![]() There's definitely no hard threshold there, but most recommendations hover around QP/CRF 17–18 for H.264. This threshold depends on the spatial features of the content: some things are easier to compress than others. Therefore, there's a perceptual threshold after which using a lower QP won't give you better quality-it'll only mean you'll waste more bits than necessary. Thus, it's usually a waste of bits to use a lower CRF, and humans won't notice it.* With a CRF lower than 17, you'll usually reach 50 Mbit/s or more, depending on the content. This means that human subjects cannot tell the difference (or will not give a higher rating) to clips encoded at higher bitrates. Subjective tests, in which users view one compressed clip after another and rate its quality, suggest that for UHD video compressed with libx264, Mean Opinion Scores saturate around ~30–40 Mbit/s (with 2-pass encoding, medium preset). The question is: where is the subjective point at which compression benefits outweigh the loss in quality? This technique of reducing information via quantization is decades old and the basis of most lossy video compression algorithms. QP 0 means no quantization loss QP 51 is the maximum possible (for 8-Bit H.264). The lower the QP, the higher the bitrate, and vice-versa. If you have a high end GPU that can properly decode 4K and 4K UHD then maybe you can do it.The short answer is: no, you are not likely to get visual benefits, or at least none that most people will notice.įor the sake of simplicity, let's say that using a CRF of 17 is equal to using a constant QP of 17. You're welcome to ignore this advise but then you're back to your op post where you can only convert at about 1/5th the speed needed. So basically any movie in 4K will ALSO have a 1080 version available. Hence for all these reasons only play 4K media on devices that can direct play the media and otherwise use the 1080 version that will direct play. On top of that if the 4K movie was a 4K UHD it's going to have a different color space that will not look good at all on a non UHD screen/setup. Chances are the 1080 version on disc is going to have better quality then the converted 4K movie especially if it's an action movies with lots of fast changing scenes. So now instead of just serving a 1080 version that was encoded using high quality settings with no time limit on the encode you instead try to take a 4K movie and convert it at better than real-time using settings that aren't designed for high quality storage but to make something watchable. ![]() If the 4K movie is transcoded it's almost surely going to get downgrade to 1080 anyway. If the client is not able to play 4K then it's going to make the server do a lot of work that wouldn't be needed it the 1080 version was served. With the state of 4K as a general rule you want to put them in their own library and only allow your users that have 4K TV and needed bandwidth to access them.Ī lot of people don't have enough bandwidth to stream 4K outside the house so it's a no go for remote 4K just from that. Depending on your hardware you may be able to transcode 4K movie but likely your server won't have the horsepower to do so.
0 Comments
Leave a Reply. |