I'd been putting this upgrade off a while. The machine has had stability issues for a couple years now and the usual troubleshooting did not determine the issue. I even put it in a new case/power supply and the issues continued. Turns out it was an intermittently bad DIMM slot causing random (and I mean RANDOM) reboots under medium and heavy loads. Super hard to diagnose. At that point it became imperative to replace the mobo, cpu, and ram. The upgrade was planned a year ago but I got distracted with my UHD upgrade first, and then getting laid off 7 months ago. But now that I have a new job I wanted to get this thing fixed and upgraded first thing. I think I built a little monster that should last a while.
Quote:
Originally Posted by el Filou
I watch UHD movies downscaled to 1080p tone mapped to 200 nits with madVR, but even with downscaling I've noticed that film grain/digital noise is much less aesthetically pleasing than in any 1080 Blu-ray I've seen. Don't know if it's because of HDR or the better source definition or something else but it looks more like artifacts than grain or camera read noise (which usually doesn't bother me).But if you're using GPU decode for the codecs that are supported now, then this would only be useful for AV1 while there is no fixed function hardware support. If it will be really useful or not all depends on how often you upgrade your GPU.
|
Some movies are worse than others. And from what I can tell it really does match the original source material. Karate Kid 1984, for example, has a ton of grain in it but the original source material does. I think on a lot of 1080p blu-rays they filter some of that out when they master it whereas the UHD is trying to retain source material as much as possible. I'm good with original film grain. I'm not as much of a fan as digitally added film grain to try to make it look like film. I think that's where they overdo it quite a bit. IMO anyway.