View Single Post
Old 8th October 2011, 04:17   #10023  |  Link
golagoda
Registered User
 
Join Date: Aug 2011
Posts: 98
Quote:
Originally Posted by Boltron View Post
So, on my never ending quest for the best quality playback possible, a question for you folks. On another forum where I sing songs praise for madVR, some made comments that they get better PQ from Nvidia than ATI.

Now I know that we all have our preferences (I am in the ATI camp), but can anyone offer a non-biased opinion on which, if either, will produce a better picture, Nvidia or ATI. I really don't care about leveraging H/W acceleration so let's ignore that ability. However, full audio Bitstreaming is a must.

I run Windows 7 and use MPC-HC/LAV Filters/madVR. Output is HDMI to a Denon receiver and on to an LG LCD TV. The TV is calibrated and data is fed to madVR yCMS.
If you're not noticing things like microstutter or have driver problems/random visual glitches with ATI it won't really be any different to what you'll experience with an nvidia GPU, they're made to do the same thing really.

The only better thing about nvidia is you can use LAV CUVID for decoding, but that's just so the GPU is used for decoding not anything to do with visual quality. My only suggestion to you is that you keep trying new settings in madVR, LAVF etc.
golagoda is offline   Reply With Quote