Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 23rd October 2013, 19:52   #20561  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 851
huhn, don t you mean:
gpu -> hdmi -> avr
gpu -> dvi-> hdmi -> beamer


instead of what you are saying:
gpu -> hdmi -> avr -> beamer
gpu -> dvi-> hdmi -> beamer


This won t work anyway. Check the specs of my card here:
http://www.club-3d.com/index.php/products/reader.en/product/radeon-hd-7950-royalking.html

You can see here that:
Maximum Resolution analog: 2048x1536 (via Dual-Link DVI to VGA adapter)
Maximum Resolution Single-Link DVI: 1920x1200
Maximum Resolution Dual-Link DVI: 2560x1600
Maximum Resolution HDMI 1.4a: 4096x3112
Maximum Resolution Mini DP 1.2: 4096x2160
Maximum Outputs Simultaneously: 4

So DVI => HDMI => beamer will have a max resolution of 2560x1600 while I need 4096x2160
THX-UltraII is offline   Reply With Quote
Old 23rd October 2013, 20:02   #20562  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
this should work anyway cause u use the dvi slot but it behavours like a hdmi slot.

i'm pretty sure all slots are display port slot read about eyefinity what its called.

the card just detects what type of connect it is.

so just try it out.

an passive adapter works for sure in this case.

Last edited by huhn; 23rd October 2013 at 20:04.
huhn is offline   Reply With Quote
Old 23rd October 2013, 20:39   #20563  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by pie1394 View Post
My card was made by Lantic. Yet the CCC shows 6 audio devices from this card in total. It means the proprietary DVI connectors are actually wired with SPDIF signal pins if it runs at single-link mode --- exactly as HDMI. Of course the correct adapter is required because standard DVI connector does not have SPDIF pin definition.
SPDIF and HDMI Audio are two completely different things. For example you can't output LPCM 5.1 over SPDIF, but you can over HDMI. Some cards support HDMI Audio over a DVI output with an adapter (sometimes you need to use the right adapter that has a shunt to "signal" the card it's actually HDMI), some don't.
e-t172 is offline   Reply With Quote
Old 23rd October 2013, 20:41   #20564  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 851
Quote:
Originally Posted by huhn View Post
this should work anyway cause u use the dvi slot but it behavours like a hdmi slot.

i'm pretty sure all slots are display port slot read about eyefinity what its called.

the card just detects what type of connect it is.

so just try it out.

an passive adapter works for sure in this case.
I just connected one of the DVI outputs to my AVR with this cable:
http://www.beamershop24.nl/hdmi-kabel/oehlbach-real-matrix-hdmi-und-dvi-kabel-12-meter/

I get no sound........ @pie1394None of the 6 HDMI audio devices 'lit up'.....
THX-UltraII is offline   Reply With Quote
Old 23rd October 2013, 20:43   #20565  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 851
Quote:
Originally Posted by e-t172 View Post
SPDIF and HDMI Audio are two completely different things. For example you can't output LPCM 5.1 over SPDIF, but you can over HDMI. Some cards support HDMI Audio over a DVI output with an adapter (sometimes you need to use the right adapter that has a shunt to "signal" the card it's actually HDMI), some don't.
.
But do I just order a load of DVI=>HDMI adapters and try and see if it works? No adapter came with the card.

Last edited by Guest; 25th October 2013 at 14:32. Reason: rule 4: no profanity
THX-UltraII is offline   Reply With Quote
Old 23rd October 2013, 20:51   #20566  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
try the onboard for audio and amd for picture.

both normal hdmi connections.

i have audio option for every connection in my nvdia control center and amd should works the same way... and on top of this i have 4 hdmi audio output devices in my sound's settings.
huhn is offline   Reply With Quote
Old 23rd October 2013, 21:00   #20567  |  Link
djfred93
Registered User
 
Join Date: Aug 2012
Posts: 32
THX-UltraII, you need a hdmi splitter or a mini DP to hdmi active adapter.

Splitter :

gpu -> hdmi -> splitter -> avr
splitter -> Sony

Mini DP to hdmi active adapter :

gpu -> hdmi -> avr
gpu -> mini DP -> active adapter -> hdmi -> Sony

Last edited by djfred93; 23rd October 2013 at 21:22.
djfred93 is offline   Reply With Quote
Old 24th October 2013, 00:30   #20568  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by Werewolfy View Post
Between maxDif 1.9 and 2.1 there is a slight difference in debanding quality, I find 1.9 is just a little too low sometimes. MaxDif at 2.0 could be a better compromise, I don't think there is a difference between 2.0 and 2.1. But it must be confirmed by other testers.
I'd ideally like for it to be set at 1.9, and yeah changing just about anything down affects debanding effectiveness but you always get more detail back as a result.
It's a slider you just have to find the most acceptable position for and I think the compromise here is worth it. I tried lower settings also and a touch less on the maxangle but the difference was too great to be worth considering.

If only the areas of "real" banded content was affected more by these settings it would be great, but what I see is enabling deband changes much of the picture quite subtly (almost 50% of the image gets changed in some way by enabling debanding at the low preset) so it's no wonder things look softer. Changing maxdif down a bit brings back a touch of those steeper gradients, so yeah the debanded area will look ever so slightly less blended but remember the rest of the image gets back what would have been lost as well with smoothing as well. I wouldn't be mentioning it if I didn't think it was worth it I like debanding as much as the next guy and have been using 3kdb for quite some time.

Just wondering, is with the content you feel with maxdif being a touch too low at 1.9, should this content not be better served by the medium preset?
For anime I'd like to keep low on by default maybe even with 10 bit content noticeable banding doesn't happen that often (unless you have low quality content), I do however also like some of the minor blending that occurs in
very minor banded areas, just not too much that it takes away the pop from the picture. Also content is just going to improve in quality as time goes on, more bitrate, HEVC etc so I'd like to be on the side of detail personally and let medium preset take care of those harder to deal with bands.

I was thinking perhaps some years down the track the presets may need to be looked at again as banding becomes less of an issue.

Last edited by ryrynz; 24th October 2013 at 03:09.
ryrynz is offline   Reply With Quote
Old 24th October 2013, 06:45   #20569  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Have you checked that your card even has onboard audio? I know my first card I though I could get audio from DVI but the card never even supported audio over HDMI. Maybe you have already done the following but I suggest going into device manager and confirming there are audio drivers installed for your GFX card. You should also make sure that audio output device is set correctly in reclock if you use it (or your media player) to force sound over the gfx card audio.
Razoola is offline   Reply With Quote
Old 24th October 2013, 14:09   #20570  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
Another silly question.

Using a relative weaker Gpu (HD 6650m, actually have no problem on a 6750m, only difference is ddr3 vs ddr5 vram) for smooth motion on 23fps on 60hz screen, and smooth motion almost double the rendering time.

Is there options to make smooth motion less "expensive'?
baii is offline   Reply With Quote
Old 24th October 2013, 16:49   #20571  |  Link
Niyawa
Registered User
 
Niyawa's Avatar
 
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
Quote:
Originally Posted by baii View Post
Is there options to make smooth motion less "expensive'?
Having "use a separate device for presentation" enabled halves my rendering time with smooth motion. But that's about all that made any significant difference.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack
Niyawa is offline   Reply With Quote
Old 24th October 2013, 16:54   #20572  |  Link
Werewolfy
Registered User
 
Join Date: Feb 2013
Posts: 137
Quote:
Originally Posted by ryrynz View Post
Just wondering, is with the content you feel with maxdif being a touch too low at 1.9, should this content not be better served by the medium preset?
For anime I'd like to keep low on by default maybe even with 10 bit content noticeable banding doesn't happen that often (unless you have low quality content), I do however also like some of the minor blending that occurs in
very minor banded areas, just not too much that it takes away the pop from the picture. Also content is just going to improve in quality as time goes on, more bitrate, HEVC etc so I'd like to be on the side of detail personally and let medium preset take care of those harder to deal with bands.
The content I feel maxDif at 1.9 is just a bit too low is an anime The medium preset will blur some details so I try to deband as much as possible with the low preset while preserving details and I don't see a difference there between low or high value for maxDif even if I'm very picky.

Like I said, other testers or madshi need to confirm that. But do you see the difference on every content? Or is there some contents where the difference is far more pronounced? If that's the case, a sample video would help.
Werewolfy is offline   Reply With Quote
Old 24th October 2013, 18:43   #20573  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by e-t172 View Post
SPDIF and HDMI Audio are two completely different things. For example you can't output LPCM 5.1 over SPDIF, but you can over HDMI. Some cards support HDMI Audio over a DVI output with an adapter (sometimes you need to use the right adapter that has a shunt to "signal" the card it's actually HDMI), some don't.
Urm.. You are right... There is something wrong with my memory. @_@

The HDMI audio data frame is also over TMDS signal pins -- in different packet format. So does DP's audio.

Quote:
Originally Posted by THX-UltraII View Post
I just connected one of the DVI outputs to my AVR with this cable:
http://www.beamershop24.nl/hdmi-kabel/oehlbach-real-matrix-hdmi-und-dvi-kabel-12-meter/

I get no sound........ @pie1394None of the 6 HDMI audio devices 'lit up'.....
The regular DVI-HDMI cable does not map all HDMI signal pins or signal impedence to the card's DVI connector. So it is always detected as the DVI device. It needs the card vendor-designed adapter to use the DVI connector port as standard HDMI output --- just like the cable picture I attached previously provided with Asus Royal King card.
pie1394 is offline   Reply With Quote
Old 24th October 2013, 19:34   #20574  |  Link
Mangix
Audiophile
 
Join Date: Oct 2006
Posts: 353
Quote:
Originally Posted by pie1394 View Post
The regular DVI-HDMI cable does not map all HDMI signal pins or signal impedence to the card's DVI connector. So it is always detected as the DVI device. It needs the card vendor-designed adapter to use the DVI connector port as standard HDMI output --- just like the cable picture I attached previously provided with Asus Royal King card.
The weird thing is that on my nvidia card, a generic DVI-HDMI adapter provides sound. I am not sure what the exact situation is for AMD cards but I do also have an ATi adapter which works as well. The difference between both adapters is that the ATi one has a DVI-I interface while the generic is just regular single-link DVI-D.
Mangix is offline   Reply With Quote
Old 24th October 2013, 21:58   #20575  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Mangix View Post
The weird thing is that on my nvidia card, a generic DVI-HDMI adapter provides sound. I am not sure what the exact situation is for AMD cards but I do also have an ATi adapter which works as well. The difference between both adapters is that the ATi one has a DVI-I interface while the generic is just regular single-link DVI-D.
i used the nvidia adapter with an singel link dvi-i cabel and it works too...

yeah i put the adapter in the tv ...
huhn is offline   Reply With Quote
Old 25th October 2013, 03:21   #20576  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by Werewolfy View Post
But do you see the difference on every content?
Yeah I see it on all content. I don't expect many would pick up the difference so I'd rather spare some pixels the change. It may be that a calibrated screen helps out somewhat here.
ryrynz is offline   Reply With Quote
Old 25th October 2013, 06:19   #20577  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by Mangix View Post
The weird thing is that on my nvidia card, a generic DVI-HDMI adapter provides sound. I am not sure what the exact situation is for AMD cards but I do also have an ATi adapter which works as well. The difference between both adapters is that the ATi one has a DVI-I interface while the generic is just regular single-link DVI-D.
Now I have understood why the special adapter is required for some ATi cards.

As the source information says, such vendor-made adapter contains a specific I2C EEPROM wired on the I2C bus (for DDC function). So I guess it is how the GPU driver can decide if it should make such video port's function with HDMI capability or just regular DVI one.

But I don't know if all ATi card manufacturers are focibly to use such feature to design the DVI connector's functions on their cards...

Last edited by pie1394; 25th October 2013 at 06:25.
pie1394 is offline   Reply With Quote
Old 25th October 2013, 10:24   #20578  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Some nVidia cards also had audio output over DVI using a special adapter. Quite clever really but I think modern cards just have HDMI outputs instead.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 25th October 2013, 13:29   #20579  |  Link
LigH
German doom9/Gleitz SuMo
 
LigH's Avatar
 
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,753
My only problem with HDMI is that it is not fully compatible with DisplayPort.

There are DP-HDMI adapter cables ... but: Someone told me that you may connect DP out ports with HDMI in ports, but not vice versa. I did not know before; a DP monitor connected to a HDMI graphic card reports "no signal".
__________________

New German Gleitz board
MediaFire: x264 | x265 | VPx | AOM | Xvid
LigH is offline   Reply With Quote
Old 25th October 2013, 14:09   #20580  |  Link
jmonier
Registered User
 
Join Date: Oct 2008
Posts: 187
This LONG discussion about audio over HDMI has nothing to do with madVR. Maybe it could be moved to a new thread?
jmonier is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:09.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.