Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th September 2020, 17:51   #1841  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
Using Tensor cores for madVR sounds like a joke.
A bad joke actually.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 24th September 2020, 17:58   #1842  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,920
using cores for neural network operation in a video render full with neural network scaler is a bad joke...

edit: about the bug. the following thing is what i encounter and i sadly need other user to reproduce it to confirm it.

i have the following issue with 456.38 the power management mode is ignored so if you are using ddu or use the clean installation option in the driver you will be stuck with optimal power.

so how did i "proof" this.
i contacted nvidia support because my GPU was stuck at 800 mhz with madVR playback the usually issue with optimal performance even through i changed this setting.
i just followed his orders by installing an older driver and setting some very question settings in the nvidia control panel and "prefer maximum performance" as he ordered. after that "worked" i was supposed to install the newest driver again. and it works now but i'm stuck at maximum performance the GPU is idle at 1.5 GHZ.

so what i like to get a report from some user here is if maximum performance works on your system and the idle GPU clock is int he GHZ range if your driver was not at maximum performance before.

if you would excuses me i have to install 452.22 set it to adaptive and install 456.38. "fun"

Last edited by huhn; 24th September 2020 at 18:37.
huhn is offline   Reply With Quote
Old 24th September 2020, 18:55   #1843  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Klaus1189 View Post
I saw that as well, it's a big fat joke just as we thought.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 24th September 2020, 21:41   #1844  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
Quote:
Originally Posted by chros View Post
... it's a big fat joke just as we thought.
We ? Who's we ?
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 24th September 2020, 22:11   #1845  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,920
pretty much everyone here sad just ignore it is just expensive and has use cases for professionals that's it.
huhn is offline   Reply With Quote
Old 25th September 2020, 12:00   #1846  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
3090 is a disaster.

It needs around 375W (custom 3090 is over 400W) and 1500$ to be 10% faster than 3080 of 700$ card in 4K gaming.

If 4K is a niche for most people, then 8K gaming is extra-terrestrial right now.
No monitors and most games are not ready for this resolution.

8K gaming is feasible mainly using DLSS enabled games using this new "Ultra performance" mode, rendering the game in 1440p and upscaling to 8K using 9x (!) upscaling.

Also, it's not a professional card.
Very far from it.
Steve from Gamers Nexus is not right on this one.

It's a lot slower than Turing TITAN RTX in applications like CAD, Design etc.
nVidia didn't enable driver optimizations of Quadro cards to a Geforce card like 3090, as it did with Titan series.

The result is a much slower card even than Turing Titan RTX in these cases.

Also, 3090 card has very slow FP64 performance and doesn't support GPU virtualization/sharing using SR-IOV.

nVidia probably reserved these things for Ampere Titan of 3000$

So, no.
Most of the users of this forum were wrong.
3090 is not a replacement of Titan cards.

It could be used only by content creators using Blender or other rendering software or video editing of 8K or any productivity app workload that can't fit in 10GB of 3080 VRAM.
Because the chip and the drivers are exactly the same for 3080/3090, with disabled features of real professional SW.

Prosumer Titan and professional Quadro users doing CAD, Design, AI Training should wait for 3000$ Titan Ampere or Quadro Ampere to do their job.

So if 3090 doesn't worth it for gaming and doesn't have professional/ workstation drivers for what is it worth ?

Ask the leather-jacket-man.

Or even better -gamers, prosumers, professionals could wait for RDNA2 cards.
I hope this time AMD will grab the opportunity.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 25th September 2020, 12:36   #1847  |  Link
Polopretress
Registered User
 
Join Date: Sep 2017
Posts: 46
Quote:
Originally Posted by huhn View Post
using cores for neural network operation in a video render full with neural network scaler is a bad joke...

edit: about the bug. the following thing is what i encounter and i sadly need other user to reproduce it to confirm it.

i have the following issue with 456.38 the power management mode is ignored so if you are using ddu or use the clean installation option in the driver you will be stuck with optimal power.

so how did i "proof" this.
i contacted nvidia support because my GPU was stuck at 800 mhz with madVR playback the usually issue with optimal performance even through i changed this setting.
i just followed his orders by installing an older driver and setting some very question settings in the nvidia control panel and "prefer maximum performance" as he ordered. after that "worked" i was supposed to install the newest driver again. and it works now but i'm stuck at maximum performance the GPU is idle at 1.5 GHZ.

so what i like to get a report from some user here is if maximum performance works on your system and the idle GPU clock is int he GHZ range if your driver was not at maximum performance before.

if you would excuses me i have to install 452.22 set it to adaptive and install 456.38. "fun"
I confirm you that a contact of mine who has tried the 456.38 has the same behavior.
No management of the selected power mode. only stuck at the last value applied with the previous driver version.
(where is the validation team and no regression tests database on nvidia side ? ...)

It seems also that CRU is not working anymore (by the extensions blocks and displayID )

Last edited by Polopretress; 25th September 2020 at 12:42.
Polopretress is offline   Reply With Quote
Old 25th September 2020, 13:11   #1848  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
@NikosD

Any chance you could calm down a bit and stop the misinformation?

Custom 3090 GPUs don't all need more than 375W. Apart from the fact that I like EVGA support and 3 year warranty, I chose the XC3 because it only needs 2 8-pin power and draws a max of 375W. It's also smaller than the 3090FE, so I hope that it will fit in my case.

375W is in full load. It will draw a lot less otherwise.

Why are you so worked up about companies strategies? They are not your friend or mine. There is nothing personal. They have a single goal: maximising profits for their shareholders. AMD is no more friendly than nVidia. Who cares who is making a GPU, as long as it's the best choice for your needs?

Anyway, re your other points, I don't really play games, I don't have the time, though I occasionally play fairly old games with my daughter. Apart from madVR, I plan to use the 3090 with Flight Simulator 2020 (in 4K) and for video editing and post-production work in 4K/8K workflows (I use studio drivers). Both applications need more than 10Gb, so there are definitely some use cases for it, even if it's often more a want than a need.

If people are stupid enough to buy them to play games in "8K", then it's their problem, but I don't think many do that. Currently the 3090 is the best solution for my needs, especially as I don't upgrade often. I bought my 1080ti in 2017, I have zero interest in Turing due to the lack of HDMI 2.1 and I'm hoping to keep the 3090 (if it fits in my case) for at least 2-3 years. AMD is not even on the table due to the lack of tensor cores and BT2020 flag support, so it's nice to not even care about what they might release next month. They could release a card more powerful than the 3090 for half the price, they would still not get my money

When/if madVR starts using tensor cores, I want to be able to make the most of it, instead of having to upgrade once more. And being able to set the BT2020 flag is a requirement for me. I don't switch gamut or calibration manually in my cinema room, sorry. I start a film with my remote control or my iPad, and HD Fury does the rest.

Sure, I'd rather buy a 3080ti with 20Gb for 30% less money, as that would suit my needs just as well, but we don't know when it will materialise, in which quantity, and the time I waste looking on forums and resellers to find out or secure a new GPU IS money.

On the other hand it's quite nice to have the flagship with full bus and full memory (until Ampere Titan shows up, but I have no interest in that) and to stop thinking about upgrading for a while. I really don't care about overclocking, getting top scores at benchmarking, water-cooling etc.

I managed to snatch a pre-order for the 3090 during the 1-2 hour window it was available (entirely by luck, I thought I'd try once on the 24th). I was able to place an order for the model I wanted - EVGA XC3 Gaming - at the price I was expecting to pay, knowing that I can return it if it doesn't fit my case, so I did it.

My situation is slightly different than most though because it's a business purchase as I use it primarily for work, both with madVR and with video post-production. I get 20% VAT back and I save 20% taxes on company profits, so it costs me in effect 40% less than the full price. It's still expensive, but as a business expense, it's not that bad compared to a high-end laptop, projector or an AVR. I certainly wouldn't recommend a 3090 to anyone who doesn't need one and can't wait for the 3080 20Gb. I'm curious to see how much my CPU will be a bottleneck for my use case, I expect it will be. I've budgeted an upgrade to Rocket Lake next year if necessary. Again, I don't upgrade often, and I've been waiting for Intel to implement PCI 4.0 to do so, only if I need to.

Buying a 3080 with 10Gb for anything but 4K video feels short-sighted to me, many games already need more than that, so to me it's a worse choice than the 3090, even if the 3090 completely loses on the value for money side.

But we're all different, with different needs, priority, preferences and budgets.

Couldn't you accept that what's good for you might not be best for someone else, and vice-versa, and please could you stop the AMD fanboy rants?

In any case, I suggest that you start a new thread if you really want to keep going that way, because this brand war has nothing to do with the topic of this thread, and it's getting boring, frankly.

I subscribed to this thread to follow potential driver issues, not to hear fanboys ranting about their most-hated or favorite brand, or to have to justify my purchase. If AMD was giving me what I wanted, I'd most happily buy AMD. I had a Sapphire 7870 before the 1080ti and I was very happy with it (until they changed their drivers and made changing advanced video settings impossible, but that's another story).

I also tried a few AMD GPUs for my EGPU as they would be a better choice on my Macbook Pro when using MacOS, but they were all pants in Bootcamp so I sent them back.

Anyway, thanks for considering giving us a break!
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 25th September 2020 at 16:48.
Manni is offline   Reply With Quote
Old 25th September 2020, 13:31   #1849  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
@Manni

I don't know if this thread is suitable for GPU discussions but the last days I have clearly stopped writing a lot of posts and certainly my posts are not so huge like yours.

You are definitely an anti-AMD guy even considering buying Rocket Lake on 2021, a non-existent CPU not mentioned anywhere officially and you didn't even mention Zen 3.

You are an Intel fanatic and anti-AMD fanatic.
That's why you cared so much replying to me with a huge post.

Also, a rather large part of your post is trying desperately to justify the worse buy ever - the 3090 card.

Are you serious saying to people that you bought a 3090 card for madVR and business ?
Are you serious that Tensor cores could be used outside very specific purposes like DLSS from Geforxe RTX cards ?
Who is saying such garbage ?

You are definitely misleading people by justifying the huge power consumption of 3080/3090 cards.

There are custom 3090 cards with 420W TDP.
Do you know what you are talking about ?

The strategies of companies like Intel, nVidia, AMD have serious impacts in our pockets and the industry in general.

We have free market system and everyone can burn his money to garbage like 3090 cards - no doubt about it.

But we have to say loud and clear that the only reason to justify buying a 3090 cars is just this - because you have money.

Posts like this one above make me wanna write even more about this subject.
Posts from nVidiatel fanatics.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 25th September 2020, 13:53   #1850  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
All right, I thought there was hope with you, I was wrong.

I'm out of this discussion. And yes, I work in film and as a consultant for madVR Labs (though what I post here are my words), so the 3090 is 100% a business purchase. MS FlightSim 2020 is just for fun at the end of the day

EDIT: as for Intel vs AMD, when AMD starts supporting Thunderbolt in more than a couple motherboards, I might consider it. In the meantime, for video editing, especially with After Effects, a threadripper is a waste of money and Intel provides better performance at the same price point. I don't expect this to change with Rocket Lake, but if it does, I'll reconsider. I use Intel because it's better for my needs, not because it's Intel. I mention Rocket Lake because I want (for my needs) PCI 4.0 and Thunderbolt 4.0, and as far as I know only Intel will provide that next year.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 26th September 2020 at 11:55.
Manni is offline   Reply With Quote
Old 25th September 2020, 15:02   #1851  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by NikosD View Post
Posts from nVidiatel fanatics.
Why do you assume people have to be fanatics if they disagree with you? (Projection?) I really do not think of any of these companies as entities to be fans of. The idea seems weird to me, billion dollar corporate entities are not something that I can be a fan of.

I am buying a 3090 asap, but for no good reason. A 3080 is a much better buy for my use cases. If big Navi turns out to be better than the 3090 I will buy one.

Anyone without an AMD tattoo seems anti-AMD to you.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th September 2020, 15:24   #1852  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Where is the like button when you need it
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 25th September 2020, 15:59   #1853  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
https://www.youtube.com/watch?v=7YQ7rNgoqMA
Klaus1189 is offline   Reply With Quote
Old 25th September 2020, 16:43   #1854  |  Link
videoh
Useful n00b
 
Join Date: Jul 2014
Posts: 1,667
If you are going to link to Germish stuff, how about an executive summary in English? The language for this forum is English.
videoh is offline   Reply With Quote
Old 25th September 2020, 16:53   #1855  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
See in the description of the video:
https://www.igorslab.de/en/what-real...0-andrtx-3090/

Last edited by Klaus1189; 25th September 2020 at 16:55.
Klaus1189 is offline   Reply With Quote
Old 25th September 2020, 17:06   #1856  |  Link
videoh
Useful n00b
 
Join Date: Jul 2014
Posts: 1,667
Thank you, Klaus.
videoh is offline   Reply With Quote
Old 26th September 2020, 03:52   #1857  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
huhn you asked why I'd even consider an FE over an AIB. Given the latest reports out there, hopefully it's painfully obvious now. The AIBs, at least some of them, it looks like they cheaped out on capacitors and the crash to desktop problems we're hearing about MAY be caused by voltage problems on those cards when they boost. nVidia used the good capacitors. ASUS did, as well, which is the only other card I was even looking at. And...I think I'll stick to that. LOL Drivers are also not super awesome for these cards combined with the problems you've found about power settings sticking. This launch feels rushed even though it shouldn't have been. We know why that is, but, nVidia really didn't have any room for a screw up here. Hopefully these issues get sorted out.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 26th September 2020, 05:47   #1858  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I think companies simply use release as a QA step today. In house QA at that level would be expensive, digital updating is cheap, and everyone who bought one can do it so the PR isn't too bad. Everyone who can does it now.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 26th September 2020, 08:10   #1859  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by SamuriHL View Post
This launch feels rushed even though it shouldn't have been. We know why that is
Yeah Nvidia was scared of big Navi

Last edited by ryrynz; 26th September 2020 at 08:36.
ryrynz is offline   Reply With Quote
Old 26th September 2020, 09:13   #1860  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,920
Quote:
Originally Posted by SamuriHL View Post
huhn you asked why I'd even consider an FE over an AIB. Given the latest reports out there, hopefully it's painfully obvious now. The AIBs, at least some of them, it looks like they cheaped out on capacitors and the crash to desktop problems we're hearing about MAY be caused by voltage problems on those cards when they boost. nVidia used the good capacitors. ASUS did, as well, which is the only other card I was even looking at. And...I think I'll stick to that. LOL Drivers are also not super awesome for these cards combined with the problems you've found about power settings sticking. This launch feels rushed even though it shouldn't have been. We know why that is, but, nVidia really didn't have any room for a screw up here. Hopefully these issues get sorted out.
yes i still do the 3080 FE is part of this problem and the 3080 FE is so bad that it will thermal throttle with let's say furmark that's not the case with a decent AIB.

there where AIB so terrible that they would die if you flash a stock bios on them and run furmark and just to make that absolutely clear i said die not maybe die.

clock rate are currently taking into account why the card is crashing. AIB cards often reach higher boost clock then FE cards just by been massively cooler with the same bios it is as it is.

so just lay back wait for some test it's sadly not unusual for stuff like this to happen. someone wants an exploding EVGA card even through the VRM are totally fine? there where and there will always be terrible AIB cards too.
huhn is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:14.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.