Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > PC Hard & Software

Reply
 
Thread Tools Search this Thread Display Modes
Old 12th August 2018, 13:00   #1  |  Link
ShogoXT
Registered User
 
Join Date: Dec 2011
Posts: 95
Rumor: Nvidia GeForce 1180 / 2080

There has been information floating around that the next gen of GeForce cards will be introduced September.

First will be the the xx70 and xx80, with the other tiers later. GeForce RTX branding has also been trademarked. Code names have actually been unclear as Ampere and Turing names have come up. Volta has been mentioned as well, but Volta is likely limited to workstation and compute.

Major new features include use of GDDR6 and Ray tracing.

Nvenc hasn't been mentioned and it's likely too early for any AV1 support. Vp9 improvements would be nice.
ShogoXT is offline   Reply With Quote
Old 20th August 2018, 19:32   #2  |  Link
ShogoXT
Registered User
 
Join Date: Dec 2011
Posts: 95
https://videocardz.com/77498/nvidia-...0-and-rtx-2070

RTX 2000 Series Announced

MSRP Prices:

RTX 2080 Ti 11GB GDDR6 $999

RTX 2080 8GB GDDR6 $699

RTX 2070 8GB GDDR6 $499

Founders Edition is an additional $100 each, with TI being $200 more.

The event was mostly about ray tracing and real time surface reflections. 8k decoding was rumored for content creators, otherwise I didnt see anything mentioned about Nvenc unfortunately.
ShogoXT is offline   Reply With Quote
Old 20th August 2018, 19:36   #3  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Can probably expect encoding improvements, as those have been rumored before. Decoding, there really isn't much to do right now. AV1 would be nice, but its probably too early.

Interesting would be if real-time upscaling using the Tensor Cores with Neural Networks would be something that can be used much more performant in video renderers, now that the Tensor Cores are coming to all NVIDIA cards, and not only the workstation cards.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 20th August 2018, 19:58   #4  |  Link
Gser
Registered User
 
Join Date: Apr 2008
Posts: 418
Wow that's just ridiculously expensive. Guess I'll be sticking with AMD from now on.
Gser is offline   Reply With Quote
Old 20th August 2018, 20:33   #5  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the 2070 is supposed to outperform a titan Xp so the prices are "fine"

if these are binned turing quadro cards than the 2080 ti is 754nm that's pretty big...

but these tensor core are a headache to me. dedicated hardware for something like physx. all console are using AMD cards if AMD isn't adding something like this too this tech will die just like physx. and coding something only for RTX card sounds like a waist looking at the limited user base.

i guess i will skip them too was really looking forward to them but they are nearly for sure 12 nm and 7 nm cards are not that far away if consumer or not well i guess i have to wait and see.
huhn is offline   Reply With Quote
Old 20th August 2018, 21:18   #6  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
Quote:
Originally Posted by nevcairiel View Post
Can probably expect encoding improvements, as those have been rumored before. Decoding, there really isn't much to do right now. AV1 would be nice, but its probably too early.
I could hope for a 2050 (Ti) late announcement around November, suitable for AV1 implementation.
But again it's only a three months difference.

Quote:
Originally Posted by nevcairiel View Post
Interesting would be if real-time upscaling using the Tensor Cores with Neural Networks would be something that can be used much more performant in video renderers, now that the Tensor Cores are coming to all NVIDIA cards, and not only the workstation cards.
Technically, Volta Tensor Cores have already been released to a prosumer card, Titan V, but definitely a very high priced card around 3500$ and very powerful too.

I'm really curious if 2060 and 2050 (Ti) will have Tensor cores and Ray Tracing cores inside.

Quote:
Originally Posted by huhn View Post
the 2070 is supposed to outperform a titan Xp so the prices are "fine"
Hahahahaha.

You are so funny guy!

Do you really believe the "leather-jacket-man" that he was referring to rasterized gaming performance ?

He was referring to some strange benchmark comparing Tensor or Ray Trace core performance mixed with CUDA cores performance.

Every other comparison is ridiculous.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 20th August 2018, 21:30   #7  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
i said supposed but the RTX 2080 has a TDP of 250-285 which is more or the same as the titan Xp has with a better process so.
huhn is offline   Reply With Quote
Old 20th August 2018, 23:00   #8  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by huhn View Post
but these tensor core are a headache to me. dedicated hardware for something like physx. all console are using AMD cards if AMD isn't adding something like this too this tech will die just like physx. and coding something only for RTX card sounds like a waist looking at the limited user base.
AI is here to stay. Its seeing much larger adoption everywhere, and the usecases are endless, and certainly useful in gaming. The entire industry is jumping onto this.

The same could be said for the Ray Tracing cores. Ray Tracing is the holy grail of 3D rendering. And a major engine like UE4 adding support immediately for everyone to use will make it relevant, plus EA's Frostbite engine (BF5 etc) and a bunch of other game studios. Those are engines that are also used with console games - the non-Ray Tracing rendering isn't going away, it just won't look as pretty.

AMD really doesn't have a choice but to do something about either of those. These features are too huge to ignore and hope they go away.
AI might be an "addon" for gaming, but there will be loads of workstation tasks that will make use of those, and AMD has been trying to get into that field. Ray Tracing, I believe, will really make game developers happy, because it makes things just so much nicer.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 20th August 2018 at 23:14.
nevcairiel is online now   Reply With Quote
Old 20th August 2018, 23:22   #9  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
are they to big to be ignored? the next consoles are going to be vega APUs do you really see them adding to these type of cores to the GPUs. do they even have the time

i'm a PC gamer myself but i know very well that we PC gamer are not that important sad but true.

if nvidia will release GPU without it and that is highly possible it will have a hard time.

even there "demos" with on/off look not that great... https://twitter.com/NVIDIAGeForce/st...00171241463808

these engines can do physx too and this makes at least a huge difference unlike ray tracing. ray tracing is pretty much godrays in fallout 4 if you like the nvidia mod vault 1080 well yeah looks a lot different in the game but in the rest of the game well people can live totally fine without.

it currently even reminds me of stuff like motion blur or depth of field so stuff that makes it look more realist but harder to see what is happening and they are not that popular or let'S say not seen as useful to everyone.
and no it's not use useless but physx is not useless too but who cares about it and that'S my point?

a couple years back VR was the holy grail of 3D rendering where is it now?

and for workstation we have quadro cards.
huhn is offline   Reply With Quote
Old 20th August 2018, 23:52   #10  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by huhn View Post
a couple years back VR was the holy grail of 3D rendering where is it now?
Ray Tracing has always been the Holy Grail of 3D rendering since its inception almost half a century ago, it just wasn't achievable in real-time.
Any movie-style scene that was rendered off-line is ray traced, because its the only way we currently have to make something really look good.

VR really isn't that interesting from a rendering-side, nothing much changes for the engine itself, it just has a different FoV and renders two eyes, but same concepts all around.

In a couple years, when Ray Tracing becomes the standard of everything, game developers will sigh in relief as they can stop with the ugly hacks and cheats to make scenes look acceptable. Thats the real beauty of ray tracing. You just build the scene and it automatically looks accurate. Right now you have to manually fix every reflection, every shadow.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 21st August 2018 at 00:13.
nevcairiel is online now   Reply With Quote
Old 21st August 2018, 00:19   #11  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
and fallout 4 has "ray tracing" in real time it's not 100 % the same but comes close. not a pretty game but what ever.

it's said that RTX is using a part of DX12 with is good but it is on the other hand part of nvidia gameworks so AMD can not use the "same" ray tracing so this is already half a nightmare even through AMD has it own ray tracing.

if it would be something like x86 in CPU and AMD could use the same thingy and create there own cores to the the same type of ray tracing i wouldn't see a big problem here but as it looks now AMD has to do something else to do the same ray tracing and that'S a problem the results will not be the same. so you game developer have to create 2 or even 3 types of ray tracing.

and if we come now back to video rendering using NN operation oyu currently have to create one precisely for nvidia and only for nvidia because AMD/ intel will not output the same number if you ask them to do the same.
huhn is offline   Reply With Quote
Old 21st August 2018, 07:07   #12  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
Nvidia is a highly aggressive and highly unethical company, take it as a fact.

The only thing that it really cares is to destroy the competition.

There is only one company left, AMD.
All the others are gone because of Nvidia.

GeForce Partner Program (GPP) was the last attempt to destroy completely AMD but fortunately, it didn't work.

GameWorks, HairWorks, PhysX are all parts of his ambition to eclipse the competition and to be alone, a great monopoly in the GPU arena.

Now, with RTX cards, Nvidia is trying to sell in HUGE prices things like Tensor cores which are completely useless in general for gaming and Ray Tracing cores which are useless right now, as no game is actually using this technology.

Gaming for Nvidia is no longer a priority, although it keeps bringing most money of all other categories together.

Tensor cores are only used for AI and Deep Learning/ Machine Learning and are actually used by Quadros and Teslas.

Ray Tracing could be useful in the future, but not right now.

But Nvidia asks for your money right now, while the rasterized gaming performance of the new cards is certainly not justifying their price.

Do you have an arm or a leg more ?

Sell it and buy a 2080 Ti for 1500€.

P.S

Don't be surprised if you find out that new games coming do not work properly in older than Turing generation of Nvidia cards or AMD cards.

It's part of the plan.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all

Last edited by NikosD; 21st August 2018 at 08:46.
NikosD is offline   Reply With Quote
Old 22nd August 2018, 11:16   #13  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,806
If you like pathtracing in games then you should try quake2 (do not forget to set 640x480 resolution for full 90s experience)
http://amietia.com/q2pt.html
Atak_Snajpera is offline   Reply With Quote
Old 26th August 2018, 00:17   #14  |  Link
Rumbah
Registered User
 
Join Date: Mar 2003
Posts: 480
Quote:
Originally Posted by nevcairiel View Post
Any movie-style scene that was rendered off-line is ray traced, because its the only way we currently have to make something really look good.
Well, I guess it's not that easy.
For example Cars was Pixar's first completely raytraced movie if I remember correctly. Everything before was at least hybrid. And even today complete raytracing is slow and movies use something like the unreal engine for their effects (of course it may also depend on the budget).

And it can be harder to use for an artist. If you want the clean realistic look it's fine. But if you want to change the look or make adjustments to the lighting of the scene you have to work with all the (diffuse) reflections of your light sources. That can make it much harder to get the lighting you want artistically.
Rumbah is offline   Reply With Quote
Old 26th August 2018, 11:04   #15  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
RTX of Nvidia has nothing to do with pure ray-tracing.

You need Petaflops and not Teraflops for this.

It's a hybrid and very slow implementation of ray-tracing mixing everything (software, SDK, Tensor cores, Ray Tracing cores and shading CUDA Cores)

This generation of RTX implementation, Turing cards, should be skipped by gaming users as even the fastest card of 1400€ - 1500€ - the 2080 Ti - throws back the resolution and FPS to 1080@30fps when RTX is on.

When more and optimized games for Ray-Tracing show up, if ever, this generation of cards will be obsolete.

I think the next architecture of Nvidia at 7nm is not far away - probably less than a year - and will be more suitable for Ray-Tracing, if there is a market for such thing in gaming.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 28th August 2018, 05:51   #16  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
It should be good for Tensor Flow hobbyists.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 28th August 2018, 19:04   #17  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the first benchmark leaks are spreading.

the 2080 looks like it will be a faster than a 1080ti which means the cuda cores got quite a lot faster. so it looks like ~0-30% more performance per dollar in classic application. i guess it is really important how the prices are going to be a week or two other the release.
huhn is offline   Reply With Quote
Old 28th August 2018, 21:40   #18  |  Link
NikosD
Registered User
 
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
I take the risk to say that 2080 will be around 8% faster than 1080 Ti and probably will loose in specific game settings and resolutions.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1)
HEVC decoding benchmarks
H.264 DXVA Benchmarks for all
NikosD is offline   Reply With Quote
Old 30th August 2018, 01:28   #19  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
I'm sure its performance in traditional tasks will be a nice upgrade.

Keeping my eyes out for the dream display - a ~27-30" 4k @ 144 Hz HDR IPS LCD display with 1000+ nits and G-Sync. Pair that up with a new 2080 and I'll be happy as a clam for the next couple years
Blue_MiSfit is offline   Reply With Quote
Old 30th August 2018, 01:43   #20  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
it is not really in a question if these cards will be faster the 2080 ti will be faster than the 1080 ti the real issue is the price.

if a 1080 ti cost as much as 2080 and has the same performance than this is an issue still favouring the 2080 but still that's a weak showing.

but i will do my judgement on this about 2 week after the release and see what the prices are then.

but i have to say i can't see high setting 60 fps 4K gaming with these cards not even the 2080 ti. newer games are not going to be easier.

but here is your screen: https://www.acer.com/ac/de/DE/conten...es/predatorx27
huhn is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:08.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.