Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
7th December 2024, 23:07 | #32061 | Link | |
Registered User
Join Date: Aug 2005
Posts: 1,266
|
Quote:
I'm glad your expectations have been met... BTW, how's the Moose Stew...? |
|
8th December 2024, 04:05 | #32062 | Link | |
Registered User
Join Date: May 2007
Location: Wisconsin
Posts: 2,207
|
Quote:
|
|
8th December 2024, 04:13 | #32063 | Link | |
Registered User
Join Date: May 2007
Location: Wisconsin
Posts: 2,207
|
Quote:
While I was able to get around the issue this time, I still want to be able to create a UHD with 1080p HEVC videos, in order to avoid a recoding. Each recoding introduces minor artifacts, etc. The less recoding, the better. |
|
9th December 2024, 01:22 | #32068 | Link |
Registered User
Join Date: Aug 2005
Posts: 1,266
|
Lovely... I have NO bloody idea what it does (except doesn't it have something to do with using your GPU to do the encoding? I don't know if I have the right kind. Raedon I think) but, I'll check it out.
If that is the case, I wonder what the benefit is rather then just using the CPU as usual? I've been pretty happy with the speed of the encodes now that I have a faster CPU and more memory, so I don't get why there is some whole other way to do it...? |
9th December 2024, 04:20 | #32073 | Link |
Registered User
Join Date: Aug 2005
Posts: 1,266
|
Kinda figured that too... So, does it use the exact same parameters and yield the exact same quality as you would get using x264 settings and the CPU? Would the only different factor then just be the time of encoding?
|
9th December 2024, 09:06 | #32075 | Link | |
Registered User
Join Date: Jul 2006
Posts: 556
|
Quote:
In terms of bitrate efficiency, which encoding is better, software or hardware encoding? For the best bitrate efficiency (quality per bitrate), software encoding tends to be superior due to its flexibility and ability to use advanced encoding techniques. 14 okt. 2024 Is it better to encode with GPU or CPU? You almost always want to do GPU encoding for recordings if you have access to it because it minimizes CPU overhead. However, CPU encoding technically produces more efficient compression (better quality at a given bitrate or lower bitrate at given quality). 19 dec. 2023 Is software or GPU rendering better? The main attraction to software rendering is capability. While hardware in GPU rendering is generally limited to its present capabilities, software rendering is developed with fully customizable programming, can perform any algorithm, and can scale across many CPU cores across several servers. |
|
9th December 2024, 12:03 | #32076 | Link |
Registered User
Join Date: May 2007
Location: Wisconsin
Posts: 2,207
|
Different parameters. As indicated by varekai's posting, overall, software gives you more control. But takes a lot longer. Depending on the job, I'll do some x264 encoding, but it takes a lot longer. I'm not even set up to do x265 software encoding.
|
9th December 2024, 23:42 | #32077 | Link | |
Registered User
Join Date: Aug 2005
Posts: 1,266
|
Quote:
|
|
10th December 2024, 17:00 | #32079 | Link | |
Moderator
Join Date: Oct 2001
Posts: 21,107
|
Quote:
I personally use the GPU to do all my encoding and use constant quality mode (CRF [x264/5], QVBR [Nvidia], or ICQ [Intel]). That way I know exactly what quality level I will get. The only down side to it is that the size isn't easy to predict as it depends on the quality you choose and the complexity of the source. But in BD-RB that is taken care of by the program with sampling and size prediction. And, as mentioned earlier by MrVideo -- GPU encoding is many, many times faster than using the CPU. Last edited by jdobbs; 10th December 2024 at 17:10. |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|