Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 Encoder GUIs

Reply
 
Thread Tools Search this Thread Display Modes
Old 27th November 2024, 21:08   #20961  |  Link
rlev11
Registered User
 
Join Date: Aug 2020
Location: Pennsylvania
Posts: 150
Quote:
Originally Posted by stryker412 View Post
How do you think the 5700X or 5700X3D might do? I've had a few people suggest I just upgrade my 3700X to the 5700X and call it a day without having to do a full build.
Seeing how there is very little difference between the 3900x and 5900x, I doubt there would be much of a difference between the 3700x and 5700x in encoding performance. Remember encoding is based a lot on core and thread count, maybe a 5900x or a 5950x from the 3700x to get more cores. You will have to account for additional cooling though, either an aoi or a noctua d15, but that also brings into play the case on if it can handle an aio or the 165mm cooler height of the Noctua.

Last edited by rlev11; 27th November 2024 at 22:30.
rlev11 is offline   Reply With Quote
Old 27th November 2024, 21:22   #20962  |  Link
rlev11
Registered User
 
Join Date: Aug 2020
Location: Pennsylvania
Posts: 150
Since we have been talking power usage lately, i found this that is interesting. Instead of changing the max cpu state in the power plan to 99%, leave that at 100%. then regedit here:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\54533251-82be-4824-96c1-47b60b740d00\be337238-0d82-4146-a960-4f3749d470c7 and change "Attributes" (DWORD) and set to "2"

Now, there is a new setting on advanced power settings: (in the legacy control panel power options) "Processor performance boost mode". Set it to "Disabled" and cpu power drops considerably.
This works for both AMD and Intel CPU's and forces everything to run at the base clock speed (setting to 99% didn't do anything for me on my Intel systems)

Have found on my AMD boxes, there is little performance difference with a considerable power decrease. On the Intel systems, since they have a much lower base clock (larger spread from base to boost speed) it does hamper performance a bit more. On my i7-14700, disabling boost, dropped p-cores to 2gz and E-cores to 1500 mhz. but cpu power dropped from 175 watts to 35 watts. lost about 30-40% performance

Just something else to consider if looking at lowering power usage quickly without messing with BIOS settings.

If I get some time this weekend, I'll run some numbers comparing the same video/chunk for each machine both boost enabled/disabled and come up with a fps versus watts used and see where things shake out.

Last edited by rlev11; 27th November 2024 at 21:59.
rlev11 is offline   Reply With Quote
Old 29th November 2024, 16:24   #20963  |  Link
Ryushin
Registered User
 
Ryushin's Avatar
 
Join Date: Mar 2011
Posts: 454
Quote:
Originally Posted by Retired@55 View Post
Why do ppl buy multi core CPU's, then hobble them just to save some energy / money ??

Why not buy a lesser CPU, and run it at 100%, what's the point in having a 16C CPU, that is hobbled down to be equivalent to a 12C ??
The upgraded processor is able to process more FPS. Lowering the max speed by 5% results in processor that is more efficient for each clock cycle and more importantly, it will extend the life of the processor by not pushing it to thermal limits for days/weeks on end.

And as someone else said, their machines won't sound like a jet turbine just to gain another 5% performance at double the power usage and most likely killing the processor sooner then later.

Last edited by Ryushin; 30th November 2024 at 14:09.
Ryushin is offline   Reply With Quote
Old 2nd December 2024, 13:22   #20964  |  Link
ReinerSchweinlin
Registered User
 
Join Date: Oct 2001
Posts: 467
Quote:
Originally Posted by rlev11 View Post
This is the total Package Power as reported by CPUID HWMonitor when each server is in the middle of a chunk of a 4k encode. Only thing i do on the Ryzens is set a Thermal Limit in the PBO section of the bios. I set the 9000 and 7000 series to 85c, and the 5000 and 3000 to 75c

First wattage is total package and second is core/ia core watts

9950x 20.02 200w, 146w
7950x 18.3 190w, 144w
7900x 17.37 170w 125w
7700x 11.26 128w, 95w
5950x 11.15 147w, 104w
5900x 9.07 148w, 95w
3950x 9.43 112w, 85w
3900x 9.78 130w, 101w
i7-14700 13.98 175w, 170w
i5-14500 12.44 133w, 130w
i5-13500 12.64 119w, 117w
i5-12600k 9.59 115w, 112w
Thanx, this is really helpfull. Without doing the math, at first glance the 7900x seems like a performance/watt winner.
ReinerSchweinlin is offline   Reply With Quote
Old 5th December 2024, 12:59   #20965  |  Link
hardkhora
Registered User
 
Join Date: Mar 2014
Posts: 8
Is there a way to force files shorter than 60 seconds to still be encoded distributively?
The problem I'm having is that my main RipBot controlling computer, which I don't usually encode with but like using as the central server, uses a iGPU (UHD 770).
Any video with KNLMeansCL used gets extra blurry/halos around the edges of things when encoding with the UHD 770, but with KNLMeansCL off or when I use any of my other computers with discrete GPUs I don't have that issue.
Example of the issue:
https://imgur.com/a/X09dQrh

Last edited by hardkhora; 5th December 2024 at 13:14.
hardkhora is offline   Reply With Quote
Old 5th December 2024, 14:39   #20966  |  Link
ReinerSchweinlin
Registered User
 
Join Date: Oct 2001
Posts: 467
Quote:
Originally Posted by hardkhora View Post
Is there a way to force files shorter than 60 seconds to still be encoded distributively?
The problem I'm having is that my main RipBot controlling computer, which I don't usually encode with but like using as the central server, uses a iGPU (UHD 770).
Any video with KNLMeansCL used gets extra blurry/halos around the edges of things when encoding with the UHD 770, but with KNLMeansCL off or when I use any of my other computers with discrete GPUs I don't have that issue.
Example of the issue:
https://imgur.com/a/X09dQrh

do you really need nlmeans noise reduction on this footage? Looks fine to me.... I usually use CPU degrain on anime, rarely nlmeans does a better job.
ReinerSchweinlin is offline   Reply With Quote
Old 5th December 2024, 18:18   #20967  |  Link
hardkhora
Registered User
 
Join Date: Mar 2014
Posts: 8
Quote:
Originally Posted by ReinerSchweinlin View Post
do you really need nlmeans noise reduction on this footage? Looks fine to me.... I usually use CPU degrain on anime, rarely nlmeans does a better job.
I agree, I don't need it for that footage, that was to illustrate the problem.
I was trying to understand if there a way to force files shorter than 60 seconds to still be encoded distributively?

I also tried updating the iGPU driver and that didn't make a difference.
hardkhora is offline   Reply With Quote
Old 5th December 2024, 19:28   #20968  |  Link
ReinerSchweinlin
Registered User
 
Join Date: Oct 2001
Posts: 467
Quote:
Originally Posted by hardkhora View Post
I agree, I don't need it for that footage, that was to illustrate the problem.
I was trying to understand if there a way to force files shorter than 60 seconds to still be encoded distributively?

I also tried updating the iGPU driver and that didn't make a difference.
hm, I just looked up the chunk size - guess you tried that too - and found 1 minute as the smallest chunk...
if you donīt run a encoding server on the master PC - shouldnīt it pass one chunk onto one of the network encoders? Have you tried that ?
ReinerSchweinlin is offline   Reply With Quote
Old 5th December 2024, 20:49   #20969  |  Link
hardkhora
Registered User
 
Join Date: Mar 2014
Posts: 8
Quote:
Originally Posted by ReinerSchweinlin View Post
hm, I just looked up the chunk size - guess you tried that too - and found 1 minute as the smallest chunk...
if you donīt run a encoding server on the master PC - shouldnīt it pass one chunk onto one of the network encoders? Have you tried that ?
When the file can't be chunked (is below 60 seconds), then the process bypasses the distributed encoding and encodes it locally via the main RipBot window (same as if you turned off distributed encoding all together). So, even if there is no encoding server it still acts locally.
I tried setting the chunk size to 0 but it still won't pass it to encoding servers as I'd expect.
hardkhora is offline   Reply With Quote
Old 5th December 2024, 22:32   #20970  |  Link
ReinerSchweinlin
Registered User
 
Join Date: Oct 2001
Posts: 467
ah ok... sorry, I canīt help then.
ReinerSchweinlin is offline   Reply With Quote
Old 7th December 2024, 08:31   #20971  |  Link
bar72
Registered User
 
Join Date: Oct 2024
Location: Scotland
Posts: 4
Quote:
Originally Posted by Retired@55 View Post
Why do ppl buy multi core CPU's, then hobble them just to save some energy / money ??

Why not buy a lesser CPU, and run it at 100%, what's the point in having a 16C CPU, that is hobbled down to be equivalent to a 12C ??
It's a bit like having a 2.0 litre over a 1.6 litre diesel engine in my car.

Doesn't have to work as hard to achieve sitting at the speed limit which puts less strain on engine overall.
bar72 is offline   Reply With Quote
Old 7th December 2024, 19:58   #20972  |  Link
rlev11
Registered User
 
Join Date: Aug 2020
Location: Pennsylvania
Posts: 150
So I got a chance to do a full test for FPS and wattage on my entire encoding farm with cpu boost enabled and disabled. I did a separate run with a 4k source (3840x1600 light degraining) and a 1920x1080 source with heavy degraining just to see how they compare. The below link is for the spreadsheet with the results. Ran each run and got the fps from the distributed encoding window and for each cpu it was the same chunk number so that matches up. I also timed each run and saw about a 10-12% increase in total encoding time without boost, which is close to what the fps difference is. Saw about a 48% total drop in combined power usage.
So my math may be fuzzy on the end result, but it looks like it is much more cost efficient to run at least AMD processors with boost disabled since their spread is minimal. Just using my raw numbers, If an encode runs for 1 hour at full power, I would use appx 106380 total watts. If running with boost off and it takes 15% longer (69 minutes) I get 63135 total watts used for the same encode.

I am not advocating anything with this, just was curious of the difference and how it all fits together. And plus, as others have mentioned, your computers don't sound like a turboprop taking off.

https://docs.google.com/spreadsheets...#gid=983798900

Last edited by rlev11; 7th December 2024 at 20:03.
rlev11 is offline   Reply With Quote
Old 8th December 2024, 00:44   #20973  |  Link
supersnakeyez
Registered User
 
Join Date: Sep 2012
Posts: 23
I'm running RB 1.27.4 and for some reason every time I try to encode a movie from Netflix it encodes like normal but when played back on computer or portable hard drive the movie keeps skipping and looping which makes watching impossible. Only happens with Rb for some strange reason. Same file I encode with other software works fine.
supersnakeyez is offline   Reply With Quote
Old 8th December 2024, 21:30   #20974  |  Link
chainring
Registered User
 
chainring's Avatar
 
Join Date: Sep 2006
Posts: 183
Quote:
Originally Posted by rlev11 View Post
I also timed each run and saw about a 10-12% increase in total encoding time without boost, which is close to what the fps difference is. Saw about a 48% total drop in combined power usage.
Almost half the power and only increasing encoding time up to 12%. That's an insane difference. Those of us who live where electricity isn't cheap, it's a seriously welcomed difference.
chainring is offline   Reply With Quote
Old 9th December 2024, 11:11   #20975  |  Link
dipais
Registered User
 
Join Date: Mar 2024
Posts: 9
Quote:
Originally Posted by Atak_Snajpera View Post
check log. (right click on job and then SHOW LOG)
Hello Atak Snajpera

I see from the logs shown in mediainfo that RipBot has different default settings than the x265.

Code:
x265 default   : --ctu 64 --max-tu-size 32 --qg-size 32 --merange 57 --aq-mode 3
RipBot default : --ctu 16 --max-tu-size 16 --qg-size 16 --merange 9 --aq-mode 2
I really hope you can explain why you chose a different setting than the default x265. What is the impact?. What are the effects or what are the advantages and benefits that RipBot users can get with these settings?.

Cheers,
D
dipais is offline   Reply With Quote
Old 10th December 2024, 00:20   #20976  |  Link
supersnakeyez
Registered User
 
Join Date: Sep 2012
Posts: 23
Quote:
Originally Posted by Retired@55 View Post
Might need to upload a sample...

Are you using std RB or an old PD build, and what app did you use to have the encode work OK ??
I'm using latest stable version of ripbot and the program that works flawlessly is Handbrake. I've had to resort to handbrake ever since rb kept crapping out on my encodes.
supersnakeyez is offline   Reply With Quote
Old 10th December 2024, 03:34   #20977  |  Link
supersnakeyez
Registered User
 
Join Date: Sep 2012
Posts: 23
Quote:
Originally Posted by Retired@55 View Post
"Stable" version ???

I haven't been doing any encoding of late, and there has been a couple of updates to RB that I haven't tried yet,
but there were some problems with the build I was using, months ago, and I had to swap a few things around for it to work like it should (or should I say, "to my liking")....sadly.

Support of RipBot is sadly not what it used to be

I'll get back into it again, soon, no doubt.

So, the latest build of HB, 1.9.0 ??....never doubt good old Handbrake

Good luck.
yea HB has been working like a charm and gets updates regularly.
supersnakeyez is offline   Reply With Quote
Old 10th December 2024, 19:35   #20978  |  Link
Boulder
Pig on the wing
 
Boulder's Avatar
 
Join Date: Mar 2002
Location: Finland
Posts: 5,812
I did a little power consumption test on a 5950X just for fun, using my chunked encoder to run an analysis step.

With PBO enabled and Curve Optimizer tweaked cores, the process took 1h 45min 47sec.
With stock multiplier and PBO disabled (i.e. all cores constantly at 3600 MHz), the process took 1h 50min 22sec.

The average power consumption of the whole PC with PBO enabled was 416,7 Watts and with stock clocks 361,8 Watts. These were measured using a Shelly smart plug with 5-second resolution stored to an SQL database.
So with about 5% loss in performance, a gain of close to 15% in energy efficiency which is quite nice if you for example don't have any excessive solar power to utilize here.
__________________
And if the band you're in starts playing different tunes
I'll see you on the dark side of the Moon...
Boulder is offline   Reply With Quote
Old 14th December 2024, 02:40   #20979  |  Link
chainring
Registered User
 
chainring's Avatar
 
Join Date: Sep 2006
Posts: 183
Quote:
Originally Posted by Retired@55 View Post
Fortunately for some of us here, have "excessive solar power", and also some of us have batteries, as well, so it is of little interest or consequence.

Let them run @ 100%.
Neat. So don't comment, TDS.
chainring is offline   Reply With Quote
Old 14th December 2024, 02:59   #20980  |  Link
chainring
Registered User
 
chainring's Avatar
 
Join Date: Sep 2006
Posts: 183
Quote:
Originally Posted by Retired@55 View Post
tds..
Yeah, your previous user name. What was the one before that? We all know.
chainring is offline   Reply With Quote
Reply

Tags
264, 265, appletv, avchd, bluray, gui, iphone, ipod, ps3, psp, ripbot264, x264 2-pass, x264 gui, x264_64, x265, xbox360

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:58.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2025, vBulletin Solutions Inc.