Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
20th January 2020, 01:10 | #7361 | Link | |
ffx264/ffhevc author
Join Date: May 2007
Location: /dev/video0
Posts: 1,844
|
Quote:
Also strong-intra-smoothing seems to have a positive effect at keeping noise. Many people are scared by the name but I find it does improve things a bit. It will help reduce banding on very clean scenes too, like skies or other clean textures, by blending a bit instead of keeping the banding merange has no effect when using HME since the latter has its own range parameters you can adjust with --hme-range Last edited by microchip8; 20th January 2020 at 01:22. |
|
20th January 2020, 12:19 | #7362 | Link |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,731
|
Ben Waggoner was doing some tests with --aq-mode 4, I don't think he got them finished though. He's probably the only one (apart from the devs) who has some kind of an insight to how it looks with real life examples.
For what it's worth, lately I've found --aq-mode 3 being troublesome. With normal sources originating from film, like the older seasons of X-Files which contain quite a lot of grain and noise, it works well at around ~0.6 for strength. Then there's something like Suits, or The Killing, which look terrible. The sources are terrible in quality -- lots of macroblocking which appears when the encoder starts removing the grain hiding some of that -- and aq-mode 3 produces a substantially lower average bitrate at the same CRF than aq-mode 1. I don't know why the rate control works this way, The Killing does contain quite a lot of dark scenes but I was unable to get a good result without switching to --aq-mode 1 at the default strength. I also raised psy-rd to 3.0 and deblock to 0:0 to try to keep the higher frequencies and avoid banding where the encoder removes too much from the original image. I really scratched my head at the issue the whole weekend. Yes, I did test x264 at --preset veryslow, but it wasn't any better Sometimes I have used a dirty trick and added a light amount of fake grain with GrainFactory3 before feeding the video to the encoder. I may need to test that too
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
20th January 2020, 16:37 | #7363 | Link | |
ffx264/ffhevc author
Join Date: May 2007
Location: /dev/video0
Posts: 1,844
|
Quote:
As for psy-rd/psy-rdoq, they are meant to keep the original energy of the source as much as possible at higher strength. So if you have a (very) noisy sample, the bitrate will shoot up in order to preserve the noise/grain. These two settings are set to high values when using --tune grain (psy-rd is set to 4 and psy-rdoq is set to 10). I personally now use psy-rd of 3.2 and psy-rdoq of 15. I do not mind the increase of bitrate as I aim to preserver the input as much as possible. These two especially have an effect on dark scenes and look better than using AQ mode 2 or 3. I haven't tested AQ mode 4, though Last edited by microchip8; 20th January 2020 at 16:43. |
|
20th January 2020, 16:45 | #7364 | Link | ||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
Quote:
|
||
20th January 2020, 16:47 | #7365 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
|
|
20th January 2020, 17:01 | #7366 | Link |
ffx264/ffhevc author
Join Date: May 2007
Location: /dev/video0
Posts: 1,844
|
I have no need for 2-pass VBR. I only do CRF encoding and that's what matters to me but also tested in 2-pass as you recommended previously in a post a week or so ago. Not much improvement. I find psy-rd/rdoq delivering better results than these AQ modes
|
20th January 2020, 19:25 | #7367 | Link |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,731
|
If you want to try things, I uploaded the sample clip I used to test: https://drive.google.com/open?id=1Ar...5cr6UNO1WjfV7- . Take a look at the flat background in the shots with the cop, it's ugly in the source and gets much worse after encoding.
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
20th January 2020, 19:47 | #7368 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
Right. But if you want to figure out the optimum efficiency for a CRF encode, comparing the features in 2-pass VBR shows you how features compare at the same file size. Once you nail down the optimum parameters, then you figure out what CRF is optimal for you with those other parameters. |
|
20th January 2020, 19:51 | #7369 | Link | |
ffx264/ffhevc author
Join Date: May 2007
Location: /dev/video0
Posts: 1,844
|
Quote:
|
|
26th January 2020, 22:41 | #7370 | Link |
Registered User
Join Date: Jul 2011
Posts: 1,121
|
Annoying question but might as well ask it.
It was a long time ago since i checked out x265, probably like 2 years ago, and back then x264 was better except for edge cases which was very low bitrate and very high resolutions mostly. And then it looked crap anyway so even if x265 won it didn't really matter (for me). How is it nowadays (know it's a vague question)? If you have like a 1080p video and use crf=16 (which i would say results in high quality for x264 at least) would x265 achieve the "same quality" for less space? Or is it only in very high resolutions like 4k and beyond that x265 can outdo it cause of how it works? I know it sounds like i am saying x265 is crap compared to x264, but that's not what i am trying to say, but it's hard to ask the comparison question without making it sound that way. I am not denying x265 nor the standard h265, it's obviously made for a good reason and the encoder is surely aiming to be as great as possible compared to what currently exists (either for all or some cases), i just don't know that much about it except that the standard was made with very high resolutions in mind as h264 breaks down quickly there except with very high bitrates (which is the "throw money at the problem" solution usually;P). Thanks |
27th January 2020, 17:41 | #7371 | Link |
結城有紀
Join Date: Dec 2003
Location: NJ; OR; Shanghai
Posts: 894
|
zerowalker, I can't answer your question but I do have a question for you. Were you using 10-bit and were you comparing 10-bit x265 against 8-bit x264?
We / I switched to x265 simply because we wanted to use 10-bit and AVC High10 had some compatibility issues with hardware players. |
27th January 2020, 19:46 | #7372 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
The biggest problem with x265 years ago was how it handled grain, which is much improved now. |
|
5th February 2020, 13:43 | #7375 | Link |
German doom9/Gleitz SuMo
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,781
|
The x265 source repo will migrate to Git and move to https://bitbucket.org/multicoreware/x265_git/ (Bitbucket will drop Mercurial support until June 2020).
Previously I used the combo of "hg pull" and "hg update" to keep a local working directory up to date. Which would be the recommended Git commands? |
5th February 2020, 13:59 | #7376 | Link | |
SuperVirus
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
|
Quote:
2) downloading with Mercurial = 35.4 MB, very fast; downloading with git = 290 MB, very slow; x265_git\.git\objects\pack\pack*.pack = 276 MB — ¿what the devil is that? Last edited by filler56789; 5th February 2020 at 14:57. Reason: typo |
|
5th February 2020, 16:05 | #7377 | Link | |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,731
|
Quote:
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
|
5th February 2020, 22:19 | #7378 | Link | |
結城有紀
Join Date: Dec 2003
Location: NJ; OR; Shanghai
Posts: 894
|
Quote:
Pull updates from remote to local is "git fetch". Pull updates from local to working directory is "git checkout". Git is much more flexible (and thus harder to use) that's why I always recommend people to use a GUI to get started. There are more concepts in Git than in Hg and it takes time to get used to. |
|
5th February 2020, 23:21 | #7379 | Link |
SuperVirus
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
|
Just re-checking...
the latest cloned x265/.hg directory = 23.2 MB Apparently the "conversion" of x265's Hg repository into git included or added some TONS of *garbage*... Last edited by filler56789; 5th February 2020 at 23:22. Reason: clarity |
6th February 2020, 00:36 | #7380 | Link |
結城有紀
Join Date: Dec 2003
Location: NJ; OR; Shanghai
Posts: 894
|
I don't understand the point of mentioning the size of the repository.
I use Git not for saving 200MB of space, but for managing the source code. I'd happily trade a few gigabytes of my "precious?" hard drive space for a tool that works better to me. To answer your previous question, git object pack is a compressed package of all kinds of objects used by git, including histories, files, deltas, etc. If network activity is a real concern to you, you can limit the depth of repository to download. You can choose to only download and store the most recent 500 commits, for example, and still work on that as long as you don't need access to history more than 500 commits back. |
Thread Tools | Search this Thread |
Display Modes | |
|
|