Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 6th February 2020, 00:43   #7381  |  Link
qyot27
...?
 
qyot27's Avatar
 
Join Date: Nov 2005
Location: Florida
Posts: 1,419
Quote:
Originally Posted by LigH View Post
The x265 source repo will migrate to Git and move to https://bitbucket.org/multicoreware/x265_git/ (Bitbucket will drop Mercurial support until June 2020).

Previously I used the combo of "hg pull" and "hg update" to keep a local working directory up to date. Which would be the recommended Git commands?
Easy:
git pull (like was mentioned above)

More exact (like when following more than one upstream HEAD):
git fetch <remote>
git merge <remote>/<branch>

which would look like:
git fetch origin
git merge origin/master (to update the master branch to its current remote state)


If you don't care about being able to recover the file or commit history and just want raw download speed:
git clone --depth 1 https://bitbucket.org/multicoreware/x265_git

Git is, as others have noted elsewhere, basically a time-based file system that happens to work great as a DVCS. Mercurial is just a DVCS.
qyot27 is offline   Reply With Quote
Old 6th February 2020, 02:00   #7382  |  Link
filler56789
SuperVirus
 
filler56789's Avatar
 
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
Quote:
Originally Posted by qyot27 View Post
...
If you don't care about being able to recover the file or commit history and just want raw download speed:
git clone --depth 1 https://bitbucket.org/multicoreware/x265_git


Quote:
Originally Posted by MeteorRain View Post
I don't understand the point of mentioning the size of the repository.
I use Git not for saving 200MB of space, but for managing the source code.
I'd happily trade a few gigabytes of my "precious?" hard drive space for a tool that works better to me.
Because the size of the hg repository is much smaller than the git one?
Because I'm old and therefore much more impatient than you?
Because I think that "huge and cheap" storage space is not an excuse for inefficiency?

Quote:
To answer your previous question, git object pack is a compressed package of all kinds of objects used by git, including histories, files, deltas, etc.
Then it compresses much worse than hg, or at least it seems so...

Quote:
If network activity is a real concern to you, you can limit the depth of repository to download. You can choose to only download and store the most recent 500 commits, for example, and still work on that as long as you don't need access to history more than 500 commits back.
Yes, my interest in x265's source-code is only to download and compile its most recent stuff, I have no use for a 300 MB """compressed""" archive that is not the code that I intend to compile.
filler56789 is offline   Reply With Quote
Old 6th February 2020, 06:15   #7383  |  Link
MeteorRain
結城有紀
 
Join Date: Dec 2003
Location: NJ; OR; Shanghai
Posts: 894
Then do a shallow clone like we posted.
We are developers and functionality is much more important for us than a few hundred megabytes.
No, huge and cheap storage is not an excuse for inefficiency, however this is not inefficiency, this is only "space" inefficiency.
You don't say ffmpeg is garbage because it's a 60MB binary.

BTW, compared to a few seconds of downloading, the actual compiling is much more time consuming.
I'm too impatient but each time I compile my x265 mod it will take a good 2 hours on a Xeon E5 server.
__________________
Projects
x265 - Yuuki-Asuna-mod Download / GitHub
TS - ADTS AAC Splitter | LATM AAC Splitter | BS4K-ASS
Neo AviSynth+ filters - F3KDB | FFT3D | DFTTest | MiniDeen | Temporal Median
MeteorRain is offline   Reply With Quote
Old 7th February 2020, 14:00   #7384  |  Link
LigH
German doom9/Gleitz SuMo
 
LigH's Avatar
 
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,753
Since the media-autobuild suite switched to the x265 Git repository, I have trouble retrieving the sources. It seems like git receives all the expected data, but then suddenly handles the closing connection as if it was interrupted in an error. Here from an interactive MSYS2/MinGW console:

Code:
$ git clone https://bitbucket.org/multicoreware/x265_git.git x265_git-git
Cloning into 'x265_git-git'...
remote: Counting objects: 88154, done.
remote: Compressing objects: 100% (88115/88115), done.
remote: Total 88154 (delta 2853), reused 85263 (delta 0)
error: RPC failed; curl 56 OpenSSL SSL_read: Connection closed abruptly, errno 0 (Fatal because this is a curl debug build)
Receiving objects: 100% (88154/88154), 277.47 MiB | 1.58 MiB/s, done.
Resolving deltas: 100% (2853/2853), done.
Retrieving Git repositories from other sources works without error. So I wonder if Bitbucket may handle little details differently. If so ... I may not be able to report this issue to Bitbucket, as I am not the Atlassian customer responsible for the x265 repo.
__________________

New German Gleitz board
MediaFire: x264 | x265 | VPx | AOM | Xvid
LigH is offline   Reply With Quote
Old 7th February 2020, 15:03   #7385  |  Link
LigH
German doom9/Gleitz SuMo
 
LigH's Avatar
 
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,753
I'll see if this Windows 7 hotfix has any impact...

Oh, it has been discontinued. I need Windows 10 to solve it.

Or it is a different one? It's errno 0, not SysCall 10054.

Cloning x264 and dependent ffmpeg/ffms2/lsmash works without abort.



After x264 was retrieved (I deleted build/x264-git), now x265 was retrieved too? ... Strange technology.
__________________

New German Gleitz board
MediaFire: x264 | x265 | VPx | AOM | Xvid

Last edited by LigH; 7th February 2020 at 15:34.
LigH is offline   Reply With Quote
Old 7th February 2020, 16:20   #7386  |  Link
Barough
Registered User
 
Barough's Avatar
 
Join Date: Feb 2007
Location: Sweden
Posts: 480
No issues with the change to x265 git. Just running smooth here.

Sent from my SM-G975F via Tapatalk
__________________
Do NOT re-post any of my Mediafire links. Download & re-host the content(s) if you want to share it somewhere else.
Barough is offline   Reply With Quote
Old 8th February 2020, 00:19   #7387  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
Quote:
Originally Posted by LigH View Post
I'll see if this Windows 7 hotfix has any impact...

Oh, it has been discontinued. I need Windows 10 to solve it.

Or it is a different one? It's errno 0, not SysCall 10054.

Cloning x264 and dependent ffmpeg/ffms2/lsmash works without abort.



After x264 was retrieved (I deleted build/x264-git), now x265 was retrieved too? ... Strange technology.
Git is an abominable piece of trash IMHO, but it's what we're stuck with, since hg is rapidly disappearing. (I'm going to have to convert all of my repos, too.) 99% of all git problems are best solved by just deleting the entire folder and cloning from scratch. Sometimes a hard revert will also do the trick, but not as reliably. If you have changes, just try to pull a patch and reapply after.
foxyshadis is offline   Reply With Quote
Old 10th February 2020, 09:05   #7388  |  Link
LigH
German doom9/Gleitz SuMo
 
LigH's Avatar
 
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,753
Cloning was just the problem. Or handling the HTTPS connection.
__________________

New German Gleitz board
MediaFire: x264 | x265 | VPx | AOM | Xvid
LigH is offline   Reply With Quote
Old 11th February 2020, 18:26   #7389  |  Link
MeteorRain
結城有紀
 
Join Date: Dec 2003
Location: NJ; OR; Shanghai
Posts: 894
Try ssh protocol instead?
__________________
Projects
x265 - Yuuki-Asuna-mod Download / GitHub
TS - ADTS AAC Splitter | LATM AAC Splitter | BS4K-ASS
Neo AviSynth+ filters - F3KDB | FFT3D | DFTTest | MiniDeen | Temporal Median
MeteorRain is offline   Reply With Quote
Old 12th February 2020, 12:59   #7390  |  Link
LigH
German doom9/Gleitz SuMo
 
LigH's Avatar
 
Join Date: Oct 2001
Location: Germany, rural Altmark
Posts: 6,753
Well, it suddenly worked once (when I let it clone x264 before x265). Maybe just something on my PC was out of sync.
__________________

New German Gleitz board
MediaFire: x264 | x265 | VPx | AOM | Xvid
LigH is offline   Reply With Quote
Old 14th February 2020, 09:11   #7391  |  Link
filler56789
SuperVirus
 
filler56789's Avatar
 
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
x265.exe 3.3_RC1+2-e386d3a8a713

(64-bits, multilib, GCC 9.2, Win32-threads)

Code:
regression: fix analysis-save/load commands
http://www.mediafire.com/file/qnc7l7...8a713.rar/file
filler56789 is offline   Reply With Quote
Old 15th February 2020, 13:16   #7392  |  Link
nakTT
Registered User
 
Join Date: Dec 2008
Posts: 415
Quote:
Originally Posted by filler56789 View Post
x265.exe 3.3_RC1+2-e386d3a8a713

(64-bits, multilib, GCC 9.2, Win32-threads)

Code:
regression: fix analysis-save/load commands
http://www.mediafire.com/file/qnc7l7...8a713.rar/file
Any interesting changes from version 3.2?
nakTT is offline   Reply With Quote
Old 16th February 2020, 05:58   #7393  |  Link
Greenhorn
Registered User
 
Join Date: Apr 2018
Posts: 61
Quote:
Originally Posted by nakTT View Post
Any interesting changes from version 3.2?
Against the last 3.2x nightly builds, or against 3.2 release?

Against the former, just what's mentioned. Against the latter: --hist-scenecut to enable a different method of selecting scenecuts, --scenecut-aware-qp to apply an offset to the qp of keyframes and the frames immediately after them, and a more or less total overhaul of how the analysis-save/load feature works. Probably more that I'm not remembering.
Greenhorn is offline   Reply With Quote
Old 17th February 2020, 04:37   #7394  |  Link
nakTT
Registered User
 
Join Date: Dec 2008
Posts: 415
Quote:
Originally Posted by Greenhorn View Post
Against the last 3.2x nightly builds, or against 3.2 release?

Against the former, just what's mentioned. Against the latter: --hist-scenecut to enable a different method of selecting scenecuts, --scenecut-aware-qp to apply an offset to the qp of keyframes and the frames immediately after them, and a more or less total overhaul of how the analysis-save/load feature works. Probably more that I'm not remembering.
Thanks for the reply. My question was against 3.2 release, sorry for not being specific. BTW, was there any changes related to improved compression? Thank you in advance.

Last edited by nakTT; 17th February 2020 at 04:52.
nakTT is offline   Reply With Quote
Old 17th February 2020, 22:19   #7395  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by nakTT View Post
Thanks for the reply. My question was against 3.2 release, sorry for not being specific. BTW, was there any changes related to improved compression? Thank you in advance.
Yes, the improved scene detection and optimized QP around scene transitions will both improve compression efficiency. The quality of frames around edits will be better at the same bitrate.

Sent from my SM-T837V using Tapatalk
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 18th February 2020, 15:55   #7396  |  Link
filler56789
SuperVirus
 
filler56789's Avatar
 
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
x265.exe 3.3+3-c2769ac5fa9d

Release notes: https://bitbucket.org/multicoreware/...leasenotes.rst

download: http://www.mediafire.com/file/op10k0...c5fa9d.7z/file
filler56789 is offline   Reply With Quote
Old 18th February 2020, 16:09   #7397  |  Link
nakTT
Registered User
 
Join Date: Dec 2008
Posts: 415
Quote:
Originally Posted by benwaggoner View Post
Yes, the improved scene detection and optimized QP around scene transitions will both improve compression efficiency. The quality of frames around edits will be better at the same bitrate.

Sent from my SM-T837V using Tapatalk
Thanks for the reply. Glad to know they are still improving the compression efficiency even after years of develoment.

Quote:
Originally Posted by filler56789 View Post
Thanks, will give it a try.
nakTT is offline   Reply With Quote
Old 18th February 2020, 16:51   #7398  |  Link
Boulder
Pig on the wing
 
Boulder's Avatar
 
Join Date: Mar 2002
Location: Finland
Posts: 5,718
Quote:
Originally Posted by benwaggoner View Post
Yes, the improved scene detection and optimized QP around scene transitions will both improve compression efficiency. The quality of frames around edits will be better at the same bitrate.
Have you made any tests concerning the new functionalities? They once again appeared without any use cases etc.
__________________
And if the band you're in starts playing different tunes
I'll see you on the dark side of the Moon...
Boulder is offline   Reply With Quote
Old 18th February 2020, 18:13   #7399  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Boulder View Post
Have you made any tests concerning the new functionalities? They once again appeared without any use cases etc.
Not myself yet. I'm currently at the HPA tech retreat. I should have some physical and CPU time to run some tests next week.

I didn't mention the Adaptive frame duplication feature. This promises some significant efficiency savings and also faster random access for content where there are identical frames. Credits, screen activity, and anime/cel animation seem particularly likely to get improvements.

I wish there was an optional flag that can allow --keyint to limit total encoded frames, so --keyint 240 would mean 240 unique coded frames. Thus, with 24 fps where the animation is at 8 fps (so three duplicate frames in a row), --keyint 240 would yield a GOP covering 30 seconds instead of 10. Not great for adaptive streaming, but could save space for a file without impacting random access.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 18th February 2020, 18:15   #7400  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by nakTT View Post
Thanks for the reply. Glad to know they are still improving the compression efficiency even after years of develoment.
Encoders are never done; we're still seeing improvements to MPEG-2 encoding after all these years. Particularly as long as Moore's Law keeps happening, we'll always find ways to take advantage of more MIPS/pixel.

With a codec as complex as HEVC, I would expect we'll be seeing material year-on-year encoder improvements for at least another five years.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:49.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.