Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
28th April 2019, 09:46 | #56001 | Link |
Registered User
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
|
For BFI you need double refresh rate, isn‘t it? So you need a capable device for that. I only know devices that support BFI after they got a 23.976 signal for example and do it after that during processing. But do devices support double input refresh rates? I am not sure.
|
28th April 2019, 10:49 | #56003 | Link | |
Registered User
Join Date: Oct 2016
Posts: 896
|
Quote:
Edit: ah, I posted this before seeing your last message. 1-2 decode queue means your CPU can't decode the video fast enough. This is confirmation it is indeed the problem. The reason you sometimes have 12-24 and sometimes it falls to 1-2 could be either some parts of the video have much higher bitrate and can't be decoded sufficiently quick, or your CPU is thermal/power throttling.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 Last edited by el Filou; 28th April 2019 at 10:53. |
|
28th April 2019, 10:56 | #56004 | Link | |
Registered User
Join Date: May 2018
Posts: 10
|
Quote:
I am checking on 4k uhd Avengers ifinity war. Maybe it is a hard file t test. Thermal power is normal. Not higher than 55 degrees on celcium. Oh my god! Is so funny. I just paused that lag moment of the video while i have been answering in the forum. After that i just played the scene again and render queue was good! All that scene runs smooth and no lags! Why?))) Last edited by Koltos; 28th April 2019 at 10:58. |
|
28th April 2019, 11:31 | #56005 | Link |
Registered User
Join Date: Apr 2018
Location: Stockholm, Sweden
Posts: 31
|
Hi,
I found this information on a madVR wiki that shed some light on the CPU & GPU queue size. "" CPU/GPU Queue Size This sets the size of the decoder queue (CPU) and upload/render queues. (GPU) Unless you are experiencing problems, I would leave it at the default settings of 12/8. The higher these queue sizes are, the more memory madVR requires. With larger queues you could potentially have smoother playback on some systems, but increased queue sizes also mean increased delays when seeking if the delay playback… options are enabled. Depending on your system, if you are having trouble getting smooth playback with madVR, sometimes turning the queue sizes all the way up or all the way down seems to help. It really depends on the machine. Updated 13/06/2013: As of madVR 0.86.2, the decoder queue can now be increased to a maximum of 128 frames. In my experience, this increases the amount of RAM madVR uses to around 800MB with 1080p video. It may also greatly slow down seeking or switching between full-screen/windowed modes when delay playback… is enabled. If you have the memory to spare, and your system is capable of filling the queues, it may result in smoother video playback. (far less chance of dropped frames occurring) In general, the decoder queue should only be set as high as your system can actually fill. There's no point in setting it to 128 frames if your system can only fill 30/128. "" https://wiki.jriver.com/index.php/MadVR_Expert_Guide Sent from my SM-N960F using Tapatalk |
28th April 2019, 13:28 | #56007 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Earlier drivers should work. They always have in the past.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
28th April 2019, 14:27 | #56008 | Link |
Registered User
Join Date: Apr 2018
Location: Paris, France
Posts: 92
|
I do have a GTX 1070, but still, I'm tired of reading that you need a war machine to run madvr.
You don't need a 800$+ GPU to achieve great results. Lower presets will do a fine job and most people won't see the difference anyway, unless they are pixel peeping or taking screenshots and switching back and forth. Even a 100$ GTX 1060 is enough to get great 1080p>2160p upscaling. It's just a matter of picking the right settings. Last edited by Charky; 28th April 2019 at 14:29. |
28th April 2019, 16:06 | #56009 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
Yes. I was running 1903 long before the new driver and I was using the last driver that didn't break metadata. It was fine.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
28th April 2019, 16:26 | #56010 | Link |
Registered User
Join Date: Apr 2019
Posts: 24
|
RTX 2080 @stock clocks
i5 8600k @stock clocks Windows 10 1809 Kodi 17.6+ds player MPC BE 1.5.3 +MPC HC 1.7.13 (not sure which one being used) http://prntscr.com/nfcmzu http://prntscr.com/nfcnky http://prntscr.com/nfcntw http://prntscr.com/nfco4h http://prntscr.com/nfcohv Using madVr Default settings. PS. HDR only seems to work in fullscreenexclusive. using latest nvidia drivers 430. something. Nvidia drivers on Ycbr422 12 bit. Any tips as to how to aquire the most correct/ best quality HDR? It's a bit hard to find in almost 3000 pages what is actually the best. |
28th April 2019, 16:27 | #56011 | Link | |
Registered User
Join Date: Mar 2019
Posts: 14
|
Quote:
At the end of the day, the multitude of settings are there to allow MadVR to run on a wide variety of cards. But with the lower quality settings, you're not really setting yourself apart from a lot of other renderers. A lot of the benefit to MadVR lies in the higher tiered settings. |
|
28th April 2019, 16:46 | #56012 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
Also, it depends how many features you use: - if you need software features such as back bars detection or BD Menus with jRiver, you can't use native, you have to use copyback, which is slower in most cases - if you use 3D LUTs, it needs added power - if you use 12bits, it needs added power - if you use the latest dynamic HDR tonemapping algo, it needs a lot of power So yes, if you only use minimal features in SDR with little to no upscaling needed (for example bluray on a small 1080p screen) and if you downgrade all the upscaling/processing because at the distance you watch your small screen, nothing will make a difference, you can get away with less performance. But if you want to play 4K HDR and upscale earlier content in best possible quality on a large screen from a fairly short distance, you need some serious power to get the best results. Eveytime I recommend a 1080ti or 2080 if possible, someone tells me "I get great results with xxx" and every single, they only use a small subset of madVR's features, or seriously compromise on the possible PQ. I certainly wouldn't get anything lower than that for the content I want to play (mostly 4K HDR) on a large screen (projector sitting at 1.1 screen width).
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 28th April 2019 at 16:51. |
|
28th April 2019, 16:57 | #56013 | Link |
Registered User
Join Date: Apr 2018
Location: Paris, France
Posts: 92
|
You're right, of course. But I still believe that picking NGU AA over Lanczos 3 + AR for chroma upscaling or meddling with dithering options is probably the final thing to do in the long list of things anyone can do to improve their home cinema experience.
I mean, if you have a white-plainted ceiling and white tiles on the floor in your projection room (and it's OK, a living room is not a projection room, you have to live in it too ), buying a RTX 2080 GPU just to pick the ultimate chroma upscaling algorithm is kinda like buying a fridge before your house is built. It's a waste of money for something that's not really a priority. A very expensive and very powerful GPU card will only benefit a tiny fraction a people who already have the right setup. Recommanding that to everyone, regardless of their equipment is, IMO, not the right course of action Last edited by Charky; 28th April 2019 at 17:04. |
28th April 2019, 17:08 | #56014 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
It's a matter of personal choice, though. Some of us, like me, only do major upgrades once every generation of technology. What I mean is I just this week replaced my 3770k machine with a 9900k. The 3770k was built around blu-ray. The 9900k is built around watching 4k UHD. As for money, an 800 dollar video card isn't much compared to some people dropping 3 grand on a TV. Why get the 3 grand TV if you're not pushing to get the best possible picture quality? So, it comes down to personal choice. And that's also why madvr is so flexible.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
28th April 2019, 18:12 | #56015 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
Yes we DO need $800 gpu. No one is talking about 1080p. HDR performance is the only metric of discussion, because that's the format all new BDs will arrive in. The 4KHDR Dynamic tone mapping Very nearly saturates a 1060 on Ngu LOW. This is on the highest clocked variants of 1060, on the ghetto 1060, it'll probably start dropping frames. That's why recommending 1070 as the bare minimum is the logical next step. 1080/ 1080ti or 2070/2080 are much better places to be, because who knows what Madshi will cook up. 10xx favored (for now) because driver problems on 20xx. 3D lut really should not be optional. This is the absolute PRIME feature of Madvr. The computer is fundamentally BLIND, it has no idea what it's outputting. A colorimeter gives the computer eyes, Therefore a soul. The video chain is incomplete without it. No one should watch movies without 3DLut.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 28th April 2019 at 18:17. |
|
28th April 2019, 20:15 | #56016 | Link | |
Registered User
Join Date: Apr 2019
Posts: 24
|
Quote:
1809 windows, 430.39 with RTX 2080. Windowed HDR doesn't work. Only fullscreen Exclusive. |
|
28th April 2019, 21:55 | #56017 | Link |
QB the Slayer
Join Date: Feb 2011
Location: Toronto
Posts: 697
|
I can still run 1080/720 ==> 4K on my ancient R9 270X with just 2GB of RAM... framerates >30 FPS of course can't be pushed (HDMI 1.4 limitations), but in those few cases I just push 1080p to the OLED and let the LG C8's α9 processor do the final upscale to 4K... not ideal, but not bad either. I'm waiting for Navi to be released before buying a new GPU for my 3 week old OLED.
https://prnt.sc/ni12f6 QB
__________________
Last edited by QBhd; 28th April 2019 at 22:00. |
28th April 2019, 22:00 | #56018 | Link |
Registered User
Join Date: Apr 2019
Posts: 69
|
I just ran a new test for 3dLut on this new panel and for whatever reason, yet again i still cannot get the 3dlut to pass. Always not ok and I don't understand why. I am on a I1 Display Pro and I still cannot get it to pass.
|
28th April 2019, 22:16 | #56019 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
Try reading through this: https://forum.kodi.tv/showthread.php...949#pid2238949
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
28th April 2019, 22:31 | #56020 | Link | |
Registered User
Join Date: Apr 2019
Posts: 24
|
Quote:
But if you must know I have LGC7OLED tv. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|