Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 10, 2019.
I would probably buy 1 or 2 tomorrow.......
Getting so damn tired of your posts now. Last warning.
What did I say wrong Boss?
That's how 4K works. Have same amount of compression artifacts on 1080p and your eyes will bleed. Quadruple pixel count and suddenly you'll not even see issues.
IQ of compression has been compared over years over and over again. Results never changed. CPU compression delivers much higher IQ at much lower bandwidths than GPU.
And on GPUs NVENC lost to AVC. And not that I would recommend AVC either.
I saw 5~10% more powerful GPU on average being sold at 10% lower price. And I saw meaningfully improved IQ. And that's with current state of driver optimizations.
And I saw damn interesting die shot.
It would have been nice to see them post what memory speed they were using to achieve the benchmarks
fox, you dont need to explain the differences in encoding algorithms to him - his entire premise is flawed. hes saying hes okay with using GPU to encode, which would affect his framerate & frametimes, while the entire point of the AMD comparison was to highlight that higher thread counts allow CPU to encode at higher quality with no additional GPU load from encoding method - with better results than a much more expensive intel chip. his objection doesnt make any sense.
You are letting your Intel Bias getting the better of you with the troll posts. You already had two warnings handed out in one day.
Ghost Recon Breaking Point really looks good.
Ok Boss!....dam sure sorry about that!...Love you/ Thanks.
I shall see my way out....
WOW 16 baby
3950x is a bit expensive,but Intel have NOTHING to face that....
First of all, i did notice IQ improvement using NVENC for years since i switched to my RTX card. As you refer to comparison over the years i assume you suggest compared to previous iteration of NVENC
Feel free to check it for yourself:
GPU NVENC encode cost almost nothing in performance, for great quality output as analyzed deeply in linked i posted above. I suggest you give it a shot yourself.
At my modest scale of regular 4K streams with 7k subs on Youtube i do use those features regularly.
Saying my "entire premise is flawed" is just arrogant and shows a lack of practice or test of this Nvidia tech.
It looks like AMD is going to be kicking butt in the CPU department.
The GPUs look decent to me, at this point they look slightly better then NVIDIA at roughly the same price. While I would certainly like to have them be cheaper, at least they are genuinely competitive again. We'll have to see it team green lowers prices significantly or not going forward.
I doubt I'd be willing to invest 5700 XT's worth in a video card, but the plain 5700 might be within my range, depending on how much the eventual custom models will cost over here. However, I will only really know what to think about it after reading the future Guru3D review. Ads are only ads, after all. I was initially interested in RTX 2070 when the price hadn't yet been announced, but when it was, I lost my interest. I can't really see my interest being rekindled even if Nvidia dropped the price considerably. Especially since 2080 Ti is what you'd actually need for the RT.
I´ll just say this on one thread.
No matter what, i´ll be swapping my 2700x for 3000. Hilberts judgement says which. I do hope you push those suckers af far you can with OC.
Like, can 3600X/3700X/3800Z be OC´d as far as 3900X.
I am pretty temted to swap my 1080Ti for 5700xt, push that crap for oc limits aswell.
Reason why, I´d need pay 200e for WB for this EVGA SC2 card (which is bad WB anyways, if i´m not mistaken). And even if i lose few frames on the trade, don´t care,
i don´t mind giving my euros for underdogs, 5700xt ain´t that bad...
Kinda nice looking features AMD gave, if that really is, how they work, with as little of impact on frames.
But Guru3D, they shall tell the Truth. I´ll go with that. And so waiting for solid, non biased results.
If I am not wrong, the RX 5700 @ $379 was compared against the 2060. The RX 5700XT @ $449 was compared against the 2070. Different thread two days ago states that the 2070 refresh price was $399. That places the 2070 $20 more than the RX 5700. AMD whole premise of leadership at the $399 price point was outdated information. Just saying that if every frame counts like they kept repeating they did not deliver. Like I said, was hoping but not this time.
EDIT: The prices maybe rumor but there are going to be products close to price points. The new tools being offered is great (only of interest to developers). If the performance of the unsaid products is on par with AMD's offerings, price is the only deciding factor.
Ryzen is kicking some ass, cpus are looking amazing but i feel radeon is a faling behind again. There was no 5800/5900 with more cu's, maybe they are saving them to compete with rtx super.
"x265 beats Turing HEVC at 6Mbps and 4Mbps. But otherwise the best HEVC encoder is Turing. It is also 3x faster than even x265 SuperFast."
- quote from article
I hope you do understand basic scale difference in between what you do and what was tested.
Those tests are done on 1440p resulting in 4~8Mbps for software encoders. Your scenario has just double pixel count and 7~10 times as high resulting bandwidth. That's what one calls waste.
Thing which always played for GPU encoding was speed, not IQ per bitrate. Streaming is always about ability to make most of given bandwidth.
That doesn´t make sense at all! The GTX 1080Ti is a way better card!