Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 16, 2020.
I hear that glands are actually fetching better prices.
What? Why cancel? You actually did it.
The whole plan was to return it anyway to get the 3090 or 3080-ti at some point in the future.
Just didn't want the trouble of returning the card, face all the back and forth exchange especially with covid impacting local transport in France, :/
Well i am glad you cancelled it, as buying something with the intention of returning is both messed up and theft.
Ah screw it, after tonight’s news I decided I’m getting a 3090 too. Going to upgrade the rest of my computer and buy a new TV too. Might as well feel good now while I can.
Wanted to use the step up program from evga but the feedback seems to indicate it's a rather slow process with a whole queue system that can span on months. :/
I can’t get into it because politics aren’t tolerated here but I’m in the US so that should be a good clue.
In any case, the pertinent part of my tweet is that I might as well splurge and get the fancy card because screw it. I’ve got money in the bank that I was going to hold on to for more important things down the line but I think I’ll just spend it instead.
Guys, check VRAM/RAM usage difference between 2080Ti/3080
From the start it's much better for 3080, but later 2080Ti get better results, interesting.
Yes i checked comparison vids on youtube last night for that same reason, i concluded they average the same vram usage.
you're misconstruing "allocates more" as "better"
@Astyanax but why it's doing it that way? for example, same RAM usage on both but VRAM usage is lower on RTX 3080, does 3080 have better texture compression?
Assuming 1 of 2 things,
The 2080ti video was made prior to the new patch
The game allocates memory with % of Free for early texture stream preloading, so the card with more vram will start with what appears to be more used.
Well, yeah, it all looks well and good for the 3080. But I got a bad feeling about this one.
Its sucking way to much leccy and whilst it does not get to hot it also will not yield a 10% overclock.
Lads! I recon we are dealing with a bad egg here!
But only AMD 6000 cards will tell the truth comparison wise if the new NV architecture is badly optimised.
So I'm Holding out on The 3070 review as in: Sucking 150watts or less and if that falls over then AMD 6000 series reviews before I go banding my money around.
We should now now guys that early adoption can some times lead to burned fingers.
You can expect lack of overclock headroom to be the norm now that mfrs have developed good enough technology to automatically get the best out of GPUs etc. without user intervention.
When speeds are pushed near the limit, power consumption will always be high.
It doesnt mean bad as long as cooling is good enough.
These arent early adoption issues, both are quite normal when there is competition and they have the electronic+software tech to push boundaries harder.
I reckon AMD cards will be slightly worse for power draw and/or heat btw.
Well yeah, maybes, but the sweet spot for me is 150 watts power consumption. What ever the power to weight ratio is with this heavy lifting gear.
Limbs are also lucrative selling in india. Don't need my legs soo...
Looking at the 3090 too, But I want it to be a big jump. Like blowing my nuts clean off levels of performance.
For the cost its gotta be more than the 10% over 3080 thats rumoured.
600W SFX Power Supplies vs. RTX 3080 – Surprising Results
when his house burns down, this video will be cause for his insurer to refuse him.
Interesting that the highest power draw was on 1440p, not 4k, due to the CPU working harder at that res.