By Adam Go To PostDell S2721DGFCheers bruv
By Wahabipapangus Go To Postif nothing else, at least make other people buy these cards, queen lisa. so i can get my rtx
It took Nvidia, who is out in front with AI, and who id guess is pretty ahead in thse types of features, two iterations to really get it right.
And AMD wants to do something similar, but cross platform?
Yeah aight
And AMD wants to do something similar, but cross platform?
Yeah aight
It's pretty obvious that what will happen is Nvidia will bring out the 3080ti in December which will beat the 6900XT at the same price at 4K, and do RT.
They've clearly been shuffling things around to counter this.
They've clearly been shuffling things around to counter this.
Adam won, but Smokey etc.
Lisa bringing it, but I'll need to see RT performance before even considering getting an AMD GPU.
Lisa bringing it, but I'll need to see RT performance before even considering getting an AMD GPU.
By Adam Go To PostVapes in denial still…I’m not in denial fam, well played to AMD. I just think we’re in for some real competition in the GPU space going forwards
Embrace Lord Su
By Laboured Go To PostI'm sure they didn't talk about RT performance because it's so good.
But is your logic effective against a spiderman gif that's the question
One of the tweaktown rumours from last month was that AMD wouldn't be able to match ray tracing on Ampere but that they would be able to beat ray tracing on Turing
https://www.tweaktown.com/news/75139/amd-big-navi-hits-spitting-distance-of-geforce-rtx-3080-performance/index.html
There was also:
- Very close to RTX 3080 in rasterisation performance
- More efficient at originally intended clocks than Ampere
- Smaller than expected (whatever that means) and relative to Ampere
- AMD taking DLSS seriously, but no details on their approach (software stack not confirmed yet).
So, not a million miles off, and now we know now they've been working with Microsoft on ML-enhanced resolution, so will be interesting to see how that stacks up against RTX + DLSS in performance and image quality. From the Xbox blog post:
"At the very beginning of development of the Xbox Series X | S, we knew we were setting the foundation for the next decade of gaming innovation and performance across console, PC and cloud. To deliver on this vision we wanted to leverage the full capabilities of RDNA 2 in hardware from day one. Through close collaboration and partnership between Xbox and AMD, not only have we delivered on this promise, we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution."
Side note, if it turns out that one of the console manufactures isn't in on this, well... 😬
https://www.tweaktown.com/news/75139/amd-big-navi-hits-spitting-distance-of-geforce-rtx-3080-performance/index.html
There was also:
- Very close to RTX 3080 in rasterisation performance
- More efficient at originally intended clocks than Ampere
- Smaller than expected (whatever that means) and relative to Ampere
- AMD taking DLSS seriously, but no details on their approach (software stack not confirmed yet).
So, not a million miles off, and now we know now they've been working with Microsoft on ML-enhanced resolution, so will be interesting to see how that stacks up against RTX + DLSS in performance and image quality. From the Xbox blog post:
"At the very beginning of development of the Xbox Series X | S, we knew we were setting the foundation for the next decade of gaming innovation and performance across console, PC and cloud. To deliver on this vision we wanted to leverage the full capabilities of RDNA 2 in hardware from day one. Through close collaboration and partnership between Xbox and AMD, not only have we delivered on this promise, we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution."
Side note, if it turns out that one of the console manufactures isn't in on this, well... 😬
The problem with the reported AMD rumours over the last 6 months is that they ranged from plausible to absolutely batshit bonkers week to week. It was hard to take anything at all in the slightest bit seriously.
I'm shocked (and pleasantly surprised) by the performance they've stated to offer and at the price points they've announced. I was fully expecting them to match 2080Ti performance, I never thought they would've matched the RTX 3080 in performance. The fact they're hitting 3090 performance though, on the same 7nm process as the 5700XT? Absolute wizardry, regardless of the benchmark fuckery regarding smart access memory and/or "RAGE MODE" vs stock Nvidia performance. It's impressive.
I'm shocked (and pleasantly surprised) by the performance they've stated to offer and at the price points they've announced. I was fully expecting them to match 2080Ti performance, I never thought they would've matched the RTX 3080 in performance. The fact they're hitting 3090 performance though, on the same 7nm process as the 5700XT? Absolute wizardry, regardless of the benchmark fuckery regarding smart access memory and/or "RAGE MODE" vs stock Nvidia performance. It's impressive.
Two things:
1) If they can't get near NV for RT then they should put all of their GPUs in the bin as a point of principle. No cunt, unless they are phenomenal level of stupid wants to pay for these if they cant do RT. You'd be paying 4-500 plus for last gen GPUs in 2020 at any res.
2) Idk what RAGE MODE is but it already makes me sick and it's for losers.
1) If they can't get near NV for RT then they should put all of their GPUs in the bin as a point of principle. No cunt, unless they are phenomenal level of stupid wants to pay for these if they cant do RT. You'd be paying 4-500 plus for last gen GPUs in 2020 at any res.
2) Idk what RAGE MODE is but it already makes me sick and it's for losers.
It's a boost that gives 1 to 2% automatic performance by tweaking power limits or something with the push of a button. It's essentially bs marketing crap.
By inky Go To PostIt's a boost that gives 1 to 2% automatic performance by tweaking power limits or something with the push of a button. It's essentially bs marketing crap.
By Laboured Go To PostTwo things:
1) If they can't get near NV for RT then they should put all of their GPUs in the bin as a point of principle. No cunt, unless they are phenomenal level of stupid wants to pay for these if they cant do RT. You'd be paying 4-500 plus for last gen GPUs in 2020 at any res.
2) Idk what RAGE MODE is but it already makes me sick and it's for losers.
If they can beat Turing, then the performance penalty is probably going to be acceptable, but that’s assuming they can get super resolution working close to the level of DLSS. 2080Ti is still a £1000+ GPU, so similar performance at £500 is not a bad option.
Their problem might be that the 3070 is cheaper, but then you do get half the VRAM with it 🤷♂️
By HonestVapes Go To PostI’m not in denial fam, well played to AMD. I just think we’re in for some real competition in the GPU space going forwards
God I hope so. Maybe nvidia will even improve geforce experience if pressured enough
By Wahabipapangus Go To PostGod I hope so. Maybe nvidia will even improve geforce experience if pressured enoughLet's not get ahead of ourselves now
nvidia price drops incoming?
I guess it wouldn't matter anyway... it's not like I can find a card in stock
I guess it wouldn't matter anyway... it's not like I can find a card in stock
Why drop the price when you just can come up with a new line in the same price range that makes months old purchases obsolete?
Ti versions here we go!
Ti versions here we go!
By Mister Go To PostRAGE MODE seemed to me like a reference to the old ATI RAGE cardsI had a Rage Pro with 4MB VRAM. It was shit.
Really excited to see some competition for nVidia. I think they've been slacking extremely in the mid-range space. It's the cards between 300-400 bucks that are the most important mass market gamer cards. GTX 1660 & RTX 2060 are still hovering at 300. 2060 Super and 5700XT around 400.
The 1060 released with 6GB in 2016 between 270-330 bucks (partners vs FE) and was a really amazing price performance sweet spot, that was cool and relatively silent depending on board partner.
Unfortunately I believe the 3060 is going to miss that target once more with 8GB. That's just not enough with Next-Gen launching and people thinking about leaving 1080p behind them.
By Wahabipapangus Go To Postwhen you finally see a 3090 in stock and it looks like thisIt’s actually a decent card
what the fuck mens
Got a 3070 from Scan 🥳🥳🥳🥳
4K monitor I’m getting doesn’t launch until Feb so will see how it does with that and probably sell later in the year
4K monitor I’m getting doesn’t launch until Feb so will see how it does with that and probably sell later in the year
By Wahabipapangus Go To Postwhen you finally see a 3090 in stock and it looks like this
what the fuck mens
It's ok no one will see.
holy shit i don't think I can handle 165hz
I've just gone a good 10 minutes in the nvidia control panel switching between 60hz en 165hz and just moving my cursor around.
I've just gone a good 10 minutes in the nvidia control panel switching between 60hz en 165hz and just moving my cursor around.
By /bacon Go To Postholy shit i don't think I can handle 165hz
I've just gone a good 10 minutes in the nvidia control panel switching between 60hz en 165hz and just moving my cursor around.
next level bacon
HFR is similar to the first time you turn off hardware acceleration. At first everything is so smooth it's scary... twenty minutes later you start feeling dirty, wondering how you ever lived like that, mucking around in the mud like a filthy 🐖