By Zabojnik Go To Post4000 series is still DP 1.4?!I would guess they might just not be certified for 2.0 yet.
It's now or never, Lisa.
that's fucking insane
bring me RTX Morrowind (I doubt there will be much mod support when it's only 4000 series cards). Maybe in a decade.
bring me RTX Morrowind (I doubt there will be much mod support when it's only 4000 series cards). Maybe in a decade.
So while there is no added latency, there's still a lot of doubt that 120fps with DLSS 3 will feel as good as normal 120fps input latency wise. Tests will be fascinating.
Yeah, I was sure it wouldn't add latency, just pretty certain it won't reduce latency at a rate corresponding to the increase in apparent framerate.
Pouring a big one out for SLI today. 40XX series don't even have Nvidia Link. @Smokey
What a time that was.
What a time that was.
By Kibner Go To PostYeah, I was sure it wouldn't add latency, just pretty certain it won't reduce latency at a rate corresponding to the increase in apparent framerate.If it reduces latency at all, it’s still a positive though. If it can make a 30FPS game feels like a 60FPS while displaying a 120FPS image then it’s still a step in the right direction.
I mean, you can run Spiderman Remastered at 8K60 with some DLSS on a 4090 so yeah I feel confident on that. Altho all subject to the emulator people producing the relevant plugins obvs.
By Kibner Go To PostThe only cool thing about the 4000 series reveal:I wish these videos just got to the point. First 15 seconds are useless.
By diehard Go To PostI would guess they might just not be certified for 2.0 yet.Possibly, but I doubt it. I fear DP 2.0 monitors will be few and far between in the next 12-24 months anyway, there's HDMI 2.1 and with DSC maybe Nvidia just don't see it as a necessity.
By FootbalIFan Go To PostI wish these videos just got to the point. First 15 seconds are useless.He literally says his name, takes 10 seconds to explain what this is about and dives right in.
Is this some misguided attempt to sell yourself as a no-attention-span zoomer, FF? You're not fooling anyone.
By Zabojnik Go To PostHe literally says his name, takes 10 seconds to explain what this is about and dives right in.no i get it but i just very quickly lost interest, because i just want to see what the video title/thumbnail says
Is this some misguided attempt to sell yourself as a no-attention-span zoomer, FF? You're not fooling anyone.
ive maybe been watching too many tiktoks
Nvidia stupidly dropped their Quadro branding and now that the latest gen uses the same letter for their architecture they will have to drop that too. So instead of the Nvidia RTX A6000 it will be just the Nvidia RTX 6000. Naming things is hard i guess.
Prices in Germany:
- GeForce RTX 4090: 1.949 Euro
- GeForce RTX 4080 16 GB: 1.469 Euro
- GeForce RTX 4080 12 GB: 1.099 Euro
Ahahaha.
- GeForce RTX 4090: 1.949 Euro
- GeForce RTX 4080 16 GB: 1.469 Euro
- GeForce RTX 4080 12 GB: 1.099 Euro
Ahahaha.
it is a shocking character flaw in some people, but we gotta respect them and see them as equals, even if they are in reality below us.
If you plan to get a 4090 you might want to watch this:
and looks like the Corsair hx1000i I bought in 2020 is atx 12v 2.4 not 3.0 oof
and looks like the Corsair hx1000i I bought in 2020 is atx 12v 2.4 not 3.0 oof
So much of that video is bollocks and sensationalized.
Starting with the lifespan of the connector, 30 cycles. Do you know how many cycles normal PSU connectors are rated for? Or CPU sockets, or PCIE slots? Generally in the realms of 25-50. The 12 pin plug found on 30 series cards was also only rated for 30 cycles.
Next up, quoting the video
You hearing this bollocks or wha? Obviously there's an issue of transient power draw spikes, an issue that's already seen with every previous gen of GPUs, especially the 3000 series. Outside of those spikes though, GPU are hard locked to not pull as much power as they want. You have to physically modify a card to break past those hard limits. Simply using a 12 pin cable without the data pins to a standard PSU won't automatically disable all power limits and voltage controls. Shit's an absurd thing to come out with.
Starting with the lifespan of the connector, 30 cycles. Do you know how many cycles normal PSU connectors are rated for? Or CPU sockets, or PCIE slots? Generally in the realms of 25-50. The 12 pin plug found on 30 series cards was also only rated for 30 cycles.
Next up, quoting the video
…the graphics card is going to by default now just pull as much power as it possibly can
You hearing this bollocks or wha? Obviously there's an issue of transient power draw spikes, an issue that's already seen with every previous gen of GPUs, especially the 3000 series. Outside of those spikes though, GPU are hard locked to not pull as much power as they want. You have to physically modify a card to break past those hard limits. Simply using a 12 pin cable without the data pins to a standard PSU won't automatically disable all power limits and voltage controls. Shit's an absurd thing to come out with.
By HonestVapes Go To PostSo much of that video is bollocks and sensationalized.
sounds like white jay-z
By Celcius Go To PostIf you plan to get a 4090 you might want to watch this:always people overblowing shit smh
and looks like the Corsair hx1000i I bought in 2020 is atx 12v 2.4 not 3.0 oof
bet nothing will be wrong when it releases
By HonestVapes Go To PostSo much of that video is bollocks and sensationalized.No going really going to disagree with you, just add some discussion:
By HonestVapes Go To PostStarting with the lifespan of the connector, 30 cycles. Do you know how many cycles normal PSU connectors are rated for? Or CPU sockets, or PCIE slots? Generally in the realms of 25-50. The 12 pin plug found on 30 series cards was also only rated for 30 cycles.I think some CPU sockets are even lower. Mini-fit Jr (like on EPS 12v) is rated for as low as 75 but as high as 1500. Complaints about 30 connection rating for the 12VHPWR connector are valid but should be given with context.
Next up, quoting the video
By HonestVapes Go To PostYou hearing this bollocks or wha? Obviously there's an issue of transient power draw spikes, an issue that's already seen with every previous gen of GPUs, especially the 3000 series. Outside of those spikes though, GPU are hard locked to not pull as much power as they want. You have to physically modify a card to break past those hard limits. Simply using a 12 pin cable without the data pins to a standard PSU won't automatically disable all power limits and voltage controls. Shit's an absurd thing to come out with.I think there's quite a bit of misunderstanding in general around all this new power stuff and it's pretty understandable. These cards do have a ton of safeties and intelligence when it comes to balancing power loads but a lot of it is lost when using adapters that are essentially combining multiple cables into one. It's a bit like how you could still have problems today when using a cable that has multiple 8-pin connectors. PCI-SIG recently sent out an email that they are seeing some issues with some lower quality adapters and I think that's where a lot of this fear is coming from.
Won't be any problems on a 4090 if you have at least one of these scenarios:
1. use 4 individual 8 pin connectors to 12VHPWR (or 2 individual connectors on a proprietary PSU cable + splitters that are rated for 300W at the PSU end)
2. use a not bottom of the barrel cable with proprietary connectors designed for this + a PSU rated to handle it (something like https://tinyurl.com/mr2uhy9x and 1200W PSU)
Most of this stuff is built to go over spec, but I wouldn't be surprised if people using adapters on 850W PSU's with 4090's have some issues.
By Patriotism Go To PostWhat does 30 cycles mean in this context?30 connect/disconnects. By context I mean comparing it to other cables, like how Vapes pointed out thats the same as the connector (I think it's technically the same connector, Micro-Fit 3.0) they used on the 3080/3090 FE cards.
By reilo Go To PostSome of those are almost as wide as a PS5 😬😬😬😬The largest card there works out to be roughly 4.3 litres in volume. Well over half the size of a Series X.
By diehard Go To Post30 connect/disconnects. By context I mean comparing it to other cables, like how Vapes pointed out thats the same as the connector (I think it's technically the same connector, Micro-Fit 3.0) they used on the 3080/3090 FE cards.Nice one, thanks
By reilo Go To PostThe 5080 RTX will be a standalone eGPU the size of a XSXI hope so. These things need the over the top cooling honestly.
By diehard Go To Post30 connect/disconnects. By context I mean comparing it to other cables, like how Vapes pointed out thats the same as the connector (I think it's technically the same connector, Micro-Fit 3.0) they used on the 3080/3090 FE cards.The standard 8 pin PCIE power cables are also rated for 30 cycles.
I’m waiting for standalone GPUs to become a thing. Surprised it hasn’t happened already with the external shroud being in RGB 🤷🏾♂️
By Smokey Go To PostI’m waiting for standalone GPUs to become a thing. Surprised it hasn’t happened already with the external shroud being in RGB 🤷🏾♂️Something to do with the pcie cables, I think. Look at how expensive even just pcie4 riser cables are.
I mean you can do external GPUs through USB-C/Thunderbolt but there's a hefty drop in performance.
PCIE riser cables have come down a lot in price, but making one that's long enough to route to an external enclosure without it looking unsightly would be a major challenge. They're also not very durable cables.
PCIE riser cables have come down a lot in price, but making one that's long enough to route to an external enclosure without it looking unsightly would be a major challenge. They're also not very durable cables.
By Batong Go To PostAlso can do through the m.2 slotHas a major impact on bandwidth given it cuts down to 4x or 2x PCIE lanes. Might even be 1x on some M.2 slots.
I believe it also requires supplemental power too. Since a normal PCIE slot can deliver up to 75w and the M.2 can't.
Oh yes power would need to be separate, just seen guys buy one of these to build some kind of egpu setup, normally less of a drop compared to thunderbolt but still similar clunky as you described with the PCIE route