How far apart is the bandwidth on Thunderbolt vs PCIe? I imagine longterm it'll be solvable but onboard will always be faster so it will always lag.
By Batong Go To PostOh yes power would need to be separate, just seen guys buy one of these to build some kind of egpu setup, normally less of a drop compared to thunderbolt but still similar clunky as you described with the PCIE routeADT-Link, what a company. Must've spent close to 200€ on their angled DP/HDMI extensions.
By reilo Go To PostHow far apart is the bandwidth on Thunderbolt vs PCIe? I imagine longterm it'll be solvable but onboard will always be faster so it will always lag.Thunderbolt 4 is 40GBp/s.
PCIe 3.0 x16 is 16GBp/s, 4.0 is 32GBp/s, 5.0 is 64GBp/s.
I might need Diehard to double check those numbers, I pulled them straight from google.
I don't think bandwidth is a limiting factor with TB4 necessarily though. I think there's something else at play, but it's hard to pin down because I can't find any true like for like testing comparing the desktop system using a GPU in the the PCIE slot vs a desktop system with an external GPU. It's all laptops with eGPU vs desktop PC testing.
Yea eGPU seemingly is for those with something like MBPs that want the portability but extra boost in a workstation like environment. Nobody seemingly uses it for gaming. I'm not quite sure what the Windows supports on that front looks like, either.
In theory someone could get an eGPU for their MBP and dualboot into Windows and benchmark it that way.
In theory someone could get an eGPU for their MBP and dualboot into Windows and benchmark it that way.
TB4 in PCIe mode is actually just 4 lanes of PCIe Gen 3 (32 Gb/sec or 4GB/sec). The bandwidth loss at it's worst (high fps gaming) can be pretty bad and at its best (compute workloads) not impactful at all.
Corsair confirming that existing power supplies will work fine with RTX 4000: https://www.corsair.com/newsroom/press-release/ready-to-go-beyond-fast-corsair-announces-compatibility-for-nvidia-40-series-graphics-cards
Yeah, super happy with my 5950x for the forseeable future, though I still want to see how the 7950x runs in "Eco mode" or whatever its called for more sane power consumption. I really am just wanting some power efficient parts because the extra heat and power consumption is no bueno when sharing a room with someone else who also games.
By Kibner Go To PostYeah, super happy with my 5950x for the forseeable future, though I still want to see how the 7950x runs in "Eco mode" or whatever its called for more sane power consumption. I really am just wanting some power efficient parts because the extra heat and power consumption is no bueno when sharing a room with someone else who also games.Briefly touched on here. Outperforms a 5950X in Cinebench multi when in Eco mode. No other benchmarks though.
There's some good stuff in that video regarding overclocking too. Seems Ryzen master has improved a lot on that front.
I'm honestly shocked that the 6 core 7600X pulling only 110w~ still runs at 93c with a 360mm AIO. I get that thermal density has increased with the new architecture, but fuck me that is incredibly shit thermal transfer.
I'm honestly shocked that the 6 core 7600X pulling only 110w~ still runs at 93c with a 360mm AIO. I get that thermal density has increased with the new architecture, but fuck me that is incredibly shit thermal transfer.
Good stuff from Ryzen.
So the Seasonic Vertex PSUs ATX 3.0 PSUs don’t hit until mid-DEC 😩. There’s an order of operations of how I want to upgrade so…
Think I’m going to drop in a 13900k and switch my motherboard to my Maximus Extreme Z690, as it has it has a single PCIE 5.0 m2 slot. Move to my 7000D case, get a new display (PG42UQ or Neo G7), and then upgrade to a 4090 in Dec/Jan along with a Vertex PSU.
This is the way . See if I can hold to it lul
So the Seasonic Vertex PSUs ATX 3.0 PSUs don’t hit until mid-DEC 😩. There’s an order of operations of how I want to upgrade so…
Think I’m going to drop in a 13900k and switch my motherboard to my Maximus Extreme Z690, as it has it has a single PCIE 5.0 m2 slot. Move to my 7000D case, get a new display (PG42UQ or Neo G7), and then upgrade to a 4090 in Dec/Jan along with a Vertex PSU.
This is the way . See if I can hold to it lul
The new AMD stuff looks nice but I’ll bet new x3d chips will be the true upgrade most gamers are looking for (unless raptor lake comes out swinging which is very possible)
Definitely going to need the 4090 just to see an appreciable difference in game benchmarks between the new gen AMD/Intel CPUs.
Currently the only reason I’m excited for the new GPUs.
Currently the only reason I’m excited for the new GPUs.
By Patriotism Go To PostGot a good deal on your fuel bills ey vapesI fucking wish, 29.3p per kWh and going up in October. Surprisingly though, between me and my girlfriend we only use on average 6.5 kWh daily.
lol (a completely joke video about the 4090's size and power)
https://media.discordapp.net/attachments/136270927263432705/1023371152158097448/0bdab6a520.mp4
e: ahh, the original video it came from:
https://media.discordapp.net/attachments/136270927263432705/1023371152158097448/0bdab6a520.mp4
e: ahh, the original video it came from:
i cant tell if that guy speaks Danglish, or if he just tries to sound annoying
Edit: Oh danish, I see.
Edit: Oh danish, I see.
I thought there was normal dimension ones as well? Weird.
But anyway there really isn't much of a curve, if that is an issue.
But anyway there really isn't much of a curve, if that is an issue.
There's the Samsung Odyssey Neo with Mini-LED but it's also curved.
And it was a quite hideous gamer design
And it was a quite hideous gamer design
By reilo Go To PostAlso the ASUS PG32UQX which fits all criteria but it's $2,300!!!
That is miniled
probably the hardest monitor in the game rn, but no hdmi 2.1
By Smokey Go To PostThat is miniled
probably the hardest monitor in the game rn, but no hdmi 2.1
Just checked last night and it's like £3,300 over here 😭😭😭
By Smokey Go To PostThat is miniledYea meant to say it's Mini LED which is totally great but fuck that price.
probably the hardest monitor in the game rn, but no hdmi 2.1
Some of these are QHD at 32" and I'm like why bro
While price increases have only affected the Core i5 this year, power requirements are up all around. All Core i5, i7, and i9 chips have a base power of 125 watts, but the Core i7 and Core i9 will both need 253 watts for their max turbo power. That’s a 5 percent jump from the 241 watts on the Core i9 last year, but it’s a massive 33 percent jump for the Core i7, which moves from 190 watts last year to 253 watts for the 13th Gen. Intel’s Core i5-13600K now needs 181 watts for max turbo, up 20 percent from the 150 watts on the Alder Lake version.fuck the earth and your utility bill - intel
Brightside: that's gaming and heating sorted for the winter tho.
edit: the new AMD chips boost at 95C whew
edit: the new AMD chips boost at 95C whew
By inky Go To PostBrightside: that's gaming and heating sorted for the winter tho.They're basically overclocked out of the box. They run much cooler and much more efficiently in terms of power when locked to a lower power target, but they lose some performance.
edit: the new AMD chips boost at 95C whew
By inky Go To PostBrightside: that's gaming and heating sorted for the winter tho.AMD took a new approach to their boost algo with this new gen. It basically doesn't care about heat until 95C and only then will it start backing off power/voltage. A better cooling solution won't result in lower CPU temps but instead more performance. (excluding extreme overclockers)
edit: the new AMD chips boost at 95C whew
At least AMD offers a built-in "Eco mode" that lowers power consumption. Not sure if Intel offers a similar option.
By Kibner Go To PostAMD took a new approach to their boost algo with this new gen. It basically doesn't care about heat until 95C and only then will it start backing off power/voltage. A better cooling solution won't result in lower CPU temps but instead more performance. (excluding extreme overclockers)
At least AMD offers a built-in "Eco mode" that lowers power consumption. Not sure if Intel offers a similar option.
Yeah you can do the same with Intel. You can set whatever power target you want with basically any CPU as long as the the motherboard BIOS allows it. I believe you can do it through XTU in Windows but I’ve never bothered with that.
I have my 12700 locked to 65w for the long term power draw. I can’t remember what I set the short term boost to, I think 125w for 56 seconds but stock is 180w infinitely.
I looked into the numbers for the 7950X. Locked to 105w, the power consumption drops by around 170w (total system power) and you only lose 20% performance in Cinebench multicore.
If you want to push it, it can still run optimally up to 115C damn
The new chips look good, but the limiting factor is the mobo price and DDR5.
The new chips look good, but the limiting factor is the mobo price and DDR5.
Might legit look for a 5900x -> 5800x3D up/down/sidegrade and not care about CPUs for a generation or three.
AM5 mobo prices are out of control, lol. B650 gonna be essential for a lot of people.
AM5 mobo prices are out of control, lol. B650 gonna be essential for a lot of people.
I'm still on a 8700k and don't really even have any need to upgrade but dammit I want to build another computer again its been long enough so I'm prob going Raptor Lake mostly for fun.
By diehard Go To PostI'm still on a 8700k and don't really even have any need to upgrade but dammit I want to build another computer again its been long enough so I'm prob going Raptor Lake mostly for fun.Du eet.
There's no way anyone's topping this graph anytime soon. Incredible work from Pat & the boys.
#TheThinRedLine
#TheThinRedLine
Was talking shit about it earlier.
First, the games picked
Second, AVG FPS Ratio? What in the flying fuck
First, the games picked
Second, AVG FPS Ratio? What in the flying fuck
Might be the one to get. Should be cheaper than the AW3423DW in theory 'cause no g-sync tax, or at the very least on par, but I fully expect them to adjust pricing, especially seeing how Samsung are charging 1700€ for their QD-OLED ultrawide.
Nvidia went and made power usage so high that Alienware had no choice to do somewhat of a mesh front case. Amazing.