yeesh
gsync monitors aren't dropping in price at all--I actually upgraded my asus by hand for the gsync mod but the prebuilt variant is like the same price as it was years ago
definitely going to have to go second hand if I upgrade
gsync monitors aren't dropping in price at all--I actually upgraded my asus by hand for the gsync mod but the prebuilt variant is like the same price as it was years ago
definitely going to have to go second hand if I upgrade
By Smokey Go To PostI still think that ASUS monitor that's hitting in Q3 is the end game. 144hz, local dimming, HDR, 4K, Gsync. It's also going to be $1,800 - $2,000.yeah that's exactly what I'm after, I'm not really bothered about OLED.
I want two of them though, and don't really want to spend more than £1500, which might not even get me one lol...
there's a good chance in a year you'll be able to get a 4k OLED that does pretty much all of that though lol, at 55 inches and near the same price
By Dark PhaZe Go To Postyeesh
gsync monitors aren't dropping in price at all–I actually upgraded my asus by hand for the gsync mod but the prebuilt variant is like the same price as it was years ago
definitely going to have to go second hand if I upgrade
Yeah...Dropped $799 on my Swift on release which was what..2+ yrs ago....And they still about the same price for 1440p. Plus with all the new resolutions they've actually gotten more expensive..
yea no...
I love gsync but I'll take a decent 1440p low input lag monitor at 60 frames and just use fast-sync to prevent tearing rather than dish out that much
would be a fuking shame having to lock the fps at that though
I love gsync but I'll take a decent 1440p low input lag monitor at 60 frames and just use fast-sync to prevent tearing rather than dish out that much
would be a fuking shame having to lock the fps at that though
By Dark PhaZe Go To Postthere's a good chance in a year you'll be able to get a 4k OLED that does pretty much all of that though lol, at 55 inches and near the same priceI'd be using these as monitors, so 2x27" is ideal for me. I guess I just haven't read enough about OLED, seems there's issues with burn in? how's input lag? My current monitor is kinda dying, so when it's done I'll be in the market. Not sure how long that'll be from now.
By ReRixo Go To PostI'd be using these as monitors, so 2x27" is ideal for me. I guess I just haven't read enough about OLED, seems there's issues with burn in? how's input lag? My current monitor is kinda dying, so when it's done I'll be in the market. Not sure how long that'll be from now.It's all model specific. You will need to Google reviews for any monitor you are interested in. OLED's in general seem to have higher input lag.
By Dark PhaZe Go To Postyea no…
I love gsync but I'll take a decent 1440p low input lag monitor at 60 frames and just use fast-sync to prevent tearing rather than dish out that much
would be a fuking shame having to lock the fps at that though
I actually put my Swift up for sale on GAF. Took a look at prices on eBay and was shocked to see them in the $500+ range used.
Didn't realize they were worth half a rack even after this time.
Ryzen launches in March
http://www.pcworld.com/article/3163500/components/amd-confirms-its-ryzen-cpu-will-launch-in-early-march-followed-by-the-vega-gpu.html
Vega Q2
http://www.pcworld.com/article/3163500/components/amd-confirms-its-ryzen-cpu-will-launch-in-early-march-followed-by-the-vega-gpu.html
“There will be widespread system availability from day one,” Su said during the call. Channel vendors will receive the first Ryzen chips, along with system integrators. More traditional hardware vendors will come later, Su added.
That statement implies that vendors like Dell or HP (neither of which have been officially confirmed to be using the Ryzen chip) will be asked to wait, while AMD caters to boutique PC vendors. Ryzen will take on Intel’s highest-end Core chips, specifically the Core i5 and Core i7 processors, Su said.
Vega Q2
What’s new, though, is a Vega timetable: Su revealed that the Vega GPUs will ship during the second quarter as well. In the second half of 2017, AMD still plans to launch a Zen-based APU, codenamed “Raven Ridge,” primarily designed for notebooks but also some desktops.
By Smokey Go To PostPeople been waiting for AMD to save industries for years now. Not sure they have it in them tbh.CPU maybe, they have provided decent GPU competition fairly regularly.
There's a new NVIDIA HotFix driver out
http://nvidia.custhelp.com/app/answers/detail/a_id/4378
This is GeForce Hot Fix driver version 378.57 that addresses the following:
- Fixed crash in Minecraft and some other Java-based titles
- Resolved 'Debug Mode' as default option on Pascal based GPUs
I didn't know the bolded was a thing. Checked my NVCP and yep, debug mode is checked and grayed out. What does debug mode do?
Except...I don't have a reference model. I have the 1070 Strix. It basically makes your custom card act as a NVIDIA reference card with regards to clock speed and boosting. Installed the HotFix and nope, still grayed out. I did some testing and my clocks from AfterBurner were reporting higher than a standard Asus Founders Edition 1070. Then I started OCing and the OC clock was being applied as expected in Heaven. So I dunno.
Kibner you might wanna check yours when you can. Go to Nvidia Control Panel -> Help and you should see if Debug mode is on/off. On a side note, do you have Samsung or Micron memory on your GPU?
http://nvidia.custhelp.com/app/answers/detail/a_id/4378
This is GeForce Hot Fix driver version 378.57 that addresses the following:
- Fixed crash in Minecraft and some other Java-based titles
- Resolved 'Debug Mode' as default option on Pascal based GPUs
I didn't know the bolded was a thing. Checked my NVCP and yep, debug mode is checked and grayed out. What does debug mode do?
Beginning with driver version 355.82, Nvidia has added a new feature called "Debug Mode", which will downclock any factory overclocked graphics card to Nvidia reference clock speeds. This is a great feature to help determine problems with the card, or for games that are sensitive to overclocked video cards. To turn on Debug Mode, open the Nvidia Control Panel, then go to the Help menu, and check the Debug Mode option. If you have a reference model card, this option will be grayed out.
Except...I don't have a reference model. I have the 1070 Strix. It basically makes your custom card act as a NVIDIA reference card with regards to clock speed and boosting. Installed the HotFix and nope, still grayed out. I did some testing and my clocks from AfterBurner were reporting higher than a standard Asus Founders Edition 1070. Then I started OCing and the OC clock was being applied as expected in Heaven. So I dunno.
Kibner you might wanna check yours when you can. Go to Nvidia Control Panel -> Help and you should see if Debug mode is on/off. On a side note, do you have Samsung or Micron memory on your GPU?
Yeah, it says debug mode is enabled. Don't have time to check if it is actually affecting my clock speeds before having to leave to meet with a client.
I have no idea whose memory is on my card and no idea how to check.
I have no idea whose memory is on my card and no idea how to check.
GPU-Z or HWInfo64
Samsung memory is what you want. Micron is notably slower. I think if you got the 1070 closer to launch you got Samsung memory. I got it by luck somehow.
I was able to get my card to OC to 2025 on the core, and my memory to get think 8290. Card still didn't break 60c.
Samsung memory is what you want. Micron is notably slower. I think if you got the 1070 closer to launch you got Samsung memory. I got it by luck somehow.
I was able to get my card to OC to 2025 on the core, and my memory to get think 8290. Card still didn't break 60c.
I'll try to check that out when I get home. I still haven't tested how high my card can clock; been running in silent mode instead.
By Kibner Go To Post1060 is probably the card of the people, tbh. The 1070 is for those who are ok with paying twice as much to get 60fps at ultra settings as opposed to 60fps at high.
I have a strange fear that if I try to push my 1060 I will discover its FPS limits and will despise it. So I've only been playing indie games and StarCraft 2 on it while in the bed before sleep. You can't just upgrade a laptop GPU.
The wife loves having me in bed at night instead of the basement, btw. Buying MYSELF a gaming laptop has paid off more than me upgrading HER solitaire.
Im the lazy enthusiast. I don't have or want to invest time into tweaking for performance. Just buy the hardware, crank everything on ultra, and expect 60fps. That's why I upgraded from a 970 to 1060.
The monitor game, idk. I got burned on my ASUS. It died in 1.5 years, so I'm just playing on a 32" LCD. Gaming monitors are expensive AF.
By Adam Blade Go To PostThe struggle is real. :(
LoL!
By RobNBanks Go To Postshould i sell my pc parts separately or try to find someone who would buy the whole thing.Probably depends on the parts. What are the components?
It is likely to be quicker to sell everything together than to finish off selling everything individually.
its really of no use to me, i dont need a 4790k and a 970 to browse slaent. i lost interest in gaming awhile ago but now im just focused on my gf and her daughter
before all of this i wanted to do a new build from scratch anyway, so if i ever got that urge to get back into gaming this rig is still unnecessary
before all of this i wanted to do a new build from scratch anyway, so if i ever got that urge to get back into gaming this rig is still unnecessary
By Smokey Go To PostGPU-Z or HWInfo64Well, I have Micron memory and I really don't feel like being arsed to manually OC'ing this thing. Maybe I'll try it while watching this Clippers game.
Samsung memory is what you want. Micron is notably slower. I think if you got the 1070 closer to launch you got Samsung memory. I got it by luck somehow.
I was able to get my card to OC to 2025 on the core, and my memory to get think 8290. Card still didn't break 60c.
By Kibner Go To PostWell, I have Micron memory and I really don't feel like being arsed to manually OC'ing this thing. Maybe I'll try it while watching this Clippers game.
gg
By Kibner Go To PostWell, I have Micron memory and I really don't feel like being arsed to manually OC'ing this thing. Maybe I'll try it while watching this Clippers game.Hello fellow Micron bro
Acer unveils 38" Ultrawide curved monitor
https://www.techpowerup.com/230313/acer-unveils-the-xr382cqk-38-inch-ultrawide-curved-monitor
Freesync
I will say....There are way more options for freesync than there are gsync, and they are cheaper. Will be paying attention to Vega for this reason really.
https://www.techpowerup.com/230313/acer-unveils-the-xr382cqk-38-inch-ultrawide-curved-monitor
Freesync
I will say....There are way more options for freesync than there are gsync, and they are cheaper. Will be paying attention to Vega for this reason really.
With RE7 showing what a mainstream AAA VR game can do, I'm becoming a little impatient for Vive's VR sequel. Valve also needs to bring something to the table software wise.
RGB RAM - GSKILL Trident
if i had a ddr4 platform id be in there...
Apparently RGB RAM is actually hard to do
if i had a ddr4 platform id be in there...
Apparently RGB RAM is actually hard to do
By Smokey Go To PostApparently RGB RAM is actually hard to doReally? I wonder why... Is it because it requires extra power and RAM is pretty fiddly with voltage and power and timings?
By Kibner Go To PostReally? I wonder why… Is it because it requires extra power and RAM is pretty fiddly with voltage and power and timings?
Linus gave some indication of why in the beginning of his video. Something about signal integrity issues, and that's just on regular LED RAM sticks, let alone RGB.
Switched from my Logitech G502 to my Sensei Rival and saw an immediate increase in my play on BF1. The Rival isn't as feature heavy as the 502, but man it feels a lot better in my hand. The logo is also RGB so that works out well with the everything else, Kibner :)
they announced the Quadro GP100, its what i would want the 1080 ti to be (it won't) with 16GB HBM2. Don't need the ridiculous FP64 performance though.
Ryzen (Euro) pricing revealed
http://www.guru3d.com/news-story/amd-ryzen-r7-lineup-of-8-core-16-thread-cpu-prices-revealed.html
AMD Ryzen R7 1800X
They start with the AMD Ryzen R7 1800X . This is the flagship processor and it has has 8 cores with 16 threads and is assumed to get a Boost frequency of 4.00 GHz. The boost frequencies are not confirmed, but the indications we have seen the past few weeks would state a 4.0 GHz Turbo and 3.6 Ghz base clock. No further data was revealed. Now keep in mind (if the perf is close) a similar 8-core Intel CPU would cost you about 1,200 euros, the cost for the flasgship Ryzen R7 1800X processor would be 599.99 euros. These are unlocked (multiplier) processors.
AMD Ryzen R7 1700X
The next AMD Ryzen in line is the R7 1700X, this one would again get 8 cores and 16 threads but this time at a Turbo frequency of 3.80 GHz, so yes this is pretty much the same processor, just with a lower base and Turbo frequency albeit the turbo on this model is again not confirmed. Both R7 1700X and 1800x are focused at enthusiast and professional usage and the price for the 1700X would be 469.99 euros .
AMD Ryzen R7 1700
Then there is the AMD R7 Ryzen 1700, this would be a top-end CPU for gamers and yes, again you'll receive an 8 core and 16 threads processor. This time at a Turbo frequency of 3.70 GHz, the most notable being that it is the only model that indicates a TDP, which is set as 65W whereas the other two would be 95 Watt parts. This unit indeed could be very attractive to DiY PC gamers and would cost 389.95 euros which is spot on with the Core i7 7700K. Even if the IPC perf would be a notch slower due to clock frequencies, you'd still get the 8-cores over 4 on that Intel processor.
1800x = $640
1700X = $500
1700 = $415
The 1800x is substantially cheaper than Intel's 8 core offering, let alone 10 core. Going to be watching the benchmarks on this one. I've said it before, but my only upgrade path is 8+ cores. I've been on 6 for 2+ years, so I'm not going down to 4 nor buying a whole new platform (Skylake-X) for another 6 core processor.
http://www.guru3d.com/news-story/amd-ryzen-r7-lineup-of-8-core-16-thread-cpu-prices-revealed.html
AMD Ryzen R7 1800X
They start with the AMD Ryzen R7 1800X . This is the flagship processor and it has has 8 cores with 16 threads and is assumed to get a Boost frequency of 4.00 GHz. The boost frequencies are not confirmed, but the indications we have seen the past few weeks would state a 4.0 GHz Turbo and 3.6 Ghz base clock. No further data was revealed. Now keep in mind (if the perf is close) a similar 8-core Intel CPU would cost you about 1,200 euros, the cost for the flasgship Ryzen R7 1800X processor would be 599.99 euros. These are unlocked (multiplier) processors.
AMD Ryzen R7 1700X
The next AMD Ryzen in line is the R7 1700X, this one would again get 8 cores and 16 threads but this time at a Turbo frequency of 3.80 GHz, so yes this is pretty much the same processor, just with a lower base and Turbo frequency albeit the turbo on this model is again not confirmed. Both R7 1700X and 1800x are focused at enthusiast and professional usage and the price for the 1700X would be 469.99 euros .
AMD Ryzen R7 1700
Then there is the AMD R7 Ryzen 1700, this would be a top-end CPU for gamers and yes, again you'll receive an 8 core and 16 threads processor. This time at a Turbo frequency of 3.70 GHz, the most notable being that it is the only model that indicates a TDP, which is set as 65W whereas the other two would be 95 Watt parts. This unit indeed could be very attractive to DiY PC gamers and would cost 389.95 euros which is spot on with the Core i7 7700K. Even if the IPC perf would be a notch slower due to clock frequencies, you'd still get the 8-cores over 4 on that Intel processor.
1800x = $640
1700X = $500
1700 = $415
The 1800x is substantially cheaper than Intel's 8 core offering, let alone 10 core. Going to be watching the benchmarks on this one. I've said it before, but my only upgrade path is 8+ cores. I've been on 6 for 2+ years, so I'm not going down to 4 nor buying a whole new platform (Skylake-X) for another 6 core processor.
By diehard Go To PostCool, but come correct with a proper high frequency 4 core CPU please.
Looks like the 4C/8T line will be between 3.2 - 3.5 Base Clock according to the chart.
Wow. I'm gonna need benchmarks on software's I use before pulling the trigger but those prices are head turners
Currently at Frys typing this on the ASUS PG348Q.
Say man
This thing is legit af. It's huge. First time using a Ultrawide monitor too.
It is priced at $1,189 tho.
Say man
This thing is legit af. It's huge. First time using a Ultrawide monitor too.
It is priced at $1,189 tho.
By Smokey Go To PostCurrently at Frys typing this on the ASUS PG348Q.Ultra-wide is great to replace side-by-side dual monitor setups. If I got one, it would ensure that I would be playing all my games in windowed mode, though. That shit is too wide, if not.
Say man
This thing is legit af. It's huge. First time using a Ultrawide monitor too.
It is priced at $1,189 tho.
Fry's is literally Newegg in the flesh. Got everything here. Might actually be worse than Microcenter. Walked by the PSU aisle and found out that a RGB PSU does in fact, exist.
By Smokey Go To PostFry's is literally Newegg in the flesh. Got everything here. Might actually be worse than Microcenter. Walked by the PSU aisle and found out that a RGB PSU does in fact, exist.Yeah, I always make it a priority to stop at one in Dallas whenever I go there for Quakecon. It's where I picked up my last two video cards. lol
They don't have any in Louisiana. :(
Houston is #blessed to have multiple Fry's and a Microcenter in town. I can't imagine doing inventory in either. Walked by the mice aisle and they still got Battlefield 4 edition gaming mice for sale.
Fry's is great. There's one outside of Portland but its a good hour long drive. Not sure there's any in the Bay.