HUB - DON'T Upgrade Your GPU, Upgrade Your Monitor Instead!
DiscussionBut ur mouse cursor and web scrolling be smooth af tho .
minesweeper unlocked FPS was pure fire!!!
reddit smooth scrolling the best
I fully agree with Hardware Unboxed on this.
Your monitor is literally the one component you are always using. A good monitor upgrade is much more meaningful than a GPU upgrade.
I typically go through 2-3 gpu upgrades for each monitor upgrade. Even old games look great on an oled.
Even old games look great on an oled.
well there's a major problem here.
if you upgrade your graphics card ever 3 years, then the monitor would need to last 9 years.
which is not a problem for lcd garbage on average.
BUT oled is expected to burn-in way before that. depending on use WAY WAY WAY before that.
which is quite an issue until oled is gone and replaced by burn-in free tech.
so depending on your use then a graphics card can outlast the monitor, if it is an oled monitor.
even monitors unboxed already saw burn-in on their model, that they are using. a line in the middle, likely due to window tiling.
Depends how much you value having an OLED I guess, I had a super dumb moment where I took a screenshot of terraria shortly after getting my OLED because I was so excited to share how nice it looked.
I wouldn't blame you, some stuff just makes OLED look like a fucking window instead of a screen.
OLED gives realistic contrast ratios. IPS 1000:1 feel so fake once you get the real 1:inf
its been 1.5 years and my oled isn't getting dimmer and i don't have burn in. sometimes image retention, but not burn-in despite having a taskbar on for 6 months earlier from limit testing. in fact, i might re-enable it because the burn-in i thought i had was just retention that cleared up. it'll be fine
are you expecting to use the monitor only for 2 years?
because maybe with your use case it can make it 2 years.
we KNOW however, that burn-in is expected:
https://www.rtings.com/tv/tests/longevity-burn-in-test-updates-and-results
it'll be fine
based on the data, it will NOT be fine.
will you be free from burn-in for 5 years? 8 years? 10 years?
i'm using good displays for 10+ years, will you be able to?
i'm having fun with my oled monitor while you're questioning someone else's decision on reddit, so i already know who the winner in life is. enjoy anxiety and indecisiveness, lmao
ground truth doesn't lie, and i'd rather try something than be afraid of not even a real unknown, but a potential unknown. worst case i suffer from a small smidge of burn in and agonize over my next monitor
edit: yeah i think i'll be fine https://www.reddit.com/r/OLED_Gaming/comments/1cq435s/rtings_1_year_burn_in_results_for_1st_gen_qdoled/
i already know who the winner in life is. enjoy anxiety and indecisiveness, lmao
the winner is the industry selling inherently broken tech for increased prices and leaving customers with the issues, for what might be a massive investment to some people, but maybe not you.
ground truth doesn't lie
ground truth shows lots of burn in in the rtings test at 12 month if you look at tvs.
are you expected to be able to dump the screen if it has burn-in in 3 years? well that is planned obsolescence, which i certainly don't like and neither should you.
worst case i suffer from a small smidge of burn in
the worst case is massive burn-in that is a major issue.
but it sounds to me like you got the money to buy a new one if needed and are fine with the planned obsolescence inherent in the technology.
i wonder if you consider it a problem, that people, who saved up 6 months to buy a new monitor are buying a new monitor under the FALSE IMPRESSION, that burn-in is fixed and after just one year, they see major burn-in problems and the company ignores any warranty claims.
is that acceptable to you? because it isn't to me. i want people to be aware of problems with tech. i want customers protected from marketing lies.
ground truth shows lots of burn in in the rtings test at 12 month if you look at tvs.
it also has my exact model in question, if you want accuracy and not extrapolation from TVs, with the extent of burn-in easily accessible on the grey 50%
and i don't consider it a problem because you are speculating. i believe dell with their business arm etc. will likely honor the 3 year warranty, but i can't say that for sure any more than you can said for sure the company may "ignores any warranty claims."
you're still without actual hard claims. you're still speculating. this is no better than a youtube video essay explaining what oled is and how burn in is possible. i understand customer protection is important, but i'm not dell trying to swindle customers, i'm just saying in practicality 99% of people won't be affected so they should enjoy life. if they got an oled for non-business purposes and can't afford anything else once they get mild burn-in, that's a them decision-making issue that won't cause them to go into poverty or die
man how often do you upgrade GPUs? the burnin in oled would make it need replacement every 2 years or so for me (a lot of bright static elements)
I daily an lcd for wfh use and limit the oled for gaming and video content. I swing either monitor into center use using monitor arms.
I don’t play the same game for thousands of hours so don’t expect it to burn in. We’ve had an oled tv for a half decade now and don’t have any burn in so I made the leap for the pc finally as well.
Well your use case is different. I use my monitors for work that may include things like 16 hours straight of static UI elements with no time for LED cycling in between. Then the same thing next day, and the next, for months.
also, the traditional "but that's an incremental gain/not really worth it" sorta falls apart at the top of the price scale, actually when you are doing a $2k rig then another $100 to make the whole thing 10% faster or give you a CPU that will last you 5 years instead of 3 years is objectively good value. and since the gpu market has gotten more and more marginal too, the same pressures are occurring here.
the step from 4070 ti to 4080 can buy you a pretty nice monitor, or a nicer CPU and max out your memory, or max out your memory and buy a 1.5tb optane for storage, etc. like yes those things have traditionally not been considered worth it, but if you're looking at spending $400 for essentially an incremental step in gpu tiering or whatever, or if you're going to spend $800 more to go from 4070 ti super to 4090 or whatever, there are legitimately other things you can buy that will be a nice QOL boost around the edges and give you a little bit of immunity to some potential problems down the road (eg extra memory+optane should negate a lot of directstorage problems).
that said, HUB weren't preaching the merits of spending more for a nicer monitor ten years ago when it was a nice gsync ips monitor vs some flaky flickering samsung CF791 freesync VA monitor. they were camp "spend $150 less for a piece of junk monitor so you can spend $200 more on Vega and support the red team" - here's their advice at the time, they were telling people to buy a 1080p 144 hz monitor into 2018 and onwards and that a nicer monitor wasn't really worth it.
I have said for a long time, if you bit the bullet and bought a refurb XB270HU or XB271HU IPS back in 2015 or so for $350-520 (they actually increased a bit over time as they dialed in prices) you've enjoyed a really nice monitor that still is fairly relevant today etc. I paid $270 for a scratch-and-dent and used it for a couple years, then sold it during pandemic for $200 or something - those monitors are still relevant today actually, albeit not anywhere close to leading edge of course. That's a fine value prop overall, versus the radeon graphics circus (freesync 1440p meant fury x in the early days, or vega) and the early freesync clownshow etc. And sure, XF270HU and Nixeus EDG were great monitors too, but they were the exception in the freesync ecosystem.
HUB is in fact pretty often defined by elevating the "who wins at the $250 price point!?" horse-race stuff above the broader question of "should I, a gamer who wants 5 years out of my system, broaden my budget and buy something earlier/more expensive but also better" like gsync vs freesync, or 8700k/9900K vs 1800X/2700X. And in fairness the zen3/vcache stuff that came later is definitely good, but if you just bit the bullet and bought the 8700k or 9900K you were most of the way to zen3 performance in 2017/2018 respectively. If you had to buy a 7000 series, buying 7700K got you a lot further than 7600K did, and I think that was actually increasingly obvious for most non-gaming stuff - 5820K and 6800K were very justifiable purchases on the merits, despite the sentiments at the time.
There are actually times where the lower-tier stuff is obviously not future-proof and isn't going to make it, that's true of VRAM today I think too. If you want a 5-year card, 8gb or even 12gb isn't it, but also AMD is pretty consistently behind on software (consider: the focus of RDNA4 is really on RT, so after FSR, AMD needs to get to work on ray reconstruction/radiance cache/etc and start getting it into games...). And future-proofing is sometimes a very serious question, when things are going to slow down for a couple generations etc (intel obviously was going to stall out on skylake for a while and AMD was behind etc) sometimes it's just worth forking over the extra money for the higher-end product. You might be using it a while, and it gives you the best shot against these feature cliffs where most of the hardware in the market just isn't going to make it - and that’s including rdna2/3 cards with weak RT and no/weak ML support (respectively) for upscaling etc, vram alone isn’t enough to make it last either.
That's because most monitors to this day continue to be garbage.
There's been basically 0 improvement. When there's no new tech, there's no need to buy anything.
I got my 27GL83-A in 2020 for $345. Nowadays $350 or at best $300 gets you the same thing except with an extra 21Hz (165Hz) woohoo.
But you're still getting the same low contrast garbo IPS with useless HDR and backlight bleed and glow.
Even with artifacts HDR backed by a good local dimming implementation can be quite striking. Also, if you’re primarily using your computer for gaming and media consumption OLED is flat out gorgeous although, unlike LCD, you’re unlikely to want to hang onto a single monitor for 5+ years due to both wear concerns and fairly rapid advancement in sustained+peak brightness.
I've been carrying my PC down two flights of stairs every weekend to play games on our new OLED tv and I don't think I'll be stopp9ng anytime soon
That’s false, even looking at your example. I just bought a 4K 165hz HDR1000 (so real HDR) Mini LED for $400. Now, it was a refurb, but there’s a curved Samsung monitor floating around right now at the same price with the same specs.
Mini LED with FALD is so much better than anything else I’ve tried other than $500+ OLEDs.
The Samsung Odyssey G7 (the one with 1196 zone MiniLED and 4k 144Hz) was $450 brand new the other day.
the 32 inch?
I think so
Yeah, and $400 with student discount.
There's definitely been stagnation in the desktop PC monitor market because it's shrinking and that leads manufacturers to invest less and less into the market segment.
That said, it also means already excellent monitors are getting cheaper. And using TVs for PC monitor use is getting increasingly viable. A 42 inch OLED TV can be had for barely any more than a high end IPS mini-LED monitor nowadays, and often be even cheaper for way bigger size.
Nowadays you can get your level of monitor quality for less than $200 and for $280 you get MiniLED with true HDR. What do you mean there has been basically 0 improvement?
IPS monitors have indeed not improved as much as VA or OLED (OLED gaming monitors didn't even exist in 2020). But IPS monitors got WAY cheaper, are starting to come with better contrast (as much as 1500:1) and have better colors.
They fucking scammed us blind when we went from CRT to LCD displays.
Yeah, they are lighter and higher definition, but a CRT of 1997 shits all over most today LCDs in motion clarity and contrast.
Monitors are well worth splurging on, same principle as overkill PSUs.
Your monitor is literally the one component you are always using
I wouldn't want to be using a messy subpixel layout for a traditional 2D textual desktop interface though, which is the main reason I haven't seriously considered an OLED monitor yet.
Overall agree, but for some people who have old hardware its the other way around. I was on a rx 480 8gb, until recently when I got a rx 6700xt.
I have an older BenQ xl2411z, 144hz, but it is still decent and I think GPU upgrade made more sense in my case.
Perhaps its time for a new 360hz screen? I play CS2.
Big Monitor spreading lies all over again I see smh.
Jokes and shitty clickbait titles aside, there's an argument to be made when you have limited money and a decent GPU costs as much as an oled TV, if your current GPU has access to modern technologies so like at least rtx 2000 or rx 5000 you're probably better off skipping a gen and just getting a good display while relying on upscaling/frame gen, besides this gen was dogshit anyway.
besides this gen was dogshit anyway
It's not super exciting, but there's still decent value in some tiers.
Right now you can get 7700XT, giving you 85% of 3080's performance at 1440p, with 2GB more VRAM, for little more than half the 3080's MSRP. I'd say that's pretty solid.
More than half of the MSRP of a 4 year old card? Incredible
It's like $30 above being exactly half, and it comes with 2 games bundled as well.
It's a decent deal no matter how much people want to whine.
I don’t think over half of a 4 year old card for 85% performance of a 4 year old card (and not the same feature set, as most people prefer DLSS to FSR) is a good deal, especially considering the 3080 came with a free game as well afaik. But I think value is very subjective, so its fair that you and I’m sure many others think its good value. No need to say I’m “whining”, I just said it’s not a good deal imo?
The 3080 not only did not come with a free game, you were essentially unable to buy it for this mythical 30 series MSRP until the very end of the generation. This is the major issue of comparing 6000/30 series MSRPs to 7000/40 prices: the former were never real.
That’s not true, I bought mine for 700 2 months after launch… Sure not everyone could, but that’s only because demand was insane (and scalpers)
Your experience was nothing like the experience for 99% of people.
i have never used any of the free games that come with my GPU. they are always games im not interested in.
7700xt is 5000sek, ie pretty much 439€/478$(let say 500€/$) in sweden and pretty much the same in rest of the nordic countries.
3080 was about 8k sek then when it was the crypto craze it was about 10k sek(take about x10 for usd/euro)
so like many are saying, u get lower perf for say half the price after 4 years, that is not progress at all. that is standstill, back in the days u would get the same perf for less after just a year or max two years or much better perf for the same price, that was when there was actually progress.
Now the progress is locked behind a super high pay wall, just like in the car world, if u want perf u have to pay a hefty premium.
You can't call it stagnation when you literally acknowledge there is progress lmao. "Back in the day" your card was obsolete in a couple of years and deprecated massively.
Unfortunately yeah the price/performance isn't going up as fast anxmore, go talk it out with TSMC engineers and ask them why they can't make their wafers cheaper, or copper miners and ask why they can't work for free so the PCB's and coolers are cheaper - because if either Nvidia, Intel or AMD could build GPU's cheaper, one brand would try to undercut the others already.
it is actually not as bad as u say, because raw material pricing have increased by 50%(an example) it does not mean u need to increase the price to customer by 50% on the final products, u just increase the price by the amount of the material used to make the product. Yet the greedonomics or price salting makes everything so much more expensive when it is not needing to be like that at all.
With shit drivers and support and no dlss and rt
This gen may be dogshit but I liked how the thermal are more efficient now and the power consumption as well so I dont have to upgrade my PSU (and risk myself of getting a dud when mine been working for years)
This gen isn't shit in terms of actual progress, it's great in fact, but it's completely garbage with price/perf.
The 4090, which isn't even the best nvidia could have pulled this gen, is basically a Pascal tier jump, or at least close, but then you see the prices across the board and just get sad.
frame gen
interpolation frame gen is worthless garbage and objectively only does visual smoothing at the great cost of increased latency.
based on that the recommendation is to have at least 60 real fps before thinking of using fake frame generation.
so interpolation frame gen certainly can't keep things going on an older weaker card.
reprojection frame generation could, but we don't have that on the desktop yet.... for some dumb reason.
Framegen has a future...
Not for 30FPS games and , interpolation framegen, never for competitive fps/fighting games...
BUT...
No problem using it on most single player games to drive a 360/480Hz monitor...
At base FPS of 180-240 being behind a frame or 2 in latency isnt' that bad. But what you get is smoothness of a 360/480Hz monitor...
I mean, so what if your reaction time to shoot that NPC in GTA VI is slightly slower...
At base FPS of 180-240 being behind a frame or 2 in latency isnt' that bad. But what you get is smoothness of a 360/480Hz monitor...
yeah, BUT we can do better.
also if you wanna take your argument to the extreme and blurbusters talked about this once i think.
if you were to use interpolation garbage framerate from 500 fps and have 500 fake frames. so motion clarity of 1000 fps, then you only have 2 ms added latency for improved clarity at least.
you could make the argument for that, BUT it doesn't make sense, not even in those cases.
why? because we have a superior technology, that is already getting used today by vr.
reprojection frame generation.
we can do THIS with reprojection frame gen:
https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/
(the first picture lagless frame generation via reprojection on a two-thread gpu workflow)
and with reprojection we are not giving up on responsiveness, but we are gaining it.
as we reproject EVERY frame with it (if we want), we reproject on the latest positional data, so we can UNDO the render time of the gpu. the gpu might take 10 ms to render a frame, but we don't care, because we reproject in under 1 ms, so we are removing 9 ms of latency.
advanced depth aware reprojection can can also have enemy positional data included in it and major moving objects.
and remember we aren't just gaining motion clarity here.
we are rendering real frames, because every frame contains the latest positional data from the player at least.
I mean, so what if your reaction time to shoot that NPC in GTA VI is slightly slower...
so instead we can shoot them quicker than without reprojection frame gen, especially with enemy positional data included, but even without it you're quicker.
and the kicker is, that interpolation frame generation is quite hardware intensive it seems.
reprojection meanwhile is extremely easy and fast to do.
it is so fast, that vr does basic reprojection for missed frames.
the render window got missed for 90 fps let's say, no worries, we can just reproject the last known frame in the most basic way and that is already better than nothing, because in vr you can't have too many actual missed frames or people throw up/feel sick, etc...
also yes, that is how reprojection can work on the desktop too.
you have a let's say 300 hz display. you always reproject to the 300 hz.
so your source fps might fluctuate between 80-150 let's say, but you won't notice, because you are getting a solid perfectly responsive 300 fps experience.
to be clear the current reprojection artifacts would be worse the longer the time between frames and as current versions don't include enemy or major moving object data in the reproject having a higher source frame rate is better of course, but it would still be amazing compared to anything else.
and of course reprojection works even better at low source fps like 30 source fps becomes playable with basic reprojection frame gen. you can test that yourself in the comrade stinger demo mentioned in the article.
so yeah we got sth better, that we can use, sth, that people would use in all competitive games.
reprojection frame generation certainly should be the future. anything else would be insane.
vr is using it as a requirement already.
so your point is reasonable at very high fps interpolation frame gen lag is indeed much less bad, but
we just got sth SOOOOOOOO MUCH BETTER! to use instead with a million more advantages. :)
That's honestly pretty crazy and maybe an eye problem. The difference between 60 and 120 is night and day. I mostly play at over 100fps and going back to 60 is a blurry mess. If you can't see the difference then sucks for you I guess because it's drastically smoother/more fluid.
Well in to my 40's and I can tell immediately, double check that you have it set to 120Hz in the adapter settings? Missing that is more common than you might think.
I notice the difference immediately. Not even in games - just moving the mouse cursor or scrolling websites is a big difference.
You went into your graphics settings and enabled high refresh rate? High refresh monitors will still run at 60hz unless specifically changed in settings and I've seen people not switch settings tons of times.
I can notice the difference between 60 and 120+, but I have to be specifically looking for it. If I'm just in gameplay, I will not notice it whatsoever.
I've been on a 1440p 144Hz Freesync monitor for a couple of years now. I'm yet to buy a GPU that can fully utilize it.
After I startet playing on Oled tv I think that oled monitor is ny far the biggest "graphic" upgrade. Even playing things like final fantasy 14 is completely different. I'm not talking about resolution or fps, just colors and contrast!
I'm playing graphically intensive games on my OLED TV (LG C9) and it's beautiful but often I'd much rather play at my desk. Dying for a monitor upgrade but will be moving in little over a year and it'd be hard/expensive to transport :(
I also don't dare grab an OLED for my desktop because my C9 got some burn in pretty quickly.
Weird, i have a b7 and never had burn in problems. Btw samsung launched a new monitor this days and it should not have burn in problems according to them. You should look at it after you move :)
Will look into that monitor tech, thank you.
The B8 hasn't got any burn in despite far greater use. The C9 has been used for ~150 hours of gaming. The weapon UI from Outlanders made a dim spot. Very annoying on films, especially grey-ish / low bitrate :(
How do you find the text in XIV?
Text issues are overblown.
what do you mean? I can read perfectly on my tv
Monitors are improving so quickly that I'm going to feel bad that the next better one is coming out every 3 months
That’s a very new development though. I bought a 144hz IPS WQHD monitor in 2019 and up until early last year, there wasn’t really much of an upgrade to be had.
Sure, there were some monitors with local dimming, higher brightness or faster refresh rates, but none of them would have made a significant enough difference to justify an upgrade.
Now you can get very solid mini led monitors for about $300 and (QD-)OLEDs for $500-800 that both massively outclass pretty much anything you could have bought in 2022, no matter how expensive.
Sure, there will be even better monitors in the future, but IMO, the big jump has already happened.
Newer OLED panels will only get marginally brighter and mini led panels will probably see a bump in refresh rates and dimming zones, again no upgrades that would justify an upgrade.
As someone who bought a Samsung G5 in 2020 then, would you recommend upgrading it sometime soon? Has to stay ~32 inches, I can't be one of those dudes who use a TV for their monitor lol
Depends on what you’re playing and watching on that monitor.
I’d say the only thing that really makes a night/day difference with these new monitors is the HDR experience. If the games you play don’t support HDR and you only watch YouTube, no movies or tv shows on your monitor, it won’t be that big of an upgrade.
Generally, if your hardware can push 4K and you have the money, any of the new 32 inch, 240Hz, UHD QD-OLED panels will be a massive upgrade.
If you want to spend less or stay at WQHD, there are still some pretty solid recent options, the Neo G7 and Neo G8 should both be solid noticeable upgrades, but even some cheaper models like the Acer XV275K P3 or its WQHD sibling the XV275U P3 will still deliver a much better HDR experience than pretty much any older edge lit monitor.
As much as I love OLED I’m having a hard time justifying upgrading my 1440p 144hz IPS. The contrast isn’t great but everything else has been fine.
I have the AW2725DF and it’s phenomenal, pretty much everything you throw at it looks amazing, but in some regards it’s a sidegrade compared to the Dell S3422DWG (apart from the narrower aspect ratio) I had before.
SDR and general desktop brightness is bad, using it in a bright room is absolutely possible, but it struggles fighting reflections and white often ends up looking gray
I stopped using my monitor for work. I got another separate monitor to work on as I don’t want to risk burning it in
I haven’t seen them in person, so I can’t say how good they really are, but in technical reviews, monitors like the Acer XV275K P3 or its WQHD sibling, the XV275U P3 look very promising.
I have a Hisense U8K, which also uses a VA panel with a mini led backlight (576 zones on my model), so I can’t say how say from experience that a good mini led implementation comes very close to QD OLED in terms of picture quality. I’ve linked to some other mini led monitors in my reply above.
The only downside of my 144hz IPS is that it cant do HDR. Its got most monitors beat in most other things and its also from 2019. Well also brightness, its lacking in that a bit.
I don't want a gaming monitor, I want a good 16:10 monitor with 100% sRGB, are there some modern 16:10 monitors?
Why specifically 16:10? If you need the extra vertical space, just get a larger monitor. A 27" 16:9 monitor is taller than a 24" 16:10 monitor.
Because I like it. And many laptops now are 16:10 again, because it's really better.
I have 24" 16:10 and I want 30" 2560*1600.
Partially you're right. But there are many details.
- Using larger 16:9 monitor adds more horizontal space. And it's not convenient for a lot of apps and websites.
- Larger 16:10 is better than larger 16:9.
- Watching photos or older movies is way more comfortable on 16:10. And some older games like The Lord of the Rings: The Battle for Middle-earth II are unplayable on 16:9 while pretty good on 16:10.
- The main thing for me is watching 16:9 videos, and I don't want the UI to cover the video. So 16:10 display is perfect for 16:9 videos.
Upgrade NEITHER and save your money instead
I bought a new IPS 1440p 165hz monitor like half a year ago, but with all the OLEDs coming out now I'm a bit afraid I regret it. Sure it's better than my old 1080p TN panel, but oddly enough as much better than I thought. And it's actually an LG that's well reviewed by Monitor's Unboxed. On one hand I wish I would have gotten an OLED for twice the money, but on the other hand I still don't trust OLED burn-in, and am not a fan of how much effort I have to put into preventing it from burning in. hiding taskbars, and using dark mode, etc.
Hopefully this one will last me until QDEL is affordable, though.
Me as a VR user: You know nothing Jon Snow
I like VR a lot, but between my HP Reverb G2 screen and my s90c, i'll still take the image quality of the OLED tbh. I like VR a lot, but I need something comparable to OLED before I'm ready to use it for everything, that and it needs to be a bit more comfortable to wear.
Understandable. I do a bit of work stuff on my PC and many of my games have a lot of static content so OLEDs are simply not for me.
what headset do you use for productivity?
Going from a 1050 to 1660 super was transformative, but jumping to a 3060 Ti > Rx 6800 > Rx 6800 XT > 4070 Ti were all pretty marginal, with the 6800 XT > 4070 Ti being the biggest improvement, mainly from the software features.
CPU upgrades have always been lacklustre.
My biggest PC upgrade however would be switching from a 1440p 144hz VA 34" UWQHD to a 55" s90c.
4k OLED with 120hz is perfect spot now, especially if you can convert your TV into a monitor (I do it);
Rant if u consider oleds(woled)
I had two oleds, both woled one from lg and the other the asus strix. And if u try to use dark mode as a person that have been using pc since forever and cant take the bright/light/standard windows theme without burning out my eyes the woleds are impossible to use as a computer monitor.
The dirty gray issue is so annoying when displaying darker shades it is just not worth it, for games it was almost perfect, if it was over 240hz it would have been perfect, felt a bit restrictive when it maxed out at 240, so I went back to an crappy 390hz ips.
Rather buy a new gpu than an oled if I cant use dark mode because of the darker shades are not solid and display artefacts like horizontal lines running(akin to "scan lines on crt but thicker" the darker shades.
maybe when qd-oleds comes out at 480 or above I will try oled again.
Yeah, it's an issue with normal oleds, the crystal backplane is not perfect and there are slight variations every few millimeters, and thus, brightness inconsistency.
It's named mura, on things like VR is very obvious and somewhat annoying, but I still prefer it to any LCD
I have the 27GR95QE and sometimes the color banding and dirty gray issue make the monitor seems like an inferior product than just a regular IPS monitor. In darker scenes it's great but I hope this can be fixed with future panels.
Only reason I haven't is because I'm still on Polaris; I want an idea of nextgen so I know how much monitor to buy.
Went from a cheapo 24" 1440P 165HZ TN monitor to the then new Samsung G7 and it was a night and day difference imo, never had a VA before but considering all the tests saying the new VA panels were good af I was just why not, I don't regret my purchase at all
gotta plug this here, r/crtgaming
my aw3423dw is going to last me 2-3 gpu cycles
i went from 980ti 1080p to 1070ti sidegrade for 1440p ultrawide freesync IPS 1:800 LG contrast ratio lmao to 3080 for 1440p ultrawide oled
i can see myself getting 4090 performance in the future in the 5XXX generation when my framerates finally drop too much. currently i'm at like 40-50 on medium-high ray tracing with DLSS quality (1.5x1.5 resolution), and i'd like to get 175 FPS or similar with the best ray tracing then.
Wait a minute, wasn't this the channel that recommended Gigabyte G27QC monitor?
Shit had ghosting issues one would thought Casper lived inside it. Meh not my money anyway.
They reviewed it but didn't recc it.
I am positive they were sponsored, with the other misleading Youtubers who some how out of all the thousand monitors out there kept selling that one as the best. No one bothered to stress test the monitor and many probly bought it.
60hz is not sufficient for gaming anymore. Its over
Its sufficient if your fine with it and have one.
However good 144hz monitors are so cheap these days that upgrading to one is a no brainer if you already don't have one.
Not to you maybe but there are people happy with what they have.
Everything over 24hz is scam, humans can't see over 12hz per eye
Joke's a side 60hz is simply to low, especially considering that most of 120hz+ displays have VRR while 60hz ones don't. Also don't forget about response times wich on 60hz monitors noone cares about, in the end we get very poor image with bad motion clarity even on same FPS with 120hz+ displays.
It's like 600x800 resolution, you can still try to use it, it might even satisfy you, but you can't deny that as resolution is pretty much dead.
just one single black/white pixel is sufficient for all but the most demanding workloads
you need 30 printers working in tandem and then an assembly line rolling the sheets past you to make a smooth image.
In a single pixel you need a lot higher freqiences to transfer same amount of information. So more pixels means lower hz. Thats how it works right? /s
For me, personally? 100% agree. Normal desktop use at 60hz is fucking gross to me now. But most people aren't that sensitive and for gaming it really depends on what you are playing.
If you're playing AAA games at only around 60fps then it's not *that* important. For competitive titles then yeah, 60Hz just isn't enough. 144Hz monitors are so cheap now and give you *SUCH* a huge advantage.
That said, even if you are one of those ~60fps gamers playing single-player titles, for me almost important as the higher refresh rate is VRR. If you're only able to drive around 60fps, your options are screen tearing or vsync judder. If you can keep your minimums above 60, then you're *not* one of those gamers and a higher refresh-rate would greatly benefit you.
True, hasn't been for at least 5 years
True, hasn't been for at least 5 years.
Says hardware unboxed who have been bankrolled by the biggest players in the industry for years now lol. Watching the video i dont buy into the logic they're arguing in this video.
lmao, imagine unironically thinking this.
What should their non-industry-bankrolled recommendation be, I wonder? Buy gold coins and seed packets from Alex Jones?
Source?
I'd take 60 Hz on a CRT over 120 Hz on an OLED easily personally. So it depends on the technology.
Unless you're playing games designed for CRTs, to be honest that's a completely irrational and borderline insane take.
I agree with him, for gaming crt are OP over any lcd or even oled.
when using a crt u will be called a cheater in mp games.
why would games being designed for a crt or not be of concern?
Why?
In gaming perspective that must be right, but for non-gaming purpose, those higher fps monitors still tend to have bit flashy/crude colour characteristics so there are still pros and cons in each price range.
Sorry they do the worst monitor reviews, you are overpaying if you buy the ones they review.
clickbait detected.
The title is wrong. He's right that 1080p especially at 60hz, is pretty outdated, I wouldnt recommend anyone stay on them unless they are on the tightest of budgets. Tim points out that the majority of Steam accounts are still on 1080p, but doesnt look at the GPU data... The #1 card is a 3060 a card that doesnt average even 1080p 120FPS in reviews, in some games it barely hits 60FPS (like RDR2). If you upgraded to QHD your 3060 drops down to 85FPS, that may seem fine but when you look at the rest of the Steam Survey, the situation gets worse as a lot of common GPUs are well below a 3060. #2 is a GTX 1650, #4 2060, #6 1060, etc. So many of these gamers ARE actually GPU (and CPU) bottlenecked and havent moved on from 1080p because QHD would lower their performance even more.
Personally I feel like it makes more sense to upgrade your GPU and CPU first, and then your monitor, that way you can run at ultra settings with no frame drops. If you upgrade your monitor first from FHD to QHD, your FPS will tank with the higher resolution, forcing you into windowed mode or dropping your quality settings making the experience worse until you upgrade your GPU and CPU.
TLDR; He's right people should buy better monitors, but they absolutely need to buy better GPUs and CPUs to take advantage of a better monitor, and GPU and CPU should be the priority.
in some games it barely hits 60FPS (like RDR2)
At ultra settings. You can safely lower settings to high or medium and spend that extra compute as more performance or a higher resolution.
Sure, the 3060 struggles to hit 1080p 60 at Ultra settings. But having come from the days when you had to choose between high settings and 60 fps even on top end hardware, there's always the option to turn down the settings. And in my experience high refresh rate with maxed out textures feels better than max settings. Especially considering I can turn off some garbage post processing settings and gain both image quality and fps.
I'd rather have 90 fps Medium settings than 50 fps Ultra settings.
I swear in the past few years, PC gamers have completely forgotten that it's entirely possible to drop settings below ultra.
Shouldn't upgrade a monitor when OLED TV's are far superior, especially QD-OLED.
Some people, such as myself, don't have the space for a whole ass TV as a monitor.
Make space, it's worth it to experience actual HDR.
I can't just rearrange my entire house to set up a 2nd tv lol; in a few weeks my pc will be in the living room near my HDR TV, so I can use it then; but as it's a shared TV I couldn't reasonably do that all the time.
I have dual monitor setup, the second one is a vertical one. My desk is already 2 meter and nu-uh im able to fit it in.
Plus, I liked curved one
Shouldnt use a TV as a monitor.
I probably would have said something dumb like that once upon a time.
You should try it, at least you'll appreciate how bad and overpriced monitors are, if nothing else.
I did. Input delay is too noticable.
Did you use an LCD* TV? OLED response times are instantaneous. Much faster than LED monitors/TVs.
Granted, the FPS is lower and larger screen is not suited to FPS games.
No, i used IPS TV. LED is not an option for my use case due to burnin.
Sorry, I meant to say LCD.
Burn in fears are overblown, especially in 2nd gen QD-OLED. They're holding up very well in the RTINGs torture test.
Burn in fear is not overblown for my use case (static elements for 16 hours straight with no time off for LED cycling to happen). For an average gamer PC i would agree the burnin is not an issue, but i use the screen for different things most of the time.
LCD responses in TVs are one of the worst you can get. Most are as bad as 50 ms.
Sellers selling stuff, nothing new. If you are a single player gaming and not a fortnite rat kid, you are perfectly right with a 60 hz monitor. 90% of people do not play at the max hz of his monitor anyways because of the lack of a better gpu.
And i say this having a 4090 and a 4k monitor 144 hz i buy recently to replace my older rog 4k 60 peasant hz, and i highly regret the day i let this kind of scammers convince me that i neeeded a new monitor.
Speak for yourself, 144Hz was the best upgrade I ever got.
Agreed. Not sure what OP is talking about. Every single game I play, especially single player games, look amazing and smooth at 165hz. Higher frame rate with slightly less fidelity in graphics look much better than playing at absolutely max settings, but locked to 60fps.
I usually play older or indie games, and my 1080p 144hz monitor is often the "bottleneck". I only have an rx6600 but in games like Cult of the lamb, i only get like 30% GPU utilization at max settings. Not everyone is interested in the latest AAA games and I appreciate the extra smoothness.
Even phones are getting 120Hz. Why? Because anyone can obviously feel the difference in regular day to day tasks. You don't have to be a competitive gamer to feel the big improvement of more Hz.
But I agree that 60Hz is still "fine" for many things
90% of people do not play at the max hz of his monitor anyways because of the lack of a better gpu.
Older games are a thing, i remember when i got 144hz in 2014 running civ 4 at that instead of 75hz was much nicer already let alone any fps game.
Also another thing is desktop, after using a 144hz+ desktop(and a 90hz+ phone for 5+ years also) for so long 60hz looks very laggy, even if your eyes can get used to it over time, but the knowledge of it could be better is still there.
then civ 5 came about and running that at a good framerate was impossible because you could already see the sphagetti code falling apart (civ 6 is outright broken on a technical level)
Next week: Upgrade your AM4 CPU, don't buy a new monitor
Don't have to watch this video to know it's full of shit unless it's a satire
I did this in 2018, got my 144 Hz Freesync while having a 390, wasn't until 2022 that I got a GPU upgrade but got to enjoy 85-144 fps for all those years while waiting.