Seen this argument come up quite a lot in the gaming community. I honestly don’t think there’s much difference between a 60Hz screen and a 144Hz screen. What’s your guy’s take?
My main computer “monitor” is a 42" 4k TV. I know I know – it’s a bit excessive. I don’t sit too too close to the screen so the big size doesn’t bother me. And for the games I play/activities I do, the 60Hz seems to be just peachy.
I have been using a similar setup for more than a year now, can’t imagine myself using a monitor ever again. If I need smoothness for anything (series, movies) I just trade it for some input lag, and I’m already over the 144 Hz feeling.
Note that I don’t play any skill-based games - just chilling on Warframe or some older CoDs. I don’t feel any disadvantage here, however, I would accept a secondary screen, which I can’t put anywhere at the moment…
Edit: If you ever decide to get one, make sure you let us know about your experiences. It’s way better to hear about it from another TV user other than someone who switched from a regular 60-75 Hz monitor.
I switched from a 60Hz Samsung TV (40" if memory serves) that is now my living room TV to a 120Hz monitor. I do like to see the higher framerates but tbh, with my current rig I’m not getting 120 FPS on most games I play regularly. It’s obviously kind of pointless for video (unless you like watching your 60FPS Youtube on 2x), but it’s nice to have. I wouldn’t consider it essential, though - it really does depend on what your focus is. I can see the refresh rate and input lag being an issue with TVs and in some games it definitely does make a difference (I found ICEY to be quite an experience), however it’s obviously not as noticeable in most slower games.
Depends what types of games you play. If you’re playing EU4 and Civ games then no. If you play anything with relatively high speed gameplay or if you’re at all competitive then it’s night and day. Even just mouse movements (on desktop) are horrible on 60Hz after being on 120+. Any sort of camera panning in any game (Assassins Creed for example) will feel so much better on 120+.
In terms of GSync/Freesync vs none, it only matters if you’re playing 4K and/or unable to push more than GSync/Freesync range fps. GSync/Freesync is useless if you’re running at hundreds of frames anyway, and the added input lag isn’t worth it if you’re at all competitive.
Worth noting the refresh rate isn’t the only thing that matters, monitors can have terrible response times that make 144Hz feel just as crappy as 60Hz on a good monitor.
Anyway, the scientific difference is huge… 6ms vs 16ms makes a big difference to your brain, how applicable it is depends entirely on your usage.
I submit that 1/10th of the gaming community cares about such details because they actually impact their gameplay, while 9/10th are simply in it for the hardon they get from having “the best.”
Hell I’m using a TV from 2007 and I just picked up a GTX 1050. I’m like a kid in a candy store with it. Still rocking 1080p for everything.
I’ve got a 480Hz monitor that would blow your mind Typically run it in 120Hz @ 4K though, but it has basically zero input lag or signal processing lag. Moving the mouse is like a hot knife through butter, so satisfying.