Most video games, especially on PC, have a vast list of settings that you can tweak to get the ideal ratio of powerful graphics to smooth performance. For many of these options, you don’t really need to understand what they do. Anisotropic filtering? Who has time to figure out what that means? Just crank up the slider until your graphics card gives out. But a few settings are powerfully important. Let’s talk about them.
We’ll point out up front that this will not be a guide to every word in your game’s settings menu. More than enough glossaries like this already exist. However, some features will have a stronger effect on your game than others, and a few can make the visuals look way better—while destroying performance at the same time. Finding the right balance is key, so we’ll be focusing on the buttons in your game’s menu that will have the biggest impact on your game.
This is the big one. The one everyone’s been talking about—and will continue to talk about for a long time. Ray tracing is a technique that calculates how light would bounce around a virtual scene in order to accurately create highlights, shadows, and reflections. If that sounds boring, understand that literally everything you see works on that same principle: Light bounces from a light source, off an object, and into your eye. That means ray tracing (or its big brother path tracing) is among the most accurate ways to simulate how objects would look if they were really there.
It’s also a massive resource hog.
Pixar and other filmmakers have used ray tracing in some form at least as far back as the original Cars movie to accurately simulate reflections, though it didn’t become a big deal for gaming until graphics cards and consoles could calculate it in real time. Games that support ray tracing can simulate light bouncing off of reflective surfaces and casting shadows fast enough while still supplying enough frames for the game to be playable. But, uh, we’ll discuss what counts as “playable” a little bit further down.
Ray tracing’s primary benefit is that it makes games really, really, ridiculously good-looking. Everything in a game will look just a little more realistic, like they’re actual objects instead of well-made polygons. In some games, it might also provide a (highly debatable) gameplay benefit, like reflections in Doom Eternal letting you know that a mancubus is behind you, but it is primarily a cosmetic benefit. If you have a beefy graphics card or console, it might be worth turning ray tracing on, but read the rest of this article before you decide.
Frame rate (fps)
Most movies or videos flicker before your eyes at around 25-30 frames per second (fps), but video games are a lot more demanding. Because you’re controlling player characters and deciding what to look at—often even changing where you’re looking with split-second reactions—smooth motion matters a lot more in gaming. That’s why it’s widely accepted that most modern games (the ones where rapid movements matter, anyway) should run at 60fps, minimum, but some modern games reach as high as 120fps.
That is a lot easier said than done. As we established before, games have to process data in real time, so doubling the amount of frames the game has to show every second effectively doubles the amount of data your machine must process in that timeframe. This is why you might find some games that treat super-high frame rates (especially 120fps) as a tradeoff feature: You can have ray tracing or 120fps, but (probably) not both. At least for now.
If you’re playing a shooter, or any first-person game that requires fast reflexes, that tradeoff might be worth it for you. The Doom Eternal demo shown above is actually a great case study, though you won’t be able to actually see the difference thanks to YouTube’s 60fps limit for videos. You can, however, watch the frame rate change within the on-screen data box. With ray tracing on, the frame rate is in the low 100s when the player is just walking around, but it drops to the 70s or 80s once combat starts. Compare that to another demo of the same game without ray tracing, which approached a mind-boggling 200fps. It’s up to you whether high frame rates or hyper-accurate lighting is more important, but one is currently likely to come at the expense of the other.
We’ll assume that at this point, you understand how screen resolution—i.e., the number of rows and columns of pixels on your screen—works. For years, 1080p was the common standard, and it’s honestly still fine for most things today. But modern games are pushing more and more into 4K, around four times the resolution of 1080p (by total coincidence—4K refers to the nearly 4,000 lines of pixels, not any kind of multiplier).
But if “around four times the resolution” sounds like yet another way to melt your graphics card, you’ve got a good ear. Just like ray tracing and 120fps, 4K gaming puts pretty substantial stress on graphics processors. Of course, it’s possible—both the PS5 and Xbox Series X claim to be able to output as high as 8K—but it will naturally come at a performance tradeoff like other highly demanding graphics features.
So, would you rather play at 1080p or 1440p at super-high frame rates and with ray tracing turned on? Or do you want to stick with 4K and 120fps, but pass on the ray tracing? Maybe you’d rather do 4K ray tracing, and be happy with at least 60fps. These features form a sort of performance triangle: Right now you can get one or two of them at the same time, but if you plan to do all three, you’d better be ready to shell out some heavy-duty cash, and hope your game is optimized for it.
When you’re playing a game, your graphics card races to display enough frames per second to give you smooth motion, but your monitor has to keep up. Sometimes, your monitor and graphics card will slip out of sync, which results in a tearing effect, where parts of multiple frames appear at the same time, creating horizontal lines. This can be especially jarring during high action games with lots of chaotic movement.
There are two main ways to handle this. One, which most games support, is called “vertical sync” or “Vsync.” This technique limits the number of frames that your GPU puts out, so it won’t overwhelm your monitor. This is a brute force way of suppressing screen tearing, but it can make your frame rate worse. If, for example, your graphics card can’t consistently put out 60fps, Vsync will drop down to a limit of 45, or even 30fps.
The better way to solve screen tearing is with adaptive sync technology like NVIDIA’s G-Sync or AMD’s FreeSync. These features let your graphics card sync up with your monitor to provide a variable frame rate—meaning your game can provide more or fewer frames depending on its performance in a given moment—without tearing. When the frame rate in your game changes, so does the refresh rate of your monitor. The downside is that there can be compatibility issues if you use an NVIDIA graphics card with a FreeSync-compatible monitor, or vice versa. It’s not impossible, and some combinations will work just fine, but you should check your specific hardware to be sure.
If you have a compatible monitor, high dynamic range (HDR) might be one of the best visual upgrades you can make. HDR is a spec (or, well, specs) that lets your display show a wider range of colors, brighter lights, and darker darks. The result is a much more vivid picture, especially for games that take advantage of it.
To use it, you’ll need an HDR-compatible monitor, which you might not have unless you specifically went looking for one. While most modern TVs have some form of HDR support built in, you should check your monitor to see if it’s supported. If it is, you can try flipping the HDR switch in your game’s settings to immediately get a richer color experience.
NVIDIA’s DLSS (and eventually AMD’s FidelityFX)
One of the most complex problems to solve in gaming graphics is aliasing. High-polygon 3D models can look jagged and pixel-y, even at high resolutions. Anti-aliasing is a rendering technique that minimizes this problem by blending pixels with their surroundings to smooth out harsh edges. This is a useful method that’s worked for a long time, but it still requires extra processing power on top of rendering a game at high resolutions to minimize the jagged edge effects.
Enter NVIDIA’s Deep Learning Super Sampling (or DLSS). On supported games—one of two crucial limiting factors—DLSS uses machine learning to examine higher-resolution frames of a particular game, which it can then use to upscale lower-resolution visuals from the game engine. This goes beyond smoothing out edges like anti-aliasing does. It lets your graphics card create better looking frames while doing much less work, which frees up your graphics card to display even more frames in the same amount of time. It’s occasionally been described as higher frame rates for free. And while DLSS isn’t a silver bullet, it’s one of those rare things in tech that’s kind of close.
Unfortunately, as mentioned above, it only works for supported games. Also, it requires an RTX NVIDIA graphics card, which you might not have. AMD announced its own competing version in June, FidelityFX, which is compatible with a much wider array of graphics cards (including many NVIDIA cards), but the list of compatible games is currently even more limited—a mere handful of titles. If you have a graphics card that can use one of these features—especially DLSS—and a compatible game, it’s almost certainly worth turning on.