In this post, we are cutting through the confusion. We will explain what HAGS does, why Lossless Scaling demands it, and when you might actually want to turn it off . Historically, your CPU told your GPU what to draw. The CPU would bundle up commands into a "buffer" and send them off. This worked well, but created a slight bottleneck.
Have you noticed a performance difference with HAGS on or off? Let us know in the comments below!
But what exactly is HAGS? And why does a simple upscaling tool care so much about a deep Windows graphics setting?