Oh the irony… Intel is making tremendous strides in the graphical power of their integrated GPUs. But I think that this, along with Apple’s primary focus on exquisite design, might be hurting the ever-budding population of Mac gamers.
Read on to find out why. TL:DR at the bottom.
There’s been a lot of debate and consternation about the fact that Blizzard Entertainment’s latest game, “Overwatch“, is officially not in development for the Mac. Blizzard have long been supporters of gaming on Mac, and Appleite gamers were distraught to hear that they wouldn’t get their latest game. Which, incidentally, is awesome and I adore. Also, disclaimer: I used to work for Blizzard. 🙂
Some have suggested that the reason for this decision is that the size of the market doesn’t warrant it. I dispute that idea: the market share of the Mac has never been higher, and Blizzard has supported it without fault until now.
Others have said that they’re allocating Mac budget to consoles (the game is coming to PS4 and XBox One). I also think that makes little sense; Blizzard is a large company, they could do it for all these platforms, and they like money. They would spend less on developing for the Mac than it would bring in.
The official word from game director Jeff Kaplan is that, essentially, the tech behind the Macs today make it challenging. Call it corporate BS all you want, I don’t think they’re happy about disappointing their fans, and I believe the answer is genuine: it’s all about the Intel HD Graphics technology.
Here’s a quick and easy recap of the tech involved:
- Central Processing Units (CPUs) aren’t great at rendering 3D graphics.
- In the 90’s, external graphics cards ushered the era of 3D gaming.
- The chips on those are called Graphics Processing Units, or GPUs.
- GPUs are indispensable for 3D games, but they are also big and power hungry.
- Laptops can’t handle big and power hungry things.
- GPU makers started creating “mobile” GPUs with “ok” performance.
- Those mobile GPUs are discrete, meaning they are separate from the CPU.
- Intel, chasing 3D performance, started creating CPUs with integrated GPUs.
- These integrated GPUs weren’t good for a long time. Apple kept using discrete GPUs.
- Intel’s integrated GPUs have now improved a lot… But they are still poor for gaming.
- I put together a not-at-all-scientific 3D perf scale (1-10), just to give you an idea:
- CPU alone, no GPU: 0.5
- Integrated GPU 3 years ago: 1 to 1.5
- Discrete mobile GPU 3 years ago: 2 to 2.5
- Integrated GPU today: 2 to 2.5
- Discrete mobile GPU today: 3 to 4
- Discrete desktop GPU (external graphics card): 4 to 10
Please feel free to seek out precise benchmarks for yourself, but I believe that this is roughly representative of the relative performance of these chips, on average.
And the bottom line is this: integrated GPUs, even today, will probably not get you satisfactory gaming experiences in anything other than the most basic games. It might work on a less intensive game (MOBAs on low settings?), but not for something more demanding. Or at least it won’t work “well” (decent FPS at decent detail levels, etc).
Ironically, the discrete GPUs from a few years ago likely gave you similar-ish or better gaming performance than integrated GPUs today. Remember this for later.