Today I read an interesting interview with Tim Sweeney of Epic whose tag line is “PCs are good for anything, just not gamesâ€.
Summarizing the interview perhaps a bit unfairly, here’s what he says:
(1) People aren’t buying expensive enough PCs.
(2) Even the expensive PCs aren’t good enough to run his games.
(3) People who buy cheaper machines with Intel integrated graphics are giving their money to Blizzard instead of Epic.
(4) This aggression cannot stand. The solution is that everyone except us should change what they’re doing and buy machines with more expensive graphics hardware.
What I find sort of amusing about this discussion is that the only time the word “stability” enters the discussion is when Sweeney talks about Epic’s development environment (“Do as we say, not as we do!”). This reveals a fascinating truth that doesn’t get a lot of wide discussion among hardcore gamers: cutting edge graphics solutions tend to make your computer crash more.(footnote 1)
The upper end of the videocard market is a dog-eat-dog market where the margin is high. Both hardware and drivers are revved frequently, more frequently than the marketing names of the cards might indicate. As a developer, if I had a buck for every time I encountered a situation where software behaved differently on two allegedly “identical” videocards, I’d be rich. Likewise, I’ve heard my friends in the Microsoft OS group talk about graphics drivers negatively impacting system stability in great detail (this detail usually involves phrases like “utter and complete garbage”, sometimes coupled with words that I won’t use in this space).
All of which is a way of saying: complain about integrated graphics all you want, but computer and electrical engineering is, inherently, all about tradeoffs. What Intel is selling in their integrated chipset is more than just “crappy performance and low cost”. They’re selling some specific level of performance (“Good enough for some percentage of PC users to accomplish everything they need.”), along with low cost, low space, low power usage, low heat, and a certain level of stability.
Games are, for most of us, only a part of what we use our computers for. It’s not clear to me that a computer manufacturer — or a consumer — who accepts the tradeoff offered by an integrated chipset is making a bad choice. And if you insist on defining “the PC games market” as “those consumers who are willing to pay more money for a faster but louder, hotter, larger, less stable product,” then no wonder you think the market sucks. The market you picked sucks. Instead of plaintively wishing that perhaps the market will change, you should change your business plan to match reality.
In summary: I wish Epic was a publicly traded company, so that I could short them.
Footnote 1: Cue response of “Well, my computer has never blue screened, so you must be doing something wrong.”
Did you actually read the article? Sweeney’s main point was the gap between the bulk of consumer hardware and gamer hardware has grown so large that they can’t scale their games between the two. He doesn’t want non-gamers to buy more expensive hardware, though I’m sure he wouldn’t mind. He wants Intel to develop integrated graphics that don’t suck so that the scaling factor is manageable on hardware designed for the non-gamer so they [Epic] can expand their market. He also calls out hardware manufacturers at the high end for fleecing their customers for marginal performance improvements…
Greg: yes, I read the article. The subtext of the article was “…and this is why developers are targeting consoles instead of PCs.” My only point is that the notion “Someone else should pay to increase the size of the market I chose to compete in” is not an argument I have a lot of patience for. Presumably, anyone to whom playing Unreal is super-important has chosen to not buy a machine with integrated graphics. Yet the machines are still selling like Krispy Kremes at the Policeman’s Ball, because it turns out that price matters to consumers.
My secondary annoyance is that I think dismissing Intel’s chipset as “sucking” is naive, and corresponds to the habit some of us have — sometimes, I have it too — of just comparing devices by their spec sheet. The integrated graphics chipsets sold by Intel are not high performance. They are, however, dirt cheap, small, low energy consumption, low heat, and are known for having stable drivers. Those are not minor benefits. On the one hand, I see a group of people blithely dismissing the tradeoffs Intel made as “sucking”, and claiming that making it “not suck” is just a matter, apparently, of force of will. On the other hand, I see the machines that that chipset is in flying off the shelves. Someone is wrong here, and if I have to choose who is wrong I’m going to choose the people who think that selling games to people who are willing to spend $3,000 on a machine to play at their desk is a viable business plan.
“He wants intel to develop integrated graphics that don’t suck” translates directly to “he wants everybody to buy more expensive graphics cards.” What Sweeney is complaining about is that Intel has realized that there’s much more market in low-end graphics than in gamer hardware. It was realized about a decade ago that the technology would soon deliver more graphics power than most of us needed. Now that this is the state of the world, Sweeney wants that reality to go away.
Cliffy B was right. Tim Sweeny is wrong.
I thought pretty much the same last time someone from Epic complained about Intel graphics. I remember thinking the same kind of thing back when I was a idiot teenage ‘hardcore’ PC gamer – why would anyone want a computer without a good graphics card?
These days I just don’t want to spend the time and money required getting a gaming PC working.
By the way: Are the Intel drivers all that good? The Vista drivers were pretty terrible, they kept crashing.
Haha! “Plaintively”! Someone’s been playing too much Dwarf Fortress.
I spend my computer gaming time shopping for high-end pc components but never buying.
I love PC gaming. I’ve never had any really bad experiences with compatibility and the like, which is maybe because I build on a budget rather than going for the bleeding edge. I don’t know that for a fact, because my knowledge of computer hardware and drivers and the like is strictly functional, but whenever I hear people going on about compatibility they seem to be talking about unnecessarily high-end machines.
Am I making any kind of decent point here? I’m honestly curious, because it seems my experience of PC gaming is much more happy and carefree than most, and it’s making me terribly paranoid.
Confession: I’m a PC game developer and I have Intel integrated graphics at home. I make AAA titles, not flash games, and one of the 5 titles I worked on does not run on my home PC. Still, I can play all the games I want to play. and given a choice, I would target Intel graphics as the max spec for all the titles I work on, for the sole reason that it gives the game the biggest possible target market, it makes them less expensive to create, and it frees up time for better gameplay. Because believe me, gameplay IS being cut in favor of cool graphics. All the time.
I had been a PC gamer for the best part of the last 10 years, even though I have owned many consoles along the way, only a couple of games have captured my time. As a gamer I have tried to keep my PC at least able to play the latest game that I am interested on, but I slowly got tired of chasing the best graphics card because the gameplay just sucked. The latest doom was what did it for me, it was just simply put not fun.
The Wii has not recaptured me as a gamer, I have had many hours of fun playing several games in that console, and currently SSBB could not get anymore fun. I think when people start putting too much stock on pixels and not enough on fun, they forget what is what makes game like Wow dominate… they are fun.
A lot of what Tim Sweeney said was pure wishful thinking. But the interview raised 1 point that might be worth pursuing – external graphics cards that can easily be plugged in and unplugged. That would give those who like graphics-intensive games the option to try out the hardware and software, while still having an internal card and drivers that are stable enough for non-games uses or for less graphically-demanding games.
His ideas about game developers coding direct to the graphics card don’t make sense. Either their games become dependent on a specific card (commercial suicide) or they code to a standardised API. We already have usable standard APIs, including DirectX and OpenGL, both of which are still being upgraded. And external graphics cards will need standard APIs.
Game developers should test every build on a machine that was mid-range 2 years ago – that’s what dominates the market at any given time.
That will restrict how much they can put into fancy graphics. They should use the additional resources to: cut out cruft *; improve the way their software plays **; develop novel game concepts instead of re-implementing the most popular genre with fancier graphics ***.
*”Rich is good, complex is bad” – one of this site’s slogans.
**For example I like strategy games, but their AIs seem no better than in the mid-1990s.
***When was the last truly novel game?