I don’t mean literally explode (assuming you’ve been reading too much about recent Dell laptop fires and Sony-provided batteries for them). I’m talking about the market suddenly being hit with multicore processors on the scale of 8-core, 12-core and possibly even 20+ cores or more.
After attending the Microsoft Gamefest event I started making some calls and a number of them kept giving small little tidbits of information to suggest that AMD and Intel are, very shortly, going to go after each other in a major battle that will result in CPU’s getting more cores than we currently have a use for. The word is that this battle is going to be fierce and nearly immediate.
Imagine a 20-core CPU. In that scenario you wouldn’t need a video card or a sound card. Those would be entirely redundant. The CPU could easily provide for the services these cards currently handle.
If this path is correct it goes a long way towards explaining some recent news and directions of major companies. For example, if ATI started to see that add-on cards were soon to become redundant (or reduced very much the same way sound cards have) then it makes a huge amount of sense for them to go ahead and get swallowed by a CPU manufacturer and continue the battle from there. Clearly ATI must have seen the writing on the wall. One only needs to look at what’s happened to Creative Labs‘ share of the PC sound card market to understand the dynamics involved.
Microsoft’s insistence that future sound support be done entirely in software makes a lot more sense as well.
Remember that years ago many of us bought “Stacker” ISA-based cards to put into our machines to handle compressing our hard drives. Once the CPU’s proved up to the task that market vanished as it all went to software. Today, drives and space are so cheap that compression of entire partitions is almost unheard of.
Why would anyone choose to buy a video card if all the graphics in a game can be handled entirely by a single core (or two) on a future CPU? After all a GPU is very much the same thing except that it’s one step removed from the CPU. The dedicated core will be IN the CPU.
Another major element of this is the cost savings across the entire industry. PC manufacturers no longer need to worry about a myriad of peripheral cards. They can simply focus on motherboard manufacturers, chipsets, memory and CPU stability (for the most part). Game developers will benefit from being able to support a “least common denominator” that reaches just about everyone. Publishers save because they won’t have to field calls from endless waves of customers with various hardware and driver issues related to video and sound cards (a bulk of their calls today). They also won’t have to worry about as many costly patches, at least when they’re mainly dedicated to dealing with video and sound issues. Consumers save in obvious ways. There will no longer be a need to spend huge amounts of money on video and sound cards. You can instead just focus on the core components of the motherboard, the CPU and the memory.
I also have to believe that Steve Jobs is salvating at all of this. Think about how the Intel-based Apple looks in this world? It no longer matters if you’re running a PC or a Mac. They’re both going to run on identical platforms. Will Apple finally be a real gaming platform?
The counter arguments that have come up don’t sway my thinking much. The first one is that graphics are far more complicated than sound. Ask an actual audio engineer how he feels about that statement. While audio in today’s games may pale in comparison to what goes on with video, that’s simply not a fair valuation of the complexities of both areas. Video is just more important to humans so it gets the larger share of attention. Audio in today’s games simply isn’t very realistic at all.
Take a very simple case. In real life, the victim of a sniper attack falls to the ground and starts bleeding long before anyone around them actually hears the shot itself. No game works like that. When someone fires a sniper rifle everyone on the map hears it at exactly the same time, instantly. Sound reflections are almost entirely non-existent in games and yet they’re critical to our every day hearing. Modeling just the most basic of sounds fully would bring a room full of today’s best PC’s to their knees. If you want to make an argument here, the real argument is about the balance between realism and fun. However, the fact remains that today’s audio is still in its infancy.
So back to the core argument. Yes, today graphics are more complicated than sound, by a wide margin. However, that doesn’t mean that CPU’s wouldn’t be able to pull off the same magic. Intel has been after this goal for many years and, for a good chunk of buyers, they’ve succeeded. I suspect that AMD’s main interest in ATI wasn’t to sell top-brand video cards but to instead integrate top-brand technology into future “massive-multicore” processors.
I remember reading arguments back in the 8088/80286 CPU days that math co-processing was so intensive that it was beyond the CPU’s capability to provide for it. When’s the last time you bought a stand-alone math coprocessor? Once the CPU made that task a trivial one the math coprocessor vanished from existence. I suspect most people reading this paragraph wouldn’t have any idea what I’m talking about unless they lived through it.
Interesting thoughts to ponder.
2 Comments
I think a distinction needs to be made between “being done in software”, and “being done in hardware that is present on the CPU”. The comparison with the numeric coprocessor is precisely the case in point. Yes, the numeric coprocessor functions are now contained in the CPU, but they are NOT done by software. There is special hardware circuitry, the equivelant of an X87 chip, in the CPU core, and it is this hardware that performs the floating point operations. Software floating point is extremely slow compared to what the hardware can do. The newer instruction sets, MMX, SSE, SSE2, SSE3, are the same. They are hardware solutions, just packaged within the CPU instead of externally. So while I agree that the “add on” hardware may disappear, I do not agree that graphics or audio will be done in software. I think CPU manufacturers will continue to devote more and more transisters to specialized functionality and hardware support for operations common to video processing tasks. The video processing will still be done in hardware, not software. It will just be done on the CPU rather that off it.
Brian,
Your point is well taken. The CPU’s may well include special hardware for this type of functionality which will be accessed, via software (API calls, etc.)
Thanks for the response. Great clarification and one I agree with entirely.