About graphical cards…

A friend of mine blogged about the infamous AMD-ATI merger. His concern, is that with many mergers the big company (AMD) will kill of the small one (ATI) in the merger.

This might well be the case, but I’m not to worried. My feeling is that this merger is in some sense a symptom of a large change in the market for graphical adapters. When SGI went bad, many engineers went to two upstart companies, ATI and Nvidia. At this time, Matrox was the king of the hill – nobody nowadays has a Matrox graphic card. If the merger kills of ATI, the engineers will leave, but will they go to a company that will have a possible monopoly and no reason to innovate, maybe. Or they will start a new company.

The performance war between Nvidia and ATI has brought many innovations, but also its lot of aberations: adapters that need their own cooling and power supplies, frame-rates benchmarks which are well above the display’s frequency. The graphical quality of games has improved a lot, but a the same time, the devellopement price for games has exploded. The problem is, we have reached a limit: while graphical quality can still be increased, the change will not be so noticeable, a few more polygons or some addtional vertex shaders are not going to be so visible to the untrained eye.

My impression is that the world of graphic adapters is going to change a lot in the next years, and the important factors are not so much related to games.

Destktop usage
Windowing systems have always taken advantage of some features of the graphical adapter, typically the capacity of moving square regions of pixels around, but the amount of desktop work done by the graphical adapter has increased a lot lately. One important influence has the been Quartz Extreme, introduced by Apple in Mac OS X 10.2, this system offloads the window compositing to the graphical card. Image and video filtering have been offloaded to the graphical card with Mac OS X 10.3 and offloading of the drawing primitives (lines, beziers, polygons and glyphs) is in the work. Windows Vista should have similar features and so does the new version of the popular X server, X.org. This means that the primary applicaton for graphical cards nowadays is not a game, or a CAD program, instead it is a window manager.
Mobile computing
I don’t own a desktop anymore, and so do many other people. Many computer manufacturers sell more laptops than desktops. Additionally, margins are higher in the laptop market than in the desktop market, which has largely become comoditized to the point where innovation is rare. The rare innovations in desktop computing is done by a few companies, like Apple or Sony, and usually relies a lot on laptop technology. One of the important results of this is that power consumption is important – batteries never last long enough and cooling is noisy.
Virtualisation
Until now, if you wanted good graphical performance, you had to turn the game into full screen mode. A similar problem arises when you are running an OS in emulation or in virtualisation, graphical performance is bad. As computer become more powerfull, many applications will want to use the graphical adapter, from small games running inside windows to operating systems running on separate displays connected to the same adapter. All these applications need better virtualisation support.
High resolution
Resolutions have increased during the last years, from 72 DPI (dots per inches), resolutions are now above 120 DPI. Still, resolution is very far from printing resolution (+300 DPI). The problem is, interfaces have trouble coping. Resolution independance is slowly coming out, and once the API are adopted, I suspect we are going to see much more high-resolution displays.
Display interaction
In the old days, the graphical card used to send a video signal to the display. If you were lucky, the video display could actually handle the signal. Figuring out if a given display supported a given mode was impossible. DVI and newer VGA connectors support the Display Data Channel, which basically means the graphical card can talk to the display via a data-bus. HDMI supports the same protocol. While bandwidth is not that large, it is not negligible either: 3.4 Mbit/s. Currently, applications of this bus are minimal, basically reading the resolution and the maker, controlling the sleep mode. But many more are possible.
I my opinion, new applications between the graphical card and the display are possible because displays are becoming increasingly complex. Some displays are USB hubs or loudspeakers, increasingly, I suspect we will see displays that are also TV sets. Technically, you stream audio on a 3.4Mbit/s bus. Any LCD display has an integrated chip that can handle overlay displays, menus, configuration, basic filtering, two inputs and many other features. I have the feeling the integration between the graphical adapter and the display could be improved a lot. For starters, gamma control and color correction should not be done in the graphical adapter, but in the display. Another feature I would like to see would be the ability to tell the display to keep one or more pictures in its buffer to display and slow down or shutdown the graphical card to save power. When you are not using you computer, you don’t want the display to go to sleep immediately, but slowly lowering the framerate and buffer frames in the display might be a good transition from awake to sleep.

Of course, those are just hunches and ideas, I might be completely wrong. I just think the the next king of the hill in the domain of graphical adapters might be a completely unknown company.

One thought on “About graphical cards…

  1. I more or less agree with this, but I think you are underestimating the Gaming market. It has been like five years since Graphic ship manufacturer realized that triangle count was not such a big deal and that they had to find another path.

    This was achieved by Vertex/Pixel Shader and tchnologies like HLSL. These technologies are improving fast and gaining a lot of populatrity among developpers and gamers. For developper this was a way to distinguish themselves from other games (having a nice shader effect). And for customers this produce some very interresting visuals.

    I think this was definitely not the good time for ATI to vanish. Now Nvidia & Microsoft can just impose the new ‘standard’ for pixel shader!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.