The 64-bit era approachs

Dit artikel delen:

One of the few benefits of Windows was that it accelerated the replacement of 16-bit processors by the nowadays de facto 32-bit Pentiums, etc. The vastly improved addressing gave real advantages to software developers.

Intel, in cooperation with Hewlett-Packard, have been developing a 64-bit single-chip processor, basically upwards compatible with the 32-bit processors, but with a number of RISC features. It is seen as important that there is a path to attract developers of RISC systems into the Intel (and their clones) mainstream. HP are showing distinct signs of migrating to Intel processors, which leaves IBM/Apple (Power processor) and Sun as the only alternatives; the former has the size to continue development and to support both architectures but Sun surely must embrace the Intel architecture before too long. There is little to be gained these days in one processor versus another, it is the system that employs it that is the critical factor. Indeed there are huge advantages in using the most common processor and thus the best supported instruction set. History saw the inferior Intel 8 and 16 bit processors beat the Motorola 68xxx largely because Gary Kildall worked for Intel and thus wrote CPM for the Intel instruction set, thereby gaining the toe hold into the mass market we see today. Nevertheless we should note that the Apple Macintosh used the superior Motorola processor and the software they produced was an order better than we got on the PC; it is not only the processor that counted, but it helped.
By using the most common processor a computer designer gains access to the latest software first. In earlier days software was written in low level languages and was specific to a particular instruction set. Today most system software and many applications are written in C or C++ which can be recompiled to work with various processors; even the compilers only need a minimum of processor dependant coding. But that bit of special code and, more significantly, the implementing and testing of code means that non mainstream processors lag behind in availability. Windows for instance is written in C, etc. and could be ported to other processors. Initially Microsoft made a lot of play on this fact but in fact only paid lip service to it on the Alpha processor in order to get anti-IBM support from Digital. But Windows depends not only on the CPU but on the low level drivers, in particular the video subsystem, and so it only ever matured on a PC architecture machine. The same is true to some extent of Linux. The difference however is that Linux is implemented in a wide variety of forms ranging from pocket PC to mainframe servers. It is effectively a common base with many specific implementations. Thus while there is a common core, a PC version of Linux cannot be fully ported to a an IBM mainframe because of the graphics. Theycan however use the same databases, compilers, etc. The ability to produce a version specifically matched to a requirement, e.g. desktop, server, embedded, etc. is a huge advantage of Linux and Microsoft must come to terms with the fallacy of the "one OS fits all" concept. Who wants all the security problems that graphics support creates on a server?
The diversity of CPUs narrowed down with the move from 8 to 16 and 16 to 32-bit processors. It should narrow still further with the imminent addition of the 64-bit Intel Itanium processor to the small set of current 64-bit RISC processors. This is largely due to the cost of developing and manufacturing such massively complex components, but it also due to a more intriguing question, who wants them?. I am sure many people said that they didn't need the move to 32-bit processing but that was a real advance that many common or garden applications needed. Sound processing, databases, audio and video manipulation, indeed any sophisticated graphics, all needed the increased addressing capability. The introduction of 32-bit data types also helps, but it also means that a lot more memory is needed. There are many examples of a 32-bit version of an older 16-bit application running slower than the original. The slight performance problems and the increase in memory requirements have more than been negated by technical advances so far. A 16-bit address equates to 64K, 32-bit to a massive 4096M. What application really needs more?

 
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.

x

Om te kunnen beoordelen moet u ingelogd zijn:

Dit artikel delen:

Stuur dit artikel door

Uw naam ontbreekt
Uw e-mailadres ontbreekt
De naam van de ontvanger ontbreekt
Het e-mailadres van de ontvanger ontbreekt

×
×
article 2003-01-17T00:00:00.000Z Martin Healey
Wilt u dagelijks op de hoogte worden gehouden van het laatste ict-nieuws, achtergronden en opinie?
Abonneer uzelf op onze gratis nieuwsbrief.