In the blogs and forums, there is much discussion of multi-core processors as an evident step of computer system development. And it is so. But there is another important development line besides multi-core machines - 64-bit technologies. What is it? What are the advantages? What are the problems involved? Here are the viewpoints of users and programmers.
By 64-bit technologies we understand both hardware and software means.
To the hardware means 64-bit processors are referred (Intel 64, Intel Itanium). "Domestic" processors are divided into x86-compatible (Intel 64, AMD64) and non-compatible with x86 Intel Itanium processors. Of course there must be the corresponding system boards for 64-bit processors but usually they are not singled out into a separate class.
To the software means 64-bit operating systems, 64-bit drivers and also applications (both 64-bit and just those able to work on 64-bit operating systems) are referred. And of course do not forget about the tools for 64-bit software developers (compilers, debuggers, libraries).
What the hardware part of 64-bit technologies is concerned, it has existed for a long time; it is being used and most likely is present on your desktop.
Yet it is not so well with the software part. 64-bit operating systems have also existed both in the Unix world and Windows world for a long time. But they are not so widely spread yet. The reason for that is that there are very few programs which do not involve issues in the 64-bit environment. It came so far that Microsoft Windows 7 integrated a virtual machine into the operating system in order to solve the issues of obsolete programs.
Lack of 64-bit programs is determined by the vicious circle that we hope will break in the future. Software developers do not want to invest into 64-bit software because very few users have 64-bit operating systems and users, in their turn, do not install 64-bit operating systems because there are very few 64-bit programs. Only some third force can break this circle. Thus, for example, Microsoft released Windows Server 2008 R2 only in the 64-bit version.
So, on the one hand, it seems that we are ready for the mass move to 64 bits but on the other hand, this move is only beginning.
For users, 64-bit technologies are the opportunity to get more than two Gbytes of memory for their applications. Where is it needed? In "heavy" tasks (video, sound, graphics, archiving), games and even in browsers (when many dozens of tabs are open). Seems to be attractive but users take their time in moving to 64 bits because there are problems of 64-bit and 32-bit software interaction. These problems are being solved but perhaps not so quickly as users would like to.
But 64-bit technologies are relevant not only to "heavy" tasks. Even if a program uses little memory (about one Gbyte), 64-bit technologies will become needed with multi-core processors getting popular. For it is not more than only 4 Gbytes (2^32) available in a 32-bit operating system and even less in practice. And what if there are several applications running simultaneously on a computer with four cores? And each needs a gigabyte of memory... You see, you cannot go without 64-bit systems.
As said above, programmers take their time and do not make 64-bit versions of their applications, although there are all the necessary tools for that - compilers of 64-bit applications for most programming languages (both by Intel and by Microsoft), third-party tools (for example, PVS-Studio code analyzer intended for detecting errors characteristic of 64-bit and parallel applications).
The main reason why programmers take their time is the necessity to maintain two versions of a program - the 32-bit version and the 64-bit one. Theoretically, you can simply compile a 32-bit application for a 64-bit system but in practice it is not so easy and there are many issues. And that is why you have to provide maintenance for the both versions. And as long as they may do only with one version (as, for example, in case of 32-bit games), they try to do it.
Sure, the 64-bit world will come. How fast this will be depends both on programmers and users. But it is clear that those software vendors that will release the 64-bit versions of their applications first, will have advantages in the competitive activity. And those users who will move to 64-bit technologies first, will be able to enjoy these advantages earlier than all the rest.
In C language, you may use functions without defining them. Pay attention that I speak about C language, not C++. Of course, this ability is very dangerous. Let us have a look at an interesting example of a 64-bit error related to it. Below is the correct code that allocates and uses three arrays, 1 GB each:
I often hear in various interpretations the phrase: "The given examples show not the code incorrect from the viewpoint of porting to x64 systems, but the code incorrect in itself". I would like to discuss and theorize a bit on this point in the blog. Please, take this note with a bit of humor.