Viva64 and VivaMP tools are not developed as separate programs. All the abilities of searching for specific errors related to developing 64-bit applications, as well as porting code from 32-bit to 64-bit platform are now available within PVS-Studio analyzer.
There were 64-bit processors, the operational systems, some programs. However completely all users have passed to 64 bits still far not. In article the reasons of it are considered.
In 2003-2004 a next revolution started in the world of PCs. It embraced all the computers, all operating systems, and all applied programs with no exception. Having started torrentially, it must have boosted computer engineering up to the next level by 2005-2006. However, now when I'm writing these words (2008 is coming to its end) this revolution has not reached its peak yet. What is meant here is the migration to 64-bit technologies. Let's see how this migration started, how it went on and what the present situation is.
Generally, 64-bit technologies weren't a real innovation even in 2003. 64-bit solutions on platforms alternative to modern PCs has been existing for rather a long time. But first 64-bit PC processors appeared only in 2003. These processors were a realization of AMD64 technology by AMD and EM64T by Intel. The reason for their coming is rather trite. Old 32-bit processors have a common limitation: they are able to work only with 4 gigabyte RAM but no more than 2GB is available for each user's application. As long as such limitations were impeded only professional sphere (designers, engineers, etc.) the problem was solved in a very simple way. Specialists usually used nonstandard computer aids and this used to take the problem off. However, when common PC games reached 2 GB even "home" uses needed new processors.
It's natural that market met new demands of users. First models of 64-bit processors appeared in 2003, but in 2005 one wasn't able to buy any different processors. The problem seemed to have been solved, didn't it? Unfortunately, in order to enjoy the advantages of 64-bit a user must have a 4 GB (or even more) RAM in a computer. But in 2005 computers with 1 (maximum 2) GB were the most typical configuration.
By the end of 2006 RAM used in common PC's grew up to 3-4 GB. One would think that it's time to evaluate the advantages of 64-bit computers. But in order to do that one should also have a 64-bit operation system. Fortunately, Microsoft has issued Microsoft 64-bit versions of Windows XP and Windows Server 2003 long before that. But these operation systems were not speedily implemented into the sphere of users' PCs. The situation remains the same now. Practically all users' PCs have 32-bit operation systems like Windows XP and Windows Vista. This can be explained by lack of 64-bit drivers (now as well as before) and, what's even more important, by lack of 64-bit programs. Indeed, though 32-bit programs work on 64-bit operation systems with hardly any trouble at all but larger volume of RAM can be used only by 64-bit programs.
Exactly the existence of 64-bit programs is the explanation for the fact that the majority of users can't say that they use 64-bit infrastructure in their work despite all its advantages. The process of program porting from 32-bit systems to 64-bit ones turned out to be rather difficult. And just the developers of programs are to blame for the lack of 64-bit programs. Let us investigate the question.
In order to release a 64-bit version of a program it must be compiled by a special compiler which is able to generate a 64-bit code. Some companies (for instance, Microsoft, Intel) have issued 64-bit compilers rather rapidly. This means that already in 2005 these compilers were available for the developers. Other companies (as, for example, Borland) has not issued 64-bit versions of their development tools. This resulted in the fact that programs developed in Borland C++ environment are not able to be compiled for working in 64-bit environment.
But as far as many programs are developed with the help of Microsoft and Intel tools, one would think that there are lots of 64-bit programs. Alas, the experience of many companies dealing with software development showed that program compiling with a new 64-bit compiler is not enough.
The matter is that in programs compiled for 64-bit platforms there may occur some unexpected (even for the developers) errors. Here are some examples.
Having been compiled for a 64-bit, a program pretended to be working perfectly until the user pressed F1. Instead of expected help system window a short phrase telling that it's impossible to download help appeared on the display. The help system seemed not to have any relation to the capacity of a processor. But the problem was just incorrect functioning of the program with the new system. Behavior of functions responsible for the help system changed and it just "broke down".
Here is another example. A data visualization application possessing RAM volume of about 2 GB was a natural candidate to acquire a 64-bit version. The new 64-bit version functioned perfectly, the users faced no problems until someone wanted to build an image based on data exceeding several GB. The image was built but...only half of it was displayed on the screen. Why? Because of incorrect work with large volumes of data.
A person unfamiliar with program systems development industry may ask a question: "Why such program errors are not discovered at the testing stage?" The answer is not completely obvious. The matter is that the existing systems of both inner and outer software verifications do not allow to discover errors typical just for 64-bit systems. In substance, verification often must be performed very fast, that's why perhaps the processing of several GB of data won't be verified. Tools for developers have kept aloof from the problem of 64-bit software for a long time. Specialized tools appeared rather recently, in 2007. The author of the present article takes part in the development of such a tool.
What tools are we speaking about? There is a special category of programs called static code analyzers. Such an analyzer "dismantles" program's source code and displays a list of potential problems of the code. After that the developer corrects them and the program is ready to come out into the 64-bit world.
Now, when the latest tools in the field of 64-bit program development are available for the developers, one may after all expect mass migration to 64-bit systems. Soon 64-bit versions of almost all most frequently used programs are going to be released. The users will be finally able to enjoy all the advantages of 64-bit world.
Evgeny Ryzhkov works at OOO "Program Verification Systems", he is concerned with the development of such software products as Viva64 and VivaMP. These are developers' tools meant for testing 64-bit programs and parallel ones by means of analysis of their source code. He possesses several published works dealing with the problem of testing and development of complex bundled software.
In forums, people often say that 64-bit versions of programs consume a larger amount of memory and stack. Saying so, they usually argue that the sizes of data have become twice larger. But this statement is unfounded since the size of most types (char, short, int, float) in the C/C++ language remains the same on 64-bit systems. Of course, for ...
I often hear in various interpretations the phrase: "The given examples show not the code incorrect from the viewpoint of porting to x64 systems, but the code incorrect in itself". I would like to discuss and theorize a bit on this point in the blog. Please, take this note with a bit of humor.