Our website uses cookies to enhance your browsing experience.
Accept
to the top
close form

Fill out the form in 2 simple steps below:

Your contact information:

Step 1
Congratulations! This is your promo code!

Desired license type:

Step 2
Team license
Enterprise license
** By clicking this button you agree to our Privacy Policy statement
close form
Request our prices
New License
License Renewal
--Select currency--
USD
EUR
* By clicking this button you agree to our Privacy Policy statement

close form
Free PVS‑Studio license for Microsoft MVP specialists
* By clicking this button you agree to our Privacy Policy statement

close form
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

close form
I am interested to try it on the platforms:
* By clicking this button you agree to our Privacy Policy statement

close form
check circle
Message submitted.

Your message has been sent. We will email you at


If you haven't received our response, please do the following:
check your Spam/Junk folder and click the "Not Spam" button for our message.
This way, you won't miss messages from our team in the future.

>
>
PVS-Studio project - 10 years of failur…

PVS-Studio project - 10 years of failures and successes

Dec 29 2016
Author:

Ten years ago, we created a simple utility called 'Viva64' intended to detect problems in 64-bit code. This is how PVS-Studio static code analyzer came into being. Although 10 years have passed, we only started doing something more or less 'worthy' as a company, just a few years ago. This article isn't a 'success story' because we think that the most interesting events are yet to come. However, 10 years is quite an occasion, and a good time to assess some of the results of our work, and to tell our readers how it all started, which errors we made, and what we finally did right. Perhaps, at times, I will be not very precise in the chronological description of the events. 10 years is a long period of time, and memory is not perfect.

Enjoy reading!

0465_pvs-studio_10_years/image1.png

Background

Myself (Andrey Karpov), and Evgeniy Ryzhkov worked in a small company in Tula, which was working on the creation of numerical simulation package, and data visualization. This job was rather interesting, and it allowed us to use, and get in touch with, the cutting-edge technologies of that time. For instance, working on the visualization of data, we experimented with 3D glasses, which worked on the principle of closing the eyes by-turn. The images were displayed on a CRT monitor that showed a picture for the left, then for the right eye. I think very few people have seen this ancient thing: the glasses were connected to the computer with a cord, and the CRT monitor was running in interlaced mode (100 fps) so that each eye had at least 50 fps. It looked really horrible, and the eyes started to hurt instantly, so it was no surprise that this technology didn't become very popular.

Another thing that we did, was to make a budget cluster, using simple PC motherboards. We used motherboards with two physical AMD processors. Today nobody will be surprised by a processor with 8 kernels, but at that time a motherboard with 2 kernels was a real wonder in an average shop in Tula. We had combined 4 boards, and gotten an 8-kernel mini-cluster, which we used for various calculations. I personally took part in soldering this device:

0465_pvs-studio_10_years/image2.png

Figure 1. A cluster built with the material we had at hand. Characteristics: 8 CPUs, 16 gigabytes of memory.

It didn't look very nice, as we didn't manage to solve the problem of ventilation and we had to remove the back cover, sacrificing the outer beauty of the device. However, the cluster coped with the tasks quite well.

Also, we dealt with the first 64-bit computers available to ordinary users. These were the machines with Opteron processors and a huge, as it seemed at that time, 4 gigabytes of memory. So, it all started with those machines.

In 2005, Visual Studio 2005 was released, which made it possible to develop 64-bit programs from the 64-bit architecture (at that time it was called AMD64). I remember going to a Microsoft conference in Moscow, where it was demonstrated how to easily recompile the code for a 64-bit processor. In general, 64-bit technology was an important trend in the development of computers.

Of course, 64-bit microprocessors existed before that too, for example, Itanium. But is was AMD64 that had a major influence on the IT-industry; and it was due to this that 64-bit processors appeared which were available to everybody - Windows-programmers now had the chance to write programs for these processors in the comfortable Visual C++ development environment.

Memory space is extremely important in visualization and numerical modeling tasks, therefore, immediately after the release of Visual Studio 2005, we started working on 64-bit applications.

It was no surprise that we were among the first programmers who were adapting their application code for 64-bit processors, and as a result, stepped on a lot of rakes. For example, we found out that the hardware keys for the program distribution weren't quite ready to be 64-bit. I cannot recall exactly what the issue was, but we really went through a lot of trouble over the new variant of code protection. There were some other nuances about the creation of distributions, but these are all details; the most interesting happened later.

Microsoft didn't lie - we really managed to recompile our applications in x64 mode quite fast - it took us about 3 weeks. At that moment it seemed to us that we had gotten a 64-bit distribution of our applications.

Ha! We had done it, yes, but the programs didn't work correctly. Moreover, there was some puzzle to all of it. The programs were successfully passing the unit tests, and worked correctly on the test data. But when we wanted to use all the power that a 64-bit application provided, we had some strange errors. The program was glitching up when it allocated more than 10 gigabytes to handle large input datasets.

0465_pvs-studio_10_years/image3.png

Figure 2. I don't have the pictures that could demonstrate the errors of visualizing errors in a 64-bit application, I couldn't imagine that after many years I would need them. But this picture is very similar to the result of errors that I saw. Suddenly, we could see only a part of the object on the display.

Now, I know the causes of these strange behaviors in the programs. In some fragments, the pointer was turned into int, and then back to a pointer. If the pointer was referring to an object in the lower 4 gigabytes of memory, all was fine. But if the object was created in the memory exceeding the 4 gigabytes, the problems were inevitable.

There were errors in such evaluations:

unsigned int X, Y, Z;
Uint64 Q = X * Y * Z;

Although the result is a 64-bit variable, it doesn't help - the overflow occurs when 32-bit variables multiply.

Of course, there are a lot of other ways to shoot yourself in the foot. You can find more details on various 64-bit pitfalls in this article: "A Collection of Examples of 64-bit Errors in Real Programs"

At that time in 2005, the things happening to our programs seemed akin to some strange magic. What's more important, we had no idea how to find and eliminate such defects - all our methods ceased to work unexpectedly.

Once again, the unit tests were running correctly and did not reveal anything; everything was fine with the small test data as well. It's almost impossible to debug on large amounts of data. Firstly, it is very slow. If the release is supposed to work for half an hour for the the errors start showing up, the debug version of the program must work long hours. Secondly, it is unclear what should we look for in the debugger. Should we start examining billions of loop iterations to find where something is going wrong? Programmers always try to use a minimal set of data to reproduce the problem while performing the debugging, but the small sets were showing us that everything was fine.

We decided to use a BoundsChecker program, which was quite popular at that time, and helped us several times already. But it turned out that it was not yet able to work with 64-bit applications. However, even if it worked, I don't think it would be of great help. The program speed when using BoundsChecker slows down by dozens of times. I think for our cases this would result in days of waiting.

We realized that we had come to an impasse: we knew that there were errors in the program, but did not know how to find them.

Our team began studying the situation in more detail. We were experimenting a lot, and surfing the net to find at least some information. Gradually it became clearer what we were dealing with. We began to understand what bugs lived in our programs, but it wasn't much help. Let's say, we suspect that arithmetic overflow was to blame for some errors. So what next? How can we find them, those fragments?

Being really frustrated, we started looking for new ways out. We considered the option of replacing all the integer types with special classes, such as SafeInt. This would allow at the least, the discovery of integer overflows. However, as it turned out, it's very difficult to do this for existing applications.

After that we tried out some static code analysis tools, and even bought Gimpel PC-Lint - an analyzer which wasn't really useful; it wasn't meant to look for 64-bit errors.

That was the moment we realized that we are the first people to wonder about it. 64-bit errors existed, but the world didn't have any solutions for solving them.

How we did we sort it out? We decided to read the code; of course, not all of it. We configured PC-Lint so that it would issue a warning for all the explicit type conversions, implicit extension of int into 64-bit types, and so on. Of course, we had an incredible amount of useless warnings, but still, it was better than just reading the code from beginning to end.

We reviewed potentially dangerous fragments that PC-Lint pointed out to us after we had set it up in a special way. We also read the most important modules and functions. After several months we finally got a stable versions of our 64-bit applications.

As you have probably guessed, only two people were involved in the task of porting to a 64-bit system: myself, and Evgeniy Ryzhkov.

Conclusions we made:

  • when porting systems to 64-bit, people will have to face 64-bit errors;
  • these errors are very difficult to search for;
  • there are no tools to help find such errors.

Around the same time, another fatal event happened; Evgeniy Ryzhkov became very keen on reading books about start-ups, and a new trending direction called ISV (Independent Software Vendor). However, there was no such word as "startup" at that time. At least, not in the way that we use it now.

11 years ago, the cell phone industry wasn't that developed; there was no AppStore, and things like that. If it had been there, perhaps Evgeniy would have started creating games for phones and tablets. At that time, he was confined to an ordinary computer. He tried to make his own application, a coloring 'book' for kids, and tried to sell it. In general, he managed it and started selling it, but it was not something that would lead to success and serious profit. He started to look for new ways to apply his knowledge.

I also had some entrepreneurial inclination, and wanted to create something of my own; but these ideas didn't have a precise form, and were just thoughts. Then, Evgeniy suggested thinking about creating something that would conquer the world and make us rich and famous - the classic dream of a startupper was really contagious. So, we started thinking hard.

Viva64 version 1.0

What we had:

  • Two people who wanted to organize some kind of a start-up;
  • These two people are aware that there is a problem searching for errors in 64-bit programs in C++.

It would seem, what is there to think about? We should just start making and selling a tool that would look for 64-bit errors. It took us quite a long time to come to this understanding. It seemed to us that it is a complex and obscure task, that no one would know how to solve, since there are no tools for it.

First, we went through some plain, and simple ideas. We didn't want to make sites, we wanted to create a finished software product. But the trouble was that we didn't understand what we can offer the world. We wanted to chose a direction that will be on demand; not just some abstract, great idea.

In time, we really started to think about a tool that would search for 64-bit errors in C and C++ programs. In the beginning we were wondering, "What could it be?". At first we thought that it could be something like a dynamic analyzer - like BoundsChecker. However, it was too hard, plus it was unclear how to look for certain bugs.

Gradually, we came to the realization that it should be a static code analyzer, i.e. a program that points a programmer to areas of code which need to be reviewed. We thought we should create a tool like PC-Lint or Parasoft C++test, but aimed only at searching for specific types of errors.

The important question was if we would be able to make such a tool, because it was to do with C++ code. We had to study this question too.

There was no LLVM at that time, so,we were considering taking the GCC compiler or some open source library as a basis. Of course, there were paid libraries to parse C++ code, but we didn't even think of those. GCC seemed to be too complex and ponderous, plus it was unclear if we could create a closed project on its basis. We didn't want to make an open project, because we didn't see how we could make a living from it. As a result, the choice fell to an unknown OpenC++ library. By that time, the library had already been abandoned, but this didn't stop us; it seemed quite simple to us, and we managed to write a small diagnostic - with its help - quite fast.

So, we had made clear for ourselves, the main direction we should take - we were going to make a tool that looks for 64-bit errors in C/C++ code. This would be a classic static code analyzer, but we tried to avoid such phrases at that time; we thought that it would mislead those people who would search online for 'a tool to search for 64-bit errors'.

Having experience using Gimpel PC-Lint, we decided that we would make this tool as a plugin for Visual Studio. Back then, you had to use almost magic spells to use PC-Lint from Visual Studio. I wrote a post on this topic that turned to be quite popular: Installation of PC-Lint and its using in Visual Studio 2005. In general, we thought that this integration didn't really suit us, and we had to provide users with a convenient interface: a person should be able to install the tool, and be able to start developing thier project. This is the principle that our team is still using, and we think it very important.

At that time, we imagined Viva64 to be a simple utility, that we would sell for $200, but on a massive scale. We thought that the demand for this tool would rise dramatically, fueled by the global rewriting of programs for 64-bit processors. Here was our plan: we make a simple tool, that will be super necessary, then we will sell it for 3-4 years. Then, after gaining a huge profit on this genius idea, we'll move on to some other project. These were our youthful fantasies of a cool idea, and our quick way to richness. We even made the following graph, showing how the demand would look.

0465_pvs-studio_10_years/image4.png

Figure 3. In 2006 Evgeniy came up with this graph, showing the supposed demand for the solution of checking compatibility of C++ code with the AMD64 platform. We were presuming that the demand would decrease in 2010, and Microsoft would release some standard solution, which would force Viva64 analyzer from the software market. Still, we were hoping for rapid sales in those first 2-3 years, and to save some money for future endeavors.

Being so inspired by the dreams of immediate success, we started programming, creating the source distribution, and the first version of the site. We were doing it all in the evenings, because during the day time we were still working in the office full time. The path from conception, to the first version, took about a year.

Finally, D-Day came. December 31, 2006, we posted the first public release of Viva64 1.00 online. I remember that Evgeniy told me to do it before the new year, so that the version history would have the year 2006 - it would seem to the users that the tool is already a year old, and it would seem more solid. Now, after a long journey of 10 years, it all looks naive, but then it seemed very important.

The budget for creating the first version of the analyzer, and, the site was 43200 rubles. Of course, our work time isn't included in this price, they were additional expenses. For a better understanding, let's calculate this amount of money at the exchange rate of the year 2006 - it was about $1710. We can say that we didn't spend much on this new project.

In 2007, we started selling our tool, gradually improving it, so the workload increased. Besides the programming tasks on our regular job, and programming the analyzer, we had to somehow promote Viva64 among programmers. We started learning how to write articles, responding on the forums, and trying out some variants of paid advertisement.

This tempo of life quickly burnt out our moral and physical resources. Additionally, things weren't going well in the company we worked full-time in; we realized that we could no longer work like that and decided to leave the job.

Still, there weren't many sales; we managed to sell several copies, but there were so few, that it's not even worth mentioning here. We even tried to withdraw the money that the reseller account, as this sum wouldn't be of any help anyway.

It was a hard time for us - there was less and less money over time, but there was nowhere to get it from. We sorely tempted to quit, and just get a job somewhere - but we tried not to give up. We were comforted by the thought that programmers are always in great demand, and we would be able to get a job somewhere within a week, if we went bankrupt.

Of course, we were looking for any possible way to find finances, so that we could continue development, and for some reason we believed that we just had to wait a little, and there would be a huge breakthrough. There were a lot of chaotic actions at that time, which I don't remember clearly now. For example, once, we decided to visit the AutomatedQA company (Smartbear nowadays, having the office in Tula) and talk to the director, Sergey Lisitsin. We showed him what we had, tried to rise some interest in our tool, but our attempts were unsuccessful. Although, perhaps we were just knocking on the wrong doors, and couldn't present ourselves properly.

We had to deal with this situation somehow, because Viva64 was refusing to become popular and famous. This pushed us to working outsource, and we were eager to do anything that was at hand, Ingate and Intelsys companies were among those that we worked with at one point. Later we took part in a large project for an Italian company; it was a program for dental technicians who were making dental prostheses. The dental bridge or crown was designed on the PC, and then a special machine cut the prostheses out. In fact, it was a highly specialized CAD system. We had to go back to the section of mathematics related to rotation matrices and transformation of images, and learn what NURBS surfaces are.

0465_pvs-studio_10_years/image5.png

Figure 4. One of the stages of working with a scanned jaw to create a bridge. Note that to the left several teeth are missing, two teeth are already ground down, so that the technician could project a bridge that would be fastened on top of them.

Again, the workload for us increased dramatically. We were spending 8 hours working on the CAD system, and then, depending on the energy resources, worked on the improvement of the Viva64 analyzer, the site, and did something about the promotion of it. The positive thing was that there was no need to spend time getting to the office, as we were working from home; but it's hard to say what's easier - sitting at the computer the whole day is also very difficult. Apparently, this is the only way out if you want to do something extraordinary - you have to work even harder to see some changes.

The popularity of Viva64 was gradually growing, but very slowly. We started to realize that we didn't have the quick start that we had dreamed of. There was some naive hope that it would happen, just a little bit of patience, and 64-bitness will be a crucial question among programmers.

To make a long story short - 64-bit has never had the demand that we expected. The programming world was slowly moving from the 32-bit to 64-bit applications, and it is still continuing. Until now, some clients choose PVS-Studio solely due to the fact that it has diagnostics that detect problems related to the 64-bitness.

Which means that we were both wrong, and right about the topic of 64 bit at the same time. We were right that there are real problems in migrating a large code base to the 64-bit platform. For 10 years we have sold quite a number of Viva64 licenses, and later, the PVS-Studio tool that looks for 64-bit errors. We were wrong about the time intervals. Our thoughts were that the transition would be going for 2-3 years, and then it would be slowly declining for another couple of years. Having these ideas in mind, we started this project. Hoping for a sprint, we started a marathon that we have now been running for 10 years already.

However, now we understand what was it all for. At that time we continued believing in '64-bit development', and went on improving the analyzer.

Start of 2008, point of no return

In 2008, fortune led us to a state program start. To be more exact, we started preparing for it in 2007, but had the finances only in 2008. In a nutshell, here's what it is. A quote from the site:

"The objective of the program is the facilitation of innovators, seeking to develop and master the production of new goods, products, technologies, or services using the results of their technological research, that are at an early stage of their development and have a great potential for commercialization. It should be kept in mind that the "start program" is primarily focused on the initiative of researchers wishing to create a steadily operating business based on their innovative ideas."

Saying it in simple words, here is the case: a person tells us about an innovative project, the program gives a grant for its implementation, and checks if you reach the stated results. If everything goes well, the funding may be extended for a second and then a third year. By the way, the program [RU] is still working now.

We have mixed feelings after participation in this program. On one hand, there is a lot of red tape, and a lot of formalities. On the other hand, this programme stimulates moving to a new level. Overall, I will give a positive assessment of this idea. Participation in this program gave a great stimulus in various directions.

First, you have to have a limited liability company. Then you are provided with some finances (at that point the sum was 750 000 rubles for a year ($30 000)). But it's not that easy to spend this amount of money - you cannot simply go out and buy PCs and ad banners; at the same time it would be silly not to spend it. As a result, the need to do something with the money made us rent our first office, hire two first employees, buy some furniture for the office, and so on.

0465_pvs-studio_10_years/image6.png

Figure 5. OOO "Program Verification Systems", 2008 year. The first day in our first own office. You can click on the picture to enlarge.

So, participation in the Start program forced us to finally leave our houses and really start working on the Viva64 project, not just playing around with it. The Start pushed us to register the LLC, hire our first employees, and feel like real organizers of the company - this was the main value of this program for us.

It helped us to think of ourselves not just as programmers with an interesting technical project, but as entrepreneurs: it served as a catalyst for our project. It made us leave the comfort zone and start setting bigger goals tasks. We are really grateful to the government, and to all who are organizing this program.

This is probably enough for positives sides, now's the time for a fly in the ointment. Participation involves awful bureaucracy, and it takes days and days to prepare the technical and accounting reports. On top of it, you are rarely able to spend the money on those things that are really needed, as a result you have to spend the money on useful, but secondary things. It is hard to explain it, you have to try it yourself, to see what it's like, but believe me, there is a lot of turmoil, and many limitations. It is clear that all these restrictions are made in such a way, that people could steal less money, showing only the fiction of work. However, it's not the best relief for honest participants.

You probably already know that the finances given by the fund didn't cover all necessary expenses; we were still working outsource for the dental company, but just sitting in an office. Once more, the workload increased. Besides the outsource work, and the analyzer development and its promotion, we now had to work on the documentation for the fund: preparing the reports and going to Moscow to hand them over.

It was hellish. When 2008 was over, we decided that we weren't going to continue participating in the "Start" program - the bureaucracy was taking too much time, and the efforts we were putting in it were mostly wasted. Around the same time, there were more sales of Viva64, and we clearly saw that we weren't devoting as much time to it as we could have. We decided that we would rather tighten our belts, and focus on the perspective directions, rather than filling up tons of reports. We did not apply for the following year. Such a decision may seem silly, but I'm sure that we did the right thing, and possibly saved a year or two.

I mentioned that there were sales of Viva64, it really was so. Later we started raising up the price, because our clients were quite large companies. That was the moment we realized that we were making a tool not for single developers, but for companies; still, we were far from realizing that we were a B2B company.

VivaMp, the first mistake

We have made a lot of mistakes over the ten years. I won't be telling you about small ones, because it's not that interesting, and I don't remember everything accurately. As an example of such a small mistake, was that our company worked on the general system of taxation, when we could have used a simplified one. We didn't know that we could apply for a simplified system in the first 5 days - a typical error of beginners. It wasn't awful for us, as our company didn't earn much, and therefore, we didn't spend much either.

So, let's speak about more epic fails. The first was the VivaMP project. We started this project back in 2008, but the first release happened only in March of 2009.

We had already accepted the fact that the 64-bit idea didn't give us a quick start, so we decided to look for a new direction, where we could get ahead of others. It seemed to us that we did it - in 2008 multicore processors started appearing on a massive scale.

Programmers had to choose which technology would dominate in the sphere of developing parallel programs on C and C++. There were different variants: MPI, OpenMP, some existing library, or something that may soon appear.

The Intel company was promoting the OpenMP technology, or, at least it seemed to us so. We thought we could repeat the same venture experiment: create a static analysis tool for parallel programs built on the OpenMP technology. In general, the static code analysis of parallel programs is an unsolvable task - the dynamic analyzers are much more useful. You can do some analysis of parallel programs, but the code should be marked up in a special way, giving a hint to the analyzer about the fragments that would be executed parallell and which would not. In this regard the OpenMP technology is extremely favourable for the analyzer. The "#pragma omp ...." directives are that very markup for the analyzer.

Therefore, we studied in detail the topic of programming with the help of OpenMP, and were quite sure that there were errors that we could detect with static code analysis. Those who are interested, can have a look at our article: "32 OpenMP Traps For C++ Developers"

All in all, the new direction was also chosen incorrectly. It was absolutely wrong. While in the case with the 64-bitness there was some interest among programmers, although it wasn't as big as we would like it to be; in the case of OpenMP direction there was no interest at all.

Apparently, there were several reasons for this misfortune:

  • The OpenMP technology didn't become mainstream. It is as usual as the other technologies of parallel programming for the programmers.
  • Which means that not so many developers really use OpenMP in projects. Consequently, the demand will be small in any case. Besides that, we didn't manage to reach that group of developers, and spread the word about the existence of VivaMP tool.
  • Altogether, static analysis is quite weak in the search for errors in parallel programs in comparison with other tools.

The VivaMP project also failed - there was nobody who needed it. There were almost no questions in the mail about this analyzer, or reports about the bugs found with it - the world just ignored the existence of this tool.

Further on, VivaMP was integrated into PVS-Studio, and later removed altogether. The OpenMP continued developing and it obtained new possibilities and new keywords. It requited some support, but there was no point in doing it for a tool that was dead in the water - it would be just a waste of time. We braced ourselves, and deleted this part of the analyzer.

So, VivaMP was our first great failure. It took us a lot of effort and time to create and promote a new tool.

Despite this misfortune, we found the courage to stop working as employed programmers, and devote our full time to the development of the analyzers. Viva64 began bringing some profit, it wasn't quite small, but enough for autonomous existence. At that time we were earning much less than we could have, if we had gone working with some software company as programmers.

PVS-Studio and general analysis, first success

In 2009 we combined Viva64 and VivaMP into a single product in hopes that the diagnostics of VivaMP will get bought as some addition to the 64-bit diagnostics. Again, this didn't bring any result, so there is point to continue talking about this.

Nevertheless, we should consider 2009 as an important milestone in the life of our company. This was the year we released PVS-Studio, which was initially the combination of Viva64 and VivaMP.

By the way, let's talk about the names of the tools. The analyzer 'Viva64' appeared as the idea: "Long live the 64-bit world!". The very word "viva" was in my mind because of a song that I heard before "Viva Forever". I suggested using this very title to Evgeniy, and he agreed. Our site www.viva64.com was named the same way, as there was no point in renaming it - this title became quite popular over time.

The PVS-Studio name was created in a more complex way. The first three letters is an abbreviation of the name of our company OOO 'Program Verification Systems', 'Studio' was added to emphasize that it's not just one tool, but a set of tools. Actually, the title is not very apt, as it is often misspelled: people forget the dash, or write PSV instead of PVS, and so on. If we were to choose a name now, we would chose something simpler. That was our idea behind the name for CppCat, but that's a completely different story, which I will tell later.

Let's go back to the main idea of this story. In 2010, we thought that we could increase interest in PVS-Studio, adding several diagnostics of general analysis. We were planning to make these diagnostics free, because we didn't really believe that they would bring any profit. In the sphere of general analysis diagnostics there were already such tools as Coverity, Parasoft C/C++test, Klocwork, Gimpel PC-Lint, and other unarguable leaders on the market. We didn't mean to compete with these tools in any way, that's why we were planning to make the free diagnostics only as a means of advertisement. The idea was the following: a programmer checks his project for free, then he learns about paid diagnostics for 64-bit errors, and OpenMP technology.

In November, 2010 we released a beta-version of PVS-Studio 4.00, with a new set of general analysis diagnostic rules. At that time there were 45 of them. Here's an article about this event: "Let the world tremble! We've released PVS-Studio 4.00 with a general-purpose analyzer!".

Later a key event came about, which actually changed everything. We can say that this was a turning point, when we moved from relative lack of popularity, to a successful strategy of behavior. Of course, it was too early to speak about huge success, but things started changing.

Here is what happened: some programmer contacted us and wrote about our general analysis diagnostics, asking how much he has to pay to get them. We replied that they are free, but he may buy the 64-bit diagnostics, which are extremely useful. His response was that he needed neither the 64-bit diagnostics, nor VivaMP. He thanked us a lot for such a cool tool, and for being able to carry on using these general analysis diagnostics for free.

We heard the signal from space, and quickly reconsidered our approach. So, PVS-Studio 4.00, which was released a month later, became paid. We even had to write an article to explain why we changed our mind so quickly: "What is the reason for making PVS-Studio 4.00 a commercial solution? :-(" In a nutshell, the idea of it is that 'we want to make money', so you may not even read the article.

Finally, PVS-Studio turned into a set of three analyzers (Viva64, VivaMP, diagnostics of general analysis), which we started selling as one tool. In this version we also made the first corporate licenses (Site License).

As time was going by, PVS-Studio analyzer was gradually developing, and bringing in more profit. PVS-Studio 4.30 added incremental analysis - an ability to run the analyzer automatically for those files that have just been edited or recompiled. It allowed the use of PVS-Studio regularly on developer's local machines.

In PVS-Studio 4.32, released in July of 2011, we stopped selling a single-user license. That was one of the best business-solutions in the company history. We came to the understanding that PVS-Studio is a team tool, which is useful for the whole project, regardless of the number of people working on it.

In the beginning of 2012, we released PVS-Studio 4.53, which already had 100 general analysis diagnostics (V501-V600). Quite soon, there appeared a new set of diagnostics in PVS-Studio 4.60, 'Micro-optimizations', to search for those fragments of performance loss, that can be detected by a static analyzer.

3 years passed before the moment when we finally made correct decisions concerning the development of the project. But that was too long, it was time to do something stupid.

Embarcadero RAD Studio, the second mistake

It will be a short story, as there is nothing much to tell. PVS-Studio 5.00 had integration into Embarcader RAD Studio. We thought there were a lot of users of C++Builder, but we were wrong, or we just haven't found them.

In general, the story was similar to VivaMP. Yes, RAD Studio is used somewhere in the world, but very little - there was no interest from the programmer community. Just like with VivaMP, there were neither e-mails with questions, nor messages about the errors in the tool. Just silence.

Of course, we spent a lot of time and effort on the support of Embarcadero RAD Studio, and the advertisement of new possibilities.

But these crazy ideas about Embarcadero RAD Studio weren't mad enough for us, a year later, we made another serious mistake.

CppCat, the third mistake

We released CppCat 1.00 - a cheap version of the analyzer, based on PVS-Studio. We called it 'PVS-Studio version for 250$'. The idea was to make a high-quality, low cost analyzer - it was much cheaper, so that supposedly, more individual developers would buy and use our solutions. Perhaps we would even abandon PVS-Studio, which we saw as a large and complex program with a long history, as opposed to an easy and small CppCat, where a simple interface was combined with powerful code analysis.

I think, by the title of the chapter you have already guessed that it was a bad idea. I decided not to describe the details of this error, as they would basically repeat the article "We are Closing Down the CppCat Project". I highly recommend reading this article, it is short and very interesting.

0465_pvs-studio_10_years/image8.png

Figure 6. The CppCat project was great in everything. It had a simple title, easy settings and any individual developer could afford it. The bad thing about it, was that it wasn't bringing profit.

A little more than a year later, we closed the CppCat project, having wasted a lot of time and effort... again. As you can see, we have made a lot of serious mistakes, and each of them could easily have bankrupted us. Now we are much more accurate, and put a lot more thought and consideration into new experiments, laying aside some resources in advance in case the idea would fail again.

By the way, a little before closing the CppCat, we removed the support of Embarcadero RAD and OpenMP diagnostics. We realized that it was time to get rid of the ballast which does not bring any profit, but still requires a lot of effort to maintain.

Nowadays

Failures with CppCat, VivaMP, Embarcadero RAD Studio didn't leave us devoid of enthusiasm, and we decided to invest our energy into three new directions:

  • Analysis of C# code;
  • Linux Support;
  • A free version of the license for PVS-Studio.

It is still too early to speak of how successful these new ventures are- we will see only in several years. We are still full of energy to conquer the world, and we see that it is gradually happening.

We can consider the start of our company as the year 2009, but we went to self-sufficiency without the support of outsource projects. At that time we had 4 employees. 7 years later, our team has 24 people. Of course, we cannot call it a huge success, but it is in the way it is. I don't see any point in making the situation look better than it really is. Despite the 10 years, we are still in the beginning of the journey, we are just learning to really make and sell our software product.

I hope that was an interesting story. I'll be glad if it inspires someone not to disregard small beginnings, and continue believing in a dream.

0465_pvs-studio_10_years/image9.png

Figure 7. Never give up!

Oh yes, bugs in Viva64 v1.0

I wouldn't be me, if I hadn't checked the first version of Viva64 analyzer with the current version of PVS-Studio. There weren't a lot of errors actually, due to the tiny size of the analyzer kernel of that time - there were about 3-4 thousand lines of code. The analyzer kernel Viva64 had only 210 files and was 37 KLOC. For comparison, now the kernel of PVS-Studio for C/C++ code is now 320 files and 208 KLOC. Accordingly, the amount of code we wrote increased around 40 times.

Note. Let me clarify once again that we are talking about the kernel for the analysis of C/C++ code. On top of it, there is a plugin for Visual Studio, the C# kernel, Standalone utility, and many more. Therefore, the total code size increased by hundreds of times.

So, let's see the warnings of the current version of the analyzer, which seemed interesting enough to include them in the article.

rw_table_sanity_check(const rw_table table[])
{
 unsigned n = (sizeof table)/(sizeof table[0]);
    
 if (n < 2) return;

 for (const char* old = (table++)->name; --n; old = (table++)->name)
   if (strcmp(old, table->name) >= 0) {
     cerr << "FAILED: '" << old << "' < '"
<< table->name << "'" << endl;
     assert(! "invalid order in presorted array");
 }
}

This error is detected by two warnings of PVS-Studio:

  • V511 The sizeof() operator returns size of the pointer, and not of the array, in 'sizeof table' expression. lex.cc 822
  • V514 Dividing sizeof a pointer '(sizeof table)' by another value. There is a probability of logical error presence. lex.cc 822

The error is related to the subsystem of unit-tests. The test checks nothing, because the n variable is assigned with the 0 value. The error that the table is just a pointer, not an array.

Here is my own error:

bool IsLiteralFFFFFFFF(const char *buf, size_t len) {
 if (len < 10)
  return false;
 
 if (buf[0] != '0' && (buf[1] != 'x' || buf[1] != 'X'))
  return false;
 ....
}

PVS-Studio warning: V547 Expression 'buf[1] != 'x' || buf[1] != 'X'' is always true. Probably the '&&' operator should be used here. vivacasts.cpp 632

The quick check that the literal should start with the "0x" or "0X" symbols. The check considers all strings, which start with the '0' character to be correct.

The following code fragment is quite long, but I decided not to shorten it:

Ptree* Append(Ptree* p, Ptree* q)
{
  Ptree *result, *tail;

  if(p == 0)
    if(q->IsLeaf())             // <=
return Cons(q, 0);
    else
return q;

  result = tail = Cons(p->Car(), 0);
  p = p->Cdr();
  while(p != 0){
    Ptree* cell = Cons(p->Car(), 0);
    tail->SetCdr(cell);
    tail = cell;
    p = p->Cdr();
  }

  if(q != 0 && q->IsLeaf())          // <=
    tail->SetCdr(Cons(q, 0));
  else
    tail->SetCdr(q);

  return result;
}

PVS-Studio warning: V595 The 'q' pointer was utilized before it was verified against nullptr. Check lines: 360, 374. ptreeutil.cc 360

There will be a null pointer dereference, if both factual arguments are equal to nullptr.

The following code is quite justified for those times, but now you cannot code like this:

Class* Environment::LookupClassMetaobject(Ptree* name)
{
  TypeInfo tinfo;
  Bind* bind = 0;

  if (this == 0) {
    TheErrorLog().Report(
MopMsg(Msg::Fatal,
"Environment::LookupClassMetaobject()",
"0 environment"));
    return 0;
  }
 ....
}

PVS-Studio warning: V704 'this == 0' expression should be avoided - this expression is always false on newer compilers, because 'this' pointer can never be NULL. environment.cc 115

There are ten more checks like these.

And... that's it. Probably, the readers would expect more, but there is really nothing more that deserves attention. 37 KLOC is really very little, and we have always been very careful about writing the code and testing.

Conclusion

A couple more interesting observations. 10 years ago I really thought that the organization of the working processes in a company would be different. Once, it seemed to me that when the company becomes larger, we'll do some creative tasks, think of the development strategies, and sit in leather arm chairs with a smart and thoughtful look. But it turned out that as the work goes on, our job resembles the job of firefighters who must selflessly fight with different troubles. The more space we are taking up, and the more employees we have, the more "emergencies" we have and the more varied they are. Examples: issues with electricity, leaking ceiling, jamming door lock, dealing with the "violations" of unpaid taxes - 1 kopeika (1 penny), and so on. This doesn't mean that we are going to fix an air conditioner, which was hit by an icicle ourselves. But it would be our task to organize its fixing.

0465_pvs-studio_10_years/image10.png

Figure 8. Quite a fresh winter problem. Here are common troubles of startups. They aren't really caused by an incorrectly chosen framework.

Sometimes you have to use real instruments and something, so that things work.

0465_pvs-studio_10_years/image12.png

Figure 9. Here is myself and Sergey Khrenov, fixing a Press Wall. If you want to do something well, do it yourself. You can click on the picture to enlarge.

Perhaps this is a typical transition step for any company, when the organizers have to do a greater amount of tasks which have nothing to do with the project itself, but the tasks just need to be done. Over time, the company should get a supply manager, a lawyer, and an administrator that would fulfill a part of those tasks. We aren't big enough for that yet, but we are doing our best.

Thank you for attention. That's the story of our failures and successes, and I wish you Happy New Year and great holidays.



Comments (0)

Next comments next comments
close comment form