Our website uses cookies to enhance your browsing experience.
Accept
to the top
close form

Fill out the form in 2 simple steps below:

Your contact information:

Step 1
Congratulations! This is your promo code!

Desired license type:

Step 2
Team license
Enterprise license
** By clicking this button you agree to our Privacy Policy statement
close form
Request our prices
New License
License Renewal
--Select currency--
USD
EUR
* By clicking this button you agree to our Privacy Policy statement

close form
Free PVS‑Studio license for Microsoft MVP specialists
* By clicking this button you agree to our Privacy Policy statement

close form
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

close form
I am interested to try it on the platforms:
* By clicking this button you agree to our Privacy Policy statement

close form
check circle
Message submitted.

Your message has been sent. We will email you at


If you haven't received our response, please do the following:
check your Spam/Junk folder and click the "Not Spam" button for our message.
This way, you won't miss messages from our team in the future.

>
>
PVS-Studio: Testimonials

PVS-Studio: Testimonials

Jul 09 2012
Author:

I've decided to collect comments of various people concerning our static code analyzer PVS-Studio in one article. Some comments are cited in full and others only partly. You can read the original texts by following the corresponding links.

OpenMP support in PVS-Studio had been dropped after version 5.20. If you have any questions, feel free to contact our support. Visual Studio 2005 and 2008 development environments are no longer supported. You can view the list of supported development environments in the documentation section "System requirements for PVS-Studio analyzer".

A couple of words about PVS-Studio in case you don't yet know what it is

PVS-Studio is a tool for programmers using the C/C++/C++11 languages. The tool is a static code analyzer which detects errors and other defects in software source codes. The analyzer is intended for Windows software developers. PVS-Studio currently operates integrating into the Visual Studio 2005/2008/2010 environment and will start supporting VS 2012 soon. Check of projects built with MinGW is also possible. To learn more about the analyzer, please follow this link: http://www.viva64.com/en/pvs-studio/

Now we're inviting you to read some of the comments we have collected.

How we check open-source projects

Author: John Carmack.

Source: post in twitter (EN).

It is fantastic that static code analysis tools can use the open source corpus to demonstrate their value: http://www.viva64.com/en/a/0079/

John Carmack about his experience of using PVS-Studio

Author: John Carmack.

Source: quotation from the "Static Code Analysis" article (EN).

The next tool I looked at was PVS-Studio. It has good integration with Visual Studio, and a convenient demo mode (try it!). Compared to /analyze, PVS-Studio is painfully slow, but it pointed out a number of additional important errors, even on code that was already completely clean to /analyze. In addition to pointing out things that are logically errors, PVS-Studio also points out a number of things that are common patterns of programmer error, even if it is still completely sensible code. This is almost guaranteed to produce some false positives, but damned if we didn't have instances of those common error patterns that needed fixing.

There are a number of good articles on the PVS-Studio site, most with code examples drawn from open source projects demonstrating exactly what types of things are found. I considered adding some representative code analysis warnings to this article, but there are already better documented examples present there. Go look at them, and don't smirk and think "I would never write that!"

PVS-Studio can find many interesting issues in programs

Author: MetaQuotes

Source: comment to the article (RU).

This is a very good analyzer that works with an absolutely new mode of error search.

Imagine, for example, that you have been deliberately clearing your projects for many years with Lint + Visual Lint + your own tools and then you launch PVS-Studio.

You think that the code is already cleared out. But now you see very silly mistakes in your projects like copy-paste, identical condition branches and some other mechanical errors. It takes you a couple of days and your project gets rid of another pile of mistakes.

What personally us concerns, we dote on PVS-Studio - this is a fine addition to Lint and Intel Parallel Studio (all the tools are licensed products).

We must support such companies - I'm telling this sincerely.

Background incremental analysis is great

Author: Sergey Vlasov

Source: quotation from the "PVS-Studio C++ Static Code Analyzer Tool For Visual Studio Review" article (EN).

Overall, PVS-Studio showed it can check for non-trivial errors with not many false warnings (for example, Visual Studio code analysis gave me more than 200 warnings). The analysis process is very slow, but background incremental analysis only for changed files after each build alleviates this problem. Integration with Visual Studio works well and is very convenient.

Comparing PVS-Studio to PC-Lint and Klocwork

Author: Alexander Lotokhov

Source: quotations from the "One more testing of PVS-Studio" article (RU).

PVS-Studio seems to be focused exclusively on Visual Studio support. After the setup it automatically integrates into the studio version you have installed on your computer. This is the only step you have to make to integrate the tool. All you have to do now is just launch the studio itself - and there you have a new menu item PVS-Studio. Such a transparent integration is beyond praise!

...

Analysis results are output in a separate PVS-Studio tab. Everything is arranged quite conveniently: for each detected (potential) issue you can see its description, location (file and line) and a reference to a detailed description of issues of this type: why it is considered an issue, how it can be fixed and so on. The descriptions are very detailed and nice!

...

And the last thing - a brief comparison of PVS-Studio with PC-Lint and Klocwork.

PVS-Studio is much more convenient to use than PC-Lint. The main problem of the latter is an extremely big number of potential issues it reports - running on the same project as PVS-Studio with all the settings set to default, it generates over 23 thousand messages! (PVS-Studio reports only 42). At the same time, PC-Lint doesn't allow you to hide repetitive issues (if one and the same .h file is included from two .c files, PC-Lint will show the problem in the .h file twice, while PVS-Studio will report it only once). In PC-Lint, you also cannot mark an individual issue as a false positive; you can only "turn off" the whole class of problems.

At the same time, if you delve into the heap of "trash" generated by PC-Lint, you can find more real potential issues than detected by PVS-Studio (which, for instance, didn't manage to notice absence of virtual destructors in a couple of classes as well as absence of explicitly defined copy constructors and default constructors). It's another thing that these useful grains in PC-Lint "sink" among other messages which are not that crucial.

PC-Lint is more universal, although it is more complicated in use. However, its integration into Visual Studio is rather smooth too.

But in general, PVS-Studio is an undoubted leader here, as you can just go and get started with it, which is not the point about PC-Lint where just a mere task of reviewing all the generated warnings might take you two or three days!

However, being compared to Klockwork, PVS-Studio, unfortunately, yields to it. First of all, it regards usability (let me just mention state tracking from build to build - it lets you see how many issues have been fixed, how many issues remain and also provides you with handy reports), versatility and a better mechanism of finding critical potential issues. Besides, Klockwork is a server multi-user system where you can assign programmers responsible for each detected problem and handle several projects at a time easily tracking the status of each.

If we compare these tools by their prices, we'll get the following (information is taken from the tool developers' websites):

  • developers of PVS-Studio ask €3500 Euro for 5 licenses or €9000 for 30 licenses per year.
  • developers of PC-Lint ask $389 for one license or $3500 for 10 licenses with no time restrictions.
  • developers of Klocwork ask €30000 for the pack "server + 20 clients" per year.

Well, the prices generally correspond to the tools' functionalities. PVS-Studio stands somewhere in the middle between a cheap yet large-sized and unhandy PC-Lint and a nice and handy yet expensive Klockwork.

You have interesting materials on your website

Author: justinvh

Source: comment on the reddit.com website (EN).

I've been actually amazed of how awesome PVS-Studio turned out to be. Plus, this was a neat read. It's worth the time to at least look at the tool to get an idea what it is capable of; I typically just went with clang static for dealing with small things, but the tool has definitely shown its merit in the 64-bit realm.

Plus, their 64-bit lesson set is a fun read.

I've been using it in the process of porting Doom 3 and it's been seriously awesome. They're really supportive of students, so if you're trying to do a talk at a university in comparing static analyzers (my example) or just want to learn a bit more about their product then send them an email and ask away.

There have been a few oddities from time to time, but it's very interesting to see it in action to say the least; I do wish I wasn't a under a VM all the time to use the tool, but the seamless integration with visual studio is pretty nifty. I am currently using clang as my preprocessor and it all seems pretty quick too.

As Carmack said, you will find bugs.

PVS-Studio makes it easy to master the static code analysis methodology

Author: Anteru

Source: quotation from the "Review: PVS-Studio" article (EN).

If you are new to static analysis, I can recommend giving PVS-Studio a shot (there is a trial version.) Static code analysis for C++ is still at an early stage, but even now, a tool like PVS-Studio can already help you discover lurking bugs. Especially if your code base is not already covered by unit tests, a static code analysis tool can quickly give you a hint which parts of your code base are ripe for review.

Oh, and before I forget: It also gets regular updates and the support is good — in fact, I reported a bug at the beginning of my review which was fixed in just a few days.

Thanks again to Viva64 for the review version, and keep up the good work!

PVS-Studio is useful when uptaking 64-bit platforms

Author: Alex Chachanashvili

Source: quotation from the "Running the code through PVS-Studio" article (EN).

Overall I liked the tool, it did find a few non-critical issues with the server code, but to be fair I have ran most of my code through FlexeLint and BoundChecker (until my license expired that is). I also have Visual Studio warning level 4 turned on for debug builds and that catches a lot of issues.

The main benefit of PVS-Studio is that it is good at finding issues that may affect porting 32-bit code to 64-bits.

We are good fellows

Author: Fernando Moreira

Source: comment to the presentation (EN).

Yep, I did tried it and became a fan right at that moment :) I recommend to every developer here at the lab. Provides a seamless integration with VS and it really educates a developer by calling his attention to several types of (dark) issues/pitfalls.

Didn't got a single false positive till this day, which is awesome!

You guys are doing a hell of a job!

On advertising of PVS-Studio and Coverity

Author: nomarketingbs

Source: I will cite the "How to Not Present Static Analysis Results" article (EN) in full, as I just don't know what ideas to pick out especially.

Aside of sad fact that it's impossible to try Coverity without first getting the Seal Of Coverity Sales Force Approval the next big difference between Coverity and PVS-Studio anyone can see is...

TEH MARKETING MATERIALS

How This Could be Done

Let's look at a typical PVS-Studio scan report. This one happened to get right in my way and so I link to it here.

A misprint... Yes, I see. Risk of array overrun... I see. Several more subtle defects.. I see. No freaking idea how those defects affect the program functioning, but they are presented quite well and are easily assessible by anyone who is willing to pay attention to them.

One might wonder how the program could run with those horrible defects.

This is quite simple. Defects that actually manifest themselves have been identified earlier using other methods - plain old debugging, unit tests, peer review, whatever else. All the rest requires some effort to get exposed. That might be some unusual dataset. That might be some unusual sequence of user actions. That might be some unusual error indication. That can be upgrading a compiler or a C++ runtime.

Defects are defects. You can't lie to the compiler they say, but not all defects are created equal. Those subtle things will sit in the codebase for years and then all of a sudden someone runs PVS-Studio on the codebase and WOW WHAT A HANDFUL OF HORRIBBLE BUGS ZOMG ELEVENELEVEN!!! they will think.

So a scan report alone is worth nothing - it still takes a developer who is familiar with the codebase to assess and possibly address each reported defect. PVS-Studio scan report does exactly the right thing - it presents defects one by one together with some analysis, nothing more.

How This Should Not Be Done

Now look at Coverity marketing materials. You will have a very hard time to find a scan report like the one linked to above with Coverity scan results. Yet once in a while Coverity dudes will issue an Integrity Report.

An Integrity Report is a very enthusiastic document containing such words as mission, seamlessly and focused on innovation. Not bad as a starter - at least presence of those keywords identifies clearly that there's too much marketing in the first three pages.

Moving on to Table A... Oh, this table shows a distribution of project sizes. Using the word distribution somehow implies that the data gathered has some statistical significance and so deserves extra trust. Well, with 45 projects total trying to build a chart and call it a distribution is very silly. You see, they had TWO projects with more that 7 million lines of code. That's unbelievable, I'm breathless.

All the rest of the Report is also full of similar completely meaningless tables. Yes, it is cool you found 9,7654 defects per square foot of some project. But until you let me try your program - I don't care, those figures don't matter any more than a 132 percent efficiency claim (the post is five years old, yet still relevant).

Fast forward to Appendix A. Tables 2 and 3 summarize defects but assigning each a category and and impact. Let's see...

Control flow issues. What's that? Is it when I forget to put a "break;" at the end of a "case" in a "switch" statement? So you say it has medium impact... Okay. What about "main()" returning immediately? That's a control flow issue as well and don't tell me it has medium impact. Not all control flow issues are created equal.

Null pointer dereferences have medium impact, don't they? Sure, my code defererences null pointers here and there and each time that happens users get a candy. Perhaps the Report authors meant potential null pointer dereferences which is a situation where code dereferences a pointer without first checking that it is not null. Good news is checking a pointer each time before it is dereferenced clutters code big time. Again, not all null pointer dereferences are created equal.

Error handling issues have medium impact. What is that? Is that checking for error codes of Win32 API functions? Sure, any time a program wants to open a file without validating whether the attempt to do so failed and just proceeds reading it's almost no big deal for the user. No access to the folder? We'll pretend we've saved the file. Whatever. Not all error handling issues are created equal.

Integer handling issues have medium impact. Sure, overflowing an integer while computing a memory allocation size is no big deal. Just allocate whatever amount it happens to be and pretend it's the right size. Not all integer handling issues are created equal.

Insecure data handling has medium impact. What's that? No freaking idea, but I something tells me not all cases of insecure data handling are created equal.

Incorrect expression - medium impact. Sure, misplace braces wherever you want, no big deal.

Concurrent access violations - medium impact. You just spend the rest of your life debugging them, no big deal.

API usage errors - medium impact. Your code erroneously forgets to specify the path and that causes entire contents of Windows\System32 be deleted. No big deal.

Program hangs - medium impact. The program hangs only when being run on a computer outside a Windows NT domain. You run it just fine inside your corporate network, then go to a trade show and it stops working, your laptop turns into a thousand dollar space heater with a screen. No big deal.

Why is no category assigned low impact I wonder? Is it because authors didn't dare call a software defect having low impact just because of belonging to some category?

This doesn't work. You can't throw several thousand defects into several categories and then assign each category an impact level. This is just impossible. If you're a software developer you must realise that beyond the shadow of a doubt, otherwise just quit your job immediately and go to the nearest McDonalds outlet - they have a "help needed" sign waiting for you.

The whole Integrity Report is just a big mess of numbers and diagrams. It's usability is not even zero - it is negative. The report scares the hell out of anyone who is concerned about software quality and stops there.

The Outcome

So what's the difference between PVS-Studio marketing materials and Coverity marketing materials? The former present facts that one can interpret and verify. The latter just try to scare by summarizing and no chances for verification.

Because not everyone deserves a free Coverity trial.

Those who care about quality of their code must have PVS-Studio

Author: Adam Sawicki

Source: quotation from the "Static C++ Code Analysis with PVS-Studio" article (EN).

Overall, PVS-Studio looks like a good tool for C++ programmers who care about the quality of their code. Finding issues related to OpenMP and 64-bit compatibility can be something of a great value, if only you need such features.

Too bad that PVS-Studio, opposite to Cppcheck, is a Visual Studio plugin, not a standalone application, so it obviously requires you to have a commercial MSVS version and do not work with Express edition. But this is understandable - if you need OpenMP or 64-bit, you probably already use Visual Studio Professional or higher.

In brief: PVS-Studio is a wonderful tool

Author: Florian George (programmer since 2009 in a company focused on computer vision/image analysis)

Source: a letter where the man is answering to our request to evaluate the PVS-Studio analyzer.

I am very impressed with PVS-Studio.

You have an already large library and are catching a lot more errors than, for example, /analyze and Cppcheck. Also, looking at the changelog, it is updated frequently. I really like the detailed explanations for each error, with examples and how to fix them. Sometimes though, I don't fully understand the issue, and how to fix it correctly. I think it would help to allow your customers, or even also visitors, to have a form at the end of each error webpage where they can provide feedback whether the page informed them properly about the issue and whether it helped them sufficiently to fix the error in their own code, and if not, allow them to provide the code, which will in turn allow you to easily improve the accessibility of the explanations for your error diagnostics. For starters, maybe it would already be enough to enable comments on the Error Documentation Webpages.

The performance is fine, it uses all cores, and to be honest I would expect speed to rather go down in the future when more and more things to check for get added.

The IDE integration into Visual Studio is good, it's all working as expected. The "click to jump to the code" feature is very convenient, so it's not surprising that it's the main feature that's used to differentiate between the Trial and the Full Version.

I stumbled upon static code analysis by accident, since then I read a lot of the posts on your blog (great posts!), as well as other articles on AltDevBlog, Random ASCII, etc, and became quite interested in the topic.

I am working at a small 10 man company, unfortunately I am not able to convince my boss to pay several thousand dollars for a software that has no instant benefit for him personally, same thing happened when I suggested licensing Intel Vtune. So I'm currently sticking to the Trial Version and manual navigation, as suggested in one of the blog posts, but at some point that might become too inconvenient to use regularly. I guess you are experiencing that situation, which is probably similar for a lot of other people interested in PVS-Studio, with a crying and a laughing eye - you have people who appreciate the work you are doing, but for certain reasons, they don't go and buy the full version.

You have a great product at a very competitive price compared to other professional solutions, I'd love to see it make the jump towards a bigger mainstream audience, right now it's just missing the last appeal, I assume mostly on the pricing side, that makes people and small companies buying it without thinking twice, the way you grab a game on Steam you see at a discount this week.

Hope this gave you a little insight into how PVS-Studio is perceived here, keep up the good work!

A fly in the ointment

Of course we get less positive comments like this (RU). We don't cite them not because we're trying to prettify PVS-Studio - it's just because many of the shortcomings pointed to in comments are either already eliminated or we're working on that. For example, there's no point in citing a comment where somebody wrote that PVS-Studio is not capable of catching null pointers at all. This is not true anymore. The analyzer has already acquired several diagnostics in that area.

There are also many negative comments regarding low speed of PVS-Studio. But when we start investigating the reasons, it usually appears that it's not the tool to blame in most times. For instance, a user has his/her project located on a network drive. Analysis runs very slowly in this case, as the analyzer has to generate large preprocessed *.i files while running. Here's another example. PVS-Studio.exe (and the Clang preprocessor as well) is launched in parallel many times. In such a case, it's often an antivirus that slows down the process pointlessly checking these executable modules again and again. So, please read the article "Tips on speeding up PVS-Studio".

Popular related articles


Comments (0)

Next comments next comments
close comment form