It must be said: some of the top players in the third-party antivirus (AV) testing industry have recently revealed themselves to be nothing more than pay-to-play capitulators who seek to line their pockets by perpetuating outmoded technologies while keeping more effective and innovative solutions out of the hands of the users who need them.
Furthermore, it seems that legacy AV vendors are more than happy to collude with certain testing houses who offer them 100% efficacy ratings in tests so that they can maintain market share, despite widespread industry acceptance that traditional AV struggles to perform at a fraction of that efficacy.
Symantec's own former Senior Vice President Brian Dye (now at McAfee) declared to the Wall Street Journal way back in 2014 that antivirus was dead, noting that traditional AV only catches 45% of cyberattacks at best, despite testing firms regularly giving their products efficacy scores of 100%.
Yet, in the most recent Malware Protection Test performed by AV-Comparatives and the Exploit Test that was performed by MRG Effitas, which was “commissioned” by Symantec, it comes as no surprise that the major benefactor for the report also turned out to be the major beneficiary of its findings.
We’ll let you as the reader decide for yourself what “commissioning” an AV efficacy test actually means. The simple fact is that no one pays for a test that shows how poorly their product performs. They only pay for tests that claim that they perform better than their competitors.
The silver lining to the abovementioned report is the fact that even though AV-Comparatives used a pirated version of our product that was out of date by almost a year, did not have access to our cloud console, was not configured properly, and did not have all available features enabled for maximum performance, Cylance still performed at 100% efficacy on the 500 most prevalent malicious samples, and at 98% efficacy on 50 malicious websites.
This test once again proves Cylance’s outstanding performance against modern and prevalent malware, even when an outdated version of our product was used in the test. The bigger issue at hand here is exactly how much credence we should give to tests that are bought and paid for to produce results that do not reflect real-world performance.
This issue should be of the highest concern to the organizations and individuals who rely on these products that are ‘getting owned’ at record levels while the testing firms and their clients (both of which have an undeniable financial interest in keeping the scam going) keep pumping out the same disinformation.
So, which is it? Does legacy AV only catch less than half of attacks, as senior Symantec officials have asserted, or does traditional AV indeed have 100% efficacy in a “real-world” scenario, as the testing firms would have you believe?
It would be an interesting survey to poll anyone using Symantec Endpoint Protection (SEP) today to see if they really are 100% protected against malware. If this is you, let us know - how many misses do you see on a daily basis from SEP?
Another standard testing practice that should do more than raise eyebrows is the fact that most of the testing firms in question offer legacy vendors like Symantec the option to choose for themselves exactly which malware samples are employed in evaluations and which are excluded. Choosing which malware is encountered is a luxury that consumers unfortunately do not enjoy in a “real-world” situation.
Some testers also offer vendors the option to retain “editorial rights” prior to the release of the reports – for a healthy fee, of course – which further calls into question the validity of the entire testing and reporting process.
To add insult to injury, security vendors who are not complicit in the scheme risk having their solutions tested at subpar configuration settings that are guaranteed to result in poor performance, and then have those results published publicly.
This system ultimately leaves the enterprise and consumer end-users at significantly higher risk than is implied by the testing house findings, and results in more effective and innovative solutions struggling to demonstrate their prowess on a level playing field.
Bottom line: don’t take our word for it. If you think Symantec really is 100% effective, test it for yourself and see what you find.
Go to the Test for Yourself site and get the tools, and then get the truth.