onbox

Emsisoft and Av-Comparatives

Recommended Posts

We're not currently participating in their testing. Note that this isn't necessarily permanent, and there's always the possibility that we may participate in their testing again some time in the future.

Also note that we're still participating in VB100 testing.

Share this post


Link to post
Share on other sites
2 hours ago, GT500 said:

We're not currently participating in their testing. Note that this isn't necessarily permanent, and there's always the possibility that we may participate in their testing again some time in the future.

Also note that we're still participating in VB100 testing. 

Can we know the reason for this decision?

Share this post


Link to post
Share on other sites
19 hours ago, onbox said:

Can we know the reason for this decision?

Passing tests are not the same as protecting against real-world threats. We don't have the resources of larger companies like BitDefender, Kaspersky, and ESET so we have to be more selective about where we focus our development efforts. We've decided that it's more important to focus on the best protection for real-world threats than to focus on passing tests so that we look good in statistics.

Share this post


Link to post
Share on other sites
29 minutes ago, GT500 said:

it's more important to focus on the best protection for real-world threats than to focus on passing tests

This is not a very encouraging statement!

Participating in VB100 is almost equal to ZERO. I recently checked VB100 and I see software like "Defenx Security Suite, ALYac, Exosphere..."  passing VB100 with flying marks .

Never heard about these software.

 

EMSI soft is using somebody else's  engine, has the former Mamutu behavior blocker , and if is not able to participate in AV Comparatives or AV Test and get a good result, well... that is not encouraging at all.

Share this post


Link to post
Share on other sites
1 hour ago, andone said:

Participating in VB100 is almost equal to ZERO.

There are still many companies that will only use an Anti-Virus software that is VB100 certified.

 

1 hour ago, andone said:

I recently checked VB100 and I see software like "Defenx Security Suite, ALYac, Exosphere..."  passing VB100 with flying marks .

To make sure your software passes the test, all you have to do is design it to pass the test. Whether or not they design their software to do more than pass tests, I have no idea.

 

1 hour ago, andone said:

EMSI soft is using somebody else's  engine ...

BitDefender's anti-virus engine isn't the only thing that allows them to get a 100% in testing. If all you use is their Anti-Virus engine and database, then you'll fail the testing every time. I imagine the same goes for every other Anti-Virus engine these days.

Sure, the Anti-Virus engine alone will probably still score over 90% most of the time, but other technology is necessary to get a 100%. Rather than redesign our technology just to ensure we pass the testing every time we prefer to devote resources to ensuring that EAM actually stops real-world threats.

 

1 hour ago, andone said:

... and if is not able to participate in AV Comparatives or AV Test and get a good result, well... that is not encouraging at all.

I don't think Malwarebytes, Sophos, and Webroot participated in AV-Comparatives testing in 2018. We're just joining the ranks of companies that are backing off from testing temporarily.

Share this post


Link to post
Share on other sites
1 hour ago, GT500 said:

I don't think Malwarebytes, Sophos, and Webroot participated in AV-Comparatives testing in 2018. We're just joining the ranks of companies that are backing off from testing temporarily.

Malwarebytes participated in AV Comparatives a while ago with catastrophic results and they decided not to participate anymore. Recently they participate again , with not so encouraging results. But they claim to be NOT and antivirus.

Webroot  claimed to have an "innovative approach' in PC security (journalism)  and they failed miserably in each and every tests ; later they decided not to participate anymore and now the company was acquired by Carbonite. Most likely the product will disappear from the market, the way it is.

 

The point is , you will never see a product which is successful in AV Comparative /AV Test  deciding NOT TO PARTICIPATE anymore. Typically, the companies which are not so successful in testings will invoke a reason or another not to participate any longer. 

Share this post


Link to post
Share on other sites
25 minutes ago, andone said:

But they claim to be NOT and antivirus.

They changed their tune with version 3 of their product, and started advertising it as "Anti-Virus". I don't know if they still do that, but the reality is that the only real difference between Malwarebytes Anti-Malware and "traditional Anti-Virus" was that they didn't focus on threats that were already commonly detected by other products, and thus required another product be installed alongside theirs in order to provide full protection.

Also note that despite their consistent bad performance in testing over the years, their revenue increased considerably, as did their funding from investment firms. I haven't checked recently, however I expect that they haven't suffered financially due to their testing failures, and I expect that their customers didn't lose confidence in them either.

 

32 minutes ago, andone said:

Webroot  claimed to have an "innovative approach' in PC security (journalism)  and they failed miserably in each and every tests ; later they decided not to participate anymore and now the company was acquired by Carbonite. Most likely the product will disappear from the market, the way it is.

Funny, I'd heard their sales were doing reasonably well. They used to be popular due to corporate contracts anyway, and not test results. If they lost money, then it's probably due to losing corporate contracts to other companies.

Regardless, Sophos is still one of the largest AV companies in the world, and they aren't participating in AV-Comparatives testing.

 

33 minutes ago, andone said:

The point is , you will never see a product which is successful in AV Comparative /AV Test  deciding NOT TO PARTICIPATE anymore.

That's because their test scores make for great marketing material.

 

37 minutes ago, andone said:

Typically, the companies which are not so successful in testings will invoke a reason or another not to participate any longer.

That's because there are legitimate reasons beyond just "we're doing bad in the tests".

Please note that I'm intentionally being vague. I'll leave you with the following link as an explanation, and hopefully you'll be able to figure out why:
https://malwaretips.com/threads/i-am-head-of-research-at-emsisoft-ask-me-anything.90999/

Share this post


Link to post
Share on other sites

Personally I think following the tests is a waste of time.   If you are really concerned then you will need  to make the effort to do your own testing.  that is what I did.   Also the tests don't tell you a thing about the nature of the company.   I will  stick with Emsisoft because I think it's the best

  • Like 1
  • Thanks 1

Share this post


Link to post
Share on other sites
5 hours ago, Peter2150 said:

do your own testing.  that is what I did

Absolutely!!! You are entitled to your own opinion.

Also, next time when you buy a new car, DO NOT trust the safety report about your car, done by a third party company. To be absolutely sure, crush it yourself....

Share this post


Link to post
Share on other sites
2 hours ago, andone said:

Also, next time when you buy a new car, DO NOT trust the safety report about your car, done by a third party company. To be absolutely sure, crush it yourself....

Hm... That sounds like fun...

Granted, the cost would be a bit high. Whereas the cost of testing Anti-Virus is considerably less. ;)

Share this post


Link to post
Share on other sites

So emsisoft will go into tests and hits, I have 15 servers with Enterprise Security that will go into annual renewal and I honestly did not like the explanations of Arthur Wilkinson and nor do I have the confidence to renew, I have seen in the past decisions like this, for example, Symantec had very poor results in AV comparatives and in the real world, only returned to AV Comparatives when in the real world presented good results.

Share this post


Link to post
Share on other sites
34 minutes ago, RodPaulo said:

... I honestly did not like the explanations of Arthur Wilkinson and nor do I have the confidence to renew ...

Does this help?
https://avlab.pl/PDF_avlab/AVLab-Test-of-software-for-online-banking-protection.pdf
https://avlab.pl/test-software-online-banking-protection

AVLab Banking Protection Test from February 2019.

We're still participating in AVLab testing, in addition to VB100 testing. We haven't pulled out of testing entirely, and there are a lot more testing organizations than just AV-Comparatives.

Share this post


Link to post
Share on other sites
43 minutes ago, GT500 said:

Does this help?
https://avlab.pl/PDF_avlab/AVLab-Test-of-software-for-online-banking-protection.pdf
https://avlab.pl/test-software-online-banking-protection

AVLab Banking Protection Test from February 2019.

We're still participating in AVLab testing, in addition to VB100 testing. We haven't pulled out of testing entirely, and there are a lot more testing organizations than just AV-Comparatives.

you can leave the others all and keep the tests in AV-Comparatives that I would not be commenting here.

Share this post


Link to post
Share on other sites

Hello Arthur,

 

what do you mean with "all you have to do is design it to pass the test"?

On 3/31/2019 at 5:34 PM, GT500 said:

To make sure your software passes the test, all you have to do is design it to pass the test. Whether or not they design their software to do more than pass tests, I have no idea.

 

 

 

...and with "We've decided that it's more important to focus on the best protection for real-world threats than to focus on passing tests "?

 

On 3/31/2019 at 2:55 PM, GT500 said:

Passing tests are not the same as protecting against real-world threats. We don't have the resources of larger companies like BitDefender, Kaspersky, and ESET so we have to be more selective about where we focus our development efforts. We've decided that it's more important to focus on the best protection for real-world threats than to focus on passing tests so that we look good in statistics.

 

Can you please explain better? What could you do differently to pass the test and why would this affect negatively the real world detection capability of Emsi?

In other words, if Emsi's real world detection is very high, why shouldn't it also be for the AV-comparatives test?

thank you!

 

Share this post


Link to post
Share on other sites
Quote
6 hours ago, pallino said:
 

what do you mean with "all you have to do is design it to pass the test"?

It means that the tests done by AV-C and AV-T have a clear image of how they think AV software should work. The problem arises when your product doesn't fit the mould. Then you get penalized for not doing what everyone else does, even though what everyone else does may not be in the best interest of the user, to begin with. Best example: Snooping around in your encrypted connections, which literally every AV vendor screwed up at least once in the past and probably will continue to happen, exposing users to potentially greater risks than most malware does.

Quote
6 hours ago, pallino said:

In other words, if Emsi's real world detection is very high, why shouldn't it also be for the AV-comparatives test?

For starters, the test sets aren't nearly as representative anymore. When we participated in AV-T and AV-C both tested with less than 200 samples a month on average. 200 samples out of literally tens of millions. The exact selection isn't clear and not representative of what users deal with either. None of them tests with PUPs for example, even though a simple look at any tech support community will tell you, that it is probably by far the biggest problem users are dealing with. 

So no, neither of those test scores represents real-life performance and it becomes blatantly obvious when you go to places like Bleeping Computer, GeeksToGo, Trojaner Board, Malekal, and all those other communities where people infected by malware show up for help and look at what products these victims used at the time they became infected. Then you will notice that a lot of these products with perfect scores don't look nearly as perfect in real-life conditions.

The reason for this discrepancy is quite simple: Most AV vendors will specifically optimise their products for these tests. The most severe cases are where vendors end up outright cheating and detecting the test environments which then results in a change of behaviour of the product (think Dieselgate, but with anti-virus). But there are many ways you can game these tests. For example:

  • you can try to figure out the threat intel feeds the companies use, then just buy those same threat intel feeds so you have all samples in advance
  • you can track their licenses and supply different signatures to them or use your cloud to treat those test systems differently
  • some particularly shady organisations literally also sell you their sample and malicious URL feed, so you can just outright buy the samples and URLs your product will get tested on later

What you end up with as a result is a product that is optimised really really well for the exact scenario they are being tested under using the exact type of URLs and samples these testers use, but that is utterly useless when it comes to anything else. We just really don't want to create this type of product.

So when we were asked whether we wanted to continue to participate this year, we discussed the matter internally, looked at what we get out of these tests (meaning: whether these tests have a discernable impact on our revenue) and decided that they are simply not worth it and that the tens of thousands of Euros we spent on them every year would be better spent on extending our team and building new ways of keeping our customers safe.

  • Like 2
  • Thanks 1

Share this post


Link to post
Share on other sites

While Fabian's explanation makes a lot of sense (as always) --  here's the thing:

The timing of this decision to suspend participation in AV-C testing is far from ideal -- in fact, it could not have come at a worse time.

It is no secret that compared to its stellar performance over time in the AV-C RW Tests, in the past 2-3 RW Tests, EAM's results, while good, have not been up to the stratospheric levels of prior results.

We were told that these subpar results were caused by some bugs in a new driver.

I, for one, would have liked to have seen that these bugs had been eradicated as demonstrated by an AV-C RW Test in which EAM's performance was in-line with it's historical top-of-the-charts results.

 

Share this post


Link to post
Share on other sites

I agree and there were a couple of things in the last tests that I unfortunately am not able to talk about. One of the first things most testing companies make you do is sign an NDA. Otherwise they won't even tell you their prices or tell you anything. That's also why Arthur mentioned that we can't really comment as in depth as we would want to. I imagine people would be quite surprised about the amounts of money that exchange hands and all the discrepancies going on. I can completely understand why more and more companies drop out of testing. It almost feels arbitrary at times and unless you start building your products for testing specifically, you can't really compete and for us at least the return of investment wasn't there to further justify it.

The real world protection tests for AV-C are a one year commitment in general, so participating just once to show that the bug is fixed (which you wouldn't know in the first place, as the samples are completely different every time) isn't really an option. That being said, we may still participate in one time tests with AV-T or AV-C occasionally if we release some major changes.

Share this post


Link to post
Share on other sites
On 4/12/2019 at 8:13 AM, Fabian Wosar said:

When we participated in AV-T and AV-C both tested with less than 200 samples a month on average. 200 samples out of literally tens of millions. The exact selection isn't clear and not representative of what users deal with either

This is EXACTLY how every test in the world works; when I passed admission for university , I had to solve in 3 hours 5 math. tests . Only 5, even though in high school I solved thousands of tests. What would you expect me to say, that in real life I am a smart guy but the tests were not representative for a "real life" situation and even though I failed , I am still a good guy!

Anyway, as a consumer, I will always chose an antivirus which was tested by third parties versus one not tested and self proclaimed   "good / the best"

Share this post


Link to post
Share on other sites
1 hour ago, andone said:

This is EXACTLY how every test in the world works; when I passed admission for university , I had to solve in 3 hours 5 math. tests . Only 5, even though in high school I solved thousands of tests.

Last time I checked standardised tests in education are just as controversial. But even if you ignore that: Tests in school are accumulative. Maybe it is different wherever you live, but at least here to even get to university you had to go through all the prerequisites. Meaning the fact that you are even allowed to take the test is the result of those thousands of tests you did before on the various topics through your school career up to that point where you demonstrated that you understood each of the sub-topics in question. Since the material taught doesn't radically change every couple of months, your previous achievements and test scores are still valid and proof that you at least have the capacity of understanding these topics, even though you may a bit rusty now.

Such an effect doesn't exist when it comes to malware, as unlike algebra for example, the threat landscape and body of malware you have to deal with changes almost every day. Therefore, results obtained just a few months ago are completely obsolete as they (if at all) reflect performance in an environment that in that form no longer exists.

1 hour ago, andone said:

Anyway, as a consumer, I will always chose an antivirus which was tested by third parties versus one not tested and self proclaimed   "good / the best"

We are still participating in several tests and continue to participate. We just dropped our AV-C engagement as many other companies did before us.

Share this post


Link to post
Share on other sites

Thank you for the explanation.👍

How widespread is the cheating problem?

How many might be cheating (e.g in %) and don't these companies cheat in all tests they take part in causing the same problem in all tests? 

Shouldn't test companies detect these (or other AV companies denounce them), make these public and ban the cheaters ( at least one suspect got caught and banned in the past) and this act as a big deterrent?

Were the driver problems during AV-C tests the same that affected Cruelsister test last year?

Have these all been fixed?

Thank you

 

 

Share this post


Link to post
Share on other sites
13 hours ago, pallino said:

How widespread is the cheating problem?

Nobody knows. 

Quote

How many might be cheating (e.g in %) and don't these companies cheat in all tests they take part in causing the same problem in all tests? 

Again, nobody knows. A couple of them were caught doing it (Qihoo most recently IIRC). If the testers become aware, it usually results in disqualification and sometimes a block from future tests for a certain time period.

Quote

Shouldn't test companies detect these (or other AV companies denounce them), make these public and ban the cheaters ( at least one suspect got caught and banned in the past) and this act as a big deterrent?

It probably depends a lot. I mean, do people still remember Panda being disqualified and hold it against them? I don't think so. Outside of a very small circle of enthusiasts nobody cares about these results. The rest will buy what PC Mag or a quick Google search tells them to buy.

There is also like a huge grey area. If AV-C and AV-T publishes the exact PC models they perform performance tests on including an exact list of software and an AV vendor goes and buys the exact same hardware and starts tuning his product to work very well on that specific hardware, maybe even to the detriment on other more commonly found hardware, is that cheating already or just bending the rules a little bit? It's not strictly against the rules, but it also makes the test results not very applicable to the general public.

Quote

Were the driver problems during AV-C tests the same that affected Cruelsister test last year?

We can't tell you as Cruelsister neither shares details nor samples. It may or may not be fixed. We will never know.

Share this post


Link to post
Share on other sites

When Outpost Security Suite was sold and taken off the market, I searched the product forums and wound up with Emsisoft. Still happy with the product. I can get Norton free with Comcast > NO WAY.  I was lucky enough to repair a telephone system that ran circles around every other system on the market. Lucent & Northern Telecom included. because of flexible and easy to program software. A few minutes and an external relay wow ! turn on a coffee pot for when I arrived in the morning. I kid you not.

So "real world is all I look at in every product I buy". I believe that testing companies are out there to PUSH certain products.

Product forums seem to have these kind of threads every so often. The way I look at them is "Someone trying to cause problems for their own gratification and people should ignore them"

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.