Last week AV-TEST released its anti-malware assessment report for the month of January-February. The test found that Microsoft Defender did quite well in terms of malicious file detection (including false positives) though it did pretty poorly in the performance impact department, securing the lowest score among all.
AV-Comparatives, which is another major anti-virus testing firm, also released its Real-World Protection Test and Malware Protection Test recently. We covered the former yesterday. In case you missed it, Microsoft did really well as it was one of the best performers alongside the likes of Kaspersky, Bitdefender, and Total Defense.
Today we take a look at the Malware Protection test. The difference between the two sets of tests is that the Malware Protection deals with malware executed on the system itself, whereas Real-World Protection is about web threats.
The test procedure has been explained by AV-Comparatives:
The Malware Protection Test assesses a security program’s ability to protect a system against infection by malicious files before, during or after execution. The methodology used for each product tested is as follows.
Prior to execution, all the test samples are subjected to on-access and on-demand scans by the security program, with each of these being done both offline and online. Any samples that have not been detected by any of these scans are then executed on the test system, with Internet/cloud access available, to allow e.g. behavioural detection features to come into play. If a product does not prevent or reverse all the changes made by a particular malware sample within a given time period, that test case is considered to be a miss. If the user is asked to decide whether a malware sample should be allowed to run, and in the case of the worst user decision system changes are observed, the test case is rated as “user-dependent”.
Like in the case of Real-World Protection Test, Microsoft Defender has done quite well in the Online Detection and Protection categories. However, its Offline Detection rate, which is at 83%, still falls behind several other competitors, though it is not the worst as it has beaten others like Trend Micro and Panda. The thing to celebrate however is the continuous improvement Defender has shown over the months. Last year, Microsoft was at just 60.3%, and it improved to 69.8% six months later.
You can view the full chart below. In total there were 10,015 malicious sample test cases:
The image below further breaks down the online protection rates and shows the number of compromised cases. Microsoft suffered two casualties, only behind McAfee and Norton, which had only a single compromised case. Meanwhile, Trend Micro was the worst with 281 which is quite appalling relatively.
The chart below summarizes the test results into one:
You can find the full test data on AV-Comparatives' website.