Microsoft Defender had been performing quite well in the recent AV-TEST rankings, though it did drop off in the latest evaluation. And despite some positives in the latest May 2022 Real-World Protection test from AV-Comparatives, Microsoft Defender has not had the best showing in these assessments.
Defender failed to secure any score in last year"s Real World Protection due to an error. Later, it was found that Microsoft"s product had a rather poor offline detection rate and that it was also quite a system resource hog.
Getting into the latest May 2022 Real-World Protection test results, we start off with the most impressive feat wherein Microsoft Defender got the best result in the false positive test category. Alongside ESET, Defender had zero false positives (FPs). This is perhaps a little bit surprising as there have been some reports lately of false alarms from Defender.
In the above image, in case you"re wondering what "user-dependent" is, AV-Comparatives says these are the false positive detections which were the outcome of a wrong user choice. It explains:
The evaluation process for each test case will recognise any variations among the malware files executed on each test machine. After the malware is executed (if not blocked before), we wait several minutes for malicious actions and also to give e.g. behaviour-blockers time to react and remedy actions performed by the malware. If the malware is not detected and the system is indeed infected/compromised, the process goes to “System Compromised”. Otherwise the product is deemed to have protected the test system, unless a user interaction is required. In this case, if the worst possible decision by the user results in the system becoming compromised, we rate this as “user-dependent”.
Where a tested product requests a user decision, we always select the option to let the program run (e.g. “Allow”). In cases where we do this, but the program is blocked anyway, the program is deemed to have protected the system.
For every user-dependent false positive, 0.5 marks are deducted and the final score is put in the FP Score column as you can see in the table image above. The worst offenders here are Malwarebytes and Trend Micro both of which have more than 40 FPs.
Up next, we have the summary of the entire evaluation. A total of 725 cases were tested and Defender got compromised seven times which is compromise percentage of around 1. Hence, overall Defender gets a 99% Protection Rate.
TotalAV does the worst in the test as it had 15 compromised samples. Interestingly, TotalAV is said to utilize the Avira engine but while the former did so poorly, the latter managed to be one of the best anti-malware protections in the test.
Here are the final protection awards that AV-Comparatives bestowed based on the performance of these products:
Although Defender had a poor detection rate, the product still managed to win the AV-Comparatives ADVANCED award due to its extremely good score in the false positive category. Meanwhile, due to this reason, others like Malwarebytes, Norton and Trend Micro have suffered quite a regression in this final awards ranking.
Source: AV-Comparatives