Você está na página 1de 10

Virtual Environment Protection Test Report

A test commissioned by Kaspersky Lab and performed by AV-Test GmbH Date of the report: May 10 , 2012, last update: May 14 , 2012
th th

Executive Summary
In March and April 2012, AV-Test performed a comparative review of 2 security solutions for virtual environments to analyze their capabilities to protect against malware. The products under test were Kaspersky Security for Virtualization and Trend Micro Deep Security. Five individual tests have been performed. The first was a real world test of malicious URLs with 36 samples, the second was a dynamic detection test with five samples, the third was a static detection test with 141,290 samples, the fourth was a false positive test, and the final test determined the impact on the system performance by the products. To perform the test runs, two VMware ESXi environments were set up on identical servers. The security software used VMware vShield to protect the virtual machines. The virtual machines for the tests ran Windows XP with the latest Service Packs and Updates. In case of the real world and dynamic tests, the samples have been executed and any detection by the security software was noted. Additionally the resulting state of the system was compared with the original state before the test in order to determine whether the attack was successfully blocked or not. In case of the static detection test, the products had to scan a set of 141,290 malicious files. The Trend Micro product was tested with and without file and web reputation (in-the-cloud), which showed a significant difference in its detection rates. It is common sense in IT security that anti-virus protection is a must. The usual agent-based anti-virus software includes several layers such as static detection, dynamic detection, firewall and more. Agentless anti-virus solutions designed for virtual environments and tested in this research, have a narrower scope, providing traditional anti-virus protection only and preventing too heavy performance impact. There also may be circumstances where critical systems may require agentbased anti-virus applications with additional protection layers. This might create a mixture of both agent-based and agent-less anti-virus protection methods that must be administered and maintained.

Overview
With the increasing number of threats that is being released and spreading through the Internet these days, the danger of getting infected is increasing as well. A few years back there were new viruses released every few days. This has grown to several thousand new threats per hour.

New unique samples added to AV-Test's malware repository (2005-2012)


18 000 000 16 000 000 14 000 000 12 000 000 10 000 000 8 000 000 6 000 000 4 000 000 2 000 000 0 2005 2006 2007 2008 2009 2010 2011* 2012* Dec Nov Oct Sep Aug Jul Jun May Apr Mar

Figure 1: New samples added per year

In the year 2000, AV-Test received more than 170,000 new samples, and in 2010 and 2011, the number of new samples grew to nearly 20,000,000 new samples each. The numbers continue to grow in the year 2012 with already over 5 million new samples in the first quarter. The growth of these numbers is displayed in Figure 1. Since virtual infrastructures are an important topic for the enterprise, security vendors provide new products which are optimized for those environments.

Products Tested
The testing occurred in March and April 2012. AV-Test used the latest releases available at the time of the test of the following products: Kaspersky Security for Virtualization 1.1 Trend Micro Deep Security 8

Methodology and Scoring


Platform All tests have been performed on identical servers equipped with the following hardware: Dell PowerEdge T310 Intel Xeon Quad-Core X3450 CPU 16 GB Ram

500 GB HDD The hypervisor was VMware ESXi 5 (Build 623860) with vShield 5 (Build 473791). The protected virtual machines were configured as follows: Windows XP Professional (32 Bit), SP3 + VMware Tools 1 CPU 2 GB Ram 50 GB HDD Testing methodology General 1. Clean system for each sample. The test virtual machines should be restored to a clean state before being exposed to each malware sample. 2. Product Cloud/Internet Connection. The Internet should be available to all tested products that use the cloud as part of their protection strategy. 3. Product Configuration. All products were run with their default, out-of-the-box configuration. For Trend Micro Deep Security reputation services are disabled by default for agentless setups. The tests were run with and without reputation services. 4. Sample Cloud/Internet Accessibility. If the malware uses the cloud/Internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 5. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). The procedures below are carried out on all tested programs and all test cases at the same time in order to ensure that all protection programs have the exact same test conditions. If a test case is no longer working or its behavior varies in different protection programs (which can be clearly determined using the Sunshine analyses), the test case is deleted. This ensures that all products were tested in the exact same test scenarios. All test cases are solely obtained from internal AV-TEST sources and are always fully analyzed by AV-TEST. We never resort to using test cases or analyses provided by manufacturers or other external sources. Dynamic Test/Real-World Test 1. The products are installed, updated and started up using standard/default settings. The protection program has complete Internet access at all times. 2. AV-TEST uses the analysis program Sunshine, which it developed itself, to produce a map of the non-infected system. 3. It then attempts to access the website resp. execute the malicious file. 4. If access to/execution of the sample is blocked with static or dynamic detection mechanisms by the program, this is documented.

5. Given that the detection of malicious components or actions is not always synonymous to successful blockage, Sunshine constantly monitors all actions on the computer in order to determine whether the attack was completely or partially blocked or not blocked at all. 6. A result for the test case is then determined based on the documented detection according to the protection program and the actions on the system recorded by Sunshine.

Static Scanning Test 1. 141,290 malware samples have been scanned with the products with recent updates and connection to the cloud (Note: Trend Micro has been tested with the default setting File Reputation Off and additionally with File Reputation on) 2. A rescan of all remaining samples has been performed 7 days later to determine the final detection rate Samples The malware set for the dynamic test contains 41 samples. The set is separated into 36 URLs with malicious downloads and 5 executable files from other sources like mail attachments or removable storage. These files have been collected during March 29th and April 12th 2012. Every sample has been tested on the day of its appearance in AV-TESTs analysis systems. The malware set for the static scanning test contains 141,290 samples of zoo malware. This includes files that were spread in the internet in the last few weeks and that were collected by AV-TEST during February and March 2012.

Test Results
Real World Attacks
The real world tests showed that Kaspersky uses signature based detection only while Trend Micro is also able to block entire URLs with their Web Reputation technique. Due to its high static detection rates and its short response times to new malware, Kaspersky was able to block 30 out of 36 malicious files, which were downloaded from the web. One sample was partially blocked. Trend Micro was able to block 24 URLs with its Web Reputation engine, this means the user was not able to download the malicious files. However, if a file was successfully downloaded, the file guard detected 15 files only. All in all Trend Micro blocked 27 samples.

Real World Detection Results


35 30 25 20 15 10 5 0 Kaspersky Security for Virtualization Trend Micro Deep Security Partially blocked attacks Downloaded files blocked URLs blocked

Figure 2: Real World Detection Results

Figure 2 shows only three files blocked by the Trend Micro file guard. The remaining 12 files were also detected by the Web Reputation engine and were therefore counted as blocked URLs.

Static Detection
The static detection tests showed that Kaspersky and Trend Micro are on a similar level as long as file reputation is enabled for Trend Micro. The default option is to turn file reputation on for agents only. Because our setup did not use the agent, we tested both options to show the differences.

Static Detection of Malware


100,00% 95,00% 90,00% 85,00% 80,00% 75,00% 70,00% Kaspersky Security for Virtualization Trend Micro Deep Security Trend Micro with File Reputation 74,22% 98,60% 98,84%

Figure 4: Static Detection of Malware

Figure 4 shows that the file reputation settings have a big impact on the detection results of Trend Micro.

Dynamic Detection
The next test determined the detection of new unknown malware through dynamic detection methods. These are files that are not detected with static detection. When looking at the very good static detection rates from above, it is obvious that only a small number of files has to be caught with dynamic detection. Kaspersky was able to block one out of five samples and Trend Micro detected two out of five samples, but it was not able to block them. It became clear that neither Kaspersky nor Trend Micro use dynamic detection methods in the tested configuration. The detections in this test were all based on signatures. The products use the VMware vShield Endpoint driver to access files on the protected virtual machine, which are then scanned by an additional virtual appliance. So the virtual machine itself does not have an anti-virus agent installed. Because it would cost too much performance to pass all events to the virtual appliance for behavior based analysis, such methods are not supported. The products can be configured with an anti-virus agent on each virtual machine, in case of Kaspersky the agent would be a normal Kaspersky Endpoint Security client. If an agent is installed it would be able to do behavior based analysis of malware and the detection rates would increase. It depends on the companies needs, whether to use a setup with or without agent. The setup without agent requires fewer resources per virtual machine and therefore more machines can run on a single host.

Dynamic Detection of Malware


5 4 3 2 1 0 Kaspersky Security for Virtualization Overall Detection (Warning) Rate Trend Micro Deep Security Overall Detection and Blocking Rate

Figure 3: Dynamic Detection of Malware

As figure 3 shows, dynamic detection methods do not work with an agentless setup.

False Positives
The false positive tests include the scan of two sets of files (static) and the installation of 20 clean applications (dynamic). The first set includes 11,604 files from several Windows and Office installations and detections in this set are therefore critical. Both products had no false positive detections in this set. The second set contains all kinds of files from popular programs, which were downloaded from major download sites. The total number of these less critical files is 231,872. Kaspersky had no false positive detections and Trend Micro detected two files. Due to the size of the set these numbers are very good.

False Positive Detections of non-critical Files


2 1,5 1 0,5 0 Kaspersky Security for Virtualization Trend Micro Deep Security

Figure 5: False Positive Detections of less critical Files

During the dynamic false positive tests Kaspersky had no false alarms. Trend Micro removed one language DLL of an IrfanView installation; however the program could start properly.

Performance
The performance test measured several synthetic I/O operations like creating and opening files as well as real usage scenarios like downloading files and running applications. The cycle was repeated 7 times. Figure 6 shows the total average time for specific real-world, non-synthetic, operations.

Total Average Time of Specific Operations


900,00 800,00 700,00 600,00 500,00 400,00 300,00 200,00 100,00 0,00 Reference Kaspersky Lab Trend Micro Run applications opening specific documents Install applications Load websites Copy files Download files

Figure 6: Total Average Time of Specific Operations (in seconds)

Kaspersky provides the overall better performance, but compared to the reference even Kaspersky needs more than twice the time to copy a set of files, which is 3.4 GB in size. The most impact can be seen by Trend Micro when installing applications. The test shows a noticeable impact on performance by both security solutions for the copying files within one virtual machine and installing applications operations. However, in a real virtual environment such operations should be rare.

Summary
The above findings show that protecting virtual environments is different than protecting the usually desktop PC. The security vendors have to use different approaches to protect the systems and to minimize performance impact. It is obvious that a careful configuration has to be made to tailor the security solution to the specific environment.

Appendix
Version information of the tested software
Developer, Distributor Kaspersky Lab Trend Micro Product name Kaspersky Security for Virtualization Deep Security 8 Virtual appliance version 1.1.0.49 8.0.0.1199 Management console version 9.2.61 8.0.1448

List of used malware samples


Real World Attacks hxxp://fotolog12.beepworld.it/files/slide-orkut85.exe hxxp://www.haoxs.net/tools/file/c/q.exe hxxp://tenda.infosapobla.com/temp/SYL-DC5.exe hxxp://www.romanhitechinstitute.com/newsimages/svchost.exe hxxp://www.clickplaystream.com/dl/java.exe hxxp://swordsoul.110mb.com/OnePiece.com hxxp://schokoweiss.de/uploads/media/media.exe hxxp://heart-station.org/blog/f2.exe hxxp://down.nurungzi.co.kr/main/t5/hinnrz.exe hxxp://ceraxon.com/iemctsec/mvxrf0.exe hxxp://ahaliaexchange.com/java.exe hxxp://75.147.219.202/aspnet_client/system_web/receitanet_malha114001.exe hxxp://bot.iamsoninja.com/downloads/server.exe hxxp://uppdate.sytes.net/_u/stub.exe hxxp://clickplaystream.com/dl/camfrog.exe hxxp://facerboolksion.biz/fotoviews.php?= hxxp://adest.com.au/readers/ADEST/ADEST.exe hxxp://www.s3odicol.net/x5.exe hxxp://alias1.adobedownloadcentre.selfip.biz/data.php hxxp://test.ceuta-pesca.es/update.exe hxxp://gonadee.com/media/files/np.exe hxxp://www.dvornalipa.sk/foto_files/photos/1.exe hxxp://78.111.51.123/files/6f82c hxxp://smscrack.narod.ru/install_sms_cracker.exe hxxp://www.veterinary-management.com/tmp/install_4a405aa301785/css/Correios-Telegrama586655.exe hxxp://newserial.net/Ah-Istanbul-Cengiz-Ozkan.exe hxxp://exehost.net/uploads/Adobe-Udpate.exe hxxp://petrojobsearch.com/java/settings.exe hxxp://lebleb2011.com/install.exe hxxp://colegiowz.com.br/d&// hxxp://91.217.153.35/upeksvr.exe hxxp://www.motivity.com.tw/lib/thumb.php?redir= hxxp://thecaswellhouse.com/caswellhouse.exe hxxp://meinv.tv/5/steup.exe hxxp://www.inews365.com/xml/xml.exe hxxp://188.116.32.144/mgfugh/update.php?ver=1&type=movie Dynamic Detection 0x06d8fe2fa094401e0c06c9d26dc274c8 0x166e6e813478f8c92ec245ef3bac1f83 0x328d9ef6c3d8770c0b144a7bff99a530 0x329428230075cb168a5aaa33c6df1cb3 0x3f53ea54adceec86de26f9a23b7ec90d 0x54115b1ceb020baf7402e24da33f2a67 0x787806ddd76b6e2caf25ae0e1be82641 0x95349dc075008283fb832f8fca2b6e08 0xad05c3c63d5b50cd820b9c43aa4cd489 0xceaece2b59a512c1d8344a2ea051a6c1 Static Detection List of the 141,290 samples not given because of size, but it is available on request.

Copyright 2012 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org

10

Você também pode gostar