Here’s How We Move Performance Testing Forward

Jan 21, 2015
4 minutes
28 views

Historically, firewall throughput performance metrics have been published using user dategram protocol (UDP), a stateless protocol that provides a relatively unrealistic metric for customers who are attempting to select the right firewall appliances. UDP is unrealistic for several reasons. First, it does not place significant computing strain on the firewall appliance under test, even when turning on security features such as application control, IPS, AV or APT protection. Second, no network has 100 percent UDP. Third, and most importantly, recent high profile breaches highlight the fact that attackers try to hide in plain sight, navigating across your network and stealing data using applications found on your network. The need to secure your network based on applications, not ports, has never been more critical than now.

Use of HTTP: a step in the right direction

There has been incremental movement towards application-centric performance testing. In 2007, when we began shipping our appliances, we chose to publish our performance metrics using HTTP, not UDP, because HTTP is more computationally strenuous and our platform classifies traffic based on applications, not ports. In 2011, Joel Snyder and David Newman tested our PA-5060 in Network World using static HTTP and then HTTP with a range of transactions sizes and payloads. In addition, some other firewall vendors have begun adding HTTP to their datasheets alongside their UDP metrics.

An industry first: a firewall performance test using 15 applications

In an effort to continue to drive the performance testing conversation toward the use of application-centric metrics, we partnered with Ixia and David Newman of Network Test to develop the first ever public facing performance test that uses applications commonly found on your networks.

The test plan uses 15 applications that were selected using data compiled from the analysis of application traffic in more than 5,500 networks. The applications were a combination of browser-based and client-server that spanned a range of categories including social media, email, filesharing, instant messaging and video. The test plan was then executed by Network Test to validate the performance of the PA-7050, our newest firewall appliance, when security services (App-ID, Threat Prevention, NAT, Logging, and 1,000 rules) are enabled.

The PA-7050 achieved the following results:

  • Using the 15 application mix and with security features enabled, the PA-7050 delivered 110 percent of our rated datasheet performance for full threat prevention.
  • Using HTTP only as a baseline, the PA-7050 achieved up to 120 percent of datasheet rated performance.

You can see all the results, the methodology we used as well as the application mixes and percentages here.

Did we pay for the test?

Yes, we paid Network Test to execute this test, and there will be those who say we rigged the test because we paid for it. We chose to commission Network Test to execute this test as a means of elevating the performance conversation to be more focused on applications and the impact that full stack classification and inspection introduces. We have included as much detail as possible in the report and David Newman intimately involved with the test plan development, providing us with feedback on the applications in use. For those customers who wish to see the tests duplicated, we will make the configuration files available through our field teams in our Proof of Concept labs.

By name, the applications you see listed may not be the most commonly found variant within the respective category, which is due in part to what the testing tools support. For example, Hotmail may not be found on every network, but it is a web-based email application, of which an average of 12 are found on every network we analyze. So it is important to focus more on the type of application and its underlying technology rather than the actual name of the application.

Performance testing is hard

This test took months to complete, with many hours spent troubleshooting the test bed and determining the application percentages. We would be remiss if we did not thank Ixia for lending us their BreakingPoint PerfectStorm chassis and providing significant support during the months long effort.

Want to learn more?

  • Read the full report here.
  • Join Ixia and Palo Alto Networks on February 11 for a webinar discussing the PA-7050 performance testing.

Subscribe to the Newsletter!

Sign up to receive must-read articles, Playbooks of the Week, new feature announcements, and more.