IBM Systems Magazine, Power Systems Edition - February 2010 - (Page 54)

IBM Perspective Power Systems directions Wisely Using Performance Benchmarks picking across application portfolios. If you can use existing, publicly available results from these benchmarks to help you in your analysis, the data is freely available. How ca n you de c ide wh ic h benchmarks to select and what to emphasize in your systems analysis? There are so many industry-standard and ISV tools to choose from: benchmarks from the Transaction Processing Performance Council that evaluate online transaction processing and business intelligence (BI) applications; Standard Performance Evaluation Corporation benchmarks that focus on the Web, Java* applications and even energy efficiency; virtualization benchmarks; ISV benchmarks from SAP that include sales and distribution, BI, and banking; storage benchmarks from the Storage Performance Council; specialized application benchmarks that are perfect for high-performance computing applications—everything from computational fluid dynamics to the weather; and many more. The most crucial input in your decision of which benchmarks to use is a deep understanding of the business and the application. You must choose a benchmark that maps well to your specific application. Sometimes a single benchmark works well. You may also want to use a combination of results on different benchmarks. Other times, nothing seems to fit and you may need to run your application in a special controlled-benchmark environment at your site or a vendor’s location. The most important aspect in comparing systems performance using industry-standard and ISV benchmarks is understanding the configurations you’re 54 FEBRUARY 2010 S ystems performance benchmarking can be wonderful for evaluating performance and comparing servers and storage. Numerous industry-standard benchmarks are ripe for the comparing. Is one system clustered with hundreds of cores and the other a large, single, nonclustered system? Are they similar in size—do they have the same or a similar number of processor chips, cores and threads? How different are the operating and database systems? It may not make sense to compare a Linux* system It’s also imperative to understand the benchmark consortia rules. For instance, are you allowed to derive other metrics from the benchmark’s metrics? Are you allowed to compare one database size with another, or an older version of a benchmark with a newer version? Price/performance may or may not be a component of the benchmark that can be analyzed publicly. The ultimate goal in using systemsperformance benchmark results for evaluating performance is ensuring an apples-to-apples comparison. You need to understand when one result is a Cortland and one is a McIntosh, or if you’re comparing a peach and a nectarine. (Peaches are fuzzy.) With these types of differences, the benchmark results “The most crucial input in your decision of which benchmarks to use is a deep understanding of the business and the application.” —Elisabeth Stahl, running DB2* with a Windows* system running SQL Server. One benchmark result may employ a sophisticated database for use with complex BI solutions, while another might use a very simple database. What about network protocols, or sizes of databases and load times? You probably don’t want to compare a BI workload of 100 GB with one of 100 TB. Examine what storage is used; some of the latest benchmark submissions employ solid-state drives, which—though great technology— don’t directly compare to other storage technologies. These are just some of the many differences in configurations used in benchmarks that may affect the end result. Be certain you’re comparing like systems. And when you’re not, your modeling must account for these differences. can still be employed as an extremely valuable systems tool, as long as you take into account the differences in your modeling and comparison efforts. Most of all, make sure you’re not directly and absolutely comparing apples to pineapples, where about the only thing in common is six letters. Elisabeth Stahl Chief Technical Strategist Performance Marketing Executive IT Specialist IBM Systems and Technology Group Send feedback to Managing Editor Tami Deedrick at

Table of Contents for the Digital Edition of IBM Systems Magazine, Power Systems Edition - February 2010

IBM Systems Magazine, Power Systems Edition - February 2010
Editor's Desk: A New Technology Party
Dashboard: Smarter Healthcare's The Cat's Meow
Trends: Active Energy Manager 4.2 Integrates IT and Infrastructure
Insider: Recent Survey Explores the State of Power Systems Resilience
Case Study: Bunzl Distribution USA Assembles Success with Power Systems and SAN
Cover Story: Power Your Planet with the Latest Technology and Four New Servers
Feature: Five Business Problems Solved By Power Systems Server Consolidation
Open Source: Lotus and Zend Boast Active Open-Source Communities
Next: How the Swiss Federal Institute of Technology Uses Data-Center Heat to Cut Energy Costs
Advertisers' Index
IBM Perspective: Wisely Using Performance Benchmarks

IBM Systems Magazine, Power Systems Edition - February 2010