Wednesday, August 19, 2009

Performance Testing Needs a Seat at the Table

It is time Performance Testing gets a seat at the table. Architects and developers like to make all the decisions about products without ever consulting the testing organization. Why should they? All testers have to do is test what's created. If testers can't handle that simple task, then maybe they can't handle their job.

I know this thought process well. I used to be one of those developers :). But I have seen the light. I have been reborn. I know that it is important to get testers input on products upfront. And actually it is becoming more important now than ever before.

With RIA (Web 2.0) technologies there are many different choices that developers can make. Should they use Flex, Silverlight, AJAX, etc... If they use AJAX, which frameworks should they use. If Silverlight then what type of back end communication are they going to use?

Let's just take AJAX as an example. There are hundreds of frameworks out there. Some are popular and common frameworks but most are obscure or one off frameworks. Developers like to make decisions on what will make their life easier and what is cool. But what happens if their choices can't be performance tested? Maybe the performance team doesn't have the expertise in-house, or maybe their testing tools don't support the chosen framework. What happens then?

I can tell you that many times the apps get released without being tested properly and they just hope for the best. It's a great solution. I like the fingers crossed method of releasing apps.

How could this situation be avoided? Simply include the performance testing group upfront. Testing is a major cog in the application life cycle. They should be included at the beginning. I'm not talking about testing earlier in the cycle (although that is important and it should be done). I'm talking about getting testing involved in architecture and development discussions before development takes place.
If developers and architects knew up front that certain coding decisions would make it hard or impossible to performance test, then maybe they would choose other options for the application development. Many businesses would not want to risk releasing an application if they knew that it could be tested properly. But when they find out too late, then they don't have a choice except to release it (the finger crossing method).

If the performance team knew upfront that they couldn't test something because of skills or tools, then at least they would have a heads-up and they could begin planning early for the inevitable testing. Wouldn't it be nice to know what you are going to need to performance test months in advance? No more scrambling at the 11th or 12th hour.

Think about this. If testing was invloved or informed upfront, then maybe they, along with development, could help push standards across development. For example, standardizing on 1 or 2 AJAX frameworks would help out testing and development. It will make the code more maintainable because more developers will be able to update it and it will help ensure that the application is testable.

We need to get more groups involved up front. The more you know, the better the decisions, the better off the application is going to be.





















Thursday, August 13, 2009

What are the Goals of Performance Testing?

So what is the point of performance testing? I get this question often. And depending on who you talk to, you get different answers.

First let me begin by telling you what are NOT the goals of performance testing / validation.
  • Writing a great script
  • Creating a fantastic scenario
  • Knowing which protocols to use
  • Correlating script data
  • Data Management
  • Running a load test

This is not to say that all of these are not important. They are very important, but they are not the goals. They are the means to the end.

So why DO people performance test? What are the goals?

  • Validating that the application performs properly
  • Validating that the application conforms to the performance needs of the business
  • Finding, Analysing, and helping fix performance problems
  • Validating the hardware for the application is adequate
  • Doing capacity planning for future demand of the application

The outcomes of the performance test are the goals of testing. It seems basic. Of course these are the goals. But...

  • How many people really analyse the data from a performance test?
  • How many people use diagnostic tools to help pinpoint the problems?
  • How many people really know that the application performs to the business requirements?
  • How many people just test to make sure that the application doesn't crash under load?

Even though they seem obvious, many testers/engineers are not focusing on them correctly, or are not focused on them at all.

  • Analysing the data is too hard.
  • If the application stays up, isn't that good enough?
  • So what if it's a little slow?

These are the reasons that I hear. Yes, you want to make sure that the application doesn't crash and burn. But who wants to go to slow website. Time is money. That is not just a cliche, it's the truth. Customers will not put up with a slow app/website. They will go elsewhere and they do go elsewhere. Even if it is an internal application, if it is slow performing a task, then it takes longer to get the job done, and that means it costs more to get that job done.

Performance engineering is needed to ensure that applications perform properly and perform to the needs of the business. These engineers do not just write performance scripts. Just because someone knows Java does not mean that they are a developer. And just because a person knows how to write a performance script does not mean they they are a performance engineer.

Performance engineering requires skills that not all testers have. They need to understand the application under test (AUT), databases, web servers, load balancers, SSO, etc.... They also have to understand the impact of cpu, memory, caching, i/o, bandwidth, etc.... These are not skills are learned overnight, but skills that are acquired overtime.

I wrote a previous blog entry on "you get what you pay for". If you pay for a scripter, you get a scripter. If you pay for a performance engineer, you get a performance engineer (well not always :). Sometimes people exaggerate their skills :) ).

Companies can always divide and conquer. They can have automaters/ scripters create the scripts and the tests, then have performance engineers look at the test and analysis the results. In any case the performance engineer is a needed position if you want to properly performance test/validate.

It needs to be mandatory to know what metrics to monitor and what those metrics mean. Also knowing how to use diagnostic tools needs to be mandatory. Again in a previous blog I mentioned that if you are not using diagnostics you are doing an injustice to your performance testing. Without this analysis knowledge you are not truly performance testing, you are just running a script with load. Performance testing is both running scripts and analysing the runs.

By looking at the monitoring metrics and diagnostic data, one can begin to correlate data and help pinpoint problems. They can also notice trends that may become problems overtime. Just running a loadtest without analysis will not give you that insight. It will just let you know that the test appeared to run ok for that test run. Many times just running the test will give you a false positive. People wonder why an application in production is running slow if it already passed performance validation. Sometimes this is the reason (You never want this to be the reason). Proper analysis will ensure a higher quality application.

As I said, these are not skills that are created overnight. Performance engineers learn on the job. How do you make sure that this knowledge stays with a company as employees come and go? That is where a Center of Excellence (CoE) comes into play (You knew I was going to have to pitch this :) ). If you centralize your testing efforts, then the knowledge becomes centralized as opposed to dispersed through a company only to get lost if those employees with the knowledge leave. You can read yet another one of my blogs for more information on the CoE. Wow! I've just been pitching my blogs entries today :). But I digress.

Let's stop thinking that proper performance testing is writing a good script and agree that performance engineering is not an option but a must. Let's start to focus on the real goals of performance testing and then all the of the important "means to the end" will just fall into place.