INDUSTRY

Aerospace

PROJECT

Testing Performance built and documented performance testing framework that fits within an Agile methodology supporting early Continuous Integration end-to-end performance testing requirements.

SYNOPSIS

Testing Performance / Fimatix - Performance Testing (Component-Level Performance Testing within Continuous Integration) Case Study

THE BACKGROUND

The customer is a large airline developing a service-based architecture that would gradually grow over time to replace existing applications.

Using Agile methodology was key for the customer, this included the practice of constant testing from the development phase onwards.

Testing included code quality checks and functional testing using automation as well as performance testing.

THE REQUIREMENT

Testing Performance was responsible for the planning and implementation of an approach for component level and end-to-end performance testing that would fit with the new way of working.

Upon definition and set-up, Testing Performance was to hand over to the incumbent supplier who would then be responsible for all performance testing.

Early discussion during pre-engagement resulted in agreement to use TMMi methodology to understand requirements, agree these and then to implement. There were to be four phases;

  • Discovery phase where requirements were determined and the way ahead was agreed.
  • Design where detail of the deliverables was determined.
  • Build where the deliverables themselves were built.
  • Deliver where a workshop would feedback and demonstrate assets created in order to achieve sign-off.
THE SOLUTION

Testing Performance supplied a Performance Test Lead and a Senior Performance Tester to carry out all activities, liaising with test management and development teams to ensure that as the concept grew, it broadly matched the airline’s requirements.

Testing Performance reviewed documentation and interviewed key personnel to understand requirements. This allowed for the following Design and Build phases to be defined with a high-level view of requirements.

A number of activities were required, which included;

  • Open-source performance test tools were to be used. A Proof of Concept (PoC) was required with both JMeter and Grinder. JMeter was selected and along with implementing the tool we also produced and documented template scripts.
  • The common platform architecture itself required performance testing. This was carried out using a pilot service that had been built specifically for functional testing against the platform.
  • A process for performance testing new services was designed, documented and set-up. This included (but not limited to) checklists, performance test plan template, performance test report template, questionnaires, status reports and raid log.
  • A second service was performance tested using the newly defined performance testing process. Lessons learnt were fed back into updates and amendments to the process and performance testing assets.
  • Various guides were built that attempted to encompass everything that needed to be delivered to meet process. This included a Service Delivery Checkbook and a Test User Guide.
  • A workshop, where designed and documented assets around performance testing were demonstrated and discussed, was carried out.
THE OUTCOME

Over a period of three months, the airline that had originally not fully known what it needed or wanted, and in some respects, what it could have, was gently led through the process of Discovery, Design and Build of a performance testing process. The team was fully accepting the new process due to the partnership and collaboration during the engagement. They consistently had ownership as they were fully consulted and involved in making choices throughout the process.

THIS EXAMPLE DEMONSTRATES
  • Testing Performance’s ability to take a customer on a journey, collaborating and implementing process that makes sense.
  • The ability to assess and select tools based on the customer’s needs.
  • Understanding how to implement ‘early’ component level performance testing, not just the more traditional end-to-end performance testing.
  • The ability to use a recognised process such as TMMi which allows for clearly defined milestones and deliverables.