Building Productive and Reliable Testing Centers

Building Productive and Reliable Testing Centers

With one of the largest consumer purchasing history databases, our personalized consumer marketing analytics client builds brand awareness and creates loyalty for the world’s leading CPG brands and retailers. Their goal is to target consumers with "behavior-based messaging" through the digital channels that are most likely to stand out to them—mobile, online and in the store. This year they will influence over 280 million shoppers and see more than $1B USD in annual purchases through its mobile commerce network.

Regression and Testing Standards

Our first engagement with this partner started in November 2013. There were a few challenges that they were facing at the time:

  • Manual test cases did not have clear standards and were not structured inside ALM
  • Manual testers did not have automation knowledge/understanding and were not in touch with automation activities and automation team
  • Regression testing was not being performed, since the team did not have a process to follow and tests flagged for regression
  • The test cases were “hard-coded”, so it was not possible to reuse them

Working together with a team on-site and near shore in Brazil, we were able to break through these issues and create standards and procedures that are used throughout their IT department:

  1. Designed the basic Test Cases Standards, which are now used across different projects.
  2. Test cases written in a table-driven format (parameterization), which increases reusability and productivity.
  3. Creation of a manual regression suite covering the most important tests every release and used as basis for the automation team's work.
  4. Established direct contact with Automation, allowing manual testers to develop automation skills and deliver faster execution results.
  5. Contributions on testing process improvement and testing artifacts organization.

Some of the tools we used to implement these plans: HP ALM (Application Lifecycle Management, HP QC (Quality Center), HP UFT (Unified Functional Testing, former QTP Quicktest Professional), Selenium (includes WebDriver)

Performance Testing

We also started working with them in their approach to Performance Testing in 2014. At the time there was no Performance Testing, no clear process to start performance engagement for a new project, and no plans or standards in place for driving performance activities. After this evaluation of their current processes, we approached this scenario in three phases:

  1. Engagement/Planning Phase
    1. Get overall understanding of the project and what impact performance could have on business objectives
  2. Design/Implementation Phase
    1. Design and implement test data generation and performance test scripts
    2. Overlap with execution/reporting phase
  3. Execution/Reporting Phase
    1. Run performance test scripts and report performance numbers
    2. Advise development team on performance issues as needed
    3. Adjust performance test scripts as needed

We then implemented three types of testing:

  1. Load Test: Checks application behavior under normal and peak load conditions
  2. Stress Test: Determines or validates system's behavior when it is pushed beyond normal or peak load conditions
  3. Capacity Test: Determines how many users and/or transactions the system will support and still meet performance goals

Through these phases and tests, we were able to create and establish Performance Testing Standards and Procedures:

  1. Established a reference for teams that have performance needs and are looking for information or envisioning a new engagement.
  2. Performance engagements were created for projects, monitoring SLAs, retesting production issues, fixes and reporting results.
  3. Constant follow ups on the applications. Frequent runs allows team to make sure systems/environments keep working fine.
  4. Creation of a reusable Jmeter scripts suite, allowing team to start a new engagement faster and making maintenance easier.
  5. Structured the Performance Testing SharePoint (storing scripts, results, documentation), letting information available for sponsors.

Overall, our continued partnership with this client is based off of our tenacity to find the issues and create tailored action plans to solve these problem areas.

Know more about Quality Assurance

Learn more

Other cases

Quality Assurance and Testing Using HP ALM and UFT

Speed-up Test Automation with Vortex