PerformanceTracking

Summary

Use the test automation infrastructure to conduct performance testing and track progress over time.

Rationale

We should aim for better performance in key components and monitor for regressions.

Use Cases

  • Boot time
  • X.org

Assumptions

Design

  • Scheduling and performing testing
  • Log the results and plot them over time

Tests

  • Which tests do we run?

Reporting

The hardware testing system using checkbox and the certification website currently returns a Pass, Fail or Skip result for each test. For performance testing this is not an appropriate result but the current system can be used as follows:

  • Fail: unused - we currently don't have failure criteria defined for these tests

  • Skip: The test could not run because of missing dependency

  • Pass: The test ran and returned a numerical result when we log and analyse in a separate process

Implementation

Code Changes

Unresolved issues

Discussion


CategorySpec

QATeam/Specs/PerformanceTracking (last edited 2008-12-18 16:34:39 by cpc4-oxfd8-0-0-cust39)