MetricsBasedTesting

Summary

All our current testing is a binary pass/fail but we would also like to track the evolution of certain parameters such as boot speed and power usage that may not have a clear pass/fail threshold but for which historical data is desireable. We will extend out infrastructure to collect and display such data.

Rationale

Work to improve performance requires access to rich performance data rather than binary pass/fail.

User stories

Design

Gather non-binary data during various checkbox tests and feed them on to the developers who need this for their work.

Examples

  • Tracking bootspeed (using bootchart)
  • Power usage
  • I/O throughput
  • Amount of memory being used
    • polled during install, application testing, immediately after boot, etc.
  • Webcam framerate

Implementation

Initial pass should be able to be implented quickly and provide value to groups who need this data.

Phase I (Karmic)

  • Gather metric data as attachments (files/blobs)
  • When a test successfully gathers and submits it's metrics data the test is marked as a Pass
  • Provide access to gathered metric data for developers
  • Do not parse it in the certification system or results tracker -- just gather and regurgitate

Provide access to the data without doing any parsing

  • Have a directory containing the data
  • .../bootchart/$machinename/$timestamp.tgz or .../$machinename/metrics/bootchart/$timestamp.tgz
    • linked from the certification 'librarian' so there is no necessity for copying the files around
  • Have a file containing metadata about the machine from this view
    • this data is already available from the website and can be retrieved programmatically
    • information such as processor speed, ram, etc.
  • Only publish data for publicly released hardware

Phase I.I (Before Phase II)

  • Obtain validation from users that the information provided from Phase I is actually useful.
  • Lessons learned from this Phase might affect some of the points in Phase II.

Phase II (Later)

  • Store metric data in database as numeric value
    • Do not yet parse or analyze heuristically
  • Create a new test result state 'Data' to be used instead of Pass/Fail/Skip
  • Provide simple access for ad-hoc reporting
  • Generate alerts when metrics change by a sufficiently large amount or above a threshold
  • Separate public and private results with access controls


CategorySpec

QATeam/Specs/MetricsBasedTesting (last edited 2009-08-21 15:26:54 by ua-178)