ARMTestsuitesAndProfilers

Differences between revisions 2 and 3
Revision 2 as of 2010-05-03 15:42:19
Size: 2444
Editor: 75-27-138-126
Comment:
Revision 3 as of 2010-05-17 18:24:46
Size: 4600
Editor: 75-27-138-126
Comment:
Deletions are marked like this. Additions are marked like this.
Line 22: Line 22:
== User stories ==
Line 25: Line 23:

== Design ==

You can have subsections that better describe specific parts of the issue.
 * A testing framework will exist for executing the tests against target ARM devices
 * A server for storing and traking results of the tests will exist
Line 32: Line 28:
This section should describe a plan of action (the "how") to implement the changes discussed. Could include subsections like: Performance and Functional tests from several testsuites will first be attempted and measured to see how well suited they are for running on ARM:
 * LTP
 * Phoronix
 * Autotest

Additionally, testsuites from ARM and SoC vendors will be investigated to see if any of these can be made available, and whether or not it would make sense to run them as well.

Tests should either be broken out from the test framework so that they can run under a different framework, or a wrapper should be written that will allow the test to run under the original framework and the results extracted.

Oprofile and perfevents profiling will be attempted on ARM to determine how usable it is, and wrappers should be written to allow running tests under these profilers.
Line 36: Line 41:
Code changes should include an overview of what needs to change, and in some cases even the specific details.

=== Migration ===

Include:
 * data migration, if any
 * redirects from old URLs to new ones, if any
 * how users will be pointed to the new way of doing things, if necessary.
Some tests have already been identified that will require changes to enable them to work on ARM. These changes should be fairly minimal in most cases.
Line 47: Line 45:
It's important that we are able to test new features, and demonstrate them to users. Use this section to describe a short plan that anybody can follow that demonstrates the feature is working. This can then be used during testing, and to show off after release. Please add an entry to http://testcases.qa.ubuntu.com/Coverage/NewFeatures for tracking test coverage.

This need not be added or completed until the specification is nearing beta.

== Unresolved issues ==

This should highlight any issues that should be addressed in further specifications, and not problems with the specification itself; since any specification with problems cannot be approved.
All tests identified will be attempted and baselined on an ARM device successfully before going into any automated test execution framework.
Line 56: Line 48:
{{{
Blueprint: https://blueprints.edge.launchpad.net/ubuntu-arm/+spec/arm-m-testsuites-and-profilers
Line 57: Line 51:
Use this section to take notes during the BoF; if you keep it in the approved spec, use it for summarising what was discussed and note any options that were rejected. Testsuites and profilers to run in an automated environment, primarily on ARM

  - daily or more often
  - up-to-date images
  - Good coverage without long runtime
  - System level testing

LTP

  - mostly work on ARM - a few tests need fixing, some will be trivial
  - actively maintained
  - can blacklist tests
  - can be hours to run
  - Paul is considering just running the syscall tests to start with
  - qemu isn't really viable yet for this
    - Nicolas suggests that some failures will be due to cache coherency, but
      qemu doesn't emulate that
    - may be useful to run it alongside as testing of qemu
  - ACTION: Paul to investigate a base set of tests to run from LTP - may need
      help to debug some of the more interesting failures.

Multiple SoCs to test

  - Paul is interested in testsuites specific to particular SoCs.
    - Knowing about internal ones and asking them to run them when
      we integrate their changes could be useful
    - Asking them to open them and so be able to build on them would be good
    - Engaging with SoC vendors so they can run their internal tests on Ubuntu
      components and report back results could be productive. This also applies
      to SoC-specific graphics hardware testing etc.
    - Freescale has one that tests codecs, Scott knows more.

Phoronix

  - Performance testing.
  - lmbench similar
  - some obvious issues, like trying to use an x86 config in the kernel compile test.
  - extend to record the variance of multiple runs
   (PTS already records the variance of multiple test runs)

Want boot time testing, is running with bootchart regularly enough?
  - doesn't capture e.g. u-boot time.

* ACTION: dmart - Feed back on any testsuites or benchmarks ARM has which are
    relevant
* ACTION: dmart - Feed back on additional ARM sanity-checks we might want to
    create.
* ACTION: dmart/will-deacon - Feed back on what ARM hardware profiling events
    would be valuable to track.


Profilers

  - Getting an oprofile dump of benchmarks could help in isolating what changed.
  - May not want to trace every run


  - perf events would be better to base on than oprofile
}}}

UNDER CONSTRUCTION

Summary

The purpose of this blueprint is to discuss, decide on, and prioritize the testsuites and profiling tools that would be most useful to run on ARM devices in an automated testing environment. Focus should be on what is gained by them, whether or not they work on arm, and how long they take to run.

Release Note

No user visible change.

Rationale

There are a lot of well-known testsuites already available that will cover much of the functional and performance testing we wish to perform. If these tests work and perform adequately on ARM, it makes sense to run them and track the results.

Assumptions

  • A testing framework will exist for executing the tests against target ARM devices
  • A server for storing and traking results of the tests will exist

Implementation

Performance and Functional tests from several testsuites will first be attempted and measured to see how well suited they are for running on ARM:

  • LTP
  • Phoronix
  • Autotest

Additionally, testsuites from ARM and SoC vendors will be investigated to see if any of these can be made available, and whether or not it would make sense to run them as well.

Tests should either be broken out from the test framework so that they can run under a different framework, or a wrapper should be written that will allow the test to run under the original framework and the results extracted.

Oprofile and perfevents profiling will be attempted on ARM to determine how usable it is, and wrappers should be written to allow running tests under these profilers.

Code Changes

Some tests have already been identified that will require changes to enable them to work on ARM. These changes should be fairly minimal in most cases.

Test/Demo Plan

All tests identified will be attempted and baselined on an ARM device successfully before going into any automated test execution framework.

BoF agenda and discussion

Blueprint: https://blueprints.edge.launchpad.net/ubuntu-arm/+spec/arm-m-testsuites-and-profilers

Testsuites and profilers to run in an automated environment, primarily on ARM

  - daily or more often
  - up-to-date images
  - Good coverage without long runtime
  - System level testing

LTP

  - mostly work on ARM - a few tests need fixing, some will be trivial
  - actively maintained
  - can blacklist tests
  - can be hours to run
  - Paul is considering just running the syscall tests to start with
  - qemu isn't really viable yet for this
    - Nicolas suggests that some failures will be due to cache coherency, but
      qemu doesn't emulate that
    - may be useful to run it alongside as testing of qemu
  - ACTION: Paul to investigate a base set of tests to run from LTP - may need
      help to debug some of the more interesting failures.

Multiple SoCs to test

  - Paul is interested in testsuites specific to particular SoCs.
    - Knowing about internal ones and asking them to run them when
      we integrate their changes could be useful
    - Asking them to open them and so be able to build on them would be good
    - Engaging with SoC vendors so they can run their internal tests on Ubuntu
      components and report back results could be productive.  This also applies
      to SoC-specific graphics hardware testing etc.
    - Freescale has one that tests codecs, Scott knows more.

Phoronix

  - Performance testing.
  - lmbench similar
  - some obvious issues, like trying to use an x86 config in the kernel compile test.
  - extend to record the variance of multiple runs
        (PTS already records the variance of multiple test runs)

Want boot time testing, is running with bootchart regularly enough?
  - doesn't capture e.g. u-boot time.

* ACTION: dmart - Feed back on any testsuites or benchmarks ARM has which are
    relevant
* ACTION: dmart - Feed back on additional ARM sanity-checks we might want to 
    create.
* ACTION: dmart/will-deacon - Feed back on what ARM hardware profiling events
    would be valuable to track.


Profilers

  - Getting an oprofile dump of benchmarks could help in isolating what changed.
  - May not want to trace every run


  - perf events would be better to base on than oprofile


CategorySpec

Specs/M/ARMTestsuitesAndProfilers (last edited 2010-06-03 16:13:14 by 75-27-138-126)