The purpose of this blueprint is to document and prioritize the test suites and profiling tools to run on ARM devices in an automated testing environment.

Release Note

No user visible change.


There are a several well-known test suites already available that will cover much of the functional and performance testing we wish to perform. Many of these tests work and perform adequately on ARM, hence it makes sense to run them on a regular basis and track the results.


  • A testing framework will exist for executing the tests against target ARM devices
  • A server for storing and tracking results of the tests will exist


Performance and Functional tests from several testsuites will first be attempted and measured to see how well suited they are for running on ARM:

Additionally, testsuites from ARM and SoC vendors will be investigated to see if any of these can be made available, and whether or not it would make sense to run them as well.

Tests should either be broken out from the test framework so that they can run under a different framework, or a wrapper should be written that will allow the test to run under the original framework and the results extracted.

Oprofile and perfevents profiling will be attempted on ARM to determine how usable it is, and wrappers should be written to allow running tests under these profilers.

Code Changes

Some tests have already been identified that will require changes to enable them to work on ARM. These changes should be fairly minimal in most cases.

Test/Demo Plan

All tests identified will be attempted and baselined on an ARM device successfully before going into any automated test execution framework.

BoF agenda and discussion

Testsuites and profilers to run in an automated environment, primarily on ARM

  - daily or more often
  - up-to-date images
  - Good coverage without long runtime
  - System level testing


  - mostly work on ARM - a few tests need fixing, some will be trivial
  - actively maintained
  - can blacklist tests
  - can be hours to run
  - Paul is considering just running the syscall tests to start with
  - qemu isn't really viable yet for this
    - Nicolas suggests that some failures will be due to cache coherency, but
      qemu doesn't emulate that
    - may be useful to run it alongside as testing of qemu
  - ACTION: Paul to investigate a base set of tests to run from LTP - may need
      help to debug some of the more interesting failures.

Multiple SoCs to test

  - Paul is interested in testsuites specific to particular SoCs.
    - Knowing about internal ones and asking them to run them when
      we integrate their changes could be useful
    - Asking them to open them and so be able to build on them would be good
    - Engaging with SoC vendors so they can run their internal tests on Ubuntu
      components and report back results could be productive.  This also applies
      to SoC-specific graphics hardware testing etc.
    - Freescale has one that tests codecs, Scott knows more.


  - Performance testing.
  - lmbench similar
  - some obvious issues, like trying to use an x86 config in the kernel compile test.
  - extend to record the variance of multiple runs
        (PTS already records the variance of multiple test runs)

Want boot time testing, is running with bootchart regularly enough?
  - doesn't capture e.g. u-boot time.

* ACTION: dmart - Feed back on any testsuites or benchmarks ARM has which are
* ACTION: dmart - Feed back on additional ARM sanity-checks we might want to 
* ACTION: dmart/will-deacon - Feed back on what ARM hardware profiling events
    would be valuable to track.


  - Getting an oprofile dump of benchmarks could help in isolating what changed.
  - May not want to trace every run

  - perf events would be better to base on than oprofile


Specs/M/ARMTestsuitesAndProfilers (last edited 2010-06-03 16:13:14 by pwlars)