Packages in Ubuntu should be tested automatically to ensure that they meet basic standards. Packages should at least be installable and uninstallable and files installed should not conflict with those in other packages.

Various tools exist to ensure package quality and should be evaluated for regular package checks.

Release Note

Ubuntu is now automatically testing all packages in the Ubuntu repositories to ensure that they will install, upgrade, and remove cleanly and without error.


Packages should always be cleanly installable and uninstallable. If a package cannot be installed or removed by the user without encountering an error, this makes the package broken and is a serious concern from a usability standpoint.

Automated testing to ensure that packages meet basic requirements will improve the quality of the software distributed by Ubuntu and ensure a clean user experience.

User stories

  • Aaron uses Synaptic to install an application from one of the Ubuntu repositories. This package obsoletes an already installed package. Because the packages have been tested to ensure clean installation and uninstallation, the change from one package to another proceeds as expected and Aaron is able to use his newly installed software.
  • Bob uses Update Manager to upgrade his packages to the latest version. No installation or update errors are encountered as the packages have been checked to ensure that they upgrade cleanly.
  • Carrie is the packager for an application. She uploads a new version of the package and is notified if the package fails automated testing.


  • Packagers can be notified when package tests fail.


Once a package is built, an automatic package probe system will see the new package and assign it for testing on a test system. If these tests fail, the uploader will be notified that the package tests failed (and will be provided with the output from the tests).


Initial Step / Feasibility Test

As a first step, QA will set up a server with a chroot environment and conflictchecker. The packages already in the archive will be tested with conflictchecker and a list of conflicts will be created.

Automated Implementation

If the results of the initial step prove to be useful and the initial test is working properly, QA will work with the Foundations team to develop a plan for conducting tests after package builds, possibly using virtual machines in the cloud (see ).

The package prober is already in place as a part of the certification environment. Some extensions may be necessary to add individual package testing functionality; these extensions will be evaluated during the feasibility testing phase.

Test Extension

After the automated environment is working to test packages for file conflicts, further tests may be added by other teams (Foundations, Server). piuparts is the most likely candidate for use but the specific tool(s) to be used will be determined during this phase.

Test/Demo Plan

Testing will be performed in phases as the system is implemented. As the true value of automated package testing is unknown, the initial phase itself is a test of feasibility for the specification as a whole.

  1. Initial testing will be done on an isolated server in a chroot environment. This will include testing the packages in the archive for file conflicts.
  2. During the second phase, testing will be performed by manually monitoring the tests on initial package builds.
  3. After an initial period of manual monitoring, audits of the test logs should be performed to ensure that the package tests are performing properly.

Unresolved issues

BoF agenda and discussion


  • Catch package issues more quickly


  • lintian
    • checks for package bugs
    • will not catch post-install script errors
    • already done by Debian, so might not give us a large bang for buck
  • apport
    • reports bugs when install scripts fail or crash
    • we have some data on this, but nobody has really fixed any of it
  • conflictchecker

  • piuparts
  • autopackagetest

Want to move piuparts sessions into the cloud

  • currently chroot-based
  • need to do this to do things like run services, do upgrades, etc.

Historically we get most bang for our buck by doing large-scale system testing

  • Install a system, do install/upgrade of a lot of packages, test
  • Is it now time to start testing at a package level?

Would be great to pool resources and do all this package testing together... Currently just have people doing it ad-hoc

When should we run these tools?

  • Package testing probably does not fall into Checkbox
  • Could this be tied into the build process? As part of build process, test install and uninstall of package
    • Need to run conflict-checker on packages
  • At times we need to touch every package in the archive
    • E.g. "If you call a certain function, how many applications use that function?"
    • Can we use a similar process?
      • Need a local mirror... use a machine to run this in the datacenter
  • Should run lintian on everything, whenever a package gets updated
    • Can specify lintian verbosity
    • we should probably start with just errors and leave warnings off to minimize data firehose
  • Should we set up some VMs and install a ton of packages?
    • we'd have to set the debconf level to critical or be overwhelmed by debconf questions
      • we'll still probably get some questions, so someone will have to monitor it

When do we not want to run these?

  • Do not want to make it a requirement to go into the archive
    • Only block a package when it fails when we are coming up on a milestone?
    • Even if we don't block in a milestone, we should notify the developer

Where do we run these?

  • A QA server to be determined
    • will probably run in a VM


  • Run lintian on all packages in archive (once)
    • or how about only packages with Ubuntu delta, as we assume Debian tests all of theirs
  • After that run it whenever a package is checked in
    • watch archive for package updates
    • download and run lintian

Upgrade Testing
  • Create a VM of the previous release (once) and clone it every day
  • Upgrade to latest version of previous release
  • Dist-upgrade to latest release

First Step
  • Henrik Omma and Ronald McCollam will coordinate and learn how to use lintian

    • Coordinate with Steve Beattie and Juanje Ojeda from Guadalinex (juanje on Launchpad)
    • Further discussions reveal that conflictchecker would have a greater impact and would be a better start
  • Ronald McCollam will begin running these tests and evaluate how much value they provide

  • run tests as described in "lintian" proposal provide output on a webpage somewhere
    • Marc Tardif requests having conflict-checker as well
  • NB: There is similar work going on here:

    • we should coordinate work to avoid duplicating effort

Next Steps
  • include piuparts or autopkgtest
    • these tools may need some updates or work and bugfixes before they are useful
    • another issue here will be receiving a number of false positives
    • piuparts is/will be better maintained than autopkgtest for the immediate future and gives results that are more immediately relevant


QATeam/Specs/PackageTesting (last edited 2009-06-10 10:40:46 by cpc4-oxfd8-0-0-cust39)