ServerKarmicAutomatedKvmTesting

Revision 2 as of 2009-11-23 21:02:39

Clear message

Summary

We should define a stack of functionality and performance tests and run them for every new revision of qemu-kvm, and various configurations.

Release Note

Ubuntu 10.04's KVM virtualization stack has undergone regular, automated testing.

Rationale

KVM is the virtualization hypervisor upon which we build the Ubuntu Enterprise Cloud, and as such, a cornerstone technology. We can, and should automate the testing of KVM.

User stories

  • As an administrator of a local installation of UEC, Katie wants to ensure the stability her users' instances, which run in KVM virtual machines.

Assumptions

  • Canonical IS will provide some VT-capable hardware which can run these automated tests.
  • Canonical QA will instrument checkbox or some other testing framework to test KVM and report the results.
  • Canonical Server Team will respond and fix bugs found/filed by QA.

Design/Implementation/Test/Demo Plan

This specification is in fact mostly a Test Plan.

Unresolved issues

UDS/Karmic Raw Notes

  • Use Autotest
    • Wide support with Google, Red Hat, IBM, etc.
  • Use Step Files for different distros.
  • Currently use automated scripts to test for certain end products.
  • Ubuntu community should build Step Files, and get them committed to Autotest.
  • Autotest currently doesn't have a large number of fucntion tests.
    • Ubuntu could greatly expand these tests.
  • Do libvirt testing as well as KVM testing.
    • Add libvirt tests to Autotest.
  • libvirt-sim does some tests with Autotests.
  • Need much more functional testing for KVM.
  • From a User perspective, should we be testing Windows VMs.
  • What are Ubuntu's requirements for testing.
  • Guests to Support:
    • Supported versions of Ubuntu Desktop and Server i386, amd64 (Hardy, Jaunty, etc).
      • Secondarily Kubuntu, Xubuntu, etc.
    • RHEL, Fedora
    • SUSE, OpenSUSE
    • Windows -- Functional not so much performance.
      • Windows 2003 Server
      • Windows 2003 Server
      • Windows XP
      • Windows 7
  • Tests should hopefully find regressions.
  • Automated test procedure:
    • Start out with a guest that has a bash shell on a serial port.
    • Scripts to interact with the shell.
    • Can put cygwin on Windows to provide a bash shell.
    • Build a collection of images that we care about testing.
    • Determine which functional tests are important.
    • Every test has a timeout.
  • Do performance testing by timing the entire process.
    • Should determine very quickly if there is a regression.
  • Integrate current hardware certification testing into Autotest framework.
    • The framework is Open Source.
    • Some tests are provided by specific hardware vendor, and there may be issues integrating with Autotest.
  • Guest certification -- What level of testing is appropriate for an Ubuntu guest running on another distro like Fedora?
    • Would be treated as a native hardware platform.
    • Currently VMware is treated in this fashion.
    • With upstream release kvm-XX.XX is supported on Ubuntu if it passes a standard set of tests.
  • Most of the work needs to be done regarding developing the tests, not so much on which framework to use.
  • Test Times:
    • Alpha, Beta
    • Test at the commit level.
  • Deliverables:
    • Owe upstream Step Files for supported releases (i386 and amd64).
      • Preseed data suppmitted in conjunction with Step Files.
  • Where to run tests:
    • Upstream and Canonical will run tests concurrently.
  • Test suite that user's can run to test their hardware.
  • How to test KVM plus the stack.
    • KVM with libvirt
    • libvirt with virt-manager
  • Determine better bug reporting process.
    • Reporting bugs upstream to KVM.


CategorySpec