DeveloperWeatherReport

Differences between revisions 1 and 10 (spanning 9 versions)
Revision 1 as of 2007-10-25 15:05:59
Size: 2601
Editor: c-67-160-148-53
Comment:
Revision 10 as of 2007-11-01 16:09:43
Size: 10148
Editor: 219
Comment:
Deletions are marked like this. Additions are marked like this.
Line 10: Line 10:
Provide a single page covering all the variables that go into determining whether Ubuntu is release ready - bug count, RC bug count, archive analysis reports, iso testing summaries, with links off to the detailed data about each metric. Provide a dashboard covering all the variables that go into determining whether Ubuntu is release ready - bug count, RC bug count, archive analysis reports, iso testing summaries, etc.
Line 14: Line 14:
##This section should include a paragraph describing the end-user impact of this change. It is meant to be ##included in the release notes of the first release in which it is implemented. (Not all of these will ##actually be included in the release notes, at the release manager's discretion; but writing them is a ##useful exercise.) ##This section should include a paragraph describing the end-user impact of this change. It is meant to be
##included in the release notes of the first release in which it is implemented. (Not all of these will
##actually be included in the release notes, at the release manager's discretion; but writing them is a
##useful exercise.)
Line 20: Line 23:
This will provide the Distro Team an actual metric to help measure the quality of an upcoming release. This will provide the release managers and Distro Team an actual metric to help measure the quality of an upcoming release.  A lot of this data already exists today but is spread amongst different locations. We want to gather the information into one central location.
Line 23: Line 26:
 * Release manager Bob wants to know what showstopping bugs exist and how many there are.
 * Mary in the distro team wants to know about the iso testing status and what area could use more focus and help.
Line 25: Line 30:
 * The system should scale gracefully, i.e. it should be trivial to add a new data source and incorporate it into the dashboard display
 * Visual cues (e.g. color) should be used to show at a glance whether anything is outstanding
 * Should provide a link to the specific metric thus enabling the capability to drill down to the information
 * A distinction could be made between critical indicators (blocking) and other metrics (e.g. bug counts)
Line 27: Line 36:
=== Ideas (Release readiness / critical path) ===
 * display a list of high/critical milestoned (rc) bugs (Launchpad / python-launchpad-bugs)
 * iso testing (qa tracker)
 * isos up to date (archive, ISOs)
 * automatic ubiquity test (UbuntuSpec:ubiquity-vmware-automation) results (qa tracker?)
 * iso bug count (Launchpad / python-launchpad-bugs)
 * automatic upgrade testing result (http://people.ubuntu.com/~mvo/automatic-upgrade-testing/)
 * packages which have no current build (archive or Launchpad?)
 * failed builds (Launchpad?, include links to failed-build logs)
 * oversize cd images - which images are of interest? (cdimage - an *.OVERSIZED file shows up)
 * pending uploads in the queue that need to be approved (Launchpad or http://people.ubuntu.com/~ubuntu-archive/queue/)
 * installability of *-desktop packages and other archive inconsistencies (http://people.ubuntu.com/~ubuntu-archive/testing/)
 * Launchpad distrorelease status (Launchpad)
 * livefs up-to-dateness (cdimage)
 * mirror publishing (https://launchpad.net/ubuntu/+archivemirrors, https://launchpad.net/ubuntu/+cdmirrors)
 * meta-release index (http://changelogs.ubuntu.com/meta-release)
 * bittorrent (torrent.ubuntu.com:6969 ?) make sure it is seeding
Line 28: Line 54:
##You can have subsections that better describe specific parts of the issue. === Ideas (other metrics) ===
 * quantity of duplicate bugs reported (Launchpad / python-launchpad-bugs)
 * determine bugs of interest based on subscriber or duplicate count (Launchpad / python-launchpad-bugs)
 * open bugs with patches attached (Launchpad / python-launchpad-bugs)
 * percentage change of iso testing passes/fails
 * merge status (http://merges.ubuntu.com/stats.txt)
  * Number of merges needed for different areas (e.g. Desktop, Xorg)
 * sync status (during freeze periods, including link to diff) to see which Debian RC bugs could be fixed by a sync
 * http://popcon.ubuntu.com/ - more for bug statistics
 * upstream version currency (some code in http://dehs.alioth.debian.org/ and svn://svn.debian.org/qa/trunk/)
 * outstanding archive requests: syncs, removals, ... (Launchpad / python-launchpad-bugs)
  * https://launchpad.net/~ubuntu-archive/+subscribedbugs?field.searchtext=sync
  * https://launchpad.net/~ubuntu-archive/+subscribedbugs?field.searchtext=remov
 * Language pack status (update package versions should be identical to -base ones; IOW: update packages are empty)
 * GnomeAppInstallDesktopDatabaseUpdate: is http://people.ubuntu.com/~mvo/gnome-app-install/ up to date?
 * ReadaheadListUpdate: has the /etc/readahead/boot file in the readahead-list package been updated in the last week?
  * a count of lintian errors of carefully selected error classes (not everything is worth worrying about)
    * liw will be running lintian on Ubuntu, at lesat main, and will make such a selection list anyway
Line 30: Line 73:
== Implementation == === Implementation ===
 * all data should perhaps be tracked historically, but not in the first version
  * but keep all output files (data files?) in bzr
  * low-priority for implementation
 * package build failures will probably want a separate report for main / restricted / universe / multiverse
  * https://launchpad.net/ubuntu/hardy/+builds?build_text=&build_state=failed
  * http://people.ubuntu.com/~ubuntu-archive/testing/hardy_outdate.txt has FTBFS+NBS+NEEDSBUILD for main
  * http://people.ubuntu.com/~ubuntu-archive/NBS/ has NBS for all components
 * https://launchpad.net/ubuntu/hardy/+queue?queue_state=0 - new packages to consider in release queue
 * https://launchpad.net/ubuntu/hardy/+queue?queue_state=1 - release queue packages which need to go through the 7 step SRU process
 * or http://people.ubuntu.com/~ubuntu-archive/queue/
 * have some sort of distinction in the reports (ex main, restricted, universe, multiverse)
 * there should be an indication as to when the stat was last updated as not all reports are run at the same time
 * XML-RPC interface? or scrape HTML
  * Scraping HTML will be very brittle
  * Have each source output javascript object notation (JSON)
   * Each data source can then be "syndicated" at given locations (e.g. http://.../.../datasource1.js, and update at whatever frequency makes most sense for that particular data source (hourly, daily, etc.).
   * Can still use scraping for areas that do not output this format, but have those tools each output JSON for inclusion into the interface
   * The weather report page can then re-load these objects asynchronously, so page will update itself without needing manual reload
 * how is severity determined?
  * package dependent
 * restrict output to problems, hide "green status"
 * Metrics from http://qa.debian.org/developer.php?login=debian-gcc%40lists.debian.org&comaint=yes (and DEHS)
  * upstream version
  * debian version
  * bug counts
 * buildd status (including age of failed build)
  * https://launchpad.net/ubuntu/hardy/+builds et al (can select by state)
  * http://merges.ubuntu.com/stats.txt
Line 32: Line 103:
=== Leverage === === Example Prototype (mdz) ===
|| '''Metric''' || '''Status''' ||
|| Launchpad release status ||<bgcolor="#ccffcc">FROZEN ||
|| Pending uploads ||<bgcolor="#ccffcc"> Empty ||
|| Failed package builds ||<bgcolor="#ccffcc"> None ||
|| Pending package builds ||<bgcolor="#bb3333"> 5 ||
|| Pending livefs builds ||<bgcolor="#ccffcc"> None ||
|| Archive inconsistencies ||<bgcolor="#bb3333"> 3 ||
|| Outdated installation media ||<bgcolor="#bb3333"> desktop, alternate ||
|| Showstopper bugs ||<bgcolor="#bb3333"> #1234, #2345 ||
|| Automated installation tests ||<bgcolor="#ccffcc"> Successful (current build) ||
|| Manual installation tests ||<bgcolor="#ffffcc"> Successful (build 20071031.2) ||
|| Pre-published mirrors ||<bgcolor="#ccffcc"> 0 ||
|| Meta-release index ||<bgcolor="#ccffcc"> Pending ||
|| Bittorrent ||<bgcolor="#ccffcc"> Pending ||
Line 34: Line 119:
UbuntuSpec:qa-website
UbuntuSpec:bug-statistics
=== Logistics ===
 * How often do we want the reports generated?
  * hourly for many components; sections regarding CDs should be more frequent
  * how far back do we want these reports archived?
 * will the site poll for data or is data pushed?
 * Will we display results at the new qa website qa.stgraber.com or launchpad?
  * more likely on qa.ubuntu.com
 * Also, once we have test automation infrastructure running, we'll probably want to display this info as well.
 * Ideally viewers should be able to filter the stats they want to view and save(?) preferences
Line 38: Line 130:
##Should cover changes required to the UI, or specific UI that is required to implement this
Line 39: Line 132:
##Should cover changes required to the UI, or specific UI that is required to implement this There is a risk the interface could get very "cluttered", esp. if we wish to show historical info (graphs, etc.) One graphical technique for displaying info with minimal clutter is "SparkLines"
http://en.wikipedia.org/wiki/Sparkline
This displays historical trending in a concise amount of space

It may also help to have collapsed sections that can be expanded to get the next level of detail. For instance, one line may show total number of new, open, etc. bugs, but when expanded it shows a further breakdown (per milestone, per area, over time, or etc.)
Line 43: Line 140:
##Code changes should include an overview of what needs to change, and in some cases even the specific ##details. ##Code changes should include an overview of what needs to change, and in some cases even the specific details.
Line 54: Line 151:
##It's important that we are able to test new features, and demonstrate them to users. Use this section to ##describe a short plan that anybody can follow that demonstrates the feature is working. This can then be ##used during CD testing, and to show off after release. ##It's important that we are able to test new features, and demonstrate them to users. Use this section to describe a short plan that anybody can follow that demonstrates the feature is working. This can then be used during CD testing, and to show off after release. 
Line 60: Line 157:
##This should highlight any issues that should be addressed in further specifications, and not problems ##with the specification itself; since any specification with problems cannot be approved. ##This should highlight any issues that should be addressed in further specifications, and not problems with the specification itself; since any specification with problems cannot be approved.

Please check the status of this specification in Launchpad before editing it. If it is Approved, contact the Assignee or another knowledgeable person before making changes.

Summary

Provide a dashboard covering all the variables that go into determining whether Ubuntu is release ready - bug count, RC bug count, archive analysis reports, iso testing summaries, etc.

Release Note

Rationale

This will provide the release managers and Distro Team an actual metric to help measure the quality of an upcoming release. A lot of this data already exists today but is spread amongst different locations. We want to gather the information into one central location.

Use Cases

  • Release manager Bob wants to know what showstopping bugs exist and how many there are.
  • Mary in the distro team wants to know about the iso testing status and what area could use more focus and help.

Assumptions

  • The system should scale gracefully, i.e. it should be trivial to add a new data source and incorporate it into the dashboard display
  • Visual cues (e.g. color) should be used to show at a glance whether anything is outstanding
  • Should provide a link to the specific metric thus enabling the capability to drill down to the information
  • A distinction could be made between critical indicators (blocking) and other metrics (e.g. bug counts)

Design

Ideas (Release readiness / critical path)

Ideas (other metrics)

Implementation

Example Prototype (mdz)

Metric

Status

Launchpad release status

FROZEN

Pending uploads

Empty

Failed package builds

None

Pending package builds

5

Pending livefs builds

None

Archive inconsistencies

3

Outdated installation media

desktop, alternate

Showstopper bugs

#1234, #2345

Automated installation tests

Successful (current build)

Manual installation tests

Successful (build 20071031.2)

Pre-published mirrors

0

Meta-release index

Pending

Bittorrent

Pending

Logistics

  • How often do we want the reports generated?
    • hourly for many components; sections regarding CDs should be more frequent
    • how far back do we want these reports archived?
  • will the site poll for data or is data pushed?
  • Will we display results at the new qa website qa.stgraber.com or launchpad?
    • more likely on qa.ubuntu.com
  • Also, once we have test automation infrastructure running, we'll probably want to display this info as well.
  • Ideally viewers should be able to filter the stats they want to view and save(?) preferences

UI Changes

There is a risk the interface could get very "cluttered", esp. if we wish to show historical info (graphs, etc.) One graphical technique for displaying info with minimal clutter is "SparkLines" http://en.wikipedia.org/wiki/Sparkline This displays historical trending in a concise amount of space

It may also help to have collapsed sections that can be expanded to get the next level of detail. For instance, one line may show total number of new, open, etc. bugs, but when expanded it shows a further breakdown (per milestone, per area, over time, or etc.)

Code Changes

Migration

Test/Demo Plan

Outstanding Issues

BoF agenda and discussion

Use this section to take notes during the BoF; if you keep it in the approved spec, use it for summarising what was discussed and note any options that were rejected.


CategorySpec

QATeam/Specs/DeveloperWeatherReport (last edited 2008-08-06 16:18:53 by localhost)