notes on graphing
|Deletions are marked like this.||Additions are marked like this.|
|Line 19:||Line 19:|
| * http://people.canonical.com/~ubuntu-archive/component-mismatches.txt
* plain text files with no annotations, no bug links
| * http://people.canonical.com/~ubuntu-archive/component-mismatches.html
* --(plain text files)--
* no annotations, no bug links
Launchpad Entry: other-p-plusonemaint-infrastructure
We have many reports that help us keep track of automatically-detectable problems in the development release (FTBFS, NBS, component-mismatches, the conflict checker, the transition tracker, etc.). These are all well and good, but they are rather disconnected from each other and in many cases do not provide very good facilities for distributing work among developers. If we want to drive these reports consistently to zero, some time spent on infrastructure would be worthwhile. What can we do to improve matters?
This is not a user-visible specification.
We have lots of tools for tracking problems in the development release. Many have difficult to parse output:
plain text files
- no annotations, no bug links
- giant text files, no annotations, no bug links, no ability to blacklist
- no bug links
need overview index
merge from Debian trunk
- no bug links (not needed for simple rebuilds, but often ties into build failures)
- should show build failures more explicitly
UEHS (ubuntuwire) - is anyone still using it? (more useful from Debian perspective)
- out of scope for +1 maintenance team
As general principles, reports should be better connected to each other; they should live in (or be linked to from) one place; and they should be easily perusable.
A previous attempt at attacking this problem was in the form of the weather report (old, new; the new version has rather less detail). This was intended as a release management dashboard. Release management and the +1 maintenance team have different goals here in terms of the level of detail required, and it seems unlikely that first-pass work will be able to accommodate both in a single tool.
Ideally, we would like to have a consolidated report listing packages with problems, rather than having to work through multiple reports. A good base for this would appear to be Daniel Holbach's Harvest tool, which already supports taking input from many data sources and limiting to only the ones you happen to be interested in, and also allows marking items as irrelevant. We will extend this to meet our needs, adding data sources as we go along; for example, links to NBS data could be improved, and it needs to list build failures from test rebuilds. Editing (adding comments) appeared to be broken when we tested it at UDS, and should be fixed.
(For comparison, it's worth looking at the LP QA dashboard.)
We will improve the various *-mismatches reports and the conflict checker to use HTML rather than text files, and to allow annotations and bug links.
We will update the transition tracker from Debian trunk, and add an overview index.
QA is taking over Michael Vogt's automatic upgrade testing, but the +1 maintenance team will need to process the output. It remains to be seen what that output will look like.
Evan Broder has volunteered to set up an Ubuntu instance of Lintian.
After CD building moves to its new machine, we will repurpose the old machine (antimony) as a piuparts runner.
Graphs over time
It would be helpful to graph some of these categories of problems over time, so that we can see if any particular areas are falling behind, and to work out objectively whether we need more people. These are some graphs that would be helpful:
- Build failures (all; main; by package set)
- Ditto, but in the latest test rebuild
- component-mismatches / component-mismatches-proposed size
- NBS size
- Completion percentage of transition trackers
Unless otherwise stated, we don't currently have the data needed to start building graphs, so the next stage in each of these is to cause the jobs that generate them to emit the necessary time series data.
It's important that we are able to test new features, and demonstrate them to users. Use this section to describe a short plan that anybody can follow that demonstrates the feature is working. This can then be used during testing, and to show off after release. Please add an entry to http://testcases.qa.ubuntu.com/Coverage/NewFeatures for tracking test coverage.
This need not be added or completed until the specification is nearing beta.