2010-05-27

   1 [19:01] <mdeslaur> So...is it time?
   2 [19:02] <ayan> i think so!
   3 [19:02] <mdeslaur> Great! I'll start then
   4 [19:02] <mdeslaur> Hi, I'm Marc from the Ubuntu Security Team. Today I will be talking about producing security updates for stable Ubuntu releases.
   5 [19:03] <mdeslaur> This won't be a hands on session, rather, it will be an informative talk about the environments and tools the security team uses, and how those tools help us to produce updates.
   6 [19:03] <mdeslaur> This information can be used as a reference to set up your own environment to produce updates for all the stable releases.
   7 [19:04] <mdeslaur> First, an overview of our setup, and after I will explain how you can set it up yourselves.
   8 [19:04] <mdeslaur> Since we currently produce security updates for Dapper, Hardy, Jaunty, Karmic, Lucid and Maverick, we need to be able to build and test in all those environments.
   9 [19:05] <mdeslaur> The security team uses KVM virtual machines for testing, and schroots with sbuild for building and testing. Our schroots used to use LVM snapshots, but now with Lucid we use aufs schroots, which are easier to set up.
  10 [19:05] <mdeslaur> Because we test all our security updates on amd64 and i386 before publishing, we use amd64-capable hardware and the amd64 version of Ubuntu. That way, we can use both amd64 and i386 virtual machines and schroots. Of course, this isn't a requirement to build security updates, but helps in making sure we are not introducing platform-specific regressions.
  11 [19:06] <mdeslaur> Releasing security updates is very different from uploading a package to a dev release, where system breakage and regressions are part of the development process. If you upload a package to the dev release and something breaks, you can simply upload a package that fixes the breakage.
  12 [19:06] <mdeslaur> Users of the dev release are mostly technical people who readily accept breakage as a compromise for running the latest and greatest.
  13 [19:07] <mdeslaur> Once we press the figurative big red button to release a security update to a stable release, ~12 million Ubuntu users will be installing it in the next few hours. This is a user base that has zero tolerance for updates breaking their system, and any regressions are likely to reflect badly on Ubuntu's reputation.
  14 [19:07] <mdeslaur> Regression testing is _the_ most important part of producing security updates and is what takes up most of the time.
  15 [19:07] <mdeslaur> Now, back to setting up the environment:
  16 [19:08] <mdeslaur> My colleague jdstrand wrote up an excellent wiki page on how to set up a virtual machine environment like the security team uses. It is available here: https://wiki.ubuntu.com/SecurityTeam/TestingEnvironment
  17 [19:09] <mdeslaur> Basically, we have a tool called vm-new that is a wrapper script around soren's excellent vmbuilder tool. With this, we create a whole set of "clean" virtual machines, one for i386 and one for amd64, for each Ubuntu release.
  18 [19:09] <mdeslaur> Once we have the "clean" virtual machines set up, we copy them over to a temporary set of VMs that we can install stuff in, run our testing scripts, and ultimately destroy how we see fit. Once we've tested our updates, we simply erase the temporary set of VMs and use a script to copy the "clean" ones over again.
  19 [19:10] <mdeslaur> Doing this allows us to always start with a "known good" image, and allows us to reproduce our tests at will.
  20 [19:10] <mdeslaur> Personally, I always get a transparent mouse working in my "clean" virtual machines. I do this by installing the xserver-xorg-input-vmmouse package.
  21 [19:11] <mdeslaur> With lucid, this works out of the box, but in previous releases you needed to configure an adequate xorg.conf file. For dapper, there was no xserver-xorg-input-vmmouse package in the archive, but you can find one I made for it in my PPA here: https://launchpad.net/~mdeslaur/+archive/ppa?field.series_filter=dapper
  22 [19:11] <mdeslaur> To set up the schroots, my colleague jdstrand again has wrote an excellent wiki page here: https://wiki.ubuntu.com/SecurityTeam/BuildEnvironment
  23 [19:12] <mdeslaur> Once the schroots are set up, we can simply launch a command like "schroot -c jaunty-amd64 -u root" to enter a jaunty schroot and test utilities and proof of concepts, run some testing scripts, or try and reproduce an issue someone has reported.
  24 [19:12] <mdeslaur> Are there any questions so far?
  25 [19:13] <mdeslaur> ok, moving on...
  26 [19:13] <mdeslaur> To ease the process we use when building security updates, and to mimic the build process of the official build servers, we wrote a tool called "umt". This is the main tool we use daily to produce security updates, and it requires having schroots set up, and a few other dependencies.
  27 [19:14] <mdeslaur> There are instructions on the following wiki page that detail some of the steps necessary to set it up: https://wiki.ubuntu.com/SecurityTeam/BuildEnvironment
  28 [19:14] <mdeslaur> The umt tool has the following commands: search, download, changelog, source, binary, build, build-orig, sign, check, compare-log, compare-bin, repo, upload. These commands illustrate basically the whole process we use to produce an update.
  29 [19:15] <mdeslaur> search: This command will search the repos and tell us what the best source package is for each release. This will also warn us when a newer version is currently in -proposed.
  30 [19:15] <mdeslaur> Security updates are always built using the latest -updates release, and not the -proposed pocket, but we need to know about what is in -proposed so we can make sure our package version will trump it and make sure we add a note to the -proposed package's bug report.
  31 [19:16] <mdeslaur> download: this command will download a particular package for each release. For example, if I type "umt download gedit", it will create a directory for each release, download the corresponding source package for that release and unpack it. In one simple command, I have source trees for all releases we need to patch.
  32 [19:16] <mdeslaur> Once I've searched for a package and downloaded it, I'm ready to patch it.
  33 [19:17] <mdeslaur> What's special about producing security updates, is we get to work on every single package in Ubuntu. We need to learn _every_ patch system in the archive.
  34 [19:17] <mdeslaur> Luckily, my colleague kees and a few others wrote a great tool called "what-patch", available in the ubuntu-dev-tools package that simplifies trying to figure out what patch system a particular package uses. I simply need to enter the source tree and launch the "what-patch" command to see what patch system is being used.
  35 [19:18] <mdeslaur> From there, I use dpatch-edit-patch, cdbs-edit-patch, quilt-edit-patch or other similar tools to quickly apply and adjust security patches to the package. I tag all patches as per DEP3: http://dep.debian.net/deps/dep3/
  36 [19:19] <mdeslaur> Tagging patches really helps to figure out where a patch came from in case of a regression. It also helps a lot when reviewing other people's contributions.
  37 [19:19] <mdeslaur> The approach used when producing security updates for stable releases is to always backport the minimum fix necessary to fix the issue instead of trying to update the packages to a more recent version.
  38 [19:20] <mdeslaur> Why do we always backport patches? Newer versions of software may need updated libraries and dependencies, might break API/ABI, may introduce configuration file changes, need a lot more testing than a simple fix to the existing version, and, most importantly, newer versions may introduce new bugs!
  39 [19:20] <mdeslaur> Are there any questions so far?
  40 [19:21] <mdeslaur> ok, moving along...
  41 [19:21] <mdeslaur> Once my patches are added, I go back to the UMT tool:
  42 [19:21] <mdeslaur> changelog: this command is run from inside the source tree, and spawns the dch command with an appropriate security update version number and pocket already filled out.
  43 [19:22] <mdeslaur> build: this command will call the source and binary commands in order. The source command will build a source package in a directory called....source! A .debdiff file will also be in the source directory for analysis.
  44 [19:22] <mdeslaur> The binary command will use sbuild in the schroots to build the package with dependancy calculation that approximates what the official builders do. The resulting binary packages will be in the directory called "binary". There is logic in the umt script that automatically determines what release we are building for based on the directory and the changes file and uses the appropriate schroot.
  45 [19:23] <mdeslaur> build-orig: This command will download the previous release of the package, will build it and discard it in order to get the build log. When building security updates for stable releases, and backporting patches, it is very important to compare the build log of the original package with the build log of the patches package.
  46 [19:23] <mdeslaur> s/patches/patched/
  47 [19:24] <mdeslaur> This helps to spot compiler warnings with backports, helps spot missing files in the updates package, and also helps spot regressions when the package runs it's test suite. Doing this has saved our skin many times in the past.
  48 [19:24] <mdeslaur> compare-log: This command will compare the build log of the previous release with the build log of the current (patched) package. UMT will try and standardize and normalize the build logs before doing a diff on them so we can easily see only what's relevant.
  49 [19:25] <mdeslaur> sign: This command will sign the source package that is in the source directory.
  50 [19:25] <mdeslaur> check: This command will perform a great big series of sanity checks on the package before we upload it into the archive. Basically, every time we've screwed up in the past, we've added logic to this check so it wouldn't happen again. :)
  51 [19:25] <mdeslaur> It includes checks for: making sure the changelog got updates, making sure the pocket is -security, making sure the version number got incremented and is more recent than what's currently in the archive, making sure the patches are all tagged properly, making sure the patch system was used, making sure we don't have quilt directories in our patch, etc.
  52 [19:26] <mdeslaur> compare-bin: This command will download the previous version of the binaries, perform an analysis on them and perform the same analysis on the binaries that we've just compiled. It checks for: missing files, changed library symbols, etc.
  53 [19:27] <mdeslaur> repo: This command will copy the binaries that we've just produced into a local repo, so they are accessible to our schroots and our virtual machines for testing purposes. This ensures our updates work properly with apt-get and update manager.
  54 [19:27] <mdeslaur> At this point in time, we would perform testing procedures on our new packages, which I'll get to in a few minutes.
  55 [19:27] <mdeslaur> upload: This command will upload our packages to the Ubuntu security private PPA for building, after performing a few last-minute sanity checks to make sure nobody uploaded a more recent version while we were working on our updates. (This has happened before!)
  56 [19:28] <mdeslaur> Once the package has been uploaded to the security PPA, and has been built, _we perform testing a second time on the actual binaries_. This makes sure the actual bits that have been produced by the build system are regression-free. We have had issues in the past where minute differences between local builds and official builds have introduced regressions.
  57 [19:28] <mdeslaur> Are there any questions so far?
  58 [19:28] <Thomas_Zahreddin> yes
  59 [19:28] <mdeslaur> Thomas_Zahreddin: shoot
  60 [19:29] <Thomas_Zahreddin> did not read all since i came a little late:
  61 [19:29] <Thomas_Zahreddin> how do you automate your tests - what tools you use?
  62 [19:29] <mdeslaur> Thomas_Zahreddin: I'm getting to that, wait a few minutes and see if I answer your question
  63 [19:30] <mdeslaur> ok, moving along...
  64 [19:30] <mdeslaur> So, how do we test packages for regressions?
  65 [19:30] <mdeslaur> The security team maintains a large collection of testing scripts we have wrote in the following repository: https://launchpad.net/qa-regression-testing
  66 [19:31] <mdeslaur> The repository contains testing scripts, a lot of information on how to use test suites of particular packages, how to reproduce certain environments, and any other information we think is pertinent.
  67 [19:31] <Thomas_Zahreddin> mdeslaur: thanks
  68 [19:31] <mdeslaur> Basically, every time we produce a security update for a package, we write a test script that will test the codepath that we've touched. Since we need to test for multiple releases * multiple archs, and test our local build and the official build, writing a test script is the only way to ensure that everything gets tested properly.
  69 [19:31] <mdeslaur> Thomas_Zahreddin: welcome :)
  70 [19:31] <mdeslaur> When possible, we will use the upstream's test suite. Some are already built into the package, in which case we try to backport not only the security fixes, but the upstream tests that go with it.
  71 [19:32] <mdeslaur> Some packages contain a test suite, but will still build successfully when the test suite fails. In this case, comparing the build logs before and after the update during our process will spot any failing items.
  72 [19:32] <mdeslaur> Even when a package has an extensive test suite, we still write testing scripts for sanity testing basic functionnality once the package gets installed.
  73 [19:33] <mdeslaur> Here is an example of test script we use with apache: http://bazaar.launchpad.net/~ubuntu-bugcontrol/qa-regression-testing/master/annotate/head:/scripts/test-apache2.py
  74 [19:33] <mdeslaur> Some of the test scripts are quite simple, but some are getting complex...it's a matter of what we want to test for each particular security update. Of course, the more we do, the more tests get added.
  75 [19:33] <mdeslaur> Other teams within Ubuntu has started using and contributing to our test scripts also, which is great.
  76 [19:34] <mdeslaur> s/has/have/
  77 [19:34] <mdeslaur> I am looking forward to community participation in writing test script for non-Canonical supported packages (essentially what used to be called Universe). sbeattie has done a training session on how to write scripts for the qa-regression-testing suite, which is archived somewhere and is a great read for getting involved.
  78 [19:35] <mdeslaur> So, basically, patching the package itself is only a small part of the whole security update process. The important and time-consuming part when producing updates for stable releases is testing, testing, and more testing.
  79 [19:35] <mdeslaur> I hope this overview of the tools and environment the security team uses was useful and can be used as a guide to get your own environment set up. If you have any questions at all about Ubuntu security, or what I've discussed today, please don't hesitate to come over to #ubuntu-hardened and ask away!
  80 [19:36] <mdeslaur> Now it's time for the question period!
  81 [19:36] <mdeslaur> any questions?
  82 [19:36] <mdeslaur> not all at once, please! :)
  83 [19:37] <mdeslaur> ok, thanks everyone!
  84 [19:38] <Thomas_Zahreddin> mdeslaur: thank you
  85 [19:38] <Thomas_Zahreddin> mdeslaur: could you please give a little insight into your build scripts?
  86 [19:39] <Thomas_Zahreddin> i think most of the magic happens there (and also bzr is very important i suppose)
  87 [19:39] <mdeslaur> Thomas_Zahreddin: sure. Do you mean the security team's UMT script?
  88 [19:39] <Thomas_Zahreddin> yepp
  89 [19:40] <mdeslaur> Thomas_Zahreddin: it's simply a wrapper around everything we do when making security updates
  90 [19:40] <mdeslaur> Thomas_Zahreddin: we kept doing the same things over and over, like downloading source packages for each release, etc.
  91 [19:41] <mdeslaur> Thomas_Zahreddin: so we scripted as much as possible to simplify the specific task of producing security updates
  92 [19:41] <Thomas_Zahreddin> mdeslaur: sure, I'm thinking of the usage of hudson (apache) for this task
  93 [19:42] <mdeslaur> Unfortunately, I'm unaware of hudson
  94 [19:43] <Thomas_Zahreddin> http://hudson-ci.org/
  95 [19:43] <Thomas_Zahreddin>  apt-get update; apt-get install hudson 
  96 [19:44] <mdeslaur> hmm...looks interesting for a "daily ppa" type of situation
  97 [19:45] <mdeslaur> Thomas_Zahreddin: I'm not sure how that would work in a security update situation though
  98 [19:46] <Thomas_Zahreddin> mdeslaur: i suppose it is not as fast (and flexible) like 'small' scripts (umt)
  99 [19:46] <Thomas_Zahreddin> mdeslaur: but it offers a great user interface and very clear oversight - like which tests are run, failed etc.
 100 [19:47] <mdeslaur> Thomas_Zahreddin: I'm afraid I can't comment on how hudson would work in a security-update type scenario
 101 [19:47] <mdeslaur> Thomas_Zahreddin: as I don't know enough about it
 102 [19:47] <Thomas_Zahreddin> i see, of course.
 103 [19:48] <Thomas_Zahreddin> mdeslaur: so you did no evaluation for your build tool chain (i suppose) ?
 104 [19:49] <mdeslaur> Thomas_Zahreddin: we try and mimick the ubuntu builders as much as possible, so no, we have not looked at hudson
 105 [19:49] <Thomas_Zahreddin> mdeslaur: thank you for your answers
 106 [19:50] <mdeslaur> Thomas_Zahreddin: you're welcome
 107 [20:11] <MTecknology> mdeslaur: just read up - thanks for the details of everything - I'd definitely like to find some time to help out with things
 108 [20:12] <mdeslaur> MTecknology: cool, thanks!
 109 [20:36] <ari-tczew> mdeslaur: are you (security team) testing package themselve? or you are base on uploader comments?
 110 [20:36] <ari-tczew> package patch *
 111 [20:37] <mdeslaur> ari-tczew: for canonical-supported packages (main), we test ourselves, for non-canonical supported packages (universe), we ask the uploader to describe the testing he has performed, and we take a look at the patch.
 112 === ayan is now known as ayan-afk


CategoryPackaging

Packaging/Training/Logs/2010-05-27 (last edited 2010-05-28 11:29:23 by i59F77AA7)