AutoServerTests

Dev Week -- Automated server testing -- soren -- Tue, Jan 26

UTC

   1 [20:00]  * soren clears throat
   2 [20:00] <soren> Hi, everyone.
   3 [20:00] <soren> Thanks for coming to my session on Automated Server Testing.
   4 [20:00] <soren> So..
   5 [20:00] <soren> In the server team, we've traditionally had a problem with collecting test results.
   6 [20:01] <soren> (question in #ubuntu-classroom-chat, by the way. please put "QUESTION" so that I will spot them)
   7 [20:01] <soren> This is because our target audience and most of our users are using Ubuntu on servers that are being used to service real users.
   8 [20:01] <soren> Real users, as you are probably aware, depend on their servers to work.
   9 [20:01] <soren> They need mail server to be up and delivering mail so that they can get their daily dosage of spam..
  10 [20:02] <soren> They need their file server to be around so they can get access to their music and various pirated software..
  11 [20:02] <soren> They need their proxy server to work so that they can log onto facebook..
  12 [20:02] <soren> They need the LDAP server to work so that they can look up the phone number for the pizza guy..
  13 [20:02] <soren> And other important things.
  14 [20:02] <soren> You get the idea.
  15 [20:02] <soren> If something should fail, it means pain and suffering for the poor sysadmin.
  16 [20:02] <soren> Hence, sysadmins are very hesitant to upgrade anything before it's been through lots and lots of QA.
  17 [20:03] <soren> However, unless /some/ of them /do/ upgrade, there's not going to be much QA work done.
  18 [20:03] <soren> This places us in a rather unfortunate situation, where a significant portion of our bug reports don't come in until after release.
  19 [20:03] <soren> Anyone involved in Ubuntu development will know that this is a hassle, since fixing things after release is much more tedious than before release, since we have much less freedom to make changes.
  20 [20:04] <soren> This is very difficult to change, and I haven't come up with a golden solution.
  21 [20:04] <soren> However, the sooner we catch problems, the more time we have to work on fun stuff since we'll be putting out less fires in the end.
  22 [20:04] <soren> See, while we're cursed with a user base that doesn't start testing our product until it's essentially too late..
  23 [20:05] <soren> ..we areblessed with a type of software that traditionally comes with a good test suite.
  24 [20:05] <soren> MySQL for instance, comes with an extensive test suite.
  25 [20:05] <soren> This test suite runs every time we upload a new version of mysql to Ubuntu.
  26 [20:06] <soren> If the test suite fails, the build fails, and the uploader gets an e-mail.
  27 [20:06] <soren> ...and it's all very obvious that something needs fixing.
  28 [20:06] <soren> This is great.
  29 [20:06] <soren> Well..
  30 [20:06] <soren> Sort of.
  31 [20:06] <soren> The thing is, every package in Ubuntu has dependencies of some sort.
  32 [20:06] <soren> For instance, almost everything depends on libc
  33 [20:07] <soren> This means that a change in libc will inevitably affect MySQL somehow.
  34 [20:07] <soren> Luckily, if this causes problems, it is (hopefully) caught by MySQL's test suite.
  35 [20:07] <soren> Less luckily, this test suite, as I just mentioned..
  36 [20:07] <soren> is run when MySQL is uploaded..
  37 [20:07] <soren> not when libc is uploaded.
  38 [20:08] <soren> So we may not notice a problem until the next time someone uploads MySQL. This could be weeks or even months!
  39 [20:08] <soren> And trying to narrow down the change that broke something is hard with all the stuff doing on in Ubuntu development over the course of months.
  40 [20:08] <soren> So..
  41 [20:09] <soren> to address this, we've set up and automated system that rebuilds MySQL ( and a bunch of other stuff) every night in a PPA.
  42 [20:09] <soren> That way, if we trust the test suite, we can relax  and know that MySQL still works, despite any changes in its dependency chain.
  43 [20:09] <soren> We do the same for libvirt, php5, postgresql, etc.
  44 [20:10] <soren> Basically, anything that has a test suite that runs at build time and that causes the build to fail if it doesn't pass, should be added.
  45 [20:10] <soren> This at least makes me sleep better :)
  46 [20:11] <soren> So, the automated testing stuff in Lucid consists of two parts.
  47 [20:11] <soren> The above is the first part, which is pretty nice.
  48 [20:11] <soren> The second part is awesome:
  49 [20:11] <soren> :)
  50 [20:11] <soren> It's an automated ISO testing system.
  51 [20:11] <soren> ISO testing is the thankless and tedious job of installing Ubuntu from an ISO over and over again..
  52 [20:12] <soren> ..with small adjustmets each time to make sure things haven't changed unexpectedly.
  53 [20:12] <soren> QUESTION: ~Shkodrani> why not run the test suite only when a packege on which, for instance MySQL relays on?
  54 [20:13] <soren> The cost of checking whether something in MySQL's dependency chain has changed is rather high. At the very least, it's tedious.
  55 [20:13] <soren> ..and just doing the rebuild is cheap and simple to get up and running.
  56 [20:13] <soren> It's all run by a 10 line shell script or thereabouts.
  57 [20:13] <soren> Ok, ISO testing..
  58 [20:14] <soren> Every time we come close to an alpha, beta or any other kind of release..
  59 [20:14] <soren> ..we all spend a lot of itme going through this install process.
  60 [20:14] <soren> Well, we /should/ anyway. I positively suck at getting it done, but there you go.
  61 [20:14] <soren> My fellow server team member, Mathias Gug, has had a preseed based setup running for a while now.
  62 [20:15] <soren> Basically, preseeding is a way to answer all of the installer's questions up front.
  63 [20:15] <soren> So, he takes all the answers..
  64 [20:15] <soren> passes them to the install using clever hacks..
  65 [20:15] <soren> ..and the install zips through the instlalation without bothering Mathias with questions.
  66 [20:15] <soren> In the end, he can log into the installed system and run the las tparts of the test cases.
  67 [20:16] <soren> This has served us well, and has probably saved us several man days (or weeks?) of testing tie over the last few years.
  68 [20:16] <soren> However, it doesn't actually test the same things as the ISO test cases describe.
  69 [20:16] <soren> The ISO test cases speak of the interaction between the user and the installer..
  70 [20:16] <soren> However, the point of preseeding is to /avoid/ interaction, and to skip it entirely.
  71 [20:16] <soren> Don't get me wrong..
  72 [20:17] <soren> Preseed testing is super valuable.
  73 [20:17] <soren> Installing that way is a supported install method, so having this well tested is wicked cool and really important.
  74 [20:17] <soren> ...but I wanted to test the interactivity as well.
  75 [20:18] <soren> So, being the virtualisation geek that I am..
  76 [20:18] <soren> I decided to use the KVM autotest framework to do the ISO testing.
  77 [20:18] <soren> Now, KVM autotest was designed to test KVM.
  78 [20:19] <soren> KVM developers use it to install a bunch of different operating systems and test things to make sure they didn't change anything in KVM that broke functionality in one of the guest operating systems.
  79 [20:19] <soren> What we want to do, though, is somewhat the opposite.
  80 [20:19] <soren> We assume that KVM works and instead want to test the operating system.
  81 [20:20] <soren> So, the KVM autotest framework works by runing a virtual machine..
  82 [20:20] <soren> grabs a screenshot every second..
  83 [20:20] <soren> ..and when the screenshot looks a particular way (e.g. when a particular dialog comes up),
  84 [20:21] <soren> it can respond with a series of key presses or mouse events.
  85 [20:21] <soren> This way, we can emulate a complete, interactive install session.
  86 [20:21] <soren> Awesome stuff.
  87 [20:21] <soren> I've started documenting this, but haven't gotten all that far, since I kept changing things faster than I could update the docs :)
  88 [20:22] <soren> The documentation lives at  https://wiki.ubuntu.com/AutomatedISOTesting
  89 [20:22] <soren> If you all open that page..
  90 [20:22] <soren> ..and scroll down to the "step files" section..
  91 [20:23] <soren> you can see a sample step from a "step file".
  92 [20:23] <soren> A step file is a description of a test case.
  93 [20:23] <soren> Now, looking at the sample, you can see a "step 9.45" and a "screendump" line.
  94 [20:23] <soren> They're pretty much just meta-data for the creator or editor of the step file
  95 [20:24] <soren> so don't worry about those.
  96 [20:24] <soren> The important lines are the "barrier_2" and "key" ones.
  97 [20:24] <soren> The barrier_2 line tells the testing system to wait..
  98 [20:24] <soren> ..until the rectangle of size 117x34 of the screen, starting at 79x303..
  99 [20:24] <soren> should have md5sum de7e18c10594ab288855a570dee7f159 within the next 47 seconds.
 100 [20:25] <soren> If this doesn't happen, the test will fail, and a report will be generated.
 101 [20:25] <soren> If it does pass, it goes on to the next step: "key ret"
 102 [20:25] <soren> As you can probably guess, "key ret" sends a keypress to the guest, namely Return.
 103 [20:26] <soren> The result of those two lines is: Wait for the language prompt right after boot to show up, and once it does, press return to accept the default "English".
 104 [20:26] <soren> Now, pretty soon, it became obvious that there was going to be a lot of duplication involved here.
 105 [20:26] <soren> ...all the installs would have to wait for that prompt and respond to it in the same way.
 106 [20:27] <soren> Even worse: If that prompt were to change, /every/ step file would need to be updated.
 107 [20:27] <soren> Even worse again: In the beginning there was no concept of "updating" step files. You had to start all over.
 108 [20:28] <soren> Starting over makes plain old ISO testing feel like a fun time.
 109 [20:28] <soren> It's not.
 110 [20:28] <soren> Just so you know.
 111 [20:28] <soren> I love people for doing it, but it's really not that much fun. :)
 112 [20:28] <soren> Ok, so to address the mass duplication of steps and stuff, I added a step file generator.
 113 [20:29] <soren> The step file generator generates a step file (you probably guessed this much) based on the task to be installed and the partitioning scheme to be used.
 114 [20:30] <soren> This means that I can tell the test frame work: Hey, please test an install of the LAMP task, with LVM partitioning and do it on amd64.
 115 [20:30] <soren> And it does so.
 116 [20:30] <soren> See, this is all running in a virtual machines.
 117 [20:30] <soren> Virtual machines are cool.
 118 [20:30] <soren> So cool, in fact...
 119 [20:30] <soren> That you can use them to make installer videos.
 120 [20:30] <soren> So, to see what happens during a test run, you can attach a recorder thingie and turn the result into an avi.
 121 [20:31] <soren> Now, like any decent TV chef, I've cheated and done this all in advance.
 122 [20:31] <soren> Now, unlike most decent TV chef's, what I did in advance failed.
 123 [20:31] <soren> And even more unlike TV chef's, I'm going to show it to you anyway, because it's useful.
 124 [20:32] <soren> Without further ado:
 125 [20:32] <soren> heh..
 126 [20:32] <soren> wait for it..
 127 [20:32] <soren> http://people.canonical.com/~soren/lamplvminstall.avi
 128 [20:32] <soren> There we go.
 129 [20:32] <soren> wget http://people.canonical.com/~soren/lamplvminstall.avi ; mplayer lamplvminstall.avi
 130 [20:32] <soren> This test case failed.
 131 [20:33] <soren> Somewhat surprisingly.
 132 [20:33] <soren> If you fast forward all the way to the end..
 133 [20:33] <soren> (watch the rest as well, it's fun to watch the test system typing the username "John W. Doe III" and the password and whatnot)
 134 [20:34] <soren> ..at the end, you'll see if breaks off before the install actually finishes.
 135 [20:34] <soren> Like... seconds before it would have finished.
 136 [20:34] <soren> Honestly, I did not mean for this to happen, but it's a good learning experience :)
 137 [20:34] <soren> Ok, if we all look at..
 138 [20:34]  * soren digs through launchpad, bear with me.
 139 [20:34] <soren> http://bazaar.launchpad.net/~soren/autotest/automated-ubuntu-server-tests/files/head:/client/tests/kvm/generator_data/lucid/
 140 [20:35] <soren> Those are the input files for the step file generator.
 141 [20:35] <soren> Yes, they are poorly named, but please appreciate that just days ago, they were all named "foo", "bar", "wibble", "wobble", etc. so this is a massive improvement.
 142 [20:36] <soren> QUESTION: That method could be used for UI testing in a *lot* of different GUI apps, not just ISO installations. Any plans to  document/release it more generally?
 143 [20:36] <soren> (from rmunn)
 144 [20:36] <soren> Yes1
 145 [20:36] <soren> !
 146 [20:36] <soren> I meant to get that done for today, but the real world imposed and made a mockery of my plans.
 147 [20:36] <soren> This can totally be used to do GUI installs as well.
 148 [20:37] <soren> Looking at http://bazaar.launchpad.net/~soren/autotest/automated-ubuntu-server-tests/files/head:/client/tests/kvm/generator_data/lucid/ again..
 149 [20:37] <soren> Specifically, 060-finish_install_and_reboot.steps
 150 [20:37] <soren> http://bazaar.launchpad.net/~soren/autotest/automated-ubuntu-server-tests/annotate/head:/client/tests/kvm/generator_data/lucid/060-finish_install_and_reboot.steps
 151 [20:38] <soren> This is the step that failed.
 152 [20:38] <soren> For some reason (that I have yet to figure out, I only spotted this failure an hour ago) this times out.
 153 [20:38] <soren> It says 579, but perhaps those a special kind of seconds that are not as long as most people's seconds.
 154 [20:39] <soren> The point is this: I only have to change the timeout in this one place, and all the test cases will be updated.
 155 [20:39] <soren> < ~rmunn> QUESTION: I see a lot of keystrokes used to select various dialog widgets. Can the KVM testing system simulate mouse clicks and/or mouse
 156 [20:39] <soren>                    movements (e.g., for testing mouseover stuff) as well?
 157 [20:39] <soren> cut'n'paste for the lose :(
 158 [20:39] <soren> Well..
 159 [20:39] <soren> Yes.
 160 [20:39] <soren> Sort of :)
 161 [20:40] <soren> The autotest framework supports it, I've added support for it to the frontend, but kvm has an.. um.. issue :)
 162 [20:40] <soren> It used to emulate a mouse, so it would move the cursor relative to the current position.
 163 [20:40] <soren> However, these days, GNOME and such give you...
 164 [20:40] <soren> mouse acceleration!
 165 [20:40] <soren> Yay!
 166 [20:40] <soren> No. Not yay.
 167 [20:41] <soren> Mouse acceleration is the enemy when you're actually warping the mouse from one place to another, because it thinks you just moved your mouse /really/ fast, and then moves it even further than you wanted it to.
 168 [20:41] <soren> This took me /forever/ to realise.
 169 [20:41] <soren> So, I've made it pretend to use a tablet.
 170 [20:41] <soren> Tablets offer absolute positioning, so this helped a lot.
 171 [20:42] <soren> However, the command to tell kvm to click on something internally translated into "mouse_event(click_button1, 0, 0, 0)", where 0,0,0 are the coordinates.
 172 [20:42] <soren> Now, if you're in relative positioning mode (using a regular mouse), this is good.
 173 [20:42] <soren> You want to click right where you are.
 174 [20:42] <soren> ..if you're using a tablet, it means you can only click in the top left corner.
 175 [20:42] <soren> No fun.
 176 [20:43] <soren> I wrote a patch for that, but I'm not sure it's in upstream KVM yet, but it'll be in Lucid half an hour after I start working on those GUI test cases :)
 177 [20:44] <soren> So, yes, GUI testing is totally an optoin.
 178 [20:44] <soren> option, too.
 179 [20:44] <soren> Another problem I had with this is that it was designed to test a variable kvm against a static set of OS's.
 180 [20:45] <soren> The OS's should look and act the same regardless of what changed in KVM. That is the whole point of these tests: To make sure they don't change.
 181 [20:45] <soren> However, we change the OS all the time. That's what we do :)
 182 [20:46] <soren> ..but since the designers of this test system never meant for it to be used this way, they didn't add an option to edit these step files very conveniently.
 183 [20:46] <soren> To fix this, I've added an option to the test system to fall back to the stepmaker (the GUI used to create step files) if a test fails.
 184 [20:46] <soren> This is great if you're running tests on your laptop or a machine you have direct access to rather than a machine running in a dusty corner of a data center.
 185 [20:47] <soren> It really comes in useful when the screens change (wording changes, extra/fewer dialogs, change of theme (in the GUI)).
 186 [20:47] <soren> Having to start over is, as I mentioned, no fun at all.
 187 [20:48] <soren> Please shoot any questions you may have. I haven't really prepared much more than this.
 188 [20:48] <soren> Still, questions belong in #ubuntu-classroom-chat
 189 [20:50] <soren> If there are no more questions, I'll sing for the rest of the time slot.
 190 [20:50] <soren> 21:50:25 < ~Omahn23> soren: QUESTION As an end user/sysadmin, is there anything I can do to help in testing with this new framework?
 191 [20:51] <soren> Well, seeing as these things run in virtual machines, running them in more places is not going to make much difference, so /running/ the tests is probably not something we need help with
 192 [20:51] <soren> However!
 193 [20:51] <soren> The more test cases we can include, the better.
 194 [20:51] <soren> The more, the merrier.
 195 [20:51] <soren> I'd love to have more test cases to include in our daily runs of this system.
 196 [20:52] <soren> 21:52:01 < hggdh> QUESTION: so we can automate pseudo-interactive testing. How to deal with the tests that require meat between the keyboard and the chair?
 197 [20:52] <soren> Examples?
 198 [20:52] <soren> 21:50:12 < Ramonster> soren: Any thoughts on testing servers while they actually perform one of the roles you talked about at the start ?
 199 [20:52] <soren> Ramonster: You mean functional testing of e.g. a LAMP server?
 200 [20:53] <soren> 21:52:23 < mscahill> QUESTION: you briefly mentioned PPA testing. what packages are included in this testing?
 201 [20:53]  * soren looks that up.
 202 [20:53] <soren> 21:53:08 < Ramonster> soren: Yeah
 203 [20:53] <soren> alright.
 204 [20:53] <soren> Um, yes, but it's not part of this work I've been doing.
 205 [20:53] <soren> We're not very strong in that area at all.
 206 [20:53] <soren> ...and that's a shame.
 207 [20:54] <soren> PKGS="libvirt postgresql-8.3 postgresql-8.4 mysql-dfsg-5.0 mysql-dfsg-5.1 openldap php5 python2.6 atlas"
 208 [20:54] <soren> Is the list of packages built daily.
 209 [20:55] <soren> Well, the security team has a bunch of tests they run whenever they change anything. They often can't rely on anyone else to test anything (since they don't go through -proposed), so they need to be really thorough.
 210 [20:55] <soren> I'm working on getting those run every day as well. They should provide some amount of functional testing.
 211 [20:55] <soren> 21:53:12 < yltsrc> QUESTION: is writing test cases required for bugfixing?
 212 [20:55] <soren> Not per se.
 213 [20:56] <soren> Most test cases need updating once a bug is fixed, and most things I can think of would be covered by this, so new test cases (for this system, I mean) wouldn't be a requirement for bug fixes.
 214 [20:57] <soren> 21:54:55 < mscahill> QUESTION: are there plans to allow automated testing for package maintainers with their own PPA?
 215 [20:57] <soren> Sure, anyone is free to run that script and do their own testing.
 216 [20:57] <soren> Hm... I may not have published it anywhere.
 217 [20:57]  * soren fixes that.
 218 [20:57] <soren> Well, /me makes a note to fix that
 219 [20:58] <soren> I do have a few ideas for doing functional testing of upgrades of various sort, but most of those ideas are only a few hours old, so they're not even half baked yet :)
 220 [20:59] <soren> Did I miss any questions?
 221 [20:59] <soren> 21:59:33 < Ramonster> soren: That's the problem atm everyone is walking around with these half-baked ideas :)
 222 [20:59] <soren> Whoops, didn't mean to post that here :)
 223 [21:00] <soren> Thanks for showing up, everyone.
 224 [21:00] <soren> that's it!

MeetingLogs/devweek1001/AutoServerTests (last edited 2010-01-29 10:04:29 by i59F765F3)