App Developer Week -- GStreamer+Python: Multimedia Swiss Army Machete -- jderose -- Mon, Apr 12th, 2011

   1 [19:01] <dpm> thanks seiflotfy and m4n1sh for a great session! Next up: jderose will tell us all about using python and gstreamer in many interesting ways :)
   2 [19:01] <ClassBot> Logs for this session will be available at following the conclusion of the session.
   3 [19:01] <m4n1sh> Thanks everyone
   4 [19:03] <jderose> okay, should i start?
   5 [19:03] <nigelb> yes :)
   6 [19:03] <jderose> Hi everyone, hope you're having a great Ubuntu App Developer Week so far!
   7 [19:03] <jderose> Okay, lets get started...
   8 [19:03] <jderose> I have a short prepared intro to get you pumped up about GStreamer + Python.
   9 [19:03] <jderose> Then we'll spend rest of the hour getting our hands dirty with real code, and I'll do my best to answer all of your questions.
  10 [19:04] <jderose> During the intro, why don't you make sure the packages needed for the code examples are installed:
  11 [19:04] <jderose> sudo apt-get install gstreamer0.10-tools python-gst0.10 gstreamer0.10-plugins-good
  12 [19:04] <jderose> Aside from `gstreamer0.10-tools`, you probably already have the rest installed, but it's good to make sure we're on the same page package-wise.
  13 [19:04] <jderose> I'm running Natty, but the examples should work fine under Maverick and Lucid too, and even older release.
  14 [19:05] <jderose> == INTRO ==
  15 [19:05] <jderose> First, I'm going to share why I think GStreamer is *the* multimedia framework, is going to totally dominate in *everything* from simple playback to big production video editing.
  16 [19:05] <jderose> I hope you're a bit surprised as to why I think this, because it's exciting, and I want to get you exited!
  17 [19:05] <jderose> If I don't surprise you, then I assume you're already as excited as I am :)
  18 [19:05] <jderose> Second, I'm going to share why I think Python is *the* language for building GStreamer apps, and correct some misconceptions I frequently hear about Python threading and GStreamer.
  19 [19:06] <jderose> -- Why GStreamer? --
  20 [19:06] <jderose> Ah, I should introduce myself.  My name is Jason Gerard DeRose, and I started writing pygst apps 7 years ago, back when gstreamer0.8 was the hot newness.
  21 [19:06] <jderose> So I have a longtime love affair with GStreamer.
  22 [19:06] <jderose> But recently I had to pick the multimedia framework for Novacut, my distributed (ala bzr/git/hg) video editor project.
  23 [19:07] <jderose> Novacut isn't just a project, it's a startup, so I needed to pick something that makes good longterm strategic sense.
  24 [19:07] <jderose> GStreamer was my gut feeling, but I played devils advocate with myself and looked at a number of other options.
  25 [19:07] <jderose> I looked most seriously at Media Lovin' Toolkit (MLT), as Jonathan Thomas originally was using GStreamer + Gnonlin for OpenShot, and then switched to MLT out of frustration.
  26 [19:07] <jderose> I believe Jonathan pointed out some legitimate weaknesses in Gnonlin, and the OpenShot development pace has been impressively quick, so you can't argue with that.
  27 [19:07] <jderose> However, I still chose GStreamer without hesitation.  Why?
  28 [19:08]  * jderose makes "drumroll" sounds...
  29 [19:08] <jderose> * GStreamer is on the Kindle
  30 [19:08] <jderose> * GStreamer is on the Nokia N900
  31 [19:08] <jderose> * GStreamer is on webos phones and tablets
  32 [19:08] <jderose> * GStreamer is what's getting attention from those wonderful Linaro folks
  33 [19:08] <jderose> * GStreamer is on every Ubuntu desktop, along with most other desktop Linux distros
  34 [19:08] <jderose> In short, I choose GStreamer because of it's economy of scale.
  35 [19:09] <jderose> GStreamer is already running on everything from small to medium, and although running it at industrial scale (big) might not be that common right now... it's inevitable.
  36 [19:09] <jderose> And doesn't that sound strikingly similar to something?
  37 [19:09] <jderose> It does to me: the Linux kernel, running on everything from smart phones to supercomputers, everything from consumer grade to pro grade.
  38 [19:09] <jderose> Once you reach that economy of scale, you're pretty unbeatable.  And I believe that over the past several years GStreamer has reached that tipping point.
  39 [19:09] <jderose> Nonlinear editing is easily exercising 90% of the same code paths as playback.
  40 [19:09] <jderose> And from a business perspective, I'd choose something where I knew that 90% would be getting serious investment across the industry...
  41 [19:10] <jderose> even if the other 10% might currently have some shortcomings compared to other options.
  42 [19:10] <jderose> I believe Edward Hervey has built an excellent foundation in Gnonlin.  It just needs more developers, more apps uses it, more users abusing it.
  43 [19:10] <jderose> -- Why Python? --
  44 [19:11] <jderose> Why not? GStreamer gives you a lot of power, you can build arbitrarily complex pipelines.
  45 [19:11] <jderose> And that's exactly the place when a simple, clear language like Python is perfect.
  46 [19:11] <jderose> You want to be able to iterate quite, and write tons of tests without a lot of friction.
  47 [19:11] <jderose> Now if you want to write new GStreamer plugins (say some new video filter), those should of course be written in C.
  48 [19:12] <jderose> But the job of assembling a GStreamer Pipeline can get surprisingly complex, and that's a great place for Python.
  49 [19:12] <jderose> Q: But wont Python make my GStreamer application slow because Python only allows on thread to run at once because of the Global Interpreter Lock (GIL)?
  50 [19:12] <jderose> A: No :)
  51 [19:12] <jderose> The Python GIL means only one thread at a time can *manipulate Python state*.
  52 [19:12] <jderose> But an arbitrary number of threads can run at once assuming those threads are't manipulating Python state (aka pretty much everything GStreamer does).
  53 [19:13] <jderose> So repeat after me:
  54 [19:13] <jderose> "Python wont make my GStreamer application slow, because after I assemble and start the pipeline, Python just sits there waiting for signals from GStreamer, and GStreamer with exactly the same performance it would have it the pipeline were assembled and started in C!"
  55 [19:13] <jderose> :)
  56 [19:14] <jderose> == LEARNING BY DOING ==
  57 [19:14] <jderose> okay, is everyone ready to play with some code?
  58 [19:15] <jderose> anyone needs a moment to catch up, at any point, please say so in #ubuntu-classroom-chat, which is also where you ask questions
  59 [19:15] <jderose> QUESTION: So, that's because the threads are just running code from an external lib, then, rather then Python code?
  60 [19:15] <jderose> chadadavis: basically, yes.
  61 [19:16] <jderose> gstreamer can/will create quite a few different threads, say for video playback
  62 [19:17] <jderose> and unless you wrote gstreamer plugins in python (which is possible, and handy for prototyping)
  63 [19:17] <jderose> python wont actually be doing anything in any of those threads
  64 [19:17] <jderose> python will just be sitting idle waiting for events from gstreamer
  65 [19:17] <jderose> the normal way to use gstreamer is all asyncronous
  66 [19:18] <jderose> okay, do doing stuff with multimedia, you always need a test video to work with:
  67 [19:18] <jderose>
  68 [19:18] <jderose> :)
  69 [19:19] <jderose> everyone go ahead and grab the example code here:
  70 [19:19] <jderose> bzr branch lp:~jderose/+junk/machete
  71 [19:20] <jderose> or you can browse it here -
  72 [19:20] <jderose> i didn't quite have time to get all the minimal python examples together i wanted, so i'm ganna wing it a bit, but thats okay :)
  73 [19:21] <jderose> gstreamer is a graph based pipeline, very genric at it's core
  74 [19:22] <jderose> the `gst-launch-0.10` command is very handy for quickly testing a pipeline, so lets look at -
  75 === kevin7060 is now known as seidos
  76 [19:23] <jderose> i know, not python yet, but this is a good way to see what gstreamer is going conceptually :)
  77 [19:23] <jderose> so the first element in this pipeline is `filesrc`... which reads from a file, in this case "jorge.ogv"
  78 [19:24] <jderose> the next element is `oggdemux`... ogg is a containing that can contain many different types of data inside: theroa video, vp8 video, vorbis audio, flac audio, etc
  79 [19:25] <jderose> so a demuxer will take a container as split out individual elementary streams
  80 [19:25] <jderose> in this example, were just going to split out the vorbis audio, transcode to flac
  81 [19:26] <jderose> now gst-launch has some magic it dose behind the scences, so it's a bit more complex from python, where you're doing everything very explicity
  82 [19:27] <jderose> now, let me introduce you to handy cool you'll use all the time if you do much with gstreamer
  83 [19:27] <jderose> in a terminal, run:
  84 [19:27] <jderose> gst-inspect-0.10 vorbisdec
  85 [19:28] <jderose> sudo apt-get install gstreamer0.10-tools
  86 [19:28] <jderose> you might have to install that ^^^
  87 [19:28] <jderose> that work for everyone?
  88 [19:29] <jderose> if you scroll up in the output, you'll see something like this:
  89 [19:29] <jderose>   SINK template: 'sink'
  90 [19:29] <jderose>     Availability: Always
  91 [19:29] <jderose>     Capabilities:
  92 [19:29] <jderose>       audio/x-vorbis
  93 [19:30] <jderose> vorbisdec can receive 'audio/x-vorbis', only
  94 [19:31] <jderose> gstreamer has "caps" (capabilities) that describe what an element can consume (at its src pads), and what an element can produce (at its sink pads)
  95 [19:31] <jderose> so when you assemble and start a pipeline, the elements do some pretty amazing dynamic negotiation
  96 [19:32] <jderose> okay, back to example -
  97 [19:33] <jderose> the `audiorate` rate element will duplicate or drop samples in order to make the buffer timestamps match whatever the global clock of the pipeline is
  98 [19:33] <jderose> it can also correct badly constructed files, or deal with issues where on formats idea of time is different than anothers
  99 [19:34] <jderose> this stuff gets trick to make work all the time because so many of the media files in the wild are often slightly broken, don't comply with a spec totally
 100 [19:34] <jderose> `audioconvert`, okay, now we go back to gst-inspect-0.10
 101 [19:35] <jderose> gst-inspect-0.10 vorbisdec
 102 [19:35] <jderose>   SRC template: 'src'
 103 [19:35] <jderose>     Availability: Always
 104 [19:35] <jderose>     Capabilities:
 105 [19:35] <jderose>       audio/x-raw-float
 106 [19:35] <jderose>                    rate: [ 1, 2147483647 ]
 107 [19:35] <jderose>                channels: [ 1, 256 ]
 108 [19:35] <jderose>              endianness: 1234
 109 [19:35] <jderose>                   width: 32
 110 [19:35] <jderose> gst-inspect-0.10 flacenc
 111 [19:35] <jderose>   SINK template: 'sink'
 112 [19:35] <jderose>     Availability: Always
 113 [19:35] <jderose>     Capabilities:
 114 [19:35] <jderose>       audio/x-raw-int
 115 [19:35] <jderose>              endianness: 1234
 116 [19:35] <jderose>                  signed: true
 117 [19:35] <jderose>                   width: 8
 118 [19:35] <jderose>                   depth: 8
 119 [19:35] <jderose>                    rate: [ 1, 655350 ]
 120 [19:35] <jderose>                channels: [ 1, 8 ]
 121 [19:36] <jderose> so vorbisdec produces audio/x-raw-float, but flacenc consumes audio/x-raw-int
 122 [19:36] <jderose> you might try removing the `audioconvert` from that pipeline, and you'll see that things wont work
 123 [19:37] <jderose> so audioconvert sees that on one side there is audio/x-raw-flow, the other audio/x-raw-int, and it converts between the two
 124 [19:37] <jderose> make sense?
 125 [19:38] <jderose>
 126 [19:38] <jderose> i didn't have time to trim this down, but here we go
 127 [19:39] <jderose>
 128 [19:39] <jderose> look at the AudioTranscoder class
 129 [19:39] <jderose> this is a common pattern in pygst
 130 [19:40] <jderose> there is a step you need, like trancoding audio, and you want it to me reusuable
 131 [19:40] <jderose> so you but the only process into a gst.Bin, and use that element abstractly
 132 [19:40] <jderose> very handy
 133 [19:40] <jderose> gst.element_factory_make('queue')
 134 [19:40] <jderose> this deserves special mention
 135 [19:41] <jderose> when you have something like a jorge.ogv, which has audio and video, you need to use queues like this:
 136 [19:42] <jderose> audio side: demux => inq => dec => enc => outq => mux
 137 [19:42] <jderose> video side: audio side: demux => inq => dec => enc => outq => mux
 138 [19:43] <jderose> this is because the audio and video are interleaved in the container, and if you don't do this, things will just hang because there wont be exactly enough to keep all the consumers happy
 139 [19:44] <jderose> -- Getting Signals/Events from pygst --
 140 [19:44] <jderose>
 141 [19:45] <jderose> chadadavis: QUESTION: So, a queue can be a mux or a demux, How does it know what's what?
 142 [19:45] <jderose> will, a queue itself is neither, a queue is a type of gstreamer element
 143 [19:45] <jderose> gst-inspect-0.10 queue
 144 [19:46] <jderose> a queue just means that buffers can be added before the last was consumed
 145 [19:46] <jderose> most of the gstreamer elements are 1-to-1: consume a buffer, do stuff, produce a buffer
 146 [19:47] <jderose> self.bus = self.pipeline.get_bus()
 147 [19:47] <jderose> you get messages from pygst using a "bus"
 148 [19:47] <jderose> this is quite nice because it takes care of a threading issue that can be a pain...
 149 [19:48] <jderose> messages from the bus are only emitted in the main thread
 150 [19:48] <jderose> so your UI code can always safely manipulate the UI state based on the signal
 151 [19:49] <jderose> self.bus.connect('message::eos', self.on_eos)
 152 [19:49] <jderose> this signal is fired when the pipeline has completed, when say an entire file has been transcoded, rendered, played back, etc
 153 [19:49] <jderose> self.bus.connect('message::error', self.on_error)
 154 [19:50] <jderose> and this one when gstreamer encounters an error.... any time you build a pipeline, you'll probably have those two signals
 155 [19:50] <jderose> at least those two, that is
 156 [19:50] <jderose>
 157 [19:51] <jderose> a Pipeline is sort of the main containing for all the gstreamer elements you chain together
 158 [19:51] <ClassBot> There are 10 minutes remaining in the current session.
 159 [19:51] <jderose> so any element that is linked into the chain *must* be in the pipeline
 160 [19:52] <jderose> murphy: QUESTION: what about progress events?
 161 [19:52] <jderose> good question :)
 162 [19:52] <jderose> so gstreamer doesn't have intrinsic progress events
 163 [19:53] <jderose> so what you do is great a gobject timeout that fires every 1 second or whatever
 164 [19:53] <jderose> and then you query gstreamer to figure out where it is the the pipeline
 165 [19:53] <jderose> you would to this for a seek bar for audio/video playback
 166 [19:53] <jderose> or to get progress for transcoding
 167 [19:54] <jderose>
 168 [19:54] <jderose> i know the times about up, but i want to talk about states a bit
 169 [19:54] <jderose> gst.STATE_NULL - no resources have been alocated at all
 170 [19:55] <jderose> gst.STATE_READY - plugins are ready, but they haven't actually touched any data, allocated buffers
 171 [19:56] <jderose> gst.STATE_PAUSED - the first buffers have been consumed, pipeline is negotiated
 172 [19:56] <ClassBot> There are 5 minutes remaining in the current session.
 173 [19:56] <jderose> gst.STATE_PLAYING - the loop is running, all the elements are consuming, producing, doing their thing
 174 [19:57] <jderose> so to query the pipeline at all, it must be in at least gst.STATE_PAUSED
 175 [19:57] <jderose> well, that's about time
 176 [19:57] <jderose> sorry if this was a bit rough - this is my first time doing a session like this :)
 177 [19:58] <jderose> i'm going to continue to work on that example repo, make it more useful
 178 [19:58] <jderose> so thanks everyone, and enjoy all the rest of the sessions! :)

MeetingLogs/appdevweek1104/GstreamerPython (last edited 2011-04-12 19:47:50 by 178)