October 16th, 2009
Yesterday lunchtime I wrote a simple GStreamer app in python to run on the N900 with the video embedded in a widget inside the app. I committed it inside gst-python git inside the folder examples. It just does videotestsrc ! videoscale ! ximagesink. It could have been videotestsrc ! xvimagesink but I wanted to grab a screenshot.
Code (with comments): http://cgit.freedesktop.org/gstreamer/gst-python/tree/examples/maemogst.py
October 12th, 2009
So last week, I started writing a maemo fremantle widget in a few spare pockets of time. One thing got me stuck. I could not get the thing to be transparent. I tried looking at other widgets’ code but just couldn’t figure it out. Finally at the Extending Hildon Desktop BoF yesterday at the Maemo Summit, I got my answer: set the colormap of the window to be 32bit rgba.
Here is the code needed in the __init__ in a python widget:
colormap = self.get_screen().get_rgba_colormap()
In the expose event handler, you have to clear with the cairo context (which I had figured out last week) with code like:
cr = self.window.cairo_create()
cr.set_source_rgba (1.0, 1.0, 1.0, 0.0)
My flickr fremantle tag photoset shows the progression and what runs now is:
There is still a little work to do, like respecting theme colours and more accurate positioning of hour etc before I create a package to push into extras.
October 8th, 2009
Above (taken on Fri Oct 2 2009) was the last photo taken of me with the glasses I picked up and wore for the first time on my wedding day 6 years and 2 months ago. They have found their destiny to rest in peace just outside Barcelona in Spain. Yesterday I ordered a replacement pair.
September 6th, 2009
So in my spare tme, I have been hacking on a couple of new projects. The first is a web based video switch controller for the mosque. We have an Autopatch 16 input 48 output video matrix switch, and the APControl software that comes with it just does not suit our workflow at all with macros being tedious to setup and figure out which to run due to the complexities of our programmes. It is also limited to running on one computer. So about 9 months ago I started on a web based replacement called Zap but development had stalled. The idea was that there would be 2 views. One was a matrix view which let you see status and switch manually small changes. The other was a view with inputs and zones where you select one input and multiple zones (each zone consists of a set of outputs). I chose Django to make the administration of the data corresponding to the inputs, outputs and zones easy and for an easy data model to expose the initial views to the web. Orbited was used so there is dynamic feedback from any changes on the videoswitch to the UI (very cool to see on the matrix view, instantaneously changes other people have made either through zap, the original APControl or even from the lcd and physical buttons on the switch). I feed those changes and accept change requests over STOMP to/from the web pages from/to a daemon using twisted‘s serialport handling. This Ramadhan, I set myself a deadline of the 15th Ramadhan which is today to get it in a working state so it could be deployed at the mosque. Thankfully, I had it ready and working last weekend and deployed it on Monday. All the other users now seem used to it and I am just working on little tweaks here and there.
The other project I have been working on is for a charity TV station who approached me to see if I knew of any open source playout server software. I didn’t but the playout server concept, we have in Flumotion as a playlist producer. So I ripped out the playlist producer code from Flumotion and made it render the graphics on screen and the sound through the soundcard. Of course this is not enough, the TV station want an overlayed logo. Initially I thought about using videomixer from GStreamer but on SD PAL, it is really CPU intensive. Then I thought, clutter would do this really easily. So I went on the clutter irc channel and asked for a simple example in pyclutter for playing a video with a custom gstreamer pipeline and overlaying an image on top. To my delight, someone wrote one on the spot and pastebin’d it for me. I integrated it all, tested it and kept Emanuele Bassi awake helping me by creating a jhbuild for clutter. Fedora 11 has only clutter 0.8.x which meant that it would not do the hardware colorspace conversion and also has an open bug with pyclutter segfaulting on “import clutter” on x86_64. I showed a proof of concept on a meeting on Friday evening and it worked nicely so Clutter/GStreamer/Gnonlin may well be powering a TV channel’s broadcast pretty soon. Thank you to the Clutter guys who helped me so much this week! The code for this project, I have pushed to github here: http://github.com/zaheerm/zplayout/tree/master. So the idea is that the output of the computer will be fed as SDI (either with an SDI output gfx card or through a DVI to SDI converter) and uplinked to the provider who would then put it in their hardware MPEG2 encoder and transport stream muxer and out onto the satellite itself. What the code now does, is watch a directory for playlist files which have entries saying something like file A is to be played at time x starting from y seconds into the file and for a duration of z seconds. The code then adds this into the GNonlin compositions and at the expected time it would start playing back. Clutter is doing the overlaying of the logo. In the future, playlist items would include live feeds from their studio through firewire or SDI input into the machine, live relay from DVB-S capturing a video feed and sending it out, and live titling, tickers etc. that could be done easily through Clutter I think. Pretty exciting stuff, I think. So thanks to the rest of the GStreamer and GNonlin guys for making this even possible, Flumotion for having a ready codebase to start from, the Clutter guys for having such an easy to code with platform for cool 2D stuff with video support and all their patient and great support for someone who has only listened to talks about Clutter at Guadec/GCDS and never got round to playing with it!
July 20th, 2009
At GCDS/Guadec, I discussed with Frank about Spykee being exposed over UPnP and started implementing a coherence backend for it. Only tonight, did I get a little time to refine and test what I had written. Below is a screenshot of UPnP Inspector and an Ogg/Theora stream exposed by the Spykee coherence backend being sent to 2 GStreamer based MediaRenderers. I also right clicked on the URL in UPnP Inspector and selected Open URL and it opened and played in Firefox 3.5 directly.
July 2nd, 2009
So yesterday was the 5th anniversary party for the Fluendo group. It only seems like yesterday that I joined the group (3 years 4 1/2 months ago) to work for Flumotion. It was really nice of the company to fly me in especially to Barcelona for the party, so thank you.
June 20th, 2009
So this aftermoon, I spent the time starting a GStreamer source element for Spykee. I have made it respond to key press navigation events coming upstream from say xvimagesink and move forward, back, left and right and d for dock. The code so far is here: http://github.com/zaheerm/zspykee/commits/master/flumotion-spykee/flumotion/component/spykee/gstreamer.py
Video with Spykee moving controlled through navigation events coming from xvimagesink: http://zaheer.merali.org/spykee_gstreamer.ogv
June 18th, 2009
So tonight, I added wizard support for flumotion-spykee and so now anyone with a Spykee robot can control and stream the robot with Flumotion without requiring to write a config file. It can all now be configured with ease in Flumotion, including discovery of Spykee robots on any interface your flumotion installation is on.
Two screencasts (both Ogg Theora videos):
June 11th, 2009
Simple UI to move the Spykee robot around, committed and pushed: