Posted by & filed under FOSS Training.

We (my wife and I) have been teaching our neighbour, Sibongile (‘Bongi’) to use a computer over the last few months. We recently employed Bongi for a few evenings for the Gentle Introduction to GIS project we were contracted to do. The project involved creation of educational GIS materials for teachers and scholars and was based on QGIS. Bongi did some narration work on some of the screen casts we made. A domestic worker in her early 30’s, Bongi had never touched a computer before we started, had no idea what a keyboard is, a mouse is, clicking, dragging etc. It’s not an atypical scenario here in Africa where many people grow up on the wrong side of the digitial divide.


The one technology that has made it across the divide is the use of cellphones. Almost everyone has a cheap grayscale Nokia something or other – something like you were probably using 10 years ago. Cellphones are a good entry point into introducing the idea of a digital device as a communications tool.

Starting to train someone from scratch is a novel experience for the trainer as well as the trainee. For one thing it means that since the person has no idea of what software is to start with, they also have no idea what proprietary software is versus Free Software. Naturally that means we can make their first computing experience a FOSS one (I’m talking Ubuntu here!) rather than one based on Windows. Imagine the potential this brings, with so many people left on the other side of the digital divide, they could all be experiencing the digital world for the first time as a Free one.

The first thing Bongi learned to do is to type. I didn’t explain anything except how to log in, open Klavaro (an open source touch typing tutor) and start typing. A month or two later she has worked through all 42 or so lessons in Klavaro and in the interim, we have tought her to use the mouse, what drag and drop is, what windows are and how to resize, minimise, maximise and close them. I have been trying to teach her in a generic way – i.e. not how to use specific programs but how to learn to use any program. I teach her to look for familiar things if she encounters a new program – maybe some text formatting buttons on a toolbar will give her queues that she can format text. Save, open and close buttons indicate that she can manage her work. When she is not sure what to do, I have shown her to use the right click context menu to discover functionality, and the help menu to read up more. I told her that learning a computer should be like moving to a new city – you need to walk around and discover the place, find which areas you like. If you find places you like, you will revisit them more often and become more familiar with them. Learning by discovery is an important part of being flexible and able to adapt to a changing digital landscape.

Bongi still has a long way to go, but already she has come a long way. One of the interesting things has been trying to explain what a computer is actually useful for. I showed her some educational programmes that she can use to teach her 5 year old daughter new things. I showed her a word processor, spreadsheet, presentation package and drawing app (all OpenOffice tools). While the basic functionality is apparent I think it will take her a while to see how these tools can be applied to her own life. Near the beginning I gave her a computer that I wasn’t using and a friend donated her an old CRT screen. Yesterday I gave her an old digital camera and showed her how pictures can be copied into the computer and set as her wallpaper. Slowly I think she is starting to see how these things converge into an information repository. Yesterday I also created her first email account and explained how the internet can be used to send messages to people. In the near future I am going to set up a blog site for her and encourage her to start blogging about her (computing and general life) experiences. I’ll put some google ads on her blog and hopefully she will even start to generate a little income from her digital life. She doesnt have internet access so she will check her email and update her blog when she comes over to visit, writing her articles at home and bringing them across on an old memory stick I have given her. One other thing I haven’t mentioned is that Bongi’s first language is Zulu so she has to learn everything in a second language, making it that much more of a challenge for her.

The coolest part is that she is experiencing everything in her new found digital world as FOSS software. Ok, I know I said that already, but I still find it incredibly cool. I’ll probably post occasional updates on her progress and who knows maybe one day she may be using FOSS GIS software too! Speaking of FOSS GIS, QGIS recently got a Xhose translator (thanks Andiswa Silinga!) and I hope to get it translated into the rest of the 11 official South African languages to make it a little easier on folks when they do get to the point of wanting to learn GIS.

If you would like to write to Bongi to wish her well on her digital journey (she will be thrilled to receive her first emails too I am sure!), you can pop her a note on: speperembe at gmail dot com

I haven’t yet explained to her all the rather more unfathomable aspects of the internet so hopefully she doesnt get inundated with Nigerian get rich quick schemes or offers for viagra!

Oh and one more thing, some of Bongi’s neighbours have found out that she is learning computers and are keen to learn too. I’ve started teaching Thandi (pupil number #2!) but dont have a computer for her. If anyone has old laptops they dont need (they really don’t need to be anything fancy, just enough to run ubuntu or xubuntu etc.) and would like to have it put to good use rather than gathering dust in your cupboard, please contact me and I’ll see that they get put to good use! Laptops are particularly good as they have a built in UPS (power supply is extremely variable especially in the rainy season) and people typically live in single room dwellings without a lot of space.

Do you have similar experiences? I’d love to hear any tips and tricks!

Posted by & filed under General FOSSGIS.

I’ve been researching options for thin clients over the past few months and trying to get my Fit PCs to boot over LTSP / etherboot (still no joy so far). I do have them working as thin clients via XDMCP thought. Along the way I have found some handy websites which I thought I would make a note of here:

VM from GDM – This one explains how to set up a custom session type so that it logs you directly into a virtual machine instance from the Gnome Display Manager. So you get a full screened windows or whatever session with no Gnome running in the background.

Thin Client Howto – This one is a slightly out of date but still useful run through on how to get clients booting via etherboot. – This is a commercial vendor of tin client workstations, some of which support dual screen which is useful for GIS workers and software developers. I havent purchased any of their products & would love to hear from those who have.

Ubuntu LTSP quick install guide – contains a useful hint when setting up a 64bit server with 32 bit clients (sudo ltsp-build-client –arch i386 )

This guide has a few handy nuggets like reminding you that every time you update the server you should rerun


One other thing to note is that I believe LTSP is now bundled with the Ubuntu alternate CD so I’m thinking I might do a complete system re-install when 9.10 comes out.

I really want to get this LTSP stuff working nicely – I think its an enviromentally responsible way to go (my FIT-PC’s only consume around 7 Watts of electricity) and makes a lot of sense in terms of configuring and securing one server and all your clients automatically benifiting from a single admin action.

Posted by & filed under General FOSSGIS.

Hey hey hey! Our openModeller article has finally been published in the GeoInformatica Journal. Unfortunately it’s not an open access article so contact me if you want a copy (assuming I am allowed to redistribute it). Here is the abstract:

Species’ potential distribution modelling is the process of building a representation of the fundamental ecological requirements for a species and extrapolating these requirements into a geographical region. The importance of being able to predict the distribution of species is currently highlighted by issues like global climate change, public health problems caused by disease vectors, anthropogenic impacts that can lead to massive species extinction, among other challenges. There are several computational approaches that can be used to generate potential distribution models, each achieving optimal results under different conditions. However, the existing software packages available for this purpose typically implement a single algorithm, and each software package presents a new learning curve to the user. Whenever new software is developed for species’ potential distribution modelling, significant duplication of effort results because many feature requirements are shared between the different packages. Additionally, data preparation and comparison between algorithms becomes difficult when using separate software applications, since each application has different data input and output capabilities. This paper describes a generic approach for building a single computing framework capable of handling different data formats and multiple algorithms that can be used in potential distribution modelling. The ideas described in this paper have been implemented in a free and open source software package called openModeller. The main concepts of species’ potential distribution modelling are also explained and an example use case illustrates potential distribution maps generated by the framework.

Posted by & filed under QGIS.

Well I found this oracle-xe article hilarious. Here are a couple of choice phrases:

"Deploying [oracle-xe] can also offer a solution to the common problem of users or developers downloading and installing open-source databases, which leaves you with a minefield of maintenance, support, and security headaches"

Excuse me? I could easily rewrite this to say

"Deploying an open source database addresses the common problem of paying exorbitant license fees for a proprietary database that hogs resources, requires expensive DBA staff to keep it running and offers few tangible advantages over a robust, easy to manage, secure and maintain open source database system such as PostgreSQL."

The article has more similarly laughable statements….which leave me just shaking my head….

Posted by & filed under General FOSSGIS.

University of Witwatersrand

Today Graeme McFerren and I gave a presentation about the FOSSGIS software stack to a bunch of honours students at the University of Witwatersrand. The group was receptive and since they had pretty much never heard of FOSSGIS, it was a great opportunity to show them that there are alternatives out there! The presentation we gave is attached to this post…

TheFossGisStack (odp)
TheFossGisStack (pdf)

Posted by & filed under General FOSSGIS.

Mapnik is an open source, high quality map rendering engine. From the front page of their site:

"Mapnik is a Free Toolkit for developing mapping applications. Above all Mapnik is about making beautiful maps. It is easily extensible and suitable for both desktop and web development."

For some time I have been planning to take a closer look at it but I’ve been to busy to sit down and pick my way through. Over the last two evenings I finally made a go of it and built it on my system and took it for a test drive. First step to installing (on Ubuntu Jaunty 64 bit) was to get the build dependencies from apt.

Mapnik Library Setup

sudo apt-get install g++ cpp libboost-dev libxml2 libxml2-dev \
libfreetype6 libfreetype6-dev libjpeg62 libjpeg62-dev libltdl7 \
libltdl7-dev libpng12-0 libpng12-dev libgeotiff-dev libtiff4 libtiff4-dev \
libcairo2 libcairo2-dev python-cairo python-cairo-dev libcairomm-1.0-1 \
libcairomm-1.0-dev ttf-dejavu ttf-dejavu-core ttf-dejavu-extra libgdal1-dev \
python-gdal postgresql-8.3-postgis postgresql-8.3 postgresql-server-dev-8.3 \
postgresql-contrib-8.3 libsqlite3-dev  subversion build-essential

Note:Above updated for Karmic, 29 Nov 2009

I am installing the most current state of the software from the mapnik subversion trunk. To use mapnik with the Quantumnik QGIS plugin (which we cover further down), you need at least version 0.6.1 of Mapnik. Using the trunk covers that base, but you could also install from the official release tarballs. The mapnik in apt is however too old so we need to build from source.

Next I went to my development dir:

cd ~/dev/cpp

And then checked out the mapnik sources:

svn co

Then build mapnik:

cd mapnik
python scons/ configure INPUT_PLUGINS=all \
  OPTIMIZATION=3 SYSTEM_FONTS=/usr/share/fonts/truetype/ttf-dejavu/
python scons/
sudo python scons/ install

It didnt take a long time to build and install. Since I am running on the 64bit version of Jaunty, I had to make a small tweak to my system library search path:

sudo vim /etc/

To which I added the following line:


And then I updated the library search path:

sudo ldconfig

Next I did the obligatory ‘hello world’ test:

import mapnik

Installing the Quantumnik plugin

So the mapnik library is a high quality map rendering engine. To define the maps, you need to create xml definition files (mapfiles). The Quantumnik plugin for QGIS will take an existing QGIS project and generate a mapnik mapfile for it – with some limitations. The limitations stem from limitations in the QGIS symbology infrastructure which does not allow for named styles, multiple symbols per feature and scale based symbol switching in a single layer. Some of these issues should be resolved in QGIS 1.3 with the introduction of some symbology work that Martin Dobias is busy with. In the mean time you can still create a fairly pleasing mapfile using QGIS.

To start I added the mapnik repository to QGIS using the python plugin manager. The repo url is:

After adding the repo I went ahead and installed the Quantumnik plugin, enabled it in the Plugin Manager in QGIS and then restarted QGIS.

Next I compiled a simple project in QGIS using vector layers stored in a PostGIS database.

Note that the layers need to be listed in the geometry_columns table in PostGIS in order for mapnik to recognise them.

Once my project was compiled, I could use the Plugins -> Quantumnik -> Quantumnik menu to switch the QGIS map canvas to instead use Mapnik to render the view. Once you are happy with the product, use the Plugins -> Quantumnik -> Export XML menu to write my mapnik xml mapfile out to disk.

Here is what my project looked like using the standard QGIS render engine.

Same scene rendered using QGIS native engine

And here is what the mapnik renderer looks like:

Quantumnik in action

Testing the tilelite standalone server

You can test your mapnik mapfile out using a lightweight web serverakin to the webserver used for testing django projects. First we need to install the mercurial code revision system:

sudo apt-get install mercurial

Then install the tilelite server:

cd ~/dev/python
hg clone
cd tilelite
sudo python install

Now go to where you exported the mapnik mapfile using Quantumnik – I put mine into /tmp/ while testing. Then run the tilelite ‘liteserv’ server: mapnik.xml

Finally point your web browser at http://localhost:8000 and you should see your map.

Publishing your map

From here you have basically three options for publishing your maps to the greater world:

– use the apache mod_tile module to serve the tiles in google mercator
– use tilecache (see my previous blog post on setting up tilecache). Tilecache has special support to caching mapnik data. You just need to remember to specify the projection = <proj4literal> and extension=png256 options. The latter will reduce the size of the rendered tiles.
– use OGCServer (also a mapnik project) to publish the map as wms. Publishing the map as WMS means you should be able to use it from a WMS client like QGIS since it does not render to tile boundaries.

You can get an idea of how simple it is to overlay your mapnik data onto google maps from this code example:
In a follow up post I plan to walk through the configuration I do to publish a pleasing looking map via mapnik and TileCache.


Many thanks to Dane Springmeyer (mapnik developer) who walked me through the whole process outlined above – much appreciated!

Posted by & filed under Postgres & PostGIS.

Someone asked on twitter it is possible to dump all the tables in Postgres to individual shp files. Some time ago I wrote a script to dump all tables as SQL dumps. The question prompted me to tweak that script to drop out shapefiles instead.

My original script looked like the listing below. The dump files contain data only (see the comments in the bash script below) because I use this script to create fixtures for my django projects.


# A script to create sql formatted fixtures (serialised models)
# used to initialise the application if you install it to another 
# machine. You should run this any time you change your models
# or when you need to make a backup of all your data.

# Tim Sutton 2009
mkdir bees/sql
for TABLE in `echo "\d" | psql sabio | grep -v seq | awk '{print $3}'`
  echo $TABLE
  # -a data only
  # -t table
  # -D dump as sql inserts
  pg_dump -a -t $TABLE -D sabio > bees/sql/${TABLE}.sql
  #bzip2 bees/sql/${TABLE}.sql

To make the script drop out shapefiles I modified it a bit as shown in the next listing. Obviously as we are dumping shapefiles, we should only bother dumping tables with geometry in them so I went the route of using the geometry_columns table to decide which tables to dump…


# A script to dump shapefiles of all tables listed in geometry_columns
# Tim Sutton 2009
mkdir bees/sql
for TABLE in `echo "select f_table_name from geometry_columns;" | psql sabio \
  | head -n -2 | egrep -v "\-\-\-\-\-\-\-\-\-" | egrep -v "f_table_name"`
  echo $TABLE
  pgsql2shp sabio $TABLE 

Hope this is useful to someone out there :-)

Posted by & filed under General FOSSGIS.

I have been working on building a small training lab. The idea is that I will have six thin client pc’s and use my day to day desktop machine as a server. The server will run dhcp and the Linux Terminal Server Project server modules to provide a boot over ethernet (PXE) service for the thin clients. With LTSP I will be able to look at the screens of my trainees and share my screen back with them.

I did the basic install something like this:

sudo apt-get install ldm ldm-ubuntu-themes thin-client-manager-backend \
         thin-client-manager-gnome ltsp-manager controlaula ltsp-server-standalone \
         ltsp-build-client ltsp-server

I’ve disabled the DHCP functions on my home router and set up my dhcp configuration file like this:

# Default LTSP dhcpd.conf config file.


subnet netmask {
    # Give any computer requesting an address an IP Address in the range 20 to 250
    option domain-name "";
    # Using open dns prevents phishing attacks and other potential dns based exploits
    option domain-name-servers,;
    # Example of how to give a computer the same IP address each time it connects
    host timbuntu {
      #Always make this computer ip 1
      hardware ethernet 00:22:3f:df:bf:0b;

    option broadcast-address;
    option routers;
    option subnet-mask;
    # For pxe boot

    # next-server;
    # get-lease-hostnames true;
    option root-path "/opt/ltsp/i386";
    if substring( option vendor-class-identifier, 0, 9 ) = "PXEClient" {
        filename "/ltsp/i386/pxelinux.0";
    } else {
        filename "/ltsp/i386/nbi.img";

I did encounter one gotcha when following the instructions at and that is due to the fact that the server runs x86_64 Ubuntu and the clients are 32 bit. To take care of future clients that may also be 64 bit, I added client support for both architectures like this:

sudo ltsp-build-client
sudo ltsp-build-client --arch=i386

Currently I am using my fit-pc as the thin client, and I will probably look at buying four Acer One Aspire netbooks (going for ZAR 2000 a pop now at Incredible Connection!) for the remaining clients assuming they can do PXE and run at a decent resolution on an external monitor.

I’m still getting things set up and configured, tested etc so I can’t say how well this approach all works out yet. My fallback will be to use NXServer if it doesnt work out, which has the advantage of requiring less configuration to set up and less bandwidth to use, but the disadvantage of not providing all the training room tools like screen takovers and screen sharing. I’ll post more once I have got everything running in earnest to let you know how it works out….

Posted by & filed under Postgres & PostGIS.

Ok so I have a few production databases that I need to back up regularly. The trick is I want to run the backup from a remote machine so that the backup lives on a separate server to the actual database system. You can run backups manually like this (assuming your database is called ‘postgis’):

pg_dump -h dbhost -f postgis_`date +%d%B%Y`.sql.tar.gz -x -O -F tar postgis

When you run the above command, you will be prompted for a password. After entering the password you will find a date stamped backup. Very nice, but you may have noted that pg_dump has no option for giving it the password on the command line – it expects you to do that interactively. So what do we do if we need to automate the packup using a cron job? The solution is to use either ~/.pgpass or the PGPASSWORD environment variable. So here is how I automated the backup by placing a script in /etc/cron.daily/

export PGPASSWORD=secret
pg_dump -h dbhost -f postgis_`date +%d%B%Y`.sql.tar.gz -x -O -F tar postgis

Posted by & filed under Uncategorized.

I’ve been doing quite a bit of web development these days and have found the jQuery library to be indispensible. Here are a few addons and samples that I found to be really useful:

I’ll add to this list over time as I come across more interesting gizmos.