Rufus Pollock’s keynote was the highlight of the weekend’s Open Data Mixer event. Pollock gave an entertaining and inspiring talk about open knowledge. The key take away messages were that open data is a platform, not a commodity, and that open data is a means to spur action and behaviour change, rather than being an end in itself.
While there were several interesting sessions, and the breaks were enlivened by a showcase of the GovHack winners from Queensland, at times it did feel a little bit like Groundhog Day in terms of the territory covered. The Queensland Government is clearly moving forward with its open data agenda, gradually extracting data from departments with entrenched cultures around data ownership. For this, they should be congratulated, despite the dubious quality and utility of some of the data that has been released. As NICTA’s Bill Simpson-Young pointed out at the event, in the early phases of an open data initiative, if the choice is between releasing tiny amounts of clean, machine-readable data, and lots of dirty data in formats that are not easily machine processable, the latter choice should win most of the time.
The big question for the state government, as it attempts to keep community interest in open data high, is where the conversation moves from here. Yes, it will be interesting to see what applications are built for the Queensland Premier’s Awards. Yes, it’s great that agencies and departments such as Translink and the Queensland Police are seemingly very committed to the open data cause. Yes, it will be a long and slow process effecting cultural change in many of the other departments; it’s completely understandable that there are reservations about releasing certain data sets.
In terms of the bigger, more general open data picture, in Queensland, though, where do we go from here? Simon Cook of Translink hit upon one of the more interesting conversations to be had in the coming months and years: collecting and publishing real-time data can be a very expensive exercise, and innovative techniques will be required to supplement the existing data collection infrastructure in many domains. Cook stated that, while he was very happy with the pilot real-time feeds available for the CityGlider bus service and some Logan bus services, it was also proving to be expensive.
Crowdsourced data from commuters carrying mobile phones might be part of the solution in the case of real-time tracking of buses. This technique is being used to good effect by Moovit, a public transit app originating from Israel, now operating in more than sixty cities around the world, including Adelaide, Perth, Canberra and Sydney.
Cook gave his own example of an innovative method for determining the position of passenger ferries on the Brisbane River. His team conducted an experiment to find out whether it was possible to determine the location of Brisbane River ferries and CityCats by listening in to their position calls, which are broadcast over VHF when they approach sharp bends and bridges and when overtaking other vessels. At the cost of a few hundred dollars, the team bought a VHF receiver, set it up on a desk at Translink and monitored the relevant VHF frequency. Sure enough, they were able to pick up the locations of Brisbane River ferries. This could prove to be a cost-effective technique for tracking ferries on the Brisbane River, and does not require the vessels to be modified.
At a time when government departments are feeling cost pressures while also being pressed to deliver on open data, innovative solutions that harness the collective power of citizens should come to the fore. Whether this can be achieved, however, is largely down to government policy-makers, who will need to ensure the path is clear for departments who wish to employ “crowd power” to fulfil their open data goals.
TSJ was there to cover proceedings. Here’s a gallery of (rather dark) photos from the Open Data Mixer.