Category Archives: Web 2.0

Urban Planning in the 21st Century

Over the last few months the new CRC-SI 2 has been kicking into motion and positioning itself for the research areas the new program will be looking to achieve. For those who have been involved no doubt you are aware of the history of the CRCSI but for those who are not up on the current lingo, the CRCSI is short for the Cooperative Research Centre for Spatial Information and ‘2’ is the second round of funding that the organisation has received from the federal government.

With this second round of funding the CRCSI will be able to work for the next seven years around the theme of Spatially Enabling Australia. The core research areas are Positioning, Automated Spatial Information Generation and Spatial Infrastructures. See this handout for more information.

As part of the CRCSI 2 work programs there are a number of application areas that will also be looked into and one has stuck my interest. The Sustainable Urban Planning (program 4.5 pdf: here) organisers recently held a seminar on the subject of Greyfields.

Greyfields are the ageing occupied residential tracts of suburbs that are physically, technologically and environmentally obsolescent……..typically found in a 5 to 25 kilometre radius of the centre of each capital city”‘ Professor Peter Newton.

With population growth, affordable housing and increase need for better government spending the rejuvenation of these areas of urban living must be proving a great challenge. What I feel will be key for this research theme is the engagement of community on urban planning as it was highlighted that within ‘greyfields’ most of the land is under private ownership. This proves an interesting sticky point for redevelopment in light of current issues.

As housing affordability and population growth are hot topics and that urban sprawl is becoming more and more unsustainable the greyfields are key to helping address these problems. In another session I attended recently titled ‘Boom Town 2050’ it was highlighted that the density of dwellings in Australia is not at the levels it needs to be to support population growth. Hopefully with research, engagement with community and importantly action the use of spatial within this area will be seen as critical to realising and communicating what needs to be done.

Moving Away from Gov Silos

A few days ago the Rudd government posted it’s response to the Gov 2.0 taskforce report on Engage: Getting on with Government 2.0. The responce, hosted on the Department of Finance and Deregulation (DFD) website can be found here.

Reading through the response I am pleased in most aspects of the Government wanting to take action in the area of making more data available to the public and generating tools that will help inform and direct new policy. The steering committee that will be setup to help direct the DFD is for the better part entirely made up of federal government agencies. This is my first bane of contention but more on that in a sec.

Some things to note coming out of the report included:
– Defining what Public Sector Information should be
o free
o based on open standards
o easily discoverable
o understandable
o machine-readable
o freely reusable and transformable.

– Establishment of metadata standards to improve sharing, reuse and discoverability of PSI. All well and good and there are standards that can be adopted although a focus on how custodians can easily manage metadata and update it needs to be high on the agenda.
– The creation of an ‘Gov 2.0 Awards’ that will recognise outstanding practice in the use and impact of Gov 2.0 tool to improve agency and program performance. Nice idea although will this lose focus on the bigger picture of interagency collaboration and the overall reduction of duplication across government? I hope the awards will take into account those agencies who without producing a big wondrous application get into the nitty gritty of creating a more efficient government.

In addition the adoption of creative commons should greatly increase participation and use of the data. I will say that while using creative commons is great though needs to go into how a custodian can still assure correctness where required.

There are two area’s that I am a little disappointed in:

1. The Gov 2.0 response to the report (and even the original report itself) gave little recognition to state level data and local government level data. Much community interaction happens at this level, more so at the local government level and so I feel more thought and support has to be focused in this area.

2. The steering group for Gov 2.0 looks to be entirely formed out of federal government agencies giving no thought to private industry (who will be supplying and even building the web2.0 tool to support government), academia and the citizens who will be the beneficiaries of the openness of Gov data.

If Government is really going to get on with Gov 2.0 then we need to realise that the world extends beyond government and so to be proactive around the use of PSI data then we need to engage with those who might get best use out of access.

With more and more governments pushing towards open access to data such as the UK and the US portals we need to take our lead from these portals and look at what we can implement to ensure a Data.Gov.Au portal becomes a success. Catalogs, Web Services, Download realms are a good start but let fully embrace what the term “2.0” is supposed to represent and ensure that a good user interface sits with the portal that makes access data easy and useful.

The Age of Social Media: A Look at Emergency Management

Happy New year to you all. It has been a while since my last update on Project Spatial and in that time quite a bit has happened within Australia and the Spatial industry. As with each new year in Australia, extreme weather conditions seem to be a norm and bushfires are ever prevelant.

Recently in Western Australia, bushfires have ravaged the town of Toodyay, multiple fires are burning all over Australia which makes me think of a round table discssion i participated near the end of 2009. Under the Gov 2.0 taskforce a project emerged on how the government could use Web 2.0 technologies within the social media sphere to help the management of incidents such as bushfire, flood and alike.

The project which only ran for a little over a month has delivered its report on how government can use social media tools to help premare and manage emergency situations. The key to it’s findings is that government needs to be able to convey trust, transparency and timeliness. In some situations, accuracy and reliability can be traded off against timeliness of information. Getting a message out there stating the threat can be more important than knowing exactly where the threat is. Of course you don’t want to instigate panic. J

I’ve talked about mashups in previous posts and the abilty to provide timely information can easily be mapped. Take the Landgate Firewatch service coming out of Western Australia, this data feed can be combined with othe feeds to create a ‘mashup’ of incidents happening around Australia. is a good example of this where RSS feeds from NSW and Victoria are combined with Firewatch, BOM Weather and other data into a simple map.

Further out from Australia we are even seening new and exciting uses of social media for EM. For example, in San Francisco there is a twitter account setup for the earthquake prone area that people can subscribe to. Combine this with TwitPics (see: and you have a detailed account of an Earthquake, providing more timely information and shared accross many users faster than traditional media sources.

What is needed in Australia is a coordinated approach to 2.0 technologies in areas such as Emergency Management. Setting standards and policies will help ensure that information is not abused or worse becomes mis-trusted.

The key really is to keep it low tech (another finding of the Em 2.0 report) although as technology evolves very quickly and newer generations are turning away from normal media channels (radio, TV) any impementation of 2.0 technologies needs to stay consistent, reach a broad range of users and be simple. Technology isn’t a barrier although controling how much technology is used will remain a factor. Remember they say you only have 8 seconds to capture someones attention through the Internet so information related to EM incidents needs to stay clearly articulated and remain accessible.

Interesting Case Study on the Victorian Black Saturday Fires:


Attrib: Government 2.0 Taskforce Draft Report 2009.

It was with great excitement that I downloaded the preliminary draft report on the Australian Government 2.0 Taskforce a couple of days ago and yet it is with some sadness that I write this post. ‘Engage – Getting on with Government 2.0’ report details the taskforces findings on where Australia should be moving in the area of open access to public sector information (PSI). The document is a long read even only if you skim through or read the executive summary which in my opinion is one of the biggest pitfalls in government where great work gets lost in translation. Web 2.0 and Government 2.0 is about interaction, engagement and fostering cultural change through collaboration, open access to data and the crowd-sourcing interaction with data. So why is it that the community we are trying to engage with gets lost along the ride through these large cumbersome documents? If anything Web 2.0 is about simplicity and interaction, certainly not segregating your audience to those who have the patience to read through an engagement plan and those who do not.

  1. This report is a step in the right direction and I do not want to tarnish the effort the taskforce in writing this although I feel that there are lacking components: Where is WA in the scheme of things? Ok as a sandgroper I take this one to heart although it must be noted that as this report details the undertakings of stage government initiatives, WA is no where to be seen. Poor form if the Shared Land Information Platform (SLIP doesn’t get a mentioned through a report detailing and promoting open access to public sector data. To that, one might even say where is LIST in Tasmania? (
  2. Integration methods. Knowing; through experience how long it can take to successfully integrate data together before it becomes usable it was with a heavy sigh that I could not find reference to having the designated “lead agency”, lead in a common integration framework. The report details interoperability between differences systems and data that is used in these systems yet what I have found key to interoperability is the integration of data. For a very long time governments and the private sector have had interoperability through sharing data manually and transforming the data to meet systems, yet ease of integration especially through a web 2.0 framework must have its place in the sun. Surely this is where location system and spatial data should have been referenced (and included as terms in the glossary!)
  3. Governance. A lead agency concept is recommended through the report and I’m sure governance will be a requirement. If this is the case, I would like to see governance by government meeting community and business needs be a driving factor. Opening access to data invites scrutiny and misinterpretation. Governance on how data should and could be used and importantly fed back will be a success factor going forward.

And finally: ‘Information’. Data sharing and opening access through creative common frameworks is a great step although if we cannot capture data through governance frameworks, if it cannot easily be integrated then it is difficult to derive information from the data that will inform government, inform policy and inform the community. Information that is easily understood and acted upon will drive a proactive, engaged Australian information economy. To this I recommend reading the report and providing feedback into the future directions.

Where did I go at 1am in the morning?

The Answer, the O’Reilly Where 2.0 Online Conference. ( This was an online conference focused on utilisation of the Apple iPhone sensors and how applications can easily be built to use these sensors in weird and wonderful ways. Quite an insightful conference and I was amazed at how awake I was particularly at 1am in the morning.

So why I would attend an online conference particularly at 1am in the morning?

  1. Most importantly, allows me to attend in my PJs as the conference was run on New York time,
  2. Online participation is exceedingly high. No more waiting for someone to stand up and ask that first question. Just type away!
  3. Can save the presentations as they are given.
  4. My work did not want to fork out the costs of sending me over to America.

The online conference turned out to be a bit of a code fest although I did gain some pretty insightful knowledge in what goes into building an application. It was especially interesting to see how the sensors are being used and what the developers would like to see added. So lesson one, these are the sensors in your modern iPhone:

  1. The Accelerometer – This pivots and turns the screen based on the movement of the iPhone,
  2. The Magnetometer – The digital compass in the 3Gs and the basis of many new and cool applications for the iPhone.
    Magnetometer Settings

  3. The GPS Reciever – this is what give you your location although the phone refers back to triangulation (~700m accuracy) when you don’t have clear line of sight to the sky.
  4. The Proximity sensor – This turns the screen off so you don’t accidentally hang up while talking to someone on the phone!

The Where 2.0 conference setup this dedicated session on the iPhone as it is the most dominate ‘smart’ phone in Australia (and most places in the world for that matter) and for the fact that more spatial data requests and captures will happen on devices like these in the future than from traditional GIS desktop applications.  In fact it was predicted back in 1999 by Max Egenhofer speaking at the 1st Brazillian Workshop on GeoInformatics ( that the smart phone would be the leading GIS device of the future

“Spatial Information Appliances – portable tools for professional users and a public audience alike, relying on fundamentally different interaction metaphors: Smart Compasses that point users into the direction of certain points of interest, Smart Horizons that allow users to look beyond their real-world field of view or Geo-Wands – intelligent geographic pointers that allow users to identify geographic objects by pointing towards them”

Ref: Simon R., Fröhlich P. & Anegg H., Beyond Location Based – The Spatially Aware Mobile Phone (

Sounds pretty cool hey? Lesson two, it may not be widely known in the spatial sector since we generally deal with top end GNSS receivers but companies like Apple and Nokia alike are the biggest GPS receiver sellers and consumers in the world. Knowing this I feel that it would almost be right in saying that these companies are the new leaders in GPS and navigation. Certainly mobile mapping is the new fad and with so many people out there collecting and geo-tagging information it seems likely that this is the new way for us to collect information.

It was nice to listen and provide input into this conference considering in each presentation, ‘location’ was key to the applications being talked about and how the future would be built around utilisation of GPS to a higher degree. At around 4am in the morning I perked up at mention of a new iPhone application called ‘Theodolite’.

Imagine this, a surveying term wrapped up in a surveying application for the iPhone. Ok, now I know the GPS receiver in the iPhone is accurate to ~40 metres and so this isn’t a surveying application but the future looks bright.

I talked about Wikitude and a little on augmented reality in a previous post and attending this conference re-assured me that this new technology can really make it in this mobile mapping, smart phone, social media age. I don’t think we will be calling our phones “GeoWands” in the future but damm they are cool.

NeoGeography and Opening Access to Australian Data

Neogeography. What is it, where did this term come from and how does it affect me? I was readying an article by Michael Goodchild that was published in a recent edition of the Journal of Location Based Services ( which talked about how volunteered geographic information and the technologies that support this is blurring the lines between academic/professional spatial scientist and keen amateur. Neogeography is the buzz word that caters to the rise of new geography technologies used by non experts in the area of collecting and sharing information that is spatially referenced.

Reference how many mash-ups are now available with API’s such as Google Maps and Bing Maps. These mash-ups are now common place and contain information that was collected, published and referenced by those who might not have a GIS degree but are quite into new technologies and social media interaction. Apps for Democracy (, Geovation (, Gov2.0 Mashup Australia ( are examples of Neogeographers who have innovative idea’s making it known to the wider audience.

This explosion of new ideas and use of spatial data is leaving government and private organisations behind in how a neogeographer can be a critical link in how data is collected and disseminated to the wider community.

This got me thinking about how we manage government data and make this accessible to the community. Working in a leading state land agency within Australia the use of data is something that crosses my desk day to day. Time and time again, data becomes outdated, un-trusted and eventually cost prohibitive to bring back up to speed.

Web 2.0; if anything has opened the possibility for government to take a lead role in engagement with the community to leverage the local knowledge of individuals to deliver data that is accurate, stays accurate and becomes trusted. The Open Street Map initiative is a prime example of the community leveraging Web 2.0 technologies to produce a product that is greater than any one government agency could produce. (

Government is good at governance and this is the strength government can bring to the table in engaging with neogeographers. There is no value for government to continue to try and absorb with all components of a data value chain. Focus attention and developed the process for neogeographers to access, use and most importantly feed back data changes will ultimately lead to a greater benefit for those in government, thos in the private sector and those who are taking neogeography to the next level.

We are all observers in our local environments. Lets open access to data and ensure that if I make an update to some data that it is easily recorded, absorbed, processed and fed back out into the community.

Power of Local Knowledge
Image Source (