Monthly Archives: May 2010

Why Spatial Data Will Fail:

Fail is a hard word; maybe using the statement that Spatial Data will never quite reach its potential would be a better way of summing up the title of this post. I have for the last few months been participating in a strategy document for Western Australia related to the power of location. This strategy document looks at how information can be used to benefit the future development of the state where embedding location into data becomes a recognised core element to expanding and deriving value out of the linked data concept. (see

What I have really learnt from this experience is that ‘Spatial’ is unknown; an elusive term whose understands is limited to those geeks sitting in the dark corner of ones office. The other side of the coin is that data is meaningless until you link this data with other bits to draw out useful information that can be easily understood.

Reading recent updates on how much data is generated per year and how the numbers are becoming astronomical. All Things Spatial Blog In 2010 the amount of data generated will pass the Zettabyte level, something that I know my computer will gladly roll over, hand in its resignation and retire to some distant silicon oasis. Let’s just say that a Zettabyte is equivalent to 75 billion fully-loaded 16 GB Apple iPads

With that much data being generated yearly there is no doubt that most of it is unintelligent and would be difficult to mine, massage it all into a useful form.

So, if you take ‘spatial’ on its own and ‘data’ on its own they are both pretty meaningless. The Power of Location strategy for Western Australia takes its aim from others around the world such as the UK Location Strategy: Place Matters where ‘everything happens somewhere’ although we can add to this by including ‘and sometime’. The need to look at how information is collected in a variety of sectors and identifying and embedding a location element in it will help in the areas of data mining and massaging ensuring that the right information can be generated when needed relating to the right area.

Information is what gets delivered in applications, in reports and help makes those critical decisions that are needed. Data that is spatially enabled (i.e. has a location) provides the links to other types of data including environmental, social and economical. A triple bottom line effect on how data is collected managed and used to derive information will ensure that ‘spatial’ is catapulted into peoples consciousness as data needs to relate to it’s surroundings at a particular location.

Moving Away from Gov Silos

A few days ago the Rudd government posted it’s response to the Gov 2.0 taskforce report on Engage: Getting on with Government 2.0. The responce, hosted on the Department of Finance and Deregulation (DFD) website can be found here.

Reading through the response I am pleased in most aspects of the Government wanting to take action in the area of making more data available to the public and generating tools that will help inform and direct new policy. The steering committee that will be setup to help direct the DFD is for the better part entirely made up of federal government agencies. This is my first bane of contention but more on that in a sec.

Some things to note coming out of the report included:
– Defining what Public Sector Information should be
o free
o based on open standards
o easily discoverable
o understandable
o machine-readable
o freely reusable and transformable.

– Establishment of metadata standards to improve sharing, reuse and discoverability of PSI. All well and good and there are standards that can be adopted although a focus on how custodians can easily manage metadata and update it needs to be high on the agenda.
– The creation of an ‘Gov 2.0 Awards’ that will recognise outstanding practice in the use and impact of Gov 2.0 tool to improve agency and program performance. Nice idea although will this lose focus on the bigger picture of interagency collaboration and the overall reduction of duplication across government? I hope the awards will take into account those agencies who without producing a big wondrous application get into the nitty gritty of creating a more efficient government.

In addition the adoption of creative commons should greatly increase participation and use of the data. I will say that while using creative commons is great though needs to go into how a custodian can still assure correctness where required.

There are two area’s that I am a little disappointed in:

1. The Gov 2.0 response to the report (and even the original report itself) gave little recognition to state level data and local government level data. Much community interaction happens at this level, more so at the local government level and so I feel more thought and support has to be focused in this area.

2. The steering group for Gov 2.0 looks to be entirely formed out of federal government agencies giving no thought to private industry (who will be supplying and even building the web2.0 tool to support government), academia and the citizens who will be the beneficiaries of the openness of Gov data.

If Government is really going to get on with Gov 2.0 then we need to realise that the world extends beyond government and so to be proactive around the use of PSI data then we need to engage with those who might get best use out of access.

With more and more governments pushing towards open access to data such as the UK and the US portals we need to take our lead from these portals and look at what we can implement to ensure a Data.Gov.Au portal becomes a success. Catalogs, Web Services, Download realms are a good start but let fully embrace what the term “2.0” is supposed to represent and ensure that a good user interface sits with the portal that makes access data easy and useful.