Following the West Midlands Urban Tech Summit, Stefan Webb and Joseph Bailey talk about the importance of opening up planning data and working with Birmingham City Council to understand the impact of new developments
The world of data analytics, big data and machine learning seems to have passed by much of the planning system. Yet, of all public city services, it is the planning system that possibly spends the most money on generating and retrieving data. This data, required (usually as a result of regulation and legislation) to provide the evidential grounding for planning applications, masterplans and city plans, is held across a number of overlapping document management systems that have little or no interoperability, and are inaccessible for machine or human.
This lack of accessibility has consequences. The most obvious being the significant cost of repeated retrieval, generation and analysis of the same data. This data is extracted either by local authority officers, or developers or consultants employed by them, who are paid the same price for what is essentially a cut and paste of a previous piece of analysis.
Having open and accessible planning data available should allow others – be they different services in local authorities or those in the market wishing to develop new data-driven products and services – to use the same data without the costs of having to generate or retrieve it over and over again.
And this is recognised in government. Speaking at the Urban Tech Summit in Birmingham, communities and local government secretary Sajid Javid made it clear that his number one priority is getting more homes built, and called out planning as being “an area ripe for innovation”. The Department for Communities & Local Government (DCLG) will be launching a new platform to unlock data, with Javid commenting that “embracing digital is no longer a ‘nice to have’ for local government”.
In the absence of more open, systematic and accessible planning data, even the simplest of tasks – such as comparing two developments – is no small feat. Yet this comparison would enable local authorities to better communicate the opportunities and impacts associated with development to planners and citizens alike.
Our review unearthed more than 180 PDF documents, with a large portion containing rich information about the impact of both developments – during and after construction. The PDF format meant much of this information was conceptually ‘trapped’ and cumbersome to extract for use in digital tools.
However, the nature of the information – such as text, tables and geospatial features – means that it shouldn’t be difficult to provide in machine-readable formats. In fact, there are many existing standards that could be used to facilitate this.
Although the planning system needs an overhaul, there are many small, interim changes that could be implemented to stimulate positive transformation. Based on our experience of liberating data on development while working with Digital Birmingham, we’ve suggested five small changes to facilitate the development of digital tools for communicating the opportunities associated with new developments:
1. Set minimum data provisions – Authorities should require a specific list of data to be provided for all development. This may encompass other mandatory requirements but would enable the generation of a systematic summary of the impact of a development extending current requirements. This will help with like-for-like comparisons.
2. Mandate the provision of machine-readable information – PDFs should be in machine-readable versions as demonstrated by ODI Leeds, and authorities should not limit these formats to particular licensed software.
3. Consider sharing by default – Unless developers and consultants are generating commercially sensitive information (if so, consideration should be given to aggregating and anonymising), they should upload their information to an open platform. This enables automated testing of the submission (for example, ensuring that all the required information is present) and, in turn, saves admin time.
4. Recognise the value in external data capture – Collecting more information about the city (eg real-time monitoring of air quality, traffic, noise, waste generation) and making it available to others makes it easier for developers and consultants to fulfil their minimum data requirements, and encourages consultancy processes to use the same data. In the long term, it may even enable the authority to automate some processes performed by developers or consultants.
5. Enhance transparency – To enable the improvement and reuse of impact assessments and analyses, authorities should mandate that developers and consultancies provide transparent and reproducible methodologies alongside their insights.
While there is and should be the ability to reflect local circumstances and priorities in how local plans, masterplans and planning applications are evidenced, core data requirements would allow for greater standardisation and automation of evidence gathering and analysis.
Implementing these incremental changes will fuel the design of digital tools to communicate opportunities associated with new developments in a transparent and consistent way. Comparing developments like-for-like, along with their impact and the opportunities that they offer will be quicker, easier and, perhaps most importantly, a lot more compelling.
Stefan Webb
Head of Projects
Dr Joseph Bailey
Data Science Team Lead
www.futurecities.catapult.org.uk/
Future Cities Catapult
Tel: +44 (0)20 7952 5111
Twitter: @futurecitiescat
Facebook: www.facebook.com/futurecitiescatapult/
LinkedIn: Future Cities Catapult
Youtube: Future Cities Catapult