Cities are sitting on a wealth of valuable data locked away in planning documents. It’s time they learn how to better use a resource that’s entirely in their control, explains Stefan Webb
Big data, artificial intelligence and visualisation are transforming the way people process and interpret information. But the methods used by many cities to plan new developments creak with age and smack of desperate inefficiency. It’s time those systems caught up with the modern world.
The processes in place within city authorities to gather information about sites, compare proposals from developers and engage with citizens are certainly rigorous, and produce huge quantities of data at no small expense. If you’re sufficiently determined, you can find it in the appendices of local plans – and those brave enough to bother will discover reams of data, pages of tables and an atlas-worth of maps. But as well as finding it difficult to understand, they’ll also see that it’s locked up inside PDFs that are difficult for machines to search and analyse.
In the offices of the architects and developers who bring those developments to life, though, things look a little different. There, before bricks or steel are even considered, data, models and digital maps are used to explore sites, proposals and plans in exquisite detail. Crucially, these organisations have come to realise the value of maintaining easily accessible data, which they can draw on quickly, easily and repeatedly.
Comparatively, the cost of generating data to support local plans is sunk when it’s dumped into a series of analogue reports and planning applications. Not only do local planning authorities have to commission new studies, time after time, to obtain the same evidence but because it’s stored away in a PDF, it can’t easily be used to inform other services.
For example, many of the datasets collected as part of a housing market assessment are the same as those which inform a community infrastructure levy, a strategic housing land availability assessment or an infrastructure capacity assessment. But, bewilderingly, the information for these four studies is all procured separately. And any synergies or interdependencies that do occur between the four are managed by human hand – so the process can be slow, contain errors and result in loss of fidelity.
The problem is exacerbated when different city departments decide to commission their own data-driven exercises to understand, say, the demand for school places, pressure on GP services or where new job opportunities will be arising in the near future. Data from planning documents could easily be reused to help provide such insights, but instead it’s gathered once more at high cost.
What’s needed, then, is for cities to hold their spatially relevant data in one place, where it can be used over and over again, not just for multiple plans but across departments. Such a system would not just provide efficiency savings by reducing the cost of updating the evidence base for local plans, but also ensure everyone is working with the same figures and assumptions, and make it easier to build tools to access, interpret and analyse the data.
Greater Manchester has already shown that it is possible to generate and reuse planning data in this way. Its Open Data Infrastructure Map shows key infrastructure across the entire region in one open, accessible location. But it goes further than this. Using the same mapping platform, it seeks suggestions for new development sites and includes new automated processes to carry out parts of the shortlisting process without human intervention.
The Geospatial Commission is beginning to work on making key public spatial data more accessible; the National Infrastructure Commission is promoting a national digital twin; and the Centre for Digital Built Britain is promoting more data relating to development from building information models. The risk here is that there seems to be little coordination and collaboration.
Who pays for all this? Well, much of the evidence required for local plans is driven by national legislation, and the costs of building planning data platforms is too large to be borne by any single planning authority. So, ideally, central government should be investing in UK local authorities and companies to prototype the planning system of the future.
A city data environment that functions in this way will allow local authorities to maximise the value of the data that is generated as part of the planning process. In turn, it will reduce the time it takes to produce local plans and make them more transparent and understandable to citizens and developers. The data is there to be used – cities just need to realise its potential in order to make use of it.
Stefan Webb
Director of Digitising Planning & Standards
Connected Places Catapult
Tel: +44 (0)20 7952 5111
futureofplanning@cp.catapult.org.uk
Twitter: @cpcatapult
LinkedIn: Connected Places Catapult
Facebook: CPCatapult