planning as a platform | thinkthinkthink #25
on how AI-powered data ingestion at scale will transform urban planning
This issue on the future of urban planning is 533 words long and it takes ~2 minutes to read. I hope you enjoy it.
Urban environments are complex adaptive systems. Yet most planning processes remain strangely static—predicated on long-range assumptions, updated on multi-year cycles, and allergic to real-time data. Layer is working on a paradigm shift: from fixed masterplans to a continuously adaptive framework, capable of responding in real-time to changes in the urban landscape. Much like a biological system, planning should evolve iteratively—metabolizing friction, adapting through feedback, and growing more resilient with each cycle.
A plan should not be a single static document but a modular living platform. It should process ongoing data inputs, models inter-dependencies, and recommend calibrated adjustments that can be reviewed and debated by both decision-makers and the public alike. This creates a form of urban growth that is less speculative and more responsive—akin to a series of interacting feedback loops rather than a blueprint.
The backbone of such a platform includes several interdependent components, each echoing a biological or computational analogue:
Data Ingestion and Standardization: Inputs are drawn from a diverse range of sources—traffic systems, environmental sensors, economic indicators, construction permits, utility data, social sentiment etc. In the age of AI there is virtually no limit to the inputs. Urban data is then semantically indexed - like urban vector embeddings - enabling integration into a shared spatial-temporal framework.
Urban Knowledge Graph: A structured representation of the city’s interconnected elements—zoning codes, infrastructure, population dynamics, public services etc. This allows the system to model how changes in one domain (e.g., mobility) affect others (e.g., emissions, land use, public health). The urban knowledge graph is continuously git-diffed and modified depending on newly ingested data. The system preserves all incremental changes and thus is able to better impute causality to specific interventions.
Simulation and State Comparison: The platform uses predictive modeling to explore future states—such as housing demand, flood risk, or demographic shifts—and compares them to current conditions. The system then identifies divergence and suggests continuously interventions that attempt to bridge the gap between present state and future ambition.
Governance Interface: Proposed changes are version-controlled, traceable, and subject to stakeholder review. Urban planners act as stewards of the process rather than decision makers—they assess technical feasibility, social implications, and might even offer their opinion. Ultimately though decisions are made by elected officials or ideally—the public at large.
Interactive Visualization: A living digital twin of the city enables users—planners, residents, investors—to visualize potential modifications, propose or identify opportunities, and simulate possible impacts. This might enable true participatory engagement and furthermore empower individuals with data to make better individual decisions.
Rather than prescribing a fixed end-state, planning as a platform supports continuous evolution. It introduces mechanisms for detecting misalignment between policy objectives and on-the-ground realities. More importantly, it enables cities to adapt without waiting for the next planning review cycle.
This model also reframes the role of the planner—from architect of a master vision to facilitator of an ongoing process. A steward of the city, the urban midwife of an ever-changing plan. A plan that is capable of learning, adjusting, and balancing diverse inputs in a structured and transparent way.
By embedding adaptability and interactivity into the process, planning as a platform opens a path toward living urban governance. Not by replacing human judgment with algorithms, but by making that judgment better informed and more responsive to change. What emerges is not a “smart” city, but a sentient one—capable of listening, learning, and iterating with laser focus on citizens—the agents whose inputs shape its evolution.
Did you like this issue of thinkthinkthink? Consider sharing it with your network:
📚 One Book
Billion Dollar Whale by Tom Wright and Bradley Hope
Billion Dollar Whale reads like a financial thriller but hits harder knowing it's all true. The book unpacks the audacious rise of Jho Low, a chameleon-like financier who engineered one of the largest heists in history via the 1MDB fund—leaving banks, governments, and Hollywood in his wake. It's not just a story of greed, but a brutal indictment of how easily power systems can be manipulated when opacity meets ambition. If The Big Short was about the system failing itself, Billion Dollar Whale is about someone hijacking it with a smile and no spreadsheet.
📝 Three Links
Claude’s Constitution by Anthropic
Teaching AI ethics by hardcoding its conscience.
They wanted to save us from a dark AI future by J. Oliver. Conroy
When AI ethics curdles into cult behavior.
Python isn't just glue, it's an implicit JIT ecosystem by Stephen Merity
From glue code to ecosystem intelligence.
🐤 Five Tweets
This was the twenty-fifth issue of thinkthinkthink - a periodic newsletter by Joni Baboci on cities, science and complexity. If you liked it why not subscribe?
Thanks for reading; don't hesitate to reach out at dbaboci@gmail.com or @dbaboci. Have a question or want to add something to the discussion?
Glad to see you writing here again!
I appreciated your comment about technological advances supplementing human decision making, rather than replacing it. I have to admit, while I think that the former is the right goal to have, I worry that the temptation for many will be to default to the latter due to the complexity of the system and the issues at hand.