A new blog post by David Luberoff, the Deputy Director of the Joint Center for Housing Studies at Harvard University, dives into the concept of digitalization and how it has changed the face of producing, financing, reviewing, and regulating the current housing market.
However, according to Luberoff, policymakers and advocates who want to take advantage of digitalization must overcome significant obstacles.
In a new white paper called “Data-Driven Multi-Scale Planning for Housing Affordability” by Paul Waddell and Arezoo Besharati calls out growing concerns about housing affordability which have spurred the development of digital tools to assess potential sites for future housing.
To illustrate the potential of these tools, Waddell, a professor emeritus of city and regional planning at UC Berkeley and CEO of UrbanSim (which uses data and data analytics to improve planning) and Besharati, the firm’s director of product, discuss efforts to develop a nationwide, data-driven planning strategy in Canada, where UrbanSim has been working with the Canada Mortgage and Housing Corporation and several local governments. This work includes using new techniques to collect and synthesize data and uses machine learning algorithms and other tools to develop, analyze, and assess policies and projects. For example, the company worked closely with Housing Now, a Toronto housing advocacy organization, to spur the development of over 12,000 units of mixed-income, mixed-use housing on “unexpected sites” near high-capacity transit lines.
Efforts to use this digitized data for urban development must also keep in mind that historically, those in power have often used this data to elevate some and marginalizing others, whether on purpose or not. To illustrate this claim, the JCHS examined two initiatives: the first being the Equity Indicators project at CUNY’s Institute for State and Local Governance and the second being a set of interrelated initiatives in Seattle that used data to identify places with social, economic, and environmental disparities and then build on that work via community planning processes.
Of these projects, four important themes emerge from the authors analysis:
Four important themes emerge from their analysis:
- Building a proper team and engaging stakeholders is essential to defining equity.
- It is critical to evaluate both the source of the data and the data’s appropriateness for measuring equity because, while Census data is an excellent way to measure challenges, it doesn’t help policymakers understand which issues are most important to specific communities.
- While it’s important to find the right data, it is equally important to find the appropriate analysis techniques for translating data into actionable knowledge.
- Communicating results effectively can yield compelling insights. The authors caution against presenting findings that cannot be easily understood or analyzed by people who do not have advanced analytical training.
Digitalization also has an under-appreciated potential to advance fair housing goals, contends Nestor Davidson, the Walsh Chair in Real Estate, Land Use, and Property Law at Fordham University’s School of Law and faculty director of the school’s Urban Law Center, in “Innovations in Digitalization and the Future of Fair Housing.”
While acknowledging concerns that digitalization makes it easier for new investors to enter the market, it also makes it harder for people of color to find and finance housing. However, he asserts that innovations in how we collect and analyze the vast amount of information about housing markets could “move our complex, all-too-opaque, fragmented systems of housing in more equitable and sustainable directions.” Indeed, aggregating and analyzing new data might reveal hidden patterns of bias while improved analytical tools, such as AI, might make it easier to assess the fair housing implications for housing subsidies, zoning policies, infrastructure investments, and affirmative marketing plans, although such efforts may be constrained by court decisions that limit race-conscious policymaking.
In their own way, the researched used in this article focuses, in part, on problems with local zoning. Historically, the U.S. has not had good tools to measure and compare zoning codes in the 30,000+ municipalities in the country, but a National Zoning Atlas—which does not currently exist—could help move things along.
While a National Zoning Atlas will not automatically lead to better outcomes, it should make it easier to implement many of the approaches highlighted in the new papers. Doing so, Bronin asserts, could strengthen planning at the local, regional, statewide, and even national level; help people become better informed participants in land-use decisions that affect them; and provide baseline information for researchers exploring the impacts of zoning on issues like affordability.
Click here to read the blog post in its entirety.