Six months after the federal levees failed and Katrina flooded 80 percent of New Orleans, I went to City Hall to try to get electricity restored to our house.
The government building’s seventh floor was so full that overflow needed to be moved to the ground-floor lobby. Most people waiting carried stacks of paperwork and photographic documentation of their damaged homes. One elderly woman ahead of me finally got her turn but walked empty-handed to the counter. She didn’t have a permit to file. She just wanted to know: Have any of her neighbors gotten permits to rebuild? Could they tell her which of her neighbors might be moving back? She didn’t have enough information to decide what to do next. And she wasn’t alone. Without data on the rapidly changing housing and demographic situation, businesses didn’t know how many customers they might have. Charities didn’t know which services were most needed and where. Neighborhoods didn’t know how to prioritize volunteer efforts to rehab houses. The whole city was flying blind.
As months turned to years, people increasingly lost confidence in government agencies and philanthropy. News reports on federal dollars going to the region and donations coming into nonprofits were abundant, but people looked at their own stalled recovery and asked, “Where’s the money?” The lack of financial transparency only added to the sense of uncertainty and suspicion.
We can do better with Harvey and Irma.
Like Katrina, these 2017 hurricanes are all-hands-on-deck disasters. Government can’t do it alone. Public and private recovery efforts will need to align, but the required level of coordination will only be possible if everyone is working from a shared base of common information and trust.
The good news is that, 12 years after the storm that devastated New Orleans, communities today can take advantage of data transparency—a powerful tool that can help align federal, state, and local government efforts with those of the private sector and philanthropy.
That may seem obvious—of course these institutions should share relevant information with the public in times of need. But it’s harder than it appears. For one, transparency during crisis doesn’t come naturally. Some may worry that opening up data may open them up to scrutiny. Power companies might be nervous that outage data would make them look bad. Elected officials might worry that families won’t move back if environmental data reveal contamination or if crime looks to be on the rise. Nonprofits might be hesitant to open data on donations and outcomes because of public scrutiny of overhead costs. Institutions might worry about privacy or fraud—say, a scam artist targeting households that received recovery benefits. Or they may simply not realize they have useful information to share.
New rules that require some federal and local government agencies to make certain data open and accessible to citizens have helped. But some high-value data sets might require public records requests, and politics can still get in the way. (This White House, for one, has increasingly put inconvenient data in its crosshairs.)
Transparency also isn’t just related to what information these organizations share but also how they share it. It’s most effective in the form of “open data,” or data that’s released in a structured, machine-readable format that can be downloaded, sorted, analyzed, mapped, and graphed. (In its highest form, it’s also available in an API, or application programming interface.) This digestible format makes it especially valuable for experts such as meteorologists, policymakers, data journalists, emergency managers, software developers, and advocacy organizations. It can also be a constructive resource for the general public, especially when software developers make it available through easy-to-use mobile apps and websites, like those that tell you when the next train is arriving.
Why is the format so important? New Orleans provides a good illustration. Just after the storm in 2005, the city government offered public access to information about building permits, but users could only look them up one address at a time. There was no way to analyze citywide permit data to, for example, see how many homes in a flooded neighborhood had construction permits for rebuilding above base-flood elevation, or to check what historic buildings were up for demolition. Maddeningly, the simplest way to answer these questions seemed to be to literally drive every street yourself.
We’ve learned a lot since then, and open data has already played a crucial role in the preparation and response to both Hurricane Irma and Hurricane Harvey. The National Oceanic and Atmospheric Administration’s geospatial data on active hurricanes, for example, gave residents and responders important real-time updates about the path and severity of the storms. The census’ emergency management map—which includes downloadable data on population density, language, disability status, household income, and vehicle ownership—has and will continue to be used by emergency planners to coordinate their response. And Miami’s Planning & Zoning team is collecting flood survey data, photos, and resident stories, and making it public as it comes in. This on-the-ground information provides a much-needed update to the Federal Emergency Management Agency maps that are used to model storm surges and will help citizens and city planners make informed decisions about waterfront zoning and land use.
But as I saw in New Orleans, recovery is a long-term process. Stakeholders in the recovery will expect—and need—regular, timely, detailed updates from a variety of sources. Some data is worth prioritizing over others—and some of it comes from surprising places.
As the elderly woman in line with me in 2006 understood, building permits—that first signal of a property owner’s intent to rebuild—are among the most important information that local governments can release to the public. After Katrina, for example, Harvard University got special permission to access the city’s permitting data. The team then analyzed the records to help the Broadmoor neighborhood identify and strategically invest in blocks that appeared to be near a tipping point of returning residents and businesses. Today, despite being one of the hardest-hit neighborhoods in the city, Broadmoor has rebounded to 90 percent of its pre-Katrina addresses.
Other New Orleans neighborhoods, however, ran into bureaucratic barriers when they tried to acquire the same information, exacerbating the already uneven recovery. In 2009, when I (as a citizen) asked the city of New Orleans to open up its permitting data so that all neighborhoods would have timely updates, officials told me that the city couldn’t release the information. They said that to do so, they’d have to pay tens of thousands of dollars to the software company that built their permitting system just to add a “download data” button to the public website. It’s a frustratingly common problem. Companies often intentionally build obstacles into their contracts or software architecture that make it difficult for clients to extract their own raw data (which, no surprise, makes it difficult for clients to switch their service provider). The tactic, known as vendor lock-in, also makes it arduous and expensive for local governments to open up their data to the public.
Some municipalities have already switched to software providers that, by default, provide systems that make it easy to share data with the public. Miami-Dade County citizens, for example, can view newly issued building, repair and demolition permits in nearly real time thanks to homegrown permitting software. As recovery efforts from the 2017 hurricanes continue elsewhere, it’s time for other cities and gov-tech companies to step up and make this kind of open data a standard feature.
Details related to parcels, those rectangles of land that people own and pay taxes on, are also among the most valuable post-disaster data sets. The boundaries are particularly useful as a scaffold over which you can map other information such as flood zones, building permits, code enforcement violations, 911 call origins, and the statuses of various services, such as water, electricity, mail delivery, and trash pickup. The Washington Post, for example, recently combined a Harris County parcel map and FEMA data to help readers visualize repeat flood damage in a particular Houston neighborhood. A map like this can serve to ground difficult dialogues about possible government buy-outs for neighborhoods with repeat flooding.
Despite their foundational role in disaster recovery, parcel data sets are often not released due to vender lock-in and because some jurisdictions sell, or think they can sell, that data. In post-Katrina New Orleans, neighborhood groups unable to get the information through public record requests traded bootleg copies of the parcel map. Even federal agencies such as FEMA struggled to get this data from the local government. When I was co-directing the nonprofit Greater New Orleans Community Data Center, the feds asked me for parcel data sets. I had to direct them, too, to the bootleg copy. When I joined the city of New Orleans government in 2010, one of the first things I did was release the parcel data—an act that was met with hugs from grateful residents at public meetings.
Accurate data on location and status of open businesses also serve as a lifeline to returning residents, who may have narrow windows between work shifts, caregiving, and rebuilding to pick up essentials, and may have limited transportation options to get from place to place. The data can also provide crucial information about what neighborhoods are being left behind in the recovery. When I returned home four months after the storm, with an 18-month-old child on my hip and a husband deployed overseas with the military, I relied on signs in the neutral ground (New Orleans speak for median strips), word of mouth, and trial and error to find an open hardware store, grocery store, or pharmacy. After disasters, local journalists often manually compile lists like this “Find Open Stores in Your Area After Harvey” feature by Houston Public Media. But these can be difficult to maintain. To be a definitive resource, you first need a complete, accurate list of all the businesses in an area. You then need regular, in some cases daily, updates on their hours of operation.
Open government data solve the first problem. States and cities typically have business license data that include the address and business type, such as hardware store, pharmacy, grocery store, or gas station. They’ll also have data about re-inspection for licensed facilities like child care centers, nursing homes, medical facilities, and restaurants. But that information only goes so far. To make up for the gaps, some places have been creative. After flooding in 2016, the Louisiana Business Emergency Operations Center created a website where people could tag a business as being open or closed. Areas recovering from disasters could also ask credit card companies to release nightly data on businesses that have processed customer transactions in the past 24 hours, a form of “passive crowdsourcing” that helps take the burden off residents and journalists to keep this information updated.
Passive crowdsourcing has also played important roles tracking whether people are returning home—arguably the single most important indicator of recovery in a neighborhood. There’s someone who usually knows which houses are occupied and which are vacant: the mailperson. In 2007, the Tampa Bay Times published a fantastic profile of Charles McCann, a veteran letter carrier who tracked recovery progress in New Orleans as he walked his daily Lower 9th Ward route. The U.S. Postal Service typically does not allow for public release of this information, which is part of its paid Delivery Statistics Product. But if the USPS fails to step in, direct mail marketing companies could fill the gap by sharing their data about addresses receiving mail—as the coupon circular company Valassis did to track repopulation after Katrina.
Data can also play a vital role in addressing public safety concerns, which don’t end when the wind dies down and the water recedes. Sharing open and accurate information from police about crime reports, arrests and citations, use of force, traffic stops, and 911 response times will help keep trust and accountability flowing in the tense atmosphere of a long recovery (an area where the New Orleans Police Department notably struggled).
Governments must also treat environmental data as vital public safety information. In the aftermath of Katrina, I never felt like I knew for sure whether the air, soil, and water were safe. We chose to stay, but two families on my block never returned, concerned about raising their children in contamination. Reports of toxic floodwater and contaminated air are already widespread in Houston. Though the Texas Commission on Environmental Quality website doesn’t publish its findings in an open format, a local civic tech company created an open-source script to scrape the data so researchers, journalists, and the public could analyze it. It was a crafty move, but we shouldn’t have to depend on the benevolence of private citizens. Governments and even nonprofits (such as the Environmental Defense Fund) should make their environmental testing data open by default.
This is just a sampling of the high-value data sets that can aid communities in Texas and Florida as they rebuild after their devastating storms. For those involved in the efforts, I worked with veterans of Katrina and other disasters to assemble a more complete—and growing—list here. For those affected, data transparency can make recoveries more predictable, fair, and efficient. It aids citizens considering when and whether to move back, businesses debating whether to invest, and government and philanthropy deciding how to spend dollars responsibly.
Perhaps more importantly, it provides the kind of information that allows residents impacted by storms like Hurricane Katrina, Sandy, Harvey, and Irma to feel like they can play an active role in our democracy and have a say in the shaping of their collective future. Disaster heightens citizen engagement, and all levels of government will be better served if officials and citizens can use accurate and open data to have constructive dialogues about how to move forward, based on a shared base of information.
Five years after the storm, the transparency tide finally turned in New Orleans. I distinctly remember the moment at a crowded City Hall meeting when I saw the power of what open data can do. The topic was the tens of thousands of storm-damaged, vacant buildings still plaguing the city. A resident came to the podium to complain about her neighbor’s falling-down house, which was attracting vermin, crime, and illegal dumping, and making it hard for those on her block to move forward with recovery. She gave the address and pulled out a printed piece of paper, a printout from BlightStatus—a simple tool created by a team at Code for America that made it easy for anyone to look up recently opened government data on the current state of blighted properties—that detailed every time inspectors had visited the site, what they found, the property owners’ responses to hearings, and more. The city’s director of code enforcement pulled up the same app on his tablet and entered the address. City Council staffers flurried to do the same for their bosses. For the first time since the storm, everyone had the same information. There was no arguing about the facts. Instead, the conversation that ensued focused only on possible solutions.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.