Stories from the World of Municipal Analytics

Our cities are getting smarter

The field of analytics continues to gain momentum in municipalities across the United States as local governments have begun to take Big Data seriously as a means to uncover a greater understanding of its citizens, increase the effectiveness of its projects and policies, and save money. More cities are adopting analytics practices than ever before, which yields an increased need for analytics professionals to oversee the adoption, deployment, and assessment of these new technologies at a local level. Of the successfully implemented programs, many have pushed through multiple failures and iterations to fulfill their goals related to improving governance. With sophisticated technology like machine learning and artificial intelligence comes the need to understand more complex assumptions and biases that may work against an objective that serves both comfortable and vulnerable populations.

Civic problems that could be addressed with the help of analytics may find their way to the desk of those in power from internal, funded problem sourcing efforts, but in the case of an engaged community, problems that need prioritization can often present themselves. To address a civic issue in tandem with analytics professionals, it seems there must not only exist a clear need for the application of analytics but, more crucially, buy-in from stakeholders with decision-making authority, the right resources from a talent and technology perspective, and mature, vetted data. Efforts to innovate in the public sector will often be met with bureaucracy and a low tolerance for risk, which is why smaller scale pilot iterations of analytics projects have an important place in demonstrating initial value. Despite the obstacles inherent in the domain, I, for one, am extremely excited about the potential for municipal analytics and data science. In fact, we have already seen plenty of successful examples throughout the country. In order to spark your interest in the growing field, and with the hope that you’ll engage and understand your own municipality’s progress in Big Data applied to policy, I’ll be summarizing some of my favorite success stories to give you a sense of what’s already possible, inspired by leading national resources for the successful implementation of civic analytics.

Smoke Detectors + New Orleans

Source: NOLA Office of Performance and Accountability

Armed with the insight that the risk of death from fire in a home is cut in half when a home has smoke alarms, the New Orleans Fire Department decided to give out free, life-saving smoke alarms to residents in need. The city has built out its data capabilities through its Office of Performance and Accountability, who built up a data team and decided to utilize a predictive analytics model to determine both the attributes of a home that would help the group understand each property’s need for a new smoke alarm and its overall fire fatality risk attributes. No data of this sort had been collected by the office, but the project was able to take off when a researcher uncovered a question about smoke alarms on the US Census Bureau’s American Housing Survey and joined it with data from the Bureau’s American Community Survey. The team was able to identify key features that helped to predict the need for a new alarm and highly susceptible houses. They were able to use this to arrive at a block-by-block smoke detector outreach and distribution plan. 2 years and 8,000 new, risk-targeted smoke alarms later, the department was able to celebrate a job well done, affirm the power of analytics, and inspire the American Red Cross and DataKind to replicate their project’s across the nation.

See “Predicting Fire Risk: From New Orleans to a Nationwide Tool” (Ash Center, Harvard)

Early Police Intervention + Nashville/Charlotte

Early police intervention systems, developed by the University of Chicago’s Center for Data Science and Public Policy, enable police departments to identify those likely to have an adverse incident in order to provide them with training, counseling, and similar resources. The group tracks “adverse incidents,” which may arrive in the form of a citizen or colleague complaint, the use of force, accidents, or injuries. The system is used to find predictors of adverse incidents within the department’s data. This includes tracking a variety of instances of officer conduct that potentially reveal something about the officer’s underlying tendencies that may or may not have an effect on adverse incident occurrence. The system learns how supervisors prioritize or deprioritize the system’s findings as well. Traditional systems reveal a feasibility issue, as they might capture 80% of officers who have an adverse incident by flagging almost 70% of the entire police force. Recommending that a department continuously retrain 70% of the force isn’t very useful given the already limited resources of many police departments. The newer system designed at UChicago still flagged 80% of the force but was able to hone in on the 30% most in need of intervention and give each officer an aggregate risk score based on historical performance. A system like this is clearly not meant to replace a police supervisor’s responsibility to make a judgment call but can help inform it, and the long-term impact of implementation is still being measured.

See “Police Project Update: Expanding and Implementing the Early Intervention System” (UChicago’s Center for Data Science and Public Policy)

Landlord Discrimination + New York City

New York City prohibits discrimination of prospective tenants based on their race, gender, and source of income…on paper. Despite Title 8 of the Administrative Code of the City of New York, income-based discrimination is still among the most reported housing-related issues. The New York City Mayor’s Office of Data Analytics and the Commission on Human Rights decided to build a model to more efficiently deploy investigative resources by identifying areas in which homeowners were most likely to illegally turn away potential tenants with housing vouchers. The researchers hypothesized that targeting the larger property management firms guilty of such conduct would have trickle-down benefits, encouraging smaller potential offenders to follow the law. By identifying the larger offenders, the model has worked to reveal illegal housing practices more effectively, allowing the Commission on Human Rights to better prioritize their workflow and improve NYC’s ability to curb housing discrimination.

See the project “The NYC Commission on Human Rights partnered with MODA to assist in identifying sites of potential income discrimination” (MODA NYC GitHub)

Urban Damage + New Orleans

Many areas that were damaged by Hurricane Katrina in 2005 in New Orleans were left untouched by the city for many years after the storm, and when Mayor Mitch Landrieu took office in 2010, the decision to take either renovation or demolition action on properties around the city to remedy damage was a key topic of discussion. The city’s data hub, OPA, was tasked with developing a performance management tool using analytics to help approach the issue. They developed BlightSTAT and a Blight Scorecard in response to the backlog of 1,500 properties awaiting a decision from housing code enforcement to either sell the house or move forward with demolition. The system developed allowed mid-level supervisors to go out and score properties, increasing the speed and consistency of the evaluation process. In this case, like many cases, human judgment is an irreplaceable factor in the civic analytics process. The decision to demolish a building was informed by, not replaced by, an advanced data system. Paper communication in this process was essentially eliminated by the technology in the Department of Code Enforcement. In turn, the backlog of properties awaiting decisions was essentially eliminated as well.

See “Code Enforcement Abatement Tool” (City of New Orleans)

See “Early detection of properties at risk of blight using spatiotemporal data” (UChicago’s Data Science for Social Good)

Food Inspections + Chicago

Chicago Restaurants Failing to Pass Inspections: 2015–2016 (https://chicago.github.io/food-inspections-evaluation/)

Despite deep regulations for food preparation, storage, and service, monitoring food safety in a city is really challenging, but analytics models can help city officials efficiently allocate expertise and financial resources to the task. Chicago has more than 7,300 restaurants, thousands of grocery stores, and other food vendors, all collectively responsible for the overall health of the city. That said, there are only about 35 inspectors tasked with covering over 15,000 total food establishments. Instead of simply hiring more inspectors, the city decided to improve its inspection performance alongside analytics. Chicago’s Department of Innovation and Tech collaborated with the Department of Public Health to forecast a restaurant’s risk of failing an inspection. The eventual model empowered the city to identify violations about 7 days earlier than the manual process was able to, and in the research process, the team created a transparent, community accessible food inspection tracker updated with real-time inspection results.

See “Delivering Faster Results with Food Inspection Forecasting” (Harvard’s Ash Center)

See “Food Inspections” (Chicago Data Portal)

Traffic Fatalities + New York City

New technologies alongside rising urban populations have made way for a revolution in how residents traverse cities, and with such change, cities must adapt their systems and regulations to ensure the safety of citizens. Ample transportation data and new analytics models provide cities a low-cost avenue to assess and improve systems of transportation. As part of the global Vision Zero movement to address transportation issues and inefficiencies with data analytics, New York City collaborated with DataKind in 2015 to assess the risk and likelihood of certain outcomes of all new transportation improvement projects. Projects that aim to improve traffic patterns and minimize delays may either correlate with improved safety or come at a tradeoff with safety. After getting over the speedbump of limited useful data, the team was able to produce a model that maps exposure to more accurately judge the effectiveness of particular street designs. The model lets the NYC Department of Transportation forecast active car counts in given areas at given times to help optimize traffic patterns and improve road safety.

See “Can Better Data Make Zero Traffic Deaths a Reality?” (Harvard’s Ash Center)