CITRIS helps City of San Francisco Evaluate Community Safety Camera Program

by Gordy Slack

When the City of San Francisco decided in 2005 to test the crime-fighting efficacy of cameras placed in public areas, the idea seemed “straightforward and intuitively simple,” says Richard Robinson, the Chief Operations Officer (COO) for the Department of Technology. They would install about seventy cameras in high-crime areas and see what happened. If they captured crimes on tape, helped police catch criminals, and helped prosecutors to put them behind bars, great. If the cameras didn’t help, that would be good to know, too.

But by April of 2008, it was very clear that what had seemed a straightforward path, was actually a twisted one peppered with landmines. There were the technical questions: What kind of cameras to use? Where to place them? What resolution to shoot? What storage medium to use and how long to keep the footage? How much memory would be needed? What kind of network should the images be carried on?

But those challenges were charged and made messier still by political, legal, and social questions. “Half the community strongly wanted the cameras and the protection they hoped they’d provide, and the other half didn’t want them at all. Some felt there were privacy issues. It was very polarized,” says Robinson.

Although Chicago and Los Angeles both have surveillance camera systems that they watch in real time, the San Francisco Police Commission decided that active monitoring would not fly in the City by the Bay, a place where civil liberties and the right to privacy are taken very seriously. Instead, the program would run the cameras 24-7, and then examine the tapes only if and when the Department of Emergency Services (which archives the tapes) agreed that a crime or suspected crime warranted the look. The approach made political sense, but introduced technical problems. How, for example, could the operators maintain the cameras if they couldn’t be checked until a crime  occurred, when it might well be too late to find a technical glitch? 

The program was controversial and unwieldy to administer, for sure. But was it worth it? Nobody could tell. There were no metrics in place to test its efficacy. When launching the pilot project, the well intentioned Police Commission had been less than precise about its objectives. In fact, there was no simple way to determine whether it was working or not.

Robinson realized that he needed to take a step back and evaluate. “We didn’t have a lot to go by. No other city had done this kind of study. If we were going to spend the City’s money on these cameras, I wanted to know upfront if there was some type of metric that could measure success,” he says.

The City contracted CITRIS to study the CSC program in March 2008. “I needed an objective, collaborative team that understood legal, social, and technical issues and could synthesize all those perspectives into an evaluation of the program,” Robinson says.

Professor Deirdre Mulligan led a mutli-disciplinary team to investigate the effectiveness of the SF security camera program.

“CITRIS had the research experience in the technical, statistical, political, and legal areas,” says Deirdre Mulligan, an Assistant Professor at the UC Berkeley School of Information and a researcher on the project.

In the eight months it had to conduct the study and prepare the report, the CITRIS team examined the technical and administrative history of the program, conducted interviews with over thirty CSC stakeholders and end-users, reviewed minutes and video recordings of public hearings, press releases, and news articles, visited Los Angeles and Chicago for comparative insights, and did statistical analyses of nearly 60,000 crime incident reports from the CSC study areas dating from 2005 through 2008.

On the CITRIS evaluation team were Jennifer King, a Research Specialist from the UC Berkeley School of Law, Mulligan, at the School of Information, and Steven Raphael, a statistician at the Goldman School of Public Policy. Their 184-page CITRIS report, An Evaluation of the Effectiveness of San Francisco’s Community Safety Cameras, released in January, examines the program’s technical aspects, management, goals, and policy components, and presents a statistical evaluation of crime reports in order to provide a comprehensive evaluation of the program’s effectiveness.

To evaluate the pilot project’s influence on crime, Raphael first calculated an average daily crime rate at each camera location. He then broke those numbers down by type of crime and distance from the cameras and finally compared each group with the average daily crime rate from the period before the cameras were installed.

Raphael’s analysis showed that the CSCs weren’t reducing violent crime. The cameras also had no significant effect on burglaries or car theft; at best they were just moving these crimes up the road, out of the camera’s field. However, the analysis showed that nonviolent thefts, such as pick pocketing, purse snatching, and theft from cars, did drop between 20 and 30 percent within 100 feet of the cameras.

The success of the cameras as an investigatory tool was limited, as well, but not insignificant. In at least six reported cases, the CSC footage helped police in charging a suspect with a crime.

The cameras did help police to investigate some crimes, says Mulligan, by providing evidence for storyboards and timelines. Also, even if a camera didn’t capture a crime on film, it could help police evaluate the reliability of witnesses. “Knowing how much weight or authority to give different accounts helped police to reconstruct the timeline for the crime and to figure out which way different cars were headed, which way people ran, and where to look for evidence,” says Mulligan.

Because the SFPD’s technological infrastructure is about 20 years old, the capacity of a modern CSC system to interface with it is also limited, the report found, making storage and retrieval of the footage cumbersome.

Most importantly, though, the researchers found a “managerial vacuum” at the top of a program that needs an “owner.” While the Mayor's Offfice of Criminal Justice has been officially in charge of the program, the Department of Emergency Services is responsible for archiving the tapes, and the SFPD is the main user of the system. The diffusion of accountability for the program led to a lack of coordinated control, the report concluded.

The paucity of clearly defined objectives was also a big handicap. Going forward, the report recommended, the project should establish clear benchmarks for success and failure. That way, the Police Commission can determine whether the project is worth its cost.

“I look at things from an operational level,” says Robinson. “I prioritize my spending for the highest value. Without this type of study, with no feedback, then you’re just hoping that things are going well, but you can never really defend or support that hope with data.”

The report is the first of its kind of  this length and depth. “It should be useful to other cities considering CSC programs,” says Raphael, “by giving them an idea of the challenges and controversies they’ll run into and how to address them from the outset.”

The program is budgeted through the end of this year, and the City has already taken the CITRIS reports advice and put it under the control of the Police Department. San Francisco Police Commission is currently considering other aspects of the report and will soon decide whether to expand the CSC into a full-fledged program.  No doubt they are glad to have more to base their decision on than biased assumption and intuition.

“In the public sector and government we apply technology in programs that have social impact, but we rarely have a chance to vet that technology from an academic, objective, independent view,” says Robinson. “CITRIS did that for us, and it is a huge asset.” 

The entire report can be found online at:
http://www.citris-uc.org/files/CITRIS
SF CSC Study Final Dec 2008.pdf