Inferring Serial Killers with Data: A Lesson from Vancouver

For those happily not in the know, my home town of Vancouver was afflicted with a serial killer during the 80’s and 90’s who largely targeted marginalized women in the downtown eastside – the city’s (and one of the country’s) poorest neighborhoods.

The murderer – Robert Picton – was ultimately caught in February 2002 and convicted for the murder of 6 women in December 2007. He is accused of murdering an additional twenty women, and may be responsible for deaths of a number more.

Presently there is an inquiry going on in Vancouver regarding the failure of the policy to investigate and act on the disappearing women earlier. There has been heart wrenching testimony from one female officer whose own efforts went largely ignored. But what is really striking is that during the late 1990s the Vancouver Policy Department actually had an expert analyzing crime data – particularly regarding the disappearing women – and his assessment was that a serial murder was at work in the city. The expert, Kim Rossmo, advised the police to issue a press release and begin to treat the case more seriously.

He was ignored.

The story is relatively short, but worth read – it can be found here.

What’s particularly discouraging is looking back at past articles, such as this Canadian Press piece as late as June 26th, 2001, less than a year before he was caught:

Earlier that day, Hughes stood with six others outside a Vancouver courthouse and told passers-by she believes a serial killer is responsible.

Vancouver police officially reject the suggestion.

But former police officer Kim Rossmo supported it while he was a senior officer. He wanted to warn residents about the possible threat. Rossmo is now involved in a wrongful dismissal trial against the force in B.C. Supreme Court.

Last week, he testified he wanted to issue a public warning in 1998, but other officers strongly objected. The force issued a news release saying police did not believe a serial killer was behind the disappearances.

Indeed, Rossom was not just ignored, other policemen on the force actively made his life difficult. He was harassed and further data that would have helped him engage in his analysis was withheld from him. Of course a few months later the murder was caught, demonstrating that his capture might have happened much earlier, if the force had taken the potential problem seriously.

A few lessons from this:

1) Data matters. In this case, the use of data could have, literally, saved lives. Rossom’s data model is now used by other forces and has become a professor in the United States.

2) The challenge with data is as often cultural as it is technical. As with the Moneyball story, the early advocates of using data to analyze and reassess a problem are often victimized. Their approach threatens entrenched interests and, the work often is conducted by people on the margins. Rossom was the first PhD in Canada to become a police officer – I’m pretty sure that didn’t make him a popular guy. Moreover, his approach implicitly, and then explicitly suggested the police were wrong. Police forces don’t deal with errors well – but nor do many organizations or bureaucracies.

3) Finally, this case study says volumes about police forces capacity to deal with data. Indeed, some of you may remember that the other week I deconstructed the Vancouver Police Department’s misleading press release regarding its support for Bill-C30 which would dramatically increase the police’s power to monitor Canadians online. I find it ironic that the police are seeking access to more data, when they have been unable to effectively use data that they can already legal acquire (or that, frankly is open, such as the number and locations of murder/disappearance victims).

Email & Share:

Print
PDF
email
Twitter
del.icio.us
Digg
StumbleUpon
Slashdot
Reddit
Facebook
Netvibes
Technorati
Suggest to Techmeme via Twitter
Identi.ca


Original post

Leave a Comment

Leave a comment

Leave a Reply