Palantir has secretly been using New Orleans to test its predictive policing technology, was given huge access to lots of private data without oversight due to loophole

The program began in 2012 as a partnership between New Orleans Police and Palantir Technologies, a data-mining firm founded with seed money from the CIA’s venture capital firm. According to interviews and documents obtained by The Verge, the initiative was essentially a predictive policing program, similar to the “heat list” in Chicago that purports to predict which people are likely drivers or victims of violence.

The partnership has been extended three times, with the third extension scheduled to expire on February 21st, 2018. The city of New Orleans and Palantir have not responded to questions about the program’s current status.

Predictive policing technology has proven highly controversial wherever it is implemented, but in New Orleans, the program escaped public notice, partly because Palantir established it as a philanthropic relationship with the city through Mayor Mitch Landrieu’s signature NOLA For Life program. Thanks to its philanthropic status, as well as New Orleans’ “strong mayor” model of government, the agreement never passed through a public procurement process.

In fact, key city council members and attorneys contacted by The Verge had no idea that the city had any sort of relationship with Palantir, nor were they aware that Palantir used its program in New Orleans to market its services to another law enforcement agency for a multimillion-dollar contract.

Even within the law enforcement community, there are concerns about the potential civil liberties implications of the sort of individualized prediction Palantir developed in New Orleans, and whether it’s appropriate for the American criminal justice system.

“They’re creating a target list, but we’re not going after Al Qaeda in Syria,” said a former law enforcement official who has observed Palantir’s work first-hand as well as the company’s sales pitches for predictive policing. The former official spoke on condition of anonymity to freely discuss their concerns with data mining and predictive policing. “Palantir is a great example of an absolutely ridiculous amount of money spent on a tech tool that may have some application,” the former official said. “However, it’s not the right tool for local and state law enforcement.”

Six years ago, one of the world’s most secretive and powerful tech firms developed a contentious intelligence product in a city that has served as a neoliberal laboratory for everything from charter schools to radical housing reform since Hurricane Katrina. Because the program was never public, important questions about its basic functioning, risk for bias, and overall propriety were never answered.
Palantir’s prediction model in New Orleans used an intelligence technique called social network analysis (or SNA) to draw connections between people, places, cars, weapons, addresses, social media posts, and other indicia in previously siloed databases. Think of the analysis as a practical version of a Mark Lombardi painting that highlights connections between people, places, and events. After entering a query term — like a partial license plate, nickname, address, phone number, or social media handle or post — NOPD’s analyst would review the information scraped by Palantir’s software and determine which individuals are at the greatest risk of either committing violence or becoming a victim, based on their connection to known victims or assailants.

The data on individuals came from information scraped from social media as well as NOPD criminal databases for ballistics, gangs, probation and parole information, jailhouse phone calls, calls for service, the central case management system (i.e., every case NOPD had on record), and the department’s repository of field interview cards. The latter database represents every documented encounter NOPD has with citizens, even those that don’t result in arrests. In 2010, The Times-Picayune revealed that Chief Serpas had mandated that the collection of field interview cards be used as a measure of officer and district performance, resulting in over 70,000 field interview cards filled out in 2011 and 2012. The practice resembled NYPD’s “stop and frisk” program and was instituted with the express purpose of gathering as much intelligence on New Orleanians as possible, regardless of whether or not they committed a crime.
NOPD then used the list of potential victims and perpetrators of violence generated by Palantir to target individuals for the city’s CeaseFire program. CeaseFire is a form of the decades-old carrot-and-stick strategy developed by David Kennedy, a professor at John Jay College in New York. In the program, law enforcement informs potential offenders with criminal records that they know of their past actions and will prosecute them to the fullest extent if they re-offend. If the subjects choose to cooperate, they are “called in” to a required meeting as part of their conditions of probation and parole and are offered job training, education, potential job placement, and health services. In New Orleans, the CeaseFire program is run under the broader umbrella of NOLA For Life, which is Mayor Landrieu’s pet project that he has funded through millions of dollars from private donors.

According to Serpas, the person who initially ran New Orleans’ social network analysis from 2013 through 2015 was Jeff Asher, a former intelligence agent who joined NOPD from the CIA. If someone had been shot, Serpas explained, Asher would use Palantir’s software to find people associated with them through field interviews or social media data. “This data analysis brings up names and connections between people on FIs [field interview cards], on traffic stops, on victims of reports, reporting victims of crimes together, whatever the case may be. That kind of information is valuable for anybody who’s doing an investigation,” Serpas said.
Of the 308 people who participated in call-ins from October 2012 through March 2017, seven completed vocational training, nine completed “paid work experience,” none finished a high school diploma or GED course, and 32 were employed at one time or another through referrals. Fifty participants were detained following their call-in, and two have since died.

By contrast, law enforcement vigorously pursued its end of the program. From November 2012, when the new Multi-Agency Gang Unit was founded, through March 2014, racketeering indictments escalated: 83 alleged gang members in eight gangs were indicted in the 16-month period, according to an internal Palantir presentation.
Call-ins declined precipitously after the first few years. According to city records, eight group call-ins took place from 2012 to 2014, but only three took place in the following three years. Robert Goodman, a New Orleans native who became a community activist after completing a prison sentence for murder, worked as a “responder” for the city’s CeaseFire program until August 2016, discouraging people from engaging in retaliatory violence. Over time, Goodman noticed more of an emphasis on the “stick” component of the program and more control over the non-punitive aspects of the program by city hall that he believes undermined the intervention work. “It’s supposed to be ran by people like us instead of the city trying to dictate to us how this thing should look,” he said. “As long as they’re not putting resources into the hoods, nothing will change. You’re just putting on Band-Aids.”

After the first two years of Palantir’s involvement with NOPD, the city saw a marked drop in murders and gun violence, but it was short-lived. Even former NOPD Chief Serpas believes that the preventative effect of calling in dozens of at-risk individuals — and indicting dozens of them — began to diminish.

“When we ended up with nearly nine or 10 indictments with close to 100 defendants for federal or state RICO violations of killing people in the community, I think we got a lot of people’s attention in that criminal environment,” Serpas said, referring to the racketeering indictments. “But over time, it must’ve wore off because before I left in August of ‘14, we could see that things were starting to slide”

Nick Corsaro, the University of Cincinnati professor who helped build NOPD’s gang database, also worked on an evaluation of New Orleans’ CeaseFire strategy. He found that New Orleans’ overall decline in homicides coincided with the city’s implementation of CeaseFire program, but the Central City neighborhoods targeted by the program “did not have statistically significant declines that corresponded with November 2012 onset date.”
The secrecy surrounding the NOPD program also raises questions about whether defendants have been given evidence they have a right to view. Sarah St. Vincent, a researcher at Human Rights Watch, recently published an 18-month investigation into parallel construction, or the practice of law enforcement concealing evidence gathered from surveillance activity. In an interview, St. Vincent said that law enforcement withholding intelligence gathering or analysis like New Orleans’ predictive policing work effectively kneecaps the checks and balances of the criminal justice system. At the Cato Institute’s 2017 Surveillance Conference in December, St. Vincent raised concerns about why information garnered from predictive policing systems was not appearing in criminal indictments or complaints.

“It’s the role of the judge to evaluate whether what the government did in this case was legal,” St. Vincent said of the New Orleans program. “I do think defense attorneys would be right to be concerned about the use of programs that might be inaccurate, discriminatory, or drawing from unconstitutional data.”

If Palantir’s partnership with New Orleans had been public, the issues of legality, transparency, and propriety could have been hashed out in a public forum during an informed discussion with legislators, law enforcement, the company, and the public. For six years, that never happened.

Source: Palantir has secretly been using New Orleans to test its predictive policing technology – The Verge

One of the big problems here is that there is no knowledge and hardly any oversight on the program. There is no knowledge if the system is being implemented fairly or cost effectively (costs are huge!) or even if it works. It seemed to have worked for a while but the effects seemed also to drop off after two years in operations, mainly because they used the “stick” method to counter crime but more and more got rid of the “carrot”. The amount of private data given to Palantir without any discussion or consent is worrying to say the least.