TechScape: can AI really predict crime? | Technology


In 2011, the Los Angeles Police Department introduced a novel approach to policing called Operation Laser. Laser - which stood for Los Angeles Strategic Extraction and Restoration - was the first predictive police program of its kind in the US that enabled the LAPD to use historical data to use laser precision (hence the name) to predict where future crimes might be committed and who could commit it.

Sign up for our weekly tech newsletter TechScape.

But it was far from precise. The program used historical crime data such as arrests, duty calls, field interview cards - which the police filled in with identifying information every time they stopped someone for whatever reason - and more to identify "problem areas" that officers could focus on or individuals' criminal risk assessments to assign. The information gathered during this police effort was fed into computer software, which further helped automate the department's crime prediction efforts. Activist groups such as the Stop LAPD Spying Coalition argue that the image of crime presented by the software simply confirmed existing police patterns and decisions, and naturally criminalized places and people, based on a controversial hypothesis (i.e., that where once crime have taken place, they will recur). ). The data the LAPD used to predict the future has been fraught with bias, leading to excessive surveillance and disproportionate targeting of black and brown communities - often the same ones they have targeted for years, experts argue.

About five years after the program began, the LAPD focused on an intersection in a neighborhood in south LA where the late rapper Nipsey Hussle was known to frequent. It was the crossroads where he grew up and later opened a flagship clothing store as an ode to his neighborhood and as a means of helping the community grow economically. There, the LAPD stopped 161 people within two weeks in search of a suspect described as black between the ages of 16 and 18. Nipsey Hussle had also previously complained about constant harassment by the police and had already said in 2013 that LAPD officers "come, get out, ask you questions, your name, your address, your cell phone number, your social contacts, if you are not" indicate nothing done. Just so they know everyone in the hood. ”In an interview with Sam Levin, Nipsey's brother Samiel Asghedom said that no one could go into the store without being stopped by the police. The brothers and co-owners of The Marathon Clothing store even considered moving the store to avoid nuisance.

Ultimately, the LAPD was forced to close the show because the data did not paint a complete picture.

Fast forward almost 10 years later: The LAPD is working on a trial basis with a company called Voyager Analytics. Documents the Guardian reviewed and written in November show that Voyager Analytics claimed it could use AI to analyze social media profiles to detect emerging threats based on a person's friends, groups, posts, and more . It was essentially Operation Laser for the digital world. Rather than focusing on physical locations or people, Voyager examined the digital worlds of those interested to see if they were involved in crime rings or wanted to commit future crimes based on who they interacted with, what they posted, and even theirs Friends of freinds. "It's a 'guilt-by-association' system," said Meredith Broussard, professor of data journalism at New York University.

Voyager claims that all of this information about individuals, groups, and sites enables its software to perform "sentiment analysis" in real time and find new clues when investigating "ideological solidarity". “We're not just connecting existing dots,” says a Voyager promotional document. “We're creating new points. What appears to be random and irrelevant interactions, behaviors or interests suddenly becomes clear and understandable. "

But systems like Voyager Labs and Operation Lasers are only as good as the data they are based on - and skewed data leads to skewed results.

In a case study showing how Voyager software could be used to identify people who “identify best with an attitude or a particular topic,” the company looked at how the company's social media presence lasted Adam Alsahli killed in an attempt to attack the Corpus Christi naval base in Texas. Voyager said the software assumes that Alsahli's profile has a strong propensity for fundamentalism. Evidence they cited included that 29 of Alsahli's 31 Facebook posts were Islamic-themed images and that one of Alsahli's Instagram account handles that was blacked out in the documents was "his pride in and identification with his Arab heritage ”reflected. The company also noted that of the accounts he followed on Instagram, "most of them are in Arabic" and "generally" are accounts that post religious content. On his Twitter account, Voyager wrote that Alsahli tweeted mainly about Islam.

Although the case study was edited, many aspects of what Voyager viewed as signals of fundamentalism could also be counted as free speech or other protected activities. The case study, at least the parts we got to see, reads like your average Muslim father's social media profiles.

While the application may seem different, the two cases show the continued desire by law enforcement agencies to improve their policing and the limitations - and in some cases the bias - that are deeply ingrained in the data used in the systems. Some activists say the police are employing systems that pretend to be using artificial intelligence and other advanced technologies to do what they really can't, which is to analyze human behavior to predict future crime. In doing so, they often create a diabolical feedback loop.

The main difference is that an entire technology sector is now crying out to meet law enforcement calls for more advanced systems. And tech companies that create open surveillance or police programs, but also consumer tech companies that the average person interacts with on a daily basis, like Amazon, answer the call. For its part, Amazon worked specifically with the LAPD to give its officers access to its network of ring cameras. For the police, the motivation for such partnerships is clear, as such technologies make their police decisions credible and potentially make their work easier or more effective. For tech companies, the motivation is to find revenue streams with growth potential. The lucrative state treaty with seemingly endless funding is hard to resist, especially since many other growth paths are drying up. Because of this, internal opposition from employees hasn't deterred companies like Google, which, despite years of employee disputes, continue to pursue military contracts.

From the New York Times: “In 2018, thousands of Google employees signed a letter protesting the company's involvement in Project Maven, a military program that uses artificial intelligence to interpret video images and target drone attacks refine. Google management gave in and agreed not to renew the contract once it expired.

The outcry led Google to issue guidelines for the ethical use of artificial intelligence banning the use of its technology for weapons or surveillance, and accelerated a transformation of its cloud computing business. Now that Google is positioning cloud computing as an important part of its future, the bid for the new Pentagon contract could test the limits of those AI principles that set it apart from other tech giants that routinely seek military and intelligence work. "

Where can a company like Google, which has expanded its business so that its tentacles are likely to be in any industry, expand its business? At the moment the answer seems to be working with the government.

Reader, I'd love to hear what you think of tech companies that are working with law enforcement to equip them with predictive police or other surveillance technology.

If you would like to read the full version of the newsletter, please subscribe to TechScape in your inbox every Wednesday.


continue reading

https://dailytechnonewsllc.com/techscape-can-ai-really-predict-crime-technology/

Comments

Popular posts from this blog

China’s new space station opens for business in an increasingly competitive era of space activity

North Uist spaceport scheme could 'review' role of Russia-linked firm

How Iran is accessing the social media accounts of protesters to incriminate them, experts say