Amazon has joined the growing number of companies selling facial recognition technology to law enforcement agencies, offering to “identify persons of interest against a collection of millions of faces in real-time.” Civil libertarians are nettled. “This is a perfect example of technology outpacing the law,” says the Electronic Frontier Foundation.
The revelation this week that Amazon is selling facial-recognition technology–branded Amazon Rekognition–to law enforcement agencies raised questions about which laws or regulations govern police use of the technology. The answer: more or less none, reports Wired. More than two dozen nonprofits wrote to Amazon CEO Jeff Bezos to ask that he stop selling its technology to police, after the ACLU of Northern California revealed documents to shine light on the sales. Amazon says its technology can “identify persons of interest against a collection of millions of faces in real-time.” The letter argues that Amazon Rekognition “is primed for abuse in the hands of governments.”
State and federal laws generally leave police departments free to do things like search video or images collected from public cameras for particular faces. Cities and local departments can set their own policies and guidelines, but even some early adopters of the technology haven’t done so. Documents released by the ACLU show that Orlando, Fla., worked with Amazon to build a system that detects “persons of interest” using eight public-security cameras. “Since this is a pilot program, a policy has not been written,” a city spokesperson said when asked about guidelines for its use. “This is a perfect example of technology outpacing the law,” says Jennifer Lynch of the Electronic Frontier Foundation. “There are no rules.” Other companies offer similar technology, including Massachusetts-based MorphoTrust, which works with the FBI, and South Carolina’s Data Works Plus, which has worked with Detroit police.
Digital monitoring of a spouse or partner can constitute illegal stalking, wiretapping or hacking. Laws and law enforcement have struggled to keep up with technological changes, even though stalking is a top warning sign for attempted homicide in domestic violence cases.
KidGuard, a phone app that can help keep tabs on children, promotes its surveillance for other purposes, like “How to Read Deleted Texts on Your Lover’s Phone.” A similar app, mSpy, advised a woman on secretly monitoring her husband. Still another, Spyzie, ran Google ads alongside results for search terms like “catch cheating girlfriend iPhone.” As such digital tools have multiplied, so have the options for people who abuse the technology to track others without consent, the New York Times reports. More than 200 apps and services offer would-be stalkers a variety of capabilities, from basic location tracking to harvesting texts and even secretly recording video. More than two dozen services were promoted as surveillance tools for spying on romantic partners. Most spying services required access to victims’ phones or knowledge of their passwords — both common in domestic relationships.
Digital monitoring of a spouse or partner can constitute illegal stalking, wiretapping or hacking. Laws and law enforcement have struggled to keep up with technological changes, even though stalking is a top warning sign for attempted homicide in domestic violence cases. “We misunderstand and minimize this abuse,” said Erica Olsen of the National Network to End Domestic Violence. “People think that if there’s not an immediate physical proximity to the victim, there might not be as much danger.” Data on electronic stalking are scarce, but data breaches at two surveillance companies last year revealed accounts of more than 100,000 users. The Centers for Disease Control and Prevention says 27 percent of U.S. and 11 percent of men at some point endure stalking or sexual or physical violence by an intimate partner with significant effects. Many law enforcement agencies don’t have the computer skills to help survivors, or don’t devote forensic resources to domestic abuse and stalking cases.
A professor at the University of California Davis School of Law predicts Supreme Court justices will defend the First Amendment principles of free speech against government attempts to curb Internet abuses—even when those abuses involve promoting falsehoods online.
How will the Roberts Supreme Court weigh in on the emerging debate over how to prevent the abuse of online media and social networks?
A forthcoming paper argues that, although the justices are now evenly divided between “technology optimists and technology pessimists,” they are likely to defend the principles of free speech against attempts to regulate content on the Internet.
Ashutosh Bhagwat, a law professor at the University of California Davis School of Law, bases his prediction on several recent rulings—although he notes that it is “astonishing” that Internet and free speech issues have rarely been addressed in the 12 years since Chief Justice John Roberts was appointed.
“It seems inevitable that going forward, this is going to change,” Bhagwat writes in an article scheduled for publication this month in the Washington University Law Review.
“Recent calls to regulate ‘fake news’ and otherwise impose filtering obligations on search engines and social media companies will inevitably raise important and difficult First Amendment issues.”
Basing his analysis on reviews of several cases brought before the Roberts Court, Bhagwat identifies Justices Roberts and Samuel Alito as the “pessimist” justices most in favor of stricter regulation; and Justices Anthony Kennedy, Sonia Sotomayor, Ruth Bader Ginsburg, and Elena Kagan as those most aligned with defending free speech.
The remaining justices—Clarence Thomas, Stephen Breyer and Neil Gorsuch—are somewhere in the middle, he writes.
Packingham concerned a challenge to a North Carolina statute that forbade any registered sex offender from accessing a commercial social networking Web site where the sex offender knows that the site permits minor children to become members or to create or maintain personal Web pages.
The Court upheld the challenge, ruling the statute unconstitutional. Justice Kennedy, writing for the majority, held that First Amendment protections could be constitutionally extended to the “vast democratic forums of the Internet…and social media in particular.”
The Court’s decision in a non-Internet case, United States v. Alvarez, which upheld an individual’s right to make a false claim that he had received the congressional Medal of Honor, made clear that “even intentional falsehoods are entitled to some level of First Amendment protection, and there is no reason to expect that principle not to be extended” to cyberspace, Bhagwat wrote.
“Given the enormous risk of self-serving political manipulation or bias posed by government regulation of social media falsehoods on political topics, I would expect all the Justices to balk” at similar attempts to discipline the use of so-called fake news, he added.
Why Supreme Court Justices lean one way or another is uncertain, but Bhagwat argues the Roberts Court’s approach to free speech issues reflects the “longstanding tension in American political thinking between Jeffersonians who embrace change and individual autonomy at the cost of occasional disorder; and Hamiltonians, who embrace order at the cost of occasional limits on liberty.”
But the paper finds that more Justices lean in the direction of free speech and openness when it comes to regulating technology.
“I think it likely, but not certain, that a working majority of the Roberts Court will vote to fend off heavy-handed efforts to assert state control over new technology such as the Internet and social media,” he writes.
He cautions that for the “technology optimists” to succeed in future cases they only have to persuade one of the three “uncertain” Justices, whereas the technology pessimists would have to persuade all three.
Nevertheless, he adds, the most critical element in shaping how the Constitution is interpreted on these issues will be the regulatory initiatives emanating from Congress, the Federal Communications Commission (FCC), and state legislatures.
“If past history is any guide, content-neutral structural regulations such as the Net Neutrality policy adopted by the Obama-era FCC (and recently repealed by the Trump-era FCC) are likely to fare well in courts and the Court, especially given the existence of precedent, authored notably by Justice Kennedy, upholding similar structural regulations of cable television,” writes Bhagwat.
Facial recognition software has been in use for more than a decade. As it gets cheaper, retailers and many smaller police departments are eyeing it as a viable tool for targeting shoplifters; but how will privacy concerns be addressed?
Shoplifting is harmless, right? It’s nothing more than a victimless petty crime.
Besides, it doesn’t really hurt the retailers because they just write off their losses. Shoplifters don’t even face jail time.
If you believe these so-called “facts,” you are buying into the myths surrounding shoplifting that have very little to do with the reality of the crime.
Consider this. According to the National Retail Federation, the loss of inventory from retail stores due to shoplifting and employee theft costs the U.S. retail industry nearly $48.9 billion a year. Moreover, the average cost per shoplifting incident is $798.48.
That’s not exactly “petty.”
According to the National Association of Shoplifting Prevention, there are approximately 27 million shoplifters – one in every 11 people – in our country today. More than 10 million people have been caught shoplifting in the last five years, but shockingly, only one in 478 shoplifters are ever caught and only half of those are turned over to the police for prosecution.
Especially troublesome is the fact that 10 percent of the total dollar losses due to shoplifting are attributable to “professional” shoplifters who steal solely for resale or profit as a business. These include hardened criminals who steal as a lifestyle, international shoplifting gangs who steal for profit, and drug addicts who steal to feed their habit.
That last group is particularly worrisome, given the escalating rate of opioid addiction in the US.
It is estimated that more than two million Americans are now addicted to prescription pain killers, while nearly 600,000 have a substance-use disorder involving heroin. People addicted to these drugs often steal from large retail stores with the intent of returning the stolen merchandise (with no receipt) for a gift card which they can then resell for cash.
While there is no nationwide research showing the connection between opioid addiction and shoplifting, many police departments have direct experience with it. The Knoxville, TN police department found that almost 85 percent of drug overdoses in a three-month window in 2017 were linked directly to gift cards.
Given those staggering numbers, is there anything retailers and the police can do to stem the tide of shoplifting?
Increasingly, though, police—along with some retailers—are turning to a technology solution that has demonstrated its effectiveness in cracking down on criminals: facial recognition software. Starting last year, a number of retailers across the country began displaying signs to inform customers that management is using facial recognition software, turning the store into a certified safe zone.
While facial recognition software has been in use for more than a decade, retailers and many smaller police departments only began to consider it as a viable tool for targeting shoplifters in the past year or two as prices have dropped.
Facial recognition software works by using image processing and machine learning algorithms to match a photo of an unidentified person (“probe” photo) against a database of photos of identified persons who previously have been convicted of shoplifting or other crimes. The face-identification algorithms in the software will produce a list of possible matches, with each match having a score that indicates the quality or likelihood of a match.
In the past, low resolution, poor lighting, motion blur, off-angle faces, facial hair, and other scenarios have challenged these algorithms to produce a good match. Advances in the technology based on algorithms such as “deep learning,” however, have produced significant gains when processing challenging probe photos.
Despite such advances, even the best facial recognition systems are unlikely to generate just a single match from something like a store security camera photo. Instead, the system will generate a list of possible matches. The police working the case will then need to use standard investigative methods to either rule out or further investigate each match, just as they would with any investigative lead.
In other words, the software isn’t doing anything that wouldn’t occur during a normal police investigation. It is simply doing what investigators would do, but faster and with a higher degree of accuracy.
It is equally important to note that the way in which facial recognition software is currently deployed offers little threat to privacy concerns and limited potential for abuse. Most systems immediately discard images of anyone who isn’t a match for a known shoplifter.
Lack of information or even misinformation, however, can cause a reaction on the part of the public. Could the retailer, for example, collect information on everyone who walks into the store, their buying habits, and so on?
Worse still, could that information be sold to others? As a result, it is important for retailers and law enforcement to fully understand the spectrum of possible uses of the technology, as well as how the public may perceive those uses.
Facial recognition software has the potential to change the rules of retail, generating leads in a great many cases that might otherwise go unsolved. And while each case may not be high profile, in aggregate they represent a staggering amount of criminal activity.
Given that, stopping even a single shoplifter could prevent tens of thousands of dollars in future theft.
Nick Coult is Senior Vice President for Law Enforcement and Public Safety at Numerica Corporation. He is one of the creators of Lumen, a platform for law enforcement search, analysis, and data sharing. Numerica is currently beta-testing a version of their Lumen software called Lumen FR, which integrates next-generation facial recognition algorithms directly into Lumen. Numerica anticipates that Lumen FR will be available by early summer. Readers’ comments are welcome. For more information, visit https://www.numerica.us/
Guest Blogger: John Bamford, Detective, Arlington County Police Department In July of 2017, the Federal Bureau of Investigation announced a multiple country operation which resulted in the take down of one of the largest illicit marketplaces. This virtual marketplace, known … Continue reading →
Guest Blogger: John Bamford, Detective, Arlington County Police Department
In July of 2017, the Federal Bureau of Investigation announced a multiple country operation which resulted in the take down of one of the largest illicit marketplaces. This virtual marketplace, known as Alphabay, was alleged to have operated for more than two years with transactions totaling over $1 billion dollars in cryptocurrencies, such as Bitcoin. At its peak, it was estimated that Alphabay had more than 200,000 users and 40,000 vendors supplying illicit substances.
What is vital to know is that unlike the physical transfers of the past, users of Alphabay completed all their transactions online in what is known as the “darkweb.” The dismantling of Alphabay and another marketplace called Hansa has since multiplied the number of websites specifically involved in the sale of illicit items. For law enforcement, this means there are more avenues for suspects to obtain items such as heroin, fentanyl, weapons, or stolen credit card numbers that can bring harm to communities. With this increase in suspects, local police departments can no longer rely solely on federal authorities to investigate these marketplaces. Local authorities should begin learning how to initiate their own investigations and understand the important role they play in identifying these suspects.
This blog provides guidance for local departments in setting up a system to undertake their own investigations into the various darkweb marketplaces.
Selecting the right person for the job
Unlike your traditional narcotics investigations, these investigations don’t usually involve a lot of hand to hand purchases and physical undercover work. Rather, these investigations require a large amount of paperwork and willingness to pour through documents looking for a single mistake that allows identification of a suspect. Since they are almost exclusively cyber-based, having a detective who is either technologically inclined or who has a willingness to learn the various ins and outs of cyber investigations is also vital.
Due to many marketplaces being centered around either fraud or narcotics, many departments utilize white collar or vice detectives who have shifted into investigating darkweb marketplaces. Detectives who typically work narcotics or fraud investigations also already have experience with sorting through documents and dealing with legal nuances and maybe prepared for the inevitable pitfalls and roadblocks that occur in the investigations into darkweb markets.
Obtaining the necessary training
In order to investigate a darkweb marketplace effectively, officers have to know what to look for and should be thoroughly trained. For example, understanding cryptocurrency or hiding a computer’s Internet Protocol (IP) Address are skills that law enforcement should not learn through trial and error. It is very easy to ruin an entire case by leaving a digital trail right back to your department. Officer training is vital because it legally allows law enforcement to make logical assumptions when executing search warrants.
The good news for many departments is that there is a large amount of training available that does not require physical travel. This training, offered online by many companies, is almost always beneficial no matter the investigator’s experience level due to the the various forums and marketplaces where criminal activity occurs. There are also numerous training opportunities available through federal government entities such as the Federal Bureau of Investigations Criminal Justice Information Services Division (CJIS) and various federal task forces.
Joining a Task Force
When undertaking an investigation into a darkweb market, it is very likely that some of the vendors or administrators of the marketplace live outside of your jurisdiction. To effectively investigate and identify suspects, it is advised to join a federal task force. A task force provides your department with additional technical experience and knowledge. Being part of a task force can help avert conflict between investigations being conducted by different departments or alternatively, it can allow for investigators to combine their investigations into a larger one. Finally, it serves as a force multiplier allowing for the pooling of both personnel and financial resources.
Picking targets of investigations
While the owner of a darkweb marketplace likely lives outside of a department’s jurisdiction and reach, it is very common that individual users of the darkweb marketplace live within the department’s jurisdiction. Local departments should utilize situations where they have actual day-to-day interaction with the suspects utilizing the darkweb marketplace to work their way up the chain. Consider this example:
An overdose death occurs in a local jurisdiction which pulls in both homicide and narcotics detectives. Upon arrival, they discover that the victim of the overdose obtained the controlled substance via an online order. Using the victim’s laptop and cellular phone, law enforcement may be able to identify the actual supplier of the narcotics through the darkweb marketplace. While identifying the owner of the website may be extremely difficult, the chances of identifying the supplier of the narcotics and working up the supply chain is a more feasible challenge for local law enforcement.
Working with prosecuting attorneys
To successfully prosecute a complex case involving a darkweb marketplace, it is vital that law enforcement officers and prosecutors are on the same page. While both sides may not see eye-to-eye on every single issue, they must be able to work together to move the case towards a successful prosecution. In darkweb marketplace cases, much of the case development and investigation will involve legal processes directed to various entities such as internet service providers or internet companies such as Facebook or Google. Working with the prosecuting attorneys can help ensure that the evidence is obtained through the correct legal processes. The prosecutors must also work with law enforcement to ensure sufficient evidence has been collected, especially since the investigation into a darkweb marketplace oftentimes requires a technical and specialized understanding that many prosecutors may not have.
In conclusion, many local departments have the ability to investigate crimes arising from darkweb marketplaces. However, to obtain a successful prosecution, it is important that departments position the right investigators for the job, ensure investigators receive proper training and resources, pursue the right suspects, and work with prosecutors to reach a favorable case conclusion.
Interested in learning how to successfully conduct dark web investigations including how to seize cryptocurrencies in a forensically sound manner? Join us at the 2018 IACP Technology Conference in Providence, Rhode Island, May 21-May 23, 2018. Visit: http://www.theiacp.org/Tech-Conference for more information!
Several technology companies are working with police departments to develop capability to add artificial intelligence to video surveillance and body cameras that could identify faces in real time, potentially expanding the reach of police surveillance. The body-camera technology is expected to be ready by the fall.
Several technology companies are working with police departments to develop capability to add artificial intelligence to video surveillance and body cameras that could identify faces in real time, potentially expanding the reach of police surveillance, the Wall Street Journal reports. The body-camera technology, expected to be ready by the fall, hasn’t yet been purchased by police departments and is still in the development stage. Police departments already use facial recognition to review surveillance footage after a crime has occurred. The new software uses an algorithm to tell an officer on the spot, through a body camera or a video surveillance camera, that it has found a suspect. The officer then must decide whether to stop the suspect or take some other action.
The technology underscores law enforcement’s growing dependence on software and high-tech tools, including gun-shot-detection technology and predictive analytics. The tools have been hailed by law-enforcement, but raise concerns about privacy. Chicago-based Motorola Solutions , a maker of police communications and body-camera technology, has partnered with artificial-intelligence company Neurala to produce a body-worn camera, ready for deployment this fall, that executives say will learn to identify a suspect or a missing child and spot them in a crowd. The technology would get smarter by taking in more data over time. “This frees up some of your cognitive space so you aren’t trying to do a thousand things at one time,” said a sergeant at a Midwest police force, who is working with Motorola to provide feedback on the technology. Motorola said it is working with a number of departments around the country.
Technology companies that vocally opposed the travel ban and transgender military ban have been noticeably quiet in the aftermath of the shooting in Parkland, Fl.
Tech companies that have weighed in on other social issues are staying noticeably silent on the gun debate, reports the Chicago Tribune. A reputation research team at Weber Shandwick, the global public relations firm, has tracked corporate responses to six controversial moments in the past year – events like President Trump’s initial travel ban, his plan to withdraw from the Paris climate accord, and his remarks following the violent protests in Charlottesville, Va.
The analysis scanned company statements and social media platforms for keywords, and found that found that fewer tech companies weighed in on gun control or the shooting than had spoken up over past events. For example, 21 of the 28 corporate responses about the transgender military ban, or 75 percent, came from tech companies. But in the aftermath of the Parkland shooting, tech came in third behind finance and retail, with just 16 percent of the responses. The analysis also found that for all six events, more companies pointed to their customers rather than corporate “values” as the reason they took action; and that fewer CEOs attached their names to the company’s statement, issuing unsigned or more general remarks instead.
Guest Blogger: Bonnie Locke, Nlets Director of Business Development and Chair of the IACP LEIT Section. Today’s law enforcement professionals face unprecedented technological challenges, from cyber-attacks that compromise personal information, to the difficulty in monitoring active intelligence from social media. Similar to … Continue reading →
Guest Blogger: Bonnie Locke, Nlets Director of Business Development and Chair of the IACP LEIT Section.
Today’s law enforcement professionals face unprecedented technological challenges, from cyber-attacks that compromise personal information, to the difficulty in monitoring active intelligence from social media.
Similar to officers on the street, law enforcement information technology professionals face a diverse set of issues depending on the size of the agency, location, budget, and existing infrastructure. While some agencies may be asking for guidance on how to create, deploy, and maintain a data warehouse, other agencies may be looking for guidance on how develop an in-house advanced video analytic system or how to conduct successful dark web investigations. The law enforcement community needs to address these problems together, keeping an open line of communication toward the goal of interoperability, unified standards, and the fusion of disparate information resources.
Although today’s public safety personnel rise to the challenge every day, they need the tools to keep up in an evolving landscape. It takes cutting edge information technology and policy guidance to ensure law enforcement is able to respond to real-time crime intelligence, communicate, and function efficiently. The 2018 IACP Technology Conference, May 21-23, in Providence, Rhode Island, provides criminal justice and public safety professionals an opportunity to share ideas that will help keep citizens and officers safe.
This three-day conference will cover a variety of emerging issues in technology including:
Leveraging Blockchain in Criminal Investigations
Highly Autonomous Vehicles- Is Law Enforcement Ready?
Using Sensor-Based Technology to Improve Officer Safety
Monitoring Social Media in Real Time with Free Tools
As well more familiar issues such as:
NIBRS: How to Work with Vendors to Ensure a Seamless Transition
How to Improve Communications using Mobile Apps
Finding a Policy Framework to Use When Procuring New Technology
Developing a Long-Term IT Vision
Today’s technology is changing so incredibly fast and it’s an integral part of what we do. I am excited to hear from leading practitioners that can talk about what is working in the field, generate thought provoking ideas, and help identify the solutions that agencies can adopt today or consider for the future. Every year, I meet law enforcement professionals and industry partners that are game changers. Whether you are a public safety technologist, analyst, manager, or executive – the IACP Technology Conference is a must. I hope you will join me this year and discover the possible.
The CLOUD Act is an attempt to update an obsolete stored communications law that was passed in the 1980s before the World Wide Web existed. Sen. Rand Paul of Kentucky opposed the proposal as a violation of Americans’ privacy. He tweeted, “But guess what? Congress can’t vote to reject the CLOUD Act, because it just got stuck onto the Omnibus (spending bill), with no prior legislative action or review.”
Police in other countries will be able to get emails and other electronic communication more easily from their own citizens and from Americans under a bill that Congress stuffed inside the massive $1.3 trillion spending deal passed last week, says USA Today. Supporters said the bill, dubbed the CLOUD Act, will simplify the process for the U.S. government and its allies to get evidence of serious crimes and terrorist threats when that evidence is stored on a server in another country. Opponents, including civil liberty and privacy rights groups, said the law could make it easier for nations with human rights abuses to spy on dissidents and collect data on Americans who communicate with foreign nationals. Internet providers had been able to legally stop police agencies from gaining access to their own citizens’ emails if those emails were stored in a foreign nation. Microsoft stores data on about 1 million servers in 40 countries.
“Tucked away in the omnibus spending bill is a provision that allows Trump, and any future president, to share Americans’ private emails and other information with countries he personally likes,” said Sen. Ron Wyden, D-Ore. “That means he can strike deals with Russia or Turkey with nearly zero congressional involvement and no oversight by U.S. courts.” In a letter to Congress, the ACLU, the Electronic Frontier Foundation, Human Rights Watch and 20 other civil liberties groups said the CLOUD Act allows foreign governments to wiretap on American soil, using standards that don’t comply with U.S. law, and gives the executive branch the power to enter into agreements with other nations without congressional approval.
The FBI and Justice Department are renewing their call for a legal mandate to gain access to smartphones during criminal investigations.
The FBI and Justice Department officials are renewing a push for a legal mandate that tech companies build tools into smartphones and other devices that would allow access to encrypted data in criminal investigations, reports the New York Times. The federal officials have been quietly meeting with security researchers who have been working on approaches to provide such “extraordinary access” to encrypted devices. Based on that research, Justice Department officials are convinced that mechanisms allowing access to the data can be engineered without intolerably weakening the devices’ security against hacking. The officials have also revived talks inside the executive branch over whether to ask Congress to enact legislation mandating the access mechanisms. The Trump White House circulated a memo last month among security and economic agencies outlining ways to think about solving the problem, officials said.
The F.B.I. has been agitating for versions of such a mandate since 2010, complaining that the spreading use of encryption is eroding investigators’ ability to carry out wiretap orders and search warrants — a problem it calls “going dark.” The issue repeatedly flared without resolution under the Obama administration, peaking in 2016, when the government tried to force Apple to help it break into the iPhone of one of the attackers in the terrorist assault in San Bernardino, Calif. Rod J. Rosenstein, the deputy attorney general, and FBI Christopher A. Wray have begun talking publicly about the “going dark” problem, though their renewed push is certain to be met with resistance.