December, 2018

DEA and ICE Installing
Cameras in Streetlights

Government contracts have been awarded, for the DEA and ICE, to a company in Texas, Cowboy Streetlight Concealments, LLC, for video-recording equipment installed in streetlights. One of the owners of Cowboy Concealments, Christie Crawford, was quoted as saying, “[T]hings are always being watched. It doesn’t matter if you’re driving down the street or visiting a friend, if government or law enforcement has a reason to set up surveillance, there’s great technology out there to do it.”

Sources:  Justin Rohrlich and Dave Gershgorn, “The DEA and ICE are hiding surveillance cameras in streetlights,”, November 9, 2018:

Justice Department Awards Contract for
Camera Installed in Shop-Vac

A customized shop-vac has been ordered for the DEA. The vacuum will be fitted with a Canon M50B, which is described as a “high-sensitivity…PTZ [Pan-Tilt-Zoom] network camera” that “captures video with remarkable color and clarity, even in very low-light environments.”

Sources:  Department of Justice contract award:
Justin Rohrlich, “The DEA Is Hiding Surveillance Cameras In Vacuum Cleaners,”, December 6, 2018:

AI Predictive Application in UK

Nine police departments in the UK will test an AI system, National Data Analytics Solution (NDAS), which uses a “combination of AI and statistics to assess the risk of someone becoming a victim of gun or knife crime, as well as the likelihood of someone falling victim to modern slavery,” according to recent reports. Once flagged as being at risk for either committing an offense or becoming a victim, a person would then be contacted by local health and social services to receive support. The system was developed from data compiled from local law enforcement databases, “including records of people being stopped and searched and logs of crimes committed.” Records of about five million people were identified.

From the data obtained, the software then produced 1,400 indicators to help predict crime. Among the indicators are “the number of crimes an individual had committed with the help of others and the number of crimes committed by people in that individual’s social group.” Using the indicators allows the AI to assign a risk score to individuals and then “to predict which individuals known to the police may be on a trajectory of violence similar to that observed in past cases, but who haven’t yet escalated their activity.”

Some people have raised ethical concerns and are critical of such a system. Andrew Ferguson, of the University of the District of Columbia, was quoted in a recent article as cautioning that such “analytical tools run the risk of limiting police enquiries to well-trodden locations,” can reinforce existing bias, and can disproportionately affect people of color and residents of poor neighborhoods.

Sources:  Adam Smith, “UK Police Use AI System to Stop Crime Before It Happens,”, November 26, 2018:
Chris Barabiuk, “Exclusive: UK police wants AI to stop violent crime before it happens,”, November 26, 2018:

by Neil Leithauser
Associate Editor