What do drones, AI and proactive policing have in common?
Data.
Ellen Joyner-Roberson, CFE, Advisory Global Marketing, Fraud & Security Intelligence, SAS
In Marvel Studios’ Iron Man movies, industrialist and genius engineer Tony Stark transforms himself into a superhero who battles evil. In the ultimate display of proactive policing, he’s aided by Jarvis (Just a Rather Very Intelligent System), a highly advanced artificial intelligence created by Stark to manage almost everything in his life, but especially to help him fight crime.
Jarvis is just an AI in a superhero movie. But what if you could use smart machines – like drones – to fight crime and make communities safer in real life? Or what about using them to assist police in patrolling quarantine zones? Today it’s not so futuristic. This kind of technology to enable proactive policing practices may be coming soon to a community or country near you.
For example, in 2017 at least 167 US fire and police agencies acquired drones as part of their proactive policing strategies. This is more than double the number of agencies that obtained unmanned aircraft in 2015. By 2021, the number of unmanned aerial vehicles (UAVs) in the US is expected to be nearly 3.5 million. Police agencies are now using UAVs for search and rescue, traffic collision reconstruction, investigations of active shooter incidents, crime scene analysis, surveillance and crowd monitoring.
Modernizing Analytics for Law Enforcement
To battle crime, law enforcement must manage geolocation and communication data from mobile devices, text messages, social media data, financial transactions, dark web activities, and image and video data from drones and beyond. Examine the challenges and benefits of modernizing technology to effectively capture, organize and analyze this diverse range of data.
Analyzing visual data captured by drones
The increasing use of drones by policing organizations represents a practical application of AI and machine learning. While drones are capable of doing much more than visual surveillance, advances in object detection have greatly expanded the use of drones for images and video stream analytics. Object detection and classification are basic tasks in video analytics, and are at the forefront of research in AI and machine learning. A variety of algorithms, including YOLO (you only look once) and deep-learning methods such as CNNs (convolutional neural networks), are at the heart of these systems and the basis for developing more complex applications.
Historically, object detection and video analytics approaches were manual and time-consuming, requiring extensive human involvement. Most importantly, experts had to restrict image quality and perform a variety of pre-processing steps before training object detection algorithms. This led to high error rates and sometimes questionable results due to algorithmic limitations and computing capacity. It wasn’t so long ago that 10 frames per second were considered fast. Today, however, it’s not uncommon to perform analysis at 100 frames per second or more with human-level accuracy or better. Our capacity to analyze images at scale and quality has never been greater, and the technology is just starting to grow and expand.
Consider what’s possible with computer vision, for example. Computer vision can be used for facial recognition as well as for object detection. Computer vision can even explore the emotions and intents of individuals. For example, let’s say surveillance cameras capture footage of an individual who is in a law enforcement database of people who are considered “threats.” If that person is walking toward a federal building, computer vision tools could analyze the person’s gait and determine that they are leaning heavily on one side. That could mean the individual is carrying a bomb or some other dangerous object.
Keeping pace with technology
Some restrictions have slowed progress with using drones and AI for preventing crimes through proactive policing efforts. These include laws, policies and privacy concerns. Local and state agencies in the US have passed rules imposing strict regulations on how law enforcement, criminal justice and other government agencies can use drones. Indiana law specifies that police departments can use drones for search-and-rescue efforts, to record crash scenes and to help in emergencies; but otherwise, a warrant is required to use a drone. That means police probably won’t be able to fly them near large gatherings unless a terrorist attack or crime is underway.
Similarly, many countries have procured drones for law and order and aerial surveillance, as European and other international institutions are taking a keen interest in AI. They serve a role in establishing suitable, international ethical and regulatory frameworks for these developments. These include, for example, the General Data Protection Regulation (GDPR), as well as new frameworks for the ethical use of AI, including transparency of decision making.
Efforts to align drones and AI with privacy considerations continue to evolve. One California police agency included privacy considerations in its pilot program that used drones for general law enforcement in 2018. The Chula Vista Police Department, which serves a city of about a quarter-million residents, was testing the viability and effectiveness of UAVs for in-progress calls.
Other forms of AI for proactive policing
Drones are not the only types of AI being considered for preventing crimes and proactive policing. What about identifying stressed-out police officers who may need a break? A system developed by Rayid Ghani at the University of Chicago increases the accuracy of identifying these “at-risk” officers by 12%, and reduces false positives by a third. The system is also used by the Charlotte-Mecklenburg (NC) Police Department.
To succeed in this environment, agencies need modern tools capable of joining and interpreting huge quantities of data, alert generation, anomaly detection and AI techniques while supporting officers through the rigors of due process. Ellen Joyner-Roberson Advisory Global Marketing, Fraud & Security Intelligence SAS
AI essentials in proactive policing
Law enforcement and public safety agencies engaged at all levels of government are required to exploit disparate and diverse data sets to be effective in their operations – that’s what drones, AI and proactive policing have in common. But massive and ever-increasing quantities of data can strain limited operational resources. And without appropriate focus, the risk of failing to identify areas of increasing threat multiplies. The challenge for agencies is further intensified by the need to adhere to mandatory legislative processes. To succeed in this environment, agencies need modern tools capable of joining and interpreting huge quantities of data, alert generation, anomaly detection and AI techniques while supporting officers through the rigors of due process.
The reality is that AI is showing improved results. Intelligence analysts are testing and using deep learning, natural language processing and machine learning techniques in real-life law enforcement scenarios that can help change our world for the good. Although we haven’t reached superhero status in our accomplishments (though it would appear we had with the filming of Iron Man 3 at our own SAS headquarters), who knows what the future holds? Like anything that is truly worthy and good for humankind, there will always be pitfalls and challenges. But now is the time to seize the possibilities that lie ahead.
About the Author
Ellen Joyner-Roberson, CFE, is part of the Advisory Global Marketing, Fraud & Security Intelligence business unit at SAS, where she defines industry strategy and messaging for the global fraud and security markets in banking, insurance, health care and government. With more than 30 years of experience in information technology, she helps clients capitalize on the power of analytics to combat fraud and keep the public safe. This includes bringing greater awareness of how to apply machine learning and AI to detect evolving fraud tactics while realizing ROI in technology investments. In addition, she consults with clients to reduce fraud losses and mitigate risk across their enterprises. Joyner-Roberson graduated from Sweet Brier College with a degree in math and computer science.
Recommended reading
- 기사 What are chatbots?Chatbots are a form of conversational AI designed to simplify human interaction with computers. Learn how chatbots are used in business and how they can be incorporated into analytics applications.
- 기사 A guide to machine learning algorithms and their applicationsDo you know the difference between supervised and unsupervised learning? How about the difference between decision trees and forests? Or when to use a support vector algorithm? Get all the answers here.
- 기사 Small-time cheats and organized crime: Benefits fraud re-examinedTo combat benefits fraud, better utilize resources and improve ROI, analytics can detect organized crime rings rather than just small-time fraudsters.
- 기사 Rethink customer due diligenceTo streamline compliance and protect against financial and regulatory risk, re-examine your customer due diligence processes and technologies regularly. With new analytical tools, you can monitor customer transactions or personal information in real time, and accurately segment customers by the risk they represent.
Ready to subscribe to Insights now?
SAS® Viya™
Make analytics accessible to everyone and bridge the talent gap in your organization