![]() |
Increasing the Probability of AI Platform Success: Top Challenges that can Lead to AI Deployment Failures Online webinar: Matthew Lucas, PhD Computer Science - TeleStrategies ISS World Free for Law Enforcement, the government intelligence community, private enterprise cyber security managers and ISS vendors. Pre-registration with your government or corporate issued email address is required. |
|
Increasing the Probability of AI Platform Success: Top Challenges that can Lead to AI Deployment Failures Over the past several years ISS vendors have developed AI based product solutions in support of Law Enforcement investigations and government intel gathering operations. But before wide scale investments by these agencies occur, vendors and agencies working together must address the probability of success (i.e., hallucination and accuracy) of their AI platforms. Why? Because in science and business management measurement has been recognized as critical.
Government and LEAs agencies (e.g. AI customers) are realizing AI-based solutions are probabilistic. No outcome is 100% correct all the time. Some early stated vendor benchmark expectations are 80% and above is great, whereas 30% or lower are too low to be useful. Law enforcement and intelligence analysts must learn to deal these accuracy limitations. Likewise, LEAs must understand that poor AI training data can lead to bad results. For example, early AI based facial recognition platforms were rejected by law enforcement because the success rates of recognizing women of color was less than 34%. Why? The training data used to develop these facial recognition platforms were based entirely on Caucasian faces. Similarly, last week the Washington Post published a head-to-head comparison piece titled “Which AI Bots Reads Best?” None of the models tested had better than 70% accuracy. This webinar addresses the accuracy of AI platforms and how to increase success probability, even if not currently measurable. Specific topics include:
Presented by: All Contents Copyright © 2025 |