Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • 5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice
  • OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval
  • Your Favorite AI Gay Thirst Traps: The Men Behind them
  • Mend Releases AI Safety Governance Framework: Masking Asset Stock, Danger Tiering, AI Provide Chain Safety, and Maturity Mannequin
  • Google DeepMind Introduces Decoupled DiLoCo: An Asynchronous Coaching Structure Attaining 88% Goodput Below Excessive {Hardware} Failure Charges
  • Mend.io releases AI Security Governance Framework covering asset inventory, risk tiering, AI Supply Chain Security and Maturity model
  • Stanford Students Wait in Line to Hear From Silicon Valley Royalty at ‘AI Coachella’
  • Google Cloud AI Research introduces ReasoningBank: a memory framework that distills reasoning strategies from agent successes and failures.
AI-trends.todayAI-trends.today
Home»AI»ICE and CBP’s Face-Recognition App Can’t Really Confirm Who Folks Are

ICE and CBP’s Face-Recognition App Can’t Really Confirm Who Folks Are

AI By Gavin Wallace05/02/20264 Mins Read
Facebook Twitter LinkedIn Email
Elon Musk's IQ and the Nature of Genius • AI
Elon Musk's IQ and the Nature of Genius • AI
Share
Facebook Twitter LinkedIn Email

The face-recognition app Mobile Fortify, now used by United States immigration agents in cities and cities throughout the US, will not be designed to reliably determine individuals within the streets and was deployed with out the scrutiny that has traditionally ruled the rollout of applied sciences that impression individuals’s privacy, in line with information reviewed by WIRED.

The Department of Homeland Security launched Cell Fortify within the spring of 2025 to “determine or verify” the identities of people stopped or detained by DHS officers throughout federal operations, information present. DHS explicitly linked the rollout to an executive order, signed by President Donald Trump on his first day in workplace, which known as for a “total and efficient” crackdown on undocumented immigrants by way of using expedited removals, expanded detention, and funding strain on states, amongst different techniques.

Regardless of DHS repeatedly framing Cell Fortify as a software for figuring out individuals by way of facial recognition, nonetheless, the app doesn’t really “verify” the identities of individuals stopped by federal immigration brokers—a well known limitation of the know-how and a operate of how Cell Fortify is designed and used.

“Every manufacturer of this technology, every police department with a policy makes very clear that face recognition technology is not capable of providing a positive identification, that it makes mistakes, and that it’s only for generating leads,” says Nathan Wessler, deputy director of the American Civil Liberties Union’s Speech, Privateness, and Expertise Mission.

Information reviewed by WIRED additionally present that DHS’s hasty approval of Fortify final Could was enabled by dismantling centralized privateness evaluations and quietly eradicating department-wide limits on facial recognition—adjustments overseen by a former Heritage Basis lawyer and Mission 2025 contributor, who now serves in a senior DHS privateness function.

DHS—which has declined to element the strategies and instruments that brokers are utilizing, regardless of repeated calls from oversight officials and nonprofit privacy watchdogs—has used Cell Fortify to scan the faces not solely of “targeted individuals,” but in addition individuals later confirmed to be US citizens and others who had been observing or protesting enforcement exercise.

Reporting has documented federal brokers telling residents they had been being recorded with facial recognition and that their faces can be added to a database with out consent. Different accounts describe brokers treating accent, perceived ethnicity, or skin color as a foundation to escalate encounters—then utilizing face scanning as the next step as soon as a cease is underway. Collectively, the circumstances illustrate a broader shift in DHS enforcement towards low-level road encounters adopted by biometric seize like face scans, with restricted transparency across the software’s operation and use.

Fortify’s know-how mobilizes facial seize a whole bunch of miles from the US border, permitting DHS to generate nonconsensual face prints of people that, “it is conceivable,” DHS’s Privateness Workplace says, are “US citizens or lawful permanent residents.” As with the circumstances surrounding its deployment to brokers with Customs and Border Safety and Immigration and Customs Enforcement, Fortify’s performance is seen primarily as we speak by way of court docket filings and sworn agent testimony.

In a federal lawsuit this month, attorneys for the State of Illinois and the Metropolis of Chicago mentioned the app had been used “in the field over 100,000 times” since launch.

In Oregon testimony final 12 months, an agent mentioned two pictures of a lady in custody taken along with his face-recognition app produced completely different identities. The girl was handcuffed and searching downward, the agent mentioned, prompting him to bodily reposition her to acquire the primary picture. The motion, he testified, prompted her to yelp in ache. The app returned a reputation and photograph of a lady named Maria; a match that the agent rated “a maybe.”

Brokers known as out the identify, “Maria, Maria,” to gauge her response. When she failed to reply, they took one other photograph. The agent testified the second consequence was “possible,” however added, “I don’t know.” Requested what supported possible trigger, the agent cited the lady talking Spanish, her presence with others who seemed to be noncitizens, and a “possible match” via facial recognition. The agent testified that the app did not indicate how confident the system was in a match. “It’s just an image, your honor. You have to look at the eyes and the nose and the mouth and the lips.”

artificial intelligence crime department of homeland security face recognition immigration immigration and customs enforcement police privacy surveillance
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice

24/04/2026

Your Favorite AI Gay Thirst Traps: The Men Behind them

24/04/2026

Stanford Students Wait in Line to Hear From Silicon Valley Royalty at ‘AI Coachella’

23/04/2026

Sam Altman’s Orb Company promoted a Bruno Mars partnership that didn’t exist

22/04/2026
Top News

The UK has launched its 675-million sovereign AI fund

Truth Social’s AI chatbot, Donald Trump’s Media Diet Incarnate is the new AI chatbot from Truth Social

AlphaFold has changed the way science is done. Even after 5 years, it’s still evolving

WIRED Roundup: DHS’s Privateness Breach, AI Romantic Affairs, and Google Sues Textual content Scammers

OpenAI terminates an employee due to insider market trading predictions

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

Easy methods to Use Twitter Analytics: The Full Information

16/01/2026

Google AI’s LLM Data Reduction: From 100,000 Labels to 500 or Less

10/08/2025
Latest News

5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice

24/04/2026

OpenAI Releases GPT-5.5, a Absolutely Retrained Agentic Mannequin That Scores 82.7% on Terminal-Bench 2.0 and 84.9% on GDPval

24/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.