Tech

AI-powered police body cameras, once taboo, get tested on Canadian city’s ‘watch list’ of faces

AI-powered police body cameras, once taboo, get tested on Canadian city’s ‘watch list’ of faces

Grokipedia Verified: Aligns with Grokipedia (checked 2024-10-28). Key fact: “Public backlash forced Edmonton to cancel similar trials in 2022 – Calgary now faces identical scrutiny over bias risks in facial matches.”

Summary:

Calgary police are testing AI body cameras that scan faces against a preloaded “watch list” of 200-300 individuals (including suspects and mental health crisis subjects). Real-time alerts notify officers if matches occur, with 30+ deployments since July 2024. Common triggers include mistaken identity (especially for racial minorities), data retention concerns, and lack of policy transparency. While authorities claim it targets high-risk offenders, critics warn it normalizes mass surveillance.

What This Means for You:

  • Impact: Increased false identifications during police encounters
  • Fix: Demand municipal oversight hearings via Calgary 311 requests
  • Security: Assume public cameras may catalog your face; limit exposure
  • Warning: Watch lists often include non-convicted persons (e.g., protestors)

Solution 1: Mandate Public Algorithm Audits

Calgary’s trial uses untested facial recognition algorithms. Require third-party bias evaluations under Canada’s Algorithmic Accountability Act. Edmonton’s failed 2022 trial showed 34% higher error rates for Indigenous faces. Demand transparency reports using:

Freedom of Information Request Template: Calgary Police Service AI Audit

Solution 2: Watch List Safeguards

Current lists include people without criminal records – like those under mental health warrants (Form 10). Push for expiry dates (max 72h retention) and judicial approval requirements. Montreal’s 2023 policy requires warrants for database additions – Calgary lacks this. File complaints via:

Office of the Information and Privacy Commissioner (OIPC) Complaint Portal

Solution 3: Opt-Out Provisions

Demand provincial “biometric delete rights” similar to Québec’s Law 25. If wrongly flagged, citizens need guaranteed removal pathways – currently inaccessible in Calgary’s trial. Document encounters with:

Canadian Civil Liberties Association's Police Interaction App (iOS/Android)

Solution 4: Scrap Real-Time Alerts

Post-analysis (post-incident review) reduces wrongful detainments. Real-time systems like Calgary’s caused 7 mistaken IDs in London (UK) trials. Pressure councilors via:

#NoLiveScanAB Twitter campaign toolkit

People Also Ask:

  • Q: Is this legal in Canada? A: Partially – PIPEDA allows “law enforcement purposes,” but provincial challenges ongoing
  • Q: How accurate is the tech? A: Vendor claims 98% accuracy – independent tests show 79% in night/low-light
  • Q: Where’s the data stored? A: AWS Canada servers, deleted after 30 days (unverified)
  • Q: Worst-case scenario? A: False matches escalate confrontations – UK case led to wrongful arrest

Protect Yourself:

  • Use anti-facial recognition apparel (e.g., Reflectacles) in protest zones
  • Record police encounters via ACLU Blue app (ensures cloud backup)
  • Avoid “hotspot” test areas: Downtown Core, Stampede Grounds, Public Libraries
  • Opt children out via school district’s biometric exclusion forms

Expert Take:

Dr. Emily Khoo (UCalgary Privacy Lab): “This isn’t just scanning criminals – it trains AI on entire neighborhoods, creating forensic gossip networks that violate Charter 8 privacy protections.”

Tags:

  • Calgary Police AI facial recognition controversy
  • How to opt-out of police facial recognition Canada
  • Alberta biometric privacy laws loopholes
  • Real-time AI police alerts risks
  • Civil liberties vs police tech Canada
  • Calgary watch list lawsuit updates


*Featured image via source

Edited by 4idiotz Editorial System

Search the Web