Biometrics, AI and the evolution of US privacy laws: reflections from Europe

Facial recognition - privacy laws and regulation

Share this content

Facebook
Twitter
LinkedIn

Simon Randall, CEO of Secure Redact, powered by Pimloc, examines digital privacy laws in the US and how lessons can be learned from Europe.

Surveillance and security

The power of security and surveillance technology in the US has accelerated in the last year.

The benefits of efficiency, accuracy and enhanced capabilities are significant but such rapid advancements, underpinned by AI, have fueled data privacy concerns.

Momentum for federal data regulation has grown alongside and expansion of state-specific privacy laws – 34% of states now have specific data privacy policies in place or coming.

Organizations must manage unique and complex compliance challenges with proper processes and strategies to harness modern security technologies while protecting civil liberties.

This is a fast-moving and contentious issue at the heart of American freedoms, US security and international relationships.

Biometrics and AI for security and operations

Biometric and AI technologies are used across the US to improve security and operations.

Today, biometric authentication is almost ubiquitous and the early adoption of facial recognition technology (FRT) has spread and matured across industries including public security, education, healthcare and public services. 

Examples of uses in security include multimodal authentication, which combines biometrics – from voice analysis to gait recognition – for individualized security measures.

Spoofing attacks can be thwarted with liveness detection, and behavioral biometrics, where a security system picks up a user’s typing patterns and mouse movements to match them to an individual profile, is a growing area of enhanced user authentication.

These technologies are highly effective and innovation has increased rapidly with the development of closed and open-source foundational models targeting artificial general intelligence (AGI).

While the jury is out on when we may reach AGI, the narrow application of these technologies into security solutions is enabling more powerful, adaptable and affordable technology.

However, the risks associated with biometric data have exponentially increased.

In a world where faces, voices and personally identifiable images are captured in large-scale data sets, generative AI (gen AI) has lowered the barrier to entry for bad actors.

While cyber-attacks and data breaches are on the rise, almost anyone can use gen AI to create ‘deepfake’ synthetic images and videos that threaten to compromise biometric security systems and enable fraud and other exploitation of sensitive biometric data.

Video surveillance for law enforcement

In the public sector, police departments have increasingly adopted biometric and AI-powered surveillance measures.

Drones are revolutionizing operations in global warfare and are now being deployed for everyday use for first responders and monitoring – with the potential to enable invaluable live and post-event video.

At least 1,400 police departments across the US are using drones today, according to the American Civil Liberty Union (ACLU).

In 2023, Oregon’s police conducted 721 drone flights during 230 search and rescue missions, traffic accidents, crime scenes and search warrants.

In San Francisco, March 2024 saw drones integrated with facial recognition for criminal investigations and pursuits.

In January 2024, in Grayson County, Kentucky, police used a drone to locate a suspect in a domestic abuse case.

The same trend is reflected in the UK, where the government is committed to expanding facial recognition, with £55.5 million ring-fenced for its rollout across the country over the next four years.

As the UK government website says, this will include at least £4 million for bespoke mobile units that can be deployed in crowded areas to identify people wanted by the police. 

The US military has greatly expanded its use of FRT drone capabilities, with the Air Force signing a contract to deploy advanced AI technology from US-based public company RealNetworks for ‘intelligence, surveillance, and target acquisition.’

Growing controversy 

As law enforcement and public institutions increasingly deploy advanced technology, scrutiny and privacy concerns have grown.

The RealNetworks contract has been one of the most controversial developments this year, eliciting warnings from civil advocates and sparking widespread concern for privacy in the face of a ‘surveillance society’.

The ACLU website describes the issues: “without proper regulation, drones equipped with facial recognition software, infrared technology, and speakers capable of monitoring personal conversations would cause unprecedented invasions of our privacy rights.

“Interconnected drones could enable mass tracking of vehicles and people in wide areas.

“Tiny drones could go completely unnoticed while peering into the window of a home or place of worship.” 

The ACLU has warned of insufficient oversight and the need for transparent regulations that balance privacy interests with the advantages of aerial surveillance.

The concern for loss of privacy is echoed in corporate data privacy practices.

A 2023 study by the Pew Research Center shows how personal data privacy fears have captured US citizens.

Over nine in ten US adults feel they’ve lost control over how their personal information is used; eight in ten express unease about third-party access to their social media data and more than seven in ten Americans advocate for increased regulation around technology and governance.

A tightening of data compliance rules across the US

Fierce debate around these subjects has given rise to significant levels of US state legislation and united calls for enhanced privacy protection across public and private spaces. 

There is already a patchwork of US federal mandates with privacy overlap by sector.

The Family Education Rights and Privacy Act (FERPA) protects student data and for educational establishments like schools and pupils, automatically approves compliance with the Children’s Online Privacy Protection Act (COPPA), which relates to computing services directed at children under 13 years old.

The Health Insurance Portability and Accountability Act (HIPAA), the strictest of the three, protects the privacy and security of individually identifiable patient information. 

As they protect individual privacy, information covered by FERPA, COPPA and HIPAA is exempt from the Freedom of Information Act (FOIA).

FOIA grants any person the right to request access to federal agency records and these records must be disclosed unless they fall under specific exempt categories.

On top of the complexity of these sector-specific regulations and privacy laws, the US is seeing a clear trend toward stricter compliance and measures for enhanced data protection concerning the handling of AI and biometric data.

But it’s still a lottery based on which state you live in.

This is best demonstrated by the Illinois’ Biometric Information Privacy Act (BIPA).

Created in 2008, the BIPA has been the subject of extensive legal battles and significant penalties.

The first jury trial in a BIPA class action lawsuit happened last summer, ending with a $228 million fine given to a transport business for requiring truck drivers to scan their fingerprints when in the company yard. 

The BIPA has been so notoriously strict that in May this year, legislators approved amendments to limit violations to the first biometric scan.

Supporters of the changes want to protect small enterprises from crippling fines, but we may see some backlash from privacy advocates.

Texas and Washington have also enacted biometric privacy laws, while New York and Maryland have pending bills that would join Illinois in allowing a private right of action if passed.

In the absence of a comprehensive federal law, 17 US states have enacted comprehensive consumer data privacy laws that grant rights to individuals regarding the collection, use and disclosure of their personal data by businesses.

This fragmented landscape creates complexity and uncertainty for cross-state enterprises, as they grapple with diverse compliance requirements and legal obligations.

Organizations must stay ahead of evolving data compliance, adopting robust strategies and technologies to navigate the complexities of privacy laws.

This entails leveraging advanced encryption technologies, implementing stringent access controls and deploying AI-driven data redaction techniques to fortify defenses against potential security breaches and data intrusions.

Fostering a culture of privacy awareness and accountability within organizations is also paramount to creating trust among consumers and stakeholders, enhancing brand reputation and mitigating reputational risks.

A federal way forward: the American Privacy Rights Act

The recent proposal of the American Privacy Rights Act (APRA) marks a significant milestone in the trajectory of US data privacy legislation.

The APRA aims to bridge previous legislative efforts, such as the American Data Privacy and Protection Act (ADPPA), by creating new privacy rights and protections for all Americans, preempting most state privacy laws to establish a uniform national standard.

It covers very wide groups of data, including health, biometric and online behavior, with multiple methods of enforcement via federal and state governments, and the private right of action.

It would require organizations to keep robust data security practices, such as designating privacy or data security officers to submit annual certifications to the Federal Trade Commission (FTC).

However, concerns have arisen that it doesn’t go far enough.

Particular opposition has come from California, a state which sees the APRA as a slackening of its high data privacy standards and compliance requirements.

In a letter sent to Washington lawmakers in April this year, the California Privacy Protection Agency (CPPA) argued that the APRA would grant “compliance safe harbors to businesses,” and “lock the country into a standard that stymies California’s rulemaking innovation.”

Even if this year’s presidential election and ongoing criticisms of the APRA slow the passing of federal-level regulation and privacy laws, organizations must prepare and be able to quickly adhere to both existing state legislation, such as the CCPA and new federal privacy laws.

Companies must also understand how the AI Bill of Rights impacts their businesses and follow its guidance surrounding data privacy, system evaluation and monitoring.

This executive order, released on 30 October 2023, is a framework to encourage more transparent, trustworthy and equitable AI systems that protect civil rights and prevent algorithmic discrimination or privacy law violations.

Alignment of privacy laws at home and abroad

If enacted, APRA would usher in a new era of digital privacy laws and regulation in the US, establishing an equivalent to the European Union’s General Data Protection Regulation (GDPR) – a high bar in privacy protection for individuals.

This alignment of standards would bring benefits to both sides of the Atlantic.

Building on the current EU-US Data Privacy Framework, a higher level of compliance on the US side could help to further open up cross-Atlantic data flow.

According to AmCham EU, which represents American Business in Europe, US-EU data flows make up half of the US’ total global data flow.

The US Chamber of Commerce estimated the secure and smooth data transfer relationship between the two is worth over $7 trillion.

Beyond obvious economic benefits, the stakes for Western global competitiveness, innovation and security rely on it.

Implementing new privacy laws and regulations takes time.

With a lack of case law, sharp regulatory scrutiny, and public empowerment and awareness of privacy rights, the APRA has a way to go.

Whereas in Europe, the market for privacy-enhancing technologies has matured since GDPR’s introduction, offering more options for businesses to protect privacy while taking advantage of the latest technology.

The development of biometrics, AI, security and data privacy laws and regulation in the US shows no sign of slowing down.

It will continue to be fascinating to follow and organizations across the public and private sectors must keep up with the fast moving and complex environment.

Matching innovation with privacy laws and data protection is an innovation, moral and legislative obligation that cannot be ignored.

About the author

Simon Randall is the Co-Founder and CEO Secure Redact, powered by Pimloc, a global privacy and security company specializing in anonymisation technology for visual data.

Simon has spent decades working in the tech and security sectors, and advocates for greater legislation and education of all-things-data.