If a company runs afoul of the law and racks up penalties for doing so, the answer should not be to throw out a good law.
Illinois has a good law in the Biometric Information Privacy Act, BIPA. It is designed to prevent companies from quietly collecting such personal information as fingerprints, voice prints and retinal images and then sharing that information with third parties. If private biometric information gets onto the dark web, it can become another part of the surveillance state and can be accessed by criminals through data breaches.
BIPA doesn’t ask much. Companies just have to notify people they are collecting the information and what they are going to do with it. Companies also have to get approval from the people who provide their biometric information.
Yet BIPA has been under attack almost since the day it was enacted, from companies that don’t want to bother with it.
In a current example, White Castle somehow managed to run afoul of the law by selling biometric data it had collected from employees without telling them. Over time, the company is looking at penalties that could top $17 billion, although the Illinois Supreme Court has said the penalties are “discretionary rather than mandatory.”
Make sure businesses know the law
The point of the law is not to crush companies with burdensome penalties. But the answer should not be to toss out the law or weaken the penalties to the point that companies can safely ignore BIPA.
A state Senate working group has been talking with stakeholders about the issue. One solution that has been raised would be to add a provision to BIPA that requires companies that sell equipment that collects biometric information to notify their customers about the law. That makes sense.
Google, Facebook and TikTok have paid large sums in settlements under BIPA, which emphasizes the need for such a law. Ignoring privacy issues tied to biometric information makes life simpler for companies, but not for people whose biometric information is misused.
As artificial intelligence becomes more advanced, society will need stronger laws to protect people from criminals who can misuse their images and voices to impersonate them, perhaps in dangerous ways. Such “deep fakes” will be made easier by biometric information that leaks out onto the web.
The Legislature should be thinking about how to strengthen BIPA, not how to undermine it.
The Sun-Times welcomes letters to the editor and op-eds. Here are our guidelines.