By Eric Kaufmann
Last week (Jan. 25), the Illinois Supreme Court issued a ruling on the state’s Biometric Act that reaffirmed companies’ obligations to disclose details about its biometric information collection efforts to subjects of such efforts and pay penalties when they don’t. The legal issue had been whether damages were due even if the person being scanned suffered no financial or other harm. The Supreme Court ruled that merely violating the state law was itself damage to the individual.
“Through the Act, our General Assembly has codified that individuals possess a right to privacy in and control over their biometric identifiers and biometric information. The Act vests in individuals and customers the right to control their biometric information by requiring notice before collection and giving them the power to say no by withholding consent,” the court ruled in the case of Stacy Rosenbach, as other and next friend of Alexander Rosenbach, Appellant, v. Six Flags Entertainment Corporation et al., Appellees. “A person who suffers actual damages as the result of the violation of his or her rights would meet this definition of course, but sustaining such damages is not necessary to qualify as ‘aggrieved.’”
The message to businesses from the Court is: You are on the hook for damages for a violation of the statute and, oh, by the way, the complainant does not have to prove any damages. The statute directs businesses to “store, transmit, and protect from disclosure all biometric identifiers and biometric information using the reasonable standard of care within the private entity’s industry.”
The cleanest take-away from this ruling is that enterprises need to take The Biometric Information Privacy Act very seriously. With that in mind, let’s take a look into what this Act says and what it means for enterprises. Not surprisingly, much of the legislation is written with little understanding of how biometric authentication works. *sigh*
First, what specific penalties are companies exposed to if they are found to be noncompliant? That’s a bit tricky.
The court ruled that “when a private entity fails to comply with one (emphasis added) of section 15’s requirements, that violation constitutes an invasion, impairment, or denial of the statutory rights of any person or customer whose biometric identifier or biometric information is subject to the breach. Consistent with the authority cited above, such a person or customer would clearly be ‘aggrieved’ within the meaning of section 20 of the Act (id.§ 20) and entitled to seek recovery under that provision. No additional consequences need be pleaded or proved. The violation, in itself, is sufficient to support the individual’s or customer’s statutory cause of action.”
OK, that sounds fine and reasonable, but let’s take a look how the original Act details those fine exposures. Section 20 of the Illinois Biometric Information Privacy Act provides, in part:
“A prevailing party may recover for each violation:
(1) against a private entity that negligently violates a provision of this Act, liquidated damages of $1,000 or actual damages, whichever is greater;
(2) against a private entity that intentionally or recklessly violates a provision of this Act, liquidated damages of $5,000 or actual damages, whichever is greater;
(3) reasonable attorneys’ fees and costs, including expert witness fees and other litigation expenses; and
(4) other relief, including an injunction, as the State or federal court may deem appropriate.”
As I read this language from the opinion in conjunction with Section 20 of the Act, it is reasonable to conclude that each separate failure to comply with one of Section 15’s requirements would constitute a separate violation, subjecting the violator to damages. Conceivably then, if a business violates five of Section 15’s requirements as to one individual, it would be exposed to potential liability of $5,000 ($1000 x 5) for negligent violations and $25,000 ($5000 x 5) for intentional/reckless violations, without any showing of actual damages. Likewise, if the business violated a single Section 15 requirement as to 1000 individuals, then the potential liability would be $ 1,000,000/5,000,000, again, without any showing of actual damages.
The million dollar question for businesses: What constitutes a separate violation of the Act exposing the business to liability?
Now that we know that the financial exposure is potentially massive, let’s drill down into exactly what a company is supposed to do.
Let’s start with what the Act opted to exclude. On biometrics itself, it says “Biometric identifiers do not include writing samples, written signatures, photographs, human biological samples used for valid scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color.”
Really? Eye color is excluded? As is hair color? That’s news to most facial-recognition systems, which often include eye color and hair color. Height is excluded? On a full body-scan for authentication, that would certainly be an issue.
It’s also good to know that authentication systems leveraging handwriting-recognition (“written signatures”) and simply ID cards with photographs on them are all not protected.
Then there are the kinds of businesses excluded. Hospital chains apparently need not worry about biometrically tracking patients without getting the state’s permission: “Biometric identifiers do not include information captured from a patient in a health care setting or information collected, used, or stored for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996.”
If that phrasing merely excluded data collected for medical tests, that would make sense, but by saying “information captured from a patient in a health care setting or”—emphases placed on the “or”—it means that any biometric data collected in a healthcare setting is exempt. So healthcare execs wanting to identify patients biometrically, go for it. Just know that if you start using biometrics to identify visitors or staff or any other non-patient, your protection vanishes.
Anyone else excluded? Absolutely. “Nothing in this Act shall be deemed to apply in any manner to a financial institution or an affiliate of a financial institution that is subject to Title V of the federal Gramm-Leach-Bliley Act of 1999 and the rules promulgated thereunder.” That’s a rather massive exemption. So big banks can biometrically authenticate anyone at will— .no permission from Illinois needed.
Of course, state legislators made sure to exclude themselves and the many Illinois state employees who enforce their edicts: “Nothing in this Act shall be construed to apply to a contractor, subcontractor, or agent of a State agency or local unit of government when working for that State agency or local unit of government.”
So what does it require for those of us left?
The Act doesn’t solely cover biometric data, opting to define confidential and sensitive information broadly. “Examples of confidential and sensitive information include, but are not limited to, a genetic marker, genetic testing information, a unique identifier number to locate an account or property, an account number, a PIN number, a pass code, a driver’s license number, or a Social Security number.”
The key requirement: “A private entity in possession of biometric identifiers or biometric information must develop a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or within 3 years of the individual’s last interaction with the private entity, whichever occurs first. Absent a valid warrant or subpoena issued by a court of competent jurisdiction, a private entity in possession of biometric identifiers or biometric information must comply with its established retention schedule and destruction guidelines.”
The nature of biometric authentication, though, makes this a little tricky. There are two basic types of biometrics: Those that are generally stable and identical for the long run, such as fingerprints and retina scans; and those that typically change over time, such as facial recognition, which is designed to recognize gradual changes in makeup or the saving of a mustache or the replacement of glasses with contacts. In short, biometrics can learn and adapt to expected facial changes. (Note: Yes, fingerprints can get damaged in a hand accident, or the grooves can thin out due to some prescription drugs and some chemicals used for cleaning, but those are the exceptions. The same with material damage to the eye from, perhaps, a car accident. In general, it’s assumed that a fingerprint and a retina will remain the same. Not so with facial recognition.)
With changeable biometric authentication methods, does each updated view reflect a brand-new dataset, with the retention clock starting over? Or is retention only applicable to the time from the initial scan? And how is the private entity defined? Let’s say that the entity is a large multinational with many subsidiaries and independent operating units. Can they all share one database of biometric datasets for all employees, customers and contractors globally? Or does a company that operates independently have to run its own timetable?
Here’s another interesting section: “No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information. No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person’s or a customer’s biometric identifier or biometric information unless: the subject of the biometric identifier or biometric information or the subject’s legally authorized representative consents to the disclosure or redisclosure; the disclosure or redisclosure completes a financial transaction requested or authorized by the subject of the biometric identifier or the biometric information or the subject’s legally authorized representative; the disclosure or redisclosure is required by state or federal law or municipal ordinance; or the disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.”
How many enterprises share biometric data with biometric vendors that are providing the technology and manage the system for them? You’ll note that that is not one of the listed exemptions. It exempts the sharing for completing “a financial transaction requested or authorized by the subject of the biometric identifier,” but what if the authentication doesn’t immediately relate to a financial transaction? And what happens if you’re accused of sharing biometric datasets with a vendor that you need to manage the process?
Then there’s the reference to “otherwise profit.” What if a program used biometric authentication to allow a customer access to a restricted area, which was specifically the case with Six Flags in this matter? Doesn’t that constitute a form of profit to the business? What if it’s used to help with marketing analysis? Doesn’t that potentially become a profit issue as well?
Next, we have an area that directly speaks to security concerns. “A private entity in possession of a biometric identifier or biometric information shall: store, transmit, and protect from disclosure all biometric identifiers and biometric information using the reasonable standard of care within the private entity’s industry; and store, transmit, and protect from disclosure all biometric identifiers and biometric information in a manner that is the same as or more protective than the manner in which the private entity stores, transmits, and protects other confidential and sensitive information.”
That sounds specific, but read it closely and it’s anything but. Biometrics is relatively new. What exactly is the “reasonable standard of care within” retail or manufacturing or the airline industry or petrochemicals firms? What if most companies in that particular vertical are reckless and treat data poorly? (Yes, I am looking at you, Retail.) Will simply mimicking the vertical’s standard of care be sufficient here? If challenged, how does a company prove what a specific vertical’s standard of care for protecting biometric datasets is?
Then there’s the second part of that quote: “protect from disclosure all biometric identifiers and biometric information in a manner that is the same as or more protective than the manner in which the private entity stores, transmits, and protects other confidential and sensitive information.”
We know that “confidential and sensitive information” includes routine account numbers—which are often printed on bills and are sometimes visible through the envelope window—that’s not necessarily that high a standard. In theory, they should have specified a security standard for companies to work toward, rather than saying, in effect, whatever you’re doing now for your account numbers is perfectly fine for biometric data. *Triple sigh*