We knew it was coming and now we’ve received dissecting two AI directives less than 30 days! With the release of the Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence and the California Privacy Protection Agency (CPPA) draft Automated Decisionmaking Technology Regulations the onslaught of AI-related governance and compliance rules is upon us.
The CPPA’s release is much more prescriptive than the Executive Order which called upon agencies and government entities to develop parameters for ethical AI use. The draft regulations by the CPPA are broken down into 3 critical categories. Here’s how they compare to the content of Biden’s Executive Order.
As outlined in the CCPA draft regulations, a company using automated decision-making technology must give customers a “Pre-use Notice.” This notice should explain how the business uses AI technology and communicate to consumers that they have the right to choose not to participate or to get information about how the technology is used. Following that notice, consumers must be given the option to opt out of all automated decision-making practices. It’s a much more comprehensive opt-out than in past privacy legislation.
The Executive Order on AI doesn’t spell out an opt-out requirement or notice obligations; however, throughout the Executive Order, it is stated that committees will be formed to address these issues and Congress is called upon to develop bipartisan data privacy legislation to ensure the privacy of American citizens’ data is of utmost importance.
Get Information On Truyo’s AI Governance Platform – Identify AI Footprints & Train Employees on Ethical and Compliant Usage
What does this mean for your organization? President Dan Clarke weighs in on what is arguably the most important, yet complicated section of this and any future legislation governing the use of AI. Clarke says, “In the privacy community, opt-in and opt-out signals are old hat, but the CPPA’s pre-use notice is significant and poses potential complications for organizations. How do you get this notice to your consumers before automated decision-making begins?”
Clarke goes on to say, “First and foremost, you need to understand where and how AI is being used. It may sound self-serving since that’s exactly what Truyo’s AI Governance Platform does, but identification of AI usage is vital. To offer a notice and an opt-out to your consumers you have to know what exactly they’re opting out of to meet requirements compliantly. Without that knowledge, you are flying blind into a governance nightmare. Regarding opt-out, you absolutely can and absolutely should go through the identity verification of someone performing a request to access – similar to a right to know or right to delete request under CPRA.”
Consumer rights are top of mind for the CPPA as indicated by the formerly released operating rules under CPRA and those same considerations continue to be apparent in the Automated Decisionmaking Technology draft regulations. The proposed regulations outline the right for consumers to request access information about the business’s use of automated decision-making. Much like a Data Subject Access Request, organizations will have to accept incoming requests and act on those requests in a timely manner, which has yet to be defined.
Again, the Executive Order does not spell out what rights consumers will have, but it is evident that the upcoming legislation Biden called for would include rights afforded to American citizens to foster the transparent and ethical use of automated decision-making technology. From the Executive Order: “To better protect Americans’ privacy, including from the risks posed by AI, the President calls on Congress to pass bipartisan data privacy legislation to protect all Americans, especially kids, and directs the following actions…” Without recommending specific and detailed compliance requirements, Biden made a statement that those would be forthcoming in the form of federal legislation and actions by committees created by this Order.
Article 6 of the CCPA draft regulations outlines special parameters for consumers under 16 years of age, allowing them to opt in. For those under 13, businesses using profiling for behavioral advertising must establish a reasonable method for a parent or guardian to opt in on behalf of their minor. Even if there’s existing parental consent as per COPPA, this additional agreement for profiling is required.
Biden’s intentional mention of kids, or minors as typically designated in privacy legislation, in the Executive Order foreshadows specific callouts in future federal legislation to maintain high-level regulation and security of the data of younger citizens.
Exemptions are less significant than at first glance. It may appear at first glance that there are several exemptions included, such as those allowing organizations to be exempt from offering the opt-out mechanism, but the available exceptions are few and far between and don’t allow for most businesses to make a use-case to avoid the opt-out. Even government agencies will likely be within the scope of the CPPA rules.
We expect further clarification from the CPPA on what qualifies as an acceptable opt-out mechanism. Truyo President Dan Clarke says, “I think iterations of these regulations from the CPPA are likely to add strength to the argument that this opt-out must be explicit to automated decision-making and you can’t simply employ an umbrella opt-out. A consumer complaint mechanism may also be released, giving consumers an option to submit grievances to the CPPA for consideration.”
There is a mention of a 15-day timeline assigned to the opt-out, but we may see this revised for more clarity, potentially processing the opt-out immediately upon first interaction. Clarke says, “Technically speaking, it makes sense for the opt-out to be processed immediately. You can’t retain data once someone opts out, but how do you operationalize that? It’s a huge burden on companies looking to comply with these proposed regulations.”
In analyzing what this means for American companies, Dan Clarke had this to say, “Maybe the biggest takeaway for a business owner is you may have to require your entire organization to carefully disclose the purpose, usage, and methodology with which they’re leveraging AI, especially in automated decision-making. This is no easy task, but under these proposed regulations and other current laws, there’s no way around it. You might, and probably should, create a ROPA-like document trail demonstrating what you’ve done to identify automated decision-making and how you’ve required your organization to document its usage and subsequent compliance efforts.”
We’ve seen Congress come to a standstill on federal privacy legislation in the past, leaving it up to the states to fill the gap. Dan Clarke is hopeful saying, “I think we’re going to see movement in the privacy protection as a result of Biden’s willingness to publicly speak on the privacy implications of AI. This has reinvigorated the federal privacy discussion and I anticipate, either through Mag-Moss rulemaking or comprehensive legislation, movement on that front sooner than later. I think one of the quickest things we’re going to see is a response from NIST and a cybersecurity program for AI compliance. I was told by a Biden administration insider that there are deadlines for new elements outlined in the Executive Order, some as short as 45 days, so we should see material AI-related compliance components come out in the next 90 days.”