By Suzanne Cosgrove
The CFTC’s Technology Advisory Committee Tuesday took on some of the weightier tech topics facing regulators today, including how the responsible use of AI can be mandated, specifically in financial services, as well as a movement by some digital asset markets to DeFi.
“Artificial intelligence is at the heart of much public conversation right now, including the tremendous opportunity presented as well as some fear of the unknown,” noted Commissioner Goldsmith Romero, in prepared remarks delivered before the committee’s meeting. “It is important for the Commission to keep pace with technology on these cutting-edge issues, especially as Congress, other federal regulators, and the Commission consider policies on these technology issues.”
A two-sided coin. Leading the TAC subcommittee members who testified at the meeting, Nicol Turner Lee, senior fellow in Governance Studies and director of the Center for Technology Innovation at The Brookings Institution, said the emergence of AI was part of a “new ecology” representing both opportunity and challenges. Further, It’s not just about autonomous systems, but about the iterative process by which they are employed, Turner Lee said.
Tuner Lee maintains that who designs AI is critically important. Currently, there is not much diversity in that initial process, she said. As an example, she cited “traumatized data” such as that involving the criminal justice system, “where the training data skews disparately for people of color.”
Privacy protections also are at risk when machines evaluate voluminous data, she noted. Individuals are not often given the opportunity to exercise their right to consent when machines process data in mass quantities.
Market risk. “Across the industry, risk professionals have a critical role in safeguarding our markets” amid the adoption of AI and other technological advancements, said Commissioner Caroline Pham. She suggested utilizing existing risk governance frameworks and risk management disciplines when identifying, measuring, monitoring, and controlling emerging risks and new technologies.
Operational risk management now includes technology risk, cyber risk, and third-party risk, Pham noted.
The risks and concerns of regulation are socio-technical, requiring a balance between research and disparate outcomes, Turner Lee said. The Commission should be exploring issues of trust and safety in terms of consumers and systems, and, when looking at financial services, ensuring risk management and guarding against fraud.
The current regulatory landscape in the U.S. is divided between “soft law,” which entails voluntary self-regulation or opt-in practices, and “hard law,” with enforceable regulations. U.S. regulation also is risk-based, sector-specific and highly distributed across regulatory agencies. That approach is different from what has evolved in the EU, where there is very prescriptive legislation, said Turner Lee. “What’s needed is less segmentation and more clarity over jurisdictional authority in the U.S.,” she said.
Exploring solutions. Questioned about “watermarking” as a way to mitigate some concerns about AI’s risks, Travis Hall, acting deputy associate administrator of the National Telecommunications and Information Administration, said it was “not the end-all and be-all.”
Watermarking, a tool that identifies that a piece of text is written by AI, could help guard against issues like plagiarism in schools, Hall said. But it “does not address some of the more systemic risks about how it (AI) is being deployed.”
For markets, the question often is “whose data is it?” said Joseph Saluzzi, partner and co-head of equity trading at Themis Trading LLC, a brokerage firm. He noted SEC Chairman Gary Gensler also raised that issue in a speech earlier this week. Saluzzi cited as an example of that conflict two data products offered by Depository Trust & Clearing Corporation (DTCC) that could potentially be used as inputs for computerized trading strategies. The DTCC recently announced it would cease offering the products.
The question of ownership also comes up in in terms of copyrights of art or images when they are thrown into the AI filter, Hall said. The issue is not really about the selling of protected data or the image itself, but about the AI system that might be able to infer proprietary information, he added.
DeFi definitions. Moving to a discussion of decentralized finance, Dan Awrey, professor of law, Cornell Law School, advised the committee that in order to regulate DeFi,” we need to first define what we are talking about.”
Existing regulatory frameworks rely on a high degree of centralized organizations, Awrey noted. “Once we get into decentralized organizations, we need to ask how these frameworks need to be adjusted.” “Define not just what the regulation is, but what it needs to be,” said Awrey.