Our second story is about JPMorgan Chase.
While it’s difficult to feel the same level of sympathy for the country’s largest bank as we do for Mr. Chicco, JPMC recently discovered that it, too, had been wronged by a data broker:
JPMorgan Chase has sued TransUnion for what it calls an "elaborate, decade-long scheme" by Argus, a unit of TransUnion, to "secretly misappropriate JPM's valuable trade secret data." The "trade secret data" the bank refers to is anonymized credit card data.
In the lawsuit the New York bank filed against Chicago-based TransUnion last week in Delaware Federal District Court, it said Argus Information & Advisory Services collected the bank's credit card data while under contract as a data aggregator for the Office of the Comptroller of the Currency, the Federal Reserve Board and the Federal Reserve Bank of Philadelphia. Argus then, without permission and in violation of its contracts with those agencies, used that data in the benchmarking services it sells to other banks, according to the bank.
Whoops!
And despite TransUnion’s rather uninspiring defense that all this happened before it acquired Argus from Verisk (another data broker) in 2022, legal observers don’t like TU’s chances in the coming court case because Argus just settled a different case with the government on the same basic charges:
The bank's case is bolstered by the fact that earlier this month, Argus agreed to pay $37 million to settle a civil investigation by the Department of Justice and other federal authorities accusing it of the same thing – using anonymized credit card data collected for government agencies in the benchmarking services it sells to credit card issuers, in violation of its government contracts.
The contracts each restricted Argus' ability to use, disclose or distribute credit card data collected from banks for purposes other than the performance of the work under the government contracts.
But the company used this anonymized data to create synthetic data that it incorporated into the products and services it sold to some commercial customers, the DOJ said.
Whoops! Whoops! Whoops!
These stories are obviously bad, but the reason they’re important is that they (and many more like them) are motivating a specific financial services regulator to take some dramatic actions.
Enter: The CFPB
Rohit Chopra, Director of the CFPB, has been talking non-stop for the last couple of years about the huge potential for consumer harm coming from AI and its insatiable appetite for consumer data:
Today, “artificial intelligence” and other predictive decision-making increasingly relies on ingesting massive amounts of data about our daily lives. This creates financial incentives for even more data surveillance. This also has big implications when it comes to critical decisions, like whether or not we will be interviewed for a job or get approved for a bank account or loan. It’s critical that there’s some accountability when it comes to misuse or abuse of our private information and activities.
Left to the free market, Director Chopra fears that this demand for consumer data will be met by increasingly unscrupulous companies and business practices, which he lumps into the category of ‘data surveillance’:
While these firms go by many labels, many of them work to harvest data from multiple sources and then monetize individual data points or profiles about us, sometimes without our knowledge. These data points and profiles might be monetized by sharing them with other companies using AI to make predictions and decisions.
In many ways, these issues mirror debates from over fifty years ago. In 1969, Congress investigated the then-emerging data surveillance industry. The public discovered the alarming growth of an industry that maintained profiles on millions of Americans, mining information on people’s financial status and bill paying records, along with information about people’s habits and lifestyles, with virtually no oversight.
Of course, today’s surveillance firms have modern technology to build even more complex profiles about our searches, our clicks, our payments, and our locations. These detailed dossiers can be exploited by scammers, marketers, and anyone else willing to pay.
Those statements are difficult to deny – remember, Mr. Chicco’s car was essentially spying on him in order to profit G.M. – but the question is what, specifically, is Director Chopra going to do about it?
Open Banking and the FCRA
There are two big initiatives underway at the CFPB, which, taken together, have the potential to dramatically reshape the financial services data economy.
The first is open banking.
I have written a lot about the CFPB’s personal financial data rights rule, which it is working to finalize under the authority granted to it by Section 1033 of the Dodd-Frank Act (check out this essay and this follow-up if you need a primer). I won’t rehash all the details here.
Suffice it to say that the CFPB is proposing a relatively narrow regulatory framework in order to advance a much larger and more comprehensive philosophical idea – financial data that is generated by or associated with a consumer is owned by the consumer and should only be shared with their explicit authorization and only for the narrow purpose for which it has been authorized.
You can easily see how this consumer-driven data-sharing philosophy conflicts with the General Motors and TransUnion stories I shared earlier.
Checking a box and agreeing to a densely written privacy policy that says that the company may share your data with unspecified third parties for unspecified reasons obviously falls well short of the general data privacy standard that the CFPB is trying to help set in the U.S. with its open banking rule.
The same is true in the case of JPMC and TU.
Does a company that is in possession of sensitive data have the right to anonymize that data or use it to create synthetic data without the explicit permission of the owners of that data?
If you squint your eyes and tilt your head just so, you can maybe see how someone (like the folks working at Argus) could reasonably believe that the answer is ‘yes’. After all, who’s getting hurt? The data is being anonymized or otherwise transformed in such a way that there is no risk of a data breach or negative outcome for an individual consumer. What’s the problem?
The problem is that it’s not their data!
In this specific case, Argus’s contracts with the government specifically forbade them from any secondary uses of the data that they were collecting from JPMC and the other big banks. But even if they hadn’t, it’s still not Argus’s data! They can’t just do with it whatever they please! That’s the general line in the sand that the CFPB is attempting to draw with 1033 – you are only allowed to use someone else’s data for the specific purpose they have authorized. Nothing else.
Now, of course, there’s quite a bit of consumer data that gets shared without consumers’ explicit authorization in the financial services industry today. And a lot of that data sharing is vitally important for ensuring that financial services providers can serve consumers in a safe, profitable, and legally compliant manner.
This is where the Fair Credit Reporting Act (FCRA) comes in.
The FCRA was enacted by Congress in 1970. It was one of the world’s first data privacy laws, and it was intended to give consumers more rights and control over the data that consumer reporting agencies (the legal term for credit bureaus) were assembling and selling. The FCRA allows for certain sanctioned uses of consumer report data (i.e. permissible purposes, such as making credit, insurance, or employment eligibility decisions) but strictly prohibits other uses of the data. Additionally, the FCRA mandates certain accuracy requirements and gives consumers a right to see their data, and due process rights to dispute inaccurate or incomplete information in their files.
The CFPB, which has rulemaking authority in regard to the FCRA, believes that the FCRA has, by virtue of being more than 50 years old, failed to keep pace with the data surveillance industry it was designed to constrain: