FTC To Ramp Up Privacy Protections In Big Data Arena

Fordham Law School in Law360, March 02, 2012

Media Source

Federal Trade Commissioner Julie Brill on Friday called on data brokers to be more transparent about the use and retention of so-called big data, but stopped short of saying whether data mining sources like Google Inc. and Facebook Inc. would be subject to the same level of scrutiny.
 
The FTC will be keeping a close eye on these practices to ensure that consumer privacy is not being compromised, Brill said in the keynote speech at Fordham University's Sixth Annual Law & Information Society Symposium. But she also was careful to acknowledge that the aggregation and analysis of large, interconnected databases, commonly referred to as “big data,” can be beneficial in predicting useful public health information and social trends.
 
“I am concerned that the vast collection of data about consumers can unintentionally — or even intentionally — include sensitive information, and that the necessary heightened protections are not being provided,” Brill said. “At first blush, it seems that some form of heightened protections are in order.”
 
To help achieve this objective, Brill renewed a call she had made six weeks ago, during a speech at George Washington University, for data brokers to develop a user-friendly, one-stop shop that will give consumers access to information that data brokers have amassed about them and allow them to opt out of the sale of this information for marketing purposes.
 
Both Brill and Maneesha Mithal, associate director of the FTC's Division of Privacy and Identity Protection, who sat on a panel at the Fordham conference, pointed to an action taken by the FTC a few weeks ago to warn several marketers of apps that provide background screening on individuals that they must comply with the Fair Credit Reporting Act of 1970 if they have reason to believe their reports are going to be used for employment, housing, credit or insurance screening.
 
Although Brill refused to directly address the question of whether Google and Facebook — which collect and store vast amounts of information that could be used to determine eligibility for these benefits — could be considered consumer reporting agencies subject to the FCRA, she said that “the [act] needs to be looked at in a creative way, and I do think that we need to be thinking more broadly about the use of the FCRA in the modern paradigm.”
 
While the FTC has yet to bring enforcement actions against companies that use big data for marketing and research purposes, the agency is very likely to exercise this authority in the future, given its previously articulated concerns about the related subject of behavioral advertising and its support of a “privacy by design” model, according to SNR Denton partner Carol Anne Been.
 
“The FTC has gradually been leaning more and more toward getting involved in the big data sector,” Been said. “The FTC is sensitive enough that it realizes that there is very interesting and valuable information that can come from big data sets, but its point is that consumers may not realize what information is being collected about them and that they should be given more control over this information.”
 
During a panel at the Fordham conference on the privacy risks associated with big data, Mithal said the FTC viewed its job as “addressing privacy risks while, at the same time, protecting the benefits of big data.”
 
In order to achieve this objective, the FTC plans to encourage companies to follow best practices, educate consumers and businesses, and punish bad actors, a formula the agency typically uses in policing other practices that raise privacy concerns, according to Mithal.
 
Citing a 2011 report that identified big data use as one of the “major hopes for the economy,” Nelson Mullins Riley & Scarborough LLP partner Jon Neiditz stressed that analyzing these vast databases could lead to positive reforms in the health care, educational and other sectors, and that any potential regulation should focus on the harm of taking away this valuable tool.
 
“A lot of people describe information proliferation as a big problem, but as the ability arises for companies to make more sense out of it and search through the information in new ways, it's becoming less of a problem and more of an asset,” Neiditz said, adding that the plan articulated in the White House's recent privacy paper for the industry to agree to abide by enforceable codes of conduct could work well in the big data industry.
 
Brill acknowledged that not all forms of big data use raise privacy concerns, specifically singling out sentiment analysis, which uses large amounts of anonymous data to tease out early warning signs to aid better planning and predict job losses, spending reductions and other events.
 
“If this information indeed is — and remains — truly anonymous, and can be put to such creative and beneficial uses, I'm not going to lose sleep over it,” Brill said. “Rather, I'll be intrigued to see how beneficial sentiment analysis proves to be in the coming years.”
 
But this anonymity cannot be guaranteed, as it can be “relatively easy” to decertify data in order to reassociate it with a specific consumer, and some methods fail to remove data linked to a specific smartphone or laptop that can be used to identify the owner of the subject information, Brill noted.
 
The commissioner also pointed out that the collection and retention of vast amounts of identifiable data can greatly increase the risk of widespread damage when a data breach occurs.
 
While the Digital Advertising Alliance's recent commitment to work collaboratively with browser-based solutions to create effective Do Not Track mechanisms is a promising start to reduce these risks, the industry needs to continue to work to ensure that “Do Not Track is not just Do Not Target, but also, when the consumer so chooses, Do Not Collect,” according to Brill.
 
This vast amount of available information is also causing privacy concerns and increased costs in litigation and the provision of other legal services, according to legal services firm Novus Law's CEO Ray Bayley.
 
As litigation and transactions become more global, firms must work to comply with the data protection laws of other countries, and many law firms have failed to meet the international standard governing what organizations must do to make sure the information they hold is secure, Bayley added.
 
As with the other perceived problems associated with big data, a change in the way the practices are handled — either through a change in the rules governing civil procedure or in an improvement of technology that would allow firms to securely and efficiently analyze this data — is necessary, according to Bayley.
 
No matter how the industry and the FTC choose to tackle these big data problems, experts agree that consumers need more education about how their data is being used and more assurance that their information is being adequately protected.
 
“No one has the ability to see the variety of uses and abuses in the field of big data,” University of Virginia Professor Siva Vaidhyanathan said during a Fordham conference panel. “In an era of big data, we don't have big understanding. We need some sort of system of regulation to build cost into the system and create a deterrent.”