The U.S. government has failed to pass federal privacy regulations despite the desire for it, but the FTC’s privacy-compliance crusade may be the push they need to get the ball rolling.
In 2022, the Federal Trade Commission launched a lawsuit against Kochava, accusing the company of selling sensitive location data from reproductive health clinics, places of worship, and more. The FTC further alleges that Kochova’s actions exposed consumers to risks like “stigma, stalking, discrimination, job loss, and even physical violence.”
Kochava isn’t the only one deceiving customers. A recent study, by data compliance tech company, Compliant, revealed that 90% of publishers shared consumer data with third parties without consent. It’s high time publishers and ad tech companies prioritize consumer privacy. These stats reveal there’s more work to do, despite some efforts towards privacy compliance.
New Year, New FTC. The FTC is doubling down on protecting consumer privacy rights. This includes their ongoing lawsuit against Kochava, a recent settlement against Outlogic, and other actions. These cases could lay the groundwork for the future of digital media privacy.
Kochava’s (Alleged) Controversial Practices
Kochava, the self-proclaimed industry leader in mobile app data analytics, is in the FTC’s crosshairs for its data collection practices. A recent FTC settlement with Outlogic, banning the use and sale of sensitive location data, suggests that Kochava might face similar repercussions.
The mobile data analytics company refutes the FTC’s accusations. However, the FTC contends that Kochava offers a ‘360-degree perspective’ on individuals, combining geolocation with other data to identify consumers. Also, using AI, Kochava allegedly predicts and influences consumer behavior in invasive ways before selling this data.
The Dangers of Data Misuse
The unsealing of court documents last November revealed Kochava’s secret sale of the ‘Kochava Collective’ data. This includes precise geolocation, consumer profiles, app usage, and sensitive details like gender identity and medical data. With this data, customers can target specific groups down to exact locations, raising concerns about privacy and potential misuse.
This allows organizations like advertisers, insurers, political campaigns, and individuals with harmful intentions to purchase data containing names, addresses, emails, economic status, and more about people in selected groups.
Allegedly, the sensitive data — accessed without user consent — is acquired through Kochava’s SDKs installed in over 10,000 apps globally and directly from other data brokers. In a separate California lawsuit, Greenley alleges that Kochava engages in covert data collection and analysis, selling customized data feeds tailored to clients’ specific needs.
Inching Towards Stronger Privacy and AI Regulations
U.S. District Judge B. Lynn Winmill initially dismissed the FTC’s complaint as it required more details. While Winmill hasn’t ruled on a motion to dismiss, discovery is underway, with a trial expected in 2025.
Advancing AI tools are reshaping data analysis, enabling the invasion of privacy. Generative AI can infer and disclose sensitive data, such as medical records, and can predict consumer behaviors, influencing decisions without their knowledge.
With the FTC’s lawsuit against Kochava unfolding and new proposals for AI regulation on the table, the legal landscape for data privacy is evolving. The case’s outcome, alongside the push for federal privacy laws, could mark a significant shift in how data and AI are regulated and used ethically.
The White House is also concerned about protecting consumer privacy and surveillance advertising. While the U.S. is still stuck with state-led privacy regulations, most agree that more robust federal privacy laws should be enacted. This case may just be the push needed to get some version of the American Data Privacy and Protection Act (ADDPA) passed.