FTC: Consumer Privacy on Mobile Devices

FTC Attorney Jim Elliott breaks down mobile privacy

In preparation for our upcoming AdMonsters Mobile Publisher Forum, we’ve interviewed Attorney Jim Elliott of the Federal Trade Commission to get a preview of his presentation on consumer privacy on mobile devices, which he will be delivering at the event. 

First, an intro from Jim:

Maria, I appreciate this opportunity you and AdMonsters have afforded me to speak on privacy issues facing the mobile app industry. As usual, the views expressed here, and during the conference, are my own and not necessarily those of the Federal Trade Commission, any individual Commissioner, their staff, the Bureau of Consumer Protection, and the Southwest Region.

The mobile space is booming, and privacy issues have come along with the prevalence of such personal connected devices. What are the biggest issues you see when it comes to privacy on mobile devices, and how is privacy on mobile different than privacy on desktop computers? 

Mobile devices present significant privacy issues. Geolocation information and tracking of app users is a real concern.

The rapid growth of the mobile marketplace illustrates the need for companies to implement reasonable limits on the collection, transfer, and use of consumer data and to set policies for disposing of collected data. The unique features of a mobile phone – which is highly personal, almost always on, and travels with the consumer – have facilitated unprecedented levels of data collection. News reports have confirmed the extent of this ubiquitous data collection. Researchers announced, for example, that Apple had been collecting geolocation data through its mobile devices over time, and storing unencrypted data files containing this information on consumers’ computers and mobile devices. The Wall Street Journal has documented numerous companies gaining access to detailed information – such as age, gender, precise location, and the unique ID associated with a particular mobile device – that can then be used to track and predict consumer behavior. Not surprisingly, consumers are concerned: for example, a recent Nielsen study found that a majority of smartphone app users worry about their privacy when it comes to sharing their location through a mobile device. The Commission seeks to have companies limit collection to data they actually need for a requested service or transaction. And that data should only be kept for as long as it is needed. For example, a wallpaper app or an app that tracks stock quotes does not need to collect location information.

The extensive collection of consumer information – particularly location information – through mobile devices also heightens the need for companies to implement reasonable policies for purging data. Without data retention and disposal policies specifically tied to the stated business purpose for the data collection, location information can be used to build detailed profiles of consumer movements over time that can be used in ways not anticipated by consumers. Location information is particularly useful for uniquely identifying (or re-identifying) individuals using disparate bits of data. For example, a consumer can use a mobile application on her cell phone to “check in” at a restaurant for the purpose of finding and connecting with friends who are nearby. The same consumer might not expect the application provider to retain a history of restaurants she visited over time. If the application provider were to share that information with third parties, it might reveal a predictive pattern of the consumer’s movements thereby exposing the consumer to a risk of harm such as stalking.

Taken together, the principles of reasonable collection limitation and disposal periods help to minimize the risks that information collected from or about consumers could be used in harmful or unexpected ways.

The FTC recently released, “Marketing Your Mobile App: Get It Right from the Start,” which provides general guidelines to all app developers. Of these guidelines, which do you believe are the most critical for publishers working in the mobile ecosystem?

You know Maria, that’s like asking me which of my children I love the most. Each of the nine points are significant from a regulatory agency standpoint. App developers should use the brochure as a checklist to ensure that these key components – truth-in-advertising standards and basic privacy principles – are incorporated in their app.

App developers should become familiar with each of the nine points and how each may apply to their product. For instance, a publisher developing a children’s product will have additional requirements under the Children’s Online Privacy Act and the FTC’s COPPA Rule.

Use the release as a starting point and check our BCP Business Center – www.http://business.ftc.gov/about-the-business-center – to drill down for additional information and guidance as you develop your application. Companies should promote truth-in-advertising and consumer privacy throughout their organization and at every stage of development of their products and services.

What are some of the most egregious abuses of user data that you’ve seen while at the FTC? How can marketers avoid making the same mistakes?

Maria, I think I can give you a few examples of egregious abuses by data users. There are a variety of situations where consumers’ data is placed at risk by those charged with making it secure.

The first type of situation is the blatant failure to guard consumer and employee sensitive personal information. The two that come to mind here are Blockbuster and Rite Aid, where both companies just threw sensitive consumer information and employee plain text records into dumpsters at their respective places of business.

Between 2005 t0 2007, Blockbuster threw customer records in trash bins at a few of their outlets. These plain text records included old Blockbuster application forms containing sensitive financial information, like credit card and Social Security numbers.

More recently, in 2010, Rite Aid settled charges that it failed to protect the sensitive financial and medical information of its customers and employees, in violation of federal law. In a separate but related action, the company’s pharmacy chain also agreed to pay $1 million to resolve Department of Health and Human Services allegations that it failed to protect customers’ sensitive health information.

Both FTC and HHS investigations followed news reports about Rite Aid pharmacies throwing away consumers’ personal information such as pharmacy labels and job applications into dumpsters.

 The obvious solution to these types of situations is to securely maintain plain text records and to dispose of sensitive consumer and employee information by shredding.

Another type of situation is where on-line databases are compromised and consumer information makes its way into the public domain.

ChoicePoint is probably the one that comes to mind most often, probably because it was one of the earliest and largest breaches. In 2005, personal financial records of more than 163,000 consumers in its database were compromised. ChoicePoint obtained and sold the personal information of consumers, including their names, Social Security numbers, birth dates, employment information, and credit histories, to more than 50,000 businesses.

The FTC alleged that ChoicePoint did not have reasonable procedures in place to screen prospective subscribers, and turned over consumers’ sensitive personal information to subscribers whose applications raised obvious “red flags”; and that ChoicePoint approved as customers individuals who lied about their credentials and used commercial mail drops as business addresses, and had no legal right to obtain the information.

By the time ChoicePoint settled these charges with the FTC, it was determined that at least 800 cases of Identity Theft arose from the company’s data breach.

A more recent example might be Twitter’s settlement of charges that it deceived consumers and put their privacy at risk by failing to safeguard their personal information, marking the agency’s first privacy case against a social networking service. The FTC charged that serious lapses in the company’s data security allowed hackers to obtain unauthorized administrative control of Twitter, including access to non-public user information, tweets that consumers had designated private, and the ability to send out phony tweets from any account.

The FTC alleged that between January and May 2009, hackers gained administrative control of Twitter on two occasions. In January 2009, a hacker used an automated password-guessing tool to gain administrative control of Twitter, after submitting thousands of guesses into Twitter’s login webpage. The administrative password was a weak, lowercase, common dictionary word.

During a second security breach, in April 2009, a hacker guessed the administrative password of a Twitter empoyee after compromising the employee’s personal email account where two similar passwords were stored in plain text.

One solution to these types of data compromises is to build data security into your app. That prevents unauthorized administrative control. The Twitter order lists some reasonable steps:

  1. Require employees to use hard-to-guess administrative passwords that they did not use for other programs, websites, or networks;
  2. Prohibit employees from storing administrative passwords in plain text within their personal e-mail accounts;
  3. Suspend or disable administrative passwords after a reasonable number of unsuccessful login attempts;
  4. Provide an administrative login webpage that is made known only to authorized persons and is separate from the login page for users;
  5. Enforce periodic changes of administrative passwords, for example, by setting them to expire every 90 days;
  6. Restrict access to administrative controls to employees whose jobs required it; and impose other reasonable restrictions on administrative access, such as by restricting access to specified IP addresses.

My final type of example of user data abuse is demonstrated by a recent case involving violations of the Children’s Online Privacy Protection Act (COPPA) and the FTC’s COPPA Rule. It was an abuse of user data collection by violating the law and placing children’s information in jeopardy.

Recently, the operator of fan websites for music stars Justin Bieber, Rihanna, Demi Lovato, and Selena Gomez agreed to settle FTC charged that it violated COPPA by improperly collecting personal information from children under 13 without their parents’ consent.

The FTC charged that the website operator, Artist Arena, violated COPPA and the FTC’s COPPA Rule, which require that website operators to notify parents and obtain their consent before they collect, use or disclose personal information from children under 13.

The FTC alleged that Artist Arena operated fan websites where children were able to register to join a fan club, create profiles, and post on members’ walls. Children also provided personal information to subscribe to fan newsletters. Artist Arena falsely claimed that it would not collect children’s personal information without prior parental consent and that it would not activate a child’s registration without parental consent, the FTC alleged.

Artist Arena registered over 25,000 children under age 13 and collected and maintained personal information from almost 75,000 additional children who began, but did not complete the registration process.

A solution is that if you designed your app to collect personal information from children, stop and redesign it. For new apps, design and build in privacy from the start by compliance with COPPA and the FTC’s COPPA Rule. Go to our Business Center (www.business.ftc.gov/documents/bus51-you-your-privacy-policy-and-coppa-how-comply-childrens-online-privacy-protection-act ) and download a copy of “You, Your Privacy Policy and COPPA – How to Comply with the Children’s Online Privacy Protection Act .” You don’t want to find, like Artist Arena did, that you have to pay a civil penalty of $1 million.

What are your hopes and fears for the future of privacy on mobile devices?

Maria, my hope is that app developers go to our Business Center and review the information and publications under the Privacy and Security section. As stated in the FTC’s “Protecting Consumer Privacy in an Era of Rapid Change,” the FTC urges companies to adopt the following practices:

  1. Privacy by Design: Build in privacy at every stage of product development;
  2. Simplified Choice for Businesses and Consumers: Give consumers the ability to make decisions about their data at a relevant time and context, including through a Do Not Track mechanism, while reducing the burden on businesses of providing unnecessary choices; and
  3. Greater Transparency: Make information collection and use practices transparent.

 I fear that there are developers who will not take advantage of the guidance the FTC provides at the Business Center and they will refuse or fail to adequately design consumer privacy and children’s privacy into their products and services.


The AdMonsters Mobile Publisher Forum hits San Antonio this November 11-14th! Get ready for presentations from industry leaders on hot-topic mobile publisher and app developer issues such as mobile standards, mobile video, developing a mobile strategy from the ground up, m-commerce, and much more! Register today to join your peers for this invaluable conference!