<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=134132097137679&amp;ev=PageView&amp;noscript=1">
Pixalate's Mobile COPPA
Compliance Tools -
Methodology
Featuring
Pixalate’s Trust & Safety Advisory Board
Allison Lefrak
SVP of Public Policy, Ads Privacy, and COPPA Compliance
Tabitha Walker
Trust and Safety Advisory Board Member
Crystal Pearson
Trust and Safety Advisory Board Member
Emma Burdis
Trust and Safety Advisory Board Member

Effective as of 16 February 2023

BACKGROUND

Download Methodology

Table of Contents

WHAT IS COPPA

The Children's Online Privacy Protection Act ("COPPA") is a United States federal law passed by Congress in 1998 to protect children’s online privacy. COPPA requires the Federal Trade Commission (“FTC”) to issue and enforce a rule implementing the law (“the COPPA Rule” or “the Rule”). The FTC’s COPPA Rule became effective in 2000 and was amended in 2013. The FTC is presently in the process of reviewing the Rule again. COPPA is enforced by the FTC and by state Attorneys General, who have the authority to seek civil penalties from companies that violate the law.

EXTRATERRITORIALITY 

Although COPPA is a U.S. law, foreign online service operators must comply with the COPPA Rule if their services, or a portion of the services, are directed to children in the U.S., or if they have actual knowledge that they collect information from children in the U.S. Relatedly, American online service operators that collect information from foreign children are also subject to COPPA.

WHO MUST COMPLY WITH COPPA

The COPPA Rule applies to online service operators whose service, or a portion of the service, is directed to children under the age of 13 or who have actual knowledge that they collect information from children under the age of 13. Actual knowledge can be gained in a variety of ways, including by a user attempting to pass through an age gate, a user or parent telling a customer service team the user’s age, or by user generated content (e.g., a user stating she is in elementary school in her profile).

Third parties in the digital advertising ecosystem (e.g., not the online service operator) must comply with the COPPA Rule when they have actual knowledge that they collect information from users of an online service directed to children. The FTC sets forth two scenarios where ad networks and other third parties will likely be deemed to have actual knowledge: 

  1. where a child-directed content provider directly communicates the child-directed nature of its content to the ad network or other third party; or
  2. where a representative of the ad network or other third party recognizes the child-directed nature of the content.

 

See the FTC’s 2013 Statement of Basis and Purpose

COPPA’s requirements apply only when the online service operators or third parties described above collect personal information. Because personal information includes persistent identifiers, COPPA applies to the collection of mobile device identifiers, browser-based cookies, or other persistent device identifiers, regardless of whether the device is a personal device (e.g., a mobile phone) or shared device (e.g., a smart TV). Please also refer to the section below: HOW DOES COPPA APPLY TO ADTECH.

COPPA’S REQUIREMENTS FOR COVERED OPERATORS

Operators covered by the Rule must:

  1. Post a clear and comprehensive online privacy policy describing their information practices for personal information collected online from children;
  2. Provide direct notice to parents and obtain verifiable parental consent, with limited exceptions, before collecting personal information online from children;
  3. Give parents the choice of consenting to the operator’s collection and internal use of a child’s information, but prohibiting the operator from disclosing that information to third parties (unless disclosure is integral to the site or service, in which case, this must be made clear to parents);
  4. Provide parents access to their child's personal information to review and/or have the information deleted;
  5. Give parents the opportunity to prevent further use or online collection of a child's personal information;
  6. Maintain the confidentiality, security, and integrity of information they collect from children, including by taking reasonable steps to release such information only to parties capable of maintaining its confidentiality and security;
  7. Retain personal information collected online from a child for only as long as is necessary to fulfill the purpose for which it was collected and delete the information using reasonable measures to protect against its unauthorized access or use; and
  8. Not condition a child’s participation in an online activity on the child providing more information than is reasonably necessary to participate in that activity. 

See COPPA Rule at 16 C.F.R. § 312. 

HOW DOES COPPA APPLY TO ADTECH

In addition to publishers of content directed to children or that have actual knowledge of a child using their service, any company that handles persistent identifiers of an online service it knows to be directed to children is subject to COPPA. This includes demand-side platforms (DSPs), supply-side platforms (SSPs), ad networks, data management platforms (DMPs), customer data platforms (CDPs), analytics and fraud detection companies, measurement providers and all of the other advertising technologies. 

Although COPPA does not prohibit advertising to children, the Rule prohibits the collection of personal information (including cookies and other persistent identifiers) from children under 13 without verifiable parental consent. The intention behind this prohibition is to stop behavioral advertising, retargeting and profiling of children under 13. Contextual advertising is permissible under COPPA. In practice, this means contextually-based advertising that does not track the user over time and across online services. 

THE COPPA FLAG IN PROGRAMMATIC ADVERTISING

Real-time Bidding (“RTB”) is a way of transacting media that allows an individual ad impression to be put up for bid in real-time. This is done through a programmatic on-the-spot auction. RTB allows for the ability to serve targeted ads. The Interactive Advertising Bureau’s OpenRTB is an API specification for an open protocol for the automated trading of digital media across a range of platforms and devices. Part of IAB’s OpenRTB specification, the COPPA flag is an attribute of a bid request that signals whether that request is for the opportunity to serve an ad to a child protected by COPPA. The publisher issuing the bid request has made the determination that the user is a child. The flag will have a value of 1 if the user is a child under 13, and a value of 0 otherwise. This flag allows media buyers to programmatically decide whether to make a bid and whether they can use tracking and targeting technologies with that impression. See IAB Guide to Navigating COPPA.  For more information about how an advertiser or adtech partner can act upon the flag, refer to the Knowledge Base.

DIRECTED TO CHILDREN, INCLUDING MIXED AUDIENCE ONLINE SERVICES UNDER COPPA

There are two categories of online services under COPPA: directed to children and general audience. Mixed audience online services are a subset of the child-directed category of online services. 

Directed to children is defined in the COPPA Rule as a commercial website or online service, or portion thereof, that is targeted to children. See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children”). Classifying a service as child-directed is subjective. However, it requires consideration of  a number of factors set forth in the COPPA Rule. These factors include:

  1. the subject matter;
  2. visual content;
  3. the use of animated characters or child-oriented activities and incentives;
  4. music or other audio content;
  5. age of models;
  6. presence of child celebrities or celebrities who appeal to children;
  7. language or other characteristics of the website or online service; 
  8. whether advertising promoting or appearing on the website or online service is directed to children;
  9. competent and reliable empirical evidence regarding audience composition; and
  10. evidence regarding the intended audience of the site or service.

See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (1)). In other words, if an online service is targeted to appeal to children, then it is considered child-directed, and the operator is required to treat every user of the service as if they are a child and comply with the COPPA Rule. This is true even if the user is assessing the service from a shared device or a device that belongs to an adult.

The COPPA Rule provides for a mixed audience subcategory of the directed to children category. See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (3)); and Complying with COPPA: Frequently Asked Questions (See FAQs D.4-8). A mixed audience online service falls under the definition of directed to children despite not targeting children under 13 as its primary audience. Operators of mixed audience online services are permitted to implement a neutral age screen for their users. Using the age screen enables the operator of a mixed audience online services to collect personal information from users who indicate they are 13 or under but only after obtaining verifiable parental consent. Notably, an online service may be deemed directed to children even if its Terms of Service or Privacy Policy prohibit children under 13 from using the service. In determining whether an online service is child-directed, the FTC will consider the factors set forth in the Rule listed above.

GENERAL AUDIENCE SITES AND SERVICES UNDER COPPA

General audience is the term used by the FTC to describe sites and services that do not target children under 13 as a portion of the audience. See the FTC’s 1999 Statement of Basis and Purpose. The COPPA Rule applies to operators of general audience online services with actual knowledge that they are collecting information from children under 13. 

Importantly, a general audience online service does not become a mixed audience service simply because some children use the site or service. However, sites that are widely-known to have a large proportion of children are using the service will likely be considered a mixed audience site. 

PERSONAL INFORMATION UNDER COPPA

The COPPA Rule defines personal information to include:

  1. first and last name
  2. home or other physical address including street name and name of a city or town
  3. online contact information
  4. screen or user name that functions as online contact information
  5. telephone number
  6. social security number
  7. persistent identifiers that can be used to recognize a user over time and across different websites or online services including a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or unique device identifier
  8. photograph, video, or audio file, where such file contains a child’s image or voice
  9. geolocation information sufficient to identify street name and name of a city or town
  10. information concerning the child or the parents of that child that the operator collects online from the child and combines with an identifier described above

 

See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Personal information”).

DATA COLLECTION UNDER COPPA

Data collection is defined under COPPA as the gathering of any personal information from a child by any means, including but not limited to:

  1. Requesting, prompting, or encouraging a child to submit personal information online;
  2. Enabling a child to make personal information publicly available in an identifiable form (unless reasonable measures are taken to delete all or virtually all personal information from a child's postings before they are made public and also delete such information from the operator’s records)
  3. Passive tracking of a child online.

See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Collects or collection”).

SAFE HARBORS UNDER COPPA

COPPA includes a provision which enables industry groups, commercial entities, or others to develop their own COPPA oversight programs, known as Safe Harbor programs. There are currently six COPPA Safe Harbor programs approved by the FTC: Children’s Advertising Review Unit (CARU), the Entertainment Software Rating Board (ESRB), PRIVO, kidSAFE, iKeepSafe, and TRUSTe. A Safe Harbor organization may audit, monitor and provide guidance to its participating companies. A benefit of certification with a Safe Harbor program is that, generally, a disciplinary review for a COPPA violation will allow for a period to cure the violation instead of a formal investigation by the FTC. To be clear, Pixalate is not an FTC approved COPPA Safe Harbor. 

Pixalate is providing tools designed to help ad tech companies avoid serving targeted ads on child-directed websites. In the first instance, it is the publisher’s responsibility to designate its app as child-directed wherever appropriate. If a publisher identifies its app as child-directed, ad networks must NOT serve targeted ads on that property. Many ad tech companies have expressed concern, however, that publishers do not always get such designations correct. These ad tech companies are trying to go above and beyond the requirements of COPPA to ensure that targeted ads are not served on child-directed properties. This is where Pixalate’s compliance tools add significant value. Given that manual review of all of the transactions that take place in the ad ecosystem is virtually impossible and that third parties cannot often ascertain the intended audience of an app, Pixalate provides a solution that flags likely child-directed properties for further review or blocking of targeted ads by ad tech companies. 

Please note that Pixalate’s assessment of COPPA risk, or potential child directedness, does not provide a legal conclusion regarding an app’s intended audience or the sufficiency of an app’s COPPA Rule compliance.

PIXALATE’S AUTOMATED CHILD DIRECTED RISK ASSESSMENT METHODOLOGY

Pixalate uses a combination of signals to determine if an app is likely directed to children under 13. These signals include: app store category information (e.g., Games, Education, Entertainment); content rating (e.g., Everyone and Everyone 10+ in the Google Play store or 4+ and 9+ in the Apple App store); whether the app is in Google’s Teacher Approved program; and the presence of child keywords in app’s title or description. Additionally, Pixalate cross references apps between the two stores for consistency. If an app is likely child-directed in one store based on Pixalate’s automated methodology, the algorithm will match the equivalent version of the app in the other store and designate it as likely child-directed as well.

The table below provides an overview of Pixalate’s automated methodology for determining whether the audience for an app is likely child-directed. As explained above, child-directed includes mixed audience apps that may be targeting children as a portion of their audience. If an app is not assessed to be likely child-directed under Pixalate’s methodology, the algorithm will deem the audience to be likely general audience.

Pixalate updates the COPPA Audience assessment on a weekly cadence, but some components used in the assessment, like Content Rating, Category or App description, are only refreshed on a monthly cadence.

Store Content Category Information 

Pixalate uses app store content category and subcategory information, along with other app information, in assessing whether apps are likely child-directed. 

Games, Education and Entertainment - Google and Apple

Based on a manual review, Pixalate determined that most apps that target children fall into the Games, Education or Entertainment categories or subcategories in both of the app stores. As early as 2012, the FTC found the highest percentage of apps for children were located in the Games and Education categories in the Apple and Android stores. See the FTC’s Staff Report Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, Table 1 at page 5. Based on Pixalate’s analysis, this remains true ten years later. Pixalate also found that a large number of apps that target children are in the Entertainment category. Pixalate uses the Games, Education and Entertainment categories/subcategories along with specific age ratings (discussed below) and child keywords (discussed below) to designate apps as likely child-directed.

Kids and Family - Google and Apple

Google states that app developers who have designed an app for children must participate in the Designed for Families program. Google also states that developers who have designed apps for everyone, including children and families, may apply to participate in the Designed for Families program. As part of the application, in addition to selecting a content rating, app developers declare that their app is designed for specific target age groups. Apps that are approved for the Designed for Families program appear in the Family category in the Play store. Pixalate uses this category information in combination with specific content ratings (discussed below) to designate apps as likely child-directed. 

Apple provides app developers the ability to declare that their apps should be included in the Kids Category on the App Store. App developers that participate in this program are supposed to follow certain guidelines including adherence to children’s privacy laws. Apps in the Kids category are supposed to be designed for children 11 and under. In addition to selecting a content rating, app developers who have designed apps for Apple’s Kids category choose a target age range for the app: 5 and under, 6-8, or 9-11). See Choosing a Category, Special Cases, Apps for Kids. Apple also allows app developers to assign their app to the Family subcategory. Pixalate uses this category information in combination with specific content ratings (discussed below) to designate apps as likely child-directed. 

Periodically, both app stores rename their categories. In general, any categories that contain the words “Kids” or “Family” are utilized by Pixalate (along with other app information) to designate apps that are likely child-directed. For more detailed information about the app stores content category information that Pixalate uses in its automated methodology, refer to the Knowledge Base.

Content Ratings and Age Ratings

Both Google and Apple use ratings to describe the app’s content. These ratings do not describe whether an app is targeting children under 13. Pixalate does not rely on these ratings alone in its child-directed automated assessment. Rather, Pixalate uses these ratings only in combination with other factors in its methodology to assess whether apps are child-directed. 

Google Content Ratings

Google explains that content ratings are the responsibility of the app developers and the International Age Rating Coalition (IARC). See Apps & Games content ratings on Google Play.  Google states, “Content ratings are used to describe the minimum maturity level of content in apps. However, content ratings don’t tell you whether an app is designed for users of a specific age.” See Age-based ratings & description. Rating standards vary by country or region. For North and South America, Google uses ratings that are maintained by the Entertainment Software Rating Board (ESRB). More information about these ratings can be found on the Google Play Help website

Pixalate uses two of the ESRB content ratings (Everyone and Everyone 10+) in combination with other factors in its child-directed automated assessment. Similarly for apps from other regions, Pixalate uses content ratings that correspond with these ratings. Please refer to the Knowledge Base for a full listing of the content ratings that Pixalate uses in its child-directed automated assessment. 

Apple Age Ratings

According to the Apple developer website,  “An age rating is a required app information property used by the parental controls on the App Store.” Apple provides a list of content descriptions and the app developer identifies how frequently each content type appears in the app. These selections are converted into one of four App Store age ratings: 4+, 9+, 12+ or 17+. See App Store Preview, Get Started, Age Ratings and App Store Connect Help, My Apps, Age ratings

Pixalate uses two of the Apple age ratings (4+ and 9+) in combination with other factors in its child-directed automated assessment. 

Google’s Teacher Approved Program 

Google Play has a program called Teacher Approved in which Apps are evaluated by teachers and other specialists and for its age and content appropriateness. Google explains that teachers and specialists rate apps in the Designed for Families program (discussed above) based on design, appeal, and enrichment; age appropriateness; and the appropriateness of ads, in-app purchases, and cross-promotion. Teacher approved apps are eligible to appear on the Kids tab on Google Play and display a Teacher Approved badge. Unlike most apps on Google Play, these apps also display the app’s target age group(s) for children under 13. 

Any app that is in Google’s Teacher Approved program is designated as child-directed under Pixalate’s automated child-directed assessment. 

Child-Related Keywords 

Pixalate uses a curated list of child-related keywords in an app’s title or app store description to assess whether an app likely targets children under 13. Pixalate uses both a qualitative and quantitative approach for generating the curated child-related keyword list. Pixalate uses a statistical technique based on conditional entropy to determine the most important words/ phrases used to describe apps for children. Pixalate is also supplementing the curated list of child-related keywords based on input from the educators in the Trust and Safety Advisory Board as they manually review apps.

PIXALATE’S MANUAL CHILD DIRECTED RISK ASSESSMENT METHODOLOGY

Pixalate has formed a Trust and Safety Advisory Board, helmed by a former FTC enforcer and composed of qualified educators, to review and assess whether apps are child-directed. The educators on Pixalate’s Trust and Safety Advisory Board make assessments of apps based on the child-directed factors outlined in the COPPA Rule discussed above. App review is an ongoing process. Pixalate prioritizes apps for review based on the popularity of the app measured by the number of downloads in Google Play and the number of reviews in Apple’s App store.

PIXALATE’S OVERALL COPPA RISK ASSESSMENT METHODOLOGY

Pixalate approaches COPPA as a risk factor. Pixalate’s assessment of COPPA risk does not provide a legal conclusion regarding whether an app is “directed to children” under COPPA. Pixalate analyzes multiple signals and produces a risk score (low, medium, high or critical) that captures the likelihood that a given app is a potential COPPA risk. In order to do so, the following signals are used:

  1. Audience: Is the app likely directed to children under 13?
  2. Privacy Policy: Does the app have a detectable privacy policy? 
  3. Sensitive Data Permissions: Does the app require permissions that could allow for the collection, use or disclosure of personal information from children?
  4. Passes Residential IP: Does the app pass residential IP in the bidstream? 
  5. Passes Location Information: Does the app expose GPS coordinates that correspond to granular information about the user’s location in the bidstream? 

Because Pixalate continuously monitors these signals, a mobile app’s COPPA Overall Risk Assessment rating can change between low, medium, high and critical over time.

Audience

See above: PIXALATE’S AUTOMATED CHILD DIRECTED RISK ASSESSMENT METHODOLOGY and PIXALATE’S MANUAL CHILD DIRECTED RISK ASSESSMENT METHODOLOGY. 

Privacy Policy

Operators that are covered by the COPPA Rule are required to  post a clear and comprehensive online privacy policy describing their information practices for personal information collected online from children. Pixalate deems the lack of an identifiable privacy policy for a child-directed app to be a critical COPPA risk factor because it is a violation of COPPA for operators of websites and onlines services that collect, use, or disclose personal information from children to fail to post a privacy policy online. Additionally, the FTC recommends that all websites and online services - particularly those directed to children - post privacy policies online so visitors can easily learn about the operator’s information practices. Pixalate determines whether an app has a privacy policy based on information provided in the app stores. Additionally, Pixalate uses crawlers to scan developer websites for privacy policies. An app will be flagged as not having a privacy policy if it is not detected under either of these two methods.

Sensitive Data Permissions

Photographs, videos, or audio files where such files contain a child’s image or voice are personal information under the COPPA Rule. Geolocation sufficient to identify street name and name of a city or town is also personal information under the Rule. Mobile apps request access to certain device permissions in order to operate, such as access to the device’s camera, microphone or geolocation. In some cases, not all of the permissions that are requested are used. However, the fact that access to certain permissions has been requested creates additional risks since the permissions can be used at any time in the future. Pixalate has classified the most common mobile app permissions in terms of their COPPA risk, i.e. the risk to expose personal information. The permissions that Pixalate deems to be sensitive permissions are shown in the Knowledge Base

Residential IP Passed in the Advertising Bidstream

Residential IP is an address that is assigned from an ISP to a homeowner, and is associated with a single owner and location. IP addresses are persistent identifiers that  fall under the definition of personal information in the COPPA Rule. Accordingly, Pixalate deems passing residential IP information to be a COPPA risk factor. Pixalate examines the traffic associated with an app and determines if the end-user IP is transmitted through the advertising pipeline which exposes granular information about the user’s location. If IP is passed, the IP is categorized based on network type, for example: Cable/DSL, Cellular Tower, etc. Cable/DSL IPs are residential IPs which  can be reverse geocoded to expose the location of the user. If the majority of the traffic from an app exposes such Cable/DSL IPs then Pixalate flags it as passing residential IP traffic in the bidstream. Pixalate interprets the passing of a Residential IP in the bidstream as a COPPA risk because advertising can be targeted using this information. However, if the last octet of the Residential IP address has been truncated in the programmatic bidstream, it is no longer considered personal information under COPPA. Accordingly, Pixalate does interpret the passing of a truncated Residential IP in the bidstream as a COPPA risk.

Geolocation Information Passed in the Advertising Bidstream

Geolocation information sufficient to identify street name and name of a city or town is personal information under the COPPA Rule. Pixalate deems passing geolocation information to be a COPPA risk factor. Pixalate examines the traffic associated with an app and determines if the end-users’ GPS coordinates are being transmitted through the advertising pipeline that exposes granular information about the users’ locations. If traffic from an app exposes such GPS location data then it is flagged as passing location information. 

Risk Scores: Low, Medium, High and Critical

  • If Pixalate’s assessment is that an app is likely “general audience”, that app is assigned a low COPPA risk score. 
  • If Pixalate’s assessment is that an app is likely “child-directed”, that app is assigned either a medium, high or critical risk score. 
    • Child-directed apps that have a privacy policy and do not pass residential IP or geolocation information in the bidstream are assigned a medium risk score regardless of whether they have sensitive permissions. 
    • Child-directed apps that have sensitive permissions and pass residential IP and/or geolocation are assigned a high risk, despite the detection of a privacy policy. This is because there is a risk that the app is collecting/using or disclosing personal information from children without getting verifiable parental consent as required by COPPA. Pixalate’s automated assessment does not verify if an app has the ability to gain verifiable parental consent, thus Pixalate assumes this is not the case in an effort to be conservative in our assessment. 
    • Child-directed apps that do not have sensitive data permissions and pass residential IP and/or geolocation are assigned a high risk, despite the detection of a privacy policy. This is because there is a risk that the app is collecting/using or disclosing personal information from children without getting verifiable parental consent as required by COPPA.
    • Child-directed apps that have no detectable privacy policy are assigned a critical risk score regardless of whether they pass residential IP or location in the bidstream and regardless of the app’s sensitive permissions. This is because it is a violation of COPPA for operators of websites and online services that collect, use, or disclose personal information from children to fail to post a privacy policy online. 

 

The table below details the impact of various signal combinations on the COPPA risk score. 

DISCLAIMER

Pixalate’s COPPA Compliance Tools render opinions that Pixalate believes may be useful to our clients and others in the digital media industry. It is important to note, however, that the mere fact that an app appears to be directed to children (e.g., data subjects under 13 years of age, as defined by the COPPA Rule) does not mean that any such app, or its operator, is failing to comply with the COPPA Rule. Further, with respect to apps that appear to be child-directed and have characteristics that, in Pixalate’s opinion, may trigger related privacy obligations and/or risk, such assertions reflect Pixalate’s opinions (i.e., they are neither facts nor guarantees); and, although Pixalate’s methodologies used to render such opinions are derived from automated processing and at times coupled with human intervention, no assurances can be – or are – given by Pixalate with respect to the accuracy of any such opinions.

What Our Customers Are Saying

“As ad spend on channels like CTV grows by leaps and bounds, advertisers need greater transparency into their programmatic buys.”

patrick-mccormack

Patrick McCormack
Head of Business Development and Global Partnerships, yahoo

slack-imgs

 

“MRT offers Criteo access to critical insights helping us evaluate brand safety signals and maintain our quality standards across our in-app supply globally.”

francois-zolezzi-headshot

François Zolezzi
Head of Supply Quality, Criteo

Criteo_logo21.svg

 

“Connected TV and mobile app advertising are both growing, but for buyers to fully embrace the potential opportunity, we need to enhance access to high-quality, brand-safe inventory."
eric-bozinny

Eric Bozinny
Senior Director, Marketplace Quality, PubMatic

PubMatic_Logo

 

“To ensure the quality and safety of all our LAN inventory, LinkedIn uses the MRT to evaluate publishers.”

peter-turner

Peter Turner
Business Development, LinkedIn Marketing Solutions

Linkedin-Logo-2011–2019

 

Schedule a Demo