FTC Hosts Workshop on Kids Safety on the Internet

BakerHostetler
Contact

BakerHostetler

We love hearing Commissioner Holyoak speak and were particularly heartened when she criticized the Khan FTC’s use of charged headlines in press releases and speeches, including calling targeted advertising “surveillance marketing.” She has said things like “[p]erhaps this re-branding is just silly—an attempt to boost press appeal, pander to the likeminded, score some political points—and basically harmless. But I fear that the silliness belies something more troubling: a glossing over—and perhaps even a degree of prejudgment—of difficult issues.” And, “[i]n my view, we should be careful to use neutral terminology that does not suggest any prejudgment of difficult issues.” Amen. We at AD-Ventures in Law did not like the creation of nefarious-sounding practices or the headlines in press releases vilifying settling companies.

The workshop the FTC held today to explore harms from the use of the Internet by visitors under the age of 18 was titled “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families.” The workshop page promised “the event will bring together parents, and child safety experts, and government leaders to discuss how Big Tech companies impose addictive design features, erode parental authority, and fail to protect children from exposure to harmful content.” The CATO Institute published an interesting piece yesterday, noting the original workshop was titled in a more neutral way.

Prior to yesterday’s hearing, the Khan FTC held a workshop focused on Internet advertising to kids and issued a staff report; both were called “Protecting Kids from Stealth Advertising in Digital Media.” We previously blogged on the staff report. The prior workshop had invited participation from industry including Google. Yesterday’s workshop had no invited industry participation. But lots of star power nonetheless. Chairman Ferguson and commissioners Holyoak and Meador all participated, as well as the director of consumer protection, Chris Mufarrige, and representatives from Congress and lots of child and parent advocates.

What were the highlights? Ferguson opened the event, noting protecting children was a top priority for the FTC and asking Congress to pass stricter laws in this area, but also noting the FTC will use its enforcement and rule-making authority to help in the effort. Sen. Blackburn spoke next, encouraging passage of the bill she has sponsored, the Kids Online Safety Act, or KOSA, which would require apps to default to the highest safety and privacy settings for kid users.

The first panel set the table, responding to the rhetorical question for this workshop, “Are Kids in Danger Online?” The panel laid out the worst harms that affect kids, from cyberbullying to sexual grooming. They asserted that parental controls are not set automatically when a child’s age is entered and that on most websites, parental controls are difficult to find and set and do not provide protections even when turned on. The panel concluded that if platforms are telling parents their parental controls are robust, but they are not, this might be an act of false advertising that can best be addressed through consumer protection enforcement.

The second panel addressed how the FTC can protect kids online. They urged passage of new laws and an update to COPPA to provide better protections beyond privacy, including in how schools are using data and encouraging use of technology. The FTC chief privacy officer, Jake Denton, articulated his view that targeted advertising is essentially causing many of the safety problems. In his view, advertising revenue informs all decisions platforms make to keep users online as much as possible to continue to gather more data that can be used for more effective individualized advertising, including from the very lucrative 13-to-18-year-old demographic. The use of push notifications and likes and the infinite scroll requires that minors stay online all day because this has become their entire social structure. This addictive nature opens the way for kids to be subject to predatory acts simply because of the amount of time they are online. When asked what the FTC should do with its current enforcement tools, the panel noted the FTC has enforcement authority for the TAKE IT DOWN Act that requires online platforms to remove nonconsensual sexual imagery. The act requires platforms to set up a process to effectuate takedown, and the FTC will monitor for compliance. The panel also suggested the FTC should bring enforcement actions against any difficult-to-access parental controls as either unfair or misleading.

The third panel discussed age verification legislation that would require providing proof of identification for pornography and other adult-directed websites. Twenty-three states have passed such laws, but there are First Amendment challenges to such laws that are currently unfolding.

Commissioner Holyoak’s remarks were a terrific summary for the day. She reiterated how important parental controls are, but they must be easy to find and implement as well as be effective tools to protect kids online. She indicated surveys she has looked at suggest very few parents are using these tools, suggesting they are difficult to find and understand. She emphatically said that the default setting for any child users should preclude communication with others and encouraged app and website owners to voluntarily act now to implement this. She summarized recent FTC enforcement and consumer education activities. She said generative AI chatbots that function as companions should be investigated by the FTC under its 6(b) authority to better understand if Congress should act to ban access to these chatbots by children and minors.

While much of the event expressed a consensus view that parents should not be solely or even primarily responsible for protecting their children online, the fourth panel discussed some strategies for families, largely advocating that parents should feel comfortable limiting or banning screen time for their children and teens.

SO, what is next? Looking at the tea leaves suggests there will be a staff report of some sort, but there is real tension between this FTC’s focus on enforcing laws as written, not as the agency wishes they were written, and what the law in this area prohibits. COPPA is not the cure-all here. It will be interesting to see if the FTC does begin looking at what platforms and apps say about their parental controls to fashion some sort of deception case if the controls are hard to access or do not protect as promised. Any platform should revisit its parental control policies and consider whether proactive voluntary changes may be in order.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© BakerHostetler

Written by:

BakerHostetler
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

BakerHostetler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide