Instagram's Algorithm Recommended Minors to Putative Pedophiles
In Week 4 of FTC v. Meta Platforms antitrust trial, shocking statistics about Meta's failure to address known security risks, plus one more look at WhatsApp.
Day 14 of the FTC’s trial to break up Meta had some of its most explosive moments with Guy Rosen, Meta’s Chief Information Security Officer, on the stand. We start there, before circling back to Day 13. And then, a note about some of the public access issues we’re facing in court. We’ll cover Day 15 (May 7) in a future post.
“An Indirect Tax on the FB App”
Guy Rosen joined Meta in 2013 when Meta acquired Onavo, a mobile analytics company founded by Rosen. Public reports showed how Meta used data from Onavo to help “clone” Snapchat’s functionality, eventually resulting in Stories. According to Meta, Rosen, as Meta’s Chief Information Security Officer, has led “an overhaul of the company’s approach to fighting abuse on the platform and built our industry-leading Integrity teams that pioneered approaches to complex challenges such as harmful content, election readiness and adversarial threats.” Basically, Rosen leads the company’s product and safety integrity efforts.
The FTC’s point in calling Guy Rosen was to show that Meta under-invested in Instagram relative to Facebook, which gets to Meta’s claimed “procompetitive” justification that acquiring Instagram helped the app grow, and that integration into Meta’s ecosystem improved safety and security efforts. The FTC wants to show that that procompetitive benefit is pretextual—and therefore doesn’t count—by highlighting all the ways in which Meta didn’t want to devote resources to Instagram. Chief Judge Boasberg said there was an “awful lot of detail” about that point, which he thought was “ancillary.” We reported first last week that testimony was getting increasingly cumulative and repetitive and that the court shared that concern.
The FTC’s direct of Rosen set this up with a shot and chaser. First, it established, with some more Zuckerberg emails, that Facebook was throttling the allocation of resources to Instagram. It’s a point we’ve heard before from Instagram co-founder Kevin Systrom himself. A 2018 email summarized “Mark’s thoughts on family”—the app family, that is (emphasis added):
“High level direction: For our mission and business, we need to make sure the FB app is growing sustainably. We need to prioritize addressing this across the company over the next 2 years . . . . The top priorities will likely be some combination of stories + MSI, perf + reliability, and integrity. Any integrity focus will likely be based on the people we’ve already transferred to work on it though . . . . Of course we need to address basic integrity risks across Instagram, WhatsApp, and Messenger as well . . . . The direction is that the primary focus from the central team should be on the FB app, but not the exclusive focus . . . . . One specific thing I worry about is that supporting integrity in other apps shouldn’t be an indirect tax on the FB app.”
Rosen, for his part, flagged in 2019 that Instagram still wasn’t getting adequate staffing:
“Currently, IG is understaffed as an app-surface when compared to Facebook, and central problem teams have not been able to prioritize Instagram parity . . . Ultimately we need a strong app surface team on Instagram to integrate with central teams to achieve parity.”
Rosen wanted at least 149 more employees allocated to Instagram for that, and it’s not clear that he got everyone he wanted. In the 2017-2018 period, Meta bought back $15 billion worth of stock, boosting its share price, while Instagram was hurting for investment.
Why was there a pressing need for more staffing on Instagram, even years after the acquisition closed? We saw one reason raised by Instagram head Adam Mosseri:
“Harmful behavior (e.g. grooming especially) . . . this really worries me given . . . IG’s younger audience. I bet we’ll find we have work to do there.”
There were some other statements in emails along those lines, although I didn’t catch the to/from in my handwritten notes when the media room was closed (emphasis added):
“He had me come over to dispel some myths/worries that IGWB team had about CI, that we don’t really care etc.”
“Since IG hasn’t done much on harmful content . . .”
“Kevin S is still worried that CI has a strong Blue bias.”
2019 email from Brady Lauback, Head of Data Analytics: “It’s unclear to me if we ever funded Instagram integrity.” Rosen’s reply: “You are correct, there is a growing realization this is underfunded. This was deliberate - I explicitly had the convo with Mark at HC planning and he thought IG has another year or two to catch up. I think we are not sure that is the case any more.”
Meta did have more work to do on “child grooming,” as we saw in a June 2019 deck titled, “Inappropriate Interactions with Children on Instagram.” An early page called out that “IG recommended a minor through top suggested to an account engaged in groomer-esque behavior.” Grooming refers generally to the tactics a child predator might use to gain trust with potential victims to sexually abuse them. Subsequent pages gave some broader data: “27% of all follow recommendations to groomers were minors.” There’s a lot we don’t know about this statement: how did Meta track accounts that were “groomers” or “engaged in groomer-esque behavior”? And why were those accounts allowed at all? How did they generate that statistic? And it’s important to caveat as well that perhaps Meta didn’t know that any potential groomers were actual criminals. But by any measure, the headline is troubling.
There was more data than that. 33% of all Instagram comments reported to Meta as inappropriate were reported by minors, the deck said. Of the comments reported by minors, more than half were left by an adult. “Overall IG: 7% of all follow recommendations to adults were minors,” the deck concluded.
The presentation also noted that during a “3-month period”—presumably in 2019—2 million minors were recommended by Instagram’s algorithm for groomers to follow. 22% of those recommendations resulted in a follow request from a groomer to a minor. Doing some back of the envelope math, that’s approximately 440,000 minors over just a three-month period who received a follow request from someone Meta labeled as a “groomer.” That number is shocking even before being annualized.
Rosen said Meta takes the grooming issue seriously and has made strides toward addressing it since then. On cross, he said there are now 40,000 employees working on “integrity” (a catchall word for security; more on that definition below), and Meta has invested $30 billion in integrity over the past decade. We also saw on cross a 2018 email from Zuckerberg asking Rosen how many more employees Rosen might need for integrity outside the normal budgeting process; Rosen’s request for more bodies was granted. Meta also developed software security “tools” with code names like “Sigma,” “Karma,” and “Black Hole.” And Sigma has recently been updated and renamed, “Omega.” For what it’s worth, we saw data on cross showing that at least “adult nudity and sexual activity” content (which is forbidden under Meta’s terms) had a low prevalence on Meta’s apps—around one-half of one percent of all content.
Chief Judge Boasberg interrupted cross, too, asking if Meta wasn’t “falling into” the FTC’s “trap” by getting too into the weeds on this “ancillary” point.
Direct, cross and re-direct also had us looking at a heat map of 20+ different integrity issues and a color shading (reproduced in black and white) showing the risk level to Meta’s apps scored 0-1-2. In the first half of 2017, 22 of 27 categories were at 0, the worst score, including “suicide and self-injury.” Some improved to level 1 and then to level 2 over time.
The Real McCoy
The FTC’s next witness was an expert, Damon McCoy, a professor of computer science at NYU. A colorful professor with dyed purple hair and a bolo tie, his assignment was to assess whether Meta’s acquisitions of Instagram and WhatsApp were necessary to achieve claimed “integrity” benefits and whether those benefits were actually achieved. “Integrity” is a “broad field that focuses on establishing trust” with users and advertisers, and “making the platform seem safe” by protecting against “harmful, objectionable content and behavior,“ McCoy explained.
McCoy concluded that the acquisitions weren’t necessary to help WhatsApp and Instagram address integrity. He didn’t find any quantitative metrics in studying these issues, so his testimony boiled down to regurgitating emails and deposition testimony from other witnesses. We flagged this issue with a previous FTC expert: typically, just rehashing fact testimony and documents isn’t a proper expert opinion.
The direct was helpful though in that McCoy had slides putting evidence we’d seen already into a timeline, focusing on the 2014 and 2018 staffing issues with Instagram. Pre-acquisition, Instagram had been using a third-party vendor called Impermium for help addressing spam, which Google ended up acquiring in 2014. And Instagram had “reached out” externally to integrate something called a “photo DNA” tool, which can compare “hashes” (part of a file’s metadata) to hashes of known images of child porn. At points, though, McCoy started to sound like a defense witness, admitting that Instagram was able to plug into Facebook resources after the acquisition—like Facebook’s photo DNA tool. At one point, Chief Judge Boasberg agreed, saying that “it seems to me that what it says there seems to contradict what you just said.”
And the direct also flagged some documents we hadn’t seen before, with statements like these:
“I should call out though that we’re facing more extreme issues on FB right now with the murder, bad activity in private groups, etc.”
From Kevin Systrom: “I don’t think Mark understands the urgency of working on integrity related issues at IG. Do you guys have anecdotes and links I can send? I’m assuming the child killing himself on live is an important one, but I think there were others (a suicide?)”
This likely referenced an accident in which a 13-year-old playing with a gun on Instagram live fatally shot himself.
June 2017 email to Mosseri: “I think IG is underinvested in integrity relative to its scale and importance to the business. I’m [not] sure I can help you pull headcount out of a hat.”
The risk to minors from groomers didn’t go away. McCoy showed a Meta document stating that “~1 in 100 IG teen users in the US are exposed to IIC convos (child grooming) with adults every day.”
On the first part of cross, Meta’s counsel pointed out that McCoy hadn’t benchmarked an industry standard level of harmful content or tools to address it, which made his testimony beg the question: was Meta behind others? McCoy was impeached with his prior testimony and didn’t consider some potentially unhelpful statements from other materials he relied on.
In all, McCoy didn’t move the needle much either way. The theme of child exploitation through Meta’s apps runs the risk of cutting the other way for the FTC. Sure, it’s possible Meta underinvested in that. But would Instagram have been better at addressing these issues on its own? While the FTC did a lot to show that Meta was deliberately underinvesting in safety on Instagram, many witnesses have had to admit that Instagram got along faster and further because it could use Meta’s tools and team.
That was Day 14. Now, let’s turn back to Day 13.
(Don’t) Shoot the Messenger
Day 13 started and finished with Peter Deng, who spent a decade at Meta. Deng started as the sole product manager on the Facebook Messenger team, before eventually moving into a similar role for Instagram just before Facebook bought WhatsApp. For the unfamiliar, tech companies love to have these roles called “project” or “product” manager, which is usually a non-technical, managerial role in charge of a particular product or feature; though it helps to have some engineering background. Here’s Peter Deng in 2012 explaining how consumers using messaging services might want to branch out into using a social networking platform:
The FTC’s direct of Deng focused on Meta’s motivations for acquiring WhatsApp. To get up to speed on what we’ve seen on the WhatsApp story so far at trial, check out our summaries of Days 2, 5, 8, and 10.
While there were some good documents highlighted below that support the FTC’s case that Meta took over WhatsApp to take out a putative competitor, Deng consistently testified on direct and cross that Meta was not concerned about and saw no indication that WhatsApp was or would be expanding into social networking features, in contrast to other chat apps like Kakao and LINE.
Some hits from the documents shown during Deng’s testimony (emphases added):
Message to Deng in a 2012 chain: “Once we destroy WhatsApp, I can't believe they are beating us, then we can focus on other stuff.” In a follow-up: “BTW, if WhatsApp falls into Google’s hands, that would be very, very bad.”
An April 1, 2012 email from Mark Zuckerberg: “I've been thinking a
lot about mobile messaging, WhatsApp, and how we should approach this since we started this discussion a couple of weeks ago. . . . Over time they will also need to figure out how to make more money per user which will either mean they need to charge more, add ads, or sell services that we could potentially give away [for] free. . . . Right now, aside from Facebook integration, WhatsApp is legitimately a better product for mobile messaging than even [our] standalone Messenger app. It's more reliable and faster for sending messages. You get better signal and feedback via read receipts and last seen times.”
Deck from August 2012: “Overview: The Threat of Messaging Apps”: LINE, KakaoTalk, and WeChat “[p]ose a tremendous threat beyond messaging.” WhatsApp was listed as a competitor, but not included in the line about posing a “tremendous threat beyond messaging.”
An October 2012 email from Deng: “We are facing a huge threat with messaging competitors . . . . As I told the messages team at our all hands yesterday, this is the biggest threat to our product that I've seen in my five years here at Facebook. It's bigger than G+, and we are all terrified.”
March 22, 2013 email from Zuckerberg: “Being the best messaging service is the
biggest opportunity and mitigation to the biggest threat we face today. . . . The space keeps heating up, including Google about to get into this in a big way, LINE with more than 1,000 engineers just focused on their app, and rumors of WhatsApp expanding into more services as well.”
On cross, Deng reiterated that WhatsApp wasn’t seen as a competitive threat to Facebook Blue (the lingo referring to the classic Facebook site or app) or Facebook Messenger. Meta’s counsel also covered Deng’s time with Instagram, and Deng painted a picture that countered Systrom’s view of how much Meta supported the app after acquiring it. Instagram’s monetization growth “would have been harder and it would have been slower” without Facebook’s help, he testified. Deng didn’t recall whether Facebook had pulled all of the growth team from Instagram in 2014, and instead remembered them "literally mov[ing] the desks” from across the walkway to sit next to Deng and the Instagram team. Facebook also helped Instagram launch a bunch of features like videos, new filters, “separate creative apps,” direct messaging, and push notifications, Deng said.
On re-direct, the FTC highlighted messages between Systrom and Deng where they discuss Instagram having less “growth support,” which Deng said “wasn’t great,” was “pretty disruptive” and “was definitely in the wrong direction”. Their Meta growth contact didn’t have “that much skin the IG game,” while Deng and Systrom did.
Deng stepped down, and then a surprise: court ended for the day early, around 11:30 a.m. Read on to find out why.
Public Access to Public Trials
The U.S. District Court for the District of Columbia (the federal court where the trial is happening) has, for the first few weeks of trial, done a great job accommodating the media attending “Antitrust Woodstock”—the simultaneous Meta trial and Google Search remedies hearing, along with active days in other litigation concerning the Trump administration, including lawsuits by Big Law firms targeted by executive orders.
But heading into Week 4, things are becoming more restricted, in two ways.
The first is media room access. The court has made “media rooms” available—usually spare conference rooms in the courthouse, but sometimes an unused courtroom—where the press can sit with electronic devices to watch and listen to a video and audio feed of the trial, usually with 3-4 camera angles. One of those angles is on the witness, and another on whatever exhibits get displayed in court. It’s a lot easier for folks to cover the trial in a timely way by typing what we see, rather than recreating the trial from handwritten notes later. And in some ways, “supply creates its own demand”: creating a dedicated press space for this trial has had a positive effect on the number of outlets covering it.
Industry publications like the Global Competition Review, MLex, and the Capitol Forum have been here daily, usually joined by national outlets like The New York Times, Bloomberg, and The Verge, with occasional stops on the beat from Reuters, The Wall Street Journal, The Washington Post and others.
But on Day 14, a press room wasn’t available for the first couple hours of trial. The court eventually opened it up, but reporters were moving to and from the courtroom trying to find out if it was open. This has followed a general trend of the media room for this trial changing without notice or signage to that effect. It would be better to keep things in one room if possible. A related issue is that when the court is sealed for confidential testimony—and the confidentiality concerns so far have been wildly overblown, in my view—the court does not always restore the feed to the media room when the unsealed testimony resumes, leaving the press sitting in a dark room while it’s lights, cameras, action in court.
These might seem like minor inconveniences from someone covering the trial, but there’s a second limitation that’s more impactful: the court’s decision to take deposition videos under submission to watch privately in chambers, which comes at the public’s expense. We reported that at the end of Week 3, Chief Judge Boasberg invited the parties to propose ways to streamline the rest of trial. Over the weekend, the parties took him up on that invitation with a joint status report proposing that deposition videos be watched in chambers instead of played in open court. On Day 13 (this Monday, May 5), Chief Judge Boasberg agreed. Because the FTC had planned to play deposition videos for the entire post-lunch trial period on Day 13, Chief Judge Boasberg adjourned, cutting the trial day in half.
So why does this matter? It’s great that trial will move along more briskly, but the press and the public didn’t get to see the depositions or the transcripts of those depositions, and presumably we’re not invited to hang out in chambers with the Chief and his clerks with popcorn to watch them. It’s anyone’s guess what the testimony said or how it was said. Full-time, professional journalists at the Times and Bloomberg have led the way in liaising with the FTC to make those transcripts and videos available. Those discussions are ongoing, and we’ll report on the missing witnesses when we have access to those materials.
There’s a simple fix for this: just put the transcripts and video clips in the public repository for exhibits. It sounded like there’s some concern somewhere in the chain of interested parties about posting the videos themselves, which would let the public cut clips of them and do what they want with them. And typically, courts prohibit live streaming from court. But deposition videos are a different thing: as this blog has pointed to, the excerpts from the deposition of Bill Gates in the DOJ’s antitrust suit against Microsoft are essential viewing. And it seems like a light lift given that material portions of testimony in this trial have taken place under seal.
It’s been an axiom of the common law for nearly three centuries that “the public has a right to every man’s evidence.” Though the statement more typically applies to questions about whether a subpoena is enforceable, for my money it means more than just that. It’s a clarion call for public access. The law is settled that at least in criminal cases, the First and Sixth Amendments to the U.S. Constitution provide for a public trial. Publicity is one check against abuse of power or the proceedings. Those concerns matter just as much, if not more, in a trial to break up one of the country’s biggest companies, of which hundreds of millions of our fellow Americans are customers. It is not asking too much to insist that, if every man’s evidence will not be played in open court, it will at least be open to the press and the public.
Stray Thoughts
Matt Stoller covered Mark Zuckerberg’s interview last week about how Meta AI will give you holographic friends and AI therapists. And this blog covered the report that Meta’s AI bots can make sexually suggestive comments to minors. It’s worth revisiting those stories in light of the shocking evidence about the breadth of the child grooming problem on Meta’s apps.
If FTC Commissioner Melissa Holyoak has carved out any signature issue, it’s the harm to children posed by online platforms. As Utah’s Solicitor General, she was part of a lawsuit against TikTok along those lines, and she has made several public statements since joining the FTC focused on the privacy of children’s data.
As always, an excellent recap.
It's unclear whose interests are being protected by failing to make deposition videos available to the media...Judge Boasberg surely understands the risks to public acceptance of his decision if the record is incomplete.
Perhaps I'm misinterpreting the summary of Deng's testimony, but I'm not clear how he failed to impeach himself by stating that WhatsApp "wasn't" a competitive threat to FB, given the litany of hysterical emails concerning just that point.
I'd also note that 30,000 people assigned to "integrity" (unclear what their roles were) seems far less impressive given that FB boasts 4 billion active users.
Such great reporting, so much information I had to read it twice!