Instagram is harmful for users, especially young women, according to research conducted by Facebook.
Facebook owns Instagram and recently conducted studies to determine how the platform affects its users. The data reveals that 32% of teen girls who reported feeling bad about their bodies only felt worse after looking at Instagram.
What is Instagram doing with that data?
“Nothing,” says Kara Frederick, a Heritage Foundation research fellow in technology policy. “What I think is particularly galling is the fact that they are still moving forward with plans to create an Instagram for children under 13 years old,” she adds.
Frederick, who is also a former employee of Facebook and helped to create and lead Facebook’s global-security counterterrorism-analysis program, joins the “Problematic Women” podcast to discuss what we now know about Instagram’s effect on users, especially young women, and what the platform should do to mitigate further harm.
Frederick also shares a little of her own journey into the field of Big Tech and cybersecurity.
Listen to the podcast below or read the lightly edited transcript.
Virginia Allen: I am so excited to welcome to “Problematic Women” Heritage Foundation research fellow in technology policy Kara Frederick. Kara, thanks for being here.
Kara Frederick: Of course. Anytime, Virginia.
Allen: Kara, you work in the field of technology policy. You’re a regular on Fox News, talking about Big Tech issues. You actually helped to create and lead Facebook’s global security counterterrorism analysis program. You spent six years as a counterterrorism analyst of the Department of Defense. That’s just a really highlight of your resume. I could just keep going and going and going. But, so tell me a little bit about how you got into this field. It’s such a unique field of tech policy. What drew you to it?
Frederick: Yeah, well, it was purely accidental as our most good things in life, I like to think. My father was in the Marine Corps for three decades, so I always wanted to follow him into the Marine Corps. I was just fascinated with national security and how America projects power in the world. I figured I’d do something along those lines. And when I got out of grad school, I ended up going into a three-letter agency, not that many people had heard of it, the Defense Intelligence Agency.
And it was described to me as the “redheaded stepchild” of the intel community, because nobody really knows [about it]. The CIA writes for the president. And DIA, we wrote and we analyzed for the war fighter. So, definitely not as sexy. We tried to make it. We tried to [juiced] it up a little bit.
I found myself there, and I was really interested in human intelligence. So, I didn’t care for the bits and the bots and the bytes, and all the crazy ones and zeros. And I was kind of a Luddite when it comes to those kinds of things.
I like to be athletic. I played soccer in college and afterwards. And so I wanted to be the cool girl that you saw in the movies. DIA, not really like that. I got the opportunity to do a rotation at the National Security Agency. And I even told my boss, I was like, “If you make me do this, I will quit.” I don’t want to do signals intelligence. I don’t want to do all [that] computer stuff. I’m not interested. I ended up going there to Fort Meade, and I spent a year there on my first rotation there, and I was fascinated.
I was like, “Wow!” The ability to see the fruits of your labor when you’re looking at a target, I was a target developer at DIA and later a targeter when I worked for a smaller naval special-warfare development group, a smaller command. And so I was just fascinated by the ability to get battle space effects using your computer and actually doing analysis with the technical work, and gaining a lot of technical proficiency being at the tip of the intel spear in that regard.
I ended up doing two rotations over two years to the National Security Agency, and that really whet my appetite for how to use technology to get actual real-world effects. That led to three deployments in Afghanistan using technology the entire time. I was the nerd plugging away in the closet in the back.
Allen: That’s like a movie, Kara.
Frederick: It was a not an interesting one. The other guys were doing the cool stuff, but I was the one in the back helping orchestrate some of the things that they did, which was great. And the same thing. You saw who was actually pulling the strings in the intel community. And let me tell you, it was those people behind the computers doing some of those interesting things. I did that for a while. I found it just … I always told people I would still be at the National Security Agency if it was anywhere but Fort Meade, Maryland.
It’s not my favorite place to drive to, but I was approached by somebody that I had worked with who was now working at a little place called Facebook in Menlo Park, California. And they said, “We need people who can do social network analysis, digital analysis, look at the way the digital behavior of terrorists and other sorts of bad guys on these digital platforms. So would you come over to Facebook?”
No, no, no, definitely don’t want to do that. Not my jam. I’m really happy here. At that time, I was working for a small naval special-warfare command, and we were able to have all the resources and just sort of be the cool kids on the block. Loved my job there, but she finally convinced me to eventually come over.
So, I helped create the counterterrorism analysis team for global security over at Facebook, and just really started working on surfacing high-quality, publicly available information to improve our platform-based reactions. And also look at some of the threats in the physical space of having manifested online to some of the Facebook entities all over the world.
Got to do sort of the same thing that I was doing, but for a bigger corporation in the private sector. That was where I got that technical proficiency. Super-excited about all of that. I also realized that my heart was still back in D.C. and there were these problems that were arising, and the best answers, I think, that the smartest people that were coming up with answers were still in D.C. A lot of my confreres in Silicon Valley didn’t think that way, but for me, I never quite got it out of my system and decided, hey, maybe there are rules to be written.
We hadn’t set the rules of the road for a lot of these are emerging technologies yet, and D.C. really needed help. I perhaps thought, too … it was too ambitious and then feeling that I could actually help, but came back and figured I’d start, so did tech policy for a smaller national security think tank, and then realized there were some bigger problems afoot that I think conservatives needed to find better solutions for, and decided to come to [The Heritage Foundation], where it all happens.
Allen: Well, we’re so glad that we were able to rope you in.
Frederick: Of course.
Allen: It’s such an honor to have someone with your background and bringing that perspective, that really holistic perspective, on tech policy. If you would give us just a real fast, high level, what is it that you’re doing here at Heritage and why is Big Tech so important today?
Frederick: Yep. I think at this point having seen what we saw in the 2020 presidential election cycle, when it comes to Big Tech censoring legitimate news. You look at the Hunter Biden laptop story with The New York Post, Twitter, Facebook. They suppressed legitimate information. Twitter, on Twitter’s part they said, “This was part of our, this was in violation of our hacked materials policy, and we’re therefore suppressing that information. We are not letting people click on links. We’re suspending The New York Post for even publishing information about this.”
In my mind, that was a “cross the Rubicon” moment, where I was like, wait a second. How do they know? And you had all those intel officials or even just regular analysts like myself come out and say, “No, we believe this has all the hallmarks of Russian disinformation.”
We know that a few days ago, Politico basically said, “No, this was actually genuine. This was legitimate.” It was that stifling of legitimate debate. It was really cutting into the marketplace of ideas and manipulating it that really raised my eyebrows and started me thinking like, what is going on here? And then it was followed in quick succession by President [Donald] Trump being suspended or banned from 17 different platforms in two weeks in early January. And that blew my mind. That made me think, OK, this is a problem. We have to get a grip on this.
So, I think the problems of Big Tech, especially when it comes to the ones that face conservatives now can be summed up in four ways, and that’s conservative censorship. We’ve seen that and suppression. Opaque content moderation decisions, so, when people are suspended or banned regular people from the platforms, you don’t really know. They don’t always know why. There’s a lot question marks there, and they’re … these rules are vague and they’re inconsistently enforced.
I’m glad this is catching on, but in our community now we’re, like, the mistakes are only going one way. And I think a good example of that is with our Heritage scholar, Mike Gonzalez, where we wanted to push ads for his book. And we weren’t allowed to do that for a time because it was too contentious. It didn’t fit under a specific Amazon policy.
We pushed back, and they later said, “Actually, this was human error.” Those human errors only tend to go in one direction, and that is the conservative side. So, in my mind, hugely problematic. And then lastly, there’s a lack of transparency and genuine recourse when it comes to this. We’re lucky. At Heritage, we have resources. We have people who are fired up to actually fight and contest those things that Amazon does.
But your average person may be running a small business on Instagram where they are able to get the word out because of the art that they make or something like that. They don’t have necessarily those resources. They don’t know what their recourse is if they put a foot wrong maybe, and were all of a sudden finding themselves suspended and therefore couldn’t make a living.
So, in my mind, I think those are really the four things that I’m very much concerned about when it comes to Big Tech, and it’s past time to address those issues and come up with viable, technically feasible, and acceptable policy solutions.
Allen: Yeah. That’s so critical, and I think, you highlighting each of those areas. It’s such a reminder of how much Big Tech really affects all of us, often our personal lives on a day-to-day basis. There was a really interesting piece that I want us to dive into now that came out on September 14th from The Wall Street Journal.
They released a story reporting research that they obtained about Facebook. Facebook owns Instagram, so we’ll kind of use those, Facebook and Instagram, interchangeably in this conversation.
The company has been doing research for a long time on how Instagram actually affects its users, and even more specifically how it affects young women. Facebook’s own research on Instagram has concluded that Instagram’s toxic for teen girls.
In fact, their research shows that 32% of teen girls said that when they felt bad about their bodies, Instagram only made them feel worse. So, Kara, what is Facebook doing with this information? They’re learning all these things about Instagram, how it impacts us. What are they doing with that?
Frederick: In a public way? Nothing. What I think is particularly galling is the fact that they are still moving forward with plans to create an Instagram for children under 13 years old. So, they had the information that you referenced in hand in 2019 and 2020, and even more fulsome details, too: 6% of American teenage girls when they had suicidal thoughts, they directly traced those thoughts to Instagram. So, they had a lot more color to that research, even though that’s a great 32% statistic, 1 in 3.
Remember when you heard 1 in 4 on college campuses, 1 in 4 people are victims of assault? Well, 1 in 3 of people right now are victims of Instagram when it comes to body-image issues, and teen girls, that is. In my mind, if you want to think of something, a data point and something that Facebook knows: 1 in 3. They know it makes them feel worse about themselves and have body-image issues, or at least exacerbate body-image issues that already exist.
So, the fact that they’re still going forward, or at least not arresting any plans to continue with this other platform aimed at younger and younger kids, I think that sort of tells you all you need to know. And I don’t just want to pick on Facebook here. YouTube Kids, that was created in 2015. YouTube was slapped with a $170 million fine by the [Federal Trade Commission] for collecting information on children under 13 years old without parental consent.
This is in direct violation of a law that was passed, I believe it was in 2018, the Child Online Privacy Protection Act. So, they’re flouting these regulations that already exist. They’re saying, “We care about one thing. That’s our bottom line.”
What I saw in these companies, in the belly of the beast, I like to say, was the three things they cared about most: So, bottom line, growth at all costs, and their brand and reputation. So if you want to look at what they’re doing, maybe their brand and reputation take a hit, but right now, yeah, it’s in the news cycle, but it doesn’t really seem to stay their hands when it comes to more plans to, frankly, do worse. Yeah, it’s something I think we need to think very seriously about in the policy world.
Allen: Yeah. Well, what responsibility does fall on a company like Instagram, Facebook to take action to remedy these things as we’re learning that this so negatively impacts teen girls that it causes suicidal thoughts for young people? What are the steps that these tech companies should be taking, and how much is maybe a larger societal problem that we as individuals need to be taking steps versus these companies?
Frederick: Yeah. So these companies are really the prime movers when it comes to this. I believe in individual agency. I think that people have, they can make their own choices. I believe parents make choices for their children sometimes in this regard, too. I don’t want to absolve everyone of responsibility in this matter, but again, these tech companies are the prime movers when it comes to that. And we know that their business models trade on our attention. It’s engagement.
It’s all when The Wall Street Journal releases [an] expose, one of the articles talked about meaningful social interactions. So, Facebook tweaked their algorithm in 2018. And it basically created more incendiary content, and that was what was rewarded. Even though maybe their goal was a little different. They wanted people to talk to their friends and their family members, and they were worried about the less engagement with more curated content.
Yet, it had the effect of making everything worse. We know that they know this. So, when these companies understand that this is happening and the effects are insalubrious on people’s psyches and their souls and whatnot, then I think they have the responsibility to at least not continue with plans to make it worse when it comes to under 13.
Facebook Messenger Kids is a thing that already exists on Facebook for younger children as well. And we know the that they’re targeting these audiences.
[Rep. BIll Johnson, R-Ohio] said in a Big Tech hearing in March that giving kids these platforms is like handing them a lit cigarette and hoping they get addicted for the rest of their lives. And it’s true.
They want these kids to be so steeped in the online world. And now it’s the metaverse that they disengage themselves, and they’re going to be using their products into perpetuity. That gives them money, and that growth factor that they’re after into perpetuity as well. So, their responsibility is, I say, not to target these younger audiences. And if they know that these platforms are actively causing harm, which [they] have proof that they do, to stop.
Allen: Well, and you mentioned the algorithm. That has become almost a common household term, the algorithm. But there’s still some mystery around it. How does it actually work? Like you said, you’ve been inside in the belly of the beast. You’ve worked at Facebook. Explain a little bit more how the Instagram algorithm works and how it actually is created to keep us addicted?
Frederick: Yep. Not all algorithms are created equal, and there are different algorithms for different platforms. There are different algorithms for different purposes. I think a good example of this is TikTok. So, everybody knows about TikTok now. It’s the thing. By the way, owned by a ByteDance, which is a Chinese company. So, its parent company ByteDance is beholden to the laws of the Chinese Communist Party.
So everybody, if your kids are on TikTok, it’s not a good thing. You better make sure that they tread carefully because all of their information, not just their data, but the leverage over their information is a key vulnerability there.
TikTok, 62% of Americans on TikTok are between 10 and 29. So, huge youthful user base. But the secret sauce to TikTok is their algorithm. They call it the for-you algorithm, but you ask for insight into these algorithms, they’re a black box. And that’s part of the problem, is there’s no algorithmic transparency. This is something lawmakers are trying to help change and incentivize these companies to show the fact that what these algorithms do, what feeds into them, and what’s the output. So, you have data inputs. You have your algorithmic black box, and then you have the output.
They tend to be coded differently. They work differently. YouTube’s basically plays on how long you watch a video. So not just how many likes or clicks, which could be inputs for other algorithms. So, they’re all different. When it comes to the Facebook, their Instagram algorithm, or at least we’ll say … actually, we’ll say their news-feed algorithm. So news feed is a central, a piece of the core platform, which is Facebook.
So distinct from what Instagram does, distinct from WhatsApp does and their algorithms. So, we’ll stick with the algorithm for news feed. In 2018 or a little before that, according to The Wall Street Journal-leaked documents, they realized the head shed, we’ll call it at Facebook, realized that their platform was losing users that were fully engaged.
So, that’s when they decided to change from people watching videos that were curated by, say, The New York Times or curated by Fox News, they wanted to push things to the top of the feed that were more organic. A video that your mom made of her cooking. So your child could, could watch Grandma cook in the kitchen or something like that. And the idea was they would reward different gradations of interactions with that video.
So, there’s a point system that fed into the algorithm. So, it was, I think one point per like, something like five points for like an emoji reaction and 30 for commentary with more, with lively verbiage and things like that. That means with the algorithm tweaked, mom’s video, if it got more points, then it would pop to the top of your news feed, and it would be more prevalent. It would not be down-listed, down-ranked depending on the platform, whatever they call it.
It works like that point system. Classifiers, which are automated systems, also work within them, too, but the algorithm itself can be tweaked and refined and … Yeah, it’s different for all platforms. And it’s different for different technologies, too. You’ll hear about facial-recognition technology algorithms and things like that. So, a lot involved, but yeah, that’s a rundown of two of the major forms and how their algorithms work. That’s at a very rudimentary level. … Not technical at all.
Allen: We could probably do a whole podcast just talking about the algorithms, because that is complex. There’s obviously so much involved there. But one thing for sure, they work. They get us addicted. I’m guilty of it. I mean, just last night I was scrolling through Instagram Reels. You’re looking at the clock, and you’re like, OK, five more minutes. And then 10 more minutes goes by. All right, I have to get on.
So, I know some people have been talking about, should there actually be some sort of warning label on these platforms? Because now that we do know that they are so addictive in the same way that we tell folks that cigarettes are addictive. Should we be telling them openly these platforms are addictive? What are your thoughts on that?
Frederick: I think it’s an interesting concept. I actually heard it for the first time a couple days ago. And that 1 in 3 formulation, maybe that’s something that you could put on there; 1 in 3 teen girls. You know what I mean? Body-image issues made worse when they open up Instagram.
You’ve heard sunlight is the best disinfectant. More information is better than less information. I’m not necessarily opposed to giving people information and letting them decide what to do with it. I think one of the biggest problems that afflicts tech companies themselves is they have a “We know what’s best for you” ethos. And that really informs their censorship of what they’re calling “disinformation” or “misinformation.” It changes by the day, as we know.
I don’t subscribe to the notion that these platforms know what’s best for us. So, maybe it is time for, to put some objective labels based off of maybe their own internal research. I think that’s an interesting idea.
Allen: Yeah. And would that probably be something that Congress would have to get involved with or would that be platforms probably on their own saying, “Yeah, we agree. We’ll add this label.”
Frederick: Yeah. I think there’s an opportunity for both. We’ve tried self-regulation. We’ve tried, “Hey, you guys, let’s regulate yourselves.” Doesn’t really appear to be working because, again, they’re beholden to those three things that, the bottom line, the growth, the brand or reputation. Those are the things that they value.
I do think having some sort of teeth when it comes to incentivizing these companies to change is something that conservatives are really getting behind. There’s a lot of energy behind that on the Hill right now. I wouldn’t be surprised if something comes out of the [Sen. Marsha Blackburn, R-Tenn., and Sen. Richard Blumenthal, D-Conn.] investigation, when they said, “Hey, we see that Facebook knew that these are harmful products, and yet they’re pushing them anyway.” So, that’s a bipartisan sort of a look into holding these companies accountable.
We can try to make them more transparent from within. If they worry that much about their brand and reputation, maybe that’s a small step that they can take. But in my mind, I do think that Washington is going to have to think a little harder. And conservatives are going to have to think a little harder about this. We don’t necessarily like to wield power, frankly. We don’t. It’s not really in our constitutions.
But I think from what we’ve seen in particular, this growing symbiosis between Big Tech and government entities. When you have [press secretary] Jen Psaki from the podium at the White House basically saying, “Oh, yeah, we talk to Facebook. We are in touch with them and regularly. And we have these people that we don’t like because they’re purveyors of disinformation. And we’re putting our thumb on these private companies to try to take these people down.”
And then they do it. They respond to it. Yeah. At the state level, same thing happened with Twitter and the secretary of state in California.
These are agents of the state basically saying, “We don’t want these people on your platform because we think they’re harmful.” And these Big Tech companies complying with agents of the state—small state and big state, too. So, in my mind, I think conservatives have to be very, very aware that this is happening and not just say, “Private companies are private companies,” because right now it looks like the government is really putting their thumb on these private companies. And we need to think about them in that context.
Allen: Absolutely. And so many of our listeners I know use social media. “Problematic Women” has an Instagram account. I use social media. What is your advice as someone who is in the middle of this field, you’re in the weeds. You know the details. You’ve worked at Facebook. What would you say to the 18-year-old girl, the 22-year-old girl who’s, like, “OK. I hear what you’re saying, That these platforms are harmful, but all my friends are on it. They’re fun. They’re creative. It’s a great outlet.”
What would you want to say to them?
Frederick: It’s really funny because I was, I found in a box the other weekend, two weekends ago, I found my handbook, my employee handbook that I got the day that I arrived at Facebook. And I was flipping through some of the pages, and I opened it and there’s one image of a man graffitiing a wall. And he’s graffitiing it with the words that say, “I just want to be where all my friends are.”
So, I totally understand that impulse. But from that impulse, we see this spawn of what that is. If something’s good, what are the fruits? Right? I think you look at the fruits of something and then determine if that is actually good or not.
The fruits as Facebook knows of Instagram are toxic. I think that there is a utility in these platforms in terms of getting your voice out there and letting people know about the work that you’re doing and whatnot.
I want to tell young girls to abandon these platforms altogether, but I know how hard it is. And you talked about how hard it is. So, what I would say, and I’ve said this before. But young ladies should find their worth elsewhere. They should find their identities in something higher than themselves. And this might be apocryphal, but Mark Twain said it, my mom said it: “Comparison is the death of joy.” And these things are comparison machines. So, you will feel worse when you’re scrolling, you’re scrolling, and you’re scrolling. It’s not, the outcomes are not good for you.
So, I would say, if you can, put the phone down, go outside. Do something good for someone else. That is going to edify your soul. And because we are talking about souls here, right? We’re not just talking about dopamine effects and the molecules in your body that respond to likes and whatnot. We’re talking about what happens deep down inside, at a spiritual level and comparing yourself to human beings, watching their highlight reels, because that’s what they are, as you know.
You don’t really get people’s bad days necessarily. But what this comparison machine engenders is jealousy, is anger, is resentment. So, when you look on balance at the fruits of this comparison machine and find that they’re rotten, I say, “Put it down. Don’t even start.”
Allen: Wisdom. Thanks, Kara. All right. Well, before we let you go, I have to ask you the question. We love to ask all of our guests on this show and that is: Do you consider yourself a feminist? Yes or no? Why or why not?
Frederick: Yeah, it’s a good question. I had a professor when I was an undergrad look at me straight in the eye one time and say, “Are you a feminist?” And I was like, “I don’t know.” Because you know, growing up a conservative, right, it’s a dirty word. It’s kind of gross.
So, what I found out was, I treat people as individuals. So, when you are talking about men, women, I definitely think sex differences are real, of course. And women have [something] that is inherently different than men, and men have things and gifts that are inherently different and purposes. And we’re put on this earth for very different things, clearly.
But I think treating people as individuals is my, that’s my doctrine, whether it be, you put an “-ism” at the end of it or whatever. So, my professor after I kind of gave him a clunky explanation of this, and he was, like, “So, you’re a post-feminist.” So, and I was like, all right, whatever, that works too. I’m an individualist. I like to treat people, take people one at a time, and I don’t think you can go wrong there.
Allen: I don’t think you can either. Kara, thank you for your time.
Frederick: Of course.
Allen: We so appreciate you coming on. Hope to have you back on sometime.
Frederick: Anytime.
Have an opinion about this article? To sound off, please email [email protected] and we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the url or headline of the article plus your name and town and/or state.