Does social media have an impact on your well-being?
Facebook has delayed the development of an Instagram app for children amid questions about its harmful effects on young people’s mental health. Does social media have an impact on your well-being? What is your relationship with social media like? Which platforms do you spend the most time on? Which do you stay away from? How often do you log on?
What do you notice about your mental health and well-being when spending time on social networks?
In “Facebook Delays Instagram App for Users 13 and Younger,” Adam Satariano and Ryan Mac write about the findings of an internal study conducted by Facebook and what they mean for the Instagram Kids app that the company was developing: Facebook said on Monday that it had paused development of an Instagram Kids service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app’s effect on young people’s mental health.
The pullback preceded a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the company knew of the harmful mental health effects that Instagram was having on teenage girls. The revelations have set off a public relations crisis for the Silicon Valley company and led to a fresh round of calls for new regulation. Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of criticism.
The article continues:
With Instagram Kids, Facebook had argued that young people were using the photo-sharing app anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for ages 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which Google owns, has released a children’s version of its app.
But since BuzzFeed broke the news this year that Facebook was working on the app, the company has faced scrutiny. Policymakers, regulators, child safety groups and consumer rights groups have argued that it hooks children on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.
The article goes on to quote Adam Mosseri, the head of Instagram:
Mr. Mosseri said on Monday that the “the project leaked way before we knew what it would be” and that the company had “few answers” for the public at the time.
Opposition to Facebook’s plans gained momentum this month when The Journal published articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had caused teen girls to feel worse about their bodies and led to increased rates of anxiety and depression, even while company executives publicly tried to minimize the app’s downsides.
But concerns about the effect of social media on young people go beyond Instagram Kids, the article notes:
A children’s version of Instagram would not fix more systemic problems, said Al Mik, a spokesman for 5Rights Foundation, a London group focused on digital rights issues for children. The group published a report in July showing that children as young as 13 were targeted within 24 hours of creating an account with harmful content, including material related to eating disorders, extreme diets, sexualized imagery, body shaming, self-harm and suicide.
“Big Tobacco understood that the younger you got to someone, the easier you could get them addicted to become a lifelong user,” Doug Peterson, Nebraska’s attorney general, said in an interview. “I see some comparisons to social media platforms.”
In May, attorneys general from 44 states and jurisdictions had signed a letter to Facebook’s chief executive, Mark Zuckerberg, asking him to end plans for building an Instagram app for children. American policymakers should pass tougher laws to restrict how tech platforms target children, said Josh Golin, executive director of Fairplay, a Boston-based group that was part of an international coalition of children’s and consumer groups opposed to the new app. Last year, Britain adopted an Age Appropriate Design Code, which requires added privacy protections for digital services used by people under the age of 18.
Students, read the entire article, then tell us:
Do you think Facebook made the right decision in halting the development of the Instagram Kids app? Do you think there should be social media apps for children 13 and younger? Why or why not? What is your reaction to the research that found that Instagram can have harmful mental health effects on teenagers, particularly teenage girls? Have you experienced body image issues, anxiety or depression tied to your use of the app? How do you think social media affects your mental health?
What has your experience been on different social media apps? Are there apps that have a more positive or negative effect on your well-being? What do you think could explain these differences? Have you ever been targeted with inappropriate or harmful content on Instagram or other social media apps? What responsibility do you think social media companies have to address these issues? Do you think there should be more protections in place for users under 18? Why or why not?
What does healthy social media engagement look like for you? What habits do you have around social media that you feel proud of? What behaviors would you like to change? How involved are your parents in your social media use? How involved do you think they should be? If you were in charge of making Instagram, or another social media app, safer for teenagers, what changes would you make?
What do you notice about your mental health and well-being when spending time on social networks?
In “Facebook Delays Instagram App for Users 13 and Younger,” Adam Satariano and Ryan Mac write about the findings of an internal study conducted by Facebook and what they mean for the Instagram Kids app that the company was developing: Facebook said on Monday that it had paused development of an Instagram Kids service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app’s effect on young people’s mental health.
The pullback preceded a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the company knew of the harmful mental health effects that Instagram was having on teenage girls. The revelations have set off a public relations crisis for the Silicon Valley company and led to a fresh round of calls for new regulation. Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of criticism.
The article continues:
With Instagram Kids, Facebook had argued that young people were using the photo-sharing app anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for ages 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which Google owns, has released a children’s version of its app.
But since BuzzFeed broke the news this year that Facebook was working on the app, the company has faced scrutiny. Policymakers, regulators, child safety groups and consumer rights groups have argued that it hooks children on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.
The article goes on to quote Adam Mosseri, the head of Instagram:
Mr. Mosseri said on Monday that the “the project leaked way before we knew what it would be” and that the company had “few answers” for the public at the time.
Opposition to Facebook’s plans gained momentum this month when The Journal published articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had caused teen girls to feel worse about their bodies and led to increased rates of anxiety and depression, even while company executives publicly tried to minimize the app’s downsides.
But concerns about the effect of social media on young people go beyond Instagram Kids, the article notes:
A children’s version of Instagram would not fix more systemic problems, said Al Mik, a spokesman for 5Rights Foundation, a London group focused on digital rights issues for children. The group published a report in July showing that children as young as 13 were targeted within 24 hours of creating an account with harmful content, including material related to eating disorders, extreme diets, sexualized imagery, body shaming, self-harm and suicide.
“Big Tobacco understood that the younger you got to someone, the easier you could get them addicted to become a lifelong user,” Doug Peterson, Nebraska’s attorney general, said in an interview. “I see some comparisons to social media platforms.”
In May, attorneys general from 44 states and jurisdictions had signed a letter to Facebook’s chief executive, Mark Zuckerberg, asking him to end plans for building an Instagram app for children. American policymakers should pass tougher laws to restrict how tech platforms target children, said Josh Golin, executive director of Fairplay, a Boston-based group that was part of an international coalition of children’s and consumer groups opposed to the new app. Last year, Britain adopted an Age Appropriate Design Code, which requires added privacy protections for digital services used by people under the age of 18.
Students, read the entire article, then tell us:
Do you think Facebook made the right decision in halting the development of the Instagram Kids app? Do you think there should be social media apps for children 13 and younger? Why or why not? What is your reaction to the research that found that Instagram can have harmful mental health effects on teenagers, particularly teenage girls? Have you experienced body image issues, anxiety or depression tied to your use of the app? How do you think social media affects your mental health?
What has your experience been on different social media apps? Are there apps that have a more positive or negative effect on your well-being? What do you think could explain these differences? Have you ever been targeted with inappropriate or harmful content on Instagram or other social media apps? What responsibility do you think social media companies have to address these issues? Do you think there should be more protections in place for users under 18? Why or why not?
What does healthy social media engagement look like for you? What habits do you have around social media that you feel proud of? What behaviors would you like to change? How involved are your parents in your social media use? How involved do you think they should be? If you were in charge of making Instagram, or another social media app, safer for teenagers, what changes would you make?