Please disable your adblock and script blockers to view this page

I gave Instagram photos of my baby. Instagram returned fear.

Accountable Tech
New York University
Platform Accountability and Transparency Act
Life | Tech

Stephanie Otway
Nicole Gill
Frances Haugen
Adam Mosseri
Said Otway
Donald Trump
Laura Edelson
New | Ethical IssuesData
Facebook |
| AndroidAsk

No matching tags

No matching tags

Instagram’s Shopping



Positivity     43.00%   
   Negativity   57.00%
The New York Times
Write a review: The Washington Post

High on my list of demands: We the users need transparency about how algorithms work — and the ability to press reset when they’re not serving us.I learned this firsthand by going on a hunt to unravel how my baby’s Instagram account got taken over by fear.More than a billion people spend time on Instagram in part because they enjoy it. But of all the millions of images across the app, these are the ones Instagram chose to show my son’s account — and I have no way of knowing why.What I question is how Instagram decided to show me these specific images, and at this volume, when I have no connection to these families.Other new parents on Instagram tell me they also feel they’re being recommended posts that prey on our specific insecurities, from breastfeeding to vaccination. The algorithms used by Instagram and Facebook look for “signals.” Some are obvious: Liking a post, following an account, or leaving a comment on a post are all signals.In my case, I didn’t do any of that with Instagram’s suggested posts. You can’t edit that list of “your topics” — but you can give feedback on an individual recommended post, if you know where to look.Reporting this column, I learned Instagram offers this one lever of control over its algorithm: When you see a suggested post (or an ad), in the upper right corner there are three dots. But amplifying extreme content is one of the consequences of training algorithms to focus on what it calls “engagement,” or content that leads people to interact.According to the documents Haugen leaked, changes to Facebook’s algorithms in 2018 and 2019 — to encourage what it called “meaningful social interactions” between users — had the consequence of promoting posts that sparked arguments and division.Extreme content can also become a gateway to misinformation about vaccines, scams, or even sharing illicit images and information.For teens, navigating the mental health pitfalls of Instagram is part of everyday lifeOn my son’s account, I witnessed another unintended consequence: what Haugen calls “engagement hackers.” They’re a kind of spammer who has learned how to hijack Instagram’s logic, which encourages them to post shocking images to elicit reactions from viewers and thus build their credibility with the algorithm.Several of the accounts behind the images Instagram recommended to my son’s appear not to be parents of the children featured in the images. But to completely shut off Instagram’s recommended posts from accounts you don’t follow — and make at least your main feed a friends-only experience — you have to select the Favorites-only view, and put all your friends in that category.An even better idea: Give us an algorithmic reset button. One of them is the bipartisan Platform Accountability and Transparency Act (PATA), which would force companies to open up their algorithms by turning over information about how they work — and their consequences — to researchers and the public.“We agree people should have control over what they see on our apps and we’ll continue working on new ways to make them more transparent, while also supporting regulation that sets clear standards for our industry in this area,” said Otway, the spokeswoman for Instagram.Now it’s time to hold them to it.More in this series: We the users want technology to work for us.

As said here by Geoffrey A. Fowler