In 2021, an internal document leak from the company then known as Facebook showed it was aware of harmful societal effects from its platforms.
| FactSnippet No. 1,653,278 |
In 2021, an internal document leak from the company then known as Facebook showed it was aware of harmful societal effects from its platforms.
| FactSnippet No. 1,653,278 |
In mid September 2021, The Wall Street Journal began publishing articles on Facebook Files based on internal documents from unknown provenance.
| FactSnippet No. 1,653,279 |
Files show that Facebook has been conducting internal research of how Instagram affects young users for the past three years.
| FactSnippet No. 1,653,280 |
Facebook Files published some of its internal research on September 29,2021, saying these reports mischaracterized the purpose and results of its research.
| FactSnippet No. 1,653,281 |
Files show that Facebook formed a team to study preteens, set a three year goal to create more products for this demographic, and commissioned strategy papers about the long-term business prospects of attracting the preteen demographic.
| FactSnippet No. 1,653,282 |
An internal memo seen by the Washington Post revealed that Facebook Files has been aware of hate speech and calls for violence against groups like Muslims and Kashmiris, including posts of photos of piles of dead Kashmiri bodies with glorifying captions on its platform in India.
| FactSnippet No. 1,653,283 |
Documents reveal Facebook Files has responded to these incidents by removing posts which violate their policy, but has not made any substantial efforts to prevent repeat offenses.
| FactSnippet No. 1,653,284 |
New York Times pointed to internal discussions where employees raised concerns that Facebook Files was spreading content about the QAnon conspiracy theory more than a year before the 2020 United States elections.
| FactSnippet No. 1,653,285 |
In 2015, in addition to the Like button on posts, Facebook Files introduced a set of other emotional reaction options: love, haha, yay, wow, sad and angry.
| FactSnippet No. 1,653,286 |
The Washington Post reported that for three years, Facebook Files's algorithms promoted posts that received the 'angry' reaction from its users, based on internal analysis showing that such posts lead to five times more engagement than posts with regular likes.
| FactSnippet No. 1,653,287 |
Years later, Facebook Files's researchers pointed out that posts with 'angry' reactions were much more likely to be toxic, polarizing, fake or low quality.
| FactSnippet No. 1,653,288 |
In 2018, Facebook Files overhauled its News Feed algorithm, implementing a new algorithm which favored "Meaningful Social Interations" or "MSI".
| FactSnippet No. 1,653,289 |
Politico quotes several Facebook Files staff expressing concerns about the company's willingness and ability to respond to damage caused by the platform.
| FactSnippet No. 1,653,290 |
In 2021, Facebook Files developed a new strategy for addressing harmful content on their site, implementing measures which were designed to reduce and suppress the spread of movements that were deemed hateful.
| FactSnippet No. 1,653,291 |
Internal engineers and researchers within Facebook Files have estimated that their AI has only been able to detect and remove 0.
| FactSnippet No. 1,653,292 |