by Julia Carrie Wong and Sam Levin in Oakland on (#47SKP)
Site’s move comes amid continuing pressure over its role as a platform for misinformation and extremismYouTube will recommend fewer videos that “could misinform users in harmful waysâ€, the company announced on Friday, in a shift for a platform that has faced criticism for amplifying conspiracy theories and extremism.The change concerns YouTube’s recommendations feature, which automatically creates a playlist of videos for users to watch next. The recommendations are the result of complex and opaque algorithms designed to capture a user’s interest, but they have become a locus of criticism when YouTube directs people to potentially harmful and false content that they would not have otherwise sought out. Continue reading...