Harassment-Moderating Systems

To Fight Online Harassment MIT's Squadbox Brings in a User's Friends

MIT's Squadbox is the university's latest attempt by researchers to combat online harassment in a way that may aid Facebook, Twitter, Discord and other social media platforms. Developed by MIT’s Computer Science and Artificial Intelligence Laboratory, this new tool proposes that harassment should be dealt with by friends rather than relying on moderators. The system utilizes 'friend-sources' as moderators to filter messages and support people being harassed online. The service would essentially swap out anonymous moderators for personal friends who can better filter and distinguish material that the user may or may not want to see.

MIT's Squadbox currently only works on email, but the team of researchers is hoping to expand the platform to eventually encompass a range of social media platforms. MIT has noted that the current limitations of the system as using friends as moderators does address a privacy and personalization issue, but also presents a challenge in relationship maintenance. Early testers of MIT's Squabox noted feelings of guilt because they were relying heavily on their friends and grew more reluctant to ask for more favors.
Trend Themes
1. Friend-sourced Moderation - Developing user's friends as moderators to combat online harassment presents opportunities for social media platforms to increase personalization and user control via equity and empathy.
2. Personalized Moderation - Emphasizing a friend-sourced moderation system over anonymous moderators promotes user control and increases the opportunity for brands to differentiate themselves via ethical and empathetic policies.
3. Moderation Limitations - Exploring the limitations of a friend-sourced moderation system, such as relationship maintenance, guilt, and the necessity of more personalized moderation techniques, will lead to improving strategies combating online harassment.
Industry Implications
1. Social Media - By utilizing a friend-sourced moderation system, social media platforms can increase personalization, control, and differentiation.
2. Artificial Intelligence - Inclusion of AI and ML comparative text analysis can help pinpoint explicit or implicit patterns of online harassment and improve friend-sourced moderation.
3. Consumer Relationship Management - Exploring the impact of requiring friends as moderators in social media platforms may lead to breakthroughs in personalized consumer relationship maintenance strategies.

Related Ideas

Similar Ideas
VIEW FULL ARTICLE & IMAGES