Follow Us On

System Filters Out Harassment from Incoming Social Media Data

January 14, 202015:00pm - 15:30pmState of AI and ML - January 2020

Speaker

Corinne David

Corinne David

CEO /Founder, Emakia
  • January 14, 2020
  • 3:00pm - 03:30pm

Abstract

Social media users are subject to harassment when unwanted offending content reaches them. Social media companies are reluctant to police content. The Emakia system provides a solution at the point where incoming real-time data are received. The system proposes to use sets of Machine Learning classifiers to filter text, images, audio, and video. In its first development, a text classifier is trained with labeled data. After training, the model is used to screen the incoming real-time data. On the device or on the server, a “bag-of-words” filter is used as an adaptive filter to retrain the model with content as yet unknown to the model. Only approved content is displayed to the user device. The unwanted data are still available if the user desires to access them. Accuracy of the harassment detection increases with an increase in the size of the labeled data set. The initial application offers one text classifier. Subsequent developments will provide multiple classifiers with the aim of allowing modification by the individual user. The Emakia system will contribute to a safer social media environment.