Building a multi-headed model that's capable of detecting different types of toxicity like threats, obscenity, insult and identity-based hate. Discussing things you care about can be difficult. The threat of abuse and harassment online means that many people stop expressing themselves and give up on seeking different opinions. Platforms struggle to efficiently facilitate conversations, leading many communities to limit or completely shut down user comments. So far we have a range of publicly available models served through the perspective APIs, including toxicity. But the current models still make errors, and they don't allow users to select which type of toxicity they're interested in finding.
toxic comment classification
International Journal of Trend in Scientific Research and Development - IJTSRD having
online ISSN 2456-6470. IJTSRD is a leading Open Access, Peer-Reviewed International
Journal which provides rapid publication of your research articles and aims to promote
the theory and practice along with knowledge sharing between researchers, developers,
engineers, students, and practitioners working in and around the world in many areas
like Sciences, Technology, Innovation, Engineering, Agriculture, Management and
many more and it is recommended by all Universities, review articles and short communications
in all subjects. IJTSRD running an International Journal who are proving quality
publication of peer reviewed and refereed international journals from diverse fields
that emphasizes new research, development and their applications. IJTSRD provides
an online access to exchange your research work, technical notes & surveying results
among professionals throughout the world in e-journals. IJTSRD is a fastest growing
and dynamic professional organization. The aim of this organization is to provide
access not only to world class research resources, but through its professionals
aim to bring in a significant transformation in the real of open access journals
and online publishing.