According to Bloomberg, Facebook is hoping to unite experts to control suicide and self-harm on its platform.
According to the report, Facebook will publicly share user stories about suicide for research, supervision and intervention by expert bodies and academic institutes. In addition, Facebook hired a health specialist to join the security team.
In terms of health and safety, users can't share pictures related to suicide and self-injury on their platform, and it's more difficult to search for such content. Facebook officials said that in April-June this year, they dealt with more than 1.5 million suicide-related content, and found 95% of the illegal content through the algorithm.