January 26, 2019 10:19 AM
Updated January 26, 2019 10:24 AM
Molly was the youngest of three sisters. She was 14 years old and was a normal teenager. I was excited. On that night she finished homework and prepared the backpack to go to school. When we woke up the next morning, I died. "Ian Russell's voice is broken: he is a father of Molly, a 14-year-old English English girl who has been suicidal in 2017.
"It's very sad, in a moment you realize that your life will never be the same," he explains.
"Molly left notes, we're lucky to have notes about her after she died, as she tried to explain how she felt," she says.
"Some of the notes said:" I am the problem of life, I love everyone, be strong, I'm proud of you. "
His father says that the content he saw on Instagram encouraged his death
After his death, the family asked the social network accounts that Molly followed, and found satisfaction with depression and self-harm.
"I do not doubt that Instagram helped kill my daughter," Russell denounces now.
According to data from the World Health Organization, in 2016 more than 200,000 people between 10 and 29 years were killed in the world.
"My daughter has made a lot of suggestions, and everything has gone, we have to accept it, the hard part is that everything has gone out with the help of Internet and social networks."
The content of social networks
"I remember to find a drawing with a signature:" This world is very cruel, I do not want to see it any more, "he explains.
"There were accounts of people who were depressed, suicidal or suicidal, and Molly had access to many similar features," she added.
Ian says that some of the materials were positive, from groups of people trying to help one another, to maintain a positive attitude and to stop harm themselves.
But he explains: "Another part of the content is shocking, encourages self-harm and reports self-harm to suicide."
Molly left notes explaining how she felt
The BBC watched Instagram about the content that appeared with "hashtag" ("tag") of "selfharm" (self-wound, in English) and found very explicit images of the users.
In addition, help "hashtags" to find more similar content, as users can subscribe and follow publications that have a particular tag.
The content tagged "depression" (depression) leads to uncomfortable material, such as suicide videos.
"The publications of these accounts are usually black and white, they are fatal, they do not leave hope, it's like saying: joining the group, you're depressed, I am too," explains Molly's father.
"We could not imagine that this kind of content could be on a platform like Instagram, and it's still there, it's very easy to find, it's not hidden, it's available," he added.
The role of algorithms
Although Flynn is the director of Papiro, a British organization for youth suicide and NegativeThinSpace;
Although Flynn is the director of Papiro, an organization for the prevention of suicide among young people founded in the United Kingdom in 1997.
In an interview, the BBC showed Flynn the images he had found in Instagram.
"I'm going to say I'm not surprised, but myself, suicide is not a hairdresser, it's an unimaginable and devastating tragedy," says Flynn.
There is an added problem: Instagram algorithms help users locate related content. Following an account of this type, the social network suggests more.
"If a social network algorithm is scheduled to offer more content of the same type that was sought, in some cases it must be more careful than when searching, for example, the term" flowers, "explains Flynn.
He added: "The suicide laws are very clear: encouraging someone to end their lives is illegal, whether in the Internet or in real life, whether in words or with images, anyone who suggests that you have to do it is at least , a possible accomplice. "
"Instagram seriously consider changing its algorithms to save lives, and it has to do it now," he says.
Reply to Instagram
"We do not allow content that promotes or delivers eating disorders, self-harm or suicide.
We have removed content of this type, "they say in the article," Instagram "has a tool that warns some search terms and offers help.
But users can simply suspend the help and continue browsing.
Papyrus, who works in the United Kingdom, provides information and practical advice to a young person who has suicidal impulses.
Molly was the girl's sister of three sisters
All the members of the board of directors of the organization were personally influenced by the suicide. Many have lost a child by suicide.
"It's not right for a child to access such explicit images," social networks can not continue to discuss, for example, they have a button that offers help if the algorithm detects terms such as "self-harm" and "suicide." "Three or four times," explains Flynn.
"My message would be to take it seriously, suicide is the main cause of death in young people in the United Kingdom, how long does it have to go to do something?" He adds. The British government encourages social networks to take more responsibility for the content that illustrates and promotes methods of suicide and self-harm.
"Devastated" and "complicated"
Steve Hatch, Facebook director for North Europe
After the announcement of Instagram, Steve Hatch, director of Facebook – the company that Instagram belongs to – for North Europe, told the BBC in an exclusive interview that Molly Russell's death was a "devastating event".
Hatch told BBC editor Amol Rajan that he felt "deeply worried" when he heard Molly's father's accusation that the social network was partly responsible for the death of the girl. "I can not even imagine how Molly's father and the rest of the family feel," he said.
When Rajan showed him images of self-harm, which supposedly oppose Instagram's policies, but still available on the social network, the executive responded as follows: "We must make sure that we analyze these images and make sure they are removed . "Hatch also said that Instagram reflects its policies constantly in everything related to depression and suicide images.
"It's a very complicated thing," he added.
"We are working with experts that help shape self-harmful policies, it's a very complex area."
"Experts tell us that when these images are published by people who clearly cross a very difficult situation, they often do that because they are looking for help or help."
"In these cases, it can be very useful and very helpful for these images to be available on the platform, so we allow them and offers support to those who can see them."
"What we do not allow is those who applaud or extol (suicide)." However, he did not want to answer the question of whether he allowed his children to use Instagram, but noticed that the social network "works hard" to eliminate these types of images and also offers "much support" to humans.