Media

X Adds ‘Moment of Death’ Form To Violent Content Policy


This is a bit morbid, and a little onerous of X, considering the circumstances.

Over the weekend, X updated its Violent Content policy to include a new clause called “Moment of Death”, which includes a form that people can fill out if they want to have a video of a loved one dying removed from the app.

X Moment of Death form

Yeah, it’s pretty bleak, especially this:

X values the maintenance of a robust public record, especially for significant historical or newsworthy events. This value is weighted against our commitment to honor your request to maintain the dignity and privacy that should accompany death.”

So you can apply to have a video of somebody you know dying removed from the app, but for one, you’ll have to fill in a form, which includes various requests for qualifying info (including a death certificate), and two, X can also reject your request, if it feels that the video is newsworthy enough.

Immediate family members or legal representatives can report Moment of Death content for review via our Moment of Death report form. If you want to request the deactivation of a deceased person’s account, immediate family members and those authorized to act on behalf of the estate can do so via our Deactivation report form.”

It feels like this should be a lot easier, that if there’s a video depicting someone’s death, it should probably be removed on request, no matter who requests it. But X is committed to defending freedom of speech wherever it chooses, and clearly, the depiction of people dying has become an element of debate among whomever is on the X moral committee.

X infamously refused to take down a video of a violent stabbing that occurred in Sydney last year, which did not result in the victim’s death, but which Australian authorities had requested be removed, due to fears that it could spark retaliatory violence. X stood firm, on freedom of speech grounds, and that video remained freely available to users in its app. Then earlier this year, a man who murdered three girls in the UK was found to have viewed that stabbing video before undertaking his attack.

It’s not clear if this rule change is related to this specific case, but the “newsworthy” qualifier here seems to suggest that X would be able to keep violent content like this active in its app, if it so chooses. Also, only the victim’s family would be able to request its removal.

So basically, posting video of someone dying is okay, and if it’s relevant enough, X will keep that content active even if a relative requests that it be removed.

I guess, in the broader free speech debate, this is a valid process, and a logical, systematic approach to what could be a significant problem.

It seems like the answer should be simpler, but X follows its own rules.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.