Four years ago the Military Times headline read: “The military’s suicide prevention fight has moved to Facebook and Twitter.” The article reported on marine sergeant Raheem Boyd, who responded to a frantic Facebook message. A friend had alerted Boyd that another marine was posting words on social media, words that implied suicide.
“If it wasn’t for social media, we never would have known what was going on in his head, ” Boyd told the newspaper, and the marine likely would have committed suicide.
It’s fortunate that someone was watching social media, and the same “someone” alerted Boyd who courageously intervened, even risking his own life.
This same article reported that researchers from Northrop Grumman and the University of Utah National Center for Veterans Studies have studied how social media can contribute to suicide OR prevent it. Also, cyber bullying can be a factor in suicide.
There is a Defense Suicide Prevention Office. Studies found worry and stress, isolation and alienation, and marriage, family and money problems and, of course, depression. The article notes the usual — that those who commit suicide have “hopelessness, social withdrawal, and insomnia” and as times goes on, they discuss more and more their “distress, relationship problems and religious affiliation.” It’s hardly surprising that suicide candidates have a “more pessimistic worldview.”
One researcher spoke of “these clues on social media” as “warning signs in life–expressing their intent to die…a sense of being a burden and having no purpose in the world.”
A series of articles in the New York Times reported on how members of a marine regiment tried to use social media to keep track of their buddies. Further that when Marines posted “done with life” or “can’t do it anymore” their marine compatriots responded, saving one life.
About four years ago Facebook began a feature — if someone wanted to report suicidal content and to support the person and to reach out to him or her.
The Department of Defense, according to the Military Times article, had recommended “integrating social media data in psychological ‘autopsies’ conducted after a service member dies by suicide and figuring any publicly available social media data into wellness assessments.”
Autopsies? I suspect we know enough already to prevent many of these suicides, but we need more than a casual approach.
All of this information can be building blocks for what we now can do — using artificial intelligence to monitor public social media in real time. In other words, we don’t have to rely on friends who may or may not be watching social media or able to interpret the telltale posts correctly or soon enough.
By all means, let’s continue programs to ameliorate PTSD, and let’s pursue and improve psychological counseling and psychotherapy, and psychiatry, and support programs, and “The Veterans Crisis Hotline.”
But we are now at the stage where we can do so much more with social media. The very words and emotions that are analyzed in “autopsies” can be used in real time, not retrospectively (i.e., “too bad” we didn’t do “…”-). In others, we now have the capability for analysis in conjunction with artificial intelligence, to monitor troubled souls, help them and even prevent their taking their own life.