Three years ago ABC reported that the “overwatch” group monitors social media to prevent veteran suicides. Overwatch has 4500 volunteers, or did at the time of this article. They operated throughout Australia to support those who had served in the Australia Defense Force.
ABC noted that Overwatch described itself as “peer-to-peer, boots-on-the-ground, rapid-response” to assist veterans “who are at risk or in crisis.”
This organization is ready to help veterans “having a meltdown” and its volunteers would try to alert authorities or police.
The bottom line is that PTSD is not unique to America’s armed forces. And what these Australian volunteers were doing is commendable. This kind of buddy system is wonderful and should be encouraged.
We see social media campaigns that involve “Buddy Check” and support groups online and offline. All of this should be continued. But it’s not enough, whether for suicides of veterans (a concern of the Veterans Administration) or suicides of those on active duty (a concern of the Department of Defense).
Just two years ago Gen. Paul Selva, then vice chairman on the Joint Chiefs of Staff, said, “I’m not a big fan of social media” but then conceded that active duty armed forces interact mainly on social media. The U.S. surgeon general, addressing the matter of suicide among active duty members, noted, “the foundation of connection is dialogue,” and that technology can be a positive or a negative.
For several years, I’ve been working on the positive, including making sense of the chaos of social media to help people, and I really want to help our military and veterans.
At the Pentagon meeting that focused on suicide, research that studied 700 suicides found that public media postings provided “reliable signals or clues.” The US News article that provided much of the preceding information for this blog item noted: “Posts about stressful life events followed by posts about negative emotions a few days late were a signal for death by suicide.” Interestingly, “a danger sign was when service members continued posting about these behaviors but no longer wrote about negative thoughts.”
Siri is supposed to be empathetic, but now feminists are attacking Siri as being too submissive. As the saying goes, “you can’t win by trying.” Apple has tried to pursue its own suicide prevention initiatives, whatever they are. And Facebook tries, in its own way.
But back to the issue at hand — active duty and veterans. According to an article just six months ago in the New York Times, reducing thie suicide rate among veterans is the top priority of the Department of Veterans Affairs. But the article reported confusion among bureaucrats, and a lack of leadership. The director of the VA’s suicide prevention office resigned because she said the office was ineffective
But we now can do more, much more that what I’ve been writing about in these four posts. We don’t have to depend only on volunteers and buddies, as terrific as they are. Nor do we have to leave things to chance about when someone might spot a troublesome post. And no more hit-and-miss. We can use a sophisticated system, with artificial intelligence and machine learning, to monitor public social media — and this is done in real time, with the potential to flag alerts for active military and veterans who may be a threat to themselves or to others.
Let’s get on with it.