>> It was in this deserted hotel in Phuket that a father filmed himself tying a rope around his baby's neck before dropping her from the rooftop. The video was up for roughly 24 hours on Facebook. This just two weeks after a man in Cleveland killed a 74 year old, live on the platform.
Reuters reporter Dave Ingram, says pressure is mounting for the social media giant to fix this. But just how is unclear.>> Live is live, on Facebook. And they don't have a good automated way to check for acts of violence, and prevent those things from being broadcast. They rely largely on their user base, thousands and thousands of people.
Potentially millions of people who are watching videos, and can report when they see something that violates the Facebook terms of service. Now when they report, those videos get flagged and they get sent to workers at Facebook. They actually do have thousands of workers at Facebook who review these videos.
But clearly these workers and the viewers are not catching all of the violent content on Facebook Live.>> Facebook Live was launched a year ago as a way for users to stream their lives in a raw and unfiltered way. But with that openness comes obvious risks. And what began with Chewbacca mom now includes a gang rape, a kidnapping, and the recent murders.
In the latest incident, the father's suicide was not broadcast. But he was found dead near his infant death daughter, who was put to rest Tuesday. The disturbing video removed after Thai officials flag them to Facebook, but not before garnering hundreds of thousands of views.>> So far Facebook has not faced any legal liability for these incidents.
They have cooperated in most cases, or all cases as far as we know in working with authorities, giving them information in response to court orders or other requests.>> While Facebook CEO Mark Zuckerberg at a recent conference admitted his company has not solved the problem.>> We have a lot more to do here.
>> There are calls from some civil rights activists to take Facebook Live offline, while it figures out a better way to curb this behavior. But so far no indication from Facebook that that's likely to happen.