Facebook is a massive corporation with the potential to do positive deeds as well as harmful acts in communities. An unfortunate example of this caused social media users in the United Kingdom to see extremely vulgar ‘suggestions’ arise in their Facebook search engines. In response to complaints, the social media giant is claiming that users access automated suggestions as a result of the popularity of other users’ search history.
While this may play into site’s algorithm, it does not excuse the fact that users were exposed to suggestions of vulgar sex acts and illegal child pornography. Reportedly, Facebook account holders in the U.K. were searching the phrase ‘videos of’ followed by the subject they were searching. It was quickly discovered, however, that that phrase was enough to warrant the site to push some undesirable suggestions.
Among those included child pornography which should go without saying but is highly illegal and immoral. While the remaining sexual content may have fallen through cracks otherwise, this proved to be too offending for users to ignore, and they rightfully reported such content.
Not only did the platform fail to protect vulnerable lives, but it also violated its own security policies. In its terms of an agreement, Facebook clearly states that inappropriate content is prohibited on the site. While this may protect some users from potential abuse, the main purpose of this rule is to keep the site clean and appropriate for all users.
The social media platform not only failed to abide by its own rules but put others at risk, most importantly, children who cannot defend themselves against predatorial adults.
The site is, however, not taking responsibility for the way it had momentarily allowed users to access shocking content. Instead, it is claiming that search engine suggestions are the result of concentrated searches that other active users have made.
While this may be true due to the algorithm factor, it does not excuse the way in which such content originally surfaced.
The site not only failed to censor content which it itself deems inappropriate but also essentially promoted the disturbing crime that is child sexual assault.
Admittedly, Facebook has a daunting amount of content to oversee, as in 2017, the site reported having 2.2 billion ‘active users.’
Nonetheless, it made another critical error recently when it provided users with a highly disturbing poll. The site requested the feedback of users regarding their input as to their opinions on child abuse.
Specifically, it questioned whether users believed that pedophilia was wrong pertaining mostly to “underaged girls” and “sexually explicit photographs.”
This content, which aligns with ‘grooming’ practices that predators use to prepare victims for assault, is something that the corporation has a responsibility to address, as online crimes can be more effortlessly carried out that their physical counterparts.
Facebook defended its actions stating that “We run surveys to understand how the community thinks about how we set policies.”
While that is an acceptable excuse for virtually any other topic, it is repulsive that the company had not previously implemented policies to catch sexual predators.
“That was a mistake,” Facebook vice president Guy Rosen admitted.
Founder of the Cyber Civics and CyberWise accountability organization, Diana Graber, added that “The fact that Facebook would even pose this question theoretically is disgusting.”
The social media company insists that it “work[s] with law enforcement to find offenders,” however, it clearly failed to do so in this instance, as an undetermined number of pedophiles likely responded to the survey.
This recent revelation is tragic considering the children who are currently facing similar abuse. While that is arguably much worse than any other offenses, it is not surprising for Facebook.
The social media corporation has been proven to censor content based off what it determines will be popular in users’ news feeds. Most recently, this has infringed upon the free speech of conservative voices who are less likely to be heard than liberal counterparts.
Again, the platform defends itself but still blames such prejudices on the electronic algorithm. It claims to prioritize content based on its ‘trending’ factor while failing to acknowledge the validity of what it is endorsing.
This has caused many to question the legitimacy of the site’s content, which is unfortunate considering that Facebook was designed to be a collaborative platform encouraging user feedback.
It appears, however, that the social media company cares more about its bottom line than that of its users.
While the company does not directly charge users for its services, the site’s popularity is undeniably indicative that the time users spend on the site translates into funds promoting inappropriate behavior, sexual assault, and child abuse.