Facebook takes on a more proactive role in monitoring Groups

ad1

Facebook is taking a more proactive step with how the social media platform wants to moderate content in its Groups feature; starting with how people see them and how they will monitor the content posted in these micro-communities.

First of all, it is known that Facebook Groups is a widely used feature in the social media site that links like-minded individuals to chat, share, and exchange information with one another.

These micro-communities are divided into three different privacy settings, which are usually deemed as complicated or can be quite confusing. Currently, these settings are called public, closed, or secret. If a group is public, Facebook users can find it, view what users are posting, and request to join. In closed groups, only current members can see who is in the group and what users are saying. Meanwhile, Secret groups are even more exclusive. Only current members can find the group, and you need an invite to join.

Facebook announced in a blog spot that it would be overhauling the feature starting with clearing the confusion when it comes to privacy settings. The platform is renaming its confusing public, closed, and secret group settings to a rather more straightforward approach.

Source: Facebook

Soon, Facebook Group administrators can choose to either make the group Public or Private. “Public” is when people can see the group shared in everyone’s timeline, specifically search for it, and see what posts and comments people leave.

Meanwhile, choosing “Private” prompts group admins to either be Private-Visible or Private-Hidden. The first is similar to a closed group where people can search for, but cannot see the content shared inside the group, while the latter is completely obscured from other people unless they have received an invite to join.

“By separating the privacy settings for posts and group membership from the overall discoverability of the group, it is easier for admins to understand and manage their group privacy settings, and also easier for members to know important information like who can find the group,” Jordan Davis, product manager for Facebook Groups, said in a blog post.

The new changes rolling out on Facebook Groups are part of the “Safe Communities Initiative,” which includes something Facebook is calling “proactive detection.”

The company started the initiative two years ago in an effort to monitor and detect bad content in Facebook Groups.

Although Groups have been an effective online meeting place for friends and other like-minded people to convene, it has also provided a meeting place for people with bad intentions, proliferating harmful thoughts that have spewed over to real-life scenarios.

In particular, Facebook Groups with the secret privacy settings have been discovered to be gathering places for racist and offensive activity, extremism and white supremacy, or ProPublica‘s discovery of a group of Border Patrol agents joking about migrant deaths.

In other less physical instances, Facebook Groups have been known to be sources of misinformation that continues to plague the quality of information available to the public. These groups have been said to be the reason behind messy elections and even the spread of the anti-vaxx movement.

As a way to resolve this, Facebook has decided to implore stricter monitoring tools that would “proactively detect bad content before anyone reports it, and sometimes before people even see it.”

Facebook says that it will be using artificial intelligence technology that would be able to sift through millions of Facebook Groups, even ones that are supposedly hidden in order to effectively prevent events such as terrorism and violence from sprouting from Facebook.

Notably, some experts say that Facebook will need to be more transparent with what posts and information these AI tech can see and who gets to see them. Especially with reports indicating that the social media site isn’t the most privacy-centered platform out there.

According to a blog post on Wednesday, Tom Alison, Facebook’s VP of engineering, says this tech is part of a new tool that they’re calling Group Quality.

“As content is flagged by our systems or reported by people, trained reviewers consider context and determine whether the content violates our Community Standards,” Alison said in the blog post. “We then use these examples to train our technology to get better at finding and removing similar content.”

Group Quality also gives admins more authority through moderation tools and gives members the option to see the group’s history and preview its content before accepting or declining an invitation.

Specific to the new admin monitoring tools, Facebook will provide insights into why posts are removed, While also giving admins the option to share what rules were broken when they decline pending posts, remove comments, or mute members.

Relevantly, Facebook has told Groups admins a few years back that they will need to impose stricter monitoring in they’re content unless they want the page to be completely absolved.

“Being in a private group doesn’t mean that your actions should go unchecked,” Alison said.

Leave a Reply

Your email address will not be published. Required fields are marked *