“Auto-reported by Facebook”, rolls-out on Facebook Groups
FBtutorial.com — It seems the social network giant have rolled out a new reporting system called “Auto-reported by Facebook”, as witnessed in Facebook Groups; FBtutorial.com reports.
We discovered (on April 26, 2016) in one of the Facebook Groups we manage, which currently has over 24K members, that a new auto-reporting feature by Facebook has been rolled out and is in effect.
Though we are not yet sure if the new automated reporting system is only enabled for Facebook Groups or site-wide. But it is a no-brainer Facebook will eventually also roll-out the feature to Facebook Pages? — For Page Admins to review and remove any disturbing content.
How Does The Auto-reported by Facebook Work?
The Auto-reported by Facebook robot works by automatically detecting and reporting disturbing and pornographic content to the administrators of a Facebook Group, so they can review and remove the violating content.
Group Administrators can also approve the automatically reported content for it to continue to display on the Group they manage; assuming the content was automatically detected in error and does not violate Facebook terms.
Below is one of the disturbing or horrific content shared on a Facebook Group we manage, which was auto-reported by Facebook.
From our experience, when users whose account has been compromised share or post obscene pornographic content on a Facebook Group we manage, a Facebook robot automatically detects and reports the disturbing content for Group Admin to moderate.
This new feature is definitely well needed in Facebook Groups and is a time-saver. The Auto-reported by Facebook will help Group Admins better manage their community by flagging obscene contents that have become the light of the day in Group Pages.
If you own, manage or the Administrator of a Facebook Group, have you noticed or witnessed the new feature “Auto-reported by Facebook“? Share your thoughts on FBtutorial.com!
UPDATE: It is important to add that, not everyone is having a positive experience with the new “Auto-reported by Facebook” in Groups. In fact, there seem to have been more negative experience with the auto-reporting feature, which is a robot Facebook recently implemented to curb the height of obscene contents that are severely being shared in Facebook Groups by users whose account has been compromised. Read the comments below.
We have had five of these auto reported posts by Facebook in the past 24 ghours, and unfortunately, they are not working as a time saver. Indeed, the opposite applies. None of the posts that were flagged by their algorithms as being in breach of their standards, were actually breaching standards. One was only a drawing of a stick figure, a drawing like you would expect to see in a childs drawing book, and yet, that was flagged and reported. The result it has had in the past day, is not to save us time as administrators, but to increase our workload. We have to check each auto reported post, evaluate it, and in al instances we have had so far, leave the post active, because it was not in breach. With having had five posts, auto flagged in a single day, none of which breached their standards, is the sign of things to come, I envision this auto reporting being nothing but a pain in the neck for administrators, and giving them extra work. A far cry from saving time and making the administering of groups any easier.
@Heather — thank you for your feedback.
Sorry to know the experience you’ve had with the new auto-reporting system by Facebook has been negative. Considering this is a new feature, it is expected there will be issues with it from inception and FB will continue to fine-tune their algorithm to be more accurate.
We can’t help but wonder if members of your Group manually report disturbing posts to Admin? This is because, in the Groups we manage (especially the fashion niche), members do report obscene content to Admin. Therefore, we wonder if Facebook’s algorithm is somewhat influenced by the type of content members have reported?
• What niche is your Facebook Group(s), and do users whose account has been compromised—post or share disturbing contents in the group?
So far, all flagged contents by the new “Auto-reported by Facebook”, has been 100% accurate in the groups we manage—where at least—over 10 obscene contents from compromised accounts are dumped in the group on a daily basis.
That said, we’ve had no negative experience whatsoever with the new auto-reporting feature by Facebook. But of course, everyone’s experience would differ and that’s understandable. We’ve shared this post with the Facebook team both on their Twitter account and on FB; so they can take note of the reactions to the new reporting feature in Facebook Groups.
We very seldom have any posts reported in our group, as all our posts are within Facebook standards (over the past 18 months, we have had at most I would estimate, 5 posts reported to Facebook (3 of which were not in breach of facebook standards and were not removed). It is a humour group for your information. Posts reported to admin have also been very minimal. We are already up in double figures with these auto reported by Facebook posts, and quite frankly, if they do not improve their detection software to prevent it flagging all these innocent posts, they are going to have a great deal of angry and frustrated group administrators. I also admin a group solely for group admins, and amongst the admins, the same issue is being mentioned over and over again, namely that these posts being auto reported are innocent posts and not in breach of their standards. So far, my previous comment about envisioning these auto reports as becoming a an administrators pain in the neck, unfortunately, have proven to be warranted
Heather is correct. The auto report is creating more work, not less, as pictures are being reported that have zero negative content, such as a swim suit offered for sale in a selling group. If I do not reply and FB has to review it, it is more work for FB. If I do not check into my group as often as things are reported, I risk FB deleting a person’s post, then the angry “Why did you delete me post” pm’s start arriving in my message box that I have to take time to address.
It sucks!!!!! I have a 114000 member group and 988 reports 95% of them are posts of people trying to sell their services like spas, tattoo, underwear, and many other services that are legal and under the facebook guidelines. It is more work for the admins because even that this doesn’t help, when you accept or delete a post it takes a while to do the job, so for moderating 10 posts it takes like 5 minutes.
I ran afoul of the auto report too. A friend in Messenger asked me a definition for something and I did a quick Google search for an image that explained the term and pasted the link in the chat window. WITHIN 20 SECONDS, I was COMPLETELY blocked from my FB account: No posts or responses, no chat responses, Likes or changes to profile or cover photo.
Even worse, the person on the other end has no idea what has happened. So for a week now, I have been receiving chat messages from angry and frustrated friends who logically believe I am ignoring them. And since there is no usable appeal process or reporting procedure, everyone in my business and personal networks thinks I have simply disappeared or gone to jail.
I am recommending to all my friends and business associates to start moving to Google+ because Facebook is now COMPLETELY UNRELIABLE as a communications medium.
No doubt.
I am currently dealing with the same situation. I cannot send messages for another who knows how long because of a post in a private group of artists that was reported. It wasn’t even like a “hey, this is bad take it down” notice. It was a “you have been silenced for 28 hrs with no recourse, on top of that your are no longer allowed to use Messenger for an indeterminate amount of time.” I run 3 businesses via FB. The clients/friends on the other side have NO idea why I am not responding. This is rediculous they just get mad/I lose my business. I agree no more FB Messenger for me.
BS, it’s causing more of a workload in out group. Nothing, out of almost a dozen in the past two days, has been inappropriate.
i own a few groups and had to close one due to hacker attacks.
i opened a new one and alot of posts that have noting wrong with them are being reported ive a zero toelerence level to sick thigs in my groups and an instant ban is on the person that tries to post it.
my newest group is a joke adult group over 18s only and is a closed group my admins watch the group like hawks and take no crap.
i found that this new feature is ver irritating and annoying also rather report posts my self.
that xick porn vid thats been going round with the young couple and ive seen many more ive had attempts to post that many times and sent the gobshites packing.
i took it that the auto correct was for posts like that not joke posts or normal posts
We too have had a couple of Auto-Reported Posts, reported as they were awaiting Admin approval. So obviously No group members had reported them as they were not yet posted!! Nor was there anything the least bit disturbing or offensive in them. (Well other than the usual disturbing political news 😉
So far, in our group, the Auto-Report function has been 100% INCORRECT!
This has been a truly irritating issue, every day we get dozens of reports for ‘A Dress’ Make up, cushions. The algorithm isnt working
I just received an auto reported a message from Facebook for a post in my mental health group. The post? Pictures of a woman and her puppy.
I have posted non pornographic or dirty pictures or memes on a group and they are all been Auto reported by Facebook the admin told me that he approves all but a lot of them are being Auto reported by Facebook I have the pictures if you want to see them please fix this.
We are having horrid issues with auto report, and 99% of the pictures have no issue.
This is killing our groups. 100% Fail.
@Steve — sorry to hear your experience with the new “Auto-reported by Facebook” has also been negative.
When we get any feedback from Facebook regarding the issues (considering we’ve already alerted them of the complaints), we’ll be sure to post an update here.
Thank you for your comment.
it’s November and this shit is even worse. And now when one is post banned they can’t even PM friends. Gee I wonder why people make fake accounts? Oh yea, to bypass Facebook fascism
Perhaps facebook doesn’t need a robot to decide what Facebook users want to see and share with each other. It’s bad enough we have actual people reporting each other, but at least they are better at recognizing what they are offended by. Auto-reporting reports things than no person in the group took offense at, and it makes the facebook experience worse for all users, not better. It creates a punitive environment, actually punishing people for using facebook, when they can go elsewhere on the internet without risking such draconian penalties.
You are in the majority I think, Steve. Same is happening to our groups. In a group I admin, for admins only, the same problem with these auto reports is being found, namely that they are auto reporting innocent posts. All it is doing is giving us extra work and causing frustration.
I made two posts to my own group as admin last night. They were both auto-reported. What’s even worse is that neither I nor my fellow admin (whom I added in desperation today) can “clear” the posts – they’re still flagged. Need to know what I can do about it.
@Lynn — sorry to know you’re also having negative experience with the new “Auto-reported by Facebook” robot. So far, we’ve had no issues with it, in the Groups managed by us, but we’ve reached out to Facebook on behalf of those having a negative experience with the new feature; so they can fix the bug.
For the meantime, we’ll highly recommend contacting Facebook Support directly via Facebook Help Center (https://www.facebook.com/help/162866443847527). The more complaints from users, the quicker they’ll likely fix the issues some Group Admins are experiencing.
Upon visiting Facebook Help Center, click the “Contact Us” button, under where it says “Choose the issue(s) that you’re having”; check or select “Something isn’t working” or “Something else”; then explain in details the issues you’re experiencing with “Auto-reported by Facebook” robot.
Thank you for your comment, and please apply some patience while the Facebook team work to fix the issues.
— FBtutorial.com team
So, in order to deal with this latest ill-functioning “improvement” by Facebook, which is wasting Admin time, we now have to submit more reports detailing the latest failure of the latest improvement??
Thanks 😛
Add our group to the list of those who hate the new “auto report” feature and want it to go away forever.
Not a single ONE of the images being flagged violate FB standards, let alone would bother anyone in our group. It’s a huge pain in the neck, and a huge waste of time for those of us who have to review the auto reported images.
If FB is going to add this feature, we need to have the ability to turn it off if we don’t want to use it. (Because we DON’T want it.)
@Carole — sorry to hear your Group is also negatively affected by the new “Auto-reported by Facebook” robot. As stated in a previous comment, we’ve reached out to Facebook on behalf of those having negative experience with the new feature; so they can fix the bug.
For the meantime, we highly recommend contacting Facebook Support directly via Facebook Help Center (https://www.facebook.com/help/162866443847527). The more complaints from Group Admins, the quicker Facebook would likely fix the underlying issues with the new auto-reporting robot in Groups.
Upon visiting Facebook Help Center, click the “Contact Us” button, under where it says “Choose the issue(s) that you’re having”; check or select “Something isn’t working” or “Something else”; then explain in details the issues you’re experiencing with “Auto-reported by Facebook” robot.
In response to your last statement, we totally agree with your suggestion of Facebook providing Group Admins the option to turn off the auto-reporting feature, or at least an option to tweak the automatic reporting setting to suit each Group (for the auto-reporting robot to act “Normal”, “Permissive” or “Aggressive”; similar to anti-virus programs).
Thank you for your comment. We look forward to Facebook fixing the issues and improving the auto-reporting system in Groups.
— FBtutorial.com team
This is not a “bug”. This is not something Facebook will ever have any interest in “fixing”. You can report your issues till you’re Facebook blue in the face, and nothing will change.
@FBtutorial.com I can’t find a “contact us” link…. I’m thinking maybe they’ve changed things since you were last there?
Here are screenshots of what I see when I’m there (the first is the page you linked to, the second is the main help center page, the third is the page I’m taken to when I click “report an issue”): https://www.dropbox.com/s/kzjqqmp8rxr0t4t/FB-help-center-screenshots.jpg?dl=0
@Carole — We’ve seen your screenshots. Don’t know why the “contact us” button is not showing for you in Facebook Help Center, but it’s showing from our end, even right now. (see screenshots: http://fbtutorial.com/wp-content/uploads/2016/04/contacting-fb-help.jpg | http://fbtutorial.com/wp-content/uploads/2016/04/contacting-fb-help-2.jpg)
It’s possible for Facebook to sometimes show different things or not show same things to all users (can’t really explain why). It may also depend on how long you’ve been a registered user on Facebook, if account is confirmed/verified, etc.
Can you have someone else with a Facebook account, visit the Facebook Help Center, if they’ll see the “contact us” blue button?
Facebook went through several weeks ago and selectively removed reporting features for thousands of users. Users who have a favorable attitude toward Facebook or who spend thousands of dollars per day on ads can still access Chat, etc. Low dollar advertisers and users who have ever expressed any sort of negative opinion of Facebook whatsoever have had ALL their contact options removed.
You say to contact fb for help. yet when we do, it isnt addressed and not looked at. SMH. ANOTHER FAIL.
I agree with you. I have been reported twice and had a one day and 3 day block put on me for pictures that had no nudity what so ever. You try to reach out to Facebook to find out what is going on but of course there is no response and no way to contact someone with a heartbeat. This is ridiculous and nothing I got blocked for was reported to any admin.
Here I am, more than 3 years later with exactly the same issue. Justed posted to a sell group as usually, hit “Post to another group”button and instantly received a bunch of messages that my posts were blocked, because, you won’t beleive – a spam! It was just a fucking refurbished turntable, no porn, no machine gun. So I claimed “No spam” button, posts were then published and everything seeded nice and shiny. Until I received another message a minute later, that I was banned from posting for three days. Kudos, Facebook! Like really, more then three years and you were completely unable to handle this primitive issue. Like some poor niche startup…
Is it not possible for Facebook to rather have a filter that will allow for art nudes, violence etc. I belong to several photographic groups, fairly large at that, one of them being pixoto.com, they have a feature that will allow the viewer to choose what they want to see, in that way not offending anyone. Surely this would be better than a draconian auto report that none of the groups that I belong to like?
But Peter, if Zuckerberg (aka World’s Biggest Net Nanny) allowed adult FB users to control what they wanted to see, then he wouldn’t be in charge of the whole world anymore, and might have a meltdown.
I think it’s ridiculous that grown people can’t do their own thing in a well maintained and screened group. But you have groups where children are subjected to grown men. I have reported these many times with no outcome. Get off the back of grown consenting adults and get these pedophiles. You got people being banned over seat cushions because of this program. Thanks alot….fail.
AGREED…. There are pages on here promoting hate violence and the killing of police and nothing is done to them. But let a nip slip and you get blocked or removed.
I reported a video last year that graphically showed three laughing men torturing and branding a screaming child. I was horrified and couldn’t eat, sleep or stop crying for days. When Facebook finally got back to me, I was informed they had reviewed the video and…drum roll, please…”it does not violate our community standards.”
I agree, this new feature is repulsive and idiotic. There are numerous adult only secret groups where consenting adults with alternative view points share. If no one in the group is complaining this feature kicks down the door like the Gestapo and bans good people who are well liked within the group. This may drive half of Facebook users to tumbler… I know if I’m put in Facebook jail again I’ll start the move over there and after the conversations I’ve had…. I’m not alone
Will it show the names of people reporting to us to help us as admin to stop the people that are just reporting to be malice to another member
I’m experiencing that right now! I’ve had 3 bans in the last 2 weeks, and I suspect some “social justice warrior” has it out for me. Memes from months ago are being reported, and I’ve never threatened, used foul language, or posted anything pornographic. Facebook doesn’t care. They have a crusade to win. Narcs’ names are never divulged.
I am in a closed group of adult male veterans. Some members do post pin up girls. Some post pictures of war time experiences. Others post political comments. For now, there is no other option to Facebook. We do NOT need Big Brother type monitoring of what we choose as adults. The images in our optional membership closed group are harmless.
The only problem I’ve had with it is, as an admin, it won’t let me approve some of the posts… only the group creator seems to be able to approve certain posts. Members tend to get a little pissy when their posts are held up for approval for several hours until someone can approve it. Not sure what the issue is but it will let me approve some but not all.
I am the admin to a couple book groups on FaceBook.
I think this auto-report is a load of crap because most of the post that I have that gets reported are for books since I am the admin to a book group. no porn, no violence, no obscene language or anything and as soon as it gets posted it is AUTO-REPORTED.
THIS SUCKS!!!!!
Agreed. Time waster, reports everything and NONE of it is objectionable. At this point I spend my FB time running around fixing this FB fail in my many book & author groups.
These are groups I created, have run successfully and without issue for years, and with the exception of one group, where there’s never been reports to admin and in the one group that there was a report by a member, it was THAT member who had an issue with spamming many groups (whom I promptly removed), not the post reported or the reported poster. I’d rather determine myself what’s acceptable or not and remove members accordingly.
Sometimes there’s just things where technology should NOT try to be used in place of a human.
This new feature is auto-fail, in my opinion, and makes for a horrible FB experience.
Looks like Facebook has added another Useless feature. I have a suggestion, Stay OUT of Closed/Secret groups that are (Adult) Members Only! Let the Creator and Admins control the Individual group without Facebook sticking it’s nose in where they don’t belong! This type Crap is ruining Facebook for its users.
Right? Adults can adult without FB’s “help”
As an admin of a group, I would rather have the ability to know who is reporting posts to facebook. We’re an art group that has a strict rule against nudes/porn, yet we have had several streaks of posts being reported to the point that the original owner of the pages quit. It seems that there are some that just report posts to be jerks, and it would help to be able to identify people that habitually report posts that are not violations.
I think facebook needs to join the 21st century and grow up. I’m probably going to be blocked for this. But some people who are mature and want the content that has been banned. Allow for it just make sure there are rules. Just as proabishion did not work blocking content that you don’t approve of won’t work it will keep happening. So learn to deal with it.
Yes!
Exactly.. What is the purpose of allowing t g email groups to be private if Facebook is going to continually change the dynamics of the website.. Those that are like minded and mature do not need to be treated like children and be monitored. Yes some abuse things but everyone should not be punished for the few
in one day I had ten postings auto reported by facebook, the subject of my group is wrestling, and facebook auto reported 6 youtube videos from wwe and 4 links of blogs posted by users
Might work better if you FB admin dipshits got the idea that breastfeeding and childbirth aren’t obscene nor inappropriate, or memes that make fun of disabled people are? Honestly this is going to take a hell of a lot of fixing to get my vote
I help run a political group and I’ve noticed that a couple of well respected (amongst people of my political persuasion, the dominant party in my country) blogs and twitter feeds are now being reported. I can only assume that this is political opponents gaming facebook’s robot into thinking that perfectly acceptable content is unacceptable – they are doing this in an attempt to suppress political discussion. Facebook really needs to rethink these bots.
It’s ridiculous in a group where there are no children all responsible adults why do we need this its a pain and “big brother” going too far concentrate on trolling and nasty posts that have more of an impact than a pair of boobs
I posted something to our group that had not violated any community standards it was two ladies in bikinis FB took it upon themselves to delete it and block me from posting for 24 hours. I think the auto reporting needs to go away or let us have the option to turn it off or on. Not to mention when you get blocked and it ask you if you think this was in error to let them know and you do, you never get a response.
I help admin a politically incorrect meme group. The auto-report feature has alerted us 5 or 6 times to inappropriate content. The users who posted the content were not active members. So far, this feature has helped us and has been a positive experience.
@Tina — Glad to know your experience with the new “auto-reported by facebook” has been positive.
We had positive experience with the auto-reporting too, in the few groups under management. However, the negative experience with the auto-reporting robot has overwhelmed the few positive ones.
Hope Facebook would fix the issues associated with the new feature—soon, or give Group Admins the option to turn it off/on, as the feature may not be necessary for some (adult) groups.
As I am sure you have norticed by now, this auto reporting feature is proving very unpopular indeed. Regarding the post you published a screenshot of on your article. that looks like a scammers post. These types of posts are made by scammers on behalf of the account holder, usually after the account holder ‘clicking’ on a scam link, which inadvertently gives them permission to post on the account holders behalf. often taking the account holder to a fake page and asking them to log in to access the link. If that was all the auto reporting was doing, it would be an aset, but that is not what it is doing. It is reporting numerous innocent posts and clogging up admin panels and costing us time. You will have noted by now, that positive experiences with this feature, are in the minority.
@Heather — You’re right, the negative experiences far outweigh the positives. In fact, the negative reactions are overwhelming. Imagine if Facebook had released the “auto-reporting robot” site-wide?
Anyway, we don’t think Facebook had any intention to flag appropriate content with the “auto-reporting” robot in groups; that would be counter-productive. However, the robot was released too soon and is without a doubt, unstable, if not inappropriate.
The robot should be programmed at best, to only flag malicious (and inappropriate) content from users whose account has been compromised and used to flood groups with obscene materials/links in an attempt to hijack more accounts. You’ve done a great job explaining how users’ accounts get compromised below:
“These types of posts are made by scammers on behalf of the account holder, usually after the account holder ‘clicking’ on a scam link, which inadvertently gives them permission to post on the account holders behalf. often taking the account holder to a fake page and asking them to log in to access the link. If that was all the auto reporting was doing, it would be an asset”.
We still haven’t heard any word from the Facebook team.
MY question isn’t about auto reporting so far we have had none of that. What I’m
Having trouble with is a member flagging posts. How do I stop this? Is there a way to find out who it is and remove them? She is flagging 2-3 posts a day.
@Toni — When a member in your Facebook Group flags a post, you will also see the name of the member who did the flagging.
If you are the Group Admin, you have the option to remove any member flagging posts unnecessarily and even block them from requesting to join your Group again.
Aren’t you able to do this in your Group page?
When i post or share in a group in which i’m admin, my posts are auto reported, when i share it to group where i’m just a member – no problems 🙂 pls correct it.
I admin in more then 1 group and because of this stupid robotic reporter I am spending almost 7 to 8 hour’s a day dealing with report’s
This is Facebook it’s supposed to be fun not a 2nd full time job.
The admins in the group’s are having to add more admins to deal with the report’s
One group is a push the Facebook regulations to the limit’s but stay within them.
5 hour’s a day on that one alone…
Fix it or get rid of it.
People will start to leave Facebook group’s if it’s not fun anymore
What would be helpful is a policy where FB introduces letting people know who reports their pics! That way we can keep our groups happy and at the level we like them to be, and removing people who don’t fit in with what its about. I run an 18+ group, our rules are strict yet every now and then you get people who think its funny to report photos that are not even breaking the rules! It’s annoying and painful. So far, this new feature has me thinking FB is sexist towards as naked female bottoms are allowed, but male bottoms are ‘auto reported’ lol How about asking admin’s and owners of groups what WE want for our groups???
Yes!! This!! Let the admins know who is reporting posts. No more hiding behind anonymous reporting to cause trouble!
Reports to Facebook must ALWAYS remain anonymous. Imagine if someone reported a post they read about someone ‘bragging’ that they had just raped or killed someone. Were Facebook to notify the person who it was that reported them, the ‘reporter’ is then placed at serious risk of harm. That risk of harm could prove to have tragic or even lethal consequences should the reported person decide to extract revenge. Even innocent posts that are reported, although they are a pain in the neck, must be kept anonymous for safety reasons. Unfortunately there are many ‘unstable’ and violent people in this world, that would seek to harass, bully, threaten, or cause actual harm, to someone were they to be notified that their post was reported by them. Unfortunately for us admin, Facebook reporters must remain anonymous to ensure the safety and well being of the reporters. However what I would like to see them doing, is inflicting penalties on petty reporters, just as they inflict penalties on people who have been reported. People that repeatedly make petty and trivial reports, should be blocked from using Facebook, and warned that repeated lodging of petty reports, may result in their account being permanently disabled. Maybe then, these petty reporters, would think twice before lodging a report. Two of the admin team in our group had our profile pictures reported to Facebook for containing nudity. The profile pictures were face shots, so why was it reported as containing nudity? One can only assume it was a petty misguided idiot, that was simply trying to cause trouble. People like that, should be blocked from using Facebook, but alas, reporters identities must always remain confidential.
Perhaps in that case wouldn’t it make sense if someone reports a person they should be automatically blocked from the other facebook user profile. As for the comment about violence and rape that’s a police matter not facebook the post would be better up for evidence.I to manage Groups there’s no bullying no harassment no kid pics no bestiality respect people as you would like to be treated in real life this auto report has me from enjoying facebook to I’ll never come back to facebook again and the feeling is mutual all round there’s so many things facebook could do like let admins deal with reports my groups are only small and private adults only we don’t need auto report.
This auto reporting is crazy.I have members getting red flagged and the pictures are not nudity as Facebook says they are.one was in memory of someone’s mother.why would that be considered nudity? Like everyone else it’s more time for owners and admins.you have to watch notifications and reports to admins as well.groups are going to lose good faithful members s over this.members are afraid to post anything now because they don’t know if they will be red flagged.
What is annoying is when you get that crap in an adult content oriented group. There is a reason there are admins in a group to delete issues. What pisses me off is ive reported stuff that had animal or child abuse yet its allowed to stay yet in an adult themed group or page little crap gets reported.
Going to be pretty funny when the groups leave FB because of this, The groups leave and the members will follow.
All responses are negative……..get rid of it!!!
I admin three groups and so far this has only been an issue in one of them. My problem isn’t so much with the new reporting app but with constant changing of what is and is not appropriate content for posting. I recent was banned for 24 hours for a picture that showeday no nudity at all but in the reasoning why it was inappropriate, Facebook stated arousal??
Here’s an idea…Let admins…admin their groups. Public groups and profiles are fb’s domain. Closed and Secret Groups, since the public cannot access them, are of no concern to fb. If a group wants to be successfull, they will have their own pinned posts, outinning thier own rules, and they will police their own members. If members break those rules, they are warned, kicked, or banned. If members don’t like the content of the groups….they can leave the group…it’s that simple. I am in quite a few groups… I admin many of them. The admins keep in contact to discuss any problems within the groups and to communicate any bans or deletions that occured and why. I like groups that have order and organization to them. If the group lacks order… I leave the group. If a user is offensive…I block the user. This is nothing more than highly unneccesary censorship.
I am an admin for a couple of groups. I understand the rules and regulations of Facebook. But as an admin please allow us to admin our groups as we see fit. It is understandable if violence is an issue if Facebook gets involve. But please allow us to post in the groups that we admin without getting flagged and or banned. This app needs to be removed or really needs improvements. Thank You!
I run and admin on several book groups, I’m seeing and increase of reported posts (mainly book covers) that do not violate standards. This is frustrating and taking up already precious time to approve posts that shouldn’t have been reported. What may work in some does not work in all.
Note in the past I may have had the occasional reported maybe 4 or 5 in over a year! And with in the last 6 hours 2 posts ( non violating) reported. seriously hoping this is not a sign of things to come.
I use auto scheduling to my Facebook group through Hootsuite. The first post shared a link to an article on the Forbes website. Facebook auto reported and when I click on approve…nothing happens. It says it’s approve but the notification still remains. How do I get rid of the notification and approve the auto-scheduled posts?
What about those of us who WANT the adult material in our private group? Let us share what we will amongst friends.
That’s against Facebook’s rules. Get over it. Don’t like it? Use another site.
If it is a secret group that only members of the group can see, and all of the members are 18+ and have agreed to the rules of the group, then why it is it a problem? If something illegal is posted (child porn, picture posted without someone’s consent) the admins take it down and boot the person. There is nothing illegal or wrong being done. There is absolutely no reason to go all Gestapo on everyone.
I understand why the rule exists, so that children or other people who don’t want to see that won’t. But if it is a closed or secret group, the only people who will see it are the members, who are of legal age and have agreed to the group’s rules.
I have a secret group. only friends have access. what we post to one another should be of NO concern to Facebook!
Apparently, my group doesn’t have the auto-report although at this point, I wish it did. My group, along with several others I know of, is getting bombarded with porn in the comments and I am practically having to stay on the computer 24/7 to deal with it. I have my group set for an admin to approve all posts but the problem is they are posting to the comments which Facebook allows regardless of your settings. Whoever the spammer is, is hacking the accounts of my users. It’s very frustrating.
it wont stop porn. just things that look like they MIGHT be porn. people that actually promote porn will take probably about an hour to work out how to get round the censor, and then it’s back to square one.
by the way, there is a good chance you have spambots in the group. i would look at the accounts that are posting them VERY carefully, and see if you are really sure they are real people. the easiest way to tell is look at their timeline, and see if they actually interact with comments in it. if you aren’t sure about someone, you don’t need them in your group!
Google Plus, there are no issues with that sort of thing. Facebook can just be more boring as ever. Way to go.
I depend on secret groups to express my first amendment right to freedom of expression of speech. Money is deemed speech in Citizens United. I just want to reach the communities that I engage with, and not have my admin distracted by an algorithm. No matter how well the algorithm COULD work, some posts are not pornographic, they are art. Some speech is not “racy,” it is just not politically correct. As long as it is not child pornography and illegal, allow moderators to block your algorithm. It may be failed now, but I worry of a day that it is too good, and secret sites that can avoid everyone being offended, will be relegated to inferior platforms (inferior than facebook.)
I rolled out a group named ALL TEXAN MUSIC under a personal account of the same name, not knowing it was heresy to make a personal account from anything outside a personal name. AFTER the personal account reached 5K friends, it was deleted from FB without notice or reply to my contacts in the HELP section. I never named myself as the Admin, but operate a PAGE and another Group named ALL TEXAS MUSIC (from the website of the same name). As a result the group ALL TEXAN MUSIC has 0 (zero) admins and I have no way to moderate my own group. I have sent numerous messages to FB about this via the REPORT system and nothing is ever done, no replies are given. Any way I can actually correspond to a person working AT FB so I can gain back access to my own group, which now has no Admins, and as a result, gets spam and porn I cannot address?
This new robot is reporting things that never should have been to begin with. The one reported today in my group was someone wearing blue jeans. Really? Another last night showed a bare back and 0 offensive or porn. Others are reporting in the Admin groups ducks, candy, fridge, cars, etc are being auto reported.
Let the admins of the groups decided how to police their groups. We don’t need some censor bot developed by special snowflake professionally offended liberal pansies to control language in private groups. It’s ridiculous that you must not only control what others post that you ‘might’ see, but you must also control what others post even if you can’t see it. It’s pathetic and weak.
We have had several pictures auto reported that are in complete compliance of facebook terms of service and when we go to approve them and refresh they are still in the auto-report window so they never go away unless we remove the pictures. We reported the issue to facebook with no changes or resolution in over a week. We have all our groups set for admin approval of all posts so the new auto-reporting only adds more work and hassle for us. Just another facebook fail that i hope they fix soon.
An absolute waste of time, and an annoyance. Am getting dozens of these a day. And so far, only one of them was something that was removed. which we would have removed of our own accord anyway. just more unwanted facebook fascism
This is one of the many reasons people are going to fuimpostingit.com. Groups and pages are no longer fun anymore.
Yes and their credo is absolutely no censorship over at fuimpostingit.com – given the constraints that users contain pr0n and other provacative material to groups created for those specific purposes…. and that is reasonable.
The new auto report feature is just creating more work as none of the posts reported on my five groups have been in breach of Facebook community standards whatsoever. We have had porn saturating the comments on legitimate posts which are not detected, that in itself is enough work for the admins let alone the hindrance of notifications that plain waste our time. Most of the reports have been for admins posts, it’s absolutely ludicrous!
It flat out doesn’t work correctly which means it needs to be removed, from the many complaints shown here this is simple and obvious, the program just isn’t smart enough to detect whats a breech and whats not a breech of the rules. No one should get a ban just because they post a banana next to a couple of oranges and not be able to correct the issue, but even if it was corrected, this kind of thing is happening constantly.
Its a simple fail.
This is a joke. We’re having over 50% of our post “Auto Reported” 99% of the post have been FINE. This is causing us to basically post twice. I as the owner of the group make a post, It is instantly reported by FB and then hidden until we go and approve it, Then we still get notified on the main page saying things have been reported. Hell half the time even after we approve it they auto report it again. FB is such controlling assholes. This new feature is completely GARBAGE and its ruining my damn groups :/
I was surprised yesterday when I put up a second pic on a post in my group and it was auto-reported. I was trying to figure out why as, though the guy was shirtless, so was the man in the first pic. So random. And as neither photo was in any way a breach of Facebook community standards, the reporting made zero sense. I’m worried that the new tool may be more sensitive than need be. Something needs to be done because auto-flagging innocuous content is a time suck for the user and will be, when enough complaints add up, an annoyance for Facebook to have to respond to. Might be best to nip it in the bud now.
My group, “Rubber,pup play” had some pics auto reported. my group is a community for people who enjoy that lifestyle. if you will, please remove the feature tom that group. it’s unfair to my members. they are not post any porn whatsoever. just friendly puppy-play cosplay. I have been monitoring my group very carefully for years and not once have i seen any porn. i have my own rules as well. as long as the are not showing any private parts and have some type of clothing on, i am fine with it. i am very devoted to my members. so please let me handle it myself.
This is NOT what group owners have asked you for. We asked that any reports made by members in our groups be directed to us first instead of Fb. Some of us enjoy being members of secret groups where we can post what we want without being censored by your stupid standards or worry about being reported by butthurt jerks that just want to cause trouble because something they saw in a group that they don’t even have to be a part of offended them and ruins the little bit of freedom the people who actually want to enjoy the group has. This is just a further invasion of our privacy over things that don’t concern you. Facebook needs less opportunities to screw people over by reporting things they’re not being forced to look at, not robots babysitting grown people and harassing our groups. It’s bad enough we can’t post things we find amusing to our own wall because some pansy ass PC nut job might get their panties in a wad, now you’ve got to invade the private safe havens we’ve made for ourselves too. Fuck you Fb. Fuck you right in the Zucker.
This tool should be optional. FB… give us a way to turn it off please.
Oh…so our comments about this have to be moderated too huh? Fuck you too.
Okay this is absurd. First time its happened to me. Shared a perfectly innocent link I’d seen on another FB page (couldn’t find the page so I tried the original link) and it auto reported in the group I’m admin of. It was about a fan fundraiser. Nothing even remotely offensive. Deleted and laboriously tracked down original FB post on page and shared from there. It had no problem sharing from there.
Facebook: if it’s NOT broken DO NOT fix it. The spammy stuff is robots and hacked accounts. That’s a security fix thing not a ‘block random posts’ thing!
I am an Admin in several groups on FB … The largest group we moderate is getting these auto-reports from FB daily now although up to just a moment ago only ONE post violated our group standards & FB’s & would have been removed anyway by us … Also our members are very prompt at ‘reporting to admin’ anything nasty … all the others were unremarkable posts not violating anything at all … We have a lot of members but we have a lot of admins & the group is diligently moderated … Yes occasionally we get these posts I’m sure most groups admins have seen … (like the one at the head of this forum used as an example) but these posts are caused by malware & are not (usually) posted intentionally by members! The virus ‘auto posts’ to any groups the unfortunate clicker is a member of … & the ironic thing is it’s a virus they’ve picked up in FB itself from all the ‘click-bait’ there !! I cannot imagine that this feature will be of any help to most of us … We don’t need FB telling us this isn’t correct … when yes actually it is !! When I see all the pornography, hate & abusive material on FB & see all the arbitrary account deactivation when there often is no logical explanation given! … I cannot get my head around the fact that they have come up with this pretty useless ‘auto report’ feature … I have sent several reports with screenshots via Help for Groups … Lets see shall we? … Meanwhile come on FB … Who makes these decisions?… It’s a bit ‘Nanny State’ isn’t it? … Don’t you have people in your organisation going … ‘Now hang on a minute’ … Let’s just think about this first ??!! You have too much automation & not enough humans FB … You need a bit of reason & common sense … not just algorithms !
I posted a video of my daughter’s local dance recital number and it auto flagged and removed my post as well saying it was copyrighted and owned by Let’s Dance. Seriously?! I’m not impressed.
It works great I guess, though I’m not sure how the program detects whats not acceptable as all auto reports submitted to 1 group I admin have not breached any of Facebook’s rules, But it’s no real drama to look at the said report and let it through to the wall or delete it if necessary.
I am one of 5 Admins in a sewing/embroidery group. We are a closed group and we have very strict guidelines on what embroidery designs can be posted. I find nothing pornographic about a wine glass or a flower. How about you stop trying to make things better, which in turn make our lives as Admin’s more difficult. You changed the way documents in the files are viewed, and now the hyperlinks have gone, you have changed the way photos uploaded into albums, in that we cannot add details before posting, we have to post the pic and then go in and edit that pic to add details/description. YES, I have complained and pointed out these problems, but have been totally ignored.
What’s the point of having a private group if there’s no privacy
I literally just sat here over an hour to read the peoples feedback,and NOT ONE was in favor of this new bot being in groups..Even with 100% negative feedback,fb isn’t going to change anything,because what the people WANT isn’t a concern of theirs..Time to reliven MySpace maybe? It’s only dead cause we all came to fb instead.
I was blocked for 15 days from posting in groups or joing with no explanation why, no replies to the so called ‘appeal’ button they have. I posted nothing remotely offensive and this happened. Unreal..
This is by far the most clumsy and incompetent “feature” that Facebook has ever implemented. Today an article shared from the Washington Post, that in no way featured any questionable content (in text or images), was auto-reported. Nobody needed this feature, and it’s pretty obvious that nobody actually wants it either. It doesn’t save time, and the only thing it seems to have accomplished is to annoy admins and FB users around the globe.
I am the admin of two groups on FB which has had an experience with the auto-report dingus. In each case, auto-report flogged a post I myself had posted to my own group. One group is Celebrate What Christians Have in Common and the other is Gloriamarie’s Progressive Stuff.
I am also the admin of two other less active groups.
In all cases the groups are closed, I have to personally vet and approve each potential member. While I appreciate that FB wants to crack down on porn, illegal stuff, etc, it seems as if the robot or whatever needs to have its parameters tweaked.
If FaceBook would employ industry standard quality assurance practices in their software development efforts, perhaps so many of their new “features” wouldn’t be so badly broken!!
I have a whole different version of `auto-reported to facebook’.
Mine means that anything posted in the group gets automatically reported to facebook , not to the group admins.
I think that this system is stupid and a complete disregard of users’ rights and privacy.
There are so many people working for facebook. Maybe they should have a better team in place to moderate content that is reported by users and not have things automated.
There is a reason why God gave humans a brain and not an off-switch.
Users have no rights or privacy. Facebook hates its members. That’s why you can’t contact them and why they automate things, so their elite staff don’t have to deal with the little people that use facebook.
This is fine for main post but does not work if they posted in the comments
Yes, like a lot of crap “features” they roll out it doesn’t work and you can’t tell them it doesn’t work. The dumb “robot” has picked up and reported pictures of Jimmy Kimmel, Gwineth Paltrow’s dog and a watermelon.
Why don’t they fix existing “features” before they introduce something else that doesn’t work?
I had one post flagged in my group today – a lawyer announced his office relocation, and a link to his professional website. No idea how it is inappropriate by fb standards.
I own 7 groups.most sports groups.I was logged out of my account.logged in Facebook had removed a post that was not nude.my members have had theirs removed.I thought it was suppose to be reported to group admins to remove it.we do not get notifications.also on my removed I was banned for 7 days and locked out of my account.none of my members have been banned for any length of time.this is unfair since I am group owner.I understand the removal but don’t ban people from posting for 3 to 7 days.that’s not right.
I guess Facebook is trying to get people to leave an d maybe someone needs to create a whole new Web community and just put facebook in the dumpster. Go the way MySpace did.
the problem i have with the new “Feature” is not the detection itself, i suspect that will improve over time. but once reported i often find it hard to clear the report.
after inspecting a photo and finding it does not breech facebooks rules i click the tick to allow it back in the group, but after doing that, click refresh and its still there marked as reported and seems to take hours to clear if at all.
i currently have 3 posts on report in my group that will not go away
An obscene post piggy-backed on an innocent post on my Facebook last night. It was removed quite quickly. One thing that was disturbing was that the woman depicted looked very young.
I’m so glad to read these comments and know that I am not the only one who thinks this new feature is a huge time waster. Since they first implemented it I have had way too many notifications of posts I’ve had to approve that had nothing wrong with them. Yet when someone reports something that is actual violence, threats, or otherwise disturbing imagery facebook refuses to do anything. But by all means lets send out the robots to stop the porn. The whole reason groups have admins is so we can monitor things and delete spam and inappropriate things, I understand some may appreciate this feature but it should be optional, something you can turn off, not mandatory.
We’ve had 3 or 4 auto-reports in our group and none of them were remotely offensive. It has been nothing but a waste of time. A good group will remove the bad posters anyway.
Up to 6 now and all of them have been normal content.
The beginning of the end of Facebook. Hate speech, animal abuse, and racism is tolerated- boobs get you banned? Absolutely idiotic!
Yeh my group is having the same issue. No nudes at all not even close. Facebook just gets worse and worse. Do not get me wrong my group is not for the easily offended that is the point of my group to be as free as can be and facebook is saying no to it.
facebook.com/groups/502330739953502
Be warned if you can be offended you will be so do not bother looking unless you have a thick hide. If you appreciate freedom then join do not blame me if you are devoured. The scum of this planet roam there.You will see no nudes maybe bikini pics here or there but that is it.
I’m the admin of several groups and whether it’s a link, photo or just text, everything I post is flagged. I can’t even approve my own posts! This almost feels like Facebook is targeting certain accounts and groups for removal.
The only thing I can do is share my page posts. That’s about the ONLY thing that gets through.
I don’t get why Facebook even feels it necessary to even implement such a thing. Most groups in Facebook are a private group of friends with like minded posts and comments. I myself have been placed on a ban repeatedly. All Facebook is going to do with this program is going to have MANY people move to other social media sites. I do understand some posts do go beyond the limits of good taste. But in private groups. I find this program to be ridiculous. I’m sure Facebook will be seeing a large number of it’s followers finding another venue.
your auto reporter is totally inaccurate and you cant admit when your jus fn wrong I guess now my comment will be auto reported too
The good news is that if admin allow a certain number of what we deem innocent posts, that have been auto reported by Facebook, the auto reporting by Facebook stops! We were plagued by posts being auto reported that were within standards, and let them all through to be posted and remain, on the group. We no longer get any! Woo hoo! See: https://www.facebook.com/help/225716337798978
Autoreport is not helping at all in a group I admin. It is identifying a post which has absolutely no obscene or inappropriate content. Our group settings are already that admins must approve all posts so we do not need autoreport at all. And when I go on to respond to the autoreport by approving the post for a second time it immediately gets flagged and sent to admin again. This needs to be fixed.
I am the administrator of a Pagan humor group. I started it five years ago. I have one co-administrator, and 1,500 members (roughly 50% of which are active daily). My group is a closed group which means only members can see what is posted. I have a very clear set of rules for the group. People are blocked and banned from my group at my digression depending on which rule they broke, and if they have been warned about it or not. It is also clearly stated in the “about group” area that if you are offended by a post you should report the post (there is a button for you to do so), and myself and my co-administrator message you to see why the post offends you. Based on your answer we will either allow or remove the post. My members are extremely intelligent, and quite capable of deciding what offends them and what doesn’t. At the moment however, they are just as offended as I am that Facebook has decided they aren’t able to make up their own minds about what is or is not offensive. Quit trolling my group and stepping on our 1st amendment rights. If you don’t walk every individual’s personal path and share their beliefs and values, how can you judge what should and should not be reported?
Leave CLOSED FB groups alone!
I have been a victim of the new bot system. It singled out a picture of a blonde woman fully clothed in a white dress looking over her shoulder (which had been reported and found ok 3 years ago) and found as obscene and after having to affirm that no other pictures were improper in the entire album of close to 1500 pictures that I have amassed over the last 6 years that I have had this acct. I too admin at least 4 groups with a total of over 39000 people between them all. I do see the benefit of helping group admins. Singling out individuals for no reason at all and placing them on a 3 day or longer ban from using their accounts, is in fact violating Facebook’s own terms of service ..ie bulling or harassing… by mere definition The behavior hurts, humiliates, or harms another person physically or emotionally.
Those targeted by the behavior have difficulty stopping the action directed at them, and struggle to defend themselves.
There is also a real or perceived “imbalance of power,”
. I feel at the very least I and others improperly “singled out for no reason whatsoever, be offered a public apology from Facebook. I also think a page that those affected be set up so that a written record can be maitain so as this “robot system” can be deactivated until a time that it functions fairly and accurately.A link or email made available to locked accounts to report this incorrect action. That a correction may be made in a much more timely matter.
this should be an option and not automatic. Please work to make it change to option.
We have had several auto reported posts. None of them were offensive and one was actually a government site for information on Bernie Sanders and his time as a House member and a Senator, voting record, etc. Since we are a Democratic support group, this is not only appropropriate, it is essential information for the members that have not bothered to check that out. The porn still arrives unflagged and is deposited in your group as comments on political activity. So, this is not exactly working as FB planned, is a total pain in the behind.
i am Also Owner of 10 groups . and i also say Auto-reporting is not a good option. when i saw a auto-reporting post in my group and i think its ok not any kind of violation and i approve it, but when i refresh the group post are remain in reporting section . not clear from there
It seems the end game is to ensure Facebook isn’t held liable for content someone might see. Simply have users and page administrators check a box saying they take responsibility and Facebook wont. The market has a wonderful way of correcting bad algorithms.
Just call a spade a spade. Your only trying to stop Amy postings that include sexual content. I’ve flagged disturbing content, murders, and different acts of violence, and was told it was ok because it was “freedom of speech.” Why not just let admins monitor their own groups and work on other things. Like allowing people to know who reported things in private groups. And while I’m at it, stop suggesting people I should be friends with!
Waste. Of. My. Time. So far two photos have been auto-reported in my closed extremely tight-knit group of all women. Both featured sexy guys in states of undress, but no genitals or bare ass parts showing. Has Facebook turned into a priggish maiden aunt that is now even shocked by man-chest? Seriously? In a private, closed group?
So, yeah, waste of time. I’m very careful with who joins my group, so we’ve never had problems with porn.
Can I join?
I dont think it is needed at all. The fact that facebook doesnt even respond half the time on reported content, no some robot code is going to determine what can or can not be posted. Facebook should abandon this project for more features that are needed, rather some robot police that like real cops. cant do their job
I’m admin in a number of groups, and in only one was anything ‘auto reported’. None of the actual pornography was flagged, only perfectly innocuous member posts to sell books. I still have to go through and boot out the pornography in the usual ‘pending’ section. The ‘autoreported’ but innocent things need to be personally posted my me, by hand. It’s making work and not doing as intended.
Why should anyone much less a robot have the right to say what offends me? Can your robot tell the difference between being old enough and whether its real, staged or automated? How do you know your robot will detect child porn? Where did you get your child porn to test it? Why should you pick out pictures in closed or secret groups when no child porn has never been reported?
I admin a moms group on facebook.
In the past week since the “auto report” feature reached our group it has reported a picture of a pregnant belly (with the woman wearing pants and a shirt but her belly sticking out) which I can sort of understand why the robot may think was ‘nudity’ (because there was skin) but then today it reported an ultrasound picture. There are definitely some bugs that need to be worked out.
In short it sucks.Look at all the negative comments here.10 aginst your 1.
And no the people reporting a post should be told to the Admin of the page.They want to accuse the page of wrong doing.But we don’t get to face are accusers. Please.I thought you to be an American.
If somone in a private group/Secret group is reporting things in a group.The Admin should have a right to know who is reporting there page.So they can eather ask them to come to them with there problems or ask them to leave the group because the humor doesn’t fit there taste or block them from the group.Why is your company punishing the majority of people in these group’s.
As far as Auto reporting.Are you trying to figure out how to pick people for concentration camp’s or what?lol…It is the dumbest thing I seen a company do.You are literally pushing people away from your business with this BS.
So basically you are telling us it’s here whether we like it or not and whether it works or not?
It’s just getting shoved on to us so there seems to be no point in leaving comments or feedback.
As a note, I just got banned from a cat group for posting a picture of my cats! Not good guys!
I believe that Auto report is the worst invention added to Facebook EVER, it ruined almost every page or Group I know including mine, it’s a machine, it won’t understand if someone is targeting my business or it’s a real report for something needed to be blocked ! and now my page has been blocked for a month and the reason is unknown !
What Facebook doing ??..we not like this Auto reporting robot .please remove it .
Agreed !
“If somone in a private group/Secret group is reporting things in a group.The Admin should have a right to know who is reporting there page.So they can eather ask them to come to them with there problems or ask them to leave the group because the humor doesn’t fit there taste or block them from the group.Why is your company punishing the majority of people in these group’s.”
Mastectomy photos with reconstruction are being reported in many of the groups set up for women who have had cancer or are at high risk for cancer (ie, genetic mutations). How does one know if it is auto reported or reported by a rogue member of the group who is either jealous, bitter, or a troll?
How do you run an adult group with this BS? I co-moderate a group for trans guys all over 40. We’re now getting ridiculous “auto reports” (underwear, top surgery, etc). We can moderate our own group, don’t need nanny FB “helping” us.
I agree totally, Jay. I am in a group that is closed/secret for adults only. Our Admins monitor to make sure our groups’ standards are met for posts. But now, we’ve had members blocked (some multiple times and for up to a month) for posts being reported to FB, and apparently automatically, with our Admins given no option to approve or deny and no option for appeal. Mind you none of the posts were offensive to our group. We don’t need extra censorship for things only our members can see in the first place.
Even if you wanted to contact Facebook, it’s a losing game. I have been banned over and over again because of the pictures I post. They are not pornographic in any way but immediately I’m sent a message that I’ve been banned. I have written back to FB over and over again to no avail. There’s absolutely no customer service, PERIOD. My friend Michael Stokes is constantly getting banned from his page for posting his beautiful pictures of men. FB needs to come up with a new way of policing what’s on here because I’ve seen post that are left up of people being beheaded and that’s okay according to their policies.
This auto feature is not accurate and combined with Facebook’s refusal to set up a dispute resolution function, it is extremely aggravating. I have resorted to filing a complaint with the Better Business Bureau since I am unable to communicate with Facebook directly. I urge all individuals experiencing similar feel I vs to do the same. Maybe then they will fix this or better yet, get rid of it and employ real people that have the ability to reason. I hope to send this message far and wide for all the individuals being blocked for no reason with no right to dispute it and have the block removed.
I run a plant page and today we have had 3 of these so called Facebook reports. They are photographs of PLANTS!!! How can this possibly be porn or any other damaging material? This is making things more time consuming and needs to be removed. Absolutely awful feature!
I am a owner of 2 groups and facing a problem with your new feature auto reported by facebook even after approving the post can’t see the post it says it is deleted or could not loaded,
how about they introduce a new feature called “talk to a human” in support rather than click this and click that which goes absolutely nowhere, your help and support section is useless unless you want to waste hrs searching thru endless crap that doesnt apply to the problem!
WHY is facebook supporting genocide ??? Serious question because I’m a admin for one of the West Papua groups concerned about the systemic killing and looting of the Melanesian population by the foreign Javanese/Malay people that the US and UN decided in 1962 to appoint as administrators of the colony while the US mines the gold & other wealth. There are around 250 million Javanese people who have been taught to treat Papuan people like unwanted animals and there are multiple Indonesian “cyber hacker” facebook groups who think it is their patriotic duty to silence groups opposed to the suffering & denial of rights.
Content has been disappearing without explanation – I suspect some Indonesian users by reporting posts they do not like for political reasons, are causing this system to first report legit posts, and then when I confirm its a legit human-rights posting the content (photo/link/etc.) disappears within a few hours because some Indonesians are reporting the post to facebook instead of to the group’s admin…
So facebook has created a system that removes control from the admin and gives political groups control of what content can or can not be posted groups!
this new feature is awful. 1)standards or not many people have or join closed/secret groups specifically to post stuff not safe because IT STAYS IN THE GROUP AND EVERYONE IS OKAY WITH RACY CONTENT
2)I’ve had more safe posts autobanned then , god forbid a woman’s tit!
3)why does fb hate tit’s anyway?
this feature is upsetting more and more people.
i guess you should give option in a group that the post which members can see should be out from auto report or Report. because people join the groups to see the stuff by their own..
I’d rather you just kept your f*cking nose out of my business. If I have a problem, I’ll tell you. Otherwise, f*ck off. Is that direct enough?
Or, if we have a problem, WE could contact YOU, instead of having you peek into the clubhouse like Daddy…or a pervert.
https:// ageofshitlords .com/how-to-find-out-who-reported-you-on-facebook/
Ok, I am in a group that keeps having reporting issues as well. The members are getting reported but our Admins are not being alerted to the post first- it goes directly to FB. We’ve had people blocked multiple times for up to a month. My question related to this is: We have a closed/secret group. The members are following the protocol for what is allowed to be posted in our group. Why does FB feel the need to censor posts that only our members can see and that our group approves of? FB should only step in if our Admins have an issue and go to them. Otherwise, butt the hell out.
This new feature doesn’t bother me at all but would like more options when reporting a post to FB. Need to add one for someone violating the Community Standards like posting animals for sale! There is too much of it out there and most other admins still allow it and don’t care. Better yet, give us a comment box to type in if you don’t have a listing of what we need to report already listed!
why FB hates women ? I can show a male nipple but a female nipple is dirty ???
Does anyone k ow if this “auto report” on Facebook thing works in secret groups? Shouldn’t the secret groups be, ya know, secret, even from Facebook bots? What if there is a secret group where people post porn pics and the secret group is specifically for that, does facebook monitor what is going on in that group? Is there no such thing as privacy at all in Facebook groups?
If Zuckercuck doesn’t like the content, he can go fap to something else.
What’s “horrific” about a fake photoshop angel?
Go find some testicles.
Another ridiculous Bot that flags everything for no apparent reason. I’ve had Gif’s that are available in Facebook, not sourced from anywhere but Facebook, flagged for inappropriate content – sex or whatever the reason was, and the Gif had absolutely NO Sexual Content whatsoever.