Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.
==
A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.
I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.
I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.
Removed by mod
There is a stark contrast between fetishizing body parts and fetishizing underage people. Calling it “shitty” is an understatement by several orders of magnitude. If you are attracted to someone because they look underage, I’d have to question your moral judgement when it comes to the real thing. People aren’t known for making wise choices when under the influence of hormones, and I think that the venn diagram of pseudo-pedos and real ones may overlap a lot more than you seem to want to accept.
I am all for blocking this instance and defederating it. If someone wants to ride that line, they know that address, but now they have to type it into their browser instead of hoping those images show up in their feed.
They didn’t defederate the instance for the content alone, but specifically for being on board with intentionally making content seem like CSAM. That’s a long, long step beyond the subject of an image just looking less than 18.
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
Their sidebar states
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
That’s a real argument, don’t get me wrong, but what Ada was arguing was that they were allowing images that, although they had subjects of legal age, put the subjects in situations where one could see them as underage.
Being real? I have pretty much all of the porn C/s blocked now, so I had to go look specifically for what the problem was.
Nobody with eyes and basic reading comprehension could mistake anything on that community for underage people. The instance has very clear rules about age verification, and the posts all followed them.
Someone seeing them as underage would take either horrible ability to discern age, or just being so wound tight that they weren’t willing to consider otherwise.
Which, again, this is their instance, they can make the decision for no reason at all, and I’m okay with that. It’s just that the stated reason is, bluntly, malarkey.