Emyn Dean
emyndean.bsky.social
Emyn Dean
@emyndean.bsky.social
46 followers 100 following 140 posts
I think about things. Interdisciplinary computer/social scientist, mostly data, anthropology, ethics, language. Intersex.
Posts Media Videos Starter Packs
I'm slightly more concerned that she doesn't have her hands on the wheel
So, a reporter fails to do basic source checking, then blames the source? What is going on at the Times?
And, for those of us researching Facebook, we should now assume that there is a high frequency of artificially generated content in any Facebook dataset.

/end
Maybe Facebook has decided to do what most of the internet does already: offload the work of controlling misinformation onto volunteer community builders, and empower them to manage those spaces themselves.

That's okay, but if that's now Facebook's policy, they should say so.
For me, I'm a professional in this domain, so it's not much more work for me to refuse a few more applications every week.

But what about communities which are more fragile or have less professional oversight, like groups for town residents? They will have to work it out, like this small city...
Is Facebook paranoid that without these accounts, numbers would show that their user base is declining, hurting their bottom line?

Maybe.

But surely it is a far greater reputational risk to have to admit to clients, "A lot of the people looking at your ads, don't exist".
But for whatever reason, this ethos is now gone.

Impersonating accounts, even those of real people, is no longer considered dangerous.

A user creating hordes of bots for data farming or misinformation seems to be tolerated, even welcomed.
And that policy was part of Facebook's ethos for a very, very long time.

In the 2010 book "The Facebook Effect", Zuckerberg said this to author David Kirkpatrick:

“You have one identity... Having two identities for yourself is an example of a lack of integrity.”

Unambiguous, no?
You may or may not agree with Facebook's "real people only, one account per person" policy. These aren't necessarily bad for the internet in general.

However, the value of bots which are impersonating real people, for the sole apparent purpose of data farming, seems minimal for online communities.
What happened between 2020 and 2022 to create such a dramatic shift in corporate policy? Has Facebook just given up fighting bots in the age of AI?

I'm not sure, but, Facebook hasn't actioned a single report of a fake account *this year*.

I get the same reply every time, usually fairly promptly:
As recently as 2020, they were claiming they were cracking down *more* on fake accounts, such as in this interview with Tripepi Smith:
Now, in the past, when I reported these accounts to Facebook, they would usually remove them - not all the time, but most of the time. Facebook has a very strict, no bots, real people only policy.

Or, so they say.
At other times, it takes a little bit more effort, but it's still pretty obvious. For example:

- The accounts usually have only a few disconnected "friends"
- They often have AI generated profile pics
- The account is often recently made
- There's rarely much information in their profile
Now, I'm a social media researcher, and I have done a lot of research into disinformation and fake accounts, so I know what to look for.

However, most of the time, it's not very subtle.

For example, on one joining form, they always answer a question asking for an email address with "ok".
Unsurprisingly, over the past couple years, fake accounts trying to join my groups has increased a lot - sometimes several a day, where it used to be only a few each month.

I do what I always do: I check the profile to ensure my judgement is correct, refuse the application, and report the accounts.
The largest has about 20k members and the smallest has about a dozen.

All of them, but particularly the larger ones, get a lot of spurious accounts trying to join the groups. A lot of the activity on Facebook takes place in groups, so it's often data-miners or advertisers.
I help admin some groups on Facebook.

Private groups, for those who don't know, usually ask joining questions to screen new entries. Usually these questions are really simple, like "do you agree to the group rules", but sometimes they're more complicated.
So, it turns out Facebook isn't removing bots anymore (a thread)...
hard men make gentle hobbits

gentle hobbits make hard times

hard times make rings of power
cat Max Scherzer just out here blowing up bridges
Reposted by Emyn Dean
"Tzefardea Tzedek" (Frog of Righteousness)

As soon as the Portland Frog became a meme, Jewish social media blew up with jokes about the 2nd Passover plague - tzefarde'a (frogs) - and the Rabbinical commentary around it. I knew I had to make a calligraphy piece encompassing all of it. (more in alt.)
that would explain why they get more funding
I completely missed that media narrative

Although I do know several people who voted and took it quite seriously