Zero fucks

How much companies care about your feelings when it comes to AI.

Zero fucks

I’m amazed — but also not really, at how little respect companies have for users and customers when rolling out AI plans.

There were of course the numerous examples of AI makers even scraping websites that had explicitly opted out of being turned into training food for the machines. If it wasn't clear by now: this data-hungry pack of wolves, led by The Lord and Savior Sam Altman, will stop at nothing because of Progress.

I’m not that surprised because this culture of [moving fast and breaking things / asking for forgiveness] has been inherent to the tech industry. And I have to admit: a younger version of me felt little pity for Old Media Companies complaining about The Google. I’m more realistic now: even if you’re not innovating yourself, that shouldn’t be an excuse for innovators to blatantly steal your stuff.

But I digress. What I am somewhat more amazed by, is how platforms implementing AI keep plunging themselves into the same PR hot waters. How? By blatantly disregarding sensitivities over user-generated content and AI training data.

There were last year's Reddit blackouts. Now, we see outrage among LinkedIn users for having to opt out of becoming AI training wheels. (A true existential threat by the way: we fed AI all the books in the world, and it still creates slop. Imagine a model only trained on slop.)

Perhaps even more egregious examples are Adobe and Udemy. Paying Adobe customers were understandably angry to see their designs being used to design a design automation tool. E-learning platform Udemy went even further: it didn’t rip off their customers, but the actual teachers providing the courses. (Training material training material.) They were given a three-week period to opt out ... only to discover the period had already ended. If biting the hands that feeds you is a bad idea, try biting the hands that feed your entire customer base.

And of course there’s Meta. In longstanding chain letter tradition, people are posting declarations to the company, refusing to become training data for Meta’s AI. Sorry folks (and that includes several A-list celebrities): that’s not how Terms of Service work.

Now that Zuckerberg is buff and necklaced, he has become so confident that he’s actually saying the quiet part out loud:

‘I think individual creators or publishers tend to overestimate the value of their specific content in the grand scheme of this.’

In a way, it’s another great example of Zuckerbot not reading the humans in the room. If you make your money on Meta apps creating the very content that Meta makes its money off (a lot of it):

  • You’re understandably nervous about AI-generated content that could end up looking a lot like yours.
  • Meta would do well to keep those creators close.

Or do they? Is Zuckerberg not being brutally honest? His counterparts at Udemy or LinkedIn were probably thinking the same thing: where are you going to go? And saying: if you don’t want to peddle your [Reels/courses/motivationals] in our market square, just go someplace else.

That’s not really an option, is it? Sure there’s places to go. But you can’t leave. (Case in point: Reddit’s management held fast, and the platform is thriving.)

So what can we do? Apart from venting our frustrations into the void?

Here’s me being very naive: maybe we will learn from this?

Maybe in the future, people will be more mindful of their rights when choosing the next platform to post or promote their wares on. Maybe they will even start caring about interoperability and the freedom to truly move elsewhere.

But that’s another story. A more positive one.


▫️
Shared on: MastodonLinkedIn