Well he and his people are far too stupid and incompetent to have come close to succeeding. While it's not great that there was no punishment, we should at least be thankful that they act on emotion and can only loosely follow playbooks for corruption from the past rather than write new ones for modern times.
It gets lost in the distracting partisan bickering over Musk/etc, but Twitter has gotten hostile and crappy in many ways like this that have nothing to do with politics. Imagine how much more hostile this action would have seemed in 2010. But now, people put up with it.
As a 50 year old, I can recall a lengthy stretch of time in the US when lamenting the lack of a "white homeland" would not be considered "partisan", but extremely fringe speech that the mainstream would mostly shun.
Twitter is certainly terrible for those reasons as well. Terrible people are excusing apolitical enshittification because they're thankful the Overton window has been pushed down to where they live in the bottom of the barrel. You just can't say the latter part too loudly here because there's sufficient sympathy and affinity for it.
> You just can't say the latter part too loudly here because there's sufficient sympathy and affinity for it.
I think you're right, and I find this revolting. Tech always had its weirdos, but mostly they were kind of idealistic, naive, or had some quirks or otherwise were maybe a bit unique, but they weren't into that kind of flat out evil ideology. Or at least not openly, because there was a sense of shame around that kind of ideology.
Not really sure how much people really even put up with it. I just went to Bluesky once I got an invite, and I've generally noticed my cohorts migrating there over time too. Sure, some content isn't there, but a smaller social media better than beating your head against the wall.
Your comment doesn't make sense because the fact that "dead internet" has been coined since then (along with the popularization of "slop" and "hallucination") means there is a line and we have crossed it. Denial doesn't stand up to any scrutiny.
It's too bad we weren't more skeptical about the ways emerging technologies would eventually be used against us. Some warned about it but many (including me) ignored them. Perhaps we could be forgiven for that naivete, but there's no excuse to be ignorant of what's going on now.
The utility of those larger sites is coming to an end, but most people aren't discerning or ambitious enough to leave and seek out the smaller places you mentioned. Places like this will remain but will join Facebook, Reddit, and Twitter as shadows of their prior useful selves. The smaller, better sites won't have to worry about attracting the masses and therefore worsening, because the masses have finally settled.
I don't know how he could ban it. AI can't be used to reliably detect AI for the same reasons it's unreliable at other tasks. He didn't really have a choice but to sell the same grift everyone else is selling. He's stuck in a prisoners' dilemma that everyone is losing except a few people at the top.
That's surprising because runtime debugging depends on the state of the call stack, all the variables, etc. Syntax errors happen independent of any of that state.
Like the sibling comments, I see it the opposite way. Caring about your work in detail, anything the slightest bit bespoke, is becoming an antipattern. Employers want you to generate mediocre work because it's cheaper, and you only need to make sure it's not on fire. Mediocre peers are happy to go along with it as the short term path of least effort.
reply