We live with GenAI, and the human to bot ratio is now leaning in a different direction. The old norms are dead, because the old structures that held them up are gone.
This idea that theres “more hoops - losing participation” on this thread keeps assuming that the community is unaffected by the macro trends.
It’s weirdly positing that HN posts and users, are somehow immune/unaffected by those trends.
No, this goes beyond that. A well-written article or book doesn’t need to be padded with junk to cater to bad readers, or to preempt trolls, because they can’t scrawl all over it such that it disrupts others’ experience. You have to go to e.g. the Amazon reviews to find people complaining that an author didn’t address something that they very definitely did, or claimed something they certainly did not, that stuff doesn’t show up on the page in footnotes or turn into flame wars on the page where everyone sees it.
Emotional Support is one of the most common use cases of Generative tools in the UK, and the % of people with mental health issues in first world countries is an order of magnitude higher than 0.1%.
Behavioral addictions are even more common place.
These numbers grow worse as you move towards the global majority which has even fewer doctors, let alone mental health professionals.
0.1% is a feel good figure to minimize cognitive dissonance when we don’t want to harm others but don’t want to curtail our benefits.
The question I’d ask is what threshold % of human population would you consider too much
That would be if this were crisis intervention though. Currently arguments I am reading here are positing that this was simply role play.
Automated crisis response is challenging, because it’s a perfect storm of high variance, unpredictable behavior, high stakes, responsibility and liability.
It’s perhaps unintentional, but your framing makes it seem that this is a baseless whimsy.
At this point, it appears that we will be talking to bots more than humans.
It’s a brave new world, and not adapting to it will see the humans leave.
reply