HN2new | past | comments | ask | show | jobs | submitlogin

It can be. It can also not be. A friend of mine had a PITA boss. Thanks to ChatGPT he salvaged his relationship with him even though he hated working with him.

He went on to something else but his stress levels went way down.

All this is to say: I agree with you if the human connection is in good faith. If it isn’t then LLMs are helpful sometimes.





It sounds like that relationship was not supposed to be salvaged to begin with. ChatGPT perhaps prolonged your friend's suffering, who ended up moving on in the end. Perhaps unnecessarily delayed.

My knee-jerk reaction is that outsourcing thinking and writing to an LLM is a defeat of massive proportions, a loss of authenticity in an increasingly less authentic world.

On the other hand, before LLMs came along, didn't we ask a friend or colleague for their opinion on an email we were about to write to our boss about an important professional or personal matter? I have been asked several times to give advice on the content and tone of emails or messages that some of my friends were about to send. On some occasions, I have written emails on their behalf.

Is it really any different to ask an LLM instead of me? Do I have a better understanding of the situation, the tone, the words, or the content to use?


I think there are a couple of differences here:

Firstly, when you ask a friend or colleague you're asking a favour that you know will take them some time and effort. So you save it for the important stuff, and the rest of the time you keep putting in the effort yourself. With an LLM it's much easier to lean on the assistance more frequently.

Secondly, I think when a friend is giving advice the responses are more likely to be advice, i.e. more often generalities like "you should emphasize this bit of your resume more strongly" or point fixes to grammar errors, partly because that's less effort and partly because "let me just rewrite this whole thing the way I would have written it" can come across as a bit rude if it wasn't explicitly asked for. Obviously you can prompt the LLM to only provide critique at that level, but it's also really easy to just let it do a lot more of the work.

But if you know you're prone to getting into conflicts in email, an LLM powered filter on outgoing email that flagged up "hey, you're probably going to regret sending that" mails before they went out the door seems like it might be a helpful tool.


"Firstly, when you ask a friend or colleague you're asking a favour that you know will take them some time and effort. So you save it for the important stuff, and the rest of the time you keep putting in the effort yourself. With an LLM it's much easier to lean on the assistance more frequently."

- I find this a point in favor of LLM and not a flaw. It is a philosophical stance, one for which what does not require effort or time is intrinsically not valuable (see using GLP peptides vs sucking it up for losing weight). Sure, it requires effort and dedication to clean your house, but given the means (money), wouldn't you prefer to have someone else clean your place?

"Secondly, I think when a friend is giving advice the responses are more likely to be advice"

- You can ask an LLM for advice instead of writing directly and without further reflection on the writing provided by the model. Here I find parallels with therapy, which in its modern version, does not provide answers, but questions, means of investigation, and tools to better deal with the problems of our lives.

But if you ask people who go to therapy, the vast majority of them would much prefer to receive direct guidance (“Do this/don't do that”).

In the cases in which I wrote a message or email on behalf of someone else, I was asked to do it: can you write it for me, please? I even had to write recommendation letters for myself--I was asked to do that by my PhD supervisor.


I wasn't arguing that getting LLMs to do this is necessarily bad -- I just think it really is different from having in the past been able to ask other humans for help, and so that past experience isn't a reliable guide to whether we might find we have problems with unexpected effects of this new technology.

If you are concerned about possible harms in "outsourcing thinking and writing" (whether to an LLM or another human) then I think that the frequency and completeness with which you do that outsourcing matters a lot.


It all depends on the use one makes of it.

It can become an indispensable asset over time, or a tool that can be used at certain times to solve, for example, mundane problems that we have always found annoying and that we can now outsource, or a coaching companion that can help us understand something we did not understand before. Since humans are naturally lazy, most will default to the first option.

It's a bit like the evolution of driving. Today, only a small percentage of people are able to describe how an internal combustion engine works (<1%?), something that was essential in the early decades after the invention of the car. But I don't think that those who don't understand how an engine works feel that their driving experience is limited in any way.

Certainly, thinking and reasoning are universal tools, and it could be that in the near future we will find ourselves dumber than we were before, unable to do things that were once natural and intuitive.

But LLMs are here to stay, they will improve over time, and it may well be that in a few decades, the human experience will undergo a downgrade (or an upgrade?) and consist mainly of watching short videos, eating foods that are engineered to stimulate our dopamine receptors, and living a predominantly hedonistic life, devoid of meaning and responsibility. Or perhaps I am describing the average human experience of today.


Not really, he was looking for other jobs. One can't just be without a job unless they have enough savings which he didn't.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: