LLMs writing prose is too robotic
LLMs output is too dependent on prompts to be interesting
LLMs take too much RAM to run effectively
LLMs take too much electricity to run locally
LLMs work locally but are a bit too slow for my taste
LLMs output mostly correct code but it isn't applicable to my codebase
LLMs make tool calls to pull in additional context
LLMs outputted code works for most developers but not my codebase <---- you are currently here
The rest is no longer true, indeed
LLMs writing prose is too robotic
LLMs output is too dependent on prompts to be interesting
LLMs take too much RAM to run effectively
LLMs take too much electricity to run locally
LLMs work locally but are a bit too slow for my taste
LLMs output mostly correct code but it isn't applicable to my codebase
LLMs make tool calls to pull in additional context
LLMs outputted code works for most developers but not my codebase <---- you are currently here