The remaining issue is that even an AI-generated UI needs considerable UX input in order to work well, especially when you have to fit it around domain specific knowledge, use-cases, and prior art. Is it for power users or not? All that.
At risk of shifting the goalposts on what I originally said, unique here isn't meant to mean quirky or weird but, simply, something that hasn't been done before, or hasn't been done as effectively.
This is the challenge for B2B startups that are switching to LLM-based development and are trying to offer more than the reselling of cloud compute at a markup with specialised functionality, because AI turns SaaS into a sexy version of MS Access.
> when AI eliminates the need for human creativity
We haven't needed the overwhelming majority of human creativity. We still paint and play guitar even though it has no economic value. I think we'll continue to do these things regardless of AI.
You really need to look again. If you're still manually writing code you have your head in the sand.
AI can produce better code than most devs produce. This is true for easy stuff like crud apps and even more true for harder problems that require knowledge of external domains.
I'm not sure about other devs, or even their number, but AI can most definitely NOT produce better code than I can.
I use it after I have done the hard architectural work: defining complex types and interfaces, figuring out code organization, solving thorny issues. When these are done, it's now time to hand over to the agent to apply stuff everywhere following my patterns. And even there SOTA model like Opus make silly mistakes, you need to watch them carefully. Sometimes it loses track of the big picture.
I also use them to check my code and to write bash scripts. They are useful for all these.
What you're describing is using it to do something you already can do at an expert level, and you already know exactly what you want the result to look like amd won't accept anything that deviates from what's already in your head. So like a code autocomplete. You don't really want the "intelligence" part, you want a mule.
That's fine, and useful, but you're really putting a ceiling on it's potential. Try using it for something that you aren't already an expert in. That's where most devs live.
Even expert coder antirez says "writing the code yourself is no longer sensible".
AFAIU antirez is mostly writing in C, a verbose language where "create a hashtable of x->y" turns into a wall of boilerplate. In high level languages the length diffrence between a precise specification and the actual code is much smaller.
He also mentions using it for Python which is minimal boilerplate.
And he didn't limit his take to just C code. He said: state of the art LLMs are able to complete large subtasks or medium size projects alone, almost unassisted, given a good set of hints about what the end result should be.
But if the using them as mules is still producing silly mistakes, how will I have the confidence to defer to their intelligence for much more complex stuff?
These things bullshit their way about all the time. I've lost track of how many times they seem to produce something great, only for me, upon deeper inspect, to see what a subtle mess they have made. And when the work is a bit complex, I cannot verify on sight; I'd have to take time to do it.
Also, they absolutely cannot even produce some levels of code. Do you think I can just give them a prompt to produce a haskell-like language, allow them to crank for some hours, and have a language ready made?
Want an example? here is something Sonnet gave me just today:
I get this as the type of xx: Promise<Result<Pick<Cabinet, "name">[]>>
Which is obviously wrong. I should be getting the full type, i.e., all columns picked. The problem is that the Column generic parameter is not being properly inferred, which is (probably) due to the sorting by name, since the sort column is defined to have to be part of the query field name, so when field is not provided, TypeScript infers the fields as the sort column name.
Neither ChatGPT nor Claude Opus have been able to solve this after one hour, suggesting all kinds of things that don't work. But I have solved it myself, with:
export type QueryArgs<Rec extends StdRecord = StdRecord, Fld extends StrKeyOf<Rec> = StrKeyOf<Rec>, FltrOp extends FilterOpsAll = FilterOpsAll, Srt extends Fld = Fld> = {
/** Fields to include in results (defaults to all) */
fields?: Fld[],
/** Filters to apply */
filter?: RecordFilter<Rec, FltrOp>,
/** Sorting to apply */
sort?: {
field: Srt// StrKeyOf<Rec>
order: SortOrder
},
/** Pagination to apply */
page?: {
maxCount?: number | undefined
startFrom?: { sortFieldKey: any, idKey: ID } | undefined
}
}
And:
queryX: <Ent extends EntityNamePlural, Col extends StrKeyOf<Dto<Ent>>, Srt extends Col = Col>(args
: {
entity: Ent,
query: QueryArgs<Dto<Ent>, Col, fOperators, Srt>,
auditInfo?: AuditSpec
}
) => Promise<Result<Pick<Dto<Ent>, Col | Srt>[]>>
You’re equating two things that aren’t the same. I’m not still manually writing code, but it’s not at all because Claude can produce better code than me. It’s worse at CRUD apps and a lot worse at domain specific bits. But it’s more parallelizable, so if I drive it well I can focus my skill on the small subset of problems that actually require it and achieve increased throughput.
I partially agree. I can see the before and after difference in colleague's code. It's night and day.
They're doing things now that they either flat out could not do before, or if they did it would be an giant mess (I realize they still can't really do it now, AI is doing it for them).
And Java typically does produce both (see Exception "cause" field). So when an exception stack trace is printed it's actually list of stacktraces, for each "cause". You can skip stacktraces and just concatenate causes' messages (like people often do in Go).
So the full message would be like "Cannot add item X to cart Y: Error connecting to warehouse Z: Error establishing TLS connection to example.com 127.0.1.1: PKIX failed".
Exceptions with stack traces are so much more work for the reader. The effort of distilling what's going on is pushed to me at "runtime". Whereas in Go, this effort happens at compile time. The programmer curates the relevant context.
And come on, skipping 5 lines and only reading the two relevant entries is not "much work". It's a feature that even when developers eventually lazied out, you can still find the error, meanwhile you are at the mercy of a dev in go (and due to the repeating noisy error handling, many of the issues will fail to be properly handled - auto bubbling up is the correct default, not swallowing)
The Go errors that I encounter in quality codebases tend to be very well decorated and contain the info I need. Much better than the wall of text I get from a stack trace 24 levels deep.
Quality java code bases also have proper error messages. The difference is that a) you get additional info on how you got to a given point which is an obviously huge win, b) even if it's not a quality code base, which let's be honest, the majority, you still have a good deal of information which may be enough to reconstruct the erroneous code path. Unlike "error", or even worse, swallowing an error case.
This is only useful to the developers who should be fixing the bug. Us sysadmins need to know the immediate issue to remediate while the client is breathing down our neck. Collect all the stack traces, heap dumps, whatever you want for later review. Just please stop writing them to the main log where we are just trying to identify the immediate issue and have no idea what all the packages those paths point to do. It just creates more text for us to sift through.
ExceptionName: Dev-given message
at Class(line number)
at Class(line number)
caused by AnotherCauseException: Dev-given message
at Class(line number)
It's only the dev given message that may or may not be of good quality, the exact same way as it is in go. It's a plus that you can't accidentally ignore error cases, and even if a dev was lazy, you still have a pretty good record for where a given error case could originate from.
Again, I am a sysadmin, not a developer. Telling me line numbers in a files written in a language I don't understand is not helpful. I don't care where the error occurred in the code. I care what the error was so I can hopefully fix it, assuming its external and not a bug in the code.
Especially when they forget to properly handle an error case among the litany of if err line noise, and you get erroneous code execution with no record of it!
This is why stack traces exist. But I agree Java seems to not really have a culture of “make the error message helpful”, but instead preferring “make the error message minimal and factual”.
For what it’s worth, the rise of helpful error messages seems to be a relatively new phenomenon the last few years.
And that's why you should have multiple appenders. So in code you write "log.error("...", exception)" once, but logging writes it in parallel to:
1. STDOUT for quick and easy look, short format.
2. File as JSON for node-local collectors.
3. Proper logging storage like VictoriaLogs/Traces for distributed logging.
Each appender has its own format, some print only short messages, others full stacktraces for all causes (and with extra context information like trace id, customer id etc). I really think STDOUT-only logging is trying to squeeze different purposes into one unformatted stream. (And Go writing everything to STDERR was a really strange choice).
> The relentless repetition flawlessly locks in their foundational muscle memory and basic swing mechanics
If only this were true there wouldn't be an army of duffers who after a lifetime of "training" still dig a trench in front of the ball every time they play.
These might be theoretical issues that people without experience worry about, but let me share what I've witnessed in practice working almost a decade with Clojure at Nu.
We mostly hired people with no previous Clojure experience. Majority of hires could pick up and get productive quickly. People fresh out of college picked it up faster. I even had a case of employee transitioning careers to S.E., with no previous programming experience, and the language was a non issue.
I can't remember an instance where the language was a barrier to ship something. Due to reduced syntax surface and lack of exotic features, the very large codebase followed the same basic idioms. It was often easy to dive into any part of the codebase and contribute. Due to the focus on data structures and REPL, understanding the codebase was simply a process of running parts of a program, inspecting its state, making a change, and repeat. Following this process naturally lead to having a good test suite, and we would rely on that.
Running on the JVM is the opposite of a problem. Being able to leverage the extensive JVM ecosystem is an enormous advantage for any real business, and the runtime performance itself is top tier and always improving.
The only hurdle I could say I observed in practice was not having a lot of compile time guarantees, but since it was a large codebase anyway, static guarantees would only matter in a local context, and we had our own solution to check types against service boundaries, so in the end it would've been a small gain regardless.
Clojure is easily the most boring, stable language ecosystem I’ve used. The core team is obsessed with the stability of the language, often to the detriment of other language values.
This attitude also exists among library authors to a significant degree. There is a lot of old Clojure code out there that just runs, with no tweaks needed regardless of language version.
Also, you have access to tons of battle tested Java libraries, and the JVM itself is super stable now.
I won’t comment on or argue with your other points, but Clojure has been stable and boring for more than a decade now, in my experience.
What I meant by that is the metaprogramming capabilities that often get cited for allowing devs to create their own domain specific "mini languages". To me that's a "creative" way to write code because the end result could be wildly different depending on who's doing the writing. And creativity invites over-engineering, over-abstraction, and hidden costs. That's what I meant by the "opposite of boring".
You linked me to this comment from another one and I have to agree with this sentiment.
Creating these mini DSLs is something that requires a lot of thought and good design. There is a danger here as you pointed out sharply.
But I have some caveats and counter examples:
I would say the danger is greater when using macros and far less dangerous when using data DSLs. The Clojure community has been moving towards the latter since a while.
There are some _very good_ examples of (data-) DSLs provided by libraries, such as hiccup (and derived libraries), reitit, malli, honeysql, core match, spec and the datalog flavor of Clojure come to mind immediately (there are more that I forget).
In many cases they can even improve performance, because they can optimize what you put into them behind the scenes.
In practice, though, most developers don’t do that.
There’s a rule of thumb: write a macro as a last resort.
It’s not hard to stick to it. In general, you can go a long, long way with HOFs, transducers, and standard macros before a hand-rolled macro would serve you better.
> syntax is hard to read unless you spend a lot time getting used to it
That’s pretty much exactly the opposite of how I always felt. Perhaps because I’m not a programmer by education, I always struggle to remember the syntax of programming languages, unless I’m working in them all the time. After I return to a language after working in other languages for a while, I always have difficulties remembering the syntax, and I spend some time feeling very frustrated.
Clojure and Lisps more generally are the exception. There is very little syntax, and therefore nothing to remember. I can pick it up and feel at home immediately, no matter how long I’ve been away from the language.
I don't think the syntax is hard to read in any kind of objective sense, it's just different than most mainstream languages. Greek would be hard for me to read too, but that's not because it's necessarily harder to read than English, just that I don't really know Greek.
I agree with the short variable name convention, that's annoying and I wish people would stop that.
Everyone complains about a lack of type safety, but honestly I really just don't find that that is as much of an issue as people say it is. I dunno, I guess I feel like for the things I write in Clojure, type issues manifest pretty early and don't really affect production systems.
The clearest use-case I have for Clojure is how much easier it is to get correct concurrent software while still being able to use your Java libraries. The data structures being persistent gives you a lot of thread safety for free, but core.async can be a really nice way to wrangle together tasks, atoms are great for simple shared memory, and for complicated shared memory you have Haskell-style STM available. I don't remember the last time I had to reach for a raw mutex in Clojure.
Good concurrency constructs is actually how I found Clojure; I was looking for a competent port of Go-style concurrency on the JVM and I saw people raving about core.async, in addition to the lovely persistent maps, and immediately fell in love with the language.
Also, I really don't think the JVM is a downside; everyone hates on Java but the fact that you can still import any Java library means you're never blocked on language support. Additionally, if you're willing to use GraalVM, you can get native AOT executables that launch quickly (though you admittedly might need to do a bit of forward-declaration of reflection to get it working).
> - syntax is hard to read unless you spend a lot time getting used to it
This is only true if you assume C-like syntax is the "default."
But regardless of that, I'd argue that there's much less syntax to learn in LISPy languages. The core of it is really just one single syntactic concept.
To be pedantic, this isn't quite correct. Syntax isn't countable like that. What S-expressions are light on is production rules. At their most basic they have IIRC 7 production rules but there are absolutely 0 languages based on s-expressions which are that simple, since it doesn't give you anything like quasiquotes, vectors, Lisp 2 function resolution, etc. Reader macros make matters much worse.
What we can say is that they are constructively simple, but not particularly unique in that. Once you get into real sexpr languages they aren't simpler than horn clauses, and are constructively more complex than languages like Brainfuck and Forth.
It's repeated a lot because it's true. The collective developer world has decided that LISP syntax is not the preference. Good if you prefer it, but you're the in the overwhelming minority.
The key issue is that Lisp's minimal uniform syntax has less variation to help with visual pattern matching, which we humans are good at (compared to richer syntax).
The meta-programming power of Lisp may be largely due to being homoiconic, although Dylan/Julia etc achieve similar without it. However Lisp's minimal syntax is not a prerequisite for homoiconicity: S-Plus/R has a more conventional syntax while retaining "code is a list" representation.
minimal and simple is not the same thing as easy to use and natural/obvious.
what looks easier to read:
(if (< a b)
(let [x (long-function-name a b)]
(another-long-function (+ x c)))
(+ a b))
or
if a < b {
let x = long_function_name(a, b);
another_long_function(x + c)
} else {
a + b
}
to me the first one is way more noisy and confusing. and you really need a text editor with auto close and rainbow brackets to be productive, of course thats a non issue today with vscode and zed/neovim/helix but still something to think about. now rust might not be the best example for "easy to read syntax" but theres also python, lua, kotlin, even js if you ignore strict/loose equals and weird casts. all of them use more procedural/c like syntax because its the natural way humans think about running an algorithm. theres a reason why pseudocode in every textbook looks like that.
> minimal and simple is not the same thing as easy to use and natural/obvious
You are right about that.
I value "minimal and simple" more than "familiar" because it makes my growth trajectory less arbitrary and more about intrinsic properties of the code. I don't care about learning how to do things the same way as everyone else nearly as much as I care about learning how code can be improved generally.
I know that code is written for humans, and that you can't remove the human from the equation. But I'm more interested in the future of code than in present-day code culture.
I think by the same reasoning Qwerty keyboard is better than Dvorak. People are just used to Qwerty, although they would type faster on Dvorak, or in the case of s-expressions use structural editing.
Honestly it's so exhausting. Every time Clojure gets mentioned on a broader forum, there's always some ridiculous claim that the Lispy syntax is "un-natural". Other Lisp dialects mostly pass unnoticed, but Clojure being more popular always causes some ruckus and I never get it - do people think that Clojurists stumble on it and be like: "holy mother of Alan Turing, this is so much more 'natural' to me than everything else..." Both choices are in the same sense "natural" as skiing and sledding. None of it is "natural" - reading prose in English, Thai or Katakana - all that is "unnatural". Nobody stumbles on the language and immediately thinks the syntax is just better - the majority of Clojurists come to it after years, often decades of using other PLs and they have to struggle at first.
Comparison with sledding is apt here, because both methods let you achieve the same goal - going down to the base of the mountain. Of course, skiing is more difficult to start with, it's more expensive, it requires deliberate effort and dedication. But the practical, learned experience far exceeds initial expectations. Do you realize how ridiculous it looks when inexperienced people try to convince them it's not worth it? Well, you may say, "the point is not to convince 'them', but to show the wider public..." And that's even more imbecilic. Imagine trying to point at people zigzagging 70 miles down from the peak, having enormous fun and telling the observers not to even try that? I'd dare anyone to argue with an experienced skier that sledding is more fun.
> theres a reason why pseudocode in every textbook looks like that.
Like I said, most - the absolute majority of Clojure programmers come to it after many years of programming in other languages (see the surveys). They are using it as a real instrument to achieve real goals, to solve practical problems. It's not an educational tool, not a "hello world app" incubator, not a "good for the resume" kind of a thing for them. If you (not you personally, but some proverbial programmer) are arguing just for the sake of it, well, with all respect, then "fuck you" (for wasting people's time). If you're sincerely trying to make a choice - nobody can "make a skier" out of you - that is something you must do on your own. No theory, no books, no videos can ever convey to you the enormous joy you may get out of it later - there's too much nontransferable tacit knowledge there. Just keep in mind, people in this community didn't make the choice because "their brains are wired differently" or something, not because "they are oblivious", no. Unlike you - they have seen, walked and lived through both of these worlds. Most of them have to switch between them, sometimes multiple times a day. And yes, the wider majority can often be wrong. In fact, history shows us that it makes wrong choices all the time. Lispers don't care about popular choices - they prioritize pragmatism above all.
I think this is kind of misleading. Yes s-expressions have very simple syntax in and of themselves. But s-expressions are not all that's required to get all the control structures in Clojure. You need to memorize all the special forms and all the standard macros that people use on a day to day basis. And they're just as hard (actually IME harder) to memorize as any other syntax. let, cond, record, if, condp, let-if, fn, def, defn, loop, recur, if-some, when-let, for, ->, ->>, as->>, cond-> ...
To this day I have to look up whenever I get back into clojure what the "syntax" is of ns, require, import, etc.
i can link you similarly undecipherable walls of text in rust and zig and c
but i bet if you sat down a junior developer not yet entrenched in any style yet, they'd be able to grok lisp code MUCH faster than the intricacies of syntax of the other alternatives ¯\_(ツ)_/¯
My opinion about this, is that it appears to depend on how a particular person's brain is wired, as to which language(s) they will understand faster and like. There can be a "winner" in terms of which language more people gravitate towards, but then that is very relative to many factors, including corporate influence.
People also put themselves into bubbles. Once in, they can filter out other languages (with all kinds of excuses), and be overly focused on certain families or only specific languages.
But that's exactly the root of the complaint. Because there's (for the sake of argument) only one syntactic concept, there's no bandwidth for structural concepts to be visible in the syntax. If you're used to a wide variety of symbols carrying the structural meaning (and we're humans, we can cope with that) then `)))))))` has such low information density as to be a problematic road bump. It's not that the syntax is hard to learn, it's that everything else you need to build a program gets flattened and harder to understand as a result.
Even among lisps this has been problematic, you can look at common lisp's LOOP macro as an attempt to squeeze more structural meaning into a non-S-expression format.
To me and many other people the syntax looks like Lego but that's a taste thing and arguing about taste isn't very productive. What is more objective is that there are less syntactic patterns to care about to do at least 90% of pretty complex systems including concurrency. The rest can usually be limited to a few namespaces that are rarely touched later because they just work. Compare that with Python...
If you want to calcify something and add robustness, use clojure.spec or Malli. Clojure encourages writing testable code and also in general, there is less code to test. Smaller problem, easier to tackle well.
The JVM is a beast for serious things because of its performance and tooling. If you need something small/ with a quick start, you can use GraalVM or some of the dialects like ClojureScript or Babashka to do what needs to be done. There is ongoing work on ClojureCLR, Jank, Janet, Basilisp, Hy and other dialects or inspired languages. Usually, these are pretty close to Clojure or try to follow the behavior of Clojure so that stuff written using Clojure.core just works the same. Clojure is turning out to be the actual lingua franca.
For me, programming in Clojure is the nearest thing to fun that I ever had doing programming. To me there seems to be less ceremony about things especially on bigger projects. For the little things Babashka tends to be even more straight forward.
And yes, there are things about Clojure that can make the life harder. Usually it has to do with laziness e.g. when you just try to get a data structure written to a file. When you want to have restartable, stateful components such as database connections, web servers, etc. and want to start them in a certain order. There are some functions that are unexpectedly slow and stuff like this that could be somewhat more predictable. All this would be more approachable if there were real documents for beginners with a little more explanations than the terse descriptions that senior developers with 20+ years of experience find sufficient.
You also need to learn a new tool to write lisp, like paredit.
While it's amazing once you've learned it, and you're slurp/barfing while making huge structural edits to your code, it's a tall order.
I used Clojure for a long time, but I can't go back to dynamic typing. I cringe at the amount of time I spent walking through code with paper and pencil to track things like what are the exact keyvals in the maps that can reach this function that are solved with, say, `User = Guest | LoggedIn` + `LogIn(Guest, Password) -> LoggedIn | LogInError`.
Though I'm glad it exists for the people who prefer it.
i'm surprised anybody coming from clojure would say this.
you absolutely do NOT need to learn paredit to write lisp, any modern vim/emacs/vscode plugin will just handle parentheses for you automatically.
that said, if you do learn paredit style workflow - nobody in any language in any ide will come even close to how quickly you can manipulate the codebase.
I realize how lame it is to relitigate Clojure's downsides every time it comes up. I fell into the same trap that annoys me about Elm threads: people who haven't used it in a decade chiming in to remind everyone they didn't like some aspect of it. Wow, such contribution.
It's like seeing that a movie is playing at the theater so you show up only to sit down next to people to explain your qualms with it, lolz. Sometimes you need to let others enjoy the show.
The OP of this thread even said all that needed to be said "The learning curve is steep but very much worth it" yet we're trapped in this cycle because someone had to embellish it with a listicle.
I didn't realize it was a no-no to share opinions here.
Plus to be fair we're having this discussion in the context of an article from 2021 that just rose to front page of HN, only to repeat the same set of pros we've been hearing about Clojure for ages (code as data, repl, etc).
The JVM is one of the major selling points of Clojure. You can "write once, run anywhere" and benefit from Java's massive ecosystem, all without having to use a Blub language. Modern JVM implementations are also incredibly fast, often comparable in performance to C++ and Go.
It's almost as if different tools exist for solving different problems. Clojure is "Lisp on the JVM". That's the core premise behind the language. Rust is a "systems programming language with a focus on type and memory safety". This is an apples-to-oranges comparison. They offer different benefits while providing different drawbacks in return. Their ecosystems are likewise very different, in each case more closely tailored to their particular niche.
> i don't think you're wrong necessarily...but rust, golang, zig, mojo, etc are gaining popularity and imo they wouldn't be if they were JVM languages.
> understood, i'm just pointing out that people seem to prefer the apple over the orange.
This is kind of like saying that fewer people are drinking Coke every year and are choosing other beverages. It might be objectively true but it glosses over the fact that literally billions of people drink Coke daily and will continue to do so for decades to come.
The JVM is the same. Some people and organizations might be using zig or mojo (and I have absolutely nothing against zig or mojo, to be clear, I hope they succeed) but many multiple orders of magnitude more individuals and organizations run JVM stuff in a given year and will continue doing so.
At this point, the JVM is a civilizational technology. If it went away tomorrow, multiple banks would fail, entire countries would no longer be able to administer social services, millions of people would die. The JVM is in everything.
Developers on HN using zig, mojo, etc. aren't really a representative sample.
> Developers on HN using zig, mojo, etc. aren't really a representative sample.
Agree. A lot of manipulation and astroturfing goes on, from many different groups, that affects what appears to be popular on a particular site. The bubbles formed at a site can become self-reinforcing.
The more time one spends at a particular site, the more likely to fall for the illusion or become to believe that what is being presented, is representative of everyone or everywhere else. Kind of like if a person only watches Fox News or PBS News.
That's fair if you're looking at it from a performance perspective.
Not entirely fair if you look at it from a perspective of wanting fast feedback loops and correctness. In Clojure you get the former via the REPL workflow and the latter through various other means that in many cases go beyond what a typical type system provides.
> the opposite of boring
It's perhaps one of the most "boring in a good way" languages I ever used.
"The TIOBE index measures how many Internet pages exist for a particular programming language."
For some reason I doubt this is in any way representative of the real world. Scratch, which is a teaching language for children, bigger than PHP? Which is smaller than Rust? Yeah, these are results you get when you look at the Internet, alright.
Sure that index isn't great (I think it's basically a regurgitation of Google Trends), but I don't think you're suggesting Clojure is actually a popular language are you? Which is the only point I'm trying to make (that it isn't popular).
Clojure is reasonably popular as far as programming languages go. It's not difficult to get a job as a Clojure developer, particularly in certain sectors (fintech and healthcare are the heaviest Clojure users). Of course C++, Java, C# and PHP dwarf both Clojure and Rust by several orders of magnitude.
> It's not difficult to get a job as a Clojure developer
Let's be honest and avoid painting a misleading picture. Getting a job as a software developer of any kind is genuinely difficult right now. Finding a position on a Clojure team has always been relatively harder for various reasons - and not simply because of its [in]popularity.
Clojure tends to attract older, more experienced developers. If you want a full-time Clojure role but have no prior experience with it, you'll often need to accept a junior-level salary - something many seasoned developers can't afford or simply won't do.
Junior developers have it even harder. Recruiting pipelines don't really distinguish between experience levels - everyone goes through roughly the same process, and juniors are expected to keep up with veterans, with almost no room for error.
Senior, battle-tested Clojure devs face a different kind of pressure. Interviews are frequently grueling, mentally exhausting sessions comparable to architect-level evaluations in other places. And because Clojure enables small, skilled teams to accomplish a lot, companies rarely need to hire in bulk - so competition for each opening is fierce.
This creates a frustrating situation for everyone, companies included. They want top-tier talent but offer junior salaries, while simultaneously rejecting juniors and anyone without direct Clojure experience. Supply and demand are badly out of balance.
That breeds resentment - "why bother learning it if I'll never get hired?" Honestly, there's no clean answer, and Rust seems to be in a similar spot right now. Even so, the language is worth learning. It has real practical value, even when you're not using it on a team. The future-proof choice I believe is to learn both - Rust and Clojure. Exploring both of these languages, I can honestly think things will change in their favor. Unless you want to stay sad at nearly-burnout levels for the next decade or more with TS/Python/Java/etc.
If we've measured the value of jobs by their popularity, everyone would want to be a retail salesperson or a cashier - according to US Bureau of Labor Statistics data, these are most common occupations in the States. People still writing Clojure professionally after 15 years (of other languages) are disproportionately serious engineers. The language self-selects. Small community means concentrated competence, not weakness.
The network effect assumption is wrong here - a programming language isn't a social network. A better hammer isn't worse because fewer people own one. Most job listings reflect what organizations already know how to hire for, not what produces the most value.
If anything, I think that makes Clojure better. Almost no one in the community is doing stuff to serve "lowest common denominator", compared to how most of JS/TS development is being done, which is a breeze of fresh air for more senior programmers.
Besides, the community and ecosystem is large enough that there are multiple online spaces for you to get help, and personally I've been a "professional" (employed + freelancing) Clojure/Script developer for close to 7 years now, never had any issues finding new gigs or positions, also never had issues hiring for Clojure projects either.
Sometimes "big enough" is just that, big enough :)
Spinning dwindling adoption as a good thing because it "unburdens community from serving lowest common denominator use-cases" is exactly the kind of of downplaying/deflection of every issue that I'm talking about, which constantly happens in the Clojure community. It's such an unhealthy attitude to have as a community and it holds it back from actually clearly seeing what the issues are and coming up with solutions to them.
Every problem people face is "not a problem" or "actually a good thing" or, maybe if all else fails we can make users feel bad about themselves. Clojure is intended for "well experienced, very smart developers". Don't you know, our community skews towards very senior developers! So if you don't like something, maybe the problem is just that you're not well experienced enough? Or, maybe what you work on is just too low-brow for our very smart community!
> It's such an unhealthy attitude to have as a community
How about just "different"? Turtle want to teach everyone to program, that's fine, just another way of building and maintaining a language. Clojure is clearly not trying to cater to the "beginner programmer" crowd, and while you might see it as "unhealthy attitude", I'd personally much prefer to realize having many different languages for different people is way better than every language trying to do the same thing for the same people. Diversity in languages is a benefit in my eyes, rather than a bad thing.
I'm glad it works for you and many others and gives you a good living. Nothing wrong with that. I wasn't trying to attack it or anyone that uses it, just stating why I never warmed up to it and projecting why I think it hasn't become popular.
* Extreme thread-safety, better than all JVM languages and on par with the best.
* Macros -
Other languages you have to reach for other languages to program in other paradigms, for example rules, logic programming, or writing a custom DAG. You'll probably say you don't need other paradigms whereas the truth is you just avoid it in other languages as it's too much of a hassle.
I am a Clojure fan and would love to use it. But you are right, we live in a real world where money talks and most organizations want to see developers as cheap, replaceable commodities.
Not to mention in a post AI world, cost of code generation is cheap, so orgs even need even fewer devs, combine all this with commonly used languages and frameworks and you need not worry about - "too valuable to replace or fire".
Having said that - there may be a (very) small percentage of orgs which care about people, code crafting and quality and may look at Clojure as a good option.
Ah, here we go again. Every single time Clojure gets mentioned on HN, some clueless egghead comes listing various "issues" without considering holistic, overall experience of using the language for real. Because they effing never did. Sure, it's so easy to "hypothesize" about deficiency of any given PL:
- Python: slow; GIL; dynamic; package management is shit; fractured ecosystem for a decade due to version split.
- Rust: borrow checker learning curve; compile times; half-baked async; too many string types; unreadable macros; constantly changing.
- Go: no generics for a decade, now bolted on awkwardly; noisy error handling; no sum types; no enums; hard to get right concurrency.
I can keep yapping about every single programming language like that. You can construct a scary-sounding wall of bullet points for literally anything, without ever capturing the cohesive experience of actually building something in the language. For all these reasons, programming in general could sound like a hard sell.
Stop treating Clojure like a "hypothetical" option. It doesn't need your approval to be successful - it already is. It's not going away whether you like it or not - despite your humble or otherwise IMOs and uneducated opinions. It's endorsed by the largest digital bank in the world, it scales to serious, regulated, high-stakes production systems. Not theoretically, not conceptually, not presumably - it has proven its worth and value over and over, in a diverse set of domains, in all sorts of situations, on different teams, dissimilar platforms. There are emerging use-cases for which there's simply no better alternative. While you've been debating whether to try it or not, people have been building tons of interesting and valuable things in it. Clojure is in no rush to be "sold" to you or anyone else. It's already selling like ice cream in July (on selected markets) and you just don't know it.
> Ah, here we go again. Every single time Clojure gets mentioned on HN, some clueless egghead comes listing various "issues" without considering holistic, overall experience of using the language for real. Because they effing never did
Stopped reading here because of your hostility so I'll just say: yes I tried to use it "for real" but I didn't like it.
There's a difference between "attempted" and "tried". And from the points you have nitpicked I can confidently say: no, you have not really tried using it in a real, production setting. That is just that obvious. No experienced Clojurista would ever blankly list some reasons without specific context. Every single point you're trying to make has a caveat, every single one of them is disputable. Your statements are not factually false, but it doesn't mean they carry any meaningful, practical insight to the functional relation between the parts that make overall experience and why they make it an excellent choice for many problem spaces.
> because of your hostility
Clojure, just like pretty much every single language, tool, technique or paradigm does have its pros and cons, there's no denying that, but you can't just blindly come and shit all over someone's backyard expecting people to happily explain to you how inaccurate path your thinking took there. And it's not just a reaction to the post about Clojure - I'd defend any other tool the same way if someone did what you have.
> Stopped reading
If you don't have mental capacity to visually scan through four paragraphs of a response to your own remarks, that's pretty indicative. I guess you're not here to learn something new, but rather to assert your own perceived rectitude. Well, your perception is misguided. I suggest you correct it by learning more about the topic you so confidently trying to argue about, or respectfully and humbly STFU. If you think you know better than Goetz, Odersky, Kay, Steele, Felleisen, Friedman - that perhaps is not a good reflection. Just something to think about.
Typically you're either deploying via a container, in which case there's no more overhead than any other container deployment, or you're deploying directly to some Linux machine, in which case all you need is a JVM - hardly an arcane ritual.
People can invest in markets without a 401k with more options (plans commonly have only a handful of funds available) and less fees (both admin fees and inflated fund expense ratios). And you may pay more taxes with a 401k than otherwise depending on your future tax rate (which is unknowable).
The only pure advantage is employer matching if you have it and stay employed long enough for it to vest.
> It's infuriating. Nearly all of the agentic coding best practices are things that we should have just been doing all along
There's a good reason why we didn't though: because we didn't see any obvious value in it. So it felt like a waste of time. Now it feels like time well spent.
It's the user's fault. They vote for this crap with their attention. Junk sites like this shouldn't exist but they do amd aren't going anywhere until people stop using them.
Some users might enable these kind of features with their attention, but I don't think users actually want these features and any kind of "voting" is likely unintentional. It's manipulation. The fault lies mainly with the company and their carefully planned dark patterns. Ideally, users should punish them by e.g. leaving the platform but there's friction that may be a bigger problem than the dark patterns (depending on user). And I don't think there are any platforms that always guarantee good user experience now and in the future.
Not sure if users even realize what the dark patterns are and do. Users aren't all-knowing, with endless time, carefully balancing their attention to try to provide markets with the optimal signal to wisely guide the misbehaving actors.
Is it really the users fault when the apps are literally designed by neuroscientists that explicitly design it to be addictive toward humans all of which is being funded by monopolists companies whose leadership tend to have antidemocratic views about humanity?
Maybe we should finally regulate these addict boxes as the dangerous substances they are.
Users are not perfect agents. How can you expect the average non-technical person to figure out what is happening? For most people, if they don't see visually see something happening on the screen, it doesn't exist. They simply have no frame of reference to figure out that LinkedIN is hijacking their scroll speed.
This is exactly what I want in a UI.
reply