HN2new | past | comments | ask | show | jobs | submit | jnpnj's commentslogin

Sorry for this sounds absurd, but with diffusion language models, who generate text non-linearly (from the few that I get, they relate terms without a simpler order), I wonder if new syntactic ideas will come up.

A blend of nextstep and macos classic somehow.

I thought of BeOS for some reason.

Was asking on mastodon if people tried leveraging very concise and high level languages like haskell, prolog with 2025 llms.. I'm really really curious.

the problem there might be limited training data?

Jane Street had a cool video about how you can address lack of training data in a programming language using llm patching. Video is called "Arjun Guha: How Language Models Model Programming Languages & How Programmers Model Language Models"

The big take away is that you can "patch" llms and steer them to correct answers in less trained programming languages, allowing for superior performance. Might work here. Not a clue how to implement, but stuff to llm-to-doc and the like makes me hopeful


So you're saying we should be vibe coding more open source stuff in languages for discerning programmers ;)

We need a new pair of words to distinguish these two mindsets. Digging deep, finding abstractions, solutions that would say more with less .. is one kind of fun. Other people want to see the magic happen by doing few keystrokes it seems, they call it fun, i call it death.

> they call it fun, i call it death.

Are you just sitting there as if dead when using AI? I find AI work exciting, always something new to discover.


It's been a few months since gemini 3 and opus 4.5 were released and I still regularly have feelings of dread in me because I'm deprived of something (which I assume is the thrill and pride of being able to explore solution spaces in non stupid ways to find plausible answers on my own)

Maybe it's the usual webdev corp job that is too focused on mainstream code and where AI is used to sell more, not find new ideas that could be exciting..


I mean I guess it really depends on what you're interested in.

There are plenty of projects I have wanted to do that I don't because the "activation energy" is too high, and if I can get a machine to basically get past the boring crap then I can focus on the parts of the project that I think are fun.


Lovely. I wonder how many people did similar things in their own django instances because the lack of embedded monitor is often a source of friction.

I think any large enough django project has toyed around with extending the admin in some way. Hopefully this project can help establish a standard to make this sort of thing easier.

Crazy to think that Fortnite might unleash a new population of people who toyed with functional-logic as their first paradigm.


This is the first thing that I used LLMs on. Not code generation, but parser and tooling to gain understanding. Also saves resources in the long run.


One of my favorite uses for Claude Code is to point it at a section of seriously badly written code with undecipherable symbol names, over the top cyclomatic complexity etc and just ask it to make the code readable.


there's a similar story by kragen on .. hmm HN or maybe SO where he describes a bootstrapping from a micro hand crafted asm monitor to forth to more and higher level languages. "stuck-in-a-basement" kindof challenge.


schemers used a good old `compose` instead of a dedicated syntax


and beside multiple-args, there's the usual threading macros

    (-> [1 2 3] f g)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: