Hacker News .hnnew | past | comments | ask | show | jobs | submit | patrickkidger's commentslogin

I'm not sure if I'm about to be the old man yelling at clouds, but Anthropic seem to be 'AWS-ifying'. An increasing suite of products which (at least to me) seem to undifferentiated amongst themselves, and all drawn from the same roulette wheel of words.

We've got Claude Managed Agents, Claude Agent SDK, Claude API, Claude Code, Claude Platform, Claude Cowork, Claude Enterprise, and plain old 'Claude'. And honourable mention to Claude Haiku/Sonnet/Opus 4.{whatever} as yet another thing with the same prefix. I feel like it's about once a week I see a new announcement here on HN about some new agentic Claude whatever-it-is.

I have pretty much retreated in the face of this to 'just the API + `pi` + Claude Opus 4.{most recent minor release}', as a surface area I can understand.


Quick heads-up that these days I recommend https://github.com/patrick-kidger/jaxtyping over the older repository you've linked there.

I learnt a lot the first time around, so the newer one is much better :)


Ah, I would have never thought jaxtyping supports torch :)


I think so!

I've not tried a couple of things you mention (e.g. background images) but e.g. for dynamic placing there are libraries like https://typst.app/universe/package/meander

The core of Typst is a pretty featureful well-thought-out FP language. This makes expressing libraries for any piece of functionality very pleasant.


This is how static type checkers are told that an imported object is part of the public API for that file. (In addition to anything else present in that file.)

C.f. "the intention here is that only names imported using the form X as X will be exported" from PEP484. [1]

I'm generally a fan of the style of putting all the implementation in private modules (whose names start with an underscore) and then using __init__.py files solely to declare the public API.

[1] https://peps.python.org/pep-0484/


That looks like its only for stub files not __init__.py


It also applies to any .py file. (At least in practice with e.g. pyright)

That said, the documentation on this matter is close to nonexistent.


Oh neat! This is my library. Happy to answer any questions.

(Though it's really a pretty tiny library that just does what it says on the tin, not sure how many questions there can be. :D )


I have a question. Why do you prefix your package files with an underscore?

In fact, you write all of your python like you really have something to hide ;) Like `_Todo`.

Where did you get this pattern?

(I’m way more curious than accusatory. Are people embracing private modules these days as a convention, and I just missed it?)


I think _private has always been a convention in Python, though I'd say most Python is not so rigorous about it. I don't see why it couldn't be applied to modules.

I honestly love when I see a package do stuff like this: it's very clear then what is public interface, and I should consider usable (without sin) and what is supposed to be an internal detail.

Same with the modules: then it is very clear that the re-export of those names in __init__.py is where they're meant to be consumed, and the other modules are just for organizational purposes, not API purposes.

_Todo is then a private type.

Very clean.


I tend to do the same, some colleagues as well, so I guess this is some common pattern.

The way I see it there are two schools:

- The whitelist school: You write everything without _ prefix, then you whitelist what you want accessible with __all__.

- The explicit school: You forget about all and just use _ for symbols, modules, etc.

I find the latter more readable and consistent (can be applied to attributes, free functions, modules...


Yup, you(/sibling comments) have it correct, it's to mark it as private.

Not sure where I got it from, it just seems clean. I don't think I see this super frequently in the ecosystem at large, although anything I've had a hand in will tend to use this style!


I just want to say this is brilliant. I've had my share of problems with asyncio and went back to using sync python and deque instead.


Went scrolling looking for this! Most of the article is about problems solved in JAX.

Also worth noting the Array API standard exists now. This is generally also trying to straighten out the sharp edges.


Same here, beautiful solution


> What open source alternatives?

Helix:

https://github.com/helix-editor/helix/

Like vim, but already has an LSP etc out of the box. Things are already there so the config files are minimal.


I have a US number and live in Switzerland. At least for me, I only receive SMS messages whenever I visit the US -- the rest of the time they're just dropped and I'll never see them.

(Doesn't really bother me, my friends and I all use WhatsApp/etc. anyway.)

n=1 though, maybe this is some quirk of my phone provider.


FWIW - I used to do research in this area - PINNs are a terribly overhyped idea.

See for example https://www.nature.com/articles/s42256-024-00897-5

Classical solvers are very very good at solving PDEs. In contrast PINNs solve PDEs by... training a neural network. Not once, that can be used again later. But every single time you solve a new PDE!

You can vary this idea to try to fix it, but it's still really hard to make it better than any classical method.

As such the main use cases for PINNs -- they do have them! -- is to solve awkward stuff like high-dimensional PDEs or nonlocal operators or something. Here it's not that the PINNs got any better, it's just that all the classical solvers fall off a cliff.

---

Importantly -- none of the above applies to stuff like neural differential equations or neural closure models. These are genuinely really cool and have wide-ranging applications.! The difference is that PINNs are numerical solvers, whilst NDEs/NCMs are techniques for modelling data.

/rant ;)


I concur. As a postdoc for many years adjacent to this work, I was similarly unimpressed.

The best part about PINNs is that since there are so many parameters to tune, you can get several papers out of the same problem. Then these researchers get more publications, hence better job prospects, and go on to promote PINNs even more. Eventually they’ll move on, but not before having sucked the air out of more promising research directions.

—a jaded academic


I believe a lot of this hype is purely attributable to Karniadakis and how bad a lot of the methods in many areas of engineering are. The methods coming out of CRUNCH (PINNs chief among them) seem, if they are not just actually, more intelligent in comparison, since engineers are happy to take a solution to inverse or model selection problems by pure brute force as "innovative" haha.


The general rule of thumb to go by is that whatever Karniadakis proposes, doesn't actually work outside of his benchmarks. PINNs don't really work, and _his flavor_ of neural operators also don't really work.

PINNs have serious problems with the way the "PDE-component" of the loss function needs to be posed, and outside of throwing tons of, often Chinese, PhD students, and postdocs at it, they usually don't work for actual problems. Mostly owed to the instabilities of higher order automatic derivatives, at which point PINN-people begin to go through a cascade of alternative approaches to obtain these higher-order derivatives. But these are all just hacks.


I love karniadakis energy. I invited him to give a talk in my research center ands his talk was fun and really targeted at physicists who understand numerical computing. He gave a good sell and was highly opinionated which was super welcomed. His main argument was that these are just other ways to arrive optimisation and they worked very quickly with only a bit of data. I am sure he would correct me greatly at this point. I’m not an expert on this topic but he knew the field very well and talked at length about the differences between one iterative method he developed and the method that Yao lai at Stanford developed after I had her work on my mind because she talked in an ai conference I organised in Oslo. I liked that he seemed to be willing to disagree with people about his own opinions because he simply believed he is correct.

Edit: this is the Yao lai paper I’m talking about:

https://www.sciencedirect.com/science/article/pii/S002199912...


What do you do now?



Likewise, spending another comment just to agree. Both on the low profile and the low travel distance.

I've tried low-profile chocs and they still have too much travel! But I'm stuck with them as split keyboards are important for me just for the usual collection of wrist health reasons.

So I'm just waiting for Apple to make a split keyboard I guess :)


I have sincerely been considering a bandsaw and a soldering iron! To find out how hard it is to split a keyboard that’s already in one piece and have it remain working.


M1242 but it has too much travel by modern standards.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: