HN2new | past | comments | ask | show | jobs | submit | BoppreH's commentslogin

> Without explaining that, the rest of this blog post is just rambling notes about developer ergonomics.

That's how I took it, and I enjoyed it thoroughly. If you're making a small app by yourself, sufficiently bad developer ergonomics can be the reason that the app doesn't get made at all, or the frustration makes me regret it. That's important for me.

> Maybe I'm just too young to have ever experienced the kind of stability expected here.

This could be it. I've been around many cycles of technology, and it always feels like a great waste when you have to abandon your tools and experience for something that's buggy and better in only a few small ways. I'm willing to tolerate a lot more bullshit for something that I know will be long-lived, like QT or a static website, than Microsoft's UI-framework-of-the-month.


Lots of good ideas here.

Flow-sensitive type inference with static type checks is, IMHO, a massively underrated niche. Doubly so for being in a compiled language. I find it crazy how Python managed to get so popular when even variable name typos are a runtime error, and how dreadful the performance is.

All the anonymous blocks lend themselves to a very clean and simple syntax. The rule that 'return' refers to the closest named function is a cute solution for a problem that I've been struggling with for a long time.

The stdlib has several gems:

- `compile_run_code`: "compiles and runs lobster source, sandboxed from the current program (in its own VM)."

- `parse_data`: "parses a string containing a data structure in lobster syntax (what you get if you convert an arbitrary data structure to a string) back into a data structure."

- All the graphics, sound, VR (!), and physics (!!) stuff.

- imgui and Steamworks.

I'll definitely be watching it, and most likely stealing an idea or two.


> I find it crazy how Python managed to get so popular when even variable name typos are a runtime error

Tangential point, but I think this might be one of the reasons python did catch on. Compile checks etc are great for production systems, but a minor downside is that they introduce friction for learners. You can have huge errors in python code, and it'll still happily try to run it, which is helpful if you're just starting out.


i don't think typing was the issue. at the time there didn't exist any typed languages that were as easy to use as python or ruby. (ok, not true, there did exist at least one: pike (and LPC which it is based on). pike failed for other reasons. otherwise if you wanted typed your options were C, C++ and then java. none of which were as easy and convenient to use as python or ruby. java was in the middle, much easier than C/C++, and that did catch on.

What about when you have a long-running program. You can't both brag about NumPy, Django, and the machine learning library ecosystem while also promoting "It's great for when you just want to get the first 100 lines out as soon as possible!"

I am guessing that Python, like Ruby, is dynamic enough that it's impossible to detect all typos with a trivial double-pass interpreter, but still.

Wonder if there was ever a language that made the distinction between library code (meant to be used by others; mandates type checking [or other ways of ensuring API robustness]), and executables: go nuts, you're the leaf node on this compilation/evaluation graph; the only one you can hurt is you.


Is it though?

As long as warnings are clear I’d rather find out early about mistakes.


People learn by example. They want to start with something concrete and specific and then move to the abstraction. There's nothing worse than a teacher who starts in the middle of the abstraction. Whereas if a teacher describes some good concrete examples the student will start to invent the abstraction themselves.

It looks like it.

Based on what I observe as an occasional tutor, it looks like compiler warnings & errors are scary for newcomers. Maybe it's because it shares the same thing that made math unpopular for most people: a cold, non-negotiable set of logical rules. Which in turn some people treat warnings & errors like a "you just made a dumb mistake, you're stupid" sign rather than a helpful guide.

Weirdly enough, runtime errors don't seem to trigger the same response in newcomers.


Interesting angle: Compiler errors brings back math teacher trauma. I noticed Rust tries to be a bit more helpful, explaining the error and even trying to suggest improvements. Perhaps "empathic errors" is the next milestone each language needs to incorporate.

I suddenly understand part of why experienced programmers seem to find Rust so much more difficult than those who are just beginning to learn. Years of C++ trauma taught them to ignore the content of the error messages. It doesn't matter how well they're written if the programmer refuses to read.

Interesting. I think over the long term many people come to realise it's better to know at compile time (when they mistype something and end up with a program that runs but is incorrect it's worse than not running and just telling you your mistake). But perhaps for beginners it can be too intimidating having the compiler shout at you all the time!

Perhaps nicer messages explaining what to do to fix things would help?


That's surprising because runtime debugging depends on the state of the call stack, all the variables, etc. Syntax errors happen independent of any of that state.

I think languages with strong support for IDE type hints as well as tooling that takes advantage of it are a fairly recent phenomenon, except for maybe Java and C# which I think are regarded by the wider hacker community as uncool.

C++/C IDE support is famously horrible owning to macros/templates. I think the expectation that you could fire up VS Code and get reliable typescript type hints has been a thing only for a decade or so - for most of modern history, a lot of people had to make do without.


this is the “types make me slow” argument that everyone self debunks after they program that way for a handful of years

> that everyone self debunks

Speak for yourself.


Ask and you shall receive: https://stacktower.io/

Oh cool. That's a promising start.

I don't know if the "The Nebraska Guy Ranking" this project uses is very useful, though. In particular the "depth" criteria doesn't make much sense to me, since it assumes the more foundational a dependency is, the more robust it must be. This seems to run counter to the point of the original comic where the "Nebraska Guy" piece was the fragile block holding up the entire tower.

This project also doesn't attempt to measure or visualize the complexity of a project. Theoretically a more complex project would require more support than a simple one, so I think that's an important metric to capture.


bro. it asks for the ability for some random github user to literally take over your private repositories.

You’re 100% right to call that out. The current GitHub OAuth scope is too broad

I’m changing this ASAP to least-privilege and I’ll publish a clear explanation of scopes + data handling. In the meantime: please run the local/CLI path if you want zero-trust.


Damn dude. That’s awesome! I saw the permissions it wanted out of every org I’m a part of (including some big open source orgs) — I’d probably find myself booted out of those orgs if I accepted that. They def get a notification on every authentication like that and take potential impersonation seriously.

Yeah, if it weren't for that, I think this would blow up. Plus, even if you get past that, if you try a larger project, it times out after 1 minute and gives up. But it's a pretty awesome idea!

hey! I built this, I know its really scrappy, I just don't have enough time currently to make right by users. I'm on it though... stay tuned

I would suggest adding the /r/ProgrammerHumor version too: https://www.reddit.com/r/ProgrammerHumor/comments/1p204nx/ac...

The AI crank always cracks me up.


AWS definitely lives above unpaid developers. In fact they should probably be the bird flying straight at the unpaid developers as they force yet another company to move to a closed license to survive.

You don't think AWS is internally built on massive amounts of open source?

That's what it would mean to place them above unpaid developers in the illustration, yes.

The shark biting the cable is what gets me


Can someone help me understand the single brick at the very bottom under Linux? What is it representing?

The undersea cables actually connecting the entire internet. Sometimes sharks just take a bite of them, they're reasonable well protected but it's enough damage to cause outages and disruptions.

It's the single pin under everything because there are a limited number of those cables especially in some regions so a single shark can take out the entire internet for some countries.

http://www.mirceakademy.com/uploads/MSA2024-6-6.pdf


I feel like having them as a single brick is a bit hyperbolic, since undersea cables are pretty redundant in most of the world. Get rid of one and traffic just routes around it. Ships have been routinely destroying cables in the Gulf of Finland and the Baltic Sea in the past couple of years without causing significant disruptions.

"most of the world" is doing a seriously large amount of heavy lifting in this sentence.

There are many regions that are served by a single line, more than you think.

Even "well connected" places have fewer cables than you expect, and the frustrating thing is that you don't know that you can route around an issue until you try.

BGP is really resilient, which is great, but if your path is not clear then you'll only realise it when the failover doesn't happen, you'll think there's a redundant path.


Only mildly. There's not huge amounts of dark capacity just sitting around waiting to take over so if a major fiber connection goes down the remainder will get congested with the extra capacity. It won't cascade like a power outage but the remaining lines will slow down.

The whole Internet was designed for precisely this use case. If there is an outage, the distributed system will try to find another path. No actual central point of failure. As you say, the single brick is hyperbolic. But yea, those sharks can certainly be disruptive at times.

Well that depends on how much traffic that cable was supporting, how much free capacity is available on other cables heading to the same area, how much additional latency the rerouting will add and how sensitive to latency the rerouted traffic is doesn't it?

Do satellite networks not move the needle in terms of capacity/reliability now?

Conceptually, it's the difference between your wifi versus running a single fiber to each room in your house. The difference in bandwidth is multiple orders of magnitude.

This is never going to change because from a physical perspective free radio is a shared medium while each individual fiber (or wire) has its own private bandwidth.


Only a little bit. Just clicking around, a new Hawaii cable is supposed to have 24 Fiber Pairs and 18Tbit per Fiber Pair at the end of this year. If you lose several tbits of bandwidth, you're going to have a hard time making it up with satellite.

For small island countries and such, satellite capacity may be sufficient; and it is likely helpful for keeping international calling alive even if it's not sufficient for international data. But when you drop capacity by a factor of 1000, it's going to be super messy.


No. They're not setup to be a principal route between two nations and most satellite networks until very recently didn't even route messages through other satellites but instead retransmitted them to a ground station with access to hardline internet. Even Starlink mostly does this still because it's way cheaper and easier.

You can see an unofficial tracker [0] of the Starlink downlink network and see how outside of some rural areas your data is only moving a few tens of miles away most of the time before it's sent down to a ground system. Their sats have 3 200 Gbps laser communicators for intra constellation routing which is pretty small for the task of replacing fiber optic links.

[0] https://www.google.com/maps/d/viewer?mid=1805q6rlePY4WZd8QMO...


God, those nicknames. The Algo of Power GW (gateway?) the Pew Pew GW? Elon chose these.

It's all so pick me. Like his insistence that he's a top level gamer.

The capacity of satellite networks is minuscule compared to that of undersea fibre optics.

Plus still have to contend with the space sharks.

I never understand why questions like this get downvoted around here.

They don't, you just have to wait longer than an hour for an accurate rating

Undersea cables. With a shark biting one.

The cables at the bottom of the ocean.

Looks like an undersea cable to me

I like that the hand crank is going counter-clockwise

Crap, I saw it as clockwise. (Furious reversal of effort…)

One of DNS pillars should be replaced by BGP.

And NTP, if I recall correctly.

When was that?

Apparently it is impossible to find the time or place to add them.

When was BGP? Or when was NTP?

I think it was a joke based on NTP being a time protocol.

whoosh

The "Whatever Microsoft is doing" bit was always my favorite.

The depiction of Microsoft as "angry birds coming to indiscriminately fuck everything up" is absolutely on point for Microsoft in 2025/26

given the events of the last few days, one could add a Shahed drone too.

Oh wow! :)

Thank you for the laughs. I needed that!


What is the point being made here? Some past technologies were overhyped, therefore AI is overhyped? Well, some past consumer technologies did change the world (smartphones, texting, video streaming, dating apps, online shopping, etc), so where's the argument that AI doesn't belong to this second group?

Also, every single close friend of mine makes some use of LLMs, while none of them used any the overhyped technologies listed. So you need a specially strong argument to group them together.


> Here's what I mean by "good code": [...]

What a fantastic list. I'll be saving it to show the junior developers.

My only nitpick is that "reliability" should have been a point by itself. All the other "ilities" can be appropriately sacrificed in some context, but I've never seen unreliable software being praised for its code quality.

Which is part of why LLMs are so frustrating. They're extremely useful and extremely unreliable.


I agree with the general message, but I'm curious what ingredients go in your 800 calorie sandwich. That's more than a double Big Mac with 4 patties (780 kcal)!



Lots of mayo.. and butter. It's more like a butterich, realistically. ;)


The meeting notes in the repo was a nice surprise. Overall looked great, striking a good balance.

  .input {$var :number maximumFractionDigits=0}
  .local $var2 = {$var :number maximumFractionDigits=2}
  .match $var2
  0 {{The selector can apply a different function to {$var} for the purposes of selection}}
  * {{A placeholder in a pattern can apply a different function to {$var :number maximumFractionDigits=3}}}
Oof, that's a programming language already. And new syntax to be inevitably iterated on. I feel like we have too many of those already, from Python f-strings to template engines.

I wish it'll at least stay small: no nesting, no plugins, no looping, no operators, no side effects or calls to external functions (see Log4J).


It looks more like a DSL than configuration, but then given what I've learned about localization that's probably necessary in some cases!

However, ideally / in most cases it isn't.


English has just singular and plural: one car, two cars, three cars (and zero cars).

Some languages have more variations. E.g. Czech, Slovene and Russian has 1, 2-4 and 5 as different cases.

Personally I think the syntax is too brittle. It looks too much like TeX code and it has the lisp like deal with lines ending with too many } braces.

I would separate it into two cases: simple strings with just simple interpolation and then a more fuller markup language, more like a simplified xml.

There are more example code at https://github.com/unicode-org/message-format-wg/blob/main/d...


Oh, the language aspect gets a lot worse than that. They explicitly have a non-goal of "all grammatical features of all languages", but the "common" cases are hard enough. From https://github.com/unicode-org/message-format-wg/blob/main/s... :

  .local $hasCase = {$userName :ns:hasCase}
  .match $hasCase
  vocative {{Hello, {$userName :ns:person case=vocative}!}}
  accusative {{Please welcome {$userName :ns:person case=accusative}!}}
  * {{Hello!}}
But if anyone can find a good compromise, it's the Unicode team.


As a happy Bazzite user, I had no idea things were so bumpy. At least the migration to other os-tree distros is trivial (Fedora Kinoite -> Bazzite was one or two shell commands). My main reason for using the distro was the built-in nvidia drivers for my old graphics card.


I think it's also appropriate to use it when the rule is so strong that exceptions are famous because they are exceptions. "Birds are capable of flight" is strong enough that penguins and ostriches are famous for being counterexamples.


But that's not following the saying - it's still not proving, it's modifying the rule. It shifts the rule from "birds can fly" to "most birds can fly". Pointing out that penguins can't fly doesn't make the case that birds can fly stronger in any way.


You're right in a strict sense. But in my experience such strictness is only useful in hard sciences and (maybe) legalese. There are exceedingly few things we can claim to apply everywhere, and even fewer we can "prove" to each other.

Give it a try if you don't believe me. Even categories we take for granted, like trees and fish, are not perfectly crisp, and "obvious" facts like "humans need a heart to live" have surprising exceptions.

> Pointing out that penguins can't fly doesn't make the case that birds can fly stronger in any way.

I disagree. It's such a common rule that there's a long Wikipedia page for the exceptions[1], and the first photo is of penguins, labelled "penguins are a well-known example of flightless birds.".

If I knew nothing else about the topic, I would take it as evidence that it's common for birds to fly, otherwise that fact would have been unremarkable. Not hard proof of a universal quantifier, but a useful rule nonetheless.

[1] https://en.wikipedia.org/wiki/Flightless_bird


> There are exceedingly few things we can claim to apply everywhere, and even fewer we can "prove" to each other.

Yes, this is why hard and fast rules don't make sense, and why they should have "generally", "normally", or "mostly" attached to them.

If you have two categories of birds, one with those that fly and one that doesn't, having that second list doesn't make the first stronger. At some point that second list dilutes that first one so much that it doesn't make sense anymore.

If my rule is that "white guys are named Dave" does my building a list of every example of a Dave and non-Dave make my rule stronger? When does the "strong" nature of the rule get watered down sufficiently? Honestly, a list of hundreds of birds tells me that it's a weak rule and that the "birds fly" rule is wrong.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: