HN2new | past | comments | ask | show | jobs | submit | KingOfCoders's commentslogin

One thing I love about Go, not fancy-latest-hype features, until the language collapses or every upgrade becomes a nightmare, just adding useful stuff and getting out of the way.

I know, I recently upgraded and skipped several releases without any issues with some large codebases.

The compatability guarantee is a massive win, so exciting to have a boring language to build on that doesn’t change much but just gradually gets better.


Really? My experience is that of C, C++, Go, Python, and Rust, Go BY FAR breaks code most often. (except the Python 2->3 change)

Sure, most of that is not the compiler or standard library, but dependencies. But I'm not talking random opensource library (I can't blame the core for that), but things like protobuf breaking EVERY TIME. Or x/net, x/crypto, or whatever.

But also yes, from random dependencies. It seems that language-culturally, Go authors are fine with breaking changes. Whereas I don't see that with people making Rust crates. And multiple times I've dug out C++ projects that I have not touched in 25 years, and they just work.


The stdlib has been very very stable since the first release - I still use some code from Go 1.0 days which has not evolved much.

The x/ packages are more unstable yes, that's why they're outside stdlib, though I haven't personally noticed any breakage and have never been bitten by this. What breakage did you see?

I think protobuf is notorious for breaking (but more from user changes). I don't use it I'm afraid so have no opinion on that, though it has gone through some major revisions so perhaps that's what you mean?

I don't tend to use much third party code apart from the standard library and some x libraries (most libraries are internal to the org), I'm sure if you do have a lot of external dependencies you might have a different experience.


Well, for C++ the backwards compatability is even better. Unless you're using `gets()` or `auto_ptr`, old C++ code either just continue to compile perfectly, or was always broken.

Sure, the Go standard library is in some sense bigger, so it's nice of them to not break that. But short of a Python2->3 or Perl5->6 migration, isn't that just table stakes for a language?

The only good thing about Go is that its standard library has enough coverage to do a reasonable number of things. The only good thing. But any time you need to step outside of that, it starts a bit-rotting timer that ticks very quickly.

> though [protobuf] has gone through some major revisions so perhaps that's what you mean?

No, it seems it's broken way more often than that, requiring manual changes.


But any time you need to step outside of that, it starts a bit-rotting timer that ticks very quickly.

This is not my experience with my own or third party code. I can't remember any regressions I experienced caused by code changes to the large stdlib at all in the last decade, and perhaps one caused by changes to a third party library (sendgrid, who changed their API with breaking changes, not really a Go problem).

A 'bit-rotting timer' isn't very specific or convincing, do you have examples in mind?


>> But any time you need to step outside of that

"That" here refers to the standard library, so:

> I can't remember any regressions I experienced caused by code changes to the large stdlib at all in the last decade

I agree. But I'm saying it's a very low bar, since that's true for every language. But repeating myself I do acknowledge that Go in some senses has a bigger standard library. It's still just table stakes to not break stdlib.

> A 'bit-rotting timer' isn't very specific or convincing, do you have examples in mind?

I don't want to dox myself by digging up examples. But it seems that maybe half the time dependabot or something encourages me to bump versions on a project that's otherwise "done", I have to spend time adjusting to non backwards compatible changes.

This is not my experience at all in other languages. And you would expect it to be MORE common in languages where third party code is needed for many things that Go stdlib has built in, not less.

I've made and maintained opensource code continuously since years started with "19", and aside from Java Applets, everything else just continues to work.

> sendgrid, who changed their API with breaking changes, not really a Go problem

To repeat: "It seems that language-culturally, Go authors are fine with breaking changes".


I disagree about culture, I’d say that’s the culture of js.

For Go I’d say it’s the opposite and you have obviously been unlucky in your choices which you don’t want to talk about.

But it is not a universal experience. That is the only third party package with breaking changes I have experienced.


Isn't the x for experimental and therefore breaking API changes are expected?

Sure.

To repeat: "It seems that language-culturally, Go authors are fine with breaking changes". I just chose x as examples of near-stdlib, as opposed to appearing to complain about some library made by some random person with skill issues or who had a reasonable opinion that since almost nobody uses the library, it's OK to break compat. Protobuf is another. (not to mention the GCP libraries, that both break and move URLs, and/or get deprecated for a rewrite every Friday)

The standard library not breaking is table stakes for a language, so I find it hard to give credit to Go specifically for table stakes.

And it's not like Go standard library is not a bit messy. As any library would be in order to maintain compatibility. E.g. net.Dialer has Timeout (and Deadline), but it also has DialContext, introduced later.

If the Go standard library had managed to maintain table stakes compatibility without collecting cruft, that'd be more impressive. But as those are contradictory requirements in practice, we shouldn't expect that of any language.


I initially loved Zed because it was so much snappier than VSCode/Cursor, but running several Zed instances made my Ryzen/32gb machine unusable together with Claude because Zed seems such a memory hog. Not using it currently anymore. (Win11)

Most software the companies I worked for that was put into production, was not verified. There were spotty code reviews, mostly focusing on "I would have done it differently" and a limited amount of unit tests with low test coverage and Heisenberg E2E tests, often turned off because, Heisenberg. Sometimes overworked, bottle neck, testers.

There is hope that with AI we get to better tested, better written, better verified software.


> There is hope that with AI we get to better tested, better written, better verified software.

And it is one thing we don't get for sure.

This tech, in a different world, could be empowering common people and take some weight from their shoulders. But in this world its purpose is quite the opposite.


Windhawk vertical task bar is a game changer for me.

I thought a Studio would be my local LLM machine 2026, but this is $2000+ for the 126gb option - not for me. I assume $6000 for that Studio machine but it looks now more like $8000.

Perhaps you're not the target audience of the article.

No the way to do it is this:

Break the law, make $1B of illegal money, then get dragged to court and pay a $200M fine - while you keep most of the profit and your market position you illegally gained.

Bonus: Shield all managers from personal accountability, best in a way that they got their bonus and salary and moved on a long time ago before the verdict hits.

Best: Not get to court, but make an $100M outside court settlement.


Interesting, but not adding something to my CI for a badge, too paranoid.

I also prefer CLI over MCP and wrote about it, and why (also when to use #FUSE to integrate AIs and data):

https://www.tabulamag.com/p/a-new-way-to-integrate-data-into

My latest CLI instead of MCP:

https://github.com/StephanSchmidt/human (alpha)


Not sure how this works, 'enveil --run claude' will give the env values to the AI?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: