That’s my feeling when reading nixos forums. People are willing to help but don’t realize how little newbies know about nix when asking for help. The first month of nixos was a massive uphill climb for me, and that knowledge doesn’t stick well because I get to interact with nix every few months to tweak things, not weekly or daily.
It’s a solid os, and I’m enjoying it, and I love that I can’t break things while tweaking. But the docs are and discussion threads are not written for beginners (it’s really hard to write for beginners).
Not the greatest fan of python, but when I've got to run a python script, I do `nix-shell -p 'python3.withPackages (ps: [ps.requests])' --command 'python3 your-script.py'` Note that there is one argument to -p and one argument to --command -- both are quoted. The argument to -p is a nix expression that will provide a python3 command, referring to a python3 with the requests package. The argument to --command is a bash script that will run python3 with the argument "your-script.py" i.e. it will run your-script.py with the python3 that has the requests package.
I think there's ways you can autoderive a python3 with specific packages from python dependency files, but I can't help you there. I do find AI to be reasonably helpful for answering questions like this: it just might sometimes require a bit of help that you want to understand the answer rather than receive a perfect packaged shell.nix file.
Do you have to figure this out? Sure, it's nice and "pure" if everything is configured through Nix but there is something to be said about being pragmatic. Personally, I just enabled nix-ld[0] and use uv to install and handle my Python versions and package dependencies. Much, much easier.
Easier and largely compatible with the rest of the world. Solving problems with "If we all switched to NixOS..." is a non-starter in most organizations.
My rule of thumb: keep a strict separation between my projects (which change constantly) and my operating system (which I set up once and periodically update). Any hard nix dependency inside the project is a failure of abstraction IMO. Collaborating with people on other operating systems isn't optional!
In practice this means using language-specific package management (uv, cargo, etc) and ignoring the nix way.
This website needs to be simpler, snappier and polite on the homepage. I should be able to send it as a quick reply to anyone doing the deed. Just like nohello.net
To be fair, if I had a company and won a lawsuit like that... a lawsuit which makes for a good underdog story, I'd let my PR team use it as much as they desire! That lawsuit is a golden asset for them now.
I would honestly expect AI ads to be invisible, and for them to just be injected by the provider as part of the prompt. For example, you ask for something about firefox, but the AI tells you that firefox has a nasty ugly way to solve your problem and it would be easier to install chrome.
Ditto. AI has the power to make you believe stuff without you noticing, why would they bother with garish ads when they could make you think it was YOUR idea to buy Chlorox?
I guess maybe the garish colors could increase your suggestibility indirectly maybe?
> against the spirit of science to keep them from the general public
Within science, participants have always published descriptions of methodology and results for review and replication. Within the same science, participants have never made access to laboratories free for everyone. You get blueprints for how to build a lab and what to do in it, you don't get the building.
Same for computation. I'm fairly sure almost all (if not all) algorithms in these suites are documented somewhere and you can implement them if you want. No one is restricting you from the knowledge. You just don't get the implementation for free.
Generally I agree up until now where we appear to be treading on the threshold of AI being orders of magnitude more powerful. Given that, which has potential to displace large swaths of the labor force, I feel as though society deserves a larger return on investment.
Software is fundamentally different than lab equipment, just like PDFs are not paper journals that have to be printed, stored, and shipped. Most things in the digital domain have to be treated in a post-scarcity mindset, because they essentially are.
This is why the incoming generation of AI engineers organizing autonomously and openly on git etc will decimate the dusty locked away AI academia generation.
The concept of heavy gatekeeping and attribution chasing seems asinine as knowledge generation and sharing isn't metered.
I would say almost exactly the opposite is happening. Academia generally publishes it's results relatively freely but academic AI research is largely being left in the dust by large corporations who do not find it in their interest to publicly describe the "magic dust" that makes their products work.
> Same for computation....You just don't get the implementation for free.
software packages arent computation... whilst software takes time and effort (and money) to make, the finished product is virtually free to store and distribute. i see it similarly against the spirit of science. how is there more free software in the laymen space?
Ladybird has a strong "all dependencies built in house" philosophy. Their argument is they want an alternative implementation to whatever is used by other browsers. I'd argue they would never use a third party library like servo as a principle.
No they don’t - SerenityOS did, but when Ladybird split out they started using all sorts of third party libraries for image decoding, network, etc.
Now a core part of the browser rendering engine is not something they’re going to outsource because it would defeat the goal of the project, but they have a far different policy to dependencies now than it used to before.
I would say the complexity of implementing defer yourself is a bit annoying for C. However defer itself, as a language feature in a C standard is pretty reasonable. It’s a very straightforward concept and fits well within the scope of C, just as it fit within the scope of zig. As long as it’s the zig defer, not the golang one…
I would not introduce zig’s errdeferr though. That one would need additional semantics changes in C to express errors.
It starts out small. Then before you know the language is total shit. Python is a good example.
I am observing a very distinguishable phenomenon when internet makes very shallow ideas mainstream and ruin many many good things that stood the test of time.
I am not saying this is one of those instances, but what the parent comment makes sense to me. You can see another comment who now wants to go further and want destructors in C. Because of internet, such voices can now reach out to each other, gather and cause a change. But before, such voices would have to go through a lot of sensible heads before they would be able to reach each other. In other words, bad ideas got snuffed early before internet, but now they go mainstream easily.
So you see, it starts out slow, but then more and more stuff gets added which diverges more and more from the point.
I get your point, though in the specific case of defer, looks like we both agree it's really a good move. No more spaghetti of goto err_*; in complex initialization functions.
Actually I am not sure I do. It seems to me that even though `defer` is more explicit than destructors, it still falls under "spooky action at a distance" category.
I don't understand why destructors enter the discussion. This is C, there is no destructors. Are you comparing "adding destructors to C" vs "adding defer to C"?
The former would be bring so much in C that it wouldn't be C anymore.
And if your point is "you should switch to C++ to get destructors", then it seems out of topic. By very definition, if we're talking about language X and your answer is "switch to Y", this is an entirely different subject, of very few interest to people programming in X.
Defer is not spooky action at a distance. It is an explicit statement that gets executed as written. Unlike (for example, a familiar feature which C doesn’t have) operator overloading… which causes code that looks like one thing (adition for example) behave like another (a function call). Defer does exactly what it says on the tin can (“move this line to the end of the scope”), just like goto does exactly what it claims to do.
Macros (in general) are way spookier than a defer statement.
Where it is invisible! What is so hard about this to understand?
>operator overloading..
Yes, but if we go by your argument, you can say it gets executed exactly as it is written. It is just that it is written (ie overloading) somewhere else ie "at distance"...just like a defer block that could be far from the end of the scope that is trigerring it
> `defer` is still in "spooky action at a distance" category
Agree, this is also why I'm a bit weary of it.
What brings me on the "pro" side is that, defer or not defer, there will need to be some kind of cleanup anyway. It's just a matter of where it is declared, and close to the acquisition is arguably better.
The caveat IMHO is that if a codebase is not consistent in its use, it could be worst.
It is, just the existence of goto makes control flow significantly harder to understand. People complain about exceptions in C++ obfuscating control flow, but then they recreate exceptions using goto. The funny thing is that exceptions are just fancy goto, the assembly is almost the same.
The bigger picture of C as a language is not that it's simple, because it's not simple at all. It's inept. It doesn't give developers the tools to write simple code. So easy things become hard, and we sort of jank together solutions that kind of work but usually don't.
I like to compare it to building a shed with power tools versus only a screwdriver. Is a screwdriver simpler than a power saw and all that? Of course. Now think about building a shed. Is it simpler to do with a screwdriver? No. It's much, much more complex. You have to develop complex processes to make that work, and it's not intuitive at all.
C is a language that already makes use of implicit control flow A LOT. I don't see defer being a problem. The irony is that if C just supported these use cases out-of-the-box, it would be simpler and easier. As a concrete example, consider polymorphism in C versus C++. Both languages can do it, but one provides the tools and one doesn't. In C++ I can go to definition, I can concretely define what polymorphism is allowed and what isn't, and the type system gives me the tools to make it safe. In C, none of that is true, so when we do polymorphism with function pointers, it's much harder to understand what's actually going on, or what could be going on.
The problem is notepad itself would download and execute bad stuff if you click the evil link. If you would paste that same link in a browser you'd be ok.
And the problem is a notepad app is expected to be dead simple, have few features, and be hard to get wrong while implementing.
It’s a solid os, and I’m enjoying it, and I love that I can’t break things while tweaking. But the docs are and discussion threads are not written for beginners (it’s really hard to write for beginners).
reply