HN2new | past | comments | ask | show | jobs | submit | BearOso's commentslogin

It's fairly easy when it's in print like this. When it's handwritten I have trouble just going back to 1900.

From the beginning, one of the advertising tricks they have used for AI is FOMO. I presume that is so they can sell you as much of it as they can before you realize its flaws.

Everybody's so worried about getting in on the ground floor of something that they don't even imagine it could be a massive flop.


Did you pay attention in computer science classes? There are problems you can't simply brute-force. You can throw all the computing power you want at them, but they won't terminate before the heat-death of the universe. An LLM can only output a convolution of its data set. That's its plateau. It can't solve problems, it can only output an existing solution. Compute power can make it faster to narrow down to that existing solution, but it can't make the LLM smarter.

Maybe LLMs can solve novel problems, maybe not. We don't know for sure. It's trending like it can.

There are still plenty of problems that having more tokens would allow them to be solved, and solved faster, better. There is no absolutely no way we've already met AI compute demands for the problems that LLMs can solve today.


There is zero evidence that LLMs can do anything novel without a human in the loop. At most LLM is a hammer. Not exactly useless by any stretch of the imagination, but yes you need a human to swing it.

Every solution generated by an AI for a novel problem was ultimately rescinded. There is no trend, there is only hope.

LLMs are considered Turing complete.

Only if you instantiate it once.

If you use it like an agent and stick it in a loop and run it until it achieves a specific outcome it's not.


Not really. You can leverage randomness (and LLMs absolutely do) to generate bespoke solutions and then use known methods to verify them. I'm not saying LLMs are great at this, they are gimped by their inability to "save" what they learn, but we know that any kind of "new idea" is a function of random and deterministic processes mixed together in varying amounts.

Everything is either random, deterministic, or some shade of the two. Human brain "magic" included.


Microsoft doesn't seem to care unless you're a company. That's the reason community edition is free. Individual licenses would be pennies to them, and they gain more than that by having a new person making things in their ecosystem. It's in their interest to make their platform accessible as possible.

PulseAudio, when it came out, was utterly broken. It was clearly written by someone with little experience in low-latency audio, and it was as if the only use case was bluetooth music streaming and nothing else. Systemd being from the same author made me heavily averse to it.

However, unlike PulseAudio, I've encountered few problems with systemd technically. I certainly dislike the scope creep and appreciate there are ideological differences and portability problems, but at least it works.


If Rust has one weakness right now, it's bindings to system and hardware libraries. There's a massive barrier in Rust communicating with the outside ecosystem that's written in C. The definitive choice to use Rust and an existing Wayland abstraction library narrows their options down to either creating bindings of their own, or using smithay, the brand new Rust/Wayland library written for the Cosmic desktop compositor. I won't go into details, but Cosmic is still very much in beta.

It would have been much easier and cost-effective to use wlroots, which has a solid base and has ironed out a lot of problems. On the other hand, Cosmic devs are actively working on it, and I can see it getting better gradually, so you get some indirect manpower for free.

I applaud the choice to not make another core Wayland implementation. We now have Gnome, Plasma, wlroots, weston, and smithay as completely separate entities. Dealing with low-level graphics is an extremely difficult topic, and every implementor encounters the same problems and has to come up with independent solutions. There's so much duplicated effort. I don't think people getting into it realize how deceptively complex and how many edge-cases low-level graphics entails.


(xfwl4 author here.)

> using smithay, the brand new Rust/Wayland library

Fun fact: smithay is older than wlroots, if you go by commit history (January 2017 vs. April 2017).

> It would have been much easier and cost-effective to use wlroots

As a 25+ year C developer, and a ~7-year Rust developer, I am very confident that any boost I'd get from using wlroots over smithay would be more than negated by debugging memory management and ownership issues. And while wlroots is more batteries-included than smithay, already I'm finding that not to be much of a problem, given that I decided to base xfwl4 on smithay's example compositor, and not write one completely from scratch.


Thanks for the extra info. I'm glad it hasn't turned out to be much of an issue. I've looked at your repository and it seems to be off to a great start.

Personally, I'm anxious to do some bigger rust projects, but I'm usually put off by the lack of decent bindings in my particular target area. It's getting better, and I'm sure with some time the options will fill out more.


There really isn't a "massive barrier" to FFI. Autogenerate the C bindings and you're done. You don't have to wrap it in a safe abstraction, and imo you shouldn't.


This. It is somewhat disheartening to hear the whole interop-with-C with Rust being an insurmountable problem. Keeping the whole “it’s funded by the Government/Google etc” nonsense aside: I personally wish that at least a feeble attempt would be made to actually use the FFI capabilities that Rust and its ecosystem has before folks form an opinion. Personally - and I’m not ashamed to state that I’m an early adopter of the language - it’s very good. Please consider that the Linux kernel project, Google, Microsoft etc went down the Rust path not on a whim but after careful analysis of the pros and cons. The pros won out.


> This. It is somewhat disheartening to hear the whole interop-with-C with Rust being an insurmountable problem.

I have done it and it left a bad taste in my mouth. Once you're doing interop with C you're just writing C with Rust syntax topped off with a big "unsafe" dunce cap to shame you for being a naughty, lazy programmer. It's unergonomic and you lose the differentiating features of Rust. Writing safe bindings is painful, and using community written ones tends to pull in dozens of dependencies. If you're interfacing a C library and want some extra features there are many languages that care far more about the developer experience than Rust.


> a big "unsafe" dunce cap to shame you for being a naughty, lazy programmer

You just have to get over that. `unsafe` means "compiler cannot prove this to be safe." FFI is unsafe because the compiler can't see past it.

> Once you're doing interop with C you're just writing C with Rust syntax

Just like C++, or go, or anything else. You can choose to wrap it, but that's just indirection for no value imo. I honestly hate seeing C APIs wrapped with "high level" bindings in C++ for the same reason I hate seeing them in Rust. The docs/errors/usage are all in terms of the C API and in my code I want to see something that matches the docs, so it should be "C in syntax of $language".


> a big "unsafe" dunce cap to shame you for being a naughty, lazy programmer

That's bizarrely emotional. It's a language feature that allows you to do things the compiler would normally forbid you from doing. It's there because it's sometimes necessary or expedient to do those things.


My point is that using C FFI is "the things the compiler would normally forbid you from doing" so if that's a major portion of your program then you're better off picking a different language. I don't dislike rust, but it's not the right tool for any project that relies heavily on C libraries.


Rust becomes a significant burden if you need a GUI or hardware-accelerated graphics.


C++ isn't much better for GUI.


C++ was the GUI king during the 1990's, and none of the Rust toolkits is half as good as the surviving frameworks, like C++ Builder, Qt, wxWidgets, heck even MFC has better tooling.


> wxWidgets

I'll trade you wxWidgets for FLTK.

Trying to defer to native widget rendering per platform was a mistake, and every time I've touched wxWidgets in the past decade and a half I've regretted it.

FLTK on the other hand is ugly as sin, but I've found it reliable enough to not act in surprising ways, and it's also small enough to where you can vendor it and build it alongside your program.


I assume most of them are just grabbing qt


HFCS came about when there was an abundance of corn and nothing to do with it. So when they discovered corn syrup they added corn subsidies and heavily tariffed cane sugar. Ethanol appeared and is a far greater corn sink, so HFCS no longer even serves that purpose.

But the processing industry doesn't want to disappear (money and job losses), so they lobby and the status quo remains. Same with private health care in a cozy position where they act as an unneeded middleman. It's too lucrative to certain people, and they won't willingly give it up.


My understanding was that the meat-packing process in the US involves a butchering method that results in more fecal matter contamination, posing the risk of salmonella, which necessitates the wash. Those bacteria occur naturally, so you can't avoid that without being careful with butchering, which is probably what the EU standards require. But I doubt the big meat conglomerates like Tyson will want any hit to productivity, and they would fight a change every step of the way.


Mechanically separated meat bluntly ruptures the digestive tract and smears the flesh with feces. So they soak the feces and flesh together in a chlorine or acid bath to sanitize it. It's disgusting.


Does EU not use mechanical meat separation for chickens? If not, wouldn't their costs be dramatically higher?


They, and everyone at the time, were kind of forced to switch to lead-free solder by RoHS. At that point, there probably hadn't yet been tests showing the results of constant thermal cycling, so the brittling effect was unknown. Apple was particularly affected as an early adopter because of their PR stance on environmental issues.

Refusing to acknowledge anything was wrong was the real problem. But that's just a reminder that companies don't care about you. Brand loyalty is a quagmire.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: