HN2new | past | comments | ask | show | jobs | submit | wrxd's commentslogin

That’s a way to keep the daily active users number going up

This post does a great job at showing that the core idea behind GPT is relatively simple.

To do something useful it needs tons of data and then everything starts becoming more and more complex to deal with


> I have an iPhone 16 and I'm locked in because of all my photos being on my iCloud subscriptio

Ever heard of the Data Transfer Project? https://support.google.com/photos/answer/10502587?sjid=95203...


Copilot, what have you done again?

Why would the correct output of a C compiler not work with a standard linker?

> Why would the correct output of a C compiler not work with a standard linker?

I feel it should for a specific platform/target, but I don't know if it did.

Writing a linker is still a lot of work, so if their original $20k cost of production did not include a linker I'd be less impressed.

Which raises the question, did CC also produce its own pre-processor or just use one of the many free ones?


This thing has likely all of GCC, clang and any other open source C compiler in its training set.

It could have spotted out GCC source code verbatim and matched its performance.


It's kinda of a failure it didn't just spit out GCC isn't it?

If I had GCC and was asked for a C compiler I would just provide GCC..


That’s the equivalent of getting less than 50% on a quiz consisting entirely of yes/no questions.

It’s in Rust…

It's an LLM, surely it could read gcc source code and translate it to Rust if it really tried hard enough

It wasn't given gcc source code, and was not given internet access. It the extent it could translate gcc source code, it'd need to be able to recall all of the gcc source from its weights.

Right. And the arguably simpler problem, where the model gets the C code directly, is active research: https://www.darpa.mil/research/programs/translating-all-c-to...

All of this work is extraordinarily impressive. It is hard to predict the impact of any single research project the week it is released. I doubt we'll ever throw away GCC/LLVM. But, I'd be surprised if the Claude C Compiler didn't have long-term impact on computing down the road.


I occasionally - when I have tokens to spare, a MAX subscription only lasts so far - have Claude working on my Ruby compiler. Far harder language to AOT compile (or even parse correctly). And even 6 months ago it was astounding how well it'd work, even without what I now know about good harnesses...

I think that is the biggest outcome of this: The notes on the orchestration and validation setup they used were far more interesting than the compiler itself. That orchestration setup is already somewhat quaint, but it's still far more advanced than what most AI users use.


Anti-LLM: isn’t all this intelligence supposed to give us something better than what we already have?

Me: Top 0.02%[1] human-level intelligence? Sure. But we aren't there yet.

[1] There are around 8k programming languages that are used (or were used) in practice (that is, they were deemed better than existing ones in some aspects) and there are around 50 million programmers. I use it to estimate how many people did something, which is objectively better than existing products.


What about the hype? If you claim your LLM generated compiler is functionally on par with GCC I’d expect it to match your claim.

I still won’t use it while it also matches all the non-functional requirements but you’re free to go and recompile all the software you use with it.


Discord is still not the same and in my opinion inferior. It’s mostly synchronous chat with poor searchability, something very different from what forums used to be

G-Research is a trading firm, not Google research


The G stands for "Google", does it not?


There is no relation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: