HN2new | past | comments | ask | show | jobs | submit | lwhi's commentslogin

The fundamental truth is that we need less people to keep things running as they are running right now.

But who's to say that things will be 'running as they are now' for long? And who knows what a new economy will look like?

If and when that transition occurs, I think the job market will pick up.


I disagree.

All of these foundation concepts are vocabulary.

We need vocabulary in order to understand and have reasonable conversations.

Do you need to be an expert? Probably not .. but yes, we should all understand.

I think we'll develop personal moats automatically. Some people don't are naturally uninquisitive. They'll be most at risk.


no they don't need vocabulary, i can vibe code my SQL. for real though, what's the point learning it now for analytics? absolutely none.

Similar point to learning how to capitalise in a sentence.

And learning what analytics actually means (it's not the same as debugging).

Learn how to communicate.


Who are the big blocks that survive the collapse though?

Some BSD server somewhere which was last rebooted in 1994. No one is really sure where it’s physically located, but it keeps everything running.

And it still pings, of course


'Dozens' should be multiples by many million, but I definitely agree with the sentiment.

Please, come on.

Is this really the best backup?

Sam Altman has demonstrated that he's a piece of ** with this move.

We can now safely assume that all the pronouncements and grand statements before were simply posturing.


> Is this really the best backup?

It is the justification for anything any corporation does. This is a company with boards of directors and shareholders, you really think this is just Sams opinion guiding this?

> Sam Altman has demonstrated that he's a piece of * with this move.

Thats is your opinion based only on what his company has publicly dislcosed to you. I prefer not to judge a mans character based on corporate puffery.


Sympathy for the Devil.

So youve never done anything you look back on and regret? You dont believe in second chances or reform?

It sounds like your opinion is that when someone makes a bad decision then thats them for life.


If that's you Sam, repent now.

There has to be a line that will not be crossed, in order to be seen to have principles.

Until that line has been reached, we can safely assume there are no principles at play.


You are the one drawing that line for yourself. Everybodies line of principles is in a different place.

No, you're incorrect.

Some people never reach a line. And if they do get close, it gets hastily redrawn.


Thats exactly what I said/implied. Everybody has their own line, and can draw it and move it wherever they like.

If you draw a line and somebody else crosses it, that affects you and you only. Its unfair to impose your own standards onto another person.


A person can be judged by their actions. Especially if they have a habit of redrawing their own 'line'.

It leaves my camera on even after I close the tab.


If the moat is taste, this is democracy in action.


So you need to approve all actions that actually do something, individually?


It's because of the Mac Mini's unified memory architecture; which is ideal for inference.


The amount of ram available on a Mac Mini is not good enough for a decent open model for OpenClaw, everybody is using remote AI services on those.


You can get up 64GB of memory.

It's very difficult to get this much memory on a graphics card.


I know, but which open model that fits in there is useful enough for OpenClaw? I don’t think there is one.

If you look at the videos and blog posts where they recommend getting a Mac Mini for this are recommending the base model (which comes with just 16GB), precisely because it’s the cheapest Mac that can read your reminders, use iMessage etc. that’s what those using OpenClaw want from the Mini, not its inference capabilities.


I disagree.


What model are you running with 64GB of VRAM (equivalent)? I doubt most users are doing that. Looking at their documentation, the default path for openclaw seems to be a 3P API for the model.


It doesn't matter what 'most users' are doing.

On a 64 GB Apple silicon Mac mini you can natively host mid sized and some larger quantised local models .. using Ollama.

For example:

Qwen3-Coder (32B), GLM-4.7 (or GLM-4 Variants), Devstral-24B / Mistral Large (Quantized)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: