HN2new | past | comments | ask | show | jobs | submit | danw1979's commentslogin

It works perfectly well when you’ve got deep pockets and unmanned test vehicles though.

False. SpaceX development of Starship is much cheaper then SLS despite using more test vehicles. The claim that building hardware rich is more expensive is not really shown in the data.

NASA has done some analysis on early SpaceX and shown that their methods produced a 10x improvement in cost. And that was with the method NASA uses that often turn out to be wrong.


Those deep pockets are funded by the same pot we all feed from.

And everyone should be happy that pot is TEN TIMES smaller than the pot holders draining the pot with the same goal.

We are repeating this same UX mistake with induction hobs now.

At least there's a good reason there - they're easier to clean. That's not much of a concern with microwave controls.

But I disagree with the idea that we don't need precise times on a microwave. The article / book disagrees with that, and the think I most regularly microwave (milk for my kids) needs 1 minute 50 seconds. 2 minutes and they'll reliably complain it's too hot.

The real problem with microwave UX is that the interfaces are often simply bad. People think the power/time dial interface is good but that's because it's difficult to mess it up (though they usually manage anyway by having them go up to 30 minutes or whatever).

It's really easy to mess up a button interface but you can also do it well. My microwave is close to doing it really well. You press a high/med/low button, then 1s/10s/1m/10m buttons to the desired time, then start. The only things they got wrong are that it requires pressing the power when 99% of the time you want high, and you could probably get a more useful distribution of time increments (I'm literally never going to use the 10m button).

But apart from that it's nicer than dials, which are often very cheap and imprecise.


  >they're easier to clean
I've never had an issue cleaning the dials. They're smooth hard plastic, and they don't get particularly dirty.

  >though they usually manage [to mess up the interface] anyway by having them go up to 30 minutes or whatever
What's the issue? I've microwaved that long before.

  >My microwave is close to doing it really well. You press a high/med/low button, then 1s/10s/1m/10m buttons to the desired time, then start.
We're very different people! That UX sounds dreadful to me, one of the worst I've heard (and unfortunately encountered).

Enter time on the keypad, optionally press Power and enter that, press Start. Also needs a Plus 30s button. This is the one and only correct way to implement a push button microwave. ;)

I count five presses instead of 3 to get 90 seconds, including one way that's just pressing the same button 3 times (+30s).

  >needs 1 minute 50 seconds. 2 minutes and they'll reliably complain it's too hot.
Seven presses?

The dial microwave I use can distinguish between those two. It helps that the shorter times are given more room, so you can adjust them more precisely. 1:50 vs 2:00 will make a difference in my experience, but 7:50 vs 8:00 generally won't.

You could have a hybrid approach of course, but then I suspect the engineering tendency would be to "lock in" the time after starting the oven, so it can't "accidentally" be changed.

Looking for a photo of my microwave dial, I came across this surprisingly relevant post:

https://ux.stackexchange.com/questions/90769/why-do-microwav...


> What's the issue? I've microwaved that long before.

Really? What for? Anyway the vast majority of microwaving is going to be in the 1-5 minute range. By making the dial linear and giving it a huge range up to 30 minutes, you end up making e.g. 30 seconds and 1 minute impossibly close.

The commercial microwave oven someone else linked had a solution - make it logarithmic.

> Seven presses?

Eight actually, but it really is quicker and easier than doing the same with a dial though. I agree it could be optimised though. It shouldn't be necessary to select the power and a 30s button would be good (down to 5 presses).


The vast majority of microwaving is in the 1 to 5 minute range only for those who use a microwave oven only for reheating.

For cooking, times from 10 to 15 minutes are more frequent, though things like potatoes or sweet potatoes need only 7 to 8 minutes. Only a few delicate vegetables or fruits may be cooked in the 1 to 5 minute range, e.g. onion, garlic, leek, parsley and dill, etc. Meat needs to be cooked at low power, which in turn requires long times, typically over 20 minutes. There is also a very small number of vegetables that need cooking times over 15 minutes, e.g. the common beans, for which even times of 30 minutes may be needed.

That said, all the microwave ovens that I have used (in Europe) had rotary knobs with variable resolution, fine for short times and coarse for long times.


> only for those who use a microwave oven only for reheating.

Which is most people, as the article notes!

> had rotary knobs with variable resolution

Eh fair enough. Maybe I have just happened to only see bad ones.


For many years, I have also belonged to "most people" and I was cooking in one of the weekend days by traditional means, which required many hours, then in the rest of the week days I was reheating the food in a microwave oven.

Only a few years ago I began to experiment with cooking raw ingredients in the microwave oven. After discovering how much this simplifies cooking I regretted very much that I had not tried to do that earlier.

Because cooking at microwaves is much faster, nowadays I cook most food immediately before eating it.


I have used only microwave ovens with rotary knobs.

They had a finer resolution of 10 seconds for short times, then the resolution was progressively coarser for longer times, e.g. of 30 seconds for times over 10 minutes.

This is perfectly adequate for finding optimum times, and I cook in a microwave oven all the food that I am eating, from raw ingredients.


I noticed, it's an unfortunate regression.

What's amazing is how the vibe of using the microwave completely changed. Before it was:

"Okay, how much time?? I've gotta get this right, I only get one shot. Think!!"

to:

"Probably 2 minutes." moves knob, cooking starts "Eh, maybe 90 seconds actually." moves knob again

That alone probably reduces the error rate, and it certainly reduces annoyance.

With the new stoves, I've noticed people are starting to dread using their stove the same way they dread the microwave. Hopefully we can fix both.


when you say pre-launch, what do you mean exactly ?

Seems founders edition of the software is 30 dollars a month subscription per the github

My two cents - don’t do it. There’s plenty of terminal editors (and personal opinions about them) to chose from. You will end up reinventing an IDE.

This is great !

Did you have any thoughts about how to restrict network access on macos too ?


I haven’t found an easy way, but I have a working theory -

sandbox-exec cannot filter based on domain names, but it can restrict outbound network connections to a specific IP/port (and drop the rest). If I can run a proxy on localhost:19999, I can allow agents to connect through it and filter connections by hostname. From my research, most agents support $HTTP_PROXY, so I'll try redirecting their HTTP requests through my security proxy. IIRC, if I do this at the CONNECT level, I don't need to MITM their traffic nor require a trusted root cert.

Recently, Codex CLI implemented something like DNS filtering for their sandbox, so I'd investigate their repo.


Some commercial firewalls will snoop on the SNI header in TLS requests and send a RST towards the client if the hostname isn’t on a whitelist. Reasonably effective. If there’s a way with the macos sandboxing to intercept socket connections you might find some proxy software that already supports this.

the HTTP_PROXY approach might be simpler though.


I’m married to someone running various prostate cancer studies in the UK. I hear the arguments against screening a lot and the issue really blew up recently in the news here.

The thing is, when researchers talk about “worse outcomes” they’re often comparing survival (or rather lack of) against terrible side-effects.

What this fails entirely to capture is that doing something to increase your odds of survival, damn the consequences, is an individual choice. It shouldn’t be up to a health economist to make that judgement.


> What this fails entirely to capture is that doing something to increase your odds of survival, damn the consequences, is an individual choice.

What you're failing to capture is that this is a hard problem because it's both an individual choice and a collective one as well. Those "terrible side effects" might actually end up killing someone. You're choosing between a high-chance lottery on a small population or a low chance lottery on a far larger one. It's not that simple.


But who will pay for the hundreds of thousands of screening MRIs, along with the large number of incidental results that will require some sort of follow-up? Many patients will seek second opinions if not recommended to "cut it out", with additional costs also for the complications resulting from unnecessary biopsies. US medical care is already tremendously expensive; adding all of these costs will break the bank and for no real benefit.

[flagged]


Bot?

I’ve been a mac user since 1994, system 7, and it feels to me like the overall Mac user experience and reliability (stability, speed, etc) really peaked with Snow Leopard, 10.6.

This probably has a lot to do with the vastly improved hardware design around then - the touchpad specifically on the “blackbook” Core 2 Duo era macbooks was a step change, and they keyboard was pretty great too. Multi-monitor support was fantastic compared to everything else too.

You have to wonder what the design principles of pre-X MacOS paired with modern Apple hardware could achieve.


I'm sorry guys, it's my fault.

My first mac was a 09 MBP with snow leopard, shortly after they updated and started removing random features and closing down customization. For some reason, you couldn't be trusted with more than one right click method anymore.

A solid 15 years later I try macs again, had a nice m3 air at work and bought a personal M4 air. A few months later Tahoe comes out. I bought the thing because modern darkmode macos looked so great and was such a pleasure to use. Now it's full on bubbleboy.

Word must have gotten back to Cupertino that I was back in the ecosystem...


>...really peaked with Snow Leopard, 10.6.

Which was just a couple of years after the iPhone. After the iPhone, the Mac was the new Apple ][, i.e. something they kept around to make some money, but didn't really care about.


Pretty sure I went to a rave once where they used all of these.


The thing about most art, architecture, etc is that it’s incredibly subjective, so contrasting your own views with “certain left wingers” is pretty much pointless.

I personally think the entire south bank is pretty ugly, but my views on this, my political views or my views on other styles of architecture don’t matter one jot.

If there’s a building a bunch of people care very much about, then let them protect it.


Thanks Neal. Is fun.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: