Hacker News .hnnew | past | comments | ask | show | jobs | submit | mmcnl's commentslogin

Decent package manager, brew is awful compared to apt. Window snapping can only be done on Apple keyboards not on external keyboards. No Alt+Tab, Cmd+Tab is not the same. No window previews when hovering over dock, ridiculous animation speed when switching workspaces that can't be changed (and somehow Ctrl+1/2/3 is 2x faster than Ctrl+Left/Right? What is that all about). Needing third-party apps for basic things like: setting a custom resolution (BetterDisplay), setting scroll direction for mouse wheel independent of touchpad scroll direction. And the Settings app is super slow.

What is bad about brew? I have used it in the past and I found it fine. With apt I have less experience since I only used it when playing with a raspberry pi.

I find it generally slow and by default it gets in the way in a very annoying way. Without disabling the feature l, every single time I try to install something it also looks for updates so instead of installing a single package I end up upgrading many additional packages

> Decent package manager, brew is awful compared to apt.

Use Macports. Installs itself properly out of the way in /opt. Works with the Apple frameworks (eg Python), allows multiple versions of software to be installed in parallel (using port select).

> Window snapping can only be done on Apple keyboards not on external keyboards.

Yes, you need some free 3rd party apps for affordances that should be built in. Hardly a deal breaker.

Rectangle allows you to set the hotkeys for window snapping and sizing for example.

As for scroll directions, yes, it's different to Windows, but it's the same on the Mac and iPhone. Didn't take very long to adjust.

Agreed that the new Settings app is a PITA and obviously inherited from iOS and sucks, but how often are you accessing Settings?


Macports is alive and well and works great.

Nix darwin + package manager with aerospace and sketchybar make it almost the same as my desktop pc. Could that be an alternative?

True for Geekbench.

Which notoriously favors anything made by Apple.

Said only by those that don’t favor Apple.

Agreed, macOS has hardly improved in the past decade. The only improvements are about ecosystem integration, which I don't really care about. Everything else is stuck in the 2010s. UI has regressed if you ask me.

What improvements has Windows made in the last decade? I think what you're describing is a symptom of modern software development as a whole.

I wouldn't say as a whole. KDE is way ahead of where it was 10 years ago!

KDE was far less mature than macOS and Windows 10 years ago. Of course it’s come a long way.

Windows Terminal and PowerToys are pretty nice. The Phone Link app is convenient, and screenshots are way better (no need to paste into Paint anymore, just use snip and sketch)

The snipping tool (with all features I'm using today) was added to PowerToys more than 20 years ago. It was integrated directly into Windows 10 pretty early in the update cycle. Not sure it qualifies for "the last decade".

You get AI EMBEDDED EXPLORER! Just wait 2 seconds for LLM to open up your folder.

Oh and don't forget to watch the ads.


Or their constant use of dark patterns to push you into using Bing and Edge. I was actually an Edge user myself. I liked a few of its built-in features, and it felt pretty fast. But then they started tricking me into changing my default search engine to Bing. I fell for it a couple of times, and then I quit.

> Agreed, macOS has hardly improved in the past decade

I would argue the opposite. Shared clipboard with my iPhone is a killer feature (i copy a lot of OTP tokens) and I envy you in the US that can remote access the iPhone (it is currently blocked in the EU, but hopefully will come eventually). Also mulit-monitor setup has become way better (I used to use 3rd party tools to restore window and monitor positions).


And I can share between my android, iOS, and linux devices with KDE Connect https://kdeconnect.kde.org/

If there are reasons its not good enough, since it's open source you should be able to help fix them (excepting iOS issues, since those are mostly just apple locking down the OS too hard for various things to work).

We're on hacker news, we should all want something we can hack on. Shared clipboard between two devices with proprietary OSs we can't hack on is a great feature for the masses, but not us.


Good for you, not see how that is relevant for the discussion what features were added to macOS though. Also note that clipcoard sharing just as Airdrop are point-to-point and neither require internet connection, nor is the data send through a third party or network.

> since it's open source you should be able to help fix them

And I can also grow my own tomatoes and cucumbers in my back yard, but I still prefer to buy them from a supermarket.


It has even regressed, I'm still on my High Sierra 2011 MacBook Air, but on my mom's M3 Air I can't help but observe that they did all that engineering to reduce the black bezel around the lid, only for Tahoe to have overly rounded windows and huge title bars.

I wouldn't say it hasn't improved. Security has improved considerably, and it's one of the main reasons to use a Mac.

However, there's too many bundled apps. Just wrote about this last week: https://medium.com/@hbbio/let-me-uninstall-spotlight-1fe64a3...


The tab key doesn't even work consistently across apps and screens.

No, macOS has improved a ton in a lot of ways under-the-hood. Battery life, memory compression, paging behaviour. The MacBook Neo wouldn't be possible at 8GB without all this stuff.

Ecosystem integration is the shining difference between Apple and others, as it is radically better than any other available implementation.

I would argue that ecosystem integration is the only primary consideration that you need to use at the top/first-culling-step of the flowchart to either include or discount Apple products in any purchasing decision. Anything else is secondary, and has workarounds.

> UI has regressed

Honestly, I love the UI of MacOS 9.2.2 the most. But I don’t have a Time Machine or Elon Musk levels of wealth to chart a different course.

And sure, some UI decisions of late have been questionable. That is always the case with non-niche products that don’t have highly focused and largely conforming users. Apple moved out of that category back in the early 2000s, and it is forced to make the same UI tradeoffs that Microsoft makes.

I actually don’t mind the modern UI, and aside from a few warts I think they’re going in a very user-friendly direction even if power users feel slighted and abandoned.


I had high hopes for Surface as well, but the pricing is ridiculous. The Surface Laptop 7 is more expensive than a MacBook Air, with the added benefit of having worse battery life and performance. Pricing hasn't come down in almost 2 years either. Availability is almost 0, I've never seen one in real life.

This still doesn't tell me how they differ. What are the factual objective measurable differences between E/L/T/P?

I was assigned an E14 once. Compared to a T14:

The case is all thick ABS.

It weighs like 2.4 kg, and the weight is unbalanced.

The USB-C charge only works at 20V, nothing less.

While charging it overheats and spins up the fans.

It came with a TN screen with terrible viewing angles, that could not be used in a brightly lit room. I didn't use the laptop for two months while I waited for a replacement screen from aliexpress.

Keyboard is much thinner, the trackpoint drifts easily.

Camera quality is worse, somehow it cannot handle sun-lit scenes. Microphone and speakers are similar to the T14.

It stopped receiving firmware updates after two years.

It uses about 0.5 W while suspended, so its tiny 48 Wh battery typically doesn't last the weekend with the lid closed.

The motherboard has design issues, a missing protection diode in the headphone jack microphone input ended up frying the CPU due to a ground loop. Meanwhile the T14 has eaten the same ground loop and even a 48V passive PoE in an accident and dealt with it by rebooting. A T450 from 2015 is still running.


Interesting, I own an E14 and it charges with 12V PD profile, stock ugreen powerbank. Maybe they differ across models?

Spoiler: they are all identical hardware, but marketed differently.

It's incredibly ugly, part of the value of Apple was esthetics. Also it's distracting, when I want to focus on something I want distracting elements to get out of the way.

I don't know, the majority of my colleagues have no idea how to do anything in a greenfield environment. They need guardrails.


I don't get the fight against estimates. An estimate is an estimate. An estimate can be wrong. It likely is wrong, that's fine, it doesn't have to be perfect. There is a confidence interval. You can communicate that.

Very often something like "6-12 months" is a good enough estimate. I've worked in software a long time and I really don't get why many people think it's impossible to give such an estimate. Most of us are developing glorified CRUD apps, it's not rocket science. And even rocket science can be estimated to a usable degree.

Really you have no idea if feature X is going to take 1 day or 1 year?


I think there are range of situations causing anti-estimate sentiment, each team is different, each tram has different levels or combination of these:

- manager’s refusal to acknowledge any uncertainty

- unclear requirements/expectations/system under change

- changing requirements

- negative consequence (“penalties”) for inaccurate estimate

I’m now also in the environment where I give ranges, but not everybody is.


> It likely is wrong, that's fine

It's almost never fine, though. When it's fine, people aren't pressured into giving estimates.

> It likely is wrong, that's fine

The most you can do is say it. Communication demands effort from all involved parties, and way too many people in a position to demand estimates just refuse to put any effort into it.


You've never had a manager or product person take estimates, even clearly communicated as low confidence or rife with unknowns, as gospel truth? Lucky you.


Engineer: “It will take me two days [of work].” Sales:”We will have your fix ready in three calendar days [today + 2].

Actual work that week gives employee 3 hours of non-meeting time, each daily meeting adds 0.5 hours of high-urgency administrative work. Friday’s we have a mandatory all-hands town halls…

Repeat that cycle for every customer facing issue, every demo facing issue, and internal political issue and you quickly drive deep frustrations and back talking.

I think there’s a fundamental truth: no one in their right minds, not even motivated engineers, actually hears anything but calendar when getting “days” estimates. It’s a terrible misrepresentation almost all the time, and engineers do a disservice when they yield to pressure to deliver them outside the broader planning process.

Project schedules should be the only place that time commitments come from, since they’re informed with necessary resource availability.


For me the worst part is that I (and they) don't fully know what the person asking me from the estimate wants me to build, and usually the fastest way is to just build the thing.


But very often the CI operations _are_ the problem. It's just YAML files with unlimited configuration options that have very limited documentation, without any type of LSP.


Personally I think this is an extreme waste of time. Every week you're learning something new that is already outdated the next week. You're telling me AI can write complex code but isn't able to figure out how to properly guide the user into writing usable prompts?

A somewhat intelligent junior will dive deep for one week and be on the same knowledge level as you in roughly 3 years.


No matter how good AI gets we will never be in a situation where a person with poor communication skills will be able to use it as effectively as someone who's communication skills are razor sharp.


But the examples you've posted have nothing to do with communication skills, they're just hacks to get particular tools to work better for you, and those will change whenever the next model/service decides to do things differently.


I'm generally skeptical of Simon's specific line of argument here, but I'm inclined to agree with the point about communication skill.

In particular, the idea of saying something like "use red/green TDD" is an expression of communication skill (and also, of course, awareness of software methodology jargon).


Ehhh, I don't know. "Communication" is for sapients. I'd call that "knowing the right keywords".

And if the hype is right, why would you need to know any of them? I've seen people unironically suggest telling the LLM to "write good code", which seems even easier.


I sympathize with your view on a philosophical level, but the consequence is really a meaningless semantic argument. The point is that prompting the AI with words that you'd actually use when asking a human to perform the task, generally works better than trying to "guess the password" that will magically get optimum performance out of the AI.

Telling an intern to care about code quality might actually cause an intern who hasn't been caring about code quality to care a little bit more. But it isn't going to help the intern understand the intended purpose of the software.


I'm not making a semantic argument, I'm making a practical one.

> prompting the AI with words that you'd actually use when asking a human to perform the task, generally works better

Ok, but why would you assume that would remain true? There's no reason it should.

As AI starts training on code made by AI, you're going to get feedback loops as more and more of the training data is going to be structured alike and the older handwritten code starts going stale.

If you're not writing the code and you don't care about the structure, why would you ever need to learn any of the jargon? You'd just copy and paste prompts out of Github until it works or just say "hey Alexa, make me an app like this other app".


I'm going to resist the temptation to spend more time coming up with more examples. I'm sorry those weren't to your liking!


Why do you bother with all this discussion? Like, I get it the first x times for some low x, it's fun to have the discussion. But after a while, aren't you just tired of the people who keep pushing back? You are right, they are wrong. It's obvious to anyone who has put the effort in.


Trying to have a discussion with people who aren't actually interested in being convinced is exhausting. Simon has a lot more patience than I do.


It's a poorly considered hobby.

It's also useful for figuring out what I think and how best to express that. Sometimes I get really great replies too - I compared ethical LLM objections to veganism today on Lobste.rs and got a superb reply explaining why the comparison doesn't hold: https://lobste.rs/s/cmsfbu/don_t_fall_into_anti_ai_hype#c_oc...


I like debate as much as the next guy(almost). Your patience is either admirable or crazy, I'm not sure which.


Neither am I!


Yes and no. Knowing the terminology is a short-cut to make the LLM use the correct part of its "brain".

Like when working with video, if you use "timecode" instead of "timestamp", it'll use the video production part of the vector memory more. Video production people always talk about "timecodes", not "timestamps".

You can also explain the idea of red/green testing the long way without mentioning any of the keywords. It might work, but just knowing you can say "use red/green testing" is a magic shortcut to the correct result.

Thus: working with LLMs is a skill, but also an ever-changing skill.


"There is considerable overlap between the intelligence of the smartest bears and the dumbest tourists."

At some point you'll just have to accept the tool isn't for everyone =)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: