Hacker News .hnnew | past | comments | ask | show | jobs | submit | kaashif's commentslogin

Apple is not catering to minimum salaries in poor countries. Does this really need to be explained?

$3499 is definitely enthusiast compatible. That's beefy gaming PC tier, which is possibly the canonical example of an enthusiast market.

This isn't tens of thousands of dollars for top tier Nvidia chips we're talking about.


Seems I misunderstood what a "enthusiast" is, I thought it was about someone "excited about something" but seems the typical definition includes them having a lot of money too, my bad.

I'm an immigrant to Canada, and yes, English has both literal meanings and colloquial meanings.

In the most literal meaning, absolutely, "Enthusiast" just means a person who likes something, is excited about something.

When it comes to market and products though, typically you'll see the word "Enthusiast" as mid-tier - something like: Consumer --> Enthusiast --> Professional (may have words like "Prosumer" in there as well etc:)

In that context, which is typically the one people will use when discussing product pricing and placement, "Enthusiast" is somebody who yes enjoys something, but does it sufficiently to be discerning and capable of purchasing mid-tier or above hardware.

So while a consumer photographer, may use their phone or compact or all-in-one camera, enthusiast photographer will probably spend $3000 - $5000 in camera gear. Equivalently, there are myriad gamers out there (on phones, consoles, Geforce Now, whatever:), an enthusiast gamer is assumed to have a dedicated gaming computer, probably a tower, with a dedicated video card, likely say a 5070ti or above, probably 32GB+ RAM, couple of SSDs which are not entry level, etc.

Again, this is not to say a person with limited budget is "not a real enthusiast", no gatekeeping is intended here; simply, if it may help, what the word means when it comes to market segmentation and product pricing :)


Additionally, "enthusiasts"/"hobbyists" tend to be willing to spend beyond practical utility, while professionals are more interested in pragmatism, especially in photography from what I can tell.

If you're an actual pro, you need your stuff to work properly, efficiently, reliably, when it's called for. When you're a hobbyist, it's sometimes almost the goal to waste money and time on stuff that really doesn't matter beyond your interest in it; working on the thing is the point, not the value it generates. Pros should spend money on good tools and research and knowledge, but it usually needs to be an investment, sometimes crossing over with hobbyist opinions.

A friend of mine who's a computer hobbyist and retail IT tech, making far far less than I do, spends comically more than me on hardware to play basically one game. He keeps up to date with the latest processors and all that stuff, he knows hardware in terms of gaming. I meanwhile—despite having more money available—have a fairly budget gaming PC that I did build myself, but contains entirely old/used components, some of which he just needed to get rid of and gave me for free, and I upgrade my main mac every 5 years or something. I only upgrade when hardware is really getting in my way.


>> So while a consumer photographer, may use their phone or compact or all-in-one camera, enthusiast photographer will probably spend $3000 - $5000 in camera gear.

It's interesting that you chose photographers as the example here. In many cases that I've seen, enthusiast photographers spend much more than professional photographers on their gear because the photographers make their money with their gear and therefore need to justify it, while the enthusiasts are often tech people, successful doctors, etc., who spend lots and lots on money on their hobbies...

In any case, your point stands, that "enthusiast" computer users would easily spend $3-4K or more on gear to play games, train models, etc.


$3.5k is a lot of money, but not a ton by American hobby standards. It's easy to spend multiples, even orders of magnitude more than that on hobbies like fishing, wine, sports tickets, concerts, scuba, travel, being a foodie, golf, marathons, collectibles, etc.

It's out of reach for lots of people, even in developed countries. But it's easily within reach for loads of people that care more about computing than other stuff.


I live in America, I am very well compensated. Have been for 15 years now. $3500 is a lot of money. A lot. There is a tiny bubble of us tech folks who think it is accessible to most people. It is not. It is also the same reason Macs are still a niche. Don't take your circles to be the standard, it is very very far from it, especially if you think $3500 is not a lot of money.

It is easy to confirm this, just look at the sales number of these $3500 devices. It is definitely not an enthusiast price point, even in the US.


It's not nothing for most people... it's more than a month of rent/mortgage for a significant number of Americans even. But if it's your primary hobby, it's not completely out of reach, and it's not something you necessarily spend every year. A lot of people will upgrade to a new computer every 3-5 years and maybe upgrade something in between those complete system upgrades.

I know plenty of people who don't make a lot of money (say top 25% or so) that will have a Boat or RV that costs more than a $3500 computer, and balk at the thought of spending that much on a computer. It just depends on where your interests are.


The first words I said: "$3.5k is a lot of money..."

There are tens of millions of top 10% income adults in America. So something can be both unaffordable to most people, and also easily accessible to very many people.


It’s a midrange to upper expense in the US if it’s your hobby. Most people don’t have a serious computer hobby but they golf, trade ATVs, travel, drink, etc.

There are something like 24 million millionaires in the United States... Estimates are that Americans spent $157 billion on pets in 2025.

There are a lot of people who could easily choose to spend $3,500 on a computer.


There is no Apple device priced above $3k that has done 1 million in annual sales. The US population is >300M. <0.3% of the population. Don't take your bubble to be representative of society. $3500 is a lot of money, even in the US.

$3500 would have been 3–4 months' discretionary spending as a PhD student in Finland 15 years ago. A sum you might choose to spend once a year on something you find genuinely interesting.

Some people succumb to lifestyle creep or choose it deliberately. Others choose to live below their means when their income grows. The latter have a lot more money to spend on extras, or to save if that's what they prefer.


Mac has about 15% of the market share in the US. It's not really a niche.

$3500 is more than I would spend on a hobby too, but there are, in absolute terms, a large number of Americans who can spend this much on their hobbies.


In June 1977, the base Apple II model with 4 KB of RAM was $1,298 (equivalent to about $6,900 in 2025), and with the maximum 48 KB of RAM it was $2,638 (equivalent to about $14,000 in 2025).

(Source: Wikipedia via Claude Opus)


Wow, 48k for $14000. Now you can get a MBP with a million times more memory for $3500 or so. Whereas that CPU was clocked at 1 MHz, so CPUs are only several thousand times faster, maybe something like 30,000 times faster if you can make use of multi-core.

I'd argue that some of those are more consumption and activity than hobby depending on how they're engaged with, and that people use the word "hobby" too loosely, but would agree that Americans in-particular consume at obscene rates.

Golf equipment, mountaineering equipment, skiing and snowboarding lift tickets and gear, a single excessive graphics card that's only used for increasing frame rates marginally, or basically a single extra feature on a car, are all things that accumulate quite quickly. Some are clearly more superfluous than others and cater to whales, while some are just expensive by nature and aren't attempting to be anything else


Those are the prices for just buying equipment, which at least retain some kind of value. 3 million+ American kids are enrolled in competitive soccer with annual clubs dues between $1K and $5K, and that money is just gone at the end of the year. Basically none of those kids are going to have a career in soccer, so it's clearly a hobby, and everyone knows it. And soccer isn't even the most popular sport!

Ya, I guess that's another category entirely. The cost of enrolling a kid in anything, potential travel involved etc..

An enthusiast in the hobby space is by definition someone willing to pour much more money that someone else not that enthusiast in whichever hobby we are talking about.

Well, and also has a bunch of money, not just willing. I guess locally we don't really have that difference, as two other commentators here went by, that's why I had to update my local understanding of "enthusiast". Usually we use it for how engaged/interested a person is, regardless of how much money they can or are willing to use.

Learned something new today at least, so that's cool :)


Yes, when tech gear is sold as 'enthusiast' gear, it is almost invariably the most expensive non-professional tier of equipment. That is roughly the common understanding: Expensive and focused on features more than security required for public use; while remaining within reach of at least some individuals, not only corporations.

In a hobby where there are (strong) HW requirements, it mostly takes for granted you have money to shell out for your hobby, indeed.

For an individual making median income in the US, it would cost 2% of your income to get a machine like this every 4-5 years. That's a matter of enthusiasm, not a matter of having a lot of money. Sorry that income is less where you are, but the people talking about the product tier are using American standards.

1200$ as the minimum salary covers probably 70% of Europe by population?

The Neo has enough power to do small LLM testing and pretty much anything else a bit slowly, and costs $600?

Neo tops at 8GB RAM. What LLM are you going to run there? Functiongemma?

It can absolutely do some ML inference on it, but not much in terms of LLMs.


Maybe, but that does not mean that the Mac Studio is not very expensive hardware even for rich first world countries.

Did you need to add poor? Unless apple isn't catering to the US

The Goodhart's law effect there seems obvious - rather than code getting better, you might just become less rigorous in your reviews and stop commenting as much. You may not even realize your standards are dropping.

Yeah, that's pretty appalling.

Regardless of how good the philosophy of something is, if it's as niche and manpower constrained as OpenBSD is then it's going to accumulate problems like this.


I actually think this isn't even surprising from OpenBSD philosophically. They still subscribe to the Unix philosophy of old, moreso than FreeBSD and much much more than Linux.

That is, "worse is better" and it's okay to accept a somewhat leaky abstraction or less helpful diagnostics if it simplifies the implementation.

This is why `ed` doesn't bother to say anything but "?" to erroneous commands. If the user messes up, why should it be the job of the OS to handhold them? Garbage in, garbage out. That attitude may seem out of place today but consider that it came from a time when a program might have one author and 1-20 users, so their time was valued almost equally.


> That attitude may seem out of place today

It absolutely doesn't. Everywhere I've worked we were instructed to give terse error messages to the user. Perhaps not a single "?", but "Oops, something went wrong!" is pretty widespread and equally unhelpful.


This is normal to return a terse message to a remote user via API. The remote user may be hostile, actively trying to gather information useful for breaking in.

But the local user who operates pf is already trusted, normally it would be root.

In either case, no error should be silently swallowed. Details should be logged in a secure way, else troubleshooting becomes orders of magnitude harder.


> That attitude may seem out of place today

That attitude was out of place at every point. Now it was excusable when RAM and disk space was sparse, it isn't today, it have entirely drawbacks


Code size would balloon if you try to format verbose error messages. I often look at the binaries of old EPROMs. I notice that 1) the amount of ASCII text is a big fraction of the binary 2) still just categories (“Illegal operation”). For the 1970s, we’re talking user programs that fit in 2K.

I write really verbose diagnostic messages in my modern code.


There was also an implicit saving back then that an error message could be looked up in some other system (typically, a printed manual). You didn't need to write 200 chars to the screen if you could display something much shorter, like SYS-3175, and be confident that the user could look that up in the manual and understand what they're being told and what to do about it.

IBM were experts at this, right up to the OS/2 days. And as machines got more powerful, it was easy to put in code to display the extra text by a lookup in a separate file/resource. Plus it made internationalization very easy.


Even in that scenario that attitude seems out of place, considering a feature is implemented once and used many times.

Is this an expertly crafted joke response or an extremely depressing authentic AI sales pitch response?


The latter, see GPs comment history.


Banned now.


> you just can’t play StarCraft that much better than the best humans

I could not disagree with this more.

Just the perfect micro part means that computers have a far higher ceiling than humans.

No, it is not possible in theory for humans to have perfect micro with thousands of APM!

We're talking about hundred unit zergling swarms perfectly dodging tank shells. Hundreds of APM at multiple locations on the map. Perfect timing and placement for every order.

This is like saying an aimbot wouldn't make a top CS pro much better.


Having written the AI systems for Robocode bots 15 years ago, they perform at such a higher level than humans that there is no way, given all the time in the world, a human can compete with a full statistical targeting and movement system. We just don't think in that way.


> We're talking about hundred unit zergling swarms perfectly dodging tank shells.

Exactly the reference I was thinking of https://www.youtube.com/watch?v=IKVFZ28ybQs


I think the "you" they refer to there is the hypothetical other skilled human, not a computer. The wording is confusing but I think they're just saying that the human players will reach a ceiling with each other (they then contrast this with real life where the ceiling is always moving). That whole paragraph is a bit muddy with the point it's trying to make.


I bet if people could star repos anonymously those porn repos would have more stars.


Unless they invent kernel as a service or undertake a remarkably ambitious AI license laundering project, I think you're right.


Anthropic is deemed a betrayer and a supply chain risk for actually enforcing their principles.

OpenAI agrees to be put in the same position as Anthropic.

It seems like you must actually somehow believe that history will repeat itself, Hegseth will deem OpenAI a supply chain risk too, then move to Grok or something?

There's surely no way that's actually what you believe...


> We'll still get full self-driving electric cars and robots next year too.

I've taken a Waymo and it seemed pretty self driving.


Not that 1. Wink.


Money can be exchanged for services.

Hope this helps.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: