I'm paying $20 for Codex and $90 for the Claude Max plan. They are a "pry from my cold dead fingers" product for me.
IMO if someone tried this tech last time 6 months ago, or their only exposure is eg. via MS copilot, they do have a rational reason for skepticism. No technology of this complexity has improved this rapidly in my memory (well, ok, we had the CPU speed races from 90's to early 2000's).
The CPU speed race might be the most apt comparison I've yet heard.
From the 80486 to AMD Athlon64 X2 and much of that progress was enabled by better EDA being run on the more powerful CPUs being made with each improvement.
Now, we have better models helping to create even better models.
How about if they plateau but prices skyrocket? Most companies would pay but if you're not working for a company that does pay for it, what's the line beyhond which you'd think twice about paying for it yourself? 500? 1000? 1500?
Let's say they have already plateau. But hardware continues to get better, right? So tokens should go down in price, not up. Since they're already 50%+ on inference today, better hardware would allow them to generate more tokens for less money.
I would pay $500 to start, build stuff with it, then keep going up the tiers as the stuff I'm building makes money.
I mean technically they did with Windows on NT and again with Windows on Windows 64. Vista was also a huge redesign from the 1990s NT to a lot of the new technologies they'd made in Longhorn.
If Unity were to ship platform native replacement for WPF equivalent (hell or even winforms) it would become a really enticing app development platform.
Aren't these pretty much the most trivial UI apps possible? E.g. compared to other native apps like Photoshop, Blender, Visual Studio or Office, CRUD is mostly just about banging together custom UI frontend for a database.
Unity's editor is implemented in its own (old) UI system, same with Godot, so in both engines it's possible to create 'traditional' non-game UI applications.
A Unity expert can correct me, but IIRC (possibly wrongly) at least the following limitations apply:
For example Unity does not have accessibility features (screen readers etc) nor I don’t think it’s DPI aware. I would _guess_ it does not support platform fonts. Not sure if it supports non-latin font layouts like arabic. Etc etc.
I want to have a computer with stable vendor supported OS so _I can do my stuff_ not tweak some os level configs.
I _don’t_ want to spend my time playing an os systems programmer.
OS is a _component_. Like the wifi driver. I think it’s great some people love developing wifi drivers but personally I just want network that-just-works because there are billion other cool things you can do with a computer.
Similarly I want an OS that just works! Without asking me to do a anything! Because _i don’t really care_. (I mean i care it works but i expect the engineers actually developing an os offering to have a far better idea than myself what is a good stable default config for the system)
This is exactly why modern Windows is problematic. MacOS is better. A right Linux distro (e.g. Fedora Silverblue) on right hardware (e.g. Thinkpad T series) also just works™; this basically the same kind of limitation as with MacOS.
I wish they issued a Windows Rock Stable edition. Ancient as rocks (Win7 look, or maybe even WinXP look), every known bug fixed, every feature either supported fully, or explicitly not supported. No new features added. Security updates issued regularly. It could be highly popular.
IMHO - disagree but it depends on point of view so this is not ”you are wrong” but ”in my view it’s not like that”.
I think it’s the role of the software vendor to offer a package for a modern platform.
Not the role of OS vendor to support infinite legacy tail.
I don’t personally ever need generational program binary compatibility. What I generally want is data compatibility.
I don’t want to operate on my data with decades old packages.
My point of view is either you innovate or offer backward compatibility. I much prefer forward thinking innovation with clear data migration path rather than having binary compatibility.
If I want 100% reproducible computing I think viable options are open source or super stable vendors - and in the latter case one can license the latest build. Or using Windows which mostly _does_ support backward binaries and I agree it is not a useless feature.
Software shouldn't rot. If you ignore the cancer of everything as a subscription service, algorithms don't need to be tweaked every 6 months. A tool for accounting or image editing or viewing text files or organizing notes can be written well once and doesn't need to change.
Most software that was ever written was done so by companies that no longer exist, or by people (not working for a software company) no longer associated with those company they wrote the tool for. In many of these cases the source is not available, so there is no way to recompile it or update it for a new platform, but the tool works as good as ever.
Mac works great out of the box. Linux can do whatever you want if you put some work into it. Windows sits kind of in the middle, and it turns out for a lot of people that's a comfortable spot even with its trade-offs.
They would still need to develop new drivers for new hardware, which could cause issues. But yes, the situation you describe would be much more stable than Win11.
Open source is a supply chain specific issue and consumers don’t care about supply chain.
Anyone with any illusions about this name quickly the top vendor for the third item in the materials itinerary of the first thing with a materials itinerary you get your hands on (for me it’s usually food. Who is the main vendor for citric acid? Or sugar. Or that red dye that causes adhd. I have no clue)
General consumers could not care less about open source.
Well, if you buy only one pair it does work really nicely with all Apple kit. So you get really nice cinematic sound from Apple TV (for my non-prosumer ears), and effortlessly can switch between phone, laptop etc. The sound is really good for video calls.
They just work.
I mean there are other pieces of kit that probably just work as well but with these you don't need to do market research.
It's surprising how non-trivial even _adequate_ sound is still in 2026 and that's what these are guaranteed to give in any situation IMHO.
If you have only one Apple device probably no selling point as such.
The Apple Beats Studio Pro should meet this reasoning for $170 (on Amazon, $350 on apple.com - guess that explains the AirPods Max pricing) & the battery lasts twice as long. I have 2 near my Apple TV just so everything plays nice together.
Yeah I basically don’t trust anyone but Apple for wireless audio because every time I’ve tried allegedly-good non-Apple Bluetooth audio devices, they’ve been a ton worse, so bad I ended up barely using them.
In this case these are more expensive than I’d pay for headphones, but that just means I won’t have any Bluetooth headphones in this form factor. Been down that road before, non-Apple was a frustrating waste of money.
I mean FFS my AirPods are worse on Windows and Linux than in the Apple ecosystem, but are still better than the non-Apple ones I’ve tried, even there. It’s not even just the home-field advantage.
Windows and bluetooth is a really difficult combination. The problem is that for some reason if Windows detects a microphone on the Bluetooth headphone, it switches to a transport mode that a) allows the mic through b) makes the sound sound horrible. So disabling mics sometimes helps there.
2000's tech bubble was caused among other things over-investment to infrastructure and technology that had no users yet.
Totally different setup.
Does not mean AI boom will not turn to bust, but weak analogues generally don't help with understanding complex systems.
reply