Hacker News .hnnew | past | comments | ask | show | jobs | submit | gignico's commentslogin

Too bad HN nicknames cannot be put for sale :)

So LLMs are destroying the economy and the environment but at least “catastrophic risk” is still low. Ok then…


Off topic, but I sincerely ask: am I the only one that is disturbed by the use of the term "Mac OS X" to refer to modern versions of the OS that is currently called "macOS"? (and not MacOS either)

I mean, the name was changed ten years ago...


It wasn’t even “Mac OS X” ten years ago, but “OS X”. “Mac OS X” was 15 years ago.


I don't think Mac from Mac OS X, was officially dropped, I think was like Xerox, we just stopped Mac from it an called it "OS X" even thought X stood for 10 from 10.0 to 10.15. This is also why Microsoft also tried the Windows 10 game and alluded to never having Windows 11 which obviously changed.

Personally, I wish Apple would stop creating a new OS yearly. Most features users use were born in System 7/8/9 days.


Looks like the submitter used the wrong term - the actual link uses "macOS".

But to answer the question: Yes! I opened thinking it was going to be some awesome Leopard or Lion app.

I miss the name, mostly because the OS was interesting and fun in those days and boring and dreary and buggy and low contrast and poor UX and squircles and flat colours if it even has colours now.


Is it really worth? YMMV, but yes if you ask me.


For reasons, I used to go to Rome quite frequently in the 2010s, and the construction of Metro C was already a meme. But now some of the stations are quite interesting indeed.


You know the meme, "they found ruins of metro C while building the metro C"


As a casual Rome enthusiast give us the lore drop


Positively surprised to see stuff like these on HN first page!

If any author is around, do you have an implementation that can be compared with CUDD and similar BDD libraries?


Hi, author here! Also positively surprised to see this on HN haha

We (well mainly Guy, if he's around) are working on an implementation, which will be made open source at some point (still rounding the edges a bit). We have very encouraging preliminary results, it does compare well wrt SDD and CUDD. There is still some ideas we would like to try, specifically for model counting.


Thanks!

The arXiv submission says the paper is submitted to SAT26, did it get accepted?


We are still waiting for the reviews. The rebuttal phase should start soon, we'll see!


> Running Windows 3.1 in True Color Full HD

People from the time would be astonished by the hardware we have now yet bloated software globs up every ounce of performance. What a waste! </granny mode=off>


I've a Compag Armada E500, and it runs Windows 98 fairly swiftly with its PIII processor and 256MB RAM. I've also a 2009 MacBook and it runs Snow Leopard like a dream, yet with "only" 2GB of RAM. And either of these machines could do nearly anything I ask of a PC today - programming, web browsing, comms, gfx edits, even some gaming, while feeling snapper, with less shite flying in my eyes ("notifications" and their wretched noises) as I work.

Someone will explain to me the business and economic reasons, but that just flies over my caveman brain that asks "why does bashing rock feel slower?"


The problem with these comparisons is often that the old OS doesn't actually do the same thing modern software does. Smoothly rendering a GIF/mp4/webm in a chat channel will bring that Windows 98 machine to its knees. Even complex software like web browsers on these older machines do a lot less work. They were also often a lot slower, as load times for modern SSDs are closer to old RAM than to the hard drives at the time.

I can imagine that your particular workload doesn't require all those bells and whistles, and I think it's probably true that only running the bare minimum software like you would back in the day is horrifically inefficient on modern operating systems. But, at the same time, kernels don't crash as often, disks encryption is actually a thing now, file downloads are no longer expressed in kilobits per second and the much prettier screens render much smoother media for a fraction of the performance impact.

Of course there are inefficiencies that could be fixed (like how chat apps are skins around browsers now) but a lot of efficient software from back in the day cost an arm and a leg to build. In the end, the software industry found out that customers are happier to pay when you deliver new features faster than when you deliver new features later (which still run on the old hardware, though the customer may have already replaced said hardware at the time you release your feature).

With current prices for RAM and other system components, I hope companies will once again feel the pressure to build for limited hardware. Then again, when I look at the hardware developers are lugging around, I highly doubt things will change quick enough.


> “Smoothly rendering a GIF”

Animated GIF is a format that was designed for playback on late 1980s PCs with a 20 MHz 386 and VGA graphics…

If anything, this example proves the point that we’ve made the simple stuff much too complex. The GIF format hasn’t changed, but somehow getting those indexed color frames to screen on time now requires a GHz core.


GIF playback should be efficient but...

About twenty years ago I was generating long animated GIFs. They worked fine in Firefox. In Internet Explorer they started fine but became jankier as playback progressed. I realised that every time IE displayed a frame it was rereading the entire file from the beginning to get to the current frame. Which took longer and longer as the current frame advanced.

It's just so easy to squander performance without noticing.


The reason you need a GHz core is that modern GIFs stretch the file format to its limits, by doing 30 or even 60fps in extremely-coloured files with resolutions that easily beat the render resolution of 1980s PCs just in a little corner of the screen.

GIF is an awful format for its modern usage that will easily waste tens of megabytes for even a short and small file. That's why many services secretly convert GIF files and serve them as video files, or other animated files that are more efficient (such as WebP).

The difference in opinion between "the simple stuff" and "missing the bare basics" seems to come down to what year you were born and what kind of services you grew up with. I don't need 90% of what Discord has to offer me but when reading along with discussions of Discord users looking for alternative platforms, fleeing their age verification and such, I find that most Discord users will absolutely demand the features I didn't even know chat apps support.


Things are slow, aren't they? I feel there was a lot of less lag in old operating systems and software.

I use two editors now. VS Code as full IDE when I want to code heavily. And a homemade FLTK based editor with just basic syntax coloring for writing notes and doing quick things.


I use a 27 year old Pentium 2 laptop with Windows 98 for a hobby project. And I keep asking myself: why does this thing feel so fast? And it could be even faster if I replaced the HDD with a CF card.


I believe Win9x (and the rest of the DOS-based Windows) has lower input latency than the NT-based ones, largely due to a simpler architecture with shorter codepaths.

Here's a related article: https://hackernews.hn/item?id=16001407


But there's a big different qualitative UX interaction latency delta between Win 2k and XP. 2k appeared to have a background thread do all UI stuff while XP and later did not. NT 3.1, 3.5x, and 4.0 (without Active Desktop) all appeared pretty responsive too. ME, XP, Vista... felt really slow.

In terms of apparent responsiveness, Win 3.1x, NT <4, and 2k felt the fastest.


I’m sure my privacy conscious setup would STILL on average transmit > 56kbps of telemetry, despite my best efforts


Can you install Office 4.3 from floppies faster than installing the latest Windows updates? Not being a recent Windows users, I'm actually kinda curious. I'm sure the CD version would install faster.


Evolution is not a process toward better quality of life and life expectancy of individuals. As long as enough individuals can reach the age to procreate in their environment evolution is done. Evolution didn’t train our bodies to reject the diseases we already have the vaccines for neither, so your reasoning would apply to smallpox as well. And what about viruses appeared after Homo sapiens evolved (such as HIV)?


I don't think it works like that, from my recollection of the uni courses I did 20 years ago.

Even a small advantage like 1% will quickly propagate in a population, because it's about advantage over 1,000s of generations.

That this disease defence CAN be turned on, means some people would have at some point had a genetic mutation to turn it on.

As the GP pointed out, therefore it must be a net negative from an evolutionary stand point.

I also suspect it would be calorific consumption, as someone else said, so it might be ok.

However, there are plausible other explanations. For example there are medical conditions that result from a too aggressive immune system and it could instead be reducing the chance of that occuring.


The problem is implying that “if evolution did not do it there must be a reason”, because 1) it makes evolution look like an engineer evaluating trade offs, which is not and 2) it considers the current state of affairs the final “product”, which is not. For example, flowers did not exist in the Cretaceous, so somebody looking at what evolution did until then would say “if evolution did not invent flowers, then we’d better not do it”. But of course that’s absurd.

Also as I said evolution is not a process towards a goal. There are 8 billion people around the world which proves Homo sapiens is quite fit for its environment so the pressure to evolve further features is quite low.


pressure to evolve further features is quite low

I'm really sorry, but you're really misunderstanding how evolution works.

Worth reading something like the Selfish Gene if you want to understand it a bit better.

There are always reproductive pressures and there are always genetic variations.

Modern civilization and medicine has simply changed what the pressures are.

As an example if a genetic variation occured tomorrow which gave resistance to spermicide, within 100 generations that variant would probably be quite successful and prevalent in the human population.


I know about reproductive pressure and I’ve read The Selfish Gene. What you say is correct but does not explain that “if evolution did not, better not do it” attitude of the original comment, which I think is wrong for many reasons as I’ve wrote.


I would say you are both right in that if you have two competing variables (on-time for the defence vs calorie consumption), when the main causes of death before procreating were infectious disease and malnutrition before modern times, I would expect some equilibrium to be reached and we have not had that much time to evolve since caloric scarcity in the western world was a solved problem for large swaths of the population.

If in the future we could trade a few hundred extra calories per day for a great immune system (without auto-immune side effects) we would have found a nice cheat code!


Thinking about your point- I bet we do not know if some people have it on or not. It feels like something that would have to be specifically investigated.


Until you need to do more than all-or-nothing parsing :) see tree-sitter for example, or any other efficient LSP implementation of incremental parsing.


It is easily possible to parse at > 1MM lines per second with a well designed grammar and handwritten parser. If I'm editing a file with 100k+ lines, I likely have much bigger problems than the need for incremental parsing.


It's not just speed - incremental parsing allows for better error recovery. In practice, this means that your editor can highlight the code as-you-type, even though what you're typing has broken the parse tree (especially the code after your edit point).


I don’t know if someone said it already, but when Steve Jobs said this famous quote (“reports of my death are greatly exaggerated”) he then died maybe just a couple of years later.

Hope this does not happen to code :)


Mark Twain.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: