HN2new | past | comments | ask | show | jobs | submit | ACS_Solver's commentslogin

This is one of the most interesting questions to me about human brains, and as far as I know no significant progress has been made in answering it.

Some people appear to have a capacity for learning, retention and understanding that is well outside the normal range. People like Ramanujan or von Neumann, or Tao. They learn at a speed that far exceeds the speed of what we would consider gifted students, they reach a deep and intuitive understanding of the material, and go on to make many discoveries / inventions of which even one would be enough for an ordinary scientist to be considered successful.

It seems there is something very different about their minds, but just what is it that allows those minds to operate at such a level?


Could be something even more difficult to identify that keeps everyone else from doing it.

I'm sure Apple has data showing that their extremely lockdown strategy is good for their business but I feel like I'm one of the potential customers Apple could gain if they didn't have that.

They're a fantastic hardware company. But my admittedly very limited experience with Apple software, from iPad to their streaming service website, has been miserable. The UX doesn't work for me, the software just doesn't do what I want. Understandable, Apple very much designs their software to work for a particular workflow they come up with, if you like that workflow it's great, for someone like me it's miserable. But I would gladly buy their hardware if I could freely run an OS of my own choosing.


I doubt that any company actually cares about what any of the myriad of metrics they collect mean at the C-suite level. I mean, "maybe" I just think it is unlikely. I bet 9/10 times someone just makes a decision about how things "ought" to be and then that's the way it is going forward.

The assumption that this is a triangulated and well researched strategy doesn't match my experience in "real-job" world. I mean, maybe Apple is different because of their history, but I am not convinced anyone listens to anyone that articulates any math ideas beyond Algebra outside of some niche specialties because they don't understand it. And it's not that I'm some math god - I mean, that's what I studied, but there are people SO much more knowledgeable and capable, and they seem to get ignored too.

Like, I'm sure the guy who runs an insurance company listens to the actuaries about relative risk, but mostly, what I've just seen is someone makes a decision, and then finds post hoc ergo proctor hoc rationales for why this was a good decision down the line when they have to account for their choices.

For instance, it took my like a year at my old job, but I finally got most of the KPIs we were using to set strategy cancelled. The data we were using to generate those KPIs? Well in a few cases, after you seasonally differenced the data was no different than white noise. No autocorrelation whatsoever. In ALL the cases the autocorrelation was weak and it was all evaporated after a month or 2. You could MAYBE fit an MA model to it, but that seemed dodgy to me. And like, I'm not a major expert - I took 1 time series class in gradschool, and frankly, time series is kind of hard. But management had ZERO idea of what I was talking about when I was like, "hey, I don't think these numbers actually mean anything at all? Did anyone run an ACF?"

Then each month someone higher up the chain would say, "why is this number low?" And then they go out and search through the reams of data they had to come up with an answer that plausibly explained things. Was the number particularly "low?" No, it was within expected statistical noise thresholds, you are probably going to have at least have 1 number out of whack every 20 cycles or so... You still had to spend an hour in a meeting coming up with reasons for why it was low that went beyond "ummm, well, this is kind of random, and we'd expect to see this sort of thing ever couple years once or twice, we won't know if it's a trend for a few more months."

Anyway, this is a long anecdote to explain why I have no confidence that most companies do any sort of actual introspection. CEO creates targets and underlings build models that show how they're meeting or not meeting those targets. Now, hilariously, with Apple in particular I might be wrong, because in Tim Cook's defense, I'm pretty sure his education is in Industrial Engineering? So if any CEO is thinking about that stuff, it's him. Still, I am totally and completely unimpressed with the C-Suite sort of thinkers.

They're not dumb - like I've never really had a straight up dumbass manager outside of shitty lower jobs or small-mom-and-pop businesses? But I have seldom met any company that actually cared about the numbers - most say they do, but most just use those numbers to justify decisions they've already made.

Am I just unlucky? I'm I the witch in church here?


This comment expresses how it feels to work in a corporate environment better than anything I've ever seen on this site.


This is validating, thanks.

The environment is why I quit my job and started working for myself in January. I hated it. And not to sound like an arrogant ass because there were a LOT of way smarter people than me at $PREVIOUS_EMPLOYER, but having to have meetings to set our meetings, having to explain things that aren't statistically meaningful to people who don't understand stats anyway, and getting code reviews (when I could get them scheduled) from dudes who hadn't touched a keyboard in 5 years was... soul sucking? I'm not doing that anymore. Or ever again.

I mean, maybe it's because I had a more hands-on blue-collar adjacent job before I got into tech? Maybe it's because I'm a fool and couldn't play the game of "pretend to work and look busy. But - and I know this might be kind of messed up - I really like not having to explain things in a series of emails to people other than the customers. I really like not having to answer to anyone but my self and my customers. If I want to do something, well, I just do it now? That's a nice place to be. Riskier for sure, but I think the prior environment would have killed me, so maybe not.

Also, I have time to do shit that's interesting? Who would have guessed how much more time I'd have in the day when I didn't have 4.5 hours of meetings per day? Hell, I'm taking 2 classes at the university for fun (weird right?!) - I never could have done that before because I would have had to make a slide deck for Thursdays All-Hands or whatever and couldn't have missed the SUPER IMPORTANT MEETING that Jake has on the schedule that he'll show up for unprepared or just not show up to.

Nah, the hell with that. I'm never going back.


We are all witches holding our breath in the Church of the Line


Arch wiki is far better than most man pages. I've referred to Arch for my own non-Arch systems and when building Yocto systems. Most Arch info applies.

In the ancient days I used TLDP to learn about Linux stuff. Arch wiki is now the best doc. The actual shipped documentation on most Linux stuff is usually terrible.

GNU coreutils have man pages that are correct and list all the flags at least, but suffer from GNU jargonisms and usually a lack of any concise overview or example sections. Most man pages are a very short description of what the program does, and an alphabetic list of flags. For something as versatile and important as dd the description reads only "Copy a file, converting and formatting according to the operands" and there's not even one example of a full dd command given. Yes, you can figure it out from the man page, but it's like an 80s reference, not good documentation.

man pages for util-linux are my go-to example for bad documentation. Dense, require a lot of implicit knowledge of concepts, make references to 90s or 80s technology that are now neither relevant nor understandable to most users.

Plenty of other projects have typical documentation written by engineers for other engineers who already know this. man pipewire leaves you completely in the dark as to what the thing even does.

Credit to systemd, that documentation is actually comprehensive and useful.


Proton is amazing and it's really three different subprojects that deserve a lot of credit each.

First is Wine itself, with its implementation of Win32 APIs. I ran some games through Wine even twenty years ago but it was certainly not always possible, and usually not even easy.

Second is DXVK, which fills the main gap of Wine, namely Direct3D compatibility. Wine has long had its own implementation of D3D libraries, but it was not as performant, and more importantly it was never quite complete. You'd run into all sorts of problems because the Wine implementation differed from the Windows native D3D, and that was enough to break many gams. DXVK is a translation layer that translates D3D calls to Vulkan with excellent performance, and basically solves the problem of D3D on Linux.

Then there's the parts original to Proton itself. It applies targeted, high quality patches to Wine and DXVK to improve game compatibility, brings in a few other modules, and most importantly Proton glues it all together so it works seamlessly and with excellent UX. From the first release of Proton until recently, running Windows games through Steam took just a couple extra clicks to enable Proton for that game. And now even that isn't necessary, Proton is enabled by default so you run a game just by downloading it and launching, same exact process as on Windows.


Yes. The unique point of ReactOS is driver compatibility. Wine is pretty great for Win32 API, Proton completes it with excellent D3D support through DXVK, and with these projects a lot of Windows userspace can run fine on Linux. Wine doesn't do anything for driver compatibility, which is where ReactOS was supposed to fill in, running any driver written for Windows 2000 or XP.

But by now, as I also wrote in the other thread on this, ReactOS should be seen as something more like GNU Hurd. An exercise in kernel development and reverse engineering, a project that clearly requires a high level of technical skill, but long past the window of opportunity for actual adoption. If Hurd had been usable by say 1995, when Linux just got started on portability, it would have had a chance. If ReactOS had been usable ten years ago, it would also have had a chance at adoption, but now it's firmly in the "purely for engineering" space.


"ReactOS should be seen as something more like GNU Hurd. An exercise in kernel development and reverse engineering, a project that clearly requires a high level of technical skill, but long past the window of opportunity for actual adoption."

I understand your angle, or rather the attempt of fitting them in the same picture, somehow. However, the differences between them far surpass the similarities. There was no meaningful user-base for Unix/Hurd so to speak of compared to NT kernel. There's no real basis to assert the "kernel development" argument for both, as one was indeed a research project whereas the other one is just clean room engineering march towards replicating an existing kernel. What ReactOS needs to succeed is to become more stable and complete (on the whole, not just the kernel). Once it will be able to do that, covering the later Windows capabilities will be just a nice-to-have thing. Considering all the criticism that current version of Windows receives, switching to a stable and functional ReactOS, at least for individual use, becomes a no-brainer. Comparatively, there's nothing similar that Hurd kernel can do to get to where Linux is now.


I'd still consider them more similar than not.

Hurd was not a research project initially. It was a project to develop an actual, usable kernel for the GNU system, and it was supposed to be a free, copyleft replacement for the Unix kernel. ReactOS was similarly a project to make a usable and useful NT-compatible kernel, also as a free and copyleft replacement.

The key difference is that Hurd was not beholden to a particular architecture, it was free to do most things its own way as long as POSIX compatibility was achieved. ReactOS is more rigid in that it aims for compatibility with the NT implementation, including bugs, quirks and all, instead of a standard.

Both are long irrelevant to their original goals. Hurd because Linux is the dominant free Unix-like kernel (with the BSD kernel a distant second), ReactOS because the kernel it targets became a retrocomputing thing before ReactOS could reach a beta stage. And in the case of ReactOS, the secondary "whole system" goal is also irrelevant now because dozens of modern Linux distributions provide a better desktop experience than Windows 2000. Hell, Haiku is a better desktop experience.


"And in the case of ReactOS, the secondary «whole system» goal is also irrelevant now because dozens of modern Linux distributions provide a better desktop experience than Windows 2000. Hell, Haiku is a better desktop experience."

Yet, there are still too many desktop users that, despite the wishful thinking or blaming, still haven't switched to neither Linux, nor Haiku. No mater how good Haiku or Linux distributions are, their incompatibility with the existing Windows simply disqualifies them as options for those desktop users. I bet we'll see people switching to ReactOS when it will get just stable enough, yet long before it will get as polished as either Haiku or any given quality Linux distribution.


No, people will never be switching to ReactOS. For some of the same reasons they don't switch to Linux, but stronger.

ReactOS aims to be a system that runs Windows software and looks like Windows. But, it runs software that's compatible with WinXP (because they target the 5.1 kernel) and it looks like Windows 2000 because that's the look they're trying to recreate. Plenty of modern software people want to run doesn't run on XP. Steam doesn't run on XP. A perfectly working ReactOS would already be incompatible with what current Windows users expect.

UI wise there is the same issue. Someone used to Windows 10 or 11 would find a transition to Windows 2000 more jarring than to say Linux Mint. ReactOS is no longer a "get the UI you know" proposition, it's now "get the UI of a system from twenty five years ago, if you even used it then".


"UI wise there is the same issue. Someone used to Windows 10 or 11 would find a transition to Windows 2000 more jarring than to say Linux Mint. ReactOS is no longer a «get the UI you know» proposition, it's now «get the UI of a system from twenty five years ago, if you even used it then»." "A perfectly working ReactOS would already be incompatible with what current Windows users expect."

That look and feel is the easy part. That can be addressed if it's really an issue. The hard part is the compatibility (that is given by many still missing parts) and stability (the still defective parts). The targeted kernel matters, of course, but that is not set in stone. In fact, there is Windows Vista+ functionality added and written about, here: https://reactos.org/blogs/investigating-wddm although doing it properly would mean rewriting the kernel, bumping it to NT version 6.0

I'm sure there will indeed be many users that will find various ReactOS aspects jarring for as long as there are still defects, lack of polish, or dysfunction on application and kernel (drivers) level. However, considering the vast pool of Windows desktop users, it's reasonable to expect ReactOS to cover the limited needs for enough users at some point, which should turn attention into testing, polish, and funding to address anything still lacking, which then should further feed the adoption and improvement loop.

"No, people will never be switching to ReactOS. For some of the same reasons they don't switch to Linux, but stronger."

To me, this makes sense maybe for corporate world. The reasons that made them stick with Windows has less to do with familiarity or with application compatibility (given the fact that a lot of corporate infrastructure is in web applications). Yes, there must be something else that governs corporate decisions, something to do with the way corporations function, and that will most likely prevent a switch to ReactOS just as it did to Linux based distributions. But, this is exactly why I intentionally specified "for individual use" when I said "switching to a stable and functional ReactOS, at least for individual use, becomes a no-brainer". For individual use, the reason that prevented people to switch to Linux is well known, and ReactOS's reason to be was aimed exactly at that.


> There was no meaningful user-base for Unix/Hurd so to speak of compared to NT kernel.

Sure, but that userbase also already has a way of using the NT kernel: Windows. The point is that both Hurd and ReactOS are trying to solve an interesting technical problem but lack any real reason to use rather than their alternatives that solve enough of the practical problems for most users.


While I think better Linux integration and improving WINE is probably better time spend... I do think there's some opportunity for ReactOS, but I feel it would have to at LEAST get to pretty complete Windows 7 compatibility (without bug fixes since)... that seems to be the last Windows version people remember relatively fondly by most and a point before they really split-brained a lot of the configuration and settings.

With the contempt of a lot of the Win10/11 features, there's some chance it could see adoption, if that's an actual goal. But the effort is huge, and would need to be sufficient for wide desktop installs much sooner than later.

I think a couple of the Linux + WINE UI options where the underlying OS is linux, and Wine is the UI/Desktop layer on top (not too disimilar from DOS/Win9x) might also gain some traction... not to mention distros that smooth the use of WINE out for new users.

Worth mentioning a lot of WINE is reused in ReactOS, so that effort is still useful and not fully duplicated.


> I do think there's some opportunity for ReactOS, but I feel it would have to at LEAST get to pretty complete Windows 7 compatibility

That's not going to happen in any way that matters. If ReactOS ever reaches Win7 compatibility, that would be at a time when Win7 is long forgotten.

The project has had a target of Windows 2000 compatibility, later changed to XP (which is a relatively minor upgrade kernel wise). Now as of 2026, ReactOS has limited USB 2.0 support and wholly lacks critical XP-level support like Wifi, NTFS or multicore CPUs. Development on the project has never been fast but somewhere around 2018 it dropped even more, just looking at the commit history there's now half the activity of a decade ago. So at current rates, it's another 5+ years away from beta level support of NT 5.0.

ReactOS actually reaching decent Win2K/XP compatibility is a long shot but still possible. Upgrading to Win7 compatibility before Win7 itself is three plus decades old, no.


maybe posts like this will move the needle. If i could withstand OS programming (or debugging, or...) I'd probably work on reactOS. I did self-host it, which i didn't expect to work, so at least i know the toolchain works!


Basically if you do the math, it means a whole generation got tired of being on the project and focused into something else, and there is no new blood to account for that.

The history of most FOSS projects after being running for a while.


ReactOS has been very slow to develop, and probably missed the point where it could make an impact. It's still mostly impossible to run on real hardware, and their beta goal (version 0.5 which supports USB, wifi and is at least minimally useful on supported hardware) is still years away. But I never had the impression that gaming was a particularly important focus of the project.

ReactOS is mostly about the reimplementation of an older NT kernel, with a focus on driver compatibility. Their ultimate goal is to be a drop-in replacement for Windows XP such that any driver written for XP would work. That's much more relevant to industrial applications where some device is controlled by an ancient computer because the vendor originally provided drivers for NT 5.0 or 5.1 which don't work on anything modern.


> But I never had the impression that gaming was a particularly important focus of the project.

> ReactOS is mostly about the reimplementation of an older NT kernel, with a focus on driver compatibility. Their ultimate goal is to be a drop-in replacement for Windows XP such that any driver written for XP would work. That's much more relevant to industrial applications where some device is controlled by an ancient computer because the vendor originally provided drivers for NT 5.0 or 5.1 which don't work on anything modern.

Fifteen years ago, they could have focused on both the industrial and consumer use cases. There were a lot of people who really didn't want to leave Windows XP in 2010-11, even just for their personal use.

Admittedly, FLOSS wasn't nearly as big of a thing back then like it is now. A larger share of GNU/Linux and BSD installs were on servers at the time, so it was a community mainly focused on commercial and industrial applications. Maybe that's what drove their focus.


It functionally is a project from fifteen-twenty years ago. Development activity was somewhat slow but steady but it largely fizzled out around I think 2018? The project tried to get political and financial support of the Russian government but failed to secure it, Aleksey Bragin transitioned to working in the crypto space, and of course with every year the number of potential users dependent on Windows 2000/XP is decreasing.

I think by now ReactOS is best viewed as an enthusiast research / challenge project with no practical use, like GNU Hurd. Just as Hurd is interesting in terms of how kernels can be done, but isn't a viable candidate for practical use, ReactOS is now in the same category. Very interesting as an exercise in reimplementing NT from scratch using clean room techniques but no longer a system that has a shot at gaining any adoption.


> That's much more relevant to industrial applications where some device is controlled by an ancient computer because the vendor originally provided drivers for NT 5.0 or 5.1 which don't work on anything modern.

In most of those applications, you just leave the computer be and don't touch it. In some cases (especially medical devices) you may not even be allowed to touch it for legal/compliance reasons. If the hardware dies, you most likely find the exact same machine (or something equivalent) and run the same OS - there are many scenarios where replacing the computer with something modern is not viable (lack of the correct I/O interfaces, computer is too fast, etc.)

If there were software bugs which could impact operations, they probably would have arisen during the first few years when there was a support contract. As for security issues - you lock down access and disconnect from any network with public internet access.

All that assumes that ReactOS is a perfect drop-in replacement for whatever version of Windows you are replacing, and that is probably not a good assumption.


In my experience, things like ReactOS would have been more useful in parts of the world with let's say a less thorough approach to things like compliance.

A factory has a CNC machine delivered fifteen years ago that's been run by the same computer all along. The computer eventually gives up the ghost, the original IT guy who got the vendor's drivers and installed them on that computer with an FCKGW copy of WinXP is long gone. Asking the current IT guy, the easiest solution (in a hypothetical timeline where a usable ReactOS exists) is to take the cheapest computer available, install ReactOS, throw in drivers from the original vendor CD at the bottom of some shelf and call it a day.


We might have to agree to disagree here, but I think the scenario where the IT guy uses XP and "finds" a license for it is the approach I would take if I was put in this situation. If the vendor for the CNC machine certified/tested their machine against Windows XP, and does not offer any support for new operating systems, I would be very reluctant to use anything else - whether it is another version of Windows which could accept the same drivers, or an open source clone. Again, I'm assuming that ReactOS manages to be a perfect clone, which is may or may not be in practice.


Kryptos K4 seems to me like a potential candidate for AI systems to solve if they're capable of actual innovation. So far I find LLMs to be useful tools if carefully guided, but more like an IDE's refactoring feature on steroids than an actual thinking system.

LLMs know (as in have training data) everything about Kryptos. The first three messages, how they were solved including failed attempts, years of Usenet / forum messages and papers about K4, the official clues, it knows about the World Clock in Berlin, including things published in German, it can certainly write Python scripts that would replicate any viable pen-and-paper technique in milliseconds, and so on.

Yet as far as I know (though I don't actively follow K4 work), LLMs haven't produced any ideas or code useful to solving K4, let alone a solution.


Yeah, you would suspect that the individual elements of solving K4 exist in some LLM, but so far the LLM slop answers are just very confident and very wrong.

My biggest complaint is that the users aren’t skeptical. They don’t even ask the LLM to verify if the answer it just generated matches the known hints from the puzzle artist. Beyond that, they don’t ask it to verify whether the decryption method actually yields the plaintext it confidently spit out.

I’m super impressed with Claude Code, though. For my use case, planning and building iOS app prototypes, it is amazing.


Technically there has only been one fatal accident in space, the Soyuz 11 failure which killed the crew of three. That occurred above the Karman line, all other spaceflight related fatalities were at much lower altitudes or on the ground.


The other one that comes close is Columbia which broke up at around 60 km.


Surely AGI would be matching humans on most tasks. To me, surpassing humans on all cognitive tasks sounds like superintelligence, while AGI "only" need to perform most, but not necessarily all, cognitive tasks at the level of a human highly capable at that task.


Personally I could accept "most" provided that the failures were near misses as opposed to total face plants. I also wouldn't include "incompatible" tasks in the metric at all (but using that to game the metric can't be permitted either). For example the typical human only has so much working memory, so tasks which overwhelm that aren't "failed" so much as "incompatible". I'm not sure exactly what that looks like for ML but I expect the category will exist. A task that utilizes adversarial inputs might be an example of such.


Super intelligence is defined as outmatching the best humans in a field, but again, on all cognitive tasks, not just a subset.

AI can already beat humans in pretty much any game like Go or Chess or many videogames, but that doesn't make it general.


Thanks, I'll see about an on-page zoom. On my 1440p the whole table fits even with the side drawer open and with my webdev inexperience I didn't even think about zoom controls other than the browser's being an option.

I love your slide puzzle too. Very cool with different hint levels, where you can have just the element symbol or the full name as well. Surely trivial for chemists but not so for me.


Thanks! My brother is a chemical engineer so he's about the only one who can come close to solving the puzzles. :)

Also really like how the Timeline fades out the elements to filter by year.

Hmmm... I wonder if its a HiDPI scaling thing? I tried the site on a couple browsers (Safari, Chromium) and even on a 4K monitor it only fits Hydrogen to Nitrogen.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: