HN2new | past | comments | ask | show | jobs | submitlogin
Why Darwin failed (2006) (synack.net)
93 points by jckahn on May 31, 2022 | hide | past | favorite | 126 comments


Apple still regularly releases large portions of macOS. Confused about some poorly documented keychain behavior? Just look at the source, e.g.,

https://opensource.apple.com/source/Security/Security-59754.... https://github.com/apple-oss-distributions/Security/blob/Sec...

Access to such source code is not nothing, and certainly seems more than just a marketing gimmick.


They were always releasing code, that wasn’t the problem.

They clearly never helped to make it a usable standalone oss product, at all.

Just look at what the author said about breaking backwards compatibility upstream, with no regards to non Mac Darwin.

It’s melodramatic to call what Apple did a “marketing stunt”, but that code on its own is pretty useless.


Similar behavior is what turned me off of Swift. I still think it’s a nice language to write, but I tried to adopt it on Linux and the situation with tooling was so horrendous.

Lots of things were under-supported or under-documented, and every release would break things in unexpected ways.

I don’t know if it was intentional, or a lack of priority, but I got tired of having to work so hard just to get my programs to compile.


My own anecdotal experience is that it's not that much better on Apple's platforms. Documentation may have gotten the tiniest bit better, but there's still way too much information missing. And Apple's platforms are a constantly moving target. It may turn out tomorrow that the work required to keep up is just too much. Too many incentives, IMO, to not develop for these platforms if one can avoid it.


> Too many incentives, IMO, to not develop for these platforms if one can avoid it.

C# had this issue for a long time as well. Any desktop app support outside of Windows is abysmal.

These languages don’t really bring anything groundbreaking to the table anyway. They just coat them in syntax sugar so you’ll stay hooked on their platform.

It’s much more sensible to write web apps and have them run in Electron or I guess alternatives, but you didn’t need me to say that.


Yeah, I was thinking the same thing: that having access to source code is really useful for developing and debugging, regardless of whether there is a community around the project.

I've been experimenting with ONVIF to create an IP camera that can be integrated with windows without the user having to install any additional software. I almost never touch windows and I must say that the experience has been very frustrating because things do not behave according to spec and I can't look at the source code to see what's actually happening.

Access to source code makes a world of difference. Otherwise, you're left guessing and the quality of your product will suffer because you end up doing "whatever works" under your circumstances. Under different circumstances that can break.


News flash: big company only does things in its interest.

When Apple was trying to escape from its near death experience (imac era) they were super enthusiastic about open standards (JPEG, MP3, iCal, et al.). When I say "super enthusiastic" I mean to the point of Jobs touting that on stage.

Of course they also paid adobe and Microsoft for access to some of their very popular formats. They paid for USB and made one of the first, and as is usual for them, most strict and compliant, implementations. They were probably the first consumer 802.11 companies. As much as possible they wanted to talk to as many things as possible.

As they clawed their way back those standards were still OK (and still are -- Apple is actually pretty good at supporting open standards, better than most in my experience) but they didn't care as much about going off in their own direction again, and making those their defaults.


Exactly. Underdog companies always try and differentiate themselves with standards and openness and talk about how friendly they are compared to the big giants, and then close themselves off once they become #1. I have personally seen this happen so many times. And it's happening right now with AMD as they're slowly overtaking Intel. And expect to see Intel scramble soon about how they're all geeks at heart and love open-source and standards.


It's the tech corporate version if "When you are stronger than me, I plead for freedom because it is according to your principles. When I am stronger than you, I take away your freedom because that is according to my principles."


They may have paid a few lip services to openness, but very seldom (apart from their legal obligations) openness of source. They did very broadly embrace interoperability when it suited them, and it’s increasingly suited them except in a few key competitive areas. And they’ve quietly relented in areas where they were overly aggressive resisting for very little business purpose (eg AV codecs).

I’d be surprised if, in a perfect universe of their imagining, Apple wouldn’t fully embrace open source… because they don’t see much value in software other than integration. Their resistance to standards and openness now is almost entirely about protecting trade secrets. Which definitely has flaws! But it’s a substantial difference from Apple in the early 2000s where they couldn’t coherently tell a story about what was public or why.


In some sense, having a UNIX based OS, with stuff like OpenGL and PDF was that openess.

You won't find any of it on the Mac OS System predecessors, nor on the Copland roadmap and early developer betas.


Can you define "very seldom (apart from their legal obligations) openness of source"?

There are major OSS projects that were started by Apple, and plenty that weren't started by, but definitely get the bulk of their engineering from Apple, in public repositories.


Apart from Swift I can’t think of an example of this. But I do think Swift is a great example of how Apple would embrace OSS if they had the right incentives.


Clang is another, iirc.


Nope, LLVM was started at Iilinois university, and clang only came into the picture because Apple wasn't going to get a ticket for GPL v3 GCC ride.

Had it not been for the license change, clang would never had happened.


Clang fits the second half, that Apple contributes a ton to it.

CUPS is another, it was ok but not great until Apple took it in and made it work with a ton of devices, and also make vendors wake up and support it. Now Windows has a much worse printing environment, and would do well to just adopt CUPS.


Nope, one of the reasons why clang is lagging behind everyone in C++20 support is that C++17 seems to be good enough for Apple purposes regarding their use of C++, and they withdraw resources from it.

Also they don't contribute everything upstream, like the stable bitcode format used in watchOS, or the SafeC used in iBoot.

Apple hired the CUPS developer, that is why, again not started by them and they had to obey the license.

Windows a worse printing environment than CUPS? What a joke.


Well if Apple didn't contribute to Clang, it would have C++ 20 because it would be developed by all those developers that aren't Apple. How can Apple be responsible for its deficiencies and not its development?


> Clang fits the second half, that Apple contributes a ton to it.

So where is that ton to it?


I know you were around way back then so you probably should remember it took a bit to bring clang to feature parity with gcc. There was a lot of excitement that Apple was involved in the work, and it really did get there to the point where FreeBSD, among others, could switch to clang full time.


Of course, they needed to get rid of GCC.

The point is what contributions they are doing in 2022.


Clang was started by, and primarily developed by, Apple. Why Apple started Clang isn't relevant - because there is no reason they had to make it OSS. Clang also wasn't only a licensing thing - gcc doesn't work as a general tooling platform.

Similarly libc++, etc.


It surely is relevant, because it wasn't out of love for FOSS like many are advocating.

GCC originally meant, GNU C Compiler and got renamed into GNU Compiler Collection exactly because it became a general platform to write compilers for the GNU ecosystem.

The irony that now GCC is far ahead of clang in ISO C++ support shows how much Apple and others care about upstream.


> they were super enthusiastic about open standards (JPEG, MP3, iCal, et al)

Let’s not forget that Jobs promised iChat would be opened for third parties to interoperate with, and never did.


iChat originally used the AIM ("Oscar") protocol.

https://en.wikipedia.org/wiki/OSCAR_protocol

Later it added XMPP and Yahoo IM support.

https://en.wikipedia.org/wiki/Messages_(Apple)#macOS_version

I am not some Apple-booster here. I use Franz, myself. But it was open. I don't have anything newer than 10.14 here so I can't talk about later versions.


iChat supported jabber


Oh, like FaceTime!

D:


One big example was OpenGL adoption, nowadays the FOSS folks get salty with Metal, yet the pre-Jobs revival Apple was doubling down in QuickDraw 3D and QuickDraw VRML, they couldn't care less about OpenGL.


No, the iMac era was not relevant to the business failure of Apple. The issue was that they were built on PPC, and IBM, et al, were utterly failing to provide hardware that was in any way competitive from a price perspective, and for the last few years of that architecture they weren't even providing performance competitive hardware at any price.

As far as hardware goes, Apple could reasonably charge a premium for a lot of Mac hardware: They were known for high end LCD, Speakers, they were ahead on USB, DVD, Firewire (...). But people don't care about the architecture of the CPU, they care about the speed it can do something, and PPC CPUs cost more than Intel or AMD by a vary large margin (I've always assumed whole-number factor?), and towards the end of the era PPC was also slower than the competitor. I'm not a business person, and obviously know nothing about Apple's financial or business plans, but I can't imagine that having to pay a huge premium for the most expensive part of a Mac (and having that also be both slower and hotter than the competition) was probably chewing out a huge amount of any profit margin than was manageable.

Paying licensing fees for popular formats is an obvious thing for a personal computer company to do - that Ubuntu, or some of the other "end user" distro, never did just seems petulant.


Were IBM cpus really magnitudes more expensive than Intel/AMD cpus or were they just packaged in macs that had their own premium tacked on (in part because of the integration of cutting edge standards like usb and wifi when wintel vendors were years behind admittedly)? I was reminiscing the other day about the common early aughts meme of pc builders on forums always talking about how they could build a pc that was twice as powerful as a mac for nearly half the cost :), but if anything the price of the ibm power cpus were an abstract mystery because consumers only ever saw them in macs (until the xbox360 anyways)


> common early aughts meme of pc builders on forums always talking about how they could build a pc that was twice as powerful as a mac for nearly half the cost

Why do you need to reminisce about this, these exact posts show up on every Apple hardware release to this day lol


The apple silicon cpus are like twice as fast as the offerings from amd and intel and barring the mcm tech and speed of apple silicon, the otherwise comparable stuff from x86_64 costs almost the same as the apple stuff, except for the mac pro


That’s actually a really interesting question. As I think about it, all I can produce is “everyone knows they were more expensive”, which is questionable in terms of veracity :D

I’ll see if I can find some actual figures


I spent a bunch of time searching, and all I can find is parts on ebay today. Everything of the era is just people saying that everyone knows they cost more than intel parts. Nothing I found actually backs up the claim with real numbers, so?


Joel had an article on his "blog" - way back in 2002 - about something like this.

https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/

Back then, it was not really much about Apple, but about Microsoft and Sun. But you get the picture.


> When Apple was trying to escape from its near death experience (imac era)

The "imac era" https://en.wikipedia.org/wiki/IMac_G3#History

Steve Jobs reduced the company's large product lines immediately upon becoming Apple's interim CEO in 1997 ... The company announced the iMac on May 6, 1998


The “era” definitely began then but definitely didn’t have much impact other than cultural until a few years later. Really it should be called the iPod era, but even that lagged a while as Apple gave up on mainstreaming FireWire.


The iMac really did keep apple from falling down the well and dying. They were busy throwing whatever they could into the fight, and (to mix metaphors) tossing overboard anything that didn’t help. This was the era of iTunes, iDVD and iEverythingElse. The iPod was at first something to plug into that nascent iEcosystem. It began a period of growth.


Exactly my point. The iMac was a product for years before any of these other efforts. It did help, but it took all of these other products (most importantly the iPod, and eventually iTunes along with it) to really make it more than a flash in the pan. By the time it was effective, the iMac had been repositioned from an entry level computer to a status symbol (as displayed in many product placement ads), and eventually the non-workstation pro Mac. It’s iconic mainly because it was a pivotal historical point for Apple, not because it actually achieved that pivot.


> Of course they also paid adobe and Microsoft for access to some of their very popular formats.

[[citation needed]]


> Prior to Mac OS X, people would have been laughed out of the building for having a Mac, but now, with Mac OS X and the magic of open source, it was suddenly OK or even cool.

To push a bit further, a bunch of us switched to OSX not from windows but FreeBSD or linux. Hardware support for open source OSes was abysmal and getting commercial software to run on even linux was death by thousand cuts at best.

On the work side, anything "not windows" was also just not an option, and we'd code on windows and remote to unix servers as needed.

Introduction of Darwin really created a space where Unix could live in personal and corporate environments.


> Introduction of Darwin really created a space where Unix could live in personal and corporate environments.

100% this. I am a huge Linux lover but also really appreciate having the option at work to use MacOS due to the Unix userspace it provides.

That being said, I would pretty much hate MacOS without the open source tooling (Nix, Homebrew, Yabai, SKHD, iTerm, etc.) I can bootstrap onto it.


I get an awful lot of mileage out of the XCode command line tools. Most of the things you need to emulate a CI pipeline locally are there, though the exceptions can occasionally be a bit of a problem.


The insistence on having to install the xcode command line tools (hello "be at 99% installed for 10 hours" bug I have had over 5 different machines") is very annoyin. But I've found that brew has been a major mess as well.

Nothing like coming in on a Monday and having to debug 10 coworker's machines cuz suddenly python and postgres were silently upgraded (or some random shared lib was upgraded).

"Just use Docker" is also a PITA cuz of the abysmal file system performance stuff... I just want to run software! Or rather, I want my coworkers who might not be as well versed in system software to be able to have working systems.

All that being said, it's undeniable that I can daily drive Linux thanks to Mac stuff existing.


I've found that asdf has replaced a lot of my brew installs. It's great to be able to pin tool dependencies per project. I used, and liked, brew for a long time, but my tooling these days is almost entirely MacPorts, asdf-vm, and Docker + docker-compose (I'm also intrigued by podman).


xcode is enough to make nvm, asdf work. With only a very few exceptions I've found it to be enough to run javascript frontends, and backends of javascript, Ruby, Node...


What tools do you use/find essential that get installed with XCode's command line tools? My experience with XCode's command line tools is pretty much just installing them for Homebrew. I don't think I've taken time to actually dig into the installed tools -- so I'd love to hear what tools increase your mileage! :)


I've been on Linux at work full time for about 7 years now. It was kinda janky for a while, but in the last couple years it feels as solid, nay, more solid, than my macbook. I did take the plunge with NixOS though, which is a quantum leap in reliability and stability.


It's still a massive pain in the ass. Their UI is inconsistent, their new arch breaks docker builds, and not enough is configurable. Brew even fails at usability, doesn't even put executables on the path for you.


> their new arch breaks docker builds

I’ve seen this complaint quite a bit on HN, and I don’t really understand it. Apple helped Docker port over to arm64 on the Mac (which was one of the first pieces of software to use the new fancy hypervisor APIs), and it’s been working for over a year now. If your docker images can’t build on arm64, I don’t really understand how Apple or Docker can make that any easier. Most of the popular images have arm64 builds now, the only holdout I know of is MySQL due to Oracle being Oracle. If you’re building your own images, GitHub Actions and many other CI providers have had ARM64 agents for a long time (and you can build arm Docker images in QEMU super easy on GHA if you need hosted agents).

There’s not much of an excuse anymore, and since arm instances are cheaper on some public clouds, you should be making sure your stuff runs on arm64 now. Especially if it’s a typical Node/.NET/Java/Ruby/Python app, they’ve been arm native for years.


> the only holdout I know of is MySQL due to Oracle being Oracle.

I am (partially) responsible at the MySQL team for Docker images. I however don't use Mac myself. The only thing I am aware of is that for 8.0.28 we were a little late with ARM images due to technical issues, but that should be resolved.

But mind there are different sets of container images: `mysql` is maintained by Docker, Inc. where we only have limited influence and `mysql/mysql-server` which is maintained by us.

I see for instance

https://hub.docker.com/layers/mysql-server/mysql/mysql-serve...

Any specific issue with that? I'd then work with the involved teams to get it resolved. (Preferebly issues via bugs.mysql.com, but I'll check responses here as well)


I thought “mysql” was y’all. Sorry! I'll have to remember in the future to use "mysql/mysql-server". I just remember that GitHub issue for "mysql" being open forever.


> you can build arm Docker images in QEMU super easy on GHA if you need hosted agents

Yeah, you nailed it: it's fairly trivial to create cross architecture images -- my preferred tool is Docker's buildx plugin[1].

I have an M1 Mac, Raspberry Pi 4 running 64 bit Ubuntu Server and multiple x86_64 bit machines. I try to exclusively use Docker images that are cross-architecture. When they are not cross-architecture I usually just build the project's Dockerfile using the Docker Buildx Github Action[2]. The majority of the time most projects' stock Dockerfile works on all architectures. The few times they don't work are usually when a Dockerfile grabs an x86_64 binary and downloads it. I feel like that is bad practice so I usually just write my own Dockerfile when I encounter that.

A recent image I did this with was Gitleaks[3] (link goes to my Github Action).

I will say I have found that while most projects usually compile well on all platforms, if I ever do run into issues it is usually on 32 bit ARM.

[1] https://docs.docker.com/buildx/working-with-buildx/

[2] https://github.com/docker/setup-buildx-action

[3] https://github.com/heywoodlh/actions/blob/master/.github/wor...


You've certainly painted a sunny picture of arm adoption that I'm simply not seeing.

One of the advantages of docker is that you can have systems lock their versions. So our project using say, Neo4J, uses a particular version of it. Not necessarily one with an arm build. This creates predictable and repeatable builds.

So now either we upgrade dependencies out of our expected cycle, or we abandon macs as our development machines. A decent number of us wanted Linux anyway. This is a cost that Apple foisted on us.

> since arm instances are cheaper on some public clouds, you should be making sure your stuff runs on arm64 now.

Some instances might be cheaper, therefore we have to adopt it now? I think the cost/benefit analysis you're performing here is ignoring development costs. Are you a project manager of some kind?

> they’ve been arm native for years.

The native libraries and tools that make them worthwhile are not. Really it only takes a single component to ruin everything.


I’m not a PM, but I’ve been in the room during those conversations. Granted, I’ve never had to rely on a ton of native extensions (at least in a business context, personal stuff for sure).

It’s just like any other dependency. Sometimes you can afford to freeze on a single version until the end of the project, but most of the time projects I’ve been on will regularly upgrade the third party stuff we rely on if for no other reason than security, with things like arm64 support coming along for the ride. We usually prefer to be able to get help with our issues faster than our tickets getting closed with “fixed in new version update”, of course weighing that against breaking API changes or what have you.


Yes, the trend for the last years hasn't been great.

With Apple's current market share, it sometimes feels like the unix core is more of an historical artifact, or one of the bullet point in commercial presentations. I don't see them put much efforts in pushing Darwin forward when they could use the same resources for Bridge OS or other parts they see as a differentiating factor.

On the bright side, Mac OX coming to enterprise settings also opened the door for linux laptops, and they became a truly viable alternative. Perhaps linux getting better representation and commercial foothold could help keep Darwin competitive and properly evolving, one could hope.


It does except in cases where it might be dangerous to (e.g. `node@16` doesn't by default as that could conflict with plain unversioned `node`). You can generally `brew link` the package anyway if you need.

Or use Nix instead. I've been having a good time with Nix on macOS.


Darwin was never a marketing ploy. Marketing had no idea what to do with it. The idea was to share the source for the highly usable BSD fork that Apple was working with at the time. Ultimately the effort failed because there were problems with the technology and not a lot of extra effort available. People think of Apple now and fail to remember that back at this time Apple was still an unhealthy and barely profitable corporation with a stock price just over $20.

And it is worth remembering what happened with copylefted code in general. Any copylefted content that Apple published caused a bunch of mean spirited and detail oriented critics to insist on proof that all relevant source diffs had been shared. This turned out to be an ongoing support problem that could not be solved despite considerable resources being applied. The intense ongoing hostile attacks from free software advocates caused Apple to rigorously remove as must copylefted content as possible in order to stop the bleeding. Far from this being a weak marketing effort it was a engineering effort that was not strong enough to succeed and got huge amounts of negative attention from the open source community. Interesting that some of the people who put their energy into killing this effort now confidently claim it was never anything but marketing. Never underestimate the capacity of the free open source community to generate rude and mean spirited criticism based on near zero understanding of corporate operations and constraints.


This seems very plausible to me, but I’m curious if you can point to any evidence to substantiate what you’re saying?


> Prior to Mac OS X, people would have been laughed out of the building for having a Mac, but now, with Mac OS X and the magic of open source, it was suddenly OK or even cool.

I think this is more a function that Apple created the best lightweight Unix laptop at that time. Windows laptops at that time had horrible command line experience and did not play well with Unix/Linux which is what most web servers were running on. Linux laptops had horrible hardware support.


Toshiba used to sell laptops with Solaris on them, and there were some SPARC versions from Sun as well.


> Windows laptops at that time had horrible command line experience and did not play well with Unix/Linux which is what most web servers were running on.

In 2006, the main reason for Linux was to fix things Microsoft broke, which upon discovery Microsoft would break again, and Linux devs would again restore functionality to Windows networks. So your first point here is somewhat true, Microsoft did not play well with Linux which was moderately successful at playing well with Microsoft.

In 2006, however, most webservers did not run Linux, hardly any did, in fact. Linux took over the data center about 5 years later. In 2006, most webservers were either IIS on Windows Server or, believe it or not, NetBSD. In fact, the vast majority of webservers globally, like 90% or more, ran NetBSD w/ Apache, and if not for fanaticism, this would still be true today, because there was and is absolutely nothing wrong with NetBSD that Linux fixes.


> In 2006, most webservers were either IIS on Windows Server or, believe it or not, NetBSD. In fact, the vast majority of webservers globally, like 90% or more, ran NetBSD

Do you have any sources to back this claim? I've always thought that (perhaps mistakenly) by 2005-ish Linux was already fairly dominant as a web server OS/platform.

Not asking because I think you are wrong -- I just would love to read more about this and couldn't find anything more than a web server survey from 2006[1]. The survey doesn't really mention specifics about any of the Unix-like platforms other than "Linux" or "Apache" (which I assume by Linux or Apache they are grouping all non-Windows OS-es together).

My interest: I'm a regular Linux user with some FreeBSD experience who finds the BSDs super interesting and their approach very admirable.

[1] https://news.netcraft.com/archives/2006/04/06/april_2006_web...


On my world corner, HP-UX, Solaris and Aix were still the UNIX workhorses we would thrust to put into production, with CERN being the exception where I used Linux in production.

Yet, even CERN in early 2000's while migrating from Solaris to Linux (my office had a pile of pizza boxes to be thrown out), most researchers were using Windows 2000 with Framemaker/Word for writing their papers, with remote tooling to access the devenvs.


I think it’s a question of perspective. In 2006 I think it’s fair to say that Linux was becoming dominant on the web, meaning that most greenfield projects would run on Linux from day one.

Enterprise IT was another thing entirely. At the time I was working at an Enterprise Linux vendor and we were very focused on these “Unix to Linux” campaigns. There was so much to do, and a lot of convincing needed to happen to get these business critical workloads over (accounting, payroll, ERPs, etc.).

Ironically, VMware ended up being more instrumental in getting Linux adopted by IT depts everywhere than we ever were. As well as Oracle with Unbreakable Linux, which was released that year.


> In 2006 I think it’s fair to say that Linux was becoming dominant on the web,

I don't see how that is possible. Dominant means a majority. Maybe by 2009 Linux still only ran maybe 40% of all Internet webservers, and that is being very liberal. So even by 2009, Linux was not dominant. In 2006, Linux possibly had 2-5% of webservers... if that. By late 2011, Linux probably ran 90% of webservers. It didn't happen overnight, it took more than a decade after 2001 for Linux to dominate the datacenter.

We ran racks of Linux servers in 2001 for back end processing. But our webserver ran NetBSD, like nearly all the other webservers, until about 2009. After 2009, perhaps all new webservers ran Linux, as hardware was replaced, but the Internet was still mostly NetBSD in 2006 with the rest being Windows Server IIS.


If you look at the graph of web server market share on Netcraft's site, you can see that Apache ramped up really quickly in the late 90s and stayed ahead of IIS until Nginx started making real inroads.

https://news.netcraft.com/archives/category/web-server-surve...

The vast majority of these web servers were running on Linux. NetBSD was a very niche OS from about the early 2000s on, and far from common before that. Up until about '97 or so most web servers seemed to be running on NT4 or even Windows 95 (which was in itself a fairly terrible idea but they were simpler times).


> Dominant means a majority.

My full sentence was qualifying what I meant by "becoming dominant" here: "meaning that most greenfield projects would run on Linux from day one."

In 2006, Linux was not powering a majority of the web, but it was powering a majority of the _new_ web properties launched that year.


> by 2005-ish Linux was already fairly dominant as a web server OS/platform.

I lived through it, and I don't have any citation, but this is incorrect. Linux did not take over the datacenter until 2011/12. In 2006, the main reason for Linux's existence was to fix the things Microsoft kept breaking, also, it was nice to revive older hardware that wouldn't perform with or run current OS versions. IBM actually tried to switch everything to Linux right around 2006, but en mass grey beard AIX system admins revolted, and IBM postponed the "migration." In 2006, Linux was just not mature or stable enough for big iron nor servers for popular websites, and Linux certainly wasn't dominant in any space in 2006, except maybe the dev hobbiest space.


I don’t agree at all. Linux was fairly ubiquitous even in the late 90s. For example the entire line of Cobalt server appliances were Linux based, and they were a favorite in colos and shared hosting.


> Linux was fairly ubiquitous

Linux was not omnipresent in the late 90's, regardless of a single server appliance. It was only in 2007 that any major hardware vendor (Dell, HP) even announced it would ship w/ Linux preinstalled. By 2008, NetBSD still had about 50% of webserver space with 30% run on Windows Server. In 2008, Linux ran maybe about 20% of Internet webservers, regardless of Steve Balmer's claim (remember how big the Internet is, i.e. it does not only consist of Apple, Amazon, Facebook, and Microsoft... that is only 4 sites... there were 31M websites in 2008, and most of them still ran NetBSD). A lot of file servers ran on Linux, there was big data being stored on Linux by 2007, there was massive interest in Linux, but Linux was not all there was, and Linux really did not take over the datacenter, where nearly all the hardware ran Linux, until late 2011.


> there were 31M websites in 2008, and most of them still ran NetBSD

Extraordinary claims require extraordinary evidence.


But the status quo is, pretty much by definition, not extraordinary. I also remember "common knowledge" of the era as "BSD standard, Linux up-and-coming". So where's your evidence for your extraordinary claim that this wasn't how it was?


I looked after dozens of those, both the Raqs and the Qubes, in the earlyish 2000s. I want to say, about 2003-2004ish?

The Raqs got through fans like crazy, for some reason. I think I went through all of the servers twice in two years.


I think you're overstating a little here. There were definitely popular websites running on linux in 2006. I won't make any claim about dominance (it's too hard to find good data now and it's just a battle of memory at that point), but definitely some high rankers on alexa at the time were very much LAMP.

Notably, I believe Facebook was LAMP, and was already becoming a pretty big force in 2006. It's hard to find a good citation for that, but it was my understanding at the time they were.

I think it was pretty common among up-and-comer startups, really. Maybe not so much on the low or high end, which were still dominated by shared hosting and enterprise contracts at the time, respectively. Still, it's the year RHEL launched so things were changing.


Better said, the year Red-Hat realized there was no money on the Linux Desktop and pivoted into RHEL.


> There were definitely popular websites running on linux in 2006.

There were about 32M websites in 2006. Nearly all of them were not running Linux. Not in 2006.


First of all, you said popular. Nearly all of those 32m websites were not popular by any useful definition of that word.

At any rate, you’re definitely going to have to provide more than a stern tone to make me believe this is true as it’s either very at odds with my experience at the time or your definition of “nearly all” is very unusual.

And finally, as another commenter suggested, if your idea is that this doesn’t also apply to netbsd (ie. nearly all servers weren’t netbsd) the i dunno what to tell you. Some kind of bsd maybe, but netbsd? I really don’t think so.


No, I did not say "popular."

It is quite simple. Linux took over in 2011. That is 5 years after 2006. We don't have to rewrite history to make Linux look better than it is. Absurd.


> Linux was just not mature or stable enough for big iron nor servers for popular websites,

On the other hand, I absolutely made no claims about taking over. That's not the argument I have with what you said.

But I was running, or involved in the running of, large and popular production websites in 2006 that absolutely ran on Linux in the datacenter. There were BSD machines in the mix, sometimes as frontend proxies, but the actual website software and the bulk of the compute power of the sites were Linux.

That’s the claim I’m making and i make it because i lived it: Linux was used by popular websites in 2006, regardless of your claims of overall popularity.


I don't understand what you're on about, dangling on a word. My claim is that most of the WWW, the vast majority of public webservers, popular or not, ran on NetBSD between 1994 and 2007, the rest ran on Windows Server IIS. Linux, if it had any marketshare at all in the public WWW space in 2006, was in an extreme minority, part of a single percent or very low single digits of the 30M websites available in 2006. Barely perceptible. Things moved fast, and by 2008, Linux gained a lot of marketshare, and HP and Dell facilitated that by shipping Linux since 2007, but Linux still only had about 40% of webserver marketshare in 2008. But by the end of 2011, it effectively gained the rest.

But you worked on a large website in 2006 that ran on Linux. So you were a pioneer. Satisfied? In what way does it offend you that Linux didn't have any webserver marketshare to speak of in 2006? It's nothing personal. You are not Linux.


So, if this is your claim, provide some evidence to support it.

Web serving is not my field but I've been in tech since 1988, and while IIS rapidly gained marketshare in the late 1990s, my impression was that by the 21st century, Linux was becoming dominant.

The biggest NetBSD server I'm aware of is SDF: https://en.wikipedia.org/wiki/SDF_Public_Access_Unix_System

And that is pretty niche.

I have never, ever seen or heard of NetBSD in production in my entire career. Not once, anywhere.


> my impression was that by the 21st century, Linux was becoming dominant.

This is just bias and rewriting history. Linux was popular among younger admins and developers. It was a verve among that generation. Many were running it by 1996, and any who did will have the false impression that it was dominant because they were exposed to it. And while Linux was in the datacenter by 2001, it was not used for web front ends, it was for running backend processing. What I am saying is that Linux had not been widely adopted by the corporate or industrial space as webservers by 2006.

If there were any public facing Linux webservers in 2006, they were in such a minority that claiming it was becoming dominant is only said with the benefit of hindsight, because by 2012, Linux did take over most webservers. But in 2006 no one could know that, because most webservers, 80% or more, from 1994 to 2006 ran NetBSD, and IIS never had as much deployment as NetBSD webservers.

Again, I am not talking about any MAMAA. I am talking about Joe's Electronic Shop, and Xie's Ceramic Hippos, and the vast majority of all the public facing webservers online in the 1990's were running NetBSD, because by the mid-90's, it became ordinary convention, thus no one said, "we should document this for historical purposes."

The UNIX Wars were over and BSD won, so there was no hemming and hawing about what OS to use for a webserver. It was a choice between a handful of proprietary pay OS (Windows Server, AIX, HP/UX, etc.) and the free BSDs. For whatever reason (stability, maturity, practicality, etc. etc.), NetBSD was the vastly more popular choice for webservers, right up until the Linux verve came in and fixed something that wasn't broken. Linux was just a preference at best and a fad at worst. It brought no new features (Apache 2 is Apache 2), was not more secure, though it booted faster, NetBSD never need be rebooted. Linux took over like Pepsi took Coke's marketshare, and there was no rational reason behind this. Today, one is as good or bad as the other. In 2006, Linux was not mature. I know this because IBM scrapped their wide-deployment Linux plans in favor of continuing to develop AIX, which as awful as it is, it was far more stable and mature than Linux in 2006.

Linux only got great, like, astoundingly great (which means as good as any BSD or SysV variant), maybe a decade after the Millennium. In 2001, no company, not even the small ones, were trusting Linux for public facing anything. Maybe Linux had small gains in the web space by 2006, but it couldn't have been more than a few percent, if even a fraction of a percent, of the web. In 2006, the web was nearly entirely Windows Server IIS and NetBSD Apache 2, with the rest in small minority sharing a few percent, Solaris, AIX, HP/UX. And Mary's Coffee Mugs Boutique was not running AIX or Solaris, would have cost a fortune. NetBSD was convention for webservers, and it was free, very well-documented, well-supported, had dozens of thousands of free ports available through pkgsrc, and was very stable, and a mature OS. So it should not be surprising in the least, because it worked, and Linux was not ready, and it wasn't ready because, though the younger IT gen was ravenous for it and crusading for it, the grey beards in decision-making positions hated it (I think because Linux arbitrarily messes with the established directory structure of UNIX, so things were not where they were expected to be, causing frustration among the most experienced UNIX & BSD admins).

I would bet, before Craigslist migrated to CentOS sometime after 2004, from inception and for the entirety of the 90's and the early naughts at least, it probably ran Apache on NetBSD. How the heck can anyone find anything about what OS Newmark was running except the man himself? (I'm fishing to see if Craig lurks here, but at 69, I hope he's lurking on a yacht somewhere instead). I'd also be willing to bet that Amazon was originally hosted and launched from NetBSD servers, but I'd also bet Jeff doesn't remember.


> In 2001, no company, not even the small ones, were trusting Linux for public facing anything.

> I'd also be willing to bet that Amazon was originally hosted and launched from NetBSD servers

That's funny. In fact, Amazon decided to switch to Linux in 2001 [0], and completed the transition by 2002 [1]. They were moving from Solaris [2].

[0] https://web.archive.org/web/20010608093419/http://news.cnet....

[1] https://www.cnet.com/tech/tech-industry/how-linux-saved-amaz...

[2] https://twitter.com/DanRose999/status/1347677573900242944

As I mentioned in another comment - it's bizarre how NetBSD supposedly absolutely dominated and yet there's absolutely zero documentation of that "fact", while there's lots of people talking about using Unix, Windows, and Linux at the time.


> while there's lots of people talking about using Unix, Windows, and Linux at the time.

Not so strange that the talk was all about OS with paid licensing. And I don't see lots of people talking about Linux webservers in the early naughts. Linux was in the news because it was interesting, but the news was about reviving older hardware with free OS. Linux made it into production backends in 2001 at the latest, but no one was writing about it.

And in 2001/2, I'm not sure Amazon was all that notable, still a small Internet company then. Also, I'd find it hard to believe Amazon began on Solaris, rather than Solaris being the first early migration (Sun had even better support than NetBSD, but getting the nines was closely tied to Sun hw). Amazon was initially a garage company. I guess it is possible Bezos had a hand me down Solaris server, and ran without Sun support (the hw was that good, Sun's excellent support was hardly needed), or that it was part of the initial investment, but those servers cost tens of thousands.

I'm really beyond trying to convince anyone that 1) Linux only went everywhere later, 2011/12, not in 2001/6, and that 2) NetBSD, for all intents and purposes, was and ran the entire WWW for a decade or more, ignoring IIS and the small amounts of pay unicies. I saw web audits of OS. No one saved any for future review and nostalgia. NetBSD was an incredibly popular webserver for a long, long time, all through the 90's and into the naughts, losing out to Linux sometime between late 2007 and 2009, by which time it had all but disappeared (or the web got massively bigger).

Believe it or not. I'm not sure what my motive would be for trying to deceive die-hard Penquinistas. If we can't find evidence NetBSD dominated webservers, then find evidence some other OS did, but don't trust Microsoft's BS. IIS was a dog and had some minority marketshare increasing by the late 90's only because no one gets fired for choosing Windows.


> And in 2001/2, I'm not sure Amazon was all that notable, still a small Internet company then.

In 2002? No, Amazon wasn't a small Internet company. It had been around for a decade; it had been publicly traded for half that time. Look at that article I linked - they were spending nearly $100 million a year on infrastructure! Their annual revenue was three quarters of a billion dollars! They were a household name: Time Magazine had already named Jeff Bezos Person of the Year three years prior, in 1999! That CNET article describes them as an "e-commerce giant"...and they felt comfortable betting their tech infrastructure on Linux in 2001, when you insist it would be another decade before it wasn't just for hobbyists.

(And FWIW, Bezos was already quite rich and successful pre-Amazon:

> He first worked at Fitel, a fintech telecommunications start-up, where he was tasked with building a network for international trade. Bezos was promoted to head of development and director of customer service thereafter. He transitioned into the banking industry when he became a product manager at Bankers Trust. He worked there from 1988 to 1990. He then joined D. E. Shaw & Co, a newly founded hedge fund with a strong emphasis on mathematical modelling in 1990 and worked there until 1994. Bezos became D. E. Shaw's fourth senior vice-president at age 30.

Amazon was also quite well-funded from the jump, with dozens of investors, including hundreds of thousands from his parents alone.

https://www.scmp.com/news/world/united-states-canada/article...

He could have easily afforded a Solaris box or two for his site.)

I understand that you're not trying to convince anyone of anything, because all you're doing is saying "trust me" over and over. Please understand that, likewise, I'm not trying to convince you, because you're clearly completely disinterested in fact-checking your own beliefs. Nor am I a "Penquinista" fighting for Linux's honour or a BSD Hater or something - while I have one machine running Linux, I've got another running Windows; I've run both OpenBSD and FreeBSD at various times; I'm typing this on a Mac now. My motivation in arguing and providing counter-evidence here is just to try to set the record straight for anyone younger who might be reading this, who wasn't around at the time, whose head might be getting filled with misinformation.


Citations.

Pics or it didn't happen.

You have an axe to grind and I'm calling BS. If this was so prevalent, it will be trivially easy for you to prove it.

I was deploying Linux in production in the 1990s. A friend of mine ran an entirely Linux-based ISP in the late '90s. I was writing about Linux in the 1990s, e.g. here:

https://web.archive.org/web/20030529000441fw_/http://members...

I put together a custom remix of SUSE Linux Pro for PC Pro magazine in 1999.

I was putting in Smoothwall boxes from 2000:

https://smoothwall.org/releases.html

If what you are saying is true, prove it. Show us site reports, deployment queries, support conversations. Show us comparative reviews. Show us some evidence.

I am fascinated and I'd love to write about this, but at the moment, you have provided nothing and you sound more and more like a random ranting Internet loonie with every post.


Well, regardless of any citations this is very interesting anecdotal experience. Thanks for sharing -- good to understand a bit more of the history there. It's interesting to learn more about previous timeframes in which I didn't have an IT career.


I think this just isn’t measuring platform at all, though you can kind of assume they wouldn’t be much different. Apache could and did run on windows as well, though it was definitely an outlier case. But if you wanted to host php and you had a windows server you might wind up using Apache on windows, as far as i remember (and that was a use case for a client if mine around that time).

I definitely remember there being platform horse race info from netcraft around that time as well, so i wonder what happened to it.

God this brings back some… fun… memories: https://httpd.apache.org/docs/2.4/platform/windows.html


> I think this just isn’t measuring platform at all, though you can kind of assume they wouldn’t be much different.

I think you're absolutely right -- I just couldn't find anything from that timeframe with similar data points (probably lacking Google-fu on my part).


"90% or more" ran _NetBSD_? At least say FreeBSD if you want to be even remotely believable. I challenge you to name a single major website that ever ran on NetBSD.

Linux was rapidly becoming a successful free platform for web servers by 1998 - that's when the term LAMP was coined [0]. By 2002 "BSD is dying" was already a meme [1] on Slashdot - it was always a joke, exaggerated, but one with a grain of truth. Unlike the claim that NetBSD dominated the market! And while Netcraft never did claim BSD was dying, they did note in 2001 that far from having a near monopoly, "[BSD] has not achieved significant mindshare widely, as Linux has. This limits BSD’s successes largely to a few particular areas." [2]

[0] https://en.wikipedia.org/wiki/LAMP_(software_bundle)

[1] https://everything2.com/title/BSD+is+dying

[2] https://news.netcraft.com/archives/2001/04/01/april_2001_net...


The WWW was developed on a NeXT machine. But NetBSD ran the web for more than a decade, and I mean 90% of 30M webservers globally between 1994 and 2006 ran on NetBSD installations, not FreeBSD, NetBSD. FreeBSD started gaining popularity later. The reason is NetBSD was mature, incredibly stable, free and well-supported. Linux was nowhere in 1998, had zero support, was not stable or mature, and there were no Linux webservers serving the public anywhere in 1998. In 1998, Linux was barely a curiosity, a hobby, and it continued to have no foothold in the WWW until Dell and HP started shipping Linux installations in 2007.

Linux fanaticism is compelling, but it can not rewrite history. Linux took over far later than you think. Linux started gaining webserver marketshare only by 2008 as a minority, quickly ramped up in a few years, but only by 2011/12 were most webservers running on Linux.


I didn't ask about NeXT, I asked about NetBSD. Surely if it ran 90% of the web, you can name a single website that was served by it.

Like, I honestly can't tell if this is some kind of bizarre troll. People keep asking you for sources for your incredible claims, and you just keep making them without a single reference. Where are you getting this 30M number? This 90% number?

Here's another source: Red Hat IPO'd in 1999 on the back of a Linux operating system, per Wikipedia, "achieving—at the time—the eighth-biggest first-day gain in the history of Wall Street". Pretty amazing for something that was "barely a curiosity, a hobby".

Here's Netcraft in 2003, claiming a mere 2M sites running FreeBSD and describing it as a blip compared to Linux and Windows.

> Although nearly all of the public focus is geared around advocacy of Linux and Windows, there is a third Intel based operating system, which generates a tiny fraction of the publicity surrounding these operating systems, and has a much smaller user and developer community.

https://news.netcraft.com/archives/2003/07/12/nearly_2_milli...

Notably they don't even mention NetBSD. In fact, Netcraft barely ever mentions NetBSD. I get _two_ hits on their entire site for it. Pretty big oversight for something running 90% of the web!

Here's a FAQ from 2001 from a commercial BSD company describing FreeBSD as "the most popular open source version of the BSD OS". Weird how, according to you, NetBSD was running 90% of the web up until 2006 but in 2001 FreeBSD was even more popular than it!

I mean, I could go on and on. It's frustrating that I can't find any clear cut OS usage stats from that era, but everything points to Linux being a major player by the early 2000s. For example, Java - which was a very big deal at the time - never really supported BSDs, but supported Linux.

https://coderanch.com/t/110571/os/Java-NetBsd

You had to buy Netcraft's more detailed analysis, but in 2003 based on their previews it's clear they split the world up into "Linux", "Microsoft", "Other", "Solaris", and "Unknown". In the screenshot they show of one DNS owner, Linux was roughly 30% of servers, Microsoft was roughly 50%. Wild how BSD doesn't appear there at all!

https://web.archive.org/web/20030425195510/http://news.netcr...

But why am I wasting my time providing these sources? I gave a bunch already, you ignored them and just kept making the same baseless claims about NetBSD's completely undocumented hegemony over the web that collapsed instantaneously. "Rewriting history" indeed!


Your citations don't say anything about the web. I also don't know why you are wasting your time. I am not really interested in trying to convince you since you are astoundingly abusive.

FreeBSD was certainly loved, by researchers and universities, but I don't think we can trust a source that appears to be marketing, "we're No. 1!" I know why NetBSD ran the web for so long, because it was based on BSD4.3, ran on everything, was well-supported by a community in active development, included pkgsrc and all the ports that came with it, and it is free, unlike the alternatives Solaris, AIX, HP/UX, which also more often than not took advantage of pkgsrc. Linux was just not a viable option for running a public webserver between 1993-200x, but as to why NetBSD and not FreeBSD, I can't answer, but I know for a fact that most webservers globally ran NetBSD from the development of WWW and for over a decade, and the rest, in minority, ran Windows Server IIS. Why this is not widely documented is no mystery; it just isn't that interesting.

Believe it or not, I honestly don't care. Believe what you wish.


I'm finding that hard to believe, and I'm a massive NetBSD fanboy.

In 2006 I was running about two dozen web servers, and only one of those was NetBSD - the rest were all Debian.

The NetBSD one was my personal server, that lived in the water tank cupboard of my flat hooked up to my thumping great 10Mbps cable modem.


Why is it so hard to believe? NetBSD was a fully mature operating system by the time the WWW was developed. The serious options were Solaris, AIX, HPUX. They weren't free. NetBSD was under active development and runs on literally everything, plus it has pkgsrc, massive amounts of source code available for free that pkgsrc managed. Maybe I am biased because I already know, but it makes a lot of sense NetBSD was so popularly widely deployed for webservers for so long.


Because I was running dozens of web servers in 2006 and was a very active part of the ops community, and everybody thought running a website on NetBSD was bonkers.

Linux was already well-proven technology by this point, with a variety of free and commercial offerings.

We mostly used Debian, although if people really enjoyed handing over money we'd put RHEL on for them.


So prove it. Put up or shut up.

Provide supporting evidence, and we will listen. Otherwise: nah.


In 2006 I was still mostly using commercial UNIXes in production, with Linux for UNIX at home purposes.


> Apple's Open Source efforts with the Darwin project have been a failure. Apple failed to build a community around Darwin in the 7 years since its original release because it was not a corporate direction, but rather a marketing stunt. Culturally, Apple did not and does not understand what it means to be open source or to build a working community.

WebKit[0] would be a counter-example.

Maybe Darwin didn't have value beyond what Apple was doing with it.

[0] https://en.wikipedia.org/wiki/WebKit


In addition to the other reply, note that Webkit only happened when the KDE developers (whose code WebKit was originally forked from) kicked up a fuss about Apple's misleading marketing of this as a successful open source engagement. Apple were originally doing bare minimum tarball dumps (which, to be clear, KDE had no real problem with in itself - that's widely accepted as a legal way to work with LGPLed code) until they got called out.


WebKit is LGPL only because it began life as an (internal!) fork of the KDE Konqueror codebase. And the lack of community stewardship from Apple led to it being forked again into Blink/Chromium, which has largely taken over the mantle[1] of "standard open source HTML engine".

WebKit itself is hugely successful because of its use in proprietary projects basically. It's "open source" only by accident, and it seems pretty clear Apple would prefer it not be.

[1] Not that Chromium is a particularly great example of an open source project, but lots of people outside Google build it, ship it, and contribute.


I read that one of the biggest driving forces of Google forking WebKit into Blink was an increasingly large gap in the direction that the two companies wanted to take the engine, including differences on things that both parties agreed should be implemented (I seem to remember there being a difference of opinion on how to implement multi-process architecture for example), making it progressively more difficult for the two to coexist on the same codebase.

Was that not true? Certainly today WebKit and Blink are significantly different engines with very different performance profiles.


This is correct, although I wonder how much they still contribute between the projects. Apple is notoriously secretive about employee OSS contributions (and everything, natch), but individual contributors pop up contributing to all sorts of things. And WebKit isn’t as isolated as it might seem, just today I saw tweets about some big perf improvement contributions from the creator of Bun who’s definitely contributing for his own purposes.

If I had to guess, coordination between WebKit and Blink is far less centralized than it was before, but I’d be shocked if it isn’t a going concern in places they don’t have different goals. It’d be a terrible waste of energy to have subject matter experts duplicating efforts like that.


> Apple is notoriously secretive about employee OSS contributions

Really? I see a lot of contributions from Apple engineers to the LLVM codebase


My understanding is that, apart from high profile involvement in certain projects (like LLVM), Apple engineers generally contribute “as individuals” without much to indicate they’re working for Apple. I could be wrong, but that’s how I’ve heard it.


That's about right. WebKit continues to be used by Apple, and finds its way embedded into a lot of platforms like game consoles.


Blink/Chromium forked because the fundamentally different process architecture, coupled with the huge cost in supporting a separate JS engine. Their big blog post bragging about code size reduction was almost entirely removing code required to support JSC and Objective-C.

Chromium/Blink will fundamentally never make any engine changes that support user privacy if that change will in anyway hurt their ad business. So the "standard open source HTML engine" now means the engine that supports the ad industry over users.


Apple seems pretty keen to promote WebKit as a bona fide open source project these days, even touting its use beyond Safari entirely as well as the degree of outside (of Apple) contribution. https://webkit.org/blog/12733/happy-birthday-wpe-webkit/

More marketing stunts? Perhaps. But I'm all for marketing stunts to promote public open source collaboration. Now we'll just continue to hold them to it.


>Not that Chromium is a particularly great example of an open source project

Chromium isn’t an open source project at all. It’s a company project that releases open source code, but all the decisions are made by the company, not by any community.


I disagree that this could not be considered open source. Chromium could be forked at any time by anyone, and then developed freely from there. That’s as much as you can reasonably expect from any project you’re not paying for, IMO.


You can fork - the source is technically open - and you could form an open source project around it, but the current project isn’t.

And you could expect a whole lot more. To me Open Source is less about being able to fork, and more about not being dependent on some particular company.


Was anything really lost?

You have a few engineers with hurt feelings and Apple lost the potential for some contributors to fix issues (at the expense of security by obscurity).

Did the world really need another open source OS?


It does sound like a big deal until he quietly mentions half the engineers hired we’re like 3 to 4 people lol.

Darwin is a big deal in that it’s backed by one of the 5 largest tech companies. But when you look at it, it’s essentially just Mach with FreeBSD (NetBSD core utils in PowerPC days) bolted on top.


If you're finding that a bit "wall of text", hit F12 in Firefox or CTRL-SHIFT-C in Chrome, and then pick <body> from the Inspector. Then into "element.style" put in:

    width: 50%;
    margin: auto;
    margin-top: 5ex;
    line-height: 3ex;
This should make the lines about the right length to read comfortably with adequate line spacing.

Most folk will find it more comfortable, and particularly if you're a bit dyslexic. Feel free to fiddle with those numbers to make it fit your eyes better.


Or select "reader mode" on firefox.


That's never really worked on mine.


the article refers to Apple's OS called Darwin not Darwin the naturalist.


Considering how many evolution deniers are still around, the naturalist still hasn't quite succeeded either.


How the heck did PureDarwin crawl out of those ashes 9 years later with a preview release of Darwin 9 with an X11 GUI (PureDarwin Xmas), followed three years after that by a command-line only 17.4 Beta based on Darwin 17 (corresponding to macOS High Sierra (10.13.x))?

PureDarwin is quite simply impossible because Apple doesn't understand or care about OSS.


> PureDarwin is quite simply impossible because Apple doesn't understand or care about OSS.

In an alternate reality it would be so cool if you could run MacOS apps on an open source OS. Maybe Darling[1] will one day get us past the point where you could run any MacOS app on Linux (not just simple GUI or CLI apps). That being said there aren't really any apps that I use that aren't available on Linux. But the option to would be nice!

[1] https://www.darlinghq.org/


I just booted the 17.4 beta in a VM. It does work, but can't really do much in its current form. Interesting nonetheless!


Wifi is not going to work, but ethernet should. To do anything at all with it, you'd need to install MacPorts and start building packages. You can make it a DAMP nearly effortlessly using MacPorts.



TLDR

> The problem was Darwin was a marketing ploy, and nothing more.


Having a proper terminal was the main reason I bought a macbook pro back in 2007.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: