HN2new | past | comments | ask | show | jobs | submitlogin

I'm puzzled why there is such a strong dependency on Mac? I mean, the M1 looks pretty nice; but if the author is budget constrained AND the hardware is too slow, wouldn't "normal" computers with Windows/Linux make much more sense anyway?

I fondly recall how a good friend of mine was impressed by the compute performance of the Apple based workstations at her university. I convinced her to get a PC for the same cost as the Mac, and then she complained how slow the university Macs suddenly felt. (That was ca 2014).



Imagine Windows deciding to do a forced auto-update (or hard disk check) ten minutes before performance. Or breaking down because the system doesn’t work well after the update.

I set up a VR performance in a gallery once. It was supposed to run the same piece of a basic program for 3 months.

Since it was Windows we had to do additional steps to make sure that it wouldn’t try to update* and possibly break drivers. Because there wasn’t IT staff around to fix things if it broke down.

I never had such issues with MacOS.

* - additional steps like making sure the built-in wireless is disabled and it doesn’t remember any wifi passwords, so it doesn’t try to fetch any sort of update. Because even if you disable Windows updates, you have a bunch of other drivers that may ignore those settings. And then, if we needed to put anything new on the computer, we had to either use usb-sticks or break the airgap and redo all the testing - so it had to be timed so that we would have at least half a day to fix it if broken.


I'm watching MacOS gradually become this, and I'm not pleased about it. I'm pretty sure I understand why it's happening: Apple is managing its full ecosystem for its own benefit.

My concern is that as a fallible lil' human, and moreover one with autism, I find it probably more upsetting than most to have stuff randomly break for no reason. I depend on continuity and repetitiveness to be well. As such I try to operate on computer systems that don't change out from under me at someone else's whim, because I can be thrown into the inability to function, by something at an unexpected level of abstraction blowing up.

This most recently happened under OSX Mojave when my (non-Apple)_apps could no longer check with the authentication server and would not launch. I lost a day to trying to repair my system: Apple had never told me 'by the way, everything you run now has to talk to a server of ours or it'll refuse to launch'. I disabled the functionality, but I can't have things like that going on. It makes Apple the cyber-terrorist they are trying to 'protect' us from.

Again, I understand their motivation as they are a titanic collective entity trying to administrate and tend another titanic collective entity, their userbase, which they feel is a part of themselves.

However, as a lil' human type, I am too deeply committed to maintaining usefulness for lots of older computers owned by other lil' human types, many of whom can't consistently throw thousands of dollars at Apple to stay in the ecosystem as Apple understands it. I find Apple's actions morally reprehensible (granted, among the least reprehensible actors in the world of computers and internet, but still)


> I set up a VR performance in a gallery once. It was supposed to run the same piece of a basic program for 3 months.

Windows 10 LTSC is a good fit for this type of scenario.

https://techcommunity.microsoft.com/t5/windows-it-pro-blog/l...


You ever work with Joanna Klass? Also... the Posterous link in your profile is of course dead


Nah, the VR work was with Norbert Delman: http://norbertdelman.com/2017/1951

Posterous - oh no! ;) Perhaps I will change the link to the webarchive page of it :)


Turning off Windows Updates is 10 minute work. And I am no Windows expert. Just got to know this via a youtube video. I am not sure how you couldn't disable updates.

I am not a professional, so I might not know audio stuffs. But manual Windows Updates is easy.


NVidia has it's own auto-update system, and some other programs as well.


If you don't install the "GeForce Experience", which given your use case, you obviously would not, the graphics driver does not auto update and has no way to do so.


Software we use is only available for macOS. Also, Mac mini availability across the world is important. Lastly Mac mini form factor is perfect for the use case.


I suspected so. Excuse my rant, and it's not directed at you, but the software companies only supporting macOS: I never understood that crap. Only offering macOS forces their customers (e.g. you) into the Apple lock-in, eliminating them from choosing the best hardware for the task.

Directed at you: I would try to avoid such software, and look for alternatives. (I expect this to be futile?).

x86 is available across the world as well. Others already pointed out comparable form factors.

Also, computerhardware reliability is pretty good these days. Maybe you could set up a scheme to reuse your tech stack?

On the extreme end, server hardware can run for years with 0 hardware-related downtime (and offers nice things like redundant power supplies, 19" rack cases [but much deeper than audio stuff?]). And brutal performance: Even my 450 Euro used&modified (new nvme disk, faster CPUs), 5y old, 1u(!) Intel dual socket system can mop the floor with most desktops below a Ryzen 3900 (at least on my compile workloads, and on anything that swaps on less than 128GB RAM in general).


If you're interested, Intel NUCs mini PCs are available all around the world in the same form factor.

They don't run macOS, so won't run your software though.


How does their performance compare to a m1 Mac mini?


Appallingly


Citation needed. There are Ryzen 5 mini PCs with active cooling [1]. You don't need to use an Intel chip manufactured on an aging 14 nanometer node.

[1] https://www.youtube.com/watch?v=fLATODi7KlU


> There are Ryzen 5 mini PCs

So not an intel nuc then, which is what I was specifically referring to.


MacOS has all sorts of serious MIDI capability built into the default install that you need a bunch of different plugins on Windows to get.

Also, while a lot of people use Pro Tools for recording/mastering Ableton (which are cross-platform) for live stuff, lots of shows use Mainstage which is Mac only. Logic is really good for recording and sequencing too and also Mac only.


I bet macOS audio API's are way better than on Windows. I don't have experience of them myself but I have been burned by Windows ones.

I have a small story to tell about this. Windows WASAPI has a way for it to tell you when, supposedly, the endpoint device started the recoding of a buffer. This is important if one wants to have a sync with precision of sub millisecond range.

Well. It doesn't quite work like that in reality. I noticed the hard way that if one gets a 2ms long packet. Queries the high performance timestamp, then queries the packet recording time, then queries the timestamp again. The packet was supposedly recorded after the first timestamp but before the next. And as the packet was 2ms long and the previous timestamp was way way way less than 2ms from the query it means that it was impossible for it to start the capture at that point.

So instead of returning the start of the recording as per docs state WASAPI just returned the current high precision timestamp. It's basically you asking someone "Hey, this 10 hour video you gave me, when did you actually start recording it?" "Now. I started the recording right at this moment".

This story has a happy ending though. I grabbed portaudio, hacked it's windows low level implementation to return the actual timestamp based on the register of the soundcard that points to the FIFO buffer and managed to get sub ms precision.

So if macOS audio API's are as good compared to Windows (which is a dumpster fire unless one uses some custom stuff dependent on manufacturer) as iOS api's are compared to Android (where only Pixel phones are not dumpster fires) I can easily see why Macs are the devices of choice.


Nothing about windows audio APIs actually stops them being good enough to do a production on.

Obviously it's not Broadway but I ran the sound for school productions for years, and we were able to quite happily do (increasingly DSP based as the years went on) live audio I/O on literally the worst windows machines you could imagine.

The whole idea that the OS really makes any difference - especially for being "creative" is just a placebo.


On a conceptual level they seem allright. The downside is that they don’t actually work as documented.

If Mac APIs actually work as advertised then the difference is that windows programs work weirdly (bad sync etc) or have had to spend tons of time making custom code to workaround the badness of their ”professional” audio API, costing way more.

And another example is Android. It’s definitely not placebo that it tends to have horrendous latency. Originally due to design and nowadays due to bad vendor implementations with only Pixel phones reaching iPhone levels of latency. We’re talking about latency in iPhones being only few ms and tens of ms or even more in average android phone. That does actually matter in music production.


Mac APIs have indeed worked as advertised for many years.

My own concern is watching Apple pursue a path of driving upgrade purchases through (A) increased performance and (B) breaking older systems or (C) disqualifying them from use of current software.

C is easy and practiced intensely by Apple, which abandons support for older stuff VERY REGULARLY in XCode. This may or may not be better than allowing it to rot and become deeply broken through lack of maintenance, but it's a choice and Apple repeatedly chooses to throw away even the possibility to support older machines.

B happens through rot: things get complicated, and if they don't care what happens to you AND they are changing your machine out from under you, they can just randomly brick it one day and not be a bit sad about it. It was your fault for not buying newer things, regularly.

A is also something Apple's been capable of. You buy into that and if you stay on the bleeding edge, Apple's become pretty good at keeping you riding that wave of the best computers can do, at any given moment. This also (to some extent) helps the older stuff become more affordable as it's left behind: that's positive in its way. 'Bleeding edge' is not the only kind of functionality to have. I've noticed that for music production testing of Apple Silicon, all the test cases are completely unrealistic: 4000 tracks each of which has 10 Space Designers, etc etc. That means the use case is a solved problem: you don't need the new Mac to do it, at any reasonable level. 8K feature film video on the desktop, yeah you can still need bleeding edge for that. Music, no, not at all.

This is also why it's important that Apple not break its own APIs or let them rot. On the whole, the functionality just works, every time, no matter what. This is a serious thing to risk by allowing the platform to become less reliable due to needing to 'churn' it and sell new generations of machines.


How much does it cost to build a Intel/AMD computer with performance comparable to a m1 Mac mini?


The author talks about his past experience, so did I. Back in 2014 the 1500 Euro Mac was beaten in image manipulating workloads (Adobe stuff) by a PC of the same price range.

Today, for a proper answer I'd had to check specific benchmarks and compare prices. If noise is no problem (e.g. separate tech room backstage, where the power amps live) used server hardware could be a reliable and performant option on the cheap.


Windows is a complete mess these days from a usability perspective and even Microsoft doesn't seem to care much about it. As for Linux, rock stable once you get it running (at least the OS itself) but lack of software support and other quirks makes it less desirable on the desktop.


Rock stable?


Solid as a rock?


If performance is really a problem, there were faster Intel Macs than a mini, as well.


True, but form factor plays into this a lot. You can fit two minis on a single 19” rack shelf, whereas trying to use an iMac or a laptop isn’t as convenient.

If they made a smaller and much cheaper entry-level Mac Pro that had a rack mount kit to fit in 2U or something, that would be amazing for production.


The current Mac Pro is available in a 4U rack form factor for US$500 more.

Click the “Buy” link: https://www.apple.com/mac-pro/


Just two of those blows through the budget of 10k for 2 complete redundant sound system given as an example in the article.


That is true. A Mac Mini is much more cost effective (for processing power per volume). But, if you’re getting a Mac Pro, it’s because you either (1) need a Mac Pro, or (2) have enough money to blow that much on the fancy new stuff.


None in the form factor/price range we need. Mac mini is unique.


Entertainers are not Engineers.

They fall for marketing significantly easier and use Veblen goods to signal wealth and power.

Then they get used to a system and it's all they know. It's more effort to change and they get locked in.


Out of the FAANG companies I can’t speak for Facebook and Netflix, but I can assure you that the engineers at Apple, Amazon, and Google have a broad preference for macOS and Apple laptops. Since the pandemic even the engineers using desktop Linux have mostly transitioned to Apple laptop + remote Linux server.


Software engineers are not Engineers if we use the strict definition of Applied scientists.

Software engineers are forced to use tradition and Authority due to abstraction.

I say this as a programmer.


Applied scientist is not and has never been the definition of an engineer. Dating back to the term’s earliest usage, predating both modern English and the scientific method, it has meant someone who designs and maintains complex machines.


Don't be ridiculous. The designers at the highest level of entertainment are extremely technical.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: