I know being expandable is important here but I am stunned by the fact that an iMac Pro actually comes with better SSD and Video Card for less money and you get a screen to boot!
256g SSD is shameful, I don't care what they were thinking. No Nvidia support mentioned so I know a few that just wrote it off for that oversight alone. They will probably hold out just to be sure but that was the hope for them and others.
While I was never in the market for a Mac Pro, though the iMac Pro is not out of the question jump over the iMac top tier, I never expected such low starting specs for SSD and Video at this price point. 580X is old, 2017 old. If this is what they are struck with by not using Nvidia they have done everyone a disservice.
* did the spec this out in 2017 when they announced it and just locked it in?
* edit : Wrong on price, iMac Pro is 1,000 less than base Mac Pro
The first step up from 256GB is 2x512GB, and there's configurations up to 2x2TB. My guess is that the 256GB configuration is meant for users that work primarily with data that's kept on network storage.
There's no professional that would buy a machine with 256GB of storage. Let's call it what it is - a throwaway configuration meant to hit a price point.
If there's a single truth of Hacker News Apple threads, its that someone will eventually say "no professional would ever..." and then immediately be rebutted by multiple professionals claiming they indeed would "..." and not only would they "...", not doing "..." would be unusual and even undesirable.
Every machine I've worked on, bought or built has had 256 or 500gb. We only run the OS/apps on the main drive. Then a second drive for scratch disk and then SAN/NAS storage for the media. It's VERY normal.
I've found that 256G is 'not quite enough', so my old Macbook has 256G and I moved my media folder to one of those flush-mount microSD card adapters. It's read-mostly so who cares about write cycles?
We only store the OS and apps, so anytime the storage is full it's usually cause someone hasn't emptied their downloads folder or they're incorrectly caching the the OS drive.
My experience in personal use is that data loss with microSD is extremely common, much worse than, say, magnetic disks (and certainly much worse than SSD).
I'm not sure a Mac Pro is the best use-case for Lightroom users, if you're working on something that requires that kind of speed/power you're probably on something like Photo Mechanic Plus and/or Capture One Pro.
Lightroom allows to move the cache on another drive...
Besides, perhaps you're the user that needs the multi-TB version. Others, e.g. video professionals wont have huge tens-of-TB source files in their main system drive.
It should, but with a 24MP camera you can create ~50GBs of compressed RAW files per session pretty easily so, a TB of storage won’t do.
If you add a high performance spinning disk its performance won’t be enough and it will be noisy even when idle.
In the end, it’s better to not add anything and let professionals bring their own storage into the mix.
At the end of the day this won’t be sold to new beginners with no files. The buyers will have their resilient storage and the internal drive won’t be used much anyway.
Unless it’s configured as a developer workstation, but it’s generally MacBook Pro’s role.
So I'm a web dev but in a previous role we bought a 2 man company that made medical renders. I ended up making them a mini-SAS Raid 5 made out of SSDs. This is before M.2 started taking off. That thing did 1.1 Gigabyte/sec write on a little table top with terabytes of storage. It was over 20x more than they were used to.
There are plenty of cheap NVMe SSDs that can push 3.5Gbytes/sec (Samsung 970, Adata XPG). Even with dual 10Gbe NICs, you can't match that, nor the low access times of local storage.
Lots of animation / multimedia houses use workstations with 40Gbps+ adapters (40GbE or Infiniband) connecting to network storage.
It's done this way so people can connect to the storage that's appropriate for the task at hand.
Different projects can be stored on different SAN/NAS arrays , each specc'ed out according to the size/needs of the project.
eg a short run animation doesn't need the same storage capacity of a full length feature film. They may have similar throughput needs though. (Summarising here, but the general concept is ok)
So, lets say someone is a Flame editor (Flame generally has high bandwidth needs). They're working on project A for today. So they connect to the storage being used for that project from their workstation. The next day, they might be working on a different project, so will be connecting to a different set of storage.
Other people using different software will connect to the same storage for their tasks, but can have different latency/throughput needs.
Obviously, this isn't the approach taken by single person multimedia er... "houses". ;)
It's usually direct-attached storage, not network-attached. They use USB-C or Thunderbolt cables to join to a RAID storage device, and then backup/archive to a network based storage pool later.
I rarely see direct attached storage anymore. It's too cost effective from a media management standpoint to just to right to 10Gbe RJ45 or fiber network storage. The only direct storage I see is if speed is absolutely critical and that's very rare, mostly just high end 3D stuff.
Got it. I'm a few years removed but fast DAS raid boxes for each workstation were common with work product being synced to a network share. Looks like the NICs and SANs are fast enough now to run everything off the network.
We are past that now, new PCIe 4.0 SSDs just have been showcased along with the new AMD chips and they can do 5GBytes/s read and a bit above 4GB/s write (AMD is rumored to have invested in the R&D of the controller). You'd need 40 GbE to match one -- and EPYC Rome, also scheduled for this fall, will have 160 lanes allowing for dozens of them. You could very easily reach 100 GByte/s read which no network will match.
>> You could very easily reach 100 GByte/s read which no network will match.
> High end networking gear already has higher throughput:
100GB/s > 200Gbps
You would need 4x 200Gbps ports to reach 100GB/s, so 2x MCX653105A-ECAT (each 2x 16-lanes) at > $700 each, and pay for 1/10th of a ~$30 000 switch, IOW 100GB/s would cost you ~ $4400, before paying for the storage.
Sure, it could be done, but it wouldn't be cheap, and you'll have used most of the PCIe lanes.
Nah, a motherboard with enough M.2 connectors could easily exist. Or, U.2 or OCuLink. We have already seen 1P EPYC servers with six OCULink connectors...
Twin ConnectX6 adaptors, gives you 800Gbps, or ~1GB/s, at an absolute theoretical max.
It's good to see that local storage has finally returned to the reasonable state of being faster than network storage. SATA / SAS was a long, slow period ...
A typical product aimed at mid-range video producers, the G-Speed Shuttle SSD, can do up to 2800MB/s. That's 32TB of local Thunderbolt 3 attached SSD storage.
Mind you, you'll pay $15K for it, but if you're in that business you can well afford it even if you're not a top tier Hollywood production shop.
Given that your storage array costs that much, the fully loaded Mac Pro price (somewhere in the $20K range?) is not that outrageous. The people who use Red cameras and G-Tech storage arrays are the Mac Pro demographic Apple is going for here.
Disclaimer: I used to work with G-Tech but no longer there.
Prices have dropped then, cause the G-Speed shuttle I use is 96TB and doesn't cost half that much. I've also used almost every model and in the real world and you don't ever get close to advertised speeds for R/W on those. Plus when the volume gets full it will drop to <100MB/s write.
They are popular though. I see them a lot, but I have had very little success with them over the years.
I'm just speculating here, but I assume that online storage isn't faster, but it may be bigger, online, easier for IT to manage, and fast enough. It also seems likely that some employers are worried about engineers or creatives just walking out with commercially sensitive data.
This. Have a look at the various Linus Tech Tips videos on their ingestion and editing workflow. There's a reason they upgraded their network to 10Gige and it wasn't for fun.
For a home lab or dev/test gear, where getting stuff off Ebay is an ok thing to do, older Mellanox cards can commonly be found for under $100. Sometimes much cheaper. depending on what the seller is doing. :)
Looking quickly at the US Ebay just now, here are some older single port ConnectX-2 cards:
US $23.00 each, free shipping. Note - I don't know the seller at all, this is just from a quick look on Ebay.
There's a tonne of stuff on there. ConnectX-2 is no longer supported by Mellanox, though the cards themselves are generally pretty bullet proof.
Drivers for Linux and FreeBSD come with the OS. :)
Drivers for Windows are a bit more complicated. Mellanox has an archive section with the ConnectX-2 drivers, which work for most people.
Cabling directly between two cards - instead of going via a switch - is pretty common for people just trying out the tech. It lets them plug one card into (say) a FreeNAS server, with the other card in their desktop or workstation. Removes the ~120MB/s limit of 1GbE, assuming any kind of reasonable disks in the connected NAS. :)
If this kind of thing is of interest, probably the best place on the whole internet (not joking) for this stuff is Serve The Homes' Networking forum:
I can have a 100TB volume for under $10,000 that multiple editors can R/W to over 10GbE. We probably average about 400-600mb/s to a client on a simple setup in real world situations, which is fast enough for a few editors to work on multiple streams depending on the server. It's easier to backup/archive from an IT standpoint.
Most people working solo don't need to access hundreds of TBs though. They're fine with local or thunderbolt attached storage devices.
My OS drive is for the OS and applications I don't care about. 128GB is plenty tbh.
I really can't imagine this world where "professionals" keep everything on a single drive and are primarily concerned with capacity of that single drive.
My gaming computer has three SSDs, including a 120GB for the OS, and a pair of magnetic drives.
My 2011 Air has 128gb and I'm always having to clean it up to install updates. Yes, it has years of cruft, and some userland stuff, but it's really just a netbook for me, so I don't keep much on it.
I find 256GB fine for writing apps. I think more space would be a waste compared to a faster processor to speed up my compile times. If I had infinite money I'd probably put any additional space on extra drives as a RAID array to help with IO read bottlenecks during compiling since I don't have any use for more.
I totally understand gamers need more space, people who collect movies and videos need space, graphic designers need more space. Maybe even Android OS developers need it due to the Android Open Source Project, AOSP, having like 100 repos you need to clone all at once. I don't need it as an app developer, though.
Apple has a history of offering small base storage/memory options, and then charging more for the upgrade versions than they physically cost to produce. It's how they make different price level products when the products are all fundamentally the same. This lets them cater to poorer customers than they could otherwise since the rich ones buy the overpriced upgrades and subsidize the poorer customers' hardware. So it isn't all bad.
Most often (depending on the cycle in the SSD tech generation) larger SSDs are faster (see new PCI4 controllers who need a number of NAND chips for best performance). A 256GB has a high probability of being not the best choice for compilation. A 2018 MacRumors thread shows:
Well, I know plenty of developers who opted for the 256GB MacBook Pro models and they're fine with it, myself included. Sure, I'd prefer more storage if I can have it, but I can definitely live with this and still have plenty of space.
I consider myself a professional (at least in that I get paid to do what I do), and I don't think my work hard drive has more than probably 40GB used on it. So 256GB seems like excessively plenty.
I never use the OS drive for storage, and unless its a requirement to run, I dont load apps on the system drive either. I normally have at least 3 drives on my computer. OS/Apps/Data. With imaging software, replacing one piece of the 3 is damn simple, and there is a better chance of ransomware not taking out your data. So I would definitely use and keep the 256GB drive.
A lot of people are disagreeing with you but I certainly concur. Sure, there are ways to work around it, but this is a $6,000 computer and you could get more storage than this in the 2004 iMac.
I refuse to believe that meticulously emptying out your trash and downloads folder so that you can continue to squeeze by with 256GB is now acceptable for ultra high end, "I don't know a single person who can afford this" computing.
I'd buy one in a heartbeat, I don't honestly give a shit what storage comes with the system. I've got a separate, dedicated system for storage that's got all kinds of backup/redundancy/failover/etc. configured.
If you're a developer on a large (multi-GB) codebase, a beefy multicore system can be great for quick compile times, but hefty storage may not be a necessity.
Or is this meant to be the "I just want to buy my own disks separately model", with 256GB drive just there to allow OS to be pre-installed?
EDIT: I see now there are only two internal drive bays, so that would mean tossing the 256GB that comes with it if you wanted to max out the internal storage.
> I see now there are only two internal drive bays, so that would mean tossing the 256GB that comes with it if you wanted to max out the internal storage.
Or using some of the several PCIe slots to add lots more SSDs. It looks like the default configuration has enough spare PCIe slots and lanes to accommodate 8 M.2 SSDs in suitable risers. That can get you another 16TB for ~$1.8k (consumer-grade SSDs).
The 256GB model is an old Apple hack to show a lower starting price (which rarely anyone would buy at this time) and which quickly goes up as you upgrade the components.
Right. If you're working with multi-terabyte files that live on a NAS or an external directly-attached storage, the local drive is basically a cache of the OS plus whatever you're physically working on at that moment.
I find that link aggregation is pretty much useless and/or unstable. Not sure what good two 10gbe ports do. But seeing 10gbe becoming part of the standard spec is a really welcomed improvement. Not the least because it will drive prices down.
I'm not mad that it's 10GBASE-T, but I'd personally rather 10G over SFP+. The sorts of shops that can afford these probably already have some pretty burly networking infrastructure, and 10GBASE-T is both expensive and (IME) less reliable.
The encoding/decoding capacity of Ethernet-over-twisted-pair seems to be reaching a point of sharply diminishing returns and the hardware for 10GBASE-T is really expensive for what's been, in my experience, a less reliable experience than SFP+. I'd rather tell somebody who needs 10GBASE-T to use an RJ-45 adapter than to have to go grab yet another Mellanox card to stick in a computer this expensive.
But isn't SFP+ not compatible with 1G (i.e. can only be used with a 10G network equipment)? If you are selling a machine that people will plug into either a 1gbe or 10gbe network, you'd rather have some interface that can downgrade.
You can use SFP in an SFP+ socket, but not the other way around. The downgrade to 1GB would work fine.
Since this is still a desk focused machine vs. datacenter, I'd expect most offices to be wired to cat6 ethernet which can do 10GB if needed vs. SFP+ cables or SFP+ to Fiber.
1 x 10 GbE for your regular Ethernet/IP traffic and 1 x 10 GbE for your storage traffic. You do run your network storage over a physically separate network, right?
I don't know. Intuitively I would have thought that a NAS that updates itself automatically, with SMB1 disabled, is less dangerous that a NAS that is in theory insulated from the internet but not patched for security vulnerabilities (since not connected) and connected to machines that are on the internet and could be infected.
I am not sure what “a physically separate network” means when it is used by the same computer. (Also, what about accessing your storage over WiFi? Or, you have to have cables strewn all over the place?)
For a lot of "pro" usage scenarios, the local environment will have one "LAN" that has Internet gateway, and various local services on it, plus a dedicated storage network using a separate switching infrastructure (and thus, separate NIC port on your machine).
WiFi is not really an option, for performance but also often for security reasons.
So yes, you'll have two cables from wall-jacks to the machine for networking.
I tried between two enterprise dlink switches, and also between a synology NAS and a dlink switch.
Link aggregation in general only provides performance with multiple connections to multiple machines, so using link aggregation between one client (Mac) and one NAS will like result in zero performance improvement (the packets will only use one of the cables). It only makes sense if you have two or more NAS that you want to access simulatenously (or two or more clients accessing the same NAS, but that wouldn't be a use case for 2x 10gbe port on a client).
Synology also supports balance slb bonding, which in theory goes around this single connection restriction. However I ran into some connection problems with some Windows clients. Never went to the bottom of them but they went away when I disabled the bonding.
In any case, it is hard to saturate a 10gbe connection with a single NAS, unless it is packed with SSDs, which I wouldn't assume for mass storage. So I am not sure there is much value in aggregating the links in the first place.
D-Link gear isn’t exactly what I’d judge networking standards on, they work I suppose but they’re hardly what I would install in even a small business office.
I have multiple LACP bonds on my Juniper EX2200 at home working without issue, though the single stream limits you mentioned are the one thing LACP can’t fix.
"enterprise" DLink switches aren't really a thing yet, regardless of what their marketing team wants to brand them as. :(
Cisco, HPE, etc have "enterprise" switches. DLink might be in a decade.
> it is hard to saturate a 10gbe connection with a single NAS, unless it is packed with SSDs
No, it's just a matter of having enough spindles behind it.
As a rough guide, with a (say) average spinning rust HDD able to push out 100MB/s when reading, you'd only need 10 such drives to push out 1000MB/s (raw).
In the real world, you need extra spindles as some of the data being pushed out is just internal checksum/redundancy, and doesn't go over the network.
But for reading back large files in mostly sequential access, you'll hit 1GB/s from about 10 drives onwards pretty easily. More drives, more throughput.
I would defer to people with more enterprise hardware experience than me for serious NAS set ups, but my experience with various generations of 12 disks synology nas is that you loose a lot of performance to disk vibrations / inefficiencies of the raid implementation / sync between drives / tcp, etc. So I don't think it scales linearly. With a synology DS3615 and 12 HGST Helium drives in RAID5, I barely get over 1GB/s locally while each drive individually is capable of over 200MB/s sustained speed.
Yeah, no idea with Synology. When I was originally looking at NAS solutions, they (and QNAP) just seemed expensive for not much product.
Went with FreeNAS instead, as I was already very familiar with building systems, it's based on FreeBSD (OSS), and it gives better tuning on the higher end.
LACP does generally “just work”, the problem is when you want one machine or session to be able to max multiple links. The solution here remains the same as it always has - multipath. I hope Apple has added support for SMB multi-channel support for these users since I last checked.
Even without link aggregation (which does generally work quite well), this would allow you to have a dedicated SAN + network solution, which is often very useful.
And how many people here who claim to use a SAN for sure use a NAS is funny.
From maintaining large image storage FC SANs with Apple xServe and IBM DS for some years this is no fun to administrate for peak performance and I would assume most smaller shops have NAS instead of SANs (no arguing larger shops have SANs as we did driving ten photo studios).
Considering what a price tax Apple has had on internal storage, "lower cost" is a bit far-fetched.
Even the old 5400 RPM HDDs cost an arm and a leg if you bought it pre-installed into your MacBook. This was back in the days where you could unscrew the lid on your MacBook Pro with a standard screwdriver and replace the internal memory.
It's not a transparency issue, this dark pattern exists everywhere. You advertise "New X starting at $x" which is low enough to attract customers. But the thing you sell for $x is actually lacking in some small but important capacity making it unattractive for actual use. Then you sell an upgrade for the target price and customers are more willing to pay because it feels like they're paying a small premium for the thing they want.
It's a neat psychology trick. Customers are far more receptive to upsells when they're the one doing the upselling.
Where do you think the 'switch' is in that? At what point do they switch something from what was previously offered to something not expected? I don't think there is any switch. The specs are presented as up-front as any hardware spec is.
The switch is that you're some way through the "funnel" before you discover that to get a decent spec machine you need to spend a lot of money. It's the "power of commitment" sales trick.
I don't agree - it says how much storage each version comes with up front in large font on the marketing page https://www.apple.com/mac-pro/specs/. It even lists the 256GB SSD option first, as the default option.
Your assumption requires that no one actually wants or will buy the 256gb option. I for one definitely would consider it, all my current computers have only 256gb and I haven't had issues (external hard drives are cheap).
For development, I'm actually perfectly fine with 256GB SSD. And I don't care about the video card much at all.
Still, I don't see this as a developer machine — I think an iMac Pro is a much better value proposition for developers, and in my case, where I really mostly care about single-threaded performance for interactive development in Clojure and ClojureScript, I'm looking at the iMac.
On the flip side over in Linux land I built an 8 core 16T desktop with 64GB of RAM and two 27” 4K HDR monitors for less than half the price of the base Mac Pro.
Oh and it has an RTX2080 which beats an RX580 so badly it’s practically attempted murder.
I bet the apple tax I really do (typing this on a 5th gen mini) but the base model is hilariously expensive and in a world where I can buy a 12 core/16T for 499 that beats a 1200 intel cpu obsolete before its released (and that 12C will drop straight in).
They’ll sell, apple stuff always does but unless you need macOS’s for some reason I don’t see how.
The new Mac Pro has a ton more PCIe lanes and DDR4 channels. It's a completely different league, it can't be directly compared to mainstream platforms that only have dual-channel and 16-24 lanes.
EPYC would've been a better value, but it's still not cheap. Big computers with lots of memory and I/O capacity will always look ridiculously expensive next to mainstream desktop.
Threadripper exists for PCIe + DDR4 (less than the reported 2TiB because of no RDIMM support, but theoretically could hit 2 TiB if people start producing 256GB UDIMMs.
But... 64 lanes of PCIe 3.0. 32 cores. 64 threads with SMT. 80MB of cache (think of all the locality!). 256GB of RAM support (if you're using more, you're probably doing scientific compute and you're probably better off on Linux anyway, I'd assume). ECC support. 1700 USD. Quad channel RAM.
Xeon W-3175X? 48 lanes of PCIe 3.0. 28 cores. Probably no more Hyperthreading after Zombieload. 512GB RAM support. ECC support. $3000 USD. Hexa channel RAM.
And better yet? Competent TR motherboards with all the RAM/GPU/whatever support you need go as low as 400 CAD.
Also, if you're less Mr. Moneybags, the 2920X exists. 12C/24T, same memory and IO capacity. 650USD.
But wait, there's more! TR 1900X is older, but: 8C/16T, 64 PCIe 3.0 lanes, quad channel RAM. Same kind of memory support: 256GB. 300USD.
A far shot from "ridiculously expensive" considering 300 USD (or even 650 USD) is less than some mainstream desktop CPUs. The 9900K is 490USD. The 9980XE is 2000USD.
Also, TR3 with PCIe 4.0 is on the horizon, and Zen 2 with PCIe 4.0 is here. 24 lanes of PCIe 4.0 has equivalent bandwidth to 48 lanes of PCIe 3.0: same as the Xeon W-3175X.
Big computers with lots of memory and IO capacity can be decently cost effective. You just can't ask Intel.
Don't worry, by the time this baby ships, you'll be able to get Ryzen 3900X's that are PCIe 4.0 instead of the Mac Pro's PCIe 3.0. Only 24 lanes, but that's all you'll need.
Do you have a quote for that? It was true for a while on cost/performance because the previous Mac Pro was never upgraded but outsight of that anomalous period the major players tend to be fairly competitive if you’re comparing equivalent parts.
Well, that's cool, but that's comparing Apples to Linux-Oranges, I guess.
I do have a Linux box with a fast Intel CPU and it's nice and fast, but a Mac it ain't. I'm happy to pay more for a Mac machine, if only for the fact that I get an OS with working copy/paste in all applications. I won't be buying the Mac Pro anytime soon, though, I'm not willing to pay that much. But there are people who are and I'm glad there is something being offered there.
>if only for the fact that I get an OS with working copy/paste in all applications
That's such an oddly specific reason to require MacOS. Can you say more? I can't say I've ever had a problem with copy/paste in Linux, in any application.
If this issue were serious enough to me to consider doubling my computer's price tag, I'd at least first look at Hackintoshing, which tends to be pretty solid these days when you can pick out specific hardware in advance.
I suspect the issue is that in Linux the copy paste comnands are specific to each application, which can be annoying when copy pasting between multiple alplications., without a lot of configuring.
Hmm, that sounds like a possibility, though as a practical matter every UI application I can remember using on Linux uses Ctrl-C / Ctrl-V, and terminal applications use Ctrl-Shift-C / Ctrl-Shift-V - because Ctrl is for sending signals to applications (like Ctrl-C). That's universal enough that I don't see any issue. (Actually I prefer it to the way Apple separates Cmd and Ctrl, which I find infuriating because of the awkward finger positioning it requires.)
Not to come out over-defensive, but your base configuration (I guess?) doesn't include a GPU, I bet your caches are significantly smaller, and also no ECC ram. I also bet your storage is considerably slower.
No that price included a RTX2080 which is a pretty decent card and better than the base config in the new Mac Pro.
Platform supports ECC all Ryzens do, though it’s on the board vendors to support it officially.
Cache size no idea, will not that the new 3900X would obliterate the sky lake Xeon (12 cores vs 8 and comparable or better IPC) on in the Mac Pro base confit, it’s an 8 core older architecture against AMD’s new best consumer processor and its 499 vs intels neatest equiv at 1100, it’s not even remotely close at the moment.
My point was that ECC ram is twice as expensive. Cache is super expensive (your consumer chip will have significantly less), so are PCI lanes (again, consumer chips have significantly fewer of these), and also your Ryzen chip doesn't have AVX-512 support which is important for e.g. video work.
>On the flip side over in Linux land I built an 8 core 16T desktop with 64GB of RAM and two 27” 4K HDR monitors for less than half the price of the base Mac Pro.
I am betting at half the price of Mac Pro, it doesn't include support for ECC, and hence Memory aren't ECC, the CPU aren't Server Grade ( Even on the AMD side you will need to use at least EPYC ), motherboard aren't Server Grade, with less PCI-E slot and not a decent case with decent power supply. Yes Ryzen supports ECC, but it is not tested as such, and broad vendor have to do their testing as well. And testing is expensive. ( Hence why Server Grade CPU are expensive )
Honestly I love AMD and loathe Intel. But these kind of comparison all over the internet is like saying I could get a 500hp Nissan GTR over 500hp Ferrari at half the price, but why did we not ever see that argument in Cars forum and only on tech forum.
Because the Xeon-W CPUs reduce frequency per core as you increase your usage to more and more cores. The 18-core version still has 4.3 GHz single-core turbo frequency.
My workflow is mostly multicore so to me the iMac Pro is clearly superior.
What's the general split between laptops and desktops for developer use? Certainly I've migrated to 100% laptop which gets regularly used in a variety of locations.
Everything about the base specs are just good enough. I’m not defending it per se, but if you spec’ed the machine for any given purpose, whether software dev, or rendering, it’d probably raise the cost by at least $1-2k, and then you’d have the “real” machine. So, the reason the price looks egregious is because most of it is just base cost.
This is a computer made for the engineers who’ve already “made it,” who are making that $500k a year and are looking for a Porsche over a Corvette.
It's designed to be useful for more than one kind of "Pro". The old Mac Pro, for instance, seemed to completely forget about music studios and their professional requirements for Macs. This new machine may seem like overkill to software developers, but as an audio engineer, it's perfect.
I don't even want a 1TB SSD in it, the 256 is perfect to hold the OS, a few DAWs, and all the plugins I could ever want. Everything else gets saved to drives in a toaster anyway. A rackmountable unit with a ton of PCI slots for HDX/Dante cards was on my Christmas list, and I'm not alone- there's a reason they made a point of showing how many HDX cards it can fit in their presentation.
It also looks like an amazing workstation for video editors. I really don't think it's designed for software engineers who make 500k a year.
256 is not enough for a serious main drive in a DAW. Sample libraries should all be on the fastest drive. There are single instruments that take up 50GB. And consider that most studios are recording in 24 or 32 bits at higher frequencies than 44.1Khz. 1TB is probably enough for a music production system although I'd personally prefer larger so that I don't have to be swapping things around all the time.
Maybe you haven’t looked around in a while; the toasters are Thunderbolt-attached now, and they take (en-cartridged) NVMe SSDs. There’s nothing slow or high-latency about that. Copy your assets over to your project disk from your NAS at the start of a new project, and then forget about it.
Alternatively, forget hotswap and use a Thunderbolt DAS with RAID6. Burn your projects from your DAS to a portable SSD when you want to pass them over. Only takes a minute or two.
In addition to computing I also dabble in woodworking, where there are tools in the 'corvette or porsche' classification that everyone drools over, and those guys can spend way more money on tools at a lower salary than I do for 'fancy' Macbooks. Of course, their tools last 3x longer if taken care of, but the outlay can still be breathtaking.
"A good craftsman doesn't blame his tools" isn't a warning against complaining. It's a warning about picking bad tools in the first place and scapegoating them instead of accepting that it was your decision all along.
Download and install the app "Blind" to be shamefully informed of how many there are. Seemingly there is either a very large amount, or there are lots of SWEs who like to lie. Lots of it seem like stock options from FAANG and Uber.
I've been on the internet. I know people lie. The OKCupid blog found that "There are consistently 4x the number of people making $100K a year than there should be" [1]. They also lie about their height, and what they look like.
What reason should I have for trusting anonymous self-reported data in a category where people are known to exaggerate?
Yeah, so... plenty of software engineers are making $500k a year. That is total compensation; you should expect half of that to come from non-salary things like stock options and bonuses.
Maybe some people are lying, but that seems about right to me for actual senior people (leading projects, maybe managing people).
My last year at Google, my W2 income was in the area of $300,000. I was a "level 5" with good performance reviews, and the scale goes up to 9. I sold all my stock the second it was issued ("autosale"), so the W2 income is pretty close to the amount of cash I got.
Programmers focused on the right task are worth their weight in gold. There are very few fields where an hour of time put in can save society as a whole thousands of hours. Software engineering is one of those, and we get to skim off a little bit of that value we created in the form of cash.
There is also much software that is complete garbage. If the ones with $500k TC are writing decent software, then those making $100k may be writing the garbage.
Depends a lot on where they're working. I know more than one person that had 15+ years experience that was making around 100k/year, then moved to a FAANG and hit > 500k total compensation in just a 2-3 years.
For OkCupid there's a clear motivation for exaggerating things like height, looks, and income. What motivation does one have for anonymously posting an exaggerated income? Are there really that many trolls who want to depress those making less than $500k/yr?
People like in anonymous communities all the time. One motivation would be to impress other (perceived-to-be) successful members so to be asked for stories to tell, for advice, etc.
I don’t know why people are like that, but anonymity thins out the middle group of semi-/socially-truthful people by providing an opportunity to be much more honest or much less honest than is normally possible.
$500k is achievable total compensation at a few companies for some people, though.
Yes. Move to Vermont or Michigan, get 3 of those new monitors, plus this machine, and bask in the glory as they warm your room in the winter with the heat turned off.
I was tempted to make a joke about the Porsche or Corvette being luxury vehicles but I've noticed we spend way too much time nitpicking after fine details. It deflects from the thesis to do so and I'm not a fan. I can understand GP's point just fine without getting into quibbles over that.
Instead I took umbrage with the idea that a $3000 laptop which is our primary tool is a luxury item. I think it's one of many signs that we're a bunch of cheapskates. Other industries have different perspectives on this.
While I tend towards being a cheapskate on many physical things I also understand diminishing returns, and to me once you hop to the other side of the price-performance curve I define that as a luxury. For me, I get a lot more value out of a laptop and accompanying software ecosystem that helps me be more productive for my typical development cycle, and a lot of the stuff at the OS and above is pretty darn subjective and context-sensitive to the kind of development cycle.
From an overall productivity standpoint, because the biggest bottleneck to programmer productivity is mental and physical health, data would imply that I should spend more on exercise equipment, a better chair that keeps me from getting injured than on a laptop that gets maybe another 10% faster compile of already less than 30 seconds (incremental compilation anyone?) for the $1k difference between a 15" Macbook Pro and a 13" Macbook Air. No amount of money I dump into any hardware or software will make AWS provision its resources everything faster either, and that's what I sit and wait on the most for feedback rather than direct code compiles. And a fat CI / CD server is not run on my laptop unless I'm running Jenkins locally or Concourse.
For a great craftsman, I expect them to get something out of any tool that a lesser person could accomplish. They can make a good tool do things I wouldn't think of, but can also work around the limitations of a lesser tool.
So if you couldn't get anything out of a better tool, I'd start asking uncomfortable questions about you.
Do I think I as a developer could leverage a Mac Pro to speed up my code-build-test cycle? If the whole team had them, then I'd absolutely tune our tools to use the extra cores, monitors, etc. We are better at troubleshooting when the feedback loop is shorter.
But to me the Mac Pro is more of a tool for designers. If a designer is turning in the same work on a 1k machine I would ask about our process first, the designer second, and the tool third.
Most of the people that are buying this will attached very large external storage as it is most likely going to be used as a video editing workstation.
are we talking about workstation nas here?
we have nothing but praise for synology performance and reliability. software is stable, tiering works flawlessly, expansion is inexpensive, web ui is good, ssh gives you a proper shell - do whatever operations on your nas locally, faster.
Editing workstation NAS, different IOPS use case. I ran a Synology as a test with VMware and even though the unit was certified, it was nightmarishly sluggish. That said, they're perfect for Most business not in the editing space.
Yes, 580X is old (it's the same hardware as RX 480 from 2016) In fact, everything in that machine is old: the Xeons they use are rocking the Skylake core form 2015, with all of the recently discovered side-channel vulnerabilities.
The upcoming AMD Threadripper 3 CPUs with PCIe4 would be a much better Pro offering. Clock-for-clock they are beating Skylake now, rumored to have up to 64 cores. Also, PCIe4-based RAID 0 arrays are pushing 15GB/sec transfer speeds IIRC.
It’s more likely the new Mac Pro uses the as-of-yet unannounced (but leaked [1]) Xeon W-3xxx CPUs. The current Xeon W fare doesn’t have the core counts Apple is advertising.
But yes, I would expect a dual die 64 core monster like the 92xx series they recently announced. Of course the 300W power/thermal solution apple was talking about seems a bit limiting in that case.
Yes, but Apple lists 12, 16, and 24 core Xeon W options, none of which are currently available from Intel but do match those leaked for Cascade Lake Xeon W. So it’s quite doubtful Apple is using Skylake Xeon W since the majority of the specs don’t match.
Don't forget that Apple can see AMD and Intel's future roadmaps.
AMD definitely has a superior offering with Threadripper 3 but we haven't seen what Intel is going to offer in comparison. And switching CPU manufacturers is never trial.
Intel’s public roadmap has been highly unreliable. They’ve been delayed by years, and that was before the recent vulnerabilities were discovered. I’m not convinced about the value of their roadmaps.
That being said, if you were only allowed to pick one platform to sell to your customers, in a machine which should last years, then Intel would definitely be the safer bet. It’s also the platform most of the software providers are likely gonna optimize first.
I'm not sure why they'd consider Threadripper for the Mac Pro. This machine fits into Epic's target cases and could possibly live up to the asking price of the whole package at that point. Given how the rest of the mac ecosystem is going, I'd only expect the most stubborn of professional communities pick this up (film editing folks seems to still buy the Apple hype).
I'd be surprised if the monthly charge is a problem here. We're talking about a machine that costs more than 6 years worth of subscriptions (for the entire suite, business price, ~1/3 less for just premiere pro, business price) for the lowest tier machine (which isn't all that great for the stated use case).
Maybe there is a technical reason but the story I've gotten from people I know doing this is that they are looked down on if they're PC users. It could be that Apple's software is that much better but it seems much more likely that they're charging what they know they can based on the generally stubbornness folks have around their software and workflow. I partly can't blame them, if your job is in a creative space the last thing I'd want to do is constantly rework a workflow and deal with the machinery itself rather than the content and output.
The 580X is technically "new" but it's the same Polaris 20 GPU chip that has been around since the RX480 in 2016 and has seen several rebadges accompanied by slight clockspeed bumps. This particular rebadge is Apple-specific and debuted with the recent iMac refresh.
Sure maybe that mobile-workstation card came out this year but the tech behind it is very old (GCN 4.0). In fact all of AMD's current offerings are fairly old now. Navi is the first GPU architecture since 2012 that's not based on GCN.
My understanding is that a Pro 580X is just a rebadging of the Pro 580, which is from mid-to-early 2017, and all the 5xx series is just minor spec bumps on top of the 4xx series.
>I know being expandable is important here but I am stunned by the fact that an iMac Pro actually comes with better SSD and Video Card for less money and you get a screen to boot!
I'm guessing it's just price discrimination. People who are in the market for a Mac Pro (or rather, their employers) are probably not price sensitive.
I partly understood it with the "first"-gen New MacBook; it was experimental, and they wanted to give it this kind of minimalist-mystique of flipping the naming convention where this forward-looking device was now simply "The MacBook". But why does it even still exist now that the other ones have adopted most of its features?
There’s a pretty obvious general pro workflow for a configuration like this, where OS and apps are stored on the main drive and data is stored elsewhere.
The entry level Macbook Pro is not the best bang for the buck. For about as long as I can remember, the best value in the neighborhood of $2950. So the 'good' Macbook gets cheaper due to inflation and not much else.
Right now you can get 32 gig of RAM and the 512 SSD for around that mark. I'd probably throw in the video card upgrade for a personal computer, maybe not for a work one unless you're doing AI.
I'm less concerned about the 256GB SSD. If this is a true Pro set up, then there's going to be very large storage media attached. The internal SSD is just to store the OS and apps. My main concern is the lack of a mention of Nvidia support. I'm very curious of the apparent bad blood between Apple and Nvidia. Were the GPU issues from the 2011 MacBook Pros to blame, or just one of the pieces of straw?
Honestly, I think the SSD size is okay. 512 would have been better but considering this is fully expandable, it's much cheaper to manually upgrade the SSD to whatever you like with after market SSDs anyway vs Apple charging more for it. Also the people who are looking to buy such a beast aren't going to be price sensitive. So this lets them price it at $5k.
Neat. My new MacBook Pro has 256GB. I'm using ~133GB. That's XCode and the Xcode Beta, the full Office suite, Logic Pro, Firefox, Chrome, Safari, IntelliJ, GoLand, Photoshop, Illustrator, a bunch of other Adobe apps, and on and on. When I'm at the desktop I use an external drive for dumping junk to, and for Time Machine backups.
There is a point of diminishing returns, and given that most people don't have a giant Steam library on their Mac, 256GB is more than sufficient for a large number of users. And as many others have said, when you're working on professional data like video and audio it is almost always on an external array.
You are right. It is sufficient for a large number of users. However, the Mac Pro is a high end workstation used by professionals, targeted for applications like high resolution video editing that require insane amounts of storage.
Relatively speaking, storage is cheap. A high performance, 1 TB NVME drive can be had for under $350. This should be the base configuration on a high end workstation...
What benefit does a 1TB SSD have over 256GB for a video workstation that needs to connect to a 100+ TB array for the real work? If your video editing task can get by with even the 4TB maximum build-to-order option, then you're not in the target market.
I would definitely look at a Mac Pro for software development (every minute counts) and if you're in that group there is simply no need for more storage or decent graphics.
That's silly to be honest. I am a developer myself and I can tell you Mac Pro or iMac Pro is a waste of money in every way imaginable.
Look up barefeats benchmarks. Unless you have very specialised needs (like video editing) there is visible penalty when you increase the number of your cores. For every day usage 6-8 cores is what you should be aiming at, and preferably with highest clock speed possible.
The more cores you have, the lower clocks get and the harder it is to sync cache between them. If your software cannot really use that much parallelisation it will be slower, not faster.
For 99,99% devs spending cash on iMac Pro over iMac with i9 is waste of money and probably degraded performance. Mac Pro is even worse as you need to buy external monitor too.
My compile times being shorter by seconds isn't important. Having lots of cores and memory is. There are many types of software to develop and mine is best developed and tested on a bevy of VMs or containers.
That said, I could still avoid the Apple tax with a Precision or something from System76. My my employer tends toward Apple on the desktop and Dell in the datacenter, but $dayjob will more likely refresh with a MBP than one of these behemoths.
I build my own whitebox systems at home mostly, and AMD's been good to me on price/performance. I have a few Macs. Wouldn't mind a Talos II from Raptor Engineering or one of these newest Mac Pro machines if I could justify the cost.
There are edge cases for sure, never said there aren't any. If you need to run several VMs, each with a few cores and few gigs of memory, and all of those under heavy load at the same time, then apple "Pro" machine might be just what you need.
What you say is true, but apple has generally been better at selecting processors from intel that aren't just re-bagged low clock xeons. It wouldn't surprise me if this thing is 5Ghz turbo on a couple cores. There was that 28core 5ghz beast intel was talking about (definitely not 300W though) and a few rumored W3175X, follow on.
All of the Xeons in Mac Pro say 4.4Ghz turbo from what I saw at apple.com.
So the 2019 iMac i9 @ 5Ghz on 4 cores will beat the crap out of them for software development purposes, and from the benchmarks so far it seems there is very little (or none at all) throttling on new iMacs.
I disagree. Having more cores is becoming more and more relevant to software engineers these days. The power wall is forcing application scaling to happen through core count, so a lot of the software development in the world is focused on that. Having a beefy multicore machine is becoming more and more relevant to devs.
"100%" CPU utilization means very little with modern Intel CPUs and TurboBoost. Intel CPUs have the headroom to clock much faster than base clocks but often can't sustain due to thermal constraints.
Using a desktop processor with more thermal headroom (with adequate cooling) would presumably be much better bang for your buck than increasing core counts in an already thermally-challenged laptop.
meh, i'm just starting to fool around with music editing and my 250 GB laptop is out of storage. Also, it's a three year old laptop and that's what it came with. They could have kept the stainless steel and given me a bigger drive.
Who exactly is the intended user for this ultra-expensive, non-portable, box if it's too lame for hobbyists?
This is just apple sticking a vacuum into their customers pockets, applying the same over-pricing policy that is now standard on their phones.
$5000 for imac pro(+applecare)? that is bonkers.
$6000 is even more crazy.
You can get more storage, better gpu, more options for network for much less. And you wont need to deal with apple geniuses to fix the issues.
It's funny this just came out when it did. I was just looking at getting an older Mac Pro 5,1 and upgrading all the components in it like a few friends have done.
For under $2K you can have a machine that competes with newer machines that cost twice that much. If you're a professional creative type, these machines are the real deal:
the Mac Pro 5.1 was designed to accommodate up to 12 cores: “Even though a single core isn’t fast, imagine having 12 of them for video editing and audio—those cores together are faster than my brand-new MacBook,” Mazzarolo said. The new iMac Pro can have up to 18 cores; new MacBook Pros max out at four cores.
The 5.1 can take a whopping 128 GB of RAM, which is equal to what a fully upgraded iMac Pro can take and double what Apple says the trash can Mac Pro maxes out at (it’s worth noting that the RAM used in newer Mac computers is usually faster)
The 5.1 can be modified to use modern SSDs, which Mazzarolo said are in some cases faster than the ones used in the new iMac Pro
The 5.1 can use almost any brand-new graphics card from most manufacturers, which is the main reason why a fully souped-up, old Mac Pro can outperform new computers. “With some rendering engines, the AMD cards that Apple uses [in new Mac Pros] don’t even work,” he said. “In general, even mid-level graphics cards we put in are as fast as those in the iMac Pro. We can put in better cards and we can put in two of those.”
On the Facebook group, Mazzarolo posted benchmarks of one of his custom-built rigs playing 5K, 6K and 8K RED RAW video clips against current-model Apple computers. A new, 15-inch MacBook pro and a recent “trash can” Mac Pro weren’t capable of playing the video at more than 8 frames-per-second. His custom-built model was able to get 24 fps in each case.
There are ways around the official support drop as well, I run macOS mojave fine with a Mac Pro 3,1. These original cheese grater macs are truly still the best desktop tower they'll ever make at this point. It'd be fun to trick one out with a 3rd gen threadripper, which is probably faster than the 28-core rumored $35,000 mac pro.
I get where you're coming from but the whole point of getting a Mac is "it just works". I just don't have the time to deal with a Hackintosh. Also from past experience, there's a reason why they sunset older machines. At some point the OS updates are just no longer compatible with the old hardware. The Mojave patch notes that you linked further confirms this.
Another issue is that Windows 10 is now at the point of working decently and it has Ubuntu baked in. Apple also has iCloud sync software for it. I will miss Mac Apps, but given economics, most people just crank out electron apps anyways.
I give up on Apple computers. imo There's no viable option for iPhone or iPad though, so I'll still will be with Apple on some level.
I'm not sure if you can say Macs just work anymore, and obviously this is the realm of people who like to tinker or make things work for the sake of the process. If only there was a word for it... Oh wait I think it's up up next to that Y up there :)
The old hardware is pretty well supported in the Mac Pro because of how modular it remains. The majority of the incompatibility comes from not having a GPU that works with Metal. Those patch notes are mostly for devices with soldered components, like the GPU.
They don't just work any more; I have an older cheese grater mac pro; and new video cards for it are a dismal lottery.
You could say this is why a modular Mac is a bad idea.
My Apple MAC SE; still just works; it runs Word 5; which is all it ever did.
dang. that is a bummer. i picked up a MacPro5,1 from a govdeals auction and it is a pretty nice computer. it even runs dark mode in OSX Mojave. guess it's days are numbered =(
i do have to say though, the single core performance of my MacPro5,1 is pretty abysmal. even with similar clock speeds, my 2015 mac mini is 50% faster in single core performance.
Intel(R) Xeon(R) CPU E5620 @ 2.40GHz
quad core, dual cpu. so 8 cores total.
multi-core performance is great!
we use it as a Unity build box and it functions great since we're normally doing multiple builds at a time, and image compression & shader compilation are multi-threaded. but it's definitely slower if you are just doing 1 build at a time.
sucks that in 2 years when Apple starts requiring the latest Xcode for iOS build submissions we will have to retire this guy =(
As an upgrade thought, that box should be able to handle dual X5675's, which are fairly cheap on Ebay.
There's a very noticeable performance improvement from (eg) the E5645's, let alone the E5620's.
Saying that as I used to use a self build box (Supermicro motherboard) with dual E5645's as my desktop, but it always "felt" a bit slow. :/
Had the opportunity to upgrade to X5675's, and now it "feels" fine interactively due to the higher base and boost clock.
The highest model, X5690's, from all reports are literally no faster in use than the X5675's (maybe thermal throttling?). So not worth the extra spend. ;)
Pro machines for A/V production are routinely $7-10k and have been for as long as I can recall, right back to the 90s. You’re probably just not in the market. These are for commercial buyers who make lots more money from these machines than they spend.
Also, six months ago everyone was bellyaching about how Apple has no serious pro devices and was abandoning the desktop. Now it’s too expensive, and my eyes can’t roll back far enough.
Arguably, the whole idea of a modular machine is that the base price can be lower, and then you can spend the money on the things that are valuable to you.
I've owned the last two Mac Pros. I won't be able to afford this one. But fair enough: in this case it may be that they can't raise the ceiling without raising the floor.
I guess you’re not very familiar with the Mac Pro line. The ones Apple let die in 2012. You know the ones with starting prices at $2000-$3000 adjusted for inflation
Apple leadership also deems developers as “pros” as well at least publicly. I don’t know what they were thinking
Yep. Ubuntu lost by several orders of magnitude. They lost me with Unity, which seemed like something overly fancy which wasn’t required. Had they gone for stability instead of UI, they mighy have kept the users and developed a more suitable alternative to macOS.
It really feels like a list battle. I’d pay $200 a year for an OS, especially an open-source one that doesn’t send my data to Amazon.
If you don't like Ubuntu's GUI, it's baked into Windows 10 now. It's not ideal since you're missing out of the remaining good Mac apps (that aren't electron based) but it's a viable alternative now.
Well of course, they've done the marketing work and know who has the money to buy it. People complaining about GPUs in this thread are missing the point.
None of those things really matter to the target audience though. Freesync doesn't have much use when you're working with static images or videos with a fixed frame rate. When you're editing video, you want your refresh rate to equal the frame rate of your video (or a multiple or it) to avoid judder and interpolated frames. A refresh rate higher than your frame rate, but not an even multiple of it, would be worse than running at the same rate as the video. Running it at an even multiple of the frame rate may provide some benefit, but it is marginal.
The pros are mainly concerned with color accuracy and gamut, neither of which gaming displays are known for.
Yeah the pricing for the base model is insane. You can get a more powerful PC with 4x the storage for half the price.
And you're right about Nvidia. I also stream on Twitch, was hoping I might be able to get one of these to replace my current PC. Without Nvidia there's no way.
The CPUs they're using are Cascade Lake, not Skylake. So I'd add at least about $500 there. You'll also need to add ~$300 for the giant 1400 watt PSU, and at least $500 for the custom cooling solution and case. Most PC cases are really not built that well.
Where did you get that $500 increase? Intel doesn't increase prices between generations, at least not that much. If anything, they could even reduce the price to have a chance to compete with new Threadrippers.
The $500 came from the new chips, plus the motherboard Apple is using is much higher spec than the SuperMicro.
It's got 8 full length PCIe slots, and two of those are the MPX slots so it's effectively 10 if you use MPX cards. It has dual 10gb NICs, and two TB3 ports.
We are well into Skylake era, a Broadwell is ancient comparatively speaking. The Mac Pro will be Cascade Lake, not state of the Art like EPYC with 7nm. But still a very large improvement.
>HP Z Turbo Drive
Even if we choose a 256GB version of Z Turbo instead of SATA as you confined, the Turbo Drive will still be a lot slower.
>HP Thunderbolt 2 (20GB/s; one port) +$200
Yes, Thunderbolt Port are expensive. And these are only TB2, not even TB3, and still missing 3 Port.
Even this 2013 Dual Port 10Gbps Ethernet is a $250 upgrade.
>Wireless Mouse and Keyboard, and WiFi
Added those .
So a not similar config with a very old Intel CPU cost $3475. I would imagine if you spec it exactly the same as Mac Pro, with TB3 and every hardware and ports, the HP would cost $4K+, with an ugly design. Quite a bit of saving but not as dramatic as some would imagine.
I assume this uses ECC DDR4 which would at a minimum explain some of the price tag. That will set you back $500+ retail for 32GB alone, still don’t understand where the rest is going with a 8-core CPU and a RX580X though.
I guess a 8-core Xeon W costs a cool $1K by itself, 10Gb networking isn’t tremendously expensive these days so it probably contributes $300 to the cost at MSRP. Still, $6K is too much, $4-5K would be less outrageous and still turn a healthy markup.
Yeah, I was expecting it to be either significantly more powerful than the iMac Pro or to have a lower cost for the base model. Neither of those things happened. I was willing to go out of my way to get hardware that's not quite ideal for my use case and still pay extra for it just to have macOS, but $6k is just too much. It might be expandable but each generation of Intel CPUs is incompatible with the last generations motherboards. If you care about single-core performance then it just doesn't make sense.
How many people actually need an 8 core Xeon vs an i9? Almost none. And that's without taking into account how much extra you're paying just for the hardware you're getting on the base model. I was willing to compromise and pay for workstation hardware just to have macOS, but not at a 2x markup.
256g of SSD would be appropriate under the conditions that a) this is a very basic and affordable configuration, so less than 3k and b) the machine would be user-expandable with NVMe drives or similar. Then you would get the minimal SSD just good enough to install the OS and put in third party storage to save money. And of course, you would want to be able to put in more than 4T of storage.
That price is just absurd. I waited for Mac Pro announcement to decide whether I'll stay with Windows or jump to macOS. Macbooks are not for me, but I hoped for a remotely reasonable priced computer, something like $2000 with another $1000 to expand with more memory, etc. But $6000 for 8-core CPU, 256 GB SSD, GPU that is less powerful than my old $200 NVidia 1060? It's just laughable.
Otherwise seems like a good machine. But not at this price. Probably configuration that I would like to have will be something around $10k and I'll build this configuration for $3k myself.
The new Mac Pro, like Mac Pros always have been, is for music/TV studios and design agencies that bill their time starting at $500/hour. It's really not for individuals, and that's ok. Apple made a computer "just for the pros", like HN begs for in every butterfly keyboard thread. I'm thrilled about it even though I know I'll never touch one. It's for people who don't know what a CPU core is but know how to produce a platinum album, get an Oscar nomination, or bring a revolutionary industrial design to market. This is really something where the price is superfluous. I personally wish it was higher so that was more obvious.
I disagree that it’s always been this way. I bought a G4 tower when I was in college and it sure as heck wasn’t $6,000 back then.
This is the only traditional desktop computer that Apple sells, so it’s disappointing that everybody outside of movie studio employees has to fall back to an iMac or a Windows/Linux machine.
Just as I'm curious what prices really were. don't know exactly what Gr tower line you are talking about, but according to https://www.macstories.net/mac/the-power-mac-g4-line/ , the first reasonable powermac G4 was summer 2000, and
> Prices ranged from $1,599 to $3,499,
According to one online inflation calculator, $1599 in 2000 dollars is $2,372.95 in 2019 dollars. and $3499 is $5,192.58.
So, I guess, yeah.I do recall them seeming awfully expensive.
I agree this new release seems awfully awfully expensive.
They got even cheaper than that. In 2003, the Power Mac base price was $1499. In 2019 dollars, that's only $2082.
By 2012, the entry level Mac tower was up to $2499. Since 2013, the cheapest Mac Pro (no PCI/E slots) has been $2999.
The 2019 model is literally more than twice as expensive as the 2013, which in turn was already the previous most-expensive entry-level Mac workstation ever.
PowerMac/MacPro prices have always seemed steep, but this is a new ballgame even for Apple.
The difference is that the iMac Pro and Mac Mini didn’t exist to segment that market back then; all the tiers of professional hardware were rolled into just “the Mac Pro.” Roll those back in and the pricing tiers look more equivalent.
I found the invoice a while ago for the professional desktop publishing setup that my parents bought in 1993...a Quadra 800 mac, upgraded video memory, a flatbed scanner, b&w laser printer, a 20" monitor, a 14400 modem and some kind of fancy magneto optical drive or something for FedExing large files. Total price was $13k at the time.
Yeah it was definitely something geared towards graphics professionals...just found an old magazine ad for it, looks like it had a Trinitron tube and ran for $2700 at the time. It weighed an absolute ton. I certainly enjoyed the screen real estate for playing Sim City when my parents weren't working!
It's unclear to me if you were agreeing or disagreeing but the upcoming mac pro entry model is going to be more than double the price of the entry level G4 towers after adjusting for inflation. So the original commenter seems to be correct, they used to be more affordable.
Is the crazy price due mainly to the all aluminum enclosure? I don’t remember the cheese grater models being priced this high accounting for inflation.
Accounting for inflation, the old 2012 cheese grater models started at $3150 in 2019 dollars. The older models were $1000 cheaper
Hopefully they've included built-in battery powered GPS because at those prices and with the handy carry handles these are going to be walking away from their owners in record numbers.
- The new Mac Mini has soldered storage. This precludes upgrading storage, replacing internal storage (which is important for preserving data due to a motherboard failure), and adding additional internal storage.
- Apple finally made the RAM on the new Mac Mini upgradeable, but it is not user-serviceable; you have to get the RAM installed by an Apple-authorized repair center in order for the warranty to stay valid on the Mac Mini. The same holds true for the iMac Pro.
- The Mac Mini lacks expansion slots. All expansion must be done externally using Thunderbolt or USB-3.
Until the 2013 Mac Pro was released, Apple has sold an entry-level Power Mac or Mac Pro model for around $2,500 in inflation-adjusted currency that was user-serviceable and has expansion slots. The 2013 Mac Pro was still user-serviceable, but it lacked expansion slots. Now Apple is finally selling a user-serviceable, expandable Mac, but at prices that are beyond what many users can afford. There are no user-serviceable, expandable Macs that cost under $5,999.
> - Apple finally made the RAM on the new Mac Mini upgradeable, but it is not user-serviceable; you have to get the RAM installed by an Apple-authorized repair center in order for the warranty to stay valid on the Mac Mini. The same holds true for the iMac Pro.
I'm pretty confident this isn't legal in the US. Under a law from the 70's the FTC released a letter reiterating this fact last year. Apple and other companies have made this purposefully vague as well as fighting right to repair laws.
> There are no user-serviceable, expandable Macs that cost under...
This has been a gripe against Apple for at least 20 years.
The memory on the iMac Pro is user upgradeable, and I believe it’s the case with the new Mac Mini. One big strike against the new Mac Mini is how much the higher end CPUs are thermally throttled. Not good for a relatively expensive machine.
GPU for me. I’d snap one up if it had a GPU in line with that in the MacBook Pro. I don’t particularly want a large external GPU hooked up over Thunderbolt.
I don't think there is much demand anymore for a Mac tower in the consumer space; actually, I don't think towers have been a popular option in the consumer space for a long time for any platform.
1. Apple first failed to update the cheese grater line for years consecutively
2. Then Apple's redesign of the Mac Pro transformed it into a larger Mac Mini trashcan which wasn't really expandable compared to the original cheese grater line. No one asked for that.
There is evidence for demand for a reasonably priced Mac Pro Tower if you look at the Hackintosh community and eBay sales of older cheese grater Mac Pros, but Apple keeps refusing to meet demand.
The cheese grater was already pro, not consumer. The last consumer tower was probably the G4 era, afterwards the iMac, mini, and laptops took over consumer completely, leaving towers to the pro market.
There is "niche" demand for consumer towers, sure, but I don't think it would be worth Apple's effort. The iMac 5K is really good, and its even the cheapest option to get one of the best displays out there.
Yeah, you're paraphrasing what I wrote. It is not consumer. It was called a Mac Pro and it's half the cost of the new Mac Pro, accounting for inflation
I paid under 3k for my PowerMac G5, ~3300 for the original Mac Pro after that, then 3k for my current 4 core 2013 Mac Pro. This is definitely a course change on the entry level pricing.
> Apple made a computer "just for the pros", like HN begs for in every butterfly keyboard thread.
> It's for people who don't know what a CPU core is but know how to produce a platinum album, get an Oscar nomination, or bring a revolutionary industrial design to market.
Wait, what? When I say "for pros" I mean able to support latest specs. Something that had the latest NVIDIA cards, or could at least USE the latest NVIDIA cards to do ML or rendering would qualify under most people's definition of Pro I think? Right? When did the definition of Pro become "Music Producer"?
Music, graphic design, video have been the core Apple "Pro" markets since forever. Developers focused on anything other than the Apple ecosystem are a relatively new addition to that lineup, and one that Apple hasn't _really_ ever catered to.
It feels like empty words similar to Steve Balmer's "Developers, Developers, Developers". Satya Nadella didn't have to say it. You could see it by what Microsoft was doing.
There was a time when research science was on that list as well, and Apple effectively abandoned them - and continue to do so with nonsense like utterly lacking cuda-capable GPUs.
My lab has grant support. I've bought workstations just as expensive as this recently, but then new Apple ones are non-starters, even if I've been using Macs since I was in 2nd grade.
Yep. For me personally, it's a reassuring signal that Apple is doubling down on macOS and professionals, and gives me hope that next year's MacBook Pro redesign won't be a disappointment.
That was basically my takeaway as well. Writing/compiling code just isn't "pro" anymore — doesn't require pro hardware. The new Mac Pro is meant for professional studios.
And to think music studios need so much horsepower when "Sgt. Pepper's" was recorded on 4 tracks…
> And to think music studios need so much horsepower when "Sgt. Pepper's" was recorded on 4 tracks…
What audio effects there were in Sgt. Pepper’s were generated in analogue effects pedals and synths. Today, those same effects are expected to be simulated, in parallel, in realtime, on the DAW you’re producing the track on. (And modern music genres use way more effect layers. Modern pop music is a fifty-deep effects tree with one-second-long samples at the bottom, with full-fidelity rendering of such monitored in real-time as performers feed vocals and other live input in.)
> And to think music studios need so much horsepower when "Sgt. Pepper's" was recorded on 4 tracks…
Estimates are > 700 hours for the album; 700/13 = more than 50 hours per track. A lot of that time was bouncing down 4 tracks to smaller numbers for future takes.
My dad is a graphic designer who got his start doing desktop publishing on a Macintosh II using QuarkXPress. He's faithfully stuck with the Mac since the '80s and currently has a trash can Mac Pro, with which he is mostly happy. He's never billed anywhere near $500/hour as most of his work is for small print periodicals (<50k circulation) and online presence for small nonprofits.
Currently, his only outstanding need is for better render times on Final Cut Pro (rendering videos of conference proceedings for YouTube), but I know he'll balk at $6000 for the new Mac Pro. I don't know what I'll recommend.
On the developer side of things, the 2018 Mac Mini update is a much lower priced computer with an option for a 6-core i7. If you really need a dedicated GPU, you can add one with a Thunderbolt enclosure.
The new Mac Pro is for movie studios and maybe very large machine learning models.
You have to wonder how many you could reasonably expect to sell that way, though. Everyone in this thread is talking about some upper-elite of pro computer users (above your ordinary multi-six-figure developer salary on HN), but at only $6k per, that's not any way to make money. If that's really the target market, I would worry about making my design money back, let alone making a profit.
So I have to assume that that's not quite true: it's a computer targeted at studios, other million dollar a year creative professionals, and (the largest market) relatively rich folks who want to play at being creative professionals. Nothing else makes sense: it's not a computer for enthusiasts, because it's at least 50% overpriced for the market, it's not for developers, as this thread amply illustrates. Consumers are far more likely to buy an iMac than a Mac Pro if they have a desktop at all. The only major market left are people with enough disposable income to buy products because of their branding and because they know it's what the fancy creatives are buying. Buying a Mac Pro will make you feel like a multimillionaire creative - that's the selling point.
It's really the same principle as in digital cameras. Pricey full frame SLRs like the Canon 5D mk.IV are great cameras that are essential to photography professionals, but the vast majority of the market for these cameras is wealthy people who want to buy the expensive model from a recognizable name brand because "it's what the pros use".
Edit: perhaps those who disagree with this could provide their own breakdown of who they think is the target market for these, and how many Apple can expect to sell to each.
It's so weird that Cuda has latched on to the ML world so tightly. It's a field where literally everything else is all about open-sourcing and giving away ivory tower knowledge, and then there's so much BS that Nvidia makes you go through - I had to wait a few days to get "approved" to download CudNN, where else is that acceptable? AMD releases high-quality open-source drivers, has better performance/TDP, usually better performance/$, and isn't having security issues every month. It'd be great to see them start to compete with ML libraries as well, they're such a better citizen.
This is not "just for the pros". It's just for an elite echelon of pro's that aren't the normal working professional. It is way overpriced for the spec. HN wasn't asking for that.
It's competing against machines such as the Boxx dual-CPU workstations, which start at $8000 [1]. I can see movie studios and such shelling out the big bucks to buy these for their 3D modellers, but only the super rich would buy one for themselves.
3D modelers want NVIDIA cards, not Radeon, as NVIDIA supports CUDA. This has been a longstanding issue with 3D folks; the old trashcan Mac Pro had NVIDIA GPUs even with otherwise-outdated internals. When rumblings came of new Mac Pros a few years ago there was excitement that they could get some fast machines again; alas this appears not to be the case.
MacOS does not support vendor-supplied drivers which has been the bane to every graphics engineer's existence. Supporting mac for games as a result, is a completely inane task we avoid like the plague. To me, because of the GPU, this machine is a giant paperweight.
I’d argue this is good for users, because vendor-supplied drivers always seem to have crappy updater software running at all times and send pop-ups at all hours on Windows. Vendors really need to better support open-source drivers for user experience purposes, but they don’t because they want more control over their market.
Don't conflate the updater with the driver itself. Apple could have enforced a policy around integration with the system update functionality or something, but the driver supplied by the vendor will always be better/faster because it can exploit internal knowledge about the hardware.
Yes, I get that. All I’m trying to say is that the current user experience with third-party drivers for anything (especially nVidia graphics cards) is not the kind of thing Apple would want to inflict on their users.
That's not true. If you let Windows manage your drivers, you won't have any popups. Drivers will be updated as part of standard Windows update. Now if you want latest drivers and you're intalling GeForce Experience application, then yeah, it'll notify you of new drivers once in a month. It's just a standard Windows notification and if you installed that application, probably you want to stay on latest drivers, so why would you complain about that?
Often when I plug one of my Razer mice into a different USB port I get a big pop up with a prompt to install their Synapse software that takes two clicks to dismiss.
It's not good for users when vendor drivers work and Apple's drivers don't. I know OpenGL is deprecated, but a lot of apps still use it because it's cross platform, and Apple has the worst GL drivers in the industry. Even Metal is missing many features from Vulkan [1].
Most major 3D modeling programs support OpenCL, as well, not just CUDA. In particularly any of them with MacOS ports are not CUDA-only (since, you know, MacOS hasn't supported CUDA-capable GPUs for many many years now), making the argument for CUDA specifically rather moot.
In terms of just GPGPU performance Vega is pretty competitive against Nvidia. It's not nearly as one-sided as the gaming space is. That is, in fact, the one saving grace of the Radeon VII in the first place vs. the RTX 2080 - its stronger professional workload capabilities.
I used to develop professional 3D tools and CUDA is used to accellerate features in 3D software including: scene manipulation, simulation, real time rendering, ray tracing, rendering shader effects and video encoding etc.
Support for all the CUDA code that's already out there. Not that I've used Metal, but that's what seems to be the reason behind OpenCL's lack of adoption
Sure. But there is also a lot of Metal code out there in order to make iPads fly. And they really do fly, I was amazed what kind of performance I could get out of them. For example, I had an algorithm running on my iMac Pro, programmed in Swift, utilising all of its 18 cores fully. It took about 5 minutes on a typical example. Then I recoded the algorithm for Metal, and it runs on the same example on my iPad Pro in under 10 seconds.
So, my bold prediction: Apple is going to shred Nvidia to pieces within the next 10 years.
The video says the accelerator card is for 4k & 8k RAW video. So I wouldn't expect it to do anything at all other than decode 4K & 8K RAW footage, kinda like the RED ROCKET-X.
I've heard that CUDA can help decrease render times on Adobe Premiere Pro and several other programs. I'm guessing that includes 3D modeling software like Fusion 360 or something to help with various molding and simulation features.
Not F360 that I'm aware of, but for 3D, 3DS Max, C4D, and Maya both support GPU-based rendering using CUDA, which can be a big performance boost. On the video side, Premiere Pro, Vegas, and AVID tend to have better support for CUDA as well. However, AMD support has been catching up in the last couple years.
FWIW, I've done some pretty complex part simulations in F360, and it's not that slow, on my Vega MacBook pro 2018. But, if it is, Fusion has online-sim calculator or renderers that are pretty affordable.
Probably has to do with other software like Maya or other simulation like programs. In fact F360 worked flawlessly on my 2016 dual-core 2.0 ghz CPU with 8gm of RAM.
tin foil hat thoughts what if CUDA on MacOS allows Adobe products to outperform Final Cut rendering times and Apple don’t want to jeopardize it’s offering?
Could be possible, but given Apple optimizes like crazy using its own hardware and software, Apple products get crazy performance. On top of that Nvidia and Apple aren't on good terms, so they obviously didn't want to build a very powerful workstation.
Raytracing, rendering, certain kinds of vector operations, physical simulations.
CUDA/OpenCL is all about writing super parallelism code to be run on the GPU instead of CPU, so many trivially parallel problems are usually outsourced to it.
Unfortunately for reasons that remain murky Nvidia still hasn’t put drivers out for Mojave and their absence from the keynote leads me to believe they won’t have drivers for Catalina. Nvidia blames Apple saying they’re not approving drivers. So yes you can slap a 2080ti in your Mac Pro but without drivers it’s useless.
This is a professional machine. No one wants to buy it and use hacky work-arounds, they want good drivers. Which aren't an option with nvidia on macos.
It's also competing against machines such as their own iMac Pro whose base model is the $1000 cheaper and has 4x the storage and a faster GPU aside from the built in 5K monitor.
That $8000 base model Boxx workstation has 20 cores instead of 8, 50% more RAM, and twice the storage.
To be clear, I'm not talking about this Mac Pro in particular, but things like the Boxx or Mac Pro.
You would get something like this instead of consumer hardware because man hours cost real money. At a previous job they wanted a VR PC, so someone speced out a bunch of high end parts and a case with flashy lights. A couple days of my salary went to putting it all together. That was the first time I had to RMA a motherboard. When I asked they didn't think to confirm the GPU would fit in the case and luckily it did. It also took a couple weeks longer than it should to get up and running because of things like this.
Some specialty software is only certified for specific hardware. You can probably deviate, but it's like building a hackintosh. You might not be able to upgrade it and the man hours will likely kill any benefit.
I also see a lot of managers forget that artists aren't always tech savvy. Honestly, you don't pay them to be tech support, anyway. You want to pay a known quantity and get improved return. It can be a huge time suck to nickel-and-dime when you're tying to manage large projects.
However, using consumer grade hardware has its place.
Yep and there is a vast middle ground of use-cases. The parent asked about consumer grade hardware versus something like Boxx. I gave 3 separate reasons. A lot of people in this thread are wondering why Hackintoshes aren't an option so I wanted to bring it up.
Honestly, Apple wasn't what I had in mind when I was replying. Personally, I've had to deal with HP z-series workstations compared to various cobbled together machines. Having machines where the series will be in production for a decade, data sheets are readily available, fewer tools needed to open it and install stuff saves so much time and allows less downtime and faster turnaround.
It runs macOS and those consumer grade computers don't. If the best colorist in the area is working on your film, you give them the tools they ask for.
The color graders here run 10-bit/color HDR monitors that have a wide gamut. Apple doesn't have monitors that support this. (The new one _might_ do the trick; we'll see.) Everyone here doing professional color work is on PCs supplied by Thinkmate. This is standard in Hollywood, too.
Same here, I'm looking to build a machine to process large data sets. So far nothing can beat Linux on an EPYC or ThreadRipper on price/performance. It would cost half as much for a 1TB NVMe SSD, RTX 2080, at least 16 cores, etc.
The only reason I'm even waiting is because Zen 2 is right around the corner.
The 256GB SSD is so confusing. The price difference between a 970 EVO 256GB and a 970 EVO 512GB for example is only $50 (I couldn't even find a 256GB 970 PRO or 980). Are they really that desperate for margin on a machine this expensive?
The lack of NVidia is also strange, do the AMD cards in this config even support Tensorflow?
For me it's win-win situation for Apple. They help Nvidia with drivers, more people buy Mac Pro to use Nvidia GPU. I have no idea why Apple don't want to work with Nvidia. They are not selling desktop GPU after all. May be they have some hidden agreements with AMD.
Well obviously they have agreements with AMD. Apple gets special treatment from AMD/Intel/Nvidia, they get to see unrevealed roadmaps, custom contracts etc.
Whether it's an exclusivity deal or not we may never know. But it does seem strange given Nvidia cards generally have lower TDP/performance (something all macs struggle with)
Does any current-gen apple hardware ship with Nvidia hardware? To my knowledge they don't, which would explain why they haven't made any drivers for it (the only people who can get a GTX 1070 with mac osx would either have to spend 500$ on an eGPU enclosure or have a hackintosh).
But this one has PCIE ports so theoretically nothing is stopping the average joe from just adding a GTX 1070? That'd certainly add a lot more justification for updating the drivers I hope.
Cheers to everyone who wants nvidia drivers for hackintoshs :D
The 256GB SSD is for people who just have the OS and apps on the drive, but the multi terabyte files they're manipulating (e.g. video editing) reside on NAS or LAS, hence 10Gbe and 40Gb TB3.
I know this can be controversial but I do all my work at home on a computer I built myself for ~900 running macOS. I had a Coffee Lake processor before you could get one straight from Apple. It's fast and with three monitors, plenty of ram, and NVME storage I noticed increased productivity over a laptop with an external display.
With the T2 chip this may not be an option forever, and the initial setup means it may not be an option for everyone. But right now it feels like a very stable and viable option for those who were never going to buy a $6000 cheese grater in the first place.
I know several people who were waiting to see what Apple offered for a pro desktop machine. I replied to a comment that said "I waited for Mac Pro announcement to decide whether I'll stay with Windows or jump to macOS". The whole issue is that there is a segment of the population demanding a product that Apple is, again here, not offering.
Had I been able to justify it, I could have also chosen n-core Xeon processors (some people do go this route) and I still wouldn't have been anywhere near $6000, and that's the base model here. That's obscene, and in that light I think that my original comment is relevant.
This is the one aspect that was truly ridiculous. When that number was quoted you could hear the crowd pause in disbelief, followed by a rumble of conversation interspersed with laughter.
From The Verge:
> Still, all hope isn’t lost: we also don’t know how robust the box that the Pro Display XDR comes in is. It’s entirely plausible that you’ll just be able to prop up the $5,999 display with a hunk of cardboard or by leaning it up against a wall. Just be careful not to drop it.
People have been begging for this for over a decade now. It doesn't take much to see demand if you look at the Hackintosh community or the eBay market for old Mac Pros.
They've been purposefully avoiding the mini-tower market for decades. For whatever reason (prestige, cannibalizing other product lines, etc) they're not interested.
I know, and I get it. Maybe "mini-tower" is the wrong description. It should be a maxi-cube. I'm not holding my breath for it though, the things they sell huge numbers of aren't terribly configurable or expandable and it gives them an edge that way.
MacOS is what keeps me on Apple for now, but this box is not the answer everyone is looking for.
People just want an affordable desktop that you can easily expand over time. At $6000 just for the starting model, in my head I'm already planning to move back to Windows. The only major thing I might miss is iCloud. The cheese grater lineup is still the best incarnation of the Mac Pro overall.
People don't want to be forced to buy an integrated screen, nor do they want a headless laptop, and not everyone can live with a laptop alone. In other words, I don't want a semi-powerful machine trapped in amber or a larger version of an iPhone / iPad.
I just have to figure out a good replacement for iCloud and its related apps.
> People just want an affordable desktop that you can easily expand over time.
Who wants that? You can tell what people want based on what’s selling in the market. Most people want a laptop. Most of the rest want a Word/Excel machine. A small minority want an expandable desktop.
The same people who are into hackintoshes and used, older Mac pros on eBay. Neither community would be as large if Apple would just build a lower end desktop. Why not just update the old Mac Pro and call Mac Pro classic?
The solution I settled on was a Mac Mini to serve as my interface and run desktop applications, and a big hefty expandable Threadripper box running Linux that I SSH into to do all heavy lifting (and that I can put away in a closet). A NAS holds most files.
I've been really happy with the Mac Mini for this role, and the Threadripper box provides more power than I could reasonably get from any Mac.
I don't want an overpriced, glued shut Mac Mini. I don't have issues with physical space. I've tried the Thunderbolt / USB3 chaining solution when I had a iMac. I didn't like it. I want a reasonably price Mac desktop like the ones from years ago, but that's not going to happen.
This is going to make my transition back to Windows much easier. Maybe the people at Apple have failed to realized that Windows has gotten a lot better under Nadella?
There's the Windows Subsystem for Linux as well: https://docs.microsoft.com/en-us/windows/wsl/about Not to mention stuff like docker also made working on Windows less of an issue and I think Ubuntu will be shipping on Windows by default in the future.
For me the sticking points on Windows that prevent me from considering it for more than gaming are 1) advertising to me within the OS, 2) frequent, aggressive restarts for updates, even when actively using the computer and 3) the higher quality productivity app ecosystem on the Mac. The first two signal a big lack of respect for their users, imo. I know you can tweak it, but they make it unnecessarily hard, and I think I’ve seen updates reverting my choices. Apple has also done some of #2, but not nearly to the same extent.
Also, the Mac Mini isn’t glued shut, first thing I did when I got it after verifying that it booted was to swap in 32 gigs of aftermarket RAM.
I’ve been using windows 10 as a gaming machine all this time and I haven’t noticed advertising. The restarts were annoying for my windows box acting as my video security server, but for personal machines you can easily schedule updates and restarts to when you’re asleep so I’m not seeing that as a real issue either.
You are right about the higher quality software on the Mac. I will miss that but I’m not sure how long it’ll last given Apple’s hardware direction. You can already see it with the rise of electron based apps
It’s a pain upgrading the mini compared to a bigger box. You run into all kinds of annoyances. Also I want to upgrade storage as well and not just the ram.
What you save by not upgrading Apple’s egregiously priced flash memory, you can put towards a much larger NAS :-) With 10GbE becoming available, you can get pretty great performance from 6-8 spinning disks, with gobs of space.
I could never really use a remote system as well as a local one - the various kinds of file syncing or mounting (samba, SSHfs, expandrive) have their problems or only work 99% of the time.
Yeah, I mostly use emacs, which is pretty terminal friendly, or ssh-piped Jupyter, which I love and hate. I used to try to make those remote mounting solutions work.
One thing you can do is run a Linux VM locally with VMWare’s shared folder stuff (annoying to get working but solid once working). Develop using your local IDE, test locally on your VM, and then check in/check out on the heavy weight machine once you have something ready to run something big. Depends on what you’re doing, though. For machine learning, Jupyter is niceish.
Its not like people haven't been able to run it illegally in a vm for at least a decade. That was one of the preferred ways of running it if your machine didn't have the exact correct hardware requirements back when I paid attention to such things.
No, no you won't. Xeons and boards that will run 2TB of ram cost a bloody fortune. Just because it's not for you, doesn't mean it's not a valuable product for others. I'll certainly be getting one (not for a while, but eventually) because I want an upgrade path, and dev tools and Docker can put to good use every GB of ram I can afford.
I want tower case, silent and performant. I want powerful GPU for few games I'm playing. I want ECC memory. I want to put one or two HDDs along with main SSD. Mac mini does not sound like something I would like to use. I don't need small size, I have enough room. I don't need portability, I work from one place.
Blizzard released a lot of games for macOS. Also I played Rimworld, Factorio recently and those games seems to have Mac version as well. I'm not very involved gamer, I just play sometimes and it seems that Mac supports almost all games I need.
> Blizzard released a lot of games for macOS. Also I played Rimworld, Factorio recently and those games seems to have Mac version as well.
I don't think any of those games are GPU intensive. Blizzard seems to want their games to run on as many machines as possible, and Rimworld and Factorio are both 2D.
That depends on resolution. At ultra settings and 1920x1080 integrated GPU won't be able to run World of Warcraft, for example. Even Nvidia 1060 can't, and I think that it's pretty powerful GPU. And for 4K resolution probably something high-end is required.
A mini can be all of that, minus the ECC memory. 6/8 core CPU, 64GB RAM, external SSD (at 3000MB/s if you need that), external GPU. That's a beast machine for less than $2000.
I wouldn't recommend it for gaming as it doesn't shed heat fast enough. You can build a separate really good gaming PC for ~$1000 and still be far off from even the iMac Pro.
> I don't need small size, I have enough room. I don't need portability, I work from one place.
It's like the previous generation of Apple computers were primarily designed for people in crowded metros like Tokyo. I still don't understand Apple designers' obsession with compactness for their desktops at the cost of everything else.
> I worked at two movie studios (Pixar, Weta Digital) and they were both almost 100% Linux with a smattering of Windows machines here and there. Apple stuff was mainly iPads for story boarding and that kind of stuff.
> This image was rendered at 2048x858 with 256spp and took 3.5 hours to render. I allowed it access to 28Gb of RAM but had to estimate how much embree would use for BVH data so it occasionally would exceed that number enough that the OS would start paging.
Mac Pro will start at USD 5999, Pro Display at 4999 (without stand), matte version will cost 5999, stand for display 999 or 200 for the vesa mount adapter.
“If you think the computer is expensive, you’re going to completely lose it when you find out the salary of the boffin who will put it to good use. Or, more mind-numbing yet, the cost of having neither.”
I am not sure about aerospace, but for mechanical engineering, SolidWorks is a must, and it is only available on Windows. Many other important pieces of software, for example ANSYS, ABAQUS, and Fluent are not available for Mac. In fact, the only mechanical engineering software I know is available for Mac is AutoCAD.
I think Apple are trying to make the point that they have invested again in this space, so they hope that software developers will also invest. The expectation might not be for an immediate switch from software developers, but laying the groundwork for something longer term. We'll see how it pans out, but Apple are experiencing pressures they didn't experience a few years ago.
This is in the context of Apple increasing its prices to ludicrous levels across its product lines though (look at the iPhones!). I'm sure it's great for short-term profitability, but I think they'll come to regret this.
They used to do premium performance and design at a premium price. For the macs they just seem to do a bit of design lately, and that's considering that PC's look better now than a decade or two ago and that apple still pushes proprietary stuff like this MPX Module.
Hardware was many orders of magnitude more expensive to manufacture in 1987[1]. The base model that you get for $6000 is not impressive at all by today's standards. The max specs would make it a powerhouse but it probably will also cost ~$12000.
If aerospace get computers like this, I'd love to hear where I need to be applying to. We're running on Windows 7 machines with i5-2400 CPUs, about 4GB of RAM, and integrated graphics. All our CAD software is run on a grid system, where we access e3, Altium and Catia via a remote server. The engineers are all on hardware about 7 years old.
They were priced like a low end Sun computer, so that must have been what they were after [1]. But it still didn't make sense. The near-identical Amiga 2000 in 1987 was 1495 dollars.
It's also no strategy for a healthy personal life. Neither is being so uptight about everything. Comment was made toungue-in-cheek to include the name of the phenomenon for posterity and those who would like to research it further :)
As most of us here are with the developer hat. I think it's worth reminding that the target audience of those machines are Pro creatives.
(those who didn't ditch Apple yet ;) )
It's also the second time I remember Apple revert a critical interface removal. (first one was FireWire from the first Unibody MacBook).
Apple lost a lot users when they've ditched PCIe. you can buy those Thunderbolt-PCIe cages and have a lot of external media drives outside your machine, but for pro people that meant new Mac Pro isn't well for putting in a machine room or stacking under the table.
Back when Apple announced the switch to Intel a lot of studios had to do an expensive upgrade not only replacing their old G5s but also additional DSP/Accelerator cards from PCI-X to PCIe. when Apple showed the trashcan (Mac Pro 2013) it was a hard sale that kept many folks with their Mac Pro 2010-2012.
While it's not stylish looking (imho) or ground-breaking modular deisgn as I've assumed, it's the first time in a while Apple is responding to their hard-core users.
price? I had macOS running on my AMD Athlon X2 back in the days (Leopard!) and even then Mac Pros were much more expansive if you've compared pure hardware. professional got Mac Pros because they've preferred their peace of mind. and I must admit, with exception of memory & gpus failures, I know many of those machines that are still rockin'.
Last thing Apple should do to get pros the freedom is sign Nvidia's drivers...
But as a developer, I wanted a desktop class CPU and just a separate monitor (tired of hearing my laptop fans all the time, and it not being expandable), so I can upgrade them independently.
So while developers are pros, we won't appreciate things like XDR because we don't need them, unlike movie studios and photographers. So a $5k monitor is overpriced for editing code. But I honestly would've shelled out even $2k for a good monitor. $5k is too much.
The computer seems like a better value if you know you'd keep it for 5-10 years.
I’m right here with you. Just built a Win10 machine because of thermal issues. I don’t even mind having a laptop but is it too much to ask to pay iMac prices for mid grade hardware??
The thing is, it doesn't need to be Win10 machine.
Even if you're developing for Apple.
If you need a "Mac Dev" you can setup up any Intel based machine with macOS. It's not "plug-n-play" but it's not that far from getting Linux running with drivers and performance you'd expect it to.
> But as a developer, I wanted a desktop class CPU and just a separate monitor (tired of hearing my laptop fans all the time, and it not being expandable), so I can upgrade them independently.
What's wrong with the Mac Mini? Sure, there's no way to upgrade the CPU, but considering the resale value of Apple hardware replacing the whole machine every few years is actually a worthwhile upgrade strategy.
Of course the same sell and replace upgrade path applies to the iMac (Pro) - I understand that you would like to keep the display separate, but it's a fantastic display which doesn't really have any alternatives right now. I don't see how you could much better in that category.
Of course, if MacOS isn't that important to you, this discussion isn't worth having - in that case you could buy whatever you want.
I wish macOS wasn't as important, but honestly it's the most important. I don't want to switch back to Windows or Linux at this point since I'm too invested in the Apple ecosystem (and it's certainly the most privacy conscious option right now).
Same here. And that's exactly why the choice between upgradeability and macOS is still an easy one, at least for me. Sure, I'd love to build an affordable, modular computer, particularly since AMD's renaissance, but not if it means sacrificing macOS.
> As most of us here are with the developer hat. I think it's worth reminding that the target audience of those machines are Pro creatives. (those who didn't ditch Apple yet ;) )
I seriously wonder about those that haven't left Apple yet, that are willing to shell out $6k for a new Mac Pro. There was a small cohort of studios and creative professionals who invested into the old Mac Pro Trashcan and got severely burned by hardware that was overpriced, obsolete, and non-upgradable. Almost everyone I know who bought one of those ended up re-investing into a W10 PC with 1080s to TITANs or now 2080s.
You can't do anything meaningful in 3D with the specs on this PC, so this machine is instantly DOA for the entire top-end of the (read: willing to spend $$$) creative field... that's motion graphics, animation, modeling, game design, VR, vis fx and simulation, etc. because since Apple took so long to ship this they've all experienced the wonders of a ~$3k-$4k computer that was bought 2 years ago and runs circles around even the highest spec Mac Pro.
The next tier down, your UX designers, visual designers, sound designers, print designers, type designers, etc. have no need for this that a Macbook Pro doesn't serve.
Mac still has Final Cut Pro, Logic, and Sketch. Those apps are what tether creatives to the OS, not much else. But I think many, many people are seeing that PCs are becoming the better route for the creative industry, generally.
This is the first Mac in many years that feels like it earned its "Pro" moniker, especially at that price point. It has a proper power supply. And cooling! And you can swap out/upgrade the hardware. It's not a bunch of thermally throttled laptop parts soldered onto some mini motherboard.
Lack of NVIDIA support is a deal-breaker. The AMD ecosystem is just so far behind when it comes to frameworks like CUDA, OptiX, cuDNN, etc.. Why can't Apple open up kernel-level support by cooperating more with NVIDIA? This state of things seems completely bizarre to me.
I know this is an Apple/Mac discussion, but from a Linux/OSS perspective, it looks very different. If it weren't for CUDA, Tensorflow and so on, I wouldn't even consider buying NVIDIA (anymore).
Currently, I have a laptop with NVIDIA graphics and a desktop PC with AMD graphics. The AMD stuff just works; out of the box (OSS, good performance, happy user). But NVIDIA either comes with nouveau (OSS, poor performance) or the nvidia proprietary binary driver which has all sorts of weird issues (e.g. always-on fans, animations running at different speeds, etc.).
Sure, this doesn't say anything about how a Mac would run with NVIDIA drivers, but it gives a hint that NVIDIA has its own weak spots. But the reason why Apple decided to skip on NVIDIA in this case, is probably rooted deeper within their strategies.
There was an anecdote on a semi-recent (sometime this year) episode of ATP where an inside source related to them that nVidia were a terrible organization to try and work with. They apparently screwed Apple over, and Apple has a long memory for that sort of thing.
The AMD ecosystem will always remain behind if application developers continue to choose single-vendor proprietary frameworks rather than standard APIs.
There was some sleight of hand in the benchmarking for me too, they compared GPU compute rendering engine speeds to "Nvidia Quadros" when no one doing CUDA rendering on Windows uses Quadro cards, they all use GTX because the price to performance is ridiculous on Octane render/Redshift/etc.
Really excited to see the future of this hardware even though I'm not the target demographic. The ability to swap out with newer PCIe components down the line will be nice and it appears Apple is likely using some sort of desktop socket, so mounts and adapters for customizing the CPU will be available from 3rd parties eventually. Hopefully Intel doesn't change the CPU socket out from underneath everyone too soon.
I'm also curious if it will support PCIe Gen4 since Apple has obviously worked closely with AMD. I'm not up to speed on the specifics on whether Intel CPUs can take advantage of that hardware yet though or if we'll have to wait a year. Maybe that's why Apple didn't give a hard release date?
What I'm really disappointed about is the new display. I would gladly pay another $1-2k similar to the old cinema displays for something that's been updated but without all the HDR features. $5k for a display is insane for anyone except video professionals who need the HDR capabilities. The icing on the cake is getting nickel and dimed for the stand too. $200 I think they could make a good faith argument, but for $1k and it doesn't even come with a vesa mount in the box is really disappointing.
You could definitely tell the crowd was bummed on that tidbit in particular.
Overall though, still a great showing by Apple and a move in the right direction for power users.
Can people really use all this power with the GPUs offered?
The base is a 580X. The original AMD 7870 was a great gaming card in 2012... but boy the latest attempts to keep that thing going with the 480 and 580 are really pushing it. It's still a good budget gaming card but an embarrassment to offer with a Pro computer at that price.
I love that Vega 2 can come with so much RAM, but until the world moves away from CUDA, are there a lot of uses for this? Even if I got this desktop for free I'd have to use my old Windows PC for anything using the GPU because it has nvidia.
The 580X is an extremely capable GPU for normal usage scenarios. If you're doing compute, they offer the Vega 2 which is very competitive with anything on the market.
People doing CUDA work are not using Macs. And for Mac apps that do use compute they'll take advantage of it.
I wish Apple wasn't so anti-nvidia, but at the same time an awful lot of the comments in here are basically saying "People who would never have bought this now won't buy it because..."
The important question is are the people who want $6,000 workstations using CUDA?
For me personally, I would LOVE a monster workstation, but if I found a Mac Pro under my Christmas tree I'm not sure I would actually use it much. I would have to keep switching back to my old Nvidia desktop to do the things I specifically want a monster workstation for.
Complaining only signals that you are not the market they’re targeting. This is for Hollywood media makers, high end studios, etc. This was never to be a mass market item.
I work in Hollywood. The complaints are here too. But people are more concerned with upgrades and repairs. Loss of a machine for a few days/weeks for a repair is far more costly than anything else.
That said a lot of the industry is moving to the cloud anyway and doesn't require workstations. Some shops are entirely on Linux. Audio post/recording and editing are the last bastions of mac pro dominance, imo.
I do music and audio in Hollywood, and yup, the Mac Pro will never die here. Which is funny, because this machine isn't exactly for us, either -- a Xeon is often less favorable than an i7 or i9 in our world.
There is, but it's changing. Disclaimer, I'm on the audio side of things and my knowledge of video post is much more limited. Dedicated processors are bigger for us than cloud solutions.
Audio post/recording shops use Protools HDX systems [1] which are rackmount units holding a handful of DSPs connected to the PC with proprietary PCIe cards. The industry is moving towards ASPs as a whole, through products like Waves Soundgrid [2] and Universal Audio's UAD [3]. Of these, PT HDX has an API for developing 3rd party code to run on their hardware (it's proprietary however) and the SOUL [4] project is aimed at these kinds of solutions.
Dedicated audio processing units are about where GPUs were 25-30 years ago. Almost all proprietary, no shader languages, but CPU time is coming at a premium so we'll have to figure something out.
Now that's not to say that having a fast machine isn't important, it certainly is since a large chunk of DSP is not running on dedicated hardware (and that hardware is freaking pricy). But in most situations your audio processing is dominated by single threaded performance, which is why some people are irritated by the iMac Pro advertised as a pro workstation when it will thermal throttle on you under moderate load, and this new Mac Pro that hopefully has better cooling and a better processor will be a huge relief for users and developers who want more CPU cycles for our code.
With respect to cloud, some applications just simply can't use workstations. VFX stuff is mostly on distributed Linux systems these days because you can't put the power you need into a single box, or at least it's more effective to have people edit over 10Gbit connections through a terminal.
The rest of the cloud stuff is more about workflow than power, at least in the stuff I've seen most cloud applications are about synchronizing projects in the highly parallelized production pipelines used in Hollywood studios. NAB is littered with cloud workflow solutions every year.
If you're looking at kitting out an entire department with these or buying the equivalent rendering firepower in the cloud, the cost of a couple of leased lines and cloud interconnects are probably going to be the least of your worries.
I worked at two movie studios (Pixar, Weta Digital) and they were both almost 100% Linux with a smattering of Windows machines here and there. Apple stuff was mainly iPads for story boarding and that kind of stuff.
This is more for other readers as I'm sure you know this, but VFX / animation facilities are very different than other post-production functions like editing and sound, which almost always run on Macs. Final Cut and Logic are mostly the reasons for that (Blackmagic is making a real effort to change that though).
Well you forgot about ProTools which although I really dislike is even more important than Logic in post production. And Avid / Adobe on the video side. All of which will run on Windows and I am seeing increasing adoption of Windows in these places.
I work in hollywood (editing television/movie trailers)... at least our agency is editing off of imacs. I can't speak for others but shared storage (Avid Nexis) is very costly. We transcode to proxy material, cut with proxies, and then uprez/conform to hd, uhd, etc. No one, at least I can't think of any agencies, editing higher than 1080p. Hell you even end up editing in lores super compressed mp4 for some projects.
How many Hollywood media makers are there? As an Apple shareholder should I be excited to learn that they made a specialty computer for this demographic?
2015 was a banner year, nearly 35,000 sold in the USA. Only about 19,000 last year. Estimates from a number of years ago were of about $10,000 profit on each sale
There are a lot of Youtube stars who would jump at a Mac Pro if it meant they can turn around videos quicker. Time is definitely money in their industry as well.
Point is that "professional" is a lot broader than just a few big enterprises.
supporting the highest of the high end is beneficial for the platform overall. almost no one who needs this machine will sweat spending $15,000 on a configuration.
What's wrong with making a mass market Mac Pro? I mean, old Mac Pros were much more reasonably priced. It's not like inflation was that huge over recent years or computer components suddenly have much higher costs.
I don't see how "the target market will still pay for it" is justification for insane markups on commodity hardware.
I understand paying a premium for the R&D that would have gone into the case and cooling design, but to then go and put such dated hardware in it and still start at $6000.
I once heard an opinion that if you see an ad in a place you often visit, then you are the target. If it's out of your reach now, you're supposed to be impressed and desire to get that thing some day. And complaining about the price is a form of admiration; would anyone complain (not ridicule) about something they think is useless?
Well, if the baseline for the Pro market was made up of people who could afford the last version, then one can make the argument that Apple is the one that has redefined "Pro".
>> It seems evident that Apple views the "pro" market as movie studios, large corporations, and media entities that need serious computing power.
If Apple is going to price out people who had purchased Mac Pros in the past (the lowest point of entry for the new model is -double- the price of the lowest late 2013 model), I don't really have a problem with that group complaining about the price.
For that particular group, this new Mac Pro wasn't worth the long wait at all.
You can make an argument about performance multiplier, but I would bet that there are people who owned the trash can Mac Pros who want access to the new -form factor- at the $3-4K price point.
I'm sure the number crunchers at Apple (and you) are OK with losing those people, but it's a slippery slope, IMO.
According to their own marketing the new 18-core model is between 2.2 and 3.7 times faster than the the 2013 12-core model. They don't compare the entry level ones, but I highly doubt it is even two times faster.
The criticism isn't just the price. it's that the hardware does not match the price at all. A similarly specced PC would cost half the price, and would probably be able to use stuff like Nvidia cards with CUDA that has become pretty important for some of the groups you mention.
If it was a beefy workstation with specs to match the price I don't think people would complain (or at least as much).
yeah, HN commenters just was an expandable mid-sized tower. a gaming PC, but a Mac. But the best use case for a gaming PC is to play games...which the Mac will never be the right platform for. Even if Apple supported Nvidia with Nvidia drivers, if you want a gaming PC just get one.
I was really hoping Apple would price the Mac Pro at a reasonable starting point, hopefully stymying the flow of developers from Apple hardware to alternatives. The features are obviously right for extreme creative pros, but I don't think any developer in their right mind should pay this much for such little hardware. I would have loved to have seen support for the LGA 1151 socket, off-the-shelf Nvidia GPUs, and user-replaceable RAM and storage.
My company offers a $2500 hardware budget, and I have a refresh coming up this fall. I already have a MacBook Pro, so I want to go with a desktop this time around. With that budget I could get a Mac Mini with an i7-8700, 32GB of RAM, and a 1TB SSD; or I could build a PC with an i9-9900K, 32GB of RAM, a 1TB SSD, and the same GPU as the Mac Pro _and still have $1000 left of my budget_.
Too bad that would only get me the Pro Display stand. That would have been a nice display to have.
> The starting point for Mac Pro, the stainless steel space frame
Only apple could manage to call a bent piece of metal pipe a "space frame".
edit: getting downvoted because a "space frame" is a real thing. However, those are very specifically designed to be a "Three-dimensional truss based on the rigidity of the triangle" and this is absolutely not that. This apple co-opting an existing term for something and applying it where it doesn't make any sense.
Sure. Looking at each of the external sphere holes, each seems to be associated with a triangle of connected internal sphere holes. The central point of each sphere is therefore associated with the central points of three others in an adjacent plane.
It is easily possible to conceptualise the 'grate' of this mac-pro as a set of interlocking tetrahedra of origins of the spherical spaces, instead of interlocking tetrahedra of the joints; a little like a photo-negative.
I dislike the appearance myself, it provokes a mild feeling of trypophobia, but I don't see anything dishonest about their term.
> In architecture and structural engineering, a space frame or space structure (3D truss) is a rigid, lightweight, truss-like structure constructed from interlocking struts in a geometric pattern. Space frames can be used to span large areas with few interior supports. Like the truss, a space frame is strong because of the inherent rigidity of the triangle;
I don't see any triangles here.. All the examples on that wikipedia page show triangles. Does the frame they built actually count as a space frame or did they coopt the term?
From what pics I've seen, it looks the circles are arranged in a triangular pattern. So there are your triangles. (One of my most eye opening realizations when I was in engineering school was that just because something does not superficially look like something else, doesn't mean they're not functionally similar. My particular realization was with four bar linkages, which is another fun structure to know about.)
Also, it looks like there's at least two layers, so it very well could be a 3D structure and not just a flat panel.
So, all in all, I'd say it probably counts as a space frame, though I'm not part of the Space Frame Police, so who knows.
Just for kicks, I too google image searched for "space frame" and while most of the images were of entirely triangle-based space frames, there were in fact some that showed constructions using curved bars. So if we're using google image search as an authority, it's now proven that space frames can have curved bars. Notably, most of the curved bars I saw were in boat- or car-related designs, if you want to see the "proof" (though I would hardly call google image searching for something to be actual proof, but since it seems to be a metric you're using, I'll go with it).
Can you show me an example of one? A google image search for "space frame" shows lots of triangles, and not a single thing that consists of a curved bar.
Those of us who are slightly older may still remember how much real workstations (as in a Sun SparcStation or an SGI Indigo) cost back in the day, and are less shocked by the pricing :-)
Computing has really been democratized over the last 25 years.
Those workstations were substantially different (and faster) than commodity x86 hardware at the time. Today you can buy a latest-gen 8-core CPU for $300 or so.
The sad truth is they really weren't that much faster. Sun workstations were using the same 68k processors that Macs were using. There is a reason the Unix Workstation market didn't survive.
That said, this Mac is not as overpriced as it appears on the surface. A server class motherboard with ECC RAM support and an 8 core Xeon aren't cheap components. I'm guessing they aren't using the bottom of the barrel SSDs either. 32GB of ECC memory aren't exactly free either. I wouldn't be surprised if the BoM on this was over $3k.
Sun's mid-1990's SPARCstation models used a SPARC CPU (hence the name). The single-core workstations were a bit faster than the top of the line 386 or 486 available at the time. As I understand it, the draw was the CPU and OS compatibility with Sun's million dollar servers.
Not necessarily that much faster. What people mostly paid for was not the hardware itself, but the whole package, including the operating system.
Which is exactly what people pay for when getting Macs, especially on the high end. You can get an Intel CPU in a box for cheap, if that's what you're looking for.
And look how great Sun and SGI are doing with their workstations today :). The list of companies, who are willing to spend 10k or more on the workstation of a single employee is growing short. And some of those still need NVidia graphic cards.
Sure, this is not a mass-market product. But I'm actually glad that the concept of a high-end workstation is making a comeback. Not everything needs to be dirt-cheap, we should not necessarily feel entitled to every product on the market being in our price range, and I'd love to see some competition on the high-end.
Yes, it is great that they have a real high-end product. But in the past, at least the starting price of the Mac Pro was in enthusiast-range. There should be a product between the Mini and the Pro which supports a desktop graphics card.
On that, we agree. I write code, and I used to own a 2008 "cheese-grater" Mac Pro. That one was affordable enough for developers, came with a low-end video card (which was fine!), supported lots of RAM, and provided a rock-stable machine for development. The new one, not so much, it seems to be mostly good for specialized applications (like video editing).
But on the other hand, I don't think I'd want to buy this Mac Pro anyway, because of the display situation. There are no good external 5K displays, and most of the PC world is stuck in 1080p (or thereabouts) because of gamers. So, if I am to get a computer with a fast CPU and a good 5K display, I will be well served by the iMac or iMac Pro.
I am stuck in the same situation. I am using my Mac for development and photo processing. Currently I am using a late 2015 5k iMac. The screen is beautiful, I wouldn't mind a somewhat larger screen though. From that sense, the new 6k screen has the perfect size. The big problems I have with my current iMac are:
- disk space, I need more disk space and have no good solution for that
- I would like to have a real graphics card, StarCraft even struggles in 1920x1080 resolution.
Even the iMac Pro doesn't really solve either of that, at already a too high price. I would have been willing to spend like 6k on a basic Mac Pro with a display, which wouldn't sound impossible considering that in older times the Mac Pro started well below 3k, but the new one is in an entirely different league. I am not sure why Apple thinks that "Pro" users are a synonym for video makers. There are photographers, there are software developers and many more.
was expecting to see a "april fools" banner to pop out any second during the segment (for the look, not the specs which seem quite interesting though (very) pricey)
This vent is uncharacteristically ugly for an Apple product. I have a hard time believing that this sphere pattern is really optimal for performance. Maybe optimal for minimizing the number of machine operations required to produce it... Those holes are just comically large.
The price saddens me. The $6K base is twice the base of the 2013 Mac Pro when it was released. Some quick spec-ing for a somewhat-equivalent DIY build gives somewhere around $3K. The build I was already spec-ing for my next workstation has 2x RAM, 16x SSD, SFP+ NIC and is under $5K.
I hope at least that the SSDs are standard M.2, but looking at the pictures I’m not sure they are.
Do they have to be special sauce drives? The picture makes it look like standard M.2 drives. It doesn't seem like you should need anything special if the T2 chip is just acting as a bog standard drive controller that happens to send only encrypted data to the drive.
I believe those drives are just the flash, as the T2 chip actually acts as the controller. That might be wrong but that's what I've gathered from how the iMac Pro works and I assume this is a similar configuration.
That is the one that really got me. I expected the Mac Pro to start at a ridiculous price and I wasn't shocked by the monitor but $1000 for a monitor stand is crazy
Have you ever bought a nice pice of furniture? This stand is a super low volume, custom design. It's solid milled aluminum and likely very hefty since it needs to handle a 32" monitor (which itself is probably pretty hefty). Not to mention the intricate hinge design that they say is rock solid (not an easy feat to achieve).
This whole system is built to compete against $25k-50k reference monitors.
Here's a page of Table Lamps from Restoration Hardware for a sense of perspective:
And the whole reason they don't include the stand with the monitor is because it's built for media production houses that already have rigs set up with their own mounts.
They're not building this for designers to run Sketch or developers to run Xcode.
This is for production servers, processing Pixar movies, and YouTube features. It's a machine that will be expensed as the cost of doing business.
That link does not help your case $300 for a fashionable lamp you can use for life and is a major fixture of a living space vs a $1,000 monitor stand? Congrats Apple you made restoration hardware look like ikea in terms of pricing.
Did you see how big that monitor is? It's going to be a permanent fixture on your desk. It's as much a part of your room and your life as anything else.
In any case, the only issue here is how they presented the pricing. They should have just said that they're selling a $6k display but if you don't want the stand, they'll sell it separately for $5k. If they framed it like that then no one would complain.
Am I the only person more concerned about the new "Find my Mac Feature"? Call me a cynic but this feels like it has all kinds of privacy problems.
"The new Find My app combines Find My iPhone and Find My Friends into a single, easy-to-use app on Mac, iPadOS, and iOS devices. Find My can help you locate a missing Mac — even if it’s offline and sleeping — by sending out Bluetooth signals that can be detected by Apple devices in use nearby. These devices then relay the detected location of your Mac to iCloud so you can locate it in the Find My app.
It’s all anonymous and end-to-end encrypted so no one, including Apple, knows the identity of any reporting device. And because the reporting happens silently using tiny bits of data that piggyback on existing network traffic, there’s no need to worry about your battery life, your data usage, or your privacy."
The copy about the lattice pattern front being inspired by nature for airflow and rigidity is pretty ridiculous.
It looks cool to have little spheres cut in the inside of a metal block but that's it. If you were actually optimizing for airflow or density you would do something entirely different (or you would look at the effects the grill has on airflow and realize the effect is minimal regardless of what you do and then move on to other priorities.
On the other hand it looks like it would attract dust and be obnoxious to clean.
Not sure what you are talking about. They actually were optimising for airflow. If you look at the picture you couldn't design to allow more air to come through whilst maintaining a rigid structure: https://www.apple.com/mac-pro/design/
As for dust/dirt you could just spray it with compressed air.
If your real goal was airflow with rigidity you could just leave the front completely open and be fine with rigidity or perhaps have a small number of bars crossing the front.
You also would not pick spherical cutouts. the bowl-shapes are going to add drag.
This is only being done to show off manufacturing sophistication, you can say you did something that looks cool without adding nonsense about performance.
Do they need a rigid structure on the faceplate? It's got the Space Frame bars, and there's a standard grid of holes plate behind the fancy faceplate. Just use the standard grate.
For developers, there's another constraint in investing in the Mac Pro: the increasingly high likelihood Apple will move to ARM:
> Mac Pro + display coming in at $11,000 a year or so before a switch to ARM is a hard nope from me. It's a little freeing, so I can now actually get a Mac mini
Why on earth would you ever even consider getting a Mac Pro if a Mac Mini will do the job you need? Unless you have $5,000+ just sitting around you want to throw away on street creds? This is a serious question, because I really cannot possibly fathom what scenario would actually put you in that position as a buyer other than, "I have money and I like to spend it".
Also the idea that ARM hardware is somehow a threat to your "investment" is just random FUD. First, it's still just speculation. But second, assuming it's 100% true -- you're kidding yourselves if you think Apple is suddenly going to appear with magical Xeon-class 28-core ARM CPUs sometime in the next year that can compete with what they're offering here. Try 5+ years (minimum) after their first desktop-class CPUs are proven rock-solid. This Mac Pro cycle will probably last that long. If they do switch anytime soon, it will likely be only lower end machines like the MacBook Airs and non-Pro models. And even then, if history is anything to go by, they'll offer 5+ years of binary compatibility for newer devices to run old applications and multi-target toolchain support for developers to target all systems. People keep computers, even laptops, for longer and longer now. x86 macOS machines are not going anywhere, anytime soon, and Apple knows not to randomly segment/abandon users like that.
I'm not going to buy this machine for a number of reasons (including the astronomical base cost), but FUD about future magical ARM machines doesn't hold up at all, IMO.
It's what people asked for – a work horse, designed to be fast, reliable and reasonably quiet. They've tried the "beauty first" approach on the 2013 version and it was a disaster.
They can accomplish all that, just without the tasteless design.
The stainless steel frame contrasts with the shell, and not in a good way. The ventilation holes are really weirdly proportioned and still manage to look cramped with that inner baffle. And finally, that apple logo on the side stands out a bit too much, on the old cheesegrater it was a darker gray and a bit smaller.
The old G5 chassis was an understated and refined look, each of its elements seamlessly blended into each other well. This thing is just sharp to look at, my eyes do not know where to rest on it.
However, design is always about compromises. The G5 had more beautiful handles, but they had sharp edges and weren't nearly as ergonomical as the new ones. And I wouldn't be surprised if the weird holes in the new Mac Pro allow for twice the airflow. Both are tradeoffs that I, personally, welcome.
Yeah, I, too, am personally not a fan. It honestly looks like something the Empire from Star Wars would put together. I'm looking at my 'Two Towers' (named after the Lord of the Rings, not the disaster!) - a Quad G5 with a 256GB SSD/1TB HDD, and 16GB of RAM I use for my music production, and a 6-core MacPro5,1, with 32GB of RAM and 2 512GB SSD's I use for video editing.
They're just beautiful, and they already do more than I could ever need them to do, as a music label manager. I don't see myself replacing them.
What about Mac Mini + minimum viable ssd (256-512GB) + eGPU + decent SSD like Samsung T5 (budget) or X5 (high end) ?
After today's announcements this is what I am looking at. The only problem I see is that current eGPUs do not support the 5k thunderbolt monitor, with exception of Blackmagic (but like most of apple products those are crazy expensive and non-upgradeable).
Until that 5k monitor situation is solved, I am stuck with iMac. I tested various 4k monitors, and somehow I find the 5k Retina (1440p) to be sweet spot for daily usage & development.
Edit: Funny thing is, if you count the $$ for above set up, you will find out it costs as much - if not more - then similar 2019 iMac. So it probably does down the the question - do you really need that top-of-the-line GPU from AMD (which isn't that fast anyway)? It seems that buying iMac that you can still sell for 50% of original price after 2-3 years and simply getting new one with specs bumped might be the same result as being able to update your ram/gpu/ssd/monitor
Seems that iMac truly is the sweat spot for most "pros" that want a desktop.
No nvidia? no AI?
Boot to windows?
Might as well use mac mini?
Starting to learn hacintosh after my two 2010 mp die?
Is that thing the ugliest thing I saw in years from apple?
Very sad day. In the first moments, I thought everyones wishes had come true, and they bring back the old style Mac Pro back, with a redesigned housing. But the old Mac Pro had started around 2-3k, so while never cheap, not out of reach of an enthusiast user. While the specs of the new Mac Pro are certainly impressive, the price is just out of the range of all private and most professional users. Not sure why Apple refuses to sell a machine which just allows to put in a normal desktop graphics card.
Also: they are selling a "Pro" desktop Mac without real mass storage? Just 4T of SSD? (Of course only available at Apple prices). The old Mac Pro had a lot of storage slots, both for 3.5 and 5.25 inch drives, but the new one couln't house at least a couple of 2.5 inch slots as well as NVMe slots?
Magnetic drives perhaps not, but I was mostly thinking of SSDs. But indeed, if you can use some of the PCIe slots for storage, it would probably be ok. Still a too large gap between the Mini and the Pro.
You have to bear in mind the target market - notice that it has dual 10Gb Ethernet, and that many users will be using a NAS. There are a shedload of PCIe slots there too for more storage.
I still do some work on my 2010 iMac. It costed me 3400 EUR. It will be its 10th aniversary soon. That's 28,3 EUR /mo for 10 years in costs for a piece of hardware that helped me to get a lot of money. This new Mac Pro, the base one, will cost me around 57,6 EUR/mo over 10 years. No monitor included, but if we include the new one, it will be around 106,9 EUR /mo over 10 years. I don't know if it is too much but I see this new Mac Pro too expensive and I am not chosing hard upgrades, but if you go a bit crazy yo can easily calculate 200€/mo over 10y.
This new Mac Pro design gives me trypophobia. Do we really need that many holes on the front? Somehow this "industrial design" is really giving me goosebumps. That's a first for Apple product...
Thanks for my new word of the day: "Trypophobia is a proposed phobia of irregular patterns or clusters of small holes or bumps."
Someone elsewhere on the thread said the design reminded them of something the Empire from Star Wars would make. It seems there may be a physiological reason for this kind of reaction to the design.
people complaining about the price and comparing hardware specs tend to forget that the price incorporates running the world's best operating system with a visual ui. although hardware spec's might not always be too impressive, it needs to be considered that the macos can actually max these out as it's purpose-built for the hardware it runs on which is not the case for most of the windows/linux systems.
and no, i'm not an apple fanboi, but think that the products are pretty solid.
I've waited for this for a few years.. Until I stopped caring last year. Alas, I'm underwhelmed. Mostly in the form factor.
I had an Intel beta machine from yonder. Apple sent everyone an iMac that year we had to send those back. I had that and a MacBook Pro for a few years until I got a 2009 Mac Pro. Had to retire that when OS X upgrades stopped working on it. I loved that machine and put it through its paces for many years.
In 2014 I got a trashcan Mac Pro and that's probably been the most beautiful Mac I've ever owned. I know a lot of people don't like it, but it's been quietly humming away on my desk for years. It takes up about a 7" x 7" corner of my desk. It's perfect. I love stuff that doesn't make a sound and gets out of my way.
The new form factor, I'm not into at all. It's great that it's expandable and powerful, but it just doesn't do it for me. I've been doing Mac and iOS development for so long now and thrown so much money at them that I think I'm just burned out. I mean, how much 'best ever', 'amazing', 'sensational', every year can you take? I no longer drool every time Ive speaks and the video for it had me yawning. I've seen it before, even though I haven't. Yah, no one else makes that design, I get it. But, I am now asking myself, so what?
It seems like they have been less 'pro' the past years and now they're extreme pro. I'm just average pro I guess. They didn't design this machine for me and that's ok. I'm just underwhelmed by everything Apple now.
It looks traditionally nice, definitely a throwback. Just not too conservative to where it looks like every other boring black PC case these days. But also not cylinder shaped like something out of a Sci-fi movie.
I don't really care about the specs because that's not really a concern one should have when buying an Apple product. It's more about being promised a solid experience, without having to be technologically conscious and do your own homework.
All I want is to be able to replace the hard drive when it fails after 18 months, and upgrade the RAM. Also need an optical bay for a tower.
It would be nice to have more than 2 USB ports, when I'm paying over $6000, without having to look for a PCIe board that has a compatible driver. Also, audio input would be a cool feature in that price range.
Main thing really though is replacing that hard drive when it fails, which they always do before long.
Not directly relevant but: boy do I love Apples marketing. They put so much passion into conveying their products, its ridiculous. Its almost like their marketing boarders on small art projects and "how its made style docs." I know that's a weird thing to say, but I find them very creative and interesting to watch. Sure beats the average commercial.
You almost need a dedicated 15-amp circuit to run this or have 20-amp circuits so you have some overhead for the rest of your electronics (or a cubicle neighbor).
The power supply draws 1280W. A 15-amp circuit provides 1800W, but you are really supposed to keep steady loads at no more than 80%, which is 1440W. I bet some circuit breakers are going to start tripping when these go live.
It’s a bit beyond my budget to buy for sentimental reasons because I owned a cheese grater PowerMac G5 and a MacPro back in my video production days. But this looks like exactly what the creative or media production professional of today needs. If you need less grab an iMac Pro or iMac.
I rather like Windows machines in 2019. I believe Apple fumbled the “Pro” market big time but it’s really not something worth for them to chase in my view. I really see this new iteration more as a goodwill gesture to their pro user base than an attempt to recapture the market.
I was ecstatic about the 2019 Mac Pro until I heard its price. $5999, which is well above the price range of the entry-level 2013 Mac Pro ($2999) and of other Mac Pro and Power Macintosh models, which have historically had an inflation-adjusted entry level price range of $2500-3000 for the past two decades. In fact, here's a list of prices for entry-level Power Macintosh and Mac Pro models in the last 20 years (all of these prices were found on Low End Mac):
Blue and White Power Mac G3 (January 1999) -- $1,599 ($2,453 in 2019 dollars)
Graphite Power Mac G4 (December 1999) -- $1,599 ($2,453 in 2019 dollars)
2001 Power Mac G4 (January 2001) -- $1,699 ($2,453)
2001 Quicksilver Power Mac G4 (July 2001) -- $1,699 ($2,453)
2002 Mirrored Drive Door Power Mac G4 (August 2002) -- $1,699 ($2,413)
2003 Power Mac G5 (August 2003) -- $1,999 ($2,776), reduced to $1,799 ($2,499) in November 2003
2006 Mac Pro (August 2006) -- $2,199 ($2,787)
2010 Mac Pro (July 2010) -- $2,499 ($2,929)
2013 Mac Pro (December 2013) -- $2,999 ($3,289.83 in 2019 dollars, but you can still purchase an entry-level 2013 Mac Pro today from Apple for $2,999 in 2019 dollars).
I wouldn't have been surprised if Apple had announced a $3,299 or $3,499 entry-level Mac Pro model given how inflation-adjusted prices have crept upwards over the past 20 years, but $5,999 is a gigantic leap from $2,999 or even $3,499.
There's no doubt that the 2019 Mac Pro has answered the needs of some classes of pro Mac users. For people in the movie and music industries as well as professionals working in certain engineering and architecture disciplines, they have clamored for user-serviceable, upgradeable Macs that are also very powerful, and are willing to pay top dollar for their equipment since time is money for them. The Mac Pro certainly delivers in these aspects, and I'm glad that Apple has recommitted itself to these users.
Unfortunately Apple has disappointed another class of Power Macintosh/Mac Pro user; those who want an affordable machine that is also user-serviceable. Apple used to provide entry-level Power Macintosh and Mac Pro models that catered to this group. Unfortunately the Mac lineup under Tim Cook has largely become unserviceable by users, complete with soldered RAM, soldered storage, and design decisions that make upgrading and repairs difficult to impossible. The 2019 Mac Pro is a tease, a kick in the face for this class of user. We finally have access to an expandable, upgradeable, user-serviceable Mac, but at unattainable prices. It's like being an Acura fan who clamored for the return for the Acura Integra or Acura RSX, which were affordable sports cars that would probably cost $30,000 brand new if they were sold today, but Acura points these fans to the Acura NSX, a $150,000+ supercar.
And, so, as a user of a 2013 Mac Pro, I have mixed feelings. I'm glad to see that Apple is committed to its pro users and to the longevity of macOS, but unfortunately Apple has doubled-down on its model of selling non-serviceable, non-upgradeable machines to all but the most wealthy of its customers.
Unfortunately I feel trapped as a Mac user. Windows 10 sucks and the Linux desktop is still not ready for prime time, but at least there are a wide variety of PCs out there that are user-serviceable, upgradeable, and affordable. macOS is the best operating system out there and is the only one in my opinion that is truly designed for users in mind, but unfortunately it is largely tied to hardware that is virtually closed off to repairs and upgrades unless you're willing to spend $6000+ on a computer, which is insanely expensive for a computer in 2019 unless you're in a field that absolutely needs the most powerful machine around.
Thankfully my 2013 Mac Pro provides ample power for my needs, and so I'll be using that for years to come. Hopefully I'll be able to pick up a used 2019 Mac Pro for a more affordable price once my 2013 Mac Pro gets long in the tooth.
I think if they would have released an i9 model with a better GPU at say, $3499, then they could have sold a lot of them. I don't see any point in server chips and ECC ram being bottle necked by a single weak GPU with proprietary upgrades.
I wonder if the people who complain about the price here have ever worked with computers. For example, I am a software developer. It costs about 5000€/month or so to hire me after I graduate. Every workplace I have been in has given me a 3500€ MBP to write code with, no questions asked. Why? I need it and it's cheap compared to my wage anyway. Even a 10000€ computer wouldn't be very expensive considering it's estimated lifespan is 5-12 years.
For companies the cost of computers next to the cost of people is insignificant, and that is the target customer group for this machine.
Anecdotally, this page takes a very long time to load in Chrome, whereas it loads immediately in Safari. I wonder if they're artificially slowing the response for non-Safari clients? Seems a bit hard to explain otherwise.
I've been waiting on Apple's Mac Pro announcement to decide if I would be buying their new pro monitor. I have previously owned an Apple Cinema HD Display 30", which was great, but started to show its age over time. I believe it cost me about $2500 at the time. I'm not afraid of spending a lot for a great monitor.
This is $5000, or $6000 with the stand. Oof. I wish there were a cheaper 4K version.
I guess I'll just spend like 10-20% of that price some other 4K monitor.
The most interesting and least elaborated detail is the Afterburner FPGA. I’m curious what the programming model is (Metal compute kernels?) and how it compares to GPU.
I get the complaining but hey Apple released a new Mac Pro is pretty cool. They could have just decided quietly leave the market completely.
I also appreciate how the display is a thing now. Not everyone remembers the verrrry long period where 1920x1200 was the best you could do even when laptop displays were shooting off into the retina world. A high dpi developer workstation with tools to match could be a beautiful thing.
I don't see it mentioned anywhere, but to me it's obvious they intentionally went for an actual "cheese grater" look, as a play on the old "cheese grater" Mac legacy.
(They made it functional of course, with the holes used for cooling etc, but the 100% cheese grater look was intended this time).
The same specs as the $6000 base level Mac Pro comes out to $1600. Granted, I would use and Intel i9-9900K, non-server motherboard and non-ECC RAM.
As a media production professional who has spent most of my career on a Mac, the original "cheese grater" is probably the best hardware platform I've ever worked on. In my experience, macOS as an operating system has been vastly superior in terms of maintenance and support for a lot of post houses. When I install a Mac, I rarely have to come back and touch it until an editor mistakenly updates the OS or Adobe suite and breaks something in the chain. The original Mac Pro cheese graters lasted 5-7 years in facilities I ran.
But times have changed. Cameras got bigger and Apple started severely limiting what their computers could do.
The Mac Pro "trash can" was drastically under-engineered. It overheated with 4k footage and could barely handle larger productions. You also ended up with a tangled mess of peripherals that used to just live inside the tower. I can't count how many times an editor would say something wasn't working and it was just because they knocked a thunderbolt cable loose.
At first glance, the new Mac Pro is nothing short of impressive.
The big thing media people want from Apple is upgradable GPUs, which Apple kind of did here.
The MXP module for the GPUs looks to be entirely proprietary. I haven't found any mention of potential 3rd party GPU support. And I assume you'll buy the GPUs preinstalled in those modules at the highest market value, despite the rapid drop in video card prices over time. But without nVidia support I fear this feature is going to be a stalemate with the industry.
What is most appealing to me is the Mac Pro basically a server motherboard with a lot of room. Building a budget PC is restricting. The gaming motherboards tend to move the fastest with technology and newer / cheaper chips that are capable of hanging with server chips, but are part of the silicon lottery. 1/4 that I receive is DOA or has to be returned for some reason. That and they lack enough PCI slots for big post-house media work. If I have to drop two GPUs in a PC, there is rarely room for I/O cards, 10GbE and other requirements.
For a post-house who wants something stable, the price high of the new Mac Pro will be returned fairly easily (maybe not with those new monitors). I don't see the freelance / independent and small-production team market buying >$6000 Mac towers though.
The latest offerings from AMD with PCI 4.0 motherboards will weigh in at $3000'ish with a single 2080 ti GPU. That's an extreme bang for your buck.
The downside in the PC world is mainly that it has become a build your own adventure with next to no warranty and a ton of technological hurdles and compatibility issues. For example, very few motherboards for Intel chips support TB3 and have 10GbE built in. And if you go AMD you can forget thunderbolt all-together.
Overall, I know a lot of places I've done work with are waiting to hit buy on the new Mac Pro. Part of that is because it just works, and they'll deal with the shortcomings later on. But really, it looks cool and the people they attract are impressed by that fancy tower. But everyone who works alone or at smaller companies are probably now waiting to drop their Ryzen 3000 and X570 orders into their shopping carts.
Disclaimer: I don't know any professionals who bought an iMac Pro. I haven't seen one in the wild since it was released.
After the iTunes calendar joke I actually thought when the launch video started it was another joke about the old cheesegrater design and was going to show an actual cheesegrater, until it cut and made it clear it was a real video.
Most of the specs sound like Cascade Lake parts, but the cache sizes don't quite match. It's possible that Apple's using a custom part, or one which Intel hasn't announced yet.
"Nobody" is obviously the market that RED is targeting with their 8K cameras (shooting raw/lossless for that matter) or various projector manufactures. Sure not everyone in the production pipeline is going to be watching manipulating the edit list in 8k, but you better be sure there are a few people looking at the dailies at full 8k res before the OCD/dictatorial director shows up.
Apple obviously thinks there is a market here too, or they wouldn't be talking about how the machine can show three simultaneous 8k streams.
Yes I do edit 12M images on a ~15M monitor. Its also the weak point in the system, as your average midrange+ DSLR is 25-50M, and its pretty easy to create a printout where the effective resolution of the page/poster is vastly higher.
But that is not really the point, the apple presentation goes on about how the machine can play 3 8k video streams simultaneously, but you have to buy a 3rd party monitor/projector to even be able to view a single stream.
This is the ugliest computer I have ever seen. The monitor as well. What in the world. It's like someone discovered Trypophobia and wants to terrify people. I can't believe I'm the only person to comment on that here.
I loved the original "cheese grater" (which doesn't even seem deserving of that title now).
nice to see that Apple still can build a real (UNIX) workstation. just about the right combination of power, standard but exotic IO, price and design (though not my favourite)
Design-wise it's not great. The quite minimal case with the huge handles looks a bit amateurish and cheap. Although they try to inflate every feature in the typical Apple manner, like the "spherical array" that is "onto the internal and external surfaces of the aluminum", the front panel is not sound. Actually a perforated metal plate like in the old macs looks better. Obviously it's question of taste but it's not great Apple design that used to impress me in the past.
256g SSD is shameful, I don't care what they were thinking. No Nvidia support mentioned so I know a few that just wrote it off for that oversight alone. They will probably hold out just to be sure but that was the hope for them and others.
While I was never in the market for a Mac Pro, though the iMac Pro is not out of the question jump over the iMac top tier, I never expected such low starting specs for SSD and Video at this price point. 580X is old, 2017 old. If this is what they are struck with by not using Nvidia they have done everyone a disservice.
* did the spec this out in 2017 when they announced it and just locked it in?
* edit : Wrong on price, iMac Pro is 1,000 less than base Mac Pro