• MST (DisplayPort daisy chaining) in MacOS. The hardware has supported it for over a decade. The OS is the weak link here. It's the difference between spending $75-$300 for a dock in addition to the cables and just connecting the monitors together along with a single cable to the first monitor.
• Non-soldered storage. Seriously. Storage is the most likely component to fail. SSDs only have so many write cycles.
• BIOS on a separate chip. To make matters worse, with the introduction of the T2 chip, the BIOS is stored on the SSD as well. This means if the SSD fails, you don't just lose your data; you have an expensive brick. You can't even boot to external drive anymore if one of the two SSD chips fails.
• Safer SSD chips. If a cheap capacitor fails on newer Macs, 13V gets shorted straight to the SSD. The SSD commonly doesn't survive this. And since the BIOS is on the SSD now… Literal ten cent part blows up your multi-thousand dollar laptop with zero warning.
The lack of MST support caught me by surprise. Six years ago, I intentionally bought an expensive monitor with USB-C and daisy-chaining support as the second screen so that the monitor itself would be able to function as my dock. Lo and behold, it worked fine on Windows but not on MacOS. Had to buy a dock separately and now have one more thing cluttering my desk.
I spent so long troubleshooting and trying to figure out why it didn't work before I stumbled on this article[1].
I'm in an org with tens of thousands of Macs deployed. We've got through butterfly keyboard failures, screen ribbon issues, bloated batteries, and dead logic boards.
We've had some SSDs fail, but it's virtually unheard of. It's so rare, I can't recall a specific instance. I submit that failure rates within a Mac's usable lifetime is so rare that it's a non-issue.
That said, I'm always going to strongly want user-replaceable, modular components everywhere I possibly can. Barring that, I want cheap repair options (buying stuff like logic boards from Apple is obscenely priced.)
I also factor AppleCare+ into device cost, because forget buying a $2,000-3,000 Apple device without it.
>Non-soldered storage. Seriously. Storage is the most likely component to fail. SSDs only have so many write cycles.
One thing to note is the quality of Apple’s storage is above and beyond anything else you can buy. They have teams at every level of the storage stack and have an amazing focus on quality. From individual chips all the way up to the filesystem, all of the software is customized to extend the lifespan of the device. They do wear-leveling so the larger the drive you buy with your device the longer the device will last, but honestly the drives they come with will last much longer than 99.9 of people will ever actually use a Mac.
> One thing to note is the quality of Apple’s storage is above and beyond anything else you can buy.
Average good experience of other users doesn't make a guy having a failing SSD feel any better. The absence of easy repair/replacement just adds an insult to injury.
> They do wear-leveling so the larger the drive you buy with your device the longer the device will last
Correct me if I'm wrong, but it sounds like a description of an average modern SSD. Heck, I was reading similar things about wear leveling regarding USB drives, like, a decade and a half ago.
I think you're being overly creative in your interpretation of average, like you're imagining a bunch of people people that have it fail after a year of regular use and then others that have it fail after 30 to 'average it out' and mask a bad experience for many people.
>all of the software is customized to extend the lifespan of the device
It is widely known macOS paging out a lot more than Windows. On a multiTab Safari you could easily have 100GB write or higher per day. Kernel_Task lock up are extremely common on google search.
Thank god I'm not the only one to know that. Peoples keep talking about the awesome efficiency of Apple software but this happened to me and I trashed quite a few SSDs.
Supposedly macOS is more efficient with the hardware but having built hackintoshs I know it is the reverse. Apple fanboys are so confident in their bullshit that it is hard to trust your own experience.
For clarity I am still a macOS and Apple user. Not for long though, I won't buy new Apple hardware until they change their ways. Got the iPhone 12 mini and deeply regret it, even though the physical format is pretty nice.
Watch the video I linked. Modern Macs are a single capacitor failure away from shorting 13v straight to those "amazing focus on quality" SSD chips. Either one of the two chips dies, you have a brick. No booting from external drive. Nothing. A brick. All because one capacitor fails.
If it were just a matter of replacing a capacitor, I'd agree with you. But the wiring decisions from their "amazing focus" defaults to a 13v short to storage chips—all your files AND the BIOS up in a puff of literal smoke.
And finally, Apple doesn't design and manufacture solid state storage devices. They contract out to other companies. Those other companies also sell NVMe storage devices with substantially similar chip designs.
Don't believe the hype. A ten cent capacitor taking out a $3K laptop isn't "above and beyond"; it's penny pinching at the expense of reliability with no concessions made to either repairability or resilience.
Guilt by association and/or ad hominem are weak sauce. What part was factually inaccurate?
• That storage is soldered unlike the vast majority of laptops on the market, which use standard NVMe sticks?
• That the BIOS is stored in those soldered SSD chips making a single point of failure?
• That if even one of the chips fails, you lose all the data including the BIOS boot logic and brick the device?
• That the board layout sends 13v down the rail directly to the aforementioned SSD chips on capacitor failure?
• That they cannot be repaired anymore, even by experienced professionals?
• That replacement parts that aren't severely compromised (unknown number of write cycles for SSD chips) simply aren't available?
Which part is false and requires a disclaimer? And I ask this as someone who has personally used Macs exclusively on the client since 2003 and professionally since 2011. Owned nothing but iPhones since the original was released in 2007. 5 iPads in total.
Perhaps it would help to know that I upgraded the RAM and hard drive on my 2005 Santa Rosa MBP. Really easy due to standard DDR2 SODIMMs and SATA connectors.
Did the same to my 2009 unibody, put a 500GB hybrid solid state hard drive in it and bumped the RAM first to 6GB and then to 8GB.
Not only did I bump the RAM in my 2012 unibody to 16GB—which wasn't even a build option when initially released—but replaced the DVD drive with a second 2.5" drive, ran two SSDs in RAID-0, and got 1GB/sec r/w transfer rates, which for that vintage was unheard of, especially since it started with a 240GB HDD. That one lasted me quite a while where the only real drawback from a new one was the screen resolution. I was able to upgrade the screen to a slightly higher res when I broke the screen, so at least a minor improvement there.
Then the Retinas started. Stuck on 16GB RAM on my 2015 MBP, but I loved the new screen, and I eventually replaced the stock 512GB SSD with a 1TB NVMe stick, then a 2TB NVMe stick. This is still my personal laptop, and I'm eyeing a 4TB upgrade for it now, 8 years after initial release.
Got a 2022 M1 Max for work, and it was great, but it did have a boot hiccup six months back. (SSD issues.) Got it "repaired" (really just a replacement since the whole mainboard was useless) and luckily I made regular backups, but became wary once I learned this wasn't an isolated issue and why the problem probably popped up in the first place.
Most of those laptops still work! The parts that died first almost without exception were the storage devices. The CPUs generally kept plugging along well after their usefulness had expired. Screens still light up. RAM still plugging away. The drives have clearly been the weakest link, but when storage is replaceable, we can keep the rest of the laptop out of the trash for many more years. (Assuming you keep them plugged in due to the battery being shot, but even batteries are replaceable or the laptop is usable as a desktop alternative for a loved one.)
Therefore now I'm far more reluctant to purchase a new Mac for personal use precisely because of the design issues found in recent Macs.
I remember modding osx Yosemite to run the swap in the ram when I had an SSD since I was so worried about it, not sure if it still does this but with the infrequently of negative problems and the fact that one of my friends used a 2gb MacBook air 2009 until 2017 made me realize it probably doesn't matter.
A 2009 MBA can still be used if the SSD dies. You can get a replacement or you can plug in an external drive to install on and boot off of. You can upgrade the storage space on that model if you want to.
Since 2018 and the T2 chip, the BIOS is stored on the soldered storage chip pair. Either one of those two chips or a capacitor adjacent to them dies, you've not only lost your files and OS. You can't replace them. You can't even plug in an external drive to boot off of since the BIOS is now stored on the SSD chips as well instead of a separate ROM as before.
2016-2017 soldered the storage, but you could at least work around it. 2018+, there are no more workarounds. If any part of your SSD fails now, you've got an expensive useless brick.
Swapping wear your SSD out faster? SOL. Capacitor fails and shunts 13V through your SSD? SOL. Anything at all happens to either one of your SSD chips? SOL.
Either pay for that AppleCare in perpetuity or buy a whole new laptop. There are no more workarounds.
More so when the RAM remains limited on Macs as it too is integrated with the CPU and hence is non-upgradable. This is even worse than soldered SSD, as we can still make an attempt to replace the SSD chips with new ones with careful soldering, but nothing can be done about the RAM. These deliberately irreparable new Macs are just shitty products.
All my old laptops since 2003 still have functional CPUs and RAM. I'm not overly concerned about the CPU/RAM situation, especially have seen the advantages to a shared CPU/GPU/RAM Arm architecture. Would I prefer upgradeable RAM, sure, but it's not the weak link. Know what's actually failed me over the years?
To be fair not that many if you want alternative to the Air and size/weight is a priority. Also.. I’m not sure what are you trying to say? That there is something wrong about criticizing Apple’s design choices?
* Just use TB monitors and daisy chaining works fine
* Nothing to add.
* Hardware enclave is a requirement for most corporate/government computers so you're just comparing two different classes of computer. All the enterprise Windows machines also do this.
> * Just use TB monitors and daisy chaining works fine
Not everybody has a zillion dollars to spend on expensive thunderbolt displays and they shouldn't have to when the Displayport standard (and other operating systems) support daisy chaining.
> when the Displayport standard (and other operating systems) support daisy chaining
It doesn't support it, exactly, it supports a compressed image in each of the multiple streams transported, which can introduce quality problems when doing the production work people use higher end Macs for.
I have dual 4k right now daisy chained from a Mac at 60Hz. Works totally fine and it was a 2019 MBP (and now a 2021 M1 Pro MBP). This is already a solved problem if you just stick with TB3+ gear.
With a large 4K screen, I'm not seeing any benefit for MST, and studies have shown it decreases productivity after 2 screens. The other problems are simply fixed with using iCloud and treating your laptop as a disposable, which many laptops have been designed as. If you use the MacBook add intended and invest in apple care it's unbeatable.
“Studies show” cannot possibly account for all workflows. The most extreme monitor users I saw were security personnel who used them to… monitor things. Their office looked very cyberpunk being utterly littered with corporate desktops and cheap TVs and LCD monitors.
These aren't studies, just anecdotes, and the actual study quoted in the NYT shows the opposite:
> One study commissioned by NEC and conducted by researchers at the University of Utah showed that people using a dual-display machine to do a text-editing task were 44 percent more productive than those who used a single monitor.
I think it's fair to say that any research from 2014 isn't the most relevant now. Both screen size and (importantly) software has changed to better support both multiple screen usage as well as large screens.
The argument for single screens in the NYT is mostly that it reduces distractions:
> But for most people, the time spent juggling two windows or scrolling across large documents isn’t the biggest bottleneck in getting work done. Instead, there’s a more basic, pernicious reason you feel constantly behind — you’re getting distracted.
This seems to indicate they were comparing a maximized window on a single screen vs non-maximized windows on multiple screens, and probably in the days before modern notifications on desktop OSs.
Based on this it is fair to say more research is needed, but it is inaccurate to claim that studies support the idea that one monitor is better.
Realistically where would you position these monitors? Ergonomically there aren't many great solutions for any extended use especially once you go up to mutiple large monitors. Tilt your head in those directions and see how long you maintain them comfortably. If you think modern notifications are less distracting I'll respectfully disagree.
Three monitors along a curve on a curved desk with a rotating office chair (or just one on wheels) is perfect. It might be OCD but I don’t like two monitors because there is a gap between screens in the middle. I can physically rotate to face one side monitor or the other. I am never tilting my head for more than an occasional glance and this setup provides sone kind of task repartition in space, which helps me with switching contexts.
From an ergonomic point of view this is no different than using a single monitor. They are at the right height and the right angle and with a good chair. The trick is to rotate the chair rather than your head. And it does take a bit of space.
> Realistically where would you position these monitors?
I have a 34" main monitor positioned directly in front of me, a 24" in vertical orientation I use mostly for multiple terminals and my MPB screen below the 34".
I don't think this is an unusual setup.
> Tilt your head in those directions
This applies to large single monitors too. I think (hope?!) most people have a setup where their main work is right in front of them.
> If you think modern notifications are less distracting I'll respectfully disagree.
No I'm saying they are more distracting, so reduce the relevance of old articles claiming a single monitor reduced those.
Are they all in front of you? They sound like they are. Do you find that the positions are comfortable? I cannot see a single article that claims mutiple monitors are ergonomic beyond 2, and most only try to suggest one. https://ergo.human.cornell.edu/ergoguide.html this article might not like the positions you use, but they're just suggestions that I read and found useful.
I had two 4K monitors in portrait mode. Meant less vertical scrolling. I generally put different contexts on different screens, eg. code/terminals on one monitor and browsers with results on the other. Laptop screen left open as well for email, Slack, etc.
Come on, those are not “studies” - they are just the opinions of some random people, and they carry no more or less weight than, for example, my opinion, which is “you can pry my 4 x 27” screens out of my cold, dead, hands.”
The NYTimes article is actually explicit about that:
> Unlike monitor makers with their multidisplay studies, I have no research proving you’ll find as much benefit from a single monitor as I did.
In other words “there are actual studies proving multimonitor setups enhance productivity, but Big Screen(tm) paid for them so they must be wrong” which has no basis in science at all.
Also:
> “Two monitors are a double-edged sword,” said Gloria Mark, a professor who studies workplace distractions at the University of California, Irvine. Ms. Mark hasn’t specifically researched how second monitors might affect focus, and when she recently had a chance to work at a two-monitor machine, she felt that it did make some of her tasks easier. “But most people have their email up on the second screen, and of course, when anything comes in, it’s a great source of distraction,” she said.
So the only actual scientist in all of this actually said “hey, this is making me more productive, unless I fill the screens with distractions”
The tl;dr is that people need to set up their workspace to enhance their flow and focus, and reduce distraction to be productive. The amount of screens is about as relevant to that as the weather outside. (“Hey, the sun is shining and the birds are singing, it is a beautiful day outside, let’s go out!”). As per your own conclusion (“the studies don’t indicate it’s universal” - setting aside the fact that there _are_ no studies) It is not universal.
Trying to whitewash Apple’s lack of DP Daisychaining support as “You don’t need this anyway, Apple simply cares about your productivity!” Is silly.
My point was that Apple probably don't care to cater to these use cases, and see them as niche. I know 0 people who use more than two monitors on any OS.
> I know 0 people who use more than two monitors on any OS.
Three in this thread alone, and shifting the goalposts from “studies show it is bad” to “I don’t know anyone with multiple monitors” or “It is tiring to move your head around all the time” indicate you are not debating this in good faith, so I am done with this chat.
Every person in my company has a laptop screen and two monitors on their desk, including devs, product, managers, infra, admin... Literally everyone. The only people who don't use them are senior leadership because they're rarely at their desks.
There’s a paragraph under the title “Account for Dual or Single Monitor Arrangements” on that web page. They just say that you have to be particularly mindful of ergonomics if you have more than one. How is that not accounting for more than one?
Got any sources for that? I find that hard to believe.
> MST (DisplayPort daisy chaining) in MacOS. The hardware has supported it for over a decade. The OS is the weak link here.
This is indeed annoying, but with USB4/Thunderbolt support becoming more common in monitors, I believe this is going to be less of an issue going forward.
SSDs and batteries are the only components (as far as I remember) that have a "guaranteed" cycle lifetime. Or in another words, a "very likely after this point" failure threshold.
I completely agree on batteries (and I believe this is in fact why most devices are being replaced was what I was getting at!), but where is that point practically for SSDs?
With typical usage, it should hopefully be out many more years than the laptop gets security updates.
That’s a fair point, if I recall correctly I had three laptops for about 5 years each. And about 7 over 5 years period as I switched jobs and laptops frequently.
And if a man mates with nine women he'll be holding one baby this time next month? With both electronic devices and gestating females concurrency is common.
No, you don't understand. SSDs begin to encounter problems over time after some number of read/write cycles. This is entirely expected and when one sector has issues it gets remapped to one of some spares that are built into the drive. Once those are exhausted it's a matter of time before you have an unusable SSD and, if it's soldered to the motherboard, an unusable laptop.
But where did they say they only used each device for 1.5 years? That seemed like your assumption based on the quoted dozen laptops across twenty years; I certainly have a number of devices concurrently, with the oldest now being 14 years (still working though only on a power supply as the battery is badly degraded). That said I also strongly prefer a replaceable drive, but more because I don't want to buy the capacity at today's price that I'll only truly need in a few years time, if ever.
It’s a pretty fair assumption, I think. It’d be a really unusual if they kept buying laptops and somehow continued using all or some of them at the same rate.
Apple devices aren't engineered to last; they're engineered to be manufactured, sell new, and sometimes sell expensive repairs by them.
Also, NVIDIA cards in eGPUs and Thunderbolt (arm64) aren't supported.
The best solution is either don't use Apple, or have a machine under warranty and very good backups.
Complaints aside, the Anker 777 dock does 5K-8K DP over USB-C, 1 GbE, 100w charging, 4 USB-A and more and works with m1 and m2 MBPs. Add a quality magnetic USB-C connector, and it's "MagSafe" docking to a monitor, power, and accessories with one cable. I'm in the camp of 1 giant 49" monitor, with occasional AirPlay to a nearby Apple TV 4K on an 85" TV.
I have had many Apple devices since the 2nd gen iPod, and they have all lasted a very long time compared to my other devices. All my laptops back to my 15” PowerBook is still working. They have fixed a faulty GPU on a 4 year old laptop for free, even though I had opened the laptop and changed thermal paste. (Known manufacturing issue with that GPU, but still impressive)
They have made mistakes. Like the battery in my current 2015 MBP is too hard to replace. And I avoided upgrading for a long time due to the known issues with the keyboards. But it seems like they generally correct when making mistakes like that.
I do agree that SSDs shouldn’t be soldered on though. Hopefully it’s one of the mistakes they’ll eventually fix. I kind of understand the motivation.. you can make it impossible to effectively steal a device, or access its contents. The device becomes very simple and robust in some ways. But they could come up with a good compromise.
Like, sure, keep the concept of not having an SSD controller on the drive, and firmware on the SSD. But sell replacement SSDs with the firmware and a recover OS preloaded. One that lets you provision the new drive with the SOC through your iCloud account. Depending on the on-chip storage and whether it’s fuse/EPROM-based or flash, the number of possible drive replacements might be limited. But that’s an OK compromise.
Yeah, but equivalently priced hardware is just as good if not better. I put as much money in a PC build as I would have in a Mac Pro, and I got a 1000w PSU replaced for free because it had 7-year warranty. So, it is not something special about Apple. If you are willing to put the price you will get pretty good hardware. If anything, you are more likely to be left hanging the short stick with Apple because they are unwilling to warrant their stuff for a long time no matter what price you buy it. Not without reason, because realistically, most of their hardware post 2015 really is not any good.
> They have fixed a faulty GPU on a 4 year old laptop for free
I have to laugh at this.
Apple had a well known strategy of denying that obvious problems with their devices were not actually problems with their devices until 5-10 years later when no one would be using those devices anymore, at which point they would announce a recall, which involved a tedious process of proving it's the right device, and shipping it to Apple for days or taking it to an Apple Store for repairs.
In the meanwhile they would deny that it was ever a problem. The good thing was that a lot of Apple Geniuses were absolutely decent people and would probably give you a repair for free although they were supposed to try to get you to pay for the repairs.
I mean, just consider how long Apple denied the following, highly public recall worthy issues (and then extrapolate that to how they would behave with something that was less publicized and had a failure rate of about 10% as opposed to the much higher failure rates exhibited in these situations):
1. You're holding it wrong, iPhone 4.
2. No, your battery is not degraded...you're just imagining that your battery life is low
3. We fixed the degraded battery....no your phone hasn't slowed down...you're just imagining it.
4. Butterfly keyboards.
I suppose the interesting question is whether this is true of the newest devices or not. My wife's MacBook Pro lasted 11 years. It finally died this year. Can I expect the MacBook Pro that's replacing it to last 11 years too? Only time will tell. Hope so.
Didn’t they end up recalling those and changing the main boards for free (of course, those replacements were also having the same issues as the original boards). AFAICT they still refuse to touch a nVidia GPU over this. It is not a common occurrence.
It was a 2008 laptop and they decided it was a real problem by 2014 or so, so 'yeah' after 6 years and wet know what happened after. My iPod was replaced within an hour when it broke.
>Apple devices aren't engineered to last; they're engineered to be manufactured, sell new, and sometimes sell expensive repairs by them.
I guess you should say include "Modern" Apple Mac. Macbook Post 2015 are pretty bad compare to older MacBook. Those who are EE or interested in Hardware should watch a few of Louis Rossmann video.
But then because the rest of the industry are so poor in reliability they still get to say they are once of the most reliable notebook. May be apart from Lenovo.
only laptop I know carved from a single piece of aluminum. not the only reason but for sure that one is a good one. No other laptop feels as solid and I don't understand why nobody else does it.
I haven't had any of the recent MBPs, but my 2020 HP EliteBook's main body looks similar to my 2013 retina mbp, which already had the "unibody" construction. If also feels quite solid when I hold it by one corner while open. The lid does feel somewhat flimsier, though, especially the hinge.
I have a Dell that looks the part when it’s sitting on a desk, but is actually quite flimsy (the underside is actually plastic, and the hinge does not look strong at all, the screen shifts colour when a slight pressure is applied on the lid). And it is a middle of the road Latitude, not a €1000 MacBook Air.
This particular HP is full metal. I was able to confirm it when using a random belkin charger with no ground: I got the familiar "vibrating" feeling in my fingers when moving them along the case.
This specific model is a 840/845 G8 (I have both, they're the same, one Intel and the other AMD). I haven't seen the current G10, but I've seen the supposedly higher-end 1040 G10 and, like the Dell, it looks the part, but it's clearly all plastic. Complete with creaking noises if you look at it wrong. And the one I'm talking about was brand new, maybe 5 hours old.
> the screen shifts colour when a slight pressure is applied on the lid
Heh. Must be nice to have a screen whose color is consistent if you leave it alone. My HP changes colors if I shift my head. No, these are not cheap 2003 models. Yes, they cost as much as a similar MBP.
One (but certainly not the only) reason is that Apple rarely discounts their products and even if they do it’s not by that much.
It’s not that hard to find a high-end/$2000+ PC laptop for 40% or so off 1-2 years after its release, not so much a macbook. Apple is still selling refurbished M1 airs for $849.00 from 2020 ($999 was the price on introduction)
The short version is: "Wouldn't it be disruptive if Apple went back to replaceable batteries?" but the author stopped short of where that leads.
One of the things that doable "now" that wasn't doable "then" is wireless charging. I love that I can put my phone on a pad next to the bed at night and it charges, pick it up and there is no trailing wire to unplug, put it down and there is no wire to find and plug in. This is a big improvement in my user experience.
Another of the things that have moved us forward is that power efficiency has gone up dramatically, as a result a smaller battery can give the same "run time" as a large battery of old without the bulk associated with the larger energy storage capacity.
Why not combine them?
Why not no battery in the device? Seal it against the elements. Then put in a pocket where a battery that is a wireless "charger" can drop into a slot on the device and provide the power. Current demands on a modern device are low enough that you just go wireless all the time for main power. Now your device takes a sealed "toaster pastry" sized unit that looks innocuous, but slide it into the slot and poof the device powers up, put it on a recharging mat and it starts charging up itself.
Now you have no exposed terminals to "short out", no worries about the battery in your device becoming a "pillow of doom"[1]. And you can carry a couple of extras if you're going to be away from power for a while.
You get all the benefits of replaceable batteries and none of the downsides with the possible exception of a "slot" in your device that looks funny when there isn't a battery in it.
> One of the things that doable "now" that wasn't doable "then" is wireless charging. I love that I can put my phone on a pad next to the bed at night and it charges, pick it up and there is no trailing wire to unplug, put it down and there is no wire to find and plug in. This is a big improvement in my user experience.
The palm pre and several early android phones had both removable batteries and wireless charging. The back panel just had pogo pins for the coil to transfer power.
In many cases the coils were integrated into the casing of the battery, and the back panel of the phone was thin enough to transfer sufficient power through.
Wireless charging wastes a lot of energy. I'm not sure of the exact figures but I recall around 50%. Not a huge deal, on a personal level, for a plugged in charger (although if everyone in the world starts doing this it'll be more of a problem).
However wireless charging from a battery pack is not ideal because the battery will have to be bigger. I already feel my phone is too big to be comfortable in my pocket, and I know it's even more of an issue for smaller people/women's clothing with tiny pockets.
That's pretty close to how power tools work, minus the wireless charging of course. My first concern would be warmth, my phone always warms up when wireless charging. Doing something CPU intensive would probably get the device blazing.
My second concern would be size - is the total package going to get a lot bigger here? If we forgo the pocket and just use magnets at each corner maybe it gets slimmer, but can it be about as slim as they are today?
If we could overcome those two hurdles, I'd be a huge fan of the idea.
There is a tradeoff here which might mislead. That is that when you're wirelessly recharging a battery "good" is defined by "charges in a short time" which means a lot of current and thus a lot of heat. When operating a device "good" is defined by "runs a long time" which means minimal current and thus little heat.
So before we could dismiss the idea, we would really need to run the experiment of "running" via wireless power to see how the current requirements in that state would be reflected in device heating (or not). I agree with you though that if the answer was "always smoking hot" then it is a non starter.
As for size, the "wireless" part is really really thin (a loop antenna). Battery size is a function of how much electrolyte it can carry and LiOn batteries are about 0.5g/cc and 250Wh/kg (or .25Wh/g) iPhones are around 12Wh so about 48g of battery which is 96cc of Lithium. Call it 100cc and you're looking as a "7.8 x 15 x 1" cm "battery card" (order of magnitude approximation based on that math and rough measurements of my iPhone 13SE which is a smallish phone)
I like the idea of magnets but I don't like the idea of the battery being "jostled off" my phone in my pocket and so effectively turned off when I think it should be on. So I think that would take some design work to get right.
MagSafe's auto-alignment really helped the efficiency of Wireless Charging, supposedly 75% efficient with MagSafe (based on the cable pulling 20w and phone supposedly accepting 15w) versus 50% for regular Qi charging.
How about a nuclear battery, aka radioisotope thermoelectric generator (RTG)? This would be sealed in (carefully!) and would last the lifetime of the device, with no need for charging, ever. Would need special steps/handling to ensure safe disposal at end of product life, however.
This is challenging because you probably don't want a phone that is hot all the time. There is however a very interesting technology which uses beta radiation and what are essentially "solar cells" to generate electricity directly[1]. Basically you could paint some Cesium 137 on to a solar cell and it would generate power 24x7!
I think you meant "big" rather than "hot"? It's decay rate is low enough that it doesn't even melt itself when not cooled (and it melts at room temperature). Adding precision here, a 'thin layer' would not melt itself, you could probably keep a 10+kg chunk liquid. But the low efficiency means you need a lot of surface area between the Cs137 and the silicon lattice, so it one could calculate the number of sq meters of "battery" surface you would need to power a device and then see if you could topologically fold that into a relatively small area. Cs137 also can produce a small amount of gamma radiation which you'd want to shield against so the whole thing might end up in a lead box. (not great for making it light :-))
It would be a fun engineering challenge and a good use of nuclear waste but difficult to sell politically because the association with "nuclear." And yes, it isn't actually practical at this level. Looking at where folks are looking at this seriously it seems to be more of a spacecraft thing (long duration probes far from the Sun kind of thing)
Being big is also a problem, but I did in fact mean heat.
"Not melting itself" is not the threshold we need to meet for a phone. We should probably consider 5 thermal watts the limit. If we have a 30% efficient betavoltaic panel, then we can produce 1.5-2 watts of device power within that limit. If we have a 5% efficient panel, then we're only getting a quarter of a watt. A quarter of a watt can stop an idle phone from draining, but you'd only recover about 1% of a normal phone battery per hour.
By the time we're considering external devices, I think "battery bank with a solar panel" beats betavoltaic on the technical merits even when we're ignoring difficulty and radiation. Even a tiny panel built into the back of a phone generally wins.
Higher levels of waste heat could be sold as a benefit, especially in cold climates. Nice little personal heating device to keep you warm on cold nights?
Optional accessories could harvest the heat to heat water, make coffee, cook food, etc?
Those people typically want to use their phone indoors too. There's really no good way to spin an uncomfortably hot phone.
If it's separate from the phone then you don't have to worry about that, but again if it's separate from the phone the entire scenario that made you want this kind of device starts to collapse because the requirements are so different.
You can't really have more than 4-5 watts of heat in a phone, so even with the friendliest heat source don't expect more than a quarter watt of electricity.
If you want to go totally crazy, you can use a black hole as a power source. The laws of physics don't give you a minimum size for them either. However for black holes the smaller they are, the more power they emit in terms of hawking radiation.
Yes. I tried to find the calculations, but couldn't find them. Google Bard to the rescue (and cutting out the actual calculations she gave):
> Therefore, a black hole needs to be at least 9.49 × 10^{22} kilograms in mass to emit 100 watts of Hawking radiation. This is about the mass of the Moon.
No clue whether those numbers check out.
EDIT: poking bard a bit more gives this correction:
> However, for a black hole to emit 100W in gamma rays, it would need to be very small, with a mass of only about 10^22 kilograms. This is much smaller than the Moon, which has a mass of about 10^25 kilograms.
But poking more and more makes me less and less confident, because bard seems to believe that bigger black holes emit more than smaller black holes.
Yes, I do: use a bunch of carefully controlled magnets to spin your block hole in a circle inside the phones.
Black holes can hold charge, and thus electromagnetism works on them.
(Not really useful for a phone, of course. But manipulating black holes like that is not any harder than manipulating an equally massive lump of ordinary matter.)
You really don’t have to go wireless to protect your device/battery against the elements. In fact, most smartphones today are totally waterproof while still being charged by a good old cable.
It’s also pretty easy to design a connector with exposed pins that are totally inoffensive until something real is exposed. In fact, modern USB charging already works like this : a charger will not send anything over 5v x 150mA before having negotiated with the device to know what it needs.
You'd still need some in-device battery to keep the phone from fully powering off every time you change the outside battery. I think people want their phones to not have to fully power off all the time.
30min of standby capacity for an internal lil battery, external poptarts galore? I don't hate it.
It's kind of an interesting play because apple could make a super thin phone that has 30 min battery life and then a few different size pop tarts with different capacities. You want 3 days, get the mondo brick.
Interestingly Apple's old PowerBooks had this. They had a very small internal lithium rechargeable battery. So you could close the lid(making the PowerBook go to sleep). Then you could swap the main battery without having to shut down the laptop. It could power itself off the internal battery (in sleep mode) for a few minutes.
Wasn’t a similar but worse version of this common in laptops 15ish years ago? You often had the option of running two batteries instead of 1 and a CD drive because they had interchangeable slots.
You are correct. “Pillow of doom” is sometimes used to refer to bulging/swollen batteries. There is subreddit named “SpicyPillows” that’s centered around the same concept. I assume it’s due to the pillow shape a swollen battery takes.
I really enjoyed the first half talking about how Apple and Nintendo followed a blue ocean strategy. Then the second half ranting about removable batteries felt totally out of place. I’m not sure I buy the blue ocean strategy of that, although well written.
> I really enjoyed the first half talking about how Apple and Nintendo followed a blue ocean strategy.
Let me fix that for you.
Blue ocean is roughly the idea of inventing new markets. While it might apply to Nintendo, that's really a stretch and I view it as product differentiation in a highly entrenched market where Sony and Microsoft are clear sharks in the water.
If you look at Apple's product and marketing history, blue ocean has not applied in two decades. The iPod was hardly the first MP3 player -- Diamond Rio was already in that space. There were several smartphones in the market before apple shipped iPhone, people were speculating on when for years before it was announced. Smartwatches supported android 2 years before Apple Watch Series 0 shipped. HomePod shipped 5 years after the first Amazon echo. What Apple does is wait for other people to prove out the market, and then produce an expensive high end version. This is an effective strategy that captures a lot of high margin & affluent demand, but it's very clearly a red ocean strategy.
A blue ocean strategy doesn't have to shout "Think Different," your product uniqueness is obvious and usually confusing. A red ocean strategy means focusing on margins, inventory, and balancing product differentiation versus convention. It means keeping your product secret until the big announce and suing competitors who copy you. At one point Tim Cook was calling himself the Atilla the Hun of operations, and was quoted in articles as saying "inventory is the root of all evil."
This is very reductive of the impact the iPod and iPhone had on the market. Every product has little bits and pieces of other products in it's DNA but to say the something like the iPhone didn't create the Smartphone market just because the Blackberry or the Sidekick existed is a little silly.
Even the iPad, which has clear lineage decades back to the horrible Windows XP tablets, create a whole new market category out of thin air.
I'm not saying these are not great products, or that they didn't raise the bar. I'm saying they all launched with directly comparable competitors, and everyone knew what it was supposed to do. In popular imagination, the iPhone was supposed to make calls, send text messages, take pictures, and play music. A cameraphone, iPod and BlackBerry. The fact that I can make such an equation should be a hint that we're not talking blue ocean.
> Even the iPad, which has clear lineage decades back to the horrible Windows XP tablets
And maybe even the Newton. That was a better example of "blue ocean" product strategy.
I agree. It made no sense after the excellent first half.
Apple has designed the Mac to become obsolete in approximately 10 years. That’s when you lose official OS support. And yes, you can get your battery serviced but nobody’s going to buy a $200 battery for a laptop worth $500. Adding it back wouldn’t be a blue ocean, it would just be Apple ceding profits to nobody.
Apple’s blue ocean is stacking ecosystem benefits. If you have an iPhone it makes more sense to own a Mac or an Apple Watch. If you own a Mac or Apple Watch it makes more sense to own AirPods. If you own AirPods it makes more sense to own an Apple TV.
The future is batteries with more charge cycles, not replaceable batteries. The solid state battery people are claiming 10,000 charge/discharge cycles. If they can achieve that, it's 27 years of battery life. The case will wear out first.
Lithium ion batteries that can handle a thousand to several thousand charge cycles and only see 80% reduction in capacity have been available for ten years.
My 2013 Macbook Pro has just such a battery, rated for 80% capacity after 1000 cycles, and it lasted over 6 years before the battery's internal resistance rose enough that heavy load would cause a brownout and the machine to force sleep.
Yet they are not in widespread use; most devices have prioritized capacity/weight/size over longevity, Apple included. My current M1 has lost 10% of its capacity after just 120 cycles and less than a year of ownership, even using a utility that allows you to set a limited maximum charge level to preserve the battery. And the battery is less accessible, less replaceable.
Battery companies and device manufacturers don't care. There's no incentive for anyone involved and consumers have gotten used to replacing devices every few years.
Solid state batteries with 10k charge/discharge cycles are never going to happen for the same reason incandescent bulbs are purposefully manufactured by every single lightbulb manufacturer to fail when they figured out a hundred years ago how to make them last forever.
We need something like a e-waste import duty, tax, fee, or 'recycling deposit' on devices with non-replaceable lithium ion batteries.
Its a bit of a myth for ever lasting bulbs, its more complicated than that. The life time was a trade off of colour and brightness. If you made them very red shifted and quite dim then there are bulbs that are very old still around, but they don't produce much light and never did. The harder you drive the filament the better the light properties but the quicker it died. The industry choose a point for trade offs as a standard. You can watch about the history of it on Technology connections. https://www.youtube.com/watch?v=zb7Bs98KmnY
Saying that the Dubai Philips LED bulbs really do last a lot longer since they have twice the filaments and run the filaments a lot less hard as a result (as well as being more efficient). It is possible nowadays to make an LED that wont fail anytime soon although LEDs are already much better than incandescents in that regard but there are ones that will likely survive your lifetime available at a higher price but only in Dubai. Most LED bulbs are designed to fail sooner, but they are also cheaper.
Those 10k recharge batteries are absolutely coming and we will have them in a few years time, li-ion as it stands today will simply be obsolete.
I'd be happy with an LED bulb that lasted a year. I haven't yet found one that can do that reliably.
Remember that 60W incandescent bulbs costing $0.25 used to be available that met that goal. We've taken huge steps backwards on interior household lighting.
Just joining in with others who report that I haven't had any problems with LED bulb failures. I've used exclusively LED bulbs for at least 5 years now (maybe about 15 bulbs across my home), and none has ever failed on me.
They do sometimes flicker when they're in a traditional dimmer set at an intermediate level, even though the ones I've bought claim to work with a dimmer. That's frustrating.
But I haven't had any outright failures.
If it helps: I bought mostly SANSI bulbs off Amazon.
> "I'd be happy with an LED bulb that lasted a year. I haven't yet found one that can do that reliably."
I do remember in the early days of LED bulbs, many were quite unreliable, especially dimmable ones. It was probably more the electronics in them failing more than the LEDs themselves?
But now days LED bulbs seem much more reliable and I very rarely have to replace them. In fact the only bulbs I can recall replacing in the past 18 months living in my current house were some old CFL bulbs that hadn't actually completely failed, but had gradually become a bit too dim.
Certainly, now days, I need to replace bulbs far, far less often than in the old incandescent days where it was a regular chore in any house.
(Also, now days I avoid using old-style dimmer switches, and even removed some in one house I lived in. Better to get a smart bulb with it's own built-in dimming if you want it.)
Jumping in to recommend Cree if they're still around. Used to be able to get them at Home Depot here in the US.
I started going Cree when I was renting 10 years ago, I've yet to have a single failure of the bulbs that I've drug with me to the place I live now. Have at least 25 bulbs currently in the house that are Cree LEDs. Their reliability actually has me frustrated at the current trend of lighting fixtures with integrated LEDs that I have 0 trust in.
I’m replacing all the ecosmart bulbs with crees. The normal 60w/100w equivalents have been fine so far, but I have some 7500 lumen Cree ED37s that are trash, replaced 3 already out of ~7. Not sure what the deal is. Maybe my power at my house is bad or something. We blow through LEDs like crazy. They just start flickering.
At that point it is the wiring in your house. LEDs are super sensitive to crappy wiring. I have a couple receptacles in my house that just chew up LEDs, one a year, while the LEDs next to them have lasted for 5+.
Ikea and generic home depot LEDs have lasted 13 years of daily use with zero failures. Also, I've never seen that high a rate of LED failure. You may want to have your house checked out by an electrician for peace of mind.
Not having replaced a bulb due to failure for ten years (and to be clear not having replaced an LED bulb ever) is not the same as not having bulbs that are under 10 years old. We have a variety of LED bulbs from 2013 and then others I’ve incrementally replaced either as they annoyed me (heat, in almost all cases) or because I just got around to it (our external security bulbs most
recently).
If LED bulbs are inherently prone to failure I’m not seeing it.
I do agree they’re more expensive, even if you adjust for inflation, but as far as I can tell they last forever.
You likely have an issue with your wiring. LED bulbs last much longer on average on a a good circuit - but they are less tolerant of spikes and drops in voltages, thus circuit, switch, wiring quality comes into play.
Would be much easier instead of embedding an AC/DC converter inside each bulb that we now get standard 12V DC for lighting throughout homes - That would centralize the voltage sensitivity and allow larger circuits / capacitors to regulate this.
Most of the LED bulbs I've gotten have lasted quite a while (multiple years). I just recently replaced a front porch light that was about 10 years old. Granted it was controlled by a motion/light sensor so it wasn't on all night, but OTOH it was outside in the heat and the cold for 10 years.
IME the worst are the ones that get really hot. I've had multiple (rather expensive) LED spotlights like that in my kitchen which never lasted more than a year or so.
That’s crazy to me, I couldn’t tell you the last time I had to change an LED bulb, certainly it’s at least five years. And I work from home in a part of the world where it’s very dark for half the year so they get a lot of use.
They’re Philips Hue bulbs, a mix between LWB010 and LTW001, but even the cheap supermarket bulbs have lasted so long I also couldn’t tell you when I last changed them. Some of them have been in the house longer than I have.
> "Yet they are not in widespread use; most devices have prioritized capacity/weight/size over longevity, Apple included. My current M1 has lost 10% of its capacity after just 120 cycles"
A simple way to significantly extend battery longevity is to limit charge to 80% (only charging to 100% occasionally when you know you might need it, like a long day travelling).
Unfortunately Apple doesn't yet provide this option on most of it's devices, other than the iPhone 15 line. But you can use third-party apps to do it on Macs.
Yeah, that doesn't work very well in my experience. Not aggressive enough about stopping at 80% unless your routine is very regular. I want it to always stop charging at 80% unless I say otherwise.
> even using a utility that allows you to set a limited maximum charge level to preserve the battery
Please tell me what that utility is--I've long wanted something like that but have yet to find anything like it.
FWIW, I've been doing something similar with my phone, but manually. I found an unusually low current charger that charges the phone at a rate of ~5% per hour, and I almost always keep the battery level between 25% and 75%. So far the phone is 4 years old and iOS claims that the maximum capacity of the battery is still 92%.
> Solid state batteries with 10k charge/discharge cycles are never going to happen for the same reason incandescent bulbs are purposefully manufactured by every single lightbulb manufacturer to fail when they figured out a hundred years ago how to make them last forever.
> Solid state batteries with 10k charge/discharge cycles are never going to happen for the same reasons, which are identical to the same reason incandescent bulbs are purposefully manufactured by every single lightbulb manufacturer to fail when they figured out a hundred years ago how to make them last forever.
My first phone was NiCad Nokia. Later Nokia monochrome display phones used Li-ion well before smartphones. I use an iPhone now, but if a user-upgradeable Android phone came out with a solid-state battery, I would switch.
Actually, if the EU gets its way, a future with mandatory removable batteries might be the only possible future. And I'm so looking forward for it. Next up should be mandatory recycling of batteries by the manufacturers.
I think, if given an exclusive choice between the two, I would take (much) more reliable batteries over user-replaceable batteries.
10k cycles (~27y) sounds absurdly good - I think 80% of original capacity by 10y would be more than adequate. Basically you never need to replace the battery throughout the useful life time of a device.
Exactly. And the calculation should be Total Energy output per cycle. i.e In the future your Solid Sate Battery will last 2-3 times longer per cycle. Meaning you will only have a full recharge cycle once every three days if not more. A Solid State Battery that degrades to 80% after 1000 cycle super fast recharge would still provide you with 8 years of usage time.
People not realizing this ship has sailed. Apple switched to supporting right to repair because
1) They are no longer dependent on the loss of battery capacity to drive new device purchases — meaning this issue is effectively over. Batteries now last all day for longer than most new device users will keep a device.
2) Margins on services sold to hand-me-down family devices are a growing high-margin and young-user market and much of the market isn’t open to buying a device here — they will only use the service if they get a device for “free.”
3) The net outcome is an increase in service revenue and decrease in recycling costs for Apple.
These are good points and interesting, I’ve never seen such analysis, but imo their support of right to repair is because they’ve steered the discussion into a direction they are comfortable with.
The apple self service program is apples ideal of right to repair. A shitty program where you can buy board assemblies and larger parts for fairly high prices with no changes to their software approach whatsoever. A battery change through this program is ~$120 up front (if you rent the tools, which you probably should) with a $30 refund when you return your dead battery. A battery change via the apple store is $99. You can only buy parts if you give them the device serial number up front (so fuck independent shops) and you need to call apple at the end to pair the part on their end.
A pointless program with tons of waste still created. A short on your laptop motherboard? Better buy a new motherboard. An issue with the camera on your MacBook? Better buy a new lcd assembly.
They will suddenly become very against right to repair the moment it starts to actually fight for things like schematics and board diagram access, component availability and not just parts assemblies, parts pairing that’s done in a consumer friendly way, etc.
imagine legislation that says apple has to change parts pairing so that if a consumer unlocks their phone the parts have to be able to be sold as well? Or that they can no longer enter a contract with Texas Instruments to buy 100% of their stock of a usb c controller ic for a MacBook Pro so repair shops can’t buy it on digikey or mouser? Etc. they will fight that hard and then just find a way to circumvent it (like making their own usb c controller ic)
The Blue Ocean strategy is choosing a specific market segment underserved by your direct competitors and refusing to compete on the product attributes that your competitors are emphasizing, not just "trying something new".
This article is a hodgepodge of Apple praise [1], incorrectly applied business analysis, and a misguided hope that Apple will be considerate enough to build user-serviceable parts. It makes no sense.
[1] yes, the trillion dollar company is better than everyone else
> [fewer product sales per unit time] is a problem that can be solved using one of Apple’s favorite financial tools: higher product margins.
It's weird people forget the main message Apple has been touting for years now: services revenue.
Even without removeable batteries, with right to repair device longevity will increase, and they've seen the writing on the wall for a long time.
Apple will continue increasing services revenue. They'll fight the bitter end for the 15/30% tax on everything, iCloud tiers, Apple Music, TV, etc.
If they find a blue ocean, it will be in services. Their entry into emergency satellite calls and other "wouldn't you feel bad being dead because you didn't pay us ?" push could be that.
The second part seemed pretty weak to me, too, but I think there's a potentially bigger shift which could be steel-manned out of the battery idea. e-waste is getting a lot of attention, as are right to repair laws, and general realization that we can't keep generating waste on a 20th century scale.
Apple might have an interesting angle for embracing that because unlike their competitors they profit from the whole stack, have a robust service portfolio, and retail presence near a large portion of their customers. Turning device longevity into a competitive point really puts pressure on anyone who can't easily switch from Qualcomm's blink-and-you-miss-it support period or negotiate some kind of revenue sharing agreement between themselves, Microsoft/Google, and services like Spotify or Netflix, and unlike most other attempts to make it harder to compete with them this would actually be seen as a general good by almost everyone other than their direct competitors.
Really bad article, the author doesn’t seem to have any idea what blue ocean strategy is about. It’s not about making “bold moves” in an unchanging market. It’s about cutting down cost through focusing on what really matters to an under looked consumer audience. That audience being one the competition isn’t even focusing on. Also low cost wasn’t just Nintendo’s choice, that was a major factor in the definition of Blue Ocean. Apple cuts down cost in some areas to compensate for quality and user experience, keeping the cost. They do this because they make luxury items.
Yellow Tail Wine and Nintendo’s Wii were text book examples. They both found an untapped audience, novice consumers who don’t know nor care for over the top quality. In Nintendo’s case, hard core gamers valued graphics and performance as quality. To Yellow Tail it was Wine enthusiasts and their interest in vineyards, vintages, and hints of whatever.
Apple products are novice, but they aren’t cheap. The Chromebook is a better example of Blue Ocean. It does only what an entry level consumer wants in a laptop. It browsers the web, writes documents. Google said “hey, pretty much everyone who owns a MacBook but isn’t an artist or techie could use this”. Like MacBooks it doesnt require expertise, it offers a similar user quality experience, but unlike them it’s affordable because it doesn’t have unnecessary hardware specs and Swiss watch leveled build quality.
I don’t blame the author for not reading the whole ass book. But for god sakes, at least read the wiki page about it, don’t use a gaming magazine as your source. Because he clearly took the Wii example in the magazine and ran with it. He thought he could map it onto Apple products because what, they get rid of stuff and are innovative like the Wii was?
I’m glad I’m not the only one who found the battery ramble irrelevant.
I thought that EU regulations were going to force smartphones to have removable batteries within the next few years. That’s hardly swimming out into the blue ocean.
But regardless, easily removable batteries are going to create more e-waste, not less. Replacing a battery is perfectly doable right now, but there is a hurdle you have to jump over - you aren’t going to do it until you really need it. Once those batteries are easy to replace, people are going to be replacing them much more frequently, they’re going to be buying spares, etc. And people are going to be throwing old batteries in the trash without even thinking about it.
Batteries are still expensive. An iPad battery for instance is easily in the 100~200 bucks, same for laptops. The iPhone might come lower (50?) but even with a simple system, opening the device will till probably enough of a friction to not make it worth it to most people.
To make a comparison to cameras, I don't know how many people care to buy spare batteries outside of pros and people actually going through 2 or 3 batteries on their trips. Everyone else is probably using the one coming with their camera until it dies and then replaces it.
I think its "replaceable" not "removeable" batteries, using "readily available tools".
So don't think removeable batteries of the past held together by a sliding cover - more like grab an eyeglass screwdriver and a few minutes later its out.
I think the distiction is slight but there is a difference between an object meant to attach/de-attach (like old laptop batteries) and the m2 hard drive in my laptop. Both are technically replaceable, only one was designed to be removeable.
Of all the things I've ever wanted to be different on my macbook, removable batteries have never been one of them. Now when battery life exceeds a (long) working day, why on earth would anyone want this? The very few times you will be away from a wall socket for more than 12 hours, you can just bring a power bank.
At some point Apple refuses to replace the batteries of your macbook as they stop supporting the model. At that point you could be down to a few hours of battery life, with no third party repair shop willing to work on your laptop at a reasonable price. That's where you'd be with a 2017 macbook Air for instance.
Replaceable batteries gives an out to that situation, as one battery maker somewhere in the world could be enough to keep the remaining laptops alive, with no need for invasive serviceing.
Of course that brings Apple close to nothing in revenue, but would boost good will from users and let them tout sustainability?
Also won't resonate if you're cycling through devices every 2 years anyway.
The 2017 MacBook air has a ton of aftermarket batteries available for less than $60. It's really not that difficult to replace it yourself. The ifixit guide says it takes 15 minutes.
"with no third party repair shop willing to work on your laptop at a reasonable price."
I am not aware of any third party repair shop who would refuse to work on a 2017 MacBook Air. Unless it is a repair shop that refuses to work on Apple products in general.
My mistake, I mixed up a Macbook pro and an Air. The Air was from 2014, the pro is 2017.
Could it be regional differences ? I rechecked on the online support page - both Jp and Fr -, and repair support falls off right at 2015 (which is nothing to sneeze at for a maker, but I'd still wish for more)
When I did, they couldn't guarantee to repair my 2013 laptop, but they sent out a parts request to see if they had a replacement part in the warehouse (top case), and they did have one, and they were happy to use the battery replacement price if my battery showed the "service recommended" state, same rule as for newer models.
They don't provide a warranty for repair on such an old machine, because they can't guarantee being able to find another replacement part.
I asked for a price on a screen replacement too and they didn't have a screen, so couldn't do that. The screen is in pretty good shape though.
Make battery packs (call them "removable batteries, if you like") 100% completely and totally separate from the device itself - think of it like a UPS for a desktop...no (or practically "no") battery on/in the device, and instead you run the device with a Qi-compatible battery pack
IOW - the device becomes 30-50% lighter (on its own), and the buyer determines how heavy his/her experience will be based on which Qi-pack [s]he has chosen to run their phone
Want a 10,000mAh battery? No problem!
Want only a 2,000mAh battery? Again: no problem!
This even solves a problem Apple has let other vendors 'solve' with regard to cases - want an OtterBox? CrayolaCase? SiliconSoftie? Go get your case from any of a thousand manufactures
Apple could provide a pair (or trio, etc) of Qi-packs for their devices, and let other manufacturers go banana pants coming up with other options
This modularizes the iPhone in a smart way (not like that goofy Motorola method (which was reminiscent of IBM in the 80s open-standardizing the buses for their Personal Computer) whereby the only thing you "have" to get from Motorola was the core module
Apple's "core module" with such a program would still be iPhone ... and that is still the biggest differentiator Apple has vs the 80 scadzillion Android makers out there: iPhone is only Apple. Android is whomever wants to make one.
Because Apple has had this for years a while now with the MagSafe battery pack.
I actually think it should become a standard that all companies adopt. It's far easier and quicker to use than removable batteries and doesn't remove valuable space inside the phone.
It’s also incredibly wasteful. 30-40% of a battery bank’s capacity is lost to inefficiency, and even worse with a case on. It would be great if phones commonly offered pogo pins on the back in addition to wireless charging.
I used to feel like you - the non-removable battery thing was annoying, the non-modular flash was a deal breaker, etc.
None of them turned out to matter, at all. What does matter is a solid feel and solid construction and the flash and batteries lasting the lifetime of the product. I'd rather have that than replaceable.
Are the apps you're using substantially better than they were 4 years ago? Are they doing substantially more for you that you wanted from them?
If the hardware is failing then I guess fair enough (it shouldn't be, that's a design defect), but inflated hardware demands to compensate for increasingly sloppy code isn't a reasonable justification to demand dropping another $500+ on a device that works fine.
I recently gifted a used iPhone X to a relative. It's still an exceptional device on which the hardware works as well as it ever did, minus the reduced battery life. I would have been happy to keep using it if I could replace the battery simply and it was still getting O/S updates.
(As it is, the device still receives security updates, which I can't really grump at Apple for after almost six years - but only because they're better than the rest of the industry. It's still a poor situation).
>(As it is, the device still receives security updates, which I can't really grump at Apple for after almost six years - but only because they're better than the rest of the industry. It's still a poor situation).
I am continually astonished at folks who expect OS and/or security updates past a few years (this is not a slight against you, just an observation on my part)
A lot of folk like to gripe about "planned obsolescence" or similar because eventually their device is no longer supported
Yet the fact that manufacturers support equipment as long as they do is ... kind of astonishing
The higher the volume of production, the shorter you would expect a manufacturer to support it (a 747 is worlds different from a pickup truck which is worlds different from a phone or laptop)
The cost-benefit analysis of continuing to support/update a given device has to be taken into account ... and - for most phones (since that is what we are talking about in this thread) for most people, that means that between 3 and 4 years after release, it makes [nearly] no economic sense to keep supporting them
That Apple does for so long speaks volumes about their commitment to their customers (even though, of course, they would like you to get a new phone every year, I am sure)
I guess my view on it is that there's really no reason for the obsolescence.
iOS 17 may not be technically able to run on the A11 chip in the iPhone X, but only on the basis that it's been designed not to do so. Any features that genuinely needed extra neural cores (or whatever limitation there might be) were clearly designed that way with the intention of forcing a new purchase - the software could have in all likelihood run just fine on the A11 if they wanted it to. OCLP kind of proves this on macOS, which should (in theory) be even harder to maintain support for than iOS, as there are many more binary driver blobs that Apple doesn't control the source code for and therefore can't stand over/patch if the original vendor has stopped doing so.
> I am continually astonished at folks who expect OS and/or security updates past a few years
How many is “a few” to you?
Many people finance their phones through carrier deals that last three years. It should at least be supported that long.
And why not expect longer? To an outsider looking in, the software isn’t changing that much. I still have a messaging app, a browser, a camera app - that all basically haven’t changed their core functionality in the time I’ve had my device.
Just like computers, I think there is a correlation between price and expected support.
If I’m paying $1,500 for an iPhone 15 Pro, I am doing so with the expectation that in 5 years it will be supported.
If I am paying $100 for a cheap Android, my expectations are lower.
> Are the apps you're using substantially better than they were 4 years ago? Are they doing substantially more for you that you wanted from them?
Yes - in general, the apps I am running are doing more and are better than they were when I first got a given model phone :)
I went from the 4S to the 6S Plus to the 11 and am waiting on my backordered 15 Pro currently
I still have the 6S Plus in a drawer in my utility room - it "works" for lite tasks like web browsing, but will not run most of the productivity apps I need/want for work or the educational apps my kids need/want for school. So, for me, it is a "useless" device (so the kids get to play with it - and nobody cares what happens because it is 8 years old)
My 11 still "works", too - but it is getting very long in the tooth
Agree with the most of youir premise - but a magnetically aligned and secured pogo-pin interface on the bottom back edge of every MacBook might be superior to inductive charging, which wastes electricity and generates too much heat. Such battery packs could also include a second pair of pass-thru pins for stacking and connecting to a direct wall power adapter.
was not really thinking about MacBook, more iPhone (and, possibly iPad) ... but that is an intriguing idea, too: supply a basic battery that would last for X long, and have a pass-through (hermetically-sealed, if you wanted to move into the IPS zone for dust/liquid resistance) MagSafe (or similar) connector for additional battery packs on the bottom
I could totally see getting a second laptop-footprint-shaped pod for my MBA that would click on and give me another 20-40h of run time between charges!
This is basically what they're doing with the Vision Pro, though out of engineering necessity rather than preference: the headset by itself only has about 5 minutes of battery life, to give you just enough margin to swap out the external battery pack you keep in a pocket.
Hopefully that internal battery is purely optional, i.e. implemented like in most laptops and not like in most phones, where it's impossible to use a device with a dead battery even when externally powered. Otherwise it'd be the worst of both worlds.
Apple never had a blue ocean strategy. Almost none of their products sold because they were innovative. They sold well because they had a better execution and integration.
Not going to happen. Battery life is good enough for 99% of people that they aren't going to want to swap batteries. Only nerds really did that in the first place.
Battery technology has improved so much that removable batteries are less desirable.
The article is about Apple, but it describes the https://frame.work/ blue ocean of mobilizing the laptop where nearly everything is replaceable and upgradeable
The real Blue Ocean strategy that Apple should apply would be to use AI to help analyze our behaviors and then actually change our behaviors to help us be happier and healthier, not just use it to sell us more stuff (as some competitors are doing).
I call BS on this. Apple’s strategy is not battery tech. It’s personal computing devices. Always has been. Always will. Wearables, services, AI-capable hardware, is where Apple wins. Wearables beyond the watch and the vision pro. Meta and Ray-Ban teamed up for some Google Glass-like shades. Apple will come out with Shade (TM) if they choose to go into that market.
The point is, Apple has had a strategy for a while now. It’s not going to win against Android in the phone wars. It’s going to win in the AR/XR space if they can manage to get a decent wearable out and provide cores within cores within cores for AI NN’s.
I would say that Apple has already won against Android. Android may have more units world-wide, but Apple owners spend more, are more desirable customers etc.
>Second, people still crave the advantages of removable batteries that were left behind: increasing battery life by swapping batteries instead of using a cumbersome external battery pack, inexpensively and conveniently extending the life of a product by replacing a worn-out battery with a new one—without paying for someone else to perform delicate surgery on the device.
This whole paragraph is delusional. Apple users carrying and swapping out batteries? Apple supporting and allowing people to simply swap out a battery instead of going to the apple store for a certified repair? Apple??
I think Apple’s blue ocean strategy is unique in the sense that it’s not when you feel they should release something but more so when they feel it’s the right time to release something and doing so in an Apple fashion way by being late. Because they make it refined to their standards.
In terms of blue oceans it’s where they see a lack of innovation because you can have many blue oceans within in anything you do if you see the problems that exist in a way that your competitors don’t and then you apply great engineering with great marketing (although sometimes it’s just marketing) to reach the numbers you need.
If there's going to be removable batteries for any product it will be for new product categories like the Vision Pro. Apple is definitely not going to take established products like the iPhone and change it up just to be different (which isn't a sustainable competitive advantage because anyone can do it). Even with the Vision Pro it's a compromise that Apple is likely tolerating, rather than what the 'final form' of the product will be in 10 or 15 years.
>Starting in 2009, Apple began to phase out removable batteries across its laptop line in favor of batteries that were sealed inside the case and were not user-accessible.
>The upsides, which Apple touted, were many: lighter weight, smaller size, better reliability, longer battery life.
What are you talking about? The link provided to Apple's press release for that laptop did not include anything about why a non-removable battery was advantageous. They provided other reasons the new battery was great, but they did not say the elimination of user-serviceability allowed for any of those benefits.
>The iPhone defied so many other norms that the sealed battery was less remarked upon than it might have been, but it was still noted.
Noted by who? Remarked upon by who? It sounds like this author is generalizing his own opinions to the entire population. I'm sure this comment sounds nitpicky but I just don't like this kind of fast and loose writing.
Their next big move will probably be some AI product that ties together their other offerings into their exclusive platform. The ideal would be that you buy all their products and they automatically learn how you live your life, ancitipate your needs and provide for them seamlessly. Other companies try to do this now but it's not seamless at all. If Apple can use some tricks to identify a user, link all their stuff together, begin predicting needs and popping up solutions for them, all tied into a privacy-first data warehouse (nobody can use your data except you on your Apple devices), that could capture much bigger segments of the worldwide tech product landscape.
At least, that's my assumption based on their massive spend on infrastructure related to AI. I've been waiting for their cloud hosting project to jump out of stealth mode, if it's still in progress, but maybe it was never intended for anyone other than Apple's use. If that's the case then it makes more sense that they'd sort of pull a Google and invest more heavily in the backend server side for one product that could really be a killer, which would most likely be something AI. An answer to Google's search product that isn't search.
The first half of the article was great. But regarding going back to replaceable batteries, I couldn't disagree more. If you know anything about Apple's design philosophy, you'd know the last thing Apple wants is for users to even consider opening up their device to replace/swap a battery.
I much doubt that giants such as Microsoft, Apple etc, rely on strategies such as "Blue/Red Ocean", it's too fancy/romantic to be practical at their level. They tend to focus on established markets, which is very messy and competitive.
More like comparing wallgardens with few open tech standards.
I was going to rant. But it is pointless. These companies find ways to hold customers because of their tailored experiences and it has less to do with technology and more with targeting.
It could be easily done. Sadly Apple doesn’t seem to view that in its interest, despite its repeated claims to care about the environment.
They could make slick, elegant devices which are still repairable and have replaceable batteries.
I remember the original iMac G5 had a great design which let you simply remove the backplate with 3 screws and access everything. Of course in the next iteration they changed the design entirely so that opening it up became a huge ordeal.
Non US resident here, I just switched from macbook 13 to system76 lemur. Half the price and double performance, could not be happier. Market is ready for segmentation and users today are more tech savvy then before.
What kind of performance? The best Geekbench scores for a System76 Lemur Pro with an i7 are still significantly lower than those of a M1 MacBook Air, which is the slowest Apple Silicon laptop. The Lemur also costs $1,300+ with the i7 (i.e. not cheaper than the Air) and has a 1080p screen.
Not saying it's a bad machine or anything, but the "half the price compared to a MacBook" meme is now more irrelevant than ever.
I'm all for Apple reversing their long-standing decision to make device batteries non-user-replaceable - an original decision which I personally loath, and one where the EU will most likely force Apple's hand anyway by requiring batteries to be user-replaceable.
But excuse me while I throw up in my mouth a little with this "Blue Ocean" nonsense. Apple makes an original, user-hostile decision for the sole purpose of increasing planned obsolescence and to make them more money, and then when the winds shift they might go back the other way - but, again, they'll likely be forced to anyway. No need for poetic blog posts.
I suppose we're owed another "Blue Ocean" missive about how Apple led the way with Lightning connectors and then found another Blue Ocean with ... USB-C. Puhleeez.
In the apple devices i have seen the insides off there is plenty of room to add screwed brackets that hold the battery in - the battery does not fill all available space because even pouch cells are basically rectangles. That means in every curve there is plenty of space for a screw post or hook.
But I agree that "Apple makes an original, user-hostile decision" is a bit too simplistic. I have long since come to the conclusion that Apple simply always chooses the cheapest option for them in cases where user-serviceability is concerned or to maybe put it another way they ignore user-serviceability semi-actively. Routing screw posts costs time and wear on tools. screwing in extra screws costs time thus just mainly gluing the battery in place is the cheapest option.
Soldering on flash and RAM saves on costs and makes the PCB design easier. At the same time it "locks" the longevity of the device neatly. This leads to not-overlong replacement cycles simply for practicability reasons - for instance modern chrome eats all the RAM.
R2R enthusiasts are incapable of seeing any decision from a perspective that isn't their own personal fetish for tinkering. So any and all advantages of sealing the battery into the chassis are just waved away and not acknowledged at all.
Lighter devices? That's irrelevant, no one cares about device weight.
Cheaper to manufacture? That's just a ploy to gasp make more money! How dare they?!
Waterproofing? Well I don't go SCUBA diving with my phone, so clearly no one else does. And if they do, just put it in a ziplock bag!
On the contrary, I find in amazing to see the lengths that Apple apologists will go to point out these spurious reasons why Apple included non-removable batteries.
The only point that you really list that is defensible is waterproofing. Lighter devices? Cheaper to manufacture? The idea that making a removable battery would make a phone noticeably heavier or impact the cost more than a cent or two is laughable.
Even with the waterproofing, I'm dumbfounded on how one hand people can laud Apple for their truly amazing engineering capabilities (I may question its utility but the Vision Pro is really an incredible piece of engineering), yet somehow a waterproof removable battery (which some other phones have) is beyond their grasp.
Removable batteries requires valuable space for the mechanism to hold the battery, transfer the power from the battery, eject the battery etc. It doesn't come for free.
At least based on what I've seen on camera mechanisms you're taking about 10-20% reduction in the battery cell.
Since the EU law is about making batteries "removable with normal tools" one could argue that encapsulating the battery like a camcorder battery with a plastic hull isn't needed.
A normal pouch with contacts will do.
Assuming "All day battery life" (24h) a 1% reduction in battery life is about equal to 15minutes but frankly I can't see the replaceable battery reduce the capacity at all if you are doing halfway good engineering - there is plenty of void space in an iPhone even now.
By that I mean space over/under PCBs where the battery cannot go because it is still just a rectangular polygon even in say the iphone 15. I think it has 2 rectangular smaller cells with a casing around them to make an L shape.
So why doesn't apple forgo the "casing" and take that sweet 10 to 20% extra capacity? :)
There are reasonable levels of supply chain integration, and then there is user-hostility. With Apple we've seen both sides of the coin, so you'll have to excuse the people who accuse them out the gate.
It stands to reason that a company as large as Apple is capable of making decisions that impact large amounts of people. It's not just Right to Repair enthusiasts that care about that impact, it's extended to regulatory bodies that are tired of meaningless standards competition and arbitrary market separation. Apple can excuse these base truths with whatever feature or novelty they choose, but it doesn't overwrite the insidious functionality of their choices.
You're basically describing the situation in reverse, from the perspective of foreign markets and regulatory watchdogs:
"Another special, licensed connector for data and power? Why do consumers need that!"
"An App Store? Why give Apple complete control over a feature as basic as installing software?"
"Only one browser engine? How are new technologies intended to compete on a platform that doesn't acknowledge newcomers?"
These "advantages" to sealing the battery are incredibly weak/false.
The weight difference from sealing the battery is negligible. If it saved several hundred of grams, it would one thing, but a replaceable battery adds at most tens of grams.
Even if the sticker price of device is cheaper, I would still have to pay more to replace the entire device once the battery is worn. This is the primary reason for most companies sealing the battery, to force buying an entire phone. Having a replaceable battery would save me money.
Sealing the battery was not responsible for waterproof phones. There were already waterproof phones with replaceable batteries.
I have huge respect for John Siracusa. His articles at Ars Technica, pragmatic opinions on the world of computing always spoke to me, but this article sounds like something written by John Gruber.
Apple indeed innovated in multiple ways and markets, but un-removable batteries wasn’t one.
I also have massive respect for Siracusa and I find I have a few similar opinions that made it really enjoyable listening to him and reading his articles.
But over the years, as John has gotten older (and as the other members of ATP) I find that my values are starting to diverge. It feels much more of a news show than a tech show and I find I’m not gaining much insight from their discussions or opinions. Even their tech insights are starting to feel out of touch with the rest of the world.
Marco was always disconnected because he's well off and could do whatever he wanted (good for him). When Casey and John had jobs, they were able to keep Marco grounded and the trio really worked. Now that Casey and John both only really do the podcast now, they all are a bit disconnected. I'm happy for all them to make money doing what they love, but the show definitely changed once they all became essentially full time podcasters.
I didn't know John pivoted to full time podcasting. This actually explains, IMHO, the shift towards a more forgiving view of Apple's product directions.
Indeed, if I remember correctly, he quit about 12 months ago or so. It made me wonder what the earning of a high profile podcast can be! Also how much that can fall away when the advertising business drops (as they constantly like to remind me, multiple times on every show…).
It makes me confused that while they’re working on the show(s) full time, the quality is dropping? It’s less interesting somehow?
I did the math a couple of years ago, and it came out to roughly $100k per podcaster. This has obviously dropped off significantly the last year with ad sales in general tanking.
Siracusa's simpering, consoomer defensiveness for Apple has become intolerable. He seems unable to deal with the fact that his precious company's business practices are incompatible with his political/enviromental beliefs. Instead concocting this 'blue ocean' dream that Apple will switch back to user serviceable batteries, creating a new market where there competitors don't dare tread.
What?
There are still plenty of Android devices that have user serviceable batteries, but they don't sell as well because what the majority of users want is to minimize size and weight at any cost. The minority of geeks out there who think something like this is important doesn't move the needle. Apple (particularly under Tim Cook) only cares about maximizing profits. Their environmental schtick is merely dressing in front of the ruthless profit making machine at work.
Obviously Siracusa can't deal with the dissonance that his precious Apple doesn't care about the environment as much as he does. Thus typing up silly articles like this...
> they don't sell as well because what the majority of users want is [...]
I think in the current market we have no idea of what the majority of users want because there's no combination of all the available options.
The majority of users might value iOS more than any specific hardware feature. Second to that, they might value price and career rebates.
That makes fringe android phones failures irrelevant and would leave us with only random guesses on what people would want outside of what Apple and Sumsung is offering right now.
Exactly. Just because a company sells something doesn't mean the majority of users want it that way. I would love for Apple to sell a laptop akin to the 2006 MacBook and MacBook Pro but with modern components, complete with removable batteries as well as user-upgradeable RAM and storage. It doesn't need to replace the existing Apple laptop lineup; it could supplement it (let's call it the MacBook Pro Max or the MacBook Extreme or something to that effect). Of course, Apple doesn't sell this product, and so I have to choose between two compromises: dealing with non-replaceable components so I could use macOS, or dealing with Windows or Linux so I could use something like a Framework laptop. In other words, I must choose between macOS or user-serviceable laptops, and unless Apple experiences noticeable declines in sales, this won't change; in fact, even PC laptops have largely moved toward the Apple model; even many ThinkPads have soldered RAM these days.
Nintendo died by hiding in the corner, being afraid to compete, not making anything cool, and instead making something nobody wanted in a category too small to sustain itself.
Nintendo died? Who's making these nice, popular hybrid portable/docked consoles then that I see all around my friends' living rooms and on public transit?
• MST (DisplayPort daisy chaining) in MacOS. The hardware has supported it for over a decade. The OS is the weak link here. It's the difference between spending $75-$300 for a dock in addition to the cables and just connecting the monitors together along with a single cable to the first monitor.
• Non-soldered storage. Seriously. Storage is the most likely component to fail. SSDs only have so many write cycles.
• BIOS on a separate chip. To make matters worse, with the introduction of the T2 chip, the BIOS is stored on the SSD as well. This means if the SSD fails, you don't just lose your data; you have an expensive brick. You can't even boot to external drive anymore if one of the two SSD chips fails.
• Safer SSD chips. If a cheap capacitor fails on newer Macs, 13V gets shorted straight to the SSD. The SSD commonly doesn't survive this. And since the BIOS is on the SSD now… Literal ten cent part blows up your multi-thousand dollar laptop with zero warning.
https://youtu.be/RYG4VMqatEY