This only makes sense if you pretend developer time is free and low morale has no effect on productivity.
10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop. Usually it costs more than 10 minutes a day too.
The hit to morale when you tell a professional they can't pick their tools is even worse. Now they are spending at least 1 hour a day looking for a new job because devs are good enough at math to know if you can't afford a new laptop you've got six months of runway tops.
If you believe employee happiness has any correlation to productivity you always buy them whatever hardware makes them happy. It is a small price to pay.
and yet not every developer not using a tiling windowmanager is fired on the spot for wasting company time.
"productivity" (or what people think of as productive activities) is overrated. Shiny new hardware makes employees, especially technically oriented ones, feel important and appreciated. That's about the gist of it. Nothing wrong with that, but let's not blow up egos with claims of "productivity".
Productivity is one of those things that I think should be motivated with reward for increasing it instead of punishment for decreasing it. I love the nice feeling I get when I use my own custom window manager to shave another 3 seconds off a thing I commonly do. It's an amazing feeling, it makes me feel like Tony Stark, building my own personalized JARVIS, a program that automatically does exactly what I would have done manually. That's a big part of why I built my window manager and why I want to share it with people, because I want them to feel that same excitement and joy of directly improving your own quality of life, even in a tiny but very real way. I would open source it and give it away for free if I could do that and still keep the lights on.
My reaction to customizations that shave off seconds is: "so what, it'll be blown away the next time the tech stack changes." I do automate, but there's a subtle difference in goals.
If I automate my personal toolset, I follow the same procedure I use around automation anywhere else: don't start off doing it to save time, do it to increase reliability. I will write small scripts, sometimes one-liner scripts, sometimes largish hundreds-of-lines scripts. But the outcome I am aiming for is that I have a procedure that is documented and puts all the configuration in a place where I can see it, so that when the situation changes, it is fixable. A productivity boost is a frequent byproduct of successfully automating, but it's usually a side component to "reliable and documented". The boost is perceived as reduced friction and increased conceptual integrity: fewer things to check or to accidentally get out of sync, and thus less stress involved.
Focusing on UI being both shiny and fast is likewise often missing the point - which is the case when discussing new hardware. There are order-of-magnitude thresholds for human attention that are important to keep in mind, but hitting a lower attention threshold usually doesn't solve a productivity problem. It creates a case of "wrong solution faster", drawing the user into a fast but unplanned and reactive feedback loop.
See for example the case of writers who like the Alphasmart, a device that makes doing editing so tedious that you don't do it, you just write the draft and edit later.
I work with .NET and that used to mean you have to be on a Windows® computer. At a place I used to work at, I had an HP elite book Windows 7 laptop with i7 processor, 8GB RAM, and a spinning hard disk. That by itself is not the problem. The problem is there is an "asset management" software installed (I assume by default) that is overly active which when combined with a antivirus with "real-time protection" meant a subversion checkout can take a long time. This definitely degrades employee morale I think.
What does that mean? You would fire anyone who opens up Finder ever? As opposed to what? Doing everything from the command line? This sounds ridiculous without clarification.
>10 minutes of lost productivity a day due to inadequate hardware
What on earth are you doing that a 2 year old mac is inadequate for?
Yeah there is a point that the hardware is a problem. I'm working on a 5 year old, mid range PC and I don't think an upgrade would really change any of my day to day work. Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.
I have a 10 year old PC at home and the only pull to upgrade is gaming, but I'd rather not sink time into that (I get enough time sitting behind a screen at work) so I hold back on spending money on it too.
Maybe the developers happiness does drop if you give them older hardware but I don't think that's based on realistic changes in performance of that hardware.
> Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.
Just because you don't notice doesn't mean that it's not there. The argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them fast hardware.
No, the argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them marginally faster hardware.
And I don't think that argument is particularly convincing. Typing this from my 3.5 year old work MacBook Pro.
Not sure where your experience is originated in. It obviously also depends on what exactly you do with your computer. Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench [1].
I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.
Not sure what marginally faster hardware means exactly for you, but for us it's definitely been significant, not marginal.
YMMV, but if you do the math of saving 10 minutes / day * $200 /h * 200 days that's > $6000 per year it becomes pretty hard to economically argue against investing in faster tooling of some sort.
If you're dealing with something intensive then upgrading makes a lot of sense. If you've got large compilation times in your pipeline or if you're doing machine learning and need to throw loads of hardware at a problem I totally get that. I'm sure there are plenty of other situations that justify this too.
But if you're like me where most of that happens off on the build machines there is very little impact in upgrading your hardware.
A 50% improvement on compute workloads probably wouldn't be noticeable on the setup I run. Outside of compiling I don't think I push a single core much above 30%.
I guess it really comes down to what you're doing.
> Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench
If this is enough to make a huge difference then you should be running the workload on a low end or better server instead of a mac book. You'll get much more performance for a fraction of the cost and won't have to pay for the things you don't need, like a new battery, screen, etc that come attached the the CPU you need.
> I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.
Have you exhausted all other avenues there? Do you have distributed builds? Is Everything componentized? Do you compile to RAM disks?
For that matter, why a macbook? Why not a high end gaming laptop with more CPU, RAM, GPU resources?
This conversation reminds me of that conversation from a day or two ago, when a writer argued that, if you're trying to hire senior engineers, you need to woo them, rather than expecting them to woo you.
In high-end job markets, hardware is so cheap compared to salary, etc. that there is no reason not get the top stuff. If you don't, that may impinge productivity, and in addition it will send a very negative signal.
Honestly this doesn't just go for high-end jobs. If you're a coffee shop and refuse to shell out for some nice stools for employees to take a rest on then you're just not using math.
All labour in the US costs far more than a decent chair or a nice ergonomic keyboard and mouse, a hands free headset... All of these things are peanuts compared to the costs of employing someone.
Ah, I’m sure that’s because it would be hard to fit in the office, not because of the price.
It’s actually cheap enough that you could buy it yourself if you wanted to. It certainly went into my bookmarks, just in case I ever actually need a literal battlestation.
Yup. I really appreciate working a place where my employer understands this. Any tools, equipment, technicians we need to bring in...anything...we just say and we get it. The difference between this and my last job where we used equipment until it broke down...and then had to keep using it...where even a pair of earplugs seemed to be a source of stress...it's like night and day.
I honestly don't think I could work for a company that cheaps out on equipment and supplies anymore. The difference it makes every day to the quality of not only the work I do, but the working conditions for me and the people I work with is worth it.
I seriously think there are power games being played by employers who give their labor terrible working conditions when the marginal improvement would have such a tremendous morale boost. Perhaps out of worldly impotence, such an employer feels satisfied in seeing the toil of someone subservient.
And yet, people insist on using heavy, slow development tools and SDKs, which make much more of an impact on productivity and iteration times than a 2 year difference in hardware.
> Usually it costs more than 10 minutes a day too.
Anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting too.
Say something takes 1 second to compile vs 6 seconds on slower hardware and you're having to recompile ten times to fix a bug. The raw waiting time might only be 10 seconds vs 60 seconds but that extra pause of having to wait every time to see if you've fixed bug might annoy you enough to drain significantly more productivity out of you.
You're best keeping your developers happy so they enjoy plowing through the work instead of hating it.
The OP mentioned a two year old computer. Do things compile 6 times slower than on today's computer? I'd guess at most 1/6th slower, ie 5/6ths of the time...
It's a hypothetical example to illustrate the point that "anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting".
For the sake of a few thousand dollars for a laptop that can be used for several years, even a 1% increase in efficiency is very likely worth it.
If you think developers churn out code at maximum productivity throughout the day and a 10m improvement in compile times per day will reap you benefits in a year, you're sorely mistaken. Unless that improvement lowers the compile time below a certain threshold where the dev will say "fuck it, I'll read some HN while it builds" you're probably not gaining anything.
Ah but then that dev comes across an insightful article, that changes how they think about programming, and becomes a better programmer.
What we should really be doing is spending all our time 'researching' on the internet. Increasing our value to our companies. That's what I tell my boss anyway.
The original argument I read was that slowdowns or waits past a certain time affect mental flow. The flow is basically you being in the zone coding optimally. Interruptions like phones can jolt a person out of flow. So can long wait times.
If person is in flow, you want them cranking out as much output as they can with no slowdowns. That was an advantage of Smalltalk and LISP. Any language with REPL or IDE supporting something like it will let one do this.
If you are using Node as the original poster mentioned, what is a slightly faster computer doing for you? You’re not spending time waiting on a compiler.
Exactly. They can not break out the "my codes compiling" argument. Lost morale over a 2 year old laptop?? Wow, they should work in education. Try a 10 year old laptop and no hope of every buying a new "high end" machine. Try a shiny new $500 dell. I think some new programmers are out of touch a bit on hardware...
Yes because developers who are whining about having to use a two year old laptop because their company is trying to save money seems more like a bunch of entitled brats.
In the end, what matters is that you keep your employees happy. Whether or not they are entitle brats (they are...) is irrelevant if you need those employees.
If you're bringing in 3x your cost (as you should as an employee), the cost of a laptop is very nil for the company. Their nickel and diming you is worse for morale and retention.
Anyone who will quit a job for something so minute as having a two year old laptop is someone that is probably so immature that they couldn’t handle the normal rigors that come from being a professional.
You may think so. When you see certain coworkers get new machines every 2 years but you don't, for example, it builds a power structure when one wasn't before. If you feel less valued as a result, I don't think it means you can't handle the "normal rigors" of being a professional. It means, if you can find somewhere else where you feel more valued, then more power to you.
Again if you can’t handle the “power structure” of not getting a laptop every two years, you will never be able to handle navigating life in corporate America.
I'd say anyone that drops the money on a 4 year CS education of any rigor, survives it, survives and succeeds at the fairly rigorous interview cycles required to get a job in SWE these days... is absolutely entitled to requesting top of the line hardware to perform their duties.
I'd say your argument makes it "seem" like it's coming from envious, curmudgeonly luddites.
If individuals of other vocations feel that these developers are "entitled brats", perhaps they should switch careers? I can't imagine a teacher went in to education seriously expecting to get provisioned the latest rMBP as a perk?
Are we assuming all jobs are equally dependent on high-end hardware to provide the best RoI for time spent?
Teaching yourself to program is not rocket science either. I was writing 65C02 and x86 assembly on 6th grade in the 80s. I got a C.S. degree because that’s what I was suppose to do.
It's not "another field", it's literally almost every field other than computing, as outside tech people either don't have money for top hardware, or don't have interest in buying it.
In so far as it makes developer more empathetic towards users, it's a good point to make.
If you were paid at your job the same as in tech, and you know the direct value you add then you'll understand the cost of the tools you prefer is a literal drop in the bucket and not worth serious discussion.
Wow seeing that I’ve been “in tech” developing professionally for over 20 years and a hobbyist programmer in assembly since 12, I think I’ve earned my geek cred...
It's actually you that is out of touch. I could reiterate the arguments already well expressed but perhaps I should just suggest you read the entire rest of the thread.
A teacher spends presumably most hours actually interacting with students and use the computer for preparation and paperwork something your 10 year old dell is probably well suited for during the minority of the time they spend on it.
Your dev spends most of their time on their machine and even if they don't have to build software in a compiled language may still be running a heavy development environment, several browsers to test the result, virtual machines, etc etc etc.
To drive the point home lets consider the cost of a 5% decrease in productivity due to using an inferior machine.
If a teacher is contracted to work 185 days or 37 work weeks in a year and earns 60k the teacher earns $32 at 50 hours.
If the teacher spends 10 hours per week on the computer the cost is no more than $32 * 37 * 10 * 0.05 = $592
If your software developer is earning 100k x 50 weeks and spends 50 hours per week almost all of which is spent using the computer then the cost is $40 per hour x 40 hours on the computer x 50 weeks x 0.05 = $4000
This doesn't account for actual costs incurred by having to hire more developers because management is too incompetent to retain them buy buying them nice tools.
Classrooms often use interactive whiteboards driven by PCs that can be quite slow to log in to the teacher's active directory profile as this involves pulling data over a network of variable speed. There can also be issues with software upgrades deciding to start at random times. The PC will need to be logged off and logged on 8 times a day...
Teachers don't talk to themselves in classrooms so time loss can affect a small percentage of up to 180 student-hours per day or 900 student-hours per teaching week per teacher. A typical '8 form entry' secondary school in UK will have around 100 teachers plus admin / head of subject / 'leadership'. School year is around 38 weeks.
I sometimes think something like ChromeOS but that can run IW software and just stay booted would be better. An appliance.
You need to factor in the time it would take for teachers to move all of their resources over to another format. Some teachers have decades worth of work that they teach with.
I do not and have never used Vim, if I started at your company would you prefer to have me waste three or four days learning it or just pay an extra six hundred dollars on hardware?
Just for your knowledge, your answer probably differs from the one I'd get from whoever does the accounting at your company, and their answer is the right one.
If you're a proficient IDE user, learning and setting up Vim up to a comparable level to a top-notch IDE (Visual Studio, IntelliJ) would take more than three or four days. Three or four weeks (or even months sounds) more realistic, in order to get efficient:
Don't let anyone tell you otherwise, I'm a Vim user and the people that say it will take just days or a few weeks have just forgotten how long it took them to ramp up. Or they're fooling themselves that their Vim does everything a capable IDE does on strong hardware.
I agree, but I wanted to clearly under estimate because three or four days is already enough cost wise and it's hard to argue. I think two weeks is on the optimistic side, but I think real proficiency would be a slow process and likely end up not paying off for about a quarter, though you'd be productive in other ways during that time.
At a previous company, I never told the new college grads that there were options other than vi, so unless they were enterprising enough to figure things alternatives by themselves (this was before widespread availability of Linux and the web), they were forced to learn it.
And then I'd tell them about alternatives after they were proficient with vi. Not one ever switched away from vi.
Did that cost us in initial productivity? Probably. But it's such a minor thing when it comes to NCGs.
Good thing I'm not in charge of these things at my company, I'd probably look for people who used hardware properly rather than just rely on Apple to make their POS GUI IDEs to run... Most of my colleagues run IDEs, but not the ones who actually get stuff done.
I generally use nano for quick and dirty things, but prefer to push/pull changes. The environment I'm working in doesn't necessitate editing local files on a remote very often.
For me, my ~4 year old MacBook Air (Yeah, not really a top end dev machine) has started to struggle with more than 15 or 20 browser tabs open. I regularly spend _way_ more time researching than compiling, so it's starting to annoy me enough to think about pushing for an upgrade.
(I put the blame half on the browser vendors, and half on modern cloud web apps. My tabs usually include Gmail, Jira, Confluence, Trello, and Slack. Even doing nothing, between them they'll sometime have the fans spinning...)
Bundling + dev servers are place a heavy load. I'm actually looking into a way I can avoid using bundling for production but still get some kind of "hot javascript file reload" in the browser.
As a counterpoint, tests are your application, and thus should run at the exact specs of your average consumer. And you don't get to compensate for the test suite itself either unless your 100% sure the average customer is only going to be running your app.
This is also a good argument for running tests on a separate dedicated machine
I like running tests on a separate dedicated machine and enjoy a build environment that verifies each commit with a full test suite... but being able to run unit tests for a file on each save of a file is something that can save you some time pretty trivially.
I also disagree strongly about needing to run tests on the exact specs of your average consumer, most of us aren't writing software for a small set of hardware configurations so determining those average specs is likely not possible - but if you're working in an embedded platform or with specialized hardware I do agree that you absolutely do need to run tests on the actual hardware regularly, I'd still argue that those tests should be run on a dedicated machine and should be in addition to tests that verify the code is obeying its contract.
> should run at the exact specs of your average consumer.
Kinda expensive, having a set of multiple dedicated servers (or VMs) running on God knows how many dozen cores and hundreds of gb of ram just to run the tests that my laptop runs fine by itself, all in the name of running the "exact specs" that my stuff is going to be run on.
You're describing system testing, which takes place extremely late in the product cycle.
This argument is particularly badly formed. Tests are your application but you are testing for correctness not speed. In theory your tests could exercise as quickly as possible as many unique operations as a customer could perform in hours of normal use. You would never want this to take hours.
Do you have incremental testing so that only what's changed is tested? If not that would be a better investment than new hardware, IME most shops don't have incremental testing.
Almost no present software will run acceptably on a 486 since machines orders of magnitude faster are available at walmart for $300 the question of what software would is mostly academic.
In 1999 we were 'upgrading' to Evergreen 586s in K12 for a lucky few boys and girls. Good times and a lot of nothing going on. PIO4 and PATA with PCI buses clocked at 33 MHZ.
Can you imagine really trying to run 20+ year old tech in
developer space? 8 - 10 MB/s throughput and 70 bogomips?
I think it's easy for a lot of people to criticize your mention of morale and many have, but the cost ratio of tools to labour is pretty extreme. Getting a new laptop every week is stupid but if your hardware is on the fritz your company should be willing to replace it pronto, having downtime without a machine while yours is in the shop is a reckless waste of company resources.
My favorite example here is from an old CS job my wife had in connection with HP, this shop was such a grind house that they refused to buy a new chair that would be compatible with her back issues. Refusing to buy an employee a 120$ chair for a loss of perhaps 10-20% of their productivity just doesn't make sense mathematically - ditto for any company that has people working on computers that refuses to shell out for good keyboards and mice. These pieces of hardware are so cheap that having a lengthy discussion about why you're not going to get them is probably costing your company more money than just snapping them up.
HP somehow became the epitomy of crappy enterprise bean counting. They lost all the talent that had any options available, and the results are readily visible.
I'm just never quite sure the gain in productivity and happy developers outweighs the cost of shipping a software product that requires hardware that costs 4-digit US$ figures to run smoothly (which seems to be the case for most everything produced by startups nowadays).
I for my part would prefer if, for instance, the Slack developers were confined to machines on which their current product runs as badly as it does on mine, even if they feel so miserable and underappreciated as a consequence that their uplifting have-a-nice-day loading messages get replaced by passive-aggressive snark or something.
I agree with the first point: wasted time will quickly stack up with old/cheap tools. I don't buy your second argument about morale. The vast majority of professionals don't get to pick their tools, they are handed work and tools based on what is available and cost-effective. If these professionals' flexibility does not include programming on a 2 year old laptop vs a brand new one, wow, that's weak grit. If your employees' morale is destroyed by not working on the latest gadget, why is that, what else about the company is insufficient to degrade their morale so far?
As a professional I know what is cost effective for me better than anyone else. For example I know that I'm always bumping up against my 16 gig memory limit, but I have plenty of CPU. I know that I get less eye fatigue on a 4k monitor - so I picked a smaller 4k monitor that was cheaper than the monster lower resolution wrap around thing my colleague prefers.
I know that I'll be significantly more cost effective with 32 gigs of ram because I don't need to spend time killing processes and rebooting VMs after half a day of work.
I know what keyboard and mouse is still comfortable for me after working 8 hours, etc.
I know I'll be more productive on a macbook not because I'm an apple fan boy. I hate apple because they've done more to kill open source than any other company - even MS. I'm a linux fan boy. But I need to use various tools that don't run on linux. I could "tough it out" on a cheaper windows machine, but it wouldn't be cost effective. I would be less productive.
A professional knows what is cost effective and spends their hardware budget wisely to optimize their productivity. They don't rip off the company for the latest shiny gadget. It is stilly to trust a dev to essentially run a multi-million dollar company, but not to pick out a computer.
Do you make your employees only listen to one genre of music while working? Do you only allow specific food for lunch? Why do you just care about this specific point in how they work?
> 10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop.
This only holds true if your developers are 100% efficient programming every second the machine is running. But let's face it. The first hour of the day is most certainly not the most productive (10 minute boot? fine, I'll make coffee meanwhile). You could easily schedule a meeting, like a stand-up, while the machines fire up, if those 10 minutes really would be necessary.
Mathematically flawed there is no reason to suspect you can subtract the time spent waiting for the computer from the time already wasted whereas actually inefficiency from poor hardware is distributed throughout the day including productive periods.
You would actually multiply a percentage of inefficiency x hours worked.
Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.
Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.
> Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.
Yup, I'm talking about the beginning of the work day. Nothing productive would be interrupted.
> Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.
I have no idea how you got to your conclusions about me by reading my comment but thanks for judging. Regarding this piece by PG, did you read it? Because it actually supports my claim, to schedule a meeting in the beginning of the day, while machines would boot, in order to save the precious time in between start and beginning of the work day.
To reiterate the prior poster claimed that 10 minutes of lost productivity could cost more than the devs desired computer.
You said that this calculation is erroneous because the developer could easily make coffee or have a meeting while his machine boots up and thereby recover that lost time.
This is a very puzzling suggestion. Slow machines aren't merely slow the start they are slow to complete user operations while the user is sitting at the machine awaiting the result. The time cost is the sum of 1000 small delays throughout the day. You can't productively fill the extra 30 seconds you spent waiting 20 times with a meeting for example.
In fact acceptable hardware/software wakes from sleep in a second or cold boots in 30-90 seconds. Boot up isn't really the problem.
You said
>Because it actually supports my claim, to schedule a meeting in the beginning of the day
What the actual article says
>Several times a week I set aside a chunk of time to meet founders we've funded. These chunks of time are at the end of my working day, and I wrote a signup program that ensures all the appointments within a given set of office hours are clustered at the end. Because they come at the end of my day these meetings are never an interruption.
PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.
> PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.
I suggest you understand it instead of mindlessly quoting it. It's clear PG wants no interruptions for productive work time but if a meeting is scheduled before productive work time begins nothing gets interrupted.
It's less clear how you can deal with a slow computer by making coffee or holding a meeting. Did you think slow computers take 10 minutes to boot up but run real fast after?
Read it, almost everything written applies mostly to the IT industry. It does NOT operate anywhere near like that in any warehouse, full-bore semiconductor manufacturing, or even fast food job I've ever had.
What may apply to one person or industry absolutely does not apply to them all.
And to boot - I'm the GM of a very, very large solar company.
Plus opportunity cost when developers talk and the good guys decide between a job with servers or VMs in the cloud and a MacBook pro to use them or a shitty, out of date, slower than Christmas windows desktop rendered on a vdi appliance connected to a Citrix server in the next state. I mean... hypothetically, of course.
i hate these kind of taylorist arguments. dev's limiting factor is always energy, not time. and i cannot imagine any good devs i know truly caring that they arent on the latest and greatest. i guess i just wouldn't work somewhere people care about that kind of shit, so maybe im biased.
> dev's limiting factor is always energy, not time
If this is true, I would expect that investing in hardware that most effectively gets out of a dev's way to have an even higher return on investment than is suggested by time and productivity arguments. The emotional toll of dealing with the dev equivalent of paper cuts should not be under-appreciated.
10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop. Usually it costs more than 10 minutes a day too.
The hit to morale when you tell a professional they can't pick their tools is even worse. Now they are spending at least 1 hour a day looking for a new job because devs are good enough at math to know if you can't afford a new laptop you've got six months of runway tops.
If you believe employee happiness has any correlation to productivity you always buy them whatever hardware makes them happy. It is a small price to pay.