HN2new | past | comments | ask | show | jobs | submitlogin

I think the comparison to giving change is a good one, especially given how frequently the LLM hype crowd uses the fictitious "calculator in your pocket" story. I've been in the exact situation you've described, long before LLMs came out and cashiers have had calculators in front of them for longer than we've had smartphones.

I'll add another analogy. I tell people when I tip I "round off to the nearest dollar, move the decimal place (10%), and multiply by 2" (generating a tip that will be in the ballpark of 18%), and am always told "that's too complicated". It's a 3 step process where the hardest thing is multiplying a number by 2 (and usually a 2 digit number...). It's always struck me as odd that the response is that this is too complicated rather than a nice tip (pun intended) for figuring out how much to tip quickly and with essentially zero thinking. If any of those three steps appear difficult to you then your math skills are below that of elementary school.

I also see a problem with how we look at math and coding. I hear so often "abstraction is bad" yet, that is all coding (and math) is. It is fundamentally abstraction. The ability to abstract is what makes humans human. All creatures abstract, it is a necessary component of intelligence, but humans certainly have a unique capacity for it. Abstraction is no doubt hard, but when in life was anything worth doing easy? I think we unfortunately are willing to put significantly more effort into justifying our laziness than we will to be not lazy. My fear is that we will abdicate doing worthwhile things because they are hard. It's a thing people do every day. So many people love to outsource their thinking. Be it to a calculator, Google, "the algorithm", their favorite political pundit, religion, or anything else. Anything to abdicate responsibility. Anything to abdicate effort.

So I think AI is going to be no different from calculators, as you suggest. They can be great tools to help people do so much. But it will be far more commonly used to outsource thinking, even by many people considered intelligent. Skills atrophy. It's as simple as that.





I briefly taught a beginner CS course over a decade ago, and at the time it was already surprising and disappointing how many of my students would reach for a calculator to do single-digit arithmetic; something that was a requirement to be committed to memory when I was still in school. Not surprisingly, teaching them binary and hex was extremely frustrating.

I tell people when I tip I "round off to the nearest dollar, move the decimal place (10%), and multiply by 2" (generating a tip that will be in the ballpark of 18%), and am always told "that's too complicated".

I would tell others to "shift right once, then divide by 2 and add" for 15%, and get the same response.

However, I'm not so sure what you mean by a problem with thinking that abstraction is bad. Yes, abstraction is bad --- because it is a way to hide and obscure the actual details, and one could argue that such dependence on opaque things, just like a calculator or AI, is the actual problem.


> shift right once, then divide by 2

So, shift right twice? ;)


I think asking people to convert to binary might be a bit too much lol

  > Yes, abstraction is bad
Code (and math) is abstraction

No ifs, ands, or buts about it.

I'm sorry, I think you are teaching people the wrong thing if you are blanket statement saying "abstraction is bad". You are throwing the baby out with the bath water. You can "over abstract" and that certainly is not good but that's not easy to define as it is extremely problem dependent. But with these absurd blanket statements you just push code quality and performance down.

Over abstraction is bad because it can be too difficult to read or it can be bad because it de-optimizes programs. "Too difficult to read or maintain" is ultimately a skill issue. We don't let the juniors decide that but neither should we have abstraction where only wizards can maintain things. Both are errors.

But abstraction can also greatly increase readability and help maintain code. It's the reason we use functions. It's the reason we use OOP. It helps optimize code, it can help reduce writing, it can and does do many beneficial things.

Lumping everything together is just harmful.

Saying abstraction is bad is no different than saying "python is bad", or any duck typing language (including C++'s auto), because you're using an abstract data type. The "higher level" the language, the more abstract it is.

Saying abstraction is bad is no different than saying templates are bad.

Saying abstraction is bad is no different than saying object oriented programming is bad.

Saying abstraction is bad is saying coding is bad.

I'm sorry, literally everything we do is abstraction. Conflating "over abstraction" with "abstraction" is just as grave an error as the misrepresentation of Knuth's "premature optimization is the root of all evil." Dude said "grab a fucking profiler" and everyone heard "don't waste time making things work better".

If you want to minimize abstraction then you can go write machine code. Anything short of that has abstracted away many actions and operations. I'll admire your skill but this is a path I will never follow nor recommend. Abstraction is necessary and our ability to abstract is foundational into making code even work.

*I will die on this hill*

  > because it is a way to hide and obscure the actual details
That's not abstraction, that obfuscation. Do not conflate these things.

  > one could argue that such dependence on opaque things, just like a calculator or AI, is the actual problem.
I'll let Dijkstra answer this: https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: