Hacker News new | past | comments | ask | show | jobs | submit login
For Donald Knuth, good coding is synonymous with beautiful expression (quantamagazine.org)
669 points by theafh on April 16, 2020 | hide | past | favorite | 225 comments



People often say "TAOCP is a dense, technical book, you don't just sit down and read it cover to cover." It is dense and it is technical, but it's also a joy to sit down and read and this is why. Knuth doesn't present an algorithm and annotate it with (Foobar 1968) and that's that. He says "Bazquux came up with an early version of this algorithm in 1962 while working on missile trajectories for the Department of Defense, then Foobar came up with the linear-time optimization that we all use today." Stuff like that.


I feel like this is often a distinction between engineering and science books.

I'm used to reading books on physics where the authors (even the Russians) give a bit of background and (usually made up) history but when I go to read a book on something like Finite Element Analysis (For engineers) it feels like the author is literally just dumping his life's work into LaTeX without any thought beyond A->B.

Don't get me started on pages and pages of MATLAB or Fortran code without any indentation, comments or variable names longer than two characters. (This is why I believe writing algorithms in real programming languages is playing with fire in textbooks).


Matlab by itself isn't the problem, it's just that a lot of professors write pretty terrible code in their textbooks. Python won't fix that. None of the mathematics, physics, or engineering texts had much history outside of some occasional excerpts. I'd like to see more of that though as it helps things click for me.


I think MATLAB is a little bit of a problem. Or at least, I want to kill myself whenever I have to use it. Python too although in more subtle ways (I like to abuse the type system, Python is just a hacked together pile of shit with tolerable syntax)

Physics textbooks tend to get "unique" when you get to topics like Quantum Field Theory. Basic Textbooks are usually fairly to the point.

"Advanced Engineering Mathematics" is a good book with a fair amount of history (and written by a mathematician so no Engineering-isms e.g. Notation recognizable to other human beings)


With Matlab, you can immediately create a Matrix and transpose it, find the inverse...etc. I think it is optimized really well for undergraduate excercises that are really simple (I have two numerical methods textbooks which use Matlab fine). I agree that anything more complex than a couple of functions and a loop is probably better off somewhere else in a lot of cases. With that being said, it has built-in support for GUIs, sparse matrices and other stuff which is a lot harder in C++, so easy to see why it is popular for academic and R&D type work. For a student, the commercial cost is only ~$30 or free at a lot of schools, so usually not a barrier.


You seem to not consider a major player in the field: Python. There's probably a reason Andrew Ng switched from Matlab to Python for teaching.


I literally mentioned Python in one of my above comments lol and it is my daily driver. Still, as an undergrad with zero programming experience, Matlab was very easy to use for the kind of things that undergrads do in engineering and physics (play with matrices, differential equations, make charts...etc). I would agree that machine learning is likely to be easier in Python, but then again, ML students already know how to code, understand OO, and other concepts. Matlab is popular where people are good at math, but have no clue what objects are.


These are issues that a good editing process would resolve. Unfortunately the market for advanced engineering textbooks is so small that it's tough to justify paying an experienced editor to work through multiple revisions.


That's true about physics! I wonder if there are other fields that are written in such casual style. Undergrad textbooks in math are also very dry affairs. David Mackay was an exception, but he is a physicist.


I agree that Knuth's exposition is a joy to read, and there are a lot of significant mathematical insights to be had within those volumes. I just wish he could get away from MIX/MMIX. I don't find it very useful or interesting as a pseudocode, and I think it gets in the way of understanding.


On the contrary, I think it's extremely useful to have the algorithms realized in a form that has a cost model free of handwaving. The books are to a great extent focused on computational costs, and high-level languages obscure those costs.

The only exception is that the books only rarely dive into the variable computational costs due to operating on values of different sizes; heapsort, for example, is only O(N log N) time if comparisons take constant time, while, in fact, as key cardinality approaches infinity, key size grows as O(log N). (Which is why O(N) time sorting is possible.) This kind of thing is growing more important as hardware design becomes more accessible; there are probably more people writing VHDL and Verilog now than were writing any kind of software when Volume 1 came out.

So I think using a high-level language would get in the way of understanding those costs, and even to some extent how to implement the algorithm in other contexts.

If you just want to implement a red-black tree or something, then sure, the MMIX code isn't going to help you. But he also presents all the algorithms in pseudocode, so you can use that.


There's no reason you can't take into account things like varying key size, memory allocation, etc. in your computational model when it matters. For comparison sorting, the reason it isn't done is that if you're comparing keys of the same length, in the worst case, you have to look at the entirety of both keys. O(n log n) might be misleading, but it's a white lie at best.


There are a lot of very interesting issues there. For example, surely you have to look at the entirety of both keys at least once, yes; but you don't have to do it on every comparison, for example in mergesort where you can store a longest common prefix between pairs of adjacent keys and avoid comparing them byte by byte most of the time, or even storing them. The algorithm is tricky both to implement and to analyze, and I don't think anyone has done it, because there are better options for long keys; but I conjecture that it actually gets mergesort down to a real O(N log N) instead of the O(N log N log N) of the usual mergesorts.

I suspect that isn't the reason this kind of analysis isn't usually done, though. If you measure the running time of a standard comparison sorting algorithm on real data on almost any general-purpose computer from 1955 to 1985, you will find that, above very small sizes, it is quite precisely proportional to N log N, where N is the number of records. The extra factor of log N doesn't show up. Why is that?

It's because almost all of those computers, even bit-serial designs like the RCA 1802 and the LGP-30, aren't bit-addressable; the smallest amount of data an instruction can manipulate is a character, typically 6-9 bits, but often 32 bits. And almost none of them could manage more than 4 gigabytes of data, because that would have been too expensive. So in practice the average time to compare two separately stored keys does not vary by a factor of 100 or so over the data sets the machine could process, as you would expect from looking at key unique prefix length. It might vary by a factor of 2 to 4, but more often was actually constant. MIX in particular had 5-character memory words, but only 4000 words of memory, so its comparison time for internal sorting is quite precisely constant for any realistic data.

After 1985, caches and massive parallelism complicate the picture a bit, initially for largish machines (though Stretch too had cache, such monsters were exceptions) and finally for almost all types of computers, though perhaps not yet the numerical majority of computers, which are probably mostly 8051s and Cortex-Ms and things like that.

Anyway, back to the issue at hand: assembly language exposes all the computational costs by default, which is what you want if you're going to prove theorems about them, while high-level languages obscure them by default, so more of your theorems will be incorrect. And that's Knuth's primary concern.

That level of single-minded pursuit of excellence is the reason we can learn things from books Knuth wrote in the 1960s that are still useful for programming computers that are literally a billion times bigger than the machines Knuth used as his model. It's also the reason he's not done yet, 55 years later, with what he thought would be a single book, done in a year or two.


>I conjecture that it actually gets mergesort down to a real O(N log N) instead of the O(N log N log N) of the usual mergesorts.

I don't think it does in the worst case. I'd bet money you could construct an adversarial input that would force you to look at more of each key than you'd have to in order to get down to "real" O(n log n).

>MIX in particular had 5-character memory words, but only 4000 words of memory, so its comparison time for internal sorting is quite precisely constant for any realistic data.

If we're being precise here, then it is certainly bounded above by a constant. But, this is also the same sense in which there is literally no such thing as a Turing machine (all physical machines, even one the size of the universe, are linear bounded automata, at best).

> Anyway, back to the issue at hand: assembly language exposes all the computational costs by default, which is what you want if you're going to prove theorems about them, while high-level languages obscure them by default, so more of your theorems will be incorrect. And that's Knuth's primary concern.

Except, no, you don't. Both you and I just got through explaining why.

> That level of single-minded pursuit of excellence is the reason we can learn things from books Knuth wrote in the 1960s that are still useful for programming computers that are literally a billion times bigger than the machines Knuth used as his model. It's also the reason he's not done yet, 55 years later, with what he thought would be a single book, done in a year or two.

No, it most certainly is not. I'm assuming you've read at least some of TAOCP. Knuth certainly introduced some mathematics and techniques for the analysis of algorithms in TAOCP, but literally none of the algorithms themselves were first published there. This is not a research monograph. It's a reference. The algorithms themselves were published and proven, including runtimes, before they ever made it into those pages.

And, yes, there is plenty that almost anyone can learn from these books and the subsequent fascicles, but it's nothing to do with the computational model. Not one result in TAOCP contradicts the previously published work. We can learn from it because there is simply so much there to learn from. There's a reason that when Steve Jobs told Don Knuth "I've read all your books," Knuth responded that jobs was "full of shit." [0]

---

[0]:https://www.folklore.org/StoryView.py?project=Macintosh&stor...


Only a very tiny part of the books is in MIX/MMIX (by my count from 3 years ago; details here: https://news.ycombinator.com/item?id=14520230) — if you don't care about it just ignore those programs/sections; it doesn't get in the way of anything. (Similarly you can ignore the mathematics if that's not what you want. You'd be skipping past a big chunk of the book in that case, though.)

On the other hand, MMIX is actually a very nice architecture for a programmer to write assembly/machine code for — it has lots of registers, etc. It's more pleasant to write in MMIXAL than x86 Assembly or (I imagine) ARM or even RISC-V. So if you just want that part of the programmer experience, it's a great way to have some understanding at all levels.


Yeah that’s actually what I found intimidating about TAOCP when I was in undergrad. But I stuck with it, and it was a genuinely good read. I feel like SICP is a better introductory text, though.


TAOCP was never intended to be an introductory text, so that's unsurprising.


Totally agree. It doesn't help at all. Pseudocode would have benn sufficient. Fake machine code? Nope.


MMIX programs can actually be run, and MMIX has multiple simulators. Actual MMIX chips can be made. MMIX also allows students to reason about the properties of modern chips without experiencing quite the complexity they'd have to deal with normally. This is valuable for many reasons.


Yes, that's part of the tradeoff between (what could theoretically be) actual machine code and pseudocode. The other advantage of a language like MMIX is, as another comment stated, you can't ignore things in the cost model. This allows more precise calculations of what the runtime of these algorithms is.

OTOH, pseudocode is easier to understand, partially because you don't have to think about some of those extra things that don't actually matter in the cost model. For an expository text (and, TACOP is an expository text), I think MMIX comes down on the wrong side of the tradeoffs.


Things like branch prediction, caches, instruction pipelining? Not really. People want the algorithm, not a difficult to translate implementation.


To paraphrase Knuth in what I think was a Dr. Dobbs interview: the book isn't for those people, then; they should make their own book.


> I write an average of five new programs every week. Poets have to write poems. I have to write computer programs.

It seems more obvious now that I read it. A lot of programmers out there, including myself, are looking for ways to improve their skills and I never thought to myself to write more programs. And not just programs for the sake of programs, but programs that force you to explain to the computer that you understand certain concepts. This man is and will continue to be a huge inspiration.


>The ceramics teacher announced on opening day that he was dividing the class into two groups. All those on the left side of the studio, he said, would be graded solely on the quantity of work they produced, all those on the right solely on its quality. His procedure was simple: on the final day of class he would bring in his bathroom scales and weigh the work of the "quantity" group: fifty pound of pots rated an "A", forty pounds a "B", and so on. Those being graded on "quality", however, needed to produce only one pot - albeit a perfect one - to get an "A". Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the "quantity" group was busily churning out piles of work - and learning from their mistakes - the "quality" group had sat theorizing about perfection, and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay.


Ha! I had the same experience in my art evening classes. One evening our teacher told us that we had to produce three paintings in the 2.5 hour lesson. It is challenging enough producing one in that time period. It was a fantastic learning experience in letting go. The first picture I did was OK, the second was better and the third was great - all because I'd given up thinking at that point and was just doing.

Sometimes I get the same feeling when coding, when really in the zone with the magical feeling of flow.


On the other hand, I took a few semesters of elective photography courses in high school, and there’s a certain essential amount of time it takes to make a good photo print and run it through all the chemicals. It can’t really be compressed down shorter than about 35–40 minutes for the whole process with only limited ability to parallelize the work while maintaining quality, which makes it tough to make more than 1 or possibly 2 good prints in 1 class period.

All of the students I knew who focused on making 1 good print every day ended up learning more and making better images than the students who constantly tried to rush.


Apples to oranges. Painting is a purely creative process with style being valued over correctness. While there is artistic decision during manual film development, there is definitely a wrong way to develop a print. If you rush the chemical process, it will end up bad. Film is more like baking in that regard while painting is stove top cooking.


I guess my point is: the important thing to learn here (in my opinion) is that regular deliberate practice with focused attention is helpful, not that people should try to rush out as much sloppy work in as short a time as possible.

Even the highest quantity of practice is not necessarily essential. My impression is that its spacing out over time is even more crucial.

I have a 3.5 year old, and I’ve been watching him learn all sorts of skills (learning to understand and use language, walk, run, jump, ride a balance bike, solve logic puzzles, build with construction toys, draw, ...), and it’s amazing the kinds of leaps he will make in balance, coordination, speed, understanding, etc. at some specific skill over the course of 3 or 4 months, even if we only practice the skill for 20 minutes once every few weeks.

Somehow the brain is churning on it in the background, and there are sudden leaps in ability which can’t be obviously explained based on direct practice time.


The art of a good film photograph isn't printing lots of pieces it's taking 10 shots, and choosing to print two of them. There are parts you can't rush, there are clear steps to taking clear photos, well exposed photos. Being able to look at the negative, and know if it turned out well is the shill you learn.

The students who focused on one good print had a lot of prep work. Getting the final step right (for software, shipping and deploying to production) isn't something we should rush. Learning all the steps which lead up to that final one well makes a tremendous difference


Is this maybe then analogous to the essential and accidental complexity premise that Fred Brooks put forwards?

It may be that in painting the cost of accidental complexity can be reduced greatly but in photography due to necessary post-processing there is less on an opportunity.

That doesn't mean there isn't an opportunity here. Being neither a photographer or painter I am most likely not the best person to comment however!


Resonates with the rise of USA. A few books about electricity mentionned that Europe was caught in a neverending debate about ideal theories while US was applying simple tricks and improving understanding on the way.

I'm also wildly guilty of the analysis paralysis... I did try to set up schedules and goals to iterate but it always die before I did any real progress.

Oh and btw, some dude said Wright bros. won the race to flight because they approached it with low cost engineering/lab mindset, rapid prototyping fashion. Other companies with more resources tried the moonshot approach, did 1 so-called plane in 2 years, failed to fly, ran out of resources and died.


Reminds me of SpaceX. everytime I see their BF metal tank blow up, I’m like “yasssss progress”.

I absolutely believe that SpaceX will land a human on Mars in our lifetime. Not sure how long they’ll survive and whether a colony is possible but they’ll definitely make it way cheaper to send shit to mars/moon than it is today.

Also I’m bullish that we’ll make big breakthroughs in DNA and protein folding/interaction understanding. Building with the same molecular machines of life opens up so many possibilities.


What is this from. Love the general idea.


War of Art. Solid book; picked it up after seeing this passage on HN hehe


Art & Fear? It's also the third-highest-score comment on HN: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...


yes, its from Art & Fear.

The Art of War is unrelated but also form an insightful fella. more focused on adversarial confrontations than Art & Fear though


So, I have now put /War of Art/ and /Art and Fear/ in my queue. I also have a physical of /Art of War/ somewhere. I'm assuming that is a different thing, though.


yes, it is. no idea why i read Art of War before :-)


I don't think you were alone in that. I stared at it for a while making sure I had it right. :D


I really disliked that book, I found nothing actionable in it.

It is so often recommended that I read it to the end thinking there would be some jewel waiting for me. There wasn't.


The problem with the so called self help book is. There are so obvious for some and totally novel for others.


This does not work for SW. No matter how long they tried quality programs are very very rare.


And when they happen, developers decide they need to "further improve" it, destroying everything that was valuable about it in the first place.


I thought that story had been debunked, the experiment never took place.


Quantity has a quality all of its own.


Cool but fake.



I wonder if there has been a proper psychological study on this. It makes logical sense and corresponds with my real-world experience, but that's not how science works.


You should really post a source.


A source that disproves a made up unsourced story is fake?


And here's the text in another book, courtesy of Google Books, on page 264:

https://books.google.com/books?id=TuvOng_Yh6wC&newbks=0&prin...


The story could still be made up, but the source for the quote is this book:

Art & Fear: Observations On the Perils (and Rewards) of Artmaking

by David Bayes and Ted Orland

https://www.amazon.com/Art-Fear-Observations-Rewards-Artmaki...



prove what? That someone printed the anecdote in a book?


What would suffice, video recordings of every class?


Naming the teacher, the school, the year, the city, something that would give a hint for someone to follow up for verification.


Except for Politicians, Salespeople, and Known Criminals, I tend to take people at their word.

Your approach might be different than mine.


And here's audio of the authors saying that it's a true story:

http://www.innonavi.com/wp-content/uploads/2017/04/David-Bay...


That's a repetition of the same claim, not a source. He doesn't mention simple information like who or where the class was, which there is no reason to be secret about.

Also, it doesn't pass the obvious sniff test, that a teacher would spend a whole semester giving half of his class a terrible experience that uttery failed to teach them anything, leaving them with "a pile of dead clay" as the author claims.


To be fair, there exist sites like snopes.com which do this type of fact-checking. But I couldn't find it there.


If I were graded solely on quantity, I would produce stuff that only just barely passes the bar of being a pot. (If you take a clump of clay and stick your finger in it, you basically have a pot.) Therefore, I think the story is false.

Nice story though.


But then you'd get sick of making such shitty pots, if you liked pottery as an art form enough to want to learn it, and you'd start trying to make better ones because just passing the bar probably wouldn't be enough for your sense of aesthetics/craftsmanship (or maybe it would, and then you would have learned something as well)


It depends a lot if it is a voluntary art class, or you are forced to take it. (Or it is the only job you could get, and you are not to interested in it.)


True if all you care about are grades (which are what? A badge?). I'd probably knock out what I needed to pass (make it through the gate) and then focus on improving.


Any metric taken to the extreme is worthless.


Something that I found useful is taking the concept of katas from martial arts. Writing the "same" program multiple times, but trying to refine the parts I didn't like in the previous iteration (e.g. try writing in less code, with smaller functions, with less libraries, in a more testable way, etc).

It's specially rewarding to do this exercise with utility libraries, as it teaches you about the nuances one abstraction level below (both in terms of learning what is possible but not exposed by the abstraction and in terms of learning what pitfalls exist). Many times I find that it's more elegant to drop down one abstraction level to accomplish something than to add more abstractions on top due to lack of understanding outside my primary abstraction level.


This concept of Code katas are well established. I learned it from the pragmatic programmer book [1]

[1] https://en.wikipedia.org/wiki/Kata_(programming)


I think this is a really high value endeavor and efficient use of time. It's the same way great writers learn how to write and produce great works.


What does it mean though? What programs? I can't write new programs at work where I maintain existing applications for the most part.

What kind of programs, not for the sake of programs, can you actually write almost daily?


I think what he has in mind are small programs that do some kind of nontrivial computation without much in the way of IO or UI. For example, [1] is a program from December 2019 that "finds n-bit binary squares that are palindromic".

A great way to get into this kind of thing if you're stuck for ideas is to go through the Online Encyclopedia of Integer Sequences [2] and just write little programs to print out the first n entries or whatever for a particular sequence. As other comments have suggested, don't worry about making them perfect, just get the numbers right and then move on to the next one.

[1] https://www-cs-faculty.stanford.edu/~knuth/programs/squarepa...

[2] https://oeis.org/


> is a program from December 2019 that "finds n-bit binary squares that are palindromic".

No dis-respect for Donald Knuth (who of course I found orders and orders of magnitude above programmers such as myself) and no dis-respect to people that found those types of programs interesting, but there are also programmers (like me) for whom those programs are just simple riddles, with almost no interest whatsoever.

They're like sudoku or crossroads, interesting and somehow intellectually stimulating but they don't help shape/change/modify the world around us. And I got hooked on programming when I realised how it could help me change the world around me and most of all how it could help me (and others) in trying to make sense on said world that surrounds us.


Well, yes, of course.

The question was about the sort of little programs someone could write daily, and the answer resembles crosswords or sudoku, which many people are in the habit of completing once a day.

Would completing one integer sequence algorithm (brilliant suggestion btw!) change the world around you? No, of course not.

Would it make you a better programmer, though? Certainly it would. Which would mean, when you settle down, crack your knuckles, and start changing the world, you'd do a better job of it.

Your objection strikes me as that of a hunter who won't go to the range, because when he fires his rifle, he wants to put meat on the table!


> Would completing one integer sequence algorithm (brilliant suggestion btw!) change the world around you? No, of course not.

Depends on the sequence, of course. I hear a lot of people would be interested in a fast algorithm for computing membership in OEIS A000040.


> They're like sudoku or crossroads, interesting and somehow intellectually stimulating but they don't help shape/change/modify the world around us.

I don't think this is accurate: sudoku and crossroads have deterministic known solutions. Solving them is a mostly mechanical feat and indeed somewhat useless beyond the joy of solving.

In contrast, the challenge of creating these sorts of programs is creative and open ended. It is very well possible that someone will come up with an even more efficient algorithm. Or a much simpler one. Or just a completely new strategy.

As to the usefulness, these little toy programs are a constant source of inspiration for solving actual real world problems. A perfect example of this is Algorithm X. One of Knuth's toys that solves Sudoku puzzles. The same algorithm is easily adapted to solve scheduling problems and actually is a major innovation.


No one was holding their breath for a program that finds n-bit binary squares that are palindromatic, but this kind of mucking about can be useful as exercise and as a source of new insights, both of which will aid the programmer in achieving more ambitious goals.

The musician doesn't noodle around randomly or repeat uninteresting exercises with the ambition that that in itself is going to constitute a meaningful and memorable work.


I think others have already done a good job of describing how programs like that help make you a better programmer, and can be inspiring.

I'd like to add that something that I find interesting and inspiring about programming is managing complexity. That's something that's not as easily explored with programs like the above. But for example, you could try to create as much of a spreadsheet program as possible in a few hours. Do that 4 times starting from scratch. How can you organise the code in such a way to make it easy to read, performant, maintainable, and fast to write? What language features will you take advantage of? Maybe try making a part of a word processor, an IDE, a raytracer, an HTML viewer, an FTS engine, a relational database, etc. Try implementing mvc, or reactive, or two way binding UIs. I would consider these sort of a different genre of poem, that let you explore a different problem space. Maybe this might be the genre that you're more interested in?


Checkout his latest fascicles on applications of zero-supressed binary decision diagrams to all sorts of problems. From beautiful tiling problems to travelling salesman.

His latest fascicles are bleeding edge when it comes to a general algorithm that solves a wide variety of problem instances very efficiently. (much similar to his Algorithm X, but superior in almost every way)


When the very adjacent sentence is “Poets have to write poems”, I think this question is like asking “what kinds of poems, not for the sake of poems, can you actually write…”, then ask about how they change the world, etc.

Even poets don't write poems just for the sake of poems — they write because it gives them pleasure, because they are moved by some strong feeling or because they have some irresistible urge to say something, etc. (And sometimes because they're being paid for it, or hope to be.) Anyway, here are some reasons (in no particular order, and not mutually exclusive—there's some overlap) for Knuth to write programs:

1. Because it gives him pleasure.

2. Because he wants to find out the answer to some question — for instance, how many knight's tours are unchanged under a 180° rotation?

3. Because he wants to experiment or gain more experience with some algorithm(s) or method(s) that he's learning or writing about.

4. Because he wants to collect data on their performance, so that as a scientist when he makes a statement like “a program using this algorithm finds the answer for N=13 using only 0.3 billion memory accesses”, it is a correct statement.

5. Because he wants to understand something better (e.g. his program for linear programming where he says he understood the simplex method clearly only after implementing it).

6. Because he wants to "debug" or interactively see what sub-components of some algorithm do.

7. Because he has encountered (or been posed) a hard problem and wants the challenge of solving it.

8. Because he has seen a beautiful program and wants to translate it to his preferred style, as he'd like more people to appreciate its clever ideas. (E.g. his reimplementation of the original "Colossal Cave Adventure" game of Crowther and Woods.)

9. Because he wants to provide this program as an illustration of some idea or algorithm mentioned in his books.

10. Because someone has asked him for the program.

One can easily imagine more reasons. Though at the rate of five a week it would be more than 7500 programs over the last 30 years, he has given a few examples on his webpage: https://cs.stanford.edu/~knuth/programs.html (Many of them are in CWEB so to help people who don't have `cweave` installed, I typeset them in 2017: https://github.com/shreevatsa/knuth-literate-programs/tree/m... — I ought to update the repository sometime.)


Well you can write an interpreter or a compiler for a simple language you design.

You can try to make text-based games.

You can make a bare-bones browser.

You can write a simulator for ..say.. classical mechanics.

Are none of these interesting for you?


You can't really write five of those in a week.


500 lines or less, hold my beer.


Is this your way of saying you're a programmer not a computer scientist? Because his job is as a scientist.

Calling his exercises simple riddles is disrespectful. It's like calling doing a 5km run every morning, simple walking. The proof is in the pudding, he has impacted the world in a profound way.

But on a more positive note. Which non-cs programmer heroes exists today? I'm thinking John Carmack.


Yeah, of course I have never thought as myself as a computer scientist, I like my job/title as a programmer, I don't see any downside in that, manipulating data is fun.

I don't personally have "programmer heroes", maybe it's because I'm 39 years old and as such I haven't believed in cowboy-like programmers for almost a decade now. I did strongly believe though in what Aaron Swartz used to do and in what he used to believe, i.e. an open internet and open data for almost everyone, but that dream died in the late 2000s - early 2010s and I don't think we'll ever going to get close to anything like that ever again.


No one said anything about cowboys? Just shining examples of programmers.

Maybe you are your own hero?


Many people praise Fabrice Bellard. Not sure if he's counted as a non-scientist, but his achievements are astounding.

Although I think that many heroes are hidden behind corporations. Those people who contributed most to Google Search or architectured Chrome. Those people who built systems everyone relies on. For example almost every Java project uses Spring Framework, but I'm sure that it was founded by one or few very talented people. And those examples are many, but not very public or exposed.


Fabrice Bellard seems like an awesome guy!

Do you guys think programmers don't recognise each others work as much as in other fields?


I don't really know much about other fields. Do architects recognize those who architected outstanding projects, like huge bridges and tallest skyscrapers?


There were quite a few people on the burnout[1] thread that wanted to change the world.

[1] https://news.ycombinator.com/item?id=22876241


Along these lines, Project Euler is great.


So is the blog "Programming Praxis". Its author has been posting short and interesting programming exercises at least once a week, but often more, for years.


Of course the answer for OP is reading and working through taocp, that practice makes you better which is why Knuth spent his life writing these books even if technically, they are as the Knuth states just a study of 'alg a compared to alg b' the byproduct is more practice that OP is looking for.


I feel like solving leetcoe problems would fall somewhere in the category of programs you're describing. (They seem like small problem solving programs.)


You can write "meta" programs that help you maintain existing applications.

Whenever I start at a new company, before I get thrown into the heart of things and become "too busy" to write these kinds of programs, I like to write things that automate the boring stuff.

Some recent examples:

- a cli for inspecting, adding comments to, and updating the status of JIRA tickets. because our hosted JIRA instance was god-awful slow and their web interface is a mess. - a tool for monitoring things and pinging me with macos native notifications when things crash or complete.

Was my net productivity positive for writing these tools? hard to tell.

There are some companies full of engineers whose actual job is to write tools like this; I'm not sure what they would do for fun :P


> There are some companies full of engineers whose actual job is to write tools like this; I'm not sure what they would do for fun :P

I think there are a lot of organizations that would be better served devoting more resources to the meta work given the labor and infrastructure costs. I would go out on a limb and say the capital expended on the majority of projects I've worked on in my career had a smaller ROI (because it was negative in most cases) than would have been the case had the organization just invested in improving internal tools and processes of their existing systems.


I totally agree with you. I think that most corporations are in a state of suspended myopia.

We’d rather direct engineers to improve the effectiveness of a single system by 200% (even though it impacts 1% of revenue), than to impact all (or many) systems by 5%.

Yet, most of the impressive software we admire today was built in the environment you prescribe. C and UNIX, famously, but also git, Go, Rust. I imagine many others.

How much has C improved productivity, over Assembly and COBOL? How much has Borg or BigTable?

The issue, IMO, is that no ambitious manager wants to invest the social capital in defending and advocating for these teams, because they don’t look good until they look exceptionally good.


I find lots of little utilities fit the bill. They are oftentimes sufficiently narrow in scope to be of little use to anyone but the author, but that shouldn't necessarily eliminate their use.

For example, here is a Gist with a shell script I wrote that, given a set of Maven coordinates, works with a few other tools to build out a docset for use in Zeal / Dash.

https://gist.github.com/michaeljmcd/5564758537946963e946806e...

I had to dig into some corners of Maven I didn't know well to pull it together and it formalized some random bits of minutae.

Nothing elegant or crazy, perhaps, but it is a bit of coding apart from the normal day job that taught me a couple of tricks.


I find myself writing debugging tools all the time. For example, our logs are overwhelming because our developers use them for debugging and then never remove their logging statements. So finding anything in them is next to impossible, and it's often interleaved with a bunch of other junk that's meaningless to you. And fixing the logging system we use would to make this work better would also be a huge task.

So a little while ago we needed to log some info to figure out where 2 parallel sets of calculations were going wrong. So I wrote some logging that output the info to a file instead of the main log, and then wrote a simple parser to read them in and display them side-by-side in a nice hierarchical tree structure. It made something that normally takes us a week to work out only take a few hours. It's something we'll never ship to customers, but helps us enormously.

I'm finding myself writing more and more stuff like that these days.


You might also try to get your QA process to require removal of debug print messages or at the very least an "off" flag as an environmental variable or config file element


We're business application/infastructure maintainers, not programmers. It's a different world from Knuth's expressions of mathematical ideas.

Outside of work, it's easy to write whole programs. Project Euler, LeetCode, etc. These are the kinds of programs Knuth is talking about -- example programs to illustrate algorithms from TAOCP.


I understand that but those don't fall into the "not for sake of programs" category.

Under that category, I imagine something where the resulting program itself is of any, however minuscule, use to me.


Stack Overflow is a great place to practice writing short programs or even just code snippets. Just look for interesting problems and solve them.


I'm a frequent user of SO, but I find leetcode [1] much better when I just want to practice coding / problem solving. Their free account is more than enough to have some fun ;)

[1] https://leetcode.com/


I disagree. Leetcode and related sites have problem descriptions that are often vague or nonsensical. Just figuring out what the author is trying to say is more of a challenge than the actual program sometimes.


I feel exactly the same, I thought that it's because I'm not a native speaker, at least now I know that I'm not alone.


There’s a few utility functions that come to mind. Git hooks, for example.


After Michelangelo died, someone found in his studio a piece of paper on which he had written a note to his apprentice, in the handwriting of his old age: “Draw, Antonio, draw, Antonio, draw and do not waste time.”

(The Writing Life by Annie Dillard)


As obvious as it seems, the advice really does apply to everything. I'm okay starting a bunch of projects and never finishing them, because I'm always learning something. When I do start a new project, I usually try ten or so different programming languages and environments to see which one is a good fit for the project. That said, there's certainly something to be said for finishing projects that you're proud of and getting to know one toolchain very well. A mix of both is good.


Remember that Knuth is/was a professor in mathematics and computer science, so he didn't spend his day coding, so for him it was beneficial to do something that complement the teaching and keep him grounded. What would complement yours? Maybe more of mathematics instead? Or design? Or economics?


I have a folder named "proofs" on my machine which is just this. About 1500 and counting little programs that demonstrate or prove that something works some way or another. The class name expresses the proven fact. When I find out I am wrong, I change the name of the class to reflect the correct understanding. I read through the names in my IDE when I need to know some detail about something I've forgotten.


Could you give some examples? This sounds really interesting.


Sure. In Java a class named :

ArraysOfConcreteParameterizedTypesAreDisallowed

The body of this class is a proof of this fact.


I misunderstood how to use git before college and used to commit everything to master, on GitHub you get this fancy flame graph when you do this and so it acted as a kind of “gamification” for me. I don’t think anything else has so dramatically improved my abilities as a programmer.


It's the same with almost every creative endeavour. Painting as well as poetry and programming. Practice makes you better, even if the result is sometimes poor in isolation.


Isn't it obvious that practicing your art is a prerequisite for improving your skills?


As Paul Dirac so eloquently put,

> “I think that there is a moral to this story, namely that it is more important to have beauty in one's equations that to have them fit experiment. If Schrödinger had been more confident of his work, he could have published it some months earlier, and he could have published a more accurate equation. It seems that if one is working from the point of view of getting beauty in one's equations, and if one has really a sound insight, one is on a sure line of progress. If there is not complete agreement between the results of one's work and experiment, one should not allow oneself to be too discouraged, because the discrepancy may well be due to minor features that are not properly taken into account and that will get cleared up with further development of the theory.”

> Scientific American, May 1963.

By that token, coding should be beautiful first, in elegance of structure and forms. And then we can iron out the bugs.

I tend to agree.


Isn't this line of thinking which has held physics back for the past few decades? I thought that physics is gradually moving to being more data driven than being driven by ideas of beauty. Sabine Hossenfelder has written quite a bit about this.


The problem with theoretical physics, in my external perception, is more political, within academia.

- selecting for the H index, that is, what will be most cited, leading to strong convergence around "consensual" or "popular" research, at a huge detriment to diversity of ideas

- one negative effect is a major push for incremental publication (equivalent of a LOC KPI for coders... it's just wrong). “What did you publish this year? How many times was your name in a paper somewhere?”

- putting funds and institutional support behind people you like as opposed to people with the most promising ideas. That's pretty much the story of string theory, and see how despite the failure they are still "winning" politically, occupying top positions and feeding the mainstream vulgarization, whereas those who "lost" decades ago are still AWOL even as their ideas are freshened up by a new generation, whose future against the institutional pyramid is all but guaranteed.

We might still be there (a situation of total stagnation, since the 1970s, at the theoretical level) by 2040 if something doesn't change profoundly in the way university politics have such a compelling handle on research, thus on our progress in science in general.

My 2cts but I'm parroting, people who now speak up in academia.

What I do know about, ML, is all but dead currently in MIT, Stanford, etc. It's just monkey coding applying whatever the industry likes for short-term benefits (again this "incremental"-only approach, playing it safe with 0.001% gains that are, sure, predictably doable). There's no actual research on "AI" let alone the general notion of "intelligence". Not even in neuro. I fear we've lost sight of the importance of basic research, as basic as it gets in a given field, because industry deals run the show in terms of funding. Yet look what Bell Labs or MIT did in the 1970-80s, surfing on the last wave of fundamental breakthroughs of the 50-60s. These days are long gone.


Whenever people bring up aesthetics in coding they forget to balance it. Here is the man himself on how he feels about code aesthetics, “I do think issues of style do come through and make certain programs a genuine pleasure to read. Probably not, however, to the extent that they would give me any transcendental emotions.” [1]

[1] Knuth, D. E. Things a Computer Scientist Rarely Talks About. Center for the Study of Language and Information, Stanford, California. 2001.


I like this:

"A person’s success in life is determined by having a high minimum, not a high maximum. If you can do something really well but there are other things at which you’re failing, the latter will hold you back. But if almost everything you do is up there, then you’ve got a good life."


I've heard it said (maybe by Frank Zappa? can't remember for sure) that the difference between a professional musician and an amateur is not how well they play at their best, but how well they play at their worst.


The hardest part about professional music, is that you have to perform at a schedule. It takes a lot of effort to perpare your body physically and mentally for peak performance ... and often-times you just fail. Have a bad day. That happens.

There will be many concerts, where you are just at 70% of your peak ability.

As a professional musician, you have to make sure your 70%-version is good enough that people gladly pay for seeing it.


>As a professional musician, you have to make sure your 70%-version is good enough that people gladly pay for seeing it.

Or you just do electronic music, so that's not an issue!


A lot of live electronic music performances involve some pretty serious showmanship.


One of my favorite "laptop DJ" moments -

I saw Brock Hampton at a festival in 2018. Hadn't heard of them before, and they were described to me as a boy band so I had low expectations. Their performance was bizarrely wonderful - they were all painted blue and wearing orange jumpsuits. Anyways, they have a DJ with a Macbook and a bunch of rappers. At one point mid-song they have a bar of silence, and I swear I saw the DJ lean forward and hit spacebar to pause the track and then resume it after.


Showmanship, yes, which is different that musicianship and needing your "70%-version" of the musical performance to be "good enough".

Also, a lot of other live electronic music performances have almost zero showmanship, so there's that too.

(Speaking as an electronic musician)


to be fair, a lot of them are just pushing a play button as well


my personal favorite (which I don’t know where I picked up from is): the difference between good programmers and average programmers is in how they cut corners. average ones will mostly walk themselves into a corner where the amount of effort required to fix it means it will never get fixed. good programmers will leave it such a way that it can be improved/fixed given more time.


There is a statement very like this, about construction workers--masons or carpenters--in Christopher Alexander's A Pattern Language.


need to check it out. thank you.


This is similar to skateboarding. If someone wants to go from zero to landing one kickflip in a couple of days, he can (albeit a terrible one), yet while the best skateboarders may very well miss their tricks more often than videos would make you think, but their overall consistency is incredible. When one lands a trick for the first time, the saying goes: "Two to make it true".


Really awesome seeing someone talk about skateboarding here on HN as it relates to the subjects covered by Knuth in this article. I grew up skateboarding and I credit the years those years with a huge amount of learning I didn't even know I was doing.

To expand upon what you said: it's amazing when you take that trick, a kickflip for example, and you're so confident in your consistency, that you apply it to something new. Kickflip to rock to fakie, kickflip down a set of stairs, kickflip to 50-50, and so on. The consistency you gain from that trick opens up the door to a whole new world of combinations.

There's so much about skateboarding, the culture (or counter culture in some cases) around it that translate so well to programming/hacking (and life in general). The trials of learning something new. The acceptance and overcoming of failure. The respect you shown and earned when you're at the park and you or someone else puts down a sick trick.

One of my favorite creative minds in skateboarding, Rodney Mullen, talks a bit about the relationship between hacking and skateboarding in his TED talk. [1]

[1] https://www.youtube.com/watch?v=3GVO-MfIl1Q&t=683


Amateurs focus how to get things right, Professionals focus on how to make sure things aren't/don't-go wrong.


That’s still saying talent is awesome. A metaphorical chess player who can beat you half asleep probably has both a high minimum and maximum.


Actually that makes sense with extreme sport even more so.


"An amateur practices until he can do a thing right, a professional until he can't do it wrong." -- (probably by) George W. Loomis


This is interesting and very true. It reminds me of people who are excellent at something but their life gets held back by alcohol/drugs or bad relationships or some other _low minimum_.

That first sentence is really profound in my opinion.


You've never experienced good art by a tortured artist, then?


That would be success in art but failing in life, by the quote definition, I infer he’s equating success in life with your minimum quality of life indicators being as high as posible, if I got it right.


That's how I read it also.

Take for example Kurt Cobain (whatever you think of him and his music). He had major success in music, but his life was a disaster and not something I'd hope for a loved one.

I do agree that it seems like the best art is created by "tortured artists".


Do you think he'd be ok with the opposite - a bland, contained and long life?

What is life for? There are fundamental questions here and I don't think the answers are obvious as you seem to think.


Tortured artists have produced some great work, true.

J. S. Bach had some hardships in his life, but he wasn't tortured.

J. R. R. Tolkien, similarly, was not tortured.

da Vinci's life doesn't seem to have been horrible either.

I think they stand as strong counterexamples to the idea that tortured artists make the best art.


Tolkien wasn't tortured, but he fought in WWI.


I guess people (particularly Tom Shippey, IIRC) do argue that's a lot of what made his work what it is. I tend to forget about that part of his history.

Certainly it would have been a traumatic experience.

Definitely a relevant point. Thanks for mentioning it.


Is their art good because they are tortured, or despite being tortured? People tend to assume the former, but it's unclear why the latter cannot be true.


Interestingly a lot of the self-helpy books coming out recently say the opposite: that investing effort into improving strengths has a higher return than improving weaknesses.


Initial impressions are driven by strength while lasting impressions are formed by your weaknesses. So building up those strengths will get you the job offer, but bolstering your weaknesses is what gets you the promotion.


I also heard that McKinsey adheres to that strategy when investing in training for their employees. Though that is only hearsay and it doesn't mean anything, they could also be wrong. But I think you are right when saying that the opposite approach also has a lot of followers right now.


I suspect what's best for an employer isn't necessarily the same as for an employee. Big employers want specialisation and obedience. That's fine while the demand is there but the narrower your specialisation the less the market needs to move to make you obsolete.


That doesn't relly make sense. Everyone's minmum is 0 (or negative) at something.

Knuth is famously proudly terrible at email and "keeping on top of things". Does that make him a failure? Of course not.


Knuth isn't "terrible" at electronic mail or keeping on top of things. He used electronic mail for decades. He's old. He doesn't want to use it anymore. So he doesn't.

https://www-cs-faculty.stanford.edu/~knuth/email.html


Well, one way to raise your minimum is by cooperating with other people so that they help out cover your minimum, you do theirs.


This is why it's important to identify your weaknesses. Once identified they are no longer weaknesses but areas to work on.


I've heard that it's better to leave weaknesses be and instead utilize your strengths. The reason is because if you just work on your weakness, you end up being just mediocre.

What's your opinion on this?


> it's important to identify your weaknesses. Once identified they are no longer weaknesses but areas to work on

As Tevye might say, You're right!

> it's better to leave weaknesses be and instead utilize your strengths

You're right!

> They can't both be right.

You know, you are also right.

- - -

Or maybe this illustrates that overly simplistic "solutions" to the complexities of life are just that... overly simplistic, that is, not solutions.


It depends what they are. Early in my career I was told I was too introverted and I took that on board. Looking back I think should have told those people to fuck off.

That's a lot different from having a shallow understanding of some technology you use or not understanding your employers business that well.


My opinion is that you can do both. Working on your weaknesses doesn't mean ignore your strengths.


Are you all claiming that people with one-hit wonders aren't succesful? Like yeah, Baauer isn't especially talented but he's going to be remembered forever for "Harlem Shake"

Same with Sir mix-a-lot and baby got back.

They have some pretty low minimums.

If you make one great work - you can coast on that forever.


One hit wonders are statistical flukes. They seem to achieve something by being at the right place at the right time and delivering an amazing performance.

I remember Harlem shake is a thing but can't remember who Baauer is. I know about Sir Mix-a-lot, but couldn't recall any of his tune.

I think you need more than great work to get me remembered forever.

The Beatles certainly achieved that, in year 3000 some people will probably talk about them. I'm not so sure about Vanilla Ice.


How about defining your own idea of success in life?

Taking your utility function, your definition of success, from someone else, is only one step removed from being jealous of someone on Instagram.

Think, and judge, for yourself.


Oddly, this comment seems mainly to put down the GP precisely for thinking and judging for themselves.

Please allow people to like and utilize the good ideas of others. Thankfully, we don't have to invent every wheel we use.


If you meet the Buddha on the road ...

It's true we don't have to invent every wheel. But it's also true that we don't have to treat invented wheels as gospel. There's more than one standard for living a good life, which depends on the person.


Eh. The direction irked me, smacked too much of an objective measure of people's lives.


> Think, and judge, for yourself.

But surely also be inspired by others? OP simply said "I like this". I agree we shouldn't blindly follow or resent others, but that seems a stretch from "like".


can someone mind ELI5 "the high minimum, high maximum"


[flagged]


Based on similar things he's said in many previous interviews, I think you should read "success in life" as "happiness in life", etc: read the quote as something like: A person’s happiness in life is determined by having a high minimum, not a high maximum. If you enjoy something a lot but there are other things which make you miserable, the latter will hold you back. But if almost everything is enjoyable (or at least tolerable) to you, then you’ve got a good life. — see the example from the article where he got uniforms to make toilet-cleaning more fun.


interesting interpretation. I guess, sometimes you need to know the author to understand what they meant more than what they say.


He probably meant an attitude towards life; giving your best effort to even the most mundane task to get the best out of it. That would also elevate your success in what you do best, and your satisfaction in life in general.


the original quote lost its full context and is giving the wrong meaning. knuth is trying to tell us to find joy in the most mundane things we do. he gave the example of cleaning the toilet, which his family does wearing suits.

solving top problems in quantum mechanics is, by any means, no mundane task.

but i have heard similar quotes before. about the joy of doing something being more important than the result. i think it's because joy is so contagious. shared joy makes you even happier, imo.


There is a De-motivational Poster from back in the day with a picture of a skier doing an airborne yard sale. Captioned with 'If you can't learn to do something well, learn to enjoy doing it poorly'

A diamond right there.


I feel like when people start talking about TAOCP, the first thing that comes up is the fact that you need to know a good amount of math to understand Knuth's language. But what people are forgetting is the fact that his books are just bunch of stories told in mathematical language. You need just the slightest of math knowledge to be able to follow his stories. Maybe hard in the beginning - because yes, it's dense - but once you get used to his way of storytelling, you get hooked.


Donald Knuth manages to be brilliant, well-read and also just a nice person. Sort of refreshing given our field's tendency to let mean-spirited people off the hook if they happen to be good at what they do.


Even here on HN I've seen quite a bit of criticism of Donald Knuth alluding that somehow he's become irrelevant. Personally I appreciate Donald Knuth the scientist as well as Donald Knuth the person. Mean spirited people will get better at what they do but will probably miss the mark on becoming a great and humble person such as Donald Knuth. I'm really grateful that he's still among us and still working on his his books and that he shares his wisdom with us.


A thing I never knew before: Donald Knuth, the paragon of scholarship and of a life well spent, not only cleans his own toilets but bought a specific janitorial uniform to make the process more efficient.

I find this inspiring enough that I am going to clean my toilet now.


I believe you misunderstood a very important point.

The uniform isn't for "efficiency". For cleaning one toilet, changing in and out of uniform would ruin the efficiency of having a pocket for 409 spray.

The uniform is for fun.

"Jill and I got uniforms that have a slot where the 409 cleaner fits. You go over there and squirt and feel good cleaning the toilet!"


I imagine they clean more than the toilet when they each put their uniform on. Perhaps do some garden work too.


Excellent point. Thank you.


> a life well spent

When I read this, my first thought was about how Knuth hasn't used email since 1990.

"I have been a happy man ever since January 1, 1990, when I no longer had an email address. I'd used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime."

Source: https://www-cs-faculty.stanford.edu/~knuth/email.html


Who else would clean his toilets? Is he wealthy? I assumed no.


Yes, that's how he was able to build his house around a custom-built pipe organ. Also, though, many men of his generation leave the toilet-cleaning up to their wives. But he says he and Jill each have uniforms for this purpose.

(My toilet looks and smells much better now.)


Makes a bit more sense.


TAOCP sold more than 1 million copies over the years. Which doesn't make him Zuckerberg rich but wealthy enough to afford a house keeper if he wanted.


I haven't asked him but I don't think his book earnings are significant compared to his salary and pension.


Wealthy enough to have a pipe organ in his home on the Stanford campus! I'm sure he could afford a housekeeper if he wanted one.


I learned programming from TAOCP, but that was in the mid-70s, when good expository texts were rarer. I mostly skipped over the math then, and tried to figure out the algorithms. Actually wrote a MIX assembler/simulator, which taught me a ton of low-level programming.

If you're into the math side of this, I'd highly recommend the book "Concrete Mathematics", which Knuth wrote with Graham and Patashnik. I got it to get better at solving problems over at Project Euler, but it's a fun read, like TAOCP. As deep as it is, it's amazingly light-hearted, down to the graffiti in the margins from the students who proof-tested the material in a Stanford course.


There is a joy that shines through everything Knuth does. Those who have ears hear the reason for this.


> Those who have ears hear the reason for this.

You think that's it?


Absolutely.


Man for an older person he is using a really small font on his screen! Also: looks like a GNU/Linux variant not windows or macOS.


Knuth does indeed use emacs. There are some elisp modules on his site somewhere. He has also said that he uses other OSes, but only trusts GNU/Linux with the "crown jewels".


And in the older photo, that's a SAIL keyboard¹, isn't it?

¹ http://xahlee.info/kbd/sail_keyboard.html


True. The editor on the left side does look like emacs.


That's the FVWM window manager. His FVWM config (or at least some version of it) is available if you're curious: https://www-cs-faculty.stanford.edu/~knuth/programs/.fvwm2rc .


Thanks so much for this link! It also explains what I am seeing in that picture:

"Now comes the fun part: Buttons to push that will take me from one desktop to another.

Being a control freak, I am not trusting FvwmButtons to find the correct button layout; I'm building it myself. The goal is to have a 64x64 ASClock at the upper right, preceded by four 32x32 buttons that will aim my display at another desktop, all above a 16x128 CPU load display. Geometry-wise, I consider it to be an 8x5 grid of 16x16 squares (although I could have regarded it as a 4x5 grid of 32x16s)."


Man, he's really bought in to that whole literate programming thing..


That's the most-pleasant .rc file I've ever read.


Thank you for this..

When I see the picture in the article, I was wondering what kind of editor he is using


Damn, even his .rc files are a joy to read.


Attended one of his lectures many years ago and he was indeed using Emacs back then. Don't remember which OS but I think the whole session he was just in Emacs most of the time. Also one time he stopped and was puzzled what was going on, quickly realizing that he'd forgotten to save the file. Still to this day when I forget to save I remember this and think : "If it can happen to Knuth it's ok if it happens to you"


So much respect for the man. Hope he stays safe during these times and lives long enough to finish the series :D


An interesting book interviewing high-level programmers is https://en.wikipedia.org/wiki/Coders_at_work. The book starts out with JWZ and ends with Knuth. In addition to covering their work, Peter also asks each one whether or not they have read TaoCP, how they feel about C++, and whether the use a debugger.

Only one of the people has read the book, with even Knuth saying that he hasn't read it. Of course, he is joking.

But this is a fun read.


Does someone know which are the books lying on his bookshelf? Alternatively, is there an interview with Donald Knuth where he mentions the books he studies.


I think a point is that when you create software there are so many choices how you could create a given piece of program. Trying to figure the best way to do it before you actually do it could take very long, and you still would not be very much wiser because it would all be in your head not in software. Whereas if you just do it you have something you can learn from, and make incrementally better


Has anyone read his books? Seems like an interesting way to get into programming. I enjoy reading, but I never thought that a book about programming would present it's ideas and technical details through stories.


My first major programs were written under the influence of Knuth—I spent a lot of time reading the source code of TeX and its related programs and learned a lot from that. In fact, I only ever took one computer science class in my life (I went three times and got a C), so I'd have to say that Knuth was by far the most formative influence on my early programming. The other day I was actually looking to see if there was any trace of the DVI previewer I wrote for VM/CMS back in the 80s around on the internet. As near as I can tell, there is not. It'd probably be embarrassing to see (among other things, I didn't do any caching of font bitmaps, mostly because I didn't know how and didn't have time to learn, so every character displayed on screen re-read the bitmap data from disk).


How do you get pixels on a screen from VM/CMS?


There were two graphic output options in the previewer. There were specialized graphic terminals using the GDDM protocol (this code was actually written by someone in Germany who sent me their changes). The original code that I wrote used Tektronics graphics protocols available in the terminal driver that connected to the mainframe via a protocol converter that enabled the use of cheap ASCII terminals instead of the standard dedicated IBM terminals.


Oh man, you were drawing pixel fonts on a 4014 storage tube? That must have taken forever.


It was an emulation of it on a PC. VT100+Tektronix was a common graphics option on terminals of the era and the PC terminal software provided that as its graphics choice. I had some optimizations like replacing any characters below a certain threshold size with a solid box based on the bitmap's bounding box. It was reasonably fast given the speed of the connection.


I think it's potentially dangerous. When I was first learning to program, there were a lot of things I couldn't figure out how to do. Then I came across an algorithms textbook (Algorithms in C, by Knuth's student Sedgewick) and it explained how to do a lot of things I had never been able to figure out how to do before, and with beautiful code. This was a wonderful revelation! I then spent a lot of time studying algorithms.

Unfortunately, I didn't learn how to program! I thought that what I was missing was knowledge about algorithms, and that was occasionally right but mostly wrong. Worse, algorithmic textbooks bias you to look for the one weird trick that makes your apparently complex problem simple. But usually that trick doesn't exist, and when it does you usually have to solve the problem the hard way first before you understand the problem well enough to find it. The process of debugging, refactoring, optimizing, and testing that gives rise to the final polished form of a program cannot easily be inferred from what remains. Books like The Practice of Programming and Code Complete were much more helpful, but you can't learn to program by reading books, any more than you can learn to play baseball or win lawsuits by reading books.

I did eventually learn to program pretty well, though I'm not yet a master of the craft like Knuth, Jeff Dean, Rob Pike, Walter Bright, or Norvig. I did it largely by a practice described in this interview: writing new programs every day. I also learned a lot from pair-programming, which taught me both to read other people's code (we had collective code ownership) and to write code others could understand. My main obstacle was not ignorance but perfectionism and lack of practice.


I used his books as a way to learn the things about computer science that I missed out on by not having a CSci degree. They are very mathematical, but that's just a bonus for me. They are certainly not light reading.


> I missed out on by not having a CSci degree

I have a BS & MS in CS, and most of the material in TAOCP was still new to me. Even the stuff I thought I knew like hash functions was covered in a new depth I never would have imagined.


I often consult the "Seminumerical Algorithms" volume, as someone who has to implement lots of digital arithmetic. It is wonderful to help me base my choice of algorithms on math. I wouldn't recommend it as a way to get into programming though, as it is more about algorithms than about how to code them in a practical language. Perhaps his books on literate programming would be better for this purpose.


I try to implement in Emacs Lisp Knuth's combinatorial algorithms from

https://www-cs-faculty.stanford.edu/~knuth/fasc2b.ps.gz

I just managed to finish the first Alg. P (plain changes); it was really hard, considering that I didn't understand the algorithm and the description is made for Pascal like languages.


> Seems like an interesting way to get into programming.

This would be an iconoclastic route into programming, to say the least.

If you are looking for a more approachable or practical book that gets you programming and introduces some theory I recommend “Classic Computer Science Problems in Python” by David Kopec.

Reading TAOCP to get into programming is sort of like reading a physics textbook in order to build a doghouse in your backyard.


Iconoclastic? What idols are being smashed by our hypothetical TAOCP-reading novice?


Not the best word choice. s/iconoclastic/singular/


perhaps the learn how to program X in Y days "icon"?


He wrote a love story about Surreal numbers. I haven't read it yet but it's on my list FWIW.


> Science is much easier to learn if you know the sequence of discoveries.

Is there a math book that gives the historical context (what is the exact problem this piece of math is trying to solve)?


Here’s the funny thing... If Knuth (or a modern avatar) were applying for a faculty position today, or coming up for tenure, would s/he succeed?


How does that gel with TeX, whose syntax creates expressions that are nearly universally regarded as ugly?


I tend to be a good writer and able to express specific feelings easily. Should I learn to code?


Coding is a valuable skill for everyone, not only computer scientist, a popular recommendation of mine for those who want to learn what it looks like while having a decent introduction is: Automate The Boring Stuff With Python https://nostarch.com/automatestuff2


Knuth invented a system that combines writing and coding: literate programming. Maybe you'd like that.


Donald Knuth and Larry Wall - I find these guys in the field of Computer Science the most intriguing.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: