I remember a 120MHz Pentium Linux box arriving at a cottage in Crete, where, with the aid of a 56k USRobotics modem, we (my wife and I) worked remotely in 1995-6. She had a Mac SE/30 for her tourist guidebook work. She later upgraded to a 6100 PowerMac "pizza-box", various iMacs, G3/G4/G5, whereas I saved a quad-200MHz PentiumPro monster (Compaq Professional Workstation 8000, tricked up to 3GB RAM) from the skip. I regret taking that to the recycling centre many years later.
There are two ways of constructing software: one way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult.
That's cute, but pure fantasy, imo. There's no non-trivial code that's actually used and maintained that's going to be immune to obvious deficiencies that get overlooked. Look at this [0], for example. Simple as can be, obvious defect in a critical area, left out in open source for ages, a type of bug an LLM would be liable to generate in a heartbeat and a reviewer could easily miss. We are human after all. [1]
The sheer amount of dedication is awe-inspiring. In the 80s, at Imperial College, London, Theoretical Physics Group, we came across some correspondence intended for one of the professors, I think. This person had evidently "translated" his native Spanish into English, laboriously, via some dictionary we thought. We spent many tea breaks puzzling over phrases such as
"Is well-knew the refran: of the said to made it has a good road."
"The language assumes contiguous memory, column-major order, and no hidden aliasing between arrays."
One out of three ain't bad.
"Column-major order", that's the one. That just says that there is an implied order of elements of an array. This enables efficient mapping of an array of the language definition to storage with a linear address space. It does not require storage to be organised in any particular way.
"Contiguous memory". Not really, the language definition does not even use the word "memory". The language is carefully defined so that you can implement it efficiently on a linear address space in the presence of caches (no accident that IBM machines in the 80s were pushing caches while CRAY was pushing vector processors). The term "contiguous" in the language definition just means a collection whose parts are not separated by other data objects.
"no hidden aliasing between arrays". This is a crude mis-statement of the actual rule of argument association across subroutine/function caller-callee boundary. The rule takes pages to describe fully. A language that still has EQUIVALENCE statement (although marked obsolete) cannot be said to disallow aliasing. It is still quite hard to find a compiler that will reliably diagnose inadvertent aliasing at runtime. The actual rule in Fortran says something like "thou shall not cause any effect on (dummy name) A through a reference to some other name, unless (some-clause-x or y or z)".
It is not suited to cases where the machine state is experienced by the user at all times (games, hardware-control, real-time transactions). It is very suited to cases where the machine is expected to act like a SSBN, disappearing from view until a large floating-point calculation is ready.
Can someone explain to me why although we've known that classical physics is not a correct description of the hardware of the universe for at least 100 years now, we are still hooked on 90-year-old Turing Machines which cannot physically exist (they violate quantum mechanics) and whose theoretical limitations are, consequently, irrelevant.
reply