> Seriously. How many times have you crashed out on a knotty problem and awakened with a clear solution?
It's the opposite that is usually happening to me: I go to sleep thinking I finally solved some problem. When I wake up, literally the first thought is a clear counter-example on which the solution does not work. It's frustrating.
Same here. It's particularly annoying when I thought through a problem (and perhaps slept on it too), figured I have it solved, started working on it, and then one day, halfway through the implementation work, I wake up realizing the design is broken.
"The Fastest And Shortest Algorithm For All Well-Defined Problems
...
An algorithm M is described that solves any well-defined problem p as quickly a the fastest algorithm computing a solution to p, save for a factor of 5 and low-order additive terms."
As someone who's exclusively working alone. I've come to the conclusion that the maximum velocity one can advance in a project is how fast the brain can adapt to it's new state. And sleep plays an important role in that.
This reminds me of a clever hack I learned for finding bottlenecks when using a debugger or in cases where you're able to get a backtrace from a running program.
If you run the program and then just randomly trigger the debugger with CTRL-C.. the probability is that you're likely to have landed on a slow code path because that's where the program is spending most of its time.
Or, if you have a semi-locked down production box and a process that's already gone unusably slow, a quick shell loop that calls pstack on the slow process 10 times will get you a good idea where the process is spending its wall clock time. (I've seen locked-down environments where pstack can be run, but gdb can't.)
Note that pstack will probably take a bit for each stack snapshot it prints. It has to attach as a debugger, walk the stacks, and read the symbol tables each time. It's easy for the whole process to freeze for a second or two, so you probably don't want to do this on a production process that is limping along okay-ish.
> Sleep sort is an O(N) sorting algorithm for non-negative integers.
No, it is not. It's O(N+M) where M is the largest value in the list. And, notably, it can fail if the time to schedule the jobs takes longer than the smallest difference between elements in the list
Also, sleep() just delegates the sorting responsibility to a scheduler which likely uses on O(log n) heap algorithm to identify the next scheduled event.
Hashed hierarchical timing wheels is a cool very specialized data structure to make this efficient. The paper claims it's O(1) to start, stop and maintain timers which is much more efficient than O(log n).
Given there's n timers and O(1) operations I'm not sure where the O(n log n) fits in here. Possible in the minimum number of ticks?
> If we can guarantee that all timers are set for periods less than MaxInterval, this modified algorithm takes O(1) latency for START_TIMER, STOP_TIMER, and PER_TICK_BOOKKEEPING.
I think that puts this in the same class as counting sort.
With a linear time O(N) first pass you can identify the minimum element in the list. This reduces M from being the largest element to being the difference between the largest and the smallest element.
If we are playing with this, why not take one pass to find the largest element, then another pass to divide each element in the list by that number, leaving it 3N+M/M.
Edit:
Why divide by m, when you can divide by M*C. Where C will speed this portion up by a factor!
There is a class of programming languages or execution environments that the name escapes me at the moment, where the "time" variable/dimension is explicitly woven into it as a first-class (and _really_ first class, it drives the rest of the system) element.
But I cannot remember what it is, and I'm likely explaining it badly. It had some neat algorithmic or type-level properties too, I believe, and I remember seeing it in context to modelling truly real-time systems.
Does anyone know what I'm talking about? Or is my memory finally deciding to make things up, whole-cloth!
Edit; found it with the help of the commenters below!
Okay that looks super interesting, I know what new thing I’m going to fiddle with while my girlfriend grumbles that I’m not paying attention to whatever we’re supposed to be watching this week ;)
Synchronous programming was the term I was thinking of: the concept of logical ticks being first-class in the programming language itself is what SleepSort reminded me of :)
This reminds me of generating signed distance fields from bitmaps by rendering a cone at every target pixel with a vertex color of white at the point and black at the base.
I like the method because not only is it silly, it's also very easy to do on GPU and good enough so I actually have used it in real products.
When I have a problem I cannot find a way to fix by the desk, I take a walk, do something else, watch a movie etc. And then at least one solution suddenly come up.
It's difficult to maintain this workflow in the office as people think you are lazy and procrastinating. In the office setting, taking the problem to bed and show solution next day is one of ways to go around that.
The more straws you have, the harder it becomes to discover the largest one, and to actually pick all of them up and bang them. At some point you will likely end up foul of fundamental limits related to mass and energy and the speed of light, and that limit will probably come well before 2^32 elements.
Edit: This is usually the problem with analog algorithms: you can easily tell apart 100 items, but scale it up and you find you need so much energy to differentiate that you'd collapse into a black hole before successfully measuring the differences.
From a technical point of view, it's simply handing the problem over to the cpu scheduler. Kind of like touching empty files and naming them in a particular way and doing an ls to get them back sorted. Similar to rc scripts.
Seriously. How many times have you crashed out on a knotty problem and awakened with a clear solution?
Glad that my superpower remains secret.