Hacker News .hnnew | past | comments | ask | show | jobs | submit | StackOverlord's commentslogin

> ‘Sometimes you need to do that to get the results for things you think are essential.’


Yeah, which is a quote relayed by a guy who disagreed with that person was doing and who seems to have a real bee in his bonnet about the topic under discussion, calling it a "lie" when I can't see any dispassionate observer agreeing with that characterisation.

Did the person allegedly quoted actually think it was lie? Seems amazingly lucky that he got so close to the real figure if he did. It feels like someone saying "sometimes you got to break the law to get things done" because they jaywalked in the middle of the night when out buying some milk. The drama of the comment just doesn't seem plausible at first glance.


I'd like to point out the existence of Chris Staecker's Youtube Channel, mainly focused on old analoguous devices used to compute and measure things.

https://www.youtube.com/watch?v=R2Q-QPsFKa4

He's a great sense of humor !


Then make sure nobody from Eglin Air Force Base posts on here.

https://www.reddit.com/r/Blackout2015/comments/4ylml3/reddit...


I'm not quite sure I get your angle but HN users are as welcome from Eglin Air Force Base as from anywhere else, as long as they're using the site as intended. Why shouldn't they be interested in pedestrian footbridges in Minneapolis? (to mention one darling of the moment)


Academia selects for conformism first closely followed by cognitive abilities ([paste link to paper here]). But who cares ?

I certainly don't I'm more interested in the groundbreaking implications of this thesis:

"Contribution to the characterization of three texture descriptors: crispness, crunchiness, and brittleness through acoustic and sensory approaches."

https://www.theses.fr/1992DIJOS040


I think it matters a great deal too, and not just for typing code but for reading it too, and this goes beyond simply typing fast but also writing short concise code, especially for parts of the code that are closer to defining program architect than implementing the details: it's better to have everything fit in one page of code rather than being dispersed throughout multiple files.

The reason for that is that a codebase will condition the complexity and thus time it takes to add features to it, in a way that is similar to algorithmic complexity and big o notation, except we as human can't even afford polynomial complexity and constant factors matter a lot.

Imagine you're developing an API, both server and client. You can cut your time in half by automatically deriving the client code from the server side specs. Of course you may have to develop that tool yourself and it takes time. The point is that the time invested developing it will be repaid each time you implement a new endpoint in your API, cutting development time in half:

n * (t(s) + t(c)) versus n * (t(s))



> "additional data available to the Department of Defense" which I assume is classified.

Spot on

> The 2014 meteor was originally identified as an interstellar object by Siraj in 2019 when he and Loeb were studying Oumuamua. The pair posted their findings as a preprint and submitted their results to an astronomy journal, but the paper was not accepted for publication because they used data from a NASA database that used classified information that could not be verified.

> The snag started a three-year process as Siraj and Loeb worked through a bureaucratic logjam to receive government confirmation on their findings, working with scientists and officials at NASA, Los Alamos National Laboratory, and other offices. They eventually connected with Matt Daniels, assistant director for space security at the White House’s Office of Science and Technology Policy, to get an analysis from Shaw and Mozer.

https://www.cfa.harvard.edu/news/scientific-discovery-gets-k...


You can get to the absolute dimensions of the pyramid from the crackpot equations though:

https://pasteboard.co/cIMzn8KktEZy.png

https://tobeornottobe.org/the-great-pyramid-intro/math-const...

https://pastebin.com/TsUtcWAa

I just found a great recap on this topic that also fits nicely with the points your link brings up, shedding a better light on the number 43,200 in other cultures (the scaling factor between the pyramid and the earth dimensions for those who TL;DR).

https://fossana.medium.com/the-pyramids-of-giza-have-propert...

It also addresses the fact the speed of light in m/s is encoded three times in the pyramid, twice in the dimensions, in cubits and meters, and another times via the GPS coordinates (yes 3 fucking times !). But how would they know about the duration of a second ? I don't know ! What I know is that 43,200 * 2 = 84,600, the number of seconds in a day.

The thing goes deeper with this man and his investigations of the first edition of Shakespeare's Sonnets.

https://www.youtube.com/watch?v=0xOGeZt71sg

https://www.youtube.com/watch?v=nIS-hNrr0-c


I'm not convinced until the pyramids encode the Meter and the Kilogram and Mole. ...in Hex.


make enough measurements and you can probably find those "encoded" there, too


write a program to prove your point.

I'm not saying this with sarcasm. Maybe you could take over my attempts at coming up with a custom set of equations for a pyramid, or any other platonic shape, using symbolic regression / genetic algorithms.

https://pastebin.com/9CEU76K6


"The pyramids give us the dimensions of our planet on a scale defined by the planet itself." - Hancock

It's a bit as if we had never seen a poem and stumble upon some text that harbors rhymes. You'd argue there's no intrinsic link between, say, pickle and tickle, you'd dig in each word etymology to show they wouldn't rhyme in their past forms. And beyond that you would deny the fact words rhymes because they are not in succession, and when I'd point out it's because the rhymes are crossed, you'd laugh it off.

It's not about convincing you (of what ? of some secret intent ? I'm not even sure this is the case). It's about how convincing it is. For that you need to model the cognitive process that interprets these coincidences.

Algorithmic Simplicity and Relevance - Jean-Louis Dessalles

https://telecom-paris.hal.science/hal-03814119/document

4.1 First-order Relevance Relevance cannot be equated with failure to anticipate [16]: white noise is ‘boring’, although it impossible to predict and is thus always ‘surprising’, even for an optimal learner. Our definition of unexpectedness, given by (1), correctly declares white noise uninteresting, as its value s at a given time is hard to describe but also equally hard to generate (since a white noise amounts to a uniform lottery), and therefore U(s) = 0. Following definition (1), some situations can be ‘more than expected’. For instance, if s is about the death last week of a 40-year old woman who lived in a far place hardly known to the observer, then U(s) is likely to be negative, as the minimal description of the woman will exceed in length the minimal parameter settings that the world requires to generate her death. If death is compared with a uniform lottery, then Cw(s) is the number of bits required to ‘choose’ the week of her death: Cw(s)  log2(52×40) = 11 bits. If we must discriminate the woman among all currently living humans, we need C(s) = log2(7×109 ) = 33 bits, and U(s) = 11 – 33 = –22 is negative. Relevant situations are unexpected situations.

s is relevant if U(s) = Cw(s) – C(s) > 0 (2)

Relevant situations are thus simpler to describe than to generate. In our previous example, this would happen if the dying woman lives in the vicinity, or is an acquaintance, or is a celebrity. Relevance is detected either because the world generates a situation that turns out to be simple for the observer, or because the situation that is observed was thought by the observer to be ‘impossible’ (i.e. hard to generate).

In other contexts, some authors have noticed the relation between interestingness and unexpectedness [9, 16], or suggested that the originality of an idea could be measured by the complexity of its description using previous knowledge ([10], p. 545). All these definitions compare the complexity of the actual situation s to some reference, which represents the observer’s expectations. For instance, the notion of randomness deficiency ([8], ch. 4 p. 280) compares actual situation to the output of a uniform lottery. The present proposal differs by making the notion of expectation (here: generation) explicit, and by contrasting its complexity Cw(s) with description complexity C(s).


http://homework.uoregon.edu/pub/emj/121/lectures/tycho121.ht...

> Tycho Brahe (1546-1601) proposed an experiment that would determine whether or not the earth goes around the sun. Basically, if the Earth orbits the sun, nearby stars should periodically "move" back and forth in their position with respect to more distant stars every 6 months. If the Earth was stationary (at the center of the Universe, this wouldn't occur.


Yes, but historically, scientists ended up accepting the heliocentric model long before the first stellar parallax was measured.

The heliocentric model's parsimony (explaining many different phenomena, such as retrograde motion of the inner planets, with the fewest assumptions) and various supporting observations (such as Venus' phases, which rule out any strictly geocentric model), combined with a physical theory that grounded the model (Newton's laws) rendered any alternative implausible by the late 1600s at the latest. It would be nearly 200 years before parallax was first detected.


> harmful and pathological

Don't forget "toxic" too !


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: