Hacker News .hnnew | past | comments | ask | show | jobs | submit | pontifier's commentslogin

Here's an actual simulation of the fusion reactor I'm working on: http://www.ddprofusion.com/simulation/

When I worked on autonomous vehicles I realized that if I really wanted to figure out how to get the vehicle to end up where I wanted, I had to model the actual processes involved. It wasn't actually that difficult. I literally just generated a set of curves that would start with the wheels of a vehicle pointed straight ahead, and tracing the path the car would follow if the steering wheel started being turned at a reasonable rate to a maximum fixed angle, holding it there for some amount of time, then straightening the wheel again. This curve wasn't a simple shape at all, but it didn't matter. It gave the most accurate results. Once the family of curves was generated, I could just select the one that best fit the situation.

The unstructured input attack surface problem is indeed troublesome. AI right now is a bit gullible, but as systems evolve they will become more robust. However, even humans are vulnerable to the input given to us.

We might be speed running memetic warfare here.

The Monty Python skit about the deadly joke might be more realistic than I thought. Defense against this deserves some serious contemplation.


What an amazing and strange effect! I'm not even going to attempt to evaluate the reasons or causes because it's brand new to me.

How wonderful to find something new like this!


What if we just aren't doing enough, and we need to use GAN techniques with the LLMs.

We're at the "lol, ai cant draw hands right" stage with these hallucinations, but wait a couple years.


The thing that blows my mind is: say you start filling the plane with pi. Pi has been proven to contain every finite sequence. That means that somewhere in the plane is a full physics simulation of YOU in the room you are in right now.

Does that you exist any less fully because its not currently in the memory of a computer being evaluated?


> Pi has been proven to contain every finite sequence

This has not been proven yet: https://math.stackexchange.com/a/216348/575868

(or more generally: https://en.wikipedia.org/wiki/Disjunctive_sequence)

Depending on the infinite grid filling scheme even these properties may not be sufficient to guarantee that every two dimensional pattern is initially generated because the grid is two-dimensional, but the number property is "one-dimensional". A spiral pattern for example may always make it line up in a way such that certain 2d patterns are never generated.


Since it's not provable with pi, then we'd have to do a more circuitous proof of every finite pattern occurring. Inspired by Champernowne's constant, I propose a Pontifier Pattern that is simple, inefficient, but provably contains every finite pattern.

Starting at the origin, mark off rows of squares. the Nth row would contain NxN^2 squares of size n x n. Each square would be filled in left to right reading order with successive binary numbers with the most significant digit at the top left.

Somewhere in that pattern is the physics simulation of you reading this comment :)


minor correction: 2^(NxN) squares per row, right?


Yeah, what was I thinking?? I really need to slow down sometimes. This should contain every finite pattern, right?


Yes, sounds like it! Though I'm thinking that the relative arrangement of patterns would also make a difference. I wonder if such a thing as "all (infinitely many) possible arrangements of all patterns" can exist



Always a fun read :) they turned it into a futurama episode


My mother is currently dying in the hospital with breathing problems. I mentioned this to her earlier today... I thought this would have been much farther along than it is. Hurry up with the medical tech already!


I remember reading somewhere that because Ternary computing is inherently reversible, that from an information theoretic point of view that ternary computations have a lower theoretical bound on energy usage, and as such could be a way to bypass heat dissipation problems in chips built with ultra-high density, large size, and high computational load.

I wasn't knowledgeable enough to evaluate that claim at the time, and I'm still not.


Here's a couple of sources that back up what I was talking about:

https://ieeexplore.ieee.org/document/9200021

https://en.wikipedia.org/wiki/Landauer%27s_principle


I recently saw another video about a high accuracy 3d positioning stage. The differences and similarities were very interesting. For instance, both used rigid rods with ball joints for accuracy, but wildly different encoders and testing methods.

https://youtu.be/MgQbPdiuUTw?si=5r0DVsxVT1owyKk6


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: