I recall lots of unicode obfuscators were popular turning letters to similar looking symbols to bypass filters/censors when the forum/websites didn't filter unicode and filters were simple.
Leetspeak (https://en.wikipedia.org/wiki/Leet), and similar things obfuscating letters with numbers/ punctuation/ symbols (or subsequently, obfuscated with Unicode or lookalikes).
Now experiencing a renaissance on YouTube et al. to legitimately refer to terms like murder, suicide etc. which will typically get a channel or user demonetized/banned/blocked by internal search engines/etc.
Exactly. So many words are being "censored". As if "k*ll" does not make us think of "kill" or something. I do not see how it helps or solves anything. It is absurd to me.
Crime is mostly individual and income-driven, so its both not community-policed and inversely proportional to income level.
Laws against crime require police to effectively enforce them, as community cannot "reduce crime" without significant investment and focus on it(police is significantly more effective vs vigilantes/citizen patrols).
Whichever this scheme tries to do, is effectively collective punishment for community not allocating resources for policing itself.
The only thing that passed the test of time,so far is specificity: if you ask for multiple things or vague things, you receive half-baked answers trying to cover all bases. If you ask for specific one thing and describe it, the answer quality goes up;e.g. LLMs creating multi-part content mix up the parts and qualities of them, so e.g. asking for Part 1*specific, will always get a better answer than "list all parts of X"(quality drops with length of list).
The macros are fine as concept, i've used something similar before for reducing code size,e.g. defining hundreds of similar functions and stuff.
What is incomprehensible and puts the entire thing into "Obfuscated C" territory is one-letter variables. You'll need to memorize all of them and can't reuse them in normal code. If at least the variables were self-descriptive i'd support such coding style, but it clearly need comments.
The only thing that could pop the bubble is an alternative architecture for inference that doesn't need GPU clusters and datacenters to compete within the ecosystem.
AI itself isn't going away anytime soon.
CorticalLabs uses Mouse Embryonic Stem Cells, but human cells might be more effective due human neuron superiority. Just some ethical problems with harvesting and you can build a hyperscale cluster.