I really would like to see a cost and cooling breakdown. I just can't see how you can do radiative cooling on the scales required, not to mention hardening.
I thought this was a troll by Elon, now I'm leaning towards not. I don't see how whatever you build being dramatically faster and cheaper to do on land, even 100% grid independent with solar and battery. Even if the launch cost was just fuel, everything else that goes into putting data centers in space dwarfs the cost of 4x solar plus battery.
I cannot really tell satire apart from genuine opinions anymore.
(But I do hope it was satire, if not, cooling satelites was/is a big issue and they only have very modest heat creation. A data center would be in a quite different ballpark)
To add insult to injury, the obvious path is for studios to switch from Adobe to ToonBoom... which already copied Adobe's playbook by going subscription-only last year.
For context, although Flash Player died a long time ago, the editor lived on in "offline" 2D animation workflows where the end result is rendered out to video. Lots of kids shows are still made with it, and at least some anime studios use it (e.g. Science SARU).
> Although the Flash Player app formally reached its end of life on December 31, 2020, Adobe has allowed a local Chinese company to continue distributing Flash inside China, where the application still remains a large part of the local IT ecosystem and is broadly used across both the public and private sectors.
Sounds like too many big institutional websites depend on Flash.
Wasn't "ChatGPT" itself only supposed to be a research/academic name, until it unexpectedly broke containment and they ended up having to roll with it? The naming was cursed from the start.
> What if you could put in all the inputs and it can simulate real world scenarios you can walk through to benefit mankind e.g disaster scenarios, events, plane crashes, traffic patterns.
This is only a useful premise if it can do any of those things accurately, as opposed to dreaming up something kinda plausible based on an amalgamation of every vaguely related YouTube video.
No, that's just an optimization that saved on computing resources. It effectively allows the party that runs this simulation to have a limited world to simulate. Dark matter is the other half of that trick. Both were invented by one Bebele Zropaxhodb after a particularly interesting party in the universe just above this one...
Isn't this still essentially "vibe simulation" inferred from videos? Surface-level visual realism is one thing, but expecting it to figure out the exact physical mechanics of sailing just by watching boats, and usefully abstract that into a gamified form, is another thing entirely.
Yeah I have a whole lot of trouble imagining this replacing traditional video games any time soon; we have actually very good and performant representations of how physics work, and games are tuned for the player to have an enjoyable experience.
There's obviously something insanely impressive about these google experiments, and it certainly feels like there's some kind of use case for them somewhere, but I'm not sure exactly where they fit in.
Google has made it clear that Genie doesn't maintain an explicit 3D scene representation, so I don't think hooking in "assists" like that is on the table. Even if it were, the AI layer would still have to infer things like object weight, density, friction and linkages correctly. Garbage in, garbage out.
Google could build try to build an actual 3d scene with ai using meshes or metaballs or something. That would allow for more persistance, but I expect makes the ai more brittle and limited, and, because it doesn't really understand the rules for the 3d meshes it created, it doesn't know how to interact with them. It can only be fluffy-mushy dream images.
Yes, with yabridge, it works very well for example with all the Valhalla Reverb plugins. But then there are others like FabFilter, they do not work so well. But luckily there are now native Linux FabFilter alternatives, like ToneBoosters EQ Pro, Tal EQ, ZL EQ, ...
They _all_ offer Dynamic EQ, all the phase modes (linear, minimal and derivatives), freq matching, collision detection, side chaining, etc...
Absolutely comparable imho. And cheaper. ZL Equalizer even is open source!
I'll put TB Barricade against Pro-L 2 and TB Reverb against Pro-R for sure. I mostly use other stuff for EQ and compression, but those two are really very similar to the FB offerings.
Yabridge works, and it's frankly incredible that it works at all, but it has some trouble figuring out where I'm clicking on EZDrummer. It's gotten better in the latest version of Linux Mint but it's still a bit off.
You know what's even harder to cool?
> Orbital Data Centers
reply