I would say this is the main benefit of cuda programming on gpu. You get to control local memory. Maybe nvidia will bring it to the cpu now that the make CPU’s
He did. His entire startup is about educational content. Nanochat is way better than llama / qwen as an educational tool. Though it is still missing the vision module.
I think the people who use more than they pay for vastly outnumber those who pay for more than they use. It takes intention to sign up (not the default, like health care) and once you do, you quickly get in the habit of using it.
This move feels poorly timed. Their latest ad campaigns about not having ads, and the goodwill they'd earned lately in my book was just decimated by this. I'm sure I'm not the only one who's still just dipping their toes into the AI pool. And am very much a user that under utilizes what I pay for because of that. I have several clients who are scrambling to get on board with cowork. Eliminating API usage for subscription members right before a potentially large wave of turnover not only chills that motivation it signals a lack of faith in their marketing, which from my POV, put out the only AI super bowl campaign to escape virtually unscathed.
> the goodwill they'd earned lately in my book was just decimated by this
That sounds absurd to me. Committing to not building in advertising is very important and fundamental to me. Asking people who pay for a personal subscription rather than paying by the API call to use that subscription themselves sounds to me like it is. Just clarifying the social compact that was already implied.
I WANT to be able to pay a subscription price. Rather like the way I pay for my internet connectivity with a fixed monthly bill. If I had to pay per packet transmitted, I would have to stop and think about it every time I decided to download a large file or watch a movie. Sure, someone with extremely heavy usage might not be able to use a normal consumer internet subscription; but it works fine for my personal use. I like having the option for my AI usage to operate the same way.
The problem with fixed subscriptions in this model is that the service has an actual consumption cost. For something like internet service, the cost is primarily maintenance, unless the infrastructure is being expanded. But using LLMs is more like using water, where the more you use it, the greater the depletion of a resource (electricity in this case, which is likely being produced with fossil fuel which has to be sourced and transported, etc). Anthropic et al would be setting themselves up for a fall if they allow wholesale use at a fixed price.
They probably have an authorized mechanism for tender offers. Too much hassle to do it outside of that… also most of their employees shares probably haven’t vested yet. The early employees are already billionaires and don’t care to sell, and can probably just use them as collateral for loans… you can try a sales person from one of the secondary platforms like forge, EquityZen, hiive, or one of the syndicaters on angellist who has done past spv’s on Anthropic. Tell them you want $5M plus and give them a bid…
I think if you took a buddy with you to the drivers license test in America and asked your buddy these questions during the test. You and your buddy are both failing. Unless test was in India over tea and not in a car.
I guess you're saying that because a waymo car can't walk into the DMV and get a license, it shouldn't be on the road? (which of course it can't, you have to have a legal human identity to get a normal driver's license, and we don't let cars have humanity currently)
Driver's licenses are legal constructs. The DMV certifies self-driving cars as able to drive on the road differently, and sure, those two different processes are different.
I really don't get the point you're trying to make here.
reply