HN2new | past | comments | ask | show | jobs | submitlogin

Cloud models will always have edge compared to local models. Maybe in 2030 your iPhone would run GPT-4 on your phone but cloud GPT-9 will solve all your kids homework, do 95% of your job and manage your household.


I'm pretty sure models are at their end game already. Now the law of diminishing returns is at play.


Transformers still didn't hit their scaling limits wrt number of tokens trained on and the number of layers. Model size limits are now purely given by finances ($100M/training run still seems to be too excessive).


That is true. And that financial limits might change however.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: