Cloud models will always have edge compared to local models. Maybe in 2030 your iPhone would run GPT-4 on your phone but cloud GPT-9 will solve all your kids homework, do 95% of your job and manage your household.
Transformers still didn't hit their scaling limits wrt number of tokens trained on and the number of layers. Model size limits are now purely given by finances ($100M/training run still seems to be too excessive).