I participated in the beta and was pleasantly surprised to see how the entire experience came together. I only experienced one small technical issue. The game runs in real time on Google's servers. If you have some lag on your network, the client will drop all the frames during that time. Instead I expected behavior like Youtube where you will see a small loader and the game will continue before the lag started. This was specifically a problem when playing Assassin's Creed: Odyssey cause in the cutscene the characters would say something like "Go to the * lag * and meet * lag *" and then I had no idea what I had to do next. Lag was still a rare thing during my experience. It's just that when it occurred the timing was unfortunate.
How do you picture your hypothetical vision of things working on the server side? Naively using buffering would make the input stream desync from the output stream. "Input prediction" works well enough when a game supports it, but lag would reappear for titles where the game doesn't. A general solution (for games with no input prediction) would require you to run the game in an x86 emulator with save-state + rewind support, such that network stutter could be translated to micro-rewinds of the game's VM; and even that wouldn't work if the game was multiplayer.
I could see games being able to signal "there is no input during this cutscene, so it can be buffered at the client." But for everything besides cutscenes, game streaming essentially has to work like a VoIP call (hard-realtime), doesn't it?
For a tangential example, Zoom conferencing increases the replay rate after a cutout. I can tell that someone goes to say, 1.25x and they still sound ok, but you usually don't lose much this way.
It happens occasionally when I drive (bad LTE zones). On bad days (ie, if I'm tethering at hotel to join a conf or using airport wifi), I sometimes hear "robot voice" as the tool attempts to deal with signal attenuation.
This may not work as well in video games, but I would like the option.
But for everything besides cutscenes, game streaming essentially has to work like a VoIP call (hard-realtime), doesn't it?
I imagine most music, and some during-gameplay dialogue, could be buffered at the client. I'd expect sound effects to be the only audio that needs to be super low latency. Maybe certain dialogue too, if the speaker is visible on screen.
This alone might alleviate a lot of the GP's problems, since most audio wouldn't cut out. That can make a huge perceptual difference.
I think you pretty much explained the solution. You need buffering plus real time VM suspend+resume. I was thinking the client could keep sending last shown frame data back to the server and if a hiccup is detected the server stops the clock cycles on the VM and resumes the clock as soon as the buffer is empty. This may lead the user to see a loader for a short time.
When you're talking to someone on Skype and you notice the other person has stopped moving, you stop talking.
They may already be doing something similar (halt rendering on congestion) but when this kind of lag happened to me it felt like the rendering had continued without pause on their end.
It’s the same experience in all other forerunners in the field.
In essence, you need a fat cable to a datacenter very nearby for shooters/fast action games.
Regular wi-fi is more or less sufficient for everything less (you might get video degradation from time to time).
OnLive was the first and it was just insanely awesome. Too bad it was shattered.
I also tried LiquidSky, and it was insanely awesome. At one point you ended up having a virtual PC where you could install and play any game from your Steam library. Too bad they’ve changed the service and monetisation a few times, and it’s unclear if yhey are alive.
I also was a participant. It was pretty cool. I did a wireshark and they are using Google's QUIC protocol.
In my experience the game would drop graphics before I would experience input lag. There were a handful of times that I did experience input lag. This was on a wired connection 100mbs down 20mbs up through Xfinity.
I did notice in the Google demo, that when the person was using the game pad, he experienced input lag when he was trying to jump up on top of that steeple on top of the building.
Hmm, I think I like those design choices. A dropped frame or two isn't a big deal in my experience, while a delayed input is a far bigger detriment to the gaming experience. QUIC (and UDP in general) seems about right for the technology backend.
I'm still of the opinion that Google shot themselves in the foot here by having a bunch of wireless controllers in one room. Its like they've never talked to a Super Smash Bros Brawl tournament organizer before: Wii Nunchuks over 2.4GHz Bluetooth have dropped packets / dropped input issues when you get to ~20+ participants.
Don't do mass wireless in one room. It always ends poorly. I'd expect that local wireless in a typical living-room setting would be a better experience actually. After all: the major issue is whether or not the wired-connection / fiber backbone of the typical city is up to spec for this kind of thing. (A typical living room user probably doesn't have to worry about clogged 2.4GHz connections unless they're in an apartment I guess...)
So basically the same issues as onlive then? I remember thinking that it was pretty good for a while. Then I tried the same game on a decent PC immediately after using the service.
It was night and day, both in terms of latency and graphics quality. I don't have high hopes for this service.
I feel similarly. I've tried in-home streaming from PS4 to a MacBook— obviously the encoding of the video signal isn't going to be as optimized or speedy as Google's backend, but still, the overall ping is sub-10ms and it's still a way worse experience than playing the same game on the TV, even for relatively slow paced action-adventure titles like HZD or RDR2.
I can't imagine trying to play something like a fighting game this way.
It worked great, in my experience. The most awesome thing about it was playing Dirt Rally 3 I believe, on a Motorola Droid/Milestone. A full PC game on a smartphone, pretty mind blowing at the time, and it was quite good even over 3G.
Yes that seems about right. I would say lag occurs once every 5 mins and lasts for roughly a second. Not longer than that. But since gaming is such a visual medium that 1 second seems longer than it actually is. One thing I forgot to mention is I was on wifi and the performance may have been better had I plugged in an ethernet cable. Although I would add that wifi usage is going to be what the typical user opts for.
I used to get these, every 5 minutes, right on the clock. Turns out, OS X location services rescan wifi every 5 minutes, God knows why. Turn it off, lag gone.
> Your approximate location is determined using information from local Wi-Fi networks, and is collected by Location Services in a manner that doesn’t personally identify you.
> I would say lag occurs once every 5 mins and lasts for roughly a second.
As someone who plays through DOOM on Ultra Nightmare, this kills the game. There are times when, if I were to miss even 10 or 15 frames, there would be a good chance that will be the end of my run.
Hello fellow Doom Nightmare player. I think Stadia is not for folks like us who have tasted the fill experience on high end rigs.
It will instead, expand the audience to a lot of casuals who just want a little demon-slaying power-fantasy on a low difficulty. Nothing wrong with that, and it is a damn big addressable audience. Just like how mobile didn't "kill" any other platform, it just added a whole lot of candy crushers.
In my experience, 2.4 GHz is congested. An entire room of wireless controllers at 2.4GHz (ex: Super Smash Bros. Setups with Wiimotes) would cause these "dropped packet" issues, even with local WiFi / Wireless. 2.4GHz shares Bluetooth, 802.11n, and many other protocols.
If this were an entire room of WiFi controllers hitting the same WiFi frequencies all at the same time, it could have been a local problem.
All in all, we can't read too much into the demo-conditions. There are too many variables at play here.
No lag at all on my end. I am in the USA, so I have a solid but not amazing connection.
I should note that I have only played for half an hour because AssCreed was extremely boring. The tech itself was very solid in my experience.
I tried to catch some streaming artifacts, like parts of the screen not updating or a visible lag, but I could not see anything. In a blind test, I doubt I would have been able to differentiate this from running the game on my home console.
I'll second that. Standard Xfinity cable connection, decent PC, and wired ethernet to the router. Occasionally in the evening during prime time when I assume everybody in my building was watching Netflix the stream would kick into lower quality graphics, but would recover quickly. No control input lag that I could detect. I put about 30 hours into the game and enjoyed myself (Awsome environment design and a vast, vast map to explore. All that fighting is repetitive and tedious, but story mode and running around exploring, taking in the views and the history is quite fun).
In any event I think that if you are a casual gamer and want to get off the graphics card upgrade train every couple of years, then this is a no brainer. It probably would not be a good match for multiplayer shooters, but for solo games, even AAA titles, it's a great option.
My feedback was that if they paired up with Steam they'd have a killer product. They might still have a killer product if they convince all those developers on Steam to also put their games on this service.
The artifacts were very visible to me when rotating the camera quickly or during fast movement. If the camera isn't moving very much I think it is easier to get a high quality picture at a standard bit rate.
> This was specifically a problem when playing Assassin's Creed: Odyssey cause in the cutscene the characters would say something like "Go to the lag * and meet * lag " and then I had no idea what I had to do next.
That's annoying.
Now, as a technicality specific to ACO, the goals should still appear in the mission's description, no matter whether you hear the NPCs stating those goals or not.
Unrealistic for what seems like obvious reasons; no one would use the service if it was lagging enough to make that time profitable by way of ads. If your comment was tongue-in-cheek, which I hope it was, I'd urge you to contribute more substance and originality in these comments. After all, this isn't Reddit.
Do you think these things are not considered. There is an army of people at Google that try to figure out how to show us more relevant ads. It's the perfect scenario for upselling.
Slow connection speed? Run these tests, etc. Or, switch to a faster more secure browser. Sound familiar?
Rewrote this a few times. Overall I think this would be really bad for the gaming world. Not buying consoles would be nice, but the natural business model will be ads or pay per hour or pay per MB. This is going to further drive the industry to towards mass multiplayer and grindy games I think. More accessible but worse content.
Indie shops would probably struggle even more under this model.
The cost per user will vary drastically by the amount they play and how computationally demanding it is. I just don’t see it being feasible without personalized cost. Looking forward to paying minimum cost for minimum graphics too.
I disagree. The natural business model will be a mix between subscription services and free to play games with micro transactions. Microsoft (and others like EA) is already moving in that direction with game pass (monthly subscription gets you all first party games plus partner games).
The biggest concern is going to be internet availability and data caps. I tried the beta for Project Stream (now Stadia) this past winter. It worked well and was impressive, but I have a good internet connection with a 1TB cap. I have friends with much worse speeds and harsher caps and I am not sure if this would be viable for them.
I am more concerned about 'physical' gaming being phased out. I doubt Stadia will do this soon, but I like building a new computer every 4 years for playing games. Maybes it's something I won't actually miss(like how I don't miss CDs for music), but it remains to be seen.
For some people, the grind is the main attraction, especially if it's well balanced. Look at Path of Exile, that's one hell of a grindy game and people absolutely adore it.
Or worse, a hybrid like Assassin's Creed Odyssey: a $60+ game that gets super grindy, but reminds you that you can opt into micro-payments to get a little experience or coin boost...
Morrowind was way more grindy than Odyssey (make 50 steps, a monster attacks you, repeat until the end of game). Odyssey just made all enemies within +/-2 levels of your own character, contrary to Origins, where levels were preset for each area.
While I am not a fan of many F2P games, their popularity and impact are undeniable.
The trap that you should avoid falling into is assuming that's the only way games can thrive. F2P games are huge and dominate the conversation, but I think indie games are the best they have ever been. My favorite game last year was Into The Breach, and by all accounts it sold well in the same market that Fortnite dominated in.
Games are evolving in weird ways, but it's in a multifacted and diverse way.
Eh, probably to some degree, but at least anecdotally for me, "new game from FTL dev" got me to watch the trailer, but the concept is 100% what hooked me.
Oh, for sure, I feel exactly the same way. ITB is one of my favorite games in my steam library, and that's the same way I discovered it. The argument I was trying to make is that a lot of people wouldn't have had the opportunity to be hooked had it not been for the "new game from FTL dev"
FTL was a Kickstarted game from devs with zero pedigree (to my knowledge) and it was also a indie hit. While I agree Into The Breach owes some success to the devs being established now, but I don't think you negate my point about this being a time where games of all types can thrive.
Game Pass works because you're essentially temporarily unlocking a library of games to be downloaded to your device. It really isn't an indicator they're moving in the same direction as Stadia.
It is kind of another beast altogether for Google, Shadow, Microsoft (xCloud), etc. to dedicate actual hardware for your usage. Shadow and others like it are essentially remote VM w/ GPU that you rent by the hour. Stadia, xCloud, etc. we don't know what the business model is going to look like. The only positive here is that Microsoft, Google, and Amazon are all cloud infrastructure companies so in theory, they could price their offerings cheaper than a company that is a tenant on their systems.
It's not just game pass though. Microsoft has flat out said they are moving towards streaming. They are also doing things like making Xbox live a service on the Nintendo Switch, iOS and Android. Game Pass the way it is today is just an indicator into how things will be.
Microsoft has made clear that they want Xbox to be a platform independent of the physical box they sell to people. They are 100% going in the direction of Stadia. The big question is how they are going to do it in regards to their next console.
They have announced xCloud, yes, but have they announced the business model/pricing? So far, pretty sure the answer is no. So it may be a subscription, but it may also end up being akin to renting a VM with GPU in Azure. Time will tell.
> The only positive here is that Microsoft, Google, and Amazon are all cloud infrastructure companies so in theory, they could price their offerings cheaper than a company that is a tenant on their systems.
That would be considered unfair competition and they would probably be fined by EU, I guess
I agree with your comments. I'd also add Stadia could operate a traditional platform/store model as well. It'd be a big ask for big publishers to launch their huge-budget flagship content straight into a content bundle. But you could have them sell access to people with Stadia accounts same as they sell physical/digital copies to Xbox/PS/PC owners.
Imagine simply moving all hardware support issues "in-house" or at least trading in all of the pimply teenagers with hardware issues for a single team of engineers working for one of the smartest companies in the world.
The vast majority of your support department can be let go.
I think undoubtedly, I would bet on the longevity of a streaming service from Valve or Amazon more than I would one from Google. But Google's past history of killing free side services doesn't offer the most relevant precedence. This is a service into a mainstream (and growing) industry that gels with Google's strengths, particularly scalability and AI. YouTube Premium/TV would be better precedent, though the jury is still out on those. I think there's still a difference, though, in that YT+/TV entered a market with already dominant leaders (traditional cable, Netflix, free YouTube). If Stadia can satisfactorily mitigate the problems inherent to streaming, and get it out before Microsoft's rumored Xbox development, it will be in a prominent position in a new field for gaming.
Why on earth would it be pay per hour or per MB? Does Netflix charge that way? How about Spotify?
Time has shown that the best revenue model for these kinds of services is finding a monthly price, or multiple tiers of a monthly price, that adequately cover the costs incurred from heavy users and light users alike.
For end users, I agree. It’s also interesting to consider how content providers will be paid: per user who plays, per user-hour, a flat catalog fee, etc., and what downstream effects this will have on games.
Netflix and Spotify’s cost structure are different. Streaming content is cheap. Rendering a high end video game can get pricey because you need GPU time. There’s a lot of other considerations too. Like, a lot of people will pause a stream and let it sit for a few hours. That’s fine. What if people want to leave a game running for a few hours? The computational burden is still high.
I'm not so sure. To play devil's advocate, this may make games cheaper even without ads. Developers want to not worry about disparate hardware, Google wants you in their ecosystem. Buying google devices, using google services, etc. Content creators want this for better interaction with fans.
> Developers want to not worry about disparate hardware
Most games would be developed for Playstation + XBox + PC + Cloud service to maximise audience (unless one cloud provider gains a monopoly, something so far nobody managed to do in either game platforms or cloud computing).
Of course you can decide to develop only for the cloud platform of your choice, but that's nothing new. Microsoft and Sony already pay you good money to make your game exclusive to their platform (if you're lucky even if you're indy).
I would not be surprised if Google injected some ads into the games as soon as their service reached critical mass and thus killing immersion.
Maybe I'm super negative here but I don't see any pros to this development. Today's games are already dumbed down and steered towards profits only. The only games I play these days are indie games made by 1 person up to a handful of people.
We really don't need more but we need better. (This applies to many other areas too)
I don't understand the arguments being made above.
Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.
Similarly, monthly subscription where you get access to a list of games actually helps indie games. Being able to jump into an indie game you already own within seconds, compared to having to buy and install a small game you've never heard of before.
> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.
Pay-per-hour means developers have an incentive to build addictive games that keep you just engaged enough to keep playing for an extended time.
> Similarly, monthly subscription where you get access to a list of games actually helps indie games. Being able to jump into an indie game you already own within seconds, compared to having to buy and install a small game you've never heard of before.
Monthly subscriptions mean that the platform has to choose what games to include and promote based on what is most likely to make users find value in the platform—which means focussing on those with widest appeal, unless their recommender engine can get enough signal to reliably predict niche interest.
I don’t think this is true. It assumes people are economically rational and trying to optimize fun per dollar. But people already should be valuing their time and yet they don’t. If developers must now guarantee players not just try their game but play it for x hours to make a profit, I don’t think they will be pushing for shorter better games.
> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.
I'm not sure about that. Currently tons of people pay monthly for grindy MMO games. Expanding that sort of revenue scheme to single player games would encourage devs to create more Skinner boxes to keep players "engaged" over the long run.
Shorter games don't happen. We saw this with Steam when they added a no-questions-asked refund policy for short games. Devs get absolutely punished by making short games now so we're going to see them get longer, and the same will hold if Stadia pays by the hour. If Stadia pays a fixed amount per game we will probably see a proliferation of short games designed to bait people into playing.
> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.
Or it ties money directly to length, so you need your game to be 60+ hours to justify it's existence. Or people won't pay that much so we don't get any new dark souls or RDR2 length games.
I'm struggling to understand how fewer casual gamers buying consoles would impact indie developers. Could you expand on your point?
Indie developers don't get a cut of console sales, but they do care about how many people buy/play their games, and if this widens the market it seems like it'd be a win for them to me.
Most indie games are short. If money is allocated to content producers based on play time, I would bet that most would earn less than a typical indie game purchase price per user.
That’s just speculation mind you. Things like Stardew Valley would probably do even better.
Nowhere did it say it's per MB or hour or whatever. Yet it's the top comment here. It's just fear mongering and hearsay on your part when most likely it will be a fixed monthly price.
It’s Google, which is notorious for smothering virtually everything unrelated to search or ads in the crib, and especially when it’s something they made in-house. Add to that streaming games; problematic for the (by far) most popular two genres: huge multiplayer shooters, and sports. I see almost no chance of this succeeding long-term, and I’d put the odds of it going EOL within 5 years at over 80%.
These huge companies will keep throwing money at streaming because they see gold at the end of the rainbow, but I’ve seen no indication that they have a plan without giant question marks before the “profit” step. Look at YouTube, something Google acquired in 2006, is dominant in its space, and still doesn’t make them money. Yet somehow games, which are more finicky and reliant on universally good internet connections is going to work for them?
This is trend-chasing, once again without any real new ideas to overcome existing challenges.
> It’s Google, which is notorious for smothering virtually everything unrelated to search or ads in the crib.
It's apparently deeply tied to YouTube, and particularly to help provide an onramp for more (often ad-opportunity-generating) gaming content on YouTube a service that is both an ad-platform and the venue for at least two distinct revenue generating premium services (the older of which is ad free) that Google has not strangled in the crib.
There's the major difference that you can't "pirate" a single-player game that is only available remotely, and that you can access such games remotely from any hardware.
So it could result in a renaissance of the classic AAA 3D single-player games, perhaps.
Writing "ps4 piracy" and "xbox one piracy" into google is a whole lot shorter than your comment. Then again, it doesn't give karma like a cool retort would.
It's already failed multiple times. OnLive tried & failed. Nvidia's has been in beta for years. Sony has one.
Everyone in this area has tried this. Nobody has seen what could be described as "success", and the cost models so far have been ludicrous. Turns out renting Xeons and Radeon MI Instincts in a professionally staffed, maintained datacenter is way, way more expensive than a handful of consumer chips in a box in the living room with nobody on-call to monitor it.
The GPU here looks to be basically a slightly cut-down AMD MI25. That'd make a single GPU in this stradia cloud gaming service costs more than 10 xbox one x's. How do you make that price-competitive here?
A big difference would be that OnLive had space in 5 colo datacenters in the US. Google has 19 full datacenters around the world and are building more. Plus, google has their own very large fiber network from different POP's and ISP's around the world. The fiber backbone gives them lower, and more predictable latency, compared to multiple upstream ISP's with different connections, issues, etc.
On the other hand, everyone on the east coast (therefore using east coast edge nodes) will be playing from 8-11 EST when the new Wolfenstein game comes out, so how is stuff rationed? Do you make people queue until there is a node close enough to them available? Do you sell the spare GPUs to people in GCP to use for their compute on off times to make up the cash? Do you make it $40 per month?
I think this comment is super underrated. If America is asleep, you can't really use that capacity for players in Europe, since the latency would increase. Likewise if Europe is over capacity, you can't really just assign players to a US server.
And (while I realize you're oversimplifying for the sake of example), it's not just per-continent in this case, but something more akin to per-metro-area.
Yeah, if you know where to look, they left clues about using MI25 hardware. (I haven't been an employee for years, this all unfolded afterwards and, ironically, it is just one search away.)
I'm sure they got bulk/promotional pricing from AMD, plus they're very good at both running hardware with low overhead and packing it efficiently.
> plus they're very good at both running hardware with low overhead and packing it efficiently.
You can't really pack the hardware here since it's latency sensitive. It's straight dedicated resources to an array of VMs. Dedicated CPU cache, even, hence the odd 9.5MB L2+L3 number.
Bulk pricing only gets you so far here. You're still talking gear that's categorically way more expensive than similar performance consumer parts. Not to mention all the other costs in play - data center, power, IT staff, etc...
You can't do time slicing, no, but you can definitely reduce time to first frame in many ways. If you don't do that, you need to provision even more hardware. Packing is also part of the capacity planning phases of a service.
The other costs (power, people, etc.) are amortized over Google's array of services.
Last but not least, it would be very dumb of them not to run batch workloads on these machines when the gaming service is idle. I bet $1000 these puppies run under Borg.
> The other costs (power, people, etc.) are amortized over Google's array of services.
Power doesn't really amortize, and neither does heat.
And capacity still had to increase for this. They didn't just find random GPUs under the table they forgot about, and now that they have a massive fleet of GPUs it's not suddenly going to start handling dremel queries.
This all still costs money. A shitload of it. Someone is going to pay that bill. More ads in YouTube won't really fund gaming sessions. So will this be ad breaks in the game? No way that's cost-effective for the resources used. Straight-subscription model? This seems most likely, but how much and how will you get people to pay for it?
Maybe it wasn't AMD, but they already had a massive fleet of GPUs. It wasn't running Dremel, either. Or maybe they found a way to do that, too, I don't know, but there are already enough workloads at Google to keep GPUs well fed.
I know from experience that Google is very cheap. You tell Urs you saved a million dollars and he'll ask you why you didn't save two. Or five.
If this takes off, the pricing of the service will pay for the hardware (assuming they did a reasonable job there of baking it in). Even if it doesn't, organic growth from other, much larger Google services can make use of the idle hardware.
For the record, I was involved in a couple of projects that required a lot of new hardware. One of them even ended up saving the company a lot of money in a very lucky, definitely unintended way.
>They didn't just find random GPUs under the table they forgot about, and now that they have a massive fleet of GPUs it's not suddenly going to start handling dremel queries.
This strikes me as rather amusing. Google was having such trouble getting their hands on enough GPUs that they decided to build custom hardware accelerators (TPUs) to fill the gaps.
It'd be a Vega 56 basis not Vega 64 but the problem is that "double the ram" part.
HBM2 memory is super expensive. Like, rumors are 16GB of HBM2 is $320 expensive. Toss in anything custom here and there's zero chance this is under $600/GPU.
Even in the hotly contested consumer market the 16gb HBM2 Radeon VII is $700. And that doesn't have any high speed interconnects to allow for sharing memory with CPU or multi-gpu.
They have TPUs for their AI stuff, and you still have to dedicate these resources while gaming sessions are active. How much monetary value can they really get out of the idle population here to offset the active usage?
You underestimate how much Google tries to squeeze out of all the machines in its fleet. That includes old ones, sometimes to comical effect.
A colleague at my current job told me about utilization targets at Amazon, where he used to work. At Google you could choose to be that wasteful if you really wanted to, but you'd lose headcount. Be more efficient and you'd get more engineers. I.e. you decide if you'd rather get machines or people.
There's also an old paper by Murray Stokely and co. about the fake market that was created to make the most use of all hardware planetwide.
There's so much that this will hurt in the gaming industry. Some of my thoughts:
1. As subscription services take over, the upfront revenue game studios see will drop. This is just simple math: Xbox Game Pass costs $10/month, which means its the total cost of a AAA game over 6 months. In a traditional model, many gamers would expect to buy, lets say, 2 AAA games per year. In this new model, I can play as many as I want. And even if I only play 1 or 2 every year, I'm almost definitely going to be "dabbling" in the collection for other games I may want to play, ESPECIALLY if they're instant-on like Stadia. Even if they pay-out to studios based on some metric derived from time spent in game, there's no way studios will get the same level of income as they did before. (note: this is exactly why Spotify is having such a hard time, and why they're branching out beyond music. royalties abstracted behind a subscription service suck for the bottom line)
2. So upfront revenue drops. How do studios make that up? In-game transactions. They're already huge, and they'll just keep getting bigger.
3. So what, micro transactions (mtx) are the "new normal". Well, the top 5% of games can afford that decreased upfront revenue by making it up in mtx (think: Fortnite, Apex Legends, CoD). The trailing 95% can't (think: Indie titles).
4. Beyond that, you can bet your bottom dollar that Stadia will pay out tons to the AAA studios just to get their names on the platform, given that Google has no first party studios to speak of. Assassins Creed gets enough upfront revenue to make it worth their while, meanwhile the next indie darling is left out to dry, further balkanizing the gaming industry
5. Switching gears: A massive number of software engineers in the industry entered it because of gaming. Games, even back in the 90s, were such a clear application and value of computers that it was obvious, even to children, that they'd be something huge. It inspired a generation, to not only play, but to mod and even make their own. Now, we're moving that all off into the cloud, hidden from the next generation. Google wants you to own a Chromebook and consume their products, not understand how they work.
6. Speaking of modding: Its literally the source of the world's most popular games. Battle Royale? You can trace its roots back to mods for ARMA and Minecraft. MOBA? Dota, a mod for warcraft. Creativity happens in environments that large corporations can't recreate, and traditionally a great platform has been starting with a base game, a great game, that some studio created, then exerting your creativity on top of that platform. It benefited everyone, including large AAA studios who could then copy your idea and make millions. Yeah, good luck modding on a blackboxed server a hundred miles away.
7. But fine. I guess we're moving into the future and this is part of it. Except, there are millions, even BILLIONS, of people around the world without the internet capability to even join this service. Google tried to help solve this with Fiber, and gave up. Its fucking hard. They'd rather do easy, cool things, like cloud gaming. Modern consoles are bad enough; my brother, who lives just an hour outside of a top 10 US city, recently told me that he downloaded Fortnite on Xbox for the kids. It took a week of 24/7 downloading. Most games drop with day 1 patches in the dozens of gigabytes, even if you buy the disk in-stores. The sheer arrogance of Google, to get up on stage and claim this service is gaming for EVERYONE, the apex of accessibility, is disgusting to me. They're stuck so far up their own ass they've become the ouroboros.
8. Well, streaming games have taken over the world. Let's say you want to compete with Google on this streaming game front. All of the top three cloud providers now want to get into game streaming. So, no way can you compete on cost there with them; they own the data centers and give their game streaming divisions nice fat discounts. They all have the pockets to design nice custom silicon with AMD specialized for the task. And, oh by the way, all the latest games are now optimized for this silicon (whether its the custom AMD chips in the PS4, or XB1, or whatever cloud streaming service we're talking about, they're all custom). You're stuck with off-the-shelf cards. Ha! Nvidia won't let you deploy their cards in a datacenter [1], because they ALSO want in on this big cash pile they've all convinced themselves exist. So, basically, good luck. The world has balkanized, and penetrating it becomes harder every year.
I hate this. I hate it so much. The only saving grace is that it is inevitable that this will fail to realize the results Google wants, and they'll pull the plug. And maybe the rest of the industry is smart enough to recognize how short-sighted a streaming-first/subscription-first strategy is, for literally everyone involved except the people who rent the metal.
I want to go back to the 2000s. This new world sucks.
honestly none of this even matters until they can offer a service with comparable input latency to local play, and so far every indication is that they can't. the whole service falls apart if it feels like shit to play.
Unless they've solve the speed of light issue then we don't need to wait for reviews. Even using a remote desktop on the other side of the city isn't a pleasant thing to be doing full time and that's not remotely as twitch based as gaming.
Google Cloud has regional US DCs in western Iowa and central South Carolina. A midpoint between those two locations roughly lands on Nashville TN, which is ~600 miles away from either. Light could make a roundtrip of that distance in 6ms. Of course, the internet doesn't allow for latency at the speed of light, but that's the physical limit, and that's plenty; a typical internet browser alone has input lag of 10ms [1]. In order to achieve 60fps, frames have 16ms to be rendered.
But the regional DC is only the worst-case, because they've said they're deploying these things in 7500 locations around the world. That's unprecedented scale for a tier 1 cloud provider at the edge. They know that they have to be close to consumer populations.
Also consider this: Once cloud streaming takes off, we're going to see deeper integration into the frameworks and game engines themselves. Imagine a game engine which is built for streaming. It could do input prediction, doing a "light rendering pass" of frames for N possible inputs the input buffer might receive on the next frame, before it receives them. These custom chips they use have plenty of headroom to do this at 1080p, and most controllers have, what, 12 buttons + all of the joystick states? Depending on the game this might be possible (example, hard to do in multiplayer). Combine that with the natural advantage a cloud-hosted multiplayer game would have in networking with other clients to resolve game-state, and you can see that its not just a strict downgrade; it might be possible that we'll see improvements in the performance of games beyond just the typical "new year better graphics" cycle.
I watch WebRTC streams from California in Germany with 100ms network latency. Speed of light has never been an issue. Latency within the same continent or within the same city is much lower.
No they didn't, that was simply a bandwidth issue, we've been streaming video since TV was invented and with the right equipment you could stream video over the internet before the web even existed. No equipment exists that can alter the fundamental speed limit of the universe.
Yep. This is the kicker. For streaming videos, only bandwidth matters but for video games, both bandwidth and latency matter.
Reminds me a bit of the 30/60 fps fights a few years ago. Sure, 30fps games look more "cinematic" and 60fps movies look uncanny, but 30fps games feel less responsive.
This is such a parochial and short-sighted view. Let's block new technology because "game studios will die". Businesses need to adapt. And if people still like games that don't rely on microtransactions or multiplayer, studios will make them. There is a place for both.
Also, to point 7. guess what billions of people can't afford a console or an expensive PC rig. Yet, in developing countries data is already very cheap if not free. So, this DEFINITELY is a big step closer to unlocking games for them.
Do you really think data is delivered to people in those countries in any sort of latency that would make a service like this remotely enjoyable to use? A massive number of people in the US don't even have internet that could support something like Stadia, let alone developing countries.
By comparison, shipping "edge devices" (aka, uh, COMPUTERS) running an 845 Snapdragon or Tegra (like the Switch) is cheap and getting cheaper. What makes more sense: asking someone in a developing country to pay $200 one time for a general purpose computer useful for everything including 1080p gaming, or $10/month in addition to, uh guess what, some computing device they'd already have to own to access Stadia.
The adage goes: never underestimate the bandwidth of a station wagon full of tapes barrelling down the highway. I mean, the follow up is usually "but never forget the latency", though in this case that doesn't apply. Point being: the internet isn't the answer to everything, but if the only tool Google has ever known is a hammer then every problem is going to look like a nail.
You hit every single point I tried to make to my friend about half an hour ago. On one hand, I'm terrified that all of this will come to pass; on the other, if this is like any of Google's other projects, it's going to evaporate in two years anyways. Whatever happened to OnLive?
It wasn't the right time for OnLive. The tech wasn't there, in networking infrastructure (the internet), compute infrastructure (DC tech), or hardware (graphics cards).
To that last point; it is impossible to understate how fundamentally important Nvidia's Pascal architecture has been to the development of both gaming and AI. In my mind, its the most important computing chipset invented in the 2010s, and belongs among the "world's greatest" chips next to the Intel Core architecture, Apple's A-series, Pentium, and the 8086. It put Nvidia, quite literally, 5 years ahead of the competition almost overnight; AMD is still catching up, three years later, to the perf-per-dollar and perf-per-watt of the GTX 1080.
That chipset, and the cards that were made with it (GTX 1080/1080Ti namely) were the first indication that DC rendering with a stream-to-client architecture was actually possible for video gaming. Before that it was hard to make an economic case for it.
OnLive was purchased by Sony, and its easy to conclude that they repurposed their tech for their Playstation Now service. So it lives on.
The whole gaming industry is positioning itself for cloud-only streaming in the next 10 years. With 5G+ everywhere, reduced latencies, nearby datacenters full of specialized GPUs, many gaming studios plan to release streaming games only at some point, cutting off local players completely (it makes a perfect business sense, collecting regular monthly rent for gaming). Unless there is some regulatory pressure, the only way to play AAA games in the future will be via streaming, local computers will be pretty dumb latency-optimized streaming machines.
Not sure how would that affect game developers though as there won't be any attractive way to programming for young people; before, making an own game was quite an attraction to jump into programming and a motivation to study hard.
Why would they charge per hour or per megabyte? That reduces stickiness. Much more likely is that they'll just get a cut of the game developers' sales, just like the game console vendors do. They'll set the cut so that the net profit pencils out over the long term.
The cost will change per game, so it could be that game's cost is associated with how much compute they need. I think it will probably be a monthly subscription based on the gym membership model. Most people won't use it a lot and they'll make all their money on them.
I think my concern about this wholly depends on if it completely takes over the business model. I really want to own my games. But, I don't mind if alternate streaming models exist, just if they become seriously required.
This is going to be an alternative for some types of games.
And quite on the contrary I think it would awesome for indies, since they tends to be pretty short. And it is much easier for them to gain consumers, if they went viral over streaming, many of which are, like horror games.
Why are you sky-is-falling this announcement? Where in their release did they say anything about per-MB or ad based business models? Did they say anything at all about personalized costs in general?
No, they didn't, and every single thing you've suggested about their gaming platform is also true about every other streaming platform, and yet none, literally none, of those platforms have tried any of the things you're complaining about.
You basically just made up a bunch of things to complain about because nothing in the actual release was objectionable. Yours is not a helpful way to react, my friend, though it is a popular one.
I mean, I’ll give you that nothing is confirmed, but nobody has made this work yet. It’s mostly tech testing, so current business models aren’t a good metric.
I’m open to debate about how this could be priced, but I’m pretty comfortable pointing to existing cloud computing business models or streaming services as a precedent.
I would invite you to come up with an alternative business model for serving people who like to play high end graphics games for many hours a day.
> but I’m pretty comfortable pointing to existing cloud computing business models or streaming services as a precedent.
Why are you this comfortable? Netflix, YouTube TV, Twitch, Sling, Amazon Prime Video -- basically all streaming services offer flat rates, not per-MB ones.
Further, all existing game library services tout unlimited gaming as a primary selling point! That's the primary reason you opt into Gamefly or Nvidia Shield, at least according to their own marketing.
And this offering is not for high-end gamers. It's taking the benefits high-end gamers get for their investment into their hardware, and making it available to the millions of more casual gamers. This isn't for high-end gamers, so creating a business model for them using Stadia makes no sense.
Finally, you're not thinking of this at the right layer if you're thinking in terms of things like s3, ec2, lambda, etc.. This is the product that's built on top of those, and the single price problem has been present for hundreds of years. It's a solved one, just ask any current MMO or hell, any clothing manufacturer. You're basically saying that an XL t-shirt is going to cost the same as a S t-shirt, despite tens of thousands of examples to the contrary.
> Netflix, YouTube TV, Twitch, Sling, Amazon Prime Video -- basically all streaming services offer flat rates, not per-MB ones.
YouTube (and Google Play TV & Movies, which appears to carry the same for-sale/rent content in a different storefront) and Amazon Video also both offer purchase of individual content items as well as a common flat rate subscription to certain content.
The amount of variable cost to stream music or video to a user is significantly less than the the variable cost to render high end graphics. The high end hardware costs money and depreciates. Why would it cost less to rent a gpu hour for generic purposes than it would to rent a gpu hour to play video games?
... no, not everything is scalable, nor does everything benefit from economies of scale.
This does benefit from economies of scale, but it’s not something you can just solve with infrastructure and fixed cost. No matter how many computers you have, you’re still going to do multiple orders of magnitude more calculations to render high end games than stream a song. And you’re going to deal with difficult load balances because every twelve year old gets home from school at the same time (exaggeration but point stands). GPU time costs money.
This benefits massively from economies of scale, and yes it is very much something you can solve with infrastructure and fixed cost. Fundamentally it doesn't matter if you're streaming a song or streaming a game.
GPU time costs money but it's a fixed cost, doesn't matter what the GPU time is being used for, therefore it won't be per-game. The end.
It’s only a fixed cost if the computers must be running at all times. That’s not the case. GPUs consume power, and require energy to cool. They also burn out over time and need to be replaced.
For similar reasons, you’re not just going to rake in the money mining bitcoin because you bought a bunch of computers.
Or maybe to make the point even stupider, you could make a game about training neural networks same as you would on a real cloud service provider. If you can understand why google doesn’t charge a simple monthly flat fee for cloud computing of neural nets, you can understand why they can’t charge a simple monthly fee for computing neural nets in a game.
I really don't understand why people here aren't talking about the real underlying principle of this service.
Google has captured the education and cheap laptop space with chromebooks. This is the logical extension of that software. Sure they will eventually combine chromeos and android but this is one more step in the direction of everything happening inside of your web browser.
The entire reason this is happening is because Google has successfully captured what once was the holy grail of markets, education, and is doing things like this to keep people on chromebooks.
ChromeOS could be as important than Android. Especially with the Windows 7 extended life support coming up, Google has less than 4 years to convince people to transfer to their platform instead of Windows 10. I know it seems heretical to imply that it will happen but I think this is a good example that Google considers it not only a real possibility but something that they might actually have a good chance of being able to pull off.
The most important thing to realize is that they are focusing on gamers as a trial, the people who are price sensitive and who have to move first off of Windows 7 since they can't pay for extended support even if they wanted it. They can then continue this into the workplace. This is a brilliant move by Google.
Do we know custom how? It might be the same chips with different microcode or whatever GPUs' equivalent is. Is it a custom spin like Intel has been doing for years for Google and Amazon?
What worries me is that I read about this being "Chrome-only".
Google is not just trying to win in console gaming space here - it's trying to establish itself as the browser monopolist and it seems to be doing it without much subtlety.
And of course, only Google can build it's game service into it's own browser, and into it's own video sharing site, and into the Play Store. Stadia is Google utilizing multiple of it's existing monopolies to enter into a new space. It's also a major shot at Twitch, not just traditional game platforms like PlayStation and Xbox, since streamers will need to be streaming on YouTube to take advantage of things like Crowd Play.
I'm not sure, if you drive everyone else out the market then you are by elimination the best product in the market (since at that point you are the only product left).
Monopoly laws exist for a reason and hard experience and I'm of the opinion that a single large player having complete domination even if they also have technical superiority is something that probably should be broken up.
> I'm not sure, if you drive everyone else out the market then you are by elimination the best product in the market (since at that point you are the only product left).
I think this can be read in two ways:
1. You're the only product, therefore you're the best (and the worst too, at the same time I guess). In which case I want to point out you can be a monopoly without controlling 100% of the market. There are usually always smaller competitors around— you just happen to have a share large enough or the assets necessary to control what happens ¯\_(ツ)_/¯. De Beers, for example, was considered a monopoly when it controlled 90% of the world's diamond production.
2. You're the one that came on top in a market with other competitors. Therefore, you must be the best.
This is assuming the only way to eliminate players off the market is by being "better" than them, but that is sadly not the case. For example— in Mexico there's a monopoly over the telecommunications business, and part of the reason it happened and stayed that way was due to support from the federal government and political corruption. Microsoft has had monopolies over several software markets— not because they were "better", but because if a better product came to be, they'd either buy it or build one that was built-in to Windows. IE was pretty bad, in many ways worse than FF, but also came built-in.
Completely agree with both your points, you can also be a monopoly in a less direct form of corruption via regulatory capture, there are lots of ways for smart rich people in charge of massive companies to bend the system their way.
Democracy and a free press in theory should act as a retarding measure but somehow that’s gone off the rails more (or I’m more aware of it than I used to be and it’s always been that way), social media and the internet has changed the landscape, We have a sitting president screaming fake news at news where they have incontrovertible proof..often his own words from previous speeches and interviews.
The world has gone haywire and at a time when globally we need more unity to address the issues facing us as a global society the very bastions of that global society are getting beaten with a stick.
I wonder what the world is going to look like 2050, I’ll be 70 if I’m still around.
They specifically said they are looking into making the client browser/platform-agnostic, which they'll have to do anyway if they want it on iOS since you aren't allowed to run anything but the Safari engine for web browsers (of course, they could build a native client).
When they say agnostic it will mean proposing and implementing web standards that they push through and other browsers dont support yet and have to catch up on - look at all the browser extensions they came up at light speed. Like WebUSB, not to say that they had very serious security flaws also.
Google uses open standards and business practices to appear like the good guys, but make no mistake they are abusing their market power.
I mean when will you see ads of Stadia on the Google homepage? What happens when you will search for Assasins Creed in future? Oh, look we have it right here at Google, no need to leave our ad platform.
I feel like there's a lot more low-hanging fruit that could be added to the Chrome experience to win people over, rather than building a relatively expensive game streaming platform.
The only Chromebook I've seen anywhere is the nice one I bought my mum for Christmas year before last, she loves it and I've had no issues supporting it (like not a single one, she uses it constantly as well).
I've got a school age stepson nothing there either.
Give it a couple years and they will be everywhere. I used to work for a company that sold a monitoring and filtering product for school computers.
It's only been in the last two or three years that the ecosystem around them has really matured enough that they can compete with Windows machines and iPads on anything other than price. With the way schools' budgets work, we've really only just passed the early adopter stage.
Are you referring to Google's hardware division? If you are that doesn't really mean much.
Most schools aren't buying fleets of Pixelbooks. They're buying chromebooks from companies like Acer and Asus which make devices that retail in the $200-400 range.
Yeah, but someone needs to actually develop ChromeOS, without Pixelbooks management might decide to focus elsewhere.
And right now from the outside it feels like there is a ramping internal politics going on with ChromeOS, Android, PWA, Flutter, Fuchsia, Kotlin, Dart teams, with upper management giving free reign and let the best win kind of stuff.
> Yeah, but someone needs to actually develop ChromeOS, without Pixelbooks management might decide to focus elsewhere.
I really, really doubt that. I don't think you understand how ubiquitous Chromebooks are becoming to the education space. Last I heard, in the US, 60%+ of all school provided computers are Chromebooks. School SysAdmins love them because they're dirt cheap and can be provisioned quickly.
From Google's perspective it's great too. Between Google Classroom and the way chromebook device management works, students have to have a google account to be able to go to school. There's rules on what data can collect, but still, kids are forced into the Google ecosystem at a young age.
ChromeOS doesn't need the pixelbook to survive, it provides an enormous amount of value on its own.
> And right now from the outside it feels like there is a ramping internal politics going on with ChromeOS, Android, PWA, Flutter, Fuchsia, Kotlin, Dart teams
100% agree there. I've been hearing for the past 3 or 4 years that Android and ChromeOS were going to be merged and nothing has yet to come of it. It seems like even Google doesn't know what is going on there.
> Being only king of US school system isn't something that holds long term in a product roadmap.
Dude... I'm sorry but you are so wrong. Like I said originally, I literally just left this industry after working in it for years.
US spends more money on education than pretty much any other nation, both per student and as a total dollar amount. The reason the numbers are so low world wide is because chromebooks in education are a relatively new concept. Everyone has been going after the big fish, which is the US.
Additionally, they way you need to handle student data in the US is fairly consistent across state lines, which means you don't need to customize your solution very much to be able to sell to all 60 million students. Once you go overseas, you'd need to sell across multiple country lines to be able to find a pool of students that big (unless you're targeting China, Russia, or India which all have their own issues).
Even if you want to ignore all of that, I don't think you realize how much of a PR nightmare it would be if Google just stopped supporting ChromeOS right now. Schools have spent hundreds of thousands of dollars buying into this ecosystem. For schools that buy at the district level, it's in the millions. Most schools/districts don't have the budget to just replace all their computers overnight. Shutting down ChromeOS would pretty much fuck all digital learning in a lot of school districts for years to come.
PS. I'm pretty sure iOS's adoption numbers US & worldwide (not in schools, just total consumer adoption) match up pretty closely with Chromebooks, so there goes your idea that only dominating the US market isn't a viable business strategy.
Sure. I'd love a list of all the products Google has canceled (not merged into another offering) after coming out on their official Google blog and saying they are here to stay.
The source that claimed Google was downsizing their hardware division said that they moved "dozens" of employees. Just last year they finalized an acquisition of 2,000 employees from HTC. What difference is some dozens?
Furthermore, the source didn't exactly do a great job with their research. This is a direct quote:
>Pixelbook is a Chromebook, meaning it runs on Google's Chrome OS software and is only capable of using internet-based applications.
>>The entire reason this is happening is because Google has successfully captured what once was the holy grail of markets, education, and is doing things like this to keep people on chromebooks.
Care to share how this tidbit is true?
>>The most important thing to realize is that they are focusing on gamers as a trial, the people who are price sensitive and who have to move first off of Windows 7 since they can't pay for extended support even if they wanted it. They can then continue this into the workplace. This is a brilliant move by Google.
What gamers are still on Win7? Most are already on 10. Most legit gamers care about input lag, which a streaming service will never solve vs. having hardware wired to your monitor. Most legit gamers use a wired mouse for gods sake because of input lag introduced to wireless mice.
> What gamers are still on Win7? Most are already on 10.
Care to share how this tidbit is true?
Sorry to use your words against you, but I'm not so certain. I'd like to see real statistics from a few sources. I agree that Windows 7 is surely phasing out, but anecdotally (which is where most conjecture like yours and mine come from), I know a number of people who specifically stayed on Windows 7 for two reasons: 1) Compatibility with games; and 2) No telemetry (or at least it's considerably minimal compared to Windows 10).
Maybe, but I would like to see more than the one source. HN is becoming filled with people who tend to make such generalized statements without much to back it up. Your link helps, so thank you! I would love to see more data and less speculation around here :)
Re: the survey - the type of person who remains on Windows 7 due to concerns about 10 would be more likely to opt out of the survey, or not use Steam in the first place. I would put the number for Windows 7 a little higher at maybe 30%.
I expect the number of Windows 10 players to increase as DX 12 becomes a common requirement, but we're not quite there yet.
I suppose, then, that ignorance is bliss. Because once I explained Windows 10 telemetry to my own parents, they opted to run Linux. They're nowhere even close to the penumbra of the nerd sphere. But, it is anecdotal, though it does rebut the notion that ALL consumers outside the nerd sphere don't care.
> doing things like this to keep people on chromebooks.
Care to share how this tidbit is true?
Lots of filthy casuals just care that they can game. If they can, somehow, why shouldn't they buy the Chromebook? Hell, if that's what they had in school, they might have grown up to like precisely the genres of games that are easy to design around the lag.
Most legit gamers care about input lag
The majority of people don't notice < 40ms round trip lag, though "legit gamers" often do notice. < 30ms, round trip lag, and few people outside of hardcore FPS players will notice. Get to < 20ms, and you get down to rounding error levels. Don't just ask me. (Hobbyist game dev.) Ask gamedevs and companies who have conducted testing.
40ms roundtrip lag for most of the world seems pretty achievable to me.
Pinging google.com [172.217.5.110] with 32 bytes of data:
Reply from 172.217.5.110: bytes=32 time=6ms TTL=50
Reply from 172.217.5.110: bytes=32 time=8ms TTL=50
Reply from 172.217.5.110: bytes=32 time=9ms TTL=50
Reply from 172.217.5.110: bytes=32 time=10ms TTL=50
Ping statistics for google.com:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 6ms, Maximum = 10ms, Average = 8ms
--- google.com ping statistics ---
12 packets transmitted, 12 packets received, 0.0% packet loss
round-trip min/avg/max/stddev = 74.750/75.566/76.724/0.587 ms
I'm on 1GBit Google Fibre in Orange County which should be as ideal an experience as could be expected. Now, considering there would be additional time on top of the pure ping, I image there would be 80-90ms of latency for me. No way lol.
I'm not sure why are your pings that high. I just asked my friend in an Asian country about his pings, and he averages out at 5ms to Google. Meanwhile, my average is 1ms in the US.
Pretty sure both of us have a Google datacenter nearby, but could the result of your ping be due to a bad configuration on your router?
Unfortunately, any and all trust I had in google maintaining its services for more than a few years is long gone. I have games from ~10+ years ago I can still download, update, and play via Steam; What are the chances this lasts more than 3 years?
In addition to that obvious concern, I'd also be very hesitant to let Google have any control over my gameplay experience. The downsides of the gaming as a service just have no appeal to me, although I'm sure it would be functional in certain generes, for a certain type of gamer.
Regardless of how many streamers are playing Skyrim in their browser, I don't think I'll be joining them.
This is actually something that worries me, long term. There's people who still play 30+ year old games— you can still buy a used NES and play classic Tetris if you wanted.
Mario Kart on the Wii still works. Online play is broken though because Nintendo has no interest in running old servers to keep a service in a long discontinued game running.
If any games are developed solely for Stadia, will they disappear if the platform disappears? Even if it doesn't, will they be playable in perpetuity, or will Google shrug breaking changes if it's for games that are over N years old, and have under K users?
I think most things we buy have a life. When I pay $$ to get shoes, I dont expect that to run forever. Sure, it might be nice but I _understand_ it might lose its life after a while. We expect software similarly to run forever, but I think we need to start associating life against it.
Now if a shoe only works for a week, I'm not going to buy it. If it works for a year, sure I'll take it.Similarly for a digital good (phone, computer, software) we need to come up with a baseline on what the acceptable number for life is. It cant be Infinity, but it should be reasonable.
The big difference here I think is for most things we buy, we're mostly control over their longevity. I can take care of something and have it last longer.
This all ties into ownership. If I own something I have control over it, so I can re-sell it if I'd ever like to re-coup some value or let someone else make use of it. I can lend it to a friend, and have it back to use or lend again. I can give it as a gift to a friend or family member. I can keep it indefinitely as a memory. All of these are reasons why some people still have a NES, GameCube, etc. It's probably not that they've been playing on the console for 30 years. They either kept the NES they owned as kids, or bought one second-hand for the nostalgia, or got it as a gift from someone who knows how much they like classic games, etc.
There are still have tournaments for games on old consoles. Old games are perfectly good games— they don't have the same wow factor, but they are as fun today as they have always been. Will future "classic games" be lost to history, or available only when a publisher decides to monetize the nostalgia of the public with a re-release?
>> When I pay $$ to get shoes, I dont expect that to run forever.
The better analogy is: when you buy a book, do you expect you can read it again in ten years? People who bought 1st pressings of Beatles albums 60 years ago can still play them with their grandkids. I can still play old SNES with my siblings at Christmas. But imagine a 2019 being unplayable in 2029. It's reasonable to be concerned.
OnLive was both before the Cloud era (where they manually had to maintain many points of presence that were near consumers and they didn't have the money to do it near what Google, Amazon and Microsoft are able to do) and before GPUs were really mainstream for in normal data center servers to the scale this type of service requires.
> the company had deployed thousands of servers that were sitting unused, and only ever had 1,600 concurrent users of the service worldwide
They were burning through all of their money because they highly overestimated the audience. That's one of the main problems the cloud was made to solve.
This particular cloud will only solve the problem if google will find a way to sell the time of these AMD GPUs to someone else.
Traditional GPGPU customers, and machine learners, usually use CUDA instead of OpenCL / VulkanCompute / DirectCompute. At least from my position is looks that way. I’m a freelance developer and my clients are picking CUDA in ~75% cases. The rest is DirectCompute, if computing on client PCs and hardware compatibility is required.
Right partially because OnLive couldn't spend the money required to put servers close enough to give users a good experience. It would have taken a monumental amount of money for them to expand in a way that would facilitate that, and custom make servers that had consumer grade GPUs in them.
Furthermore, there are other technological advances that have come even in the past 2 years that help facilitate this, like Secure Reliable Transport protocol (https://www.srtalliance.org/).
Things were just too stacked against OnLive at the time to make it work, but most of those barriers have since been removed after they had gone under.
Gaikai was one of OnLive's competitors. It had better management and probably better tech as well. Sony bought it out and it still lives as Playstation Now. I believe Dave Perry is one of the founders.
Speed is not the (main) problem. It's latency and jitter. With the last one especially bad on mobile networks. To get at most 1ms of additional delay you also need a datacenter every 300 km.
Google operates tiny little datacenters in every ISP POP in America. That is why the video I just watched on Youtube in Oakland came from some place 2ms away even though Google's nearest real datacenter is in Oregon.
How do you edge cache games though? Edge caching youtube requires a giant pile of hard drives, Edge caching games requires a giant pile of hard drives and a giant pile of compute power.
It's not the edge cache itself that matters; it's that many ISPs peer directly with Google.
Google Cloud can route the audio/video/keyboard packets mostly over Google's private network and then only use the public internet once it gets to your ISP (or their transit provider). This provides Google with more control over how the packet gets to the end user.
Google provides a similar service to Google Cloud customers as the "Premium Network Service Tier".
Huh, that's actually really interesting. I was going to say, "but even then you're still dealing with the speed of light", but I guess at the limit, it takes 4ms to reach 1,000 miles, which should be enough to get most anywhere in the U.S. to a Google data center. The latency dynamics here are a lot less implausible than I was suspecting at first, given peering at the local ISP level.
Video files (even live stream HTTP based protocols like HLS) can be cached very easily on CDN infrastructure.
However, a gaming session means to have a dedicated daemon running for you somewhere. I doubt this can be deployed anywhere on the spot, but perhaps they have some amazing technology for that.
I used the precursor of Stadia, Project Stream. There were moments where the video went fuzzy, but for >95% of the time, it ran fantastic. My gaming PC setup is an i5/16GB RAM/GTX 1080 and it stutters more than the stream did (I assume that's on the CPU side). I'm sure Google has the compression algorithm chops to make this all work.
Unfortunately for me at least, what I want is more akin to a cloud VM where they can host my games and I stream them wherever, which this isn't it. I've already got games between Steam, uPlay, Origin, etc. I have zero interest in buying a game 2x just to be able to have more portable play.
I'm pretty sure you won't buy them (again). You will rent them. Anything but subscription-based pricing model for Google Stadia would be a surprise to me.
Not only you will need a really good internet connection and wifi, but also being close to a Stadia data center. The best possible case scenario will probably be a latency of around 50 ms, but I'm guessing the average will be closer to 100 ms which is too much for many types of games (specially competitive games the streamers play).
Heck, even streaming in your local network isn't such a great experience these days.
I love the idea, but realistically we are still very far away from streaming games replacing consoles or PCs. Many companies have tried (Nvidia, Sony, etc) but no one has succeeded for the simple reason that latency is not there yet.
i tried this and found it to be pretty satisfying. i played entirely on a macbook while traveling during my winter vacation. It worked surprisingly well even with mixed wifi, i would compare it to the experience you get from using PS4 Remote Play or Steam Link over ethernet.
The only time i found it lagging was similar to the times when both of the above lag: during particularly complex visual scenes (i.e. you're circle strafing around a target and the entire screen is constantly redrawing). I thought it was great for playing a game casually: i.e. story mode. Lots of people use that phrase as a put-down, but the system is well suited for a game like ACO where you are mostly being tactical, planning, exploring, and moving the story forward.
I think of a game like 2016 hitman: i hesitated to install 30G of it to my ps4, but if you told me i could drop into the demo/prelude in less than a minute, even at 720, it's a very appealing concept for somebody like me that plays video games the way other people watch netflix while they're eating dinner: basically whenever i have some downtime and want to dip into a story or mechanic i like for ~30 mins.
Note that even Steam Link over Ethernet is too laggy to play KB/M FPS at any serious level - the input lag makes FPS almost unplayable.
I can see the growth of services like these, especially with more gamers being unable to access dedicated hardware, but there will always be a niche for dedicated hardware. The only way I think they could solve that problem with streaming is using some sort of hybrid streaming approach where some of the UI is remote and some is local and they could do client side simulation somehow for input.
i perhaps have diminished standards, but i found steam link over ethernet to be totally serviceable for playing through most of hyper light drifter on a controller. the input lag was fairly minimal and the computer was already quite long in the tooth, but to your point, i wasn't getting 60fps, that's for sure (probably realistically it was a firm 30fps).
i don't doubt that consoles will remain useful, but i think that services like this will satisfy a pretty legit niche for a vast swath of games that aren't really dependent on low latency input (i.e. puzzle/turn based/rpg/simulators/board game conversions) and that are often 'discovered' by people finding letsplays on youtube.
for some variety of mmorpgs, i can imagine devs being excited about the reduced surface area for cheating/exploits. for somebody like me that uses a mac, i'm looking forward to playing a version of Civ that doesn't cause my laptop to sound like it's about ready to take flight. i don't think the idea is meant to replace consoles, though more, it seems like a way to grease the wheels of commerce and get people playing (and buying) games that they've been traditionally priced out of because of the not-insignificant startup cost of building and maintaining a gaming pc/console.
I guess my question is; how would something like this scale? With Netflix and Spotify, the media is the same every time you play it, and even stuff like the Black Mirror CYOA have a limited number of combinations, so it's very easy to cache.
Every game has an extremely high number of potential combinations and outcomes, so it's effectively uncacheable. Fan that out to Steam level popularity and diversity of games and it sounds a bit nuts.
The only way is adding more servers closer to the users, which means Stadia will only offer really low latencies to users living close to the data centers. So probably only people in large cities.
I don't really have a need to stream a full game, but your idea about the demos actually sounds great. I would like to play the demo instantly, evaluate if the gameplay/controls are good and if it does and I'm interested, I would purchase and download full game to my system.
I was a beta tester for Project Stream. FWIW, I traveled with a Pixelbook and played Project Stream with the hotel WiFi in Seattle. Everything is smooth and similar (if not the same) experience when I played with my home WiFi.
The latency is good from my experience. There are occasional resolution drop but that happens <1 per hour. Mind you my experience was based on playing ACO which probably is not sensitive to latency.
I also played Nvidia Shield Now on Mac while it was beta. I would say Project Stream is a much much better experience (no need to signup a new account is a plus, no client required a HUGE plus).
Agreed. Enthusiast gamers fret/obsess over a small difference in FPS, refresh rate and lag especially when it comes to competitive gaming. Until Google can deliver a service that is indistinguishable, it’s DOA in my view.
> Agreed. Enthusiast gamers fret/obsess over a small difference in FPS, refresh rate and lag especially when it comes to competitive gaming. Until Google can deliver a service that is indistinguishable, it’s DOA in my view.
I work for Google, opinions are my own.
I agree but I don't consider this service to be for hardcore enthusiasts. The enthusiasts will continue to buy their powerful gaming machines because they really care about having the best experience.
I think this is great for someone like me, who likes video games, but doesn't want to spend that much money on a computer. Before I would not be able to play the latest titles because my computer is 6 years old, but this would make it so I could.
There are probably tons of kids out there whose parents would scoff at the idea of giving them a $1000+ computer but now would be able to play the latest games with this technology.
All that assuming it works, of course, which is no guarantee.
> I think this is great for someone like me, who likes video games, but doesn't want to spend that much money on a computer. Before I would not be able to play the latest titles because my computer is 6 years old, but this would make it so I could.
> There are probably tons of kids out there whose parents would scoff at the idea of giving them a $1000+ computer but now would be able to play the latest games with this technology.
Presumably this service is not going to be free, and the overall cost for someone who plays a lot of games would probably be similar to just getting a $1000+ PC and using it for 3-4 years. Of course maybe the pricing will prove me wrong.
how are those parents gonna react when their entire month's data plan gets used up in a weekend? honestly i would rather see google try and cut in on the handheld market by creating peripheral controllers for android devices. every smartphone in the world is four buttons and a dpad away from being the best gameboy ever made.
This is a very valid point. Adding 100ms of latency to games that already struggle to deal with their built in ~100-150ms already is basically a non-starter.
So you're left with casual games, single player games (with no modding), and maybe some less latency sensitive multi-player games.
Even the best hardware right in front of you can still currently cause a bad user experience for latency-sensitive games. All it can take is one player with high latency or an unreliable connection, and it can detract from the enjoyability of a game.
And games already have latency from input devices, inputs registering, the game calculating the next frame, the monitor displaying it, and the user perceiving what has happened on screen. Every millisecond of network delay gets sandwiched between all of those.
So, who wants to make a bet on how long this "experiment" will last? I'm thinking 2 years before Google gets tired of it and abandons it, and another year before it's taken behind the shed.
The difference is they launch high-profile, promising-the-moon projects and put their full weight behind them, and then give up a couple short years later for inscrutable reasons ([cough] Google Fiber [cough])
Google Fiber wasn't really that inscrutable. It's clearly outside of their core competency, it required collaboration and agreement with local governments, as well as construction projects, and to AT&T's snarkily made point, it's way harder than a typical software project.
This is closer to their existing product portfolio, it doesn't have nearly as a difficult upstart cost relative to Fiber, and the business models are pretty clear. I think a closer analog would be the rumored reductions in their hardware division, but given that they're still making successful hardware + software, it could be more about reorganizing around priorities than it is an indication of their reduced investment
They lack the ability to commit to a space and cultivate it, casually changing course on a whim and leaving masses of people spinning in their wake. Just look at how they've managed YouTube and its guidelines, bans, etc. Or the Play Store and all of its debacles. Were I a game developer, I would never want to be beholden to a Google gaming platform.
> They lack the ability to commit to a space and cultivate it
That's because states, entrenched providers and in some cases local governments made every inch of work cost a mile of time. The cost was making Fiber a non-starter... ISP's benefited from nationwide networks largely funded by governments, then used their monopoly influence to prevent any competition, and "hired" state governments to enforce their monopolies.
But we can hand-wave all of that away and claim it was Google's lack of commitment.
I don't deny that ISP's make it far harder than it needs to be. They make it impossible for small upstarts. But Google, of all companies, had plenty of resources to make it happen if they really wanted to.
Blowing millions of dollars on an investment that will take decades to pay off isn't good business. Google decided, rightfully, that the answer was webpass. It's a better revenue generator, and has considerably lower barrier to entry.
The point is they overcommit and then back out instead of doing more careful due-diligence. Google Glass is an example of the right way to explore a new product; most of their ventures aren't so conservative.
It's less about them killing products and more about them following the same frustrating sequence of steps (for products that people actually liked):
1. Release a product, make it decently good.
2. Kill it in a few years after it has gotten widely adopted.
3. Release an alternative product that does the same thing, but, a lot of times, worse than the original and/or with missing functionality from the old one that people actually care about.
4. Bonus step: make it really confusing (looking at you, Hangouts => Allo/Duo).
5. Repeat the cycle.
Notable examples of it that come to mind:
* Messaging: GChat => Hangouts => Allo/Duo => I got tired of keeping up with that they are pushing now
* Music streaming: Google Music => Google Play Music [just a rename, I think] => YouTube Remix => YouTube Music
* First party Android experience: Google Play Edition => [I feel like I am missing something in-between here] => Android One + Android Go
See, that's part of the problem. As an end-user, to me all those sound kinda samey and fill the same niche, but technically are different products.
As for others, I personally stopped using non-essential google products (i.e., the ones I believe that Google is unlikely to screw up without completely destroying faith in the company, like GMail or Maps), so I can't come up with anything. If you are curious, you can check out https://killedbygoogle.com/ and see if there is anything there that might fit the bill.
Disclaimer: I have zero affiliation with that website. I found it by trying to look up a list of killed off Google products, hoping it would bring up some memories. Then I realized that there are way too many products on the list for me to parse through, and the website has a nice and clean UI/UX that I am not ashamed to share here.
There likely would be no offline option for this if (when? Ha!) it gets shut down.
It’s not like a photo site where you can just download your content. Whatever is built to run on Stadia will not run on your home machine or anywhere else.
The failure rate of startups is drastically higher than Google products, so it's only unusual in the sense that they have way more success than the industry as a whole.
It's not ironic at all. VR movies were always experimental and it looks like they realized there isn't a substantial market there, video games on the other hand are a proven industry that they've just entered.
I'd say one promotion cycle for the top executives on the product, 2-3 more for the next tier of engineers/product folks to ship some cool stuff, then a year or two for the product to coast before no one wants to take on the technical debt. So I'll predict its shutdown will be announced by July 2022.
They might roll it up into youtube or something, but the main components are cloud compute (GCP) and video streaming (Youtube). Its naturally aligned with their other initiatives so its pretty safe I would think.
The reason why these kind of initiatives fail is because they do not solve a real problem or issue.
The average Joe doesn't need another subscription just to play some games. Having a console where you buy games is more than enough.
I also silently hope that Google will fail hard with this one. Since they are the worst publisher on earth with absolutely no compassion towards its developers relationships.
$400 for a current console and $60 or so for a recent game. Assuming you buy the next gen after 3 years or so there's quite a budget invested in the gamer's part - a streaming service at maybe $20/month that gives you access to recent titles doesn't look that bad in comparison.
It's more like every 5 years and $60 a game only if you buy just fresh games, if you don't focus just for most recent games, you can buy good triple-A games for like $20 then they are on sale.
Don't forget the subscription where you usually get a decent game every month for like $45 a year. So you can actually buy a console once, subscribe and pay like $4 a month and constantly have new games in your catalog to play with.
Even with a very conservative 3 year lifespan of a console (5 years is closer to reality) 20 dollars a month over 3 years is 700+ dollars and at the end of the day you don't own anything. It seems like a pretty bad deal overall.
Video games and consoles don't have much lasting value, so it's not a good investment (except for some collectors).
If you mean to keep a copy and be able to play it again anytime in the future, then yes I'm with you. However, most of the games are either thrash or doesn't offer much replay value, which means that once you're done with it that will stack up somewhere and collects dust.
One point though is you can trade games that you own, which has some appeal.
People used to bring the same argument for music streaming vs buying a CD or digital copy. See where that brought us.
I have the feeling this is peak microtransaction. They've found out that people are likely to spend small amounts over time then one big sum at once and now you don't even own the game. It became the microtransaction.
There's a lot of computer games I can't play at 60fps because my computer isn't good enough.
I would like if someone else handled all the hardware upgrades, so I don't have to spend so much money every year just to keep up.
One of the appealing aspects of South Korea gaming cafes is that all the computers can be rented for a few dollars and tend to be maxed out. That environment creates a great place for everyone to get into gaming.
I'm not sure how that fits into Google's long-term interests, but it definitely fits into mine.
I work at Google on a different project. Opinions are my own.
> The reason why these kind of initiatives fail is because they do not solve a real problem or issue
> Having a console where you buy games is more than enough.
I think that's the problem though. This opens up "real" gaming to a much wider audience. Even though games are mainstream now, most people are not willing to spend big money on gaming PCs or consoles.
I played with Project Stream during the technical beta and I would say it cleared the hurdle of "good enough." The resolution was good and the input lag was normally small enough to be compensated for or go unnoticed.
Since the majority of this venture's hardware appears to be in-house and, I assume, easily re-purposed for other core services I'm less cynical about the half-life of this product.
I have no intention of buying the controller but if this let's me stream to my existing hardware with minimal friction I would be interested. If they structure it as a monthly subscription with a robust catalog they would probably get my money.
Google is really flexing on people here. With 5G this will be the future of all computing and only google and amazon have the data centers to make it work.
Sundar opened up the event talking about deep learning, I wouldn't be surprised if they used this platform as a way to train their RL models against humans or on data generated by the users.
I'd argue MSFT does, too, and they've already cleared a HUGE hurdle with XBOX (the hardware + game ecosystem) & XBOX Live (social engagement).
The opportunity for Google is huge, but it will require sustained, strong, cross-functional execution that can be challenging for such an engineering driven, compartmentalized organization.
I don't think Fuchsia would be used for this.
Vulkan runs on Linux, Unreal/Unity run on Linux, and AMD has a good open source driver. Everything points to Linux being the right choice here.
The only part about it that looks custom is that it's lower spec than what AMD otherwise sells.
Stadia's specs per Google:
Custom AMD GPU with HBM2 memory and 56 compute units capable of 10.7 teraflops
16 GB of RAM with up to 484 GB/s of performance
Radeon MI25's specs:
16GB of HBM2 memory at 484 GB/s with 64 compute units capable of 12.29 TFLOPs
12.29 * (56 / 64) = 10.7.
So this is almost certainly just MI25 chips with 1 or 2 bad 'cores'. Aka, the reject pile. From last generation, at that, since the current gen MI50 & MI60 are both 1TB/s HBM2 memory.
The multi-gpu part maaaaay be custom using the Instinct's extra interconnects instead of what used to be consumer Crossfire. But multi-gpu in consumer space is almost entirely dead, and nothing seems to suggest any developer would do anything for multi-gpu in this platform, either.
Between this and Valve's push for gaming on Linux, another of Linux' Achilles heels is solved. What else is missing before the Linux desktop is adequate for pretty much everyone?
Developing a game that runs on Stadia and one that runs on the customer's local Linux box are probably going to be two different things, unfortunately. :-(
This isn't going to bring any new local games to Linux, rather it "solves" the gaming problem by simply making these streaming games playable on Linux with Chrome, same as on Windows, Mac, on your TV with ChromeCast, tablets, phones etc.
Much like web apps, YouTube, Netflix etc. today already work flawlessly on Linux, so will gaming too with this development. For most people, the browser is already the only "native" desktop application they use on their computers.
Web apps and web video finally made the Linux desktop viable, but also irrelevant in the bigger picture.
It's running on Linux servers and using Linux tech like Vulkan though. Getting a game to run well on Stadia will probably involve a lot of the same steps that are needed to make them run well on Linux natively.
Linux for everyone will happen when I no longer need to explain what a "package manager" or "dependency" is to my retired mother. So, never, basically.
That is an interesting detail. I don't want the future of gaming to be Google-based, but it would be awesome if this project - in the couple years before Google ditches it - leads to some real headway in Linux gaming.
Other comments have suggested this might fail like OnLive/Nvidia/PlayStation Now because the economics of streaming on the business end don't work out.
This case is different. Stadia is the first video game streaming product made by a cloud computing company, and also a cloud computing company which developed a video codec (VP9) just for high quality/low latency streaming.
The economics are much different on all sides; it all depends on how Google prices the service. (it'll be curious how Microsoft's xCloud competes. Maybe Amazon will get in on the fun too.)
I don't think google does something new here, and I don't think most of them failed because of financial side. Most just failed to acquire users, even giving games and subscriptions almost for free. Or, even for free, like GeForce Now. But I don't use it. For me gaming via cloud feels like a bad experience.
May be Google would put pedal to the metal on its PR and advertisement, it would attract enough users.
P.S. Just tried GeForce Now again. After playing on PC even if lag is barely detectable or noticeable, but it is very irritating. Like, there is delay between action and reaction.
Developers will put their games where the users are. Right now most developers deploy to PC, XBox and PlayStation (and some to Switch) what's the cost of one more platform to deploy to vs the extra sales it can produce?
I also participated in the beta and was quite impressed. In my case, there was no lag and instead, when connection was poorer, the stream quality would get lowered (i.e like Youtube - 1080p to 460p). I think this can be completely resolved with a solid connection.
Speaking on Google leveraging Youtube for Strada, I also think this could be an amazing affiliate opportunity for game developers and content creators. Imagine watching your favorite streamer (or I guess game reviewer) and with a click of a button, without having to download, play the game right from your browser (demo or full game). Great source of income for content creators, great frictionless experience for viewers/consumers, and great tool for developers.
You mean a great place to loose another 30% of your income. A great place where you are forced to implement Google play related shit. A place where you can be expelled from the service based on wrong bot decisions. A place where there is no human support. A place where only the big publishers are looked after for. A place where indie developers are treated as shit.
A place where search algorithms are dubious.
Yes that's what you can expect from a Google stream service. Just as the playstore currently is.
They have benchmarks and go into detail on how well it runs. I found this interesting:
> "[It's] a complete port," Google's Phil Harrison told me. "[Ubisoft] built the game completely for Stadia and they're actually doing a talk at GDC about how they got their game up and running."
> In our earlier analysis, our impression was of a game that very, very closely mirrored the PC version running at 1080p resolution, with some elements upgraded beyond the console quality threshold defined by Xbox One X.
> "Correct, but they started from their main line on the consoles, it's not that they took the PC version and ported," Stadia VP Majd Bakar explains. "You can see that as the UI changes according to the controller you connect. I wouldn't call it a console port, I'd recommend going to the talk. It's going to be run jointly between Google and the team that did the work for Project Stream and they will talk about how they did it and the work stream they followed."
> Focusing on developers, Google also unveiled an impressive way for game developers to apply their own design style to titles on Stadia. It’s a machine learning-based style transfer tool that developers can use to simply drop an image into the video frames of games and have it mimic the style throughout.
This is a fascinating testament to the rate at which deep learning can run. If you think back to 2014-2016 (less than 5 years ago), one of the common things people said was that style transfer would be really cool to apply to video games or video, and then the middlebrow dismissal was 'style-transferring a single frame Gatys et al 2014 style takes a Titan hours so [simple math] shows we can't style transfer in realtime for decades, duh'.
But, GPUs got better. Deep learning got better. Style transfer got better. Now, here in 2019, realtime style transfer is not just old hat (Facebook launched a realtime style transfer video app for smartphones years ago), but Google is announcing a service to stream countless hours of style-transferred video games to potentially millions of players.
Does anyone even consider playing on the go anymore? I will be 200 years old by the time you can stream 60fps to a device sitting on a bus on my way to work.
Streaming may have a place in the future, but if it's the only option, then consider me a full-time retro gamer.
I already despise the fact that the masses voted to kill ownership of games, and now it's going straight to killing "rental licenses" as well, putting all of the power of entertainment and commerce in the hands of, essentially, two companies: Google and Amazon.
Ya'll voted for these massive monopolies, really looking forward to the dystopia.
5G is on the way, and will enable this stuff anywhere. MWC had tons of demos of 4k, multiplayer, and VR games streaming via 5G. A lot depends on how/when/if it's all rolled out, but unless you're currently 194 years old, it's gonna be a lot sooner than you think.
If 5G is anything like LTE, it will work some of the time, cut out at other times, rapidly degrade into crap quality before coming back to full speed at random intervals. And bear in mind I live in Orange County, CA, hardly the backwoods.
We are a long, long, long way away from having constant, fast, reliable, high-bit rate connections for mobile devices.
Still, always-online connections are still subject to typical tech issues. We're not living in the world of Ready Player One, and the country is extremely spread out. Add to the fact that a few mega corps own all the infrastructure in the country and for business reasons don't want to spend a dime, and I don't really see how any of this is possible, or even practical.
In other words, we have the technology (and it'll get better, I mean) to have handheld gaming devices. Even if they are digital only, it's extremely inefficient to generate video hundreds/thousands of miles away and beam it down to a device.
To me, this sounds like mailing a letter to your next door neighbor, going through the entire US Postal system, vs. just walking over and putting the envelope in their mailbox.
The quality of this is only likely to improve with 5G coming in the next year or so.
This technology consolidating in the hands of only two companies implies there aren't going to be any competitors. Nvidia is already establishing a foothold in this space, and I imagine AMD will also do so eventually. If the technology is actually interesting to consumers, there will be a competitive ecosystem that springs up to serve them. The future isn't as dystopian as you're implying.
It's dystopian in the sense that nothing is actually owned anymore. I grew up collecting games and still collect them now, and for a ton of people (the average gamer is in their mid-30s, and that's a massive demographic) still likes to collect their games. They like a shelf of titles. Heck, even those who buy digital have little digital collections, and more power to 'em.
But paying a monthly fee and having my gaming time leased out to me by MegaCorp is not anything I'm interested in. I want to pick what I enjoy, buy it, and play it. I'm not crazy for thinking that way, collecting is part of the fun of gaming for me and many others.
Speaking of the 4G/5G, I cannot fathom a world where you can be on a subway underground and somehow have a perfect streaming connection. And subways are only in major cities - what about the rest of the entire country? The current infrastructure, heck, the infrastructure of 10 years from now isn't built to handle this kind of bandwidth for constant streaming. It's not even efficient - always-online services have been vilified by the gaming community, and now somehow we're celebrating being stuck on 1 TV, playing at home.
Sigh. I feel like this whole industry has left me behind, and nothing makes sense anymore.
I'd say the vast majority of PC gamers right now have maybe one or two special edition physical boxes and everything else is digital in their Steam/Origin/Epic accounts. As it is right now at any point those accounts can be closed down without oversight or warning, and you don't legally own the games in your steam account. I don't see game streaming fundamentally changing things at this point, the "own your games" ship sailed long ago. Also like many steam users I have a metric ton of games in my account that I will likely never play or use again in my lifetime. A monthly rental playing a catalog of old games and some new ones might honestly end up costing me less then having all those old useless games in my library that I played once for 2 hours and can't return that I'll never play again.
As for the subway thing, wifi networks are becoming more and more ubiquitous and advanced, I know my local subway has it's own inbuilt wifi, obviously it's not going to be amazing any time soon but between that and 5g and the hotswapping connections to keep uptime, idk it can work.
You appear to be somewhat biased by your surrounding infrastructure which does not reflect the state of the rest of the developed world. You see, a lot of subways have excellent LTE because the carries deployed their network antennas underground. Besides, for a lot of cities, public transit lives primarily above ground. On top of all that, LTE in the US is terrible compared to other countries which can easily average 50-100Mbps: http://research.rewheel.fi/
Exactly. Or underground on a subway. Or in a bus next to a mountain. Or in a very congested area with tons of people on their phones dragging on the network. Or in a building that blocks service.
Every day of my life I am in all of those above situations, and I almost exclusively game on the go, either on my Vita or Switch.
5G doesn't make these problems just vanish, the US is a massive country and the middle of the mountains just won't have 60fps streaming capabilities in 5 years, it just won't. Why would any company want to spend the money on that infrastructure outside of a few major cities?
So, I work in the games industry and I don't think this will be successful. This has been tried before (multiple times), gamers haven't wanted streaming services. PC gamers already have Steam and this doesn't offer anything compelling over using Steam, and more casual gamers are playing on Ps4 or Switch or mobile. Also, the hardcore PC gamers are never going to accept the latency associated with playing on remote servers.
Google isn't a name that people associate with gaming or gamers, buit they do have a name associated with dropping products after a year or two. I wouldn't trust Google to still be supporting "Stadia" in 5 years time, whereas I do know Steam will still be there.
Also, Google's reputation for "do no evil" has been rapidly eroded in recent times and people are more and more wary about giving this company any more personal data than they already do. Contrast this with Steam, a private company that gamers trust with their data.
I think Google is once again reaching here into a space too late and without anything really compelling to make people switch away from existing Xbox/Windows/Steam ecosystems.
Edit: One other additional aspect to this: it doesn't really matter how fast Google's servers can stream the data, most people's ISPs do not have great consistent connections. Hell, I have Google Fibre at home and occasionally I get issues streaming 1080p Netflix (stuttering etc). Most people have services with bandwidth caps and do not get consistent enough speeds to support streaming games. At my parent's house in the UK, the situation is even worse with lower caps and very often there are disruptions in the connection and so the Netflix stream will turn into a blocky mess for 10 seconds or so.
I am a "gamer", I have a 2080Ti and I hope it's the last 1500$ gaming equipment I ever have to buy. It sits idle 95%+ of the time and it is tied to only one of the all the screens in my house.
Curious: Why'd you go with a 2080Ti and not just a 2080 or 1080? Did you just want the RTX support or did you just want the absolute best possible GPU for gaming regardless of price?
I can't answer for OP, but I went with a 2080ti because it's cheaper than purchasing a 2060/2070/2080 now and buying a 2080ti later. If price wasn't an issue I'd have gone with the best possible Titan RTX.
That doesn't make much sense. If you buy a 2070 now in a few years the successor 2170 (or whatever) will be more powerful than the current 2080ti and cheaper.
Unless you have unlimited budget I find that buying cheaper cards more often will give you better average performance.
The pace has slowed down somewhat. A 2080 (discounting the DLSS and RTX stuff) is about as fast as a 1080Ti, a 2070 is about as fast as a plain 1080, etc...
I knew someone would mention those.. I have two since they launched. However, they don't do 4k or HDR and I still need the 95%-of-the-time idle 1,500$ graphics card.
You still have an idle graphics card, but you can enjoy the benefits of it from any laptop in your home network. My gaming desktop is wired, my Linux laptop is wifi, and I get 4K @ 60FPS flawlessly.
However, I would prefer not to have to upgrade in the future and offloading the intensive parts to Google would be highly preferable.
Were you using old wifi? I've only ever heard of latency issues when using wifi and then only when using older than N. Latency when using Ethernet is less than the HDMI latency on consoles.
Steam link games don't have latency compensation since the games are unmodified. I assume that stadia games will have latency compensation, so should be much more playable.
There's no chance whatsoever that Stadia will be lower-latency than Steam's in-home streaming.
No matter how good Google's CDN network is, you can't compete with sub-1ms in-network round trips. There's not really any clever "latency compensation" you can do here.
Even if you rewind state to apply input, which games probably won't support anyway, that still only helps you close the gap added by internet latency. A gap that in-home doesn't even have in the first place.
The server will know what frame you were seeing when you pressed your button. So it rewinds it's state machine the appropriate number of frames (hopefully just 1), applies the input, and fast forwards again.
This may result in a visual glitch, but you won't have missed your target.
I wonder if you could use something like the RetroArch rollback solution[1] - render multiple versions of the stream, then pick the closest one based on player input. With a fast enough Internet connection, you could let the _client_ pick, then there is no lag.
Of course, the hardware you have to throw at it to do that quickly becomes ridiculous, unless you can come up with a way of efficiently rendering the multiple versions.
can't speak for Steam Link because I have one, but it was more headache than it was worth to get running.
That said, I had my Bootcamped MBP with a XB1 controller connected over Bluetooth and any input lag was negligible with the Assassin's Creed Odyssey Project Stream demo.
Why is it tied to only 1 screen? You can use Steam or Nvidia Shield to stream to just about any screen or tablet in the house and it works great. My gaming system is headless and I play all my games streaming to one system or another.
Steam and Nvidia don't have the stellar 11 9s kind of reputation Google has. Also, you have to actually buy those games first, so your library is a tiny fraction of what Stadia offers.
> Also, you have to actually buy those games first, so your library is a tiny fraction of what Stadia offers.
What makes you think games would just be included in Stadia?
Competing streaming services here are around $10-20/month without any games included. Origin's game access is $15/month without any game playing hardware included.
So... what will stradia end up being, and at what price? Would you pay $30/month for this? When a PS4/Xbox is only $300-400 in the first place, and has a lifespan of ~5 years?
Oh okay, in that case there's no real example you've given for the cost, so $30 may or may not be the cost, but considering a high-end rig approaches $3k and is relevant for 5 years, $30 is a great deal, especially if you get access to a large game library, which is what's happening.
$15/month for a streaming service + $15/month for game access service equals, drum roll, $30/month.
What are you not getting here? Origin is relevant as a reference point for what the games themselves cost on a subscription.
Stadia still needs to actually pay for games, after all. Either as a subscription (reference origin's pricing), or you straight up buy them at the normal $60/pop.
True... but they were talking about how their collection was limited to their single gaming system at home which wasn't true. They have options for that. But both of those are technologies to stream your own games, they do nothing to provide the games themselves. So it really is apples/oranges as a general topic.
Yeah but do you really think you're gonna be like, "You could hand build YouTube at home!" and anyone is going to be like, "Oh okay, WTF Google why even try?"
Keep in mind that 2080Ti will give you a way better experience than this will. The actual specs of the hardware here would fall into solidly mid-range. As in, the equivalent entire system would be less than your 2080Ti alone.
So instead of spending $1500 on that piece of gaming equipment spend $250-300 instead. That's what you're actually getting here. A very slow CPU with a mid-range GPU. It's a budget build that's not going to drive 4k@60fps with the bells and whistles. Not remotely close. It'll be fine for 1080p gaming, though, but that's realistically it.
+1, ex networking+tools gamedev here and have almost the exact same thoughts.
They haven't fundamentally solved the latency issue, 50-100ms of ping is brutal and the people who could be a target for this generally won't have a fiber internet connection.
Network gameplay works via misdirection, predictable simulations and knowing how to converge a divergent state(including rewinding in time to re-play hittraces and the like). Streaming gets you none of that.
It'll work for high-latency sensitive singleplayer games but not seeing it scaling out wider.
I'm not super sure, anymore, about latency being as big a deal. Particularly with regards to monitors. Display lag on a large TV can be upwards of 50ms by itself; display lag on a monitor can be as low as 5ms. If you're viewing this as a competitor to low-end, relatively lightweight console play, the total lag of display plus network is actually maybe not that bad. Also, living in Boston I was seeing 16ms pings on Google's service when I tried it out, which might be an adoption factor. 16ms plus a monitor's display lag is going to be competitive with most HDTVs.
That level of lag definitely makes some games implausible. (Part of why I bought a big-ass OLED is that newer TVs, but particularly OLEDs, seem to have generally consistent display lag; being able to play Tekken 7 on a 55" screen with only 21ms of lag is nice, but to do it on my old LCD is like playing underwater.) But I think it's going to be more practical for more games than the current conventional wisdom suggests.
Yeah, but you hit the nail on the head when you said you have a steady, consistent latency on your TV.
When you're stuffing the pipe w/ 4k encoded content that puts a lot more pressure on the connection and added packet loss or delay is brutal in that case.
All of this stuff is subtle but it compounds in the same way that the average gamer won't be able to immediately spot 30 vs 60 fps but they'll know that the game "feels" better at 60.
I fully agree that it's an inferior good, and I still don't think multiplayer games will be much of a thing. But I do think that even pretty involved action titles (they tested with an Assassin's Creed title) will be okay. Not great, and I'd get mad at them, but I'm also a really demanding user. Most folks kinda aren't.
And when all the clients and the server are sitting in the same DC?
Now you can program your clients and server like they’re all on a LAN. And now you can trust the client. (The client, not the inputs.) Those are huge reductions in complexity. This makes delivering good multiplayer easier, not harder.
But I do agree that input/display latency is going to be a thing. Results may be highly variable and you’ll probably know two people in the same city who have very different experiences with it.
The joystick and display device are just the clients now. You have the same problem just moved around a bit. Without the benefit of being able to use techniques to hide latency, such as client side prediction.
No, the client is still the client. It's still a big bunch of stateful code, but now it's running on the LAN with the server and can't have its bits fiddled by the customer. All that code where the server fastidiously checks the client's untrusted state updates goes away, replaced by a much simpler layer that sanity checks the inputs. All that code where the server has to rewind and replay inputs to deal with late-arriving client state updates goes away (assuming your game isn't cross-DC). All that code where the server is protecting clients from other clients, sending down corrections and fixing up the world so it maintains some semblance of causal order, goes away.
At my day job I'm working on a AAA multiplayer game, and making a properly server-authoritative game with significant client-to-server latency mitigation imposes serious costs in dev time and complexity and security review process. And now, at least in theory, they mostly disappear.
The input/display lag problem is different from what we have now. It's new. We don't, of course, compensate for lag from input to client. (Between client and server, yes, but that's actually a different beast.) And yes, you won't be able to do much about it. You'll be relying on Google to keep your latencies down. They said they're deploying to 7500 nodes, so there's hope. Time will tell. I'm hopeful a lot of people will be reporting single-to-low-double-digit latencies.
But this is much, much more than just moving around the same problem.
I'm used to designing systems and games that work in 100ms+ latent environments since that's what you see in production(or try mobile where it can be 2-5x that).
We were doing 250ms ping times back in '99 with games like SubSpace and the like where you had a predictive game model/design. Yes you lose the overhead of designing for latency because it just won't fundamentally work in those scenarios.
I agree that it will all hinge on what kinds of latencies we see. If Google can produce the same kinds of latencies it does to 8.8.8.8, it's definitely got a shot. Mobile's an even bigger challenge, but I've seen wire-competitive latencies from 5G, so I think there's hope there too.
This does, however, leave VR high and dry. Wonder if they have any ideas about that angle.
Getting a consistent latency to their edge servers is going to be tricky.
Wifi for instance is completely out. All it takes is someone turning on the wrong blender/micowave/power adapter and just the smallest bit of RF pollution will put you into unusable territory. Gets harder as your population density rises.
You can already see this with in-house streaming today. I can stream Netflix but my local LAN XBOne streaming is pretty unusable for things like Monster Hunter.
As your population density lowers the chance of you having the sub-30ms latency w/ 0% loss decreases significantly.
Like I said in the root, they'll able to get this to work for non-action(~200ms window) singleplayer titles, the idea that you could use this for multiplayer is laughable.
I'm seeing anecdotes elsewhere in this thread that wifi worked very well for some testers.
Will it be a worse experience than a 2080 Ti on your desktop? Sure. But will Stadia be good enough to win over the expense and hassle of the alternative? Maybe!
Regarding your last point, I'm not sure Id would stake their reputation on delivering a game that is going to be "laughable" on Stadia. They might know something we don't.
I predict a lot of game teams developing new IP for Stadia will impose a single-DC-match requirement, because the time/complexity tradeoff of implementing all that stuff won't be considered worth it. Then, if their game fails, maybe they'll blame that decision. :-)
Yeah, this is my opinion, too, almost right on the nose. Especially the part about only supporting a product for a few years.
I also don't like the idea of games-as-a-service in the sense that I only own a license to the game for as long as I pay a monthly fee. I'm okay with licensing content within the game (i.e. DLC), but I shy away from subscription games services like Xbox Games Pass or EA's Origin subscription.
A streaming game product like this doesn't solve that issue. Additionally, the latency issue, like you pointed out, is too much for me to consider _not_ owning my own gaming hardware.
Have they even said anything about the business model? I'm a few minutes behind on the stream, but I don't recall any mention of licenses or subscriptions. The only "practical" example of launching they game they showed was... out of a button on a YouTube video.
EDIT: A "Stadia store" was briefly mentioned shortly after my comment, it looked like it was tied into Google Play.
The "Stadia store" suggested games would likely be sold or subscribed to. But I definitely have to imagine some sort of trial model being available for the launch-from-YouTube options. Perhaps a time-limited option, that won't stop you from joining a streamer with Crowd Play, but would cut you off after a while if you hadn't already purchased it?
I'm very curious about the expenses game developers will face too, which definitely wasn't brought up. Id Software bragged about just using one Stadia instance, but then later demonstrations showing split screen gaming "without compromise" was shown running each screen on a different Stadia instance. While Google suggests this makes split screen gaming possible again, my guess is that split screen gaming costs the developers twice as much to implement as a service cost, since it uses twice as much hardware. Making it probably mostly a nonstarter anyways.
> they do have a name associated with dropping products after a year or two
Totally a concern. When OnLive closed people lost all their games, no refunds. While if you went to the store and bought a physical disk. It'd still be yours.
The average console quality game is about 50 or 60 bucks nowadays. If you are an active gamer, you could lockup and lose hundreds or even a thousand dollars when they decide to close it.
Most of Google's services are free, but paid ones I'd hope they wouldn't as easily abandon too.
Would you be buying games for $50-$60 through Stadia? What if they follow a model like Playstation Now or Origin Access, where you pay a monthly fee to have access to the entire library (a la Netflix)?
I haven't bought a game on physical disk for a long time. My impression is that in many cases, you're still going to be downloading a big release day patch upon installation. And for multiplayer-heavy games, it doesn't matter if you have a physical disk when the servers are shut off.
When Project Stream closed down, Google and Ubisoft did give everyone who played a copy of Assassin's Creed Odyssey they could play via Uplay. As much as these companies really don't want to talk about "when we inevitably shutdown", I feel like a lot of player trust would be established if there was a clear commitment to that.
Just FYI, but no one outside the tech scene knows about Google's reputation for dropping support of their products or their eroded reputation in regards to do no evil. I have tried talking to people about it and they really just don't give a shit.
Also Microsoft or Sony weren't companies known for gaming when they entered this space and now they are the leaders.
Gaming PCs need to be upgraded on a 2-4 year basis if you want to keep near the top-of-the-line. One of Google's phones can cost $600-$1000, which is significantly more than a new console. So I'm not sure why you even bring that up, as if there were no economical difference between putting $500+ on your credit card, and trying out a $15-$30 month subscription.
Computer upgrades amortize pretty well. Over the last 8 years I've made 2 upgrades to my computer: A new SSD ($100 for 1TB) and a new GPU which is overkill for many purposes (~600 for a GTX1080). That setup can play many games at higher than 60fps, which beats the hypothetical max of Stadia (and I'll believe "4k at 60fps" when I see it— the Project Stream demo seemed to have a much lower framerate than that while still having noticeable artifacts on a 1440p monitor). Compared to that, $30 a month for streaming (if that's the price) seems as predatory as those companies that let you rent-to-own a TV. Although I'm not a fan of renting games vs owning.
Larry Land even started working on Azure solutions. They were more stable, however, more expensive.
With all that said, there's definitely an interest to be hardware light. So this move into a hosted service should probably be pioneered by the big three hosts: AWS, Azure and Google. They are the biggest consumers of hardware at the moment. And have a hell of a lot more financial backup then Onlive (who only had one product to sell). Xbox has but to partner with their Azure counterparts. Although, I grant you there's probably little incentive due to them being hardware sellers.
All this hardware could have been used for better purposes, of course. Like AI training or Pharma... but like in Stross' Accelerando - it's all going to be used to run economics 2.0 anyway.
> people are more and more wary about giving this company any more personal data
To have the data of every choice and move in every second of every game would be quite useful to calculate the model of the particular user with the level of accuracy barely existed before.
Bungie had to build an infrastructure to push analytics from Halo into their system so they could get "heatmaps" of where players were in a multiplayer map. This fed back into their design process and was used to tweak maps after release.
Building something like that might be easier if all the state is living in a cloud server "right next to" the analytics pipeline, with no need to push state over the external connection. It could, hypothetically, even be built as a library Google supports directly.
That's for sure, but Google wouldn't be Google if they don't invent the way to derivate from that data something like "neural heatmaps" of the particular user to use it for advertising part of business.
Not necessarily. Google doesn't derive advertising (or any other data beyond broad bytecounts, for that matter) from the cloud-customer data stored in their cloud storage infrastructure.
I would definitely try this on some quirky little games that I love to watch the streamers play, but too lazy to download. If the play could start in minutes, I would try it.
I think we'll have to see what Microsoft is going to do at E3, because they've signalled in the past that they want to take this streaming approach, and they have their own cloud too...
This doesn't offer anything more compelling over using Steam? Excuse me?
At this point I see comments like yours in here (from "games industry" insiders) and I see horse breeders decrying the car. "It takes gas instead of grass, it'll never work!"
The sheer feat of technical complexity in this. Distributed computing, HPC, Networking, hardware, cloud, partnering with other firms etc. It's too complex for me to even imagine the hard work and ingenuity behind it.
Even if the venture fails, I am impressed with the tech part.
Problem is Google will never have high quality content from big publisher, it directly compete with their revenue so I'm not sure how they're going to get game on this platform.
Ubisoft's revenues are ~$2B. Google's is $130B+. Google already pays Apple $10B+ annually just to be the default search engine on iOS/macOS. You really think there's no room for negotiation here?
Apparently it's a Goldman Sachs estimate. The last time that number was explicitly listed (in court filings) was $1B in 2014. The more apt metric would be what YouTube pays for original content, though I've only seen estimates/projections (a "few hundred million"):
They didn't talk about monetization. They could presumably sell Stadia games like Steam games (i.e. there's no subscription). They did mention being able to jump into games quickly, but there are plenty of ways to make that work (e.g. giving everyone some free time).
I'm concerned about how this will work for game libraries.
If I buy a console and maintain the hardware, I can still play all the cartridge / CD games I bought for it forever, indefinitely. The hardware target does not change.
Google has a track record with their cloud infrastructure (such as App Engine) of requiring developers to keep updated as the engine changes. Will they attempt a similar model with the system against which Stadia games are built? And if a given developer doesn't want to invest the roll-forward effort (or dies out), does Google maintain an older version of their infrastructure indefinitely to run no-longer-updated-by-developer games, or do they pull those games from their offering to free up resources to run newer and more popular games?
This seems a bad investment for a gamer who is interested in returning to a beloved treasure 5-10 years out. Even my old Steam purchases can be re-installed and run on my own new hardware.
Not to mention the frequent horror stories of people having their account closed by google, often by mistake. There's even a link about a story like that right now on the front page. [1] Good luck getting in touch with a human and get your saved games back if that happens to you.
I really hope the latency we've seen in the live demo was due to the venue's poor internet connection because otherwise this will make some games unplayable. They mentioned that Doom Eternal will be available and that it would be a good benchmark for the service's latency but I'm waiting to see some independent reviews to believe it.
Also I don't really understand who's the target audience for this, I doubt people who currently own a console or a gaming PC will care unless the prices (of games or subscription) are extremely competitive.
The only thing that got me excited is that the infrastructure runs on Linux and AMD, this could have a great impact on Linux gaming.
Latency: Project Stream has been in beta since January, and there are a lot of testimonials out there. Other than fighting games and fast pace shooters, the latency seems to be imperceptible, assuming a good connection.
Target audience: Probably people who used to own consoles/gaming rigs, but no longer do for a variety of reasons. The fact that you can play anywhere and on any device (chromecast, phone, laptop), makes it competitive with devices such as Nintendo Switch of Nvidia Shield.
> Other than fighting games and fast pace shooters, the latency seems to be imperceptible, assuming a good connection.
There are certainly games out there which would respond poorly to latency and jitter. Play Hollow Knight or Dark Souls, but randomly jitter your inputs by 10ms to 20ms here and there, and that would make it far more frustrating experience.
Its bad enough that those punishing games can kill you quickly with a single mistake. But when you're fighting the controller and the lag, its that much worse.
I think streaming would be definitely fine for games like RPGs, or slower-paced adventure games (ex: Skyrim). But this will never be acceptable for Cuphead, Overcooked, Hollow Knight, or Touhou (or Jamestown). I'm willing to be proven wrong of course, but I've generally felt that even TV-lag is enough to make me rage-quit some setups when playing these games...
And TV-lag is consistent. WiFi based internet lag is jittery by nature. The question of what to do with dropped packets or delayed packets is an important design choice, and I don't think there's a correct answer for these twitch games like Cuphead.
---------
Fighting game players are so crazy about latency that they don't even use local Bluetooth controllers (!!) The serious fighting game community will likely never accept something like this solution.
The more casual, but still relatively twitchy, "hard games" players (Cuphead) is what I'm curious about. Whether or not the latency is acceptable for that community.
I can't reliably play Dark Souls via Steam Link over a wired connection that's less than ten feet long. Google may have some special sauce that improves the situation but like you I doubt it would be sufficient for all games.
I hear this a lot, and this is typically a network optimization problem. Networking is hard, and "a cable with enough bandwidth" is only part of the issue. How many hops between devices? Are any devices in the chain wireless? What router are you using? How many other devices are connected to it, and what kind of load are they putting on it?
FWIW, I frequently stream PS4 and PC games, wirelessly, over 802.11ac and it's more than playable. Dark Souls III, and even Overwatch, is playable over my network.
Just a hint: TV latency varies between models from 10ms all the way to 200ms (!!) in my experience. 200ms latency TVs completely wreak almost any video game experience, even if it is "smooth" and non-jittery (a "timing" based game like Guitar Hero can compensate to some degree, but few games work at that level).
Its very possible that your issue is in the TV, as opposed to a network issue. If your TV latency is below 30ms, the TV is probably fine.
-----
For whatever reason, computer monitors consistently score around 10ms, maybe 15ms in the worst case. Its TVs that have all this processing that kicks the latency to 50ms or 100ms+. Especially on larger TVs.
Well AMD will have to keep writing good Linux drivers which will most likely benefit everyone with an AMD card. They're also partnered with Unity and Epic Games which means these engines will make it easier to target Linux as the release platform and maybe port some DX-only features over to Vulkan.
Which doesn't mean any of those changes will be contributed back to upstream, just like it happened with many of the PS 4 features in clang and AMD drivers, where only what wasn't business critical from Sony's point of view was contributed back.
Google really promoting how it's better to develop for than anything else since they can constantly upgrade the hardware. How long before Stadia exclusive content?
They aren't the first or the only one doing this. Crackdown 3 already semi-famously uses Microsoft's cloud systems to do some physics processing in the cloud and not on the client.
Stadia might be a lot easier for devs to code against since everything is in the "cloud", rather than having to split where the processing happens specifically, and so we might see more exclusives there because of that, but MS and friends aren't going to just fall over and give up if this ends up working.
> Crackdown 3 already semi-famously uses Microsoft's cloud systems to do some physics processing in the cloud and not on the client.
Is that true though? I remember in the early Don Mattrick days of Microsoft they were heavily pushing the whole cloud physics thing but ever since then that has been swept under the carpet or heavily downplayed.
I haven't played it myself nor have I done a ton of research, but Crackdown 3 just came out recently, and I heard them talking about how important the cloud aspect of it is to enable the dynamic destruction they have in their multiplayer stuff.
The link below from late 2018 talks about it a bit
This is not a first service that tries to do that. And as before it all ends up with how close the gamer is to the server node, how stable user connection and whether connection is metered or not.
I just pinged my google.com server node - and it's over 16ms. Which means no 60fps.
If all the players are in this service, it means lag is much easier to deal with for developers, since its only input lag. All the players actual compute/rendering would be on a very low latency network.
you no longer have to deal with all the other networking problems between the client software, and server software.
My point is following: you have a character standing in a world. You click "w" - at which point character has to move. Right? On local machine that's what you will get. Not so much with the streaming - first you input goes to the google, being processed there, returns to you and only now you see that character is moving.
Yes, even with a Steam Link over Ethernet in my house there's a slight but noticable input lag. Most games are playable but it doesn't feel quite right.
Rocket League is still fun no matter what, even with a little input delay! Turn based games are fine of course. But the only game that has been unplayable for me was Hollow Knight. It requires you to react very quickly to attacks and I just can't get the timing right if I play on the Steam Link.
Surprisingly, a platformer that works decent on Steam Link is N++. It has more fluid controls than Hollow Knight though.
But the server knows which frame you pressed the 'w' on. So when it gets the 'w', it rewinds it's state machine X frames (hopefully X==1), applies your input, moves the state machine forward X+1 frames and displays that state.
So there might be a slight visual glitch but you won't be missing your targets.
If that's round trip latency. Then input latency will be 8ms right? Input action to Frame update latencies are reliably 12-13ms+ with a local console routing through a TV.
So if they can do something fancy with rendering frames to the network to sync with your monitor. We might see a minimal increase in actual overall latency.
As someone who participated in the beta, it's not as good as they make it sound. The compression of the video stream is quite bad in any slightly intensive scenario, the latency is certainly noticeable, frame drops every once in a while, and in general it's nowhere near as fluid as a local gaming experience.
I have gigabit fiber and quite a good machine, so I'm certain that's not the issue. Maybe some of these issues could be solved by closer edge servers, but it'll be a while till it reaches the level where you can play FPS', etc.
From a preservationist angle, this would be a terrible path for games to go down. Any exclusive games released using Stadia would be impossible to play once the servers shut down because users don't have any access to the software.
Even if MMOs shut down, there is usually an opportunity to write a new server by reverse engineering the client. I can't really support anything about game streaming unless it was tied into an actual game you could install or play without relying on their servers to exist.
I'm skeptical they'll be able to overcome the latency problem. At some point there's a hard cap that only infrastructure can help with - not software - and Google's already given up on rolling out their own infrastructure.
I did the beta, and I think the service is great, but I used 300GB of data in two days of play. There's no way this is going to fly when most people are under a TB cap.
Former PSN eng here.
So, it will be very interesting to see what is going to happen. I feel we were not very successful with Playstation Now. It requires a lot of money to even start scaling, you need a powerful (read expensive) machine which can be fully utilized by just 1 AAA game. And for any competitive gaming experience - latency is a big one. If you want to play multiplayer as you going to have latency to Google's machine and then to the game server.
Anyway, I'm not in gaming business any more, so I wish G good luck in disrupting the industry. It is just they were not really that sucsessful in disrupting anything they didn't acquire (read copied and tried to improve), but maybe this time...
What do you think of Google's controller? Having it bypass the laptop/chromecast and connect directly to wifi seems like it will save some latency, but how much?
The controller was the most exciting part of the announcement, for me. I think it's a really smart move to connect the controller directly to the network and not through the "console". It makes the controller a first class citizen and gives it the ability to do all kinds of neat things.
My guess, they did it because of the browser. Idea is very good and makes total sense - you don’t have to integrate your controller with every browser, you just stream in it.
Honestly, there's got to be a better solution than full-on streaming. Has anyone seen the demo where there's >2 seconds of input lag? Even with around a 0.1 input lag, what can I play comfortably? Any kind of action games or platformers are out of question. Ironically, those games are the ones which usually demand high processing power. Most other games are toaster-compatible and I can run them without paying(?) Google a streaming premium.
I also like being able to play games when on the bus or the subway, so the whole always-online thing is a bit of a bummer.
One of the biggest reasons to install windows is gaming. If this takes off (or enough games are released on linux as a byproduct) linux could gain a larger install share.
I participated in the beta. Everything was great except the graphics became super grainy and pixelated when you rotated the camera or during high motion scenes.
My download speed is 50 mbits and I had about 5 ping to the google servers.
I don't think US internet infrastructure is ready for this unless you live downtown with 100+ mbit speeds.
> I don't think US internet infrastructure is ready for this.
This is a very fair point and should be emphasized. It's not only the infrastructure, but the lack of Net Neutrality, the monopolies of many areas to a single ISP, and data caps -- they're all hurdles in the US.
Competitive games already need to compensate for latency in the network. This system technically won't have any "additional" latency, it will just move where it happens.
The amount of time between you hitting a button and a server somewhere registering the command and sending it to other players should be roughly the same (or close enough not to matter all that much for the vast majority).
Sure, some games where local timing is everything won't work as well (Super Smash Bros probably won't be able to be played on this system at the top tier), but the vast majority of players and games won't ever need that kind of millisecond precision.
It really doesn't take much input lag for things to feel sluggish and unresponsive. I've tried to play Rocket League on a Steam Link, and while it was technically _possible_ to do so, there was just enough sluggishness in the controls that it just felt _off_ - like I was at a disadvantage to every other player. Even something as simple as a 2D platforming game can require precise timing of button presses, and it seems unlikely to me that Stadia or any other similar service will be able to fully address this. I'd love to be proven wrong, however.
I'd imagine a big part is that games which are written for a system like this can do some compensation if they know the latency is between controller and the system (vs games which expect the latency between the client and the server).
I'm also curious if there is anything else going on here than just "streaming video to the user", like allowing some "smearing" to compensate like how VR headsets will do some fancy stuff to compensate for head movement faster than the round trip communication time between the sensors on the headset and the PC.
Latency hiding in "traditional" multiplayer games basically works by doing computations for other players locally (even if it's as simple as dead reckoning and extrapolating a few milliseconds into the 'future'), and only apply corrections when this diverges too much from the server because of network hick-ups. Unless Google invented some new magic, that's not possible with a video stream (but hmm, who knows, maybe they have invented their own 'game-streaming video codec' with additional information for some sort of 2D-dead-reckoning for regions of the video frame... but my guess is they simply don't care about latency).
Yeah, i'm really curious to read more about the tech here.
I know things like the Oculus Rift does some fancy processing to kind of "smear" frames to compensate for head movement while the PC renders a new frame. I'm wondering if this service is going to be doing a bit more than just "streaming video to the user". Like including some metadata and allowing the client to make cheap adjustments to various parts of the scene to compensate for lag?
Either way, I really want to see some people play things on this and talk about how it feels, because I can't imagine Google would just release a gaming platform that's nothing more than streaming video. Especially since it's been tried multiple times before.
Yep, if you did something where local controller input directly impacted the motion vectors for macroblocks in a video frame...you could maybe hide some of the latency, but that seems dubious...
Why wouldn't you be able to apply the same type of compensation for games like Super Smash Bros? The system (should) know what frame was being displayed when you press your button, and would use that to calculate whether it was a hit or not, rather than the current position.
Lets start by getting obvious out of the way - no such thing as competitive gaming on a gamepad. Then you have just encode/decode taking longer than some competitive players whole end-to-end latency.
That's what everyone said about controllers for FPS games, yet there's still console tournaments for Call of Duty and the like. If everyone is playing with the same handicap, it's still a fair match. And sometimes mass appeal is better than playing at the top level.
Some people - strange people - actually prefer a gamepad as an FPS controller, probably because that's what they grew up with. Nobody prefers extra input lag.
It's a smart move for Google to leverage their huge captive audience on YouTube to drive adoption. With Amazon's similar position with AWS and Twitch I'd be surprised if we didn't see a similar offering from them.
Any indication on how this ties in with the Google/Improbable deal for big-world games? Those work better if all the clients are in the same data center, a few microseconds apart.
This only works with Google web clients, such as Chrome, right. It may turn out that all this is just a ploy to force everyone to use a Google web client, giving Google total control over the online user experience. ("Browser" is so last cen.) Then Google can exit games, except for putting ads in them and on top of them.
I sometimes casually play Fortnite over nVidia Geforce Now via xDSL over wifi.
And while i do entertain myself lag is hindering overall experience, at least in FPS games.
My immediate thought is that Google is going to be actually running the games on hardware in edge data centers (to keep latency low). If you own a cloud software business, you're likely looking for any revenue-driving excuse to build out hundreds/thousands/etc of small DCs in every corner of the world. Stadia revenue (or the promise of Stadia revenue) would pay to build out the infra, and extra capacity could be sold off to GCloud customers.
They do already have at least _some_ edge presence. But my understanding is that's largely been for distributing content, like YouTube. I imagine storage/bandwidth at the edge are largely solved problems for Google.
Has Google needed to run code physically closer to the user up until now? Likely not. Gmail/AdSense/etc. probably don't have [many] use cases where putting compute physically closer to the user (versus physically closer to a database replica) adds a ton of value.
With the advent of AI/ML, that's changing. Accurately recognizing speech, recognizing people and objects in streaming video, translating text, etc. likely don't require access to other resources, but the latency between the server and the user is super important. Video games are essentially the epitome of the use case: you're not going to need to bang out many queries across the Google network to play a round of Splatoon, but every millisecond of lag you can shave off, the more valuable your service becomes.
I could see Google selling cloud functions that execute within shouting distance of the user. What would the round trip be? 30ms? This could be a play by Google to take on Cloudflare workers.
In London where the fastest internet available to me at my address in zone 2 is 12mb highly contended in the evening.
This isn't for everyone, clearly a bet for the future. But this is why Google Fiber should have gone wider and international, or governments should pressure telcos to build speeds massively in excess of demand.
This is what I've seen everything trending towards for almost ten years now. The next step is to build hardware that utilize the full benefits of data center gaming. Vr headsets can shrink and are only limited by the method of getting light into your retina. Same goes with any kind of haptic feedback or audio feedback.
A more philosophical question. If they manage it with a hard, computationally and bandwidth intensive problem like games, does this mean that this kind of computing is coming for good for every kind of activity?
Is this is the last time we need to upgrade to an expensive graphics card?
I don't understand giving a keynote like this and then not launching the product for months. Apple figured it out long ago - way more power in "it's available now" than the alternative of forcing customers to wait an indefinite amount of time.
I wonder how much of the gaming market consist of games that requires good latency.
I would be perfectly fine playing Civilization or turn based games on this infrastructure as sub-100ms latency probably isn't that big of a deal here. Perhaps google is going after that niche?
Services like this are already in operation, see Shadow and GeForce Now. Developers don't need to do anything to have their games supported on these platforms either, so I don't understand what Google are bringing to the table with Stadia.
I used this in beta on a cheap chromebook. It worked great, and since I don't like putting games on my machines or upgrading hardware or using windows, perfect. Realistically, the only way I could play this game on this hardware.
This will boost pc gaming significantly. Now everyone can experience pc quality game on tablet, phone, mediocre pc. But bad news for mobile games and gaming hardware manufacturers IF stadia's pricing is affordable enough.
Google is also launching a new Stadia Controller that [...] will work with the Stadia service by connecting directly through Wi-Fi to link it to a game session in the cloud. This will presumably help with latency and moving a game from one device to another.
Sounds like a horrible design choice. My controller now needs to be wifi connected, so I can instantly change my game session from my phone to my TV?
Focusing on developers, Google also unveiled an impressive way for game developers to apply their own design style to titles on Stadia. It’s a machine learning-based style transfer tool that developers can use to simply drop an image into the video frames of games and have it mimic the style throughout.
Sounds like another gimmick that nobody has ever, or will ever need. But "Machine Learning!"
How does it improve latency? The device it is connected to is already internet connected and there is no meaningful latency between the controller and the device. So how does the on-board wifi help?
Adding wifi to a device for no reason is horrible, I don't suppose it needs explaining.
By directly connecting the controller to the service via WiFi it removes any potential issues with the device having to act as a proxy for the controller input. For their dedicated Stadia set top box it isn't a big deal (as they will control the hardware) but when used with phones, third party STBs and such it can be problematic. Google will have no control over the WiFi hardware in all these devices and many cheaper WiFi chipsets are buggy in a multi-device setup.
Adding WiFi to the controller removes all those potential issues and allows them to tweak things to be as good as they can be for latency, power consumption, range, etc. (at least in theory anyway).
Yes, I was thinking along the same lines. It seems like the combined latency of bluetooth, passing that input through the OS, browser, then out to wifi could be considerable, especially if the machine's wifi and bluetooth chips are not optimized for low latency. Since Google makes the hardware of the controller, they can use all the lowest latency chips and drivers possible.
I'd be very interested to see some latency benchmarking of the stadia controller vs an xbox controller. I'd also be curious if a future version of this controller could connect directly to an even lower latency 5G connection, bypassing the wifi router and cable/fiber modem and removing yet another latency question mark. Even when playing locally, input lag can be considerable, so this might be a very smart hack to make this streaming service playable with a larger range of games.
"no meaningful latency" --> citation needed. In gaming, every ms counts.
"for no reason": I can think of a variety of ways it will be beneficial to the overall utility and simplicity of setup. Seems like an odd thing to be so negative about.
Isn't this similar to Nvidia's GeForce Now service? I've been impressed with Nvidia's beta; with a good connection you can play a ton of games at maximum settings with just minor input lag.
Yeah, there's quite a few companies that are working on a similar product, including PlayStation Now (Sony) [0], GeForce Now (NVIDIA) [1], and Shadow [2].
Can I buy just the controller? Sure does seem the PS4 controller is about the best out there for most gaming, and I've been hesitant to buy anything else. This controller seems just as well made.
Jade Raymond is at the helm!? Color me very excited, she was running the original Assassins Creed game back in the day. She's a very talented producer!
Strada looks like a game changer, I bet it will be massive by 2025 when internets speeds improve. Only companies with major infrastructure (Microsoft and Amazon) will truly be able to compete with Google. The cost of hardware can even be spread between multiple people lowering prices. Plus it divided monthly via a subscription, meaning a lower cost of entry versus buying a new Console.
I give this 2 years before Google removes all references of "Stadia" from it's product lineup. They have never stuck with a product which, relative to rest of their products, has smaller but dedicated adoption.
For example, Valve will keep updating/supporting/hosting events of CSGO even if it has <1Mil active users on PC, but I can totally see Google pulling plug on it and burn dedicated followers.
> For example, Valve will keep updating/supporting/hosting events of CSGO even if it has <1Mil active users on PC, but I can totally see Google pulling plug on it and burn dedicated followers.
Bad example because CSGO has been sitting at #2/#3 on steam's active player charts for years and years. Consistently ~10x the player count of, say, Rainbow Six Siege. Continuing to invest resources into a game with CSGO's player count is just all around solid business strategy, particularly since it has ~500k concurrent users at any given point. Active player count would be way, way over 1Mil.
TF2 would be a better example, but I don't know how much (if anything?) Valve is still doing with the small TF2 community. It still has a respectable 50k concurrent user population, but has Valve been ignoring them? Are there still updates for it?
Particularly if Facebook overtops them with some sort of streaming Oculus appliance. This is already a killer app for people on the edge about upgrading older PCs. I beta tested this as did a friend - he was able to play on a chromebook! But if I could play Elite: Dangerous on a streaming VR console I'm not sure I'd ever buy another high end PC again.
Could this be peak attention economy? I fully expect - no, I demand - that the price to play these games be fully compensated for by the data I generate from playing each game.
TLDR: >60ms of added latency (160ms) while plugged on Google campus connection in specially modified and tweaked game versus unmodified typical Ubisoft bad PC port, compression artifacts
I've used most of these applications in the past and I'm pretty salty they died. So yeah. I don't have a lot of faith in google keeping services it doesn't find profitable, my guess is the cost of compute will exceed the profit from selling games/game streaming/whatever and then, yes, they will shut it down. Don't act like its not something they don't do or wouldn't do. It's certainly not like making fun of the fat kid at the gym. Or are we turning into hail corporate over here?
Given the number products and services that Google has shuttered over the years and their focus on learning all there is about their users, I am going to sit this one out.
There is definitely some cool work here though: being able to have access to a library of games without lugging around a console or computer is an interesting idea with both pros and cons. The pro that I can think of is not having to be responsible for upgrades. The con is that we continue to move towards a world where we rent more and own less.
Probably worth noting: this project appears aligned with Google's Cloud and mobile initiatives. It's unproven, but it's originating from spaces where Google's products have stuck around.