HN2new | past | comments | ask | show | jobs | submitlogin

> and this had to be paged in and out dynamically, without any "hitches"—loading lags where the frame rate would drop below 30 Hz.

This is what gets me. Modern game development seems to say "eh, a little hitching won't hurt anyone", and then we wind up with games that run like shit. Even on consoles.



I work as a programmer in games industry,and I feel like the problem is made worse by artists and level designers who add more stuff without worrying about performance. I can make a super efficient physics system or model loader,but that only means that someone somewhere is going to add more particle effects or lights or whatever, or maybe placing too many props in the scene so PS4/X1 can't handle it. In fact, the separation is huge nowadays - I know our engine inside out, but I personally wouldn't really know how to use the editor to remove things from the scene. Likewise, a level designer will know how to put props in the scene,but they will have no idea how underlying things are wired together or what is the performance cost of doing them.

It's a complicated problem, which might have been made worse by the fact that games are simply easier to make nowadays than ever before.


Back when I was somewhat involved in the gaming industry a very, very smart programmer told me the reason his game with his new fancy, innovative, advanced engine didn't work out.

If you make it possible for level designers to make six square mile levels, they all make nothing but six square mile levels.


The internal Commandos Level Editor had a bug where it computed twice the memory footprint for the level. When the boss found out, he ordered the tool programmer to NOT fix that bug or tell anyone about it.


This is eerily similar to the problems we face in the browser space. Make CSS styling faster and developers will just write more complex CSS to take advantage of it. Then people blame the browser (and the Web in general) for being slow.

The difference is, sadly, that we don't control the assets at all. :(


I am not a game developer, just a developer and a gamer, so please forgive what may be a goofy question: have you seen or engineered systems that capped the level designers' resources? For a super simplified example, I think of Forge in the Halo series, and I'm pretty sure Forge had a certain amount of monopoly money that gamer-designers ran out of eventually. Would such systems be infeasible more for political reasons than technical?


The resources are usually capped in a "soft" way. So if the scene uses too much RAM for a console to handle, it will trigger a chain of emails that is sent to everyone involved, and well.....either I will be given a new task to somehow fit all of it in memory as a programmer,or a designer will be given a task to remove some stuff to go below limit. But framerate can be really tricky to handle,because ultimately, it's not up to me whatever fps drops are acceptable or not.


Is there no standard? Like, if fps drops below 30 on a play through - red light and fix.


The problem is, that fps stays below 30 for 90% of the development cycle. Only in the last few months of a project everyone rips out useless code, debug code, debug overlays, loggers, extra network connections for statistic and performance servers,and finally you can produce a "nearly" final version of the game that doesn't have any debug code in - and only in that state you can see how it will run on actual hardware and start optimizing from there. So basically you arrive at a situation where you have 3-6 months before release,and you have this game running at 20fps on a console,and you have to somehow make it run at 30 or 60fps. Obviously profilers help with that a lot, but it's rarely a process which you can afford to be doing during most of development.


This is why we see all these Early Access games that run like garbage. They are still in the develop new features and fix bugs phase, and have not yet gotten to the optimization/strip out debug code phase.


And yet, I wonder if they couldn't release a production build without all that (maybe crash handling/reporting). Early access games are what agile development is to software - or should be. Going to production several times a day without giving up on the application's performance, and such.

Early access game devs pushing unoptimized releases should set up their release system betterer.


This exactly - no reason why you couldn't ifdef everything that is debug only, if your team is consistent with it from the very start.

Backfilling existing code with ifdefs and dealing with compile breaks and other more weird things can be intimidating, time consuming, with hard to define ROI, so I can empathize if someone doesn't do it.


That sounds like a very, very risky software development process. I think best practice is to make sure release builds are done and are used for testing from day one. QA doesn't test with all this debug code and debug overlays do they?

#ifdef DEBUG


They do. The project I'm on has been in development for 5 years, has hundreds of people working on it globally, and not a single #ifdef DEBUG in it. It's currently my job to add that in,it's going to take weeks to finish.


I remember an interview with some guys from criterion who said they always made sure to maintain 60fps through the development of their game.


At least in my own experience, this is not standard throughout the industry.


Oh yeah, I know it isn't. But it seemed to work for them back when they made 60 fps games.


This is also why improving fuel/power efficacy in industry usually doesn't help the environment.



I was playing recently Axiom Verge...

Great game, except it has NES graphics and stutters like hell.

Wasteland 2 on my machine also ran really, really badly, it was unplayable (it was the first time I got pissed for kickstarting something).

Kerbal Space program also has some performance issues, but not bad as the previous two.

Then I go play some graphics heavy game made by some studio that like to make good tech, or play emulated Wii or PS2 games, and there are no issues and games look awesome.


Two of those games use the Unity engine, and Axiom Verge apparently uses MonoGame.

Garbage collection is often a big problem for real-time performance, especially on the old version of Mono that Unity has.


The worse thing that ever happened to game development was the on-demand updates. There is nothing worse than buying a game on release day only to wait for it to download a patch.


It is both good and bad. It allows delivery of critical fixes and new content, but it also decreases the demand for code quality from the start as well as increasing DLC. The bigger issue though is for the people who can't get it, like those whose only options for internet are dail up or satellite (with a 5GB per month limit). Sadly the market doesn't care about this small group enough to matter and we get left behind.

But, even with having a worse experience than a day 1 patch (that being unable to get the day 1 patch because it is 10GB and you only have 5GB for everything for a month), I wouldn't call it the worse thing ever to happen to game development.


Console games have survived for many, many years without the ability to issue critical fixes and they seemed to do ok. I've played console games since 1991 and I've never ran into a game that had bugs that made it unplayable. I think the QA cycle would be more complete if developers didn't rely on on-the-air patches.

It shouldn't really become acceptable to ship an knowingly subpar product with the attitude "we can always issue an update later."

There is an art in exploiting bugs in old games and working to glitch your way to worlds you aren't supposed to enter at that time. The kind of time and effort in finding these is really amazing. Finding and exploiting those bugs is an art form in itself. Take a look at this - https://www.youtube.com/watch?v=aq6pGJbd6Iw Skip to 12:15 for the real insanity.


Yup, yup. Running into a game breaking bug, bringing the game to the store and being told that you've got a "damaged" disk that they will replace (with a patched game, if it's been already "reved" or with an exactly same disk otherwise) was so much more fun. Miss those days too.


When we submit games to Sony/MS/Nintendo for publishing, we usually have to do it 2-3 months in advance. What are programmers supposed to do in that time between "end" of development and release date? Of course everyone works on little things that were left, you might as well release them as a patch!


I get the sentiment (they should have tested more thoroughly), but I for one appreciate the on-demand update mechanism. Would you rather play a game with previously unknown bugs, or have them smashed on launch day and get a patch to make your experience more stable?


I'd rather play a game that, if I should want to play it 20 years from now and there are no update servers - which I do still do with my old consoles - I can pop it in and not worry about bugs I have to figure out how to get patched. Games will inevitably have bugs, but developers have become too reliant on the update mechanisms and games have shipped completely busted.


That's an interesting point. I hadn't considered the impact of ondemand updates to future abandonware - looks like another case where pirated illegal versions might have better preservation than official ones.


Some fans even made patches for bugs in Master of Orion, in the binary.


Command&Conquer: Red Alert 2 and its expansion pack also received community patches (I contributed to them a lot) to its binary. When EA later released The First Decade bundle with all the old games in it, they removed the old copy-protection. Most games in the bundle were simply recompiled to remove it, but for the expansion pack they hex-edited the copy-protection to keep it community patch compatible.


Games were stable enough before. And the stability of an average game in the first few month definitely went down with the introduction of on-demand online updates. Not only that, it's not entirely uncommon to see half-gigabyte patches.


Is that so much when the games themselves have ballooned into the double-digits of gigabytes? A .5 gig patch isn't a whole lot when Dragon Age: Origins takes up 20gb in the first place (and thats a pretty old game)


A lot of that is sounds, textures, models, animation info.


> Modern game development seems to say "eh, a little hitching won't hurt anyone"

Or just like in the PS1 days, game developers still have to make trade-offs to meet dead drop dates set by publishers.

Online enabled patches can allow developers to be a little more cavalier with the quality, as they push to build more features closer to ship date.


"Online enabled patches can allow developers to be a little more cavalier ..."

That just means that we get to deal with buggy crap while they tell themselves it's OK because they can ship another update.

Sigh.


I'm going to give game developers the benefit of the doubt by believing they're not OK with shipping bugs.


... yet the bug count keeps rising.


Most Nintendo games still seem to be locked at 60Hz (sometimes 30 maybe?). Look at the fuss when people realised that Mario Kart 8 dropped to 59Hz sometimes ;-)


Nintendo generally picks a framerate and sticks to it. The N64 zelda games were actually locked at 20hz.

(It is possible to do 60hz on the N64, but it's really hard)


With internet based games I wonder if that loading screen/hitch is essentially finding you a server as you move from one zone to another if the server you were on was too crowded.

I also wonder if they are doing the equivalent in modern consoles - pushing the limits with graphics, etc. where you simply can't avoid the hitch.

It would be good to hear someone's perspective on this that works on these types of games.


It shouldn't be, networking should always be in the background and for console games there's no reason why clients should move to another zone - loadbalancers direct new clients to available servers, they don't or shouldn't move existing clients to other servers to make room for new clients.

Chances are they're not doing anywhere near the same thing with modern consoles; maybe in off-the-shelf engine code to get maximum FPS, but the games themselves, not likely. Also because modern video games are millions of lines of code - you don't want to duplicate those tenfold by squeezing every bit of performance out of it. Maybe only in the most frequently accessed codepaths.


Moving from one server to another can be instantaneous. Just use the cellphone model where they do proper handoffs between base stations. As to loading assets, plenty of games used a streaming model where you can explore huge worlds without issue.

PS: Skyrim is an interesting case where there is a player made patch to make city's open world where the original game has a loading screen. http://www.nexusmods.com/skyrim/mods/8058


> As to loading assets, plenty of games used a streaming model where you can explore huge worlds without issue.

The streaming model isn't simple though. You need to decide which assets to load, when, and when to discard them. You also have to consider how your level is designed, i.e. if you've got three tiles, and a player travels from tile a to b, and then b to c. When he moves from b to c, you can remove a from memory, but what if zone b + c is too big to fit, whereas a to b is okay.

A really interesting presentation was given at GDC this year from Insomniac games on streaming http://s3.crashworks.org/gdc15/ElanRuskin_SunsetOverdrive_St...


I agree that it's harder and limits the graphics somewhat. But, a lot of things are hard and we don't give companies a free pass when they mess up pathing.

Also, when you get close there many ways to hide the loading going on so it seems cleaner. Like a player going from planet A to planet B though a more limited space ship. On arrival they see a larger but still limited space port, giving the game time to load the new planet. Or even just boosting the glare when someone steps outside.

However, IMO these things can easily be over done.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: