The worse thing that ever happened to game development was the on-demand updates. There is nothing worse than buying a game on release day only to wait for it to download a patch.
It is both good and bad. It allows delivery of critical fixes and new content, but it also decreases the demand for code quality from the start as well as increasing DLC. The bigger issue though is for the people who can't get it, like those whose only options for internet are dail up or satellite (with a 5GB per month limit). Sadly the market doesn't care about this small group enough to matter and we get left behind.
But, even with having a worse experience than a day 1 patch (that being unable to get the day 1 patch because it is 10GB and you only have 5GB for everything for a month), I wouldn't call it the worse thing ever to happen to game development.
Console games have survived for many, many years without the ability to issue critical fixes and they seemed to do ok. I've played console games since 1991 and I've never ran into a game that had bugs that made it unplayable. I think the QA cycle would be more complete if developers didn't rely on on-the-air patches.
It shouldn't really become acceptable to ship an knowingly subpar product with the attitude "we can always issue an update later."
There is an art in exploiting bugs in old games and working to glitch your way to worlds you aren't supposed to enter at that time. The kind of time and effort in finding these is really amazing. Finding and exploiting those bugs is an art form in itself. Take a look at this - https://www.youtube.com/watch?v=aq6pGJbd6Iw Skip to 12:15 for the real insanity.
Yup, yup. Running into a game breaking bug, bringing the game to the store and being told that you've got a "damaged" disk that they will replace (with a patched game, if it's been already "reved" or with an exactly same disk otherwise) was so much more fun. Miss those days too.
When we submit games to Sony/MS/Nintendo for publishing, we usually have to do it 2-3 months in advance. What are programmers supposed to do in that time between "end" of development and release date? Of course everyone works on little things that were left, you might as well release them as a patch!
I get the sentiment (they should have tested more thoroughly), but I for one appreciate the on-demand update mechanism. Would you rather play a game with previously unknown bugs, or have them smashed on launch day and get a patch to make your experience more stable?
I'd rather play a game that, if I should want to play it 20 years from now and there are no update servers - which I do still do with my old consoles - I can pop it in and not worry about bugs I have to figure out how to get patched. Games will inevitably have bugs, but developers have become too reliant on the update mechanisms and games have shipped completely busted.
That's an interesting point. I hadn't considered the impact of ondemand updates to future abandonware - looks like another case where pirated illegal versions might have better preservation than official ones.
Command&Conquer: Red Alert 2 and its expansion pack also received community patches (I contributed to them a lot) to its binary. When EA later released The First Decade bundle with all the old games in it, they removed the old copy-protection. Most games in the bundle were simply recompiled to remove it, but for the expansion pack they hex-edited the copy-protection to keep it community patch compatible.
Games were stable enough before. And the stability of an average game in the first few month definitely went down with the introduction of on-demand online updates. Not only that, it's not entirely uncommon to see half-gigabyte patches.
Is that so much when the games themselves have ballooned into the double-digits of gigabytes? A .5 gig patch isn't a whole lot when Dragon Age: Origins takes up 20gb in the first place (and thats a pretty old game)