A studio's logo used to be the first thing to herald the loading of a game. The screen would go black, momentarily, you'd spend the customary few seconds wondering whether the game was actually loading or your computer froze and then – there they would be: Origin Systems. Interplay. Bethesda Softworks. Black Isle. Ion Storm. Troika Games. Proud, shiny logos made in-house by people who had spent years putting your game together. Developers stamping their approval on software they had personally made...
Nowadays, more often than not, the first thing you see looks like a farmer's market of tool kits, plugins, engines and extensions – as in the above screen cap from Witcher 2. Graphics, visual middleware, sound propagation, UI and menus, content optimization — even foliage: if there's anything a developer doesn't want to (or cannot) do themselves, no worries — there's a tool kit for that.
Which is not surprising: every development cycle has a deadline, few are expansive enough to allot time for making things from scratch, and – at the end of the day – if technology already exists that does exactly what you need for your game – why not use it? The ever-expanding market of game development aids is a huge boon to any studio too small or financially handicapped to afford development time and cost for finding its own solution.
But convenience is a double-edged sword and over-dependence on shortcuts leads to stunted programming ability, limited gameplay scope, frequent design similarities, poorly constructed software and – you guessed it! – item three on my list of things holding back game development. Namely: plugins.
The steady expansion of middleware and the resultant overspecialization is not unique to game development. Take a look at any business sector that used to demand some degree of technological competence and you will see the same, recurring theme. Whether it's services like Wix for web development, Canva for graphic design, Getty Images for photography or ChatGPT for, well, writing of any sort, evidently, tech companies offering ready-made solutions are steadily barging onto the pasture that used to be the sole domain of talent.
Which makes sense: in a business model that values profit above all else, the cheaper solution providing a comparable outcome will win out. And paying a flat-fee for something you know works is cheaper than paying a regular salary, insurance and all the rest of it for an employee who may deliver.
But, unlike those other fields, game development is unique insofar as every project has completely different criteria and there is no "one size fits all", overarching solution. Every game is different – pardon – should be different... But increasingly, that is not the case.
I've touched on the similarity of gameplay mechanics between titles before and how — when a company puts in the time to develop a significant and versatile engine, others merely copy it and tailor their games to its limitations instead of developing their own. I used to think that was merely expediency at play, but – nowadays – I wonder whether it may not represent an overall decline in technical ability. Think about it.
When a Tanuki I'm fond of finished his programming degree in 1997, his university wasn't quite sure what would come in handy and so decided to throw the book at him. He coded in Assembler, Pascal, C and C++, was taught physics, electronics, machine construction, sociology (!), network and driver design. Not all of it was useful (I'm guessing the sociology didn't add a ton of value), but it instilled a positive trend of learning and saw him able to come to grips with Java, Scala, neural networks and other advanced technologies that a quaint, barnyard pig like myself has no clue about.
And that's something that a programmer weaned on bento solutions simply isn't capable of doing. I'm not even sure if someone who exclusively uses tool kits or plugins to develop games can still be called a "programmer" (maybe if you keep the quotes in the title). Simply put, if your job mostly consists of adapting plug-and-play pieces to solve a problem, you are only viable so long as the problem can be solved in that way. The moment you stumble onto an issue not covered by some existing middleware, you're stumped.
Which is not to say proper programmers no longer exist. But I think a lot of them, justifiably, followed the money and left game development all the worse for it (I used to think Roguelikes were a genuine fad; now I wonder whether permadeath was just a convenient way to cover for not knowing how to code a working save system).
The same could be said for game design: was the ongoing, decades-long spate of copycat titles a Smart Business Decision? (devs gauging what sold and following in its tracks); a Clever Management Call? (devs catching on to a malleable platform they could tailor to their needs without any extra programming); a Necessary Compromise? (devs caving to the lack of programming ability); or all of the above? Whatever the case, the result has been a parade of games that all function on the engine of that One Company that spent the time to Make Their Own Thing (suckers!), which all work the same, play the same and feel the same.
Which is not all bad news, of course. I mean, am I sad Creative Forge used Firaxis' XCOM engine to breathe life into Phantom Doctrine or Hard West? Of course not. In the hands of capable devs, all of the programming aids function like they should: as shortcuts to results they themselves could eventually achieve. There are lots of solid titles that wouldn't exist (or at least not in as good a form as they do) were it not for these ready-made solutions... But, equally, there are even more bad games that shouldn't exist, which were made possible solely because of these very same shortcuts.
Speaking of bad games — know what's really hard to do when you don't know how your own engine works? That's right: optimizing the sucker. Games performing poorly on rigs that could comfortably solve the Meaning of Life (still 42; laughably easy, at this point) are the third and last side-effect of the steady decline in programming ability and over-dependence on third-party solutions.
The worst offender that readily comes to mind is Unity — a one-and-done 3D development kit responsible for resource-hogs like Pillars of Eternity, Wasteland, Pathfinder: Kingmaker, The Long Dark, Tyranny or, of course, BattleTech. In fact, "poor performance" and "Unity" seem to go hand-in-hand like wine-and-cheese or bread-and-butter. Unity games have laughably high overhead in no way justified by the visuals they provide. And were that not bad enough, developers that use Unity seem to belong to some exclusive, unfortunate club that's unable to improve game performance in any meaningful way... It's a real frustrating one-two made worse by the fact that – performance aside – the games themselves are actually, usually, pretty decent.
The existence of tool kits and plugins is not bad in of itself. Games may keep getting larger and more complex, but project deadlines aren't stretching to keep pace. It would be unreasonable to expect any studio to make every component of a game from scratch... But making only a scant few and depending on kits to account for the rest will not – in the long run – lead the industry anyplace good.
Pig Recommends:
- -learning to do things The Hard Way; no matter what you're learning, try to learn it from the ground up, without shortcuts or intermediaries; then, once you know how to do something properly, by all means — indulge as many aids as you like content in the knowledge that, should they fail, disappoint or become unavailable, you'll still be able to get by;