From the article:
One likely culprit is the number of hours kids now spend in front of the TV and playing videogames rather than engaging in creative activities.
By extension, I blame game developers.
But seriously. When I was growing up, we played D&D. On paper! When the uptight and moral types would descend on us with warnings of eternal damnation, we’d trot out the list of good things(tm) that come from this older style of gaming. For one, it developed creativity. Player and DM together imagined how the story would evolve.
Games today don’t evoke the imagination. Gameplay is most often rigid and repetitive (and every detail is spelled out with blessed HD graphics). Games expect their players to adapt and submit to the mechanics given, never the other way round. We had to imagine everything: what would be the impact of a magic missile on a mimic, formerly disguised as a disused vambrace, now firmly ensconced on a party-member’s forearm (why-not!). And if we didn’t like a rule in a PnP game, we changed it.
Game designers don’t want their players to be creative. They do their utmost to suppress unexpected and emergent gameplay, in fear it might overshadow the blessed experience they have bestowed on us.
As evidence I point to the Nerf. ZOMG! players have found a way to master the ZergMeister class 3.2 days earlier than intended. And the game MUST be experienced as we, the game designers, intend. You know what, who cares if this doesn’t match canon lore. The real fun in games comes from testing the limits.
And any time a game comes out that encourages player made content, the glorious-ones jump at the chance to point out that 99.9% of anything player-made is shit. That isn’t really the point though. Sure, my dungeon is shit. But it’s mine, and I probably had fun (and used my brain) while designing it.