Has Motion-Control Failed?

The real question is, how many times a day do you play with your Wii?  Be honest...

Nash Herringtonby Nash Herrington

There was a time when motion-control genuinely seemed like the future of gaming. Back in 2005 when the Wii was called the “Nintendo Revolution”, we all sat in awe of its grand reveal trailer at the Tokyo Games Show, intrigued by the actors enthusiastically waggling their Wii-motes at us whilst simulating fishing, drumming and sword-swiping.

The concept had its fair share of detractors, of course, but as the demo version of the console made its way into videogame stores and people got the chance to play Wii Sports for themselves, even the greatest of cynics found themselves taken aback by the sheer joy to be had in witnessing their Grandma confusedly flailing her arm around as it transformed into an impromptu tennis racket.

But it’s been 6 years since we were informed that standard videogame controllers were, like, sooooo 2004, and I don’t know about you but I’m still slumped on my coach like a gaseous blob of useless DNA, blithely twiddling my thumbs as I blow apart hordes of faceless alien arseholes.

So if it hasn’t become our primary method of gaming, has motion-control failed entirely? Although the initial success of the Wii would firmly place most in the “No” camp, it should be considered that any console that can cite “Carnival Games” as one of its major releases should automatically be forced to spend a couple of hours on the naughty step. If it hasn’t thought of a mediocre Space Marine-themed FPS in that duration then it should immediately be sent to bed without dinner.

But while my Wii steadily became nothing but a futuristic-looking ornament, families the world over still seemed to gain some sort of enjoyment out of ritually embarrassing themselves in front of their television screens. My 8-year-old half-brother, for example, was inexplicably amazed by one of its by-the-numbers on rails shooters. Naturally, the next time he visited me I popped an Xbox controller in his hand and turned on Left 4 Dead 2. Sure, he spent the next month or so stalking the playground attempting to bludgeon his school friends with a ruler, but now he’s experienced what true gaming feels like.

And this is the problem that many of us long-term gamers have; in attempting to please the youngsters, the old farts and the stay-at-home-moms who have grown bored of Farmville, the innovative motion-control hardware isn’t actually being supplemented by any innovative motion-controlled software. So while Microsoft was busy rolling out its Kinect and a bunch of titles enjoyable only if your living room has all the space of a small aircraft hangar, Nintendo went back to the drawing board and came back with an announcement: “y’know those regular controllers you were playing with before? Yeah, they weren’t so bad after all.”

So the WiiU was announced and Microsoft began asking itself questions: was motion-control really the future? Was it possible to remove the controller from the player and still grant them the same amount of functionality? Why on earth was Double Fine making a Sesame Street game? Unfortunately, the truth that Microsoft is facing with the Kinect is the same truth that invariably led Nintendo back to the controller – there’s only so much that you can do in a videogame without two analog sticks.

The Kinect’s most ambitious title thus far is Rise of Nightmares, allowing the player to control not only the movement of their in-game characters arms as has thus far been the case, but also allowing them to change the direction of their in-game characters movement, and including combat that was almost fluid. However, the fluidity of the movement of the in-game character was nothing in comparison to the fluidity of the movement of the player in real life who, after realising that the Kinect was a waste of his money, promptly made his way to his local GameStop and picked up a copy of Arkham City.

So who wins in all of this? Well, that’s easy: Sony, of course. While Nintendo based a whole console around the concept of motion-control and damaged their credibility because of it, and Microsoft spent much of 2011 trying to sneak knock-offs of mediocre Wii titles into our homes whilst distracting us with Gears of War 3, Sony instead understood the motion-controller for what it is: a neat peripheral. By incorporating the Move into its “It Only Does Everything” ethos, the PlayStation doesn’t revolve around the motion-controller – the motion-controller simply compliments the PlayStation.

Gamers don’t mind kiddies and grandparents waggling their arms and legs in front of the TV if they know there are still going to be games that are playable using only their thumbs and a litre of Mountain Dew. By proving that they can sate the appetites of both the “casual” and “hardcore”, Sony has proved that motion-controlling can thrive, if included as part of a balanced gaming diet.