If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Ghost in the Machine: Pt 2

Alex Kipman on hybrid games, what happened to Milo and what we should expect from the Kinect-exclusive Star Wars game

GamesIndustry.bizI wanted to ask you about something else which we saw at E3, which was the scanning of objects into Kinect, people picking up things in front of the camera and saying, now we're going to put this into the game as an object. We haven't seen that since, is that still something that's possible?
Alex Kipman

It's absolutely possible, you'll see it in the launch titles. I'll give you some examples and then we'll talk about why you're not digitising skateboards.

Kinect Joyride. You can create user generated content from showing the camera real world objects and having the system digitise those objects and translate them into paint colours for your car. I just came from an interview in York city, and I was wearing a t-shirt with a British flag on the sleeve.

Now, I had my tricked out convertible in Kinect Joyride - it was a silver car, but I flashed the sensor my little British flag and the next thing I knew I had this Austin Powers car with red, white and blue stripes across it. That feature ships today in a launch title.

Another example, which goes back to what is probably the most technically challenging, deep title we have in the launch line-up, Kinectimals. With Kinectimals you have these adorable little plush animals that exist in the real world. In the beginning you can go over to this magic stone and flash one of these plushies and that real world plushie shows up in the game as an adorable little tiger cub.

So there's two examples at launch. But why can't you digitise a skateboard? It's easy for me to ideate, on the whiteboard, what I think the world would like. And that's the world of the video, this is where, in this world of creating brand-new, never been seen, transformational experiences, you have to admit that you're stupid. And I'm talking about myself here.

I told my team, during the process, let's not create religion in our stupid phase. This is a point where none of us know how to design games for this world, we need to understand what humans want. We need to observe humans in the playtest lab, we need to see and understand what they like to play. Turns out that, in my stupid phase, I thought scanning skateboards was the single most fun thing known to mankind. Thankfully we didn't create religion during my stupid phase and we started spending time, in the playtests, seeing how people enjoyed digitising real world objects and bringing them to the game.

Turns out, that's cool science fiction, but not fun. From hardcore to mums and dads, casual gamers, they were like 'so why did I do this again'? I spent a good amount of time up front digitising this thing, and it shows up under my guy's feet. Why couldn't I just flash you something, like the colour of my t-shirt, and customise the object that way? And guess what? We listened, and that shows up in games like Kinectimals and Joy Ride.

Now, let's say for some reason that I did want to create a game where I did digitise a skateboard. Could I do it? Sure, absolutely. The platform supports it in terms of giving you the toolset to create these features within your game.

We spent a lot of time learning together as an ecosystem. We had several Kinect summits where we worked with game designers, both first-party and third-party developers, and we shared practices. This was a collective journey where we were all learning together and there was a ton of sharing in terms of what works and doesn't work between first and third-party. Probably unlike anything I've ever seen before.

Turns out not having stuff in your living room and putting you into the action, and using lighter ways of digitising things, is where the fun lies, and that's what you'll see show up in both first and third-party games.

GamesIndustry.bizWhat's happening with the lightsabre game we saw teased at E3? How does Kinect tell which way you're holding an imaginary object like that?
Alex Kipman

Star Wars, unlike Milo, was not a tech demonstration - it was a product. We're really working on it. We haven't disclosed a release date, but we didn't flash it by mistake at E3, you will see a Star Wars game coming soon to Xbox 360 and yes it will be exclusive to Kinect.

In terms of the precision, I will tell you that we have been spending a tremendous amount of time to ensure that this amazing franchise translates in a super-deep and meaningful way. If you're going to create a lightsabre game, that lightsabre game better feel like you always imagined when you dreamed of becoming a Jedi as a kid.

Our level of understanding of the human body transcends precision. It transcends what you can and can't do, and becomes a world where we can create a system that is so science fiction turned science fact that you believe it's magical. This goes beyond how you hold a lightsabre, or how you detect the angle of the lightsabre. That sort of thing we already do in Kinect sports with the ping pong game. You can top-spin, bottom-spin, move it around - we already have the precision in the system to do that.

With a lightsabre you want to be much more intentional than that. You want to feel like the guy's stepping correctly, that the weight of his steps are light - the weight of Jedi steps. You want to feel connected with that. You want to make sure that the lightsabre feels natural and feels connected, when you touch your hands together to get that lightsabre from your pocket, you want to feel that connection of that "wwwhhhhhmmmm" as that thing grows.

I can tell that the answer for how we do it is: very carefully.

GamesIndustry.bizWe've spoken to Blitz Games and Ubisoft recently and they've told us that they're considering using raw Kinect data instead of processing it through the supplied libraries. What potential advantages could that have?
Alex Kipman

The answer is that you'd have to ask them. I'm the guy who designed the platform - there's a very good reason I gave them access to the raw data. The reason I allow it is the same reason that I allow them to have access to the CPU and GPU at a very core way.

You, as a game designer, choose where you want to go. You can go all the way from middleware, where you have very little control, all the way to flipping zeroes and ones on my GPU and life goes on. Kinect is no different. I provide what I think is a set of sophisticated algorithms that allow you to turn science fiction into science fact. But, I allow everybody to get access to the raw data.

Again, I'm not trying to create religion in my stupid phase. If Ubisoft or anyone else can come along and create more meaningful/cheaper in terms of processing/more optimised content, then God bless them all - they're creating exclusive Kinect content for my platform and I love them to pieces.

GamesIndustry.bizI wanted to ask you one last thing about the acquisition of Canesta. Will they be manufacturing more sensors for you or are they going to be working on new tech?
Alex Kipman

Well neither, really. From a manufacturing point of view, the way you can perceive the real world boils down to essentially three ways. One is called structure light, and one is called time of flight and one that's called lines radar. Kinect ships based on structure light. Canesta's technology is based on time of flight. These are separate technologies, so it's not to help with manufacturing, because it's like apples to bananas in terms of how the technologies work.

In terms of future, I will tell you that our focus is an amazing launch and revolutionising gaming and entertainment in the living room. What Microsoft does, and what Microsoft has done in this case, is make an opportunistic buy of a great company that can help us accelerate in meaningful ways, our journey in natural input technology.

I speak generally for Microsoft, not necessarily for Kinect on Xbox.

Related topics
Author
Dan Pearson avatar

Dan Pearson

Contributor

Comments