Post by Frisbone on Jul 31, 2013 12:28:45 GMT -5
I seem to have neglected writing up information on what the ITC is attempting to achieve, why I think it might work, and why should I be trying it.
Using a fake gun to control a shooting game is not new. It has been most common with arcade games but older consoles had controllers similar for specialty shooting games. Most of the ones I remember were based on tracking the gun in some way to know its true position relative to the screen.
I was in the "wish" mode one day while playing COD with my son. I was wishing that there was some way to get some real exercise while playing and injecting a bit more reality in it rather than watching players "bunny hop" all over the place and cheat in various ways through controller mods (rapid fire) or instantly flip around with high controller sensitivity set.
So I started dreaming up a system comprised of a free-moving platform, VR headset, and a weapon in hand as a controller.
The VR headset had been done and in fact I researched who's who at the time. There was some hub bub about Oculus Rift but it was all just talk - but there were existing solutions already that had better specs anyway. The moving platform idea was complex - but I had some thoughts on how it could work. Today there is a kickstarter project working on a treadmill based idea that looks promising.
When it came to the controller I thought I had some solid ideas and since I had hobby parts lying around I figured why don't take a stab at it. You put all three things together and you would have a very cool 1st person simulation. I wouldn't even feel bad playing it online against others because you would be at a serious disadvantage when all of your movements were based on the physical limitations of the player - if you were willing to take the stat hit, why feel bad about that?
On the controller I thought - boy it would be nice to make something that could work with pretty much any game without modification to the game. I further noted that it was likely all the buttons could be replaced and the joystick could be emulated. But emulated how? Well, I knew that Accelerometers were cheap and in everything now and seemed to have decent responsiveness - why not track the physical movement of a gun with an AM in it and figure out a way to translate the 3D movement to 2D left/right on a joystick.
I further developed an algorithm for converting the coordinate systems. With the body as an anchor at the center of a virtual sphere the AM to origin forms a line intersecting this sphere at an arbitrary radius. When you move the AM in 3D space you can draw a new line. Accept that a sphere around you is actually a fair "viewing surface" from the perspective of your eyes - then think of how you would represent moving to that new point. I determined that ascension/declination (celestial positional notation) would be a fair representative - and its a 2D coordinate system. So ascension becomes left/right (X) and declination becomes up/down (Y) - but in radians (degrees).
This all sounds fascinating but of course there are limitations and concerns.
It has no understanding of game context. The tracking would have to be extremely accurate (and calibrated) so that errors would take a long time to see. Horizontal errors aren't a big deal with VR headset - but vertical errors could result in a user physically looking up while the in-game character is looking straight ahead - and then you'd need a way to correct for that.
So I move forward with the understanding that calibration will be key for this to be a usable controller. I also move forward with the significant possibility that this could be a colossal failure. But I think no matter what the result, I will have learned a lot and had fun in the process.
Using a fake gun to control a shooting game is not new. It has been most common with arcade games but older consoles had controllers similar for specialty shooting games. Most of the ones I remember were based on tracking the gun in some way to know its true position relative to the screen.
I was in the "wish" mode one day while playing COD with my son. I was wishing that there was some way to get some real exercise while playing and injecting a bit more reality in it rather than watching players "bunny hop" all over the place and cheat in various ways through controller mods (rapid fire) or instantly flip around with high controller sensitivity set.
So I started dreaming up a system comprised of a free-moving platform, VR headset, and a weapon in hand as a controller.
The VR headset had been done and in fact I researched who's who at the time. There was some hub bub about Oculus Rift but it was all just talk - but there were existing solutions already that had better specs anyway. The moving platform idea was complex - but I had some thoughts on how it could work. Today there is a kickstarter project working on a treadmill based idea that looks promising.
When it came to the controller I thought I had some solid ideas and since I had hobby parts lying around I figured why don't take a stab at it. You put all three things together and you would have a very cool 1st person simulation. I wouldn't even feel bad playing it online against others because you would be at a serious disadvantage when all of your movements were based on the physical limitations of the player - if you were willing to take the stat hit, why feel bad about that?
On the controller I thought - boy it would be nice to make something that could work with pretty much any game without modification to the game. I further noted that it was likely all the buttons could be replaced and the joystick could be emulated. But emulated how? Well, I knew that Accelerometers were cheap and in everything now and seemed to have decent responsiveness - why not track the physical movement of a gun with an AM in it and figure out a way to translate the 3D movement to 2D left/right on a joystick.
I further developed an algorithm for converting the coordinate systems. With the body as an anchor at the center of a virtual sphere the AM to origin forms a line intersecting this sphere at an arbitrary radius. When you move the AM in 3D space you can draw a new line. Accept that a sphere around you is actually a fair "viewing surface" from the perspective of your eyes - then think of how you would represent moving to that new point. I determined that ascension/declination (celestial positional notation) would be a fair representative - and its a 2D coordinate system. So ascension becomes left/right (X) and declination becomes up/down (Y) - but in radians (degrees).
This all sounds fascinating but of course there are limitations and concerns.
It has no understanding of game context. The tracking would have to be extremely accurate (and calibrated) so that errors would take a long time to see. Horizontal errors aren't a big deal with VR headset - but vertical errors could result in a user physically looking up while the in-game character is looking straight ahead - and then you'd need a way to correct for that.
So I move forward with the understanding that calibration will be key for this to be a usable controller. I also move forward with the significant possibility that this could be a colossal failure. But I think no matter what the result, I will have learned a lot and had fun in the process.