How to scratch the elf? We tried to play Pokemon Go in VR and MR

Lei Feng network press: This article from the micro-action submission published by Lei Feng network.

(Picture: Micro-motion and E-Impulse try VR and MR versions of Pokémon GO together)

Your thoughts on the day and night are controlled by Pokémon GO, the VR and MR version. We have made it. This is the correct way Pokémon GO breaks the wall of dimension.

Pokémon GO makes AR and mobile games, streets and even social connections, just these keywords are enough attractive, plus Pokémon this century's big IP, it seems not fire but no reason. But on the phone, a long-touch-sliding touch-screen operation simulates a "throw" elf ball, earning Pokemon's income... Will there be some slight restraint and the second?

Although the national dress is not yet open, Chinese players have a lot of thoughts and imaginations about Pokémon GO. Pokémon GO does bring a lot of surprises to us, but what the players most expect to see is not the appearance of the first level of technology, but the most Appropriate technical expressions are combined. From the point of view of the plot design and operation, VR/MR+ touch control can bring us a more refreshing experience.

| It is easy to open the brain hole, but realize it?

We tried Pokémon GO in VR and MR, and let Baoke Dream lift it from the screen of the mobile phone. You can grab the elf ball that you want to choose and aim it at the front and adjust it to shoot at Pocco Dreams. If you encounter a hard and sly Pokemon, to avoid being beaten, you can also circle the ball and throw the strongest spine ball to feed the food. There is probably no Pokémon from the Lord. Escape from the hand.

| About Development Tutorials

I believe that the old drivers have a thorough understanding of VR. In addition to trying to unlock new postures to capture treasure dreams, we are also trying to spread the technical expressions of MR+ feel control to everyone. The following is the development tutorial, in addition to the SDK can also be downloaded free of charge in our Vidoo official website, and developers are expected to have more masterpieces after eating this conscience.

Touch control:

First, depth images are obtained with our binocular depth sensor.

In the depth image, we calibrate the distance between the pixel and the sensor by modifying the RGB value of each pixel. When we look at the depth image, we find that the effect is similar to the familiar infrared thermal image, except that we replaced the temperature with the distance.

In the figure below, the depth image is on the left and the original image is on the right. It can be seen from the observation that the closer the distance, the redder the color; the farther the distance, the bluer the color.

Second, we segmented the part of the depth image that fits the characteristics of the human hand to recognize the hand posture and movement.

It needs to be pointed out here that the gesture recognition algorithm is based on depth images instead of ordinary images. This will make it easier for us to obtain the depth information of each point of the hand so as to restore the bone structure of the entire hand. If we want to touch-act with virtual objects, bone structure is an indispensable technical prerequisite.


Finally, we algorithmically align the three coordinate systems of the sensor coordinate system (including hand skeleton space information), the virtual world coordinate system (including the spatial information of all virtual objects), and the real coordinate system (including environmental information and hand information). .

In this way, we can align the detected hand bones with the real hands; align the virtual objects with the real world. When the real hand touches a virtual object, an event similar to a mouse click will be generated to drive the execution of the program. Specifically, in Pokémon GO, it is to feed the elf and throw the sprite ball.

At this point, the basic part of the handle manipulation has been completed, and the follow-up will further improve the logic of the event in Unity3D to complete the Pokémon GO function of throwing the ball and food into the elf.

| MR display

Next, what we need to do is display the game elements in Pokémon GO in the space, and let Pikachu and Elf balls appear within a reasonable space.

We use video perspective technology to collect and digitize the “real world” you see in real time through binocular cameras, and then render the images in real time through computer algorithms. After doing so, it can completely superimpose the virtual image, and can also see the new "realistic image" after computer rendering, which satisfies our demand for the simultaneous existence of virtual objects and the real environment.

After solving the interactive and display core technologies, add the corresponding SDK to Unity3D to adjust the process logic. The Pokémon GO MR was born.

| MR Name

The "MR" mentioned in this article is not "mixed reality". MR is a technology between VR and AR. It is called Mediation Reality (MR). It is called Mediation Reality (MR). It is called Mediation Reality (MR). It is called Mediation Reality (MR) and it is mediated by Steve Mann, Professor of the University of Toronto. VR is a pure virtual digital picture, Mixed Reality including AR is a virtual digital picture + naked-eye reality, MR is a digital reality + virtual digital picture.

MR did not appear in recent years. In the 1970s and 1980s, Steve Mann designed wearable intelligent hardware "Digital Eye Glass" in order to enhance its visual effects and allow the eyes to "see" the surroundings under any circumstances. As visual aids, this is seen as a preliminary exploration of MR.

The ability to innovate human visual perception involves not only the digital superposition of reality, but also other visual modification methods such as additions, deletions, and alterations. Since the 1990s, MR (Mediated Reality) research has gradually been conducted at the University of Toronto in HI experiments. In the room, a group of top talents emerged from the HI Lab.

Editor's note: Lei Feng network (search "Lei Feng network" public concern) had made many years ago to follow Steve Mann's CTO Yi Yi CTO Ai Wei article - "the magic of the virtual world: you may not yet understand the smart glasses " , Talk about the difference between AR/VR/MR.

(The picture shows the eye pupil MR glasses - VMG-PROV)

| "Feeling control" is not "gesture recognition"

Touch control is not equal to gesture recognition, and touch control includes gesture recognition.

Hand touch control to meet the following three basic points:

1, the handle must have depth information, the sensor to know how far away from the user.

2, feel control including gesture recognition. Gesture recognition is achieved through the image algorithm update hand gestures and hand movement instructions, the depth of hand information and gesture recognition information is superimposed, to get: the hand is far from me, the hand is doing a complete information, can be called feel control .

3, in VR, handle manipulation to give the user a real hand image. By synchronizing the image of the hand of the algorithm to the real vestibular sensory sense of the user, the user can see the depth information and give the user the most realistic feeling of manipulation. If the user sees a mechanical hand or a simulated hand, the interaction friendly degree will be greatly reduced.

Lei Feng Net Note: Reproduced, please contact the authorization, retain the source and the author, not to delete the content.

If you are also a VR practitioner or have in-depth research on VR related technologies, you can contribute to us:.

Posted on