Disney Patent Brings Advanced Version of Monsters, Inc. Laugh Floor Technology to Your Pocket

Iain

Disney Patent Brings Advanced Version of Monsters, Inc. Laugh Floor Technology to Your Pocket

A patent application published January 14, 2021 details how Disney could use smartphones as a conduit to virtual puppeteering. An app and the smartphone’s cameras allow the user to control a virtual puppet. The invention is a consumer version of the technology used in Disney attractions like Monsters, Inc. Laugh Floor and Turtle Talk with Crush.

The patent application states that when children play with dolls or action figures, they imagine the character is interacting with objects in the environment. This is how play is portrayed in the Toy Story films. The application goes on to state that while video games are more interactive than dolls or action figures, a controller limits the direct one-to-one interaction between the user and the character. The virtual puppeteering system allows a child (or adult) to become the toy.

screen-shot-2021-01-23-at-10-38-07-am-8767676

Fig. 1A provides an overview of how the virtual puppeteering system works. The user (120) uses a portable device (110) with a display screen (118). A doll, figurine, or action figure (122) is separated from the device by distance (124). This is where things get interesting. The user is able to animate the virtual puppet of the toy (132) simply by moving. The user is a puppeteer.

The real-world environment could be an indoor or outdoor location. The patent provides some details of outdoor environments. One is very interesting.

Examples of an outdoor environment may include a residential yard, a playground, or a park, such as a theme park, to name a few.

United States Patent Application Publication US 2021/0008461 A1

It is logical to assume that the “theme park” mentioned is a Disney Parks location. The virtual puppeteering functionality could be standalone app or integrated into the Play Disney Parks app. For example, guests in Star Wars: Galaxy’s Edge could create a virtual puppet from one of the droids located in the land.

screen-shot-2021-01-23-at-10-38-26-am-6141786

A projection system can also be used to display the virtual puppet. Fig. 1B shows the virtual puppet (132) being projected on a physical surface (130) such as a screen or wall.

screen-shot-2021-01-23-at-10-38-42-am-5762143

Disney also included a virtual reality (VR) display option (Fig. 1C). This setup would completely immerse the user (120) in a virtual world.

screen-shot-2021-01-23-at-10-39-11-am-4338917

Fig. 2 gives an overview of the software and hardware needed to make virtual puppeteering happen. The app (212) can also link with a virtual environment database (234) to place the virtual puppet in any number of environments. All of the heavy lifting is done by the device (210). Once again, display options include the device’s display (218), a projector (226), or a VR headset (238).

screen-shot-2021-01-23-at-10-39-47-am-8310141

The process detailed in Fig. 3 goes a long way to explaining just how the virtual puppeteering system works. The user activates the mobile device camera from within the app (341). The image is displayed on the device screen (342) and the user selects an object found in the image (343). At that point, the app calculates the distance between the device and the selected object (344). Step 345 is optional. The user can control the virtual puppet in the real world or choose a virtual environment.

The mobile device receives animation inputs when the user moves (346). The app examines the object, the animation input, and any virtual environment to determine how the virtual puppet should move (347). The animation frames are then generated (348) and displayed to the user (349).

Let’s return to the idea of the optional virtual environment. In a virtual environment, the virtual puppet is able to interact with elements in the selected virtual environment. The example provided in the patent places the virtual puppet in a winter wonderland.

screen-shot-2021-01-23-at-10-40-17-am-8460914

In Fig. 6A we see a virtual puppet (660) rolling “snow” to form the base of a snowman. The virtual puppeteering app understands how snow works. When a user crouches to touch the snow, he or she can roll a snowball. The snowballs can be carried by the virtual puppet and stacked to form a snowman.

It is noted that snowman building animation 632 is generated in part from the virtual environment input received in action 345 identifying the virtual environment as a field of snow. That optionally selected virtual environment is yet another input, analogous to voice and touchscreen gesture, for use in generating the animation of virtual character 660. If there is no snow, building a snowman is simply not possible.

United States Patent Application Publication US 2021/0008461 A1
screen-shot-2021-01-23-at-10-40-38-am-1023149

In Fig. 6B, the snowman (672) constructed by the virtual puppet (660) in Fig. 6A becomes a new virtual puppet. The user may then abandon the original virtual puppet and take control of the snowman. In this example, the user controls the virtual snowman puppet (672) and builds a second snowman (674). The app is able to properly scale the second snowman based on the size of the first.

Virtual puppeteering is not new to Disney Parks. However, a consumer implementation as described in the patent application would be a new use of the technology. The addition of an interactive virtual world surpasses the technology used in current attractions. The Monster’s, Inc. and Crush characters are able to interact with guests. However, they cannot manipulate objects in the confines of their virtual stages.

WDWNT will continue to monitor Disney patent applications. Be sure to check out our report on Disney’s recent patent that could allow the Expedition Everest yeti to come back to life.