If Disney has its way, augmented reality (AR) systems will not require controllers in the future. Instead, AR users will be able to interact with physical objects using only their hands. Disney plans to pull off this feat by using the user’s skin as a communication network.
In a United States Patent Application Publication for “Systems and Methods to Provide an Interactive Environment in Response to Touch-based Inputs,” Disney details how human skin could serve as a transmission path.
This disclosure presents systems and methods to provide an interactive environment in response to touch-based inputs. A first body channel communication device coupled to a user may transmit and/or receive signals configured to be propagated along skin of the user such that the skin of the user comprises a signal transmission path.
United States Patent Application Publication US 2021/0011542 A1
The patent application abstract goes on to explain that users will see “virtual content specific to the interaction entity… presented to augment an appearance of the interaction entity.” In other words, when the user views an object through the headset, the object can come to life through augmented content created by the system. Imagine picking up a Mickey Mouse toy, looking at Mickey through a headset, and having him talk to you. That’s what this technology can do.
Body channel communication (BCC) technology is embedded in physical objects like toys, allowing two-way communication through touch interaction. Information flows through the user’s skin, turning the human body into an extension of a computing network. The patent identifies touch interaction as physical contact using hands, feet, and other methods of touch. The application notes that it is possible to receive BCC signals without touching, from up to five centimeters.
The invention requires numerous components, each connected to a network or networks.
This illustration (Fig. 1) shows the entire system (100). The patent devotes nearly four pages to describing the components and how they interact. We will do our best to provide a Cliffs Notes version.
A presentation device (102) includes a display (122) and a processor (104) running code (106). Numerous other presentation devices (103) may also be utilized. The presentation device also includes one or more sensors (124) and storage (120).
Like the presentation device (102), the first BCC device (134) includes a processor (136), code (138), and storage (142). Unlike the presentation device (102), the BCC lacks a display and sensors. Note that second (146) and additional (132) BCC devices may also be added.
The entire system has access to servers (101) through a network or networks (130). External resources (131) include sources of information, hosts, and other providers of information outside the system (100). In essence, the system could grab information from outside (e.g. Internet) sources.
This linear flowchart shows how the system in Fig. 1 processes and presents information and images.
Fig. 3 shows an implementation of the system detailed in Fig. 1. Here, the user (301) wears a BCC (302) and a presentation device (304). In this example, the presentation device is a headset, likely one capable of augmented reality. The patent also notes that the BCC (302) may be attached to other parts of the user’s body. An interaction entity (306), in the form of a doll or action figure, stands on a table.
The user (304) is now holding the interaction entity (306). The BCC (302) can now receive signals from the doll or action figure. The signal (402) is transmitted through the user’s skin. The BCC (302) is able to send signals to the headset (304) as well. This communication may be wireless (as illustrated) or wired.
Fig. 5 shows how the interaction entity (306) can send virtual content (502) to be presented on the display (304). Multiple instances of virtual content can be displayed at once. This allows the appearance of the interaction entity (306) to be augmented. In this example, the action figure or doll is “talking” to the user.
The example in Fig. 6 adds a second user (601). Both users are wearing BCCs and VR headsets. Both users are able to interact with an interaction entity (606), represented here by a map.
The users (301 and 601) have performed a concurrent physical touch with the interaction entity (606). The path (702) drawn on the virtual map (606) is visible to the users through their respective headsets. Interestingly, the patent states that some virtual content may only appear when the users simultaneously interact. In other words, Disney is looking to bring collaborative play into the augmented reality space.
Disney recently submitted a patent application for technology that turns toys into virtual puppets, using only a user’s smartphone. At first glance, this application may sound similar. The critical difference is that the patent detailed here encourages hands-on and collaborative play.
If you are interested in other recent Disney patent applications, be sure to check out our posts on “smart” merchandise displays and a system that could bring the Expedition Everest yeti back to life.
So basically, Disney is trying to create the Oasis from Ernest Cline’s novel, “Ready Player One”. If they do end up making a virtual reality system that can be influenced with touch, they better give him a share of the profits- because it was his idea first!
i don’t know about you, but i bet Disney is trying to do something very illegal. :I
one step closer to becoming borgs. will disney resorts become hives?
Coincidently the augmented reality system will also allow guests to control the animatronics in Pirates of The Caribbean resulting in multiple scalawag complaints