360 VR - 360 VR Video - 360 Video Software -

At some point the 2.0 Lighthouse sensors will arrive with a new curved design and support to add up to four at once (right now it's just two). Meanwhile, the Vive's Trackers enable you to bring any object into VR, and some developers have already found some creative uses for them. These, combined with the TPCast's wireless adapter, gives the Vive the advantage in the tech battle. However, if you wait a little while you can get your hands on the new Vive Pro Eye, which adds new eye-tracking technology (more on that further below). https://www.youtube.com/watch?v=GkbxJnpXhI8&t=1s
Independent production of VR images and video has increased by the development of omnidirectional cameras, also known as 360-degree cameras or VR cameras, that have the ability to record 360 interactive photography, although at low-resolutions or in highly-compressed formats for online streaming of 360 video.[46] In contrast, photogrammetry is increasingly used to combine several high-resolution photographs for the creation of detailed 3D objects and environments in VR applications.[47][48] https://www.youtube.com/watch?v=me5LJSyQvv4&t=1s
In 1992, Nicole Stenger created Angels, the first real-time interactive immersive movie where the interaction was facilitated with a dataglove and high-resolution goggles. That same year, Louis Rosenberg created the virtual fixtures system at the U.S. Air Force's Armstrong Labs using a full upper-body exoskeleton, enabling a physically realistic mixed reality in 3D. The system enabled the overlay of physically real 3D virtual objects registered with a user's direct view of the real world, producing the first true augmented reality experience enabling sight, sound, and touch.[16][17]

Special input devices are required for interaction with the virtual world. These include the 3D mouse, the wired glove, motion controllers, and optical tracking sensors. Controllers typically use optical tracking systems (primarily infrared cameras) for location and navigation, so that the user can move freely without wiring. Some input devices provide the user with force feedback to the hands or other parts of the body, so that the human being can orientate himself in the three-dimensional world through haptics and sensor technology as a further sensory sensation and carry out realistic simulations. Additional haptic feedback can be obtained from omnidirectional treadmills (with which walking in virtual space is controlled by real walking movements) and vibration gloves and suits. https://www.youtube.com/watch?v=DA91OBKEK4M&t=1s
In 1968, Ivan Sutherland, with the help of his students including Bob Sproull, created what was widely considered to be the first head-mounted display system for use in immersive simulation applications. It was primitive both in terms of user interface and visual realism, and the HMD to be worn by the user was so heavy that it had to be suspended from the ceiling. The graphics comprising the virtual environment were simple wire-frame model rooms. The formidable appearance of the device inspired its name, The Sword of Damocles. https://www.youtube.com/watch?v=egdoabbfpcE&t=1s
Alright, I’m going to walk you through my specific process, although some people may do things in a slightly different order. Let’s jump back to the import. The camera I used records to 2 separate Micro SD cards, so I have to import both. In the “Ingested Footage” folder for that day’s shoot, I put a folder for one card, “Front” and one card, “Back”. Then I use the stitching software and export final stitches for every clip to a third folder I call “Stitched”. Then I dropped each of those stitched files into Adobe After Effects and mask myself, other, gear, and tripod out of the shot. This way I have clean video with only the model in it. I export these as individual files as well. https://www.youtube.com/watch?v=zzc7oZVIwfc&t=1s
The trade-off, besides the clunky cables, is the price. The least expensive tethered options are currently around $400. And that's before you address the processing issue; the Rift S and the Vive both need pretty powerful PCs to run, while the PS VR requires a PlayStation 4. If the cost isn't a deal breaker but the cables are, HTC offers a wireless adapter for the Vive, but it requires a desktop PC with a free PCIe slot to work. There are also third-party wireless adapters for the Rift, but we can't guarantee how well they work. https://www.youtube.com/watch?v=rG4jSz_2HDY&t=1s
Microsoft HoloLens is shaping up to be another formidable competitor in the HMD market. Unlike VR tech, Microsoft based their display on holographic technology. The original Hololens was more proof of concept than consumer device, and the recently announced Hololens 2 is again eschewing the public in favor of enterprise/government uses. The Hololens 2 will feature upgraded visuals, including a much-expanded field of view — music to fans of the original device, who settled on the limited FOV as the main drawback of the device. https://www.youtube.com/watch?v=FNq-8bWfAwQ&t=1s
The newest breed of mobile headsets can also be considered "tethered," because instead of inserting your phone into the headset itself, you physically connect your phone with a USB-C cable. Qualcomm has been emphasizing the VR and augmented reality capabilities of its Snapdragon 855 processor, and is promoting a new ecosystem of XR viewers (including both AR and VR devices). These use the aforementioned USB-C connection to run all processing from a smartphone, while keeping the display technology built separately into the VR headset or AR glasses. https://www.youtube.com/watch?v=jLtcPTm5dTg&t=1s