Tag Archives: Virtual Reality

Are We Ready for 6G?

Apart from simply being an evolution of the 5G technology, 6G is actually a transformation of cellular technology. Just like 4G introduced us to the mobile Internet, and 5G helped to expand cellular communications beyond the customary cell phones, with 6G the world will be taken to newer heights of mobile communications, beyond the traditional devices and applications for cellular communication.

6G devices operate at sub-terahertz or sub-THz frequencies with wide bandwidths. That means 6G opens up the possibility of transfers of massive amounts of information compared to those under use by 4G and even 5G. Therefore, 6G frequencies and bandwidth will provide applications with immersive holograms with VR or Virtual Reality and AR or Augmented Reality.

However, working at sub-THz frequencies means newer research and understanding of material properties, antennas, and semiconductors, along with newer DSP or Digital Signal Processing technologies. Researchers are working with materials like SiGe or Silicon Germanium and InP or Indium Phosphide to develop highly integrated high-power devices. Many commercial entities, universities, and defense industries have been going ahead with research on using these compound semiconductor technologies for years. Their goal is to improve the upper limits of frequency and performance in areas like linearity and noise. It is essential for the industry to understand the system performance before they can commercialize these materials for use in 6G systems.

As the demand increases for higher data rates, the industry moves towards higher frequencies, because of the higher tranches of bandwidth availability. This has been a continuous trend across all generations of cellular technology. For instance, 5G has expanded into bands between 24 and 71 GHz. 6G research is also likely to take the same path. For instance, commercial systems are already using bands from FR2 or Frequency Range 2. The demand for high data rates is at the root of all this trend-setting.

6G devices working at sub-THz frequencies require generating adequate amounts of power for overcoming higher propagation losses and semiconductor limits. Their antenna design must integrate with both the receiver and the transmitter. The receiver design must offer the lowest possible noise figures. The entire available band must have high-fidelity modulation. Digital signal processing must be high-speed to accommodate high data rates in wide bandwidth swathes.

While focussing on the above aspects, it is also necessary to overcome the physical barriers of material properties while reducing noise in the system. This requires the development of newer technologies that not only work at high frequencies, but also provide digitization, test, and measurements at those frequencies. For instance, handling research at sub-THz systems requires wide bandwidth test instruments.

A 6G working system may require characterization of the channel through which its signals propagate. This is because the sub-THz region for 6G has novel frequency bands for effective communications. Such channel-sounding characterization is necessary to create a mathematical model of the radio channel that can encompass intercity reflectors such as buildings, cars, and people. This helps to design the rest of the transceiver technology. It also includes modulation and encoding schemes for forward error correction and overcoming channel variations.

Moving 3-D Sensing Into Smartphones and Vehicles

Chirp Microsystems, a new startup from Berkeley, California, has developed a new Time of Flight (ToF) ultrasonic sensing platform for use in wearables and Virtual and Augmented Reality (VR/AR) systems. They have selected some big customers to whom they have made available their development platform.

At present, the high-end VR/AR systems are typically confined to a prescribed space, or tethered to a base station. The limit comes from the requirements of additional equipment in the space for creating better tracking experience. Usually, the additional equipment is often a magnetic sensor or a camera-based system that can correct drifting by using the inertial measurement unit (IMU) within the head units of the VR/AR system.

Chirp has demonstrated they can embed their miniaturized MEMS ultrasound sensors within the AR/VR head unit. With the sensors in place, the user has a 360-degree immersive experience, as the tracking system moves along with the user. Supporting inside-out tracking, the ultrasound sensors from Chirp can have controllers or input devices working with six-degrees of freedom—offering 3-D sensing.

VR/AR systems already use the optical or camera-based system for tracking. However, the camera is only a 2-D device, incapable of providing any sort of depth information. Even to detect if objects have shifted from one frame to another, a camera needs to use the point cloud, while applying very complicated calculations.

On the other hand, ToF ultrasound sensors can easily detect 3-D movement. This is because the technology is adept at triangulating data easily, and simpler calculations demand much less power.

Although it is another option for 3-D sensing, infrared technology has limited use when the sensor is outdoors—the heat outdoors tends to wash out infrared sensing. However, ultrasound sensors are robust and consume low power, and able to perform well in VR/AR systems outdoors, even in the presence of a bright sun.

While using the ToF ultrasound sensors in VR/AR systems, Chirp hopes the low-end VR/AR systems will improve the interactive experience, and smartphones and vehicles can start using the untethered high-end VR/AR systems.

For instance, smartphones use infrared technology currently as a proximity sensor. This actually prevents the user’s cheek from dialing the phone by itself. However, this requires the smartphone to have a tiny hole for the infrared sensor embedded on the face of the smartphone.

According to Chirp, some smartphone vendors have shown interest in replacing infrared with ultrasound. This would improve the aesthetics of the smartphone by removing the tiny hole on the face of the phone. Additionally, the ultrasound sensors can also add features such as autofocus when taking selfies, and add simple gesture functions to the phone.

At present, vehicles use bulky ultrasound sensors, for say, backing up. Chirp hopes to replace them with its ToF ultrasound sensors. They can also use the sensors as a User Interface (UI) inside cars for infotainment systems. However, as automotive applications tend to use long design-in cycles, Chirp is keeping this in low-priority for the time being. Chirp is planning to ramp up production of its ultrasound MEMS sensors and accompanying ASICS later this year.

SOLI: The Final Interface is Your Hands

Finally, it is time to say good-bye to buttons and touchscreens. You need only wave your hands in thin air for controlling your gadgets. This game changer is a breakthrough from Google and is its project Soli. Soli makes it a thing of past to hit incorrect keys with your thumbs and you can conveniently forget swiping screens. The new gesture technology from Google is very precise and allows working on the smallest of displays.

Soli has small chips generating invisible radar to recognize finger movements. The chips are small enough so that they can be embedded into wearables and other devices. The deciphered finger movements are then translated into commands that computers can understand.

The system identifies delicate finger movements using the radar coming from tiny microchips. The system can use gestures to create touchpads, virtual dials and more as shown in the video above. Although there are camera-based sensors, such as Leap Motion, to capture gestures, they are cumbersome to set up requiring special hardware.

The inventor of this technology is Mr. Poupyrev, who heads the team of designers and developers at Google’s ATAP or Advanced Technology and Projects Lab in San Francisco. According to the Russian inventor, the beauty of Project Soli is the chip that can be embedded into just about anything and the use of invisible radar emanating from it.

Typically, police use the smallest radar for speed traps and even these are the size of a shoebox. The team had to struggle hard to shrink that radar to fit it inside a microchip. Although it took them 10 months, Mr. Poupyrev and his team were able to shrink all the components of radar down to millimeter size. They worked with Infineon, the German chipmaker and were inspired by the advances made in Wi-Gig, the next generation communication protocol for Wi-Fi.

Soli is a simple technology and the lack of cameras makes it easy to put wherever you want – in a toy, watch, wearable computer, car, furniture or anywhere. It is useful whenever people want to connect with devices. For example, Soli technology makes it possible to interact with objects in games making use of VR or virtual reality.

Since Soli makes it possible to replace a physical device, it works perfectly for Virtual Reality, as the field of vision of the user in VR is limited. The microchip uses radar to recognize movements of fingers. The chip radiates a broad beam radar for recognizing movement, distance and velocity. The radar uses the 60GHz spectrum and captures about 10,000 frames per second. The chip then translates the movements into commands that computers can understand.

Once Project Soli becomes reality, in the future we will be able to control devices such as fitness trackers and smart-watches only by our finger movements and will not require smart-phones as of now. Very soon, you may simply be able to snap your fingers to switch on the lights in your room and to vary its intensity by twirling your fingers.