Qualcomm introduces an immersive home platform that integrates wi-fi6 and 6e to support XR applications with low latency

Qualcomm technologies, Inc. today announced the launch of Qualcomm’s immersive home platform, which is the successor to the company’s mesh network platform. Qualcomm said that these devices deployed Gigabit wireless performance in every room of the home in a palm size, “with sufficient cost-effectiveness to meet lower consumer price points”. The immersive home platform is realized through the modular architecture method, the progress of network packet processing technology and the integration of the next generation wi-fi6 and 6e.

Qualcomm pointed out that in the face of the growing demand for home data, its immersive home platform provides four different product layers. It said that it will provide flexible design for manufacturers and broadband operators who are committed to fully adopting the wi-fi6 and 6e mesh architecture portfolio throughout their products.

“We have launched Qualcomm’s immersive home platform, a new approach to leveraging high-power wi-fi6 and 6e home networks, a new architecture design tailored to home deployment, designed to provide Gigabit performance advanced functionality to every corner of the home,” said nickkucharewski, vice president and general manager of wireless infrastructure and networks at Qualcomm technologies. “Today, with offices, classrooms, cinemas and other scenes moving to the home, high-performance Wi Fi has transitioned from a luxury to a key utility.”

Kevinrobinson, senior vice president of Wi Fi alliance marketing, said: “the popularity of Wi Fi has created an increasingly diverse and densely populated Wi Fi environment, including home networks that now have to support many demanding applications at the same time.” He added: “the features of wi-fi6e, such as Gigabit speed, low latency and high capacity, will benefit users where they need it most and enable Wi Fi devices to operate efficiently in the most dynamic home connection settings.”

What does this mean for augmented, virtual, and hybrid reality (ar/vr/mr – collectively referred to as XR)? Qualcomm said that with its immersive home platform, the network will be ready to support devices running emerging 6GHz applications, including vr/xr, real-time video sharing / streaming, real-time games, etc.

Specifically, Qualcomm’s immersivehome310 series of products represent the company’s tri-bandwi-fi6 platform in the product portfolio, which aims to simultaneously utilize all three bands to support IOT class devices (2.4GHz) and today’s traditional media devices (5GHz), and realize congestion mitigation data traffic migration from 5GHz to 6GHz bands.

As a result, thanks to the ultra-low latency of Qualcomm’s immersive home platforms, they will be able to support new delay sensitive emerging applications, such as XR, by reducing the latency by up to 8x (in some configurations operating at 6GHz). According to the company, in a crowded environment, the delay of wireless VR devices or applications is less than 3MS.

Square, a screen sharing platform, announced to cooperate with snap to introduce ar filters into the platform to improve interactivity

Square announced its cooperation with snap. By using camerakit to integrate the augmented reality function of snap into the square camera, it has brought new creative channels to users during the epidemic.

The covid-19 pandemic is disastrous for most parts of the world. Although everyone’s experience is different, many people hope to relieve their mood in various creative applications.

Square is a screen sharing platform where users can watch their favorite movies and browse applications together. They can even have video chats with up to nine people. Square’s co viewing feature provides long format content and tiktok content, as well as other virtual experiences. However, by partnering with snap, square has taken personalization and creativity to a new level.

Camerakit released in June uses snapchat’s filter rotation UX and lensesar platform to provide an integrated user experience. This allows developers to use snapchat camera functionality for their applications. They will also receive a range of tools and services to create an augmented reality experience.

Alstoncheek, partner director of snap platform, said that camerakit was the crystallization of nine years of innovation and investment, and square was one of the first partners to integrate it into the platform.

By introducing these augmented reality features into the square application, users can use the snapchat filter when chatting and watching videos. It will provide users with a new way to share and express their personality when browsing online.

At the same time, the lensesar experience also makes square closer to the 238million daily active users of snapchat. Since most of these active users are teenagers and millennials, this is very beneficial for genz applications like square.

By building a social platform that allows meaningful interaction, square hopes to make people feel less lonely. In a pandemic, creative channels are essential to control stress and maintain mental health.

Now, through its partnership with snap, square can make screen sharing more interactive. It realizes interactive moments through augmented reality experience and filters, thus promoting the participation of community users.

Honeywell announced the launch of VR training solution to provide workers with an immersive collaborative learning environment

Recently, Honeywell announced that it would launch an industrial training solution called immersivefieldsimulator, which is a training tool based on Virtual Reality (VR) and hybrid reality. The tool combines the digital twin technology of physical factories and can provide targeted and on-demand skill based training for workers. The solution combines 3D immersive technology with operator training simulations to create a collaborative learning environment for plant operators and field technicians.

Prameshmaheshwari, vice president and general manager of Honeywell services, said: “In the face of increasingly complex technologies and the imminent retirement of experienced employees, operators need reliable technical training and development solutions to accurately describe the real environment. When it comes to helping factory field operators and technical maintenance personnel better complete their work, traditional training methods often fail to meet the requirements, and the result may be reliability problems and increased operational accidents.”

Immersive field simulators can provide virtual smooth drills to make workers better familiar with the factory, including avatars representing virtual team members. With the change of factory operation, the cloud hosting and device platform (including flexible 3D models) of the simulator will grow with the development of users. The simulator can be customized to meet specific teaching needs. Project team members and plant subject matter experts can easily create customized training modules.

Honeywell’s “immersive on-site Simulator” has changed the training mode of its local employees, so that employees can learn while doing, while increasing the knowledge reserve, minimizing the situation that may lead to operation interruption, so as to improve the ability in various fields.

“With our end-to-end solution, console and field operators can practice different operations and security solutions in a secure simulation environment, including rare but emergency situations,” Maheshwari said. “This approach greatly improves the current training tools and methods. VR based training can improve confidence and loyalty, as well as overall professional skills. Experience has shown that students using VR can master knowledge much faster than learning in the classroom.”

Honeywell’s capability management program (including simulator training) is based on decades of worker experience using integrated control and safety systems. Honeywell integrates this experience into its latest training products to improve employee performance and safety.

U.S. Army introduces soldier decision system, which can fill complex data in immersive XR environment

The U.S. Army today announced that its researchers have developed a system that can fill an immersive virtual environment with complex heterogeneous data, enabling soldiers to “quickly retrieve relevant information to support mission critical decisions.”

According to Dr. markdennison, a research psychologist at the Army Research Laboratory (ccdcarl) of the U.S. Army combat capability development command, the system is inspired by the potential of extended reality (XR).

According to Dennison, augmented, virtual and hybrid reality (collectively referred to as extended reality XR) provide an opportunity to redefine the way users interact with computers and information. The focus of the laboratory is to create a general operating environment for XR, which can realize the interoperability between ready-made commercial XR systems and recording system programs, heterogeneous sensors and big data analysis on a secure encrypted network.

Dennison added: “this work aims to create a framework to achieve interoperability between XR technology and big data analysis, so as to speed up decision-making and realize shared understanding between Co located and distributed users.”

The U.S. Army said that by linking the “accelerating user reasoning: cross reality operations, research and analysis” or “auroraxr” system with the “elasticstack” system (elasticstack is a big data analysis architecture used by the U.S. Department of defense, industry and academia), scientists and engineers can now use complex heterogeneous data to fill an immersive environment.

“These rich data sets enable ccdcarl scientists to study how to enhance complex decision-making processes in the XR environment, especially compared with existing field information systems,” Dennison said

Stormfishscientificcorporation, the company that developed the system, signed a contract with the laboratory and contributed to the research funded by ARL.

Dennison continued: “Military decision-making requires the fusion of a large amount of data from various sources across domains. When trying to filter out the key elements necessary to answer a particular question, many of such data may be irrelevant or redundant. Therefore, the army needs novel information meditation tools to help human decision makers identify and retrieve relevant data quickly enough to support mission critical decisions, and the window of opportunity may be less than a few minutes. ”

For example, U.S. Army commanders may need to leverage Air Force Intelligence, surveillance, and reconnaissance (ISR) assets while tracking army ground forces Co located with the Marine Corps.

“Only grey asset data (such as social media feeds) can exacerbate this information space, and these data may dynamically change mission plans within complex urban terrain. Technologies that only identify, transmit, and display information related to specific unit missions are critical to joint global joint command and control or cjadc2 decisions,” Dennison said

The U.S. Army emphasized that its research has also helped solve the lack of secure XR network solutions that can meet the network and security requirements of the Department of defense, and provided solutions that link basic research with prototype systems. This work also supports the “adaptive cross reality information mediation (axrim)” task under the “C3I” or “rric3i” programs of the army network cross functional team.

The U.S. Army noted that this work is helping to deepen understanding of how immersive common combat imagery enhances decision-making in a cross echelon mobile, installation, and post command computing environment. Dennison concluded: “researchers are eager to continue working in this field to achieve scientific and transformative overmatch and provide soldiers with unparalleled capabilities on the battlefield.”

Gorilla glass manufacturer Corning and pixel cooperate to develop consumer grade ar head display optics

Corning, the manufacturer of gorillaglass, announced a new partnership with pixelligentechnologies, a composite material supplier, to develop optical devices for consumer grade ar head displays.

A statement said that this cooperation relies on pixelligent’s optical transparent compound and Corning’s glass manufacturing technology to “shorten product development time and improve the availability of AR equipment”.

Dr. xavierlafosse, senior commercial and technical director of Corning, said: “I have full confidence in our ability, and will continue to provide support to equipment manufacturers with high-quality materials, machinery and professional knowledge to prepare for the large-scale adoption of augmented reality and hybrid reality equipment in the future.”.

Pixelligent will provide its pixclear polymer to increase the refractive index of optical elements through nanocrystals based on zirconia and titanium dioxide. This is a measurement method to judge the refractive index of light passing through a specific medium.

Corning has invested a lot of energy in mass production of ultra flat and high index glass wafers and provided them to the industry’s leading ar equipment manufacturers (they have not disclosed the specific list). It is worth noting that most of Corning’s post IPO equity came from apple, which invested $450million in the company in two investments in 2017 and 2019.

Apple is undoubtedly developing its own ar head display, so Corning and pixel technologies are likely to play an important role in Apple’s ar products in the future. There are rumors that the so-called “appleglass” may be launched sometime in 2021. But seeing is believing and hearing is false. Like all Apple products, they cannot be fully confirmed before they are actually released.

Editor in charge: CC

Leave a Reply

Your email address will not be published. Required fields are marked *