The cloud has transformed personal computing - connecting us to each other and delivering and endless supply of media to our fingertips, changing the way we live, work and travel. But as digital content becomes more interactive, and data generation becomes increasingly decentralized, we need to reshape our internet and bring the power of cloud computing to the 5G edge. By building this processing power right into the network and bringing it closer to connected devices, we can enable a new class of near real-time applications.
This paradigm shift can't be achieved by the network alone. We need the entire technology ecosystem to work together - designing the infrastructure, hardware, software, and content to utilize the power of the 5G network.
This was our motivation to announce an expansion of our Edge Computing Zone (EC Zone) program, which we launched last week with Designing the Edge at the Ericsson Experience Center in Santa Clara, Calif.
Our discussions during this two-day forum further underscored the disruptive potential of edge computing, as well as the need to work collaboratively to design and re-architect applications for this new computing paradigm. We are excited to engage new collaborators as we work toward the realization of our three core missions: cutting the cord on extended reality (XR) and gaming, enabling real-time sensor analytics for public safety, and orchestrating live HD-3D mapping frameworks for vehicle automation.
We chose to explore these three specific areas because they are ambitious, extremely worthwhile, and truly push the limits of what 5G hopes to achieve due to their heavy bandwidth, latency and near real-time computing requirements. We believe that integrating edge computing capabilities into the 5G network edge is key to designing the internet infrastructure of the future, and is why we also expanded our Edge Computing Zone efforts to include a 5G testing environment hosted at the Ericsson campus in Santa Clara.
Though Designing the Edge was a kickoff to this new lab, we’ve already worked with five of our early program participants to integrate their applications into our testbed and showcase the potential of 5G and edge computing in time for this launch:
- PlayGiga: PlayGiga is a games-as-a-service (GaaS) company and a pioneer in cloud-based gaming. Since 2017, PlayGiga has deployed their cloud gaming service in four different countries with several blue chip partners. As telecom companies begin to deploy and commercialize their 5G network services, high-quality streaming of videogames over mobile networks will become a true possibility. PlayGiga is working with the AT&T Foundry and Ericsson to showcase that 5G performance will match the expectations that have been generated in recent years.
- HTC: Last month, we discussed our goal to deliver 6DoF interactive content over mobile 5G. By working closely with HTC, we have showcased the ability to remotely render commercially available VR games and deliver them to a mobile head-mounted display (HMD). This experience samples 2880x1600 resolution frames at 75 frames per second and transports them over Ericsson’s mmWave 5G to an HTC Focus headset. The use of 5G and edge computing allows us to bring this experience to our mobile network while still achieving the latency and bandwidth necessary to meet the performance requirements of the wireless HMD and VR application.
- NVIDIA: By working with NVIDIA and their CloudVR software, we were able to play an interactive VR game streamed over the 5G radio from an RTX server. The result was a great end-user experience, with only 5 milliseconds of network delay and no observable performance loss.
- Arvizio: We’ve also been working with Arvizio to showcase the delivery of immersive, mixed-reality applications over a 5G network, using mobile edge computing and powered by an NVIDIA RTX server. Our proof of concept demonstrates how the combination of 5G and edge can optimize complex 3D models for visualization with XR devices and ultimately deliver smooth, multi-user experiences. Arvizio works with a variety of industries, including companies within architecture, engineering and construction (AEC), utilities, energy and advanced manufacturing.
- NGCodec: With NGCodec, we’ve been experimenting with an FPGA-driven approach to server-side encoding and compression for the network edge. Using its real-time RealityCodec Video compression, NGCodec compresses the output of the desktop PC by 300:1 with minimal latency. This demo wouldn’t be possible without the low latency of 5G and RealityCodec's sub-frame video encoding at the native resolution of the HTC Vive at 90fps. In this way, we envision delivering VR from the cloud to mobile headsets with the performance of a high-end gaming PC.
These projects are only the beginning of our exploration of the edge and 5G. Over the next few months, we'll continue working with these companies to quantitatively assess how next-generation network capabilities directly impact application performance and the end user experience.
The energy at Designing the Edge was palpable. We were thrilled to host such thoughtful, nuanced conversations and are energized by the new projects that they have already begun to generate. We look forward to working with our technology ecosystem to fully realize the potential of 5G. For more information about our Edge Computing Zone efforts, please visit our AT&T Foundry website.