In the market today, there are several Augmented Reality Broadcasting solutions; most of them are developed via integration with the existing broadcasting system. The programs are run via high-end PCs to process colour keying, position tracking, and real-time visual effect rendering,  encoded with the main video feeds and ports in the broadcasting system.

The current solution is a level up from the virtual newsroom concept from the early 2000s, as it is more interactive and accurate in using real-time rendered visual; instead of per produced CG footage. However, these are not disruptive technology and not scalable for individual broadcasters.

Today’s mobile camera has evolved into a professional tool, equipped with multi-lens, time of flight (TOF) features, optical stabiliser, larger complementary metal-oxide-semiconductor (CMOS)  and branded lens support. This makes our mobile phones as good as professional video cameras.

More importantly, mobile devices have become an all-in-one minicomputer. It handles capturing, encoding and graphic processing independently; games like Pokémon Go have shown the accessibility of mobile AR.

At the age of 4G, there are already thousands of live video broadcasting platforms successfully established. Moving into the world of 5G, cloud computing will further power up our mobile devices by giving it unlimited computing capabilities.

In 2020, we are at the dawn of discovering how 5G and cloud computing will produce the broadcasting level interactive experience for end-user, and by end-user.

During Chingay 2020, Asia’s largest street parade, Hiverlab collaborated with Huawei and M1 to push for a 5G augmented reality broadcasting solution, and together, a mobile 5G solution was born. 

Mobile-based AR Broadcasting program “RealityCast™” by Hiverlab

The mobile program “RealityCast™” is an all-in-one mobile AR broadcasting solution,  allowing users to launch and modify the AR effects in live camera feed, and cast it to mainstream video platforms.  

The technology behind RealityCast™

Spatial mapping

The AR program actively scans the environment and builds the spatial structure for the mobile device. This is to match the AR effects accurately within the space.

Thanks to the latest AR technology, spatial tracking focuses on the major spatial feature instead of the individual element, so as to filter out random moving objects and build consistent spatial data.

Interactive AR content management system

The program includes a live operation panel for users to load and adjust AR effects in real-time, allowing users to have full control of how and when to launch particular AR effects from the library. Further down the road, users will also have their access to writing and editing the effects.

As the full range of AR effects is being rendered in real-time, RealityCast™  sends out the Augmented combined camera effects from live streaming protocol to 5G broadcasting server.

5G Powered Network Structure

Huawei and M1 built the world’s first 5G test station to empower the online broadcasting experience.

During Chingay 2020, Hiverlab made use of Huawei’s 5G enabled mobile devices to connect to M1’s 5G network to send a broadcasting feed.

Concurrently, video footage from a drone was being processed at the same instance.

The central broadcasting server received multiple feeds from the drone and on-ground 5G enabled mobile devices. The peak bandwidth reached up to 100Mbps during Chingay 2020. Slicing and NEC solution may future optimize the 5G connection in maintaining the bandwidth quality.

Eventually, the feeds were processed from an encoder box to the web streaming server.

Conclusion

The trial was a success, which enhanced the key performances during the Chingay parade; hundreds of thousands managed to watch the augmented reality show on their mobile and pc screens during this 2-day parade.