AR Opening Show at Citizen Sports Game 2020
Sep. 2020 - Oct. 2020
Hualien, Taiwan
Hualien, Taiwan
Role
Unity Engineer
Type
Company Project with OSENSE Technology
Platforms
iOS, Android, HoloLens 2
Tools
Unity/C#, ARKit, Android Studio, Apple Xcode
︎︎︎ Google Play
︎︎︎ App Store
Unity Engineer
Type
Company Project with OSENSE Technology
Platforms
iOS, Android, HoloLens 2
Tools
Unity/C#, ARKit, Android Studio, Apple Xcode
︎︎︎ Google Play
︎︎︎ App Store
Overview
In this project, we created a stadium-scale indoor AR live opening performance for the Citizen Sports Game 2020 in Hualien City, Taiwan. Spectators could immerse themselves in the show through their mobile devices, with AR content accurately tailored to their seating locations.
Furthermore, the performance was live-streamed on the stadium's large screens and broadcast nationwide. Notably, the President of Taiwan was present at the stadium, personally experiencing the show (although she didn’t wear the HoloLens as the security team was worried that a bomb might blow up on her head).
Role
As the Unity engineer, I was responsible for developing the AR performance app that is downloadable on the App Store and Google Play. To accommodate the live broadcasting needs, I also designed another version of the app specifically optimized for stability and ease of control during live broadcasts.
In this project, we created a stadium-scale indoor AR live opening performance for the Citizen Sports Game 2020 in Hualien City, Taiwan. Spectators could immerse themselves in the show through their mobile devices, with AR content accurately tailored to their seating locations.
Furthermore, the performance was live-streamed on the stadium's large screens and broadcast nationwide. Notably, the President of Taiwan was present at the stadium, personally experiencing the show (although she didn’t wear the HoloLens as the security team was worried that a bomb might blow up on her head).
Role
As the Unity engineer, I was responsible for developing the AR performance app that is downloadable on the App Store and Google Play. To accommodate the live broadcasting needs, I also designed another version of the app specifically optimized for stability and ease of control during live broadcasts.
Process
Prototype and Iterations
To ensure the accuracy and stability of the AR localization effect, I traveled to the stadium site, which was three hours away from the office, to conduct early testing. I constructed a prototype with an approximate idea of the venue's appearance and calibration process, and then refined it based on data gathered during on-site testing and measurements.
The design and development process also involved frequent communication and collaboration between the government organizer, the animation vendor, and our internal team members. Given the project's tight one-month timeframe, rapid prototyping played a crucial role in gathering prompt feedback and making necessary application adjustments.
![]()
Designed and Developed 4 Versions of the Apps To Accommodate Needs
To meet the live broadcasting requirements of this project, I designed the AR calibration process and control panel. This allowed team members responsible for camera controls to efficiently localize the AR content and make adjustments as necessary. Additionally, I created an audience version of the app, which can be downloaded from the App Store and Google Play, enabling spectators to enjoy the performance from their seating positions on their mobile phones. Lastly, we developed VIP and HoloLens 2 versions of the app to provide an immersive experience for VIP participants on stage.
![]()
On-site Rehearsals and Setup
We dedicated five days to on-site rehearsals and testing before the performance. Our efforts involved close coordination with the lighting, camera control, and live-streaming teams to guarantee the stability of AR localization during the performance. We also conducted tests to synchronize multiple mobile phones, ensuring that the performance could commence simultaneously for all viewers, regardless of the device they were using. This synchronization was controlled by a central backend system.
![]()
︎︎︎ Testing the synchronization of multiple mobile phones
To ensure the accuracy and stability of the AR localization effect, I traveled to the stadium site, which was three hours away from the office, to conduct early testing. I constructed a prototype with an approximate idea of the venue's appearance and calibration process, and then refined it based on data gathered during on-site testing and measurements.
The design and development process also involved frequent communication and collaboration between the government organizer, the animation vendor, and our internal team members. Given the project's tight one-month timeframe, rapid prototyping played a crucial role in gathering prompt feedback and making necessary application adjustments.

︎︎︎ Testing the AR localization process and accuracy on-site
Designed and Developed 4 Versions of the Apps To Accommodate Needs
To meet the live broadcasting requirements of this project, I designed the AR calibration process and control panel. This allowed team members responsible for camera controls to efficiently localize the AR content and make adjustments as necessary. Additionally, I created an audience version of the app, which can be downloaded from the App Store and Google Play, enabling spectators to enjoy the performance from their seating positions on their mobile phones. Lastly, we developed VIP and HoloLens 2 versions of the app to provide an immersive experience for VIP participants on stage.

On-site Rehearsals and Setup
We dedicated five days to on-site rehearsals and testing before the performance. Our efforts involved close coordination with the lighting, camera control, and live-streaming teams to guarantee the stability of AR localization during the performance. We also conducted tests to synchronize multiple mobile phones, ensuring that the performance could commence simultaneously for all viewers, regardless of the device they were using. This synchronization was controlled by a central backend system.

︎︎︎ Testing the synchronization of multiple mobile phones
Challenges
Challenges For This Project
- Integrating the AR performance seamlessly with the entire stadium to create a believable effect.
-
Although spectators cannot see the performance with their naked eyes, we've developed an app for them to view the AR effects from their phones. The main challenge here is positioning the AR animation correctly from all angles and heights within the stadium to ensure that all spectators can view the performance from any position and on any AR-enabled device.
- Establishing a clear user flow to guide users who may be unfamiliar with AR technology in order to create a stable AR effect.
-
Ensuring a stable AR live performance in a large indoor venue with few feature points and changing light conditions, as we only have one shot for the live broadcast.
-
Controlling and aligning the start time of the live performance on every device in the stadium.
-
Dealing with very limited setup and preparation time before the actual performance, as there are other performances scheduled ahead of the AR live show.
-
Creating a stable AR performance using low-budget equipment.
-
Maximizing the use of the limited time available for on-site testing and setup. Hualien City is far away from Taipei, and the project must be completed within a relatively short timeframe with multiple stakeholders involved.
Outcome
Performance Clips
![]()
![]()
︎︎︎ The AR content blended with the environment
App Downloads
The app was #1 on the App Store in the sports category, and the performance was broadcast live nationwide by one of Taiwan's biggest public TV channels.
![]()
︎︎︎ App Store and Google Play download pages
Spectator Experience
The President of Taiwan also viewed the performance through the app, while the mayor viewed it with HoloLens 2. The performance was completed smoothly without any incidents or instability and was well-received by the spectators and the media.

︎︎︎ The event mascots stopped in front of the stage to say hi to the VIP guests
![]()
︎︎︎ Cameras capturing the performance from multiple angles

︎︎︎ Cameras capturing the performance from multiple angles

︎︎︎ The AR content blended with the environment
App Downloads
The app was #1 on the App Store in the sports category, and the performance was broadcast live nationwide by one of Taiwan's biggest public TV channels.

︎︎︎ App Store and Google Play download pages
Spectator Experience
The President of Taiwan also viewed the performance through the app, while the mayor viewed it with HoloLens 2. The performance was completed smoothly without any incidents or instability and was well-received by the spectators and the media.
Contributions
I ensured a stable AR live performance by designing and testing the user flow and implementing intuitive, easy-to-follow guidance for users of all ages.
I traveled to Hualien and conducted on-site testing multiple times to quickly adjust the AR animation scale, positioning, and user flow. This ensured seamless integration of the virtual AR performance with the physical stadium.
Implemented features include:
I traveled to Hualien and conducted on-site testing multiple times to quickly adjust the AR animation scale, positioning, and user flow. This ensured seamless integration of the virtual AR performance with the physical stadium.
Implemented features include:
-
Live update of the opening show schedule
-
Live update of the animation assets (AssetBundle)
-
Responsive App UI supporting both iOS and Android
-
Trigger AR live performance from the backend to align the start time
-
Designed and implemented the AR localization methods and user flow
- Designed and implemented another version of the app for broadcasting controls
Reflections
Through this experience, I acquired a substantial amount of knowledge about AR localization and learned how to seamlessly integrate a virtual show with the physical world. Staging a performance of this magnitude demands significant teamwork, and I'm deeply grateful to everyone who dedicated their hard work to make it happen.
Follow-up Study: Can we achieve more for large-scale AR live performances?
As a follow-up study to this project, I looked into successful examples of AR large-scale live performances to understand what could be achieved without all the limitations on time and budget. Here are two examples I found:
︎ 2018 League of Legends World Championship
Combining real performers with virtual characters, this AR live performance brought Riot's virtual K-pop quartet, K/DA, to life. The virtual characters may not be visible to the naked eye, but they can be seen performing alongside the real singers and dancers via the official live stream.
Each of the four real performers has a digital twin within the K/DA group. They recorded their dance moves through motion capture in advance, ensuring that their virtual counterparts closely resemble them, resulting in a seamlessly blended performance.
The reflections of the virtual characters on stage play a crucial role in enhancing the realism of the AR effect.
The League of Legends World Championship has consistently delivered stunning AR performances each year (2017, 2018, 2019, 2020). While AR serves as the primary element in their performances, it alone cannot guarantee success. The inclusion of real singers, dancers, and musicians, combined with remarkable cinematic effects, sophisticated lighting, and seamless camera movements from various angles, all contribute to making their performances a truly spectacular watch.
︎ 2020 Tokyo Olympics Closing Ceremony
![]()
During the Tokyo Olympics 2020 closing ceremony, the live fireworks display was followed by a breathtaking AR light show. The AR light show, while invisible to the athletes inside the stadium, provided a captivating visual spectacle for television viewers.
The key to its captivating appeal lay in its seamless integration with the live fireworks both visually and spatially, making it incredibly challenging to distinguish between the virtual and real effects.
This event serves as yet another outstanding example of how augmented reality, or mixed reality, can achieve a heightened level of believability and enchantment when expertly integrated with physical spaces and tangible objects.
Follow-up Study: Can we achieve more for large-scale AR live performances?
As a follow-up study to this project, I looked into successful examples of AR large-scale live performances to understand what could be achieved without all the limitations on time and budget. Here are two examples I found:
︎ 2018 League of Legends World Championship
(Photo credit: Riot Games)
Combining real performers with virtual characters, this AR live performance brought Riot's virtual K-pop quartet, K/DA, to life. The virtual characters may not be visible to the naked eye, but they can be seen performing alongside the real singers and dancers via the official live stream.
Each of the four real performers has a digital twin within the K/DA group. They recorded their dance moves through motion capture in advance, ensuring that their virtual counterparts closely resemble them, resulting in a seamlessly blended performance.
The reflections of the virtual characters on stage play a crucial role in enhancing the realism of the AR effect.
The League of Legends World Championship has consistently delivered stunning AR performances each year (2017, 2018, 2019, 2020). While AR serves as the primary element in their performances, it alone cannot guarantee success. The inclusion of real singers, dancers, and musicians, combined with remarkable cinematic effects, sophisticated lighting, and seamless camera movements from various angles, all contribute to making their performances a truly spectacular watch.
︎ 2020 Tokyo Olympics Closing Ceremony

During the Tokyo Olympics 2020 closing ceremony, the live fireworks display was followed by a breathtaking AR light show. The AR light show, while invisible to the athletes inside the stadium, provided a captivating visual spectacle for television viewers.
The key to its captivating appeal lay in its seamless integration with the live fireworks both visually and spatially, making it incredibly challenging to distinguish between the virtual and real effects.
This event serves as yet another outstanding example of how augmented reality, or mixed reality, can achieve a heightened level of believability and enchantment when expertly integrated with physical spaces and tangible objects.
