News

Technology and Art in Today’s Creative Industry

Oct 24, 20255 min read

Author: YSOLIFE

Technology and art have become inseparable in today’s creative landscape. It no longer merely supports creators—it directly reshapes how audiences are engaged, how live experiences are delivered, and how content sustains its long tail.

This feature brings together three interviews that represent this transformation in practice:

FANSI GO, a “Web 2.5” ticketing platform linking repurchase incentives, social sharing, and cross-border payments; Funique, an immersive-tech studio combining 8K 3D scanning and proprietary 4D Gaussian Splatting for XR live sessions; and NAXS Studio, an artist-led team bridging music, visual, and performing arts through real-time interactive technology.

Below is a closer look at how each team turns ideas into reality.


FANSI GO

Q1. How does FANSI GO turn a one-time ticket purchase into a long-term relationship?

A: We transform ticketing from a single transaction into an ongoing fan relationship. Fans who attend one show receive small discounts, early-bird access, or better seat numbers for the next one—creating a clear incentive to return. Our “invite-a-friend” bundles (e.g. four-ticket packs) turn social engagement into a product feature, encouraging fans to organize groups on IG or Threads. After checkout, every buyer gets a shareable digital ticket stub—evoking Web3 community pride, but without requiring blockchain use.

Most importantly, organizers gain access to verified fan contact data (e.g. emails), enabling post-event follow-ups, merch drops, and next-round ticketing. In some projects, we’ve seen repurchase rates exceeding 50%, showing how loyalty and data can work hand-in-hand.

Q2. What kinds of communities are the best fit for FANSI GO?

A: We work best with communities that have strong identity and loyalty—such as Taiwan’s metal scene, self-curated venues with a clear artistic tone, or promoters with a distinct curatorial voice. Their audiences overlap and stay engaged over time, making our repurchase and social-sharing tools more effective.

Conversely, if every show attracts a completely different crowd, the impact is limited. That’s why FANSI GO currently operates by invitation—we assess alignment in vision and audience structure before partnering.

 

圖1. Fansi Go 現場 Qr Code 驗票

*Photo Source: FANSI GO

Q3. What advantages does FANSI GO offer for cross-border ticketing and payments?

A: In our project with Busan International Rock Festival, FANSI GO handled payment collection and disbursement through the platform, helping organizers avoid roughly 18% in cross-border remittance taxes. Clients can choose settlement accounts and currencies (USD / TWD), and even opt for USDC if needed.

We’re also integrated with major Southeast Asian payment aggregator Zendit, supporting local e-wallets and regional payment methods. This allows overseas events to “collect locally, settle globally,” reducing fees, easing reconciliation, and shortening cash-flow cycles.

Q4. How do you see e-payment trends in Taiwan and Southeast Asia?

A: E-wallet and mobile payment adoption is skyrocketing across the region—Thailand, for instance, has seen dramatic growth in the past year. Consumers now prefer wallet-based or transfer payments, so we prioritized integrations with regional payment gateways to ensure familiar user experiences. Meanwhile, paid advertising efficiency is declining due to fraud and saturation, making word-of-mouth activation far more valuable. By merging seamless e-payment with built-in fan-sharing mechanics, we turn every checkout into the start of the next wave of organic promotion.


Funique

Q1. What technologies does Funique use in production?

A: Since our founding in 2016, our core production standard has been 8K stereoscopic imaging per eye—the minimum resolution we believe can create a “virtual yet real” digital twin.

Our latest pipeline combines proprietary AI algorithms with 4D Gaussian Splatting (4DGS), enabling highly accurate, re-composable 3D models. From micro-objects to full human scans, everything is captured with real textures and zero synthetic animation artifacts.

Using our in-house capture and reconstruction process, one master file can serve multiple outputs—VR headsets, glasses-free 3D displays, theatrical projections, mobile screens, or mapping installations—without the need to reshoot. This ensures every production remains portable, scalable, and consistently high-quality across platforms.

Q2. How does Funique define the role of “immersive live content” in the market?

A: We position immersive live visuals as part of an OMO (Offline Merge Online) ecosystem. The same XR content is first ticketed offline as an installation or screening, then distributed online via partner OTT platforms. The goal isn’t to replace live shows, but to extend their lifecycle—turning one-off performances into reusable experience assets.

Each production can sync with cinema-grade systems, head-mounted displays, or 3D projection setups. As headset experiences and 5G/6G networks become more common, online streaming will grow more stable while physical events retain their premium value.

Both sides reinforce each other, maximizing the return on every shoot. Our recent project U:NUS XR Dream Party, in collaboration with Taipei Music Center, Rock Records, and Long Hu Men, demonstrates this model in action.

 

圖2. Unus《xr 夢境派對》展覽體驗畫面 1

*Photo Source: Funique VR Studio

Q3. How can music creators collaborate with Funique?

A: Within our Live Session product line, we combine top-tier visuals with flexible monetization.

Our upcoming XR OTT platform integrates donation tools, fan voting, interactive ads, and instant shoppable content—opening new revenue channels for artists and sponsors alike.

Because each master file can be repurposed for multiple endpoints, creators get maximum return on a single production.

Both scanned and animated virtual sets can be used, and we can apply virtual cinematography in post-production to replace costly camera reshoots. This allows artists to output the same session as a traditional MV, an immersive theater experience, or a brand installation. These XR assets can also serve as international showcase demos, offering realistic, high-end visuals for global pitches and sponsorships.


NAXS Studio

Q1. What is the origin story of NAXS Studio?

A: NAXS Studio evolved from NAXS Corp, a tech/new-media art collective operating project-to-project. In 2021, we incorporated as NAXS Studio, maintaining our experimental/artistic DNA while taking on fuller production/design and commercial work. We’ve long invested in Unity, real-time engines, WebGL, mobile, and online interactive experiences, all while staying closely connected to music creators.

Q2. Why build MMO(Massive multiplayer online game)-style virtual worlds for music performance?

A: Our interest in online immersion came from earlier limitations in physical installations—restricted audience size and geography. We began using the web as a stage, blending gaming and theater to explore new forms of online immersive performance.

Early experiments included the Afterlife online exhibition and Afterlife EV20F1, a cross-border live show. After incorporation, NAXS expanded into commissioned interactive experiences such as Taiwan Now Cloud Pavilion and the National Theater & Concert Hall’s Lunatic Town.

In 2022, a collaboration with Sunset Rollercoaster led to Sunset Town, built in Unity using WebGL and Cloud Rendering. It created a multi-user, free-movement virtual concert world—continuing our exploration of new models connecting music, space, and community.

 

圖3. Ac Stage Fhd

*Photo Source: NAXS Studio

Q3. Your team keeps collaborating internationally. Can you share examples?

A: Body Crysis began our long-term collaboration with Australian artists Sam McGilp and Harrison Hall (introduced through Chunky Move). There are three iterations to date:

  1. 2022: a hybrid performance with Taipei experimental band Prairie WWWW (落差草原wwww) live in Taiwan, dancers live in Australia, and a synchronous online game world;

  2. 2023 at Soft Centre: a lower-spec iteration with live dancers and pre-recorded music playback;

  3. 2023 at Melbourne’s Now or Never: the NAXS team flew in to run real-time Unity visuals on site.

 

圖4. Soft Centre X Carriageworks X Vivid Full   Jordan Munns   @jordankmunns 122

*Photo Source: NAXS Studio

More recently, the tri-national co-production VR documentary “Dark Rooms” (Taiwan/Denmark/Germany) followed an audiovisual/film workflow and co-production model and was selected for the Venice Immersive Competition, further strengthening our international production network.

 

圖5. S2 4 2480x1772

*Photo Source: NAXS Studio

Q4. What business models are you developing now?

A: We run two tracks. First, XR/film-oriented VR co-productions—transitioning from media art into the film ecosystem with professional producing, international co-financing, and festival pathways to broaden monetization. 

Second, online interaction as a service—modularizing our MMO/multi-user aesthetics and tech into a service we call AUTOMETA, which rapidly customizes web-based immersive experiences for brands, fashion, and live culture; commercial terms range by scale: project fees, licensing, or revenue share. In parallel, we continue projects like “Sunset Town” to validate box office and IP-building over the long term.

 

Share this post