Next Article in Journal
Design and Numerical Study of Induction-Heating Graphitization Furnace Based on Graphene Coils
Next Article in Special Issue
Enriching User-Visitor Experiences in Digital Museology: Combining Social and Virtual Interaction within a Metaverse Environment
Previous Article in Journal
Crack Detection of Concrete Based on Improved CenterNet Model
Previous Article in Special Issue
Interactive Teaching in Virtual Environments: Integrating Hardware in the Loop in a Brewing Process
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios

Instituto de Diseño y Fabricación, Universitat Politècnica de València, 46022 Valencia, Spain
*
Author to whom correspondence should be addressed.
Submission received: 6 February 2024 / Revised: 12 March 2024 / Accepted: 14 March 2024 / Published: 17 March 2024

Abstract

:
This work aims to identify and propose a functional pipeline for indie live-action films using Virtual Production with photorealistic real-time rendering game engines. The new production landscape is radically changing how movies and shows are made. Those were made in a linear pipeline, and now filmmakers can execute multiple tasks in a parallel mode using real-time renderers with high potential for different types of productions. Four interviews of professionals in the Spanish film and television market were conducted to obtain the whole perspective of the new paradigm. Following those examples, a virtual production set was implemented with an Antilatency tracking system, Unreal Engine (version 5.3), and Aximmetry (version 2023.3.2) as the leading software applications. Results are commented on, presenting how all the work is currently closely connected between pre-production, shooting, and post-production and analyzing its potential in different fields.

1. Introduction to Virtual Production with Real-Time Rendering

Virtual Production (VPX) signifies a seismic shift in the filmmaking landscape, merging real-time digital environments with live-action footage, thereby revolutionizing traditional cinematic processes. Utilizing state-of-the-art 3D software, motion capture, and virtual camera systems, VPX enables filmmakers to dynamically interact and modify digital components during the live shooting process, fostering an unprecedented level of creative immediacy and flexibility.
Explorations into VPX’s application in animated filmmaking, as discussed by Benet and Carter [1], reveal its potential to significantly streamline post-production workflows, enabling instantaneous interactions with virtual environments. This real-time capability not only enhances efficiency but also expands the creative palette available to animators and directors. In live-action contexts, VPX’s integration of sophisticated virtual technologies and software, highlighted by Priadko and M. Sirenko [2], transforms the entire spectrum of production stages. This transformation not only boosts efficiency but also elevates creative expression, offering filmmakers the ability to visualize and craft complex scenes with newfound adaptability.
Further investigations, such as those by Montes-Romero et al. [3] and Walmsley and Kersten [4], delve into VPX’s broader implications, demonstrating its role in augmenting visual effects, facilitating real-time decision-making, and promoting collaborative production dynamics. These studies emphasize VPX’s instrumental role in reshaping filmmaking, fostering a more interconnected and efficient creative environment.
The impact of VPX extends into the educational realm, particularly in animation training, as evidenced by Hart et al. [5]. This underscores VPX’s growing significance across various sectors, setting new benchmarks in storytelling, visual engagement, and interactive experiences, thus heralding an era where technological innovation and artistic endeavor converge seamlessly.
VPX introduces a paradigm shift from the traditional linear film production pipeline to a more integrated and parallel workflow. This shift diminishes the boundaries between the pre-production, production, and post-production phases, creating a synergistic environment that enhances collaboration and iteration among all creative stakeholders. As illustrated in Figure 1, leveraging animated 3D scenes for pitch and pre-visualization not only economizes resources but also streamlines the shooting process [6]. With the democratization of VPX technology, even indie filmmakers can harness its benefits, reflecting a broader trend where technological advancements make sophisticated filmmaking tools increasingly accessible [7].
As the film industry continually evolves, VPX stands at the forefront of this transformation, reshaping the creation narrative from the ground up. From pre-production visualization techniques to integrating green screens or LED walls during shooting and refining the final touches in post-production, VPX influences each juncture of film creation [2]. In this ever-changing landscape, VPX emerges as the next frontier, poised to redefine the cinematic experience for creators and audiences alike.

1.1. Motivation

The evolution of VPX marks a significant milestone in the landscape of filmmaking and content creation, heralding a new era where technology and creativity converge in unprecedented ways. VPX’s ability to seamlessly integrate digital environments with live-action footage is not just a testament to technological innovation but a paradigm shift that promises to redefine the traditional methodologies of storytelling and film production. This transformative potential of VPX, characterized by its capacity to enhance real-time decision-making, streamline production workflows, and foster immersive storytelling, offers a rich domain for exploration and research.
The motivation for this research paper is rooted in a deep-seated curiosity about how VPX is reshaping the narrative and aesthetic dimensions of filmmaking. It seeks to investigate the myriad of ways in which VPX facilitates a more interactive, collaborative, and efficient production process, enabling filmmakers and content creators to push the boundaries of traditional storytelling. By delving into the intricacies of VPX, this paper aims to shed light on its role in democratizing film production, enabling artists and creators across the globe to realize their visions with greater creative freedom and technological empowerment. Through this exploration, this research aspires to chart the future trajectory of VPX, uncovering its potential to revolutionize the way stories are conceived, produced, and experienced in the digital age.

1.2. Related Work

The groundwork for VPX has been profoundly laid out through seminal works in filmmaking. Seminal films such as “Star Wars: Episode I—The Phantom Menace” [8], “Avatar” [9], “The Lord of the Rings: The Fellowship of the Ring” [10], and “The Polar Express” [11], have significantly influenced the development and understanding of Virtual Production Experiences (VPX) in the film industry. These films have set the stage for advanced VPX implementations, as seen in “The Mandalorian” [12], where Jon Favreau’s integration of VPX from visualization to in-camera final pixel solutions has been groundbreaking, merging the realms of the real and the digital in a seamless real-time process.
Kadner elaborates on VPX as a range of technologies that enable practical filmmaking tools to interact with virtual counterparts [13]. This interaction allows a camera in the physical world to control a virtual camera within a game engine, with the real-time final image displayed on an LED wall for capture. The Epic Games company has been at the forefront of this revolution, introducing key methodologies in “The Virtual Production Field Guide” (2019) that encompass Visualization, Chroma Virtual Production, and LED Wall Virtual Production. These techniques have revolutionized narrative development in films, providing invaluable tools for directors and creative teams.
Visualization is pivotal in VPX, serving as the initial step in visualizing and refining creative concepts. It encompasses various phases, including Pitchviz, Shotviz, Previz, Virtual Scouting, Techviz, Stuntviz, and Postviz, each crucial for different stages of the filmmaking process (Figure 2) [14,15,16,17,18,19,20]. Chroma Virtual Production offers a novel approach to green screen filmmaking, allowing for the real-time incorporation of virtual elements into live-action footage, thus broadening narrative possibilities and creative flexibility [20,21]. LED Wall Virtual Production represents a fusion of live action with real-time game engine visuals, presented on large LED screens, providing dynamic lighting and immersive production settings [13,20].
Recent studies further explore the advancements in VPX and real-time rendering technologies. Tewari et al. investigate neural rendering, a method that marries generative machine learning with traditional computer graphics, opening new avenues in virtual and augmented reality applications [22]. Dong discusses the impact of game engines like Unreal Engine and Unity on virtual production, emphasizing their role in real-time rendering and remote collaboration [23].
Fuqiang et al. introduce a neural approach for reconstructing, compressing, and rendering human performances, enhancing the creation of realistic and expressive avatars for VR [24]. Li et al. delve into a VR-based design method for multi-visual animation character 3D models, integrating real-time computer graphics technologies to enrich character immersion [25]. Zhang et al. address the challenges in VR education through a cloud-to-end rendering and storage system, showcasing the system’s ability to deliver high-quality, real-time 3D content [26].
These integrated insights provide a comprehensive overview of the evolution and current state of virtual production and real-time rendering, illustrating their transformative impact on the film industry and beyond and driving innovation in narrative development, creative visualization, and interactive experiences.

1.3. Objectives and Main Contributions

This research article investigates VPX, examining its tools, technologies, and methodologies in comparison to traditional filmmaking practices to make them more accessible for smaller-scale productions. Through a detailed study involving a literature review and interviews with VPX professionals in Spain, this paper analyzes how VPX integrates real-time rendering throughout production phases, linking pre-production, filming, and post-production seamlessly. Additionally, this work introduces a lab prototype implementation that exemplifies VPX’s practical applications, offering insights into its integration in production environments.
This study aims to fill a knowledge gap in the practical use of VPX, particularly within the Spanish film industry, focusing on its impact on production workflows, cost implications, and future directions. It provides empirical evidence and uncovers innovative practices in Spain, highlighting the country’s growing importance in the VPX arena and detailing the adoption of VPX methodologies by industry professionals. This exploration offers a comprehensive perspective on the transformation VPX brings to filmmaking, discussing its challenges and transformative potential.
By investigating VPX’s deployment and effects in Spain’s film and audiovisual sector, this paper sheds light on its regional adoption and examines the broader implications for production pipelines, schedules, and costs. It discusses theoretical and practical contributions to the field, revealing how VPX can revolutionize production processes, enable new storytelling techniques, and highlight the role of technologies like Unreal Engine in modernizing various production stages. The findings point to significant trends, such as department centralization and shifts in animation and rendering practices, showcasing VPX’s evolving impact on the filmmaking landscape.

1.4. Structure

To systematically address the objectives related to the VPX topic, the structure of this paper is as follows: The Introduction sets the stage by outlining the scope and significance of the study. Subsequently, Section 2 elucidates the research design and methodologies utilized, whereas Section 3 delves into the practical application of VPX, evaluating its influence on various filmmaking processes. Next, Section 4 examines the challenges and opportunities associated with VPX, integrating insights from the literature and expert interviews. The paper culminates in Section 5, where the future implications of VPX in the cinematic domain are contemplated and synthesized.

2. Materials and Methods

This section delineates the comprehensive methodology employed in this research to investigate the tools, technologies, and methodologies of VPX and their comparative analysis with traditional filmmaking practices. Aimed at elucidating VPX’s role in enhancing the accessibility and efficiency of production processes for smaller-scale productions, this study adopts a multi-faceted approach encompassing an extensive literature review, interviews with seasoned VPX professionals in Spain, and the development of a lab prototype to exemplify VPX’s practical applications.
Initially, the research methodology is underpinned by a systematic literature review designed to forge a foundational understanding of VPX’s current landscape and its integration into various production phases. Following this, this study engages with firsthand insights through qualitative interviews with an array of VPX professionals across Spain, selected based on stringent criteria to ensure a broad representation of industry perspectives. The culmination of this methodological approach is the creation and analysis of a lab prototype, which serves not only as a tangible representation of VPX’s capabilities but also as a tool to explore its practical implications within production environments.

2.1. Literature Review Process

This literature review was orchestrated to build a comprehensive foundation for VPX, examining its tools, technologies, methodologies, and juxtaposition with conventional filmmaking practices. This review particularly emphasizes the integration of VPX in film production, its adoption across various geographies, and its impacts on the industry.
  • Search Strategy: A systematic search was conducted using keywords related to VPX, focusing on identifying the literature that discusses the technological aspects, application in film production, and comparisons with traditional filmmaking methods.
  • Database Selection: Through databases like Google Scholar and ACM Digital Library, significant works were identified, including the discussion on VPET given by Spielmann et al. as an innovative virtual production tool [27], showcasing its application in enhancing filmmaking processes.
  • Inclusion and Exclusion Criteria: Studies were selected based on their relevance to VPX’s technological advancements, applications, and impacts. For example, Fair examines the implications of virtual production on regional filmmaking, providing insights into the economic and policy dimensions influenced by VPX [28].
  • Screening and Selection: After screening, the selected studies included works such as those by Kavakli and Cremona [29], which analyze the Virtual Production Studio concept, offering a comprehensive view of VPX’s transformative potential in filmmaking.
  • Synthesis and Analysis: The synthesis highlighted the evolution of VPX, its integration into production processes, and its broader industry impacts. This included an examination of Bennett and Carter’s work [1] to delve into how virtual production is adopted in animated filmmaking, contributing to the understanding of VPX’s versatility and adaptability across different production contexts.

2.2. Selection of VPX Professionals

To gain an in-depth perspective on VPX within the Spanish film industry, this study meticulously selected a group of seasoned professionals from Spain’s leading VPX studios: Koru Media Labs, El Ranchito, Orca Studios, and MR Factory. These individuals bring a wealth of experience and expertise, shaping the narrative of VPX’s current state, challenges, and future directions in Spanish filmmaking.
  • Edu Martin: The founder of Koru Media Labs boasts almost 25 years of experience in the computer animation industry. His impressive career includes pivotal roles such as Lighting Sequence Lead for “Planet 51” [30] at Ilion Animation Studios and Lighting Technical Director at Pixar Animation Studios, contributing to animated classics like “Brave” [31], “Monster University” [32], and “Cars 2” [33]. Additionally, he served as CG Supervisor for “Wonder Park” [34] at Skydance Animation in Madrid. Notably, Edu Martin recently ventured into real-time rendering, creating his short film titled “Out of Sync,” which has garnered recognition, including a Mega-Grant from Epic Games, a key player in the virtual production landscape [35,36].
  • Rafael Lozano: A former researcher and software developer at El Ranchito, a leading VFX facility in Spain, brings over five years of experience in developing pipelines and tools for major applications used in the film industry, including Maya, Unreal Engine, Shotgun, Houdini, and Nuke. His extensive portfolio, as reflected in his IMDb profile, showcases his contributions to notable films and series such as “Hernán” [37], “Westworld” [38], “Game of Thrones” [39], “The First” [40], and “Super Lopez” [41,42,43].
  • Adrian Corsei: Serves as the head of the Spanish Virtual Production Facility, Orca, located in the Canary Islands, which stands as one of the pioneering LED wall VPX studios in Spain. With a rich background in the film industry, Adrian has contributed to several noteworthy films, including “Spider-Man: Homecoming” [44], “Guardians of the Galaxy Vol. 2” [45], “Captain America: Civil War” [46], and “Iron Man 3” [47] at Trixter. His professional journey has seen him progress from Lead Light TD to CG Supervisor, VFX Supervisor, and ultimately, to the position of Head of Studio at one of the major VFX facilities in Germany [48,49].
  • Óscar Olarte: Holds the role of co-founder and Chief Technology Officer (CTO) at MR Factory, a Green Screen Virtual Production studio spanning 500 square meters and situated in Madrid. Embarking on his digital journey approximately 30 years ago, Óscar initially engaged in developing flight simulation software. Over the years, his expertise extended to encompass a diverse range of projects, including audiovisual experiences for museums, documentaries, TV news, films, and series. Some of his recent notable contributions can be observed in productions like “The Goya Murderers” [50] and “Sky Rojo” [51,52,53].

2.3. Interview Methodology

The qualitative component of this study was significantly enriched by conducting in-depth interviews with selected VPX professionals, aiming to gather their expert insights and experiences related to the integration and impact of VPX in Spain. The interviews were structured around four pivotal questions, as detailed below, designed to explore various dimensions of virtual production and its implications for the film industry.
  • Q1—What is Virtual Production?—This question aimed to understand the professionals’ definitions and perspectives on virtual production, establishing a foundational understanding of the concept as seen through the lens of industry experts.
  • Q2—How has virtual production been used in Spain?—This inquiry sought to explore the adoption and application of virtual production technologies within the Spanish film industry, identifying unique practices and the extent of integration in local production environments.
  • Q3—How does VPX change the pipeline, schedule, and cost?—This question was designed to delve into the operational impacts of VPX, examining how it influences production workflows, timelines, and financial aspects of filmmaking.
  • Q4—VPX in the future?—Focusing on forward-looking insights, this question encouraged professionals to speculate on the evolution and future trajectory of VPX in the film industry, capturing their visions and expectations.
The interviews were conducted remotely via Zoom, providing a convenient and flexible platform for the participants. Each session had an average duration of 80 min, ensuring comprehensive coverage and depth of discussion. This format also facilitated the recording of the interviews, with the consent of the participants, to support accurate transcription and detailed analysis.

2.4. Qualitative Analysis Methodology

Initially, detailed transcripts of the interviews were meticulously prepared to ensure an accurate representation of the participants’ responses. This step was crucial for capturing the nuanced perspectives shared during the discussions. Following the transcription process, an open coding method was employed to sift through the data, identifying and generating preliminary codes that encapsulate the key ideas articulated by the interviewees.
Building upon the initial coding, a thorough thematic analysis was undertaken. This analysis aimed to distill the core themes and insights emerging from the interview data, providing a structured understanding of the topics discussed. Through this analytical process, the authors of this work were able to categorize the data into coherent themes that reflect the collective viewpoints and individual experiences of the participants.
To deepen the analysis, a comparative examination of the thematic findings was conducted across all interviews. This comparative analysis was instrumental in uncovering commonalities, differences, and unique perspectives among the participants, offering a multifaceted view of the subject matter.
The culmination of this rigorous analytical process led to the identification of several key themes, which include the Definition of Virtual Production, Challenges in Traditional Production Methods, Advantages of VPX, Applications of Virtual Production in Spain, Impact on Production Pipeline, Schedule, and Cost, and Future Trends and Anticipated Changes in the field. These themes not only encapsulate the essence of the discussions but also provide a comprehensive overview of the current state and future prospects of Virtual Production Extensions (VPX) in the film industry.

2.5. VPX Prototype Implementation

The development of a lab prototype was informed by insights from our methodology and expert interviews, showcasing practical applications of Virtual Production Extensions (VPX) in the Spanish film industry. This prototype illustrates the integration of VPX technologies into the filmmaking process, reflecting the expert discussions on current trends and future directions.
The prototype’s design incorporates advanced VPX tools identified as crucial or emerging in the industry, aimed at demonstrating their impact on enhancing efficiency, creativity, and cost-effectiveness in film production. The detailed characteristics and outcomes of the prototype, illustrating its potential benefits and challenges, will be elaborated in Section 3, offering a focused view of its application and relevance to real-world production settings.

3. Results

3.1. Case Studies in the Spanish Market

The film and audiovisual market in Spain has experienced significant growth, propelling the country to the sixth rank globally for exporting movies, music, and series. The contributing factors include qualified professionals, state-of-the-art technology, diverse filming locations, and tax incentives. In 2020 and 2021, revenues from fictional series productions amounted to almost EUR 4.3 billion, generating approximately 72 series, 18,443 jobs, and EUR 264 million in tax revenue. The “Spain AVS Hub” plan aims to make Spain the main European audiovisual hub by 2025, with a public investment of EUR 1.6 billion, targeting a 30% increase in film, series, shorts, advertising, videogames, and animation production [54,55].
Spain is evolving into an international hub for films and series, necessitating an understanding of how virtual production is being implemented in the Spanish market. To explore this, four interviews were conducted with production-proven professionals experienced in Virtual Production (VPX).
  • Q1—What is Virtual Production?
Virtual production serves as a pivotal method for bridging the gap between post-production and pre-production, intertwining these two cinematic realms [36]. Through effective pre-production employing Virtual Production (VPX), a film crew can streamline the shooting process across multiple locations, enhancing the narrative compared to traditional production methods. Within a virtual production studio, locations can be seamlessly altered with a simple click, allowing for rapid transitions. The lighting of live-action scenes harmonizes with LED wall backdrops, or it can be aligned using the appropriate lighting setup and laser projectors in a greenscreen backdrop scenario, with the final composite image continuously visible on monitors in real time. In VPX pre-production, cameras, lights, and actors can be staged in a virtual setting, establishing the movie’s mood and efficiently conveying that information to the entire crew without the need to capture any frames.
This synergy empowers directors and film crews to adapt and explore improvements to the story, iterating the original idea when necessary, especially post-pandemic, where filmmakers have adjusted their approaches to comply with restrictions and safety measures. Virtual production allows for the application of digitized physical locations in a controlled environment, such as a TV set [49]. This significantly reduces post-production efforts, as elements like chroma keys are replaced by expansive screens, resulting in recordings that closely resemble the final shot [52]. Budget constraints, travel restrictions, or logistics may force directors or producers to cut scenes, potentially compromising the narrative. However, with VPX, the cast and crew can virtually transport to any location worldwide without leaving the studio [52].
A major challenge lies in integrating a 3D backdrop with an actor on a green screen. A Director of Photography (DOP) can experiment with lighting in various ways, but the traditional process entails waiting for days, weeks, or even months to see the final shot and discover that the two worlds do not align. In traditional shooting, an experienced crew can handle one or two locations a day. In contrast, a virtual production pipeline can manage six or more locations, significantly reducing shooting time [52].
In some instances, a VFX supervisor on set can provide guidance to the director or DOP for a more accurate scenario in postproduction. In VPX, the DOP can adjust lights as many times as necessary, observing real-time results, either as an in-camera final using LED walls or as real-time compositing with a green screen to achieve a flawless composite shot. All assets used in previz and during shooting can be reused in post-production as references for final assets [36].
Real-time composition offers creative freedom and various benefits, including cost and scheduling efficiency, providing a profound understanding of the production [39]. This approach enables better decision-making by offering early visibility to the director, DOP, cast, and crew of elements that would typically only be revealed much later. The advantages include enhanced photographic composition, a more immersive experience for actors, and improvements in high-quality indirect lighting [52]. Table 1 summarizes the primary benefits of virtual production below.
  • Q2—How has virtual production been used in Spain?
When inquiring about the timing and process of implementing Virtual Production (VPX) studios, each of the four companies provided unique insights, summarized in Table 2.
MR Factory adopted a Chroma VPX pipeline and workflow, achieving final compositing on set or location while also acquiring all the necessary data and media for a post-produced environment if required by the director. Recorded data includes the original greenscreen plate, the matte using the Ultimatte keyer, the CG backdrop, an FBX file with camera tracking data, and the final composite plate. Achieving optimal lighting for characters and props is simplified as adjustments can be made in real time by observing the composite on the monitor [52]. Reflection challenges in greenscreen scenarios were addressed by recreating reflective objects using a game engine. Techniques such as LIDAR scanning or photogrammetry were employed to model accurate 3D versions of problematic objects, ensuring synchronization and tracking with real-world counterparts in the shot [52].
Orca Studio aims at the film market, offering comprehensive vertical services from concept and pre-production to shooting with LED walls, post-production, and final film delivery. The use of live LED walls for in-camera final pixel shots enhances shooting efficiency, providing a flexible and controllable environment with real-time illumination and reflections. This real-time visibility allows for on-the-fly creative decisions, eliminating the waiting period for rendering results. Thorough planning is crucial, requiring testing of all locations before shooting [48].
El Ranchito embarked on its VPX journey with the series “Hernán,” recreating the entire island of Tenochtitlán as a virtual environment. The camera could be controlled within this virtual space, obtaining coordinates that guided the real camera placement. This synchronization allowed for a harmonious blend of the real and virtual environments. In HBO’s Westworld series, El Ranchito replicated the City of Arts and Sciences in Valencia, using it inside Unreal Engine and displaying it on a 50 by 22-foot LED wall in Los Angeles for in-camera final pixel shots. A notable development from El Ranchito is CrowdER, a crowd system enabling the integration of a virtual environment filled with digital people into a real setting, creating the illusion of events occurring in a different location. El Ranchito’s focus lies in tool development and CGI asset creation, offering a versatile approach to 3D animation by integrating Unreal Engine into various stages of production [56].
Edu Martín’s approach involves working with editing, animation (independent of the camera view), cameras, and lighting, closely resembling live-action shooting with digital characters [36].
  • Q3—How does VPX change the pipeline, schedule, and cost?
In a traditional pipeline, tasks are typically executed sequentially, progressing from one stage to the next. Conversely, VPX introduces a multi-threaded, parallel approach, emphasizing the importance of meticulous pre-production planning to seamlessly integrate VFX from the final stage into pre-production and shooting. Thorough testing and pre-shoots are essential to identify and address potential issues that may arise during filming.
On-set, incomplete, or subpar production assets can be seamlessly replaced with a greenscreen, recorded using the LED wall as a backdrop. This technique retains lighting and reflections from the digital environment while only replacing the recorded green elements. Directors gain quick access to multiple locations without the need for extensive travel or cast and crew relocation. These efficiencies can result in approximately a 30% budget reduction by expediting processes and consolidating activities in a centralized location [49].
The advantages of VPX with a chroma backdrop harken back to the filmmaking style of Hollywood’s golden age, providing greater control over the shooting environment. Workflows may involve shooting for the final pixel using real-time compositing or opting for a post-processing approach in post-production. The latter provides access to various essential elements such as the original plate, matte, 3D file with camera movement, 3D background render, and the composite plate as a Postviz. While achieving lighting consistency can be challenging with chroma backdrops, VPX allows real-time adjustments by observing the reference monitor. A notable advantage of working with chromas compared to LED walls is the ability to use several cameras simultaneously against the chroma backdrop.
The integration with traditional CGI asset creation is nearly seamless, with the pipeline often involving the use of Maya and Houdini for 3D asset production, followed by insertion into Unreal. The accelerated production schedule eliminates the need to wait for image rendering, providing dynamic feedback to clients. However, the careful coordination and development of assets for Unreal Engine are crucial, demanding adherence to specific requirements. Unreal’s real-time rendering necessitates optimized geometry, UVs, and textures, requiring precision in software like Maya or Houdini. Despite these challenges, the final quality of assets, especially for environments like rocky landscapes, rivals or surpasses classical methods. While close-ups of organic characters with hyper-realistic details may still favor traditional methods, the difference is diminishing, and real-time benefits often outweigh the minor quality distinctions [52].
The workflow for a 3D animated film undergoes a substantial transformation by centralizing all processes within Unreal Engine. However, this restructuring necessitates a well-organized framework to facilitate communication with external Digital Content Creation (DCC) applications, primarily for tasks such as modeling, texturing, animation, and effects. A critical aspect of this transition involves the use of file formats like Alembic or USD to enable seamless iteration and fluid communication between software applications. Another significant evolution in the filmmaking process is the increasing demand for compositing in fully animated movies. This arises from the necessity to address issues that cannot be re-rendered due to constraints such as budget or time limitations. In the context of a movie running on a real-time game engine, corrections and adjustments are made instantaneously, aligning with the director’s vision. Consequently, the storage and management of data within the network undergo changes, as there is no longer a need to export render layers or AOVs for each frame during the compositing phase. Instead, by simply hitting play on the game engine, the final result can be observed, and only the ultimate versions of frames, stored as EXR or DPX files, need to be retained. This shift in the revision process introduces a more interactive approach. In traditional pipelines, directors typically review changes in a cinema setting with supervisors and the production team, hoping for the desired modifications in subsequent revisions. Contrastingly, in a real-time pipeline, adjustments are made swiftly, as exemplified in the interview [36]: “[…] I don’t approach it shot by shot or element by element. Instead, I hit play and pause on the shot that, for some reason, doesn’t seem to be working. I then adjust the lights and play it again. Rather than stopping at that particular shot, I proceed to the next one, only to discover that the cut doesn’t quite fit. So, I make some modifications and fine-tune it a bit. Afterward, I adjusted the camera. […]”.
This innovative approach to 3D animation aligns more closely with the methods employed by live-action directors, allowing for real-time adjustments based on what is observed. In contrast, the traditional approach required meticulous planning and execution due to the high costs associated with frequent changes. Additionally, the real-time environment facilitates the use of smaller teams [36].
  • Q4—VPX in the future?
The future of virtual production (VPX) envisions the establishment of extensive film studios featuring a blend of real and virtual elements. The onset of the COVID-19 pandemic accelerated the integration of VPX into the film industry by two or three years, prompting filmmakers worldwide to swiftly adopt and adapt to these technologies due to travel restrictions and health concerns [52]. To facilitate crew training, starting with smaller, faster projects such as TV spots, music videos, or short films is recommended [48].
Anticipated Changes:
  • Emergence of New Narratives: VPX is expected to bring forth new types of narratives, enhancing production times and overall outcomes. The accessibility of these technologies for home use is likely to spur the emergence of innovative concepts. This methodology is poised to replace the traditional approach involving chroma keys and rendering [52].
  • Unreal Engine’s Role: Unreal Engine [57], in collaboration with the Virtual Production Hub, has introduced a versatile pipeline for virtual production. Four major changes in comparison to traditional linear pipelines were identified, spanning pre-production, shooting, and post-production. Notably, departments like Art, Previz, Virtual Art, and VFX collaborate in the early stages of pre-production to develop tools and techniques for a more compelling story. Real-time rendering engines, such as Unreal Engine, serve as hubs for Digital Content Creation (DCC) applications during pre-production, allowing seamless asset development for use throughout the filmmaking process [58].
  • Diverse Shooting Options: During shooting, three complementary options are available: a traditional live-action workflow, a chroma backdrop, and an LED wall. In the latter two options, VFX is integrated into the shooting process. Post-production involves continuous feedback between editorial and VFX departments, ensuring a cohesive narrative. Assets created early in the process can be used for final pixel shots or serve as starting points for developing final assets, with real-time rendering enabling instant corrections and improvements [56].
  • Centralization of Departments: Real-time game engines can centralize various departments, including Layout, Look Development, Lighting, and Editorial, streamlining decision-making processes. The traditional linear pipeline limitations in revisions are overcome by allowing the director to see the entire work simultaneously, promoting more instinctive decision-making. This shift may lead to a sequence-based approach, reducing the need for specialists and fostering a dynamic collaboration among generalists [56].
  • Animation and Rendering Changes: The animation and rendering of characters and props will resemble on-set procedures, focusing on performance and selecting the perfect camera angle. The need for extensive digital compositing will diminish as real-time rendering allows adjustments before the final film rendering. Smaller, more iterative creative teams are anticipated, with the game engine serving as a hub for multiple departments in animated productions [56].
In addition, the advancements in generative artificial intelligence (AI) tools to generate 3D assets, materials, and even full environments forecast a near future where it will be possible to generate complex 3D environments from a simple prompt. Some generative tools already present in the market herald a transformative future trend in a smaller scope. Tools like Artbreeder, Promethean AI, RunwayML, and NVIDIA’s GauGAN are at the forefront of this evolution, offering capabilities to automate and enhance the creation of 3D assets, textures, landscapes, and more [59,60,61,62]. These AI-driven platforms enable users to generate detailed and realistic environments, characters, and textures with increased efficiency and reduced manual effort. By allowing developers and artists to describe desired outcomes in natural language or through simple sketches, these tools significantly streamline the development process. This symbiosis between AI and Unreal Engine not only accelerates production timelines but also opens up new avenues for creativity and innovation in VPX projects. As we look to the future, the continued advancement and integration of AI technologies promise to further revolutionize the field of indie virtual production. The near future is marked by the continuous democratization of virtual production thanks to the fast evolution of these tools, making sophisticated filmmaking accessible to indie creators and expanding narrative and aesthetic possibilities across the industry.
These technological advancements are not just reshaping current filmmaking practices but are also signaling a future where storytelling converges with interactive, immersive experiences, marking a new era of efficiency, creativity, and audience engagement in the film industry.
Building on this, the real-world application of VPX in Spanish indie filmmaking vividly illustrates the transformative impact highlighted above. By embracing virtual production, Spanish indie filmmakers are not just adopting new technologies; they are pioneering a shift in cinematic creation that aligns with the global trend towards more dynamic, cost-effective, and collaborative filmmaking processes. In the context of Spanish indie filmmaking, Virtual Production Extensions (VPX) have revolutionized traditional workflows, enabling more agile and cost-effective production processes. By integrating real-time rendering and LED wall technologies, indie filmmakers can now achieve Hollywood-level visual effects on limited budgets, fostering creative freedom and innovation. The democratization of film production through VPX allows for broader participation, empowering a new generation of filmmakers to bring their visions to life with unprecedented immediacy and flexibility. This technological evolution not only reshapes the roles within film crews but also encourages collaborative creativity, blurring the lines between different production stages and facilitating a more iterative approach to storytelling.

3.2. VPX Prototype

The establishment of an indie virtual production set, illustrated in Figure 3, serves as a multifaceted platform catering to both film production and educational purposes. By incorporating real-time rendering game engines like Unreal Engine, filmmakers are afforded the opportunity to immerse themselves in virtual environments, marking a transformative shift in content creation methodologies. This shift is characterized by a move towards a parallel and collaborative pipeline, a significant departure from the traditional sequential task completion process, signaling VPX’s burgeoning influence globally due to the ongoing advancements and increasing accessibility of virtual production technologies.
Central to the virtual production setup are several key technological components. Unreal Engine 5 (UE5) is pivotal, renowned for its real-time rendering capabilities that allow for the creation of lifelike virtual scenes using advanced technologies like Nanite and Lumen [54]. Aximmetry acts as a bridge, integrating UE5 with the studio’s hardware to manage tracking, compositing, and scene synchronization, ensuring fluid interaction between virtual and physical realms [55]. The Antilatency Tracking System provides precise, low-latency tracking essential for aligning virtual and real-world elements seamlessly [56]. Complementing these is the Blackmagic Cinema Pocket Camera 4K, which captures high-quality footage within the virtual environment. Proper studio lighting is crucial to maintain consistent illumination across both realms, while a green chroma backdrop offers the flexibility to substitute the physical environment with any virtual scene dynamically.
In our small VPX prototype, the chroma effect was successfully achieved by utilizing a specialized system termed the “Retroreflective Greenscreen Light Ring.” This system is composed of a distinct gray fabric designed to reflect the green light emitted by a ring of LEDs surrounding the camera. Such a configuration facilitated the attainment of satisfactory results within a constrained space, obviating the necessity for extensive lighting equipment to illuminate the chroma background, thereby rendering the process cost-efficient. The technological setup comprised a straightforward workstation equipped with 64 GB of RAM and an NVIDIA 4090 graphics card. Additionally, the integration of input signals was adeptly managed through the deployment of an Elgato Cam Link stick, which converted the Black Magic camera’s output into a 4K resolution webcam feed, seamlessly integrated into Unreal Engine as the primary video source for virtual production. This straightforward yet effective setup brings to the fore a range of advantages and disadvantages when evaluated within the context of virtual production capabilities.
Among the notable advantages is the cost-effectiveness of the system. The utilization of economical components, such as the Elgato Cam Link stick paired with a basic workstation outfitted with a high-performance NVIDIA 4090 graphics card, substantially lowers the financial barriers to achieving high-quality virtual production. Moreover, the space efficiency introduced by the Retroreflective Greenscreen Light Ring system enables the achievement of chroma effects in limited spaces without the necessity for elaborate lighting arrangements, rendering it particularly suitable for smaller studios or residential settings. Conversely, this setup is not without its drawbacks. The operation of such a system necessitates a degree of technical acumen, especially in navigating the intricacies of Unreal Engine and the integration process of various components, which may present a learning curve for some users. Additionally, despite the efficiency of the NVIDIA 4090 graphics card, reliance on a single workstation may impose constraints on the project’s scope, particularly for endeavors demanding higher computational resources or advanced graphical fidelity. Furthermore, while the simplicity of this setup is beneficial for small-scale projects, it may encounter scalability challenges when expanding to larger, more complex virtual production ventures, necessitating further adjustments to both hardware and software to meet the increased demands.
In essence, the described virtual production setup strikes a balance between accessibility and performance, offering an entry point into sophisticated virtual production techniques for users across a spectrum of resources and technical proficiency levels. Nonetheless, it is imperative to consider the limitations inherent to this system in the context of the project’s scale and complexity.
The physical setup of the proposed VPX set is methodically outlined, starting with the studio’s arrangement to optimize production quality, including the installation of the chroma backdrop and strategic placement of the camera and lights. Following this, software integration involves setting up Unreal Engine 5 and Aximmetry on the studio’s hardware, with the Antilatency system calibrated to provide real-time tracking data. Scene development within UE5 utilizes its vast asset library and rendering capabilities to craft immersive backdrops for live production. The calibration process ensures the camera and tracking system are finely tuned for real-time alignment of physical and virtual elements, a critical step for achieving seamless integration. With all components in place, the virtual production studio is primed for operation, enabling actors and objects to interact with the virtual environment as their actions are captured by the camera in real time, fostering a cohesive blend of reality and virtuality, as developed from the study’s findings.
Our experiments confirmed the system’s suitability for a foundational independent studio setup. Nonetheless, the quality and stability of the resulting image could be significantly enhanced by upgrading to a more powerful workstation and integrating a Black Magic video capture card into the system to prevent frame drops during operation.

4. Discussion

The analysis of the interview transcripts unveils profound insights into the impact of virtual production on independent studios. Edu Martin emphasizes fluid integration across various production phases, which not only enhances flexibility but also drives innovation within the sector. At MR Factory, Óscar Olarte highlights how technological integration streamlines workflows, boosting creative flexibility and output. Rafael Lozano from El Ranchito emphasizes the critical nature of real-time production and post-production capabilities, while Adrian Corsei at Orca Studios points to the strategic advantages of cost reduction and increased operational efficiency, which bolster creative freedom and streamline production processes.
These experts’ perspectives collectively signal a marked transition towards virtual production within independent circles, illustrating its capacity to boost efficiency, ignite creativity, and shift competitive dynamics in the face of industry titans. The long-term benefits of virtual production appear to significantly surpass the initial barriers of tech adoption and investment, signaling a new era for the indie sector.
The discussion among Spanish indie filmmakers centers on the unanimous recognition of virtual production’s benefits. They highlight how this technology enhances creative control, allowing for the real-time visualization and alteration of complex scenes. Cost efficiency emerges as a key advantage, reducing the need for physical sets and on-location shoots. This shift signifies a broader movement in filmmaking, where technological advancements not only streamline processes but also unlock new creative potentials.
Addressing the technical hurdles, Spanish filmmakers acknowledge the steep learning curve associated with new software and technologies. Yet, they point out inventive solutions, like targeted training and tech partnerships, to overcome these challenges. The shift to virtual production necessitates detailed pre-production planning, ensuring creative visions align with technical possibilities and showcasing a broader industry trend where technology and creativity converge.
Regarding creative opportunities, Spanish indie producers recognize virtual production’s transformative effect on storytelling. The ability to instantly visualize and adjust complex scenes not only simplifies production but also expands the creative toolkit for directors and cinematographers, enabling a more dynamic narrative approach that surpasses traditional physical constraints.
Regarding the industry impact, Spanish indie producers underline virtual production’s effect on filmmaking workflows, role definitions, and the democratization of film production. This technology enables smaller teams to produce visually striking content, challenging the established production norms and opening new avenues for indie filmmakers. This evolution is reshaping the industry, fostering innovation, and promoting inclusivity in film production, both in Spain and globally [59,60,61,62].
The collaborative dynamics within virtual production are underscored by pivotal roles such as the Virtual Production Supervisor and the Virtual Art Department (VAD). These entities play a crucial role in unifying various departments—ranging from physical production teams to art and visual effects departments—thereby ensuring a harmonious and efficient production workflow. The Virtual Production Supervisor, in particular, is instrumental in aligning the collective creative vision, serving as a vital conduit among the different stakeholders in the production process [63].
Highlighting successful collaboration models can provide tangible insights into this synergy. For instance, since 2019, the production of the TV series “The Mandalorian” exemplifies how the integration of LED screens and Unreal Engine catalyzes a symbiotic relationship between technology and creativity [12]. This approach not only enhances the actors’ performances by providing them with real-time, immersive environments but also streamlines the post-production process, showcasing a model of efficiency and innovation in filmmaking.
Moreover, the NYU Tandon School of Engineering accentuates the significance of multi-user collaboration techniques in this realm [64]. Such methodologies enable teams to collaborate in real-time across various locations, fostering a flexible and dynamic creative environment. This adaptability is crucial for modern filmmaking, where real-time decision-making and geographical diversity are increasingly prevalent.

5. Conclusions

This article extensively explores the current state of Virtual Production (VPX), with a specific emphasis on the implementation of real-time rendering game engines. While acknowledging that this field requires more extensive documentation, most available resources are provided by Epic Games for their Unreal Engine. Despite this, Epic Games remains at the forefront, pushing technological boundaries to revolutionize filmmaking. This review reveals that virtual production can be categorized into three main fields: Visualization, Chroma Virtual Production, and LED Virtual Production.
In the Visualization field, seven subtypes (Pitchviz, Shotviz, Previz, Virtual Scouting, Techviz, Stuntviz, and Postviz) serve specific purposes within the film production process. Chroma Virtual Production is divided into Real-time Chroma and Post-produced Chroma, each contributing to various stages from pre-production to post-production. LED Virtual Production, representing a crucial facet, utilizes LED screens to create realistic, immersive environments, enhancing performances and framing during shooting.
Regardless of subtype or field, all VPX forms must deliver imagery or video content, aiding in planning, guiding in-camera VFX, or contributing to the final post-produced result. Bringing VFX from post-production to pre-production emerges as a crucial aspect of effective virtual production.
VPX has created visually stunning experiences, fostering global collaborations, transcending geographical boundaries, and harnessing diverse talent. It offers benefits such as cost-effectiveness and flexibility, revolutionizing film production pipelines, schedules, and costs. The interviewees emphasized the importance of proper planning and time dedication to VFX development for effective VPX utilization.
In Spain, companies like El Ranchito, Orca Studios, and MR Factory, along with independent filmmakers like Edu Martín, have embraced virtual production, implementing VPX pipelines to enhance the filmmaking process. This has resulted in more efficient workflows, enhanced creative possibilities, and cost savings.
VPX’s evolution has transformed the film industry, enabling real-time creative decisions, efficient shooting in multiple locations, and cost savings. Directors, cast, and crew benefit from real-time results, reduced reliance on physical locations, and adaptability to restrictions. Integrating real and virtual elements enhances the visual experience for audiences and opens new storytelling possibilities.
The future of virtual production holds immense potential, and as technologies evolve, further advancements can be expected, pushing the boundaries of visual storytelling. The continued collaboration between filmmakers, technologists, and content creators is essential for unlocking the full potential of virtual production in future creative endeavors. The primary objective of implementing VPX is to enhance storytelling and contribute to time and cost reduction in specific projects, with the virtual production ecosystem rapidly advancing toward becoming the norm in filmmaking. By embracing virtual production techniques, filmmakers can create immersive experiences, transcend geographical boundaries, and foster more creative and collaborative film productions. As the virtual production industry evolves, a culture of knowledge exchange and continual learning will be crucial to fully unlock its transformative power for future creative endeavors.
In conclusion, looking towards the future, the potential of virtual production to further revolutionize the film industry, especially within the Spanish independent sector, is substantial. As technology progresses, virtual production is expected to become increasingly accessible and versatile, offering filmmakers unparalleled creative freedom and operational efficiency. This trend portends a promising future for cinematic storytelling, where the integration of technology and artistic innovation leads to the emergence of groundbreaking cinematic experiences.

Author Contributions

Conceptualization, A.M.; Methodology, A.M.-T.; Software, D.S.J.; Formal analysis, A.M.; Writing—original draft, D.S.J.; Writing—review & editing, F.M., J.E.S. and L.G.; Supervision, A.M.-T. and A.M.; Funding acquisition, L.G. All authors have read and agreed to the published version of the manuscript.

Funding

Spanish Government (Grant PID2020-117421RB-C21 funded by MCIN/AEI/10.13039/501100011033) and Generalitat Valenciana (Grant INVEST/2022/324).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bennett, J.S.; Carter, C.P. Adopting Virtual Production for Animated Filmaking; Global Science and Technology Forum (GSTF): Singapore, 2014. [Google Scholar]
  2. Priadko; Sirenko, M. Virtual production: A new approach to filmmaking. Bull. Kyiv Natl. Univ. Cult. Arts Ser. Audiov. Arts Prod. 2021, 4, 52–58. [Google Scholar]
  3. Montes-Romero, Á.; Torres-González, A.; Montagnuolo, M.; Capitán, J.; Metta, S.; Negro, F.; Messina, A.; Ollero, A. Director Tools for Autonomous Media Production with a Team of Drones. Appl. Sci. 2020, 10, 1494. [Google Scholar] [CrossRef]
  4. Walmsley, A.P.; Kersten, T.P. The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography. Appl. Sci. 2020, 10, 1517. [Google Scholar] [CrossRef]
  5. Hart, P.; Carter, J.; Portmann, G.; Carter, C. QUT: Learn Virtual Production in Our Animation Courses. 2014. Available online: https://eprints.qut.edu.au/102486/ (accessed on 13 March 2024).
  6. Wu, H.-Y.; Palù, F.; Ranon, R.; Christie, M. Thinking Like a Director: Film Editing Patterns for Virtual Cinematographic Storytelling. ACM Trans. Multimed. Comput. Commun. Appl. 2018, 14, 1–23. [Google Scholar] [CrossRef]
  7. Cgsociety. Available online: https://cgsociety.org/news/article/5006/indie-filmmakers-use-xsens-mocap-and-virtual-production-to-emulate-feature-films (accessed on 19 September 2022).
  8. Lucas, G. Star Wars: Episode I—The Phantom Menace; Twentieth Century Fox: Los Angeles, CA, USA, 1999. [Google Scholar]
  9. Cameron, J. Avatar; Twentieth Century Fox: Los Angeles, CA, USA, 2009. [Google Scholar]
  10. Jackson, P. The Lord of the Rings: The Fellowship of the Ring; New Line Cinema: Burbank, CA, USA, 2001. [Google Scholar]
  11. Zemeckis, R. The Polar Express; Warner Bros.: Burbank, CA, USA, 2004. [Google Scholar]
  12. Favreau, J. The Mandalorian; Walt Disney Studios: Burbank, CA, USA, 2019. [Google Scholar]
  13. Kadner, N. The Virtual Production Field Guide Volume 1. 2019. Available online: https://cdn2.unrealengine.com/vp-field-guide-v1-3-01-f0bce45b6319.pdf (accessed on 20 June 2021).
  14. The Third Floor. The Third Floor Virtual Visualization Series—Pitchvis & Previs. Available online: https://thethirdfloorinc.com/3863/virtual-visualization-series-pitchvis-previs/ (accessed on 7 January 2022).
  15. Chetty, D. Physics-Based Shotviz. Available online: https://www.unrealengine.com/en-US/onlinelearning-courses/physics-based-shotviz?sessionInvalidated=true (accessed on 8 June 2020).
  16. Rokoko. What Is Previs? And Why Is it the Most Important Aspect of Movie and Game Production?|Rokoko. Available online: https://www.rokoko.com/insights/previs-what-is-previsualization (accessed on 2 February 2020).
  17. Unreal Engine. Virtual Scouting. Available online: https://docs.unrealengine.com/4.27/en-US/BuildingWorlds/VRMode/VirtualScouting/ (accessed on 3 June 2021).
  18. Leung, L. Previs, Postvis, Techvis & Animation Roles in VFX, Film, Games|Lucy Leung. 7 February 2021. Available online: https://lucylueng.myblog.arts.ac.uk/2021/02/07/previs-postvis-techvis-animation-roles-in-vfx-film-games/ (accessed on 2 February 2022).
  19. The Third Floor. Techvis. Available online: https://thethirdfloorinc.com/what-we-do/film-tv/techvis/#techvis (accessed on 2 February 2022).
  20. Animation World Network. Animatrik Provides Performance Capture Tech for ‘Suicide Squad’. 31 August 2016. Available online: https://www.awn.com/news/animatrik-provides-performance-capture-tech-suicide-squad-stunt-vis (accessed on 2 February 2022).
  21. Kadner, N. The Virtual Production Field Guide Volume 2; Epic Games: Cary, NC, USA, 2021. [Google Scholar]
  22. Tewari, A.; Fried, O.; Thies, J.; Sitzmann, V.; Lombardi, S.; Sunkavalli, K.; Martin-Brualla, R.; Simon, T.; Saragih, J.; Nießner, M.; et al. State of the Art on Neural Rendering. Comput. Graph. Forum 2020, 39, 701–727. [Google Scholar] [CrossRef]
  23. Dong, A. Technology-driven Virtual Production: The Advantages and New Applications of Game Engines in the Film Industry. Rev. FAMECOS 2022, 29, e43370. [Google Scholar]
  24. Fuqiang, Z.; Yuheng, J.; Kaixin, Y.; Jiakai, Z.; Liao, W.; Haizhao, D.; Yuhui, Z.; Yingliang, Z.; Minye, W.; Lan, X.; et al. Human Performance Modeling and Rendering via Neural Animated Mesh. ACM Trans. Graph. (TOG) 2022, 41, 1–17. [Google Scholar]
  25. Li, L.; Zhu, W.; Hu, H. Multivisual Animation Character 3D Model Design Method Based on VR Technology. Complexity 2021, 2021, 9988803. [Google Scholar] [CrossRef]
  26. Zhang, H.; Zhang, J.; Yin, X.-P.; Zhou, K.; Pan, Z. Cloud-to-end Rendering and Storage Management for Virtual Reality in Experimental Education. Virtual Real. Intell. Hardw. 2020, 2, 368–380. [Google Scholar] [CrossRef]
  27. Spielmann, S.; Helzle, V.; Schuster, A.; Trottnow, J.; Götz, K.; Rohr, P. VPET: Virtual production editing tools. In Proceedings of the ACM SIGGRAPH 2018 Emerging Technologies, Vancouver, BC, Canada, 12–16 August 2018. [Google Scholar]
  28. Fair, J. Virtual Production and the potential impact on regional filmmaking: Where do we go from here? DBS Bus. Rev. 2023, 5, 51–58. [Google Scholar] [CrossRef]
  29. Kavakli, M.; Cremona, C. The Virtual Production Studio Concept – An Emerging Game Changer in Filmmaking. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand, 12–16 March 2022; pp. 29–37. [Google Scholar]
  30. Blanco, J.; Abad, J.; Martínez, M. Planet 51; Ilion Animation Studios: Las Rozas de Madrid, Spain, 2009. [Google Scholar]
  31. Andrews, M.; Chapman, B.; Purcell, S. Brave; Pixar Animation Studios: Emeryville, CA, USA, 2012. [Google Scholar]
  32. Scanlon, D. Monsters University; Pixar Animation Studios: Emeryville, CA, USA, 2013. [Google Scholar]
  33. Lasseter, J.; Lewis, B. Cars 2; Pixar Animation Studios: Emeryville, CA, USA, 2011. [Google Scholar]
  34. Brown, D. Wonder Park; Paramount Animation: Los Angeles, CA, USA, 2019. [Google Scholar]
  35. Martin, E. Edu Martin LinkedIn Profile. Available online: https://www.linkedin.com/in/edumartin/ (accessed on 5 February 2022).
  36. Martín, E. Interviewee, Entrevista a Eduardo Martín Real-Time CG. Available online: https://upvedues-my.sharepoint.com/:b:/g/personal/anmartes_upv_edu_es/EbZ8_cMlWndItmswy--SZTgBW4uxsvdlmNY5aV3Jh9E3Ng?e=S3XWkL (accessed on 5 February 2024).
  37. de Tavira, J.; Jaén, M.; Muruzabal, A.; Royo, C. Hernán; Amazon: Seattle, DC, USA, 2019. [Google Scholar]
  38. Joy, L.; Nolan, J. Westworld; HBO: New York, NY, USA, 2016. [Google Scholar]
  39. Benioff, D.; Weiss, D.B. Game of Thrones; HBO: New York, NY, USA, 2011(first season)–2019(last season).
  40. Willimon, B. The First; HULU: Santa Monica, CA, USA, 2018. [Google Scholar]
  41. Caldera, J.R. Super Lopez; La Gran Superproducción: Barcelona, Spain, 2018. [Google Scholar]
  42. Vida, R.L. Rafael Lozano Vida IMDB Profile. Available online: https://www.imdb.com/name/nm10663299/?ref_=nv_sr_srsg_0 (accessed on 5 February 2022).
  43. Vida, R.L. Rafael Lozano Vida LinkedIn Profile. Available online: https://www.linkedin.com/in/rafael-lozano-vida-078ba711b/ (accessed on 5 February 2022).
  44. Watts, J. Spider-Man: Homecoming; Marvel Studios: Burbank, CA, USA, 2017. [Google Scholar]
  45. Gunn, J. Guardians of the Galaxy Vol. 2; Marvel Studios: Burbank, CA, USA, 2017. [Google Scholar]
  46. Russo, A.; Russo, J. Captain America: Civil War; Marvel Studios: Burbank, CA, USA, 2016. [Google Scholar]
  47. Black, S. Iron Man 3; Marvel Studios: Burbank, CA, USA, 2013. [Google Scholar]
  48. Corsei, A. Adrian Corsei LinkedIn Profile. Available online: https://www.linkedin.com/in/acorsei/ (accessed on 5 February 2022).
  49. Corsei, A. Interviewee, Entrevista a Adrían Corsei de Orca Studios. Available online: https://upvedues-my.sharepoint.com/:b:/g/personal/anmartes_upv_edu_es/Ed_ou7tC_BVJmQSFUb9dY0EBgL-xXB5yj-7Dw50lWNcV6w?e=I5tFyK (accessed on 5 February 2024).
  50. Herrero, G. The Goya Murders; Tornasol Films: Madrid, Spain, 2019. [Google Scholar]
  51. Pina, Á.; Lobato, E.M. Sky Rojo; Netflix: Los Gatos, CA, USA, 2021. [Google Scholar]
  52. Olarte, Ó. Interviewee, Entrevista a Oscar Olarte MR Factory. Available online: https://upvedues-my.sharepoint.com/:b:/g/personal/anmartes_upv_edu_es/Eau9QuuDA85HrtB6H0jAIuYBh3SpM7Eq3BmTFRcRgNvupw?e=bjnuKo (accessed on 5 February 2024).
  53. Olarte, Ó.M.; Óscar, M. Olarte LinkedIn Profile. Available online: https://www.linkedin.com/in/%C3%B3scar-m-olarte-251868a/ (accessed on 5 February 2022).
  54. ICEX. Situación de la industria audiovisual en España. Available online: https://www.investinspain.org/es/sectores/audiovisual (accessed on 5 February 2022).
  55. Ministerio de Asuntos Económicos y Transformación Digital. Plan de Impulso al Sector Audiovisual (Spain Audiovisual Hub). Available online: https://portal.mineco.gob.es/es-es/TID/hub-audiovisual/Paginas/el-plan.aspx (accessed on 5 February 2022).
  56. Lozano, R. Interviewee, Entrevista a rafael Lozano de El Ranchito. Available online: https://upvedues-my.sharepoint.com/:b:/g/personal/anmartes_upv_edu_es/EYqBQNrF7ulIkSNkQrj7D1kBXJK-Bm1iHZD_3jdvZpy1TA?e=I3TDzE (accessed on 5 February 2024).
  57. Unreal Engine. Virtual Production Hub. Available online: https://www.unrealengine.com/en-US/virtual-production (accessed on 28 June 2020).
  58. Mayeda, R. HBO’s Westworld Turns to Unreal Engine for In-Camera Visual Effects. 5 August 2020. Available online: https://www.unrealengine.com/en-US/spotlights/hbo-s-westworld-turns-to-unreal-engine-for-in-camera-visual-effects?sessionInvalidated=true (accessed on 18 October 2020).
  59. Artbreeder. About Artbreeder. 2024. Available online: https://www.artbreeder.com/about (accessed on 11 March 2024).
  60. NVIDIA. GauGAN: Changing Sketches into Photorealistic Masterpieces. 2019. Available online: https://www.nvidia.com/en-us/research/ai-playground/ (accessed on 11 March 2024).
  61. Promethean. Promethean AI: Building the Worlds of Tomorrow. 2024. Available online: https://www.prometheanai.com/ (accessed on 11 March 2024).
  62. RunwayMI. Runway: Creativity Unleashed. Available online: https://runwayml.com/ (accessed on 11 March 2024).
  63. Frame.io. Virtual Production Essentials: 10 Things to Know Before You Start Shooting. 2023. Available online: https://blog.frame.io/ (accessed on 11 March 2024).
  64. NYU Tandon School of Engineering. NYU Tandon School of Engineering’s Overview on Virtual Production and Collaboration. 2024. Available online: https://engineering.nyu.edu/ (accessed on 11 March 2024).
Figure 1. Traditional Pipeline vs. VPX Pipeline based on the diagrams of “The virtual Production Field Guide Volume 1”.
Figure 1. Traditional Pipeline vs. VPX Pipeline based on the diagrams of “The virtual Production Field Guide Volume 1”.
Applsci 14 02530 g001
Figure 2. Visualization Types. (a) Storyboard (static image); (b) ShotViz (static image); (c) PreViz (moving image sequence); (d) Final movie (moving image sequence).
Figure 2. Visualization Types. (a) Storyboard (static image); (b) ShotViz (static image); (c) PreViz (moving image sequence); (d) Final movie (moving image sequence).
Applsci 14 02530 g002
Figure 3. Chroma Virtual Production Set tested at the lab. (a) Chroma Backdrop; (b) Camera with tracking system; (c) Reference Monitor; (d) Talent with PCap system; (e) Lights; (f) Realtime Compositing.
Figure 3. Chroma Virtual Production Set tested at the lab. (a) Chroma Backdrop; (b) Camera with tracking system; (c) Reference Monitor; (d) Talent with PCap system; (e) Lights; (f) Realtime Compositing.
Applsci 14 02530 g003
Table 1. Virtual production benefits.
Table 1. Virtual production benefits.
Advantages DirectorBenefits Production
Adapt better/faster iterationsAvoid limitations of locations
Give guidelines to the director or the DOPBenefits when it comes to costs, scheduling
Freedom from a creative point of viewQuicker shooting
Actors are more immersed in context. Multiple locations
High-quality indirect lightingReduce postproduction times and costs
More creative possibilities
Table 2. Virtual production casuist for each company.
Table 2. Virtual production casuist for each company.
ProducerExperience in VPXType of VPX DevelopedLimitations
MR FactoryMore than 5 yearsReal-time Chroma VPX: The data and media are recorded: the original greenscreen plate, the matte using the Ultimatte keyer, the CG backdrop, an FBX file with the camera tracking data, and a composite version of the shot.Need some type of post-processing (real-time or traditional post-production)
Orca studio4 years LED Wall VPX: film market and they offer vertical servicesAt some angles of the shooting, it can obtain moiré patterns for the LED walls
El RanchitoMore than 5 yearsAssets and BG creation: camera could be controlled to obtain the desired shot as if it were the photo mode of a video game. Their approach is more in the development of tools and in the creation of CGI assetsTraditional 3D asset creation
Koru Media Labs 1 year experienceUnreal Engine works as a HUB where all artists can connect at some point. New pipelines and software applications need to be learned by the CGI artist.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Silva Jasaui, D.; Martí-Testón, A.; Muñoz, A.; Moriniello, F.; Solanes, J.E.; Gracia, L. Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios. Appl. Sci. 2024, 14, 2530. https://0-doi-org.brum.beds.ac.uk/10.3390/app14062530

AMA Style

Silva Jasaui D, Martí-Testón A, Muñoz A, Moriniello F, Solanes JE, Gracia L. Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios. Applied Sciences. 2024; 14(6):2530. https://0-doi-org.brum.beds.ac.uk/10.3390/app14062530

Chicago/Turabian Style

Silva Jasaui, Daniel, Ana Martí-Testón, Adolfo Muñoz, Flavio Moriniello, J. Ernesto Solanes, and Luis Gracia. 2024. "Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios" Applied Sciences 14, no. 6: 2530. https://0-doi-org.brum.beds.ac.uk/10.3390/app14062530

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop