Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The real-time editing system for virtual production and animated interactive project in real-time rendering game engine
(USC Thesis Other)
The real-time editing system for virtual production and animated interactive project in real-time rendering game engine
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
THE REAL-TIME EDITING SYSTEM
FOR
VIRTUAL PRODUCTION AND ANIMATED INTERACTIVE PROJECT
IN REAL-TIME RENDERING GAME ENGINE
By
Wenzhao Li
Master of Fine Arts
Interactive Media & Games Division
School of Cinematic Arts
University of Southern California
May 2019
1
Acknowledgements
System Engineering team Wenzhao Li, Yubing Xie, Rong Deng, and Zijian Chen;
Thanks for my thesis Artistic Committee Sam Roberts, John Brennan, and Mike Zyda;
Jeff Watson, Carl Schnurr, and Jane Pinckard as my thesis class instructors;
Dennis Wixon, Chair and Associate Professor. Professor Tracy Fullerton. Jeff Watson, Assistant
Professor as my thesis Official committee.
2
Table of Contents
Acknowledgements 2
Table of Contents 3
List of Figures 5
Abstract 6
Introduction 7
Conceptualize the Real-time Editing System 8
The Unity Timeline System Analysis 9
Design The Real-time Editing System 11
Scene Management Flow Chart 12
Variation On Animation Clip 14
Playable Director Manager 15
Playable Director Manager’s Controller 17
Variation On Camera Coverage 18
The Cinemachine Brain 19
Virtual Camera 20
Virtual Camera Manager 22
Virtual Camera Manager’s Controller 24
The Essence Of Interactions 25
Standpoint 25
Gaze Tracking Input With Point Of Interest 27
Gaze Focus Weight Calculation 27
The Spontaneous Feedback 29
Alert System 30
3
Camera Offset System 30
Breaking The Fourth Wall 32
Conclusions And Future Plan 32
References 35
4
List of Figures
Fig. 1. An overview of the Real-time editing system structure.
Fig. 2. The flow chart of the user interactions and experiences.
Fig. 3. Visual Breakdown of the Playable-Directors Manager in Unity Inspector.
Fig. 4. Visual Breakdown of the Playable-Directors Manager’s controller in Unity Inspector.
Fig. 5. Multi-Virtual Camera system in Unity Scene.
Figs. 6. Visual Breakdown of the Virtual Camera Manager in Unity Inspector.
Fig. 7. The cinema-chine virtual camera timeline, how the Virtual Camera Manager get the
Pacing for editing and determine the current render camera in the multi-virtual camera
system.
Fig. 8. Visual Breakdown of the Virtual Camera Manager’s Controller in Unity Inspector.
Fig. 9. Visual Breakdown of the Gaze Weight Ratio calculation script in Unity Inspector.
Fig. 10. Keyframe for the story part of project Heliosphere, this space station scene takes
place in the first half of the story.
5
Abstract
Every film can be identified as an organic system when it gets to a completion state where every
shot and every scene are bound together to fit a story context. Yet the context itself varies from
person to person when they encounter the original screenplay. In most of the cases, the final
interpretation falls into the hands of the film editor.
In traditional motion picture production, to capture a variety of actor performances, each shot
will mostly end up with multiple takes. Actors make multiple attempts during the production
phase with performance variations from time to time. Subtle changes in each performance
possess different meaning for a different context.
This leaves the film editor to compose the film with the ocean of dailies and their best
interpretation. However, a significant amount of performance and emotions are abandoned
during post production. Boundless possibilities were hidden behind the scene as the untold story.
The realm of Interactive media provides a new way of storytelling that preserves a great number
of performances as a pool and can dynamically tailor the audio-visual experience at different
times to different audiences.
Project Heliosphere, my interactive master’s thesis artwork, explores a new frontier by
composing its story timeline based on audiences’ real-time inputs, much like a storyteller adapts
their speech to suit their audiences’ interest every different time.
6
Introduction
Project Heliosphere is intended to deliver the cinematic experience with a newly designed
Interruption-Free interaction system. This thesis project aims to achieve real-time rendered
photorealistic quality graphics in combination with procedurally determined motion picture
editing, all running in the Unity engine, to achieve a dynamic story flow for a non-repetitive
audio-visual experience.
This procedurally determined editing system, or The Real-Time Editing System, allows an
animated film to preserve every animation clip recorded during the virtual production phase via a
multi virtual-camera system. Then edits the film based on audience interaction.
The fundamental interaction is the audience’s Gaze inputs. These inputs determine editing
outcomes through a heat-map based Gaze weight calculation system in order to tailor the
experience, either by the varying the camera coverage (Virtual Camera settings) or by varying
the actor performance (Animation Clip). We also added a physical controller as a potential
interaction device, for audiences who seek more active interaction and feedback to enhance the
emotional experience.
This thesis project also has a mission to explore an artist-friendly, and also affordable, virtual
production pipeline for animated films. Most of the virtual productions currently on the market
were achieved by a big studio’s in-house solutions. Project Heliosphere tries to integrate
off-the-shelf solutions and school equipment to recreate an artist-friendly pipeline for people
trying to use performance capture and dynamic editing in their game or film.
7
Conceptualize The Real-time Editing System
Carrying an idea to make an animated film render and edit itself in real-time, it is necessary to
come with the strategy to split the project into two parts: the system, and the story. This paper
will focus the discussion on the system part of the thesis project, that is the Real-Time Editing
system, and specifically on the concept, prototyping, engineering and finally the integration with
the test scene. The Real-Time Editing system shown to be a standalone tool capable to fit with
any story developed on the foundation of the Unity engine.
The concept of Project Heliosphere emerged from my great interest toward in-game cinematic in
combination with the Physical Production experience I received in the academic life at the
University of Southern California, School of Cinematic Arts. When it comes to the topic of
in-game cinematic, much of the ideas here were directly borrowed from Physical Production,
which objectively speaking, is a more matured discipline as it has already developed for many
decades.
Interactive Media software, whether it is Unity, Unreal or Maya, etc. uses its own form timeline
or sequencer systems as editing tool similar to the motion pictures editing software, which
performance data, animations, sound, environment, lighting, and particle effect has been packed
into clips that can be easily trim and splice. Consider the complexity of the topic, the working
period I have and the team skill set, the interactive engine I picked for this very master’s thesis is
Unity engine.
8
The initial thinking was to recompose the video clip through coding, based on the user input
calculation results before the progress bar hits every target frame with enough lead time to make
the seamless edit result. Since all assets are in digital format and Unity 3D is a real-time render
engine, what we tried is to hack into the Unity timeline system source code to change the
arrangement of these digital clips during Runtime to achieve the editing behavior. We quickly
built up a digital scene with few simple-shape meshes and assigned each one with unique
animation pattern, this animation data was packed into animation clips by the Unity Timeline
system to allow us to easily edit a short animation as a testing ground.
The Unity Timeline System Analysis
The Unity Timeline system is the tool for interactive media developers to compose cut-scene,
cinematic, or game-play sequences. We started to tackle the Unity Timeline source code at the
beginning of Fall semester 2018, trying to emulate the film editing logic and actions of a film
editor through coding in the background. However, our early experiments showed that the
changes made on the timeline sequence will not be reflected during the run-time. After digging
deep into the Unity source code, we found that the operational workflow of the Unity Timeline
system constructed with three major components: Playable-Asset, Playable-Graph, and
Playable-Director. Simply put, the Playable-Asset, the collection of tracks in a timeline
sequence, needs to be transformed into the form of Playable-Graph then pass to the
Playable-Director to direct the digital scene.
9
In Unity, a single scene capable to support multiple timeline sequences( Playable-Asset) and
multiple Playable-Directors. At the beginning of each game session, the Unity timeline system
will transfer these Playable-Assets to Playable-Graphs. Changes made on Playable-Asset have no
immediate impact during the game cause the Playable-Graphs that already have been generated
prior to the change. This explains why changes made on the Cine-machine timeline through
coding do not reflect in the gameplay until the next game session in our early test.
Moreover, although the front end the user interface of the Unity timeline system has been
mapped to nearly identical to these traditional editing tools to make it easily approachable, what
happens on the back end, or under the hood, is completely different. The timeline system for
conventional motion pictures software, live action and animation, mostly only includes two types
of tracks, the Video Track and the Audio Track. Video Track holds pixel information containing
character performances, camera works, and lighting. Audio Tack contains information on the
voice, music and sound effects. A significant amount of artistic decisions are made on the stage
during the production phase that the film editor left to figure out the pacing and context for the
post-production.
When editing a film you editing rendered footage, when editing in The timeline system in
Interactive project and digital animation we editing before the render. This grants the editor with
more versatile control on the camera coverages, camera movements, rendering material
properties as well as the lighting choices. In Unity each individual object in the digital scene
controlled by Unity timeline sequence needs their own track. If a clip is about the conversation
between two characters, then the timeline sequence in Interactive needs two tracks instead of
10
one. This creates huge obstacles as we have to manage the synchronization problem between
many subjects even if we only want to change a small clip for a single object. In this case the
video track has been divide to Animation Track and Camera Track, a considerable amount of
design choices needs to be made in the post-production phase.
Design The Real-time Editing System
Building on the valuable information and experiences we got from the initial attempt, I decided
to make a few adjustments on the overall design. The system will now edit the Playable-Graph
sequences instead of edit the animation clips and camera clips in the Playable-Asset. A linear
story has been divided into a linkage of multiple scenes, each scene preserves all variations of
the performance. On the implementation level, we prepared multiple Playable-Directors for each
scene on the linkage, each with their own pool of candidate playable-assets. Building a script to
manipulate these Playable-Directors is the very first step of our realization.
This Real-Time Editing (RTE) system is achieved on the foundation of the virtual production
pipeline and game development technologies, that all actor performance, their voice & sound,
and lighting decisions will be recorded to a candidate pool, RTE system can then compose the
film with this pool based on the audience's interaction through gaze tracking input or physical
controller input.
In most of the cases, the Real-Time Editing (RTE) system swaps the Animation Clip and the
current render camera to create a dynamic Experience. The RTE system is also capable to
rearrange the structure of the storyline to create a more powerful emotional impact.
11
The editing behaviors were determined based on the inputs provided by the audience to offer a
customization experience. For instance, if the system is judging the viewer confused or they miss
important story pieces the game will try to reintroduce the concept through a conversation or
flashback. If the audience appears to be more interested in a specific character the game will
make additional character reveal. Though on the engineering level the system can support
branching narrative with its current features, the design aspect of the Real-time editing system in
Project Heliosphere is still mainly focused on linear story structure.
Scene Management Flow Chart
Figure 1. overview of the real-time editing system structure.
12
To have a summary of the flow of the Real-Time Editing System, we split the unified story
timeline into many segments of timeline sequences, a unified Playable-graph into many
precomposed shorter Playable-graphs. All user inputs collect in the current scene will be
calculated and analyzed by behavior controllers’ manager to determine the performance in the
subsequent scene.
The basic components for a single scene, also called a Node, is the digital scene environment
itself with one Playable-director and all its candidate performances. An audience behavior
manager, Gaze Focus Weight Calculation script, the Player-Director manager’s controller and
the Virtual camera managers controller are responsible for the calculation and editing activities.
Finally, a playable director will direct the performance.
13
Figure 2. The flow chart of the user interactions and experiences.
With a chain of nodes each containing their Playable-director and candidate performances, a
complete story can edit itself. Input gathered from the audiences will be calculated by the Gaze
Focus Weight Calculation script before the story reaches to every edit point for a seamless cut.
Variation On Animation Clip
These digital resources are packed as Playable-Assets and updated every frame on the screen in
the game engine. By rearranging Playable-Assets in the Unity timeline system a variation of
animation can be achieved.
In project Heliosphere, animation editing behavior has been classified into 4 categories.
● The variation in Character Performance
14
● The variation on Length of the Animation (Extended Cut)
● The variation on The Performance Perspective (The Way for the animation to deliver the
story)
● The variation on the Long-Term outcome (Multiple variables storing in the database
determine the final/ Long-Term outcome of the story).
Playable Director Manager
15
Figure 3. Visual Breakdown of the Playable-Directors Manager in Unity Inspector.
To overcome the obstacles we encountered, the alternative strategy is to let the Real-Time
editing system assemble multiple timeline’s Playable-Assets in a sequence, each made by fewer
clips yet already synchronized. We construct a Playable-Directors Manager (PDM) to compose
short timeline-assets (playable-graphs). By splitting a long sequence to multiple shorter
timeline’s Playable-Asset for the playable directors, a single playable-graph can been transferred
into many shorter graphs, the dynamic linkage of the graphs based on user input allow us achieve
Real-time editing behavior in the Unity engine.
Unity 3D engine uses Playable-Director to direct pre-authored timeline-playable asset. Individual
Playable-Directors are capable of directing multiple timeline-playable assets. Playable-Directors
Manager (PDM) is the structure to link Playable-Directors and manage their Playable-asset
options.
The Playable-Directors Manager (PDM) holds all playable-directors and their potential
timeline-assets candidates. It will link director performance with the indicated index number to a
complete sequence. By author many performance clips as timeline candidates during the
production phase, the Playable-Directors Manager (PDM) will have the full capacity to manage
editing behaviors.
16
Playable Director Manager’s Controller
Figure 4. Visual Breakdown of the Playable-Directors Manager’s controller in Unity Inspector.
17
Playable-Directors Manager’s Controller directly controls the Playable-Directors Manager
(PDM). PDM’s Controller will get and set the adjustable variables in the Playable-Directors
Manager, in this case, the number of Playable-Directors, the index of each Playable-Director and
their selected performance candidate. Changes make on these variables during the run will allow
the system to switch the performances of the subsequent scene before the story progression reach
to the edit point.
Playable-Directors Manager’s Controller is the core that transforms audience interaction into
editing behavior on the animation variation level. PDM’s Controller holds judging criteria for all
different enum states as we discussed before. These criteria directly associates with their
dedicated functions which constantly monitor audience interaction from the Gaze Focus Ration
script during a scene to see if the judging criteria of that state has been fulfilled. Which a positive
return PDM’s Controller will pick the alternative performance instead of the one used in default
reference cut, these alternative enum states in each node were assigned by design team through
the Unity inspector.
Variation On Camera Coverage
In addition to the variation of animations, the choices of camera usage, camera properties,
camera movements and lighting decisions also play a huge role in the art of motion picture
storytelling. This set of artistic choices belong to the Director of Photography in the physical
production. Therefore on top of animation variances, the Real-Time Editing system needs to gain
control of the camera coverage to emulates the Director of Photography’s job as well.
18
In interactive media and digital animations, we set up multi-camera system to cover all the
blocking supported by Cinema-chine and its Virtual cameras. The switch between Virtual
cameras emulates the cut in conventional post-production.
To guarantee the fluency and the pacing of the story, Project Heliosphere uses a pre-authored cut
serve as guidance. That the director of the story needs to provide a cut as default version, this
pre-authored cut provide pacing reference for the Real-time Editing System and guarantees the
film will play as a complete piece even without the audiences' interaction aspiration.
The camera editing behavior has been classified into 5 categories.
● Insert on a prop
● Dolly in and out
● Switch between wide angle, medium shot or close up
● The order of appearance on objects
● Rack focus on Gaze
The Cinemachine Brain
The Cine-machine in Unity 3D work with the timeline system to allow the developer to create
the cinematic experience. Cine-machine track in the Unity timeline specially manage the virtual
camera operations, this reflect as the Virtual cameras’ clip in Playable-Asset. Cine-machine also
gets the start time and duration information of each virtual cameras clips and selectively
overwrite their camera properties including the Solo state.
19
The Solo state, a boolean toggle as virtual camera’s properties build within the cine-machine
system, provides every virtual camera with the power to overwrite the current render behavior of
the current Playable-Graph. Solo state only allows one virtual camera to become the current
render camera anytime during the game, as its name suggest. Should the Solo statue be disabled
the Cine-machine will hand the control back tot Playable-graph generated from the cine-machine
timeline track.
By constructed a script directly in control of the Cine-machine track and the virtual camera’ Solo
state we can realize the editing behaviors on the variation of camera’s job.
Virtual Camera
20
Figure 5. Multi-Virtual Camera system in Unity Scene.
The Real-Time editing system in project Heliosphere is heavily dependent on the Virtual
Camera. Virtual Camera in Unity project are basically game objects serve as proxy for the main
render camera, which minimize the rendering latency and make the scene easy to manage.
Virtual Camera contains camera proprieties variables same as the main render camera, including
the Transform.Position, Transform.Rotation and physical camera properties. When a virtual
camera clip is in the active state on Cine-machine track, the render camera will use its position
and overwrite the render camera with virtual camera’s properties to fit the needs of the
storytelling. When a Virtual Camera installed into a timeline sequence it also knows the start
time and the duration of each clip, aka. The pacing.
In the real-time editing system we wrote a cine-machine Virtual Camera Manager (VCM) on top
of the Unity cinemachine brain to operate cameras and manipulate the camera-editing behavior
(camera coverage) during runtime.
21
Virtual Camera Manager
Figure 6. Visual Breakdown of the Virtual Camera Manager in Unity Inspector.
The design to make it possible to edit the coverage in realtime is to build a cine-machine Virtual
Camera Manager (VCM) system on the top of existing cinema-chine brain. The manager will
collect, record and calculate the weight of user inputs and determine when and which Virtual
Camera to use to overwrite the original camera job.
22
Figure 7. The cinema-chine virtual camera timeline, how the Virtual Camera Manager get the pacing for editing
and determine the current render camera in the multi-virtual camera system.
This Virtual Camera Manager is capable of storing and monitoring all Virtual Cameras in the
scene assigned in its inspector and change their properties. VCM also built with functions to
operate those Virtual Camera Manager (VCM), it can switch the current virtual camera coverage
to wide, medium, close up or other specific insert and be smart enough to pick target object base
on the previous Camera pick.
An overwrite function with the toggle in the inspector and current selected Virtual Cameras
index to give its jurisdiction to overwrite the camera at any moment. Every Virtual Cameras has
its own weight-variable, the VCM will pick the dominant one to replace the default cinema-chine
track when the function are set to active.
23
Virtual Camera Manager’s Controller
Figure 8. Visual Breakdown of the Virtual Camera Manager’s Controller in Unity Inspector.
The VCM Controller directly controls the Virtual Camera Manager(VCM). It will get and set the
adjustable variables in the VCM, in this case, the length of the Vcm List, the index of each
Virtual Camera and their camera properties, Changes made to these variables during the run will
allow the system VCM Controller to switch and swap the Virtual Camera usage in the
subsequent scene.
24
VCM Controller is the core that transforms audience Gazing behavior into editing behavior on
the coverage decision level. VCM Controller holds judging criteria for all different enum states
as we discussed before, these states directly associates with their dedicated functions each will
constantly monitor audience interaction during a scene to see if the judging criteria of that state
have been fulfilled. Which a positive return PDM’s Controller will pick the alternative
performance candidate instead of the one used in default reference cut.
The project story sequence is composed of multiple scenes. The design team assigns enum states
to each node through the Unity inspector during the post-production phase.
The PDM’s Controller also communicates with the Gaze Weight Calculation Script, mostly for
the audience’s attention information in different from the VCM controller. It will interpret
whether if the audience is in attention deficit or be interested with a specific topic regarding the
current scene.
The Essence Of Interactions
Standpoint
Influenced by contemporary story driven games like the TellTale game’s The Walking dead and
Life is Strange, I was also trying to implement a dialog system to the project, as my personal
passion to figure out how a similar system was built and the way it works for a story driven
project as well as an additional interaction, a more proactive way, for people to invest and
interact with the story.
25
Coming up with an idea like this I was also keeping in mind that it needs to fit with the Real-time
Editing system I was working on, a non-branching story structure.
Project Heliosphere was already planned as passive interaction media, so I was trying to shape
the dialog system to fit with the theme. Influenced by Bioware’s Mass effect and Dragon Age at
the time when I was conceptualize the idea, that the designer explained to people why they found
the text displayed on the screen for Mass effect and Dragon Age dialogue options was so
different compare to the actual line spork by the protagonist. That, In most of the cases human
mind carries an expectation before they take a conversation into action, But the the actual line we
speak will never be the same as the one was kept in our mind. Much like the situation when deal
with an interview, the preparations we made mostly unlikely to be take place during the
interview session.
Such clue has transformed the dialogue system into a standpoint point system where the audience
makes choices about what they think in the terms of the story taking place instead of what a line
of dialogue the characters has to say. Such a design maintains the passive interaction nature and
reduces the complexity.
Standpoint system display options on the screen closely tight with subconscious emotions of the
characters. The options includes attitude to the previous scene and their exception towards the
subsequent scene, every option link with a boolean variable planted inside the dialogue database
default set to the false state.
26
Like every other element in the Real-time Editing system, the Standpoint system and its options
are not mandatory for the game experience to proceed.
Gaze Tracking Input With Point Of Interest
For every frame in every scene, there is a point of interest to the audience, the place they are
currently aiming during the experience represents the protagonist attention at the moment. This
point of interest can be monitor through ray-cast base system or vector based system and an
external hardware like Kinect, PlayStation Camera, Tobii eye tracking camera or even
smartphone mono-cam. In this thesis project we use the Tobii eye tracking camera and unity
ray-cast as our solution for maximum accuracy.
Gaze Focus Weight Calculation
27
Figure 9. Visual Breakdown of the Gaze Weight Ratio calculation script in Unity Inspector.
Tobii Eye-tracking API was integrated into the project system as our Gaze Tracking system
solution. Tobii API is versatile enough for us to customize it's function to adapt our need, in this
case, Gaze Weight Weight ratio script audience Gazing behavior and calculate them into useable
data to determine the film editing behavior.
Underneath the hood, Tobii API uses Ray-cast as the mechanism for its Gaze tracking system.
Audience’s Gazing behavior was transferred to Ray-cast actions, which project a ray-cast laser
sourcing from the picture plane gaze spot towards the infinite. The system constantly listens if
28
the Raycast collides with the game object attached with a Collider and Gaze Aware Script and
return that object information if it gets a positive result.
The Gaze Weight Ratio Weight script has timers and algorithms built within to calculate the
on-gaze focus weight between and within the categories of characters, pros, and the environment.
This will effetely allow the system to know audiences point of interest from time to time.
The Spontaneous Feedback
As an interactive media project one specific feedback I got the most amongst others is audiences
tend to want to see the interactions they were making during the experience and they would like
to see them as an instant form of feedback.
Inspired by some of the story driven game in the market like TellTale The Walking Dead series
and Detroit: Become Human which allows the audience to navigate the screen with controller
input and always show brief notes as text on the user interface, I decided to make my own
version of those customized to fit for the Real-time editing system. An alert system tells the
audience if they triggered an alternative editing behavior different from the default version of the
cut and summarized the narrative through a string of text on the user interface. A camera offset
system allows the user to gain limit control of the camera movement through their Head
Rotation, Gaze Contact and Physical Controller input.
29
Alert System
The Alert system tightly connects with the Timeline linkage manager’s Controller and Virtual
Camera Manager’s Controller, meaning two types of controller are directly responsible for the
selection of new playable-asset candidates or the property changes made on the render camera.
Alert system constantly listen to two managers, once an editing behavior has been made, the alert
system will reflect the changes on the screen with text notes. All notes were priorly made during
the post production phase and stored in a serializefield variable tagged to the responsible
Playable-Director.
Camera Offset System
The design of the camera offset system serves as a feature to assign some unique shots with
certain mood, i.e. to give the audience a chance to explore more about the character and also
allow us to implement hidden information outside the default framing of the picture plane, that
the audience can reveal pieces of information at their will. We wish such a design feature can
bound the audience closer to story characters.
The Tobii eye-tracking camera infrared lenses trace audiences head rotation and gaze movement,
this information can be used to constrain Virtual camera’s rotation variables. We constructed
customize algorithm that transform the data to drive the render camera that when the user rotate
their head or when their Gaze contact point falls closer to the edges of the screen the render
camera rotation changes in correspondence to these actions.
30
The audience may also use controller joystick to achieve such effect as alternative way of input.
Should the player in favor of the physical controller interactions, the physical control will
overwrite the depth camera inputs to make sure that no additive match occurs to disrupt the
camera movement.
The camera offset system using two scripts. One tight with every Virtual Camera in the scene to
gain direct control of their rotation properties, this script talks with the Tobii eye-tracking camera
through its development API to acquire the head rotation which is constantly monitored by the
depth cameras.
The second script communicates with Virtual Camera parent group node, this script constant
monitoring the input from the controller joystick and transfers the input from Vector 2 values to
the rotation information for the Virtual Camera parent group node.
To avoid the additive effect, only one script is allowed to function at the current time, based on
the playlets feedback we decided to pick the Controller Script with a higher weight. Once an
input value has been found from the controller script will immediately take over, suspend the
function of the Gaze Script.
The Gaze script, the one with a lower weight, will regain its control when the no input can be
found from the controller for a period time. The length of the time was stored by a float variable
in the script and can be customized by the developer based on the playlets result.
31
In addition, a user interface has been designed to indicate the state of the current system; The
gaze state, the controller state, the transition state and its progression in between. Each state has
its own indication sprite which will show on the bottom right corner of the screen.
Breaking The Fourth Wall
Considering the nature of this project, almost the entire interactive system take heavy use of the
Unity timeline, I added controller vibration feedback as vibration clips which can be edit on the
Unity timeline. Additional feature took advantage of the timeline system in the aid the
immersive experience I am going for.
We wrote it as a timeline event clip function which allows us to insert and edits the clip on the
timeline, variables in the function included the direction and intensity so we can author the event
of vibrations to match emotional intensive moment during the story.
Conclusions And Future Plan
The Real-time editing system in project Heliosphere is an experimental prototype which
conserve performances and camera sets then edit story in the Real-time based on audience
inputs, mostly in the form of eye tracking. An exploration took advantage of the Real-time
rendered game engine in the interdisciplinary field of virtual production, allows the system to
conserve varieties of performances during the production and delivery dynamic story
experiences to the audience.
32
We did few tests to the system with our unity test ground scene to make sure all the functions
were properly constructed, easy to sue and responsive, the tester did show great interest with the
system and think most of the interactions make sense. However it is not possible to find existing
virtual production film as the story material to integrate for further test by the thesis deadline as
Real-time film making in game engine is an emerging field.
The future plan is to integrate the Real-time ending system with the story part of the project,
which already under the production phase. With a complete story to test and iterate we can locate
and improve some of the design issues have not been found at the current stage since the test
ground scene has not been count as an emanational structure.
The Real-time ending system can also be expand and explore, we can test its performance
without a reference cut but purely based on the intensity of the sound, the entrance and exit of
the characters and the audience reactions.
33
Figure 10. Keyframe for the story part of project Heliosphere, this space station scene takes place in the first half of
the story.
34
References
1. Fullerton, Tracy. Game Design Workshop . Taylor & Francis, 2014.
2. “Timeline Overview.” Unity - Manual: Timeline , Unity,
docs.unity3d.com/Manual/TimelineOverview.html .
3. Majaranta, Päivi, et al. “Gaze Interaction and Applications of Eye Tracking.” 2012,
doi:10.4018/978-1-61350-098-9.
4. Bergstrom, Jennifer Romano., and Andrew Jonathan. Schall. Eye Tracking in User
Experience Design . Elsevier, 2014.
5. “REAL-TIME: FILMMAKING.” 3D World , Nov. 2016, pp. 19–24.
35
Abstract (if available)
Abstract
Every film can be identified as an organic system when it gets to a completion state where every shot and every scene are bound together to fit a story context. Yet the context itself varies from person to person when they encounter the original screenplay. In most of the cases, the final interpretation falls into the hands of the film editor. ❧ In traditional motion picture production, to capture a variety of actor performances, each shot will mostly end up with multiple takes. Actors make multiple attempts during the production phase with performance variations from time to time. Subtle changes in each performance possess different meaning for a different context. ❧ This leaves the film editor to compose the film with the ocean of dailies and their best interpretation. However, a significant amount of performance and emotions are abandoned during post-production. Boundless possibilities were hidden behind the scene as the untold story. ❧ The realm of Interactive media provides a new way of storytelling that preserves a great number of performances as a pool and can dynamically tailor the audio-visual experience at different times to different audiences.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Morana: explore healing potential of virtual reality storytelling
PDF
Ascension: an analysis of game design for speech recognition system usage and spatialized audio for virtual reality
PDF
BeachcomberVR: inspiring exploration: concepts in the production of a virtual reality game
PDF
Kinesthesia: a multi-sensory gesture driven playground of the future
PDF
Last Martelé: a virtual reality approach to analogic violin performance
PDF
The future of games and health: towards responsible interaction design
PDF
One little change: exploring relationship abuse in a virtual reality narrative
PDF
Ascension: a look into context-dependent memory development in virtual worlds
PDF
The Death Mask: a study in interactive mystery
PDF
FRKN WKND and video game mixtapes: developing talent and experience through video game mixtapes
PDF
Your presence is present enough: a thesis project postpartum
PDF
Epidemic vectors: an action game about mosquito-transmitted disease prevention
PDF
Cognitive science, belief, and interactive media as a distribution platform for experiences that enhance human development
PDF
Wetware: designing for a contemporary dilemma
PDF
Light at the End of the Tunnels: level design and its relationship to a spectrum of fear
PDF
Fall from Grace: an experiment in understanding and challenging player beliefs through games
PDF
Notes from the creator: designing a world for transmedia expression & engagement
PDF
Creatively driven process: using a virtual production workflow to achieve a creative vision
PDF
The Palimpsest project: producing a cultural shift to enable a systematic shift
PDF
Behind Shadows: the design and experiment for a new way of interaction - shadow as the controller
Asset Metadata
Creator
Li, Wenzhao
(author)
Core Title
The real-time editing system for virtual production and animated interactive project in real-time rendering game engine
School
School of Cinematic Arts
Degree
Master of Fine Arts
Degree Program
Interactive Media
Publication Date
04/30/2019
Defense Date
05/08/2019
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
editing,film,OAI-PMH Harvest,real-time,timeline,Unity,virtual production
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Wixon, Dennis (
committee chair
), Fullerton, Tracy (
committee member
), Watson, Jeff (
committee member
)
Creator Email
salvather@gmail.com,wenzhaol@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-160724
Unique identifier
UC11660410
Identifier
etd-LiWenzhao-7364.pdf (filename),usctheses-c89-160724 (legacy record id)
Legacy Identifier
etd-LiWenzhao-7364.pdf
Dmrecord
160724
Document Type
Thesis
Format
application/pdf (imt)
Rights
Li, Wenzhao
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
editing
real-time
timeline
Unity
virtual production