Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Psynchrony: finding the way forward
(USC Thesis Other)
Psynchrony: finding the way forward
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1
Julian Kantor
Conferred Degree: Master of Fine Arts in Interactive Media
Degree Conferral Month: August 2015
Faculty Advisors of the School of Cinematic Arts: Peter Brinson, Michael Patterson, Vincent
Diamante
University of Southern California
April 1, 2015
Psynchrony: Finding the Way Forward
Julian Kantor
2
Table of Contents
1. Introduction: Psynchrony as a Synthesis……………………………………………………….3
2. Beginnings: Defining the Central Vision……………………………………………………….4
3. Core Interactions: Creating Meaningful Constraints…………………………………………...7
4. Form and Function: Building the Animation Pipeline………………………………………...10
5. Role Playing: Designing Around the Character……………………………………………….14
6. Assembling the Pieces: Full Production Begins………………………………………………16
7. Conclusion: Towards an Ending………………………………………………………………19
8. Works Cited…………………………………………………………………………………...20
3
1. Introduction: Psynchrony as a Synthesis
“Videogames” is a blanket term for a single medium that could probably be subdivided
into several others. As a player, I find myself drawn equally towards heavily narrative games
with limited interaction, and frantic action games with no narrative content to speak of. I have
always integrated musical composition into my design practice, and I have found that my
approach best lends itself to that second category of game: immediate, intense, but ultimately
lacking substantive meaning. While I aspired to tell stories, I found myself only able to do so in
writing, animation, and other non-interactive media.
Psynchrony is my attempt to integrate these two types of experiences, with the aim of
creating a new sort of narrative immersion. It uses unrelenting interactions that drive the
protagonist’s every action with the aim of deeply focusing the player’s attention, and delivers its
visual narrative along with the sensory power of music in order to create emotionally resonant
moments. While I have not completed the story I set out to tell, I hope that my thesis project
delivers on my vision for this type of experience.
4
2. Beginnings: Defining the Central Vision
The story of Psynchrony originated with a prototype for the thesis prep class, ironically
enough in response to a prompt to create something that was “not my thesis.” After
experimenting with low-intensity, exploratory gameplay throughout my time as an MFA student,
I made a challenging, action-based game I called Shatterfall. In Shatterfall, players free-fall
through a concrete structure, aligning themselves with glass panels that would shatter on impact,
allowing their fall to continue. If they missed a glass panel and hit concrete, the screen would cut
to black and the sequence would restart. As I finished the prototype, I began imagining a story
for the game – the player steps out onto the glass roof of a skyscraper, which would shatter
below them. After falling for some time, the environment would become inverted – the player
would now fly upwards towards the sky, empowered rather than challenged. As I discussed the
idea with Alexandra Tyng, an accomplished artist (and also my mom), she pointed out the
imagery of shattering glass ceilings, and suggested that I make the protagonist female.
Separately, I had begun to think of my thesis project as a potential “concept album.” I
have long thought of my games as sharing the structure of their soundtracks, and so I imagined a
game that, like a great album, could be replayed over and over. I pictured the experience of
listening to an album that, just as your favorite song reached its climatic moment, suddenly
veered off in a wonderful and unexpected direction, and pairing that with the experience of
finally reaching a hitherto-unexplored world in an endlessly replayable game like Spelunky. The
trouble was, my “concept album” concept had no concept! It was only after some time that the
Shatterfall narrative I had been working out in the back of my mind snapped into focus, and I
merged the two ideas into what would become Psynchrony. Like a concept album, it would tell a
5
story through music, but using visuals and interactivity rather than lyrics to bolster the player’s
understanding of that story.
Relatively quickly, I began to lay out the key visual moments of the story. But as I
enthusiastically rambled about these ideas to my advisor Peter Brinson, he raised a vital question.
He told me that he enjoyed imagining the scenarios that I described, but was concerned about the
lack of a unifying core interaction that could be applied to the entire project. As he pointed out,
such a mechanic would be essential from a production standpoint, so that a new gameplay
system would not be needed for every moment of the story. But from a player’s standpoint, a
unifying core mechanic would be equally vital: players could be fully immersed in the story,
without constantly being pulled out of the experience to learn a new interaction, and without
needing to question what was expected of them. This felt like an especially important trait for my
project to strive for, since I wanted it to flow seamlessly along with the music, and for players to
approach new challenges with the immediacy and certainty that the character they embodied
would.
At the start of the fall semester, I had a better idea of what this unifying mechanic would
be, and had a clearer idea of what the story I wanted to tell was really about. The simple
mechanic I decided upon was “tap anywhere to act.” Players would simply tap anywhere on the
screen of their iPad to take an action as Kay, the game’s protagonist. I initially imagined
Psynchrony as a simple rhythm game – players would tap along to the music to a predefined
rhythm and watch the story unfold in different ways depending on the accuracy of their timing.
But a conversation with my professor Richard Lemarchand got me thinking about rhythm game
mechanics and their relationship to the narrative that would be created. Richard thought of
rhythm game mechanics as demanding, harsh, even fascist. This perspective had never occurred
6
to me, as a musician who finds nothing but pleasure in mastering complex and difficult rhythms.
After considering my discussion with Richard, however, I began to see his ideas resonating in
the fragments of a story I had been piecing together. The protagonist I imagined became a writer,
whose central struggle was overcoming the uncompromising rhythms of her daily routine, and
seeking out the source of these rhythms in her inner dream world.
7
3. Core Interactions: Creating Meaningful Constraints
In order to tell a story about rhythms, I realized that I needed something to put rhythm in
contrast to. I developed three interaction paradigms that adhered to the “tap anywhere to act”
constraint. The first was rhythm action, in which players would need to tap along to a predefined
rhythm. Next was free action, in which players were able to tap at any moment they chose.
Finally, rhythm choice would allow players to take different actions depending on which beat
they tapped to, allowing them to make choices in both narrative and action scenes. These three
modes of interaction would allow for diverse gameplay scenarios, but would also allow players a
window into the mindset of Kay, the game’s protagonist. Players should be able to ruminate on
the importance of whether Kay approached a given situation through rhythm action, free action
or rhythm choice, particularly if that action re-appeared in a new gameplay context later in the
story.
Having set my goal of establishing three core interaction modes, I needed to start creating
the visual cues that would alert players of the interactions that were available to them. I began to
work with my teammates, second-year MFA students Natalie Gravier and Zach Davis, who had
come aboard the project at the start of the fall semester. I did not want the interaction cues to be
on a timeline (as in Parappa the Rapper) or a note highway (as in Guitar Hero). An obtrusive
user interface overlay would lead players to completely ignoring the game’s visual narrative. I
instead wanted to achieve something like what Rhythm Heaven had: completely diegetic rhythm
cues. However, while Rhythm Heaven accomplishes this in a huge variety of diverse and unique
mini-games, each mini-game comes with its own training period to teach players what to look
out for. With Psynchrony, players would need to be able to immediately tackle unfamiliar
8
situations, so I felt there needed to be some sort of subdued, but consistent, overlay that would
indicate points of interaction.
We ended up with the idea of an “interaction silhouette,” a glowing white halo that would
hover around the edges of Kay’s body, which was, in these early stages of development,
represented by a sphere. For rhythm action, this silhouette would appear as a circle, and would
slowly grow brighter, collapsing and wrapping itself around Kay’s outline. The precise moment
to react would be the moment that it touched her. We playtested this visual cue using a prototype
in which players would tap to propel a ball down a colorful environment, and found that it was
immediately understandable with no tutorialization to a wide variety of gamers. The single point
of negative feedback we received repeatedly was that players wanted to be able to choose where
to go. This, of course, played into our plans for the other interaction modes, and so we decided to
stick to our current course of action. For free action, the second interaction type, we designed an
interaction silhouette that vibrated in and out, loosely matching Kay’s shape, but that never
collapsed or dramatically changed shape. We found that players felt immediately drawn to tap
the screen, without waiting for a specific moment to interact, and so we moved on to rhythm
choice.
For the third interaction type, rhythm choice, a more complex silhouette had to be
designed. First, the target of interaction was no longer Kay, but multiple objects representing
potential interactions. Secondly, due to the nature of the “tap anywhere to act” mantra, only one
choice would be valid at any given moment. Choices would have to cycle along to a rhythm, but
we did not want players to act upon the first choice that became available simply because they
misunderstood their options. Therefore, the rhythm choice interaction silhouettes would need to
communicate not only when each choice became active, but also which potential interactions
9
were currently unavailable. After many iterations, we settled upon the idea of using a dashed line
silhouette to represent an inactive choice, which would become solid and coalesce around its
target to indicate when the choice became available. We conducted some informal playtesting of
this mechanic, using primitive shapes to act as “choices.” We found that players were easily able
to find the rhythm and tap to select a choice, but they were confused by the abstracted quality of
their actions. Without narrative context for the mechanic, players did not feel as though they
were making a choice at all. Although we felt we had done a good job of establishing the basic
interactivity of the project, but we were still a long way away from realizing its narrative
ambitions. In order to start achieving that, we were going to have to make something where
players controlled a person, not a sphere.
10
4. Form and Function: Building the Animation Pipeline
From the inception of the project, I knew that the primary technical challenge in
developing Psynchrony would be in visually representing Kay, the game’s protagonist. The
entire story revolved around Kay, and I wanted her to feel like a real person, not just an artificial
videogame character. Not only would she need to perform a huge variety of actions throughout
the course of the game, but the lack of dialogue meant that Kay would also need to convey the
story through the subtleties of her body language and facial expressions. Additionally, the
musical nature of the project meant that it was absolutely critical that Kay’s movements were
fluid. I felt as though typical approaches to depicting human characters in games would not
apply, for a variety of reasons.
The big-budget approach to character design, creating complex 3D models, rigs and
animations, was completely infeasible. While I had a great volunteer team that provided as much
help as they could, I felt that I needed to find a means of creating character art that I could
execute on my own, if need be. And even if I could recruit a team filled with Maya experts who
had nothing but free time to devote towards my project, the stiff, puppet-like nature of so many
3D characters in games made me feel as though this approach was not a great fit for the type of
game I wanted to make.
Other, smaller games that tell human stories will often rely on the first person perspective
or 2D sprite animations. First person is a widely-employed convention that has many strengths,
not the least of which is that it entirely removes the need to depict the protagonist at all.
However, I felt that Kay’s story had to be told with specificity and emotion, and that she had to
be shown on-screen. Finally, 2D animations have had a recent resurgence in indie games to
wonderful effect. Many such games use a minimalist, stylized art style to allow players to fill the
11
details in with their imaginations, and to limit the complexity of asset creation. While this
approach works fantastically for many games, I again felt that an overly-simplified approach
would limit the specificity of Kay’s story, and that it would clash with the graphic novel-inspired
visual style I imagined for the game. Furthermore, I am neither a trained artist nor a trained
animator; while I can draw passably well given enough time, creating convincing movement
from frame to frame seemed like something I definitely could not do. And even if I could
somehow achieve it, hand-drawing each sprite would mean that an inordinate number of assets
would need to be created in order to achieve the scope of the story and the fluidity of the
animation that I imagined. This would be true even if I reconfigured Psynchrony’s visual style to
be as minimalistic as it possibly could be.
One rarely-utilized approach to this problem that has long fascinated me is rotoscoped
animations. Rotoscoping, the act of tracing over frames of video, by its very nature leads to
lifelike movements. It has been used outside of games to stunning effect: one example is the
iconic music video for A-ha’s “Take on Me,” animated by my advisor Michael Patterson. A pair
of games from the early 1990s, Prince of Persia and Another World, used rotoscoped graphics to
create protagonists that felt more cinematic and human than those of their contemporary games.
When I watched GDC classic game postmortems about their development and sought them out
to play, I was intrigued, both as a player and a designer. First, the practicality of this approach
struck me as genius: for Prince of Persia, Jordan Mechner simply recorded his brother
performing the necessary actions, which served as the basis for the eponymous prince’s
animations. As a player, the stylized realism of these games struck a chord with me that not
many others did. Another World was a cinematic game that told a strong, visual story within the
12
conventions of a sidescrolling platformer, even within the tight technological constraints of the
day.
Having a video reference for each frame of animation alleviated the question of whether
my technical skills would be enough, but I still had the question of scope. In his GDC talk, Chahi
explains Another World’s polygonal graphics, a unique innovation of the game, in depth. The
idea of using vector graphics intrigued me, and not just because it created such a striking effect
in Another World. Since vector graphics are based on numerical data points and equations, not
bitmaps, they could be interpolated with ease. This meant that I needed to draw only a few
keyframes in order to achieve the fluidity that was needed.
I had previously put some of these ideas to test in Underneath, a game I made in
collaboration with Alex Mathew in 2013. Our minimalistic protagonist was abstract, but his
movements were sufficiently fluid. Underneath also made me further consider the potential of
real-time vector graphics. Our game’s core mechanic was rotating a three-dimensional cave to
discover hidden outlets and passageways to explore; therefore, our character needed to be 3D.
Since I was already interpolating between keyframes for each animation, I realized that I could
also interpolate between keyframes of separate views of the same animation as the camera
orbited the main character – this had the very convincing effect of making our character look 3D,
even though he was still made up of flat 2D polygons.
For the character animations in Psynchrony, I built off of the ideas I had tested in
Underneath, with several important differences. First, I rewrote the animation system to be able
to support far more complex characters, while performing well on iOS. Next, while the 3D view
blending remained in place in order to realize the heightened reality of Kay’s dream world, I
expanded the ways in which different animations could be blended together. My goal was to cast
13
a lead actress to play Kay, shoot the entire project like a film, and use the footage as reference to
create the animation keyframes.
This animation system had many affordances which made it well-suited to the project.
The way in which different animations could be blended together meant that even more assets
could be generated from a small set of keyframes – for example, Kay’s “sad walk” and
“confident walk” could be blended in equal parts to produce a “neutral walk.” Furthermore, it
could be tightly choreographed using Metronomaton, the rhythmic scripting framework that I
developed during a directed research course with my advisor Vince Diamante, resulting in
movements that looked at once realistic and expressionistic. I was also able to take the
interaction silhouettes that had been prototyped earlier and automatically fit them to any action
that Kay happened to be taking.
Finally, and most significantly, the real-time nature of the animation system meant that it
could be responsive to player input. In my earliest prototype of Psynchrony, using a primitive
iteration of the animation system, players would simply tap along to Kay’s steps as she walked
automatically through the streets of an unseen city. When I had department chair Tracy Fullerton
try the prototype, one of the things she mentioned was that she wanted to be the protagonist, not
just watch her; this had been my intention all along, but Tracy’s feedback showed me that I
would need to rework my conception of the way that Kay would animate. I designed the new
animation system around the idea that the player’s input would be the impetus for Kay’s every
movement. After I had finished a preliminary version of the animation system, my team and I
recorded some test footage, using ourselves as stand-ins, to prototype some of the scenes from
the game and figure out how the player-character interactions would work.
14
5. Role Playing: Designing Around the Character
The primary scene I focused on was Kay standing in front of her bathroom mirror,
brushing her teeth. It was a scenario in which players could see a close-up view of the
protagonist, but the perpetual motion of Kay’s hand also allowed the scene to serve as a test bed
for repetitive, action-oriented mechanics. Once I had rotoscoped the scene’s keyframes, I set
about animating the scene in such a way that brought reactivity to the forefront. For the first
variation of the scene, Kay would move the toothbrush with a single stroke every time the player
tapped the screen. The design of the animation system allowed me to easily create multiple
variations of this movement: a tap that was tightly synchronized with the rhythm of the
movement would make Kay quickly fling the toothbrush across her mouth, while a more loosely
synced tap would make her drag it more slowly. A complete “miss” would make her flinch, but
not bring the toothbrush across. For the second variation of the scene, Kay would brush her teeth
automatically, but now the player would control her movements on a higher level. Every time
they tapped, she would change pose to brush a different area of her mouth. Again, the accuracy
of the player’s tap corresponded to the snappiness of her movements, and as the player’s streak
of accurate taps grew longer, Kay’s brushing pace would double successively.
I showed these rough prototypes at the fall open studio, and for the most part they were
received enthusiastically. Players enjoyed the sense of feedback that would result from their
interactions – more than once, I was told that the game felt magical, and that players had zoned
out into a trance-like state while playing. Some players felt as though the interaction silhouettes
distracted from their ability to pay attention to the animation, and others felt as though the
interaction silhouettes themselves did not provide enough feedback for timing accuracy. But
15
overall, the connection between the player and character was vastly strengthened from the
earlier, primitive character-based prototype.
For the winter show, I stitched together the variations of the teeth-brushing scene into a
cohesive sequence with a narrative framework and a musical score. In the scene, a tutorial
explains the concept of rhythm action, and the player taps to brush during an introductory period.
Once they get Kay into rhythm, she brushes automatically, and players start to tap to change pose
as the music builds in intensity. Finally, an alarm on her phone interrupts Kay, who then needs to
hurry and finish brushing her teeth using free action. While it did not have nearly as much
narrative content as the final game would have, I wanted to get a sense of how immersed that
players could get from an uninterrupted segment of gameplay with all of their senses engaged.
The reception was again positive – the added tutorial and uninterrupted gameplay allowed almost
all players to advance through the entire experience without any external prompting. Players
would also refer to “missing” the rhythm, which I felt was a good sign that the animations were
providing sufficiently responsive feedback.
While most players were left excited to see what would come next, and some reported
having felt moved and experienced “being” Kay, others questioned how emotionally engaging
the experience could be. One player felt as though the simplicity of interactions was
fundamentally unable to provide an emotionally rich experience. While this concern might very
well turn out to be valid, I wanted to see the project through to see if that response could change.
I felt that final art, based on the performance of a trained actor, and a more complete narrative
sequence was necessary in order to be able to fully answer that question. But in order to begin to
begin finding an answer, I would have to venture out of my comfort zone and begin capturing the
footage that would serve as the basis for the narrative.
16
6. Assembling the Pieces: Full Production Begins
I entered my second semester with the clear objective of getting the live action shooting
process up and running, but I quickly found that this was easier said than done. While I had made
a few comedic shorts in the past, and had shot footage with friends for my past experiments with
rotoscoping for games, I had never tackled a film project of this scope before. I found myself not
even sure how I should get started, particularly because writing a script for my dialogue-free
game seemed like an exercise in futility. I found that I had done a lot of work on the areas of the
game that felt easy for me to work on – creating environmental art, laying a structural foundation
for the project by building systems, and working out the details of narrative. While I had help
from my small team in these areas, they were mostly tasks that I could manage on my own. But
starting to film meant that I had to take on a more active leadership role, build a larger team and
start relying more heavily on others.
I ended up enrolling Ascot Smith, a second-year MFA student with prior experience in
live action, as producer for the project. With Ascot’s help, I began to break down exactly what
we would need to film into a shot list that wound up being over 120 items long. Any illusions I
had harbored that I would be able to tell the entire story I had envisioned quickly evaporated.
There would be nowhere near enough time to bring each shot into the game as an interactive,
animated scene complete with a 3D background, much less stitch them all together into a
seamless, musical experience. So I refocused my ambitions for my thesis project: I would
capture the entire story on film, but bring only an introductory sequence of the final game to a
full level of completion. This way, the project as displayed at the final thesis show could be
something that, when played, felt like a complete experience, but to me would serve as a
17
concrete test case of my vision. Could the level of emotional immersion I was seeking be
achieved?
We began to move into crunch mode for the film production portion of the project.
Beurocracies were navigated, roles were cast, a crew was assembled and shoots were completed.
It was great to work with Laura Wineland, the Drama School MFA student who we cast as Kay.
She was excited by the project and eager to participate in a kind of acting that was radically
different than what she was accustomed to. Rather than running lines over and over, she could
concentrate entirely on her non-verbal communication. The week after our first shoot, I animated
a shot every day, relieved to have the ability to finally get started on art assets that were not just
placeholders. Creating the animations also made it helpful for my ability to direct Laura, as I was
gaining an understanding of what was, and was not, working well once it was translated into the
style of the game.
While it was great to finally be in full production mode, I also began to understand just
how limited my time was. While I could fill my weekdays with smaller tasks, like rotoscoping
and animating shots, my weekends were spent on set, obtaining footage for the rest of the game.
But even as I creating the content I would need, I had not made any playable progress towards
my thesis show deliverable. And my schedule made it very difficult to get work done on the
larger projects that I would need to complete in order to start editing shots together into playable
sequences. Still, I did not have the number of art assets required to provide the gameplay with
the context that it required, which made me feel as though I needed to keep my head down and
proceed with the shooting and animating process.
There was an uncomfortable Catch-22 developing – if I kept working on producing
assets, I would have less and less time to assemble, test and iterate on the interactivity. But if I
18
switched focus to interactivity, how useful would the testing and iterating be without the context
of the story and the feedback from the dynamic animations informing players’ actions?
What ultimately forced me to switch focus to the interactivity portion was working with
another second-year MFA student, Steve Cha, on the final project for his user research class. We
had agreed that he would conduct research on a playable prototype for my project and he needed
to start testing. The last playable prototype we had assembled was the winter show demo, which
had simple interactions that did not approximate what we were trying to deliver for the thesis
show. While we had tested the three visual cue languages separately much earlier in the process,
we were still unsure about how players would react to them in sequence. We needed to know
whether players would be able to recognize what was expected of them and execute on that
objective at a fast pace. We ultimately felt it was best to spend a few days constructing a
usability test using primitive shapes, without any assets. This approach was highly problematic:
not only would there be a lack of crucial context and feedback for the player, but it also was still
more work I had to complete that would not feed directly into the thesis deliverable. But I had to
confront the fact that at this point, any approach I would have taken would have been highly
problematic. I had been neglecting interactivity for too long, and I had to start paying attention to
that essential element of my project, even if it would create other issues along the way.
19
7. Conclusion: Towards an Ending
Now, I am entering the final stretch of production. I have nearly all of the character
animations I will need to tell the basic story I want to for the thesis show. With the essential help
of my advisor Michael Patterson, I developed a pipeline to get my teammates working hard to
supply the rest. I am conducting usability tests to refine the mechanics, and the visual and textual
language needed to give players enough information to engage with the game. And I am finally
at the point where I can assemble the sequences that will make up the experience that will be
shown in May.
It took a while for me to reach this point, and I still have a ways to go before the project
will be able to do what I want it to. But part of this process has been figuring out the process
itself. While I have learned valuable lessons about production pipelines and prudent scoping
throughout my time as an MFA student, I truly believe that in order to deliver on my vision for
Psynchrony, I had to get lost in the weeds along the way. So much of the year was spent on
defining, refining and executing on the process of giving interactive life to canned, live-action
performances, and there were many other things I neglected along the way. But making Kay a
living, interactive character was the central challenge of this project. All of the other things that I
still need to do, I have done before. I am optimistic that I will have time to do them again, but
even if I do not, I know I will be able to do them in the future. Ultimately, I will end the semester
and leave graduate school with the ability to tell human stories in my games, something I never
before knew how to do. At the thesis show in May, I will only tell a part of Kay’s story, but after
going through the thesis experience, I have gained the ability to finish it.
20
8. Works Cited
Yu, Derek, and Andy Hull. Spelunky. Mossmouth, 2012. Xbox 360.
NanaOn-Sha. Parappa the Rapper. Sony Computer Entertainment, 1996. PlayStation.
Harmonix. Guitar Hero. Red Octane, 2005. PlayStation 2.
Nintendo SPD, TNX Music Recordings. Rhythm Heaven. Nintendo, 2008. Nintendo DS.
Rhino. 1986. “Take On Me.” Available 4/1/2015, from http://www.youtube.com/watch?v=djV11Xbc914.
Brøderbund. Prince of Persia. Brøderbund, 1989. Apple II.
Delphine Software. Another World. Delphine Software, 1991. Amiga, Atari ST.
Abstract (if available)
Abstract
“Videogames” is a blanket term for a single medium that could probably be subdivided into several others. As a player, I find myself drawn equally towards heavily narrative games with limited interaction, and frantic action games with no narrative content to speak of. I have always integrated musical composition into my design practice, and I have found that my approach best lends itself to that second category of game: immediate, intense, but ultimately lacking substantive meaning. While I aspired to tell stories, I found myself only able to do so in writing, animation, and other non-interactive media. ❧ Psynchrony is my attempt to integrate these two types of experiences, with the aim of creating a new sort of narrative immersion. It uses unrelenting interactions that drive the protagonist’s every action with the aim of deeply focusing the player’s attention, and delivers its visual narrative along with the sensory power of music in order to create emotionally resonant moments. While I have not completed the story I set out to tell, I hope that my thesis project delivers on my vision for this type of experience.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The moonlighters: a narrative listening approach to videogame storytelling
PDF
Paralect: an example of transition focused design
PDF
Try again: the paradox of failure
PDF
Grayline: Creating shared narrative experience through interactive storytelling
PDF
The Toymaker’s Bequest: a defense of narrative‐centric game design
PDF
Ardum: a project about the player-designer relationship
PDF
Come with Me: a cooperative game focusing on player emotion
PDF
Resurrection/Insurrection
PDF
Players play: extending the lexicon of games and designing for player interaction
PDF
Spectre: exploring the relationship between players and narratives in digital games
PDF
Quicksilver: infinite story: procedurally generated episodic narratives for gameplay
PDF
Creatively driven process: using a virtual production workflow to achieve a creative vision
PDF
The voice in the garden: an experiment in combining narrative and voice input for interaction design
PDF
Designing the Ramayana re-imaginator
PDF
Penrose Station: an exploration of presence, immersion, player identity, and the unsilent protagonist
PDF
a•part•ment
PDF
Timension
PDF
Historicity and sociality in game design: adventures in ludic archaeology
PDF
Last broadcast: making meaning out of the mundane
PDF
The return: a case study in narrative interaction design
Asset Metadata
Creator
Kantor, Julian
(author)
Core Title
Psynchrony: finding the way forward
School
School of Cinematic Arts
Degree
Master of Fine Arts
Degree Program
Interactive Media
Publication Date
06/19/2015
Defense Date
04/09/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
concept album,Games,integrated design,Music,narrative immersion,OAI-PMH Harvest,performance capture,rhythm,rotoscoping
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Brinson, Peter (
committee chair
), Diamante, Vincent (
committee member
), Patterson, Michael (
committee member
)
Creator Email
jkantor@usc.edu,julian.kantor@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-573516
Unique identifier
UC11300124
Identifier
etd-KantorJuli-3488.pdf (filename),usctheses-c3-573516 (legacy record id)
Legacy Identifier
etd-KantorJuli-3488.pdf
Dmrecord
573516
Document Type
Thesis
Format
application/pdf (imt)
Rights
Kantor, Julian
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
concept album
integrated design
narrative immersion
performance capture
rhythm
rotoscoping