Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A meditative application inspired by emotional regulation
(USC Thesis Other)
A meditative application inspired by emotional regulation
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
A MEDITATIVE APPLICATION INSPIRED BY EMOTIONAL REGULATION
by
James Leo Bulvanoski
________________________________________________________________________
A Thesis Presented to the
FACULTY OF THE USC SCHOOL OF CINEMATIC ARTS
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
MASTER OF FINE ARTS
(INTERACTIVE MEDIA)
May 2012
Copyright 2012 James Leo Bulvanoski
ii
Epigraph
I am grateful, violins, for this day
of four chords. Pure
is the sound of the sky,
the blue voice of air.
Pablo Neruda
Table of Contents
Epigraph…………………………………………………………….…………………….ii
List of Figures…………………………………………………………….……...………iv
Abstract…………………………………………………………………………..……….v
Introduction………………………………………………………………………..……..1
Prior Art………………………………………………………………….……………….2
Development Process…………………………………………………….……..……….10
Conclusion……………………………………………………………….………...…….24
Bibliography…………………………………………………………….…………...…..25
iv
List of Figures
Figure 1: A model for interactive meditation experiences 3
Figure 2: Screenshots of Fourface for the iPhone 6
Figure 3: Screenshot of Bloom for the iPhone 7
Figure 4: Screenshot of TrueMonk directing attention between objects 9
Figure 5: Screenshot of TrueMonk Unity prototype development 10
Figure 6: Comparison photos of Fourface on iPhone and iPad 12
Figure 7: Audio analysis summary of duration and Hz 14
Figure 8: Cylinder Zero: 8_20780_digriffin_zen-gong.wav,
Spectrum Analysis using Audacity 15
Figure 9: Cylinder One: 7_23541_loofa_thai-gong3.wav,
Spectrum Analysis using Audacity 15
Figure 10: Cylinder Two: 4_24623_anamorphosis_gmb-kantilan-3.wav,
Spectrum Analysis using Audacity 16
Figure 11: Cylinder Three: 77_48325_monkay_singingbowl.wav
Spectrum Analysis using Audacity 16
Figure 12: Screenshot from TrueMonk user test of digital prototype one 17
Figure 13: Screenshots from TrueMonk user test of digital prototype two 20
Figure 14: Screenshot from TrueMonk user test of digital prototype two 20
Figure 15: Screenshot from TrueMonk digital prototype three – pinch to zoom 21
Figure 16: Screenshots from TrueMonk user test of digital prototype three 22
– pinch to zoom
v
Abstract
Advances in neuroscience and affective psychology have opened new areas of
research in emotion regulation. This paper investigates one area of emotion regulation,
meditation, and how contextualizing different types of meditative applications in terms of
emotional regulation can inform meditative application design.
1
Introduction
Emotional self-regulation influences the emotions we have, when we have them,
and how we have them (Gross, 2008). Research in the fields of social psychology
(Wadlinger and Isaacowitz, 2011) and neuroscience (Lutz et al, 2008) suggest that, when
treated as attention-training methodologies, meditative practices may be among the most
effective strategies for emotion regulation. This project treats meditation as an attention-
training methodology, and this thesis focuses on the design challenges inherent to an
interactive, meditation application -- how to best exercise attention using touch, color,
shape, movement, and sound. This project aims to provide insight on how to best exercise
attention in two ways: 1) explore how touch, color, shape, movement, and sound can be
used in focused and open meditation, and 2) explore how touch, color, shape, movement,
and sound can be combined as interactive objects of meditation.
TrueMonk is a meditative application designed to exercise attention using touch,
color, shape, movement, and sound. The application has been inspired by the findings of
Wadlinger and Isaacowitz, which link meditation to emotion regulation through attention
training. TrueMonk was designed to exploit the affordances of meditation in exercising
the emotion-regulation strategies of “selective attention to object” and “disengage
distracters; reengage object.” (Wadlinger and Isaacowitz, 2011, p. 89)
2
Prior Art
Meditative applications were selected based on personal experience. These
included the rhythm games Halcyon and Pulse; the music applications Bloom, Chimes,
Synth Pond, and Bubble Harp; and the relaxation tools Zen Bound, Zen Garden, Strange
Rain, and Koi Pond. The applications were compared across four qualitative dimensions
(See Figure 1). The first dimension, focused versus open meditation, evaluates the
perceived meditative experience of the application, ranging from focused meditation to
open meditation. Contemplative, the second dimension, goes from not contemplative to
very contemplative and is defined as including counting or repetitive touch or sound
elements. The third dimension, challenge, ranges from low challenge to high challenge
and measures the perceived level of difficulty employed by the application. A fourth
qualitative dimension, generative sound, is the level of perceived controlled versus
random agency used by the application to produce sound.
3
(Figure 1: A model for interactive meditation experiences)
A review of prior art linking meditation to emotion regulation through attention
training found the perceived level of focused versus open meditation of applications to
match both the “cognitive effort to acquire” and “cognitive effort to execute” of emotion-
regulation strategies (Wadlinger and Isaacowitz, 2011, p. 89). These findings suggest
some meditative applications are best used for one type of meditation rather than others,
depending upon the emotional regulation strategies exercised.
According to Wadlinger and Isaacowitz’s model of attention-training methods
integrating emotion regulation strategies and attentional networks, the attentional-training
methodologies of meditation employ a number of emotional regulation strategies:
- Sustained attention on object, such as the breath or a spot on the wall.
- Sustained awareness without focus, such as not being bothered by
distraction while still being able to focus on anything you choose.
- Selective attention on object, such as choosing one object among many to
focus on.
4
- Disengage distracters; reengage object, such as paying attention to one
specific fly while numerous flies are around you.
- Disengage distracters; nonreactive labeling of distracters, such as the
ability to choose and to change what to focus on amongst distracting
sounds, or movements, without labeling them as something to be ignored
categorically.
As these emotional regulation strategies correspond with the attentional strategies
of the meditation applications on the previous chart, we can start to gain some insight into
how applications using generative sound and algorithmic music, such as Bloom, Strange
Rain, and Bubble Harp, feel more meditative than those with fixed background music,
such as Zen Bound and Pulse. The emotional regulation strategy of “sustained attention
on object” is employed by many of the focused meditation-oriented applications such as
Zen Bound, Zen Garden, and Pulse. We can also find that Bloom, Koi Pond, and Bubble
Harp, were perceived by the author to be more open meditative than other applications,
making use of the emotional regulation strategies of “disengage distracters; reengage
object” and “disengage distracters; nonreactive labeling of distracter.” TrueMonk has
been designed to use the “selective attention to object” and “disengage distracters; re-
engage object” emotion regulation strategies.
According to the model of attention-training methods integrating emotion
regulation strategies and attentional networks, meditative applications making use of the
executive attentional network employ the emotion regulation strategies of “disengage
distracters; reengage object” and “disengage distracters; nonreactive labeling of
distracters.” If we go on to list the emotional regulation strategies employed by
applications listed on the model for interactive meditation experiences, we will find three
of the four works of prior art -- Strange Rain, Bloom, and Bubble Harp -- make use of all
5
three attentional networks, integrate the use of touch and sound via interactive systems,
and produce generative sound and algorithmic music. Taking a similar approach,
TrueMonk can be a meditation application providing users with highly open meditation
experience.
Meditative applications were found to use different combinations of direct versus
indirect touch, color, movement, and sound to exercise attention. Relaxation tools use
direct control over sound and movement and indirect control over color. And rhythm
games use direct control over shape and indirect control over sound.
In conceptualizing TrueMonk as a meditative touch interface, I started going
through my iPhone to find touch interfaces I enjoyed. Fourface, by fellow USC IMD
graduate William Carter, is a collection of playful interfaces for the iPhone. Fourface
features a number of visually oriented interfaces (See Figure 2). The Arc interface of
Fourface was a favorite of mine for a number of reasons. Its tone felt meditative. The
color palette was dark but colorful. The design was subversive. So much tapping on the
iPhone had me exhausted, but here was something wholesome. Something you could
drag your fingers across. Its sound evoked a sense of surprise. As you move your touch
from arc to arc and then stay on one arc for a second, the color of the arc changes from
green to blue. When you keep a finger on the arc long enough, the application’s volume
increases and pitch gradually rises. The sound made my touches feel authentic and
meaningful. I was inspired.
6
(Figure 2: Screenshots of Fourface for the iPhone)
Fourface also offered the advantage of featuring two other creative touch
interfaces. These interfaces threw my interest in the arc interface into relief. One interface
featured a number of playful buttons, while the other used squares. These alternatives to
the arc interface were equally creative but not as meditative. They served to reassure me
my search was over. I wanted to make a meditative touch interface using color, arc
shapes, sound, and movement.
Research into prior art showed music applications used some combination of
color, shape, sound, and movement to direct attention. Music applications used direct
control over sound and movement, with indirect control over color. TrueMonk was
designed to be a meditative music application, using indirect control of sound to focus
attention on touch.
7
The iPhone application Bloom, by Brian Eno and Peter Chilvers, has been found
to treat sound in a similar way. In Bloom, users interact with generative sounds via touch
to compose ambient music. Using touch, users create generative music compositions
registering initial melodies according to the order and location of their touches. These
compositions are augmented and repeated according to unknown algorithms, resulting in
ambient music. Bloom also gives users indirect control over color. New colors in the
form of small, lightly shaded circles appear wherever users tap the screen. The shapes
grow in diameter while the sound increases in volume and rises in pitch before being
eventually absorbed into the background (See Figure 3). In this way, the color
composition of Bloom changes along with the music composition. The result is an
interactive, generative, ambient music experience. Bloom has been influential to
TrueMonk in the way the application allows users to create interactive music.
(Figure 3: Screenshot of Bloom for the iPhone)
8
Halcyon is a rhythm game where players drag triangle shapes to trigger harp
sounds. The game times the completion of player touches. As play progresses, challenge
increases with the addition of more shapes. The resulting sounds work well with the
background music. Halcyon directs user attention with shapes and color. When dissimilar
shapes are close to colliding, wavy red lines appear. The red lines direct the user’s
attention to the shapes.
Pulse is a rhythm game using shape to direct attention. Pulse randomly places
small, circular-shaped touch points around a larger circle. When users touch these shapes,
the music and color of the application continue. When players miss the shapes, the
soundtrack is interupted and the touch point explodes.
TrueMonk intends to use direct control of touch, shape, and movement like a
rhythm game and combine it with indirect control of sound and color like a music
application. Like Halcyon, TrueMonk uses color and movement to direct attention. The
water level rises when you are not interacting with the main chime. As pieces get closer
to the edge of the waterfall, changes in color are used to bring attention to the shape. If
shapes go over the waterfall and hit the chime, rings of the main chime also change color.
Like Pulse, TrueMonk uses shape to direct attention and interact with sound. As
players touch the shapes in the foreground, the sound of the application is not affected
(See Figure 4). When players fail to touch these shapes, the meditative gong-like sound
interupts the silence. Like Bloom, TrueMonk users use touch to interact with sound and
compose music, but the music is not ambient - it is intentional. TrueMonk combines
touch and sound in a unique combination.
9
(Figure 4: Screenshot of TrueMonk directing attention between objects)
10
Development Process
Digital Prototype
The first iteration of TrueMonk was developed as a digital prototype in Unity, a
game development tool for producing touch-based applications for the iPhone and iPad.
The goal of this prototype was to use touch, color, shape, movement, and sound to make
a meditative application for the iPad. The application was visualized as being made of
overlapping circles changing color when touched. My focus at this time was on finding a
way to code the application, like Fourface, so it would change the color of an outer ring
when swiped. With no sample code to work from, the approach was experimental.
I developed a system involving four concentric cylinders. I thought of them as
chimes and enjoyed the way they looked like discs (See Figure 5). This would lead to
ideas returned to later in my design.
(Figure 5: Screenshot of TrueMonk Unity prototype development)
11
My first attempts at tracking touch were failures. There were many approaches,
including one using four balls moving outward from the center when dragged. At first,
these four balls remained stationary where I left them. Touch felt boring. I fixed the code
so they returned to center. That was much better. I needed only one ball.
While developing this early prototype, a number of deviations from the original
design were made:
- Narrow-spaced arcs were found to be less “meditative” than wider arcs.
- Fewer arcs were used in the interface.
- Rotating text around the arcs was removed, as it seemed distressing.
Also, the larger screen size of the iPad changed the way the arcs changed
color in Fourface. When brought to the larger screen of the iPad, the changes in
color were alarming. The larger screen size of the iPad made the screen-filling
arcs of Fourface less intimate (See Figure 6). This increase in arc size meant the
speed of text rotating around the arcs increased and became alarming. With a
larger surface, your touch would be less careful and move too fast, creating a
distracting sense of movement. The arcs of the TrueMonk interface would need to
be smaller in total size but individually wider. TrueMonk would also need to keep
the arcs highlighted longer. To do so, the code values for highlighting the arcs
were reversed. Rather than highlight the arc only when touched, the arcs would
remain highlighted until the touch was released, producing a kind of rubber band-
like mechanic.
12
(Figure 6: Comparison photos of Fourface on iPhone and iPad)
The new mechanic made the application more meditative. It required the user to
repeatedly drag a finger from the center of the main chime to the outer rings. The
repetition of the gesture reminded me of ritual. It required the user to drag slowly and
intentionally. The slowness made the touch feel more authentic. There was less of an
emphasis on immediate feedback and more emphasis on being thoughtful. The touch had
meaning.
Meditative color and sound were now needed to match the meditative gesture.
Fourface used a palette of muted colors, enabling a certain amount of space on the color
wheel for lighter tones to be used when the arc was highlighted. I wanted to find muted
colors that were more calming than the green color used in Fourface. A number of colors
were chosen, including those associated with calmness, such as sky blue. Light teal was
originally the chosen highlight color, but the highlighting of arcs in this design was no
longer meditative. Dark violet and yellow were also tried, but dark violet felt too intense
and yellow created too much contrast with the cool hues of sky blue and teal. The
highlighting was reversed. The circles would begin as being highlighted and be filled
13
with darker tones when touched. In addition, a third color was added to coincide with the
highlighting of the outermost arc. Magenta was chosen as a complementary color to dark
teal for use in the outer arc.
With a touch type and visual design finalized, the next focus was sound. To
determine the effect of sound on the touch and visual experience, a number of different
bell sounds were tried. Initial bell types included bicycle bells, western church bells, and
individual chimes. As the design evolved, these simple bell rings, which were similar in
duration and perceived pitch to those used in Fourface, no longer made sense. The
application needed a more meditative sound to reflect the measured highlighting of the
touch gesture. Looking to the calming touch gesture for inspiration, the most compelling
time to trigger sound was not when the arc was selected but when the touch was realized.
Designing the sound this way created the sense of surprise experienced in my initial
Fourface encounter and reinforced the emphasis on slowing down the touch gesture and
being mindful.
To develop a sense of surprise, a number of different sounds would be triggered
with each touch gesture. Searching for meditative bells, and then gongs and singing
bowls, 20 recordings were chosen as candidates for the alpha test. As sounds were
inserted into the project, varying durations were tested. The combination perceived by the
designer as the most harmonious and surprising were three gongs and one singing bowl
(See Figure 7).
14
Cylinder Name Duration Hz DB (-)
Cylinder Zero 8_20780_digriffin_zen-gong.wav 0:06 325 36
Cylinder One 7_23541_loofa_thai-gong3.wav 0:51 276 24
Cylinder Two 4_24623_anamorphosis_gmb-kantilan-3.wav 0:05 754 22.6
Cylinder Three 7_48325_monkay_singingbowl.wav 1:00 750 26.1
(Figure 7: Audio analysis summary of duration and Hz)
The method used to arrive at the fundamental frequency listed in Figure 7 was a
spectrum analysis. The spectrum analysis, the results of which can be seen in Figures 8
through 11, was conducted using the sound editing and analysis program Audacity. The
fundamental frequencies of these sounds (as measured in hertz) are significant because
they represent the base frequency for the harmonics of those sounds. When sounds have
fundamental frequencies that are close enough to each other, they are said to be
harmonic. The harmony created when sounds share a similar fundamental frequency, as
the sounds do in Figures 8 and 9, and Figures 10 and 11, is often pleasing to the listener.
The frequencies shared by Figures 10 and 11 are more similar than those shared by
Figures 8 and 9. When considered in light of the meditative-like touch gesture, an
interactive composition system emerges where users generate sound based on an
algorithmic design. What is special about the creation of such a design is the integration
of the touch gesture with both the harmony and duration of each sound.
15
(Figure 8: Cylinder Zero: 8_20780_digriffin_zen-gong.wav, spectrum analysis using
Audacity)
(Figure 9: Cylinder One: 7_23541_loofa_thai-gong3.wav, spectrum analysis using
Audacity)
16
(Figure 10: Cylinder Two: 4_24623_anamorphosis_gmb-kantilan-3.wav, spectrum
analysis using Audacity)
(Figure 11: Cylinder Three: 77_48325_monkay_singingbowl.wav, spectrum analysis
using Audacity)
The durations of each sound were chosen based on a belief that symmetry would
represent a pattern mirroring the way users were expected to touch the circles. The idea
was the majority of touch gestures would either reverse highlight the center circle and
first arc or all four arcs with each touch. As the goal was not only surprise but also
meditation, the intent was for at least one of the sounds associated with one ring on each
17
touch gesture to remain sounding after the touch gesture was completed. To accomplish
this, sounds of shorter duration for the center and second arc and of longer duration for
the first and third arcs were chosen. It was also important for each set of rings to be
harmonious. To accomplish this, similarly pitched sounds were chosen for the center and
first arc as well as the second and third arcs.
First Digital Prototype Testing
To find out how users would interact with the application, seven people were
observed and videotaped interacting with the application for several minutes. The users
were between the ages of 22 and 50 and had varying experience levels using touch
computing. I was interested in how these users would interact with the rubber-band touch
mechanic and whether the touch, color, shape, movement, and sound of the application
would lead to a meditative experience. All users were instructed to use headphones and to
“touch from center,” but no additional instruction or background regarding the purpose of
the application was provided.
(Figure 12: Screenshot from TrueMonk user test of digital prototype one)
18
Proof of concept came early. During a number of play tests, users reported having
meditative experiences. The meditative experience with the application consisted of the
following elements: indirect touch, indirect sound, and indirect color. The rubber-band
touch gesture was unfamiliar to all users regardless of age or experience level. The
indirect touch created by this unfamiliar gesture was observed in the way users would
contemplate the workings of the gesture and the resulting sound. One user, who was
familiar with both the use and creation of touch interfaces, wondered whether velocity
was being used to generate sound. Zero users were able to identify how many different
sounds were being triggered by their touch gestures. Only one user checked for the
presence of any potential functionality using the accelerometer.
Feedback from the test revealed an organic desire by users to employ the
application as a meditation device. Other important feedback included concern over a
perceived lack of control over sound. And some users desired additional activities for the
application.
Second Digital Prototype
For the next prototype, a point system was implemented, with a focus on
expanding the user experience of the application. The point system had three objectives.
The first was to calculate the number of touch gestures made by each user. The point
system measured both the total number of touches and the number of touches of each arc.
The second objective of the point system was to use the number of touches to provide
users with visual feedback. The application was developed to display different visual
19
elements in the form of various digital numbers, text colors, arc colors, and background
images. The third purpose was to test whether the addition of lower tone sounds would
add depth to the experience.
Second Digital Prototype Testing
In testing the second digital prototype, the goal was to observe how the
modifications affected the user experience. My testing methodology was to observe the
same users who participated in the first play test. I wanted to see how the users would
interact with the point system and whether it would adversely affect the meditative
experience. The users were instructed to use headphones and to “drag from center.”
Again, no additional instruction or background regarding the purpose of the application
was provided.
The results indicated many of the visual elements adversely affected the
meditative experience. Users were alarmed by the point display. They also reported
feeling disrupted by the appearance of text. The participants made no connection between
their touch actions and the appearance of different color text, numbers, or backgrounds.
In short, giving the users more to do was not making the application more meditative.
20
(Figure 13: Screenshots from TrueMonk user test of digital prototype two)
As a result of this play test, the new text, numbers, arc colors, and backgrounds
were removed -- except for the white sky background image.
(Figure 14: Screenshot from TrueMonk user test of digital prototype two)
Third Digital Prototype
In this prototype, additional touch gesture functionality was tested. My focus was
to expand the user experience. Results from the first user test indicated some users
desired additional things to do, but the previous attempt at providing this experience had
21
failed. The goal was to explore the potential of the new gesture functions: pinch and
zoom, multi-touch, and drag. Pinch and zoom was implemented to give users an
experience of a larger world. Users would be allowed to zoom into arcs or zoom out for
multiple arc objects. After adding pinch and zoom, however, it was clear that zooming in
was confusing and unintuitive. The application was initially designed to be smaller than
the screen size of the iPad. When zoom was enabled, it broke the meditative agency of
the rubber-band mechanic.
Conversely, zooming out gave the user a window into a larger world. This
prompted the question of what is there to do in the larger world. The answer to this
question involved multi-touch and drag.
(Figure 15: Screenshot from TrueMonk digital prototype three – pinch to zoom)
Multi-touch was added to allow users to use multiple objects to create sound. The
goal of adding multi-touch and drag was to allow users to place additional objects around
the world, which would affect the triggering of sounds. Adding multi-touch and drag was
problematic. When zoom was enabled, multi-touch would not work. The two functions
22
were mutually exclusive. Depending on the placement of objects, when one intended to
pinch and zoom in or out, objects would often move. As neither multi-touch nor drag was
experienced to add anything to the core experience of meditation, they were removed. It
is important to note this prototype was not tested on users and I intend to return to these
functions in future iterations.
(Figure 16: Screenshots from TrueMonk user test of digital prototype three – pinch to
zoom)
Fourth Digital Prototype
The application was redesigned for use as an attention-training experience. The
goal of the second and third prototypes was to extend the experience beyond the
meditative experience of the first prototype, while the goal of this fourth prototype was to
incorporate the previous prototypes into an attention training-inspired meditation
application.
My focus was on integrating the meditative experience of the first prototype into a
more structured experience. Rather than enabling users to explore a meditative world
through zoom, an infinity pool-inspired area was added below the gong-like circles. To
23
train users to exercise sustained attention on objects, and selective attention to objects,
multi-touch and drag were brought back to this fourth version of the application.
This application is currently in development. It is envisioned to provide additional
challenge to users, increasing their open meditation abilities. Such a design will need to
enable users to attend to any of the system’s sounds or touch objects among multiple
sounds and touch objects. This would be different from existing rhythm games in which
attention outcomes are binary. It is proposed that such a system would need to operate as
a two-step process. Step one would be an open meditation experience, such as Bloom or
the first TrueMonk digital prototype where users are encouraged to find a user-generated
sound and or shape representing a sound. Step two would then be to give the user enough
agency to choose new sounds and or sound shapes to focus on.
24
Conclusion
The investigation of prior art for TrueMonk revealed the use of varying
combinations of touch, color, shape, movement, and sound to exercise attention. Music
applications were found to direct attention by using sound and movement. Rhythm games
were found to primarily use shape and movement to direct attention. By revisiting prior
art, and contextualizing different types of meditation applications in terms of emotional
regulation, the author gained insight into how research in social psychology can be used
to inform design.
The development process explored how touch, color, shape, movement, and
sound could be combined to direct attention. An innovative touch gesture was designed
integrating touch and sound. Development of a fourth prototype is ongoing. It is hoped
that a process of borrowing elements from previous prototypes will continue to inform
efforts to create an interactive meditative application.
25
Bibliography
Gross, James J. “Emotion and Emotion Regulation. Personality Processes and Individual
Differences.” http://spl.stanford.edu/pdfs/Gross_08_Chapter.pdf
Lutz, Antoine, Slagter, Heleen A., Dunne, John D., and Davidson, Richard J. “Attention
regulation and monitoring in meditation” Cognitive-emotional interactions.
Waisman Laboratory for Brain Imaging and Behavior, Department of
Psychology, University of Wisconsin
Wadlinger, Heather A. & Isaacowitz, Derek M “Fixing Our Focus: Training Attention to
Regulate Emotion.” Personality Social Psychology Review 2011, 15: 75. p. 89
Abstract (if available)
Abstract
Advances in neuroscience and affective psychology have opened new areas of research in emotion regulation. This paper investigates one area of emotion regulation, meditation, and how contextualizing different types of meditative applications in terms of emotional regulation can inform meditative application design.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The return: a case study in narrative interaction design
PDF
Coping with negative emotions through breath control
PDF
How to be Indian: a Tumblr experiment
PDF
Ahistoric
PDF
Ardum: a project about the player-designer relationship
PDF
Of gods & golems
PDF
Video Volume
PDF
Designing the Ramayana re-imaginator
PDF
LudoQueue
PDF
Berättande
PDF
Historicity and sociality in game design: adventures in ludic archaeology
PDF
Intimation and experience of the self in games
PDF
Traversing the Green ward
PDF
Harvest: first person psychological horror game
PDF
Come with Me: a cooperative game focusing on player emotion
PDF
Localite
PDF
Getogether
PDF
A full body game for children with attention deficit hyperactivity disorder: adventurous dreaming high-flying dragon
PDF
Players play: extending the lexicon of games and designing for player interaction
PDF
A second summer
Asset Metadata
Creator
Bulvanoski, James Leo
(author)
Core Title
A meditative application inspired by emotional regulation
School
School of Cinematic Arts
Degree
Master of Fine Arts
Degree Program
Interactive Media
Publication Date
05/08/2012
Defense Date
03/26/2012
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
affective psychology,emotion regulation,Games,Meditation,Neuroscience,OAI-PMH Harvest,Touch
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Anderson, Steven F. (
committee chair
), Carter, William (
committee member
), Kratky, Andreas (
committee member
)
Creator Email
james.bulvanoski@gmail.com,jbulvanoski@yahoo.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-34444
Unique identifier
UC11288398
Identifier
usctheses-c3-34444 (legacy record id)
Legacy Identifier
etd-Bulvanoski-808.pdf
Dmrecord
34444
Document Type
Thesis
Rights
Bulvanoski, James Leo
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
affective psychology
emotion regulation