Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
SolidSpace: a device for redesigning perception
(USC Thesis Other)
SolidSpace: a device for redesigning perception
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
SOLIDSPACE: A DEVICE FOR REDESIGNING PERCEPTION by Bryan Patrick Jaycox A Thesis Presented to the FACULTY OF THE USC SCHOOL OF CINEMATIC ARTS UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree MASTER OF FINE ARTS (INTERACTIVE MEDIA) May 2010 Copyright 2010 Bryan Patrick Jaycox ii Epigraph “This was supposed to be the future. Where is my jetpack, where is my robotic companion, Where is my dinner in pill form, Where is my hydrogen fueled automobile, Where is my nuclear-powered levitating house, Where is my cure for this disease?” -Unknown iii Dedication For all my family, friends, and professors who inspired me throughout these three years. iv Acknowledgements Advisors MARK BOLAS (Chair) Associate Professor University of Southern California ANDREAS KRATKY Assistant Professor University of Southern California JOSEPH GARLINGTON Vice President, Interactive Projects Walt Disney Imagineering CHRIS ULLRICH Sr. Director of Research Immersion Corporation Team Members ANDY UEHARA HYUNG GYU OH EDMOND YEE BILL GRANER SEAN PLOTT JACOB BOYLE RAMY SADEK Special Thanks ANN PAGE FOX INTERACTIVE USC ANNENBERG FELLOWS v Table of Contents Epigraph ii Dedication iii Acknowledgements iv List of Figures vii Abstract ix Keywords x 1. Project Description 1 2. Concept/Overview/Objective 5 What is “Redesigning Perception”? 5 Possible Venues 6 3. Prior Art 8 Haptics – Commercial 8 Haptics – Academic 11 Implants/Prosthetics/Cybernetics 13 Artistic/Aesthetic 15 4. Conceptualization and Development Process Documentation 18 Preceding Research 18 Early Conceptualization 21 Production 23 Iteration 29 Other Devices 30 5. User’s Experience Description 32 6. Evaluation Scenarios and Methods/User Testing 34 7. Discussion: Innovative Contribution & Implications 37 8. Conclusion: Summary of Key Points, suggestions for additional work 39 vi 9. Contributors 40 10. Glossary 43 Alphabetized Bibliography 44 vii List of Figures Figure 1: Early conceptualization of the SolidSpace Glove. 3 Figure 2: High Concept of the SolidSpace Glove 4 Figure 3: The Novint Falcon 9 Figure 4: SensAble Technology's Phantom Omni 9 Figure 5: TN Games 3rd Space FPS Gaming Vest 10 Figure 6: Immersion Corp's CyberForce Haptic Hand 11 Figure 7: University of Tokyo, Shinoda Lab's Touchable Holography 12 Figure 8: Forschungszentrum Karlsruhe's Tactile Display 12 Figure 9: Bauhaus-Universitat Weimar's Shape Memory Alloy Thimble 13 Figure 10: Oscar Pistorus using Ossur's Cheetah Prosthetic Legs 14 Figure 11: Cochlear (left) and Vision (right) Implants 15 Figure 12: MIT Media Lab's Sixth Sense 15 Figure 13: Cyberneticist Kevin Warwick 16 Figure 14: New Media Artist Danielle Wilde 17 Figure 15: First prototype of virtual collision motor 19 Figure 16: (left) Initial proof of concept; (right) Ultrasonic sensor mapping to single finger with soft haptic feedback arm. 22 Figure 17: (left) Daisy chained commanded loop ultrasonic array wiring; (right) Ultrasonic array testing. The array being read by the arduino and console output to my laptop. 24 Figure 18: Commanding multiple stepper motors in series/in parallel testing. 25 viii Figure 19: (left) Maya mesh of the chassis design; (right) Unassembled 3D printout. Components fused when printed as a single chunk. 26 Figure 20: Chassis resting on the hand. 27 Figure 21: The completed initial prototype. 28 ix Abstract The Redesigning Perception Project is an attempt to create non-invasive wearable technologies that expand aspects of our human perceptual systems. Our perceptual systems are the gateway to our understanding of the information of the universe, so by expanding our potential to absorb information from the world around us we in turn expand our potential to grow as individuals and as a species. By intelligently analyzing the current limitations of our perceptual systems and removing these barriers through technology we can surpass the genetic constraints passed down to us through our ancestors and perceive the world in a new and unique way. The Solidspace Glove is the first of these devices, a haptic feedback device that allows the user to touch at a distance. By sampling the space in front of the user with ultrasonic rangefinders and mapping these contours to the user’s fingertips, the user can feel without physical contact. x Keywords Solidspace, redesigning perception, touch at a distance, haptic, perceptual, sensory, non- invasive, cybernetics, sensory gateway, touch, wearable technology, wearable, soft haptic feedback, micropiston, glove, ultrasonic, USC, Interactive Media 1 1. Project Description In the dead of night a platoon of marines advances forward across a field under the pitch black of a new moon. The night is too dark for the enemy to see anything a mere five feet ahead of them, but the marines march forward as if they are bathed in the sunlight of high noon. Night vision forever changed the face of the battlefield by redesigning the vision of everyday soldiers. Exploiting the human inability to see visible light below a given threshold, engineers designed these devices to enhance the human visual system and give soldiers an edge over their competition far beyond their mere skill in firing a weapon. By redesigning the way we see in a way that widens our threshold for viewing visible light we pushed ourselves past the constraints of our genetic code, and even our ability to learn and adapt to new situations. We actively redesigned ourselves as a species to grow and become more than our mere genetics. This thesis project is a device that attempts to do just that for the sensation of touch. It is a way of redesigning touch sensation to allow users to “feel at a distance”. This carries with it the higher goal of promoting thinking about how we can use non-invasive, wearable technologies to actively pursue devices built with the specific purpose of redesigning perception. Just as our human vision is restricted to a narrow band of visible light at a specific intensity range, our human sense of touch is limited by its inability to feel objects past the reach of our physical body. This device stretches that ability by 2 allowing us to project our sense of touch across the empty space of a room and feel the contours of an object across this gap. The SolidSpace glove itself is a device worn around the wrist. It is built first and foremost as an active haptic feedback device that provides force feedback at each of the fingertips to give the sensation that your fingers are pressing against a physical object. Its secondary purpose is to redesign the human perception of touch to allow users to feel the contours of a physical space from a distance by scanning the space in front of the glove, creating a rough depth map of the space and feeding this data back to the glove which then creates a rough facsimile of these contours as haptic feedback at the user’s fingertips. This device for haptic feedback is novel in several ways that should help it succeed where other haptic gloves have failed. All motion in the glove is grounded at the wrist so none of the movement of the device is felt in the hand except for the haptic force feedback it is creating at the user’s fingertips. The glove also makes use of soft haptic feedback. Rather than trying to create a solid, immovable surface, the soft haptic feedback of the glove provides pressure feedback to the fingers in an amount proportional to the distance of an object. As an object approaches the glove the pressure applied to the fingertips increases, as it recedes the pressure decreases until no contact is made with the finger. 3 Figure 1: Early conceptualization of the SolidSpace Glove. The bulk of the device is slung from the shoulder to displace as much weight as possible away from the user’s hand. The stepper motors and battery packs slung over the shoulder control a device worn on the hand through a series of bicycle cables that run down the length of the arm. A set of springy leaf extensions driven by these stepper motors come up from the body of the device to make contact with each of the fingertips and provide a soft surface to push against to virtually simulate pressing against a object in the real world. The springy leaf extensions provide a soft haptic feedback to the users fingers, giving the user a range of pressures as light as the touch of a feather to as hard as a solid wall. This gives the user a sense of the world in front of him such that objects fade into the distance as they recede from the glove. This gives the glove the ability to not only render the contours of a room but also render absolute distances between the user and the objects around him. A set of ultrasonic proximity sensors is turned outwards to read the contours of the physical space in front of the glove. Each outward facing proximity sensor on the leaf 4 extensions determines the distance to the nearest object in front of it and feeds this data to the glove. This distance is mapped to the haptic virtual collision surface contacting the user’s fingers, allowing the user to feel the contours of the object in front of them. Figure 2: High Concept of the SolidSpace Glove The ultimate goal of this specific sensory redesign project is to see what it feels like to be able to physically feel the shape of a space around you, breaking away from the standard media sensations of video and audio to attempt to visualize spatial structures at a distance solely through touch. 5 2. Concept/Overview/Objective This project redesigns our capabilities of perception to reach past the constraints of our current sensory systems and experience the world more efficiently and with a new richness beyond what is felt by modern mankind. We can only evolve so far and so fast within our limited lifespans. The only way to push ourselves past the physical constraints of our inherited genetic code is to intentionally redesign ourselves to take in and respond to the world more effectively. What is “Redesigning Perception”? Redesigning perception is more than just the simple remapping of variables to a visual system. Remapping perception, such as that done in scientific visualizations, is simply taking a new variable and translating it into a spectrum that our perceptual systems can understand so that we can better interpret a set of data. This is useful as it allows us to interpret datasets in a more efficient way that utilizes the strengths of our sensory systems, but it is not technically expanding our core sensory functionality. Designing perception as interface, for instance a Heads up Display, is also distinct from the idea of redesigning perception. When you design a HUD, you are designing interface around the functionality of the machine rather than around the functionality of the human. The machine is designed first, then the HUD to interface to the machine, and the human 6 is expected to learn the HUD. This is much different from the process of redesigning perception which begins with understanding how human the senses work, designing technology to enhance these senses, and designing a machine shell that embodies there human sense amplifications. Redesigning perception is neither the arbitrary remapping of data points to the sensory systems, nor designing perception as an interface. Perceptual redesign is remapping the sensory systems in a calculated way to extend human capabilities without limiting their existing capabilities with that perceptual system. This serves a higher goal of allowing humans to function at a higher level as a creature. It is effectively conscious self- reinvention, guided evolution. Possible Venues There are a number of venues that redesigned perception has applications for. The most direct application would be the military. Any augmentation of the standard foot soldier’s abilities that give him a slight advantage over his enemies on the battlefield is desired in war. Just as night vision goggles gave a huge advantage to soldiers, augmenting their sensory systems in other ways could also give them a huge advantage. For instance, a smart haptic system that gave soldiers advance warning if they were about to be fired on and from what direction, a vision system that bends their field of view to give them the ability to see everything around them in almost a full 360 degree arc, or a guidance system for smart bullets that removes the need to even sight a target down your gun. 7 There are also applications for those with disabilities. Just as prosthetic legs, sight and hearing now bring back a small part of sensory systems that have been lost, a SolidSpace haptic glove could be used as a tool for the blind, allowing them to “see” the space in front of them through touch by simply holding their hand in front of them and feeling the contours of the world. In an industrial setting vision redesign could allow engineers to analyze a mechanical construction from multiple viewpoints simultaneously, seeing the device they’re working on in an almost cubist fashion. Redesigning vision to detect a wider spectrum of phenomena could allow a mechanic to troubleshoot a broken piece of machinery based on stresses and fractures revealed by the redesigned vision system. It also has purely artistic potential as well. Simply the ability to experience the world in a new and unique way as a guided experience by an artist would make an impressive gallery exhibition. The ability and manner in which people choose to shape our sensory processes could become an art unto itself. 8 3. Prior Art Prior art has been developed in a variety of fields thus far. Haptics has been emerging as a promising field recently for applications ranging from the entertainment sector to CAD development, and as a result many new devices have been popping up in both the commercial and academic sectors in the hopes of being the first to capture a wide audience and capitalize on this boom. A number of efforts in the fields of medicine have been pushing to create new and novel interfaces for our perceptual systems as replacements for our own damaged organs, but this development is also pushing in ways that may soon surpass our god-given abilities. Also, many artists are currently exploring perception, interaction and movement as a means of interfacing with technology in a more naturalistic way. Haptics – Commercial There has already been quite a bit of research into haptic interfaces, the most notable of which is the Novint Falcon. The Novint Falcon, pictured below, provides three degrees of freedom haptic feedback for a single collision point and has been used in CAD design, 3D modeling, medical applications such as motor rehab training, and has recently been released with a new pistol grip for First-Person Shooter video games. 9 Figure 3: The Novint Falcon SensAble Technology’s Phantom Omni is a similar stylus driven device, and is a popular choice for artists. It provides haptic feedback along its boom arm and is typically used for 3D sculpting since its stylus functions much like that of the Wacom tablet. It provides single point haptic feedback at the point of the pen as if you are poking at a virtual surface. Figure 4: SensAble Technology's Phantom Omni 10 TN Games “3rd Space FPS Gaming Vest” is a different approach to haptic feedback aimed at the consumer level gamer market. Rather than a pinpoint haptic display targeting artists or niche markets, the FPS Gaming Vest is a pneumatically-driven, distributed haptic system. Eight small rubber pistons are distributed across the chest and back allowing you to localize virtual bullet impacts in a video game through touch alone. Figure 5: TN Games 3rd Space FPS Gaming Vest Immersion Corp’s CyberForce Haptic Hand is a full hand haptic glove with vibro-tactile feedback motors. Using this you can physically manipulate virtual objects in space by grasping them and moving them around. Each of the fingers is physically attached to the assembly that drives their force feedback by a set of wires, meaning you feel the glove and you feel the resistance from the wires even when they are not generating force feedback for the hand. 11 Figure 6: Immersion Corp's CyberForce Haptic Hand Haptics – Academic “Touchable Holography” (http://www.alab.t.u-tokyo.ac.jp/~siggraph/) is an interesting application of ultrasonic transducers to generate a localized sensation of touch. At its core, the ultrasonic transducers create a small air pressure wave that is felt by the hand, but is driven at a frequency outside the human hearing spectrum. Two infrared cameras triangulate the position of the hand by tracking a small reflective band worn around the finger, and make sure that the 2D ultrasonic transducer array only fires when the hand collides with a virtual object in 3D space. 12 Figure 7: University of Tokyo, Shinoda Lab's Touchable Holography The “Tactile Display” of Forschungszentrum Karlsruhe provides spatial tactile feedback from more than one point to simulate a physical surface that you can run your finger across. This is done using a pin system driven by a series of servo motors. This creates a mutable surface with a relatively good degree of fidelity, but the grain of the surface (determined by the number of pins), and the active area of the surface are both limited by the physical size constraint of the servo motors driving the individual pins. Figure 8: Forschungszentrum Karlsruhe's Tactile Display 13 Another method developed at the Bauhaus-Universitat Weimar for tactile feedback is a shape memory alloy thimble. These thimbles are worn over the fingers and provide a slight sense of pressure to the fingertips by passing a small current through shape memory alloy wires stretched across the fingers surface. This causes the SMA wires to contract, and press on the pads of the fingers. Figure 9: Bauhaus-Universitat Weimar's Shape Memory Alloy Thimble Implants/Prosthetics/Cybernetics In other fields there has also been research related to the idea of redesigning perception. nightvision goggles, and thermal goggles are excellent examples of ways in which we have already redesigned our sense of vision to some extent. Other augmentation systems are used primarily as a means for replacing abilities that have been lost already, rather than as a means for augmenting our existing abilities. One excellent example is Ossur’s Cheetah Prosthetic Legs used by paralympic champion Oscar Pistorus. There is some controversy over whether these prosthetic limbs may not only get him up to par with other non-amputee athletes, but that it may even be an unfair advantage over them. 14 Figure 10: Oscar Pistorus using Ossur's Cheetah Prosthetic Legs Other prosthetic devices include cochlear implants and vision prosthetics for replacing hearing or vision in the deaf and blind. At the moment, these devices have fidelity nowhere near that of the fidelity of natural sight and hearing, but someday they may be equivalent to or even surpass the abilities of our natural senses, opening up a huge range of possibilities for redesigned electronic sensory perception only previously imagined in science fiction. 15 Figure 11: Cochlear (left) and Vision (right) Implants Artistic/Aesthetic Several artists have also experimented in the realm of directly augmenting our perceptions of reality. The “Sixth Sense” project developed at the MIT Media lab (http://www.pranavmistry.com/projects/sixthsense/) is an augmented reality device worn around the neck that overlays interactive digital information on the real world. It functions much like a framework for a new type of augmented reality system in which the virtual objects are not transparent to the real world, but are projected onto them. Figure 12: MIT Media Lab's Sixth Sense 16 Some artists and scientists have also explored the realms of ability augmentation through technology by interfacing directly with the machine. For instance Kevin Warwick, a professor of cybernetics at the University of Reading implanted himself with a silicon chip transponder that allowed him to walk the halls of his research lab and have doors, lights and other devices activate based on his proximity to them. Figure 13: Cyberneticist Kevin Warwick Other artists have explored altering our perceptions through interfaces that force us to interact in the world in new and interesting ways. Danielle Wilde has done a variety of such devices, including projects such as the “Hip Disk”, a musical interface that can only be activated by bending our abdomen in a way we are not usually accustomed to. With this device you are not simply making music, you approach a sense of embodying the music you are creating by reflecting it in your own body shape. 17 Figure 14: New Media Artist Danielle Wilde Although diverse, all of these projects are connected in their attempts to rework who we are as human beings and how we interface with the world. In this day and age we are free to break out of the realm of the mouse, keyboard and monitor and interact with the world in a more natural fashion using full body motion and receiving input along our all of our natural sensory paths. The virtual no longer needs to be a realm unto itself, but now can be an experience seamlessly integrated into our real world experience. 18 4. Conceptualization and Development Process Documentation Preceding Research Much of the initial concept for this thesis project arose out of projects I had pursued in my earlier studies here at USC. In the year directly preceding my thesis year, I experimented with a variety of haptic feedback approaches in my work with the INIT Immersive Technologies Group with Professor Mark Bolas. I began that year by designing a simple device that taps a user on the shoulder when it is triggered by a virtual character tapping you on the shoulder in the Panda3D Game Engine. This was shortly followed by a subtle mixed reality device that uses Peltier junctions to manipulate a user’s perception of the ambient temperature of a room around them, making it feel as if the entire room has suddenly become warmer or cooler. However, throughout all of these experiments the project I truly wanted to create was a device that created high fidelity active haptic feedback in a one to one sense. In the INIT Lab I explored several designs for creating active haptic feedback in which a user can grab onto a solid object that can amorphously change into different forms based on the shape of virtual objects that it is simulating. I developed several concepts, including a shape changing wall which was the seed of the idea for the micropistons that currently bridge the gaps between the Solidspace glove leaves. At this point, the project was conceptually heading towards a full body haptic experience of virtual objects, but 19 eventually settled on the current design for its simplicity and its ability to simulate a grip on virtual objects. Since the hands are one of the most concentrated regions for our perception of touch it seemed sufficient to focus just on creating the device for the hands at this time. This haptic glove would actively generate physical contours on the user’s fingertips that would serve as haptic representations of objects in a virtual world. The idea was to have the leaf extensions of the glove gracefully floating about a quarter inch away from the fingertips at all times until a virtual collision in the world was detected, at which point the motor would seize up and create a static surface for the finger to push against. In this concept, the ultrasonic sensors were actually pointing inwards towards the user’s fingertips and the proximity detection done by these sensors was what kept the leaf extensions gracefully floating a fraction of an inch in front of the user’s fingers. The benefit of such a system is that it is a contactless haptic system; whenever the user is not touching anything in the virtual world, he is touching nothing but air in the real world. Figure 15: First prototype of virtual collision motor 20 To these ends, in late February 2009 I prototyped a simple stepper motor mechanism (pictured on the preceding page) that simulates virtual haptic collisions and served as an early prototype for testing the functionality of the leaf extension mechanism. This prototype uses a physical switch in place of the proximity sensors to detect when the finger is too close to the leaf extension. When the user pushes against it, the leaf extension provides little to no resistance, but when the leaf extension detects a virtual collision, the stepper motor locks in place, providing a solid surface for the finger to push against. This prototype actually brought to light the idea of soft haptic feedback. The arm on the virtual collision motor was built out of foamcore as a quick prototyping material, but in doing so the realization arose that there is a really interesting effect when you get a slightly spongy feel from your haptic feedback device. This little bit of give provides a sense of slight pressure that allows you to interpret proximity and light touch in a manner far superior to that in rigid haptic systems. In practice, the implementation with the ultrasonic sensors pointing inwards towards the user’s fingers was short-sighted for a number of reasons. Through testing the ultrasonic sensors it was clear that the grain of these devices was very sensitive, but at best they had a granularity of about a half an inch, with varying fidelity in different environments. In addition the sensors have a dead zone of about three inches within which all relative distances are read as the same value. These two functionality issues of the ultrasonic sensor made it infeasible to use them as a proximity sensor for the hand in the way that I had originally planned. Simply moving the motor to the fixed position of the virtual 21 object it was representing served effectively the same purpose as the floating leaves that tracked the finger, although ideally the system that exists now is not quite as responsive at representing an object that is quickly approaching the hand as it could have been with the floating leaf system. Early Conceptualization As I proceeded with the haptic glove it became clear that the project needed something more. There were already other haptic devices that existed and created high fidelity haptic renderings of virtual objects. My device had unique aspects to it that other haptics devices were lacking (soft haptic feedback, user-determined contact with surfaces, simultaneous multipoint haptic rendering), but the crux of whether the glove would succeed rested on the experience that was being conveyed through the haptic experience. As a media device, the hardware itself has no value if the media it is conveying is insubstantial. I knew that I wanted an experience rooted in the real world, something that would draw people into interacting in real space with physical humans around them rather than into computers and interacting with the virtual. Much of my interests studying here at USC had also been centered on changing the way in which we as humans view the world. In the INIT Lab I had created a prototype of a warping world in which you could bend your line of sight, and also in earlier classes I had done several visual representation projects that attempted to give you a cubist view of a real world spaces to try to allow you to view the entire 360 degree space of a world around you at once. 22 The early concept for redesigning perception came out of these three drives in my research studies: haptics, drawing people into the real world, and expanding who we are as human beings. I took the ultrasonic sensors that were going to do proximity detection on the fingers and pointed them outward to capture a rendering of the world. As it had become a tool for perceiving the real world, it became important that it was an untethered wearable technology device so that someone could wander freely with it through the real world, away from power sources and other technology. Figure 16: (left) Initial proof of concept; (right) Ultrasonic sensor mapping to single finger with soft haptic feedback arm. Near the end of the Spring 2009 semester I had for the most part settled on and solidified this concept, and as of September 16 th I had created the first workable prototype of a single leaf extension using ultrasonic rangefinding and soft haptic feedback (pictured above). The distance captured by the ultrasonic rangefinder was mapped to a foamcore arm that was directly driven by the stepper motor. As I moved my hand towards and away from the ultrasonic sensor users could feel an increase in pressure through the 23 foamcore proportional to the distance, which some described as almost touching a shadow of the object that they were remotely sensing. For me it almost felt as if you were feeling the world through a three inch thick sponge, and as you approached objects the sponge would slowly compress against your fingertips until you were actually gripping the object itself. It was this prototype that really pounded home the importance of soft haptic feedback for this project. The best way for users to actually get a true sense for distance in a distanceless sensation like touch is to map it to pressure or another modality. By mapping it in this way, users can get a sense for distance, and approaching and receding objects, not just hard contours. Production From this initial prototype I gathered interest from several of the other Graduate Students in the Interactive Media program here at USC, and in early Fall 2010 started production with a team of four, including second years Bill Graner and Sean Plot, and first years Andy Uehara and Jacob Boyle. Based on past wearable technology projects working with alumni Diana Hughes, I decided to use the mini Arduino microcontroller and Lilypad wearable electronics (available on sparkfun.com) as a development platform. These components are all rugged and small form factor specifically for wearable technology implementations. The leaves of the glove are driven by off the shelf stepper motors, powered by NiMH power 24 supplies. Rangefinding is done using the Maxbotix EZ1 Ultrasound Sensors, daisy chained in a commanded loop. Figure 17: (left) Daisy chained commanded loop ultrasonic array wiring; (right) Ultrasonic array testing. The array being read by the arduino and console output to my laptop. There were several technical challenges that needed to be overcome at the start of this project, especially with the input from the ultrasonic sensors. In early testing it became clear that while the sensors had very good granularity and range, the signals received were very noisy and heavily influenced by environmental factors. Since the ultrasonic signal is a sound wave, in loud environments with lots of people, the signal had a greater amount of reliability issues than in the quiet test environments in which I had been using it. Also, if the ultrasonic pulse that was sent out hit two large surfaces at very different depths the output that is returned oscillates between these two values. There is also a huge problem of interference in that we are sampling from four different ultrasonic sensors simultaneously. The interference problem was quickly solved by chaining the ultrasonics to fire in sequence in a commanded loop, such that only one ultrasonic sensor is actually shooting out and receiving a pulse at a time. The other two issues 25 unfortunately were problems inherent to the sensor that had to be resolved by a number of smoothing algorithms in code, and actually resulted in a bit of lag in the system. Also, since the signal emitted by the ultrasonic sensor shoots out in a conical shape with a relatively large radius, there was also a problem of overlap in the coverage of the sensors. If all of the ultrasonic sensors point straight ahead, they are covering nearly the same field of view, especially at larger distances. In order to reduce overlap between the sensors, they needed to be angled outward. This design actually has an added advantage that gives the user a perspective view of the world much like with our vision, so that objects that are closer are privileged over objects that are further away. Figure 18: Commanding multiple stepper motors in series/in parallel testing. A second major hurdle was programming motor control on a single arduino to drive multiple stepper motors. To drive an individual stepper motor a signal is sent out from the arduino using pulse width modulation to tell the motor the speed and number of steps it should rotate. If you want to control these stepper motors in serial this isn’t too bad, 26 you just send a PWM signal to one motor at a time. When one motor completes its movement you move to the next one and just cycle through the motors one by one. But for the solidspace glove this was far from ideal, we wanted each leaf of the glove updating in real time to create the illusion of a morphing surface. In order to pull this off we needed motor control in parallel, but every time you pause to update one motor the PWM signal for the other motors would be disrupted, since it is time based encoding. To manage this parallel control, the PWM signals had to be interleaved in the code and sent to multiple pins simultaneously, using the delay from the processing of the code itself as part of the delay in the PWM signal. Figure 19: (left) Maya mesh of the chassis design; (right) Unassembled 3D printout. Components fused when printed as a single chunk. The chassis itself was an interesting design challenge as well. I wanted to apply the rapid prototyping and iteration techniques used for game design here at USC to the production of physical devices, but this is very difficult as there are inherent delays in machining 27 objects. However, the recent abundance and accessibility of 3D printing technology is beginning to change this. Using the Maya software with a ZCorp 3D printer I was able to take my chassis design from paper, generate a working 3D design of the device in Maya in one day, and print it in the next day, reducing the iterative cycle to only two days. Recently as well MakerBot has released a 3D printer for only $1000, making it possible for the first time in history that the everyday person can have this manufacturing potential in their own home. The potential of this is amazing! Just as youtube revolutionized the way we create and distribute video today, this 3D printer has the potential to revolutionize material fabrication, taking it out of the control of corporations and making it something the everyman can produce. Figure 20: Chassis resting on the hand. There were a number of design constraints taken into consideration when building the chassis itself. It simultaneously needed to be functional, ergonomically comfortable, and aesthetically pleasing to look at. The motors needed to be clustered on the arm in such a 28 way that they were as close as possible to the bone of the arm and in a tight formation so that the motor weight would not be felt swinging when the arm rotated. This clustering of the motors offset the fulcrums of the lever arms driving the leaf extensions. The leaf extensions were designed to only make contact at the fingertips of the user’s hand, resulting in an arching look that eventually gave rise to the steampunk/street tech aesthetic feel of the prototype. The leaf extensions also needed to be designed in such a way that the gaps between them could be bridged to form a continuous surface. In order to achieve this continuous surface I used what I call micropistons, tiny rod and shaft devices constructed from acupuncture and syringe needles that create a surface that is flexible along their length, but is inflexible when pushing perpendicular to the axis of extension. Figure 21: The completed initial prototype. 29 Iteration User testing at a number of exhibitions has shown several problems with the design of the initial prototype that are resolved in the most recent iteration of the glove. The primary problems noted in the initial prototype were weight and mobility issues. Having the motors mounted on the wrist made users feel as if they were lifting a barbell after several minutes of wearing the device. Also, driving the leaves through a rigid contraption mounted on the wrist meant that the rotation of the wrist was constrained by the bulk of the device. Both of these problems severely hindered the use of the device, as most users simply ended up cradling the device on their arm as if they were holding a pet or a small child instead of using it to feel the environment around them as a natural extension of their hand. To solve this problem the most recent design of the device displaces the motors and battery packs of the device to a pack that is slung over the shoulder. A minimal number of components remain on the hand to cut down weight and increase mobility, and everything that remains on the hand is attached below the wrist so it does not hinder the joint. The leaves of the glove are attached to the palm of the hand and are driven by the displaced motors using bicycle cables. Another issue was the displacement of the ultrasonic sensors from the fingertips. This was initially done so that the ultrasonic sensors would remain stationary so their movements would not be affected by the movements of the leaf extensions, but this caused a problem for users since they could not relate what they were feeling at their fingertips to the signal that was being read from their wrist. The new design will mount 30 the ultrasonic sensors on the leaves themselves and should alleviate this disconnect between what is being felt and what the user is trying to feel. Other Devices Over winter break during a two week period spanning December 5-16, 2009 I worked with Ramy Sadek of USC’s Institute for Creative Technologies and three other students Andy Uehara, Edmond Yee, and Hyung Gyu Oh to prototype and experiment with several other devices for redesigning perception. This two week period, the Immersive Technologies Lab’s Winter CrHunch headed up by Professor Mark Bolas, gave us an opportunity to access the facilities at the Institute for Creative Technologies McConnell facility to do blue sky research and prototyping outside of the concerns of other class deadlines and projects. During this period we prototyped one device for augmenting the human visual system and experimented with another device for manipulating the human hearing spectrum. The visual piece was a projected augmented reality device. By using a combination of webcam and projector aligned along the same axis similar to that in MIT’s Sixth Sense, we were able to do realtime capture of a realworld space and reproject augmented reality onto that space. The projector would strobe a light so that the webcam could capture an image of the scene, the webcam image would be processed (in this initial prototype just a simple edge detection process), and the processed image would be projected back onto the world. The result was essentially a flashlight that outlined the objects it passed over. 31 This first implementation was very simple but it could be used in a very powerful way based on the image processing routines used. It could highlight objects in the room that users should pay attention to, or otherwise alter our perception of real world environments. The audio piece has not yet reached the prototyping phase, but it experimented with bringing sounds outside our normal human hearing spectrum in the ultrasonic and infrasonic range into an audible frequency. This would potentially give us the ability to hear the low rumble of tectonic plates or a truck approaching from miles away, and simultaneously hear the voice of a mouse and electronic devices. 32 5. User’s Experience Description As you approach the device it is inviting as an item to be worn. The clear acrylic plastic of the chassis gives the device an air of futurism, with gentle curves that hint at an organic and natural feel. The chassis is stitched to a black cloth glove that signals to you that this is not something to be viewed, but an item to be worn and experienced. You place it on your hand and sling the small, wired pack connected to it over your shoulder, flipping the switch to turn it on. As it turns on, the gentle whir of motors is heard and you notice the slightest bit of movement on your hand almost like the movements of a small creature as the device calibrates to the room it is sensing. As you move your fingertips forward you feel the slightest bit of pressure like a feather. A subtle, almost unnoticeable movement of your hand causes the device to respond with an even slighter movement of pressure against your fingertips. You sweep your hand across the room and the motors whir, pushing gently against your fingers as you sweep across a nearby pedestal. As you push a bit harder into the glove, shapes begin to emerge from the shadows of pressure you feel against your fingertips. You sweep your hand towards the pedestal again and you can now feel the near edge creating a corner against your fingertips. 33 You close your eyes and move your hand across the room, realizing that the subtle movements you had only barely felt before are revealing the shape of the room in front of you. You begin to sense the world not as the colors and lines you had once seen, but as a complex shadowy image of depths receding and proceeding from your fingertips. 34 6. Evaluation Scenarios and Methods/User Testing The evaluation of success in this project is a twofold problem. First, the device must be evaluated technically. Does it perform as expected from a mechanical standpoint and is the haptic rendering that it is representing an accurate representation of the world it is seeing? Do the ergonomics of the device make it a useable as wearable technology for a human being from a practical standpoint? Second, the project must also be evaluated from a qualitative standpoint. Does the user’s perception of the world around them change after using the device? Has their perception been redesigned and can they learn to re-perceive the world using this wearable technology as an extension of their body and not simply a tool? There were several phases of user testing for the various prototypes developed over the course of this project. In the initial single finger test on September 16 th , 2009 the scenario involved myself moving my hand back and forth within a range of about three feet of the device and allowing them to experience the change as I approached or receded from the ultrasonic sensor. About ten to fifteen people were run through this demo with varying success. There was still a lot of noise in the system at this point, which made it tough for some to understand what was going on due to the motor swinging about occasionally. Some users would not really connect what was going on with their finger to the input that they were receiving. However, many other people that tried it would get 35 that “Aha!” moment of surprise and joy at feeling a push against their finger when they looked up and saw my hand was still about ten inches away from them. This playtest revealed several things both positive and negative that informed my future designs. The negative was that the noise in the ultrasonic sensor would be a huge problem in future iterations and was something that needed to be ironed out for the next prototype. The positives were the revelation of soft haptic feedback and that the springyness of the leaf extensions contributed to the experience of depth in the world by giving you shadows of approaching and receding objects around you. For the initial prototype during the Winter Thesis Show on December 3 rd , 2009 the user testing had expanded a bit more. I would move my hand in and out over an individual sensor and across the array of sensors to see if the user would respond and understand the sensation that they were experiencing. Would the user connect what was going on in the real world with what they were feeling at their fingertips? For most users, again there was a generally favorable response. Most people would respond to this experience with sounds of amazement and say that they could feel the pressure of my hand moving over the sensors. I also had a small 3D printed sculpture of my head on the table nearby that users could touch with the glove to see if they could feel the changes in shape when moving their hand above a rounded surface. This test caused a few more problems to come to light. First, the device felt too heavy and bulky, so most users were hesitant to turn it upside down to feel the sculpture on the table, and even when they did few users were very excited by the shape that they felt, simply because there was only one, very 36 ordinary shaped object for them to feel. Also, the displacement of the ultrasonic sensors from the fingertips presented a problem, in that people would scan the glove over the sculpture, only to find that they were not feeling anything because they were trying to scan it with their fingertips, not the ultrasonic sensors on their wrists. Over the course of the Winter Thesis Show about forty to fifty people ran through this phase of user testing. Final user testing will take place as a scripted series of events and environments testing the grain and fidelity of the glove’s haptic feedback, and the range of positions the glove can detect. It will consist of both general aesthetic experience and what the user felt when trying the device for the first time as well as the technical specification testing listed above. Future user tests will include distinguishing between different shapes such as a cube and a sphere, edge detection, and determining the distance to an object within a range of 1-20 feet away. 37 7. Discussion: Innovative Contribution & Implications The innovative aspects of this project come a lot from its approach towards design. It is design with evolution as the goal, to push us forward in the way that we experience the world. It is neither a tool, nor a commercial product designed for a singular purpose, problem, or job that needs to be accomplished. It is a project that attempts to redesign the very way in which we see life. It is not simply haptics or a heads-up display, trying to convey a false reality imposed on the world, it is a new way to experience the reality of the world that already exists around us. It is not simply a remapping of the senses that we already have to a new data set, but an expansion of our experience and the natural mapping that we already possess in our sensory systems. This project is an extension of the human as wearable technology that bends technology to fit the natural mapping of the human body, rather than forcing the human to bend to understand the technology. The intent is that the user can just slip on the device and feel the world in exactly the same way as they’ve felt since they were a baby, but now with added superhuman ability. Designing with humanity at the core, rather than as a tool for a specific purpose should help to broaden us as people. Cybernetics currently has connotations that evoke images of invasive technology or information overload, much like having a computer overlaid on your current vision systems or a wire sticking out of 38 your arm. This device could change that perspective, by allowing people to experience what it is to be superhuman in something you can simply take off at the end of the day. This technology gives people the ability to physically be more than the sum of what they were born with, it is technology designed to be an extension of us as human beings. I can see this project being used in many applications in the future as a device for sight for the blind, a way of augmenting reality for military personnel in the field, an entertainment device for augmented reality games, swapping out the ultrasonic sensors to use the glove to sample different elements in your environment, or to simply see in a different way. To be as successful as possible in these applications several things could be done to harden this technology and bring it to its full potential. The current limiting factors of this device are its weight and the lag/noise in the ultrasonic sensors. In order to bring this device to its full potential several things would need to be developed: stronger smaller motors, more lightweight power supplies and materials for the chassis, and faster ultrasonic sensors with less noise. If these technologies were developed and made practical for a glove of this size, I could see this technology becoming portable enough to be used in a practical everyday scenario. 39 8. Conclusion: Summary of Key Points, suggestions for additional work In this paper I explained how the redesigning perception project is attempting to take our existing sensory systems and push their boundaries to allow people to experience the world in a new and unique way. The first and current prototype for the implementation of this ideal is embodied in the SolidSpace glove, a device worn on the hand that uses haptic feedback to convey a sense of touch at a distance. The glove reads the shape of a room using ultrasonic sensors and maps this to the fingertips in a way that conveys overall shape, position, angles, and depth through pressure of objects. Several other projects are also being explored and prototyped, including a device that allows humans to hear an expanded range of frequencies in the ultrasonic and infrasonic range, and a projected augmented reality device that takes in the visual world, processes it and re- renders it using a projector registered to the webcam’s image. By doing this project I hope to expand the way in which we as humans see and experience the world, as a means of conscious self-reinvention that supersedes our evolutionary systems that control the way we currently view the world. My hope is that by working in this way we can reanalyze the way we percieve the world and look at it as a design problem rather than a stated fact of the world, reinventing ourselves in a way that gives us the greatest potential to experience the world around us, rather than being confined to the constraints of our evolutionary ancestry. 40 9. Contributors Team Members A number of students and researchers helped significantly on the development of this project from start to finish. My greatest thanks to all of them for their hard work and contributions seeing this project through. ANDY UEHARA (1 st year M.F.A., Interactive Media) Andy helped in numerous capacities over the entire course of development for both the SolidSpace glove and our prototypes during the winter CrHunch, including team management, contacting technical experts, design, and construction of electronic systems. HYUNG GYU OH (1 st year M.F.A., Interactive Media) Hyung was our image processing expert on the projected augmented reality prototype developed during the winter CrHunch at ICT. He developed the code that interfaced the webcam and projection systems and applied filters to the world in real-time. EDMOND YEE (1 st year M.F.A., Interactive Media) Edmond did a significant amount of work on hardware and electronics development during the winter CrHunch at ICT, doing much of the construction of the projected augmented reality helmet along with Andy Uehara. 41 BILL GRANER (2 nd year M.F.A., Interactive Media) Bill Graner contributed in a number of ways on the development of the SolidSpace Glove Prototype, including brainstorming mechanical systems for driving the leaf extensions, consulting on solutions for mounting the device to the arm, and various production and organizational tasks. SEAN PLOTT (2 nd year M.F.A., Interactive Media) Sean was a consultant for the noise cancellation code underlying the initial SolidSpace prototype, brainstorming ideas with me to remove the almost 50% noise coming from the ultrasonic sensors at times, and smooth the signal to a reasonable level. JACOB BOYLE (1 st year M.F.A., Interactive Media) Jacob Boyle worked with Andy through much of the initial development of the SolidSpace glove prototype, helping to design and test many of the electrical systems in the glove. RAMY SADEK (Audio Design Consultant, USC’s Institute for Creative Technologies) Ramy Sadek acted as an audio consultant on the infra/ultrasound project aimed at expanding our hearing frequency range. He provided expertise on a full range of audio hardware, software and processing topics. 42 Special Thanks ANN PAGE (Adjunct Professor, USC’s Roski School of the Fine Arts) For her generosity with the USC Roski School of the Fine Arts 3D Printer. FOX INTERACTIVE USC ANNENBERG FELLOWS 43 10. Glossary Active Haptic Feedback – Any system that tries to render a physical surface for our sensation of touch by moving components that mimic the normal forces of objects it is representing in the virtual world. Floating Leaf System – A concept for a haptic system in which contact points float millimeters above the fingertips until a virtual collision is triggered and they make contact with the fingertips. This allows the user to feel nothing but air until a haptic force is applied to the fingertips. Leaf Extensions – Thin, actuated lever arms that extend to the fingertips to provide points of connection for haptic rendering of a contoured shape. Micropistons – A mesh of thin wire-in-tube structures that allow freedom of movement in one axis while remaining rigid in the other two axes. This allows for the creation of a morphing surface that provides a solid normal force to push against. Soft Haptic Feedback – A haptic system that provides a bit of play. Rather than attempting to render everything as a rigid surface, soft haptic feedback allows the user to ease into contact with a surface, allowing the system to convey pressure information as well as position, and overall just produce a more natural feeling contact with a virtual surface. Subtle Mixed Reality – Mixed reality applications that do not try to draw attention to themselves, but instead try to integrate with the real world environment in a way that privileges the fusion of virtual and real over the fidelity of the virtual information being displayed. 44 Alphabetized Bibliography Iwamoto, Takayuki, Mari Tatezono, Takayuki Hoshi, and Hiroyuki Shinoda. Touchable Holography. 2009. University of Tokyo. 1 April 2010 <http://www.alab.t.u- tokyo.ac.jp/~siggraph/09/TouchableHolography/SIGGRAPH09-TH.html>. Maxbotix, Inc. “LV-MaxSonar-EZ4 Data Sheet.” 2005-2007. Maxbotix Inc. 1 April 2010 <http://www.maxbotix.com/uploads/LV-MaxSonar-EZ4-Datasheet.pdf>. Mistry, Pranav. Sixthsense. 2009. MIT Media Lab. 1 April 2010 <http://www.pranavmistry.com/projects/sixthsense/>. Norman, Donald A. The Design of Everyday Things. New York, NY: Basic Books, 1988. Robbin, Tony. Shadows of Reality: The Fourth Dimension in Relativity, Cubism, and Modern Thought. N.p.: Yale University Press, 2006. Schiffman, Harvey Richard. Sensation and Perception. New York, NY: John Wiley & Sons, Inc, 2001. Squire, Larry R., Floyd E Bloom, Nicholas C Spitzer, Sascha du Lac, Anirvan Ghosh, and Darwin Berg. Fundamental Neuroscience, Third Edition. Burlington, MA: Elsevier, 2008. Traylor, Ryan, and Hong Z Tan. “Development of a Wearable Haptic Display for Situation Awareness in Altered-gravity Environment: Some Initial Findings.” Proceedings of the 10th Symp. On Haptic Interfaces For Virtual Envir. & Teleoperator Systs. IEEE, 2002. 1 April 2010 <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.8808&rep=rep1& type=pdf>. Van Erp, Jan B. F., Hendrik A.H.C. Van Veen, Chris Jansen, and Trevor Dobbins. “Waypoint Navigation with a Vibrotactile Belt.” ACM Transactions on Applied Perception Volume 2, Issue 2. ACM, April 2005: 106-117. 1 April 2010 <http://portal.acm.org/citation.cfm?id=1060585>. Wilde, Danielle. Daniellewilde.com. 2010. 1 April 2010 <http://www.daniellewilde.com/>.
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Perception and haptic interface design for rendering hardness and stiffness
Asset Metadata
Creator
Jaycox, Bryan Patrick
(author)
Core Title
SolidSpace: a device for redesigning perception
School
School of Cinematic Arts
Degree
Master of Fine Arts
Degree Program
Interactive Media
Publication Date
04/24/2010
Defense Date
03/25/2010
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
cybernetics,glove,haptic,Interactive Media,micropiston,non-invasive,OAI-PMH Harvest,perceptual,redesigning perception,sensory,sensory gateway,soft haptic feedback,SolidSpace,Touch,touch at a distance,ultrasonic,USC,wearable,wearable technology
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Bolas, Mark (
committee chair
), Garlington, Joseph (
committee member
), Kratky, Andreas (
committee member
), Ullrich, Chris (
committee member
)
Creator Email
bjaycox@gmail.com,jaycox@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2948
Unique identifier
UC1142703
Identifier
etd-Jaycox-3695 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-308982 (legacy record id),usctheses-m2948 (legacy record id)
Legacy Identifier
etd-Jaycox-3695.pdf
Dmrecord
308982
Document Type
Thesis
Rights
Jaycox, Bryan Patrick
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
cybernetics
haptic
micropiston
non-invasive
perceptual
redesigning perception
sensory
sensory gateway
soft haptic feedback
SolidSpace
touch at a distance
ultrasonic
wearable
wearable technology