Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Re/locating new media in the museum
(USC Thesis Other)
Re/locating new media in the museum
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
i
Re/Locating New Media in the Museum
Nateene Diu
Art and Curatorial Practices in the Public Sphere
Master of Arts
University of Southern California
May 13, 2016
ii
ACKNOWLEDGEMENTS
Thank you to the many—my friends, my family, my cohort—who have selflessly lent their time
and energy in support of both me and my project over the past year. My gratitude to Amelia
Jones for her insight and unwavering encouragement. I cannot begin to thank Karen Moss
enough for being a constant source of guidance and for pushing me to hop cross-country flights
in pursuit of my argument (and some adventure). Thank you to Andreas Kratky for helping me to
expand my understanding of new media before this project even began. I must also thank
Rochelle Steiner who never failed to make her support known.
My special gratitude to Christiane Paul, Peter Lunenfeld, Zach Kaplan, Jerry Saltz, Caroline
Jones, Rita Gonzalez, Joel Ferree, and David Familian for being so generous with their expertise,
time, and rule-bending skills. This thesis would have been woefully incomplete without their
contributions.
Finally, an indescribable and eternal thank you to my husband, Don Everhart, for his love,
support, and almost superhuman ability to tolerate my writing process.
iii
TABLE OF CONTENTS
ACKNOWLEDGEMENTS…………………………………………………….. ii
ABSTRACT…………………………………………………………………....... iv
INTRODUCTION………………………………………………………………. 1
HISTORICAL PROGRAMMING…………………………………...................2
TERMS AND CONDITIONS…………………………………………………...11
CONTEMPORARY PROGRAMMING.……………………………………....19
CONCLUSION…………………………………………………………..............38
BIBLIOGRAPHY…………………………………………………………..........43
iv
ABSTRACT
Multimedia art, net art, art and technology, new media; all of these terms have at one
time been used to refer to an ever-expanding field of art that continues to become more indistinct
and hybrid in nature. Beginning as art and technology, the relationship between science and
artistic practice changed radically once they entered a museum setting together. Specifically, it
was when artists began to work with corporate industry during the first Art & Technology
program at the Los Angeles County Museum of Art (LACMA), which took place from 1967 to
1971, that this collaborative model became solidified and widely recognized. While artists had
been previously influenced by technology, it was at this time that technology entered into
dialogue with high art within an institutional setting, thus legitimizing the resulting hybrid rather
than pluralizing its identity. As a result, institutions relegated it to the fringes of other, more
established art movements. From this early point in its history, new media has continued to
diversify and evolve alongside technological advancements.
Despite a great deal of interest, new media has not been able to establish a permanent
foothold alongside more traditional mediums in most major American art institutions. This thesis
aims to explain this difficulty by examining three different institutional models of curating new
media practices in large American museums: the Whitney Museum of American Art, the New
Museum, and LACMA. Within each of these institutions, specific attention will be paid to a
specialized program of model. By taking careful consideration of these models, I hope to gain a
comprehensive understanding of the current state of new media within the museum landscape—
not just how and why it is still not considered fine art—but also to analyze the ways in which
new media curatorial practice is continuing to evolve.
1
INTRODUCTION
Multimedia art, net art, art and technology, new media; all of these terms have at one
time been used to refer to an ever-expanding field of art that continues to become more indistinct
and hybrid in nature. Beginning as art and technology, the relationship between science and
artistic practice changed radically once they entered a museum setting together. Specifically, it
was when artists began to work with corporate industry during the first Art & Technology
program at the Los Angeles County Museum of Art (LACMA), which took place from 1967 to
1971, that this collaborative model became solidified and widely recognized. While artists had
been previously influenced by technology, it was at this time that technology entered into
dialogue with high art within an institutional setting, thus legitimizing the resulting hybrid rather
than pluralizing its identity. As a result, institutions relegated it to the fringes of other, more
established art movements. From this early point in its history, new media has continued to
diversify and evolve alongside technological advancements.
The term “art and technology” has become ubiquitous in the art world since the 1960s.
From that point onward, it has been used persistently and rather indiscriminately to describe a
range of artistic practices on the boundaries of science and technology-related art activities. I
believe we must resist the continued use of “art and technology” as a blanket term: as new media
art continues to evolve, the term is less and less appropriate and takes on a more historical
meaning that represents models from the 1960s, in which an artistic entity interacts or
collaborates with a corporate entity. Contemporary “new media”, produced since the 1990s,
refers to a much larger range of practices in which artists adopt different technologies to produce
their work, including intermedia, net art, or bio art.
2
This thesis will examine three different institutional models of curating new media
practices in large American museums: the Whitney Museum of American Art, the New Museum,
and the LACMA. I have chosen to focus on museums in New York City and Los Angeles
because of both cities’ historic relationships with new media and art and technology, as well as
their continually evolving and diverging curatorial models. Within the Whitney, specific
attention will be paid to Artport, an online portion of the museum’s permanent collection
dedicated to net art and new media. Analysis of the New Museum’s model will primarily focus
on Rhizome, a non-profit organization in residence at the museum that is dedicated to Internet
art. Finally, in relation to LACMA, I will begin with original Art & Technology Program and
then examine the current Art + Technology Lab, rebooted in 2014 and ongoing. By taking
careful consideration of these models, I hope to gain a comprehensive understanding of the
current state of new media within the museum landscape – not just how and why it is still not
considered fine art -- but also to analyze the ways in which new media curatorial practice is
continuing to evolve. Why is it a matter of urgency to institutionalize new media art? How are
curators injecting new media into permanent collections? Are museum administrators changing
institutional programming to accommodate a renewed interest in new media?
HISTORICAL PROGRAMMING
Before I delve into these three institutional models, it is important to provide some
historical background to early art and technology programs and to some key exhibitions of the
1960s. Beginning in the 1960s, there is a dramatic uptick in both art and technology
collaborations and exhibitions. First is Experiments in Art and Technology (E.A.T), founded in
1966 by Bell Telephone Laboratories engineer Billy Kluver. E.A.T. built on “the strong belief
3
that an industrially sponsored, effective working relationship between artists and engineers will
lead to new possibilities which will benefit society as a whole.”
1
E.A.T. members included
fellow engineer Fred Waldhauer and artists Andy Warhol, Robert Rauschenberg, Jean Tinguely,
John Cage, and Jasper Johns. Together they would create a number of deeply influential
hybridized art events, such as 9 Evenings: Theatre and Engineering (1966), which was a series
of performances that brought new technologies together with theater and dance.
2
E.A.T. was also based on collaboration not just between artists and engineers but also
corporate industry––initially, Bell Labs, and later Pepsi-Cola that sponsored E.A.T.’s 1970
collaboration with the soda company would lead to the landmark Pepsi Pavilion (a large dome
that hosted a number of kinetic sculptures and interactive environment) at the 1970 World Expo
in Osaka. A number of the same artists from E.A.T. would also participate in the LACMA Art
and Technology program, further demonstrating this moment within a technology-oriented
zeitgeist.
3
While the EAT programs were primarily performances and installations, in the later
1960s and into the early 1970s, some primary curatorial models situated art and technology
within the white cube spaces of museums. As a background to this examination, we should note
similarities in circumstances between the cultural climate of the 1960s and 1970s and our
contemporary zeitgeist, particularly concerning war, mass media, and the rapid evolution of
creative technological practices. If we better understand the taxonomies of three exhibitions from
the 1960s and 1970s, then we will be able to move forward from them. If we do not, we will
constantly look over our shoulders, unsure of our foundation The Information show at the
1
Julie Martin, “A Brief History of Experiments in Art and Technology,” IEEE Potentials 34 (6), 2015, 9.
2
Christiane Paul, Digital Art, (London: Thames & Hudson Ltd., 2003), 16.
3
Maurice Tuchman. A Report on the Art and Technology Program of the Los Angeles County Museum of Art,
1967–1971 (New York: The Viking Press, 1971), 28.
4
Museum of Modern Art (1970) the Software - Information Technology: Its New Meaning for Art
at the Jewish Museum (1970), and the first Art & Technology program at the Los Angeles
County Museum of Art (LACMA) (1966-71) provide us with three distinct curatorial models
which help us to understand our current curatorial landscape.
Information, curated by Kynaston McShine at the Museum of Modern Art in New York
City, can be seen as an example of a curatorial model aimed at defining art and technology in
relation to the institution. Meanwhile, Software, curated by Jack Burnham for the Jewish
Museum in Brooklyn, was an example of an alternate curatorial model that was designed to
explore the movement’s conceptual framework. And, finally, the Art and Technology Program
(known as A&T), curated by Maurice Tuchman, was both a multiyear pioneering initiative that
brought artists and corporations together as well one of the first art and technology curatorial
strategies based on an artist incubator or residency model. Of these three historical exhibitions,
the curatorial model most similar to the one currently at play in 2016 was Tuchman’s first
iteration of the Art & Technology program.
The Art & Technology program encompassed roughly four years and concluded
with an exhibition of some of the works that it helped its contributors to create in 1971.
Tuchman, who was only 27 at the time, paired artists with a broad spectrum of major American
corporations in fields such as aeronautics, industrial manufacturing, and entertainment.
The goal was for these corporate entities might offer specialized knowledge, facilities, or
materials for artists to utilize so that they might create an interaction or outcome that might not
have been possible without one’s influence on the other. Even though, in the end, not every
pairing produced a displayable outcome, there were a number of projects that yielded impressive
5
results. Many of these were displayed at both LACMA (in May of 1971) and at the 1970 World
Exposition (Expo 70) in Osaka, Japan.
4
While nearly 70 artists took part in the initial corporate matchmaking process, not all of
them yielded successful collaborations or displayable end products. Artists (all successful white
men, predominantly from New York City) such as Andy Warhol, James Turrell, Robert Irwin,
Tony Smith, Claes Oldenburg, Richard Serra, and Robert Rauschenberg were paired with major
corporations like IBM, Panam, Kaiser Steel, Ford, and General Electric. In contrast to the
situation with the current iteration of A&T, these corporate partners were seen as a catalyst. They
were a part of the experimental process rather than acting as protected members of the
administration. In the 2013 Lab at LACMA, the corporations became fixed points in the project.
The onus of “success,” in terms of the museum’s project, is much more on the artist.
A successful pairing between artist and corporate partner in the earlier lab could be seen
between Andy Warhol and Cowles Communication Inc., a magazine publisher. As an outcome
of their collaboration, Warhol created Daisy Waterfall (Rain Machine) (1971), an installation of
holographic photographs of daisies behind a wall of continually dripping water. Other
collaborations, however, proved to be less smooth. Claes Oldenburg was connected to WED
Enterprises, which was the design and fabrication branch of Disney, to help him create his iconic
Giant Ice Bag (1969). However, citing prohibitively high costs and man hours, WED Enterprises
withdrew from the partnership after three months. Not long after, Oldenburg was matched with a
different tech company, Gemini G.E.L., a Los Angeles-based print-making studio and publisher.
Finally, after re-establishing a technology collaborator, Oldenburg was able to complete the
piece.
4
Maurice Tuchman. A Report on the Art and Technology Program of the Los Angeles County Museum of Art,
1967–1971 (New York: The Viking Press, 1971), 44.
6
While some partnerships did not require much of an imaginative stretch (Richard
Serra with Kaiser Steel Corporation), others proved to be more surprising. Robert
Rauschenberg, better known for his combines, ended up creating Mud Muse
(1968-71) with Teledyne, an industrial company. Mud Muse was a sound installation
that drew its inspiration from the bubbling “paint pots” at Yellowstone National Park.
5
During his three-year collaboration with Teledyne, Rauschenberg created an artwork
that was more reflective of his late-career interest in audience-activated spaces. Mud
Muse was made up of a large glass and aluminum vat of mud, a control console, and a
compressed air system. The mud, a clay and water mixture, was situated over a sound-
activated compressed air system that would push air through the viscous material to
create bubbling, plopping sounds to accompany the sounds of visitors in the gallery.
Despite these somewhat mixed and sometimes unforeseen results, Tuchman never considered
A&T a failure. He was satisfied in the program’s success in bringing previously disparate groups
and individuals together for observation and experimentation.
6
Information at the Museum of Modern Art (MoMA) took place from July 2 to September
20, 1970. It included videos, photographs, and installations by one hundred artists, both
European and American. With reference to the name of the show, the works dealt largely with
publicly accessible data and archives. Participating artists included the likes of Dennis
Oppenheim, Robert Smithson, Ed Ruscha, Vito Acconci, Jeff Wall, Daniel Buren, Jan Dibbets,
Art & Language, and Hans Haacke. The artwork mediums submitted ranged from Wall’s
5
Jennie Waldow, “The Pioneering 1960s Program that Paired Big-Name Artists with Tech Firms,”
Hyperallergic, accessed October 24, 2015, http://hyperallergic.com/246521/the-pioneering-1960s-program-
that-paired-big-name-artists-with-tech-firms/
6
Tuchman, 12.
7
Landscape Manual (1969), which was a cheaply produced guidebook of black and white
photographs and commentary on landscape photographs, to Acconci’s Service Area (1970),
which was largely conceptual and performative.
7
Service Area involved the artist having his mail
forwarded to the museum for three months - where he would go collect it every day. This was
still largely a show focused on display - even if what was being displayed was simply
documentation. In fact, I think one could argue that this focus on display was the main difference
between the Information show and the Software show; Information displayed a result, a
collection of data, or documentation of a completed process (Acconci’s ongoing performance
was a rare format) while Software’s aim was to bring people to the show in order to interface
with the computers and systems installed.
Software is not specifically a demonstration of engineering know-how, nor for
that matter an art exhibition. Rather in a limited sense it demonstrates the effects
of contemporary control and communication techniques in the hands of artists.
Most importantly it provides the means by which the public can personally
respond to programmatic situations structured by artists. Software makes no
distinctions between art and non-art; the need to make such decisions is left to
each visitor. Hence the goal of Software is to focus our sensibilities on the fastest
growing area in this culture: information processing systems and their devices.
8
The Software show at the Jewish Museum in Brooklyn took place from September 16th
to November 8th, 1970. The show brought together conceptual artists and computers with a
focus on process and interaction. While Burnham did achieve something novel through the
exhibition of computer software (such as Ted Nelson and Ned Woodman’s Labyrinth (1970)
which allowed visitors to interface with a digitized catalogue about the show, do some research,
7
Jennie Waldow, “The Pioneering 1960s Program that Paired Big-Name Artists with Tech Firms,”
Hyperallergic, accessed October 24, 2015, http://hyperallergic.com/246521/the-pioneering-1960s-program-
that-paired-big-name-artists-with-tech-firms/
8
Jack Burnham, Software - Information Technology: Its New Meaning for Art, (New York: Jewish Museum,
1970), 10.
8
and then print out their own narrative path or user history
9
), he further set the show apart by
situating the software alongside artworks that fell in line with what is more traditionally thought
of as conceptual art. The show featured works such as John Baldessari’s Cremation Project
(1970) in which the artist burnt all of his early work to ash, which he then interred in one of the
gallery walls. Rather than focusing on differences in medium or execution, Burnham was able to
create conversations between these “more traditional” conceptual pieces and works like Hans
Haacke’s Visitor’s Profile, in which museum visitors were asked to fill out a questionnaire that
probed into the identity of the institution and the art world. These findings were then collated and
printed in the gallery.
When examining a movement such as art and technology or the broader term of “new
media,” which houses most of its commonalities and connective tissue in its praxes or conceptual
framework rather than its choice of medium, it can be hard to identify its most prevalent tenets.
Whether those principles are most concerned with materiality (or lack thereof), reproducibility,
or functionality, they can be understood best as an overall language. This language is not really
one of spoken colloquialisms so much as one a system of understanding. This system is one from
which directives can be formed or lenses through which to examine practical structures can be
developed. This framework is perhaps easier to understand as a programming language, a set of
keywords and phrases that are used to communicate directives to a machine.
In this sense, it can be said that language is the single most important means of
understanding the similarities and differences between the two New York shows. The
language in question moves beyond the expected conversations of what the proper
9
Edward A. Shanken. “The House That Jack Built: Jack Burnham's Concept of ‘Software’ as a Metaphor for
Art. Leonardo Electronic Almanac 6:10, last modified November, 1998, http://mitpress.mit.edu/e-
journals/LEA/ARTICLES/jack.html
9
name for “new media” should be, or the nuances of “intermedia.” The language, in this
case, is much closer to how each curator locates the genre of art and technology or new
media in the museum landscape. For Kynaston McShine (and MoMA for that matter),
the Information show took the form of a survey exhibition and seemed to convey a sense of art
and technology as an identifiable but non-medium-specific genre. For Jack Burnham, however,
the Software show was much more about highlighting connections between emerging modes of
art production (the taxonomies) and conceptual art. Burnham considered the show to be an
attempt to eliminate aesthetics from the equation altogether. To an extent, McShine was trying to
identify the true scope of art and technology, which is how the Information exhibition ended up
including such a diverse range of works and practices that it eventually resembled a survey show.
Meanwhile, Burnham was more interested in seeing just how far the borders of art and
technology could and were being pushed at this time.
In many ways, the Information show at MoMA was the first time that an art and
technology-centered show was organized in such a fashion. Some one hundred artists
were invited to install and/or create works in the museum. However, unlike the more
residency-based model Tuchman was employing at LACMA, McShine’s version was
much more about the museum, about the way in which MoMA could display this type of new
media. At the end of the day, the Information show was essentially nothing more than a
conceptual art survey show. It simply differed from some of its predecessors in that Information
focused more heavily on the mode of production or the tools that were employed to make an
artist’s point: the language, the collectible data (often created specifically for or about the show),
or the museum as muse. To some degree, the show felt as though the curator had been in a rush
10
to categorize these new practices, to situate them within MoMA’s rather rigid taxonomies, rather
than to broach or expand on a new development.
Burnham’s intent seemed to push for the exact opposite. Burnham’s
Software show was much more about the embrace of developing software technologies
as metaphors for art as well as tools through which to expand on interactivity with the art
viewer. Rather than stage the encounter between art object and viewer as a simple
circuit that could be neatly completed, Burnham’s model embraced artists and works
that created a continual system of feedback by having encounters that change over time.
While it was certainly within the exhibition makers’ grasp, technologically speaking, to have
interactive art that did not necessitate a physical presence (which might have fit better in the
Information show), Burnham purposefully chose artworks that did. He created a thoughtful new
analogue-digital conversation in which visitors could participate. His strategy returned the
physical body to computer systems.
Finally, to understand the full extent of this return back to the physical, we must also
return to Tuchman. By returning to Tuchman, we return, in a way, to the present, given the
continuing legacy of his efforts. Tuchman was exploring a residency-based model of curation
that involved the placing of artists’ bodies into new physical environments. This model was the
culmination of Burnham’s concept; software and systems had become ingrained in artists’
practice such that the production of potentially non-display-friendly work is expected. This
expectation has become widespread with the ubiquity of the internet. However, at the current
point in time, artists have begun to move beyond the creation of non-haptic artwork. They have
begun to apply the systems of understanding related to recent technological advances to more
11
traditional mediums. They have begun to apply digital concepts to the analogue and have thus,
once again, returned to the physical. This return does not discard technology, but regards it as a
given facet of artists’ practice. The return to physicality and embodiment comes at a high point
of technological saturation and advancement due partially to the rapidly ripening fruit of the first
wave of computers in industrial and creative settings.
TERMS AND CONDITIONS
The term “new media art” seems rather straightforward on the surface. It is composed of
three basic words; none of them require a deep knowledge of semantics or philosophy to
comprehend. However, the use of this terminology has been hotly contested for decades (already
an indicator of the fallacious use of “new”) and attempts at defining it have proved no less
contentious. If one breaks down the term, word by word, he or she will find that all three words
are problematic, if not flat out misnomers. As previously mentioned, this field or specialty is
hardly new, first coming into its own in the 1960s and 1970s; however, during this time, it was
better known as “art and technology.” The second word in the term, “media,” is also not as
straightforward as it seems. While one might understand this word to be a plural of the word
“medium” (which in and of itself is not wrong), he or she would be ignoring some of the nuance
of the term as “media” can also indicate a link to mass communication.
This reference to mass media points to one of the main tenets in Lev Manovich’s “Media
After Software:” one must consider both the artwork itself as well as the vehicle of delivery or
consumption (magazine, mobile application, internet) equally. And finally, one arrives at what
might seem the least likely to be misunderstood of the three words: “art.” In a more general and
traditional sense, art can manifest itself in a wide variety of forms and activities. The issue with
12
this understanding of art is that it either privileges both the visual and physical object or leaves
out the conceptual facet altogether.
In “The House that Jack Built,”
10
Edward Shanken discusses Jack Burnham’s theory of
software as a metaphor for art, which speaks to this point directly. For Burnham (and Shanken),
the scope of art should not be held to output. It should also include the internal processing and
building of structures of understanding such that conceptualization and eventual output can
happen. In short, Burnham/Shanken highlight an artist’s internal interaction with his or her own
Bourdieusian habitus, one’s embodied structure of feeling and understanding.
In contrast to the numerous art and technology exhibitions which took place
between 1966-1972, and which focused on the aesthetic applications of
technological apparatus, Software was predicated on the ideas of "software" and
"information technology" as metaphors for art. He conceived of "software" as
parallel to the aesthetic principles, concepts, or programs that underlie the formal
embodiment of the actual art objects, which in turn parallel "hardware." In this
regard, he interpreted "Post-Formalist Art" (his term referring to experimental art
practices including performance, interactive art, and especially conceptual art) as
predominantly concerned with the software aspect of aesthetic production.
11
New media art can refer to any art that was created either using a form of technology or
be influenced by technology’s conceptual frameworks or impact on social culture, the
environment, or the way in which we experience art. The application of the term “new media art”
is also quite broad; it can encompass everything from more historic intermedia, to bio art, to net
art.
“Intermedia” is a term first that was first used in the 1960s by Fluxus artist Dick Higgins
to describe a wide variety of multidisciplinary art activities.
12
These activities could and did span
10
Edward A. Shanken. “The House That Jack Built: Jack Burnham's Concept of ‘Software’ as a Metaphor for
Art. Leonardo Electronic Almanac 6:10, last modified November, 1998, http://mitpress.mit.edu/e-
journals/LEA/ARTICLES/jack.html
11
Edward A. Shanken. “The House That Jack Built: Jack Burnham's Concept of ‘Software’ as a Metaphor for
Art. Leonardo Electronic Almanac 6:10, last modified November, 1998, http://mitpress.mit.edu/e-
journals/LEA/ARTICLES/jack.html
12
Dick Higgins, “Statement on Intermedia,” [KM to provide citation]
13
performance, poetry, painting, and computer graphics. Higgins described “intermedia” as
artwork that took place between traditional genres, often occupying more than one genre at a
time or even sometimes no one specific genre.
13
These historic roots are quite fitting for new
media as not only is it similarly broad, it speaks to the hybrid nature of so much of what might be
considered new media art.
The term “bio art” was first used by artist Eduardo Kac in 1997 to describe his piece
Time Capsule in which the artist implanted a microchip into his own ankle.
14
The term can be
used to describe an artistic practice that utilizes living organisms, tissue, bacteria, or life
processes. Bio art sometimes involves the application of biotechnological processes such as
genetic engineering, cloning, and tissue culture and can take place in either traditional art making
spaces or those more traditionally associated with scientific research, such as a lab. However,
there is some debate over whether work that does not employ actual living material but rather
scientific imagery or research can be considered bio art.
15
Internet art (or net art) is made up of work that exists almost exclusively online. It is
predominantly based on interaction and manipulation of the medium’s unique traits (such as its
pseudo form of intimacy, allowing the artist(s) to communicate with individual viewers privately
or its ability to reach a more geographically diverse audience due to its online accessibility and
lack of physical presence) and can be seen as a direct challenge to the established gallery and
museum system. According to art critic Rachel Greene, the term “net art” (alternately spelled
net.art) was coined in 1995 by Slovenian artist Vuk Cosic. Cosic supposedly opened an
anonymous email that had been mangled en route to his inbox, leaving only legible the words
13
Dick Higgins, “Intermedia,” re-published in Leonardo, vol. 34, 2001, p. 40.
14
Olivia Solon. “Bioart: The Ethics and Aesthetics of Using Living Tissue as a Medium,” Wired, July 28,
2011.
15
Claire Pentecost. "Outfitting the Laboratory of the Symbolic: Toward a Critical Inventory of Bioart".
Tactical Biopolitics: Art, Activism and Technoscience. The MIT Press. p. 110
14
“net.art”.
16
One of the main reasons that the term “new media art” is so difficult to pin down is
due partially to the fact that the various other forms of art that fall under this umbrella term are
similarly nebulous and vast. In fact, Greene claims that once the term “net.art” caught on, it
began to be applied to a whole host of everyday actions and interactions:
Net.art stood for communications and graphics, e-mail, texts and images,
referring to and merging into one another; it was artists, enthusiasts, and
technoculture critics trading ideas, sustaining one another’s interest through
ongoing dialogue. Net.art meant online detournements discourse instead of
singular texts or images, defined more by links, e-mails, and exchanges than by
any “optical” aesthetic.
17
Even after dissecting the term and examining its component parts, the term “new media
art” itself remains stubbornly opaque. While it can be argued that various kinds of art that fall
under new media can be tied together by concept and practice, I do not think that the same can be
said of medium specificity.
Nearly all of the scholarly writing that has been published on new media in the past
twenty years delves into a discussion about terminology. A large portion of said scholarly
literature chooses to define new media in either admittedly oversimplified or overly formalist
terms for the sake of having a working definition, any definition. Despite Beryl Graham and
Sarah Cook, Christiane Paul, and Lev Manovich’s prevarications that their definitions of new
media will undoubtedly fall short on one front or another, all end up relying on a form of
medium-specificity to move their work forward. In Rethinking Curating: Art After New Media
(2010), Graham and Cook define new media as “art that is made using electronic media
technology and that displays any or all of the three behaviours of interactivity, connectivity and
computability.”
18
[this is an anthology and not really a scholarly source!] In New Media in the
16
Rachel Greene. “Web Work: A History of Internet Art,” Artforum (May 2000): 162-190, p. 162.
17
Ibid, 162.
18
Beryl Graham and Sarah Cook, Rethinking Curating: Art After New Media, (Cambridge: The MIT Press,
2010), 10.
15
White Cube and Beyond: Curatorial Models for Digital Art (2008), Christiane Paul defines new
media as “computational and based on algorithms.”
19
In The Death of Computer Art (1996),
Lev Manovich states that new media must be aligned “towards new, state-of-the-art computer
technology.”
20
However, at the current point in time, medium-specificity seems altogether too
restrictive and dated. As will be discussed later, choice of medium is only partially definitive of
what might constitute “new media art.”
In Digital Art (2003), Christiane Paul creates both a visual and conceptual survey of
digital art since the 1980s. She discusses the ways in which digital technology has impacted art
production and consumption. In addition to her survey, Paul also traces the lineage of the
variable terminology for “technological art forms” back to the 1970s, when when it was still
being referred to as “computer art.” After the 1970s, the term for this kind of art became
“multimedia art”, before finally coopting the term previously applied to film/video and
photography at the beginning of the twenty-first century.
21
New Media in the Museum
The continued treatment of new media as a niche or special interest keeps the entire field
from entering the larger institutional conversation. While renewed interest in art and technology
in the 2010s might manifest in the form of increased numbers of travelling new media
exhibitions hosted by museums, it in fact still constitutes a refusal to commit to new media as a
whole. Museums are not buying new media art, preferring to simply open their doors for these
19
Christiane Paul, New Media in the White Cube and Beyond: Curatorial Models for Digital Art, (Berkeley:
University of California Press, 2008) 3.
20
Lev Manovich, “The Death of Computer Art,” Rhizome, last modified October 23,
1996, http://rhizome.org/discuss/view/28877
21
Christiane Paul, Digital Art, (London: Thames & Hudson Ltd., 2003), 7.
16
travelling exhibitions. By serving as a stop on a traveling show’s tour, the museum does not have
to support a piece in perpetuity.
Historically, most major museums have not developed or maintained a new media
department or even represented the field in a department title. Most often, if “new media” is
represented in a department title, the term will accompany “film,” “photography,” “center,” or
“lab.” The former two connote an older understanding of the term while the latter two labels
imply a purposeful distancing from elevated art that might necessitate curatorial work.
Film/video and photography were the previous “new media” earlier in the nineteenth and
twentieth centuries and, while a focus on them does not exclude a relationship to more current
versions of new media (such as Audrey Wollen’s ongoing performative self-portraiture on
Instagram), the aligning of media with film and photography implies an established relationship
to traditional gallery space that other forms of new media do not yet have. Should the association
of old and new grow to be too strong, the museum runs the risk of curating new media in the
same way that it curates film or photography. The terms “center” or “lab” suggest that the media
represented in these spaces is somehow other, to be separated from the rest of the museum’s
collected works.
One of the main ways in which an institution can support a piece of new media art is by
acquiring it. While there are a number of problems associated with bringing an artwork into a
museum (potentially denaturing pieces that might otherwise only exist on the web or preserving
otherwise ephemeral work), it is imperative that the rate at which new media art is purchased
picks up soon. Without the support of large institutions, we risk losing a staggering number of
works to obscurity. Whether that obscurity comes in the form of a lack of legitimacy or a lack of
conservation, the risk is imminent. Taking into consideration the rate at which technology
17
develops on a weekly or even daily basis, the obsolescence of essential materials (hardware,
software, tools, technical expertise) makes conservation of new media works challenging for a
large institution and nearly impossible for an individual or small organization.
Because of years of essentialist appraisal, new media’s narrative history became
truncated and its bleed-overs neatened (excised, really), such that it became ghettoized and
understood as a niche specialty. As such, there is a tendency to assume that the field has an
extremely limited exhibition history or lacks an evolved curatorial strategy. Despite many
decades of art historians supposed repudiation of Clement Greenberg, his formalist strategies still
manage to retain a foothold when it comes to new media. By setting new media on a parallel
narrative rather than an integrated one, its history cannot be anything other than narrow. Any
bleed-overs get annexed into other, more traditional mediums. For example, the artist Hans
Haacke is widely known to work with and explore systems (social, biological, algorithmic). This
work, especially given the Burnham/Shanken understanding of software, is categorically new
media. However, there seems to be a greater general willingness to align this work with either
institutional critique or sculpture. Without a degree of rewriting, the field—artistically and
critically—might continue to spread amorphously and grow thinner as it goes. How can our
understanding and conceptualization thrive without a clear understanding of what is and is not
new media? Furthermore, if this problem of canonical proximity or historical separation is not
rectified, the connections and influences between works that do not fall into the same genre will
potentially continue to lack a dimension. A similar situation arises when considering the physical
proximity of artworks. If the museum exists so as to allow visitors to create their own
relationships and conversations - either between themselves or between multiple pieces - then
there is a curatorial duty to make as many conversants available as possible.
18
Museums of all kinds, and large encyclopaedic museums in particular, are beholden to a
public that has placed its trust in them to safeguard that which is considered communal cultural
history. Beyond the role of caretaking, the public also expects the institution to help represent the
present moment in the greater museum timeline. Large institutions are trusted to make value
judgments on the products of the public sphere that surrounds them. If encyclopaedic museums
do not continue to move the narrative forward, then the encylopaedic museum must be seen as
seen as a specialized museum that only shows work from a specific segment of history.
New media art has grown on its own at the margins of the art world for decades. Due to
its lack of mainstream museum integration and some of the hacker-ish mentality of its once small
community, bringing new media further towards the center is no small task. While some of new
media’s DIY character is due to an aforementioned distance from the backing of large
institutions and their donors, a considerable number of new media’s qualities can be attributed to
its development in relation to various other non-art-related advancements. In 1960s America, the
era most commonly recognized as that of the birth of contemporary art and technology,
machinery and hardware, computers, and audiovisual equipment had finally begun to become
affordable for average citizens. Up until that point, computers and had largely belonged only in
the realms of government and specialized lab research. Thus as hardware and expertise became
more readily available, new media expanded its breadth in leaps and bounds.
We have reached a similar place in a cycle of the art world’s relationship to
technology, both in terms of mass media forms and the evolution of creative practices,
to that of the first Art & Technology program. New media had and has evolved from
being novel in format to being novel in name alone. However, despite general
similarities in art historical moments, there are notable differences both between the
19
museums in New York City and Los Angeles as well as between the Art & Technology
Program of 1967 and the current Lab.
Within the span of the last decade, the wave of renewed interest in art and technology has
continued to grow in relation to the increasing ubiquity of the internet and mobile technologies.
To associate this renewed interest entirely with a sudden boom or rebirth in the fields of art and
technology or new media would be to willfully ignore a host of net art, kinetic sculpture, video
games, and interactive art made from the 1980s through the early 2000s. However, if one were to
hazard a guess at the specific interests within this wave, based on the exhibition output of major
(mostly American) museums, it might not be surprising to find a rather bookended distribution of
interests.
These bookends are comprised by the sociopolitical and technological commonalities
between the first era of institutionalized new media and today’s institutional climate. The
technological developments of both the 1960s and the 2010s mirror each other, but perhaps more
importantly, the impact and level of integration of these developments into everyday life bear
resemblance. To better understand the more contemporary side, I undertake a critical exploration
of the curatorial models employed by the New Museum, The Whitney, and LACMA.
CONTEMPORARY PROGRAMMING
THE NEW MUSEUM: RHIZOME
Rhizome is a small not-for-profit organization housed within the New Museum in New
York City. It bears some resemblance to Eyebeam Art + Technology Center (another small
independent non-profit also located in New York City) in the ways in which it engages and
promotes new media -- that is, it produces publications and scholarships, exhibits, public
20
programming, and digital conservation efforts. However, Eyebeam is run out of its own building
in Downtown Brooklyn and its primary concern and point of origin is education. Rhizome, on
the other hand, was originally founded as an email list focused on net art but has since expanded
in practice and in scope to encompass other emerging new media practices. However, despite its
affiliation with the New Museum, Rhizome retains its identity and structure as its own
organization. Rhizome exhibits both online and offline. It collaborates with the institution in
order to put on and host events, shows, and residencies. Most recently, it jointly hosted its
inaugural annual conference on January 30, 2016 at the New Museum, “Open Score: Art and
Technology 2016,” which focused on the current state of art and technology. The participants of
the symposium, made up of four panels of artists, writers, curators, and researchers, discussed
such topics as art criticism in the digital age, surveillance, and social media. The conference was
timed to celebrate the fiftieth anniversary of the pioneering initiative Experiments in Art and
Technology (E.A.T.); the conference title is a reference to Robert Rauschenberg’s performance
of the same name during one of E.A.T.’s most well known events, “9 Evenings: Theatre and
Engineering.” As noted on the Rhizome website, aptly, the symposium was supported by not
only the New Museum but also the Robert Rauschenberg Foundation.
22
The events were
streamed live and its record made available through Rhizome’s website. All three aspects of the
conference (the creation of an in-person event, the live real time access online, and the archived
footage to remain available in perpetuity) are representative of the organization’s particular
resources and strategies.
Furthering this sense of hybridity is another annual conference being hosted by Rhizome:
“7 on 7.” In truth, “7 on 7” is part conference, part residency, and part creative matchmaking
initiative not wholly dissimilar to the A+T Lab at LACMA. Begun in 2010, “7 on 7” pairs artists
22
“Seven on Seven,” Rhizome, accessed on December 27, 2015, http://rhizome.org/programs/#program-item-2
21
and technologists and challenges them to create something new; whether that something is an
artwork, application, provocation, or something else entirely is up to them. At the end of their
time together, they present their creation and discuss the process of collaboration and production
at an event that is open to the public.
23
The New Museum describes itself as a non-collecting institution.
24
While it does actually
have a small permanent collection, it is not the typical museum assemblage. The New Museum’s
collection is not one built with future museum programming in mind. It is based mostly on
artworks that the museum’s administration feels reflect on or are directly tied to the institution’s
identity. Further, the administration’s means of acquisition for their collection is not the process
commonly associated with major museums, which typically consists of a joint effort between
directors, curators, conservation staff, and other governing bodies of the institution. For example,
Ugo Rondinone’s Hell, Yes! (2001) was acquired by a number of museum trustees and then
donated to the museum.
25
While the New Museum does not actively add to its permanent collection, neither does
Rhizome in a traditional sense. Its collecting model is much closer to that of an archive, to which
they primarily add works via a commissioning process. Rhizome Commissions Program takes
three forms: microgrants, the Prix Net Art, and its “curatorially led project commissions.”
26
The
microgrants (five of which were awarded in 2015, valued at $500 each) are encouraged to be but
not limited to browser-based works. The Prix Net Art is a cooperative project, started in 2014
with Chronus Art Center and Tsinghua University Art and Media Lab (TASML), that awards a
23
Ibid
24
“Press F.A.Q.” New Museum, accessed December 28, 2015,
http://www.newmuseum.org/files/nm_press_faq.pdf
25
Ibid
26
“Rhizome’s Fall/Winter Program,” Rhizome Blog, rhizome.org, last modified October 2014,
http://rhizome.org/editorial/2014/oct/2/2014-2015-program/
22
first place $10,000 prize to one artist and a second distinction prize of $5,000 to another artist,
both chosen by a jury. As an example of the jury’s usual composition, the 2015 jury was
comprised of Josephine Bosma, Chrissie Iles, and Domenico Quaranta. It is specifically geared
toward art being made on and for the internet. Finally, Rhizome’s curatorially led project
commissions are major components of the organization’s programming for the upcoming year.
They are chosen by a jury of New Museum and Rhizome staff as well as various other experts
working in the field.
27
ArtBase is Rhizome’s free and public digital art archive available online. It was founded
in 1999 as “a home for works that employ materials such as software, code, websites, moving
images, games, and browsers.”
28
Previously, ArtBase functioned on an open submission model
that would then be considered by Rhizome staff and then selectively added to the repository.
However, in 2008, the policy was changed such that the only work that would be submitted to
the repository from that point onward had to come from artists who were either invited to submit
their work or commissioned by the curatorial staff.
29
While the New Museum and Rhizome’s curatorial model may not support new media
artists by acquiring work being created independently of the museum, they are supporting
technologically-engaged art through preservation of past, present, and future net art. Of late, they
have dedicated efforts to trying to create a stable platform for the conservation of systems-
dependent works in a time and field where the systems continue to change rapidly. Beyond their
conservation work - which places the creation of systems of preservation on almost equal footing
with the works being preserved themselves - they also continue to help inject support into new
27
“Rhizome Commissions,” Rhizome Commissions Archive, accessed on December 27, 2015,
http://classic.rhizome.org/commissions/
28
“The ArtBase,” Art, accessed on December 27, 2015, https://rhizome.org/art/artbase/
29
Ibid
23
media art production. Rather than bringing it into the museum / organization via collecting, they
are providing financial and systemic support without having to navigate questions of ownership.
They are either providing grant money to support work and to encourage further diversification
of practices or commissioning work that is intended to be exhibited in their space without
necessitating transfer of ownership from the artist(s). Their support of non-commissioned
artwork is limited in that it extends an institutional hand to fewer individual pieces and artists
directly. However, as their online conservation efforts continue to develop and will be made
available for free and wide use, it is unclear as to how great an impact they might end up having
on a larger number of artists. Their most recent project, Webrecorder, just received a $600,000
two-year grant from The Andrew W. Mellon Foundation. Webrecorder is a free service that
allows users to freely and anonymously download or archive web content.
30
The technical
research being done for this project will aid in the preservation of older net art that operates on
already outdated systems as well as potentially helping in the generative process for future
artworks.
On the one hand, Rhizome and the New Museum are making impressive strides towards
stabilizing the framework of future access and conservation for art heavily engaged with the
internet. On the other hand, despite the fact that Rhizome has expanded its own focus from
strictly net art to a broader field of new media, there is still a heavy focus on internet artwork
versus work that might produce or require a material footprint. While they do manage to
incorporate an aspect of the material into their model via their work with public programming,
this aspect of their work is more about the physical proximity of people (in the form of their
public programming, conferences, and artistic collaborations) than objects - their version of the
30
“Rhizome Awarded $600,000 by The Andrew W. Mellon Foundation to build Webrecorder,” rhizome.org,
last modified January 4, 2016, http://rhizome.org/editorial/2016/jan/04/webrecorder-mellon/
24
return of new media to physical bodies. The goal of the the institution(s)’ public interface is to
bring individuals into the physical space of the New Museum in order to create and have in-
person conversations with one another as much as if not more so than with the pieces on display.
Meanwhile the pieces on display, whether they are brought in for a temporary exhibition or as a
symbol of a collaboration arranged by the institution, do not receive the same kind of support.
These works do not stay in the institution past their allotted time - at least not beyond
documentation or other assorted archival material.
In many ways, the New Museum and its affiliation with Rhizome is a traditional museum
model turned completely on its head: the permanent collection is not central, and thus immaterial
works are privileged over physical objects. In these respects, it is hard to even call this
institutional model a hybrid so much as a new kind of museum altogether.
THE WHITNEY: ARTPORT
The Whitney Museum of American Art has long held a reputation for showing the work
of living and less well-known artists (particularly in its own Whitney Biennial) as well as
maintaining and expanding its vast permanent collection. With these institutional ideologies in
mind, it might come as a bit of a surprise to learn that up until this past year, it only had one
piece of net art in its permanent collection (Douglas Davis' 1994 online project The World’s First
Collaborative Sentence).
31
In 2015, the Whitney officially added Artport - created and curated by
Christiane Paul and originally launched in 2001 - in its entirety to its permanent collection.
32
Artport is the museum’s own in-house online gallery and archive, housing the museum’s
numerous commissioned artworks as well as documentation of net art and new media
31
Marisa Olson, “Collectible After All: Christiane Paul on net art at the Whitney Museum,” Rhizome, August
10, 2015, http://rhizome.org/editorial/2015/aug/10/artport-interview-christiane-paul/
32
Ibid
25
exhibitions. This special collection of net art has now been given equal standing to some of the
more traditional mediums in the Whitney’s permanent holdings. It is important to note the nature
of Artport’s pieces; they are all commissioned artworks. While the Whitney reserves the right to
display the works in perpetuity, the artists themselves retain copies of their work as well as the
right to exhibit their work as they please, so long as it includes a credit line stating its origin as a
Whitney commission. According to Paul, the Artport pieces were not brought into the connection
as traditional acquisitions (i.e., exclusive ownership), as doing so would feel like a violation of
some of the central characteristics of net art and its digital nature:
33
We therefore chose to take a hybrid approach that makes Artport an adjunct of
the collection: all the works maintain their non-exclusive status but, at the same
time, Artport as a whole became associated with the collection. The "Artport
collection" is now given the same administrative purview as the Museum's
collections. This means that all of the artists are treated as collection artists and
that we are committed to preserving their work.
34
So, while the Whitney follows a more traditional museum model in that it does strongly
focus on exhibiting and expanding its permanent collection (hardly modest at over 22,000 works
by some 3,000 artists
35
) and does not house an independent curating organization within the
larger institution, it still employs a hybridized model of curating in order to accommodate new
media art. Similar to Rhizome, Artport keeps a rather narrow purview within new media. Artport
exclusively works with and refers to net art. At this point in time, again similar to Rhizome, it
also privileges non-haptic new media (artworks that have no physical footprint or cannot be
interacted with through physical engagement). As discussed above, the Whiney only very
recently brought net art into the fold of its permanent collection and the way it decided to
33
Christiane Paul quoted in Marisa Olson, “Collectible After All: Christiane Paul on net art at the Whitney
Museum,” Rhizome, August 10, 2015, http://rhizome.org/editorial/2015/aug/10/artport-interview-christiane-
paul/
34
Ibid
35
“About the Collection,” whitney.org, accessed January 3, 2016,
http://whitney.org/Collection/AboutTheCollection
26
accomplish this was not through a traditional accessioning process that occurred piece by piece
over time. While it is not uncommon for an institution to bring the entirety of a single collection
into its permanent holdings, it is rather different and surprising for a major museum such as the
Whitney to incorporate a collection of artwork that was previously, was essentially wholly
unrepresented. It might be fair to say that the nature of the collection might evolve over time to
include a wider variety of contemporary new media beyond net art. For the time being, however,
the Whitney’s collection is limited to the virtual and a selection of well-known and well-received
works such as Cory Arcangel’s Super Mario Clouds (2002). If the new media portion of the
permanent collection retains this current structure of largely being molded around Artport - only
making future additions to its new media collection via Artport commissions - then it seems
unlikely that the evolution of the Whitney’s new media curatorial model will diversify. That said,
Artport might evolve to become a hybrid, with a function similar to that of Rhizome within the
New Museum, rather than completely integrating new media into the Whitney’s permanent
collection. In its present form, it cannot be a new media collection so much as a net art
collection.
It should be noted that, via its acquisition patterns, the Whitney is making an important
distinction between not just net art and new media but also new media and art and technology. If
we look at the Whitney’s new media holdings as a whole, due to the great number of artworks
within Artport, the demographics of the genre is consequently skewed. This is not to say that the
Whitney has no other new media artworks in its permanent collection; however, the other pieces
that fall into this broader collection category are also skewed; they are notably on the more
historic end of new media; they are representatives of the art and technology roots of new media.
It is also important to recognize that the limited selection of art and technology holdings is not an
27
attempt to illustrate the breadth or evolution of this period in the 1960s and 1970s so much as big
names in the art world who happened to be working with art and technology during that time.
For example, the Whitney owns work by the likes of such LACMA A&T artists as James Turrell
and of New York based artists such as Claes Oldenburg who went on to have very successful
careers.
The Turrell works in their collection are a series of aquatints and etchings from the
artist’s Mapping Spaces project which, if examined based only on their medium, might not seem
like art and technology works at all. However, this project, begun in 1974 and taking place
within a dormant volcano, was an investigation by the artist into geology, space, and, like so
much of his work, light. Turrell modified the volcano, now a crater (Roden Crater), by adding
observation chambers and tunnels so as to afford visitors new ways of viewing and experiencing
the coming together of these fields of scientific study.
36
Turrell was able to acquire Roden Crater
only with the assistance of the Dia Foundation, which is primarily funded by the de Menil family
fortune – a fortune that is built on the corporate oil industry. On the other hand, The Oldenburg
works are altogether more readily identified with art and technology. The Whitney owns one of
the three sculptural variations of the artist’s famous mechanical ice bags, Ice Bag-Scale C
(1971), which, as I noted above, he made during his time at LACMA’S A&T Program in
collaboration with Gemini G.E.L.
With precious little in the Whitney’s new media collection between these older art and
technology works and more contemporary net art, one can only hope that the addition of Artport
is a sign that the Whitney will be moving towards a more well rounded new media collection
(one that will continue to fill in the gaps between the plainly historical and the very
36
Niamh Coghlan, “New Interpretations of Colour,” Aesthetica Magazine, accessed December 14, 2015,
http://www.aestheticamagazine.com/new-interpretations-of-colour/
28
contemporary) in the not too distant future. Such a move would particularly be of benefit to the
greater new media landscape due to the Whitney’s position as a larger institution. While the
institution does have an area of specialization, namely, American art throughout the twentieth
and twenty-first centuries (beginning with the Ashcan School
37
), and is therefore not an
encyclopaedic museum, it is big and diverse enough on a cultural level to be able to reach people
who otherwise might not be interested in a smaller museum with a more specific focus like the
New Museum. The Whitney is more likely to bring a bigger crowd due its ability to attract both
art world initiates and more casual (or more conservative) cultural tourists. And because it has a
longer, more established history, it feels like a safer bet to many people. A visitor less familiar
with art might be more inclined to make the trip to see a piece by Willem de Kooning rather than
an artist with a less well established reputation. The visitor might be more likely to supplement
their visit to the Whitney to see an artist like Laura Poitras (whose Academy Award and Pulitzer
Prize-winning work should be no less impressive) than seek her out as a first priority. It seems
even less likely for a casual museumgoer to gamble with their time and money on work that is
potentially entirely unfamiliar (and potentially less accessible) like Cheryl Donegan’s
performative video parodies of commercials and music videos at the New Museum. With any
luck, casual visitors’ supplementary art experience leads them either to appreciate both facets of
the collection more than they would have had they only seen the de Kooning. And should that
luck continue, that casual visitor might become a more regular visitor, seeking out the Cheryl
Donegan-type shows from thereafter. The larger the institution, the more curators and
administrators need to view it as a gateway, as a space for new conversations to begin.
37
“About the Collection,” whitney.org, accessed January 3, 2016,
http://whitney.org/Collection/AboutTheCollection
29
LACMA: ART + TECHNOLOGY LAB
When the Los Angeles County Museum of Art (LACMA) announced that it would be
reviving its historic Art and Technology Program in December of 2013, it seemed to be making
good on CEO and Director Michael Govan’s continuing pledge to expand the presence of new
media art at the museum.
38
However, upon closer examination, the new iteration of the Art +
Technology Lab is ostensibly a public program rather than an expanded integration or
engagement with LACMA’s art collections.
As noted, the first iteration of the Art and Technology Program (which ran from 1967 to
1971) was curated by Maurice Tuchman and aimed to pair artists (such as Andy Warhol, James
Turrell, and Richard Serra) with a broad spectrum of major American corporations (such as those
relating to the aeronautics, steel, and entertainment industries) in the hope that these corporate
entities might offer specialized knowledge, facilities, or materials for artists to utilize en route to
creating a unique collaboration. Even though, as previously mentioned, in the end, relatively few
pairings produced an exhibitable object or outcome, the program was successful in pioneering a
new model of interaction between artists and major corporate industry, within an institution.
The 1967 A&T Program invited roughly 250 companies to participate; in the end thirty-
seven companies accepted.
39
Tuchman and his team contacted sixty-four artists.
40
All but three
of them agreed. The initiative was quite open-ended: neither company nor artist was intended to
come into the project with a set plan and while contracts stated that the artist was intended to
spend no more than three months with the company, there was near universal agreement that
38
Jason Edward Kaufman, “Interview with LACMA Director Michael Govan [Part 2]: On New Media,
Redefining Contemporary Art, and the Convergence of Film and Art in L.A.,” accessed August 17, 2015,
http://www.jasonkaufman.info/index.php/9-inview/39-11interview-with-lacma-director-michael-govan-part-2-
on-new-media-redefining-contemporary-art-and-the-convergence-of-film-and-art-in-l-a
39
Tuchman, 12.
40
Ibid, 49.
30
should the pairing yield interesting work, the artist could stay on as he liked.
41
A&T, through
Tuchman’s initiative, was much more focused on the collaboration between industry and artistic
practice than the production of an object. However, according to Tuchman’s report, a number of
the companies signed on with an interest in receiving artwork in return for their sponsorship.
Tuchman put forth a tiered sponsorship system that saw each company investing either $7,000 or
$14,000. In addition to potential artwork that might be received, these sponsors were afforded
brand placement on each of LACMA’s advertisements; the bigger the investment, the more
prominent the advertisement.
42
For so long, many, such as art critic Maxwell Williams, have taken A&T’s relatively
brief lifespan to be a sign of its failure. It certainly fell down in a number of practical and
conceptual areas including a lack of cohesive direction, a lack of budgetary control that led to
certain companies dropping out of the program, and the eager conflation of technology and
corporate industry. However, despite all of these missteps, the Art and Technology Program has
now achieved a rather mythic art historical status. Art historians and theorists such as Pamela
Lee, Anne Collins Goodyear, and Peter Lunenfeld have alternately either helped to spark or
maintain this wave of interest. It could be argued that some of this continued legacy is due to
already well-known artists conceiving of new artistic practices during the program that would
lead to some of their best-known work. For example, Richard Serra’s first explorations in large-
scale steel forms came about during his A&T residence at Kaiser Steel.
43
New media theorist
Peter Lunenfeld posits that the biggest problem facing A&T at that time was that “the nation's
attitudes toward technology transformed between 1966, when [Tuchman] first proposed A&T,
41
Ibid, 10.
42
Ibid, 11.
43
Maxwell Williams, “LACMA’s Art and Technology Program Returns,” Artbound, January 16, 2014,
http://www.kcet.org/arts/artbound/counties/los-angeles/lacma-art-technology-lab.html
31
and 1971, when it opened.”
44
He is referring to the American public’s association of technology
in general (and participating companies like Lockheed Aircraft Corporation - in specific) with
government and state power, most blatantly on display during the Vietnam War, which was still
then being fought. Despite this sizable hurdle, the collaboration between art and technology was
always going to strike a lasting note with the public due to the way the region had developed.
Southern California has now been an ideal location for art and technology for over fifty years
because the corporate industries that established themselves in the area have always had a
profound impact on fabrication, fine art or otherwise.
45
A&T broke the seal on the collaboration
between art and science (or more specifically corporate industry) not only in regard to artistic
practice but also in terms of of participation and embodiment. This can be seen as a reflection of
some of A&T’s roots in Dada, Fluxus, or Happenings in that it continued to affect and expand
systems of viewing and displaying art. The relationship between art and technology has never
truly reverted back to the way it was before 1967; these systems of mutual influence have only
gained steam. This symbiotic relationship has once again come to a head after 2000, in the past
decade or so.
Nearly fifty years later, LACMA announced the revival of its pioneering program to
much fanfare and curiosity.
46
Despite the almost half century between the two events, the social
conditions are markedly similar and so is the cultural level of interest. During the late 1960s and
early 1970s, the American public was experiencing similar effects (social, technological,
political) of prolonged military engagement overseas as well as a new level of overall
technological fluency (long-distance calls, computer games like Pong and Space War, pocket
44
Peter Lunenfeld. “Art and Technology,” Artforum, September 2015,
https://artforum.com/inprint/issue=201507
45
Ibid
46
“LACMA Introduces Art + Technology Lab,” lacma.org, December 10, 2013,
https://www.lacma.org/sites/default/files/A%2BT-Release-FINAL-12.11.13.pdf
32
calculators). This fluency is endemic not only to that which merely became commercially
available to a wider public, but rather to the kinds of technology that became the norm. As to the
time immediately surrounding the Lab’s relaunch in 2014, we have also been experiencing a
long-term American military involvement overseas and near total ubiquity of social media and
mobile technology. While LACMA’s 2014 revival of the Art + Technology Lab yielded a few
notable similarities to the original, the overall project actually diverged quite a bit on some of the
most important points. For instance, while both the original and 2014 Lab are quite open-ended,
in that they were not predicated upon completed artworks or even tangible results, the 2014 artist
selection process took the form of a Request of Proposals (RFPs) – a pdf on the LACMA Lab
website with a description of the program and application guidelines -- and a selection
committee. The 1967 A&T Program had approached specific artists directly to pitch their idea
and recruit them, a process taking over two years.
47
At the end of this process, Tuchman and his
team received seventy-eight unsolicited proposals. None were accepted into the program.
48
Further, where Tuchman’s program ended up including over sixty artists, the 2014 Lab selected
only five projects to receive grants: four artists working on their own and one two-collaborator
team.
49
Another major difference between the 2014 Lab and Tuchman’s program was the
inclusion of women. Of the six 2014 participants (Annina Rust, Taeyoon Choi and E Roon Kang,
John Craig Freeman, Rachel Sussman, and Tavares Strachan) two of them were women, and
only one artist was a white male. Proportionally, this is a considerable departure compared to the
original program’s group of artists, all of whom were white men. It is also interesting to note that
47
Tuchman, 19.
48
Ibid, 19.
49
“Announcing Art + Technology Lab Artist Grant Awards,” lacma.org, April 9, 2014,
http://unframed.lacma.org/2014/04/09/announcing-art-technology-lab-artist-grant-awards
33
none of the 2014 Lab artists chosen were from the Los Angeles or Southern California area. In
fact, the vast majority were based in New York City or were international.
50
In March of 2015, LACMA announced a new long-term corporate sponsorship with
Korean automotive manufacturer Hyundai, simply called The Hyundai Project. This corporate
sponsorship (along with sponsorships by Accenture, NVIDIA, DAQRI, SpaceX, Google, and
Gensler
51
) pledged to further two major areas of focus with in the museum: Korean art
scholarship and Art + Technology. The project will take the form of acquisitions, exhibitions,
and publications through 2024.
52
The 2015 Lab press release differs from the previous year’s in
that it now categorizes the Lab as part of “The Hyundai Project: Art + Technology at LACMA, a
joint initiative exploring the convergence of art and technology.”
53
While this sponsorship is a
long-term commitment and additional funding, corporate or not, is a boon to the field in general,
it once again creates a sense of distance from the rest of the museum. By tying the work done at
the Lab so tightly with that of a major corporate sponsor, it simultaneously gives the impression
of the Lab being a pet project for the museum rather than a primary concern as well as
continuing to paint the Lab as a special interest entity that requires the involvement of a specific
kind of company to be funded. Perhaps even more troubling, this distance signals a lack of
permanence. This sense of ephemerality is worrisome in that it might be understood to be a
hedged bet by the museum, which might be only keeping the Lab around until this wave of
50
Jori Finkel, “Los Angeles Museum Grants to Promote Art and Technology,” New York Times, April 9, 2014,
http://mobile.nytimes.com/blogs/artsbeat/2014/04/09/los-angeles-museum-grants-to-promote-art-and-
technology/
51
Announcing Art + Technology Lab Artist Grant Awards,” lacma.org, April 9, 2014,
http://unframed.lacma.org/2014/04/09/announcing-art-technology-lab-artist-grant-awards
52
Jessica Youn, “The Hyundai Project,” Unframed, lacma.org, March, 26, 2015,
https://unframed.lacma.org/2015/03/26/the-hyundai-project
53
Amy McCabe Heibel, “Eight Artists Receive Art + Technology Lab Grants,” Unframed, lacma.org, June 10,
2015, https://unframed.lacma.org/2015/06/10/eight-artists-receive-art-technology-lab-grants
34
interest wanes and the interest become less bankable once again. While it might be argued that
bringing new media art – much of which has no physical materiality or is intended to live on the
web – into an institutional setting might limit its potency or denature the work, it is still
imperative that new media join the larger art historical narrative and conversation. Due to social
understandings of culture, the only way to accomplish such legitimacy in the public’s eye is to
put work into a museum setting.
The second round of the Request for Proposals in 2015 yielded eight artists in total, two
more artists than were in the inaugural group: Gabriel Barcia-Colombo, Nonny de la Peña,
Cayetano Ferrer, Jonathon Keats, Nana Oforiatta-Ayim, Alex Rivera, Matthew Shaw, and
William Trossell.
54
Unlike the previous year’s artists, three of the 2015 recipients are Los
Angeles based. Another interesting change from the inaugural press release announcing the first
round of grant recipients is the specific mention of the public engagement aspect of the initiative.
One of the things we ask the artists to describe in making a proposal to the Lab is
the opportunity for the public to see prototypes and works in progress, and to
hear the artist talk about the process of exploration and experimentation. Less
emphasis is given to completing a finished work during the 12-month Lab
program than to purposeful risk-taking and iteration.
55
The 2015 Lab’s foregrounding of public access and engagement can be seen as an
important signal that warrants further consideration. On the one hand, by once again painting the
Lab as a place of community events, it restates its stance on the Lab as a public program. It
positions the Lab more as a series of guest lectures rather than a newly acquired piece or
collection. It is a reaffirmation of the distance that the LACMA administration has built between
the museum itself and the Lab. It can also be read as a sign of the new hybrid qualities of
contemporary new media art practices. While older new media practices were instilled with a
54
Ibid
55
Ibid
35
sense of isolation from other human bodies (either through the creative process of the artist or
through the kind of interactivity, from human-to-screen), newer practices - like the ones in the
Software show, discussed earlier - have begun to return to the physical. This emphasis on
reinstalling human bodies into the process of production and experimentation, rather than at the
end to view a finished work, implies that the proximity and interaction of bodies is a non-
negotiable component to every artist’s project. This embodied experience has been previously
associated with interactive art and kinetic sculptures being made by artists in the 1980s and
1990s (like Jeffrey Shaw’s The Legible City from 1989), but seen less frequently in more recent
new media practices, especially as part of the generative process.
Further ties between new media and the body might be drawn from LACMA’s choice to
exhibit Rain Room (2012) by Random International and Daisy Waterfall (Rain Machine), Andy
Warhol’s 1971 piece created during the first A&T Program. As I mentioned above, Daisy
Waterfall (Rain Machine) consists of a wall of mounted lenticular hologram daisy prints set
behind a double wall of water that rained down between the viewer and the daisies. On the other
hand, Rain Room is a fully immersive environment that allows visitors to enter into room in
which water continually falls from the ceiling; however, special software and sensors keep the
area immediately above and around the visitor dry. Thus, the viewer is able to seemingly walk
through and admire falling rain unaided without ever getting wet; this is nothing if not an
embodied experience. Based on the formal similarities between the two artworks and the
scheduling of their exhibits, documentation and some of the lenticular daisy prints from
Warhol’s Daisy Waterfall (Rain Machine) were shown as part of a small historical show at
LACMA about the original A&T Program. That show, From the Archives: Art and Technology
at LACMA, 1967-1971, was on view from March 21, 2015 to October 25, 2015, while Random
36
International’s Rain Room opened on November 1, 2015 and will close on April 24, 2016. Given
these two initiatives, LACMA’s inference about the continuing hybridization and the reinsertion
of the viewing body in new media is hard to miss.
While I take a critical view of LACMA’s adamant separation of the Lab from the curated
art collections, I would be remiss if I did draw attention to the benefit of the Lab’s focus on
process and documentation. This focus does, at the very least, represent a facet of new media that
has been increasingly underrepresented in curatorial and institutional conversations: new media
as it continues to remanifest in the physical world. The popularity and accessibility of net art --
important though it is -- steers new media conversation in a direction that would have us believe
that the future of new media will not have much of a physical footprint, existing almost entirely
in digital form. Beyond this upshot, there is a central problem in the privileging of the process
without requiring a finished product. This format is not wholly unproblematic; not only because
it would deny the institution and its visitors an exhibit, but also because it removes a potential
dimension in the collaborative process. Arguably, the program would still retain its investigation
into artistic process if it required artists to at least attempt to create a piece while allowing for or
encouraging failure in a more concrete fashion. Without that extra impetus, the snapshot into
creative practice at the moment lacks the problem solving and workaround “kludge”
56
(a quick
and rough, usually DIY answer to a specific hardware or software issue) ethic that runs deep in
the heart of new media.
LACMA’s continued focus on documentation of the process rather than on the outcome
is a bit misleading. It is true that the Lab makes the “progress reports” for each artist available to
the public. However, these progress reports are not actually as revealing as they could be. At the
56
Peter Lunenfeld. “Art and Technology,” Artforum, September 2015,
https://artforum.com/inprint/issue=201507
37
end of the day, each of these summary reports is a standardized piece of paperwork, not entirely
dissimilar to a request for a field trip or a new office printer. This paperwork (available to the
public on the Lab’s website) consists of little more than updates on scheduling, project
milestones, and budgetary funding in list form. There is not much in the way of reflection or
description of the creative process. If anything, these bureaucratic check-ins reveal more about
the institution’s process of working with artists than it does about the artists’ individual practices
or method of engagement with technology. Further lack of clarity can also be found regarding
the display of any eventual works created. It is unclear when or if this might ever take place.
Curated by Jennifer King, From the Archives: Art and Technology at LACMA, 1967-1971
was a mini-retrospective about A&T on view in a very small, three-walled room, buried in one of
the museum’s innermost sub-galleries. It was surrounded on all sides by the museum’s other
modern art galleries, nestled between the Surrealist and Abstract Expressionist collections. This
room featured one video monitor showing a film about Claes Oldenburg’s Giant Ice Bag, walls
hung with documents and ephemera, and one medium-sized vitrine that housed a couple of small
maquettes, proposals, and correspondence from both artists and corporations interested in
participating in the program. Immediately outside of the room, in a corner of the Surrealist
gallery, were two of Rene Magritte’s most well-known and beloved masterpieces: The Treachery
of Images (This Is Not a Pipe) from 1929 and The Liberator from 1947. Certainly, it might be
argued that the placement of the rather miniscule A&T exhibition immediately behind these two
pieces, such that it sat on the opposite side of the wall of The Liberator, was meant to draw in a
wider audience; those visitors who might come seeking only that which they have seen
reproduced in books, on postcards, tacked up on dormitory walls, who had never heard of the Art
and Technology program. However, it might also be argued that this is simply a situation of a
38
large institution expressing its lack of concern for the subject and hiding the show in an out-of-
the-way location. Unfortunately, after having spent an extended period of time haunting the
A&T room and the surrounding galleries, I feel as though its placement might not have been a
terribly successful strategy. Almost without fail, museumgoers would photograph and fawn over
the cultural touchstones in front of them, glance briefly at the A&T display (often without
crossing the threshold into the small room), and continue on their way. A handful wended their
way around the vitrine, and of that handful, maybe half committed to truly investigating the
ephemera. On my viewing, this group appeared largely to be art students and a portion of the
retiree demographic.
Despite this already somewhat lackluster showing, arguably the museum’s most
egregious offense was the complete lack of acknowledgement or reference to the current
iteration of the Art and Technology Lab, housed two buildings away. This erasure above all
indicates a pointed effort on LACMA’s part to keep The Lab as public programming, to make
sure it cannot be conflated with “fine art.”
CONCLUSION
The current landscape for new media art practice has begun to hit a peak of hybridity in
terms of the balance of the physical and the immaterial. This most recent generation of new
media works is moving beyond the net or screens and back into physical objects and
interpersonal interactions or performances. Far from leaving the immaterial or virtual behind,
new media is rematerializing as paintings that are influenced by social media imagery or
sculpture based on app-generated data. Richard Prince’s controversial 2014 show, New Portraits,
at the Gagosian showroom in New York is an example of a direct collision between social media
39
and appropriation art reminiscent of Duchamp’s assisted readymades. And yet, in many ways,
New Portraits, was a rather traditional photography show in a blue chip gallery. As such, the
artwork is - in the purely formal sense - beginning to once again resemble art to which major
museums are more accustomed. If anything, this moment seems an ideal one to commit to finally
bringing new media into museums on a more permanent basis. However, there are a couple of
issues that seemingly continue to plague major institutions: narrative history and economic
history.
In regard to the narrative history, should a museum decide to invest in and incorporate
this “new” new media - which may comprise a return to traditional collectability - they must
acknowledge the multiple decades’ worth of new media works that have been swept under the
rug for so long. It would require a re-examination and rewriting of the canon of contemporary art
history. New media as we currently know it has direct ties to numerous movements already well
represented in the traditional art historical narrative such as the Bauhaus, Dada, Fluxus, and
Happenings movements. These movements are linked by more than form or content; they also
share theoretical and conceptual groundings. All or most of these genres situate the system of
understanding and production of art through experience and blur the boundaries between art, life,
and/or design. Whether that specific understanding of experience be John Dewey’s or Jack
Burnham’s version,
57
-- both of which engage with some version of an individual’s past
interactions or system of comprehension rather than necessitating direct interaction with a
machine -- is up for interpretation, but with this kind of lineage, a return to or reinjection of
embodied experience into contemporary new media practices can hardly be a surprise. And due
57
Edward A. Shanken. “The House That Jack Built: Jack Burnham's Concept of ‘Software’ as a Metaphor for
Art. Leonardo Electronic Almanac 6:10, last modified November, 1998, http://mitpress.mit.edu/e-
journals/LEA/ARTICLES/jack.html
40
to a protracted reluctance by major institutions to integrate new media into their permanent
collections, there are gaps in conservation and education departments to be addressed; new
expertise and potentially equipment would need to be sought in order to support these neglected
pieces in perpetuity. Institutional acknowledgement of these gaps, despite the urgency and
necessity, would open a sizable can of worms for any museum that might choose to finally adopt
new media. As John Dewey has noted: “[T]he work of art is often identified with the building,
book, painting, or statue in its existence apart from human experience. Since the actual work of
art is what the product does with and in experience, the result is not favorable to
understanding.”
58
This new hybridity, this understanding of new media as being beyond medium-
specificity, can be found not only in artistic practices but curatorial practices as well. Principles,
systems of thought, and even buzzwords traditionally associated with new media have become
an assumed part of every working curator’s set of skills, even those who do not work in fine art
institutions. Framing visitor experience in the context of social media, an emphasis on digitizing
historical archives and engaging with visitors through a virtual medium are commonplace
strategies and common practice now. While this embrace of the digital might be explained away
as an obvious development in response to changing marketing practices, I would argue that much
of the change in marketing practices (hiring full time social media engagement specialists, goods
and services that are exclusively digital, online lotteries and competitions) are caused by the
same mass cultural technology saturation that is propelling changes in new media and that the
two are therefore linked. And while this new hybridity is hardly a bad development, it does seem
to have happened rapidly and without much general recognition or attribution.
58
John Dewey, Art as Experience, (New York: Capricorn Books, 1959), 1.
41
Another major factor in the current dearth of new media in major museums does not
necessarily come from the institutions directly but rather from art historical and curatorial
educational models. The vast majority of new media art history is taught either in parallel with
traditional canon or as a specialty topic to spend one or two classes on (if that) before returning
to Art History. For example, in H.H. Arnason and Elizabeth C. Mansfield’s History of Modern
Art (seventh edition), the only mention of new media art in the table of contents falls under
“Chapter 20: Playing by the Rules: 1960s Abstraction” and only covers kinetic art and light art.
Older works of new media are subsumed into other categories such as conceptual art or Post-
minimalism (Chapters 22 and 23, respectively) while the only real mention of more
contemporary internet art comes in the form of Ai Wei Wei and the “Arab Spring” in relation to
globalization and activist art (Chapter 27).
59
It can hardly be a surprise to find this othering
within both traditional art historical and curatorial training reflected in the choice of research or
exhibitions that curators bring into myriad institutional settings. Even if curators do choose to put
on art and technology shows, they are by and large either historical shows focused on the roots of
new media in the 1960s and 1970s or on only the most contemporary new media practices -
which often include the more recent work of the same older artists whose work can also be found
in the 1960s and 1970s historical shows. From the vantage point of curators and institutions in
the 2010s, work from half a century earlier seems to have become historicized enough that it can
now be handled safely.
Debate over proper terminology when discussing new media is, like many things about
new media, not new. And while “new media” has traditionally been a rather broad catchall that
might refer to any genre from video games to internet art to interactive art to bio-art and then
59
H.H. Arnason and Elizabeth C. Mansfield, History of Modern Art (7
th
Edition), (New York: Pearson, 2012),
vi
42
some, a new distinction is being made more apparent of late: new media versus art and
technology. In the past, “art and technology” has been used interchangeably with “new media” as
another catchall term, but with the new hybridity of art and curatorial practices that has been
discussed in this paper, this lack of differentiation is proving to be less apt. I would argue that,
despite the continued confusion over the lack of novelty actually present in new media, “art and
technology” as a term is falling out of applicability at a more rapid rate than “new media.” While
“new media” continues to feel broad and can refer to practices from multiple periods in art
history, “art and technology” feels increasingly only appropriate to describe the era that the first
LACMA Art and Technology Program emerged from as opposed to the 2014 Lab. Even though
the new iteration of A+T shares a name and legacy with Tuchman’s program, it reads as a
reference to the 1960s and 1970s in more so than even the 2014 Lab. Arguably, had the current
Lab made more of a break from the original model and not, instead, reaffirmed the association of
“art and technology” with corporate industry, there would still be some room for some
divergence and thus more breadth in the term’s meaning. Had the current Lab changed the nature
of the artist-technology dynamic to one of true collaboration - where the artist and the
technology (or technologist) are equally beholden to the generative process - rather than one
where the artist is simply given access to specialists or specialized equipment, the term “art and
technology” might still be applicable to an entire field rather than one outdated practice.
43
Bibliography
Finkel, Jori. “Los Angeles Museum Grants to Promote Art and Technology,” New York Times,
April 9, 2014, http://mobile.nytimes.com/blogs/artsbeat/2014/04/09/los-angeles-museum-
grants-to-promote-art-and-technology/
Goodyear, Anne Collins, “From Technophilia to Technophobia: The Impact of the Vietnam War
on the Reception of ‘Art and Technology,’” Leonardo 41 (2), (Cambridge, The MIT
Press, 2008),169-173, http://www.jstor.org.libproxy2.usc.edu/stable/20206559
Lauder, Adam. “Executive Fictions: Revisiting Information,” Master’s thesis, Concordia
University, 2010.
Graham, Beryl, author and editor, New Collecting: Exhibiting and Audiences After New Media
Art, Surrey, England: Ashgate, 2014.
Graham, Beryl, and Sarah Cook. Rethinking curating: Art after new media. Cambridge: MIT
Press, 2010.
Heibel, Amy McCabe. “Eight Artists Receive Art + Technology Lab Grants,” Unframed,
lacma.org, June 10, 2015, https://unframed.lacma.org/2015/06/10/eight-artists-receive-
art-technology-lab-grants
Higgins, Dick. "Intermedia," re-published in Leonardo, vol. 34, 2001, 49 - 54
Lunenfeld, Peter. “Art and Technology,” Artforum, September 2015,
https://artforum.com/inprint/issue=201507
Manovich, Lev. The Language of New Media. Cambridge: MIT Press, 2002.
Martin, Julie. “A Brief History of Experiments in Art and Technology,” IEEE Potentials 34 (6),
2015: 13-9.
Paul, Christiane. Digital Art. New York: Thames & Hudson, 2008.
Paul, Christiane. New Media in the White Cube and Beyond: Curatorial Models for Digital Art,
Berkeley: University of California Press, 2008.
Pentecost, Claire. "Outfitting the Laboratory of the Symbolic: Toward a Critical Inventory of
Bioart". In Beatrice, da Costa. Tactical Biopolitics: Art, Activism and Technoscience. The
MIT Press, 2008.
Niamh Coghlan. “New Interpretations of Colour,” Aesthetica Magazine, accessed December 14,
2015, http://www.aestheticamagazine.com/new-interpretations-of-colour/
44
Marisa Olson, “Collectible After All: Christiane Paul on net art at the Whitney Museum,”
Rhizome, August 10, 2015, http://rhizome.org/editorial/2015/aug/10/artport-interview-
christiane-paul/
Shanken, Edward. “The House That Jack Built: Jack Burnham's Concept of ‘Software’ as a
Metaphor for Art. Leonardo Electronic Almanac 6:10, last modified November, 1998,
http://mitpress.mit.edu/e-journals/LEA/ARTICLES/jack.html
Tuchman, Maurice. A Report on the Art and Technology Program of the Los Angeles County
Museum of Art, 1967–1971, New York: The Viking Press, 1971.
Wagley, Catherine, 2015. “Closed Circuits: A Look Back at LACMA’s First Art and Technology
Initiative,” East of Borneo, May 11, 2015, http://www.eastofborneo.org/articles/closed-
circuits-a-look-back-at-lacmas-first-art-and-technology-initiative
Williams, Maxwell. “LACMA’s Art and Technology Program Returns,” Artbound, January 16,
2014, http://www.kcet.org/arts/artbound/counties/los-angeles/lacma-art-technology-
lab.html
Youn, Jessica. “The Hyundai Project,” Unframed, lacma.org, March, 26, 2015,
https://unframed.lacma.org/2015/03/26/the-hyundai-project
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Public engagement and activating audiences in Los Angeles museums
PDF
Museum programming and the educational turn
PDF
Art as a political tool: the early feminist production of Mónica Mayer, 1976–1984
PDF
Ghost
PDF
The art of staying: Theaster Gates and the Rebuild Foundation
PDF
Sustainability through participation: public art projects by Fritz Haeg, Futurefarmers, and Fallen Fruit
PDF
From land art to social practice: environmental art projects by Helen Mayer Harrison and Newton Harrison, Bonnie Ora Sherk, Mel Chin, Fritz Haeg, and Fallen Fruit
PDF
Sites of oppositional consciousness: the construction of an alternative cyborg in Trinh T. Minh‐ha’s Night Passage
PDF
Light and space as experience: a study of the work of James Turrell, Olafur Eliasson, and perceptual phenomenology
PDF
The Museum of Modern Art’s Cineprobe program and Essential Cinema at Anthology Film Archives: curating underground film
PDF
Teaching freedom: the power of autonomous temporary institutions and informal pedagogy in the work of Tania Bruguera and Suzanne Lacy
PDF
The globalization of contemporary Chinese art: biennales, large-scale exhibitions, and the transnational work of Cai Guo-Qiang
PDF
Ableism in the U.S. art context: curators, art museums, and the non-normative body
PDF
Explorer-at-large: artist-led inquiry and the rise of the museum as athenaeum
PDF
Orientalism and Chinoiserie: Chinese culture in the western fashion industry
PDF
Business casual: performing labor in the work of Harun Farocki, Pilvi Takala, and Melanie Gilligan
PDF
The trouble with Radical Women: anti-Blackness, Latinidad, and contemporary curating
PDF
Queer nightlife networks and the art of Rafa Esparza, Sebastian Hernandez, and Gabriela Ruiz
PDF
A picture is no substitute for anything: intertextuality and performance in Moyra Davey’s Hujar / Palermo
PDF
Social media, reality television, and the contemporary performance of self
Asset Metadata
Creator
Diu, Nateene
(author)
Core Title
Re/locating new media in the museum
School
Roski School of Art and Design
Degree
Master of Arts
Degree Program
Art and Curatorial Practices in the Public Sphere
Publication Date
05/05/2016
Defense Date
05/13/2016
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
A,art and technology,curate,curating new media,curation,curatorial,LACMA,multimedia art,museum,net art,new media,new museum,OAI-PMH Harvest,Whitney Museum
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Jones, Amelia (
committee chair
), Kratky, Andreas (
committee member
), Moss, Karen (
committee member
)
Creator Email
diu@usc.edu,nateene@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-246130
Unique identifier
UC11276456
Identifier
etd-DiuNateene-4402.pdf (filename),usctheses-c40-246130 (legacy record id)
Legacy Identifier
etd-DiuNateene-4402.pdf
Dmrecord
246130
Document Type
Thesis
Format
application/pdf (imt)
Rights
Diu, Nateene
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
A
art and technology
curate
curating new media
curation
curatorial
LACMA
multimedia art
net art
new media
new museum
Whitney Museum