Jeff Burke



I am a professor of theater and associate dean, research and technology at the UCLA School of Theater, Film and Television (TFT). I co-direct REMAP, the Center for Research in Engineering, Media, and Performance, a joint effort of TFT and the Samueli School of Engineering at UCLA. 

My research interests are primarily in the use of emerging technologies in performance and other live experiences. I am also increasingly interested in how artists and human creative expression should guide fundamental technology development. 

This site provides an overview and links out to some of my projects, with the most current work towards the top.

jburke - at - ucla - dot - edu

Publications 

Bio, photo


LinkedIn

01 Immersive Theater + Augmented Reality

Recent experimental work using AR on stage includes A Most Favored Nation, set in the world of Amazon Studios’ The Man in the High Castle, described in this Epic Games article, as well as in multidisciplinary workshops (pictured).  I recently did a talk and workshop on this topic at the Prague Quadrennial.  Work in progress includes an adaptation of China Miéville’s The City and The City
2014-




02 AI in Live Performance

My research and teaching has incorporated machine learning for computer vision, natural language processing, and style transfer into a variety of performances and experimental workshops. I’m currently collaborating with Jared J. Stein to update our pre-pandemic experimental work Entropy Bound, about a character who must use an AI as their memory after a brain injusry, for today’s LLMs. 
2012-


03 Named Data Networking (NDN)

NDN is a “data-centric” network technology originally funded by the NSF’s future internet architecture program. At REMAP, we are exploring how it enables applications difficult to achieve on the current Internet. New collaborative work with Lixia Zhang and Dirk Kutscher on decentralized approaches to extended reality recently won the Omidyar Network’s Future of Data Challenge.    

named-data.net
2010-




04 Learning Environments 

Real-time and interactive technology designed with live experience in mind can also support and draw from research in hands-on learning environments.  Previous work in this area includes research on science learning environments with Noel Enyedy and Joshua Danish.  That collaboration drove the creation of OpenPTrack, still one of the few open source, low latency multi-camera tracking systems.   I am now involved in a pilot collaboration with the UCLA Anderson School of Management (pictured) on immersive learning. 
2007-




05 Other Theater/Performance Work

The more recent work above builds on experimental performances, workshops, and teaching exploring the use of sensing, real-time media, and computation at a variety of scales, with support from a Google Focused Award, the Trust for Mutual Understanding, and other sources. Many of these projects are described on the REMAP website
2001-

06 cheLA

Since 2003, I have collaborated with my colleague Fabian Wagmister on his Centro Hipermediatico Experimental Latinoamericano (cheLA), organizing workshops, experimental performances, and developing / supporting technologies for a variety for projects. The cheLA + UCLA trailer highlights this unique space and community.. 
2003-




07 Moving Image  

My own research and other responsibilities at UCLA have led to a number of interesting moving image projects, including co-producing Francis Ford Coppola’s Distant Vision workshop, the TFT / Swarovski collaboration on Waterschool, shot by our students across six countries, and Recoding Innovation, a pair of shorts supported by NSF on ethics as a generative force in science and engineering.



08 Installation

At REMAP and the Hypermedia Studio that preceded it, I have created and acted as systems designer and engineer for a variety of interactive media installations, most often exploring the use of sensing and real-time media in public art involving community contributions.  In 2019, I ran a summer institute where students conceived and built mixed reality installation concepts inspired by the LA 2028 Olympics using AI.
 

1997-2016
(mostly)




09 Participatory Sensing

At the NSF Center for Embedded Networked Sensing, I collaborated with Deborah Estrin, Mark Hansen, and other faculty, students, and research staff on how to enable people to “see the signals” that could be captured by their smartphones, and use them for personal and community benefit.  
2006-2012