Institute for Creativity, Arts, and Technology (ICAT)
Permanent URI for this community
The Institute for Creativity, Arts, and Technology is uniquely partnered with the Center for the Arts at Virginia Tech. By forging a pathway between trans-disciplinary research and art, educational innovation, and scientific and commercial discovery, the institute works to foster the creative process to create new possibilities for exploration and expression through learning, discovery, and engagement.
Browse
Browsing Institute for Creativity, Arts, and Technology (ICAT) by Content Type "Conference proceeding"
Now showing 1 - 20 of 44
Results Per Page
Sort Options
- 3D Time-Based Aural Data Representation Using D⁴ Library’s Layer Based Amplitude Panning AlgorithmBukvic, Ivica Ico (Georgia Institute of Technology, 2016-07)The following paper introduces a new Layer Based Amplitude Panning algorithm and supporting D⁴ library of rapid prototyping tools for the 3D time-based data representation using sound. The algorithm is designed to scale and support a broad array of configurations, with particular focus on High Density Loudspeaker Arrays (HDLAs). The supporting rapid prototyping tools are designed to leverage oculocentric strategies to importing, editing, and rendering data, offering an array of innovative approaches to spatial data editing and representation through the use of sound in HDLA scenarios. The ensuing D⁴ ecosystem aims to address the shortcomings of existing approaches to spatial aural representation of data, offers unique opportunities for furthering research in the spatial data audification and sonification, as well as transportable and scalable spatial media creation and production.
- Aegis Audio Engine: Integrating a Real-Time Analog Signal Processing, Pattern Recognition, and a Procedural Soundtrack in a Live Twelve-Perfomer Spectacle With Crowd ParticipationBukvic, Ivica Ico; Matthews, Michael (Georgia Institute of Technology, 2015-07)In the following paper we present Aegis: a procedural networked soundtrack engine driven by real-time analog signal analysis and pattern recognition. Aegis was originally conceived as part of Drummer Game, a game-performancespectacle hybrid research project focusing on the depiction of a battle portrayed using terracotta soldiers. In it, each of the twelve cohorts—divided into two armies of six—are led by a drummer-performer who issues commands by accurately drumming precomposed rhythmic patterns on an original Chinese war drum. The ensuing spectacle is envisioned to also accommodate large audience participation whose input determines the morale of the two armies. An analog signal analyzer utilizes efficient pattern recognition to decipher the desired action and feed it both into the game and the soundtrack engine. The soundtrack engine then uses this action, as well as messages from the gaming simulation, to determine the most appropriate soundtrack parameters while ensuring minimal repetition and seamless transitions between various clips that account for tempo, meter, and key changes. The ensuing simulation offers a comprehensive system for pattern-driven input, holistic situation assessment, and a soundtrack engine that aims to generate a seamless musical experience without having to resort to cross-fades and other simplistic transitions that tend to disrupt a soundtrack’s continuity.
- AffecTech-an affect-aware interactive AV ArtworkCoghlan, Niall; Jaimovich, Javier; Knapp, R. Benjamin; O’Brien, Donal; Ortiz, Miguel A. (ISEA International, 2009)New developments in real-time computing and body-worn sensor technology allow us to explore not just visible gestures using inertial sensors, but also invisible changes in an individual’s physiological state using bio-sensors (Kim & André 2008). This creates an opportunity for a more intimate interaction between the observer and technology-based art (Gonsalves 2008). We present a technical overview of the AffecTech system; a bio-signal based interactive audiovisual installation commissioned as part of the pre-ISEA symposium in November 2008. Observers were invited to sit on one of 2 sensor-enhanced chairs (Coghlan & Knapp 2008), which transmitted physiological data about the occupant to a central control system. This data was used to control and modulate interactive visuals, live video feeds and a surround sound score, with events and interactions dependent on the observers’ affective/emotional state and the disparity or similarity between the bio-signals of the chairs occupants. This technical overview is followed by an examination of the outcomes of the project, from both the artistic and technical viewpoints, with recommendations for modification in future implementations.
- Affective Feedback in a Virtual Reality based Intelligent SupermarketSaha, Deba Pratim; Martin, Thomas L.; Knapp, R. Benjamin (ACM, 2017)The probabilistic nature of the inferences in a context-aware intelligent environment (CAIE) renders them vulnerable to erroneous decisions resulting in wrong services. Learning to recognize a user’s negative reactions to such wrong services will enable a CAIE to anticipate a service’s appropriateness. We propose a framework for continuous measurement of physiology to infer a user’s negative-emotions arising from receiving wrong services, thereby implementing an implicit-feedback loop in the CAIE system. To induce such negative-emotions, in this paper, we present a virtualreality (VR) based experimental platform while collecting real-time physiological data from ambulatory wearable sensors. Results from the electrodermal activity (EDA) data analysis reveal patterns that correlate with known features of negative-emotions, indicating the possibility to infer service appropriateness from user’s reactions to a service, thereby closing an implicit-feedback loop for the CAIE.
- Cinemacraft: Immersive Live Machinima as an Empathetic Musical Storytelling PlatformNarayanan, Siddhart; Bukvic, Ivica Ico (University of Michigan, 2016)In the following paper we present Cinemacraft, a technology-mediated immersive machinima platform for collaborative performance and musical human-computer interaction. To achieve this, Cinemacraft innovates upon a reverse-engineered version of Minecraft, offering a unique collection of live machinima production tools and a newly introduced Kinect HD module that allows for embodied interaction, including posture, arm movement, facial expressions, and a lip syncing based on captured voice input. The result is a malleable and accessible sensory fusion platform capable of delivering compelling live immersive and empathetic musical storytelling that through the use of low fidelity avatars also successfully sidesteps the uncanny valley.
- Contagion of Physiological Correlates of Emotion between Performer and Audience: An Exploratory StudyJaimovich, Javier; Coghlan, Niall; Knapp, R. Benjamin (2010)Musical and performance experiences are often described as evoking powerful emotions, both in the listener/observer and player/performer. There is a significant body of literature describing these experiences along with related work examining physiological changes in the body during music listening and the physiological correlates of emotional state. However there are still open questions as to how and why, emotional responses may be triggered by a performance, how audiences may be influenced by a performers mental or emotional state and what effect the presence of an audience has on performers. We present a pilot study and some initial findings of our investigations into these questions, utilising a custom software and hardware system we have developed. Although this research is still at a pilot stage, our initial experiments point towards significant correlation between the physiological states of performers and audiences and we here present the system, the experiments and our preliminary data.
- Creating a Network of Integral Music ControllersKnapp, R. Benjamin; Cook, Perry R. (NIME, 2006)In this paper, we describe the networking of multiple Integral Music Controllers (IMCs) to enable an entirely new method for creating music by tapping into the composite gestures and emotions of not just one, but many performers. The concept and operation of an IMC is reviewed as well as its use in a network of IMC controllers. We then introduce a new technique of Integral Music Control by assessing the composite gesture(s) and emotion(s) of a group of performers through the use of a wireless mesh network. The Telemuse, an IMC designed precisely for this kind of performance, is described and its use in a new musical performance project under development by the authors is discussed.
- Creating Biosignal Algorithms for Musical Applications from an Extensive Physiological DatabaseJaimovich, Javier; Knapp, R. Benjamin (NIME, 2015)Previously the design of algorithms and parameter calibration for biosignal music performances has been based on testing with a small number of individuals - in fact usually the performer themselves. This paper uses the data collected from over 4000 people to begin to create a truly robust set of algorithms for heart rate and electrodermal activity measures, as well as the understanding of how the calibration of these vary by individual.
- Design of a Mobile Brain Training Tool for Seniors: Motivational RequirementsO’Brien, Donal; Knapp, R. Benjamin; Thompson, Oonagh; Craig, David; Barrett, Suzanne (ICST, 2012)The overall goal of this project is to design, develop, and validate a mobile phone-based ‘brain training’ software suite targeted at senior users, using iterative person-centered design methodologies, that will permit a subsequent clinical trial of cognitive stimulation efficacy known as theSONIC²S Study (Stirling-Oregon-Northern Ireland-Chicago Cognitive Stimulation Study). The SONIC²S Study will represent a long term (c. 15 year), very large scale (n=12,000), embedded clinical trial that aims to definitively establish whether or not brain training acts to prevent dementia or cognitive decline. It is anticipated that participant compliance in such a study will be a significant concern. This study reports on a series of focus groups, usability studies and field trials which sought to identify the key motivational factors influencing seniors’ engagement with mobile brain-training technology in order to inform the design of a bespoke tool which is acceptable/enjoyable to target users.
- Disciplinary Influences on the Professional Identity of Civil Engineering Students: Starting the ConversationGroen, Cassandra J.; Simmons, Denise Rutledge; McNair, Elizabeth D. (2016-06)As the discipline of civil engineering has evolved from an apprentice-based trade to a socially-engaged profession, the role of the civil engineer has responded to shifts within the ever-changing culture of society. These shifts and historical events have directly influenced what is considered to be valued civil engineering knowledge, behaviors, and practices that we teach to students during their undergraduate careers. As part of a larger grounded theory study that is currently being conducted by the authors, the purpose of this paper is two-fold. First, we present the topic of professional identity formation as heavily influenced by unique historical events that shape the civil engineering discipline. . To establish the connection between identity formation and the history of civil engineering, we interpret historical events as constituents that create a disciplinary identity that is communicated to and subjectively applied by students during their undergraduate careers. Second, we hope to promote and invoke conversations surrounding the relevancy of civil engineering professional identity formation in engineering education among our colleagues within the technical disciplines. Through this paper, we add to ongoing research exploring the professional formation of engineering identities and promote discussions surround this topic at the disciplinary level. While most research conducted on identity formation has been generalized to include all or most engineering disciplines, we focus our discussion solely on professional identity formation within the civil engineering discipline. To reinforce the relationship between the history of the civil engineering profession and students’ professional identity formation, we review the literature on these two areas of inquiry. In particular, we will frame our paper using the following key discussion points: 1) providing a brief overview of key historical events of civil engineering in the United States; 2) discussing the influence of this history on instructor pedagogies and student learning within civil engineering education; and 3) conceptualizing this learning process as a means of professional identity formation. From this work, we will begin to understand how major historical shifts within our discipline maintain the potential to impact its future as we educate the next generation of civil engineering students. To conclude this paper, we will introduce current research that is being conducted by the authors to further understand the nuances of professional identity formation in undergraduate civil engineering students and how instructors may help or hinder that development.
- The Emotion in Motion Experiment: Using an Interactive Installation as a Means for Understanding Emotional Response to MusicJaimovich, Javier; Ortiz, Miguel A.; Coghlan, Niall; Knapp, R. Benjamin (NIME, 2012)In order to further understand our emotional reaction to music, a museum-based installation was designed to collect physiological and self-report data from people listening to music. This demo will describe the technical implementation of this installation as a tool for collecting large samples of data in public spaces. The Emotion in Motion terminal is built upon a standard desktop computer running Max/MSP and using sensors that measure physiological indicators of emotion that are connected to an Arduino. The terminal has been installed in museums and galleries in Europe and the USA, helping create the largest database of physiology and self-report data while listening to music.
- Emotion in Motion: A Reimagined Framework for Biomusical/Emotional InteractionBortz, Brennon; Jaimovich, Javier; Knapp, R. Benjamin (NIME, 2015)Over the past four years Emotion in Motion, a long running experiment, has amassed the world’s largest database of human physiology associated with emotion in response to the presentation of various selections of musical works. What began as a doctoral research study of participants in Dublin, Ireland, and New York City has grown to include over ten thousand emotional responses to musical experiences from participants across the world, from new installations in Norway, Singapore, the Philippines, and Taiwan. The most recent iteration of Emotion in Motion is currently underway in Taipei City, Taiwan. Preparation for this installation gave the authors an opportunity to reimagine the architecture of Emotion in Motion, allowing for a wider range of potential applications than were originally possible with the initial development of the tools that drive the experiment. Now more than an experiment, Emotion in Motion is a framework for developing myriad emotional/ musical/biomusical interactions with co-located or remote participants. This paper describes the development of this flexible, open-source framework and includes discussion of its various components: hardware agnostic sensor inputs, refined physiological signal processing tools, a public database of data collected during various instantiations of applications built on the framework, and the web application frontend and backend. We also discuss our ongoing work with this tool, and provide the reader with other potential applications that they might realize in using Emotion in Motion.
- ICAT Creativity + Innovation Day Exhibit and Event Catalog May 4, 2020(Virginia Tech. Institute for Creativity, Arts, and Technology, 2020-05-04)A program for the festival held virtually on May 4, 2020.
- ICAT Creativity + Innovation Day, May 6, 2019(Virginia Tech, 2019-05-06)A program of events for the exposition held at the Moss Arts Center on May 6, 2019.
- ICAT Creativity and Innovation Day program, 2018(Virginia Tech, 2018-04-30)A program of events for the exposition held at the Moss Arts Center on April 30, 2018.
- ICAT Creativity and Innovation Day: Sensing Place Exhibit Catalog, 2017(Virginia Tech, 2017-05-01)An exhibit catalog for the event held at the Moss Arts Center on May 1, 2017.
- ICAT Day program, 2016(Virginia Tech, 2016-05-02)A program of events for the exposition held at the Moss Arts Center on May 2, 2016.
- ICAT Day: Beyond, May 1, 2023(Virginia Tech, 2023-05-01)A program for the exhibition held on May 1, 2023, at the Moss Arts Center. This year’s theme, Beyond, focused on the kinds of work people do beyond a transdisciplinary education, how disciplines and identity are intertwined, and how that can help and transform.
- Inner-Active Art: An examination of aesthetic and mapping issues in physiologically based artworksCoghlan, Niall; Knapp, R. Benjamin (ISEA, 2009)Much art seeks to describe or stimulate the feelings and emotions of the viewer, through both abstract and literal representation. With the exponential increase in computing power over recent years we also seek new ways of interacting with technology and exploring the virtual world. Physiological signals from the human body provide us with a view into the autonomic nervous system, that part of the nervous system largely unmediated by the direct intentions of the viewer. With the appropriate choice of signals and processing, we can even develop systems with the ability to interact with us on an emotional level - machines that know how we feel and can react accordingly (Haag et. al., 2004). This gives us the ability to see into and map the interior worlds of artists and viewers through a direct and visceral connection, the human body itself. A key issue in the development of physiologically based artwork is to make the observer-artwork dialogue meaningful to the observer, a question of translating the input bio-signals to visual, auditory or experiential events. We have yet to develop a suitable language for this dialogue and so this paper seeks to explore some potential mappings for bio-signal art, illustrated using several case studies from past and current works (Knapp et.al., 2008) (Gonsalves, 2009). We also examine some of the other philosophical and artistic issues involved in 'affective' and bio-art such as monitoring emotion v. engendering emotion, the involvement of the observer in creating and contributing to bio-signal art and strategies for effectively developing such works.
- The integral Music controller: Introducing a Direct Emotional Interface to gestural control of sound synthesisKnapp, R. Benjamin; Cook, Perry R. (Michigan Publishing, 2005-09)This paper describes the concept of the integral music controller (IMC), a controller that combines gestural interface with direct emotional control of a digital musical instrument. This new controller enables the performer to move smoothly between direct physical interaction with an acoustical musical instrument, and gestural/emotional control of the instrument’s physical model. The use of physiological signals to determine gesture and emotion is an important component of the IMC. The design of a wireless IMC using physiological signals is described and possible mappings to sound synthesis parameters are explored. Controlling higher level musical systems such as conducting and style modelling is also proposed.
- «
- 1 (current)
- 2
- 3
- »