Scholarly Works, Institute for Creativity, Arts, and Technology (ICAT)
Permanent URI for this collection
Research articles, presentations, and other scholarship
Browse
Browsing Scholarly Works, Institute for Creativity, Arts, and Technology (ICAT) by Issue Date
Now showing 1 - 20 of 39
Results Per Page
Sort Options
- Multimodal Interaction in Music Using the Electromyogram and Relative Position SensingTanaka, Atau; Knapp, R. Benjamin (NIME, 2002)This paper describes a technique of multimodal, multichannel control of electronic musical devices using two control methodologies, the Electromyogram (EMG) and relative position sensing. Requirements for the application of multimodal interaction theory in the musical domain are discussed. We introduce the concept of bidirectional complementarity to characterize the relationship between the component sensing technologies. Each control can be used independently, but together they are mutually complementary. This reveals a fundamental difference from orthogonal systems. The creation of a concert piece based on this system is given as example.
- The integral Music controller: Introducing a Direct Emotional Interface to gestural control of sound synthesisKnapp, R. Benjamin; Cook, Perry R. (Michigan Publishing, 2005-09)This paper describes the concept of the integral music controller (IMC), a controller that combines gestural interface with direct emotional control of a digital musical instrument. This new controller enables the performer to move smoothly between direct physical interaction with an acoustical musical instrument, and gestural/emotional control of the instrument’s physical model. The use of physiological signals to determine gesture and emotion is an important component of the IMC. The design of a wireless IMC using physiological signals is described and possible mappings to sound synthesis parameters are explored. Controlling higher level musical systems such as conducting and style modelling is also proposed.
- Creating a Network of Integral Music ControllersKnapp, R. Benjamin; Cook, Perry R. (NIME, 2006)In this paper, we describe the networking of multiple Integral Music Controllers (IMCs) to enable an entirely new method for creating music by tapping into the composite gestures and emotions of not just one, but many performers. The concept and operation of an IMC is reviewed as well as its use in a network of IMC controllers. We then introduce a new technique of Integral Music Control by assessing the composite gesture(s) and emotion(s) of a group of performers through the use of a wireless mesh network. The Telemuse, an IMC designed precisely for this kind of performance, is described and its use in a new musical performance project under development by the authors is discussed.
- Sensory Chairs: A System for Biosignal Research and PerformanceCoghlan, Niall; Knapp, R. Benjamin (NIME, 2008-06)Music and sound have the power to provoke strong emotional and physical responses within us. Although concepts such as emotion can be hard to quantify in a scientific manner there has been significant research into how the brain and body respond to music. However much of this research has been carried out in clinical, laboratory type conditions with intrusive or cumbersome monitoring devices. Technological augmentation of low-tech objects can increase their functionality, but may n ecessitate a form of context awareness from those objects. Biosignal monitoring allows these enhanced artefacts to gauge physical responses and from these extrapolate our emotions. In this paper a system is outlined, in which a number of chairs in a concert hall environment were embedded with biosignal sensors allowing monitoring of audience reaction to a performance, or control of electronic equipment to create a biosignal-driven performance. This type of affective computing represents an exciting area of growth for interactive technology and potential applications for ‘affect aware’ devices are proposed.
- Inner-Active Art: An examination of aesthetic and mapping issues in physiologically based artworksCoghlan, Niall; Knapp, R. Benjamin (ISEA, 2009)Much art seeks to describe or stimulate the feelings and emotions of the viewer, through both abstract and literal representation. With the exponential increase in computing power over recent years we also seek new ways of interacting with technology and exploring the virtual world. Physiological signals from the human body provide us with a view into the autonomic nervous system, that part of the nervous system largely unmediated by the direct intentions of the viewer. With the appropriate choice of signals and processing, we can even develop systems with the ability to interact with us on an emotional level - machines that know how we feel and can react accordingly (Haag et. al., 2004). This gives us the ability to see into and map the interior worlds of artists and viewers through a direct and visceral connection, the human body itself. A key issue in the development of physiologically based artwork is to make the observer-artwork dialogue meaningful to the observer, a question of translating the input bio-signals to visual, auditory or experiential events. We have yet to develop a suitable language for this dialogue and so this paper seeks to explore some potential mappings for bio-signal art, illustrated using several case studies from past and current works (Knapp et.al., 2008) (Gonsalves, 2009). We also examine some of the other philosophical and artistic issues involved in 'affective' and bio-art such as monitoring emotion v. engendering emotion, the involvement of the observer in creating and contributing to bio-signal art and strategies for effectively developing such works.
- AffecTech-an affect-aware interactive AV ArtworkCoghlan, Niall; Jaimovich, Javier; Knapp, R. Benjamin; O’Brien, Donal; Ortiz, Miguel A. (ISEA International, 2009)New developments in real-time computing and body-worn sensor technology allow us to explore not just visible gestures using inertial sensors, but also invisible changes in an individual’s physiological state using bio-sensors (Kim & André 2008). This creates an opportunity for a more intimate interaction between the observer and technology-based art (Gonsalves 2008). We present a technical overview of the AffecTech system; a bio-signal based interactive audiovisual installation commissioned as part of the pre-ISEA symposium in November 2008. Observers were invited to sit on one of 2 sensor-enhanced chairs (Coghlan & Knapp 2008), which transmitted physiological data about the occupant to a central control system. This data was used to control and modulate interactive visuals, live video feeds and a surround sound score, with events and interactions dependent on the observers’ affective/emotional state and the disparity or similarity between the bio-signals of the chairs occupants. This technical overview is followed by an examination of the outcomes of the project, from both the artistic and technical viewpoints, with recommendations for modification in future implementations.
- Contagion of Physiological Correlates of Emotion between Performer and Audience: An Exploratory StudyJaimovich, Javier; Coghlan, Niall; Knapp, R. Benjamin (2010)Musical and performance experiences are often described as evoking powerful emotions, both in the listener/observer and player/performer. There is a significant body of literature describing these experiences along with related work examining physiological changes in the body during music listening and the physiological correlates of emotional state. However there are still open questions as to how and why, emotional responses may be triggered by a performance, how audiences may be influenced by a performers mental or emotional state and what effect the presence of an audience has on performers. We present a pilot study and some initial findings of our investigations into these questions, utilising a custom software and hardware system we have developed. Although this research is still at a pilot stage, our initial experiments point towards significant correlation between the physiological states of performers and audiences and we here present the system, the experiments and our preliminary data.
- Review of user interface devices for ambient assisted living smart homes for older peopleO’Mullane, Brian A.; Knapp, R. Benjamin; Bond, Rod (International Society for Gerontechnology, 2010)Smart homes generally focus on issues to do with security, health, energy savings and entertainment, issues which grow in importance as we age. The sensors, actuators and entertainment devices required to build such a system add significantly to its complexity. Hence, the Man-Machine Interface (MMI) to the smart home systems is often acknowledged to be the most sensitive area for acceptance. Smart homes can allow the user modify the house via a unified control, additionally assisted living smart homes gather information about the subjects health, information that can be used to feedback to the user to modify their behaviour via the device. Increasingly these interface device present information from the internet, such as weather and news. With the internet fast becoming the first source of information for many services, such as shopping, or care workers access, these devices may additionally help bridge the digital divide between the young and old(²), if the principles of universal design are addressed(³)(⁴). The purpose of this study is to examine user interfaces devices that can perform these tasks and analyse them with regard to the particular requirements of the older user.
- Biosignal-driven Art: Beyond biofeedbackOrtiz, Miguel A.; Coghlan, Niall; Jaimovich, Javier; Knapp, R. Benjamin (CMMAS, 2011)Biosignal monitoring in interactive arts, although present for over forty years, remains a relatively little known field of research within the artistic community as compared to other sensing technologies. Since the early 1960s, an ever-increasing number of artists have collaborated with neuroscientists, physicians and electrical engineers, in order to devise means that allow for the acquisition of the minuscule electrical potentials generated by the human body. This has enabled direct manifestations of human physiology to be incorporated into interactive artworks. This paper presents an introduction to this field of artistic practice and scientific research that uses human physiology as its main element. A brief introduction to the main concepts and history of biosignal-driven art is followed by a review of various artworks and scientific enquiry developed by the authors. This aims at giving a complete overview of the various strategies developed for biosignal-driven interactive art.
- Design of a Mobile Brain Training Tool for Seniors: Motivational RequirementsO’Brien, Donal; Knapp, R. Benjamin; Thompson, Oonagh; Craig, David; Barrett, Suzanne (ICST, 2012)The overall goal of this project is to design, develop, and validate a mobile phone-based ‘brain training’ software suite targeted at senior users, using iterative person-centered design methodologies, that will permit a subsequent clinical trial of cognitive stimulation efficacy known as theSONIC²S Study (Stirling-Oregon-Northern Ireland-Chicago Cognitive Stimulation Study). The SONIC²S Study will represent a long term (c. 15 year), very large scale (n=12,000), embedded clinical trial that aims to definitively establish whether or not brain training acts to prevent dementia or cognitive decline. It is anticipated that participant compliance in such a study will be a significant concern. This study reports on a series of focus groups, usability studies and field trials which sought to identify the key motivational factors influencing seniors’ engagement with mobile brain-training technology in order to inform the design of a bespoke tool which is acceptable/enjoyable to target users.
- The Emotion in Motion Experiment: Using an Interactive Installation as a Means for Understanding Emotional Response to MusicJaimovich, Javier; Ortiz, Miguel A.; Coghlan, Niall; Knapp, R. Benjamin (NIME, 2012)In order to further understand our emotional reaction to music, a museum-based installation was designed to collect physiological and self-report data from people listening to music. This demo will describe the technical implementation of this installation as a tool for collecting large samples of data in public spaces. The Emotion in Motion terminal is built upon a standard desktop computer running Max/MSP and using sensors that measure physiological indicators of emotion that are connected to an Arduino. The terminal has been installed in museums and galleries in Europe and the USA, helping create the largest database of physiology and self-report data while listening to music.
- YourWellness: Designing an Application to Support Positive Emotional Wellbeing in Older AdultsDoyle, Julie; O’Mullane, Brian A.; McGee, Shauna; Knapp, R. Benjamin (BISL, 2012)Emotional wellbeing is an important indicator of overall health in adults over 65. For some older people, age-related declines to physical, cognitive or social wellbeing can negatively impact on their emotional wellbeing, as can the notion of growing older, the loss of a spouse, a loss of sense of purpose or general worries about coping, becoming ill and/or death. Yet, within the field of technology design for older adults to support independence, emotional wellbeing is often overlooked. In this paper we describe the design process of an application that supports older adults in monitoring their emotional wellbeing, as well as other parameters of wellbeing they consider important to their overall health. This application also provides informative and useful feedback to support the older person in managing their wellbeing, as well as clinically-based interventions if it is determined that some action or behaviour change is required on the part of the older person. We outline findings from a series of focus groups with older adults that have contributed to the design of the YourWellness application.
- Lantern Field: Exploring Participatory Design of a Communal, Spatially Responsive InstallationBortz, Brennon; Ishida, Aki; Bukvic, Ivica Ico; Knapp, R. Benjamin (NIME, 2013-05)Lantern Field is a communal, site-specific installation that takes shape as a spatially responsive audio-visual field. The public participates in the creation of the installation, resulting in shared ownership of the work between both the artists and participants. Furthermore, the installation takes new shape in each realization, both to incorporate the constraints and affordances of each specific site, as well as to address the lessons learned from the previous iteration. This paper describes the development and execution of Lantern Field over its most recent version, with an eye toward the next iteration at the Smithsonian's Freer Gallery during the 2013 National Cherry Blossom Festival inWashington, D.C.
- Technology helps students transcend part-whole conceptsNorton, Anderson H. III; Wilkins, Jesse L. M.; Evans, Michael A.; Deater-Deckard, Kirby; Balci, Osman; Chang, Mido (2014-02)How would your students make sense of the fraction 5/7? Would they interpret it as 5 parts out of 7 equal parts? Could they also understand it as a piece that is 5 times as large as 1/7? The former interpretation aligns with part-whole conceptions, whereas the latter aligns with partitive conceptions. Steffe and Olive (2010) have made such distinctions in students’ fractional knowledge to explain why students experience difficulties with fractions and to help students overcome those difficulties. (For summaries of this work, see Norton and McCloskey 2008 and 2009.) We introduce an educational video game (application, or app) designed to promote students’ development of partitive understanding while demonstrating the critical need to promote that development. The app includes essential game features of immediate feedback, incentives, and summary information for refl ection and discussion (Evans et al. 2013).
- OPERAcraft: Blurring the Lines between Real and VirtualBukvic, Ivica Ico; Cahoon, Cody; Wyatt, Ariana; Cowden, Tracy; Dredger, Katie (University of Michigan, 2014-09)In the following paper we present an innovative approach to coupling gaming, telematics, machinima, and opera to produce a hybrid performance art form and an arts+technology education platform. To achieve this, we leverage a custom Minecraft video game and sandbox mod and pd-l2ork real-time digital signal processing environment. The result is a malleable telematic-ready platform capable of supporting a broad array of artistic forms beyond its original intent, including theatre, cinema, as well as machinima and other experimental genres.
- Emotion in Motion: A Reimagined Framework for Biomusical/Emotional InteractionBortz, Brennon; Jaimovich, Javier; Knapp, R. Benjamin (NIME, 2015)Over the past four years Emotion in Motion, a long running experiment, has amassed the world’s largest database of human physiology associated with emotion in response to the presentation of various selections of musical works. What began as a doctoral research study of participants in Dublin, Ireland, and New York City has grown to include over ten thousand emotional responses to musical experiences from participants across the world, from new installations in Norway, Singapore, the Philippines, and Taiwan. The most recent iteration of Emotion in Motion is currently underway in Taipei City, Taiwan. Preparation for this installation gave the authors an opportunity to reimagine the architecture of Emotion in Motion, allowing for a wider range of potential applications than were originally possible with the initial development of the tools that drive the experiment. Now more than an experiment, Emotion in Motion is a framework for developing myriad emotional/ musical/biomusical interactions with co-located or remote participants. This paper describes the development of this flexible, open-source framework and includes discussion of its various components: hardware agnostic sensor inputs, refined physiological signal processing tools, a public database of data collected during various instantiations of applications built on the framework, and the web application frontend and backend. We also discuss our ongoing work with this tool, and provide the reader with other potential applications that they might realize in using Emotion in Motion.
- Creating Biosignal Algorithms for Musical Applications from an Extensive Physiological DatabaseJaimovich, Javier; Knapp, R. Benjamin (NIME, 2015)Previously the design of algorithms and parameter calibration for biosignal music performances has been based on testing with a small number of individuals - in fact usually the performer themselves. This paper uses the data collected from over 4000 people to begin to create a truly robust set of algorithms for heart rate and electrodermal activity measures, as well as the understanding of how the calibration of these vary by individual.
- Aegis Audio Engine: Integrating a Real-Time Analog Signal Processing, Pattern Recognition, and a Procedural Soundtrack in a Live Twelve-Perfomer Spectacle With Crowd ParticipationBukvic, Ivica Ico; Matthews, Michael (Georgia Institute of Technology, 2015-07)In the following paper we present Aegis: a procedural networked soundtrack engine driven by real-time analog signal analysis and pattern recognition. Aegis was originally conceived as part of Drummer Game, a game-performancespectacle hybrid research project focusing on the depiction of a battle portrayed using terracotta soldiers. In it, each of the twelve cohorts—divided into two armies of six—are led by a drummer-performer who issues commands by accurately drumming precomposed rhythmic patterns on an original Chinese war drum. The ensuing spectacle is envisioned to also accommodate large audience participation whose input determines the morale of the two armies. An analog signal analyzer utilizes efficient pattern recognition to decipher the desired action and feed it both into the game and the soundtrack engine. The soundtrack engine then uses this action, as well as messages from the gaming simulation, to determine the most appropriate soundtrack parameters while ensuring minimal repetition and seamless transitions between various clips that account for tempo, meter, and key changes. The ensuing simulation offers a comprehensive system for pattern-driven input, holistic situation assessment, and a soundtrack engine that aims to generate a seamless musical experience without having to resort to cross-fades and other simplistic transitions that tend to disrupt a soundtrack’s continuity.
- Introducing D⁴: An Interactive 3D Audio Rapid Prototyping and Transportable Rendering Environment Using High Density Loudspeaker ArraysBukvic, Ivica Ico (University of Michigan, 2016)With a growing number of multimedia venues and research spaces equipped with High Density Loudspeaker Arrays, there is a need for an integrative 3D audio spatialization system that offers both a scalable spatialization algorithm and a battery of supporting rapid prototyping tools for time-based editing, rendering, and interactive low-latency manipulation. D⁴ library aims to assist this newfound whitespace by introducing a Layer Based Amplitude Panning algorithm and a collection of rapid prototyping tools for the 3D time-based audio spatialization and data sonification. The ensuing ecosystem is designed to be transportable and scalable. It supports a broad array of configurations, from monophonic to as many as hardware can handle. D⁴’s rapid prototyping tools leverage oculocentric strategies to importing and spatially rendering multidimensional data and offer an array of new approaches to time-based spatial parameter manipulation and representation. The following paper presents unique affordances of D⁴’s rapid prototyping tools.
- Cinemacraft: Immersive Live Machinima as an Empathetic Musical Storytelling PlatformNarayanan, Siddhart; Bukvic, Ivica Ico (University of Michigan, 2016)In the following paper we present Cinemacraft, a technology-mediated immersive machinima platform for collaborative performance and musical human-computer interaction. To achieve this, Cinemacraft innovates upon a reverse-engineered version of Minecraft, offering a unique collection of live machinima production tools and a newly introduced Kinect HD module that allows for embodied interaction, including posture, arm movement, facial expressions, and a lip syncing based on captured voice input. The result is a malleable and accessible sensory fusion platform capable of delivering compelling live immersive and empathetic musical storytelling that through the use of low fidelity avatars also successfully sidesteps the uncanny valley.