Wang, MarxDuer, ZacharyHardwig, ScottyLally, SamRicard, AlaynaJeon, Myounghoon2023-02-082023-02-082022-10-29http://hdl.handle.net/10919/113735Born from physical activities, dance carries beyond mere body movement. Choreographers interact with audiences’ perceptions through the kinaesthetics, creativity, and expressivity of whole-body performance, inviting them to construct experience, emotion, culture, and meaning together. Computational choreography support can bring endless possibilities into this one of the most experiential and creative artistic forms. While various interactive and motion technologies have been developed and adopted to support creative choreographic processes, little work has been done in exploring incorporating machine learning in a choreographic system, and few remote dance teaching systems in particular have been suggested. In this exploratory work, we proposed Echofuid-a novel AI-based choreographic learning and support system that allows student dancers to compose their own AI models for learning, evaluation, exploration, and creation. In this poster, we present the design, development and ongoing validation process of Echofluid, and discuss the possibilities of applying machine learning in collaborative art and dance as well as the opportunities of augmenting interactive experiences between the performers and audiences with emerging technologies.application/pdfenIn CopyrightEchofluid: An Interface for Remote Choreography Learning and Co-creation Using Machine Learning TechniquesArticle - Refereed2023-01-23The author(s)https://doi.org/10.1145/3526114.3558708