Research Projects at ArtFab

Below is a list of active research projects at ArtFab. To learn more about each project, click on the links for more info, or contact Ali Momeni.  Opportunities for independent studies, research assistantships and summer internships are available; please apply through this form.

SocialVR
SocialVR
We are developing an easy-to-use browser-based tool through that allows users to create branching narratives in virtual reality using drag and drop interactions, and annotation with sounds, text and images. The resulting VR experiences can be viewed in a mobile app (developed in Unity) that pulls the content from a server and recreates the experience on the mobile. Potential applications include immersive language acquisition; crowd-based city tours, and many others. Project is led by Ali Momeni (CMU Art; Principal Investigator) and Aparna Wilder (IRL Lab; Community Outreach Director). More info...

Currently Seeking: Browser Developers (html5, javascript, THREE.js, angular.js, css); Unity Developers

Fall 2016 Development Team:
-Benjamin Scott (CMU BFA Art 2017; Research Assistant, Browser Development)
-Ralph Kim (CMU BFA Art 2016; Research Assistant, Unity Developer)
-Hanfei Sun (CMU MCDS 2018; Backend developer)

Summer 2016 Development Team:
-Benjamin Scott (CMU BFA Art 2017; Research Assistant, Browser Development, Unity Developer)
-Ralph Kim (CMU BFA Art 2016; Research Assistant, Unity Developer, Outreach)
-Michelle Ma (CMU BCSA 2018; Design, Curriculum, Outreach)
Ticklers
Ticklers
In the past few decades, a handful of consumer technologies have dramatically transformed our relationship with music consumption. The Sony Walkman and the Apple iPod normalized the intimate listening experience that is now pervasive in public and private spaces alike. iTunes, PodCasts, Spotify and the like allowed listeners to develop into collectors (the personal library), curators (the published playlist) and creators (the podcasts). This project aims to catalyze an analogous revolution in PLAYING music (as opposed to listening, collecting or distributing), by leveraging advanced fabrication and interactive technologies. The four instruments, each based on a distinct mode of gestural interaction (tapping, rubbing, plucking and blowing) will be low-cost, reproducible by anyone with access to a maker space, and built around the most pervasive mobile computing platform of the day (i.e. iOS and Android mobiles). Project is led by Ali Momeni (CMU Art; Principal Investigator). More info...

Currently Seeking: Electronic musicians and composers; Hardware/Physical computing

Fall 2016 Development Team:
-Sydney Ayers (CMU BFA Art 2018; Hardware Design)
-Steven Krenn (CMU Masters in Music Technology, 2018; Software Design)
-Siddarth Suri (CMU Master of Integrated Innovation for Products, 2018; Product Design)
Dranimate
Dranimate
We are developing an interactive animation system that allows users to rapidly and intuitively rig and control animations based on a still image or drawing, using hand gestures. Dranimate combines two complementary methods of shape manipulation: bone-joint-based physics simulation, and the as-rigid-as-possible deformation algorithm. Dranimate also introduces a number of designed interactions that focus the user’s attention on the animated content, as opposed to computer keyboard or mouse. We have formed a startup company and have done several animation projects for corporate presentations and art festivals. Project is led by Ali Momeni (CMU Art; Principal Investigator) and Zach Rispoli (CMU BFA 2017, Art; Co-investigator). More info...

Currently Seeking: Browser Developers (html5, javascript, THREE.js, angular.js, css); Mobile Developers

Fall 2016 Development Team:
-Rob Kotcher (CMU MS Music Technology candidate, 2016; Browser and Mobile Development)
-Nick Coronado (CMU BS Music Technology, 2015; Browser Development)
-Hanfei Sun (CMU MCDS 2018; Backend developer)

Fall 2016 Business Team:
-Saket Bohania (CMU ETIM 2018, Business Development)
-Neha Sharma (CMU ETIM 2018, Business Development)
-Dian Yu (CMU ETIM 2018, Business Development)
-Yu Zhang (CMU ETIM 2018, Business Development)
ArtBytes
ArtBytes
We are developing a mobile app for augmented reality (AR) annotation of the real world. This app allows for intuitive image capture (taking imagery from streets, shops, museums and galleries), composition (creating new imagery by remixing, cropping and compositing) and sharing (attaching personalized imagery to real-world objects or pictures in the real world like road signs, graffiti, store signs, using Augmented Reality). Ali Momeni (CMU Art; Principal Investigator) and Anthony Tomasic (CMU SCS LTI; Co-Investigator. More info...

Currently Seeking: Unity Developers; Mobile Developers

Fall 2016 Business Development Team:
-Ankur Agrawal (CMU MBA 2018)
-Blake Bonnewell (CMU MBA 2018)
-Jeffrey Bloom (CMU MBA 2018)
-Timothy Brooks (CMU MBA 2018)
-Teginder Kaleka (CMU MBA 2018)
-James McSweeney (CMU MBA 2018)
-Frank Shang (CMU MBA 2018)

Spring 2016 HCI Evaluation Team:
-Sanjana Baldwa (CMU BHCI 2016)
-Helen Hong (CMU BHCI 2016)
-Rachel Jue (CMU BHCI 2016)
-Mina Kim (CMU BHCI 2016)
-Evan Metsky (CMU BHCI 2016)
Shahnameh Remix
Shahnameh Remix
This project functions at the intersection of theatrical storytelling, immersive media for performance and gestural interaction with groups of flying robots. We have been working with the aerial robotics lab at CMU to develop a gestural, grammatical language for controlling coordinated group behavior of flying robots in real-time. We have developed a grammar, a gesture recognition system, a tablet based interface, a choreography design environment, and the corresponding autonomy system for the robotic movements. This project is working towards a live performance with multiple performers, a group of robots, and real-time animation and video projection. More info...

Currently Seeking: Creative Roboticists, web-developers

Current Team:
Ali Momeni (CMU Art; Principal Investigator)
Nathan Michael (CMU Robotics Institute, RASL Lab; Co-Investigator)
Anthony Tomasic (CMU SCS, LTI; Co-Investigator)
Ellen Cappo (CMU Robotics Institute, PhD Candidate; Co-Investigator)
Machine Learning for Artists
Machine Learning for Artists
We are making efforts to make machine learning more accessible to artists and designers. This efforts have included the creation ml.lib, an extensive set of real-time machine learning tools for the popular Max and PureData programming environments. We have also been organizing a multi-part series of workshops, and with on Machine Learning for Artists and Designers, in collaboration with The Frank-Ratchye STUDIO for Creative Inquiry. Ali Momeni is working towards creating a Manual for Machine Learning for Artists, inspired by his previously published Manual for Urban Projection. Project is led by Ali Momeni (CMU Art; Principal Investigator).

Currently Seeking: Max and PureData developers; OpenFrameworks developers
Listen to your Gut
Listen to your Gut
Extensive research supports the commonplace intuition that there is strong connection between gut and brain, and neural connections between the brain and the ‘enteric nervous system” (a network of approximately 100 million neurons embedded in the intestinal wall) have been identified, including, most significantly, the vagal nerve. Yet investigations of the nature of this link have been stymied by the lack of non-intrusive methods of measuring intestinal activity. We are working on developing a device and a set of machine-learning powered audio analysis tools that help researchers learn more about the gut--and the mind--through analysis of bowel sound movements.

Currently Seeking: Industrial Designers, Music Information Retrieval experts

Current Team:
George Loewenstein (CMU Social and Decision Science, Principal Investigator)
Ali Momeni (CMU Art; Co-Principal Investigator)
Max G'Sell (CMU Statistics, Co-Investigator)
Rich Stern (CMU ECE, Co-Investigator).