MUSIC INCORPORATION
One aspect of the project that we had not looked into much was the musical portion. We came into the project knowing that we wanted to leverage music to make an engaging interactive light art piece, but we were not entirely sure how. We looked into various way’s of incorporating it and ended up settling on a pretty good implementation.
Initially we were at a loss as to how the music would react to people interacting with our project. The first plan we thought of was to just have the music increase as more people were detected by the IR sensors. While this would make our piece marginally more entertaining and build upon the goal of making the space more of a party, we thought we could do more with it.
While giving one of our first presentations on this project idea, the idea of making each participant their own instrument was brought up. While each would produce the sound of a different instrument, they come together to play a full song when everyone is within the art piece. We looked into various way to achieve this effect.
We went online and found a website that has a set number of classic songs deconstructed so you can select and deselect different instruments to alters what plays.This could be usable, but would leave us with only a handful of usable sound files and songs to work with.
Team member Chris also has friends who are active in the on-campus band and orchestra groups. He can ask them to perform a simple song one-on-one and record them all separately. While this allows for full creative freedom, it would take a lot of work to coordinate and could still result in some human error.
As we were searching, Chris remembered of a really cool website that he had stumbled upon a couple years back that was related to an artist that he followed. The website is Madeon’s Adventure Machine. Madeon is an electronic artist and he put out this website where users can sample small sections of his songs to create their own unique mixes. The intuitiveness of the page as well as the cool features, such as how the page waits to make a selection until the beat is correct or the plethora of song snippets to choose from, inspired me to base our music section on this general structure.
While we are not done implementing the feature, it should definitely allow for enough variety and entertainment to really shine a light upon our project.
CODE RE-STRUCTURED
Last week our team was able to get the communication between the IR camera and the Analog LED strip working. This week our digitally addressable LED strip and IR emitters which allowed us to develop the next iteration. In the processes we realized the code was beginning to look like spaghetti code, therefore it was hard to debug and not intuitive. This is crucial since we are working in a team and we should be able to easily look at it individually. We opted to take a step back and restructure our code to OOP. We made a class with a header for our IR camera and began using the FastLED library. This IR camera class we created has useful functions that allow us to get the individual x and y values of the tracked objects. This proved to be very useful since it cut the time of debugging and the code is more intuitive. We also decided to migrate to the Atom IDE using PlatformIO to be able to interface with the arduino uno. This IDE allows for faster code writing and has useful tools for programming.

BIGGEST CHALLENGES SO FAR
As our team was testing the max amount of trackable object (4 IR emitters) we noticed that only three of the four objects were tracked perfectly. The fourth object seemed to be displaced far of into a corner of the readings by the camera.

We also realized that the IR emitters transmit their light waves in a single direction as seen in the figure below. This is not ideal when moving around since the camera loses track of the emitter if the angular displacement is not precise. Our team need to find a way to expand the angular displacement of the waves emitted in order for the camera to track the moving objects. We will also have to account in the code when a object disappears for a bit. This could be done by adding counter that keeps track of how much time it has elapsed since the object disappeared and based on that determines if it is because of the tracking error or that the object actually is not in the space anymore.
https://cdn-shop.adafruit.com/datasheets/IR333_A_datasheet.pdf
NEXT STEPS
1)Find a way to expand the angular displacement of the IR emitters
2) Account for the IR camera tracking error (objects disappearing for a short amount of time)
3) Map the LEDs to objects. This can be done by mapping the x and y displacements to the intensity and set of LEDs on the strip.
5) Incorporate sound with LED light colors. Establish how the interaction between the object will affect the music relative to their position from each other.
6) Create a structure for the LEDs to be placed at and the IR camera for easier transportation.
7) Type up script for final video.