Join Claire's mailing list:

EP-491

This is Claire's daily log for her EP-491 project under Berklee College of Music's Electronic Production & Design department. Her mentoring professor is Dr. Richard Boulanger, and her comrades are Andres Vera, Jacob Johnson, Isaac Matus, Dewey McManus, Montez Hardy and Tal Levy.

Monday, Jan 22nd 2018

Today was the first 491 meeting of the semester for my section with Dr. B. We discussed each of our 491 projects in class, and I'm grateful for the feedback and ideas my classmates threw out.

Some general goals for this week:

  • Decide on venues for weekly development, rehearsals, and the final performance

  • Continue working on the color tracking patch I presented at my alma mater in Singapore - an interesting little experiment, with much improvement to be made

    • Better way to parse data? Less bumpy?

    • Synced to tempo (of Ableton Live?) such that MIDI note output is rhythmic and more musically coherent?

    • Tracking accuracy in different lighting environments?

    • Distance from camera?

    • Type of camera?

  • Begin writing music that might serve as the accompanying electronic part of my "human synthesizer" piece

  • Aside from color tracking, heavily consider alternative forms of tracking movement​​

    • Lasers?​

    • Physical controllers e.g. Wiimote, Xbox, Kinect, Leap Motion

    • Electrical paint? (Can't remember the brand name...)

  • Think about scheduling in the 2nd half of the semester i.e. dates for rehearsals leading up to the performance

I've arranged to meet eMBee tomorrow regarding the venue of my piece's performance. Hopefully, I'll be able to present my work in 112, as long as it stays at the scale I currently anticipate it to be.

Tuesday, Jan 23rd 2018

I met eMBee today, and was grateful for his thoughts and help regarding the logistic execution of my 491 project. We decided that B54 would be a good place for rehearsals prior to juries, given its close proximity to the EPD equipment room, and after that I could conduct one or two dress rehearsals in 112 before my final performance.

Here's the schedule I proposed, which I also ran by Dr. B:

  • Now until around March 12th (week before Spring Break) or juries, whichever is first: Rehearsals in B54

  • Monday April 23rd: Dress rehearsal in 112

  • Monday April 30th: 2nd dress rehearsal in 112, if necessary

  • Monday May 7th: Final performance in 112

Keeping in mind that I'd like this project to be something moving and meaningful to me, I started on a draft of a musical piece that might be a portion of my accompanying part. It is partially influenced by The Pearl (Harold Budd/Brian Eno) - I remember how moved I was the first time I heard the album in EP-461 with Dr. B. The piece I composed after it was one that was very close to home. The feelings of warmth I experienced from the music are what I'd like others to feel through my art.

Wednesday, Jan 24th 2018

I've booked some project time in B54, which I think will be put to much better use now that I've 1) consolidated my notes from Week 1's 491 and my thoughts from over the break, and 2) heard more of what Nona's project will be like from Dr. B, eMBee, and Nona herself. 

EP-491 Meeting 1:

  • Aim to finish project by midterm/juries

    • Think about how to "sell" your ideas and work​

  • Weekly, bring in:

    • Something to work on​

    • Something you've done

    • Questions for Dr. B

  • Question(s) to keep asking: How do I make it work for me musically? Where's the "EPD"?

  • Previous 491s to keep in mind: Violin + Kinect, Interactive Video Projections

Accuracy and reliability are of utmost importance - I'll be sure to log the distances and environment settings whenever I test things out.

Thursday, Jan 25th 2018

Today, I made some minor albeit important changes to the current color tracking patch I have. These changes include better visual organization within the patch, some finer tunings to the calibration process of selecting a color and subsequently tracking it, as well as the option for choosing camera inputs in the event that multiple cameras are available. The last set of changes might be significant, especially if I choose to add some processed visuals (either in real-time or beforehand, in accordance with the choreography) to my piece.

Dr. B has offered us many resources for experimentation with audio/visuals/movement. Tomorrow, the EPD conference room should be open, and I hope to do a bit more experimenting and research there.

Friday, Jan 26th 2018

I revisited the iCubeX touch gloves in the EPD conference room today, which I introduced to through Circuit Bending & Physical Computing class. While I managed to get it to work previously and do remember quite clearly how I did it, I couldn't work the system this time. Perhaps this might have been due to iCubeX recently making updates (as of Jan 2018) to their digitizers and EditorX softwares. I'm wondering if my recent upgrade to Sierra (10.12.6) might have something to do with it as well. For now, I'll put that option aside, and perhaps try the Wi-microDig next time if I can locate it.

Saturday, Jan 27th 2018

This was a fruitful day in terms of research and hands on experimentation! My focus today was on using 1) my small Pico image webcam, and 2) my Macbook's built-in camera, for testing the effectiveness of color tracking from various distances in the environment of B54.

As expected, the stronger colors such as red, hot pink, bright blue and purple were easier to track compared to yellow and paler colors. That said, I would like to test multiple colors in the near future - will telling red and pink apart still be possible then? Can they be tracked separately and simultaneously with my current resources?

Today's work admittedly dealt more with technical things related to the actual Max patching and Ableton Live routing; tomorrow I hope to be as productive with my experiments and hopefully wrap up the color tracking testing part of my project.

Sunday, Jan 28th 2018

I completed the draft of ambient music that I started previously for the accompanying electronic part of my piece, and I'm looking forward to receiving feedback from Dr. B and my friends tomorrow. In an attempt to compose with patterns, I recorded several simple musical phrases played on a grand piano, in groups of 5, 7, 11, and 13 notes. I wanted to see if my composition could be influenced by the way these "prime number" note groups might interact with each other and, if so, whether their interactions could be further highlighted by choreographed movements that trigger notes or sequences.

I have done a fair bit of experimenting with colored ribbons in several different environments at this point, and with several different cameras. Having had this experience now, I think the involvement of motion/dance in my piece could potentially be implemented in the following ways:

  • Continuous sound e.g sweeps, risers, fades in and out (volume control)

  • Gestures e.g. one-shot granular bursts

  • Synced to tempo e.g. smoothing or quantizing events such that they're in time with the music or occur at regular time intervals

  • Changes to sounds after or while they're sounding e.g. cutoff, resonance, delays, modulation

The most important thing to keep in mind at this point is probably to make things musical, and I will keep that in mind as I continue putting my piece together.

Monday, Jan 29th 2018

I was glad to receive feedback about my ambient piece from Dr. B and my classmates today. One thing I will definitely do this week is listen to The Pearl again, as well as some of the other ambient tracks Dr. B recommended from Electroacoustic Composition class. An important thing to keep in mind is to not let the piece eventually come across like "noodling"; despite being ambient in nature, it should be purposeful.

Dr. B made a great suggestion about thinking of composing something more rhythmic to contrast with the ambience piece, perhaps as a prelude. I might look at some of the recordings of modulars I've made in the past - certain sounds I created with the various EPD systems were ones that I liked very much, so maybe I'll get some inspiration from there.

The notion of adding a visual component to my piece was also raised by Jacob. This is something I would definitely like to look into (perhaps after juries though). Even if I can't get live visuals, in might be nice to have some gestures recorded beforehand that contribute to visuals during my performance (Wekinator?).

Tuesday, Jan 30th 2018

I borrowed Dr. B's Kinect today, and I hope to get it to work on my Macbook. Many forums I've read so far have expressed trouble with using the Kinect without Windows. I'm hoping that Bootcamp isn't my only option. OpenKinect looks like a good resource to tap into for developing with the Kinect on OS X, but I'll have to build several things first before I can get the Kinect to connect.

With regards to color tracking, I've been thinking heavily about reconsidering using it as a central part of my piece, mainly because of lighting issues. Unless my experiments continue with neon colors or things that glow in the dark, I'm not sure how effective my color tracking will be. It might be more effective to do something slightly opposite that, and have my motions trigger sound, which then in turn trigger some kind of colored lights or visuals.

Wednesday, Jan 31st 2018

I set a simple goal for myself today, which was to re-look into some of my previous modular recordings, as I mentioned in a post from some days back. I'd like to see if I can make a new piece (with space, of course) from these older recordings. The live danced part could then perhaps be something percussive of granular in nature - Jacob's web instrument was particularly inspirational in the way it reflected the "path" of a user's mouse that caused the granular sounds to be synthesized.

Perhaps a 10 to 15 minute piece in something like ternary form (ABA') might be how my piece turns out. Something slow and ambient, followed by something percussive, and the return of the ambient as a recapitulation. Dr. B mentioned how I might think of capitalizing on some of my classical training; maybe it's a manifestation of Sonata-esque Form in this case?

Thursday, Feb 1st 2018

A short post today: I spent most of my dedicated daily 491 time looking into how to get these Kinect drivers and software programs running on OS X instead of Windows. Hopefully my time will have been well-spent... On to more Terminal typing!

Friday, Feb 2nd 2018

I'm very glad to say that I finally got Kinect to work with some open source OS X things, namely OpenKinect's Freenect library. Freenect in particular supports RGB and depth imaging, motors, accelerometers, and LEDs, and generates some vibrant images. I will post some examples of these images when I find out exactly what they represent. I have a feeling it might have to do with distance, or IR sensing.

Later today, or perhaps tomorrow, I will look more into the movement examples from previous 491s under Dr. B to specifically look at how they implemented motion tracking. Even with my toolkit expanding, I want to remember not to have too many things that make my project turn into a spectacle rather than a meaningful piece.

Saturday, Feb 3rd 2018

I looked briefly into the Hot Hand today after hearing some suggestions from Ceiling and Allen that it might be a useful performance tool. While it does track quite well and I've managed to get it running to and from Ableton Live and Max (e.g. sending MIDI notes and CCs back and forth), I feel that its definition of the XYZ axes don't quite suit my needs, especially compared to the Leap Motion, for example, which I've used before (e.g. with Boulanger Labs' Muse and my own CC mapping for Max/Ableton Live synthesizer parameter control) and have found more intuitive.

In addition, I worked on some rudimentary patches for movement triggers in Max, albeit without the integration of actual physical controllers yet. The patches mainly involve playing sounds, muting, or modulating volume based on one control (for now, focusing on the Z-axis/height). I will integrate CC control soon for using with parameters such as LPF cutoff frequency/resonance, HPF cutoff frequency/resonance, and vibrato or tremolo.

Sunday, Feb 4th 2018

Today I continued working on Max patches, looked into a bit more Leap Motion and Hot Hand implementation, and most importantly used the Kinect again. Here are some pictures of monitoring distance with it! Pink and red tones are the closest to the camera, while blue and cyan tones are the furthest away.

I imagine this proximity information could be useful for creating "zones of effectiveness" e.g. if I'm a certain distance away from the camera, only then will _____ kick into action. Excited!

Monday, Feb 5th 2018

Some notes from 491 today:

Tuesday, Feb 6th 2018

Some notes from 491 today:

Wednesday, Feb 7th 2018

Recital!

Thursday, Feb 8th 2018

text

Friday, Feb 9th 2018

text

Saturday, Feb 10th 2018

text

Sunday, Feb 11th 2018

text

Monday, Feb 12th 2018

text

Tuesday, Feb 13th 2018

text

Wednesday, Feb 14th 2018

Nona Day!

Thursday, Feb 15th 2018

Nona Day!

Friday, Feb 16th 2018

Nona Day!

Saturday, Feb 17th 2018

Nona Day!

Sunday, Feb 18th 2018

Nona Day!

Monday, Feb 19th 2018

Nona Day!

Tuesday, Feb 20th 2018

Nona Day!

Wednesday, Feb 21st 2018

Nona Day!

Thursday, Feb 22nd 2018

Nona Day!

Friday, Feb 23rd 2018

Nona Day!

Saturday, Feb 24th 2018

Nona Day!