live video processing '03

/** weekly journal **/
// week1

bought max/msp/jitter. Whooooo hooooooooooo!

// week2
Finished Meditatation #1. discovered headache of file paths in Max/MSP, and the tosymbol and sprintf objects.

// week3
Nothing much. Jitter is cool.

// week4
Played with class patches. I love jit.brcosa with matrix feedback... aweseome, blown-out images.

// week5
Finished meditation#2. Easy :)

// week6
Playing with peek~, and added exponential decay (meditation#5, I think) to my cross-synthesis patch, along with fiddle~, to make some crazy shit happen.

Watched Scott Fitz perform, got very interested in jitter stuffs. Wish I had infinite time to play with it.

// week7
Midterm ideas:

1. Work with Pamela Vitale on a VJ "battle" system, where changing colors on video projection is picked up by another, which influences changes in it.

2. I'm proposing doing some live image analysis in Jitter of the webcam images by doing a series of "sketches" that compare movement in webcam image frames with one another. I'd do a series of comparisons between images taken at different intervals (captured the same time on successive days, captured successively) by dividing each frame into a grid and analyzing the movement in each component square. Then, I'd color in each square according to some parameters like magnitude of movement and color, which could be altered live thru a midi controller.

// week8
Decided to combine my experiencing numbers projects with the live video midterm. The idea (which will also be my thesis, I think), is to take live images from the MTA traffic webcams and turn them into video and sound. You "listen to the traffic" on the radio already, right? But do you really listen to the traffic...

I'm going to spit the project into two divisions - all the cool video and sound sketching will be here, under the live video class, while all the conceptual discussion will fall under the experiencing numbers site.

// week9
// week10
// week11
// week12
// week13
// week14
// final project