
CataRT - IMTR From IMTR Real-Time Corpus-Based Concatenative Synthesis by Diemo Schwarz, IMTR Team, Ircam--Centre Pompidou, and collaborators. See also: Description The concatenative real-time sound synthesis system CataRT plays grains from a large corpus of segmented and descriptor-analysed sounds according to proximity to a target position in the descriptor space. CataRT is implemented in MaxMSP and takes full advantage of the generalised data structures and arbitrary-rate sound processing facilities of the FTM and Gabor libraries. CataRT allows to explore the corpus interactively or via a target sequencer, to resynthesise an audio file or live input with the source sounds, or to experiment with expressive speech synthesis and gestural control. CataRT is explained in more detail in this article and is an interactive implementation of the new concept of Corpus-Based Concatenative Synthesis. Download CataRT is offered as free and libre open source software in the spirit of the GNU GPL. Requirements License
Tutorial 25: Tracking the Position of a Color in a Movie There are many ways to analyze the contents of a matrix. In this tutorial chapter we demonstrate one very simple way to look at the color content of an image. We'll consider the problem of how to find a particular color (or range of colors) in an image, and then how to track that color as its position changes from one video frame to the next. This is useful for obtaining information about the movement of a particular object in a video or for tracking a physical gesture. In a more general sense, this technique is useful for finding the location of a particular numerical value (or range of values) in any matrix of data. The object that we'll use to find a particular color in an image is called jit.findbounds. Here's how jit.findbounds works. In this example we use the jit.qt.movie object to play a movie (actually an animation) of a red ball moving around. Minimum and maximum values specified for each of the four planes • Click on the toggle to start the metro. Playing Notes Playing Tones
FTM&Co - ftm Making an infra red triggered sampler, using the Johnny Chung Lee method! You definitely want to use cv.jit.sort to index your blobs, or design your own sorting method – sort seemed to work well for me, although I never developed a fully functional system. You may run into problems if the movement is too extreme between frames, but for slower gestures it should work well. Other approaches might be (if using normal video rather than IR) to colour your blobs in such a way that they can be identified by some jitter native calculation of average colour. …. I remember that by big problem was with blobs merging (at which point sort will flake out) so that was more of a concern for me than reliable sorting directly – I was thinking a lot about a decent filter to separate blobs as in the case of LEDs the glare on the camera can create haloing around each bulb. Not sure if that’s of any use, but those were my thoughts at the time. Alex
Max Objects Database Cycling '74 Wiki Download fftw++ Jean-Marc Pelletier Physical Computing with Max Turn sensor data into something more meaningful. With native support for a variety of hardware protocols, Max connects to your electronics. Just connect a patchcord to visualize the sensor values and add filters to smooth and scale them. Get the SensorBox tool Watch an interview with Ali Momeni Connect everything with support for all your favorite hardware. Max comes with objects to capture MIDI, Serial (Arduino, etc.), Human Interface Devices like joysticks, and network protocols like Open Sound Control (iOS apps, DMX interfaces, OSC-enabled software). Connect a joystick tutorial Read about Herbie Hancock’s setup Capture the scene with a camera and respond. With support for live video inputs and a full range of image processing tools, you can use information from a camera to drive interactive media. Make connections with camera data Try a variety of Jitter recipes Discover why Max has been the tool of choice for innovative artists for over a decade.