dtr

Dieter Vandoren

Here’s some development footage of my forthcoming project Integration.03.

Integration.03 is a body & motion controlled audiovisual live performance. The performer plays dynamic light and sound structures with the body as control interface. ‘Structures’ is to be taken quite literally: performer and audience are immersed in voluminous 3D light projections, giving an almost tangible quality to the ephemeral audiovisual media.

Premiere at the Open Dans festival in Rotterdam on the 24th of June.

More info at http://dietervandoren.net/index.php?/project/integration03/

A first little experiment generating geometric shapes from Kinect motion tracking data.

Reminiscent of Oskar Schlemmer’s Pole Dance.

0

What?! No posts since june 2010? Yep, time flies (like a jet fighter) when having fun being flooded in work…

Anyway, I’ve been messing around with the Kinect as motion tracking input for my upcoming Integration.03 project (see int.02 and int.01). My development platform is Max, at least for now. The jit.freenect.grab external enables the Kinect in Jitter.

I immediately found an obstruction in that the Kinect’s depth data, when rendered directly as a point cloud or mesh in Jitter, is seriously distorted because of the depth sensing method, lens distortion and the camera perspective. For the depth data to be useful to me I have to undistort it so that it renders in a perspective correct way (eg. floor and ceiling parallel, walls perpendicular to floor, etc).

I found the math to do this at the OpenKinect wiki (thanks Kyle and the OFx folks!) and developed 2 methods for getting it done in Jitter: 1) the jit.expr way and 2) the GLSL shader + jit.gl.slab way. The latter being the fastest one as the calculation is done on the graphic card’s GPU, yet not as fast as I had hoped for. Bottleneck’s probably writing and reading the 640×480 matrix to/from GPU memory. I’d love to hear it if people have ideas for speeding it up!

I thought I’d share the patch as getting it to work was an instructive process for me and perhaps others can learn from it now too. Here’s the patch (with the 2 methods) and the GLSL shader: freenect-undistortion.zip.

This semester I’ve been tutoring in the Interactive Technology Design master course at TU Delft’s Faculty of Industrial Design Engineering. We’re having the final exhibition on friday the 2nd of June, presenting all the students’ projects in an open house style event from 14.00 till 17.00.

The students worked on six design assignments:
- Social Connectedness with Waag Society and ID-Studiolab
- Media Experience installations with the Image & Sound Institute
- Energy Consumption Awareness Displays with Eneco
- Communicate Corporate Identity with Exact
- Interactive LED lamps with Philips Consumer Luminaires
- Design Tools for designing interactive product concepts with ID-Studiolab

The students are blogging their projects at itd2010.weblog.tudelft.nl.

Everyone’s welcome to take a look at the Faculty of Industrial Design Engineering, Landbergstraat 15, 2628 CE Delft (directions).

Announcing Integration.02

by Dieter Vandoren

I’m happy to confirm that, now all organizational puzzle pieces finally fell together, my new project Integration.02 is in the production pipeline. Its premiere is scheduled for the Blikopener Festival on the 5th and 6th of June In Delft (NL).

Integration.02 is the 2nd iteration of my Integration series of audiovisual-installation-meets-live-performance projects. Integration.02 builds upon Integration.01. It expands into a 3-dimensional, half-circle light installation and adds spatialized multichannel audio. This time round I step back behind the scenes and give way to dancers engaging in a motion-tracked, interactive dialogue with the dynamic audiovisual space.

Integration.02 is being developed in collaboration with media artist, composer and researcher MarkDavid Hosale and dancer and choreographer Janne Eraker. More dancers are to be involved soon.

Integration.01

Integration.01 at the SEEDER festival, De Fabriek Rotterdam, 2008

obo64screenshot

I dusted of my Monome today, eager to check out what goodies Max-for-Live brought to the community. I got quite excited over the idea of being able to punch in beats in Live from the Monome. Surely someone had done this by now.

Stretta’s obo (from his great maxforlive monome suite) was the closest thing I could find. It’s pretty awesome but has one major flaw for Monome 64/40h users like myself: it’s limited to a pattern length of 8 steps. 2 bar loops get boring very quickly…

obo lets you switch between 16 patterns but you need to mouse for that. An external controller or key mapping could do the job too but surely there is a way to solve it in the Monome itself? So I delved into Stretta’s patch and modified a couple of things to suit my needs:

- The max pattern length is now 32 steps.
- The top row is sacrificed for pattern page selection. The 4 leftmost buttons select the pattern page, 1 up to 4 pages for 32 steps. It still shows the play position too.
- The default note mapping matches Live’s drum rack mapping. The 7 rows map to the first 7 slots of a drum rack (starting at C1).

My goal was to make a quick beat sketching tool with a hardware step sequencer feel. Recording the output into midi clips lets me edit ‘m in greater detail in Live itself. Possible future enhancements: adding vertical paging for sequencing more than 7 tracks and/or 64 step pattern length.

Saving to external file and pattern switching seem to work reliably. I haven’t tested the MIDI in feature.

obo64 Max device download: http://dietervandoren.net/filez/obo64.amxd.zip

iPhone/iPod touchscreen interface development is taking a leap forward!

nr74′s c74 app and Max external creates on-the-fly GUI layouts connected to Max (running on a computer). The massive improvement over existing interface layout apps like TouchOSC is that the layout is dynamically controlled by Max. Now we can have the interface adapt to the tasks it has to perform in real-time.

Can’t wait to try it out. Looks very much like I’ll switch to it for the further development of DTouchR.

Seen first at CreativeApplications.net

Invitation

After a long semester of workshops, brainstorms, prototyping, tinkering and soldering the students I’ve co-tutored in the Delft University of Technology Interactive Environments course are presenting their work on friday. The results are 3 interactive lounge prototypes. Each applies a number of technologies and principles including kinetic structures (pistons, servo’s, inflatables), tactile interface sensors, optical motion tracking, emergent behavior swarm networks, spatial sound, dynamic lighting, Arduino protytoping, Max scripting, Python coding, etc etc…

Everyone’s welcome to join the pubic presentation at 16.00. Have a look at the studio blog for work-in-progress pics and info.