What I’d like to try next time…

A little movie to show you another animation that responds to the music as well as following something…

This is the visualisation I was talking about in the post ‘Thoughts on two sources’. I’ve had a better look at the program now and it seems that the warping doesn’t actually attract the colour from one point to another, but start from a point and then smudge it in different directions. This means I can’t easily do the thing I was thinking of doing, with the colour going from one dancer to another. I’m sure it could be done, but not as easily as I had anticipated. Shame.

I’d like to make the maxMSP patch I am usig in this little movie available to download but wordpress doesn’t support the file format.

Visiting Lecture 1(written up notes)

John Bowers from Goldsmiths came to give us what was more or less a presentation of his previous work on Thursday, some really interesting stuff, quite relevant to what a lot of us are thinking about at the moment.

He kicked off by talking about the interactive virtual reality work he was involved with in the 90’s, bringing the real world into navigable virtual worlds to model and affect behaviors and spaces. He gave examples of how virtual reality ideas were taken up by the media in TV, fashion and computer gaming; some more successfully than others.
An example of trying and failing to incorporate computers in the fashion industry was virtual garment making, (building a garment virtually would take longer than to create a real mock up, and would not be able to answer the questions it was made for eg. Is it comfortable? How do you feel to wearing it?) Making the salient point that you need to establish whether the technology is justified, not use it simply because you can.

He said he’d noticed that more recently thinking had reversed somewhat from the 90’s desire to bring the real world into a virtual one; the trend is now people thinking about how it is possible to mix the virtual world with the real one. To superimpose them in creative and meaningful ways for entertainment and education.

I’m not completely convinced that virtual reality has had it’s day, what with the success of Habo Hotel (people living virtual lives on the internet) and online gaming, but I agree that a lot of interesting current work is finding ways of creatively integrating technology with our everyday lives, enabling anybody (not just nerds!) to use it.

Some artists projects mentioned or talked about at the lecture… Blast Theory’s Desert Rain & Uncle Roy all around you, Nottingham Castle, Zgodlocator, Masaki Fujihata. I don’t know who’s website this is… http://www.mrl.nott.ac.uk/~sdb/videos/ but it has a neat little description of all the projects John was talking about with some videos.

Some interesting points on Collaboration… Different parties usually have different goals within the same project. We should establish ‘boundary objects’ – elements that everyone in the collaboration can agree are important. In my case I suppose that would be tango.

Other triggers

My tango friend Adam had a bright idea and e-mailed me this…

Hi!

How did it go with gonzalo and solange?

I was just working on something totally unrelated, and I had an idea…

You know that video you showed me with people holding sensors, and the image changed whenever they turned the sensor upside down or something?

Well, in tango your feet change move upside down in linear and circular voleo and gancho, etc. And they are usually decorations accompanied by accents in the music. So wouldn’t it be cool if the image could change with the accents? So…sensors on followers feet is the answer.

OK, I’ll go back to work now…

An excellent point! and this visualisation patch is so customisable it’s perfect. If I can add something in the tracking part of the programme to recognise the underside of the feet I could use it to trigger a change of colour in the animation. Could be really cool.

So, what could I use to track the soles of the feet? I don’t know if the colour recognition is strong enough to pick up what would be such quick flashes. Anyone else got any bright ideas?

Thoughts on two sources…

I had thought I would have to adjust the programme to receive two inputs if I wanted the dancers to be tracked individually, but when we were experimenting in the space, it would jump between the two sources of shadow if the dancers were apart, creating the effect I wanted anyway (two sets of ripples).

Now I’m playing with this visualisation patch, it seems to have two points that it uses to create the animation – an origin point and a sort of black hole point that the colours are sucked into… it might be nice to assign the dancers a point each, so they are always joined to each other by a bridge of colour even when they are apart… illustrating the connection that exists between them at all times during the dance.

New Toys!

Liam introduced me to a max patch yesterday that generates visualisations from music, then I discovered that the centre of the generated animation can be moved, this is ideal! It means it should plug straight in to my patch with minimal fiddling, to follow the coordinates of the dancers. I will have a go… not sure how to post the results though… guess I could make a little movie of me using it.

Tutorial with Liam

I handed in my statement of intent for this project… collaborative_statement1.doc

He seemed happy enough. Suggested thinking about how sound was involved, that perhaps the patterns generated by the programme should effect the playback of the music to make it all circular (music affects the choreography, choreography affects the pattern, pattern affects the music). I can see how this would be pleasing, but I’m not sure yet how it could be implemented. I did have the programme listening to the music and changing the damping of the ripples depending on the volume, but Gonzalo pointed out that visualiations from music are quite common now, and it would be more interesting to work on introducing more elements to be affected by movement. I think I agree. But I will give the sound dimention some more thought.

I had some other ideas, on how I could change the imagery generated by the programme. I don’t know if it’s possible but I thought perhaps I could somehow take the programming for the visualisations you get with windows media player for example, and introduce it to my programme so that I could use them on the dance floor. Liam suggested looking up max tunes in MaxMSP.

We also discussed using sensors on the dancers (tilt sensors or pressure pads) to enable other things to be triggered within the performance.

Rilke’s Duino Elegies

Frankie (who helped me with the filming at the arts centre) commented that he was reminded of Rilke’s elegy referencing Picasso’s Les Saltimbanques… so I looked it up. This is the fifth elegy, the last six lines are relevant…

But who are they, tell me, these Travellers, even more
transient than we are ourselves, urgently, from their earliest days,
wrung out for whom – to please whom,
by a never-satisfied will? Yet it wrings them,
bends them, twists them, and swings them,
throws them, and catches them again: as if from oiled
more slippery air, so they land
on the threadbare carpet, worn by their continual
leaping, this carpet
lost in the universe.
Stuck on like a plaster, as if the suburban
sky had wounded the earth there.

This elegy is founded on Rilke’s knowledge of Picasso’s painting Les Saltimbanques

saltimbanques1.jpg

(he lived, from June to October 1915, in the house where the original hung, in Munich). Picasso depicts a family of travelling acrobats. Rilke was familiar with such people from his stay in Paris, where he became Rodin’s secretary.

I love the idea that the performers could be healing the earth (or healing themselves or anyone watching) where they dance. I’d like to think the white plastic sheet I put down to project onto equates to Rilkes threadbare carpet and has similarly therapeutic properties.

Hybridized Practice

Last term I looked at ways of marrying dance (particularly thinking about Argentine Tango) with technology. I looked at various websites to see what was being done already…

Interactive dance/performance
http://www.igloo.org.uk/main.html
http://www.igloo.org.uk/dotdotdot/Dswmedia/index.htm
http://createdigitalmotion.com/tag/diy/
http://www.essexdance.co.uk/DanceTech/pages/page1.htm
http://troikaranch.org/technology.html
http://www.companyinspace.com/front/cis_fs.htm
http://www.newworknetwork.org.uk/

– Particularly relevant to my project:
http://www.tmema.org/messa/messa.html
http://www.betaminds.com/interactfx/index.htm
http://naturalinteraction.org/files/whitepaper.pdf
http://www.feedtank.com/dfm.html
http://www.elkabong.com/IDC/idc.html
http://www.arcstreamav.com/poolSystem.htm?ppc_agent=google&term=interactive+floors
http://www.luminvision.co.uk/interactivesoftware.htm
http://www.naturalinteraction.org/
http://www.ncl.ac.uk/culturelab/research/amuc.htm
http://domino.watson.ibm.com/comm/research.nsf/pages/r.mobile.innovation.html

http://www.youtube.com/watch?v=NA1P-eI794Q (shoes + comment: The result shown here is an interesting start. It’d be more interesting if there were gyros wired in that would interpret sweeping motions of the legs/feet… perhaps as pitch shifting or other “velocity” based effects. Definitely a lot of potential.)

Interactive Tango video

I emailed Stuart Hobday at the Norwich Arts Centre before Christmas to ask about using the hall there. I wanted to try and set up the project I was working on last term, in a space where I could project onto the floor. I didn’t hear from him, so was pleasantly surprised when I followed up to hear that he was willing to let me use the space to experiment. By fortuitous coincidence, the day we arranged for me to have the auditorium, two professional dancers from Argentina (who set up the ‘experimental tango art centre’ in Buenos Aires, check www.tangoscene.com) were visiting Norfolk to teach at the Tango festival at Bylaugh Hall. When I showed them the video I made last term, testing out the MaxMSP Patch I put together, they were interested in helping me develop the idea.

With the assistance (and camera equipment) of my friend and film maker John Franklin, and VJ Marcus Williams from 2bitTV (see 2bittv.net) who is also on the MADP course, and the two really helpful technicians at the Arts Centre: Paul Osborne (Hall tech) and James Gibbon (multimedia tech) we got it all up and running.

Here it is, a draft edit of a potential performance piece.

A bit too long? I’ve only shown it to people who aren’t very interested in tango so far so it’s been difficult to gauge it’s success! Shame the yellow doesn’t show up better, but we had some problems with darker colours. I think I’ll put together an edit of the dancers playing/testing the floor, it will be useful for thinking of ways to progress the project.