WELCOME TO YOUR BLOG...!!!.YOU ARE N°

Combining in a linked network, music composition, jazz improvisation, physics equations, mathematical algorithms, and much more...‏




The Most Intriguing Performance in Jazz 2011

lyletedx
Jazz, Math, Tech & Lyle Mays
One of the most intriguing events in jazz this year that went largely unnoticed was something that I accidentally stumbled upon.  I received an email in early Spring from my friend Lyle Mays with a short message that just read “lots going on” and he included a YouTube link which featured a video of himself and a group of musicians performing at the TEDxCaltech conference in Pasadena, California this past January (Feynman’s Vision: The Next 50 Years).  Lyle was an invited performer at the conference and the video he sent to me was just superb both sonically and visually.  This wasn’t just some jazz fusion performed on a stage in front of an audience but something much more mysterious with larger creative implications.
When I wrote back to Lyle commending him on the impressive performance I had to ask what the heck did I just watch?  He informed me that this was a project that he had been working on for some time combining music composition, jazz improvisation, physics equations, algorithmic composition, speech patterns (courtesy of the late Physicist Richard Feynman whose work was the focus of the conference), live video mixing and a custom laptop linked network.  You got that?
The origin of the project started with Caltech Physicist Michael Roukes, “When I was thinking of who I might invite as the principal musical guest for TEDxCaltech, one of my first thoughts, of course, was of Lyle.  Searching out his biography, I was intrigued to find that he has an affinity for things mathematical…so I took the plunge and attempted to contact him about our then-future event — somewhat on a lark.  I outlined the inchoate (maybe incoherent is more apt) ideas I had for a connection to music and the scientific theme of our event.” 
Roukes and Lyle met and outlined the work as he explains, “Lyle and I got together for lunch a number of times, communicated by e-mail, then spent some afternoons together with several colleagues from each of our ‘worlds’ exploring how nonlinear dynamics, mathematics, and numerical programming could generate interesting motifs as the basis for Lyle’s keyboard improvisations.  This led to a number of afternoon jam sessions — first in my office, increasingly with a larger cast of characters, and later in the Caltech jazz ensemble’s rehearsal room with other musicians.  These sessions were an almost equal mix of algorithms, musical expression, and blue-sky discussion.”  After the initial sessions, Roukes’ work moved him away leaving Lyle and the musicians to finish.  The work was realized, performed and captured on video early this year at the TEDxCaltech conference.  Fittingly, Roukes provided the introduction to the performance in the program.  
Given the complexity of this project, instead of interviewing Lyle about this unique work, I felt it more interesting to ask him to comment on the performance itself using the time sequence of the performance video.  That way we can follow the details of the musical performance and also gain a deeper understanding of what processes are going on as they are happening within the context of the piece.  First, watch the performance.  Then, follow Lyle’s commentary. Fortunately, there will not be a test on the assignment. - JV

Lyle Mays Commentary:
What do you get when the IT Department is the band? Jimmy Branly (drums) is a recording engineer. Andrew Pask (woodwinds) is a programmer who works for Cycling 74 (the company which makes the brilliant MAX software), Bob Rice (guitar and sounds) is a sound designer/engineer/synth programmer, Tom Warrington (bass) is a math wiz, Jon 9 (visualizations) designs, builds, and provides content for video installations, and Rich Breen could build (and nearly has built) recording studios MacGyver style. And what kind of music should one make when Stephen Hawking is in the audience at Caltech? Jazz alone doesn’t cut it.
The first notes you hear are chosen by one of my apps that mapped the calculus equations that describe the motion of the double pendulum to an accompaniment part played by synths. I wanted to use nonlinear dynamics as a starting point to illustrate the deep connections between math and music and to make a point about what we call organic. As Galileo famously observed,
“Mathematics is the language with which God has written the universe.”
The deeper we understand math and the better we apply those discoveries, the closer we get in our technology to what feels right or organic. If we use the same equations that the universe does why shouldn’t it feel less like tech and more like nature?
The video shots for the next few seconds (0:12-0:19) show one of the interfaces I wrote which displays the computers choices both to me and everyone in the band (through a linked laptop network Andrew and I came up with and he coded) with both pitches on staves and chord names (using another of my apps which analyzes harmony in realtime). The interface also automatically reconfigures my midi world (much like the sadly defunct OMS Setup + Patches software). The buttons for “Change Low” and “Change High” allow me to “conduct” the computer to choose a new voicing or chord by simply hitting keys on a little Korg Nano. Think of that synth as an assistant conductor who makes no sound but speaks the computer’s language. The data base from which the app chooses arpeggios is a linked bifurcated system where the computer “knows” which upper structures are consonant with which lower structures so the chords can move in leap-frog fashion: top notes can remain when lower notes change (or vice versa) because the moves are guaranteed to make harmonic sense. The specific note choices were made by the algorithms.
The first “human” notes (played from the midi piano) at 0:22 were sent to Frisell-style electric guitar samples to move a step closer to actual acoustic music without yet introducing real air vibration caused by a player’s physical motion. By playing the piano softly and pushing my volume pedal up I could control the balance between piano and guitar. Notice how at 0:40 the piano comes into the mix simply because I struck it harder.
Somewhere around 0:50 I hit the “>” key on the computer which caused the chart display part of the interface to move to the next section which in turn displayed a message on everyone’s laptop that Andrew was to join with sparse comments and Jimmy was to bow his cymbals and Tom was to add bowed efx. At 1:42 I advanced again and now Bob got a message to join. At 2:33 another advance told Tom to take the foreground sans bow (very soulfully imho), Andrew to recede, and Jimmy to switch to mallets. A little later another message got sent telling Andrew to switch to clarinet, Jimmy to go back to bowing cymbals, and Bob to switch to his sound-design rig mode to prepare for the next section.
The processed speech you hear at 3:48 is Bob playing an excerpt of a Feynman talk through his iPhone into his guitar pickup and then doing realtime sound design on that through his rig (all his ideas which I loved). Andrew routed his clarinet through one of his apps that changes the the sound or even the actual pitches of his echoes (all his ideas which I also loved).
At 4:42 I started an app which generated a steady 3/4 rhythm using shakere samples with equations for compressed fluid dynamics patched to the velocities within the pattern. Also using the same double pendulum equations used earlier (but with different initial conditions), this time mapped to a variety of conga samples, the first drum hits you hear at 4:48-4:55 are the equations taking a mini solo. Jimmy then joins with his hybrid drum/percussion/electronic rig for a duet. Notice how he makes bongos, his electronics, my computer generated parts, and his live drums sound like one natural instrument!
Another advance of the conductor system told Andrew to join on penny whistle and Bob to add synth-like efx from the guitar. These are clear text instructions that appear on the guy’s laptops so there’s no guesswork on who should be doing what when where in music this free. It felt like I could “play” the band from my rig while improvising.
At 5:27 we started the first bit of conventional music (actual paper on music stands) which was designed to make a (hopefully) seamless transition from the world of interacting with algorithms and equations to the world of musicians interacting with each other (which is my favorite system of nonlinear dynamics). Tom and I were to play three rubato chords while I faded the machine drums out, Andrew switched to soprano, and Jimmy was then free to start the opening pattern of the intro to this tune.
All through this, our sixth band member, Jon9 on video, was improvising using photos from the Feynman archives at Caltech, live feeds from my computer and midi piano, and a robot camera over my left should he could control with a joystick. He had his own algorithms running which allowed him to switch between, merge or fantabulize (yes, he talks like that) it all.
On a compositional note, the rhythm you hear at 8:02 is based on Feynman’s own speech pattern. One day while we were scheming at my kitchen table, Bob applied a series of filters to a YouTube video of Feynman which left him sounding like a mosquito. The thought content seemed removed but an ineffable essence of intent and clarity remained which floored me. The man had a rhythm all his own and that was made crystal clear. I quantized a bit of what I heard and used that for the rhythmic palette of the whole piece from 6:12-8:08.
At 8:18 we were supposed to move into a zone where one of my apps displayed the chords I would improvise to the band members to serve as an organizational tool for the not-quite-free improv section. The output is actual chord names, not notes on a staff. It was way cool in rehearsals but I screwed up one of my zillion moves within the interface that day and it didn’t engage. The band, bless their hearts, just stayed where we started and made music with it. The show must go on…
Notice how at 10:25 I reach over to the computer and exactly ten seconds later everyone just stops. Andrew had programmed a slick film-reel-start kind of countdown timer into the network so when I clicked it, we all had a visual display of our ten second countdown. From a musical perspective that meant we could concentrate on what we were playing instead of waving arms or watching, or in any way deviating from our ideas while we stopped on a dime.
The next section uses an app which orchestrates in real time. It sends notes I improvise to a variety of virtual instruments creating a kind of chamber orchestra on the fly. Simultaneously another app is graphing the performance (like data in a lab experiment) in a stripped down way where pitch is the y axis and time is the x axis. At 11:59 I switched the graphing app to “polar view” where pitches are mapped according to the circle of fifths. This view was from a suggestion by the incredibly brilliant Caltech grad student, Andrew Homyk, who ported my app from RealBasic to C to use here and who also coded all the calculus equations to create the double pendulum animation. This guy writes code as fast as musicians improvise. It’s astounding.
There was an interesting bit of psychological feedback for me watching the polar view as I played – it caused me to interact with the display just for the pure fun of making the dots go different directions or creating patterns that the visuals suggested as opposed to patterns that the notes suggested. Give a person a new perspective and new ideas emerge.
The last section is a reworking of my piece “Before You Go” from the album Street Dreams. The samba treatment with a new tag provided a great vehicle to showcase Jimmy and let the band open up. It was fun and maybe freeing to do a bit of music at the end that had less tech attached although Jon was still hard at it and Bob and Jimmy incorporated all their electronics in some very cool ways. All in all this had to be a musical performance that stood on it’s own apart from any technology. I tried hard to weave the threads together, to design a set that had a narrative flow, to create a different kind of band with both intelligence and playfulness that paid homage to the eccentric and spirited Richard Feynman and that ultimately would be faithful to the triple pillars of TED: technology, entertainment and design.
About Lyle Mays:
Lyle Mays has been an integral part of the Pat Metheny Group since its inception in 1977, and has co-written much of the consistently engaging music for the multi-Grammy-winning group’s albums. Lyle’s sense of melody, crystal clear virtuosity and almost cinematic scope of orchestration has clearly distinguished the group’s sound.Born into a musical family in Wausaukee, Wisconsin, in 1953, he was always encouraged to explore new forms of expression. As a teenager, Lyle attended jazz summer camps and studied with such talents as Rich Matteson and Marian McPartland. He then studied composition and arrangement at North Texas State University before touring with Woody Herman’s Thundering Herd. While appearing at the 1975 Wichita Jazz Festival, Lyle met twenty-year-old guitarist Pat Metheny, and the two formed a musical alliance that has proven to be among the most artistically successful of the past three-plus decades.
More info: TEDxCaltech
Special thanks to Lyle Mays, Michael Roukes and Mary Sikora

No hay comentarios:

Publicar un comentario

COMENTE SIN RESTRICCIONES PERO ATÉNGASE A SUS CONSECUENCIAS