Jean-Baptiste Barrière

Since 1981, Jean-Baptiste Barrière  is a composer and a researcher at Ircam, where he works in the framework of the scientific research  projects Chant (voice synthesis sung by a computer) and Formes (composition with a computer). He also assisted composers Gérard Grisey, Jonathan Harvey and Sir Harrison Birtwistle in the realization of their works at Ircam. From 1984 to 1987, he directed the Esquisse Project (composition environment in Lisp for assisted composition). In 1987-1988, he was reponsible of the teaching of computer music composition at CNSM Paris. During the year 1988-1989, he was invited to the University of California San Diego. From 1989 to 1998, he was director of the Pedagogy department at Ircam.

Chréode (1983)

Chréode is a term borrowed to Morphogenesis and Biology (from the greek 'cre' we need, and 'odos' path : necessary path). it serves here of metaphor for a cross systematic investigation of sonic materials and organizations.
Though sonic materials have been worked carefully, attention in this piece is more specially on organisation. Chréode is the first step towards a grammar of processes I want to try to elaborate.
This research on musical processes, their fields of action and their limits, has been thought of a strategy of approach of the musical territory, as it is renewed by the possibilities made accessible through the use of computer.
A very general destination of this project consisted to experiment with different types of organizations, and at a higher level to structure them in time and formally. Sonic materials were chosen to allow a certain type of control over timbre, concentrated of a small number of compositionally relevant parameters. Therefore the work on timbre is essentially based on the compression and expansion of formants or spectral enveloppes, by departing from vocal types material and derivating from this models towards others, with or without references to musical families of instrumental timbres. The piece Chréode was awarder First Prize for digital music, International Electroacoustic Music Contest, Bourges, 1983.


John Cage   

John Cage was born in 1912 in Los Angeles and passed away 12 August 1992. Composer and thinker, he was in the center of the American avant-garde for several years. He is without doubt the American composer who has had the biggest impact on all of contemporary music. Continuously exploratory, John Cage's music uses several new means: prepared piano, radios, unusual percussive instruments, electric and electronic sounds, coincidence, I Ching, silence... During the sixties, his research aimed for a constantly greater integration of life in art, which earned him many virulent comments.

Atlas Eclipticalis (1961-1962)

This piece, specially commissioned by Pierre Mercure for the International Week of Today’s Music, was premiered under the direction of Cage himself on August 3rd, 1961, in the Thêatre de la Comédie-Canadienne de Montréal (now called TNM). Atlas Eclipticalis represents a map of the stars, evoking in a pointillist manner the multiplicity of the voids and the pools of light that make up the heavens. The title, Atlas Eclipticalis, was taken from the astronomer Antonín Bečvář. Working from maps created by Bečvář and after multiple chance procedures guided by the I Ching, Cage draws his own celestial map — a starry sky, immutably slow but continually in motion. The score is written for 1 to 86 acoustic and electronic instruments; each part has four pages made up of five systems in which Cage draws a multitude of note clusters (the constellations). According to Cage’s instructions, the notes of these clusters are to be played in or out of order; they should be as short as possible or of only a perceptible length, never more than a bow-stroke or a breath. Their pitches are specific, but they have no rhythmic value; the tempo beat out by the clock of the conductor’s movements connects them and determines their duration. At any time, the musicians’ performance can be anywhere between minimum activity (silence) and maximum activity (what appears on the page).


Jean Lochard & Rémi Dury   

Jean Lochard is Réalisateur en Informatique Musicale, teaching at Ircam since 2001. He teaches acoustics, techniques for analysis-synthesis and real-time to young composers of the Computer music and Composition cursus at the department of pedagogy and cultural action. He also realized the Ircamax plugin for Ableton Live and Najo Max Interface, facilitating the use of the Max software. In parallel he pursues his work as an electronic musician: remix by Emilie Simon, film-concerts, performances with the Suonare e Cantare company, creation of applications for the Karlax, informatic realizations for Aril, Pierre Estève, Jean Michel Jarre, Jackson and his Computer Band, Camille…

Remi Dury studied at the Conservatoire National Supérieur de Musique de Paris in Pierre Schaeffer's and Guy Reibel's classes. In 1990, he co-founded with Serge de Laubier, the Puce-Muse studios, which he left 15 years later to found the ARTACTIL association where he developed the Tubx. In 2003, he created with Bernard Garabédian, the B.A.Ziques, an electronic instrumentarium designed for the collective practice of electroacoustic music. He composes and performs for the Nadia Xerri-L theater company, the Spideka dance company and Frédéric Firmin's drum trio. In 2004 he founds the DA FACT company. The Karlax is lanched on the marked in 2011 and receives the Grand Prix de l'Innovation from the City of Paris.

Étude pour deux Karlax (2014)

The composition of this study, stretched over several months, is the fruit of cross-propositions from the two composers. Composed of 5 tables inspired by both electro and concrete music, its sound elements come from concrete sounds, analog synthetizers and old synthsis programs for Atari computer. This study constitutes the meeting-point of two universes, each of which is written with its own computer environment: on the one hand, the Live software was enriched with synthesis and control tools specifically designed for the Karlax thanks to Max for Live. On the other hand, the KarMaxAudio software, specifically developed for the Karlax is equipped with synthesis, processing and spatialization tools, fully used in the writing of the piece. The composers have decided not to impose a common fingering on the Karlax. Each of them has therefore built their own ergonomy for optimal comfort and pleasure of playing during the performance. To finish, this study, thanks to its liebrty of interpretation, makes it possible to create "live" an infinite number of variations depending of the different possible performance situations (club, choreography, mixed).


Yan Maresz   

Born in 1966 in Monaco, Yan Maresz studied jazz at the Berklee College of Music in Boston from 1983 to 1986 and progressively turned to composition. In 1987, he entered the Julliard School in New York, where he studied composition with David Diamond. In 1994, he attended the composition and computer music cursus at Ircam. He was a resident of Académie de France in Rome, Villa Medicis from 1995 to 1997, of Europaïsches Kolleg der Künste in Berlin in 2004, and of Civitella Ranieri Foundation in 2012.Since 2007 he teaches electroacoustic composition at CNSDM de Paris, as well as new technologies at the conservatory of Boulogne Billancourt.

Metallics (1994)

Metallics is a piece written at IRCAM during the computer music composition cursus 1994. The composer has always been fascinated by the changes of character offered by mutes on brass instruments, multiplying their expressive possibilities. Having chosen the trumpet, he undertook a study of the acoustic properties of the principal mutes used with the instrument: bowler, dry, harmon, wha-wha and whisper. After analyzing the characteristics of each mute, he recreated the transformations they produce on the trumpet by applying in real-time the spectral envelopes of each mute (through formant filtering). The trumpet is particularly well adapted to this kind of transformations. He was thus able to simulate the different mutes on the instrument which, by the way, also uses the mutes during the piece, creating a play between the real sound image and the synthetic shadow. The results of the analysis of the mutes also offered a formal basis, as it occured that the mutes naturally classified on a harmonicity/unharmonicity scale when comparing their spactral information to that of an ordinary trumpet. It therefore gave a model, in a musical course segmented into distinct movements, presenting starting from the ordinary trumpet, the mutes from the less to the most noised, with in between, parantheses of ordinary trumpet, during which the formant filtering is operated. The musical character of each movement is determined by the acceptation and incorporation of the sound archetypes and of the inevitable musical references specific to the trumpet and to the different mutes.


Tom Mays   

Originally from California, he established in France over 20 years ago. He took part in the creation of the CNCM Césaré studio, spent several years at IRCAM as musical assistant, composes, performs with electronic instruments, develops real-time programs, improvises. He is professor of Electroacoustic Creation and Performance at the Conservatory and Academy of Strasburg and associate professor of New Technologies applied to composition and computer music for the Sound Education at CNSMD of Paris. He is particularly interested in gestural control of real-time computer-music, in written and and improvised music, as well as in the relation between music and image.

Presque rien pour Karlax (2014)


Steve Reich   

Steve Reich, born Stephen Michael Reich on 3 October 1936 in New York, is considered as one of the pioneers of minimalist music. To characterize his work, he prefers the expression "phasing", which refers to his invention of the musical technique of phase shifting. From 1976 onwards, he develops a musical writing based on the rythm and pulsation of one of his most important works, Music for 18 Musicians. Recognized as an essential contemporary composer, he orientates his compositional work towards the musical setting of speech, in collaboration with his wife Beryl Korot.

Reed Phase (1966)

Reed Phase, also called Three Reeds, is an early work by the American minimalist composer Steve Reich. It was written originally in 1966 for soprano saxophone and two soprano saxophones recorded on magnetic tape, titled at that time Saxophone Phase, and was later published in two versions: one for any reed instrument and tape (titled Reed Phase), the other for three reed instruments of exactly the same kind (in which case the title is Three Reeds). It was Reich's first attempt at applying his "phasing" technique, which he had previously used in the tape pieces It's Gonna Rain (1965) and Come Out (1966), to live performance.


Jean-Claude Risset   

A pioneer in the adventure of computer music in France, J.-C. Risset then contributed to introducing the computer in France (in institutions such as IRCAM or the universities of Orsay and Marseille-Luminy). He was, thanks to his double scientific and artistic education, the first French composer to make a away for computer-produced synthetic sounds. Today he is a major figure of contemporary music creation and, in the same time, of research in so-called electronic music. His contributions have made a striking mark on the aesthetics of the 1970s-1990s.

Voilements (1987)

A fabric in the wind or a wheel that no longer goes around. Strange transformations occur: the range of pitch gets out of control, melodic volutes close in on themselves similarly to a broken record. Then the perspective changes, we move from the telephoto lens to wide angle. The tape becomes more multiple and distant. Synthetized sounds unfamiliar to the saxophone universe appear. And until the end, a more pacific and distant relation establishes between the tape and the various techniques of the soloist. The tape was created in Marseille (Faculty of Sciences of Luminy and the Laboratoire de Mécanique et d’Acoustique of CNRS). The sounds produced by Daniel Kientzy were transformed using the SYTER digital audio processor developed at INA-GRM by Jean-François Allouis and industrialized by Digilog. The composer wants to thank Pierre Dutilleux for his work on the SYTER. The synthetic sounds were produced on a compatible IBM-PC computer with the version of the MUSICV program extended by Daniel Arfib; they were specified through instrumental gestures on a MIDI keyboard, then transcribed into MUSIC-V code. This transcoding realized by Frédéric Boyer, makes it possible to combine the resources of the synthesis and real-time controls.


Kaija Saariaho   

Kaija Saariaho is a prominent member of a group of Finnish composers and performers who are now, in mid-career, making a worldwide impact. She studied composition in Helsinki, Freiburg and Paris, where she lives since 1982. Her studies and research at IRCAM have had a major impact on her music, and its typically mysterious textures are often created by the combination of live and electronic music. Although her catalogue contains a great number of chamber music works, since the mid-90's she has progressively turned to stronger forces and greater structures, such as the operas L'Amour de loin and Adriana Mater, and the oratorio La Passion de Simone.

Noa-Noa (1992)

NoaNoa is a piece for flute and Ircam's "Station d'informatique musicale" (SIM, based on the Next computer). Different types of flute sounds are sampled and stored in the memory of the SIM, then triggered and electronically transformed at given moments in the score, either using score following (following the pitch and envelope of the flute), or using a pedal, controlled by the performer. In a general manner, the electronic part develops the musical ideas of writing for the solo instrument. The title refers to a wood cut by Paul Gauguin, NoaNoa. It also refers to a travel diary of the same name, written by Gauguin during his visit to Tahiti in 1891-1893.The fragments of phrases selected for the voice part in the piece come from this book. NoaNoa is also a team work. Many details in the flute part were worked out with the help of Camilla Hoitenga, to whom the piece is dedicated. The electronic part was developed by Xavier Chabot on the SIM environment, under the supervision of Jean-Baptiste Barrière.



Stefan Joel Weisser born on 8 February, 1951 started to use Z'EV as a "trade mark" name in 1978. Z'EV is an American poet, percussionnist and sound-artist. After studying various world music traditions at CalArts, influenced by DADA, futurism, and Fluxus movements, he began producing visual and sound Poetry through exhibitions. He began creating his own percussion sounds out of industrial materials.He is regarded as a pioneer of industrial music. His work with text and sound was influenced by Kabbalah, as well as African, Afro-Caribbean and Indonesian music and culture.

Electronique élémentaire (2014)

Z‘EV: "My use of the term 'elemental' in the title has a double meaning. On the one hand the term refers to the intention of the rhythmic structures which I'll play. These rhythms are analogues of a variety of energies/rhythms occurring in Nature. The Natural impulses I will be responding to for the performance in Bourges will be geomantically and meteorlogically based. And note that my use of the term 'responding' is meant to imply that I am engaging in a dialogue with these energies. On the other hand the use of elemental refers to my utilizing a combination of available technologies and verging-on-vintage digital hardware. The available technology in question refers to the trigger pads which were developed in the gaming world for the PlayStation 2, PlayStation 3, Xbox 360, and Nintendo Wii. Guitar Hero World Tour drum set. The verging-on-vintage hardware is the Yamaha DTXpress which was first marketed in 1999. The unit combines 16-bit AWM2 (Advanced Wave Memory 2) sample based synthesis with the classic Yamaha patented Frequency Modulation Synthesis (or FM synthesis). FM synthesis can create both harmonic/pitch specific and non-harmonic/percussive sounds. When synthesizing harmonic sounds, the modulating signal has a harmonic [integer based multiples] relationship to the carrier signal. When modulators apply non-harmonic frequencies [non-integer based multiples] to the carrier signal, both the atonal and tonal qualities of the created sounds combine in more complex waveforms that produce the unique sounding percussion tones that are commonly described as, 'fat', 'gritty' and 'thick and dark'. "