Musical Representations

The Musical Representations team explores the formal and dynamic structures of music and mobilizes them in creative environments for musical composition and interaction.

The Musical Representations team (RepMus) focuses on the structures of music (or “musical intelligence”) as they can be approached through computer science in order to analyze, formalize, represent, model, generate, and manipulate them, with the aim of providing broad support for musical creativity in contexts of composition, performance, improvisation, or analysis.


RepMus investigates representations at various scales, from symbolic structures to the signal level, ranging from epistemological and mathematical aspects to computational modeling and the development of technological tools widely used by musicians. These methods and tools apply to both written music and oral traditions, and to both classical and popular repertoires.


Research on high-level representations of musical concepts and structures—supported by the original programming languages and environments developed by the team—leads to the implementation of models that can be applied both to creative processes and to musical analysis. The exploration of interactive and/or synchronous time offers opportunities with new media for the development of real-time, interactive, open, collective, improvisational, and distributed works.


The tools and methods employed draw in particular on the following fields: musical formalization; tools for composition, analysis, performance, and improvisation; and involve formal computation, domain-specific programming languages, new notations, musical mathematics, creative AI and learning, computational (co-)creativity, and autonomous agents.


RepMus research is structured around three thematic areas:

  • Theme (F)O(R)M: formal spaces, computer-assisted composition
  • Theme REACH: human and artificial co-creativity, interaction
  • Theme ECTIS: writing and control of interactive and synchronous time

The team has a long history of intensive collaboration with artists at the national and international levels. The three volumes of the book OM Composer’s Book document part of this work and ensure its international dissemination and long-term preservation.

Members Team

Gerard Assayag

Responsable d'équipe

Mikhail Malt

Chercheur

Karim Haddad

Chercheur

Philippe Esling

Chercheur invité

Moreno Andreatta

Chercheur invité

Orian Sharoni

Doctorante

Romain Buguet

Doctorant

Sébastien Li

Doctorant

Marco Fiorini

Doctorant

Heni ZOUARI

Doctorant

Wander Vieira Rodrigues

Doctorante invitée

Vasiliki Zachari

Administrative

Theme (F)O(R)M: formal spaces, computer-assisted composition

  • Computer-assisted composition, OpenMusic software
  • Computer-assisted orchestration, outcomes of the international ACTOR project, Orchid* software family
  • Mathematics and music, models and formal computation
  • Musical analysis and computational musicology

Theme ECTIS: writing and control of interactive and synchronous time

  • Writing of time, artificial listening, control of synthesis and spatialization, Antescofo software, open-source system Tzim Tzoum
  • Polytemporal scores, polytemporal autonomous agents
  • Collective synchronization in hybrid systems (human–machine) and machine–machine systems

Theme REACH: human and artificial co-creativity, interaction

  • Dynamics of improvised interaction, creative autonomous agents, real-time interaction: OMax, Somax2, Dicy2, Somax2Collider, Somax4Live, Prosax, Djazz
  • REACH project and its follow-ups: learning interaction, cyber-human systems, models of co-creativity, artificial listening and cognition
  • Generalized integration of writing and improvisation in hybrid systems (human–machine) and machine–machine systems

Computer-assisted composition, analysis, performance, improvisation, and orchestration; computational musicology; creative artificial intelligence; domain-specific programming languages; musical mathematics; formal computation; synchronous real-time languages; listening machines; executable notations; interaction architectures; multi-agent autonomous systems; human–machine co-creativity.

Related Projects

See all projects

(DYCI2) Creative Agents, Improvised Interactions, and “Meta-Composition”

Dynamiques créatives de l'interaction improvisée

DAFNE+

Decentralized platform for fair creative content distribution empowering creators and communities through new digital distribution models based on digital tokens

Dates : July 2022 to June 2025

ACTOR

Analysis, Creation, and Teaching of Orchestration

Dates : April 2018 to December 2026

Related Software

Software collection

SOL - Instrumental sounds datasets

TinySOL, OrchideaSOL and FullSOL are three dataset containing instrumental samples, including a wide range of extended techniques.

Software included in premium subscription

Sound design and processing

Instrument modelling

Software

SOMAX2

Somax2 is an AI-based tool for musical improvisation and composition, handling MIDI/audio, memory, agent control, and offering full Max integration with manual and generative interaction.

Free software

Improvisation, generativity and co-creative interactions

Software

Orchidea

Assisted orchestration finds optimal orchestral combinations for a target sound. Orchidea performs static and dynamic orchestration, single-objective optimization, with few parameters needed.

Free software

Instrument modelling

With the supervision of:

IrcamSorbonne UniversityCNRSMinistry of Culture