All projects

Wasabi

Web Audio Semantic Aggregated in the Browser for Indexation

The goal of the WASABI project is to define an optimized methodology for indexing music for the Web for large databases by linking metadata from audio analysis, Semantic Web techniques, and the analysis of textual data such as song lyrics using natural language analysis and to compare this methodology with case scenarios to develop unique services and applications using Web Audio technologies.

The project entails using algorithms to extract musical information and Semantic Web techniques to produce more consistent musical knowledge bases for streaming services and music databases. Services using Web Semantic data like LastFM, MusicBrainz, or DBPedia use the extraction of structural data, connecting works to metadata such as the producer, the recording studio name, the composer, the release year, or the subjects in the lyrics for example. The data in free text like the lyrics are also analyzed to determine the musical context of the piece. Web Audio technologies make it possible to explore these musical spaces improved with analyses such as high-level musical indexation: detecting emotion and plagiarism, detecting and characterizing the singing voice, detecting the structure and separating the different sources.

Open source software bricks and “open data” online services will be proposed at the end of the project for:

  • The visualization of audio metadata and listening to un-mixed tracks in a browser as well as using the latest Web Audio API technologies (mixing in real-time, audio effects)
  • Automatic processing of lyrics, recognition and merging named entities, collaborative annotation and correction
  • Access to a Web service with an API offering an environment in which to study musical similarities from audio and semantic analyses

These software bricks will be used in the development of formalized demonstrators with our partners and collaborators (journalists and composers), using the new Web Audio API standard making it possible to develop musical applications accessible to the general public via a Web browser.

Project reference: ANR-16-CE23-0017-02.

With the supervision of:

IrcamSorbonne UniversityCNRSMinistry of Culture

Discover other team projects

OpenTuning

Individuation créative par le design d'interaction avec des systèmes musicaux génératifs

Dates : March 2026 to December 2029

Inside Artificial Improvisation

Dans la boîte noire de l’improvisation artificielle

Dates : January 2026 to December 2029

INTIM

INteractive analysis/synthesis of musical TIMbre

Dates : September 2024 to March 2026