Neon is a browser-based music notation editor written in JavaScript, designed for working with square notation. It uses Verovio to dynamically render the symbolic music files in MEI format, updating the file in real time through a graphical interface. Neon can be used for creating new musical scores, making ground-truth data for machine learning, or for correcting errors from automated transcriptions in an OMR workflow. Neon is designed as part of an optical music recognition workflow, allowing for quick and easy correction of pitch and position errors created in the OMR process. Every component of our OMR process is designed as an accessible online application, to allow correction tasks to be crowdsourced from our partner organizations and community members.


You can try out Neon on our demo page. You can begin by selecting a link to a musical document that has undergone OMR, and then continue to insert, delete, or pitch shift notes on the page.

Source Code

Source code is available on the Neon Github.


Installation and instructions are available on the Neon Wiki page.

Contributing Code

Any contributions are welcome! The easiest way to submit code is:

  1. Create a fork of the Neon Github repository
  2. Read through the documentation and familiarize yourself with the code. Look at same of the outstanding issues and feature requests if you need some inspiration.
  3. Change code as you please in your local repository.
  4. When you’re ready, send us a pull request. We’ll look through your code, and then merge it in.


If you have any comments please let us know. If you would like to see a particular feature implemented, post a new issue on the Neon Github.


Neon is developed by:

  • Gregory Burlet
  • Zoé McLennan
  • Alastair Porter
  • Juliette Regimbal
  • Andrew Tran

Project managers:


Neon is an ongoing project at the Distributed Digital Music Archives and Libraries Lab (DDMAL), at the Schulich School of Music of McGill University. Neon is part of the larger Single Interface for Music Score Searching and Analysis (SIMSSA) project that is generously funded by the Social Sciences and Humanities Research Council of Canada. We’re also grateful for the support provided by the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT).