Hi HN, I built Contrapunk because I wanted to play guitar and hear
counterpoint harmonies generated in real-time. It takes audio from your guitar, MIDI player or your computer keyboard and generates harmony voices that follow counterpoint rules to generate harmonies. You can choose the key you would like to improvise/play in and the voice leading style and which part of the harmony you would like to play as, as well.
Would love feedback on the DSP approach and the harmony algorithms. I am also looking at training a ML model for better realtime guitar to midi detection. I believe that will take some time.
Thank you! The idea is not completely mine, I have to give thanks to Abhinav Arora who had this idea initially during the ADCx music hackathon. Kudos to him! Also love the phrase Gradus ad Parnassum! Maybe this should be the motto of contrapunk :)
How are you finding rust for audio development? I have a background in pro audio, and both for the audio and GPU render threads, I used a lot of arena allocators and ring buffers. Do you find yourself fighting with rust's strict semantics?
I've got a few thoughts for features, if you're open to them:
1. Ability to specify where your "played" voice resides in the voicing: As the bass note, as an inner voice, or as the top line.
2. Options for first species, second species, third, florid, etc counterpoint for each of the generated voices. Ex: You play a single note and the upper voice plays two notes for every one of yours, etc, etc.
3. If you want to get real fancy, make the generated voices perform a canon of your played notes.
Have you been able to try it as well would love to hear what you think! Coming back to the features, regarding 1. you can already choose between soprano, alto, tenor or bass. I have still filed an issue for this, will help me remember to take vet this feature. Sometimes it's not as strict as it should be but that's also something I need to work on. Regarding 2. it's a good idea, helps you be in control of the kind of counterpoint you are doing, filed an issue for the same. Please feel free to comment on the issue. 3. is just feels is a little goofy as well I love it. I haver filed an issue for this as well check https://github.com/contrapunk-audio/contrapunk/issues/
"Realtime" as in "while playing guitar" has some pretty challenging latency requirements. Even if your solution is optimal, hardware specs will play a meaningful role. I'd be really interested if you've solved for this e2e.
Yes, latency was the main problem to solve here. Because of which I opted for Rust. The pipeline is:
- 128-sample cpal audio buffers (~2.7ms at 48kHz)
- Single-cycle pitch detection
- 2-frame McLeod pitch voting for confirmation
- Entire DSP pipeline is Rust, pre-allocated ring buffers with minimal heap pressure
The e2e from pluck to MIDI "note-on" signal, is under 10ms on an M-series Mac. Hardware matters for sure so an audio interface with low-latency drivers (I use an Audient iD14) helped a lot. The web version (app.contrapunk.com) adds AudioWorklet latency on top, so the native Mac app is noticeably tighter. I am still working on figuring out how to have lesser noise and pitch jitter in the final output. Also this works really well for higher notes, bass not so much right now. Still need to figure out how to handle harmonics better. I have created this issue for you for now, let me know if you would like to add anything else to this as well. https://github.com/contrapunk-audio/contrapunk/issues/6.
There was a 1970's Indonesian progressive rock band called Contrapunk that released an album called "Putri Mohon Diri" [1]
You can find the recording on YT [2]
They were really unique - blending traditional Indonesian instruments, intense guitar work and classical influences.
Ah banger from the get go. Guess this is what I will be listening to through rest of the day. And I will add this as a fun fact to the website as well. Thank you for sharing :)
I've been thinking of and briefly working on a similar project.
One idea is to analyze timing as well, and "trigger" things after certain sequences (so play 1-3-5 as say eighth notes and then get an in-rhythm arpeggio one octave higher) or detect the beat and play on the upbeat.
I haven't done any Rust, but this might give me a good reason to give a try.
Have you considered making it a plugin? (makes replay easier in my opinion, but brings other pain like relaunching the DAW between builds...)
I had added a metronome and a note generator to this earlier which aren't working well exactly right now. But this would definitely increase playability. I agree that there should be a VST plugin version of this as well. It can also live as a standalone app and a plugin. Could you elaborate on the analyze timing? If you had to play using this, what would you like to hear ?
The quickest workaround for this would be running ` xattr -cr /Applications/Contrapunk.app`. Are you able to use the web version though at app.contrapunk.com
Ah perfect ! Let me know what you think ! Also you can ping me on twitter if required https://x.com/BobadeVibhav/ Not really a twitter user, never been, but would be a easier way to connect obviously.
What a cool idea. I don't have a music setup capable of running this right now - perhaps in a couple of months - but if you were to post some sample recordings, I'd gladly listen to them.
How do you generate velocity values for the accompaniment notes?
Given that you already have a pitch tracker, it could be interesting to add key detection; just start playing, instead of telling the machine what key you're in, and it starts following along as soon as it catches on.
Thank you! The best part about this project is that you don't actually need a elaborate setup :) All you need is a DAW like garage band or logic and a few IAC buses configured on your mac. I really like the idea of key detection, it is something I have already thought about as well: based on the song it should pick up the key which you can then jam to with contrapunk and I think it can act as an educational moment for the player as well at the time. https://github.com/contrapunk-audio/contrapunk/issues/4 I have created a github issue over here as well for this. For the velocity currently I am inheriting the onset strength of the input signal for the accompanied notes. The guitar input measures RMS energy in the first ~5ms of each pluck (the attack transient) and maps that to MIDI velocity. I will post more sample recordings on the website soon! Were you able to check the one which was already posted?
Thanks for taking a look! Let me know if you have any issues with it, it's still in a nascent stage and has a lot of room to grow especially in the Guitar to MIDI detection. If you don't have a midi controller you can use your keyboard as well !
macOS DMG: https://github.com/contrapunk-audio/contrapunk/releases/tag/...
Source: https://github.com/contrapunk-audio/contrapunk (do open any issues if you have any)
Would love feedback on the DSP approach and the harmony algorithms. I am also looking at training a ML model for better realtime guitar to midi detection. I believe that will take some time.
How are you finding rust for audio development? I have a background in pro audio, and both for the audio and GPU render threads, I used a lot of arena allocators and ring buffers. Do you find yourself fighting with rust's strict semantics?
I've got a few thoughts for features, if you're open to them:
1. Ability to specify where your "played" voice resides in the voicing: As the bass note, as an inner voice, or as the top line.
2. Options for first species, second species, third, florid, etc counterpoint for each of the generated voices. Ex: You play a single note and the upper voice plays two notes for every one of yours, etc, etc.
3. If you want to get real fancy, make the generated voices perform a canon of your played notes.
- 128-sample cpal audio buffers (~2.7ms at 48kHz) - Single-cycle pitch detection - 2-frame McLeod pitch voting for confirmation - Entire DSP pipeline is Rust, pre-allocated ring buffers with minimal heap pressure
The e2e from pluck to MIDI "note-on" signal, is under 10ms on an M-series Mac. Hardware matters for sure so an audio interface with low-latency drivers (I use an Audient iD14) helped a lot. The web version (app.contrapunk.com) adds AudioWorklet latency on top, so the native Mac app is noticeably tighter. I am still working on figuring out how to have lesser noise and pitch jitter in the final output. Also this works really well for higher notes, bass not so much right now. Still need to figure out how to handle harmonics better. I have created this issue for you for now, let me know if you would like to add anything else to this as well. https://github.com/contrapunk-audio/contrapunk/issues/6.
You can find the recording on YT [2] They were really unique - blending traditional Indonesian instruments, intense guitar work and classical influences.
[1] https://www.discogs.com/release/17424685-Contrapunk-Putri-Mo...
[2] https://www.youtube.com/watch?v=jb1792ZuXcY
You won't be alone :)
Such a cool project and the name and thanks for making it open source!
One idea is to analyze timing as well, and "trigger" things after certain sequences (so play 1-3-5 as say eighth notes and then get an in-rhythm arpeggio one octave higher) or detect the beat and play on the upbeat.
I haven't done any Rust, but this might give me a good reason to give a try.
Have you considered making it a plugin? (makes replay easier in my opinion, but brings other pain like relaunching the DAW between builds...)
How do you generate velocity values for the accompaniment notes?
Given that you already have a pitch tracker, it could be interesting to add key detection; just start playing, instead of telling the machine what key you're in, and it starts following along as soon as it catches on.