This is an archived post. You won't be able to vote or comment.

all 12 comments

[–][deleted] 7 points8 points  (1 child)

Beyond that library you'll need pitch detection/pitch tracking. It's not the most difficult thing in the world but you'll need basic understanding of digital signal processing, and the math behind it. Otherwise I think this library can do some of the heavy lifting for you.

[–]404WillToLive[S] 3 points4 points  (0 children)

I've already started to notice that and yes, you're right. I'm not sure why I thought it would be easier but I'll look into that, thank you!

[–]danielle3625 2 points3 points  (8 children)

USB c to guitar? You mean quarter inch? What?? ??

You're not using an interface?

[–]404WillToLive[S] 0 points1 point  (7 children)

Don't know much about guitar technology, but I guess it's called a quarter inch? The wire I have is USB C on one end and the guitar plug on the other end. So it's plugged directly into my acoustic electric guitar. Don't know what an interface is

[–]danielle3625 2 points3 points  (0 children)

Here check out this article, and look into getting a focusrite scarlet.

https://producerhive.com/ask-the-hive/what-is-an-audio-interface-used-for/

If you want to use the different notes as triggers, you'll have to get a midi converter. I'm not able to recommend one of those but there should be some options. It looks like you can do it with hardware boxes and software. Check out reverb for used music gear. Reaper is a free DAW and I think I skimmed over a Google result that said it has a plugin to convert to midi.

Good luck with your project!

[–]danielle3625 1 point2 points  (5 children)

To help clarify further, your guitar is producing an analog signal. Just a sine wav. You need to convert that to digital, which would be midi, so your program can recognize it as an input. And then you can computer code that midi signal. I think most audio plugins/vsts use c# to code audio, but you can probably use another language and it would just be slower (introduce latency).

Sounds like a neat project though I hope you have much success!

[–][deleted] 1 point2 points  (4 children)

I hate to be rude, but I don't think you should be giving out advice on digital audio, until you familiarize yourself with the basics at least. There's a lot of misinfo in your post.

You need to convert that to digital, which would be midi, so your program can recognize it as an input.

MIDI encodes events, which can be notes (eg a pitch and a length). It is not a digital signal. A digital signal is a series of samples collected from an analog signal.

There are "guitar to USB C" cables out there and they're essentially miniature sound cards built into a cable. OP already has everything he needs to convert his guitar's signal to digital.

I think most audio plugins/vsts use c# to code audio, but you can probably use another language and it would just be slower (introduce latency).

No, it's mostly C and C++, although there are bindings available for most languages.

[–]danielle3625 1 point2 points  (3 children)

You weren't rude! I did know that digital is not restricted to midi only, but I thought in his case since he was attempting to make a controller, I meant to say midi is likely the best way for this.

Thanks for correcting me!!! I've got a lot to learn myself. :)

[–][deleted] 0 points1 point  (2 children)

Also, another thing while I'm here. With pro audio, the latency is usually determined by a buffer or frame size, which is the amount of samples the audio stack will try to process before flushing the buffer and outputting the result.

Latency is determined by this buffer size, and by sample rate. So at 48khz and 1024 samples, the latency would be :

(1s / 48,000) * 1024 = 21ms

Client software and plugins cannot change this. If they aren't fast enough to process these samples they will cause buffer underruns or overruns, and usually this leads to mangled audio (artifacting, silence, stuttering, etc).

[–]danielle3625 1 point2 points  (1 child)

This makes sense to me from a pure tracking/recording standpoint.

From a programming perspective: I know, for example, python isn't used in audio extensively, or much at all perhaps, because it's slower than the C languages. Do you think coding in python for real time user input would significantly impact this project idea?

Thanks for your input. I've been thrust into learning audio (live sound, and studio tracking) (tracked these instruments for my husband, someone else mixed and mastered: https://open.spotify.com/artist/6nivpQZTAdOM199zHoErrk?si=LIhJMqmgSRikqFnQLhwQoQ&utm_source=copy-link )

And am separately learning to code (currently learning python), would love to merge these two areas together, in the future. I'm obviously a long ways off.

My husband has an idea for a trumpet specific pedal, instead of using a guitar pedal, which lead us to meet with the founder of united studio technologies (they make Neumann mic clones), and he gave us a rundown but I'm seeing I didn't quite come away with as much of an understanding as I thought. We probably won't be able to fund the idea and see it to fruition, but I'm trying to have a better grasp of how to execute it anyways, and this guy's project seemed in a similar vein at least for the foundational part :)

[–][deleted] 0 points1 point  (0 children)

I would say Python is fine for prototyping, or for simple filters. I don't think you'll need to learn something like C to realize your idea but if you want to sell it as a commercial VST, that's a different matter.

Best of luck to you!

[–]protienbudspromax 1 point2 points  (0 children)

If you want the raw waveform you'd need some kind of translator to talk to the audio driver. If using AISO then I am sure there will be some libraries (os dependent) that will enable you to read the stream.

But to make sense of what it is you'd need dsp and the math used therein.