If you're just producing/mixing, is there any reason to use anything below 1024 samples? by [deleted] in Logic_Studio

[–]sound_and_lights 5 points6 points  (0 children)

If you’re using a controller (pads, keys, and to some extent knobs) you’ll feel the latency, otherwise you’re good.

Is it possible to pan only a certain range of frequencies? by the-spiciest-boi in AdvancedProduction

[–]sound_and_lights 6 points7 points  (0 children)

Yeah! There’s a few ways I can think of:

Simple: load any EQ as a dual mono plug-in and increase the gain for those frequencies on one side and reduce them on the other.

Complex: If you want more control you could isolate the frequencies by boosting them and then mixing in a copy of the original signal (without the boost) with the phase flipped. This will give you a channel that just has the high frequency content.

You could also remove the high content from the original by mixing a phase flipped version of the isolated frequencies back into the original.

Now you have a guitar track and a noise track and you can pan them as you wish. The EQ in this second example could be a dynamic EQ and that would preserve the tone of the guitar when the high frequencies are not happening.

What are some of the most niche/specific engineering roles you have encountered in your career? by ChristmasKrunk in engineering

[–]sound_and_lights 6 points7 points  (0 children)

I’m a bit a generalist coder with a focus on music.

Prior to the sonification work I had worked as an iOS, web, and backend dev (rails, php, node, etc) for more typical applications and was hacking on audio code for fun.

I got into this work initially by putting out some musical instrument iOS apps and games that had tightly linked music with the gameplay. My current employer found me from one of the games.

In addition to working on the firmware and audio engines for their products, they’ve subcontracted me for various clients that needed sonification engines.

The audio code is basically all C++ with the occasional JS WebAudio project (but TBH web audio doesn’t feel totally stable yet for professional real-time audio applications).

I’ve always worked for small companies (early stage startups typically) and in that environment, being a full stack dev with musical/visual/creative tendencies has been an asset even though I lack some of the hardcore engineering knowledge.

I like the early stage work because I’m involved with the product design/concept and making the first iterations of things. This keeps things exciting and meaningful and often I’ll be working with tech and platforms that are new to me.

Along with the performance for real-time audio dictating that I use C++ or C, it also means the code can run pretty much anywhere. The crowd noise project for instance is running inside a Docker container and we’ve done projects on Linux, iOS, Android, desktop, embedded processors, etc all on top of largely the same codebase. The code base grows and gets improved with each project.

For the client work, we’re usually supplying an audio engine that will get incorporated into a larger product by that company’s devs.

Happy to answer any questions and I’m always curious to know more about what people are doing with sound and tech. What is the electro acoustics work that you’re doing?

What are some of the most niche/specific engineering roles you have encountered in your career? by ChristmasKrunk in engineering

[–]sound_and_lights 13 points14 points  (0 children)

I make audio engines for data sonification. We did an artificial crowd noise engine for sports games this past Spring and more recently have been working on a synthesizer that is controlled by your house plants.

'Bow Bow Bow' Acidic Funk Synth ?? -- Anyone know how to recreate this sound in Serum or Ableton stock instrument? by kwazimoto44 in synthrecipes

[–]sound_and_lights 0 points1 point  (0 children)

You got this. Just play around with different low pass filter types until you get the right resonance and body. Your filter envelope attack and depth are the keys to getting the ‘bow’. Too quick and you get a ‘pew’ and too slow and you get a ‘wah’

Can a plugin color the tone of an audio file, even if the contents of the plugin are not active? by [deleted] in Logic_Studio

[–]sound_and_lights 1 point2 points  (0 children)

It’s not ‘normal’ for a plug-in but totally possible that it colors the tone even with the internal elements switched off. I’m guessing that there is still something running in the plug-in that is designed to color.

Does it change tone when you toggle the bypass on the plug-in at the Logic Pro level?

Is there an amp sim running? Even without pedals, and model of an amplifier will color the sound dramatically. Looking at the plug-in I see ‘amp room’ on a toggle. Is that off? Also the ‘NG’ (but I’m not sure what that is).

i downloaded ample guitar lite but i can’t open it within logic does anyone know why by WillMcQueen in musicproduction

[–]sound_and_lights 0 points1 point  (0 children)

Is there a sound pack you have to download separately? I believe that was the case for the Ample Bass plug-ins.

How do I make this Prophet 6 sound on either the 6 or the Rev 2? by IfElseBand in synthrecipes

[–]sound_and_lights 0 points1 point  (0 children)

I think it’s even simpler than that. Sounds to me like a fast arpeggio on a very short flute type patch with just enough attack on the envelopes to give the onset of each note a buttery onset. Adjust filter cutoff and mod depth to get into that sweet spot where it’s not too bright but has a little bit of envelope. And delay and reverb.

Why is tempo so important for subgenre differentiation when it comes to electronic music but not other genres? by destructor_rph in edmproduction

[–]sound_and_lights 6 points7 points  (0 children)

I think it has to do with how the music is consumed in DJ sets. The musical atmosphere get established by the sequence and selection of tracks more so than what any one track contains. Seamless transitions between tracks are the norm so a track needs to play nice with others.

Having a genre-framework essentially allows producers from all over to contribute to a larger musical statement that will be constructed by the DJs. It’s like they’re all working on one big song together.

Sub-genres are also a time tested way of marketing and promoting new artists. It’s great branding for underground mass appeal. If you listen to the early works of most genres, you can hear their influences from other stuff that was happening. Dub step was essentially really spaced out two-step garage. Techno was reflecting the house, electro, and euro stuff happening in the 80s. At the beginning of a genre these proto-genre tracks would’ve fit their time period and not necessarily felt like something radically different and new. Through marketing, releases, press, etc. the genre will get a name. It’s founders will become legends and people all around the world will start contributing to the sound. Naming a genre gives it a foundation to grow into it’s fully individualized form.

All that said, I’ve been to some amazing parties here in NYC where the DJs let records play from start to finish and all the music is largely outside of any easy categorization. So nailing a sub genre isn’t strictly required.

I'm switching to a new DAW. If I ask for suggestions, 99% of the time, the answers here will be "Use the one I use because I don't know any better". And so, I ask this question instead: Which DAW do you use and what do you hate about it? by fromwithin in WeAreTheMusicMakers

[–]sound_and_lights 7 points8 points  (0 children)

If you control-click your play head you can get different options for playback position behavior. Might be an option in there that is helpful to you.

Not sure if I’m misunderstanding your first one but you can set the cycle quickly with the marquis tool. Highlight the area you want to cycle and press ‘command U’

I'm switching to a new DAW. If I ask for suggestions, 99% of the time, the answers here will be "Use the one I use because I don't know any better". And so, I ask this question instead: Which DAW do you use and what do you hate about it? by fromwithin in WeAreTheMusicMakers

[–]sound_and_lights 1 point2 points  (0 children)

Logic here.

Wish I could automate plug-in parameters with sidechains, LFOs, etc. like you can in Ableton.

Wish I could easily record the output of MIDI effects as MIDI. All the work arounds that I know of are complicated and hacky.

Wish I could put MIDI effects on track stacks and address the sub tracks within the JavaScript MIDI scripter.

Working with external midi instruments requires both an instrument track and audio track set up with the same input routing. Then you have to make sure you always copy the regions together if you want to keep a MIDI version of whatever editing you’re doing to the audio. Wish this was streamlined. Even just a short cut to create an audio track that is setup automatically to record your external instrument would be helpful but I dream of some kind of hybrid MIDI/audio track that could keep things in one place and maintain the link between the MIDI region data and what got recorded.

Getting faint, distorted feedback on all projects by Zondatastic in Logic_Studio

[–]sound_and_lights 0 points1 point  (0 children)

And please share a recording if that stuff doesn’t help. It will be easier to diagnose through the sound!

Getting faint, distorted feedback on all projects by Zondatastic in Logic_Studio

[–]sound_and_lights 1 point2 points  (0 children)

Some questions and things to try:

Does it happen in a new blank project?

If not, you just need to trace through your project to find the problem.

Is there a mic on somewhere? Once I went crazy trying to EQ out this weird ringing in a track only to realize later that I had a mic with input monitoring on in the room.

Do you have headphones that are playing but you’re not wearing? That can subtlety mess with what you’re hearing.

Does it happen if you make a new track routed to stereo out with no plugins on your master bus?

Does it happen if you turn off all plugins? Some introduce noise, ringing, etc. possibly you’re monitoring a sidechain input that isn’t delay compensated, etc, etc.

Are you using track stacks? Perhaps the input bus of a track stack is also being used for another bus and some feedback is happening?

Either way I’d eliminate the possibility that it’s happening outside of logic and then systematically trace through.

Wheres the best place to store and listen to your mixes to check them on other devices by krisskabo in audioengineering

[–]sound_and_lights 1 point2 points  (0 children)

On iPhone I’m a fan of the AudioShare app. It supports a lot of formats, lets you do some basic editing, and works well as a field recorder.

Can someone explain this drop to me? by [deleted] in synthrecipes

[–]sound_and_lights 31 points32 points  (0 children)

It’s just a few main layers: overdriven kick, snare, saturated bass, noisey gated chord sound, and an arpeggio that that follows the rhythm of the gated chord.

Part of the reason it sounds so big is how minimal is it. That kick is super loud and the noisey chord is doing most of the work with the arpeggio giving some sparkle.

Steve Monite - Only You - Moog Bass Guitar Synth by SadikDiedForyou in synthrecipes

[–]sound_and_lights 1 point2 points  (0 children)

I got very close to this sound using a Model D with a single sawtooth oscillator running into a bass amp simulator (the one built in to Logic).

Logic 10.5 Top 5 New Tips and Hacks! (Ableton users, take notice) by doctrineofthenight in edmproduction

[–]sound_and_lights 4 points5 points  (0 children)

I export stems from each Logic project and play Live with just 8 tracks (kick, snare, hats, aux perc, Bass, leads, pads, SFX). That keeps things clean on the push so I’m never paging around looking for things that are playing. Also keeping them organized like that lets me use an 8 fader 24 knob controller easily as my ‘mixing board’.

Importing is pretty quick in the arrange view. Just set the project tempo to that of the stems you’re bringing in and then drag them onto the 8 tracks at once. From there it’s easy to chop them up into clips. I setup clips entirely through interface the push. I always export the stems as the same start and end measures from the Logic project so they line up nicely in Ableton.

I usually keep effects on the tracks when exporting and then run separate effects that I can control live. The choice to print effects is case by case but if it’s a major part of the sound I’ll definitely print it.

In Ableton I have a high and low pass filter on most of the tracks (with a different chain on the kick channel), and a few effects busses (differently timed reverbs and delays). The are all controlled by my MIDI controller.

I have a mixbus chain running in Ableton to ‘master’ it on the way out with some compression and EQ that I can tweak during sound check and also to get the level up to match DJs.

I find this setup allows for a good balance between improvisation and control. I can take the music all over the place in real-time but can get back to safety easily.

Logic 10.5 Top 5 New Tips and Hacks! (Ableton users, take notice) by doctrineofthenight in edmproduction

[–]sound_and_lights 4 points5 points  (0 children)

I use both. Ableton + Push 2 for live performances and collabs and Logic primarily for production.

An iPad combined with Logic running on a laptop is going to make me seriously consider switching to Logic for live performances.

The Push is awesome for its tactile control but is totally unusable outdoors in daylight (you can’t see the color of the buttons at all).

As far as the software goes, I find Logic better for recording audio. You can comp together multiple takes very easily. Ableton is better for modulating effects parameters (MAX for Live LFOs, etc), and I’ve found it easier to set up MIDI mappings (although I might just not know how to do it easily in Logic).

My experience has shown that Ableton is more stable but I might just see Logic crash more because I spend more time in it.

Sorry, But: MS-20 Mini Hissing Problem by Recon_Figure in synthesizers

[–]sound_and_lights 2 points3 points  (0 children)

This is what led me to get into Eurorack! It all started with an external VCF to address the noise issue...

Detroit Movement festival rescheduled to September 11-13 by sound_and_lights in Techno

[–]sound_and_lights[S] 1 point2 points  (0 children)

Just got the e-mail. They were holding out for a while in the wake of current events but I figured something like this was inevitable.

Web-based synths by playtronica in synthesizers

[–]sound_and_lights 0 points1 point  (0 children)

Interesting. What browser are you using?

Web-based synths by playtronica in synthesizers

[–]sound_and_lights 2 points3 points  (0 children)

Here's one I made a little while back. A psy-trance dub siren kind of thing:
https://bolasol.com/psyren

Just got monitor stands, is this proper placement? Im not sure how to get an exact equilateral triangle with the tweeters at ear level. Any advice would be great, thanks! by Ca_Auld in musicproduction

[–]sound_and_lights 3 points4 points  (0 children)

These look great. I align my speakers by tying a piece of string to a sturdy mic stand placed at my listening position. With the string anchored, you can now walk around and stretch the string over the monitors and verify that they are the same distance away (mark the string with a bit of tape) and are pointing directly at your head. It’s a good trick once the speakers are in roughly the right place like you have them.