Jump to content

Soundscape EQ in Reason


Dafydd
 Share

Recommended Posts

Hello. I've been using Reason for a year or so now after sticking with midi all my life, and I still don't get this EQ part of it. As I've had it explained to me, if too many instruments are playing simultaneously on the same frequencies, they're going to mud up the soundscape. What I do now is just boost everything because, well, everything is being muddied up by something else and when you were still making midis the only way to make an instrument more audiable was to make it louder.

With Reason, I understand the key to making a song sound good is to EQ every instrument to have its own range of frequencies that no other instrument is using. The problem here is - how do I know what frequencies each instrument is using, and how do I know which frequencies to fade out in order not to get a muddy soundscape? Also, whenever I fool around with EQ on an instrument, it sounds different than it did from the beginning (duh). How do you puzzle the instruments together in the soundscape so that each one of them keeps to its own range without altering the percieved sound of each instrument?

I'm totally in the dark here. Someone help?

I'd also very much appreciate it if someone would be kind enough to explain how compressors and limiters work, because I think I know what they do but not how to turn the knobs to make them do what I want them to.

Link to comment
Share on other sites

frequency relationships.cmb

Use that as an effect device and learn from it. The first vocoder labeled "before eq" works as a spectrum analyzer and shows you what freqs the raw sound plays in. The second vocoder labeled "equalizer" works as an eq, you can adjust the frequency bands by making them taller or shorter to raise or lower the volume of certain freqs the instrument plays. The third and final vocoder labeled "after eq" shows you how you modified the sound with the eq from the original. This patch within itself is like a lesson on freq relations, just go through different patches and play hi and lo notes and you'll quickly learn how sounds relate to the eq and frequency spectrum in general. After experimenting for a bit you SHOULD be cable of listening to a sound and at least identifying whether it contains mostly lo, hi, or mid frequencies. This could have been a lot better teaching aid if the vocoders labeled the freqs like the mclass eq does, but unfortunately they don't, so this can more or less just help you learn the basics and get the general idea. It's still something though, so enjoy.

Link to comment
Share on other sites

First, don not eq everything as a rule. EQ is part of the mastering, and as all mastering, it should be subtle (unless you want to create special effects with mastering or give the song a certain dynamic etc, as a general rule, this kind of effects should be subtle)

Changing the EQ on Real instruments will greatly affect it's nature and as thus, you can easily end up with undesired results, like real instruments sounding fake or overprocessed. Instead, try something more subtle. Do not make changes in eq above or below a ~6db threshold. If your song is sounding too cluttered, try attenuating the sounds instead of eq them. If you really feel you want two instruments who share similar frequency ranges to be clearly audible, do not completely cut off some frequencies of one or do insane boost on the other. Instead, start with one sound and find the freqs that give the instrument more prescence and boost them a little, then do the same for the other instrument.

As for finding the right frequency, a lot of factors come into play. Pitch, for example, influences which frequencies a sound will occupy, so if you want you can try lowering/rising an octave on the desired instrument to fit the frequency range you want (this works better on synths and processed sounds, not real instruments). In the end it all comes from experience, and the more you mix the more you will be able to tell which frequencies are predominant on a sound.

If you need help you can always use a spectrum analyzer (in Reason's case, a vocoder, there's a thread around here on how to use the vocoder as a basic spectrum analyzer). Put one at the end before the final audio out and run your instrument. Watch the peaks and you will have a good idea of which frequencies it's occupying.

Again, from the things I've read and my own experience, try to not eq everything unless you feel it's necessary, you will see your mix will acquire a more natural feel. In the end, the kind of music you are making will influence where you will heavily use eq or not. Trying to EQ everything to fit a certain freq range will give a noticeable processed and unnatural feeling to your song, which you may not want.

About compressors and limiters, that's a whole new topic and a BIG and complex one. I will try to make some sort of tutorial on this but suffice is to say that mastering takes a lot of experimentation and trial and error, as you won't find a mastering setting that will work for all your songs. There are some "rules" to follow but when it comes to mastering nothing is set in stone. I will try to cover this in more detail later, in the meanwhile try to apply what I said about EQ to your songs and see if you achieve better results.

Understanding some concepts and grasping them will surely take time and it has a learning curve, so be patient, experiment, read, and of course, ask.

Good luck.

Link to comment
Share on other sites

What you said about EQ is basically what I said about EQ, only you provided some solutions - ones that I unfortunately didn't really understand. The real issue, I guess, is how to EQ something and make it come out more by boosting its signature frequencies when these obviously change whenever you change the pitch of the instrument, like you said. How do you cope with this? Do you automate the EQ settings and give each knob on it a track of its own?

Looking forward to some basic tutorial on compression and limiting.

Link to comment
Share on other sites

Yes, I made the patch.

On compression: http://dnbsecrets.com/node/compression-101-teh-basics-422.html

On compression and limiting: http://www.ocremix.org/forums/showthread.php?t=3784 (scroll down all the way until you get to the MASTERING part, he starts talking about compressing and limiting there. A limiter is basically a compressor with an infinite ratio, you'll get what that means after you read the tutorials.)

Link to comment
Share on other sites

What you said about EQ is basically what I said about EQ, only you provided some solutions - ones that I unfortunately didn't really understand. The real issue, I guess, is how to EQ something and make it come out more by boosting its signature frequencies when these obviously change whenever you change the pitch of the instrument, like you said. How do you cope with this? Do you automate the EQ settings and give each knob on it a track of its own?

Looking forward to some basic tutorial on compression and limiting.

As I said, EQ and mastering in general should be subtle, for the most part. You can't automate an eq to follow pitch because it will cause a filtering effect (EQs are filters, btw.), giving your instrument a different behavior and texture. As I said, do not to box every sound in your song to a certain frequency range, I doubt that could work. Instead try subtle adjustments to EQ and volume or velocity.

There are several ways to make sounds that share the same frequency range to have presence and be audible at a similar level that do not necessarily involve EQing the hell out of everything. Stereo spread, for example, helps a lot. Try to pan one instrument to the left and the other to the right, start with small changes as you listen to them play. Play a bit with the volume of each and make more changes on the panning until you feel they are getting equal space in the song.

You can also try to stereo spread one and center the other, using a stereo imager on one instrument to spread it, and another in the other instrument, to center it. You can also try to pass the signals of both instruments through a compressor and try to normalize the levels.

If you want to find what the "main" frequencies of an instrument are, I already gave you the solution, use a spectrum analyzer (or in Reason's case, the vocoder), set your instrument on a loop and watch the peaks on the vocoder, they will tell you which frequencies are more prominent on it.

Link to comment
Share on other sites

I think you can EQ the hell out of stuff and get something desirable. And general EQing, like to the synths, is part of the mixing stage I believe, not mastering. Mastering is when you change the character of the entire track as a whole, not the individual instruments. On the instrument level it's still pretty much considered mixing, and from what I've read around several sources guys seem to say you can EQ quite a bit off a instrument, even enough to make it sound pretty thin and terrible by itself, as long as it sounds good and clean in the mix.

I'll give you example from a track I'm working on dafydd, my bass synth was playing all the frequencies below 625Hz and was pretty friggin loud. The problem is that it interefered with the bass kick in my drums and muddied it, which had most of it's prominient frequencies around 250Hz-625Hz. What I did with the EQ was cut a fairly small but deep hole around 500Hz in my bass, this did very little to the sound since I only took out a small porportion of it's freq range. Then with the bass kick, I raised the level of the freqs in the 500Hz area to make those stand out more and bam! Now you can fairly easily hear my bass AND the bass kick with neither sounding awkward or muddying each other up because of the EQ. Of course, if I have a lead synth going over the bass and drums, and it's somewhere between 1kHz-6kHz then I don't really need to EQ it since it's not in the way of anything. On the other hand, if I had some bells that where like around 4kHz-10kHz and were providing a counterpointe melody to my lead, I'd think about lowering their 4kHz-6kHz frequencies so that they don't muddy up the lead. I'd probably even raise or lower a little on the lead in the 4kHz-6kHz area to see how it'd sound. So basically... just think about stuff like that when you mix.

Link to comment
Share on other sites

Thanks. I've read up on Zircon's description of compressors, and it's amazingly satisfying to turn the knobs and suddenly know what they do. His tutorial doesn't mention sidechaining, however. Anyone care to explain what it can be used for? Like, an example I could listen to?

Also, being Swedish, I constantly worry that Propellerheads will make mistakes when writing in English. A lot of the time when reading the manual, I keep thinking to myself "omg, this sentence is so written by a Swede". Does anyone else notice anything strange or am I just being paranoid? (note that I realize my English isn't perfect, either)

Link to comment
Share on other sites

Thanks. I've read up on Zircon's description of compressors, and it's amazingly satisfying to turn the knobs and suddenly know what they do. His tutorial doesn't mention sidechaining, however. Anyone care to explain what it can be used for? Like, an example I could listen to?

Also, being Swedish, I constantly worry that Propellerheads will make mistakes when writing in English. A lot of the time when reading the manual, I keep thinking to myself "omg, this sentence is so written by a Swede". Does anyone else notice anything strange or am I just being paranoid? (note that I realize my English isn't perfect, either)

Jag håller med!

But it's kind of charming to me hahaha

Link to comment
Share on other sites

I think you can EQ the hell out of stuff and get something desirable. And general EQing, like to the synths, is part of the mixing stage I believe, not mastering. Mastering is when you change the character of the entire track as a whole, not the individual instruments. On the instrument level it's still pretty much considered mixing, and from what I've read around several sources guys seem to say you can EQ quite a bit off a instrument, even enough to make it sound pretty thin and terrible by itself, as long as it sounds good and clean in the mix.

I'll give you example from a track I'm working on dafydd, my bass synth was playing all the frequencies below 625Hz and was pretty friggin loud. The problem is that it interefered with the bass kick in my drums and muddied it, which had most of it's prominient frequencies around 250Hz-625Hz. What I did with the EQ was cut a fairly small but deep hole around 500Hz in my bass, this did very little to the sound since I only took out a small porportion of it's freq range. Then with the bass kick, I raised the level of the freqs in the 500Hz area to make those stand out more and bam! Now you can fairly easily hear my bass AND the bass kick with neither sounding awkward or muddying each other up because of the EQ. Of course, if I have a lead synth going over the bass and drums, and it's somewhere between 1kHz-6kHz then I don't really need to EQ it since it's not in the way of anything. On the other hand, if I had some bells that where like around 4kHz-10kHz and were providing a counterpointe melody to my lead, I'd think about lowering their 4kHz-6kHz frequencies so that they don't muddy up the lead. I'd probably even raise or lower a little on the lead in the 4kHz-6kHz area to see how it'd sound. So basically... just think about stuff like that when you mix.

EQ is very common in the kick/bass situation, it's almost mandatory. But as I said, it depends on the genre you're mixing. As I said, for synth instruments you might want to do big changes to the freqs, but for real instruments and for general mastering is not recommendable. Again, remember that EQ can drastically change the "personality" of your sounds. Personally I eq pretty much everything, but it's because most of the songs I make are very synth-heavy and the genre requires drastic adjustments to the nature of drums and instruments. When I go for something like a mix of orchestral elements and electronic aspects I keep things gentle.

If you want good examples of this, check out reason demo songs and see how they apply eq to instruments. As I said, how you will use eq, as any other effect, depends of the kind of music you're doing. But as a general rule try to keep it gentle and try to keep your sounds authentic. If you start EQing everything to box every sound into a frequency range, the sounds lose authenticity and character, whereas if you try to cover a good range of frequencies with each instruments in a way that they don't totally overlay each other but also don't sound boxed and thin, you will achieve better results.

Thanks. I've read up on Zircon's description of compressors, and it's amazingly satisfying to turn the knobs and suddenly know what they do. His tutorial doesn't mention sidechaining, however. Anyone care to explain what it can be used for? Like, an example I could listen to?

Sidechaining is a type of compression used to automatically attenuate certain sounds when others are played. This is used a lot for vocals, so when the vocals appear in a song, certain instruments get attenuated to give prominence to the vocals. Most of the time this effect is used in a subtle manner but it also can be used heavily on some styles of songs, for example a lot of House and Techno songs use this method to create an effect like the one found in rayza's new remix at around (0:45): http://www.ocremix.org/remix/OCR01633/

Link to comment
Share on other sites

A lot of the time when reading the manual, I keep thinking to myself "omg, this sentence is so written by a Swede". Does anyone else notice anything strange or am I just being paranoid?

No, I notice it too, but that's not because I'm fluent in Swedish (which I'm not) but because I'm fluent in English, and I KNOW when something sounds funny. I kinda envy you and Anso though, I think it's totally badass to know a second language (like English in this case) well enough to go on forum boards and comfortably speak it to other natural English speakers. If I went on a Swedish board I wouldn't understand anything ._.

Hey, what's the razor tool for? I can't find any mention of it in the help contents...

It's for cutting up clips. Say you record something and you wanna cut up the midi notes into different clips so you can have them categorized and can copy and paste them in different sections of the song. The razor tool is used to cut the clip where you specify.

Link to comment
Share on other sites

Ahhh... ok. Of course. I couldn't find a way to cut up clips and didn't know what the razor was for - I should have made the connection.

You'd probably understand bits and pieces if you went on a Swedish forum, but not enough to actually be able to participate in the discussion. Anyway, so you do notice there's something funny about the Reason documentation. How embarassing, haha :oops:

Link to comment
Share on other sites

Also, being Swedish, I constantly worry that Propellerheads will make mistakes when writing in English. A lot of the time when reading the manual, I keep thinking to myself "omg, this sentence is so written by a Swede". Does anyone else notice anything strange or am I just being paranoid? (note that I realize my English isn't perfect, either)

Jag är inte svensk, men norsk :P

And somtimes it feels like I am reading swedish only with english words...

Link to comment
Share on other sites

I made a post on sidechaining in Reason a little while ago on this thread:

http://www.ocremix.org/forums/showthread.php?t=11783

Crappy musical example but I hope it helps lol.

That was what got me into doing Sidechain compression! YAY! Anyone who talks to me about mastering knows i always talk about sidechaining. It's superhelpful for percussive stuff getting drowned out by a longer note of the same frequency. That's the best way to get that stuff done.

Thanks for that post malcos! It has been enormously helpful!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...