Jump to content

REASON - Please Direct Reason questions here


Devvyn
 Share

Recommended Posts

I get pissed about my PowerMac - two months after I bought it, Apple bumped all of the Powermac levels down, so that for the same price I could have gotten a 2.0GHz instead of a 1.8 GHz. But then I reassure myself by thinking that, if I had waited, I would have gone two months without my computer and I wouldn't have cranked out something like three remixes in that time frame.

So, if you've cranked out a remix or two or three since March 10 with Reason 3.0, then you're good to go.

If you're a registered Reason 3.0 user, don't forget to grab ElectroMechanical 2.0 refill from the props site.

Also, I'm not a beta tester. I don't like the Props' beta testing scheme with giving you demos and all of that. It makes it really hard to do anything. Everyone's a registered user - it's not like they shouldn't give you the opportunity to save your work.

Link to comment
Share on other sites

  • 4 weeks later...

How can I reduce the delay between the time I'm playing a note with my midi keyboard and the time I actually hear the sound. I am using the NNXT sampler with the reason's patches and no attack to -64 and still I have a delay.

Link to comment
Share on other sites

Its the loss in the computer-synthesizer. The attack isn't what would make the synth lag. Go into "edit"-"preferences" then go to the MIDI output or something like that and it should have a lag between the synth and the computer. It usually is because the computer is slow and needs some more think time to process the information. Personally, I don't use a synth to input the notes, so I could be wrong since its never come up for me, but you could try it.

I have a question of my own... Does anyone know of a good place to find a mastering FAQ. I've looked at the propellerhead site and a few others and haven't found anything. Im having problems with my master coming out too clipped...

Link to comment
Share on other sites

How can I reduce the delay between the time I'm playing a note with my midi keyboard and the time I actually hear the sound. I am using the NNXT sampler with the reason's patches and no attack to -64 and still I have a delay.

Preferences > Audio. Fool with the driver selection and buffersize/latency slider.

Link to comment
Share on other sites

Bought the upgrade today. Wow, it really was worth the money! They've really refined it, little touches like actual mute and solo buttons instead of that crossy thing that was always awkward to use. All your sounds seem like they're at your fingertips, and the combinator has some great presets!

I wasn't sure about getting the update, but I have to hand it to propellerheads once again. Excellent. :D

Link to comment
Share on other sites

One of my old songs, when rendered, produced this nasty screatchy sound. I was curious as to what the issue was.

I finally narrowed down the issue to a combination of 3 devices.

If I put a matrix on a subtractor (gate & cv) and another matrix on the same subtractor (mod & volume) and then I put a scream 4 on the subtractor, when I render I get massive feedback that isn't apparent in the playback (in reason 3.0.3)

I've been able to re-create this in a very simple song, I looked through the first few pages of the forums and didn't see any solution to this issue and was curious if anyone else stumbled accross the same issue.

the result mp3 sounds like this: http://www.animeremix.org/Carbunk1e/download.php?song=Clank2.mp3

Link to comment
Share on other sites

How to use Reason with other programs? Is there any way for me to use Reason simultaneously with Cool Edit Pro. I would like to had some vocal to my songs!

Without rewire compatibility there wouldn't be a way to link them, if that's what you're asking. But if you mean using them both at the same time, that should not be a problem depending on what soundcard you have. What most reason users do is import the audio file in reason via NNXT (treat it as a sample) and just hold the note in the reason piano roll for however long the audio file is. It's not the most graceful solution, but it does work quite well.

Edit: I'm using Reason 2.5. I'm not sure how Reason 3 changes any of my above statements, if any.

Link to comment
Share on other sites

I have a question of my own... Does anyone know of a good place to find a mastering FAQ. I've looked at the propellerhead site and a few others and haven't found anything. Im having problems with my master coming out too clipped...

is this pre or post rendering?

a lot of the times if clipping is going on you need to take down the bass or if there's not much bass just take the volume down in little bits untill it sounds good

Link to comment
Share on other sites

How to use Reason with other programs? Is there any way for me to use Reason simultaneously with Cool Edit Pro. I would like to had some vocal to my songs!

Without rewire compatibility there wouldn't be a way to link them, if that's what you're asking. But if you mean using them both at the same time, that should not be a problem depending on what soundcard you have. What most reason users do is import the audio file in reason via NNXT (treat it as a sample) and just hold the note in the reason piano roll for however long the audio file is. It's not the most graceful solution, but it does work quite well.

Edit: I'm using Reason 2.5. I'm not sure how Reason 3 changes any of my above statements, if any.

you could also use recycle to break it up into beats and import into reason as a rex file. This gives you more 'mixing stylistic' possibilities like repeats/stuttering etc.

Link to comment
Share on other sites

How to use Reason with other programs? Is there any way for me to use Reason simultaneously with Cool Edit Pro. I would like to had some vocal to my songs!

Without rewire compatibility there wouldn't be a way to link them, if that's what you're asking. But if you mean using them both at the same time, that should not be a problem depending on what soundcard you have. What most reason users do is import the audio file in reason via NNXT (treat it as a sample) and just hold the note in the reason piano roll for however long the audio file is. It's not the most graceful solution, but it does work quite well.

Edit: I'm using Reason 2.5. I'm not sure how Reason 3 changes any of my above statements, if any.

you could also use recycle to break it up into beats and import into reason as a rex file. This gives you more 'mixing stylistic' possibilities like repeats/stuttering etc.

On the flipside, you could export the audio from Reason and drop it in as a track in whatever recording program you use. Record your vocals on another track WHILE you hear the song! (I am assuming that the program features multi-track recording. For a single-track program, you would probably want to use one of these other approaches instead.)

Of course, the best solution would be to use ReWire. If your recording program supports ReWire, open that program and then open Reason. This way you can not only record vocals, but you can hear the song being played live. No exporting to other programs when you want to make a change.

Also, somebody mentioned that they were looking for some mastering FAQs. While I don't have any, I did discover recently that the BV512 Vocoder can also be used as an EQ unit if the Parametric EQ isn't suiting your needs (BV512 works great for lo-cuts). There's a knob on the BV512 with two options: Vocoder and EQ. Vocoder is the default, so switch it to EQ and you get a 4/8/etc-band EQ similar to those on certain stereos and computer media players.

Some might think this was obvious, but I didn't think it was, so there you go.

Link to comment
Share on other sites

The MClass Equalizer in Reason 3.0 is pretty decent. Kicks the snot out of the little tiny red EQ.

You know how there's that rumoured Rewire limitation - first 16 devices (and only in the order they're created?). I don't know who here uses Reason rewired as a setup, but today I was working with Logic and Reason (on a Mac of course) and I didn't encounter the limitation. However, it takes a bit of forethought.

When you Rewire (in Logic, at least), you specify a Rewire Instrument object and give it a Bus and a Channel. On Mac, Bus 6 corresponds to Reason (for some reason I don't know the details of). When you do this, the channel then ends up listing the devices you have in your rack by name. However, there's only 16 channels (which means 16 devices).

If you need to work with more than 16 devices? When you use a new Rewire object, set it to use Bus 7. The next 16 devices will appear in the channel.

Presumably this'll work all the way to Channel 32 (and maybe even wrap around to Channels 1), which should be WAY more than enough devices.

Link to comment
Share on other sites

When you Rewire (in Logic, at least), you specify a Rewire Instrument object and give it a Bus and a Channel. On Mac, Bus 6 corresponds to Reason (for some reason I don't know the details of). When you do this, the channel then ends up listing the devices you have in your rack by name. However, there's only 16 channels (which means 16 devices).

If you need to work with more than 16 devices? When you use a new Rewire object, set it to use Bus 7. The next 16 devices will appear in the channel.

Presumably this'll work all the way to Channel 32 (and maybe even wrap around to Channels 1), which should be WAY more than enough devices.

Yeah that's sweet. The limitation is in Cakewalk hosts and FL (maybe others that I don't know about too) apparently. Seems like Logic has it right.

Link to comment
Share on other sites

One of my old songs, when rendered, produced this nasty screatchy sound. I was curious as to what the issue was.

I finally narrowed down the issue to a combination of 3 devices.

If I put a matrix on a subtractor (gate & cv) and another matrix on the same subtractor (mod & volume) and then I put a scream 4 on the subtractor, when I render I get massive feedback that isn't apparent in the playback (in reason 3.0.3)

I've been able to re-create this in a very simple song, I looked through the first few pages of the forums and didn't see any solution to this issue and was curious if anyone else stumbled accross the same issue.

the result mp3 sounds like this: http://www.animeremix.org/Carbunk1e/download.php?song=Clank2.mp3

Anyways, I finally got a response on this.

It is a 3.0.3 problem specifically. Prop is working on fixing this issue amongst other bugs reported. For now, composers must work around this combination of devices (Subtractor->2Matrix->Scream4)

-dj carbunk1e

Link to comment
Share on other sites

You can go on a Mac, presumably, from 6 to 32 busses, each of which contains 16 channels. So that's a total of 26 times 16 devices, which is over 400 devices (416 to be exact). Chances are that Logic would wrap around from bus 32 to bus 1 if needed, so you'd have something like 512 devices altogether.

In Reason, the Combinator counts as an object, as does the Hardware Interface. Also, I don't think you can can have more than 64 audio connections in the Hardware Interface (as far as I know), which means that you can't send more 16 audio channels to the host sequencer for mixing. There's room for 64 connections... and then because you need a stereo L/R, that means there's potential for 32 stereo audio devices that you'd mix down in Logic.

It's worth noting that in Logic, the audio channels count up way past 64 (but are shaded out), suggesting that the Rewire protocol has support for something like 256 devices at least.

I'm not sure about the instances of Reason. On MacOS X, you can't explicitly tell it to open more than one instance of an application (but you can open as many windows of that app as you want). Based on my experiments, if you have one Reason song open in Rewire, then open another one, Logic will take the Rewire settings from the one that was the most recently opened.

In any case, the point is that Rewire isn't really that limiting at all if your host is properly designed. I'm using Logic as a host and Reason as my massive sound module right now. For $350 I don't think you can get a more complete set of sounds to start out with.

Link to comment
Share on other sites

As far as I know, you can't. Reason can only operate as a Rewire Slave. It cannot operate as a Rewire host.

How would one launch Reason 2.5 as a ReWire slave in Madtracker 2.5.0?

The first thing to determine is if Madtracker can act as a rewire host. Based on what it says on their page "Hook it up inside any ReWire host, and use MadTracker as an expansion with your current musical projects!" I can only assume by this that Madtracker, like reason acts as rewire slaves. You'd need a rewire host, like the free tracktion (or at least, it used to be).

Link to comment
Share on other sites

As far as I know, you can't. Reason can only operate as a Rewire Slave. It cannot operate as a Rewire host.

How would one launch Reason 2.5 as a ReWire slave in Madtracker 2.5.0?

The first thing to determine is if Madtracker can act as a rewire host. Based on what it says on their page "Hook it up inside any ReWire host, and use MadTracker as an expansion with your current musical projects!" I can only assume by this that Madtracker, like reason acts as rewire slaves. You'd need a rewire host, like the free tracktion (or at least, it used to be).

Shit. My hopes of hooking Reason up to a tracker are dashed. Thanks for the info, regardless.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
 Share

×
×
  • Create New...