Jump to content

Harmony

Members
  • Posts

    1,081
  • Joined

  • Last visited

Everything posted by Harmony

  1. Thanks, I love videos like these!
  2. Kanthos summed it up. I'll add that aside from having a few extra features that may be helpful for really large or complex projects, the bigger more expensive sequencers (SONAR, Cubase, Logic, etc) potentially will give you fewer headaches when trying to work with third-party software and hardware. Does Kontakt (a popular software sampler) work with SONAR? Absolutely. Does it work with Mixcraft? eh...maybe...it should...right? Everyone's different. I take weeks or months to finish a full song. The length is partially due to the fact that I work with a lot of live instruments, and setting them up to record isn't always a fast process. But I'm also a perfectionist when it comes to my own music, not because I (or anyone else) need to be to get good results, but because for me it's fun Some people crank out the hits it hours or days. There's a music competition over at ThaSauce called the One Hour Compo in which musicians have a single hour to finish a song. The quality some people get in an hour is amazing sometimes. So yeah, it just depends. No problem, and thanks
  3. You made a suit out of money!? FOOL! Jade, get away quick while you can. He's going to lead you to financial ruin!
  4. That's essentially how I do it. SONAR has a listing of all assignable parameters for a given effect or instrument, so I usually go that route rather than use whatever built-in learn functionality the plugin may have. But learn is learn, whatever way you look at it I guess.
  5. I'M NOT LARRY. Dave, you've known him for years, he lived in your house, and you still can't tell us apart!?

  6. No need to be nervous, welcome to the boards Audacity is an audio editor, which means it takes in or records audio files (for example, .wav, .mp3, .aif) and allows you to manipulate them (for example, add effects, change the pitch/volume/speed). Those effects are performed by what are loosely called “plugins”. Some plugins come with Audacity and others (usually in a format called VST) can be bought or freely downloaded. Now, while you can use Audacity for sequencing entire songs, it's not the best tool for the job and it doesn't offer all of the features that a full multi-track sequencer (sometimes referred to as a DAW) will have. Reaper and Mixcraft are two good options that offer free unlimited trials and cost about $60 if you decide to buy. There are more expensive and potentially better options, but these are absolutely fine to get started with. I'd still get Audacity to use as a great audio editor, but for your sequencing, go with a full sequencer. So I'll try to answer your questions with that advice in mind. 1)Audacity is cake to use. Reaper takes a little getting used to if you've never used a sequencer before, but once you get the hang of it, or any sequencer, it's very straightforward. Everything is nicely laid out and it didn't give me any problems when I tried it a while ago. I actually just found out about Mixcraft yesterday when I saw a friend of mine using it, but it looked really good. Mixcraft is modeled after GarageBand, so it's goal is ease of use. 2)Reaper has a users guide and a forum that can be helpful. A YouTube search pulled up , which looks helpful for newbies. 3)A soundfont is a collection of audio samples. Those samples are bundled with instructions used to tell a program how to play them (volume/length/pitch/effects) when they are triggered with MIDI data (see the next question if you don't understand MIDI). Most sequencers, Reaper included, don't natively read soundfonts, but most sequencers, Reaper included, can load a soundfont player which in turn loads the soundfont, reads the MIDI data that you send to it, which then triggers the sounds. They're incredibly useful and I'm a huge fan of soundfonts. 4)There are a couple of ways in general. You have to understand the difference between MIDI and audio data though. Audio is a digital or analog representation of sound. Your keyboard produces audio. MIDI is a digital instruction that tells hardware or software how and when to play a certain piece of audio. Therefore, MIDI itself has no sound. Your keyboard also can send and receive MIDI data. Knowing this, there are 3 basic ways to use your keyboard in any given sequencer. You can connect the audio output of the keyboard (e.g. a headphone jack or line-out port) to the audio input of your computer (e.g. a line-in port) and record whatever audio is coming out of your keyboard directly into Reaper/Audacity/Mixcraft as an audio track. You play the performance live, or use one of your recorded sequences stored in the keyboard. Problem is, if you play live and mess up, there's no easy editing. If you use one of your recorded sequences, you could edit the sequence, but then you're not really taking advantage of the sequencer on your computer, are you? You could connect the MIDI out from your keyboard to the MIDI in of your soundcard. Even if you don't have a MIDI-in on your soundcard, you're fortunate enough to have a keyboard that transmits MIDI through USB in addition to the standard dedicated MIDI lines. Take your pick of which to connect. After it's connected, you can send MIDI messages directly to Reaper/Mixcraft (not Audacity though, it's an *audio* editor!). Reaper/Mixcraft are “hosts” which means they can load lots of other programs (“plugins”) within them. Some of these plugins are instruments and some of these are effects (both commonly in the VST format). If you have a VST instrument, for example a VST electric piano, you can load it in Reaper/Mixcraft then use your keyboard to play that virtual instrument using MIDI. There are lots of advantages to this method. First, your sequencer will record MIDI, not audio. This means that if you make a mistake, you can immediately go in and correct the bad note without having to replay the entire thing. You can make your performances quantized, which means that all of the notes start perfectly in time with the tempo. In general, there's just lots of flexibility with editing the performance, once it's recorded. Also, there are tens of thousands...maybe more...virtual instruments out there that can be loaded into Reaper/Mixcraft. Many of these will be free and many of them will sound better than the sounds that came with your keyboard! The possibilities are endless. The final connection method is if you want to use the computer sequencer to sequence the sounds that come with your keyboard. To do so you hook up the MIDI in/out of your keyboard to the MIDI out/in of your soundcard (again, your keyboard allows you to do it just by using the usb cable). You also hook up the audio out of the keyboard to the audio in of the soundcard, as in the first method. Now Reaper/Mixcraft are receiving AND sending MIDI data, and receiving audio. With this setup you can use a Piano Roll View (see below) to program in MIDI “notes”. That MIDI sequence is then sent to your keyboard where it is used to play one of the keyboard's sounds. That sound is then sent back to Reaper/Mixcraft and is recorded as an audio track. You could also play the keyboard into Reaper/Mixcraft, record the MIDI sequence, have that MIDI sequence sent back out to the keyboard which it will control one of the keyboards sounds, then send that audio back into Reaper/Mixcraft and record it as audio. 5)No, that's a separate program for a separate purpose. Essentially the functionality of Finale is replaced by the Piano Roll View (PRV) in most sequencers. With the PRV you can use your mouse to directly input midi notes which tell the virtual instruments you have loaded (or external instruments, like your keyboard) what to play. 6) As I've said, effects and instruments popularly come in a format called VST (sometimes called VSTi if it's an instrument). Reaper/Mixcraft can load these VSTs onto your audio tracks. So lets say you go to this huge archive of VSTs (and other stuff) and pick up a free modulation VST effect. Lets also say that you also used the first method above to record yourself playing the piano into an audio track. Now, once that modulation VST is installed, you can go to Reaper/Mixcraft's list of plugins and and select your new modulation effect and place it on the piano audio track. BAM! You've got a modulated piano. Isn't music fun!
  7. Oh cool, I'm going to have to see if I can set something up like that when I get home. I average around 30 tracks for a full song so I've never bothered trying to setup up my 9 faders for levels. On a sidenote, if SONAR would get it's ACT together (Active Controller Technology) then it would a snap. As it is, ACT never works as it should for me.
  8. I actually looked at that controller since it's SOOOO cheap, but it would be a luxury item for me since I'm pretty comfortable with my current setup. What I'd be concerned with is how effectively you use the faders with more than 8 tracks? With your setup, what do you have to do to change which tracks each fader/rotary is assigned to?
  9. My Axiom 61 has a "null" function that prevents parameter jumps and it works 90% of the time. The Edirol seems to be in the same class of controllers, but it doesn't have something similar?
  10. For me, it completely depends on what I'm doing. That is, I don't have any painstakingly constructed master assignment scheme -- I assign everything on the fly. If I'm tracking, I will usually assign the volume of the armed recording track to a fader. That way I can adjust the volume without the mic picking up the mouse clicking. I also do volume automation, even simple ones, by using the faders. If I'm in dBlue Glitch, I have a standard setup where faders control amounts of a parameter, like modulation speed, or bit crusher amount. Knobs control filter/cutoff of each effect, and one knob controls the global Glitch mix so I can easily turn off the effect. For other effects I usually just setup a single fader for automation, like setting one as a Wah amount. For synths, I have two faders that I almost always use for cut/res, but my favorite thing to do is assign aftertouch or modulation to a pad. You get a different style of control over the effect than you could ever get with a knob/wheel/fader.
  11. My decision to get a corner desk was purely practical. I haven't had a studio space that wasn't in my bedroom since high school, and when I bought this desk it was placed as shown here. There were literally 3 inches between it and the bed and no other desk would fit in that space, so it was actually a perfect choice (a choice made after a trip to WalMart with a measuring tape). Corners have their pros and cons. I don't like the difficulty setting up a dual monitor setup, but I love the accessibility behind the desk. Because of the triangular space between the back of the desk and the wall corner, I can go under and look behind my computer or hardware without moving anything. I'm going to miss that when I finally get a room big enough for a great desk. Oh. I see. You've had faith, and now it's time to put her to rest. Well ♫ This one's for my homies ♪♫ RIP WalMart desk. OMG, that thing breaks constantly on me. Actually the worst part for me is that the screws for the metal rails rip out of the particle board of the keyboard tray if I put too much pressure on the tray (say, if I slam my fist against the tray after dying the 12,000th time in Call of Duty). I've had to re-drill holes in the rails 5 times now
  12. Why wait? Based on my new understanding of djp's CCR fanboyism, I think if you learned Proud Mary immediately I foresee some direct posts in your future...
  13. Dude, I've gotten 3 moves out of this thing so far, up and down many flights of stairs, and the desk just keeps on tickin. Have faith in the Wal*Mart desk, she can do it! One more move! One more move!
  14. Haha! I like how you FORCED the dual monitor setup to work with it. Oh, and how do you not constantly knock that mousepad off whenever the sliding keyboard tray closes?
  15. Haha, speed-of-sound based mic placement for optimal phase alignment on a djembe? Please, for my sake, let's leave that idea on the drawing board. My job is fluid dynamics, I do music to get AWAY from it Yeah I've got one that I like, but I really wouldn't know how to use it to determine phase cancellation effects. It's something I'll look into though. And as you all have said, I'll play with the mic placement and make sure I'm getting the fullest sound possible.
  16. I'm sending the pics for you to post, but I just had to post this one first. LOL Level 99, djp, ..., Liontamer, Harmony
  17. That's a good point. When moving the mic though, I really can't tell how much of the change in sound is due to phase issues, so I guess the best bet is just to move it until it sounds "good".
  18. Haha, yeah I did think about sampling my djembe, but I already piss my neighbors off enough with my yelling (a.k.a. singing) and occasional drum playing I thought about that, but I'm not so sure it's the best way to go for a djembe in general, primarily because the frequencies that you end up picking up at the top and the bottom are very different. So, for example, when you have a low frequency (and likely high amplitude) wave recorded at the bottom, the same wave recorded at the top will be of a much lower amplitude and will be contaminated heavily with the high frequency content of the initial hand strike. So the phase cancellation should be minimal. But it would still be significant enough of an issue if the mics weren't separated by almost 3 feet. With that distance, like you alluded to, I think the two signals are enough out of phase that it may end up hurting the audio sum by inverting the phase of one of the signals. Take a look at this: The top wave is the top mic and it's view has been magnified vertically 10X. This is the low strike on the djembe that you hear at the beginning of the loop. You can see the much greater high freq content in the top mic and, judging by the 10X zoom, you can also see how big of a difference in amplitude there is between the two signals. Finally, you can also see the slight phase lag on the bottom mic. Put these together, and as said, I'm not sure that inverting the bottom mic is the right thing to do. Any thoughts?
  19. While that one does look interesting, I saw this challenge earlier because I had come to KVR for a free I was actually in the market for a free simple audio file view, like "eigenmode". Also, last week I woke up with an idea for a guitar vst that was VERY similar to "Armchair Guitarist". Can I claim that my intellectual property has been plagiarized?
  20. OMGawd that bass is ridiculous. It takes up the ENTIRE soundstage...which, of course is awesome. So awesome in fact that when the main theme get first hinted at around 0:52 I get worried that no drop-to-lead in the world is going to have enough punch to follow that bass. I was, sadly right. The drop into 1:09 seemed anti-climactic to me. The lead just isn't strong enough at that point to keep the energy high imo, especially since the bass scales WAY back. By 1:43 when the bass and the lead (an octave higher) come in, I'm feeling it though. Great stuff with the subdued section to a much more effective drop at 2:15. Love the added count in there to shake things up a little from the 4-count omm-tss. Bass solo section. Yes. Yes. Yes. I'm not sad at the ending. I'm a little confused though. I thought everything was going so well, we were getting along great. Our relationship was built on a solid foundation of trust, understanding and a phat bassline, and then you just drop me like that Was it something I said?
  21. New Mic Just picked up an Audix D4 for recording the low end of my djembe, and maybe the bongos. I also picked up a D-Vice mic clamp which I thought would attach nicely to the bottom rim of the djembe, but thanks to a finicky design by Audix, no dice on that one. Although the clamp claims versatility, that thing won't attach to anything except metal rimmed toms or snares, and with how poorly designed it is I'd bet that even some of those are difficult to attach to :-/ Anyway, I ended up having to construct a bracket (ugh) that would accommodate the d-vice, then drill into the djembe (ugh) to attach the bracket. Not happy with the d-vice at all, but I am happy with what I was able to do with it (second pic). The mic sounds great! I just did a little test run (hope I didn't wake the neighbors banging on my drum at 1am) using the D4 on the bottom and my Rode NT1-A on the top: The sample plays through 4 times: 1) top mic (NT1-A) only. raw audio, no processing. 2) bottom mic (D4) only. raw audio, no processing. 3) top+bottom. raw audio, no processing. 4) top+bottom. compression on each mic, EQ'd out the high end on the D4, EQ'd out the low end on the NT1-A, compressed below 80Hz on the D4 to keep rumble in check, limited and further compressed the whole thing.
  22. Of course the plugin I want to try the most is Mac only
  23. search bar >> alpha navigation. Plus the mascots hovering over the right side of it looks pretty slick. Still wondering about that odd link on the Workshop page though. Is it supposed to be like that?
  24. Simple example of sampling a hardware synth and turning it into a Kontakt instrument. Nothing amazing, but still relevant to the discussion.
×
×
  • Create New...