Jump to content

Moseph

Members
  • Posts

    2,040
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by Moseph

  1. The melody is good. Do more of that, and do it earlier in the piece. If you handle the melody well, you might not even need to do much more with the riff.
  2. Nomad Factory Echoes: $49 -- Analog delay Nomad Factory Echoes + MAGNETIC: $99 -- Analog delay + analog tape sim Through May 15. I haven't tried the Echoes demo yet, but I already have MAGNETIC and use it as a mastering effect on everything I mix (I got it for ~$50 when it was on sale a few months ago, and it's been worth every penny).
  3. I'm not hearing the problem. The clarinet sounds plenty loud to me. If it still needs to be louder, just pull all the other instruments down. EDIT: Stop looking at the level meters. Almost the only time the meters matter is when they clip. Other than that just mix with your ears and don't worry about things looking louder or softer than they sound. There are, as Rozovian mentioned, a number of things that influence how the level meters will look -- things such as transient loudness, bass loudness, how EQ is set up, etc. -- and these things may not affect perceived loudness in the same way. It may also be that the meters in QuickScore are just borked, because when I look at this in Sonar, the clarinet isn't represented on the meter as being louder than the other instruments. Just the opposite, in fact. When all of the level faders are at the zero position (that is, Sonar's playing exactly what you exported from QuickScore), the clarinet peaks below the guitar and the bodhran and in about the same place as the other instruments. This is what I'm looking at:
  4. re: Miroslav crashing You might have better luck using ASIO drivers for the soundcard if they're available from Creative Labs. (If they're not, the ASIO4ALL drivers might work.) There should be an options screen somewhere in your program that will tell you what type of drivers you're currently using and will let you switch the drivers if others are available on the machine. Google tells me that QuickScore can use either ASIO or DirectX for the audio. My guess is that you're currently using DirectX, which is way inferior to ASIO. I'll download the files and take a look at the piece, too.
  5. I did my undergraduate degree at Western Carolina University which has a music/recording tech hybrid program much like Drexel's and has absolutely the best recording studio I've seen at a university. I don't know how it stacks up against Drexel or Northeastern, but it's a smallish state school and is really cheap, at least for in-state tuition (North Carolina). I think out-of-state is also competitive.
  6. Just take the same basic pattern and move it around until it sounds good. For example (^ indicates the next octave up): a-flat c a-flat b-flat e-flat b-flat c^ d-flat e-flat^ g (Nitpicking: I assume you're composing in a piano roll rather than a score editor, so there's no real distinction among different ways of spelling notes. For notation purposes, though, you seem to be in f minor, so it's easier to read if you follow the key and spell it as: f c f g c g a-flat c f g c -- d-flat f g d-flat g a-flat d-flat f g d-flat. )
  7. One nice feature would be to have the speed measured in bpm instead of ms.
  8. Very cool. Here it is in action, running through SONAR with VSL strings.
  9. I'll have to read a little more about the MCI, but I think the type of output I'm talking about would have to be accessed through the more advanced MIDI services. Basically, you'd be querying the system to find out what MIDI output devices are present, then allowing the user to select the MIDI device they want to send to. Sorry I can't be more specific, but I've never coded anything that deals with MIDI beyond simple GlovePIE scripts. If I get a chance, I may look at your code and see if I can implement something. Edit: To clarify, what the program does right now is to output through the default Windows MIDI player -- called the Microsoft GS Wavetable Synth, I think, which is probably the hardcoded behavior for MCI MIDI output. Depending on how the system is set up, there may be other MIDI output devices to send through, such as a soundcard's MIDI out or a MIDI loopback program, so what would need to happen would be for the program to find these other output devices and allow you to select which one to use.
  10. A Google search turns up several C# MIDI libraries that look capable of providing output, although I haven't looked at any of them in detail.
  11. I'd think it would be fairly straightforward to take drake's version and add output to a MIDI channel, assuming you could find the correct libraries for C#. Then it would just be a matter of using a MIDI yoke/loopback to feed it to a sequencer to record it.
  12. I believe the US copyright registration system only requires that the work exist in a fixed form, which means that it can be registered either as a score or as a recording.
  13. Granted, but getting sued is expensive and a hassle regardless of whether you ultimately win, and using a name that's already widely recognized as being related to someone besides yourself dramatically increases the chance of lawsuits, justified or unjustified.
  14. Not necessarily. For example, the band The Postal Service almost got sued by the US Postal Service, but they came to an out of court agreement.
  15. Usually a combination of note velocities and expression control (probably CC11, depending on your sample library) will be your best bet in getting natural sounding dynamics. Usually I use velocity to control basic level, but since velocity layers don't always transition smoothly, I use expression controller automation to fine-tune things and glue dynamics together on the phrase level rather than on the individual note level. I have a default value for the expression controller that I come back to after each phrase -- this helps me keep the overall level of the instrument in generally the same place so the dynamics don't vary too wildly. Some libraries let you link a velocity crossfader to a MIDI CC so you can draw or record CC curves to change velocities instead of editing individual velocities (sometimes this sounds good, sometimes not, since crossfading velocities makes multiple samples play over top of each other -- basically bleeding the velocity layers over one another -- which can sound odd for solo instruments). For achieving consistency in levels across the entire arrangement, it may be a good idea to come up with a rubric of sorts that tells you what different dynamic levels in the various instruments "look" like. For example, you may decide that mp in the strings is what you get when your velocities are around 20 and the expression controller is around 90 and p is when velocities are around 20 and expression is around 50. This may help you think more consistently about levels over the whole piece, because it will allow you to use objective values to determine dynamic levels rather than simply doing what kind of sounds right and then discovering that it sounds way different from something you did somewhere else. The unfortunate truth, though, is that no matter how you go about handling dynamics and sample use, the editing process is extremely time consuming and usually involves some level of attention given to every single note in the piece. It helps to have a clear idea of how you're going to approach it so you don't end up wasting time redoing things, but ultimately you just have to decide how much effort you're willing to put into it and then bite the bullet and do it.
  16. Yeah. Ideally, you'd be able to turn notes on and off to get the scale you wanted, program behaviors and tempos for the blocks, set each block or note to output on a specific channel and at a specific velocity, and so forth. There are so many things that could be done with this.
  17. It would be really cool to have a plugin like this that generated MIDI notes. GO GO DISTORTION!
  18. And is also finally being included in these deals that they run so often. This makes me happy. Eventually I'll have money.
  19. This is good. I think the arrangement itself works well. Most of my comments have to do with performance, mix, and sample use, which are really difficult things to get right in orchestra music done with sample libraries -- they're often more difficult than the arranging itself. It should help to listen to recordings of live orchestras -- not other people's sampled orchestra pieces -- to get an idea of how the instrument balance and dynamics should work. Anyhow, comments: The opening notes strike me as a bit odd. The piano melody strikes softly on a beat at the very start, and then the accompaniment comes in way loud off the beat. Throughout the intro, in fact, the accompaniment seems too loud in comparison with the melody. The added beat (5/4 or 3/4 measure) at 0:23 to 0:26 throws me a bit. I'm not at all opposed to mixed meters, but this seems like a place where taking liberties with the tempo to slow things down at the end of the phrase would feel more natural than adding a beat. Piano sounds mechanical. For example, all of the bass notes have the same velocity and are exactly the same length in the section from 0:50 and sound metronomic. String samples have weird swells in them when they're held. See if you can adjust that in the sampler so the sustain is more uniform -- you can automate swells manually with the volume or expression control (probably CC7 and CC11, depending on how your sample library is set up) if you need them in specific places. At 0:26 and again at 2:02, the strings are rushing one of the melody notes (making the rhythm sixteenth--dotted eighth instead of eighth--eighth). I'm assuming this is a mistake that got copy/pasted, since this rhythmic figure doesn't occur anywhere else. Related to the mechanical piano and the string issues, the performance could stand to be humanized. There's no dynamic variation anywhere -- velocities sound pretty uniform everywhere, and it sounds like all of the instruments are fighting to be loudest. In general I think you should back way off on the velocities. The ending is abrupt, as has already been mentioned.
  20. I think one of the things about hosting streamable previews of OC ReMixes on YouTube is that it doesn't use OCR's bandwidth to listen to them like an embedded mp3 player would.
  21. I'm reading Sartre's Being and Nothingness, mostly while on lunch break at work. There's nothing like dense existential philosophy to take your mind off of a lousy job.
  22. I played around with recording improvisation sessions for a while, and the categorization thing was what finally killed it for me. If I'd had an idea and then decided to come back to it days later, I had to find it again. I decided it was more straightforward just to write down things I played that seemed like good ideas at the time.
  23. Sweet. Are you going to try to do some kind of custom cabinet art or just leave it as-is?
×
×
  • Create New...