Jump to content

Humanizing


Tydin
 Share

Recommended Posts

Hi all, Just wanted some tips on how to humanize a track.

Please listen to this sample of a project I'm working on, and let me know how it sounds.

I have spent alot of time trying to make is sound as real as can be.

I'm using reason 4 and the offical Reason Piano package (It seems pretty good so far though I havent gone through all of it yet)

If you could leave feedback, advice and so on that'd be great. Thanks for your time!

My main issue is I'm worried about over doing the reverb, is this an issue in the sample?

stats.png

Link to comment
Share on other sites

The first step would be making sure you vary note velocities, which it sounds like you've done here. As far as solo piano pieces go, short tempo changes to emphasize certain parts, flourishes, less repetative patterns will go a long way.

Right now, the piece is played at a straight tempo without a lot of expresiveness, which makes it feel robotic. Could probably work on the note velocities even more, too. Also, it's drowning in reverb.

Finally, hasn't this already been done?

Link to comment
Share on other sites

Something I find myself doing is to record with a midi keyboard and then fix the timings (manually, not quantizing) so I can keep the human qualities of the performance but still make it sound like it's played by someone who can actually play the keyboards. Then again, I try to stick to synths that don't need to emulate a real performance. :D

There's, as Nutri already stated, tempo changes. With solo instrument pieces and pieces without percussion, the tempo wouldn't stay the same throughout.. Velocity and timing is never exact with a real performance, and a random humanization function doesn't take into account performers' tendency to emphasize some notes through timing or velocity (adding to rhythm). Swing/shuffle sounds more human when quantized because it's not evenly quantized.

Looking out for repeated samples or stuff with a similar repetitive sound is also a good thing, you've got a lot of timing edits between the low notes and high notes when played near-simultaneously, but the timing difference seems to be the same for every instance, making it sound robotic. That timing discrepancy is otherwise a good idea, but doing it the same way makes it stand out too much. Just imagine hearing the exact same crash or the exact same strum noise over and over. Doesn't sound realistic.

For some instruments, not the piano tho, the volume changes during the note. Various blown instruments, brass, reed, woodwinds, whatever; get their volume from how fast/much air is blown through them, so a real performance on those has less to do with velocity and more to do with expression (a midi control change (cc#11)) which controls the intensity of the note (if routed that way, which many virtual instruments are). Same with bowed strings and other sustained instruments, tho then it's not about air pressure but about their respective applicable force. If cc11 doesn't work for you, cc7 (volume) or cc1 (modulation) might work. I suggest trying expression first, since volume can interfere with mixer settings and doesn't utilize whatever other intensity-changing features of the instrument. Blah blah blah.

Also, stop hiding your work in reverb. :P It doesn't make the performance more human. (just the sound... a little.)

edit: Sbeast is right, some instrument, like strings, can benefit from some random detuning... just don't overdo it. :D

Link to comment
Share on other sites

Thanks guys :)

I should have mentioned that this is just the piano peice to the music, there will be a beat included as well as other instruments, as for this already being done, this is going to be a mash up of someday the dream will end from ffx as well as the chrono map theme 8O

I agree that the reverb should be cut back a bit after taking a break and listening again.

I'll go back and change the velocities a bit more, though most notes have thier own seperate volume already, I guess I just need to make the difference a bit more noticable?

Now about timing, how can I mix this up a bit while keeping within the beat? I assume I'll have to edit each note rather then just tinkering with the tempo? I've never really tried something like this before, and I really would like to thankyou all for responding to me so fast ;-)

Link to comment
Share on other sites

Maybe it's just me, but it doesn't SOUND like a pianist wrote this piece. It sounds like it would be awkward to play.

The process of humanization is often approached (ironic as it may seem) mechanically or automatically.

Humanization means thinking about the imagined human element. How would a human play this piece? His imperfection reveals his humanity, yes, but there's more to it. How do they feel about the piece, what do they want to change about the piece, if it's a different person, maybe their goals with the piece aren't the same as yours. You really need to put yourself in the shoes of an imaginary person to really do this to full effect.

There is a flow to the fingers when playing piano. When pianists write piano music, this is built in, this understanding of a finger flow, and it's more natural.

Imagine the hands, what they are doing, how is this played by the left hand, how by the right? Think about the movement of the arms, the feet, the lean of the player as they move right or left.

That's humanization.

Link to comment
Share on other sites

Maybe it's just me, but it doesn't SOUND like a pianist wrote this piece. It sounds like it would be awkward to play.

The process of humanization is often approached (ironic as it may seem) mechanically or automatically.

Humanization means thinking about the imagined human element. How would a human play this piece? His imperfection reveals his humanity, yes, but there's more to it. How do they feel about the piece, what do they want to change about the piece, if it's a different person, maybe their goals with the piece aren't the same as yours. You really need to put yourself in the shoes of an imaginary person to really do this to full effect.

There is a flow to the fingers when playing piano. When pianists write piano music, this is built in, this understanding of a finger flow, and it's more natural.

Imagine the hands, what they are doing, how is this played by the left hand, how by the right? Think about the movement of the arms, the feet, the lean of the player as they move right or left.

That's humanization.

It's also a really abstract explanation that would make almost any n00b to piano go:

"Uh... what did you say?"

In my opinion, anyway.

Link to comment
Share on other sites

It's also a really abstract explanation that would make almost any n00b to piano go:

"Uh... what did you say?"

In my opinion, anyway.

Writing for an instrument you can't personally play is a process of abstraction--you have to pull yourself back and objectively examine what your music is like on that instrument.

Everything else is about psychology.

Since you're thirsty for an example, I'll give you one, but it's not about playing piano, it's about another piece I was working on that featured a solo flute in an orchestra.

I didn't have a flute, I didn't have an orchestra, just my computer and my trusty multi-sample virtual instruments. Up until the moment where the flute came into the piece, the strings were just playing a sustain and they were the only instruments playing a the time. When the flute came in, the basses and celli were supposed to come in with a low pizzicato note. What I imagined was happening was the conductor was holding a fermata awaiting the floutist's entrance into the piece. In my mind, I imagined there would be a breath of anticipation that slowed the flute entrance, the player was psyching themselves up, I imagined, to prepare for the solo, the conductor was watching the floutist in order to cue the rest of the instruments who were watching him. When the flute player came in, the conductor was just behind the flute player making the bass and celli pizzicati fall behind the beat. During the opening part of the solo, the conductor mostly followed the lead and expression of the floutist, thereby making the rest of the instruments late, but as more and more instruments started joining in with the accompaniment, the orchestra became more cohesive. The flute started blending into the ensemble and eventually everyone was following the conductor together by the end of the section.

Humanization is about you treating your virtual instruments like little players. It is treating them like humans.

Link to comment
Share on other sites

Well here. Let me fire off this question.

What would you guys suggest as the easiest way to go about humanizing something?

For example when I write my music I write it in a sheet music program and import a midi into Fruity Loops. I'm not certain if that's the most economical way to go about it, but it's worked this far. Were you to be in my shoes (for hypothetical scenario's sake) what would you recommend as the most efficient approach? Because currently I'm looking at manually picking each and every individual note and tweak the duration, volume, and tempo. And on something that's composed for about four harpsichords that's looking to be an intimidating pain.

Link to comment
Share on other sites

Because currently I'm looking at manually picking each and every individual note and tweak the duration, volume, and tempo. And on something that's composed for about four harpsichords that's looking to be an intimidating pain.

That's ... pretty much what I was going to recommend doing (I've done this before for an entire orchestra; yes, it's a pain). One thing that may help a lot: I don't know if FL has this feature, but some DAWs let you record a tempo track with MIDI input and then sync the project to that tempo track, which basically lets you "perform" the tempo and then apply that performance to the whole project. On occasion, I've gone as far as to videotape myself conducting the piece and then base the tempo track on the conducting video. It's extremely helpful for humanization.

Also, for workflow in general, I only begin with a sheet music program if I know that I'll need sheet music for the piece sometime in the future. I find that it's much easier and quicker to get the humanization and samples working properly when I write in a piano roll view instead of importing from Finale.

Link to comment
Share on other sites

Well, for realism, you can't do much with velocity if you're working with harpsichords. I would suggest playing each of the parts on a midi keyboard and tweaking them to get rid of mistakes and tidying up sloppy timings and stuff.

You could also edit it manually, but you should still record a short bit of it to see how it looks like performed. See how far apart "simultaneous" notes really are, see how they differ in length (and preferably, try it on a real harpsichord to hear how it sounds if you don't already). See how that sounds, and compare that to how it sounds like when it's just raw sequenced notes.

As brought up by dannthr, think of how it would be played for reals. Everything the rest of us said is technique. Figure out the psychology behind it, the different moods you can get from playing the same thing with long or short notes, fast or slow, increasing or decreasing in speed, etc...

TL;DR? Midi keyboard and "damage control". :D

Link to comment
Share on other sites

I don't know if FL has this feature, but some DAWs let you record a tempo track with MIDI input and then sync the project to that tempo track, which basically lets you "perform" the tempo and then apply that performance to the whole project.

ACK, NEVER MIND. But I'm pretty sure FL Studio has a feature that syncs the MIDI data to the project tempo. Which is much better in my opinion, because it makes more sense to match the performance to what was intended to be written than matching the writing to the performance.

Link to comment
Share on other sites

ACK, NEVER MIND. But I'm pretty sure FL Studio has a feature that syncs the MIDI data to the project tempo. Which is much better in my opinion, because it makes more sense to match the performance to what was intended to be written than matching the writing to the performance.

In SONAR, I can draw curves and dips in the tempo map, makes accellerating and slowing easy as well as rubato style tempo where the tempo almost becomes a part of the performance expression.

Link to comment
Share on other sites

In SONAR, I can draw curves and dips in the tempo map, makes accellerating and slowing easy as well as rubato style tempo where the tempo almost becomes a part of the performance expression.

That can be done in FL Studio as well, but it needs to be done manually in the software; it can't change the tempo according to the MIDI data received.

Link to comment
Share on other sites

That can be done in FL Studio as well, but it needs to be done manually in the software; it can't change the tempo according to the MIDI data received.

If I understand what you're saying properly then yeah. The midi data will take precedence over FL Studio and all tempo changes will remain true to what the midi tells it to do. And that also applies to volume changes.

Link to comment
Share on other sites

Another question. Say I've got something with several chords in it. Should I spend time trying to make those notes not sound at the exact same time, but rather put an ever so minute delay on some of the notes? Or is that a bit too nit picky. When the concept of humanizing was introduced to me I'd never thought about it, so I listened to harpsichord music over and over and some people would play chords and hit the lower notes ever so slightly sooner than the higher ones and some vice versa. What I want to know is should I be spending time adding that same flare to things as well?

Link to comment
Share on other sites

Another question. Say I've got something with several chords in it. Should I spend time trying to make those notes not sound at the exact same time, but rather put an ever so minute delay on some of the notes? Or is that a bit too nit picky. When the concept of humanizing was introduced to me I'd never thought about it, so I listened to harpsichord music over and over and some people would play chords and hit the lower notes ever so slightly sooner than the higher ones and some vice versa. What I want to know is should I be spending time adding that same flare to things as well?

That's what I do, but usually for the whole orchestra. Very important also on guitar strums, harp runs, etc.

The highest level of achievement for a virtual performance is imperfection, while the highest level of achievement for a human performance is perfection--irony.

PS: There is NO "too nit picky" when it comes to virtual instrument programming.

Link to comment
Share on other sites

ACK, NEVER MIND. But I'm pretty sure FL Studio has a feature that syncs the MIDI data to the project tempo. Which is much better in my opinion, because it makes more sense to match the performance to what was intended to be written than matching the writing to the performance.
In SONAR, I can draw curves and dips in the tempo map, makes accellerating and slowing easy as well as rubato style tempo where the tempo almost becomes a part of the performance expression.
That can be done in FL Studio as well, but it needs to be done manually in the software; it can't change the tempo according to the MIDI data received.
If I understand what you're saying properly then yeah. The midi data will take precedence over FL Studio and all tempo changes will remain true to what the midi tells it to do. And that also applies to volume changes.

I'm no longer sure exactly what everyone's talking about.

To be more specific, I'm talking about Sonar's Fit Improvisation feature which lets you record a click track by hitting a key on the MIDI keyboard over and over again to mark the beat. Sonar then uses the MIDI track you recorded to generate an automated tempo for the project, so the actual click track for the project will reflect the tempo changes that you entered with the MIDI keyboard. The result of this process can also be edited manually, since it just shows up as tempo automation in the tempo view.

Link to comment
Share on other sites

I'm no longer sure exactly what everyone's talking about.

To be more specific, I'm talking about Sonar's Fit Improvisation feature which lets you record a click track by hitting a key on the MIDI keyboard over and over again to mark the beat. Sonar then uses the MIDI track you recorded to generate an automated tempo for the project, so the actual click track for the project will reflect the tempo changes that you entered with the MIDI keyboard. The result of this process can also be edited manually, since it just shows up as tempo automation in the tempo view.

I prefer to just edit manually in tempo view--though I also do a lot of mouse work for my instrumental performances as I usually get my performance tweaks down as I write.

Link to comment
Share on other sites

That's what I do, but usually for the whole orchestra. Very important also on guitar strums, harp runs, etc.

The highest level of achievement for a virtual performance is imperfection, while the highest level of achievement for a human performance is perfection--irony.

PS: There is NO "too nit picky" when it comes to virtual instrument programming.

Guitar isn't a concern for me since I actually play the guitar. Other instruments, while I do play them, I'm certainly incapable of playing what I write unless I decide all those other notes I hit along the way need to be present for artistic expression's sake.

To be more specific, I'm talking about Sonar's Fit Improvisation feature which lets you record a click track by hitting a key on the MIDI keyboard over and over again to mark the beat. Sonar then uses the MIDI track you recorded to generate an automated tempo for the project, so the actual click track for the project will reflect the tempo changes that you entered with the MIDI keyboard. The result of this process can also be edited manually, since it just shows up as tempo automation in the tempo view.

I'm pretty new to Sonar so I'm not certain what you're referring to. But I'll look for it next time I boot the program up.

In any case I do appreciate the info, guys.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...