Jump to content

timaeus222

Members
  • Posts

    6,128
  • Joined

  • Last visited

  • Days Won

    49

Everything posted by timaeus222

  1. Ironically, I'm pretty set on my strong belief that this wasn't as loud as "mainstream" dubstep (AKA soundcloud dubstep and youtube dubstep), and that's part of why I said the execution wasn't super strong. That said, I liked the arrangement (the rap performance in particular) despite the straightforward approach, and production opinions I had were very minor, actually. Rap and dubstep *can* go together, because guess what? CHALLENGE ACCEPTED. It's possible; just takes the right person to tackle it, and my constant nudging towards my sister to go and request the stems from her band. Also kinda funny you liked the first seconds, because those *were* the sampling, and you'd be saying original > this remix. And about the standards... they tapered off in 2007, last I heard, but they were rising before then, so 10 years ago, this really would have passed. Highly probable. Yes, old OC ReMixes *generally* were more captivating for me in terms of the overall package, but hey, maybe the "more likable ones" (MUCH emphasis on these quotes) are still in the queue.
  2. Yeah, I'll take you up on being co-director/assistant director if you'd like.
  3. They advertise the size from uncompressed samples, perhaps to demonstrate how good their compression algorithm is, between WAV and NKI, and taking the time to compress your own WAV samples to NKI for Kontakt actually really helps the load time.
  4. Technically, something from 13 years ago doesn't quite match the standards today, but with the 2~7 minutes in a submission, it really is just a recommendation based on attention spans and filesizes for the sake of bandwidth.
  5. I technically agree with the heart of this, but I would rather take every interview seriously than take it as mere experience, then nicely decline any extra jobs that I prefer less if more than one happen to go well.
  6. I actually meant to remix one of my sister's band's rap songs with some dubstep, but these guys beat me to the punch! (I'm completely serious) Ironically, that song was called "Break Ya Spine", IIRC (read the lyrics; you'll see why it's kinda ironic). Not sure I'm much of a fan of this though. It's not like I'm against these genres, but the execution felt predictable and a bit underwhelming.
  7. Some clipping here and there, though still as great and detailed as I remember.
  8. So then does it have anything to do with variable (unpredictable what they are) speakers used for TVs, or...? I mean, you don't really know what speakers people will use when they use their TV. o.o; At this point, we're just asking you questions about your question.
  9. The rhythm guitar sounds rather narrow. Did you think about recording a second take and double tracking it? Some other instruments in the background sound buried and off-rhythm in a weird way. Also, the lead guitar seems casually played, as if you were just playing around. o.o;
  10. Sounds great! I think you've improved since last I heard your orchestral.
  11. Great analogy. Yeah, that's what I was suggesting. You could try making it less full than usual, and Nase and you (and ecto, who agreed and made a similar previous topic about something like this) have a correspondingly logical reason of leaving room for sound effects. The challenge could then be to scoop the right frequencies so that the music doesn't sound overly compensated by itself and also sounds complementary with the sound effects. That just takes careful locating of the frequencies that create the particular qualities of each sound you want. So for you, Pete, I think grandeur can still be done, but it would then just be about finding a good balance in the frequency scooping gain, once you find those frequencies. Since you mainly use FL, I believe, it should be easier than in some other DAWs due to the visual EQ you have.
  12. This could just be because of the audio system though. At least for me, my headphones hardly give me any ear fatigue. If your system is supposedly minimizing ear fatigue and it's not doing it, either it actually isn't functioning as intended or yeah, it's whatever you're working on that's causing it. (My guess is that thin resonances that are there but you may or may not hear is a cause. Oftentimes I tame resonances in FM bell sounds, for example, and their waveforms go from looking 'uncompressed' [dynamically all over the place] to 'controlled', yet no actual compressors were used.)
  13. I'm believe zircon did something kinda like this for Fittest after listening closely to the mixing. For example, the strings sound great in Photosynthesis and the bass sounds great in Morsecode, but a little reserved in the loudness, I think. So, maybe whatever reason they have has something to do with not "distracting the listener" from the game, which could mean holding back on the fullness of the mixing?
  14. 1 day easter sale on Sunday with 15% discount on everything on ADSR Sounds, including the Zebra soundbank from me way back when.
  15. Textures were a tad grating, and I do agree with the repetition, but overall I did like the energy and the harmonic variations from the arpeggios.
  16. It would be quite RAM-intensive to just have a new instrument instance per note. If you can, sure? It'd be a lot of work. I would actually just rhythmically offset each note in the chord, write corresponding velocities depending on the pitch of the note, and layer on articulations. So, for example, perhaps I'd layer contrabass harmonics with contrabass legato sustains since the harmonics can go as low as 40~60Hz and the sustains can be high passed above that range and still sound strong at around 60~200Hz or so, forming a strong sustaining bass presence (you can route to a single bus and EQ before you get to the bus. Signal chains! ). Sidenote regarding my own laziness: Although I wouldn't always recommend the following, sometimes I link many expression controllers (CC11 in EWQL and many other orchestral libraries) of similar articulations of similar instruments to the same automation clip so they're in sync; that way the swells are quickly editable and not wildly different. The rhythmic offset of the notes would offset the phase of the automation clip usage, but the automation itself would still be consistent. The reason I would recommend otherwise is just a small thing; for more realism, you'd be doing separate automations for separate instrument expression controllers because we're not synchronized robots, but I suppose if the difference in the results seems small to you, either way could be fine, but it's up to you and the context. i.e. if the strings are not exposed, it's less noticeable. It's like manipulating statistical data to make reasonable approximations and simplifying your calculations. As for writing with a MIDI keyboard, it really depends on what one you have and how much memory you have. Its velocity response and your computer's response (i.e. delay between your note and the recording of the note on-screen) get rather interrelated. I usually just sequence strings, but if you have the memory, yeah, a good MIDI keyboard is worth getting to use for this.
  17. I can see what you mean. The full section patches are already set to sound like a limited set of variations in timbre, but combining a bunch of separate related patches could even triple the variations in timbre, making it more realistically emulating the interactions of each instrument's sound waves (similar to sympathetic resonance on vibrating bodies of string instruments).
  18. What filetype is it? If it's AVI... https://answers.yahoo.com/question/index?qid=20100402010303AA5eRT0
  19. 1. What first encounter with VGM? What kind of effect do you remember it having on you? I had many before I actually really paid attention to it. The first one I paid attention to was MegaMan Battle Network 6. I wasn't a remixer yet, so all it did was inspire me to do game music modding to import external songs into that game (and tweak instruments to compensate for the newly assigned voicegroups). 2. At what point do you remember considering yourself a VGM fan (or OCR fan) in relation to your first VGM experience? I was truly a VGM fan when I heard my first OC ReMix from DarkeSword, called "Beamsabre Beat ZERO v2" in 2007. 3. How did you hear about OCR? In 2009 I went back to OCR after hearing DarkeSword's remix to download two remixes per week (this was back when OCR published remixes often!) to use in a Pokemon Crystal playthrough on youtube until I got through all 39 parts, so I "re-heard" about OCR, but I don't think I really thought much about OCR the first time I was there for like, a minute. 4. What do you feel would be some non-musical examples (fan art, videos, interpretive dance, horse racing, rock-throwing) of the OCR society? There's a lot... TheGuitahHeroe golfs, I do martial arts, graphics design, web design, and video production, and we have plenty of graphic artists. I dunno, there's a bunch. 5. How has VGM and OCR affected your life? Everything. VGM got me much more attuned to any type of music, and OCR helped me and everyone here improve their music production and arrangement skills, oh, let's say, a LOT. I don't think I'd be listening to interesting music without OCR.
  20. That's because your brain melted and your memory went kaput. My brain's only partially melted from the head-melting guitars (past the face-melting) cause I haven't even heard the whole album yet. I'm a bad boi. >.<
×
×
  • Create New...