Search the Community

Showing results for tags 'mixing'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Community
    • Competitions
    • History & Study of Video Game Music
    • OverClocked Podcast
    • OverClocked Records
    • ReMix Requests
    • Site Issues & Feedback
  • Workshop
    • Music Composition & Production
    • Post Your Game ReMixes!
    • Post Your Original Music!
    • Post Your Art!
    • Projects
    • Recruit & Collaborate!
  • Comments/Reviews
    • Album Comments/Reviews
    • Judges Decisions
    • ReMix Comments/Reviews

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Skype


AIM


Location


Interests


Biography


Real Name


Occupation


Facebook ID


Twitter Username


Last.fm Username


Xbox Live Gamertag


PlayStation Network ID


Steam ID


Software - Preferred Plugins/Libraries


Instrumental & Vocal Skills (Other)

Found 6 results

  1. Hey everyone, I'm currently working on a track that starts off with a solo piano and slowly starts building off of that and adding a whole lot of new instruments as it progresses. Currently, I'm having two problems with this that I'm having difficulties fixing: 1. The piano gets 'too loud'. What I mean by this is that, at the beggining, the piano starts real quiet, playing at around -18 dB, and in the most intense parts, which may get even more intense later on, it goes up until +5 dB. These values make sense since there's a lot of dynamics going on with the piano, but I was wondering if this could be a problem, since it'll cause clipping. I've tried compressing it, but I feel like that ruins the entire dynamics of it and makes it lose its emotional power. 2. My violins are clashing with the piano. There's a part where both the piano and the violins together start rising and getting louder and louder and I can hear the violins being completely smothered and drowned out by the piano, which goes much louder than them. I've read online that these instruments play in very similar frequency ranges, which is probably the reason for this, so I was wondering what was the best way to go about this without sacrificing the sound of either of them. If my explanations don't seem too clear, I can provide a sample of the track so it's easier to understand. Thank you for the help :)
  2. Pichu's Dad

    iOS Production

    Hi, I wanted to let everyone know what kind of rig I've been using and ask other members of the community for some advice/feedback regarding iOS recording and production. Does anyone have any experience with this? I'd love to hear about mobile music production from people other than YouTube. lol As far as my rig goes, I have an iPad mini, running through a Focusrite iTrack Dock (anyone familiar with Focusrite would like to know that it is a cousin of the Scarlet as it uses the same mic preamps) with lightning connector allowing me to record at 24-bit/96kHz and get some really good sounds. If there are any other users on the forums that produce with iOS tech, I'd love to hear about it! I'll list my setup (with links) as it stands below so you can get a better idea of what I'm working with. Tracking/Audio Capture: 2 Samson CL8 Studio Condenser Microphones (I use other mics too, but these are the ones that get used the most) Focusrite iTrack Dock iPad mini Harmonicdog MultiTrack DAW Final Touch Guitar Amp Simulation: BIAS Amp BIAS FX Synthesizers: SoundPrism Caustic Drum Machine/Loops: Caustic Now, at this point, I could get pretty redundant as there any many different ways to use each of the apps to get a different result, but the main thing I wanted to share was that I use the inter-app audio support of the Positive Grid apps (BIAS FX and BIAS Amp) within Multitrack DAW to get my amp sounds and often use the effects bus in the DAW for reverb/delay as they are relatively high quality. I can use BIAS for guitar and bass guitar. SoundPrism does not have inter-app audio support nor does it have AudioBus support (I don't know about SoundPrism PRO, though... haven't used it) so I am using an external device (iPhone 4, baby!) via 3.5mm to 1/4" cable into my line-in port of the iTrack Dock. This comes in handy for a lot of things like my chaos pad (SynthPad, which also does not have Audiobus support). Those of you familiar with recording via iOS will know how valuable AudioShare can be, and that is the main way I take drum loops/samples from Caustic or other DAWs (like GarageBand) and import them into MultiTrack as it is a much better DAW with good EQ, Reverb, Delay, Compression, and editing options in a very simple, user-friendly setting. Not the most complex or feature-packed DAW on iOS (not like Cubasis or Auria) but it does a good job for very cheap. Once my project is done, I bounce the track, export it to Final Touch, and start with several of their mastering presets and tweak them to my specs. I have the option to export the whole track in many different file types, bitrates, and have the option to upload directly to Dropbox, SoundCloud, email, etc... BUT Final Touch is also great for mastering tracks BEFORE they hit the DAW. Sometimes I'll send my drum track from Caustic (which may include live automation, synths, drum samples, etc...) and want it to sound a certain way before I mix in my live tracks (guitars, vocals, saxophones, hand percussion, etc...) and will give me a better idea of what to listen for and edit properly. THEN I'll bounce that, and master the whole track, making less-drastic edits to my master, and have a more coherent sounding track. This is just scratching the surface of what is available to artists via iOS and I may have said a lot of redundant things but I wanted to share what I do with others who haven't experimented with iOS music production/creation before and get some ideas from others who have been doing it longer/differently than myself. Thanks!
  3. Hello. I have uploaded a short excerpt from an original song which I am attempting to mix... Link: https://dl.dropboxusercontent.com/s/ftbejny6abxvvfh/S02T05 test.mp3 Here's a slightly different version: https://dl.dropboxusercontent.com/s/v96hp02tq9zaip0/S02T05 test - ver 2.mp3 I plan on eventually having the full song done by a professional, but currently I would like to know if my mix is of an acceptable quality for demonstration/preview purposes. Is there anything in particular that should be changed in terms of EQ or volume levels? Does it sound noticeably amateurish? All the instruments are programmed (including the guitars), since I'm a keyboard player and not a guitarist. I used Toontrack's Meshuggah EZdrummer demos as reference mixes. Edit: I compared it to some more tracks. I'm not really satisfied with the sound. I think the tone of the guitar and drums could still be improved a lot. Currently I am busy with another music-related project (not metal), but I may return to this later.
  4. Hello, I'm Ronald Poe and I write electronic music and remix. I use FL Studio and Audacity for my music and Musescore for the writing/editing of midi. I mix/produce the music myself and it seems to hinder the music itself. Do you have any tips on mixing/production? Here's a couple examples of my work. My character theme for Axel (KH) My remix of "King Ghidorah" (Godzilla NES) from that contest. Please give both your opinion and some mixing advice. Thanks
  5. Hey peeps, I barely know anything about mixing, so this might be a bit of an easy question, but I truly do not know what to do about it, so I'll just drop it here anyway. I like to record different instruments and put them all together. But whenever there are more than 2 instruments everything gets mudded and it just sounds unclear. When I pull things away from each other by stereolizing it to the left or the right it sounds better, but I can't imagine that being the right solution. Does someone here know how such things work?
  6. Hi guys, I'm working on an original piece at the moment. It's a sad, solo piano track, only 1 minute or so in length, and it has to work as a seamless loop. I'm not at all confident in my own abilities, so what I want to know is- what do you think, and what can I do to make it better? I use Logic Pro 9, and this was created with the Ivy Piano in 162 soundfont. I wish I could get a live player to perform it, but alas, I lack both a piano, and the ability to play one. https://www.dropbox.com/sh/np1ahu61yc034f4/AABQn7x16LRh6_aIrtsVl2vSa?dl=0