Welcome to The Definition: a series of articles designed to eliminate some confusion and arguments inside and outside of the music industry. Sometimes the wording may be boring, but even boring information is useful.
The Definition of an Audio Engineer
Last week, we discussed what defines a music producer, and the most common question I’ve received since then has been, “Then what does an engineer do, if it’s not production?” Luckily, the roles of an audio engineer are much easier to outline, and thus it’s much easier to define what an engineer is.
The Roles of an Audio Engineer
First and foremost, audio engineers got their name because back in the day, they actually had to be engineers to make everything work. While the knowledge of electronics and their interactions is still an incredible boon to have, it’s not necessary to fill all of the roles of an engineer. Sometimes, these tasks are delegated to assistants, and sometimes performed by the engineer themselves.
- Setup – Audio engineers are the ones responsible for making sure everything sounds as good as it can. The more someone is able to control sound at its source, the easier the technical aspects will be later on in the recording process. At a recording session (or at a live venue, the roles are the same) the engineer helps musicians with setting up their gear. They help tune the instruments. They decide which microphones best suit the instrument/vocalist, and the best placement for all of the parts of the whole.
Recording (“Tracking” to professionals) – Once recording actually begins, the engineer (with the help of the producer, if present) monitors the technical aspects of the recording. They watch and listen to the levels of input for the tracking process. Does the kick drum attack harder than the compressor can handle? Did the vocalist sing off-key for the performance? Is the microphone too close to the guitar cabinet? Can it be fixed in the mix, or will another track be called for? These are the questions an engineer has to answer before the tracks are able to be mixed.
- Mixing – Mixing is a balancing act between the technical and artistic aspects of music creation. Left-brain, right-brain, so to speak. The common approach to mixing is to remove the negative aspects of the recording and to emphasize the positive qualities. Some examples of these would be removing the lower frequencies (equalizing) of the human voice that people can’t actually hear (freeing up energy from the speakers to allow for more bass from the instruments) or adding reverb to give the impression that a vocalist is in a large cathedral. Through the use of effects (both hardware and software) and controlling the levels of output, they put together a piece of music that is ready to be mastered.
Mastering – Mastering is the process in which audio is prepared and transferred to its storage medium. Before audio tape became a thing, engineers would record straight from a horn to a disc, usually made out of metal or wax. These discs were called the Master recordings, and were replicated to distribute to consumers. This is where the terms “mastering” and “put it on wax” both came from.
After magnetic tape was invented and normalized for audio, the quality of sound recordings drastically advanced. Even though the final albums would still be pressed on vinyl for years to come, there had to be a process to prepare the audio from the tapes (which had a MUCH larger room for dynamic range) to be put on vinyl. To deal with the technical limitations of vinyl while still making the music enjoyable to the ear, mastering engineers would use specific hardware to adjust the audio to fit the medium.
Ever since digital audio has become prevalent, and with the advent of Digital Audio Workstations (such as Pro Tools), and mastering has been a slightly different beast. While still trying to conform to specific guidelines for analog medium, engineers have also been mastering for digital music. The technical limitations of digital audio are far fewer than those of analog, so digital mastering doesn’t really have a set standard at the moment. Instead of only mastering for the type of device a person listens on, engineers are also mastering for the specific internet platforms that the music will be distributed on. For example, the platform Spotify will compress audio that comes in louder than the -14 LUFS / -1 dB TRUE Peak standard (but, they recommend -2dB TP for their LOUD track settings). YouTube uses a -13 LUFS standard, and Apple Music uses a -16.
So, in essence, a mastering engineer has MORE work cut out for them now than 100 years ago, but the job has always been to adjust the audio for the medium it’s intended for.
The engineers (or their delegated staff) are the people who handle all of the technical aspects of recordings, from setting up the microphones for tracking, all the way to the master copy.
Too Long; Didn't Read
Engineers are the people in charge of the science behind making music sound good.
If you’re still unsure about the definition, feel free to drop your questions in the comments below. Also, if there are any other definitions in the music industry that you’re unsure of, request them for an article!