I’ve always been drawn to music videos. After doing a lot of concert photography, I wanted to do something more substantial. I first toyed with the idea of making some behind the scenes videos of bands, but after doing just the basic research, I realized that video is much more complicated than I thought.
As I scoured YouTube to learn more about video, I came the the conclusion that making a music video would be a great project to get started in video work. First of all, the sound is already taken care of. Also, the production takes place in a controlled environment where you can adjust the lighting, the position on the subject(s) and do as many takes as you need.
You can watch the finished product below and read on to learn more about how it was made:
I approached my friends from the band Trash Room and posed the idea of making a music video for their song Words Like Daggers. They very quickly said yes and we were on our way.
Since it was winter, we needed an indoor location and ended up using a large workshop that we had access to. This brought up another issue which was lighting. Shooting inside meant that we needed more artificial lighting than I had. I ended up acquiring two Aputure Light Storm LS C120d units and rigged up a set of two lights using LED work lights from Lowes (see below).
My first big mistake was in overestimating how much I could get done. We had scheduled about 6 hours of filming. I planned to film a whole intro (complete with dialogue), the song performance with isolated performances in front of a green screen and an ending filmed outside. In reality, it took me over 2 hours just to get the equipment set up. I quickly abandoned the idea of the intro and focused all of my energy on getting the performance shots.
Our first challenge was getting the band to play in sync with the prerecorded song. We had an iPad mini connected to a PA system for playback, however, the live drums were so loud that it was hard for the band to hear the track and they would get out of sync. After some volume adjustments and a little bit of practice, the band was able to lock in.
My biggest challenge was wearing so many hats. I was in charge of directing the band, operating two SLR cameras, setting up and adjusting the lighting and overall time management. Apart from being very stressful, I found myself making mistake after mistake such as forgetting to start the cameras before a take… very frustrating. I also found myself so occupied with the technical side of things that I wasn’t properly directing the band. Lucky for me they we able to nail their performances without much direction.
We pushed through all obstacles and managed to film the footage we needed. Other than a few test shots, I had never filmed a subject in front of a green screen before and made some mistakes that cost me a lot of time in the editing process. The main issues were not lighting the green screen evenly, not straightening out the wrinkles and keeping the subject within the boundaries of the green screen.
Once I got home and imported all of the footage, the first step was to start a new project in Adobe Premiere and assemble all of the clips. As I added each clip, I synchronized it with the song. The playback track we used during filming had four synchronization beeps before the song started. I was able to use the sound from each clip to match the initial beep from the clip to the same beep in the audio track for the song,
Once all the clips were in place, I spent the morning evaluating the footage and choosing what clips to use for each part of the song. I considered this to just be a first draft, but it ended up staying fairly constant through the whole process. I think what happens is that you get used to seeing the video in a certain way and it gets stuck in your mind.
Once the basic video was edited, I set about adding the special effects. Being a video game developer, I naturally wanted to be able to combine the 2D and 3D objects that I use in my games with the film that I shot for the video. I ended up writing my own video effects software using Unity — the development tool I use to make video games. The details of this custom system are beyond the scope of this blog post, but in a nutshell, my effects system allows me to import one or more video clips and place them in three dimensional space. I’m then free to add 3D objects to the same scene and have it all combine in a realistic way. For instance, the video clips can cast shadows onto the 3D objects and, likewise, the 3D objects can cast shadows onto the video clips.
In this video, the main special effects element was the flocks of crows flying around the band. The crows were basic 3D objects that I had already used in my last video game. I had to choreograph the crows to enter the room from the outside, fly around in various ways, circle the lead singer as she grew in size and then finally exit the room in fear.
The other main special effect element were the various 3D objects that were constantly falling to add to the feeling of chaos in the room. These objects, including saws, barrels, buckets and more, were placed high up on each side of the “room”. I programmed my system to drop them at various times in the song when I knew they would be in the shot. I used Unity’s 3D physics engine to ensure that they moved in a realistic way when falling and bouncing off the floor.
Unfortunately, I was using a 5 year old MacBook Pro to do most of the video. A single render of the video took over an hour for the most complicated setups. Much of the process of making the video was spent waiting for renders to complete. I’ve since upgraded my computer system and have optimized the code, so it’s much faster now. Luckily, the song was only a minute and a half long!
Overall, I’m very happy with the finished product and definitely want to make another music video in the future.