After Effects: Clocks and Countdown

I’ve had Adobe Creative Suite for a long time now. Though my primary video editor is Sony Vegas (now MAGIX Vegas Pro) I’ve had Premiere and I’ve used it in the past. I’ve just found that the workflow wasn’t as fast as in Vegas. Because I use Vegas Pro as my primary editor, I hadn’t spent much time with After Effects. For most things, I didn’t need to use After Effects so I didn’t need to learn it. However, After Effects does two things. First, it allows you to animate scenes. Second, through expressions, it allows you to programmatically control the animation.

This allowed me to address two needs that I had in very rapid succession: a stopwatch, and a countdown timer.

The Stopwatch

A watch or stopwatch has a very consistent motion. The second-hand sweeps around with a predictable rotation. You can create an animated stopwatch by taking a second hand and manipulating its rotation over time. Putting this together in After Effects wasn’t particularly hard, as there are several demonstrations on the Internet. However, there are a few challenges with the demos – not the least of which is that many of them are clocks and not stopwatches. Given I wanted to show 30 seconds elapsing, I needed a stopwatch.

Finding a Stopwatch

Most of the stock photography and illustrations I buy I get from BigStockPhoto.com because it’s relatively inexpensive and they’ve got good content. I found a good-looking stopwatch and purchased a vector version of it. There are two reasons for the vector version. First, it’s scalable so I can get to whatever resolution I’d like. Second, it made it easy to break apart for my video.

The watch is great but there were a few things about it that needed tweaking. There were glass effects that look great for a static image, but could be distracting when I scaled it down and placed it in motion. So, I took the scalable EPS and opened it in Illustrator. I removed the layers of effects I didn’t need and saved a new file.

Animating a Stopwatch

I also needed to separate out the second hand and minute hand from the stopwatch so I could animate them individually. So, I ultimately ended up with three EPS files: one for the second hand, one for the minute hand, and one for the rest of the stopwatch. I could have saved them into a raster format instead of a vector format, but keeping them as vectors meant that After Effects could render them instead of interpolate them – making for a better final product.

I brought in each of the image pieces and converted them to outlines in After Effects then deleted the original EPS files. This gave me versions without a background and something native to After Effects.

I scaled the stopwatch face to full size. I then used expressions to set the scale of the second and minute hands to the same as the face. That way, if I went in later to adjust sizes, the hands would adjust proportionally. This was as simple as pressing Alt while clicking the stopwatch next to scale and then dragging the expression pick whip (little spiral) to the scale of the watch face.

The hardest part of the work was centering the hands and then setting the anchor point – which is the center around which the item rotates. This isn’t as simple as it seems, because the second hand and the minute hand don’t rotate around an edge. After a bit of trial and error, I found good anchor points on both.

Then I took the second hand and animated the keyframes to a rotation that indicated zero on the stopwatch and then one minute later was back at zero but with one rotation in. The angle wasn’t zero (it happened to start at 317 degrees) but it showed zero on the stopwatch. Then I went to expressions (Alt-Stopwatch) and entered loopOut(“continue”) to cause the stopwatch to continue to rotate the second hand for the length of time of the composition.

Rendering the minute hand used a similar process, except the minute hand was smaller and was rendered in black in the original photo. Because I had converted it to outlines, I was able to change the color to red so that, even though it was smaller, it would stand out a bit against the rest of the illustration. That was just a matter of adjusting color on each of the contents of the outline. Then I added an expression to the minute hand which read:

thisComp.layer(“SecondHand Outlines”).transform.rotation/30+value

What is happening is it’s taking the second hand rotation and dividing by 30. This is because the minute hand clock only goes to 30, not 60 – so twice the rotation per increment as the second hand and 1/60th because it’s one tick per minute. Thus /60*2 – or /30. The + value allows me to set the initial rotation of the minute hand to rotate it into the zero position.

Final Touches

With the stopwatch animating correctly, I needed to take care of a few things. Since I wasn’t going to use the video individually, I needed to get to a transparent background. Technically, it’s possible to render transparency, but the file formats for this are ugly. I elected to set my background color for the composition to pure green – since there’s no green in the stopwatch – and use chromakeying on the Vegas side to drop out the background. This meant I could use a .MP4 format and keep things small and simple. The resulting watch is very smooth and super clean.

The Countdown

The approach used for the stopwatch is fine – except it doesn’t lend itself to a countdown very well. Even with the minute hand rendered in red, it’s easy to miss. What I wanted was a countdown that would change text so that it shows the remaining minutes and seconds. Finding a demonstration that showed how to do this was impossible. I assume this is because designers don’t want to learn development and very few developers are interested in doing graphics or video. So, while the solution turns out to be very simple and elegant, it required a bit of thinking from both camps.

Coding

The starting point for this project was to place a text element on the page and expand the source text including expressions. This allowed me to write a bit of code to output the countdown I wanted. To do this, I knew I needed two input variables. First, I needed to know how long the composition is – so I could know where I needed to be at zero. Second, I needed to know where we were in the rendering process. Luckily, these variables are available to us. We can get the total duration of the composition with thisComp.duration. We can get where we are in the composition with time. I could, from that, calculate the time remaining.

The next challenge is getting the output to look right. The time remaining was floating point because both source values are floating point and I needed integers. So I used Math.floor() to get integers of the minutes and seconds remaining.

The next bit is building this into a string. I started by adding the minutes to the string, a colon and then, if the seconds were less than 10, adding a leading zero, and finally the number of seconds.

To call this from After Effects I had to put the bulk of the code in a function and then call that function. The result is this code (which you can use):

function TimeRemaining (){
var timeRemaining = thisComp.duration – time;
var timeSeconds = Math.floor(timeRemaining);
var timeMinutes = Math.floor(timeSeconds/60);
var timeSecondsOnly = timeSeconds – (timeMinutes*60);
var timeDisplay = “” + timeMinutes + “:”
if (timeSecondsOnly < 10) timeDisplay += “0”;
timeDisplay += timeSecondsOnly
return timeDisplay;
}
TimeRemaining();

Text

With that out of the way I could position the text, set the font, color, etc. Wherever I went in the timeline, I’d get the right remaining values. The one key to the text was to select a monospace font. Without this, the text moved on the screen when the width of the numbers changed. This was distracting. I elected to use the free font Digital-7 in my composition so I could make it look like an old digital clock.

I could have caused this to render on a background scene of a bedroom where there was a digital clock, but for my purposes, I needed to have a different look, so I just overlaid the font on my background image.

Wrapping Up

This project – because it doesn’t contain any licensed images – I can provide. You’ll find it in a ZIP file here. By changing the composition length, you can change the length of the countdown, so this should be suitable whether you need a 1-minute countdown or a 20-minute countdown. I do ask that if you find this valuable, you drop me an email and let me know.

Video Studio 2.5 – The Streaming Upgrades

My last update about my studio was the 2.1 updates. (Which followed the 2.0 post about the last set of major upgrades.) This set of upgrades added live streaming support – support that was not urgent when the studio’s use was recording. However, as there are more virtual conferences happening and the desire to do webinars with a more interactive feel, it was time to take the plunge.

Adding Audio

Before I get into the upgrades, it’s important to rewind and remind everyone about what I already had. I had purchased a Blackmagic Design ATEM Television Studio for the 2.1 upgrades to clean up some loose ends on the Chroma keying front. It plays a key role in the streaming, as it’s the device I get my output video from.

The one challenge it has is that it doesn’t accept audio input other than from cameras. (Though the more expensive models of the ATEM do.) I wanted to use the Rode NTG1 and NTG2 shotgun microphones in the office. That meant getting them into the ATEM. The route to get there wasn’t direct. The first step was rewiring them from the Zoom Handycorder H4n and putting them into my Behringer Eurorack Pro RX1202FX. This is a rackmount mixer that sits with the ATEM, and the audio gear in a rack underneath the preview monitors.

The Eurorack Pro has a pair of Behringer Multicom Pro-XL MDX4600 4 channel compressor/gates on it. This gives me signal indicators as well as the ability to add a gate and/or compress the channel. I do want to stop and point out that having a signal meter or indicator for each channel makes these useful even if I never use the compressor. When I run high-end boards for live production, I always have signal meters on every channel. It’s indispensable. This solved this problem with a minimal investment and got some added capabilities at the same time. The MDX4600 gets me the ability to do basic signal cleanup on the 8 mic inputs in an inexpensive package.

Each channel on the Eurorack has a direct insert that I use to feed the MDX4600 – and I split that to send the raw signal to the Focusrite Saffire 40 and Focusrite OctoPre MkII. The first six channels of the mixer are on the Saffire and the OctoPre MkII gets the remaining two. They channels from the Eurorack are wired to inputs 3-8 on the Saffire and on the OctoPre because the first two ports are front accessible on each. So I get 4 direct in for recording plus the 8 channels with the compressor. In a pinch, the output of the Saffire is routed back into the 11/12 channels of the mixer so I can even pick-up the extra inputs to add to the output if I need.

The EuroRack outputs analog audio signals and the ATEM needs a digital input. For that I added a Behringer Ultramatch Pro SRC2496 which converts analog to digital – and vice versa. The Ultramatch takes in the main output from the Eurorack as its input and outputs the AES/EBU signal that is fed into the ATEM. The only hitch to this was that the Ultramatch has a RCA connector for the output and the ATEM expects a bayonet input. The I got an RCA cable and an RCA to bayonet adapter and I was all set.

Capturing Video

The ATEM has a H.264 encoder on it which can be connected via USB. I’ve had that connection to the primary video machine for some time but I never could leverage it. As I did more research and investigation, it seemed like no one really developed support for this output so very few things could talk to it, except for the Blackmagic software. As a result, I decided to pick up a dedicated capture card which I could use to get the signal into the computer. That card is a Blackmagic Design DeckLink Mini Recorder 4K. It has both an SDI and an HDMI input. That means that I could run the SDI output from the ATEM into the DeckLink which would eliminate any concerns for distance or the cable getting loose.

It took the last slot in the computer but it fit. Once I had it installed I could see the output from the ATEM – and had it on a platform that was more expected and therefore more supported. However, having the video on a capture card didn’t solve the problem.

Broadcast

Livestreaming and web conferencing doesn’t typically expect video coming in on a capture card. What it expects is that you’ll have a web camera that is broadcasting the video. So I needed software that had two capabilities. First, I needed software that would live stream to the live streaming targets that you’d expect (Facebook, YouTube, etc.). Second, I needed something that would allow the video conferencing platform to see the output as a web camera. Those capabilities come from Telestream’s Wirecast software.

The software isn’t the most intuitive, and the documentation leaves more than a little to be desired, but the software itself seems very stable and I was eventually able to figure it out. It ended up giving me the capability to live stream to the typical sources. It also has a virtual web camera that you can start. When you do that your favorite webinar/web conferencing software gets set to use the virtual web camera and you get streaming through your favorite platform.

In my setup, I now had all the capabilities of the ATEM for video switching, including live Chroma keying plus audio from production microphones. The Wirecast software technically has more capabilities than I’m using, including live Chroma keying; however, the ATEM does such a great job of this, I wanted to handle it in the ATEM.

So I’m shooting in front of my green screen and I need a moving background but the ATEM Television Studio won’t play video media, so I needed a simple solution. For that I grabbed a simple media player.

Media Player

While I was doing the setup for the Kin-to-Kid Connection booth I purchased some inexpensive media players. The Incredisonic Vue Series IMP150+ are great little media players that can be configured to automatically play from the media that’s connected, and they have a remote and are USB powered. I borrowed one of these and a Decimator MD-HX to convert the output from HDMI to SDI for the ATEM (which I didn’t technically need to do since HDMI would have worked). I then put three video files on the USB stick. The first video named 01-* was my pre-roll. The second was named 02-* and was the prerecorded segment. The third file was named 03-* and was my moving background for the live keying. When the media player is plugged in, it starts playing the first video and seamlessly transitions – and loops.

When the prerecorded program ended, we pushed the fader on the ATEM and suddenly we moved from the media player to keying on top of the media player input – thus we could enter the scene with a moving background.