I recently worked on a project where I was asked to design an Aperture workflow for a live shoot. It worked out really well, so I wanted to share it here.

Setup

  • Studio shoot; two simultaneous setups, cameras, photographers, etc.
  • Two days, 11+ hours each, 150+ setups, thousands of photos

Objectives

  • To photo-edit in near real-time… rating, tagging and keywording (aka “tag and bag”); cropping and white balancing; color or other image adjusting as needed
  • Provide twice-daily web previews for remote client

Tech

  • Two Canon 1Ds Mark II’s and an assortment of lenses and lights
  • Two PowerMac G5’s and 23” displays, one Mac Pro Intel Quad with dual 30” displays, internal 1.5TB RAID, fast network, G-Tech G-RAID for backup
  • All pictures shot RAW

The goods

All made possible with the power of AppleScript, Aperture, and the Mac.

First step was setting up a tethered shooting solution for the two cameras. Each Canon was USB tethered to a PowerMac G5. Canon’s EOS Utility was used to copy the images from the camera to the computer in real-time. The software was set to leave a copy of the image on the CF card as well as copy it to the computer, so an instant after the shutter was depressed, there were two copies of the image — one on the CF card and one in the watched destination folder on the Mac.

The destination folder on each Mac was being watched by Aperture Hot Folder. As each image was captured, it was immediately added to the Aperture library. The current version of Aperture Hot Folder allows Aperture to operate in full-screen mode, and still capture in the background, then automatically advance to the just-imported frame. While in full-screen, the photographers opted to have the HUD (Heads Up Display) open with the RGB histogram displayed, and the loupe loaded. This mean that seconds after each shot was taken, they could watch the screen for their image, check levels on the histo, and check sharpness in the loupe.

The destination folder on each Mac was simultaneously being watched by a second script as well. This script simply copied each image across the network to the Mac Pro, where they landed in one of two folders; Camera1 or Camera2. At this point, again only seconds after the shutter was depressed, three copies of the image existed.

Meanwhile on the Mac Pro, (where I sat), I had Camera1 and Camera2 folders opened in icon view, with previews on and the icons set to the largest size. This allowed me (and the producers watching over my shoulder) to monitor the progress of each camera simultaneously. Any image in question could be quickly opened in Preview for a fast check. As soon as either camera was finished with a particular setup, they would notify me that the setup was complete so I could import the batch into Aperture. At that point I simply clicked on either my Camera1_import or Camera2_import AppleScripts.

Each AppleScript would import the appropriate folder of content into Aperture, then move the images from the Camera1 or Camera2 folder into a Camera1_archive or Camera2_archive folder, as appropriate. These archive folders would later be backed up for a fourth and final copy of each image. Also, since Aperture maintains its links to its referenced images so well, even though the images had already been imported (by reference) into Aperture, they could still be moved to another folder on the drive and Aperture never lost track of a link. Additionally, the scripts did more than simply import the images; they also applied a series of metadata strings, including copyright data and a camera identifier. This allowed the images to be sorted by camera later on if needed.

Once each batch was imported, in Aperture I would use the Find command to isolate the most recent import session, therefore only looking at the most recent batch of photos. From there I would manually stack the pictures (an easy task to do manually with the studio setup, and not particularly suited to auto-stacking since time between shots varied so much). Once stacked I would select large groups of images and batch-apply keywords as needed. We had a series of about a dozen keywords across four categories; each image required one or more keyword from each category to be properly identified later. (I also set up smart albums to call out any images that had not yet been keyworded, meaning they were missed in the initial keyword run. This proved useful as occasionally images would get missed, and this process allowed those to be quickly identified and corrected.) Once stacked, I would select the reference color-chart image (one was shot for each studio setup), define a white balance point, and lift-and-stamp that white balance across all other images from the shoot. In a matter of seconds, Aperture could white balance hundreds of photos to the exact same specification.

At this point, I went to full-screen mode to compare stacked images and pick a best shot. For some stacks, it was as easy as picking the last or second-to-last photo. For others, Aperture’s Stack Mode was used extensively, comparing a current pick to the next shot in succession. And finally in yet others, all images in the stack were thrown across dual 30” displays and individually duck-hunted out of the stack selection until it was narrowed down to a final pick. The process used really depended on the individual content and quantity of each stack. The seamless flow between all picking methods in Aperture meant we were able to comb through hundreds, and ultimately thousands of photos in record time.

Once the images were tagged and bagged, some images required cropping or other image adjustments, and so were treated quickly and effectively. On a final pass through the current selection, favorites would be tagged with a 3-star rating. This would come in handy later.

Meanwhile, the next shoot for either camera setup would continue unabated. The instant we were notified of a setup completion, images would be imported via the two AppleScripts, regardless if ready for them or not in Aperture. As soon as one set was completely photo-edited, I would move on to the next set.

Earlier on, several smart web-gallery were set up to isolate particular keyword criteria and 3-star matches. This meant that publishing updates to the website was only one-click away, as each smart web gallery was auto-created with all the favorite choices.

Finally, twice a day (at lunch and at end-of-day), a Backup routine would be run, copying the master image files and Aperture Library to the G-RAID, providing a fourth and final version of each photo. At this point the original images on the CF-cards would be deleted.

That is how we shot thousands of pictures in two days… and picked out the best before we went home.

1 Comment