You are here

Network Drives (NAS) and Aperture

PhotoJoseph's picture
August 4, 2010 - 3:17am

I’ve seen this question pop up in the forums before, and recently I had a conversation with a user who had some issues storing their master files on a Time Capsule (which technically would be a NAS drive, or “Network Attached Storage”). The attraction to using a NAS (for anything; not just Aperture) is obvious; simultaneous access to your files from any computer on the network, vs. single-computer access off a hard drive.

I’ve always, without hesitation, recommended against storing anything Aperture related on any kind of NAS. There are many reasons I don’t like this idea.

  1. Speed. The fastest wired ethernet connection (gigabit; 1000 mbps) or the fastest wireless connection (802.11n; 300 mbps) are both slower than Firewire 800 (800 mpbs). OK technically gigabit has higher bandwidth than FW800, however the way data is moved over the network, FW800 is invariably faster.
  2. Reliability. Wired and wireless networks (especially wireless) are unreliable and susceptible to prioritization of all the other network traffic. In the world of high-bandwidth video, in a large post house for example, data is moved over a dedicated fibre network, which provides the convenience of ethernet with massive speed. However this requires a dedicated controller computer, fibre cards in every computer, and a lot of money.
  3. Who’s got my file? On a network, someone else could see — and change or move — your file while you’re working on it. That’s a recipe for disaster.

Also if I dig back into the cobwebs of my mind, I recall that in the early days of Aperture, using a NAS was specifically NOT supported. That’s still the case—for the Aperture library itself. More on that in a moment.

It turns out though that technically, using an NAS for storing your referenced master files is supposed to work. The drive has to be a Mac OS X Extended formatted volume, but even if it’s on a network, Aperture can use it. The only real risk is that any interruption (as outlined above) while writing to the master file (i.e. you decide to use the Write IPTC Data to Master feature) can result in data corruption or data loss. You don’t need me to tell you that that’s a Bad Thing™. However, a distinctly rare possibility.

Yet, I was just helping a reader who did lose files in Aperture while moving them from one drive to a NAS drive (his Time Capsule). Bizarrely, what he experienced was that if the originating Finder window was open while he moved the photos, the photos never made it to their destination. If the Finder window was closed, it worked fine. His findings weren’t scientific and I haven’t tried to repeat them, but that’s pretty scary. The same user also encountered a situation where the Finder reported the Time Capsule as being online (and showed mounted on the desktop), yet Aperture insisted that the volume was not mounted.

Frankly, these problems point back to the reasons that I don’t think network storage is a good idea for your Aperture masters. Backup—yes. Originals—not so much. Call me paranoid, but… I am.

What about the Aperture Library itself? That is specifically not supported. Actually, Apple’s wording on the topic is:

Also, it is strongly recommended that the Aperture library be stored on a locally mounted hard drive. Storing the Aperture library on a network share can also lead to poor performance, data corruption, or data loss.

 The KnowledgeBase article from whence this comes is http://support.apple.com/kb/TS3252Aperture: Use locally mounted Mac OS X Extended volumes for your Aperture library. Notice that it’s talking about the Library here, not the referenced masters.

Where does this leave us? Again, technically, NAS should work for the referenced masters. But you won’t see me switching any time soon. I’m gonna stick with my trusty Drobo for now.

What’s your experience with NAS and Aperture? Sound off in the comments… good or bad experiences, let’s here ‘em all.

App:
Apple Aperture
Platform:
macOS
Author:
PhotoJoseph

The problem described with finder showing that he has the time capsule available but not Aperture not having it probably means that it got disconnected while being used and when it was remounted had to use a different path in the file system.

For example, if the user goes into the terminal and does: cd /Volumes and do an ls they will like see something like: myDriveName and myDriveName 1 listed as directories. If they look in the myDriveName they will likely find the files that they are missing.

Just an Idea, but I’ve had similar problems with not being able to find the drive in aperture and that was what it turned out to be.

To fix the issue they need to eject the network drive, move the myDriveName directory somewhere (the desktop would be a good place) and then connect to the network drive again. They should then see only myDriveName in the /Volumes directory, not any myDriveName 1

That same support document mentions “Referenced images stored on FAT32 volumes may sporadically go offline.” My NAS uses some kind of UNIX format, so I’m not sure if that format is acceptable for Aperture referenced images. I had problems with Mac OS X “fast user switching”, depending on which user logged in first, the user that runs Aperture had problems because the volume sometimes mounted as “myDriveName 1” and Aperture could not find the referenced images (oddly, iTunes had no trouble finding the “referenced” mp3 even when the volume was mounted as “myDriveName 1”). NAS drives that are not formatted as Mac OS X Extended volumes and not connected via AirPort don’t seem to be a good idea for Macs, other than for backups.

Still using Apple Aperture

Well there are NAS’s and then there are NAS’s, and there are networks and then there are networks. Almost all the problems outlined above come down to those factors, not down to whether the right NAS and the right network are up to the task.

By the way; the FAT32 issue is only relevant for direct attached devices, not for a NAS. It is completely irrelevant as to what format the disks in the NAS are. What is relevant is how they are presented, NFS/CIFS/AFS. I find AFS & NFS extremely reliable, CIFS is “OK”.

On point #1. The bottleneck for access speed to a NAS is the wireless connection (if applicable) followed closely by the CPU speed of the NAS. This is why a time capsule is such a lousy NAS, having a hopelessly underpowered CPU for anything but the slowest backup duties. A proper NAS is one with a x86 processor, these are ~$450+ without drives when I last checked. With a wired GigE connection and a proper NAS, 60MB/s is achievable. This is faster than FW400 and close to FW800 speeds. It is far faster than USB2.0 speeds.

On point #2. Wired networks are not unreliable, wireless is though. Any wired network failures are typically related to how the local OS handles the remote volume, and since snow leopard I’ve found it to be absolutely rock solid. Certainly more reliable than a locally mounted USB device.

Yes, fibre connections are fast, however even large enterprises are moving away from fibre connected storage for cost reasons. Where performance is needed enterprises are implementing 10GigE networks to NAS devices, and these are even faster than Fibre. There’s nothing fundamentally wrong with NAS technology then.

On point #3. Sorry but that is completely incorrect. The NAS locks the file to safe guard against exactly what you describe.

So the moral of the story is; if you buy a proper NAS, have the NAS present the volumes as NFS or AFS, and connect to it on a proper GigE network, using Snow Leopard, there would be no problems using referenced masters on it. That’s lots of ifs and buts, probably <1% of homes would satisfy these requirements, hence why Apple recommends against it.

I am going through a rework of my archiving system right now because of problems experience using Aperture on a nas. Obviously there were a few things i did not read in the instructions and recommendations but all good learning in the end.
So instead of a badly performing aperture library based on the nas i am now using well defined libraries on my computer hard drive or on a firewire external drive referencing to referenced masters located on the nas.
It is working a lot better. The performance is many times better.
However, I still feel a little vunerable given the problems some people have experienced.

SC — thanks for your input.

On the point #3, my understanding is that what you’re saying is accurate for a managed (controlled) NAS (and perhaps I’m using the wrong terms here), such as Final Cut Server. However if you just mount a drive over a network and start accessing files on it (which is what we’re talking about here), then it’s not controlled and anyone could read or write to the same files at the same time. Even if the network was smart enough to see you accessing the file 0001.cr2 and lock others out of it, someone could easily change 0002.cr2 just before you load it, and they wouldn’t have been locked out of it because you weren’t accessing it—yet.

Of course the nature of Aperture is such that those files will never be altered, so in theory even an unmanaged NAS should work, but we’ve already discussed why it’s a Bad Idea™ ;-)

thanks again, and if you still feel I am understanding incorrectly, please do let me know.

cheers
-Joseph @ApertureExpert

@PhotoJoseph
— Have you signed up for the mailing list?

Hi Joseph,

Researching this subject and found your post. I’m in limbo trying to use my 2011MBP with a single Thunderbolt port and needing to connect both a high speed RAID (Not the Pegasus Promise which allows daisy chaining) with my Libraries, and a 30” monitor. Until someone comes out with a TB hub/splitter, i’m kinda stuck. So in the meantime, I’ve got my RAID attached to my Lion Server (an older MBP with an express card slot ) which I’ve got my RAID attached to. It is a very small network setup and everything is connected with Gigabit ethernet.

RAID is formatted as HFS+ and I am the only one with permission to access the RAID volume. I’m getting very acceptable performance opening the library over the network.

I am NOT using “Write IPTC Data to Master”.

Do you still feel this is a dangerous situation? I’ve got mirror of the RAID running twice a day and also another backup to a Drobo. But would like to avoid getting into trouble in the first place. I’ve got an older MacPro that I could use for Aperture, but it is getting a bit long in the tooth. Two thunderbolt ports would solve a lot of problems, but that isn’t going to happen anytime soon.

Any thoughts?

Many thanks,
Matthew

"There is nothing worse than a sharp photograph of a fuzzy concept." Ansel Adams

Matthew,

Wow I guess I wasn’t aware that not all Thunderbolt drives come with multiple Tb ports. That seems… silly.

I think you’re safe here, since the problems are associated with multiple users. I’d be happier having the Library on your internal drive and the masters on the network drive, which is a supported configuration, but if you can’t, you can’t. You’re doing this about as cleanly as you can, so it may be the only viable solution for you.

Your regular backups are definitely good, but also be sure that you can back up to a few days ago. If a problem creeps in, and that is pushed to your backup, it may be too late to recover. You didn’t say what kind of backup you’re running to the Drobo (Time Machine, or some other third party solution?) but even the occasional Vault backup on a staggering basis (so you keep older ones around for a while) might not be a bad idea.

@PhotoJoseph
— Have you signed up for the mailing list?

Hi Joseph,

Wasn’t “subscribed’ to this page, so didn’t see your reply till now. (Squarespace’s handling of subsriptions is not obvious, perhaps a note below the post field might help…)

My RAID is eSATA, not Thunderbolt. I bought a Sonnet ExpressCard-to-Thunderbolt adapter for a ExpressCard eSATA card, but the adapter only has one TB port. That setup did give me amazingly fast speed connected to the MBP, but then I couldn’t use the 30” monitor.

So now the RAID is attached to a MacPro server with an eSATA card and throughput is plenty fast. I don’t want to keep the library on my local drive because I plan to have other people come in and work on the Library on another machine. Also makes for more complicated backup strategy.

Re: Backup - I’ve got CCC doing a daily clone of the RAID, and then another CCC clone with archiving to the Drobo. As the Drobo fills up I prune the archive. I’m also looking into implementing Crashplan to an external BU server as my current offsite backups are only weekly or less often. As Crashplan only backs up changed bits, seems like it should work across the net even thought the Libraries themselves are massive. Any experience with Crashplan?

Thanks,
Matthew

"There is nothing worse than a sharp photograph of a fuzzy concept." Ansel Adams

Matthew,

Good point on the subscribe thing. I can’t seem to edit that window but I’ve asked Squarespace support. Thanks.

I don’t have any experience with Crashplan, sorry. I use Backblaze for my offsite backup, and while the original backup took quite a bit of time, now that it’s done it’s a wonderful thing. I actually back up three systems to it now… it’s nice to know everything is backed up offsite, all the time :)

@PhotoJoseph
— Have you signed up for the mailing list?

You may login with either your assigned username or your e-mail address.
Passwords are case-sensitive - Forgot your password?
randomness