• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Moving files from PC to A Mac

Richard Rickard

New member
I purchased a I Mac 24" and I am in the process of moving my picture files to the Mac from a PC.
Question is copying as good as moving a file. Do you lose any quality when you copy compared to moving a file?

Thanks, Rick Rickard
 

Don Lashier

New member
Moving is identical to copying except that after making the copy, move will delete the original. My preference is to bulk copy, then bulk delete the originals if desired. This way you can verify that everything's ok before deleting the originals.

- DL
 
Don Lashier said:
Moving is identical to copying except that after making the copy, move will delete the original. My preference is to bulk copy, then bulk delete the originals if desired. This way you can verify that everything's ok before deleting the originals.

I agree. Moving is dangerous. You can lose files doing that if anything goes wrong. Copy the files, validate them, then only delete them off the PC if you need the space and have them backed up elsewhere.

At a high level, a move is simply a copy followed by a delete of the source files. In practice, copies of large numbers of files can and do fail for some files. This makes a copy, verify/validate, and delete process much safer for your data. And a simple copy and verify much safer for your data.

my $0.02,

Sean (a bit more dramatic with the same message)
 

Asher Kelman

OPF Owner/Editor-in-Chief
O.K, Sean,

What are the best copy-validate routines?

I open a series of files and check they work. OTOH, i've found to my chagrin, after the fact, that some files are indeed damaged!

So for Macs what validations software do you use. We touched on this recently, but it's worth covering again here as it's so important.

Asher
 
Asher Kelman said:
O.K, Sean,

What are the best copy-validate routines?

I open a series of files and check they work. OTOH, i've found to my chagrin, after the fact, that some files are indeed damaged!

So for Macs what validations software do you use. We touched on this recently, but it's worth covering again here as it's so important.

Asher
On a OS X, *nix box, or using the Cygwin environment on XP open a bash command shell. Next lets define our directories:

/home/mydir/source_images - The directory the copy was done from.

/home/otherdrive/destination_images - The directory the images were copied to.

Please note that the above two values will differ from system to system and it is assumed that the /home/otherdrive/ directory is actually a mountpoint for a different physical hard drive.

Then you need to do one thing using the version of the diff command I have in my Cygwin install.
bash> diff /home/mydir/source_images /home/otherdrive/destination_images
And it will list what differs. Not knowing what differs (are they only checking file name and size or a bit by bit check?), I would finally suggest doing a hash based check.


i.e., if you system supports sha1sum use sha1sum, else if it has md5sum use md5sum, else fall back on cksum. Note the recommended hash functions are simply more reliable due to newer and slower algorithms.
bash> cd /home/mydir/source_images
bash> for i in $(ls); do
bash> sha1sum
$i >> source.txt
bash> done
bash> cd
/home/mydir/source_images
bash> for i in $(ls); do
bash>
sha1sum $i >> destination.txt
bash> done

bash> diff
/home/mydir/source_images/source.txt /home/mydir/source_images/destination.txt
And if and only if the final diff comes up safe I would feel safe
deleting the source files if they were backed up elsewhere.

Finally, it should be noted this could take a significant amount of time for many GB of data. But these are our photos we are talking about so hopefully you feel it is worth it.

enjoy,

Sean
 
Last edited:

Asher Kelman

OPF Owner/Editor-in-Chief
Sean,

That's helpful. I should be more comfortable getting into command line solutions.

at least I will work with that. However, I need to refresh on hash, bash and other like tweerms!

Or else, we need ready to use software front ends that hide the real work, but are user friendly.

Probably, I can do it in command line in OSX.

Asher
 
Hi Asher,

Asher Kelman said:
Sean,

That's helpful. I should be more comfortable getting into command line solutions.

You can find a reasonable looking introduction at:

http://www.macdevcenter.com/pub/ct/51

Start at the bottom and work up. I would start with:

http://www.macdevcenter.com/pub/a/mac/2001/12/14/terminal_one.html

Asher Kelman said:
at least I will work with that.

The command line can be daunting at first, but it is how I learned to use computers to do things. There are things I can do so much faster with a command line than GUI tools and vice versa. But with OS X one can always fall back onto the command line which is a real plus over earlier Mac OS versions. Although there was once a Mac based Unix out there prior to OS X.

Myself, once I had the Cygwin *nix tools on Windows I was happy as I had the tools I needed when I needed them and a reasonable GUI the rest of the time.

Asher Kelman said:
However, I need to refresh on hash, bash and other like tweerms!

First, I should note that bash (GNU Bourne Again Shell) is a modern variant of the classic Unix Bourne Shell (sh). And when I use a fixed width font with bash> I am simply denoting a command line prompt. All fixed width text should be considered to exist with in a terminal/shell window. This fixed width text convention is very common.

By hash I refer to a cryptographic hash function, checksum, or a cyclic redundancy check (CRC) which refers to a function which takes an input string and outputs a fixed length random looking string that is unique for every input and should be greatly dissimilar for nearly identical inputs (i.e., differing by exactly one bit).

For instance, consider two text files containinging the alphabet.

a.txt
abcdefghijklmnopqrstuvwxyz
and b.txt
aBcdefghijklmnopqrstuvwxyz
The two files differ almost trivially with the letter 'b' capitolized in the second.

Now noting that the output of the SHA-1 hash algorithm is a 160-bit number which is represented by a 40 digit string of hexadecimal characters.
bash> sha1sum a.txt b.txt
8c723a0fa70b111017b4a6f06afe1c0dbcec14e3 *a.txt
fcee15a6afb615e50c6eecce0447cc7047cc9178 *b.txt
The hash value is on the left in the output followed by a space, an asterisk, and then the filename and a newline.

Note how greatly the value generated by the sha1sum command varies between two almost identical files. This variance makes the uses of hash values reasonably reliable to verify two files differ. It is not quite as reliable as a direct comparison, but I am not sure what the diff command actually does so I recommend double checking it as these are photos and not mp3s once can rip again from their CD collection we are talking about.

I should also note that some command line implementations of bash may have limits on how many files in an arguement list they can handle and may fail on thousands of files. In this case one can use wildcards in the ls statement to limit the list. i.e., ls 20060825_*, then ls 20060826_*, and etcetera assuming you rename your files by date using big endian dates (YYYYMMDD->Year Month Day) to ensure alphabetical sorts also sort temporally.


Asher Kelman said:
Or else, we need ready to use software front ends that hide the real work, but are user friendly.

That is what I use on XP. But I do not know of any to recommend for OS X or *nix systems and hence falling back to command line.


Asher Kelman said:
Probably, I can do it in command line in OSX.

You can do it. It is not hard once you get some of the basics down. But it is also not necessarily something a photographer should spend their time learning unless they enjoy tinkering with computers. But it is also our safety net to get work done on computers.

enjoy,

Sean
 

Asher Kelman

OPF Owner/Editor-in-Chief
Sean,

The first reference says that the explanations are no good with OS 10.2 and beyond! Is there a new version?

I think learning to classify in latin the underwater sea creatures you photograph, would be easier!

However, at least in the command line work, one does not run out of oxygen and wine is always avaliable!

Asher
 
Asher Kelman said:
Sean,

The first reference says that the explanations are no good with OS 10.2 and beyond! Is there a new version?

I think learning to classify in latin the underwater sea creatures you photograph, would be easier!

However, at least in the command line work, one does not run out of oxygen and wine is always avaliable!

A note a found says:

Note: tcsh is the default shell in OS X Jaguar (10.2)
Panther (10.3) and Tiger (10.4) default to bash - many commands do work in either shell, but this page will soon be updated to list the bash syntax.
Which will fix the shell syntax I chose and make it work as the tcsh shell uses different loop syntax (I use bash as it is the most common default I run into anymore).

I hate to admit this, but a 15 minute search is only turning up info for programmers rather than users. The best I found is:

http://developer.apple.com/documentation/OpenSource/Conceptual/ShellScripting/index.html

Which is also available as a pdf. You will only be interested in the beginning part of anything you find. After the first 3 or 4 chapters things will go too far into programming for most.

But the real trick is to take the above code, write a shell script out of it and enhance it so that it copies files from a CF card to your drive, validates them, renames them, copies them to a backup drive, validates the backup copy, and then moves the new file to the directory they belong in. You could enhance this by having the script call Adobe's DNG Converter to validate each file. Then you put this script in a file and simply double click on the file on your desktop and have it all done for you your way.

But my real point here is that the OS has the tools to do what is needed for safe file copies if needed. But GUI tools can be much more convenient unless you need/desire specific customizations. To copy CF cards to my box I actually use the command line to call a script that executes a series of commands. Perhaps I should enhance it and share it out.

As to learning the Latin names of the intertidal species, I have a first tier book for the local ecosystem which makes it pretty easy. I also have the authors (Eugene Kozloff) book for plants and animals of the Pacific Northwest and that too eclipses anything else I have for ease of identifying species. Terrestrial arthropods (insects, spiders, ...) and fungi are much harder and I do not currently bother with the fungi. Perhaps if I find a good website for fungi IDs I might, but it is unlikely after my first wormy/larva laden spore print experiment.

As to wine, well my camera backpack is a Camelback and it stocks 100 oz of H2) (~3 liters) and has room for my camera bag/s and some survival basics like food. As to the command line, I find it does not mix well with wine.

enjoy,

Sean
 
Top