• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Upgrading video cards for image processing!

Asher Kelman

OPF Owner/Editor-in-Chief
One can spend a lot of money by upgrading the Apple Video cards. Consider needs for image processing only. Forget for the moment 3D modeling, architecture, games and so forth.

For what reason should we upgrade? Which card is the best value?

Let's say we need to process massive files, batch processing, multiple layers, or use larger and/or multiple monitors?

Apple offers upgrades at further costs: what is really needed?

Then what cards meet these needs? Let's assume we already have at least 1- 4GB or RAM

Asher
 
With dual displays any of the default 256 MB cards should work fine. You might need a 512 MB card for dual 30 inch Cinema Displays. Photoshop uses a small amount of video processing power to run the user interface (i.e., Aqua on macs) but the video card only displays images and windows, itdoes not accelerate the image processing.

Hence, I would doubt that upgrading a card will do much for you except perhaps make screen display at higher resolutions a touch smoother.

Everyone, please feel to correct any errors in facts or assumptions I have made.

enjoy,

Sean
 

Stan Jirman

New member
One example of need for brute force GPU power is Aperture. All of its realtime processing (while you're making adjustments) is done on the GPU. More speed & RAM there and you'll quickly see the difference. Motion or FCP would be another example.

In "normal" life, you won't see almost any difference. For instance, PS doesn't use the GPU at all beyond what it does thru the OS (moving of windows, etc). Specifically, PS does its own color management, thus it will draw slower than most other apps that use Apple's ColorSync engine (which can be in some instances done on the GPU).

You will see a difference with gadgets such as Expose, multiple desktops, etc., where more GPU RAM is of a huge benefit.

Of the current line-up, the 7300 is pretty sad, esp. as soon as you connect two monitors to it. The ATI card is a bargain, as the Quadro card offers barely any speed benefits at a far higher price. However, ATI cards have less accuracy than NV cards, thus for applications like Aperture you may be better off with NV cards. It's a long story to explain all the nitty gritty, but that's the way things are :) Still, of the current line-up I'd buy the ATI card and not the Quadro, simply because that one carries a sick price tag, and the whole Aperture deal is overblown anyway since the "final render" (export to tiff, printing) is always done on the CPU anyway.

As for your final comment about "already having at least 1-4GB RAM" - that's quite a spectrum :) Anything below 2GB is simply bad; my recommendation has always been that the minimum should be 1GB per processor (since typically you want the processors to be doing something sensible).
 

Asher Kelman

OPF Owner/Editor-in-Chief
Thanks Stan,

A lot of great information. The reason for the wide range, and setting up the unintended conundrum, is the fact that some Powerbooks are RAM limited and then others can't even run Aperture but can run at least PD 7.0. So there were a wide array of applications to cover.

Great you can add light on this. What about Lightzone, does that also use the GPU? If you don't know I'll ask them.

Now for payoff in Aperture, what is the multiplication factor in going from the standard graphic card to the ATI card in speed?

Asher
 

Stan Jirman

New member
Asher Kelman said:
Great you can add light on this. What about Lightzone, does that also use the GPU? If you don't know I'll ask them.

I don't even know what Lightzone is :) You can look at it in another way, for every application: if "it" uses CoreImage (introduced in 10.4 Tiger), it uses the GPU. If it's not using CoreImage, it's virtually impossible for it to use the GPU, because that's an OS managed resource. The software could use OpenGL to achieve GPU support, but realistically that would be way too much pain.

Asher Kelman said:
Now for payoff in Aperture, what is the multiplication factor in going from the standard graphic card to the ATI card in speed?

I don't actually own / use the ATI card. However, I do have the 4500 Quadro (long story), and there the performance difference is quite noticeable. I don't have a firm number, since you're looking at a difference in frame rate (smoothness of updates) which is pretty hard to measure, but I'd say close to 10x for complex filters (sharpen, highlight/shadow). The ATI card is on average about 90% of the 4500 performance, so it's still a huge difference.
 

Mike Sisk

New member
I think a better way to put it is "if it uses Core Image, it MAY (emphasis mine) use the GPU".

The beauty of Apple's Core Image is that the programmer of an application doesn't have to worry about the graphics hardware -- the system will use the GPU if possible, but you're on something like a Mac Book that essentially doesn't have a GPU, it'll default to doing the graphics processing in software.

Of course, the programmer has to use the correct API to use Core Image. And there is an override -- a programmer can tell Core Image to NOT use the GPU. [Apple has a tech note about this on the developer site. I'm not linking to it because I don't know if it's a publicly available resource, but a search on the kCIContextUseSoftwareRenderer flag should find it.]

As for Lightzone, as I understand it's a cross-platform Java application and likely isn't able to use Core Image. OTOH, Adobe's Lightroom probably can and I'd imagine the next Photoshop will, too.
 

Asher Kelman

OPF Owner/Editor-in-Chief
Mike Sisk said:
I think a better way to put it is "if it uses Core Image, it MAY (emphasis mine) use the GPU".

The beauty of Apple's Core Image is that the programmer of an application doesn't have to worry about the graphics hardware -- the system will use the GPU if possible, but you're on something like a Mac Book that essentially doesn't have a GPU, it'll default to doing the graphics processing in software.

Wow! Does this means that Aperture won't be able to pump out images as fast as expected on a PBPro? Can one add a Graphics card externally or BTO wth a higher level card?

Asher
 

Mike Sisk

New member
There's no build to order option or possibility for replacing the graphics card on any of the current Apple laptops. However, I expect Apple at some point will have an option for a better graphics card on the Mac Book Pro like they're doing with the 24" iMac.

As for the current Mac Book Pro (and even the G4 PowerBooks), they do have a GPU that Aperture can use -- not as good as the desktop systems, but better than nothing.

Now, the Mac Books (e.g. the 13-inch non-Pro models) do have a GPU, but it's not much of one and, as I understand it, doesn't have Core Image hardware support.
 

Asher Kelman

OPF Owner/Editor-in-Chief
Mike,

I really don't understand why Apple doesn't have a special version of the each of the laptops LOADED with the finest options for an extra premium.

Imagine a 13" Powerbook Pro with 6GB RAM and high end graphics card and a 200GB drive!

Asher
 

Mike Sisk

New member
Asher Kelman said:
I really don't understand why Apple doesn't have a special version of the each of the laptops LOADED with the finest options for an extra premium.

Imagine a 13" Powerbook Pro with 6GB RAM and high end graphics card and a 200GB drive!

Market cannibalization.

This is something nearly any company that sells both low-end and high-end products has to cope with. The problem is that if you make the low-end product too good you'll drive sales away from the more-profitable high-end.

Canon has the same problem. If they make the 5D too good it'll eat into sales of the 1Ds. Hence the 5D lacks weather seals and a few other things pros desire. Same with Apple.

Of course, none of these products exist in a vacuum so this internal dance has to co-exist with external competition. Apple has changed the tune of the dance somewhat with the 24" iMac by adding Firewire 800 and better graphics cards. I expect the next update of the Mac Book Pro to continue the trend.

True story: about 10 years back I worked for a large software company that wrote CAD software. They decided to come out with a low-end product and had the same concerns about cannibalization. One of the features we decided to add but "dumb-down" was a programming interface for customization. The trick was to limit it so the customer could do simple things like macros but keep people from developing full-blown custom applications (which could eat into sales of the much more expensive "pro" product).

So I spent a year on this project team only to be told just weeks before shipping that the executives decided to drop the whole programming interface. It sure sucks to see a whole year of your work just flushed away.
 
Top