okay...so a card like firegl or quadro fx only helps with content creation in a software like max but i need to get a separate rendering card to handle the rendering is it?
You don't need a card for rendering. Any computer will render. It's a matter of time. I imagine if you were making a ton of money off of your renderings, some type of accelerator (render farm) may be a helpful investment, but for MOST people ( not ILM or WETA) you can just get a nice robust PC and do renderings pretty quickly. Dual processor, or dual core and 2GB RAM would do nicely. I render on my AMD Althon 3000+ and 768 MB and it's fine. Not as fast as my dual Opteron 244 with 2GB RAM. Short answer, no you don't need a rendering card.
well.. if you render a highly detailed image. your test renders themselves take so much time. I need to take out renders at a commercially viable rate. I think there's a rendering card called PURE something.... i was wondering if there are any others available
you'd have to be making a decent $$ to justify buying something like the PURE card (not to mention they only support Mental Ray, as far as I know).
How much are you going to spend? I would think it'd be much more cost effective to buy one great workstation, with a nice Quadro/FireGL, then get a few rack mount dual processor machines with tiny drives and cheap graphics for a render farm.
This is pretty big bucks, though. I assume if you are thinking about the PURE solution, you realize that $10k will be gone in the blink of an eye for a complete setup?
GPU's will be the future, but not yet.
Also, if you have a tight deadline, consider an online render farm:
does anyone know how to make a rendering farm? i have to render high res images with a high level of detail. Doing it on a single machine takes too much time....and if i make a mistake i lose precious time.
All i can tell you about the render farm that they have at Penn (which mostly goes unused) is that it is 12 Dual Opteron machines with 2 GB RAM each in a rack. They don't have video cards and I don't think they have harddrives. I've acutally never used it because basically the way it's setup is best used for animations, and it only works with Max. For animations it's good because you can have 12 machines each rendering a frame at a time. For single frames, one machine renders a frame, so it's actually slower than my machine because it's talking over the network. Plus, the new Dell workstations that the school promoted are dual-dual core, so they actually render with 8 buckets as opposed to the 2 of the farm. But for animations, it's saved people a ton of time.
You can send a single frame render to more than one machine - split frame or distributed rendering (depends on the software you are using to render). This is a huge time saver for high res images.
To make a render farm you just need to network the computers and install Max/Maya/whatever on each machine (you can have one licensed copy installed on your machine and still install it on the farm, you can do this with any software that can distribute the rendering, like After Effects).
When you want to use the farm, you'll send it specifically to the farm. For testing, you use your machine (it can take sometime to send the maps to the machines, so low res images can actually take longer).
There are a bunch of things you need to make sure are in order (like the location of the maps, for texture maps to GI maps). Once it is all in order, though, it should be a 'click and go'.
As a side note, most offices have the machines networked (everyone's workstations) together. You can set all of these machines up as a render farm. It'll significantly slow the entire system down if you try to use these machines while rendering (like if someone wants to open Acad and it's rendering in the backgroudn, it'll perform like a dog), but you could at least render when people had left or over the weekend.
This is your low budget render farm.
Go talk with folks at CGTalk or CGArchitect, you'll find more specific responses.
I think the Penn Ranch is just set up in a strange way that doesn't actually do distributive rendering properly. I know it can be done and should work that way, but ours doesn't at the moment.
trace is spot on with everyone he's said but I'll elaborate a bit more since I just got my farm together.
I scouted a lot of forums like 2cpu.com and the already mentioned CGTalk and CGArchitect to figure out what (at the time) would give me the most bang for the buck and it turned out to be some genetically blessed 1.6ghz xeons that can be [extremely] easily overclocked to 3.2 on air cooling or even more with water on certain motherboards....
in short, it increases productivity tremendously due to the fact that 1) you don't have to wait all night for a render 2) said renderings become better 3) the burden on your workstation is lightened which allows you to continue modeling/drafting/illustratoring... etc which tremendously helps your digital workflow.
right now, if you're considering building your own farm out of several machines ~ you couldn't go wrong with an overclocked core2 setup... plus you'd have the potential to go quad core if you felt so inclined.
once you start talking about renderfarms, software lics start to become a concern if you're using anything other than the default renderer. FinalRender, Brazil, VRay all have limits about how many processing cores you can have with one license ... just food for thought.
Is a content creation card the same as a rendering card?
Is a digital content creation card the same as a rendering card?
no.
unless some marketing guy has his stuff mixed up.
Rendering cards are the super expensive add-in cards that help the cpu to render.
"DCC cards" usually refers to workstation class graphics cards (quadro or Firegl)
Video cards do not help with rendering. It's CPU and RAM unless you have the add-in discussed by mamamamamannamnamnamna.
okay...so a card like firegl or quadro fx only helps with content creation in a software like max but i need to get a separate rendering card to handle the rendering is it?
You don't need a card for rendering. Any computer will render. It's a matter of time. I imagine if you were making a ton of money off of your renderings, some type of accelerator (render farm) may be a helpful investment, but for MOST people ( not ILM or WETA) you can just get a nice robust PC and do renderings pretty quickly. Dual processor, or dual core and 2GB RAM would do nicely. I render on my AMD Althon 3000+ and 768 MB and it's fine. Not as fast as my dual Opteron 244 with 2GB RAM. Short answer, no you don't need a rendering card.
well.. if you render a highly detailed image. your test renders themselves take so much time. I need to take out renders at a commercially viable rate. I think there's a rendering card called PURE something.... i was wondering if there are any others available
i have a business card. lots of them. would you like one? ten? fifty?
you'd have to be making a decent $$ to justify buying something like the PURE card (not to mention they only support Mental Ray, as far as I know).
How much are you going to spend? I would think it'd be much more cost effective to buy one great workstation, with a nice Quadro/FireGL, then get a few rack mount dual processor machines with tiny drives and cheap graphics for a render farm.
This is pretty big bucks, though. I assume if you are thinking about the PURE solution, you realize that $10k will be gone in the blink of an eye for a complete setup?
GPU's will be the future, but not yet.
Also, if you have a tight deadline, consider an online render farm:
http://www.respower.com
http://www.rendercore.com
does anyone know how to make a rendering farm? i have to render high res images with a high level of detail. Doing it on a single machine takes too much time....and if i make a mistake i lose precious time.
oh and does acontent creation card affect rendering speeds in any way?
All i can tell you about the render farm that they have at Penn (which mostly goes unused) is that it is 12 Dual Opteron machines with 2 GB RAM each in a rack. They don't have video cards and I don't think they have harddrives. I've acutally never used it because basically the way it's setup is best used for animations, and it only works with Max. For animations it's good because you can have 12 machines each rendering a frame at a time. For single frames, one machine renders a frame, so it's actually slower than my machine because it's talking over the network. Plus, the new Dell workstations that the school promoted are dual-dual core, so they actually render with 8 buckets as opposed to the 2 of the farm. But for animations, it's saved people a ton of time.
Second question, no.
You can send a single frame render to more than one machine - split frame or distributed rendering (depends on the software you are using to render). This is a huge time saver for high res images.
To make a render farm you just need to network the computers and install Max/Maya/whatever on each machine (you can have one licensed copy installed on your machine and still install it on the farm, you can do this with any software that can distribute the rendering, like After Effects).
When you want to use the farm, you'll send it specifically to the farm. For testing, you use your machine (it can take sometime to send the maps to the machines, so low res images can actually take longer).
There are a bunch of things you need to make sure are in order (like the location of the maps, for texture maps to GI maps). Once it is all in order, though, it should be a 'click and go'.
As a side note, most offices have the machines networked (everyone's workstations) together. You can set all of these machines up as a render farm. It'll significantly slow the entire system down if you try to use these machines while rendering (like if someone wants to open Acad and it's rendering in the backgroudn, it'll perform like a dog), but you could at least render when people had left or over the weekend.
This is your low budget render farm.
Go talk with folks at CGTalk or CGArchitect, you'll find more specific responses.
I think the Penn Ranch is just set up in a strange way that doesn't actually do distributive rendering properly. I know it can be done and should work that way, but ours doesn't at the moment.
trace is spot on with everyone he's said but I'll elaborate a bit more since I just got my farm together.
I scouted a lot of forums like 2cpu.com and the already mentioned CGTalk and CGArchitect to figure out what (at the time) would give me the most bang for the buck and it turned out to be some genetically blessed 1.6ghz xeons that can be [extremely] easily overclocked to 3.2 on air cooling or even more with water on certain motherboards....
in short, it increases productivity tremendously due to the fact that 1) you don't have to wait all night for a render 2) said renderings become better 3) the burden on your workstation is lightened which allows you to continue modeling/drafting/illustratoring... etc which tremendously helps your digital workflow.
right now, if you're considering building your own farm out of several machines ~ you couldn't go wrong with an overclocked core2 setup... plus you'd have the potential to go quad core if you felt so inclined.
once you start talking about renderfarms, software lics start to become a concern if you're using anything other than the default renderer. FinalRender, Brazil, VRay all have limits about how many processing cores you can have with one license ... just food for thought.
here's my buildup thread: it may answer some of your questions
http://forums.2cpu.com/showthread.php?t=75174&highlight=pc-dl
for a wealth of info, search "renderfarm" in the technical and hardware forum @ CGTalk.com
cheers
Block this user
Are you sure you want to block this user and hide all related comments throughout the site?
Archinect
This is your first comment on Archinect. Your comment will be visible once approved.