Disneyland 1972 Love the old s
HomeBlogAbout Me

Adobe Premiere Render Farm Mac



Adobe Premiere Render Farm Mac

When you work with 3D or video rendering, you quickly realize—excuse the tacky, car-salesman tone of this cliché—that time is actually money. Under a tight deadline with very demanding tasks, you need more than just a faster machine. You need a lot more power, and one of the main ways of getting it is to divide your task among networked machines.

Rendering and exporting. Basics of rendering and exporting; Rendering and exporting still images and still-image sequences; Export an After Effects project as an Adobe Premiere Pro project; Converting movies; Automated rendering and network rendering; Using the GoPro CineForm codec in After Effects; Expressions and automation. Expression basics. Imagine having five or six Mac Mini’s over 10 gig connecting to a Jellyfish and doing render farms, where you’re taking raw footage and making pro-res proxy footage, or H264 footage for use in Premier or Final Cut or Resolve and doing this all with multiple machines at once.

In my recent Mac Pro review, I mentioned that a workstation computer (whether from Apple, HP, or Dell) is designed to be good at a lot of tasks that are demanding on the CPU and, increasingly, on the GPU. But a workstation can be too much machine for some things, so often you want to free it up to use for asset production instead of saturating it with render tasks. Work stations are overkill to use as what I call 'dumb muscle.'

Dumb muscle machines are like mafia button men: they lie dormant until the boss calls them in for a job and then they go beat the hell out of something—in this case, a set of 3D frames or video composites. After, they go back to sleep and wait for the next call. Dumb muscle has one job to do essentially: keep pummeling. If you put all your money and CPU cores into one box, you are going to be in a really bad spot if that machine goes down. But if you suddenly must get your workstation serviced without warning and have a lot of networked muscle, you can instantly work from a laptop and use a render server to keep your efficiency relatively high while your main machine is MIA.

I opt for the 3.0GHz 8-core Xeon E5 v2 machine because, for anything that doesn't stress more than 8-cores—almost everything but 3D rendering or certain video encoders—the 8-core would perform better than the lower-clocked 12-core. I decided to save the $1500 upgrade cost to go from the 8-core E5 v2 that would give me better performance with tasks that couldn't saturate all cores. Instead, I spent a bit of extra money to get another 10-core 3.0GHz Xeon E5 v2 and put that in a machine that will run only during V-Ray for Maya renders. If you use a 3D renderer that has network rendering support, you can use it on the host machine and the dumb muscle to quickly take down render jobs. And the resulting combo should be much faster than what you would get with a single workstation jacked to the hilt with tons of cores or dual socket CPUs.

Now, there are a lot of variables and questions to answer for anyone considering this route. If you're not much of a system builder, it's hard to know what CPU to buy, what OS to run, how to set up your host machine to talk to the muscle, and what, if any, render management software to use. While I'm not an IT guy for Pixar, I've used networked 3D renderers for a number of years and have built some personal render nodes, both Xeon and desktop-chip based. If you are looking to build a room full of HPC nodes and are concerned about heat, performance, reliability, etc.—this guide is not for you. This guide is intended for people who just want to get set up with some added networked power, to maybe learn the basics of Linux and remote administration with ssh, and to understand the foundation of networked rendering and workflows.

Advertisement

A home build vs. retail units

While I personally wouldn’t go near a home build for a workstation, we’re demanding relatively little of our render workhorse. We don’t need sound to work, we don’t need the latest video drivers, we don’t need an OpenGL-driven user interface. It just has to render, and it can’t crash while doing it. A home-built box should be perfectly fine. For the build machines I recommend, there are two main routes you can take: a beefy Xeon with lots of cores or a cheaper desktop CPU that is more for the budget-minded. A Xeon chip is designed to give you a number of things that aren't seen in a desktop processor, and it has certain advantages:

  • You don't need to buy a video card for server motherboards because they have built-in VGA output (yes, VGA)
  • Lower power consumption for more rendering power
  • Most Xeon boards come with dual NICs for Ethernet link aggregation
  • ECC RAM for dependable simulations, if you’re a scientific user who needs added data insurance
  • Potentially a lot more cores and more cache
  • More PCIe lanes for stuff like link-aggregated Ethernet, USB3, PCI devices, and fast RAID arrays. If you’re doing GPU rendering with more than one PCIe x16 card, you will need a Xeon to accommodate all that bandwidth

While a Xeon chip can’t be overclocked, it will give you more render power per Watt. You pay a premium for that energy efficiency, but, if you are planning on getting more of these systems, it makes a difference. Overclocking a desktop chip usually comes at the expense of energy efficiency, and you still won’t get close to the rendering power of something like a 10- or 12-core Xeon E5 v2. (That is, unless you are one of those guys with a closet that doubles as a dry ice fortress of solitude.)

Advantages to a desktop processor

Best free office software for mac. If you have a tighter budget and just want a bit more help for a coming gig, a desktop chip like an i7 could be fine to get you some added oomph on the cheap. These have certain perks:

  • Cheaper than Xeon and you can squeeze out more performance with overclocking
  • Can double as a gaming rig if you add a decent GPU
  • Cheaper non-ECC RAM
  • Faster for things that don't scale well across many CPU threads. For a render machine, this is less advantageous since rendering scales well across more CPUs and cores
  • More likely to get support for suspend to RAM with a gaming board than a server-oriented Xeon motherboard that might only support hibernate

The speed advantage for the desktop parts is increased because, by the time Intel releases a Xeon variant of a chip like the Ivy Bridge i7, the next faster iteration of the desktop part has come out. This means it's smartest to consider the Xeon only if you know you want more than six CPU cores, ECC memory, a ton of fast PCI express lanes, or if you want to avoid a lot of machines to manage. You can typically build two faster i7 machines for the price of a single 10- or 12-core Xeon E5 v2, but that's more RAM that can fail, more CMOSes to reset, and more potential headaches that will bite you in the ass during a deadline. I prefer to spend a bit more to get the added insurance of a Xeon, since server parts are built more to run maxed out around the clock. I am a firm believer in the saying “the miser pays twice,” and I’ve unfortunately been proven right about this on a number of occasions. Also bear in mind that many 3D renderers charge more for render nodes, so more machines also increases your cost if you’re using a renderer like Arnold or V-Ray 3.

Advertisement

Potential builds

Ars already has very good system building guides, and we have a lot of ground to cover. I'll only briefly cover the build stuff since there really isn’t one specific way to go about this. There are just a few main things to keep in mind when building some dumb render muscle: Bluestacks running slow on mac.

CPU

If you’re going to get a desktop CPU, opt at least for the i7 4930K CPU. With six cores, 3.4GHz, and a $610 price tag, it offers the best balance of speed and cost for a desktop part. The 4960K will get you slightly more power (3.6GHz), but it costs $450 more. Remember that the turbo clock speeds aren’t of much concern to us because we’re using all potential threads, which only uses the base clock rate.

Adobe Premiere Render Farm Mac Os

Disk

Since the bottleneck in our setup is the network and not the disk, we don't need to put a super fast PCIe SSD in the muscle machines. Since my render-only builds tend to get used only for CPU-heavy ops, I usually throw my older SSDs in them and that’s perfectly OK. If you’re a video editor, this would be different for you, and you’ll likely want to look into larger, faster disks and quicker networking hardware. You’ll be pushing more data, where a 3D renderer is just crunching numbers most of the time (and, if it’s an efficient renderer, it's better about caching things like textures).

Networking

For my network, I’m not using link aggregation since that requires special routers and switches. However, I can give a simple tip to speeding up your network: put your machines on a dedicated Gigabit Ethernet switch. This separates your Internet traffic from your local traffic and guarantees better bandwidth across all renderer machines and the host. Even a cheap switch will do this; my old and cheap D-Link Green Ethernet switch successfully maxes out the 100MB/s transfer rate. Don’t use wireless unless you’re using something really fast, like the newer ac spec, in all machines.

Case and cooling

Adobe Premiere Pro Mac

For my 10-core Xeon E5 v2 case, I went with a dirt cheap and smallish Antec VSK-4000, because all it does is sit in my painting studio in the back of the house. But I didn't cut corners on the fan. I don't want it to overheat or be too loud if I need to paint in that room while my renders run. I ultimately picked up a Noctua NH-L12, which made for a pretty quiet and powerful machine. By contrast, I bought a Corsair 550D for a gaming rig a while ago and hate it because it’s so stupidly large:

This is a notice about changes to GPU and DV/HDV support on macOS. These changes apply to the 14.0 releases of Premiere Pro and Adobe Media Encoder.

CUDA support is no longer available on macOS and Adobe is deprecating support for OpenCL. Descargar explorer mac. We recommend transitioning to Apple Metal, including systems running NVIDIA graphics.

Starting with the 14.0 release, Premiere Pro and Adobe Media Encoder default to Apple Metal graphics rendering on macOS. This applies to new and existing projects. Apple Metal provides a modern and unified render pipeline for all users on that platform and will be the focus of our development on macOS going forward.

Adobe Premiere Render Error

Note:

  • If you still need to use CUDA, for example on older hardware, use versions 12.x or 13.x of Premiere Pro on macOS 10.13.6 (High Sierra).
  • On Windows, Premiere Pro and Adobe Media Encoder continues to use CUDA rendering with NVIDIA graphics cards and OpenCL rendering with AMD and Intel graphics.
  • Software Only rendering is available as an option on both macOS and Windows.

Starting with macOS 10.15 Catalina, Premiere Pro no longer supports DV and HDV capture over FireWire.

  • This change does not impact other forms of tape capture.
  • You can still edit DV/HDV files that have previously been captured.
  • DV/HDV capture is still available with Premiere Pro on Windows.

Users who still need access to DV/HDV ingest have the following options:

  • On macOS you can use Premiere Pro 12.x and 13.x on macOS 10.13.6 (High Sierra) or 10.14 (Mojave)
  • On Windows you can continue to use the current versions of Premiere Pro.

Render In Adobe Premiere Pro

As with all major updates for operating systems and production software we recommend you test before transitioning. Don’t update in the middle of a project and always make backups before doing updates.

For more information and help, visit our Adobe Community Support.





Adobe Premiere Render Farm Mac
Back to posts
This post has no comments - be the first one!

UNDER MAINTENANCE