February 2014

Overseen by Framestore VFX Supervisor Rob Duncan, our London and Montreal offices combined to deliver a full range of photo-real visual effects for José Padilha’s RoboCop reboot, encompassing more than 600 robot and bullet-filled shots.

Our major worked included CG augmentation of both RoboCop’s traditional silver suit and its new black incarnation, creating negative space to give the feeling that there couldn’t possibly be a man inside, while full CG suits were created for stunt work. We also gave a more formidable spin to the original's ED-209s and created the all-new humanoid EM-208s, as well as creating huge battle sequences, vastly extending sets, blowing up the car that injures good-cop Alex Murphy and built OmniCorp’s headquarters – a colossal collection of work completed in just over a year.

"Having a reliable and creative VFX vendor is key for movies like Robocop. I was very lucky this way, as MGM introduced me to Framestore. One of the best things that happened to me on Robocop.” José Padilha – director.

Warning: spoilers ahead

The film opens with a live broadcast from Tehran, showcasing the pacifying police role that OmniCorp’s range of robots carry out in the field. Not quite everything goes to plan, however. On set in Toronto there was a 300ft section of the street, built to two stories high. It was our job to top up the buildings and extend the set out into the distance, add smoke, explosions and general destruction and also to populate the sequence with fully CG ED-209s, the towering robots that will be familiar from the original film, and the humanoid EM-208s. All of that meant we touched pretty much every shot in what is a very long sequence.

The original idea for the EM-208s was to have stunt actors on set to aid with choreography and composition. They were placed in motion capture suits in case we could use their movement to inform the animation, but that idea was abandoned because each EM-208 needed to move in the same robotic way. “If you had a group of robots all off the same assembly line with the same software in them they would do more or less the same thing rather than using a motion capture of six different people” says Rob Duncan. “Even with heavy choreographing there would have been too many differences."

To give them that robotic feel we developed our own generic walk cycle, walk-to-stop-cycle, scan cycle and so on. In comp we gave a clue to the EM-208s’ different states of action through their visors – a red LED in the centre during stand-by mode, a barcode-like set of flowing blips when they are scanning and the visor fully lit up when in combat mode.

In many of the shots, especially the wide ones, there was a lot of top-up and extension required to make it look believably like Tehran. “It was just about getting the right detail built into either the fully-CG build, or the 2.5D projection, or the matte painting, they all had to have enough detail in them to sell the complexity of the outskirts of a big city” adds Rob.

Murphy Injured

In its search for a human face needed to make its robo acceptable to the American public, OmniCorp comes across Alex Murphy (Joel Kinnaman), a police officer mortally wounded by a car bomb. The car is fully CG throughout the shot, with Murphy taken over by a digi-double as the explosion happens. A lot of work was done to make sure he could be seen in the resulting fireball.

“It’s a very long shot, we were back into Gravity land!” says FX Supervisor Johannes. “It was complicated because there are flames coming out of the car, a lot of debris and interaction with him, in the end it required several simulations (done with Flush, Framestore’s in-house version of Naiad) in different directions because fire comes out of the front, back, through the doors and from underneath the car. There was no way we could just hit a button and watch it work – it was quite an assembly of different simulations that in the end worked really nicely.”

In the ensuing surgery shots we see Murphy’s exposed brain. “It was a really fun shot” says Compositing Supervisor Adrian. “There was a certain playful element of adding bits of goo that came off the tweezers, it’s all CG and it just looks gross.

Murphy then learns the true extent of what has been done to him when, strapped to a docking station, he watches himself being disassembled in a mirror as each mechanical part is removed one by one until all that’s left are his head, lungs and throbbing trachea. Obviously only a full CG takeover of the suit and dock would suffice!

“We built it to be as realistic as possible, even to the extent of building a layer of goo above the brain so the light would refract correctly” adds Adrian. There was quite a lot of detailing in how the 3D was put together to make it look as realistic as possible. The first shot in that sequence is particularly hard because of the sheer length of it, the amount of motion and the camera move – it needed all aspects of compositing.”

Suited and rebooted

After the big reveal we see RoboCop in action, wandering disorientated through a lab. A big part of our work was in augmenting the brilliant physical suit made by Legacy Effects to make it seem more robotic in ways that wouldn’t be possible without VFX. A section between the end of the shoulder and the beginning of the upper arm was gouged out and the abdomen was slimmed it down to make negative space in the hips and the pelvis. We also took out areas around the elbows and his neck. “It was all to confuse the audience and make them wonder how they could have possibly got a man in that suit” says Rob. Parts of the physical suit were made out of rubber to allow for movement so certain elements had to be replaced to make them more metallic.

The black RoboCop suit was done slightly differently. The shoulders were modified as before, but with the black suit’s slimmer look the abdomen only needed to be warped, rather than replaced. The stomach would be tracked to work out where it would be in the shot, then the compositors would take that geometry and use that as a basis to warp the plate to produce the desired silhouette.

For some sequences and particularly stunt-work, a full-CG RoboCop was needed, which meant the real thing had to be matched completely photo-realistically. In other scenes a partial suit was used to allow greater movement for the stunt actors and then completed in CG. “We’re really proud of how good we made the suit look in light and comp” says Montreal CG Supervisor JP.

To make sure the full CG RoboCop sat believably in his environment we took the geometry of the scans taken on set and used them to recreate the environments in CG, then re-rendered RoboCop and the environments together to make sure the lighting interaction between the two was perfect.

Making a run for it

Murphy’s disorientated flight through the labs leads him through several rooms, many of them massively extended beyond what was shot. There is a moment when he sprints through a massive warehouse full of endless lines of pink lab-coated workers – taken from four rows on set to countless in the final shot. From there RoboCop bursts through a door, scales a wall, and sees paddy fields sprawling out before him. On set there was a short strip of field with a few real rice plants for him to run through, but in post we extended this to include thousands of plants and rolling hills in the distance. RoboCop, filmed with a partial suit to allow him to run, needed to be heavily augmented and sometimes replaced throughout these sequences.

“Off in the very distance they are digital matte paintings, but the rice fields are a simulation” says Rob. “We had to put in hundreds of plants, sometimes closer to the camera than the real ones. From shot to shot on the day the lighting and the wind would change so you couldn’t just match one shot and clone it to the others because the real plants might be moving differently or leaning over more. You start off with a generic ‘does this plant look right? Does it behave correctly? Does it look like it’s blowing in a breeze? Then once that’s right you tailor it to each shot. Then of course you have to build in randomness. It was tricky but I think it’s very successful.”

“It worked out nicely that we could use all the plants he interacted with directly, but then we had to match those and create something with the same dynamics of a plant blowing in the wind and put it right next to the real one without anyone being able to tell the difference was quite a task” adds FX Supervisor Johannes. There are also hundreds of different strands of grass that we simulated and then I think we made about 20 different plants that are instanced all over the place.”

RoboCop’s training pits him against an EM-208, with the sequence beginning in a sterile, colourless environment, obviously a virtual reality world, which then morphs into a realistic but still CG environment. “RoboCop had to stay realistic, as the background changes from the white and blends into a photo-realistic set” says JP. “That was quite a big job, especially for the Environment TD. In the end we had a really high density mesh and we could control all of the different points that would blend and time them accordingly.”

The camera tracks out to reveal the action happening on giant screens behind a static RoboCop and an EM-208 – both of which needed to reflect the light of the screens, which were modelled in Nuke. “Rendering was a big thing for that sequence, because you’ve got a metal environment reflecting a metal robot and vice versa” says JP. “To get the lighting interaction between RoboCop and the EM-208 and the screens behind them we made a pre-comp of what the insert would be like, then passed that to lighting and they would use it to generate several passes so we would get a feel of the robots on set interacting with what is on screen behind them at the same time.”

Our new Arnold-based rendering pipeline was a big part of getting everything to look so realistic. “I was really impressed with the results we would get early on in the process when you’re seeing the first pass of lighting” says Rob. “It looked pretty much there the first time you saw it. We were always striving for photorealism because we know we would have to mix and match practical suits with CG pieces of suits or render CG full suits for stunt work, so it was very important that the lighting was good and we were able to do that very successfully. I think that was the biggest innovation on the show really.”

Bringing out the big guns

Utter Mayhem. That’s what happens in the film’s destructive final battle. Taking place between RoboCop and multiple full CG ED-209s in an environment that was unsuitable for large scale practical effects (actually the Vancouver Conference Centre, in which events were taking place around the shoot), the scene is a medley of thousands of bullets, millions of shards of glass and plenty of explosions. “It’s a bit like the Tehran sequence –we pretty much touched every shot” says Rob. “In one way it’s great because you’ve got control over everything, but in another way it’s really tough to keep the continuity going. If there’s an explosion you have to keep the residue rolling over into the next few shots.”

The sequence begins with RoboCop approaching on his bike, shot variously with Kinnaman on a low loader and with a stunt rider on the road, who needed to wear a proper motorbike helmet that we then replaced with RoboCop’s slim-line one. He pulls up to the OmnicCorp building, simply jumps over the awaiting squad of soldiers and through a window to fight the ED-209s inside and a firestorm of bullets and breaking glass ensues.

“On set they weren’t allowed to set off many practical effects, although a fake column was brought in and blown up, so we basically had to destroy the whole of the Vancouver Conference Centre digitally. We needed a lot of imagination to reach what you see in the final sequences” adds Rob.

The breaking glass, as RoboCop crashes into the building and as an ED-209 stumbles through a panel to the level below, was done in FX. New software was developed to make sure it broke in an impressive way – exploding into thousands of pieces on impact that break again when they hit the ground.

The destruction is so intense that it required the development of a completely new in-house bullet system to handle it. “We needed it because there was going to be so much gunfire and you needed to work out the effects of each bullet. I suppose you could animate each one and work out where it hit then apply a reaction to that, but that would have been so labour intensive that we would still be doing it now!”

The system saw the animation, FX, and compositing teams all working together on interacting elements – the muzzles flashes, the tracer fire and effect of each bullet. “The problem was that we needed to decide who was firing, when and at what in each frame, and we needed that information in comp, lighting and FX as everyone needed to do a bit of it. Lighting needed it to do the interactive light for the gun fire, comp needed it for the muzzle flashes on the guns, and to place textures where damage had occurred, and FX needed it too, so none of us could actually drive it” says Johannes.

As a result it was driven by animation and picked up by FX first of all, so one frame after a bullet was fired the system would draw a straight line through the set and leave a marker where it first hit, which would in turn drive the effects for that particular bullet type. It was then expanded to include separate systems for bullets, lasers and muzzle flashes.

The muzzle flashes were added in comp, “it was complicated, because muzzle flashes from different angles look very different” says Compositing Supervisor Kate. “If you look directly at them they almost look like a flower, but from the side they look longer. We had to work out a way to tell Nuke what angle the camera was from the gun. We had a range of muzzle flash elements, enough to complete a 360 we wanted a way to place that on the gun to make it fire and to make it look right. We wrote a lot of stuff, working very closely with FX, animation and rigging and it ended up becoming such a powerful tool that we could have complete control over what kind of muzzle flash was in each shot and there were different looks to each flash for each gun.”

Even with the bullets placed the job wasn’t done. “It still needed manipulating – even though the bullet system was designed to automate the destruction we still wanted the ability to art direct it,” says Rob, “so we would sometimes have to ignore some of the bullets that were fired (at the animation stage) in terms of where they hit and what their reactions were and it was much more time efficient if we didn’t have to go all the way back to animation for that. You couldn’t just have a constant fire though, sometimes you wanted to fire a burst and then re-aim and fire again, sometimes you would want a strafing pattern.

The Richard Hammond Builds a Planet Effect

As the battle goes on the ED-209s literally bring out the big guns. We wanted to give the Gatling gun a signature look to make sure the audience would appreciate the increase in firepower. The inspiration for it came from a strange source however: “I happened to be watching Richard Hammond Builds a Planet on BBC2,” says Rob, “and they showed what happens when you fire a high calibre machine gun at a steel plate – it super heats and in slow motion you see these little licks of flame which you wouldn’t really notice when you’re viewing in real time. So that became the reference point to distinguish it from normal impacts and the Richard Hammond effect became the short-hand for describing when we should see that type of impact and everyone found it hilarious.”

RoboCop fights his way up to the top of the OmniCorp headquarters, which was shot on green screen and required a great deal of extension, as well as adding the Detroit cityscape in the background.

It’s a fittingly frenetic end to an action-packed remake and we’re glad to have been on board to bring the year 2028 to life with incredibly varied body of work – completed for the first time across Framestore’s Montreal and London studios. “Overall it was a very smooth show,” says Rob Duncan, “the Arnold pipeline was a big part of that, meaning the lighting wasn’t an issue at all, and our adoption of Shot Review Plus meant working across two sites, and with a new team in Montreal , went really well. It didn’t matter where the work was done – the quality was superb.

Playlist: Drag videos to re-arrange playback order
1 of
Clear contents
Updating, please wait...
Are you sure you want to clear the playlist contents?