MachinePix Weekly #24: Jeff Linnell, founder of Bot & Dolly
Jeff Linnell talks about robots for cinematography, synesthetic experiments, and almost vibrating a building apart. This week's top post was a three language typewriter š
Note: this is issue #24, the URL says #23 because I donāt know how to count.
This weekās interview features one of the coolest people I know, Jeff Linnell. He is currently the CEO ofĀ Formant, and previously made a big splash in the robotics industry as the creator of Bot & Dolly, which pioneered robotic camera control for a series of blockbuster films. Jeff went on to be Director of Robotics for Google X.
The most popular post this week was a mechanical typewriter with Japanese, Chinese, and English characters. As always, the entire weekās breakdown is below the interview.
Iām always looking for interesting people to interview, have anyone in mind?
āKane
Interview with Jeff Linnell
You started in video production, how did you end up in robotics?
Oh Jeez, the origin story. So itās funny, before that I was a Dot Com 1.0 person. I had this company that was a little bit bifurcatedāwhich I find is often the formula for me. A company where you canāt quite pin it down. Bot & Dolly was like that. Was it a creative agency? A robotics company?
At the beginning of my career, my first company was half computer animation and half building websites. Thatās the start, the New York years. I was working a lot with Silicon Graphics Computers and doing post production.
The real thing that got my brain cooking was working on programs like Softimage Eddie, which was compositing before Adobe After Effects. I was fascinated by combining different imagery to pull off what people hadnāt seen yet. When I think about that, what kind of sparked me then is these powerful tools, and arguably Bot & Dolly was a tool company. Interfaces for people to do what they couldn't do otherwise. I was exposed to editing programs on the post production side: how do I take a bunch of b-roll and turn it into a video with editing tools, which was like a sledgehammer. You can use it to bash a story into shape. Then you have the scalpels like After Effects. You can get magic from a mere 30 frames, build 1 second gems.
I saw all these insane tools in my early twenties and it was really enabling. As I advanced in my career I got out of New York, sold my graphics agency. I took a couple years off and got really into photography and portraiture. I had a bunch of time on my hands, and a new tool. I went deep for a year. I was also flying model airplanes; I got bored with that so I switched to helicopters. Way harder, more adrenaline, easier to crash. As I was staring at one of these model helicopters, I started thinking about how all these parts got made. So I bought my first CNC mill.
I took my still camera and bolted it to my CNC mill. I wrote just enough code to manage this as a stop motion rig with a Canon SLR and this tiny mill. It was also really useful for doing high precision photography of small things. I could program camera moves in Maya. Honestly it came out of all these interests I had, and I mashed them together in a venture.
I built a little test rig, and I became the person in San Francisco to shoot tabletop cinematography and food. All these glamour shots of food, like grapes falling on a table, I got really good at that.
I remember discovering Bot & Dolly when the Box video went viral. How did that video come about? What inspired it?
We were playing with projection mapping, and I donāt think thereās any secret origin story, at least not one I recall. Projection mapping was just starting to happen and we were just poking around. Similarly, I look around my studio, and I have a bunch of tools, some of which are robots. I was wondering what I could do with this, and it dawned on me that we could do moving projection mapping in a way that hadnāt been done before. Thereās a reason you donāt usually move the projector in projection mappingābut I had two robots where I could move both the projector and the camera on a predefined path, and I was thinking āhow can I best demonstrate this new technique.ā
We would always try at our studio to mix our contractual with our creative work, and this was the biggest leap weād made to date. We were so slammed with advertising and effects work that we didn't really have time to do anything epic. This was a chance for us to commit to something big. It was a big financial and time investment which was scary to us as a small business, but we pulled it off and it paid off in spades.
It also was a way to wrap up so many different technologies we were dabbling with into one project: there were projection screens, motion controlled treadmills, camera synchronizing software, robotics and DMX lighting controlāall the effects there are practical. When the room lights up thatās all driven by the animation software and synced to the millisecond. There were new realtime 3D technologies emerging in the DJ and electronic music scene that we employed. Everything we had: software and physical, got used on that project. We never got to do that for clients. This was an excuse to prove it was possible. Iām flattered that this is still talked about in design schools, in many ways itās dear to my heart.
Itās actually really important to talk about the restrictions that were there. As successful as it was, it was very intentionally constrained. We picked a theme: Magic. So we had something fundamental to build off of, it gave us some fodder. There was no color. We didnāt project onto 3D surfaces; we had done some 3D projection mapping for Warner Brothers, but I think we made a bigger point by keeping it black and white and extremely graphic. We only projected onto 4x8 white panels to show how far we could push the technique. We showed that we could provide all this dimensionality through only light and shadow. We had a lot of technology at our disposal, but a lot of self-imposed constraints on how we wielded it. By taking a two dimensional thing and making it seem three dimensional and opening up the fourth, fifth wall in this case, where the floor opens up, we really pushed the frontier.
It became a bit contentious on the team actually. We had it in the can, and it was starting to get super polished, and it started to feel really Hollywood, too polished. Like a commercial. We were really good at commercials, and that's not what we wanted. So we dialed it back a little. We showed the studio and setting more. You can hear the robots moving. We spent a few weeks making it grittier, dumbing down the polish, and it was the right call. I still fight my perfectionism though.
Who named the robots Iris and Scout?
Youāre bringing me back, I hadn't thought about these guys in a while. Itās funny, Iris and Scout. Scout is from To Kill a Mockingbird which fits because Scoutās a nimble little robot and itās quick and stealthy. Scout we used when we had something super fast like tracking the shoes of an athlete running by. When we filmed an explosion we used Scout. Names are important, and To Kill a Mockingbird was always close to me. Iris I like because itās literal, technical, film-related, but also feminine and humanizing. I donāt think itās very interesting to have masculine names for these giant hulking machines. Iris is our main production camera. The majority of Gravity was essentially filmed on three Irises.
The original robots were named after Shakespeare characters, Gilda and Puck. We had an editor that was a big Shakespeare fan and named those robots. A reference to Rosencrantz and Puck Fair.
I know Gravity used the Bot & Dolly system in filming; how did they find you?
There was an article in Wired on Bot & Dolly, someone from Warner Brothers was reading Wired on an airplane and saw it. They wanted a zero-G effect. The easy way would have been to do it all CGI, but there was no way they would accept that for the level of cinematography they wanted. The cinematographer, Emmanuel Lubezki, demanded it be done optically. They experimented with steadicams and swivel chairs, and there was far too much complexity to the shots to do with a classic motion control rig. We showed them a demo in SF, and it was clear this was the only way to do it. The rest is history. About half that movie is filmed with our robots. Most of the exterior and a fair amount of the interior shots.
There weren't traditional motion control systems that could get the shot count they had. You used to get one or two shots every four days on a standard motion control rig. By the time we hit pace, we were getting six, seven shots a day. The previsualization was done in Maya and we wrote software to translate that directly into motion control.
The camera moves in Gravity are incredibly complicated. The opening shot, one continuous 13 minute shot, had insane camera moves. It was impossible to do with a crane. We had a 6-axis robot with a 3-axis mount for the camera, plus the zoom and focus. And the whole thing was on a track. Thatās 11 axes, with the actors in a 3 axis pan-tilt-roll rig. The scene had 14 axes synchronized to the millisecond and millimeter.
What other projects did B&D work on?
We worked with all sorts of clients. Facebook, Chevrolet, Sony, BMW, Google, Sonos, I mean, you name it. We had robots performing with the Blue Man Group, Beck. Shooting Annie Liebovitz, Mikhail Baryshnikov, Buzz Aldrinā¦ All sorts of entertainment or visual projects.
The more interesting answer to that question, the thing that people donāt know is that we did a lot with digital parametric fabrication. We built giant 3D printers with the robots. We also did subtractive work with robots with milling heads doing giant sculptural work. Probably 40% of what we did was R&D fabrication work. That work was not very public, but we really pushed the edge of scale and precision. There was a lot of cross pollination between the fabrication side of the shop and the film production efforts. Each pushed the tools in new ways, but invariably informed the other.
Whatās the wildest story from your time at B&D?
I gave a talk on my view on what makes a company successful, and also thinking about in my career where Iāve been most successfulāand generally it's been in finding edges. Iām pretty good at getting to the point of things not working. We were doing a project for Googleās IO conference, we got to the razorās edge and nearly went over. They were launching the Nexus Q (a spherical music player) at the time. We wanted to do something sensational to show the device. We thought āwhy donāt we put it on a giant robot, and put this huge sphere, like 20ā diameter, and allow people to control it in real time.ā The sphere would mix music depending on its three dimensional position in the room. We hooked it up to Ableton Live, and set everything up in Moscone Center for Google IO.
We thought the challenge would be in creating this giant fiberglass sphere, what we found out a day before we were supposed to bolt this thing in, when you moved this thing around, the entire building would resonate in this really scary way. You have this two ton robot with a giant sphere on it vibrating the entire building. Taking down the Moscone Center would not be good. It was the very end of the project, and we thought we might have to call Google and take away this huge presentation that was a part of their premiere show.
We got on the phone with a bunch of people and realized it was a resonance problem. It happened to be the year the America's Cup was in SF, so there were a lot of boat builders in town. We called a bunch of people that were here working on the Oracle boat, and they helped us out with the fiberglass sphere. How to reinforce it. We learned a lot about boat building but it didn't solve it. At the last minute, we had a notion that just might work: if we introduced a Brownian motion to the path of the robot, we might just break up the resonance. We wrote this algorithm that introduced noise to the path of the robot, and it worked! We intentionally messed up this otherwise perfect robot. You couldnāt even tell by looking at it, but it worked beautifully. This was the only time where I thought things might not work out.
Any side projects youāre working on right now?
Yea, so all sorts of stuff! You should check out novel.af: thatās a Bot & Dolly-esque design group that I started up. It shows a couple of my passions. Iām pretty interested in commingling automation and augmented reality. Novel.af is one expression of that.
Thereās also science.af. A fake science lab working with objects it didn't discover! What happens when I take a non material object like a sphere in Unity or Unreal, how well can I synchronize with a robot to make it convincing in AR? Iām interested in what can be done with a game engine and extending it to the real world in a high frequency, closed-loop way.
Iām also extremely interested in game engines and audio gear. Sequences, synthesizers, etc. Iām currently rebuilding some of those inspired interfaces inside Unreal. I'm also experimenting with interfaces in an augmented setting thatĀ back out to the real world via MIDI. I believe game engines are a tool that will radically transform multiple industries over the next decade. There's something very inspiring about the real-time and generative aspects emerging from the digital technologies Iāve been playing with my entire career.
Any favorite books or books youāre reading?
Not sure itās a recommendationāactually, Iām quite sure its not! But Iām currently reading Stranger Than Fiction: The True Time Travel Adventures of Steven L. Gibbs. Itās anti-educational. All that said, I was listening to an old Coast to Coast AM radio showāI used to listen to it all when time when driving across the country late at night. Thereās a great interview with Steven, and you canāt tell if heās a con artist or a madman. Iām still trying to formulate an opinion.
Whatās your favorite simple (or not so simple) tool or hack that you think is under-appreciated?
I donāt have an answer for that. All I have is my formula: get interested in something and get really deep. Find something else to apply it to. Allow yourself to dabble and tinker. Youāve got to have the right amount of reagents.
The Week in Review
Given the size of this blade, itās likely a sawmill bandsaw blade. A downside of plasma cutting is that it creates a heat-affected zone at the cut which will have to be removed or re-treated.
I tried really hard to identify this specific factory but was unable to. Let me know if you have a lead.
Cable robots are very light and can achieve high accelerations over very large spaces. Theyāre often used for cameras in stadiums. They have their own unique dynamics and control challenges because cables canāt exert a pushing force, only a pulling force.
Neat. Final print was 2.3 metric tons.
Postscript
When I created @machinepix six years ago to post cool pictures I found online, I never expected it to come as far as it has. Thank you all for subscribing and giving me an excuse to talk to really cool people. Happy New Year ā¤ļøš¤
If you enjoyed this newsletter, forward it to friends (or interesting enemies). I am always looking to connect with interesting people and learn about interesting machinesāreach out!
āKane