Alien: Covenant Concept Art/Behind-the-Scenes Thread *spoilers*

Started by Corporal Hicks, May 17, 2017, 10:11:32 AM

Author
Alien: Covenant Concept Art/Behind-the-Scenes Thread *spoilers* (Read 389,520 times)

Corporal Hicks

Definitely! Muthur is very thoughtful and dedicated to the series, especially looking at the prequels. Worth a look!

Corporal Hicks


KiramidHead

Frank from Hellraiser ain't doing so well.

NetworkATTH

Quote from: KiramidHead on Mar 16, 2018, 03:44:42 PM
Frank from Hellraiser ain't doing so well.
(Space) Jesus Wept

NetworkATTH

NetworkATTH

#559
Raw, early, and fairly rough, renders of the Engineer City's overall layout. From an insider at 20th Century Fox.





The Engineer City is approximately 2 kilometers (1.2 miles) wide. The only things that seem to be missing are the amount of buildings around the City's center, and the Heads carved into the mountains.  With a width of 2km (1.2 miles), you would think this would be more of a town, or an outpost, than an actual "city". Than again, our pale friends seem to be practically immortal, so who needs more space right? Just seems pretty small for a city.

I hope this helps anyone who wants to make a basic map of the Engineer's city/town/outpost/general store/bar/tavern/rest stop

Note: I did not obtain this.

MU-TH-UR 6000

MU-TH-UR 6000

#560
Dane Hallet is prepping something for Alien Day. Would be pretty cool if it was the artbook with all of David's sketches.


https://www.facebook.com/hallett.dane/videos/1746812345369179/

Also the clickbaiting is getting out of control with these fellas haha: http://www.alien-covenant.com/news/an-exciting-announcement-coming-alien-day-2018


Baron Von Marlon

Baron Von Marlon

#561
Quote from: NetworkATTH on Mar 21, 2018, 03:52:34 PM
Raw, early, and fairly rough, renders of the Engineer City's overall layout. From an insider at 20th Century Fox.





The Engineer City is approximately 2 kilometers (1.2 miles) wide. The only things that seem to be missing are the amount of buildings around the City's center, and the Heads carved into the mountains.  With a width of 2km (1.2 miles), you would think this would be more of a town, or an outpost, than an actual "city". Than again, our pale friends seem to be practically immortal, so who needs more space right? Just seems pretty small for a city.

I hope this helps anyone who wants to make a basic map of the Engineer's city/town/outpost/general store/bar/tavern/rest stop

Note: I did not obtain this.

Thanks for posting.
I read about there were plans for this but I haven't seen it in images.

SM

QuoteAlso the clickbaiting is getting out of control with these fellas haha: http://www.alien-covenant.com/news/an-exciting-announcement-coming-alien-day-2018

"getting"?  ;D

NetworkATTH

From Odd Studio:

QuoteFull scale adult Neomorph collectible bust coming soon from CoolProps @coolprops_pr .
Lots of fun revisiting this guy a year and a half after the film! Sculpture by Adam Johansen @adman855
#neomorph #aliencovenant #aliens #alien #horror #creaturedesign #creatureeffects #scifi #models #collectables #sculpture

https://www.instagram.com/p/BgndHfmnCzg/?taken-by=odd_studio

MU-TH-UR 6000

Quote from: SM on Mar 23, 2018, 10:07:32 PM
QuoteAlso the clickbaiting is getting out of control with these fellas haha: http://www.alien-covenant.com/news/an-exciting-announcement-coming-alien-day-2018

"getting"?  ;D

You mean to tell me it was never in control in the first place? Well, I would never...

Corporal Hicks

Quote from: MU-TH-UR 6000 on Mar 23, 2018, 04:31:06 PM
Dane Hallet is prepping something for Alien Day. Would be pretty cool if it was the artbook with all of David's sketches.

I imagine that is what this'll be.  :) And I can't wait!

Quote from: MU-TH-UR 6000 on Mar 23, 2018, 11:55:50 PM
Quote from: SM on Mar 23, 2018, 10:07:32 PM
QuoteAlso the clickbaiting is getting out of control with these fellas haha: http://www.alien-covenant.com/news/an-exciting-announcement-coming-alien-day-2018

"getting"?  ;D

You mean to tell me it was never in control in the first place? Well, I would never...

:laugh:

NetworkATTH

Don't know if anyone has posted this, but this was an interview from the Canadian website Derivative, that focuses on the UI of Alien Covenant, and how it was fairly fully functional, at least it had function. They developed a new system to make everything work the way Ridley liked, and sheepishly called it "Mother"

I haven't seen anyone post this interview here before, but if anyone has, feel free to give me lashings.

Quote  Sep.26.17 Soma CG Drives All Screen Visuals On-Set for Ridley Scott's Alien Covenant


QuoteCovenant Radiography. Screen graphics driven by Soma CG's MOTHER

Quote
Ridley Scott on set of Alien Covenant

QuoteAll the 120+ screens you see in Alien Covenant were simultaneously driven on-set by Soma CG's application MOTHER, made entirely in TouchDesigner. This enabled Director Ridley Scott to craft and adapt the timing, sequencing, effects and look of the screen graphics while setting up each shot with the actors, art director and cinematographer. MOTHER was even put to uses that were not originally intended. This is what happens when TouchDesigner is used on-set, on-stage or (virtually) in-orbit. Boris Morris Bagattini (aka Chris Wilson) lays it out here.

Quote"MOTHER" (interface below) was the remarkable (and very beautiful) TouchDesigner application designed for the task by Wilson. The application is based on replicated TouchDesigner Components, each having its own interface with all relevant settings for that screen. Further to the director's brief, the system feeding each screen needed to be able to:

  •     Composite and control up to 8 layers of screen graphics videos, HUD (heads-up display) overlays and live camera feeds.


  •     Apply and control several effects to different layers on each screen and adjust to the director's requests.


  •     Drive precisely-cued playback from producer calls through to then end of the shoot.


  •     Quickly wrap around screens as per the director's instruction.


  •     Quickly load and play newly requested content as it was provided.


  •     Enable the team to rapidly develop director-requested features, and distribute back to the screens.


  •     Save out entire scenes so they could be recalled and used on request.


  •     Color and level-adjust all or individual screens independently.



QuoteDerivative: Chris the design brief was daunting and the functionality required extensive! How did you go about building this system?

Chris Wilson: I received a call from Martin Crouch of PXL a motion design studio based in Sydney. He had been engaged to lead the Screen Graphics Department and was looking at options to fulfil the requirements of the software and hardware brief for Alien Covenant. Ridley Scott was unhappy with the level of control he was getting from existing systems he had used on previous productions and had a long wish list of functionality he wanted that wasn't currently offered. Martin had experimented with TouchDesigner on a few smaller productions, and had seen what was possible. Knowing the number of feeds required and the level of control needed, it was Martin's idea that TouchDesigner was going to be the best platform to handle all of the on set playback requirements.

The first obstacle was how to structure the entire system. I began by thinking about each screen as a stack of layers, each layer either being a piece of footage, a live video feed or custom .tox component and that the UI would be simply a method to make choices, provide feedback, but ultimately all processing would be handled by the slave nodes out on the set. I had built an initial sketch in TouchDesigner and then started to focus on the daunting hardware that would be involved. Martin Crouch had already amassed a collection of bare LCD screens of various sizes and types, and was working with a Japanese partner on sourcing driver boards. These would ultimately be assembled by Martin and also Steven Paul, our on-set hardware technician. We initially had a lot of problems with EM (electro-magnetic) interference due to the sea of radio and electrical noise that flows through a set during shooting and it was a battle that continued and only finally eliminated later in the filming schedule. I am sure a film set is the most hostile environment on earth for integrating sensitive electronics!

While I was battling with hardware and signal paths, I handed my initial sketch over to the excellent Peter Walker who began to write a class-based Python framework to unify all control information into a single base component that could be called from anywhere in the network. This was an elegant and efficient structure that worked well, but not without some stressful debugging occurring on our first set. Ultimately from what we learned I decided to rewrite MOTHER from the ground up to maximise its stability and flexibility, and this is the version that went on to finish Alien, then move on to the other films Pacific Rim Uprising and Aquaman.


QuoteMOTHER V2.0

QuoteThe current version of MOTHER is based on replicated Components, each having its own interface with all relevant settings for that screen. The number of screens are set in the master configuration which automatically addresses the slaves based on the remote computers' physical video output and the corresponding TouchDesigner instance running under GPU affinity. Monitors are accessible in groups of eight in the interface and settings can be linked across monitors and cues for arbitrary flexibility.

Ultimately all interface settings are condensed into a single Touch Out DAT that is received by the slaves on an assigned UDP port. The Touch Out DAT dynamically increases and decreases in size to scale additional information such as layer-types and specific .tox settings that may be loaded on the fly for that screen (A "tox" is a TouchDesigner component file that gets loaded on demand into a TouchDesigner session.)

All footage, layer types, effects and tox extensions are automatically copied to the slave computers in the background throughout the network of servers. This ensures any imagery or generative elements are local on the slave computers for minimum latency and direct processing, regardless of the load on the master. All elements across the system are synced through a master clock.


QuoteLander Interior

QuoteD: Can you give us a little background on your work history and how you came to adopt TouchDesigner and have the kind of trust in the software that lead you to using it in such a critical scenario on Alien Covenant?
CW: I come from a visual effects and motion graphics background and started to move into live events. I originally began using Resolume as my main platform but discovered TouchDesigner 077 and it was a revelation. The unique interface and visual feedback was addictive and played into the established node-based creative processes I was familiar with in VFX.

I have been using TouchDesigner since 077 for large-scale theatre, projection mapping and interactive artworks. I'm a bona fide Touch nut. I have toured TouchDesigner-based dance projects in Australia, the UK and Canada. Some of these projects have been very complex and I have always worked in theatre productions with real-time generative graphics and lighting control. I like to keep everything live, being able to react and perform with the on-stage performers so each show is slightly different. To me it makes the whole experience both as an audience member and as an artist more engaging. Theatre is even more critical than film as if something goes wrong you can't go for another take. After doing many shows using TouchDesigner with some crazy set-ups and never having a problem with the stability of the platform or a creative challenge TouchDesigner could help solve, I had a lot of confidence going in. I was totally terrified though as well.

QuoteRidley Scott on set of Alien Covenant
QuoteD: What are some of the functions you designed that makes MOTHER distinctive and original?
CW: MOTHER works as a distributed media server and real-time graphics generator tailored to on-set screen graphics. A unique aspects is that it runs robustly on a variety of standard hardware from multi-Quadro workstations to laptops and Microsoft Surface Pros. The latest version integrates with cheap and small single-board computers, providing flexible hardware installation options. It can mix multiple screen sizes from multiple hardware vendors. Access through a unified EDID process results in no tearing or camera strobing artifacts at any frame-rate that the director wants to shoot at.

It is totally customizable through the tox system and scene/script specific animations.

Procedural or pre-rendered scenes can be built on the fly even during shooting and then loaded transparently without affecting the on-set imagery, even during a take. Master and slave are protected as all updates and control are on-demand, and procedural animations are locally processed.


QuoteLifter Installation
QuoteD: Can you explain why TouchDesigner was an ideal choice for creating the platform.

CW: TouchDesigner was ideal in that it enabled:

  • the development of a unified media server platform that could be controlled over a network to media servers, each running 3 instances of the software, with each instance outputting up to 4 screens, each with 8 layers and multiple effects.

  • deployment of the same centrally controlled system on floating laptops and Surface Pros that could be moved to any location where floating screens might be requested.

  • tailor effects, playback control and generative content, deploying them instantly across the network.

  • creation of unique layer types that might within themselves composite, apply effects and generate content.

  • customization of a UI and system to reflect the unique needs of the project.

  • creation of our own routines to quickly identify the production team's name for the screen and locate them in the server network.

  • the resolution and orientation of screens customzable on-the-fly.

  • all configurations being centralized back to a master cue player so that a complex scene could be controlled with simple UI interactions.

  • multiple instances being deployed in a single machine to keep hardware costs minimized while maintaining performance.

  • routing multiple live video feeds to all instances on the network efficiently.


QuoteFirst Tests
QuoteD: What were some of the incentives for creating MOTHER from scratch when there are existing software solutions you could have deployed?
CW: We decided to deploy a TouchDesigner system that used readily available hardware such as gaming motherboards and Quadro cards rather than a turnkey solution with overpriced licensing that was not tailored to our needs and could not be arbitrarily customized for both hardware and features. The need to prototype and deploy fast turnaround requests was the ultimate reason. And TouchDesigner's procedural nature and high efficiency meant we could balance time constraints and loads on the graphics department and enact procedurally generated layers that mixed seamlessly with pre-generated content.

There was also a proportion of directly interactive elements that ran on Surface Pro, plus the need to pipe in Go-Pro and Alexa mini footage directly into the system through Blackmagic capture cards on the fly along with multi-layer overlays and color correction. No other software solution provided such flexibility. Whatever crazy requests came through from the art director or director we were confident we would be able to find a solution in TouchDesigner in a short amount of time!


QuoteTouchDesigner on set
QuoteD: Because we are curious, can you give us an example of a "crazy request" from Ridley Scott?
CW: I remember one incident on the Lifter set. This was a 30 ton spaceship set that was constructed in an old storm-water reservoir. The whole set was on a massive hydraulic gimbal that allowed it to move freely and violently in any way, we had rigged the TouchDesigner servers underneath the ship with umbilical cords and bungee system that ran up through the sets centre of mass to reduce strain on the cabling.

We explored the idea of placing the servers on the ship but the size and violence of the hydraulics made it impossible. We had rigged up a gyroscope to feed into TouchDesigner to give us relative positional and rotational data to drive some of the instrumentation on the ship. It was all working well with me set up at control behind a massive wall of shipping containers that served as a giant blue screen. We had fulfilled the brief for the set but Ridley was not happy with the actors not being able to know what they were supposed to be reacting to, so suddenly we had the camera department, visual effects and GoPro team asking us to coordinate a mocked pre-visualisation of the scene in real-time across the monitors in the ship. We had footage coming in from video assist of Alexa footage that had just been shot of a big stunt where the ship crashes into a statue and Katherine Waterson is thrown over the side of the ship, we had live video feeds of the ship exterior and all this pre-vis footage from VFX of the alien and the ship in this big showdown.

The monitors in the ship went from screen graphics to a kind of live switching studio where we would be cueing segments and sources across various screens and with Ridley calling edits during the take so the actors understood what was going on in the scene and had stuff to react to. I think it was great that the system could actually make it all happen in-between takes and for a completely different purpose than for what it was intended. I think it improved the timing and engagement of the actors, which is why Ridley tries to build as much as possible practically and not in post during his films, he's not a fan of people acting blindly in front of a green screen

QuoteGLSL Monitor Network

QuoteCorridors Control
QuoteD: Can you discuss some of the critical factors that make deploying a system like MOTHER on a major motion picture valuable and cost-effective?
CW: I think scale and the broad mix of hardware used on a film set makes MOTHER unique. We had screens the size of iPhones through to 70 inch 4k monitors all running in sync and off the same unified system. These were spread through complex sets with up to 70+ monitors that integrated thousands of practical LED lighting, SFX such as squibs, sparks and fire, and in the case of the Covenant and many other sets, suspended on a multi-ton hydraulic system to make the whole set move. The pace and cost of shooting also makes robustness critical. With each take costing around 12,000 USD you don't want the system failing mid-take, eliminating a significant portion of the visual impact of the set, disrupting the actors and director and requiring costly delays or screen replacement in VFX.

Flexibility and rapidity of deployment is also the hallmark of the system. On the film Pacific Rim Uprising we had a 72 monitor set to install from scratch in 2 days. Most of this time is purely the installation of the screen hardware, with only a few hours to deploy the software and get everything up and running on all screens with the correct imagery, effects, layering and level controls. Sometimes a director may have a complex new visual idea once he has seen the set and this may need to go from brainstorming a solution to a final cue-able scene within an hour sometimes. While shooting scenes with a single operator, rapid changes to imagery and content over large numbers of screens is where the system shines.

GLSL Monitor Network

QuoteLayer Network
QuoteD: What advantages do you see in using real-time graphics on film sets vs. pre-rendered VFX or effects added in post-production?
CW:For the purposes of screen graphics real-time, in-camera is superior to VFX. It simplifies post production enormously reducing the need to track and rotoscope large portions of a film set and burn in imagery, especially with high depth of field and lots of actors occluding portions of the screens. It also provides realistic ambient light from the screens which is always a concern for the director of photography who needs to balance very subtle light levels, and with many scenes in modern movies driven by screen-based plot points where actors are looking at a screen and reacting to its content, it helps enormously for that to be a real thing happening in front of them rather than a green square.

I think other areas of VFX such as set extension, realistic creatures and destructive elements are a while off, but there are great advantage in utilizing real-time techniques on set to pre-visualise post-produced environments or characters, allowing the director and the actors to better understand the virtual aspects of the scene.

QuoteMonitor Network
QuoteD: What is your prediction in terms of real-time effects becoming more common in the film industry? What do you see as 'the path'?
CW: I think technically we are not there yet, despite the enormous advances being made in game engine development. I think technically we will come ever closer, however I still see films using non real-time techniques as primary for a long-time to come. I do think that more and more cameras will integrate hardware and software solutions that will capture many aspects of the filming process that were traditionally post-based to being in-camera and real-time, eliminating the need for green screen, rotoscoping, lightfield capture and tracking. All the information needed will be captured by the camera itself, this will speed up both shooting and post-production enormously.

QuoteNetwork

QuoteRoot A
QuoteMassive thanks to Chris Wilson for taking the time to talk to us and for congratulations on this great achievement. We look forward to your next endeavors!

You can read the article here: http://derivative.ca/events/2017/AlienCovenant/

𝔗𝔥𝔢 𝔈𝔦𝔤𝔥𝔱𝔥 𝔓𝔞𝔰𝔰𝔢𝔫𝔤𝔢𝔯

Quote from: NetworkATTH on Mar 27, 2018, 07:49:42 AM


That flight control stick (HOTASS) is from a real combat aircraft (possibly an F16). Even has the combat radar cursor hatswitch, red bomb pickle button and ecm programming switches still on it.

Note though the screen on the top-left which reads: ENVIRON CTR PURGE 24556 DR5

That of course can also be seen in the Narcissus in Alien and inside the Spinner in Blade Runner.


NetworkATTH

Never noticed that. heh
Ridley never gets tired of that 24456 DR 5 Purge

MU-TH-UR 6000

https://www.instagram.com/p/BhC3ThklGSf/?taken-by=matthatt0n

Another tease for David's sketchbook (?), this time from Matt Hatton. Getting pretty hype about it right now.

AvPGalaxy: About | Contact | Cookie Policy | Manage Cookie Settings | Privacy Policy | Legal Info
Facebook Twitter Instagram YouTube Patreon RSS Feed
Contact: General Queries | Submit News