Cinema’s head is in the cloud

Shifting our processing power to the internet is the next natural step in cinema’s technological revolution. Saul Mahoney from online production specialists Sundog Media Toolkit, explains the direction of travel.

Words: Saul Mahoney


The motion picture industry has spent the best part of the past 20 years transitioning from an analogue to an almost entirely digital workflow. This digitisation could be defined as complete, given that there are now many well-known examples where content travels ‘glass to glass’, that is camera lens to projector lens, in a purely digital environment. This area is still growing, predominantly in Europe and the US, and it is far from global given current levels of network delivery solutions deployed.

The digital transformation has enabled innovation and has generated a healthy landscape where technology is flourishing — but it has also brought with it a version explosion; namely the ability for distributors and exhibitors to exploit content in any language in a growing number of formats. Typically a ‘blockbuster’ will require a minimum of 300 individual versions of the same film, all to be created, mastered and distributed in a short space of time. This rising level of complexity and volume of content across the cinema platform has, up to now, been handled in a fairly linear fashion by a mixture of traditional, well-known stakeholders, and new entrants to the industry. 

More and more companies are turning to cloud-based solutions in order to facilitate this rapid expansion of content complexity, and “the cloud” is now an often-raised subject in the technology circles of the industry. So what is this “Cloud” of which we speak?

Nebular networking…

Quite simply, cloud computing is a network of remote servers hosted on the internet. These servers are held, typically, in secure data centres—  warehouses with huge banks of machines blipping and whirring away, doing what they are designed to do. 

The use of cloud platforms in the motion picture industry is in its relative infancy, though there are a few well-established uses already, notably the electronic delivery platforms distributing content to cinemas, such as those from Gofilex and UniqueX. Both utilise data centres and cloud services.

In the film and television industry adoption is, however, growing apace, with emerging technologies and applications hitting cloud services such as Aspera and Signiant which offer fast and secure file delivery; cut in BeBop and Aframe with professional editing platforms; GrayMeta uses AI and machine learning to extract and create metadata; Foundry (Nuke) with powerful VFX rendering, and our own Sundog Media Toolkit which brings DCP versioning and mastering to the growing party of solutions hosted in the cloud.

Adaptable to suit the scale required

Cloud computing is useful for a number of reasons, primarily, it is the ability to ramp up scale as demand requires. This solution is a no-brainer for an industry such as ours, where the product is essentially a time-honoured creation of large amounts of data which needs to be processed and then delivered. Secondary to this is speed; if more machines can be harnessed to achieve scale then more machines can also be utilised to speed up a workflow.  An example of this is Sundog Media Toolkit’s DCP Engine which can perform the task of wrapping a new DCP in a substantially reduced timeframe, compared to on-premise machines, yet it doesn’t stop there, because any versioning job can be run hundreds of times simultaneously, with negligible loss of time overall. In other words, cloud services allow the production pipeline to expand when the demand for all the exotic flavours in the version explosion is there, rather than having to extend a deadline or farm out work due to capacity issues.

Cloud computing is an efficient resource and the costs associated reflect this. When companies migrate to cloud computing and storage, capital expenditure budgets can be reduced significantly, with an on-demand op-ex model.

When we look at cinema and its growing relationship with the cloud, we can see some highly compelling cases for further enterprise and collaboration. The overall adoption in the post-production and mastering worlds will see timelines shrink, content delivered quickly, dynamically — that is, with a higher level of accuracy than currently exists, and, potentially, on demand. Metadata will provide more intelligence on content — or in content — which in turn will allow more automation, localised advertising, ‘augmented reality’ and marketing opportunities.

Opportunities & challenges of adoption

The first and most obvious opportunity of cloud is the apparent “infinite scale” of resource available, and that can equate to infinite speed. Of course the reality is that neither is infinite, but the amount of resource available in public cloud far exceeds the amount available to any single organisation, so it offers the opportunity to take processing of data from weeks and days to hours and minutes.

Collaboration becomes far easier in a public cloud environment as team members across territories can easily work together on projects, and the required connectivity is easily handled by the data centre. The tools required will be a factor here, but most cloud native services support multiple users working on the same content or project simultaneously. Look at Google Sheets as a simple example.

Another advantage of the data centres global reach and connectivity is that file delivery becomes easier and no longer ties up the bandwidth of your facility. Delivering to multiple parties is more efficient and if the requirement is to deliver the same files to high volumes of customers, most public cloud operators also operate content delivery networks (CDN) which automatically improve file delivery by moving copies of the files to local hubs which service high demand in a particular region. CDNs are how services such as the BBC iPlayer and Netflix are able to cope with high local demand or peaks in a given territory.

Despite the advantage of file delivery, connectivity will still likely be a challenge to adoption of cloud services — you still need to move your data into the cloud in the first place. The cost of broadband connectivity is falling constantly, but in the case of working with content in the cloud, enterprise class connectivity is a must. At the small operation end, 1 gigabit will typically suffice, but for a larger scale operation, one or more 10gigabit pipes are now typical. With the ability of cloud to scale on demand, some media specialist telecoms providers now offer features which can make cloud services a much more attractive proposition. Features such as “burst” pricing for short-term additional bandwidth, direct connect (dedicated bandwidth into a particular public cloud), and layered services such as security, asset management and collaboration tools are becoming features of connectivity offerings.

In the case of a one-time large scale move to the cloud (for instance a content archive) the idea of shifting an entire repository of content could be daunting, even with huge connectivity. However this can actually be handled by utilising physical migration services. This involves putting the data onto large portable hard drive arrays and physically shipping them to the data centre. Off the shelf services such as Amazon’s Snowball and Avalanche are able to handle migrations running into exabytes of data (an exabytre is 1018bytes — a big number). At the low cost end, a few hundred dollars will move up to 80 terabytes in a “Snowball” while you can move up to 100 Petabytes at a time in a “Snowmobile”.

A changing financial model

Cost saving is an interesting part of the equation. At first glance, a cloud service can appear to be very low cost overall compared to an equivalent fully managed service. However moving to the cloud will typically change the nature of an operation and that point has to be accounted for. Most major cloud providers now offer “TCO” (Total Cost of Ownership) calculators to help work this out. A simple example would be to bring a service in-house that was previously entirely outsourced. There is no capital cost on the service itself, the cloud cost here is all operational cost which should typically be in the region of 10-20% of the legacy service cost. You may now need to staff up to run the service in-house, you will almost certainly need to upskill existing workforce or bring in new expertise. You may need to increase your connectivity (see later in this section) and buy in additional cloud-based supporting systems (eg file transfer, asset management, missing service pieces etc.). So it is not unusual when this and other factors are all accounted for that what looked, at face value, to be an 80-90% saving actually reduces to 30-40% of hard savings. This doesn’t attribute value to the operational agility and resilience that the cloud service may bring. Nevertheless, 30% is typical of TCO cost savings in cloud migration seen across all industries.

If you already run in-house services using owned infrastructure, moving to an operational expenditure model will change the nature of your accounting and this can sometimes be a challenging transition. Furthermore, there is an expectation with owned infrastructure to squeeze as much lifetime out of the hardware as possible, often well beyond the accounting write-off period. In this case, cloud migration is more likely to happen piecemeal as equipment reaches end-of-life or is no longer supported. This has the upside of a gentle transition (avoiding operational shock) but has the downside of supporting two models, potentially over a number of years. This transition is much easier to undertake when moving from a wholly outsourced service to in-house cloud backed service. The net result is the same though — the cloud allows the infrastructure cost to vary with the peaks and troughs of demand. Being able to scale instantly when busy and minimise cost when quiet has enormous intrinsic value to any business.

As safe as houses?

Security is a major factor when taking the decision to move into the cloud, since it is widely accepted that the physical infrastructure of large public cloud providers is far more secure than most operations — even many governments — can reasonably afford to achieve. The fact remains, however, that the services within are in a “public” environment, and so the attack surface is naturally large. This is an area that is far too complex to take a single point approach to security, and it is impossible to guess what vulnerabilities or attack vectors may present themselves over time. In this case it is essential to be diligent in your choice of service provider. You must ask if they are certified by any industry bodies, whether their security is well-documented, whether they undergo regular independent testing and attack, and whether they have passed any major security audits. In the end, the required vendor security level of any organisation is going to reflect their own stance on security, but bear in mind that in the connected world, security is more than just protecting content, but protecting internal systems, business continuity and customer and personal data. The cloud can be just as secure as owned infrastructure, and in most cases far more secure, but ensure the applications you choose don’t compromise that.

Can the cinema industry avoid the cloud? Not really.   We’re heading in this direction for a reason; it is the only sustainable solution and will be ubiquitous soon, but the challenges for anyone heading to the cloud in the near future are not trivial and should not be underestimated.

The different types of cloud deployment

There are three main types of cloud deployment; public cloud (e.g. Amazon Web Services, Microsoft Azure, Google etc) operates globally with public access to tools and services provided by the operator.  Private (local) cloud are bespoke builds i.e not open to public use, and can be costly to set up and  maintain. Finally the hybrid cloud offers a bridge between the private and the public and can be a good model for businesses not wanting to jump in at the deep end.

The cloud is probably best known as a data storage provider, most telco’s offer some cloud storage to consumers so that we don’t all have to carry a hard drive around with us that contains a lifetime of photography and music archives. 

Cloud services are used in familiar applications such as Netflix, Facebook and LinkedIn. This is where cloud is not just used for storage but also processing. Further examples are Dropbox and Google Drive for storage and file sharing; Google Docs and LibreOffice for work collaboration on documents, ; Tumblr and Instagram for photo editing and sharing; and YouTube for video storage and publishing.   The list of use-cases illustrates the extent to which the cloud is already embedded in everyday life.