IS HOLLYWOOD COOKED, OR IS AI HALF-BAKED?

Like a lot of people trying to make a living in the creative arts, I've become increasingly more aware of the ever-creeping presence of AI into discussions around the future of film.

I say, "aware" specifically - a lot of people might use the word "concerned" instead - but I can't say I feel particularly concerned about it. Not because I'm one of those that welcomes the idea of using generative AI to replace traditional film-making, but because the prevailing discourse around AI and the constant, exaggerated hyperbole that is inescapably woven into said discourse can't mask the strange sense of desperation that seems to be bubbling just under the surface of it all. Being relatively enthusiastic about technology I tend to read a lot about AI, and more than anything I'm struck at just how many of these tech CEOs seem to have put an awful lot of eggs - about a trillion eggs, I'm led to believe - into one basket.

As a result, they really, really need you to buy that AI powered fridge, or that AI powered lawnmower, or the other thousands of products and services that have integrated AI without any real understanding of how it actually makes their products better. To say these companies have a lot riding on your wholesale adoption of AI powered technology is a colossal understatement - a lot of financial analysts suggest they're gambling another catastrophic global recession on it.

I'm reminded of the obsession with nuclear energy in the 1950's that was fuelled by the end of World War 2 and the Cold War's impact on modern culture - everything was suddenly "atomic" in some way, as we entered a new technological age, and this mysterious, unfathomable technology that had won us the war forced it's way into the public consciousness. You could go to Las Vegas and get an Atomic Cocktail, or do home experiments with an Atomic Energy Home Chemistry Kit, or wear an Atomic Bomb Ring and watch real atoms collide on your finger for some reason.

Nobody outside of the nuclear scientists that built it really understood the science behind atomic energy, all they knew was that it could bring death and devastation on an unprecedented level to our enemies and that was cool - so naturally we should be putting it in children's breakfast cereal.

I feel like the cultural feeling towards AI is kind of the same. Few people seem to really understand the technology powering it, but we're repeatedly assured that it's somehow going to make everything better.

You'll have to forgive my skepticism on that last point, when raising concerns about the long-lasting environmental damage AI data centres might be causing (or the psychological damage AI chatbots are already inflicting on vulnerable members of society, or the fact that AI adoption is giving thousands of large-scale employers the green light to start laying off workers en masse before the technology has really proved itself a worthy replacement) is met with labels of being a "Luddite" from pro-AI adopters, and claims of "you can't put the genie back in the bottle" or "technological change is inevitable, embrace it or die". Call me overly cynical, but none of this sounds like it's making anything better per se.

Even the name is deliberately misleading: "Artificial Intelligence" - there's nothing actually intelligent about it. LLMs are essentially 'roided up autocomplete algorithms that can predict what the next word in a statement should be based on the statistical probability of it appearing in a data source. It doesn't think. It doesn't rationalise. It doesn't contextualise decision making. It just carries out an - albeit very impressive - mathematical function based on the data you put into it, and the data is has to cross-reference. But ordinary people - the kind who don't stay up until 3:15am writing blog posts about why the talk around AI is kind of silly - seem to singularly fail to grasp this concept. 

Not that I blame them; to come back full circle, the modern obsession with AI is driven by marketing and gimmickry carried out by enormously wealthy tech companies that have banked heavily on AI being the "atomic energy" of the future. An awful lot of time and effort has gone into trying to manifest that idea into reality, and nowhere is this more prevalent than in social media discourse where, within seconds of finding a pro-AI digital space, you will uncover claims that "Hollywood is cooked" with the "proof" being some sort of uncanny, plastic looking mashup of John Wick fighting Arsene Wenger on top of the Empire State building while Pope Pious XII watches on from his helicopter.

Don't get me wrong, the fact that the technology to create this even exists is impressive - I think it's fair to say that none of us saw it coming - but the massive ocean of AI "slop" videos that have sprung forth from the technology strongly suggests two things to me: 

1) That the creative mental development of 99% of people who use generative video services stopped when they were about 6 years old and their idea of peak entertainment was smashing action figures together. 

2) Tech companies are still kind of flailing when it comes to figuring out what the specific applications of this technology actually are.

This is why I sense that strange air of desperation around the subject. Let me put it this way - if you invented a car that ran on tomato sauce instead of petrol and did 600 miles to the gallon; if you wanted to sell it to the public you'd buy a few jars of Dolmio, stick it in the tank, drive round the block a few times, and then tell everyone that yes, you do accept cheques.

But there isn't an AI equivalent of that. No killer, undeniable application that every household simply can't do without. Sure, there are lots of day to day use cases where AI provides convenience - quickly summarising an email, drawing up a to do list, providing a basic overview of a topic - but I'm going to go out on a limb and suggest that gambling a trillion dollars on it and completely obliterating the creative arts industry in the process isn't worth it, if all the end result yields is that it saves me the two minutes it would take up drawing a shopping list.

When the technology first arrived it was all "this could cure every disease known to man!" and "this will help us colonise Mars!", but as it has developed further, and more and more money has been invested into it, tech firms seem to have quietly drifted away from such altruistic applications and just replaced them with "err, do you want to know what it would look like if Spiderman went skiing with OJ Simpson?"

...And I'll admit, the urge to ask ChatGPT to generate that image for the purposes of illustrating this blog post is enormous, but I'm going to stand by my principles and not do it...

Instead, we just have podcast clips and interviews of Sam Altman, Dario Amodei, and Elon Musk making unfounded, outlandish claims and telling everyone that AI is going to replace all doctors in 5-10 years, or that AI will become self-aware and kill everyone on planet Earth in 5-10 years, or that AI will come into your Mum's house, unmake her bed, and tip an entire bag of flour onto the floor of her kitchen in 5-10 years.

But back in the real world, Goldman Sachs has just announced that AI has contributed "basically zero" to America's GDP growth last year. Similarly, MIT conducted a report into AI led business and concluded that the majority of AI-based companies do not turn a profit. Nobody - including the AI companies themselves - seem to really know what AI is for.

The most recent industry it seems to have forced itself into is coding and, as I type this, the term "vibe coder" has entered modern internet parlance as way of describing people who have no coding skills whatsoever, but are able to describe what they want to at least a GCSE English standard. There is currently an ongoing war of words between traditional programmers and these "vibe coders", with one side claiming that AI makes the other redundant, and the other claiming that AI is so bad at complex coding it just makes humans with talent more important than ever. I sympathise with the latter, and suspect that they're feeling much the same emotional response to these claims as I do when I see the tenth video of The Very Hungry Caterpillar waterboarding Tony Hawk on board the Death Star that day.

The point is, I'm not worried about it mostly because while I do believe there is a place for AI in film - God knows I'd love a rotobrush tool that can cut people out of a 15 second VFX plate with a single click of a button - making anything with emotional substance, intelligence, and a compelling narrative still completely relies on the talent of human beings, and the types of human beings that dedicate themselves to developing those talents... aren't too keen on AI.

The real concern shouldn't be that AI is coming to replace us all, but that these companies are investing a trillion dollars into a technology that they don't really have a justifiably good use for yet, and in lieu of that are simply trying to force us all into living with it, making the mass adoption of their technology some kind of Orwellian inevitability.

If they had the honesty to admit that's what they want, it'd be the one prediction they'd be completely right about.