As automation becomes more common, people are starting to worry about its effects on their livelihoods. Even governments are thinking up solutions to manage the potential job losses. They’re starting to take ideas like “Universal Basic Income” seriously; something most of us laughed at a decade ago. As marketers and blog writers, we’re now hearing talk of the impact of computer-generated writing on our jobs…and it’s worrying.
People working within the creative industries tend to assume their jobs are safe from the effects of automation and Artificial Intelligence (AI). Computers lack the capacity to capture emotion and depth. We assume that content writers, copywriters, and opinion journalists are safe…
…or at least we did. Now, that computer-generated writing is advancing rapidly we’re beginning to wonder how it will affect our work. Will we lose our jobs to AI? It’s a provocative question.
What Is Computer-Generated Writing?
Specialist software programmes create computer-generated writing using large data sets, analysis, and algorithms. They take large amounts of written data (e.g. books, internet comments, etc.), the programme then analyses the data according to pre-set patterns called algorithms, and then produces a new text that follows the same pattern as the content they were fed.
The patterns analysed vary between programmes. They might include things like the relationship between verbs and nouns; the number of emotionally loaded words; and the use of positive, neutral, and negative language. They can analyse writing in many different ways.
Thousands of these programmes exist and some are highly advanced. But have they reached a level where they might replace people? The answer is yes and no.
The Software Is Getting Smarter
With most examples of computer-generated writing, it’s easy to think creative jobs are safe. But, the writing’s improving all the time.
A look at some examples of AI-generated writing might help clarify the issues. Botnik Studios is a good example. They used AI to create a chapter based on Harry Potter’s wizarding world. They fed the programme all seven of the Harry Potter books. It then analysed the texts and created a new story. Here’s an excerpt:
“I think it’s closed,” he noticed.
“Locked,” said Mr. Staircase, the shabby-robed ghost. They looked at the door, screaming about how closed it was and asking it to be replaced with a small orb. The password was “BEEF WOMEN,” Hermione cried.
“I think it’s okay if you like me,” said one Death Eater.
“Thank you very much,” replied the other. The first Death Eater confidently leaned forward to plant a kiss on his cheek.
While it doesn’t particularly make sense, the sentences are coherent. It’s also funny (worth a read if you haven’t already). It just isn’t a story in the traditional sense: there’s no plotline and the ideas don’t follow logically. The art of creative writing seems safe…
…but, what about factual writing? Here’s a news piece written by AI:
“FedEx shares have climbed 9 percent since the beginning of the year, while the Standard & Poor’s 500 index has increased nearly 5 percent. In the final minutes of trading on Tuesday, shares hit $162.65, a climb of 11 percent in the last 12 months.”
Unless you’re some sort of forensic linguist, it’s unlikely you’ll notice anything to indicate that it was written by a computer. That means at least some computer-generated writing is indistinguishable from normal language.
Why We Don’t Have to Panic Just Yet
Computer-generated writing is fantastic when reporting on facts, e.g. weather reports and specific types of news. Indeed, some news agencies already use it.
Fortunately, creative writers can rest easy for now. AI still struggles with some aspects of language. It can’t form new opinions on the facts it’s reporting. It could infer a few things based on data that it already has, but it can’t create new ways of interpreting facts – humans can.
Computers don’t have emotions, so any emotional response comes from a predetermined set of variables. It will give similar responses each time; its use of emotional language will come to feel stereotyped and numb. Once we get used to it, the writing will feel boring.
They don’t understand the emotions they express, so they can get it wrong. In the Harry Potter extract, does it make sense that a locked door evokes searing pain? Especially when contrasted with Ron’s flat response in the next line. He went through a rollercoaster of highs and lows…just from looking at a door.
This lack of insight makes the writing funny because it’s unexpected. They aren’t purposefully comical. They just don’t understand emotions: the depth is wrong and the emotion doesn’t fit the context.
For AI to create meaningful associations, they need a huge amount of data and feedback (a human to correct the errors). Right now, that’s prohibitively expensive. This means that when AI encounters a unique or unexpected event, it has nothing to tell it how to react appropriately. And, it’s more likely to encounter unique/unexpected events, because of the lack of data.
So, AI writing is limited in unpredictable situations: it’s limited by its emotional range and by its ability to form opinions. Technology hasn’t yet conquered this way of thinking.
It might appear to have conquered it – there are examples of AI art that people prefer over people-made art – but the art is nothing more than an image. There’s no story or idea behind it. It looks pretty, but it’s vacuous.
What’s the Conclusion? Are We Safe?
Computer-generated writing has its place. It works well with factual information, making it good for breaking news articles and weather reports. But for writing that has depth and meaning, e.g., opinion pieces and blogs, it’s not quite there yet.
For now, our jobs are safe. For how much longer, only time will tell. But just remember this: during times of technological advancement (e.g. the industrial revolution), people have always feared their own displacement. What actually happens is that we adapt and transform the way we work. Perhaps we need to think of ways to adapt to these changes, instead of fearing them?