The landscape of generative artificial intelligence (AI) applications has increased exponentially in a very short time. AI is now being used for generating content including audio, code, images (including the one above), text, simulations, and videos amongst many others, all with the potential to dramatically change the way we approach our marketing and digital communications – and ultimately how we create content.
With the rise and rise of generative AI technology, today everyone will have heard of ChatGPT. But there are many more players in the marketplace, including Jasper and Copy.ai, as well as Google’s Bard, Microsoft’s Copilot and Amazon’s Rekognition. And earlier this year Apple brought out its first books narrated by algorithms, to add to its AI suite of tools.
The capabilities of AI are increasing all the time, and GPT-4, the most recent software from OpenAI released in early 2023, is now ‘multimodal’, so it works with both text and images. While it can’t produce images (yet), GPT-4 can analyse and describe them. And while many of us will have started using the likes of ChatGPT to create content (disclaimer, AI was not used to write this article), the more cautious amongst us may be wondering if it will end up replacing us.
A 2022 McKinsey survey shows that AI adoption had more than doubled over the previous five years, and investment in AI is ever increasing. AI was created to simulate human intelligence and perform tasks that usually require human-like reasoning, perception and decision-making. Today, it’s used in a wide range of industries from education and healthcare to finance and legal. Statista, a leading provider of market and consumer data, has predicted investment in AI technology will reach almost a trillion dollars by 2024.
However, AI tools will only ever be as good as the data and prompts we use and what goes into it. There are many concerns around the use of AI, including ethics, creativity and confidentiality, with a recent report highlighting the fact that more than 4% of employees have put sensitive corporate data into the large language model, raising concerns that its popularity may result in massive leaks of proprietary information.
And a further note of caution when using ChatGPT. It has been trained with a vast amount of text data and draws on over 175 billion datapoints, with text fed into it from across the internet. But this data is only up until mid-2021, so don’t rely on the content being up-to-date or factually correct. Bard, on the other hand, continually draws information from the internet, so it’s more likely to be pulling from the latest information. Tools like ChatGPT work by predicting strings of words that it thinks best match your query, but they lack reasoning to apply logic or consider factual inconsistencies, so can respond with false facts and reply to questions with made-up replies as though they were true. These are known as AI hallucinations and have been hitting the news in recent weeks.
So how can we harness AI for marketing and digital communications? Greg Brockman, the founder of OpenAI, believes ChatGPT helps overcome “the blank page problem”. From generating initial content ideas, to keeping news and articles up to date, using AI tools should be able to speed things up and improve productivity. And for many aspects of the creative process, it might provide a starting point that could lead to new ideas.
Today, generative AI can be used in the marketing space for:
Recently, Coca-Cola announced plans to use it to help create new marketing content. And this is just the beginning.
But all these options still require some level of human intervention to avoid your marketing communications looking and sounding like your competitors’. AI needs to be used in the right way and with the best creative thinking to get the best results. By ignoring the importance of strategic and creative human input from your marketing and communications experts, you risk compromising your brand, which will ultimately be bad for business.
And what about governance? According to Tad Roselund, Managing Director & Senior Partner at BCG, responsible AI (or ‘RAI’) governance is becoming more and more important, so having it in place to mitigate risk is key. “Even if your company has good values and principles – but does nothing – you’re likely to encounter these ethical issues when you use AI. The greater attention paid to purpose and ESG also makes it hard to ignore responsible AI.”
So, with the right creative, strategic and human input, harnessing the power of AI without causing harm will help ensure AI is used as intended within your organisation.
Ultimately, AI is not a replacement for humans, but rather a tool for enablement. The entire landscape has changed and will continue to change, but for now, it won’t replace those areas of work which are very much human-centric. AI can’t create something new; it can only do what it’s programmed to do. In the end, it’s how we use it that’s key, so ensuring that the content generated is credibly sourced from up-to-date sources, is on brand and in the right tone of voice, will be crucial.
There are business efficiencies that can be driven by AI, but we should use those efficiencies to create breathing space to think up new ideas and ways of working, rather than simply increase profit margins and drive growth.
We created a distinct campaign to stand out in a competitive and crowded marketplace.
Whether you want us to be frank, bright, able or all three, get in touch.
Sign up and be first to hear about our events and publications.