Content, Clones and ChatGPT

Cara Bartlett transformed into a cartoon character - blonde hair, blue eyes, pink top, smiling face
Image made with AI!
Picture of Cara Bartlett

Cara Bartlett

Marketing Consultant - Lark Marketing

Over the last year, generative AI tools like ChatGPT have rapidly become a cornerstone and support tool in the world of content creation.

Recently, I took part in a conversation with a fellow marketing professional exploring how we, as content creators, are adapting (or resisting) this change. The discussion left me reflecting not only on how I currently use these tools, but also on what’s at risk if we rely on them too heavily.

Let’s start with the positives. I do use AI, mainly as a research and ideation tool. At the start of a research project – with a well informed prompt – AI helps me get something on the page quickly. It’s a bit like brainstorming with a robot: it alerts me to angles I hadn’t thought of or highlight phrasing that’s surprisingly useful. I’ve also found value in tools like “Keywords Everywhere,” which work well alongside ChatGPT to support keyword research and generate metadata. But that’s where the benefits end, for me.

Would I use AI-generated content verbatim on a firm’s website or client-facing materials? Absolutely not. One of my biggest concerns is that content is becoming too homogenised – everything’s starting to sound the same. If we all use the same AI tools, and we’re all inputting similar prompts to write about something very factual, particularly in professional services i.e. tax deadlines or business grants, how do we differentiate one firm from another? Worse still, how do we maintain our own professional and personal voices?

To me, this is both a brand problem and a personal one. AI can’t replicate the unique insights, tone, and opinions that the professionals I work alongside bring to the table. When I read AI-generated posts, particularly on LinkedIn, they stand out for all the wrong reasons. They lack authenticity. They’re slick, but a bit…soulless? And that’s not how I want my own brand or my clients’ brands to come across.

We risk becoming just one big ‘blob’ of indistinguishable accountancy or law firms. We’ll lose our personality, we’ll lose our credibility, and, ultimately, we lose the trust of our clients, and those who rely on us to advise them.

Yes, there’s a place for AI. But the idea that it will or should replace content professionals is something I push back against hard. I’ve seen content roles drying up lately, only to be replaced by ‘AI Trainer’ roles. And there’s a worrying assumption in the market that AI is “good enough” to do the job. I suspect that in a year or two, those firms will realise that AI-produced content lacks the human touch and nuance – that only a marketing and communications professional having worked in the sector for many years has to engage ‘real’ people. When everyone sounds the same, that’s when we’ll proactive start to seek out distinctive, genuinely human content, and it will become valuable again.

There are other barriers too, trust being a big one. We know AI can hallucinate – it tells us that. So when dealing with technical subjects (especially in regulated industries like accountancy, law and compliance), you still need someone who can apply professional judgement, cross-check facts, and write with nuance. That comes from experience and expertise, not just a prompt. And then there are ethical concerns: shouldn’t readers be told when content has been AI-generated? I think we’ll see more regulation coming down the track, and rightly so, so that readers can make their own decisions whether to rely on your content, or seek more authoritative sources, or even pick up the telephone (remember them?!) and speak to their adviser.

Another worry I can’t shake off is whether we’re walking ourselves out of a job – and I don’t just mean marketing and communications professionals here. There’s an uneasy irony in teams streamlining their processes with AI, only to find they’re no longer needed. In fact, with so many AI trainer roles popping up in job ads, it feels like we’re heading for a weird cycle where humans train AI to train other humans to use AI… and so on, endlessly.

It’s all a bit dystopian. I joked in my recent chat that it’s like trying to think about how big the universe is – it hurts my brain. But behind that humour is a genuine concern: are we accelerating toward a future where creative thought, individuality, and human judgement are sidelined in favour of efficiency and scale?

I hope not.

So what’s my main takeaway?

Generative AI is a brilliant tool if you use it wisely. It can help speed up processes, generate ideas, and support research. But it can’t replace authenticity, experience, or human judgement. As marketers in professional services, we have a duty to inject our voice, our values, and our firm’s personality into every piece of content we publish.  I wholeheartedly agree that AI can support that, but it shouldn’t lead it.

If we don’t maintain that balance we’ll find ourselves in a future where our content is clean, optimised… and completely forgettable.

So, if you’d like to chat the human Cara rather than this AI generated version, give me a shout!

Subscribe to our newsletter