Agency

Yomi Tejumola: Looking forward with AI


There’s a problem with the infusion of generative AI into marketing: finding people with the skills to use it.

This isn’t just a problem for marketing, of course. Some 90% of businesses are facing a shortage of AI professionals, according to a study by IBM’s Institute for Business Value. This isn’t just because AI is suddenly showing up in all facets of business. The fact is getting the most out of AI goes well beyond mastering the UI. It requires knowing data analytics, mathematical concepts, and the ability to think critically. 

Yomi Tejumola is CEO and founder of Algomarketing, a U.K.-based martech and operations consultancy. We talked to him about marketing’s AI skills shortage and what can be done. (Interview edited for length and clarity.)

Q: You’re a data scientist, but most marketers aren’t. You know how to think about data and ask the questions that make the best use of the AI. How can they learn those sorts of skills?

A: So, I’ll give an answer based on an example case that we were working on recently. We built a sort of ChatGPT for insights, a generative business intelligence. It enables marketers to ask questions about the data. Like, what campaigns drove the most leads last quarter or something and it will give you answers.

We initially presented the tool to the marketers as a blank slate and said, “This is a genie. You can ask it any questions on your data and you get the answers and recommendations.” When we did that, there wasn’t as much adoption as we wanted. Nobody knew what to ask. It’s like in Aladdin when the genie comes and says, “What are your three wishes?” And people said, “I don’t know.” 

Dig deeper: How to do an AI implementation for your marketing team

What we needed to do was, instead of just giving it to them as a blank slate, only have it do a few things. 

One was to provide already-generated prompts that they could use, but those prompts needed to be relevant to that marketer themselves. We had to create personalized prompts, which meant we had to engineer it so that it takes user data and knows who the user is, what their job title is, their previous campaigns and their current campaigns. Taking all that information and generating a prompt that is relevant to them. 

It also gives another view that enables them to explore insights the AI has already found. Then when they click on one of those insights, it recommends prompts that they could use to explore further. So rather than just giving them a blank slate, they needed some guidance on what to ask and what to do next. 

Q: It’s kind of like a two-dimensional person being presented with a three-dimensional world. You have to think about the data now in other directions. How should companies be thinking about training people around this? I run into this assumption from some places where they think, “Oh, it’s natural language, you don’t need training.” Should there be more training of people around the prompts and thinking about the data? 

A: Yes. Yes, absolutely there should be. That’s the only way we saw it working. Otherwise, the adoption rate was just low. When we started to train people on basic prompt engineering, we then started to see an uptick in the adoption rate.

For the adoption of these tools, it’s essential that marketers are trained on how to prompt them. You must also ensure that those tools are guiding the marketers as well, that they already have existing prompts or templates that they can click on.

We rolled out some prompt engineering courses to the marketers so that they know how to effectively prompt these AI chatbots to get the answers they need.

We have an AI trainer and he comes in and trains our clients on how to prompt within their specific area whether it’s marketing or within marketing, whether it’s analytical marketing or the operational side and so on. Having a trainer coming in to do a more tailored way, we found that more effective than just a general sort of prompt engineering course. 

We also create prompt templates as well so that people don’t only have to rely on training. They can actually pull from a prompt library. So we’ve organized a prompt library for each type of team. If it’s a team that focuses on partner marketing, for example, we have a prompts library specific to that team and we are continuously building on that prompt library updating it to make sure it has the most relevant prompts there.

There are collaboration elements as well, where people contribute to the prompt library, create their own prompt templates and upload them into the library to say, “Hey, I’ve got this new prompt for that.” And we’re seeing that’s organically developed as well as people are using and adopting these tools more.

Q: And that gets them to learn and explore?

A: Sometimes. After every prompt, if it gives recommendations on what the next prompt could be, you actually find that the marketer doesn’t act. When we look at the user stats and how people are using the tools, we find that 60% of the time they don’t actually type in anything to the prompts. They’re just selecting the recommended prompts from one recommended prompt to the next recommended prompt to the next one, to get those insights. That’s pretty important as well. It’s like people don’t want to do any work, they just want to be guided by the right prompts.

Q: Can you run into the problem of people saying, “Oh, that’s what the AI said to do,” and going with that? 

A: Yes, if there are no controls in place, yes. If you don’t have a policy of validating these AI results, then you run into a risk of your team just lazily relying on the AI’s recommendations and just going with that. You need to have an AI validation team in place so that if you’re following an AI recommendation it needs to go through some sort of validation. You need to have somebody manually go through that to ensure that this is the only option or what are the other options that the AI did not see.

Get MarTech! Daily. Free. In your inbox.



Source link

en_US