We talk with our clients every day about the ramifications of generative AI. Many conversations focus on leveraging generative AI for productivity gains and—for those more confident in their AI abilities—using it for customer-facing interactions to drive sales and service.
However, there’s also an opportunity to examine its impact on people: the new skills they’ll need and the ways it will shape our interactions with each other. As our research with Oxford Economics shows, nearly all jobs will be affected by generative AI in 10 years’ time, and as many as 46% of businesses will adopt the technology in the next decade. The ramifications for the workforce—and for needed workforce skills—will be enormous.
Consider that in the most mainstream use case, generative AI acts as an assistant—curating and analyzing a vast amount of knowledge to share insights and generate recommendations. An example is a sidekick to a call center agent. Typically, agents have to put clients on a lengthy hold while they search for information to answer a question; with generative AI serving up quick summaries and recommendations, that wait time is reduced.
But, as we all have learned, generative AI is fallible. It can hallucinate feelings, make up information, respond with bias and be tricked into sharing information it shouldn’t. It requires a second set of eyes to elevate its thinking or guide it back on track—much like a new graduate starting their first foray into the professional world. Or as the Board of Innovation puts it, generative AI will turn many of us from creators into editors.
In this way, these AI tools will act as our automated digital interns, requiring humans—even those entry-level new grads—to become effective “editors” or managers.
To manage the shift, enterprises will need to prioritize change management and employee education much more so than they have in the past. Here are a few of the human skills that will need to be honed in the workforce as AI solutions become more pervasive in the enabling digital ecosystem.
Critical thinking: a key workforce skill for prompt engineering
The rise of generative AI has caused a land rush of prompt-engineering tips and advice, typically comprising formulas or patterns of questions aimed at getting the best results from large language models. Engineering a prompt essentially means workers are learning to manage their AI interns (such as Microsoft 365 Copilot, Miro’s AI Assist and developer productivity tools). Doing so requires the same type of critical thinking and problem-solving skills managers use to effectively elevate work products.
To engage with AI, users need to break down a question, or prompt, into its component parts: the sources from which information might come, the lens or persona through which it should be ingested, and the parameters that should shape the output.
Once a response is delivered, the user will need to create additional prompts to refine or improve the quality of the response. This kind of critical thinking is now table stakes for new graduates, who lack the experience to instinctively know when a response is enough (i.e., “what good looks like”).
As we think about future skills and how to cultivate them, the following questions come to mind:
- How will curriculums evolve to build these types of AI managerial skills that graduates can bring to the workplace?
- How can immersive experiences augment the current standard of testing to reinforce the new skills of the future, like critical thinking and applied problem-solving?
- How can apprenticeship programs be structured to hone critical thinking—typically derived from experience—faster?