In the wake of the generative AI (GenAI) revolution, UK businesses find themselves at a crossroads between unprecedented opportunities and inherent challenges.
Paul O’Sullivan, Senior Vice President of Solution Engineering (UKI) at Salesforce, sheds light on the complexities of this transformative landscape, urging businesses to tread cautiously while embracing the potential of artificial intelligence.
Generative AI has stormed the scene with remarkable speed. ChatGPT, for example, amassed 100 million users in a mere two months.
“If you put that into context, it took 10 years to reach 100 million users on Netflix,” says O’Sullivan.
This rapid adoption signals a seismic shift, promising substantial economic growth. O’Sullivan estimates that generative AI has the potential to contribute a staggering £3.5 trillion ($4.4 trillion) to the global economy.
“Again, if you put that into context, that’s about as much tax as the entire US takes in,” adds O’Sullivan.
One of its key advantages lies in driving automation, with the prospect of automating up to 40 percent of the average workday—leading to significant productivity gains for businesses.
The AI trust gap
However, amid the excitement, there looms a significant challenge: the AI trust gap.
O’Sullivan acknowledges that despite being a top priority for C-suite executives, over half of customers remain sceptical about the safety and security of AI applications.
Addressing this gap will require a multi-faceted approach including grappling with issues related to data quality and ensuring that AI systems are built on reliable, unbiased, and representative datasets.
“Companies have struggled with data quality and data hygiene. So that’s a key area of focus,” explains O’Sullivan.
Safeguarding data privacy is also paramount, with stringent measures needed to prevent the misuse of sensitive customer information.
“Both customers and businesses are worried about data privacy—we can’t let large language models store and learn from sensitive customer data,” says O’Sullivan. “Over half of customers and their customers don’t believe AI is safe and secure today.”
AI also prompts ethical considerations. Concerns about hallucinations – where AI systems generate inaccurate or misleading information – must be addressed meticulously.
Businesses must confront biases and toxicities embedded in AI algorithms, ensuring fairness and inclusivity. Striking a balance between innovation and ethical responsibility is pivotal to gaining customer trust.
“A trustworthy AI should consistently meet expectations, adhere to commitments, and create a sense of dependability within the organisation,” explains O’Sullivan. “It’s crucial to address the limitations and the potential risks. We’ve got to be open here and lead with integrity.”
As businesses embrace AI, upskilling the workforce will also be imperative.
O’Sullivan advocates for a proactive approach, encouraging employees to master the art of prompt writing. Crafting effective prompts is vital, enabling faster and more accurate interactions with AI systems and enhancing productivity across various tasks.
Moreover, understanding AI lingo is essential to foster open conversations and enable informed decision-making within organisations.
A collaborative future
Crucially, O’Sullivan emphasises a collaborative future where AI serves as a co-pilot rather than a replacement for human expertise.
“AI, for now, lacks cognitive capability like empathy, reasoning, emotional intelligence, and ethics—and these are absolutely critical business skills that humans need to bring to the table,” says O’Sullivan.
This collaboration fosters a sense of trust, as humans act as a check and balance to ensure the responsible use of AI technology.
By addressing the AI trust gap, upskilling the workforce, and fostering a harmonious collaboration between humans and AI, businesses can harness the full potential of generative AI while building trust and confidence among customers.
You can watch our full interview with Paul O’Sullivan below:
O’Sullivan will feature on a day one panel titled .