Randy Bean, Heidi Lanford and Ash Gupta
The emergence of generative AI poses a current challenge for corporate boards of directors. Will generative AI disrupt businesses and entire sectors? Some estimates indicate that generative AI will automate more than 40% of business tasks and create business value worth more than $400 billion. The public version of Chat GPT, an application by Generative AI, created over 100 million users within weeks of its release. The potential impact extends to job displacement. What if a vast majority of white-collar tasks could be accomplished more efficiently using AI?
I recently attended the Wall Street Journal Technology Live event and wrote about Artificial general intelligence (AGI) and the coming wave. At the WSJ event, venture capitalist Vinod Khosla predicted that “AI will be able to replace 80% of all jobs within 10 to 20 years.” Author and AI pioneer Mustafa Suleyman noted, “In the coming years, AI will become as ubiquitous as the Internet,” and asked, “Will AI unlock the secrets of universe or will it create systems beyond our control?
What is the responsibility of boards of directors when it comes to generative AI? Are company board members sufficiently equipped to consider opportunities as well as risks, and guide companies in their responsibilities as shareholders and stakeholders? While generative AI has the potential to revolutionize the way we do business, it has the same potential for good or bad, on a massive scale. These are the risk and reward factors that board members need to consider.
Generative AI and the responsibilities of corporate boards was the topic of discussion at a November 1 meeting of the New York chapter of the National Center for Corporate Directors (NACD). The discussion was moderated and moderated by Ash Gupta, former and long-time Global President of Risk Management and Information for American Express. I was a guest panelist alongside Heidi Lanford, former global director of data and analytics for the Fitch Group, which includes Fitch Ratings and Fitch Ventures and is wholly owned by Hearst Corporation. The NACD discussion focused on the steps and actions boards should take to safely adopt generative AI. These include:
1. Strategic implications of introducing AI into business
2. The role of boards as facilitators
3. Legal, fairness and transparency considerations
4. Monitoring, learning and accelerating progress.
The potential risk to any business will depend on the nature of the business problem that generative AI is used to solve. Examples include creating operational efficiencies, improving cross-selling with customers, improving risk management, or driving product and service innovation. The recommended course of action will depend on factors such as industry regulations, the skills of the organization, and whether safeguards are in place to mitigate potential risks. Heidi Lanford notes: “Monitoring and governance are necessary. However, for use cases on the “offensive” side of AI, I prefer to put in place guardrails rather than authoritarian governance.
How prepared are corporate board members for generative AI? Author Tom Davenport, in a recent Forbes article, Are boards of directors laughing at generative AI?, raises a warning flag. Davenport notes that 67% of board members surveyed for a recent industry survey described their knowledge of generative AI as “expert” (28%) or “advanced” (39%). Davenport expresses skepticism, noting, “This level of expertise among board members seems rather unlikely. I doubt that 29% of computer scientists, even formally trained, fully understand the underlying transformer models that make generative AI work. I’ve been studying them for a few years now and I wouldn’t put myself in the expert category.” Board directors may want to take this into account.
The limits of board understanding may not be unique to current board experience with generative AI. A recent Wall Street Journal article headlined: Boards of directors still lack cybersecurity expertisenoting that “only 12% of S&P 500 companies have directors with relevant cyber qualifications,” referencing a November 2022 WSJ. Research study showing that only 86 of 4,621 directors of S&P 500 companies had relevant cybersecurity experience. One might expect that with the novelty of generative AI, the level of relevant experience would be even lower.
One solution could be to recruit new board members with skills in this area. Inderpal Bhandari, who was previously director of data and analytics at IBM, recently joined the Walgreens Boots Alliance board of directors. Bhandari notes, “Cybersecurity threats and reinvention of business models and products through technology are the need of the hour. Today’s board of directors must possess not only technical knowledge, but perhaps even technical instinct to ensure effective governance. He adds: “While well-established cybersecurity training opportunities are readily available to board directors, this is not the case for mission-critical technologies such as data and AI. There is an urgent need to strengthen boardroom literacy in this direction.
Ash Gupta suggests: “It will be the responsibility of board members to understand the extent to which each company is prepared to leverage generative AI in ways that create competitive excellence, as well as mitigate risks commercial and for stakeholders. He notes that while more than 95% of board members believe in the need for AI, only 28% of companies have made realistic progress. He continues: “Boards need personal engagement to develop a deep understanding of how GenAI works, how it can revolutionize business and, perhaps most importantly, what it can’t do.” Gupta adds: “This commitment must be both one-time formal education and ongoing learning. »
To this end, Gupta outlines a series of steps that companies are taking to prepare boards for a generative AI future. These steps include:
1. Create essential training for the company’s board of directors and management so that they have an informed understanding of what is possible and the limitations of generative AI.
2. Create a culture of testing and learning that recognizes that many ideas that initially seem promising may not be of much use to the organization.
3. Consider how best to extend the knowledge of company teams through external collaborations. These may include data sources relevant to your industry, control mechanisms, talent and tools.
4. Prioritize discussing progress and providing updates at least frequently than at every other board meeting.
5. Follow how industry-leading companies are using generative AI.
Gupta and Lanford agree that companies and their board members need to remain vigilant. Lanford warns: “AI must be a team sport. Boards of directors should see broad participation. Expect AI ideas to be solicited from all staff, not just technical experts.” And while there is broad consensus on the need to regulate generative AI, it has been noted that while the technology evolves week to week, legislation often takes years. Gupta adds, “As a CEO and board member, delegate to your CDO, CAO, or CIO, but don’t abdicate your authority.”
Lanford concludes: “Boards can ensure that there is a greater chance of success with generative AI if there is a culture of experimentation and failure, which balances how cases of use are transferred to production. Gupta echoes this sentiment, commenting: “Catastrophic incidents can occur if your people and processes are not properly trained. Effective implementation will require both technical and managerial understanding.” He adds: “Most likely, the first ideas will not produce the expected results. A deep commitment to creating a culture of testing and learning will! »