Support for AI progress goes far beyond risk management
In AI risk management, the biggest risk to many organizations might be failing to take advantage of AI’s transformative capabilities.
This risk should be top of mind for boards as they perform their governance and oversight duties over AI. Their savviest competitors are almost certainly adopting AI, driving efficiency and providing insights that can make their organizations more profitable.
Many boards initially saw their AI-related responsibility as urging management to develop guardrails to protect organizations from the risks related to bias, data security and transparency that are associated with the technology. But it’s long past time to go beyond a protective stance related to AI and take a proactive approach toward the technology.
“The board can and should encourage management to proceed with AI initiatives and be comfortable with being uncomfortable once they have strong risk management and controls in place related to AI,” said Grant Thornton Growth Advisory Services Managing Director Joe Ranzau.
AI, in particular, has the potential to significantly improve the effectiveness and efficiency of human capital at organizations across the industry spectrum. Employees can use AI to multiply their impact many times over while also eliminating repetitive, low-value tasks that don’t require human judgment. However, optimal use of AI requires a commitment to user training, and boards have a responsibility to monitor AI’s effects on employee engagement.
Ranzau, who moderated a recent NACD webcast, “Empowering private company boards: AI and workforce,” provides a deeper dive on AI’s impact on human capital in this Q&A, with an emphasis on the board’s role as an AI champion.
How can boards make sure management provides employees with enough room to experiment with and pilot AI while maintaining appropriate guardrails?
In some cases, management isn’t especially supportive of AI use, and boards need to push leadership forward.
Although 60% of finance leaders in Grant Thornton’s CFO survey for the fourth quarter of 2024 said their organizations are using generative AI, that means 40% weren’t using it yet, even though most of them are exploring potential use cases. Especially when management is reluctant to implement AI, boards need to set expectations for moving forward with AI, establishing a supportive environment with proper resources for AI initiatives that align with the organization’s strategy and ethical standards.
Management must be ready for AI use to advance throughout the business environment — and it’s advancing quickly. It’s the board’s role to make sure that management provides employees with AI tools and training needed to succeed with the right controls in place to remain within the guardrails.
These elements sometimes aren’t in place. In our most recent CFO survey, two-thirds of respondents who are using generative AI said they have clearly defined acceptable use policies for the technology. That leaves fully one-third of these organizations whose people can’t be certain when they can use generative AI — and when it’s off-limits.
Boards that allow management to be complacent on this issue should be forewarned: If the companies don’t provide guidance, there’s a good chance their employees are using AI anyway — and they might even be paying for their own AI tools. Boards must make sure that management is providing employees with clear direction on when it’s OK to use AI and when it’s not. Proper oversight by the board is required to make sure organizations provide employees with an organizational framework for AI — and abundant opportunities within that framework for employees to experiment with AI.
How can boards verify that management takes the right steps to upskill the workforce related to AI?
There is certainly work to be done here. Our latest CFO survey shows that 57% of finance leaders do not plan to increase their spending on training, learning and development in the coming year, while 65% of finance leaders expect to increase their spending on IT and digital transformation.
“There’s not one answer to this because organizations are in different phases of maturity related to AI,” Ranzau said. “The 20% that are early adopters have been using generative AI for a while, and the 40% that are the fast followers also are using it. But that last 40% is still in the adoption phase, where they’re just starting to figure out how to use the technology.”
Although it’s difficult to generalize because of the wide range of maturity levels, it’s important for companies throughout the AI maturity curve to make a concerted effort to upskill their people as they implement generative AI. Training should be thorough and include everyone who will need to use the technology. And it should include detailed instructions for employees to incorporate AI into the use cases that already have been identified as well as a framework for experimenting with AI.
Boards need to make sure that management is allocating resources to upskilling and is defining the skills that will be needed for success with AI. Management should be reporting back to the board on the effectiveness of upskilling efforts, as well as on the workforce’s preparedness to make the most of this important new technology.
How should boards approach oversight of culture related to AI governance? Employees in some situations are concerned that AI is going to take their jobs, and it seems that it would be important for boards to make sure management is transparent with employees with frequent communication on why AI is being adopted and how it will affect their roles.
AI is having a real impact on roles. In the most extreme examples, the need for accounts payable employees and frontline call center representatives has been substantially reduced as AI emerges. It’s important for boards to work with management to set the right tone at the top and be transparent about the way jobs are being affected by AI, even though widespread job displacement is not yet happening.
There are some standard tools that boards can encourage management to use to monitor the culture of an organization. Boards can make sure that management is constantly measuring the pulse of an organization through employee surveys, town halls and focus groups. Boards also can enact skip-level meetings, where they drop down beneath the levels of the organization they might ordinarily meet with, to see how AI is affecting an organization’s culture.
However, employee concerns about AI aren’t occurring in a vacuum, and they are part of a broader awareness that management and boards need to have about culture in the current work environment. Grant Thornton’s State of Work in America survey for 2024 showed that 51% of the U.S. workers who responded said they had suffered burnout in the previous 12 months — a 15-percentage-point increase over those who reported burnout in 2023.
Burnout is high and engagement has surpassed historical lows, and it’s partly because of the removal of boundaries between work and home over the past several years — dating back even before the pandemic. Even people who work all day in an office have laptops and smartphones that demand their attention outside of working hours, and the separation between work and home continues to erode.
In addition, factors outside of work are causing burnout. Employees are experiencing anxiety over economic disruption, political disruption, global conflicts, and climate change. The stress on individuals is incredibly high, and now AI’s emergence is requiring employees to learn how to do their jobs differently, too. Boards might suggest to management that AI:
- Has the potential to alleviate some employee burnout by creating efficiency in day-to-day tasks that can make work less overwhelming.
- Can increase employee engagement as they find creative ways to interact with the technology and experience its benefits.
Oversight of these cultural issues is part of the board’s risk management duties, as a healthy workforce is critical to an organization’s success regardless of the extent of AI adoption. It’s important for boards to closely monitor employee engagement statistics and urge management to take the necessary steps to keep morale high.
How can boards make sure that management responds appropriately to employee resistance to AI?
One thing boards can do is make sure management and labor collaborate on a continuing basis to evolve the organization in a healthy manner. For example, the recent labor negotiations with longshoremen are typical of an issue that some organizations will face in the coming years.
AI has the potential to take longshoremen’s jobs. But if the organizations don’t embrace AI, they might not be able to stay competitive — and then all the jobs will be gone, leaving no viable roles for labor to fill. It’s going to take a collaborative effort to solve these issues.
As AI increases productivity, another area where boards will need to get involved is related to compensation.
“From a compliance perspective, certain federal, state and local laws require minimum wages and compensation by the hour,” Ranzau said. “But as productivity increases thirtyfold or three hundredfold, hourly billing might no longer be practical.”
In these situations, boards, management and labor might all need to work together to find a different model for compensation — one that is no longer tied to the number of hours that an employee works. Including employees in the process helps organizations gain buy-in on those collaborative solutions.
How can boards appropriately monitor the risks related to integration of AI into workforce management processes? For example, in recruiting, AI models might show bias against hiring members of certain groups. How can boards appropriately monitor risk management to prevent organizations from getting hit with fines in some of these hazardous areas?
This is an opportunity for audit committees to have conversations with management on how they’re redesigning their strategies and adopting tools that monitor for these types of risks on an ongoing basis.
There’s a lot that’s about to change in the internal audit and compliance space. Some percentage of internal audit’s work will continue to be compliance, but another part is going to be devoted to helping improve the performance of the organization. These duties will be related to understanding the technologies that exist and making sure the organization is aware of the potential risks or liability.
The board’s role in this is understanding that it’s not enough to make sure the CIO, or chief technology officer, is upskilled with AI’s capabilities in mind. The audit committee needs to make sure the chief audit executive is upskilled and has people on the internal audit team who understand what they need to be looking at related to AI and how they should be thinking about it.
They need to make sure internal audit is prepared to support the organization’s use of AI in an effective, ethical and legal manner that brings value to the organization. For a lot of management and internal audit teams, that’s a substantial shift from where they have focused in the past.
How can boards make sure return on investment is being measured appropriately for human capital initiatives?
Some board members have said that measurement of AI isn’t necessarily different from a board perspective from how they would measure the return or success of any other transformation or technology initiative.
Typically, that means boards should require management to report on key performance indicators that might include cost savings, productivity gains and even revenue growth. And these metrics should be reviewed regularly and presented to the board. But that’s not always easy with AI.
From management’s perspective, the challenge with AI is that when AI is experimental, it’s difficult to identify where there are opportunities to measure AI’s impacts. So, for example, in the case of a large language model that employees use to assist brainstorming, it’s probably fairly difficult to measure the return on that brainstorming facilitation.
On the other hand, if you’re using AI, for example, to monitor for fraud in the HR space, whether it’s labor-related, overtime abuse, or shadow employees, you can measure those results specifically compared with how your old compliance programs delivered.
Some of these outcomes are easier to measure than others, and management is still in the experimental phase for a lot of these use cases. Companies are trying to identify the potential benefits and the value AI might bring. As the technology matures and organizations get closer to full implementation, the precision around ROI measurement will improve.
Board oversight needs to be flexible here. Where results are easy to measure, boards should demand to see them. They also should push management to be diligent in its pursuit of metrics that will show the value that AI is bringing. But in situations where accurate ROI data is unavailable, boards need to be patient as the measurement capabilities improve.
Contact:



Joe Ranzau
Managing Director, Business Consulting
Grant Thornton Advisors LLC
Joe is a back-office transformation leader, focused on performance and profitability improvement through; strategy design, operating model re-design, cross-functional process improvement, post-merger integration, and organizational change management.
Austin, Texas
Industries
- Technology, Media and Telecommunications
- Healthcare
- Manufacturing, Transportation and Distribution
- Not-for-profit and Higher Education
Service Experience
- Advisory Services
- Business Consulting
Content disclaimer
This content provides information and comments on current issues and developments from Grant Thornton Advisors LLC and Grant Thornton LLP. It is not a comprehensive analysis of the subject matter covered. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC and Grant Thornton LLP. All relevant facts and circumstances, including the pertinent authoritative literature, need to be considered to arrive at conclusions that comply with matters addressed in this content.
For additional information on topics covered in this content, contact a Grant Thornton professional.
Grant Thornton LLP and Grant Thornton Advisors LLC (and their respective subsidiary entities) practice as an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable law, regulations and professional standards. Grant Thornton LLP is a licensed independent CPA firm that provides attest services to its clients, and Grant Thornton Advisors LLC and its subsidiary entities provide tax and business consulting services to their clients. Grant Thornton Advisors LLC and its subsidiary entities are not licensed CPA firms.
Tax professional standards statement
This content supports Grant Thornton Advisors LLC’s marketing of professional services and is not written tax advice directed at the particular facts and circumstances of any person. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC. If you are interested in the topics presented herein, we encourage you to contact a Grant Thornton Advisors LLC tax professional. Nothing herein shall be construed as imposing a limitation on any person from disclosing the tax treatment or tax structure of any matter addressed herein.
The information contained herein is general in nature and is based on authorities that are subject to change. It is not, and should not be construed as, accounting, legal, tax, or professional advice provided by Grant Thornton Advisors LLC. This material may not be applicable to, or suitable for, the reader’s specific circumstances or needs and may require consideration of tax and nontax factors not described herein. Contact a Grant Thornton Advisors LLC tax professional prior to taking any action based upon this information. Changes in tax laws or other factors could affect, on a prospective or retroactive basis, the information contained herein; Grant Thornton Advisors LLC assumes no obligation to inform the reader of any such changes. All references to “Section,” “Sec.,” or “§” refer to the Internal Revenue Code of 1986, as amended.
Grant Thornton Advisors LLC and its subsidiary entities are not licensed CPA firms.
Trending topics

No Results Found. Please search again using different keywords and/or filters.
Share with your network
Share