Updated joint statement by the AICD and the Governance Institute on minute-taking and the use of AI
The Australian Institute of Company Directors (AICD) and the Governance Institute of Australia (GIA) have released an update to the 2019 Joint Statement on Board Minutes (Updated Joint Statement). The Updated Joint Statement was published in response to heightened interest in the use of AI tools for board minute-taking and is informed by refreshed advice from legal counsel Dominique Hogan-Doran SC and Douglas Gration, who provided a legal opinion for the purposes of the original 2019 statement (our 2019 statement overview).
The Updated Joint Statement reaffirms the core principles and legal requirements of minute-taking under the Corporations Act 2001 (Cth) (Corporations Act), outlines considerations for the use of AI to prepare draft minutes (including associated risks) and suggests measures to consider to protect the integrity of minute-taking when using AI.
In short, there is no prohibition on a company using generative AI tools in the preparation of minutes. However, before using any AI tools to assist with minute-taking, companies and Boards should assess the associated risks and available safeguards and consider whether the use of AI tools for minute taking is appropriate in the circumstances of the company. The key principles for effective minute taking and compliance with minute taking and record keeping obligations should not be compromised using AI tools.
In other words, the key question for boards and companies is not whether AI 'can' be used to assist with minute preparation, but whether and in what circumstances it 'should' be used for that purpose.
Key issues and questions for Boards and companies to consider in relation to AI assisted minute preparation are outlined below.
- Before adopting AI to assist with Board minute-taking, directors and companies should pause to consider a fundamental question: is this a valuable and appropriate use case for AI?
- As noted below, there are material risks associated with using AI to assist with minute preparation and appropriate risk management controls may be complex and time-intensive to implement. These risks, complexities and costs need to be weighed against the potential administrative benefits.
- Given this, directors and companies should consider whether AI-assisted minute-taking is worth the effort and risk – or whether there are other, lower-risk opportunities for AI to support governance and corporate administration that may offer a better starting point.
- As with the usage of any AI tools, there needs to be an appropriate level of human oversight and evaluative judgment applied so as to the preserve the integrity and evidentiary value of Board minutes.
- Beyond the question of the potential use case of AI in Board minutes, the Updated Joint Statement is a signpost of a clear direction of travel: AI governance is an increasing focus for industry bodies and regulators alike. The Updated 2019 Statement reflects the need for AI governance to have standalone frameworks that align with existing obligations and protocols. Given this increasing focus, now is the time for organisations to get ahead and implement practices that enable AI value while minimising risk.
The value proposition: How can AI assist with minute preparation?
With the advent of virtual meetings and AI, questions are being asked about how AI and in particular large language models might be used to assist with minute preparation, for example:
- Should AI be used to record Board meetings and then produce a transcript of Board meeting discussions?
- Should AI be used to generate draft minutes using meeting notes, a recorded transcript and/or Board papers as an input?
Risks associated with using AI tools for minute preparation
The particular risks will depend on how AI is used in Board minute preparation. The key use cases are to record transcripts and to generate minutes.
Key risks of using AI to record transcripts of Board meeting discussions include:
- Any recording or transcript retained will be discoverable and may be admissible in court as evidence.
- Legal advice or privileged information discussed during the meeting may be compromised, and there is a risk privilege could be waived by disclosure.
- Material inaccuracies in the recorded transcript (eg where audio/microphone quality is poor, or Board discussions are dynamic and speakers cannot be identified or key words (including acronyms or company- or industry-specific terminology) are not interpreted correctly).
- Inability to recognise tone, intent and non-verbal cues of speakers, which often relies on professional human judgment.
- Potential impact on free-flowing and candid discussions around the Board table if Board members are mindful a transcript is being made.
- Vulnerabilities within third-party AI providers’ security systems that can increase susceptibility to cyber attacks and data breaches.
- Technical issues (eg internet connectivity) that may result in failure to record all relevant information during the meeting.
- Privacy risks, including if the organisation does not have appropriate consent to input personal information discussed during a Board meeting into the AI tool, as well as data retention risks etc.
- Intellectual property risks, including if copyrighted material is discussed during a Board meeting and the AI tool terms of use do not align with the organisation's intellectual property rights and licences in relation to that material.
Key risks of using AI to generate minutes of Board meetings (including by using recorded transcripts, Board papers, and uploaded participant notes as inputs) include the following:
- Material inaccuracies in the AI-generated output, including the scope for ‘hallucinations’ and fabricated information, namely, content that appears plausible and coherent but is inaccurate, one-sided or false. A high level of attention to detail is required to identify errors in plausible AI draft minutes. While there is already a need to carefully review draft Board minutes to ensure they are an accurate record of the proceedings, the risk of errors or inaccuracies may be heightened where AI tools are used to generate draft Board minutes. These potential inaccuracies create a risk that the company may breach its obligation to keep minutes of the proceedings and resolutions of meetings, which may in turn represent a failure by the directors to act with appropriate care and diligence.
- Bias in representing the Board's collective discussion and/or individual director contributions, resulting in crucial details or nuances in the rationale for decisions not being captured. For example, AI may defer to the content of Board papers (including management's recommendations) as the central input as opposed to directors' perspectives and may not accurately reflect questions or constructive challenge of management assumptions that may take place during a meeting.
- Inability to capture the key points of discussion and the rationale for decisions.
- Inability to reflect specific organisational context or sensitive / confidential matters appropriately which often relies on professional human judgment.
- If technology is used to record meetings or generate meeting minutes, and the records are stored in the cloud and able to be downloaded, the Updated Joint Statement cautions that the making of such a recording does not fulfil the company's obligation to keep a record of a meeting in a minute book because the obligation for a chair to sign a minute indicates that a written record is necessary. This means Board records should be downloaded to a minute book and signed.
Key considerations in assessing whether to utilise AI for minute preparation
Boards need to have confidence and trust in the minute-taking process. This requires implementation of appropriate safeguards (including to ensure that minutes are recorded accurately and without bias). AI may not be suitable for all companies and Boards.
In assessing whether to use AI tools to assist with minute preparation, companies should perform a risk assessment, and follow the AI approval and oversight processes set out in its company-wide Governance Framework. In particular, consideration should be given to:
- The appropriateness of using the proposed AI tools in the context of the particular company, including the sensitivity and confidentiality of matters before the Board and any applicable regulatory obligations.
- The storage, security and discoverability of a recording or transcript of a meeting.
- How personal information will be input into, stored and managed by the AI tool, including whether a Privacy Impact Assessment is required, and whether changes are needed to the company's Privacy Policy and relevant Collection Notices.
- The disclosure of privileged information to third-party AI providers which might jeopardise claims of legal professional privilege for legal advice included in, or attached to, Board papers.
- The consent and comfort level of the Board.
- Policies around when AI will not be used, and how the company will manage challenges to AI outputs.
Mitigating the risks of using AI
Measures that may assist to mitigate the risks of using AI tools are described below. The availability, feasibility and effectiveness of these measures should be considered as part of any risk assessment regarding the use of AI tools to assist with minute-taking.
- Establish clear policies and processes for the use of AI to generate minutes, including:
- by following the AI tool approval process and risk mitigation strategies set out in your company governance framework;
- selecting approved and trusted AI tools for use in relation to Board minutes and a clear policy as to what information is not acceptable to upload to an AI tool for Board minutes;
- accountabilities within the company (eg company secretariat and CEO) for providing detailed reviews, verifications and corrections of AI output (in addition to the Board's usual practices and processes for reviewing and finalising the minutes), and whether the organisation will still require 'back-up' manual minutes to be taken for verification purposes;
- how legal advice or privileged information will be handled (the Updated Joint Statement recommends that, where minutes refer to privileged advice, those minutes should not be provided to third-party AI providers without legal advice);
- what contingency plans are in place in the event of AI system failure; and
- how recordings, transcripts and draft minutes generated by AI will be stored in line with applicable document retention policies.
- Implement robust data security measures, including encryption and regular audits.
- As with any Board minutes, ensure that AI generated outputs are reviewed and refined by the directors (particularly the chair) and experienced professionals to identify errors and provide context.
- If Board artefacts (e.g. Board papers, transcripts of meetings and draft minutes) are retained by third-party AI providers, understand where those artefacts are stored (including to ensure this aligns with the company's cross-border data policy), who has access to those artefacts, what security / encryption measures are in place and whether there is any ability for third parties to use the relevant artefacts.
- Provide training on AI use and risks, including how information is collected and stored, how directors can assist AI use for minute preparation through effective communication and chairing, and how governance professionals can instruct AI tools to focus on key Board decisions and actions within the meeting agenda.
- Regularly monitor, review, audit and test AI processes used to generate minutes, including for compliance with applicable laws and the changing regulatory environment.
If a Board opts to use AI as part of the preparation of draft minutes, AI should not be the only tool relied on. There should be appropriate controls in place, including human oversight and evaluative judgment, to preserve the integrity, quality and accuracy of the minutes.
Our Corporate Advisory specialists have deep experience advising clients on the conduct of Board meetings and minute-taking practices, to ensure compliance with legal requirements and support appropriate risk management.
In conjunction with our AI Advisory Team, who assist clients in designing and implementing responsible AI solutions, we are uniquely placed to assist you to develop an AI governance framework that is aligned with appropriate risk management and compliance aligned to bespoke regulatory obligations.