Artificial intelligence (AI) tools are being used by an increasing number of people. Some popular tools include ChatGPT, Gemini and Copilot.
AI can help you with:
- summarising complex information
- writing first drafts
- explaining work topics in plain English
But these tools are not perfect and should not be solely relied on.
Good practice when using AI tools
There are some things you can do to get the most from using AI.
Review legal information given by AI
If you are using AI to explain or summarise information, you should check to make sure it's accurate.
You can do this by:
- checking it against advice from recognised organisations, for example, advice from Acas or GOV.UK
- contacting the Acas helpline
AI tools may not be trained on UK employment law. This means they can give inaccurate information. For example, referencing laws or cases from outside the UK.
They can also make mistakes. These are known as 'hallucinations'. For example, AI might reference laws or cases that do not exist.
Check writing that has been created or edited by AI
If you use AI to help you write or edit emails and letters. You should check it:
- reflects your knowledge and experiences
- does not include irrelevant laws, for example laws from other countries
- does not include legal jargon or phrases you do not fully understand
When communicating with Acas, it is important that you explain your situation in your own words. This helps us to give you the most relevant advice.
Data privacy and AI
You should avoid entering sensitive information into public AI tools. For example, entering documents from a conciliation case into ChatGPT.
These tools might retain this information or use it to improve the accuracy of the tool.