Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Maira Butt

Woman who used nine ‘fabricated’ AI cases in court loses appeal

PA

A woman who used nine “fabricated” ChatGPT cases to appeal against a penalty for capital gains tax has had her case rejected by a court.

Felicity Harber was charged £3,265 after she failed to pay tax on a property she sold. She appeared in court to appeal the decision and cited cases which the court found were “fabrications” and had been generated by artificial intelligence such as ChatGPT.

She was asked whether AI had been used and confirmed it was “possible”.

When confronted with the reality, Mrs Harber told the court that she “couldn’t see it made any difference” if AI had been used as she was confident there would be cases where mental health or ignorance of the law were a reasonable excuse in her defence.

She proceeded to ask the tribunal how they could be confident that any of the cases used by HMRC were genuine.

The tribunal informed Mrs Harber that cases were publicly listed along with their judgments on case law websites, which she said she had not been aware of.

Judge Anne Redston said that the use of artificial intelligence in court was a ‘serious and important issue’
— (PA)

Mrs Harber said the cases had been provided to her by “a friend in a solicitor’s office”.

However, her appeal was dismissed although the judge noted that the outcome would have been reached even in the absence of the fabricated cases.

Judge Anne Redston also added: ”But that does not mean that citing invented judgments is harmeless...providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue.”

The court accepted that Mrs Harber did not know that the cases were not genuine and that she did not know how to check their validity using legal search tools

It comes after a UK judge admitted to using “jolly useful” ChatGPT when writing judgements.

According to the Law Society Gazette, Lord Justice Birss said of AI: “It’s there and it’s jolly useful. I’m taking full personal responsibility for what I put in my judgment – I am not trying to give the responsibility to somebody else.

“All it did was a task which I was about to do and which I knew the answer and could recognise an answer as being acceptable.”

Earlier this year, two American lawyers were penalised for using fake court citations generated by artificial intelligence in an aviation injury claim.

ChatGPT and other artificial intelligence tools are known to have a “hallucination problem” where false information is created and presented as fact when questions are asked by users.

The Solicitors Regulation Authority (SRA) has highlighted the opportunities for firms of using AI, but also warned of the risks saying: “All computers can make mistakes. AI language models such as ChatGPT, however, can be more prone to this.

“That is because they work by anticipating the text that should follow the input they are given, but do not have a concept of ‘reality’. The result is known as ‘hallucination’, where a system produces highly plausible but incorrect results.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.