A Barrister's Blog

The Lighter Side of Law

 

by Paul Cutler

Chat GPT and the keynote address

Chat GPTChat GPT  was recently taken for a test run by Justice Leeming as he prepared his paper to present to the Aviation Law Association of Australia & NZ.

In collaboration with his son, his Honour initially thought he would use it to find a title for his paper. Some of the candidates were “Navigating the legal skies”; “Turbulent times in aviation law”;  “Landing on the legal tarmac” etc etc. No problems, so far.

But then the thought occurred to him to use it to write a paragraph about Sydney Seaplanes Pty Ltd v Page [2021] NSWCA 204 (which was the case arising from the Cottage Point to Rose Bay seaplane crash in 2017). Initially Chat GPT referred to Sydney Seaplanes v Page 2021 HCA 56 (15 Dec 2021). The problem is that the citation is completely wrong (there were only 44 judgments that year and none were on 15 December). Asked the same question a second time it wrote a paragraph about trade marks and use of confidential information on termination of employment. That was obviously not the subject of the case.

When challenged with “I don’t think that’s right”, Chat GPT “apologized for the confusion” and the previous “inaccurate response”. Ultimately his Honour asked “what case was [2021] NSWCA 204”? Although Chat GPT came up with the correct name the case was described a debt recovery matter (which is also clearly not correct).

His Honour concluded:

  • Our jobs are safe from generative artificial intelligence at least for the next week or so;
  • Chat GPT doesn’t make mistakes or tell falsehoods – it just gets confused and apologises profusely; and
  • He was going to have produce the substance of his keynote address the old fashioned way.

Here’s a link to his Honour’s paper (which doesn’t have any of the titles suggested by Chat GPT). 

You may also have heard media reports about a New York lawyer who used Chat GPT to draft his submissions to the court. When the court staff couldn’t find the cases he (Chat GPT) had cited he was asked to provide copies of them. Instead of going to the law reports, he again used Chat GPT which promptly generated the (non-existent) judgments and which he provided to the court. Needless to say the Judge was unimpressed. That case was discussed recently on the RN Law Report podcast, but you’ll find it easily enough with a google search.

Creative commons acknowledgement for the photograph.

Share Button

Share transfers

Share transfers 101 as explained by Graham J in Ku v Song: [175] Whoever coined the expression ‘as clear as mud’ must have been slaving over the extraordinarily, and unnecessarily, complex provisions of the Corporations Act and the Corporations Regulations relating to...

The Gravy

On 11 December 2015, Robert Skeen attended a function at Blacktown Workers' Club for which he had pre-ordered a gluten free meal and a gluten free dessert. He wasn't provided with gluten free gravy for his meal. This set in train a series of unfortunate events which...

The Vibe

Family Law disputes often bring out the worst in people and solicitors should be wary about acting for family member combatants. So, when Brett Smith decided to act for his daughter in law against her first husband, he should have anticipated that emotions might run...

The Prophecy

I don't usually blog about overseas judgments, but as I've just come back from North America, I thought I would share the humour of the Court of Appeal of Alberta in R v Pelech, 2012 ABCA 134 at [2]: At about 2:00 a.m. on December 6, 2009, an Edmonton police officer...