Investigation finds that the Department of Families, Fairness and Housing failed to take reasonable steps to ensure the accuracy of personal information and to protect personal information from unauthorised disclosure

The report: https://ovic.vic.gov.au/regulatory-action/investigation-into-the-use-of-chatgpt-by-a-child-protection-worker/

Investigation into the use of ChatGPT by a Child Protection worker

In December 2023, the Department of Families, Fairness and Housing (DFFH) reported a privacy incident to the Office of the Information Commissioner (OVIC), explaining that a Child Protection worker had used ChatGPT when drafting a Protection Application Report (PA Report). The report had been submitted to the Children’s Court for a case concerning a young child whose parents had been charged in relation to sexual offences.

Despite its popularity, there are a range of privacy risks associated with the use of generative artificial intelligence tools such as ChatGPT. Most relevant in the present circumstances are risks related to inaccurate personal information and unauthorised disclosure of personal information.

After conducting preliminary inquiries with DFFH, the Privacy and Data Protection Deputy Commissioner commenced an investigation under section 8C(2)(e) of the Privacy and Data Protection (PDP) Act with a view to deciding whether to issue a compliance notice to DFFH under section 78 of that Act.

OVIC’s investigation considered whether the Department took reasonable steps to ensure the accuracy of personal information and to protect personal information it holds from misuse, as required by the Privacy and Data Protection Act 2014 (Vic) and Information Privacy Principles 3.1 and 4.1.

The full investigation report:

https://ovic.vic.gov.au/wp-content/uploads/2024/09/DFFH-ChatGPT-investigation-report-20240924.pdf

  • Eevoltic@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    Some highlights from the report after briefly skimming through it.

    1. CPW1 indicated that they used ChatGPT regularly for around one month prior to the PA Report incident. CPW1 claimed that they used ChatGPT to save time and to present work more professionally.
    1. DFFH also interviewed staff in CPW1’s team. It was indicated that CPW1’s use of ChatGPT had been well known within the team for a period of around 3 – 4 months and possibly longer, and that they had demonstrated to other team members how it could be used. There were no admissions that other members of the team had used ChatGPT.
    1. Ultimately, DFFH found there were 100 cases with indicators that ChatGPT may have been used to draft child protection documents. The types of documents involved court reports, case notes, case plans and risk assessments…
    1. … DFFH identified that nearly 900 employees had accessed the ChatGPT website within this period. This represents almost 13 per cent of DFFH’s workforce of around 7,000 employees.

    Copyright State of Victoria (Office of the Victorian Information Commissioner) CC BY 4.0

    I find this information very distributing.

  • Salvo@aussie.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    The corporate revolt against MLL has started. The lawyers are starting to realise that outsourcing to an unaccountable third party is too much of a liability. The business leaders are also starting to realise that it isn’t good for their business models.

    I was ecstatic to find that the Adobe AI feature is missing from Acrobat Reader when I started my work computer the other day. Our IT department is also progressively excising Microsofts Copilot and Bing Chat features from our work computers too.

    More companies need to do this.