The Government’s Robodebt Scandal
With more and more data being generated, both governments and businesses worldwide have grown their use of automated decision making tools. For example, artificial intelligence (AI) is commonly used in defense and law enforcement agencies, with governments using deep neural networks to detect suspicious activity that occurs in large data sets. Civil and health agencies are also leveraging AI, deploying its potential in areas such as trade surveillance, climate and economic analysis. Whilst AI has many benefits, we also know that automated decision making can create even larger problems than those it was employed to remedy. In particular, where AI is left unchecked, and efficiency and cost-savings are prioritised over operational transparency, we see just how badly individuals can be hurt by the consequences.
An Australian Case Study:
An example of the damaging impacts of AI can be seen through Centrelink’s Online Compliance Intervention Program, more colloquially known as Robodebt. The Robodebt scheme started in 2015 and was an automated debt assessment and recovery program that raised debts by ‘income averaging’ tax office data to check welfare payments. Essentially, this meant that the government was calculating a recipient’s income by averaging it over fortnightly periods instead of enforcing a process which discovered an individual’s actual fortnightly income. This method of calculation was ultimately deemed ‘unlawful’ and meant that many individuals who had irregular income were subject to incorrect Centrelink debts.
This ultimately led to a cycle of distress for many individuals receiving welfare payments. With the government raising more than $1.76 billion in debts against 443,000 people and unlawfully recovering $751 million from these individuals, it was clear that legal action had to be taken.
The Class Action:
Gordon Legal brought a class action on behalf of all the victims in 2020 of the Robodebt scheme. In response to this legal action, the government, with the approval of the Federal Court, has agreed that it will repay the $751 million it took and completely clear all debts, totalling $1.76 billion. $112 million has also been added in interest to this settlement, and this sum will be shared among the 394,000 victims, depending on the size of the debt that was raised against them and how long these individuals had to go without their money.
Justice Bernard Murphy of the Federal Court of Australia stated that it should have been ‘obvious’ to government ministers that the debt-raising method was unlawful. Bill Shorten has stated that this government mistake marked a ‘shameful chapter’ in the administration of the commonwealth social security system. What appears most concerning is how such a program was left unchecked and undetected for so long.
Individuals have had to prove that their debts were ‘tainted with illegality’ to be owed compensation, a claim which Justice Murphy deems is difficult to prove. This has meant that despite the class action, 200,000 people will not receive any benefit from the government settlement.
The Royal Commission:
Ultimately, we see that despite Centrelink’s AI program being launched to prioritise cost-savings and efficiency, Robodebt has only exacerbated wealth inequality in Australia. Furthermore, it has eroded societal trust in the government’s ability to manage social services and to care for those most vulnerable in the community.
A royal commission into Robodebt will be established this year and aims to find what exactly was responsible for the illegality of the scheme. Knowing the truth about Robodebt will hopefully not only help keep the government accountable, but also help the government to ensure that no mistake like this happens ever again.
With more and more data being generated, both governments and businesses worldwide have grown their use of automated decision making tools. For example, artificial intelligence (AI) is commonly used in defense and law enforcement agencies, with governments using deep neural networks to detect suspicious activity that occurs in large data sets. Civil and health agencies are also leveraging AI, deploying its potential in areas such as trade surveillance, climate and economic analysis. Whilst AI has many benefits, we also know that automated decision making can create even larger problems than those it was employed to remedy. In particular, where AI is left unchecked, and efficiency and cost-savings are prioritised over operational transparency, we see just how badly individuals can be hurt by the consequences.
An Australian Case Study:
An example of the damaging impacts of AI can be seen through Centrelink’s Online Compliance Intervention Program, more colloquially known as Robodebt. The Robodebt scheme started in 2015 and was an automated debt assessment and recovery program that raised debts by ‘income averaging’ tax office data to check welfare payments. Essentially, this meant that the government was calculating a recipient’s income by averaging it over fortnightly periods instead of enforcing a process which discovered an individual’s actual fortnightly income. This method of calculation was ultimately deemed ‘unlawful’ and meant that many individuals who had irregular income were subject to incorrect Centrelink debts.
This ultimately led to a cycle of distress for many individuals receiving welfare payments. With the government raising more than $1.76 billion in debts against 443,000 people and unlawfully recovering $751 million from these individuals, it was clear that legal action had to be taken.
The Class Action:
Gordon Legal brought a class action on behalf of all the victims in 2020 of the Robodebt scheme. In response to this legal action, the government, with the approval of the Federal Court, has agreed that it will repay the $751 million it took and completely clear all debts, totalling $1.76 billion. $112 million has also been added in interest to this settlement, and this sum will be shared among the 394,000 victims, depending on the size of the debt that was raised against them and how long these individuals had to go without their money.
Justice Bernard Murphy of the Federal Court of Australia stated that it should have been ‘obvious’ to government ministers that the debt-raising method was unlawful. Bill Shorten has stated that this government mistake marked a ‘shameful chapter’ in the administration of the commonwealth social security system. What appears most concerning is how such a program was left unchecked and undetected for so long.
Individuals have had to prove that their debts were ‘tainted with illegality’ to be owed compensation, a claim which Justice Murphy deems is difficult to prove. This has meant that despite the class action, 200,000 people will not receive any benefit from the government settlement.
The Royal Commission:
Ultimately, we see that despite Centrelink’s AI program being launched to prioritise cost-savings and efficiency, Robodebt has only exacerbated wealth inequality in Australia. Furthermore, it has eroded societal trust in the government’s ability to manage social services and to care for those most vulnerable in the community.
A royal commission into Robodebt will be established this year and aims to find what exactly was responsible for the illegality of the scheme. Knowing the truth about Robodebt will hopefully not only help keep the government accountable, but also help the government to ensure that no mistake like this happens ever again.