An alliance of digital rights groups has urged the Morrison Government to fill in obvious gaps in the development of the tracing technology to give it its best chance of winning public trust.
Today, the Morrison Government released the COVID-Safe tracing app asking all Australians to download the technology which is designed to inform them of every person a user has been in contact with.Read more
Human rights and privacy experts have called on Federal Health Minister Greg Hunt to explain privacy and surveillance issues arising from the Federal Government’s recently launched Coronavirus Australia app. The app has been downloaded over 500,000 times in Australia, yet there is little publicly available information about what data is being collected from people and how that private information is being used and kept safe.Read more
New technologies that deploy Artificial intelligence should be assessed for their social impact on citizens before they are allowed to be deployed, according to The Australia Institute’s Centre for Responsible Technology.
In its submission to the Australian Human Rights Commission (AHRC) discussion paper on human rights and technology, the Centre argues that a formal regulatory regime, rather than voluntary ethical codes, is required to protect the public interest in this time of rapid change.Read more
The hack of billions of photos from an Australian start-up, Clearview AI, which harvests photos from social media and bundles the information for law enforcement agencies, reinforces the need to place a moratorium on facial recognition technology.
The Australia Institute’s Centre for Responsible Technology is supporting the Australian Human Rights Commission’s proposal for a moratorium on facial recognition technology until a framework that protects the rights of individuals is developed.
Clearview AI is a clear case study of the issues surrounding facial recognition technology, which are yet to be confronted, including:
- The level of consent users should have over their information being captured and on-sold, either to businesses or state authorities;
- The protection of an individual’s personal information and how it is interpreted and used, particularly by the government;
- The security standards required for the storage and disposal of such information, especially when an image has been removed by a user from the public domain.
The Australia Institute’s Centre for Responsible Technology has called for an extension in the way computer games are classified, to capture design architecture that exposes children to addictive, gambling-based content in many common games.
In a submission to the Department of Communications review into the classification system, the Centre for Responsible Technology argues that the spread of ‘addiction by design’ in many games means children risk being groomed to become the problem gamblers of the future.Read more
Disinformation is the real winner in government’s light touch response to the ACCC Digital Platforms InquiryRead more
“If the ACCC Digital Platforms Review was, as reported at the time, world’s best practice on regulating Big Tech, the government’s response shows Big Tech has secured world’s best practice in slowing down meaningful reform,” said Peter Lewis, Director of the Centre for Responsible Technology at the Australia Institute.
“Hardly anything from the ACCC has survived untouched with the Big Tech companies avoiding the big-ticket reform of limiting their power to take over competitors, with the government opting for voluntary compliance and incremental reviews over regulation.
“This is a shame as the Morrison Government had the opportunity to lead a re-think of how platforms should operate and challenge the conceit that a platform is not a publisher.
“Perhaps the creation of new section of the ACCC with oversight of the platforms will have an impact, but this model falls well short of the Prime Minister’s position that the rules that exist in the real world need to exist in the digital world,” Mr Lewis said.Read more
The Australian public support tighter regulation of political advertising on social media platforms, from truth in advertising, limits to micro-targeting, to bans on political advertising on social media altogether.
The findings, based on public polling conducted by Essential Research in November found:
- 73 per cent support requiring social media platforms to ensure political ads are factual
- 70 per cent require social media platforms to confirm organisations advertising politically are registered locally
- 66 per cent support preventing platforms from ‘micro-targeting’
- While 60 per cent support a back on social media altogether.
The figures were released as part of a report "Distorting the Public Square" by The Australia Institute’s Centre for Responsible Technology associate fellow, Jordan Guiao.Read more
The majority of Australians are not comfortable with the way government and companies collect and use their personal information, according to new research.
To coincide with its launch, the Australia Institute’s new Centre for Responsible Technology today released new research showing high levels of discomfort with the way personal information is collected, repurposed and stored.Read more