11am Friday 28 January 2022
House of Representatives Select Committee on Social Media and Online Safety
Inquiry into the Social Media and Online Safety
Opening Remarks – Peter Lewis, Director of the Australia Institute’s Centre for Responsible Technology:
We are an independent thinktank under the auspices of the Australia Institute with a commitment to promoting and supporting the development of sensible regulation of network technology.
With your indulgence I will speak briefly to our submission.
In essence we make three simple points:
First, that the challenging in addressing the myriad online harms that are a consequence of Big Tech’s current business models requires a systemic response.
Second, that much of this thinking has already been done and is awaiting government action - most notably via the ACCC”s Digital Platforms Inquiry and the Australian Human Rights Commission’s ‘Human Rights and Techno0logy Report’.
And finally, that we need to start imagining alternative digital infrastructure to these global advertising monopolies as the anchor to our public square.
This has been a significant Parliament in terms of beginning this journey and I do want to recognise the refreshingly bi-partisan efforts to shift the balance of power back towards the public interest.
The News Media Bargaining Code which passed this Parliament just under a year ago with bi-partisan endorsement was a globally significant intervention in support of public interest journalism which has had a positive impact on the viability of Australia’s media.
The Centre for Responsible Technology found itself in the position of supporting the Morrison Governments efforts to implement the code in the face of extreme pressure from the Big Tech giants that culminated in Facebook’s takedown of Australian civic content.
However, it was the only one of 23 recommendations from the ACCC that are slowly working their way through the system and, if fully implemented, create an evidence-based framework for dealing with the market power of the Big Tech platforms.
These include measures to update Privacy Laws, actionable platform responsibility for harmful disinformation and a national effort to increase Australia’s digital literacy – including our understanding of how these networks make their dough.
Likewise, former Commissioner Ed Santow’s report into the human rights implications of the development of Artificial Intelligence and explicitly calls for a moratorium on the development of facial recognition technology until public interest safeguards are in place is waiting for a formal response from the Attorney-General.
The committee should also note there is a coordinated national research effort underway to chart the broader implications of technological change through the ARC Centre for Excellence in Automated Decision-Making and Society that should inform better policy-making over the next Parliament.
What is clearly required is an all of government approach – not one-off interventions - is required to create safer online environments.
The most significant finding from the ACCC Inquiry was that the digital platforms would be regarded as ‘advertising monopolies.
I think it is worth bearing this in my as you work your way through these particular terms of reference. Whenever you read the word ‘social media’ change it to ‘advertising monopoly’.
Through this lens, efforts to attribute platform responsibility takes on a real urgency.
Through this lens, the case for transparency is self-evident.
And through this lens, the onus really is on you, as our elected representative to actively disrupt their business models when they threaten the well-being of our children and of our democracy.
These platforms are gathering mind-boggling amounts of information about us and have developed a business model that exploit this information by selling our attention to the highest bidder.
In the pursuit of eye-watering profits, these platforms will always go as close to the line as they can. In fact, to quote the GOAT they ‘headbutt’ the line.
Regulation of these platforms will always be complex and contested – it needs clear evidence-based thinking with a clear articulation of the public’s interest.
I’ve been somewhat sceptical of this inquiry because of its narrow time frame – but I do recognise the importance of stepping these challenges out.
Drawing on the work that is already in train, there is a clear opportunity to chart a journey towards safer social network infrastructure that works in the public’s interest.