Why did English football fans fill social media with racial slurs? Because it’s good for business

Peter Lewis reflects on why fans took to social media to hurl racial abuse on English football players who missed key penalty shots for the Euro 2020 final:

"If I were to stand on the hill at my local football ground and hurl racist abuse at players after they missed a vital goal there would rightly be real-world consequences.

If a fellow fan or player reported me, I would be ejected from the ground and given a long-term ban on my membership. I would be banished from my flock.

If the player I had verbally attacked chose to lodge a complaint, I could be subject to a racial vilification complaint to the Australian Humans Rights Commission. In NSW I would also be liable to criminal prosecution for a hate crime if it were deemed to be inciting violence.

But these measures would be likely unnecessary because the idea of shouting out such obscenity is so far outside the social norms of modern society that I would be unlikely to even consider this as a response to my disappointment.

The reason that thousands of shattered English soccer fans felt entitled to take to social media with racist slurs against their own players this week was that on social media this sort of emotion-fuelled discourse is the norm."

Read the full article on the Sydney Morning Herald here.


Tech Check Digest - July 8, 2021

The fast-paced world of technology is constantly developing, and there are so many interesting things happening in technology news, policy and events. 

So we've decided to regularly curate a list of the most interesting articles, links and media for your pleasure.

Explore our first Tech Check Digest below, and let us know if you come across anything great that we may have missed!

READ:

Rod Sims on the pivotal Digital Platforms Inquiry, two years on, and how it will continue to shape the digital landscape for the next few years (via InnovationAus)

Former US President Trump files suit against Facebook, Twitter and YouTube, claiming he had been wrongfully censored (via AP)

Twitter no longer has liability protection over user-generated content in India (via Reuters)

WATCH:

‘Digital feudalism: The future of data capitalism’. A conversation between Shoshana Zuboff, (author of The Age of Surveillance Capitalism), Tim O’Reilly (Founder & CEO of O’Reilly Media) and Mariana Mazzucato (Founding Director of the UCL Institute for Innovation & Public Purpose)

LISTEN:

‘AI And Humans: Collaboration Rather Than Domination’, interview with Jeannie Paterson, Professor of Law and Co-director of the Centre for AI and Digital Ethics at the Melbourne Law School, via University of Melbourne

EVENTS:

Facial Recognition webinar featuring Human Rights Commissioner, Ed Santow and Digital Rights Watch chair, Lizzie O’Shea, hosted by Monash University


The Australian Search Experience Project

Centre for Responsible Technology Associate, Professor Axel Bruns is leading a project through the ARC Centre of Excellence for Automated Decision-Making and Society that is investigating whether our search results are customised based on the different profiles Google has defined.

The project, called The Australian Search Experience is a crowdsourcing initiative that asks Australians to lend their search engine activity to this pivotal study.

Learn more here


On public health and the digital platforms

Moves this week by the Therapeutic Goods Administration (TGA) to stymie Clive Palmer’s latest foray into political advertising highlight the different rules that apply between the traditional media and the new social media platforms.

Whereas the TGA has warned that Palmer and the regional radio station running his anti-vax ads breach their responsibilities as advertisers and broadcasters, in the online environment, it’s up to platforms to make their own call.

On Facebook and other social networks, this sort of disinformation is circulating in groups and targeted networks, far away from the gaze of health professionals.

When dangerous misinformation does come to attention, platforms can be prompted to act – Facebook to its credit has taken down content from MP Craig Kelly. But such actions remain at the discretion of the platform.

Free of enforceable rules and driven by a business model that preferences content that excites and enrages users so as to keep them producing behavioural data for longer, these digital platforms have become their own public health problem.

Efforts to mitigate disinformation have been minimalist. In Australia and abroad the preference for voluntary industry codes and protocols set down feel-good statements of intent without sheeting home legal responsibility.

Read the full article by Peter Lewis and Jordan Guiao on Croakey News here.


Ed Santow discusses landmark AI report with Centre for Responsible Technology

Human Rights Commissioner Ed Santow joins Australia Institute's Centre for Responsible Technology director Peter Lewis and Digital Rights Watch chair Lizzie O'Shea for this important discussion into the rules for the future, presented by the Public Square project.


New AI rules will put Australia ahead

Peter Lewis writes about the new Human Rights & Technology report from the Australian Human Rights Commission:

"As the Swiss have their watches and the Danes their furniture, maybe Australia could have its AI, built with fairness baked in, delightfully designed, rigorously engineered, embedding all that is good about us in the algorithm.

It could be exported to governments and businesses around the globe to create more robust algorithms and help avoid having to make the choice between systems anchored in state surveillance or in surveillance capitalism.

The instinct of business is always to push back against government regulation as red tape that will stifle innovation. But the Santow Report should be seen more as the guardrails that could turbocharge Australian technology onto the world stage."

Read the full article on The Australian here.


Tech Talk - Secret Algorithms, Data Protection and Surveillance Cities

In this fortnight's Tech Talk we review the US Senate hearings into social media algorithms, evaluate competing models for data protection and take a walk through Melbourne's not-so-secret city.

Read more

Eli Pariser launches new report 'The Public Square Project'

Eli Pariser, co-founder of Avaaz and Upworthy, and author of 'The Filter Bubble' joins our director Peter Lewis and Digital Rights Watch chair Lizzie O'Shea for a conversation on what healthy digital public spaces could look like.

Pariser: "If Elon Musk can imagine whatever crazy scheme to put robots on the moon, we can imagine better community infrastructure". Watch the full webinar below:


Tech Talk - Identity Theft, Chipageddon and Facebook's 'Supreme Court’

A lot of new developments in the world of technology, including policy updates in the European Commission, computer chips running out, and the thorny problem of content moderation:

Read more

False choice: Why the default surveillance model is no choice at all

In our submission to the ACCC Digital platform services inquiry's latest report, we welcome the ACCC looking into the default product options on Android that have contributed to Google's anti-competitive market dominance. 

The issue of Google’s default choice on Android has already resulted in a significant fine in Europe under antitrust law and the choice screen has been presented as a solution.

Over a year since its development, the European choice screen has proven to be inadequate, with Google’s design continuing to preference its own products, and many ‘alternative’ products actually remaining part of Google’s ecosystem.

The Centre for Responsible Technology recommends that the ACCC:

1) Following the €4 billion fine for anti-trust violations in Europe, consider whether Google is in breach of similar competition laws in Australia.

2) Define the parameters and specifications of the choice screen solution upfront as part of this inquiry rather than allowing vested interests to develop the choice screen themselves.

3) Monitor the effect of the choice screen on a quarterly basis to determine its effectiveness and performance, and adjust the design accordingly.

4) Recommend that the government develop incentives for competitors like DuckDuckGo to establish a more developed local presence and regional strategy in the Australian market.

 



connect