Week Five: Algorithms

The confronting article “When algorithms control information, what of democracy?’ is eye-opening for anyone who is not aware of the presence of algorithms in their own lives and in the wider world. The concept that the public’s emotions and actions are being manipulated with an agenda (Arvanitakis, 2017) is especially thought provoking. Studies suggest that many people do not know or understand the extent to which algorithms may be affecting their lives (Susarla, 2019).

There seems to be widespread opinion that current regulations do not do enough, or are not keeping up with these development in artificial intelligence and the way it relates to people’s data and privacy (NBC News, 2018, Ehrenmann in Susarla, 2019). A succinct set of suggestions from the PEW Research report of 2017 included stricter access protocols, ethical codes for digital stewardship and no third party sale without consent (Rainie & Anderson, 2017). These sorts of safeguards can only be enforced by laws, as companies will not generally voluntarily implement them even whilst claiming to support them. For example, Zuckerberg has called for regulation to protect privacy whilst at the same time Facebook has been fined multiple times for various infringements (Privacy International, 2019).

I do wonder about the place of universities in teaching data ethics. Not only do they stand to benefit from algorithms and artificial intelligence (for example, by identifying at-risk students as stated in the article), the digital divide as described by Susarla (2019) may widen if universities are the key place of getting this kind of information. University is expensive: not everyone gets to attend. And what of other groups such as the elderly? Perhaps libraries can step in to bridge this gap, including school libraries; the collection of data and targeting of content is not restricted to adults.

References

Arvanitakis, J. (2017, 11 August). If Google and Facebook rely on opaque algorithms, what does that mean for democracy? ABC News. Retrieved from https://www.abc.net.au/news/2017-08-10/ai-democracy-google-facebook/8782970?pfmredir=sm.

Privacy International (2019). Cambridge Analytica, GDPR – 1 year on – a lot of words and some action. Retrieved from https://privacyinternational.org/news-analysis/2857/cambridge-analytica-gdpr-1-year-lot-words-and-some-action.

Rainie, L. & Anderson, J. (2017, 8 February). Code-Dependent: Pros and Cons of the Algorithm Age. PEW Research Centre. Retrieved from https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/.

Saraga, D. (2017). Opinion: Should algorithms be regulated? Retrieved from https://phys.org/news/2017-01-opinion-algorithms.html.

Susarla, A. (2019, April 17). The new digital divide is between people who opt out of algorithms and people who don’t. The Conversation. Retrieved from https://theconversation.com/the-new-digital-divide-is-between-people-who-opt-out-of-algorithms-and-people-who-dont-114719.

Weisbaum, H (2018). Trust in Facebook has dropped by 66 percent since the Cambridge Analytica scandal, NBC News. Retrieved from https://www.nbcnews.com/business/consumer/trust-facebook-has-dropped-51-percent-cambridge-analytica-scandal-n867011.

Leave a comment

Design a site like this with WordPress.com
Get started