Released in 2020, the documentary-drama, ‘The Social Dilemma’, offers a thought-provoking and alarming depiction of our reality today. The film exposes the ruthless nature of tech giants seeking to reinforce marketing algorithms for monetary gain and the consequences that have emerged as a result; from eliciting mental health issues and nurturing addictions to promoting the spread of fake news, and threatening democracy.
Whilst it certainly offers a somewhat biased, or one-sided take on the social media phenomenon, the film nevertheless raises a number of important concerns that are worth addressing.
As part of Eskenzi PR’s latest initiative, the Eskenzi Cyber Book & Film Club, cybersecurity and cyberpsychology experts were invited to take part in a Tweet Chat to discuss some of these very issues. Specifically, we were joined by Brian Higgins, Director at ARCO Cyber Security and Security Specialist at Comparitech; Anete Poriete, UX Researcher and Cyber Psychologist at CyberSmart; Madeline Howard, Director at Cyber Cheltenham (CyNam); and Neil Stinchcombe, co-founder of Eskenzi PR.
To read up on all of their insights, check out the Eskenzi Twitter or look under the hashtag #EskenziClubSD !
What is the biggest problem with social media?
In the same way the documentary began, the event kicked off with a rather broad question:
“What do you think is the biggest problem with social media today? Is there a problem?”
A general consensus suggested that a lack of regulation and ownership of responsibility has played a central role in the failings of social media.
Absolutely agree!! Facebook are launching Instagram for children and you can bet it’s not to safeguard them from online harms. What better way to know which Ads they’ll like when they finally get some independent spending power. pic.twitter.com/j7NMMfSc7g
— Brian Higgins (@brian_cyber) March 26, 2021
#EskenziClubSD A1: Social media gives us great freedom, but with great freedom comes great responsibility. The lack of regulation and communal irresponsibility are perhaps the biggest problems with it, although its effect on our collective cyber psychology is also important.
— CyberSmart (@CyberSmartUK) March 26, 2021
For Brian Higgins, part of the problem can be attributed to our ignorance. Indeed, if we are unaware that we are in the matrix, how can we then solve the issue, let alone recognise the problem in the first place?
A1. It’s definitely the Matrix problem and it’s generational. I was born in 1968 and remember life before tech. Younger people definitely can’t escape from the Matrix because they have no idea they’re in it! pic.twitter.com/bxv5XcK7Hs
— Brian Higgins (@brian_cyber) March 26, 2021
Social Media: Tool or Manipulation Instrument?
During the film, Tristan Harris, former design ethicist at Google and co-founder of Centre for Humane Technologies, suggested that we had “moved away from a tools based technology environment, to an addiction and manipulation used technology environment. Social media isn’t a tool waiting to be used. It has its own goals, and it has its own means of pursuing them by using your psychology against you.”
The argument suggests that algorithms and artificial intelligence are increasingly adept at understanding who we are, and are leveraging this knowledge to curate our reality as well as influence our thoughts and decisions.
A2 I think social is driving an agenda of addiction after all only two industries call their customers USERS, perhaps it did not start out as intentional, but if the platform is built on getting your attention to exert influence on you it is not surprising #EskenziClubSD pic.twitter.com/VnJgIevbnh
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
A2: In many ways social media is still a tool – to connect with individuals, do business, share thoughts, etc. BUT through algorithms which shape our individually tailored content and adverts our human psychology is being manipulated. #EskenziClubSD
— Madeline (@MadzzHoward) March 26, 2021
In addition to algorithms, however, is the platform offered to ‘influencers’.
#EskenziClubSD A2: Besides this, one of the key psychological features that cyberspace presents is the loss of authority and equalised status, therefore anyone can be seen as an influencer as long as they have the skill and means to communicate to their target audience.
— CyberSmart (@CyberSmartUK) March 26, 2021
Unfortunately, it seems our habit of consuming bite-size information has also made us conducive to being manipulated as both our attention spans and critical thinking are negatively impacted.
If we decrease our attention span and constantly occupy our brains with content that distracts and seduces us, then we lose the ability to question if something is true or relevant, do not feed your demons #EskenziClubSD pic.twitter.com/AeEEv0ONT2
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
#EskenziClubSD A3: People take less than a minute to view the majority of their on-screen content and it takes less than 30 seconds for people to switch across different content as that is more exciting.
— CyberSmart (@CyberSmartUK) March 26, 2021
A3: The endless streams of communication & connection are changing the way we think and absorb information.
Dopamine is stimulated by unpredictability, by small bits of information, and by reward cues. This is pretty much the exact conditions of social media. #EskenziClubSD
— Madeline (@MadzzHoward) March 26, 2021
Bite-size info has removed the ability to apply critical thinking to the things we read and see. Immediacy of information and expected response are the reasons why phenomena like Fake News have taken such a foothold. pic.twitter.com/5lZXeMkrZ8
— Brian Higgins (@brian_cyber) March 26, 2021
To Intervene or Not to Intervene
Recognising the imperfect nature of social media design then, we wondered if intervention by tech giants is required, particularly with regards to disinformation/misinformation.
A4: 100% intervention is required through fact checking teams / bots at the social media orgs. Newspapers are held accountable and penalised if they publish things that have negative repercussions…..social media platforms and users should be no different. #EskenziClubSD
— Madeline (@MadzzHoward) March 26, 2021
Social Media needs rules and accountability to protect the truth, rather than fake news with is more popular and spreads faster, it is no longer a case of you cannot handle the truth, it is hard to know what is the truth #EskenziClubSD https://t.co/Xp1awMeGei pic.twitter.com/E3Dmw3nAFL
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
It’s fairly obvious that SM companies have a very set agenda and aren’t willing to self-regulate. Mark Z was in the film saying that the best way to combat all of the problems caused by their AD-spewing algorithms is to write more algorithms.#EskenziClubSD#Definitionofmadness pic.twitter.com/xp7eJG0TMG
— Brian Higgins (@brian_cyber) March 26, 2021
#EskenziClubSD A4:Uses and gratifications theory explains that what technology was designed vs how people use technology can be very different. I think the design and policies should be adjusted to the actual usage of social media and support the common good for us as a society.
— CyberSmart (@CyberSmartUK) March 26, 2021
Yet, the issue of misinformation is not always clear cut. In fact, a recent study conducted by Facebook suggests that it is not necessarily false information that creates problems but content that doesn’t “outright break the rules”.
The study sought to understand the spread of ideas on social media and how it was having an impact on Covid-19 vaccine hesitancy. Despite banning false and misleading statements about the vaccine, many statements including expressions of concern or doubt, are often too ambiguous to be removed but have been found to play a harmful, contributing role to hesitancy. This is especially true when the message is promoted by influencers and are concentrated within like-minded communities, acting as an echo chamber.
Anete Poriete explains this further:
#EskenziClubSD A5: Likewise, we’re often more inclined to favour opinions and information that comes from sources we identify with – the so-called ‘echo chamber effect’. The best way to address this is with critical thinking and fact-checking before sharing the content further.
— CyberSmart (@CyberSmartUK) March 26, 2021
To address the issue, Madeline Howard believes proactive engagement is necessary.
A5: (continued!)….Work needs to be done proactively engage with these communities and direct more authoritative information towards them. #EskenziClubSD pic.twitter.com/Ot2xLqPQBD
— Madeline (@MadzzHoward) March 26, 2021
This then led us to question whether it is ever okay to amplify a message.
A6. Given the fact (if it is a fact!) that it’s apparently impossible to discern between messages of truth and those that are in some way false, then allowing algorithms to propagate the information to billions of tech-addicted internet sheep is probably not the best idea. pic.twitter.com/BdLyCqyshp
— Brian Higgins (@brian_cyber) March 26, 2021
A6: I don’t think it is OK to amplify messages regardless of who you are or the organisation. It’s created a deceitful online environment where individuals, organisations & governments are playing the algorithm game to gain the most traction, engagement and ultimately influence.. pic.twitter.com/39FeXREAI1
— Madeline (@MadzzHoward) March 26, 2021
A6 Not MONEY It will depend on who/what defines the rules of the algorithm, good AI will have its ethics set by society so if the process used to decide, is similar to society influencing who a self driving car crashes into it could be fine #EskenziClubSD https://t.co/kRDjdLNbTp pic.twitter.com/70zHqBc4jv
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
The Privacy Paradox
The news is full of concern about privacy, we all think of it as very important, but the way we act in reality is often contradictory. There appears to be cognitive dissonance in that we claim to value our privacy, and yet we continue to engage in services such as Facebook, that undermines it. Moreover, we often choose to overshare details of ourselves and our lives on such platforms.
#EskenziClubSD A7: Unfortunately, this doesn’t always lead to according behaviours – a so called privacy paradox. There are multiple reasons why people would not act accordingly, firstly, they lack knowledge of how to manage their privacy settings as they are complicated….
— CyberSmart (@CyberSmartUK) March 26, 2021
#EskenziClubSD A7: In social media, ‘close circle illusion’ plays a role as people see their information and data as only being shared with their friends and acquaintances.
— CyberSmart (@CyberSmartUK) March 26, 2021
A7. The answer to this is dehumanisation. Online activity allows us to post and troll and bully and sextort and groom all alone at home. Add even one other human presence and all of these behaviours would either be altered dramatically or not happen in the first place pic.twitter.com/Hee5zfMqOL
— Brian Higgins (@brian_cyber) March 26, 2021
A7 Does privacy even exist anymore? And do people still care about it? The more you have to lose the more you care about it, in a social media world there is a trade-off between privacy and fame #EskenziClubSD https://t.co/9zuzRJtfeD pic.twitter.com/lCQeSfZRr2
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
A7: Work needs to be done to raise awareness more effectively to the lay person. When the news is talking about privacy & the issues surrounding it, why aren’t they ending with ‘and here are some ways to ensure you’re living and working securely online’…. #EskenziClubSD pic.twitter.com/MdOOov9Ta0
— Madeline (@MadzzHoward) March 26, 2021
Interestingly, our offline behaviours also make us susceptible to cybercrime.
Q8: Cybercriminals are leveraging the very nature of social media – barrier-less connectivity.
Enabling them to connect, share and rapidly amplify their scams across social media to exploit our already heightening emotions when on the platforms. #EskenziClubSD
— Madeline (@MadzzHoward) March 26, 2021
A8. FUD.
Fear, Uncertainty and Doubt. Basically all of the things that the modern SM landscape has made impossible to avoid.
Hear all about when I joined the @TheBeerFarmers a while ago. Don’t watch it with your kids!!!https://t.co/iDqvogbL0C#EskenziClubSD— Brian Higgins (@brian_cyber) March 26, 2021
A8 one of the biggest drivers for humans is being appreciated so that is one, greed is another, if something seems too good to be true then do not click that link #EskenziClubSD https://t.co/35dauSDseS pic.twitter.com/VwjGSolwNS
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
….which means that people are more open and judgement impaired online, much like being drunk. People can become more bold, confident and disclose a level of private and intimate information they would not necessarily disclose when offline. #EskenziClubSD
— CyberSmart (@CyberSmartUK) March 26, 2021
Recommendations and Solutions
To conclude the Tweet Chat, we asked the experts what they thought about the use of verified ID in helping to make us safer online and the concept of ethical-by-design.
In response to verified ID, the verdict was clear that it would encourage accountability. Nevertheless, as Anete points out, anonymity can also serve as a safety measure. As such, ID verification should be subject to choice. Neil added that the security of one’s identification should also be considered before ID verification is implemented on a wider scale.
A9: Yes. Individuals should be accountable for their comments and actions online. If you’re not prepared for your ID to be associated with an account…..why not…… #EskenziClubSD pic.twitter.com/M7I4uFdz5k
— Madeline (@MadzzHoward) March 26, 2021
A9. Absolutely it would. It would also severely restrict the user base of every SM platform and force them to be more accountable and less irresponsible. So I can’t see it happening any time soon.
EskenziClubSD pic.twitter.com/CttQA70FLz— Brian Higgins (@brian_cyber) March 26, 2021
#EskenziClubSD A9: As cyberspace has introduced a lot of space for identity flexibility, complete anonymity in combination with disinhibition effects can help people to engage in more negative behaviours.
— CyberSmart (@CyberSmartUK) March 26, 2021
… cybercriminal groups.Therefore, I would agree that verification could make us safer online, but I believe these should be subject to choice. There are many instances where anonymity can be a safety measure. #EskenziClubSD
— CyberSmart (@CyberSmartUK) March 26, 2021
A9 Yes of course, as long as it is not used to identify where vulnerable people live or attack them, the process for identification needs to be secured and used ethically if you live in a totalitarian state you might have a different view #EskenziClubSD https://t.co/LYUwc0DmAu pic.twitter.com/uZ2rLjGZt8
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
In respect to the concept of ‘ethical-by-design’, it was agreed that ethics is ever-evolving and subjective; and should, therefore, be regularly evaluated. The key is in ensuring that technological design is working in the user’s best interest and operates with transparency.
A10: Platforms need to continue to evolve at pace to ensure they’re continually ethical in their design. Societal and industry trends, human behaviour, scientific research, political and economic issues all need to be considered and monitored so platforms can adapt #EskenziClubSD
— Madeline (@MadzzHoward) March 26, 2021
A10. Ethics are subjective, both socially and culturally. Since SM platforms are designed with a single, commercial purpose, ethics seem kind of defunct.#EskenziClubSD
— Brian Higgins (@brian_cyber) March 26, 2021
A10 Ethical by design is an interesting concept, the devil is in the detail and execution of course, who/what will decide what is and is not ethical, we have seen large groups in society develop polarised views and the center has almost disappeared #EskenziClubSD https://t.co/TiVPGEfJnH pic.twitter.com/2FnCM7z6kZ
— Neil Stinchcombe (@neilstinchcombe) March 26, 2021
#EskenziClubSD A10: Transparency is also an important element of ethical design. People need to fully understand what they are signing up for and this information should be easily accessible, digestible and understandable. Safety and security is the basis for ethical design.
— CyberSmart (@CyberSmartUK) March 26, 2021
A Concluding Note
While the Tweet Chat mainly focused on the negative consequences of social media, it is important to recognise that it has also brought us many benefits which cannot and should not be neglected. We just hoped this discussion provided you with some food for thought.
A10: The Social Dilemma was indeed an interesting watch but it was disappointing it didn’t share a balanced view.
Social media does bring benefits to our global society. It’s important to acknowledge the negatives & try to change them, but always remember the positives. pic.twitter.com/UZ7FdAIMEr
— Madeline (@MadzzHoward) March 26, 2021