Categories
Issue 2

Sitting inside the black mirror and peeking at the world beyond

Social media is all around us. One can argue that the very way in which we communicate today and conceptualize interactions with the world at large, has been fundamentally altered by social media. Television shows such as Black Mirror or the Netflix Documentary Social Dilemma have drawn public attention to the ramifications of human-computer interactions.  While there is no denying that social media has made staying in touch with friends and family, no matter where they are a breeze, as well as aided technological progress, there is unfortunately, a flipside. The problem is twofold —  one, we often believe our social media feeds an accurate representation of reality. And second, we spend too much time on our devices, which makes problem one worse.

To understand why we keep spending increasing amounts of time with our smart devices is tied to how internet companies such as Google, Facebook, Instagram etc. make money while providing services for free.   

A company, by definition, exists to generate a profit. That holds true for internet companies as well. While we are not charged for Facebook, Instagram and the likes, they monetise through ad revenues. This, therefore, makes them depend on their algorithms to detect patterns in our browsing behaviour, so that they can match us to the best possible advertisers, and if we look at an ad long enough, we might be prompted to spend money.

Two corollaries further follow: 

First, better accuracy of the algorithm in predicting our patterns of behaviour on the internet allows the company to better tailor its content for our feed. 

Second, the longer time we spend on our screen, the more ads we see, the more money the company makes by charging the vendors. 

To achieve the first goal, one of the main strategies companies use is AI based smart algorithms. Machine learning means you give the algorithm a goal and then it’ll figure out how to achieve it by itself. AI is also only as good as the data that it’s trained it on. Companies like Google and Facebook have huge data sets at their disposal, because of the vast number of their users from different countries spending lots of time online. This amounts to an unfathomable quantity of data. Modern algorithms accurately tailor social media feeds based on these patterns. By showing content we like frequently, they ensure we stay on the devices longer. Knowing this is very important, because this prevents us from believing that our social media feed is an accurate representation of the world. Once the false belief system takes hold, it makes us more partisan — to the level we cannot even consider having a discussion with people harbouring contrarian viewpoints. The lack of will to engage rationally with the other side is dangerous for public discourse. This is at the centre of exclusion, discrimination, hate speech and hate crimes based on gender, class and caste, ethnic and religious minorities. 

Additionally, most of the social media apps are designed based on the psychology of persuasion and more dangerously, addiction. In the 1930s, B.F. Skinner showed what we describe today as “operant learning”- animals repeat behaviours and learn a task when given a reward.  They don’t do this when the reward is taken away. However, when Skinner started to change the schedule of reward delivery, he found something striking.  When reward delivery follows a varied ratio interval (food pellet delivered after an uncertain number of lever presses), i.e. when the animals expect without knowing when they will be rewarded, they learn to repeat the task behaviour fastest. More importantly, even if the rewards are stopped entirely, they keep on pushing the lever. It showed that this type of learned behaviour is extremely difficult to extinguish. This is exactly the principle on which gambling and slot machines work: they keep the gambler on tenterhooks of expecting a win and in the process, they keep them playing and continue betting.

Both the brain pathways and the neurotransmitters that underlie such addictive behavior have been characterized in detail over the years.  Deep in our midbrain and brain stem sits a group of neurons that release dopamine, the pleasure chemical. This area is the Ventral Tegmental Area (VTA). VTA neurons talk to another set of neurons hidden under the cerebral cortex- the nucleus accumbens, which in turn talks to the frontal part of the brain, where most important executive controls and decision making reside. Any natural rewards, such as food, pleasurable sex, or satiety, result in dopamine release at the Nucleus Accumbens, which through the cortex, causes the sensation of pleasure and reward. This is why we like to repeat what makes us feel good. Addiction hijacks this pathway, whether it is a chemical addiction such as cocaine or a physical addiction like gambling. These behaviors cause a massive upsurge in dopamine, much larger than physiological dopamine release. In seasoned addicts, the anticipation of reward releases twice as much dopamine than the actual reward. This biological phenomenon makes it hard to successfully abstain from addiction. 

Image Courtesy: drugabuse.gov

This is the mechanism that has been targeted by most tech companies. As the time spent with the devices is directly proportional to the ad revenues the companies earn, it favours the companies’ interests to make social media usage addictive. Hence, most elements of app design now use endless notifications, personalizing our feeds better everyday, and building in features like the answering bubble with moving dots when someone is replying. Most apps did not have this before. This is a classic example where the dopamine upsurge of anticipation is utilized. If this is a person of romantic interest or a recruitment manager, the anticipation of the reward can be more addictive than the reward itself. 

The fact that you pick up your phone and 25 minutes whoosh past isn’t random; it isn’t you. The anticipation of receiving curated content is arguably similar to a dopamine rush a gambling addict would get.

If this sounds far-fetched, take a simple test. Go device free for 24-48 hours. Lock them away. Track your mood changes, craving and general wellbeing and distress in this time. How irritable or uneasy are you? How much do you fear you are missing out or crave your device? Once you are able to get your device back, chart how long you have used it every day (all smartphones/tablets can tell you how much screen time you have had in a day). Now compare the usage from the 48 hours after abstinence to your regular usage. The mood charting shows how bad your social media habits are. 

 We survived fine till 2007 when smartphones were introduced. While our brains have evolved little over the past millennia, our environment has exploded over the past decades, especially in the online space. Biological evolution cannot keep up with the exponential evolution of technology. Therefore, spending too long on your devices makes you vulnerable to a range of health problems: poor eyesight, postural pains, lack of exercise etc. Also, engaging constantly with deeply disturbing content, even if they are on social justice issues, will inevitably start affecting your mental health and wellbeing. The world around is brutal and unfair. It is rife with discrimination and atrocities. While we should be aware of such inequalities, engaging with news all day can lead to a sense of loss of control over your life. All of these are good reasons to give yourself a periodic detox from social media. 

Given how all-pervasive social media is, where and how do we draw the line? Here are a few tips:

  1. Know that this is a world of your own creation. 

If you subscribe to viewpoint A, the apps curate your feeds with everything that reinforce A and negate all other viewpoints. You gravitate towards atrocities committed by members of anyone who does not subscribe to A and you behave as if the only reality in the world is understood by those who subscribe to A, and all others cannot be debated or even conversed with. Ultimately this makes the society more polarized. Arguably, this world is far more polarized than the world 40 years ago. Falling prey to believing the version of reality on your screens as absolute reality, you open yourself up to be easily manipulated to now indulge in hate speech, insensitivity, and sometimes, physical violence towards people whose views contradict yours. Listen to the contrarian view-points. Don’t allow one ideology to wholly dictate what you trust.

  1. Give yourself a digital detox every now and then. 

 When you’re not busy with work, limit your screen time.  When you go out with your friends for a much sought-after coffee, engage in conversations. Mutually agree to restrict mobile usage to 3-5 photos for the entire duration of the meet. Write physically in a journal every day. Exchange letters. Indulge in hobbies and activities that do not involve screens.  When you are on vacation, switch off your phone entirely. Activate an automatic email reply with the dates when you will return to work and the name of an interim person who can be reached out to if urgent. 

  1. Resist the temptation to document each moment of your life on social media. 

Besides adding to your digital footprint, this leads to unhealthy comparisons. Most people put their best foot forward on social media. Everyone posts pictures where they are doing something fun. Very few people post unhappy pictures on Instagram or write when they have a bad day at work on Facebook, yet negative things happen to all humans, every day. If we believe everything we see on feeds to be a true reflection of their lives, we buy into this idea that everyone has a perfect life, except, well, ourselves. This is not true. No one has a perfect life, and what people project on social media is often different from their real lives. So do not compare yourself to anyone on social media. Live your life as you want, without telling everyone about every moment of it. The “likes” only activate those short dopamine loops that provide instant gratification and are addictive. No amount of likes determine self-worth. So actively stop tying notions of self-worth with the likes and followers on social media. 

  1. Be a conscious consumer and not a prey to the influencer phenomenon. 

Social media can be used constructively. Collaborations, products and partnerships have evolved to its credit. If you are curating your own feed, use this awareness and the powerful AI behind apps to curate a feed that is good for you —  pages and channels that deliver creative content, help amplify positive messaging, promote mindfulness and healthy living. 

In that same vein, be picky about who you choose to follow as their content will make your curated feed, and likely only add similar content. We live in a country where influencers with millions of followers routinely promote misogyny, crass, classist, casteist, and majoritarian views. When you decide to follow an account,  try to determine the veracity of their claims. Do they cite data? What is the source? If you look at the data, does their conclusion make any sense? Never amplify something you have not fact checked before. There is a tremendous amount of misinformation in the post truth era. 

  1. General principles of sensible social media use

 Research has shown that constant social media usage leads to an inability to focus and restlessness. These directly affect professional or educational performance. A slew of productivity-based apps using the pomodoro technique (20 minutes for work followed by a 5-minute break) and restricting social media usage are available on all platforms and can be used for structuring the work day. The break can be used to do any activity that does not involve phones or computers. Outside work hours, make some rules for social media usage and stick to them. Turn notifications off for most apps except your calendar or reminders.  

Another effective rule is the “no phone at dinner and beyond” rule. Do not reach for your phone before sleep and as you open your eyes the next morning. Try holding off at least until after breakfast. These simple rules go a long way to ensure our internet usage stays under control.

While social media can foster a sense of community, it can also take away from face to face interactions which eventually raises a lot of concerns over your physical and mental health. Being conscious of that is relevant, simply because we cannot avoid it.

Simantini Ghosh is an Assistant Professor and PhD Coordinator for the Department of Psychology at Ashoka University.

We publish all articles under a Creative Commons Attribution-Noderivatives license. This means any news organisation, blog, website, newspaper or newsletter can republish our pieces for free, provided they attribute the original source (OpenAxis). 

Categories
Uncategorized

To End or Not to End Privacy

Imagine, if you will, a murder. Some letters are found, all written in a strange language. In Conan Doyle’s “The Adventure of the Dancing Men,” it took Sherlock Holmes to decipher such a script and find the murderer.

Inventing a secret language is rather difficult, except that we now have standardized ways to do it: encryption algorithms. Essentially, we have language-inventing software, which can create different languages based on a secret password. If you know the password, you can translate the language back into plain English. Today’s techniques produce incredibly secure ciphers that would leave even Holmes clueless. 

This has led to governments trying to subvert or weaken cryptography. Inevitably, every time an atrocity occurs, we hear this argument again and again. Donald Trump has stated that the US should be able to “penetrate the Internet and find out exactly where ISIS is and everything about ISIS.” It was perhaps David Cameron who best articulated this sentiment: “In our country, do we want to allow a means of communication between people, which even in extremis, with a signed warrant from the home secretary personally, that we cannot read? … are we going to allow a means of communication where it simply isn’t possible to do that? My answer is no, we are not.” The justification, of course, is that these powers are needed by “intelligence agencies and security agencies and policing in order to keep our people safe.”

The “deal”, then, is this: You can communicate securely, as long as you make the encryption easy enough for The Government to decipher. This “easy enough” requirement is currently being enforced by various means, including the infiltration and bribery of companies that produce commercial cryptographic software. Many activists and technologists have written about the ethical problems with having a government that is capable of snooping on all of our communications. I argue that legalising this is not only unethical, but operationally impossible.

I am sure you can already spot the problem — if something is easy enough for one person to decipher, then it is easy enough for many others. You cannot have one and not the other, since our government employees are not magically cleverer than their US, Chinese, or Russian counterparts, or the many cyber-criminals that prowl the internet. Broken security renders us vulnerable to anyone with the expertise, not just some government agencies. Mathematical laws care little for the laws of any country.

A commonly proposed solution is for the government to have some kind of “backdoor,” such as a master key. This is difficult to do, both technically and operationally. Given that we have substantial problems implementing and deploying our current (comparatively simple) systems, shifting to such a complicated new technology would inevitably lead to more security holes.

Even if one government has a master key for a certain set of encryption systems, we still have problems. What if the master key gets stolen? We are artificially introducing a critical weakness — such a key would certainly be a prime target for any adversary, and having the key stolen is not a negligible possibility. Over the past few years, hackers have been able to steal everything from the blueprints of the F-35 fighter jet, to financial data from credit rating agencies, to healthcare data from hospitals. Trusting governments with master keys when they haven’t been able to safeguard their own military technology seems like a terrible idea.

Further, if a criminal knows that the government has a master key to software #420, she’s not going to use it. She’ll find a system with no master key (these, of course, already exist). So, the people suffering from weak encryption are mostly going to be law-abiding citizens, who will now be more vulnerable to hackers.

The global nature of the internet adds yet another layer to this. Other governments are not going to sit around and use compromised (from their point of view) communication systems – they’ll build their own software, probably with their own master keys, and stop trusting software made by residents of other countries, essentially creating import control on software. How would multinational companies secure their data? Would they be required to provide keys to every government in the world, or, perhaps a branch of the UN? The creation of a global body to govern these master keys presents a herculean challenge. Further, nothing prevents the governments from adding their own backdoors to subvert that body as well.

Practically every expert in the field believes that subverting cryptosystems (and the bulk surveillance that inevitably accompanies it) is foolish, immoral, and dangerous.

This is why companies like Apple, Facebook, Google, and Microsoft are supporting stronger encryption. Some people who don’t really understand how encryption works have come up with many good reasons for exceptional access backdoors and opined that regulators and legislators must find a way to provide some privacy while allowing law enforcement access. This won’t work. Yes, there are many good reasons for having backdoors (roll-down windows on airplanes might have many advantages), but the numerous fatal problems that they create should have obviated this discussion long ago. Governments should stop trying to build backdoors and support strong, end-to-end security and privacy.

Debayan Gupta is currently an Assistant Professor of Computer Science at Ashoka University, where he teaches a course on security and privacy as well as an introductory programming class. He is also a visiting professor and research affiliate at MIT and MIT-Sloan.

We publish all articles under a Creative Commons Attribution-Noderivatives license. This means any news organisation, blog, website, newspaper or newsletter can republish our pieces for free, provided they attribute the original source (OpenAxis).