On the 15th of August, Prime Minister Modi announced the new National Digital Health Mission and its pilot launch in six government territories. According to the policy draft, this new mission will radically change healthcare – lowering cost, increasing transparency, and bringing healthcare services even to the most remote corners of the country. However, ever since its announcement, many have been voicing privacy concerns.
While some believe that this will be a revolutionary change to India’s Healthcare system, making all things healthcare easier than ever before, others argue that such a program is unnecessary and distracts us from real issues. According to the Indian Medical Association, such a plan may distract us from real problems such as the lack of sufficient and proper medical health structure. The association also said that strengthening public health infrastructure and addressing social determinants of health should be our priority right now, and the new plan will possibly divert funds from these issues, further jeopardizing public health care. The biggest concern, however, is confidentiality and privacy.
Confidentiality and health privacy of patients is one of the most fundamental principles of Healthcare. This is not only to protect the patient from any kind of stigma and discrimination but also to establish trust between health facilitators and patients. Another important cause for this is the idea of self-ownership and bodily integrity; as rightful owners of our bodies, only we have the right to own our biological data and decide with whom our data is shared. The Digital health mission repeatedly emphasizes its stringent security and consent-based shared guidelines on the policy document but many still believe that the national health mission could pose a grave threat to the privacy and security of citizens.
One reason behind this is the numerous privacy breaches of the past. With data as sensitive as this, how can we trust the government to believe the same will not happen again? The Aadhaar, for example, has been a subject of data leaks on far too many occasions, yet there is still no accountability from the government. Furthermore, if our Health IDs will be connected to our Aadhaar, the risks of our personal health data being accessed and misused by an unauthorized person or entity further increase. Similarly, with the Aarogya Setu app, there was no transparency on whether data collected was being deleted after patients recovered and how the app functions on the phone due to the code not being open source.
Another area of concern is the possibility of our health data sold off to private companies. According to the National Health Data Management policy, anonymized, de-identified, and aggregated data may be made available to organizations (page 21). In the past year alone, there has been an explosion in the number of cases of big tech and pharma companies collecting personal health data. The NHS was found to be providing data to Amazon, Google bought the health records of 50 million Americans from insurance companies, and pharmaceutical company GlaxoSmithKline paid DNA testing company $300 million for customer data. Big companies are chasing after our data, and with the implementation of the National Digital Health Mission, India might become a new market for getting such data.
Big tech companies like Google, Amazon, and Apple are collecting health data for the use of building their own AI Health technologies. While these AI health technologies could prove to be highly useful and beneficial in the future, there are still concerns about whether these companies are using this data ethically. For example, last year when Google bought personal data from insurance companies and medical institutions in the USA, we found that even though those records were stripped of identifying details such as name, contact information. Google combined these records with the information Google already had from the database. This included all personal information collected form smartphones that could easily establish the identity of the patient’s medical records. This is not just true for big tech companies, research has shown that even anonymous information can be easily re-identified, even if data sold is anonymized and aggregated. How do we know that big tech companies will only use this research as they claim, and not for profit? Although tech companies say this data will only be used for research, an anonymous whistleblower has claimed Google does use this data to mine patient information, run analytics and then sell this data to third parties to be able to target healthcare ads based on the patient’s medical history.
The truth is that there is no guarantee of how this data will be used. Although personalized healthcare ads might be a relatively light issue, we cannot even comprehend how this data could be used and what harm it could cause to us in the worst-case scenarios. The Cambridge Analytica scandal is one example of how our data was used against us to sway elections. With our sensitive health data involved and big tech companies beginning to work on AI based healthcare, the implications of possible data misuse could be even graver.
Besides big tech companies, pharmaceutical and insurance companies are chasing after our medical data. Pharmaceutical companies have also been found to use digital record-keeping systems in hospitals to gather information and use it to sell drugs. There are also some findings that insurance companies are beginning to gather data on race, marital status, how much TV you watch, and even the size clothing you wear. With our entire medical history available, insurance companies would have more power than ever before, patients could be denied health coverage or be charged higher based on their medical history or even one’s genetic data.
While it is true that digitizing the healthcare ecosystem in India could be beneficial for streamlining healthcare, the privacy concerns are difficult to ignore. Along with bringing transparency and accessibility, it also opens new doors for data misuse and possible stigma and discrimination. Without a data protection law in place and actual technological infrastructure to protect our data, the implementation of this project is dangerous.
Aradhya is a psychology major at Ashoka University. In her free time you’ll find her reading books, drinking chai and cycling at odd hours.
We publish all articles under a Creative Commons Attribution-Noderivatives license. This means any news organisation, blog, website, newspaper or newsletter can republish our pieces for free, provided they attribute the original source (OpenAxis).