I used my phone camera to take a better picture of the image inside of it. This allowed me to draw conclusions about the person’s age and social standing. As a journalist with access to modern technology, I am confident I could potentially identify this individual. However, pursuing such an investigation solely for the purpose of illustrating this article and highlighting the importance of our data would be borderline intrusive.
Please don’t mistake my enthusiasm; I deeply value investigative journalism and would gladly immerse myself in it under different circumstances. However, since I’m using this anecdote solely to illustrate my story on privacy, I encourage you to trust your judgment and consider my insights carefully.
I can’t say for sure, but I’d bet the person in the photo didn’t have any privacy concerns when the picture was taken. It’s improbable that they ever imagined this view-master, bearing their image, would find its way to a journalist’s home in Berlin, Germany, by the year 2024, let alone feature in an article discussing digital health data.
Is it true that the merchant selling this device obtained the consent of the individual depicted in the photograph to distribute it? Since I paid for the View-Master, do I now have rights over this person’s image? Do we really understand the value of the data we generate about ourselves today and what it will have in the future?
These questions have been swirling in my mind since I purchased this device. Now, as I focus on reviewing wearable devices like smartwatches, chest straps, and even Peloton treadmills, I find myself wondering if we are truly prepared to share our most valuable information with companies: our health data.
First, I’m not a specialist in the fields of privacy and security, so I reached out to my network and gained insights from professionals working directly in the wellness and health industry. Moreover, this is not an attempt to educate anyone on the importance of data protection—to be honest, the deeper I dive into the topic, the less I understand.
However, I do believe there is value in understanding the information we produce about ourselves and how it can be used to benefit us or cause damage. So, here, I’ll share with you some best practices to ensure you’re protecting your health data while still generating useful metrics about yourself.
To raise awareness among our audience about health data privacy and security, I asked two specialists a simple question: If you could offer just one advice to people about protecting their health data, what would it be? I spoke with Martha Dörfler, a developer and maintainer at the Berlin-based Drip Collective, which prioritizes privacy in their open-source menstrual cycle tracking app Drip; as well as with Vincent Chartier, a cybersecurity engineer at Withings, a company I trust when it comes to my personal data.
Tip #1: Do not fall for the “take it or leave it” model
Maybe you already noticed it, but many companies try to make us, the users, responsible for our data, but nowadays, privacy laws protect people in most places.
For example, nextpit’s audience is mainly from the United States and the European Union, two major markets where health privacy is taken seriously. In the US, health data protection is governed by HIPAA (Health Insurance Portability and Accountability Act), and in the EU, it’s protected under GDPR (General Data Protection Regulation).
Additionally, international companies serving US or EU citizens must follow local laws. This means that even people in regions like Latin America, India, and China could see benefits from these privacy protections.
What I want to emphasize is that you shouldn’t have to accept anything that feels invasive to use a service. Forget the idea that you must ‘take it or leave it.’ Instead, exercise your right to protect your personal data: report issues, voice complaints, and push companies to change their policies. Remember, rights only exist if we assert them when necessary.
But where to begin? A great starting point for me to understand the current state of privacy issues was engaging with the ‘None of Your Business‘ (NOYB) organization. Created by Max Schrems, NOYB ensures that companies adhere to regulations regarding the protection of your personal information online. They employ clever strategies and collaborate with various groups to safeguard your privacy, especially in Europe.
In the United States, there isn’t an organization directly comparable to NOYB in terms of visibility and specific focus. However, organizations such as the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU) fulfill similar roles. They advocate for privacy rights and undertake legal actions to defend those rights.
Tip #2: Don’t be lazy; read the privacy policy
According to Vincent Chartier, the Withings cybersecurity specialist, the most important thing you can do is read the privacy policies of a service before using it:
If you find the privacy policy too complex to understand, it should raise concerns about the company’s privacy practices.
Yes, you should go through the entire privacy policy. However, Vincent shared some key questions to focus your attention on while reading privacy policies that are critical for identifying potential red flags and understanding how your data is managed and protected. Here is a cheat sheet:
- How does the company collect and share your data?
- For what purposes?
- Who actually hosts the data?
Allow me to further break this down for you in two examples.
Example 1: Understanding consent
Meta and their business model: Recently, Meta introduced the ‘Pay or Okay’ system in the EU in order to ‘comply with GDPR.’ So now the company wants to offer users a choice: allow Meta to use people’s online information for business purposes, or pay to keep their information private. This new and notable option could change online privacy management, raising questions about its fairness. Meta’s decision might lead other companies to make online privacy a paid service.
Meta is evidently manipulating the concept of consent. According to Vincent, the importance of understanding consent is the key here, especially for navigating data privacy and protection:
Consent can only exist when it’s unambiguous. You can give it freely, meaning you know exactly what’s behind this consent, what process is behind this consent.
Shifting the focus from a broader overview of privacy to health data: As we continue to digitize health records, use wearables, fitness apps, and integrate technology into healthcare, securing our personal health information has emerged as a top concern. Moreover, according to the IDC Data Age report, approximately 30% of the world’s data volume is generated by the healthcare industry and is growing at an annual rate of 36%.
This shift toward digital health records and tech-based health monitoring tools has undoubtedly made healthcare services more efficient and accessible. We can now track health metrics, access medical records from anywhere, and interact more effectively with medical professionals. Take my menstrual cycle history as an example, when I’m at the gynecologist I can easily export it from my iPhone’s Health app as a PDF document so they can analyze it in a matter of seconds.
Nonetheless, this convenience imposes a significant obligation on technology companies, healthcare providers, and patients to safeguard vital health information. Health data is incredibly personal and sensitive, containing details we might not share even with those closest to us. Should this data fall into the wrong hands, it could lead to privacy violations, identity theft, and discrimination in employment or insurance.
And this is why consent is paramount here. This means ensuring that our information is shared and used only with our explicit agreement!
In the digital age, the concept of consent has become a cornerstone of data protection and privacy. Under GDPR, consent isn’t just a formality; it’s a significant and complex requirement designed to empower individuals over their personal data.
Consent must be freely given, specific, informed, and unambiguous, which means individuals must actively ‘opt-in’ for their data to be processed. This approach seeks to eliminate any ambiguity, ensuring that consent is a clear affirmative action that cannot be derived from silence, pre-ticked boxes (opt-out), or inactivity.
Moreover, GDPR emphasizes that individuals have the right to withdraw their consent as easily as they gave it, reinforcing the principle that consent is not a one-time notification, but an ongoing choice.
This shift not only places more power in the hands of the individual, making consent a genuine choice, but also challenges organizations to adopt transparent and user-friendly practices, marking a significant step forward in data privacy and protection.
This means companies cannot merely present a series of options within a lengthy paragraph that explains (or fails to explain) the reasons for data collection, sharing, and storage. They must break it down into options that are easy to read and understand, requiring simple ‘yes’ or ‘no’ responses.
However, as emphasized by Vincent Cartier, it is at this point that companies tend to push the boundaries to the limit. To avoid future nightmares, he suggests delving deep into the background of a company to understand the so-called legitimate interest—at Meta, for instance, their business model is advertisement:
There are a number of legal bases applicable to personal data. The best known is consent. Legitimate interest represents another legal basis that allows companies to process data without obtaining consent. It’s a challenging path because it somehow bypasses the need for consent. […] Unfortunately, companies often exploit this because they anticipate little to no pushback from regulatory authorities, given the vast number of businesses involved. Therefore, protect your privacy, it’s crucial to comprehend and adeptly navigate issues like the misuse of legitimate interest. For example, it seems quite unfair for companies to share my data based on legitimate interest.
Example 2: Understanding where content is being processed
Consider the applicable rules for your data: You may be utilizing a service from a German company, but the data might be stored by a US company, such as AWS, Amazon’s leading cloud provider. Now, who actually hosts the data?
Cloud storage is at its peak right now. It’s so much so that in 2025, IDC Data Age report predicts that 49% of the world’s stored data will reside in public cloud environments—Google Cloud Platform, Amazon Web Services, or Microsoft Azure.
Vincent explains that “at its core, cloud computing involves running programs on someone else’s computer, which immediately presents issues of control and security.” So it’s clear why there’s a notable concern regarding the compliance of these services with European regulations, especially GDPR. And don’t get me wrong, people in the US will face the same issues with European cloud companies as well, however, since these companies have a smaller market share, it is often not seen with the same concern.
So, as highlighted by Withings cybersecurity specialist, “the dominance of American companies in the cloud market brings legal and privacy concerns.” The U.S. laws, such as the Cloud Act, allow American authorities to access data stored by these companies, regardless of the data’s physical location. This directly conflicts with GDPR principles.
But how can you navigate your health data on all of these?
In the realm of health data security, technological measures play a pivotal role in safeguarding user information against unauthorized access, breaches, and cyber threats. The keyword here is ‘on-device processing.
When we say health data is being processed “on-device,” it means that everything from gathering to analyzing your health information happens right on your personal gadget, like a smartwatch or smartphone. This is different from the data being sent off to some faraway server or a cloud to be dealt with. This method has a big plus for keeping your health details private and secure because the information stays with you, cutting down the risk of it getting into the wrong hands or being mishandled.
On top of boosting privacy, handling health data on-device can make things work faster and more smoothly, especially for apps that need up-to-the-minute health updates to function correctly. It’s also handy when you’re out of range of good internet service, letting these apps keep running easily.
Plus, since the data doesn’t have to travel back and forth over networks, it could mean your device’s battery will last longer. However, doing all this right on the device might mean the device needs to be a bit more powerful to manage these tasks, which could be a trade-off in some cases.
In conclusion, data security cannot be ignored, or privacy will suffer.
Tip #3: Be aware of the laws in your region
As part of developing a menstrual cycle application, Martha Dörfler emphasizes the importance of understanding the content you create about yourself and, even more importantly, how this content can and will be used by companies and public institutions. Therefore, she believes that the key to keeping your health data private and secure is, above all, being aware of the laws in your region.
It greatly depends on the person’s threat model. For example, if you reside in a country where abortion is highly illegal, you need to consider different alternatives for collecting and storing data about yourself.
As you can see, the implications extend beyond just the laws surrounding your privacy. Generating content about your health could also potentially involve violating more serious laws. This concern isn’t limited to abortion—which is legal in some US states but not in others—but also applies to the use of illegal substances in the context of health issues and fitness, for instance.
Example 3: Collecting data means creating facts
Police testing women for abortion drugs: In 2023, the channel Tortoise Media reported that the British police were checking women for abortion medication and asking for information from period tracking apps following unexplained miscarriages. That said, depending on where you live, even something as simple as recording your menstrual cycle could potentially be used against you.
As stated initially, privacy regulations vary between Europe and the U.S. In Europe, there’s a significant distrust towards companies, whereas Americans are primarily worried about government intrusion into their privacy. So, as emphasized by Martha, what you share should also be considered based on the culture you are part of, and the region you are in.
In the United States, HIPAA is fundamental to health data security laws, setting strict privacy and security standards for protected health information (PHI). This affects healthcare providers, tech companies like Apple and Google, and users. Compliance is crucial for any entity handling PHI, including health records and billing information.
For tech companies entering the health sector—via wearables, apps, or data management—HIPAA compliance is crucial for handling U.S. citizens’ health data. This requires protecting user privacy and ensuring patient data’s confidentiality, integrity, and availability across all data handling stages.
Apple, for instance, has delved deeply into health with its Health app, Apple Watch, and other health-related technologies and services. Same for Google with the Health Connect and Google Pixel Watch and Fitbit. These products and services are designed with privacy and data security at their core, often emphasizing their commitment to user privacy, which aligns with HIPAA’s objectives.
When Apple or Google’s products and services handle PHI, they must ensure HIPAA compliance to protect this sensitive information from breaches and unauthorized access. We have the same compliance requirement when it comes to European citizens, but it’s related to the GDPR.
What if you’re not covered by HIPAA or GDPR? To understand how your data is handled in your place, I suggest accessing the Morrison Foerster Privacy Library. This comprehensive tool provides access to privacy laws, regulations, reports, multilateral agreements, and details from government authorities for over 150 countries worldwide.
Why should I care?
Under the privacy law where I live, which falls under the GDPR, there is no hierarchy of health content types. However, the protection of health information is taken very seriously due to its highly private nature. After all, the EU introduced these rules to safeguard our most private details.
As I begin to take a more serious approach towards digital health in both private and professional aspects of my life, my concern about generating data about myself—and at a different scale, proposing others do the same or not—has reached a peak that inspired me to write this article.
I want to access and understand my health information to make better life choices but keep my data private. I don’t want my information used for ads, product segmentation, training AI like LLMs, or at the mercy of cyberattacks. These are my rights, and I believe they should be yours too.
The good news is that most companies in the health and wellness sector, especially those we discuss at nextpit, understand data protection and comply with regulatory standards. When there are lapses, privacy organizations hold these companies accountable.
For example, Drip is an open-source app with privacy as a central part of its philosophy, as shown in its privacy policy. Companies like Withings carefully choose their partners to ensure data processing meets European laws.
Similarly, in the U.S., firms like Apple focus on processing data on-device, a practice also adopted by Google, although their business models introduce some complexities. Brands like Amazfit process data on-device too, and in Germany, position their data servers within the country, enhancing privacy.
Last but not least, protecting our health data from misuse begins by understanding its value and responsibly choosing whom to share it with. It’s essential to hold companies accountable for their handling of our data. Failing to do so could result in our information getting lost in the vast global data market, a situation no one wants, reminiscent of what could happen with something as innocuous as a View-Master purchase.