Healthcare exchanges in New England shared users' sensitive health data with companies like Google
The exchange websites ask users to answer a series of questions, including about their health histories, to find them the most relevant information on plans. But in some cases, when visitors responded to sensitive questions, the invisible trackers sent that information to platforms like Google,
Advertisement
The Markup and CalMatters audited the websites of all 19 states that independently operate their own online health exchange. While most of the sites contained advertising trackers of some kind, The Markup and CalMatters found that four states exposed visitors' sensitive health information.
Nevada's exchange, Nevada Health Link, asks visitors about what prescriptions they use, including the names and dosages of the drugs, to help them find their best options for health insurance. When visitors start typing, it suggests specific medications, including antidepressants, birth control and hormone therapies.
As visitors answered the questions, their responses were sent to LinkedIn and Snapchat, according to tests conducted by The Markup and CalMatters in April and May.
When an individual indicated that they took Fluoxetine, commonly known as Prozac, on Nevada Health Link, the information was sent to LinkedIn.
The Markup/CalMatters
On the other side of the country, Maine's exchange, CoverME.gov, sent information on drug prescriptions and dosages to Google through an analytics tool. It also sent the names of doctors and hospitals that people had previously visited.
Advertisement
Rhode Island's exchange, HealthSource RI, sent prescription information, dosages, and doctors' names to Google.
Massachusetts Health Connector, another exchange, told LinkedIn whether visitors said they were pregnant, blind, or disabled.
After being contacted by The Markup and CalMatters, Nevada's health exchange stopped sending visitors' data to Snapchat and Massachusetts stopped sending data to LinkedIn. Additionally, The Markup and CalMatters found that Nevada stopped sending data to LinkedIn in early May, as we were testing.
The Markup and CalMatters discovered the sharing after finding that California's exchange, Covered California,
Experts said state health exchanges' use of advertising trackers was troubling if not entirely surprising. Such tools can help organizations to reach visitors and tailor ads for them. Google Analytics allows website operators to better understand who is coming to their site and to optimize ad campaigns. The LinkedIn and Snap trackers, like a similar offering from Meta, help companies target their social media ads.
Nevada uses the trackers to help target marketing at uninsured residents, according to Russell Cook, Executive Director of the state agency that operates Nevada's exchange, Silver State Health Insurance Exchange.
But health care services need to be especially careful with those tools, said John Haskell, a data privacy attorney who has previously worked as an investigator for the Department of Health and Human Services.
'It doesn't surprise me that organizations that have these massive tech stacks that rely on third party-resources don't have a full understanding of what the configuration is, what the data flows are, and then once they go to somebody, what that data is being used for,' Haskell said. 'It's something that needs to be addressed.'
Advertisement
How was state exchange data tied to users' identities?
After
The Markup and CalMatters then examined websites operated by 18 states other than California, as well as Washington, D.C., to see what information they shared as users navigated them. The sites were established under the Affordable Care Act, which requires states to offer health insurance either through their own exchanges or one operated by the federal government.
To test them, we first ran the sites through
The results showed that 18 used some sort of tracker. Some were filled with them. Nevada, for example, used nearly 50. By contrast, Blacklight found no tracker of any kind on Washington, D.C.'s exchange. Popular websites use on average seven trackers, according to
Many of the sites used trackers in relatively innocuous ways, like counting page views.
The four exchanges we found sharing sensitive health data sent varied responses to questions about the tracking.
Advertisement
Cook said in a statement that trackers placed by his Nevada agency were 'inadvertently obtaining information regarding the name and dosage of prescription drugs' and sending it to LinkedIn and Snapchat.
Cook acknowledged such data was 'wholly irrelevant to our marketing efforts' and said it had disabled tracking software pending an audit.
Jason Lefferts, a spokesperson for Massachusetts Health Connector, said in a statement that 'personally identifiable information is not part of the tool's structure and no personally identifiable information, not even the IP addresses of users of the tool, has ever been shared with any party in any way via this tool.' But LinkedIn's
Spokespeople for the Rhode Island and Maine health exchanges said that they pay a vendor, Consumers' Checkbook, to run a separate site that allows visitors to explore what plans are available to them through their states' exchanges. It was from these sites that sensitive information was shared to Google. Consumers' Checkbook's sites are at different web addresses than the exchange sites, but are prominently linked to on the exchange sites and display identical branding like the state health exchange's logo, making it unlikely that an average visitor would realize they were no longer on a state-run domain.
Christina Spaight O'Reilly, a spokesperson for HealthSource RI, said the company uses Google Analytics to study trends but not to serve ads, and 'disables Google Signals Data Collection, ensuring that no data is shared with Google Ads for audience creation or ad personalization, and no session data is linked to Google's advertising cookies or identifiers.' HealthSource RI's terms of use mention the use of Google Analytics, she noted. A spokesperson for CoverME.gov made similar points, saying that the agency 'does not collect or retain any data entered into the tool.'
Advertisement
When an individual selected a doctor on HealthSource RI, the doctor's name was sent to Google Analytics.
The Markup/CalMatters
Consumers' Checkbook declined to comment beyond the exchanges' statements.
All of the exchanges said that individually identifiable health information, like names and addresses, wasn't sent to third parties. But the point of the trackers is to enhance information sent about a user with data the platforms already have on that user, and every tracker found by The Markup and CalMatters logged details about individual visitors, such as their operating system, browser, device, and times of visit.
In response to requests for comment, the tech companies whose trackers we examined uniformly said they do not want organizations sending them potentially sensitive health data, and that doing so is against their terms of use.
Steve Ganem, Director of Product Management for Google Analytics, said that 'by default any data sent to Google Analytics does not identify individuals, and we have strict policies against collecting Private Health Information or advertising based on sensitive information.' A spokesperson for LinkedIn, Brionna Ruff, said that advertisers are not allowed 'to target ads based on sensitive data categories,' such as health issues. A spokesperson for Snapchat owner Snap said the same, noting that sending purchases of supplies like prescriptions would run afoul of the company's rules about sensitive data.
Advertisement
'It is important to ensure that your implementation of Google Analytics and the data collected about visitors to your properties satisfies all applicable legal requirements,' the page reads.
More incidents
State exchanges aren't the only health sites that have sent medical information to social media companies.
In 2022,
In 2023, a New York hospital agreed to pay a $300,000 fine for violations of the Health Insurance Portability and Accountability Act, or HIPAA.
In response to a series of incidents, the
Some plaintiffs have used state laws, like those in California, to argue that they should be compensated for having their health data sent to third parties without consent. Others have argued that this kind of tracking runs afoul of
'Organizations aren't investing enough time and resources into properly vetting everything,' said Haskell, who advises clients to be very careful about the information they track on their sites. 'When organizations are saying, 'we didn't understand that there's a certain configuration of this tool that we're using,' well, I can't really
not
put that on you.'

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time Business News
8 hours ago
- Time Business News
Have You Seriously Considered The Option Of Orale Steroïden?
Steroid tablets may cause side-effects if taken for an extended period, such as osteoporosis and weight gain. You can minimize these side effects by taking calcium and vitamin D supplements, exercising regularly and refraining from smoking. Always carry with you a red or blue steroid emergency card provided by the person prescribing or dispensing your medicine, which outlines your dosage and medical condition to healthcare providers. Trustworthiness Online purchasing of steroids offers convenience, yet also poses certain risks to buyers' health and legal repercussions. Furthermore, counterfeit or unregulated products could be purchased. Therefore, buyers should ensure their vendors are legitimate by doing a detailed investigation and reviewing customer feedback as well as looking for professional website designs, secure payment options and transparent shipping policies from them. Steroids are powerful substances that can dramatically change how the body functions if taken under medical guidance and with other substances. If taken alone or taken by people not familiar with them, however, steroids can be potentially lethal. Google-owned YouTube and Facebook make it easy for potential steroid users to obtain appearance and performance-enhancing drugs (APEDs) without prescriptions, with searches for specific names of APEDs leading to videos providing contact details and instructions on purchasing them; other searches pointed towards bodybuilding or fitness model websites hosting content about steroids containing relevant terms relating to them. In addition, most websites did not present accurate information about use or risks and declined legal responsibility when selling illegal substances – all making life easy for steroid buyers. Customer service Purchase of steroids online requires taking into consideration several aspects to ensure its legality and safety, such as regional regulations, product authenticity and effectiveness, payment security measures and shipping policies. Users also often evaluate customer service by testing responsiveness and helpfulness of support teams. Steroids may be illegally used to enhance athletic performance, such as marathon running or swimming. Unfortunately, long-term or high dose use of these substances can have adverse side effects; to ensure safety it's important to consult a doctor first when considering taking steroids. As mentioned previously, steroids may interact with certain medicines, increasing your risk of side-effects. This is particularly relevant if taken in combination with NSAIDs that cause stomach ulcers; additionally, steroids can make you less sensitive to pain – this could pose problems if recovering from surgery or experiencing injury and this could result in serious health complications if left unchecked. Payment options Steroids may help treat flare-ups of rheumatoid arthritis, lupus or gout. Steroids can be taken in various forms; from tablets to liquids or creams or injections. Some individuals experience side effects from taking steroids; usually mild and they subside once treatment stops; this is more likely to happen when dosage levels are increased or taken for extended periods. Your treating provider will monitor your symptoms closely and may adjust dosage if symptoms worsen over time. Steroid tablets often cause more side effects than creams or injections because they enter the bloodstream, where they can have widespread impact. Steroid tablets may lead to weight gain and increased appetite; increase infection risks; raise pre-diabetes risks and bolster diabetes or pre-diabetes risks; slow growth rates among children or teenagers and require periodic height checks for those taking them; these adverse reactions must all be carefully considered before choosing which form of treatment would work best. Some individuals combine multiple anabolic steroids together, or 'stack', in order to enhance their effects. Others may 'pyramid' their use by gradually increasing doses over a period of time before stopping for a time so their bodies can recover. Product availability Steroids are powerful tools that can help increase muscle mass, boost strength, and enhance performance. But to avoid unwanted side effects and legal complications, orale steroïden must only be purchased from reliable licensed providers like myroidshop1 net/nl/. Steroids carry potential side effects and may even be illegal in certain countries, making it essential to understand the various legal implications when purchasing online. Doing your research thoroughly on this subject will enable you to make wiser purchase decisions while avoiding potential legal ramifications. Dianabol, Clenbuterol and Trenbolone are among the most widely available anabolic steroids on the market. All three compounds are known to accelerate muscle growth and fat loss and can be found in high doses on the market. Be wary of untrustworthy suppliers who sell counterfeit or substandard steroids that could pose health risks to users. TIME BUSINESS NEWS

Refinery29
2 days ago
- Refinery29
AI Therapy Is Helping Our Wallets, But Is It Helping Our Minds?
Within just three minutes of using ChatGPT as a therapist, it had told me to 'go low or no contact' with my family. This is something a real therapist might suggest where appropriate after multiple sessions. That should scare us. In a new Harvard Business Review report into how we're using AI today, therapy and companionship came out top. Last year, these things ranked second, and now firmly in first place, they're joined by ' organising my life ' and 'finding purpose' in second and third place respectively. Where content creation and research used to feature heavily near the top, those uses of AI have dropped in favour of emotional uses. We're turning to AI as if it were a friend, confidant or trained professional with our best interests at heart. The BBC has reported on this trend in China specifically, where people use DeepSeek for therapy and get to see the AI's 'thought process' as well as the response. But AI being used in place of healthcare professionals is happening worldwide. When therapy can typically cost £40-100 for one session in the UK, and ChatGPT can be accessed day or night for free, it's no wonder the draw of that is strong. As a journalist, I never think to use ChatGPT. It's like turning up to the house of someone that has promised to shoot me one day. This is unlike my friends in science or data based jobs, who use it for everything, in place of Google or to help plan their holiday itineraries. Having witnessed them do this multiple times, I've come to realise my resistance to AI isn't the norm. And so it won't come as a surprise that I've never used AI as a therapist, though I have done actual therapy in the past. With a quick scroll on TikTok, I can see ChatGPT therapy is popular and a frequent resource for people. Especially young people who predominantly use the app, who might have less disposable income. There are videos with people joking about their AI 'therapists', through to comments giving advice on how to get your ChatGPT voice to become more personal. Lee (surname withheld), 42, from Texas, has been using AI in place of therapy for the last eight months, ever since dating again after a six year hiatus. 'I was confused when some old thought patterns started popping up [as I began dating]. I'd already used ChatGPT for other things and decided to run some problems by him that I was having in dating and family life,' Lee says. 'Him', because Lee's ChatGPT calls itself Alex and says he's a feminist. 'I found it very helpful and cannot think of any instances where it fell short — if anything it exceeded my expectations.' Lee has even made 'progress' in her boundaries regarding a particular family dynamic. Previously, Lee had spent anything from $60 to $150 per appointment on therapy, but at the time she felt she could benefit from it again (and started using ChatGPT), she didn't have access to healthcare so that wasn't a viable option. While there's concern about the efficacy of AI in place of therapy (more on that later), we can't overlook where people feel it has helped them, people who otherwise wouldn't be able to afford and access therapy. Lee has a glowing review of her experience so far. 'I have never had a therapist know me as well as ChatGPT does,' she says. 'Alex is always available, doesn't flinch at the hard stuff, and has actually been more consistent than some therapists I've seen. Therapists are trained, but they're still human, and if they haven't lived anything close to what you've been through, it can feel like something is missing in the room.' However, AI, though it isn't human, has learned from humans — and it hasn't lived. In fact, research shows, and spokespeople have said on the record, that AI can tell you what you want to hear and end up mirroring your own opinions. There have even been cases where AI has been linked to deteriorating a person's mental health, with one mum convinced it contributed to her son's suicide. More recently, the New York Times reported on how AI chatbots were causing users to go down 'conspiratorial rabbit holes'. To get a sense of what Lee and the plenty of other people turning to AI for mental health support are experiencing, I started speaking to ChatGPT to see how it would respond to questions around anxiety and family dilemmas. The first thing that struck me was how quickly you can be inundated with information — information that it would take several weeks of therapy to receive. While ChatGPT did tell me it wasn't a licensed therapist and that if I'm in crisis I should seek out a mental health professional, in the same breath it reassured me that it can 'definitely provide a supportive, nonjudgmental space to talk through things'. It also said it could offer CBT-based support, which in the UK is the bog standard form of therapy people get when they go to the GP. I was pretty surprised to then see, within a few minutes of using the chat, that it offered to help me work through 'deeper issues happening since childhood'. I had asked hypothetical questions to see its response, some of which centred on family. A CBT practitioner will often tell you this form of therapy isn't the best suited to deep work (I know, because I've been told this first-hand numerous times, and the therapists I've interviewed for this piece agree), because CBT typically isn't designed for long-term deep unpicking. A lengthier, costlier form of therapy is better suited, and with good reason. And yet, ChatGPT was up for the challenge. Caroline Plumer, a psychotherapist and founder of CPPC London, took a look at my conversation with AI and found parts of it 'alarming'. 'There's definitely information in here that I agree with,' she says, 'such as boundary setting not being about controlling others behaviour. Overall, though, the suggestions feel very heavy-handed, and the system seems to have immediately categorised you, the user, as 'the good guy' and your family as 'the bad guys.' Oftentimes with clients there is a need to challenge and explore how they themselves may also be contributing to the issue.' Plumer adds that when exploring dysfunctional family issues, it can take 'weeks, months, or even years of work' — not, a matter of minutes. She also thinks, getting all of this information in one go, could be overwhelming for someone. Even if it's seemingly more economic, a person might not be able to handle all of the suggestions let alone process and action them, when they're given at rapid fire speed. Plumer says it isn't helpful having an abundance of generic suggestions that aren't truly accounting for nuance or individuality. At least, not in the same way a therapist you'd see over a period of time can do. On top of this, the environmental impact of AI is huge. 'I appreciate that lots of people don't have the privilege of having access to therapy. However, if someone is really struggling with their mental health, this might well be enough to set them off down an even more detrimental and potentially destructive path.' Liz Kelly, psychotherapist and author of This Book Is Cheaper Than Therapy, thinks the suggestion I consider low or no contact with certain family members is reflective of how commonly discussed cutting people off now is, almost as if ChatGPT is playing on social media buzzwords. This worries her, too. 'You could potentially make a hasty, reactive decision that would be difficult to undo later,' Kelly says, citing the role of the therapist to help someone emotionally regulate themselves before making any big decisions. When it's just you and a laptop at home, no one is checking in on that. 'I certainly wouldn't jump straight to these suggestions after one short snippet of information from the client,' is Plumer's conclusion after reading my transcript with AI. 'Ideally you want to help a client to feel supported and empowered to make healthier decisions for themselves, rather than making very directive suggestions.' Kelly feels that while some helpful information and advice was provided, the insight was lacking. 'As a therapist, I can ask questions that my clients haven't thought of, challenge them to consider new perspectives, help connect the dots between their past and present, assist them in gaining insight into their experiences, and support them in turning insight into action. I can assess which therapeutic interventions are most suitable for my clients, taking into account their individual histories, needs, and circumstances. A therapeutic modality that works for one client may be entirely inappropriate for another.' While AI can 'learn' more about you the more you speak to it, it isn't a replacement for therapy. But at the same time, in this financial climate, people clearly are going to keep turning to it — and you're going to need greater discernment on where to take and leave the advice if you do.
Yahoo
2 days ago
- Yahoo
Remarkable new AI can tell your age by looking at your eyes
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. One of the most impressive areas of generative AI software like ChatGPT right now involves enhanced computer vision. AI can understand and interpret data from images. That's why we now have such advanced image and video generation models in ChatGPT, Gemini, Firefly, and other AI software. Models like ChatGPT o3 can accurately guess the location of an image by analyzing its details. Google offers advanced photo editing tools in its Photos app, and also directly in Gemini. These tools let you alter real photos in ways that weren't possible before. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 These image-related AI capabilities aren't just used to generate memes or melt OpenAI's servers. Researchers are developing AI models that can interpret images for various purposes, including medicine. The latest study showing such advancements comes from China. Researchers from several universities have been able to determine a person's age with high accuracy by having AI inspect an image of their retina. The readings also showed differences between the person's age and the eye's age. The researchers found that the retinal age gap the AI provided can be especially helpful for women. A simple retinal scan might help doctors offer better support to couples trying to conceive and to women at risk of early menopause. Retinal fundus imaging, or a photo of the back of the eye, lets doctors see microvascular features that reflect systemic aging. An AI trained on thousands of images can then predict the eye's age and compare it to the person's actual age to 'predict retinal age from fundus images with high precision.' The scientists used an AI called Frozen and Learning Ensemble Crossover (FLEX) to predict retinal age from fundus images. They fed FLEX over 20,000 eye photos from more than 10,000 adults of all ages to teach it how the back of the eye looks as people age. FLEX also analyzed over 2,500 images from nearly 1,300 pre-menopausal women. The AI was then able to estimate a person's age by examining a retinal fundus photo. If the eye appears older than the woman's actual age, the retinal age gap is positive. That could also mean other organs in the body are aging faster. The implications for reproductive health are clear. Fertility and menopause issues could benefit directly from such an AI screening tool. The researchers linked a larger retinal age gap to lower blood levels of anti-Müllerian hormone (AMH), a marker for ovarian reserve. The lower the AMH value, the harder it is for older women to conceive. The scientists studied women ages 40 to 50 and found that each additional retinal year raised the risk of a low AMH result. The risk increased by 12% in the 40-44 age group and by 20% in the 45-50 group for every extra retinal year. The study also found that having more childbirths at younger ages was associated with lower AMH levels than average. Each additional retinal year increased the risk of developing menopause before age 45 by 36%, according to the paper. We're still in the early days of using AI for medical imaging, but the study shows promise for using a simple, non-invasive technique to improve reproductive health protocols. Imagine getting a retinal scan in your late 20s or early 30s to help decide whether to get pregnant or freeze your eggs. Similarly, women over 40 concerned about pre-menopause or menopause could use an eye scan to check their retinal age and assess the risk of early symptoms. This might help them prepare for the years ahead with hormonal therapies to delay or ease symptoms. For any of this to happen, the conclusions from Hanpei Miao & Co. would need to be confirmed by further research. Separately, the FLEX AI model used in this study could be explored for other health conditions where eye scans might serve as early indicators of age-related health risks. The full study is available in Nature magazine. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the