sandman

Why your dreams deserve privacy: the case for on-device AI

Why your dreams deserve privacy: the case for on-device AI

I study sleep for a living. When I looked into what happens to dream data in most apps, I started paying a lot more attention to where mine goes.

Jacob Lowe

Jacob Lowe · Founder of Sandman

Published on Apr 17th · 6 min read

When I was building Sandman, one of the first decisions I had to make was whether to process dream content in the cloud or on the device. To answer that, I started reading privacy policies for every dream app I could find. What I found was unsettling enough that it shaped the entire architecture of what I was building.

I'd been keeping a dream journal for years at that point. But I'd never seriously considered what would happen if someone else got access to it.

Dream data isn't sleep data

My sleep tracker knows when I went to bed, how long I was in REM, and how many times I woke up. That's biometric data, and it's already sensitive enough that people worry about it. But dream content is different.

A dream journal is basically everything your brain does when you're not filtering it. Fears you haven't told anyone about. People you're angry at. Situations that stress you out. Sexual content. Grief. The weird symbolic stuff your brain generates when the prefrontal cortex goes quiet and the emotional centers take over.

The research I've read lines up with this. Dieter Riemann and colleagues at the University of Freiburg found that dream content in depressed patients mirrors their waking emotional state. Negative dreams, hopeless themes, the same heaviness during the day showing up at night. Matthew Walker's work at UC Berkeley shows that REM sleep is basically emotional triage, stripping the charge off difficult memories so you can function the next morning. And a 2024 study in Scientific Reports gave the first real empirical evidence that dreaming actively participates in that process.

Your dream journal is a running log of what your brain considers emotionally important. That's not the same as your step count.

I looked at what apps do with this data

When I started digging into the privacy policies of dream journal apps, I expected boilerplate. What I found was worse.

I read privacy policies for about a dozen dream journal apps. Several collect your dream descriptions, mood tags, and usage patterns, then share "anonymized and aggregated" data with third party partners, researchers, or companies. Some share data with advertising platforms. That phrasing, "anonymized and aggregated," sounds reassuring until you realize that dream content is inherently identifying. If I write about sprinting through an airport and missing my flight, that's not generic data. That's my specific recurring dream, tied to my specific anxiety about deadlines.

The health app track record is bad. In 2023, the FTC took action against BetterHelp for sharing therapy intake data, including answers to mental health questionnaires, with Facebook and Snapchat for ad targeting. They paid $7.8 million. The FTC also went after Flo, the period tracking app, for sending health data to Facebook's and Google's analytics divisions after promising users their data was private. Across 2023, the FTC reached settlements totaling nearly $9.5 million with health apps that had been sharing sensitive data, changing privacy policies retroactively, or violating the Health Breach Notification Rule.

Dream journals weren't part of those cases. But the data they hold is at least as personal as a therapy questionnaire or a menstrual cycle log.

The data broker problem

This is the part that made me genuinely uncomfortable. A researcher named Joanne Kim at Duke University's Sanford School contacted 37 data brokers and asked to buy mental health data. Twenty-six responded. Eleven were willing to sell.

The data included lists of people categorized by condition: depression, anxiety, ADHD, bipolar disorder, insomnia. Sorted by age, zip code, gender, net worth. Some brokers charged as little as $275 for 5,000 records. Others charged $75,000 to $100,000 for annual access. The buyers faced minimal vetting. Kim described seemingly few controls on what happened to the data after purchase.

None of this requires a data breach. It's just the normal flow of information in an industry where personal data is a product.

A dream journal has all of that and more. Anxiety themes, depressive content, trauma responses, sleep disturbances. Run that through any classification model and you'd have a mental health profile more detailed than what those data brokers are selling.

What on-device AI actually means

When an app says it uses "AI" to analyze your dreams, there's really only one question worth asking: where does that processing happen?

Cloud-based AI means your dream text leaves your phone, travels to a server, gets processed by a third-party system, and depending on the service, may be stored, logged, or used for model training. Even apps that promise encryption in transit still decrypt your data on their servers to run the analysis. Your dream about your ex or your dead parent lives on someone else's computer now.

On-device AI means the model runs locally on your phone. Your dream text never leaves. There's no server to breach and no third party with a copy. If the company goes under or changes its privacy policy, your data isn't sitting in some S3 bucket waiting to be dealt with.

Forty percent of organizations have already reported an AI-related privacy incident. GDPR regulators are starting to question whether sending personal data to cloud AI without explicit consent even counts as legal. And once your dream text sits on a server, it's subject to whatever the company decides to do with it later. Sell, get acquired, change the privacy policy, get subpoenaed.

On-device processing avoids all of that. The tradeoff is that the models are smaller than what runs in the cloud. But for dream journaling, you don't need GPT-scale reasoning. You need something that can spot recurring themes and tag emotions over weeks and months. A modern phone handles that fine.

Why I changed how I journal

I'd spent months reading about what apps do with personal data before building Sandman, and I still hadn't thought to check what the dream apps were doing. When I finally looked, I didn't like what I found.

When I finally looked, I didn't like what I found. Not because any specific app had done something terrible, but because the default across the industry is to send your most private thoughts to a server and hope the company's incentives stay aligned with yours. That's a bet I'm not comfortable making with data this personal.

I built Sandman around on-device AI specifically because of what I found. The dream entries don't leave your phone. I'm obviously biased, so take that as context rather than a pitch. But the privacy architecture is the reason I built it this way, and it's the thing I'd check first in any dream journal app.

What to look for

If you're using a dream journal app, or thinking about starting, here's what I'd actually check:

Where does the AI processing happen? On your phone, or on a server? If it's on a server, your dream text is leaving your device.

Does the app work offline? If it can't function without an internet connection, your data is going somewhere.

Read the privacy policy. Not the summary, the full thing. Look for "anonymized and aggregated data," "third-party partners," and "may share." Those are the phrases that mean your data isn't staying put.

Check if the app has been updated recently. Privacy policies change. A company that was privacy-focused in 2024 might have a new business model by 2026.

I'm not saying cloud-based apps are all bad. Some handle data responsibly. But with dream content, the stakes are higher than with most personal data, and the default should be that nothing leaves your device unless you decide otherwise.

My journal has hundreds of entries. I'd rather keep that on my phone.

About the Author

Jacob Lowe

Jacob Lowe

Founder of Sandman

Jacob is a web developer with over a decade of experience in the field. His passion for coding and open-source technologies drives his desire to create and innovate. He believes that through technology, we have the power to increase access to new experiences and make a positive impact in the world. At the heart of his work lies a love for nature and the beauty of the natural world. He finds solace in the stillness of nature and the abstractions of code.

Remember your dreams