The Anxiety of Optimization: Is Your Smartwatch Making You Sick?

A person lying in bed looking worried, with a smartwatch displaying a sleep analysis that indicates poor sleep next to them.

I woke up the other morning feeling surprisingly refreshed. The Pasadena sun was streaming in, the birds were loud, and I felt ready to tackle a new Python script I’d been working on.

Then, I made a mistake. I looked at my wrist.

According to the sleek black sensor strapped to my arm, my “Readiness Score” was a dismal 45. My “Sleep Efficiency” was down 12% from my baseline. My Heart Rate Variability (HRV) was “unbalanced.”

Almost immediately, that feeling of freshness evaporated. I started scanning my body for fatigue. Am I tired? I must be tired. The algorithm says I’m tired. By the time I got to the kitchen to make my coffee, I felt sluggish and anxious, convinced my body was on the verge of collapse.

This is the Quantified Self Paradox. We are living in the golden age of bio-hacking and health data, yet for many of us, this constant surveillance isn’t making us healthier—it’s making us neurotic.

A concerned woman looking at her smartwatch displaying health metrics in a dimly lit bedroom, conveying anxiety about her sleep data.

Welcome to Orthosomnia

I’m not just speaking anecdotally. There is a genuine clinical term for this phenomenon: Orthosomnia.

derived from “ortho” (correct) and “somnia” (sleep), it refers to an unhealthy obsession with achieving “perfect” sleep data. It is a perfectionist quest that ironically leads to insomnia. The stress of tracking the sleep ruins the sleep.

We have gamified our biology. We want to close the rings, get the badges, and see the green checkmarks. But the human body is not a machine running code that executes perfectly every time. It is a biological organism subject to hormonal fluctuations, stress, and environmental shifts.

When we treat our bodies like software that needs debugging based on a dashboard, we invite a specific type of anxiety: the fear that we are failing at simply existing.

A concerned woman holds her head while looking at a smartwatch displaying an alert for 'HRV Unbalanced' on a red background, indicating anxiety related to health data.

The Nocebo Effect of Data

In psychology, we often talk about the Placebo Effect (believing something will help makes it help). But its evil twin is the Nocebo Effect.

The Nocebo Effect occurs when negative expectations cause negative outcomes. When my watch told me I had a “low readiness” score, my brain accepted that as truth. My cortisol likely spiked in response to the “bad news,” creating the very stress the watch was warning me about.

I essentially hallucinated fatigue because an algorithm told me to.

This is where the technology becomes dangerous for our mental health. We are outsourcing our interoception—our internal sense of the physiological condition of our body—to a third-party app. We stop asking, “How do I feel?” and start asking, “How does the app say I feel?”

A woman sitting with her head resting on her hand, looking stressed as holographic health data displays surround her, showing low performance metrics like heart rate and sleep quality.

The Loss of Intuition

As someone who walks the line between hard science and spirituality, this disconnection worries me the most.

In esoteric practices, intuition and body awareness are everything. We are taught to listen to the subtle signals of the body—hunger, rest, energy shifts. This is the foundation of mindfulness.

When we rely entirely on data to tell us if we are okay, we sever that connection with our own intuition. We silence the body’s voice in favor of the screen’s voice. We are trading wisdom for raw data, and they are not the same thing.

A person driving a car, looking ahead at a scenic road during sunset. A transparent digital display shows health-related metrics in the windshield view.

Reclaiming the Narrative

Does this mean I’m throwing my smartwatch in the trash? No. I love technology, and I love data (I am analyzing datasets for fun, after all).

But I am changing my relationship with it. I’m moving from “Obedience” to “Observation.”

Data should be a rearview mirror, not a GPS. It’s useful to look back and see trends over a month (e.g., “Oh, I tend to sleep poorly during the full moon” or “My resting heart rate drops when I do yoga”). But it should never dictate how I feel in the present moment.

If you wake up feeling good, and the watch says you’re bad? Trust the body. The body has millions of years of evolutionary engineering behind it. The watch is running firmware v2.0.

We need to stop trying to “optimize” ourselves into oblivion. Sometimes, a bad night’s sleep is just a bad night’s sleep, not a system failure.


I’m curious: Have you ever felt “data guilt” from your health tracker? Does seeing a low score ruin your day, or does it help you adjust? Let’s discuss in the comments.


Discover more from Nicole Explains It All

Subscribe to get the latest posts sent to your email.

One-Time
Monthly
Yearly

Make a one-time donation

Make a monthly donation

Make a yearly donation

Choose an amount

$5.00
$15.00
$100.00
$5.00
$15.00
$100.00
$5.00
$15.00
$100.00

Or enter a custom amount

$

Your contribution is appreciated.

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly
The Anxiety of Optimization: Is Your Smartwatch Making You Sick?

categories

← Back

Thank you for your response. ✨