Sleep staging has a measurement problem. The gold standard is effective, but nobody would describe it as normal. A full polysomnography study can involve EEG leads on the scalp, belts around the chest and abdomen, a nasal cannula, a pulse oximeter, limb sensors, and a technologist scoring the night in 30-second epochs. That is fine when the goal is definitive diagnosis. It is a lousy fit for repeat monitoring, large-scale screening, or anyone who already sleeps badly in unfamiliar settings.
That burden is why camera-based sleep staging has moved from curiosity to a serious research track. If an overnight camera can recover enough pulse variability, breathing information, and movement context to estimate sleep architecture, the economics of sleep monitoring change. The question is no longer whether video can see something at night. It is whether overnight video and rPPG can estimate the parts of sleep architecture that matter, with enough consistency to earn a place alongside lower-burden sleep tools.
"The camera-based remote PPG algorithm achieved an average kappa of 0.58 and accuracy of 81% for 3-class sleep staging." — Fokke B. van Meulen and colleagues, "The HealthBed Study" (2023)
Why the market keeps looking for a lower-burden sleep staging method
Sleep staging matters because sleep disorders are not only about whether someone stops breathing. Clinicians care about how long a patient spends awake, how much REM sleep they get, whether deep sleep is fragmented, and how physiology changes over the course of the night. The trouble is that the cleanest measurement stack is also the most intrusive.
That tradeoff has created a crowded middle ground. Home sleep apnea tests reduce complexity but still rely on body-worn sensors. Consumer wearables estimate sleep stages from pulse and motion, but many lack the validation depth needed for clinical workflows. Camera-based monitoring sits in between. It aims to collect some of the same physiological clues as wearables, without asking the sleeper to wear anything.
Linas Saikevičius, Vidas Raudonis, Gintaras Dervinis, and Virginijus Baranauskas laid out the broader opportunity in their 2024 systematic review in Sensors. Their conclusion was blunt: camera-based heart rate and respiratory rate measurement have matured quickly, but translation into robust real-world monitoring still depends on better datasets, preprocessing, and validation. Sleep staging is where those issues collide most clearly, because the signal has to hold up for six to eight hours, in darkness, through movement, blankets, and partial face occlusion.
What overnight video sleep staging is actually measuring
Camera-based sleep staging does not work by "watching" sleep in the human sense. It reconstructs physiology from indirect signals.
- rPPG captures tiny color fluctuations linked to blood volume changes in exposed skin.
- Pulse rate variability acts as a proxy for autonomic shifts that differ across wake, REM, and non-REM sleep.
- Breathing rate and breathing variability can be derived from facial, chest, or upper-body motion.
- Body movement helps identify transitions, awakenings, and broad sleep state changes.
- Temporal modeling matters because sleep stages are not isolated moments. They unfold in patterns.
That last point is easy to miss. Sleep staging is not just a classification problem. It is a sequence problem. A model that looks only at one short window of vital signs misses the fact that REM typically arrives in recurring cycles, deep sleep tends to cluster earlier in the night, and awakenings leave a physiological trace before and after the event.
How camera-based sleep staging compares with other approaches
| Approach | Contact required | Core signals | Typical strength | Main limitation | Best role today |
|---|---|---|---|---|---|
| In-lab polysomnography | Yes | EEG, EOG, EMG, airflow, belts, SpO2, ECG | Full clinical reference | Expensive, obtrusive, poor fit for repeat use | Diagnosis |
| Home sleep apnea test | Yes | Airflow, effort, SpO2, pulse | Lower cost than PSG | Limited staging detail, sensor drop-off | Home diagnostic support |
| Wearable PPG sleep staging | Yes | Pulse, pulse variability, motion | Scalable, familiar form factor | Adherence, placement, comfort | Home tracking and screening |
| Bedside radar | No | Respiration, motion | Works in darkness, no wearables | Less mature ecosystem | Contactless longitudinal monitoring |
| Camera + near-infrared + rPPG | No | Pulse variability, respiration, activity, position | Rich multimodal sensing from common optics | Sensitive to occlusion, lighting, sleep position | Screening, research, longitudinal monitoring |
The comparison matters because camera-based systems are not trying to beat EEG at being EEG. They are trying to win on burden. If they can offer better physiology than a simple consumer tracker while asking less of the patient than PSG, they become useful even before they are perfect.
Current research and evidence
The clearest camera-based result so far still comes from the HealthBed study. Fokke B. van Meulen, Angela Grassi, Leonie van den Heuvel, Sebastiaan Overeem, Merel M. van Gilst, Johannes P. van Dijk, Henning Maass, Mark J. H. van Gastel, and Pedro Fonseca evaluated a contactless camera-based remote PPG setup against manual polysomnography scoring in 46 healthy participants. Using three cameras mounted above the bed, the team reported 81% accuracy with a kappa of 0.58 for 3-class sleep staging and 68% accuracy with a kappa of 0.49 for 4-class staging. Those are not PSG-grade numbers, but they are good enough to make the field hard to dismiss.
Another signal came from Jonathan Carter, João Jorge, Bindia Venugopal, Oliver Gibson, and Lionel Tarassenko, who reported in 2023 that a near-infrared video camera combined with deep learning reached 73.4% accuracy and a Cohen's kappa of 0.61 for four-class sleep staging in 50 healthy volunteers. Their system used heart rate, breathing rate, and activity from overnight near-infrared video. That matters because it shows the field is moving away from single handcrafted features and toward multimodal temporal models.
It is also useful to compare contactless video against the wearable PPG literature, since both approaches depend heavily on pulse variability. In 2024, Clémentine Aguet, Loris Constantin, Florent Baty, Maximilian Boesch, Philippe Renevey, Damien Ferrario, Mathieu Lemay, Martin Brutsche, and Fabian Braun evaluated deep learning sleep staging in 134 patients with suspected sleep apnea using wearable PPG. Their wrist-based model reached a median accuracy of 80.8% with a Cohen's kappa of 0.70. That is a strong benchmark for the contactless field. It suggests that pulse-derived sleep staging is already useful, but also shows how far camera-based systems still need to go when the patient population gets sicker and messier.
What the evidence says right now is simple: overnight video can estimate coarse sleep architecture with moderate agreement to PSG, especially when the system has good face visibility, low-light support, and enough temporal context. The evidence does not yet support claims that cameras can replace full laboratory staging across broad clinical populations.
Clinical applications
Longitudinal sleep monitoring at home
This is the most obvious opening. A contactless camera system can be repeated night after night without new consumables, reattachment, or charging rituals on the body. That matters for insomnia follow-up, behavioral sleep programs, and remote care models where trends are more valuable than one heavily instrumented night.
Sleep medicine triage before formal diagnostics
Many patients do not need immediate full PSG. They need a better screening layer that can tell a clinic who likely has meaningful sleep disruption and who can wait. Contactless overnight monitoring could widen that front door, especially in settings where labor and sleep lab capacity are limited.
Monitoring in higher-acuity environments
Hospitals, rehab units, and post-acute facilities already care about sleep fragmentation, respiratory decline, nighttime restlessness, and autonomic instability. A camera that can track heart rate, respiratory rate, and broad sleep state trends may be more realistic in those settings than asking already fragile patients to tolerate extra wearables.
Where the technology still breaks down
The hardest problems are not glamorous.
- Side sleeping can remove the face from view and degrade rPPG quality.
- Blankets may cover the chest, making respiratory motion harder to track.
- Overnight monitoring requires stable performance across hours, not minutes.
- Deep sleep remains harder to classify accurately than broad wake-versus-sleep distinctions.
- Clinical populations introduce arrhythmias, comorbid breathing disorders, and medication effects that distort the physiological patterns models learn on healthy volunteers.
The wearable literature reinforces this caution. Aguet and colleagues found performance drops in patients with cardiac arrhythmia. That is a useful warning for camera-based systems too, because the autonomic rhythm patterns that help classify sleep stages are exactly the patterns most likely to become noisy in medically complex patients.
Privacy is the other obvious hurdle. An overnight bedroom camera will always raise questions that a wrist wearable does not. The field likely depends on local processing, minimal raw video retention, and workflows that preserve physiological outputs rather than long-term identifiable recordings.
The future of camera-based sleep staging
The near-term future is not full replacement of PSG. It is a lower-burden layer that sits below it.
That layer could be valuable on its own. If overnight video can reliably separate wake from sleep, estimate REM versus non-REM trends, identify fragmented nights, and combine those outputs with contactless heart rate and respiratory rate, it becomes useful for screening, remote follow-up, and population-level monitoring. That is enough to matter.
The larger shift is that sleep staging is becoming part of a broader overnight vital signs story. Once a single camera can track pulse dynamics, breathing patterns, sleep-related movement, and recovery trends through the night, the line between "sleep tech" and "contactless physiologic monitoring" starts to blur. Circadify is building camera-based vital sign capabilities for exactly that kind of environment, where the value comes from passive, repeated measurement rather than one isolated snapshot.
The research still needs larger datasets, more clinical diversity, and harder real-world testing. But the direction is clear. Sleep staging is no longer confined to wires and a sleep lab bed. Cameras are beginning to produce enough physiology to make that old assumption look temporary.
Frequently Asked Questions
Can a camera replace polysomnography for sleep staging today?
Not yet. Polysomnography remains the clinical reference standard because it measures brain activity, eye movement, muscle tone, airflow, and oxygen saturation together. Camera-based systems are better viewed today as lower-burden screening or longitudinal monitoring tools that may complement, not replace, formal sleep studies.
How does rPPG help with sleep staging?
rPPG extracts pulse-related color changes from facial skin using a camera. From that signal, software can estimate heart rate and pulse rate variability, which shift across wakefulness, REM sleep, and non-REM sleep. Those changes can be combined with breathing and movement signals to classify sleep stages.
Why are near-infrared cameras used for overnight sleep monitoring?
Most bedrooms and sleep labs are dark. Near-infrared cameras can capture facial and body motion without visible light, which lets the system monitor pulse, respiration, and activity overnight without disturbing the sleeper.
What is the biggest barrier to camera-based sleep staging?
The biggest challenge is reliability in real bedrooms. Blankets, side sleeping, low light, motion artifacts, variable skin visibility, and differences across age and health status all affect signal quality. Strong results in controlled datasets still need broader real-world validation.
Related Articles
- Camera-Based Sleep Apnea Screening: Overnight Video, rPPG, and Contactless Respiratory Event Detection — Sleep staging and apnea screening rely on many of the same overnight respiratory and pulse signals.
- Sleep Quality Assessment via Camera — A broader look at how cameras can estimate overnight physiology beyond formal sleep staging labels.
- Contactless Respiratory Rate Detection — Respiratory sensing remains one of the most important inputs in any overnight contactless monitoring stack.