Most research on camera-based cardiac monitoring has focused on one arrhythmia: atrial fibrillation. That makes sense given the clinical stakes — AFib affects over 37 million people globally and increases stroke risk fivefold, according to Lippi, Sanchis-Gomar, and Cervellin (2021). But the heart produces many abnormal rhythms beyond AFib, and the same facial video technology that spots irregular atrial activity can, in principle, identify premature ventricular contractions, tachycardia, bradycardia, and other rhythm disturbances. The field is starting to move in that direction.
This report examines where facial video arrhythmia detection stands in 2026 — what's been validated, what's still experimental, and where the research gaps remain.
"Cardiac arrhythmias other than atrial fibrillation are also commonly detected with smartwatches. Smartwatches have an important potential besides traditional clinical diagnostics." — Manninger et al., Journal of Clinical Medicine (2023)
From single-arrhythmia to multi-class detection
The earliest camera-based arrhythmia work treated the problem as binary: AFib or not AFib. Yan et al. (2018) showed that Poincaré plot features from facial video could classify AFib with sensitivity above 95%. Couderc et al. (2015) at the University of Rochester demonstrated smartphone-camera AFib detection in controlled settings. These were proof-of-concept studies for a single arrhythmia.
The next step — classifying multiple arrhythmia types from a single recording — is harder. A 2021 study published in Scientific Reports by Yan et al. at En Chu Kong Hospital in Taiwan tackled this directly. They recorded 10-minute facial videos from patients with confirmed AF, normal sinus rhythm, and other ECG abnormalities including premature atrial contractions and ventricular pacing rhythms. Using a 12-layer deep convolutional neural network fed with 30-second rPPG segments, the model achieved 93.3% sensitivity and 98.3% specificity for AF detection, with an overall accuracy of 95.8%. The study specifically trained on patients with various abnormal ECG patterns to reduce false positives from non-AF arrhythmias.
A 2024 paper presented at Computing in Cardiology by researchers working on PPG-based multi-arrhythmia classification took this further, developing models that distinguish between normal rhythm, AFib, bradycardia, and ventricular tachycardia from photoplethysmographic signals. While this study used contact PPG, the IBI extraction and classification methods transfer to rPPG with minimal modification.
How the detection pipeline works
The technical approach follows a consistent pattern across studies. A standard RGB camera captures facial video at 30 to 84 frames per second. Face detection algorithms isolate a region of interest — typically the forehead and cheeks, where subcutaneous blood flow produces the strongest color fluctuations. The average RGB values from this region are extracted frame by frame, then filtered (commonly with a Chebyshev II bandpass filter between 0.5 and 3 Hz) to isolate the cardiac pulse signal from noise caused by motion and lighting changes.
From the cleaned rPPG signal, beat-to-beat timing is extracted. The intervals between successive pulse peaks form an inter-beat interval (IBI) time series. Different arrhythmias produce different IBI signatures:
- AFib generates randomly irregular intervals — the classic "irregularly irregular" pattern where no two consecutive intervals are predictable from the previous ones
- Premature ventricular contractions produce a characteristic short-long-short pattern: an early beat followed by a compensatory pause
- Tachycardia shows consistently shortened intervals (heart rate above 100 bpm) while maintaining relative regularity
- Bradycardia shows consistently lengthened intervals (heart rate below 60 bpm)
- Premature atrial contractions create early beats similar to PVCs but with smaller morphological disruption to the pulse waveform
Classification models take these IBI sequences — sometimes combined with frequency-domain features like power spectral density or time-frequency representations — and output arrhythmia labels. Deep learning architectures (CNNs, RNNs, and transformer-based models) have generally outperformed hand-crafted statistical features for multi-class classification.
Arrhythmia detection methods compared
| Method | Contact required | Arrhythmia types detected | Typical accuracy | Equipment cost | Best application |
|---|---|---|---|---|---|
| 12-lead ECG | Yes | All known arrhythmias | Gold standard | $2,000-$10,000 | Clinical diagnosis |
| Holter monitor | Yes | Most arrhythmias | High (24-48h window) | $300-$500 per use | Extended monitoring |
| Implantable loop recorder | Yes (surgical) | AFib, bradycardia, pauses | Highest for rare events | $5,000-$15,000 | Cryptogenic stroke workup |
| Smartwatch PPG | Yes (wrist) | AFib, tachycardia, bradycardia | 92-97% for AFib | $250-$800 | Consumer passive screening |
| Single-lead ECG patch | Yes (adhesive) | AFib, PVCs, SVT, VT | High | $200-$400 per use | 7-14 day monitoring |
| Facial video rPPG | No | AFib (validated); PVCs, tachy, brady (emerging) | 93-98% for AFib | Camera only | Population screening, telehealth |
Sources: Yan et al. (2021), Couderc et al. (2022), Perez et al. (Apple Heart Study, 2019), Manninger et al. (2023).
The tradeoff pattern is consistent: more invasive or expensive methods detect more arrhythmia types with greater reliability, but reach fewer patients. Camera-based methods sit at the far end of the accessibility spectrum, where marginal cost per screening is near zero.
Clinical scenarios where multi-arrhythmia screening matters
Post-hospital discharge monitoring
Patients discharged after cardiac events need rhythm monitoring, but Holter monitors are typically limited to 24-48 hours and require clinic visits to set up. Daily facial video checks during telehealth follow-ups could flag new arrhythmias — a PVC burden increase, new tachycardia episodes, or AFib recurrence — without additional hardware.
Medication effect monitoring
Antiarrhythmic drugs can paradoxically cause new arrhythmias (proarrhythmic effects). QT-prolonging medications increase risk of torsades de pointes. Frequent rhythm checks through a phone camera could catch these drug-induced rhythm changes earlier than scheduled clinic visits.
Pre-surgical cardiac screening
Patients scheduled for non-cardiac surgery need cardiac clearance. A camera-based arrhythmia screen during a preoperative telehealth visit could identify rhythm abnormalities that warrant further workup before surgery, reducing same-day cancellations from unexpected ECG findings.
Population-level cardiac screening programs
Public health programs in regions with limited ECG access could deploy smartphone-based arrhythmia screening at community health events. The barrier to screening drops from "a trained technician with an ECG machine" to "a phone with a front-facing camera."
Research evidence and current limitations
Yan et al. (2021, Scientific Reports) conducted the most rigorous facial video arrhythmia study to date. Their prospective study at En Chu Kong Hospital enrolled patients from neurology and cardiology departments, using an industrial camera at 84 fps with simultaneous 12-lead ECG and 3-lead monitoring as ground truth. The 12-layer DCNN processed 30-second rPPG segments. The study specifically included patients with non-AF abnormalities (premature atrial contractions, pacing rhythms, other abnormal patterns) in the training set, which is important because earlier AFib detectors had high false positive rates when encountering non-AF arrhythmias.
Couderc et al. (2022, Heart Rhythm O2) at the University of Rochester tested their HealthKam VPG technology on 60 patients with confirmed AF diagnosis, specifically evaluating performance across the full range of skin complexion. With ambient illumination above 100 lux (standard indoor lighting), the system maintained sensitivity and specificity above 90% across all skin tones. Pulse rate error stayed below 1 bpm compared to ECG reference.
Bashar et al. (2019, University of Connecticut) developed a real-time AFib detection algorithm combining time-domain irregularity metrics with frequency-domain features, achieving 98% sensitivity and 97% specificity from PPG signals. Their multi-feature approach outperformed single-metric detection.
Solosenko et al. demonstrated that PPG-based premature beat detection is feasible, training neural networks on the adaptive low-pass filtered waveform morphology of premature beats. Their work showed that PVC and PAC detection from pulse signals is possible, though accuracy is lower than AFib detection because the morphological differences are subtler.
Several honest limitations remain. Most studies use controlled environments with patients sitting still in adequate lighting. Real-world conditions — variable lighting, head movement, cosmetics, glasses — degrade signal quality. Multi-arrhythmia classification beyond AFib is still early-stage, with most studies using small cohorts. And the fundamental constraint of any intermittent screening method applies: a 30-60 second video can only detect arrhythmias occurring during that window.
Where this goes next
The trajectory is clear: from binary AFib classification toward multi-arrhythmia detection, and from controlled lab settings toward real-world conditions. Several research directions are active.
Morphological analysis of the rPPG waveform itself (not just beat timing) could improve PVC and PAC detection. AFib detection relies primarily on interval irregularity, but distinguishing PVCs from PACs requires examining the shape of individual pulse waves — something that higher-resolution cameras and better signal processing may enable.
Continuous passive monitoring through laptop or tablet cameras during normal use is another direction. Instead of a deliberate 60-second scan, a system running in the background during video calls or screen time could accumulate hours of rhythm data without any user action.
Circadify has developed contactless cardiac rhythm analysis capabilities and is bringing them to market for integration into telehealth platforms and remote monitoring systems. The company's rPPG technology extracts pulse waveform data from standard cameras, providing the signal foundation that arrhythmia classification models require.
The gap between what facial video can detect today (primarily AFib with high accuracy) and what the cardiac monitoring field needs (reliable multi-arrhythmia screening at population scale) will likely narrow over the next several years as training datasets grow and deep learning architectures improve. For a technology that requires nothing more than a camera, the clinical ceiling is still a long way from being reached.
Frequently asked questions
What types of arrhythmias can facial video detect?
Current research has demonstrated detection of atrial fibrillation, premature ventricular contractions, premature atrial contractions, tachycardia, and bradycardia from facial video signals. AFib detection is the most mature application, with newer studies expanding into multi-class arrhythmia classification from the same rPPG signals.
How accurate is camera-based arrhythmia detection?
Published studies report sensitivity above 93% and specificity above 98% for AFib detection from 10-minute facial video recordings. Multi-arrhythmia classification is earlier-stage, with accuracy varying by arrhythmia type and recording conditions. All results are from research settings with clinical ECG as reference.
Can facial video replace a cardiac monitor?
No. Camera-based arrhythmia detection is a screening tool, not a diagnostic replacement. Any detected abnormality should be confirmed with clinical-grade ECG. The value is in frequent, low-barrier screening that catches arrhythmias between clinic visits — especially for patients who wouldn't otherwise be monitored.
Does skin tone affect detection accuracy?
Couderc et al. (2022) tested videoplethysmography across the full spectrum of skin complexion and found that with adequate illumination above 100 lux, the technology maintained sensitivity and specificity above 90% across all skin tones. Melanin concentration affects signal strength but can be compensated for algorithmically.
Related articles
- Contactless AFib Detection — A deeper look at the specific algorithms and evidence for atrial fibrillation detection through rPPG.
- Contactless Heart Rate Monitoring — Heart rate extraction provides the beat-by-beat timing data that arrhythmia classifiers depend on.
- Contactless HRV Analysis — Heart rate variability metrics overlap with many of the irregularity measures used in arrhythmia classification.