Skip to main content

Mobile & Tablet

Mobile & Tablet Builds

Per-Device Custom rPPG Model Training

We train device-specific rPPG models for your exact smartphone or tablet camera hardware — front-facing selfie cameras, depth sensors, and multi-lens arrays. Every model is built from scratch for the sensor in your target device.

Custom preprocessing handles auto-exposure, rolling shutter, and OIS artifacts unique to each device. Generic rPPG SDKs treat all mobile cameras the same — we build models that account for the specific signal characteristics of yours.

Talk to Our Engineering Team →

What We Build

Per-device training
Auto-exposure compensation
Rolling shutter correction
OIS artifact removal
Native SDK delivery

What We Build

1

Per-Device Model Training

Individual rPPG models trained on data from your target device hardware. An iPhone 15 Pro camera behaves differently from a Samsung Galaxy S24 — we train for yours specifically.

2

Auto-Exposure Compensation

Custom preprocessing that handles the aggressive auto-exposure algorithms on mobile cameras, which cause brightness fluctuations that corrupt rPPG signals in generic SDKs.

3

Rolling Shutter Correction

Preprocessing that compensates for rolling shutter artifacts in CMOS sensors — the temporal skew across frame rows introduces systematic error that we model and remove.

4

OIS Artifact Removal

Optical image stabilization creates micro-movements that generic rPPG algorithms interpret as physiological signal. Our preprocessing isolates and removes OIS-induced artifacts.

5

Native iOS & Android SDKs

Production-ready Swift (iOS) and Kotlin (Android) SDKs with full API documentation, sample apps, and integration guides. Not cross-platform wrappers — true native performance.

6

White-Label UI Components

Pre-built measurement screens, progress indicators, and results displays that match your app's design system. Fully customizable colors, fonts, animations, and layout.

Technical Specifications

Built for Mobile Hardware

Supported Platforms

iOS 15+, Android 10+, iPadOS 15+, Android tablets

Camera Requirements

Front-facing camera, minimum 720p, 24fps+

SDK Languages

Swift for iOS, Kotlin for Android, TypeScript for React Native bridge

Measurement Time

30 seconds default, configurable 15-60 seconds

SDK Size

Under 15MB for both iOS and Android

Output

Native objects, JSON, FHIR R4 Observation resources

Mobile & Tablet rPPG FAQ

Common questions about custom rPPG for mobile and tablet devices

Why does per-device training matter for mobile?

Every smartphone camera has different sensor characteristics, ISP processing, auto-exposure behavior, and OIS implementation. A model trained on iPhone data performs measurably worse on Samsung and vice versa.

Can you support our specific device fleet?

We train models for any device. Enterprise customers with specific device fleets (e.g., company-issued iPads, specific Android tablets) get models tuned exactly for those devices.

Is the SDK a React Native wrapper or truly native?

Truly native. Swift for iOS, Kotlin for Android. We offer a React Native bridge for cross-platform apps, but the core inference runs natively for maximum performance.

How does white-label work?

We deliver customizable UI components (SwiftUI/Kotlin Compose) that you style with your brand assets. Measurement screens, progress animations, and results displays — all configurable via a theming API.

What vitals can be measured on mobile?

Heart rate, respiratory rate, HRV (SDNN, RMSSD), blood pressure estimation, SpO2, and stress level. All from the front-facing camera in a 30-second measurement.

Related Custom Builds

Request A Demo

See how contactless vitals can transform your healthcare delivery.