Passive Smartphone Sensing on Centralive: What iPhone Can Tell Us When Nobody’s Asking

The smartphone has been called the most thoroughly instrumented device most people will ever own. It tracks where we go, how we move, how we type, how we speak, how we sleep, how often we reach for it, and how we interact with the people in our lives, all as a byproduct of normal use.

For digital health research, that’s an enormous opportunity that has historically been hard to actually capture. Survey burden is real. Self-report is noisy. And the richest behavioral signals tend to live in places researchers can’t easily reach.

The Centralive iPhone integration changes that. Through Apple’s SensorKit framework, our platform can now collect a wide range of passive smartphone sensing signals from participants’ iPhones, with explicit per-data-type consent and Apple’s approval of the study protocol. No diaries. No active logging. The phone observes what the phone already observes, and ships those observations into the research pipeline.

What we mean by passive smartphone sensing

Passive sensing is what happens when meaningful behavioral and physiological data gets collected without requiring the participant to actively do anything beyond living their normal life with their phone. It’s the opposite of an EMA prompt. It’s data that accumulates whether or not the participant remembers the study is running.

Done right, it gives researchers something self-report can’t: continuous, ecologically valid, low-burden ground truth about real-world behavior. Done wrong, it’s a privacy disaster. SensorKit was designed specifically to thread that needle, and Centralive operationalizes it for research teams.

The iPhone signals Centralive can now collect

Once a participant authorizes each data type in the study app, Centralive can ingest the following from iPhone.

SignalCategoryWhat it capturesWhat’s excludedExample research uses
Messages usageCommunicationNumber of messages sent and received, count of distinct individuals messagedMessage content, identities of contactsSocial withdrawal, depression, loneliness phenotyping
Phone usageCommunicationTime on calls, number of calls, distinct individuals calledAudio, contact informationSocial engagement patterns, mood research
Siri speech metricsSpeechSpeaking rate, pitch, jitter, shimmer, pauses, and speech expression features during Siri interactionsRaw audio, retained transcriptsVocal biomarker research, mood and cognition tracking
Telephony speech metricsSpeechSame vocal-feature measurements during phone and VOIP callsRaw audio, conversation contentMood disorders, Parkinson’s disease, cognitive decline
Keyboard metricsTyping behaviorWords typed, typing speed, accuracy, autocorrect rates, inter-key timings, emoji counts, sentiment-categorized word countsThe actual content typedFatigue, mood, motor function, cognitive performance
VisitsMobilityAnonymized location identifiers, distance from home, arrival and departure windows, location category (home, work, school, gym)GPS coordinates, specific addressesMobility entropy, time-at-home, routine disruption
Device usageDevice interactionPhone and app use frequency and duration, screen wakes, unlocks, notifications, app activity by categorySpecific app and website names, notification contentCircadian behavior, sleep displacement, engagement patterns
Ambient lightEnvironmentalLux and chromaticity readings with sensor placementCamera imagesIndoor versus outdoor exposure, circadian context
Ambient pressureEnvironmentalBarometric pressure in hPa and elevation change with associated temperatureAltitude exposure, weather correlation
Face metricsVision and attentionFacial expression features, gaze direction, blend shapes during device unlock or messaging app useImages, videoAffect research, attention studies
Media eventsMedia interactionTimestamps of when video or images appear on screen during messaging app useImage or video contentScreen time content context, attention research
AccelerometerMovementThree-axis acceleration in G’s from the iPhone IMUActivity recognition and gait research when no watch is worn
Rotation rateMovementThree-axis gyroscope data from the iPhone IMUPhone handling patterns, posture and orientation
Pedometer dataMovementSteps, distance, pace, floors when the phone is being carriedDaily activity, mobility tracking
OdometerMovementDistance, speed, slope, altitude changeTravel and movement profiling
Sleep sessionsSleepDetected sleep timing and duration when no watch is presentSleep duration tracking in phone-only deployments

A closer look at each grouping below.

Communication and social behavior (iPhone only)

  • Messages usage. Number of messages sent and received, number of distinct individuals messaged. Content and identities are excluded. This is one of the most studied digital phenotyping signals in the literature on depression, social withdrawal, and loneliness.
  • Phone usage. Time on calls, number of calls, number of distinct individuals called. Again, no audio, no contact information.

Speech characteristics (iPhone only)

  • Siri speech metrics. Speaking rate, pitch, jitter, shimmer, and pauses during Siri interactions. No audio, no transcripts retained for sharing.
  • Telephony speech metrics. The same vocal-feature measurements during phone and VOIP calls. These signals have a growing evidence base in mood disorders, Parkinson’s disease, and cognitive decline research.

Typing and language behavior (iPhone only)

  • Keyboard metrics. Words typed, typing speed, accuracy, autocorrect and retro-correction rates, emoji use, inter-key timings, and even sentiment-related word counts in aggregate. Excludes the actual content typed. Typing dynamics have been linked to fatigue, mood, and motor function.

Mobility and location patterns (iPhone only)

  • Visits. Frequently visited locations represented as anonymized identifiers, plus distance from home, approximate arrival and departure times, and location category (home, work, school, gym). Crucially, no GPS coordinates and no specific addresses. This gives researchers mobility entropy, time-at-home, and routine-disruption signals without ever knowing where the participant actually is.

Device interaction (iPhone only)

  • Device usage. Frequency and duration of phone and app use, screen wakes and unlocks, notification patterns, and app activity grouped by category rather than by named app. Strong signal for circadian behavior, sleep displacement, and engagement patterns.

Environmental context (iPhone only)

  • Ambient light. Lux and chromaticity readings from the iPhone’s ambient light sensor, useful as a proxy for indoor versus outdoor exposure and circadian context.
  • Ambient pressure. Barometric pressure and elevation change from the iPhone’s barometer.

Vision and attention (iPhone only)

  • Face metrics. Facial expression features such as brow raise and gaze direction, captured during device unlock or messaging app use via the TrueDepth camera. No images, no video. Promising signal for affect research.
  • Media events. Timestamps (only) of when video or images appear on screen during messaging app use.

Movement when the phone is the sensor

  • Accelerometer and rotation rate. The iPhone’s IMU streams. For participants who carry their phone but don’t wear a watch, these become the primary movement signals for activity recognition and gait research.
  • Pedometer data. Steps, distance, pace, and floors when the phone is being carried.
  • Odometer. Distance, speed, and altitude change during movement.

Sleep

  • Sleep sessions. Detected sleep timing and duration, which can be derived from the phone alone when no watch is present.

What this enables that wasn’t really possible before

A few research directions where rich passive iPhone data shifts what’s feasible.

Digital phenotyping for mental health. Combining mobility entropy from visits, communication frequency from messages and phone usage, vocal features from telephony speech metrics, and circadian behavior from device usage produces a multidimensional behavioral fingerprint. Changes in that fingerprint have been studied as leading indicators of depressive episodes and mania, at a temporal resolution that weekly questionnaires can’t approach.

Cognitive and motor decline research. Typing speed and accuracy, vocal acoustic features, gait from accelerometer data, and routine stability from visit patterns all carry signal relevant to neurodegenerative conditions. Capturing them passively means the people who most need to be monitored, including older adults and individuals with early symptoms, aren’t burdened by the monitoring itself.

Just-in-time interventions. Centralive’s closed-loop architecture means passive sensing isn’t just a measurement strategy. It’s a trigger. A spike in nighttime device usage, a drop in mobility, a shift in vocal features during calls: any of these can be wired up to deliver an intervention precisely when a participant might benefit from one.

Context for everything else. Even when passive sensing isn’t the headline measure, it’s powerful as context. An EMA response means something different at home than it does at work. A heart rate reading from the watch means something different during a sedentary day than after a long walk. Passive sensing gives every other data point a context to live in.

A note on what’s deliberately excluded

It’s worth being explicit about what these signals don’t include, because the design is intentional. SensorKit excludes message content, the identities of people contacted, audio recordings, GPS coordinates, app and website names (categories are reported instead), notification content, and images or video. Researchers get behavioral structure without behavioral surveillance, and Apple reviews every study application before approval. Centralive surfaces these constraints transparently in the participant-facing consent flow, and each data type is authorized individually before any collection begins.

Privacy and research utility usually get framed as a tradeoff. The SensorKit design, and Centralive’s implementation of it, is a working example of the cases where they aren’t.

Bringing it together

Apple Watch gives us the body. iPhone gives us the behavior. Together, on the Centralive platform, they form one of the most comprehensive passive monitoring stacks available for real-world research, one that respects participants, satisfies institutional review, and surfaces signals at a depth that conventional digital health tools simply weren’t built to reach.

If you’re designing a study where passive iPhone data could carry weight, we’d be glad to walk through what your protocol could look like.