iOS SENSOR LAB
Web API Proof of Concept
1
Tap Request Motion Permission. iOS requires this to fire synchronously from a direct tap — it cannot follow any async call.
⏳
2
Tap Request Camera + Mic + Location. iOS will show a system prompt for each — tap Allow on all three.
⏳
WAITING
📐 ACCELEROMETER
— awaiting permission —
G-force — g
🌀 GYROSCOPE
— awaiting permission —
📍 GEOLOCATION (GPS)
— awaiting permission —
📷 CAMERA + LIGHT SPECTRUM HISTOGRAM
— awaiting permission —
▸ RGB CHANNELS
▸ LUMINANCE
▸ DOMINANT COLOR
▸ RGB HISTORY
R
G
B
▸ LUMINANCE HISTORY
Avg Lum
🎙️ MICROPHONE + SOUND SPECTRUM
— awaiting permission —
▸ OSCILLOSCOPE (TIME DOMAIN)
▸ SPECTROGRAM (TIME × FREQUENCY)
0 Hz2k4k8k16k~22k
▸ FFT FREQUENCY SPECTRUM
0 Hz2k4k8k16k~22k
📱 SCREEN ORIENTATION
— detecting —
🔲 VISUAL VIEWPORT
— detecting —
💬 SPEECH RECOGNITION
— awaiting mic permission —
IDLE
Tap Start and speak…
🌐 NETWORK / IP
— fetching —
🖥️ CLIENT / DEVICE
— detecting —
⚠ Requires HTTPS — Camera & Mic fail on file:// URLs.
All sensor processing is local. Network card uses same-origin /cdn-cgi/trace. iOS Safari 13+.