Touch trails
React Native only. Android is fully supported. iOS is on the roadmap.
How it works
A native touch listener is attached to the root view at init time. Every touch action maps to one of three event types:
| Native action | SDK event | Throttle |
|---|---|---|
ACTION_DOWN | press | — |
ACTION_MOVE | move | ~30Hz (once per 33ms) |
ACTION_UP / ACTION_CANCEL | release | — |
All coordinates are normalized to [0,1] against the root view's width and height, so the replay player doesn't need to know anything about device pixel ratios.
What you see in the replay
| Element | Purpose | Lifetime |
|---|---|---|
| Purple stroke | Follows the finger as playback scrubs forward | Live while the gesture is in progress |
| Fading tail | Stroke stays visible after release and fades out | 800ms |
| Direction arrow (↑↓←→) | Appears on swipes > 5% of screen | With the fading tail |
| Ripple ring | Single taps with no movement | 600ms |
Classification
On release, the player computes dx = releaseX - pressX and dy = releaseY - pressY:
| Condition | Classified as |
|---|---|
max(abs(dx), abs(dy)) ≤ 0.05 | Tap |
abs(dy) > abs(dx) and dy < 0 | Swipe up |
abs(dy) > abs(dx) and dy > 0 | Swipe down |
abs(dx) > abs(dy) and dx < 0 | Swipe left |
abs(dx) > abs(dy) and dx > 0 | Swipe right |
Turning it off
Galacha.init({
projectKey: "...",
captureTouches: false,
});Event size impact
A 10-second scroll produces ~300 move events (30Hz × 10s). Each event is ~40 bytes serialized . roughly 12KB added per 10 seconds of active gesturing. For most apps this is noise compared to frame data.