Touch trails

React Native only. Android is fully supported. iOS is on the roadmap.

How it works

A native touch listener is attached to the root view at init time. Every touch action maps to one of three event types:

Native actionSDK eventThrottle
ACTION_DOWNpress
ACTION_MOVEmove~30Hz (once per 33ms)
ACTION_UP / ACTION_CANCELrelease

All coordinates are normalized to [0,1] against the root view's width and height, so the replay player doesn't need to know anything about device pixel ratios.

What you see in the replay

ElementPurposeLifetime
Purple strokeFollows the finger as playback scrubs forwardLive while the gesture is in progress
Fading tailStroke stays visible after release and fades out800ms
Direction arrow (↑↓←→)Appears on swipes > 5% of screenWith the fading tail
Ripple ringSingle taps with no movement600ms

Classification

On release, the player computes dx = releaseX - pressX and dy = releaseY - pressY:

ConditionClassified as
max(abs(dx), abs(dy)) ≤ 0.05Tap
abs(dy) > abs(dx) and dy < 0Swipe up
abs(dy) > abs(dx) and dy > 0Swipe down
abs(dx) > abs(dy) and dx < 0Swipe left
abs(dx) > abs(dy) and dx > 0Swipe right

Turning it off

Galacha.init({
  projectKey: "...",
  captureTouches: false,
});

Event size impact

A 10-second scroll produces ~300 move events (30Hz × 10s). Each event is ~40 bytes serialized . roughly 12KB added per 10 seconds of active gesturing. For most apps this is noise compared to frame data.