Back to Case Studies
Experimentation

A/B Testing & Experiments

Ran multiple hypothesis-driven experiments across no-shows, cancellations, and booking flows

6.9% 2.5%
no-show rate
Company
Intellect
Role
Senior Data Analyst

5 Experiments

Hypothesis In, Evidence Out

Each experiment below followed the same discipline: observe something in the data, form a hypothesis, design a test, and measure the outcome. No gut feelings — just data-backed decisions.

Experiment 1

The Calendar Fix

Observation

No-show rates running at ~7%. Users were forgetting sessions when booking multiple at once.

Hypothesis

Adding "Add to Calendar" functionality would reduce no-shows as a lightweight behavioural nudge.

Test

Implemented calendar integration and measured no-show rates before/after.

Result
6.9% 2.5% no-show rate

Takeaway

Sometimes the simplest intervention has the biggest impact.

Experiment 2

Reschedule vs Cancel

Observation

30% of users who cancelled cited "scheduling conflict" — they wanted a different time, not to quit.

Hypothesis

Offering a reschedule-forward flow instead of a cancellation button would retain users who just need flexibility.

Test

Replaced the straight cancellation flow with a "reschedule first" prompt.

Result
-15%
cancellations
+8%
reschedules

Takeaway

Users were telling us why they cancelled — we just had to listen to the data.

Experiment 3

Provider Selection Cohort

Observation

Auto-matched users showed low commitment — 45% match-to-booking, 57.6% session completion.

Hypothesis

Letting users pick their own provider creates psychological ownership and stronger commitment.

Test

Cohort experiment replacing auto-match with user-selected provider listing.

Result
45% 82%
match-to-booking
57.6% 70%
session completion
19.6% 13.4%
repeat-matching

Takeaway

User agency drives commitment. Auto-matching optimises for speed; user-selection optimises for outcome.

Experiment 4

Combined Match + Booking Flow

Observation

30% drop between matching with a provider and actually booking.

Hypothesis

Combining match and booking into a single committed flow removes the friction gap.

Test

Single-step flow where users commit to booking at point of provider selection.

Result
+5%
higher booking rate when users committed at point of selection

Takeaway

Reducing steps reduces drop-off. Every extra click is a chance to lose the user.

Experiment 5

Session Assignment Impact

Observation

10% of providers proactively sent session assignments (homework/exercises) to users.

Hypothesis

Provider-initiated engagement between sessions improves retention.

Finding

Users with assigned providers had 12% higher retention.

Result
+12%
higher retention for users with session assignments

Recommendation

Surfaced as a product recommendation to expand assignment adoption platform-wide.

Takeaway

The best insights sometimes come from observing outlier provider behaviour.

Let's work together.

Looking for a data person who can go from SQL to boardroom? I'd love to chat.