Why tennis AI gets confused on clay (and how we handle it)

Clay courts break most tennis AI ball trackers. Here's why — orange ball on orange surface, faint lines, dust — and how AceSense's court keypoint model adapts.

If you play on clay and you've tried a tennis AI app, you've probably had this experience: the app works fine on the local hard courts, you take it to clay for a club match, and suddenly the heatmap is wrong, the line calls are wrong, or the shot count is half what you actually hit. You're not imagining it. Clay courts are genuinely harder for AI than hard courts, and most apps don't talk about why.

This post explains the why, walks through the specific failure modes, and shows how AceSense handles them. I'm the founder, so I'll be honest about where we still have weak spots.

TL;DR

  • Clay-court tennis AI is harder because line contrast is lower, the ball-court color delta is smaller, and dust/footprint marks add visual noise.
  • A model trained mostly on hard courts will under-detect on clay — App Store reviews of competing apps document this directly.
  • AceSense's court keypoint model includes annotated clay footage in training and evaluation. Per-surface accuracy is on /accuracy.
  • Our weakest cases are faded club clay with poorly maintained lines, and green-clay (Har-Tru) under specific lighting.
  • If your clay video fails, send it to us — we want the edge cases.

The clay complaint that keeps showing up

Read SwingVision's App Store reviews (source) and you'll find the same complaint repeated:

"on clay it doesn't understand where the lines of the court are."

This isn't a one-off review. It's a pattern. And it's not because the SwingVision team built a bad product — they didn't. It's because clay-court tennis AI has specific technical challenges that don't show up if your training data is biased toward American hard courts.

Three things go wrong on clay, and you can fix some of them with better data, some of them with better models, and some of them only with better-quality video.

What's actually different about clay video

1. Line contrast is lower

A white line on a green hard court has a luminance delta of roughly 60–80 in standard 8-bit video (depending on the lighting). A white line on a freshly swept red clay court is closer to 30–45. After two hours of play, with footprint marks and dust, that drops further. Some club clay courts have lines that are barely 10–15 luminance points different from the surface.

Court keypoint detection models (which find the four corners and key intersections of the court) rely on edge detection at some layer. When the edge signal weakens, detection weakens. This is why a model that hits 95% line accuracy on hard courts can drop to 70% on dusty clay.

2. Ball-court color similarity

A yellow tennis ball against a blue hard court is a high-contrast object — the ball pops in almost any light. The same ball against an orange clay court is a much lower-contrast target. The HSV color delta is small enough that, in the wrong lighting (overcast afternoons, low golden-hour sun), the ball can blur visually into the surface.

This isn't a hypothetical. We've seen videos where the per-frame ball detection rate drops 10–15 percentage points on clay relative to hard, with the same player and the same camera setup, just because the contrast is worse.

3. Dust, marks, and visual noise

Hard courts are visually clean. Clay courts have:

  • Footprint patterns that look like ball-shaped marks to a vision model.
  • Bounce marks (a useful signal for human players, but easy to confuse with the ball itself).
  • Dust accumulation along lines that breaks edge detection.
  • Lighter and darker patches where the surface has been re-rolled.

A ball-detection model that's only seen clean hard-court training data will fire false positives on bounce marks. Even purpose-built models like TrackNet need clay-specific training data to suppress these.

How AceSense handles it

Three concrete decisions in the pipeline:

1. Clay in the training set

Our court keypoint model is trained on a mix that explicitly includes European red clay, Har-Tru green clay, and indoor clay. We didn't bolt clay on at the end — we trained for it from the start because we're a European product and roughly a quarter of the tennis our users film is on clay.

This is the single biggest reason we don't see the line-detection failures that show up in SwingVision App Store reviews. Not because our model is fundamentally smarter; because it saw the right pictures during training.

2. Per-surface evaluation

Our /accuracy page lists per-surface F1 scores for ball detection, shot detection, and court keypoints. The clay numbers are slightly behind the hard-court numbers — that's the honest truth — but the gap is small enough that the rec-player experience is consistent across surfaces.

We publish the gap rather than hiding it. If clay numbers were dramatically worse, you'd want to know before you uploaded a clay match.

3. Bounce detection on clay benefits from the surface

Here's a slightly counter-intuitive point. Clay is harder for ball-tracking, but clay is easier for bounce-detection because the ball leaves a visible mark. Our pipeline doesn't use bounce marks directly (we detect bounces from ball trajectory hitting the court plane), but the surface texture means the visual evidence of where the ball landed lines up with the model's prediction in a way that hard courts don't always provide.

This matters because the heatmap your players see is largely a bounce heatmap. Even if per-frame ball tracking drops a few points on clay, the bounces — the things you actually use — are typically still good.

Where AceSense still struggles on clay

I said I'd be honest. Two known weak spots:

Faded, poorly maintained club clay

If you play on a club court where the lines are washed out and the surface hasn't been rolled in weeks, the model's court keypoint detection can fail entirely. There's no signal there for the model to lock onto, because there's barely a signal a human umpire could lock onto either. We retry with relaxed thresholds; sometimes that works, sometimes it doesn't.

If this is your home court, please send us a sample video. We retrain on real edge cases.

Green clay (Har-Tru) in specific lighting

Har-Tru, common in the US South, has different visual properties than European red clay. Some of our color-based heuristics were tuned on red and behave a little oddly on green clay under certain lighting (especially early-morning or late-afternoon shadow-heavy conditions). The pipeline still produces a report; the heatmap may have slightly more noise than usual.

This is on the model retraining roadmap. It's solvable; it's just not solved today.

What you can do as a player

Three things help on clay regardless of which app you use:

  1. Film when the court is freshly swept. First match of the morning, or right after the groundskeeper rolls it. Lines are at their crispest, dust is minimal.
  2. Get your phone higher. Clay's lower contrast means fewer pixels per ball at a given distance. Mounting your phone at head-height or slightly above (3–4 feet, not eye-level on a chair) gives the ball more pixels to work with.
  3. Use natural light, not floodlights, when possible. Clay-under-floodlights is the worst combination — uneven lighting amplifies the contrast problem. Daylight on clay is fine for the model. Floodlit indoor clay is the hardest case.

The Talk Tennis "Any Free Video Analysis Apps?" thread has people swapping setup tips — most of them apply double on clay.

Why this matters for buying decisions

If you primarily play on clay, your tennis-AI shortlist is shorter than the global "best tennis app" list. The vendors that explicitly evaluate on clay are the ones to start with. The ones whose App Store reviews flag clay as a weak spot are the ones to skip — or use only on hard courts.

This is one reason AceSense was built in Europe. Roughly half the tennis our team plays is on clay; we had to make it work or our own product would be unusable. American-built vendors that ship from a region with limited clay exposure don't have the same forcing function.

The bigger picture

Clay courts aren't going to get easier for tennis AI. Players who play on clay will either:

  • Choose a vendor that explicitly tests on clay (like AceSense, like a few others).
  • Accept that their hard-court app gives them noisier data on clay days.
  • Manually verify the heatmap on their first few clay sessions and decide whether the noise is acceptable.

We've made the bet that publishing per-surface accuracy and including clay in the training data is what serious clay players want from a tennis-AI vendor. The App Store complaints suggest that bet is correct.

FAQ

Does AceSense work on clay? Yes. Per-surface F1 numbers are on /accuracy. Slight gap vs hard-court but small enough that the rec-player experience is consistent.

Why does SwingVision struggle on clay? App Store reviews report it directly — the line-detection model under-performs on clay surfaces. Likely cause: training-data bias toward hard courts.

What kind of clay is hardest? Faded, dusty club clay where the lines are barely visible. Some Har-Tru green clay under low-angle light is also a known weak spot.

Can I help improve clay support? Yes — send your clay-court videos that didn't work to [email protected]. Real edge cases are how the model improves.

Will my hard-court setup work on clay? Yes — same camera angle, same height, same tripod. If anything, raise the camera slightly and shoot in natural light.


Try AceSense free on hard, clay, or indoor — same pipeline, same accuracy methodology. Start free · How AceSense works · Tennis ball tracking accuracy explained · Doubles support in AI tennis apps