Troubleshooting

Common issues and how to fix them.


Device not discovered

Symptom: The Haptix daemon doesn't see your device, or the agent gets "no devices found" from start_session.

Fixes:

  1. USB cable connected — Your device must be connected to your Mac with a USB cable. Haptix uses USB-only device discovery.

  2. Trusted computer — When you first connect a device, iOS asks "Trust This Computer?" You must tap Trust and enter your device unlock code.

  3. Haptix.start() called — Make sure Haptix.start(license:) is in your app's init() and is actually executing. If it's inside #if DEBUG, verify you're building the Debug configuration.

  4. Developer Mode — On the device: Settings → Privacy & Security → Developer Mode → toggle on. Requires a restart.

  5. Try a different cable/port — Some USB cables are charge-only and don't support data. Try a different cable or USB port.


License mismatch

Symptom: The agent connects but commands return licensing error messages.

Fix: The license key in Haptix.start(license:) in your iOS app must match the one entered in the Haptix Mac app settings.


Agent can't connect to MCP server

Symptom: Your AI agent reports that it can't reach the Haptix MCP server.

Fixes:

  1. Haptix running — Make sure the Haptix Mac app is running (check your menu bar). The daemon starts automatically with the app.

  2. Correct URL — The MCP endpoint is http://localhost:4278/mcp. Check your agent's config file for typos. If you have an older config using /sse, update it to /mcp.

  3. Port conflict — If something else is using port 4278, the daemon won't start. Check with: lsof -i :4278.

  4. Config file location — Make sure the MCP config is in the right place for your agent. See MCP Setup for the exact paths.

  5. Restart agent — Some agents (Cursor, VS Code) require a restart or window reload after adding MCP config.


Taps not working on specific controls

Symptom: Taps work on some elements but not others.

Possible causes:

  1. Context menus — If the control uses UIContextMenuInteraction or SwiftUI .contextMenu, tap won't work on menu items. This is a known issue.

  2. System-presented UI — Alerts, action sheets, and share sheets are system-owned and may reject synthetic touches. See Compatibility.

  3. Menu-style pickers — SwiftUI Picker with .menu style uses context menus internally. See Compatibility.

  4. Keyboard covering element — If the software keyboard is visible, it covers the bottom ~40% of the screen. Elements behind it can't be tapped. Dismiss the keyboard first by tapping a non-interactive area above it.

  5. Element off-screen — The element might be scrolled out of view. Use scroll to bring it into the viewport first.


Gestures not working as expected

Symptom: Swipe, drag, pinch, or rotate doesn't produce the expected result.

Possible causes:

  1. Swipe on pickers — Vertical swipes are redirected to programmatic scrolling, which doesn't work on picker wheels or custom scroll-driven controls. Use small swipe movements directly on the picker wheel column.

  2. Drag not working — The drag tool currently has limitations. See Compatibility.

  3. Pinch/rotate broken — Multi-touch gestures (pinch, rotate) are currently broken. Gesture recognizers reject synthetic multi-touch events. See Compatibility.

  4. Use scroll for page navigation — For basic up/down/left/right page scrolling, use the scroll tool instead of swipe. It's more reliable for that purpose.


Simulator vs. physical device differences

Simulator advantages

  • No Developer Mode or UI Automation settings required
  • No USB cable needed (localhost connection)
  • Faster screenshot capture
  • No battery or charging concerns

Simulator limitations

  • No haptic feedback
  • Performance differs from real hardware
  • Some hardware features unavailable (camera, certain sensors)

Physical device requirements

Physical devices require two settings:

  1. Developer Mode — Settings → Privacy & Security → Developer Mode → toggle on (requires restart)
  2. UI Automation — Settings → Developer → Enable UI Automation → toggle on

If either is missing, the device may connect but gestures may not work correctly.


Slow first interaction

Symptom: The first tap or screenshot takes 1–2 seconds longer than subsequent ones.

Cause: If the Xcode debugger is attached (debuggerAttached: true in device_info), there's a one-time warmup delay on the first UI automation interaction. Subsequent interactions are fast.

Fix: This is expected behavior. The delay only happens once per debug session.


App crashes on launch with HaptixKit

Possible causes:

  1. Release build — If HaptixKit is included in a Release build and the app can't find development frameworks, it may crash. Always wrap in #if DEBUG and link only in the Debug configuration.

  2. iOS version — HaptixKit requires iOS 16.0+. Earlier versions will crash on import.


Screenshots are blank or black

Possible causes:

  1. App in background — The app must be in the foreground for screenshots to capture content.

  2. Secure content — Some system UI (password fields, certain system sheets) may render as black in screenshots for security.

  3. GPU rendering — Rarely, Metal-rendered content may not capture correctly. Try mode: "full" instead of mode: "app".


Still stuck?

If none of the above resolves your issue:

  • Email support@haptix.dev with:

    • Your macOS version and device/iOS version
    • The Haptix app version (from the menu bar → About)
    • What you tried and what happened
    • Any error messages from the Xcode console
  • Check the Compatibility page for known issues with specific controls or gestures.