[How-to] AI-Enhance Your Fitness App with Minimal Coding: The Burpee Example

December 7, 2023
Image powered by AI | DALL-E

If you're the proud owner of a fitness app, you've likely curated an impressive collection of exercises, encompassing beloved classics such as squats, high knees, and burpees. However, in the ever-evolving landscape of health and fitness, there's always room for innovation. Imagine taking your users' workout experience to a whole new level by infusing your app with the cutting-edge power of AI technology. The result? Workouts that are not only effective but also interactive, engaging, and downright thrilling.

And here's the best part: this transformation isn't reserved for tech wizards or algorithm gurus. In fact, it's as simple as adding just a few lines of code! How, you ask? Stay with us as we unveil the thrilling secret in the paragraphs to come.

1. The perfect Burpee exercise: the role of Motion Tracking

Mastering the perfect burpee involves engaging a complex interplay of muscles and joints, making it a comprehensive full-body exercise. It recruits the quadriceps, hamstrings, glutes, and calves during the squat and jump phases, while the push-up component activates the chest, shoulders, triceps, and core. The challenge lies in executing this multifaceted movement seamlessly, as poor form can lead to injury or reduced effectiveness.

This is where AI motion tracking shines. By integrating AI into your fitness app, you can offer users real-time guidance and feedback on their burpee form. The AI system can assess alignment, depth, and timing, ensuring each burpee is safe and efficient. It alerts users to potential form errors, encouraging corrections for optimal results while minimizing the risk of strain or injury. AI motion tracking transforms the burpee from a challenging exercise into a safer, more effective, and highly engaging fitness experience.

2. AI-Powered Exercise: Your Comprehensive How-To Guide

Have you ever wondered how to take your standard Burpee exercise and turn it into a thrilling AI-powered fitness experience? With companies like Sency, the transformation is as simple as plug-and-play.

Thanks to our user-friendly SDK, the SMFit, we provide you with step-by-step instructions to seamlessly integrate AI technology into your fitness app with just a few lines of code.

A- What's in it for you?

Before delving into a more technical explanation, let's define what SMFit is and what it allows you to add to your app:

  • Rep Counting: Say goodbye to manual tracking! AI can accurately count your repetitions, ensuring you stay on target and meet your fitness goals.
  • Range of Motion: Achieve the perfect form by receiving real-time feedback on your movement. AI helps you maintain the right range of motion for each exercise, maximizing effectiveness and safety.
  • Real-Time Instruction: Get instant guidance as you work out. AI provides real-time instructions, helping you perfect your technique and push your limits.
  • Skeleton Tracking: Experience a whole new level of precision with AI-powered skeleton tracking. It ensures your movements are spot on, helping you to fine-tune your Burpee exercise like never before.

With Sency SDK, your fitness journey will never be the same. Say hello to precision, efficiency, and motivation like never before.

B- SMFIt documentation

SMFit is Sency 'core level' SDK which allows direct access to SencyMotion's fitness domain functionalities and features. Using this SDK you'll get full flexibility to create your own workout structures by simply implementing easy to use APIs, and build your own UI on top of it.

Installing SMFit

Install via Cocoapods:

Configuring SMFit

Configure the SDK on app launch:
Call SMSessionManager to configure with your auth key. This process might take a few seconds and if you try to do anything else with SMFit before it completes, you will receive an error.

Using SMFit

InitSMExerciseFlowManager
First, init the SMExerciseFlowManager
Make sure to conform to SMFitSessionDelegate:
captureSessionDidSet - Once the camera data is available this function will be called with a AVCaptureSession you then can use it to present the camera output
captureSessionDidStop - If the camera session stops this function will be called
handleDetectionData- This function will be called every frame with the following data:
MovementFeedbackResultData- This struct holds all the feedback data
didDetectRep - (Dynamic exercise) If user did perform repetition
isGoodRep - (Dynamic exercise) true if the repetition was good
isInPosition - (Static/Isometric exercise type) true if the user in position
currentRomValue - The current Range Of Motion of the user
specialParams - some dynamic exercises will have some special params for example the exercise "Jumps" has "JumpPeakHeight" and "currHeight"
handlePositionData -This function will be called every frame with the following data:
poseData - This a dictionary of joints and there position (can be empty if the user was not detected in the camera field of view)


Start the exercise session

Initiate the session:
Start detection:
Stop detection:
When you stop exercise detection you will get all detected data packed in a struct of type SMExerciseInfo:
sessionId - current session Id
startTime - the exercise start time
endTime - the exercise end time
totalTime- time in seconds from start to end detection
In addition you will get different data according to the type of the workout:
SMExerciseDynamicInfo
numberOfPerformedReps - the number of the performed reps
repsTechniqueScore - the exercise score
performedReps - array of RepData
SMExerciseStaticInfo
timeInActiveZone - time the user was in position
positionTechniqueScore - the exercise score
Stop session:
When you stop session detection you will getall of the session's detected data packed in a struct of type DetectionSessionResultData which is the summary of the entire session:
exercises - all SMExerciseInfo of the session
startDate - the start date of the session
endData - the end date of the session


Calibration observers

Device motion observer:
Body positioning observer:

>> Access to the full Sency Documentation here.

Ella Binder

Head of Marketing

Implement AI Motion Tracking

Kindly share your details, and we'll ensure you get all the info to boost your platform with AI motion tracking.
Thank you!
We appreciate you reaching out.

We have received your message and have sent you an invitation to schedule a call with our team.

Please check your mailbox or book a demo now.
Oops! Something went wrong while submitting the form.
close button

Do’nt miss any update
Subscribe to our mailing list now

Thank you for subscribing!
You'll receive updates and news from us in your inbox soon.
Oops! Something went wrong while submitting the form.
gradientgradient