Analyze walking, running, and functional tests—instantly.
Assess hand function and range of motion with a video.
Advanced head and trunk kinematics using dual AI models.
Validated PT tests—sit-to-stand, balance, and more.
Visual, easy-to-understand results for you and your clients.
Test the platform with pre-recorded demo videos—no upload needed.
BR Butler, B Gholami, BZW Low, Q Mei, D Hollinger, Z Altai, DW Evans, et al. Innovative machine learning approach for automatic detection of ACL reconstruction history using knee kinematic features.
Y Li, J He, B Liew, DS Hollinger, Q Mei, B Gholami, M Fasli, et al. Advanced self-supervised learning techniques improving joint moment estimation accuracy and reducing data requirements for gait analysis.
Ongoing research developing infrastructure and protocols for real-time motion analysis in telehealth applications using 5G networks, machine learning, and computer vision technologies for clinical assessment.
Published Papers
Citations
Collaborating Institutions
Research Funding
World-class researchers and engineers pushing the boundaries of healthcare technology
Process multiple videos at once with our desktop application
Desktop application for bulk video processing • Simple drag-and-drop • Automatic CSV export
Process dozens of videos in minutes, not hours
No complex setup - just drag your videos and go
Automatic export ready for Excel or statistical analysis
Download and run pose_extractor.exe - no installation needed. The application will open instantly.
Click "Select Videos" or drag and drop your video files directly into the application window. Select multiple files at once.
Select where you want to save the results. CSV files will be automatically created - one for each video analyzed.
That's it! Click "Start Analysis" and watch the progress
No! EZBatch processes everything locally on your computer. Your videos never leave your device, ensuring complete privacy and GDPR compliance.
There's no hard limit! You can select as many videos as you want. Processing time depends on your computer's performance and video file sizes. Each video is processed sequentially.
Each CSV contains: frame number, timestamp, joint angles (hip, knee, ankle, elbow, shoulder), keypoint coordinates (x, y), and confidence scores for each detection. Perfect for biomechanical analysis!
No internet required! Once downloaded, EZBatch works completely offline. Process videos anywhere - in the lab, clinic, or field.
Secure • Privacy-first • Free forever
Share your experience with EZcap platform
Be the first to leave a review!
Answers to practical and technical questions about using EZCap in clinic, research, and remote settings.
Screening and trend tracking with consistent protocols.
Single-view capture is sensitive to angle and occlusion.
Decision support, not a standalone diagnostic replacement.
Interpret outputs alongside confidence and protocol consistency.
It estimates body keypoints from standard video without reflective markers or multi-camera lab hardware. EZCap uses AI models to convert that video into biomechanical features.
No. EZCap improves accessibility and scale for many use cases, but high-precision volumetric biomechanics still favors dedicated multi-camera laboratory setups.
Repeated functional assessment, remote follow-up, and trend monitoring over time when setup is standardized between sessions.
Accuracy varies with camera angle, movement plane, lighting, clothing, and occlusion. In-plane movements are generally more robust than out-of-plane rotations.
Small setup changes in distance, height, framing, speed, and subject orientation can change keypoint estimates. Use the same protocol each session.
In most practical workflows, trends over time are more reliable than one-off absolute values, unless setup conditions are tightly controlled.
Use a stable mount, keep full body regions of interest visible, align camera to the main movement plane, and avoid hand-held recording.
A lot. Even lighting and fitted clothing usually improve tracking confidence. Backlighting, shadows, and loose garments can degrade measurements.
Yes. Canes, walkers, braces, or limb overlap may obscure landmarks and lower confidence. Document these conditions and compare like-for-like sessions.
Typically 2 to 3 clean trials per task. If variability is high, collect more and use confidence-based filtering before interpretation.
Yes. Remote use is a key strength, but data quality depends on clear participant setup guidance and repeatable test conditions.
No. It is a decision-support tool and should be interpreted alongside history, examination, and established clinical measures.
Follow your organization policies for consent, retention, storage location, access controls, and compliance workflows. Governance is deployment-specific.
Personalized treatment recommendations powered by the deployed R Shiny app
This app was developed from the CASINO (Surgical or Nonsurgical Treatment for Cervical Radiculopathy) study, which investigated outcomes of surgical versus conservative treatment in patients with cervical radiculopathy (paper link). In collaboration with Yunlong Liang, University of Essex Institute of Social & Economic Research (ISER), we created personalized treatment effect models using baseline demographic, clinical, and MRI characteristics.
The core protocol was a multicentre patient-randomised trial with follow-up over two years, comparing surgery with prolonged conservative care and focusing on VAS arm pain, neck pain, and Neck Disability Index outcomes.
The models estimate whether an individual patient is more likely to benefit from surgery or conservative care, generating patient-specific benefit scores for neck-related disability, neck pain intensity, and arm pain intensity at 52 weeks and 104 weeks.
The CAS Recommender is optimized for larger screens. For the best experience, open it in a full browser tab or visit on a desktop or laptop.
Open CAS RecommenderWe're revolutionizing healthcare through cutting-edge motion analysis technology, making advanced biomechanical assessment accessible to clinicians worldwide.
Funded by Innovate UK, our platform combines state-of-the-art computer vision, machine learning, and clinical expertise to deliver precise, real-time motion analysis. From gait assessment to hand function evaluation, we provide healthcare professionals with the tools they need to make better clinical decisions.
1.71 billion people worldwide affected by musculoskeletal conditions (WHO, 2023)
Leading cause of disability globally - 149 million DALYs lost annually (WHO)
Accessible, validated technology enabling early detection and remote monitoring of movement disorders
Join leading healthcare institutions using our motion analysis platform
Get in touch with our team - we'd love to hear from you
University of Essex
Wivenhoe Park
Colchester, CO4 3SQ
United Kingdom
+44 (0)1206 873333