Title: Computer vision-based accurate identification system for damaged parts of athletes' high-strength sports injury images

Authors: Guoyang Huang

Addresses: College of Physical Education and Health, Guilin Institute of Information Technology, Guilin 541004, Guangxi, China

Abstract: With the continuous development of society, the application of computer vision (CV) is also increasing. CV is an important branch of AI. The problem it needs to solve is to understand the content in the image. Due to the fact that various parts of the body would be damaged to different degrees during the high-intensity exercise of sports athletes, the image recognition and analysis must be carried out during the treatment. The accuracy and efficiency of the existing relevant technologies to identify and process them are very low. To solve this problem, this paper proposed a high intensity motion damage image based on fish swarm algorithm, and applied it to grey scale conversion and damage recognition. By comparing particle swarm optimisation (PSO), genetic algorithm (GA) and the algorithm designed in this paper, the experiment in this paper was analysed from two aspects of recognition rate and time. According to the experimental data, when the number of recognised images was 50 and the number of experiments was 50, the recognition rates of PSO, GA and this algorithm were 64.33%, 66.86% and 94.57% respectively. When the number of recognised images was 35, the recognition time of PSO, GA and this algorithm was 0.768 s, 0.807 s and 0.532 s respectively. It was not difficult to see that the design method in this paper had excellent performance in recognition rate and recognition time. Therefore, the system designed in this paper was worthy of further promotion and application.

Keywords: sports injury; computer vision; CV; sports athletes; accurate identification of image damaged parts; fish swarm algorithm; high-intensity sports.

DOI: 10.1504/IJCSYSE.2026.151348

International Journal of Computational Systems Engineering, 2026 Vol.10 No.1/2/3/4, pp.217 - 225

Received: 27 Oct 2023
Accepted: 30 Nov 2023

Published online: 26 Jan 2026 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article