CAMSHIFT IMPROVEMENT WITH MEAN-SHIFT SEGMENTATION, REGION GROWING, AND SURF METHOD
CAMSHIFT algorithm has been widely used in object tracking. CAMSHIFT utilizes
color features as the model object. Thus, original CAMSHIFT may fail when the object color is
similar with the background color. In this study, we propose CAMSHIFT tracker combined with
mean-shift segmentation, region growing, and SURF in order to improve the tracking accuracy.
The mean-shift segmentation and region growing are applied in object localization phase to extract
the important parts of the object. Hue-distance, saturation, and value are used to calculate the
Bhattacharyya distance to judge whether the tracked object is lost. Once the object is judged lost,
SURF is used to find the lost object, and CAMSHIFT can retrack the object. The Object tracking
system is built with OpenCV. Some measurements of accuracy have done using frame-based
metrics. We use datasets BoBoT (Bonn Benchmark on Tracking) to measure accuracy of the
system. The results demonstrate that CAMSHIFT combined with mean-shift segmentation, region
growing, and SURF method has higher accuracy than the previous methods.
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License - Share Alike that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: Creative Commons Attribution-Share Alike (CC BY-SA)