A new method for the optical flow estimation and segmentation of moving objects 'NMES'
by Amina Ourchani; Zine-Eddine Baarir; Abdelmalik Taleb-Ahmed
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 17, No. 1/2, 2018

Abstract: Segmentation of moving objects in video sequence is essential task in computer vision. This paper focuses on developing a new method for discriminate moving objects from a static background, focusing on the combination of motion, colour and texture features. First, we have used block-matching for computing the optical flow, we also have taken in consideration the result of frame difference, to improve the quality of the optical flow. Moreover, we have used the k-means clustering algorithm owing to group the pixels, having similar features. Second, the result of the grouping pixels is used as an input in Chan-Vese model, in order to attract the evolving contour of moving objects contours. To evaluate the performance of our proposed method, we experiment it on challenging sequences. It has shown that our method provides an improved segmentation results.

Online publication date: Tue, 08-May-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com