Feature binding pulse-coupled neural network model using a double colour space
by Hongxia Deng; Han Li; Sha Chang; Jie Xu; Haifang Li
International Journal of Computational Science and Engineering (IJCSE), Vol. 16, No. 2, 2018

Abstract: The feature binding problem is one of the central issues in cognitive science and neuroscience. To implement a bundled identification of colour and shape for a colour image, a double-space vector feature binding PCNN (DVFB-PCNN) model was proposed based on the traditional pulse-coupled neural network (PCNN). In this model, the method of combining RGB colour space with HSI colour space successfully solved the problem that all colours cannot always be separated completely. Through the use of the first pulse emission time of the neurons, the different characteristics were successfully separated. Through the colour sequence produced by this process, the different characteristics belonging to the same perceived object were bound together. Experiments showed that the model can successfully achieve separation and binding of image features and will be a valuable tool for PCNN in the feature binding of colour images.

Online publication date: Mon, 19-Mar-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com