Optimisation of computer virtual image reconstruction method based on feature point matching
by Chiyu Pan
International Journal of Information and Communication Technology (IJICT), Vol. 20, No. 3, 2022

Abstract: Traditional computer virtual image reconstruction methods have the problems of slow reconstruction speed and insufficient image clarity. To solve this problem, an optimisation method of computer virtual image reconstruction based on feature point matching is proposed. SURF operator is used to extract image feature points. After obtaining feature points, image feature points are matched according to TZNCC constraints. The virtual image is reconstructed by sparse method, and the high resolution depth image in virtual vision is represented by dictionary sparse linear combination. The sparse coefficient of the image is solved by alternating direction multiplier algorithm, and the problem of virtual image reconstruction is transformed into a problem of solving sparse signal, so that a better reconstruction effect can be obtained. The experimental results show that the proposed method has high speed and clarity of image reconstruction.

Online publication date: Thu, 07-Apr-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Communication Technology (IJICT):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com