Characteristic point recognition method based on neural network

A technology of neural network and recognition method, applied in the direction of neural learning method, biological neural network model, input/output process of data processing, etc., can solve the problems of low accuracy and efficiency, achieve accuracy, improve calculation speed, The effect of simple positioning process

Pending Publication Date: 2017-05-31
VR TECH HLDG LTD
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to solve the defect that the accuracy and efficiency of the projection ID (Identity, serial number) are not high in the current virtual reality space pos

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Characteristic point recognition method based on neural network
  • Characteristic point recognition method based on neural network
  • Characteristic point recognition method based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] In order to solve the defect that the accuracy and efficiency of determining the projection ID in the current virtual reality space positioning method are not high, the present invention provides a feature point recognition method based on a neural network that can improve the accuracy and efficiency of the projection ID.

[0026] In order to have a clearer understanding of the technical features, purposes and effects of the present invention, the specific implementation manners of the present invention will now be described in detail with reference to the accompanying drawings.

[0027] see figure 1 — figure 2 . The neural network-based feature point recognition method of the present invention includes a virtual reality helmet 10 , an infrared camera 20 and a processing unit 30 , and the infrared camera 20 is electrically connected to the processing unit 30 . The virtual reality helmet 10 includes a front panel 11 , and a plurality of infrared point light sources 13 ar

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a characteristic point recognition method based on a neural network. The characteristic point recognition method comprises the following steps: training the neural network by utilizing a pre-processed picture; keeping an infrared spot light source of a virtual reality helmet in an open state, and photographing by an infrared camera; pre-processing an image to obtain the pre-processed image; inputting the obtained pre-processed image into a neuron to obtain an ID (Identity) of the infrared spot light source corresponding to each light spot. Compared with the prior art, an algorithm of the neural network is introduced into a virtual reality space positioning method, and a method for determining the ID of the light spot is provided and is accurate and efficient. A training image and a testing image are pre-processed, so as to avoid influences, caused by the diversification of pictures, on recognition accuracy; the diversified pictures are subjected to standardization processing, so that the succuess rate and accuracy of ID recognition are greatly improved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner VR TECH HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products