Posture capturing method and device based on virtual reality

A technology of virtual reality and posture, applied in character and pattern recognition, input/output process of data processing, input/output of user/computer interaction, etc., can solve the complex production process of virtual avatar materials, lack of realism, and low precision and other issues to achieve the effect of improving the gesture recognition rate and recognition accuracy, facilitating capture, and enhancing the sense of reality

Pending Publication Date: 2020-08-28
广州市大湾区虚拟现实研究院
View PDF10 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present invention provides a gesture capture method and device based on virtual reality, which solves the technical problems in the prior art that the production process of virtual avatar materials is complicated, the sense of reality is insufficient,

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0053] Example one

[0054] figure 1 It is a schematic flowchart of a method for capturing a posture based on virtual reality in an embodiment of the present invention. Such as figure 1 As shown, the embodiment of the present invention provides a virtual reality-based gesture capture method, the method includes:

[0055] Step 110: Obtain first hand posture information.

[0056] Specifically, in order to realize gesture recognition, the first step is to realize finger tracking and recognition based on depth information, which is a key technology to realize non-contact human-computer interaction. The body reconstruction in the embodiments of the present application is further divided into the reconstruction of limbs and hands. In the body reconstruction, infrared optical positioning technology is used, and multiple infrared transmitters are used to cover the indoor positioning space. An optical tracker is placed on the tracked user to receive the infrared signal emitted by the transmitt

Example Embodiment

[0070] Example two

[0071] Based on the same inventive concept as the virtual reality-based gesture capture method in the foregoing embodiment, the present invention also provides a virtual reality-based gesture capture method device, such as figure 2 As shown, the device includes:

[0072] A first obtaining unit 11, the first obtaining unit 11 is configured to obtain first hand posture information;

[0073] The first training unit 12 is configured to input the first hand posture information into a training model, wherein the training model is obtained through training of multiple sets of training data, and the training data in the multiple sets Each set of training data in includes: the first hand posture information and depth image;

[0074] The second obtaining unit 13 is configured to obtain output information of the training model, wherein the output information includes hand region contour information;

[0075] The third obtaining unit 14 is configured to perform offline training

Example Embodiment

[0094] Example three

[0095] Based on the same inventive concept as the virtual reality-based gesture capture method in the foregoing embodiment, the present invention also provides a virtual reality-based gesture capture device, such as image 3 As shown, it includes a memory 304, a processor 302, and a computer program that is stored on the memory 304 and can be run on the processor 302. The processor 302 implements the aforementioned virtual reality-based pose capture method when executing the program. Steps of either method.

[0096] Among them, in image 3 In the bus architecture (represented by the bus 300), the bus 300 can include any number of interconnected buses and bridges. The bus 300 will include one or more processors represented by the processor 302 and various memories represented by the memory 304. The circuits are linked together. The bus 300 may also link various other circuits such as peripheral devices, voltage regulators, power management circuits, etc., which

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a posture capturing method and device based on virtual reality, and relates to the technical field of artificial intelligence. The method comprises the steps of obtaining and inputting first hand posture information into a training model, wherein the training model is obtained through training of multiple sets of training data and each set of training data in the multiple sets of training data comprises the first hand posture information and a depth image; obtaining output information of the training model, the output information which comprises hand region contour information; carrying out offline training and real-time recognition on the hand region contour information according to the random forest to obtain hand skeleton information; carrying out hand tracking onthe hand skeleton information to obtain a continuous first hand part motion sequence; according to the first hand part motion sequence and the first gesture feedback information, obtaining the firstgesture recognition information, thereby achieving the technical effects of conveniently capturing gesture posture data, improving the gesture recognition rate and recognition precision, and improvingthe reality sense of the virtual avatar.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner 广州市大湾区虚拟现实研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products