Object recognition and positioning method and device and terminal equipment

一种物体识别、定位方法的技术,应用在三维物体识别、字符和模式识别、图像数据处理等方向,达到提高效率、鲁棒性好、提高准确性的效果

Active Publication Date: 2020-05-19
SHENZHEN YUEJIANG TECH CO LTD
View PDF8 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, the embodiment of the present invention provides an object recognition and positioning method, device and terminal equipment to solve the problem of how to improve the efficiency and accuracy of 3D object recognition and positioning in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] figure 1 It shows a schematic flowchart of the first object recognition and positioning method provided by the embodiment of the present application, and the details are as follows:

[0033] In S101, acquire the two-dimensional image and point cloud data of the area to be measured.

[0034] The area to be detected is an area containing several objects to be identified, and multiple objects to be identified may exist in this area at the same time. For ease of description, the object to be identified is referred to as a target object hereinafter. The two-dimensional image of the area to be measured can be collected by a depth camera (such as an RGBD camera) that can collect two-dimensional images, or it can be acquired by an ordinary camera. The two-dimensional image contains the two-dimensional shape information of the object in the area to be measured . The point cloud data of the area to be measured can be directly collected by the depth camera, or can be converted ...

Embodiment 2

[0071] figure 2 It shows a schematic flowchart of the second object recognition and positioning method provided by the embodiment of the present application, and the details are as follows:

[0072] In S201, acquire the two-dimensional image and point cloud data of the area to be measured.

[0073] S201 in this embodiment is the same as S101 in the previous embodiment. For details, please refer to the relevant description of S101 in the previous embodiment, and details are not repeated here.

[0074]In S202, the two-dimensional image is detected by using a pre-trained deep learning model, and the two-dimensional target area and geometric shape type corresponding to the target object in the two-dimensional image are identified.

[0075] S202 in this embodiment is the same as S102 in the previous embodiment. For details, please refer to the relevant description of S102 in the previous embodiment, and details are not repeated here.

[0076] In S203, the two-dimensional target ...

Embodiment 3

[0106] image 3 It shows a schematic structural diagram of an object recognition and positioning device provided by the embodiment of the present application. For the convenience of description, only the parts related to the embodiment of the present application are shown:

[0107] The object identification and positioning device includes: a first acquisition unit 31 , an identification unit 32 , a rough segmentation unit 33 , and a positioning unit 34 . in:

[0108] The first acquisition unit 31 is configured to acquire the two-dimensional image and point cloud data of the region to be measured.

[0109] Optionally, the first acquisition unit includes a first acquisition module and a point cloud data generation module:

[0110] The first acquisition module is used to acquire a color map and a depth map of the area to be tested, wherein the color map is a two-dimensional image of the area to be tested;

[0111] A point cloud data generating module, configured to generate po...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention is suitable for the technical field of machine vision, and provides an object recognition and positioning method and device and terminal equipment. The method comprises the steps: obtaining a two-dimensional image and point cloud data of a to-be-detected region; detecting the two-dimensional image through a pre-trained deep learning model, and identifying a two-dimensional target area and a geometrical shape type corresponding to a target object in the two-dimensional image; mapping the two-dimensional target area to the point cloud data, and determining a first three-dimensionalarea of the target object according to a mapping result; and according to the geometrical shape type and the first three-dimensional area, determining a second three-dimensional area of the target object and positioning the target object. According to the embodiment of the invention, the 3D object recognition and positioning efficiency and accuracy can be improved.

Description

technical field [0001] The invention belongs to the technical field of machine vision, and in particular relates to an object recognition and positioning method, device and terminal equipment. Background technique [0002] In the process of industrial production or robot application, it is often necessary to realize the recognition and positioning of objects through machine vision, so as to facilitate subsequent grasping or other processing steps. [0003] For existing three-dimensional (3Dimensions, 3D) objects, a 3D model matching algorithm is usually used, that is, according to a pre-built 3D model of the target object, model matching is performed on the object to be measured, and the target object is identified therefrom. However, existing 3D model matching algorithms are less robust to occlusions and noisy backgrounds, and are prone to mis-matching, resulting in low efficiency in 3D object recognition. Contents of the invention [0004] In view of this, the embodimen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/70
CPCG06T7/70G06T2207/10028G06T2207/20081G06T2207/20084G06V20/647
Inventor 刘培超徐培郎需林刘主福
Owner SHENZHEN YUEJIANG TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products