Robot repositioning method, device and equipment

A robot and relocation technology, applied in the computer field, can solve the problems of task interruption, time-consuming and labor-intensive, affecting the normal execution of robot tasks, etc., and achieve the effect of accurate relocation, simplicity and relocation

Pending Publication Date: 2020-12-18
DJANGO ROBOTICS SHANGHAI CO LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The technical effect described for an example system involves creating a visually mapped area around every corner of a workspace containing robots. When there are changes or obstacles near it, images from different areas will show up at once - they help guide robotic movement more accurately. This allows for quicker reconfiguration without having to manually adjust everything afterwards.

Problems solved by technology

The technical problem addressed by this patented solution relates to efficiently moving objects between different locations within a facility while maintaining their accuracy during operation without sacrificing productivity. This requires efficient allocation of resources such as manpower and equipment assets.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot repositioning method, device and equipment
  • Robot repositioning method, device and equipment
  • Robot repositioning method, device and equipment

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0067]Example 1. A rangefinder that can be configured by a mapping robot can measure the distance between the robot and the feature points of the object being photographed, such as the distance between the robot and the feature points of the two-dimensional code taken, and the distance between the robot and the photographed ceiling The distance between the feature points; then, based on the distance and the camera's perspective, determine the relative position between the robot and each feature point, and then based on the current position of the robot during the collection operation (such as the three-dimensional coordinates in the three-dimensional initial map, or The two-dimensional coordinates in the two-dimensional initial map) and the relative positions of the two to calculate the position of each feature point in the initial map.

[0068]Example 2: Extract the feature points in the environmental image collected in step 404, and parse the location distribution data between the featu

example 2

[0089]Example 2. When the robot needs to be repositioned, because the relocation can be completed quickly through the QR code identification, the QR code image can be collected first; if there is no QR code image around, the ceiling image can be collected.

[0090]Example 3: When the target environment image is collected, first identify whether the target environment image is a two-dimensional code image, and if so, trigger the execution of the step of identifying the identification information of the two-dimensional code; otherwise, trigger the execution of the target The step of performing feature matching between the first feature point set in the environmental image and the pre-established visual map and subsequent steps.

[0091]In other words, in this example, the two-dimensional code image is preferentially used, and the priority of other types of environmental images is lower than the two-dimensional code image.

[0092]Example 4. When an image of the target environment is collected, th

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An embodiment of the invention provides a robot repositioning method, device and equipment. The method comprises the following steps: a visual map is pre-constructed to record the position of each feature point in a working space where the robot is located; when the robot needs to be repositioned, a nearby target environment image is collected, a first feature point set in the image is extracted,then the first feature point set is matched with a visual map to obtain a matched second feature point set and the position of the second feature point set, and the current position of the robot is calculated on the basis of the second feature point set and the position of the second feature point set. Therefore, the purpose of simple and accurate repositioning can be achieved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner DJANGO ROBOTICS SHANGHAI CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products