Pedestrian tracking method and device, computer equipment and storage medium

A pedestrian and robot technology, applied in the field of computer vision, can solve the problem that indoor robots do not have real-time active follow-up, and achieve the effect of improving the level of intelligence

Pending Publication Date: 2022-05-10
AGRICULTURAL BANK OF CHINA
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present invention provides a pedestrian tracking method, device, computer equipment and storage medium, which solv

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0025] Example 1

[0026] figure 1 This is a flowchart of a pedestrian tracking method provided in Embodiment 1 of the present invention. This embodiment is applicable to the situation where the indoor navigation robot actively follows in real time. The method of this embodiment may be executed by a pedestrian tracking device, and the device may be implemented by means of software and / or hardware, and the device may be configured in a server.

[0027] Correspondingly, the method specifically includes the following steps:

[0028] S110. If it is determined that the indoor navigation robot and the target pedestrian have established a following navigation relationship, each time the tracking time point is reached, a detection and collection image is acquired through an indoor environment camera.

[0029] The following navigation relationship may be an initial image of the target pedestrian collected by the indoor navigation robot, and a task relationship is established with the

Example Embodiment

[0055] Embodiment 2

[0056] Figure 2a This is a flowchart of another indoor robot navigation control method provided in Embodiment 2 of the present invention. This embodiment is refined on the basis of the above-mentioned embodiments, and calculates the first fusion comparison feature corresponding to each detection target and the corresponding navigation route according to the pedestrian image collected by the indoor navigation robot for the target pedestrian and the set following navigation route. Track the second fusion comparison feature corresponding to the predicted target for further refinement, refer to Figure 2a , the method specifically includes the following steps:

[0057] S210. If it is determined that the indoor navigation robot and the target pedestrian have established a following navigation relationship, each time the tracking time point is reached, an indoor environment camera is used to acquire a detection and acquisition image.

[0058] S220: Identify at

Example Embodiment

[0078] Embodiment 3

[0079] image 3 This is a flowchart of another indoor robot navigation control method provided in Embodiment 3 of the present invention. This embodiment is refined on the basis of the above-mentioned embodiments, and further refines the identification of the target pedestrian in the detection target according to the similarity between the first fusion comparison feature and the second fusion comparison feature. Refer to image 3 , the method specifically includes the following steps:

[0080] S310. If it is determined that the indoor navigation robot and the target pedestrian have established a following navigation relationship, each time the tracking time point is reached, a detection and acquisition image is acquired through an indoor environment camera.

[0081] S320: Identify at least one detection target in the detection and acquisition image, and identify a tracking prediction target for the target pedestrian in the detection and acquisition image.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a pedestrian tracking method and device, computer equipment and a storage medium. The method comprises the steps that if it is determined that a following navigation relation is established between an indoor navigation robot and a target pedestrian, a detection collection image is obtained through an indoor environment camera when a tracking time point is reached; identifying at least one detection target and identifying a tracking prediction target for the target pedestrian; calculating a first fusion comparison feature corresponding to each detection target and a second fusion comparison feature corresponding to the tracking prediction target according to a pedestrian image acquired by the indoor navigation robot for the target pedestrian and a set following navigation route; and according to the similarity between each first fusion comparison feature and the second fusion comparison feature, target pedestrian identification is carried out in the detection target so as to track the target pedestrian. The problem that an indoor navigation robot does not have an active real-time following function is solved, the function of real-time following of customers is achieved, and the intelligent level of the robot is improved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner AGRICULTURAL BANK OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products