Long-time visual target tracking method based on continuous learning

A target tracking, long-term technology, applied in neural learning methods, image data processing, image enhancement and other directions, can solve problems such as difficulty in adapting to video online changes, easy failure, accelerated tracking drift, etc., to achieve long-term stable tracking of targets, improve The effect of adaptability and reliability

Pending Publication Date: 2020-01-24
BEIJING UNIV OF TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, tracking models that only use offline training are usually difficult to adapt to online changes in videos, and simply updating

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Long-time visual target tracking method based on continuous learning
  • Long-time visual target tracking method based on continuous learning
  • Long-time visual target tracking method based on continuous learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Below in conjunction with accompanying drawing of description, the embodiment of the present invention is described in detail:

[0034] A long-term target tracking method based on continuous learning, the overall process is as follows figure 1 Shown; the algorithm is divided into model initialization, online tracking and model update parts. Model initialization part: For the initial frame processing, first use the superpixel segmentation method to obtain the initial frame segmentation image with only the foreground, then input the original image of the initial frame and the initial frame segmentation image to extract the convolutional layer features, and then fuse the two parts of the features, that is, the two Part of the features are added, and then the classification score is obtained through the fully connected layer and the classification layer to calculate the classification loss, and then the optimal initialization model is optimized and solved by backpropagating th

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a long-time visual target tracking method based on continuous learning. The method comprises the steps of network model design, model initialization, online tracking and modelupdating. A deep neural network structure is designed for long-time visual target tracking, an initialized network model is obtained through model initialization, then online tracking is conducted through the initialized network model, long-time or short-time model updating is conducted through a continuous learning method in the tracking process, and the method adapts to various changes of a target in the tracking process. According to the method, a traditional visual target tracking model online updating process is converted into a continuous learning process, and the complete appearance description of the target is integrally established from all historical data of the video, so that the robustness of long-term visual tracking is effectively improved. The method provided by the invention can provide an effective solution for long-term visual target tracking for application requirements of intelligent video monitoring, man-machine interaction, visual navigation and the like.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products