Multi-modal neural image feature selection method based on sample weight and low-rank constraint

A feature selection method and low-rank constraint technology, applied in the field of multimodal neuroimaging feature selection, can solve problems such as the inability to fully utilize the complementary information of multimodal data, the single feature representation method, and restrict the learning performance of the model, so as to improve the diagnosis. Accuracy, high classification performance, the effect of improving accuracy

Pending Publication Date: 2020-11-10
WENZHOU UNIVERSITY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology helps create an improved way of diagnosing disease by analyzing multiple sources such as images obtained from magnetic resonance imagery (MRI) scans. It can identify specific patterns within these MRI datasets which help researchers better interpret them about their health condition. By combining this analysis across various modes it provides greater precision than just one mode alone while still taking advantage of other available techniques like X ray fluorescent tomography (XRF). Overall, this new method allows us to improve medical care efficiency through identifying potential causes early on before symptoms appear.

Problems solved by technology

This patents discusses different ways to extract useful structural characteristics from images collected through various techniques such as magnetic resonance tomography (MRT), positron emission tomography(PET) scans, X ray fluorescent tests, ultrasound waves, etc., but current approaches lack effective tools for analyzing and understanding complex multicommodality datasets containing diverse types of data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal neural image feature selection method based on sample weight and low-rank constraint
  • Multi-modal neural image feature selection method based on sample weight and low-rank constraint
  • Multi-modal neural image feature selection method based on sample weight and low-rank constraint

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] see figure 1 , a method for feature selection based on sample weights and low-rank constraints disclosed in the present invention, comprising the following steps:

[0032] (1) Multimodal brain imaging data collection;

[0033] (2) After obtaining the data of multiple modalities, use the method of multi-modal feature collaborative analysis for feature selection, establish a regression model from each modal data to the classification class, and perform group sparse constraints on the regression vector, so as to obtain all A subset of common features that are all relevant to the task;

[0034] (3) Multimodal data feature modeling;

[0035] (4) Write the objective function as an augmented Lagrangian form, and the objective function becomes a convex function;

[0036] (5) Use the alternate direction multiplier algorithm to solve each variable in the formula in step (4) with the gradient descent method, and obtain the weight matrix W of each mode of each sample and the weight

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-modal neural image feature selection method based on sample weight and low-rank constraint. The method comprises the following steps: (1) collecting multi-modal brain image data; (2) after data of multiple modals are obtained, feature selection being carried out in a multi-modal feature collaborative analysis mode, a regression model from each modal datum to a classification class label being established, group sparse constraint being carried out on regression vectors, and therefore a common feature subset related to all tasks being obtained; (3) multi-modal data feature modeling; (4) writing the target function into an augmented Lagrange form, and changing the target function into a convex function; (5) obtaining the weight matrix of each mode of each sample and the weight of each feature; and (6) using a multi-kernel support vector machine method to fuse the sample multi-modal features for classification. According to the technical scheme, the importance of samples among modals and the correlation among the modals are considered, more meaningful features are found out, and the classification and prediction accuracy is improved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner WENZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products