CN120086654A - Battlefield casualty status recognition method and system based on multi-dimensional data - Google Patents

Battlefield casualty status recognition method and system based on multi-dimensional data Download PDF

Info

Publication number
CN120086654A
CN120086654A CN202510251961.8A CN202510251961A CN120086654A CN 120086654 A CN120086654 A CN 120086654A CN 202510251961 A CN202510251961 A CN 202510251961A CN 120086654 A CN120086654 A CN 120086654A
Authority
CN
China
Prior art keywords
data
sensing data
physiological
wounded
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202510251961.8A
Other languages
Chinese (zh)
Other versions
CN120086654B (en
Inventor
李云
郑致远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Yun Zhao Medical Technology Co ltd
Original Assignee
Guangdong Yun Zhao Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Yun Zhao Medical Technology Co ltd filed Critical Guangdong Yun Zhao Medical Technology Co ltd
Priority to CN202510251961.8A priority Critical patent/CN120086654B/en
Publication of CN120086654A publication Critical patent/CN120086654A/en
Application granted granted Critical
Publication of CN120086654B publication Critical patent/CN120086654B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Databases & Information Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Epidemiology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Educational Administration (AREA)
  • Human Resources & Organizations (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses a battlefield wounded state identification method and system based on multidimensional data, wherein the method comprises the steps of acquiring physiological sensing data and multidimensional environmental sensing data of wounded to be identified; correcting the physiological sensing data according to at least one environmental sensing data to obtain corrected physiological data, determining state recognition results corresponding to the corrected physiological data and the environmental sensing data respectively based on a state recognition algorithm, and determining the wounded state of the wounded to be recognized according to the state recognition results corresponding to the corrected physiological data and the environmental sensing data respectively. Therefore, the invention can fully combine multidimensional data and comprehensive identification to realize more accurate and efficient wounded state identification, so as to realize more intelligent and automatic wounded identification and improve rescue effect and efficiency.

Description

Battlefield wounded state identification method and system based on multidimensional data
Technical Field
The invention relates to the technical field of data processing, in particular to a battlefield wounded state identification method and system based on multidimensional data.
Background
In modern disaster or war rescue work, the intelligent data processing technology plays an important strategic role in improving the guard quality and the wounded rescue efficiency. With the rapid development of artificial intelligence technology and Internet of things sensing technology, how to improve the accuracy of state identification of fight field wounded by means of algorithm technology becomes an important technical problem. However, in the existing technologies, most of the technologies are only based on sensor equipment on the wounded body to perform state identification, and are not fully combined with multidimensional data to perform interactive correction and further identification, so that the accuracy and efficiency of the wounded state identification are obviously low, and the intelligentization degree and the automation degree are both deficient. It can be seen that the prior art has defects and needs to be solved.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a battlefield wounded state identification method and system based on multi-dimensional data, which can fully combine the multi-dimensional data and comprehensive identification to realize more accurate and efficient wounded state identification so as to realize more intelligent and automatic wounded identification and improve rescue effect and efficiency.
To solve the above technical problems, a first aspect of the present invention discloses a battlefield wounded state recognition method based on multidimensional data, the method comprising:
acquiring physiological sensing data and multidimensional environmental sensing data of a wounded to be identified;
correcting the physiological sensing data according to at least one environmental sensing data to obtain corrected physiological data;
based on a state recognition algorithm, determining state recognition results respectively corresponding to the corrected physiological data and the environment sensing data;
And determining the wounded state of the wounded to be identified according to the state identification results respectively corresponding to the corrected physiological data and the environment sensing data.
As an optional embodiment, in the first aspect of the present invention, the physiological sensing data includes at least one of blood pressure data, body temperature data, blood glucose data, speed data, acceleration data, respiration data, and pulse data.
As an optional implementation manner, in the first aspect of the present invention, the environmental sensing data includes at least one of an environmental image, an environmental sound, an environmental light reflection data, an environmental temperature, and an environmental humidity.
As an optional implementation manner, in the first aspect of the present invention, the correcting the physiological sensing data according to at least one environmental sensing data to obtain corrected physiological data includes:
For each physiological sensing data, determining at least one relevant environment sensing type corresponding to the sensing data type of the physiological sensing data according to a preset data type relevant rule;
Screening out relevant environment sensing data corresponding to the relevant environment sensing type from all the environment sensing data;
Determining reference physiological prediction data based on all the relevant environment sensing data and a preset prediction neural network;
and calculating the average value of the reference physiological prediction data and the physiological sensing data to obtain corrected physiological data corresponding to the physiological sensing data.
As an optional implementation manner, in the first aspect of the present invention, the determining the reference physiological prediction data based on all the relevant environmental sensing data and a preset prediction neural network includes:
Inputting each relevant environment sensing data to a physiological relevant prediction neural network corresponding to the sensing data type to obtain physiological prediction data corresponding to each relevant environment sensing data, wherein the physiological relevant prediction neural network is obtained through training a training data set comprising a plurality of training relevant environment sensing data and training physiological data labels of the corresponding sensing data type;
The method comprises the steps of calculating weighted sum average values of physiological prediction data corresponding to all relevant environment sensing data to obtain reference physiological prediction data, wherein the weighted calculation weight of the physiological prediction data corresponding to each relevant environment sensing data comprises a first weight and a second weight, the first weight is in direct proportion to the data proportion of the training relevant environment sensing data, which is the same as the data type of the relevant environment sensing data, and the second weight is in direct proportion to the data proportion of the relevant environment sensing data to the total data quantity of all the relevant environment sensing data.
As an optional implementation manner, in the first aspect of the present invention, the determining, based on a state recognition algorithm, a state recognition result corresponding to the corrected physiological data and the environmental sensing data respectively includes:
for each sensing data, determining the data type corresponding to the sensing data, wherein the sensing data is the corrected physiological data or the environment sensing data;
The sensing data is input into a trained wounded state recognition neural network corresponding to the data type to obtain a state recognition result corresponding to the sensing data, the wounded state recognition neural network is obtained through training of a training data set comprising a plurality of training sensing data corresponding to the data type and corresponding wounded state labels, and the state recognition result comprises an injured type, an injured part, an injured duration and a consciousness state.
In a first aspect of the present invention, the wounded state recognition neural network is a first network when the sensing data is the corrected physiological data, and is a second network when the sensing data is the environmental sensing data, wherein:
The first network comprises a data corresponding part identification network and a part state identification network, wherein the data corresponding part identification network is used for identifying a human body part corresponding to the corrected physiological data, and the part state identification network is used for predicting the corrected physiological data based on a state identification network obtained by training a training data set related to the human body part so as to obtain a corresponding state identification result;
The second network comprises a wounded part identification network and a wounded state identification network, wherein the wounded part identification network is used for identifying a data part related to the wounded to be identified in the environment sensing data, and the wounded state identification network is used for predicting the data part to obtain a corresponding state identification result.
As an optional implementation manner, in the first aspect of the present invention, the determining, according to the state identification results corresponding to the corrected physiological data and the environmental sensing data, the wounded state of the wounded to be identified includes:
calculating intersection sets of state recognition results corresponding to all the corrected physiological data to obtain a first reference state recognition result;
calculating the average value of the similarity between the state recognition result corresponding to the environmental sensing data and the state recognition result corresponding to each other environmental sensing data for the state recognition result corresponding to each environmental sensing data to obtain a similarity parameter of the state recognition result corresponding to the environmental sensing data;
Screening out state recognition results with similarity parameters larger than a parameter threshold value from the state recognition results corresponding to all the environmental sensing data to obtain a plurality of preferred state recognition results;
Calculating the intersection of all the preferred state recognition results to obtain a second reference state recognition result;
and calculating an intersection of the first reference state identification result and the first reference state identification result to obtain the wounded state of the wounded to be identified.
The second aspect of the embodiment of the invention discloses a battlefield wounded state identification system based on multidimensional data, which comprises:
The acquisition module is used for acquiring physiological sensing data and multidimensional environment sensing data of the wounded to be identified;
the correction module is used for correcting the physiological sensing data according to at least one piece of environment sensing data to obtain corrected physiological data;
the identification module is used for determining state identification results respectively corresponding to the corrected physiological data and the environment sensing data based on a state identification algorithm;
And the determining module is used for determining the wounded state of the wounded to be identified according to the state identification results respectively corresponding to the corrected physiological data and the environment sensing data.
As an optional embodiment, in the second aspect of the present invention, the physiological sensing data includes at least one of blood pressure data, body temperature data, blood glucose data, speed data, acceleration data, respiration data, and pulse data.
As an optional embodiment, in the second aspect of the invention, the environmental sensing data includes at least one of an environmental image, an environmental sound, an environmental light reflection data, an environmental temperature, and an environmental humidity.
In a second aspect of the present invention, as an optional implementation manner, the correction module corrects the physiological sensing data according to at least one environmental sensing data, so as to obtain a specific manner of correcting the physiological data, which includes:
For each physiological sensing data, determining at least one relevant environment sensing type corresponding to the sensing data type of the physiological sensing data according to a preset data type relevant rule;
Screening out relevant environment sensing data corresponding to the relevant environment sensing type from all the environment sensing data;
Determining reference physiological prediction data based on all the relevant environment sensing data and a preset prediction neural network;
and calculating the average value of the reference physiological prediction data and the physiological sensing data to obtain corrected physiological data corresponding to the physiological sensing data.
As an optional implementation manner, in the second aspect of the present invention, the correction module determines a specific manner of referencing physiological prediction data based on all the relevant environmental sensing data and a preset prediction neural network, including:
Inputting each relevant environment sensing data to a physiological relevant prediction neural network corresponding to the sensing data type to obtain physiological prediction data corresponding to each relevant environment sensing data, wherein the physiological relevant prediction neural network is obtained through training a training data set comprising a plurality of training relevant environment sensing data and training physiological data labels of the corresponding sensing data type;
The method comprises the steps of calculating weighted sum average values of physiological prediction data corresponding to all relevant environment sensing data to obtain reference physiological prediction data, wherein the weighted calculation weight of the physiological prediction data corresponding to each relevant environment sensing data comprises a first weight and a second weight, the first weight is in direct proportion to the data proportion of the training relevant environment sensing data, which is the same as the data type of the relevant environment sensing data, and the second weight is in direct proportion to the data proportion of the relevant environment sensing data to the total data quantity of all the relevant environment sensing data.
As an optional implementation manner, in the second aspect of the present invention, the identifying module determines, based on a state identifying algorithm, a specific manner of the state identifying result corresponding to the corrected physiological data and the environmental sensing data, respectively, including:
for each sensing data, determining the data type corresponding to the sensing data, wherein the sensing data is the corrected physiological data or the environment sensing data;
The sensing data is input into a trained wounded state recognition neural network corresponding to the data type to obtain a state recognition result corresponding to the sensing data, the wounded state recognition neural network is obtained through training of a training data set comprising a plurality of training sensing data corresponding to the data type and corresponding wounded state labels, and the state recognition result comprises an injured type, an injured part, an injured duration and a consciousness state.
In a second aspect of the present invention, the wounded state recognition neural network is a first network when the sensing data is the corrected physiological data, and is a second network when the sensing data is the environmental sensing data, wherein:
The first network comprises a data corresponding part identification network and a part state identification network, wherein the data corresponding part identification network is used for identifying a human body part corresponding to the corrected physiological data, and the part state identification network is used for predicting the corrected physiological data based on a state identification network obtained by training a training data set related to the human body part so as to obtain a corresponding state identification result;
The second network comprises a wounded part identification network and a wounded state identification network, wherein the wounded part identification network is used for identifying a data part related to the wounded to be identified in the environment sensing data, and the wounded state identification network is used for predicting the data part to obtain a corresponding state identification result.
As an optional implementation manner, in the second aspect of the present invention, the determining module determines, according to the state recognition results corresponding to the corrected physiological data and the environmental sensing data, a specific manner of the wounded state of the wounded to be recognized, including:
calculating intersection sets of state recognition results corresponding to all the corrected physiological data to obtain a first reference state recognition result;
calculating the average value of the similarity between the state recognition result corresponding to the environmental sensing data and the state recognition result corresponding to each other environmental sensing data for the state recognition result corresponding to each environmental sensing data to obtain a similarity parameter of the state recognition result corresponding to the environmental sensing data;
Screening out state recognition results with similarity parameters larger than a parameter threshold value from the state recognition results corresponding to all the environmental sensing data to obtain a plurality of preferred state recognition results;
Calculating the intersection of all the preferred state recognition results to obtain a second reference state recognition result;
and calculating an intersection of the first reference state identification result and the first reference state identification result to obtain the wounded state of the wounded to be identified.
In a third aspect, the invention discloses another battlefield wounded state recognition system based on multidimensional data, the system comprising:
a memory storing executable program code;
A processor coupled to the memory;
The processor invokes the executable program code stored in the memory to perform some or all of the steps in the multi-dimensional data based battlefield wounded state recognition method disclosed in the first aspect of the present invention.
A fourth aspect of the invention discloses a computer storage medium storing computer instructions which, when invoked, are adapted to perform part or all of the steps of the method for identifying a battlefield wounded state based on multi-dimensional data disclosed in the first aspect of the invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
According to the invention, the physiological sensing data can be corrected according to at least one environmental sensing data to obtain accurate corrected physiological data, and then the state recognition results corresponding to the corrected physiological data and the environmental sensing data are determined based on the state recognition algorithm, so that the wounded state of the wounded to be recognized is comprehensively determined, more accurate and efficient wounded state recognition can be realized by fully combining multidimensional data and comprehensive recognition, more intelligent and automatic wounded recognition is realized, and the rescue effect and efficiency are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for identifying the status of a battlefield wounded based on multi-dimensional data according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a battlefield wounded state recognition system based on multi-dimensional data according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of another battlefield wounded state recognition system based on multi-dimensional data according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or device that comprises a list of steps or elements is not limited to the list of steps or elements but may, in the alternative, include other steps or elements not expressly listed or inherent to such process, method, article, or device.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The invention discloses a battlefield wounded state identification method and system based on multi-dimensional data, which can correct physiological sensing data according to at least one environmental sensing data to obtain accurate corrected physiological data, and then determine state identification results respectively corresponding to the corrected physiological data and the environmental sensing data based on a state identification algorithm to comprehensively determine the wounded state of a wounded to be identified, so that more accurate and efficient wounded state identification can be realized by fully combining the multi-dimensional data and comprehensive identification, more intelligent and automatic wounded identification can be realized, and rescue effect and efficiency are improved. The following will describe in detail.
Example 1
Referring to fig. 1, fig. 1 is a flow chart of a method for recognizing the status of a battlefield wounded based on multi-dimensional data according to an embodiment of the present invention. The method for identifying the state of the battlefield wounded based on the multidimensional data, which is described in fig. 1, can be applied to a data processing system/a data processing device/a data processing server (wherein the server comprises a local processing server or a cloud processing server). As shown in fig. 1, the method for recognizing the state of a battlefield wounded based on multi-dimensional data may include the following operations:
101. Physiological sensing data and multidimensional environmental sensing data of the wounded to be identified are obtained.
102. And correcting the physiological sensing data according to the at least one environmental sensing data to obtain corrected physiological data.
103. Based on the state recognition algorithm, determining state recognition results respectively corresponding to the corrected physiological data and the environment sensing data.
104. And determining the wounded state of the wounded to be identified according to the state identification results respectively corresponding to the corrected physiological data and the environment sensing data.
Therefore, the embodiment of the invention can correct the physiological sensing data according to at least one environmental sensing data to obtain accurate corrected physiological data, and then determine the state recognition results corresponding to the corrected physiological data and the environmental sensing data respectively based on the state recognition algorithm to comprehensively determine the wounded state of the wounded to be recognized, so that more accurate and efficient wounded state recognition can be realized by fully combining multidimensional data and comprehensive recognition, more intelligent and automatic wounded recognition can be realized, and rescue effect and efficiency are improved.
As an alternative embodiment, in the above step, the physiological sensing data includes at least one of blood pressure data, body temperature data, blood glucose data, speed data, acceleration data, respiration data, and pulse data.
Therefore, through the optional embodiment, the types of the physiological sensing data are limited, so that the physiological related characteristics of the wounded are comprehensively represented, the state of the wounded is accurately identified later, more accurate and efficient wounded state identification is realized by fully combining multi-dimensional data and comprehensive identification, more intelligent and automatic wounded identification is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, in the above step, the environmental sensing data includes at least one of an environmental image, an environmental sound, an environmental light reflection data, an environmental temperature, and an environmental humidity.
Therefore, through the optional embodiment, the types of the environmental sensing data are limited, so that the relevant characteristics of the rescue environment are comprehensively represented, the physiological data can be accurately corrected and the state of the wounded person can be identified conveniently, the accurate and efficient wounded person state identification can be realized by fully combining the multidimensional data and comprehensive identification, the intelligent and automatic wounded person identification can be realized, and the rescue effect and efficiency can be improved.
As an optional embodiment, in the step, correcting the physiological sensing data according to the at least one environmental sensing data to obtain corrected physiological data includes:
For each physiological sensing data, determining at least one relevant environment sensing type corresponding to the sensing data type of the physiological sensing data according to a preset data type relevant rule;
Screening out relevant environment sensing data corresponding to the relevant environment sensing type from all the environment sensing data;
determining reference physiological prediction data based on all relevant environmental sensing data and a preset prediction neural network;
and calculating the average value of the reference physiological prediction data and the physiological sensing data to obtain corrected physiological data corresponding to the physiological sensing data.
Therefore, through the optional embodiment, the reference physiological prediction data can be determined based on the environmental sensing data related to the data type of the specific physiological sensing data, so that the average value of the physiological sensing data is corrected, more accurate corrected physiological data is obtained, the state of the wounded is accurately identified later, more accurate and efficient wounded state identification is realized by fully combining the multidimensional data and comprehensive identification, more intelligent and automatic wounded identification is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, in the step, determining the reference physiological prediction data based on all the relevant environmental sensing data and the preset prediction neural network includes:
inputting each relevant environmental sensing data to a physiological relevant prediction neural network corresponding to the sensing data type to obtain physiological prediction data corresponding to each relevant environmental sensing data; optionally, the physiological related prediction neural network is obtained by training a training data set including a plurality of training related environmental sensing data and training physiological data labels of corresponding sensing data types;
The method comprises the steps of calculating a weighted sum average value of physiological prediction data corresponding to all relevant environment sensing data to obtain reference physiological prediction data, and optionally, the weighted calculation weight of the physiological prediction data corresponding to each relevant environment sensing data comprises a first weight and a second weight, wherein the first weight is in direct proportion to the data proportion of the same data type as the relevant environment sensing data in training relevant environment sensing data, and the second weight is in direct proportion to the data proportion of the relevant environment sensing data to the total data proportion of all relevant environment sensing data.
Therefore, through the optional embodiment, the prediction neural network trained on the training data set corresponding to the data type can predict the physiological data corresponding to the related environmental sensing data, and the weighted summation of a plurality of prediction results is realized on the basis of the weight related to the data proportion to obtain more accurate reference physiological prediction data, so that the physiological data can be accurately corrected and the state of the wounded can be identified, more accurate and efficient wounded state identification can be realized by fully combining the multidimensional data and comprehensive identification, more intelligent and automatic wounded identification can be realized, and the rescue effect and efficiency can be improved.
As an optional embodiment, in the step, determining, based on the state recognition algorithm, a state recognition result corresponding to the corrected physiological data and the environmental sensor data, respectively, includes:
For each sensing data, determining the data type corresponding to the sensing data, wherein the sensing data is optionally corrected physiological data or environment sensing data;
The sensing data is input into a trained wounded state recognition neural network corresponding to the data types to obtain a state recognition result corresponding to the sensing data, optionally, the wounded state recognition neural network is trained by a training data set comprising training sensing data corresponding to a plurality of data types and corresponding wounded state labels, and the state recognition result comprises an injured type, an injured part, an injured duration and a consciousness state.
Therefore, through the optional embodiment, the state recognition results corresponding to the corrected physiological data and the environmental sensing data can be respectively determined based on the wounded state recognition neural network corresponding to the data types of the different sensing data, so that the subsequent accurate comprehensive recognition of the wounded state is facilitated, the accurate and efficient wounded state recognition is realized by fully combining the multidimensional data and the comprehensive recognition, the intelligent and automatic wounded recognition is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, in the above steps, the wounded state recognition neural network is a first network when the sensing data is corrected physiological data, and is a second network when the sensing data is environmental sensing data, wherein:
the first network comprises a data corresponding part identification network and a part state identification network, wherein the data corresponding part identification network is used for identifying a human body part corresponding to the corrected physiological data;
The second network comprises a wounded part identification network and a wounded state identification network, wherein the wounded part identification network is used for identifying a data part related to a wounded to be identified in the environment sensing data, and the wounded state identification network is used for predicting the data part to obtain a corresponding state identification result.
Therefore, through the optional embodiment, the network architecture and functions of the state neural network corresponding to different data types are clarified, so that the corresponding state recognition result is accurately predicted, the follow-up accurate comprehensive recognition of the state of the wounded is facilitated, the accurate and efficient wounded state recognition is realized by fully combining multidimensional data and comprehensive recognition in an auxiliary manner, the intelligent and automatic wounded recognition is realized, and the rescue effect and efficiency are improved.
As an optional embodiment, in the step, determining the wounded state of the wounded to be identified according to the state identification results corresponding to the corrected physiological data and the environmental sensor data, includes:
calculating intersection sets of state recognition results corresponding to all corrected physiological data to obtain a first reference state recognition result;
Calculating the average value of the similarity between the state recognition result corresponding to the environmental sensing data and the state recognition result corresponding to each other environmental sensing data for the state recognition result corresponding to each environmental sensing data to obtain a similarity parameter of the state recognition result corresponding to the environmental sensing data;
Screening state recognition results with similarity parameters larger than a parameter threshold from state recognition results corresponding to all environmental sensing data to obtain a plurality of preferred state recognition results;
calculating the intersection of all the preferred state recognition results to obtain a second reference state recognition result;
and calculating an intersection of the first reference state identification result and the first reference state identification result to obtain the wounded state of the wounded to be identified.
Therefore, according to the above-mentioned alternative embodiment, the first reference state recognition result can be obtained according to the intersection of the state recognition results corresponding to all corrected physiological data, the prediction result based on the physiological data is more accurate, so that the intersection can be directly calculated to serve as the reference state, the prediction result of the environment data with insufficient accuracy is obtained by determining the intersection of the state recognition results of the environment sensing data after screening based on similarity screening and intersection calculation, the second reference state recognition result is obtained, so that the more accurate wounded state of the wounded to be recognized is obtained based on intersection calculation, more accurate and efficient wounded state recognition can be realized by fully combining multidimensional data and comprehensive recognition, more intelligent and automatic wounded recognition can be realized, and rescue effect and rescue efficiency can be improved.
Example two
Referring to fig. 2, fig. 2 is a schematic structural diagram of a battlefield wounded state recognition system based on multi-dimensional data according to an embodiment of the present invention. The multi-dimensional data-based battlefield wounded state recognition system described in fig. 2 can be applied to a data processing system/data processing device/data processing server (wherein the server comprises a local processing server or a cloud processing server). As shown in fig. 2, the multi-dimensional data-based battlefield wounded state recognition system may include:
An acquisition module 201 is configured to acquire physiological sensing data and multidimensional environmental sensing data of a wounded to be identified.
The correction module 202 is configured to correct the physiological sensing data according to the at least one environmental sensing data, so as to obtain corrected physiological data.
The recognition module 203 is configured to determine a state recognition result corresponding to the corrected physiological data and the environmental sensing data, respectively, based on a state recognition algorithm.
The determining module 204 is configured to determine a wounded state of the wounded to be identified according to the state identification results corresponding to the corrected physiological data and the environmental sensing data, respectively.
Therefore, the embodiment of the invention can correct the physiological sensing data according to at least one environmental sensing data to obtain accurate corrected physiological data, and then determine the state recognition results corresponding to the corrected physiological data and the environmental sensing data respectively based on the state recognition algorithm to comprehensively determine the wounded state of the wounded to be recognized, so that more accurate and efficient wounded state recognition can be realized by fully combining multidimensional data and comprehensive recognition, more intelligent and automatic wounded recognition can be realized, and rescue effect and efficiency are improved.
As an alternative embodiment, the physiological sensing data includes at least one of blood pressure data, body temperature data, blood glucose data, speed data, acceleration data, respiration data, and pulse data.
Therefore, through the optional embodiment, the types of the physiological sensing data are limited, so that the physiological related characteristics of the wounded are comprehensively represented, the state of the wounded is accurately identified later, more accurate and efficient wounded state identification is realized by fully combining multi-dimensional data and comprehensive identification, more intelligent and automatic wounded identification is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, the environmental sensing data includes at least one of an environmental image, an environmental sound, an environmental light reflection data, an environmental temperature, an environmental humidity.
Therefore, through the optional embodiment, the types of the environmental sensing data are limited, so that the relevant characteristics of the rescue environment are comprehensively represented, the physiological data can be accurately corrected and the state of the wounded person can be identified conveniently, the accurate and efficient wounded person state identification can be realized by fully combining the multidimensional data and comprehensive identification, the intelligent and automatic wounded person identification can be realized, and the rescue effect and efficiency can be improved.
As an alternative embodiment, the correction module corrects the physiological sensing data according to at least one environmental sensing data, to obtain a specific way of correcting the physiological data, including:
For each physiological sensing data, determining at least one relevant environment sensing type corresponding to the sensing data type of the physiological sensing data according to a preset data type relevant rule;
Screening out relevant environment sensing data corresponding to the relevant environment sensing type from all the environment sensing data;
determining reference physiological prediction data based on all relevant environmental sensing data and a preset prediction neural network;
and calculating the average value of the reference physiological prediction data and the physiological sensing data to obtain corrected physiological data corresponding to the physiological sensing data.
Therefore, through the optional embodiment, the reference physiological prediction data can be determined based on the environmental sensing data related to the data type of the specific physiological sensing data, so that the average value of the physiological sensing data is corrected, more accurate corrected physiological data is obtained, the state of the wounded is accurately identified later, more accurate and efficient wounded state identification is realized by fully combining the multidimensional data and comprehensive identification, more intelligent and automatic wounded identification is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, the correction module determines a specific way of referencing the physiological prediction data based on all relevant environmental sensing data and a preset prediction neural network, including:
inputting each relevant environmental sensing data to a physiological relevant prediction neural network corresponding to the sensing data type to obtain physiological prediction data corresponding to each relevant environmental sensing data; optionally, the physiological related prediction neural network is obtained by training a training data set including a plurality of training related environmental sensing data and training physiological data labels of corresponding sensing data types;
The method comprises the steps of calculating a weighted sum average value of physiological prediction data corresponding to all relevant environment sensing data to obtain reference physiological prediction data, and optionally, the weighted calculation weight of the physiological prediction data corresponding to each relevant environment sensing data comprises a first weight and a second weight, wherein the first weight is in direct proportion to the data proportion of the same data type as the relevant environment sensing data in training relevant environment sensing data, and the second weight is in direct proportion to the data proportion of the relevant environment sensing data to the total data proportion of all relevant environment sensing data.
Therefore, through the optional embodiment, the prediction neural network trained on the training data set corresponding to the data type can predict the physiological data corresponding to the related environmental sensing data, and the weighted summation of a plurality of prediction results is realized on the basis of the weight related to the data proportion to obtain more accurate reference physiological prediction data, so that the physiological data can be accurately corrected and the state of the wounded can be identified, more accurate and efficient wounded state identification can be realized by fully combining the multidimensional data and comprehensive identification, more intelligent and automatic wounded identification can be realized, and the rescue effect and efficiency can be improved.
As an optional embodiment, the identifying module determines, based on a state identifying algorithm, a specific way of correcting the state identifying result corresponding to the physiological data and the environmental sensing data, respectively, including:
For each sensing data, determining the data type corresponding to the sensing data, wherein the sensing data is optionally corrected physiological data or environment sensing data;
The sensing data is input into a trained wounded state recognition neural network corresponding to the data types to obtain a state recognition result corresponding to the sensing data, optionally, the wounded state recognition neural network is trained by a training data set comprising training sensing data corresponding to a plurality of data types and corresponding wounded state labels, and the state recognition result comprises an injured type, an injured part, an injured duration and a consciousness state.
Therefore, through the optional embodiment, the state recognition results corresponding to the corrected physiological data and the environmental sensing data can be respectively determined based on the wounded state recognition neural network corresponding to the data types of the different sensing data, so that the subsequent accurate comprehensive recognition of the wounded state is facilitated, the accurate and efficient wounded state recognition is realized by fully combining the multidimensional data and the comprehensive recognition, the intelligent and automatic wounded recognition is realized, and the rescue effect and efficiency are improved.
As an alternative embodiment, the wounded state recognition neural network is a first network when the sensed data is corrected physiological data, and a second network when the sensed data is ambient sensed data, wherein:
the first network comprises a data corresponding part identification network and a part state identification network, wherein the data corresponding part identification network is used for identifying a human body part corresponding to the corrected physiological data;
The second network comprises a wounded part identification network and a wounded state identification network, wherein the wounded part identification network is used for identifying a data part related to a wounded to be identified in the environment sensing data, and the wounded state identification network is used for predicting the data part to obtain a corresponding state identification result.
Therefore, through the optional embodiment, the network architecture and functions of the state neural network corresponding to different data types are clarified, so that the corresponding state recognition result is accurately predicted, the follow-up accurate comprehensive recognition of the state of the wounded is facilitated, the accurate and efficient wounded state recognition is realized by fully combining multidimensional data and comprehensive recognition in an auxiliary manner, the intelligent and automatic wounded recognition is realized, and the rescue effect and efficiency are improved.
As an optional embodiment, the determining module determines a specific mode of the wounded state of the wounded to be identified according to the state identification results corresponding to the corrected physiological data and the environmental sensing data, including:
calculating intersection sets of state recognition results corresponding to all corrected physiological data to obtain a first reference state recognition result;
Calculating the average value of the similarity between the state recognition result corresponding to the environmental sensing data and the state recognition result corresponding to each other environmental sensing data for the state recognition result corresponding to each environmental sensing data to obtain a similarity parameter of the state recognition result corresponding to the environmental sensing data;
Screening state recognition results with similarity parameters larger than a parameter threshold from state recognition results corresponding to all environmental sensing data to obtain a plurality of preferred state recognition results;
calculating the intersection of all the preferred state recognition results to obtain a second reference state recognition result;
and calculating an intersection of the first reference state identification result and the first reference state identification result to obtain the wounded state of the wounded to be identified.
Therefore, according to the above-mentioned alternative embodiment, the first reference state recognition result can be obtained according to the intersection of the state recognition results corresponding to all corrected physiological data, the prediction result based on the physiological data is more accurate, so that the intersection can be directly calculated to serve as the reference state, the prediction result of the environment data with insufficient accuracy is obtained by determining the intersection of the state recognition results of the environment sensing data after screening based on similarity screening and intersection calculation, the second reference state recognition result is obtained, so that the more accurate wounded state of the wounded to be recognized is obtained based on intersection calculation, more accurate and efficient wounded state recognition can be realized by fully combining multidimensional data and comprehensive recognition, more intelligent and automatic wounded recognition can be realized, and rescue effect and rescue efficiency can be improved.
Example III
Referring to fig. 3, fig. 3 is a schematic diagram illustrating a battlefield wounded state recognition system based on multi-dimensional data according to an embodiment of the present invention. The multi-dimensional data based battlefield wounded state recognition system depicted in fig. 3 is applied in a data processing system/data processing device/data processing server (wherein the server comprises a local processing server or a cloud processing server). As shown in fig. 3, the multi-dimensional data based battlefield wounded state recognition system may include:
a memory 301 storing executable program code;
A processor 302 coupled with the memory 301;
Wherein the processor 302 invokes executable program code stored in the memory 301 for performing the steps of the battlefield wounded state recognition method based on multidimensional data as described in embodiment one.
Example IV
The embodiment of the invention discloses a computer-readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer to execute the steps of the battlefield wounded state recognition method based on multi-dimensional data described in the embodiment.
Example five
The embodiment of the invention discloses a computer program product, which comprises a non-transitory computer readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute the steps of the battlefield wounded state recognition method based on the multidimensional data described in the embodiment.
The foregoing describes certain embodiments of the present disclosure, other embodiments being within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. Furthermore, the processes depicted in the accompanying drawings do not necessarily have to be in the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that the present description may be provided as a method, system, or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description embodiments may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
Finally, it should be noted that the method and system for recognizing the state of a battlefield wounded based on multi-dimensional data disclosed in the embodiments of the present invention are only disclosed in the preferred embodiments of the present invention, and are only used for illustrating the technical scheme of the present invention, but not limiting the same, and although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that the technical scheme described in the foregoing embodiments may be modified or some of the technical features may be equivalently replaced, and these modifications or substitutions do not depart from the spirit and scope of the technical scheme of the embodiments of the present invention.

Claims (10)

1. A method for identifying a battlefield wounded state based on multi-dimensional data, the method comprising:
acquiring physiological sensing data and multidimensional environmental sensing data of a wounded to be identified;
correcting the physiological sensing data according to at least one environmental sensing data to obtain corrected physiological data;
based on a state recognition algorithm, determining state recognition results respectively corresponding to the corrected physiological data and the environment sensing data;
And determining the wounded state of the wounded to be identified according to the state identification results respectively corresponding to the corrected physiological data and the environment sensing data.
2. The multi-dimensional data based battlefield wounding state recognition method of claim 1, wherein the physiological sensing data comprises at least one of blood pressure data, body temperature data, blood glucose data, speed data, acceleration data, respiration data, and pulse data.
3. The method of claim 1, wherein the environmental sensing data comprises at least one of an environmental image, an environmental sound, an environmental light reflection data, an environmental temperature, an environmental humidity.
4. The method of claim 1, wherein correcting the physiological sensor data based on at least one of the environmental sensor data to obtain corrected physiological data comprises:
For each physiological sensing data, determining at least one relevant environment sensing type corresponding to the sensing data type of the physiological sensing data according to a preset data type relevant rule;
Screening out relevant environment sensing data corresponding to the relevant environment sensing type from all the environment sensing data;
Determining reference physiological prediction data based on all the relevant environment sensing data and a preset prediction neural network;
and calculating the average value of the reference physiological prediction data and the physiological sensing data to obtain corrected physiological data corresponding to the physiological sensing data.
5. The method of claim 4, wherein determining the reference physiological prediction data based on all the relevant environmental sensing data and a predetermined prediction neural network comprises:
Inputting each relevant environment sensing data to a physiological relevant prediction neural network corresponding to the sensing data type to obtain physiological prediction data corresponding to each relevant environment sensing data, wherein the physiological relevant prediction neural network is obtained through training a training data set comprising a plurality of training relevant environment sensing data and training physiological data labels of the corresponding sensing data type;
The method comprises the steps of calculating weighted sum average values of physiological prediction data corresponding to all relevant environment sensing data to obtain reference physiological prediction data, wherein the weighted calculation weight of the physiological prediction data corresponding to each relevant environment sensing data comprises a first weight and a second weight, the first weight is in direct proportion to the data proportion of the training relevant environment sensing data, which is the same as the data type of the relevant environment sensing data, and the second weight is in direct proportion to the data proportion of the relevant environment sensing data to the total data quantity of all the relevant environment sensing data.
6. The method for recognizing states of battlefield wounded based on multi-dimensional data according to claim 1, wherein the determining the state recognition results respectively corresponding to the corrected physiological data and the environmental sensing data based on the state recognition algorithm comprises:
for each sensing data, determining the data type corresponding to the sensing data, wherein the sensing data is the corrected physiological data or the environment sensing data;
The sensing data is input into a trained wounded state recognition neural network corresponding to the data type to obtain a state recognition result corresponding to the sensing data, the wounded state recognition neural network is obtained through training of a training data set comprising a plurality of training sensing data corresponding to the data type and corresponding wounded state labels, and the state recognition result comprises an injured type, an injured part, an injured duration and a consciousness state.
7. The method of claim 6, wherein the victim state identification neural network is a first network when the sensed data is the corrected physiological data, wherein the victim state identification neural network is a second network when the sensed data is the environmental sensed data, wherein:
The first network comprises a data corresponding part identification network and a part state identification network, wherein the data corresponding part identification network is used for identifying a human body part corresponding to the corrected physiological data, and the part state identification network is used for predicting the corrected physiological data based on a state identification network obtained by training a training data set related to the human body part so as to obtain a corresponding state identification result;
The second network comprises a wounded part identification network and a wounded state identification network, wherein the wounded part identification network is used for identifying a data part related to the wounded to be identified in the environment sensing data, and the wounded state identification network is used for predicting the data part to obtain a corresponding state identification result.
8. The method for recognizing the state of a battlefield wounded based on multi-dimensional data according to claim 1, wherein the determining the wounded state of the wounded to be recognized according to the state recognition results respectively corresponding to the corrected physiological data and the environmental sensing data comprises:
calculating intersection sets of state recognition results corresponding to all the corrected physiological data to obtain a first reference state recognition result;
calculating the average value of the similarity between the state recognition result corresponding to the environmental sensing data and the state recognition result corresponding to each other environmental sensing data for the state recognition result corresponding to each environmental sensing data to obtain a similarity parameter of the state recognition result corresponding to the environmental sensing data;
Screening out state recognition results with similarity parameters larger than a parameter threshold value from the state recognition results corresponding to all the environmental sensing data to obtain a plurality of preferred state recognition results;
Calculating the intersection of all the preferred state recognition results to obtain a second reference state recognition result;
and calculating an intersection of the first reference state identification result and the first reference state identification result to obtain the wounded state of the wounded to be identified.
9. A battlefield wounded state recognition system based on multi-dimensional data, the system comprising:
The acquisition module is used for acquiring physiological sensing data and multidimensional environment sensing data of the wounded to be identified;
the correction module is used for correcting the physiological sensing data according to at least one piece of environment sensing data to obtain corrected physiological data;
the identification module is used for determining state identification results respectively corresponding to the corrected physiological data and the environment sensing data based on a state identification algorithm;
And the determining module is used for determining the wounded state of the wounded to be identified according to the state identification results respectively corresponding to the corrected physiological data and the environment sensing data.
10. A battlefield wounded state recognition system based on multi-dimensional data, the system comprising:
a memory storing executable program code;
A processor coupled to the memory;
The processor invokes the executable program code stored in the memory to perform the multi-dimensional data based battlefield victim state identification method of any one of claims 1-8.
CN202510251961.8A 2025-03-05 2025-03-05 Method and system for identifying battlefield casualty status based on multi-dimensional data Active CN120086654B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510251961.8A CN120086654B (en) 2025-03-05 2025-03-05 Method and system for identifying battlefield casualty status based on multi-dimensional data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510251961.8A CN120086654B (en) 2025-03-05 2025-03-05 Method and system for identifying battlefield casualty status based on multi-dimensional data

Publications (2)

Publication Number Publication Date
CN120086654A true CN120086654A (en) 2025-06-03
CN120086654B CN120086654B (en) 2025-09-05

Family

ID=95856975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510251961.8A Active CN120086654B (en) 2025-03-05 2025-03-05 Method and system for identifying battlefield casualty status based on multi-dimensional data

Country Status (1)

Country Link
CN (1) CN120086654B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120087699A (en) * 2025-03-05 2025-06-03 广东云曌医疗科技有限公司 Method and system for determining rescue strategy for multiple injured persons based on dynamic programming

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378394A (en) * 2019-06-26 2019-10-25 广西大学 More physiological data convergence analysis methods neural network based
JP2019213888A (en) * 2019-08-05 2019-12-19 パラマウントベッド株式会社 Watching system
CN112504461A (en) * 2020-09-21 2021-03-16 江苏超数信息科技有限公司 Intelligent temperature measurement system and temperature measurement method
CN114052670A (en) * 2021-11-17 2022-02-18 深圳市盛景基因生物科技有限公司 Health risk assessment method and disease early warning system
US20240415446A1 (en) * 2021-11-30 2024-12-19 Streamlined Forensic Reporting Limited System for wound analysis
CN119344702A (en) * 2024-12-23 2025-01-24 吉林大学第一医院 Intelligent monitoring system and method for patient status

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378394A (en) * 2019-06-26 2019-10-25 广西大学 More physiological data convergence analysis methods neural network based
JP2019213888A (en) * 2019-08-05 2019-12-19 パラマウントベッド株式会社 Watching system
CN112504461A (en) * 2020-09-21 2021-03-16 江苏超数信息科技有限公司 Intelligent temperature measurement system and temperature measurement method
CN114052670A (en) * 2021-11-17 2022-02-18 深圳市盛景基因生物科技有限公司 Health risk assessment method and disease early warning system
US20240415446A1 (en) * 2021-11-30 2024-12-19 Streamlined Forensic Reporting Limited System for wound analysis
CN119344702A (en) * 2024-12-23 2025-01-24 吉林大学第一医院 Intelligent monitoring system and method for patient status

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120087699A (en) * 2025-03-05 2025-06-03 广东云曌医疗科技有限公司 Method and system for determining rescue strategy for multiple injured persons based on dynamic programming

Also Published As

Publication number Publication date
CN120086654B (en) 2025-09-05

Similar Documents

Publication Publication Date Title
CN120086654B (en) Method and system for identifying battlefield casualty status based on multi-dimensional data
CN108920654B (en) Question and answer text semantic matching method and device
CN112055878B (en) Adjusting a machine learning model based on the second set of training data
CN109034371B (en) Deep learning model reasoning period acceleration method, device and system
WO2021031817A1 (en) Emotion recognition method and device, computer device, and storage medium
CN109902588B (en) Gesture recognition method and device and computer readable storage medium
CN108937866B (en) Sleep state monitoring method and device
CN113361381B (en) Human body key point detection model training method, detection method and device
CN116611006A (en) Fault identification method and device of electric kettle based on user feedback
CN118965120B (en) Vehicle defect misjudgment and identification method and system based on model rules
CN118536018B (en) Intelligent ring data processing method and system based on multi-sensor data
CN118394280B (en) Sea chart data resource management method and system based on data verification
CN115578591A (en) Plant pot changing detection method, device, equipment and storage medium
CN118368606A (en) Finger ring communication method and device based on history record
CN111079560A (en) Tumble monitoring method and device and terminal equipment
CN113408692A (en) Network structure searching method, device, equipment and storage medium
CN118570122B (en) Method and system for identifying nail fungus abnormalities based on multiple models
CN120087699A (en) Method and system for determining rescue strategy for multiple injured persons based on dynamic programming
CN119650074A (en) Pet health monitoring method and system based on prediction model
CN119691625A (en) Pet feeding control method and system based on gesture recognition
CN119920473B (en) Wounded Positioning Method and System Based on Neural Network Prediction
CN119066353B (en) Data center data processing method and system for energy-saving control
CN119917880B (en) Wounded sensing data transmission method and system based on clustering grouping
CN112767350A (en) Method, device, equipment and storage medium for predicting maximum interval of thromboelastogram
CN118789825B (en) Nozzle flow data processing method and system based on guide rail motion comparison

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant