Linking research of biomedical datasets

Brief Bioinform. 2022 Nov 19;23(6):bbac373. doi: 10.1093/bib/bbac373.

Abstract

Biomedical data preprocessing and efficient computing can be as important as the statistical methods used to fit the data; data processing needs to consider application scenarios, data acquisition and individual rights and interests. We review common principles, knowledge and methods of integrated research according to the whole-pipeline processing mechanism diverse, coherent, sharing, auditable and ecological. First, neuromorphic and native algorithms integrate diverse datasets, providing linear scalability and high visualization. Second, the choice mechanism of different preprocessing, analysis and transaction methods from raw to neuromorphic was summarized on the node and coordinator platforms. Third, combination of node, network, cloud, edge, swarm and graph builds an ecosystem of cohort integrated research and clinical diagnosis and treatment. Looking forward, it is vital to simultaneously combine deep computing, mass data storage and massively parallel communication.

Keywords: BDIIT (biology, data, information and intelligence technologies) fusion; DCSAE (diverse, coherent, sharing, auditable and ecological); co-normalization; neuromorphic graph computing; swarm learning.

Publication types

  • Review
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Ecosystem*
  • Humans
  • Knowledge