Browsing by Author "Wong, Kenneth H."
Now showing 1 - 6 of 6
Results Per Page
Sort Options
- Artificial Intelligence for the Future Radiology Diagnostic ServiceMun, Seong K.; Wong, Kenneth H.; Lo, Shih-Chung B.; Li, Yanni; Bayarsaikhan, Shijir (Frontiers Media, 2021-01-28)Radiology historically has been a leader of digital transformation in healthcare. The introduction of digital imaging systems, picture archiving and communication systems (PACS), and teleradiology transformed radiology services over the past 30 years. Radiology is again at the crossroad for the next generation of transformation, possibly evolving as a one-stop integrated diagnostic service. Artificial intelligence and machine learning promise to offer radiology new powerful new digital tools to facilitate the next transformation. The radiology community has been developing computer-aided diagnosis (CAD) tools based on machine learning (ML) over the past 20 years. Among various AI techniques, deep-learning convolutional neural networks (CNN) and its variants have been widely used in medical image pattern recognition. Since the 1990s, many CAD tools and products have been developed. However, clinical adoption has been slow due to a lack of substantial clinical advantages, difficulties integrating into existing workflow, and uncertain business models. This paper proposes three pathways for AI’s role in radiology beyond current CNN based capabilities 1) improve the performance of CAD, 2) improve the productivity of radiology service by AI-assisted workflow, and 3) develop radiomics that integrate the data from radiology, pathology, and genomics to facilitate the emergence of a new integrated diagnostic service.
- Blockchain-enabled Secure and Trusted Personalized Health RecordDong, Yibin (Virginia Tech, 2022-12-20)Longitudinal personalized electronic health record (LPHR) provides a holistic view of health records for individuals and offers a consistent patient-controlled information system for managing the health care of patients. Except for the patients in Veterans Affairs health care service, however, no LPHR is available for the general population in the U.S. that can integrate the existing patients' electronic health records throughout life of care. Such a gap may be contributed mainly by the fact that existing patients' electronic health records are scattered across multiple health care facilities and often not shared due to privacy and security concerns from both patients and health care organizations. The main objective of this dissertation is to address these roadblocks by designing a scalable and interoperable LPHR with patient-controlled and mutually-trusted security and privacy. Privacy and security are complex problems. Specifically, without a set of access control policies, encryption alone cannot secure patient data due to insider threat. Moreover, in a distributed system like LPHR, so-called race condition occurs when access control policies are centralized while decisions making processes are localized. We propose a formal definition of secure LPHR and develop a blockchain-enabled next generation access control (BeNGAC) model. The BeNGAC solution focuses on patient-managed secure authorization for access, and NGAC operates in open access surroundings where users can be centrally known or unknown. We also propose permissioned blockchain technology - Hyperledger Fabric (HF) - to ease the shortcoming of race condition in NGAC that in return enhances the weak confidentiality protection in HF. Built upon BeNGAC, we further design a blockchain-enabled secure and trusted (BEST) LPHR prototype in which data are stored in a distributed yet decentralized database. The unique feature of the proposed BEST-LPHR is the use of blockchain smart contracts allowing BeNGAC policies to govern the security, privacy, confidentiality, data integrity, scalability, sharing, and auditability. The interoperability is achieved by using a health care data exchange standard called Fast Health Care Interoperability Resources. We demonstrated the feasibility of the BEST-LPHR design by the use case studies. Specifically, a small-scale BEST-LPHR is built for sharing platform among a patient and health care organizations. In the study setting, patients have been raising additional ethical concerns related to consent and granular control of LPHR. We engineered a Web-delivered BEST-LPHR sharing platform with patient-controlled consent granularity, security, and privacy realized by BeNGAC. Health organizations that holding the patient's electronic health record (EHR) can join the platform with trust based on the validation from the patient. The mutual trust is established through a rigorous validation process by both the patient and built-in HF consensus mechanism. We measured system scalability and showed millisecond-range performance of LPHR permission changes. In this dissertation, we report the BEST-LPHR solution to electronically sharing and managing patients' electronic health records from multiple organizations, focusing on privacy and security concerns. While the proposed BEST-LPHR solution cannot, expectedly, address all problems in LPHR, this prototype aims to increase EHR adoption rate and reduce LPHR implementation roadblocks. In a long run, the BEST-LPHR will contribute to improving health care efficiency and the quality of life for many patients.
- Brain Signal Quantification and Functional Unit Analysis in Fluorescent Imaging Data by Unsupervised LearningMi, Xuelong (Virginia Tech, 2024-06-04)Optical recording of various brain signals is becoming an indispensable technique for biological studies, accelerated by the development of new or improved biosensors and microscopy technology. A major challenge in leveraging the technique is to identify and quantify the rich patterns embedded in the data. However, existing methods often struggle, either due to their limited signal analysis capabilities or poor performance. Here we present Activity Quantification and Analysis (AQuA2), an innovative analysis platform built upon machine learning theory. AQuA2 features a novel event detection pipeline for precise quantification of intricate brain signals and incorporates a Consensus Functional Unit (CFU) module to explore interactions among potential functional units driving repetitive signals. To enhance efficiency, we developed BIdirectional pushing with Linear Component Operations (BILCO) algorithm to handle propagation analysis, a time-consuming step using traditional algorithms. Furthermore, considering user-friendliness, AQuA2 is implemented as both a MATLAB package and a Fiji plugin, complete with a graphical interface for enhanced usability. AQuA2's validation through both simulation and real-world applications demonstrates its superior performance compared to its peers. Applied across various sensors (Calcium, NE, and ATP), cell types (astrocytes, oligodendrocytes, and neurons), animal models (zebrafish and mouse), and imaging modalities (two-photon, light sheet, and confocal), AQuA2 consistently delivers promising results and novel insights, showcasing its versatility in fluorescent imaging data analysis.
- A Novel Methodology for Iterative Image Reconstruction in SPECT Using Deterministic Particle TransportRoyston, Katherine (Virginia Tech, 2015-04-30)Single photon emission computed tomography (SPECT) is used in a variety of medical procedures, including myocardial perfusion, bone metabolism, and thyroid function studies. In SPECT, the emissions of a radionuclide within a patient are counted at a gamma camera to form a 2-dimensional projection of the 3-dimensional radionuclide distribution within the patient. This unknown 3-dimensional source distribution can be reconstructed from many 2-dimensional projections obtained at different angles around the patient. This reconstruction can be improved by properly modeling the physics in the patient, i.e., particle absorption and scattering. Currently, such modeling is done using statistical Monte Carlo methods, but deterministic codes have the potential to offer fast computation speeds while fully modeling particle interactions within the patient. Deterministic codes are not susceptible to statistical uncertainty, but have been over-looked for applications to nuclear medicine, most likely due to their own limitations, including discretization and large memory requirements. A novel deterministic reconstruction methodology for SPECT (DRS) has been developed to apply the advantages of deterministic algorithms to SPECT iterative image reconstruction. Using a maximum likelihood expectation maximization (ML-EM) algorithm, a deterministic code can fully model particle transport in the patient in the forward projection step, without the need of a large system matrix. The TITAN deterministic transport code has a SPECT formulation that allows for fast simulation of SPECT projection images and has been benchmarked through comparison with results from the SIMIND and MCNP5 Monte Carlo codes in this dissertation. The TITAN SPECT formulation has been improved through a modified collimator representation and full parallelization. The DRS methodology has been implemented in the TITAN code to create TITAN with Image Reconstruction (TITAN-IR). The TITAN-IR code has been used to successfully reconstruct the source distribution from SPECT data for the Jaszczak and NCAT phantoms. Extensive studies have been conducted to examine the sensitivity of TITAN-IR image quality to deterministic parameter selection as well as collimator blur and noise in the projection data being reconstructed. The TITAN-IR reconstruction has also been compared with other reconstruction algorithms. This novel image reconstruction methodology has been shown to reconstruct images in short computation times, demonstrating its potential in a clinical setting with further development.
- Novel Preprocessing and Normalization Methods for Analysis of GC/LC-MS DataNezami Ranjbar, Mohammad Rasoul (Virginia Tech, 2015-06-02)We introduce new methods for preprocessing and normalization of data acquired by gas/liquid chromatography coupled with mass spectrometry (GC/LC-MS). Normalization is desired prior to subsequent statistical analysis to adjust variabilities in ion intensities that are not caused by biological differences. There are different sources of experimental bias including variabilities in sample collection, sample storage, poor experimental design, noise, etc. Also, instrument variability in experiments involving a large number of runs leads to a significant drift in intensity measurements. We propose new normalization methods based on bootstrapping, Gaussian process regression, non-negative matrix factorization (NMF), and Bayesian hierarchical models. These methods model the bias by borrowing information across runs and features. Another novel aspect is utilizing scan-level data to improve the accuracy of quantification. We evaluated the performance of our method using simulated and experimental data. In comparison with several existing methods, the proposed methods yielded significant improvement. Gas chromatography coupled with mass spectrometry (GC-MS) is one of the technologies widely used for qualitative and quantitative analysis of small molecules. In particular, GC coupled to single quadrupole MS can be utilized for targeted analysis by selected ion monitoring (SIM). However, to our knowledge, there are no software tools specifically designed for analysis of GS-SIM-MS data. We introduce SIMAT, a new R package for quantitative analysis of the levels of targeted analytes. SIMAT provides guidance in choosing fragments for a list of targets. This is accomplished through an optimization algorithm that has the capability to select the most appropriate fragments from overlapping peaks based on a pre-specified library of background analytes. The tool also allows visualization of the total ion chromatogram (TIC) of runs and extracted ion chromatogram (EIC) of analytes of interest. Moreover, retention index (RI) calibration can be performed and raw GC-SIM-MS data can be imported in netCDF or NIST mass spectral library (MSL) formats. We evaluated the performance of SIMAT using several experimental data sets. Our results demonstrate that SIMAT performs better than AMDIS and MetaboliteDetector in terms of finding the correct targets in the acquired GC-SIM-MS data and estimating their relative levels.
- Topic Model-based Mass Spectrometric Data Analysis in Cancer Biomarker Discovery StudiesWang, Minkun (Virginia Tech, 2017-06-14)Identification of disease-related alterations in molecular and cellular mechanisms may reveal useful biomarkers for human diseases including cancers. High-throughput omic technologies for identifying and quantifying multi-level biological molecules (e.g., proteins, glycans, and metabolites) have facilitated the advances in biological research in recent years. Liquid (or gas) chromatography coupled with mass spectrometry (LC/GC-MS) has become an essential tool in such large-scale omic studies. Appropriate LC/GC-MS data preprocessing pipelines are needed to detect true differences between biological groups. Challenges exist in several aspects of MS data analysis. Specifically for biomarker discovery, one fundamental challenge in quantitation of biomolecules is owing to the heterogeneous nature of human biospecimens. Although this issue has been a subject of discussion in cancer genomic studies, it has not yet been rigorously investigated in mass spectrometry based omic studies. Purification of mass spectometric data is highly desired prior to subsequent differential analysis. In this research dissertation, we majorly target at addressing the purification problem through probabilistic modeling. We propose an intensity-level purification model (IPM) to computationally purify LC/GC-MS based cancerous data in biomarker discovery studies. We further extend IPM to scan-level purification model (SPM) by considering information from extracted ion chromatogram (EIC, scan-level feature). Both IPM and SPM belong to the category of topic modeling approach, which aims to identify the underlying "topics" (sources) and their mixture proportions in composing the heterogeneous data. Additionally, denoise deconvolution model (DMM) is proposed to capture the noise signals in samples based on purified profiles. Variational expectation-maximization (VEM) and Markov chain Monte Carlo (MCMC) methods are used to draw inference on the latent variables and estimate the model parameters. Before we come to purification, other research topics in related to mass spectrometric data analysis for cancer biomarker discovery are also investigated in this dissertation. Chapter 3 discusses the developed methods in the differential analysis of LC/GC-MS based omic data, specifically for the preprocessing in data of LC-MS profiled glycans. Chapter 4 presents the assumptions and inference details of IPM, SPM, and DDM. A latent Dirichlet allocation (LDA) core is used to model the heterogeneous cancerous data as mixtures of topics consisting of sample-specific pure cancerous source and non-cancerous contaminants. We evaluated the capability of the proposed models in capturing mixture proportions of contaminants and cancer profiles on LC-MS based serum and tissue proteomic and GC-MS based tissue metabolomic datasets acquired from patients with hepatocellular carcinoma (HCC) and liver cirrhosis. Chapter 5 elaborates these applications in cancer biomarker discovery, where typical single omic and integrative analysis of multi-omic studies are included.