An Enhanced Visualization Method to Aid Behavioral Trajectory Pattern Recognition Infrastructure for Big Longitudinal Data
UMass Chan Affiliations
Department of Quantitative Health SciencesDocument Type
Journal ArticlePublication Date
2018-06-01Keywords
UMCCTS fundingEnhanced projection pursuit
Longitudinal data
Pattern recognition
Visualization
Computer Sciences
Library and Information Science
Translational Medical Research
Metadata
Show full item recordAbstract
Big longitudinal data provide more reliable information for decision making and are common in all kinds of fields. Trajectory pattern recognition is in an urgent need to discover important structures for such data. Developing better and more computationally-efficient visualization tool is crucial to guide this technique. This paper proposes an enhanced projection pursuit (EPP) method to better project and visualize the structures (e.g. clusters) of big high-dimensional (HD) longitudinal data on a lower-dimensional plane. Unlike classic PP methods potentially useful for longitudinal data, EPP is built upon nonlinear mapping algorithms to compute its stress (error) function by balancing the paired weights for between and within structure stress while preserving original structure membership in the high-dimensional space. Specifically, EPP solves an NP hard optimization problem by integrating gradual optimization and non-linear mapping algorithms, and automates the searching of an optimal number of iterations to display a stable structure for varying sample sizes and dimensions. Using publicized UCI and real longitudinal clinical trial datasets as well as simulation, EPP demonstrates its better performance in visualizing big HD longitudinal data.Source
IEEE Trans Big Data. 2018 Jun;4(2):289-298. doi: 10.1109/TBDATA.2017.2653815. Epub 2017 Jan 16. Link to article on publisher's site
DOI
10.1109/TBDATA.2017.2653815Permanent Link to this Item
http://hdl.handle.net/20.500.14038/50314PubMed ID
29888298Related Resources
ae974a485f413a2113503eed53cd6c53
10.1109/TBDATA.2017.2653815