• Login
    View Item 
    •   Home
    • UMass Chan Faculty and Staff Research and Publications
    • UMass Chan Faculty and Researcher Publications
    • View Item
    •   Home
    • UMass Chan Faculty and Staff Research and Publications
    • UMass Chan Faculty and Researcher Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of eScholarship@UMassChanCommunitiesPublication DateAuthorsUMass Chan AffiliationsTitlesDocument TypesKeywordsThis CollectionPublication DateAuthorsUMass Chan AffiliationsTitlesDocument TypesKeywords

    My Account

    LoginRegister

    Help

    AboutSubmission GuidelinesData Deposit PolicySearchingAccessibilityTerms of UseWebsite Migration FAQ

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Unsupervised ensemble ranking of terms in electronic health record notes based on their importance to patients

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    Publisher version
    View Source
    Access full-text PDFOpen Access
    View Source
    Check access options
    Check access options
    Authors
    Chen, Jinying
    Yu, Hong
    UMass Chan Affiliations
    Department of Quantitative Health Sciences
    Document Type
    Journal Article
    Publication Date
    2017-04-01
    Keywords
    Electronic health record
    Information extraction
    Natural language processing
    Unsupervised ensemble ranking
    Artificial Intelligence and Robotics
    Bioinformatics
    Computer Sciences
    Health Information Technology
    Health Services Administration
    Information Literacy
    
    Metadata
    Show full item record
    Link to Full Text
    https://doi.org/10.1016/j.jbi.2017.02.016
    Abstract
    BACKGROUND: Allowing patients to access their own electronic health record (EHR) notes through online patient portals has the potential to improve patient-centered care. However, EHR notes contain abundant medical jargon that can be difficult for patients to comprehend. One way to help patients is to reduce information overload and help them focus on medical terms that matter most to them. Targeted education can then be developed to improve patient EHR comprehension and the quality of care. OBJECTIVE: The aim of this work was to develop FIT (Finding Important Terms for patients), an unsupervised natural language processing (NLP) system that ranks medical terms in EHR notes based on their importance to patients. METHODS: We built FIT on a new unsupervised ensemble ranking model derived from the biased random walk algorithm to combine heterogeneous information resources for ranking candidate terms from each EHR note. Specifically, FIT integrates four single views (rankers) for term importance: patient use of medical concepts, document-level term salience, word co-occurrence based term relatedness, and topic coherence. It also incorporates partial information of term importance as conveyed by terms' unfamiliarity levels and semantic types. We evaluated FIT on 90 expert-annotated EHR notes and used the four single-view rankers as baselines. In addition, we implemented three benchmark unsupervised ensemble ranking methods as strong baselines. RESULTS: FIT achieved 0.885 AUC-ROC for ranking candidate terms from EHR notes to identify important terms. When including term identification, the performance of FIT for identifying important terms from EHR notes was 0.813 AUC-ROC. Both performance scores significantly exceeded the corresponding scores from the four single rankers (P < 0.001). FIT also outperformed the three ensemble rankers for most metrics. Its performance is relatively insensitive to its parameter. CONCLUSIONS: FIT can automatically identify EHR terms important to patients. It may help develop future interventions to improve quality of care. By using unsupervised learning as well as a robust and flexible framework for information fusion, FIT can be readily applied to other domains and applications.
    Source
    J Biomed Inform. 2017 Apr;68:121-131. doi: 10.1016/j.jbi.2017.02.016. Epub 2017 Mar 4. Link to article on publisher's site
    DOI
    10.1016/j.jbi.2017.02.016
    Permanent Link to this Item
    http://hdl.handle.net/20.500.14038/29181
    PubMed ID
    28267590
    Related Resources

    Link to Article in PubMed

    ae974a485f413a2113503eed53cd6c53
    10.1016/j.jbi.2017.02.016
    Scopus Count
    Collections
    UMass Chan Faculty and Researcher Publications

    entitlement

    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Lamar Soutter Library, UMass Chan Medical School | 55 Lake Avenue North | Worcester, MA 01655 USA
    Quick Guide | escholarship@umassmed.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.