• Login
    View Item 
    •   Home
    • UMass Chan Faculty and Staff Research and Publications
    • UMass Chan Faculty and Researcher Publications
    • View Item
    •   Home
    • UMass Chan Faculty and Staff Research and Publications
    • UMass Chan Faculty and Researcher Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of eScholarship@UMassChanCommunitiesPublication DateAuthorsUMass Chan AffiliationsTitlesDocument TypesKeywordsThis CollectionPublication DateAuthorsUMass Chan AffiliationsTitlesDocument TypesKeywords

    My Account

    LoginRegister

    Help

    AboutSubmission GuidelinesData Deposit PolicySearchingTerms of UseWebsite Migration FAQ

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    An investigation of the impacts of different generalizability study designs on estimates of variance components and generalizability coefficients

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Authors
    Keller, L. A.
    Mazor, Kathleen M.
    Swaminathan, H.
    Pugnaire, Michele P.
    UMass Chan Affiliations
    Meyers Primary Care Institute
    Department of Family Medicine and Community Health
    Office of Medical Education
    Document Type
    Journal Article
    Publication Date
    2000-10-01
    Keywords
    Analysis of Variance
    *Clinical Competence
    Computer Simulation
    Confidence Intervals
    Data Collection
    Educational Measurement
    Humans
    Students, Medical
    Life Sciences
    Medicine and Health Sciences
    Women's Studies
    Show allShow less
    
    Metadata
    Show full item record
    Link to Full Text
    http://journals.lww.com/academicmedicine/Fulltext/2000/10001/An_Investigation_of_the_Impacts_of_Different.7.aspx
    Abstract
    In recent years, performance assessments have become increasingly popular in medical education. While the term “performance assessment” can be applied to many different types of assessments, in medical education this term usually refers to some sort of simulated patient encounter, such as an objective structured clinical examination (OSCE) or a computer simulation of an encounter. These types of assessments appeal to many educators because the tasks or items used are often seen as more realistic than items on multiple-choice examinations. However, this increased “realism” or apparent authenticity comes at a cost—performance examinations are typically more time-consuming and expensive both to administer and to score. On an OSCE, each encounter with a standardized patient is typically scored as a single item, often resulting in an examinee's completing only four to eight items in a two-hour testing period. In contrast, an examinee might complete 100 to 150 items during a two-hour multiple-choice examination. The fact that performance examinations are typically relatively short means that test users must pay particular attention to the reliability and validity of test scores. Generalizability theory provides a framework for estimating the relative magnitudes of various sources of error in a set of scores. In most performance assessments, both items and raters are potential sources of error. Generalizability theory allows estimation of the error associated with each of these sources separately, as well as the relevant interaction effects. In a generalizability study (G study), the variance in a set of scores is partitioned in a manner similar to that used in the analysis of variance. However, in a G study the emphasis is not on testing for statistical significance, but rather on assessing the relative magnitudes of the variance components. Depending on the study design, different variance components can be estimated. Once the variance components are estimated, additional analyses can be conducted. The purpose of the present study was to examine the impacts of different G-study designs.
    Source
    Acad Med. 2000 Oct;75(10 Suppl):S21-4.
    Permanent Link to this Item
    http://hdl.handle.net/20.500.14038/50738
    PubMed ID
    11031163
    Related Resources
    Link to article in PubMed
    Collections
    UMass Chan Faculty and Researcher Publications

    entitlement

    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Lamar Soutter Library, UMass Chan Medical School | 55 Lake Avenue North | Worcester, MA 01655 USA
    Quick Guide | escholarship@umassmed.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.