• Login
    Search 
    •   Home
    • Search
    •   Home
    • Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of eScholarship@UMassChanCommunitiesPublication DateAuthorsUMass Chan AffiliationsTitlesDocument TypesKeywords

    My Account

    LoginRegister

    Filter by Category

    Date Issued2000 (1)AuthorKeller, L. A. (1)Mazor, Kathleen M. (1)Pugnaire, Michele P. (1)
    Swaminathan, H. (1)
    UMass Chan AffiliationDepartment of Family Medicine and Community Health (1)Meyers Primary Care Institute (1)Office of Medical Education (1)Document TypeJournal Article (1)Keyword*Clinical Competence (1)Analysis of Variance (1)Computer Simulation (1)Confidence Intervals (1)Data Collection (1)View MoreJournalAcademic medicine : journal of the Association of American Medical Colleges (1)

    Help

    AboutSubmission GuidelinesData Deposit PolicySearchingTerms of UseWebsite Migration FAQ

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors
     

    Search

    Show Advanced FiltersHide Advanced Filters

    Filters

    • Publications
    • Profiles

    Now showing items 1-1 of 1

    • List view
    • Grid view
    • Sort Options:
    • Relevance
    • Title Asc
    • Title Desc
    • Issue Date Asc
    • Issue Date Desc
    • Results Per Page:
    • 5
    • 10
    • 20
    • 40
    • 60
    • 80
    • 100

    • 1CSV
    • 1RefMan
    • 1EndNote
    • 1BibTex
    • Selective Export
    • Select All
    • Help
    Thumbnail

    An investigation of the impacts of different generalizability study designs on estimates of variance components and generalizability coefficients

    Keller, L. A.; Mazor, Kathleen M.; Swaminathan, H.; Pugnaire, Michele P. (2000-10-01)
    In recent years, performance assessments have become increasingly popular in medical education. While the term “performance assessment” can be applied to many different types of assessments, in medical education this term usually refers to some sort of simulated patient encounter, such as an objective structured clinical examination (OSCE) or a computer simulation of an encounter. These types of assessments appeal to many educators because the tasks or items used are often seen as more realistic than items on multiple-choice examinations. However, this increased “realism” or apparent authenticity comes at a cost—performance examinations are typically more time-consuming and expensive both to administer and to score. On an OSCE, each encounter with a standardized patient is typically scored as a single item, often resulting in an examinee's completing only four to eight items in a two-hour testing period. In contrast, an examinee might complete 100 to 150 items during a two-hour multiple-choice examination. The fact that performance examinations are typically relatively short means that test users must pay particular attention to the reliability and validity of test scores. Generalizability theory provides a framework for estimating the relative magnitudes of various sources of error in a set of scores. In most performance assessments, both items and raters are potential sources of error. Generalizability theory allows estimation of the error associated with each of these sources separately, as well as the relevant interaction effects. In a generalizability study (G study), the variance in a set of scores is partitioned in a manner similar to that used in the analysis of variance. However, in a G study the emphasis is not on testing for statistical significance, but rather on assessing the relative magnitudes of the variance components. Depending on the study design, different variance components can be estimated. Once the variance components are estimated, additional analyses can be conducted. The purpose of the present study was to examine the impacts of different G-study designs.
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Lamar Soutter Library, UMass Chan Medical School | 55 Lake Avenue North | Worcester, MA 01655 USA
    Quick Guide | escholarship@umassmed.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.