Evaluation of educational software.

Titus Schleyer, Lynn A. Johnson

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

Evaluation is an important component of developing educational software. Ideally, such evaluation quantifies and qualifies the effects of a new educational intervention on the learning process and outcomes. Conducting meaningful and rigorous educational evaluation is difficult, however. Challenges include defining and measuring educational outcomes, accounting for media effects, coping with practical problems in designing studies, and asking the right research questions. Practical considerations that make the design of evaluation studies difficult include confounding, potentially small effect sizes, contamination effects, and ethics. Two distinct approaches to evaluation are objectivist and subjectivist. These two complement each other in describing the whole range of effects a new educational program can have. Objectivist demonstration studies should be preceded by measurement studies that assess the reliability and validity of the evaluation instrument(s) used. Many evaluation studies compare the performance of learners who are exposed to either the new program or a more traditional approach. However, this method is problematic because test or exam performance is often a weak indicator of competence and may fail to capture important nuances in outcomes. Subjectivist studies are more qualitative in nature and may provide insights complementary to those gained with objectivist studies. Several published examples are used in this article to illustrate different evaluation methods. Readers are encouraged to contemplate a wide range of evaluation study designs and explore increasingly complex questions when evaluating educational software.

Original languageEnglish (US)
Pages (from-to)1221-1228
Number of pages8
JournalJournal of Dental Education
Volume67
Issue number11
StatePublished - 2003
Externally publishedYes

Fingerprint

Software
evaluation
software
environmental pollution
Reproducibility of Results
educational program
Ethics
Mental Competency
performance
learning process
coping
moral philosophy
Learning
Research

Cite this

Evaluation of educational software. / Schleyer, Titus; Johnson, Lynn A.

In: Journal of Dental Education, Vol. 67, No. 11, 2003, p. 1221-1228.

Research output: Contribution to journalArticle

Schleyer, T & Johnson, LA 2003, 'Evaluation of educational software.', Journal of Dental Education, vol. 67, no. 11, pp. 1221-1228.
Schleyer, Titus ; Johnson, Lynn A. / Evaluation of educational software. In: Journal of Dental Education. 2003 ; Vol. 67, No. 11. pp. 1221-1228.
@article{c40e3d23c712490cb0934f3913bdf22a,
title = "Evaluation of educational software.",
abstract = "Evaluation is an important component of developing educational software. Ideally, such evaluation quantifies and qualifies the effects of a new educational intervention on the learning process and outcomes. Conducting meaningful and rigorous educational evaluation is difficult, however. Challenges include defining and measuring educational outcomes, accounting for media effects, coping with practical problems in designing studies, and asking the right research questions. Practical considerations that make the design of evaluation studies difficult include confounding, potentially small effect sizes, contamination effects, and ethics. Two distinct approaches to evaluation are objectivist and subjectivist. These two complement each other in describing the whole range of effects a new educational program can have. Objectivist demonstration studies should be preceded by measurement studies that assess the reliability and validity of the evaluation instrument(s) used. Many evaluation studies compare the performance of learners who are exposed to either the new program or a more traditional approach. However, this method is problematic because test or exam performance is often a weak indicator of competence and may fail to capture important nuances in outcomes. Subjectivist studies are more qualitative in nature and may provide insights complementary to those gained with objectivist studies. Several published examples are used in this article to illustrate different evaluation methods. Readers are encouraged to contemplate a wide range of evaluation study designs and explore increasingly complex questions when evaluating educational software.",
author = "Titus Schleyer and Johnson, {Lynn A.}",
year = "2003",
language = "English (US)",
volume = "67",
pages = "1221--1228",
journal = "Journal of Dental Education",
issn = "0022-0337",
publisher = "American Dental Education Association",
number = "11",

}

TY - JOUR

T1 - Evaluation of educational software.

AU - Schleyer, Titus

AU - Johnson, Lynn A.

PY - 2003

Y1 - 2003

N2 - Evaluation is an important component of developing educational software. Ideally, such evaluation quantifies and qualifies the effects of a new educational intervention on the learning process and outcomes. Conducting meaningful and rigorous educational evaluation is difficult, however. Challenges include defining and measuring educational outcomes, accounting for media effects, coping with practical problems in designing studies, and asking the right research questions. Practical considerations that make the design of evaluation studies difficult include confounding, potentially small effect sizes, contamination effects, and ethics. Two distinct approaches to evaluation are objectivist and subjectivist. These two complement each other in describing the whole range of effects a new educational program can have. Objectivist demonstration studies should be preceded by measurement studies that assess the reliability and validity of the evaluation instrument(s) used. Many evaluation studies compare the performance of learners who are exposed to either the new program or a more traditional approach. However, this method is problematic because test or exam performance is often a weak indicator of competence and may fail to capture important nuances in outcomes. Subjectivist studies are more qualitative in nature and may provide insights complementary to those gained with objectivist studies. Several published examples are used in this article to illustrate different evaluation methods. Readers are encouraged to contemplate a wide range of evaluation study designs and explore increasingly complex questions when evaluating educational software.

AB - Evaluation is an important component of developing educational software. Ideally, such evaluation quantifies and qualifies the effects of a new educational intervention on the learning process and outcomes. Conducting meaningful and rigorous educational evaluation is difficult, however. Challenges include defining and measuring educational outcomes, accounting for media effects, coping with practical problems in designing studies, and asking the right research questions. Practical considerations that make the design of evaluation studies difficult include confounding, potentially small effect sizes, contamination effects, and ethics. Two distinct approaches to evaluation are objectivist and subjectivist. These two complement each other in describing the whole range of effects a new educational program can have. Objectivist demonstration studies should be preceded by measurement studies that assess the reliability and validity of the evaluation instrument(s) used. Many evaluation studies compare the performance of learners who are exposed to either the new program or a more traditional approach. However, this method is problematic because test or exam performance is often a weak indicator of competence and may fail to capture important nuances in outcomes. Subjectivist studies are more qualitative in nature and may provide insights complementary to those gained with objectivist studies. Several published examples are used in this article to illustrate different evaluation methods. Readers are encouraged to contemplate a wide range of evaluation study designs and explore increasingly complex questions when evaluating educational software.

UR - http://www.scopus.com/inward/record.url?scp=0642343538&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0642343538&partnerID=8YFLogxK

M3 - Article

VL - 67

SP - 1221

EP - 1228

JO - Journal of Dental Education

JF - Journal of Dental Education

SN - 0022-0337

IS - 11

ER -