Home | About | Software | Curriculum | Research |
Home Page Our Research
Evaluation and Educational Research
The project will make a major effort to provide evaluation
data of value to policy decisions. Since this is a pilot study,
its value to policy-makers concerns its implications in other
similar situations. We, therefore, need to know not only how
well students were able to master these materials, but also whether
improvements based on atomic-scale modeling might be expected
in other subjects and contexts. Questions that address these
issues can be divided into four areas: student performance, materials
design, implementation considerations, and curriculum implications.
Student Performance
The questions addressed by this part of the study include:
How well do students learn using this approach? How well do students
achieve the stated learning goals? Can the use of atomic-scale
models be traced to student learning?
Our design can be described as an open-ended design without tight
controls. The evaluation studys will generate learning goals
for each WISE activity that, according to practicing teachers,
are appropriate. Student evaluation will center on measuring
the extent to which students in the study achieve those goals.
Pre- and post-tests for each project will be administered to
determine the change in knowledge that can be attributed to the
materials. We will also give the post-tests to a representative
group of comparable students at each participating institution
who do not participate in the project. This will provide a comparison
group that indicates how well students typically master the project's
learning goals.
We will develop a sub-set of items for the pre- and post-tests
that measure general understanding about atoms and molecules.
This sub-test will provide evidence about the impact of this
approach on student understanding of the atomic world and its
connection to macroscopic phenomena.
Materials Design
The questions addressed by this part of the study include:
Was the underlying model sufficiently accurate? Did the hypermodels
capture features of science and technology that are important
to this audience? Were the projects designed at an appropriate
level for this audience? Was there the right balance of open-ended
exploration and exposition? Did the instructional materials address
the learning goals?
These data will be gleaned from an analysis of the XML-tagged
data returned by Pedagogica and the student responses
collected by WISE. This methodology allows us to know how long
students spent on each page, their predictions and observations,
as well as their ongoing reflections on their own learning.
We will also present the materials to panels of experts for analysis.
Different panels will look at the accuracy and appropriateness
of the software, the fit of the materials in core science courses,
and the applicability of the activities in technical areas.
Implementation Considerations
The questions addressed by this part of the study include:
Are there technical or organizational problems in implementing
the materials? Did students have adequate access to computers
to complete the materials at home, school, and work? How reliable
was Internet access? Were students able to take advantage of
the ability to run models locally? How difficult was it to make
the materials available to students? Were there problems with
downloading applications, uploading responses, and caching material
locally?
These questions will be addressed through questionnaires administered
to some participating students and technology coordinators at
all participating colleges. Most students will respond online,
but a paper version will be available for students with serious
connectivity problems.
Curriculum Implications
The questions addressed by this part of the study include:
Are there places in the typical two-year college curriculum for
materials like these? What subjects and topics are most applicable?
Is there too large a gap between the abstractions of the models
and the exigencies of the short curriculum? Do standards, license
exams, and other requirements make it difficult to fit in new
materials such as these?
These questions will be addressed through phone interviews and
online questionnaires addressed to participating faculty.
This material is based upon work supported by the National Science Foundation under Grant No. EIA-0219345. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Last Update: 12/10/2018 Maintainer: webmaster@concord.org
Site Map
Copyright © 2018, The Concord Consortium.
All rights reserved.