Formative Evaluation of a Web-based Multimedia Intervention to
Support Learning of Statistics
Natalya Koehler
1
, Ana-Paula Correia
2
, Nimet Alpay
1
and Carolyn LeVally
1
1
International Institute of Innovative Instruction, Franklin University, 236 E Town St, Columbus, Ohio, U.S.A.
2
Educational Studies Department, The Ohio State University, 29 w Woodruff Ave, Columbus, Ohio, U.S.A.
Keywords: Formative Evaluation, Multimedia, Statistics.
Abstract: Tailoring information to the needs of the learner is an important strategy in teaching complex concepts. Web-
based learning modules informed by multimedia theory for teaching declarative and procedural knowledge
(statistical terminology, concepts, and procedures) were introduced to a mathematics course designed to teach
undergraduate students. The formative evaluation of these modules was conducted over a period of two
trimesters (August to December 2015 and January to April 2016) with a total of 187 students and six
instructors. Students’ perceptions on the modules’ usability features (e.g., pace of audio presentations, ease
of navigation, and layout) as well as on cognitive support and effectiveness of the modules were analyzed.
Students and instructors’ reflections on their experiences with the modules were also gathered and analyzed.
Both set of participants were overwhelmingly positive about their online learning and teaching experiences
of statistics. Suggestions for improvement included having more multimedia lectures, showing more examples
as well as calculator tutorials, and asking more comprehension questions in the modules.
1 INTRODUCTION
Teaching and learning mathematics has often been
identified as challenging and a barrier to students’
graduation in higher education (Complete College
America, 2011). For this reason, a mathematics
course was redesigned with the purpose to increase
student retention and the number of students
receiving passing grades. Also, better support for
students whose time schedules prevented them from
attending synchronous online sessions needed to be
offered.
In face of these challenges, a course re-design was
done to a mathematics course offered at a large non-
profit private university with a large selection of
online courses and programs, named Franklin
University for the purpose of this study. The data
analysis prior to the re-design revealed that there was
a need to better support the needs of undergraduate
students taking the course. This conclusion was made
based on faculty members’ interviews and student
course evaluations. Most of students were adult
learners who had to juggle work, family
responsibilities, and financial commitments along
with their academic programs. The majority of the
students who took the mathematics course as an
online section was not even able to attend the live
synchronous sessions because of schedule conflicts.
To better address the needs of these students, a series
of web-based learning modules (refereed here as
Software) was created. It consisted of weekly web-
based multimedia interactive lectures and aimed to
help students synthesize the statistical knowledge and
check their understanding of terminology, concepts,
and procedures. An example can be viewed at
http://video.franklin.edu/Franklin/statistics/1a/story.
html
This proposal focuses on the evaluative efforts to
judge the learning value of the Software. It describes
the two rounds of formative evaluation conducted
between August 2015 and April 2016.
2 LEARNING OF STATISTICS
AND COMPUTER ASSISTED
LEARNING
Repetitive practice through carefully selected and
managed examples is considered an important way of
learning statistics. It allows students to construct
generalizations and see patterns (Petty, 2012). The
92
Koehler, N., Correia, A-P., Alpay, N. and LeVally, C.
Formative Evaluation of a Web-based Multimedia Intervention to Support Learning of Statistics.
DOI: 10.5220/0006318500920099
In Proceedings of the 9th International Conference on Computer Supported Education (CSEDU 2017) - Volume 1, pages 92-99
ISBN: 978-989-758-239-4
Copyright © 2017 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
efficient way of engaging students in repetitive
practice is using auto-graded assignments that can be
provided by a textbook publisher or an in-house
system.
Larwin and Larwin (2011), in their meta-analysis
study found that Computer Assisted Instruction (CAI)
has a moderate impact on student achievement. Some
researchers emphasized that technology needs to be
used not only for computing numbers but for concept
exploration (Moore, 1997; Garfield et al., 2000).
Others stated that technology has the potential to
expand the range of visualization techniques that can
help learners understand concepts (Chance et al.,
2007). Including embedded comprehension
questions in CAI can be considered as an additional
benefit (Sklar and Zwick, 2009). Also, animated
presentations of statistical concepts can help non-
statistical majors learn statistics (Sklar and Zwick,
2009). Wender and Muehlboeck (2003) found that
computer-animated graphics were more effective
than static images in teaching statistics and the use of
computer-animated graphics resulted in better
students understanding of statistical concepts and
retention of the material.
Kirschner et al., (2006) advocated a strong
emphasis on guidance of the student learning process.
The Cognitive Theory of Multimedia Learning
describes the cognitive process that occur when
exposed to multimedia learning (Moreno and Mayer,
2000). Using both of the auditory and visual channels,
student learning through multimedia instruction can
be enhanced as information is presented. Keller's
attention, relevance, confidence, and satisfaction
(ARCS) model features enhancing and sustaining
motivation strategies. Instructional designers can
build into the software interface a menu-driven
program structure, which enhances learners'
confidence by providing control access to different
parts of courseware (Song and Keller, 2001).
3 METHODS
3.1 The Intervention
According to the report from Complete College
America (2011), for part-time students, the chances
of graduation with a 4-year bachelor’s degree within
8 years is 24.3%. Moreover, graduation rates are
especially low for poor, older students (Complete
College America, 2011). Since Franklin University
undergraduate students largely fall into the above
categories, it was important to count success of every
student. Most Franklin University students are first
generation college students who work part-time and
balance work, classes and family responsibilities. The
majority of them are financial aid recipients. Student
average age is 33.27 years. Table 1 summarizes the
average statistics of undergraduate student population
at Franklin University.
Table 1: Statistics of Undergraduate Student Population at
Franklin University.
Category Percentage
Race
Caucasians - 66.7%
African Americans - 21.6%
Asians - 3.2%
Hispanic - 3.2%
Others - 5.3%
Student Status
Full time - 33.6%
Part time – 66.4%
Financial Aid Recipients 70 %
Gender
Males – 43.5%
Females – 56.5%
3.2 The Software
Twenty-eight web-based multimedia modules were
created targeting the weekly learning outcomes.
Topics included sampling, types of data, graphs,
variance and standard deviation, percentiles, rules of
probabilities, confidence intervals, hypothesis
testing, tests with qualitative data, correlation
coefficient and least-squares regression.
The Software were not only designed to provide
additional information but rather to increase students’
understanding of key concepts through the use of
multimedia instruction and periodic checks of student
comprehension. At Franklin University the Software
targeted students in both sections of the mathematics
course, online and face-to-face. Details on the design
of the Software are described in Alpay et al, 2014.
3.3 Study Design
This study explores students’ and instructor
perceptions of the benefit and value of the Software.
Students’ perceptions and instructors’ feedback of the
intervention were collected over the period of two
trimesters (August to December 2015 and January to
April 2016) at Franklin University.
The focus of the first formative evaluation was to
collect preliminary information on the Software
effectiveness and usability. It took place between
August and December of 2015. The instrument for
collecting student perceptions consisted of four parts,
as follows:
Part 1: a table for reporting any bugs that students
Formative Evaluation of a Web-based Multimedia Intervention to Support Learning of Statistics
93
could potentially encounter in any of the modules was
made available.
Part 2: students were asked to rate 14 statements
(see Table 3) on the rating scale of 5 (5-strongly
agree; 1-strongly disagree) about the Software
assigned for their review. Students were asked to
evaluate:
1. three usability features, such as pace of the audio
presentations, ease of navigation, and layout of
the Software.
2. overall experience with the Software.
3. sufficiency of cognitive support provided by the
Software.
4. helpfulness of Software features, such as:
a. explanations
b. real-life examples
c. summary slides
d. transcript of lectures
e. self-assessment questions
f. supplemental content (tips for how to use a
calculator, topic videos, etc.)
g. program feedback to self-assessment questions
Part 3: students answered two open-ended
questions about what they liked about the Software
and what changes they would like to make, if any.
Part 4: students were asked to write a paragraph
about their learning experience with the Software and
provide examples to illustrate which specific program
features and/or instructional strategies helped/did not
help them learn the new material.
The second round of formative evaluation
evaluated the Software after minor bugs were fixed.
The examples of the bugs: (1) a few buttons on
some screens did not work the way it was expected;
(2) a few pop-up windows did not work the way it
was expected. Since no major problems with
software design were found in the first round, the data
collection instruments to seek additional student
feedback on the value of the Software remained the
same in the second round. Thus, the design of the
second round of the evaluation in the second trimester
(January to April 2016) did not change.
On both rounds of formative evaluation,
instructors’ feedback was collected through an
anonymous online survey on Survey Monkey. The
instructors answered six open-ended questions
focused on how the Software was used in the course
by students and instructors.
3.4 Data Collection Process
The data was collected from four sources: (1)
students’ responses to two open-ended questions, (2)
students’ survey ratings of the Software features, (3)
student reflections on their experiences and, (4)
instructors’ responses to six open-ended questions via
online survey.
Data were analyzed for recurring themes related
to the helpfulness of the Software features and
instructional strategies used in the mathematics
course. The data were triangulated to ensure the
reliability and validity of the instruments. The data
collection instruments corresponding to the research
questions are presented in Table 2.
Table 2: Research Questions and Data Collection
Instruments.
Research questions
Data collection
instruments
1. What were the
students’ perceptions of how the
Software supported their
cognitive processing of the
target concepts?
2. What was the overall student
experience with the Software?
3. What unanticipated problems
did students and
instructors experience with the
Software?
4. Which features and
instructional strategies
implemented in the Software
were the most and least helpful
to students and instructors?
Online survey
consisting of 14
statements, 2 open-
ended questions
and a reflection
summary. Students
could take the
survey at their
convenience within
the allocated time
for response.
Students rated their
experience with the
Software using a
rating scale of 5 (5-
strongly agree, 1-
strongly disagree).
5. What were the
instructors’ perceptions of
student use of the Software?
6. What was the instructor’s
overall experience with the use
of the Software in this course?
7. What effect did Software
have on the role of the
instructor and the effectiveness
of instruction in the course?
Online survey
consisting of six
open-ended
questions.
Instructors could
take the survey at
their convenience
within the allocated
time for response.
Franklin University Institutional Review Board
approved this study. An informed consent was
obtained from instructors before they completed an
anonymous online survey. Students who participated
in the study were given bonus credits. Students who
did not participated were given alternative course
activities to gain equivalent number of credits.
After reviewing the modules, student-
participants would fill out the survey form. Since
students would be sending the signed consent forms
to the research assistant in a random fashion, this
would ensure the fact that the research assistant could
assign modules to students sequentially with a goal of
having the same proportion of evaluations for each
CSEDU 2017 - 9th International Conference on Computer Supported Education
94
module. This task assignment schema guaranteed that
the multimedia modules were assigned to students in
a random fashion. The research assistant kept track of
the assigned/unassigned multimedia lectures, the
students’ responses, their bonus points and survey
results.
3.5 Participants
All students of Franklin University who were enrolled
in the mathematics course were eligible to participate
in the study. Out of 236 students enrolled in the
course between August and December of 2015, 147
students (62%) took the course online and 89 students
(38%) were enrolled in the face to face section.
Eighty-six of the enrolled students (36%) agreed to
participate in the first round of formative evaluation
of the Software. From January to April of 2016, out
of the 209 students enrolled in the course, 131 (63%)
took it in online and 78 (37%) in a face to face format.
Eighty-one students (39%) agreed to participate in the
second round of the formative evaluation.
Six instructors out of 13 (46%) participated in the
study between August and December of 2015, two of
them taught face to face sections and another four
taught online sections.
Six instructors out of 13 (46%) participated in the
study between January and April of 2016 three of
them taught face to face sections and another three
taught online sections.
4 RESULTS
4.1 Summary of Students’ Quantitative
Data
As reflected in Table 3 below, students’ experiences
with the Software were overwhelmingly positive.
Students rated the usability of the Software very
high. The majority of the formative evaluation
participants strongly agreed that they liked the layout,
navigation, and pace of the instruction. See items 1,
2, 9 in Table 3. Most participants strongly agreed that
the Software design supported their cognitive
processing of the new information. See items 6, 7, 8
in Table 3.
The vast majority of the students found the Check
Your Learning questions embedded in the Software
helpful. Also, they highly valued the program
feedback providing the explanation of correct
answers, and they found the feedback sufficient. See
items 10, 11 in Table 3.
Table 3: Students’ experiences with the Software.
Survey
question #
Aug-Dec 2015
Mean (SD)
Jan-Apr 2016
Mean (SD)
1. 4.7 (0.83) 4.7 (0.59)
2. 4.7 (0.73) 4.7 (0.53)
3. 4.6 (1.2) 4.7 (0.67)
4. 4.5 (0.94) 4.5 (0.76)
5. 4.1 (1.06) 4.3 (0.89)
6. 4.6 (0.79) 4.7 (0.55)
7. 4.4 (0.79) 4.5 (0.71)
8. 4.5 (0.85) 4.4 (0.69)
9. 4.5 (1.35) 4.5 (0.78)
10. 4.5(0.72) 4.6 (0.73)
11. 4.3 (0.79) 4.4 (0.91)
12. 4.3 (0.96) 4.4 (0.84)
13. 4.6 (0.72) 4.7 (0.62)
14. 4.5 (0.84) 4.6 ( 0.61)
Survey questions (5-strongly agree; 1-strongly disagree): 1.
I liked the layout of the sections in the lecture; 2. I liked the
navigation in the sections in the lecture; 3. I liked the
explanation and the examples used in the informational
slides of the lecture sections; 4. I liked the summary slides
used throughout the sections; 5. I found the transcript option
useful; 6. The sections helped me understand this week’s
material, 7. The section design helped me retain the new
information; 8. The section design helped me maintain my
attention; 9. I found the self-assessment feature (Check
Your Learning questions) helpful; 10. The answer feedback
to the Check Your Learning questions (explanations of the
correct answers) were helpful; 11. The answer feedback to
the Check Your Learning questions was sufficient; 12. I
liked the supplemental content (if any), such as calculator
tips and topic videos; 13. The pace of the content was good;
14. I enjoyed my experience using the lectures.
Such features as access to audio transcripts and
supplemental materials (for example, help on how to
use the calculator for accomplishing specific tasks)
were highly valued by some students, but not by all
of them. See items 5, 12 in in Table 3. This can be
attributed to the difference in student prior knowledge
of the topics. Transcripts allowed students with high
prior knowledge to briefly go over familiar material.
For example, the calculator help was used only by
those students who needed it.
The overwhelming majority of the study
participants enjoyed their experience with the
Software. See item 14 in in Table 3.
4.2 Summary of Students’ Qualitative
Data
Thematic content analysis was used for summarizing
themes from student reflections on their learning
experiences with the Software supported by
illustrative examples. The themes fell under 4
categories:
Formative Evaluation of a Web-based Multimedia Intervention to Support Learning of Statistics
95
Effectiveness of multimedia design - software
usability and visual design
Effectiveness of multimedia instructional design
– helpfulness of software features
Effectiveness of support for cognitive processing
of information provided by the Software
Overall satisfaction with the Software
“Composite” responses that reflected the content of
all the responses in each category were created and
tabulated. The most frequent themes are presented at
the top of the list. Eighty-six students provided their
comments. Some students commented on more than
one theme. The results are presented in Tables 4 to 7.
Table 4: The themes from students’ comments about
effectiveness of multimedia design.
Theme
Aug-Dec 2015
# comments
Jan-Apr 2016
# comments
Supports well self-paced
learning
25 30
The pace is appropriate 17 26
The quality of narration is
appropriate
14 12
Easy to navigate 44 39
Colors are visually
appealing
10 24
Color coding supports the
information processing
7 10
The length of mini lectures
was appropriate
16 25
Overall, there were no negative comments, only
positive ones. Students were happy with all aspects of
the Software. They thought that the lectures gave a
classroom feel to a non-classroom approach to
learning. Students found that the lectures were very
user-friendly and supported cognitive processing of
the new information effectively.
Table 5: The themes from students’ comments about
effectiveness of multimedia instructional design –
helpfulness of software features.
Theme
Aug-Dec 2015
# comments
Jan-Apr 2016
# comments
Summary slides are
helpful
25 36
Real-life examples are
helpful
23 17
Explanations are helpful to
guide student learning
34 32
Self-assessments at the end
of each chapter reinforce
student learning
39 44
Program feedback on
student answers to quiz
46 50
questions is helpful and
sufficient
Supplemental materials
were helpful
24 21
The calculator tutorial was
helpful
19 25
Inclusion of audio
transcripts is helpful
20 21
Graphics and tables were
helpful
18 12
Animated examples helped
understand concepts
20 11
Students used the Software as preparation for
doing homework assignments (27 comments) and
preparation for exams (9 comments). Also, students
highly recommended building similar lectures for
other courses in which complex concepts are taught.
Table 6: The themes from students’ comments about
effectiveness of support for cognitive processing of
information provided by the multimedia lectures.
Theme
Aug-Dec 2015
# comments
Jan-Apr2016
# comments
Helped students
understand the material
32 34
Helped students
remember the material
37 23
Theme
Aug-Dec 2015
# comments
Jan-Apr2016
# comments
Helped students maintain
their attention
24 37
Helped students organize
information for encoding
into long-term memory
13 6
There were a few suggestions for improvement.
Twenty students asked that more self-assessment
questions be included in the Software. Nine students
recommended three tries instead of two tries on the
self-assessment questions. Four students would like
to have a section containing all the videos teaching
how to use calculators in one single place. Four
students would like to have a reference to the specific
page in the textbook. This last suggestion would help
Table 7: The themes from students’ comments about overall
satisfaction.
Theme
Aug-Dec 2015
# comments
Jan-Apr2016
# comments
Reduces phobia of math 7 6
Recommendation to build
similar lectures for other
courses
10 12
Satisfaction with the
Software
25 32
CSEDU 2017 - 9th International Conference on Computer Supported Education
96
students go to the textbook, if extra study is required
without spending time searching it.
Based on the bugs report, students’ suggestions
for corrections were collected. Students’ suggestions
were tabulated by the research assistant and corrected
by the multimedia designer.
During the second trimester (January to April
2016), most of the themes that came from students’
comments remained the same compared to the first
trimester (August to December 2015). The major new
theme that emerged was about having student
experiences with the Software graded. One of the
students, wrote the following comment:
“Because these interactive lectures were not
mandatory, I did not complete them for every section.
When I felt like I was stuck on a problem or having a
hard time remembering the material, I would use
these lectures. I wish I had gone about the class
differently and used these lectures each week. My
reasoning was that because the amount of work in this
class is so large, on top of the other classes I was
taking, and my personal life, I did not have time to
complete something if it was not part of the grade. It
would be nice if these lectures counted for a portion
of our grade.”
4.3 Instructors’ Feedback
As to instructor participants, all of them (two face to
face instructors and four online instructors in August
to December 2015 and three face to face instructors
and three online instructors in January to April 2016)
provided favorable comments about the Software.
Most of them recommended that their students should
use the Software before they come to class or during
synchronous online sessions. All the instructors
believed that the Software was a good resource for
helping students understand complex statistical
concepts. Half of the instructors considered the
Software as a tool that provided helpful ideas for
teaching statistical concepts. Composite responses
that reflected the content of all the instructor
responses in each category were created and tabulated
(see Table 8).
Table 8: Instructor comments about their perceived
effectiveness of the Software.
Areas of inquiry
based on research
questions
Composite
responses
# instructors
comments
The use of the
Software by the
instructor for
teaching the course
The web-based
multimedia modules
were highly
recommended for
6
students to watch
before class
Areas of inquiry
based on research
questions
Composite
responses
# instructors
comments
Effect of the
software on the
instructor role, if
any
It frees instructor
time for discussing
and clarifying
complex concepts in
class and during
synchronous online
sessions. Good
addition to the
course.
6
The use of the
Software during
online synchronous
sessions or in class,
if any
a) Some screens in
the Software were
used during the
synchronous online
sessions to help
students visualize
complex concepts;
b) Played the videos
in the Software
instead of lecturing.
This was not a very
successful
experience because
the instructor lost
students’ interest.
5
1
Perceived student
reactions to the
Software
Positive comments
from students
because it helps
clarify the weekly
material
6
Perceived effect of
the software on
student
understanding of the
topics
Excellent resource,
supports
understanding of the
material because it
is chunked down
into easy to
understand pieces.
6
The use of the
Software for
professional
development, if any
a) The software
provides helpful
suggestions to the
instructor on how to
help students
understand the
concepts;
b) The Software did
not have any impact
on professional
development
3
3
Formative Evaluation of a Web-based Multimedia Intervention to Support Learning of Statistics
97
5 DISCUSSION AND
CONCLUSIONS
Most of the students who participated in the
evaluation of the Software found it highly valuable
for their learning. The web-based multimedia
modules provided invaluable instructional support for
online students, as well as face to face students.
Students favored self-paced instruction, dual coding
of information supported by multimedia lectures,
systematic instruction of complex subject matter for
online students, the opportunity to review the material
at a later time, clear, concise and effective instruction,
multiple examples, comprehension questions with
explanatory program feedback and many other
aspects of the Software.
Interestingly, students used the Software in
additional ways that their instructors did not
anticipate. It was not only used as initial learning
before classroom instruction, but also served for
reviewing the course material and for test preparation.
Surprisingly, one of the most valuable features in
the Software appeared to be comprehension questions
augmented with explanatory feedback. Also, several
students mentioned that they would like to have
similar instruction in other courses. Others regretted
that student experience with the software was not part
of their course grade. They thought that if the
Software experience were graded, more students
would be using it for their benefit. One of the
unanticipated consequences of the formative
evaluation study was that by exposing students to the
Software, the number of students who started using it
on a regular basis increased because they realized that
this was a more effective and efficient way of learning
statistics.
The following comments were chosen as the best
representations of student experience with the
Software. They are:
“I have done much with online classes and various
interactive programs such as Pearson Math XL,
Webassign, and McGrawHill Math connect. The
interactive lectures on the Franklin course site are
far better than any of the others I have used.”
“I really enjoy the interactive lectures. I think it
gives a classroom feel to a non-classroom
approach to learning.”
“At the beginning of the math course, I didn’t look
at the interactive lecture and opted to read the
book and assist the SLA meetings on Thursdays.
One day I decided to listen to the interactive
lecture to see if it was helpful. I really liked it
because I was able to hear someone explain the
course and give concrete examples I could relate
to. It really helped me understand the idea behind
each chapter and what each formula and equation
was used for.”
“My experiences with this interactive were
completely positive. Having this experience with
the interactive lectures really helped bring the
lesson plan alive; in class the professor can go
over and show example but with the interactive
series it allows you to stop and replay whichever
sections you may not understand.”
5.1 Suggestions for Improvement
The major suggestions for improvement were about
building additional interactive multimedia lectures
for this course and similar resources for other math
courses. Some students wanted to have more
multimedia presentation of the new material, more
examples, more calculator tutorials, and more
comprehension questions in the Software.
5.2 Study Strengths and Limitations
Only 36% and 39 % of the students enrolled in the
course between August 2015 and April 2016
participated in the evaluation. Thus, it is likely that
the respondents were those who potentially might
benefit from the Software use. Although not
representative of the entire cohort of students, this
formative evaluation provides useful information for
the design and development of multimedia
educational materials to teach statistics and support
the needs of online students. The results of the study
demonstrated that students valued the principles of
sound educational practices applied to both online
and face to face instruction. Also, student comments
provided preliminary evidence of the effectiveness of
web-based multimedia modules. This study benefits
the field of multimedia instructional design by
reporting on specific strategies employed to produce
effective instructional solutions.
REFERENCES
Alpay, N., Reid, A., Koehler, N., Prendergast, M., &
Thomas, S., 2014. New Products and Improvements in
Math 215. Workshop conducted at Conference on
Teaching and Learning, Columbus, OH, June 13, 2014.
Chance, B., Ben-Zvi, D., Garfield, J., & Medina, E. (2007).
The role of technology in improving student learning of
statistics. Technology Innovations in Statistics
Education, 1, http://repositories.cdlib.org/uclastat/
cts/tise/vol1/iss1/art2.
CSEDU 2017 - 9th International Conference on Computer Supported Education
98
Complete College America, 2011. Time is the Enemy of
Graduation. Retrieved from http://
completecollege.org/resources/
Garfield, J., Chance, B., & Snell, J. L., 2000. Technology
in college statistics courses. In D. Holton (Ed.), The
teaching and learning of mathematics at university
level: An ICMI study. (pp. 357-370). Dordrecht, the
Netherlands: Kluwer Academic Publishers.
Kirschner, P. A., Sweller, J., & Clark, R., 2006. Why
minimal guidance during instruction does not work: An
analysis of the failure of constructivist, discovery,
problem-based, experiential and inquiry-based
teaching. Educational Psychologist, vol. 41, no.2, pp.
75–86.
Larwin, K., Larwin, D., 2011. A meta-analysis examining
the impact of computer-assisted instruction on
postsecondary statistics education: 40 years of research.
Journal of Research on Technology in Education, vol.
43, no. 3, pp. 253–278.
Moore, D. S., 1997. New pedagogy and new content: The
case of statistics. International Statistical Review, vol.
635, pp. 123-165.
Moreno, R, Mayer, R.E., 2000. A learner-centered
approach to multimedia explanations: Deriving
instructional design principles from cognitive theory.
Interactive Multimedia Electronic Journal of Computer
Enhanced Learning, vol. 2, no. 2, pp. 12-20.
Petty, N. W., 2012. Drill and Rote in Teaching LP and
Hypothesis Testing [Web log post]. Retrieved from
https://learnandteachstatistics.wordpress.com/2012/02/
Sklar, J. C., & Zwick, R., 2009. Multimedia presentations
in educational measurement and statistics: Design
considerations and instructional approaches. Journal of
Statistics Education, vol. 17, no. 3.
Song, S. H., & Keller, J. M., 2001. Effectiveness of
motivationally adaptive computer-assisted instruction
on the dynamic aspects of motivation. Educational
Technology Research and Development, vol. 49, no. 2,
pp. 5-22.
Wender, K. F., & Muehlboeck, J. S., 2003. Animated
diagrams in teaching statistics. Behavior Research
Methods, Instruments, & Computers, vol. 35, no. 2, pp.
255-258.
Formative Evaluation of a Web-based Multimedia Intervention to Support Learning of Statistics
99