The following article has been published in *Research
in Collegiate Mathematics Education V*, edited by edited by Selden, A.,
Dubinsky, E., Harel, G., and Hitt, F., Providence, RI: American Mathematical
Society. (2003). This research was
funded by National Science Foundation grant # DUE-9752421. The authors would also like to acknowledge
the support for this project from the Carnegie Academy for the Scholarship of
Teaching and Learning, a program of The Carnegie Foundation for the Advancement
of Teaching.

**The Nature of Learning in Interactive Technological Environments**

** **

Jack Bookman and David
Malone

It is clear that technology is fundamentally changing the way we live, work, and learn, but is not clear exactly how it is changing the way we live, work, and learn. In particular, it is not clear how the Internet and sophisticated computer algebra systems are changing and will change how we teach and learn mathematics. Smith (2000) has succinctly summed up the situation:

Technology is a fact of life for our students -- before, during, and after college. Most students entering college now have experience with a graphing calculator, because calculators are permitted or required on major standardized tests. A large and growing percentage of students have computer experience as well – at home, in the classroom, or in a school or public library. ''Surfin' the Net'' is a way of life -- whether for good reasons or bad. Many colleges require computer purchase or incorporate it into their tuition structure. Where the computer itself is not required, the student is likely to find use of technology required in a variety of courses. After graduation, it is virtually certain that, whatever the job is, there will be a computer close at hand. And there is no sign that increase in power or decrease in cost will slow down any time in the near future. We know these tools can be used stupidly or intelligently, and intelligent choices often involve knowledge of mathematics, so this technological environment is our business. Since most of our traditional curriculum was assembled in a pre-computer age, we have a responsibility to rethink whether this curriculum still addresses the right issue in the right ways -- and that is exactly what has motivated some reformers.

Nonetheless, the move to integrate technology into teaching has not been without its detractors. Krantz (2000) raises some important concerns when he states that: (1) distance education and products promoted by publishers for profit “describe a dangerous trend”; (2) “Provosts and deans have dollar signs in their eyes. They envision teaching more students with fewer faculty”; and (3) “The important question is whether students are internalizing and retaining the material.” But he presents an extreme either/or view of technology lumping together all use of technology in the classroom. He asserts, providing no evidence, that “Traditional education ... enables students to master the ideas and retain them for future use,” that “traditional methods … have had-and continue to have-great success,” and that traditional classrooms produce “interaction of first rate minds.” He then claims (again with no evidence) that there is no measurable benefit to employing technology in the classroom. He is not the only mathematician with these concerns. The issue of how, or if, to introduce technology into the classroom is one of the most divisive and emotionally charged issues in education.

In this paper, we propose an agenda for research that will move the discussion of the use of technology in undergraduate mathematics classes from the coffee lounge and soap box to the seminar room. We will discuss some preliminary results of careful observations of student learning using computer algebra systems with lessons delivered via the Internet. Based on these observations, we propose a set of research questions whose answers will help us to understand how best to use these new technologies to improve the teaching and learning of mathematics.

In recent years, consensus among organizations concerned about mathematics education, such as the National Council of Teachers of Mathematics and National Research Council, may have emerged regarding the essential steps in reforming mathematics and science education (Chambers & Bailey, 1996; Battista, 1999; National Council of Teachers of Mathematics, 1991; National Research Council, 1991; National Science Foundation, 1996). Bailey and Chambers (1996) summarized several of these reform reports and concluded that six overarching recommendations have emerged: (1) integrate the teaching of science and mathematics; (2) emphasize cooperative learning; (3) focus on application and relevant problem solving; (4) teach primarily through active learning as opposed to lecture; (5) attend to the motivation of learners; and (6) use technology in meaningful ways.

The Connected Curriculum Project ^{1}
is an innovative instructional effort which addresses each of these six
recommendations. The Connected Curriculum Project (CCP) has developed a
collection of learning materials designed to create interactive learning
environments for students in the first two years of college mathematics courses
(Colvin et al., 1999; Coyle et al., 1998). The materials combine the
interactivity, accessibility, and connectivity of the Web with the power of
computer algebra systems. These materials may be used by groups of students as
an integrated part of a course, by individuals as independent projects, or as
supplements to classroom discussions.
Lawrence Moore and David Smith, who began their collaboration in 1988 at
the beginning of the calculus reform movement, lead this project.

The CCP is a direct extension of the experience gained from the calculus reform movement in general and, in particular, Project CALC: Calculus As a Laboratory Course, supported by the NSF Calculus Reform Initiative. The key features of that course are real-world problems, hands-on activities, discovery learning, writing and revision of writing, teamwork, and intelligent use of available tools. The stated goals for the course are that students should: (1) be able to use mathematics to structure their understanding of and investigate questions in the world around them; (2) be able to use calculus to formulate problems, to solve problems, and to communicate the solution of problems to others; (3) be able to use technology as an integral part of this process of formulation, solution, and communication; and (4) learn to work cooperatively (Bookman & Blake, 1996; Smith & Moore, 1990; Smith & Moore, 1991).

A
subsequent NSF grant in 1993 (Interactive Modules for Courses Following
Calculus, Duke University, NSF DUE-9352889, 1993-97) supported development of
modular lab activities for courses beyond calculus: linear algebra,
differential equations, and engineering mathematics. These modules were created
as interactive texts in specific computer algebra systems. CCP was devised to extend the usefulness of
these modules by capitalizing on the interactivity and availability provided by
the Internet. The CCP modules include
hypertext links, Java applets, sophisticated graphics, a computer algebra
system, realistic scenarios, and questions that require written answers. The
materials used for this study were single-topic units that can be completed in
one to two hours with students working in two-person teams in a computer lab
environment.

The CCP is
based in part on ideas of cognitive psychology that examine the ways in which
students take in, organize, and represent knowledge internally. An underlying principle from this research
is that students cannot simply be given knowledge; they must construct
knowledge in their own minds. This perspective on learning, known as
constructivism, is rooted in the earlier work of cognitive theorists such as
Piaget, Bruner, and Vygotsky (Piaget, 1952; Bruner, 1966 and Vygotsky, 1978).

Learning
from the constructivist perspective is seen as a “self-regulated process of
resolving inner cognitive conflicts that often become apparent through concrete
experience, collaborative discourse, and reflection” (Fosnot, 1993).
Constructivist theorists maintain that the active learning in a socially
interactive environment is a necessary condition for meaningful understanding.
The primary role of the teacher is to structure learning situations in which
students experience a sense of cognitive disequilibrium. Students take on more
responsibility for monitoring and regulating their own thinking and
learning. The job of the teacher is to
create learning environments that place students in a position of constructing
or building meaning and understanding for themselves. Thus, from this
viewpoint, learning is viewed as a transformative process involving conceptual
change, not merely a process in which students recite back information that
they have passively accumulated.

In the
traditional mathematics class, the instructor might explain a concept, demonstrate
several examples of how to solve problems involving that particular concept,
and then ask students individually to work through problem sets. The emphasis
is on learning computational procedures. In a cognitively oriented or
constructivist mathematics class, after an introductory discussion led by the
teacher, students are given complex and engaging problems which are situated
and embedded in a meaningful context. These contextualized problems require
students to engage not only in computational procedures, but also in sustained
mathematical reasoning. Students might be asked to write about the problem,
create graphs and drawings, manipulate objects, and in other ways actively make
sense of the problem. Working collaboratively with peers, they pose hypotheses,
justify ideas, formulate solutions, and explain their personal understandings
in their own words. An emphasis in a constructivist mathematics class is on
making sense of mathematical ideas.

Because
constructivist approaches to instruction attempt to engage students in solving
meaningful problems using real world data, the use of technology in mathematics
instruction is seen as holding enormous promise. Portela (1999) noted that,
“Although not new, constructivism has more relevance in education today because
the dawn of the Information Age has rapidly increased the amount of, and
accessibility to, information.” He stated that there is a scarcity of studies
about how students learn in technology-based environments and describes the
results of his case study of a mathematical communication and technology course
for mathematics graduate students. He
reported that the focus of teaching and learning shifted from knowledge
transmission to knowledge building and he credits the Internet with aiding this
shift. He also indicated that being
connected to the Internet in the classroom provided opportunities for more
active learning by encouraging students to learn by doing, concentrate on the
subject matter rather than simply copying notes from the board, participate in
class discussions, work at their own pace, receive individual help from the
instructor without holding back the rest of the class, and access related sites
“right on the spot.”

Papert
(1980) has written extensively about ways technology can be used in mathematics
classrooms to promote “agency” or student ownership of mathematical thinking.
Papert indicated that technology-rich environments which utilize inquiry-based
approaches to learning math have the potential of significantly altering the dynamics
between instructor and student, as well as between the student and the
mathematical content being studied. Papert also noted that in such learning
environments students exercise more authority over their own thinking and
develop a deeper intuition for mathematical problem solving.

Cooper (1999) stated that, “In its use as an educational medium in a carefully structured learning environment based on the principles of cognitive research, the computer may serve as a strong mechanism for reorganizing mental processes, aiding students in developing the hierarchical structure for their new knowledge.” She also noted that although original uses of the computer focused on drill, practice, and individualized learning, use of the computer as an instructional tool is most effective in collaborative learning environments.

The nature of learning mathematics interactively and in technology rich environments has been the focus of several researchers. Dubinsky and Schwingendorf (1997) investigated the effectiveness of teaching calculus using small cooperative learning groups in a computer laboratory setting. They concluded that the “use of computer activities and small group problem solving to implement a theory of learning mathematics has shown itself to be a very promising direction” (p. 241). Dubinsky and Schwingendorf (1997) also noted concerns such as the need for computer software to be “as easy as possible to use” (p. 235) and the need for instructors to develop mechanisms for ensuring that each student is held individually accountable for participating in the collaborative lab environment.

Asiala and Dubinsky (2000) investigated the attitudes and academic performance of students enrolled in college math courses that utilized “innovative pedagogical strategies” (p.1). These non-traditional teaching approaches included the use of computers and cooperative learning. Asiala and Dubinsky (2000) concluded that these innovative approaches led to improvement in student learning. The researchers also reported a tendency for students who had successfully completed an innovative math course to take more mathematics courses in future semesters than were taken by students who had completed traditionally taught math courses.

Other researchers, including Davidson (1990) and Dubinsky and Fenton (1996) have examined in depth issues surrounding the use of active learning strategies and student collaboration in college mathematics classrooms. Furthermore, Project CLUME has focused on ways collaborative learning strategies can be used effectively to foster deep understanding of mathematical concepts (Rogers et al., 2001). Research in this area has addressed issues of significance to our current study, such as the question of how to get students to reflect on the quality of their social interactions in collaborative learning situations.

We agree with Pea (1987) that “the computer can serve as a fundamental mediational tool for promoting dialogue and collaboration on mathematical problem solving” (p. 125). Research such as that of Pea and Dubinsky (Dubinsky & Fenton, 1996; Asiala & Dubinsky, 2000, Dubinsky & Schwingendorf, 1997) has raised many of the questions we raise here. But much of this previous work predates HTML. Since HTML is a new and potentially powerful educational tool, it is necessary to reexamine previous work in light of this technology. Although many of the questions we raise are old (important and still open) questions about cooperative learning and technology, we also raise some important new questions particularly concerning issues about students’ use of HTML. For example (as we will document later in this paper) the following questions arose from observing students working in this environment: What cognitive conditions prompt students to use hot links? What are the ways students seek and receive help in the computer learning environment? How can we get students to use help tools built into computer environments so they spend less time floundering and getting frustrated on syntactical problems? Our research also differs from previous work in that, instead of making observations of students working as a group in a classroom, we collected data in such a way to allow for very close examination and documentation of students' behaviors in a way that has not appeared in the research literature.

Many of the questions raised by the current study on interactive, technology-rich approaches to learning are relevant to active learning environments in general. A significant research literature exists on active learning and cooperative learning in mathematics as well as on technology in mathematics education. However, significantly less research has been conducted which closely examines the nature of the interaction of active, collaborative learning environments with technology-rich mathematics learning environments. More research is necessary to understand the ways in which socially interactive approaches to teaching interact with the use of computers in mathematics classrooms. It seems important to identify characteristics which are unique to technology-rich classrooms.

Heid et al. (1998) reviewed the empirical research on mathematics learning using computer algebra systems (CAS). Reporting only on those studies that involved systematic data collection and analysis, Heid et al identified 64 studies from journal articles, conference proceedings and dissertations that addressed CAS. They examined five sets of outcomes – achievement, affect, behavior, strategies and understanding – and concluded that the research justifies incorporating CAS into the established mathematics curriculum. In particular, the researchers noted that, “The majority of studies examined indicate that there is no loss in proficiency in computational skills and these results are obtained in the absence of a CAS on the research instrument. Cumulatively, these studies suggest that use of a CAS in the learning of mathematics… can result in higher overall achievement in terms of both procedural and conceptual items.” They concluded, “CAS research is now ready to enter a new phase. Researchers must no longer focus their efforts on corroborating the ‘no-harm-done’ conclusion. They must no longer be satisfied to establish that conceptual understanding is better. They must, like some of the more recent pioneers, investigate the very nature of learning with CAS.”

**Methods.** The purpose of this study is to develop, based on observations of
students’ work, a set of research questions that will help us understand the
nature of learning in these more interactive technological environments. Particularly since little research has been
done in this area, this phase of the research must be exploratory in nature. We
feel the most appropriate research method for this type of research is Glaser
and Strauss’s notion of *grounded theory*,
which they described as “the discovery of theory from data systematically
obtained from social research,” which they contrasted with “theory generated by
logical deduction from *a priori*
assumptions” (Glaser & Strauss, 1967). In the first stage of building a
grounded theory, researchers examine their data with the purpose of
establishing categories and/or constructs unbiased by prior conceptions. The
data are studied "to identify significant phenomena, and then determine
which phenomena share sufficient similarities that they can be considered
instances of the same concept. … You will need to define the category, give it
a label and specify guidelines that you and others can use to determine whether
each segment in the database is or is not an instance of the
category." (Gall, et al., 1996)
Our data gathering methods can be described using Romberg’s (1992) method of
clinical observations where “the details of what one observes shift from
predetermined categories to new categories, depending upon initial
observations.”

**Subjects.** The subjects studied were college students taking a mathematics
course (at a level beyond calculus) in a major research university. The
students had been using CCP modules for at least several weeks and were
somewhat familiar with *Maple* (the
computer algebra system) and the format of the modules. CCP modules were
required for their current mathematics coursework and, on average, these
students had completed one module per week.
For all but one pair of the subjects, the particular module used in the
study was a specific requirement for the course in which they were
enrolled. The subjects volunteered to
be videotaped for the purposes of this study (and were each paid $25). Their participation in the study consisted
of working through one of the CCP modules with a partner. The students working together were
videotaped and, simultaneously, their computer output was collected on a
separate videotape. Each session was 1-2 hours in length and data were
collected from a total of 10 pairs of students.

**Data **Collection. The data were collected in a quiet office
where the pair of students could work comfortably. Also in the room (though not always for the entire time) was one
of the investigators. On the table was
a computer with *Maple*; students were
also given paper and pencil. Also in
the room was a video camera to record their work and a scan converter connected
to a VCR and television to record their computer output. When the students arrived the investigator
explained the general purpose of the research and asked the subjects to sign
consent forms and forms to be paid. He
then helped them find the URL for the module and asked them to begin their
work.

Although (as discussed above) other researchers have investigated issues similar to those addressed in this study, the method of the current study is somewhat different. In the current study, not only were subjects videotaped, their computer output was simultaneously recorded and the focus of the video camera remained on the two students sitting at the computer for the entire session (as opposed to videotaping of a larger classroom). These aspects of the current study’s methodology provide the researchers with an opportunity to closely examine and document student behavior.

**Analysis:** A principle of grounded theory is that one generates conceptual
categories from evidence (Glaser & Strauss, 1967). The authors of this paper (one mathematics
educator and one educational psychologist) observed each of the tapes several
times and noted those issues that appeared to facilitate or inhibit learning or
that appeared to be important factors in understanding the process of learning
taking place. Another principle of
grounded theory is that the categories that “emerged from the data are
constantly being selectively reformulated by them. The categories, therefore, will fit the data, be understood both
to sociologists and to laypeople who are knowledgeable in the area, and make
the theory usable for theoretical advance as well as for practical application”
(Glaser & Strauss, 1967). The following three categories emerged: (1) the role of the teacher; (2) types of
behavior, thinking processes and self-monitoring as students engage in
collaborative interaction; and (3) issues raised directly by the
technology. We have selected several
excerpts from the videotapes that illustrate these concepts. For each of the vignettes, we discuss
aspects of the four categories that are reflected in the data. We then discuss researchable questions
raised by our analysis.

In order to help the reader place these vignettes in the context in which these students would normally be working, we present the following description of the beginning of a typical class in which these students would be working on these CCP modules:

On a day when CCP computer-based modules are being used, the learning environment looks significantly different from a traditional mathematics class. Typically, after gaining the attention of students and taking care of administrative announcements, the instructor gives a brief overview of and introduction to the lesson. The purpose of this initial teacher-directed overview is to activate students’ prior knowledge, introduce new terminology and procedures, and to provide students with a conceptual anchor. Students are then assigned to pairs (or get with a previously assigned partner). For most CCP modules, students work cooperatively with a single partner, but each pair of students belongs to a larger support team made up of four students. Each pair works collaboratively on a single computer. Roles are not assigned, so students must decide between themselves who will control the keyboard, point the mouse, read the problem, and take primary responsibility for the variety of tasks required by the learning activity. Once settled in, each of the two students typically reads the introduction on the computer screen and then they collaboratively engage in problem solving.

The subjects noted that the graph of the data seemed to be
exponential. They read the next
instruction, “Experiment with logarithmic plotting of the data to determine the
type of growth,”
not paying attention to that (or perhaps not understanding what it was
asking). They then went on to the next
instruction, “Find a formula for a continuous function *r = R(t)* such that *R(n)*
reasonably approximates the *n*th
measured radius, *r _{n}*

* Discussion.* In this vignette, the
students struggled with the software, but were not frustrated and did succeed
in accomplishing what was asked. Much
of the subjects’ effort in this vignette involved getting *Maple* to do the computations that they wanted done. Because Mary and Jim were hesitant to ask
for help, they proceeded through the module at a slow pace. This raises certain questions: Was this an
efficient use of the student’s time? If
this was a classroom and the investigator was the instructor, should he have
intervened more often to help the students move along more quickly? How does
the instructor know when and in what situation to intervene?

We also
noticed in this vignette, as in others, that students are willing to persist
for a long time trying different things on the computer. In particular, trial and error can be much
quicker on a computer than with pencil and paper and because of this, students
are less likely to be demonstrate self-regulatory behavior.

This vignette illustrates three issues: (1) as in any active classroom, there are the questions of when and how the teacher should intervene, (2) the computer learning environment seems to affect students’ ability to manage their time efficiently and (3) whereas computers solve certain pedagogical problems (such as time consuming calculations), they create others (such as learning the nuances of the software).

**Vignette 2.** This vignette also involved Jim and Mary and began several
minutes after Vignette 1. They were
asked to “construct a function *r = r
(theta)* that describes the shell radius as a function of polar angle.” They typed in:

*x: = theta -> R * cos (theta); y := theta
-> R * sin (theta)*

Because *R*
was defined earlier as a function (not as a constant), *Maple* would not plot the parametric equations. After the subjects struggled with this for
about 4 minutes, the following conversation took place:

Investigator: Did
you run into a problem?

Jim: When
we graphed it, we didn’t get anything.

Investigator: What
is *R*? You have *x: = theta -> R
cos (theta)*. *R* is the radius, right?

Mary: Yeah.

Investigator: Isn’t
that changing as a function of theta?

Jim: Oh. So we have our *R* formula.

Investigator: You
can just say *R (theta)*. Does that make sense?

Jim and Mary: Yeah.

They tried that and it worked. They were then able
to proceed through the next parts of the module.

*Discussion. *In this vignette, the
subjects are confronted with a mathematical problem that they may not have been
able to solve on their own, and without solving it, they could not proceed
through the module. Whereas in the
first vignette the investigator/instructor helped move things along, in this
vignette it seems that the intervention of the instructor was essential. This was not an isolated instance of this
sort; we noticed such problems often as we reviewed the data. This raises the
question of what would happen here in the absence of an instructor, e.g., in a
distance learning situation. It also
raises the issue of whether and how such problems can be anticipated by
curriculum developers and what possible technological solutions (such as links
to hints) can be built into the module.

**Vignette 3**. Again we look at Jim and Mary working on the Equiangular Spiral
module, this time towards the end of the module. They read the instructions: “Find derivatives of *x* and *y* with respect to *theta*,
and then combine the results to find *dy/dx*
in terms of *theta*. You may want to
use your helper application for this.” They could not remember how to get *Maple *to compute derivatives and began trying
to find the derivative of *x=**r _{0}
e^{k}*

Mary: O.K.

Jim: Shut
up.

Mary: (laughs)
Here, it’s just the product rule.

Jim: Yeah. It would be nice if *Maple* would do it for us.

Mary: It
will.

Jim: Yeah.
I want it to do it.

A minute later, working together they got *Maple* to do the calculation.

Mary: You
see, it’s exactly what I just did.

Jim: Yeah
but your way is stupid.

Mary: But
it was quicker.

The instructions asked them to divide *dy/d**q** *by *dx/d**q* to get a formula for *dy/dx*.
This would have been quite tedious with pencil and paper but they were
now able to use *Maple *to do this
computation in a couple of seconds. The
instruction then directed them to evaluate an even more complicated expression
that reduced to *1/k*. They got this result with *Maple* (by now, they were using *Maple *correctly), and Jim said, “Wow. I
want to work this out on paper. I don’t believe that.”

*Discussion.* Note that Mary first suggested using pencil
and paper but then wanted to use the CAS, whereas Jim first insisted on using
the CAS but in the end did not believe the results without checking using
pencil and paper. The main issue raised
by this vignette is how and why students choose one tool or another. As we’ve seen in other videotaped sessions,
the students used the CAS and hand-held calculators, as well as pencil and
paper. Some students (like Mary and Jim
above) seem to believe their pencil and paper calculations more than the
results of a CAS application. This situation may have been exaggerated by
getting the surprising result of the complicated expression reducing to *1/k*. This issue of what tools to use to
solve a particular problem raises several questions: Does familiarity and
comfort with the tool affect how readily one accepts the results produced with
that tool? Do some students
(particularly those who have been successful in school) receive some
ritualistic satisfaction from doing pencil and paper calculations? Or,
alternatively, is it that in situations like this, where it seems that solutions
are pulled out of a hat, that students want to do the hand calculations because
of a intrinsic motivation to understand the surprising results? How do students
check their work, given multiple technological tools?

**Vignette 4. ** In this vignette, we examine the work of Andy and Larry working
on a module called “Correlation and Linear Regression” ^{3} with Andy
at the keyboard and Larry to his left.
In this module, they learn about correlation and use that notion to
develop an understanding of linear regression and least squares estimates for
linear data. At the beginning of the
module, the students are given the scores of 15 students on two tests and are
asked to plot the scores on test 1 vs. the scores on test 2. The first thing they did was follow a link
to another file that showed them how to use Maple to make a scatter plot. Andy correctly cut and pasted the zip
command that tells Maple to create coordinates from two sets of variables. They then needed to decide what the x and y
should represent (the correct answer being test1 and test2). The following conversation then
occurred. During this time they were in
the room by themselves; the instructor had stepped out for a few minutes.

Andy: (types
and says out loud): ourdata = zip (x, y) -> [x, y]

Larry: What
does the zip do? Is that just something
in the definition?

Andy: I
have no idea. It just says it right
there.

Larry: You just copied it?

Andy: Yeah.
Why not.

Larry: I thought it was a cool function, it
sounded interesting. I guess just test
1 test 2

Andy: No.
No. We’re not planning on plotting them against each other.

Larry: Yeah
we were. Weren’t we?

Andy: No.

Larry: I thought we were plotting test 1
vs. test 2

Andy: No.
We’re plotting test 1 and test 2 so we want to do these against, just like, the
one, so each number represents a 1.

Larry: Oh wait, we’re just putting the
plots on the same graph and not plotting them against each other?

Andy: Yeah.

Larry: OK.

Andy: I
don’t know if this is going to work.

Since what
Andy said made no sense, they made very little progress. At first, they typed “plot (test 1)” which
produced nothing. Andy then spent a
couple of minutes poking around in the help menu, thinking that he was having
syntax problems rather than a problem understanding the mathematics. After about 4 minutes, Andy said, “You know
what we can do?” and plots the test1 data vs. the set {1,1,1, … ,1}. Larry said, “That can’t be the normal way to
do it. Interesting though.” Periodically, throughout this process, Larry
politely and without being assertive asked whether Andy was sure that they were
not supposed to be plotting test 1 vs. test 2.
We should note that the module has a link to the glossary for the word
“versus,” yet neither Larry nor Andy suggested following that link. Finally, they looked further down on the
worksheet and realized that they should have been plotting test 1 vs. test
2. They typed this in and got the
correct scatter plot. At this point the
investigator entered and verified for them that they were on the right
track. Interestingly, Andy didn’t seem
embarrassed by his refusal to take Larry’s advice and Larry didn’t seem to
blame him.

*Discussion:* Several issues surface in this vignette.
Perhaps foremost among these issues is the ongoing tension between the desire
of one student to understand the conceptual ideas embedded in the mathematics
problem and the other student’s push to solve the problem and to get through
the assignment as quickly as possible. For example, Larry asks, “What does the
zip do?” Andy indicates that he has no idea what the zip does, seemingly
implying that the primary goal is to finish the problem, not necessarily
understand the underlying ideas and processes. This raises the question: How
can computer learning environments be designed so that they foster learning for
understanding but that also use students’ time efficiently, avoid student
frustration, and prevent those students who have a desire to get the work done
with a minimum of time and effort from missing the point of the lesson?

A
second issue raised by this vignette is similar to an issue in Vignette 1. Students struggle with the tools (*Maple* software commands) as much as they
do with the mathematics concepts. Andy spends much of his time in this vignette
trying to correct what he perceives to be a *Maple*
syntax problem, when the real problem is his lack of understanding of the
concept of “versus.” Perhaps with
greater metacognitive awareness, Andy might be able to ask himself whether he
is having a problem with the tool or the concept. This raises the following question: In a computer learning
environment is it possible to build into lessons ways to help students develop
metacognitive skills? For example, in
this case, why did neither Larry nor Andy suggest following the link to the
glossary for the word “versus”? What
can instructors and module writers do to strengthen students’ abilities to
recognize and differentiate between tool problems and conceptual
misunderstanding? Can the learning of metacognitive processes be embedded and
situated in computer based modules without significant costs in terms of
instructional time?

Another
issue, which surfaces in this vignette, concerns the role of the teacher. In
this particular scenario, the investigator permitted the two students to
struggle independently for quite some time, prior to interacting with them.
This did allow Andy and Larry eventually, after a significant expenditure of
time, to discover on their own how to plot the data. However, would this be an
efficient use of instructional time? Would more learning have occurred if the
instructor had intervened sooner? What is the cost-benefit ratio in terms of
letting students discover solutions versus intervening and more directly
guiding the computer based modules? Again, as was discussed in Vignette 1,
these are questions in any active classroom.

A final
aspect of Vignette 4 concerns the role of student-to-student dialogue in
mathematics computer learning environments. In this particular vignette, Andy
and Larry appear to be sharing ideas, but Andy is not seriously considering
Larry’s questions and Larry is not asserting himself. Andy continues to plod down the wrong path, despite Larry’s
early suggestion that they should plot test 1 versus test 2. In Vignettes 1, 2,
and 3 genuine dialogue between the pair of students seemed to exist. The
students shared their provisional hypotheses and provided one another useful
feedback. In the case of Andy and Larry, Andy emerges as an assertive but
conceptually mistaken leader. Larry, who correctly understands the mathematics
problem, remains for the most part a passive follower. This raises several
questions about the quality of interactions in any active collaborative
learning situation, and in particular in the types of technology rich
environments described here: In an interactive computer learning environment
where meaningful student dialogue is essential for developing understanding,
what steps can the instructor take to facilitate dialogue? How can mechanisms be built into the computer
modules (or other active learning materials) to get students to reflect on the
quality of their interactions? How
does the instructor structure the lesson to minimize the problem of one student
taking over the learning situation? Can interdependence and shared
responsibility, as well as other aspects of cooperative learning, be more
effectively built into computer modules?

**Vignette 5. **In this vignette, Carl and Kevin are working on the
module called “Correlation and Linear Regression” and after examining formulas
for the correlation coefficient are asked to compute the correlation
coefficient for a data set consisting of four points: {(1,2), (2,3), (3,4),
(4,3)}. They plunged into the exercise
using pencil and paper but when (three and a half minutes later) they were
faced with trying to calculate the square root of 2, Carl asked Kevin, “Do you
have a calculator?” After Kevin looked around and couldn’t find one, the
investigator said, “What are you looking at?” (referring to the computer). The investigator expected that the subjects
would bring up *Maple* but instead Carl
brought up the computer’s scientific calculator (a standard accessory in MS
Windows). Five minutes later they
finished the calculation. The following
conversation occurred (Note: computing the standard deviation is a step in
using the given formula for the correlation coefficient.):

Investigator:
Would it have been easier or
harder to figure out how to get *Maple*
to do the standard deviation?”

Carl: If
we had a nice little equation like that it would be fairly easy I think.

Investigator: Actually
didn’t we do that before?

Carl: Up
here? Oh the coefficient thing. Yeah.

Kevin: Oh
yeah that would have been easier.”

* Discussion.*** **In
almost every videotaped lab session, including this one, the researchers noted
three aspects of students' use of tools in computer based learning
environments: (1) the perceptions students have of the array of
"tools" available to them to use to solve a problem; (2) students'
notions of when and where it is appropriate to use a particular tool; and (3)
and the degree to which students believe or "trust" that a certain
tool is a reliable means of producing the correct solution to a problem. For example, after beginning their work
using pencil and paper, Carl and Kevin then used the computer's
calculator. By the end of the session,
with the guidance of the instructor, Carl and Kevin begin to reflect on their
choices of tools and they seem to realize that *Maple* is a powerful tool they also have at their disposal.

This
vignette raises several potential research questions. For example, even though Carl and Kevin had been using *Maple* for two or three months, the idea
of using the computer algebra system to calculate a standard deviation did not
occur to them. This raises the
following questions. Prior to putting
students into a computer based learning environment, can we design ways to
introduce students to computer algebra systems that help them feel comfortable
using them? Can we design computer
modules in ways that get students to think more reflectively about their choice
of tools (pencil/paper, calculators, computer calculator, CAS)? What factors underlie students' perceptions
of the accuracy, reliability and efficacy of a particular tool?

**Vignette 6. **Again, we describe the work of students working on
the module “Correlation and Linear Regression”. The students, Alex and Neil, are roommates and are both taking
linear algebra. After a couple of
minutes of work, the following conversation occurs:

Alex: Why
don’t you type?

Neil: Are
you sure?

Alex: Yeah.

Neil: Why
don’t you want to type?

Alex: You’re
more familiar with the commands.

Investigator: Who
usually types?

Alex: He
did before because he knew *Maple* and
then I did the last couple.

Neil: We take turns.

Alex: Yeah,
it’s his turn anyway.

After
this, they returned to work on the module.
During this period, they sometimes thought aloud and sometimes talked to
each other (but they looked at the screen rather than each other) and they
often pointed at objects on the screen as they talked.

Soon after
they switched seats, Neil asked the investigator, who was still overseeing
their work, “How do we turn radicals into decimals?” (*Maple *output is exact
unless you ask for the decimal approximation.) The instructor/investigator, who
was present responded, “you go evalf” (evalf is the command to compute decimal
approximation). Several minutes later,
after the instructor/investigator left and was sitting in the adjacent room,
they acknowledged in their discussion that they did not know how to interpret
the word “versus” (as in “test 1 versus
test 2”). Though the word “versus” was
highlighted as a hot link, they did not click on the link. Eventually, they figured out what versus
meant using the context of questions that came up later.

About 10
minutes into the tape, when they executed the command to plot the scatter plot
of test 1 versus test 2, the plot was displayed incorrectly. This was not the result of any error on the
students’ part but was due to some technical problems concerning the memory of
the machine and *Maple*’s interaction
with the hardware. The students’ first
reaction was to assume that the output was correct and they tried to construct
some meaning out of the incorrect output (obviously this was a challenge). After realizing that the output was
incorrect, Neil and Alex tried all sorts of things such as changing the format
and display options hoping to get the graphs to come out right.

*Discussion.* One issue that appeared in many of the lab
sessions concerned the question which of the lab partners would assume certain
roles; for instance, who will take responsibility for typing, using the mouse
and offering initial ideas. While in
most cases the roles are not explicitly discussed, in this case Neil and Alex
directly addressed the issue of who would use the keyboard. This raises the questions of whether roles
should be assigned, how students work out roles in the absence of assignments
and what impact, if any, do the roles have on learning.

This ties
in again to the question of the role of the instructor: Should the instructor
assign roles to students? Although
assignment of roles has been discussed in the literature on cooperative
learning (Slavin, 1995), do different
issues arise in the interactive computer environment?

Another
issue in this vignette concerns the different ways that students seek help.
Neil and Alex encountered difficulty in interpreting “versus,” yet they either
failed to recognize that versus was a hot link in the HTML document, or they
were reluctant to use it. Why do students sometimes use links and other times
ignore them? What cognitive conditions prompt students to use hot links? This vignette also re-emphasizes issues seen
in earlier vignettes: What are the ways students seek and receive help in the
computer learning environment? What is the role of the instructor in providing
support and guidance?

A related
issue that appears in this vignette has to do with the intellectual dialogue
between the students during the session. One of the strongest features of the
interactive computer lab approach seems to be the way that the labs foster
collaborative discourse. This type of dialogue is seen in this vignette when
Neil and Alex think aloud, offer hypotheses, and make predictions. However, the
question arises: Is this type of dialogue typical of this computer learning
environment? Does this environment foster meaningful collaborative discourse in
ways that would not occur without the computer, and why?

A final issue that surfaced in this vignette was the technical problem with the interaction of the software and hardware. The hardware memory problem was very common and frustrating to many students (and the investigator!) This was not consciously built into the study but it points out the serious difficulties that can occur when technical problems happen. Typically, students think they did something wrong, as in the reaction of Carl in a different session: “Oh! What did I change? The computer hates me.” Often, even the instructor (as in this case) cannot solve the problem and this can seriously upset the flow of the lesson, especially if it depends on the computer output. This raises the same question we discussed in the fifth vignette concerning the impact that these technical problems have on the sense of credibility and trust that the students have in the computer as a tool. Another question concerns a metacognitive issue: How can we help students learn how to check the reasonableness of an answer and determine whether discrepancies are due to mathematical or technical errors?

**Vignette 7. **In this last vignette, we describe the work of Dan
and Aaron working on the module “Correlation and Linear Regression.” Dan and Aaron are among the best students in
a linear algebra class. Dan is an
electrical engineering major and Aaron is a mathematics/economics major. Dan
was working comfortably at the keyboard while Aaron was thinking critically and
actively about the questions being asked. They had just examined the formula
for the correlation coefficients and were asked to compute the correlation
coefficient for a data set consisting of the points: {(1,2), (2,3), (3,4),
(4,3)}.

Aaron: Could
you go [scroll] up so I can see the formula?

Dan: There’s
a way to do this in *Maple*. I don’t remember how.

Aaron: To
find *r*?

Dan: To
sum a list. I don’t remember how right
now.

Aaron: You
can do the standard deviation, can’t you? (meaning on *Maple*)

Dan: Yeah.

Dan enters “X1: =[1,2,3,4]; Y1: =[2,3,4,7];” and tries to remember the syntax for standard deviation.

Aaron: It’s at the top if you can’t remember.

Dan: (mutters)
I can’t remember

Dan
checked the syntax and computed the means and standard deviations of X1 and Y1
using *Maple. * Aaron began to input that information, using
pencil and paper, into the formula for the correlation coefficient. While Aaron got involved in the pencil and
paper computation, Dan, intently viewing the computer screen, tried to remember
how to sum a list. He used trial and
error but did not use the help menu.

Aaron: I can do these by hand (referring to
summing the product of *x _{i}*
and

Aaron
completed the computation, making some errors and got 3/80* sqrt (10). Dan had *Maple*
evaluate this and got approximately 0.12.

Aaron: That’s really low.

Dan: Yeah,
these two are correlated really well.
It’s got to be higher than that.

Dan tried
to create a scatter plot, while Aaron and the investigator tried to find the
arithmetic error in his calculation. Dan
ran into trouble getting *Maple* to
plot the scatter plot.

Dan: It
doesn’t seem to show for some reason.

Investigator: That’s
strange.

Dan: The
graph shows 3 points, not 4.

Investigator: Where
would the fourth point be?

Dan: There
should be one right here but it [the correlation] still should be better than
0.1

Aaron
continued looking for the computational error during this interchange and the
investigator returned to helping him, suggesting that instead of looking for
the error, he simply redo the calculation.
Meanwhile Dan, quietly and persistently, tried to get *Maple* to compute the correlation
coefficient by using the sum command but still couldn’t get it to work.

Dan: I do this all the time in my other class. (He uses *Maple*
in his electrical engineering class.)

Investigator: We’ll
check it with *Maple* later.

Aaron: I think it’s 2/sqrt(10). I think that’s right. I made two algebra mistakes and I found both of them.

Dan: (evaluating the expression on *Maple*) Yeah, that’s reasonable.

In writing up the answer and explanation, Aaron made suggestions. Dan listened to Aaron’s suggestions while also incorporating his own ideas (though not saying anything), using Aaron’s input. After reading what Dan typed, Aaron said, “I think that explains it.”

*Discussion.*** ** In this vignette we see
several of the issues we observed in other vignettes. For example, the two students struggle with computer software. In
this case they can't remember the *Maple*
syntax needed to sum a list. This raises the question: How do we provide
students with adequate training in computer algebra software without taking
away from instructional time? A second issue raised in this vignette is the
problem students appear to encounter with self-regulation and time management.
Dan inefficiently used trial and error to discover how to sum a list. They
devoted too much of the lab time trying to create a scatter plot. This raises
several questions: Are students more likely to have difficulty managing their
time and regulating their thought processes in a computer based mathematics
lab? As anyone who uses a computer knows, sometimes a task which at first would
appear to take a few minutes on a computer can end up taking much longer,
perhaps because it is so easy to explore options on the computer.

This vignette also raises the question of the role of the instructor in an interactive computer environment. How directive should the instructor be? In this case, the instructor finally intervened and directed Aaron to repeat the calculation, instead of inefficiently looking for his error. What responsibility did the instructor have in this vignette for making sure the lesson progressed at a reasonable pace?

Aaron and
Dan engaged in productive collaborative discourse and cooperative problem
solving. There are several good
examples of the in-depth cognitive processing that conceptually based
approaches to mathematics are designed to elicit. For example, when Aaron and Dan compute a correlation coefficient
of 0.12, they experience a sense of
cognitive disequilibrium. “It’s got to
be higher than that,” Dan remarked. The
two students appear to have made a prediction or estimate of higher correlation
coefficient based on their initial analysis of the data. When they computed a lower number, they
began to question its reasonableness.
Is questioning the reasonableness of an answer more or less likely to
occur in a computer-based program? Will
some students tend to accept as reasonable what the computer produces, because
they view the computer as infallible?
How can we design lab experiences which encourage students to make
predictions, estimate reasonable answers, and then compare their estimates to
computer generated products? How can we build into computer lesson modules
tasks which elicit the high level mathematical cognitive processing we want
students to engage in?

** **

**Summary and Conclusions:**

The
research presented here is the first stage of a process whose purpose is to
develop an understanding of how students learn in technology-rich environments
for the mathematics classroom. By
carefully observing student work in a particular technology-rich environment
(i.e., the Connected Curriculum Project), we generated a set of questions for
further study. These questions were not
derived a priori from a theoretical perspective but were derived from the data.
Using grounded theory as a methodological approach, the data served as a first
step in constructing a theory that will explain learning in this environment.

The analysis of the data led to the formulation of three categories of research questions: (1) What is the role of the instructor in this environment? (2) What types of behavior and thinking processes are students engaged in as they work together in front of the computer, and how can the modules be written to facilitate students' self-monitoring and effective collaborative interaction? and (3) What opportunities and obstacles are raised by the technology itself? Research in each of these areas has important implications for curriculum developers, mathematics instructors, and students.

For each of these
categories, we will summarize some of the issues and questions that arose from
our observations:

1.
*The role of the instructor*. As in any active
classroom, the instructor must confront questions of when and how to provide
support and guidance to students who are engaged in complex problem
solving. We observed incidents where
the teacher’s intervention was critical for the student’s progress. In other cases, we observed students
floundering and making little progress because of the lack of help. We also saw
students struggle, perhaps inefficiently, with a problem, but eventually solve
it on their own. The questions raised
here include: What should be the role of the instructor in a computer based
mathematics class? How does the instructor make decisions about intervening? In
an interactive computer mathematics lab, how does the instructor ensure that
instructional time is used efficiently? Is learning more likely to occur if the
instructor intervenes whenever students encounter difficulties? Should the
instructor allow students ample time to discover solutions on their own? How, if
the course is being delivered via the Internet, could software developers
simulate the role of the instructor in the lab? For example, what kinds of hints (or nested sequence of hints)
could be provided and how can this be done intelligently (in the sense of
artificial intelligence)?

2.
*Types of behavior, thinking
processes and self-monitoring as students engage in collaborative interaction*. Much of
the focus of our observations was on the ways students' chose to work with one
another as they engaged in collaborative problem solving. For example, we
observed students making decisions about which tools to use to solve the
problems presented to them on the computer. One of the recurring themes
concerned how, when, and why students chose one tool or another (e.g., pencil
and paper, a calculator and/or CAS).
Analysis of the data raised many questions concerning the perceptions
students have of the array of tools available to them. We also observed
students struggling with what they thought were tool problems, when in fact
they had conceptual misunderstandings. How can students recognize and differentiate
between tool problems and conceptual misunderstanding, and how can instructors
help them?

* *We also observed how, as
students worked in different situations, lab partners would assume certain
roles such as hypothesizer, verifier, and recorder. We also saw students
deciding who would take responsibility for typing, using the mouse, and
offering initial ideas. At times,
decisions of this sort were consciously made; at other times, the students
seemed to choose roles without discussion. These observations raise the
question: would students benefit from assigned roles? Should roles be
structured to minimize the problem of one student taking over the learning
situation?

We saw some students trying to understand the underlying
concepts while others were trying to get through the lab as quickly as
possible. In an interactive computer learning environment where meaningful
dialogue is essential for developing understanding, can mechanisms be built
into the computer modules to get students to reflect on the quality of their
interactions? Can interdependence and shared responsibility, as well as other
aspects of cooperative learning, be built into computer modules?

We also noticed students using (or not using) links in many
different ways. Some of these hot links
were labeled as hints; others were links to a glossary or other resources. Some students clicked on the hint hot links
immediately, some waited and clicked if necessary and some didn’t link to the
hint even when they were stuck. What cognitive conditions prompt students to
use hot links? What are the ways students seek and receive help in the computer
learning environment? How can we get
students to use help tools built into computer environments so they spend less
time floundering and getting frustrated on syntactical problems? These questions about links are questions of
both cognition (how students are thinking about the problems at hand) and metacognition
(their thoughts about their thinking).

We saw instances of productive dialogue as students tried to solve the problems presented to them. What produces this type of constructive dialogue: the computer learning environments, the content presented, the group environment or project-oriented nature of the work? In other words, how do the physical, intellectual, social and pedagogical environments interact to produce learning? Does the computer lab environment foster meaningful collaborative discourse in ways that would not occur without the computer, and, if so, why? What steps can be taken to facilitate dialogue in computer-based mathematics classes?

In terms of self-regulation, the computer learning environment seemed to affect the students’ ability to manage their time efficiently. Is there a positive side of students becoming so involved in problem solving that they lose track of time? Where should the balance be between open-ended discovery versus a focus on getting the correct answer in a set amount of time? We have noticed that students, because they must start doing something even before they've understood the problem, often do not reflect before doing a calculation. In fact, this is true of students using hand calculators or pencil and paper as well as students using computer algebra systems. Are students more likely to jump into difficult computations or use guess and check strategies when they have access to a computer algebra system than when they are using hand calculators or pencil and paper? To what extent does this strategy promote learning even when they are going down the wrong path entirely? Can the learning of time management, self-regulation, and metacognitive processes be embedded and situated in computer based modules without significant costs in terms of instructional time? And (relating to problems with the technology), how can we help students learn how to check the reasonableness of an answer and determine whether discrepancies are due to mathematical or technological errors?

3.
*The technology itself*.
Whereas computers solve certain pedagogical problems (such as time consuming
calculations), they create others (such

as learning the
nuances of the software). Students
struggled with the tools (*Maple *software
commands) as much as they did with the mathematics concepts. And the technical problems that occurred
with the interaction of the software and hardware raise questions of how
students and teachers react to such problems.

Clearly not all these questions fall neatly into a single category. For example the question of whether the instructor should assign roles to students depends on understanding what roles students assume, how they assume those roles, and the effect those roles have on learning. This is not an exhaustive list of issues. We can imagine many other questions, but, consistent with the principles of grounded theory, we are limiting the discussion to the evidence presented in the data.

In interpreting these data, it is important to realize that
these students were talented students doing mathematics at a level beyond
calculus and using specific software in a laboratory setting. It is not our purpose here to generalize
these results to a larger population but to use these observations to suggest
areas for future study. It is also
important to note that each entering class of students brings more familiarity,
more comfort and more sophistication with using educational technology. It is not clear which problems faced by the
subjects in this study will likely be problems for students several years from
now.

There were many issues not addressed in this study such as method of instruction and assessment, the physical environment of classroom and affective issues. These are certainly areas for future study. And as the categories above are reformulated and refined as we collect more data and develop a theory, the need to triangulate our observations (that is to verify our observations by interviewing subjects and collecting other sources of data) will be crucial.

Above, we
have raised many questions that are suitable for immediate and more focused
study. In conclusion, we suggest
several follow-up studies based on the following questions:

1.
What
are the implications of these observations for distance education? What would be different when students work
alone as opposed to working with a partner?
Will they learn the “right” lessons? How (or can) problems such as the
ones identified in this study be anticipated by curriculum developers of
distance learning materials and what possible technological solutions can be
built into these materials?

2.
Which
of these issues arise in a real classroom setting versus an experimental
setting? Which don’t? What other issues arise?

3.
Initially,
we thought the person who had control of the keyboard might have been the more
active learner, but we have seen instances where the person who was not
burdened with keyboarding was free to think more about the mathematical
content. We would suggest follow-up studies that focus on particular aspects of
the way students work together. These studies should include clinical
observations along with interviews and other data collection.

4.
How
does the speed, allure and stimulation of computers affect the ways students
solve problems? We would suggest
focusing in great detail (perhaps with think aloud protocols) on a comparison
of students working on a task with pencil and paper as opposed to computers. Among other things, such a study could
document how computers affect student time management.

5.
What
can we learn from existing research in other areas (e.g., cooperative learning,
problem solving approaches to instruction)?

6.
How
are the questions raised here different in an active learning environment
without computer technology?

We find this last question to be particularly important. Many of the questions and issues raised by the current study on interactive technology rich learning are relevant to active learning environments in general. For example, the question of how an instructor motivates students to reflect on the quality of social interactions in a collaborative work environments is important in all cooperative learning situations, not just technology-rich paired learning activities. Throughout our work on this research project, we found ourselves asking whether a particular instance of behavior was unique to a computer-based learning environment or more relevant to all active learning situations. As Dubinsky and other researchers (Asiala & Dubinsky, 2000, Dubinsky & Schwingendorf, 1997) have pointed out, it is difficult to differentiate the impacts of technology from the impact active or cooperative learning on student understanding. This is one of the first questions we plan to address in future research.

As we
acknowledged, many of the important and interesting questions concerning
cooperative learning in interactive technological environments have been
examined by previous researchers. In
our grounded approach, many of these same issues emerge. Another important study that needs to be
done is a comprehensive review of the research literature that documents what
is known about each of these questions and what remains to be learned.

Our long-term
research program involves the creation and validation of a model of learning
and teaching of mathematics in a technology rich environment. The model will
examine the nature and importance of the relationships among the following
components:

student

teacher

content and context

materials (software, text)

method of instruction and
assessment

physical environment of
classroom

affective environment (e.g.,
classroom atmosphere)

We plan to identify
sub-components of these factors. For example,
students’ issues might include student's prerequisite knowledge, attitudes,
motivation, and learning style. We
hypothesize that there will be significant interactions among the
sub-components both within and between larger components of the model. We realize that this is a very bold and
ambitious agenda. We plan, working with others over a period of years, to make
progress toward an understanding of the relationships among these components.

^{3} http://www.math.duke.edu/education/modules2/materials/test/test/

**References**

Asiala, M., & Dubinsky, E. (2000). Evaluation of
Research: Based on Innovative Pedagogy in Several Math Courses. Unpublished
report.

Battista, Michael T. (1999). The Mathematical
Miseducation of America's Youth. *Phi Delta Kappan, 80* (6), 425-433.

Bookman, J. & Blake, L.D. (1996). Seven Years of Project CALC at Duke
University- Approaching a Steady State?
*PRIMUS, 6* (3), 221-234.

Bruner, J.S. (1966). *Toward a Theory of
Instruction*. New York: Norton.

Chambers, Jack & Bailey, Clare. (1996). Interactive Learning and Technology in the
US Science and Mathematics Reform Movement*. British Journal of Educational Technology
(27),* 123-133.

Colvin, M.R., Moore, L., Mueller, W., Smith, D.,
& Wattenberg, F. (1999). Design, development, and use of web-based
interactive instructional materials. In G. Goodell (Ed.), *Proceedings of the Tenth Annual International Conference on Technology
in Collegiate Mathematics*. Reading, PA: Addison-Wesley.

Cooper, Marie A.
Cautions and Considerations: Thoughts on the Implementation and
Evaluation of Innovation in Science Education. In Kelly, A., & Lesh, R.
(Eds.), *The Handbook of Research Design
in Mathematics and ScienceEducation*
(pp. 859-876). Mahwah, NJ: Lawrence Erlbaum.

Coyle, L., Moore, L., Mueller, W., & Smith, D.
(1998). Web-based learning materials: Design, usage, and resources. *Proceedings of the International Conference
on the teaching of mathematics.*
(pp. 71-73). Somerset, NJ: Wiley.

Davidson, N. (1990*). Cooerative Learning in Mathematics: A Handbook for Teachers.*
Menlo Park, CA: Addison-Wesley.

Dubinsky, E., & Fenton, W. (1996). *Introduction to Discrete Mathematics with ISETL.
*New York, NY: Springer-Verlag.

Dubinsky, E., & Schwingendorf, K. (1997). Constructing
Calculus Concepts: Cooperation in a Computer Laboratory. In Dubinsky, E.,
Mathews, D., & Reynolds, B. (Eds.), *Readings
in Cooperative Learning for Undergraduate Mathematics*. (pp. 225-246).
Washington, DC: Mathematical Association of America.

Fosnot, C.T. (1993). In Brooks, J.G. and Brooks, M.G. *In Search of Understanding
the Case for Constructivist Classrooms*
(p.vii) Alexandria, VA: Association for Curriculum and Development

Gall, M.D., Borg, W.R., & Gall, J.P. (1996). *Educational Research: An Introduction. *White
Plains, NY: Longman.

Glaser, B.G. & Strauss, A.L. (1967). *The Discovery of Grounded Theory.* Chicago:
Aldine Publishing.

Heid, M.K. (1988).
Resequencing Skills and Concepts in Applied Calculus Using the Computer
as a Tool*. Journal for Research in Mathematics Education, 19* (1), 3-25.

Heid, M.K., Blume, G., Flanagan, K., Iseri,
L.,Deckert, W., Piez, C. (1998). Research
on Mathematics Learning in CAS Environments.
In G. Goodell (Ed.), *Proceedings
of the Eleventh Annual International Conference onTechnology in Collegiate
Mathematics* (pp. 156-160). Reading:
Addison-Wesley.

Krantz, Stephen. (2000). Imminent Danger: From a Distance. *Notices of the AMS 47 *(May), 533.

National Council of Teachers of Mathematics. (1991*). Professional Standards forTeaching
Mathematics.* Reston, VA: Author.

National Research Council. (1991). *Moving beyond myths: Revitalizing Undergraduate
Mathematics. *Washington, DC: National Academy Press.

National Science Foundation. (1996). *Shaping the Future: New Expectations for Undergraduates
for Undergraduate Education in Science, Mathematics,Engineering, and
Technology. *Washington, DC: Author.

Papert, S. (1980). Mindstorms: *Children, Computers, and Powerful Ideas.* New York, NY: Basic Books.

Pea, R. (1987) Cognitive Technologies for
Mathematics Education. In Schoenfeld, A. (Ed.), *Cognitive Science and Mathematics Education*. Hillsdale, NJ:
Lawrence Erlbaum Associates.

Piaget, J. (1952).
*The Origins of Intelligence in
Children.* New York: International Universities Press.

Portela, J. (1999). Communicating Mathematics Through
the Internet: A Case Study. *Educational Media International, 36* (1),
48-67.

Rogers, E., Reynolds, B., Davidson, N., &
Thomas, A. (2001). Cooperative *Learning
in Undergraduate Mathematics: Issues That Matter and Strategies That Work*.
Washington, DC: Mathematical Association of America.

Romberg, Thomas A*. *Perspectives on Scholarship and Research
Methods*. * In D.A. Grouws (Ed.) *Handbook
of Research on Mathematics Teaching and Learning.* (pp. 49-64). New York:
Macmillan.

Slavin, R.E. (1995). *Cooperative Learning: Theory, Research, and Practice.* Boston: Allyn
& Bacon.

Smith, D.A. (2000). Renewal in Collegiate
Mathematics Education: Learning from Research. In Ganter, S.L. (Ed.), *Calculus Renewal: Issues for Undergraduate
Mathematics Education in the Next Decade* (pp. 23-40). New York, NY: Kluwer
Academic/Plenum Publishers.

Smith, D.A. & Moore, L.C. (1990). Project CALC:
In T. W. Tucker, (Ed.), *Priming the
Calculus Pump: Innovations and Resources.* (pp. 51-74).* *Washington D.C.: Mathematical Association of America.

Smith, D.A. & Moore, L.C. (1991). Project CALC: An Integrated Lab Course. In Leinbach,
C. et al. (Eds.), *The Laboratory Approach
to Teaching Calculus. *(pp. 81-92).* *Washington D.C.: Mathematical
Association of America.

Vygotsky, L.S. (1978). *Mind in Society*.
Cambridge, MA: Harvard University Press.

DEPARTMENT OF MATHEMATICS, DUKE UNIVERSITY, DURHAM, NC 27708

*E-mail
address*: bookman@math.duke.edu

PROGRAM IN EDUCATION, DUKE UNIVERSITY, DURHAM, NC 27708

*E-mail
address*: dmalone@duke.edu