Post on 15-Apr-2017
1
Using Formative Assessment and Self-Regulated Learning
to Help Developmental Mathematics Students Achieve:
A Multi-Campus Program
John HudesmanSara Crosby
Niesha ZiehmkeHoward Everson
Sharlene IsaacBert Flugman
Barry ZimmermanCity University of New York
Adam MoylanRockman et al.
The authors describe an Enhanced Formative Assessment and Self-Regulated Learning (EFA-SRL) program designed to im-prove the achievement of community college students enrolled in developmental mathematics courses. Their model includes the use of specially formatted quizzes designed to assess both the students’ mathematics and metacognitive skill levels. When the corrected quizzes are returned, students are required to demon-strate how they used both the mathematics and metacognitive feedback to improve on their errors. Results indicate that pro-gram students earned higher mean grades and achieved higher pass rates than students enrolled in baseline classes.
More than 40% of high school graduates entering two-year colleges
Hudesman, J., Crosby, S., Ziehmke, N., Everson, H., Isaac, C., Flugman, B., Zimmerman, B., & Moylan, A. (2014). Using formative assessment and self-regulated learning to help developmental math students achieve. Journal on Excellence in College Teaching, 25(2), x-x.
Journal on Excellence in College Teaching2
require mathematics remediation, at a cost of between 1.85 and 2.35 billion dollars annually (Strong American Schools, 2008). As a result of this over-whelming demand, developmental mathematics courses comprise over half of the mathematics offerings at many of these colleges (see Lutzer, Rodi, Kirkman, and Maxwell, 2007). Even more disturbing is the finding by the Carnegie Foundation (2009) that between 60% and 70% of the students who enroll in developmental mathematics do not successfully complete these required courses. The situation continues to deteriorate when we consider what happens to those students who do pass developmental mathematics. For example, at a large urban comprehensive college, only between 45% and 67% of the students passed an introductory college-level mathematics course after successfully completing developmental mathe-matics (Cummings, personal comunication, May 3, 2011).
The Contribution of Academic Content Formative Assessment
to Student Learning
The impact of formative assessment on learning has been well docu-mented. In a series of landmark review articles, Black and Wiliam (1998a, 1998b, 2009) found that achievement gains generated by using formative assessment across a range of academic content areas were among the largest ever reported for education interventions, with the largest gains realized among low achievers.
Implementing an effective formative assessment program involves having teachers receive feedback from the classroom assessments and then using this information to make changes in their instructional practices. Instructors are also expected to provide feedback to students about how they can improve their own learning. Students, in turn, are expected to use this feedback to make constructive changes in how they learn (Sadler, 1989). In this way the formative assessment process is integrated into classroom instruction as an ongoing process, and can promote mastery learning and curriculum-based measurement (Fuchs, 1995; Zimmerman & DiBenedetto, 2008).
In conceptualizing the formative assessment process, feedback is a key element in how students develop their learning-how-to-learn skills. As such, formative assessment merges with theories of metacognition in general and self-regulation in particular (see, for example, Nicol and Macfarlane-Dick’s 2006 discussion of formative assessment and self-reg-ulated learning).
Helping Developmental Math Students Achieve 3
The SRL Component of Formative Assessment
The majority of formative assessment interventions have emphasized content competency, for example, by providing feedback on incorrectly applied mathematics strategies, such as the steps needed to solve a qua-dratic equation by factoring. We believe that in order to optimize student learning, it is necessary to include metacognitive, self-regulatory strategies as part of classroom instruction and assessment.
The SRL approach guiding our work is based on models of self-reg-ulated learning developed by Zimmerman (2000, 2002, 2006) and Grant (2003, 2008). It is a psycho-educational model characterized by continuous feedback cycles, with each cycle divided into three main phases. The first is a planning phase, in which students review their past efforts, conduct academic task analyses, choose content and metacognitive strategies that best address their specific learning challenges, set identifiable goals, and make self-efficacy and self-evaluation judgments to assess the accuracy of their level of understanding and content mastery. Next is a practice phase, in which students implement their plans, monitor their progress, and make real-time adjustments to their learning plans. This is followed by an evaluation phase, in which students assess each strategy’s effectiveness based on instructor feedback, for instance, classroom assessments. Stu-dents then build on the successful strategies and/or modify or replace less effective ones. The students’ responses from the evaluation phase become the basis for the planning phase of the next SRL cycle. This model has also been described in Hudesman, Millet, Niezgoda, Han, and Flugman (2013) and Hudesman, Crosby, Flugman, Isaac, Everson, and Clay (2013).
The SRL intervention derives much success from its cyclical nature; each time students complete a cycle, they acquire more feedback and, therefore, come closer to achieving their learning goals. Students begin to understand that learning is directly related to experimenting with dif-ferent content and metacognitive strategies, a notable shift from the more common notion that achievement is simply a function of innate ability or some other external factor (Zimmerman, 2002).
The power of SRL competence is highlighted in a classic study in social learning theory by Zimmerman and Bandura (1994). They demonstrated that students’ SRL skill levels are more highly correlated with their college grade point average than are their scores on standardized tests such as the SAT. More recently, Dignath and Buettner (2010) reported a meta-analy-sis of 84 K-12 studies that found overall mean effect sizes of .69 for SRL interventions, with the largest effect sizes in mathematics achievement. In addition, a review of the efficacy of SRL strategies by the National
Journal on Excellence in College Teaching4
Mathematics Advisory Panel (NMAP) (2008) concluded there was signif-icant evidence to support the further development and research of this approach in mathematics instruction. The panel specifically suggested encouraging further research on the relationship of SRL strategies to a wide range of mathematics knowledge and skill areas.
The Application of the SRL-Enhanced Formative Assessment Programs
The program described in this article represents a portion of our on-going research on formative assessment and self-regulation. The focus of our work is to improve students’ academic performance by developing both their academic and metacognitive competencies. Because the SRL-EFA program is not reliant on any specific course content, it is possible to apply the SRL model in a wide variety of academic coursework, including developmental mathematics and other STEM-related courses (see Blank, Hudesman, Morton, Armstrong, Moylan, White, and Zimmerman, 2008; Hudesman, Zimmerman, and Flugman, 2005, 2010; Zimmerman, Moylan, Hudesman, White, and Flugman, 2011).
The Main Features of the EFA-SRL Program
In keeping with Heritage’s (2010) view that student assessments should be viewed as part of an ongoing instructional process, the EFA-SRL pro-gram incorporates mathematics content and metacognitive assessments as an integral part of classroom instruction. It is designed to change how instructors teach and how students learn. As such, the program cycle consists of the following five major components:
1. Instructors administer specially constructed quizzes that assess both the students’ mathematics and SRL competencies.
2. Instructors review and grade the quizzes to provide constructive feedback about both the content and SRL competencies with which students struggle. They also use quiz feedback to adjust their instruction.
3. Students complete a specially constructed self-reflec-tion and mastery learning form for each incorrectly answered quiz question. This process affords students an opportunity to reflect on, and then improve, both
Helping Developmental Math Students Achieve 5
the mathematics content and SRL processes that were incorrectly applied.
4. Instructors review the completed self-reflection and mastery learning forms to determine the degree to which students have mastered the appropriate mathematics and SRL skills. Based on the instructor’s evaluation of their work, students can earn up to the total value of the original quiz question. Based on the reflection and mastery learning form data, instructors also have an additional opportunity to make changes to the mathe-matics content and SRL topics to be covered in upcoming lessons.
5. Instructors use the feedback provided by the students’ quizzes and self-reflection/mastery learning forms as the basis for ongoing class discussions and exercises, during which students discuss the relationship between their mathematics content and SRL skills. The students use these discussions to develop their future plans.
The Research Question
The research question addressed in this article is this: Can an Enhanced Formative Assessment Program featuring SRL improve the mathematics achieve-ment of students enrolled in developmental mathematics courses at two different two-year colleges?
Method
Participants and Design
The EFA-SRL developmental mathematics program was part of a larger three-year grant from the Fund for the Improvement of Post-Secondary Education (FIPSE) (Hudesman et al., 2010), which included almost 3,000 students in 14 different academic disciplines. It was implemented at one high school, two 2-year colleges, and two 4-year colleges in three differ-ent states. During the third year of the grant, we were able to study the impact of the EFA-SRL program on students enrolled in developmental mathematics courses at the two participating community colleges. One of the community colleges is located in a major northeastern city, and the majority of the students are either Black or Hispanic. The other commu-
Journal on Excellence in College Teaching6
nity college, which is located in a rural Midwestern city, has a student population that is mostly white. Although there were somewhat different curricula used at the different colleges, they all focused on pre-algebra and algebra topics. Furthermore, all classes were taught using a lecture/demonstration format.
This section reports on the work of three instructors at two 2-year col-leges who carried out the program with 125 students enrolled in seven sections of developmental mathematics. The EFAP-SRL procedure and materials described here have also been described in Zimmerman et al. (2011); Hudesman, Carson, Flugman, Clay, and Isaac (2011, a, b); and Hudesman et al. (2013).
Course Materials
Mathematics QuizzesStudents enrolled in the EFA-SRL course sections completed specially
formatted quizzes that were administered at least once a week. Each quiz consisted of a maximum of five questions. The procedure for each quiz was as follows. When completing the top portion of the quiz, students were asked to predict their quiz grade and to enter the amount of time they spent preparing for it. Once they started the quiz, the students were asked to read each question, but before answering it, they were asked to make a self-efficacy judgment indicating how confident they were that they could correctly solve the problem. After attempting to solve the problem, stu-dents were asked to make a second self-evaluation judgment, indicating how confident they were that they had correctly solved the problem. A sample quiz formatted with the self-efficacy and self-evaluation judgments is illustrated in Appendix A and has also appeared in Zimmerman et al. (2011), Hudesman et al. (2013), and Hudesman et al. (in press)
Instructors used the students’ quiz information to provide feedback to students and to make changes in their instruction. The quiz data also allowed instructors to study the relationship between the students’ math-ematics and metacognitive skill levels. This information is significant, because struggling students frequently make more optimistic predictions about their knowledge than are warranted by their actual quiz scores, indicating that they often do not recognize the difference between “what they think they know” and “what they don’t know” (Tobias & Everson, 2002). As a result of this false belief, these students do not feel any need to remedy the situation by changing their “learning how to learn” behaviors. Without proper feedback, students may continue a destructive cycle of poor planning and disappointing academic outcomes. Being able to pro-
Helping Developmental Math Students Achieve 7
vide students with ongoing feedback about the relationship between their actual performances (their quiz scores) and their metacognitive judgments (their predicted scores, and the relationship between their preparation time and their self-efficacy and self-evaluations judgments) is critical to improving students’ mathematics and SRL skill sets.
The SRL Mathematics Self-Reflection and Mastery Learning Form For each incorrectly answered quiz question, students were expected
to complete a separate self-reflection and mastery learning form. This form was designed to further assist students in assessing the relationship between their content knowledge and their ability to use critical SRL tools. In the first section of this form, students were asked to compare their predicted quiz score and their actual quiz score and explain any significant discrepancy and evaluate the accuracy of their academic confi-dence judgments, that is, their self-efficacy and self-evaluation judgments. They were then required to discuss the relationship of these judgments to their actual quiz score. Based on the instructor’s written feedback and/or prior class discussions, students needed to indicate which of the mathematics strategies they incorrectly applied on the quiz. Implicit in scoring this section of the reflection/mastery learning form is that we are looking for indications that students are acting more strategically. A constructive response would indicate that students understand how their actions were directly related to their quiz score, for example, that their preparation time was inadequate. Furthermore, these students would need to indicate how they will remedy this situation in preparing for the next quiz, for example, to work on more practice problems or meet with the tutor a specific number of times. By contrast, a less effective student response would be much more general and not include any significant description of appropriate metacognitive strategies, for instance, students who respond that they will “try harder” the next time.
In the second section of the EFA-SRL Reflection and Mastery Learning Form, students again had to solve the original mathematics problem and include a written description of the specific mathematics strategies and procedures involved in their work. Students were also required to use these same mathematics strategies to solve a similar problem. A sample EFA-SRL Reflection and Mastery Learning Form is shown in Appendix B and has also appeared in Zimmerman et al. (2011), Hudesman et al. (2013), and Hudesman et al. (in press).
Journal on Excellence in College Teaching8
Scoring the Self-Reflection and Mastery Learning Form The EFA-SRL Reflection and Mastery Learning Form is based on a mas-
tery learning approach in which students are given multiple opportunities to use feedback in order to improve their performance. By completing the form, students had an opportunity to demonstrate how well they could constructively use feedback to master both the mathematics and metacognitive competencies necessary to solve the problem. Students who demonstrated complete mastery on this reflection form could earn up to 100% of the original credit for a problem. Instructors again used the information from the self-reflection and mastery learning form to plan lessons that demonstrated the relationship between mathematics content and EFA-SRL competencies.
Classroom Discussions and ExercisesThe quiz/self-reflection and mastery process is considered to be a major
classroom priority. Instructors are expected to have ongoing class discus-sions that focus on the relationship between how students effectively learn mathematics content and how they enhance their self-regulation skills. One example of such an activity involves having students create individ-ual graphs illustrating the relationship between their SRL judgments and their quiz scores. In another exercise, instructors might ask students to compare the time they spent preparing for the quiz and their quiz grades. The students’ responses are then listed on the board. The results often reveal an obvious correlation between the students’ preparation time and their quiz scores. Students are then asked to use the feedback from this exercise to design a revised plan for improving their work.
Instructor Selection, Training, and Support
EFA-SRL instructors were chosen from among a group of faculty who volunteered to participate. Instructors initially attended a two-day on-site training program that involved discussions on the theory and practice of EFA-SRL. Throughout the semester, instructors provided ongoing elec-tronic logs detailing their classroom experiences. A sample log, which is included in Appendix C, was completed over a three-week time period by one of the most experienced SRL instructors and illustrates how she applied the SRL model to both her students and her instruction. The log form has four main sections:
1. What was your goal for your students and/or yourself this period?
Helping Developmental Math Students Achieve 9
2. What were the mathematics content and metacognitive strategies your students and/or you were planning to use to achieve this goal?
3. What actually happened?
4. How will the results of this cycle inform your instruc-tion in the future? What worked that you will continue? What adjustments do you need to make?
Program staff responded to these logs by e-mail, telephone, and/or video conferencing sessions. In addition, there were classroom obser-vations conducted by either visiting program staff or on-site EFA-SRL coordinators, who were trained by program staff. These observations were recorded using the SRL Observation form (see Appendix C). The observers and instructors met after the class to review the observations, with the goal of assisting instructors to further improve their implemen-tation of the program. These meetings also provided an important venue for gathering information on how to develop the program further.
Performance Measures for the Developmental Mathematics Course
Grade Distributions During the third year of the grant, we were able to obtain baseline
and post intervention academic outcome data for three developmental mathematics instructors at two community colleges. The baseline data included grade distributions and pass rates from the semester before the instructor began using the EFA-SRL program. The grade distribution is based on a 13-point scale that ranges from 0 (for a grade of F) to 12 (for a grade of A+). These pre-program grade distributions were compared with grade distributions from students enrolled in the same courses taught by the same instructors after the implementation of the EFA-SRL program. Based on these grade distributions, we were able to compare the mean grades and pass rates earned by students before and after the implementation of the program.
Instructor Satisfaction Survey A brief three-question survey was conducted six months after the end
of program funding. Instructors were asked if they continued to use the EFA-SRL program in their classrooms “not at all,” “somewhat,” or “a lot.”
Journal on Excellence in College Teaching10
Results
The data in Table 1 indicate that student achievement in developmen-tal mathematics courses improved after the introduction of the EFA-SRL program. The percentage of students passing the course increased from 63.5% to 79.2% (χ2 = 6.95; p = .008). Similarly, the mean grade for students in the EFA-SRL program courses was 5.2, compared to a mean grade of 4.1 for students enrolled in baseline developmental mathematics courses (F = 4.84; p = .029). As part of the overall FIPSE program evaluation, we also surveyed all of the instructors, regardless of their academic depart-ment, six months after funding for the program ended. As mentioned, all instructors were asked if they continued to use the EFA-SRL in their classrooms “not at all,” “somewhat,” or “a lot.” Only one instructor (not a mathematics instructor) out of the total group of 38 instructors was no longer using the program. About half of the instructors indicated that they continued to use EFA-SRL in their classrooms “somewhat,” and the rest of the instructors reported that they were still using the EFA-SRL program in their classrooms “a lot.” These findings support the portability of the EFA-SRL model to other colleges.
Discussion
This article describes the implementation of an EFA-SRL program in developmental mathematics at two 2-year colleges. Program students, some of whom were enrolled in classes that were 500 miles away, were more successful in their developmental mathematics coursework, as in-dicated by improved grades and pass rates. These findings are in keeping with those reported by Hudesman et al. (in press) and Zimmerman et al. (2011). The authors found that students enrolled in EFAP-SRL sections of developmental mathematics achieved higher pass rates in the course and higher pass rates on the COMPASS, that is, the mathematics section of the ACT (1997, 2006). There was also some evidence that these students were able to carry over the skills they learned in the EFAP-SRL class to a subsequent credit-level mathematics course.
Instructors also seemed pleased with their participation, as indicated by their continued use of the program six months after the conclusion of the grant funding. It should also be noted that the positive academic outcomes reported in this paper are in keeping with the findings reported in several other studies carried out by the project team, for example Zimmerman et al. (2011) and Blank et al. (2007).
Helping Developmental Math Students Achieve 11
Limitations and Directions for Future Research
Because the program staff had limited access to the two community college campuses, there concerns about the fidelity of the implementation for the program. While faculty at both sites received their initial onsite training from program staff, most of the onsite follow up monitoring was done via electronic logs, video conferencing, and “surrogate” observers to do the classroom observations and follow up meetings.
As previously mentioned, the EFA-SRL program was implemented in 14 different academic areas at five different high school and college cam-puses. The developmental mathematics courses implemented at the two 2-year campuses accounted for only three of the instructor participants. Because of administrative and financial constraints, our pre/post evalu-ation design was limited to this (developmental mathematics) instructor group. Because no similar evaluation was carried out for other academic disciplines and campuses, we have no way of knowing if our results would carry over to these areas.
Another concern is that the EFA-SRL program is labor intensive for both students and instructors. It takes time for students to complete the various requirements of the program. This raises a question as to whether the students’ improvement is due to the additional time-on-task or wheth-er the improvement is due to an increase in the students’ metacognitive skill level. Our belief is that program students expected to take on more
Journal on Excellence in College Teaching12
responsibility for their learning, which, in turn, would result in increasing their time-on-task.
The EFA-SRL program also expects that instructors will put in the extra time needed to administer more quizzes, provide ongoing mathematics and SRL feedback regarding mathematics content and SRL strategies, review self-reflection/mastery learning forms and conduct classroom discussions relating SRL to the students’ learning and achievement. All of this must be included within the already crowded required mathemat-ics curriculum. Under the circumstances, it is not surprising that some instructors expressed an initial concern about completing the semester’s work. However, these concerns usually disappeared by the end of the semester. In fact, we are unaware of any mathematics instructor who was unable to complete the course curriculum. In order to address this concern and make the program delivery more efficient, we have begun to explore ways to transition from a paper and pencil delivery system to a computerized system (see Hudesman et al., 2011, a, b).
Despite the program limitations discussed here, we believe that these results, when taken together with other similar work, suggest that instruc-tors can consider the EFA-SRL program to be part of their “tool box” of approaches to be combined with other content-specific approaches and instructional methodologies.
References
ACT. (1997). Computer-adaptive placement assessment and support system in mathematics (COMPASS). American College Testing manual. Iowa City, IA: ACT.
ACT. (2006). COMPASS reference manual. Iowa City, IA: ACT.Black, P., & Wiliam, D. (1998a). Assessment and classroom learning. As-
sessment in Education: Principles, Policy & Practice, 5, 7-71.Black, P., & Wiliam, D. (1998b). Inside the black box: Raising standards
through classroom assessment. Phi Delta Kappan, 80, 139-148.Black, P., & Wiliam, D. (2009). Developing the theory of formative assess-
ment. Educational Assessment, Evaluation and Accountability, 21(1), 5-31.Blank, S., Hudesman, J., Morton, E., Armstrong, R., Moylan, A., White,
N., & Zimmerman, B. J. (2007, October). A self-regulated learning assessment system for electromechanical engineering technology stu-dents. In Proceedings of the National STEM Assessment Conference (pp. 37-45). Washington, DC: Drury University and the National Science Foundation.
The Carnegie Foundation for the Advancement of Teaching. (2009). De-
Helping Developmental Math Students Achieve 13
velopmental math. Retrieved from http://www.carnegiefoundation.org/problem-solving/developmental-math
Dignath, C., & Buettner, G. (2010). Components of fostering self-regulated learning among students. A meta analysis on intervention at primary and secondary school level. Metacognition and Learning, 3, 231-264.
Fuchs, L. S. (1995). Connecting performance assessment to instruction: A comparison of behavioral assessment, mastery learning, curricu-lum-based measurement, and performance assessment. ERIC Digest E530. [There should be a volume and pages.]
Grant, A. M. (2003). The impact of life coaching on goal attainment, metacognition, and mental health. Social Behavior and Personality, 31(3), 253-264.
Grant, A. M. (2008). Personal life coaching for coaches-in-training enhances goal attainment, insight, and learning. Coaching: An International Journal of Theory, Research, and Practice, VI(1) 54-70.
Heritage, M. (2010). Formative assessment and next-generation assessment systems: Are we losing an opportunity? (Report prepared for the Council of Chief State School Officers). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing, University of California - Los Angeles.
Hudesman, J., Carson, M., Flugman, B., Clay, D., & Isaac, S. (2011a [need month for conferencs.] ). The computerization of the self-regulated learning assessment system: A demonstration program in developmental mathematics. Paper presented at the 9th annual meeting of the Education and Infor-mation Systems, Technologies, and Applications (ESTIA) [Letters don’t match the order in the name; shouldn’t it be EISTA?] Conference, Orlando, FL.
Hudesman, J., Carson, M., Flugman, B., Clay, D., & Isaac, S. (2011b). The computerization of the self-regulated learning assessment system: A demonstration program in developmental mathematics. The Interna-tional Journal of Research and Review, 6, 1-18.
Hudesman, J., Crosby, S., Flugman, B., Isaac, S., Everson, H., & Clay, D. (2013). Using an enhanced formative assessment program and self-regu-lated learning to improve student achievement. Journal of Developmental Education, 13, 2-13.
Hudesman, J., Millet, N., Niezgoda, G., Han, H., & Flugman, B. (2013). The use of self-regulated learning, formative assessment, and mastery learning to assist students enrolled in developmental mathematics: A demonstration project. The International Journal of Research and Review, 10, 1-17.
Hudesman, J., Zimmerman, B., & Flugman, B. (2005). A comprehensive cog-
Journal on Excellence in College Teaching14
nitive skills academy for associate degree students (FIPSE No. P116B010127). New York, NY: City University of New York.
Hudesman, J., Zimmerman, B., & Flugman, B. (2010). The replication and dissemination of the self-regulated learning model to improve student performance in high schools, two-year, and four-year colleges (FIPSE No. P116B060012). New York, NY: City University of New York.
Lutzer, D. J., Rodi, S. B., Kirkman, E. E., & Maxwell, J. W. (2007). Statistical abstract of undergraduate programs in the mathematical sciences in the United States. Providence, RI: American Mathematical Society. Retrieved from http://www.ams.org/cbms/cbms2005.html
National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education.
Nicole, D., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31, 199-218.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119-140.
Strong American Schools. (2008). Diploma to nowhere. Retrieved from http://www.strongamericanschools.org/files/SAS_Diploma_To_No-where_v11_FINAL.pdf
Tobias, S., & Everson, H. T. (2002). Knowing what you know and what you don’t know (College Board Report, 2002-04). New York, NY: [The?] College Board.
Zimmerman, B. J. (1990). Self-regulating academic learning and achieve-ment: The emergence of a social cognitive perspective. Educational Psychology Review, 82, 297-306.
Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive per-spective. In M. Boekaerts, P. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp.13-39). San Diego, CA: Academic Press.
Zimmerman, B. J. (2002). Achieving self-regulation: The trial and triumph of adolescence. Academic Motivation of Adolescents, 2, 1-27.
Zimmerman, B. J. (2006). Integrating classical theories of self-regulated learning: A cyclical phase approach to vocational education. In D. Euler, G. Patzold, & M. Lang (Eds.), Self-regulated learning in vocational education (pp. 7-48). Stuttgart, Germany: Franz Steiner Verlag.
Zimmerman, B. J., & Bandura, A. (1994). Impact of self-regulatory in-fluences on writing course attainment. American Educational Research Journal, 31, 845-862.
Zimmerman, B. J., & DiBenedetto, M. K. (2008). Mastery learning and assessment: Implications for students and teachers in an era of high-
Helping Developmental Math Students Achieve 15
stakes testing. Psychology in the Schools, 45(3), 206-216.Zimmerman, B. J., Moylan, A. R, Hudesman, J., White, N., & Flugman, B.
(2011). Enhancing self-reflection and mathematics achievement of at-risk urban technical college students. Psychological Test and Assessment Modeling, 53(1), 108-127.
Author Note
This work was supported by a grant from the Fund for the Improvement of Post-Secondary Education (P116B060012). Correspondence concerning this article should be addressed to jhudesman@gc.cuny.edu.
John Hudesman is a senior principal investigator at the Center for Advanced Study in Education at the CUNY Graduate School and an adjunct professor at the New York City College of Technology, CUNY. Sara Crosby is the director of First Year Programs at Brooklyn College, CUNY. Niesha Ziehmke is a first-year experience consultant [at CUNY?] . Howard Everson is the director of the Center for Advanced Study in Ed-ucation at the CUNY Graduate School. Sharlene Isaac is an SRL staff researcher and administrator at New York City College of Technology, CUNY. Bert Flugman is senior research fellow at the Center for Advanced Study in Education at the CUNY Graduate School. Barry Zimmerman is a distinguished professor emeritus of educational psycholo-gy at the CUNY Graduate School. Adam Moylan is a senior researcher at Rockman et al.
Journal on Excellence in College Teaching16
Appendix A A Sample EFA-SRL Math Quiz
Name: ____________________________________________ Date: _____________ Quiz #:__________ Predicted Score: _________ Preparation Time: ________ minutes
Before solving each problem, how confident
are you that you can solve it correctly?
REMEMBER: Show all of your work.
Simplify all of your answers
After you have solved each
problem, how confident are you that you solved it
correctly?
0% 25% 50% 75%
100%
1. Factor completely:
10x2y2+4xy3 – 2y=
0% 25% 50% 75%
100%
0% 25% 50% 75%
100%
2. Divide:
8a2b2 – 12a3b2c +4ab2 /4ab2
0% 25% 50% 75%
100%
0% 25% 50% 75%
100%
3. Express answer in scientific notation:
(a) 6700000 (b) 0.000015
0% 25% 50% 75%
100%
0% 25% 50% 75%
100%
4. Compute and express in scientific notation:
(3.6 x 10-5)(6 x 103)/12 x 103
0% 25% 50% 75%
100%
0% 25% 50% 75%
100%
5. Multiply:
(5X – 3)2
0% 25% 50% 75%
100%
Helping Developmental Math Students Achieve 17
App
endi
x B
Sam
ple
EFA
Rev
isio
n Fo
rm
SR
L M
ath
Rev
isio
n Sh
eet
|
Qui
z #_
___
|
Item
# _
___
|
D
ate:
___
____
_ St
uden
t: __
____
____
____
____
____
____
__
|
Inst
ruct
or: _
____
____
____
____
____
____
___
Now
that
you
hav
e re
ceiv
ed y
our c
orre
cted
qui
z, y
ou h
ave
the
oppo
rtun
ity to
impr
ove
your
scor
e.
Com
plet
e al
l sec
tions
thor
ough
ly a
nd th
ough
tfully
. Use
a se
para
te re
visi
on sh
eet f
or e
ach
new
pr
oble
m.
PLA
N IT
1. a
. How
muc
h tim
e di
d yo
u sp
end
stud
ying
for t
his t
opic
are
a? _
____
____
_
b. H
ow m
any
prac
tice
prob
lem
s did
you
do
in th
is to
pic a
rea
____
____
____
____
____
____
____
in
prep
arat
ion
for t
his q
uiz?
(ci
rcle
one
) 0
– 5
/
5 –
10
/ 1
0+
c.
Wha
t did
you
do
to p
repa
re fo
r thi
s qui
z? (u
se st
udy
stra
tegy
list
to a
nsw
er th
is q
uest
ion)
2.
Afte
r you
solv
ed th
is p
robl
em, w
as y
our c
onfid
ence
ratin
g to
o hi
gh (i
.e.,
4 or
5)?
(circ
le o
ne)
y
es
no
3.
Exp
lain
wha
t str
ateg
ies o
r pro
cess
es w
ent w
rong
on
the
quiz
pro
blem
.
Journal on Excellence in College Teaching18
App
endi
x B
(continued)
Sa
mpl
e EF
A R
evis
ion
Form
PR
AC
TIC
E IT
4.
Now
re-d
o th
e or
igin
al q
uiz
prob
lem
[on
the
left
?] a
nd w
rite
the
stra
tegy
you
are
usi
ng o
n th
e rig
ht.
5. H
ow co
nfid
ent a
re y
ou n
ow th
at
you
can
corr
ectly
solv
e th
is
sim
ilar i
tem
? (c
ircle
one
)
Def
inite
ly
not
conf
iden
t
N
ot
conf
iden
t
U
ndec
ided
C
onfid
ent
V
ery
conf
iden
t
1
2 3
4 5
6. N
ow u
se th
e st
rate
gy to
solv
e th
e al
tern
ativ
e pr
oble
m. [
Shou
ld th
ere
be m
ore
spac
e he
re fo
r th
is?]
7. H
ow co
nfid
ent a
re y
ou n
ow th
at
you
can
corr
ectly
solv
e a
sim
ilar
prob
lem
on
a qu
iz o
r tes
t in
the
futu
re?
(circ
le o
ne)
D
efin
itely
no
t co
nfid
ent
N
ot
conf
iden
t
U
ndec
ided
C
onfid
ent
V
ery
conf
iden
t
1
2 3
4 5
Helping Developmental Math Students Achieve 19
App
endi
x C
O
bser
vatio
n Fo
rm
SRL
Obs
erva
tion
For
m
Afte
r obs
ervi
ng th
e in
stru
ctor
for a
t lea
st o
ne h
our,
circ
le th
e ap
prop
riat
e va
lue
to re
pres
ent t
he fr
eque
ncy
and
qual
ity o
f eac
h be
havi
or.
Inst
ruct
or: _
____
____
____
____
____
Obs
erve
r: __
____
____
____
____
_ C
lass
: ___
____
____
__ D
ate:
___
____
___
Des
crip
tion
of T
arge
t Beh
avio
r
Writ
e a ch
eck
mar
k for
each
tim
e you
see
this
beha
vior
.
How
ofte
n w
as ea
ch
beha
vior
carr
ied ou
t?
Ver
y N
ever
O
ften
Was
the
beha
vior
carr
ied
out
syst
emat
ically
?
1.
Teac
her m
odel
s spe
cific
stra
tegi
es a
t eac
h st
ep o
f th
e pr
oble
m
1
2
3
4
5
Ye
s
N
o
2. Te
ache
r writ
es d
own
stra
tegi
es cl
early
on
the
boar
d in
wor
ds
1
2
3
4
5
Ye
s
N
o
3. Te
ache
r exp
lain
s to
the
stud
ents
that
they
nee
d to
writ
e do
wn
stra
tegi
es
1
2
3
4
5
Ye
s
N
o
4. Te
ache
r enc
oura
ges s
tude
nts t
o m
onito
r str
ateg
y us
e du
ring
prob
lem
solv
ing
1
2
3
4
5
Ye
s
N
o
5. Te
ache
r mak
es d
elib
erat
e er
rors
dur
ing
pres
enta
tion
1
2
3
4
5
Ye
s
N
o
Journal on Excellence in College Teaching20
App
endi
x C
(con
tinue
d)
Obs
erva
tion
Form
SR
L O
bser
vati
on F
orm
A
fter o
bser
ving
the
inst
ruct
or fo
r at l
east
one
hou
r, ci
rcle
the
appr
opri
ate
valu
e to
repr
esen
t the
freq
uenc
y an
d qu
ality
of e
ach
beha
vior
. In
stru
ctor
: ___
____
____
____
____
__ O
bser
ver:
____
____
____
____
___
Cla
ss: _
____
____
____
Dat
e: _
____
____
_
D
escr
iptio
n of
Tar
get B
ehav
ior
Writ
e a ch
eck
mar
k for
each
tim
e you
see
this
beha
vior
.
How
ofte
n w
as ea
ch
beha
vior
carr
ied ou
t?
Ver
y N
ever
O
ften
Was
the
beha
vior
carr
ied
out
syst
emat
ically
?
6.
Prob
lem
solv
ing
erro
rs a
re u
sed
as a
dep
artu
re
poin
t for
ana
lysi
s, i.e
. tea
cher
s don
’t ju
st st
art
over
or q
uick
ly co
rrec
ting
erro
rs th
emse
lves
1
2
3
4
5
Ye
s
N
o
7. Te
ache
r ref
ers t
o st
rate
gy st
eps f
or co
rrec
ting
erro
rs
1
2
3
4
5
Ye
s
N
o
8. Te
ache
r enc
oura
ges i
ndiv
idua
l pra
ctic
e of
st
rate
gies
for p
robl
em so
lvin
g an
d er
ror
dete
ctio
n
1
2
3
4
5
Ye
s
N
o
9. Te
ache
r enc
oura
ges s
tude
nts t
o go
to th
e bo
ard
to d
emon
stra
te p
robl
em so
lvin
g /
erro
r de
tect
ion
stra
tegi
es
1
2
3
4
5
Ye
s
N
o
10. T
each
er e
ncou
rage
s stu
dent
s to
verb
aliz
e er
ror
dete
ctio
n /
prob
lem
solv
ing
stra
tegi
es w
hile
re
view
ing
prac
tice
prob
lem
s
1
2
3
4
5
Ye
s
N
o
Helping Developmental Math Students Achieve 21
11. T
each
er le
ads d
iscu
ssio
ns a
bout
goa
l set
ting,
se
lf-m
onito
ring,
or o
ther
def
inab
le p
arts
of t
he
SRL
mod
el
1
2
3
4
5
Ye
s
N
o
12. S
tude
nts h
ave
the
oppo
rtun
ity to
chec
k th
eir
unde
rsta
ndin
g (d
iscu
ss a
nsw
ers t
o pr
oble
ms
and
erro
rs) w
ith p
eers
in p
airs
or g
roup
s.
1
2
3
4
5
Ye
s
N
o
13. T
each
er ca
lls o
n st
uden
ts th
at a
re a
void
ing
part
icip
atio
n, in
spiri
ng a
hea
lthy
acad
emic
te
nsio
n in
the
clas
sroo
m.
1
2
3
4
5
Ye
s
N
o
14. T
each
er a
ims t
o ge
t stu
dent
s who
are
stru
gglin
g w
ith th
e m
ater
ial i
nvol
ved
in so
me
way
, e.g
. ca
lling
them
to th
e bo
ard
for d
emon
stra
tion,
as
king
them
to e
xpla
in st
rate
gies
, pus
hing
them
to
dis
cuss
err
ors a
nd st
rate
gies
dur
ing
indi
vidu
al p
ract
ice.
1
2
3
4
5
Ye
s
N
o
15. T
each
er m
oves
aro
und
the
clas
sroo
m d
urin
g pr
actic
e an
d at
tem
pts t
o gi
ve fe
edba
ck to
eac
h st
uden
t
1
2
3
4
5
Ye
s
N
o
Not
es:
Journal on Excellence in College Teaching22
App
endi
x D
Sa
mpl
e Se
lf-M
onito
ring
Inst
ruct
or L
og
for t
he E
FA-S
RL
FIPS
E Pr
ogra
m
Nam
e: C
S Lo
g su
bmis
sion
dat
e: 1
0/12
Lo
g tim
e pe
riod
: 9/1
9-10
/12
1. W
hat w
as y
our S
RL
goal
for y
our s
tude
nts
and/
or y
ours
elf t
his
peri
od?
Stud
ents
will
: •
Refle
ct o
n th
eir p
ast p
erfo
rman
ce in
mat
h cl
asse
s and
set a
goa
l for
impr
ovin
g th
eir m
ath
perf
orm
ance
. •
Plan
and
mon
itor t
heir
mat
h st
rate
gy u
sage
on
a re
gula
r bas
is.
• Re
view
thei
r wor
k.
2. W
hat w
ere
the
stra
tegi
es y
our s
tude
nts
and/
or y
ou w
ere
plan
ning
to u
se to
ach
ieve
this
goa
l?
• G
ave
pre-
test
of a
rithm
etic
skill
s and
dia
gnos
tic te
st o
f alg
ebra
skill
s. •
Had
stud
ents
vol
unte
er a
ctiv
ities
/hob
bies
they
wer
e go
od a
t and
how
they
got
to b
e go
od a
t the
m.
• G
ave
stud
ents
goa
l-set
ting
wor
kshe
et (f
or cr
edit)
. •
Gav
e st
uden
ts w
eekl
y m
onito
ring
form
, whi
ch a
llow
s the
m to
set g
rade
goa
l for
upc
omin
g qu
izze
s/ex
ams,
plan
, mon
itor a
nd e
valu
ate
thei
r mat
h st
rate
gy u
sage
eac
h w
eek
(for c
redi
t).
• C
opie
s of f
orm
are
ava
ilabl
e on
Bla
ckbo
ard.
•
Plan
ning
form
for t
he cu
rren
t wee
k co
llect
ed e
ach
Mon
day.
•
Dis
cuss
ion
on w
hat’s
wor
king
and
wha
t’s n
ot e
ach
Frid
ay.
• H
ad 3
qui
zzes
, whi
ch a
re co
mpo
sed
of h
omew
ork
ques
tions
. Stu
dent
s are
enc
oura
ged
to u
se th
eir
hom
ewor
k w
hile
taki
ng th
e qu
iz. Q
uiz
has f
ollo
w-u
p qu
estio
ns a
skin
g if
HW
was
don
e an
d w
hat s
tude
nts
pred
ict t
heir
scor
es w
ill b
e.
Helping Developmental Math Students Achieve 23
3. W
hat a
ctua
lly h
appe
ned?
A. D
id y
ou/y
our s
tude
nts a
chie
ve th
e go
al? (
Be a
s spe
cific
as p
ossi
ble—
give
illu
stra
tions
of o
utco
mes
, in
clud
ing
num
bers
or p
erce
ntag
es o
f stu
dent
that
ach
ieve
d va
rious
out
com
es.)
• Pe
rcen
t of s
tude
nts t
hat s
ubm
itted
goa
l-pla
nnin
g w
orks
heet
s: 94
%
• Pe
rcen
t of s
tude
nts w
ho cl
aim
ed th
ey h
ad th
eir h
omew
ork
done
, or m
ostly
don
e fo
r eac
h qu
iz: Q
1: 4
3/46
=
93%
; Q2:
38/
43 =
88%
; Q3:
36/
44 =
82%
•
Inte
rest
ing
that
this
per
cent
age
has b
een
falli
ng. I
n pl
ace
of a
n in
-cla
ss Q
uiz
4, I’
ll be
colle
ctin
g H
W
inst
ead.
•
Perc
ent o
f stu
dent
s sub
mitt
ing
wee
kly
plan
ning
form
s: W
eek
1 =
87%
; Wee
k 2
= 60
%
B.
Hav
e yo
u an
d yo
ur st
uden
ts d
iscu
ssed
thei
r goa
l ach
ieve
men
t in
rela
tion
to th
eir s
trat
egy
use?
• Fr
iday
dis
cuss
ions
hav
e be
en v
ery
enco
urag
ing.
On
2nd F
riday
, I a
sked
them
to w
rite
out t
heir
answ
ers t
o th
e fo
llow
ing:
Hav
e yo
u no
ticed
any
chan
ges i
n yo
ur m
ath
abili
ty si
nce
the
begi
nnin
g of
the
quar
ter?
Do
you
feel
mor
e/le
ss/s
ame
conf
iden
t in
your
mat
h ab
ility
? Why
do
you
thin
k th
at is
? •
Had
them
shar
e th
eir a
nsw
ers.
Ver
y po
sitiv
e. E
ven
thos
e th
at n
eede
d to
be
prod
ded
to re
spon
d ha
d no
ticed
chan
ges e
ven
thou
gh th
ey w
ere
relu
ctan
t to
attr
ibut
e th
em to
SRL
met
hods
. Qui
te a
few
re
spon
ded
that
they
wer
e do
ing
hom
ewor
k m
ore
cons
iste
ntly
and
it w
as h
elpi
ng (s
ome
said
they
wer
e si
mpl
y co
ntin
uing
beh
avio
rs th
ey k
new
from
hig
h sc
hool
, whi
le o
ther
s adm
itted
this
was
the
first
tim
e th
ey d
id m
ath
HW
on
a re
gula
r bas
is).
One
said
she
crea
ted
flash
card
s tha
t hel
ped,
ano
ther
said
the
plan
ning
form
was
hel
pful
and
mad
e he
r adh
ere
to a
sche
dule
. Ano
ther
stud
ent r
ecog
nize
d he
som
etim
es
is o
ver-
conf
iden
t and
mak
es si
lly e
rror
s and
nee
ds to
slow
dow
n. O
ther
s men
tione
d st
rate
gies
they
use
d fo
r the
1st ti
me
such
as d
oing
pra
ctic
e ex
erci
ses o
n Bl
ackb
oard
, wat
chin
g in
stru
ctio
nal D
VD
s tha
t cam
e w
ith th
e bo
ok, m
akin
g up
pra
ctic
e qu
izze
s, ta
king
them
, and
hav
ing
som
eone
chec
k th
eir w
ork.
Som
eone
el
se m
entio
ned
chec
king
her
wor
k ag
ains
t the
exe
rcis
es w
e di
scus
sed
toge
ther
in cl
ass.
• W
as e
ncou
ragi
ng th
at it
seem
ed th
at n
o on
e fe
lt th
ey w
ere
less
conf
iden
t abo
ut th
eir m
ath
abili
ty.
Journal on Excellence in College Teaching24
App
endi
x D
(continued)
Sa
mpl
e Se
lf-M
onito
ring
Inst
ruct
or L
og
for t
he E
FA-S
RL
FIPS
E Pr
ogra
m
Nam
e: C
S Lo
g su
bmis
sion
dat
e: 1
0/12
Lo
g tim
e pe
riod
: 9/1
9-10
/12
4. H
ow w
ill th
e re
sults
of t
his
cycl
e in
form
you
r ins
truc
tion
in th
e fu
ture
? W
hat w
orke
d th
at y
ou w
ill
cont
inue
? W
hat a
djus
tmen
ts d
o yo
u ne
ed to
mak
e?
• I t
hink
the
wee
kly
plan
ning
form
s are
a h
it w
ith m
any.
Alth
ough
the
num
ber s
ubm
ittin
g th
eir f
orm
s di
pped
in th
e 2nd
wee
k, I
thin
k m
uch
of th
is ca
n be
attr
ibut
ed to
the
fact
that
I ga
ve th
em a
copy
of t
he fo
rm
for t
he 1
st w
eek,
but
ther
eafte
r, th
ey a
re re
spon
sibl
e fo
r prin
ting
out a
copy
from
Bla
ckbo
ard.
The
re is
als
o a
perm
anen
t ann
ounc
emen
t on
Blac
kboa
rd th
at st
ates
that
a n
ew w
eekl
y pl
an is
due
EV
ERY
Mon
day,
with
a
link
to th
e fo
rms a
nd d
irect
ions
. Afte
r 2 ro
unds
, we’
re ir
onin
g ou
t the
kin
ks a
nd th
e ex
cuse
s.
• A
fter c
olle
ctio
n of
2nd
pla
nnin
g fo
rm, n
otic
ed se
vera
l lef
t the
goa
l que
stio
ns b
lank
, so
I rem
inde
d th
em o
f th
e on
goin
g cy
cles
of t
he S
RL m
odel
. Als
o sa
id th
at to
rece
ive
full
cred
it, th
ese
goal
s mus
t be
set.
• A
lso,
ane
cdot
ally
, it s
eem
s tha
t I h
ave
a gr
eate
r deg
ree
of p
artic
ipat
ion
and
“buy
-in”
on th
e pa
rt o
f my
stud
ents
than
I di
d la
st sp
ring.
•
I’m e
ncou
rage
d by
this
beg
inni
ng, b
ut n
ext w
eek
I nee
d to
spec
ifica
lly ta
rget
thei
r goa
l ach
ieve
men
t. I’m
th
rille
d th
ey a
re tr
ying
new
stra
tegi
es a
nd/o
r bei
ng co
nsis
tent
in th
eir s
trat
egy
use,
but
are
they
mee
ting
thei
r goa
ls? A
re th
eir g
oals
real
istic
? (Fo
r exa
mpl
e, se
tting
a g
oal o
f 100
% o
n qu
izze
s whe
n hi
ghes
t gra
de so
fa
r has
bee
n 70
%.)
•
Ther
e ar
e a
hand
ful o
f stu
dent
s who
are
n’t e
ngag
ed d
urin
g SR
L di
scus
sion
s. Th
e ne
xt ti
me
I ask
them
to
answ
er so
me
ques
tions
and
shar
e th
eir r
espo
nses
, I’ll
hav
e th
em d
iscu
ss th
em in
gro
ups.
• H
W p
artic
ipat
ion
is fa
lling
(but
still
qui
te h
igh
at o
ver 8
0%).
I will
colle
ct th
e H
W in
pla
ce o
f an
in-c
lass
qu
iz to
coun
t for
Qui
z 4.
Will
cont
inue
HW
qui
zzes
whe
n no
t pre
ssed
for t
ime.