EVALUATION OF THE REASONING MIND … of the reasoning mind mathematics program 2012-2013 ... vi list...
-
Upload
duongquynh -
Category
Documents
-
view
221 -
download
1
Transcript of EVALUATION OF THE REASONING MIND … of the reasoning mind mathematics program 2012-2013 ... vi list...
Dallas Independent School District
EVALUATION OF THE REASONING MIND MATHEMATICS PROGRAM
2012-2013
EA13-514-2
DEPARTMENT OF EVALUATION
AND ASSESSMENT
Mr. Mike Miles Superintendent of Schools
Dallas Independent School District
Mr. Mike Miles Superintendent of Schools
EVALUATION OF THE REASONING MIND MATHEMATICS PROGRAM
2012-2013
EA13-514-2
Joan Bush, Ph.D. Myoungsook Kim, Ph.D.
Dallas Independent School District
Approved Report of the Department of Evaluation and Assessment
Nancy Kihneman, Ph.D. Director Program Evaluation
Cecilia Oakeley, Ph.D. Executive Director Evaluation and Assessment
Dallas, Texas September 2013
Table of Contents
Section Page
ABSTRACT .............................................................................................................................. 1
PROGRAM DESCRIPTION .................................................................................................... 2
PURPOSE AND SCOPE OF THE EVALUATION ................................................................... 3
MAJOR EVALUATION QUESTIONS AND RESULTS ............................................................ 3
2.1 What were the sources and amounts of funding for Reasoning Mind? ............................ 3
2.2 What were key findings from the literature review related to technology-based supplemental instruction and scaling up educational initiatives? ..................................... 4
2.3 What were the demographic characteristics of students and teachers involved in Reasoning Mind? .............................................................................................................. 8
2.4 Did staff members participate in Reasoning Mind training as planned? ........................... 10
2.5 Was the Reasoning Mind program used by students as planned? .................................. 23
2.6 What were teacher and administrator perceptions of Reasoning Mind? .......................... 53
2.7 What were student mathematics achievement outcomes? .............................................. 82
SUMMARY .............................................................................................................................. 101
RECOMMENDATIONS ........................................................................................................... 104
REFERENCES ........................................................................................................................ 105
ii
List of Tables
Table Page
1 Demographic Characteristics of 2012-13 Reasoning Mind Students ............................ 9
2 Number and Percentage of Teachers that Completed the Qualification Course by Time Period ............................................................................................................... 11
3 Number and Percentage of Teachers that Completed the Qualification Course by
Division ...................................................................................................................... 12
4 Teacher Observation Ratings Based on the RM Rubric ................................................ 16
5 Teacher Observation Ratings by Indicator for Observation 3 ........................................ 22
6 Teacher Observation Rating Changes from Observation 1 to Observation 3 by Indicator .................................................................................................................... 23
7 Number of Hours Students Spent on RM in Fall and Spring by Grade and Division .... 26
8 Descriptive Statistics for Reasoning Mind Hours by Grade and Division ...................... 27
9 School Information by RM Hour Categories in the Fall and Spring by Grade and Division ...................................................................................................................... 29
10 RM Second-Grade Fall Hour Information with Schools Rank Ordered by Division
and Mean Hours Online ............................................................................................ 30
11 RM Third-Grade Fall Hour Information with Schools Rank Ordered by Division and Mean Hours Online ................................................................................................... 35
12 RM Second-Grade Spring Hour Information with Schools Rank Ordered by Division
and Mean Hours Online ............................................................................................ 40
13 RM Third-Grade Spring Hour Information with Schools Rank Ordered by Division and Mean Hours Online ............................................................................................ 44
14 Number of RM Objectives Students Completed in Fall and Spring by Grade and
Division ...................................................................................................................... 50
15 Descriptive Statistics for Reasoning Mind Objectives Completed by Grade and Division ...................................................................................................................... 51
16 Mean Accuracy Rates in Fall and Spring by Grade and Division .................................. 52
17 Results of Campus Administrator Overall Agreement Items ......................................... 55
18 Results of Campus Administrator Survey Satisfaction Items ......................................... 57
19 Results of Campus Administrator Survey Program Coordinator Items ......................... 58
20 Results of Campus Administrator Survey Frequency of Use Items ............................... 58
Continued
iii
List of Tables Continued
Table Page
21 Results of Campus Administrator Survey Technology Implementation Items ............... 59
22 Results of Campus Administrator Survey RM Awareness and Use Items .................... 60
23 Campus Administrator Survey Comments ..................................................................... 62
24 Supported and Non-Supported Teacher Respondent Characteristics .......................... 64
25 Results of Teacher Survey Satisfaction Items ............................................................... 65
26 Results of Teacher Survey Collaboration Items ............................................................ 66
27 Results of Teacher Survey RM Support Items .............................................................. 67
28 Results of Teacher Survey Professional Development Items ....................................... 68
29 Results of Teacher Survey RM Program Coordinator Items ......................................... 69
30 Results of Teacher Survey RM Resource Items ............................................................ 70
31 Results of Teacher Survey Frequency of Use Items ..................................................... 71
32 Results of Teacher Survey Student Change Items ........................................................ 72
33 Results of Teacher Survey Frequency of Technology Issue Items ............................... 73
34 RM Supported and Non-Supported Teacher Survey Comments .................................. 75
35 Percentage of District Second-Grade Students At or Above the 40th Percentile on ITBS Mathematics Total from Spring 2012 to Spring 2013 ...................................... 85
36 Percentage of District Second-Grade Students At or Above the 80th Percentile on
ITBS Mathematics Total from Spring 2012 to Spring 2013 ...................................... 86
37 Percentage of District Second-Grade Students At or Above the 40th Percentile on ITBS Mathematics Total by Student Group from Spring 2010 to Spring 2013 ......... 87
38 Percentage of District Third-Grade Students that Met Satisfactory on STAAR
Mathematics in Spring 2012 and Spring 2013 .......................................................... 88
39 Percentage of District Third-Grade Students that Met Advanced on STAAR Mathematics in Spring 2012 and Spring 2013 .......................................................... 89
40 Correlations between RM Use and Mathematics Achievement Test Scores ................ 91
41 Multiple Regression Results for Second-Grade 2013 ITBS Math Scores and Various Predictors ..................................................................................................... 93
Continued
iv
List of Tables Continued
Table Page
42 Multiple Regression Results for Second-Grade 2013 ITBS Math Scores and Three Main Predictors ......................................................................................................... 93
43 Multiple Regression Results for Third-Grade 2013 STAAR Scores and Various
Predictors .................................................................................................................. 95
44 Multiple Regression Results for Third-Grade 2013 STAAR Scores and Three Main Predictors .................................................................................................................. 95
45 Multiple Regression Results for Third-Grade 2013 ACP Scores and Various
Predictors .................................................................................................................. 97
46 Multiple Regression Results for Third-Grade 2013 ACP Scores and Three Main Predictors .................................................................................................................. 97
47 RM Student ITBS and STAAR Results by Hours Online and Level A Accuracy
Rates ......................................................................................................................... 100
v
List of Figures
Figure Page
1 Percentage of Supported and Non-Supported RM Teachers that Completed the
Qualification Course by Time Period ........................................................................ 11
2 Percentage of RM Teachers that Completed the Qualification Course by Division ...... 13
3 Percentage of Supported Teachers that Met RM Training Requirements ..................... 14
4 Percentage Distribution of Observation Ratings: Data Driven Decisions ...................... 17
5 Percentage Distribution of Observation Ratings: Lesson Planning ............................... 17
6 Percentage Distribution of Observation Ratings: Instructional Methods ....................... 18
7 Percentage Distribution of Observation Ratings: Learning Modes ................................ 18
8 Percentage Distribution of Observation Ratings: Teacher Engagement ....................... 19
9 Percentage Distribution of Observation Ratings: Procedures ....................................... 19
10 Percentage Distribution of Observation Ratings: Incentive Systems ............................ 20
11 Percentage Distribution of Observation Ratings: Notebooks ........................................ 20
12 Percentage Distribution of Observation Ratings: Independent Learning ....................... 21
13 Percentage Distribution of Observation Ratings: Student Engagement ........................ 21
14 Mean Hours Students Spent on RM in Fall and Spring by Grade ................................. 25
15 Percentage of Students in RM Hour Categories by Grade and Semester .................... 25
16 Percentage of Schools in RM Hour Categories by Grade and Semester ...................... 28
17 Mean RM Objectives Students Completed in Fall and Spring by Grade ....................... 49
18 Percentage of RM Objectives Completed by Grade and Semester .............................. 49
19 Percentage of Campus Administrators, Supported Teachers, and Non-Supported Teachers that “Agreed” or “Strongly Agreed” with Selected Survey Items ............... 77
20 Percentage of Campus Administrators, Supported Teachers, and Non-Supported
Teachers that “Agreed” or “Strongly Agreed” with Selected Support Items ............. 78
21 Percentage of Campus Administrators, Supported Teachers, and Non-Supported Teachers that “Agreed” or “Strongly Agreed” with Technology Issues ..................... 79
22 Number of Staff Survey References to Improved Student Learning and
Engagement .............................................................................................................. 80
Continued
vi
List of Figures Continued
Figure Page
23 Number of Staff Survey References to Technology-Related Challenges ...................... 80
24 Number of Staff Survey References to Scheduling Issues ............................................ 81
25 Number of Staff Survey References to Suggestions for Improving Technology ........... 81
26 Percentage of Second-Grade Students At or Above the 40th Percentile on Spring 2013 ITBS Mathematics by Level A Accuracy and Total Hours Online .................... 99
27 Percentage of Third-Grade Students that Met Level 2 Satisfactory on Spring 2013
STAAR Mathematics by Level A Accuracy and Total Hours Online ......................... 99
EA13-514-2
EVALUATION OF THE REASONING MIND MATHEMATICS PROGRAM: 2012-2013 Evaluators: Joan Bush and Myoungsook Kim, Dallas Independent School District
ABSTRACT
During 2012-13, the Reasoning Mind (RM) supplemental mathematics curriculum was provided in grades two and three at all elementary schools except for Allen and Dealey. The district budgeted Title I, Part A funds ($969,500) to pay for student accounts along with Title II, Part A funds ($525,000) to cover teacher training for a total of $1,494,500. Overall, 26,151 students were enrolled in RM schools; this included 13,398 (98.8%) of the district’s 13,563 second-grade students and 12,753 (98.8%) of the district’s 12,907 third-grade students. Whereas all RM teachers were supported in 2011-12, the district allocated one supported teacher per campus in 2012-13 due to cost. A total of 584 second- and third-grade teachers were trained including 129 supported teachers, 402 non-supported teachers, and 53 previous RM teachers. Less than half (48%) completed the RM Qualification Course by the end of the first six weeks, which delayed use of RM with students. Most supported teachers (92%) completed the required Best Practice and Curriculum Study workshops. Student use of RM improved from fall to spring; however, most students did not meet the 30-hour per semester requirement in fall (98% to 99%) or spring (68% to 69%). The majority completed Level A (easiest) learning mode problems in the fall and both Level A (easiest) and B (medium difficulty) learning mode problems in the spring; however, notably fewer completed Level C (hardest) learning mode problems, review mode problems (A, B, C), and test mode problems. School comparisons mirrored student data. Per the surveys, most campus administrators and teachers wanted to continue using RM, believed students benefited, and were satisfied with district and RM support. Supported teachers were more positive than administrators and non-supported teachers. The majority of campus administrators and supported teachers were positive toward RM training and resources; non-supported teacher responses were mixed. The most-cited success was increased student learning and engagement. The chief barriers were technology and scheduling problems; the main suggestion was to improve technology. Overall analyses provided a big picture of how district students progressed in mathematics over time; however, due to low implementation of RM, the use of RM as a supplemental program, and the lack of a control group, findings were limited and did not show the true impact of RM on student math achievement. The percentage of second-grade students that scored at or above the 40th percentile on ITBS Mathematics Total slightly decreased from 2012 (57.9%) to 2013 (56.2%), whereas the percentage of third-grade students that met the STAAR Satisfactory mathematics standard marginally increased from 2012 (55.2%) to 2013 (57.3%). Results of correlation analyses revealed that mastery of objectives and accuracy rates on Level A (easiest) learning mode problems were more strongly related to math achievement than time spent online. Per multiple regression analyses, three predictors (prior 2012 ITBS mathematics achievement, mastery of objectives, learning mode Level A accuracy rates) explained 64 percent of the variance of second-grade spring 2013 ITBS scores. Similarly, three predictors (learning mode Level A accuracy rates, prior 2012 ITBS mathematics achievement, STAAR test mode Level A accuracy rates) explained 68 percent of the variance in third-grade STAAR scores and 61 percent of the variance of spring third-grade ACP scores. Follow-up frequency analyses showed that time is important; however, students with accuracy rates at 75 percent or higher did notably better on ITBS and STAAR.
2
PROGRAM DESCRIPTION The Reasoning Mind (RM) technology-based mathematics curriculum program was provided to
Dallas Independent School District (ISD) second-grade students in 2011-12 and expanded to include
second- and third-grade students in 2012-13. Students at all but two district elementary schools (Allen,
Dealey) were enrolled in 2012-13. RM developed the adaptive, online mathematics curriculum to be used
as a supplement to the regular classroom teachers’ mathematics instruction in grades 2 through 4 and as
the core mathematics curriculum for grades 5 and 6. The district chose to use RM in second- and third-
grade classrooms due to a clear need to improve districtwide math achievement.
Per the RM web site, “A Reasoning Mind classroom is a hybrid of online and face-to-face
instruction, where the teacher gives each child individual help and attention.” During class time, students
use individual computers to log into the online RM program and to work through lessons and
corresponding problems at their own pace. The RM system develops a personalized path for each child
based on an assessment that identifies strengths and weaknesses. When a student struggles with a
problem, a request for the “Genie Solution” can be made. The “Genie Solution” provides a thorough
explanation for computing the problem. While students are working online, the teacher can view an RM
administrator screen to see how students are progressing and to note which students are having
difficulties. At that point, the teacher provides one-on-one interventions or small-group tutorials.
The support model for RM includes supported and non-supported teachers. In 2011-12, all
teachers were supported; however, in 2012-13, the district opted to pay for one supported teacher per
school for financial reasons as the cost per supported teacher was $3,500. As a result, the remaining
second- and third-grade teachers were non-supported. Through the RM support model, the supported
teachers worked closely with an RM program coordinator and were to collaborate with the non-supported
teachers in the building. Both supported and non-supported teachers were required to take the RM
Qualification Course and pass the course exam before using RM with their students. Supported teachers
received six additional professional development courses (12 hours), three formal program coordinator
observations, and periodic program coordinator visits and communication. In addition, the RM program
coordinators were to set up periodic meetings that included the principal and all RM teachers. Program
3
coordinators were available to assist both supported and non-supported teachers as time allowed;
however, the focus was on the supported teachers.
Program Goals. The goal and action steps to achieve it were summarized in a November 2012
memorandum from Superintendent Miles to the Board of Trustees and presented at the
November 8, 2012 School Board Briefing. The overall goal of RM in 2012-13 was to increase
mathematics achievement for second- and third-grade students. As for action steps, the district agreed to
ensure that all campuses implemented the program with fidelity to the RM model, that all teachers
completed the training, and that every campus had a schedule that provided each student with the
required amount of time for the program. RM promised to provide training, on-site coaching, support, and
weekly summaries of student time spent on RM, so the district could make adjustments as needed.
PURPOSE AND SCOPE OF THE EVALUATION
The purpose of this report is to summarize the context, implementation, and outcomes of the RM
Program. This included interviews with RM staff members as well as with Dallas ISD program staff
members assigned to the grant, a review of internal documents, and analyses of RM database files,
campus administrator and teacher survey data, and math standardized assessment data.
MAJOR EVALUATION QUESTIONS AND RESULTS
2.1 What were the sources and amounts of funding for Reasoning Mind?
Methodology
The workscopes for Title I, Part A and Title II, Part A were reviewed to note sources and amounts
of funding for RM. The Title I workscope was reviewed to the determine cost per student. Likewise, the
Title II workscope was studied to find out the cost per teacher trained.
Results
The total district budget for RM during 2012-13 was $1,494,500, which included $969,500 of
Title I, Part A funds and $525,000 of Title II, Part A funds. Title I funds were allotted to pay for 27,700
individual student accounts at a cost of $35 per student account; this included 14,200 second-grade and
13,500 third-grade student accounts. As for Title II, funds were allocated to pay for 150 supported
teachers’ professional development at a cost of $3,500 per teacher.
4
2.2 What were key findings from the literature review related to technology-based supplemental
instruction and scaling up educational initiatives?
Methodology
A literature review was conducted to summarize strengths and challenges of RM implementation
found in other settings and to review best practices related to computer-based instruction and the process
of “scaling up” initiatives. The review was not comprehensive but was meant to summarize findings that
can guide future implementation of RM in Dallas ISD.
Results
Reasoning Mind Studies
Five evaluation studies were found in the literature for RM. No studies were based on the second-
and third-grade supplemental programs used in Dallas ISD. Rather, studies were conducted in grades
four through seven. Three of the five studies took place in Houston ISD, whereas the other two were
conducted in Angleton and Beaumont.
Weber (2003) evaluated a small pilot project in Houston ISD that involved an experimental
(N=30) and control group (N=26). He found meaningful differences between seventh graders in the two
groups and concluded that the positive RM results were “far beyond reasonable given the focus and
duration of the project.” The study showed that most students and the two teachers that participated in
RM were positive about their RM experiences.
In 2006, Weber published findings from a study conducted in 10 Houston middle schools. Results
showed that “the implementation and evaluation” of RM “were fraught with problems.” For example, many
schools did not use RM until the spring semester. The comparison group outperformed the RM group in
most comparisons. Further outcome analyses showed that “student achievement in reading and student
performance on prior measures of mathematics achievement” were better predictors of success in
mathematics than RM. Weber noted that the field test accomplished its purpose by identifying problems
and possible solutions. Implementation issues included confusing theory presentations, “bugs” in certain
mathematics problems, “overly wordy solutions,” solutions skipped by students due to length, problems
becoming too difficult too quickly, and so forth. The “glitches” led to some teacher and student frustration.
Weber suggested that RM further investigate a variety of areas such as whether students with low
5
reading skills could be successful using RM, whether a sufficient number of teachers could be
appropriately trained, what level of teacher support was required to ensure the success of the program,
and the scalability of RM and the Deployment Coordinator model.
In a study of fifth graders at three schools in Angleton ISD, Waxman and Houston (January 2008)
found that RM students outperformed control students on an RM-developed pre- and posttest but not on
the Texas Assessment of Knowledge and Skills (TAKS). Results of teacher and student surveys were
positive.
Waxman and Houston (2012) conducted a study that included fifth-grade students at 16 schools
(eight treatment and eight control) in Beaumont ISD. Results showed that the higher TAKS scores of RM
students versus comparison students were statistically and practically significant. In addition, results
revealed that the “percentage of correct answers” was the best predictor of RM students’ performance on
the math TAKS and that “percentage of correct answers” had a greater effect on math TAKS than
students’ prior year performance. Teacher and student survey results revealed positive perceptions
toward RM.
Houston ISD (2011) conducted an internal evaluation of the RM fourth-grade supplemental
program used in 21 schools. Outcome results for the RM and matched comparison group were mixed. On
surveys, most students reported that they enjoyed RM and felt RM “helped them understand math better.”
Teacher surveys were positive for the most part; however, some teachers had concerns about the
alignment of RM and district curriculum and about students “missing other instruction while participating in
RM.”
In summary, the relationship between RM and student achievement was very positive in some
studies but mixed in others. Weber’s external evaluation of the Houston ISD field study identified
implementation issues that should be considered in future studies such as “bugs” in the program, teacher
training and support, scalability of the RM model, and scalability of the Deployment Coordinator model.
Educational Technology and Scalability Studies
Cisco Systems commissioned the Metiri Group (2006 and 2009) to summarize findings in the field
of educational technology. In reports, the authors state that technology advocates “over-promised”
student learning outcomes due to “underestimating the critical need for system changes required to use
6
technologies effectively in learning.” Based on their literature review, they found that technology provided
“a small, but significant, increase in learning when implemented with fidelity and accompanied by
appropriate pedagogical shifts.” Barriers to effective use of instructional technology include lack of access
due to unreliable or outdated technology along with lack of vision, absence of an innovative school
culture, and/or limited resources. As a result, the authors emphasize the need to address challenges that
may be unique to specific schools and to take into account the importance of “leadership development,
professional development, school culture, curricular redesign, and teacher preparation.”
Cheung and Slavin (2011) conducted a meta-analysis of math-related educational technology
applications. Across the studies reviewed, educational technology programs produced a small, positive
effect on mathematics achievement. Supplemental computer-assisted instruction (CAI) had the largest
effect on students’ math achievement. Studies with small sample sizes produced “twice the effect sizes of
those with large sample sizes;” they suggested that one of the likely reasons was that small-scale studies
could be more tightly controlled than large-scale studies. They also found differences related to program
intensity (i.e., larger effects were found for programs that required more student time). They conjecture
that though some attribute the small effect of supplemental programs to low implementation, the limited
time given to implementation could be part of a larger problem. That is, it is possible that “separate CAI
programs are not well accepted or seen as central to instruction by teachers, so teachers may not make
sure that students get the full amount of time on technology recommended by vendors.”
In Levin’s recent study on what it takes to scale up innovations, he notes “significant additional
costs” that make them difficult to replicate without “significant additional resources.” Specifically, Levin
described large-scale change in American schools as a “very daunting proposition” due to “a fundamental
tension between replicating a program or practice exactly and adapting it to meet different local
circumstances.” He proposes comparing an innovation to the “standard” (original) model in five areas:
cost, human capacity, tools and infrastructure, political support, and non-school factors. The “more the
innovation differs from the standard model, the harder it will be to scale.” Brief descriptions of the five
areas follow.
Cost. It is important to understand the costs of carrying out an innovation in comparison to the
standard model to determine “how much more is required per student or per school.”
7
Human capacity. Innovators must determine if the innovation demands a “significantly higher
level of skill or commitment than is found in the system now.” For example, an innovation could require a
higher level of competence, a higher level of time commitment, or a “behavior that people cannot
currently do.” Also, an innovation could require “particularly skilled leaders” and “key support people.” In
summary, “the more complex an innovation, and the further it is away from current practice in most
schools, the higher the human capital demands.”
Tools and infrastructure. If an innovation requires supports that are not typically available in
schools, they must be identified. Examples include types of facilities, technology, training materials,
additional time for training or development, and so forth.
Political support. Internal and external supports are important and include support from elected
leaders, school and district leaders, teachers, students, and parents. He pointed out that people’s
perceptions are “not always well informed, but they are real.” Also, he noted, “Innovations that do not
meet the public-acceptability test are unlikely to succeed at scale.”
Non-school factors. “Factors outside the school can affect the scaling of any initiative.” Outside
factors that differ from the model can include student demographic characteristics, mobility rates, home
technology access, parent time or commitment, and so forth.
Levin cautions that “it cannot be assumed that the difficulty of any of these challenges will change
in lockstep with an increase in scale of application.” While some issues may become easier with “wider”
implementation, others could become more difficult.
Thus, implementation fidelity is of utmost importance and requires systemic changes including
added support and attention to issues that can preclude successful implementation such as lack of staff
buy in, limited support at all levels, the absence of pedagogical shifts, variation across schools, and so
forth. For example, the Metiri Group emphasized the importance of developing visionary leaders,
providing high-quality professional development, ensuring a healthy school culture, redesigning the
curricula to take into account technology use, and adequately preparing teachers to teach using
technology and/or technology-based programs. Cheung and Slavin note larger effect sizes for small-scale
than large-scale programs due to tighter control of small-scale programs and the possibility that some
teachers may not fully accept or implement supplemental programs; this could certainly be a possibility in
8
a district as large as Dallas ISD. Similarly, Levin suggests the importance of considering five areas when
“scaling up” a program: cost, human capacity, tools and infrastructure, political support, and non-school
factors. Some of these areas were suggested by Weber’s 2006 study in Houston ISD as well and
certainly could be applicable to the large-scale roll out of RM in Dallas ISD.
2.3 What were the demographic characteristics of students involved in Reasoning Mind?
Methodology
RM student demographic data were exacted from the Dallas ISD Public Education Information
Management System (PEIMS) database. Specifically, data for enrolled students were pulled from the
October 29, 2012 fall snapshot file, which is used for state accountability ratings. During the 2012-13
school year, second- and third-grade students at all but two Dallas ISD elementary Schools (Allen,
Dealey) were enrolled in RM. Frequency analyses were computed for student demographic
characteristics by district division and grade level.
Results
Based on October 2012 PEIMS data, 26,151 second- and third-grade students were enrolled in
RM schools during the 2012-13 school year. This included 13,398 (98.8%) of the district’s 13,563
second-grade students and 12,753 (98.8%) of the district’s 12,907 third-grade students. Table 1 displays
demographic characteristics of RM students by division and grade level. Across divisions and grade
levels, there were slightly more male than female students in all comparisons; the major ethnic groups
were Hispanic and African American. The majority of RM students (92%) were economically
disadvantaged, and half (51%) were limited English proficient (LEP). Ten percent were identified as
gifted, and six percent received special education services. When reviewed by district division, there was
some demographic and grade-level variation.
9
Table 1
Demographic Characteristics of 2012-13 Reasoning Mind Students
Charac- teristic
Division 1 N (%)
Division 2 N (%)
Division 3 N (%)
Division 4 N (%)
Division 5 N (%)
All N (%)
RM Second-Grade Students (N=13,398) Gender
Male 1,457 (51) 1,394 (53) 1,218 (51) 1,180 (53) 1,672 (51) 6,921 (52) Female 1,396 (49) 1,233 (47) 1,185 (49) 1,042 (47) 1,621 (49) 6,477 (48)
Ethnicity Af. Am. 555 (19) 618 (24) 375 (16) 842 (38) 742 (23) 3,132 (23) Hispanic 2,233 (78) 1,813 (69) 1,914 (80) 1,130 (51) 2,297 (70) 9,387 (70) White 47 (2) 111 (4) 80 (3) 202 (9) 215 (6) 655 (4) Other1 18 (1) 85 (3) 34 (1) 48 (2) 39 (1) 224 (4)
Eco. Dis. 2,650 (93) 2,399 (91) 2,267 (94) 1,889 (85) 3,098 (94) 12,303 (92) Gifted 194 (7) 211 (8) 257 (11) 244 (11) 283 (9) 1,189 (9) Spec. Ed. 153 (5) 164 (6) 146 (6) 130 (6) 173 (5) 766 (6) LEP 1,589 (56) 1,461 (56) 1,363 (57) 766 (35) 1,631 (50) 6,810 (51)
Total 2,853 (100) 2,627 (100) 2,403 (100) 2,222 (100) 3,293 (100) 13,398 (100) RM Third-Grade Students (N=12,753)
Gender Male 1,423 (51) 1,311 (52) 1,120 (52) 1,106 (50) 1,630 (52) 6,590 (52) Female 1,349 (49) 1,211 (48) 1,031 (48) 1,088 (50) 1,484 (48) 6,163 (48)
Ethnicity Af. Am. 541 (19) 572 (23) 289 (13) 848 (39) 739 (24) 2,989 (23) Hispanic 2,166 (78) 1,762 (70) 1,782 (83) 1,091 (50) 2,175 (70) 8,976 (70) White 47 (2) 110 (4) 46 (2) 213 (10) 152 (5) 568 (5) Other1 18 (1) 78 (3) 34 (2) 42 (1) 48 (1) 220 (2)
Eco. Dis. 2,573 (93) 2,304 (91) 2,033 (95) 1,839 (84) 2,891 (93) 11,640 (91) Gifted 280 (10) 295 (12) 284 (13) 302 (14) 336 (11) 1497 (12) Spec. Ed. 185 (7) 172 (7) 130 (6) 124 (6) 221 (7) 832 (7) LEP 1,520 (55) 1,416 (56) 1,270 (59) 774 (35) 1,551 (50) 6,531 (51)
Total 2,772 (100) 2,522 (100) 2,151 (100) 2,194 (100) 3,114 (100) 12,753 (100) All RM Students (N=26,151)
Gender Male 2,880 (51) 2,705 (53) 2,338 (51) 2,286 (52) 3,302 (52) 13,511 (52) Female 2,745 (49) 2,444 (47) 2,216 (49) 2,130 (48) 3,105 (48) 12,640 (48)
Ethnicity Af. Am. 1,096 (19) 1,190 (23) 664 (15) 1,690 (38) 1,481 (23) 6,121 (23) Hispanic 4,399 (78) 3,575 (69) 3,696 (81) 2,221 (50) 4,472 (70) 18,363 (70) White 94 (2) 221 (4) 126 (3) 415 (9) 367 (6) 1,223 (5) Other1 36 (1) 163 (3) 68 (1) 90 (2) 87 (1) 444 (2)
Eco. Dis. 5,223 (93) 4,703 (91) 4,300 (94) 3,728 (84) 5,989 (94) 23,943 (92) Gifted 474 (8) 506 (10) 541 (12) 546 (12) 619 (10) 2,686 (10) Spec. Ed. 338 (6) 336 (7) 276 (6) 254 (6) 394 (6) 1,598 (6) LEP 3,109 (55) 2,877 (56) 2,633 (58) 1,540 (35) 3,182 (50) 13,341 (51)
Total 5,625 (100) 5,149 (100) 4,554 (100) 4,416 (100) 6,407 (100) 26,151 (100)
Source. PEIMS district database (10/29/2012) for all second- and third-grade students except for Allen and Dealey Note. Some percentages for ethnicity may not add to 100 due to rounding. 1Other included Asian, American Indian or Alaska Native, Native Hawaiian or Other Pacific Islander, two or more races, and not available.
10
2.4 Did staff members participate in Reasoning Mind training as planned?
Methodology
RM staff members extracted professional development completion data and classroom
observation data from RM databases. Frequency analyses were conducted to note the number and
percentage of teachers that completed various training courses. Qualification data was reviewed overall,
by teacher type, by time period of course completion, and by district division. Other professional
development data were reviewed overall and in some cases by teacher type and district division.
Classroom observation data were analyzed by rubric indicator.
Results
RM provided training for supported teachers, non-supported teachers, and campus
administrators. All first-year RM supported and non-supported teachers were required to complete the RM
Qualification Course before they could use RM with their students; as part of the course, teachers had to
pass an end-of-course assessment. In addition, supported teachers were expected to complete two Best
Practice Workshops and four Curriculum Study Workshops; both online and in-person training options
were provided. Non-supported teachers could participate in the Best Practice and Curriculum Study
training as well, but participation was not a requirement. All online courses were designed to take two
hours but could be completed flexibly. That is, teachers could begin a course, sign out, and return later to
pick up where they left off; teachers could sign in and out as many times as necessary without penalty.
Qualification Course for Teachers
As mentioned above, both supported and non-supported teachers were required to complete the
RM Qualification Course before they could use RM with the students. Table 2 shows that a total of 584
teachers completed the course; this included 129 supported teachers, 402 non-supported teachers, and
53 inactive teachers. One supported teacher and 25 non-supported teachers never completed the course.
11
Table 2
Number and Percentage of Teachers that Completed the Qualification Course by Time Period Supported Non-Supported Inactive* Total N (%) N (%) N (%) N (%) Before 8/27/2012 73 (56) 45 (11) 24 (45) 142 (23) 8/28/2012-10/4/2012 21 (16) 123 (29) 11 (21) 155 (25) 10/05/2012-1/18/2013 33 (25) 198 (46) 17 (32) 248 (41) 1/19/2013-3/08/2013 2 (2) 28 (7) 1 (2) 31 (5) 3/09/2013-5/17/2013 0 (0) 8 (2) 0 (0) 8 (1) Not completed 1 (1) 25 (6) 0 (0) 26 (4) Total 130 (100) 427 (100) 53 (100) 610 (100)
Note. Some percentages may not add to 100 due to rounding. *Inactive RM teachers received RM training but left the district or were reassigned to another position at some point during the school year.
Less than half of the teachers (48%) completed the RM Qualification Course by October 4, 2012,
which meant over half of the students could not use RM during the first six weeks of the school year. As
seen in Table 2 and Figure 1, notably more supported (72%) than non-supported (40%) teachers
completed the course by the end of the first six weeks. Almost half of the non-supported teachers (46%)
finished the course between October 5, 2012 and January 18, 2013 (end of first semester). Two (2%)
supported teachers and 36 (9%) non-supported teachers completed the course during the spring
semester; most likely, this was related to teacher turnover. There were 26 teachers that did not meet the
course requirement by the mid May cut-off date; most (25 of 26) were non-supported teachers.
Figure 1. Percentage of Supported and Non-Supported RM Teachers that Completed the Qualification Course by Time Period
56
16
25
2 0 1
11
29
46
7 26
0
10
20
30
40
50
60
Before8/27/12
8/28/12-10/4/12
10/05/12-1/18/13
1/19/13-3/08/13
3/09/13-5/17/13
Notcompleted
% C
ompl
eted
Qua
lific
atio
n Co
urse
Supported Non-Supported
12
Table 3 and Figure 2 show the number of RM teachers who completed the RM Qualification
Course by district division during 2012-13. For supported teachers, a sizeable proportion in
Division 1 (58%), Division 2 (63%), and Division 3 (69%) completed the course before school started, but
less than half did so in Divisions 4 (46%) and 5 (44%); rather, most supported teachers in Divisions 4 and
5 finished the course during the first semester. The percentage of non-supported teachers that completed
the course by the end of the first six weeks ranged from 20 percent (Division 4) to 48 percent (Division 5)
versus 60 percent (Division 4) to 80 percent (Division 2) of supported teachers.
Table 3
Number and Percentage of Teachers that Completed the Qualification Course by Division
Division 1 Division 2 Division 3 Division 4 Division 5
Supp. Non-Supp.
Supp.
Non-Supp.
Supp.
Non-Supp.
Supp.
Non-Supp.
Supp.
Non-Supp.
N (%) N (%) N (%) N (%) N (%) N (%) N (%) N (%) N (%) N (%) Before 8/27/12 18 (58) 7 (7) 15 (63) 10 (12) 18 (69) 8 (10) 10 (46) 7 (10) 12 (44) 13 (15)8/28/12- 10/4/12 3 (10) 30 (30) 4 (17) 29 (35) 2 (8) 28 (34) 3 (14) 7 (10) 9 (33) 29 (33)10/05/12- 1/18/13 9 (29) 43 (43) 5 (21) 37 (45) 4 (15) 38 (46) 9 (41) 48 (67) 6 (22) 32 (36)1/19/13- 3/08/13 1 (3) 7 (7) 0 (0) 4 (5) 1 (4) 6 (7) 0 (0) 6 (8) 0 (0) 5 (6)3/09/13- 5/17/13 0 (0) 1 (1) 0 (0) 1 (1) 0 (0) 2 (2) 0 (0) 1 (1) 0 (0) 3 (3)Not completed 0 (0) 12 (12) 0 (0) 2 (2) 1 (4) 1 (1) 0 (0) 3 (4) 0 (0) 7 (8)
Note. Some percentages may not add to 100 due to rounding.
13
Figure 2. Percentage of RM Teachers that Completed the Qualification Course by Division
Best Practice and Curriculum Study Workshops for Teachers
In addition to the RM Qualification Course, supported teachers were required to earn six credits
(12 hours) of RM professional development. Teachers that completed the requirement received a $500
stipend. First-year RM teachers were expected to complete four Curriculum Study Workshops and two
Best Practice Workshops. Teachers that had implemented RM for more than a year could choose any
combination of Best Practice and Curriculum Study Workshops to meet the six-credit requirement.
Teachers could also earn credits beyond the required credits if they chose to do so. Teachers had the
choice of numerous in-person sessions during the fall and spring semesters along with online options.
The in-person Curriculum Study and Best Practice Workshops were held at RM’s Dallas office.
Overall, 92 percent of supported teachers (133 out of 144) attained the six-credit professional
development requirement; 85 percent met the Best Practice Workshop attendance expectation, and 83
percent did so for the Curriculum Study Workshop requirement. Very few non-supported teachers
(N=11; 2%) participated in workshops, and none completed six credits. Figure 3 shows the percentage of
supported teachers that met the six-credit training requirement by division. Over 90 percent of the
teachers in Division 1 (94%), Division 4 (96%) and Division 5 (94%) met the requirement, and 89 percent
of teachers in Divisions 2 and 3 did so.
5863
69
46 44
712 10 10 15
39 38
23
55 55
7380 80 77
69
3 4 8 6 9 9 94
12
2 1 4 8
0102030405060708090
100
Div. 1 Div. 2 Div. 3 Div. 4 Div. 5 Div.1 Div. 2 Div. 3 Div. 4 Div. 5
Supported Non-supported
% C
ompl
eted
Qua
lific
atio
n Co
urse
Before Fall 2012 During Fall 2012 During Spring 2013 Not completed
14
Figure 3. Percentage of Supported Teachers that Met RM Training Requirements
Professional Development for Campus Administrators At the request of campus administrators, training sessions were provided during July and August
2012. The purpose of the training was to give campus administrators an overview of the RM program. In
all, 55 campus administrators from 51 elementary schools attended training. Attendees included
principals (N=44), assistant principals (N=3), former principals (N=3), instructional coaches (N=4), and
other (N=1). When reviewed by divisions, there were 10 from Division 1, 13 from Division 2, 15 from
Division 3, 7 from Division 4, and 10 from Division 5. There were four schools represented by two
administrators each; this included one school in Division 2, two in Division 4, and one in Division 5.
RM Teacher Observations
In addition to formal professional development sessions, RM program coordinators formally
observed each supported teacher at least three times a year. Non-supported teachers were not formally
observed. During the 45-minute observations, RM coordinators used an implementation rubric to rate the
teachers in 10 areas. The rubric was used to monitor implementation fidelity as well as to help teachers
identify strengths and areas that could be improved. Besides formal observations, program coordinators
informally observed and visited with supported teachers throughout the school year to provide feedback,
ideas, and suggested resources. Program coordinators also visited and provided ideas to non-supported
teachers as time allowed. For example, the program coordinators helped teachers think through ways to
Div. 1 Div. 2 Div. 3 Div. 4 Div. 5 District% Did Not Meet 6% 11% 11% 4% 6% 8%% Met 94% 89% 89% 96% 94% 92%
0%
20%
40%
60%
80%
100%
% o
f Sup
port
ed T
each
ers
Mee
ting
PD R
equi
rem
ent
15
overcome technology issues and classroom management challenges. Teachers also had access to
tutorials and RM-hosted symposiums.
The classroom observation rubric included ten indicators and four possible ratings (not
established, established, proficient, advanced). Teachers were to aim for achieving at least “proficient” on
each indicator. However, the main goal was to see improvement over the course of the year. For
example, if a teacher began as “not established,” the program coordinator provided feedback to help the
teacher move up to “established.” Program coordinators tried to spread observations out across the year
to assess for growth over time. However, per one program coordinator, many of the observations could
not begin until into the second semester due to teacher turnover or late program launches.
Of the 144 supported teachers included in the database, most (97%) were observed three times,
while the remaining teachers were observed four times. As would be hoped, across the indicators, the
number of teachers who received ratings of “not established” and “established” tended to decrease over
time, whereas the number of teachers who attained ratings of “proficient” or “advanced” increased. (See
Table 4 and Figures 4 to 13.) Data from the fourth observation were not included in the figures due to the
small number of teachers that were observed four times.
16
Table 4
Teacher Observation Ratings Based on the RM Rubric
Observation 1
(N=144) Observation 2
(N=144) Observation 3
(N=140) Observation 4
(N=4) Indicator Rating N % N % N % N % Data Driven Decisions
Not Established 35 24 19 13 9 6 0 0 Established 60 42 23 16 26 19 0 0 Proficient 42 29 72 50 44 31 1 25 Advanced 7 5 30 21 61 44 3 75
Lesson Planning
Not Established 118 82 51 35 12 9 0 0 Established 22 15 70 49 85 61 2 50 Proficient 4 3 21 15 30 21 2 50 Advanced 0 0 2 1 13 9 0 0
Instructional Methods
Not Established 109 76 61 42 26 18 2 50 Established 24 17 42 29 42 30 1 25 Proficient 8 5 30 21 46 33 1 25 Advanced 3 2 11 8 26 19 0 0
Learning Modes
Not Established 119 83 67 47 41 29 1 25 Established 17 12 40 28 42 30 1 25 Proficient 5 3 25 17 23 17 2 50 Advanced 3 2 12 8 34 24 0 0
Teacher Engagement
Not Established 24 17 15 10 7 5 0 0 Established 32 22 26 18 21 15 0 0 Proficient 33 23 33 23 32 23 2 50 Advanced 55 38 70 49 80 57 2 50
Procedures Not Established 15 10 10 7 8 6 0 0 Established 37 26 25 17 20 14 1 25 Proficient 69 48 68 47 58 41 1 25 Advanced 23 16 41 29 54 39 2 50
Incentive Systems
Not Established 67 46 22 15 6 4 0 0 Established 73 51 88 61 71 51 2 50 Proficient 4 3 24 17 35 25 2 50 Advanced 0 0 10 7 28 20 0 0
Notebooks Not Established 96 67 62 43 28 20 1 25 Established 39 27 61 42 75 54 3 75 Proficient 9 6 18 13 30 21 0 0 Advanced 0 0 3 2 7 5 0 0
Independent Learning
Not Established 85 59 40 28 14 10 0 0 Established 54 37 98 68 112 80 2 50 Proficient 4 3 3 2 5 4 0 0 Advanced 1 1 3 2 9 6 2 50
Student Engagement
Not Established 17 12 10 7 5 3 0 0 Established 35 24 35 24 26 19 1 25 Proficient 12 8 13 9 8 6 0 0 Advanced 80 56 86 60 101 72 3 75
17
Figure 4. Percentage Distribution of Observation Ratings: Data Driven Decisions
Figure 5. Percentage Distribution of Observation Ratings: Lesson Planning
6
13
24
19
16
42
31
50
29
44
21
5
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Data Driven Decisions
Not Yet Established Established Proficient Advanced
9
35
82
61
49
15
21
15
3
9
1
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Lesson Planning
Not Yet Established Established Proficient Advanced
18
Figure 6. Percentage Distribution of Observation Ratings: Instructional Methods
Figure 7. Percentage Distribution of Observation Ratings: Learning Modes
18
42
76
30
29
17
33
21
5
19
8
2
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Instructional Methods
Not Yet Established Established Proficient Advanced
29
47
83
30
28
12
17
17
3
24
8
2
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Learning Modes
Not Yet Established Established Proficient Advanced
19
Figure 8. Percentage Distribution of Observation Ratings: Teacher Engagement
Figure 9. Percentage Distribution of Observation Ratings: Procedures
5
10
17
15
18
22
23
23
23
57
49
38
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Teacher Engagement
Not Yet Established Established Proficient Advanced
6
7
10
14
17
26
41
47
48
39
29
16
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Procedures
Not Yet Established Established Proficient Advanced
20
Figure 10. Percentage Distribution of Observation Ratings: Incentive Systems
Figure 11. Percentage Distribution of Observation Ratings: Notebooks
4
15
46
51
61
51
25
17
3
20
7
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Incentive Systems
Not Yet Established Established Proficient Advanced
20
43
67
54
42
27
21
13
6
5
2
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Notebooks
Not Yet Established Established Proficient Advanced
21
Figure 12. Percentage Distribution of Observation Ratings: Independent Learning
Figure 13. Percentage Distribution of Observation Ratings: Student Engagement
Table 5 shows rating results from the third observation by indicator. The majority of teachers
received “proficient” or “advanced” for data driven decisions (75%), teacher engagement (80%),
procedures (80%), and student engagement (78%). Thus, teachers used data to guide individual and
small group instruction, worked with students during most of RM time, and had good procedures in place
to ensure students logged in quickly and so forth. About half (52%) attained “proficient” or “advanced” for
10
28
59
80
68
37
4
2
3
6
2
1
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Independent Learning
Not Yet Established Established Proficient Advanced
3
7
12
19
24
24
6
9
8
72
60
56
0% 20% 40% 60% 80% 100%
Obs. 3
Obs. 2
Obs. 1
Student Engagement
Not Yet Established Established Proficient Advanced
22
instructional methods, which means about half of the teachers differentiated student instruction during
observed interventions.
Less than half scored “proficient” or “advanced” for lesson planning (30%), learning modes (40%),
incentive systems (45%), notebooks (26%), and independent learning (10%). The majority scored
“established” (61%) for lesson planning because they chose the students and objectives to focus on
during the lesson rather than before. Many teachers did not achieve “proficient” for learning modes
because their students did not spend enough time in review mode; this means low-performing students
were given fewer opportunities to review basic computation questions, and high-performing students
missed chances to work on more rigorous problems. In the case of incentive systems, teachers must set
both class and individual goals to reach “proficient;” thus, over half (55%) did not set both types of goals.
Sometimes it takes teachers longer to reach “proficient” for the notebook indicator because taking notes
is a new skill for many students. To receive a rating of “proficient” on independent learning, no more than
three students can skip the Genie Solution (instructional feedback) when they miss a question; the
purpose of this indicator is to ensure that students learn from their mistakes rather than moving forward
without an understanding of why they missed a question.
Table 5
Teacher Observation Ratings by Indicator for Observation 3
Indicator
Not Established
N (%)
Established
N (%)
Proficient
N (%)
Advanced
N (%) Data Driven Decisions 9 (6) 26 (19) 44 (31) 61 (44) Lesson Planning 12 (9) 85 (61) 30 (21) 13 (9) Instructional Methods 26 (19) 42 (30) 46 (33) 26 (19) Learning Modes 41 (29) 42 (30) 23 (16) 34 (24) Teacher Engagement 7 (5) 21 (15) 32 (23) 80 (57) Procedures 8 (6) 20 (14) 58 (41) 54 (39) Incentive Systems 6 (4) 71 (51) 35 (25) 28 (20) Notebooks 28 (20) 75 (54) 30 (21) 7 (5) Independent Learning 14 (10) 112 (80) 5 (4) 9 (6) Student Engagement 5 (4) 26 (19) 8 (6) 101 (72)
Note. Some percentages may not add to 100 due to rounding.
23
Table 6 shows the percentage of teachers that had decreased or increased ratings from
Observation 1 to 3 as well as the percentage that had no change. The majority of teachers had increased
ratings for data driven decisions (71%), lesson planning (81%), instructional methods (69%), learning
modes (64%), incentive systems (71%), notebooks (59%), and independent learning (58%). In contrast,
half (51%) had no change in ratings for student engagement; this is likely because the majority of
teachers were “proficient” or “advanced” from the start. Similarly, results were mixed for teacher
engagement and procedures; again, over half were “proficient” or “established” from the first observation
on. In general, there was a pattern of increase across time even if teachers did not reach “proficient” in all
areas.
Table 6
Teacher Observation Rating Changes from
Observation 1 to 3 by Indicator
Decreased No Change Increased Indicator N (%) N (%) N (%) Data Driven Decisions 11 (8) 30 (21) 99 (71) Lesson Planning 1 (1) 25 (18) 114 (81) Instructional Methods 5 (4) 38 (27) 97 (69) Learning Modes 6 (4) 45 (32) 89 (64) Teacher Engagement 20 (14) 60 (43) 60 (43) Procedures 19 (14) 53 (38) 68 (49) Incentive Systems 1 (1) 39 (28) 100 (71) Notebooks 8 (6) 50 (36) 82 (59) Independent Learning 8 (6) 51 (36) 81 (58) Student Engagement 24 (17) 72 (51) 44 (31)
Note. Some percentages may not add to 100 due to rounding.
2.5 Was the Reasoning Mind program used by students as planned?
Methodology
RM provided second- and third-grade student data files for the fall and spring semesters. The
data files included students’ time spent online using RM, number of objectives completed, and math
problem accuracy rates. Data were aggregated to prepare for school-level analyses. The evaluators
conducted frequency and descriptive analyses to determine student- and school-level implementation
related to time online, objectives mastered, and accuracy rates within various modes of the RM program.
24
The evaluators dealt with two challenges related to RM data files. First, RM did not receive full
district data files except at the very beginning of the year and did not have access to student identification
numbers until March. As a result, RM did not know when students were not included and/or moved in and
out of the district. Upon teacher request to add student accounts, the RM district coordinator worked with
district technology staff to get the information to RM; however, no ongoing, automatic updates were in
place. Second, the fall files used in this report differ from the fall files used in the interim report because
RM provided updated data that included many students that were missing in the previous fall files. The
evaluators used the updated fall files to be as accurate as possible.
Results
Student Hour Results
To meet RM’s two hour per week requirement, the number of hours students spent online should
be approximately 35 hours each semester; the evaluators used 30 hours or more per semester as the
actual goal to compensate for instructional time lost due to holidays or school events such as assemblies
or field trips. In general, average implementation in terms of hours online was about 35 percent of what it
should have been in the fall and about 80 percent of what it should have been in the spring. As seen in
Figures 14 and 15, mean hours of student use increased from 10.71 (fall) to 24.19 (spring) for
second-grade students and from 10.52 (fall) to 24.33 (spring) for third-grade students. Whereas a few
second- (2%) and third-grade (1%) students met the 30-hour goal in the fall, about a third of second-
(31%) and third-grade (32%) students did so in the spring. In the fall, about half of second- (50%) and
third-grade (52%) students logged less than 10 hours. As for spring, approximately 60 percent of second-
(63%) and third-grade (61%) students logged 20 or more hours. In comparison to 2011-12, the fall 2013
mean of 10.71 hours was lower than fall 2012 hours (13.40) for second grade; however, the spring 2013
mean (24.19) was higher than in spring 2012 (17.20).
25
Figure 14. Mean Hours Students Spent on RM in Fall and Spring by Grade
Figure 15. Percentage of Students in RM Hour Categories by Grade and Semester Tables 7 and 8 display overall and division-level student hour information by semester and grade.
In general, there were increases from fall to spring for all divisions. Almost half of Division 1 second-
(49%) and third-grade students (48%) spent 30 or more hours online in the spring; percentages varied for
other second- (21% to 34%) and third-grade (19% to 34%) spring division comparisons. The percentages
of students with 20 or more hours ranged from 55 percent (Division 3) to 80 percent (Division 1) for
second-grade students and from 48 percent (Division 3) to 80 percent (Division 1) for third-grade
students.
10.71 10.52
24.19 24.33
0
10
20
30
2nd Grade 3rd Grade
Mean
Fall
Spring
2nd GradeFall
2nd GradeSpring
3rd GradeFall
3rd GradeSpring
30+ Hours 2 31 1 3220-29.99 Hours 11 32 11 2910-19.99 Hours 37 24 36 250-9.99 Hours 50 13 52 14
0
20
40
60
80
100Percentage
30+ Hours
20-29.99 Hours
10-19.99 Hours
0-9.99 Hours
26
Table 7
Number of Hours Students Spent on RM in Fall and Spring by Grade and Division Second Grade Third Grade Fall
N (%) Spring N (%)
Fall N (%)
Spring N (%)
All 0-9.99 hours 6,770 (50) 1,715 (13) 6,676 (52) 1,739 (14) 10-19.99 hours 5,063 (37) 3,238 (24) 4,606 (36) 3,235 (25) 20-29.99 hours 1,425 (11) 4,312 (32) 1,366 (11) 3,763 (29) 30+ hours 228 (2) 4,188 (31) 182 (1) 4,084 (32)
Division 1 0-9.99 hours 1,385 (48) 158 (5) 1,668 (58) 171 (6) 10-19.99 hours 1,277 (44) 447 (15) 805 (28) 399 (14) 20-29.99 hours 232 (8) 883 (31) 329 (12) 901 (32) 30+ hours 11 (<1) 1,408 (49) 55 (2) 1,380 (48)
Division 2 0-9.99 hours 1,234 (50) 252 (10) 1,055 (43) 295 (12) 10-19.99 hours 904 (36) 621 (25) 1,090 (45) 620 (26) 20-29.99 hours 260 (10) 762 (31) 234 (10) 679 (28) 30+ hours 97 (4) 852 (34) 49 (2) 836 (34)
Division 3 0-9.99 hours 1,500 (61) 441 (18) 1,288 (60) 413 (19) 10-19.99 hours 783 (32) 676 (28) 794 (37) 685 (32) 20-29.99 hours 152 (6) 825 (34) 61 (3) 629 (29) 30+ hours 21 (1) 506 (21) 3 (<1) 415 (19)
Division 4 0-9.99 hours 1,257 (55) 364 (16) 1,450 (64) 302 (13) 10-19.99 hours 787 (35) 590 (26) 578 (26) 619 (27) 20-29.99 hours 216 (9) 671 (30) 198 (9) 580 (26) 30+ hours 12 (<1) 644 (28) 21 (1) 764 (34)
Division 5 0-9.99 hours 1,394 (41) 500 (15) 1,215 (38) 558 (18) 10-19.99 hours 1,312 (39) 904 (27) 1,339 (43) 912 (29) 20-29.99 hours 565 (17) 1,171 (35) 544 (17) 974 (31) 30+ hours 88 (3) 778 (23) 54 (2) 689 (22)
Note. Some percentages may not add to 100 due to rounding.
27
Table 8
Descriptive Statistics for Reasoning Mind Hours by Grade and Division
Second Grade Third Grade Fall Spring Fall Spring
All Number of students 13,486 13,453 12,830 12,821 Range of hours 0.00-68.93 0.00-131.91 0.00-72.91 0.00-159.56 Mean hours 10.71 24.19 10.52 24.33 Standard deviation 7.96 12.73 7.76 13.26
Division 1 Number of students 2,904 2,896 2,857 2,851 Range of hours 0.00-42.87 0.00-131.91 0.00-50.61 0.00-103.45 Mean hours 10.36 28.62 10.24 29.47 Standard deviation 6.83 11.32 7.99 12.51
Division 2 Number of students 2,495 2,487 2,428 2,430 Range of hours 0.00-68.93 0.00-124.46 0.00-57.56 0.00-148.02 Mean hours 11.45 25.67 11.43 24.92 Standard deviation 9.18 14.00 7.53 13.33
Division 3 Number of students 2,456 2,448 2,146 2,142 Range of hours 0.00-37.83 0.00-85.95 0.00-49.33 0.00-159.56 Mean hours 8.97 21.29 8.46 20.64 Standard deviation 6.60 12.50 6.29 12.94
Division 4 Number of students 2,272 2,269 2,247 2,265 Range of hours 0.00-46.91 0.00-69.44 0.00-57.60 0.00-110.45 Mean hours 9.49 22.75 8.92 24.53 Standard deviation 7.78 12.72 7.57 13.30
Division 5 Number of students 3,359 3,353 3,152 3,133 Range of hours 0.00-65.22 0.00-87.00 0.00-72.91 0.00-97.49 Mean hours 12.55 22.35 12.63 21.57 Standard deviation 8.48 11.91 8.15 12.49 School Hour Results
As would be expected, school-level data mirrored student-level data. The percentage of schools
that averaged thirty or more hours increased from fall to spring for both second (1% to 29%) and third
grade (0% to 29%). Even so, less than a third (29%) of the RM schools averaged at least 30 hours in the
spring. (See Figure 16.) Most RM schools (66%) averaged 20 hours or more in the spring. When viewed
by division, Division 1 had the highest level of implementation in the spring, whereas Divisions 3 and 5
had the lowest. (See Table 9.)
28
Figure 16. Percentage of Schools in RM Hour Categories by Grade and Semester
2nd GradeFall
2nd GradeSpring
3rd GradeFall
3rd GradeSpring
30+ Hours 1 29 0 2920-29.99 Hours 8 37 7 3710-19.99 Hours 45 29 46 280-9.99 Hours 46 5 47 6
0
20
40
60
80
100Percentage
30+ Hours
20-29.99 Hours
10-19.99 Hours
0-9.99 Hours
29
Table 9
School Information by RM Hour Categories in the Fall and Spring by Grade and Division Second Grade Third Grade Fall
N (%) Spring N (%)
Fall N (%)
Spring N (%)
All 0-9.99 hours 67 (46) 8 (6) 67 (46) 8 (6) 10-19.99 hours 65 (45) 41 (28) 67 (46) 40 (28) 20-29.99 hours 12 (8) 53 (37) 10 (7) 54 (37) 30+ hours1 1 (1) 42 (29) 0 (0) 42 (29)
Division 1 0-9.99 hours 14 (45) 0 (0) 17 (55) 1 (3) 10-19.99 hours 15 (48) 5 (16) 11 (36) 4 (13) 20-29.99 hours 2 (7) 11 (36) 3 (10) 11 (35) 30+ hours 0 (0) 15 (48) 0 (0) 15 (48)
Division 2 0-9.99 hours 12 (43) 1 (4) 12 (43) 1 (4) 10-19.99 hours 12 (43) 6 (21) 15 (54) 8 (29) 20-29.99 hours 3 (11) 11 (39) 1 (4) 9 (32) 30+ hours 1 (4) 10 (36) 0 (0) 10 (36)
Division 3 0-9.99 hours 14 (52) 3 (11) 14 (52) 4 (15) 10-19.99 hours 11 (41) 9 (33) 13 (48) 8 (30) 20-29.99 hours 2 (7) 11 (41) 0 (0) 11 (41) 30+ hours 0 (0) 4 (15) 0 (0) 4 (15)
Division 4 0-9.99 hours 14 (56) 1 (4) 15 (60) 0 (0) 10-19.99 hours 11 (44) 8 (32) 8 (32) 7 (28) 20-29.99 hours 0 (0) 10 (40) 2 (8) 10 (40) 30+ hours 0 (0) 6 (24) 0 (0) 8 (32)
Division 5 0-9.99 hours 13 (38) 3 (9) 10 (29) 2 (6) 10-19.99 hours 16 (47) 13 (38) 20 (59) 14 (41) 20-29.99 hours 5 (15) 11 (32) 4 (12) 13 (38) 30+ hours 0 (0) 7 (21) 0 (0) 5 (15)
Note. Some percentages may not add to 100 due to rounding. 1If the 100+ missing second-grade students were included for Titche, there would be no schools in the highest implementation category in the fall. There were 15 students in the RM files versus 134 in the fall PEIMS file.
Schools within each hour category were rank ordered by division and average RM hours. (See
Tables 10 to 13.) It should be noted that many Titche students were not included in the RM files.
Specifically, there were 15 Titche second-grade students in the fall and spring RM files versus 134 in the
October 2012 PEIMS file; as for third-grade students, there were 42 in the fall RM file and 43 in the spring
RM file versus 116 in the October PEIMS file. As a result, the Titche averages for time online are
misleading and would be much lower if the missing students with no time in RM were included.
30
Table 10
RM Second-Grade Fall Hour Information with Schools Rank Ordered by Division and Mean Hours Online
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
0-9.99 Hours 1 Twain 62 0.84 0.00 13.74 2.13 1 Foster 123 0.91 0.00 9.62 1.06 1 Weiss 84 4.21 0.00 9.46 1.70 1 Rosemont 181 4.41 0.00 19.87 1.98 1 Alexander 72 5.52 0.00 17.09 4.16 1 Saldivar 151 6.07 0.00 12.49 2.19 1 Carpenter 58 6.26 0.00 31.55 7.78 1 Webster 123 6.44 0.00 23.30 3.25 1 Terry 61 7.22 0.00 14.08 4.10 1 Kahn 103 8.33 0.00 22.62 4.70 1 Turner 58 8.49 0.00 22.42 4.03 1 Hooe 75 9.12 0.00 24.50 7.28 1 Field 69 9.26 0.00 18.67 3.90 1 U. Lee 90 9.95 0.00 26.84 6.67 2 DeGolyer 54 0.00 0.00 0.00 0.00 2 San Jacinto 98 0.25 0.00 7.57 1.27 2 Gooch 65 2.16 0.00 31.12 5.35 2 Bryan 80 4.51 0.00 16.91 3.35 2 Blanton 94 4.99 0.00 14.15 2.13 2 McShan 97 5.37 0.00 12.66 2.72 2 J. Adams 97 6.33 0.00 16.89 4.55 2 Mills 78 7.36 0.00 18.96 4.69 2 H. Meadow 136 7.39 0.00 16.79 3.56 2 Cabell 106 7.62 0.00 23.39 4.14 2 Lowe 115 8.01 0.00 27.41 4.62 2 Caillet 109 9.81 0.00 52.34 6.26 3 Salazar 127 1.42 0.00 10.21 1.70 3 Kramer 89 1.65 0.00 13.57 1.62 3 Frank 179 3.09 0.00 7.36 1.60 3 Stevens 113 3.60 0.00 15.95 2.46 3 Zaragoza 87 3.71 0.00 10.12 3.02 3 Lanier 84 3.73 0.00 6.13 1.56 Continued
31
Table 10 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
3 Soto 118 6.32 0.00 11.34 1.48 3 Carr 81 6.66 0.00 18.61 2.56 3 Pershing 67 7.01 0.00 15.86 2.87 3 Ray 74 8.30 0.00 15.90 4.65 3 E. Medrano 99 8.45 0.00 12.88 2.77 3 Houston 47 8.68 0.00 21.55 6.10 3 Preston Hollow 62 8.83 0.00 16.10 4.56 3 Bethune 125 9.11 0.00 14.91 2.91 4 Callejo 83 0.00 0.00 0.27 0.03 4 Young 115 0.59 0.00 20.46 2.65 4 Guzick 114 0.73 0.00 6.07 1.01 4 Marsalis 79 1.22 0.00 26.48 3.13 4 Lipscomb 88 3.25 0.00 9.98 1.83 4 Thornton 53 3.95 0.00 7.48 1.90 4 Truett 157 4.79 0.00 18.83 4.00 4 Mata 59 4.94 0.00 8.90 1.55 4 Bushman 85 6.34 0.00 21.77 6.89 4 Pease 95 6.68 0.00 20.49 3.87 4 Conner 91 8.32 0.00 22.41 4.94 4 Urban Park 115 8.56 0.00 14.38 3.34 4 Lisbon 52 8.86 0.00 15.79 3.03 4 H. Stone 49 9.05 4.69 16.25 2.83 5 Douglass 109 2.91 0.00 23.14 3.86 5 Sanger 76 3.07 0.00 15.07 2.67 5 Ervin 103 3.52 0.00 23.73 4.28 5 Reagan 79 4.37 0.00 29.34 3.66 5 Wilmer-Hutchins 117 4.64 0.00 14.90 3.86 5 Casa View 114 6.15 0.00 10.38 1.93 5 Bayles 110 6.56 0.00 12.86 3.80 5 Lagow 106 6.86 0.00 21.73 3.82 5 Botello 88 7.60 0.93 16.57 1.99 5 Moseley 109 8.26 0.00 19.32 3.00 5 Smith 169 8.87 0.00 19.25 5.73 5 Reilly 84 8.99 0.00 21.20 3.10 5 Reinhardt 109 9.54 0.00 16.48 3.56 Continued
32
Table 10 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
10-19.99 Hours 1 Brashear 93 10.01 0.00 26.51 8.95 1 McNair 147 11.03 0.00 22.58 3.58 1 Burnet 187 11.13 0.00 32.90 4.34 1 Moreno 80 11.52 0.00 19.98 4.69 1 Knight 106 11.56 0.00 21.64 4.86 1 Cigarroa 99 12.60 0.00 24.49 5.30 1 Donald 67 12.94 0.00 19.02 3.70 1 Hall 88 13.25 0.00 22.31 4.39 1 Winnetka 113 14.11 0.00 40.69 3.96 1 Williams 45 14.12 0.00 22.96 6.41 1 Peabody 85 14.30 0.00 22.35 4.41 1 Stemmons 124 15.90 0.00 26.21 4.34 1 Jones 122 16.08 0.00 33.66 5.32 1 Walnut Hill 45 17.14 0.00 24.70 4.82 1 Tolbert 67 19.45 0.00 42.87 7.56 2 Ireland 106 10.15 0.00 26.22 6.33 2 Runyon 92 10.77 0.00 17.41 3.63 2 Bush 117 11.48 0.00 30.52 6.11 2 Marcus 156 12.74 0.00 24.41 4.09 2 Hotchkiss 153 14.20 0.00 37.64 6.32 2 J. Stone 50 14.56 0.00 21.88 5.36 2 N. Adams 75 15.94 0.00 29.83 5.25 2 Budd 79 17.27 0.00 28.11 5.63 2 Withers 75 17.85 10.31 33.29 4.63 2 Junkins 105 18.23 0.00 59.40 10.89 2 Hawthorne 67 18.76 0.00 25.95 4.31 2 Starks 51 18.77 0.00 41.97 11.19 3 Earhart 47 10.47 0.00 16.98 5.32 3 Martinez 85 10.63 0.00 21.84 5.15 3 Arcadia Park 125 10.71 0.00 26.61 5.08 3 DeZavala 58 11.04 1.85 24.28 7.00 3 Carver 76 11.08 0.00 20.34 5.44 3 Chavez 107 11.53 0.00 23.87 4.45 3 Cochran 96 12.01 0.00 26.28 5.00 Continued
33
Table 10 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
3 Kennedy 115 12.76 0.00 30.39 4.27 3 Hernandez 65 13.02 0.00 31.38 6.38 3 Milam 49 13.37 0.00 19.85 3.26 3 Cowart 97 14.86 0.00 25.76 5.15 4 R. Lee 52 10.40 0.00 27.05 4.40 4 Tatum 111 12.53 0.00 36.18 6.04 4 Lakewood 131 13.08 0.00 45.74 6.37 4 Oliver 49 14.35 0.00 22.92 4.26 4 Dunbar 100 14.97 0.00 25.81 6.47 4 Mt. Auburn 138 15.34 0.00 23.80 4.43 4 Rowe 65 15.71 5.51 23.18 3.69 4 King 74 15.86 0.00 41.66 8.13 4 S. Jackson 114 16.29 0.00 46.91 6.47 4 Jordan 86 19.22 0.00 29.13 7.55 4 Russell 117 19.64 0.00 36.61 6.35 5 Blair 95 10.24 0.00 20.76 6.33 5 Kiest 114 10.74 0.00 17.24 4.18 5 Rhoads 87 11.24 0.00 38.91 8.60 5 Gonzalez 128 11.55 0.00 26.24 6.23 5 Cuellar 132 11.57 0.00 20.08 4.27 5 Hexter 100 13.44 0.00 27.40 5.44 5 Silberstein 116 14.65 4.18 18.22 2.30 5 Hogg 45 15.85 0.00 41.76 8.61 5 Central 83 16.17 0.00 52.21 9.98 5 W. Anderson 107 16.54 0.37 30.74 6.23 5 Rice 87 16.57 0.00 41.46 8.89 5 Bowie 63 16.90 0.00 22.37 4.70 5 Burleson 104 17.06 0.00 39.05 6.61 5 Macon 79 18.05 0.00 30.35 6.28 5 Seagoville North 112 19.53 0.00 47.73 6.28 5 Dorsey 63 19.99 0.00 28.81 5.03
20-29.99 Hours 1 Henderson 67 21.90 0.00 31.73 5.29 1 Polk 59 22.38 8.43 36.16 3.75 2 Pleasant Grove 99 21.66 0.00 56.61 16.11 Continued
34
Table 10 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
2 Johnston 75 22.90 0.00 41.87 8.22 2 Miller 51 22.98 5.79 46.00 5.69 3 Rogers 88 20.28 0.00 37.83 9.58 3 Maple Lawn 96 20.88 0.00 31.24 4.59 5 Seagoville 86 20.33 0.00 39.29 5.34 5 Kleberg 92 20.43 0.00 32.70 5.92 5 Gill 153 23.21 0.00 35.61 7.24 5 Peeler 53 27.01 0.00 65.22 9.15 5 Halliday 87 29.03 7.46 36.63 3.88
30+ Hours1 2 Titche1 15 46.94 6.24 68.93 13.70
1Many Titche students were missing in the RM file; there were 134 second-grade students in the PEIMS file. The mean of 46.94 is misleading and was not the true level of implementation at Titche.
35
Table 11
RM Third-Grade Fall Hour Information with Schools Rank Ordered by Division and Mean Hours Online
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
0-9.99 Hours 1 Alexander 58 2.67 0.00 19.92 3.36 1 Weiss 83 3.08 0.00 9.38 1.84 1 Moreno 110 3.18 0.00 9.67 1.64 1 Rosemont 173 3.26 0.00 23.53 2.22 1 McNair 132 3.60 0.00 17.15 3.34 1 Saldivar 153 3.78 0.00 14.63 2.16 1 Carpenter 52 4.90 0.00 10.74 3.24 1 Burnet 159 5.75 0.00 12.10 3.13 1 Brashear 93 6.13 0.00 16.60 2.03 1 Twain 56 6.53 0.00 15.26 4.75 1 Foster 121 7.14 0.00 13.39 2.55 1 Turner 63 8.29 0.00 15.31 2.46 1 Winnetka 124 8.32 0.00 14.61 2.16 1 Hooe 63 8.49 0.00 18.05 3.76 1 Cigarroa 83 9.40 0.00 24.18 3.80 1 Terry 69 9.46 0.00 17.08 4.05 1 Tolbert 76 9.60 0.00 21.55 4.61 2 San Jacinto 102 0.24 0.00 8.24 1.14 2 Gooch 70 0.63 0.00 5.45 0.99 2 DeGolyer 61 2.59 0.00 7.30 1.34 2 Mills 71 4.74 0.00 12.91 3.47 2 Blanton 108 5.40 0.00 8.79 2.11 2 N. Adams 73 7.54 0.78 16.21 4.67 2 Withers 66 7.64 0.00 15.30 3.47 2 H. Meadow 146 8.56 0.00 16.12 3.90 2 Ireland 108 8.61 0.00 21.59 5.64 2 Caillet 105 8.93 4.00 20.23 2.88 2 Marcus 137 9.72 0.00 26.97 3.88 2 Cabell 73 9.83 0.00 28.18 3.98 3 Stevens 97 0.00 0.00 0.00 0.00 3 Pershing 83 1.34 0.00 6.19 1.63 Continued
36
Table 11 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
3 Zaragoza 54 1.66 0.00 9.00 1.59 3 Kramer 95 1.94 0.00 5.08 1.03 3 Preston Hollow 19 3.12 0.00 4.97 1.18 3 Frank 169 3.46 0.00 11.54 1.74 3 Soto 92 4.65 0.00 10.08 1.84 3 Carver 74 4.91 0.00 25.36 6.17 3 Lanier 90 5.11 0.00 9.43 2.12 3 E. Medrano 91 5.90 0.00 21.43 4.64 3 Martinez 78 6.61 0.00 11.20 2.21 3 Chavez 70 7.57 0.00 19.91 4.02 3 Earhart 30 8.97 0.00 15.97 4.19 3 Carr 77 8.99 0.00 29.76 4.40 4 Young 108 0.74 0.00 19.51 2.74 4 Thornton 73 0.84 0.00 2.19 0.62 4 Guzick 104 1.16 0.00 44.30 5.17 4 Lipscomb 88 2.50 0.00 4.37 0.88 4 R. Lee 56 2.55 0.00 6.27 1.34 4 Truett 177 4.88 0.00 18.35 2.93 4 Marsalis 96 5.36 0.00 11.17 3.08 4 Pease 94 5.54 0.00 17.46 3.55 4 Lakewood 136 6.14 0.00 11.45 2.39 4 Mata 80 7.35 0.00 13.88 2.82 4 H. Stone 48 7.63 1.27 17.20 2.99 4 Tatum 112 7.83 0.00 25.14 3.69 4 Mt. Auburn 110 9.44 0.00 34.58 4.28 4 Rowe 73 9.57 0.00 16.81 5.00 4 Urban Park 102 9.76 0.00 17.89 2.56 5 Moseley 120 0.38 0.00 23.15 2.38 5 Douglass 102 4.55 0.00 23.56 3.76 5 Reagan 93 4.77 0.00 13.58 1.92 5 Wilmer-Hutchins 123 5.01 0.00 26.73 4.22 5 Bayles 100 5.10 0.00 17.48 6.78 5 Lagow 93 6.44 0.00 16.03 4.10 5 Sanger 77 6.54 0.00 17.97 4.02 5 Rhoads 77 8.68 0.48 17.53 4.34 5 Rice 78 8.88 3.26 29.09 3.84 5 Botello 76 9.40 0.00 21.97 5.32
Continued
37
Table 11 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
10-19.99 Hours 1 Webster 85 11.31 0.00 23.30 5.54 1 Field 77 11.59 0.00 20.17 3.07 1 U. Lee 109 12.05 0.00 21.17 4.28 1 Jones 120 12.47 0.00 19.32 4.65 1 Hall 85 12.84 0.00 21.42 4.80 1 Knight 105 13.46 0.00 24.83 5.90 1 Kahn 100 14.43 0.00 47.91 12.49 1 Peabody 73 14.82 9.04 21.37 2.98 1 Walnut Hill 43 18.08 0.00 36.73 4.86 1 Donald 70 18.88 0.00 27.13 5.07 1 Williams 47 19.86 0.00 30.30 4.74 2 Bush 96 10.04 0.00 23.62 4.07 2 McShan 75 10.34 0.00 20.10 4.34 2 J. Adams 81 12.30 0.00 26.29 5.79 2 Bryan 62 14.07 5.08 31.23 3.44 2 Miller 64 14.15 3.23 40.40 4.92 2 Junkins 110 14.58 0.00 37.16 6.26 2 Runyon 111 15.08 0.00 26.98 3.93 2 Hawthorne 72 15.12 0.00 25.59 5.49 2 Johnston 61 16.09 0.00 28.51 4.40 2 Hotchkiss 148 16.11 0.00 34.47 6.92 2 Budd 108 16.18 0.00 31.26 5.45 2 Lowe 96 18.49 0.00 36.71 6.52 2 Titche1 42 18.84 0.00 22.59 4.63 2 Starks 48 19.27 0.00 27.21 6.06 2 Pleasant Grove 86 19.86 0.00 57.56 11.43 3 Rogers 57 10.13 0.00 16.28 2.95 3 Salazar 128 10.31 0.00 19.79 3.43 3 Maple Lawn 73 11.42 0.00 21.71 3.74 3 Arcadia Park 98 11.72 0.00 25.49 4.63 3 Bethune 115 11.76 0.00 21.37 3.70 3 DeZavala 59 12.03 0.00 26.30 4.62 3 Cochran 85 12.03 0.00 26.15 5.62 3 Ray 64 12.73 0.00 21.19 5.30 3 Milam 40 13.11 0.00 21.37 5.54 3 Cowart 106 15.49 0.00 25.43 4.96
Continued
38
Table 11 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
3 Hernandez 60 16.65 0.88 31.92 5.62 3 Kennedy 104 16.71 2.49 49.33 5.59 3 Houston 38 16.94 4.40 19.93 3.62 4 Bushman 80 10.49 0.00 26.33 5.47 4 Jordan 75 10.86 0.00 18.43 4.39 4 Lisbon 44 11.50 0.00 24.79 6.44 4 Oliver 50 11.64 0.00 18.27 3.71 4 S. Jackson 101 11.67 0.00 24.89 4.16 4 Callejo 96 13.01 0.00 30.47 4.80 4 Conner 89 15.67 0.00 57.60 9.73 4 Dunbar 77 16.68 0.00 26.31 6.68 5 Casa View 120 10.27 0.00 24.38 6.01 5 Reilly 84 10.46 0.00 45.22 5.61 5 Kiest 109 11.09 0.00 17.39 4.40 5 Ervin 91 11.14 0.00 22.10 4.61 5 Silberstein 116 11.15 0.00 28.37 4.14 5 Central 71 11.45 0.00 23.56 4.43 5 Gonzalez 107 12.67 0.00 34.85 4.75 5 Blair 91 12.89 0.00 41.17 6.30 5 Hexter 98 14.29 1.69 50.96 5.21 5 Seagoville North 109 14.67 0.00 28.83 6.76 5 Seagoville 103 14.83 0.00 35.71 9.30 5 Hogg 30 14.98 0.00 21.63 5.63 5 Reinhardt 91 16.12 0.00 22.86 4.91 5 Smith 133 16.62 0.00 36.32 6.82 5 Burleson 108 16.87 0.00 41.63 7.71 5 Bowie 67 17.00 0.00 25.15 6.86 5 Halliday 88 18.37 0.00 32.62 6.22 5 W. Anderson 109 18.51 0.00 43.36 5.69 5 Gill 99 19.71 0.00 72.91 7.35 5 Cuellar 102 19.77 0.00 41.00 6.72
20-29.99 Hours 1 Henderson 72 24.05 0.00 34.63 5.14 1 Stemmons 130 24.76 0.00 40.29 7.16 1 Polk 73 27.34 0.00 50.61 5.69 2 J. Stone 48 24.95 0.00 37.10 9.00 4 King 58 21.98 0.00 35.30 6.02
Continued
39
Table 11 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
4 Russell 120 24.91 0.00 54.23 7.15 5 Macon 77 20.22 0.00 30.11 4.73 5 Peeler 62 21.21 0.00 37.63 4.90 5 Dorsey 74 22.40 9.70 40.44 5.78 5 Kleberg 74 24.71 0.00 65.37 10.92
1Many Titche students were missing in the RM file; there were 116 third-grade students in the PEIMS file. The mean of 18.84 is misleading and was not the true level of implementation at Titche.
40
Table 12
RM Second-Grade Spring Hour Information with Schools Rank Ordered by Division and Mean Hours Online
Division School Number of Students
Mean Hours
Minimum Hours
Maximum Hours
Standard Deviation
0-9.99 Hours 2 Gooch 64 3.26 0.00 28.21 5.33 3 Pershing 67 2.46 0.00 15.17 2.86 3 Stevens 108 4.17 0.00 22.50 2.20 3 Kramer 89 4.97 0.00 12.72 2.28 4 Callejo 82 0.00 0.00 0.00 0.00 5 Reagan 79 4.58 0.00 21.43 2.51 5 Douglass 109 6.21 0.00 30.58 7.34 5 Lagow 106 7.17 0.00 15.96 3.87
10-19.99 Hours 1 Donald 67 11.27 0.00 18.47 5.66 1 Weiss 84 14.51 0.00 27.46 6.04 1 McNair 134 14.73 0.00 36.06 7.01 1 Alexander 72 18.42 0.00 30.65 9.25 1 Carpenter 60 18.62 0.00 53.27 10.95 2 H. Meadow 138 10.72 0.00 17.80 4.32 2 Bryan 80 11.67 1.49 32.50 7.35 2 McShan 98 12.71 0.00 30.35 4.93 2 Mills 80 13.60 0.00 29.98 6.68 2 Caillet 109 14.08 1.55 47.59 4.58 2 Blanton 99 18.75 0.00 35.27 8.00 3 Lanier 85 10.24 0.00 13.83 3.65 3 Carr 81 13.95 0.00 32.42 6.11 3 Ray 75 15.30 0.00 32.06 8.86 3 Rogers 88 17.26 0.00 40.75 9.92 3 Frank 180 17.58 0.00 35.83 6.68 3 Salazar 126 17.84 0.00 48.27 9.80 3 Earhart 47 18.31 0.00 37.01 9.44 3 Cochran 95 18.81 0.00 29.79 5.15 3 Houston 47 19.98 1.04 30.06 5.04 4 Truett 157 10.29 0.00 30.21 7.81 4 Conner 91 10.41 0.00 32.31 5.70 4 Young 114 13.37 0.00 35.84 8.50 4 S. Jackson 114 14.85 0.00 41.62 5.16 4 Guzick 111 16.03 0.14 41.05 9.06
Continued
41
Table 12 Continued
Division School Number of Students
Mean Hours
Minimum Hours
Maximum Hours
Standard Deviation
4 Lisbon 54 18.60 2.85 39.42 6.31 4 Marsalis 80 19.67 0.00 35.45 8.05 4 Pease 93 19.90 0.00 30.51 8.22 5 Ervin 102 10.70 0.00 24.41 4.59 5 Wilmer-Hutchins 117 13.85 0.00 34.70 7.89 5 Hexter 100 14.15 0.00 32.49 4.97 5 Reilly 84 15.03 5.21 34.42 3.71 5 Central 83 15.65 1.10 58.89 14.08 5 Smith 169 17.90 0.00 43.03 11.60 5 Casa View 114 18.43 0.89 34.04 4.38 5 Rhoads 86 18.59 6.86 42.45 4.97 5 Bayles 109 18.91 0.00 39.64 9.00 5 Silberstein 116 18.94 0.00 49.96 4.30 5 Hogg 45 19.03 7.26 32.19 5.81 5 Rice 87 19.39 0.00 32.41 6.63 5 Blair 96 19.74 0.00 47.18 5.98 5 Kiest 114 20.00 0.00 44.25 6.55
20-29.99 Hours 1 Stemmons 124 21.55 0.00 45.15 9.57 1 Turner 57 23.46 7.86 35.01 4.69 1 Hall 88 23.54 14.41 36.30 4.96 1 Tolbert 65 24.45 10.10 41.53 7.70 1 Burnet 186 25.81 0.00 42.26 5.12 1 Knight 106 25.96 0.00 38.52 9.20 1 Hooe 75 26.59 0.00 34.81 8.84 1 Twain 62 26.84 0.00 122.86 22.80 1 Moreno 80 28.00 0.00 49.93 10.54 1 Webster 124 28.95 13.83 59.82 6.62 1 Winnetka 113 29.87 18.77 47.35 5.48 2 Bush 117 22.09 10.72 47.87 6.05 2 Lowe 104 22.76 0.00 48.63 8.81 2 Marcus 156 22.84 5.05 48.91 5.85 2 Ireland 106 23.79 1.88 40.52 7.87 2 San Jacinto 98 23.92 0.00 46.14 7.05 2 N. Adams 75 26.13 0.00 40.19 6.28 2 Budd 78 26.29 2.82 57.19 9.04 2 Hotchkiss 153 26.51 0.00 47.94 8.70 2 Starks 49 28.62 8.07 38.19 6.96
Continued
42
Table 12 Continued
Division School Number of Students
Mean Hours
Minimum Hours
Maximum Hours
Standard Deviation
2 Withers 75 29.79 15.77 71.35 10.98 2 DeGolyer 55 29.84 7.31 64.99 7.90 3 Arcadia Park 127 20.81 0.00 44.24 8.57 3 Hernandez 61 21.48 1.74 47.41 11.86 3 DeZavala 58 23.64 0.00 40.33 7.34 3 Bethune 123 23.68 5.85 42.65 7.73 3 Kennedy 116 24.22 0.54 46.74 6.42 3 Zaragoza 87 25.31 0.00 42.47 6.78 3 Soto 117 25.42 4.19 45.04 6.68 3 Carver 75 25.79 0.00 44.81 9.71 3 Milam 49 27.22 14.80 34.72 3.81 3 Preston Hollow 63 28.10 0.00 62.58 18.46 3 Maple Lawn 96 28.93 13.57 36.96 3.79 4 H. Stone 49 20.30 6.34 56.01 8.58 4 Lipscomb 88 21.00 0.00 40.39 8.11 4 Thornton 54 21.01 5.63 30.89 5.17 4 Tatum 111 22.39 0.00 63.19 6.80 4 Rowe 65 23.04 14.92 33.85 4.22 4 Bushman 85 24.46 4.52 38.15 6.20 4 Lakewood 131 26.89 0.00 66.49 11.86 4 King 76 27.01 0.31 53.38 9.15 4 Urban Park 115 28.30 0.00 46.54 9.91 4 Dunbar 101 29.31 4.20 54.27 12.38 5 Moseley 106 21.45 6.47 35.25 4.81 5 Sanger 78 22.90 3.54 47.48 6.90 5 Dorsey 63 25.06 9.72 30.66 3.31 5 Botello 88 25.43 0.00 63.54 7.02 5 Macon 79 25.76 0.00 42.88 6.65 5 Gill 153 26.95 0.00 48.67 7.07 5 Seagoville 85 28.32 17.20 65.64 6.15 5 Burleson 105 28.70 0.00 46.92 7.46 5 W. Anderson 107 29.10 0.41 63.96 7.39 5 Reinhardt 109 29.40 0.00 44.10 9.51
30+ Hours 1 Brashear 93 31.05 2.17 51.99 6.33 1 Kahn 105 31.43 5.01 73.38 13.46 1 Terry 61 32.47 0.00 131.91 18.25 1 Cigarroa 98 32.58 0.00 50.57 9.85
Continued
43
Table 12 Continued
Division School Number of Students
Mean Hours
Minimum Hours
Maximum Hours
Standard Deviation
1 Rosemont 182 33.23 1.80 54.56 6.88 1 Henderson 67 33.24 20.81 44.06 4.50 1 Foster 121 34.29 13.77 53.98 5.79 1 Jones 122 34.33 0.00 46.11 5.86 1 U. Lee 92 35.23 7.72 53.68 9.42 1 Walnut Hill 45 35.77 12.22 46.76 5.35 1 Peabody 85 36.97 0.00 46.65 7.84 1 Williams 46 37.22 0.00 48.31 13.57 1 Field 72 37.85 0.00 54.92 10.54 1 Polk 57 37.99 17.85 47.60 3.91 1 Saldivar 153 38.07 0.00 50.96 8.02 2 J. Stone 49 31.09 8.81 57.01 8.08 2 Miller 49 32.28 17.04 46.04 7.08 2 Junkins 105 32.95 4.99 63.16 8.89 2 Cabell 106 34.98 0.00 84.03 16.88 2 Johnston 73 37.42 8.15 70.66 17.29 2 J. Adams 96 39.36 23.38 89.43 7.62 2 Runyon 88 39.37 3.45 60.40 6.61 2 Hawthorne 72 42.76 7.79 66.77 11.07 2 Pleasant Grove 100 46.12 0.00 124.46 20.88 2 Titche1 15 55.87 40.61 71.77 7.50 3 Cowart 97 30.60 0.00 45.68 6.78 3 E. Medrano 100 31.25 0.00 53.97 10.52 3 Martinez 85 37.66 0.00 85.95 21.26 3 Chavez 106 38.39 0.00 74.67 12.71 4 R. Lee 52 31.11 1.29 56.25 11.55 4 Mt. Auburn 138 31.68 0.00 49.88 7.32 4 Jordan 85 31.91 6.05 44.46 6.78 4 Mata 60 33.96 6.60 46.76 7.55 4 Oliver 46 41.66 25.78 69.44 6.55 4 Russell 117 43.93 0.00 67.59 9.36 5 Halliday 87 30.39 10.99 74.20 6.18 5 Bowie 63 31.04 0.00 43.69 6.58 5 Gonzalez 130 32.98 4.32 72.77 10.83 5 Cuellar 128 33.77 3.83 83.70 10.03 5 Peeler 53 37.18 14.58 55.68 7.03 5 Seagoville North 113 38.16 7.82 87.00 18.87 5 Kleberg 90 38.86 0.00 58.58 9.50
1Many Titche students were missing in the RM file; there were 134 second-grade students in the PEIMS file. The mean of 55.87 is misleading and was not the true level of implementation at Titche.
44
Table 13
RM Third-Grade Spring Hour Information with Schools Rank Ordered by Division and Mean Hours Online
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
0-9.99 Hours 1 Twain 56 8.50 0.00 18.53 5.65 2 Gooch 70 4.49 0.00 16.18 3.43 3 Stevens 99 1.59 0.00 15.06 1.62 3 Kramer 95 4.60 0.00 20.65 2.68 3 Preston Hollow 20 7.48 0.81 12.22 2.85 3 Carver 75 8.19 0.00 16.31 3.72 5 Wilmer-Hutchins 127 5.08 0.00 31.38 6.38 5 Reagan 93 6.95 0.00 14.33 2.27
10-19.99 Hours 1 McNair 131 13.37 0.00 55.11 7.64 1 Alexander 57 14.25 0.00 25.04 6.74 1 Tolbert 78 15.17 0.00 37.67 6.33 1 Turner 63 17.04 0.00 25.75 4.94 2 Mills 71 11.28 0.00 26.32 8.10 2 H. Meadow 149 11.99 0.00 33.48 7.93 2 Ireland 110 13.97 3.45 32.82 5.49 2 Withers 66 14.40 0.00 23.90 4.78 2 N. Adams 70 14.55 5.52 148.02 16.58 2 Budd 106 16.69 0.00 31.81 6.29 2 McShan 76 17.42 0.00 30.65 7.34 2 Marcus 138 17.69 0.00 36.26 6.33 3 Rogers 56 10.70 0.00 18.73 4.26 3 Lanier 92 12.03 0.00 20.68 5.03 3 Arcadia Park 98 14.23 0.00 27.09 5.58 3 Frank 170 14.85 0.00 34.46 4.93 3 Cochran 78 16.79 0.00 37.37 8.77 3 DeZavala 59 17.36 0.00 37.68 5.60 3 Zaragoza 54 18.81 2.68 29.49 4.47 3 Bethune 115 19.96 0.00 39.23 5.44 4 Truett 178 10.23 0.00 28.32 7.18 4 Tatum 112 10.36 0.00 23.74 3.87 4 S. Jackson 101 12.44 4.84 32.20 4.01
Continued
45
Table 13 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
4 Bushman 81 15.68 0.00 38.52 6.93 4 Lakewood 135 16.61 0.00 37.67 5.56 4 R. Lee 56 16.76 0.00 30.58 7.46 4 Lipscomb 88 16.97 0.00 43.33 7.73 5 Central 72 10.76 1.94 24.99 6.02 5 Ervin 89 11.14 0.31 27.19 5.43 5 Rice 78 11.50 1.61 33.23 4.39 5 Rhoads 77 12.80 0.00 37.38 12.16 5 Douglass 100 13.82 2.59 30.57 4.61 5 Lagow 85 14.26 0.00 26.63 5.19 5 Hogg 30 15.46 0.00 20.42 5.27 5 Silberstein 117 16.38 2.70 27.96 4.69 5 Reilly 83 17.45 0.00 78.15 12.20 5 Gill 98 18.36 4.28 32.81 6.63 5 Halliday 89 18.40 0.00 35.87 6.02 5 Kiest 111 18.53 0.00 28.55 7.29 5 Casa View 121 19.10 0.98 45.24 5.71 5 Bayles 100 19.94 0.00 41.65 8.57
20-29.99 Hours 1 Cigarroa 84 22.62 0.00 38.03 6.62 1 Winnetka 124 24.22 15.07 35.09 4.07 1 Donald 70 25.21 4.44 37.44 5.29 1 Foster 121 25.28 0.00 35.84 7.72 1 Weiss 83 25.49 0.00 76.86 13.09 1 Rosemont 173 26.39 0.00 51.59 6.86 1 Hall 85 26.93 0.00 48.92 6.45 1 Carpenter 53 28.11 8.50 42.62 6.02 1 Kahn 98 28.73 0.00 62.05 14.50 1 Jones 121 28.90 0.00 50.02 11.43 1 Terry 68 29.72 12.60 42.58 5.91 2 Blanton 110 20.97 0.00 33.26 5.63 2 Bryan 60 23.24 6.27 50.35 12.02 2 Junkins 110 26.24 4.51 57.50 11.19 2 Hotchkiss 147 26.29 0.00 56.99 7.46 2 Caillet 105 27.36 2.86 80.62 10.08 2 Pleasant Grove 85 28.41 0.00 66.14 15.15
Continued
46
Table 13 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
2 Lowe 96 29.47 11.55 66.28 10.51 2 San Jacinto 102 29.66 11.41 38.07 4.51 2 Bush 96 29.71 4.20 43.39 6.37 3 Martinez 78 20.12 0.00 36.91 9.49 3 Carr 76 21.22 0.33 44.26 10.69 3 Maple Lawn 75 21.56 2.75 42.04 8.17 3 Cowart 106 22.66 0.00 45.61 7.59 3 Houston 38 22.83 17.49 49.28 6.01 3 Pershing 83 23.98 0.00 55.24 8.69 3 Earhart 29 25.97 1.76 44.21 12.21 3 Salazar 128 26.56 0.00 53.39 7.38 3 Soto 91 27.18 14.40 40.55 5.44 3 Ray 63 27.73 8.92 46.28 7.80 3 Milam 39 29.41 20.05 37.44 4.12 4 Mt. Auburn 111 20.54 0.00 44.14 5.95 4 Guzick 108 22.39 0.00 47.22 9.65 4 Marsalis 96 22.42 0.00 40.02 9.57 4 Conner 93 22.47 0.00 62.38 12.81 4 H. Stone 48 23.21 0.00 67.90 10.68 4 Rowe 73 25.48 0.00 44.74 8.36 4 Lisbon 48 28.08 0.00 54.07 12.42 4 Urban Park 101 28.39 0.00 47.21 7.43 4 Pease 92 28.42 0.00 64.41 12.98 4 Callejo 96 28.88 0.00 61.14 9.34 5 Blair 88 20.36 0.00 37.93 5.86 5 Seagoville North 109 20.44 0.00 47.70 10.10 5 Hexter 97 20.52 0.00 33.39 7.15 5 Moseley 120 22.62 0.00 67.17 17.65 5 Dorsey 74 23.72 4.99 38.13 4.17 5 Botello 76 23.76 8.02 43.06 8.56 5 Sanger 78 23.94 5.33 63.50 11.52 5 Macon 75 24.56 0.00 33.92 3.93 5 Smith 133 24.89 0.00 55.42 10.30 5 W. Anderson 110 25.63 4.25 56.46 11.90 5 Reinhardt 91 26.13 0.00 43.01 5.36 5 Burleson 102 26.61 3.03 66.78 7.50 5 Bowie 67 29.29 0.00 40.22 7.00
Continued
47
Table 13 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
30+ Hours 1 Hooe 63 31.30 0.00 45.25 9.35 1 Moreno 109 31.64 0.00 49.15 8.94 1 Brashear 93 32.48 21.51 44.19 3.81 1 Henderson 72 33.28 0.00 47.35 7.81 1 Williams 47 33.99 18.97 46.91 5.54 1 U. Lee 111 34.10 0.00 75.14 14.76 1 Webster 85 34.12 1.90 103.45 24.47 1 Field 74 35.87 21.88 50.17 7.01 1 Burnet 160 36.12 0.00 59.79 6.78 1 Peabody 73 37.30 27.65 53.76 4.23 1 Saldivar 153 37.64 0.00 88.75 10.15 1 Walnut Hill 43 38.17 21.56 57.31 5.17 1 Polk 73 38.19 18.98 72.03 6.98 1 Knight 100 38.76 0.00 51.79 10.05 1 Stemmons 130 45.09 6.29 82.29 11.56 2 DeGolyer 61 30.09 0.00 37.90 6.95 2 Miller 64 30.23 6.83 58.03 10.00 2 Starks 49 30.30 0.00 40.70 7.49 2 J. Stone 45 30.81 15.10 47.82 6.46 2 Johnston 62 32.11 11.52 57.63 7.95 2 Hawthorne 72 34.68 0.00 80.46 11.48 2 Cabell 73 36.66 0.00 64.18 10.43 2 Titche1 43 40.16 1.63 55.50 7.61 2 Runyon 112 40.25 0.00 56.49 7.64 2 J. Adams 82 50.42 3.20 84.03 13.67 3 Hernandez 61 31.71 0.00 43.59 8.46 3 Chavez 71 33.18 0.00 62.24 12.58 3 Kennedy 106 39.77 8.31 159.56 18.23 3 E. Medrano 87 40.14 0.00 84.78 16.73 4 Young 109 31.84 0.00 75.40 14.14 4 Thornton 73 33.53 17.76 48.25 6.68 4 Dunbar 78 33.62 0.28 51.06 8.51 4 Jordan 75 34.60 0.78 48.46 7.79 4 Oliver 50 37.82 11.42 48.08 6.79 4 Mata 83 38.49 0.69 61.92 14.66
Continued
48
Table 13 Continued
Division School
Number of
StudentsMean Hours
Minimum Hours
Maximum Hours
Standard Deviation
4 King 58 39.69 22.98 62.94 8.96 4 Russell 122 43.77 7.81 110.45 10.41 5 Cuellar 99 36.58 13.21 74.73 8.56 5 Peeler 62 36.85 2.89 49.24 6.02 5 Seagoville 103 37.76 6.07 67.48 13.46 5 Kleberg 72 40.38 0.00 71.91 10.62 5 Gonzalez 107 42.40 0.00 97.49 11.31
1Many Titche students were missing in the RM file; there were 116 third-grade students in the PEIMS file. The mean of 40.16 is misleading and was not the true level of implementation at Titche.
To study how students progressed within the RM system, the evaluators reviewed the number of
objectives students completed each semester. Figures 17 and 18 display student objective completion
rates. The mean number of objectives that students completed increased from fall to spring for second-
(7.74 to 14.01) and third-grade (4.06 to 10.02) students. In both fall and spring, the mean number of
objectives completed was higher for second-grade students than for third-grade students. When reviewed
by division, results varied and showed no clear trends. (See Tables 14 and 15.)
49
Figure 17. Mean RM Objectives Students Completed in Fall and Spring by Grade
Figure 18. Percentage of RM Objectives Completed by Grade and Semester
7.74
4.06
14.01
10.02
0
2
4
6
8
10
12
14
16
2nd Grade 3rd Grade
Mean
Fall
Spring
2nd GradeFall
2nd GradeSpring
3rd GradeFall
3rd GradeSpring
21+ objectives 2 17 0 1016-20 objectives 9 21 1 1111-15 objectives 19 29 7 186-10 objectives 33 22 22 301-5 objectives 28 9 43 29No objectives met 9 2 27 2
0
20
40
60
80
100Percentage
21+ objectives
16-20 objectives
11-15 objectives
6-10 objectives
1-5 objectives
No objectives met
50
Table 14
Number of RM Objectives Students Completed in Fall and Spring by Grade and Division Second Grade Third Grade Fall
N (%) Spring N (%)
Fall N (%)
Spring N (%)
All None1 1,083 (9) 205 (2) 3,152 (27) 303 (2)1-5 3,426 (28) 1,160 (9) 5,071 (43) 3,676 (29)6-10 4,027 (33) 2,911 (22) 2,553 (22) 3,681 (30)11-15 2,246 (19) 3,790 (29) 858 (7) 2,289 (18)16-20 1,101 (9) 2,758 (21) 76 (1) 1,350 (11)21+ 269 (2) 2,177 (17) 27 (<1) 1,199 (10)
Division 1None1 279 (10) 23 (1) 786 (28) 15 (1)1-5 840 (31) 128 (5) 1,258 (47) 536 (19)6-10 885 (32) 472 (17) 471 (18) 770 (28)11-15 546 (20) 807 (28) 156 (6) 653 (23)16-20 172 (6) 718 (25) 16 (<1) 412 (15)21+ 9 (<1) 687 (24) 5 (<1) 406 (14)
Division 2None1 127 (6) 22 (1) 485 (22) 36 (2)1-5 549 (25) 176 (7) 954 (43) 721 (31)6-10 776 (36) 536 (22) 558 (25) 665 (28) 11-15 401 (18) 692 (29) 199 (9) 387 (16)16-20 234 (11) 437 (18) 15 (<1) 269 (11)21+ 90 (4) 544 (23) 5 (<1) 286 (12)
Division 3None1 263 (11) 54 (2) 665 (34) 90 (4)1-5 815 (36) 306 (13) 910 (47) 756 (36)6-10 739 (33) 574 (24) 303 (16) 656 (31)11-15 289 (13) 671 (28) 62 (3) 337 (16)16-20 152 (7) 504 (21) 8 (<1) 152 (7)21+ 12 (<1) 266 (11) 0 (0) 97 (5)
Division 4None1 230 (12) 44 (2) 675 (34) 50 (2)1-5 457 (24) 178 (8) 834 (42) 696 (32)6-10 626 (33) 459 (22) 380 (19) 621 (28)11-15 362 (19) 636 (30) 99 (5) 373 (17)16-20 171 (9) 478 (22) 10 (<1) 237 (11)21+ 33 (2) 334 (16) 5 (<1) 230 (10)
Division 5None1 184 (6) 62 (2) 541 (19) 112 (4)1-5 765 (25) 372 (11) 1,115 (39) 967 (32)6-10 1,001 (32) 870 (27) 841 (29) 969 (32)11-15 648 (21) 984 (30) 342 (12) 539 (18)16-20 372 (12) 621 (19) 27 (1) 280 (9)21+ 125 (4) 346 (11) 12 (<1) 180 (6) Note. Some percentages may not add to 100 due to rounding. 1In addition, the RM database included 1,334 second-grade students in the fall, 452 second-grade students in the spring, 1,093 third-grade students in the fall, and 323 third-grade students in the spring that did not use RM at all. These students were not included in the “None” category above.
51
Table 15
Descriptive Statistics for Reasoning Mind Objectives Completed by Grade and Division
Second Grade Third Grade Fall Spring Fall Spring
All Number of students1 12,152 13,001 11,737 12,498 Range of objectives 0-41 0-94 0-37 0-84 Mean objectives 7.74 14.01 4.06 10.02 Standard deviation 5.73 7.36 3.95 7.44
Division 1 Number of students1 2,731 2,835 2,692 2,792 Range of objectives 0-27 0-94 0-24 0-53 Mean objectives 6.99 15.91 3.55 12.25Standard deviation 5.15 7.10 3.77 7.64
Division 2 Number of students1 2,177 2,407 2,216 2,364 Range of objectives 0-41 0-63 0-37 0-84 Mean objectives 8.64 15.12 4.60 10.52 Standard deviation 6.04 8.16 4.03 8.02
Division 3 Number of students1 2,270 2,375 1,948 2,088 Range of objectives 0-26 0-67 0-20 0-56Mean objectives 6.28 12.71 2.94 8.13 Standard deviation 5.02 7.34 3.23 6.34
Division 4 Number of students1 1,879 2,129 2,003 2,207 Range of objectives 0-29 0-56 0-27 0-84 Mean objectives 7.63 13.85 3.45 10.06 Standard deviation 5.66 6.95 3.69 7.74
Division 5 Number of students1 3,095 3,255 2,878 3,047Range of objectives 0-41 0-47 0-33 0-51 Mean objectives 8.90 12.58 5.28 8.86 Standard deviation 6.16 6.70 4.26 6.67 1In addition, the RM database included 1,334 second-grade students in the fall, 452 second-grade students in the spring, 1,093 third-grade students in the fall, and 323 third-grade students in the spring that did not use RM at all. These students were not included in this table. To study how students spent their time online, the evaluators reviewed the types of problems
students completed and the accuracy rates of the problems completed. The RM system includes math
problems in learning mode, review mode (Wall of Mastery), and test mode (STAAR preparation for third
grade). In the learning and review modes, problems increase in difficulty from Level A (easiest) to Level C
(hardest). As seen in Table 16, the highest student participation numbers are for Level A learning mode in
the fall and spring and for Level B learning mode in the spring; noticeably fewer students completed
Level C learning mode, Wall of Mastery (A, B, or C), or test mode items. Even so, there were increases in
the number of students that took all types of problems from fall to spring. A review of semester accuracy
levels showed that mean scores, which range from 1 to 100, decreased as the test level increased. This
is likely due to the increased challenge of the Level B and C problems.
52
Table 16
Mean Accuracy Rates in Fall and Spring by Grade and Division Second Grade Third Grade Fall
N/Mean AccuracySpring
N/Mean AccuracyFall
N/Mean Accuracy Spring
N/Mean AccuracyAll
LM Level A 12,132 / 76.81 12,969 / 72.53 11,707 / 69.43 12,458 / 76.51 LM Level B 6,719 / 40.69 11,680 / 40.06 5,785 / 50.73 10,767 / 46.26 LM Level C 5,116 / 27.70 9,378 / 27.56 4,166 / 32.56 7,094 / 21.10WM Level A 2,463 / 74.86 7,113 / 75.40 3,268 / 61.99 6,951 / 68.41 WM Level B 2,064 / 41.95 6,181 / 47.41 2,168 / 51.88 6,936 / 49.26 WM Level C 1,057 / 40.55 4,179 / 38.77 2,040 / 32.29 4,511 / 39.27STAAR Mode N/A N/A 563 / 45.33 8,267 / 52.38
Division 1 LM Level A 2,725 / 77.26 2,835 / 71.73 2,692 / 69.43 2,788 / 76.69LM Level B 1,442 / 39.77 2,691 / 40.12 1,214 / 54.28 2,519 / 46.75 LM Level C 1,112 / 26.42 2,293 / 27.51 937 / 34.38 1,794 / 21.18 WM Level A 500 / 76.24 1,352 / 75.60 638 / 62.11 1,527 / 68.98WM Level B 411 / 40.60 1,121 / 49.21 582 / 51.84 1,458 / 48.39 WM Level C 179 / 39.37 754 / 37.98 374 / 33.65 896 / 38.49 STAAR Mode N/A N/A 201 / 48.88 2,185 / 52.44
Division 2 LM Level A 2,174 / 75.64 2,399 / 71.69 2,214 / 68.70 2,364 / 77.45 LM Level B 1,333 / 39.17 2,204 / 38.68 1,265 / 48.13 2,119 / 48.27LM Level C 1,063 / 26.57 1,820 / 25.68 892 / 31.74 1,488 / 19.77 WM Level A 554 / 72.74 1,503 / 74.51 753 / 61.59 1,463 / 68.86 WM Level B 486 / 39.23 1,340 / 45.38 728 / 51.07 1,465 / 49.22WM Level C 268 / 40.36 896 / 37.84 455 / 31.57 975 / 40.09 STAAR Mode N/A N/A 89 / 37.56 1,523 / 52.00
Division 3LM Level A 2,268 / 77.64 2,372 / 73.31 1,943 / 70.06 2,084 / 75.22 LM Level B 1,057 / 37.91 1,991 / 39.64 739 / 52.21 1,675 / 41.42 LM Level C 776 / 27.84 1,527 / 26.05 544 / 31.07 913 / 20.14WM Level A 391 / 71.99 1,367 / 74.91 484 / 59.95 1,152 / 66.43 WM Level B 317 / 37.98 1,207 / 45.03 480 / 54.10 1,185 / 49.46WM Level C 152 / 38.07 818 / 36.79 293 / 28.87 792 / 37.47 STAAR Mode N/A N/A 129 / 46.50 1,300 / 50.89
Division 4LM Level A 1,877 / 76.77 2,128 / 72.59 1,983 / 69.58 2,185 / 76.38 LM Level B 1,025 / 45.76 1,917 / 41.11 904 / 53.04 1,861 / 47.02 LM Level C 763 / 31.04 1,521 / 29.20 668 / 34.97 1,249 / 22.00WM Level A 425 / 75.26 1,205 / 75.39 538 / 62.19 1,222 / 68.88 WM Level B 382 / 49.27 1,091 / 49.91 549 / 52.95 1,250 / 49.77 WM Level C 232 / 43.66 791 / 41.04 345 / 33.92 797 / 39.21STAAR Mode N/A N/A 114 / 43.89 1,210 / 55.97
Division 5 LM Level A 3,088 / 76.66 3,235 / 73.23 2,875 / 69.44 3,037 / 76.58LM Level B 1,862 / 41.26 2,877 / 40.66 1,663 / 48.18 2,593 / 46.74 LM Level C 1,402 / 27.66 2,217 / 29.07 1,125 / 30.99 1,650 / 22.07 WM Level A 593 / 77.29 1,686 / 76.45 855 / 63.30 1,587 / 68.53WM Level B 468 / 42.68 1,422 / 48.00 829 / 50.64 1,578 / 49.54 WM Level C 226 / 40.19 920 / 40.16 573 / 32.73 1,051 / 40.56 STAAR Mode N/A N/A 30 / 45.03 2,049 / 51.41
Note. LM= Learning mode; WM=Wall of Mastery review mode; N/A=Not applicable
53
2.6 What were teacher and administrator perceptions of Reasoning Mind?
Methodology
Three RM staff surveys were developed and administered online during a two-week window in
May 2013. The surveys were developed to gather insights from RM campus administrators, supported
teachers, and non-supported teachers. All campus administrators and teachers involved in RM during
2012-13 were sent an email with the survey link along with periodic reminders. Response rates were high
for campus administrators (72%), supported teachers (83%), and non-supported teachers (69%). The
total number of responses included 215 campus administrators, 121 supported teachers, and 425
non-supported teachers.
Survey items related to whether staff would like to continue using RM in the future, whether staff
would recommend RM to others, staff support for using RM, collaboration between administrators and
staff, collaboration between RM supported and non-supported teachers, support from RM and Dallas ISD
staff, quality and use of RM training and resources, and technology-related challenges. The supported
and non-supported teacher surveys were the same with the exception of a few items related to the RM
support model.
The administrator and supported teacher surveys contained three open-ended items to allow for
feedback related to successes, barriers to implementation, and suggestions for improvement. One
open-ended item was included on the non-supported teacher survey to allow teachers to make general
comments at the end of the survey. The comments were categorized into overall themes (main codes),
and when applicable, main codes were broken down into sub codes. Some open-ended responses
referenced more than one theme, and as a result, some comments were coded into more than one main
code and/or sub code. Although only one open-ended item was included on the non-supported teacher
survey, most of the non-supported teacher survey comments fell within the themes found on the campus
administrator and supported teacher surveys; as a result, the same main codes and sub codes were used
across the three surveys as applicable. Invalid and blank responses were removed from analyses.
54
Results
Campus Administrator Survey
As mentioned above, a total of 215 campus administrators responded to the online questionnaire.
The questionnaire was sent to all principals, assistant principals, and staff members that were designated
as RM campus administrator contacts. Respondents included 50 (23%) from Division 1, 47 (22%) from
Division 2, 36 (17%) from Division 3, 33 (15%) from Division 4, and 49 (23%) from Division 5. There were
between 1 and 4 administrator respondents across 128 elementary campuses. When reviewed by
position, there were 110 (51%) principals, 66 (31%) assistant principals, and 39 (18%) that served as
campus administrator contacts. Most respondents indicated that the primary campus administrator
contact for RM was the principal (63%); others reported the primary administrator contact as an assistant
principal (9%), a mathematics instructional coach (9%), or another staff member (19%).
As seen in Table 17, between 55 percent and 88 percent marked “agree” or “strongly agree” for
11 of the 12 overall agreement items. Mean scores ranged from 3.27 to 4.09, which rounded to the
“neither agree nor disagree” and “agree” categories, respectively. In general, results were mixed. As
might be expected, campus administrators were most positive (88%) toward teachers having “adequate
campus administrator support to effectively implement the program.” Similarly, over 60 percent “agreed”
or “strongly agreed” that they would like to have RM on the campus next year (64%), that students
benefited from RM (68%), that the RM supported teachers worked closely with non-supported teachers
(63%), and that RM supported teachers had adequate support from RM staff (72%). In contrast, less than
half (48%) “agreed” or “strongly agreed” that RM helped teachers be more effective in the classroom.
Although results were positive overall, a sizeable proportion (between 41% and 45%) marked “neither
agree nor disagree,” “disagree,” or “strongly disagree” for the school having adequate technology
resources to effectively implement RM (45%), recommending RM to other administrators (45%), teachers
being positive toward RM (42%), having adequate support from district technology staff members (42%),
and having adequate RM support for non-supported teachers to effectively implement RM (41%).
55
Table 17
Results of Campus Administrator Survey Overall Agreement Items
Statement
Mean N (%)
Strongly Agree N (%)
Agree N (%)
Neither Agree Nor
Disagree N (%)
Disagree N (%)
Strongly Disagree
N (%) I would like to have Reasoning
Mind (RM) at my campus next year.
3.70 47 (22) 90 (42) 50 (23) 17 (8) 9 (4)
RM helps teachers on my campus to be more effective in the classroom.
3.37 27 (13) 74 (35) 68 (32) 39 (18) 5 (2)
Students at my school benefit from RM. 3.79 40 (19) 104 (49) 57 (27) 10 (5) 3 (1)
Teachers in my building are positive toward RM. 3.50 26 (12) 98 (46) 54 (25) 29 (14) 7 (3)
The RM supported teacher works closely with RM non-supported teachers on my campus.
3.67 36 (17) 98 (46) 57 (26) 19 (9) 4 (2)
I work closely with supported and non-supported RM teachers on my campus.
3.93 42 (20) 123 (57) 42 (20) 6 (3) 1 (<1)
RM supported teachers have adequate support from RM staff members to effectively implement RM.
3.85 45 (21) 109 (51) 42 (20) 16 (8) 1 (<1)
RM non-supported teachers have adequate support from RM staff members to effectively implement RM.
3.57 31 (15) 95 (44) 57 (27) 28 (13) 3 (1)
Teachers have adequate campus administrator support to effectively implement RM.
4.09 56 (26) 132 (62) 18 (8) 6 (3) 2 (1)
My school has adequate technology resources (i.e., computers, etc.) to effectively implement RM.
3.27 38 (18) 80 (37) 21 (10) 52 (24) 23 (11)
My school has adequate technology support from district technology staff members to effectively implement RM.
3.33 30 (14) 93 (44) 26 (12) 46 (22) 18 (8)
I would recommend RM to other administrators. 3.51 37 (17) 81 (38) 60 (28) 27 (13) 9 (4)
Note. Some percentages may not add to 100 due to rounding.
In the next part of the questionnaire, campus administrators were asked to note the frequency
that they met with RM teachers on the campus. Most (74%) administrator responses were split between
three categories: once per month (29%), once every other week (17%), and once per week (28%).
56
Another nine percent indicated meeting with teachers more than once a week. Thus, 83 percent reported
meeting with their teachers at least once a month. In contrast, six percent met with teachers every other
month, and seven percent met less than once every other month. The remaining four percent noted never
meeting with teachers.
Table 18 shows that between 51 percent and 76 percent marked “extremely satisfied” or
“satisfied” to the nine satisfaction items. Mean scores ranged from 3.45 to 4.04, which were closest to the
“neither satisfied nor dissatisfied” and “satisfied” categories, respectively. Over 60 percent of campus
administrators chose “satisfied” or “extremely satisfied” for support from the RM program
coordinator (76%), other RM staff (67%), RM technology staff (63%), and district technology staff (61%).
Likewise, the majority of campus administrators (67%) were “satisfied” or “extremely satisfied” with the
RM interface and reports. In contrast, close to half (49%) marked “neither satisfied nor dissatisfied,”
“dissatisfied,” or “extremely dissatisfied” for support from district central administrators (49%), RM
professional development opportunities for teachers (49%), and RM administrator training (47%).
Likewise 43 percent did not give a positive response for being satisfied with “student progress as a result
of using RM.”
57
Table 18
Results of Campus Administrator Survey Satisfaction Items
How satisfied are you with the following related to RM implementation at your campus this year?
Mean
Extremely Satisfied
N (%) Satisfied
N (%)
Neither Satisfied Nor Dissatisfied
N (%)
Dissatisfied N (%)
Extremely Dissatisfied
N (%) Support from your
RM Program Coordinator
4.04 63 (30) 97 (46) 45 (21) 5 (2) 0 (0)
Support from RM technology staff 3.71 35 (17) 94 (46) 60 (29) 11 (5) 4 (2)
Support from other RM staff 3.79 34 (17) 100 (50) 61 (30) 6 (3) 1 (<1)
Support from district central administrators
3.50 20 (10) 83 (41) 78 (39) 17 (9) 3 (1)
Support from district technology staff 3.64 27 (13) 100 (48) 61 (30) 16 (8) 3 (1)
Student progress as a result of using RM
3.53 22 (11) 96 (46) 63 (30) 24 (12) 3 (1)
RM administrator interface and reports
3.79 36 (18) 101 (49) 58 (28) 8 (4) 2 (1)
RM administrator training 3.49 24 (12) 85 (42) 65 (32) 26 (13) 4 (2)
RM professional development opportunities for teachers
3.45 23 (11) 83 (40) 70 (34) 29 (14) 3 (1)
Note. Some percentages may not add to 100 due to rounding. Most campus administrators gave positive responses to the program coordinator items with
between 80 percent and 93 percent choosing “agree” or “strongly agree” to the seven items. (See Table
19.) Mean responses ranged from 4.21 to 4.44, which rounded to the “agree” category. Approximately
half “strongly agreed” that the program coordinator acted professionally (51%), responded efficiently to
email and voicemail (51%), honored appointments (49%), presented suggestions in a supportive way
(47%), and knew or found answers to questions (47%).
58
Table 19
Results of Campus Administrator Survey Program Coordinator Items
My Program Coordinator….
Mean
Strongly Agree N (%)
Agree N (%)
Neither Agree Nor
Disagree N (%)
Disagree N (%)
Strongly Disagree
N (%) Acts in a professional manner 4.44 106 (51) 86 (42) 15 (7) 0 (0) 0 (0) Presents suggestions in a positive
and supportive manner 4.35 96 (47) 88 (43) 21 (10) 1 (<1) 0 (0)
Honors appointments 4.38 101 (49) 82 (40) 21 (10) 1 (<1) 0 (0) Responds to emails and voicemails
in a timely manner 4.41 106 (51) 80 (39) 18 (9) 2 (1) 0 (0)
Knows or finds answers to my questions 4.31 96 (47) 77 (37) 33 (16) 0 (0) 0 (0)
Tailors his/her support based on my needs 4.21 87 (43) 75 (37) 40 (20) 2 (1) 0 (0)
Conducts beneficial in-person meetings 4.24 88 (43) 81 (40) 32 (16) 2 (1) 1 (<1)
Note. Some percentages may not add to 100 due to rounding.
Per survey results, most campus administrators reviewed RM emails more often than RM reports.
(See Table 20.) Over half (57%) indicated that they viewed RM administrator update emails at least once
per week; an additional 30 percent of responses were split between “once every other week” and “once
per month.” As for viewing RM reports, responses varied more but most reported that they viewed the RM
Metrics Report (86%) and RM Objective Spreadsheet Report (79%) at least once a month; over a third
indicated reviewing the reports at least once a week.
Table 20
Results of Campus Administrator Survey Frequency of Use Items
How often do you review the following RM updates and reports?
More than
once a week N (%)
Once per
week N (%)
Once every other week N (%)
Once per
month N (%)
Once every other month N (%)
Less than once every other month N (%)
Never N (%)
RM administrator update emails 28 (14) 89 (43) 31 (15) 31 (15) 3 (1) 7 (3) 16 (8) RM Metrics Report 18 (9) 65 (33) 30 (15) 57 (29) 4 (2) 13 (6) 13 (6) RM Objective Spreadsheet Report 14 (7) 52 (26) 29 (15) 62 (31) 8 (4) 15 (7) 19 (10)
Note. Some percentages may not add to 100 due to rounding.
59
In the next section of the survey, administrators were asked to indicate how often
technology-related issues precluded implementation of RM with students as planned. Approximately half
of campus administrators noted that the campus “frequently” or “very frequently” experienced Computer
on Wheels (COW) issues (53%) and wireless connectivity issues (49%). (See Table 21.) Likewise, about
40 percent reported “frequently” or “very frequently” having network issues (39%) or scheduling issues
due to limited campus access to computers (42%). Approximately a third “frequently” or “very frequently”
dealt with student log in issues (35%). Fewer administrators (20%) felt computer lab certification delays
often impeded implementation.
Table 21
Results of Campus Administrator Survey Technology Implementation Items
How often have the following issues kept teachers on your campus from implementing RM with students as planned?
Very
Frequently N (%)
Frequently N (%)
Occasionally N (%)
Rarely N (%)
Never N (%)
Student log in issues 31 (15) 40 (20) 73 (36) 52 (25) 9 (4) Computer lab certification delays 15 (8) 24 (12) 65 (34) 64 (33) 26 (13) COW (Computer on Wheels) issues 63 (31) 45 (22) 51 (25) 33 (16) 13 (6) Wireless connectivity issues 60 (29) 41 (20) 62 (30) 31 (15) 10 (5) Network issues 41 (20) 39 (19) 73 (36) 40 (20) 8 (4) Scheduling issues/limited access to
computers (difficulty scheduling laptop carts/computer lab time)
48 (24) 37 (18) 47 (23) 51 (25) 18 (9)
Note. Some percentages may not add to 100 due to rounding. Campus administrators were asked to indicate ways they increased awareness and usage of RM
on their campuses. As seen in Table 22, administrators used a variety of approaches to communicate
with staff, parents, and others. They most frequently informed staff about RM during campus faculty
meetings (54%) and opened the computer lab before or after school to allow students to work on RM
(50%). More than a quarter informed parents about RM through PTA presentations (37%) and the school
newsletter (27%). A third educated the site-based committee about RM (34%), sent weekly notes to
teachers about RM usage (33%), and gave student or class awards and/or incentives (32%). Other
means of communication varied.
60
Table 22
Results of Campus Administrator Survey RM Awareness and Use Items
Activity N (%) Inform staff about RM in campus faculty meetings 117 (54) Open computer lab before or after school to allow students to work on RM 108 (50) Inform parents about RM through presentations at PTA meetings 79 (37) Inform site-based committee about RM 72 (34) Send weekly notes to teachers about RM usage 70 (33) Give student or class awards and/or incentives acknowledging student achievement in RM 68 (32) Inform parents about RM through the school newsletter 57 (27) Inform parents about RM through the school web site 28 (13) Inform parents about RM through the school’s Community Liaison 27 (13) Inform parents about RM through messaging system 21 (10) Inform parents about RM through social media 9 (4) The final part of the campus administrator survey included three open-ended items. Specifically,
administrators were asked to comment on RM successes, challenges or barriers to implementation, and
suggestions for improvement. As mentioned above, some comments fit into multiple categories. The
most-cited successes were related to improved student learning and engagement (N=86) including
improved math performance and problem-solving abilities (N=34), increased student engagement (N=23),
and so forth. (See Table 23.) There were 26 references to RM being a helpful teacher and student
resource, and 19 mentions of improved school-level implementation due to better scheduling (N=9),
increased use of RM (N=7), generally improved implementation (N=2), and improved campus support
(N=1). Five referenced the helpful, supportive RM staff. In addition, there were five generally positive
comments about RM, and three negative comments.
The most common challenges or barriers to implementation included technology issues (N=104)
and scheduling issues (N=94). Of the 104 technology-related comments, 28 did not explain exact
technology problems, whereas the others referenced connection issues (N=34), computer malfunctions
(N=31), login issues (N=9), and browser/RM program issues (N=2). As for scheduling issues (N=94), over
10 related to difficulty meeting mandated time requirements (N=45), lack of computer access (N=25),
conflicts due to events such as field trips and testing (N=11), and loss of instructional time (N=10). There
were 16 references to the lack of alignment between RM and state/district curriculum requirements.
Fifteen comments related to RM training challenges; these included staff not trained (N=12) and the
learning curve for teachers to use RM (N=3). Other challenges referenced four times each were student
61
issues and limited staff buy-in. Three administrators noted no RM implementation challenges in 2012-13.
There were seven negative comments toward RM.
The most frequent campus administrator suggestion was to improve technology (N=48); this
included increasing the number of computers (N=24), improving the network connection (N=10), repairing
or replacing computers (N=6), improving technology support (N=3), and bettering technology in general
(N=5). There were eighteen references to improving RM training; these included increasing the amount
and quality of training (N=7), expanding training to other staff members (N=7), conducting summer/earlier
training (N=2), and improving online training or replacing online training with in-person training (N=2).
Twelve noted a need to align RM to district and state curriculum requirements. Nine suggested ways to
improve RM implementation, which included designating one staff member to teach all RM classes (N=3),
increasing parental involvement (N=3), exempting or making adjustments for special needs or below
average students (N=2), and limiting RM as an enrichment program (N=1). Another nine related to ways
to improve RM time and scheduling issues; these included creating a more efficient computer lab/COW
schedule (N=4), allowing more flexibility in the number of hours required per week (N=3), and creating a
master schedule that designates time for all RM classes (N=2). Other suggestions included improved
campus leadership, communication, and promotion of RM (N=4) as well as expansion of RM (N=3).
62
Table 23
Campus Administrator Survey Comments
Comment
Sub Code
N
Main Code
N Successes
Improved student learning and engagement 86 Improved math performance, problem solving abilities, conceptual understanding 34 Increased student engagement 23 Improved vocabulary, reading, and note-taking skills 8 General student improvement 8 Improved technology skills 5 More independent learning 4 Increased academic self confidence 2 Challenged students 1 Connections made between RM and classroom instruction 1 Helpful teacher and student resource 26 Differentiated, interactive instructional supplement 9 Versatile program 7 Helpful data monitoring tool 7 Allows teachers to work one-on-one with struggling students 3 Improved school-level implementation 19 Better scheduling 9 Increased use of RM 7 Generally improved implementation 2 Improved campus support 1 RM staff helpful, supportive, and friendly 5 Generally positive comments 5 Experienced no successes 3
ChallengesTechnology issues 104 Connection issues 34 Computer/hardware/accessories malfunctions 31 Login issues 9 Browser/RM program issues 2 General technology issues (specific issues not defined) 28 Scheduling issues 94 Difficulty meeting mandated time requirements 45 Lack of computer access 25 Conflicts with field trips, testing, holidays, and other programs 11 Loss of instructional time 10 General scheduling issues 3 RM not aligned with state/district curriculum requirements 16Training issues 15 Staff not trained/staff turnover 12 Learning curve to use RM 3 General negative comments 7 Student issues 4 Limited technology skills/computers at home 2 Classroom management challenges 1 Difficult for special needs, below average, or LEP students 1 Limited staff buy in 4Experienced no challenges 3
Continued
63
Table 23 Continued
Comment
Sub Code
N
Main Code
N Suggestions for Improvement
Improve technology 48 Increase the number of computers 24 Improve network connection 10 Improve technology (general) 5 Repair or replace computers 6 Improve or increase technology support 3 Improve RM training 18 Increase the amount and quality of training 7 Expand training to other staff 7 Conduct earlier or summer training 2 Replace online training with in-person training/improve online training 2 Align RM to district/state curriculum requirements 12 Improve RM implementation 9 Designate one staff member to teach all RM classes 3 Increase parental involvement 3 Exempt or make adjustments for special needs/below average students 2 Limit RM as an enrichment program 1 Solve RM time/scheduling issues 9 Create a more efficient computer lab/COW schedule 4 Allow more flexibility in the number of hours required per week 3 Create a master schedule that designates RM time for all RM classes 2 Improve campus leadership, communication, and promotion of RM 6 Expand RM program 3
Note. Some respondents gave responses that fit into more than one category. Supported and Non-Supported Teacher Survey Results As mentioned above, there were 121 supported teacher survey responses and 425
non-supported teacher responses. Table 24 displays teacher respondent characteristics. More schools
were represented on the non-supported (94%) than the supported (81%) teacher survey. The
percentages of supported and non-supported teacher respondents were similar when reviewed by
division. The majority of supported teachers (79%) taught third grade only, whereas most non-supported
teachers (61%) taught second grade only; there were some supported (12%) and non-supported (5%)
teachers that taught both second and third grade. The original plan for 2012-13 was for all supported
teachers to be third-grade teachers; however, some schools opted to place second-grade teachers in the
supported teacher role instead. Most supported (80%) and non-supported (60%) teachers taught with RM
two semesters or less. More non-supported (40%) than supported (20%) teachers taught with RM for
more than a year.
64
Table 24
Supported and Non-Supported Teacher Respondent Characteristics
Group
Supported N (%)
Non-Supported N (%)
Schools Represented 118 (81) 136 (94) Division 1 27 (22) 99 (23) 2 21 (17) 86 (20) 3 23 (19) 73 (17) 4 20 (17) 74 (18) 5 30 (25) 93 (22) Grades Taught Second only 11 (9) 260 (61) Third only 96 (79) 145 (34) Second and third 14 (12) 20 (5) Number of Semesters Taught with RM 1 9 (7) 51 (12) 2 88 (73) 203 (48) 3 2 (2) 13 (3) 4 10 (8) 132 (31) 5 or more 12 (10) 26 (6)
Note. A total of 121 supported teachers and 425 non-supported teachers responded to the surveys. There were 145 campuses with students in RM.
Responses were positive for seven of the nine teacher satisfaction items with over half choosing
“agree” or “strongly agree.” Overall mean scores for the nine items ranged from 3.33 to 3.93; when
rounded, the means were closest to the “neither agree nor disagree” and “agree” categories, respectively.
Responses were more positive for supported than non-supported teachers. (See Table 25.) Most
teachers marked “agree” or “strongly agree” for RM being beneficial for students in their
classrooms (75%), receiving adequate support from RM staff members to implement the program
effectively (75%), having adequate campus administrator support to implement RM (72%), wanting to
continue teaching with RM next year (66%), and having adequate technology resources to effectively
implement RM (67%). Also, 63 percent of supported (75%) and non-supported (60%) teacher
respondents “agreed” or “strongly agreed” that they would recommend RM to other teachers. Responses
were mixed for the other two satisfaction items; about half “agreed” or strongly agreed” that teachers in
their building were positive toward RM (47%) and that RM helped them to be more effective in the
classroom (50%).
65
Table 25
Results of Teacher Survey Satisfaction Items
Statement Mean
Strongly Agree N (%)
Agree N (%)
Neither Agree Nor
Disagree N (%)
Disagree N (%)
Strongly Disagree
N (%) I would like to continue teaching with Reasoning Mind (RM) next year.
Supported 3.99 47 (39) 44 (37) 14 (12) 8 (7) 6 (5) Non-Supported 3.64 133 (31) 133 (31) 68 (16) 54 (13) 36 (9) All 3.72 180 (33) 177 (33) 82 (15) 62 (11) 42 (8) RM helps me to be more effective in the classroom.
Supported 3.58 26 (21) 43 (36) 33 (27) 13 (11) 6 (5) Non-Supported 3.32 80 (19) 122 (29) 109 (26) 78 (18) 34 (8) All 3.34 106 (20) 165 (30) 142 (26) 91 (17) 40 (7) Students in my classroom benefit from RM.
Supported 4.14 44 (37) 58 (48) 12 (10) 3 (2) 3 (2) Non-Supported 3.85 116 (27) 193 (46) 67 (16) 31 (7) 17 (4) All 3.90 160 (29) 251 (46) 79 (15) 33 (6) 20 (4) Teachers in my building are positive toward RM.
Supported 3.34 16 (13) 43 (36) 33 (28) 20 (17) 7 (6) Non-Supported 3.33 61 (14) 138 (33) 134 (31) 64 (15) 28 (7) All 3.33 77 (14) 181 (33) 167 (31) 84 (16) 35 (6) I have adequate support from RM staff members to effectively implement RM.
Supported 4.42 66 (55) 43 (36) 8 (7) 2 (2) 1 (<1) Non-Supported 3.80 96 (23) 201 (48) 78 (19) 30 (7) 14 (3) All 3.93 162 (30) 244 (45) 86 (16) 32 (6) 15 (3) I have adequate campus administrator support to effectively implement RM.
Supported 4.07 42 (35) 58 (48) 11 (9) 7 (6) 3 (2) Non-Supported 3.82 101 (24) 193 (46) 91 (22) 26 (6) 11 (3) All 3.86 143 (26) 251 (46) 102 (19) 33 (6) 14 (3) I have adequate technology resources (i.e., computers, etc.) to effectively implement RM.
Supported 3.75 37 (31) 48 (40) 11 (9) 19 (16) 6 (5) Non-Supported 3.60 98 (23) 179 (42) 58 (14) 55 (13) 33 (8) All 3.64 135 (25) 227 (42) 69 (12) 74 (14) 39 (7) I have adequate technology support from district technology staff members to effectively implement RM.
Supported 3.69 28 (23) 54 (45) 17 (14) 17 (14) 5 (4) Non-Supported 3.45 71 (17) 158 (37) 105 (25) 65 (15) 23 (6) All 3.50 99 (18) 212 (39) 122 (23) 82 (15) 28 (5) I would recommend RM to other teachers.
Supported 3.94 41 (34) 50 (41) 20 (16) 2 (2) 8 (7) Non-Supported 3.55 101 (24) 150 (36) 87 (21) 51 (12) 34 (8) All 3.65 142 (26) 200 (37) 107 (20) 53 (10) 42 (7)
Note. Some percentages may not add to 100 due to rounding.
66
On the surveys, both supported and non-supported teachers were asked if they worked closely
with each other. Noticeably more supported (68%) than non-supported (52%) teachers “agreed” or
“strongly agreed” that they worked closely together. Slightly more than half of non-supported teachers
“agreed” or “strongly agreed” that they worked closely with supported (52%) and non-supported (54%)
teachers. Survey results further showed that the frequency with which RM teachers collaborated varied
widely; however, responses were similar for supported and non-supported teachers. About half of both
supported (55%) and non-supported (49%) teachers reported that they collaborated with other RM
teachers at least once a week. (See Table 26.) Remaining responses varied.
Table 26
Results of Teacher Survey Collaboration Items
How often do you collaborate with RM teachers on your campus
More than
once a week N (%)
Once per
week N (%)
Once every other week N (%)
Once per
month N (%)
Once every other month N (%)
Less than once every other month N (%)
Never N (%)
Supported 31 (26) 35 (29) 12 (10) 16 (13) 3 (2) 14 (12) 10 (8) Non-Supported 95 (22) 116 (27) 56 (13) 47 (11) 15 (4) 48 (11) 48 (11) All 126 (23) 151 (28) 68 (12) 63 (12) 18 (3) 62 (11) 58 (11)
Note. Some percentages may not add to 100 due to rounding. Table 27 displays teacher responses to items related to support for implementing the program. In
general, supported teachers indicated a higher level of satisfaction than non-supported teachers.
Supported teachers were especially positive toward support from their RM program coordinators.
Specifically, noticeably more supported (68%) than non-supported (27%) teachers indicated being
“extremely satisfied” with support from their assigned RM program coordinators. Almost all supported
teachers (96%) marked “extremely satisfied” or “satisfied” for the item. This makes sense because RM
program coordinators were assigned to work closely with the supported teachers and to help
non-supported teachers as time allowed. Over half of supported and non-supported teachers marked
“extremely satisfied” or “satisfied” for support from RM technology staff (79% versus 61%), other RM staff
(82% versus 59%), the principal (83% versus 72%), and campus/district technology staff (59% versus
51%); in these four cases, supported teacher responses were noticeably higher than those of
67
non-supported teachers. Responses were mixed between “extremely satisfied,” “satisfied,” and “neither
satisfied nor dissatisfied” for support from the math instructional coach and another campus administrator.
Table 27
Results of Teacher Survey RM Support Items
How satisfied are you with the following as you implement RM this year?
Mean
Extremely Satisfied
N (%) Satisfied
N (%)
Neither Satisfied
Nor Dissatisfied
N (%) Dissatisfied
N (%)
Extremely Dissatisfied
N (%) Support from your RM Program Coordinator
Supported 4.63 82 (68) 45 (28) 4 (3) 1 (1) 0 (0) Non-Supported 3.92 113 (27) 197 (47) 89 (21) 12 (3) 11 (3) All 4.08 195 (35) 242 (44) 93 (17) 13 (2) 11 (2) Support from RM technology staff Supported 4.18 49 (41) 45 (38) 22 (19) 3 (2) 0 (0) Non-Supported 3.69 77 (19) 174 (42) 129 (31) 24 (6) 9 (2) All 3.80 126 (24) 219 (41) 151 (28) 27 (5) 9 (2) Support from other RM staff Supported 4.18 45 (38) 52 (44) 21 (18) 1 (1) 0 (0) Non-Supported 3.68 69 (17) 171 (42) 149 (36) 13 (3) 8 (2) All 3.81 114 (22) 223 (42) 170 (32) 14 (3) 8 (1) Support from your principal Supported 4.13 43 (36) 57 (47) 16 (13) 4 (3) 1 (1) Non-Supported 3.91 109 (26) 192 (46) 93 (22) 18 (4) 6 (1) All 3.93 152 (28) 249 (46) 109 (20) 22 (4) 7 (1) Support from your math instructional coach
Supported 3.54 25 (22) 22 (20) 57 (51) 5 (4) 3 (3) Non-Supported 3.38 53 (13) 103 (26) 199 (50) 29 (7) 15 (4) All 3.39 78 (15) 125 (24) 256 (50) 34 (7) 18 (4) Support from another campus administrator (assistant principal, etc.)
Supported 3.66 25 (22) 32 (28) 49 (44) 4 (4) 2 (2) Non-Supported 3.56 64 (16) 131 (33) 176 (44) 16 (4) 10 (3) All 3.60 89 (18) 163 (32) 225 (44) 20 (4) 12 (2) Support from campus/district technology staff
Supported 3.61 22 (19) 47 (40) 32 (28) 10 (9) 5 (4) Non-Supported 3.45 52 (13) 154 (38) 144 (36) 33 (8) 21 (5) All 3.49 71 (14) 201 (39) 176 (34) 43 (8) 26 (5)
Note. Some percentages may not add to 100 due to rounding. Teacher responses to RM professional development items were noticeably more positive for
supported than non-supported teachers; however, many non-supported teachers answered items related
to Best Practice and Curriculum Study training even though very few attended. This revealed confusion
about RM training for non-supported teachers. Between 80 percent and 84 percent of supported teachers
rated the in-person courses as “very helpful,” and between 50 percent and 60 percent did so for the
68
online version of the courses. (See Table 28.) As for non-supported teachers, those who chose “very
helpful” ranged from 46 percent to 53 percent for the in-person training and from 25 percent to 28 percent
for the online sessions. In general, both groups of teachers found the in-person sessions to be more
helpful than the online sessions; however, as noted above, the high number non-supported teacher
responses to these items revealed confusion on the part of non-supported teachers as very few
non-supported teachers participated in the Curriculum Study sessions or Best Practice workshops.
Table 28
Results of Teacher Survey Professional Development Items
How helpful was the following?
Mean
Very Helpful N (%)
Somewhat Helpful N (%)
Not Helpful N (%)
RM Qualification Course (in person) Supported 2.80 70 (82) 13 (15) 2 (2) Non-Supported 2.47 130 (53) 99 (41) 15 (6) All 2.56 200 (61) 112 (34) 17 (5)RM Qualification Course (online) Supported 2.44 39 (50) 34 (44) 5 (6) Non-Supported 2.12 89 (28) 181 (56) 51 (16) All 2.18 128 (32) 215 (54) 56 (14)Curriculum Study Sessions (in person) Supported 2.77 58 (80) 13 (18) 2 (3) Non-Supported 2.41 92 (46) 97 (49) 10 (5) All 2.51 150 (55) 110 (41) 12 (4)Curriculum Study Sessions (online) Supported 2.46 50 (52) 40 (42) 6 (6) Non-Supported 2.12 65 (26) 153 (61) 34 (13) All 2.22 115 (33) 193 (56) 40 (11)Best Practice Workshops (in person) Supported 2.83 63 (84) 11 (15) 1 (1) Non-Supported 2.42 95 (48) 89 (45) 13 (7) All 2.53 158 (58) 100 (37) 14 (5)Best Practice Workshops (online) Supported 2.56 56 (60) 33 (36) 4 (4) Non-Supported 2.11 59 (25) 144 (61) 34 (14) All 2.24 115 (35) 177 (54) 38 (11)
Note. Some percentages may not add to 100 due to rounding. Teachers that marked “did not attend” were removed from the analyses.
Next, teacher perceptions of the RM program coordinator were assessed. As seen in Table 29,
supported teacher responses were strongly positive as between 67 percent and 80 percent marked
“strongly agree” on all 11 items; combined “strongly agree” and “agree” responses ranged from 94
percent to 100 percent. Non-supported teachers also responded positively with most (68% to 91%)
choosing the “strongly agree” or “agree” categories.
69
Table 29
Results of Teacher Survey RM Program Coordinator Items
My Program Coordinator…. Mean
Strongly Agree N (%)
Agree N (%)
Neither Agree Nor Disagree
N (%)Disagree
N (%)
Strongly Disagree
N (%) Acts in a professional manner Supported 4.80 97 (80) 24 (20) 0 (0) 0 (0) 0 (0) Non-Supported 4.41 216 (52) 161 (39) 30 (7) 3 (1) 3 (1) All 4.53 313 (59) 185 (35) 30 (6) 3 (<1) 3 (<1)Presents suggestions in a positive and supportive manner
Supported 4.79 96 (79) 24 (20) 1 (1) 0 (0) 0 (0) Non-Supported 4.30 198 (48) 157 (38) 42 (10) 10 (2) 4 (1) All 4.40 294 (55) 181 (34) 43 (8) 10 (2) 4 (1)Honors appointments Supported 4.73 93 (78) 22 (18) 2 (2) 2 (2) 0 (0) Non-Supported 4.24 186 (46) 142 (35) 61 (15) 8 (2) 4 (1) All 4.35 279 (54) 164 (31) 63 (12) 10 (2) 4 (1)Responds to emails and voicemails in a timely manner
Supported 4.73 90 (75) 27 (23) 3 (2) 0 (0) 0 (0) Non-Supported 4.33 206 (50) 155 (38) 31 (8) 9 (2) 7 (2) All 4.43 296 (56) 182 (35) 34 (6) 9 (2) 7 (1)Knows or finds answers to my questions Supported 4.73 93 (77) 24 (20) 3 (2) 1 (1) 0 (0) Non-Supported 4.30 199 (49) 156 (38) 38 (9) 10 (2) 6 (2) All 4.40 292 (55) 180 (34) 41 (8) 11 (2) 6 (1)Tailors his/her support based on my needs
Supported 4.70 90 (76) 23 (19) 5 (4) 1 (1) 0 (0) Non-Supported 4.19 180 (45) 135 (34) 72 (18) 6 (1) 7 (2) All 4.32 270 (52) 158 (31) 77 (15) 7 (1) 7 (1)Is knowledgeable about the areas in which my class could improve
Supported 4.69 88 (74) 26 (22) 4 (3) 1 (1) 0 (0) Non-Supported 4.15 174 (43) 137 (34) 73 (18) 11 (3) 6 (2) All 4.28 262 (50) 163 (32) 77 (15) 12 (2) 6 (1)Continually encourages me to improve my RM class
Supported 4.71 87 (73) 29 (24) 3 (3) 0 (0) 0 (0) Non-Supported 4.03 152 (39) 135 (34) 85 (22) 14 (3) 9 (2) All 4.17 239 (46) 164 (32) 88 (17) 14 (3) 9 (2)Is invested in my personal success as an RM teacher
Supported 4.64 85 (72) 26 (22) 6 (5) 0 (0) 1 (1) Non-Supported 3.99 151 (38) 129 (33) 89 (22) 16 (4) 11 (3) All 4.15 236 (46) 155 (30) 95 (19) 16 (3) 12 (2)Helps my RM class succeed Supported 4.58 80 (67) 32 (27) 7 (6) 0 (0) 1 (1) Non-Supported 3.92 133 (34) 135 (34) 101 (26) 16 (3) 10 (2) All 4.08 213 (41) 167 (32) 108 (21) 16 (3) 11 (2)Conducts beneficial in-person meetings Supported 4.71 90 (76) 25 (21) 3 (2) 0 (0) 1 (1) Non-Supported 3.96 148 (38) 117 (30) 100 (25) 18 (5) 9 (2) All 4.15 238 (47) 142 (28) 103 (20) 18 (3) 10 (2)
Note. Some percentages may not add to 100 due to rounding.
70
Next, teachers were asked to rate the helpfulness of RM resources. A large percentage of
supported (83%) and non-supported (59%) teachers believed that the RM reports were “very helpful.”
(See Table 30.) Notably more supported than non-supported teachers indicated the RM teacher resource
(76% versus 49%) and professional development (63% versus 35%) web sites were “very helpful.”
Table 30
Results of Teacher Survey RM Resource Items
How helpful were the following RM resources? Mean
Very Helpful N (%)
Somewhat Helpful N (%)
Not Helpful N (%)
RM Teacher Resource Website Supported 2.76 92 (76) 29 (24) 0 (0) Non-Supported 2.44 178 (49) 171 (47) 16 (4) All 2.53 270 (56) 200 (41) 16 (3) RM Professional Development Website Supported 2.62 76 (63) 44 (36) 1 (1) Non-Supported 2.27 118 (35) 189 (57) 27 (8) All 2.37 194 (43) 233 (51) 28 (6) RM Reports Supported 2.83 101 (83) 19 (16) 1 (1) Non-Supported 2.56 243 (59) 152 (37) 15 (4) All 2.62 344 (65) 171 (32) 16 (3)
Note. Some percentages may not add to 100 due to rounding. All of the non-supported teachers that marked “did not use” were removed from the analyses; all of the supported teachers indicated use of the three resources.
As seen in Table 31, about half or more of the supported (49% to 94%) and non-supported (48%
to 79%) teacher survey respondents indicated using all of the assessed resources at least once a week.
In general, supported teachers used most resources somewhat more frequently than non-supported
teachers. The majority of supported teachers marked “more than once a week” for notifications (71%), my
students menu (69%), objective spreadsheet report (69%), activity logs (67%), and metrics report (70%);
most non-supported teacher responses were split between “once per week” and “more than once a week”
for the items. Even so, over half (54%) of non-supported teachers viewed notifications “more than once a
week.” There was more variation in frequency of use for both supported and non-supported teachers for
the remaining reports.
71
Table 31
Results of Teacher Survey Frequency of Use Items
How often do you review the following?
More than
once a week N (%)
Once per
week N (%)
Once every other week N (%)
Once per
month N (%)
Once every other month N (%)
Less than once every other month N (%)
Never N (%)
Notifications Supported 85 (71) 28 (23) 5 (4) 2 (2) 0 (0) 0 (0) 0 (0) Non-Supported 223 (54) 105 (25) 36 (9) 32 (8) 1 (<1) 7 (2) 12 (3) All 308 (58) 133 (25) 41 (8) 34 (6) 1 (<1) 7 (1) 12 (2) My Students Menu Supported 84 (69) 28 (23) 6 (5) 1 (1) 0 (0) 0 (0) 2 (2) Non-Supported 182 (44) 133 (32) 33 (8) 36 (9) 4 (1) 6 (2) 16 (4) All 266 (50) 161 (30) 39 (7) 37 (7) 4 (1) 6 (1) 18 (3) Objective Spreadsheet Report Supported 83 (69) 27 (23) 3 (2) 5 (4) 0 (0) 0 (0) 2 (2) Non-Supported 152 (37) 145 (35) 51 (12) 44 (10) 5 (1) 7 (2) 11 (3) All 235 (44) 172 (32) 54 (10) 49 (9) 5 (1) 7 (1) 13 (2) Activity Logs Supported 80 (67) 29 (24) 6 (5) 2 (2) 0 (0) 1 (1) 1 (1) Non-Supported 178 (43) 127 (31) 51 (12) 33 (8) 4 (1) 7 (2) 14 (3) All 258 (48) 156 (29) 57 (11) 35 (7) 4 (1) 8 (1) 15 (3) Metrics Report Supported 84 (70) 28 (23) 5 (4) 1 (1) 1 (1) 0 (0) 1 (1) Non-Supported 165 (40) 106 (26) 44 (11) 41 (10) 5 (1) 15 (4) 31 (8) All 249 (47) 134 (25) 49 (9) 42 (8) 6 (1) 15 (3) 32 (6) Student Progress Report Supported 50 (41) 35 (29) 14 (12) 9 (7) 3 (3) 4 (3) 56 (5) Non-Supported 147 (35) 146 (35) 45 (11) 47 (11) 12 (3) 6 (1) 14 (3) All 197 (34) 181 (31) 59 (10) 56 (9) 15 (2) 10 (2) 70 (12) Student Study History Report Supported 32 (27) 35 (29) 19 (16) 7 (6) 1 (1) 7 (6) 19 (16) Non-Supported 88 (21) 123 (30) 72 (17) 53 (13) 11 (3) 16 (4) 49 (12) All 120 (23) 158 (30) 91 (17) 60 (11) 12 (2) 23 (4) 68 (13) Grading Report Supported 31 (26) 37 (32) 20 (17) 8 (7) 2 (2) 3 (2) 16 (14) Non-Supported 99 (24) 131 (32) 54 (13) 50 (12) 14 (3) 17 (4) 49 (12) All 130 (24) 168 (32) 74 (14) 58 (11) 16 (3) 20 (4) 65 (12) Test Categories Report Supported 28 (24) 29 (25) 18 (15) 19 (16) 4 (3) 8 (7) 11 (9) Non-Supported 70 (17) 128 (31) 52 (13) 66 (16) 13 (3) 20 (5) 63 (15) All 98 (19) 157 (30) 70 (13) 85 (16) 17 (3) 28 (5) 74 (14)
Note. Some percentages may not add to 100 due to rounding.
Teachers were next asked to assess changes in student performance as a result of RM use.
Combined “improved” and “significantly improved” percentages ranged from 61 percent to 89 percent for
supported teachers and from 53 percent to 82 percent for non-supported teachers, which showed that
72
over half of all teachers believed students made positive changes in the areas assessed. (See Table 32.)
Combined means for the student change items ranged from 3.64 to 4.06, which were closest to the
“improved” response category. The mean ratings were slightly higher for supported than non-supported
teachers. Although positive overall, a third or more chose “no change” for “reading comprehension” and
“progress in other subjects.”
Table 32
Results of Teacher Survey Student Change Items
To what extent did students change in the following areas as a result of using RM? Mean
Significantly Improved
N (%) Improved
N (%)
No Change N (%)
Regressed N (%)
Significantly Regressed
N (%) Reasoning skills Supported 4.00 19 (16) 78 (68) 17 (15) 1 (1) 0 (0) Non-Supported 3.86 40 (10) 260 (67) 81 (21) 4 (1) 2 (1) All 3.87 59 (12) 338 (67) 98 (19) 5 (1) 2 (<1) Non-standard thinking abilities Supported 3.91 17 (15) 67 (61) 25 (23) 1 (1) 0 (0) Non-Supported 3.77 36 (9) 231 (61) 108 (28) 4 (1) 2 (1) All 3.82 53 (11) 298 (61) 133 (27) 5 (1) 2 (<1) Independence in learning Supported 4.09 25 (21) 79 (68) 12 (10) 1 (1) 0 (0) Non-Supported 3.91 61 (15) 246 (62) 83 (21) 4 (1) 2 (1) All 3.93 86 (17) 325 (63) 95 (18) 5 (1) 2 (<1) Confidence in mathematical ability Supported 3.99 19 (16) 80 (68) 18 (15) 1 (1) 0 (0) Non-Supported 3.90 59 (15) 249 (63) 81 (20) 3 (1) 4 (1) All 3.91 78 (15) 329 (64) 99 (19) 4 (1) 4 (1) Enjoyment of mathematics Supported 4.26 46 (38) 60 (50) 13 (11) 1 (1) 0 (0) Non-Supported 4.04 94 (24) 228 (58) 64 (16) 6 (2) 1 (<1) All 4.06 140 (27) 288 (56) 77 (15) 7 (1) 1 (<1) Reading comprehension Supported 3.76 11 (10) 62 (57) 35 (32) 1 (1) 0 (0) Non-Supported 3.68 29 (8) 196 (53) 139 (38) 4 (1) 0 (0) All 3.69 40 (8) 258 (54) 174 (37) 5 (1) 0 (0) Progress in other subjects Supported 3.74 12 (13) 44 (48) 36 (39) 0 (0) 0 (0) Non-Supported 3.60 25 (7) 157 (46) 158 (46) 1 (<1) 1 (<1) All 3.64 37 (9) 201 (46) 194 (45) 1 (<1) 1 (<1)
Note. Some percentages may not add to 100 due to rounding. Teachers were next requested to indicate how often they experienced technology issues that
precluded them from using RM with students as planned. A noticeable percentage of supported and
non-supported teachers marked “very frequently” or “frequently” for Computer on Wheels (50%), wireless
connectivity (44%), and network (38%) issues; over half of both groups noted “occasionally,” “frequently,”
or “very frequently” having student log in issues. (See Table 33.) More non-supported (32%) than
73
supported (16%) teachers reported “very frequently” or “frequently” having scheduling issues or limited
access to computers. Also, more non-supported (41%) than supported (28%) teachers indicated at least
occasionally having lab delays. In general, over half of both supported and non-supported teachers chose
“occasionally,” “frequently,” or “very frequently” on four of the six technology-related survey items.
Table 33
Results of Teacher Survey Frequency of Technology Issue Items
How often have the following issues kept you from implementing RM with your students as planned?
Very Frequently
N (%) Frequently
N (%) Occasionally
N (%) Rarely N (%)
Never N (%)
Student log in issues Supported 13 (11) 16 (14) 36 (30) 33 (28) 20 (17) Non-Supported 57 (14) 77 (19) 120 (29) 112 (27) 46 (11) All 70 (13) 93 (18) 156 (29) 145 (27) 66 (12) Computer lab certification delays Supported 6 (5) 6 (5) 20 (18) 24 (21) 57 (50) Non-Supported 29 (7) 46 (12) 86 (22) 103 (26) 128 (33) All 35 (7) 52 (10) 106 (21) 127 (25) 185 (37) COW (Computer on Wheels) issues Supported 28 (24) 27 (23) 25 (22) 16 (14) 20 (17) Non-Supported 121 (31) 80 (21) 69 (18) 49 (12) 72 (18) All 149 (29) 107 (21) 94 (19) 65 (13) 92 (18) Wireless connectivity issues Supported 26 (22) 24 (20) 31 (26) 19 (16) 18 (15) Non-Supported 99 (24) 82 (20) 110 (27) 70 (17) 47 (11) All 125 (24) 106 (20) 141 (27) 89 (17) 65 (12) Network issues Supported 21 (18) 21 (18) 33 (28) 29 (24) 15 (12) Non-Supported 76 (19) 80 (20) 128 (32) 84 (21) 30 (8) All 97 (18) 101 (20) 161 (31) 113 (22) 49 (9) Scheduling issues/limited access to computers (i.e., difficulty scheduling laptop carts/computer lab time)
Supported 7 (6) 12 (10) 34 (29) 25 (21) 40 (34) Non-Supported 58 (15) 66 (17) 120 (31) 78 (20) 63 (16) All 65 (13) 78 (16) 154 (31) 103 (20) 103 (20)
Note. Some percentages may not add to 100 due to rounding. As mentioned previously, supported teachers were asked specifically about successes,
challenges, and suggestions for improvement, whereas non-supported teachers were given the
opportunity to provide general comments. Analyses of the data for the open-ended survey items showed
that most of the non-supported teacher comments fit into the same categories as the supported teacher
responses to the three open-ended items (successes, challenges, suggestions). As a result, supported
74
and non-supported teacher comments are presented together in Table 34. Like the administrator survey,
the most-cited successes were in the student learning and engagement category for both supported
(N=129) and non-supported (N=38) teachers. For example, there were 41 supported teacher and 8
non-supported teacher references to increased math performance, problem-solving abilities, and
conceptual understanding. Likewise, there were 32 supported and 20 non-supported teacher citations
related to student engagement. In addition, there were over 10 supported teacher remarks related to
increased independent learning (N=19), connections made between RM and classroom instruction
(N=14), and improved vocabulary, reading, and note-taking skills (N=12). Besides student learning and
engagement, there were 18 supported and 14 non-supported teacher references to RM being a helpful
teacher and student resource. Other success-related comments varied.
Most teacher-cited challenges fell into the main categories of technology and scheduling issues.
(See Table 34.) For example, there were 56 supported teacher mentions of technology issues that
included connection issues (N=21), computer malfunctions (N=21), login issues (N=5), browser/RM
issues (N=3), and non-specified technology problems (N=6). Likewise, supported teacher remarks related
to scheduling issues included difficulty meeting mandated time requirements (N=13), lack of computer
access (N=17), conflicts on field trip and testing days (N=4), loss of instructional time (N=11), and general
scheduling issues (N=2). There were over 10 citations for both supported and non-supported teachers for
lack of alignment between RM and state/district curriculum requirements, training issues, and student
issues. Ten supported teachers indicated that they experienced no challenges. Other challenges varied.
Suggestions for improvement were similar but varied somewhat when reviewed by teacher group.
(See Table 34.) For example, the most common supported teacher suggestions related to improving
technology (N=32), refining RM training (N=17), solving RM time/scheduling issues (N=17), and aligning
RM to district/state curriculum requirements (N=13). The most-mentioned non-supported teacher
references related to aligning RM and district/state curriculum (N=16) and improving technology (N=9).
Eighteen of the valid non-supported teacher comments were general in nature and did not fit into
the success, challenge, or suggestion categories. Specifically, there were 15 generally positive comments
that referred to RM as “great,” “awesome,” “helpful,” and so forth. In addition, three comments were
negative toward the program.
75
Table 34
RM Supported and Non-Supported Teacher Survey Comments
Supported Non-Supported Comment
Sub Code
N
Main Code
N
Sub Code
N
Main Code
N Successes
Improved student learning and engagement 129 38 Increased math performance, problem-solving abilities,
conceptual understanding 41 8
Increased student engagement 32 20 Improved vocabulary, reading, and note-taking skills 12 0 Increased technology skills 1 0 Increased independent learning 19 4 Improved academic self confidence 4 0 Challenged students 6 2 Connections made between RM and classroom instruction 14 4 Helpful teacher and student resource 18 14 Differentiated, interactive instructional supplement 5 8 Versatile program 4 1 Helpful data monitoring tool 6 3 Allows teachers to work one-on-one with struggling students 3 2 Improved school-level implementation 2 0 Improved campus support 2 0 RM staff helpful, supportive, and friendly 3 3 Generally positive comments 2 0Experienced no successes 3 0
Challenges Technology issues 56 18 Connection issues 21 4 Computer/hardware/accessories malfunctions 21 7 Login issues 5 4 Browser/RM program issues 3 0 General technology issues (specific issues not defined) 6 3 Scheduling issues 47 17 Difficulty meeting mandated time requirements 13 2 Lack of computer access 17 4 Conflicts with field trips, testing, holidays, and other programs 4 1 Loss of instructional time 11 10 General scheduling issues 2 0 RM not aligned with state/district curriculum requirements 17 23 Training issues 12 11 Staff not trained/staff turnover 2 7 Learning curve to use RM 9 0 Inadequate training/communication from Program Coordinator 1 4 General negative comments 0 0Student issues 18 14 Limited technology skills/lack of computers at home 0 2 Classroom management challenges 9 1 Difficult for special needs, below average, or LEP students 9 8 General student issues (specific to RM use) 0 3 Limited staff buy in 2 3Experienced no challenges 10 0
Continued
76
Table 34 Continued
Supported Non-Supported Comment
Sub Code
N
Main Code
N
Sub Code
N
Main Code
N Suggestions for Improvement
Improve technology 32 9 Increase the number of computers 9 3 Improve network connection 2 2 Repair or replace computers 19 3 Improve or increase technology support 2 1 Improve RM training 17 4 Increase the amount and quality of training 8 0 Expand training to other staff 4 1 Conduct earlier or summer training 3 1 Replace online training with in-person training/improve online training 2 2
Align RM to district/state curriculum requirements 13 16 Improve RM implementation 7 6 Designate one staff member to teach all RM courses 3 0 Implement RM earlier in the school year 2 0 Exempt or make adjustments for special needs/below average students 1 2
Limit RM as an enrichment program 1 4 Solve RM time/scheduling issues 17 5 Create a more efficient computer lab/COW schedule 5 0 Allow more flexibility in the number of hours required per week 8 5 Create a master schedule that designates RM time for all RM classes 4 0
Improve campus leadership, communication, and promotion of RM
4 0
Expand RM program 2 2 General Comments
Positive comment N/A 15 Negative comment N/A 3
Note. Some respondents gave responses that fit into more than one category. N/A=Not Applicable Comparisons Across Surveys Overall satisfaction with RM and impact on students. Results of selected overall satisfaction items
showed that over half of the three groups of staff “agreed” or “strongly agreed” that they would like to
continue using RM next year (62% to 76%), that they believed students benefited from RM (68% to 85%),
and that they would recommend RM to others (55% to 75%). For all four items, the percentages of
supported teacher responses were higher than the percentages of administrator and non-supported
teacher responses. Staff results were somewhat mixed toward perceptions of RM benefits for teachers;
that is, whereas over half of supported teachers believed RM was beneficial for teachers (57%), less than
77
half (48%) of campus administrators and non-supported teachers “agreed” or “strongly agreed;” this is
likely a result of supported teachers receiving more training and support from RM program coordinators.
Although more positive for supported than non-supported teachers, over half of both supported (61% to
89%) and non-supported teachers (53% to 82%) believed students improved in areas assessed such as
reasoning skills, non-standard thinking abilities, independence in learning, confidence in mathematical
ability, and so forth.
Figure 19. Percentage of Campus Administrators, Supported Teachers, and Non-Supported Teachers that “Agreed” or “Strongly Agreed” with Selected Survey Satisfaction Items Communication between RM teachers. To assess staff communication, both supported and
non-supported teachers were asked if they worked closely with each other and how often they
collaborated; also campus administrators were asked how often they communicated with RM teachers.
Noticeably more supported (68%) than non-supported (52%) teachers “agreed” or “strongly agreed” that
they worked closely together. Results of all three surveys revealed wide variation in frequency of
communication. At least half of administrator (54%), supported teacher (65%), and non-supported
teacher (62%) respondents indicated that they communicated at least once every other week, and the
majority reported communicating at least once a month (73% to 83%).
64
48
68
55
76
57
85
75
62
48
73
60
0
20
40
60
80
100
Would like RM oncampus next year
RM helps teachers bemore effective
Students benefit fromRM
Would recommend RMto others
Percentage
Administrator Supported Teacher Non-Supported Teacher
78
Perceptions of RM and district support. As seen in Figure 20, over half of staff members were
“satisfied” or “extremely satisfied” with support from the RM program coordinator, RM technology staff,
other RM staff, and district technology staff. Supported teachers were noticeably more positive than
campus administrators and non-supported teachers toward support for implementing RM. Again, this is
likely due to the extra assistance that RM supported teachers received in 2012-13. As shown in Table 27,
most supported (83%) and non-supported (72%) teachers were also “satisfied” or “extremely satisfied”
with principal support.
Figure 20. Percentage of Campus Administrators, Supported Teachers, and Non-Supported Teachers that Were “Satisfied” or “Extremely Satisfied” with Selected Support Items
RM professional development and resources. Over half of administrators were “satisfied” or
“extremely satisfied” with RM administrator training (54%) as well as with the RM administrator interface
and reports (67%). As for teachers, noticeably more supported than non-supported teachers viewed the
in-person (82% versus 53%) and on-line (50% versus 28%) RM Qualification Course training as “very
helpful.” Similarly, markedly more supported than non-supported teachers rated the RM teacher resource
website (76% versus 49%), professional development website (63% versus 35%), and RM reports (83%
versus 59%) as “very helpful.”
76
6367
61
96
7982 83
74
61 59
72
0
20
40
60
80
100
RM Program Coordinatorsupport
RM technology staffsupport
Other RM staff support District technology staffsupport
Percentage
Administrator Supported Teacher Non-Supported Teacher
79
Technology issues. As for technology issues, across the three surveys, a notable percentage
marked “frequently” or “very frequently” for experiencing COW issues (47% to 53%), wireless connectivity
issues (42% to 49%), and network issues (36% to 39%). (See Figure 21.) Noticeably more administrators
(42%) and non-supported teachers (32%) than supported teachers (16%) believed scheduling issues
and/or limited access to computers interfered with RM implementation.
Figure 21. Percentage of Administrators, Supported Teachers, and Non-Supported Teachers that Chose “Frequently” or “Very Frequently” for Technology Issues
Open-ended items (successes, challenges, suggestions). On all three surveys, the most
frequently mentioned success (N=253) was increased student learning and engagement. In addition,
there were 58 references to RM being a helpful resource and 21 mentions of increased campus-level
implementation. As seen in Figure 22, over 20 described increased student learning and engagement in
terms of improved math performance (N=83), increased engagement (N=75), and more independent
learning (N=27).
35
20
5349
3942
25
10
4742
36
16
33
19
52
4439
32
0
20
40
60
80
100
Student log inissues
Lab certificationdelays
COW issues Wirelessconnectivity issues
Network issues Schedulingissues/limited
access
Percentage
Administrators Supported Teachers Non-Supported Teachers
80
Figure 22. Number of Staff Survey References to Improved Student Learning and Engagement
Technology issues (N=178) and scheduling problems (N=158) were the most referenced barriers
to implementation. Other barriers with over 30 references included lack of alignment between RM and
state/district curriculum (N=56), training issues (N=38), and student issues (N=36). Of the 178 references
to technology issues, some comments were general in nature but others were specific such as
connection issues (N=59) and computer/accessory malfunctions (N=59). (See Figure 23.) The 158
references to scheduling issues included difficulty meeting mandated time requirements (N=60),
inadequate computer access (N=46), loss of instructional time (N=31), and so forth. (See Figure 24.) In
particular, COW carts were described as problematic because they were not easily accessible to the
teachers housed in portable buildings.
Figure 23. Number of Staff Survey References to Technology-Related Challenges
83
75
2720 19
9 8 6 6
0
20
40
60
80
100
Number
59 59
37
18
5
0
20
40
60
80
100
Connection issues Computer/accessorymalfunctions
General technology issues Student log in issues Browser/RM programissues
Number
81
Figure 24. Number of Staff Survey References to Scheduling Issues
The most common administrator and teacher suggestion was to improve technology (N=89).
Other suggestions referenced 20 times or more were related to aligning RM with district/state
requirements (N=41), improving RM professional development issues (N=39), solving time/scheduling
challenges (N=31), and improving RM implementation (N=22). Figure 25 displays staff survey references
for suggestions to improve technology. The most common remarks included increasing the number of
computers (N=36) and repairing or replacing computers (N=28); these comments were closely related to
the mentions of scheduling issues due to inadequate computer access above.
Figure 25. Number of Staff Survey References to Suggestions for Improving Technology
60
46
31
16
5
0
20
40
60
80
100
Meeting timerequirements
Inadequate computeraccess
Loss of instructional time Conflicts with field tripsand testing
General scheduling issues
Number
3628
146 5
0
20
40
60
80
100
Increase number ofcomputers
Repair or replacecomputers
Improve networkconnection
Improve/increasetechnology support
Improve technology(general)
Number
82
2.7 What were student mathematics achievement outcomes?
Methodology
The evaluators analyzed student mathematics achievement data including ITBS, STAAR, and
district semester ACPs. Overall district data was extracted from the Dallas ISD Data Packet for 2013-14
Planning (Data Packet). In addition, data from RM files, PEIMS files, and test data files were merged to
prepare for RM analyses. The evaluators first reviewed overall district data to see if achievement
improved for second- and third-grade students from spring 2012 to spring 2013. Due to low
implementation of RM, the use of RM as a supplemental program, and the lack of a control group,
findings are limited and do not show the true impact of RM on student math achievement. Second,
correlation analyses were computed to determine the strength of relationship between RM variables (time
online, objectives completed, accuracy rates) and mathematics achievement measures. Third, multiple
regression analyses were conducted to note the relationship between RM use and student achievement
and to determine the strongest predictors of student achievement. Finally, frequency analyses were used
to study patterns between achievement, hours online, and learning mode Level A accuracy rates.
Evaluators pulled Data Packet information to note how many second-grade students scored at or
above the 40th and 80th percentiles on ITBS Mathematics Total and how many third-grade students
scored at the Level 2 Satisfactory and Level 3 Advanced levels on STAAR mathematics. Data were
reviewed using spring 2012 and spring 2013 data for all Dallas ISD students; data were also examined by
subgroup and district division. It should be noted that four elementary schools (Allen, Dealey, H. Stone,
Stevens Park) did not use RM in 2011-12, and two (Allen, Dealey) did not utilize RM in 2012-13; however,
analyses included all students because the main goal of RM use was to improve districtwide mathematics
achievement.
Correlation and multiple regression analyses were conducted to determine the level of
relationship between various components of RM and student achievement. Pearson product correlation
coefficients (r) were computed to assess the strength of relationships between RM (hours online,
objectives completed, accuracy rates of problems completed) and student achievement (ITBS, STAAR,
ACP) measures. The enter selection multiple regression procedure was used to determine the strongest
predictors of 2013 mathematics achievement (ITBS, STAAR, ACP). RM variables were entered as a
83
group followed by prior year mathematics performance (ITBS 2012). The pairwise deletion method was
used to handle missing values. RM variables entered in the program included (1) total hours online, (2)
total hours in STAAR test mode (third grade only), (3) objectives mastered, (4) Level A accuracy rates of
learning mode items, (5) Level A accuracy rates of Wall of Mastery review mode items, (6) overall
accuracy rates of STAAR test mode items (third grade only), and (7) prior mathematics achievement.
Spring ITBS Mathematics Total with Computation 2012 scores were used as the prior year variable in all
analyses. Based on regression results, frequency analyses were conducted to look at student
achievement results by hours online and Level A accuracy rates of learning mode items. Descriptions of
ITBS, STAAR, and ACP follow.
ITBS. ITBS, a nationally normed achievement battery, is administered to district students in
grades K-2 each spring. Percentile ranks were utilized in the review of districtwide data and in follow-up
frequency analyses, whereas normal curve equivalent (NCE) scores were used in the correlation and
regression analyses. Percentile scores, which range from 1 to 99, denote how a student’s score
compares to other students in the normed comparison group. For example, a student at the 40th
percentile performed better than 40 percent of the students in the comparison group. NCE scores range
from 1 to 99 with a mean of 50; as the name suggests, NCE scores are scaled on an equally distributed
normal curve.
STAAR. STAAR, the state’s mandated criterion-referenced assessment, is administered each
spring for students in grades 3-8. STAAR was used for the first time in spring 2012. The percentage of
students that met the Level 2 (Satisfactory) and Level 3 (Advanced) standards were reviewed overall, by
subgroup, and by district division. Third-grade students’ raw mathematics scores were used in the
correlation and regression analyses. The Level 2 (Satisfactory) standard was employed in the follow-up
frequency analyses.
ACP. Third-grade students took a mathematics ACP at the end of each semester. The
district-developed criterion-referenced assessment is aligned to the Texas Essential Knowledge and
Skills using the third-grade mathematics Curriculum Planning Guide. The percentage of items correct is
scaled each year to maintain a similar passing rate for the district. A scale score of 70 is passing.
84
Results
Overall District Mathematics Achievement
The percentages of second-grade students that performed at or above the 40th and 80th
percentiles on ITBS Mathematics Total in spring 2012 and 2013 were reviewed overall, by subgroup, and
by division. (See Tables 35 and 36.) As mentioned above, due to low implementation of RM, the use of
RM as a supplemental program, and the lack of a control group, findings are limited and do not indicate
the actual impact of RM on student math achievement. The percentage that scored at or above the 40th
percentile slightly decreased from spring 2012 (57.9%) to spring 2013 (56.2%), which is the opposite of
what would be hoped. Similarly, the percentage of students that scored at or above the 80th percentile
decreased from spring 2012 (21.6%) to spring 2013 (19.8%). When reviewed by district subgroup, the
percentage of students at or above the 40th percentile decreased from 2012 to 2013 in all comparisons,
and the percentage at or above the 80th percentile decreased from 2012 to 2013 in all but one; there was
a slight increase for special education students. A review by ethnicity revealed that notably more White
(79.6%) than Hispanic (57.6%) or African American (46.4%) students scored at or above the 40th
percentile in 2013; over half of economically disadvantaged (54.5%) and LEP (55.7%) students did so.
A look at ITBS Mathematics Total results for students at or above the 40th percentile over the past
four years revealed that second-grade ITBS results increased from 2010 to 2011 and from 2011 to 2012
but decreased from 2012 to 2013. Thus, the pattern of small increases from 2010 to 2012 reversed in
2013. (See Table 37.) This pattern was similar for all subgroups except for one (African American) in
which there were decreases from 2011 to 2012 and from 2012 to 2013.
85
Table 35
Percentage of District Second-Grade Students At or Above the 40th Percentile on ITBS Mathematics Total from Spring 2012 to Spring 2013
Spring 2012 Spring 2013
Group Number Tested
Percent At or Above 40th Percentile
Number Tested
Percent At or Above 40th Percentile
Change %
DistrictAll Students 13,188 57.9 13,275 56.2 -1.7White 618 82.7 663 79.6 -3.1African Amer. 3,010 48.9 3,038 46.4 -2.5 Hispanic 9,340 59.0 9,362 57.6 -1.4Econ. Dis. 12,096 56.2 12,320 54.5 -1.7Spec. Ed. 650 26.0 619 23.9 -2.1LEP 6,790 57.1 6,770 55.7 -1.4
Division 1All Students 2,810 56.9 2,824 55.4 -1.5White 58 77.6 49 85.7 8.1 African Amer. 539 49.4 545 52.3 2.9Hispanic 2,200 58.1 2,210 55.3 -2.8Econ. Dis. 2,622 55.8 2,660 54.4 -1.4Spec. Ed. 136 23.5 121 25.6 2.1LEP 1,567 56.0 1,578 53.7 -2.3
Division 2All Students 2,575 55.3 2,523 54.9 -0.4White 113 78.8 113 74.3 -4.5African Amer. 567 46.4 568 44.0 -2.4Hispanic 1,826 56.6 1,765 57.6 1.0Econ. Dis. 2,398 53.8 2,357 53.2 -0.6Spec. Ed. 129 17.8 121 24.8 7.0LEP 1,482 54.6 1,400 56.0 1.4
Division 3All Students 2,361 56.5 2,410 54.0 -2.5White 48 72.9 74 74.3 1.4African Amer. 318 45.9 370 40.5 -5.4Hispanic 1,954 57.5 1,937 55.7 -1.8Econ. Dis. 2,227 55.6 2,290 53.3 -2.3Spec. Ed. 122 26.2 123 19.5 -6.7 LEP 1,382 55.4 1,376 54.2 -1.2
Division 4All Students 2,123 59.3 2,132 55.7 -3.6White 201 90.0 193 90.7 0.7African Amer. 814 51.1 798 45.9 -5.2Hispanic 1,074 59.6 1,105 55.8 -3.8 Econ. Dis. 1,832 55.2 1,871 51.7 -3.5Spec. Ed. 107 25.2 115 26.1 0.9LEP 753 56.3 760 51.1 -5.2
Division 5All Students 3,198 60.2 3,228 58.5 -1.7White 170 78.8 202 70.3 -8.5African Amer. 737 48.3 710 45.5 -2.8 Hispanic 2,242 62.6 2,281 61.5 -1.1Econ. Dis. 2,966 58.9 3,064 57.8 -1.1Spec. Ed. 153 34.0 134 21.6 -12.4LEP 1,595 62.3 1,632 60.8 -1.5
MagnetAll Students 106 86.8 116 92.2 5.4White 28 96.4 29 100.0 3.6 African Amer. 27 70.4 30 86.7 16.3Hispanic 37 89.2 45 91.1 1.9Econ. Dis. 36 83.3 43 88.4 5.1Spec. Ed. * * * * *LEP 7 100.0 13 84.6 -15.4
Source. Dallas ISD Data Packet ITBS Mathematics Total (Combined Mathematics Concepts, Problems, and Computation subtest results). Magnet and alternative schools are included in total but not in division statistics. *Statistics are not reported for groups smaller than six.
86
Table 36
Percentage of District Second-Grade Students At or Above the 80th Percentile on ITBS Mathematics Total from Spring 2012 to Spring 2013
Spring 2012 Spring 2013
Group Number Tested
Percent At or Above 80th Percentile
Number Tested
Percent At or Above 80th Percentile
Change %
DistrictAll Students 13,188 21.6 13,275 19.8 -1.8White 618 48.5 663 47.7 -0.8African Amer. 3,010 16.5 3,038 15.2 -1.3 Hispanic 9,340 21.2 9,362 19.0 -2.2Econ. Dis. 12,096 19.7 12,320 17.7 -2.0Spec. Ed. 650 7.7 619 8.2 0.5LEP 6,790 20.0 6,770 17.8 -2.2
Division 1All Students 2,810 19.9 2,824 18.1 -1.8White 58 48.3 49 32.7 -15.6 African Amer. 539 16.0 545 18.3 2.3Hispanic 2,200 20.0 2,210 17.5 -2.5Econ. Dis. 2,622 18.6 2,660 17.0 -1.6Spec. Ed. 136 8.1 121 8.3 0.2LEP 1,567 19.5 1,578 15.7 -3.8
Division 2 All Students 2,575 21.4 2,523 21.1 -0.3White 113 47.8 113 48.7 0.9African Amer. 567 15.7 568 14.1 -1.6Hispanic 1,826 21.3 1,765 21.7 0.4Econ. Dis. 2,398 19.7 2,357 19.5 -0.2Spec. Ed. 129 4.7 121 9.1 4.4LEP 1,482 19.6 1,400 20.6 1.0
Division 3All Students 2,361 16.6 2,410 14.9 -1.7White 48 31.3 74 32.4 1.1African Amer. 318 12.9 370 10.0 -2.9Hispanic 1,954 16.7 1,937 14.8 -1.9Econ. Dis. 2,227 16.2 2,290 13.9 -2.3Spec. Ed. 122 5.7 123 4.9 -0.8 LEP 1,382 15.8 1,376 14.1 -1.7
Division 4All Students 2,123 24.2 2,132 22.9 -1.3White 201 61.2 193 68.9 7.7African Amer. 814 17.7 798 14.8 -2.9Hispanic 1,074 22.1 1,105 19.8 -2.3 Econ. Dis. 1,832 19.8 1,871 17.2 -2.6Spec. Ed. 107 6.5 115 8.7 2.2LEP 753 18.3 760 17.1 -1.2
Division 5All Students 3,198 24.1 3,228 20.6 -3.5White 170 37.6 202 31.7 -5.9African Amer. 737 16.8 710 15.8 -1.0 Hispanic 2,242 25.3 2,281 21.2 -4.1Econ. Dis. 2,966 22.9 3,064 19.7 -3.2Spec. Ed. 153 11.8 134 9.0 -2.8LEP 1,595 25.1 1,632 20.5 -4.6
MagnetAll Students 106 52.8 116 52.6 -0.2White 28 57.1 29 79.3 22.2 African Amer. 27 44.4 30 36.7 -7.7Hispanic 37 54.1 45 40.0 -14.1Econ. Dis. 36 50.0 43 34.9 -15.1Spec. Ed. * * * * *LEP 7 42.9 13 38.5 -4.4
Source. Dallas ISD Data Packet ITBS Mathematics Total (Combined Mathematics Concepts, Problems, and Computation subtest results). Magnet and alternative schools are included in total but not in division statistics. *Statistics are not reported for groups smaller than six.
87
Table 37
Percentage of District Second-Grade Students At or Above the 40th Percentile on ITBS Mathematics Total by Student Group from Spring 2010 to Spring 2013
All
Students Tested
N
Student Group
Year District
% White
%
African American
% Hispanic
%
Econ. Disadv.
%
Special Educ.
% LEP %
2010 13,231 54.6 76.1 49.7 54.7 53.1 21.8 52.7
2011 13,134 56.8 78.0 50.6 57.5 55.6 24.8 56.2
2012 13,188 57.9 82.7 48.9 59.0 56.2 26.0 57.1
2013 13,275 56.2 79.6 46.4 57.6 54.5 23.9 55.7
Source. Dallas ISD Data Packet ITBS Mathematics Total (Combined Mathematics Concepts, Problems, and Computation subtest results).
Next, the percentages of third-grade students that met the Level 2 Satisfactory and Level 3
Advanced standards on the STAAR math subtest in 2012 and 2013 were examined for all district
students, by subgroup, and by district division. (See Tables 38 and 39.) Again, due to low implementation
of RM, the use of RM as a supplemental program, and the lack of a control group, findings are limited and
do not reveal the true impact of RM on student math achievement. For all district third-grade students,
57.3 percent met the Level 2 Satisfactory standard in 2013, which was slightly higher (2.1%) than in 2012
when 55.2 percent met the standard. Also, there was a slight increase (1.2%) for the percentage of
students that met the Level 3 Advanced standard. A review by ethnicity showed that the percentage of
students that met the Level 2 Satisfactory standard was higher for White students (80.6%) than for
Hispanic (60.5%) and African American (42.1%) students. Over half of economically disadvantaged
(55.6%) and LEP (59.7%) students met the Level 2 Satisfactory standard.
88
Table 38
Percentage of District Third-Grade Students that Met Satisfactory on STAAR Mathematics in Spring 2012 and Spring 2013
Spring 2012 Spring 2013
Group Number Tested
Percent Met Satisfactory
Number Tested
Percent Met Satisfactory
Change %
DistrictAll Students 11,991 55.2 12,029 57.3 2.1 White 481 78.0 571 80.6 2.6 African American 2,794 42.0 2,772 42.1 0.1 Hispanic 8,513 58.0 8,472 60.5 2.5 Econ. Dis. 11,064 53.8 11,000 55.6 1.8 Spec. Ed. 408 29.2 410 39.0 9.8 LEP 6,195 57.4 6,053 59.7 2.3
Division 1All Students 2,570 56.0 2,680 58.1 2.1 White 28 71.4 45 84.4 13.0 African American 504 44.0 527 46.7 2.7 Hispanic 2,016 58.7 2,092 60.3 1.6 Econ. Dis. 2,406 55.2 2,501 57.2 2.0 Spec. Ed. 77 24.7 110 35.5 10.8 LEP 1,442 57.4 1,466 59.3 1.9
Division 2All Students 2,354 55.4 2,297 56.0 0.6 White 86 70.9 108 78.7 7.8 African American 505 41.2 512 41.2 0.0 Hispanic 1,709 58.4 1,613 58.7 0.3 Econ. Dis. 2,176 54.2 2,114 54.6 0.4 Spec. Ed. 77 22.1 67 37.3 15.2 LEP 1,352 58.7 1,274 58.7 0.0
Division 3All Students 2,180 50.8 2,042 57.8 7.0 White 48 68.8 40 62.5 -6.3 African American 329 40.7 291 39.2 -1.5 Hispanic 1,770 51.9 1,668 61.0 9.1 Econ. Dis. 2,056 49.9 1,935 57.1 7.2 Spec. Ed. 80 37.5 74 37.8 0.3 LEP 1,232 50.3 1,164 59.7 9.4
Division 4 All Students 1,933 55.1 1,978 55.5 0.4 White 161 90.1 194 87.1 -3.0 African American 733 40.7 764 38.7 -2.0 Hispanic 1,000 59.3 985 61.8 2.5 Econ. Dis. 1,675 50.9 1,686 51.1 0.2 Spec. Ed. 90 35.6 50 52.0 16.4 LEP 706 57.2 665 61.5 4.3
Division 5All Students 2,845 57.4 2,916 57.3 -0.1 White 138 71.0 155 74.8 3.8 African American 689 41.9 647 43.0 1.1 Hispanic 1,978 61.8 2,074 60.5 -1.3 Econ. Dis. 2,702 56.7 2,713 56.2 -0.5 Spec. Ed. 80 23.8 105 38.1 14.3 LEP 1,451 62.0 1,475 60.1 -1.9
MagnetAll Students 109 67.9 115 82.6 14.7 White 20 90.0 29 93.1 3.1 African American 34 64.7 31 67.7 3.0 Hispanic 40 55.0 39 84.6 29.6 Econ. Dis. 49 57.1 50 76.0 18.9 Spec. Ed. * * * * * LEP 12 58.3 9 77.8 19.5
Source. Dallas ISD Data Packet first administration of STAAR. Magnet and alternative schools are included in total but not in division statistics. STAAR M (modified) and STAAR L (linguistically accommodated) are not included. *Statistics are not reported for groups smaller than six.
89
Table 39
Percentage of District Third-Grade Students that Met Advanced on STAAR Mathematics in Spring 2012 and Spring 2013
Spring 2012 Spring 2013
Group Number Tested
Percent Met Advanced
Number Tested
Percent Met Advanced
Change %
DistrictAll Students 11,991 7.9 12,029 9.1 1.2 White 481 26.6 571 28.4 1.8 African American 2,794 2.9 2,772 4.3 1.4 Hispanic 8,513 8.2 8,472 9.2 1.0 Econ. Dis. 11,064 6.8 11,000 7.9 1.1 Spec. Ed. 408 3.2 410 2.7 -0.5 LEP 6,195 8.1 6,053 8.8 0.7
Division 1All Students 2,570 7.7 2,680 9.6 1.9 White 28 17.9 45 35.6 17.7 African American 504 3.6 527 6.8 3.2 Hispanic 2,016 8.5 2,092 9.6 1.1 Econ. Dis. 2,406 7.3 2,501 9.2 1.9 Spec. Ed. 77 0.0 110 0.9 0.9 LEP 1,442 8.9 1,466 9.5 0.6
Division 2 All Students 2,354 7.8 2,297 9.3 1.5 White 86 19.8 108 24.1 4.3 African American 505 2.4 512 3.3 0.9 Hispanic 1,709 8.4 1,613 9.7 1.3 Econ. Dis. 2,176 7.2 2,114 8.2 1.0 Spec. Ed. 77 1.3 67 4.5 3.2 LEP 1,352 7.9 1,274 8.9 1.0
Division 3All Students 2,180 6.2 2,042 7.1 0.9 White 48 16.7 40 25.0 8.3 African American 329 3.6 291 4.8 1.2 Hispanic 1,770 6.4 1,668 6.8 0.4 Econ. Dis. 2,056 5.4 1,935 6.4 1.0 Spec. Ed. 80 3.8 74 0.0 -3.8 LEP 1,232 5.6 1,164 7.0 1.4
Division 4All Students 1,933 9.5 1,978 9.4 -0.1 White 161 42.2 194 34.5 -7.7 African American 733 2.6 764 1.6 -1.0 Hispanic 1,000 8.5 985 10.5 2.0 Econ. Dis. 1,675 5.7 1,686 6.5 0.8 Spec. Ed. 90 5.6 50 6.0 0.4 LEP 706 8.2 665 9.6 1.4
Division 5All Students 2,845 8.1 2,916 9.1 1.0 White 138 19.6 155 21.3 1.7 African American 689 2.8 647 5.3 2.5 Hispanic 1,978 9.2 2,074 9.3 0.1 Econ. Dis. 2,702 7.6 2,713 8.3 0.7 Spec. Ed. 80 3.8 105 2.9 -0.9 LEP 1,451 9.6 1,475 9.1 -0.5
MagnetAll Students 109 11.0 115 23.5 12.5 White 20 15.0 29 34.5 19.5 African American 34 5.9 31 16.1 10.2 Hispanic 40 5.0 39 23.1 18.1 Econ. Dis. 49 6.1 50 14.0 7.9 Spec. Ed. * * * * * LEP 12 8.3 9 22.2 13.9
Source. Dallas ISD Data Packet first administration of STAAR. Magnet and alternative schools are included in total but not in division statistics. STAAR M (modified) and STAAR L (linguistically accommodated) are not included. *Statistics are not reported for groups smaller than six.
90
Correlations between RM Components and Math Achievement
Table 40 shows Pearson product correlation coefficients (r) between RM components (time,
objectives completed, accuracy rates), and student mathematics achievement measures (ITBS, fall ACP,
spring ACP, STAAR). The purpose of the correlation measures was to determine the strength of
relationship between student achievement and various RM variables. Correlation coefficients range
from -1 to 1 with zero meaning no correlation and 1 or -1 indicating perfect correlation. Absolute values of
.1, .3, and .5 are considered small, medium, and large respectively (Cohen, 1988). The largest
correlations were between accuracy rates on learning mode level A problems and achievement measures
(.594 to .725) as well as between accuracy rates on STAAR test mode problems and achievement tests
for third grade (.534 to .720); remaining accuracy rate and achievement correlations ranged from .065
(small) to .383 (medium). Correlations between objectives completed and achievement ranged from .211
(small) to .315 (medium), whereas the correlations between hours online and achievement (.058 to .137)
and between hours online in STAAR test mode and achievement (-.021 to .134) were small. These
results showed that students’ mastery of objectives and accuracy rates on basic learning mode problems
were more strongly related to mathematics achievement than time spent online.
91
Table 40
Correlations between RM Use and Mathematics Achievement Test Scores
ITBS 2013
NCE Scores
ACP Fall 2012
Raw Scores
ACP Spring 2013 Raw Scores
STAAR 2013
Raw Scores 2nd Grade 3rd Grade 3rd Grade 3rd Grade Component N r N r N r N r
Time Variables (Number of Hours Online) Hours Online (LM and WM) Semester 1 12,244 .137 11,512 .074 11,429 .123 11,750 .117 Semester 2 12,244 .118 11,512 .058 11,429 .109 11,750 .099 Hours Online in Test Modes Semester 1 - - 548 .089 536 .134 546 .132 Semester 2 - - 7,803 -.021 7,795 .013 7,987 .033
Objectives Variables (Objectives Completed) Objectives Completed Semester 1 11,565 .315 11,126 .211 10,897 .265 11,208 .254 Semester 2 12,114 .300 11,419 .226 11,385 .304 11,685 .283
Accuracy Variables (Accuracy When Completing Math Problems) Learning Mode Level A Semester 1 11,548 .594 11,097 .644 10,869 .622 11,179 .672 Semester 2 12,088 .632 11,392 .639 11,360 .698 11,657 .725 Learning Mode Level B Semester 1 6,466 .349 5,609 .110 5,522 .073 5,617 .065 Semester 2 11,108 .339 10,122 .359 10,109 .369 10,328 .379 Learning Mode Level C Semester 1 4,929 .282 4,044 .099 3,971 .096 4,038 .087 Semester 2 9,015 .296 6,768 .308 6,762 .301 6,886 .301 WM Review Mode Level A Semester 1 2,363 .309 3,156 .288 3,109 .307 3,179 .320 Semester 2 6,734 .262 6,504 .353 6,488 .378 6,657 .383 WM Review Mode Level B Semester 1 1,973 .399 3,064 .173 3,010 .144 3,077 .152 Semester 2 5,855 .422 6,540 .350 6,527 .347 6,667 .356 WM Review Mode Level C Semester 1 1,026 .144 1,977 .122 1,936 .110 1,979 .125 Semester 2 3,984 .239 4,323 .249 4,301 .263 4,390 .279 Accuracy in STAAR Test Mode Semester 1 - - 540 .547 528 .534 538 .564 Semester 2 - - 7,776 .664 7,768 .671 7,958 .720 Note. Correlation absolute values of .1, .3, and .5 are considered small, medium, and large, respectively (Cohen, 1988). Correlations with medium and large effect sizes are in bold. WM=Wall of Mastery
Multiple Regression Analyses
Multiple regression analyses were conducted to determine the strongest predictors of student
mathematics achievement on ITBS (second grade), ACP (third grade), and STAAR (third grade). RM
variables in the analyses included (1) total hours spent online, (2) total hours spent in STAAR test mode
92
(third grade only), (3) number of objectives completed, (4) accuracy rates of Level A learning mode
problems, (5) accuracy rates of Level A review mode (Wall of Mastery) items, (6) accuracy rates of
STAAR test mode problems (third grade only), and (7) prior achievement (2012 ITBS NCE scores).
Tables 41 through 46 display the inter-correlations among the predictor variables, the overall variance of
the test scores explained by the predictor variables included in the model (R2), and the standardized
regression coefficients (β) and t-values for each predictor variable.
The first multiple regression analysis was conducted to determine whether second-grade
students’ RM program use (hours online, objectives completed, learning mode Level A accuracy rates,
review mode Level A accuracy rates) and prior math performance (ITBS 2012) significantly predicted
achievement on spring 2013 ITBS. Results revealed that the five predictors explained about 64 percent of
the variance in ITBS test scores (R2=.644, F[5, 6,648]=2,403.91, p<.001). (See Table 41.) Three variables
accounted for most of the variance; these included prior 2012 ITBS math achievement (β=.42, p<.001),
learning mode Level A accuracy rates (β=.37, p<.001), and objectives completed (β=.27, p<.001). The
Wall of Mastery review mode Level A accuracy variable was statistically significant; however, the
magnitude was very small (β=.02, p<.01). Total hours students spent online had a suppressor or very
small, negative effect (β=-.06, p<.001) within the prediction model. In other words, when the other
variables were accounted for, students with higher hours online were expected to have lower 2013 ITBS
math scores.
A follow-up regression analysis was conducted that included the three main predictors found in
the original model. Results showed that the overall variance explained (R2) using three predictors
(R2=.642, F[3, 11,154]=6,668.243, p<.001) was about the same as the original model with five predictors
(R2=.644, F[5, 6,648]=2,403.91, p<.001); that is, both models predicted 64 percent of the variance in
spring 2013 ITBS math scores. (See Table 42.) Consequently, as values increase for the three key
variables (prior math achievement, objectives completed, learning mode Level A accuracy rates), ITBS
math scores are likely to increase.
93
Table 41
Multiple Regression Results for Second-Grade 2013 ITBS Math Scores and Various Predictors
Correlations among Variables
Regression Results R2=0.644
Adjusted R2=0.644 1 2 3 4 5 6 β t-value 1. ITBS 2013 Scores 1.00 2. LM Level A Accuracy 0.65 1.00 0.37 38.18*** 3. Objectives Completed 0.38 0.07 1.00 0.27 24.86*** 4. WM Level A Accuracy 0.28 0.34 0.12 1.00 0.02 2.60** 5. Total Hours Online 0.15 -0.08 0.71 0.06 1.00 -0.06 -5.34*** 6. Prior Year ITBS Scores 0.72 0.61 0.29 0.25 0.09 1.00 0.42 43.03***
M 48.60 74.70 11.20 75.10 36.70 47.90 SD 22.00 12.40 5.40 22.60 17.10 21.00 N 13,250 12,257 12,264 7,238 12,340 11,902
Note. LM=Learning mode; WM=Wall of Mastery review mode **p<.01 ***p<.001
Table 42
Multiple Regression Results for Second-Grade 2013 ITBS Math Scores and Three Main Predictors
Correlations among Variables
Regression Results R2=0.642
Adjusted R2=0.642 1 2 3 4 β t-value 1. ITBS 2013 Scores 1.00 2. LM Level A Accuracy 0.65 1.00 0.38 53.10*** 3. Objectives Completed 0.38 0.07 1.00 0.23 39.21*** 4. Prior Year ITBS Scores 0.72 0.61 0.29 1.00 0.42 56.32***
M 48.60 74.70 11.20 47.90 SD 22.00 12.40 5.40 21.00 N 13,250 12,257 12,264 11,902
Note. LM=Learning mode ***p< .001
94
Table 43 displays the results of the next multiple regression analysis, which examined the
relationship between third-grade students’ spring 2013 STAAR scores and seven predictors. Results
revealed that the seven predictors explained about 68 percent of the variance in STAAR test scores
(R2=.679, F[7, 5,169]=1,559.835, p<.001). The largest predictors were accuracy rates for learning mode
Level A problems (β=.33, p<.001), prior year (2012) ITBS scores (β=.29, p<.001), and STAAR test mode
accuracy rates (β=.28, p<.001). When other variables were taken into account, students’ total time online
using RM had a small, negative (suppressor) effect (β=-.05, p<.001). The remaining two variables
(number of objectives completed, hours in STAAR test modes) were statistically significant but had very
small betas (β=.06, p<.001 and β=.03, p<.01, respectively). As a result, a multiple regression analysis
was conducted using the three main predictors only. Results showed that overall variance explained (R2)
was about the same with the three main predictors (R2=.677, F[3, 7,211]=5,037.074, p<.001) as with
seven (R2=.679, F[7, 5,169]=1,559.835, p<.001). (See Table 44.) Thus, the higher the values of the major
predictors (prior achievement, accuracy levels of learning mode Level A problems, accuracy levels of
STAAR test mode Level A problems), the higher students’ STAAR scores are likely to be at the end of the
school year.
95
Table 43
Multiple Regression Results for Third-Grade 2013 STAAR Scores and Various Predictors
Correlations among Variables
Regression Results R2=0.679
Adjusted R2=0.678 1 2 3 4 5 6 7 8 β t-value 1. STAAR 2013 Scores 1.00
2. LM Level A Accuracy 0.75 1.00 0.33 24.36***
3. STAAR TM Accuracy 0.72 0.71 1.00 0.28 23.79***
4. Hours in STAAR TM 0.03 0.03 0.07 1.00 0.03 3.02**
5. Objectives Completed 0.30 0.38 0.24 0.03 1.00 0.06 4.62***
6. WM Level A Accuracy 0.38 0.48 0.37 0.04 0.21 1.00 0.01 1.19
7. Total Hours Online 0.13 0.24 0.10 0.21 0.74 0.14 1.00 -0.05 -4.01***
8. Prior Year ITBS Scores 0.73 0.71 0.66 -0.04 0.28 0.36 0.06 1.00 0.29 23.85***
M 28.20 73.30 52.40 1.70 7.30 66.80 36.70 51.10
SD 9.50 12.70 19.10 1.70 5.10 25.10 17.20 22.10
N 12,699 11,734 8,045 8,073 11,737 7,314 11,750 11,099
Note. LM=Learning mode; TM=Test mode; WM=Wall of Mastery review mode **p<.01 ***p<.001
Table 44
Multiple Regression Results for Third-Grade 2013 STAAR Scores and Three Main Predictors
Correlations among Variables
Regression Results R2=0.677
Adjusted R2=0.677 1 2 3 4 β t-value 1. STAAR 2013 Scores 1.00
2. LM Level A Accuracy 0.75 1.00 0.34 31.84***
3. STAAR TM Accuracy 0.72 0.71 1.00 0.29 28.67***
4. Prior Year ITBS Scores 0.73 0.71 0.66 1.00 0.30 29.77***
M 28.20 73.30 52.40 51.10
SD 9.50 12.70 19.10 22.10
N 12,699 11,734 8,045 11,099
Note. LM=Learning mode; TM=testing mode ***p< .001
Next, a multiple regression analysis was conducted to examine third-grade RM students’ spring
2013 ACP scores and various predictors. (See Table 45.) Results indicated that the seven predictors
explained about 61 percent of the variance in ACP test scores (R2=.612, F[7, 5,054]=1,136.957, p<.001).
96
Like the STAAR regression model above, the three largest predictors were accuracy of learning mode
Level A problems (β=.30, p<.001), prior 2012 ITBS math scores (β=.29, p<.001), and accuracy of STAAR
test mode Level A problems (β=.24, p<.001). When other variables were accounted for in the model, total
hours online had a very small suppressor effect (β=-.06, p<.001) and did not meaningfully contribute to
the regression model. Two RM variables (number of objectives completed and accuracy of Wall of
Mastery review mode Level A problems) were statistically significant but very small in magnitude (β=.11,
p<.001 and β=.03, p<.01, respectively). When the evaluators conducted a multiple regression model
limited to the three main predictors above, the overall variance explained (R2) was about the same with
three predictors (R2=.606, F[3, 7,069]=3,620.161, p<.001) as with seven (R2=.612, F[7, 5,054]=1,136.957,
p<.001). (See Table 46.) That is, 61 percent of the variance in spring 2013 third-grade math ACP scores
could be accounted for with three predictor variables (accuracy of learning mode Level A problems, prior
2012 ITBS math scores, accuracy of STAAR test mode Level A problems) and precluded the need to
include all seven variables into the model. Again, the higher the values of the three major predictor
variables (accuracy of learning mode Level A problems, prior 2012 ITBS math scores, accuracy of
STAAR test mode Level A problems), the better the students are likely to score on the second semester
third-grade math ACP.
97
Table 45
Multiple Regression Results for Third-Grade 2013 ACP Scores and Various Predictors
Correlations among Variables
Regression Results R2=0.612
Adjusted R2=0.611 1 2 3 4 5 6 7 8 β t-value 1. ACP 2013 Scores 1.00
2. LM Level A Accuracy 0.71 1.00 0.30 19.84***
3. STAAR TM Accuracy 0.67 0.70 1.00 0.24 18.04***
4. Hours in STAAR TM 0.01 0.02 0.06 1.00 0.01 1.22
5. Objectives Completed 0.32 0.38 0.24 0.03 1.00 0.11 7.83***
6. WM Level A Accuracy 0.38 0.48 0.37 0.04 0.20 1.00 0.03 2.73**
7. Total Hours Online 0.14 0.23 0.09 0.21 0.74 0.14 1.00 -0.06 -4.09***
8. Prior Year ITBS Scores 0.70 0.71 0.66 -0.04 0.27 0.35 0.05 1.00 0.29 21.47***
M 15.00 73.70 52.80 1.70 7.40 67.00 36.90 51.50 SD 6.00 12.50 19.00 1.70 5.10 25.10 17.20 21.90 N 12,351 11,416 7,854 7,880 11,419 7,126 11,429 10,828
Note. LM=Learning mode; TM=Test mode; WM=Wall of Mastery review mode **p<.01 ***p<.001
Table 46
Multiple Regression Results for Third-Grade 2013 ACP Scores and Three Main Predictors
Correlations among Variables
Regression Results R2=0.606
Adjusted R2=0.606 1 2 3 4 β t-value
1. ACP 2013 Scores 1.00
2. LM Level A Accuracy 0.71 1.00 0.33 27.88***
3. STAAR TM Accuracy 0.67 0.70 1.00 0.24 21.40***
4. Prior Year ITBS Scores 0.70 0.71 0.66 1.00 0.30 27.25***
M 15.00 73.70 52.80 51.50
SD 6.00 12.50 19.00 21.90
N 12,351 11,416 7,854 10,828
Note. LM=Learning mode TM=Test mode ***p< .001
Comparisons of Math Achievement, Hours Online, and Level A Accuracy Rates
Because learning mode Level A accuracy rates were strong predictors of student achievement in
all regression models, the evaluators conducted frequency analyses to examine the relationship between
time, accuracy, and student achievement. Figure 26 and Table 47 show the percentage of second-grade
98
students that scored at or above the 40th percentile on ITBS by hour and accuracy rate categories.
Although overall mean scores increased somewhat as time online increased, second-grade students that
achieved accuracy rates of 75 percent or higher did notably better than those with lower accuracy rates
no matter how much time was spent online. Less than half of second-grade students with accuracy rates
between 50 percent and 74 percent scored at or above the 40th percentile (21.8% to 48.9%), whereas
most students with accuracy rates at or above 75 percent did so (62.6% to 92.5%). The percentages of
students that scored at or above the 40th percentile and achieved accuracy rates of 75 percent or higher
ranged from 62.6 percent (0-19.99 hours) to 79.9 percent (20-39.99 hours) to 85.3 percent (40-59.99
hours) to 92.5 percent (60 or more hours). Thus, for second-grade students, the amount of time spent
online was important, but accuracy rates at or above 75 percent best predicted higher ITBS achievement
as over 60 percent in all time categories scored at or above the 40th percentile.
Figure 27 and Table 47 display the percentage of third-grade students that met the Level 2
Satisfactory standard on STAAR by hour and accuracy comparisons. For all four time categories, over 80
percent of students with accuracy rates at 75 percent or higher met the third-grade Level 2 Satisfactory
STAAR math standard. Like ITBS, the percentages of students that met the Level 2 Satisfactory standard
were dramatically different for students with accuracy rates between 50 percent and 74 percent (27.1% to
34.7%) and for those with accuracy rates at 75 percent or higher (83.8% to 86.3%). When looking at time
alone, the percentage that met the standard increased as time online increased and ranged from 50.2
percent (0-19.99 hours) to 55.9 percent (20-39.99 hours) to 62.1 percent (40-59.99 hours) to 64.7 percent
(60 or more hours). Even so, accuracy proved to be a much more important predictor than time because
over 80 percent of students in all time categories that achieved accuracy rates of 75 percent or higher
met the third-grade 2013 STAAR Level 2 Satisfactory mathematics standard.
99
Figure 26. Percentage of Second-Grade Students At or Above the 40th Percentile on Spring 2013 ITBS Mathematics by Level A Accuracy and Total Hours Online
Figure 27. Percentage of Third-Grade Students that Met Level 2 Satisfactory on Spring 2013 STAAR Mathematics by Level A Accuracy and Total Hours Online
0-19 Hrs 20-39 Hrs 40-59 Hrs 60 + HrsAccuracy: 0-24 % 0.0 0.0 0.0 0.0Accuracy: 25-49 % 8.1 7.3 9.8 0.0Accuracy: 50-74 % 21.8 31.9 38.6 48.9Accuracy: 75+ % 62.6 79.9 85.3 92.5
0
20
40
60
80
100
% A
t or A
bove
40t
h Pe
rcen
tile
on IT
BS
0-19 Hrs 20-39 Hrs 40-59 Hrs 60 + HrsAccuracy: 0-24 % 14.3 0.0 0.0 0.0Accuracy: 25-49 % 13.7 13.9 13.8 15.4Accuracy: 50-74 % 34.7 28.9 30.4 27.1Accuracy: 75+ % 84.9 86.3 86.2 83.8
0
20
40
60
80
100
% M
et S
atis
fact
ory
Leve
l on
STA
AR M
athe
mat
ics
100
Table 47
RM Student ITBS and STAAR Results by Hours Online and Level A Accuracy Rates
Second Grade Third Grade
Level A Accuracy (%) Number Tested
% At or Above 40th Percentile
on ITBS Number Tested
% Met Level 2 Satisfactory on STAAR
0-19.99 Hours 0-24 4 0.0 7 14.3 25-49 86 8.1 211 13.7 50-74 586 21.8 967 34.7 75+ 1,126 62.6 662 84.9 All 1,802 46.6 1,847 50.2
20-39.99 Hours 0-24 4 0.0 6 0.0 25-49 165 7.3 281 13.9 50-74 2,316 31.9 2,498 28.9 75+ 2,938 79.9 2,629 86.3 All 5,423 57.1 5,414 55.9
40-59.99 Hours 0-24 2 0.0 0 0.0 25-49 102 9.8 80 13.8 50-74 1,916 38.6 1,343 30.4 75+ 1,941 85.3 1,933 86.2 All 3,961 60.7 3,356 62.1
60 Hours or More 0-24 0 0.0 0 0.0 25-49 24 0.0 13 15.4 50-74 456 48.9 361 27.1 75+ 465 92.5 743 83.8 All 945 69.1 1,117 64.7
Note. Some percentages may not add to 100 due to rounding.
101
SUMMARY
The RM technology-based, math curriculum program was provided as a supplement for
Dallas ISD second-grade students in 2011-12 and expanded to second- and third-grade students in
2012-13. During 2012-13, students at all but elementary two schools (Allen, Dealey) were enrolled in RM.
This included 13,398 (98.8%) of the district’s 13,563 second-grade students and 12,753 (98.8%) of the
district’s 12,907 third-grade students for a total of 26,151 students. In addition, 584 second- and
third-grade teachers were trained including 129 supported teachers, 402 non-supported teachers, and 53
inactive teachers that left the district or were assigned to another position within the district.
The district budget for RM was $1,494,500, which was funded through Title I, Part A ($969,500)
and Title II, Part A ($525,000). Title I funds were allocated for 27,700 individual student accounts at $35 a
student; this included 14,200 second-grade and 13,500 third-grade accounts. Likewise, Title II funds were
allotted to pay for RM professional development and support for one teacher per RM campus at a cost of
$3,500 per supported teacher.
Whereas all RM teachers were “supported” teachers in 2011-12, when RM was expanded to
second and third grade in 2012-13, the district limited the number of supported teachers to one per
campus for financial reasons; the remaining RM second- and third-grade mathematics teachers were
“non-supported” teachers, which means they received less support and training from RM. Whereas all
teachers (supported and non-supported) completed the RM Qualification Course before using RM with
students, supported teachers received six additional professional development courses (12 hours), formal
classroom observations and follow up, and ongoing support from the assigned RM program coordinator.
Supported teachers were expected to collaborate with non-supported teachers on their campuses, to
share information learned in training, and to serve as the “go to” person as needed. RM program
coordinators assisted non-supported teachers as time allowed, but per the RM support model, most
assistance went to the supported teachers.
RM provided both in-person and online professional development in 2012-13. Less than half of
the teachers (48%) completed the RM Qualification Course by the end of the first six weeks, which
delayed use of RM with the students. Notably more supported (72%) than non-supported (40%) teachers
completed the course by the end of the first six weeks of the school year. There were 26 teachers that did
102
not complete the course requirement by the mid May cut-off date, which included one supported teacher
and 25 non-supported teachers; most likely, these teachers were replacing teachers that left the district or
were reassigned. Most supported teachers (92%) met the twelve-hour training requirement to complete
Best Practice and Curriculum Study Workshops during 2012-13.
Implementation of RM improved from fall to spring, but the district did not meet RM fidelity
expectations either semester. Few second- or third-grade students met the 30 hours per semester
requirement in fall (1% to 2%), whereas almost a third (31% to 32%) did so in the spring. A review by
school level showed that 42 of the 145 elementary campuses (29%) averaged 30 hours or more in the
spring. Mean hours of student time online increased from 10.71 (fall) to 24.19 (spring) for second-grade
students and from 10.52 (fall) to 24.33 (spring) for third-grade students. In the fall, about half of second-
(50%) and third-grade (52%) students logged less than 10 hours, while approximately 60 percent of
second- (63%) and third-grade (61%) students logged 20 or more hours in the spring. In comparison to
the previous school year, the fall 2013 hour mean (10.71) was lower than in the fall of 2012 (13.40) for
second grade; however, the spring 2013 mean (24.19) was higher than in spring 2012 (17.20). It should
be noted that the fall files in this report differ from the fall files used in the interim report because RM
provided updated data that included students that were missing in the previous fall files.
To study how students spent their time online using RM, the evaluators reviewed the number of
objectives students completed, the types of problems students completed, and the accuracy rates of the
problems completed. Like time online, the mean number of objectives that students completed increased
from fall to spring for both second (7.74 to 14.01) and third (4.06 to 10.02) graders. Notably more
students completed Level A (easy) learning mode problems in fall and spring and Level B (medium
difficulty) learning mode problems in the spring than Level C (difficult) learning mode problems, Wall of
Mastery review mode problems (A to C), or test mode items (third grade only). Results showed that
accuracy rates were higher for Level A learning mode items than for Level B or C items. This showed that
students were challenged by Level B and C problems.
Results of the three campus staff surveys showed that the majority of campus administrators and
teachers wanted to continue using RM next year, believed students benefited from RM, and were
satisfied with RM and district support. Supported teacher responses were noticeably more positive than
103
campus administrator and non-supported teacher responses on most survey items. Other key findings
follow.
• The majority of campus administrators and supported teachers were pleased with RM training and
resources; non-supported teacher responses were mixed.
• More supported (68%) than non-supported (52%) teachers agreed that they worked closely
together.
• When asked about technology issues that precluded effective implementation, a sizeable
percentage marked “frequently” or “very frequently” for experiencing COW issues (47% to 53%),
wireless connectivity issues (42% to 49%), and network issues (36% to 39%).
• Across the three surveys, the most frequently mentioned success (N=253) was increased student
learning and engagement, whereas the most-referenced barriers to implementation were
technology issues (N=178) and scheduling problems/limited access to functioning computers
(N=158). The most-referenced campus administrator and teacher suggestion was to improve
technology (N=89).
Review of district achievement data provided an overall picture of how students progressed in
mathematics over time. Due to low implementation of RM, the use of RM as a supplemental program, and
the lack of a control group, findings are limited and do not show the true impact of RM on district
mathematics achievement. For second-grade students, the percentage that scored at or above the 40th
percentile on ITBS slightly decreased from spring 2012 (57.9%) to spring 2013 (56.2%). Similarly, the
percentage of second-grade students that scored at or above the 80th percentile on ITBS decreased from
spring 2012 (21.6%) to spring 2013 (19.8%). In the case of third-grade students, there were small
increases from 2012 to 2013 in the percentage of students that met the STAAR Satisfactory Level 2
(55.2% to 57.3%) and Advanced Level 3 (7.9% to 9.1%) standards.
Correlation analyses showed that RM students’ mastery of objectives (.211 to .315) and accuracy
rates (.594 to .725) on basic learning mode problems were more strongly related to math achievement
than time spent online (.058 to .137). Results of the regression analyses revealed similar patterns across
grade levels and tests used. For second grade, three main predictors explained 64 percent of the
variance of ITBS scores; these included prior math achievement, mastery of objectives, and learning
104
mode Level A accuracy rates. As for third grade, three major predictors explained 68 percent of the
variance in STAAR scores and 61 percent of the variance of spring ACP scores; the predictors were
learning mode Level A accuracy rates, prior 2012 ITBS math scores, and STAAR test mode Level A
accuracy rates. In other words, for both the second- and third-grade regression models, the higher the
value a student had on the three main variables, the higher the student’s test score was expected to be.
Frequency analyses were computed to further study patterns related to time and accuracy. For
both second and third grades, the analyses showed that spending adequate time online using RM was
essential, but high accuracy was most important. Student accuracy rates at 75 percent or above best
predicted student success on ITBS (over 60% scored above the 40th percentile in all time categories) and
STAAR (over 80% met the STAAR Level 2 Satisfactory standard in all time categories).
RECOMMENDATIONS
RM Monitoring Data
• Two challenges precluded accurate tracking of student and teacher RM implementation. First, RM received a data upload with district data at the beginning of the fall semester but there was no automatic system in place to ensure RM received all campus-level changes related to students moving in and out of a school after the first upload. The district project director requested information be sent upon teacher request; however, many students were missing from the files because some teachers did not put in requests to add missing and/or new students. Second, RM did not receive student identification numbers as part of the district data, which precluded accurate matching of students to district data. The evaluators updated identification numbers as much as possible “after the fact,” but the result was that many students could not be matched to district data. As a result, it will be important that RM receive data files on an automatic, ongoing basis and that files include student identification numbers, teacher identification numbers, and school numbers.
• A few teachers were not included in RM data files or weekly reports due to never implementing RM as expected. It will be important to ensure that the weekly reports include all second- and third-grade teachers so that school- and teacher-specific implementation can be monitored on a consistent basis.
RM Teacher Training
• In 2012-13, 28 percent of supported teachers and 60 percent of non-supported teachers did
not complete the RM Qualification Course by the end of the first six weeks, which means many teachers could not use RM with their students at the beginning of the school year. As a result, it will be important to monitor teacher training closely and ensure campus-level deadlines are set within the first few weeks of the school year. Executive directors and principals should work closely together to make sure teachers complete the course at the beginning of the school year.
105
RM Supported Teacher Model
• RM supported teachers were noticeably more positive than non-supported teachers on most survey items. Most likely, the difference was due to supported teachers receiving more training, encouragement, and monitoring from RM than non-supported teachers. Due to the large size of the district, the RM support model may need to be altered to better assist non-supported teachers and to ensure that the district can reach high fidelity to the RM model without adding more teacher cost. The 2013-14 evaluation of RM should include additional measures to study the effectiveness of the RM support model within the district.
RM Technology and Scheduling Issues
• Per survey results, the most mentioned barriers to implementation were technology and
scheduling issues. Some of the scheduling issues were related to limited computer access and/or challenges related to sharing computers. Executive Directors should review campus RM technology schedules to ensure adequate time is being allotted to RM; also, principals should work with Executive Directors and central technology administrators to resolve computer access issues.
RM Student Time Online and Accuracy
• Results of outcome analyses showed that Level A accuracy rates on learning mode problems
(second and third grade) as well as accuracy rates on STAAR test mode problems (for third grade) were strong predictors of student math achievement. That is, students that had accuracy rates of 75 percent or higher on Level A learning mode and test mode problems had a higher likelihood of having higher test scores than those with lower accuracy rates. During 2013-14, it will be important to monitor student accuracy rates in addition to time online to ensure students are mastering math concepts with a high level of accuracy (75% or higher).
• Based on RM hour/accuracy analyses and RM teacher observation data, most students did not spend much time completing Wall of Mastery review mode math problems. Also, a sizeable proportion of third-grade students did not spend time in STAAR test mode to prepare for STAAR. Rather, most student time was spent in learning mode and confined to Level A (easiest) problems in the fall and restricted to Level A (easiest) and B (medium difficulty) problems in the spring. Teachers should make sure students spend time in learning mode, review mode, and test mode (third grade only) during 2013-14, so that students make full use of the RM program.
REFERENCES
Bush, J. and Kim, M. (March 2013). Preliminary evaluation findings for the 2012-13 third-grade Reasoning Mind Mathematics Program: First semester implementation and outcome data. Dallas, TX: Dallas Independent School District. Cheung, A.C.K. and Slavin, R.E. (July 2011). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Best evidence encyclopedia. www.bestevidence.org. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd. ed.). Costello, R.J. (July 2012). Evaluation of the Reasoning Mind Program. Dallas, TX: Dallas Independent School District.
106
Dallas ISD (July 30, 2013). Data packet for 2013-14 planning: District summary. Dallas ISD Evaluation and Assessment Office of Institutional Research. http://mydata.dallasisd.org. Daniel, L. (Fall 1988). Statistical significance testing: A historical overview of misuse and misinterpretation with implications for the editorial policies of educational journals. Research in the schools, vol. 5, no. 2, 23-32. Houston Independent School District. (September 2011). Reasoning Mind Program Evaluation 2010-2011. Houston, TX: Houston ISD Department of Research and Accountability. Lemke, C. (2006). Technology in schools: What the research says. Culver City, CA: Metiri Group. Commissioned by Cisco Systems Inc. Lemke, C., Coughlin, E., and Reifsneider, D. (2009). Technology in schools: What the research says: An update. Culver City, CA: Metiri Group. Commissioned by Cisco Systems Inc. Levin, B. (March 2013). What does it take to scale up innovations? An examination of Teacher for American, the Harlem Children’s Zone, and The Knowledge is Power Program. Boulder, CO: National Education Policy Center. http://nepc.colorado.edu/files/pb-scaling innovation-levin.pdf Waxman, H.C. and Houston, W.R. (January 2008). An evaluation of the 2006-2007 Reasoning Mind Program. Produced for Reasoning Mind, Inc. Waxman, H.C. and Houston, W.R. (February 2012). Evaluation of the 2010-2011 Reasoning Mind Program in Beaumont ISD. Produced for Reasoning Mind, Inc. Weber, W.A. (August 2003). An evaluation of the Reasoning Mind Pilot Program at Hogg Middle School. Produced for Reasoning Mind, Inc. Weber, W.A. (December 2006). An Evaluation of the 2005-06 Reasoning Mind Project. Produced for Reasoning Mind, Inc.