Report No: AUS0000650 Samoa Early Grade Reading Assessment (SEGRA) Results Report 2017 EDU . 1 © 2017 The World Bank 1818 H Street NW, Washington DC 20433 Telephone: 202-473-1000; Internet: www.worldbank.org Some rights reserved This work is a product of the staff of The World Bank. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of the Executive Directors of The World Bank or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Rights and Permissions The material in this work is subject to copyright. Because The World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Attribution—Please cite the work as follows: “World Bank. 2018. Samoa Early Grade Reading Assessment (SEGRA) Results Report. © World Bank.â€? All queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, The World Bank Group, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. 2 Table of Content Table of Content ................................................................................................................. 3 List of Tables ...................................................................................................................... 5 List of Figures ..................................................................................................................... 7 Acknowledgement .............................................................................................................. 8 Executive Summary ............................................................................................................ 9 Key SEGRA findings from sub-tests ............................................................................ 10 Factors contributing to greater fluency and comprehension in SÄ?moa ......................... 11 Conclusions ................................................................................................................... 12 Recommendations ......................................................................................................... 12 Chapter 1 / Introduction ................................................................................................... 19 1.1 Country Context ................................................................................................ 19 1.2 Why Testing Early Grade Reading Abilities is important ................................ 19 Chapter 2 / Evaluation Design and Methodology ............................................................ 21 2.1 Objectives of Study ........................................................................................... 21 2.2 Research Questions ........................................................................................... 21 2.3 Overview of EGRA instrument and Snapshot of School Management Effectiveness (SSME) tools .......................................................................................... 22 Chapter 3 / Instrument Development and Implementation .............................................. 23 3.1 Instrument Adaptation ...................................................................................... 23 3.1.1 SEGRA ......................................................................................................... 23 3.1.2 Survey Tools ................................................................................................. 25 3.1.3 Instruments Piloting ...................................................................................... 26 3.2 Sample Design ...................................................................................................... 26 3.3 Enumerator Training ......................................................................................... 27 3.4 Data Collection ................................................................................................. 28 Chapter 4 / EGRA Results ............................................................................................... 29 4.1 Results by Subtest ............................................................................................. 29 4.1.1 Sub-test 1 – Automatic Letter Name Recognition/Symbols ......................... 29 4.1.2 Sub-test 2 – Initial Sounds Identification...................................................... 30 4.1.3 Sub-test 3 – Letter Sounds ............................................................................ 32 4.1.4 Sub-test 4 – Familiar Words ......................................................................... 33 4.1.5 Sub-test 5 – Non-words Reading .................................................................. 35 4.1.6 Sub-test 6 – Oral Passage Reading ............................................................... 37 4.1.7 Sub-test 7 – Reading Comprehension ........................................................... 38 4.1.8 Sub-test 8 – Listening Comprehension ......................................................... 40 4.1.9 Sub-test 9 – Dictation.................................................................................... 42 9a Dictation Orthography ...................................................................................... 42 9b Dictation Convention ........................................................................................... 44 4.2 Summary of findings......................................................................................... 45 Chapter 5 / Performance in reading fluency and comprehension .................................... 50 Chapter 6 / SSME findings and analysis of factors associated with reading fluency ...... 54 6.1 Teacher Training ..................................................................................................... 54 3 6.2 Instructional materials and professional development ............................................ 55 6.3 Print rich materials .................................................................................................. 56 6.4 Time on-task ........................................................................................................... 57 6.5 Teacher tongue ........................................................................................................ 57 6.6 Student tongue ........................................................................................................ 58 6.7 Formative reading assessment ................................................................................ 59 6.8 Home environment.................................................................................................. 59 6.9 Association of student characteristics to student reading performance ............ 60 6.10 Association of teacher characteristics to student reading performance ............ 61 6.11 Association of teacher training and teaching guides to student reading performance .................................................................................................................. 63 6.12 Association of Classroom Environment to student reading performance ........ 63 6.13 Association of Teacher Instructional and Assessment Methods to Student Performance .................................................................................................................. 65 Chapter 7 / Conclusions and Next Steps .......................................................................... 69 Recommendations ......................................................................................................... 70 Annex 1 / Tables ............................................................................................................... 77 Annex 2 / Instruments ....................................................................................................... 85 Annex 3 / Test Reliability Measures ............................................................................... 105 4 List of Tables Table 1: SEGRA Sample by Region, Year and Gender ................................................... 27 Table 2: Sub-test 1 Letter Names Results by Year Level ................................................ 29 Table 3: Sub-test 1 Letter Names Results by Gender ....................................................... 30 Table 4: Sub-test 2 Initial Sounds of Words Results by Year Level ............................... 31 Table 5: Sub-test 2 Initial Sounds of Words Results by Gender ...................................... 31 Table 6: Sub-test 3 Letter Sounds Results by Year Level ................................................ 33 Table 7: Sub-test 3 Letter Sounds Results by Gender ...................................................... 33 Table 8: Sub-test 4 Familiar Words Results by Year Level ............................................. 34 Table 9: Sub-test 4 Familiar Words Results by Gender ................................................... 34 Table 10: Sub-test 5 Non-words Results by Year Level................................................... 36 Table 11: Sub-test 5 Non-words Results by Gender ......................................................... 36 Table 12: Sub-test 6 Oral Passage Reading Fluency Results by Year Level.................... 37 Table 13: Sub-test 6 Oral Passage Reading Fluency Results by Gender .......................... 38 Table 14: Sub-test 7 Reading Comprehension Results by Year Level ............................ 39 Table 15: Sub-test 7 Reading Comprehension Results by Gender ................................... 40 Table 16: Sub-test 8 Listening Comprehension Results by Year Level ........................... 41 Table 17: Sub-test 8 Listening Comprehension Results by Gender ................................. 41 Table 18: Dictation Orthography Results by Year Level ................................................. 43 Table 19: Dictation Orthography Results by Gender ....................................................... 43 Table 20: Dictation Convention Results by Year Level ................................................... 44 Table 21: Dictation Convention Results by Gender ......................................................... 44 Table 22: Distribution of correct scores in percentages for reading comprehension by Year Level ................................................................................................................. 50 Table 23: Percentage of students reading with 80% or more comprehension by Year Level and Gender ....................................................................................................... 51 Table 24: ORF Score by Reading Comprehension Score ................................................. 52 Table 25: Distribution of ORF Scores for students meeting and not meeting the 80% benchmark ........................................................................................................ 53 Table 26: Theme 1 - Teacher training Key Indicators ...................................................... 55 Table 27: Theme 2 - Instructional materials & PD Key Indicators .................................. 55 Table 28: Theme 3: Print Rich Environment Key Indicators ........................................... 56 Table 29: Theme 3: Print Rich Environment Key Indicators 57 Table 30: Theme 4: Time on Task 57 Table 31: Theme 5: Language and Instruction – SÄ?moan Reading Instruction for Teachers .......................................................................................................................... 58 Table 32: Theme 6: Language and Instruction – SÄ?moan Reading Instruction for Students .......................................................................................................................... 58 Table 33: Theme 7: Reading Assessment ......................................................................... 59 Table 34: Theme 8: Home environment ........................................................................... 59 Table 35: Theme 8: Home environment ........................................................................... 60 Table 36: Student background characteristics .................................................................. 60 5 Table 37: Association of student characteristics to Oral Reading Fluency (ORF) scores 61 Table 38: Profiles of Teachers in SEGRA ........................................................................ 62 Table 39: Association of teachers' characteristics to Oral Reading Fluency Score (ORF) .......................................................................................................................... 62 Table 40: Teacher Training and Teaching Guides ............................................................ 63 Table 41: Association of training and guides to student oral reading fluency (ORF) scores .......................................................................................................................... 63 Table 42: Average Number of Classroom Displays and Materials Observed .................. 64 Table 43: Frequency and Type of Classroom Displays/Resources Available .................. 64 Table 44: Association of Classroom Environment to Student ORF Scores ..................... 64 6 List of Figures Figure 1: Letter Names Results by Region and Year Level ............................................. 30 Figure 2: Initial Sounds Results by Region and Year Level ............................................. 32 Figure 3: Letter Sounds Results by Region and Year Level ............................................. 33 Figure 4: Familiar Words Results by Region and Year Level .......................................... 35 Figure 5: Non-words Results by Region and Year Level ................................................. 36 Figure 6: Oral Passage Reading Results by Region and Year Level ................................ 38 Figure 7: Reading Comprehension Results by Region and Year Level ........................... 40 Figure 8: Listening Comprehension Results by Region and Year Level .......................... 42 Figure 9: Dictation Orthography Results by Region and Year Level ............................... 43 Figure 10: Dictation Convention Results by Region and Year Level .............................. 45 Figure 11: Percentage of Students Scoring Zero by Subtest and Year Level ................... 46 Figure 12: Summary of Results: Number of correct responses for timed sub-tests by Year Level ................................................................................................................. 46 Figure 13: Summary of Results: Number of correct answers for untimed sub-tests by Year Level ................................................................................................................. 47 Figure 14: Summary of Results: Percentage of Correct Responses for Timed sub-tests by Gender .............................................................................................................. 48 Figure 15: Summary of Results: Number of correct answers for untimed sub-tests by Gender .............................................................................................................. 48 Figure 16: Summary of Results: ORF Distribution by Reading Comprehension Level .. 52 Figure 17: Distribution of ORF scores for students reading with at least 80% Comprehension ................................................................................................ 53 7 Acknowledgement This report was prepared by Education Technology for Development. The overall research quality was monitored and advised by the World Bank and was funded through the Global Partnership for Education. The authors would like to express their gratitude to the Samoa Ministry of Education, Sports and Culture. Sincere thanks are given to the supervisors and enumerators who helped support data collection, and the teachers and students who participated in the survey. 8 Executive Summary Early Grade Reading Assessment (EGRA) is a simple instrument that measures foundational reading skills of early primary school students in Grades (Years) 1-3. The results are used to identify progress towards achieving reading fluency and comprehension, which are essential skills for learning and completing primary education. The overall purpose of SÄ?moa Early Grade Reading Assessment (SEGRA) therefore is to provide an initial measurement of how well children are learning to read and write in their local language in the first three year levels of primary schooling. This report summarizes the results of SEGRA conducted in SÄ?moa from August 21 to September 7, 2017. With funding from the Global Partnership for Education (GPE), the World Bank and Education Technology for Development (Et4D) carried out the assessment in collaboration with the South Pacific Community (SPC) and the SÄ?moa Ministry of Education, Sports and Culture (MESC). The findings are expected to assist policy makers to design effective early grade reading intervention strategies to improve early grade reading instruction and assessment. This activity is part of the Pacific Early Age and Readiness Program (PEARL), which was established to support Pacific Island countries improve policy and programming decisions around early grade literacy and school readiness. The SEGRA was administered to a nationally representative sample of students enrolled in Years1 1, 2 and 3. A total of 1,196 students (600 girls and 596 boys) participated in the assessment. The SEGRA tool consisted of seven reading skills tests and two reading-related tests (listening comprehension and dictation). Unlike most EGRAs, which primarily test reading and listening skills, the SEGRA included a short dictation exercise to assess early writing skills. In addition, a head teacher, teacher and student surveys to collect information on characteristics associated with reading outcomes was administered to identify factors contributing to reading fluency. The assessors also carried out a classroom observation in each school visited to appraise the classroom environment and teaching resources available. The analysis of SEGRA data included descriptive statistics (means and standard deviations) to measure average levels in basic reading skills; an analysis of variance to determine statistical significance of gender and regional differences; and regression analysis to estimate the association of a given teacher, student or classroom characteristic and reading fluency outcomes. 1 In the SÄ?moan education system, grade levels are called years. Year 1 is a synonym for Grade 1. 9 The key findings, factors associated with reading performance and recommendations are detailed below. Key SEGRA findings from sub-tests • Early reading achievement in SÄ?moa is low. Findings of this study show that overall, students in SÄ?moa even after 3 full years of schooling are not yet able to read with fluency and accuracy which is preventing them from reading with comprehension. The basic skills required for students to read and comprehend are not being developed, thus students are not learning to read. Research2 suggests a minimum fluency rate of 45-60 correct words per minute (cwpm), depending on the complexity of the language. Overall, Oral Reading Fluency3 (ORF) scores averaged 16 cwpm across the years. Year 3 students are reading at an average of 29 cwpm and only 16% of them could comprehend 80% to 100% of the text. These achievement levels are well below the international ORF benchmark of 45-60 cwpm and the international reading comprehension benchmark of 80%. • Students show progression in word reading skills from Years 1 to 3. As children develop reading skills, they begin by recognizing letters and sounds, and then proceed to reading sight words, which leads to connected text and reading fluency. Each of these skills builds upon one another, and they are all crucial to the development of young readers. In order to assess progression in these skills across all three years, the same SEGRA instrument was applied to students in Years 1-3. While the degree of progress varied per subtest, the SEGRA results showed that there was measurable progress between Years 1-3 on the oral reading fluency and familiar word sub-tests. Mean scores for the ORF subtest increased by 13 words between Years 1-2 (from a mean score of 4 cwpm in Year 1 to a mean score of 17 cwpm in Year 2), and an additional 12 words between Years 2-3 (from a mean score of 17 cwpm to a mean score of 29 cwpm). The familiar word subtest also showed progression in learning from Years 1-3 with an increase of 10 words from Year 1 to Year 2, and an additional 8 words from Year 2 to Year 3 given their mean scores of 14 correct familiar words per minute (cfwpm) and 22 cfwpm respectively. For the other subtests, average differences between years were rather modest. • Students’ lack decoding skills. To read text fluently, students must be able to decode unfamiliar words by sounding out individual letters and syllables. Students scored low across the three sub-tests that measure these skills: letter sounds, initial sounds, and non-words. Overall, students correctly identified an average of 23 letter sounds, 3 initial sounds and 7 nonwords. It is important to note that a significant number of students scored zero 2 Abadzi, H. (2011). Reading Fluency Measurements in EFA FTI Countries: Outcomes and Improvement Prospects. Education for all Fast Track Initiative 3 Oral Reading Fluency (ORF) refers to the Oral Passage Reading subtest. 10 across the 3 subtests. For the non-word subtest, 81% of Year 1 students, almost half of Year 2 students (46%) and 27% of Year 3 students could not correctly identify one single non-word. Analysis for the initial sounds subtest revealed that more than half of Year 1 students (58%), 29% for Year 2 students and 21% of Year 3 students scored zero. Data for the letter sounds subtest showed that 15% of all the students could not correctly identify a single letter sound, with 20% from Year 1 and 12% from Years 2 and 3. • Reading comprehension levels are well below the international benchmark. An analysis of zero scores revealed that over 96% of students in Year 1, 72% in Year 2 and 50% in Year 3 could not correctly answer a single comprehension question about the oral reading passage. The majority of students in the sample across the three years are achieving well below the internationally accepted reading comprehension benchmark of 80%. Only 6% of all students tested met the benchmark (80%) and above. • Girls have higher reading fluency and comprehension. There were important differences between boys and girls, with girls outperforming boys in all of the subtests except for listening comprehension. The subtest that had the biggest variance in performance was oral reading fluency with a mean score of 13 cwpm for boys and 19 cwpm for girls, a difference of 6 cwpm. As well, 8% of girls and only 4% of boys assessed met the 80% reading comprehension benchmark. • Regional differences were significant. Students in the Savai’i region consistently performed better than other regions in 9 out of 10 subtests. Apia Urban students scored the lowest in all subtests. Factors contributing to greater fluency and comprehension in SÄ?moa The following are statistically significant factors associated with average gains in ORF; Student characteristics: • Student has books, newspapers or other reading materials at home in SÄ?moan (average increase of 8 cwpm) • Student has a literate sister (average increase of 6 cwpm) • Student reads to himself/herself at home (average increase of 6 cwpm) • Student reads aloud to someone at home (average increase of 5 cwpm) Classroom environment: • Spelling/vocabulary displayed on charts and posters (average increase of 7 cwpm) Teaching pedagogy and assessment: • Students reading on their own silently daily (average increase of 9 cwpm) • Teacher works on word building with students often (average increase of 12 cwpm) 11 • Students writing sentences (average increase of 16 cwpm) • Students work on spelling words in exercise books often (average increase of 19 cwpm) Conclusions • Despite the noted progression of learning between the years, the overall low achievements for oral fluency reading, and familiar word reading even for Year 3 students, suggest that the students have not mastered automaticity with letters and words identification to be able to read with fluency. • Given the low reading comprehension scores and high percentage of zero scores across all three years, it is most likely that the breakthrough point for reading fluency and comprehension will be much later than Year 3 for the majority of students thus compromising the ability of students to cope with content and the curriculum in upper grades. • Of all the factors assessed, those related to teaching pedagogy and assessment had the strongest influence on reading fluency scores. Classroom activities focused on writing, word building and spelling exercises had the greatest association with improved ORF scores. Recommendations Based on the findings, the following recommendations are presented for consideration, as means to improve the quality of early grade reading instruction for SÄ?moan students: 1. Teaching and Learning 1a. Teachers should provide explicit and systematic instruction in decoding and reading comprehension skills for students in early grades. Results show that most students have not mastered decoding skills. Overall, students correctly identified an average of only 23 letter sounds out of a possible score of 100, 3 initial sounds out of 10 and 7 out of 50 non-words. Given students low level of decoding skills, it is not surprising that only 6% of them were able to comprehend 80% or more of grade level text. If students are taught specific decoding strategies, they will be able to read faster and more accurately leaving sufficient working memory for comprehension. Students who cannot identify letter sounds and decode words cannot read and therefore cannot comprehend. To become good readers, most students require explicit, intensive, and persistent instruction. It is therefore suggested that explicit instruction in decoding and reading comprehension skills be practiced in schools to improve students reading achievements. Research has shown that the most effective reading comprehension strategies include activating prior knowledge/making predictions, questioning, visualizing, drawing inferences and summarizing or retelling of a text in their own 12 words. Teachers should use various strategies to develop students’ reading comprehension skills from as early as preschool or Year 14. 1b. Provide remedial instruction for non-readers Results show that 60% of students in Year 1, 32% in Year 2 and 20% in Year 3 were unable to read a single word of an oral reading passage. These nonreaders are unlikely to ever learn to read without remedial instruction. Teachers should be trained and empowered to conduct reliable classroom level assessments to identify non-readers in Years 1-3, diagnose the causes, and design and implement specific activities to address deficiencies. For instance, teachers may group students according to ability and provide remedial activities and appropriately levelled text. This “catch-up approachâ€? is being used by UNICEF in Zambia based J-Pal’s research in India, which demonstrated that students grouped by ability is more effective than mixed-ability grouping. In this approach, UNICEF assesses students in all three grades and group them according to reading abilities. Grouping students by reading ability rather than grades has produced dramatic results in India, Kenya and Ghana. Teachers and school administrators should further determine whether non-readers have learning disabilities (e.g., dyslexia) and design relevant intervention strategies for special needs students. 1c. Develop and implement activities that specifically focus on raising boys’ abilities and interests in reading. The results illustrate that boys consistently performed lower than girls except for the listening comprehension subtest where they (boys) scored slightly better than girls. These differences in performance should not be overlooked because they are consistent and can be taken as a sign of systematic, yet not well understood, differences in the learning opportunities and experiences offered to boys and girls. There may be cultural or gender barriers that affect boys’ interest and engagement in reading activities. Offering a rich and varied mix of materials and being mindful of boys’ reading preferences can go a long way towards building an engaging and inviting reading environment for boys. Successful strategies that have worked in other countries include developing gender-sensitive materials that attract boys’ attention (such as sports, science fiction, fantasy, comic books, digital text, and stories that are humorous), increase in the use of graphics, pictures and storyboards in class and for homework and integrating reading into extracurricular activities (e.g., sports, health clubs). 2. Teacher Training (In-service and Pre-Service) 4 Sample reading comprehension activities can be found in the following guides: Ontario Ministry of Education. (2003). A Guide to Effective Instruction in Reading: Kindergarten to Grade 3 (Available at: http://eworkshop.on.ca/edu/resources/guides/Reading_K_3_English.pdf) Institute of Education Sciences (IES). (2010). Improving Reading Comprehension in Kindergarten through 3rd Grade. What Works Clearninghouse (Available at: https://education.ohio.gov/getattachment/Topics/Early-Learning/Third-Grade-Reading-Guarantee/Third-Grade-Reading-Guarantee- Teacher-Resources/Improving-Reading-Comprehension-in-Kindergarten-Through-3rd-Grade.pdf.aspx) 13 2a. Train Years 1-3 teachers in reading instruction with a focus on vocabulary, decoding skills, reading comprehension and writing Only 38% of teachers reported being trained on reading instruction in the last two years and yet, one of the minimum service standards5 for schools in SÄ?moa is for teachers to have continuous professional development. This finding also reconfirms one of the key development issues noted in the SÄ?moa Education Sector Plan 2013- 2018 that many teachers in primary schools have not had adequate training (at pre– service and in-service levels) and on-going professional support to ensure they have the content, pedagogical and assessment knowledge needed to implement effective literacy and numeracy programmes 6 . Student results indicate they are weak in recognizing letter sounds and unfamiliar words, and have very low reading comprehension skills. Hence, instruction in these areas should be strengthened to increase overall reading scores. A recommended training package for teachers can be divided into two parts. The first part of the training can focus on general principles, or techniques of effective teaching. Examples might include ways of increasing student engagement during lessons, methods for leading effective classroom discussions of text, or ways of effectively correcting student word reading errors during shared reading activities. Teachers can receive this type of professional development through workshop series, reading study groups, or coaching. The most effective professional development always includes follow-up in the classroom to ensure that teachers fully understand new instructional approaches to apply them in their classrooms. The second part can be the program specific component that includes a core reading program containing systematic lessons to support the growth of critical reading skills, along with practice and teacher support activities that are aligned with instruction. It is important to ensure that Year 1 teaching enables students to quickly develop alphabetical and phonological knowledge that develops decoding skills and therefore contribute to text reading and writing. The training should put emphasis on both reading and writing development and the use of guided reading/writing, shared reading and independent reading approaches. The training program should also allow participation by members of the school community. 2b. Ensure that pre-service course content provides new teachers with essential knowledge and skills related to improving reading and literacy outcomes One of the main objectives of primary teacher education courses should be to prepare preservice teachers to teach reading. Preservice training programs therefore need to assist teacher trainees understand and be able to use various strategies to develop Years 1 to 3 students’ foundational literacy skills and how to deliver explicit teaching about phonemic awareness, phonics and the alphabetic principle. The training package discussed in the previous recommendation for in service training can also be incorporated into the pre-service program if deemed necessary. 3. Formative Assessment 5 Samoa Schools Minimum Service Standards, 2010, 6 Samoa Education Sector Plan, June 2013-July 2018, p 24 14 3a Ensure support for teachers on formative assessment SEGRA results as self-reported by teachers show that 92% of them conduct formative reading assessments and 94% modify their instructions based on assessment information. About 86% of teachers reported receiving training on reading assessments but only 5% of this number received the relevant tools for conducting assessment. It is possible that although training was conducted, there were still gaps in teachers understanding of how to apply the new knowledge and skills in the classroom and how to utilize the results effectively for reflection and lesson planning. Additionally, considering 95% of teachers did not receive tools, many were unable to apply what they had learned. Therefore, refresher training on formative assessment tools and provision of tools for teachers and principals are recommended as well as follow-up coaching in the classroom to ensure all teachers received the assessment instruments and are able to apply them appropriately. The assessment tools should be aligned with the newly established fluency and comprehension benchmark (see recommendation 8). 4. Time on Task 4a. Develop strategies to ensure that students spend more time reading in school, and at home. The more time children spend reading the better and more fluent readers they become. Results show that students reading on their own silently on a daily basis was associated with 9 more cwpm in the ORF subtask. The majority of the teachers interviewed (44%) spend around 11-15 minutes of teaching time on reading and only 13% spend 26-35 minutes on reading which is the recommended amount of time to be spent on reading instruction and practice. Results indicate that only 16% of students have books or other reading materials in SÄ?moan at home and 10% have access to a computer or mobile device. SEGRA results therefore indicate that the majority of teachers are not allocating sufficient time to reading instruction and the majority of students do not have access to reading materials at home which are potential contributing factors to the low levels of reading achievements for students. In order to increase students’ reading fluency skills, teachers should ensure that students have access to a variety of grade appropriate reading materials and are spending sufficient time reading every day at school and home through teacher-led, parent-led or self-guided reading activities and that there is sufficient practice and materials at home to encourage fluency. 4b. Provide daily time for students to write. Reading affects writing and writing affects reading. SEGRA results has shown that students have very low writing skills as evident in the dictation orthography and dictation convention subtest results. Almost half of the students scored zero in dictation orthography (43%) and dictation convention (49%) which were the two subtests that assessed students’ writing skills. Providing adequate time for students to write is one essential element of an effective writing instruction program. Students need dedicated instructional time to learn the skills and strategies 15 necessary to become effective writers, as well as time to practice what they learn. As teachers observe the way students write, they can identify difficulties and assist students with learning and applying the writing process. Given the relationship between reading, writing and vocabulary, it is recommended to review the language/reading programs in the early grades in SÄ?moa and incorporate sufficient instructional tasks and activities to develop all three skills (reading, writing and vocabulary). 5. Reading Materials 5a. Ensure that every classroom has a library/reading corner and the books are used during reading instruction. Only 12% of classrooms have reading corners and yet, regression analysis showed that having a reading corner was associated with 4 more correct words per minute on the ORF subtask. The classroom observations also revealed that 84% of classrooms have adequate space for reading activities. Thus, it is recommended to develop reading corners in the classrooms with a leveled book collection that has a balance between familiar favourites and new material, fiction and nonfiction books, those that are easy to read and ones that have more challenging material for experienced readers. Teacher guides for reading should be available to assist teachers on how to effectively use these books for reading instruction. The International Reading Association (IRA) recommends that classroom libraries start with at least seven books per child and add two new books per year. The optimal number of books in a classroom library is 300-600, depending on the grade level and number of copies7. The number of books teachers should expect children to read during the school year is 100-125 picture books by the end of Grade 1 and 50- 75 chapter books by the end of Grade 2. The SÄ?moa school fee grant scheme can assist schools with the procurement of sufficient and a variety of grade appropriate SÄ?moan reading materials for the students. A low-cost option is to have e-readers provided that the schools can make available the required materials to support student access to such resources. E-readers allow students and teachers to choose from a variety of genres, it is portable so students can read from home or school, and its read aloud features provides additional support for emergent readers89. In addition to the provision of an increased number and variety of graded hard and soft copy books, teachers should be trained on how to better integrate materials into their instruction using their teacher guides for reading and on how to develop attractive reading corners. 6. School Leadership 7 Neuman, S. (undated). The importance of the classroom library. Available at: http://teacher.scholastic.com/products/paperbacks/downloads/library.pdf 8 Adams, A. & van der Gaag, J. (2011). First Step to Literacy: Getting Books in the Hands of Children Available at: https://www.brookings.edu/research/first-step-to-literacy-getting-books-in-the-hands-of-children/ 9 UNESCO (2014). Reading in the mobile era: A study of mobile reading in developing countries. http://unesdoc.unesco.org/images/0022/002274/227436E.pdf 16 6a. Train School Principals to serve as Literacy Leaders (Directors/Guides). The role of the school principal is to guide, support, and monitor classroom reading instruction. School principals should ensure that teachers have effective ongoing professional development programs and adequate materials to support high quality instruction, and they should observe classes to identify areas that need to be strengthened in order to achieve results. Based on information collected through the teachers’ questionnaire, only a third of the teachers received training in early grade reading and only 12% of them said they had reading corners in their classrooms. As well, about a third of the teachers reported they had teaching guides (37%), textbooks and resources (36%), and less than a third (27%) stated they had curriculum statements. These findings indicate that the teachers are not well supported to effectively deliver reading programs. It is therefore suggested that early grade reading professional development programs for school principals focus on updating principals’ understanding of early grade reading and literacy, the quality of instruction, school organisation, monitoring and evaluation. 7. Establishing Benchmarks 7a Define early grade reading and fluency benchmarks to provide teachers and policymakers with a means to track early grade reading performance. It is important to establish norms for reading performance especially in mother tongue languages. The wealth of data obtained from this study and the 2017 nationwide collection of baseline data for literacy in Year 1 through to Year 3 provide sufficient evidence for the MESC to determine the rates of fluency, comprehension and word skills that are necessary at each year level. Equally important is for MESC to ensure that if a benchmark system is introduced, it should include adequate mechanisms to identify struggling and non-readers in order for them to receive the necessary support to reach the grade standards before completion of a school year. In fact, a benchmarking system provides critical evidence to redirect the education system as a whole to get as many students as possible to achieve the approved standards. This means improving classroom instruction, ensuring a culture of shared accountability over learning at the school level as well as strengthening practices and support outside the school. In developing the benchmarks, stakeholders should decide on the level of comprehension required to understand grade level text (e.g., 80% is the internationally accepted standard) and then review the fluency scores that fall within that range (as was done in this report). If stakeholders agree with the 80% benchmark, then an acceptable fluency range may be 70-74 cwpm. The current mean Oral Reading Fluency (ORF) score is 17 cwpm and based on this, policymakers may decide to lower or raise the benchmark from the 80% point. Once the benchmark is decided, the next step is to consider the targets and agree on the percentage of students who should be meeting the benchmark in a particular time period. Currently, only 6% percent of students are meeting the 80% and above benchmark. Once the benchmark and targets are set, the MESC can then inform all stakeholders of the new benchmarks and regularly monitor and report progress 17 towards achieving the targets at all levels (national, regional and schools). The approved benchmarks can also be integrated into the pre-service programs at the National University of SÄ?moa (NUS). 8. Additional research on findings not well understood 8a. Identify the causes in differences in performance across regions and develop context-relevant interventions. Data analysis shows that the Savai’i region had the best average scores for all subtests. The differences between Savai’i and the other two regions across the subtests ranged from 0.2 to 4 points. Information in the MESC Statistical Digest 2016 indicates overcrowding in the urban schools. This could be one of the reasons why the Savai’i students are performing better. Every classroom that houses more children than the optimum number encounters special difficulties in instruction, guidance and supervision which means that the teacher will have less individual attention given for each student. An investigation into the causes of differences in performance between the regions is suggested. As well, targeted interventions for students and for teachers in terms of professional development programs such as those suggested above are highly recommended for Upolu Urban in particular. 18 Chapter 1 / Introduction 1.1 Country Context The independent State of SÄ?moa, known as Western SÄ?moa until 1997, consists of two main islands of Upolu and Savai’i and seven smaller islands. Upolu is home to SÄ?moa's capital city and nearly three-quarters of SÄ?moa's total population of 193,000 people. SÄ?moa is located south of the equator in the Polynesian region of the Pacific Ocean, about halfway between Hawaii and New Zealand. Primary education is free and covers an eight-year cycle from Years 1-8. Under the Education Act, education is compulsory for children aged 5-14 or until completion of Year 8. The enrolment rate shows a steady movement over the past five years of official school aged children attending primary education with 100% participation in 201610. The gross enrolment rate trend over the last five years in primary education remains above 100% with 107% in 2016, reflecting a high degree of participation, and more over-aged students attending primary education. 1.2 Why Testing Early Grade Reading Abilities is important Teaching students to read at a young age is the cornerstone of improving educational outcomes. Students who do not learn to read in the early years of schooling often struggle and repeat grades. Not understanding printed information and not being able to follow written instructions and communicate well in writing, these children risk falling further and further behind those who can read effectively in later grades 11 . Without effective interventions, the literacy gap between good and poor readers increases over time. Effective readers absorb increasing amounts of written information, enhancing their vocabularies and improving their comprehension, while ineffective readers lose motivation, reading a fraction of the amount and remaining unable to comprehend more complex information referred to as the Matthew Effect12. Research has found that students who are unable to read proficiently by the end of Year 3 or grade 3 are four times more likely to leave school without a diploma than proficient 10 SÄ?moa Education Statistical Digest 2016 is an annual publication of the MESC which is based on data collected through the school census 11 Give, A. and A. Wetterberg, 2011. The Early Grade Reading Assessment: Applications and Interventions to Improve Basic Literacy. Research Triangle Park, NC, USA: Research Triangle Institute 12 Stanovich, K. 1986. Mathew Effects in Reading: Some consequences of Individual Differences in Acquisition of Literacy. Reading Research Quarterly. 11(4). 19 readers13. Research also suggests that poor readers are more likely to experience behavioral and social problems in subsequent grades and are more likely to repeat grades 14 . This affects the social and economic wellbeing of these children in future life. Justification for the EGRA survey in SÄ?moa falls under Goals 1 and 2 of the SÄ?moa Education Sector Plan (2013-2018) 15 . A core element of the quality focus of the plan comprises government priorities to improve the quality of teaching and learning at all levels. Following the progress made in curriculum and teacher professional development in the 2012-2016 sector plan, the MESC has extended these components into the current sector plan. The Curriculum Reform and Assessment System aims to ‘ensure that all students have the opportunity to learn and acquire the knowledge, skills and attitudes specified in the national curriculum.’ While the “Developing Effective Teachersâ€? aims to ensure that ‘teachers are at the core of efforts to improve the quality of education and the level of student learning’. In order to reach the sector’s indicators of achievement for quality of education at all levels, the MESC considers that it is crucial to work with teachers and students in the early grades to ensure progress is made to reduce the proportion of students categorized as “At Riskâ€? in English and SÄ?moan Literacy at Years 4 and 6. In order to make significant progress in this indicator, teachers should have the competences and knowledge to support beginning readers in the first grades of primary education. In this sense, any intervention using the data from the 2017 SÄ?moa EGRA assessment will aim to support MESC’s efforts to achieve sector outcomes 01.1 (Improved literacy and numeracy outcomes at all levels, with boys and girls each achieving to agreed National Benchmarks) and 01.3 (Professionally more competent teaching force at all levels, especially in the teaching of literacy and numeracy)16. Despite many positive achievements in education since 1995, the quality of education as measured by test results and functional literacy remains disappointing. The SÄ?moa primary education literacy level one (Year 4) and two (Year 6) tests, which assess students at risk of not reaching literacy and numeracy standards, as well as Year 8 examinations, indicate unsatisfactory results. SÄ?moa’s 2015 PILNA results also showed that in literacy, only three in every ten students are meeting or exceeding the expected proficiency level in Year 4 and similarly in Year 6. The Pacific Early Age and Readiness Program (PEARL) was established to improve policy and programming decisions for school readiness and early grade literacy. The SEGRA results provide an initial measure of students reading abilities in Years 1-3. 13 Annie E. Casey Foundation. 2010. Early Warning: Why Reading By the End of Third Grade Matters. Baltimore, MD, USA: Annie E. Casey Foundation. 14 Hernandez, D. 2011. Double Jeopardy: How Third-Grade Reading Skills and Poverty Influence High School Graduation. New York, USA: The Annie E. Casey Foundation 15 Samoa Education Sector Plan (July 2013-June 2018), P 29: GOAL 1: Enhanced quality of education at all levels; GOAL 2: Enhanced educational access and opportunities at all levels 16 SÄ?moa Education Sector Plan, July 2013-June 2018: Improved Focus on Access to Education and Training and Quality Outcomes. 20 Chapter 2 / Evaluation Design and Methodology 2.1 Objectives of Study The EGRA tool was adapted to the SÄ?moan context in June 2017 and administered nationally between August 21 and September 7, 2017. The SÄ?moa Ministry of Education is committed to utilize the findings of the SEGRA assessment to: • understand the strengths and weaknesses in reading skills of Years 1-3 students to inform improvements in curriculum and teacher professional development. • develop modules for early grade reading that can be streamed through existing in-service teacher training mechanisms, underway and funded by MESC. • develop an early grade reading formative assessment that teachers can use in the classroom and that can serve for curriculum officers (who serves as pedagogical advisors) to reinforce pedagogy and classroom management activities and to monitor progress in reading at the classroom level. 2.2 Research Questions The overall purpose of SEGRA is to provide an initial measurement of how well children in SÄ?moa are learning to read and write in their local language in the first three year levels of primary schooling. The findings will therefore assist the Curriculum Development and Materials and Teacher Development Divisions of MESC to develop teacher training modules for early grade reading as well as formative assessment tools for reading that teachers can use in the classrooms. As well, the SEGRA results can assist the SÄ?moa MESC policymakers with decisions regarding ongoing capacity building of teachers and teacher trainers such as the curriculum officers, allocation of budgetary and human resources, and development of reading indicators and benchmarks. The ultimate goal of the SEGRA is to ensure that evidence-based decisions are made in order that all children develop the skills needed to become fluent readers. To achieve this ultimate goal, SEGRA was designed to respond to the following questions: 1. What are children’s reading fluency and comprehension levels in grades 1-3? 2. What is the difference in performance between girls and boys? 3. What reading skills need to be strengthened? 4. What student, teaching and classroom environmental factors are associated with reading achievement? a. Are teachers adequately trained and supported to teach early grade reading? (teach) b. Does the classroom environment enhance reading (e.g., student work displayed, reading corners, visual aids)? 21 c. Are there sufficient reading and teaching materials to support student learning (at least one book per student)? (text) d. Do teachers spend sufficient time on reading activities according to the lesson plan and teacher guide? (time) e. Are teachers and students fluent in the language of instruction? (tongue) f. Do teachers assess students’ reading skills and modify instruction according to students’ reading levels? (test) g. Are students supported to read at home? Do they have access to reading materials in the language of instruction? 5. What additional resources, training, policies or support is required to improve overall reading achievement? 2.3 Overview of EGRA instrument and Snapshot of School Management Effectiveness (SSME) tools EGRA is not a high-stakes accountability measure to determine whether a child should advance to the next grade. It is a diagnostic tool to assess early grade reading outcomes. The EGRA tasks measure predictors of early grade reading which are the skills found to be the most reliable evidence of later reading success As a formative assessment tool, teachers can either use EGRA in its entirety or select subtests to monitor classroom progress, determine trends in performance, and adapt instruction to meet children's learning needs. EGRA’s theoretical framework serves to support its adaptation to other languages. SEGRA therefore was adapted to suit the SÄ?moan language and orthography. Considerable work was done in the instrumentation workshop to develop, pilot, and validate nine EGRA sub- tasks for SÄ?moa. The questionnaire component of an assessment includes the collection of background and contextual data at different levels. The information gathered can be a powerful tool in providing explanations of the outcomes of an assessment’s cognitive component. This information enables a more in-depth understanding of the observed test outcomes and the implications of these outcomes for designing improvement strategies. SEGRA included the use of SSME tools provided by the World Bank PEARL project and are consistent with those used in other countries to provide a general snapshot of school management effectiveness. 22 Chapter 3 / Instrument Development and Implementation 3.1 Instrument Adaptation 3.1.1 SEGRA An instrument adaptation workshop was conducted from Monday 11th to Friday 17th June, 2017 which was facilitated by language and education experts working with ET4D. The workshop was attended by staff of the MESC and selected early grade teachers. Prior to instrument development, a language analysis was conducted to get a better understanding of orthography and language issues that needed to be considered when developing the SEGRA tools. The SÄ?moan language has diacritical marks - glottal stops and macrons. It was therefore agreed that Task 1 be renamed as Letter/Symbol Names and it therefore assesses a combination of the alphabetical letters, and the additional symbols that are used in written SÄ?moan. A letter frequency list, word frequency list and potential non-word list were also developed. The more difficult components of the assessment tool are the four sub-tasks that utilise short texts in SÄ?moan. These sub-tasks are: reading fluency, reading comprehension, listening comprehension, and writing, using a dictation sentence. The workshop participants developed a number of potential listening and reading texts in accordance with specified criteria relating to length, text-type, discourse structure, and level of language. For each text, five questions were developed, mostly querying information that is directly accessible from the text and one question that required some level of inferential thinking. The group also developed sentences as dictation prompts. Each of the texts and dictation sentences were shared discussed and by consensus, two texts (with questions) were identified as suitable for the reading fluency and reading comprehension tasks; two texts (with questions) were identified as suitable for the listening comprehension task; and two sentences were chosen as the preferred dictation prompts. During the instrument development phase, an active feedback loop operated, whereby immediate modifications and edits were made to the instruments based on input from all participants with the intention of perfecting the tool as much as possible. The open atmosphere and the freedom teachers and workshop participants demonstrated in providing ideas and feedback made preparation of the tools a genuinely cooperative activity. Teachers and other Ministry staff involved were also constantly exposed to concepts and ideas around development of reading skills for students and classroom evaluation of reading, so capacity was built in these areas. 23 24 The SEGRA instruments consisted of the following nine subtests: Test Skills Measured Timed Letter Name Knowledge Ability to read alphabet letters with accuracy and Yes fluency Initial Sound Identification Phonemic awareness – the ability to identify sounds in No spoken words Letter Sound Knowledge Phonics – the ability to identify sounds of letters with Yes accuracy and fluency Familiar Word Reading Ability to read familiar words with fluency and Yes accuracy Nonword Reading Ability to decode linguistically sound invented words Yes Oral Passage Reading Ability to read a short passage with fluency and Yes accuracy Reading Comprehension Ability to respond to several comprehension questions No based on passage Listening Comprehension Ability to comprehend a short story read aloud No Dictation Orthography and Convention skills No Another significant timetabling issue for this workshop was that unfortunately, the week of the workshop coincided with exam week in schools. It was not possible to find a school that could accommodate pilot testing on a Friday, so the pilot tests were carried out on Wednesday and Thursday. This created some pressures, but the workshop team was able to overcome them. 3.1.2 Survey Tools In order to get a broader picture of the contextual environment around students’ development of reading skills, additional tools were developed to collect background information from students, teachers and head teachers as well as classroom observations. Rather than create these tools from scratch, the Et4D team drew on EGRA SSME17 tools in existence and, importantly, on versions of this tool recently validated and employed in the Pacific region that were shared by the World Bank. The use of the SSME tools in conjunction with EGRA was therefore important to produce a comprehensive picture of school-related factors that may influence students’ literacy achievements. As with the reading assessment tasks, the SSME tools were revised where necessary as a result of feedback after the trials. 17 The Snapshot of School Management Effectiveness (SSME) is an instrument that yields a multifaceted picture of school management practice in a country or region (RTI International, 2010). Information that can be collected through the SSME is pedagogical approaches used; time on task; interactions among students, teachers, administrators, district officials, and parents; record keeping; discipline; availability and condition of school infrastructure; availability of pedagogical materials; and safety. 25 Participants therefore played a major role in reviewing and modifying the contextual instruments to be used in: a. recording demographic and other literacy-related information about students (Student Questionnaire); b. interviewing teachers (Teacher Questionnaire); c. interviewing Principals (Head Teacher Questionnaire); and d. carrying out classroom observations (Classroom Observation Tool). 3.1.3 Instruments Piloting Pilot-testing for the instruments were conducted in two days - Wednesday 14th (Day 1) and Thursday 15th (Day 2) from 9.30a.m. to 12.30p.m. The main purpose of the pilot was to undertake a field test of the instruments, to locate places where modifications would need to be made based on observations of the process reported back by enumerators, and also from an initial analysis of the pilot results. Ten workshop participants took on the role of ‘assessors on Day 1 and seven on Day 2. The pilot testing was done at a relatively large primary school, close to Apia area. A total of 30 students participated in the trial on Day 1 (10 Year 1, 10 Year 2 and 10 Year 3) and 21 on Day 2 (7 Year 1, 7 Year 2 and 7 Year 3). There was a balance of gender in the students that were selected for the pilot. The students were not sampled using rigorous randomizing methods, however, it is not likely that teachers were deliberately selecting students based on their abilities since it could be seen from the findings that there was quite a spread of results across the students drawn from each year level. The pilot also provided the first opportunity to field-test the Student Questionnaire with the 30 students on Day 1. The questionnaire is designed to provide a picture of each student’s wider reading environment. The Principal (Head Teacher) questionnaire was piloted on Day 2. The Teacher questionnaire was also piloted on the afternoon of Day 2 with teachers who were workshop participants. There were minor amendments made to the Teacher Questionnaire instrument as a result of this trial for instance, teachers preferred to use the term Teacher Guide instead of syllabus. The classroom observation tool was piloted in three classrooms on Day 3. The extra element of diacritical marks provided an additional challenge. However, these are features of the SÄ?moan language and so students would have been taught these. In piloting the instruments, many students were able to produce these correctly. 3.2 Sample Design With guidance from the World Bank, the MESC chose a nationally representative sample for students in Years 1, 2, and 3. The target population was defined as students enrolled in Years 1 to 3 in primary schools implementing the official national curriculum in SÄ?moa regardless of their ages. A sample of 40 schools across the country was selected based on enrolment data from the MESC, their location and language of instruction. 26 To randomly select classrooms and/or students, supervisors were instructed to use a random number generator app, “R – True Random.â€? In each selected school, one classroom for each of Years 1-3 was selected, unless there was a multigrade classroom. If there was more than one classroom for a year, the supervisor assigned each class a number starting from 1 up to the total number of classes and select a class using a random number generator. There were to be 10 students from each classroom (5 boys and 5 girls) for a total of 30 students per school. A total of 1,196 students, 596 girls and 600 boys were assessed (see Table 1). Table 1: SEGRA Sample by Region, Year and Gender Region Year 1 Year 2 Year 3 Total Girls Boys Total Girls Boys Total Girls Boys Total Upolu Urban 79 78 157 75 76 151 77 77 154 462 Upolu Other 83 83 166 73 70 143 67 73 140 449 Savai’i 53 53 106 45 45 90 44 45 89 285 Total 215 214 429 193 191 384 188 195 383 1196 3.3 Enumerator Training The Enumerator Training Workshop for SEGRA was held in Apia from Tuesday 15th to Friday 18th August. Participants included staff of the MESC, et4D personnel and 13 potential enumerators who were pre-selected to participate in the training. Over the four days of training, enumerators were exposed to basic concepts concerning literacy instruction and basic principles of EGRA assessments. A lot of the time was spent on understanding EGRA assessment administration, practicing and familiarizing the enumerators with the tools to be used and Tangerine-based data collection techniques, as well as the different questionnaires to be administered. Enumerators were trained for two days in a classroom environment followed by 2 days of practice at a nearby primary school. This primary school was not the one in which the instruments had been previously piloted. AAM (Assessor Accuracy Measurement) simulations were conducted three times in order to familiarize the enumerators with the process and examine their accuracy level. Overall AMM results were positive. Getting the enumerators to work immediately on Tangerine rather than training them on the paper instruments proved to be effective in reducing the technology fear and getting them used to the way Tangerine works straight from the start. As well, all of the trainees were younger than 25 and had no teaching experience. This proved to be the right people for this type of assignment, where familiarity with technology is essential. This is because none of them had teaching experience, they were going strictly by the script and did not attempt to correct the children. 27 3.4 Data Collection SEGRA was administered between August 21 and September 7, 2017, by a group of supervisors and enumerators managed by ET4D, with local assistance from the MESC. A total of 1,196 students were assessed in this assessment across the islands of Upolu and Savai’i. In addition to the SEGRA assessment, the team administered one student questionnaire to each assessed pupil. Supervisors conducted the head teacher interview and teacher interviews and also performed the classroom observation, providing a total of 39 head teacher questionnaires, 102 teacher questionnaires, and 105 classroom observations. The success of the data collection was due to the assistance by the MESC and work of the enumerators. 28 Chapter 4 / EGRA Results This section provides information on how students performed in each of the nine EGRA subtests. Most importantly, it gives comparative information of performance for year levels, gender and regions because the tool is the same for all grades. The average scores of the sub-tests are provided and disaggregated by class and gender. Average scores are presented for the entire sample and the overall results therefore are the actual reading performance of the entire population, including those who can and cannot read. 4.1 Results by Subtest 4.1.1 Sub-test 1 – Automatic Letter Name Recognition/Symbols The test of automatic letter name recognition is the most basic assessment of reading skills. It measures students’ ability to identify the names of letters accurately and automatically. Automaticity and fluency of letter name knowledge is a predictive skill for later reading success. During the SEGRA, students were given a page of 100 randomly distributed upper- and lowercase letters and asked to say the names of as many letters as possible within one minute. The test was scored by the number of letters that students correctly named in one minute (correct letters per minute (clpm)). The overall scores for sub-test 1 by year and gender is presented in Table 2. Overall results show that students were able to correctly identify 37 clpm, with or without the diacritical marks. Year 1 students identified 27 clpm, 39 clpm for Year 2 students and 46 clpm for Year 3 students. The greatest improvement in letter name recognition was in Year 2, where students correctly named an additional 12 letters on average. Compared to Year 2 students, those in Year 3 correctly named an average of 7 additional letters. Out of 429 students assessed in Year 1, 11% could not name a single letter. Table 2: Sub-test 1 Letter Names Results by Year Level Subtest 1: Letter N Mean SD Min Max % zero Name (clpm) scores Overall 1,195 36.7 18.7 0 167.6 5.4% Year 1 429 26.5 16.5 0 167.6 10.9% Year 2 384 39.4 16.4 0 96 2.1% Year 3 382 45.5 17.2 0 102.4 2.4% 29 Table 3 shows letter recognition results by gender. Overall, girls scored higher than boys, identifying correctly an average of 39 clpm compared to 34 clpm for boys. Table 3: Sub-test 1 Letter Names Results by Gender Subtest 1: Letter N Mean SD Min Max % zero Name (clpm) scores Overall 1,196 36.7 18.7 0 167.6 5.4% Girls 600 39.3 18.7 0 102.4 3.7% Boys 596 34.2 18.3 0 167.6 7.0% Figure 1 below is a graphical representation of results by region and year level. The region with the highest overall mean score of 41 clpm was Savai’i and the lowest was Upolu Urban with 34 clpm, slightly lower than Upolu other with 37 clpm. Figure 1: Letter Names Results by Region and Year Level Letter Name - average results by Region and Year Level 60 50 letters per minute Number of correct 40 30 20 10 0 Savaii Upolu other Upolu urban Yr1 30.3 24.8 22.7 Yr2 41.1 40.3 38.5 Yr3 50.7 47.2 41.8 Overall 40.7 37.3 34.3 4.1.2 Sub-test 2 – Initial Sounds Identification Phonemic awareness is an important precursor to both reading and writing where students learn to identify the sounds in words and match the sounds to the corresponding letter. Being able to correctly identify the sound made by the first letter of each word will help children figure out what the word says. Children are able to hear and isolate beginning sounds in a word before they can hear sounds in the middle or at the end of a word. At its very core phonemic awareness is a listening and speaking skill rather than a reading skill and so in this test, the assessor read aloud a word twice and then asked the student to identify the first sound in the word. The test was comprised of 10 words and was not a timed exercise. The students’ scores were based on the number of initial sounds they could correctly identified out of 10 items. 30 Data shows that a significant proportion of students could not identify a single phoneme – more than half of Year 1 students (58%), and at least a quarter of Year 2 (29%) and Year 3 (21%) students. Overall, students correctly sounded an average of 3 initial sounds out of 10 items. Year 1 students correctly identified an average of 2 initial sounds, Year 2 students with an average of 4 (an increase of 2 correct initial sounds from Year 1) and Year 3 students with an average of 5 correct initial sounds out of 10. Results show development of this skill through the three year levels. As it is a foundational phonics skill, it should be performed better and mastered in the first two grades. However, it appears that the teaching of letter sounds is not taking place in all classes, and this may account for some students still mixing the name and the sound of letters and for the large share of zero scores in this subtest. Table 4: Sub-test 2 Initial Sounds of Words Results by Year Level Subtest 2: Initial N Mean SD Min Max % zero Sounds of Words scores Overall 1,195 3.3 3.2 0 10 36.6% Year 1 429 1.8 2.5 0 10 57.5% Year 2 384 3.7 3.2 0 10 28.8% Year 3 382 4.6 3.3 0 10 20.7% Table 5 shows the results of the initial sounds subtest by gender. Girls correctly identified 4 initial sounds compared to 3 correct out of 10 for the boys. Almost half of the boys assessed (42%) scored zero in this subtest. Table 5: Sub-test 2 Initial Sounds of Words Results by Gender Subtest 2: Initial N Mean SD Min Max % zero Sounds of Words scores Overall 1,196 3.3 3.2 0 10 36.6% Girls 600 3.6 3.2 0 10 31.3% Boys 596 3.1 3.3 0 10 41.8% Analysis by region shows that there was very little variation in student performance across the 3 regions. Students correctly identified an average of 3 to 4 initial sounds out of 10 in all three regions. 31 Figure 2: Initial Sounds Results by Region and Year Level Number of correct sounds Initial sounds - average results by Region and Year Level 6 5 4 3 2 1 0 Savaii Upolu other Upolu urban Yr1 1.8 1.7 1.8 Yr2 3.5 3.9 3.9 Yr3 5.2 5.2 4.5 Overall 3.5 3.6 3.4 4.1.3 Sub-test 3 – Letter Sounds Letter sound knowledge or graphemic knowledge is an awareness of the letters or groups of letters which represent the individual speech sounds in language. Knowledge of how letters correspond to sounds and to match sound and symbol with automaticity are critical skills students must master to become successful readers. The letter sound knowledge test was administered similarly to the letter name knowledge subtest. Students were provided with a page of 100 randomly distributed upper and lowercase letters of the SÄ?moan alphabet and asked to provide the sounds (not the names) of as many letters as they could identify within a one-minute period. Diacritical marks were also used randomly on some of the vowels to test students’ knowledge of short and long vowel sounds. The test was scored by the number of letter sounds that students correctly identified within one minute out of a total of 100 items provided. On average, students were able to correctly identify 23 correct letter sounds per minute (clspm). Year 1 students correctly identified 18 clspm. Students in Year 2 could accurately sound an additional 6 clspm, for a total average score of 24 clspm. Scores increased by 5 clspm in Year 3, from 24 in Year 2 to 29 clspm in Year 3. The increase in average scores from Year 1 to Year 3 is an indication of learning progression from one year to the next. However, average scores for each year and overall for all students tested were lower than the letter name subtask. 32 Table 6: Sub-test 3 Letter Sounds Results by Year Level Subtest 3: N Mean SD Min Max % zero Letter Sounds scores Overall 1,195 23.2 16.8 0 219.1 14.6% Year 1 429 17.5 12.9 0 80.7 19.5% Year 2 384 24.0 15.4 0 84 12.2% Year 3 382 28.7 19.7 0 219.1 11.5% Similar to results in letter names, analysis by gender as shown in Table 7 below indicates that girls outperformed the boys by correctly identifying an additional 4 letters. Table 7: Sub-test 3 Letter Sounds Results by Gender Subtest 3: Letter N Mean SD Mi Max % zero Sounds n scores Overall 1,196 23.2 16.8 0 219.1 14.6% Girls 600 25.0 15.9 0 84 12.2% Boys 596 21.4 17.4 0 219.1 17.0% Presented in Figure 3 below are the results by region which clearly shows that there is a slight variation in scores between the regions. Savai’i students recorded the highest average score of 27 clspm. Upolu Urban and Upolu Other students scored an average of 22 and 23 clspm respectively. Figure 3: Letter Sounds Results by Region and Year Level 4.1.4 Sub-test 4 – Familiar Words The familiar word sub-test measures students’ ability to read familiar words with fluency and accuracy, both of which are necessary to become fluent readers. In administering this reading test, students were given a list of 50 familiar words (selected from high frequency 33 words in Years 1-3 SÄ?moan readers available in classrooms) with instructions to read as many as they could in one minute. Familiar-word reading is a timed test scored by the number of correct words read per minute (cfwpm). Table 8 shows the results of the familiar word reading sub-test by year.. The overall mean score was 13 correct familiar words per minute (cfwpm). Results show that the test was most difficult for Year 1 students who could only correctly read an average of 4 cfwpm and 67% of whom scored zero. The reading levels improved by 10 cfwpm in Year 2, in which students correctly read an average of 14 cfwpm. Year 3 students read an additional 9 cfwpm with an overall average of 23 cfwpm and only 27% scored zero. Despite this measurable progress between years, it is important to note that overall performance in this subtest is a concern given that by Year 3, 20% of students could not read a single familiar word. Table 8: Sub-test 4 Familiar Words Results by Year Level Subtest 4: Familiar N Mean SD Min Max % zero Words (cfwpm) scores Overall 1,195 13.0 16.0 0 166.7 39.2% Year 1 429 3.8 8.4 0 166.7 66.7% Year 2 384 14.0 14.3 0 83.3 27.1% Year 3 382 22.5 18.3 0 96.8 20.4% Boys recorded a mean score of 11 cfwpm compared to 15 cfwpm for the girls, a difference of an average of 4 cfwpm. Table 9: Sub-test 4 Familiar Words Results by Gender Subtest 4: Familiar N Mean SD Min Max % zero Words scores Overall 1,196 13.0 16.0 0 166.7 39.2% Girls 600 15.1 17.7 0 166.7 34.3% Boys 596 10.8 13.9 0 81.2 44.0% Based on Figure 4 below, Savai’i students scored the highest with an average of 16 cfwpm compared to 14 cfwpm for Upolu Other and 12 cfwpm for Upolu Urban. 34 Figure 4: Familiar Words Results by Region and Year Level Familiar words - average results by Region and Year 30 Number of correct words per minute 25 20 15 10 5 0 Savaii Upolu other Upolu urban Yr1 5.1 4.0 2.7 Yr2 14.9 13.1 13.6 Yr3 26.8 24.0 19.7 Overall 15.6 13.6 12.0 4.1.5 Sub-test 5 – Non-words Reading The non-words fluency measures a student’s ability to decode individual phonemes (use of the alphabetical principle) and then blend the sounds together to read words. It is a measure of students’ automaticity and accuracy in matching the letters to sounds in order to read the word given. Unlike familiar word reading in which students can read from memory or sight recognition, the non-words reading sub-test requires students to sound out individual letters and syllables to decode a word. It is a purer measure of decoding ability because the students cannot recognize the words by sight. Students were provided with a table of 50 made-up words and instructed to read as many as they could within one-minute. The test was timed and measured by the number of correct non-words read per minute (cnwpm). Table 10 below shows an overall average of 7 cnwpm. There was a significant proportion of students who were unable to decode one word across all three years with the majority (81%) being in Year 1, nearly half (46%) in Year 2, and about a quarter in Year 3 (27%). In terms of mean scores, students were able to decode an average of 10 cnwpm. An increase of 6 cnwpm was noted for Year 2 compared to Year 1 and a difference of 4 cnwpm between Year 2 and Year 3. The overall mean scores do indicate progress from Years 1-3 however, an overall score of 7 cnwpm clearly shows a critically low level of decoding ability. As decoding is a key predictor of fluency levels, low performance on this subtest explains the low fluency and comprehension levels in the subsequent subtests. 35 Table 10: Sub-test 5 Non-words Results by Year Level Subtest 5: Non- N Mean SD Mi Max % zero words (cnpm) n scores Overall 1,195 7.0 10.5 0 92.0 52.5% Year 1 429 1.8 5.3 0 81.2 80.9% Year 2 384 7.6 10.9 0 92.0 46.4% Year 3 382 12.3 11.6 0 55.0 26.7% On average, girls read 2 more non-words correctly than boys (8 and 6 respectively). It is also important to note that almost half of the girls (49%) and more than half of the boys (56%) could not sound out a single non-word correctly. Table 11: Sub-test 5 Non-words Results by Gender Subtest 5: Non-words N Mean SD Min Max % zero (cnpm) scores Overall 1,196 7.0 10.5 0 92.0 52.5% Girls 600 8.2 11.3 0 81.2 48.5% Boys 596 5.8 9.4 0 92.0 56.4% Based on Figure 5, results by region show that students from the Savai’i area read an average of 9 cnwpm, 8 cnwpm for Upolu Other students and Upolu Urban region students recorded the lowest of 7 cnwpm. Figure 5: Non-words Results by Region and Year Level Nonwords - average results by Region and Year 18 16 nonwords per minute Number of correct 14 12 10 8 6 4 2 0 Savaii Upolu other Upolu urban Yr1 2.2 2.9 1.0 Yr2 8.5 7.1 7.5 Yr3 15.6 13.3 11.5 Overall 8.8 7.7 6.7 36 4.1.6 Sub-test 6 – Oral Passage Reading Oral passage reading fluency assessments have become a common methodology for measuring reading proficiency and growth18. Speed and accuracy are measurable elements of Oral Reading Fluency (ORF) and these can be measured as correct words per minute (cwpm). It encompasses all of the previous reading skills plus the skills needed for comprehension - the ability to translate letters into sounds, unify sounds into words, process connections, relate text to meaning, and make inferences.19 ORF has been shown to be a powerful predictor of overall reading competence and comprehension 20. This is the actual timed correct rate at which an individual reads and is measured as a raw score of the total number of correct words per minute. In order for students to understand a simple passage, they must be able to read it fast enough to retain the words in short-term memory. Research21 suggests a minimum fluency rate of 45-60 words per minute, depending on the complexity of the language. In this sub-test for SEGRA, students were asked to read a very short story comprised of 69 words in one minute. After one minute, the assessor stopped students and recorded the number of words read correctly. If the child could not read any words correctly in the first line, the assessor stopped the test early and the child received a score of zero. The overall mean score for this sub-test was 17 cwpm which is well below the 45-60 cwpm fluency standard. The bulk of the low scores were in Year 1, where students read an average of only 4 cwpm and more than half (66%) could not read a single word correctly. Students improved as they progressed to Year 2 reading an average of 18 cwpm and 32% scored zero. Year 3 students could read an average of 31 cwpm and one fifth of them scored zero which is a relatively high proportion given that they have spent three years in school. Overall, all of the three levels scored well below the expected international fluency standard. Table 12: Sub-test 6 Oral Passage Reading Fluency Results by Year Level Subtest 6: Oral N Mean SD Min Max % zero Reading Fluency (cwpm) scores Overall 1,195 17.3 23.6 0 170 40.0% Year 1 429 4.4 9.0 0 80 65.5% Year 2 384 17.9 21.2 0 138 31.6% Year 3 382 31.2 28.6 0 170 19.5% 18 Wolf & Katzir-Cohen, 2001 19 Hasbrouck, J., & Tindal, G. A. (2006). “Oral reading fluency norms: A valuable assessment tool for reading teachers.â€? The Reading Teacher, 59(7), 636–644. 20 Ibid 21 Abadzi, H. (2011). Reading Fluency Measurements in EFA FTI Countries: Outcomes and Improvement Prospects. Education for all Fast Track Initiative 37 Girls correctly read 7 more cwpm than boys (20.8 cwpm compared to 13.8 cwpm). Table 13: Sub-test 6 Oral Passage Reading Fluency Results by Gender Subtest 6: Oral N Mean SD Min Max % zero Reading Fluency scores Overall 1,196 17.3 23.6 0 170 39.9% Girls 600 20.8 26.1 0 170 35.7% Boys 596 13.8 20.1 0 138 44.2% Analysis by region shows little variation in the overall average scores between the three areas. Students from Savai’i had the highest fluency rate of 19 cwpm while Upolu Other and Upolu Urban regions recorded mean scores of 17 cwpm. However, there was a marked difference in performance for Year 3 students between the 3 regions. Year 3 students in the Upolu Urban region had an average score of 23.6 which was 7 cwpm less than students in Upolu Other and 10 cwpm less than those from the Savaii area. Figure 6: Oral Passage Reading Results by Region and Year Level 4.1.7 Sub-test 7 – Reading Comprehension The reading comprehension subtask measures the ability to answer comprehension questions based on the passage read. A total of five questions were provided for this sub- test, consisting of direct, fact-based questions and at least one question requiring inference from the passage read. The questions were developed during the instrument development workshop. Students were asked questions only up to the point where they had stopped reading. For instance, if the child read the first sentence (10 words), s/he was asked one question. If s/he read half of the text (34 words), s/he was asked three questions; and, if s/he read all five sentences (69 words), s/he was asked all five comprehension questions. Similarly, if 38 learners did not read any of the text, they were not asked any questions and received a score of zero. This sub-test is scored by the number of questions answered correctly out of a total possible five points. Table 14 shows that significant proportions of students scored zero in this subtest across all three years with almost all students in Year 1 (96%), more than half of Year 2 students (72%) and half of those in Year 3 (50%). Given that almost all of Year 1 students scored zero, their overall average score was 1.2% which means they could not correctly answer a single comprehension question. Students in Year 2 and Year 3 were only able to correctly respond to one comprehension question (raw means of 0.5 (11%) and 1.3 (26%) respectively). The international reading comprehension benchmark is 80% (4 or more correct responses out of 5) and so students in SÄ?moa are performing well below the desirable level of comprehension at 0% for Year 1 (no correct response) and only 20% . for those in Years 2 and 3 (1 correct response). It is also important to note that progression of learning between the years is minimal for this subtest. Comprehension is the ultimate goal of reading. It enables students to make meaning out of what they read and use that meaning not only for the pleasure of reading but also to learn new things, especially other academic content. It is also important to note that higher order skills like comprehension, build on lower order skills (e.g, phonemic awareness, letter sound knowledge and decoding). It is not surprising therefore that comprehension scores were so low for this group of learners since lower order skills were also very low. Table 14: Sub-test 7 Reading Comprehension Results by Year Level Subtest 7: Reading N Mean %Correct SD Min Max % zero Comprehension scores Overall 1,195 0.6 12.1 1.2 0 5 73.5% Year 1 429 0.1 1.2 0.4 0 4 96.0% Year 2 384 0.5 10.5 1.0 0 5 72.3% Year 3 382 1.3 25.8 1.6 0 5 49.5% Figures in Table 15 show that girls outperformed the boys in reading comprehension scoring an average of 16% or 1 correct response. . For boys, their average score was 9%, which means they could not answer a single question correctly. I It is also important to note that significant proportions of girls and boys scored zero (68% girls and 79% boys). 39 Table 15: Sub-test 7 Reading Comprehension Results by Gender Subtest 7: Reading N Mean % SD Min Max % zero Comprehension Correct scores Overall 1,196 0.6 12.1 1.2 0 5 73.5% Girls 600 0.8 15.5 1.4 0 5 67.8% Boys 596 0.4 8.6 1.0 0 5 79.2% Figure 7 below shows the average number of questions answered correctly for each of the three regions. All three regions scored an average of 12%, which is 1 correct response. Figure 7: Reading Comprehension Results by Region and Year Level 4.1.8 Sub-test 8 – Listening Comprehension The purpose of the listening comprehension assessment is to measure whether the student can listen to a short passage being read aloud and then answer several questions correctly with a word or a simple statement. Poor performance on a listening comprehension tool would suggest that students simply do not have the basic vocabulary, or that they have difficulty processing what they hear. It also means that the students have not learned the different structures of stories to help them predict the key messages in a story (e.g., who, what, when, where, how) and make inferences (e.g., why), which are necessary for responding to comprehension questions. In SEGRA, the assessor read a short story to students and then asked five comprehension questions. Students had 15 seconds to respond to each question. As this was an untimed test, all students heard the entire story and responded to all five questions. For this reason, scores are based on the percentage of questions answered correctly. 40 The overall mean score for listening comprehension was 30% (raw mean of 1.5) or 2 correct responses. By the end of Year 1, students achieved 18% on average (raw mean of 1.0) which means they could correctly answer 1 comprehension question and an average of 2 correct responses for Year 2 and Year 3 students (30% (raw mean of 1.5) and 42% (raw mean of 2.1) respectively). Almost half of Year 1 students (43%), less than a third of Year 2 students (25%) and 13% of Year 3 students could not answer 1 listening comprehension question correctly. Table 16: Sub-test 8 Listening Comprehension Results by Year Level Subtest 8: Listening N Mean % SD Min Max % zero Comprehension Correct scores Overall 1,195 1.5 29.6 1.3 0 5 27.6% Year 1 429 1.0 17.9 1.0 0 5 42.8% Year 2 384 1.5 30.3 1.3 0 5 25.3% Year 3 382 2.1 41.9 1.4 0 5 12.9% Analysis by gender shows that boys were better in listening comprehension than their female counterparts (2 correct responses (33%) and 1 correct response (26%) respectively). This is the only subtest where boys outperformed the girls by scoring 1 additional correct response. Table 17: Sub-test 8 Listening Comprehension Results by Gender Subtest 8: Listening N Mean % Correct SD Min Max % zero Comprehension scores Overall 1,196 1.5 29.6 1.3 0 5 27.6% Girls 600 1.3 25.7 1.2 0 5 33.7% Boys 596 1.7 33.4 1.3 0 5 21.5% Figure 8 presents performance by region for the 3 years. All the 3 regions recorded very little variation in their average scores of 1 or 2 correct responses. Savaii and Upolu Other regions scored an average of 2 correct responses (31% and 32% respectively) and an average of 1 correct response (27%) for the Upolu Urban region. 41 Figure 8: Listening Comprehension Results by Region and Year Level Listening comprehension - average results by Region and 50 Year 45 Number of correct (%) 40 35 30 25 20 15 10 5 0 Savaii Upolu other Upolu urban Yr1 18.6 19.7 15.5 Yr2 29.0 33.3 28.2 Yr3 46.6 44.7 36.8 Overall 30.6 31.8 26.7 4.1.9 Sub-test 9 – Dictation The dictation sub-test measures students’ alphabet knowledge and ability to hear and distinguish individual letter sounds in words and to spell words correctly. This subtest was untimed and the assessor read aloud a short sentence and asked students to write down what they had heard. The assessor read the sentence three times, once before students began writing and twice while they were writing. Students were given 15 seconds to complete writing after the third reading. The analysis for dictation was in two parts. The first part was orthography which looked at the number of correctly written words including the use of diacritics. The second part of the assessment looked at whether or not the student capitalized the initial word in the sentence, capitalized the proper nouns, use a full stop to indicate the end of the sentence and put the correct amount of spaces between the words. 9a Dictation Orthography Scores for dictation orthography was based on the number of correctly written words. The total number of orthography items was 13 and each item had a maximum value/score of 3 hence the total possible maximum raw score was 39. The average score for the entire sample was 14.2 out of 39. For Year 1, students scored an average of 4.0. An increase in performance by 12.6 points is noted for Year 2 students compared to those in Year 1 and a further 6.7 increase in skills for Year 3 compared to Year 2. Although there is progress between the years for dictation orthography, students in Years 1 and 2 scored less than half (4 points and 16.6 points respectively) of the total possible score of 39. The average score for Year 3 students was slightly over half (23.3 points) of the possible total score. It is also important to note that significant proportions 42 of students across the years scored zero in this subtest - more than half (73%) of Year 1 students, about a third of Year 2 students and one fifth of Year 3 students. Table 18: Dictation Orthography Results by Year Level Subtest 9: Dictation N Mean SD Min Max % zero Orthography scores Overall 1,195 14.2 15.0 0 39 42.8% Year 1 429 4.0 8.7 0 37 73.0% Year 2 384 16.6 14.7 0 39 31.7% Year 3 382 23.3 14.0 0 39 19.9% Table 19 presents the overall mean scores for girls and boys and the difference in performance is minimal. Girls recorded an average score of 15.6, which is 2.7 points higher than the boys average score of 12.9. Table 19: Dictation Orthography Results by Gender Subtest 9: Dictation N Mean SD Min Max % zero Orthography scores Overall 1,196 14.2 15.0 0 39 42.8% Girls 600 15.6 15.3 0 39 39.0% Boys 596 12.9 14.6 0 39 46.5% The performance by region is shown in Figure 9. The average mean score in dictation orthography for the Savai’i region was 16.1 points out of a possible score of 39. Upolu Other and Upolu Urban had 14.1 and 13.2 respectively. For Year 3 students, there Upolu Urban scored and average of 7 to 9 points less than Upolu Other and Savaii. Figure 9: Dictation Orthography Results by Region and Year Level 43 9b Dictation Convention Analysis of results based on students’ abilities to apply spacing, capitalization and full stop to the dictation sentence is presented in Table 20 for the three years. The maximum raw score for dictation convention was 6 points. Year 1 recorded a mean score of 0.4 out of 6, an increase of 1.3 points for Year 2 with an average score of 1.7 and a further increase of 0.7 for Year 3 students with an average score of 2.4. Progression of learning from Year 1 to Year 3 is minimal and an overall average of 1.5 out of a total possible score of 6 points is very low. It is also important to note that significant proportions of students across the years could not correctly indicate the start and end of a simple sentence, use capitalization for proper nouns and put in spaces between words. For Year 1, almost all students (82%) scored zero, 40% of Year 2 students and 23% of Year 3 students. Table 20: Dictation Convention Results by Year Level Subtest 10: Dictation N Mean SD Min Max % zero Convention scores Overall 1,195 1.5 1.7 0 6 49.2% Year 1 429 0.4 1.0 0 6 80.2% Year 2 384 1.7 1.7 0 6 40.4% Year 3 382 2.4 1.7 0 6 23.1% There is very little difference between boys and girls performance in this component of dictation. The girls noted an average of 0.3 points higher than their male counterparts (1.6 and 1.3 respectively). Data also indicate that more than half of the boys (53%) and almost half of the girls (46%) scored zero which are significantly high proportions of students that have not mastered the mechanics of writing or the conventions of print that do not exist in oral language. Table 21: Dictation Convention Results by Gender Subtest 10: Dictation N Mean SD Min Max % zero Convention scores Overall 1,196 1.5 1.6 0 6 49.2% Girls 600 1.6 1.8 0 6 45.5% Boys 596 1.3 1.6 0 6 52.9% Similar to all other subtests, Savai’i had the highest ranking of 1.8 points, Upolu Other and Upolu Urban had the same mean score of 1.4 out of a possible score of 6 points. Year 3 students from the Savaii region had the highest mean score of 3.0 out of 6, those from the Upolu Other region scored 2.5 and Upolu Urban students had a mean score of 2. 44 Figure 10: Dictation Convention Results by Region and Year Level 4.2 Summary of findings The overall SEGRA results showed that the majority of SÄ?moan students are not yet fluent readers and as a result cannot comprehend grade level text. Figure 11 presents the percentage of students with zero scores by subtest and year level. Overall, Year 1 recorded the highest percentage of students with zero scores for all subtests particularly for non-words (81%), reading comprehension (96%) and dictation convention (80%). More than half of Year 1 students had zero scores for six out of the nine subtests. The percentage of Year 2 students who scored zero across the subtests ranged from 2%- 72% and Year 3 students had the lowest percentage of students scoring zero with a range of 2%-50%. Reading comprehension had the highest percentage of students with zero scores for all years compared to the other subtests. 45 Figure 11: Percentage of Students Scoring Zero by Subtest and Year Level Year1 Year2 Year3 100 80 Percentage 60 40 20 0 Familiar Letter Name Initial Sound Letter Sound Nonword Oral Passage Read comp. List. Comp. Orthography Convention Word Year1 10.9 57.5 19.5 66.7 80.9 65.5 96.0 42.8 73.0 80.2 Year2 2.1 28.8 12.2 27.1 46.4 31.6 72.3 25.3 31.7 40.4 Year3 2.4 20.7 11.5 20.4 26.7 19.5 49.5 12.9 19.9 23.1 Sub-tests For the timed reading fluency sub-tests, students scored highest in letter name knowledge. Year 1 students had an average score of 27 clpm, 39 clpm for Year 2 and 46 clpm for Year 3. The biggest difference in performance between each year was in the oral reading fluency subtest where Year 2 students scored 13 more words than Year 1 and Year 3 students gained an additional 12 words compared to Year 2 students. Students struggled most with decoding familiar and non-words. Overall, students could only identify an average of 13 familiar words (out of a possible score of 50) and 7 correct non-words in a minute (out of a possible score of 50). ORF scores were higher averaging 16 cwpm, which suggests that students were better able to read words in context rather than in isolation. The majority of students in Years 1-2 were unable to comprehend the reading passage and only 50% of Year 3 students could correctly respond to only one reading comprehension question. Figure 12: Summary of Results: Number of correct responses for timed sub-tests by Year Level Timed Sub-test Average Results by Year 50 45.5 Correct answers per minute 39.4 40 28.7 31.2 30 26.5 24.0 22.5 17.5 17.9 20 14.0 12.3 7.5 10 3.8 4.4 1.8 0 Letter Name Letter Sound Familiar Word Nonword Reading Oral Passage Knowledge Identification Reading Reading Sub-tests Year1 Year2 Year3 46 For the untimed subtests in Figure 13, listening comprehension skills were higher than reading comprehension for all years. The reading comprehension sub-test had the highest percentage of zero scores across all years (See Figure 11). Over 90% of Year 1 students, over 70% of Year 2 students and 50% of Year 3 students could not accurately respond to one comprehension question. Students did, however, show improvements as they progressed through the years. By the end of Year 3, students could read an average of 13 familiar words correctly per minute and comprehend 20% of text (See Tables 8 and 14). However, this is very low considering Year 3 students should be reading with at least 80% comprehension, according to the internationally accepted reading comprehension benchmark. Figure 13: Summary of Results: Number of correct answers for untimed sub-tests by Year Level Untimed Sub-test Average Results by Year 5 4.6 Correct Answers 3.7 4 3 2.1 1.8 2 1.3 1.5 1.0 1 0.5 0.1 0 Initial sound Reading comprehension Listening comprehension Sub-tests Year1 Year2 Year3 For all of the timed subtests, girls outperformed their male counterparts. The subtest with the biggest difference in performance was oral passage reading with girls scoring 7 more cwpm than boys. For letter name, letter sound and familiar word reading, girls scored between 4-5 more points than boys, on average. 47 Figure 14: Summary of Results: Percentage of Correct Responses for Timed sub-tests by Gender Timed Reading Results by Gender Girls Boys 45 39.3 Correct answers per minute 40 34.2 35 30 25.0 25 21.4 20.8 20 15.3 13.8 15 10.8 8.2 10 5.8 5 0 Letter Name Letter Sound Familiar Word Nonword Oral Passage Knowledge Identification Reading Reading Reading Sub-tests Girls outperformed boys on two of the three untimed subtests; however, the differences were insignificant. Girls scored 0.5% higher on initial sound and 0.4% more in reading comprehension. Boys scored slightly higher than girls on listening comprehension – demonstrating an increased score of 0.4%. Figure 15: Summary of Results: Number of correct answers for untimed sub-tests by Gender Untimed Results by Gender Girls Boys 4 3.6 3.5 3.1 3 Correct Answers 2.5 2 1.7 1.5 1.3 1 0.8 0.4 0.5 0 Initial sound Reading comprehension Listening comprehension Sub-tests In terms of regional performance, students in the Savai’i region consistently performed better than Upolu Other and Upolu Urban regions. Apia Urban students scored the lowest in all subtests. The subtest with the biggest variance in performance was the letter names with the students in the Savaii region correctly naming an average of 41 letters, Upolu 48 Other students with an average of 37 clpm and Upolu Urban with 34 clpm. Although the variance in overall performance between the three regions is minimal for the letter sounds subtest, there is an average difference of 6 clspm between Upolu Other and Upolu Urban and 11 clspm between Upolu Urban and Savai’i for Year 3 students. Similarly for the non- words subtest, overall performance for the 3 regions differ by an average of 1-3 cnwpm but for Year 3 only, there is an average difference of 4-9 cnwpm. As well for the oral reading passage subtest, the difference in Year 3 performance across the three regions was an average of 8-11 cwpm. For reading comprehension, all the three regions had a mean score of 1 correct response and for reading comprehension, Upolu Urban had an average of 1 correct response and an average of 2 correct responses for Upolu Other and Savaii. 49 Chapter 5 / Performance in reading fluency and comprehension According to the second edition of the EGRA toolkit22, there are two steps to identifying a reading fluency benchmark. The first is to identify the level of reading comprehension that is expected for the grade level. In most countries, students in Years 1-3 are expected to read at 80% comprehension or higher (4 correct answers out of 5 questions) which is also the international standard for reading comprehension. This benchmark shows countries’ commitment to the importance of “readersâ€? being those who can comprehend most of the text. The 80% threshold for reading comprehension will be used for the purposes of this SEGRA study. Once the reading comprehension benchmark is set, the second step is to use EGRA data to show the range of ORF scores obtained by students against the desired level of comprehension. Since students may be able to achieve the 80% comprehension benchmark at different fluency rates, and there could be a wide range of scores, we have added a third step to the analysis. The third step is to identify the range of scores with the highest proportion of students meeting the comprehension benchmark. With this information, then stakeholders may decide on the value within the fluency range that should be put forward as the reading fluency benchmark. Table 22 shows the actual distribution of correct scores in percentages for reading comprehension. Overall, 4% of students scored 80% correct and 2% achieved 100%. Thus, the total percentage of students who achieved 80% or more in reading comprehension was only 6%. Table 22: Distribution of correct scores in percentages for reading comprehension by Year Level Comp. Score Year 1 Year 2 Year 3 Overall Mean 0% 96% 72% 49% 74% 20% 3% 14% 15% 11% 40% .2% 8% 12% 7% 60% .2% 3% 8% 3% 80% .5% 2% 10% 4% 100% 0% 1% 6% 2% Total 100% 100% 100% 100% As mentioned above, it was decided to set the level of comprehension at 80% as an indicator that students demonstrate full understanding of the text. Table 23 shows the percentage of students by year and gender that comprehended 80% or more of the text read. 22 RTI International. (2016). Early Grade Reading Assessment (EGRA) Toolkit, Second Edition. Washington, DC: United States Agency for International Development. 50 The distribution of scores by year is: 0.5% for Year 1 students, 3% for Year 2 and 16% for Year 3 students. In terms of gender, 8% of girls and 4% of boys reached the 80% benchmark in reading comprehension. Table 23: Percentage of students reading with 80% or more comprehension by Year Level and Gender 80% or more reading comprehension Mean (%) SD (%) N Overall 6% 24.3% 1195 Year 1 0.5% 7.4% 429 Year 2 3% 20.9% 384 Year 3 16% 32.2% 382 Female 8% 27.4% 600 Male 4% 20.3% 596 The second step looks at the distribution of fluency scores of students who reached the threshold. Figure 16 shows the distribution of ORF scores for each level of reading comprehension. Globally, an increase in the level of reading comprehension is associated with an increase in ORF scores. Table 24 confirms that mean scores in ORF were higher when students had higher levels of reading comprehension. However, it also shows that some students who did not reach the 80% reading comprehension benchmark achieved higher ORF scores than students who did reach the benchmark. For example, students who have 80% in reading comprehension could have a fluency score of 44 correct words per minute. Meanwhile, some students who were unable to answer a single reading comprehension question correctly could read up to 8023 correct words per minute. 23 Note that ORF scores may exceed the total number of words in a passage since it is a timed test. If students could read the entire passage within the 60 seconds allowed, the fluency score would take into account the time remaining from the 60 seconds. This explains why some students present such high value as 136 correct words per minute although the passage had only 60 words. 51 Figure 16: Summary of Results: ORF Distribution by Reading Comprehension Level Table 24: ORF Score by Reading Comprehension Score Reading Comp. Mean Fluency SD Min Max Sample (n) Score Score 0% 7.9 11.3 0 80 872 20% 26.7 12.6 4 68.9 114 40% 42.9 15.3 16 80 83 60% 58.5 23.0 32 170 44 80% 71.8 21.7 44 136 52 100% 80.9 21.0 56 138 30 Distribution of ORF scores for students who reach 80% in reading comprehension Figure 17 shows the distribution of ORF scores for the 52 students who achieved the level of 80% correct in reading comprehension. The majority of the students who met the benchmark read between 65 and 75+ correct words per minute. However, scores ranged between 45 and 75 or more words per minute. Scores above 60 indicate that students finished the paragraph in less than one minute while scores below 60 imply that students 52 who did not finish the paragraph were able to infer enough to correctly respond to all questions. Figure 17: Distribution of ORF scores for students reading with at least 80% Comprehension Table 25 shows that the fluency rate of 70 cwpm has a higher proportion of students meeting than not meeting the benchmark. Policymakers may decide on an acceptable range (e.g., 70-74 cwpm) that can be considered proficient or acceptable. If the benchmark for reading comprehension was lowered to 60%, then the fluency scores would also be lower. Table 25: Distribution of ORF Scores for students meeting and not meeting the 80% benchmark Correct number of words per % meeting 80% % not meeting 80% RC minute (ORF) RC benchmark benchmark 45-49 28 72 50-54 30 70 55-59 47 53 60-64 50 50 65-69 41 59 70-74 55 45 75 and more 80 20 In conclusion, greater oral reading fluency is associated with higher levels of reading comprehension. Only 6% of students are identified as being able to comprehend 80% of what they read. More females than males achieved the benchmark level of comprehension. ORF scores for students achieving the benchmark shows great variability. Stakeholders should discuss and decide on the number of correct words per minute that could define a student as a fluent reader. The decision should be based on the distribution of scores for students who reached the reading comprehension benchmark. 53 Chapter 6 / SSME findings and analysis of factors associated with reading fluency The collection of contextual information is central to an assessment. The relationships between contextual factors and achievement and not the learning outcomes data alone, are essential to decision-making. This section reports the results of regression analysis conducted to explore the effect of student, teacher and head teacher factors and classroom environment (independent variables) on Oral Reading Fluency scores (dependent variable). The SSME tools provided by the World Bank PEARL Project were used to yield a quick, but rigorous and multifaceted picture of school management and pedagogical practice. Where there are country-specific indicators available, these have been provided and where they don’t exist, the information is given to inform policymakers of the current situation in their schools and classrooms to assist with efforts on how to make these schools more effective. There is a national standard on teacher development programs and this is referred to in the discussion for this particular theme. The factors are organized into eight themes relating to the research questions: 1) teacher training; 2) instructional materials and professional development; 3) print rich environment; 4) time on task; 5) teacher tongue; 6) student tongue; 7) reading assessment; and 8) home environment. The results of the analysis directly respond to research questions categorized under each theme. Although the main focus is on the correlation between the above factors and ORF, Tables 47 – 52 in Annex D detail the factors associated with statistically significant increases in reading comprehension performance. 6.1 Teacher Training For effective teaching of early grade reading, it is critical that teachers are adequately trained. The teacher training theme therefore investigates teachers’ level of training on specific skills for teaching reading. Specific information requested within this theme as indicators of teacher quality include practices around the percentage of teachers who received preservice training in early grade literacy and teaching English as a second language as well as the number of teachers who received training within the last two years of teaching. 54 Table 26 shows the results for the above mentioned key indicators of the theme of teacher training. Figures indicate that the percentage of teachers who received training was 39% and 38% within the last two years. This shows that more than half of the teachers surveyed have not received adequate training in teaching early grade reading especially in the last two years. This could be one of the main contributing factors to the low level of reading abilities of the students assessed. Table 26: Theme 1 - Teacher training Key Indicators Indicators Percentage Teachers trained in early grade reading 39% Teachers trained within past two years 38% 6.2 Instructional materials and professional development It is also fundamental that there is sufficient support materials for the teachers and students to use in the classroom for the teaching and learning of reading. Based on the teachers and head teachers’ responses to the questionnaires, Table 27 shows the percentage of teachers with the relevant instructional materials as well as ongoing professional development support which are the key indicators for the theme of instructional materials and professional development. Data shows that less than half of the teachers had manuals (37%) and curriculum statements (27%). The curriculum statement outlines the planned and structured learning experiences that schooling provides and is the basis of the teaching and learning programs provided by schools. The Teachers Manual extend teachers’ understanding of how they can use the curriculum to create, deliver and assess effective teaching and learning programmes for students. Therefore, these are two key documents for every teacher to have24. In terms of professional development, 87% of the teachers and 80% of head teachers attended in- service training for early grade reading. Almost all of the teachers reported getting monthly feedback and coaching by the head teacher or the MESC (92%). This was reconfirmed by the head teachers when 95% said they provide feedback and coaching in the classroom. One interesting finding is that according to the head teachers, 69% of them provide teaching and learning materials and yet, only 27% of teachers have curriculum statements and 37% with teachers’ manual. Similarly, less than 50% of teachers said they have not received any training in early grade reading but 74% of head teachers said they send their teachers to in-service training and 87% said they organise training workshops for their teachers. Table 27: Theme 2 - Instructional materials & PD Key Indicators Indicators Percentage Teachers with teaching manual 37% Teachers with curriculum statement 27% 24 Minimum Service Standards Guidelines 2010, 2.2 Standard: Teaching and Learning Materials – Schools have all the required curriculum materials. The MESC provides each teacher with a Curriculum Statement and a Teachers’ Manual for all subjects. 55 Teachers with textbooks and resources 36% Teachers who attended in-service training for early grade reading 87% Teachers receiving monthly feedback and coaching by head teacher or MESC 92% Head Teachers who have been trained in EGR 80% Head Teachers who provide feedback and coaching in classroom 95% Head Teachers who assist with lesson planning 90% Head Teachers who organise training workshops 87% Head Teachers who send teacher to in-service trainings 74% Head Teachers who provide teaching and learning materials 69% Head Teachers who equip schools with libraries/reading corners 67% Head Teachers who manage school funds to support literacy strategy 59% 6.3 Print rich materials This theme focuses on the physical learning materials and supplies that are available within classrooms for instruction, including reading books, textbooks, charts, journals, student work displays, student profiles, posters and other relevant print resources. The key indicators for this theme therefore include displays of recent students’ work, posters and charts, student profiles, sufficient space for organizing reading activities, reading corners/libraries, textbooks per student, instructional readers and materials and big books. The results in Table 28 indicate that more than half of the classrooms observed had a variety of teaching aids including readers, big books, games etc. A high percentage (84%) of classrooms had sufficient space for organizing reading activities, 88% had student profiles, and 84% had recent students work displayed. However, less than 50% of classrooms displayed posters and charts (36%) and about 66% of the classrooms had teaching aids, posters and charts with phonics. With the exception of posters and charts displayed feature, the majority of the classrooms seemed to support the importance of a print rich environment to develop reading. Table 28: Theme 3: Print Rich Environment Key Indicators Indicator Percentage Teaching aids readers 73% Teaching aids big books 75% Teaching aids posters/charts with poems 67% Teaching aids posters/charts with songs 55% Teaching aids posters/charts with phonics 66% Classrooms with recent students’ work displayed 84% Classrooms with posters/charts displayed 36.2% Classrooms with student profiles 88% Classrooms with sufficient space for organizing reading activities 84% To answer the question of whether there are sufficient reading and teaching materials to support learning, the data in Table 29 shows that the classrooms are not adequately resourced. All schools visited had libraries however, almost half had no instructional materials and/or big books, only 9% had 11 or more instructional readers and only11% had more than 20 student textbooks. Also alarming is the finding that only 12% of classrooms 56 had reading corners and yet, 84% of classrooms had adequate space for reading activities (refer to Table 28). Overall, classrooms observed are not well resourced to support the teaching of reading. Table 29: Theme 3: Print Rich Environment Key Indicators Indicator Percent Number of student textbooks 1-5 17% Number of student textbooks 11-20 18% Number of student textbooks _ more than 20 11% Classrooms with reading corner 12% Schools with libraries 100% Classrooms with 11 or more instructional readers 9% Classrooms with 20 or more instructional materials 19% Classrooms with no instructional materials 41% Classrooms with 11 or more Big Books, posters or charts 16% Classrooms with 20 or more Big Books, posters or charts 11% Classrooms with no Big Books 40% 6.4 Time on-task In response to the research question of whether teachers allocate enough time on reading activities, the majority of the teachers (44%) reported that they spend 11-15 minutes of teaching time on reading. Only 13% of classes spend 26-35 minutes on reading which is the preferred amount of time. Table 30: Theme 4: Time on Task Indicator Percent Classroom with 5-10 minutes of reading instruction and reading 17% Classroom with 11-15 minutes of reading instruction and reading 44% Classroom with 16-25 minutes of reading instruction and reading 24% Classroom with 26-35 minutes of reading instruction and reading 13% 6.5 Teacher tongue This theme explores the level of basic reading and comprehension skills of teachers in the language of instruction. The measures for this theme included the number of teachers who were able to read grade level passages in English fluently and with comprehension, able to write stories in English, teach in English and those that switch between English and SÄ?moan while teaching. Teachers were asked to rate themselves for these different measures based on the pre-determined rating levels (not at all, poor, fair, good, very good, fluent, native/excellent). Data from the teacher survey shows that nearly all teachers are either native (64%) or fluent (32%) in the Samoan language, which is the language of instruction (refer to Table 31). 57 The majority of teachers (98%) stated that they teach in SÄ?moan and 16% stated that they teach in English. Interestingly, almost a quarter of the teachers (22%) claimed that they used code switching (between English and SÄ?moan). Code switching is not always detriment to proficiency in the learning of a language, but may be considered as a useful strategy in classroom interaction, if the aim is to make meaning clear. However, this practice should be minimal to ensure that teaching and learning of the target language is given the level of importance that it requires. Most teachers (95%) reported that they were able to read and comprehend grade level SÄ?moan passages and 89% of them considered themselves as having excellent skills in writing stories in SÄ?moan. This data is self-reported and thus would need to be validated through actual assessment, but it provides a picture of teachers’ efficacy and comfort in using the language of instruction. Table 31: Theme 5: Language and Instruction – SÄ?moan Reading Instruction for Teachers Indicator Percent Teachers who can speak SÄ?moan fluently 32% Teachers who are native speakers of the SÄ?moan language 64% Teachers who are able to read and write in SÄ?moan fluently 32% Teachers who are native speakers in reading and writing in SÄ?moan 63% Teachers who are able to read grade-level SÄ?moan passages fluently and with 95% comprehension Teachers who are able to write stories in SÄ?moan (excellent) 89% Teachers teaching in English 16% Teachers teaching in SÄ?moan 98% Teachers who switch between SÄ?moan and English in class 22% 6.6 Student tongue Teachers were also asked to rate their students language skills by assessing their abilities to conduct specific language tasks. According to the teachers interviewed, about 60% of students are native speakers of the Samoan language and about one-third speak SÄ?moan fluently. However, only one-third of students according to teachers are able to read SÄ?moan passages fluently appropriate to their year. This might mean that a significant number of students speak other languages. In terms of letter recognition, teachers state that 38% of their students are able to recognize and name letters correctly. Table 32: Theme 6: Language and Instruction – SÄ?moan Reading Instruction for Students Indicator Percent Students understand SÄ?moan (fluently) 32% Students who are native speakers of the SÄ?moan language 59% Read grade-level SÄ?moan passages fluently (excellent) 33% Recognize and say SÄ?moan letter names (excellent) 38% 58 6.7 Formative reading assessment During teacher interviews, teachers were also asked whether they conducted formative reading assessments for their students and use the results to inform changes in their instruction practices. Almost all teachers (92%) reported they have formative reading assessments and 94% said they use the findings to inform teaching practice. The percentage of teachers who reported receiving training on reading assessments is 86% but only 5% said they also received the relevant tools for conducting assessment. Table 33: Theme 7: Reading Assessment Indicator Percent Teachers who conduct formative reading assessment 92% Teachers who modify instruction 94% Teachers who have received training only 86% Teachers who have received training and tools 5% 6.8 Home environment Helping students to develop reading skills is a responsibility shared by the family and the school. Students' exposure to various reading materials at home and family support for students' literacy efforts plays a critical role in students' growth as readers. Key information on the students’ home environment were collected through the student questionnaire. Findings show that a significant percentage of students do not have a supportive home environment. Slightly more than half of the students (54%) receive assistance with their homework from their mothers and 33% from their fathers. At home, almost half of the students (42%) read independently, 44% have someone they can read to and more than half of them (59%) are read to. Table 34: Theme 8: Home environment Indicator Percent Students who receive help with homework from mother 54% Students who receive help with homework from father 33% Students who read to someone at home 44% Students who read independently at home 42% Students who are read to at home 59% Numerous studies have demonstrated the benefits of increasing students' exposure to literacy materials in their homes, especially for lower-achieving students25. Students were asked about the presence of reading books, computer and mobile devices in their homes that they can access and use for reading. As shown in Table 35, only 25% of students have books or other reading materials at home and 10% have access to a computer or mobile device. 25 Goldenbery et al. 1992; Koskinen et al. 1995 59 Table 35: Theme 8: Home environment Indicator Percent Students who have books or other reading materials to read at home 25% Students who have access to a computer or mobile device 10% 6.9 Association of student characteristics to student reading performance General background information and reading activities were collected in the student questionnaire. The factors as shown in Table 36 include whether the student attended preschool, ate breakfast before arriving at the school, language spoken at home, family literacy, whether students receive help with homework, availability of reading materials and whether students read or are read to at home. Close to two-thirds of the students surveyed (59%) attended preschool before Year 1. Most of the students (90%) speak SÄ?moan at home and 82% of them have at least one person who can read in their homes. Results also show that more than half of the students (79%) eat breakfast in the morning before going to school. Only 16% of students have books, newspapers and other reading materials in SÄ?moan and 7% of them have these available in English in their homes. Less than 50% of students read by themselves or to someone at home and 59% said they can be read to at home. It is interesting to note that only 5% of the students surveyed said they like reading so the majority of students do not like reading. Table 36: Student background characteristics Student Characteristics % of cases SE* N Student attends preschool before Year 1 59% 0.4% 1195 Student speaks SÄ?moan at home 90% 0.1% 1195 Student eats before arriving to school 79% 0.3% 1195 Student has someone who can read at home 82% 0.6% 1195 Student receives help with homework 87% 5.2% 1195 Someone asks student about what he/she did in school 49% 0.8% 1195 Student tells someone at home when he/she gets good marks 56% 0.8 % 1195 Student has books, newspapers or other things to read at home 25% 0.8% 1195 Student has books, newspapers or other things to read at home in 16% 0.01% 1195 SÄ?moan Student has books, newspapers or other things to read at home in 7% 0.01% 1195 English Someone reads to student at home 59% 0.1% 1195 Student reads aloud to someone at home 44% 0.01% 1195 Student reads to himself/herself at home 42% 0.1% 1195 Student reads on a computer or mobile device at home 10% 0.01% 1195 Student likes to read 5% 8.9% 1195 The most significant factors associated with high levels of ORF (with an increase of 6 or 7 correct words per minute) were whether the students had a literate sister and whether they had books, newspapers or other reading materials in SÄ?moan at home. Other factors that 60 were statistically significant that resulted in an increase of 5 correct words per minute were reading to someone at home and reading by themselves. Students who had a literate father could read an additional 2 word per minute and 3 more words for those who received support from their mothers for their homework. However, these differences in ORF were not statistically significant. Table 37: Association of student characteristics to Oral Reading Fluency (ORF) scores Student Characteristics Change in SE ORF Score (+/-) Student attends preschool before Year 1 0.01 0.04 Student eats breakfast before arriving to school -0.09 0.05 Student speaks SÄ?moan at home -0.50 2.19 Student has a literate mother -0.23 1.50 Student has a literate father 2.46 1.59 Student has a literate sister 6.29* 1.51 Student has a literate brother 1.97 1.62 Student receives help with homework from the mother 3.80 1.53 Student receives help with homework from father -1.60 1.62 Student receives help with homework from sister 4.39 1.78 Student receives help with homework from brother 1.84 2.27 Someone asks student about what he/she did in school -0.09 0.03 Student tells someone at home when he/she gets good marks -0.04 0.02 Student has books, newspapers or other things to read at home -0.02 0.02 Student has books, newspapers or other things to read at home in SÄ?moan 7.64* 1.91 Student has books, newspapers or other things to read at home in English 1.46 2.64 Student reads aloud to someone at home 5.28* 1.49 Student reads to himself/herself at home 5.88* 1.44 Someone reads to student at home -0.91 1.39 Student reads on a computer or mobile device at home 0.002 0.002 Student likes to read 0.02 0.07 *The statistically significant factors associated with high levels of ORF are the ones in bold and marked with an asterisk. 6.10 Association of teacher characteristics to student reading performance Research shows that effective teachers are the most important factor contributing to student achievement. Although curricula, reduced class size, district funding, family and community involvement all contribute to school improvement and student achievement, the most influential factor is the teacher. The association between teachers’ characteristics and student reading performance was explored using data collected in the teacher interview questionnaire. The profiles of teachers surveyed are presented in Table 38. Teachers have an average of 18.8 years of experience. Figures show that most of their experience is on the island they are currently teaching. Of the 102 participating teachers, nearly all (98%) wee female and 43% have a primary teaching certificate. The majority have primary school teaching certificates. Only 15% of them have a reading corner in their 61 classroom and 72% have met with their students’ parents. Over half of the teachers have not been away from school in the last term. Table 38: Profiles of Teachers in SEGRA Teacher characteristics % of cases N Female Teachers 98% 102 Has a primary teaching certificate 43% 102 Has a reading corner in the classroom 15% 102 Has not been absent from school in the last term 60% 102 Has met with parents of his/her students 72% 102 Teacher demographics Mean (SD) N Age of the teacher 44.5 (11.6) 102 Number of years of teaching experience 18.8 (12.8) 102 Number of years spent teaching on the island 18.1 (12.8) 102 Number of minutes from home to school 22.2 (21.0) 102 In order to identify the teacher characteristics associated with better student reading outcomes, a separate regression analysis was conducted for each teacher characteristic (see Table 39). The dependent variable was the average number of word read per minute in the oral reading passage (scores from sub-test 6) and the independent variables were the teacher characteristics described above (Table 38). The results in Table 39 indicate that none of the teacher characteristics listed had a statistically significant association with ORF scores. Having a reading corner in the classroom seemed to have a positive influence on ORF scores but this factor was not statistically significant. Table 39: Association of teachers' characteristics to Oral Reading Fluency Score (ORF) Teacher Characteristics Change in ORF SE Score (+/-) Has a primary teaching certificate -5.71 3.90 Has a reading corner in the classroom 4.26 4.52 Has not been absent from school in the last term -0.19 3.64 Has met with the parents of his/her students -2.35 7.79 Age of the teacher 0.22 0.26 Number of year of experience in teaching -0.18 0.24 Number of minute from home to school 0.03 0.76 62 6.11 Association of teacher training and teaching guides to student reading performance Information was also collected to determine the relationship between teaching resources and student reading performance. Teachers were asked if they had a teaching syllabus and also whether they had received any training on how to teach reading in the last two years. As shown in Table 40, 70% of teachers had a syllabus for teaching reading and only 37% had a teacher guide. Less than 50% of teachers have received training on reading instruction. Table 40: Teacher Training and Teaching Guides % of cases N Teacher has a syllabus 70% 102 Teacher has a teacher guide 37% 102 Teacher has received training on how to teach reading in the 39% 102 last two years Results of the regression analysis for teaching resources and its association with student reading performance, measured by a change in ORF score are presented in Table 41. Overall, statistics show no correlation of training and teacher guides on ORF scores. Table 41: Association of training and guides to student oral reading fluency (ORF) scores Change in ORF SE score Teacher a Curriculum Statement to teaching SÄ?moan 0.74 3.80 Teacher has a Manual for teaching SÄ?moan 0.21 3.54 Teacher have received training on how to teach reading in 0.28 3.28 the last three years 6.12 Association of Classroom Environment to student reading performance Information on classroom environment especially the types of reading resources available was collected through classroom observations. Assessors recorded whether they observed the following classroom displays: spelling/vocabulary charts/posters, songs/hymns on blackboard, charts and posters, students work, space for reading activities, reading corner, student profiles. Classrooms observed had an average of 3.36 classroom displays. The observation also covered the types of printed materials used in instruction, such as newspapers, magazines, flashcards, food wrappers and packaging, prepaid cards, objects in treasure boxes and any other materials. There was an average of 2.71% printed materials used in the classroom. 63 Table 42: Average Number of Classroom Displays and Materials Observed Classroom environment Mean (SD) SE N Classroom displays 3.36 (0.91) 0.09 105 Print materials used in instruction 2.71 (1.25) 0.12 105 As shown in Table 43, 89% of classrooms had spelling/vocabulary words displayed, only 8% had songs/hymns/stories written on the blackboard and 57% had them written on charts or posters, and 84% had student work displayed. Additional factors related to the classroom environment that were observed included information on whether there was space for organizing group activities, whether there was a reading corner in the classroom, whether teachers maintained folders with students’ work (student profiles) and the seating arrangement. The results showed that a very small percentage (12%) of classrooms had a reading corner, 84% had sufficient space for organizing group work, 88% have student profiles and 84% displayed students work. Table 43: Frequency and Type of Classroom Displays/Resources Available Types of classroom displays/resources available % of classes SE N Spelling/vocabulary displayed on charts/posters 89% 0.03 105 Song/hymns displayed on blackboard 8% 0.04 105 Song/hymns displayed on charts/posters 57% 0.03 105 Student work displayed 84% 0.04 105 Sufficient classroom space for organized group 84% 0.04 105 activities Reading corner in the classroom 12% 0.03 105 Student profiles (folder with student work and student 88% 0.03 105 info) The classroom environment variable that had the most positive relation with ORF scores was the spelling and vocabulary displayed on charts and posters. The students read an average of 7 more words per minute with these charts and posters displayed in the classroom. However, this factor is not statistically significant. Table 44: Association of Classroom Environment to Student ORF Scores Classroom Environment Change in ORF SE Score (+/-) Classroom displays 1.81 1.72 Spelling/vocabulary displayed 6.76 5.90 Song/hymns/stories displayed on blackboard -0.36 3.50 Song/hymns/stories displayed on charts/posters 1.29 3.15 Student work displayed 0.23 4.62 Print materials used in instruction 0.02 1.25 Sufficient classroom space for organized group -3.18 4.21 activities Reading corner in the classroom -0.55 4.79 Student profiles (folder with student work and 1.24 5.28 student info) 64 6.13 Association of Teacher Instructional and Assessment Methods to Student Performance The final set of regression analysis examined the relationship between instructional and assessment methods and student performance in ORF. Table 45 shows the results of regression analysis of the frequency in which students and teachers performed seventeen instructional and assessment methods within the course of the week. The dependent variable is the mean ORF score of the students and the independent variables are the instructional methods. The mean ORF score of students who were never exposed to an instructional or assessment method was compared to the mean ORF score of students who were exposed daily, 3-4 days, 1-2 days, not in the last 5 days. Results from those regression analyses were examined to ascertain if being exposed to a specific method has a positive or negative impact on ORF scores and if so, whether the effect is the same for all rates of exposure (e.g., 1-2 days, daily, etc.). All regression models include gender and Year for controlling for those two characteristics. Table 45 shows the teaching methods used during reading instruction and the frequency in which each strategy was conducted. The two most frequently used instructional methods were the teaching of listening comprehension and children learning meanings of new words/vocabulary. The most common classroom activities were shared reading, students spelling words in their exercise books and students writing sentences. Table 45: Frequency of methods used during reading instruction Never Rarely Sometimes Often Daily N (Not in the 1 or 2 days 3-4 last 5 days days) Teaching of Listening 4% 3% 20% 7% 67% 102 Comprehension Children Practice Letter Name 3% 6% 18% 12% 62% 102 Children orally retell a story 5% 10% 30% 10% 45% 102 that they have read Children learn new letter sounds 4% 4% 23% 8% 61% 102 Children sound out unfamiliar 13% 9% 22% 9% 47% 102 words using knowledge of letter sounds Children learning meanings of 3% 8% 14% 9% 67% 102 new words/vocabulary Shared reading 3% 5% 17% 12% 64% 102 Group Guided reading 4% 7% 21% 15% 54% 102 Listening to a child read aloud 12% 8% 22% 13% 45% 102 Students readings on their own 35% 9% 26% 8% 22% 102 silently Reading comprehension 3% 8% 20% 10% 60% 102 activities 65 Children take books home to 18% 7% 27% 12% 36% 102 read with their parents Evaluating student’s oral 3% 3% 40% 14% 40% 102 reading with running records or any other method Teachers works on word 3% 5% 21% 13% 59% 102 building with students Students read and draw 51% 3% 22% 3% 22% 102 Students working on spelling 3% 5% 18% 10% 65% 102 words in exercise books Students writing sentences 3% 7% 21% 5% 65% 102 Results of the regression analysis presented in Table 46 below showed that some instructional methods used in reading classroom positively correlate with students’ ORF scores and the ones discussed below were statistically significant. Students spelling words in exercise books more often seemed to have the most positive relationship with ORF scores with an average of 18.67 more words per minute and their peers who never did this activity had an average of 10.12 more words per minute. When students write sentences often, they were able to read an average of 15.67 more words and those who never wrote sentences received an ORF score of -0.69 or 1 word less per minute. Teachers working on word building with students resulted in increased ORF scores of 12.19 more words per minute and when this activity was never done, the ORF score was 4.6 words per minute. Also, students reading on their own silently on a daily basis showed a positive association with ORF scores of 9.22 more words per minute and those that rarely read on their own silently had an ORF score of – 4.07. However, several teaching methods showed a negative association with students’ ORF scores. Teaching meaning of new vocabulary words daily recorded an average of 8.44 less words and students who never or rarely received guided reading had lower ORF scores of 6 words less. As well, rarely evaluating students’ oral reading skills showed lower scores compared to those whose oral reading skills were never evaluated. The only factor that was statistically significant was students who never received reading comprehension activities with 19.7 less words. Table 46: Association of teacher instructional and assessment methods to student performance Teaching instructional and assessment methods Change in ORF SE score Teaching of Listening Comprehension Never 4.61 8.36 Rarely -4.30 9.45 Often -1.41 6.70 Daily -0.77 3.88 Teaching Letter Names Never 4.58 9.52 Rarely 0.68 7.20 Often -2.30 5.70 66 Daily -0.84 4.08 Asking children to orally retell a story that they have read Never -4.27 7.24 Rarely 4.99 5.46 Often 3.12 5.46 Daily -3.32 3.49 Teaching new letter sounds Never 11.15 8.10 Rarely -1.45 8.10 Often 1.36 6.12 Daily -3.16 3.61 Asking children to sound out unfamiliar words using knowledge of letter sounds Never 3.07 5.22 Rarely -4.78 5.92 Often 5.85 5.92 Daily -1.77 3.82 Teaching meaning of new vocabulary words Never 1.63 9.49 Rarely -8.45 6.61 Often -3.60 6.37 Daily -8.44 4.38 Shared reading Never -4.48 9.55 Rarely 4.63 7.76 Often -0.01 5.75 Daily -0.16 4.16 Group Guided reading Never -6.57 8.27 Rarely -6.26 6.62 Often 1.63 5.12 Daily -0.02 3.89 Listening to a child read aloud Never -1.83 5.36 Rarely -4.40 6.18 Often 6.63 5.22 Daily 1.05 3.84 Students readings on their own silently Rarely -4.07 5.47 Sometimes 5.70 3.74 Often 2.28 5.74 Daily 9.22* 3.98 Reading comprehension activities Never -19.70* 9.09 67 Rarely -9.53 6.14 Often 2.60 5.68 Daily -5.44 3.78 Children take books home to read with their parents Never -5.86 4.56 Rarely -2.04 6.38 Often -2.66 5.21 Daily 0.78 3.78 Evaluating student’s oral reading with running records or any other method Never -7.10 8.94 Rarely -18.30 8.94 Often -3.89 4.63 Daily -1.94 3.30 Teacher works on word building with students Never 4.60 9.12 Rarely 13.23 7.35 Often 12.19* 5.21 Daily 4.14 3.75 Students read and draw Rarely 0.01 9.06 Sometimes 0.96 3.88 Often 2.28 9.06 Daily -2.58 3.88 Students work on spelling words in exercise books Never 10.12 9.04 Rarely 2.26 7.32 Often 18.67* 5.71 Daily 5.49 3.85 Students writing sentences Never -0.69 9.20 Rarely -0.03 6.51 Often 15.67* 7.42 Daily 4.43 3.74 68 Chapter 7 / Conclusions and Next Steps SEGRA was administered in SÄ?moa with the intention to provide evidence to the MESC on the current situation of early grade reading abilities to elicit critical dialogue among stakeholders that would effectively inform the direction and next steps to improve reading in the early grades. The SEGRA study showed that students’ reading fluency and comprehension levels in Years 1-3 are very low when mapped against international standards. There is evidence of learning progression especially between Years 1 and 2. Data also shows very little learning or none at all between Year 2 and Year 3 especially in reading comprehension. Despite this progress in learning between the years, it is not at the rate the students need to enable them to become fluent readers or read with comprehension. Results indicate that students struggled the most with identifying letter sounds and especially decoding non-words. As a result, the majority of students are unable to read fluently with comprehension. Almost all of the students in Year 1 (96%) and more than half (72%) of Year 2 students have zero reading comprehension skills. Even at Year 3, 50% of students still cannot comprehend a grade level text and only 8% of them could comprehend at least 60% of the text. Overall, 16% of Year 3 students met the international reading comprehension benchmark of 80% and above. Across all sub-tests, girls tended to perform better than boys. More girls than boys met the 80% reading comprehension benchmark. The girls also had a fluency rate of 19 cwpm compared to 13 cwpm for the boys. Students in the Savai’i region consistently performed better than other regions. The analysis also identified a number of student characteristics, teaching and classroom environment factors that are associated with better reading outcomes. Students who had SÄ?moan reading materials at home, read to someone at home or by themselves scored an average of 5-8 more words on the ORF sub-test. Students spelling words in exercise books, writing sentences and working with teachers on word building were found to be critical activities for increasing students’ fluency rates. As expected for classroom environment, the availability of reading materials in the classroom or used in teaching instruction was positively related to ORF scores. Results also show that the majority of classrooms observed had sufficient space for reading activities but were not fully utilised. Spending enough time on reading activities improves student reading abilities. The preferred amount of time to be spent on reading is 26-35 minutes. Data shows that only 13% of classes spend 26-35 minutes on reading. As well, only 5% of the students surveyed said they like reading and this is evident in the low level of reading achievements. 69 Reading comprehension activities, teaching meaning of new vocabulary and evaluating students’ oral reading and running records or any other assessment method were negatively related to student ORF scores. One reason could be that teachers are not closely following the teaching guide. Data shows that less than half of the teachers had manuals (37%) and curriculum statements (27%). Overemphasizing decoding skills or assessment and not utilizing results to inform instruction can affect student performance. More research is necessary to identify the real causes behind these results. Recommendations Based on the findings, the following recommendations are presented for consideration, as means to improve the quality of early grade reading instruction for SÄ?moan students: 1. Teaching and Learning 1a. Teachers should provide explicit and systematic instruction in decoding and reading comprehension skills for students in early grades. Results show that most students have not mastered decoding skills. Overall, students correctly identified an average of only 23 letter sounds out of a possible score of 100, 3 initial sounds out of 10 and 7 out of 50 nonwords. Given students low level of decoding skills, it is not surprising that only 6% of them were able to comprehend 80% or more of grade level text. If students are taught specific decoding strategies, they will be able to read faster and more accurately leaving sufficient working memory for comprehension. Students who cannot identify letter sounds and decode words cannot read and therefore cannot comprehend. To become good readers, most students require explicit, intensive, and persistent instruction. It is therefore suggested that explicit instruction in decoding and reading comprehension skills be practiced in schools to improve students reading achievements. Research has shown that the most effective reading comprehension strategies include activating prior knowledge/making predictions, questioning, visualizing, drawing inferences and summarizing or retelling of a text in their own words. Teachers should use various strategies to develop students’ reading comprehension skills from as early as preschool or Year 126. 1b. Provide remedial instruction for non-readers Results show that 60% of students in Year 1, 32% in Year 2 and 20% in Year 3 were unable to read a single word of an oral reading passage. These nonreaders are unlikely to ever learn to read without remedial instruction. Teachers should be trained and empowered to conduct reliable classroom level assessments to identify 26 Sample reading comprehension activities can be found in the following guides: Ontario Ministry of Education. (2003). A Guide to Effective Instruction in Reading: Kindergarten to Grade 3 (Available at: http://eworkshop.on.ca/edu/resources/guides/Reading_K_3_English.pdf) Institute of Education Sciences (IES). (2010). Improving Reading Comprehension in Kindergarten through 3rd Grade. What Works Clearninghouse (Available at: https://education.ohio.gov/getattachment/Topics/Early-Learning/Third-Grade-Reading-Guarantee/Third-Grade-Reading-Guarantee- Teacher-Resources/Improving-Reading-Comprehension-in-Kindergarten-Through-3rd-Grade.pdf.aspx) 70 non-readers in Years 1-3, diagnose the causes, and design and implement specific activities to address deficiencies. For instance, teachers may group students according to ability and provide remedial activities and appropriately levelled text. This “catch-up approachâ€? is being used by UNICEF in Zambia based J-Pal’s research in India, which demonstrated that students grouped by ability is more effective than mixed-ability grouping. In this approach, UNICEF assesses students in all three grades and group them according to reading abilities. Grouping students by reading ability rather than grades has produced dramatic results in India, Kenya and Ghana. Teachers and school administrators should further determine whether non-readers have learning disabilities (e.g., dyslexia) and design relevant intervention strategies for special needs students. 1c. Develop and implement activities that specifically focus on raising boys’ abilities and interests in reading. The results illustrate that boys consistently performed lower than girls except for the listening comprehension subtest where they (boys) scored slightly better than girls. These differences in performance should not be overlooked because they are consistent and can be taken as a sign of systematic, yet not well understood, differences in the learning opportunities and experiences offered to boys and girls. There may be cultural or gender barriers that affect boys’ interest and engagement in reading activities. Offering a rich and varied mix of materials and being mindful of boys’ reading preferences can go a long way towards building an engaging and inviting reading environment for boys. Successful strategies that have worked in other countries include developing gender-sensitive materials that attract boys’ attention (such as sports, science fiction, fantasy, comic books, digital text, and stories that are humorous), increase in the use of graphics, pictures and storyboards in class and for homework and integrating reading into extracurricular activities (e.g., sports, health clubs). 2. Teacher Training (In-service and Pre-Service) 2a. Train Years 1-3 teachers in reading instruction with a focus on vocabulary, decoding skills, reading comprehension and writing Only 38% of teachers reported being trained on reading instruction in the last two years and yet, one of the minimum service standards27 for schools in SÄ?moa is for teachers to have continuous professional development. This finding also reconfirms one of the key development issues noted in the SÄ?moa Education Sector Plan 2013- 2018 that many teachers in primary schools have not had adequate training (at pre– service and in-service levels) and on-going professional support to ensure they have the content, pedagogical and assessment knowledge needed to implement effective literacy and numeracy programmes 28 . Student results indicate they are weak in recognizing letter sounds and unfamiliar words, and have very low reading comprehension skills. Hence, instruction in these areas should be strengthened to increase overall reading scores. A recommended training package for teachers can 27 Samoa Schools Minimum Service Standards, 2010, 28 Samoa Education Sector Plan, June 2013-July 2018, p 24 71 be divided into two parts. The first part of the training can focus on general principles, or techniques of effective teaching. Examples might include ways of increasing student engagement during lessons, methods for leading effective classroom discussions of text, or ways of effectively correcting student word reading errors during shared reading activities. Teachers can receive this type of professional development through workshop series, reading study groups, or coaching. The most effective professional development always includes follow-up in the classroom to ensure that teachers fully understand new instructional approaches to apply them in their classrooms. The second part can be the program specific component that includes a core reading program containing systematic lessons to support the growth of critical reading skills, along with practice and teacher support activities that are aligned with instruction. It is important to ensure that Year 1 teaching enables students to quickly develop alphabetical and phonological knowledge that develops decoding skills and therefore contribute to text reading and writing. The training should put emphasis on both reading and writing development and the use of guided reading/writing, shared reading and independent reading approaches. The training program should also allow participation by members of the school community. 2b. Ensure that pre-service course content provides new teachers with essential knowledge and skills related to improving reading and literacy outcomes One of the main objectives of primary teacher education courses should be to prepare preservice teachers to teach reading. Preservice training programs therefore need to assist teacher trainees understand and be able to use various strategies to develop Years 1 to 3 students’ foundational literacy skills and how to deliver explicit teaching about phonemic awareness, phonics and the alphabetic principle. The training package discussed in the previous recommendation for inservice training can also be incorporated into the pre-service program if deemed necessary. 3. Formative Assessment 3a. Ensure support for teachers on formative assessment SEGRA results as self-reported by teachers show that 92% of them conduct formative reading assessments and 94% modify their instructions based on assessment information. About 86% of teachers reported receiving training on reading assessments but only 5% of this number received the relevant tools for conducting assessment. It is possible that although training was conducted, there were still gaps in teachers understanding of how to apply the new knowledge and skills in the classroom and how to utilize the results effectively for reflection and lesson planning. Additionally, considering 95% of teachers did not receive tools, many were unable to apply what they had learned. Therefore, refresher training on formative assessment tools and provision of tools for teachers and principals are recommended as well as follow-up coaching in the classroom to ensure all teachers received the assessment instruments and are able to apply them appropriately. The 72 assessment tools should be aligned with the newly established fluency and comprehension benchmark (see recommendation 8). 4. Time on Task 4a. Develop strategies to ensure that students spend more time reading in school, and at home. The more time children spend reading the better and more fluent readers they become. Results show that students reading on their own silently on a daily basis was associated with 9 more cwpm in the ORF subtask. The majority of the teachers interviewed (44%) spend around 11-15 minutes of teaching time on reading and only 13% spend 26-35 minutes on reading which is the recommended amount of time to be spent on reading instruction and practice. Results indicate that only 16% of students have books or other reading materials in SÄ?moan at home and 10% have access to a computer or mobile device. SEGRA results therefore indicate that the majority of teachers are not allocating sufficient time to reading instruction and the majority of students do not have access to reading materials at home which are potential contributing factors to the low levels of reading achievements for students. In order to increase students’ reading fluency skills, teachers should ensure that students have access to a variety of grade appropriate reading materials and are spending sufficient time reading every day at school and home through teacher-led, parent-led or self-guided reading activities and that there is sufficient practice and materials at home to encourage fluency. 4b. Provide daily time for students to write. Reading affects writing and writing affects reading. SEGRA results has shown that students have very low writing skills as evident in the dictation orthography and dictation convention subtest results. Almost half of the students scored zero in dictation orthography (43%) and dictation convention (49%) which were the two subtests that assessed students’ writing skills. Providing adequate time for students to write is one essential element of an effective writing instruction program. Students need dedicated instructional time to learn the skills and strategies necessary to become effective writers, as well as time to practice what they learn. As teachers observe the way students write, they can identify difficulties and assist students with learning and applying the writing process. Given the relationship between reading, writing and vocabulary, it is recommended to review the language/reading programs in the early grades in SÄ?moa and incorporate sufficient instructional tasks and activities to develop all three skills (reading, writing and vocabulary). 5. Reading Materials 5a. Ensure that every classroom has a library/reading corner and the books are used during reading instruction. Only 12% of classrooms have reading corners and yet, regression analysis showed that having a reading corner was associated with 4 more correct words per minute 73 on the ORF subtask. The classroom observations also revealed that 84% of classrooms have adequate space for reading activities. Thus, it is recommended to develop reading corners in the classrooms with a variety of leveled book collection that has a balance between familiar favourites and new material, fiction and nonfiction books, those that are easy to read and ones that have more challenging material for experienced readers. Teacher guides for reading should also be available to assist teachers on how to effectively use these books for reading instruction. The International Reading Association (IRA) recommends that classroom libraries start with at least seven books per child and add two new books per year. The optimal number of books in a classroom library is 300-600, depending on the grade level and number of copies29. The number of books teachers should expect children to read during the school year is 100-125 picture books by the end of Grade 1 and 50-75 chapter books by the end of Grade 2. The SÄ?moa school fee grant scheme can assist schools with the procurement of sufficient and grade appropriate SÄ?moan reading materials for the students. A low-cost option is to have e-readers provided that the schools can make available the required materials to support student access to such resources. E-readers allow students and teachers to choose from a variety of genres, it is portable so students can read from home or school, and its read aloud features provides additional support for emergent readers 3031 . In addition to the provision of an increased number and variety of graded hard and soft copy books, teachers should be trained on how to better integrate materials into their instruction using their teacher guides for reading, and on how to develop attractive reading corners. 6. School Leadership 6a. Train School Principals to serve as Literacy Leaders (or Directors/Guides). The role of the school principal is to guide, support, and monitor classroom reading instruction. School principals should ensure that teachers have effective ongoing professional development programs and adequate materials to support high quality instruction, and they should observe classes to identify areas that need to be strengthened in order to achieve results. Based on information collected through the teachers’ questionnaire, only a third of the teachers received training in early grade reading and only 12% of them said they had reading corners in their classrooms. As well, about a third of the teachers reported they had teaching guides (37%), textbooks and resources (36%), and less than a third (27%) stated they had curriculum statements. These findings indicate that the teachers are not well supported to effectively deliver reading programs. It is therefore suggested that early grade reading professional development programs for school principals focus 29 Neuman, S. (undated). The importance of the classroom library. Available at: http://teacher.scholastic.com/products/paperbacks/downloads/library.pdf 30 Adams, A. & van der Gaag, J. (2011). First Step to Literacy: Getting Books in the Hands of Children Available at: https://www.brookings.edu/research/first-step-to-literacy-getting-books-in-the-hands-of-children/ 31 UNESCO (2014). Reading in the mobile era: A study of mobile reading in developing countries. http://unesdoc.unesco.org/images/0022/002274/227436E.pdf 74 on updating principals’ understanding of early grade reading and literacy, the quality of instruction, school organisation, monitoring and evaluation. 7. Establishing Benchmarks 7a. Define early grade reading and fluency benchmarks to provide teachers and policymakers with a means to track early grade reading performance. It is important to establish norms for reading performance especially in mother tongue languages. The wealth of data obtained from this study and the 2017 nationwide collection of baseline data for literacy in Year 1 through to Year 3 provide sufficient evidence for the MESC to determine the rates of fluency, comprehension and word skills that are necessary at each year level. Equally important is for MESC to ensure that if a benchmark system is introduced, it should include adequate mechanisms to identify struggling and non-readers in order for them to receive the necessary support to reach the grade standards before completion of a school year. In fact, a benchmarking system provides critical evidence to redirect the education system as a whole to get as many students as possible to achieve the approved standards. This means improving classroom instruction, ensuring a culture of shared accountability over learning at the school level as well as strengthening practices and support outside the school. In developing the benchmarks, stakeholders should decide on the level of comprehension required to understand grade level text (e.g., 80% is the internationally accepted standard) and then review the fluency scores that fall within that range (as was done in this report). If stakeholders agree with the 80% benchmark, then an acceptable fluency range may be 70-74 cwpm. The current mean Oral Reading Fluency (ORF) score is 17 cwpm and based on this, policymakers may decide to lower or raise the benchmark from the 80% point. Once the benchmark is decided, the next step is to consider the targets and agree on the percentage of students who should be meeting the benchmark in a particular time period. Currently, only 6% percent of students are meeting the 80% and above benchmark. Once the benchmark and targets are set, the MESC can then inform all stakeholders of the new benchmarks and regularly monitor and report progress towards achieving the targets at all levels (national, regional and schools). The approved benchmarks can also be integrated into the pre-service programs at the National University of SÄ?moa (NUS). 8. Additional research on findings not well understood 8a. Identify the causes in differences in performance across regions and develop context-relevant interventions. Data analysis shows that the Savai’i region had the best average scores for all subtests. The differences between Savai’i and the other two regions across the subtests ranged from 0.2 to 4 points. Information in the MESC Statistical Digest 2016 indicates overcrowding in the urban schools. This could be one of the reasons why the Savai’i students are performing better. Every classroom that houses more 75 children than the optimum number encounters special difficulties in instruction, guidance and supervision which means that the teacher will have less individual attention given for each student. An investigation into the causes of differences in performance between the regions is suggested. As well, targeted interventions for students and for teachers in terms of professional development programs such as those suggested above are highly recommended for Upolu Urban in particular. 76 Annex 1 / Tables Annex 1.A: Correlations between tasks Annex 1.B: 95% Confidence interval for EGRA Task Annex 1.C: SÄ?moa Regression with Reading Comprehension as Outcome Annex A: Letter Name Initial Sound Letter Sound Familiar Non-words Oral Reading Reading Listening Dictation Dictation Knowledge Identification Knowledge Words Fluency (Task Comprehensio Comprehensio Orthography Convention (Task 1) (Task 2) (Task 3) (Task 4) (Task 5) 6) n n (Task 9) (Task 10) clpm init_sound_pcn clspm cwpm cnwpm orf (Task 7) (Task 8) orthography_ convention_ t readcomp_pcnt listcomp_pcnt pcnt pcnt Task 1 1 Task 2 0.431** 1 Task 3 0.541** .390** 1 Task 4 0.645** .519** 0.514** 1 Task 5 0.582** 0.494** 0.459** 0.854** 1 Task 6 0.590** 0.507** 0.473** 0.904** 0.856** 1 Task 7 0.471** 0.421** 0.400** 0.777** 0.753** 0.827** 1 Task 8 0.327** 0.391** 0.225** 0.379** 0.337** 0.361** 0.370** 1 Task 9 0.610** 0.480** 0.445** 0.770** 0.684** 0.729** 0.602** 0.420** 1 Task 1 0.586** 0.461** 0.422** 0.747** 0.662** 0.703** 0.581** 0.380** 0.894** 10 **Correlation is significant at the 0.01 level (2-tailed). Annex B: 95% Confidence interval for EGRA Task 95% Confidence Interval for mean at Task 1 by Year and Gender 95% Confidence Interval Task 1: Letter Name Lower Upper bound Bound Overall 35.7 37.8 Year 1 24.9 28.1 Year 2 37.8 41.1 Year 3 43.6 47.3 77 Girls 37.8 40.8 Boys 32.7 35.6 95% Confidence Interval for mean at Task 2 by Year and Gender 95% Confidence Interval Task 2: Initial Sound Lower Upper bound Bound Overall 31.5 35.2 Year 1 16.1 20.8 Year 2 33.8 40.2 Year 3 43.1 49.8 Girls 33.5 38.7 Boys 28.0 33.1 95% Confidence Interval for mean at Task 3 by Year and Gender 95% Confidence Interval Task 3: Letter Sounds Lower Upper bound Bound Overall 22.3 24.2 Year 1 16.3 18.8 Year 2 22.5 25.6 Year 3 26.7 30.7 Girls 23.7 26.3 Boys 20.0 22.8 95% Confidence Interval for mean at Task 4 by Year and Gender 95% Confidence Interval Task 4: Familiar Word Lower Upper bound Bound Overall 12.1 13.9 Year 1 3.0 4.6 Year 2 12.5 15.4 Year 3 20.7 24.3 78 Girls 13.9 16.7 Boys 9.6 11.9 95% Confidence Interval for mean at Task 5 by Year and Gender 95% Confidence Interval Task 5: Non-word Lower Upper bound Bound Overall 6.4 7.6 Year 1 1.3 2.3 Year 2 6.5 8.7 Year 3 11.1 11.3 Girls 7.3 9.1 Boys 5.1 6.6 95% Confidence Interval for mean at Task 6 by Year and Gender 95% Confidence Interval Task 6: Oral Passage Lower Upper bound Bound Overall 16.0 18.6 Year 1 3.5 5.3 Year 2 15.8 20.0 Year 3 28.3 34.1 Girls 18.7 22.9 Boys 12.2 15.4 95% Confidence Interval for mean at Task 7 by Year and Gender 95% Confidence Interval Task 6: Reading Lower Upper Comprehension bound Bound Overall 10.7 13.4 Year 1 0.5 1.9 Year 2 8.4 12.6 Year 3 22.6 29.0 79 Girls 13.3 17.7 Boys 7.0 10.3 95% Confidence Interval for mean at Task 8 by Year and Gender 95% Confidence Interval Task 8: Listening Lower Upper Comprehension bound Bound Overall 28.1 31.1 Year 1 16.1 19.7 Year 2 27.7 32.9 Year 3 39.2 44.7 Girls 23.7 27.7 Boys 31.2 35.5 95% Confidence Interval for mean at Task 9a by Year and Gender 95% Confidence Interval Task 9a: Dictation - Lower Upper Orthography bound Bound Overall 34.3 38.7 Year 1 8.2 12.5 Year 2 38.8 46.3 Year 3 56.1 63.4 Girls 36.8 43.1 Boys 30.1 36.1 95% Confidence Interval for mean at Task 9b by Year and Gender 95% Confidence Interval Task 9b: Dictation - Lower Upper Convention bound Bound Overall 25.8 29.4 Year 1 5.9 9.4 Year 2 28.5 34.9 Year 3 42.7 49.0 Girls 27.4 32.7 Boys 22.6 27.6 80 Annex D: SÄ?moa – Regression Analysis with Reading Comprehension as Outcome Table 47: Impact of student characteristics on Reading Comprehension percent scores Change in ORF Student Characteristics Score (+/-) Student attends preschool before Year 1 0.01 Student eats breakfast before arriving to school -0.06 Student speaks SÄ?moan at home 1.64 Student has a literate mother -1.22 Student has a literate father 1.12 Student has a literate sister 3.20* Student has a literate brother 1.67 Student receives help with homework from the mother 3.11 Student receives help with homework from father -1.41 Student receives help with homework from sister 5.38* Student receives help with homework from brother 0.87 Someone asks student about what he/she did in school -0.07 Student tells someone at home when he/she gets good marks -0.07 Student has books, newspapers or other things to read at home -0.02 Student has books, newspapers or other things to read at home in SÄ?moan 6.70* Student has books, newspapers or other things to read at home in English 6.42* Student reads aloud to someone at home 5.12* Student reads to himself/herself at home 5.41* Someone reads to student at home 1.06 Student reads on a computer or mobile device at home 0.003 Student likes to read 0.04 Table 48: Effect of teachers' characteristics on Reading Comprehension scores Teacher Characteristics Change in RC Score (+/-) Has a primary teaching certificate -0.08 Has a reading corner in the classroom 0.08 Has not been absent from school in the last term -0.14 Has met with the parents of his/her students -0.36 Age of the teacher 0.01 Number of year of experience in teaching -0.006 Number of minute from home to school 0.002 81 Table 49: Effect of training and guides on Reading Comprehension scores Change in RC score Teacher have a syllabus/curriculum in SÄ?moan -0.02 Teacher have a manual for teaching SÄ?moan -0.12 Teacher have receive training on how to teach reading in the last -0.07 three years Table 50: Effect of Classroom Environment on Reading Comprehension scores Classroom Environment Change in RC Score (+/-) Classroom displays -0.05 Spelling/vocabulary displayed -0.16 Song/hymns/stories displayed on blackboard 0.10 Song/hymns/stories displayed on charts/posters -0.10 Student work displayed 0.16 Print materials used in instruction 0.14* Sufficient classroom space for organized group activities -0.22 Reading corner in the classroom -0.18 Student profiles (folder with student work and student info) 0.03 Table 51: Effect of Reading Instructional Resources on Reading Comprehension scores Language use in classroom Change in RC Score (+/-) Reading instructional materials in classroom 0.03 Table 52: Effect of teacher instructional and assessment methods on Reading Comprehension scores Teaching instructional and assessment methods Change in ORF score 64.Teaching of Listening Comprehension Never -0.36 Rarely -0.10 Often 0.02 Daily -0.12 65.Teaching Letter Names Never -0.04 Rarely -0.22 Often -0.21 Daily -0.18 82 66.Asking children to orally retell a story that they have read Never -0.21 Rarely -0.10 Often 0.42* Daily -0.16 67.Teaching new letter sounds Never 0.06 Rarely -0.45 Often 0.18* Daily -0.29 68.Asking children to sound out unfamiliar words using knowledge of letter sounds Never -0.13 Rarely 0.16 Often 0.26 Daily -0.10 69.Teaching meaning of new vocabulary words Never 1.63 Rarely -8.45 Often -3.60 Daily -8.44 70.Shared reading Never -0.36 Rarely -0.04 Often 0.09 Daily -0.10 71.Group Guided reading Never -0.57 Rarely -0.14 Often 0.18 Daily 0.14 Q72. Listening to a child read aloud Never -0.20 Rarely 0.07 Often 0.31 Daily 0.15 Q73.Students readings on their own silently Rarely 0.41 Sometimes 0.39* Often 0.30 83 Daily 0.42* Q74.Reading comprehension activities Never -0.55 Rarely -0.09 Often 0.55* Daily -0.10 Q75.Children take books home to read with their parents Never -0.26 Rarely -0.22 Often -0.05 Daily 0.002 Q76.Evaluating student’s oral reading with running records or any other method Never -0.36 Rarely -0.52 Often -0.10 Daily -0.22 77.Teacher works on word building with students Never 0.06 Rarely 0.26 Often 0.67* Daily 0.20 78.Students read and draw Rarely 0.37 Sometimes 0.10 Often 0.06 Daily -0.02 79.Students work on spelling words in exercise books Never 0.14 Rarely -0.14 Often 0.92* Daily 0.27* 80.Students writing sentences Never -0.25 Rarely 0.16 Often 0.51 Daily 0.13 84 Annex 2 / Instruments Annex 2.A: EGRA Instrument Annex 2.B: Student Questionnaire Annex 2.C: Head Teacher Questionnaire Annex 2.D: Teacher Questionnaire Annex 2.E: Classroom Observation 85 Annex 2.A: EGRA Instrument 86 87 88 89 90 91 Annex 2.B: Student Questionnaire 92 93 Annex 2.C: Head Teacher Questionnaire 94 95 96 Annex 2.D: Teacher Questionnaire 97 98 99 100 101 Annex 2.E: Classroom Observation 102 103 104 Annex 3 / Test Reliability Measures Annexe Table 1 presents indicators of test reliability for the SEGRA. The first indicator of reliability is the “item-testâ€? correlation which is the correlation between each sub- domain and a composite measure. The composite measure is the sum of the standardized scores of each sub-domain following the Cronbach’s Alpha methodology. The second indicator is the “item-restâ€? correlation which is the correlation of each sub-domain with a composite measure excluding the sub-domain. The composite measure for this second indicator is the sum of the standardized scores for all sub-domains excluding the sub- domain in question. These two indicators help identify sub-domains that are less correlated with the EGRA test as a whole in order to identify potential outlier sub-domains. Finally, Cronbach’s Alpha is calculated for the sub-domains that are not timed; these are scored as percent correct in EGRA. RTI (2009:82) does not recommend the Cronbach’s Alpha test for timed sub-domains as this may inflate the measure of test reliability32. This indicator provides an overall measure of the correlation between the non-timed sub-domains; a typical benchmark in research studies is 0.7. For the SEGRA, listening comprehension stands out as being less correlated with the other domains. Its item-test correlation is 0.56 compared to a range of 0.64 to 0.91 for the other domains, and its item-rest correlation is 0.45 compared to a range 0.54 to 0.87 for the other domains. However, Cronbach’s Alpha for the non-timed sub-domains which includes listening comprehension is 0.74 and exceeds the typical reliability benchmark of 0.7. 32 RTI (2009). Early Grade Reading Assessment Toolkit. Research Triangle Park, N.C.: RTI International 105 Annex Table 1. Reliability measures Item-test Item-rest correlation correlation Correct letters per minute 0.75 0.67 Correct letter sounds per minute 0.64 0.54 Correct words per minute 0.91 0.87 Correct invented words per minute 0.85 0.80 Oral reading fluency 0.90 0.87 Initial sounds (percent correct) 0.69 0.60 Reading comprehension (percent correct) 0.82 0.76 Listening comprehension (percent correct) 0.56 0.45 Dictation (percent correct) 0.83 0.77 Cronbach's alpha (percent correct items only) 0.74 106