79937 The Former Yugoslav Republic of Macedonia SABER Country Report STUDENT ASSESSMENT 2012 Key Policy Areas for Student Assessment Status 1. Classroom Assessment In FYR Macedonia, there are several formal system-level documents that provide guidelines for classroom assessment. While there are some system-wide resources and materials available to teachers for carrying out classroom assessment activities, there are limited opportunities available to them for learning about or developing more effective classroom assessment practices. In general, classroom assessment practices are considered to be weak, and there are limited systematic mechanisms in place to monitor their quality. 2. Examinations The State Matura has been administered every year since 2008 to grade 12 students. Results of the State Matura are used for certifying grade completion, determining admission to university and other higher education institutions, monitoring education quality levels, and planning education policy reforms. Funding for the State Matura is provided by the government to the National Examination Center. Currently, there are no up-to-date courses or workshops on the State Matura available to classroom teachers. 3. National Large-Scale Assessment (NLSA) The External Assessment of Students' Achievement in Primary and Secondary Education (“External Assessment�) was piloted in different grades and subjects in 2010 and 2011. The Ministry of Education and Science announced that the External Assessment would be formally launched in the 2012-2013 school year. The official purpose of the External Assessment is to evaluate the objectivity of teachers' grading. 4. International Large-Scale Assessment (ILSA) FYR Macedonia has participated in a number of ILSA exercises, including PIRLS (2001, 2006), TIMSS (1999, 2003, 2011), PISA (2000), and PISA Plus (2001). However, there is no policy document that systematically addresses the country's participation in international assessments. Most of the funding for FYR Macedonia's participation in ILSA exercises has been provided by donors, including the World Bank. Opportunities to learn about ILSA are available only to individuals working directly on a specific ILSA exercise. THE WORLD BANK FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Introduction SABER-Student Assessment methodology In 2001, FYR Macedonia embarked on a new education The SABER-Student Assessment framework is built on policy agenda. One of the goals of this agenda was to the available evidence base for what an effective improve the country’s student assessment system. In assessment system looks like. The framework provides order to gain a better understanding of the strengths guidance on how countries can build more effective and weaknesses of its existing assessment system, FYR student assessment systems. The framework is Macedonia decided to benchmark this system using structured around two main dimensions of assessment standardized tools developed under The World Bank’s systems: the types/purposes of assessment activities Systems Approach for Better Education Results (SABER) and the quality of those activities. program. SABER is an evidence-based program to help countries systematically examine and strengthen the Assessment types and purposes performance of different aspects of their education systems. Assessment systems tend to be comprised of three main types of assessment activities, each of which What is SABER-Student Assessment? serves a different purpose and addresses different information needs. These three main types are: classroom assessment, examinations, and large-scale, SABER-Student Assessment is a component of the system level assessments. SABER program that focuses specifically on benchmarking student assessment policies and systems. Classroom assessment provides real-time information The goal of SABER-Student Assessment is to promote to support ongoing teaching and learning in individual stronger assessment systems that contribute to classrooms. Classroom assessments use a variety of improved education quality and learning for all. formats, including observation, questioning, and paper- and-pencil tests, to evaluate student learning, generally National governments and international agencies are on a daily basis. increasingly recognizing the key role that assessment of student learning plays in an effective education system. Examinations provide a basis for selecting or certifying The importance of assessment is linked to its role in: students as they move from one level of the education (i) providing information on levels of student system to the next (or into the workforce). All eligible learning and achievement in the system; students are tested on an annual basis (or more often if (ii) monitoring trends in education quality over the system allows for repeat testing). Examinations time; cover the main subject areas in the curriculum and (iii) supporting educators and students with real- usually involve essays and multiple-choice questions. time information to improve teaching and learning; and Large-scale, system-level assessments provide feedback (iv) holding stakeholders accountable for results. on the overall performance of the education system at particular grades or age levels. These assessments typically cover a few subjects on a regular basis (such as every 3 to 5 years), are often sample based, and use multiple-choice and short-answer formats. They may be national or international in scope. Appendix 1 summarizes the key features of these main types of assessment activities. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 2 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Quality drivers of an assessment system Table 1: Framework for building an effective assessment system, with indicator areas The key considerations when evaluating a student assessment system are the individual and combined quality of assessment activities in terms of the adequacy of the information generated to support decision making. There are three main drivers of information quality in an assessment system: enabling context, system alignment, and assessment quality. Enabling context refers to the broader context in which the assessment activity takes place and the extent to which that context is conducive to, or supportive of, the assessment. It covers such issues as the legislative or policy framework for assessment activities; institutional and organizational structures for designing, carrying The indicators are identified based on a combination of out, or using results from the assessment; the criteria, including: availability of sufficient and stable sources of funding; • professional standards for assessment; and the presence of trained assessment staff. • empirical research on the characteristics of effective assessment systems, including analysis of the System alignment refers to the extent to which the characteristics that differentiate between the assessment is aligned with the rest of the education assessment systems of low- versus high-performing system. This includes the degree of congruence nations; and between assessment activities and system learning • theory — that is, general consensus among experts goals, standards, curriculum, and pre- and in-service that it contributes to effective assessment. teacher training. Levels of development Assessment quality refers to the psychometric quality of the instruments, processes, and procedures for the The World Bank has developed a set of assessment activity. It covers such issues as design and standardized questionnaires and rubrics for collecting implementation of assessment activities, analysis and and evaluating data on the three assessment types interpretation of student responses to those activities, and related quality drivers. and the appropriateness of how assessment results are reported and used. The questionnaires are used to collect data on the characteristics of the assessment system in a particular Crossing the quality drivers with the different country. The information from the questionnaires is assessment types/purposes provides the framework then applied to the rubrics in order to judge the and broad indicator areas shown in Table 1. This development level of the country’s assessment system framework is a starting point for identifying indicators in different areas. that can be used to review assessment systems and plan for their improvement. The basic structure of the rubrics for evaluating data collected using the standardized questionnaires is summarized in Appendix 2. The goal of the rubrics is to provide a country with some sense of the development level of its assessment activities compared to best or recommended practice in each area. For each indicator, the rubric displays four development levels—Latent, Emerging, Established, and Advanced. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 3 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 These levels are artificially constructed categories Primary and secondary education is compulsory in FYR chosen to represent key stages on the underlying Macedonia. Beginning in the 2007-2008 school year, continuum for each indicator. Each level is children start school at the age of 6. Also since the accompanied by a description of what performance on 2007-2008 school year, the length of the primary the indicator looks like at that level. education cycle increased from eight to nine years. As of 2009, school enrollment at the primary level is 90 • Latent is the lowest level of performance; it percent and the completion rate is 92 percent. represents absence of, or deviation from, the desired attribute. Investments have been made to improve the physical • Emerging is the next level; it represents partial infrastructure for education, to increase the number of presence of the attribute. teachers (particularly in primary and lower-secondary • Established represents the acceptable minimum schools), and to modernize the curricula. Additionally, FYR standard. Macedonia has focused on improving education • Advanced represents the ideal or current best practice. outcomes. FYR Macedonia’s performance on international large-scale assessments (TMSS, PISA, and PIRLS) in 2000 A summary of the development levels for each and 2001 prompted several education reforms in this assessment type is presented in Appendix 3. regard. As a result, between 2004 and 2008, enrollment in secondary schools increased from 85 to 95 percent, and In reality, assessment systems are likely to be at dropout rates in primary school decreased to below 2 different levels of development in different areas. For percent. To improve teacher performance, training example, a system may be Established in the area of opportunities have been made available, and examinations, but Emerging in the area of large-scale, specifications for the accreditation and monitoring of system-level assessment, and vice versa. While intuition teacher training have been introduced. These and other suggests that it is probably better to be further along in education reforms have led to school managers and as many areas as possible, the evidence is unclear as teachers reporting improvements in key areas of to whether it is necessary to be functioning at education. Specifically, from 2004 to 2007, almost half of Advanced levels in all areas. Therefore, one might view school managers and teachers reported improvements in the Established level as a desirable minimum outcome to student achievement while nearly two thirds observed achieve in all areas, but only aspire beyond that in those improvements in planning and assessment processes. areas that most contribute to the national vision or priorities for education. In line with these considerations, Detailed information was collected on FYR Macedonia’s the ratings generated by the rubrics are not meant to be student assessment system using the SABER-Student additive across assessment types (that is, they are not Assessment questionnaires and rubrics. It is important meant to be added to create an overall rating for an to remember that these tools primarily focus on assessment system; they are only meant to produce an benchmarking a country’s policies and arrangements for overall rating for each assessment type). The methodology assessment activities at the system or macro level. for assigning development levels is summarized in Additional data would need to be collected to Appendix 4. determine actual, on-the-ground practices in FYR Macedonia, particularly by teachers and students in schools. The following sections discuss the findings by Education in FYR Macedonia each assessment type, accompanied by suggested policy options. The suggested policy options were FYR Macedonia is an upper-middle-income country in determined in collaboration with key local stakeholders Eastern Europe. GDP per capita (current US$) is $4,925, based on FYR Macedonia’s immediate interests and with annual growth of about 2.9 percent. While needs. Detailed, completed rubrics for each assessment unemployment is very high at 31.4 percent (2011), type in FYR Macedonia are provided in Appendix 5. education is one of the main sectors experiencing employment growth, particularly in the second quarter of 2012. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 4 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Classroom Assessment Classroom assessment practices are generally considered weak, with an overemphasis on their use as an administrative tool than as a pedagogical resource. It Level of development is apparently very common to observe uneven application of standards for grading students’ work, and grade inflation is considered a serious problem. Reports There are several formal system-level documents that also indicate that classroom assessment activities tend provide guidelines for classroom assessment. The to be mainly about recalling information and that Teaching Curricula by Subject and Grades (2001-2010) teachers do not use explicit or a priori criteria for series are official documents authorized by the Bureau scoring or grading students' work. Additionally, it is for Development of Education in the Ministry of reportedly common to observe errors in the scoring or Education that specify what students are expected to grading of students' work. On the positive side, it learn. However, these and other available documents appears that parents are well informed about students' do not specify the level of performance that students grades, classroom assessment information is seen as are expected to demonstrate in relation to what they providing useful feedback to students, and classroom have learned. assessment practices are viewed as being aligned with the pedagogical and curricular framework. There are some system-wide resources/materials available to teachers for carrying out classroom Apart from classroom assessment being a required assessment activities. For example, the Teaching component of a teacher’s performance evaluation and Curricula by Subject and Grades documents outline of school inspection, there are limited systematic what students are expected to learn in different subject mechanisms in place to monitor the quality of areas at different grade/age levels and the Assessment classroom assessment practices. Criteria document provides the level(s) of performance that students are expected to reach in different subject Suggested policy options: areas at different grade/age levels. The Ministry of Education website provides examples of good 1. Introduce required courses on classroom assessment practices and case studies on classroom assessment for pre- and in-service teachers; assessment. An on-line item bank, containing some develop and make available additional items that were previously administered on the State resources for teachers to carry out classroom Matura and on the National Assessment, was recently assessment. added to this website (posted in June 2011), although the number of items in the item bank is very limited. 2. Introduce more systematic mechanisms to monitor the quality of classroom assessment There are limited opportunities available to teachers for practices. learning about or developing more effective classroom assessment practices. A small number of courses on 3. Develop additional system-level documents to classroom assessment are offered during pre-service guide classroom assessment in secondary teacher training. In the case of in-service teacher education. training, some courses on classroom assessment have been offered on an ad-hoc basis. 4. Emphasize the quality of classroom assessment during external evaluation of schools, and introduce regular mechanisms to support teachers in improving assessment practices. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 5 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Examinations Some documentation about the technical aspects of the State Matura exists, but it is not in a formal report format. Comprehensive technical reports were Level of development prepared only during the piloting of the State Matura. The State Matura has been administered every year While there is a general understanding of what the since 2008 to grade 12 students. Results are used for State Matura measures, there are no regular, up-to- certifying secondary education completion, determining date courses or workshops on the exam available to admission to university and other higher education classroom teachers. Focused workshops on institutions, monitoring education quality levels, and administration and marking procedures for the exam planning education policy reforms. are organized every year, but these are available only to those serving as test administrators and markers. Almost all stakeholders support the State Matura. Some Workshops related to specific subject areas (content, stakeholders have made efforts to further improve the requirements) are organized from time to time. examination. For example, policy makers and educators University professors and certified training providers have encouraged the development of a State Matura carry out some of these workshops. Schools have to pay exam for art schools, the introduction of two levels of for their teachers' participation in these workshops. A the mathematics exam, and the introduction of a limited number of schools are able to provide access to second examination session for students who failed the these workshops for their teachers. Courses for examination during the first session. teachers of centrally-developed Matura subjects are offered for free by NEC and the Bureau for The National Examination Centre (NEC) is responsible Development of Education (BDE). for the State Matura. Funding for the State Matura is provided to the NEC by the government, and covers all Suggested policy options: core examination activities (design, administration, data processing, and reporting), as well as staff training, but 1. Introduce mechanisms for monitoring the does not cover research and development. impact of the State Matura on the quality of teaching and learning. The NEC has state-of-the-art facilities to carry out the examination, including computers for all technical staff, 2. Improve the quality of documentation on the a secure building, secure storage facilities, access to technical aspects of the State Matura. adequate computer servers, the ability to backup data, and adequate communication tools. 3. Introduce regular funding for research and development, and for improving the technical The State Matura results are officially recognized both quality of the State Matura. in FYR Macedonia and by certification and selection systems abroad, including in Slovenia, the UK, Bulgaria, 4. Introduce workshops or courses for teachers on Albania, and Serbia. the State Matura that would address, for example, test item development, data analysis, and use of data to improve teaching and learning. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 6 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 National Large-Scale Assessment (NLSA) curriculum development and supporting teachers, to the National Examination Centre, an institution established for the purpose of administering the Level of development External Assessment. Additionally, while it was initially intended that students’ results on the External The External Assessment of Students’ Achievement in Assessment would influence their final annual Primary and Secondary Education (External Assessment) (summative) grades, given that the Law on Education was piloted in 2010 (grade 9 Mathematics) and 2011 provides that only teachers are responsible for grading (grade 4 Social Sciences, grade 7 Chemistry, grade 10 students, a new bylaw document has been introduced Mother Tongue, grade 11 Business). Laws for primary which specifies that External Assessment results cannot and secondary education regulate the External influence students’ final annual (summative) grades. Assessment. At the time of this review, the External Assessment was still in the pilot phase and the Ministry Currently, no teacher training courses, workshops, or of Education and Science had announced that it would presentations on the External Assessment are offered in be formally launched in the 2012-2013 school year. FYR Macedonia. From 1998 to 2006, the National Assessment, the main While some documentation on the technical aspects of purposes of which were to monitor education quality at the External Assessment exists, it is not in the form of a the system level; support schools and teachers; and formal report and it is not publicly available. support policy design, evaluation, and decision making, was considered the most important national large-scale There are no mechanisms in place to ensure that the assessment in FYR Macedonia. The National Assessment External Assessment is used in a way that is consistent is no longer administered and, according to education with its intended purposes and technical characteristics, policy, the External Assessment is now viewed as the or to monitor its consequences. most important national large-scale assessment in the country. Suggested policy options: The main purpose of the External Assessment is to 1. Clearly identify and communicate to key assess the objectivity of teachers’ grading. Although not stakeholders the purposes, intended uses, and provided for in official documents, there also has been characteristics of the External Assessment. some discussion about formally reporting External Assessment results for individual students and using 2. Introduce regular training courses for NEC staff them to make decisions about their selection to the on the development and administration of the next level in the education cycle. External Assessment. While policymakers strongly support the External 3. Introduce in- and pre-service teacher training Assessment, educators, students, and parents, as well courses, workshops, or presentations on the as some donors, oppose it. Stakeholders have External Assessment. challenged the validity of the External Assessment given its intended use, and expressed concern that its high 4. Introduce mechanisms to (1) ensure that the stakes nature, as well as the format of the assessment External Assessment is used consistently with its instrument (only multiple-choice items), will result in purposes and technical characteristics, and to teaching to the test. (2) monitor its consequences or impact. As a result of stakeholders’ independent efforts to shape the External Assessment, regulation of the External Assessment shifted from the Bureau for Development of Education, which is responsible for SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 7 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 International Large-Scale Assessment PIRLS 2006 results were presented only to representatives from the MoES and educational (ILSA) agencies within the MoES, primarily due to a lack of financial resources. There is a plan for the results from Level of development TIMSS 2011 to be presented in a country report. The results of the ILSA exercises have been used by FYR Macedonia has participated in a number of ILSA policy makers and education leaders to improve exercises, including PIRLS (2001, 2006), TIMSS (1999, education quality in the country. For example, PIRLS 2003, 2011), and PISA (PISA Plus in 2001). However, and TIMSS results have been used as a basis for there is no policy document that addresses, in a developing the concept for the nine-year primary systematic manner, the country’s participation in education cycle , teacher training programs (programs international assessments. Although there is no official initiated by donors have been designed to reduce gaps document or plan for future participation in an ILSA, in student achievement that were identified by the FYR Macedonia has taken concrete steps to participate national reports for the various ILSA exercises), and in PISA 2015 as the Ministry of Education and Science other assessment activities in the country. TIMSS and sent an official letter of interest to the OECD. The PIRLS frameworks and technical standards have been Ministry also has informally expressed its intention to used to inform the development of the External participate in TIMSS 2015. Assessment and disclosed items have been used as materials for teacher training on classroom assessment. Most of the funding for FYR Macedonia’s participation in ILSA exercises, including the recently concluded Suggested policy options: TIMSS 2011, has been provided by donors, including The World Bank, USAID, and UNICEF. Funding covered 1. Prepare a formal policy document that international participation fees, implementation of the addresses FYR Macedonia’s participation in assessment exercise in the country, and attendance at ILSA. international expert meetings. Funding did not cover research and development activities. 2. Introduce more regular government funding, in combination with donor funding, for carrying The task of processing and analyzing data from the out ILSA activities. TIMSS 2011 exercise, as well as reporting and disseminating the results in the country, was carried out 3. Introduce mechanisms to capture and evaluate by NEC staff (as part of their official responsibilities). the impact of decisions based on ILSA results on student achievement levels. Opportunities to learn about ILSA are available only to individuals working directly on a specific ILSA, and are 4. Use ILSA results to inform decision making on provided by the Ministry of Education and the donor adjusting existing mechanisms or introducing community (e.g., UNICEF, Dutch embassy). new mechanisms to improve teaching and learning. Results from PIRLS 2001, TIMSS 1999, TIMSS 2003, and PISA Plus were printed as national reports, disseminated to stakeholders (schools, universities, educational authorities), and publicly presented at conferences. Reports are available in print and on line. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 8 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Appendix 1: Assessment Types and Their Key Differences Classroom Large-scale assessment Examinations Surveys National International Exit Entrance Purpose To provide To provide To provide To certify To select immediate feedback on feedback on the students as they students for feedback to overall health of comparative move from one further inform the system at performance of level of the educational classroom particular the education education system opportunities instruction grade/age system at to the next (or level(s), and to particular into the monitor trends in grade/age workforce) learning level(s) Frequency Daily For individual For individual Annually and Annually and subjects offered subjects offered more often more often on a regular on a regular where the system where the system basis (such as basis (such as allows for allows for every 3-5 years) every 3-5 years) repeats repeats Who is All students Sample or A sample of All eligible All eligible tested? census of students at a students students students at a particular grade particular grade or age level(s) or age level(s) Format Varies from Usually multiple Usually multiple Usually essay Usually essay observation to choice and short choice and short and multiple and multiple questioning to answer answer choice choice paper-and-pencil tests to student performances Coverage of All subject areas Generally Generally Covers main Covers main curriculum confined to a few confined to one subject areas subject areas subjects or two subjects Additional Yes, as part of Frequently Yes Seldom Seldom information the teaching collected from process students? Scoring Usually informal Varies from Usually involves Varies from Varies from and simple simple to more statistically simple to more simple to more statistically sophisticated statistically statistically sophisticated techniques sophisticated sophisticated techniques techniques techniques SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 9 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Appendix 2: Basic Structure of Rubrics for Evaluating Data Collected on a Student Assessment System Development Level LATENT ESTABLISHED (Absence of, or EMERGING (Acceptable deviation from, (On way to meeting minimum ADVANCED Dimension attribute) minimum standard) standard) (Best practice) Justification EC—ENABLING CONTEXT EC1—Policies EC2—Leadership, public engagement EC3—Funding EC4—Institutional arrangements EC5—Human resources SA—SYSTEM ALIGNMENT SA1—Learning/quality goals SA2—Curriculum SA3—Pre-, in-service teacher training AQ—ASSESSMENT QUALITY AQ1—Ensuring quality (design, administration, analysis) AQ2—Ensuring effective uses SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 10 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Appendix 3: Summary of the Development Levels for Each Assessment Type Assessment Type LATENT EMERGING ESTABLISHED ADVANCED Absence of, or deviation On way to meeting Acceptable minimum Best practice from, the attribute minimum standard standard There is no system-wide There is weak system- There is sufficient There is strong system- institutional capacity to wide institutional system-wide institutional wide institutional support and ensure the capacity to support and capacity to support and capacity to support and quality of classroom ensure the quality of ensure the quality of ensure the quality of assessment practices. classroom assessment classroom assessment classroom assessment practices. practices. practices. CLASSROOM ASSESSMENT There is no standardized There is a partially There is a stable There is a stable examination in place for stable standardized standardized standardized key decisions. examination in place, examination in place. examination in place and and a need to develop There is institutional institutional capacity and institutional capacity to capacity and some strong mechanisms to run the examination. The limited mechanisms to monitor it. The EXAMINATIONS examination typically is monitor it. The examination is of high of poor quality and is examination is of quality and is perceived perceived as unfair or acceptable quality and is as fair and free from corrupt. perceived as fair for corruption. most students and free from corruption. There is no NLSA in There is an unstable There is a stable NLSA There is a stable NLSA place. NLSA in place and a in place. There is in place and institutional need to develop institutional capacity and capacity and strong institutional capacity to some limited mechanisms to monitor run the NLSA. mechanisms to monitor it. The NLSA is of high NATIONAL (OR SYSTEM- Assessment quality and it. The NLSA is of quality and its LEVEL) LARGE-SCALE impact are weak. moderate quality and its information is ASSESSMENT information is effectively used to disseminated, but not improve education. always used in effective ways. There is no history of Participation in an ILSA There is more or less There is stable participation in an ILSA has been initiated, but stable participation in an participation in an ILSA nor plans to participate there still is need to ILSA. There is and institutional capacity in one. develop institutional institutional capacity to to run the ILSA. The capacity to carry out the carry out the ILSA. The information from the INTERNATIONAL LARGE- ILSA. information from the ILSA is effectively used SCALE ASSESSMENT ILSA is disseminated, to improve education. but not always used in effective ways. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 11 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Appendix 4: Methodology for Assigning Development Levels 1. The country team or consultant collects information 5. The preliminary development level is validated using about the assessment system in the country. expert judgment in cooperation with the country team and The World Bank Task Team Leader. 2. Based on the collected information, a level of development and score is assigned to each dimension in For scores that allow a margin of discretion (i.e., to the rubrics: choose between two levels of development), a final decision has to be made based on expert judgment. For • Latent = 1 score point example, the aforementioned hypothetical country has • Emerging = 2 score points an ‘Enabling Context’ score of 2.33, corresponding to a • Established = 3 score points preliminary level of development of ‘Emerging or • Advanced = 4 score points Established.’ Based on qualitative information not captured in the rubric, along with expert judgment, the 3. The score for each quality driver is computed by country team chooses ‘Emerging’ as the most aggregating the scores for each of its constituent appropriate level. dimensions. For example: 6. Scores for certain key dimensions under ‘Enabling The quality driver, ‘Enabling Context,’ in the case of Context’ (in the case of EXAM, NLSA, and ILSA) and ILSA, has 3 dimensions on which a hypothetical country under ‘System Alignment’ (in the case of CLASS) receives the following scores: Dimension A = 2 points; were set as ceiling scores, i.e., the overall mean Dimension B = 2 points; Dimension C = 3 points. The score for the particular assessment type cannot be hypothetical country’s overall score for this quality greater than the score for these key dimensions. These driver would be: (2+2+3)/3 = 2.33 key variables include formal policy, regular funding, having a permanent assessment unit, 4. A preliminary level of development is assigned to and the quality of assessment practices. each quality driver. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 12 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Appendix 5: SABER-Student Assessment Rubrics for the Former Yugoslav Republic of Macedonia This appendix provides the completed SABER-Student Assessment rubrics for each type of assessment activity in FYR Macedonia. In each row of the rubric, the relevant selection is indicated by a thick border and an asterisk. The selection may include a superscript number that refers to the justification or explanation for the selection (as indicated by a thick border and an asterisk). The explanation or justification text can be located in the “Development level rating justifications� section at the end of each rubric. If a row includes a superscript, but not a thick border and an asterisk, this means that insufficient information was available to determine the relevant selection in the row. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 13 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 THE FORMER YUGOSLAV REPUBLIC OF MACEDONIA Classroom Assessment SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 14 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ENABLING CONTEXT AND SYSTEM ALIGNMENT Overall policy and resource framework within which classroom assessment activity takes place in a country or system, and the degree to which classroom assessment activity is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT AND SYSTEM ALIGNMENT 1: Setting clear guidelines for classroom assessment There is no system-level document that There is an informal system-level There is a formal system-level document This option does not apply to this provides guidelines for classroom document that provides guidelines for that provides guidelines for classroom dimension. 1 assessment. classroom assessment. assessment. * 2 This option does not apply to this This option does not apply to this The availability of the document is The document is widely available. dimension. dimension. restricted. * ENABLING CONTEXT AND SYSTEM ALIGNMENT 2: Aligning classroom assessment with system learning goals There are no system-wide resources for There are scarce system-wide resources There are some system-wide resources There are a variety of system-wide 3 teachers for classroom assessment. for teachers for classroom assessment. for teachers for classroom assessment. resources available for teachers for classroom assessment. * There is no official curriculum or There is an official curriculum or There is an official curriculum or There is an official curriculum or standards document. standards document, but it is not clear standards document that specifies what standards document that specifies what what students are expected to learn or students are expected to learn, but the students are expected to learn and to to what level of performance. level of performance required is not what level of performance. * 4 clear. ENABLING CONTEXT AND SYSTEM ALIGNMENT 3: Having effective human resources to carry out classroom assessment activities There are no system-level mechanisms This option does not apply to this There are some system-level There are a variety of system-level to ensure that teachers develop skills dimension. mechanisms to ensure that teachers mechanisms to ensure that teachers and expertise in classroom assessment. develop skills and expertise in classroom develop skills and expertise in classroom * 5 assessment. assessment. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 15 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ASSESSMENT QUALITY Quality of classroom assessment design, administration, analysis, and use. LATENT EMERGING ESTABLISHED ADVANCED ASSESSMENT QUALITY 1: Ensuring the quality of classroom assessment Classroom assessment practices suffer Classroom assessment practices are Classroom assessment practices are Classroom assessment practices are 6 from widespread weaknesses or there is known to be weak. known to be of moderate quality. known to be generally of high quality. no information available on classroom assessment practices. There are no mechanisms to monitor the * There are ad hoc mechanisms to monitor There are limited systematic mechanisms There are varied and systematic quality of classroom assessment the quality of classroom assessment to monitor the quality of classroom mechanisms in place to monitor the 7 practices. practices. assessment practices. quality of classroom assessment * practices. ASSESSMENT QUALITY 2: Ensuring effective uses of classroom assessment Classroom assessment information is not This option does not apply to this Classroom assessment information is Classroom assessment information is required to be disseminated to key dimension. required to be disseminated to some key required to be disseminated to all key 8 stakeholders. stakeholders. stakeholders. * There are no required uses of classroom There are limited required uses of There are adequate required uses of There are adequate required uses of assessment to support student learning. classroom assessment to support classroom assessment to support classroom assessment to support student learning. student learning, excluding its use as an student learning, including its use as an 9 input for external examination results. * input for external examination results. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 16 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Classroom Assessment: Development-level rating justifications 1. System-level documents providing guidelines for classroom assessment include: a. Assessment Standards document (authorized by the Bureau for Development of Education (BDE) Ministry of Education, 2008) b. Assessment Criteria document (authorized by the Bureau for Development of Education (BDE) Ministry of Education, 2008). This document consists of a general description of marks (from 1-5) based on the percentage of outcomes expected to be achieved and the level of their complexity based on Bloom’s taxonomy. c. Teaching curricula documents by subject and grades (authorized by the Bureau for Development of Education (BDE), Ministry of Education, 2001- 2010). These documents include teaching syllabi (for a subject and a grade) that provides short (1/2 -1 page) guidelines related to assessment format, principles, and methods. In the documents developed after 2007 (syllabi for grades 1-9 of primary education) guidelines are more detailed and address formative and summative assessment. d. Guidelines for assessment in primary schools (authorized by Bureau for Development of Education (BDE), Ministry of Education, 2008). These guidelines contain information on school-based assessment principles, standards for classroom assessment, and requirements and expectations of teacher competences. e. Indicators for assessing quality of schools work (authorized by the State educational Inspectorate (SEI), Ministry of Education, 2009). 2. The documents are available in the public libraries, in teacher training colleges, and in in-service courses for teachers. The documents are also publically available on the Ministry of Education websites. a. Assessment Standards - http://www.bro.gov.mk/?q=standardi b. Assessment Criteria - http://app.bro.gov.mk/dokumenti/kriteriumi/Kriteriumi_za_ocenuvanje.pdf c. Teaching curricula by subject and grades - http://www.bro.gov.mk/?q=nastavni-programi d. Guidelines for assessment in primary schools - http://toolbox.pep.org.mk/Files/ASSESSMENT%20STANDARDS%20-%20Guide.pdf (English version) e. Indicators for assessing quality of schools work - http://www.mon.gov.mk/DPI/download/Indikatori_mk.pdf Schools were informed when the documents were made available by letters and/or during professional meetings and workshops. 3. Resources teachers for classroom assessment include: a. A document that outlines what students are expected to learn in different subject areas at different grade/age levels (for example, the "Teaching curricula by subject and grades" document) b. A document that outlines the level(s) of performance that students are expected to reach in different subject areas at different grade/age levels (for example, the "Assessment Criteria" document) c. Some textbooks or workbooks that provide support for classroom assessment that were provided to teachers during a workshop that was part of a project to improve school-based assessment (2007-2011). d. Scoring criteria or rubrics for students’ work (which are available for some subjects or topics and have been developed as a result of projects related to improving classroom assessment). e. Item banks or pools with examples of questions - there is limited number of released items from different types of assessments (national or international) available in printed collections published by NEC. An on line item bank was recently developed (posted in June 2011). Number of items SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 17 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 in this item bank is very limited. At the moment, there are 390 items in 12 primary subjects; 273 items from exams in 8 subjects in State Matura 2010, 86 released items from National assessment 2001 in two subjects (mother tongue and mathematics)). f. Online assessment resources - there are some examples of good assessment practices or case studies on classroom assessment available on the Ministry of Education website. 4. "Teaching curricula by subject and grades" documents specify what students are expected to learn, but does not specify to what performance level. 5. During pre-service teacher training, future teachers have minimal courses on classroom assessment. In-service teacher training on classroom assessment has been administered on an ad-hoc basis. These trainings were organized by certified training providers, and schools and/or teachers could select a particular training from the training catalogue in an area of interest. Assessment has been frequently chosen, with 4,139 out of 26,938 teachers selecting classroom assessment as a topic. Ten training topics have been offered with classroom assessment being one of the four most popular topics selected. From 2003 – 2005 most of the lower primary teachers (grade 1-3) were trained in classroom assessment initiated by the new law regulation on descriptive assessment (students’ achievements is described using verbal/ written description instead the numerical marks that were used before). From 2007 -2011, as part of a project for improving SBA, the majority (85 percent) of subject teachers (teachers that teach separate subjects in upper primary and secondary schools) participated in series of workshops on classroom assessment that lasted four to six days. Trainings were organized once from 2007-2011. There were two workshops: four days on formative assessment and two days on teachers' summative assessment. 6. It is very common to observe uneven application of standards for grading students’ work and grade inflation is a serious problem; it is common to observe the classroom assessment activities are mainly about recalling information, teachers do not use explicit or a priori criteria for scoring or grading students' work, it is common to observe errors in the scoring or grading of students' work, and classroom assessment is mainly used as administrative or control tool rather than as pedagogical resource; it is not common for classroom assessment activities to rely mainly on multiple-choice, selection-type questions, parents are well informed of students' grades, classroom assessment information provides useful feedback to students and classroom assessment practices are aligned with the pedagogical and curricular framework. 7. Mechanisms include that classroom assessment is a required component of a teacher’s performance evaluation and of school inspection or teacher supervision. 8. Teachers are obligated to report on individual student’s performance to parents and student at least four times during the school year. Schools are obligated to report aggregated data about students’ performance to school district and the Ministry of Education at the end of school year. 9. The required uses of classroom assessment activities to promote and inform student learning include: diagnosing student learning issues, providing feedback to students on their learning, informing parents about their child’s learning, planning next steps in instruction, and grading students for internal classroom uses. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 18 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 THE FORMER YUGOSLAV REPUBLIC OF MACEDONIA Examinations SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 19 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which assessment activity takes place in a country or system and the extent to which that framework is conducive to, or supportive of, the assessment activity. LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 1: Setting clear policies No standardized examination has taken The standardized examination has been The examination is a stable program that This option does not apply to this 1 place. operating on an irregular basis. has been operating regularly. dimension * There is no policy document that There is an informal or draft policy There is a formal policy document that This option does not apply to this 2 authorizes the examination. document that authorizes the authorizes the examination. dimension. examination. * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this 3 dimension. the public public. dimension. * This option does not apply to this This option does not apply to this The policy document addresses some The policy document addresses all key 4 dimension. dimension. key aspects of the examination. aspects of the examination. * ENABLING CONTEXT 2: Having strong leadership All stakeholder groups strongly oppose Most stakeholder groups oppose the Most stakeholders groups support the All stakeholder groups support the 5 the examination or are indifferent to it. examination. examination. examination. * There are no attempts to improve the This option does not apply to this There are independent attempts to There are coordinated attempts to examination by stakeholder groups. dimension. improve the examination by stakeholder improve the examination by stakeholder 6 groups. groups. * Efforts to improve the examination are This option does not apply to this Efforts to improve the examination are This option does not apply to this not welcomed by the leadership in dimension. generally welcomed by the leadership in dimension. 7 charge of the examination charge of the examination. * (CONTINUED) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 20 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 3: Having regular funding There is no funding allocated for the There is irregular funding allocated for There is regular funding allocated for the This option does not apply to this 8 examination. the examination. examination. dimension. * This option does not apply to this Funding covers some core examination Funding covers all core examination This option does not apply to this dimension. activities: design, administration, data activities: design, administration, data dimension. 9 processing or reporting. processing and reporting. * This option does not apply to this Funding does not cover research and This option does not apply to this Funding covers research and 10 dimension. development. dimension. development. * ENABLING CONTEXT 4: Having strong organizational structures The examination office does not exist or The examination office is newly The examination office is a stable This option does not apply to this 11 is newly established. established. organization. dimension. * The examination office is not This option does not apply to this The examination office is accountable to This option does not apply to this 12 accountable to an external board or dimension. an external board or agency. dimension. agency. * Examination results are not recognized Examination results are recognized by Examination results are recognized by Examination results are recognized by by any certification or selection system. certification or selection system in the one certification or selection system in two or more certification or selection 13 country. another country. system in another country. * The examination office does not have The examination office has some of the The examination office has all of the The examination office has state of the the required facilities to carry out the required facilities to carry out the required facilities to carry out the art facilities to carry out the 14 examination. examination. examination. examination. (CONTINUED) * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 21 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 5: Having effective human resources There is no staff to carry out the The examination office is inadequately The examination office is adequately The examination office is adequately examination. staffed to effectively carry out the staffed to carry out the examination staffed to carry out the assessment 15 examination, issues are pervasive. effectively, with minimal issues. effectively, with no issues. * The country does not offer opportunities This option does not apply to this The country offers some opportunities The country offers a wide range of that prepare for work on the dimension. that prepare for work on the opportunities that prepare for work on 16 examination. examination. the examination. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 22 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 SYSTEM ALIGNMENT Degree to which the assessment is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED SYSTEM ALIGNMENT 1: Aligning examinations with learning goals and opportunities to learn It is not clear what the examination This option does not apply to this There is a clear understanding of what This option does not apply to this 17 measures. dimension. the examination measures. dimension. What the examination measures is This option does not apply to this What is measured by the examination is * This option does not apply to this 18 questioned by some stakeholder groups. dimension. largely accepted by stakeholder groups. dimension. * Material to prepare for the examination There is some material to prepare for the There is comprehensive material to There is comprehensive material to is minimal and it is only accessible to examination that is accessible to some prepare for the examination that is prepare for the examination that is 19 very few students. students. accessible to most students. accessible to all students. * SYSTEM ALIGNMENT 2: Providing teachers with opportunities to learn about the examination There are no courses or workshops on There are no up-to-date courses or There are up-to-date voluntary courses There are up-to-date compulsory courses examinations available to teachers. workshops on examinations available to or workshops on examinations available or workshops on examinations for 20 teachers. to teachers. teachers. Teachers are excluded from all Teachers are involved in very few * Teachers are involved in some Teachers are involved in most 21 examination-related tasks. examination-related tasks. examination-related tasks. examination-related tasks. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 23 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ASSESSMENT QUALITY Degree to which the assessment meets quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED ASSESSMENT QUALITY 1: Ensuring quality There is no technical report or other There is some documentation on the There is a comprehensive technical There is a comprehensive, high quality documentation. examination, but it is not in a formal report but with restricted circulation. technical report available to the general 22 report format. public. * There are no mechanisms in place to This option does not apply to this There are limited systematic mechanisms There are varied and systematic ensure the quality of the examination. dimension. in place to ensure the quality of the mechanisms in place to ensure the 23 examination. quality of the examination. * ASSESSMENT QUALITY 2: Ensuring fairness Inappropriate behavior surrounding the Inappropriate behavior surrounding the Inappropriate behavior surrounding the Inappropriate behavior surrounding the 24 examination process is high. examination process is moderate. examination process is low. examination process is marginal. * The examination results lack credibility The examination results are credible for The examination results are credible for This option does not apply to this 25 for all stakeholder groups. some stakeholder groups. all stakeholder groups. dimension. * The majority of the students (over 50%) A significant proportion of students A small proportion of students (less than All students can take the examination; may not take the examination because of (10%-50%) may not take the examination 10%) may not take the examination there are no language, gender or other 26 language, gender, or other equivalent because of language, gender, or other because of language, gender, or other equivalent barriers. barriers. equivalent barriers. equivalent barriers. * (CONTINUED) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 24 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ASSESSMENT QUALITY 3: Using examination information in a fair way Examination results are not used in a Examination results are used by some Examination results are used by most Examination results are used by all 27 proper way by all stakeholder groups. stakeholder groups in a proper way. stakeholder groups in a proper way. stakeholder groups in a proper way. Student names and results are public. 28 This option does not apply to this Students’ results are confidential. This option does not apply to this * dimension. dimension. * ASSESSMENT QUALITY 4: Ensuring positive consequences of the examination There are no options for students who There are very limited options for There are some options for students who There is a variety of options for students do not perform well on the examination, students who do not perform well on the do not perform well on the who do not perform well on the 29 or students must leave the education examination. examination. examination. system. * There are no mechanisms in place to This option does not apply to this There are some mechanisms in place to There is a variety of mechanisms in place monitor the consequences of the dimension. monitor the consequences of the to monitor the consequences of the 30 examination. examination. examination. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 25 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Examinations: Development-level rating justifications 1. The State Matura has been administered every year since 2008 (four times). It is administered at grade 12 and is used for student certification for grade completion, selection to university or higher education institution, for monitoring education quality levels, and for planning education policy reforms. Starting in 2007, students have had to take four exams: mother tongue and three subjects selected from a list of 12 subjects. Two of the exams are administered at the central level, and two are administered by a student’s school. Currently, five subjects in total are administered at the central level, and seven are administered by schools. Under the next phase of the Matura Exam, which was expected to begin in 2010 but has been postponed to 2013, all of the exams on the list of 12 optional subjects will be administered at the central level. 2. Documents include: a. Rulebook for taking Matura exams and assessment of students’ results on the exams on Matura for Gymnasium and Secondary Vocational Education (authorized by the Ministry of Education and Science, 2007 and 2010 (revised)). b. Concept paper for Matura and Final Exam in four years Secondary Education (authorized by State Matura Committee and Ministry of Education and Science, 2005 and 2010 (revised)). c. Exams’ syllabi. For each teaching subject, these documents provide the purpose of the exam; exam content; types of the items; and test specification. d. Documents related to exams organization and administration procedures for organization and administration of the external exams in the State Matura (internal and confidential document of NEC) 3. All documents have been published and distributed to all secondary schools. They are also available on-line. Documents include: a. Rulebook for taking Matura exams and assessment of students’ results on the exams on Matura for Gymnasium and Secondary Vocational Education: http://www.slvesnik.com.mk/Issues/58AC759457D37C428A95285484B3C4CA.pdf p.23 b. Concept paper for Matura and Final Exam in four years Secondary Education: http://www.matura.gov.mk/data_files/state_graduate/mk/5115_Koncepcija.pdf (in Macedonian language) c. Exams’ syllabi: http://www.matura.gov.mk/documents.aspx?language=MK&page=O6dtQQpiV3o= 4. The policy documents cover various aspects of the examination: The Law for National Examination Centre, the Rulebook for taking Matura exams and assessment of students’ results on the exams on Matura for Gymnasium and Secondary Vocational Education, the Guidance for the Mature exam, the Guidance for the local coordinators, and the Guidance for the test administrator outline governance, distribution of power, responsibilities among key entities; the Concept paper for Matura and Final Exam in four years Secondary Education and the Amendments to the Law for Secondary Education (Article 27) describe the purpose of the examination; the Concept paper for Matura and Final Exam in four years Secondary Education and the Law on High Education describe the authorized uses of results; the Rulebook for taking Matura exams and assessment of students’ results on the exams on Matura for Gymnasium and Secondary Vocational Education specifies who can sit for SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 26 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 the examination, outlines procedures for special/disadvantaged students, and outlines procedures to investigate and address security breaches, cheating, or other forms of inappropriate behavior; procedures for organization and administration of the external exams in the State Matura (internal and confidential document of NEC) identifies rules about preparation; and Exams’ syllabi explain the alignment with curricula and standards and the format of the examination questions. 5. Policymakers and employers strongly support the Matura exams. Introducing Matura exams was result of the joint effort of the policymakers and professionals. Educators, students, parents, and universities support the Matura exams as well. BDE advisors have taken part in the development of the documents for Matura (Concept for Matura, Exam syllabi), development of the test items, and monitoring the Matura administration. Most teachers support the Matura exam because it motivates students to learn more. As communicated by the media, teachers and school principals support the Matura. While students initially strongly opposed the examination (there were student demonstrations against the Matura and negotiation with student associations), they are now supportive or neutral of it. The parents were initially confused about the Matura or opposed it. They now support it because the Matura is lower stakes than the university entrance exams use to be. Universities were consulted and involved in the process of developing the Concept for Matura, and they accepted to replace the entrance exams with the results of the Matura. Presidents of all subjects’ Matura committees are professors from the universities. There have been no issues expressed by the universities on the validity of the Matura exam. The media is generally neutral to the Matura but is usually critical of the Matura results. Teacher unions and think tanks and NGOs have not expressed their opinion on the Matura. 6. During the development of the Concept for Matura examination, universities submitted proposals to improve the Concept for the examination. After approval of the Concept, educators and policymakers took the initiative to: a. develop a State Matura exam for art schools; b. introduce two levels of the exam in mathematics for the State Matura; c. introduce a second examination session for those students who failed the State Matura in the first session (the first examination session is in June, the second one is in August); d. harmonize the content of the Mother tongue exam in VET schools with the exam in General Secondary schools. 7. There were no attempts made to improve the Matura exam that were not well received by the leadership in charge of the exam. 8. Funding for the State Matura is provided by the government to the National Examination Centre. 9. Funding also covers staff training. 10. Funding does not cover research and development. 11. National Examination Centre (NEC) was established as a semi-independent agency in 2009. The agency use to be unit within the Bureau for Development of Education (BDE). SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 27 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 12. NEC is accountable to State Matura Board a external body established by the Ministry of Education. It consists of the representatives from: Ministry of Education and Science; Universities; National Examination Centre; Bureau for Development of Education and VET Centre. 13. The examination results are officially recognized in the country and by more than one certification and selection system abroad, including in Slovenia, the UK, Bulgaria, Albania, and Serbia. 14. The examination office has computers for all technical staff, a secure building, secure storage facilities, access to adequate computer servers, the ability to backup data, and adequate communication (telephone, email, internet) tools. 15. There is permanent or full-time staff, but it is insufficient to meet needs of the examination. Specifically, NEC lack subject specialist for some mother tongue (Turkish) and foreign (Russian, French) languages, but this issue has been addressed by the use of the BDE subject specialists. Subject specialists are members of the subject’s Matura committee, provide technical support to test developers, test markers, and are responsible for timely and secure preparation of exam papers. There have been some indications of poor exam monitoring by the some of the test administrators due to some test administrators not following the test administration rules and not being sanctioned for allowing students to engage in improper behavior. 16. Non-university training courses or workshops on educational measurement and evaluation were provided as part of projects related to establishing the examination system and mainly by donors. While these opportunities are still in place, they are rarely used due to limited financing. University professors and certified training providers offer courses on educational measurement and evaluation, and schools have to pay for their teachers participation or the teachers themselves have to pay. A limited number of school can provide these trainings for their teachers. 17. The Matura exam is tied to the national curricula and standards. 18. The results from the Matura exam are used by universities for admission as well as by employers. 19. The materials needed to prepare for the examination is widely accessible by all students (over 90 percent) in a variety of learning contexts. Materials needed to prepare for the Matura exams are regular textbooks used during secondary education. Textbooks, exam items and marking schemes from previous exams, and user friendly guidance for taking Matura are available for free to all students in all languages of instruction for compulsory and elective subjects. Tests from previous exams are available online, information on how to prepare for the examination is available in the form of a booklet that is distributed to students, and the framework documents explaining what is measured on the examination (also known as the "Exams' syllabi") are also made available. 20. There are voluntary courses or workshops that are not regularly updated. Workshops for the State Matura administration and test marking are organized every year and are regularly updated. These courses are available just for the test administrators (approximately 2000 people) and test markers (approximately 320 people). SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 28 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Workshops related to separate subjects (content, requirements) are organized from time to time, and university professors and certified training providers carry out some of these workshops. Schools have to pay for their teachers' participation or teachers themselves must pay. A limited number of schools can provide access to the workshops for their teachers. Courses for teachers of centrally developed subjects in Matura are offered by NEC and BDE for free. 21. Teachers create examination questions for a pool for each compulsory and elective exam that is centrally developed. Test items developers are usually invited based on their previous experience in curriculum and exam syllabi development, experience as test markers and reputation as good teachers. Teachers have role of administrators in a school other than their own, by a schedule developed by NEC. Teachers also score the exam. 22. There is some documentation about the technical aspects of the examination, but it is not in a formal report format. Comprehensive technical reports were prepared only during the piloting phase. 23. Items and administration procedures were piloted during the preparatory phase of the Matura exam. Technical personnel from the NEC review item pools and select the best items. The President of the Matura committee (university professor) for each exam makes the final selection of the items that will be included in the exam. Selection is based on the quality of test items and the relevance of the items for the test specification (which is part of exam syllabi). There are attempts to maintain the difficulty levels of the Matura exams from one year to the next. 24. Inappropriate behaviors that diminish the credibility of the examination typically occur during the examination process include: copying from other candidates; collusion among candidates via mobile phones, passing of paper, or equivalent; and the provision of external assistance via the supervisor, mobile phone etc. According to the rulebook on test administration, if someone is caught cheating, he or she will be warned and if the same person continues with cheating, that person will be disqualified. If copying is proven on open-ended and essay-type of items, those items are marked with 0 points. The new rule of conduct (2010) identifies penalties for test administrators and supervisors who engage in or support students' improper behavior. 25. All universities (state and private) use exam results as a selection criterion. Employers also accept results as valid and reliable. 26. All students may take the examination, regardless of background (e.g., gender, ethnic group), location (e.g., urban, rural), ability to pay (e.g., transportation, fees) or the like. 27. There is no systematic evidence on improper use of the results. 28. Students results on Matura are displayed at the school and everyone can see the results. Making the results public is considered part of the strategy for transparency of the examination. 29. Students who do not perform well on the examination may retake the examination once again in the same school year (in August) and next school years in two terms (June and August). Students may also opt for less selective schools/universities/tracks and can repeat the grade. 30. There are no mechanisms in place to monitor the consequences of the examination. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 29 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 THE FORMER YUGOSLAV REPUBLIC OF MACEDONIA National (or System-Level) Large-Scale Assessment (NLSA) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 30 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which NLSA activity takes place in a country or system and the extent to which that framework is conducive to, or supportive of, the NLSA activity. LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 1: Setting clear policies for NLSA No NLSA exercise has taken place. The NLSA has been operating on an The NLSA is a stable program that has This option does not apply to this 1 irregular basis. been operating regularly. dimension. There is no policy document pertaining There is an informal or draft policy * There is a formal policy document that This option does not apply to this 2 to NLSA. document that authorizes the NLSA. authorizes the NLSA. dimension. * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this 3 dimension. the public. public. dimension. * There is no plan for NLSA activity. This option does not apply to this There is a general understanding that the There is a written NLSA plan for the 4 dimension. NLSA will take place. coming years. * ENABLING CONTEXT 2: Having strong public engagement for NLSA All stakeholder groups strongly oppose Some stakeholder groups oppose the Most stakeholders groups support the All stakeholder groups support the NLSA. 5 the NLSA or are indifferent to it. NLSA. NLSA. * (CONTINUED) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 31 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 3: Having regular funding for NLSA There is no funding allocated to the There is irregular funding allocated to There is regular funding allocated to the This option does not apply to this 6 NLSA. the NLSA. NLSA. dimension. * This option does not apply to this Funding covers some core NLSA Funding covers all core NLSA activities: This option does not apply to this dimension. activities: design, administration, analysis design, administration, analysis and dimension. 7 and reporting. reporting. This option does not apply to this Funding does not cover research and This option does not apply to this * Funding covers research and 8 dimension. development activities. dimension. development activities. * ENABLING CONTEXT 4: Having strong organizational structures for NLSA There is no NLSA office, ad hoc unit or The NLSA office is a temporary agency or The NLSA office is a permanent agency, This option does not apply to this 9 team. group of people. institution or unit. dimension. This option does not apply to this Political considerations regularly hamper Political considerations sometimes * Political considerations never hamper 10 dimension. technical considerations. hamper technical considerations. technical considerations. * This option does not apply to this The NLSA office is not accountable to a The NLSA office is accountable to a This option does not apply to this 11 dimension. clearly recognized body. clearly recognized body. dimension. * (CONTINUED) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 32 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 5: Having effective human resources for NLSA There is no staff allocated for running an The NLSA office is inadequately staffed The NLSA office is adequately staffed to The NLSA office is adequately staffed to 12 NLSA. to effectively carry out the assessment. carry out the NLSA effectively, with carry out the NLSA effectively, with no minimal issues. issues. * The country does not offer opportunities This option does not apply to this The country offers some opportunities to The country offers a wide range of that prepare individuals for work on dimension. prepare individuals for work on the opportunities to prepare individuals for 13 NLSA. NLSA. work on the NLSA. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 33 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 SYSTEM ALIGNMENT Degree to which the NLSA is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED SYSTEM ALIGNMENT 1: Aligning the NLSA with learning goals It is not clear if the NLSA is based on This option does not apply to this The NLSA measures performance against This option does not apply to this 14 curriculum or learning standards. dimension. curriculum or learning standards. dimension. * What the NLSA measures is generally This option does not apply to this What the NLSA measures is questioned What the NLSA measures is largely 15 questioned by stakeholder groups. dimension. by some stakeholder groups. accepted by stakeholder groups. * There are no mechanisms in place to There are ad hoc reviews of the NLSA to There are regular internal reviews of the This option does not apply to this ensure that the NLSA accurately ensure that it measures what it is NLSA to ensure that it measures what it dimension. measures what it is supposed to intended to measure. is intended to measure. * 16 measure. SYSTEM ALIGNMENT 2: Providing teachers with opportunities to learn about the NLSA There are no courses or workshops on There are occasional courses or There are some courses or workshops on There are widely available high quality 17 the NLSA. workshops on the NLSA. the NLSA offered on a regular basis. courses or workshops on the NLSA offered on a regular basis. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 34 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ASSESSMENT QUALITY Degree to which the NLSA meets technical standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED ASSESSMENT QUALITY 1: Ensuring the quality of the NLSA No options are offered to include all This option does not apply to this At least one option is offered to include Different options are offered to include 18 groups of students in the NLSA. dimension. all groups of students in the NLSA. all groups of students in the NLSA. There are no mechanisms in place to This option does not apply to this There are some mechanisms in place to * There are a variety of mechanisms in 19 ensure the quality of the NLSA. dimension. ensure the quality of the NLSA. place to ensure the quality of the NLSA. There is no technical report or other There is some documentation about the There is a comprehensive technical * There is a comprehensive, high quality documentation about the NLSA. technical aspects of the NLSA, but it is report but with restricted circulation. technical report available to the general 20 not in a formal report format. public. * ASSESSMENT QUALITY 2: Ensuring effective uses of the NLSA 21 NLSA results are not disseminated. NLSA results are poorly disseminated. NLSA results are disseminated in an This option does not apply to this effective way. dimension. NLSA information is not used or is used This option does not apply to this * NLSA results are used by some NLSA information is used by all in ways inconsistent with the purposes dimension. stakeholder groups in a way that is stakeholder groups in a way that is or the technical characteristics of the consistent with the purposes and consistent with the purposes and 22 assessment. technical characteristics of the technical characteristics of the There are no mechanisms in place to * This option does not apply to this assessment. There are some mechanisms in place to assessment. There are a variety of mechanisms in 23 monitor the consequences of the NLSA. dimension. monitor the consequences of the NLSA. place to monitor the consequences of the NLSA. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 35 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 National (or System-Level) Large Scale Assessment (NLSA): Development-level rating justifications 1. The External Assessment of Students’ Achievement in Primary and Secondary Education was piloted in 2010 (in 27 out of 95 secondary schools, and assessed Mathematics at grade 9) and in 2011 (in 14 of 343 primary schools, and assessed Social sciences at grade 4 and Chemistry at grade 7; in 11 of 95 secondary schools, and assessed Mother tongue (Macedonian, Albanian, Turkish) in grade 10, and Business in grade 11. 2. Formal policy documents include: a. Law on Primary Education and its amendments art. 54, 71 (authorized by the Ministry of Education and Science, 2008 (Official Gazette, No 103, 19.08.2008 p.2) and amendments 2010 (Official Gazette, No 33, 09.03.2010 p.10)) b. Law on Secondary education and its amendments art. 45a, 45b, 45v, 56 and 70 (authorized by the Ministry of Education and Science, 2008 (Official Gazette, No 92, 22.07.2008 p. 11) and amendments 2010 (Official Gazette, No 33, 09.03.2010, p 2)) c. Rule on Organization and Administration of the External Assessment of Students in Primary Schools, Establishing and Work of the School Committees, Confidentiality of the Testing Materials, Verification of the Testing Materials by the School Committee and the Template and Content of the Report (2010) d. Rule on Organization and Administration of the External Assessment of Students in Secondary Schools, Establishing and Work of the School Committees, Confidentiality of the Testing Materials, Verification of the Testing Materials by the School Committee and the Template and Content of the Report (2010) 3. The laws and rules are available online and in hard-copy. 4. While there is no large-scale assessment plan for the coming years or future assessment rounds and the External Assessment of Students’ Achievement in Primary and Secondary Education is still in the pilot phase, the MoES has announced that this assessment will be lunched next school year (20012/13). 5. Independent efforts have been made by different stakeholder groups to reform the External Assessment of Students’ Achievement in Primary and Secondary Education. Individuals have made motions to the Constitutional Court, challenging particular articles in the laws (provisions related to division of responsibilities between the education institutions and the idea that grades that student will receive on the External Assessment to influence the summative grades of the student – final / annual grades). Thus, the MoES has made revision of the laws, specifically: a. A new institution – the National Examination Center, which is responsible for administration of the assessment was established b. In the new bylaw document, there is no longer the option of using External Assessment results to influence students’ grades While policymakers strongly support the External Assessment of Students’ Achievement in Primary and Secondary Education, educators, students, and parents oppose it, while donors strongly oppose it. For example, the World Bank team has given an opinion that Ministry of Education has to reconsider the decision of introducing massive external assessment, addressing that the data from the students’ learning assessment should be used primarily to direct responses toward improvement of teaching and learning. Similar suggestions had been given by the USAID financed Primary Education Project, addressing SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 36 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 that the stake of the assessment for teachers and format of assessment instruments (only multiple choice items) force teaching for the test and hampers the Project’s efforts to improve assessment for learning. 6. There is regular (continuous and predictable) funding allocated by the government. Test item development and administration of the pilot have been funded so far. The Ministry will allocate sufficient funds in the budget for the next fiscal year. 7. Because the assessment is still in the pilot phase, the funds that were allocated so far were for the piloting of the assessment. Funding for full implementation will cover assessment administration, data analysis, data reporting, staff training, and the development of the software and establishment of a system for on line testing. 8. Funding will not cover research and development activities. 9. According to the laws on the Primary and Secondary Education and law for the National Examination Center (NEC), the NEC is responsible for administration of the assessment, Bureau for Development in Education (BDE) and the VET Center are responsible for item development. 10. Policymakers are concerned about grades inflation and low reliability of students' grades, and intend to use the data on reliability of teacher’s grading as an indicator for teacher evaluation in order to force teachers to improve the reliability of their assessment practices Professionals have shared with the Ministry their suggestions on the purpose of the External Assessment, test format, sampling, the issue of validity, particularly given the intended use of the assessment results (for evaluating students and teachers). Policy makers have not taken most of the suggestions given by the professionals into consideration. 11. The legal framework regarding the External Assessment does not regulate accountability of the NEC, and it is not clear to whom the NEC is accountable. 12. There is permanent or full-time staff, but it is insufficient to meet the needs of the External Assessment when it will be implemented after the pilot phase. The Ministry plans to finance part time staff during the administration and data processing periods every year. Issues that have been identified with the performance of the human resources responsible for the External Assessment include omission of curricular topics and weaknesses in test design (only multiple choice items have been introduces on the External Assessment and therefore some domains, such as writing, practical experimentation, and oral exams were not included in the test design; the External Assessment also does not measure higher order cognitive skills). 13. No opportunities have been offered on an annual basis. NEC staff had received high quality training for item development/ test design and administration of the exams by the CITO and Anglia Assessment experts. This technical assistance was provided during the period of establishing NEC (2002 - 2006). SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 37 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 There have been some training organized for the item writers before the first pilot of the External Assessment. Courses, according to the participants, were of satisfactory quality but the participants expressed need for more hands-on training. Trainings have also been made available through projects for establishing the system of external assessment and ad hoc trainings for test item developers. 14. The External Assessment measures the objectivity of teacher's grading. The main purpose of the assessment is to evaluate the objectivity of teachers’ grading. It is intended that each student's results on the External Assessment will be reported in a Diploma Supplement (DC). The DS will be used as additional criteria for selection of the student in a next cycle in educational system, i.e. from primary to secondary education. This is not yet regulated in any official document. 15. Education professionals have challenged the validity of the External Assessment given its intended use. The majority of teachers expressed doubts that this assessment will contribute to the enhancement of the quality of teaching and learning. 16. There are no mechanisms in place to ensure that the External Assessment accurately measures what it is supposed to measure. 17. There are no teacher training courses, workshops, or presentations on the External Assessment (e.g., domains measured, how to read and use results) offered in the country. The External Assessment is currently in the pilot phase. 18. It is planned for the External Assessment to be offered in all four languages of instruction in FYR Macedonia. 19. Mechanisms to ensure the quality of the External Assessment include a standardized manual for assessment administrators, and a pilot is conducted before the main data collection takes place. 20. There is some documentation about the technical aspects of the assessment, but it is not in a formal report format and is not available to the public. 21. Results are featured in newspapers, magazines, radio, or television. The External Assessment is currently in the pilot phase. 22. The External Assessment is currently in the pilot phase. 23. Mechanisms to monitor the consequences of the External Assessment have not yet been developed. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 38 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 THE FORMER YUGOSLAV REPUBLIC OF MACEDONIA International Large-Scale Assessment (ILSA) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 39 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which ILSA takes place in a country or system and the extent to which that framework is conducive to, or supportive of, ILSA activity. LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 1: Setting clear policies for ILSA The country/system has not participated This option does not apply to this The country/system has participated in The country/system has participated in 1 in an ILSA in the last 10 years. dimension. at least one ILSA in the last 10 years. two or more ILSA in the last 10 years. * The country/system has not taken This option does not apply to this The country/system has taken concrete This option does not apply to this concrete steps to participate in an ILSA in dimension. steps to participate in at least one ILSA in dimension. 2 the next 5 years. the next 5 years. * There is no policy document that There is an informal or draft policy There is a formal policy document that This option does not apply to this 3 addresses participation in ILSA. document that addresses participation in addresses participation in ILSA. dimension. ILSA. * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this dimension. the public. public. dimension. ENABLING CONTEXT 2: Having regular funding for ILSA There is no funding for participation in There is funding from loans or external There is regular funding allocated at There is regular funding approved by law, 4 ILSA. donors. discretion. decree or norm. * This option does not apply to this Funding covers some core activities of Funding covers all core activities of the Funding covers all core activities of the 5 dimension. the ILSA. ILSA. ILSA. * Funding does not cover research and This option does not apply to this This option does not apply to this Funding covers research and 6 development activities. dimension. dimension. development activities. * (CONTINUED) SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 40 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 LATENT EMERGING ESTABLISHED ADVANCED ENABLING CONTEXT 3: Having effective human resources for ILSA There is no team or national/system There is a team or national/system There is a team and national/system This option does not apply to this coordinator to carry out the ILSA coordinator to carry out the ILSA coordinator to carry out the ILSA dimension. 7 activities. activities. activities. * This option does not apply to this The national/system coordinator or The national/system coordinator is fluent This option does not apply to this 8 dimension. other designated team member may not in the language of the assessment. dimension. be fluent in the language of the assessment. * This option does not apply to this The ILSA office is inadequately staffed or The ILSA office is adequately staffed or The ILSA office is adequately staffed and dimension. trained to carry out the assessment trained to carry out the ILSA effectively, trained to carry out the ILSA effectively, 9 effectively. with minimal issues. with no issues. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 41 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 SYSTEM ALIGNMENT Degree to which the ILSA meets technical quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED SYSTEM ALIGNMENT 1: Providing opportunities to learn about ILSA The ILSA team has not attended The ILSA team attended some The ILSA team attended all international This option does not apply to this 10 international workshops or meetings. international workshops or meetings. workshops or meetings. dimension. The country/system offers no This option does not apply to this * The country/system offers some The country/system offers a wide range 11 opportunities to learn about ILSA. dimension. opportunities to learn about ILSA. of opportunities to learn about ILSA. * This option does not apply to this This option does not apply to this Opportunities to learn about ILSA are Opportunities to learn about ILSA are dimension. dimension. available to the country's/system's ILSA available to a wide audience, in addition 12 team members only. to the country's/system's ILSA team * members. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 42 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 ASSESSMENT QUALITY Degree to which the ILSA meets technical quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED ASSESSMENT QUALITY 1: Ensuring the quality of ILSA Data from the ILSA has not been The country/system met sufficient The country/system met all technical The country/system met all technical published. standards to have its data presented standards required to have its data standards required to have its data beneath the main display of the presented in the main displays of the presented in the main displays of the * 13 international report or in an annex. international report. international report. The country/system has not contributed This option does not apply to this This option does not apply to this The country/system has contributed new 14 new knowledge on ILSA. dimension. dimension. knowledge on ILSA. * ASSESSMENT QUALITY 2: Ensuring effective uses of ILSA If any, country/system-specific results Country/system-specific results and Country/system-specific results and Country/system-specific results and and information are not disseminated in information are disseminated irregularly information are regularly disseminated in information are regularly and widely 15 the country/system. in the country/system. the country/system. disseminated in the country/system. * Products to provide feedback to schools This option does not apply to this Products to provide feedback to schools Products to provide feedback to schools and educators about the ILSA results are dimension. and educators about the ILSA results are and educators about ILSA results are 16 not made available. sometimes made available. systematically made available. * There is no media coverage of the ILSA There is limited media coverage of the There is some media coverage of the There is wide media coverage of the ILSA 17 results. ILSA results. ILSA results. results. * If any, country/system-specific results Results from the ILSA are used in a Results from the ILSA are used in some Results from the ILSA are used in a and information from the ILSA are not limited way to inform decision making in ways to inform decision making in the variety of ways to inform decision 18 used to inform decision making in the the country/system. country/system. making in the country/system. country/system. * It is not clear that decisions based on This option does not apply to this This option does not apply to this Decisions based on the ILSA results have ILSA results have had a positive impact dimension. dimension. had a positive impact on students' 19 on students' achievement levels. achievement levels. * SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 43 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 International Large Scale Assessment (ILSA): Development-level rating justifications 1. FYR Macedonia has participated in PIRLS (in 2001, 2006), TIMSS (in 1999, 2003, and 2011), and PISA (in 2000 and PISA Plus in 2001). This rubric is completed with regard to FYR Macedonia's participation in TIMSS 2011. 2. There is no official document or plan for future participation in an ILSA, although the Ministry of Education and Science (MoES) has expressed intention to participate in PISA 2015 and TIMSS 2015. 3. The decision for FYR Macedonia to participate in an ILSA has been made on an ad-hoc basis. 4. Funding for the participation fee was provided for by international agencies USAID, UNICEF, and MoES using the World Bank loan proceeds (under the "Education Modernization Project"). Most of the work related to the administration and marking was done by teachers on a voluntary basis. Test desk-top preparation, data entry and data cleaning were done by the National Examination Centre (these activities are part of its official responsibilities). Translation and printing was paid for by the MoES. Participation on the international expert meetings was covered by the MoES from the Ministry discretion funding and by UNICEF. 5. Funding covered international participation fees, implementation of the assessment exercise in the country, and attendance at international expert meetings for the assessment exercise. Processing and analyzing data collected from implementation of the assessment exercise and reporting and disseminating the assessment results in the country was conducted by NEC (as part of its official responsibilities). 6. Funding does not cover research and development. 7. The team is comprised of the national coordinator, who is an employee of NEC and was assigned by the Ministry of Education and Science to be the national coordinator, and other staff from NEC. All of the team members have gained experience in working on ILSAs over the last ten years. The national coordinator is the only person officially nominated to the ILSA team. Other members of the team work on ILSA in addition to their regular jobs in NEC. Subject specialists for the sciences are teachers who are engaged when necessary, at times on a voluntary basis. 8. The national coordinator is fluent in the language of the assessment. 9. Only the national coordinator participated in meetings from the time that FYR Macedonia joined TIMSS 2011. The national coordinator is fluent in the language in which the international-level meetings are conducted and related documentation is available. Additionally, the team has previous experience working on international assessments and the necessary training and experience to carry out the required assessment activities effectively. No issues have been identified with the carrying out of the international assessment in the country. 10. Only the national coordinator attended international meetings and workshops. During previous ILSA administrations, national coordinators and members of the ILSA team participated in almost all workshops on international assessments and on using international assessment databases offered by IEA. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 44 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 11. Opportunity to learn about international assessments is provided by the Ministry of Education and the donor community (UNICEF, Dutch embassy) and covers attending international workshops or training on international assessments. 12. Opportunities to learn about the ILSA are available to individuals working directly on the specific international assessment exercise. 13. FYR Macedonia met all technical standards required to have its data presented in the main displays of the international report. 14. FYR Macedonia’s contribution of new knowledge on international large-scale assessment includes: a. PIRLS 2001 Encyclopedia http://timss.bc.edu/pirls2001i/pdf/encyclopedia.pdf b. PIRLS 2006 Encyclopedia http://timssandpirls.bc.edu/PDF/P06Encyclopedia.pdf c. TIMSS 2011 Encyclopedia (in preparation) d. NC Anica Aleksova was member of the Item scaling and anchoring committee for TIMSS 1999, member of the International expert panel for Mathematics and Mathematics item development task force and Mathematics item review committee for TIMSS 2003. e. NRC Bojana Naceva was member of the PIRLS 2006 Questionnaires Development Group http://timssandpirls.bc.edu/pirls2006/framework.html f. Bojana Naceva and Gorica Mickovska: Impact of PIRLS 2001 in Republic of Macedonia, in The impact of PIRLS 2001 in 13 Countries, Studies in International Comparative and Multicultural Education , Germany , 2007 http://books.google.mk/books?id=HF9Pvd7sOvAC&pg=PA193&lpg=PA193&dq=The+impact+of+PIRLS+2001+in+13+Countries&source=bl&ots=gERf4T1J DL&sig=xqUOPJ91xYShL7XB6EG5s9Cd4B0&hl=mk&redir_esc=y# 15. There is a plan for the results from TIMSS 2011 to be presented in a country report. Results from PIRLS 2001, TIMSS 1999, TIMSS 2003 and PISA plus were printed as national reports, disseminated to stakeholders (schools, Universities, Educational authorities) and publicly presented on the conferences. Reports are available in printed versions and on line. Results from PIRLS 2006 were not presented in a national report because the national coordinator for PIRLS left NEC and the management staff of NEC was changed, therefore writing PIRLS National report was no longer a priority. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 45 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Released TIMSS’s and PIRLS’s items were published and distributed to schools. Released TIMSS’s items are available on-line: http://itembank.dic.edu.mk/Documents.aspx TIMSS 2003 and PIRLS 2006 results have been presented only to the representatives from the MoES and educational agencies within the MoES. The main reason for not shearing the results and other relevant information widely was primarily the lack of finances and interest of the BDE for using ILSA results for improving teaching practice. Results from ILSA have been distributed through a national report online, copies of the national and international reports were distributed to key stakeholders, country results were communicated through a press release. 16. TIMSS 1999 and PIRLS 2001 results were fed-back to the schools’ representatives from all primary schools in FYR Macedonia during the regional workshops, the goals of which was to share ILSA results, to familiarize schools with these assessments (it was first time that FYR Macedonia participated in TIMSS and PIRLS) and with the types of test items used in ILSA, and to communicate this information to expected outcomes in teaching practice in FYR Macedonia. TIMSS 2003 and PIRLS 2006 results have been presented only to the representatives from the MoES and educational agencies within the MoES. The main reason for not shearing the results and other relevant information widely was primarily the lack of finances and interest of the BDE for using ILSA results for improving teaching practice. 17. Media coverage is limited to a few small articles. The media has always been invited to all press conferences related to the dissemination of ILSA results. The media has covered only basic information and the focus was usually on the results. 18. The results of the international assessment exercise have been used by policy makers or education leaders to improve education quality in the country by informing curriculum improvement (PIRLS and TIMSS results have been used as a basis for developing the concept for 9 years Primary Education ( 2007)), teacher training programs (teacher training programs initiated by donors have been designed to reduce gaps identified in the national reports on students’ achievement in ILSA.), and other assessment activities in the country (TIMSS and PIRLS framework and technical standards have been used for the development of the concept for National Assessment and disclosed items have been used as materials for teacher training on classroom assessment). 19. There has not been regular national assessment or any impact evaluation/research related to student achievement since 2006 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 46 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 Acknowledgements This report was prepared by The World Bank SABER- Student Assessment team in collaboration with Bojana Naceva, World Bank Senior Education Specialist and Task Team Leader for education projects in FYR Macedonia, and Gorica Mickovska, Project Manager at the Macedonian Civic Education Centre. References Clarke, M. 2012. “What Matters Most for Student World Bank. 2010. Country Partnership Strategy for Assessment Systems: A Framework Paper.� Former Yugoslav Republic of Macedonia for the Period READ/SABER Working Paper Series. Washington, DC: FY11-FY14. Report No. 54928-MK. Washington, DC: World Bank. World Bank. European Commission. 2012. The Former Yugoslav ———. 2012. “Macedonia, FYR at a glance.� Republic of Macedonia 2012 Progress Report. Brussels: Washington, DC: World Bank. Data retrieved from European Commission. http://devdata.worldbank.org/AAG/mkd_aag.pdf on January 29, 2013. International Monetary Fund. 2012. Public Information Notice (PIN) No. 12/58. IMF Executive Board Concludes ———. FYR Macedonia Country Indicator Data. 2011 Article IV Consultation with Former Yugoslav Washington, DC: World Bank. Data retrieved from Republic of Macedonia. Washington, DC: International http://data.worldbank.org/ on January 29, 2013. Monetary Fund. ———. The World Bank's Strategy in FYR of Republic of Macedonia State Statistical Office. 2012. Macedonia: 2011-2014. Washington, DC: World Bank. “Primary, Lower and Upper Secondary Schools at the Data retrieved from http://go.worldbank.org/ Beginning of the School Year 2011/2012.� Skopje: State QS6GA93YY0 on January 29, 2013. statistical office of the Republic of Macedonia. United Nations Educational, Scientific, and Cultural Organization - International Bureau of Education. 2011. World Data on Education. VII Ed. 2010/11. “The former Yugoslav Republic of Macedonia.� Geneva: UNESCO-IBE. SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 47 FYR MACEDONIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2012 www.worldbank.org/education/saber The Systems Approach for Better Education Results (SABER) initiative produces comparative data and knowledge on education policies and institutions, with the aim of helping countries systematically strengthen their education systems. SABER evaluates the quality of education policies against evidence-based global standards, using new diagnostic tools and detailed policy data. The SABER country reports give all parties with a stake in educational results—from administrators, teachers, and parents to policymakers and business people—an accessible, objective snapshot showing how well the policies of their country's education system are oriented toward ensuring that all children and youth learn. This report focuses specifically on policies in the area of student assessment. This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. THE WORLD BANK SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 2