67661 The World Bank Human Development Network - Education System Assessment and Benchmarking for Education Results (SABER) THE COMMONWEALTH OF DOMINICA Education Management Information System (EMIS) COUNTRY REPORT Emilio Porta, Jennifer Klein, Gustavo Arcia and Harriet Nannyonjo February 2012 Acknowledgements This report was prepared by a team led by Emilio Porta, Senior Education Specialist at the Human Development Network/Education at the World Bank; and consisting of Gustavo Arcia, Consultant to the Human Development Network/Education of the World Bank and Senior Economist at Analítica LLC in Miami, Florida; Jennifer Klein, Consultant to the Human Development Network/Education at the World Bank, and Harriet Nannyonjo, Senior Education Specialist, LCSHE, World Bank. The report was prepared under the guidance of Elizabeth King, Robin Horn and Chingboon Lee. The views expressed here are those of the authors and should not be attributed to the World Bank Group. All data contained in this report is the result of collaboration between the authors, the Organization of Eastern Caribbean States, and participants in the benchmarking exercise. All errors are our own. This benchmarking study arose from an active partnership between the Education Reform Unit of the Organization of Eastern Caribbean States (OECS) and the World Bank. The benchmarking exercise was done during an OECS workshop conducted in Castries, St. Lucia, from January 23 to January 28, 2011, with the participation of government officials from Antigua & Barbuda, the Commonwealth of Dominica, Grenada, St. Kitts & Nevis, St. Lucia, and St. Vincent & the Grenadines. A delegate from Montserrat also attended as an observer. The workshop and benchmarking exercise were done under the invaluable leadership of Marcellus Albertin, Head of the Education Reform Unit (OERU) at the OECS. His unflagging support, enthusiasm, and institutional supervision were fundamental for the cooperation of all participants and for the success of the workshop. To him we owe a great deal of gratitude. We would like to thank the OERU staff that helped us with workshop logistics, especially Emma Mc Farlane- Jouavel and Beverly Pierre. We would also like to thank the workshop participants: Doristeen Etinoff, Priscilla Nicholas, and Patricia George from Antigua & Barbuda; Ted Serrant, Robert Guiste, and Weeferly Jules from Dominica; Pauleen Finlay, Michelle Peters, and Imi Chitterman from Grenada; Gregory Julius from Monserrat; Quinton Morton, Ian Gregory, and Laurence Richards from St. Kitts & Nevis; Kendall Khodra, Nathalie Elliott, Sisera Simon, Evariste John, and Valerie Leon from St. Lucia; Dixton Findlay, Keith Thomas, and Junior Jack from St. Vicent & Grenadines; Darrel Montrope, Jacqueline Massiah, Sean Mathurin, and Loverly Anthony- Charles from the OECS. Abbreviations EMIS Education Management Information System MOE Ministry of Education OECD Organization for Economic Cooperation and Development OECS Organization of Eastern Caribbean States SABER System Assessment and Benchmarking for Education Results SEAT SABER EMIS Assessment Tool UIS UNESCO Institute for Statistics UNESCO United Nations Educational, Scientific, Cultural Organization THE COMMONWEALTH OF DOMINICA ESTABLISHED  Aspect of Data Quality Benchmark Prerequisites of Quality Established ¤¤¤¢ Assurances of Integrity Emerging ¤¤¢¢ Methodological Soundness Mature ¤¤¤¤ Accuracy and Reliability Established ¤¤¤¢ Serviceability Established ¤¤¤¢ Accessibility Emerging ¤¤¢¢ 1 BACKGROUND Education Data in Dominica Dominica is transitioning from a paper-based Education Management Information System (EMIS) to a fully computerized system. The new, fully computerized system will: Ü facilitate information-based planning and decision-making Ü reduce the costs associated with the paper-based EMIS Ü provide multipoint access for data users Ü facilitate easy transmission of data to regional and international agencies Ü foster efficient collection, analysis and reporting. FACILITIES AND EQUIPMENT. The current EMIS repository is located in the Ministry of Education’s Planning Unit, which manages a Central EMIS Application Server and an EMIS database server. It is accessible to school staff and to parents and students. Computers are is password protected and network sharing is also password protected but there is no clear policy on data security. Data files are usually stored on the statisticians’ computers. As a result, automatic archiving remains a huge issue and no remote storage exists in the event of the disaster. Dominica is evaluating the use of cloud computing but, for now, uses Opensis (www.os4ed.com) integrated with Moodle – the learning management software. EMIS STAFF. There are six Information and Communications Technology (ICT) officers at the national level for 65 schools and one EMIS coordinator in each secondary school. There are also ICT personnel from the Ministry of Education’s Human Resources Department who train primary school staff in EMIS administration. The system has five short-term workers that conduct the initial bulk data input and a statistical assistant/data administrator who manages and maintains the system’s database and report generation. EMIS DATA. Educational data used by the EMIS includes: Education Data: Social Assistance Data: Finance Data: Ü School Enrolment Ü Textbooks Ü Departmental Accounts of the Ministry of and Attendance Ü School Feeding Education Ü Assessments Ü Education Trust Ü Budget information from the Ministry of Ü Teachers Ü Transportation Finance Ü Social Assistance DATA COLLECTION. Data collection is made with questionnaires mailed or delivered to all schools. The questionnaires ask for data on students, teachers, and school finances. Data collection day is October 15 of each year and the deadline to return the questionnaires is October 30. DATA PROCESSING. Once received, questionnaires are vetted for errors and omissions, and the data are entered into an annual spreadsheet. A statistician aggregates school, district and national data and later validates the data with a planning officer. Planning officers, statisticians, or education officers conduct on-site verifications and any errors are corrected. The Senior Planning Officer verifies all data before releasing it for publication, and the statistical unit uses the approved data for the biennial indicator reports. PUBLICATIONS. The office in charge of education statistics produces 1) a biennial publication of education indicators circulated to all key stakeholders and available to others on demand; 2) an annual publication of Quick Facts, which is a snapshot of the education system for dissemination; 3) The Annual State of Education Report, which is submitted to Parliament. Dominica also submits EMIS data to UNESCO, the OECS Education Research Unit (OERU), and the International Monetary Fund (IMF). 2 EMIS in Dominica and the OECS ESTABLISHED:  In January 2011, Dominica’s EMIS was assessed using the SABER-EMIS Assessment Tool (SEAT) and overall the EMIS was categorized as ESTABLISHED (0.65). Among the six Organisation of Eastern Caribbean States (OECS) countries, Dominica tied with St. Kitts and Nevis for the highest overall score (Table 1). Dominica outperformed the OECS average Figure 1. SABER EMIS Scores in the OECS on most of the SEAT’s Aspects of Quality and, with a 0.83 score on Methodological Soundness, Dominica was one of only two countries to achieve a MATURE benchmark on any Aspect of Quality. The only Aspect of Quality that was not above the OECS average was Accessibility (0.47), which had three LATENT (0.00) sub-components that brought down the average score into the EMERGING range. Assurances of Integrity also had three LATENT sub-components and was the only other Aspect of Quality categorized as EMERGING. The next sections of this country report will analyze Dominica’s performance on the sub- components of each Aspect of Quality in order to present a detailed portrait of the strengths and weaknesses of Dominica’s EMIS and many concrete actions that the country can take to improve education data quality. Table 1. SABER EMIS Scores in the OECS Countries (2011) OECS Dominica Antigua Grenada St. Kitts St. Vincent St. Lucia Average Pre-Requisites 0.70 0.52 0.68 0.66 0.45 0.64 0.61 of Quality Assurances of 0.58 0.53 0.61 0.44 0.50 0.64 0.55 Integrity Methodological 0.83 0.67 0.67 0.83 0.67 0.50 0.69 Soundness Accuracy and 0.70 0.48 0.58 0.75 0.53 0.58 0.60 Reliability Serviceability 0.61 0.29 0.50 0.79 0.43 0.68 0.55 Accessibility 0.47 0.47 0.69 0.61 0.36 0.56 0.53 Overall 0.65 0.46 0.62 0.65 0.52 0.63 0.59 Latent Emerging Established Mature 0 – 0.3 0.31 - 0.59 0.6 - 0.79 0.8 - 1 3 PREREQUISITES OF QUALITY ESTABLISHED:  Dominica has ESTABLISHED (0.70) the Prerequisites of Figure 2. Prerequisites of Quality Quality and had the highest average score of the OECS countries (Figure 2). On three sub-components, Dominica’s ESTABLISHED EMIS was MATURE: 1) Responsibility for collecting and    disseminating education data is clearly specified by the law on data collection, which designates the Education Planning Unit at the Ministry of Education as responsible for collecting educational data (Table 2, 0.1); 2) Individual/personal data are kept confidential and used for statistical purposes only as required by a clause of the 1997 Education Act (0.3); and 3) Quality procedures are in place and enforced by management (0.8). While there is no formal agreement, the sharing of data among different levels of administration exists and is consistent (0.2). Measures for statistical reporting (0.4) also are not ensured through a formal legal mandate but the Education Act (1997) makes the provision for an annual state of education report. Creating more formal regulations for data sharing and reporting could help the country create a more robust system. Staff, facilities, technology, and financing are somewhat sufficient, but as the EMIS develops, it will require additional financing, staff statisticians, and measures for efficient management of human and physical resources. (0.5/0.6). OECS Table 2. Prerequisites of Quality: Subcomponents Dominica Benchmark Average Responsibility for collecting and disseminating education data is Mature 0.1 1.00 0.75 clearly specified  Emerging 0.2 Data sharing and coordination among different agencies are adequate 0.50 0.50  Individual/personal data are kept confidential and used for statistical Mature 0.3 1.00 0.79 purposes only  Statistical reporting is ensured through legal mandate and/or Established 0.4 0.75 0.58 measures to encourage response  Staff, facilities, computing resources, and financing are Established 0.5 0.75 0.63 commensurate with the activities  Processes and procedures are in place to ensure that resources are Emerging 0.6 0.50 0.63 used efficiently  Education statistics meet user needs and those needs are monitored Established 0.7 0.75 0.75 continuously  Mature 0.8 Processes are in place to focus on quality 1.00 0.63  Established 0.9 Processes are in place to monitor the quality of data processes 0.75 0.33  Processes are in place to deal with quality considerations in planning Emerging 0.10 0.25 0.58 the stat program  Mechanisms exist for addressing new and emerging data Emerging 0.11 0.50 0.54 requirements  4 ASSURANCES OF INTEGRITY EMERGING:  The Assurances of Integrity in Dominica’s EMIS are Figure 3. Assurances of Integrity in the OECS still EMERGING (0.58) overall, but a closer look at the EMERGING sub-components reveals two sets of scores on the either    end of the spectrum. More than half of the sub-components are MATURE. A law is in force protecting the professional independence of the data producing institution, which can ensure that statistics are produced on an impartial basis (Table 3, 1.1). Choices of sources, statistical techniques and decisions on dissemination are sound (1.3) and advance notice is given immediately for major changes in methodology, source data, and statistical techniques (1.8). In addition, the General Orders of the Public Service outline the guidelines for staff behavior (1.9). However, there are three LATENT sub-components that lower the overall average score. The terms and conditions under which statistics are collected, processed, and disseminated are not available to the public (1.5) nor are products of statistical agencies/units are clearly identified (1.7). Further, the public is not informed about internal access to preliminary data (1.6) and professional credentials are only sporadically considered for recruitment and promotion (1.2). OECS Table 3. Assurances of Integrity: Subcomponents Dominica Benchmark Average 1.1 Mature Statistics are produced on an impartial basis 1.00 0.38  1.2 Emerging Professionalism of staff is actively promoted 0.25 0.42  1.3 Choices of data sources and statistical techniques are made solely by Mature 1.00 0.83 statistical considerations  Agency is entitled to comment on erroneous interpretation and misuse Mature 1.4 1.00 0.58 of statistics  1.5 Latent Terms and conditions are available to the public 0.00 0.33  1.6 Public is aware of internal governmental access to statistics prior to Latent 0.00 0.38 their release  1.7 Latent Products of education statics agency are clearly identified 0.00 0.50  1.8 Advanced notice is given of major changes in methodology, source Mature 1.00 0.71 data, and statistical techniques  Guidelines for staff behavior are in place and are well known to the Mature 1.9 1.00 0.83 staff  5 METHODOLOGICAL SOUNDNESS MATURE:  In terms of Methodological Soundness, Figure 4. Methodological Soundness in the Dominica’s EMIS is MATURE (0.83). Dominica was OECS countries one of only two OECS countries to achieve a MATURE benchmark on any of the Aspects of Quality MATURE and was 0.14 above the OECS average score (Figure    4). Dominica’s scores on Methodological Soundness were high because of Dominica’s use of internationally accepted standards and guidelines for structure, concepts and definitions established by the UNESCO Institute for Statistics (UIS) and the OECS Education Reform Unit (OERU). In addition, Dominica uses the International Standard Classification of Education (ISCED) to classify all education data including expenditure data (Table 4, 2.3). Currently, Dominica’s EMIS produces between 71 and 90 percent of UIS indicators, which results in an EMERGING benchmark on the scope of statistics sub-component (2.2). Expanding the scope of statistics produced to 100 percent of UIS and OECD indicators is ideal and can enable additional domestic, regional, and international education policy analysis. OECS Table 4. Methodological Soundness: Subcomponents Dominica Benchmark Average Overall structure, concepts and definitions follow regionally and Mature 2.1 internationally accepted standards, guidelines, and good 1.00 0.83  practices Scope is in accordance with international standards, guidelines, Emerging 2.2 0.50 0.42 or good practices  Classification systems are consistent with international Mature 2.3 1.00 0.83 standards, guidelines, or good practices  6 ACCURACY AND RELIABILITY ESTABLISHED:  Figure 5. Accuracy and Reliability The Accuracy and Reliability of Dominica’s EMIS data is ESTABLISHED (0.70) (Figure 5) and on six of ESTABLISHED ten sub-components, Dominica’s EMIS was MATURE    (Table 5). Source data are timely (3.3), and data complication, editing, transformation and other statistical procedures employ sound statistical techniques (3.5/3.6). Intermediate results are routinely validated (3.7) and statistical discrepancies in intermediate data and statistical outputs are always investigated (3.8/3.9). Despite scoring higher than the OECS average on this Aspect of Quality, Dominica’s EMIS was LATENT in two areas, which lowered the overall average score. Source data are not audited and information on sampling errors and imputed data are not documented (3.4). Further, revisions to methodology are rarely made (3.10). Source data does not yet fully comply with the standards and scope of education statistics (3.2). The gap between strengths and weaknesses is large in Dominica on this Aspect of Quality. By focusing on improving the LATENT and EMERGING sub- components, Dominica could greatly improve the Accuracy and Reliability of its EMIS. OECS Table 5. Accuracy and Reliability: Subcomponents Dominica Benchmark Average Source data are obtained from comprehensive data collection that 0.75 Established 3.1 0.58 takes into account country-specific conditions  Data are reasonably confined to the definitions, scope, classifications, 0.25 Emerging 3.2 0.50 and time of recording required  1.00 Mature 3.3 Source data are timely (6 months after event) 0.46  Other data sources, such as censuses, surveys, and administrative 0.00 Latent 3.4 0.42 records, are routinely assessed  Data compilation employs sound statistical techniques to deal with 1.00 Mature 3.5 0.79 data sources  Other statistical procedures (data editing, transformations, and 1.00 Mature 3.6 0.63 analysis) employ sound statistical techniques  Intermediate results are validated against other information where 1.00 Mature 3.7 0.67 applicable  Statistical discrepancies in intermediate data are assessed and 1.00 Mature 3.8 0.92 investigated  Statistical discrepancies and other potential indicators or problems in 1.00 Mature 3.9 0.71 statistical outputs are investigated  Studies and analyses of revisions are carried out routinely and used 0.00 Latent 3.10 0.33 internally to inform the processes  7 SERVICEABILITY ESTABLISHED:  The Serviceability of Dominica’s EMIS data is ESTABLISHED (0.61) and is above the OECS average (0.55). Dissemination periodicity and Figure 6. Serviceability in the OECS timeliness and the consistency of statistics within the dataset are Dominica’s main strengths under this Aspect. The country meets the benchmark for ESTABLISHED producing an annual census of enrolments, teachers,    schools, and financial data (Table 6, 4.1). Administrative school census data are available two months after the initiation of the school year (4.2), and consistency checking and crosschecking are done on a regular-basis (4.3). Time series are available for more than 10 years and inconsistencies are explained (4.4), but there are no procedures for revisions, which follow a regular schedule but are only conducted internally within the unit (4.6). Preliminary and revised data are not clearly identified (4.7). The weakest sub-component under Serviceability is the consistency and reconcilability of data when compared to other data sources (4.5). Dominica’s EMIS was categorized as LATENT because school- reported figures were not compared with other data sources to verify the validity and consistency of the final results. Without sufficient verification, consistencies and errors could damage the credibility of EMIS. OECS Table 6. Serviceability: Subcomponents Dominica Benchmark Average Mature 4.1 Periodicity follows dissemination standards 1.00 0.96  Mature 4.2 Timeliness follows international dissemination standards 1.00 0.63  Mature 4.3 Statistics are consistent within the dataset 1.00 0.71  Statistics are consistent or reconcilable over a reasonable Established 4.4 0.75 0.54 period of time  Statistics are consistent or reconcilable with those obtained Latent 4.5 0.00 0.33 through other data sources and/or statistical frameworks  Emerging 4.6 Revisions follow a regular and transparent schedule 0.25 0.21  Emerging 4.7 Preliminary and/or revised data are clearly identified 0.25 0.46  8 ACCESSIBILITY EMERGING:  Accessibility was Dominica’s lowest score (0.47/EMERGING) and the only score that fell below the OECS average (0.53). There are two notable strengths under Figure 7. Accessibility in the OECS this Aspect: All data are released at the same time to all users (Table 7, 5.4) and there are procedures in place for releasing non-published data and non-confidential data upon EMERGING users’ request (5.5).    Currently, metadata are not available (5.6) and data are not released on a pre-announced schedule (5.3). All statistical releases identify a contact person in case of required assistance, but assistance is limited, hard to obtain, and not monitored (5.8). While data are clearly presented and charts have underlying data available, disaggregation of data are not presented (5.1) and levels of detail are not adapted to the needs of the intended users. Catalogs of data, publications, or other services are not available (5.7/5.9). Accessibility is one of the key missions of an EMIS because it creates and maintains the public image of the EMIS and enables greater accountability. Accessibility could improve in Dominica by 1) continuing to work toward the fully computerized EMIS, 2) creating data and publication catalogs, 3) improving assistance for users, and 4) disseminating data and reports on a pre-announced schedule with detailed metadata and contact points. OECS Table 7. Accessibility: Subcomponents Dominica Benchmark Average Statistics are presented to facilitate proper interpretation and Established 5.1 0.75 0.96 comparisons (layout, clarity of texts, tables, and charts)  Established 5.2 Dissemination media and format are adequate 0.75 0.54  Emerging 5.3 Statistics are released on a pre-announced schedule 0.25 0.38  Mature 5.4 Statistics are made available to all users at the same time 1.00 0.79  Mature 5.5 Statistics not routinely disseminated are made available upon request 1.00 0.75  Documentation on concepts, scope, classifications, basis of recording, data sources, and statistical techniques is available, and differences Latent 5.6 0.00 0.58 from internationally accepted standards, guidelines, or good practices  are annotated Latent 5.7 Levels of detail are adapted to the needs of the intended users 0.00 0.38  Emerging 5.8 Contact points for each subject field are publicized 0.50 0.38  Catalogs of publications and other services, including information on Latent 5.9 0.00 0.00 any charges, are widely available  9