Dear Editor,
The editorial in Stabroek News of July 1 headlined `Assessment’, is presented in such a way as to cast doubts in the populace regarding the integrity of the process. An informed editorial on the subject would not have been couched in a display of ignorance as presented. Editorial presentations should have as their background good research to allow for quality analyses to educate rather than to misinform. There are sections of the editorial that demand presentation of alternative empirical data to warrant asking questions that seek to challenge the ones presented by the Ministry of Education.
Firstly, the number of children that wrote the National Grade Six Assessment in 2010 fell below the 2009 figure. All children in the age group in the public schools have been accounted for. It must be noted that in 2000 the number that sat the examination was 15,729, in 2001 it was 14,692, in 2004 it was 16,427, in 2006 it was 18,540 and 2008 it was 17,630. The fluctuation over time is not uncommon. However, it is worth restating that by regulations, all children in the age cohort are presented for the examination.
Next, the scores as presented are raw scores converted to standard scores, a process that is statistically robust to allow for the aggregation of scores across subjects. The maximum raw score for each of the four subjects examined is 60, and this was so over the last ten years. The raw scores when standardized for each subject would result in a maximum standard score for that subject. The maximum standard scores for the four subjects are added to provide the maximum standard score for the examination; the highest mark possible. The process of converting the raw scores to standard scores creates the deviations noted from year to year with the maximum score possible. It is important to note that the raw score rank order of the children remains the same after obtaining the standard scores. This process would usually not result in a fixed maximum standard score from year to year. This system of using standardized scores in not unique to Guyana. It is a tool used by all examining bodies processing results for ranking children.
Five schools are competed for nationally. They are, in rank order, Queen’s College, The Bishops’ High School, St. Stanislaus College, St. Roses’ High School and St. Joseph High School. The cut-off score for any school is determined by the number of places available and the highest score obtained in that year. For example, Queen’s College has 140 places, so the top students will fill up the school. The score of the student awarded the last place in that school is the cut-off score for that school, which for Queen’s College happens to be 532 this year. Bear in mind that many students would have obtained identical marks with that top 140 cohort. All other schools are then awarded cut-off scores in a similar fashion.
Other secondary schools are categorized as A, B, and C based on their CXC track record over the last three years. Secondary schools not writing CXC are labelled D while the tops of primary schools are labelled E. Placement in these schools is based on two criteria. These are student performance and place of residence. Usually, category A schools are filled first, then Category B, and so on.
In response to the query about statistical manipulation to present a picture of educational progress, it must be emphatically stated that there is no such manipulation. The procedure for finding the mean score and standard deviation of performance in any subject is no different from that obtained in a statistical textbook.
It is not true that the entry requirements to the University of Guyana and the Cyril Potter College of Education have been reduced. The current entry requirements to the Cyril Potter College of Education have been upgraded to five CXC subjects at Grades 1 – 3 including English and Mathematics. This, by no stretch of the imagination, can ever be seen as lowering standards.
The Grades 2 and 4 Assessments are marked at the school level by senior teachers trained for the purpose. The markers are provided with a comprehensive Marking Scheme to guide the process. At the end of this exercise, a defined sample of scripts from the assessment is sent to NCERD for the purpose of moderation to ensure compliance with the Marking Scheme. The results of the Grades 2 and 4 scores are known to the school. These scores are kept in a database and are processed simultaneously with the Grade 6 scores as outlined earlier.
It is hoped that the explanations provided herein will dispel any suspicion still lurking in the minds of members of the public.
Yours faithfully,
Mohandatt Goolsarran
National Centre for Educational
Resource Development
Ministry of Education