Are 1 in 6 Canadians illiterate or 10th best readers in the world?

During the last year, we have been given conflicting information about how well Canadians can read. First to be released, in May of 2010, was a Statistics Canada report that claimed 1 in 6 Canadians were functionally illiterate or 14.6% of the 42% who were considered semi-illiterate.  For specific information on those statistics, read this CBC story. It is titled “Canada’s Shame” and is clearly an attempt to justify increasing funding and programs for adults with literacy difficulties.

Now, I am all for helping people who need it. I operated my own private reading clinic for a decade or more to do just that.  But, something is wrong when StatsCan has to use twelve-year-old numbers to make their case. Here, for example, is how the joint StatsCan, IALS (OECD’s International Adult Literacy Survey) study is explained — that the first round of IALS surveys were conducted in 1995, followed by second and third rounds in 1996 and 1998, with the final report pertaining to 23 countries or regions being released in 2000.

So, given how out of date that data was, why was it released in 2010? And, why was it discussed in the absence of other studies — particularly since both studies involved the OECD and data pertaining to the the studies on the reading competency of fifteen-year-olds was already available for 2000, 2003 and 2006? And, just last week, the 2009 OECD’s school-based international test scores in reading were released (December, 2010). Involving 70 countries, that report found that Canadian youth ranked tenth overall for reading, having slipped from 7th overall three years ago in 2006.

Now, here is the Canadian dilemma: Given the latest OECD school-based reading tests and the explanation in the CBC column, how can we have 42% of our population semi-illiterate while our high school graduates are performing well above the norm. I mean, we can’t have it both ways. We can’t be drowning in illiteracy while our kids are excelling.

So, which is it?

Was Canada’s high school drop-out rate halved by lowering standards?

Yesterday, the news came out that Statistics Canada had found that the high school drop-out in rate in Canada has been cut in half over the last twenty years.  While print and television news reports were all overflowing with praise (e.g., see this Ottawa Citizen report by Mark Iype), my immediate reaction was “why.”  Why did the drop rate go down so dramatically? Unfortunately, Stats Can numbers don’t tell us anything about why more young people are staying in school long enough to graduate.

Well, as a former university teacher, I can tell you it isn’t because students are being better prepared. In my last few years teaching, for example (early in this decade), I found spelling and grammar in written assignments was worse than it had ever been — and remember, like now, that was the age of computer spell checkers. 

In my opinion, and it is the opinion of an awful lot of people in this country, the reason for the decrease in drop-outs are provincially directed “no-fail and social promotion policies.” In fact, I was on the Dave Rutherford Radio Show last May, talking to Roy Green and his callers about that very subject. For those who are interested, here is my blog’s archive on that topic.

Now, I am all for motivating students to stay in school. But, lowering standards to do it is, in the long run, counter productive to society as a whole. Yet, Andrew Parkin, Director General of the Council of Ministers of Education, doesn’t seem to question why the rate had been reduced by half, just that: “It’s a dramatic change over time, and hopefully that means we can keep it going…I think it shows that the value of education and the recognition of that value has been increasing.”

I wish I could say I think it shows the recognition of the value of a high school graduation diploma, but I rather think it simply shows that: (a) more kids are promoted who shouldn’t be, particularly from elementary school into high school, and (b) academic standards have been reduced in order to accommodate those who would otherwise have dropped out. Is that such a bad thing? I don’t know. I guess only time will tell.

Perhaps parents and today’s employers can leave a comment here and tell me if they think their grown children, or staff, were adequately prepared for working in the real world. Because, it is only with that kind of information that we can know the reason Canada’s high school drop out rate has dropped so dramatically.

Pashler study claims learning style models lack validity

The politics of research

Be wary of the way researchers present their conclusions as they are always political. Here, for example is an excellent YouTube presentation on the various types of research design and the paradigms they represent, as well as a related print source.

In the traditional corner, you have the quantitative experts, or what some would refer to as true scientific inquiry. They tend to assume that only experimental objectives and value free assumptions have practical utility as only those methods are able to answer “what” questions.

In the other corner, what I will refer to as the progressive corner,  you have the qualitative experts. They prefer questions that result in “understanding” and answers to questions that ask “why” and “how.” For example, qualitative or action researchers might want to understand how a student’s environment and cultural diversity impact on the way they learn — their learning styles — and thus their academic outcomes.  

Continue reading

Iowa study shows ON “no-fail” policies fail

Although this article was first published a few days ago, I felt it was worth repeating. And, even though it is directed primarily at the Ontario Government, it would be relevant to any of Canada’s education ministries because the Iowa study is about how boards or districts of education (from the trustee down to the classroom teacher) can influence student achievement in either a positive or negative way.

The Lighthouse Study

The Iowa Association of School Boards’ (IASB) “Lighthouse Study” is about quantifiable, reliable measures of student achievement and school renewal versus simply accepting limitations and managing the system.

As a result, it can teach Ontario and other Canadian provincial governments and their school districts or boards a thing or two about how to improve student outcomes without lowering expectations or academic standards — which is exactly what is happening in Ontario right now.

A timeless study and reported in the fall of 2000, the IASB’s underlying message was a belief in students and their innate potential, combined with expectations of achievement regardless of a student’s circumstances. It was also about how a school district’s internal culture of improvement could affect student achievement in positive way.

In other words, student success is NOT about pushing students through the system or a “no-fail” policy that allows substandard work. The only group who benefits from that kind of policy is the government in question because they can claim reduced dropout statistics and increased graduation rates.

The students, on the other hand, would not benefit for very long because they would have to face their difficulties once they are in employment situations or working at the post-secondary level.

Background to the study

Some background for those readers who haven’t the time to read the study itself. The IASB research group wanted to find out if they could identify “links” between what school boards do and the achievement of students.

To look into that possible link, the researchers worked with six school districts in Georgia using census data to make sure they were using comparable data in terms of enrollment, percent of children living in poverty, spending per student, household income and other related factors. They also used verifiable data such as the Iowa Test of Basic Skills, which is administered to Grades 3, 5 and 8 and a high school graduation test. So, there was little in the way of subjective data used.

As well, they used seven key conditions for school renewal as the basis of questions they used to interview a cross section of board members, superintendents and school personnel. Those seven conditions were:  

  1. Shared leadership
  2. Continuous improvement and shared decision making
  3. Ability to create and sustain initiatives
  4. Supportive workplace for staff
  5. Staff development
  6. Support for school sites through data and information, and
  7. Community involvement.

What we can learn from the study

The results are fascinating. In the high achieving boards, student achievement was very high and there was a culture of belief in the students and the system. And, perhaps just as important, everyone was geared to renewal and able to move ahead on all fronts.

In the low-achieving boards, the reverse apparently happened. Rather than look at student potential and possible improvement, they looked at limitations. Moreover, while renewal goals were on paper, everyone got “stuck” at each stage of the process and simply spent most of their time “managing the system.”

Sound familiar? Ontario Minister of Education Kathleen Wynn and her officials need to read this study because “managing” student achievement and graduation rates is precisely what they are trying to do. As a result, students are leaving the school system ill prepared for the future. And,far too many Ontario boards of education (see also this link) seem to be dysfunctional because they are “stuck” on managing their systems.

Something to think about.

H/T to a regular reader for the link to the Lighthouse study.

Do Fraser rankings “really” reflect the quality of schools?

When families move into new communities, what is one of the first things parents ask their real estate agent?  You guessed it. They ask: Where are the best schools and how do you know they are the best?  And, on the basis of the answer, the parents decide then and there where they want to rent or purchase housing.

Now, just how do people find out where the best schools are located? In the past, they speak to everyone they know who lives in the community where they are moving. Then, they make an informed decision. Now, it seems, the Fraser Institute’s school rankings is the primary source parents are using.

But, is that all there is to a school? Do the rankings alone “really” reflect the quality of a school?  Or, should other criteria be used as well? For example:

  • Is there a strong emphasis on academics?
  • Is there a good sports program?
  • Are there extra-curricular activities in the arts?
  • Is there a school choir or band?
  • Is there a strong school spirit?
  • Do children like attending?
  • Do the teachers communicate well with the parents?
  • Are the staff dedicated?
  • Do the staff undertake professional development?
  • Is the principal approachable?
  • Does the principal treat parents with respect?
  • Are there a lot of parent volunteers?
  • Is the school council effective?

And, so on. Or, do the rankings themselves mean enough — as in — if the children do well in the annual tests, then that means there are good teachers and the school is good. Is that a fair analysis? Or, is this whole process a self-fulfilling prophecy? 

Continue reading

ON school rankings confirm skill development

To all Ontario taxpayers — which of course includes parents — the school rankings that came out yesterday are a good thing. They not only indicate how some schools are struggling, they also show how some are gradually improving year-over-year during a five year period. And, of course, they indicate those schools that are doing exceptionally well. 

So, not only are the rankings giving parents and educators a snapshot of what is going on in the publicly funded system overall, they are also giving parents a choice.  For example, some boards of education have open boundaries and IF — and that is a big IF — there is room in a school that is ranked high, children from other neighbourhoods can attend. Similarly, if parents are about to relocate, they have an idea where they might want to move.

Of course, for the same reasons, many of those within the education system itself, don’t like the rankings. First and foremost, the criticism is that all teachers are doing is “teaching to the test.” Well, what is wrong with that if the students are all learning the same knowledge and skills?

Another criticism is that the testing is arbitrary and may seem unfair when two schools in completely different communities are being compared to one another. For example, you could have one school community that is full speed ahead with block scheduling and the balanced school day approach to curriculum with literacy and numeracy tasks taking up half the day — and — where lots of parent volunteers are available. Whereas, in another school community, the teaching staff are doing their best dealing with a high population of ESL and special needs students and there are few parent volunteers because the majority of parents work outside the home.

Nevertheless, while there are limitations to the rankings, as Moira MacDonald writes in today’s Toronto Sun, they are really the only comparison and accountability tools taxpayers have.  So, if for no other reason than that, they are a good thing.

However, let’s not lose track of the rationale behind the testing in the first place — that students learn and be tested on the SAME literacy and numeracy skills they all MUST have, no matter where they attend school.

That is the bottom line.