Monday, 15 June 2015

The revolution is upon us! (2015 Term 2 Week 9)

What have been the most important developments in human history so far?

The list of possible candidates is lengthy. The domestication of animals? The rise and fall of one or more empires? The appearance of a 'great man' historical figure and his ideas, such as Confucius, Socrates, or Karl Marx? The emergence of a world religion, such as Judaism, Christianity or Islam? The development of the written word? The rule of law with the Magna Carta (800 years ago today)! The discovery of the New World? The use of antibiotics?

The answer: It depends! What criteria do we use? I recently came across a graph that attempts to answer the question, identifying a development that 'bent the curve of human history' in a way that nothing else has. The graph is below:

The central idea here is that the steam engine, and associated industrial technologies, underlies the sudden exponential progress in human social development (and human population). This invention was the point at which the limitations of muscle power were overcome and massive amounts of useful energy became accessible, which then led on to subsequent multiplying technological innovation and progress. Factories, transportation, urbanisation ... everything else that constituted the Industrial Revolution erupted from the fact that muscle power had been surpassed.

The graph comes from Ian Morris' book Why the West Rules - For Now: The Patterns of History, and What they Reveal About the Future. However, I came across it in one of the more exhilirating and disturbing books that I have read in the last twelve months. The second machine age: war, progress and prosperity in a time of brilliant technologies by Erik Brynjolfsson and Andrew McAfee.



The authors argue that we have entered into a second machine age, whereby computers "are doing for mental power - the ability to use our brains to understand and shape our environment - what the steam engine and its descendants did for muscle power." It is their contention that the vast and unprecedented boost to mental power that is provided by computing will be equally transformative of the human story as the boost to muscle power.

Brynjolfsson and McAfee acknowledge that wide-eyed optimism about computing have been around a while and that we have all grown a little jaded in thinking about the wonders of the future. However, their book makes a cogent and persuasive case that computing really has turned an extraordinary corner in the last little while. Diagnosing diseases, listening and speaking to us,writing and evaluating high-quality prose, robots scurrying around warehouses ... all these things are present realities.

To outline just one example from their book, in 2004 it was a given within the field of computing that computers would never be able to drive cars because computers were good at following rules but no good at the complex levels of pattern recognition involved in driving a car. Driverless cars in 2004 entirely validated this conviction. 

By 2012, Google had developed a fleet of driverless cars that had logged more than 500 000 kilometres on urban roads. According to Wikipedia, Google plans to make these cars available to the public from 2020 - which is around when our Year 6 students will be getting their licenses!

I won't go on to summarise the whole book, but it makes a good case that the rate and extent of change that we and our children are about to experience is unimaginable. 

The challenge for education becomes all the more pointed. What education will our children need to equip them for this 'second machine age' into which they are moving? I remain convinced that the personal capabilities and characteristics of the Inaburra Learner Profile will be invaluable to them. The question is, how does education need to change in order to cultivate these capabilities?

(Oh, and for what it is worth, my own conviction is that the life, death and resurrection of Jesus Christ remains the most important event in human history. But that is a story for another day!)


Tuesday, 9 June 2015

How do we know how we are going? (2015 Term 2 Week 8)

Following on from last week's blog about our recent HSC results, I want to make the point that I am entirely in favour of evaluating progress. Every school needs to ask "How are we going?" If we don't assess our impact, then we can't know if our efforts are well-directed or effective. The very idea of an improvement agenda, or a quest for success, assumes some means of measurement. 

The challenge, of course, lies in identifying the right measures! Inaburra School has access to, and weighs, a huge number of measures in evaluating our progress in any one year. 

Evaluating cognitive learning Assessing the cognitive learning of students is obviously a core aspect of the school's work. While assessment can be done of learning, for learning and as learning, the following comments are directed towards assessment of learning. 



Most assessment of learning at school is designed and administered internally to the school. The reports that are issued at the end of each semester summarise these assessments, providing feedback on a student's progress over time. This feedback includes the students' work habits and engagement, participation in school life, progress according to syllabus outcomes and more general comments.

The School is also engaged with externally-developed assessments of student learning. Many of them will be familiar to parents;the main ones are NAPLAN (literacy and numeracy) and the HSC (end of Year 12 exit credential in NSW). However, there are a number of other tools that we use for assessing the progress of individual students including the Progressive Achievement Tests, the International Competitions and Assessments for Schools (ICAS), the Essential Secondary Science Assessment (ESSA) and others.

Evaluating non-cognitive learning However, given our interest in the non-cognitive capabilities of our young people, particularly as expressed in the Inaburra Learner Profile, we are also exploring how to assess student progress in these areas. Non-cognitive capabilities and characteristics are harder to gauge with an objective, verifiable and consistent tool, but we believe that it ought be possible to make defensible judgments about individual students and their progress. In 2015 we are making use of two externally-developed tools that explore some of these issues.

The Gallup Student Poll measures student hope, engagement and wellbeing, as well as the effect of the school on the spiritual journey of the student, and the entrepreneurial disposition of students. Our students in Years 5-12 participated in this online poll earlier in the year. The school will receive a report from Gallup in the near future; this report will provide us with a baseline for our student population by year group. It does not report on individual students.

Inaburra School has also joined ten other NSW independent schools in trialling the Australian use of the Mission Skills Assessment. This instrument assesses students in Years 6-8 for the non-cognitive skills and attributes of teamwork, creativity, ethics, resilience, curiousity, and time-management. An online survey, which is part of this assessment, will be done by students in these years during the course of this week. We expect a report from this assessment late in the year that will serve as a baseline for us in developing these attributes. Again, the MSA does not report on individual students.

The challenge of assessing non-cognitive learning is substantial, but crucial. 



Evaluating other aspects of school life
Obviously, in addition to assessing learning, there are many other aspects of organisational life that we assess. We evaluate key measures, such as enrolment demand, finances, and staff/student ratios against historical data and industry benchmarks. Staff participate in an annual engagement survey conducted by Voice Project. Students provide feedback to teachers in small-scale surveys at various points in the year. We conduct satisfaction surveys for students in Years 6 and 12 every year, as well as their parents. Every three years we conduct a K-12 parent satisfaction survey; this was completed by 60% of parents earlier this term.

All of which is to say, the School places a high value on evaluating our progress, identifying and bolstering areas of strength and intervening to improve in areas of weakness. Onward and upward!

Monday, 1 June 2015

Evaluating Year 12 results (2015 Term 2 Week 7)

One of the key influencers for families in their choice of school is the final academic result. We all want our children to do well, in order to maximise their opportunities, primarily with reference to university entrance. It is a legitimate goal for families, and for schools, to want students to achieve strong results. For many, the HSC results constitute a verdict on the quality of the education at a school.

At this point, I find myself in an awkward stance. As Principal I obviously have a vested interest in putting a positive spin on Inaburra's results, and any comments I make regarding evaluation may be read with suspicion! Yet as an educator (and a parent), I want to be sure the results can be evaluated effectively and fairly. Apart from anything else, rigorous evaluation is a necessary step in any improvement agenda!

In 2014 Inaburra School students attained fewer Band 6 results in their HSC than they had done in the previous few years. This was disappointing to us, but not surprising. We know our students well and we understood the level of their performances, throughout their time at school. We rejoiced with them at many of the results that they achieved; we also shared the disappointment that some of them experienced.  

However, I do not accept the proposition that I have heard that our students' academic performance fell. Nor is it valid to suggest that the academic standards of the school are sliding. Let me explain.



An easy way to evaluate?
In the newspapers each year, and on a number of websites, it is possible to find tables that purport to rank and compare schools. The measure that they use is to divide the total number of HSC exams sat by the total number of Band 6 results (marks over 90). This result is used to make a simple ranking table. This method appears to have the great benefits of being simple, clear and objective. Moreover, it seems relatively simple to compare a school's ranking one year with its ranking in another year, so it is possible to gauge a school's trajectory - up or down!

Whilst an apparently simple, clear and objective ranking table may be easy to understand, it is a very superficial reading of limited data and, as such, it can be a poor guide to the academic performance of a school.

The popular method of ranking schools used by the media and websites (described above) is the only way they are legally allowed to make comparisons. Section 18A of the Education Act 1990 prohibits the public comparison of HSC results with reference to anything other than marks over 90; that is, Band 6 results. To compare on any other data publically is prohibited. This is a political decision, not an educational or statistical decision.



What other options are there?
Using only the number of Band 6s as the basis for comparison is a very narrow measure. It would be analogous to evaluating the weather only with reference to the number of clear days. It is valid to count the number of clear days, but evaluation should also include the rainfall, temperature range, wind, patterns, trends, averages, atypical weather events and whatever other measures are available and helpful.

There are many other ways to evaluate HSC results. For example: 
  • the average mark received by students at a school in each subject, compared to the State average
  • the profile of Band results compared to the State (what percentage got Band 6 in a specific subject, what percentage got Band 5, what percentage got Band 4 etc); 
  • the distribution of Band 6 results across a cohort; that is, are the Band 6 results concentrated amongst a small group of students, or are they disseminated more widely across the year group.  
  • the distribution of results by subjects also yields valuable data by indicating if there are some areas that are stronger or weaker than others.  
  • the value-add, whereby we identify each student’s performance at a prior point, project their expected results and then measure to see if those predictions are reached, exceeded or unfulfilled. 
The HSC results are subject to a lot of scrutiny. The school publishes some summary information at the time the results are released; more data is published in the Annual Report which is submitted by the end of June each year. I encourage people who are interested in our results to look at this data.  

However, it is staff who spend the most time poring over and analysing the results, exploring all the methods outlined above, and more. The data is explored subject by subject, class by class, student by student, and compared to previous years and published State norms. A report developed by both staff and an external consultant (who is used by nearly every comparable school in the State) is provided to the Board of ICL, the School's governors. 

Over the years in which I have been examining HSC results, I have become convinced that the best overall measure of our results emerges through the Grand School Average (GSA). This figure, established by our external consultant, is calculated on the relationship between the HSC result in every subject by every student and the ATARs that the students receive. This measure accounts for the relative scaling of subjects, recognising the merits of students attempting more challenging courses. 

The GSA at Inaburra has been rising since we began to measure it in 2008. This long-term trend is the best indication that we have that academic achievement at Inaburra is strengthening. Each student is being helped in his or her pursuit of excellence. 

Two final points:
Success is ultimately measured by individual students and their families in different ways. For some students, Band 6 results are a valid goal and object of celebration. Other students and their families, who may have walked a very different path in aptitude and circumstance, may well rejoice with different numbers. Many of our families also measure success more widely than the results in the HSC. 



In reflecting on the HSC and ATAR, I do not resile for a second in my conviction that a strong ATAR, by itself, is not an adequate outcome of secondary schooling. We need to be aiming for ATAR plus the character, capabilities and capacities of the Inaburra Learner Profile if our children are to be equipped to survive and thrive in the world that lies ahead of them. At Inaburra, we will continue to work hard to help our students to achieve strong results, but we recognise that the long-term benefits of their education will not be expressed in numbers.