Friday, October 7, 2011

Is it working? CAN we move from ‘I think we know’ in education?

Data and TypesImage via WikipediaConor Bolton recently posted to another community of which I am a member, where he was reflecting on what he has been trialling with his students this year, and what he has observed to date. He concludes "In spite of the challenges, I think my practice is becoming more evidence based; it is based upon sound theory and research and while the implementation has a way to go and the evidence is shaky the process is starting and the teaching as inquiry cycle is becoming more evident (I hope)". Conor also refers to Greg Whitby's post: "From I Think to We Know".

It really started me mulling over the implications of research-informed educational practice, as well as  the 'benchmark' approach requirement of many Ministries of Education to 'prove' that national initiatives have been positively effective and have increased student achievement of learning outcomes. I also tend to agree with Darcy Moore who replied to Greg Whitby's post - I believe there is no way of categorically illustrating a direct cause and effect correlation whereby a specific intervention has a positive or negative effect on learning.

I am now going to scab something from the oracle, Wikipedia ;-) around Design-based research theory:
"Methodologically, the Learning Sciences is distinguished from other fields that study learning in humans in its methodological treatment of the subjects of its study, learners, their localities, and their communities. The Design-Based Research methodology is often employed by Learning Scientists in their inquiries because this methodological framework considers the subject of study to be a complex system involving emergent properties that arise from the interaction of more variables than are initially known to researchers, including variables stemming from the researchers themselves (Brown, 1992). As such rather than attempting to isolate all the various factors that impact learning as in traditional research, the learning sciences employ design based research methodologies which appeal to an approach to the study of learning – in particular human learning both inside and outside of school – that embraces the complex system nature of learning systems. Learning Scientists often look at the interactions amongst variables as key components to study yet, acknowledge that within learning environments the interactions are often too complex to study all or completely understand. This stance has been validated by the findings of Cronbach and Snow (1977) which suggest that Aptitude-Treatment Interactions, where variables are isolated in effort to determine what factors “most” influence learning, will not be informative but rather inaccurate and potentially misleading if used as a ground for educational decisions or educational research of complex learning situations such as those characteristic of human beings in their lived experiences." (emphasis mine).

(For more 'academic' resources than Wikipedia about Design-based research, this bibliography is useful, and if you have a preference for videos, here are some interviews with folk who use and have developed the theory and model further).

Design-based theory offers a way of acknowledging the unavoidable 'biases' of the person designing the data collection tools and collecting the data - the very fact that decisions have to be made around the what, where, when and how of data collection, and the influence that, for example, even an impartial observer can have on the dynamics of a group. The theory encourages the reflection on these design decisions as well as acknowledging that findings gathered along the way are likely to influence further design decisions as well as what is happening in the learning situation.

My brand new StethoscopeImage by Lidor via Flickr

I also feel it is worth reflecting on Whitby's assertion: "Doctors can make assumptions about a patient’s health but unless the assumptions are tested, you cannot diagnose and treat. It’s the same rigour that must be applied to the practice of teaching. The relational aspect of teaching will not be subverted by the use of data but enhanced by it". The analogy isn't comparing eggs with eggs - while a doctor assumes you may have anaemia because you are pale, tired and dizzy, and can prove this through a blood test - what is to say that it is not a symptom of a way bigger cause. Also, as design-based research theory suggests - there is no equivalent of a 'blood test' in education.

Anyhow (climbs off soap box) - I guess the thing to bear in mind is Whitby's reply to Moore: "it becomes the conclusion rather than the beginning of insightful and reflective discussions. What is critical is not the data per se but the quality of questions that arise from examining the data and feedback without prejudice or judgement."
Rubik's CubeImage via Wikipedia


I would add, though, that any thinking about how shifts in teaching practice effect students' learning experiences would be positive if it is influenced by 1) What the teacher is aiming to do and what the students are also aiming to do, and 2) reflection on data collected in several different ways - surveys, emails, elicited / not elicited, LMS usage data, and community feedback for example. And in this way, the picture is likely to be much richer and informative - although it must always be the case that an educator looks at themself as the collector / interpretor of that data and ask on how they are being influenced by their own assumptions as an educator, as well as a social human being with a history, a number of communities, interests, relationships and so on.
Enhanced by Zemanta

No comments: