Archives by date

You are browsing the site archives by date.

  • Navigating research: A guide for teachers and school leaders

    Navigating research: A guide for teachers and school leaders

    Guest post by Dr. Richard Farrow 

    Part of the problem with the current rush to research has been that in making things accessible, well-meaning organisations such as the Sutton Trust have divorced teachers even further from actual research by publishing guides summarising findings. This is all well and good for many people, but when a result is non-intuitive, such as in the case of the non-impact of TA’s, many have questioned the findings. In order to do this, consulting the original sources is probably a wise move. This is a perfectly natural and valid thing to do and something people in a research rich-environment should encourage.

     

    The question is how to do that? Our first problem is that many academic papers are behind a pay wall because they usually form part of a journal that is paid for. This is either online or in a paper version. Academic journals have traditionally been expensive to buy and many of them rely on charging a lot for a small number of copies to keep publishing. I would like to see the EEF pay for every registered teacher to have access to a system where research papers can be read and downloaded for free (something that happens in Scotland). So assuming you can access a paper, what are the other barriers for lay people reading research papers? I have made a list:

     

    1. They don’t know where to look for the evidence or what they need
    2. They have no idea about the method being used and whether it is valid or not
    3. The academic language used sends them to sleep after a few seconds
    4. It simply doesn’t make sense

     

    Points 3 and 4 are easy to address. Academic language is a must for people who work in academia. However, if it is still the case that even when you have deciphered the hieroglyphics and think you know what something means, if you still cant make head nor tail of it, it is potentially rubbish. Often it can be a conceptual issue that provides a boundary, because some concepts are interpreted differently across disciplines, but I guess the key is to make sure that the language amounts to something you recognise and makes sense to you as a practitioner. There is a push towards developing “research literacy” amongst teachers. Membership of the National Teacher Enquiry Network (NTEN) allows access to academic papers and provides support. If you are looking at this as an avenue for improvement further information is available here: tdtrust.org/nten/home/

     

    More pressing are points 1 and 2 and further clouding the water, they may be language dependent also. Evidence is a hard concept to pin down and many research papers use different types of evidence to make their points. A quick list of where evidence could come from:

    1. Survey/Interview data – either written or verbal generally designed to give you statistical value to make judgements from.
    2. Statistical data – i.e. exam results, either at a school or authority or national/supranational level (think PISA)
    3. Interviews – practitioners and children and whoever – designed to add “flesh to the bones” of statistical data
    4. Observations – of anything really, could be children, classes, teachers etc
    5. Other literature – as in the case of meta-analyses
    6. Philosophical tools – such as analysing and rewriting something from a Marxist (or whatever) viewpoint.

     

    There are more, but most fall into one of these categories.

     

    In the past certain methods were considered more valid than others, and this continues to evolve and change as the requirements of research become different. There are two main ways of carrying out social research: quantative and qualitative. There are pitfalls in both approaches and no one would pretend otherwise. A combination of the two is also used and referred to as “mixed methods,” an approach that is growing in popularity in educational research. In my own opinion, a research area that people want to investigate properly needs both approaches, either together or separate. The more evidence you can get, the better. People often dismiss qualitative research as opinion, and done badly this can apply. However, the same allegation can be made against a purely stats approach, which can often miss the point and flatten out nuances with large data sets.

     

    This issue is not addressed by meta-analyses, which attempt to review a number of papers and make value judgements from them. Researchers get together papers on a certain topic, take the data and findings from them and synthesise them together. The problem is, that unless the papers are analysing the same thing, comparisons can be useless. Repeat research is vital for understanding to improve and a general rule is that the more studies that exist on something, the better. Education is a field where crucially; very little repeat research is done. The figure given is that less than 1% of research papers actually repeat something done before. In other disciplines this is much higher. Meta-analysis as a tool looks great, but in reality gives false results and does not compare like for like studies. This is the main issue I have with the Sutton trust toolkit. While their aim is noble, the way of finding their results is full of holes so big you could drive a truck through them. The further fact that they give a numerical value AND then an increase/decrease in months learning, makes it hard to take seriously. It is a shame that so many people appear to take their findings at face value. Perhaps this is the point of this blog, a call for practitioners to ignore the overall analysis and get back to the roots of the results. Developing that “research literacy” will help.

     

    What a research paper has/should have:

     

    A paper will start with an abstract, which summarises the research in around 100-150 words. This is an extremely difficult thing to do, so it is probably better to skip this bit if you have committed to reading the whole thing.

     

    Next should come a short paragraph introducing and summarising the aims and findings of the research. In paper written for press attention this is called an executive summary and can be much longer. Often an executive summary is skewed somewhat and attempts to present findings that back up the ideological view of whoever commissioned the research. Again, if you have committed to reading, I’d skip this bit.

     

    Following this you should have a methodology section. This explains why you have carried out the research and where it fits in the current literature. It should link to a method of analysis used in other papers. This section is often missing in educational research. It is not because the research isn’t valid; it is because education often does not have many universally recognised tools of analysis. It is however, a weakness and a symptom of the lack of repeat studies.

     

    Next comes the research design/method section. This explains how you have carried out your research project. If it is a survey paper, it will present the terms of the survey and what is being looked for. If it is stats, it will have a sample size and an indication of where that data came from. In education research often this section is inadequate. As a general rule, the larger the sample size, the better the research. In some forms of stats analysis there is a magic number that you can draw conclusions from, around 1000. This is where the confidence interval it at its smallest point (around +/- 3%). The smaller the sample, the larger the confidence interval, which is bad for drawing conclusions. In a qualitative paper, this is a complicated section to put together, but again, the more people that have been interviewed/observed/questioned the better.

     

    After this a data section, often called ‘data collection’ should be present. It will give you the raw data gained from the project. For stats projects, you can look at whether they have got their data from a fair amount of sources, or whether they have failed. In a qualitative paper this section may include quotes or statements, which address the key findings of the research. If a class has been observed, details of its composition and the research undertaken will be included. This section is absolutely vital to find out whether the research is relevant to you. If you are looking at TA impact and this section does not address a point you have in your school, it is pointless to consult. If you insist that TA’s work with a different group every day, while the study only has TA’s who stay with the same group, whether or they make an impact or not is of no relevance to you. You will need to search for the paper where they look at TA’s working in different groups to assess their impact. This is the issue with meta-analyses, they do not often address the same point, but can come up with a general conclusion. If you take this conclusion and make changes to your school as a result, you may be making a large mistake.

     

    Next comes the analysis, where the data should have been taken notice of and conclusions drawn. The author will be looking for ways to make a point and in reading this section you must make sure the conclusions drawn are actually backed up by the data. One mistake education papers often make is overreaching from their findings. The researchers will carry out a very small-scale research project involving 2/3 classes, then attempt to say that this could have a national impact. What they should be saying is the following: “more research is needed in this area to find out if these results are valid.”  What they are likely saying is more like: “our findings show this intervention/thing works and we expect that this will continue as the project expands.” If these conclusions are not backed up by the raw data, then you have a problem.

     

    Finally you should have a conclusion, where the main findings are summarised and conclusions drawn. This should give anyone who is wishing to build on your paper, points to consider when carrying out their own research. Again, the conclusions should be backed up by the data and the analysis section.

     

    In sum, if the paper you are reading does not have these sections in then it is possibly not worth reading. The fact that a huge amount of educational research misses out one or more of these sections is problematic, and leaves us with a problem. This is where I would always urge you to focus on the data section. Assuming the people who carried out the research are competent researchers, the data should be useful. You are capable of analysing this type of thing yourself and I would urge you to do it.

     

    Specific advice for school leaders looking to move staff towards a research based model of school improvement:

    Join NTEN

    Get involved in researchED and related activities

    Make links with local universities who have educational research departments

    Join existing projects through making contacts via the above

    Make sure you have the time to embed this culture in your school

    Be enthusiastic, it will be difficult to drive this forward if you are not committed.

     

    Any questions get me on twitter @farrowmr

  • Why are the English so bad at Languages?

    Why are the English so bad at Languages?

    By Sam Owen

    Might one type of ‘immersion’ be an answer?

    Immersion dictionaries

    Somehow it always seems to come up at drinks parties. Why are English students so bad at languages? As a Spanish teacher I feel the burden of responsibility, and the need to defend my profession. I want to say that language teaching has come on hugely, that the training and research is excellent, that it is by far the hardest subject to teach, and that there are some extraordinary people out there doing it brilliantly. But it’s hard to forget that even Dutch service station attendants all speak English better than we do, or that everyone inBorgen speaks English (and probably 3 other languages) fluently. “Why are we so bad?” the person holding the glass of wine muses, and I look at my shoes, weighed down by the shame of our collective failings as language teachers.

    There are many answers, but I am convinced that one of them is to do with immersion. Or CLIL (Content and Language Integrated Learning) as it is known by the cognoscenti. It is described as “teaching of non-language subjects through a foreign language, with both subject matter and language learning as goals.”[i] The Dutch, (along with the Welsh, Catalans, Canadians and others) have schools where they study some of their subjects in another language. The language surrounds them on the radio and television; they need it; and they do the business of learning through it.

     Immersion 1

    The research says that if your brain believes something is important it will retain it better.[ii]The same goes for when the information is couched in a memorable or interesting context.[iii] One of the greatest motivations to speak to other people is to share one’s thoughts and opinions[iv]. And that’s the challenge language teachers face. How do you make La plume de ma tante important; contextually relevant; and something about which students can share their thoughts?

    If you teach them Geography or Biology in another language, however, all of that changes. They want to understand how the world works, so the language is necessary and important. They remember the words because they are tied to the world they know, to pictures, and concepts. They want to speak – because they have an opinion about what the solution or cause of something might be. The research on CLIL says that motivation, language retention, achievement, and desire to speak should all go up.[v]

    So we wanted to see if all that could really be true for us as well, and came up with the format of ‘Immersion Trips’. One to France, and one to Spain. Languages, Geography, and Biology working together in a place where we could expose the students to as much of the language day to day as possible. To make it tie in with our other principles we included some Harkness discussion, and tried to bring the subjects together by focussing the learning and research on the central question: ‘How sustainable is human activity in the Regional Park we are staying in?’ Finally to include a physical challenge we incorporated Bronze DofE.

    It was a fantastic experience. It is enriching working with colleagues in other departments and learning from them (even if virtually impossible to timetable). There is also a fantastic sense of collaboration when involving students in the enquiry. Their feedback was extremely valuable, and led to us making some immediate changes and adaptations, and the sense of teamwork and partnership which arose from this was phenomenal.

    Immersion interview

    Research Questions

    To go about testing the efficacy of the trip we roughly followed the NTEN (National Teacher Enquiry Network) Lesson Study Model. These were the main questions we wanted to answer:

    1. Are students more confident speaking as a result of CLIL?
    2. Do they find the lesson more engaging/ motivating/ enjoyable when it focusses on content from other subjects?
    3. Does their retention of new language and vocabulary improve?
    4.  Does the quality of their language production (speaking and writing) improve?
    5. Does learning the Geography and Biology content in a foreign language have a negative impact on understanding/ retention of subject matter? If so, how much? Conversely are there advantages?
    1. Does the incorporation of Harkness discussions lead to greater speaking, and better language?

    Methods

    In order to provide some objective measurements we marked the levels of the students’ final presentation (using MYP levels for Spanish, Geography, and Biology), and compared them to assessments done before the trip. Students were also given a surprise vocabulary test before and after the trip.

    In order to gauge levels of confidence, and the students’ perception of their learning we conducted two questionnaires – one before and one after the trip. The questionnaire involved a mixture of Likert Scale questions and descriptive answers.

    As part of the lesson study we also followed three students throughout the trip – trying to assess their reactions to the teaching, and conducting three filmed interviews during the trip with each student.

    Due to the logistics of getting it all off the ground some of the baseline data is patchy, but when combined with the students’ descriptive comments gave us some very strong indicators of progress.

    Findings

    Immersion river study

    The findings were extremely encouraging. They were more confident speaking (+30%), and said things like: “I found it much more interesting actually speaking to Spanish people, as this provided a much more legitimate experience than you would find within a classroom,” “Learning it like this was more useful because you had to understand and be able to speak the language to get anything done”, or “Last night I had a dream in Spanish… I’m really enjoying hearing it around being spoken.. it’s around all the time.” As it happens their confidence went up in all 4 skills (speaking, reading, writing, and listening).

    They also enjoyed the style of learning a great deal, and were 45% more positive than about normal lessons.

    Their retention of vocabulary also went up. In a specific vocab test their marks went up 14%, and they could freely recall more than twice the number of words from the trip compared before.

    The quality of their language production improved by 10%, and they were 36% more positive about their improvement than normal.

    Surprisingly their Geography and Biology assessment marks also went up. However the baseline data was patchy, and the nature of the group work may have meant that many piggy-backed on the most able. My general impression is that there were significant gaps in the understanding of some. However I would hope that with some more practice, and perhaps some support after the trip in English, we might get to the point where we feel confident they are learning Biology and Geography as well as in normal lessons.

    The Harkness discussions needed some adjustments out there and challenged the students. However 75% felt they spoke more using Harkness than in traditional speaking activities, and I think that with some more time incorporated for feedback and corrections, we can improve the quality of their language here further.

    It was hard work, but extremely rewarding – working together with staff from other departments to find out what works and what doesn’t is not only stimulating, but it also makes for an atmosphere of undefendedness as you learn alongside your colleagues and students. Paradoxically this is very empowering. And who knows – perhaps, just perhaps, it might be part of the answer as to how we can catch up with the Dutch.

    The full report is here:

    https://wellingtoncloud-my.sharepoint.com/personal/saso_wellingtoncollege_org_uk/_layouts/15/WopiFrame.aspx?sourcedoc={BCE2AC70-5E90-4770-B60C-4A891F3B06FA}&file=Findings%20from%20immersion%20trip_report.docx&action=default

    [i] Montet, M. and Morgan, C. (2001). Teaching geography through a foreign language: How to make text accessible to learners at different levels. Language Learning Journal, 24, 4-11

    [ii] Ellis, R. (2005). Task-based language learning and teaching. Oxford: OUP, p21

    [iii] Coyle, D. (1999). Supporting students in content and language integrated contexts: planning for effective classrooms. In Masih, J. (Ed.) Learning through a foreign language: models, methods and outcomes (p46-62). Lancaster: CILT, especially p49-50.

    [iv] Prabhu (1987) – cited in Do Coyle (1999)

    [v] See for instance:

    1. Bennett, N. and Dunne, E. (1990). Talking and learning in groups. Routledge.
    2. Burden, R and Williams M. (1977). Psychology for language teachers. Cambridge University Press.
  • Improving student writing by modelling bad writing.

    Improving student writing by modelling bad writing.

    Tom Wayman, head of English,  Wellington College

    “Let’s write the WORST story ever written” was a much more appealing invitation to my mixed-ability Year 11 class than previous offers to collaborate on writing the best one ever. Putting together a unholy montage of cliché, malapropism, comma splices firing off in all directions and gushing logorrhoea was not only liberatingly hilarious but deeply educational.

    badspellingcrosswalk

    As each pupil threw down their worst into a shared Drive folder, they were able to gasp and chuckle as the very things which, unthinkingly, they were pretty much doing two weeks earlier they were now sending up. We projected a few onto the board: some were read out – it was all great fun. The set were able to atomise clearly what made in this case, a short story, poor. If we use exemplar material to showcase the best, who not put together wilfully weak work to highlight the obverse?

    I did a similar exercise with my Year 9s last week. Using some of the excellent resources of the British Library timeline, around which we base the first seven weeks of our Year 9 curriculum, we were looking at using Victorian Freak Show posters to sharpen our analytical writing. In response to the question, ‘What do these posters reveal about the society at the time?’, I typed and projected live the following glorious glossolalia:

    The societe was probably quiet nice becaue they licked to look at strange people and it was funny but they might have felt a but  bad which revelas that they like posters. Also they think tall people are weird. Which made them laugh. The thought it was good. The main attrecation of the show is the smaller perond to the right and it is very old which is shown bauise the paper is brown. I think that Amazons were strange, I think that they kind of through ti was quite nice.

    Although this emerges as a parody of careless writing and editing, the key points of learning were a pleasure for the class to identify and guard against as they set off, for homework, to produce the real thing – to great success. I’d like to call this strategy ‘Modelling c**p’ – but that’s clearly not appropriate. It does catch, however, a little of the useful anarchy of the exercise….

    Excellense

  • Grit Resources

    Grit Resources

     

    – “Promoting Grit, Tenacity, and Perseverance: Critical Factors for Success in the 21st Century” US Department of Education, meta-review, 2013 (pretty massive but nothing beats this source for a comprehensive look at the most cutting-edge info on grit and mindsets)

     

    Building Emotional Intelligence: Techniques to Cultivate Inner Strength in Children by Linda Lantieri & Daniel Goleman

    “An innovative educator and the pioneer of emotional intelligence team up to present a groundbreaking program for building resilience and inner strength in children.”

     

    How Children Succeed: Grit, Curiosity, and the Hidden Power of Character

    http://www.paultough.com/the-books/how-children-succeed/

    “The story we usually tell about childhood and success is the one about intelligence: success comes to those who score highest on tests, from preschool admissions to SATs. But in How Children Succeed, Paul Tough argues that the qualities that matter most have more to do with character: skills like perseverance, curiosity, conscientiousness, optimism, and self-control.

     

    How Children Succeed introduces us to a new generation of researchers and educators who, for the first time, are using the tools of science to peel back the mysteries of character. Through their stories—and the stories of the children they are trying to help—Tough traces the links between childhood stress and life success. He uncovers the surprising ways in which parents do and do not prepare their children for adulthood. And he provides us with new insights into how to help children growing up in poverty.”

     

    Fostering Grit: How do I prepare my students for the real world? by Thomas R. Hoerr

    http://www.ascd.org/Publications/Books/Overview/Fostering-Grit.aspx

    “Grit is a combination of tenacity and perseverance—a willingness to take risks even if it means sometimes failing and starting again. Knowing how to respond to frustration and failure is essential whether a student struggles or excels. Veteran school leader and popular Educational Leadership columnist Thomas R. Hoerr shows what teaching for grit looks like and provides a sample lesson plan and self-assessments, along with a six-step process applicable across grade levels and content areas to help students build skills they need to succeed in school and in life.”

    [PRINT ARTICLES – NON-RESEARCH]

    “Resilience and Grit: Resource Roundup”

    http://www.edutopia.org/resilience-grit-resources

    Edutopia’s carefully curated collection of blogs, articles, interviews, and videos with information for parents and educators about the associated concepts of resilience and grit.

     

    “Teaching Grit: Social and Emotional Truth”

    http://www.edutopia.org/blog/grit-social-emotional-truth-vicki-zakrzewski

    “Close examination of both the construct of grit and research reveals that teaching grit requires more than we may think. Grit involves the interplay of thoughts and emotions, demanding a wellspring of inner resources to overcome the inevitable obstacles that arise when going full-force after a goal. Thus, teachers who want their students to be more “gritty” need to be aware of those students’ inner lives as well as the outer steps being taken to reach their dreams.”

     

    “5 Steps to Foster Grit in the Classroom”

    http://www.edutopia.org/blog/foster-grit-in-classroom-andrew-miller

    “The character traits of determination, adaptability and reflection add up to a critical 21st century skill.”

     

    “True Grit: The Best Measure of Success and How to Teach It”

    http://www.edutopia.org/blog/true-grit-measure-teach-success-vicki-davis

     

    A Three Part Series on Grit and Resilience from UC Berkeley’s Greater Good Science Center

    “How to Help Kids Overcome Fear of Failure” –http://greatergood.berkeley.edu/article/item/how_to_help_kids_overcome_fear_of_failure

    “What’s Wrong With Grit?” –http://greatergood.berkeley.edu/article/item/whats_wrong_with_grit

    “Two Ways to Foster Grit” –http://greatergood.berkeley.edu/article/item/two_ways_to_foster_grit

    “According to the research on failure, students may need more than just grit to succeed. To help students learn to overcome obstacles in pursuit of long-term goals, educators should focus on developing cognitive and emotional skills.”

    [PRINT ARTICLES – RESEARCH]

    Duckworth, A.L., Peterson, C., Matthews, M.D., & Kelly, D.R. (2007). “Grit: Perseverance and passion for long-term goals”. Personality Processes and Individual Differences, 92 (6), p. 1087.

    http://www.sas.upenn.edu/~duckwort/images/Grit%20JPSP.pdf

    [VIDEOS & LECTURES]

    Angela Duckworth’s Video Series on Grit

    “What is grit?” –http://www.youtube.com/watch?v=Rkoe1e2KZJs

    “Are there virtues that are precursors or closely associated with grit?” –http://www.youtube.com/watch?v=EFlflmC8K08

    “What role does humility play, if any, in cultivating grit?” –http://www.youtube.com/watch?v=-lshCXcZj00

    “How has your personal story been a window into your research on grit?” –http://www.youtube.com/watch?v=TNWV6B2kI48

    “What advice would you give to parents who wanted to cultivate grit in their children?” –http://www.youtube.com/watch?v=di7u-4gTUxY

    “What is psychological distancing and how does it relate to self control?” –http://www.youtube.com/watch?v=DiaPzrUbu2Q

     

     

    Edutopia Resilience and Grit: Resource Roundup

     

  • Lesson-study, design based research and professional development.

    Lesson-study, design based research and professional development.

    CB

    Dr. Chris Brown, Institute of Education. 

    In this article Chris Brown reports on the Haverstock primary to secondary transition project. Designed to improve the experience of transition to secondary school for vulnerable pupils in Camden (London), the project uses Lesson Study to help primary and secondary practitioners work together to develop effective cross-phase pedagogical approaches to teaching English/literacy and science.

    INTRODUCTION 

    Connecting research to practice is notoriously problematic. Much has been written, for example, regarding whether the ‘evidence-informed’ movement serves to work against practitioners’ professional judgment (e.g. see Biesta, 2007; Brown, 2014). Likewise, there exists issues in relation to how formal academic knowledge and professional or tacit knowledge might be effectively combined (Stoll, 2009; Brown, 2013). Furthermore are the still very active and virulent disputes surrounding some of the methods commonly associated with enhancing evidence use (e.g. randomized control trials and the process of systematic review – see Maclure, 2005; Nutley et al., 2007; Brown, 2013). A further aspect of note centers on how practitioners’ capacity to engage with academic research might be enhanced (e.g. Hargreaves, 1996; Cooper et al., 2009). This issue is also intertwined with the notion that much academic research is inaccessible to teachers, both in terms of where it is published and the language that is typically used within such publications (Hillage et al., 1998; Tooley and Darby, 1998).

    The idea of Design Based Research (DBR) (sometimes referred to as Design-Based Implementation Research) has been positioned as one way to overcome many of these issues (Penuel et al., 2011; Coburn et al., 2013), with Vanderlinde and van Braak (2010) explicitly suggesting that DBR approaches should be used to close the research practice gap. Described by Anderson and Shattuck (2012: 16) as an approach “designed by and for educators that seeks to increase the impact, transfer, and translation of education research into improved practice”; DBR in addition “stresses the need for theory building and the development of design principles that guide, inform and improve both practice and research in educational contexts” (ibid). Anderson and Shattuck (2012: pp.16-17) go on to suggest a number of important definitional Design Based Research attributes:

    That DBR must be situated in a real educational context;

    DBR should focus on the design and testing of a significant intervention;

    It will involve iterative refinement of that intervention to improve its operation and builds on/iron out past mistakes;

    DBR must involve a collaborative partnership between researchers and practitioners; and

    The process of DBR leads to the development of design principles reflecting the conditions within which the intervention operates.

    Coburn et al., (2013) suggest that DBR may be differentiated from other approaches that seek to connect research to practice because: 1) with DBR, the distinction between the roles of practitioners and researchers is blurred (i.e. practitioners and researchers are mutually responsible for developing theory and embedding solutions); and 2) collaboration occurs in real time throughout the project rather than concentrated at the beginning (scoping) and end (debrief). DBR thus represents a shift from the traditional perspective of research and practice being two distinct activities, with the former being able to unambiguously influence the latter (Vanderlinde and van Braak, 2010), and towards the simultaneous build and study of solutions. As Coburn et al., (2013: 8) suggest: “[DBR has] two goals of equal importance… develop materials and instructional approaches that can be implemented in classrooms, schools and districts. At the same time… to advance research and theory”. 

    LESSON STUDY 

    Lesson Study has been described as a ‘teaching improvement process’ that has origins in Japanese elementary education, where it is a widely used professional development practice (Dudley, 2014). As a process, Lesson Study involves teachers collaborating, normally in groups of three, to progress cycles of iterative practice development. Such cycles typically involve the following steps: 1) a discussion of student learning goals and the identification of a teaching strategy that might meet these; 2) planning an actual classroom lesson (called a ‘research lesson’) that employs this strategy; 3) observing how the lesson works in practice; and 4) discussing and embedding revisions to enable improvement. In the Japanese model, teachers also report on and often hold public demonstrations of the lesson so that other teachers can benefit from their learning (Dudley, 2014). In addition three pupils, who represent wider groups of interest, will be observed and their progress monitored as case studies of the impact of the approach (ibid). In itself, Lesson Study can be considered a form of Joint Practice Development (JPD), a process described by Fielding et al., (2005) as involving developing ways of working through collaborative engagement that, as a result, opens up and shares practices with others. And whilst Lesson Study does have a number of distinctive characteristics, its underpinning mechanism, as with other JPD approaches, involves a process that is truly mutual, rather than one-way, with the practice concerned being improved rather than simply moved from one person or place to another.

    THE CAMDEN PARTNERSHIP FOR EDUCATIONAL EXCELLENCE

    The Camden Partnership for Educational Excellence (CPEE) was set up in April 2012 with the vision to make the London Borough of Camden “the best borough for education”. The CPEE aims to drive forward the recommendations of the Camden Education Commission (an independent commission jointly established by Camden schools and Camden Council to take a long term perspective regarding strengthening education in the borough), which highlighted key issues and opportunities for Camden schools in the light of the changes to the English education landscape. In 2013, the CPEE Board invited schools, colleges, partners and stakeholders to bid for funds from a £2 million pot set up to support innovative ways of working and projects to raise achievement and attainment. In particular, to find ways of improving outcomes for the most vulnerable groups of students. A key requirement of the CPEE’s bid call was that school improvement projects should be based on the Lesson Study approach. This followed the appointment to Camden Local Authority/the CPEE Board of a staunch Lesson Study advocate who had been involved in the process for a number of years both in the UK and abroad (see Dudley, 2014).

    THE HAVERSTOCK PRIMARY TO SECONDARY TRANSITION PROJECT

    A key finding from the Camden Education Commission’s final report was that, particularly for vulnerable students “transition arrangements [within Camden] at present are not consistently good enough” (2011: 5); correspondingly it argued improving these should be a central focus of improvement efforts moving forward. In particular, it suggested that there should be a better understanding between year 6 and 7 teachers (teachers of students aged 11-12) of the pedagogy and practice of teaching and learning in each other’s institutions, which would both assist them in preparing students for success and in supporting students to flourish in their new environments (2011: 36). In response to the report and the invitation by CPEE board for organizations to bid for funding for projects, colleagues from Haverstock school (Camden) and the Institute of Education, University of London teamed up to develop a project that might serve to address some of the commission’s concerns in relation to transition. 

    Our first step was to undertake a review of the international literature on the issue of primary to secondary transition, enabling us to situate our proposal within a suitable theory of learning (Penuel et al., 2011) As a result, the review enabled us to centre our proposed bid to CPEE around the following three notions: 

    That transitions are at their strongest when: “the social, emotional, curricular and pedagogical aspects of learning are managed in order to enable pupils to remain engaged with, and have control of, their learning” (DCSF, 2008: 5; also see McGee et al., 2004; Evangelou et al., 2008). 

    That whilst a myriad of initiatives have been undertaken to tackle the social and emotional aspects of learning (especially for more vulnerable pupils: e.g. see Shepherd and Roker, 2005), Galton et al., (1999: 6) argue that: “there are still problems at transfer with curriculum continuity”; and that 

    Such continuity serves, however, to maintain pupils’ interest in learning, allows them to progress in their learning, and so helps them avoid the internationally observed learning hiatus that seems to accompany transition (McGee et al., 2004; Evangelou et al., 2008). 

    As a result, the Haverstock Primary/Secondary Transitions project was conceived with the purpose of bringing together primary and secondary teachers from the London Borough of Camden in order that they might employ Lesson Study to develop effective cross phase pedagogical approaches/strategies to teaching English/literacy and Science to support the transition of year 5 to year 8 students: in other words to provide hitherto lacking consistency/continuity in the way students across the phases were taught these two subjects. In particular, the project focused on two vulnerable groups: i.e. those students most at risk in terms of their progress post transition. Within this project we consider ‘vulnerable’ as contingent on pupils ability to make a successful academic, social and emotional transition from Year 6 to Year 7. Our scope thus includes pupils entitled to Free School Meals (FSM); “and pupils from some ethnic groups (which ones depending on the particular subject being assessed)” (Galton et al., 1999: 22; Tree, 2011): in this case, white British students (closing the gap for White working class students is a high priority both within Camden Local Authority and within the English context. For example the Camden Partnership of Educational Excellence highlighted this group as being the cohort with the most underachievement overall). We also sought to include more able pupils not fulfilling their potential. 

    Jointly directed by colleagues from Haverstock school and the Institute of Education, the specific aims of the project were to improve student and teacher outcomes in relation to:

    Improved rates of progress and attainment for ‘vulnerable’ pupils within each of years 3-8 (ages 8 through to 13).

    More robust, challenging and innovative but also consistent pedagogic practice at National Curriculum level 1-8 in English and science (levels represent how pupils progress in relation to England’s National Curriculum) and assessment practices through years 5, 6, 7, & 8 (ages 10 through to 13).

    Shared teacher confidence using these practices as well as in planning work and moderation of leveling in their subject in English and science from levels 1 to 8 in order to assess progress.

    A group of teachers able to use Lesson Study approaches to improve classroom practice and impact on standards, thus building transferable capacity.

    The project main phase (which commenced in September 2014) comprises a pilot and main phase, with the latter involving 18 schools engaged in:

    Nine Lesson Study sessions throughout the course of the academic year.

    A workshop to examine both interim impact and to look at cross-phase approaches to assessment/moderation.

    A workshop to examine both interim impact and cross-phase approaches to data transfer.

    Participating schools were also brought together to design final resources for the project with the aim that resources can be used by schools within the borough and more widely.

    A knowledge mobilisation and end of project workshop.

    A DBR APPROACH TO LESSON STUDY

    Primary and secondary schools have their own particular ways of working and these are not especially well suited to fostering cross phase collaboration. For instance, primary teachers will teach all subjects to one cohort of pupils for an entire year. In contrast, secondary school teachers will specialise by subject area and so will teach that one subject to a number of different classes. In addition, using Lesson Study is a new phenomenon in English schools and using Lesson Study in a cross phase way (to tackle issues of transition) rarer still. Baring in mind the particular ways of working of each phase and that neither the researchers or practitioners involved in the project had engaged in Lesson Study activity before, it was decided that a pilot phase of five months with a small group of schools be run in order that researchers and practitioners could collaborate in trialling the approach and ascertaining how it might be made fit for practice: in other words to enable a Designed Based Research approach to the development and implementation of a Lesson Study approaches. That is, in keeping with Anderson and Shattuck (2012) we (i.e. participating teachers from these schools, the Assistant Head project lead from Haverstock school, and researchers and facilitators from the Institute of Education) sought, as a collaborative partnership, to design, test and refine cross-phase Lesson Study in a real educational context, with a view to establishing a basis for its future roll out.

    DEVELOPING A THEORY OF ACTION FOR LESSON STUDY

    A key aspect of this approach was the establishment of a theory of action for Lesson Study: in other words, to determine which aspects of Lesson Study were an integral part of a logical chain leading to improved student outcomes and which were more open to contextual manipulation (Argyris and Schön, 1996; Cherney and Head, 2011). A mutually developed theory of action has been shown to have significantly positive impacts on the effectiveness of interventions they relate to (Lemons et al., 2014) and so is a vital aspect of DBR. As noted above it is argued that, as a form of Joint Practice Development, Lesson Study involves collaborative engagement that serves to open up and share practices. As such, the development of our theory of action for Lesson Study centred on how adults can learn from and build upon the best practice of their peers through interaction. This led us to the constructivist mode of learning which posits the idea that, in instances of effective learning, new understanding is not simply digested but engaged with as part of a social process of adoption, accommodation and assimilation (Black and Wiliam, 2001; James et al., 2007). This means that new information will be filtered through past experience, knowledge and understanding; implying that the process of new learning must start via the exploration of existing ideas as well as the encouragement of those who holding them to express and defend them. As a result, these ideas can be surfaced and made explicit, the importance of which is illustrated by James et al., who suggest that: “unless learners make their thinking explicit to others, and so to themselves, they cannot become aware of the need for conceptual modification” (2007: 17). The social constructivist approach further augments the constructivist notion of learning by emphasizing the interactive element of learning: i.e. the idea that learning proceeds via interaction within a social milieu. 

    In order to facilitate this learning, we turned to the literature on Professional Learning Communities. Whilst there is no universal definition of a Professional Learning Community (PLC), they are usually depicted as a situation in which people involved with and concerned about teaching and learning, work collaboratively to learn about how they can improve pupil learning or outcomes (Stoll, 2008; Harris and Jones, 2012). In particular we looked at the nature and structure of the ‘learning conversations’ that take place as part of PLC activity. Described as “deep, sustained conversations among teachers about matters of teaching and learning” (Stoll and Louis, 2007: 79) and “the way that educators make meaning together and jointly come up with new insights and knowledge that lead to intentional change to enhance their practice and student learning” (Stoll, 2012: 6), learning conversations comprise considered thoughtful (rather than superficial) discussion and challenge, focused on matters of teaching practice, that consider evidence of actual and potential forms of practice and that are undertaken with a view to developing both improved practice (i.e. new solutions to issues) and, as a result, outcomes for students. Moving deeper into this area, Stoll (2012: pp. 6-11) suggests that the following are features characteristic of high quality learning conversations between adults: 

    Focus on evidence and/or ideas. Learning conversations are focused, with the specific focus reflecting one of two important perspectives. First, the conversation’s focus can centre on existing and effective practice within the school/network. Second reflects ideas about innovation and transformation where, for example, the conversation explores creative ways to engage learners and extend learning. Because second focus will require elements of the first, many conversations weave these two perspectives together.

    Experience and external knowledge/theory. Access to outside expertise deepens learning conversations. Whether delivered personally, through writing, or via other media, independent ideas are injected to stimulate reflection, challenge the status quo and extend thinking. Such ideas can help promote greater depth in conversations. 

    Protocols and tools. Learning conversations can often be framed more clearly when supported by frameworks and guidelines that help participants structure their dialogue and interrogate evidence or ideas.

    Facilitation. Facilitation isn’t the same as external expertise. It can come from inside or outside the group, but it’s needed to elicit and support intellectual exchange, as well as maintaining open dialogue and, sometimes, injecting new energy into the conversation. Skilful facilitation can often lead to a productive balance of comfort and challenge.

    OPERATIONALIZING LESSON STUDY

    These four elements, plus the four steps outlined above) thus formed the basis for how we initially thought to structure and operationalize Lesson Study activity. As a result it was decided by the project team that the pilot phase should commence with a one day facilitated workshop in which practitioners held data informed discussions about the key issues their vulnerable students faced in relation to Engish/literacy and Science. Before the workshop the Assistant Head project lead from Haverstock school, and researchers and facilitators from the Institute of Education spent a day developing protocols and tools to facilitate learning conversations and planning activity within the workshop (based on approaches used by Stoll: e.g. see Stoll and Brown, forthcoming). Using these, participants worked through a series of activities designed to help them decide upon one focus area (a topic being taught that encapsulated the issue) and to also think about a common approach to teaching the topic in relation to the issue that triads could adopt, implement and iteratively improve. Following this, participants were asked to identify three students within each school that represent the focus (vulnerable) students and to then collaboratively plan the first research lesson that would be taught/observed. 

    In keeping with the notion that it is expertise with respect to a given intervention that enables practitioners to tailor interventions to their specific situation, and that the development of expertise involves both aspects of effective learning and sustained skill refinement (practice) (see Bryk et al., 2011; Penuel et al., 2012; Lemov et al., 2013; Brown and Rogers, 2014) the pilot phase then involved three full Lesson Study days. These involved practitioners: 1) revisiting the purpose of the lesson and the focus area that it linked to; 2) being talked through each phase of the lesson and what its aims and goals were; 3) observing how the lesson worked in practice (with a focus on the case children); 4) interviewing the case children for their perspectives on the issues; 5) a facilitated discussion to evaluate the lesson (what went well and what would have made the teaching activity even better); and 6) building on what had happened (i.e. collaboratively establishing how to improve the pedagogic approach), planning for the next Lesson Study research lesson. Again, before the first Lesson Study day, the project team spent a day together collaboratively developing protocols, tools and an outline for the day to facilitate the Lesson Study process. The Lesson Study activity was also observed by the project team in order to give us an understanding of how it was being enacted.

    COLLABORATIVELY REVIEWING AND IMPROVING LESSON STUDY ACTIVITY

    Throughout the pilot phase, time and space was created to enable researchers and practitioners to deliberate and discuss what researchers and practitioners had learned and their experiences in relation to Lesson Study. Through this dialogic process we were able to construct common understanding and meaning with regards to both aspects of the process and in terms of the use of tools and protocols to facilitate the process. As a result were then able to understand which aspects of the approach were successful in helping participants develop their practice and improve outcomes for the most vulnerable and which appeared to provide limited value. In other words, as Gutierrez and Penuel (2014: 20) suggest: “[s]tudying the ‘social life of interventions’ [helped us] move away from imagining interventions as fixed packages of strategies with readily measurable outcomes and towards more open-ended social or socially embedded experiments that involve ongoing mutual engagement. To ensure the learning from the pilot phase, at the end of the three Lesson Study days, a one-day workshop was held so that the main phase could be collaboratively developed. Aspects here included the nature of the ‘kick-off’ event that would introduce ‘main stage’ participants to the process and the types of exercises and tools to be used as part of this event; the grouping and sequencing of Lesson Study days throughout the year (bearing in mind the distinct ways of working that each phase of schooling has); the nature (running order) of each Lesson Study day; the nature of the tools and protocols to be employed as part of each main phase Lesson Study session; and how impact should be conceived of and measured (see below).

    What was also viewed as important by both participants and the project team, however, was that, as we scaled the project up from pilot to main phase, the dialogic process that enabled us to understand and iteratively improve the operation of Lesson Study could continue at scale. Perhaps one of the main issues of the DBR approach as currently conceived is that it is very researcher intensive: in other words that it requires researchers working intensively with small numbers of practitioners. The failure of evidence to make a widespread and sustained impact on the practices of teachers is an international phenomenon however (Bryk et al., 2011; Taylor, 2013) and, despite considerable activity, the development of system-wide processes for the effective connecting and applying of research and professional knowledge remain underdeveloped (Gough et al., 2011). An imperative then must be to find ways of examining how DBR approaches can impact on maximal numbers of teachers.To overcome this, practitioners and researcher jointly agreed on the need for distributed ownership: if DBR at scale is an unmanageable task for researchers alone, then researchers cannot be the only actors involved in creating meaning – practitioners experienced in the deliberative process should also be able to move beyond their traditional roles and engage in this way too (Coburn et al., 2013). This agreed upon approach to capacity building meant that we were able to use the original pilot group members as practitioner-researchers who could form new triads and engage with practitioners involved in the project’s main stage. This freed time for the researchers to work with other groups of ‘main stage’ practitioners and both sets of researchers could then meet periodically to consider ongoing improvements and changes that needed to be made to the Lesson Study methodology.

    FINDINGS SO FAR

    Interim impact data illustrates that the DBR approach to Lesson Study has been successful in enabling us to build researcher practitioner’s capacity in relation to: 1) understanding and evaluating impact; 2) cross phase collaboration; and 3) engaging in, managing and tailoring a process of cross-phase Lesson Study. Such capacity building is a direct result of the DBR approach: in other words engaging in DBR activity has enabled practitioners to develop an understanding of the theory of action underpinning Lesson Study: that is, how could we ensure the effective enactment of learning conversations so that best practice could be established and iteratively improved. Capacity building is also a vital precursor to the project’s ongoing sustainability: i.e. its propensity to self sustaining once funding ends and the research team can no longer be involved (Coburn, 2003; Gutierrez and Penuel, 2014). It is only when practitioners take ownership of an initiative that it will continue and, it is argued, feelings of ownership will flourish if practitioners have been involved in the creation and development of an initiative. It has also provided much needed scalability: with each new round of Lesson Study those involved in the previous round can now work with two new practitioners but still continue to develop and refine the approach.

    In terms of the wider group of participants, impact data shows that teachers are better able to identify ways of improving practice that impacts on the outcomes of their students and receiving feedback so that their improvement efforts can develop iteratively. To ascertain impact, participants were surveyed each term and asked to consider in relation to their baseline data, their desired outcomes and Lesson Study observation and other data available to them. Specifically impact focussed on two main questions:

    What difference has Lesson Study activity made to your practice? How do you know?

    What difference has Lesson Study activity made in terms of pupil outcomes? How do you know?

    Examples of some of the responses here include:

    “My pupils are able to be more specific about skills in writing.  I now have a clear structure and success criteria for how they collaborate to identify successes and next steps. They can use it to apply grammar knowledge. All pupils have made at least one point progress”.

    “When we have used the process developed in the lesson study, the writing has always been improved by the pupils. Also, the process has helped pupils with their communication skills, ability to describe specific aspects of writing and range of ideas used in improved writing”.

    “The impact of this has been evident in lessons where I have used the improvement marking strategy that we explored as part of LS and many students in my classes have made good progress with the introduction of this technique in my practice”.

    “Many pupils have displayed increased confidence when discussing and revising work. I have noticed a marked improvement in second drafts of assessments”.

    “I have developed a greater awareness of strategies that can be used to enable pupils to learn scientific vocabulary”.

    Results from the main phase will be available in late 2015. Watch this space.

    REFERENCES

    Anderson, T. and Shattuck, J. (2012) Design-Based Research: A Decade of Progress in Education Research, Educational Researcher, 41, 1, pp. 16-25.

    Argyris, C. and Schön, D. (1996), Organizational learning II:  Theory, method, and practice, (Reading, MA, Addison-Wesley Publishing Company). 

    Biesta, G. (2007) Why ‘What Works’ Won’t Work: Evidence-based Practice and the Democratic Deficit in Educational Research, Educational Theory, 57, 1, pp. 1-22.

    Black, P. and Wiliam, D. (2001) Inside the Black Box: Raising standards through classroom assessment, available at: http://weaeducation.typepad.co.uk/files/blackbox-1.pdf, accessed on 14 July 2014.

    Brown, C. (2013) Making Evidence Matter: A New Perspective on Evidence-Informed Policy Making in Education, (London, IOE Press).

    Brown, C. (2014) Evidence Informed Policy and Practice in Education. A Sociological Grounding, (London, Bloomsbury).

    Brown, C. and Rogers S (2014) Knowledge creation as an approach to lead evidence informed practice in early years settings: Examining ways to measure the success of using this method with early years practitioners in Camden, London. Presented at the American Educational Research Association, Philidelphia PA, 3-7 April, 2014.

    Bryk, A., Gomez, L. and Grunow, A. (2011) Getting ideas into action: Building Networked Improvement Communities in Education, in Hallinan, M. (ed) Frontiers in Sociology of Education, Frontiers in Sociology and Social Research (Dordrecht, NL, Springer

    lesson-study, design based research and professional development

    Camden Education Commission: (2011) Camden Education Commission: Final report (Camden Children’s Trust Partnership Board and London, London Borough of Camden).

    Cartwright, N. (2013) Knowing what we are talking about: why evidence doesn’t always travel, Evidence & Policy, 9, 1, pp. 97-112.

    Cherney, A. and Head, B. (2011) Supporting the knowledge-to-action process: a systems-thinking approach, Evidence & Policy, 7, 4, pp. 471-488. 

    Coburn, C. (2003) Rethinking scale: moving beyond numbers to deep and lasting change, Educational Researcher, 32, 6, pp. 3-12.

    Coburn, C., Penuel, W. and Geil, K. (2013) Research-Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts, (New York, NY, William T. Grant Foundation).

    Cooper, A., Levin, B. and Campbell, C. (2009) The growing (but still limited) importance of evidence in education policy and practice, Journal of Educational Change, 10, 2-3, pp. 159-171.

    Department for Children Schools and Families (2008) The National Strategies. Strengthening transfers and transitions: partnerships for progress, available at: http://dera.ioe.ac.uk/7464/1/str_tt_prtnshp_prgrss08308.pdf, accessed 5 September, 2013.

    Dudley, P. (2014) Lesson Study: A handbook, available at: http://lessonstudy.co.uk/wp-content/uploads/2014/01/new-handbook-early-years-edition2014-version.pdf, accessed on 21 July 2014.

    Earley, P. and Porritt, V. (2013) Evaluating the impact of professional development: the need for a student-focused approach, Professional Development in Education, DOI: 10.1080/19415257.2013.798741

    Easton, J. (2013) Using measurement as leverage between developmental research and education practice. Talk given at the Center for the Advanced Study of Teaching and Learning, Curry School of Education, University of Virginia, Charlottesville. 

    Erickson, F. and Gutierrez, K. (2002) Culture, rigor, and science in educational research, Educational Researcher, 31, 8, pp. 21-24.

    ESTYN (2004). Recommendations on implementation of transition provisions in the Education Act 2002, (Cardiff: ESTYN). 

    Evangelou, M., Taggart, B., Sylva, K., Melhuish, E., Sammons, P., and Siraj-Blatchford, I. (2008) What makes a successful transition from primary to secondary school, (Nottingham, Department for Children Schools and Families).

    Fielding, M., Bragg, S., Craig, J., Cunningham, I., Eraut, M., Gillinson, S., Horne, M., Robinson, C. and Thorp, J. (2005) Factors influencing the transfer of good practice, available at: http://webarchive.nationalarchives.gov.uk/20130401151715/http://www.education.gov.uk/publications/eOrderingDownload/RR615.pdf.pdf, accessed on 21 July, 2014.

    Galton, M., Gray, J. and Rudduck, J. (1999). The impact of school transitions and transfers on pupil progress and attainment, (Nottingham, Department for Education and Employment).

    Gough, D., Tripney, J., Kenny, C., Buk-Berge, E. (2011) Evidence informed policymaking in education in Europe: EIPPEE final project report summary, available at: www.eipee.eu/LinkClick.aspx?fileticket=W6vkqDjbiI%3d&tabid=2510&language=en-GB, accessed on 11 September 2012. 

    Gutierrez, K. and Penuel, W. (2014) Relevance to Practice as a Criterion for Rigor, Educational Researcher, 43, 1, pp. 19-23.

    Hargreaves, D. (1996) The Teaching Training Agency Annual Lecture 1996: Teaching as a research based profession: possibilities and prospects, available at: http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/TTA%20Hargreaves%20lecture.pdf, accessed on 14 January 2013.

    Harris, A. and Jones, M. (2012) Connect to Learn: Learn to Connect, Professional Development Today, 14, 4, pp. 13-19.

    Hillage, L., Pearson, R., Anderson, A. and Tamkin, P. (1998) Excellence in Research on Schools, (London: DfEE).

    Howe, A. (2011) Managing primary-secondary transfer: lessons learned?, in Howe, A. and Richards, V (eds) Bridging the transition from primary to secondary school, (Abingdon, Routledge).

    Husbands, C. and Pearce, J. (forthcoming) Great pedagogy: nine claims from research, (Nottingham, National College for School Leadership).

    James, M., McCormick, R., Black, P., Carmichael, P., Drummond, M.J., Fox, A., MacBeath, J., Bethan M., Pedder, D., Procter, R. Swaffield, S., Swann, J. Wiliam, D. (2007)

    Improving learning how to learn: classrooms, schools and networks, (London, Routledge).

    Lemov, D., Woolway, E. and Yezzi, K. (2013) Practice Perfect: 42 Rules for Getting Better at Getting Better, (San Francisco CA, Jossey Bass).

    Lipskey, M. (1980) Street-level bureaucracy: Dilemmas of the individual in public services, (New York, Russell Sage Foundation).

    MacLure, M (2005). ‘Clarity bordering on stupidity’: where’s the quality in systematic review?, Journal of Educational Policy, 20, 4, pp. 393-416

    McGee, C., Ward, R., Gibbons, J and Harlow, A. (2004) Transition to secondary school: a literature review, (Hamilton, the University of Waikato).

    Moss, G. (2013) Research, policy and knowledge flows in education: what counts in knowledge mobilisation, Contemporary Social Science: Journal of the Academy of Social Sciences, DOI: 10.1080/21582041.2013.767466.

    Nutley, S.M., Walter, I. and Davies, H.T.O. (2007) Using evidence: How research can inform public services, (Bristol, The Policy Press).

    Penuel, W., Fishman, B., Haugan, C. and Sabelli, N. Organizing Research and Development at the Intersection of Learning, Implementation and Design, Educational Researcher, 40, 7, pp. 331-337.

    Penuel, W., Sun, M., Frank, K. and Gallagher, A. (2012) Using Social Network Analysis to Study How Interactions Can Augment Teacher Learning from External Professional Development, American Journal of Education, 119, 1, pp. 103-136.

    Lemons, C., Fuchs, D., Gilbert, G. and Fuchs, L. (2014) Evidence-Based Practices in a Changing World: Reconsidering the Counterfactual in Education Research, Educational Researcher, 43, 5, pp. 242-252.

    Shepherd, J. and Roker, D. (2005) An evaluation of a ‘transition to secondary school’ project run by the National Pyramid Trust, available at: www.youngpeopleinfocus.org.uk/_assets/php/report.php?file=37, accessed on 4 September 2013. 

    Stoll, L. (2008) Leadership and policy learning communities: promoting knowledge animation, in: Chakroun, B. and Sahlberg, P. (eds) Policy learning in action: European Training Foundation Yearbook 2008, (Torino, Italy, European Training Foundation).

    Stoll, L. (2009) Knowledge Animation in Policy and Practice: Making Connections, Paper presented at the Annual Meeting of the American Educational Research Association as part of the symposium Using Knowledge to Change Policy and Practice, available at: www.oise.utoronto.ca/rspe/UserFiles/File/Publications%20Presentations/AERA%2009%20knowledge%20animation%20paper%20Stoll.pdf, accessed on 23 January, 2014.

    Stoll, L., (2012) Stimulating Learning Conversations, Professional Development Today, 14, 4, pp. 6-12.

    Stoll, L. and Louis, K. S. (2007) Professional Learning Communities: Divergence, Depth and Dilemmas, (Maidenhead, Open University Press).

    Sutherland, R., Ching Yee, W., McNess, E. and Harris R. (2010) Supporting learning in the transition from primary to secondary schools, available at: http://www.bris.ac.uk/cmpo/publications/other/transition.pdf, accessed on 5 September, 2012.

    Taylor, M. (2013) Social science teachers’ utilisation of best evidence synthesis research, New Zealand Journal of Educational Studies, 48, 2, pp. 35 – 49.

    Tooley, J. and Darby, D. (1998) Educational Research: a Critique, (London, Ofsted).

    Tree, J. (2011) What helps students with challenging behaviour make a successful transfer to secondary school? (D.Ed.Psy Dissertation, University of London, Institute of Education).

    Vanderlinde, R. and van Braak, J. (2010) The gap between educational research and practice: views of teachers, school leaders, intermediaries and researchers, British Educational Research Journal, 36, 2, pp. 299 316.

    Witt, M. (2011) Mathematics and transition, in Howe, A. and Richards, V (eds) Bridging the transition from primary to secondary school, (Abingdon, Routledge).

  • A ‘Liturgical Laboratory’

    A ‘Liturgical Laboratory’

    ‘Liturgical Laboratory’  

    by Tim Novis.

    As the Senior Chaplain at Wellington College, I am responsible for ensuring that worship is meaningful and relevant for the students, over a thousand in number, who enter the Chapel on a weekly basis.  The most obvious expectation in an independent school, often driven by impressive fees, is that whatever we offer it must be of the highest quality.  

    Timmy

    Fr. Tim Novis

    The question must be asked, what then must worship look like in the independent sector, where chapel attendance is mandatory, yet where those who attend are not by any means strict adherents of any one particular faith or religion; although most select ‘Church of England’ as at least their affiliation on entrance applications. 

    How then can we achieve this goal with integrity, and not simply resort to ‘plug and play’ liturgies that are really just indoctrination and the mumbling back of hollow responses printed in a booklet or merely the gusty rugby pitch style singing of 18th century hymns that are remotely patriotic with little to nothing to do with the interior life?

    I am beginning qualitative, evidence-based research into what 21st century, mandated, adolescent, public worship should be, to ensure that the full benefits of spiritual wellbeing are realised within the institutional setting.

    Utilising focus-group research and case studies, I also wish to trace the life stories of groups of pupils over a 3 year period of ‘exposure’ to chapel attendance.

    Further, what spaces within an independent schools does spirituality inhabit?  Within the liminal, epistemological, temporal and physical realms, where does spirituality offer an impact upon the wellbeing of students?  What affect does it have and how can we measure this, to capture what we should be offering more of, and what we should be letting go of as archaic?

    In regard to wellbeing, in a webpage entitled, ‘Spirituality and Mental Health,’ from the Royal College of Psychiatrists, the question is asked: 

    ‘What difference can spirituality make?

    Service users tell us that they have gained:

    •better self-control, self-esteem and confidence

    •faster and easier recovery (often through healthy grieving of losses and through recognising their strengths)

    •better relationships – with self, others and with God/creation/nature

    •a new sense of meaning, hope and peace of mind. This has allowed them to accept and live with continuing problems.’

    Worship, or liturgy, is literally the ‘work of the people’.  We need also to ask and answer, ‘what works for people’, if we take as truth that spirituality is a crucial component in any successful program of education.  The Wellington College Chapel will become a ‘liturgical laboratory’ where research with a lofty end in mind will be completed.

  • Taming the Wild West of Educational Research

    Taming the Wild West of Educational Research

    By Dr. Simon P Walker 
    sw

    Dissolving facts

    Let’s start with a basic fact. All facts are constructs. As Martin Heidegger would have put it, no construct is a description of the thing in itself. It is a proposal, a representation, an attempt at a description. 

    Take a supposed ‘solid fact’. Take, for instance, the fact of gravity. Gravity is a fact, surely. As eggs is eggs, everything is subject to gravity. 

     Well it’s certainly the case that an attractional force between objects appears to act across the detected universe, but what that force precisely is, is still not agreed. From (apocryphally) seeing apples falling on heads, Newton described this universal force in the way we all learned up to GCSE. Einstein took Newton’s model of gravity and picked holes in it, describing gravity in the way students learn it now at A Level. Meanwhile, post-docs explain to undergraduates that accept Einsteinian models are themselves only inadequate constructs. A better construct is being sought; a unifying theory of forces, still not described by Hawking and co.

    Physicists accept readily that they are merely dealing in representations of reality, not reality itself. Perhaps that is why physicists, such as Fritjof Capra and David Bohm, pioneered what they call a ‘new science’, open to the non material, the ambiguous, the paradoxical, the spiritual. Mystery. They know that hard knows of real matter actually dissolve before you can touch them into states of energy and proximity. 


    By contrast, it is biologists (I confess, my first degree) who tend to be religiously convinced their ideas are unarguable, utterly certain and right. Think of the militant evangelicalism of Richard Dawkins, that great campaigner for the truth that the world is nothing more than nuts and bolts. His epistemological absolutism is, I suspect, a little odd to physicists who know all versions of reality are vague approximations, mental constructs designed to get us slightly closer to the things we can never actually claim, or touch, or control. 

    Biologists tend to miss this because their world sits somewhere between the empirical experimental machineries of the particle physicist at Cern, and the narrative writing social scientist at the RSA. Biologists dismiss physicists as dealing with forces at a level ‘unable to explain complex, emergent properties of organic life, reproduction and speciation’. At the same time, they poo poo the ‘loose, shoddy subjective non-empirical pseudo-science’ of the social scientist, such as the educational researcher.

    The lot of the educational researcher

    This, in fact, is the lot of the poor social scientist, of which the educational researcher is one. To be kicked in the face by the ‘hard science’ bully.  And perhaps that contributes to the endless existential crisis that afflicts social science research; it certainly seems to be part of the lively energy surrounding the debate around educational research methods and validity that schools are becoming alert to.

     The core problem for the educational researcher, as a social scientist, is this: How can I do valid research? He faces methodological challenges unknown to the scientist: Human beings can’t be put in test tubes. They can’t be dissected. They have wills. They have to be asked permission to take part in his project, they slip out of the constraints of the experimental designs set them. He sets up his case-study based study. Five students are absent from the recorded first lesson, skewing his data. He watches ten lessons….. (my! what a lot if observational data he now has to analyse….) yet those lessons were just 3% of three hundred taught that week. How can he generalise from that sample size? What claims can he make that would possibly be valid beyond the limits of that particular experience?  Thus the educational researcher feels constantly anxious terrified his research doesn’t really count, or worse, will be counted as ‘bad research’.

    Bad science or bad customers?

    Some educational researchers retreat to empiricist methods. Quantitative studies are commissioned on huge sample sizes. Claims are made, but how valid are those claims to the real-life of the classroom? For example, what if one study examines 5,000 students to see if they turn right rather than left after being shown more red left signs. Yes, we now with confidence know students turn left when shown red signs. But so what?  What can we extrapolate from that?  How much weight can that finding bear when predicting human behaviour in complex real world situations where students make hundreds of decisions to turn left and right moment by moment? The finding is valid but is it useful?

    Generalising applications from limited, circumscribed data has been a route to poor educational product development. In the rarefied abstracted confines of a lab, or campus, looking at only a few factors in millions of interacting ones, a researcher publishes a claim. Maybe it is a claim about the way the brain processes different kinds of data (just as a random example…..). The researcher states that their claim is a proposal,  a construct, an artifice to describe a small phenomenon they observed. It is a way of putting that finding into language. It is not the truth.

    But teachers don’t like constructs; they like tools, they like ideas that they can use, can put into practice.

    This desire for practical tools creates a market for someone to translate that small, limited, prescribed, tentative claim and turn it into a tool that teachers can use…… hence:

     Brain gym, learning styles, VAK, EQ, Thinking hats, mindset….

    These tools are created because there is a demand from teachers for them. Teachers are in the business of doing, getting results in a classroom.  They want practical things that can help them to that not tentative research constructs. So a company creates those certainties for them. Sometimes the original researcher is involved in the tool development and sometimes not. Sometimes the researcher is in despair with what the teaching profession goes on to do either their subtle qualified claims and constructs.

    Neuroscientific abuse

     Neuroscience has been most susceptible to this kind of poor adoption. Sometimes the neuroscience itself has been bad science. More often, the application of the science by teachers has been bad practice. Neuroscience has that seductive appeal, the promise of unlocking the kernel of what learning actually is. But neuroscience does not and, indeed, cannot achieve that. Peering into the neural activity of thirty teenagers rampaging in science, lesson three Monday morning, is currently beyond the scope of the fMRI scanners. Teaching may draw on bits of hard neuroscience but in the end, classroom teaching is a social collective experience. Neuroscience does not adequately deal with collective cognitive affective phenomena. No, teaching is informed by studies inside the brain but it will never be fully described by them. Teaching is a live happening, a collective event.

    Proper confidences of educational research

    That is why the appropriate discipline to measure teaching and learning must remain social science. And teachers must be confident social scientists when they research the efficacy of their methods. They must be confident in what it can claim and accepting of the limits of their social science research.

    So what are the limits and confidences of educational research?

    Limits:

    Because studies take place in real schools and schools are particular not abstract instances of education, conclusions from educational research are always open and conclusions revisable. This applies to both quantitative but especially qualitative studies.

    When teachers act as researchers into their efficacy of their own schools, they must acknowledge their own impact on the study outcome. There is a major ‘Hawthorn Effect’ when teachers lead internal school research whereby the presence of the observer, as well as the sheer fact of conducting the observations, will by themselves improve results. Hattie’s comparative ‘effect size’ model explained in Visible Learning is one way to mitigate against exaggerating results.

    Confidences:

    Because studies are done ‘in the field’ rather than in the lab, there is a good chance the findings will be relevant and useful rather than valid but useless.

    Social science methodologies (case study, interview, participant survey, observation) are all valid. Quantitative methods (tests, controlled assessments) have some advantages of reliability (sample size, limitation of conflicting factors, reduction of noise) but disadvantages of applicability (how useful are the conclusions). Bigger samples are always better than smaller samples whatever method.

    The researcher must know which methodological tools he is using, not claiming to be screwing a bolt when he is bashing a nail. Beware the researcher who does not know which methodology he is using, why and what evaluation of data it will enable him to do by what method.

    Beware the heavy boot of higher education. In the perceived academic hierarchy of the British mind, universities look down on secondary schools, which look down on primary schools. For their own reasons (to do with funding amongst other things) universities are piling into school research right now. Their presence is welcomed, their expertise in research design and analysis is vital. However, they generally have slightly less interest in actually improving school education as opposed to measuring things designed to improve it. Teachers want to improve things; this is a noble and arguably more important goal and we need ensure the ‘doers of teaching’ retain control over the ‘measurers of education’.

    Research should be as non-intrusive as possible. Ditto above, academics love hefty research design. Schools will quickly become sick of research projects if they intrude too much upon the actual functioning of the school day. Design light projects which are lean and clean.

    The real opportunity of a school research project is for teachers to become leaners again. A school should adopt an agreed ‘active-learning cycle’ (e.g. Kolb, Lewin, Honey and Mumford) as the structure through which all its research activities are driven. By adopting a common format for eliciting research questions, implementing studies, applying and evaluating study data, schools will engage staff and pupils with the project’s goals, stages and benefits. Research projects will be coordinated and heuristic not random and whimsical.

    Research should enrich the story. Teachers are social constructionists, telling stories that will capture the deeper participation of students. Live, ongoing research enriches that story, makes it fresh, open, edgy… I was asked recently by one school Head participating in one of our Human Ecology Education studies ‘So presumably what you expect to find is X and Y…’ My reply was ‘Not at all! I’m not assuming anything… this is why we are doing the research- to find out!.’ Research actually discovers new things which is what makes it exciting as part of a school’s life. The null hypothesis may be just as interesting and important as the one we expected to find. Heads should be sharing the current research, questions and findings, with their students all the time… it’s better than watching spirogyra bubble oxygen in test tubes.

     Someone wise said ‘The unexamined life is a life not worth living’. Educational research is about finding out what works, but it is also about much more than that. It is not, in the end, about collecting more data, or obtaining more knowledge; it is about regaining wisdom, the wisdom to derive pleasure, care and pride in the crafting of education. 

    A score chart to plan your educational research study

    At Human Ecology Education, we have designed a rough rule-of-thumb set of criteria against which experimental designs can be scored. The higher the score, the stronger the study data is likely to be. I have suggested a score above 40 is an acceptable minimum. I hope it will help school-based researchers identify aspects of a proposed study that they can strengthen, giving them confidence in their ability to do research of genuine value. Feel free to take and use it as you see fit!

     

    About Simon P Walker

    Dr Simon P Walker is a man who has done too many degrees. After teaching in the humanities, science and social sciences across several research institutions he is now Director of Research at Human Ecology Education. He is a Visiting Fellow at Bristol Graduate School of Education and his specialist field is the regulation of Cognitive Affective Social (CAS) state in students and its impact on educational outcomes.

    www.humanecology-education.co.uk 

    https://twitter.com/simonw762