Archives by date

You are browsing the site archives by date.

  • Improving student writing by modelling bad writing.

    Improving student writing by modelling bad writing.

    Tom Wayman, head of English,  Wellington College

    “Let’s write the WORST story ever written” was a much more appealing invitation to my mixed-ability Year 11 class than previous offers to collaborate on writing the best one ever. Putting together a unholy montage of cliché, malapropism, comma splices firing off in all directions and gushing logorrhoea was not only liberatingly hilarious but deeply educational.


    As each pupil threw down their worst into a shared Drive folder, they were able to gasp and chuckle as the very things which, unthinkingly, they were pretty much doing two weeks earlier they were now sending up. We projected a few onto the board: some were read out – it was all great fun. The set were able to atomise clearly what made in this case, a short story, poor. If we use exemplar material to showcase the best, who not put together wilfully weak work to highlight the obverse?

    I did a similar exercise with my Year 9s last week. Using some of the excellent resources of the British Library timeline, around which we base the first seven weeks of our Year 9 curriculum, we were looking at using Victorian Freak Show posters to sharpen our analytical writing. In response to the question, ‘What do these posters reveal about the society at the time?’, I typed and projected live the following glorious glossolalia:

    The societe was probably quiet nice becaue they licked to look at strange people and it was funny but they might have felt a but  bad which revelas that they like posters. Also they think tall people are weird. Which made them laugh. The thought it was good. The main attrecation of the show is the smaller perond to the right and it is very old which is shown bauise the paper is brown. I think that Amazons were strange, I think that they kind of through ti was quite nice.

    Although this emerges as a parody of careless writing and editing, the key points of learning were a pleasure for the class to identify and guard against as they set off, for homework, to produce the real thing – to great success. I’d like to call this strategy ‘Modelling c**p’ – but that’s clearly not appropriate. It does catch, however, a little of the useful anarchy of the exercise….


  • Grit Resources

    Grit Resources


    – “Promoting Grit, Tenacity, and Perseverance: Critical Factors for Success in the 21st Century” US Department of Education, meta-review, 2013 (pretty massive but nothing beats this source for a comprehensive look at the most cutting-edge info on grit and mindsets)


    Building Emotional Intelligence: Techniques to Cultivate Inner Strength in Children by Linda Lantieri & Daniel Goleman

    “An innovative educator and the pioneer of emotional intelligence team up to present a groundbreaking program for building resilience and inner strength in children.”


    How Children Succeed: Grit, Curiosity, and the Hidden Power of Character

    “The story we usually tell about childhood and success is the one about intelligence: success comes to those who score highest on tests, from preschool admissions to SATs. But in How Children Succeed, Paul Tough argues that the qualities that matter most have more to do with character: skills like perseverance, curiosity, conscientiousness, optimism, and self-control.


    How Children Succeed introduces us to a new generation of researchers and educators who, for the first time, are using the tools of science to peel back the mysteries of character. Through their stories—and the stories of the children they are trying to help—Tough traces the links between childhood stress and life success. He uncovers the surprising ways in which parents do and do not prepare their children for adulthood. And he provides us with new insights into how to help children growing up in poverty.”


    Fostering Grit: How do I prepare my students for the real world? by Thomas R. Hoerr

    “Grit is a combination of tenacity and perseverance—a willingness to take risks even if it means sometimes failing and starting again. Knowing how to respond to frustration and failure is essential whether a student struggles or excels. Veteran school leader and popular Educational Leadership columnist Thomas R. Hoerr shows what teaching for grit looks like and provides a sample lesson plan and self-assessments, along with a six-step process applicable across grade levels and content areas to help students build skills they need to succeed in school and in life.”


    “Resilience and Grit: Resource Roundup”

    Edutopia’s carefully curated collection of blogs, articles, interviews, and videos with information for parents and educators about the associated concepts of resilience and grit.


    “Teaching Grit: Social and Emotional Truth”

    “Close examination of both the construct of grit and research reveals that teaching grit requires more than we may think. Grit involves the interplay of thoughts and emotions, demanding a wellspring of inner resources to overcome the inevitable obstacles that arise when going full-force after a goal. Thus, teachers who want their students to be more “gritty” need to be aware of those students’ inner lives as well as the outer steps being taken to reach their dreams.”


    “5 Steps to Foster Grit in the Classroom”

    “The character traits of determination, adaptability and reflection add up to a critical 21st century skill.”


    “True Grit: The Best Measure of Success and How to Teach It”


    A Three Part Series on Grit and Resilience from UC Berkeley’s Greater Good Science Center

    “How to Help Kids Overcome Fear of Failure” –

    “What’s Wrong With Grit?” –

    “Two Ways to Foster Grit” –

    “According to the research on failure, students may need more than just grit to succeed. To help students learn to overcome obstacles in pursuit of long-term goals, educators should focus on developing cognitive and emotional skills.”


    Duckworth, A.L., Peterson, C., Matthews, M.D., & Kelly, D.R. (2007). “Grit: Perseverance and passion for long-term goals”. Personality Processes and Individual Differences, 92 (6), p. 1087.


    Angela Duckworth’s Video Series on Grit

    “What is grit?” –

    “Are there virtues that are precursors or closely associated with grit?” –

    “What role does humility play, if any, in cultivating grit?” –

    “How has your personal story been a window into your research on grit?” –

    “What advice would you give to parents who wanted to cultivate grit in their children?” –

    “What is psychological distancing and how does it relate to self control?” –



    Edutopia Resilience and Grit: Resource Roundup


  • Lesson-study, design based research and professional development.

    Lesson-study, design based research and professional development.


    Dr. Chris Brown, Institute of Education. 

    In this article Chris Brown reports on the Haverstock primary to secondary transition project. Designed to improve the experience of transition to secondary school for vulnerable pupils in Camden (London), the project uses Lesson Study to help primary and secondary practitioners work together to develop effective cross-phase pedagogical approaches to teaching English/literacy and science.


    Connecting research to practice is notoriously problematic. Much has been written, for example, regarding whether the ‘evidence-informed’ movement serves to work against practitioners’ professional judgment (e.g. see Biesta, 2007; Brown, 2014). Likewise, there exists issues in relation to how formal academic knowledge and professional or tacit knowledge might be effectively combined (Stoll, 2009; Brown, 2013). Furthermore are the still very active and virulent disputes surrounding some of the methods commonly associated with enhancing evidence use (e.g. randomized control trials and the process of systematic review – see Maclure, 2005; Nutley et al., 2007; Brown, 2013). A further aspect of note centers on how practitioners’ capacity to engage with academic research might be enhanced (e.g. Hargreaves, 1996; Cooper et al., 2009). This issue is also intertwined with the notion that much academic research is inaccessible to teachers, both in terms of where it is published and the language that is typically used within such publications (Hillage et al., 1998; Tooley and Darby, 1998).

    The idea of Design Based Research (DBR) (sometimes referred to as Design-Based Implementation Research) has been positioned as one way to overcome many of these issues (Penuel et al., 2011; Coburn et al., 2013), with Vanderlinde and van Braak (2010) explicitly suggesting that DBR approaches should be used to close the research practice gap. Described by Anderson and Shattuck (2012: 16) as an approach “designed by and for educators that seeks to increase the impact, transfer, and translation of education research into improved practice”; DBR in addition “stresses the need for theory building and the development of design principles that guide, inform and improve both practice and research in educational contexts” (ibid). Anderson and Shattuck (2012: pp.16-17) go on to suggest a number of important definitional Design Based Research attributes:

    That DBR must be situated in a real educational context;

    DBR should focus on the design and testing of a significant intervention;

    It will involve iterative refinement of that intervention to improve its operation and builds on/iron out past mistakes;

    DBR must involve a collaborative partnership between researchers and practitioners; and

    The process of DBR leads to the development of design principles reflecting the conditions within which the intervention operates.

    Coburn et al., (2013) suggest that DBR may be differentiated from other approaches that seek to connect research to practice because: 1) with DBR, the distinction between the roles of practitioners and researchers is blurred (i.e. practitioners and researchers are mutually responsible for developing theory and embedding solutions); and 2) collaboration occurs in real time throughout the project rather than concentrated at the beginning (scoping) and end (debrief). DBR thus represents a shift from the traditional perspective of research and practice being two distinct activities, with the former being able to unambiguously influence the latter (Vanderlinde and van Braak, 2010), and towards the simultaneous build and study of solutions. As Coburn et al., (2013: 8) suggest: “[DBR has] two goals of equal importance… develop materials and instructional approaches that can be implemented in classrooms, schools and districts. At the same time… to advance research and theory”. 


    Lesson Study has been described as a ‘teaching improvement process’ that has origins in Japanese elementary education, where it is a widely used professional development practice (Dudley, 2014). As a process, Lesson Study involves teachers collaborating, normally in groups of three, to progress cycles of iterative practice development. Such cycles typically involve the following steps: 1) a discussion of student learning goals and the identification of a teaching strategy that might meet these; 2) planning an actual classroom lesson (called a ‘research lesson’) that employs this strategy; 3) observing how the lesson works in practice; and 4) discussing and embedding revisions to enable improvement. In the Japanese model, teachers also report on and often hold public demonstrations of the lesson so that other teachers can benefit from their learning (Dudley, 2014). In addition three pupils, who represent wider groups of interest, will be observed and their progress monitored as case studies of the impact of the approach (ibid). In itself, Lesson Study can be considered a form of Joint Practice Development (JPD), a process described by Fielding et al., (2005) as involving developing ways of working through collaborative engagement that, as a result, opens up and shares practices with others. And whilst Lesson Study does have a number of distinctive characteristics, its underpinning mechanism, as with other JPD approaches, involves a process that is truly mutual, rather than one-way, with the practice concerned being improved rather than simply moved from one person or place to another.


    The Camden Partnership for Educational Excellence (CPEE) was set up in April 2012 with the vision to make the London Borough of Camden “the best borough for education”. The CPEE aims to drive forward the recommendations of the Camden Education Commission (an independent commission jointly established by Camden schools and Camden Council to take a long term perspective regarding strengthening education in the borough), which highlighted key issues and opportunities for Camden schools in the light of the changes to the English education landscape. In 2013, the CPEE Board invited schools, colleges, partners and stakeholders to bid for funds from a £2 million pot set up to support innovative ways of working and projects to raise achievement and attainment. In particular, to find ways of improving outcomes for the most vulnerable groups of students. A key requirement of the CPEE’s bid call was that school improvement projects should be based on the Lesson Study approach. This followed the appointment to Camden Local Authority/the CPEE Board of a staunch Lesson Study advocate who had been involved in the process for a number of years both in the UK and abroad (see Dudley, 2014).


    A key finding from the Camden Education Commission’s final report was that, particularly for vulnerable students “transition arrangements [within Camden] at present are not consistently good enough” (2011: 5); correspondingly it argued improving these should be a central focus of improvement efforts moving forward. In particular, it suggested that there should be a better understanding between year 6 and 7 teachers (teachers of students aged 11-12) of the pedagogy and practice of teaching and learning in each other’s institutions, which would both assist them in preparing students for success and in supporting students to flourish in their new environments (2011: 36). In response to the report and the invitation by CPEE board for organizations to bid for funding for projects, colleagues from Haverstock school (Camden) and the Institute of Education, University of London teamed up to develop a project that might serve to address some of the commission’s concerns in relation to transition. 

    Our first step was to undertake a review of the international literature on the issue of primary to secondary transition, enabling us to situate our proposal within a suitable theory of learning (Penuel et al., 2011) As a result, the review enabled us to centre our proposed bid to CPEE around the following three notions: 

    That transitions are at their strongest when: “the social, emotional, curricular and pedagogical aspects of learning are managed in order to enable pupils to remain engaged with, and have control of, their learning” (DCSF, 2008: 5; also see McGee et al., 2004; Evangelou et al., 2008). 

    That whilst a myriad of initiatives have been undertaken to tackle the social and emotional aspects of learning (especially for more vulnerable pupils: e.g. see Shepherd and Roker, 2005), Galton et al., (1999: 6) argue that: “there are still problems at transfer with curriculum continuity”; and that 

    Such continuity serves, however, to maintain pupils’ interest in learning, allows them to progress in their learning, and so helps them avoid the internationally observed learning hiatus that seems to accompany transition (McGee et al., 2004; Evangelou et al., 2008). 

    As a result, the Haverstock Primary/Secondary Transitions project was conceived with the purpose of bringing together primary and secondary teachers from the London Borough of Camden in order that they might employ Lesson Study to develop effective cross phase pedagogical approaches/strategies to teaching English/literacy and Science to support the transition of year 5 to year 8 students: in other words to provide hitherto lacking consistency/continuity in the way students across the phases were taught these two subjects. In particular, the project focused on two vulnerable groups: i.e. those students most at risk in terms of their progress post transition. Within this project we consider ‘vulnerable’ as contingent on pupils ability to make a successful academic, social and emotional transition from Year 6 to Year 7. Our scope thus includes pupils entitled to Free School Meals (FSM); “and pupils from some ethnic groups (which ones depending on the particular subject being assessed)” (Galton et al., 1999: 22; Tree, 2011): in this case, white British students (closing the gap for White working class students is a high priority both within Camden Local Authority and within the English context. For example the Camden Partnership of Educational Excellence highlighted this group as being the cohort with the most underachievement overall). We also sought to include more able pupils not fulfilling their potential. 

    Jointly directed by colleagues from Haverstock school and the Institute of Education, the specific aims of the project were to improve student and teacher outcomes in relation to:

    Improved rates of progress and attainment for ‘vulnerable’ pupils within each of years 3-8 (ages 8 through to 13).

    More robust, challenging and innovative but also consistent pedagogic practice at National Curriculum level 1-8 in English and science (levels represent how pupils progress in relation to England’s National Curriculum) and assessment practices through years 5, 6, 7, & 8 (ages 10 through to 13).

    Shared teacher confidence using these practices as well as in planning work and moderation of leveling in their subject in English and science from levels 1 to 8 in order to assess progress.

    A group of teachers able to use Lesson Study approaches to improve classroom practice and impact on standards, thus building transferable capacity.

    The project main phase (which commenced in September 2014) comprises a pilot and main phase, with the latter involving 18 schools engaged in:

    Nine Lesson Study sessions throughout the course of the academic year.

    A workshop to examine both interim impact and to look at cross-phase approaches to assessment/moderation.

    A workshop to examine both interim impact and cross-phase approaches to data transfer.

    Participating schools were also brought together to design final resources for the project with the aim that resources can be used by schools within the borough and more widely.

    A knowledge mobilisation and end of project workshop.


    Primary and secondary schools have their own particular ways of working and these are not especially well suited to fostering cross phase collaboration. For instance, primary teachers will teach all subjects to one cohort of pupils for an entire year. In contrast, secondary school teachers will specialise by subject area and so will teach that one subject to a number of different classes. In addition, using Lesson Study is a new phenomenon in English schools and using Lesson Study in a cross phase way (to tackle issues of transition) rarer still. Baring in mind the particular ways of working of each phase and that neither the researchers or practitioners involved in the project had engaged in Lesson Study activity before, it was decided that a pilot phase of five months with a small group of schools be run in order that researchers and practitioners could collaborate in trialling the approach and ascertaining how it might be made fit for practice: in other words to enable a Designed Based Research approach to the development and implementation of a Lesson Study approaches. That is, in keeping with Anderson and Shattuck (2012) we (i.e. participating teachers from these schools, the Assistant Head project lead from Haverstock school, and researchers and facilitators from the Institute of Education) sought, as a collaborative partnership, to design, test and refine cross-phase Lesson Study in a real educational context, with a view to establishing a basis for its future roll out.


    A key aspect of this approach was the establishment of a theory of action for Lesson Study: in other words, to determine which aspects of Lesson Study were an integral part of a logical chain leading to improved student outcomes and which were more open to contextual manipulation (Argyris and Schön, 1996; Cherney and Head, 2011). A mutually developed theory of action has been shown to have significantly positive impacts on the effectiveness of interventions they relate to (Lemons et al., 2014) and so is a vital aspect of DBR. As noted above it is argued that, as a form of Joint Practice Development, Lesson Study involves collaborative engagement that serves to open up and share practices. As such, the development of our theory of action for Lesson Study centred on how adults can learn from and build upon the best practice of their peers through interaction. This led us to the constructivist mode of learning which posits the idea that, in instances of effective learning, new understanding is not simply digested but engaged with as part of a social process of adoption, accommodation and assimilation (Black and Wiliam, 2001; James et al., 2007). This means that new information will be filtered through past experience, knowledge and understanding; implying that the process of new learning must start via the exploration of existing ideas as well as the encouragement of those who holding them to express and defend them. As a result, these ideas can be surfaced and made explicit, the importance of which is illustrated by James et al., who suggest that: “unless learners make their thinking explicit to others, and so to themselves, they cannot become aware of the need for conceptual modification” (2007: 17). The social constructivist approach further augments the constructivist notion of learning by emphasizing the interactive element of learning: i.e. the idea that learning proceeds via interaction within a social milieu. 

    In order to facilitate this learning, we turned to the literature on Professional Learning Communities. Whilst there is no universal definition of a Professional Learning Community (PLC), they are usually depicted as a situation in which people involved with and concerned about teaching and learning, work collaboratively to learn about how they can improve pupil learning or outcomes (Stoll, 2008; Harris and Jones, 2012). In particular we looked at the nature and structure of the ‘learning conversations’ that take place as part of PLC activity. Described as “deep, sustained conversations among teachers about matters of teaching and learning” (Stoll and Louis, 2007: 79) and “the way that educators make meaning together and jointly come up with new insights and knowledge that lead to intentional change to enhance their practice and student learning” (Stoll, 2012: 6), learning conversations comprise considered thoughtful (rather than superficial) discussion and challenge, focused on matters of teaching practice, that consider evidence of actual and potential forms of practice and that are undertaken with a view to developing both improved practice (i.e. new solutions to issues) and, as a result, outcomes for students. Moving deeper into this area, Stoll (2012: pp. 6-11) suggests that the following are features characteristic of high quality learning conversations between adults: 

    Focus on evidence and/or ideas. Learning conversations are focused, with the specific focus reflecting one of two important perspectives. First, the conversation’s focus can centre on existing and effective practice within the school/network. Second reflects ideas about innovation and transformation where, for example, the conversation explores creative ways to engage learners and extend learning. Because second focus will require elements of the first, many conversations weave these two perspectives together.

    Experience and external knowledge/theory. Access to outside expertise deepens learning conversations. Whether delivered personally, through writing, or via other media, independent ideas are injected to stimulate reflection, challenge the status quo and extend thinking. Such ideas can help promote greater depth in conversations. 

    Protocols and tools. Learning conversations can often be framed more clearly when supported by frameworks and guidelines that help participants structure their dialogue and interrogate evidence or ideas.

    Facilitation. Facilitation isn’t the same as external expertise. It can come from inside or outside the group, but it’s needed to elicit and support intellectual exchange, as well as maintaining open dialogue and, sometimes, injecting new energy into the conversation. Skilful facilitation can often lead to a productive balance of comfort and challenge.


    These four elements, plus the four steps outlined above) thus formed the basis for how we initially thought to structure and operationalize Lesson Study activity. As a result it was decided by the project team that the pilot phase should commence with a one day facilitated workshop in which practitioners held data informed discussions about the key issues their vulnerable students faced in relation to Engish/literacy and Science. Before the workshop the Assistant Head project lead from Haverstock school, and researchers and facilitators from the Institute of Education spent a day developing protocols and tools to facilitate learning conversations and planning activity within the workshop (based on approaches used by Stoll: e.g. see Stoll and Brown, forthcoming). Using these, participants worked through a series of activities designed to help them decide upon one focus area (a topic being taught that encapsulated the issue) and to also think about a common approach to teaching the topic in relation to the issue that triads could adopt, implement and iteratively improve. Following this, participants were asked to identify three students within each school that represent the focus (vulnerable) students and to then collaboratively plan the first research lesson that would be taught/observed. 

    In keeping with the notion that it is expertise with respect to a given intervention that enables practitioners to tailor interventions to their specific situation, and that the development of expertise involves both aspects of effective learning and sustained skill refinement (practice) (see Bryk et al., 2011; Penuel et al., 2012; Lemov et al., 2013; Brown and Rogers, 2014) the pilot phase then involved three full Lesson Study days. These involved practitioners: 1) revisiting the purpose of the lesson and the focus area that it linked to; 2) being talked through each phase of the lesson and what its aims and goals were; 3) observing how the lesson worked in practice (with a focus on the case children); 4) interviewing the case children for their perspectives on the issues; 5) a facilitated discussion to evaluate the lesson (what went well and what would have made the teaching activity even better); and 6) building on what had happened (i.e. collaboratively establishing how to improve the pedagogic approach), planning for the next Lesson Study research lesson. Again, before the first Lesson Study day, the project team spent a day together collaboratively developing protocols, tools and an outline for the day to facilitate the Lesson Study process. The Lesson Study activity was also observed by the project team in order to give us an understanding of how it was being enacted.


    Throughout the pilot phase, time and space was created to enable researchers and practitioners to deliberate and discuss what researchers and practitioners had learned and their experiences in relation to Lesson Study. Through this dialogic process we were able to construct common understanding and meaning with regards to both aspects of the process and in terms of the use of tools and protocols to facilitate the process. As a result were then able to understand which aspects of the approach were successful in helping participants develop their practice and improve outcomes for the most vulnerable and which appeared to provide limited value. In other words, as Gutierrez and Penuel (2014: 20) suggest: “[s]tudying the ‘social life of interventions’ [helped us] move away from imagining interventions as fixed packages of strategies with readily measurable outcomes and towards more open-ended social or socially embedded experiments that involve ongoing mutual engagement. To ensure the learning from the pilot phase, at the end of the three Lesson Study days, a one-day workshop was held so that the main phase could be collaboratively developed. Aspects here included the nature of the ‘kick-off’ event that would introduce ‘main stage’ participants to the process and the types of exercises and tools to be used as part of this event; the grouping and sequencing of Lesson Study days throughout the year (bearing in mind the distinct ways of working that each phase of schooling has); the nature (running order) of each Lesson Study day; the nature of the tools and protocols to be employed as part of each main phase Lesson Study session; and how impact should be conceived of and measured (see below).

    What was also viewed as important by both participants and the project team, however, was that, as we scaled the project up from pilot to main phase, the dialogic process that enabled us to understand and iteratively improve the operation of Lesson Study could continue at scale. Perhaps one of the main issues of the DBR approach as currently conceived is that it is very researcher intensive: in other words that it requires researchers working intensively with small numbers of practitioners. The failure of evidence to make a widespread and sustained impact on the practices of teachers is an international phenomenon however (Bryk et al., 2011; Taylor, 2013) and, despite considerable activity, the development of system-wide processes for the effective connecting and applying of research and professional knowledge remain underdeveloped (Gough et al., 2011). An imperative then must be to find ways of examining how DBR approaches can impact on maximal numbers of teachers.To overcome this, practitioners and researcher jointly agreed on the need for distributed ownership: if DBR at scale is an unmanageable task for researchers alone, then researchers cannot be the only actors involved in creating meaning – practitioners experienced in the deliberative process should also be able to move beyond their traditional roles and engage in this way too (Coburn et al., 2013). This agreed upon approach to capacity building meant that we were able to use the original pilot group members as practitioner-researchers who could form new triads and engage with practitioners involved in the project’s main stage. This freed time for the researchers to work with other groups of ‘main stage’ practitioners and both sets of researchers could then meet periodically to consider ongoing improvements and changes that needed to be made to the Lesson Study methodology.


    Interim impact data illustrates that the DBR approach to Lesson Study has been successful in enabling us to build researcher practitioner’s capacity in relation to: 1) understanding and evaluating impact; 2) cross phase collaboration; and 3) engaging in, managing and tailoring a process of cross-phase Lesson Study. Such capacity building is a direct result of the DBR approach: in other words engaging in DBR activity has enabled practitioners to develop an understanding of the theory of action underpinning Lesson Study: that is, how could we ensure the effective enactment of learning conversations so that best practice could be established and iteratively improved. Capacity building is also a vital precursor to the project’s ongoing sustainability: i.e. its propensity to self sustaining once funding ends and the research team can no longer be involved (Coburn, 2003; Gutierrez and Penuel, 2014). It is only when practitioners take ownership of an initiative that it will continue and, it is argued, feelings of ownership will flourish if practitioners have been involved in the creation and development of an initiative. It has also provided much needed scalability: with each new round of Lesson Study those involved in the previous round can now work with two new practitioners but still continue to develop and refine the approach.

    In terms of the wider group of participants, impact data shows that teachers are better able to identify ways of improving practice that impacts on the outcomes of their students and receiving feedback so that their improvement efforts can develop iteratively. To ascertain impact, participants were surveyed each term and asked to consider in relation to their baseline data, their desired outcomes and Lesson Study observation and other data available to them. Specifically impact focussed on two main questions:

    What difference has Lesson Study activity made to your practice? How do you know?

    What difference has Lesson Study activity made in terms of pupil outcomes? How do you know?

    Examples of some of the responses here include:

    “My pupils are able to be more specific about skills in writing.  I now have a clear structure and success criteria for how they collaborate to identify successes and next steps. They can use it to apply grammar knowledge. All pupils have made at least one point progress”.

    “When we have used the process developed in the lesson study, the writing has always been improved by the pupils. Also, the process has helped pupils with their communication skills, ability to describe specific aspects of writing and range of ideas used in improved writing”.

    “The impact of this has been evident in lessons where I have used the improvement marking strategy that we explored as part of LS and many students in my classes have made good progress with the introduction of this technique in my practice”.

    “Many pupils have displayed increased confidence when discussing and revising work. I have noticed a marked improvement in second drafts of assessments”.

    “I have developed a greater awareness of strategies that can be used to enable pupils to learn scientific vocabulary”.

    Results from the main phase will be available in late 2015. Watch this space.


    Anderson, T. and Shattuck, J. (2012) Design-Based Research: A Decade of Progress in Education Research, Educational Researcher, 41, 1, pp. 16-25.

    Argyris, C. and Schön, D. (1996), Organizational learning II:  Theory, method, and practice, (Reading, MA, Addison-Wesley Publishing Company). 

    Biesta, G. (2007) Why ‘What Works’ Won’t Work: Evidence-based Practice and the Democratic Deficit in Educational Research, Educational Theory, 57, 1, pp. 1-22.

    Black, P. and Wiliam, D. (2001) Inside the Black Box: Raising standards through classroom assessment, available at:, accessed on 14 July 2014.

    Brown, C. (2013) Making Evidence Matter: A New Perspective on Evidence-Informed Policy Making in Education, (London, IOE Press).

    Brown, C. (2014) Evidence Informed Policy and Practice in Education. A Sociological Grounding, (London, Bloomsbury).

    Brown, C. and Rogers S (2014) Knowledge creation as an approach to lead evidence informed practice in early years settings: Examining ways to measure the success of using this method with early years practitioners in Camden, London. Presented at the American Educational Research Association, Philidelphia PA, 3-7 April, 2014.

    Bryk, A., Gomez, L. and Grunow, A. (2011) Getting ideas into action: Building Networked Improvement Communities in Education, in Hallinan, M. (ed) Frontiers in Sociology of Education, Frontiers in Sociology and Social Research (Dordrecht, NL, Springer

    lesson-study, design based research and professional development

    Camden Education Commission: (2011) Camden Education Commission: Final report (Camden Children’s Trust Partnership Board and London, London Borough of Camden).

    Cartwright, N. (2013) Knowing what we are talking about: why evidence doesn’t always travel, Evidence & Policy, 9, 1, pp. 97-112.

    Cherney, A. and Head, B. (2011) Supporting the knowledge-to-action process: a systems-thinking approach, Evidence & Policy, 7, 4, pp. 471-488. 

    Coburn, C. (2003) Rethinking scale: moving beyond numbers to deep and lasting change, Educational Researcher, 32, 6, pp. 3-12.

    Coburn, C., Penuel, W. and Geil, K. (2013) Research-Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts, (New York, NY, William T. Grant Foundation).

    Cooper, A., Levin, B. and Campbell, C. (2009) The growing (but still limited) importance of evidence in education policy and practice, Journal of Educational Change, 10, 2-3, pp. 159-171.

    Department for Children Schools and Families (2008) The National Strategies. Strengthening transfers and transitions: partnerships for progress, available at:, accessed 5 September, 2013.

    Dudley, P. (2014) Lesson Study: A handbook, available at:, accessed on 21 July 2014.

    Earley, P. and Porritt, V. (2013) Evaluating the impact of professional development: the need for a student-focused approach, Professional Development in Education, DOI: 10.1080/19415257.2013.798741

    Easton, J. (2013) Using measurement as leverage between developmental research and education practice. Talk given at the Center for the Advanced Study of Teaching and Learning, Curry School of Education, University of Virginia, Charlottesville. 

    Erickson, F. and Gutierrez, K. (2002) Culture, rigor, and science in educational research, Educational Researcher, 31, 8, pp. 21-24.

    ESTYN (2004). Recommendations on implementation of transition provisions in the Education Act 2002, (Cardiff: ESTYN). 

    Evangelou, M., Taggart, B., Sylva, K., Melhuish, E., Sammons, P., and Siraj-Blatchford, I. (2008) What makes a successful transition from primary to secondary school, (Nottingham, Department for Children Schools and Families).

    Fielding, M., Bragg, S., Craig, J., Cunningham, I., Eraut, M., Gillinson, S., Horne, M., Robinson, C. and Thorp, J. (2005) Factors influencing the transfer of good practice, available at:, accessed on 21 July, 2014.

    Galton, M., Gray, J. and Rudduck, J. (1999). The impact of school transitions and transfers on pupil progress and attainment, (Nottingham, Department for Education and Employment).

    Gough, D., Tripney, J., Kenny, C., Buk-Berge, E. (2011) Evidence informed policymaking in education in Europe: EIPPEE final project report summary, available at:, accessed on 11 September 2012. 

    Gutierrez, K. and Penuel, W. (2014) Relevance to Practice as a Criterion for Rigor, Educational Researcher, 43, 1, pp. 19-23.

    Hargreaves, D. (1996) The Teaching Training Agency Annual Lecture 1996: Teaching as a research based profession: possibilities and prospects, available at:, accessed on 14 January 2013.

    Harris, A. and Jones, M. (2012) Connect to Learn: Learn to Connect, Professional Development Today, 14, 4, pp. 13-19.

    Hillage, L., Pearson, R., Anderson, A. and Tamkin, P. (1998) Excellence in Research on Schools, (London: DfEE).

    Howe, A. (2011) Managing primary-secondary transfer: lessons learned?, in Howe, A. and Richards, V (eds) Bridging the transition from primary to secondary school, (Abingdon, Routledge).

    Husbands, C. and Pearce, J. (forthcoming) Great pedagogy: nine claims from research, (Nottingham, National College for School Leadership).

    James, M., McCormick, R., Black, P., Carmichael, P., Drummond, M.J., Fox, A., MacBeath, J., Bethan M., Pedder, D., Procter, R. Swaffield, S., Swann, J. Wiliam, D. (2007)

    Improving learning how to learn: classrooms, schools and networks, (London, Routledge).

    Lemov, D., Woolway, E. and Yezzi, K. (2013) Practice Perfect: 42 Rules for Getting Better at Getting Better, (San Francisco CA, Jossey Bass).

    Lipskey, M. (1980) Street-level bureaucracy: Dilemmas of the individual in public services, (New York, Russell Sage Foundation).

    MacLure, M (2005). ‘Clarity bordering on stupidity’: where’s the quality in systematic review?, Journal of Educational Policy, 20, 4, pp. 393-416

    McGee, C., Ward, R., Gibbons, J and Harlow, A. (2004) Transition to secondary school: a literature review, (Hamilton, the University of Waikato).

    Moss, G. (2013) Research, policy and knowledge flows in education: what counts in knowledge mobilisation, Contemporary Social Science: Journal of the Academy of Social Sciences, DOI: 10.1080/21582041.2013.767466.

    Nutley, S.M., Walter, I. and Davies, H.T.O. (2007) Using evidence: How research can inform public services, (Bristol, The Policy Press).

    Penuel, W., Fishman, B., Haugan, C. and Sabelli, N. Organizing Research and Development at the Intersection of Learning, Implementation and Design, Educational Researcher, 40, 7, pp. 331-337.

    Penuel, W., Sun, M., Frank, K. and Gallagher, A. (2012) Using Social Network Analysis to Study How Interactions Can Augment Teacher Learning from External Professional Development, American Journal of Education, 119, 1, pp. 103-136.

    Lemons, C., Fuchs, D., Gilbert, G. and Fuchs, L. (2014) Evidence-Based Practices in a Changing World: Reconsidering the Counterfactual in Education Research, Educational Researcher, 43, 5, pp. 242-252.

    Shepherd, J. and Roker, D. (2005) An evaluation of a ‘transition to secondary school’ project run by the National Pyramid Trust, available at:, accessed on 4 September 2013. 

    Stoll, L. (2008) Leadership and policy learning communities: promoting knowledge animation, in: Chakroun, B. and Sahlberg, P. (eds) Policy learning in action: European Training Foundation Yearbook 2008, (Torino, Italy, European Training Foundation).

    Stoll, L. (2009) Knowledge Animation in Policy and Practice: Making Connections, Paper presented at the Annual Meeting of the American Educational Research Association as part of the symposium Using Knowledge to Change Policy and Practice, available at:, accessed on 23 January, 2014.

    Stoll, L., (2012) Stimulating Learning Conversations, Professional Development Today, 14, 4, pp. 6-12.

    Stoll, L. and Louis, K. S. (2007) Professional Learning Communities: Divergence, Depth and Dilemmas, (Maidenhead, Open University Press).

    Sutherland, R., Ching Yee, W., McNess, E. and Harris R. (2010) Supporting learning in the transition from primary to secondary schools, available at:, accessed on 5 September, 2012.

    Taylor, M. (2013) Social science teachers’ utilisation of best evidence synthesis research, New Zealand Journal of Educational Studies, 48, 2, pp. 35 – 49.

    Tooley, J. and Darby, D. (1998) Educational Research: a Critique, (London, Ofsted).

    Tree, J. (2011) What helps students with challenging behaviour make a successful transfer to secondary school? (D.Ed.Psy Dissertation, University of London, Institute of Education).

    Vanderlinde, R. and van Braak, J. (2010) The gap between educational research and practice: views of teachers, school leaders, intermediaries and researchers, British Educational Research Journal, 36, 2, pp. 299 316.

    Witt, M. (2011) Mathematics and transition, in Howe, A. and Richards, V (eds) Bridging the transition from primary to secondary school, (Abingdon, Routledge).

  • A ‘Liturgical Laboratory’

    A ‘Liturgical Laboratory’

    ‘Liturgical Laboratory’  

    by Tim Novis.

    As the Senior Chaplain at Wellington College, I am responsible for ensuring that worship is meaningful and relevant for the students, over a thousand in number, who enter the Chapel on a weekly basis.  The most obvious expectation in an independent school, often driven by impressive fees, is that whatever we offer it must be of the highest quality.  


    Fr. Tim Novis

    The question must be asked, what then must worship look like in the independent sector, where chapel attendance is mandatory, yet where those who attend are not by any means strict adherents of any one particular faith or religion; although most select ‘Church of England’ as at least their affiliation on entrance applications. 

    How then can we achieve this goal with integrity, and not simply resort to ‘plug and play’ liturgies that are really just indoctrination and the mumbling back of hollow responses printed in a booklet or merely the gusty rugby pitch style singing of 18th century hymns that are remotely patriotic with little to nothing to do with the interior life?

    I am beginning qualitative, evidence-based research into what 21st century, mandated, adolescent, public worship should be, to ensure that the full benefits of spiritual wellbeing are realised within the institutional setting.

    Utilising focus-group research and case studies, I also wish to trace the life stories of groups of pupils over a 3 year period of ‘exposure’ to chapel attendance.

    Further, what spaces within an independent schools does spirituality inhabit?  Within the liminal, epistemological, temporal and physical realms, where does spirituality offer an impact upon the wellbeing of students?  What affect does it have and how can we measure this, to capture what we should be offering more of, and what we should be letting go of as archaic?

    In regard to wellbeing, in a webpage entitled, ‘Spirituality and Mental Health,’ from the Royal College of Psychiatrists, the question is asked: 

    ‘What difference can spirituality make?

    Service users tell us that they have gained:

    •better self-control, self-esteem and confidence

    •faster and easier recovery (often through healthy grieving of losses and through recognising their strengths)

    •better relationships – with self, others and with God/creation/nature

    •a new sense of meaning, hope and peace of mind. This has allowed them to accept and live with continuing problems.’

    Worship, or liturgy, is literally the ‘work of the people’.  We need also to ask and answer, ‘what works for people’, if we take as truth that spirituality is a crucial component in any successful program of education.  The Wellington College Chapel will become a ‘liturgical laboratory’ where research with a lofty end in mind will be completed.

  • Taming the Wild West of Educational Research

    Taming the Wild West of Educational Research

    By Dr. Simon P Walker 

    Dissolving facts

    Let’s start with a basic fact. All facts are constructs. As Martin Heidegger would have put it, no construct is a description of the thing in itself. It is a proposal, a representation, an attempt at a description. 

    Take a supposed ‘solid fact’. Take, for instance, the fact of gravity. Gravity is a fact, surely. As eggs is eggs, everything is subject to gravity. 

     Well it’s certainly the case that an attractional force between objects appears to act across the detected universe, but what that force precisely is, is still not agreed. From (apocryphally) seeing apples falling on heads, Newton described this universal force in the way we all learned up to GCSE. Einstein took Newton’s model of gravity and picked holes in it, describing gravity in the way students learn it now at A Level. Meanwhile, post-docs explain to undergraduates that accept Einsteinian models are themselves only inadequate constructs. A better construct is being sought; a unifying theory of forces, still not described by Hawking and co.

    Physicists accept readily that they are merely dealing in representations of reality, not reality itself. Perhaps that is why physicists, such as Fritjof Capra and David Bohm, pioneered what they call a ‘new science’, open to the non material, the ambiguous, the paradoxical, the spiritual. Mystery. They know that hard knows of real matter actually dissolve before you can touch them into states of energy and proximity. 

    By contrast, it is biologists (I confess, my first degree) who tend to be religiously convinced their ideas are unarguable, utterly certain and right. Think of the militant evangelicalism of Richard Dawkins, that great campaigner for the truth that the world is nothing more than nuts and bolts. His epistemological absolutism is, I suspect, a little odd to physicists who know all versions of reality are vague approximations, mental constructs designed to get us slightly closer to the things we can never actually claim, or touch, or control. 

    Biologists tend to miss this because their world sits somewhere between the empirical experimental machineries of the particle physicist at Cern, and the narrative writing social scientist at the RSA. Biologists dismiss physicists as dealing with forces at a level ‘unable to explain complex, emergent properties of organic life, reproduction and speciation’. At the same time, they poo poo the ‘loose, shoddy subjective non-empirical pseudo-science’ of the social scientist, such as the educational researcher.

    The lot of the educational researcher

    This, in fact, is the lot of the poor social scientist, of which the educational researcher is one. To be kicked in the face by the ‘hard science’ bully.  And perhaps that contributes to the endless existential crisis that afflicts social science research; it certainly seems to be part of the lively energy surrounding the debate around educational research methods and validity that schools are becoming alert to.

     The core problem for the educational researcher, as a social scientist, is this: How can I do valid research? He faces methodological challenges unknown to the scientist: Human beings can’t be put in test tubes. They can’t be dissected. They have wills. They have to be asked permission to take part in his project, they slip out of the constraints of the experimental designs set them. He sets up his case-study based study. Five students are absent from the recorded first lesson, skewing his data. He watches ten lessons….. (my! what a lot if observational data he now has to analyse….) yet those lessons were just 3% of three hundred taught that week. How can he generalise from that sample size? What claims can he make that would possibly be valid beyond the limits of that particular experience?  Thus the educational researcher feels constantly anxious terrified his research doesn’t really count, or worse, will be counted as ‘bad research’.

    Bad science or bad customers?

    Some educational researchers retreat to empiricist methods. Quantitative studies are commissioned on huge sample sizes. Claims are made, but how valid are those claims to the real-life of the classroom? For example, what if one study examines 5,000 students to see if they turn right rather than left after being shown more red left signs. Yes, we now with confidence know students turn left when shown red signs. But so what?  What can we extrapolate from that?  How much weight can that finding bear when predicting human behaviour in complex real world situations where students make hundreds of decisions to turn left and right moment by moment? The finding is valid but is it useful?

    Generalising applications from limited, circumscribed data has been a route to poor educational product development. In the rarefied abstracted confines of a lab, or campus, looking at only a few factors in millions of interacting ones, a researcher publishes a claim. Maybe it is a claim about the way the brain processes different kinds of data (just as a random example…..). The researcher states that their claim is a proposal,  a construct, an artifice to describe a small phenomenon they observed. It is a way of putting that finding into language. It is not the truth.

    But teachers don’t like constructs; they like tools, they like ideas that they can use, can put into practice.

    This desire for practical tools creates a market for someone to translate that small, limited, prescribed, tentative claim and turn it into a tool that teachers can use…… hence:

     Brain gym, learning styles, VAK, EQ, Thinking hats, mindset….

    These tools are created because there is a demand from teachers for them. Teachers are in the business of doing, getting results in a classroom.  They want practical things that can help them to that not tentative research constructs. So a company creates those certainties for them. Sometimes the original researcher is involved in the tool development and sometimes not. Sometimes the researcher is in despair with what the teaching profession goes on to do either their subtle qualified claims and constructs.

    Neuroscientific abuse

     Neuroscience has been most susceptible to this kind of poor adoption. Sometimes the neuroscience itself has been bad science. More often, the application of the science by teachers has been bad practice. Neuroscience has that seductive appeal, the promise of unlocking the kernel of what learning actually is. But neuroscience does not and, indeed, cannot achieve that. Peering into the neural activity of thirty teenagers rampaging in science, lesson three Monday morning, is currently beyond the scope of the fMRI scanners. Teaching may draw on bits of hard neuroscience but in the end, classroom teaching is a social collective experience. Neuroscience does not adequately deal with collective cognitive affective phenomena. No, teaching is informed by studies inside the brain but it will never be fully described by them. Teaching is a live happening, a collective event.

    Proper confidences of educational research

    That is why the appropriate discipline to measure teaching and learning must remain social science. And teachers must be confident social scientists when they research the efficacy of their methods. They must be confident in what it can claim and accepting of the limits of their social science research.

    So what are the limits and confidences of educational research?


    Because studies take place in real schools and schools are particular not abstract instances of education, conclusions from educational research are always open and conclusions revisable. This applies to both quantitative but especially qualitative studies.

    When teachers act as researchers into their efficacy of their own schools, they must acknowledge their own impact on the study outcome. There is a major ‘Hawthorn Effect’ when teachers lead internal school research whereby the presence of the observer, as well as the sheer fact of conducting the observations, will by themselves improve results. Hattie’s comparative ‘effect size’ model explained in Visible Learning is one way to mitigate against exaggerating results.


    Because studies are done ‘in the field’ rather than in the lab, there is a good chance the findings will be relevant and useful rather than valid but useless.

    Social science methodologies (case study, interview, participant survey, observation) are all valid. Quantitative methods (tests, controlled assessments) have some advantages of reliability (sample size, limitation of conflicting factors, reduction of noise) but disadvantages of applicability (how useful are the conclusions). Bigger samples are always better than smaller samples whatever method.

    The researcher must know which methodological tools he is using, not claiming to be screwing a bolt when he is bashing a nail. Beware the researcher who does not know which methodology he is using, why and what evaluation of data it will enable him to do by what method.

    Beware the heavy boot of higher education. In the perceived academic hierarchy of the British mind, universities look down on secondary schools, which look down on primary schools. For their own reasons (to do with funding amongst other things) universities are piling into school research right now. Their presence is welcomed, their expertise in research design and analysis is vital. However, they generally have slightly less interest in actually improving school education as opposed to measuring things designed to improve it. Teachers want to improve things; this is a noble and arguably more important goal and we need ensure the ‘doers of teaching’ retain control over the ‘measurers of education’.

    Research should be as non-intrusive as possible. Ditto above, academics love hefty research design. Schools will quickly become sick of research projects if they intrude too much upon the actual functioning of the school day. Design light projects which are lean and clean.

    The real opportunity of a school research project is for teachers to become leaners again. A school should adopt an agreed ‘active-learning cycle’ (e.g. Kolb, Lewin, Honey and Mumford) as the structure through which all its research activities are driven. By adopting a common format for eliciting research questions, implementing studies, applying and evaluating study data, schools will engage staff and pupils with the project’s goals, stages and benefits. Research projects will be coordinated and heuristic not random and whimsical.

    Research should enrich the story. Teachers are social constructionists, telling stories that will capture the deeper participation of students. Live, ongoing research enriches that story, makes it fresh, open, edgy… I was asked recently by one school Head participating in one of our Human Ecology Education studies ‘So presumably what you expect to find is X and Y…’ My reply was ‘Not at all! I’m not assuming anything… this is why we are doing the research- to find out!.’ Research actually discovers new things which is what makes it exciting as part of a school’s life. The null hypothesis may be just as interesting and important as the one we expected to find. Heads should be sharing the current research, questions and findings, with their students all the time… it’s better than watching spirogyra bubble oxygen in test tubes.

     Someone wise said ‘The unexamined life is a life not worth living’. Educational research is about finding out what works, but it is also about much more than that. It is not, in the end, about collecting more data, or obtaining more knowledge; it is about regaining wisdom, the wisdom to derive pleasure, care and pride in the crafting of education. 

    A score chart to plan your educational research study

    At Human Ecology Education, we have designed a rough rule-of-thumb set of criteria against which experimental designs can be scored. The higher the score, the stronger the study data is likely to be. I have suggested a score above 40 is an acceptable minimum. I hope it will help school-based researchers identify aspects of a proposed study that they can strengthen, giving them confidence in their ability to do research of genuine value. Feel free to take and use it as you see fit!


    About Simon P Walker

    Dr Simon P Walker is a man who has done too many degrees. After teaching in the humanities, science and social sciences across several research institutions he is now Director of Research at Human Ecology Education. He is a Visiting Fellow at Bristol Graduate School of Education and his specialist field is the regulation of Cognitive Affective Social (CAS) state in students and its impact on educational outcomes. 


  • Modern foreign languages in the primary and secondary school: Teaching the new National Curriculum

    Modern foreign languages in the primary and secondary school: Teaching the new National Curriculum

     Screen Shot 2014-09-14 at 19.38.14






    The University of Reading are delighted to be able to offer free and extensive CPD for primary and secondary school language teachers through funding from the DfE, co-delivered by experienced local teachers and University tutors.

    The programme will begin with a whole day of input and activities, followed by monthly twilight sessions and will end with a half-day event. French language upskilling sessions will be provided for primary teachers in addition, each month.

    DfE funding allows us to be able to make a substantial contribution to supply costs for teachers attending the first and last event plus a minimum of three twilight sessions.

    The programme will be of benefit for all those teaching languages, especially colleagues leading languages provision in schools, as well as those new to delivering primary languages. The language of focus for primary colleagues will be French; for secondary colleagues, sessions will include examples in French, German and Spanish.

    Sessions will include:

    •   Creating a joined up KS2-3 curriculum for languages
    •   Developing core oral skills, including accurate pronunciation and spontaneous oral interaction
    •   Literacy skills in the foreign language – including reading for comprehension, appreciation and vocabulary development
    •   Developing grammatical competence across Key Stage 2 and 3
    •   Developing learners’ listening skills
    •   Assessment
    •   Primary-secondary transition

    Session 1: Whole day, Friday 10 October, 09.30-15.30, Institute of Education: The new National Curriculum across Key Stages 2-3 and principles of effective teaching and learning; developing learners’ listening skills; assessment and evaluation; transition issues.

    Further details of subsequent sessions can be found overleaf.

    As well as gaining a wealth of practical ideas, participants on the programme will also enhance their understanding of the principles that underpin effective language learning. On-line support activities for use between sessions will be available.

    Toreserveyourplace,pleasegotothefollowinglink: Further information will be emailed to you a week before the first event. orphone01183782612

    Twilight Sessions
    Session 2 (Speaking): 16.30-18.30, Thursday 6 November, The Willink School RG7 3XJ
    Session 3 (Speaking): 16.30-18.30, Wednesday 3 December, The Willink School RG7 3XJ Session 4 (Reading): 16.30-18.30, Thursday 15 January, The Piggott School RG10 8DS
    Session 5 (Reading/writing): 16.30-18.30, Wednesday 4 February, The Piggott School RG10 8DS Session 6 (Grammar): 16.30-18.30, Thursday 5 March, The Piggott School RG10 8DS

    Session 7 (Sharing practice; transition): Half day, Wednesday 25 March, 13.30-16.30, IoE
    Additional French language tuition will be offered on the following dates for primary teachers, with all

    sessions held at the University:
    Thursday, 23 October, 17.00-19.00
    Thursday, 20 November, 17.00-19.00 Thursday, 11 December, 17.00-19.00 Thursday, 22 January, 17.00-19.00
    Thursday, 12 or 19 February (TBC), 17.00-19.00

    We will pay for one day’s supply cover for teachers who attend the first and last event plus a minimum of three twilight sessions, with schools asked to fund the remaining half-day. There are no further costs for the CPD. Further details will be emailed out with joining instructions before the first event.

    This CPD is being delivered as a consortium led by the University of Reading and involves the following partners:

    •   Bartholomew School, Eynsham
    •   Cherwell School, Oxford
    •   Fair Oak Junior School
    •   Keep Hatch Primary School
    •   Oxford University Department of Education
    •   Radstock Primary School
    •   The Willink School, Burghfield Common
    •   Wellington College Teaching School Partnership
    •   Wokingham Secondary Federation 


    MFL CPD 

  • Access to free research journals, reports and papers

    Access to free research journals, reports and papers

    Access to Research (via SUPER Network.)

    The aim of this page is to post links to any freely available research that is not hidden behind paywalls e.g. Journal articles, conference papers, reports, research digests etc (presented A-Z).



    Assessment and Learning: State of the Field Review (via Oxford University Centre for Educational Assessment)

    Access to Research –  Public Library Initiative

    BELMAS Publications (free access via free first year membership)

    BERA Research Intelligence

    BERA Insights and Briefings

    BERA Why Educational Research Matters

    BERA Research and Teacher Education

    Best Evidence in Brief (University of York, Institute for Effective Education)

    BES (Iterative Best Evidence Synthesis) Programme – What Works Evidence 
    Hei Kete Raukura (New Zealand)

    Cambridge Primary Review Publications

    CUREE Publications

    EPPI Centre Systematic Reviews


    Educational Neuroscience: a teacher’s guide to the good, the bad and the irrelevant (slides and materials  from Dorothy Bishop, University of Oxford)

    Forum: Qualitative Research

    Usable Knowledge (Harvard Graduate School of Education)

    Institute of Education Research News Bulletin

    Institute of Education Research Briefings

    The Internet and Education – free chapter by Professor Neil Selwyn, Monash University, Australia

    Jotter: Journal of Trainee Educational Research (Faculty of Education, University of Cambridge)

    Learning Landscapes (current issue: Spring 2014 – a peer-reviewed, open access, themed journal that is published twice a year. It works to bridge theory and practice by publishing submissions from individuals who represent the wider educational community—teachers, principals, students, parents, university academics and community leaders – produced in Quebec, Canada)

    MESH Guides – supporting professional judgement with evidence (connecting educators with summaries and sources of educational research)

    NFER Direct – free email newsletters on NFER’s latest research etc (National Foundation for Educational Research)

    Research in Teacher Education (University of East London)

    Routledge Education Arena

    Routledge Education Class of 2013 (free access to most downloaded papers until 31st Dec 2013)

    Routledge: Primary & Secondary Education (free access until 30th Sept 2014)

    Routledge: September Spotlight Journal – Professional Development in Education (free access until 30th Sept 2014)

    Routledge: Education in Extreme Poverty, Conflict & Disaster (free access until 31st Dec 2014)

    Sandagogy – Sandringham School Learning Journal

    Teacher Leadership (free access to ECER 2014 Conference Papers by university lecturers/academics and school teachers)

    TLRP Publications

    Teaching & Learning Together in Higher Education Blog

    The Sutton Trust EEF Toolkit

    World Bank Research Digest


    Read the SUPER network blog here.

  • ResearchED 2014: Why every school needs a research champion.

    ResearchED 2014: Why every school needs a research champion.




    photo 21

    Carl Hendrick, Head of Learning and Research, Wellington College

    For too long the classroom practitioner has been the researched as opposed to the researcher. Teachers have been given answers to questions they didn’t ask, provided with solutions to problems that never existed, and assailed by counterintuitive theory when practical advice was more appropriate. Where there has been good research, it is often sidelined by short-termism, a near fetish for data or the Sword of Damocles of a looming inspection.

    By the same token, school staff rooms are often dominated by teachers whose only serious reflection on their practice comes from their own limited experience and confirmed biases, and whose only measures of success are exam results and league tables. This attitude is often typified by a deep and open antipathy to anything too reflective labelled ‘evidence based’ or ‘research.’ This is a professional practice that amounts to what Professor John West-Burnham terms ‘long-term self indulgence.’

    Clearly, the division between education academia situated almost exclusively at the University and the classroom practitioner hacking away ‘at the coalface’ has not worked as well as it might have. Teachers are typically given a whistlestop tour of education research during their PGCE year and then are furnished with very poor support ‘within house’ in terms of evidence based training and approaches to teaching. All of us have sat through turgid CPD sessions where we have been given impressionistic, superficial droplets of education theory from an ocean of research that is often neither relevant nor resonant with our own practice.

    Some teachers do go on to do research in the form of an MA or PhD (usually the ones with the vacant stare and severe caffeine addiction) but how often is this funded and embedded into their schools? How often do schools harness this expertise into their school improvement plans and CPD? Again, the issue is this disjunction between research and practice which has traditionally marginalised the difficult path of evidence-based approaches to school improvement and privileged the easier route of fads and quick fixes.

    So how do we emerge from this primordial soup? The role of head of research or research champion in schools should ultimately be about mobilising the wider evidence base and making it easy for classroom teachers to be more informed about what they do. I would like to see the following developments in the way schools engage with research:

    1. A researcher in residence located in schools.

    At Wellington College, we will be working closely with various HEIs including Harvard faculty of education on a number of projects but I hope to appoint a full time researcher in residence who will be located in our school on a regular basis. The researcher in residence should be both a recognised academic and someone who has published research in the field and can help ensure interventions are carried out properly. They should work ‘cheek to jowl’ with teachers to improve their practice by helping with robust design methodology, literature review and evaluation. This ‘in-house’ approach to research is widely used to great effect in other fields and could potentially transform teacher practice, leadership and whole school policy.

    2. Schools need to ‘own’ their research questions

    Schools need to identify what it is they want to know, how they are going to ‘know’ it and what they are going to do to measure the impact and applicability of this new knowledge. At all of these stages the researcher in residence is key. Schools need to be asking their own questions about improvement based on specific CPD and school improvement plans. What may be appropriate for one school may not be appropriate for another and we should have the flexibility to frame our own point of enquiry and direct resources accordingly.

    3. Capacity: Research should be embedded in the life of the school

    This is a central problem. Most schools simply don’t have the time or capacity to resource something that is as yet unproven. Typically research in schools is an ‘add-on’, it is not central to the day to day business of school practice and this needs to change. For CPD and professional review there should be an area for research and time allocated accordingly to staff to engage in it and further their knowledge.

    4. Better measures of success and impact

    Education research is often about trying to measure the immeasurable. Hattie’s now seminal work is titled ‘visible’ learning for good reason: often quality learning is invisible to us and resists classification and categorisation. There is also the problem of what we are using to measure success. There is a lot of use of the phrase ‘what works’ at the moment, well if ‘what works’ is simply a proxy for ‘what creates good exam passers’ then we have a very jaundiced and impoverished barometer of what success actually is. More complex and elegant ways of evaluating the impact of interventions are needed.

    5. ‘Reflective’ practice not ‘best’ practice.

    We need to get rid of the phrase ‘best practice’ and replace it with ‘reflective’ or ‘informed’ practice. There should be no single, ‘best’ way of teaching but rather a deeply reflective approach informed by high-quality evidence and appropriate for the specific context in which it is applied. An evidence-based approach to education does not mean a uniform ‘one size fits all’, axiomatic approach to what good teaching should look like. Whenever I have seen excellent teaching, it is almost totally unique and characterised by something idiosyncratic and deeply personal to that teacher. We need to resist the homogenising forces of the past that insist for example that too much teacher talk is bad, or that kids only learn best in groups or that learning how to ‘play the exam game’ is an acceptable outcome to the process of learning.

    6. ‘Standing on the shoulders of giants.’

    The first port of call for any school should be to look at the wider evidence base, specifically robust, methodologically sound and peer reviewed research and begin their inquiry from there. The work of the EEF and Rob Coe in the production of their Toolkit is one of the most significant developments in education and should be the point of departure for any school’s development plan.

    7. Schools producing their own publications.

    Dissemination of findings has to be a central part of the process. We plan to publish a learning and research journal here at Wellington next year which will showcase the work we have been doing engaging with research and evidence based practice. This will encompass individual teacher enquiry in the form of MA/PhD work to whole school research such as our two year partnership with Harvard faculty. This publication will not be an academic peer-reviewed article (although I want academics to contribute) but rather a readable, accessible journal that can be easily digested by staff, parents and hopefully students.

    The professionalisation of teaching

    Education research has provided teachers with enlightening and elegant ways of approaching their practice. There is an ever-growing and robust evidence base in a wide range of areas that have improved standards and enfranchised both teacher practice and student achievement. However there has also been a history of ideologically driven, methodologically unsound and politically entrenched dogma in the name of education research that has compromised the very teachers and students it was intended to empower.

    The role of research champion is only strengthened by the emergence of researchED which represents a hugely significant step forward in the professionalisation of teaching. It is a movement that has brought together some of the greatest thinkers in education and provided a vibrant and exciting platform for debate. This rich seam should be mined by every school in the country and utilised a point of departure for all areas of school improvement.

    Read more about ResearchED at 

  • Photos from Harvard GSE Partnership launch

    Photos from Harvard GSE Partnership launch


    photo 11IMG_4629










    photo 2


    IMG_4713 copy