The University of Reading are delighted to be able to offer free and extensive CPD for primary and secondary school language teachers through funding from the DfE, co-delivered by experienced local teachers and University tutors.
The programme will begin with a whole day of input and activities, followed by monthly twilight sessions and will end with a half-day event. French language upskilling sessions will be provided for primary teachers in addition, each month.
DfE funding allows us to be able to make a substantial contribution to supply costs for teachers attending the first and last event plus a minimum of three twilight sessions.
The programme will be of benefit for all those teaching languages, especially colleagues leading languages provision in schools, as well as those new to delivering primary languages. The language of focus for primary colleagues will be French; for secondary colleagues, sessions will include examples in French, German and Spanish.
Sessions will include:
Creating a joined up KS2-3 curriculum for languages
Developing core oral skills, including accurate pronunciation and spontaneous oral interaction
Literacy skills in the foreign language – including reading for comprehension, appreciation and vocabulary development
Developing grammatical competence across Key Stage 2 and 3
Developing learners’ listening skills
Session 1: Whole day, Friday 10 October, 09.30-15.30, Institute of Education: The new National Curriculum across Key Stages 2-3 and principles of effective teaching and learning; developing learners’ listening skills; assessment and evaluation; transition issues.
Further details of subsequent sessions can be found overleaf.
As well as gaining a wealth of practical ideas, participants on the programme will also enhance their understanding of the principles that underpin effective language learning. On-line support activities for use between sessions will be available.
Toreserveyourplace,pleasegotothefollowinglink: http://store.rdg.ac/UoR-MFLITPSS Further information will be emailed to you a week before the first event.
Session 2 (Speaking): 16.30-18.30, Thursday 6 November, The Willink School RG7 3XJ
Session 3 (Speaking): 16.30-18.30, Wednesday 3 December, The Willink School RG7 3XJ Session 4 (Reading): 16.30-18.30, Thursday 15 January, The Piggott School RG10 8DS
Session 5 (Reading/writing): 16.30-18.30, Wednesday 4 February, The Piggott School RG10 8DS Session 6 (Grammar): 16.30-18.30, Thursday 5 March, The Piggott School RG10 8DS
Session 7 (Sharing practice; transition): Half day, Wednesday 25 March, 13.30-16.30, IoE
Additional French language tuition will be offered on the following dates for primary teachers, with all
sessions held at the University:
Thursday, 23 October, 17.00-19.00
Thursday, 20 November, 17.00-19.00 Thursday, 11 December, 17.00-19.00 Thursday, 22 January, 17.00-19.00
Thursday, 12 or 19 February (TBC), 17.00-19.00
We will pay for one day’s supply cover for teachers who attend the first and last event plus a minimum of three twilight sessions, with schools asked to fund the remaining half-day. There are no further costs for the CPD. Further details will be emailed out with joining instructions before the first event.
This CPD is being delivered as a consortium led by the University of Reading and involves the following partners:
Learning Landscapes (current issue: Spring 2014 – a peer-reviewed, open access, themed journal that is published twice a year. It works to bridge theory and practice by publishing submissions from individuals who represent the wider educational community—teachers, principals, students, parents, university academics and community leaders – produced in Quebec, Canada)
For too long the classroom practitioner has been the researched as opposed to the researcher. Teachers have been given answers to questions they didn’t ask, provided with solutions to problems that never existed, and assailed by counterintuitive theory when practical advice was more appropriate. Where there has been good research, it is often sidelined by short-termism, a near fetish for data or the Sword of Damocles of a looming inspection.
By the same token, school staff rooms are often dominated by teachers whose only serious reflection on their practice comes from their own limited experience and confirmed biases, and whose only measures of success are exam results and league tables. This attitude is often typified by a deep and open antipathy to anything too reflective labelled ‘evidence based’ or ‘research.’ This is a professional practice that amounts to what Professor John West-Burnham terms ‘long-term self indulgence.’
Clearly, the division between education academia situated almost exclusively at the University and the classroom practitioner hacking away ‘at the coalface’ has not worked as well as it might have. Teachers are typically given a whistlestop tour of education research during their PGCE year and then are furnished with very poor support ‘within house’ in terms of evidence based training and approaches to teaching. All of us have sat through turgid CPD sessions where we have been given impressionistic, superficial droplets of education theory from an ocean of research that is often neither relevant nor resonant with our own practice.
Some teachers do go on to do research in the form of an MA or PhD (usually the ones with the vacant stare and severe caffeine addiction) but how often is this funded and embedded into their schools? How often do schools harness this expertise into their school improvement plans and CPD? Again, the issue is this disjunction between research and practice which has traditionally marginalised the difficult path of evidence-based approaches to school improvement and privileged the easier route of fads and quick fixes.
So how do we emerge from this primordial soup? The role of head of research or research champion in schools should ultimately be about mobilising the wider evidence base and making it easy for classroom teachers to be more informed about what they do. I would like to see the following developments in the way schools engage with research:
1. A researcher in residence located in schools.
At Wellington College, we will be working closely with various HEIs including Harvard faculty of education on a number of projects but I hope to appoint a full time researcher in residence who will be located in our school on a regular basis. The researcher in residence should be both a recognised academic and someone who has published research in the field and can help ensure interventions are carried out properly. They should work ‘cheek to jowl’ with teachers to improve their practice by helping with robust design methodology, literature review and evaluation. This ‘in-house’ approach to research is widely used to great effect in other fields and could potentially transform teacher practice, leadership and whole school policy.
2. Schools need to ‘own’ their research questions
Schools need to identify what it is they want to know, how they are going to ‘know’ it and what they are going to do to measure the impact and applicability of this new knowledge. At all of these stages the researcher in residence is key. Schools need to be asking their own questions about improvement based on specific CPD and school improvement plans. What may be appropriate for one school may not be appropriate for another and we should have the flexibility to frame our own point of enquiry and direct resources accordingly.
3. Capacity: Research should be embedded in the life of the school
This is a central problem. Most schools simply don’t have the time or capacity to resource something that is as yet unproven. Typically research in schools is an ‘add-on’, it is not central to the day to day business of school practice and this needs to change. For CPD and professional review there should be an area for research and time allocated accordingly to staff to engage in it and further their knowledge.
4. Better measures of success and impact
Education research is often about trying to measure the immeasurable. Hattie’s now seminal work is titled ‘visible’ learning for good reason: often quality learning is invisible to us and resists classification and categorisation. There is also the problem of what we are using to measure success. There is a lot of use of the phrase ‘what works’ at the moment, well if ‘what works’ is simply a proxy for ‘what creates good exam passers’ then we have a very jaundiced and impoverished barometer of what success actually is. More complex and elegant ways of evaluating the impact of interventions are needed.
5. ‘Reflective’ practice not ‘best’ practice.
We need to get rid of the phrase ‘best practice’ and replace it with ‘reflective’ or ‘informed’ practice. There should be no single, ‘best’ way of teaching but rather a deeply reflective approach informed by high-quality evidence and appropriate for the specific context in which it is applied. An evidence-based approach to education does not mean a uniform ‘one size fits all’, axiomatic approach to what good teaching should look like. Whenever I have seen excellent teaching, it is almost totally unique and characterised by something idiosyncratic and deeply personal to that teacher. We need to resist the homogenising forces of the past that insist for example that too much teacher talk is bad, or that kids only learn best in groups or that learning how to ‘play the exam game’ is an acceptable outcome to the process of learning.
6. ‘Standing on the shoulders of giants.’
The first port of call for any school should be to look at the wider evidence base, specifically robust, methodologically sound and peer reviewed research and begin their inquiry from there. The work of the EEF and Rob Coe in the production of their Toolkit is one of the most significant developments in education and should be the point of departure for any school’s development plan.
7. Schools producing their own publications.
Dissemination of findings has to be a central part of the process. We plan to publish a learning and research journal here at Wellington next year which will showcase the work we have been doing engaging with research and evidence based practice. This will encompass individual teacher enquiry in the form of MA/PhD work to whole school research such as our two year partnership with Harvard faculty. This publication will not be an academic peer-reviewed article (although I want academics to contribute) but rather a readable, accessible journal that can be easily digested by staff, parents and hopefully students.
The professionalisation of teaching
Education research has provided teachers with enlightening and elegant ways of approaching their practice. There is an ever-growing and robust evidence base in a wide range of areas that have improved standards and enfranchised both teacher practice and student achievement. However there has also been a history of ideologically driven, methodologically unsound and politically entrenched dogma in the name of education research that has compromised the very teachers and students it was intended to empower.
The role of research champion is only strengthened by the emergence of researchED which represents a hugely significant step forward in the professionalisation of teaching. It is a movement that has brought together some of the greatest thinkers in education and provided a vibrant and exciting platform for debate. This rich seam should be mined by every school in the country and utilised a point of departure for all areas of school improvement.
Identifying the factors that may influence success inside and beyond the classroom
Dr Simon Walker, Lead Researcher, Human Ecology Education
Previous studies have shown that the Affective (a word for emotional) and Social factors (AS factors) measured in this study develop over childhood. Young children (in year 3-6) show AS scores indicating they seek novelty and change and exhibit a high trust in themselves, coupled with a low trust of others and a low willingness to disclose. As children develop through to adolescence and then adulthood, these AS biases diminish, indicating a growing ability to self-regulate appropriately for the situation rather than just be driven by internal drives.
CAS development over childhood
This is true across different populations of same-age children. For example, here are AS profiles of four different year 10 cohorts from four different UK schools. Each school was in a different part of the country and were of different school type (independent, day, boarding, academy).
In fact, what is true of Affective Social scores also appear to be true of Cognitive scores. In the above chart, we have added perspective, processing and planning- Cognitive factors- to form a CAS profile. The students CAS profile is just as remarkably consistent as their AS profile.
However, this does not mean that school is having no impact on children’s CAS development. In 2013 we tested what impact schools have on student CAS scores.
Our Footprints CAS technology can measure the CAS profile of children when they are OUT of school and also measure it when they are IN school. This can give us an indication of the impact the school is having on the Cognitive, Affective and Social development of its students.
For example, this chart shows the dark blue CAS profile of one of the above schools when its yr 10 students are OUT of school. The light blue profile which is overlaid, then shows the profile of those SAME students when they are IN school.
Clearly this school is having a big impact on the student’s trust of others, trust of them self, their desire to embrace change and their planning!
And different schools appear to have different impacts. Here’s a second school in which the OUT of school highs and lows of the seven factors are more or less levelled out when IN school.
Contrast this with a third school, in which the OUT of school and the IN school profile are almost identical.
So, our research suggests that different schools have a different impact on CAS (Cognitive, Affective Social) development of their students.
The question is, what’s the significance of those impacts? Does it matter that different schools have different impacts? Do those different impacts contribute to different student outcomes? Might those different impacts have a bearing, for example, on students’ academic outcomes, or their ability to manage social situations, or their aspirations?
Previous studies suggest they might. For example, we have already shown that a student’s ability to adjust their CAS state to an optimal state as they move from curriculum lesson to curriculum lesson (i.e. from maths, to english to science) correlates with higher academic outcomes. See http://www.footprintsschoolsprogramme.co.uk/#/research/4574561474. Results suggest that this ability may in fact, account for up to 10-15% of secondary student academic outcomes.
If CAS scores have an impact on academic outcomes, it seems likely that they may also have an impact on non-academic outcomes. For example, employability, or attractiveness to universities, or leadership qualities.
The CAS profile was originally developed by Human Ecology Education’s parent company in the 2000s as a measure of leadership mobility. It is used successfully to help employers select and train their internal leaders by indicating whether candidates have the requisite skills to adapt into the varied tasks, contexts and relationships required in the work place. It seems likely, therefore, that student CAS scores could have a bearing not just for success in the classroom but beyond school in the workplace.
We hope that this current state-independent dividend school study will shed some further light onto CAS and its impact in determining student educational outcomes. We hope it will help us understand whether different models of schooling have different impacts on CAS development. We hope it will highlight if there are gender differences in CAS development and what educational experiences are most likely to boost CAS development, so that our country’s education can be as effective as it possibly can.
Dr Simon Walker, Lead Researcher, Human Ecology Education
From September 2014 Wellington College will enter a two year partnership with Research Schools International led by Harvard Graduate School of Education Faculty to explore the broader topic of independent learning, specifically the areas of Growth Mindsets, Resilience, Grit and Active Learning. We will also be working closely alongside partner schools from our Teaching School Alliance in this process.
The initial direction of the project was decided in the Summer term 2014 through consultation with all staff at the College by a survey conducted by Harvard.
The project has three broad stages:
1. A comprehensive literature review of main areas and dissemination to staff.
This will be presented by Harvard GSE faculty to all staff and will provide the starting point for our enquiry. Strands and emerging themes from this review will be used to facilitate discussion at both a school and network level. Engaging with the wider evidence base is a vital part of this process and means examining not only what has been written in the field to date promoting independent learning but also examining its criticism, and alternative perspectives.
2. Collection of baseline data from all students, detailing exactly where students are in terms of the four areas outlined above. This will be in the form of a quantitative and qualitative survey designed by Harvard to capture students attitudes and mindsets to independent learning. The survey will be trialled with a group of student research fellows to test efficacy and appropriateness.
It is important that we have a big enough sample size so we will collect baseline data from three additional school from our Teaching School Alliance.
The findings of this research will be analysed by Harvard GSE and delivered to schools in Summer 2015 to inform choice of interventions in year 2.
3. Trial and evaluate interventions.
Based on the baseline data and what we have learned about independent learning, we will decide in consultation with Harvard GSE and our partner schools what interventions we might trial in year 2 to facilitate independent learning, to inform teaching practice and to improve student outcomes. These interventions will be then trialled and evaluated for impact and efficacy. It is planned to use multiple approaches including a randomised controlled trial.
There will be a launch event for this partnership on September 10th at Wellington College with a presentation from Harvard GSE faculty including Christina Hinton and Bruno Della Chiesa. All are welcome.
Christina Hinton and Bruno Della Chiesa
There will be a launch event for this partnership on September 10th at Wellington College with a presentation from Harvard GSE faculty including Christina Hinton and Bruno Della Chiesa.
You can read more on the project in an interview with Christina Hinton here and Carl Hendrick here.
This conference drew delegates from around the world, for an analysis of what is rapidly becoming a global movement. With hundreds of people in the room, John Hattie introduced his 3 themes: understanding learning, measuring learning and promoting learning.
Throughout the day the reality was that there were other pervading ideas: the SOLO taxonomy was extolled as the holy grail (as a way of moving learning from ‘surface’ to ‘deep’), Dweck’s growth mindset received its fair share of positive press, and the benefits of making students struggle (in ‘the learning pit’) was mentioned time and again. In contrast, ideas like VAK were wholeheartedly lambasted.
In his keynote speech, Hattie made it clear that the job of the teacher is to facilitate the process of developing sufficient surface knowledge to then move to conceptual understanding. And this is teachable. The structure that this hangs off is the SOLO taxonomy: one idea, many ideas, relate ideas, extend ideas (the first two are surface knowledge, the latter two are deep). Another way of looking at this is that students should be able to recall and reproduce, apply basic skills and concepts, think strategically and then extend their thinking (by hypothesizing etc.)
So that’s surface and deep. Next Hattie described knowledge in terms of the ‘Near’ and the ‘Far’, i.e. closely related contexts or further afield relations – he proposed that our classrooms are almost always focused around near transfer. Hattie finished his keynote speech by briefly outlining 6 of the most effective learning strategies:
Backward design and success criteria. ES=0.54 (with ‘Outlining and Transforming’ the most striking at 0.85, although he didn’t really say what this actually meant). More straightforwardly, worked examples are at 0.57 – for me, as a Physics teacher, this is critical. Finally, concept mapping entered the hit parade with an ES of 0.64. Hattie then went on to discuss flipped learning, which he seemed quite positive about, perhaps because the effect size of homework in primary schools is zero – which he spun to be a positive: “What an incredible opportunity to improve it”.
Investment and deliberate practice. ES=0.51. Top of the table here was ‘practice testing’ (even when there is limited feedback). Hattie thinks that the key to this is that students are investing in effort. “We need to get rid of the language of talent”, including setting etc. Dweck’s mindset work was repeatedly referenced during the day, including an interesting idea about the dangers of putting final work on the walls – perhaps we should decorate our rooms with works in progress? But how do we make the practice that they do ‘deliberate’. Another author repeatedly referenced was Graham Nuthall and his work on needing 3 opportunities to see a concept before we learn it. I thought that it was interesting that Nuthall was given such a glowing report when his book ‘The Hidden Lives of Learners’ includes relatively little in the way of attempting to measure and quantify his conclusions. His conclusion to this section was the catchphrase: “How do we teach kids to know what to do when they don’t know what to do?”
Rehearsal and highlighting. ES=0.40. Some strategies here: rehearsal and memorization, summarization, underlining, re-reading, note-taking, mnemonics, matching style of learning (in order of effect size, with the latter at ES=0.17). The key here is to get kids to get sufficient surface knowledge so they can use their (limited) working memory to do the far learning. I thought it was interesting that matching learning styles gets such a bad press when it does, according to this, have at least a small positive impact.
Teaching self-regulation. ES=0.53. Reciprocal teaching – not just knowing, but checking that they know why.
Self-talk. ES=0.59. Self-verbalization and self-questioning.
Social Learning. ES=0.48. The top effect is via classroom discussion (at 0.82) (Hattie stressed that this should not be a Q&A, but an actual discussion).”When you are learning something and you’re still not sure, then reinforcement from classroom discussion is the biggest effect”…but if the discussion is of something wrong, then people are more likely to remember it. The most memorable quote here was that “80% of the feedback in the classroom is from peers…and 80% is wrong”.
What about Direct instruction? ES=0.6. The important thing is sitting down with colleagues and planning a series of lessons. And then jointly discussing how you are going to assess. “If you go out and buy the script, you’ve missed the point”. Constructivist teaching only has an effect size of 0.17. Guide on the side leaves the kids without self-regulation behind. This resonated with the work of David Didau (the learning spy). Interestingly, ‘problem solving’ has negligible effect size, but ‘problem based teaching’ has a large ES.
And what about IT? Technology is the revolution that’s been around for 50 years and has an ES=0.3. Teachers use technology for consumption purposes, e.g. using a phone instead of a dictionary. That’s why the ES is so low. If you use technology in pairs, then the ES goes up. Why? Because they communicate and problem solve; i.e. use it for knowledge production. Three linked concepts were mentioned: the power of two. Dialogue not monologue. The power of listening. Compare this to the quip: “Kids learn very quickly that they come to school to watch you work”.
Feedback? The question of feedback is not about how much you give, but how much you receive. Most of the feedback is given, but not received. Students want to know “Where to next?”, so we should show another way, giving direction. This is incredibly powerful. “How do teachers listen to the student feedback voice, to understand what has been received?” This is at the vanguard of Hattie’s current research.
Error management? Typically errors are seen as maladaptive…and teachers create that climate: solving the error, redirecting to another student, returning the correction to the student who made the mistake, ignore the error (although hardly ever). Hattie sees errors as the essence of learning. He mentioned the teaching resilience as an example of best practice.
Session 1: the Visible Learner (with Deb Masters)
In her work with John they have developed a model for measuring the effect of feedback and asked the question, how do you take the research and put it into a process in the schools? She called this ‘Visible learning plus’. We were asked to come up with our ideal pupil characteristics: questioning, resilient, reflective, risk takers. And the least ideal: not proactive, defeatist. No surprises there, then.
Deb defined visible learning as “when teachers SEE learning thought he eyes of the student and when students SEE themselves as their own teachers.” So the job is to collect feedback about how the students are learning.
Deb defined visible learning as “when teachers SEE learning thought he eyes of the student and when students SEE themselves as their own teachers.” So the job is to collect feedback about how the students are learning.
We also need to develop assessment capable learners (ES=1.44). What does this mean? Students should know the answers to the questions…Where am I going? How am I doing? Where to next? Students should be able to tell you what they will get in up-coming assessments.
This workshop slightly lost its way towards the end as time ran out. We quickly looked at the use of rubrics to develop visible learners, and I was struck by the links with the MYP assessment structure.
Session 2: SOLO taxonomy (with Craig Parkinson – lead consultant for Visible Learning in the UK)
This is based on the work of Biggs and Collis (1982) and was an interesting and practical session. Much of it was based on the ‘5 minute lesson plan’ (which I remain unconvinced about, despite liking the idea of focusing on a big question). The key is to design and plan for questions that will move students from surface to deep learning (one idea, several ideas, relate, expand). SOLO was the preferred model here, over the well-established Bloom taxonomy. I was sitting next to Peter DeWitt whose blog ‘Finding Common ground‘ expands on this.
Session 3: Effective feedback (Deb Masters)
“If feedback is so important, how can we make sure that we get it right?” For feedback to be heard the contention was that you need “relational trust and clear learning intention”. I agreed with the former, but am less convinced by the latter. What do students say about effective feedback? “It tells me what to do next”. Nuthall was mentioned again – 80% is from other kids, and 80% is wrong. Why is there such a reliance on peer feedback? Students say that the best feedback is “Just in time and just for me” … and interaction with their peers is a good way of getting this.
Deb used the golf analogy to discuss the levels of feedback:
Self … praise (“cheerleading does not close the gap in performance”).
Task … holding the club etc. This is often where teacher talk features the most.
Process … what do you think you could do to hit the ball straighter?
Self-regulation … what do you need to focus on to improve your score?
The idea is to pick the right level at which to give the feedback.
Can we use the model to help the pupils to give each other and us feedback? I was particularly struck when one delegate from a large school in Bahrain suggested that they are experimenting with the use of Twitter to get instant feedback about the teaching in real time!
Keynote 2: James Nottingham: Visible Learning as a new paradigm for progress
James started with a critique of the current labelling practices that occur in schools. For example, every single member of the Swedish parliament is a first born child, and 71% of September births get in top sets compared with only 25% of August births: “Labelling has gone bananas … if you label pupils then you affect their expectation of their ability to learn”.
Eccles (2000): Application = Value x Expectation
Again, progress should be valued rather than achievement. How do we go about getting this … what is the process involved?
The ‘learning pit’ was discussed (Challenging Learning, 2010). Often teachers try to make things easier and easier…the ‘curling’ teacher (push the stone in the right direction and then desperately clean the ice to make it easier for it to go further). I liked that analogy. James (rightly in my view) said that our job is to make things difficult for pupils, after all “Eureka” means “I’ve found it”. I’m sure his book will expand on this, but his basic structure was:
Conflict and cognitive dissonance
Some thoughts from the day
The key message that came through from the whole conference was that everything has to hang off the learning objectives / the learning intentions. Is this just because their research requires a measurement of outcome? This is performance, but not necessarily learning. The question is whether the interventions that Hattie has found apply to effective classroom performance and learning…or just performance? I was struck by the contrast between this and what Didau talks about.
Throughout the day there was an interesting use of instant feedback – point to one corner of the room if you know about x and the other corner if you don’t.
Hattie recognizes that we are extremely good at the transfer of ‘near’ knowledge, but not good at the ‘far’ … and that is okay: we shouldn’t throw out the baby with the bathwater.
“It’s a sin to go into a class and watch them teach … because all you do is end up telling them how to teach like you”. You should go into the class to watch the impact that you have.
Should we stop the debate about privileging teaching?
Can we plot a graph of achievement against progress for our students? This can allow you to make interventions with the drifters.
How do we measure progress?
Do we have enough nuancing of assessment levels?
Hattie: “What does it mean to have a year’s growth / progress? We have to show what excellence looks like. Proficiency, sure, but the key is the link with progress.”
And one final thought: “Visible learning into action” will be out April – June next year to show how this might be put into practice in schools.