Archives by date
You are browsing the site archives by date.
Guest post by Dr. Richard Farrow
Part of the problem with the current rush to research has been that in making things accessible, well-meaning organisations such as the Sutton Trust have divorced teachers even further from actual research by publishing guides summarising findings. This is all well and good for many people, but when a result is non-intuitive, such as in the case of the non-impact of TA’s, many have questioned the findings. In order to do this, consulting the original sources is probably a wise move. This is a perfectly natural and valid thing to do and something people in a research rich-environment should encourage.
The question is how to do that? Our first problem is that many academic papers are behind a pay wall because they usually form part of a journal that is paid for. This is either online or in a paper version. Academic journals have traditionally been expensive to buy and many of them rely on charging a lot for a small number of copies to keep publishing. I would like to see the EEF pay for every registered teacher to have access to a system where research papers can be read and downloaded for free (something that happens in Scotland). So assuming you can access a paper, what are the other barriers for lay people reading research papers? I have made a list:
- They don’t know where to look for the evidence or what they need
- They have no idea about the method being used and whether it is valid or not
- The academic language used sends them to sleep after a few seconds
- It simply doesn’t make sense
Points 3 and 4 are easy to address. Academic language is a must for people who work in academia. However, if it is still the case that even when you have deciphered the hieroglyphics and think you know what something means, if you still cant make head nor tail of it, it is potentially rubbish. Often it can be a conceptual issue that provides a boundary, because some concepts are interpreted differently across disciplines, but I guess the key is to make sure that the language amounts to something you recognise and makes sense to you as a practitioner. There is a push towards developing “research literacy” amongst teachers. Membership of the National Teacher Enquiry Network (NTEN) allows access to academic papers and provides support. If you are looking at this as an avenue for improvement further information is available here: tdtrust.org/nten/home/
More pressing are points 1 and 2 and further clouding the water, they may be language dependent also. Evidence is a hard concept to pin down and many research papers use different types of evidence to make their points. A quick list of where evidence could come from:
- Survey/Interview data – either written or verbal generally designed to give you statistical value to make judgements from.
- Statistical data – i.e. exam results, either at a school or authority or national/supranational level (think PISA)
- Interviews – practitioners and children and whoever – designed to add “flesh to the bones” of statistical data
- Observations – of anything really, could be children, classes, teachers etc
- Other literature – as in the case of meta-analyses
- Philosophical tools – such as analysing and rewriting something from a Marxist (or whatever) viewpoint.
There are more, but most fall into one of these categories.
In the past certain methods were considered more valid than others, and this continues to evolve and change as the requirements of research become different. There are two main ways of carrying out social research: quantative and qualitative. There are pitfalls in both approaches and no one would pretend otherwise. A combination of the two is also used and referred to as “mixed methods,” an approach that is growing in popularity in educational research. In my own opinion, a research area that people want to investigate properly needs both approaches, either together or separate. The more evidence you can get, the better. People often dismiss qualitative research as opinion, and done badly this can apply. However, the same allegation can be made against a purely stats approach, which can often miss the point and flatten out nuances with large data sets.
This issue is not addressed by meta-analyses, which attempt to review a number of papers and make value judgements from them. Researchers get together papers on a certain topic, take the data and findings from them and synthesise them together. The problem is, that unless the papers are analysing the same thing, comparisons can be useless. Repeat research is vital for understanding to improve and a general rule is that the more studies that exist on something, the better. Education is a field where crucially; very little repeat research is done. The figure given is that less than 1% of research papers actually repeat something done before. In other disciplines this is much higher. Meta-analysis as a tool looks great, but in reality gives false results and does not compare like for like studies. This is the main issue I have with the Sutton trust toolkit. While their aim is noble, the way of finding their results is full of holes so big you could drive a truck through them. The further fact that they give a numerical value AND then an increase/decrease in months learning, makes it hard to take seriously. It is a shame that so many people appear to take their findings at face value. Perhaps this is the point of this blog, a call for practitioners to ignore the overall analysis and get back to the roots of the results. Developing that “research literacy” will help.
What a research paper has/should have:
A paper will start with an abstract, which summarises the research in around 100-150 words. This is an extremely difficult thing to do, so it is probably better to skip this bit if you have committed to reading the whole thing.
Next should come a short paragraph introducing and summarising the aims and findings of the research. In paper written for press attention this is called an executive summary and can be much longer. Often an executive summary is skewed somewhat and attempts to present findings that back up the ideological view of whoever commissioned the research. Again, if you have committed to reading, I’d skip this bit.
Following this you should have a methodology section. This explains why you have carried out the research and where it fits in the current literature. It should link to a method of analysis used in other papers. This section is often missing in educational research. It is not because the research isn’t valid; it is because education often does not have many universally recognised tools of analysis. It is however, a weakness and a symptom of the lack of repeat studies.
Next comes the research design/method section. This explains how you have carried out your research project. If it is a survey paper, it will present the terms of the survey and what is being looked for. If it is stats, it will have a sample size and an indication of where that data came from. In education research often this section is inadequate. As a general rule, the larger the sample size, the better the research. In some forms of stats analysis there is a magic number that you can draw conclusions from, around 1000. This is where the confidence interval it at its smallest point (around +/- 3%). The smaller the sample, the larger the confidence interval, which is bad for drawing conclusions. In a qualitative paper, this is a complicated section to put together, but again, the more people that have been interviewed/observed/questioned the better.
After this a data section, often called ‘data collection’ should be present. It will give you the raw data gained from the project. For stats projects, you can look at whether they have got their data from a fair amount of sources, or whether they have failed. In a qualitative paper this section may include quotes or statements, which address the key findings of the research. If a class has been observed, details of its composition and the research undertaken will be included. This section is absolutely vital to find out whether the research is relevant to you. If you are looking at TA impact and this section does not address a point you have in your school, it is pointless to consult. If you insist that TA’s work with a different group every day, while the study only has TA’s who stay with the same group, whether or they make an impact or not is of no relevance to you. You will need to search for the paper where they look at TA’s working in different groups to assess their impact. This is the issue with meta-analyses, they do not often address the same point, but can come up with a general conclusion. If you take this conclusion and make changes to your school as a result, you may be making a large mistake.
Next comes the analysis, where the data should have been taken notice of and conclusions drawn. The author will be looking for ways to make a point and in reading this section you must make sure the conclusions drawn are actually backed up by the data. One mistake education papers often make is overreaching from their findings. The researchers will carry out a very small-scale research project involving 2/3 classes, then attempt to say that this could have a national impact. What they should be saying is the following: “more research is needed in this area to find out if these results are valid.” What they are likely saying is more like: “our findings show this intervention/thing works and we expect that this will continue as the project expands.” If these conclusions are not backed up by the raw data, then you have a problem.
Finally you should have a conclusion, where the main findings are summarised and conclusions drawn. This should give anyone who is wishing to build on your paper, points to consider when carrying out their own research. Again, the conclusions should be backed up by the data and the analysis section.
In sum, if the paper you are reading does not have these sections in then it is possibly not worth reading. The fact that a huge amount of educational research misses out one or more of these sections is problematic, and leaves us with a problem. This is where I would always urge you to focus on the data section. Assuming the people who carried out the research are competent researchers, the data should be useful. You are capable of analysing this type of thing yourself and I would urge you to do it.
Specific advice for school leaders looking to move staff towards a research based model of school improvement:
Get involved in researchED and related activities
Make links with local universities who have educational research departments
Join existing projects through making contacts via the above
Make sure you have the time to embed this culture in your school
Be enthusiastic, it will be difficult to drive this forward if you are not committed.
Any questions get me on twitter @farrowmr