//
you're reading...
Collaboration, Lesson Study, Practitioner Enquiry

What impact will the use of ‘socratic questioning and evaluation of the lesson questions’ will improve the outcome of B/A targeted students in terms of written responses to exam style questions Flynn, L, Maher, C, Murphy, C (2016)

What impact will the use of ‘socratic questioning and evaluation of the lesson questions’ will improve the outcome of B/A targeted students in terms of written responses to exam style questions Flynn, L, Maher, C, Murphy, C (2016)

Subject Areas:  Maths, English, Business Studies

This research was conducted under the guidance of S. Gunessee by C. Maher, C. Murphy and L. Flynn. The aim of this lesson study was to examine the effect that Socratic questioning and evaluation had on the written responses of a specific group of year 11 students who had B/A targets but were found of be underperforming. Each specific teacher, CMH, CMU, and LFL, picked their own specific set of 5 target students based on their progress in their specific subject. The research was conducted by CMH, CMU, and LFL over the course of 9 months with SGU reviewing the work at points during the year via discussions and IRIS observations.

CMH, CMU, LFL and SGU spent a couple of weeks trying out and testing different ideas and teaching pedagogies that each individual teacher was interested in. There included Socratic questioning (which we decided was most suited to our students needs at the end), along with coaching questioning. The use of socratic questioning required some further reading in order to ensure we were confident with what needed to be done and this came in the form of many reports written by many different people along with attending the Thursday Morning Thrillers that OHS have established where many different teachers share their experiences and skills with the teaching staff of the school.

Planning

What did existing evidence suggest could improve outcomes?

  • Students take more responsibility to be more active in their learning and diagnosing weaknesses when answering exam questions
  • Students can break down the question to the core principles and build and tailor their answer to the question being asked
  • Feedback can be given more frequently but importantly more effectively to challenge individual problems in written work
  • More time to develop bespoke skills in Business, English and Maths, such as knowledge of concepts and application to exams
  • Improved discussion which adds depth to essay writing
  • Students become self-learners and lifelong learners by understanding how to answer the question being asked
  • Improved feedback cycle between teacher and student

 Had any colleagues tried anything similar?

I was aware that many teachers at Oriel High School use Socratic questioning and write out a list of questions to ask students, however I was aware that no one had done previous research into this area of research at Oriel High School. Socratic questioning was inspired by Greek Philosophers and “create a critical atmosphere in your classroom that probes thinking and gets students questioning in a structured way” (Neill, Rosie 2015).

A brief summary of our approach as originally planned:

The research was originally intended to see the impact that Socratic questioning would have on creating better outcomes for exam responses. The focus was very much rooted in something which could be quantitatively measured, however it became clear that the ‘knock on’ effects of conducting this research was also having an impact on the levels independence which is particularly important at KS4 in order to prepare students for KS5 and further education, as staff and students often take for granted students abilities to apply their knowledge to any question being asked and write an exam response to this.

The study was focused on Y11 students across three Departments; English, Maths and Business primarily as way to examine the impact Socratic questioning would have on their ability to sit their GCSE PPE and exams confidently.

Y11 students were informed at the beginning of the course that teaching approach would be changing and they were very positive about the rationale behind the move and eager to be compliant with all techniques used throughout the year to examine its impact. Through discussions with students and SGU, LFL, CMU, CMH  it was established that employing a coaching method in conjunction with Socratic questioning would allow us to investigate if this strategy was having an impact on students that are more able but no meeting their targets. Nonetheless we decided that the impact of this coaching tool would need to be assessed in terms of the impact across the whole class, as it would be more beneficial in the long run. It was also made clear to the Y11’s that this method of questioning would have less teacher involvement and modelling answers first in class and more deconstructing of knowledge for further discussion and exploring essay writing beyond a uniformed manner.

Through in depth discussion and further secondary research, it was concluded that a way to examine the impact of Socratic questioning across three departments would be through IRIS self-reflections, student’s questionnaire perception and snap shots of progress throughout the academic year.

Enquiry question

What was your enquiry question?

“What impact will Socratic questioning and evaluative questioning will improve the outcome of B/A targeted Y11 students in terms of written responses to exam style questions

Our main objectives as a group was to ensure by the end of the study our pupils would have new confidence, a better understanding of the marking scheme and the ability to allow themselves the chance to question their prior knowledge.  We as teachers ask the important questions that open and start the discussions and get the minds of our pupils flowing. We are the catalyst that encourages pupils to dive into their prior knowledge that has been stored away. But what if pupils didn’t need us and could ask the important questions themselves. These Socratic questions could be taught and nurtured so that when pupils sit that final exam they can ask themselves those vital questions. During the course of the year videos /documentaries/textbook reading/articles were used as stimuli to discuss our thoughts from the beginning of the lesson.

It started off all of us using the same template and style of Socratic questions: where, why, how would etc. From here we tested and used IRIS to video our lesson to see the impact and effect implementing this template would have on the pupils. It took a lot of scaffolding to help and get the pupils to use the questions template correctly. We began our lessons by showing an exam question to the pupils then we showed them a template that we had already filled in with question related to answering the exam question. Modelling the questions and layout was vital for pupils to fully understand and access the task. These were questions that we as teachers ask all the time, which open up the minds of our pupils and get them thinking. Showing the pupils my sample answer the pupils recognised how these questions had helped me answer my exam question. We all discussed the types of questions that we could create to help us answer this next exam question. The pupils realised that they too could create and ask these questions. This experiment allowed the pupils to come up with question that maybe they hadn’t thought of before. We used IRIS during our lessons on a few occasions giving pupils sufficient time to discuss and create question related to a certain exam question.

How was it modified?

As the research continued it became apparent that maths needed to alter the style of its question template as the questions didn’t access the topics or help the pupils uncover their prior learning of mathematics. From this an individual question generator was created with question that related to all areas of maths. Questions such as: what equipment do you need for this question?, what formula could you use for this question?, what do you know about this topic already? The change in questions had a huge impact on my pupil’s ability to assess and expand their knowledge of questioning. We practiced and used IRIS to video the lesson using this different template, it showed a pupil clearly explaining each step to solving his exam question. We saw them work collaboratively to create questions, share ideas and comprehend the marking scheme.

Evaluation

How did you evaluate the impact of your enquiry project?

After each IRIS video, CMH, CMU, and LFL met with SGU to discuss how they were getting on with the study and discuss possible ways to move forward with the study for the next trimester. We looked at many different things that could demonstrate student improvement and further understanding such as:

  • Student perception
  • Student understanding of each question and what would be required of them in an answer to achieve the mazimum possible marks
  • Academic progress (professional predictions)
  • Refining the quality of activities experimented with over the research
  • Quality of essay writing in English and Business Studies and quality of development of answers in Maths

Summary of Questionnaire Findings for Yr 11 Business, English and Maths Action Research 2015-2016

If you require a copy of the questionnaire do not hesitate to contact me at lflynn@oriel.w-sussex.sch.uk , sgunessee@oriel.w-sussex.sch.uk , cmaher@oriel.w-sussex.sch.uk, cmurphy@oriel.w-sussex.sch.uk

Y11 Business has 21 students in September 2015.The below table represents student’s grade band distribution from their external year 10 PPE  and their Oriel Target grade band distribution and professional predictions as of 2016.

Student – Business Studies Oriel Target Professional Prediction (November 2015) Current Level (November 2015) PPE result (January 2016) Professional Prediction (Jan 2016) Professional Prediction (March 2016)

 

Current Level (March 2016)

 

Final professional prediction (May 2016)
Student 1 B+ C- B+ D D C- B+ C-
Student 2 A- C A+ C- B B- A+ A-
Student 3                
Student 4 B+ C A+ D C C+ A+ B+
Student 5 A- C A+ A* A A A+ A

Y11 Business Studies Analysis – 18 out 21 Y11 students completed questionnaire

1.        Did you find the use of the question generator/grid useful in your lesson?
Yes. 4 No. 7 Sometimes. 7
2.        If yes, how did you find the question generator impacted on your ability to understand the lesson?
•               It helps me to understand aspects of the topic which im not to sure about

•               Helped me to identify simple and complex questions an how I could answer them.

•               some of them were pointless

•               It helped because it helped to frame exam questions that could come up which I found useful

•               some of them were pointless

3.        Has the question generator been useful for in preparing you for your exams?
Very useful – 5 Useful –  5 Satisfactory –  5 Not useful  2 Not very useful 1
4.        Are you more confident in approaching exam questions?
Very confident 0 Confident  2 Satisfactory – 9 Unconfident  6 Very unconfident   1
5.        What weaknesses have you identified in your exam responses as result of the question generator?
·         9 mark conclusion paragraphs

·         I find that the question generator is not useful when it comes to exam questions as it isn’t in the same format as the exam.

·         Sometimes I go off track

·         I think my answers have become brief when answering exam questions.

·         Not all are very specific to the sort of exam questions you would get in an exam

·         need to remember to expand my answer

·         nothing

6.        What can your teacher do to improve the question generator task for future year groups?
·         Help me to answer all simple and complex questions and to help identify what needs to be spoken about in the answer

·         To start to use them at the beginning of year 10 and use them more frequently

·         Get the to make us answer them around a certain topic

·         Possibly make them the same type of format as the exam

·         Put them into groups so they can all come up with questions for the generator

·         less questions so that its easier to go into depth

·         Nope it’s a good idea but not as effective as other methods

7.        How confident are you confident with exam layout and understanding the mark scheme due to the question generator?
Very confident 2 Confident  1 Satisfactory – 8 Unconfident  6 Very unconfident   1
8.        If you answered very confident or confident for question 7, what have you learned exactly in relation to exam layout and/or mark schemes?
·         How to structure an argument

·         9 mark structure

·         That the break down of what the question means is actual quite simple and faintly outlined in the question

·         I know how to answer them better

·         From doing exam questions continuously i have picked it up

 

9.        Do you feel the question generator would be useful at the beginning of Y10 and throughout Y11?
Yes. 7 No. 11
10.     If you answered yes to question 9, why
·          So that you are using sophisticated language in both years

·         Gives time to get used to the question generator and it helps to answer 9 mark questions in more detail

·         It gets them ready for the exams

·         Give students a chance to understand topics they found difficult

·         It made me elaborate more

·         I think it has helped me to become a bit more confident in what questions could come up.

·         gets you thinking about the questions from early

·         If you were to use the question generator I would recommend you use it in year 10 as then in year 11 the students could focus more on the exact format of the questions in the exams.

 Y11 maths contained 30 pupils and below is the 5 pupils that were chosen to analysis throughout the lesson study. The below table represents the 5 students grade band distribution from their external KS4 and their Oriel Target grade band distribution and professional predictions as of May 2016.

Student – Maths Oriel Target Professional Prediction (Nov  2015) Current Level (Nov 2015) PPE result (January 2016) Professional Prediction (Jan 2016) Professional Prediction (March 2016)

 

Current Level (March 2016)

 

Final professional prediction (May 2016)
Student 1 B C D+ D C- C- C- C
Student 2 B C D D C- D+ C- C-
Student 3 B B- B C C+ B- B- B
Student 4 B C- C- C C- C- D+ C
Student 5 B C D+ D D- D+ D+ C-

Y11 Maths Analysis – 20 out 30 Y11 students completed the questionnaire.

11.     Did you find the use of the question generator/grid useful in your lesson?
Yes – 5 No- 10 Sometimes  – 5
12.     How did you find the question generator impacted on your ability to understand the lesson?

·         Depending on the topic, it helped significantly or a just a bit.

·         Very well

·         Helps by given picture of what could be in the exam

·         Things like ‘what is my answer supposed to look like?’ was helpful but I could never think of any other questions?

·         If I didn’t know the answer it was be helpful to know at least what it looks like.

·         I learnt what the question was asking

·         I understood it more.

·         The only question that really helped me was- what should the answer look like? Which made me think of the steps between?

·         It gave foundations for me to work off and gave me a god idea of where I may need to improve.

13.     Has the question generator been useful for preparing you for exams?
Very useful- 0 Useful- 2 Satisfactory – 4 Not useful – 9 Very not useful – 5
14.     Are you more confident in approaching exam questions?
Very confident – 1 Confident – 4 Satisfactory – 10 Unconfident – 4 Very Unconfident – 1
15.     What weaknesses have you identified in your exam responses as result of the question generator?

·         I’m not good at understanding what the question us asking

·         Memorising formulas

·         Quadratic equations

·         I think more about the question

·         If I didn’t know what the question was meant to look like I wouldn’t be able to use it.

·         I need to write more to get full marks

·         That I didn’t know how many marks were for each question.

16.     What can your teacher do to improve the question generator task for future year groups?

·         Nothing

·         Explain it better and use it more

·         Select easy to hard questions so that people who find the topic easy can go for the hard and people who find them hard can go for the easy.

·         Use exam questions from a range of past papers.

·         Ask students to find what everyone else is weak at and go through that.

·         Give as homework

·         Have hints/ideas to help think of better questions.

·         Make it more interactive.

·         To use it whilst going through the completed PPE’s

17.     Are you confident with exam layout and understanding the mark scheme due to the question generator?
Very confident – 4 Confident – 3 Satisfactory- 6 Unconfident- 6 Very Unconfident – 1
18.     If you answered very confident or confident for question 7, what have you learnt exactly in relation to exam layout and/or mark schemes?
·         How to achieve all or majority of the marks available.

·         Not sure

·         How much information you need to include to get each mark required

·         How it works

·         I look at the marks to understand what is needed for each question.

·         That I need to write enough and show every part of my working out.

·         By thinking about the end answer I know which steps to take to getting there.

·         I have learned all the different steps of working out needed to answer 3,4 and 5 mark questions

19.     Do you feel the question generator would be had been useful at the beginning of Y10 and throughout Y11?
    YES – 10 NO- 10
20.     If you answered yes to question 9, how?
·         It’s a quick and simple task to educate you through a topic.

·         Because it prepares you for what any kind of exam question might look like.

·         Because it is revision.

·         I answered no because I struggle with thinking of questions.

·         It gives you a guideline of how to answer questions.

·         To start off earlier recognising what questions are meant to look like.

·         If you continue using it throughout the two year the students are more likely to relate back to it.

·         As by the time you got into year 11 you wouldn’t need them as much as you’d have more practice.

·         Because the student can get used to it and the longer it is used the more effective it will be in improving student’s exam performance.

21.     Would you find the question generator helpful in other subjects?
 YES – 7 NO – 14
22.     If you answered yes to question 10, how?
·         You could use it for any contextual subjects.

·         Help understand topics.

·         Because it prepares you for what your exam question might look like and therefore you can have more of an idea of how to answer it before you go into the exam.

·         I said no because maths has only one answer (unless it is quadratic equations) but with other lessons everyone will have different answers making it harder to mark.

·         In other subjects there can be more complex things to understand so by creating questions it can break it down into little bits to help you work things out easier.

·         Help me to understand what the question is.

·         It would help me understand the question.

·         The only other subject that it would be helpful for would be history as I find out the layout of exam styled questions very hard.

·         It could be in some subjects such as science, but only certain science related questions. Other subjects often require too much writing and too much time for it to be fully effective.

English consisted of 29 students at the beginning of the study and throughout. However, only 12 students completed the questionnaire. As it was set for homework, a lot of the students may not have remembered to do it and this could have impacted on the actual effect of the question generator.

Student – English Lit Oriel Target (English Lang) Professional Prediction (November 2015) Current Level – Language (November 2015) (January 2016)

Results from Eng Lang exam

Professional Prediction (Jan 2016) Professional Prediction (March 2016)

English Lit

Current Level

(May 2016)

English Lit

Final professional prediction (May 2016)
Student 1

A

C C C C C D C-
Student 2 A C C B C B- C- B-
Student 3 B C C C C C D+ C
Student 4 B C E B C C D C
Student 5 B+ B B A B+ B+

B-

B+

Y11 English Analysis – 12 out 29 Y11 students completed questionnaire

23.     Did you find the use of the question generator/grid useful in your lesson?
Yes  1 No 6 Sometimes 5
24.     How did you find the question generator impacted on your ability to understand the lesson?

·         They enable you to isolate the areas where you need the most improvement

·         To understand the question, you have to understand the topic clearly so it helps in that sense

·         Helped me see where I had to improve

25.     Has the question generator been useful for preparing you for exams?
Very useful- 1 Useful- 7 Satisfactory – 3 Not useful – 0 Very not useful – 1
26.     Are you more confident in approaching exam questions?
Very confident – 1 Confident – 3 Satisfactory – 6 Unconfident – 2 Very Unconfident – 0
27.     What weaknesses have you identified in your exam responses as result of the question generator?

·         Recalling content

·         How to analyse and justify my responses

·         That I don’t properly answer the question

·         Sticking to relevant info

·         I don’t answer the question fully

28.     What can your teacher do to improve the question generator task for future year groups?

·         Use it to recap knowledge at the beginning of the lesson

·         Help model exam questions

·         Add more possible questions

·         Make them more complex

·         Make it simpler

29.     Are you confident with exam layout and understanding the mark scheme due to the question generator?
Very confident – 0 Confident – 0 Satisfactory- 4 Unconfident- 7 Very Unconfident – 1
30.     If you answered very confident or confident for question 7, what have you learnt exactly in relation to exam layout and/or mark schemes?
N/A
31.     Do you feel the question generator would be had been useful at the beginning of Y10 and throughout Y11?
    YES – 3 NO- 9
32.     If you answered yes to question 9, how?
·         It’s a quick and simple task to educate you through a topic.

·         Because it prepares you for what any kind of exam question might look like.

·         Because it is revision.

33.     Would you find the question generator helpful in other subjects?
 YES – 6 NO – 6
34.     If you answered yes to question 10, how?
·          Learning and revision content in history

·         Let us do harder questions , test our knowledge

·         I find questions difficult to answer In. Sociologists

·         they help you understand the lesson and questions better

·         To make sure you know key-words e.g. in geography

Did you have a parallel or historical group to compare against?

This lesson study is loosely a development of a previous one completed in 2015 by the History department, whilst the lesson study enquiry is examining different activities and the impact it will have on specific group, it has focused on broadly the pedagogical technique of questioning. The outcomes cannot be measured from last year’s classes, but it has taken on board the areas for development which included encouraging more teaching to the top. Whilst the activity focused on B/A grade targeted students the wider impact can also be seen in how it has benefitted lower ability students, especially reinforced by the above analysis break down from each subject area of English, Maths and Business Studies.

What was your experience of the enquiry process?

How would you improve it next time?

  • For next time I feel that this needs to be undertaken in Year 10 as students were not as willing to try something new as they were in the mind frame for exams and therefore did not want to try something new as they were worried it might not help their learning.
  • Use questionnaire feedback to inform planning and teaching.
  • Use IRIS outcomes to inform future planning and teaching.
  • Make sure that it is differentiated and tailored to topics and exam structure more clearly so all students feel they can access it.
  • Carry out an interview with the 5 students it was focused to find out if it had a direct impact on their learning and understanding
  • Where possible pick the same students across departments, where possible, so a direct comparison and analysis can be conducted. 

Further work, sharing & dissemination

What does your work suggest could be investigated in the future? 

  • Increase level of stretch and challenge in the case studies
  • Use this at A level as it breaks down content and exam questions and pairs the information down to a level everyone can access
  • Examine specific groups of students across a range of subjects and the impact Socratic questioning has with them
  • Question generator works effectively when embedded from the beginning of the academic year
  • Making students take more responsibility and ownership for learning, this resource could be used in other lessons and the student should have a folder for each subject and consult when needed.
  • IRIS self-reflection on a half termly basis
  • Modelling and templates of question that could be used for pupils.
  • High expectations which promotes the pursuit scholar excellence
  • Relationship with the class and sharing the importance of that this is a two way relationship for it to work
  • Engagement – creating a culture of effort, challenge, interaction and scaffolding.
  • Question generator for each individual pupils.

How did you disseminate your findings to colleagues?

  • Findings have been disseminated to colleagues through formal and informal discussions in CPL sessions
  • Formal dissemination to the whole school and beyond via blog..
  • S Gunessee will be presenting findings at Educated 2016 conference

 

 

 

 

Advertisements

Discussion

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: