A Guide to this Blog

Showing posts with label Data. Show all posts
Showing posts with label Data. Show all posts

Tuesday, October 16, 2018

Pushing the Paragraph

Write! Write like the wind!
A while ago I was marking NAPLAN and my desk buddy told me about something she does with her senior classes to help them prepare for the HSC. It was a while ago so I can't recall all the details but the idea was that students are given a paragraph (or essay) prompt and only six minutes to write in response to it. The reason for it being six minutes is that someone, somewhere, crunched the numbers and worked out averages to determine that this is roughly how long it should take per paragraph to generate enough output to qualify their essays as a 'sustained' response in a HSC setting.

I love this idea because, well, other things I'd tried weren't really working. I would give classes practice essay questions to do and the results would be middling at best; the high-performing students would jump into the questions and it was great practice for them, but the students who still had much more potential for further growth would find it too overwhelming. A different someone, somewhere, told me once that, "You don't run a marathon as practice to run a marathon," and I liked that; it resonated with me. The HSC is a massive undertaking that requires a highly-condensed explosion of writing unrivaled in other stages of education - the vast majority of students will never go through anything even remotely like the HSC examinations ever again so it's quite understandable that the less academically-primed students would be reticent to write essays in order to practice before they're asked to write essays in the HSC.

It's much less confronting for most Year 12 students to write paragraphs instead. And, yes, some still try to avoid it, however, when I'm collecting data every single time and letting students see their progress, it becomes difficult for the reluctant to continue non-attempting. But I'm getting ahead of myself.

The Goal
In getting students to write paragraphs I want them to be mindful of what they're writing rather than simply trying to write as much as possible. Part of keeping them focused is to make them aware of the lexical density of their writing. 

Lexical density refers to the amount of lexical items found in a piece of writing. A lexical item is any word or word-group that carries meaning on its own, IE. A noun, a verb, an adjective, or an adverb. These are words that we can easily define or can swap with synonyms. If a word isn't a lexical item then it's a grammatical one - these are all the joining or connecting terms, some examples being 'with', 'what', 'it', 'the', 'however', 'in other words', etc. What we want to do is to have students write paragraphs and then test how dense they are with lexical items.

Testing the lexical density of a paragraph is to see how much 'content' is in a piece of writing. The key here is not to think about lexical density in terms of a higher amount of content equalling better writing, it's more a measurement tool that can help pick up patterns and identify trends in writing on repeated examination. 

Some things to keep in mind:
  • A high lexical density (say, a piece of writing that is 80% made up of lexical items) might indicate a piece of writing that is too jargonistic, or convoluted, or loaded with words that the writer doesn't necessarily understand. 
  • A low lexical density (below 50%) might demonstrate too much colloquialism, informality in sentence construction, or a lack of appropriate vocabulary.
  • I'd probably say that the 'sweet spot' would be anywhere between 55% and 75%, but I'm not completely nailed down on this yet. Ask me again after I've collected data from a few more classes over the next 5 years!
The goal is to have students produce something within the 'sweet spot' bracket but, more importantly, to generate a word-count of 180 words or so within the 7 minutes given. The lexical density testing is a byproduct of building speed; by having the students focused on vocabulary in this way it ensures that they are keeping some kind of standard in mind in terms of ensuring that quality isn't sacrificed in order to just increase speed.

The Process
Written language tends to be more lexically dense and less grammatically intense than spoken language. Part of testing lexical density is to teach students to switch codes when writing and ensure that they aren't getting too conversational.

To calculate lexical density there needs to be a bit of maths in play (sorry!) This is represented as a percentage and is created through the following formula:


Get students to give you their lexical density percentage and word count after they've written a paragraph in response to an essay question. You want to do this a couple of times a term so you can start to build a reliable data set.

The Data
This is the cool bit, and it comes especially in handy during parent-teacher interviews when parents are interested in seeing how their child is progressing in their writing skills. By using the graph function in Word you can start to assemble a visual representation of student growth in regards to word count in timed conditions. It also allows both the teacher and the student to identify patterns in lexical density, which allows students to meet Outcome 9 (the reflection outcome). 

Record the information with pen and paper each time the students undertake paragraph writing in this fashion. It's best if you have a separate sheet for each student - that way you can record the information in front of the students without them seeing what their peers are doing (not that it's really a secret, students are usually okay sharing numbers related to this stuff because it's not formally assessed). The reason this is useful is that if particular students don't partake in the activity, or they make a series of excuses, they can see how this looks on paper over time. The realisation that you are collecting this information on a regular basis will prompt a lot of these students into action.

Student A shows growth in word count over time. They were already performing at a high level in terms of generating long paragraphs in short amounts of time, however, seeing this information presented in this way helped them push themselves further. The lexical density was way too dense at the outset - a lot of overstretching of vocabulary. We took at a look at some of the vocabulary used and worked on making the language more concise, which had the added bonus of increasing the amount of analysis they could get across in the time given.
Student B struggled with the concept of these paragraphs at first as they were nervous about not performing well. Eventually they joined in after realising that I wasn't going to stop running the activity, and they actually did quite well.

Student C showed phenomenal growth in word count whilst maintaining lexical density.

Once Student D joined in they started to show growth as well.

Student E really pushed themselves to extend the word counts they were achieving in 7 minutes.
At the end of each semester I give the students a copy of the data (such as the graphs above) to make sure that they're partners in the process. The above graphs are also what are given to the parents. It doesn't necessarily all have to be done the way I've described it but the main point is that students are practising writing in timed conditions without having to commit to an entire essay every time.

Sunday, April 15, 2018

Differentiation in Stage 6 Assessment

UNSW, where Ignite the Spark 2018 took place
About a week ago I attended the Ignite the Spark 2018 Conference. This Professional Learning was mainly focused on assessment in relation to giftedness but what was most interesting across the conference was that, underneath a rhetoric built around excitement for assessment and the need for teacher quality, the real subtext of most of the sessions seemed to be that the HSC Exams have become obsolete and that the requirements handed down from NESA and the DoE aren't represented by this increasingly irrelevant process.

But, like it or not, the HSC exams remain largely unchanged and this means that Stage 6 has become an increasingly challenging transition phase between all the requirements of 21st century learning covered by Years K-10 and the need to perform well in a 19th century-styled examination at the end of Year 12. 

Enter Lisa O'Neill, the Head Teacher of HSIE at St Mary's Senior High, who presented a brilliant seminar at Ignite the Spark that centred on how we can better meet the challenges set forth by the demands of the HSC in 2018 and beyond by differentiating our assessment of Year 11 and 12. In her presentation Lisa identified our tendency as teachers to focus on preparing students for exams, and the way that our assessment tasks lead towards the HSC exams, and questioned the capacity of these assessments in actually helping these students to learn. We can't eradicate the exam but there is a concerted need to shift our thinking so that students become more skill-focused. NESA seems to be acknowledging this through the mandating of only 1 formalised examination per subject, per year. There is an undeniable tension, though, between the push for assessment in Stage 6 to become less focused on preparing for the HSC exams and the HSC exam-focused culture that already exists in NSW schools.

Here are some things we need to consider:
  • NESA syllabuses include a requirement that each KLA assesses their students using assessment as, of, and for learning. I've tended to interpret this as meaning that not all assessment tasks should be based on establishing data that assesses student ability (assessment of learning).
  • About 60% of domestic university student enrollments from 2014-2017 were non-ATAR, which suggests that universities are becoming less concerned with the HSC examination when it comes to evaluating who they admit into their courses.
Getting back to Lisa O'Neill's Differentiation in Stage 6 Assessment presentation, she takes her lead from John Hattie in approaching differentiation as something that "relates more to addressing students' different phases of learning from novice to capable to proficient rather than merely providing different activities to different (groups of) students" (from Hattie's Visible Learning for Teaching). O'Neill also piqued my interest by using this to interpret the as, of and for dichotomy of assessment in a more holistic sense - that each individual assessment task should encapsulate all three of these forms of assessment. 

I think I'm down with that. 

Linking her work to the Professional Standards, O'Neill highlighted Standard 1, IE. 'Know Your Students', and clarified her interpretation of differentiation as having one broad task that allows access points for all students. In a nutshell, assessment:
  • Should provide students with multiple ways to achieve.
  • Can provide access points for students moving through a spectrum of novice to capable to proficient.
O'Neill presents on the Assessment Cycle of Love
 Here are some other recommendations made by O'Neill:
  1. Use what she refers to as the 'Assessment Cycle of Love', a diagrammatic tool for designing assessment, to identify gaps in your tasks. Are the outcomes, marking criteria, feedback, and process of assessment all as per NESA advice? Are assessment as, of and for learning all functional within the task?
  2. Staff evaluation should be an integral part of the process, with consideration of whether a task has allowed students to set learning goals, and what skills students were taught while undertaking the task.  
  3. Know the data - use SMART, SCOUT, our own assessment tasks, and the Educator Calculator (a data tool for assessment that can be found at the DoE Centre for Education on Statistics and Evaluation).
  4. Consider the need for access points in tasks so students don't become disheartened when an assessment task has shown them that they can't do anything in preparation for the HSC exams. They might be able to access a band in the marking criteria, but can they access a skill set?
  5. Include feedback in the marking criteria. Feedback is pivotal for Stage 6 assessment and all criteria should be a form of assessment for learning (IE. What can teachers and students alike gain from reading this criteria in preparing for future learning?)
  6. Can we shift the student mindset away from marks and towards skills? Have students make an appointment for feedback conferencing and use this time for students to annotate their own response. Identity their areas of weaknesses ahead of time and get them to look for these weaknesses in the task so that they can establish learning goals.
Some Useful Links
"Crunching the Numbers" - A report from the Mitchell Institute on the use and usefulness of the ATAR.
Student Conferencing and Feedback  - providing students with 1:1 feedback to facilitate personal growth.
Assessment For Learning - an alternative approach in using separate diagnostic assessment with students.