A new TD programme and its rationale

Ever since I studied a module for my MA on teacher professional development, I have retained a solid interest in how teachers can improve their practice. Those readers with long memories may recall the first programme that the module engendered: a somewhat optimistic project to help my then colleagues expand their own language knowledge.

That one died a death; as I was coming to realise in the comment underneath the post, the project lacked coherent aims and collective will. I probably could have persevered with it, but I did get the sense that I was teaching them their own language, and that attendance was using up my goodwill credit at a rate of knots.

I did, however, proceed to observe my colleagues more as my responsibilities grew in that job. I can but hope that I was of use to those who I watched teach and gave feedback to – I was still learning how to give that feedback, and it is a very difficult skill to master, given all the sensitivities potentially involved. I like to believe that in one or two cases I could pass on some constructive tips, and in others I delivered one or two uncomfortable truths. But the major difference I noticed, post-observing, was to my own confidence and skill set when I taught my next lesson. I was almost involuntarily using a number of techniques and skills that I had seen being used by the teachers I had observed. In addition to that, I felt an added sense of confidence as a teacher. I now had a better idea of what was a good stage of a lesson and what was not; what helped students and what didn’t really; I had a sort of insight that my own reflection-on-action had not hitherto provided.

Fast forward three years, and I am at a different school. The pedagogical challenges we face are in some ways similar, but in others completely different. As always, the main question is – how can we as teachers deliver the most effective learning? The answers are theoretical and practical.

The problem is that, with experience and particularly after a long term, reference to the theoretical can become tiresome. I know what I am supposed to do. I know how I am supposed to react in that scenario. I know what I should be doing in terms of progress and assessment. But it can all be superseded by that daily grind of classroom management, administration, fatigue and the never-ending quest to engage learners effectively (so that the teacher’s role is reduced). I personally have this issue with one class, and another issue with its own minutiae in a different class. I don’t have the time or energy to peruse four manuals, often of varying quality, to find the answer.

So in a sense, my quest for improvement this time is purely practical. I want to know what other teachers do and how they do it. I want to learn from them. I also, selfishly perhaps, want that burst of confidence from knowing a technique that I use has concrete advantages over my colleague’s methodology in that one micro-situation. When put like this, it does seem very selfish. But if they get the same schadenfreude boost from watching me suffer in any given circumstance, and then they feel good during feedback because they can pass on one of those practical tips, and then I learn from that, who loses out? No one.

I guess what I want is that bad feedback. I read once, and it may be apocryphal, that the boss of a Silicon Valley multinational has hired someone whose job it is, purely, to deliver bad news. Without bad news, the thought process goes, you cannot improve. But it is so hard to get bad news about your teaching. I know that I am always on the lookout for perceived slights against my practice – and ready with a long list of defences in the case of attack. A sure sign of paranoia, a sure sign that deep down, I know something needs some work. Most teachers have this from time to time – a bunker mentality and defensiveness that goes with knowing that not all is going right in the classroom, but the kids are to blame, or the curriculum, or whatever, but it’s not the teacher’s fault. I reject this. I want things to be my fault, so that I can work on them.

In that spirit I am working on a project of general teacher development for the next academic year. It will consist in the first instance of a round of peer observations, during which the focus will be on three points – firstly, what the observee or department think they need to work on; secondly, the requirements of the school and the senior management; and thirdly, the requirements of external inspectors eg Ofsted or ISI. Then, the observee will be given a minimum of 20mins of one-to-one feedback. Finally, any outstanding issues from the observations will be collated and inform the composition of remedial TD sessions, to take place in the winter or spring term. It is hoped that there will be follow-up observations the following term.

I intend to persevere with this project – and I think it has clearer aims and will be of more use than my first attempt. Stay tuned for updates as it progresses in September.

The most effective way to progress in your second language…?

Reflecting on my own situation, and more importantly that of my students, I’ve had cause lately to think a lot about the way second languages are learned and automatised. Are my students learning to use English better, and if so, are there any correlations between their location and activity on the one hand and their rate of progress?

It sounds like a terribly simple question at first. Of course there must be correlations between where they are and their rate of progress. Chief among these would be the well-known and research-supported idea that study abroad immersion in the target language context is generally conducive to progress in that language. Within that, though, other questions raise themselves.

The students in my two classes of low-level English learners came to this country in September 2016. They were generally A1-A2 level, with one or two complete beginners. The context in which they live is artificial to some extent – although nominally international, the school is overwhelmingly composed of kids from their mother country, so their L1 is used not just in the corridor and the playground, but also, by policy, in something between 70 and 80% of their lessons. Invariably they live with parents or family who also speak the L1 at home, not English.

This means that my kids are not benefiting, perhaps, from the purest and most consistent exposure to English.(They also are supremely proficient with technology, and access videos and games in their mother tongue as diversion often.) This must have a quantifiable penalty effect on their rate of progress, one would imagine.  Yet they are living in England, in an area where very few people outside the school speak their L1. They will at least hear English a lot more than perhaps their compatriot who has stayed at home. It’s an interesting and dare I say perhaps unusual environment for language acquisition.

So what has happened, as far as I can tell? Well I would say that the exposure to English has had the most dramatic effect on their listening in general. The vast majority have no problem in understanding very slightly graded language now, as we are coming to the end of the year. For some, their speaking has improved drastically; for the majority, it has improved considerably. Other skills vary and it becomes difficult firstly to generalise, but also tough to establish causal links between environment and progress. It’s impossible to say whether there is any correlation between progress in reading and writing, say, and being in England but in this bubble.

What about those speakers? Well, the ones that have improved the most have either had tutors supplementing their 5 hours a week of English, or have been in host families. This would lead one to the tentative conclusion that exposure to language in a pure or natural form is superior to classroom-based instruction, even if that instruction is task-based and learner-centred. You might then also consider ideas like the Critical Period Hypothesis in child language acquisition, and wonder whether there is any point whatsoever in teaching children form and function, when the evidence seems to suggest that immersive environments trigger intrinsic learning in their programmed-to-learn-language, high-plasticity child brains.

But then you’d be jumping to a conclusion which connected correlation with causation. You’d be saying that this kid has a tutor and they have got better because of that, where it might also be connected to workrate, parental enthusiasm/support, extra-curricular activities, aptitude or literally any other factor or combination of those. You’d be subject to confirmation bias too: any improvement the kid demonstrates must of course be down to the tutor, when in fact it only takes a little think to realise that although this may be true, it is not ineluctably so.

I do think my teaching makes a difference, in conclusion. In particular I think it has made more of a difference since I’ve moved away from a presentation-practice-production model approach and towards a task-based, error correction model. I think this has been one consequence of learning to teach the child brain rather than the adult brain.

I also think that despite critically including the existence of other factors, an immersive environment or context (resulting in consistent exposure to the target language) has a demonstrable positive correlation with progress. The question then becomes one of motivation – how do we as educators make that environment a desirable one for each and every language learner, kids and adults?

How English sounds…or practise your listening!

An article in the ‘Indy 100’ online Independent caught my eye the other week. Its title was ‘How English sounds to people who don’t speak English’ and here is the link:


The video to which it refers is a YouTube film project of Brian Fairbairn and Karl Eccleston called Skwerl. The video is embedded in the article and you can see the script here:

The video is an extraordinary one, where the viewer feels as though she has a complete understanding of what is being said, but simultaneously none. The words used in the script are plausibly English in their sounds and indeed spelling, but nonsensical. Used with the nuts and bolts of English – real words like common articles, prepositions, grammar and lexis – it has this curious effect. (I imagine some viewers of the recent BBC programme SS-GB, widely criticised for its mumbling protagonists, will be familiar with the odd sensation).

The indy 100 article, and indeed the title of the YouTube clip, seem to sell this a bit short to me. This isn’t really about ‘how English sounds to people who don’t speak English’, nor indeed lazy stereotypes about the sounds made by a people from a certain part of the world. It is rather about how the viewer’s brain, through context and focus on non-verbal communication, can fill in the gaps to make semi-logical that which is not.

But of course, being a non-native speaker in any given language is the best way to practise and see this. I speak passable French (B2-C1 or so), but my listening has always lagged well behind speaking or indeed reading. I would say my listening is now at B1 level after a good degree of recent exposure. I always found it easier to listen in Spanish, despite my overall level being much lower.

Is this because some people’s brains attune to the rhythm of some languages more than others? Is it that context is easier to grasp in Spanish for me? Or is it perhaps that non-verbal communication in Spanish is more obvious than French?

Ooh, I hear from some quarters – you’re going dangerously close to stereotype there. We all know the Spanish or Latin stereotype of gesticulation as non-verbal communication, and I imagine if you asked the average British person what the French did to communicate non-verbally, the ‘Gallic shrug’ might make an appearance. If you then went on to ask how certain languages sound, uglier things yet may spout forth – the ‘honhe hon’ perhaps, or even the camp Nazi.

But anyway. The transposition of what non-natives hear is also of interest. One of my current students, who is French, loves to wear a t-shirt with the following emblazoned on it:

‘Douillou spique Ingliche?’

Which, to a French speaker, reads as ‘do you speak English?’ with a heavy French accent, rendered in French-like spelling. I have pointed out to him that to an English speaker, the above would make no sense at all, and if asked to read it, they might say something like ‘dooeeloo speakyou Englychee?’

Another example comes from Facebook, where a Colombian (so Spanish-speaking) ex-student of mine wished her English friend a happy birthday in the following way:

‘Japi verdei!’

Which, again, reads as ‘happy birthday’ should to a Spanish speaker with a Spanish accent, rendered phonetically but using English words. As the last example, an English speaker may say something like ‘jappy verday‘ if asked to read the above. The paradoxical element comes when one realises that my italicised rendering also depends on the presumption of a specific relationship between sounds and spelling, one which emerges exclusively from my being a native English speaker.

That’s why, you may say, we need the phonetic chart. We do, yes. But we also need practice in other languages to build our skills of reading body language and non-verbal communication, and finally, and probably most importantly, we need to look beyond our own culturally-informed stereotypes of which cultures make which noises, and concentrate on understanding their essential humanity.

Learner styles

The Guardian last week reports that a number of eminences grises  – scientists, mostly – have signed a letter denouncing the concept of learner styles. For the uninitiated, learner styles can be described as follows: each learner learns best through her preferred receptive method. This can be auditory, linguistic or kinaesthetic, amongst others. The scientists argue that there is no basis in research for such an assumption, and, further, that promoting education along these lines may actually interfere with learning.

I have no qualification or research on which to base the following – it is merely an opinion, informed exclusively by my own experience.

Learners, in my view, do learn differently. They process and absorb information in different ways and at different speeds; they show differing rates of retention, and differing degrees of accuracy in reproduction. They also manifest in my experience differing abilities to analyse and evaluate, to comment and to link. I have had learners who seemed to benefit best from games, and others who benefited from texts, and others who benefited from structures and rules, and others who just wanted to listen to me tell them the answers.

They are, in short, varied; each is unique, informed by their nature and their nurture.

Does this mean that the idea of learner styles is borne out by my experience? Not exactly, I would say. I continuously (though not always successfully) attempt to make my lessons varied in terms of activity and pace. This is mainly to prevent my getting bored. It also, I hear, helps prevent the students getting bored, which is always good (I reiterate that it is not always successful!) I also try to make my lessons student-centred – one of my fundamental assumptions about learning is that as a cognitive process it must involve the learner finding, understanding or using something that she couldn’t prior to the lesson. Not to say this can’t happen if you stand at the front and lecture, in the old style, but I have tried both ways extensively and as a learner and I consistently find better results with student-centredness. I also bore myself if I talk too much.

That said, but I do like to hold court on occasion. My colleague James Castleden and I have often discussed the importance of extended feedback (the post-stage task when the answers to the questions are elicited and boarded and checked by the teacher). Although this is more difficult with younger learners, again because of that concentration span, I think lengthy feedback is of huge importance – this is when, in James’ words, ‘the teaching actually happens’. It’s the time when the teacher can hurry or pace the lesson; can check on a given student and their needs; can find out much about their effort and relative success; can demand high, or break it down; can pursue tangents of interest to the students. So much of this chimes with my core understanding of my role in students’ learning – but is it teacher-centred, or student-centred? And which is better for different learners?

The trouble with people objecting to learner styles as a concept is that they may throw the baby out with the bathwater. Just the other week I read an article in the Times quoting a Tory education minister (I forgot to note his name) pre-empting the scientists’ letter but citing their beliefs as evidence that education as a whole should return to a bygone era of the ‘sage on the stage’. ‘God forbid,’ his sarcastic quote went, ‘that children should learn something from a knowledgeable adult.’

Now there may be no evidence for differing learner styles, but to my mind that doesn’t mean disregarding all the apparent progress pedagogues have made in terms of accepting the primacy of the learner in the process of learning. Variation in pace, activity, task type; use of colour and technology to engage and provoke; evidence of an outcome rather than ‘do you understand?’; use of task-based learning; collaboration and involvement; differentiation; awareness of inherent differences: all of these are vital and intrinsic to teaching in my view.

That said, I never saw the point of Cuisenaire rods.

You immigrant

To my 1ere (Year 12) classroom, this month, where the French curriculum theme of ‘espaces et echanges’ has given us a semester’s worth of work on immigration and integration. We are a week in, and the students’ obvious interest in the subject is encouraging and motivating for the teacher.

I thought I’d start by giving students some words describing people who move from country to country. Here they are:


Illegal immigrant



Migrant worker


Asylum seeker

My questions to the class were firstly whether the definition of each was positive, negative or neutral, and thereafter whether the usage or interpretation of each was positive, negative or neutral. (The general idea was to get the students thinking critically about why one term might be used instead of another, and what inference or implication the choice of that term might give about the writer or the text.)

Interestingly, the class had different ideas from what I had expected. While we were in agreement about the overtly marked ones like ‘illegal immigrant’, ‘refugee’ or even ‘colonist’, whose definitions or cultural context suggest positivity or negativity, we were not on the same page regarding ‘immigrant’. The students as a whole insisted they were ‘immigrants’, though for some reason I would never label them as such.

I guess the fact is that they are. They have come to this country from another, at the behest of their parents, often to give, and to benefit from what it currently offers in terms of opportunity.  To me, the word sounds harsh. It seems to connote some sort of discontent – that the original place where you were was somehow lacking and you had to move. It also begs the question as to why someone would be defined by that action. But quite apart from that very subjective, reactive and instinctive idea, I think ‘immigrant’ is negative to me because it has been used so often negatively in what I have read and what I have heard. The students, coming from a different cultural background, had no such ‘colour’ on the word itself.

And this is the point – in an era of fake news and alternative facts, we could remember that well before all that, the writer or speaker’s choice of words and their connotations in the context have far-reaching implications that can affect our understanding. In this case it is of a simple word describing a simple act, and it could all just be in my head. But I think in general that raising one’s awareness of the associations that can be taken (by the reader or listener) may enable one to critically analyse the motivations of the writer or speaker. I don’t know how we will have time to do it these days, but hey ho…

G up – or why should He get a capital letter?

To nowhere, for this month’s post – nowhere firstly in the sense that I have zero evidence for the point I am about to make, and secondly insofar as there is no evidence for the existence of the subject of the post!

I talk of ‘God’ – the man upstairs (that moniker a little creepy, guys?) First, a disclaimer: I am an atheist, though I don’t like the term – I forget who made the point originally, but it seems rather vexing to define yourself by something you don’t believe. Although I make best efforts to keep my own political and religious views out of a notionally impartial and semi-scientific blog, a careful reader may detect a hint of them nonetheless.

But despite my professed lack of interest in any public acknowledgement of a supreme deity, I have noticed this: ‘God’ (he of the Judaeo-Christian variety, one assumes) seems to be becoming ‘god’ in certain forums. I’m thinking of the Guardian, here, as it is the one I read most regularly, but as I say I don’t have evidence and it could well have been the Independent or the Times (though I think the last unlikely, for reasons to be explained below).

Research indicates the received wisdom: capitalised God refers to that Judaeo-Christian lead character who has at one time or another appeared in a number of books and motion pictures. Lower case god refers to the supporting cast, as it were: characters who must share their privileged status with other supermortals, one of many.

So is it too much of a jump to state that received wisdom, Western-centric no doubt, has been using orthography to suggest that there is one true God, he of the capital letter? Far stranger things have happened. I can certainly imagine the particularly imposing effect of the capital letter in early calligraphy and printing, and any lower-caseing of the ‘g’ being seen as a grievous error.

Why, then, might this now be in abeyance? Why might certain writers be using lower case ‘g’ even to refer to the bearded chap who was rather cross in the Old Testament but had mellowed out by the New? The first explanation that springs to mind is that it reverses the above trend – that by defenestrating that guy, it puts him firmly in his place as one of many. No better, no worse, and to boot worshippers of alternative gods get to feel their man is finally on something of an equal footing. In orthographical terms, at least.

I’m reminded here of a prominent atheist (Richard Dawkins, Ricky Gervais, Derren Brown perhaps) who once made the point that Christians were almost as atheistic as they themselves – as Christians don’t believe in 398 gods, whereas atheists don’t believe in 399. It’s a neat thought, skewering the absurdity of believing the god of your home culture just so happens to be the only true one. While that makes sense to me, the choice to not capitalise God may not be so much an emasculative act towards the big fella (motivated by militant atheism) as an earnest attempt to promote equality amongst deities and their followers in general.

But critical thinking compels me to add another potential explanation for the G becoming merely a g – that perhaps with the growth of the internet and instant messaging, capital letters have simply become too time-consuming or fiddly to tap out, especially in an informal register. See, for an example, the oft-quoted ‘omg’ – to capitalise the last letter for the sake of respect to God would seem ridiculous, even amongst believers perhaps.

But out of fear of divine retribution, or more probably fear of the retribution of Disgusted of Tunbridge Wells, the BBC and broadsheets will presumably continue to capitalise the ‘g’ when they are referring to the Judaeo-Christian fellow. It does help Him stand out amongst the invisible crowd.

(Hey look, I just capitalised the ‘h’ in Him, so you knew who I was referring to! Now that’s another phD – a longitudinal corpus investigation into the use of capital letters in proper nouns and pronouns referring to Judaeo-Christian deities)…

The reflective teacher in action (or at least in words…)

As promised, another post in December (just)! This time a little navel-gazing, justified perhaps by the catch-all concept of the ‘reflective teacher’…

So my career has changed gear since early summer 2016, when I left my post as a senior teacher at a large central London language school. I moved for the start of the new academic year to a large international school in north London. The change has been considerable: there has been something of a personal overhaul necessary in terms of managing a different workload, patterns of work, not to speak of fresh challenges like classroom management and new colleague relationships. The school is itself relatively new, and as with any organisation of that scale and with that ambition, relies on a few tremendously gifted and visionary individuals to overcome the trials and tribulations that are inherent in the journey. It is an incredible and fulfilling endeavour, setting up a school in its first few years, and I am proud to be part of it.

With the scale of the potential reflection becoming clear, I need to confine myself to contemplation of my lessons and my learners. In terms of the lessons, I have needed not only to adjust to using different curriculums, but also to being an ‘English teacher’ (here in its conventional sense as opposed to its English-as-a-foreign-language sense). While the distinction seems to be clear in this country, that doesn’t seem to travel – the majority of the pupils and staff are from a country where there really isn’t much of a difference. If you teach English, you’re an English teacher. The facts that most English teachers in English state schools wouldn’t have a clue about teaching pronunciation, for example, or that many EFL teachers would struggle to teach a lesson on pathetic fallacy, seem to have passed them by.

The concrete consequences are that for half my lessons (the other half are conventional EFL classes), I have had to modify my teaching. I have had to rely more on the part of my experience that deals with interpreting texts and looking at tone and implication and other skills required by (near or) native speakers to pass exams. I have had to readjust my conceptions of assessment. I have had to read texts and use textbooks I have never looked at before, and use them in ways that I would not previously have thought of. As you may imagine, this was quite a task: some lessons went distinctly awry, and there’s nothing quite like the feeling you get as a teacher when you realise you are effectively teaching people to speak their own bleeding language. But with the help of some excellent and diligent colleagues, who have answered my questions with a patience that I fear I won’t show to new teachers next year, the lessons have felt as though they have improved. My two classes of (near or) native speakers have got to grips with some texts, progressed through them, seem to have had some fun, and produced some amazing written and spoken work.

As for the learners – well…the first thing to say is that they are young. The next thing to say is that the vast majority are absolutely lovely (and even the lazy or obstreperous ones are charming out of class). The final point of note is that 90% or so of them have the same mother tongue, which is not English. Some are very good at English, as noted above, but some appear to have suffered at the hands of the education system in their country of origin.

Their youth has presented me with probably my biggest professional and experience issue. I had taught kids before, for a year in Spain and Argentinian kids for about a month once a year thereafter in London, but Spain was a long time ago, and the Argentinians were invariably perfect model Victorian-style children. As such my familiarity or lack thereof with the cognitive speed and ability of various types of child (Year 8, say, and then Year 9 and 11 and 12) was a significant point to tackle. My colleague Oliver Hipkins found the same – it is a tightrope act, it seems, to strike a balance between what the kids can do linguistically, and what they can do cognitively. In many cases we have inadvertently pitched the former too low, and the latter too high, given our background in adult EFL. Increasing the linguistic or literary challenge without making the task cognitively impossible for brains at that developmental stage has been quite something.

Also, classroom management has taken front and centre stage, in one of my classes in particular. Having strategies to deal with any issue has assumed more importance than it ever did before in my career, and also ensuring that all learners are learning, and that differentiation is maintained during those lessons, remains a job to work on. I haven’t fallen out with many learners in the first year so far, and I would go as far to say that I think I am liked (by even the ones whose company I wouldn’t necessarily seek out!), but that is nowhere near good enough. What will be good enough is when the interest and engagement in the lesson content is there; when the pace is kept up and varied according to learner skill set and ability; when the lesson is student-centred and memorable; and when progress is made by all the learners in the class.

I have enjoyed very much my first term in this environment, and I relish the steep learning curve it puts me on as a professional. There are moments when it’s tough, for sure, but as someone I admire said to me recently, ‘if it isn’t tough it’s either boring or not worth doing’. As we continue into the new year and new term I need to keep the same ideals in mind as I had when I first started – love for the subject, patience, creativity and a positive attitude. On that note, happy new year!