I’m on a mission to have better 360 discussions. It’s top of my priorities right now. As part of my own experience of asking for feedback from colleagues, I’ve discovered just how easy it is (or not) for people to give feedback; to have those 360 discussions, and how easy it’s been for the HR and L&D teams to manage the 360 process.
Why is this important? Well, I want to improve my own performance and leadership, but also I want others to see how easy it can be for us all to improve – by helping each other – so we can grow together.
Feedback is my focus this week. I’m taking a deep dive into 360 multi-rater feedback tools, because I want to give the best possible support to my leadership and coaching clients. Reflecting what observers have said in their feedback. But given that it’s experienced leaders and managers at the receiving end of 360 feedback tools, I have a burning question: why do we call the people who give the feedback ‘raters’?
I’m sure you’ve got more experience than me, and your comments are most welcome. But hear me out:
‘Raters’ can’t be the right word, can it?
Quick dictionary definition (thank you merriam-webster.com) of rater: “a person who estimates or determines a rating”. So far, so clear. But there’s also ‘rater reliability’. Yes, it’s a thing apparently. It’s the degree of agreement, or consistency between raters.
We’re not talking Olympic skating, or Strictly Come Dancing here. Those judges are experts in their field. If they hold up the score they also get to explain why they’ve given the mark they have, based on years of experience in the subject.
My point is that we have little or no confidence in the knowledge and experience of ‘raters’ in the subject matter about which they’re being asked, unless they’ve been prepared, or had training, first. We certainly can’t expect consistency without it.
Participants are going to get feedback which may be challenging. They’re going to be asked to invest time and effort in a personal development plan. The least we can do is improve the experience for them…
my360plus doesn’t call them ‘raters’; because they’re Observers.
Let’s take presentation skills. Sure I know if my colleague bores me when s/he stands up and speaks to a powerpoint. Phrases like “I know you can’t see the detail here but….” are the kiss of death.
I remember a presentation where the boss stood up dressed in a hi-vis jacket to make a safety point. But if the question asks how good a communicator my boss is, does a rater base their response on that one memorable time?
In the my360plus behaviours, based on the Schroder high-performance model, presentation skills are about getting your ideas over. Showing you’ve done the research. Presenting different options. Influencing, not just railroading, the final decision.
So asking how often we see these behaviours makes perfect sense.
And that’s why “Why do we call them raters?” was my number one question when we invested in the my360plus feedback tool. I’m grateful that no-one mocked me for it. And I learned three things:
- The first was that my360plus don’t call them ‘raters’; they’re Observers
- The my360plus questions are about how often a behaviour is observed
- They’re Observers because the questions ask them about what they observe in others;
not how they rate others.
Let’s reduce bias
‘Observer’ reduces bias, and it takes out pre-conceived notions of what’s ‘good’. It makes you think. From an objective viewpoint it reduces reliance on rater reliability. It’s a shift from how we rate people, what we think about them, to what we observe about them.
My first step is to experience my360plus from a participant’s point of view.
I answered the questions, and colleagues are giving me feedback right now. Then I get to see the report and experience someone giving me feedback, before I plan a year’s leadership development goals supported by the my360plus social feedback system and some 1-1 coaching. As I say: scary and exciting.
Let’s prepare observers
From a customer point of view, this experience will lead to training programmes for observers and my360plus administrators, so that everyone feels confident to give reflective feedback based on observation, and the participant experience can be better. We’ve got videos dotted around this site to support the ongoing development plans, and other ‘how to’ content, but I know we can do better.
So if you’ve had similar experiences, whether from being rated, or giving coaching and feedback to leaders using these 360 multi-rater feedback tools, or you want to help create the best possible training programme for my360plus participants and observers, I’d love to hear and learn from you.
You’ve invested in your 360 degree feedback tool; set up the process and briefed the participants on how it all works. Your HR team stand ready to coach your leaders and managers on their reports. So what could possibly go wrong?
Let’s be blunt. With any 360, it’s rubbish in, rubbish out. So it’s important to get the best from your observers and coaches, not just the participants themselves.