One of the trickier parts of designing a 360 review process is coming up with good questions to prompt peers with. You want to ensure that you are asking questions at the right level of detail, providing reviewers with enough context, while not introducing any bias. You also want to strike the balance between questions that are easy answer, yet thorough enough as to collect meaningful feedback.
We recommend that you tailor your reviews to the roles and requirements within your organization. As I have mentioned in previous posts, your reviews will be far more meaningful if they analyze the specific skills required for success within your company. This is particularly true when evaluating technical roles, such as software developers, which require deeper evaluation of technical skills.
That said, if you are wanting to implement your first review, it helps to have some initial content to get you started. Below is a generic performance review, designed to evaluate skills that are important to most core roles within an organization. Skills such as effective communication, team work and ability to execute on goals.
Check out more than 200 custom metrics in the Worklytics Data Dictionary
The review is broken up into sections, each designed to evaluate a key skill set. Each section is made up of 1-4 multiple-choice questions, with varying responses types. I have chosen a variety of response formats to provide a few different examples, for reference. These include frequency (never,sometimes,always), expectations (exceeds, meets, below) and a typical Likert scale (agree, disagree etc). Question sets are followed by a optional field, prompting reviewers for additional context and feedback.
I plan on publishing a series of posts to share examples of other reviews, for roles such as software developers, UI/UX designers and Product Managers. If you have any additional requests, feel free to reach out to us on twitter - @worklytics