Level of Agreement Calculations | 360

Modified on Thu, 3 Oct at 7:08 AM

360 listening events use level of agreement calculations. The level of agreement is the extent to which respondents agree or disagree in their ratings on each item and competency. High levels of agreement indicate that respondents provided similar ratings, while a low level of agreement suggests respondents provided inconsistent ratings. The level of agreement can help a user reviewing a report get a better sense of how much he/she may want to consider taking action on an item or competency area.


Using agreement values controls for outliers, preventing a single item from disproportionately affecting the overall competency level.


There are two methods for calculating level of agreement: item-level and competency-level.



This article walks through:


Item-Level Method

This section addresses the item-level calculation method, the steps for using it, examples, and inclusion requirements.



Calculation Method


  • Calculate the standard deviation (SD) of responses to measure spread.

    • High agreement: SD < 1

    • Medium agreement: 1 ≤ SD ≤ 2

    • Low agreement: SD > 2



Steps


  1. Collect ratings for an item from respondents.

  2. Compute the SD for these ratings.

  3. Assign an agreement level based on the SD.



Examples


  • Item 1: five ratings -> agreement score provided.

  • Item 2: four ratings -> no agreement score provided.



Inclusion Requirements


  • At least five responses per item (excluding self-ratings) for an agreement score.


Note: The threshold for the number of responses can be adjusted.



Competency-Level Method

This section addresses the competency-level calculation method, the steps for using it, examples, and inclusion requirements.



Calculation Method


  • Convert item SDs to agreement values:

    • High: 0.9 (SD < 1)

    • Medium: 1.9 (1 ≤ SD ≤ 2)

    • Low: 2.9 (SD > 2)


  • Average these values to determine the overall competency agreement:

    • High: mean < 1

    • Medium: 1 ≤ mean < 2

    • Low: mean ≥ 2



Steps


  1. Compute item-level SDs within a competency.

  2. Assign agreement values based on SDs.

  3. Average these values to determine the competency agreement level.



Examples


  • Competency 1: All items with five ratings -> all included in the agreement score.

  • Competency 2: Only items with five ratings included -> items with fewer ratings excluded.



Inclusion Requirements


  • At least five responses for one item within the competency for an agreement score. Items with fewer responses are excluded.


Note: The threshold for the number of responses can be adjusted. Also, competency agreements can be suppressed if any items lack valid data.





Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article