IRIS incident reporting volumes – addressing the misconceptions

Whenever a new system is introduced, a slow initial uptake is expected. Users typically require time to adapt to new processes, and usage grows as confidence builds. This is a natural part of the change journey.

In this article, I'll explore this journey on the Incident Reporting and Investigation System (IRIS), including how the COVID-19 pandemic impacted its early adoption. I'll also address a persistent, yet outdated, misconception that IRIS underperforms compared to the previous system.

The Data

IRIS was launched in June 2020 at the height of the COVID-19 pandemic. As a result, footfall on our sites - and therefore incidents that would need to be reported - were greatly reduced at this time, slowing user engagement and extending the time needed for staff to become familiar with the system. This led to early concerns that fewer incidents were being reported on IRIS compared to the old paper-based process. At the time, these concerns were valid, as shown in the graph below.

To provide context:

  • In the final two full academic years of the paper-based system (2017–18 and 2018–19), 1,034 and 1,055 incidents were reported respectively
  • In IRIS’s first full academic year (2020–21), only 722 incidents were reported - during a year still heavily impacted by the pandemic
  • However, in 2021–22, the first full year of IRIS reporting largely unaffected by COVID, incident reporting levels returned to those seen under the paper-based system and have since continued to grow annually
  • By 2023–24, the last full academic year, reporting levels were 60% higher than during the paper-based years, as shown below

Why is it important we address this misconception?

Perceptions about a system reflect their confidence in it. Why would someone who believes a system is ineffective invest their already stretched time into using it properly? In all likelihood they wouldn’t. Worse still, these opinions are often shared and repeated, shaping wider attitudes.

We’ve seen this first-hand. The Safety Office has fielded repeated questions from senior University stakeholders about low reporting levels on IRIS and why the system isn't working, based on early figures that are no longer representative. This is not only frustrating - it risks undermining trust in a system that has demonstrably improved.

IRIS is not perfect, but it is evolving, and we're constantly making strides to improve it. The IRIS Product Ownership Group (POG), made up of users from across the University, has led continuous improvement efforts over the past three years. Through taking on board feedback and identifying improvements, over 40 system enhancements have been delivered. The rise in reporting numbers, shown in the above graph, is in large part a result of their commitment and collaboration. Spreading the misconception does a disservice to colleagues who have worked hard to make IRIS a success.

What now?

As safety professionals, it's important that people have confidence in our systems - both IT-related and as a general concept of the arrangements we help put in place. We should challenge and address misconceptions about not only IRIS and its usage, but all aspects of health and safety. Please share the graphs above with your stakeholders in departmental safety advisory committees or other relevant meetings.

The IRIS POG will continue to drive improvements on the system, with several enhancements already planned for the coming year. Your feedback remains crucial, please continue to share it with POG members so we can investigate what can be done to make IRIS the best it can be. Thank you.

Chris Sanders, Business Change Analyst – HR Systems – People Department