Assessing agreement between the examiners, measurements and instruments are always of interest to health-care providers as the treatment of patients is highly dependent on the medical reports. Till now several agreement statistics have been developed and all of them have certain limitations. In 2002 Kilm Gwet introduced a more robust and unbiased agreement statistics named “Gwet’s AC1 statistics”. It has been shown by various researchers that AC1 statistics has the best statistical properties amongst all the other agreement statistics. Though it has been reported to be the better estimate still several inconsistencies existed in this agreement statistics. In this paper author aimed to develop a new formula that can overcome all the inconsistencies and dependencies of inter-rater agreement statistics.
Keywords: Inter-rater agreement, Agreement statistics, Kappa statistics, AC1 statistics