Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups

Abstract Background Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fra...

Full description

Bibliographic Details
Main Authors: Thomas Rauer, Matthias Boos, Valentin Neuhaus, Prasad Ellanti, Robert Alexander Kaufmann, Hans-Christoph Pape, Florin Allemann
Format: Article
Language:English
Published: BMC 2020-01-01
Series:Patient Safety in Surgery
Subjects:
Online Access:https://doi.org/10.1186/s13037-019-0228-y
id doaj-e0140cdcf65248abb7cedc8703a87435
record_format Article
spelling doaj-e0140cdcf65248abb7cedc8703a874352021-01-10T12:38:32ZengBMCPatient Safety in Surgery1754-94932020-01-011411810.1186/s13037-019-0228-yInter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groupsThomas Rauer0Matthias Boos1Valentin Neuhaus2Prasad Ellanti3Robert Alexander Kaufmann4Hans-Christoph Pape5Florin Allemann6Department of Trauma Surgery, UniversityHospital ZurichDepartment of Trauma Surgery, UniversityHospital ZurichDepartment of Trauma Surgery, UniversityHospital ZurichDepartment of Trauma Surgery, UniversityHospital ZurichDepartment of Orthopaedic Surgery, University of PittsburghDepartment of Trauma Surgery, UniversityHospital ZurichDepartment of Trauma Surgery, UniversityHospital ZurichAbstract Background Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fractures. Methods Radiographs of 20 lateral clavicle fractures that represented a full spectrum of adult fracture patterns were graded by five experienced radiologists and five experienced trauma surgeons according to the Orthopaedic Trauma Association (OTA), the Neer, and the Jäger/Breitner classification systems. This evaluation was performed at two different time points separated by 3 months. To measure the observer agreement, the Fleiss kappa coefficient (κ) was applied and assessed according to the grading of Landis and Koch. Results The overall interobserver reliability showed a fair agreement in all three classification systems. For the OTA classification system, the interobserver agreement showed a mean kappa value of 0.338 ranging from 0.350 (radiologists) to 0.374 (trauma surgeons). Kappa values of the interobserver agreement for the Neer classification system ranged from 0.238 (trauma surgeons) to 0.276 (radiologists) with a mean κ of 0.278. The Jäger/Breitner classification system demonstrated a mean kappa value of 0.330 ranging from 0.306 (trauma surgeons) to 0.382 (radiologists). The overall intraobserver reliability was moderate for the OTA and the Jäger/Breitner classification systems, while the overall intraobserver reliability for the Neer classification system was fair. The kappa values of the intraobserver agreements showed, in all classification systems, a wide range with the OTA classification system ranging from 0.086 to 0.634, the Neer classification system ranging from 0.137 to 0.448, and a range from 0.154 to 0.625 of the Jäger/Breitner classification system. Conclusions The low inter- and intraobserver agreement levels exhibited in all three classification systems by both specialist groups suggest that the tested lateral clavicle fracture classification systems are unreliable and, therefore, of limited value. We should recognize there is considerable inconsistency in how physicians classify lateral clavicle fractures and therefore any conclusions based on these classifications should be recognized as being somewhat subjective.https://doi.org/10.1186/s13037-019-0228-yLateral clavicle fractureReliabilityClassification systemsInter- and intraobserver agreementFleiss’ kappa value
collection DOAJ
language English
format Article
sources DOAJ
author Thomas Rauer
Matthias Boos
Valentin Neuhaus
Prasad Ellanti
Robert Alexander Kaufmann
Hans-Christoph Pape
Florin Allemann
spellingShingle Thomas Rauer
Matthias Boos
Valentin Neuhaus
Prasad Ellanti
Robert Alexander Kaufmann
Hans-Christoph Pape
Florin Allemann
Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
Patient Safety in Surgery
Lateral clavicle fracture
Reliability
Classification systems
Inter- and intraobserver agreement
Fleiss’ kappa value
author_facet Thomas Rauer
Matthias Boos
Valentin Neuhaus
Prasad Ellanti
Robert Alexander Kaufmann
Hans-Christoph Pape
Florin Allemann
author_sort Thomas Rauer
title Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
title_short Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
title_full Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
title_fullStr Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
title_full_unstemmed Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
title_sort inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
publisher BMC
series Patient Safety in Surgery
issn 1754-9493
publishDate 2020-01-01
description Abstract Background Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fractures. Methods Radiographs of 20 lateral clavicle fractures that represented a full spectrum of adult fracture patterns were graded by five experienced radiologists and five experienced trauma surgeons according to the Orthopaedic Trauma Association (OTA), the Neer, and the Jäger/Breitner classification systems. This evaluation was performed at two different time points separated by 3 months. To measure the observer agreement, the Fleiss kappa coefficient (κ) was applied and assessed according to the grading of Landis and Koch. Results The overall interobserver reliability showed a fair agreement in all three classification systems. For the OTA classification system, the interobserver agreement showed a mean kappa value of 0.338 ranging from 0.350 (radiologists) to 0.374 (trauma surgeons). Kappa values of the interobserver agreement for the Neer classification system ranged from 0.238 (trauma surgeons) to 0.276 (radiologists) with a mean κ of 0.278. The Jäger/Breitner classification system demonstrated a mean kappa value of 0.330 ranging from 0.306 (trauma surgeons) to 0.382 (radiologists). The overall intraobserver reliability was moderate for the OTA and the Jäger/Breitner classification systems, while the overall intraobserver reliability for the Neer classification system was fair. The kappa values of the intraobserver agreements showed, in all classification systems, a wide range with the OTA classification system ranging from 0.086 to 0.634, the Neer classification system ranging from 0.137 to 0.448, and a range from 0.154 to 0.625 of the Jäger/Breitner classification system. Conclusions The low inter- and intraobserver agreement levels exhibited in all three classification systems by both specialist groups suggest that the tested lateral clavicle fracture classification systems are unreliable and, therefore, of limited value. We should recognize there is considerable inconsistency in how physicians classify lateral clavicle fractures and therefore any conclusions based on these classifications should be recognized as being somewhat subjective.
topic Lateral clavicle fracture
Reliability
Classification systems
Inter- and intraobserver agreement
Fleiss’ kappa value
url https://doi.org/10.1186/s13037-019-0228-y
work_keys_str_mv AT thomasrauer interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT matthiasboos interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT valentinneuhaus interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT prasadellanti interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT robertalexanderkaufmann interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT hanschristophpape interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
AT florinallemann interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups
_version_ 1724342597851283456