We have located links that may give you full text access.
Journal Article
Research Support, Non-U.S. Gov't
Carotid plaque characterization by duplex scanning: observer error may undermine current clinical trials.
Stroke; a Journal of Cerebral Circulation 1999 January
BACKGROUND AND PURPOSE: Clinical studies currently in progress are using subjective methods to characterize plaque morphology from ultrasound images. However, there are few studies on the intraobserver and interobserver variability of these classifications. This study was designed to assess these variables.
METHODS: Grading of plaque morphology from ultrasound images, stored both digitally and to hard copy, was performed by 2 classification schemes. Interobserver agreement was determined by 4 observers. Within-observer agreement was performed at intervals for up to 6 months. Accuracy of the 2 methods was determined by comparison with histology.
RESULTS: Within- and between-observer agreement was moderate to good for full-color digital image analyses, with pooled kappa values of kappap=0.49+/-0.10 and kappap=0.62+/-0.07 for the 2-category method and kappap=0.53+/-0.06 and kappap=0.52+/-0.05 for the 4-category method, respectively. Hard copy data analyses gave lower kappa values. The more experienced observers produced higher within-observer agreements and higher correlation with histology.
CONCLUSIONS: Reproducible grading of ultrasound images is not consistently achievable among experienced observers, and within-observer agreement may vary with time. The current subjective ultrasound characterization of carotid plaque morphology used in clinical trials may be associated with unacceptable levels of reproducibility in some centers. Variability between observers may be reduced by using the simpler 2-category grading of plaque morphology to interrogate full-color digitally stored images. Observer agreement should be audited regularly.
METHODS: Grading of plaque morphology from ultrasound images, stored both digitally and to hard copy, was performed by 2 classification schemes. Interobserver agreement was determined by 4 observers. Within-observer agreement was performed at intervals for up to 6 months. Accuracy of the 2 methods was determined by comparison with histology.
RESULTS: Within- and between-observer agreement was moderate to good for full-color digital image analyses, with pooled kappa values of kappap=0.49+/-0.10 and kappap=0.62+/-0.07 for the 2-category method and kappap=0.53+/-0.06 and kappap=0.52+/-0.05 for the 4-category method, respectively. Hard copy data analyses gave lower kappa values. The more experienced observers produced higher within-observer agreements and higher correlation with histology.
CONCLUSIONS: Reproducible grading of ultrasound images is not consistently achievable among experienced observers, and within-observer agreement may vary with time. The current subjective ultrasound characterization of carotid plaque morphology used in clinical trials may be associated with unacceptable levels of reproducibility in some centers. Variability between observers may be reduced by using the simpler 2-category grading of plaque morphology to interrogate full-color digitally stored images. Observer agreement should be audited regularly.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app