ICC, reliability coefficient
0 < reliability=(true variance)/(true variance + error variance) < 1
ICC reports including: model, type, definition.
(1) model: 1-way random, 2-way random, 2-way mixed
(2) type: 1 or 2 raters vs. 3 raters, k
(3) definition: consistency or absolute agreement
MODEL:
1-way, 2-way:
(1)Do we have the same set of raters for all subjects?
the same set: 對受測者而言,raters有相同的設置,例如:隨機抽出2個rater, pertest由1位rater進行,posttest由另1位rater進行。這屬於2-way。反之,若是隨機選取rater,則屬於1-way random.
random, mixed:
(2)Do we have a sample of raters randomly selected from a larger population or a specific sample of raters?
我們所選取的raters, 是否來自特定對象,且就一直施測。若是,則為mixed model,例如:這機構就是這位專業rater終身做一輩子。
實際上,工作輪調或許也算是一種random。
MODEL:
1-way, 2-way:
(1)Do we have the same set of raters for all subjects?
the same set: 對受測者而言,raters有相同的設置,例如:隨機抽出2個rater, pertest由1位rater進行,posttest由另1位rater進行。這屬於2-way。反之,若是隨機選取rater,則屬於1-way random.
random, mixed:
(2)Do we have a sample of raters randomly selected from a larger population or a specific sample of raters?
我們所選取的raters, 是否來自特定對象,且就一直施測。若是,則為mixed model,例如:這機構就是這位專業rater終身做一輩子。
實際上,工作輪調或許也算是一種random。
DEFINITION:
consistency: ranking applicants, 類似 排名 的概念(可以接受不同rater的差異)
absolution agreement: rated above or below a preset standard absolute score, 要找出某分數的分界線,那麼rater的評分應該像是 數學公式,共同遵循。
ICC (consistency) =
subject variability / (subject variability + measurement error)
ICC (absolute agreement) =
subject variability / (subject variability + variability in repetition + measurement error)
Q:
ICC的注意事項或缺點:
Q:
ICC的注意事項或缺點:
沒有留言:
張貼留言