Anglia Ruskin Research Online (ARRO)
Browse

Beyond relevance and recall: testing new user-centred measures of database performance

Download (738.62 kB)
journal contribution
posted on 2023-08-30, 13:36 authored by Peter Stokes, Allen Foster, Christine Urquhart
Background:  Measures of the effectiveness of databases have traditionally focused on recall, precision, with some debate on how relevance can be assessed, and by whom. New measures of database performance are required when users are familiar with search engines, and expect full text availability. Objectives:  This research ascertained which of four bibliographic databases (BNI, CINAHL, MEDLINE and EMBASE) could be considered most useful to nursing and midwifery students searching for information for an undergraduate dissertation. Methods:  Searches on title were performed for dissertation topics supplied by nursing students (n = 9), who made the relevance judgements. Measures of recall and precision were combined with additional factors to provide measures of effectiveness, while efficiency combined measures of novelty and originality and accessibility combined measures for availability and retrievability, based on obtainability. Results:  There were significant differences among the databases in precision, originality and availability, but other differences were not significant (Friedman test). Odds ratio tests indicated that BNI, followed by CINAHL were the most effective, CINAHL the most efficient, and BNI the most accessible. Conclusions:  The methodology could help library services in purchase decisions as the measure for accessibility, and odds ratio testing helped to differentiate database performance.

History

Refereed

  • Yes

Volume

26

Issue number

3

Page range

220-231

Publication title

Health Information and Libraries Journal

ISSN

1471-1842

Publisher

Wiley

File version

  • Accepted version

Language

  • eng

Legacy posted date

2011-12-20

Legacy creation date

2022-02-18

Legacy Faculty/School/Department

Support Services

Usage metrics

    ARU Outputs

    Categories

    No categories selected

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC