Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Between August and December 2001 we conducted a pilot project at the Charité Medical Center in Berlin,1 extending results of a similar study in the primary care setting by Swinglehurst et al2 in which an information service providing evidence-based answers to questions posed by GPs and nurse practitioners in Great Britain was described and evaluated.
All clinicians at our university hospital were contacted by email and we offered to address any medical question they might have over a period of 6 weeks by conducting a rigorous literature search and critically appraising the sources identified. All questions submitted were answered by two internal medicine specialists trained in evidence-based medicine. We always consulted the evidence-based databases Clinical Evidence, Cochrane Library, and ACP Journal Club. In contrast to the search cascade used by Swinglehurst et al, our two informaticists always consulted the TRIP database and PubMed archive for each and every question in order to ensure that the most up to date studies were considered.
A standardised format was used for answers and consisted of (1) the original question, (2) the search methodology (including search words and hits), (3) a critical appraisal of the evidence found, and (4) a classification of this evidence according to the Oxford Centre for Evidence-based Medicine levels of evidence table, May 2001 (http://cebm.jr2.ox.ac.uk). One to two weeks after our answer we evaluated the recipient's level of satisfaction with our service and the influence of this service on his or her medical decision making by questionnaires.
Overall, we received 34 questions (2.3% of all clinicians contacted), 31 (91%) of which could be answered. Of these, 24 questions were related to treatment, three to aetiology, two to prognosis, one to diagnosis, and one to side effects.
According to the levels of evidence table mentioned above, the evidence found for one question was classified as level 1a, for two as level 1b, for seven as level 2b, for seven as level 4, and for two as level 5. The evidence provided for the remaining 12 questions could not be classified because of insufficient data. The median time taken for the answering process was 7 hours (range 3–32), which is significantly longer than in other studies.2,3 We attribute this primarily to the search for, and critical appraisal of, primary literature. According to our estimates, approximately 40% of the time was spent each on researching literature and critical appraisal and the final 20% on preparing the written answers.
Of all clinicians to whom questionnaires were sent to assess their satisfaction with our service, 61% responded. Almost 90% of respondents expressed a high level of satisfaction, rating it as good to very good, and 68% of the responding physicians indicated that our information had answered their question entirely. Most of the respondents also considered our answers exemplary and felt that the information we provided would help them in their own further literature research and critical appraisals. Interestingly, however, the choice of treatment considered before submitting the question was only minimally or not at all influenced by the answer we provided.
Because evidence-based guidelines and quick access to evidence-based literature are still lacking in Germany, we feel that there is a strong need for an external information service even in the academic setting, despite the time and effort involved.