↓ Skip to main content

Five shared decision-making tools in 5 months: use of rapid reviews to develop decision boxes for seniors living with dementia and their caregivers

Overview of attention for article published in Systematic Reviews, March 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (84th percentile)
  • High Attention Score compared to outputs of the same age and source (81st percentile)

Mentioned by

twitter
25 X users

Citations

dimensions_citation
17 Dimensions

Readers on

mendeley
138 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Five shared decision-making tools in 5 months: use of rapid reviews to develop decision boxes for seniors living with dementia and their caregivers
Published in
Systematic Reviews, March 2017
DOI 10.1186/s13643-017-0446-2
Pubmed ID
Authors

Moulikatou Adouni Lawani, Béatriz Valéra, Émilie Fortier-Brochu, France Légaré, Pierre-Hugues Carmichael, Luc Côté, Philippe Voyer, Edeltraut Kröger, Holly Witteman, Charo Rodriguez, Anik M. C. Giguere

Abstract

Decision support tools build upon comprehensive and timely syntheses of literature. Rapid reviews may allow supporting their development by omitting certain components of traditional systematic reviews. We thus aimed to describe a rapid review approach underlying the development of decision support tools, i.e., five decision boxes (DB) for shared decision-making between seniors living with dementia, their caregivers, and healthcare providers. We included studies based on PICO questions (Participant, Intervention, Comparison, Outcome) describing each of the five specific decision. We gave priority to higher quality evidence (e.g., systematic reviews). For each DB, we first identified secondary sources of literature, namely, clinical summaries, clinical practice guidelines, and systematic reviews. After an initial extraction, we searched for primary studies in academic databases and grey literature to fill gaps in evidence. We extracted study designs, sample sizes, populations, and probabilities of benefits/harms of the health options. A single reviewer conducted the literature search and study selection. The data extracted by one reviewer was verified by a second experienced reviewer. Two reviewers assessed the quality of the evidence. We converted all probabilities into absolute risks for ease of understanding. Two to five experts validated the content of each DB. We conducted descriptive statistical analyses on the review processes and resources required. The approach allowed screening of a limited number of references (range: 104 to 406/review). For each review, we included 15 to 26 studies, 2 to 10 health options, 11 to 62 health outcomes and we conducted 9 to 47 quality assessments. A team of ten reviewers with varying levels of expertise was supported at specific steps by an information specialist, a biostatistician, and a graphic designer. The time required to complete a rapid review varied from 7 to 31 weeks per review (mean ± SD, 19 ± 10 weeks). Data extraction required the most time (8 ± 6.8 weeks). The average estimated cost of a rapid review was C$11,646 (SD = C$10,914). This approach enabled the development of clinical tools more rapidly than with a traditional systematic review. Future studies should evaluate the applicability of this approach to other teams/tools.

X Demographics

X Demographics

The data shown below were collected from the profiles of 25 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 138 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 2 1%
Unknown 136 99%

Demographic breakdown

Readers by professional status Count As %
Researcher 23 17%
Student > Master 22 16%
Student > Ph. D. Student 11 8%
Student > Bachelor 10 7%
Other 9 7%
Other 23 17%
Unknown 40 29%
Readers by discipline Count As %
Medicine and Dentistry 23 17%
Nursing and Health Professions 16 12%
Psychology 12 9%
Social Sciences 10 7%
Pharmacology, Toxicology and Pharmaceutical Science 4 3%
Other 26 19%
Unknown 47 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 14. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 June 2018.
All research outputs
#2,328,041
of 23,191,112 outputs
Outputs from Systematic Reviews
#421
of 2,015 outputs
Outputs of similar age
#46,371
of 308,446 outputs
Outputs of similar age from Systematic Reviews
#12
of 58 outputs
Altmetric has tracked 23,191,112 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,015 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.8. This one has done well, scoring higher than 79% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 308,446 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 84% of its contemporaries.
We're also able to compare this research output to 58 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 81% of its contemporaries.