INTRODUCTION

Social media continues to transform the way that organizations and consumers interact. Facebook®, the world’s largest social networking site, has 1.2 billion monthly users, indicating that the scope of social media extends beyond traditional marketing platforms.1 Healthcare lags behind other industries in the use of social media, which is partly attributable to unique ethical and legal issues, including controlling the flow of information, patient privacy, and protection of the patient–provider relationship.27 Nevertheless, healthcare social media has evolved from simple information-sharing functions to addressing complex public health problems such as healthcare quality and safety, disaster preparedness, and pandemic surveillance.812

The Office of Health Information Technology within the U.S. Department of Health and Human Services acknowledges the importance of tracking and responding to social media’s growing role in healthcare.13 Shared experiences about satisfaction and quality of healthcare organizations and providers posted online may influence consumers’ healthcare decision-making in the coming years. The simplicity of social media as a healthcare information resource—in comparison to more challenging and conflicting modes of public reporting of healthcare quality data—may add value for consumers seeking to make sense of complex information.14,15 In fact, public reporting of healthcare outcomes is largely ignored by consumers, at least in part due to accessibility and comprehensibility issues.16,17 In 2008, less than 10 % of Americans used information comparing the quality of health insurance plans (9 %), hospitals (7 %), or doctors (6 %) to make healthcare-related decisions, and only 6 % were aware of Hospital Compare.18 Therefore, ensuring the accuracy of quality metrics of healthcare organizations and providers on social media sites is important to all stakeholders.15

In November 2013, technology Web sites began reporting that Facebook was providing organizations the option of allowing users to post ratings (1–5 stars) on their Facebook pages.19 The extent to which these ratings are related to hospital quality or patient satisfaction is unknown. Additionally, the potential impact of user-generated metrics on healthcare consumer decision-making and market share is unclear. However, a study from Harvard Business School demonstrated that among Seattle area restaurants, a one-star increase in Yelp® ratings was associated with a 5–9 % increase in revenue.20 Average star ratings may serve as a simplifying heuristic to help consumers learn about quality in the face of complex information.20

To date, the majority of literature related to social media in healthcare has been qualitative, primarily focused on implications and applications.2 Few studies have provided quantitative analysis of social media utilization by healthcare organizations or the relationship between end-user metrics and hospital quality and/or outcomes.11,21,22 However, emerging studies suggest that feedback and ratings on social media and other online rating tools may be correlated with patient satisfaction and objective measures of hospital quality and safety23.

Our study investigates the use of Facebook among hospitals performing above or below the confidence interval of the national average for 30-day readmissions as reported on Hospital Compare. Endorsed by the National Quality Forum, 30-day readmissions are also the focus of healthcare cost and quality initiatives, including the Hospital Readmissions Reduction program.2426 The purpose of this study was to determine whether hospitals with lower readmission rates were more likely to have higher ratings on Facebook than hospitals with high readmission rates.

METHODS

Publicly available cross-sectional data from Hospital Compare were accessed to identify performance measurements for all Medicare-certified hospitals on 30-day hospital-wide all-cause unplanned readmission rates (HWR)25. Data collection for readmission rates encompassed the period from July 2011 through June 2012, the most recently available data at the time of study initiation.27 Only hospitals reported as performing “outside the expected national average” on 30-day HWR were included in order to evaluate for potential maximum differences between groups and to reduce administrative burden and time-associated bias related to collection of Facebook ratings, which can change rapidly. The statistical model used to calculate HWR and 95 % confidence interval estimates is reviewed elsewhere.28

Each hospital name, as listed in Hospital Compare, was entered into an Internet search tool (Google®) to locate its Web page, which was then searched for a hyperlink to Facebook. If the link was present, it was followed to the Facebook page and data were collected. Only Facebook Pages, which are designed to “enable public figures, businesses, organizations and other entities to create an authentic and public presence on Facebook” were included.29 Additional options for Facebook presence, including groups, check-in pages, community pages, and personal profiles, were excluded.

If there was no visible link to Facebook on the hospital Web page, additional methods were utilized, as follows: 1) HTML code search of the Web page for the terms “Facebook,” “connect,” “follow,” or “social”; 2) entering “Facebook” in the hospital Web page native search tool; 3) Internet search of the hospital name followed by “Facebook”; and 4) entering the hospital name within Facebook’s native search tool. If a dedicated Facebook page was not found through these methods, the hospital was considered as not having a Facebook page. Given duplication of certain hospital names, Facebook pages were confirmed by address. Facebook data were obtained from January 26, 2014 through February 2, 2014.

Data from hospital/health system Facebook pages included 1) presence, number, and average of ratings (1–5), if available; 2) number of likes; 3) availability and number of check-ins (“were here” in which a user indicates having been physically present at that location); 4) date the hospital joined Facebook, as listed under the timeline data. If one or more hospital within a health system had the same Facebook page, all hospitals were deemed to utilize Facebook, but only one occurrence of the page was included in the Facebook data analyses to prevent double-counting.

Data on hospital size (beds), teaching status (major, minor, non-teaching), location (urban/rural), and total expenditures were collected from publicly available data (American Hospital Association DataViewer)30 from February 20, 2014, through March 6, 2014.

Statistical analyses were performed using SPSS software (SPSS for Windows, Version 16.0; SPSS Inc., Chicago, IL, USA). Continuous variables are expressed as mean ± standard deviation or median [interquartile range], as appropriate. Normality testing was performed using the Kolmogorov-Smirnov test. Between-group comparisons were performed with the independent-samples t-test, Mann–Whitney U test, or Chi-square test, as appropriate. Multiple logistic regression analysis was used to assess the association between Facebook ratings and HWR performance when controlling for key variables. A p value < 0.05 was considered statistically significant.

RESULTS

Characteristics of hospitals included in the study are presented in Table 1. Among all hospitals in the Hospital Compare 30-day HWR dataset (n = 4805), 79 % (3813) were reported having a readmission rate that was “no different than the U.S. national rate” (15.4 ≥ HWR ≥16.9), 315 hospitals (6.6 %) were performing better than the national average on HWR (HWR < 15.4, low-HWR group) and 364 (7.6 %) were performing worse than the national rate (HWR > 16.9, high-HWR group). Data on HWR was “not available” for 196 hospitals (4.1 %); the number of patients/cases was too few for evaluation for 117 hospitals (2.4 %).31

Table 1 Characteristics of the Studied Hospitals

No significant difference was found between the high- and low-HWR hospitals with respect to the number of beds, admissions, outpatient visits, personnel, total expenses, or ownership status. Major teaching hospitals were more likely to be in the high-HWR hospital group (p < 0.001).

Table 2 presents data on Facebook use and related variables by group. Among the low-HWR hospitals, 93.3 % had a Facebook page (either a unique page or shared page with a larger healthcare organization), as compared to 82.4 % of high-HWR hospitals (p < 0.001). Among the unique Facebook pages, the prevalence of allowing users to provide ratings on the five-star system was 81 % in the high-HWR group and 82 % in the low-HWR group. Figure 1 presents an example of a hospital Facebook page that utilized the five-star rating system. The number of ratings for low-HWR hospitals (349 [159–569]) was significantly (p = 0.01) higher than that for the high-HWR hospitals (248 [116–532]).

Table 2 Facebook Characteristics of the Studied Hospitals
Figure 1.
figure 1

Comparison of 30-day hospital unplanned readmission rates, stratified by use of Facebook.

Among hospital Facebook pages utilizing the five-star rating system, the average Facebook rating for low-HWR hospitals (mean: 4.15 ± 0.31, range: 3.1–5.0, median/interquartile range: 4.2 [4.0–4.3]) was significantly (p < 0.01) higher than that for high-HWR hospitals (mean: 4.05 ± 0.41, range: 2.5–5.0, median/interquartile range: 4.1 [3.8–4.3]). There was no significant difference between groups for the following Facebook variables: number of likes, number of check-ins, and number of months the Facebook page existed.

Figure 2 presents an in-group comparison of 30-day HWR stratified by the presence or absence of a Facebook page. Among high-HWR hospitals, those with a Facebook page (n = 300) had a lower HWR than those without a Facebook page (n = 64) (17.96 ± 0.73 vs. 18.5 ± 1.04, p < 0.01). Among low-HWR hospitals, those with a Facebook page (n = 294) had a significantly lower HWR than hospitals in this group without a Facebook page (n = 21) (14.17 ± 0.53 vs. 14.94 ± 0.52).

Figure 2.
figure 2

Massachusetts General Hospital Facebook page. *accessed on 2/4/2015 at www.facebook.com/massgeneral.

In multivariate logistic regression analysis (Table 3), a one-star increase in Facebook rating was associated with a 5.1-fold greater likelihood that the hospital was a low-HWR hospital versus a high-HWR hospital, when controlling for hospital characteristics (admissions, number of beds, number of personnel, number of outpatient visits) and Facebook-related variables. A major teaching affiliation was associated with increased odds of belonging to the high-HWR hospital group (OR = 14.3, CI: 5.6–33.3, p < 0.01).

Table 3 Multivariate Regression Analysis for Prediction of Performance on 30-Day Hospital Readmissions (Better or Worse than National Average)

DISCUSSION

This study examined the extent to which hospital ratings on Facebook were related to 30-day hospital-wide unplanned readmission rates among hospitals with readmission rates outside the confidence interval for the national average. Among hospitals with Facebook pages, those with lower 30-day readmission rates had higher ratings on Facebook’s five-star rating scale than hospitals with higher readmission rates, after controlling for hospital characteristics and Facebook-related variables. These findings add support to the small but growing body of literature suggesting that unsolicited feedback on social media and hospital ratings sites corresponds to patient satisfaction and objective measures of hospital quality.11,23,31

Secondly, 88 % of hospitals in this study had a Facebook page, which is similar to estimates published in other studies.32 This finding suggests that hospital leaders find value or necessity in maintaining a social media presence. Healthcare organizations are using social media sites like Facebook for a variety of functions, including patient education, marketing, and sharing information with staff.33 Healthcare organizations may find that allowing ratings on social media sites warrants increased attention.

There are several plausible explanations for the finding that Facebook ratings were associated with 30-day HWR. First, consumer ratings of hospitals on social media may actually reflect hospital quality. The concept that user-generated and unsolicited feedback reflects quality is not novel, having been demonstrated in the automotive and restaurant industries.20,34 Alternatively, ratings could reflect confirmation bias, as users may be inclined to provide higher ratings to hospitals already perceived to be of high quality.

While Facebook users are now empowered to provide hospital ratings, the variables coalescing into the final rating could be quite varied. Other ratings sites/systems such as U.S. News & World Report® and the Leapfrog hospital survey have more granularity, as they provide information on organ system, disease process, or aggregates of multiple quality measures. Another benefit of more granular systems is periodic updates. In contrast, social media ratings may be more difficult to change once a critical mass of feedback is met. Thus, Facebook ratings could reflect historical and not current trends in quality.

Low-HWR hospitals were more likely to have a Facebook page than high HWR hospitals, which could be attributable to the fact that hospitals with high HWR may have more to lose from potential negative feedback. The impact of negative ratings on social media may be more detrimental than the influence of potentially positive ratings/feedback.36 Alternatively, lower-quality hospitals may have a perceived or real lack of necessity to maintain a social media presence, as they may be situated in regions where competition for patients or insurance contracts is limited. Finally, maintaining a coherent social media strategy requires resources that hospital leaders may determine are better served elsewhere, such as programs to improve quality and safety.

Quantitative study of the relationship between user-generated ratings/feedback on social media and hospital quality measures is relatively new within healthcare11. However, interest in the role of social media in assessing patient satisfaction and hospital quality is growing.11 The findings in our study are similar to those of others in both the U.S. and UK, suggesting that unsolicited ratings and feedback are associated with patient satisfaction and objective measures of quality. One study evaluated the relationship between hospital ratings on Yelp and those on the Hospital CAHPS® (HCAHPS; Hospital Consumer Assessment of Healthcare Providers and Systems) survey.23 Over 950 hospitals had at least five ratings on Yelp, and there was a positive correlation between ratings and higher scores on HCAHPS. Similarly, ratings were negatively correlated with mortality measures for myocardial infarction and pneumonia and 30-day readmissions.23

In 2008, the UK National Health Service (NHS) established the Web site “NHS Choices,” allowing patients to provide unsolicited ratings of their experiences with healthcare providers37. Patients can rate several aspects of quality and can leave comments on particular aspects in free text. A study of hospital ratings within NHS Choices found that positive recommendations were significantly associated with lower standardized mortality ratios and lower readmission rates.32 Similarly, higher ratings of hospital cleanliness were associated with lower methicillin-resistant Staphylococcus aureus (MRSA) and Clostridium difficile infection rates.32

In our study, there was no association between the number of likes and hospital-wide 30-day unplanned readmission rates. This is in contrast to a study of New York City area hospitals that found a strong negative association between the number of Facebook “likes” and 30-day mortality rates and a positive association with patient recommendation measures.38 One possible explanation is that mortality may have a stronger influence on whether users “like” a hospital Facebook page. However, within the social media landscape, ratings are now considered to be more representative of consumer ideas about quality than about “likes.”19 Facebook’s five-star rating system combines ease of use, quantitative information, and digestibility, offering more granularity with regard to how users feel about companies and products.19

Evaluating the relationship between user-generated ratings of hospitals on social media and objective measures of hospital quality is important given the potential impact on consumer healthcare decisions. Rothberg notes that although even the best public reports of hospital quality do not seem to affect market share or consumer choices, this may change as consumers become more aware of rating services and as high-deductible plans drive patients to seek care beyond their local hospital.15

Ratings on social media represent another tool that consumers can employ in making healthcare decisions about hospitals. Along the spectrum of online tools that provide hospital ratings/rankings, social media is likely among the easiest to use and most readily accessible. Continuing to develop quality measures that are understandable and accessible to patients is important in order to ensure that measures on social media do not become overvalued, given inherent biases related to online ratings. Alternatively, incorporating social media and other online tools that allow consumers to provide feedback within existing quality measure platforms, similar to NHS Choices in the United Kingdom, may represent a reasonable next step.

LIMITATIONS

A number of limitations and confounders are present in the current study. First, the study design is cross-sectional and correlative, which limits assigning causality in the findings. Reporting of 30-day readmission measures are significantly delayed, and the most recently available data collection period at the time of the study (July 2011–June 2012) was before Facebook implemented the rating system. Therefore, Facebook ratings may not be pertinent to the time period of readmission data collection.

The Centers for Medicare & Medicaid Services (CMS)/Yale 30-day all-cause unplanned readmission rate (HWR) used in this study has inherent limitations, including case mix adjustment, sample size, and concerns with applicability to non-Medicare populations.39 Another important limitation is that quality measures such as mortality or patient satisfaction were not utilized in addition to readmission rates. However, 30-day HWR was the only relevant hospital-level all-cause quality measure reported as a continuous variable on Hospital Compare. Mortality measures on Hospital Compare are condition-specific (e.g., 30-day mortality rate for heart failure or acute myocardial infarction). Alternatively, patient satisfaction measures from HCAHPs are reported as ordinal data (e.g., percentage of patients rating the hospital as a 0–6, 7–8, or 9–10), which limits quantitative discrimination amongst outliers in performance.

Excluding hospitals performing within the expected range of the national average for 30-day HWR has some weaknesses, including limiting the ability to evaluate for a potential correlation between ratings on Facebook across a representative sample of all hospitals. However, as a pilot study, the present evaluation of potential differences among outliers in performance may inform and direct a more comprehensive evaluation of Facebook ratings for a larger sample of hospitals that includes additional quality and satisfaction indicators.

User-generated feedback on Facebook may be biased and not reflective of patient experiences, and it could also be subject to fraud. Further, users providing feedback may not represent patients or families who received care at a particular institution. Also, individuals who use social media are not necessarily representative of the overall population, since the elderly and ethnic minorities are underrepresented in Internet use.40 Moreover, Facebook ratings do not provide a time scale, and thus a critical mass of ratings may drown out recent ratings that may be more representative of current quality or satisfaction. Lastly, while the difference in Facebook ratings between high- and low-HWR groups was statistically significant, exactly how meaningful the difference is remains unclear and may warrant further study.

CONCLUSIONS

The use of social media, particularly Facebook, is prevalent among U.S. hospitals. Further, hospitals with lower rates of 30-day hospital-wide unplanned readmissions have higher ratings on Facebook than hospitals with high readmission rates. The potential impact of social media ratings on healthcare consumer decision-making must not be underestimated in this changing healthcare environment with increased attention to cost and quality.