Background Two strategies for rating the quality of nursing homes (NHs) in the USA are exemplified by the Nursing Home Compare (NHC) website, launched by the federal Centers for Medicare and Medicaid Services in 1998, and Yelp.com, an online consumer review site that has grown in popularity since its founding in 2004. Both sites feature a 5-star rating system. While much is known about NHC ratings, little is known about NH Yelp ratings. This study examines Yelp ratings for NHs in California and compares these ratings with NHC ratings. Understanding how these ratings relate can inform efforts to empower consumers and enhance NH decision-making.
Methods We collected NHC and Yelp ratings for all California NHs between September and November 2016. For each NH with a Yelp review, we recorded the star rating and the total number of reviews. We also recorded the NHC 5-star rating and NHC ratings for inspections, staffing and quality measures (QMs). We conducted descriptive statistics and frequencies for these variables. We conducted correlations and tested means to compare the ratings.
Results A total of 675 NHs (68.1% of the total sample of 991 NHs) had both Yelp ratings and NHC ratings. Correlations between the Yelp and NHC ratings were relatively weak. The Yelp rating was significantly lower than the 5-star NHC rating and the NHC ratings for staffing and QMs. It was significantly higher than the NHC inspection rating.
Conclusions This study found that when consumers rate NHs on Yelp, their ratings differ considerably from NHC ratings, a finding in keeping with similar studies of NH quality measurement. Further research is needed to analyse the content of Yelp reviews and compare the themes and topics consumers identify with those reported on NHC.
- healthcare quality improvement
- nursing homes
- patient-centred care
- patient satisfaction
Statistics from Altmetric.com
Two strategies for rating the quality of nursing homes (NHs) in the USA have developed over the past two decades. The first is best exemplified by the Nursing Home Compare (NHC) website, launched by the federal Centers for Medicare and Medicaid Services (CMS) in 1998. The second is best exemplified by Yelp.com, an online consumer review site that has grown in popularity since its founding in 2004. Over the years, the two sites have evolved such that they now feature a similar 5-star rating system, but the systems are based on data collected from very different sources.
NHC was designed to help consumers compare and make informed decisions about NHs. Early versions reported numerous care quality measures (QMs) for virtually every NH in the nation. The QMs, derived from resident assessment information collected by NH staff, largely captured resident changes in physical, functional, psychosocial and cognitive well-being. The site drew scant attention from consumers, in part because users found it difficult to understand the various measures.1 In 2009, NHC introduced its more intuitive 5-star rating system. It also added new metrics for health inspections, which are conducted by independent state surveyors, and staffing levels, which NHs report to CMS. Presently, NHC summarises all measures in a single NHC 5-star rating. This rating itself comprises 5-star ratings for the site’s three quality domains: health inspections, staffing levels and QMs.2
Like NHC, Yelp also features 5-star ratings. The site allows consumers to post narrative reviews of businesses they have used and to rate these organisations on a 5-star scale. In addition to the individual ratings, Yelp also reports an average, summary 5-star rating of all reviews for each business. Yelp reviews generally describe a consumer’s personal experience of a particular business or service. Research in hospitals has found that Yelp reviewers tend to focus on subjective experiences of healthcare, such as their personal assessments of staff attitudes, the physical setting and the cost of care.3
Consumers increasingly are using Yelp to review and choose healthcare providers, while NHC still struggles to attract consumer attention. Between 2008 and 2016, the cumulative number of healthcare-related reviews on Yelp jumped from 160 000 to 7.26 million (Yelp does not categorise healthcare providers by setting).4 That number is expected to continue rising because Yelp recently directed additional resources to this review sector. In 2015, it partnered with ProPublica, an investigative journalism company, to incorporate healthcare statistics into the Yelp business pages of healthcare providers, including NHs.5 These additional data are intended to further empower consumers to make informed decisions.5
NHC has not seen similar growth in use. Although families report using the internet to obtain information about NHs, they continue to report that they are not aware of the NHC website. Konetzka and Perraillon,1 for instance, found that none of 65 caregivers recognised the NHC site when shown it, although many had used the internet to research NHs for their loved one. Similarly, Schapira et al 6 found that less than a quarter of their 35 study participants (caregivers and NH residents) consulted the NHC site, although half sought information about NHs online.
Do NHC and Yelp ratings point consumers to the same NHs? Despite the substantial differences in their data sources—differences in everything from who reports what, when, where and how—it is unclear whether the Yelp and NHC ratings ultimately yield similar or different results in their NH ratings.
On the one hand, one might reasonably assume that the NH attributes that residents and families reportedly care about—for example, responsive staff—will relate positively to the NH attributes that NHC assesses (eg, pressure sore prevention), even if the two sets of attributes differ from each other. Indeed, the NH culture change movement rests on such an assumption, for it posits that organisational changes aimed at improving residents’ quality of life will also improve their quality of care. This assumption is supported by studies showing that NH culture change adopters have fewer survey deficiencies than non-adopters.7
On the other hand, recent studies have consistently found that residents’ and their family members’ NH ratings do not correlate well with NHC 5-star ratings. The US General Accounting Office, for instance, found that NHs with higher overall star ratings did not have higher resident satisfaction scores or fewer complaints.8 Another study found a high level of inconsistency between NHC overall star ratings and consumer satisfaction scores.9 A study of Facebook ratings for NHs in two states found the Facebook rates were not significantly correlated with the NHC 5-star rating or survey-based resident and family satisfaction ratings.10
While much is known about NHC ratings, little is known about NH Yelp ratings. This is unfortunate given the growing popularity of Yelp.com as a source for consumer information about healthcare providers. This study examines Yelp ratings for NHs in California. It also compares rating from both NHC and Yelp. Understanding how these ratings relate can inform efforts to empower consumers and enhance NH decision-making.
We collected data from several sources to conduct cross-sectional analyses that compared NH Yelp ratings with NHC ratings .
We used data from the California Office of Statewide Health Planning and Development (OSHPD) to identify all NHs in the state. Long-term care facilities, including skilled nursing facilities, group (or congregate) living facilities and stand-alone hospices, submit annual utilisation reports to OSHPD, which then compiles them into a complete data set.11 We used the 2014 data set (data for 2015 had not yet been released). These data provided the initial list of licensed NHs, for which we later retrieved additional information. We excluded group living facilities and stand-alone hospices from our analysis. Other information retrieved from this data set included each NH’s reported occupancy rate and ownership status.
We also collected NH data from Yelp.com. Although other websites report consumer reviews of NHs, we focused on Yelp.com because, as noted earlier, it is one of and by some measures the most popular consumer review website in the nation.12 Between September and November of 2016, we searched on Yelp for each NH on our list (not all NHs are reviewed on Yelp). For each NH found, Yelp reported one overall rating, ranging from a low of 1 to a high of 5, that represents the average score across all consumer reviews posted for that NH. This average score is rounded to the nearest half-point. For each reviewed NH, we recorded the overall 5-star rating and the total number of reviews.
We also collected ratings from NHC for each NH. Two NHC summary ratings—for QMs and staffing levels— are derived from data self-reported by nursing homes. The QM rating is a composite of 16 QMs (eg, percentage of patients reporting pain, exhibiting depressive symptoms, receiving an influenza shot) derived from NH-reported clinical data.2 The staff rating takes into account registered nurse (RN) hours per resident per day and the number of staffing hours per resident per day. Staffing hours include hours worked by RNs, licensed practical nurses or licensed vocational nurses, and certified nurse aides. The inspection rating is not self-reported by NHs, but is determined by state surveyors, who conduct annual health and fire safety inspections of NHs. This rating is based on the three most recent annual inspections and any inspections that occurred in the last 3 years because of complaints. More emphasis is placed on recent inspections.2 The overall NHC 5-star rating for each NH takes into account all three subscores using methods previously published by the CMS.2 Rather than report an average of the three subscores, the NHC 5-star rating starts with the inspection rating and adds or subtracts stars depending on the two other subscores.
We conducted descriptive statistics and frequencies for all NH variables, including the NHC and Yelp ratings. We compared NHs with Yelp ratings with those without ratings using a Mann-Whitney U test for continuous variables (ie, NHC ratings, number of occupied beds) and Pearson Χ2 test for categorical variables (ie, reviewed on Yelp, ownership status). We used the Wilcoxon signed-rank test for differences between Yelp and all NHC ratings, and a two-tailed Pearson correlation to compare Yelp ratings with each NHC rating (ie, overall, QM, health inspection and staffing). We cross-tabulated Yelp and NHC 5-star ratings to determine the percentage of NHs for which both ratings matched. For this analysis, Yelp ratings with half-point scores (eg, 3.5) were rounded up to achieve comparable data points between the two rating systems.
We identified 1092 NHs certified by Medicare and/or Medicaid (Medi-Cal in California) in California’s 2014 long-term-care facility utilisation report. For our total sample, we included only NHs with complete data for all CMS measures. Of the 1092 NHs in the state file, 991 (90.8%) had complete data for CMS measures (overall, inspection, staffing, QM). Their number of occupied beds ranged from 14 to 365 (M=88.1; SD=43.8). Their NHC 5-star rating ranged from 1 to 5 (M=3.6; SD=1.3). Most (88.8%) were for-profit facilities.
NHs with Yelp reviews
Of the 991 NHs with complete CMS data, 675 (68.1%) were reviewed on Yelp. Occupied beds for these NHs ranged from 14 to 365 (M=94.5; SD=45.0). NHC 5-star ratings ranged from 1 to 5 (M=3.6; SD=1.3). Most were for-profit NHs (89.5%).
There were no significant differences between NHs with Yelp reviews and those without with respect to NHC 5-star ratings (P=0.92) and ownership status (χ 2=0.991; P=0.32). However, NHs with Yelp scores were significantly more likely to report more occupied beds (M=94) than NHs without Yelp scores (M=75.5; P<0.001).
The numbers of reviews per NH ranged from 1 to 57, with a median of 5 (M=7.0; SD=6.7). Yelp 5-star ratings for reviewed NHs ranged from 1 to 5 (M=3.1; SD=1.2).
We used five reviews (the median number of Yelp reviews for this sample) as the minimum number of Yelp reviews needed for a meaningful comparison with NHC ratings, an approach consistent with a study of Yelp hospital reviews10 and in keeping with a requirement among most public reporting sites that each measure must include a minimum number of eligible patients to be reported. We conducted sensitivity analyses by rerunning correlations with the NHC ratings after increasing the number of required Yelp reviews (to nine or more). Results varied little. Thus, we report results for just the correlations using five or more Yelp reviews. Of the 675 NHs with Yelp reviews, 354 (52.4%) had five or more Yelp reviews.
Table 1 reports the descriptive statistics for the Yelp and NHC ratings for NHs with five or more Yelp reviews. Ranges were the same for all ratings (1–5). The Yelp mean, median and mode ratings were lower than all comparable NHC ratings, with the exception of the NHC inspection ratings, for which the mean (2.6) and mode (2.0) ratings were lower than the comparable Yelp ratings (3.0 and 3.0, respectively).
Table 2 reports the frequencies for the Yelp and NHC ratings for NHs with five or more Yelp reviews. More than half (57.9%–84.2%) of NHC 5-star, staff and QM ratings, all of which are self-reported by NHs in whole or in part, are 4s and 5s. By comparison, 40.4% of Yelp ratings and 26.0% of NHC inspection ratings, which are independently assessed by state surveyors, were 4s and 5s.
Comparisons of Yelp and NHC scores
A Wilcoxon signed-rank test showed the Yelp rating was significantly lower than (1) the NHC 5-star rating (P<0.001); (2) the staff rating (P<0.001); and (3) the QM rating (P<0.001). It was significantly higher than the inspection rating (P<0.001). Table 3 presents correlations between Yelp ratings and each NHC rating. While all but the staff rating correlation were statistically significant, all correlations were relatively weak (r=0.09–0.32). We also conducted a side-by-side comparison of Yelp and NHC 5-star ratings for each NH. The Yelp and NHC 5-star ratings were the same for 156 NHs (23.1% of NHs with 5+ Yelp reviews). The Yelp rating was greater than the NHC 5-star rating for 207 NHs (30.7%), and was less than the NHC 5-star rating for 312 NHs (46.2%).
This study sheds light on how NHs are rated on Yelp, one of the most popular online review sites. It also provides insight into how Yelp ratings compare with NHC ratings.
We found that about two of every three NHs in California had at least one Yelp review, with an average of seven reviews. These findings are in keeping with prevalence trends for online reviews of healthcare providers. While more consumers are reviewing these providers on Yelp, these reviews account for just 6% of all reviews on Yelp.12 13 Thus, there are both growth in this review segment and room for more growth.
We also found that consumers posting Yelp reviews rate NHs differently than NHC rates the same facilities. The correlations between the Yelp rating and the NHC 5-star rating and each NHC subscore were relatively weak. Additionally, the Yelp rating was significantly lower than the NHC 5-star rating and the ratings for QMs and staffing. It was significantly higher than the inspection rating. Finally, a conservative calculation found that an NH’s Yelp rating equalled its NHC 5-star rating just 23% of the time. These findings differ from results reported in similar studies of Yelp hospital ratings. In these studies, Yelp ratings were positively associated with ratings on the Hospital Consumer Assessment of Healthcare Providers and Systems survey, which CMS uses to evaluate patients’ experiences following hospitalisation.14 15 Our findings are in keeping, however, with the previously described studies reporting discrepancies between NHC and consumer ratings of NHs.8–10 Several explanations may account for why we found differences between Yelp and NHC ratings.
Differences in content
Yelp consumers likely evaluate different aspects of NH care than NHC evaluates. As previously discussed, research involving hospitals has shown that Yelp reviewers tend to focus on their subjective experiences of healthcare, in contrast to the standardised assessments of clinical, staffing and inspection attributes that NHC reports.15 More research is needed to identify the topics addressed in Yelp reviews of NHs.
Differences in data sources
Our findings also may be explained by differences in information sources. NHC ratings are derived from resident assessment, staffing and inspection data. By contrast, it is likely that family members, not residents, write the bulk of NH Yelp reviews. Previous research suggests residents’ and family members’ rate NHs differently, with family members’ ratings aligning more closely with NHC 5-star ratings.9
Differences due to gaming
A final explanation concerns gaming, whereby ratings are artificially manipulated. As discussed below, both NHC and Yelp have faced charges of gamed ratings.
Estimates of faked reviews and ratings on Yelp range from 16% to 40%.16 17 The company, however, filters out many of these reviews using automated software as well as alerts from Yelp users. Perhaps as a result of this vetting, users say they trust Yelp more than other online review sites or traditional advertisers.18 Users also appear to trust their own ability to filter out exaggerated or unreliable reviews.19
Worth noting is that gamed Yelp ratings may encompass either or both extremes of the 5-point rating scale. Businesses, for instance, reportedly have tried posting fake 1-star reviews on their competitors’ Yelp pages. They also reportedly have attempted to post fake 5-star reviews to their own Yelp pages.20 While both types of attempts are lamentable, because they run in both directions (and could possibly cancel each other out), few systematic biases, which favour one direction over another, have been detected in Yelp ratings.20
While some researchers have raised concern that reviews of NHs on Yelp or Facebook may direct consumers to NHs that score poorly on ‘objective’ measures of NH quality, such as those reported on NHC, we raise an opposing concern.10 Several studies have found evidence in NHC ratings of potential systematic bias in favour of inflated ratings.21–23 As background, bias may arise because some NHC measures are self-reported by NHs (ie, the QMs and staffing measures), thus allowing for possible manipulation of the data.21 Some NHs, for instance, reportedly have increased their staffing rating by adding workers just before their facility’s annual federal inspection.21
A 2014 investigation by the New York Times (NYT) reported that when the NHC 5-star rating system started in 2009, 37 percent of NHs received 4- or 5-star ratings. By 2013, nearly half did’.21 The NYT article, titled ‘Medicare Star Ratings Allow Nursing Homes to Game the System’, reported that even NHs with a history of providing poor care achieved high scores on measures derived from self-reported data.21 A 2014 report found similar results.23 A 2016 analysis concluded that the marked increase between 2009 and 2013 in NH ratings for staffing and QMs was due to purposefully inflated data reports.24 The researchers also found that NH profits increased with higher ratings. Thus, they concluded that the rating system provides a financial incentive for gaming.24
In 2014, CMS modified its NHC rating system in an attempt to discourage gaming. Among other changes, CMS implemented a staff reporting system that is ‘auditable back to payrolls to verify staffing information’.25 NHs, however, continue to self-report their QM and staff data. A 2016 analysis of NHC data for NHs with particularly poor ratings (called Special Focus Facilities) found evidence that high scores (mostly 4s and 5s) in these facilities’ QMs and staff ratings boosted their NHC 5-star rating, a finding the authors concluded ‘indicates that data manipulation continues’. 22
If California NHs gamed some of the data they reported to CMS, that practice could explain why Yelp ratings in our study were significantly lower than the NHC ratings for QMs and staffing, which are derived from data self-reported by NHs.
There are limitations to this study. We collected data for only California NHs. NHC and Yelp ratings may vary by state, and so our results may not be generalisable to other states. That noted, we are aware of no evidence that such variation occurs. We also included in our sample only NHs with both Yelp and CMS ratings. Because not all NHs have Yelp ratings, our sample is neither random nor necessarily representative of all California NHs. Our analysis found, however, that NHs with Yelp ratings differed from NHs without Yelp ratings only in that they reported more occupied beds. This difference may be due more to the fact that NHs serving more individuals have more costumers who can rate them on Yelp than to any care quality attribute distinguishing larger NHs from smaller NHs. Finally, as we discussed earlier, each data source used in this study—NHC and Yelp—has limitations. However, these are two of the most predominant sources of NH information available and—according to our findings—are used by consumers. Thus, they should not be disregarded. Rather, acknowledgement of these limitations is what drives the search for improvement. Our study results highlight the need for further research to improve and strengthen the ratings systems used by NH consumers.
Choosing an NH for oneself or a loved one is a complex, challenging, often emotionally charged task. To help NH consumers make informed decisions, CMS launched an online rating system—NHC—in 1998. Nearly 20 years later, many NH consumers still report they are unaware of the ratings website.1 6 26 Meantime, online review sites, and in particular Yelp.com, have grown increasingly popular as places for consumers to express their views about myriad healthcare providers.13
This study found that when consumers rate NHs on Yelp, their ratings differ considerably from NHC ratings, a finding in keeping with similar studies of NH quality measurement. Further research is needed to analyse the content of Yelp reviews and compare the themes and topics consumers identify with those reported on NHC. Longer term, as research into social media reviews of NHs advances, consideration should be given to analysing data across consumer review websites (eg, Yelp, Facebook, Twitter and others) and testing whether the amalgamated results can help identify high-risk NHs for onsite inspections, much as Griffiths and Leaver27 recently proposed for use in English hospitals.
Handling editor Kaveh G Shojania
Competing interests None declared.
Ethics approval This study is exempt from institutional review board approval because the research did not use human subjects data.
Provenance and peer review Not commissioned; externally peer reviewed.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.