Article Text

Download PDFPDF

Nursing home Facebook reviews: who has them, and how do they relate to other measures of quality and experience?
  1. Jennifer Gaudet Hefele1,
  2. Yue Li2,
  3. Lauren Campbell2,
  4. Adrita Barooah1,
  5. Joyce Wang1
  1. 1 Department of Gerontology, University of Massachusetts Boston, Boston, Massachusetts, USA
  2. 2 Department of Public Health Sciences, University of Rochester Medical Center, Rochester, New York, USA
  1. Correspondence to Jennifer Gaudet Hefele, Department of Gerontology, University of Massachusetts Boston, Boston, Massachusetts 02125, USA; jennifer.hefele{at}


Background The growing use of social media creates opportunities for patients and families to provide feedback and rate individual healthcare providers. Whereas previous studies have examined this emerging trend in hospital and physician settings, little is known about user ratings of nursing homes (NHs) and how these ratings relate to other measures of quality.

Objective To examine the relationship between Facebook user-generated NH ratings and other measures of NH satisfaction/experience and quality.

Methods This study compared Facebook user ratings of NHs in Maryland (n=225) and Minnesota (n=335) to resident/family satisfaction/experience survey ratings and the Centers for Medicare and Medicaid (CMS) 5-star NH report card ratings.

Results Overall, 55 NHs in Maryland had an official Facebook page, of which 35 provided the opportunity for users to rate care in the facility. In Minnesota, 126 NHs had a Facebook page, of which 78 allowed for user ratings. NHs with higher aid staffing levels, not affiliated with a chain and located in higher income counties were more likely to have a Facebook page. Facebook ratings were not significantly correlated with the CMS 5-star rating or survey-based resident/family satisfaction ratings.

Conclusions Given the disconnect between Facebook ratings and other, more scientifically grounded measures of quality, concerns about the validity and use of social media ratings are warranted. However, it is likely consumers will increasingly turn to social media ratings of NHs, given the lack of consumer perspective on most state and federal report card sites. Thus, social media ratings may present a unique opportunity for healthcare report cards to capture real-time consumer voice.

  • nursing homes
  • patient satisfaction
  • report cards

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


The shift in focus to a more person-centred approach in nursing homes (NHs) increases the need for documenting first-hand evaluations of care and experience, information that is difficult if not impossible to convey through standard quality measures. The national report card for NHs, Nursing Home Compare (NHC), does not include measures of patient experience. Just a small handful of state-based report cards include resident/family satisfaction and experience measures, leaving the overwhelming majority of consumers in the dark on understanding resident experience of NHs.1 Additionally, traditional survey methods are not administered in real time, and there can be substantial time lags between when the experience occurred, when the person is surveyed and when the results are reported.2 3 Thus, existing efforts may not be as informative as consumers would like.

An innovative approach to capturing patient/resident experience is through the use of social media.2 4 Social media platforms share a common focus on social interaction and a two-way exchange of information and feedback. Platforms like Facebook, Twitter and Yelp allow users to provide feedback or comment in real time on their experiences with all kinds of businesses, products and service providers. Social media has been increasingly used by businesses such as restaurants,5 non-profits such as Red Cross6 7 and even by the government.8 Healthcare providers have also entered these waters and numerous hospitals now use Facebook, Twitter, Yelp and Foursquare.3 4 9 10 Indeed, a recent study found that approximately 94% of hospitals in the USA had a Facebook page.10 The potential impact of these ratings is unknown, but previous research suggests social media ratings like these do indeed impact consumer choice.5 11

Relying on a social media sites for user-generated ratings of healthcare providers raises several questions, including representativeness of results and whether social media ratings are related to other measures of quality, like experience survey results and clinical quality. Several other studies have examined these questions in the context of hospital care, and results have been mixed across the studies; several have found positive associations between user-generated ratings and hospital quality,9 12–14 while others have found no or only weak associations.3 15 However, these questions remain unanswered within the context of NH care in the USA. This question is particularly salient in NH care, as quality problems are widespread, long-documented and can severely impact residents’ lives.16 If social media ratings direct consumers to poor-quality NHs, a great deal is at stake, as individuals live out all aspects of their lives within the NH and transition to another facility is difficult and often dangerous for residents.17–20 Our study seeks to address some of these gaps. Specifically, we seek to (1) identify which NHs are more or less likely to have a Facebook page; and (2) determine whether Facebook user-generated ratings correlate with other measures of NH quality.


We combined data from several sources to conduct cross-sectional analyses in order to examine the relationship between Facebook adoption and user ratings, NH characteristics, and other measures of quality and experience.

Data sources

Nursing Home Compare

NHC is a publicly available online report card site that provides information on all Medicare and Medicaid certified NHs in the USA. From this data set we used variables related to NH quality and facility characteristics (see table 1 for a listing and description of variables and sources).

Table 1

List of key variables and their sources

Facebook data

Since 2013, Facebook business pages allow for user-generated ratings and reviews on the page using a 5-star rating system and an open-ended commenting field. Although most people use Facebook primarily as a way to connect with their social network, rating businesses and reading reviews within such a context have great potential, given that Facebook is the most popular social media platform and one that most users visit daily.21 Further, one-third of Facebook users are aged 45 and older, which falls within the range of the most likely seekers of NH care, whether for themselves or a loved one.21

In Fall 2015, we collected data on Facebook ratings of NHs in two states: Maryland and Minnesota. These states were chosen because we had access to NH resident experience survey data (described below). Using the list of NHs accessed in NHC data, we located Facebook pages through Facebook and Google searches. We verified names and addresses, and included only publicly available Facebook pages officially associated with the business. If on clicking on the NH’s Facebook page we were directed to a parent company Facebook page, we did not collect data from that page. From the Facebook pages, we collected information on the number of times an NH was rated and the average number of stars it received.

Experience data

The Minnesota experience data come from standardised face-to-face interviews and surveys with residents. These are conducted by an independent research firm hired by the state. The purpose of the interviews and surveys is to collect and publicly report information on resident experience around quality of life, as well as report results back to the NHs for awareness and quality improvement. Experience survey results have been used in the state’s performance incentive models as well.22–24 All NHs in the state are included in the sample, and respondents include both long-stay and short-stay residents. The tool was developed by the University of Minnesota and is described in full elsewhere.25 26 There are a total of 11 broad domains and one additional domain on mood. The Cronbach’s alpha scores of the domains range from 0.52 to 0.77,27 and the scales had good concurrent validity. In our analyses, we examined both the overall experience score and the individual domain scores.

The Maryland experience data come from the Maryland Nursing Facility Family Survey, which has been conducted annually by the state since 2005. All NHs in Maryland that have had any residents with 100 or more days of stay are part of the survey process, and respondents are limited to family members with a resident having at least 100 days of stay. The survey contains two overall experience measures (rating of overall care and likelihood of recommending to others), as well as 17 items related to staff and administration, care, food and meals, autonomy and residents’ rights, and the physical environment. Both the composite scores and the individual items have been found to have a high internal consistency and concurrent validity.28 29 We use both the overall and component experience measures in our analyses.

Online Survey Certification and Reporting System

The Online Survey Certification and Reporting System (OSCAR) data set contains information collected during the federally mandated, state-implemented on-site inspections of all certified NHs conducted approximately annually. We used several variables from the OSCAR to describe NHs and their markets.

Area Health Resource File

The Area Health Resource File (AHRF) is a publicly available data set that compiles health data from several sources, including the US Census, the American Hospital Association and the Bureau of Labor Statistics. We used AHRF to characterise the counties in which NHs are located.


We merged all data sets on provider identification number to construct the analytic file. We excluded hospital-based facilities from analyses, as these types of facilities operate in very different ways from other NHs. We examined descriptive statistics and distributions of all key variables. To identify which NHs were more likely to have a Facebook page, we pooled data from both states and ran a logit regression to identify which key independent variables increased the probability that an NH would have a Facebook page, controlling for unobserved differences between states. To determine whether Facebook ratings of NHs are associated with other measures of satisfaction/experience, we conducted correlation analysis using the Spearman’s rank correlation coefficient between the Facebook rating and the experience ratings. Correlation analysis was chosen over regression analysis because of the small number of NHs in each state that allow users to provide Facebook ratings. In the correlation analysis, we limited analysis to only those NHs with five or more Facebook reviews, thus eliminating facilities with very few reviews. To determine whether Facebook ratings are related to other measures of quality, we pooled the data from the two states and ran logit models to estimate the association between Facebook ratings and the overall NHC 5-star rating, controlling for other factors related to quality. This analysis is also limited to only those NHs with five or more Facebook ratings and uses frequency weights to adjust for number of ratings, using a generalised linear model.

For these analyses, we standardised some of the variables for ease of interpretation. We recoded the per cent Medicare-pay and per cent Medicaid-pay variables to represent 10% increases in the variables, and we standardised the median household income into $1000 increments. We recoded the Herfindahl-Hirschman Index (HHI) as (1−HHI)×100. In the regression models, we recoded the star ratings into binary variables. Specifically, the NHC 5-star dependent variable was recoded such that 1=the NH received 5 stars and 0=the NH received 1, 2, 3 or 4 stars; the Facebook ratings independent variable was recoded as 1=the NH received 5 stars and 0=the NH received 1, 2, 3 or 4 stars. This was done because the distributions of both ratings variables skew high. Additionally, we wanted to be able to distinguish between ‘good’ and ‘bad’ as opposed to identifying distinctions between the effects of 3 and 4 stars, for example, However, we tested alternate specifications of both the NHC and Facebook ratings variables as well. We also combined the nurse staffing variables into one variable (total nurse staffing), as we were unable to get stable estimates likely due to the small sample size. Analyses were performed using Stata V.11 SE.


Table 2 describes the sample of Minnesota and Maryland NHs used in this study. The data show that NHs in the two states differ along several dimensions. Minnesota NHs tend to be smaller, non-profit and have lower proportions of residents who are Medicare-pay and Medicaid-pay. Maryland NHs have more inspection deficiencies and are more often located in urban counties and counties with younger populations. Minnesota NHs tend to have higher levels of certified nursing assistant (CNA) and registered nurse staffing, whereas Maryland NHs tend to have higher levels of licensed practical nurse staffing.

Table 2

Descriptive statistics for independent variables

Of the 549 NHs across the two states, we found one-third (181) to have a Facebook page. For those NHs with a Facebook page, the majority (62%) offered users the opportunity to provide a rating. The proportions differed slightly between the two states: 38% of Minnesota NHs had a Facebook page (126 of 328 facilities), whereas 25% of Maryland did (55 out of 221 facilities). Further, 60% of the Minnesota facilities with a Facebook page allow users to provide ratings, whereas 69% of the Maryland facilities with a Facebook page allowed for the same.

The mean Facebook rating for those NHs that had them was 4.49, with no significant difference between facilities. Of those NHs across both states with Facebook ratings, the distribution of ratings was skewed high: 0.9% and 0.0% had 1 and 2 stars, respectively, whereas 13.3%, 54.9% and 31% had 3, 4 and 5 stars, respectively. Similarly, the NHC ratings are skewed high: the mean NHC rating was 3.61 stars out of 5, with no significant difference between the two states. Across all NHs, the distribution of NHC ratings, in order from 1 through 5 stars, was 5.2%, 19%, 18.1%, 25.8% and 32%.

Table 3 provides descriptions of the satisfaction/experience survey results from Minnesota and Maryland. Most survey items are rated highly in both states. The overall proportion of positive responses across all items in the Minnesota survey is 0.826, indicating that, on average, respondents provide a ‘yes’ answer to nearly 83% of survey items. In Maryland, 88.9% of respondents report that they would recommend their facility to others.

Table 3

Mean, median and percentiles for measures of experience/satisfaction from Minnesota and Maryland

Our regression analysis of factors associated with having a Facebook page indicates that NHs with greater CNA staffing were more likely to have a Facebook page (OR 2.005, p=0.004; see table 4). NHs located in more competitive markets were also more likely to have a Facebook page (OR 1.018, p=0.021). NH payer-mix also appears to have an effect on likelihood of having a Facebook page; a 10% increase in the proportion of Medicaid residents is associated with a 2% decrease in likelihood (OR 0.98, p=0.004). We also see that a 10% increase in the proportion of Medicare residents is associated with a 2% decrease in likelihood of having a Facebook page (OR 0.979, p=0.059), with marginal significance. Chain affiliation decreased the likelihood of having a Facebook page (OR 0.404, p=0.001), as did being a for-profit facility (OR 0.636, p=0.065), although this estimate was marginally significant. NHs located in Maryland were less likely to have a Facebook page (OR 0.471, p=0.014). Our results also show that having a Facebook page is not associated with quality, as the coefficients on the number of deficiencies and the indicator for being in the highest tier of NHC ratings (5 stars) are insignificant. Results were consistent when using the NHC rating as a five-level categorical variable.

Table 4

Logistic regression results to identify factors associated with a greater likelihood of having a Facebook page, Minnesota and Maryland (n=535)

In examining Facebook ratings, we saw that the average number of ratings was 10 per facility in both Minnesota and Maryland, with a mean rating of 4.5 (on a scale of 1–5, with 5 being the best) in both states. Our correlation analysis between Facebook ratings and other measures of quality shows essentially no meaningful relationships (see table 5). The association between Facebook ratings and the overall satisfaction score measured in the Minnesota survey, although statistically significant, can be considered a negligible relationship.30 There were no significant relationships found between Facebook ratings and Maryland survey items.

Table 5

Correlations between Facebook ratings and other measures of experience/satisfaction, Minnesota and Maryland

Our logit model to estimate the association between Facebook ratings and NHC ratings found no significant association: being a 5-star Facebook rated facility had no impact on being a 5-star NHC rated facility (see table 6). These findings were robust to alternate specifications of both the independent Facebook variable and the dependent NHC variable.

Table 6

Logit model to estimate the association between Facebook ratings and NHC star ratings in NHs with five or more Facebook reviews (dependent variable: 1=NH has 5 NHC stars, 0=NH has 1–4 stars) (model n=73)


Summary and interpretation of results

Our study sought to identify NH characteristics associated with likelihood of having a Facebook page and to determine whether user-generated Facebook ratings were associated with other measures of NH quality. We found that indeed Facebook adoption varies across NH characteristics. This is in line with several previous studies of social media in healthcare. Three separate studies found that large, urban and non-profit hospitals were more likely to adopt social media.10 13 14 31 Our study similarly found that urban NHs were more likely to have a Facebook page and that for-profit homes were less likely to have one. However, we found no significant relationship with size.

Our findings suggest the likelihood of having a Facebook page may be related to the customer base of the NH. Specifically, it appears that NHs serving a more affluent customer base are more likely to have a Facebook page, evidenced by being less likely to be for-profit, chain, and having higher proportions of Medicare-pay and Medicaid-pay residents. Research on NHs consistently shows that non-profit facilities tend to serve more affluent customers and have higher proportions of private pay and lower proportions of Medicaid-pay and Medicare-pay residents.32–35 Facebook may be a strategy of these non-profit homes to reach their more affluent customer base. Market competition may also help spur Facebook adoption among NHs, further suggesting the use of Facebook as part of an overall marketing strategy.

Our study found that the user-generated Facebook ratings were not in sync with other NH measures of quality and experience. This aligns with some previous studies of social media in healthcare: two studies of hospital Tweets found no or poor associations with traditional quality and patient experience measures.3 15 Perhaps the lack of an association between user-generated ratings and traditional clinical measures of quality is not surprising, given the lack of consistency in association between clinical quality and patient experience/satisfaction; whereas some studies have found no, poor or negative associations,36–39 others have found positive associations.40–44

On the other hand, several previous studies have found positive associations between user-generated ratings and hospital quality and experience.9 12–14 Our contradictory results may be due to the fact we examined NHs and not hospitals, or that our analysis was limited to two states, or that we examined Facebook ratings as opposed to Yelp ratings or Tweets. A larger study of user-generated ratings and NH quality is certainly warranted. However, there are no national data on NH experience available currently to use for comparison. While there is a survey tool for collecting resident experience (the Consumer Assessment of Healthcare Providers and Systems for NHs surveys) that was tested in a national sample, it has yet to be implemented on a national scale for reporting purposes.45

The inconsistency in findings may also be related to differences in propensity to adopt social media among healthcare providers, noted in this study and others as discussed above. These differences certainly can skew the results from one study to the next, as we are not examining the same sample across social media platforms or even time periods. Further, the bias in social media adoption may skew the results in favour of finding a relationship; Glover and colleagues12 found that hospitals with Facebook pages had lower readmission rates compared with those without Facebook pages. Thus, the sample used in that study is biased in favour of higher performing providers. We did not find an association between having a Facebook page and quality (ie, number of deficiencies or being a 5-star NH) in our study.

Significance of results

The results of this study are significant for several reasons. First, consumers are increasingly looking to social media to help shape their decision making, and it is likely that we will see this behaviour grow in the context of healthcare, even among middle-aged and older-aged adults. The use of social networking sites among adults 50–64 years increased from 33% to 51% between 2010 and 2015; usage among adults 65 and older tripled from 11% to 35% in the same time period.46 47 The use of social media for healthcare specifically is fairly substantial, where as many as one-third of all consumers are using social media for health-related discussions.48 This is in line with a 2006 survey of NH residents’ family members, where 31% of respondents indicated they had used the internet in their NH selection process.49 Many patients and their families face a short time frame in finding and choosing an NH after a hospital stay, making it a challenge to conduct extensive research and in-person visits. Using the internet and social media ratings to learn about NH choices eases these difficulties. Of course, this depends on the availability of reviews on NHs within the consumer’s set of options, which can depend on bed supply and payment sources. Currently, self-pay and Medicare-pay consumers are likely to have a greater set of NH options and thus more likely to be able to find social media ratings more useful; NH options for Medicaid-pay consumers are likely to be more limited because of undesirable lower NH payment levels. However, as the use of social media among NH-aged persons and their families continues to grow, we are likely to see growth in both the generation of user reviews and the use of such ratings to support decisions regardless of bed supply or payer source.

Second, studies outside of healthcare show that social media ratings help to drive business.5 11 One examination of Yelp restaurant reviews found that every 1-star increase in the Yelp rating of a restaurant leads to a 5%–9% increase in revenues.5 It is not surprising that user reviews have such a big impact, as narratives can help support and clarify decision making.50 This is particularly true when given complex choices, like those in healthcare decisions.50 51 Thus, understanding how social media ratings of healthcare facilities relate to other, more objective measures of quality is important, given the potential for the user-generated ratings to drive consumer choice of providers. That our study found weak and absent associations with experience and clinical ratings of quality is concerning; the user-generated ratings may potentially encourage consumers to choose providers who rate poorly on the more objective measures of quality. Indeed, our data show that only 33% of NHs that receive 5 NHC overall stars also receive 5 Facebook stars, and conversely 20% of NHs with 1, 2 or 3 NHC overall stars receive 5 Facebook stars (results not shown). This suggests that reliance on Facebook ratings does not necessarily point consumers to high-tier facilities, as one would hope. Noted earlier, our analyses show no significant relationship between Facebook and NHC ratings. Choosing a poor-quality NH can have serious negative implications for residents, making it all the more important for user-generated ratings to consistently point consumers towards high-quality facilities.

Third, our findings on the relationship between social media ratings and traditional quality and experience ratings are particularly salient in the NH context. While hospital report card sites provide at least some information on patient experience, the consumer perspective is glaringly absent on NHC, the federal report card site. In the most recent examination, only six sites provided any sort of resident satisfaction information on state-sponsored NH report card sites.1 Yet it appears that this is the sort of information that consumers want when looking for an NH; a recent study found that consumers want to hear about current and prior resident/family experiences when deciding on an NH for themselves or a loved one.52 This confirms earlier findings where focus groups participants consistently identified reputation/recommendations from/experiences of family and friends as a top factor for NH selection.53 Indeed, both studies also had participants independently suggest that report cards include user-generated star ratings of NHs. For consumers who do seek out online information to support their NH decision, it is likely they will increasingly turn to social media sites to get the consumer perspective. However, it may be problematic to rely on social media ratings to provide this insight to consumers, given the lack of relationship between Facebook ratings and other measures of satisfaction/experience and quality.

Implications: opportunities and limitations of social media NH ratings

Given the disconnect between the desire to see consumer experience ratings on report card sites and the overall lack of availability of that information, social media may present report card makers with a unique opportunity. Sites like Facebook and Yelp allow consumers to generate and evaluate experience data in real time. It may be possible to feed social media ratings into existing NH report cards like NHC, thus delivering the standard, often clinically based measures of quality alongside user-generated ratings. Two recent studies found that content in social media user reviews of hospitals differed from the domains collected through CAHPS, and have potential to provide useful additional information for consumers.54 55 We also see evidence of information going in the opposite direction: Yelp is now pulling in information on hospital patient–provider communication quality and NH deficiencies to help guide their users.56

Where the state and federal report cards are falling short, the private market is providing consumers with the decision support information they want. However, there are limitations to relying on social media ratings of NHs. Our study and others show that not all providers are equally as likely to use social media, thereby limiting reach to consumers wishing to incorporate ratings into their decision-making process. More importantly, not all families/residents are providing ratings and the ones that do are not part of a randomly selected sample, potentially biasing results. This may be why our analysis of user-generated ratings did not line up with our more scientifically grounded measures of experience. There are also substantial opportunities for providers to game the system and ‘stuff the ballot box’, so to speak, which few studies have been able to address.5 Thus, the social media ratings should not be considered reliable sources of experience or satisfaction and have the potential to direct consumers to NHs that perform more poorly on traditional measures of quality. Presenting the social media ratings alongside the more robust measures can serve to reduce that threat.

Study limitations

Our study is limited in several ways. First, our examination is limited to just two states. Maryland and Minnesota were chosen because these are where we had access to NH experience data. Indeed, very few states have experience data on all NHs.1 However, the findings may differ in other states and further examinations should be conducted where possible.

As a result of examining just two states, our sample size is fairly small. In our correlation analysis, we had 80% power to detect a correlation of 0.25 and above (at p=0.05) in Minnesota. We were similarly powered to detect significant correlations at only 0.37 and above in Maryland. Although this does limit our ability to detect significant but weak correlations, we were powered to detect moderate and strong correlations (0.40 and above) in both states. Our regression analysis had 80% power to detect effect sizes of 0.04 and above at p=0.05. This may have had some impact on our ability to detect smaller effects.

Not all social media websites are alike and neither are the users of each platform. It is conceivable that there is something systematically different about those who rate an NH on Facebook versus Yelp, or from those who Tweet about NH quality and experience. Examinations of other platforms in the context of NH care and experience are warranted to better understand these differences and their implications on ratings. Future studies should examine the differences in the content of reviews, as well as the relationship between user ratings on other platforms and traditional measures of quality and experience. Such studies may be able to determine if there is better alignment with some platforms over others and therefore consistent likelihood of pointing consumers in the direction of better quality NHs.

It is important to note that for reviews of NH care, it is conceivable that social media reviews are completed by family members of NH residents. This differs from the typical reviews of restaurants and hotels, for example. However, in NH care, it is often the case that satisfaction/experience surveys are completed by family members, as the resident may not be able to complete such a survey, or they are completed with assistance from NH staff/consultants.57 58 Thus, it is an issue unique to NH care and consistent across all types of reviews/surveys, online or otherwise.

The use of social media in general and Facebook specifically is ever-evolving. Indeed, our analysis is truly an early look at what is happening with NHs and social media. Our findings showed that certain types of NHs were more likely to have adopted Facebook, and it may be that our results are only valid for these early-adopters. It is likely that the number of facilities in our two states has increased since we collected data for this study. Continued examinations are warranted.


User-generated ratings on social media represent a growing and potentially important opportunity to provide consumer perspectives on healthcare providers in real time. This information is largely absent in NH care, but something that consumers want to help support their decision making. Our study highlights several concerns as it relates to NH adoption of Facebook ratings: Facebook adoption varies systematically across facility types and may be related to customer base, thereby limiting access to this information to only subsets of consumers, and Facebook ratings have poor associations with more objective experience and provider quality measures. Ratings from Facebook and other social media platforms have the potential to close an important gap in NH report cards, and our early look at this nascent phenomenon suggests policy makers may wish to consider the opportunity to use social media ratings in light of the limitations and biases. Alternately, policy makers can pursue a path to collect more scientifically grounded user ratings and reviews to complement existing report cards.



  • Contributors All listed authors have made substantial contributions to the conceptualisation, data collection/analysis and/or writing of this manuscript.

  • Funding The authors received no external funding for the research, authorship or funding of this project.

  • Competing interests None declared.

  • Ethics approval This study is exempt from review as there are no human subjects data used.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement The data on the Facebook variables were collected by the authors. Persons should contact the corresponding author if they are interested in accessing the data.

Linked Articles