Preliminary Results of an 18-Month Client Satisfaction Study of the Six START Programs
William Hawthorne, PhD Beth Green, PhD Linda Hammond, PhD
Between 1980 and 1996, Community Research Foundation (CRF) developed six short-term acute residential treatment (START) programs which were located regionally throughout San Diego County. These programs provide a cost-effective alternative to acute psychiatric hospitalization for voluntary adults who would otherwise be at high risk for hospitalization.
The Client Satisfaction Scale (CRF-CSS) was developed in 1981 by clinical and administrative staff of CRF as an in-house instrument. It is a 15-item questionnaire that includes an open-ended question for additional comments. The CRF-CSS has been in continuous use since 1981, and the results for each START program were routinely included in annual reports to San Diego County Mental Health Services (SDMHS) and in periodic reports to program directors. However, this is the first focused and comparative analysis that includes all six START programs examined together. The following discussion summarizes pertinent findings of the paper which was submitted to SDMHS in 1998.
Between July 1, 1995 and December 31, 1996, 3675 clients were discharged from the six START programs. At the time of discharge, each client was given the CRF-CSS and was asked to complete and return the survey prior to departure. Clients were given time and a private space to complete the survey. We collected 2190 surveys yielding an overall response rate of 60%. An abbreviated summary of findings follows.
Question 1 asked where the client would have gone if the program were not available. Options included: (a) "County Mental Health Hospital", (b) "Private Hospital", and (c) "Other", with a space that requested the respondent to "please specify." There were 1619 responses to Question 1, including 921 responses to "Other" which were coded and collapsed into categories. The results are presented in the table below.
Other START program
Other hospital program
Question 2 requested the name of the service or agency the client utilized prior to contact with the START program. As these data are available from SDMHS Information Systems, and are routinely collected and reported elsewhere, they were not included in the review.
Questions 3 and 4 asked for the age and gender of the client. The mean age of respondents was 36 (SD = 9.9). Fifty-two percent (n = 1114) of the respondents were male, and 48% (n = 1047) were female.
Question 5 asked how soon the client saw a therapist after he or she first contacted the program. There were 2068 responses. Variation in response proportions among programs was unremarkable.
Within one hour
Not soon enough
Responses to questions 6 through 12 were grouped into "favorable" and "unfavorable" responses. Questions are paraphrased and favorable responses and proportions are presented in the table below. Variation between START programs was unremarkable (effect size <.02).
6. Was the staff kind and helpful?
7. How much did you help plan your treatment?
8. How satisfied were you with the services...?
9. Were you treated with respect and dignity...?
10. Would you return if you had similar problems?
11. Would you recomment this program to a friend?
12. Did the services help you with your problem?
Question 13 asked what the client felt was most important about the therapist. There were 1836 responses (354 were left blank). There were four response choices. Sixty-seven percent (n = 1226) selected "is helpful, kind and concerned about you". Twenty percent (n = 367) selected "is highly trained". Nine percent (n = 166) selected "speaks your native language", and 4% (n = 77) selected "has a background like yours". Variation in proportions among programs was unremarkable.
Is kind and helpful and concerned about you
Is highly trained
Speaks your native language
Has a background like you
Question 14 asked the client to compare services received with those of a hospital. There were five choices and a "not applicable" for those without hospital experience. Two hundred twenty-four clients did not respond to the question and 278 selected "Not Applicable." The remaining 1688 responses are presented in the table below.
There were a number of technical difficulties encountered in the analyses of these data. The reader should be aware of and take these limitations into account in making any interpretations and/or conclusions from this report. The five-point scale utilized in most of the questions contained a "Don't Know" response choice as the third choice between two positive choices (favorable to the program) that preceded it and two negative choices (unfavorable to the program) that followed it. Ultimately, we decided to include it in the analysis, in order to increase the variability of responses and because such an approach was supported by the data's distribution.
Another difficulty was encountered in Question 1, which asked "If we were not available for service, where would you have gone?" There were 3 response choices: (1) one was "County Mental Health Hospital", (2) the next was "Private Hospital", and (3) the last choice was "Other." Forty five percent (n = 921) of the respondents chose "Other." The "Other" category was followed by a request to "Please Specify." A review of the written responses to the request to "Please Specify" supported our suspicions that there were not enough choices offered. For example, there were a number of responses that specified the Veterans Hospital, which is not a "Private Hospital." The original intent of the question was to ascertain whether the client believed he or she would have needed a hospital if the START program were not available. Clearly, the response choices provided were inadequate. While we were able to aggregate and categorize the responses, this was time-consuming and not an optimal example of data-gathering techniques.
Another problem area was the lack of a comparison group. As this was a "home-grown" instrument, we did not have the benefit of standardization data with which to make comparisons. Consequently, we were limited to comparisons we could make with the existing data. Attkisson and Zwick (1982) and others (Nyugen, Attkisson, & Stegner, 1983) have reported on the widespread tendency of clients and other consumers of services to report high levels of satisfaction which has been characterized as ubiquitous. Clearly, in the absence of comparative data, one must take this tendency into account.
Another potential source of positive bias was due to the fact that, in many cases, clients who either left the program against medical advice or were transferred to a hospital did not fill out a survey. While the proportion of these types of discharges is small (<10%), these clients may well have given less favorable responses had they completed a survey.
Despite substantive technical limitations, the information, especially specific comments, complaints, and suggestions, generated by these surveys has been useful in our ongoing quality improvement efforts. Our START programs have been incorporating this information into program improvement efforts for many years. As our name implies, Community Research Foundation firmly believes in the process of attempting to quantify different aspects of care for research and quality improvement purposes. Who better to ask to evaluate our services than the clients we serve.
Attkisson, C. C., & Zwick, R. (1982). The Client Satisfaction Questionnaire: Psychometric properties and correlations with service utilization and psychotherapy outcome. Evaluation and Program Planning, 5, 233-237.
Nguyen, T. D., Attkisson, C. C., & Stegner, B. L. (1983). Assessment of patient satisfactionL Development and refinement of a service evaluation questionnaire. Evaluation and Program Planning, 6, 299-314.