Claims of High SIS Satisfaction?
Written by CRAAG
As part of redesigning the Developmental Disability (DD) waiver system to comply with the Department of Justice (DOJ) Settlement Agreement, the Departments of Behavioral Health and Developmental Services (DBHDS) and Medical Assistance Services (DMAS) adopted the Supports Intensity Scale (SIS) assessment tool to determine individual service needs, and the ABE algorithm to establish provider reimbursement tiers for waiver services.
In response to concerns by General Assembly (GA) legislators regarding the effectiveness of the SIS process, the GA mandated the formation of the “Supports Intensity Scale (SIS) Stakeholders Work Group” and required DMAS to report annually to the GA regarding the status and effectiveness of the process.
CRAAG was invited to attend the SIS Stakeholders Work Group meetings starting in the spring of 2019. Until the end of 2019, ASCEND was Virginia’s exclusive SIS assessment contractor. We received ASCEND’s quarterly Satisfaction Reports and Activity Report and DMAS’s 2017 and 2018 reports to the GA. A link to all the sources cited in this article is found at the end of the article.
In the December 2018 annual report to the GA, Dr. Jennifer Lee, then DMAS Director, wrote: “The ‘Satisfaction Report’ revealed satisfaction levels in the upper 90s – 100% range, as reported by individuals, family members, support coordinators.”
This article explains why CRAAG believes this claim is misleading.
- CRAAG questions the statistical validity of ASCEND’s data due to the low percentage of people responding to the survey. The response rate was between 8% – 12% (907 total responses from 8,101 total surveys completed). This means that the “non-response rate” was between 88% – 92%. Of those returning the surveys, fewer than one-quarter were from people most impacted by the results – the individuals themselves and their family members. The overwhelming majority of respondents were professionals (case managers, providers, and support team members).
- CRAAG questions why so few people responded to the survey. The seven Likert Scale survey questions seem easy enough to complete, so difficulty in responding to the questions is ruled out. CRAAG has asked providers who have attended many SIS assessments for their thoughts on the low response rate, and they mentioned these inconsistencies and barriers to participation:
- The satisfaction survey asks for identifying information for the recipient, respondent, and interviewer, which is inappropriate for such a survey.
- It is a pen and paper survey, and respondents are instructed to fax it to ASCEND. This is a barrier for most families.
- Interviewers didn’t consistently distribute the surveys in the same way or uniformly explain the importance of returning the survey.
- Some interviewers left a stack of surveys on the table without encouraging participants to participate. Others didn’t have enough surveys for every participant even though everyone who participates in the interview is eligible to fill out a survey.
- CRAAG wonders about the 23 total, mostly positive “representative examples” of feedback that ASCEND selected to share with the state. Given the widespread unpopularity of the SIS among families in the autism community and others, the examples weren’t what CRAAG would have expected. Only one example mentioned autism: “The tool does not capture the nuances of an autistic higher functioning individual.” In the article, SIS Feedback to DBHDS and DMAS, cited below, CRAAG shares all 23 examples the state received from ASCEND. We are asking our readers to email their feedback on the SIS assessment to CRAAG. The state needs more feedback.
- CRAAG is dismayed that DBHDS and DMAS rely exclusively on a small, narrow set of data from their own SIS contractor (ASCEND) that claims almost 100% satisfaction with the SIS assessment process. Meanwhile, who is assessing the ABE algorithm, a less well understood, but critical, tool in the SIS process? These two tools work hand and glove together. From what we hear, there is a lot of dissatisfaction with the process and outcomes.
In conclusion, CRAAG can’t be sure, but suspects, that most of the 970 respondents in the ASCEND data set responded to the survey before learning the outcome of their SIS assessment and their funding level. For the professionals and families who learned that the lower than expected funding tier would not qualify their person for services they absolutely need, the inaccuracy of the SIS process was highly unsatisfactory.
Isn’t it time for the state to contract with a third-party to comprehensively evaluate every aspect of the DD system instead of relying on ASCEND’s misleading claim of almost 100% satisfaction with the SIS assessment process? We need a complete evaluation of DD Waiver system to understand the impact on individuals and their families.
ASCEND’s Selected Representative Examples of SIS Feedback to DBHDS
In 2018-19 ASCEND, Virginia’s SIS assessment contractor, selected 23 “representative examples” of feedback from the satisfaction surveys to give to DBHDS and DMAS. We
don’t know how many comments ASCEND received in total.
CRAAG wants to give the state much more feedback on the SIS process. If you, your family member, or client has been through a
SIS assessment and wants to share feedback with the agencies, please email [email protected] in confidence.
January to March 2018
“The tool asks about situations/hypothetical that aren’t relevant to many clients and is highly subjective, resulting in frustration on behalf of the support team”
“Some parts did not apply and were painful to the family”
“Accurate and thorough”
“Was very thorough, but some questions were irrelevant and difficult to answer”
“I felt handing out to each person the scoring system helped a great deal. Having an interviewer that clearly understands this test and is willing to take time with individuals month and explain certain questions was very helpful”
July to August 2019
“A bit cumbersome in some areas especially when relating to this specific individual”
“shorten it please”
“The assessment tool was effective and was used to clarify confusing questions”
“I feel it is unfair that a service provider pay is reduced when their client shows improvement”
“Some of the questions appear to be redundant”
“Please don’t waste/take up time reading to families. This used valuable time”
“Assessment tool is what we have to work with. It has flaws, but it is what it is”
“Send questions in advance to family point of contact-unable to read complex questions in advance”
September to December 2019
“Some repetition but better to repeat than not to capture the full level of support”
“Though it takes quite a long time, we felt it was helpful”
“Some of the questions are too board in scope to give a clear and correct answer”
“Certain questions require more clarification to understand”
“The assessment tools were designed and structure very well such that they capture the important assessment data elements”
“This is a good tool when it comes to determining what supports are needed to help an individual live a good quality of life as independently as possible”
“The assessment was lengthy, but it was also nicely detailed”
“The tool does not capture the nuances of an autistic higher functioning individual”
“Tool continues to be sufficient”
CRAAG is interested in your comments on this article. Email CRAAG your thoughts to [email protected]
Want to stay connected with CRAAG? Follow them on Facebook.