By Elizabeth Sites, CQL Director of Organizational Excellence
What if your agency data showed that people supported who live in one home have less rights restrictions than those in other homes? There may be lessons you could learn to reduce restrictions more universally. What if you launched a new supported employment program and then a year later, found that employment-related outcomes improved? That might shed light into the effectiveness of your program. What if you surveyed staff and found that the majority didn’t feel prepared during their first weeks of employment? You could use that data to explore additional support and training.
In this Capstone, we’re taking a deeper dive into collecting, assessing, and understanding data to evaluate services, drive change, and improve quality of life. After you’ve gone through the initial planning stages – which were covered in our April 2023 edition ‘How To Use Data For Informed Decision-Making’ – this article covers the practical next steps. In addition, you’ll hear from our partners at Community Ventures In Living (CVL), which detail specific examples of how they’ve leveraged data to enhance their services, programs, and operations.
Key Areas To Address
Sources For Collecting Data
As we described in last year’s Capstone, the first steps are to develop your strategy for data collection and analysis, and ensure the right people are part of it. Another next step is to identify the data sources. We couldn’t list out every possible source, so here a few examples to help you spark some ideas:
- Health/Medical – illnesses, hospitalizations, census, survey feedback, etc.
- Rights – rights assessments, rights restrictions, human rights committees, survey feedback, etc.
- Staffing & Staff Development – staff turnover, vacancy, absenteeism, overtime, training participation, survey feedback, etc.
- Respect, Autonomy, Person-Directed Planning – complaints/concerns, decision-making authority, representative payee status, employment, survey feedback, etc.
- Positive Behavior Supports – psychotropic medication use (Prn/scheduled), physical interventions, behavior supports, person-centered planning, goals/outcomes, etc.
- Relationships – Personal Outcome Measures® interviews, social connections, survey feedback, etc.
- Abuse, Neglect, Mistreatment, Exploitation (ANME) – incident management reporting, allegations/investigations of ANME, etc.
- Safety – internal/external safety reviews and audits, incident management reporting, etc.
- Financial Management – expenditures (line item, program, departmental, organization-wide), volunteer usage, fundraising, marketing & communications, etc.
Creating A Centralized System
After you’ve identified sources and figured out which data would be most meaningful, you’ll need to pull it together in one place. This ‘place’ can be as simple as an excel document or more complex like an online survey platform, electronic records management system, regulatory reporting system, etc. Regardless, the principles are similar and there is no one right way!
No matter what you use, the most essential part is that it offers valuable information to guide decision-making. To help ensure success, it should be well-organized and relatively easy to operate for all audiences. You should also keep in mind that your data system will not be perfect at the onset but will evolve over time.
“At CQL we always talk to our partners about the importance of progress over perfection.”
As your system evolves, you should consider looking at opportunities for automation to build efficiencies, such as linking data dynamically within an excel document, developing recurring reports, providing the ability to enter in data via a mobile device, setting up dashboards for people to view, etc.
Identifying Your Collection Team
No matter what you use to ‘house’ your data, you will want to delegate data sets based on who is responsible for collecting it. This is important for accountability and identifying who people can turn to if they have questions or need assistance. Each person can also ensure everything is organized to keep your system from becoming cluttered, confusing, or siloed. The main takeaway here is that you need to pull in the right people and ensure that not just one person is responsible for everything – this should be a team effort.
Analyzing Your Data
When you’re starting to look into analysis, you must have a plan. This should list out your (achievable) goals, what data you’ve been collecting, timelines for tracking progress, and actions that will be taken as a result.
Along with that, you’ll want to figure out who will be involved in the analysis and resulting decision-making. This might be a group beyond the people who are gathering the data. It may include diverse stakeholders such as support staff, people receiving services, executive leadership, and others. As we shared in the April 2023 Capstone article, you want to avoid data siloes since your data-driven decisions don’t happen in a vacuum and affect numerous people.
Another solid first step is to review the data you already have before looking at brand new data. This will give you a good starting point and keep everything manageable. You will also learn some valuable lessons along the way that will help you avoid any pitfalls, prior to expanding your analysis.
Developing Action Steps
Accountability is key not only with analysis, but also action steps. You should set clear expectations for everyone, specifying their roles and responsibilities. It’s very helpful to set deadlines for deliverables and timelines for check-ins to maintain momentum.
As mentioned earlier, during the planning stages you will want to set achievable goals. But how will you know when that goal has been reached? What’s your measure of success? And how can you flag when a decision made as a result of the analysis was correct or incorrect? In establishing your intended outcomes for analysis and action steps, it’s critical to get very specific about what you hope to accomplish. Here’s an example:
Goal: Decrease restrictions/intrusive interventions by 5% across the organization by the end of the calendar year.
- Action #1: Collect baseline data on the number and type of restrictions/intrusive interventions placed on people.
- Action #2: Identify blanket restrictions that can be immediately removed.
- Action #3: Identify restrictive/intrusive interventions that are in place due to legal and/or regulatory requirements.
- Action #4: Educate all employees and people receiving services on what is a restriction/ intrusive intervention.
Using this example, an organization might complete action step #1 and realize that people receiving services have a lot more restrictions placed on them than was originally thought. The amount and types of restrictions placed on people can then be used to develop education for staff to help them understand what exactly constitutes a restriction and how they may be unknowingly restricting people’s rights. It can also lead an organization to look at the type of lease agreements it uses as those documents may have built-in restrictions that are not legally enforceable.
Last but certainly not least, you will want to communicate effectively about your analysis, action steps, and outcomes. Yes, charts and graphs can definitely be a part of it. But real-life stories, testimonials, and profiles can often be even more impactful. This helps get across ‘the why’ of what you’re doing, demonstrate the real-life benefits, and build investment in future initiatives.
Before we move on to discovering how Community Ventures In Living (CVL) is using data, there’s just one final tip to keep in mind. If you ever hit a roadblock during your data journey, you can revisit your organization’s mission and dig deeper into the data, asking yourself “why is this important?”
CVL’s Insights Into Using Data
By Thomas Dukes, MSW, MS, EdD, Director of Quality, Community Ventures in Living
Richard Brown, Executive Director, Community Ventures in Living
Community Ventures In Living (CVL) is dedicated to service excellence. For us that means maintaining a culture of continuous improvement and innovation. Our use of data fits within this vision. The point of collecting and analyzing data is to ensure our work has the impact we think it does. To that end, we have created a system that ties together service exception tracking with other forms of data like satisfaction surveys and Personal Outcome Measures® (POM) interviews.
We refuse to merely collect data. We strive to utilize data, and the insights and understanding it provides, to drive the positive impact we can have in the lives of individuals who receive our services. These factors guide our quality improvement process to be driven by person-centered outcomes and not based merely on inspection and compliance of quality assurance systems.
An Evolving System
Our ‘system’ is more an evolving process that is constantly undergoing refinement. At this time, this process involves automated system evaluations followed by regular human analysis and evaluation. Our system allows data to be tracked and aggregated across regions, programs, and individuals while also providing the ability to drill down to specific individual instrument data so that similarities and differences can be identified and evaluated. This provides both individual-level and aggregate tracking, response, and follow-up opportunities for the QA team and the agency.
The system dashboard displays data including incident reports, POM interviews, and satisfaction surveys. We break down the data and display it in a variety of ways to foster deeper analysis and meaningful insights. For example, the system displays the average number of service exceptions per month, tracks a lateness quotient related to knowledge and submission of exceptions, and displays a breakdown of exception categories in pie chart form.
The system also flags individuals whose submissions surpass predetermined thresholds. The system then allows us to efficiently review and examine these cases more deeply. We regularly search previous reports, track and plot trends, identify concerning trajectories, and provide support teams with timely information, consultation, and support in identifying and responding to assessed needs.
Examples of Analysis
For example, there was a time when our team all shared the perception that one of the individuals we support was having a significant uptick in the frequency of missed medications. We dug into the data over the previous two years and discovered our perception was not accurate. In fact, the pattern we thought was escalating was actually leveling off and beginning to trend downward. It was a good lesson on examining and confirming our perceptions with the facts.
In another case, we knew that an individual suffered from seizures that often resulted in falls and injuries. However, when we looked at the frequency of such occurrences over time it became evident that the seizures were increasing in frequency and the injuries were similarly increasing in severity. We were able to work with the support team and provide this evidence of escalating risk and danger to the individual. It was quite compelling and provided added substance to the conversation they could have with the parent of this individual in planning for services proportionate to the need.
On the aggregate level, we also identify trends, consider root causes, and lay out quality improvement plans to address system-wide issues identified. For instance, aggregate POM data represented in bar graphs reveals that many of those we support do not have the robust natural support network they desire. The data compels us to continue being creative in addressing the skill gaps and lack of opportunities available for relationship development.
Addressing Challenges
One of our ongoing challenges is to make the data we collect meaningful in ways that impact those we serve. We’re not interested in data for data’s sake. Our approach involves regular, weekly analysis, team discussion, and direct application of what we learn to decision-making and intervention planning when needed. To facilitate this, our system utilizes a basic form of AI to flag individuals with data that falls outside of expected ranges. The QA team then reviews these cases, considering historical and contextual features, and acts as the situation warrants.
Conveniently, our system dashboard also contains a project management capacity so that Remediation Plans for individuals and Quality Improvement Plans for the organization are directly linked to the sources of data that spawned them. The system “reminds” us at regular intervals to revisit those projects and provide updates on progress and continuing challenges.
Another challenge is ensuring the data we obtain is of sufficient quality and depth to provide actual insights. For example, our annual satisfaction survey provided strong positive results. However, checking the same positive response to each question asked with no additional comments or suggestions gave us little to work on for improvement. First, we added a comment section after each individual survey item in the hope of obtaining qualitative data to enhance the quantitative data and shed more light on people’s perceptions, desires, and needs. When this failed to provide much additional actionable information, we piloted Satisfaction Focus Groups. These meetings added significantly to the depth of our understanding of people’s lived experience of their interactions with the agency. It facilitated our understanding and ability to respond to what they want and need; what really matters to them.
Communicating About Data
Data collection and analysis is critical, but it is only the beginning. Importantly, our process includes publishing and disseminating concerns, trends, and issues as well as soliciting reactions, feedback, and insights from those providing the supports. If our efforts don’t include ongoing communications with service teams, then our work can’t impact the lives of those we support in the way we envision. Our efforts so far include monthly QA summaries, POM feedback forms, direct consultation with support teams, and ad hoc staff communications.
Another important aspect of our approach to the use of data, and Quality Assurance activities in general, is continually referencing the “why” of what we do. How does the story the data tells fit into the bigger picture of an individual’s life? How do we understand the individual within their family, natural support network, and larger community? How can we better understand this person within their unique context to provide more effective support in helping them accomplish the goals they have for themselves? Finding answers to questions like these is the means for applying our data analysis to enhance service provision.
The Impact of Data & Next Steps
We pride ourselves on knowing people well as the basis for providing person-centered supports. The ability to collect, review, and analyze data, and plan and track change efforts in one visually appealing system, allows us to know people, on the individual level and the aggregate level, much better. It allows us to be accurate in our assessments. And it allows us to be flexible, adaptive, and responsive to changing needs and conditions in real time. It also lets our teams know that we have their backs, and that they have a resource in the QA department to support them in their work.
Next steps for us include aligning our various data sources with the Basic Assurances® as a means of assessing desired outcomes. We also have plans to create analysis tools that sort data by life domain. This will allow us to look for indications of successful outcomes in the lives of those we support, and to celebrate these successes accordingly. With advances in AI technology, we anticipate increasingly efficient analysis by the system itself, highlighting and prioritizing areas in need of increased human attention. This will free our time and energy for decision-making and action around the most pressing issues.
We continue striving to be innovative in the ways we collect, examine, analyze, strategize, and drive change using data. This impacts every aspect of our operations and service provision. It contributes to our culture of continuous improvement. Most importantly, it helps us check our perceptions, guard against complacency, and do the best job we can providing support to individuals that is meaningful to them and addresses their unique needs.
Additional Resources for Utilizing Data
By Elizabeth Sites, CQL Director of Organizational Excellence
Hopefully between these Capstone articles, you have a much better sense of why data is important and how it can help drive decision-making. Remember – data is simply information! We have created a number of resources for your data journey, which cover many topics!
- Guide – 12 Reasons Why Data Is Important
- Webinar – Integrated Quality Management System
- Research – Provider Quality & Personal Outcomes
- Capstone – Demystifying Factor 10
- Webinar – Factor 10
- Report – Quality Outcomes & Measurement Metrics
- Webinar – Accreditation & Data
- Capstone – Transformation Strategies
- Webinar – Reliability & Personal Outcome Measures® Data
Putting Organizational Data Into Action
In this free webinar Putting Organizational Data Into Action, we’re sharing guidance that agencies can use for collecting, analyzing, and leveraging data to improve the quality of services and the quality of people’s lives.
View The Webinar
Sparking Organizational Change Through Data