From essays to dissertations, we have experts for all of your assignments!

dot
  • 1.provide your instructions
  • 2.choose an expert in your field
  • 3.track the order progress

RiverbendCity_DataAnalyticsInternshipIntroduction.html

Capella Logo

Riverbend City ® Activity

Data Analytics Internship Introduction

Introduction

Welcome to your virtual internship at the Riverbend Community Action Center (RCAC)! Located in Riverbend City, a midsized city in the Midwest, this organization provides a variety of human service functions. In your internship, you will be focused on the Ruby Lake Teen Homelessness Task Force, which is centered in the neighborhood of Ruby Lake. You will be learning about the role of data analytics in a human services setting.

Three years ago, RCAC received a seven-figure grant from the Helping Hands Foundation to help the center's work with homeless teens. Now Helping Hands would like to know what RCAC has done, and if their programs have been effective. Your focus will be on understanding how analytics could be applied to evaluate these programs and make the case for their effectiveness.

Mentor Talk

Riverbend City Community Action Center: Mentor's Office

Check in with your CAC Mentor, Brenda.

Hi! And welcome to RCAC! I'm really looking forward to working with you and helping dig into the landscape of data analytics in the human services.

Like your acceptance letter said, the specific thing I'd like you to look at while you're here is an analytics-based evaluation of the effectiveness of our grant-funded work on teen homelessness in the Ruby Lake neighborhood. Well, that's all well and good, but what does that really mean? Let's jump in and talk about the data analytics process itself.

To do that, we’re going to focus on the first step of the lifecycle.

There are a variety of ways to visualize the data analytics lifecycle. Some models will have more steps than others… some will parse those steps in slightly different ways. That said, there is some commonality in terms of basic stages or steps. For our purposes, let’s work with the model SAS uses.

The first stage of the data analytics lifecycle addresses understanding the problem. Some models will call this the discovery phase, others may talk about identification or understanding the problem. The important thing to remember is that this is the point where you are going to identify the business problem and ask the questions that will help you analyze the business problem. You might think of it as narrowing the gap between yourself as a data analyst and the business owner.

The business owner may not be describing the problem in ways that make it clear what you’re trying to measure in your analysis, so a significant part of the first stage of the lifecycle may be identifying what data to look at. Later stages will center on acquiring the data, cleaning it, building models and so forth, but at this initial stage, the goal is to acquire as much clarity as possible in understanding the core business problem.

OK! That's the lifecycle in a nutshell. Next, I'd like you to hit the ground running on understanding the problem. I'd like you to go talk to some of the internal stakeholders to help you get a well-rounded view of exactly what we're trying to do here. I've arranged for you to talk to Richard Agin, our CEO; Eduardo Alvarez, the director of our Homeless Teen program; and Heather Adams, one of our case managers in the program. That way, you should get a good diversity of perspectives and concerns that will help you really identify the questions being asked.

The Analytics Lifecycle

Interviews

Riverbend City Community Action Center Offices

Talk to some RCAC staffers to get their perspectives on what kind of questions you should be trying to answer with data.

Richard Agin

CEO of Community Action Center

Thank you for asking! I'm excited to see this question being taken seriously.

I'll tell you one thing: more than anything else, I want to emphasize process. Since this program is essentially my little kingdom, I often find myself in direct contact with donors, granting agencies, and accrediting bodies. And I hate having to answer their questions about how things are going with vague, anecdotal evidence. I can see their eyes roll, and shudder when I think of what this is doing to our long-term relationships. So if you're here to bring some rigor to the question of evaluating how we're doing, you're making my life a lot easier no matter what you find. For my sake, I'm begging you: Please be thorough with everything, please be clear, and please document everything you do, so that I can leverage these results to convince people to give us more money for future activities.

OK, that said, if you're looking for the things you need to be tracking, I'm happy to give you my perspective. As much as I like to focus on individual success stories, in my role as CEO I have to focus on big-picture elements. So: How much funding did we direct towards this program? How much of it was from dedicated grants and how much from unrestricted funds? What were our cost centers in executing the program? How much staff time and energy was devoted to this program, that could have been devoted to other things? Same question in terms of our non-staff resources—what did we allocate here?

You get that stuff and you have one side of the ledger…then we have to figure out what our specific concerns are evaluating benefits. The number of teens that have been involved with the program seems like a good place to start. Maybe as a baseline, the overall number of teens in the region who'd be eligible to be helped by the program, so that we can see what percentage of the populace we're targeting. Also worth looking at cost and resource allocation per participant — how efficient are we being? We've been doing this for a while now, so it's good to look at year-to-year figures, see if we're getting better both in terms of absolute numbers and efficiency. What else? If possible, I'd love to have data from other orgs that are doing the same kind of work, so that we can compare our effectiveness to theirs. Oh, and I'd like to have longer-term data about what happens to participants after they leave the program, so that we can see if we're solving things or just applying a band-aid. How are our participants doing after six months? After a year? And so on.

I'm sure I could come up with more, but that's what comes to mind first. Thanks again for asking me about this, and I'm really looking forward to seeing what you come up with.

Eduardo Alvarez

Homeless Teen Program Director

So, OK. We're looking for ways to use analytics to evaluate the teen homelessness program? I need a moment to switch my brain out of day-to-day mode and into long-term-thinking mode.

Which, now that I think about it, is a good thing for you to keep in mind as you're working on this. We have this tactical day-to-day reality of "how can I help Jenny, the 15-year-old who’s currently sleeping under an overpass, find a place to stay with some permanence?" And your project is to help us look at the big picture so that we can tell how well we're doing in our endless series of day-to-day realities.

All right. Sorry for the philosophy there. One thing I'd like to have a handle on with analytics is what the circumstances are for clients on intake, so that we can correlate that with success rates. Like, are we more likely to be able to help kids who have been living on their friends' couches than the kids who have been sleeping out of doors? Is there a substantial, consistent difference in how much staff time needs to be devoted, depending on what the person's intake status was? Is there a difference in how stable we can help them become? As far as that goes, what *is* the range of possible living statuses that kids enter the program with? We don't really know in any systematic way.

As a program director, I'd also be interested in data on which of my case workers are particularly effective — I guess for that to work, we'd need to find a way to quantify effectiveness, of course. Some tangle of time spent per case versus placement rate for client versus stability of outcome, I'm not sure. But if we had an answer for that, we could look at the most effective case workers and see what it is they're doing right. And from there, maybe try to work up some best practices that we could share with case workers who aren't as effective. Lots of pitfalls and cans of worms working that way, of course, but I think it could be really helpful if we did it intelligently and carefully.

Heather Adams

Case Manager, Homeless Teen Program

It's hard for me to think about my job in terms of analytics. Because to me that just means cold data, and this job is all about people and their stories. But I've heard people talk about how data tells stories of its own, so I guess that's the way I need to look at this.

So, if I try to think about things I'd like to know about…I guess I'd like it if we could try to figure out what individual things are important about the teens we work with, in terms of making a difference in how they wind up in the program and how able we are to help them. Does that make sense? Like, the question I'm trying to answer is "what do these teens have in common or not in common, and how does that affect them?"

We gather a lot of basic information now that we don’t do enough with: name, age, DOB, last permanent address, stuff like that, but also a lot of detailed background information that goes deeper into their family situation before they came into the program. But then the data just sits there and we don’t do anything with it, don’t use it to dig into the bigger questions. Like, what if we looked at the data and saw that, say, Latino kids with divorced parents are more likely come into the program. I could see that being useful for us in the day-to-day working of the program, and it seems like that would turn around and help you figure out how effective the program as a whole was for different populations. I think.

On the other hand, I can see some privacy risks there, if we're gathering and storing that much information. And as case managers, we'd need to be very careful not to be intrusive or cold or off-putting as we gathered that information. But these seem like solvable problems to me.

Conclusion

You have completed the Riverbend City: Data Analytics Internship Introduction activity.

Reflection Questions

What big-picture questions should the RCAC attempt to answer using data analytics to evaluate their programs?

Your response:

This question has not been answered yet.

What specific data should be collected in pursuit of these questions?

Your response:

This question has not been answered yet.

Licensed under a

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes