ROI, performance, marketing, growth, business model, alignment, Diversity, inclusion, equity, accountability, place making, strategic planning, Measurement, methods, research, nonprofit, feedback, evaluation, data visualization, indicators, transparency, reporting, impact, measurement, assessment, Big data, data, social networking, outcomes
What the heck is an Externality, and why do we care about them? An externality is a result or consequence of an activity that is experienced by unrelated third parties. Externalities can be either positive or negative; however, within the not-for- profit sector we tend to focus more energy on positive outcomes.
Most commonly, externalities come up when an organization makes a claim, “With the help of our financing workshops, the median income in this neighborhood improved 5%.”
In reality, there are a number of external variables effecting this improvement. A new business may have moved into town and employed a few dozen residents. Perhaps a new program was created to provide access to small business loans. These are examples of externalities, or inputs, that would otherwise effect the median income of residents in this example.
How do we account for externalities and make direct correlations between our organization and community impact? I’ve seen organizations completely ignore the existence of externalities, and others simply refuse to make any claims whatsoever. Neither path is helpful.
Start by documenting other programs, businesses or variables that may effect your data. Be open and honest about their ability to also effect the outcomes of your sample population. Attempt to perform joint data collection between your organization and other community partners. A broader base of support helps capture outlying externalities. A simple frame shift also helps. Rather than take soul credit, adopt the mentality of “a rising tide lifts all boats”. Play nice in the sandbox.
At Collective Metrics, our business depends on the ability to identify externalities. Our goal is to systematically plot all the existing variables on an easy to use map (or GIS). In this way, we can see what other programs are within proximity and visualize the relationship between partners as they collaborate on projects. Over time we can track what programs are being accessed, and what reach they have.
Tags: no tags available
I spend a great deal of time working one-on- one with customers. Direct feedback is still the best way to improve your game, and often times we lose sight of that.
Most recently my team completed an aggressive cold calling spree to hundreds of clients. This was a simple fact finding mission, a series of basic questions followed up with a more in depth survey via email.
Consistently, I’m surprised by the candidness of the impromptu 5-minute conversations. I tend to walk away thinking, “People are good, they want to be connected and to be informed”. It’s that simple. Seldom do we receive pushback from ornery clients, refusing to provide information in a stoic stance of defiance. They understand the mantra, “Help us, help you.”
Two of our company values at Collective Metrics are transparency and accountability, and we try to exercise these values when working collaboratively with our community partners. Aggregate data is shared amongst the partners in a very open format. We talk about client progress, success stories, and also the barriers that are currently in their way. Often times our partners will offer resources and support for customers we haven’t had much luck with.
This type of knowledge sharing is mutually beneficial to everyone involved. It also has the added benefit of fostering trust, a core component of any collective impact strategy.
Despite the effectiveness such transparency brings, the room is always sharply divided 50/50 on the topic of information sharing. The moment the words pipeline or client funnel are mentioned, one group will crack open laptops, and another will cross their legs and gradually slouch lower and lower into their chairs. Most of the time I forget they're even present.
This isn’t the dentist’s office. There is certainly no reason to be squeamish….. Or is there?
For chair slouchers, transparency is an inherent threat, the equivalent of awkwardly falling on your own sword. Instead of engaging, they most often hide behind a vale of privacy. Common retorts are:
“This is sensitive data, I can’t talk about confidential client information.”
“There are regulations in place, have you heard of FERPA?”
Of course we have heard of FERPA, and YES, client confidentiality is important. But it’s not to be used as a shield to limit accountability. This is the problem with the social service sector in general, we don’t call BS when a partner bows out or doesn’t participate.
In the end the lack of transparency, and quality data, exposes an organization for what they truly are: underperforming and poorly led. And in case you haven’t caught up, your privacy mandates and aggressive knee crossing isn’t fooling anyone.
Tags: no tags available
In theory, collective impact is a thought provoking concept. Like a mixing bowl you blend a common agenda, shared goals and frequent communication…. and BAM, collective impact. It reminds me of Jurassic Park, “BINGO, Dyno-DNA”.
You’re probably thinking, isn’t this methodology obvious? Why wouldn’t you collaborate with other organizations, share data, and set mutual long-term goals? Isn’t that a common sense recipe for success?
In order to benchmark an entire community against a set of shared outcomes, the group needs to embrace radical transparency. Every participant must be willing to share information on clients, capital resources and internal processes. Unfortunately, few organizations are willing to collaborate at this level. The initiative will fail without mutual participation from all the partners.
Technology can also be an Achilles heel for new initiatives, especially when decision makers are 2 to 3 degrees removed from daily client interaction. This prevents them from understanding technology needs as it relates to digital communication, data entry and user experience. To make matters worse, there are few platforms designed to facilitate transparency across multiple industries and sectors; which is a critical component for analyzing trends and identifying redundant services.
Perhaps the largest issue stems from a lack of capacity. In the non-profit community especially, organizations are surviving grant-to- grant. When you’re trying to keep the doors open, managing process improvements becomes extremely difficult. And when it comes to prioritizing short-term gains, nonprofit directors can be just as guilty as corporate America CEO’s. Participation can not always be forced.
For a collective impact initiative to have teeth it needs to be well funded and powered by “social capital”. In other words, the initiative needs to be supported financially and verbally by key community stakeholders, whether that be local foundations, municipalities or prominent philanthropists.
At Collective Metrics we believe strongly in collective impact strategies, so much so that we built it into our name. But the process can’t be half-assed. As with most high risk opportunities; you’re either all in, or all out.
Tags: no tags available
As you scroll through Facebook over lunch, pause a moment and think about the history you’ve compiled over the years. The pictures, videos and comments on your feed are snapshots of thoughts and actions during that brief moment in time.
Imagine categorizing these snapshots into a series. For example, Jane went to coffee at 9am, then to school at noon, and finished her day with a hot yoga session at the downtown YMCA. Now imagine comparing Jane’s day to her peers, or those in proximity to her. Can we see any consistencies between Jane and the peer group?
What happens if Jane changes her habits, and stops drinking coffee? Can we predict a change in her daily routine? Is she more likely to have health problems down the road?
These big questions can be answered by comparing data points on a timeline. At Collective Metrics, our goal is to gather these pieces of information and use the findings to help our users understand the lifetime value of their programs. For example: out of 100 participants how many will go on to receive degrees? What percentage will start new businesses? How many will have kids?
Over time, have our efforts dramatically improved quality of life, or resulted in another outcome? We can only begin to understand lifetime value if we collect data points throughout a user’s history. In the future, organizations will lean more heavily on social media timelines in order to compile and analyze correlations.
Open access to these data sources, from a simple user interface, will provide a new level of sophistication to the masses and help us understand the true lifetime value of our work.
Everyone has an opinion, and everyone’s a critic.
When an organization releases a new standard for measuring success, we instinctively want to call…BS.
We feel this way for a number of reasons: limited inclusion in the decision making process, the new standards don’t match our own, a fear of missing out (they have a special sauce we don’t have).
The truth is, there is nothing wrong with setting or implementing a new standard. But it’s unacceptable to stick a finger in the air, and call it how you see it. This isn’t golf, the winds shouldn’t be changing all that frequently.
Standards need to be based on consistent trends. Within your organization, pick a number of data points and begin establishing a baseline. Are you seeing any correlations in the data? Give it time, and perhaps expand the number of inputs you’re collecting.
When you feel like you’ve collected a statistically relevant data set, do the analysis, and move forward with a simple set of metrics that you can rely on. Sometimes we don’t have the time or capacity to include the entire community in the process. There are instances where it’s okay to lead, and be the gold standard. If you’ve done your research, use the data to silence the skeptics.
Performance reporting is a necessary evil. That’s probably the friendliest way of describing it.
Reporting has become a time consuming part of our lives, regardless of whether you’re producing an annual report, or a weekly staff update. In fact, they tend to feel more like a sales pitch. Reports should be viewed as a brief update, a snapshot, on what’s working and what’s not. They should be data driven, and demonstrate causal relationships using factual evidence.
Yes, this may show that programs are under performing, or aren’t producing the results you expected. That’s why timely reporting is so vitally important.
It’s no secret that the nonprofit sector is deceptively competitive, especially when the relevancy of an organization hinges on the next grant award. However, most performance issues can be solved by moving towards “real-time” reporting. It’s a simple concept, in theory. Data is collected, recorded and then immediately pulled into an application that generates a series of charts, graphs and dashboards.
One of the biggest issues we see is “static reporting”, where an organization collects information in Excel and stores it in Dropbox for the next 9 months. Isn’t this counterintuitive? How can we drive insight from data that is neither timely nor relevant?
Data should immediately be displayed in a visual format as its collected. There are a number of great online applications that will completely automate this process for you, including Collective Metrics.
Data should not be collected on a monthly, quarterly or annual basis. Instead, approach data collection as a daily, repetitive, on-going affair. Focus on standardizing your data intake process in all facets of your business. Stay consistent with the types of questions being asked, and the fields being created.
Eventually you will select a few key metrics as primary indicators. Build those metrics into a dashboard and review the numbers daily. The best thing you can do for your organization is make the dashboard transparent and accessible. Reference the dashboard at staff meetings, during events and with key stakeholders on a routine basis. If the data is positioned as a public resource, it will become a roadmap and alibi.
Performance measurement is an expectations game. Transparency and consistency will drive better decision making, and keep the wolves at bay.
Seth Godin recently published an article titled “Numbers, and the magic of measuring the right thing”. In the piece, he makes a compelling argument for why we need to spend more time figuring out what exactly to keep track of, and less time obsessing over the numbers that we are already measuring.
Too often our methodology for data collection is flawed; incorporating our biases into the questions we ask, and the reports we create.
Sometimes our methods are dictated by donors, board members, or perhaps the unrelenting status quo, but that doesn’t mean we can’t look at our own numbers with a wary eye.
One barrier that prevents us from improving internal methods is a lack of data transparency. At best, we tend to guard our information like the Bush’s Baked Beans recipe. Instead, we should make our logic models, and assessment tools, open and accessible. Openly report on your value stream (inputs, outputs, outcomes) and let the chips fall as they may. If your data come under scrutiny, use the feedback as an opportunity to improve your processes.
If you’re trying to shelter your organization from receiving critical feedback on performance, then your missing an opportunity to become more efficient, and ultimately… more competitive.
Many organizations are attempting to incorporate diversity and inclusion into their vernacular. Despite their good intentions, much of the dialogue is merely a half-hearted attempt at “doing the right thing”.
The problem is talking about diversity and inclusion isn’t enough; and when it comes time to take action steps, we stare blankly across the room at one another, hoping someone has a plan. I recall a poignant remark made by a colleague during a recent board meeting. “I’m sorry,” he said “but we need to be far more intentional about what steps we’re going to take.”
Running a business or program with intentionality, is difficult. Being intentional about diversity means our thoughts, beliefs, desires are weaved into the very fabric of our core values. It’s not a topic to be talked about, it’s something you live, everyday. And for most of us, we’re not there yet.
For starters, we can begin by setting a series of tasks that can be accomplished in a short period of time (a month or less). Give the initiative teeth by assigning ownership, setting deadlines and holding staff accountable to performance measures. At every step, engage the community. Be intentional about traveling TO people, and hosting meetings in THEIR environment.
Most importantly, don’t assume that you’re ever running things correctly.… Like any good entrepreneur, let your clients show you the way. If you seek the community’s input on how to build diverse programs, you’re actually practicing inclusion. As with most things, intentionality drives results.
It’s really easy to take on new projects. Too often we use shiny new toys as a smokescreen, hiding the fact that our core business has become stale, stagnant or is underperforming.
Fight the urge to cast a wide net; instead, reexamine your core competencies. Why was your organization created, and what basic services need to be delivered? We should be looking for greater depth, expertise and competency in our core service offerings. It’s time to double down on the “special sauce”.
We see many organizations drift away from their bread-and- butter. If you’d like to bring in new clients or new users, look at alternative marketing channels. Is there a new audience that can be nurtured with a slightly different message? Do you need better trained personnel? Maybe you will find that your business model has become irrelevant, and it’s a good time to walk away.
If you spread yourself too thin, you’ll end up doing nothing well. Expanding product/service offerings is never a good alternative for improving core competencies. Do a couple things really well.