Skip to content

Center for Effective Philanthropy surveys Kresge grantees to assess the foundation’s overall effectiveness, identify areas for improvement

A Grantee Perception Report produced for The Kresge Foundation by the Center for Effective Philanthropy told us much of what we knew about our performance – and a good deal of what we didn’t.

The first time we enlisted the Center for Effective Philanthropy to produce a Grantee Perception Report was in 2007. It was a tricky time. We were largely perceived as a funder of facilities challenge grants, yet we had already begun on a path of wholesale change – developing a very different approach to grantmaking rooted in six distinct program strategies. The CEP findings clarified our desire to understand and positively affect both the fields within which we worked and the local communities we served. Findings on both measures were quite low. In many ways, we were being assessed for the work we aspired to do rather than for work we had been doing for decades quite successfully.

Now we have a full report that measures the perceptions of grantee organizations that received awards in 2010 and early 2011 – at the very time we were coming to the end of our programmatic and operational transition. Program teams were seeking board approval of, or refining, their strategies. We agreed to take our collective pulse again – or, perhaps more exactly, to have others take it – even though we knew we weren’t quite yet “done.” Caveats aside, the numbers tell a compelling story.

What the numbers say

On nearly every measure, we ranked higher than we had in 2007: understanding of and impact on fields, communities and organizations; clarity of written guidelines and program goals; helpfulness during the grantee selection process. And our overall ranking was “higher than typical” of the other foundations that CEP surveys.

Whereas in 2007 we were evaluated against CEP’s full data set, this time, CEP ranked us two ways: against the full data set of 269 large, medium and small foundations and against a smaller cohort of 21 peer foundations – large, national or regional foundations such as Annenberg, Atlantic Philanthropies, The California Endowment, Ford, MacArthur, Knight, Robert Wood Johnson, Surdna and others. I’ll focus on the comparisons that emerge from this second cohort.

Three findings to boast about:

  • Grantees say our impact on their respective fields is above 70 percent of all foundations in the data set and above 70 percent of our 21 peer foundations. This is a dramatic change from our rating just of four years ago, which was below 50 percent of the full data set.
  • Grantees say our impact on their communities is above 75 percent of our peer foundations. Four years ago, we ranked below the 25th percentile.
  • Grantees say our direct impact on their organizations is above 80 percent of our peer foundations. Four years ago, we stood at the midpoint.

For an organization new to the practice of strategic philanthropy, how did this happen – and so quickly?

According to our survey data, it appears that two broad factors – both tied to intentional aspects of our transition – have come together to elevate the measures listed above: (1)  meeting our grantees at their point of need by providing them with flexible funding; and (2) trusting that our grantees know best how to serve their local communities or constituencies. A word of explanation about each.

First, the findings show that Kresge is making more awards of operating support (23 percent) than our peer foundations (15 percent). CEP concludes that foundations have the greatest impact on grantee organizations when they award multiyear operating-support grants of $25,000 or more. We tend strongly in that direction, although we certainly offer program/project support and make program-related investments – part of our desire to expand our grantmaking toolbox.

In a similar vein, 40 percent of grantees believe that the primary effect of a Kresge grant is “enhanced capacity” – a larger percentage than the average of both our peer foundations and all funders in the data set. This in turn translates into giving grantee organizations the kind of flexibility they need to chart their own course to enhanced effectiveness and impact

Second, CEP found that Kresge puts less pressure than 83 percent of peer funders on grantees to “modify their priorities to create a proposal that was likely to receive funding.” Our grantees know best how to execute their missions; we’re proud to think that our grantmaking practices respect that.

A large red flag

The findings aren’t universally complimentary, however. The trouble spots emerge in the realm of staff interactions.

We were ranked below 70 percent of our peer foundations and below 74 percent of all funders for our responsiveness to grantees. In a related measure, the proportion of Kresge grantees that interacted with their program officer yearly or less often is low – a rate that falls below 70 percent of our peer foundations.

This is hard to hear and disappointing. I have to believe these numbers were driven in part by the pace of change underway at the foundation during our multiyear transition.

We are committed to improving on this essential dimension. And we have a positive starting point: We are rated above 65 percent of our peer foundations for grantees’ comfort in approaching us if a problem arises. We are also rated above 90 percent of our peer cohort in grantees’ perceptions of being treated fairly. So the grantees who get through to us feel good about their interactions.

This would suggest that our challenge is two-pronged: improving the manner in which we treat all applicants and increasing the frequency of contacts with grantees overall. We’ve begun conversations across the foundation and within our six program teams about how to address both. A couple of immediate steps seem clear.

Plans for improvement

First, we will continually improve the website. This is most often the first encounter a grantseeker has with the foundation, and it can make his or her life immeasurably easier or more difficult.

The survey was conducted before the launch of Kresge’s new website. Even on the old site, however, grantees rated the clarity of content above 75 percent of our peer foundations and the consistency of our information above 78 percent. The expanded content found on the new site should bump these already strong numbers even higher. Perhaps more important, the information on the new site should answer many of the questions that arose for past grantseekers – questions that required a phone conversation with a Kresge staffer to resolve.

Second, we will expand the variety and frequency of electronic communications we broadcast to our fields of interest – program updates, news stories, tweets and, eventually, team-focused e-newsletters and blogs. By more proactively shaping the information we share with potential grantees, we increase the likelihood that people will understand what we are doing, why and how they might fit.

Third, we will assess how we handle incoming phone and email inquiries. It may be that we need to expand our current response process to include a triage function that enables us to connect grantseekers more quickly when needed to program staff. As you would expect, our program staff possess deep knowledge of their team’s programmatic goals and objectives and can respond easily to the nuances of their grantmaking strategies. There may be other issues as well; we’ll find out.

Fourth, and in a related vein, we are actively pursuing how to connect grantseekers directly to program staff on a more regular basis. I’ve asked each of the program teams to explore ways this might be done – more regular check-ins by program associates and/or program officers, more frequent electronic contacts or through other means.

The challenge of assessing outcomes

We understand that we have to determine what success looks like – within each of our programs and across the foundation. We have only recently moved from strategy development and early implementation to the articulation of preliminary outcomes we might measure over time. The CEP findings underscore that we have a long way to go. We are dead last among our peers when grantees are asked whether our reporting and evaluation process help to strengthen their organizations.

Just as in 2007, when we knew that the measures of field, community and organizational impact would be low, in 2011, we knew that progress toward a learning and evaluation agenda would be low. We are committed to making this a priority in the years ahead.

Final thoughts

Through the CEP survey, our grantees and declined applicants have told us much of what we knew and a good deal of what we didn’t. We think your feedback affirms our fundamental course, and suggests the necessity of both improving practices related to applicant and grantee interactions and intensifying our commitment to develop ways of assessing whether or not our strategies are successful in advancing our aims.

The motivating goal for our programmatic and operational transformation was a desire to execute our mission in the ways most relevant to the challenges of the 21st century. Although imperfectly, we believe we are helping our grantees improve the economic, social and environmental conditions of low-income and underserved communities. With you, our partners, we are doing our best to fulfill Sebastian Kresge’s charge to “promote human progress.”

To all of you who participated on our behalf in this important CEP exercise: Thank you.

Note: In an earlier version, the words “higher than” were used in describing some findings when the correct term is “above.”  This language has been modified to relect this distinction.

Download File