The U.S. Department of Health Human Services Office of Adolescent Health posted four new teen pregnancy prevention funding opportunities last weekend (January 10) and one just before the holiday break on December 23rd. These are major funding opportunities for organizations that are working in teen pregnancy prevention. Each of the five opportunities has a different focus and funds very different activities. Please be sure to read them carefully. Each also requires a letter of intent to apply for the funding as well as a full application. In addition, each opportunity has a different timeline for receipt of letters of intent and applications.
To learn how to access detailed information on each of the opportunities, click here or on the “Funding Alert” link above. To learn more about how we can help, click here or on the link above to “Evaluation Research.”
I have a tolerate/hate relationship with community needs assessments. I really wanted to write that I have a love/hate relationship with them but that would be dishonest. I do not love them at all, but I do understand their importance and will tolerate them…barely. In fact, I have designed them, conducted them, and used them to inform and guide work on community projects and initiatives.
So, what’s my problem with community needs assessments? In part, it has to do with my research orientation. I am a qualitative researcher at heart and by preference, and too many community needs assessments focus on just the numbers. I can do the numbers, but they do not “chat to me” as they do to my quantitative researcher spouse who becomes positively giddy over statistics. Even more, I think the importance of numbers are overblown since they do not tell the whole story. They are good for describing a situation or issue, but not explaining it, which is really the key to community change.
Until we understand why or how something is happening in a community, we usually cannot influence and change it.
Okay, so I have owned my part in my problem with community needs assessments and I have come to accept thatcommunity needs assessments are a necessary evil. Yet there are still other problems with community needs assessments that have little or nothing to do with my research orientation or preference. I believe they can be improved and made more useful both to those that must conduct them and to the residents of communities that are subjected to them. Here are my recommendations.
First, I propose we lose the name “community needs assessment” and replace it with “community understanding study.” The name “community needs assessment” has come to assume weaknesses, problems, negatives, and deficits in the community. Therefore, we are imposing our belief that a community has problems and biasing the outcome from the outset. We need to consider that our perspective may not be what the residents of the community see at all. They may, in fact, see strengths, benefits, positives, and assets in the community. They may see their communities as incredibly, infinitely resilient and able to overcome any challenge. Community “needs assessment” assume communities are doing poorly with regard to one or more issues. As a result, some needs assessments effectively double bind residents into responding to the assessment questions in ways that only reveal the deficits. Recently I participated in a community needs assessment that had several of these types of questions on it. For example, it asked me to choose from a variety of responses with regard to a health related issue, without first asking me if I actually had that health issue. To respond at all was to admit to an issue that I did not have. (In fact, although I did not have the issue, I might have developed it had I allowed myself to dwell too long on such a poorly constructed survey.)
Second, I recommend we pay closer attention to how and why we do studies of communities in order to be more thoughtful and intentional. Funding often drives community studies. Funders may require a community needs assessment to justify an “investment” in the community. Recipients of funds, even when they are not obligated to conduct a needs assessment, may include a study in their work plan merely to assure the funder that they know what they are doing and to establish their credibility with the funder. These real or perceived expectations too often produce hastily undertaken studies that, for example, may use poorly designed survey or interview questions, convenience samples that are not representative of the community (often even excluding residents with valuable lived experience with the issue being studied), engage in a wild flurry of busy data collection activity (aka “going through the motions” to create the impression of a “good” study), or other such things that result in an overall poor quality effort. Studies done in this way neither provide actual benefit to the funder nor grantee and certainly provide little value to the community. At worst, a poorly done needs assessment may turn up community “problems” that are completely unrelated to the real issues facing a community, sending both the funder and grantee off on a chase to fix “problems” that are either insignificant or nonexistent.
Third, I suggest we more forthrightly and clearly admit our findings are based upon assumptions that may not be correct. Whether a community study uses a quantitative, qualitative, or mixed method approach, the information gathered is always limited by key core assumptions that we are asking the right questions in the right way of the right people to get an accurate picture of the community. Good researchers are always aware of such assumptions and worry about the limitations of their research. They will take care to describe the limitations and hope that they are thoughtfully considered before the findings are applied to an unsuspecting community by the Sledgehammer of Helpfulness: “You need this, see? And you’re going to get it whether you want it or not!” I have been guilty of wielding the Sledgehammer of Helpfulness myself as a leader who did not always understand and respect the limitations of data collection and analysis. More fully aware of my own limitations today, it is painful to see others who still swing the sledgehammer at communities.
Fourth, I suggest we find a better way to study the life of a community continuously in real time. Community needs assessments are too often a “still photo” or “snapshot” in time that fails to provide ongoing “real time” updates. Snapshots become dated very quickly, though we may cling to them as if they really do represent the present. Even worse, I have been involved in some large community needs assessments that take so long to produce findings (sometimes more than a year) that when they are delivered, the original conditions it found are no longer present. As a technical assistance provider who was supposed to use that data to tailor my assistance, I found it to be an absurd, crazy-making requirement that was both useless to me and the community initiatives it was supposed to serve. I think my colleagues involved in community based participatory research (CBPR) are trying to figure out how to study a community in real time and I appreciate their effort. I would still ask them to look beyond just the numbers and to shift their focus from community problems and deficits to positives and possibilities.
Fifth, I strongly suggest that we professional do gooders (PDGs) who conduct the studies stop trying to be experts in other’s communities. One of the biggest problems I have with community studies of any kind is that they shift the balance of “expert” power from the community residents to the PDGs who are doing the study.Here is a very hard truth: we PDGer’s have too often used our community studies for two terrible deceptions. The first is our own self-deception. Some of our community studies result in such massive amounts of data on communities that we first conclude we must be the true experts. I have been in meetings with PDGs who have asserted (one even pounded the table for emphasis) that they were the experts in the community they were serving, not the people who lived in it. Unfortunately, the deception does not stop there. We, who have claimed expert status by virtue of our reams of data, too often commit a second deception on the residents with lived experience in the community we are studying.We use our new self-declared expert status and data (see the Sledgehammer of Helpfulness above) to convince residents that we know them better than they know themselves. When this happens, community engagementwork, then, becomes a process by which we convince the community residents what their needs really are and get them to agree to let us do an intervention to them – an intervention we have often designed just for them without their input.
Finally, I recommend that our community studies be expanded beyond an examination of “needs,” to include an assessment of community “wants” and “will.” It is just as valid to ask residents of a community what is wanted as it is to ask what is needed. Some will argue that it is hard to trust that people will want what is best for them. To that I ask, “Could we possibly make that sound any more condescending?” Others will argue that people may not even know what they want. To that I say, “So what? Will it kill us to find out?” I think we will be surprised what happens when we actually trust people to tell us what is important to them. Okay, maybe people will tell us they want a new car or a new cell phone or something else that seems ludicrous to us given our “expert” observation of the many other greater “needs” in the community. However, what if we then ask them why it is important to have a new car or new cell phone? Maybe we will learn that they need transportation to take a chronically ill child to a hospital for regular treatment or they are unemployed and need a contact phone number to list on job applications.
Both a wants and will assessment require us to go beyond surveys, questionnaires, and interviews to engage people differently to gain a deeper understanding. A community “will” assessment is a bit more complex and requires the most creative engagement strategy. What are residents of a community actually willing to do? People do not always act in the best interest of their needs. For example, I may need to maintain my weight but I still enjoyed my share of that large Polynesian pizza from the Lost Dog Pizza Cafe last Friday night. They may also describe their wants but then do something entirely different, including something that meets a need that is more important to them than their wants (e.g., I still want Polynesian pizza for lunch today, but I will have yogurt and granola instead.) How do we know what people are willing to do? One of the best indicators of what people are willing to do is discovered through an Appreciative Inquiry process. Through Appreciative Inquiry people identify the most positive moments and experiences that they are not only willing to experience again but will intentionally plan to experience again. This process can be used to help us better understand what residents and communities are willing to do.
If community change is going to be effective, we need to align community needs, community wants, and community will with our understanding of how these are interconnected.No assessment is ever perfect, whether it is a needs, wants, or will assessment. Communities are complex adaptive systems which are dynamic and in constant flux, which is all the more reason to create community understanding studies that allow us to remain aware of the fluctuations, both great and small.
An Update on For Barbara: The Power of One: On May 5, 2014 I posted a blog about my long time friend and colleague, Barbara Huberman, that generated many comments from readers. Barbara passed away in hospice care on Saturday, May 17th. Barbara had asked me in March if I would assist her family in planning a celebration of her life. Before she passed away, Barbara got to read that blog and I got to have one last visit with her. On June 3rd, in Washington, DC, more than 150 people from around the United States attended the celebration and memorial for Barbara Huberman. It was one of the most profound honors of my life to lead that celebration. We miss you, Barbara. Rest well.
I can chat anytime, anywhere, with anyone. I am infamous for my chatting ability and inclination. I’ve written a little bit about my proclivity for chatting in an earlier blog (Movies, Wavers, & Client Love). The airplane is my favorite place to chat with people for the obvious reasons of boredom on long flights, the unwillingness of airlines these days to provide distractions (e.g., food, movies, flight crew with a sense of humor, etc.), and the absence of space to move any other part of your body but your mouth, which, at least, supports the activity of chatting.
This past week I was on a flight when my seatmate surprised me with my own opening gambit when he turned to me and asked, “Going away or going home?” I was stunned that a) he beat me to the question and b) he used nearly the exact question I use to start many fascinating conversations. Needless, to say, we chatted the entire flight. He had an entirely more fascinating life than my own – he is a movie director, had worked on many of my favorite films (including some in the Star Trek series and a movie with one of my favorite actresses, Jodie Foster). He was en route to begin shooting the remake of a very well-known movie series. However, we mostly chatted about the joys and challenges of launching our 20-something children. This kind of chatting is a lot of fun and it certainly passes the time in the most entertaining way when one does not have many other options.
I have come to realize, though, the value of chatting in relation to research. Chatting with a purpose, what I am now calling by the more scientific sounding name of elicited chat is a useful qualitative research strategy. An elicited chat is one that calls forth or draws out information in an informal act of talking in a familiar way with another person. To be clear, I am proposing elicited chat is different from elicited conversation, a more structured qualitative research strategy used in some other fields. Elicited conversation in these fields appears to refer to a conversation that is staged in order to gain research data. Elicited chat is differentiated by an even more informal interaction in which the researcher follows openings to collect data in the natural flow of the chat as the openings appear. Of course, before a researcher would engage in any data gathering activity, he or she needs to pay attention to the guidance of the appropriate Institutional Review Board (IRB) to ensure the ethical treatment of research participants. Assuming IRB approval, an elicited chat with research participants has the potential for mining some very useful qualitative research data in a less contrived way than traditional qualitative interviewing.
My journey to discovering for myself the value of elicited chatting began when I was doing my dissertation research on leaders of sexual health organizations using a constructivist grounded theory approach. Though I used a semi-structured qualitative interview process, I noticed that in nearly all of the interviews they changed into something different at some point. They stopped being formal interviews directed by my carefully constructed interview guide and became, instead, chats that were merely informed by the interview guide. When this change occurred, I could feel it and, presumably, so could the other person. The tone of our talk changed, the sense of connection changed, and the conversation grew warmer. As a result, we became more open, more genuine, and more revealing with each other. I have no doubt my research participants shared things with me after the change that they would never have shared with me, if the change from an interview to an elicited chat had not occurred. Before your imagination runs ahead of you, please remember we are talking about “elicited chats” not “illicit chats.”
There are several reasons why I believe elicited chats can be more effective in gathering rich data, in some situations, than interviewing.
The concept of “chatting” connotes a level of informality that is lost when a research participant knows he or she is about to be interviewed. The informality creates a more relaxed environment and that can, in turn, result in more entry points in the talk to access the data being sought.
While there is still a necessity for informed consent and some structure to assure confidentiality in an elicited chat, an elicited chat can be done in a way that is less contrived. It can be done in a variety of settings, even while doing other things, thus allowing the conversation to more naturally flow between the researcher and the participant.
It is an approach that changes the power relationship between the researcher and the participant. They become two people in an interesting chat about something instead of being an “expert” trying to learn more from a “subject.” They are bound, in the moment, as two people by mutual curiosity and the joy of conversation.
This bonding allows two people to communicate across socio-demographic (e.g., age, race, socio-economic position, etc.) and ideological barriers that might otherwise restrict their interaction. Elicited chat, by virtue of the human connection it creates, can quickly facilitate a trust and confidence between two people.
Recently I wrote on the challenge of community engagement on issues that were perceived as being difficult to address (see Community Engagement and Touchy Topics). My experience of interviewing sexual health leaders, who represented a very wide spectrum of ideologies in the debate over comprehensive sexuality education and abstinence-only education, convinced me that an elicited chat has considerable value when trying to learn from another person who has a very different ideology than my own. When community issues being researched are less controversial, elicited chat can work well because it more closely resembles the informality and familiarity that characterizes how neighbors and members of the same community typically talk to one another.
As I have continued to think about elicited chat and become more convinced of its value, I am also considering several limitations to its use. First, a researcher needs to be naturally curious about the topic and genuinely care about it. Chats are richest in those magic moments when both parties are connected with interest and sincerity. Secondly, a researcher needs to be a “people person.” The richest chats are between two people who enjoy connecting with others. For this reason, my seatmate and I started talking the moment we sat down in the plane and did not stop until we were walking off the plane together. Thirdly, at the risk of sounding ageist, elicited chat may work better for the more mature (e.g., older) researcher. The art of conversation requires a large frame of reference that may not yet be available to less mature (e.g., younger) researchers. Finally, a researcher needs to be comfortable with chatting as a complex, though informal, process. Chatting is a complex process in that it is often messy, by which I mean it has the properties of complexity – it is dynamic, entangled, emergent, and robust. (More about this in a future blog.)
I am continuing to think about the integration of my love of chatting with qualitative research. I would be pleased if you would think about it with me and join the conversation.