HOW I WORK
Your Questions, Answered
What to expect when we work together — from first conversation to final deliverable.
Have a question that's not here? Just ask.
Organizational Services
-
Most projects begin with a discovery call — usually 30–45 minutes — where I learn about your organization, the question you're trying to answer, what you've already tried, and what success looks like. From there I put together a brief proposal that outlines scope, timeline, deliverables, and cost.
Once a project is underway, I work collaboratively but efficiently. I don't disappear for weeks and resurface with a report. You'll hear from me regularly, and I'll flag anything that changes along the way — including if the data is pointing somewhere you didn't expect.
-
I price most work as fixed-fee projects rather than hourly — it's better for clients because the cost is predictable, and it lets me focus on producing the best outcome rather than tracking time. Rates vary by scope, complexity, and timeline.
I also offer hourly advisory work ($150–$200/hr) for organizations that need consultation or strategic guidance rather than a full project. Subcontracting arrangements are available at $100–$150/hr.
If budget is a constraint — particularly for nonprofits or community organizations — I'm open to scoping conversations about what's feasible with the resources you have.
-
Yes. I've worked across the full spectrum — from community nonprofits and public agencies to mid-size employers and HR teams in private-sector contexts. The methods adapt to the setting; the standard of rigor doesn't.
Nonprofits and public-sector clients often need findings that will hold up to funder scrutiny or inform policy decisions. Private-sector clients often need findings that will land with executive leadership who are skeptical of "soft" data. I'm comfortable working in both settings.
-
Absolutely. Some of the most useful engagements start with an organization that has data sitting underanalyzed or underutilized. I can pick up existing survey data, focus group transcripts, program records, or prior reports and help you figure out what's actually there — and what's still missing.
-
All client work is covered by a consulting agreement that includes confidentiality provisions. I don't share client names, findings, or organizational details without explicit permission. Case studies on this site have been modified to protect identifying information.
For research involving individual participants — employees, community members, clients — I apply appropriate protocols to protect their privacy and ensure data is used responsibly.
Research
-
Mixed methods means using both quantitative (numbers, surveys, statistical analysis) and qualitative (interviews, focus groups, open-ended responses) approaches — deliberately, in a way that's designed to answer a specific question better than either approach alone.
In practice, it usually means: surveys tell you what is happening and how widespread it is. Qualitative work tells you why, and captures the experiences and context that numbers can't. Done well, the two inform each other throughout the research process — not just at the end.
-
This is the right question, and it's one I take seriously. A lot of well-being measurement defaults to whatever's available — a validated scale that may not fit the population, a single-item satisfaction question, or whatever the survey vendor includes in their standard template. That's not the same as measuring what matters.
My approach starts from the construct: what dimension of well-being are we actually trying to understand, for whom, and in what context? From there, I select or develop instruments that fit — not the other way around. When validated scales are appropriate, I use them. When they're not, I build from the ground up using measurement development principles from my training in community psychology.
-
That depends on what you actually need. For organizational clients, the most common deliverable is an executive report — structured, clearly written, with findings presented in a way that supports decision-making rather than demonstrating methodological sophistication. I also provide technical appendices for teams that want the full analytic detail.
For nonprofit or public-sector clients reporting to funders, I can produce evaluation reports formatted to common funder requirements. For teams that need findings in a presentation format, I can deliver that too.
I don't produce deliverables that sit on a shelf. If a report isn't usable by the people it's meant for, it hasn't done its job.
-
It varies by scope. A focused survey design and analysis project might be completed in 4–6 weeks. A full needs assessment with qualitative components typically runs 8–14 weeks from kickoff to final report. A program evaluation timeline depends heavily on the data collection phase.
I'm clear about timelines in proposals and don't commit to deadlines I can't keep. If something changes — on either side — I'd rather surface it early than deliver late.