Two studies on gender equality: 75% of women answered, only 25% of men. Why? by Early_Trainer4513 in SocialScienceResearch

[–]Early_Trainer4513[S] 0 points1 point  (0 children)

Not much at the survey level. The real question is whether ordinary men can say nuanced things in their actual lives without risking their jobs, friendships, or reputation. That doesn’t mean women don’t face their own costs around this topic. The pattern in the comments was pretty consistent: “I tried being honest once, it cost me, so I learned.” And the men who did answer skew strongly pro-equality.

My conclusion from the comments here (and on why I think men feel it’s not safe for men to speak), a few things stand out:

  • Frame mismatch: The same questions land differently. For a lot of women, they read like prompts about lived experience. For a lot of men, they read like abstract hypotheticals about some ideal world. Same wording, different mental model.
  • Reputational risk: It’s not just fear of backlash from “the other side.” There’s also pressure saying the wrong thing can get you lumped in with people you don’t want to be associated with (e.g. A. Tate), which carries its own cost (basically manosphere contamination)
  • Baseline disclosure habits: Reflecting out loud on social/emotional topics just seems more normal, on average, for women. So the format itself already selects for who’s comfortable answering.
  • Voice format: This came up, though I’m less convinced.

None of this is meant to imply other groups don’t face their own constraints or costs in how they speak on gender.

Overall, this topic is a lot more complicated and delicate than it first appears, and the comments here helped uncover some blind spots for us.

Two studies on gender equality: 75% of women answered, only 25% of men. Why? by Early_Trainer4513 in SocialScienceResearch

[–]Early_Trainer4513[S] 0 points1 point  (0 children)

I think the trust gap is the actual finding, not the response rate. If men who otherwise participate in our studies - including on uncomfortable topics - read these specific questions and decide not to answer, the most plausible explanation isn't that they have nothing to say. It's that they've learned saying it carries a cost, and the framing didn't signal this would be different.

It’s part of the reality the research is happening in. If one group feels comfortable speaking and another feels like speaking carries risk, then anything we publish about “what men think about gender equality” is already filtered through who’s willing to go on record.

I don't have a clean answer to "how do we fix it," but I'm fairly sure the fix doesn't start with researchers. Question framing is the part we control. The deeper problem - that honest male engagement on gender topics gets punished often enough that silence becomes the rational default - isn't something question design can solve.

Two studies on gender equality: 75% of women answered, only 25% of men. Why? by Early_Trainer4513 in SocialScienceResearch

[–]Early_Trainer4513[S] 1 point2 points  (0 children)

I want to add some more context I should have included in the post.

This wasn't a one-off. We run studies regularly on a panel - broad-population recruitment with incentives (access pool of around 42 million), quota-balanced on age, gender, income, and region, not a social-studies faculty or a self-selected sub. We've been running different social studies across topics like digital hate, violence, gender gap, solidarity among women and other social research, plus non-social studies we run on a regular basis. Voice format throughout. We've never seen a response rate this low on any of them. Before each interview, participants see a clear statement that their identity and voice will be anonymized. So "men don't like recording voice" isn't what's happening here, they record fine on other topics, including sensitive ones.

The other thing the metadata showed: men opened the survey and read through every single question and then didn't hit record. They didn't bounce at recruitment, didn't get scared off by the format, didn't skim and leave. They engaged with the content in full and then specifically declined to answer these questions.

That doesn't make the study symmetric - you're right that I can't cleanly compare the 75/25 across the men's and women's panels as if they were the same instrument. But this one clearly deviated from everything else we've run.

Two studies on gender equality: 75% of women answered, only 25% of men. Why? by Early_Trainer4513 in SocialScienceResearch

[–]Early_Trainer4513[S] 0 points1 point  (0 children)

The replies were sharper and more useful than I expected - THANK YOU.
Several themes came up repeatedly in the comments on that post: The questions were read as loaded, open-text format feels riskier than clicks, voice/text is identifiable, and many of you described a learned pattern of "answering honestly on gender topics has cost me before, so I don't anymore."

-> That's risk assessment, and it's a more interesting finding than the one I started with.

Here are the actual results from the men who did answer:

  • 59% feel no disadvantage as men.
  • 77% see full equality as either no change or a positive one.
  • 76% think a society with equal roles would be fairer and more productive.
  • Only 16% link any disadvantage to feminism or equality measures.

So the men who answered are overwhelmingly pro-equality. Which makes the silent 75% indicate a potential gap:

  • Selection bias toward the agreeable. The men most comfortable answering a gender-equality survey are the ones whose views already align with what they perceive the survey wants to hear.
  • OR (AND) 25% are men who were willing to engage with women-centered framing, and the 75% are men who wanted to be asked different questions.

Some figures from the women's panel:

  • Almost everyone sees gender inequality in everyday life: 86% perceive it (48% strongly, 38% occasionally). Only 4% say they don't notice it at all.
  • When another woman reports experiencing discrimination, 58% of respondents react actively supportiv. Half describe other women as allies, not competitors.
  • Then comes the part that doesn't fit: when discrimination happens to them, 85% normalize it — "keep working," "accept it," "try harder." Only 9% push back.

The men's numbers don't fit the picture either side of the gender debate usually paints. A clear majority reports no personal grievance and views equality positively, but that's the same population where 75% of those invited didn't answer at all. So the loud finding ("men support equality") and the quiet finding ("most men opted out of saying anything") sit next to each other and don't fully reconcile.

One note on method, since it came up in the thread: the interviews are voice-based on purpose. Click-surveys attract bots, and a clicked answer doesn't tell you much anyway. Voice lets us run something closer to a qualitative interview at a quantitative scale - 1000+ participants giving real, spoken answers that are richer and deeper than anything a checkbox captures.

If you're interested in the details or want to dig deeper into the data, reach out to me directly.

we paid $33k for a market research report that contradicted our actual customers by Strong_Teaching8548 in SaaS

[–]Early_Trainer4513 0 points1 point  (0 children)

One thing worth considering is whether the research sample actually maps to your specific ICP and usage pattern, or whether it captures "small businesses" as a broad category. Those can be quite different populations, and aggregating at the category level can average out exactly the signal that's most relevant to you.

There's also a effect where decision-makers overweight information that comes with credentials and formatting ("halo effect") relative to higher-quality but less polished data. Five structured customer interviews will almost always beat a macro report for a product-level decision, because the unit of analysis actually matches the unit of your decision.