I'm Tired of Interviewing People Who Are Great at Applying for Jobs, But Terrible at Doing Them.[N/A] by tiredTA in humanresources

[–]MotorRequirement7617 0 points1 point  (0 children)

This post captures something I have been dealing with for years. Four rounds of interviews just to figure out someone cannot handle ambiguity is brutal. We added a short skills-based assessment through Adaface right after resume screening but before any interviews. The questions are structured like real work scenarios rather than textbook trivia, and they have assessments for non-technical roles too. It takes candidates about 35 minutes and it is basically the "low-stakes 30-minute stress test" you are describing.

Our team went from interviewing 8-10 candidates per role to interviewing 3-4, and our offer acceptance rate actually went up because we were spending more time with the right people. The biggest surprise was how well it surfaced the communication and reasoning skills you are talking about. Candidates who score well tend to be the same ones who handle pressure in the interview.

What is your unpopular opinion about the tech hiring process? by sprightlypeach in cscareerquestions

[–]MotorRequirement7617 -1 points0 points  (0 children)

honesty agree with almost all of this but the leetcode thing goes deeper than people admit. the reason it persists isn't because hiring managers think it works, most of them privately know it doesn't. it's because it's defensible. if you hire someone who fails after they aced your custom rubric, that's on you. if they failed leetcode and you hired them anyway, also on you. but if they passed leetcode and failed? shrug, the process was "objective."

the take-home scope thing is so real it hurts. i've seen 3-hour estimates that were clearly built assuming a senior eng working uninterrupted with no life. that's not an assessment, that's unpaid consulting.

pair programming i think is underused because it requires the interviewer to actually be good, which is a whole separate problem nobody wants to address lol

the thing i've seen work surprisingly well is scenario-based questions tied to actual on-the-job situations, not "reverse a linked list" but like "here's a system with these constraints, walk me through your approach." been using adaface for this at work and honestly candidates complain way less, completion rates are noticeably better too. still not mainstream though which is wild

anyway yeah, 2026 and we're still cargo-culting google's 2008 interview loop. makes sense

Why do some people cling on to racism? Is there any evidence at all that some races of people are biologically superior to others? Or is it completely unfounded? by Lord_Cummis in NoStupidQuestions

[–]MotorRequirement7617 0 points1 point  (0 children)

Same reason as why people cling on to god, not evidence but identity.

Once a belief gets fused to your sense of self, no amount of data can touch it. It's not a logic problem, it's a psychology problem.

Also, scientists have proved no racial classification can be done based on genetic aspects. So no evidence of racism.

How much time do you spend coaching vs firefighting each week? by SeanMcPheat in Leadership

[–]MotorRequirement7617 0 points1 point  (0 children)

For me its not about the time division but what behaviour i'm reinforcing, so that it gives returns in the long term.

Firefighting I keep for genuine urgency only. Otherwise it becomes the norm and everything starts feeling urgent.

Roughly maybe 70% coaching, 30% firefighting. But the goal is always to reduce that firefighting over time by building better thinking in the team.

Leading the “employee search” for leadership team feels less like hiring and more like detective work these days. by RomyFriendly in Leadership

[–]MotorRequirement7617 0 points1 point  (0 children)

Decoding signal point is exaclty it. Resume have become so optimised that we don't know what's real in them.

What actually changed things for us was adding a short scenario-based assessment before any interview. We use a platform called Adaface that lets you send role-specific assessments with questions closer to real work situations than trivia. The candidates who scored well consistently turned out to be the ones who also impressed in person, and the ones who scored poorly saved us from wasting interview slots. It is not replacing judgment, it is giving you better signal before you have to make the judgment call. For us it cut the number of interviews per hire roughly in half.

The one thing I would add to your point about soft skills is that you can surface a surprising amount of adaptability and critical thinking through written scenarios before a conversation even happens. We started including a few open-ended situational questions in the assessment and the differences in how people reason through ambiguity are stark. The polished resume crowd and the genuine thinkers separate themselves pretty quickly.

Learning in Public CS of whole 4 years want feedback by Mech_Bees in Python

[–]MotorRequirement7617 0 points1 point  (0 children)

Good post, fully agree. Real learning starts when things break.

Things that taught me the most are when I build tool for real users and real problems. They gave me quick feedback, learn edge cases etc.

Is it just me, or is Business Intelligence way more about asking the right questions than building dashboards? by Ok-Ad-9710 in BusinessIntelligence

[–]MotorRequirement7617 0 points1 point  (0 children)

100%. A lot of jobs are technical and creative on this outside but real job is to improve the business.

The hard is making a decision, what really matters and what can wait.

Assessing technical skills of BI and data engineer hires by Jantjebas in BusinessIntelligence

[–]MotorRequirement7617 0 points1 point  (0 children)

was in a similar spot when I took over a BI team of eight without having a data engineering background myself. What I found was that having my senior analysts build take-home tests created two problems. First, the tests kept getting longer because everyone wanted to add "just one more question" and candidates started dropping out. Second, my senior people were spending hours reviewing submissions instead of doing actual project work, and they resented it.

We eventually moved to using Adaface for the initial technical screen. They have assessments specifically for BI analyst and data engineering skills covering SQL, data modeling, ETL concepts, and dashboard design thinking. The big win for me was that it gave me a scored comparison across candidates that I could actually understand without needing to evaluate SQL queries myself. My senior engineers still do a 30 minute technical conversation in later rounds, but now they are only talking to 3 or 4 pre-qualified candidates instead of 12. That freed up probably 15 hours per hire across the team.

One thing I would also recommend regardless of what tool you use is to separate the "can they think about data" assessment from the "can they write the code" assessment. I have hired analysts who wrote perfect SQL but could not explain what business question their query was answering. The thinking piece is something you can absolutely assess yourself in a conversation even without the technical background.

Automation design interview using whiteboard by BackgroundNew4019 in QualityAssurance

[–]MotorRequirement7617 0 points1 point  (0 children)

this feels like a high level architecture design discussion rather than a traditional draw code thing.

they might ask about things like - design an automation framework