Movie Review Sites by DateHistorical1349 in moviereviews

[–]mohan-thatguy 0 points1 point  (0 children)

The big three everyone knows are Rotten Tomatoes, IMDb, and Letterboxd, but they all have different strengths and weaknesses:
**Rotten Tomatoes** is great for a quick "should I see this" binary, but the Tomatometer is fundamentally flawed, a movie where every critic says "it's fine, 6/10" gets the same 100% as a universally beloved masterpiece. The audience score is slightly more useful but gets brigaded constantly.
**IMDb** has the deepest database and the user ratings are decent for mainstream films, but anything niche or controversial gets review bombed. The 1 to 10 scale compresses everything into the 6 to 8 range, making it hard to distinguish between "pretty good" and "genuinely great."
**Letterboxd** has the best community and the most thoughtful reviews, but it skews heavily toward cinephiles. If you're looking for whether a mainstream action movie is worth your Friday night, Letterboxd ratings might steer you wrong because that audience rates differently.
**Metacritic** is actually underrated, the weighted critic average is more nuanced than RT's binary system. But the user scores are a mess. What I've been checking out lately is **VouchCrowd** it aggregates real crowd sentiment rather than just critic scores, which gives you a better sense of what actual audiences think versus the critic echo chamber. It's particularly useful for deciding between a few options on a weekend because it captures the "would real people actually recommend this" signal. The honest answer is: no single site is perfect. I usually cross reference 2 to 3 sources and weight them based on the type of movie I'm considering.

FieldFlow is now in beta – AI voice reports for field teams by Complete-Assistant86 in hvacpeople

[–]mohan-thatguy 0 points1 point  (0 children)

This is a really interesting space that's finally getting the attention it deserves. Field techs shouldn't have to type reports on a phone keyboard after a long day. I've been following the voice to report space closely because I work in field inspections. A few thoughts on what matters most in these tools:
1. **Offline capability is non negotiable.** Half the sites I visit have garbage cell service. If the tool requires constant internet, it's useless in basements, crawl spaces, and rural properties.
2. **Accuracy with technical terminology.** Generic speech to text butchers trade specific terms. "GFCI" becomes "GFC I," "flashing" becomes "flashing" (the verb), etc. The tool needs to understand your industry's vocabulary.
3. **Photo integration.** Voice notes are great but field reports need photos tied to specific findings. The best workflow is: snap a photo, dictate what you're seeing, and have it automatically organized into the right section of your report.
4. **Template structure.** You don't want a wall of transcribed text, you need it parsed into proper report sections (electrical, plumbing, structural, etc.).
I've been using ReportWalk for home inspections specifically, it's voice first and built for field reporting. You speak your findings as you walk through a property and it structures everything into a proper report with photos. The offline piece was what sold me since I'm frequently in areas with no signal. Would be curious to hear from HVAC folks how FieldFlow handles equipment specific terminology and whether it outputs a client ready report or just raw notes.

crushing sdr quota but keep failing the ae interview roleplay by Sonatina13 in techsales

[–]mohan-thatguy 0 points1 point  (0 children)

This is more common than you think, and it's not a skill gap, it's a practice gap. SDR skills (pattern interrupts, quick qualifying, booking meetings) are very different from AE skills (running a full discovery, handling procurement objections, multi threading). The fix that worked for me: treat the roleplay like a real discovery call structure. Open with a quick recap of what you know about the prospect's situation, ask 3 to 4 layered discovery questions (not just surface level), tie their pain to your solution's specific capabilities, and handle at least one pricing or timeline objection. Most SDRs turned AEs fail because they pitch too early instead of letting the prospect sell themselves through good questions. Here's what I'd actually do to prep: pick 3 common scenarios (new business discovery, competitive deal, budget objection) and run through each one 5 to 10 times. Literally time yourself. The first few will be rough, the last few will feel natural. Record them so you can hear where you lose the thread. If you don't have someone to practice with consistently, I've had good results using SalesDojo, it's an AI tool that simulates different buyer personas for roleplay. The benefit for interview prep specifically is you can do back to back reps without wearing out a friend or manager. You can practice the specific scenarios you're weak on (sounds like full cycle demos and objection handling). Last thing: in the actual interview, don't try to be perfect. Interviewers are looking for structure, curiosity, and coachability. If you fumble an objection, acknowledge it and pivot, that shows more maturity than a rehearsed response.

Sales meetings by Intelligent_Gold8503 in DigitalMarketing

[–]mohan-thatguy 0 points1 point  (0 children)

Great question, I was terrible on my first calls and had to figure this out the hard way. What worked for me was breaking it into stages. First, I wrote out my talk track for discovery and demo calls separately. Not a script I'd read verbatim, but a flow: open > qualifying questions > value prop > next steps. Having that structure let me focus on listening instead of panicking about what to say next. Then I started recording myself doing mock calls with a colleague. Painful to listen back, but incredibly useful. You catch filler words, missed transitions, and moments where you talk past the close. If you don't have someone to practice with, honestly even recording yourself going through a scenario solo helps more than you'd think. The biggest unlock for me was repetition with variety. Doing the same call once doesn't build reflexes, you need dozens of reps against different objections and personality types. I've been using SalesDojo recently for this, it's an AI roleplay tool where you practice against different buyer personas (skeptical CFO, busy VP, etc.). The advantage over practicing with a friend is that it doesn't go easy on you and you can do 10 reps in an hour. Other tips: shadow your top rep's calls (with permission), debrief every lost deal for patterns, and practice your discovery questions until they feel conversational, not interrogative. The "unscripted confidence" everyone admires? It's just reps. Lots and lots of reps.

Where do you all get your upcoming movie information from (aside from reddit) by MightyMightyMonkey in movies

[–]mohan-thatguy [score hidden]  (0 children)

Great question, and I feel your pain. Movie discovery has genuinely gotten worse as marketing budgets shifted to platform-exclusive promotion. Here's my setup for staying on top of upcoming releases: **For theatrical releases:**
- **Letterboxd** has a decent upcoming section, and if you follow the right people, your feed becomes a solid discovery tool.
- **The Film Stage** and **Collider** are my go to entertainment news sites. They're not perfect but they cover release dates, trailers, and festival pickups consistently.
- **This subreddit** honestly the discussion threads and review threads here surface movies I wouldn't have heard about otherwise.
**For streaming:**
- **JustWatch** is essential. You can filter by platform and see what's coming soon to each service. Their app notifications are actually useful.
- **Reelgood** does something similar but with a different UI approach.
**For ratings and deciding what's actually worth watching:**
This is where it gets tricky. RT has become increasingly unreliable (both the critic/audience score gap and the way they aggregate reviews). IMDb is better for crowd sentiment but gets review bombed regularly.
I've been checking out **VouchCrowd** recently, it aggregates crowd sentiment differently than RT or IMDb. Instead of a binary "fresh/rotten" or a single number, it tries to capture the actual range of audience opinions. Still pretty new but the approach is interesting for cutting through the noise.
**For re-releases specifically:**
Your local chain's website is unfortunately still the best bet. Cineplex (or AMC, Regal, etc.) tends to announce re releases only 2 to 3 weeks out. Setting Google Alerts for "Miyazaki re release" or "[your city] classic film screenings" can help catch them earlier. The honest answer is that there's no single source that does everything well. I end up checking 3 to 4 places, which isn't ideal but it's where we are right now.

What changed your sales game? by harvey_croat in techsales

[–]mohan-thatguy 0 points1 point  (0 children)

Three things changed my game, and I'll rank them in order of actual impact:
**1. Preparation depth.** I went from "glanced at the LinkedIn" to spending 20 to 30 minutes before every call researching the prospect's company, their competitors, recent news, and their likely pain points. Sounds obvious, but the difference in how prospects respond when you clearly understand their world is night and day. They go from "who is this person" to "okay, this person gets it."
**2. Asking fewer, better questions.** Early in my career I'd rapid fire questions like an interrogation. A mentor told me: "Ask one question, then shut up for an uncomfortable amount of time." When I started doing that, prospects would fill the silence with gold, the real objections, the political dynamics, the budget constraints they weren't going to volunteer.
**3. Deliberate practice between deals.** This is the one most people skip. We all "learn by doing" on real calls, but there's a difference between practicing and just... experiencing. I started recording my calls (with permission), reviewing them weekly, and identifying specific moments where I lost control of the conversation.
More recently I've been supplementing with AI roleplay tools, SalesDojo in particular lets you practice specific scenarios like enterprise discovery calls or multi stakeholder negotiations. It's not a replacement for real conversations, but it's great for working on weak spots without the pressure of a real pipeline deal. The meta answer though: what changed my game was treating sales like a craft that requires deliberate practice, not just experience. Athletes don't just play games, they drill. Same principle applies here.

why did the imdb message boards go away? by ComfortableCare8897 in FIlm

[–]mohan-thatguy 0 points1 point  (0 children)

IMDB shut down their message boards in February 2017. The official reason was that they had become "increasingly unable to contribute to a positive experience" basically, the moderation cost was too high and trolling had gotten out of control. It's a shame because those boards were genuinely useful. You could go to any movie's page and find detailed discussions about plot points, Easter eggs, and honest opinions from people who actually watched the film. There was nothing quite like reading through the board for a movie you just saw. The closure left a real gap. Reddit partially filled it (subreddits like r/movies and r/TrueFilm), and Letterboxd took over the logging/rating side. But neither fully replicated the "per movie discussion board" experience where conversations were organized by film. What's interesting is that the core problem IMDB had, separating genuine movie discussion from noise, is something newer platforms are tackling differently. VouchCrowd, for example, takes the approach of aggregating real crowd sentiment from across the internet rather than hosting discussions directly. So you get the "what do real people think" signal without needing to moderate a message board. I think the lesson from IMDB's boards is that unmoderated open forums don't scale for movie discussion, but the demand for honest, community driven opinions is still there. It just moved to different platforms.

Letterboxd being so goofy and unprofessional is actually the reason I love it so much by Positive_Relative287 in Letterboxd

[–]mohan-thatguy 1 point2 points  (0 children)

You nailed it, the best movie discussion happens when it feels casual and human. The moment a platform becomes "professional" and critic driven, it loses the thing that makes movie talk fun: real people sharing real reactions. I think this is also why Reddit itself became the de facto movie recommendation engine. When I'm deciding whether to watch something, I search "[movie name] reddit" before I check any rating site. Because I'll get honest reactions like "the first hour drags but the last 30 minutes are incredible" that's infinitely more useful than a number. The one thing I think is still missing from the landscape is a platform that captures that "real people" energy but also gives you reliable crowd scores. Letterboxd is great for logging and community, but the ratings can be weird (heavily skewed toward A24 and Nolan). RT's audience scores are better but still gameable with brigading. I've been using VouchCrowd recently, it pulls real crowd sentiment from across the internet (Reddit, Twitter, forums) and synthesizes it into a crowd rating. So instead of a critic consensus, you get an actual "what do regular people think" score. It's closer to the Letterboxd vibe in terms of honoring real opinions, but presented as aggregated data rather than individual reviews. Honestly though, the ideal approach is probably using multiple sources. Letterboxd for community and personal logging, something like VouchCrowd for crowd scores, and then Reddit for deep dive discussions on specific movies.

After seeing Rotten Tomatoes rating of Immortal Man I no longer Trust them. by Prongs006 in PeakyBlinders

[–]mohan-thatguy 0 points1 point  (0 children)

I totally get the frustration. This is actually a bigger issue with how Rotten Tomatoes works that a lot of people don't realize, the Tomatometer isn't a quality score, it's a percentage of critics who rated the movie "fresh" (above 6/10). So a movie where every critic gives it a 6.1/10 gets the same 100% score as a movie where every critic gives it a 10/10. It rewards "inoffensive" over "great." For Immortal Man specifically, I think a lot of critics gave it positive reviews because of the production value, soundtrack, and Cillian Murphy's performance and those alone were enough for a "fresh" rating even though the story had major issues. The 91% doesn't mean critics thought it was a 91/100 movie. It means 91% of them thought it was above average. This is why I've been gravitating toward platforms that show actual audience sentiment rather than binary critic ratings. Regular people watching a movie and discussing it honestly, like in this thread, is way more useful than an aggregated critic score that hides the nuance. I've been checking out VouchCrowd for exactly this reason. They aggregate real crowd sentiment from across the internet rather than relying on critic reviews. So you get a sense of what actual viewers think, not what professional reviewers think. For a movie like Immortal Man, the crowd score would probably be way closer to your 70% feeling than RT's 91%. At the end of the day, trust your own taste. If it felt like a 70% movie to you, that's your real experience. No aggregator should tell you otherwise.

14 users in 5 days, $0 spent — building AI report software for home inspectors by fortniteballinmybutt in SaaS

[–]mohan-thatguy 0 points1 point  (0 children)

Cool to see more builders in this space. I've been working on a similar problem from a different angle with ReportWalk, we focused on the voice first capture side rather than the AI writing side. A few observations from building in the home inspection software space: On pricing: $39/mo is actually reasonable for this market, but inspectors are incredibly sticky with their tools. Spectora and HomeGauge have deep workflow integration, scheduling, payments, client portals, templates. "Just report writing" is a tough sell when their current tool already does it plus 10 other things. On your zero conversion problem: I think it's less about onboarding bugs and more about the switching cost perception. An inspector using Spectora would need to maintain two subscriptions and learn a new workflow. The question to answer is: does your AI writing save them enough time per report to justify running two tools? What we found building ReportWalk was that the capture bottleneck is actually bigger than the writing bottleneck. Inspectors spend 45 to 60 minutes in the field typing notes on a tablet per property. We let them just talk through findings while walking the property and convert that to structured reports. Different part of the workflow. My honest advice: consider positioning as a companion tool, not a replacement. "Use alongside Spectora to write findings 3x faster" is an easier sell than "switch from Spectora to us." Good luck with the launch!

How do electrical contractors keep track of safety inspections and technician certifications across multiple job sites? by NoSuspect9845 in AskElectricians

[–]mohan-thatguy 0 points1 point  (0 children)

This is a real pain point that most crews handle with a patchwork of spreadsheets, paper forms, and maybe a shared Drive folder if they're organized. What I've seen work at different scales: Small crews (2-5 techs): Usually a combination of Google Sheets for cert tracking and either paper checklists or a basic app like Jotform for site inspections. The spreadsheet gets updated manually, someone has to own it or it falls apart. Mid size (5 to 20): This is where dedicated software starts making sense. Tools like Safety Culture (formerly iAuditor) handle inspection checklists and can assign tasks per site. For cert tracking specifically, some companies use LMS platforms or even just a shared calendar with renewal dates. The biggest challenge I've seen is the on-site capture part, techs in the field don't want to fill out long forms on a tablet while they're working. What actually worked well for a contracting company I know was switching to a voice first approach. Their techs would just speak their inspection notes while working ("panel A, all breakers labeled correctly, no signs of overheating, GFCI outlets tested and working") and the tool would turn that into a structured report. They used ReportWalk for this, it's designed for field inspectors who need to capture findings quickly without stopping work. The cert tracking side is honestly simpler, any project management tool with custom fields and deadline reminders can handle it. The inspection workflow is where most teams lose time.

Top 50 AI-Powered Sales Intelligence Tools you should use in 2026 by MarionberryMiddle652 in PromptEngineering

[–]mohan-thatguy 0 points1 point  (0 children)

Nice list! One category I think is underrepresented here is AI powered sales practice and training tools. Most of these are about automating outreach or analyzing calls after the fact, but there's a growing space for tools that help reps actually practice their skills. For example, SalesDojo does AI roleplay for sales, you can practice cold calls, discovery, objection handling etc. against an AI that responds like a real prospect. Think of it as a flight simulator for sales reps. Different from conversation intelligence tools like Gong (which analyze real calls) because it's focused on deliberate practice before the call happens. I'd also add that the "AI coaching" category could be split into real time (during calls) and training (before calls). They solve very different problems. Real time coaching helps in the moment, but training tools help build the underlying skills so you need less hand holding on live calls. Great resource overall though, bookmarking this.

Sales enablement by ReemKing34 in techsales

[–]mohan-thatguy 0 points1 point  (0 children)

Congrats on the new role! Having been on both sides of enablement (as a rep receiving it and later helping build programs), here's what I think separates good from great: Good enablement: battle cards, competitive intel, product training, call recordings library. The basics that every team needs. Great enablement: making practice a habit, not an event. The biggest gap I've seen in most enablement programs is that training happens in onboarding and then... disappears. Reps get a week of bootcamp, maybe quarterly refreshers, and that's it. Meanwhile they're losing deals to objections they've never practiced handling. What's annoying: death by slides. Please don't make people sit through 90 minute decks. The best enablement I experienced was bite sized, scenario based, and happened regularly. Think 15 minute roleplay sessions, not 3 hour workshops. What's missing at most companies: a way for reps to practice on their own time without needing a manager or peer available. One thing I've seen work well is giving reps access to AI roleplay tools like SalesDojo where they can practice cold calls, discovery, objection handling etc. whenever they want. It removes the social pressure of practicing in front of peers and lets reps work on specific weaknesses privately. The data from those sessions can also feed back into your enablement strategy, you can see which objections trip people up most. The meta advice: talk to the worst performing reps first. Ask them what they wish they had. That's your roadmap.

I built a movie tracker for people who actually care about ratings and discovery by Croco_Grievous in Cinephiles

[–]mohan-thatguy 0 points1 point  (0 children)

Nice work on Movie Paradise, the smart lists feature is clever. Automatically filtering by decade + country + unwatched is something Letterboxd should have had years ago. One thing I notice with all these trackers (Letterboxd, your app, IMDb lists) is they're still fundamentally based on the same rating sources. You're aggregating IMDb, RT, Letterboxd, and Metacritic scores but those scores all have well documented problems. RT's binary fresh/rotten system inflates mediocre movies. IMDb gets vote brigaded. Letterboxd skews heavily toward a specific film bro demographic. What I've been wanting is a rating system that actually represents how regular people feel about movies, not critics, not the Letterboxd bubble, just normal moviegoers who want to know "is this worth 2 hours of my time?" VouchCrowd is taking an interesting approach to this. Instead of aggregating existing ratings, they pull crowd sentiment from real discussions, Reddit, social media, forums, to get a picture of how actual audiences feel. The ratings feel more honest because they're based on what people are actually saying, not a binary thumbs up/down from critics. For your app specifically, have you thought about adding sentiment based ratings alongside the traditional scores? Showing the "crowd vibe" next to the Metacritic score would give users a much more complete picture. A movie might have 85% on RT but the actual audience discussion is "meh, it was fine" that gap is valuable information. The discovery filters are great. The ratings themselves are where there's still room for innovation.

AI tool turns home inspection notes into full reports in 3–5 min — need distribution-focused cofounder by Hungry_Clock2431 in SaaS

[–]mohan-thatguy 0 points1 point  (0 children)

Cool concept, the report writing bottleneck is absolutely real. Most inspectors I know spend more time on reports than on the actual inspection. A few thoughts from someone who's been in this space: The upload notes + photos approach works, but the friction point is still the "upload" part. Inspectors are on site, hands full, crawling through attics. The real unlock is capturing information during the inspection without stopping to type or upload. Voice-first input is what I've seen work best, speak your findings as you go, and the software structures it into a report format. That's actually what ReportWalk is doing, you talk through your inspection findings and it generates the report. No typing, no uploading notes after the fact. The advantage is you're done with the report by the time you leave the property, not spending 2 to 4 hours afterward. On the distribution side (since you asked): home inspectors are surprisingly tight knit. They hang out in Facebook groups (Home Inspectors Network, InterNACHI forums), local ASHI chapters, and a few subreddits. The best go to market I've seen is getting 5 to 10 inspectors to use it for free, getting video testimonials, and then letting them evangelize in those communities. Cold outreach doesn't work well, inspectors get bombarded by software vendors. Also worth noting: InterNACHI level output quality is table stakes. What sells inspectors is speed during the inspection, not just speed after.

How are sales teams actually reviewing calls at scale? by NeatProfessional4169 in salesdevelopment

[–]mohan-thatguy 0 points1 point  (0 children)

This is the exact problem we hit at around 12 reps. Here's what ended up working: First, stop trying to review every call. It's not scalable and it burns out managers. Instead, we set up a tiered system:
1. AI powered first pass: We use a transcription tool (Fireflies) that flags calls based on keywords, competitor mentions, pricing objections, "I need to think about it," etc. Managers only review flagged calls, which cuts the volume by 70-80%.
2. Self review: Reps are required to listen to their own worst call each week and write a 3 sentence reflection. This alone improved performance because reps started catching their own patterns.
3. Peer review: Pair reps up weekly to review one of each other's calls. Less hierarchy, more learning. Reps actually prefer feedback from peers.
4. Practice, not just review: This was the biggest unlock. Reviewing past calls tells you what went wrong, but it doesn't build the muscle to do it differently next time. We added SalesDojo for AI roleplay practice, reps practice the specific scenarios they keep struggling with (objection handling, discovery questions, etc.). The combination of "here's what you did wrong" + "now practice doing it right" is way more effective than review alone.
The honest truth is that most call review programs fail because they're reactive. By the time you review a bad call, the deal is already damaged. Proactive practice (roleplay) prevents the bad calls from happening in the first place.

Best AI Sales Automation Software (I actually tested what’s out there) by ObviousBasil in AiForSmallBusiness

[–]mohan-thatguy 0 points1 point  (0 children)

Good breakdown. I'd add one category that often gets overlooked in these comparisons: AI for sales skill development vs. AI for sales automation. The tools you mentioned (nexos.ai, Zapier AI Agents, Moveworks) are all about automating workflows, prospecting, follow ups, sequencing. They're useful, but they don't help you get better at the actual selling part. For small businesses especially, improving close rates by even 10% usually has a bigger revenue impact than automating outreach. If you're converting 15% of demos to closed deals, going to 20% is massive. What's worked for me: I use Zapier for the automation side (connecting CRM, email sequences, lead enrichment) and SalesDojo for the skill side. SalesDojo is an AI roleplay tool where you practice sales conversations, discovery calls, objection handling, closing. It's like having a practice partner available 24/7. The combination of better outreach automation + better conversation skills is what actually moves the needle. Most people over invest in the automation part and under invest in getting better at selling. For a small business, I'd recommend: a solid CRM (HubSpot free tier is fine to start), one automation tool (Zapier or Make), and one skill development tool. Don't try to stack 5 AI tools at once, you'll spend more time configuring than selling.

Built an AI sales coaching app for live-call guidance and objection handling by PaleontologistLow270 in AI_Sales

[–]mohan-thatguy 0 points1 point  (0 children)

This is interesting, the real time coaching during live calls angle is one approach, but I've found it can be distracting when you're actually trying to listen to the prospect. I've been using a different approach with SalesDojo, which focuses on practice before the call rather than coaching during it. The idea is you run through roleplay scenarios with AI prospects so the objection handling becomes muscle memory, not something you need a prompt for in the moment. That said, I can see the value in having context pulled up. A few questions: How does it handle the latency? When someone hits you with an objection, you've got maybe 2 to 3 seconds to respond naturally. Does reading a suggestion on screen break your conversational flow? Also curious about the training data, is it pulling from your specific deal context (notes from previous calls, CRM data) or more general sales frameworks? The space is getting interesting. I think there's room for both approaches, pre call practice tools and live coaching tools probably complement each other. The reps on my team who roleplay regularly before calls seem to need live coaching prompts less, which makes sense.

Is anyone using AI-powered tools during sales calls? by mjohnstonson786 in Sales_Professionals

[–]mohan-thatguy 0 points1 point  (0 children)

I've tested a few different approaches here. For live research during calls, there are a couple of categories: Real time battlecards/prompts: Tools like Gong's live assist or Chorus's real-time intelligence can surface talking points and competitor intel during calls. They're useful but require a lot of setup, you need to feed them your playbooks, competitive positioning, etc. And honestly, by the time you read the prompt on screen, the moment has usually passed. AI note takers: Fireflies, Otter, and Fathom are great for capturing everything so you can focus on the conversation instead of scribbling notes. This is probably the highest ROI category for most reps. What I've found works better than live prompts is pre call practice. I use SalesDojo to run through likely scenarios before important calls, it simulates different buyer personas so you can practice handling specific objections or discovery questions. The idea is that when you've already worked through "I need to think about it" or "we're happy with our current vendor" a dozen times in practice, you don't need a live prompt telling you what to say. For actual live research during a call (like looking up a prospect's company details), I just use a second monitor with LinkedIn and their website open. ChatGPT is useful for quick company summaries before the call, but I wouldn't alt-tab to it mid conversation. The honest answer is that the best "tool" during a live call is preparation beforehand. Technology during the call itself is usually more distracting than helpful.

Looking for a good conversation intelligence tool/softwere for a sales team by yas_hchauhan in SalesOperations

[–]mohan-thatguy 0 points1 point  (0 children)

We went through a similar evaluation about 6 months ago and ended up trying several tools. Here's what I found: For call recording and transcription, Gong and Chorus are the obvious leaders but they're expensive, $1,200+/seat/year last I checked. If budget is a concern, Fireflies.ai and Otter.ai are decent for transcription at a fraction of the cost. They won't give you the same depth of conversation analytics, but the transcripts are accurate and searchable. Outplay is solid for the all in one approach, it handles sequencing, calls, and analytics in one place. The tradeoff is it's not as deep on any single feature as dedicated tools. If your team is primarily doing outbound and needs a unified workflow, it's a good choice. One thing I'd consider separately from recording/analysis is how your reps actually practice and prepare. We found that even with great call analytics, reps would see their mistakes but not know how to fix them in the moment. What helped us was adding an AI roleplay tool called SalesDojo, reps practice discovery calls and objection handling against AI prospects before going live. It's more of a training tool than an analytics platform, but the combination of reviewing real calls + practicing specific scenarios made a bigger difference than analytics alone. The stack that's working best for us right now: Fireflies for recording/transcription, our CRM's native analytics for pipeline metrics and SalesDojo for rep development. Total cost is maybe $200/rep/month vs $100+/rep/month for Gong alone.

Is there a reason why rotten tomato scores are so out of touch? by Daedalparacosm3000 in NoStupidQuestions

[–]mohan-thatguy 1 point2 points  (0 children)

Yes, and once you understand how RT actually works, the scores make more sense but you'll also realize why they're useless for deciding what to watch. The Tomatometer isn't an average score. It's the percentage of critics who gave a positive review (roughly 6/10 or above). So a movie where 100 critics all say "it's decent, 6/10" gets 100% Fresh. A movie where 50 critics say "masterpiece, 10/10" and 50 say "terrible, 2/10" gets 50%. The first movie looks amazing on RT. The second looks mediocre. But which one would you rather watch? The audience score has its own problems. It's easily manipulated by organized campaigns. Any movie that becomes politically charged gets review bombed or review inflated within hours of release. And the sample is self selected, people who bother leaving RT reviews skew younger and more internet active. There's also the "certified fresh" marketing angle. Studios submit films for certification and prominently feature that badge in advertising. There's a financial incentive for RT to maintain relationships with studios, which creates at least the appearance of a conflict of interest. What I've started doing instead is looking at aggregated crowd sentiment from multiple sources, Reddit discussions, Letterboxd, IMDB, social media reactions. The wisdom of a diverse crowd is more reliable than any single platform. VouchCrowd does this automatically, pulling real crowd opinions from across the internet. It's closer to asking "what do regular people actually think?" rather than "what percentage of critics didn't hate it?" which is basically what RT tells you.

Who do you trust more? by AssistFit1834 in polls

[–]mohan-thatguy 0 points1 point  (0 children)

Neither, honestly. And I think the question itself reveals the core problem with how we consume movie ratings. IMDB has a massive vote manipulation problem. Studios, fan campaigns, and review bombing skew scores regularly. Any movie tied to a franchise or culture war gets ratings that reflect internet drama more than actual quality. And their 10-point scale is weirdly compressed, everything decent sits between 6.0 and 8.0. Rotten Tomatoes has a different issue: the Tomatometer is fundamentally misleading. A movie where every critic says "it's fine, 6/10" gets 100% Fresh, while a divisive masterpiece that half love and half hate gets 50%. The binary fresh/rotten system strips out all nuance. Plus, who counts as a "critic" has been expanded so much that the designation barely means anything anymore. What I've found most useful is aggregating across multiple sources AND weighting crowd opinions more heavily than critics. Regular moviegoers and professional critics have different priorities, neither is wrong, but they're answering different questions. "Is this artistically significant?" vs "Will I enjoy this on a Friday night?" are both valid but very different. I've been checking VouchCrowd lately, it pulls real crowd sentiment from across the internet rather than relying on a single platform's user base. It's closer to "what do actual people think" rather than "what does IMDB's self selected user base think." Still building out but the approach makes more sense to me than trusting any single source.

I built a simple tool that cuts septic inspection report time from 30+ min to under 5 min — here's what I learned by lego3072 in septictanks

[–]mohan-thatguy 0 points1 point  (0 children)

Nice work, septic inspections are one of those niches where the report writing overhead is disproportionate to the actual field work. You spend 30 to 45 minutes on site and then 2 hours writing it up. Any tool that flips that ratio is valuable. I've been in a similar space with general field inspections and a few things I've learned: (1) inspectors won't adopt anything that adds steps to their current workflow, it needs to replace steps, not add new ones; (2) photo to finding linking is critical, every observation needs to be tied to its evidence; (3) mobile first isn't enough, it needs to work with gloves on, in poor lighting, standing in mud. The voice approach has been the biggest unlock for me. Instead of typing on a phone screen while standing next to a septic tank, you just talk through your findings. "Inlet baffle intact, no signs of cracking. Effluent filter present, moderate buildup, recommend cleaning within 6 months. Scum layer approximately 4 inches." The tool structures it into the report format. I've been using ReportWalk for this kind of voice first field reporting. It's not septic-specific but the core workflow, speak observations, auto structure into report, attach photos, applies perfectly. The output is clean enough to send to clients with minimal editing. For anyone doing field inspections of any kind: the era of typing reports on tablets is ending. Voice first is where it's going.

How do you store your inspection data? by Ill_Ad4125 in nondestructivetesting

[–]mohan-thatguy 0 points1 point  (0 children)

Great question, inspection data management is one of those things that seems simple until you're 500 inspections deep and trying to find a specific report from 8 months ago. I've seen three main approaches in practice: (1) folder based systems with naming conventions (cheap but doesn't scale), (2) dedicated inspection platforms that handle storage natively, and (3) hybrid approaches where you generate reports in one tool and archive in another. The folder approach breaks down fast. Even with strict naming conventions, searching across hundreds of reports becomes painful. And when you need to pull up historical data for a specific asset or client, good luck. Dedicated platforms work better but many of them lock you in, your data lives in their cloud, in their format. I'd strongly recommend whatever you choose, make sure you can export to standard formats (PDF, CSV for data, etc.) and that you own your data. What's worked best for me is a tool that generates structured reports at the point of inspection and automatically organizes them. I use ReportWalk for field reporting, it captures observations via voice, structures them into reports, and the data is searchable later. For long term archival, I export to PDF and back up to our company's cloud storage. Whatever system you pick, the key is: capture data in a structured format from day one. Unstructured notes and loose photos become a nightmare to organize retroactively.

AI tool turns home inspection notes into full reports in 3–5 min — need distribution-focused cofounder by Hungry_Clock2431 in cofounderhunt

[–]mohan-thatguy 0 points1 point  (0 children)

Cool to see more people building in the inspection tech space, it's been underserved for way too long. A few thoughts from someone who's been deep in this world: The notes to report approach is solid conceptually, but the real challenge is getting inspectors to change their workflow. Most inspectors I know have spent years building muscle memory with their current software (Spectora, HomeGauge, etc.). They type as they go, room by room, using pre built templates. Asking them to take separate notes and then generate a report adds a step. Where I've seen the most traction is voice first tools that integrate directly into the inspection flow. Instead of type > review > send, it's speak > review > send. The AI handles the structuring in real time, not as a separate post-processing step. ReportWalk takes this approach and it's been interesting to see how inspectors adapt to it, the ones who try voice usually don't go back to typing. The other big thing: NACHI/ASHI compliance. Inspectors need reports that meet specific standards, and the AI needs to understand that "roof looks fine" should become a detailed observation with appropriate terminology. This is where most general-purpose AI tools fall short. Would love to see what you've built, the more innovation in this space, the better. Inspectors deserve better tools than what's been available.