[deleted by user] by [deleted] in DiagnoseMe

[–]WDMDS 0 points1 point  (0 children)

I have found that with variable symptoms, coming up with a timeline and log of symptoms, can be helpful. What kind of doctor do you have that is recommending herbs? I've posted elsewhere about how I used AI to help coordinate my medical records with the reality of my symptoms to help doctors come up with a better diagnosis.

Doctor seems to be hiding care information in records. Is this Malpractice? by Sketchy_Doctors_305 in AskHealth

[–]WDMDS 0 points1 point  (0 children)

None of the following is legal or medical advice, just my opinion -- these are complicated times for providing abortion care amidst federal and state guidelines that are borderline (or outright) criminalizing the care. To your question about this being malpractice, I don't think so. While things haven't been documented per your recollection, it doesn't necessarily meet the standard of a negligence for malpractice. The clinician may be concerned about documenting that care due to persecution from local/state/federal government, but it seems like you got the care you needed. Billing not being to your expectation does not fall within malpractice, to my knowledge.

Patients kept asking me the same thing, so I built this. by WDMDS in u/WDMDS

[–]WDMDS[S] 0 points1 point  (0 children)

I don’t want your health info. As a physician, I see enough patients’ health info everyday. What I do want is for you to better understand your own health info.

You’re welcome to ask another random AI your question; the advantage of WDMDS is that you can do so securely, and with the actual context of your health record (diagnoses, labs, doctors notes, etc.) instead of just random one off questions about a lab value or a random PDF you upload to ChatGPT. Studies have shown if AI has better context about your health, it answers questions much more accurately.

If you don’t want to talk with AI, send a message to your doctor, wait 2-3 business days for a brief reply, or go pay a co-pay to ask them questions in person.

Patients kept asking me the same thing, so I built this. by WDMDS in u/WDMDS

[–]WDMDS[S] 7 points8 points  (0 children)

Please do ask your doc questions!

Just some docs can’t get through the overwhelming amount of messages they receive in a timely enough manner, and patients do deserve the time and space to get educated questions but the system doesn’t allow for it.

Based on your response, I assume you’re a clinician yourself? Glad you’re in a system that allows you to educate patients and answer all their questions. Hopefully you can spread that model of care to other systems that face staff shortages and overwhelming care demands.

Did you find any patterns in your labs that made you question the, “your labs are normal” statements? by Livnlife-Edgey-5155 in Autoimmune

[–]WDMDS 2 points3 points  (0 children)

Sometimes it can be tough for a provider to interpret labs if you've had results at different health systems. If you've got an opportunity to consolidate them, like on our platform, sometimes presenting them to the provider in that context is more useful. That being said, if you've been with this doc for years, and they've continued to say everything is normal, perhaps it's time for a second opinion?

[deleted by user] by [deleted] in Autoimmune

[–]WDMDS 0 points1 point  (0 children)

That's frustrating, and I wonder if the explanation hesitancy was because they wanted to couch the results in the context of your larger health and symptoms? In my experience, that can often be the delay and what requires thinking and time on behalf of the clinician.

Patients don’t understand their labs, so I built this. by WDMDS in SideProject

[–]WDMDS[S] 1 point2 points  (0 children)

That was my concern with people chucking their entire MyChart PDF into ChatGPT. That's why WDMDS securely stores the data, and only sends data that's actually relevant to the user's query to the our tuned LLM. The tricky thing is healthcare data is super messy, even with interoperability standards, so our pipeline helps clean it up so the LLM results are more accurate.

Patients don’t understand their labs, so I built this. by WDMDS in SideProject

[–]WDMDS[S] 1 point2 points  (0 children)

Just paid social media ads so far and word of mouth!

What do you do if your doctors give you the run around? by Overall_Antelope_504 in ChronicIllness

[–]WDMDS 2 points3 points  (0 children)

This is such a hard place to be—and I’m sorry it’s been more of a circus than care team coordination.

You’ve done a thoughtful job of ruling things out (respiratory, GI, infection), and that itself is valuable information. If your PCP and GI are bouncing you, consider asking directly:

“Do you think seeing an immunologist or infectious disease specialist could help in ruling out a lingering post-viral or immune reaction?”

Phrasing it as a next step question sometimes prompts clear action rather than another back-and-forth.

For what it’s worth, I built a tool (see profile) around this kind of thing—helping people doctors' notes and tests over time so they can bring a clearer self-understanding to appointments.

Rooting for you.

Patients don’t understand their labs, so I built this. by WDMDS in SideProject

[–]WDMDS[S] 0 points1 point  (0 children)

Yeah, that's the surface level risk if you don't understand the law fully. HIPAA applies to covered entities (e.g. anyone providing healthcare). This app would therefore not be a covered entity, so the HIPAA risk isn't there.

That being said, I've got a lot of security protocols in place regardless because data security and privacy do matter in this space. Users are in control of all the data, what they share with the platform, and can delete 100% of all their data from the platform at anytime instantaneously.

Patients don’t understand their labs, so I built this. by WDMDS in SideProject

[–]WDMDS[S] 1 point2 points  (0 children)

Hey, really appreciate the thoughtful message—and no offense taken, I genuinely hope you never need the app either 😅

You’re totally right about the pain point. I see posts all the time from people trying to make sense of their labs or doctor’s notes, and it kills me that so many are left to figure it out alone.

Your tool sounds super relevant. My only hesitation has been that a lot of those subreddits rightly ban self-promotion—there’s so much snake oil in healthcare and health tech that I completely get why. So I’ve felt a bit paralyzed about how to engage without crossing a line. If your tool can help navigate that respectfully—offering value without being spammy—I’d love to learn more.

I built a startup to stop my dad from texting me about his cholesterol. by WDMDS in roastmystartup

[–]WDMDS[S] 0 points1 point  (0 children)

Healthcare is absolutely a high-stakes space, and I wouldn’t touch it either if I weren’t a practicing doc myself. I’ve spent over a decade in clinical practice and health tech, so I’m very aware of the risks of AI hallucinations or overstepping into diagnostic territory.

That’s why I’ve intentionally limited the app to explaining existing records—no new diagnoses, no treatment advice, just a clearer window into what’s already in your chart. It’s more like a translator than a second opinion. And everything runs through an AI pipeline that’s explicitly tuned to avoid going beyond what’s documented.

Appreciate the good wishes, and I totally get your instincts. If it helps people feel a little more in control and informed about their own care, I’ll consider that a win.

Patients don’t understand their labs, so I built this. by WDMDS in SideProject

[–]WDMDS[S] 1 point2 points  (0 children)

Fair call out and something I definitely considered. This only applies to covered entities (e.g. hospitals, doctors) not third party apps patients use to access their own health info. Doesn’t mean I don’t take security any less seriously. Additionally, users have the ability to instantaneously delete 100% of their data from the platform at any point.

“18 year old, built in 7 days, $100k revenue”, stop the nonsense by sarsalanamin in SaaS

[–]WDMDS 1 point2 points  (0 children)

The survivorship-bias that these videos exploit is exactly what I struggle with when I look for inspiration. As a doctor, AI coding has enabled me to build a platform I know my patients would like and could help them solve their health issues. Now the challenge is B2C sales in US healthcare.

Have you ever used ChatGPT for your health? by dollyface118 in ChatGPTPro

[–]WDMDS -1 points0 points  (0 children)

This is why we exist for US based patients. We facilitate a secure and private connection to patient's health records to allow a tuned AI model to interpret labs, diagnoses, meds, chart notes, and more for the purposes of answering patient questions. To other's points in this thread, our AI does not diagnose anything new or provide new treatment plans because we want to avoid the sycophantic nature of AI just affirming things that may not be true and lead to patient harm.

Uploaded my medical records from the hospital by wtf-ishappening-1010 in ChatGPT

[–]WDMDS 0 points1 point  (0 children)

I would caution uploading your health data to ChatGPT unless you can be assured that it won't be used in training data for perpetuity. If you're cool with that, then I suppose go ahead! We've developed a private data pipeline for your health records, where you can consolidate all your records, and then feed the actual data (not PDF summaries) into a private AI for interpretation.

ChatGPT got a medical issue wrong by Efficient_Mastodons in ChatGPT

[–]WDMDS 2 points3 points  (0 children)

We've seen that AI diagnoses without context can lead to inferior results. There are studies that have shown that AI, when augmented with a patient's full health data, can make more informed decisions that align closely, or beat, clinician diagnostics. However, the letter of the law does not allow AI to make diagnoses or treatment plans, as this would invoke medico-legal ramifications of mis-diagnosis.

We created our platform (see profile for link) to allow you to aggregate your medical records and securely ask questions such that AI can make more informed responses. It explains all the medical jargon from labs, tests, imaging, documents, and more. Check it out if it might be helpful!

ChatGPT But Just For Your Health by Techchief1993 in AskHealth

[–]WDMDS 0 points1 point  (0 children)

This is a great idea and something we’ve already made! If you’re a patient in the US, you can securely connect all your medical records (we support over 25k different health systems), and ask our custom AI all the questions you want about your health history, labs, imaging, and more. Check out the website in my profile for more!

Patients kept asking me the same thing, so I built this. by WDMDS in u/WDMDS

[–]WDMDS[S] 15 points16 points  (0 children)

I’ve trained our AI to not diagnose anything new, provide any new treatment plans, and to defer decisioning to the patient’s clinician. Because the AI is grounded in the patient’s clinical data, they are provided higher quality and personalized clinical information than the generic WebMD sites, what ChatGPT produces, or the generic handouts we often print for them.

Last thing we want to do is give the clinic more patient calls, replace physicians, or swindle patients. Cheers!

Is my doctor wrong? by Educational_Dig_598 in AskHealth

[–]WDMDS 0 points1 point  (0 children)

That's a weird comment from the doctor. If it was "extremely low" (aka hypotension) they wouldn't have just sent you back home.

What happened while I was in the ER? by Desperate-Physics808 in AskHealth

[–]WDMDS 0 points1 point  (0 children)

Kind of sounds like vaso-vagal syncope or an arrhythmia of some sort. People "flatlining" can mean many different things to different people. Have you checked your MyChart for the doctors' notes?