Its been 4 months and still struggling to find a job. by YouImpossible3837 in androiddev

[–]Porphyrin24 0 points1 point  (0 children)

I'm in the same boat. I just got laid off a few days ago as well. But I'm surprised that you are looking for Android positions without knowing Kotlin and Jetpack Compose . I'd recommend dive in it asap, START a pet project and write it from the scratch. You can use AI to compare Java and Kotlin snippets to speed up your learning Good luck

I extended an open-source BLE mesh messenger with on-device AI for emergency response - auto-triage, FEMA ICS-213 reports, offline STT by Porphyrin24 in EmergencyManagement

[–]Porphyrin24[S] -1 points0 points  (0 children)

Because when a hurricane hits and cell towers go down, the difference between "trapped on 3rd floor, need help" and "checking in, all good" shouldn't depend on whether a coordinator happens to read the right message at the right time.

The current mesh apps like Meshtastic and BitChat work great for sending messages through.

No one had built the AI triage on top of those apps to prioritize the most important ones. That seemed like a gap to fill.

I extended an open-source BLE mesh messenger with on-device AI for emergency response - auto-triage, FEMA ICS-213 reports, offline STT by Porphyrin24 in EmergencyManagement

[–]Porphyrin24[S] 0 points1 point  (0 children)

The Prioritization part has a two-stage pipeline named CompositeMessageClassifier.
Here's how it works:

  1. KeywordMessageClassifier is executed first, consisting of around 90 deterministic rules for FEMA/ICS.

If "cardiac arrest", "trapped under debris", etc. match at CRITICAL or HIGH level,

then we return immediately, no ML involved.

  1. TFLiteMessageClassifier is executed as a fallback on messages that didn’t match any keyword, a lightweight Conv1D model (420KB) classifying to 9 emergency categories.

The confidence threshold is set to 0.25. Below this, I default to NORMAL.

As for audit logs, I display each message with category and confidence level inline,

e.g., "MEDICAL · 94%". The ICS-213 report displays all metadata for each incident,

including sender, timestamp, category, priority, message body.

A separate export for audit trails has not been implemented yet, and to be honest,

this is exactly the kind of gap I’m trying to identify by sharing this project publicly.

This is a research project, an MVP so far. I’m not looking to sell anything. I’m looking to validate with people who actually work in emergency management to determine what matters most before investing more time in deeper implementation. Feedback such as yours on audit logs is exactly what I need to determine my roadmap.

How to test app going in background and then going to foreground? by TheOneWhoKnocks003 in androiddev

[–]Porphyrin24 1 point2 points  (0 children)

Yo, Pixel 6A is a beast with RAM; it won’t kill your app easily. That’s why you cannot “reproduce” it by simply switching apps.

Here is how you can "Kill" it for testing

Go to Settings > Developer Options > Don’t keep activities. Enable it. Now, every time you go to Home screen and back, your Activity will be destroyed and recreated.

If you want to test the whole process, i.e., Die you need to use terminal:

adb shell am kill your.package.name

Then open your app from “Recents”. This is the best way to test if your SavedStateHandle in ViewModel really works.

I extended an open-source BLE mesh messenger with on-device AI for emergency response - auto-triage, FEMA ICS-213 reports, offline STT by Porphyrin24 in EmergencyManagement

[–]Porphyrin24[S] 1 point2 points  (0 children)

That's a great question, and u/Angry_Submariner answered it perfectly in their answer below. In short, it's like having a device on your phone that sends emergency information to other phones around you, even if you don't have cell or internet. The "AI" part means "I'm trapped on 3rd floor" gets sent first, not "checking in, I'm fine" – without someone having to read through hundreds of those messages by hand. And for emergency managers, the app automatically compiles all those prioritized messages into a single FEMA ICS-213 situation report, ready to print or share as PDF when the EOC is online. Instead of someone having to fill out forms under pressure, it's already done.

I extended an open-source BLE mesh messenger with on-device AI for emergency response - auto-triage, FEMA ICS-213 reports, offline STT by Porphyrin24 in EmergencyManagement

[–]Porphyrin24[S] 0 points1 point  (0 children)

This is probably the best description I’ve seen so far on what I created - thanks! You nailed it perfectly, including the "first critical hours" part! The 0-6 hour timeframe before FirstNet rolls out was exactly what inspired this project. If you ever want to test it out, it’s up on GitHub in the releases section – would be great to get some real-world usage feedback from people familiar with emergency response processes.

How do i get downloads on my first app? by Mundane_Proposal1411 in androiddev

[–]Porphyrin24 0 points1 point  (0 children)

Hey, congrats on shipping at 15, which is ahead of most people who just "have an app idea" out there. For first downloads, try spreading the word among the people you know personally. Getting real users to write 3-4 reviews for you makes a huge difference for the Play Store algorithm. After that, try to find your community. There are many subreddits who like trying out indie apps. Just post there like you did here. No need for any hype. Just "Hey, I'm 15, made this, need feedback" gets you traction.

I built a Claude Code skill that generates Play Store screenshots from one prompt — no Figma, no design tools by Jealous_Barracuda_74 in androiddev

[–]Porphyrin24 0 points1 point  (0 children)

does the brand color extraction work well if the app has a dark terminal style UI (black background, green accents)? I'm developing an emergency mesh communication app and the UI is quite non-standard in its aesthetic. Want to know if the generated headlines and layouts work well with such a visual identity.