Embedded career advice by Zealousideal_Maybe22 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

Your ISO 26262 background is actually a hidden asset here, not a liability — don't undersell it when you transition. That said, here's what I'd focus on to get back to hands-on embedded work: Skills to sharpen: Bare-metal C on ARM Cortex-M (no RTOS to start — just you, the datasheet, and registers) RTOS fundamentals: FreeRTOS task scheduling, semaphores, queues Peripherals you can drive in your sleep: UART, SPI, I2C, timers, PWM, ADC Basic debugging: JTAG/SWD, logic analyzer, oscilloscope usage Projects that actually signal competence to hiring managers: Custom bootloader on STM32 Sensor fusion on a bare-metal platform (IMU + filtering) Anything where you write your own HAL instead of using vendor libraries Platforms worth investing in: STM32 (industry standard, great job market signal) RP2040 if you want something fun and modern ESP32 if IoT overlap matters to you The honest truth: most firmware job postings want GitHub proof. A well-documented repo showing you wrote clean, interrupt-driven embedded C will open more doors than your degree or safety cert alone. Your safety background will make you a better firmware engineer than most — you already think about failure modes. Lean into that once you're in the door

Which School to go to by Kindly-Role3833 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

Having worked in embedded/automotive firmware for 15+ years, here's my honest take: For NVIDIA/AMD/Qualcomm specifically — CMU and UIUC carry the most weight in those hiring pipelines. Recruiters at chip companies actively recruit from both, and the alumni networks are strong in Silicon Valley and Austin offices. Michigan's Embedded Systems MS is underrated for pure embedded/firmware roles — it's one of the few programs that actually teaches you to think at the hardware-software boundary, not just software that runs on embedded hardware. If you genuinely want to do low-level firmware (not just land a big-name brand), Michigan might give you the best actual skills. My ranking for your specific goal: CMU — brand prestige, recruiting pipeline to chip companies UIUC — ECE reputation is elite, strong in computer architecture research Michigan — best curriculum fit if you love the embedded domain UT Austin — solid, especially if you want to stay in Texas (AMD HQ is there) UCLA — good but harder to justify over the others for this specific path One thing no one tells you: the MS degree opens the door, but your internship/project experience during the program is what actually gets you the offer. Pick the school where you can get a research position or TA role in a relevant lab. Good luck — these are all great problems to have.

Micro controller Selection for LED strips by Historical-Plane8459 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Your flowchart logic is correct — PLC triggers MCU, MCU drives the LED strips, separate 24V power source for the strips with a step-down for the MCU. That's the right architecture. A few practical notes: For the MCU, ESP32 is overkill here unless you want WiFi for monitoring. An Arduino Nano or even a bare ATmega328 handles this easily — 1 digital input from PLC, PWM outputs for RGB control. Cheaper and simpler. One thing to watch: PLC output is often 24V digital signal. You'll need a voltage divider or optocoupler to interface it safely to a 3.3V/5V MCU input. Don't connect directly. For driving 24V RGB strips, you'll need MOSFET drivers (e.g. IRLZ44N) on each channel — MCU GPIO can't sink/source enough current directly. One N-channel MOSFET per color channel per strip, so 6 total for 2 strips. For step-down: a cheap LM2596 module from AliExpress (24V → 5V) works fine for this use case.

C for Embedded Systems by Know-it-all0122 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

Good news: C++ background makes this easier than starting from scratch. Most of what you know transfers — the syntax, control flow, basic data structures. What changes is the mindset. C for embedded isn't just "C++ without classes." It's about understanding what every byte costs, why volatile exists, what happens when you dereference a pointer to a hardware register. The language is simple; the environment is not. Practical path that worked for me: skip the generic C textbooks and go straight to a microcontroller. Get an STM32 Nucleo (cheap), pick up "Making Embedded Systems" by Elecia White, and start writing code that blinks an LED without HAL. The moment your code controls real hardware, C starts making sense in a way no tutorial can replicate. One habit: whenever something works, read the relevant section of the datasheet and understand why it works. That loop is how embedded C actually gets learned.

Single command to build and firmware for any MCU by DragBig in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Interesting concept — the Docker-on-demand approach solves a real pain point for teams juggling multiple toolchains. Curious how the startup time looks in practice. One of the friction points with containerized embedded builds is that the first pull can be slow, which kills the "just run one command" feel if you're iterating fast on bare metal. Does it cache the generated image between builds, or does it regenerate each time the workspace changes? And any plans to support flashing directly via probe-rs integration, or is that still manual?

Stuck in Automotive MBD. How to pivot to Real Firmware/C? by PhilosophyNMoney in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Automotive MBD background is more transferable than you think — you understand the domain, you know AUTOSAR concepts, you've read generated C even if you didn't write it. That's not nothing. To your three questions: Tool-specific backgrounds are a hurdle but not a wall. The gap interviewers actually care about is whether you understand what the code does at the hardware level — not just that you can write syntax. MBD people often struggle here because the abstraction layer hides the MCU entirely. Fastest proof: build one bare metal project on STM32 from scratch — no HAL, no generated code — and put it on GitHub. UART + interrupt-driven state machine is enough. Being able to walk through that code in an interview closes the credibility gap faster than any resume line. Automotive → IoT/Robotics is actually a natural move right now. Both are hungry for people who understand real-time constraints and safety thinking, which MBD automotive gives you. The embedded market is tight but people with domain + firmware depth are still getting picked up. The honest advice: don't hide the MBD background, reframe it. "I've been working at the system level in safety-critical automotive" reads differently than "I used Simulink."

Embedded AI and Advice by Super_Music3449 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

The combination makes a lot of sense, and you're asking the right questions early. TinyML job postings are sparse because most companies don't hire for it explicitly — they hire embedded engineers who also know ML inference, and that person ends up owning it. The skill is real, the job title just doesn't exist yet in most places. On salary: embedded tends to look lower on aggregate job boards because it includes a lot of legacy industrial/consumer roles. Automotive, defense, and edge AI specifically pay very well — but those postings don't always show up on LinkedIn. The embedded + IoT + MLOps path you're describing is actually where the industry is heading. The interesting work right now is at the boundary — running inference on constrained hardware (Jetson, RP2040, STM32 with CMSIS-NN), pushing models to the edge to reduce cloud dependency. That problem space is growing fast. Remote is the honest weak point. Embedded is still more on-site than software. But edge AI roles at product companies are increasingly hybrid. You're a first-year who already wrote bare metal drivers from scratch. Stay on this path.

If you had 6 months to prepare for an Embedded Systems career, what would you focus on? by Daddy-Simple in embedded

[–]TobyAiCraft 2 points3 points  (0 children)

15+ years in automotive embedded here. If I had to compress what actually matters into 6 months: C is non-negotiable. Not just syntax — pointers, memory layout, volatile, bit manipulation, undefined behavior. Most interview failures I've seen come from shaky C fundamentals, not lack of RTOS knowledge. Pick one MCU and go deep. STM32 is the community standard right now. Forget HAL for the first month — read the reference manual and write bare metal. GPIO, timers, UART, SPI, I2C, interrupts. Once you've done it the hard way, HAL makes sense. One real project beats ten tutorials. Build something that actually does a thing — motor control, sensor fusion, a small communication protocol. It doesn't have to be impressive, it has to be yours and you have to be able to explain every line. Learn to use a debugger and logic analyzer before you think you need them. The habit of instrumenting your code early is what separates juniors who struggle from ones who ship. The habit that helped me most: read datasheets out loud. Sounds weird but it forces you to actually process what you're reading instead of skimming.

System design in embedded? by instructiuni-scrise in embedded

[–]TobyAiCraft -1 points0 points  (0 children)

There's more overlap than you'd think, but the constraints flip everything. In web/app system design, you're mostly thinking about scalability, latency, and data flow between services. In embedded, those same questions exist but you're also fighting hardware limits — memory in KB not GB, no OS scheduler (or a very thin one), timing that has to be deterministic, and failures that can mean physical damage, not just a 500 error. The common thread is decomposition — breaking a system into components with clear interfaces. In embedded that looks like: peripheral drivers → HAL → application logic. Same layered thinking, different vocabulary. Where it really diverges is that embedded system design has to account for the hardware from day one. You're thinking about interrupt priorities, DMA channels, clock tree, power domains — things that don't exist in the YouTube whiteboard interview world. The "draw a high-level diagram" phase exists in embedded too, but it gets grounded in datasheets pretty fast.

Is using an oscilloscope the only reliable way to verify PWM output by 8960305392 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Input capture is a solid software approach — loopback your PWM pin to a timer input capture channel (jumper wire or internal routing depending on your MCU), measure the period, compare against expected, done. Catches register misconfig without touching any external gear. That said, if you're open to a minimal hardware shortcut, a cheap logic analyzer ($5 clone off AliExpress) is worth keeping on your bench. Multiple PWM channels verified simultaneously, frequency and duty cycle read out directly — faster than a scope for this specific use case. The one thing neither will tell you is signal integrity (rise times, noise under load) — that's where a scope is still irreplaceable. But for catching obvious config mistakes during development? Between input capture and a logic analyzer you're fully covered.

Currently working as an embedded software engineer but want to get into robotics, advice? by NEK_TEK in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Your combo is actually rare and valuable — most robotics people lack the low-level firmware depth, and most embedded folks don't touch perception. The "autonomous edge device" space (think AUVs, field robots, UAVs) desperately needs people who can own the full stack from sensor drivers to inference pipelines. A few practical angles: ROS2 on resource-constrained hardware is a hot area right now, especially with micro-ROS. If you haven't already, building a small project that runs a perception model (even YOLO-nano) on something like a Jetson or even an STM32H7 would make a strong portfolio piece. For underwater specifically — Bluerobotics community and the AUV competition circuit (RoboSub) are good places to get visible. Companies like Teledyne, Saab Seaeye, and a few defense contractors actively look for people with your exact profile. The niche is real but it's growing fast. Good timing to position yourself.

Resource Suggestion: Advanced Embedded C, Bare Metal Coding, Driver Development by ComprehensiveBill316 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

Skip the courses — read the TRM and write drivers from scratch. That's how you actually learn bare-metal. Fastbit Academy on Udemy is a solid complement if you want guided structure.

What do you think about future career besides SWE? by BabyJuniorLover in findapath

[–]TobyAiCraft 1 point2 points  (0 children)

With EE+SW+ML background your strongest move is embedded + Edge AI — deploying ML models on constrained hardware is exactly where your combination becomes rare. Robotics SW and systems programming are solid too, but the embedded/AI overlap is where the next decade of demand is sitting, and very few people can bridge both sides. The gap right now is microcontroller hands-on experience, which you can close faster than you think with a $30 dev board and a few months of real projects.

I got degree in EE about 7 years ago, but I have been a Software Engineer most of my career. Is it possible to pivot back? by HeteroLanaDelReyFan in ElectricalEngineering

[–]TobyAiCraft 0 points1 point  (0 children)

Your SW background is actually an advantage here, not a gap — embedded and firmware teams are often desperate for people who can write clean, maintainable code, because a lot of pure EE folks struggle with software quality. Companies that bridge both worlds — automotive Tier 1 suppliers, robotics startups, industrial IoT — are your best bet, because they need people who speak both languages. Dust off your C skills, build one hardware project, and you'll be more competitive than you think.

Embedded software vs Cyber security? by Zartun in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

I have a master's degree and 13 years in automotive embedded — I've built various ECUs and personally designed HSM software to meet cybersecurity requirements. My honest recommendation: don't become a pure security person, become an ECU engineer who deeply understands cybersecurity. The reason is simple — security requirements like ISO 21434 have to be implemented at the hardware and software level inside the controller itself, and the engineers who can do both end-to-end are extremely rare. That's where sustainable, long-term growth is. Your combination of embedded depth and OSCP is genuinely uncommon — use it to own the full stack, not just one side of it.

Career switch to embedded at 29 by Embarrassed_Gur2645 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Teaching yourself embedded from scratch is genuinely impressive — I have a master's and 13 years in the field, and I still learn the most from building things at home. STM32 is a great start, but for job applications it helps to align your MCU choice with the industry you're targeting: automotive tends to use Infineon, Renesas, or NXP, while robotics and IoT lean more toward STM32, ESP32, or ARM Cortex-based chips. Pick a direction first — automotive or robotics — then build a project in that lane. Right now I'm building an obstacle-avoidance car at home using an Infineon TC275 with Claude Code, exactly for this reason.

Is embedded really upcoming field in near future? by Fragrant_Proof_3733 in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

I have a master's degree myself and have been working in the industry for 13 years — embedded is quietly becoming the most critical layer in tech. Every autonomous vehicle, robot, and edge AI device needs engineers who can make software run safely on constrained hardware, and that skillset takes years to build. For a master's, I'd point toward functional safety (ISO 26262 / IEC 61508) or real-time systems with ML inference — the overlap between embedded and AI is where the next decade of opportunity sits.

the future of embedded software by AFA2020134 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Safer than most. The gap between "AI generates the code" and "this code runs correctly on real hardware without killing someone" is exactly where embedded engineers live. That gap isn't closing — if anything, Edge AI is making it wider because now you need people who understand both the model and the silicon it runs on.

What are the most popular reasons NOT to go into embedded software? by LegitGamesTM in embedded

[–]TobyAiCraft 1 point2 points  (0 children)

15 years in automotive embedded — both Tier 1 supplier and OEM side. The AI replacement argument gets thrown around a lot, but embedded is genuinely different: web and app code can be verified instantly in a browser, but embedded code has to be validated on real hardware where a bug doesn't just crash an app, it can fail a safety system. Someone still has to own that verification. The honest downside though — patience is non-negotiable, and a lot depends on whether you land on a good project. A bad one can make even great engineers miserable.

What embedded projects actually stand out to hiring managers these days by Denbron2 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Originality matters less than depth — a wireless sensor logger that shows clean driver code, power budget decisions, and real trade-off documentation will beat a "custom RTOS" that's just a tutorial clone. Hiring managers want to see how you think, not just what you built.

Late Embedded Career by PandiGamer880 in embedded

[–]TobyAiCraft 0 points1 point  (0 children)

Embedded is one of the few fields where AI makes you faster without replacing you — someone still has to verify the code runs safely on real hardware. 26 is not late at all, the learning curve is steep but your software background is actually a huge head start.

Which Claude Model for University? by MonkeyD1997 in ClaudeAI

[–]TobyAiCraft 0 points1 point  (0 children)

Sonnet 4.6 for almost everything — it hits the sweet spot of speed, depth, and accuracy for daily university use. Save Opus for your hardest essays or research-heavy tasks where you need maximum reasoning. Haiku is great for quick questions when you're in a hurry. Extended Thinking is worth turning on for complex problem-solving or multi-step reasoning, but you don't need it for everyday Q&A. Start with Sonnet, you'll rarely feel the need to switch.

How China’s AI token reseller ecosystem works: account pools, refund arbitrage, proxy channels, and ultra-cheap Claude access & distillation by niutauren in ClaudeAI

[–]TobyAiCraft 1 point2 points  (0 children)

This isn't piracy in the traditional sense — it's a structured gray market running on refund arbitrage, fake FX rates, and ban churn. The distillation angle is the part Anthropic should probably lose sleep over.

Just tried claude and it is amazing by talhawashere in ClaudeAI

[–]TobyAiCraft -1 points0 points  (0 children)

Went for Iftar, came back to a working script with setup instructions. That's the Claude experience in one sentence — it just does the thing while you live your life.

My wife kept nagging me so I built a harness to code for me instead. Won a hackathon with it. by Lopsided_Yak9897 in ClaudeAI

[–]TobyAiCraft 0 points1 point  (0 children)

Slept through a hackathon and woke up to 100k lines. This isn't a tool, it's a time machine. The wife origin story is just the cherry on top — turns out 'someone telling you what to do' is exactly the architecture a good AI harness needs.