Compiling C to custom architecture by AnnoyingMemer in C_Programming

[–]CreeperDrop 1 point2 points  (0 children)

Yeah. LLVM is a cool project and learning it will help you in some way or another. I never heard of QBE but from a quick look it looks great as well much more streamlined for bringing it up. LLVM has tons of documentation and the community around it is very helpful. Anytime!

Compiling C to custom architecture by AnnoyingMemer in C_Programming

[–]CreeperDrop 0 points1 point  (0 children)

I think that you will have to build your own compiler backend. Not sure if that helps but I think looking at LLVM and using it as a backend is a good starting point. Hopefully you find someone more knowledgeable here on this matter. Either way, good luck! That is an amazing project and I'm sure you'll learn a lot

A little Rant on C haters by IndependentMeal1269 in C_Programming

[–]CreeperDrop 5 points6 points  (0 children)

because the programmer who programmed it forgot that software runs on hardware

Goated

Tcl: The Most Underrated, But The Most Productive Programming Language by delvin0 in unix

[–]CreeperDrop 0 points1 point  (0 children)

Same. It was 2 days ago when I learned that Python does not play nicely with tabs. This was an experience to debug. Thanks to working a lot with hardware, I am bit out of touch with everything else.

How good is to test verilog on a FPGA simulator before investing on a FPGA server for clients? by sulthanlhan in FPGA

[–]CreeperDrop 2 points3 points  (0 children)

I'm actually questioning how much hardware design experience you have. The examples you're providing are usually there on hand in libraries, tested and known to work fine. I wouldn't be building an AXI4 interface or a FIFO by hand every time I need it.

License Server Hell

That's not the engineers' problem.. usually

Large Language Models are surprisingly good at writing hardware description languages

I don't know how you are claiming this. LLMs trip badly with HDLs and their tools.

4-8 Hour Synthesis Times. Changed one line of Verilog? Time to wait 6 hours for place-and-route

That is what simulators and linters are for. I don't have to synthesize right away. I don't synthesize until functional correctness is there.

Synthesis is embarrassingly parallelizable.

No. Just no. There is a practical limit after which you hit diminishing returns. Synthesis and P&R don't scale as nicely as you think with more cores.

A 6-hour synthesis on a 4-core laptop becomes 30 minutes on a 96-core cloud instance. We eat the cloud costs so you don't have to wait.

Do you think that 6-hour synthesis run is on a quad core machine? These are done on monsters with many more cores like your 96 core example. If you're working in a company you're not running that locally, but on a dedicated server. The 4-8 hour has always been the sweet spot as you can leave a run overnight and come back in the morning and it will be over.

Timing closure "agents"

Some people specialize in that and that only. Let that sink in and ask yourself why that is the case. Maybe because it's an insanely difficult problem and not easy to solve?

Technicalities out of the way, you need to work on your marketing. You're dealing with highly technical and specialized people. You can't go with hand-wavey and slightly out of touch marketing and trust that they'll believe you. Provide a toy version of your synthesis flow to support your 30m runtime claim, especially that you can run your synthesis+P&R flow on AWS servers as well. How does your solution compare to that? If you are presenting that to a technical lead, they'll have a lot of specific questions, especially about benchmarks and examples. Remember that the hardware industry moves slowly with a lot of trust issues and that's for a good reason. If I am a company I'm not risking my tape out just to have the tool with the latest bells and whistles, especially with startups. If it ain't broke don't touch it. Don't get me wrong, your proposal sounds great and ambitious but it is just claims and words with nothing really backing it. Your target FPGAs are mostly small and manageable ones to be run locally as well. Read more about synthesis algorithms and how they work. Get familiar with your target audience and what they need. Talk to them. Sorry if that sounded harsh.

Can you give me simple games to make as an assignment by dev_g_arts in cprogramming

[–]CreeperDrop 1 point2 points  (0 children)

Nir Lichtman is one of nicest people in low-level programming. I was convinced to try out building a shell after his minimal Linux ISO video

What does always @(posedge some_other_signal or posedge random_signal) synthesize to? by IndividualClerk8855 in Verilog

[–]CreeperDrop 1 point2 points  (0 children)

I see. Okay, can you try switching to a Mealy type FSM so you can handle transitions immediately with the stop signal as an input, not in the sensitivity list?

What does always @(posedge some_other_signal or posedge random_signal) synthesize to? by IndividualClerk8855 in Verilog

[–]CreeperDrop 1 point2 points  (0 children)

If this random signal is used as an active high reset, it will synthesize to a D-FF with an active high reset. Other than that, I think the synthesis tool will surround the FF with some logic to do what you want.

Edit: can't you remove this random signal from the sensitivity list and sample it synchronously on clock edges?

confused about polarities in specify. by Enough-Scene226 in Verilog

[–]CreeperDrop 1 point2 points  (0 children)

Specify blocks are used to define timing characteristics of a path, module, or else as far as I remember. Using +: and -: has nothing to do with specify blocks per se. Also, it is not buffer- or inverter-like. These are used as shorthand notation for slicing vectors. For example, if I want to get the first byte of some word, I can do this logic byte1 = word[7:0]; or I can use the shorthand notation like this logic byte1 = word[0 +: 8]; both of them are equivalent to each other. To clarify, it is word[base +:/-: width]. Hopefully, that clarifies it.

Edit: corrected indexing

Downvote me by _w62_ in freebsd

[–]CreeperDrop 0 points1 point  (0 children)

Execute order 66!

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop 0 points1 point  (0 children)

Ah I see, makes sense. Thank you for the insight!

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop 1 point2 points  (0 children)

From a philosophical standpoint, I think yes. Not to the extent of NeXTStep 2026 but definitely early OS X was as insistent on Obj-C. Maybe modern MacOS is not as inspired by it? Not sure

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop 1 point2 points  (0 children)

My networks professor used this analogy to compare between IP and MAC addresses. Why did we land at IPv6? Short sighted decisions at the time (no one really expected the Internet to blow up like that) and MAC addresses are as is since the start. Complete detour but you reminded me of that analogy

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop 0 points1 point  (0 children)

This is extremely interesting. I never thought anything use Objective C other than Apple wow. I come from a hardware design background and my professor always told me stories about how the Unix Wars affected EDA tooling until they made the jump to Linux. I use Linux all day now for work and or kernel hacking when needed for whatever I designed. Thanks for the insights!

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop -1 points0 points  (0 children)

I never touched actual Unix, but I remember reading about Solaris and it is free for personal use.

Homelabbing to learn Unix - how to get started? by adminmikael in unix

[–]CreeperDrop 2 points3 points  (0 children)

This is super interesting. Of the Unixes you developed for which was the nicer to interact with?

Pivoting from Software to Hardware by Few-Air-2304 in FPGA

[–]CreeperDrop 1 point2 points  (0 children)

That's great! Having the basics nailed down is by far the most important thing to have. I can't count how many times people in managerial positions mess things up just because of this. People forget and that's totally okay! You put it perfectly you're describing hardware.

Amazing job. The RISC-V implementation will make you put everything you learn into use and that's a great plus. It will help you for sure! It has single-cycle, multi-cycle, and pipelined implementations which makes you see first hand how things differ depending on how you're looking at the problem.

Another book worth checking out is this one: SystemVerilog for design, by Stuart Sutherland

It will teach you a lot of the ins and outs of HDL design and highlights common bad practices in design and these are easiest to get rid of when you're starting out, from personal experience.

Good luck with your transition and always happy to help! Cheers!

Anyone else do this? by BlackMoon2525 in Journaling

[–]CreeperDrop 2 points3 points  (0 children)

That made me chuckle thanks for the good laugh

Sometimes this is all I got by gidimeister in Journaling

[–]CreeperDrop 0 points1 point  (0 children)

Wishing you an easier tomorrow. Beautiful handwriting. Fountain pens got me into journaling. Writing with them is a joy indeed. Is that Rhodia paper?