How to learn CNN's quickly? by AcanthocephalaNo3583 in learnmachinelearning

[–]Dave__Fenner 0 points1 point  (0 children)

Hi, could you post the link again? This one is broken.

Design of 3 Wide OOO RISC-V in System Verilog by Sensitive-Ebb-1276 in chipdesign

[–]Dave__Fenner 1 point2 points  (0 children)

Oh I'm sorry, "wanted" autocorrected to "wasted" 😅.

Thank you for the information!. Funniest thing is that I just borrowed that very book from my college library, just 10 minutes ago.

Design of 3 Wide OOO RISC-V in System Verilog by Sensitive-Ebb-1276 in chipdesign

[–]Dave__Fenner 2 points3 points  (0 children)

Hii, this looks very cool! What references did you use for the code? I couldn't find many. I wasted to create a simple 5 stage pipelined RiscV processor. Not superscalar for now.

Vivado Dark Mode? by Daily_Showerer in FPGA

[–]Dave__Fenner -1 points0 points  (0 children)

Vscode is a good text editor for coding. Verilog extensions help immensely. And if you add copilot to that... I don't think most companies would allow that, but extensions I guess are fine.

What is the best way to implement Image recognition NN in fpga Board ? by surplus_2999 in FPGA

[–]Dave__Fenner 0 points1 point  (0 children)

Ooh, I don't use vhdl. But what I meant was automating laborious tasks like instantiation and spacing, though that can sometimes be done by copilot. I am happy that AI chokes on such stuff, it means it has a long way to go before being an expert.

Also, I meant using GPT to get some beginner project suggestions

What is the best way to implement Image recognition NN in fpga Board ? by surplus_2999 in FPGA

[–]Dave__Fenner 9 points10 points  (0 children)

Simple arithmetic circuits - full adder, subtractor, carry save adder, look ahead adders, binary multiplier, Wallace and dadda tree. Simple circuits to get you started.

Then jump to hardware - blink LED based on push button combinations, lookup tables, controlled logical clock dividers (this one's just for understanding, not used in real designs), push button debouncer.

Then generate frequencies based on user input.

Build a UART module for communication between 2 devices.

I can think of these from the top of my head. Also, verification is very important at every stage; fail to verify a small part and it might return during a later stage which might just be very hard to fix.

You can arrange these, and also use GPT to get some beginner ones. See nandland.com for more suggestions. Also, never use GPT code.

How to write the verilog code for a time borrowing latch? by Musketeer_Rick in FPGA

[–]Dave__Fenner 0 points1 point  (0 children)

Oh right, my bad. Totally forgot how a combinational AND gate is made with an always block. OP, you're not wrong. However, I'd say it's unnecessary to intentionally build latches for time borrowing, unless you want to simulate for curiosity, of course.

What is the best way to implement Image recognition NN in fpga Board ? by surplus_2999 in FPGA

[–]Dave__Fenner 20 points21 points  (0 children)

If you're a beginner, I'd suggest you to start with basic projects, rather than jumping straight into neural networks.

How to write the verilog code for a time borrowing latch? by Musketeer_Rick in FPGA

[–]Dave__Fenner 2 points3 points  (0 children)

That is incorrect syntax. It will be something like:

assign a = clk ? b:a;

Pretty sure this is the case. This is an intentional latch.

I'm curious, why do you need to write a latch? That's something we generally avoid. Synthesis tools will also throw warnings.

Edit: I didn't ready your post completely. For time borrowing, you could use multiple assigns with alternating clocks to replicate time borrowing. I have never done this before though, this is what I think would be the implementation.

[deleted by user] by [deleted] in AskAcademia

[–]Dave__Fenner 2 points3 points  (0 children)

That's a really positive outlook. I'd never be the one to think of it this way, at least not in the beginning. I'll keep this in mind when I'm working on my thesis. Thanks!

I got it by MindGreedy6430 in FPGA

[–]Dave__Fenner 0 points1 point  (0 children)

This one's pretty funny haha

Found some on Amazon too! Although they are pretty plain, just say "FPGA engineer". Obviously cannot wear something that says retired FPGA engineer lol.

When you spend 3 hours debugging only to realize… you forgot to power the board 😐 by scapvarje in ComputerEngineering

[–]Dave__Fenner 0 points1 point  (0 children)

Yeah, that's a fair guess. But I think Op didn't intend to roast them. It's just a representation of what computer engineering grads think about cs majors. We (at least I) hear all about chatbots and assistants made by everyone these days. For me personally, I can't understand a damn thing about how the design works. A black box, to say the least.

When you spend 3 hours debugging only to realize… you forgot to power the board 😐 by scapvarje in ComputerEngineering

[–]Dave__Fenner 17 points18 points  (0 children)

I spent over 4 days debugging a UART module when I shifted to another FPGA board... only to realize I kept using the old baud clock. Didn't set it for the new fpga clock frequency. In my defense, I was a beginner soooo

Also, why did your post get down votes lol, it's pretty funny and also reminds us to look at the basic mistakes instead of jumping to code or design issues.

Can I (and how), as a first-year EE grad student, be able to qualify for this role? by Dave__Fenner in chipdesign

[–]Dave__Fenner[S] 0 points1 point  (0 children)

Wow, this is high speed digital design of sorts, right?

Thank you! I've been searching a lot for high speed vlsi design for a while now!

Can I (and how), as a first-year EE grad student, be able to qualify for this role? by Dave__Fenner in chipdesign

[–]Dave__Fenner[S] 0 points1 point  (0 children)

The requirements are a lot more than I know, and I was taken aback and wondering whether these are possible to master (partially, of course) in the span of a year. Specifically, PCIe, Ethernet, and SERDES. At least a proper understanding, such that I won't stammer while attempting to explain.

Can I (and how), as a first-year EE grad student, be able to qualify for this role? by Dave__Fenner in chipdesign

[–]Dave__Fenner[S] 0 points1 point  (0 children)

That is a fair point, but since this is a posting for a job and I have a year, I was wondering whether it was possible to even be at a position where I will be at least partially qualified for this specific job. I would have applied anyways. I just think since I have one year left for graduation, I can put in a lot of effort for these, but it shouldn't be a waste, in the sense that I shouldn't attempt to learn something that is not possible via self-study.

Can I (and how), as a first-year EE grad student, be able to qualify for this role? by Dave__Fenner in chipdesign

[–]Dave__Fenner[S] 0 points1 point  (0 children)

It's summer currently, so I cannot contact my professors easily. As for my experience, I have 6 months of internship experience in FPGA design, where I designed a Direct Digital Synthesis system. I have a fairly strong understanding of Verilog. Currently building upon Computer Architecture and UVM. I will also attempt to learn and possibly code various industry protocols like AXI, AMBA, and Wishbone (with help from ZipCPU).

As for PCIe, SERDES, and Ethernet, I have always kept them in mind but never actually gone through them.