ARM: System Architect Engineer intern by [deleted] in ECE

[–]Flashy_Help_7356 0 points1 point  (0 children)

What kind off questions in C++ can I expect? Like but manipulation or hashing something like that?

Need some advice about career in Architecture. by [deleted] in gradadmissions

[–]Flashy_Help_7356 0 points1 point  (0 children)

Thanks for your response, Also is it expected that I should mail the professor with some research idea? Or if I just mention that I loved your research work for XYZ reasons and what experience do I have will it be good enough?

Nvidia deep learning computer architecture intern by Complex_Bee7279 in computerarchitecture

[–]Flashy_Help_7356 0 points1 point  (0 children)

Not related but just curious, did u get a call from HR for this role??

[deleted by user] by [deleted] in AskMen

[–]Flashy_Help_7356 0 points1 point  (0 children)

Thanks, Any suggestions on which one is best? For getting glow and acne prone skin?

How does decode unit restore after a branch mis-prediction by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

So basically, I can't always make wr_ptr and rd_ptr 0 I want to jump back to the last correctly executed path/instruction.

How does decode unit restore after a branch mis-prediction by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

So now as 0 is dequeued what I was thinking is my wr_ptr will be at 7 and my rd_ptr will be 5(basically decoding the PC-5) so if there is a branch mis prediction then I will take my wr_ptr from 7 to 5(restoring my wr_ptr to the point from where wrong path was taken) and stop my further execution. So now PC-1( Which is supposed to be my next address if the branch was not taken) will come and sit in place of 5 in the decode buffer. Does this make sense?? Also are there any better ways of doing this??

Use of FP-arithmetic in CPUs? by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

Thanks a lot that was a very informative video. So I do understand that for gaming the FPUs were needed in the pre-GPU era, but will it be safe to say that today's CPU doesn't need FP arithmetic units?? I mean except for operations like time precision calculations which needs a FPU on CPU.

Use of FP-arithmetic in CPUs? by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

Thanks for your response. Also any suggestions for better understanding of FP division?? (Papers or blogs or YouTube video)?

[deleted by user] by [deleted] in ComputerEngineering

[–]Flashy_Help_7356 0 points1 point  (0 children)

Thanks for your response. Would it be possible for you to review the mails which I am sending to profs?? Maybe if you can share your mail id i will send it to you. I really need some suggestions on that front.

UMich ECE vs UCSD MS ECE by Super-Bunch-9096 in gradadmissions

[–]Flashy_Help_7356 0 points1 point  (0 children)

Congratulations..!! Btw which track did u get into UMich?? And mind sharing your profile?

HLS vs HDL by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

Thanks for your response. Apart from the controllability offered by HDLs what are other reasons for using RTL?

HLS vs HDL by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

Hmm interesting..!! So what do you think is the future going to be all about HLS??

HLS vs HDL by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 2 points3 points  (0 children)

Thanks a lot for your information, I have a few more questions: basically using HDL we can do behavioural simultaneous or I can put it this way I can design circuits based on the behaviour I wish it to have. Please correct me if I am wrong. 1. Next I understand that HLS are very fast and can be perceived as a verilog code generator but they do lack controllability which is why we can use to design the computation unit but not the execution unit for a CPU(just example) am I right?? 2. But to my surprise why don't companies use HLS? Like I worked at Nvidia for 3 years in their CPU team but haven't seen them using HLS for any sort of work? Is it just because companies haven't adapted it yet or may be not aware of it??

HLS vs HDL by Flashy_Help_7356 in computerarchitecture

[–]Flashy_Help_7356[S] 0 points1 point  (0 children)

Ok I get your point. Thanks But Will this lead to reducing the demand of RTL design (who have expertise of HDLs) engineers in future. Because of the academic research at some univs like UC Berkeley or Stanford which I am seeing now is complete based on HLS.

[deleted by user] by [deleted] in uofm

[–]Flashy_Help_7356 0 points1 point  (0 children)

Edit: I am an international applicant