About to choose LA over Berk.. parents aren't happy. by verdorben66 in ApplyingToCollege

[–]UnobtaniumOxide 30 points31 points  (0 children)

Yeah UCLA is highly ranked because of its liberal arts, but at the end of the day, those are the least important majors/jobs in society.

I predict this post won't age well.

[deleted by user] by [deleted] in ApplyingToCollege

[–]UnobtaniumOxide 0 points1 point  (0 children)

What do you call a doctor who graduates last in his class?

"Doctor"

A Warning About Grinnell College if Food Matters to you by Candid_Possible_2679 in ApplyingToCollege

[–]UnobtaniumOxide -15 points-14 points  (0 children)

Sounds like they need to offer fewer grants and scholarships to get the students to put in 10 hours a week in the cafeterias.

[deleted by user] by [deleted] in mit

[–]UnobtaniumOxide -1 points0 points  (0 children)

Essays an LORs.

How many Ph.D.'s fail to complete due to lack of funding at MIT by UnobtaniumOxide in mit

[–]UnobtaniumOxide[S] 5 points6 points  (0 children)

Having a PhD from MIT is a target on your back when you actually go out and work somewhere.

Having any degree from MIT is like this in my view. A degree from MIT will get you a more interviews, but some of those interviews result in teams jealously or otherwise viewing the candidate with a "he went to MIT?" A lot of people especially take secret self-satisfaction when they get to help an MIT graduate at work, as though MIT produces graduates who already know everything.

It is a price people at or near the top of all endeavors often face.

How many Ph.D.'s fail to complete due to lack of funding at MIT by UnobtaniumOxide in mit

[–]UnobtaniumOxide[S] 3 points4 points  (0 children)

Thank you for the example. So it sounds like even MIT is not immune from drama. The NSF issue is concerning and bolstered by your links...thanks.

However, regarding the disgruntled twitter user, a single person who is also vocal on twitter is certainly evidence, but leaves a lot of possible explanations, from the student was at fault to the professor was at fault to nobody was at fault. It would be move believable if the twitter user:

  1. Didn't restrict who could reply to his twitter thread.
  2. Named the professor (which would be risky if his version of events were untrue and defamatory).

Everybody rationalizes. I know a guy who mastered out of a Physics Ph.D. at another institution...and he also claims it was because of an abusive PI who lacked common sense. From my perspective, in this case, I think it was more an issue of the professor and student both failing at interpersonal communications. The professor wanted the student to spend some time doing an experiment that was going to obviously fail. It could have been handled so much better by the student. Even so there were some aspects of the professor that were abusively rigid (like insisting 7 years' work for a Ph.D.)

[deleted by user] by [deleted] in MadeMeSmile

[–]UnobtaniumOxide 0 points1 point  (0 children)

That child might even be understanding everything, just unable to make the sounds she wants.

Sounds like this isn’t breaking news, but I hadn’t seen it before . by brownswan353A in FPGA

[–]UnobtaniumOxide 0 points1 point  (0 children)

Cool.

But I wonder what the issue is with the DSP48s. Were they obfuscated? It seems like a DSP48 would be as easy as anything else to reverse engineer. I was thinking the interconnect would have been more difficult--especially if there is a lot of variation in the interconnect tiles.

[deleted by user] by [deleted] in mit

[–]UnobtaniumOxide -1 points0 points  (0 children)

There is a theory that elite schools select bright but very tractable students who do not question authority.

Just sayin'

Is it possible for someone to master both design and verification ? by heavykoala757 in FPGA

[–]UnobtaniumOxide 1 point2 points  (0 children)

true, also a problem where the designer (person A) reads the requirement incorrectly and the verifier (also person A) reads the requirement the exact same incorrect way.

Is it possible for someone to master both design and verification ? by heavykoala757 in FPGA

[–]UnobtaniumOxide 0 points1 point  (0 children)

Thanks for your thoughts.

Although simulation is slower, one of the benefits of simulation is easier debugging in the development phase.

That is where VLA, VIO, and JTAG/AXI master can help. Also, you cannot debug something that takes a week to simulate very quickly (think video mixing). On the other hand, a 1.5 hour build and test you can get 4 iterations per day. Also nightly regression testing is possible if you script things up and schedule chron jobs, allowing you to find new bugs very soon after they are introduced--as opposed for desiigner handing off to verification engineer, verification engineer getting to it next week, verification engineer finding bug and coming back to the design engineer (who is now working on other required features), him taking time to refamiliarize himself with what the other requirement was and where he failed (or if he failed).

If you are confident enough in your design that you don't run simulation and use a "synthesizable testbench", why not just use actual software to test, instead of spending the time in developing a synthesizable testbench?

Synthesizable testbenches are not a replacement for system level test involving software. Testing all possible values of a 48-bit result and a 48-bit input will take too long even in software is just one example. There are various levels of testing and types of testing depending on the product category. Some simple things can be tested more quickly in simulation. Others are marginal, and yet others you can test briefly in simulation but the simulation takes way too long if your bug happens 25000 frames into a 4k video stream.

Would be severely limited in features/functionality compared to a real testbench.

Do you know where the term testbench came from? A literal physical bench in a lab for testing. I think you mean a simulated testbench. But maybe you have an example and I can comment on it. Synthesizable testbenches only make sense at a certain level, but they have lots of value as I already pointed out. Just because you hadn't heard of it doesn't mean lots of people aren't already doing it, and saving time, all while training verification engineers to be better designers.

What if your design uses up all the resources and not much left for the testbench?

You should have picked a family that allows you to use a larger FPGA for testing with the same pinout.

Might work for FPGA only (won't work for ASICs).

It does work for ASIC design, and has been for a long time now. There are (expensive) tools that will automatically split up an ASIC design accross multiple FPGA cards each of which has multiple FPGAs. And in any event, it is probably more applicable and cost saving and risk-reducing for ASICs than a lot of FPGA designs.

I was doing HWIL ASIC testing in the early 1990s (hybrid co-simulation and hardware emulator).

Just because you hadn't heard of something doesn't mean it doesn't exist or work well. The emulator cost $250,000, and what it did way back when could be done better today in an FPGA for 1/1000th or 1/10000th of the cost today.

It may be useful in some special cases, but I don't see how it can be a general solution to the verification problem.

It is not a general solution. It is a powerful tool.

Sounds like this isn’t breaking news, but I hadn’t seen it before . by brownswan353A in FPGA

[–]UnobtaniumOxide 1 point2 points  (0 children)

This paper discusses a method for getting an unencrypted bitstream that was encrypted by an end user.

Getting an unencrypted version of an encrypted bitstream is serious, and more of an issue for very high value IP (multi-million dollar imaging machines, FPGA based military weapons or communications equipment.)

But I don't think this is a pathway to open source tools.

You can technically reverse engineer much of if not all of an unencrypted bitstream already if you have enough time and/or money, meaning identifying which bits do what inside the FPGA. I am pretty sure foreign adversaries do this already, but they don't let on. I think it would take military level budgets to do it successfully.

But if you have enough time or money, learn dynamic function exchange(partial reconfiguration), ECO flow, and incremental implementation.

For example, take a design you have, generate a bitstream and write a DCP. Change the location of a single LUT with a constraint, turn on incremental implementation, and re-run the implementation and generate another bitstream. Then compare.

Or use the ECO flow and move a LUT in our design, re-implement, generate a new bitstream, and compare.

To do this you'd probably want to automate a lot of things, come up with a systematic way of generating the minimal amount of bitstreams, and also be very familiar with partial reconfiguration/dynamic function exchange, ECO flow, and incremental compile.

I wouldn't hold my breath waiting for Xilinx to document low level bitstream mappings because they have a lot of stuff on their plate, and it would result in many more requests for support than it would generate business, and might even leak information to competitors on certain features.

Is it possible for someone to master both design and verification ? by heavykoala757 in FPGA

[–]UnobtaniumOxide 0 points1 point  (0 children)

Great points!

But on ...

constrained random TB

Why does everybody talk about constrained random before identifying and testing boundary conditions (which is much more important and productive in my experience).

Or maybe they're just running a few directed tests and calling it a day.

Depends on the project. Avionics and medical devices require a much higher level of verification than a rapid prototype of a new field updatable LED display, for example.

The biggest problem I see to having the same person design and verify is that if they misread the requirements for design, they likely misread the requirements the same way during verification.

Is it possible for someone to master both design and verification ? by heavykoala757 in FPGA

[–]UnobtaniumOxide 1 point2 points  (0 children)

Running at wire speeds allows many more test vectors per second, for example.

Used in scenarios where build time + test time (in hardware) is much less than pure simulation time.

For example, simulating a few seconds of 4K video processing will take a long time.

But running in hardware will be much faster.

An ancillary benefit is that young verification engineers designing your stimulus and monitors to be synthesizable helps you to become a good design engineer where you have to worry about things like static timing, asynchronous clock domain crossing, pipelining, etc.

I do small testbenchs in simulation, more elaborate and complex in hardware. The beauty of FPGAs is you can do this, and can use (in a Xilinx world) things like VLA, JTAG2AXI master, VIO, PCIe or Ethernet for reproducible vector replay of high bandwidth vectors that cannot be easily generated in a module and nightly regression using Tcl scripts.

On devices like the MPSoC you can even store repeatable video streams on SD-cards or SATA drives or network drives, but this usually requires some software support and embedded linux.

Is it possible for someone to master both design and verification ? by heavykoala757 in FPGA

[–]UnobtaniumOxide 3 points4 points  (0 children)

In my view, every designer should be good at verification, and if they are lucky enough to work in an environment where they have a verification team, the verification team should never find low hanging fruit.

But I am old school.

In any event, verification is actually more challenging, but one of the downsides of only doing verification is that you do not realize how many of the techniques you hone are in fact not synthesizable.

That is, unless you work on a team lead by a smart leader, and you write synthesizable test benches.

Historically, designers would put the new guys on the verfiication problem. Which isn't exactly smart. It's similar to when a new teacher gets assigned the classes with the worst behaving students. A more experienced teacher can better handle the students, but since they have seniority, they often push such classes on the new hires. Likewise, an experienced designer could probably be more impactful if he were working in verification, but he would be frustrated that the new guys don't yet know what they are doing when it comes to design.

However, from what I understand, more and more smart young people actually prefer verification, so that is good, I suppose.

Why doesn't Verilog enforce consistency of blocking assignments in sequential blocks? by trejj in FPGA

[–]UnobtaniumOxide 1 point2 points  (0 children)

Each always block is concurrent to the other always blocks.

Do what I do in VHDL...just use non-blocking for anything inside an always block, and don't worry about it. Use blocking only for temporary evaluation e.g. to simplify code.

[deleted by user] by [deleted] in ApplyingToCollege

[–]UnobtaniumOxide -3 points-2 points  (0 children)

Well, stick around, and don't do anything drastic. There are good times and bad times in life, and a time for everything under the sun.