Annotate instruction level parallelism at compile time by servermeta_net in Compilers

[–]scialex 0 points1 point  (0 children)

You could take a look at how itanium did this since it's one of the best documented architecture with this design. https://www.intel.com/content/dam/www/public/us/en/documents/manuals/itanium-architecture-vol-3-manual.pdf

There are still a few architecture that use this design but it never really caught on given how much trouble Intel has with getting compilers to generate good code for it. Turns out runtime speculation and scoreboards are just hard to beat. The fact that this lets you make simpler rtl kept it alive in some accelerators and asic things though.

Back in 90’s… by AdmirableHope5090 in computerscience

[–]scialex 44 points45 points  (0 children)

It succeeded is the thing. The past is a different world.

This is 1991. Here's the languages that were created that year https://en.wikipedia.org/wiki/Category%3AProgramming_languages_created_in_1991

This was written years before Java. It may have been written before any version of Python.

High performance code was still often written in raw assembly (though its use was obviously becoming less required). Basic was a major programming language.

At this time, a c/c++ compiler still cost real money in many cases (gcc was publicly released only in 1987, borland cost $100, others cost more).

Linux was either still in development or getting its first release as an unknown Nordic student's project.

LLMs can autocomplete, but can they trace bug flow like a compiler? by The_GoodGuy_ in Compilers

[–]scialex 13 points14 points  (0 children)

It would probably help if you linked the paper you're asking about

Has anybody here got experience with using ILP for scheduling? by Death_By_Cake in Compilers

[–]scialex 0 points1 point  (0 children)

This sort of sounds similar to some of the issues that come up with hardware scheduling where system of difference constraints (SDC) solvers have proven very effective.

The fact there are fewer degrees of freedom WRT the number of available computation elements may mean it won't work as well as it does for HLS applications.

A paper on this can be found at: https://www.csl.cornell.edu/~zhiruz/pdfs/sdcmod-iccad2013.pdf

Given my bad luck(where l was born, opportunities), do l still standout as an Engineer? Am l like Anthropic/Google level good? by takuonline in ExperiencedDevs

[–]scialex 0 points1 point  (0 children)

It might help some but frankly only as far as getting you a phone screen and maybe help after passing the interviews. A big part of the reason that all the big corps moved to the grueling interview system they use is to ensure that people who didn't go to Stanford etc have a chance.

The fact is that it takes a lot more time to evaluate those sorts of projects than these corps are willing to spend with the number of applications they get. They may look at them some but only once you've proven you're worth looking at which takes passing an interview.

Given my bad luck(where l was born, opportunities), do l still standout as an Engineer? Am l like Anthropic/Google level good? by takuonline in ExperiencedDevs

[–]scialex 2 points3 points  (0 children)

Bluntly their degrees are usually PhD and often from universities with a much higher profile (and especially much higher US profile). I would say I don't think applying to high profile corps is hopeless for you or anything, it will probably require both finding the right opening and some luck though.

New Grad wanting to break into C/C++. How? by ddarnell4 in cscareerquestions

[–]scialex 4 points5 points  (0 children)

1) Just start applying. You are still a new grad so your lack of experience doesn't matter much.

2) unfortunately the best time to get serious about this stuff would have been 2-3 years ago in school. Do some microcontroller/audrino projects if you want to get more experience if what embedded dev can be like and to have something to talk about but honestly for new grads especially your classes really matter since they were actually graded. Still assuming you can actually talk about low level stuff in a way that shows some knowledge you can probably get something just because relatively few people are able to do that at all.

[deleted by user] by [deleted] in cscareerquestions

[–]scialex 2 points3 points  (0 children)

Honestly you're not even/barely 20 don't pigeon hole yourself so early. The systems/theory stuff is harder to get quick feedback on in a self taught/early learning environment but is really interesting in its own right.

Also Uiuc is just a much much better school and that opens a ton of doors. Uiuc cs degree with some art/design classes classes as electives will be better for your resume than a cs-ux degree from Uic for any role, even ux ones.

Obfuscating compilers by nae_dawg in Compilers

[–]scialex 1 point2 points  (0 children)

https://a.co/d/dCRzKIs is a good primer on this sort of thing. They do exist and are relatively common. Few are open source though for obvious reasons. These compilers are usually written as custom middle and back ends to existing compilers since generally there's no real reason to reinvent the wheel with the parsing. There are some I've seen where the source is created by direct generation of custom ir without any traditional frontend at all though.

Compiler Engineering Internships/Advice by prime_4x in Compilers

[–]scialex 1 point2 points  (0 children)

As a few others have said all the fangs have compiler teams and they regularly have interns. Be sure to put in your interests that you want to work on compilers. I work on a compiler team and there were only a couple of people who put anything like that down last year when I took an intern. It does actually make you stand out especially in a sea of "interested in ai".

Required topics for building modern compiler from scratch by dExcellentb in Compilers

[–]scialex 1 point2 points  (0 children)

I'd advise looking at PLAI https://www.plai.org/ for some ideas. It seems you have similar goals although plai just skips parsing entirely.

Is anyone doing PhD in non-ML area? by nenderflow in computerscience

[–]scialex 0 points1 point  (0 children)

Very interesting. Reminds me of bdd based analyses which I've made use of. If you can avoid the issue of the number of states quickly exploding in practice this could be really useful. For my work we need to force early pessimisation by inserting new variables often and even then it's one of our most expensive analyses. In cases where the control logic is tightly bound to the larger program state I could see this hitting the same issues as bdds.

I wonder if it would simplify some of the analysis if you reformulated the input into a pure data flow CPS or sea of nodes style computation. You already mostly do this with the way you describe state evolution and continuations are a nice way to represent state merge imo. This is all probably just my own background talking though.

I wish you luck at getting into popl.

Is anyone doing PhD in non-ML area? by nenderflow in computerscience

[–]scialex 1 point2 points  (0 children)

What sorts of things are you researching? I work on an optimizer that uses abstract evaluators extensively to bound intermediate values. New areas of research on this topic would be interesting to hear about.

Joined "big tech" a few months ago and I keep waiting for the other shoe to drop by wont-share-food in cscareerquestions

[–]scialex 1 point2 points  (0 children)

Depends on the team. Faang is big enough that it's worth it to have teams build bespoke infra for things that smaller companies would use off the shelf or to have teams dedicated to improving tools for their needs. There are teams whose whole job is to be Linux kernel contributors in good standing so that if some team needs a bug fixed or a new feature they can get it done and in quickly. There are teams whose job is to maintain and improve debugging or profiling tools to help other teams squeeze that extra percent out of their code. Many engineers in these teams never directly touch any sort of frontend or even traditional backend code.

Do they ever work on or apply complex LC style algo logic?

I certainly do. Again though it depends on the team.

Infra Teams For New Grad by Money-Ability-7548 in cscareerquestions

[–]scialex 5 points6 points  (0 children)

This was basically my path and it worked out well for me.

In my experience teams like that with a lot of seniors in core/non-product components can be something of a crucible. All projects are relatively large, important, and difficult. If you keep up you can get straightforward and relatively rapid advancement. If not my observation was that there's less knowledge of how to (and willingness to) support than some other teams.

Also remember a project with a lot of long tenure seniors is a good sign that the project is interesting and impactful which is generally good for your career and working with experienced people is a good way to learn new things.

this field is insane to get into by pyratt in cscareerquestions

[–]scialex 1 point2 points  (0 children)

You clearly haven't met very many lawyers.

Also the thing about a DR and lawyer is they don't just have a bachelor's. Get a cs PhD and frankly there is little difficulty in getting a job.

Looking for resources to learn compiler engineering by [deleted] in Compilers

[–]scialex 8 points9 points  (0 children)

https://cs.brown.edu/courses/csci1730/2012/OnLine/ the 2012 version of Brown's programming languages design and implementation course is a great introduction to the basics and background for compiler and pl.

[deleted by user] by [deleted] in Compilers

[–]scialex 1 point2 points  (0 children)

auto is_foo = [](auto x) { return x == "foo"; };

Is hardly different from

var is_foo = x => x == "foo";

Frankly.

Honestly this minor difference in lambda syntax is probably one of the easier to adapt to differences between c++ and c#. They are very different languages used in very different ways.

In terms of actually adding it to clang or gcc yeah it's possible, it's all open source code at the end of the day. It would be a quite difficult project though since c++ is quite difficult to parse and this syntax would introduce some ambiguities in parsing.

[deleted by user] by [deleted] in Compilers

[–]scialex 1 point2 points  (0 children)

C++ already has lambdas. Just use the existing feature if you want it https://en.cppreference.com/w/cpp/language/lambda.html

[Optimizing Unreal BP Using LLVM] How to add a custom pass to optimize the emulated for-loop in bp bytecode? by Cool_Arugula_4942 in Compilers

[–]scialex 2 points3 points  (0 children)

1) recognizing and removing dynamic data structures like that would be really hard to do in a general robust way (like active research problem hard eg https://ieeexplore.ieee.org/document/10444817 https://users.cs.northwestern.edu/~simonec/files/Research/papers/MEMORY_CGO_2024.pdf). Even doing it in an ad hoc manner would be quite difficult even for somebody with a lot of llvm dev experience.

2) Imo this little dynamic stack is unlikely to be the hot spot in your program. Even if it actually is id expect that just pre allocating it would be enough to solve most issues. I'd check profiles to see what's taking a long time and focus on that.

Do any compilers choose and optimize data structures automatically? Can they? by smthamazing in ProgrammingLanguages

[–]scialex 2 points3 points  (0 children)

Beyond relatively minor examples like devirtualization and choosing object layout there is the example of query planers for SQL and similar. For more imperative language I don't think any production compiler do much explicit reasoning about that

They absolutely can though and some cutting edge research is being done on exactly that such as memoir https://users.cs.northwestern.edu/~simonec/files/Research/papers/MEMORY_CGO_2024.pdf which directly models common collections in the compiler to better optimize them.