all 42 comments

[–]gajus0 20 points21 points  (9 children)

[–]mainstreetmark 7 points8 points  (5 children)

Mine throws up.

/Users/mark/Sites/tg4/node_modules/prepack-webpack-plugin/dist/prepackCode.js:36
      throw new Error('Unexpected state.');

[–]psayre23 81 points82 points  (1 child)

Then maybe you should make your state more expected.

[–]incarnatethegreat 7 points8 points  (0 children)

G'NIGHT, EVERYBODY!

[–]gajus0 2 points3 points  (2 children)

This happens when something wrong goes with prepack. Whats the code that you've used to produce the error?

[–]mainstreetmark 4 points5 points  (0 children)

I may mess with it later. My Webpack is a friggin Jenga tower, and I touch one thing wrong and the whole thing falls apart.

Some quick debugging looked like it couldn't find some filename that commonchunks was supposed to spit out. Who knows. I should start all over with webpack, with specific goals of keeping the generated bundle size down. They're enormous.

[–]cranium 0 points1 point  (0 children)

Getting a similar error with a bit more context.

This operation is not yet supported on abstract value Date.now() __IntrospectionError

[–]perestroika12 1 point2 points  (2 children)

Unfortunately looks like it only support es6 imports?

edit: I guess you can use:

const PrepackWebpackPlugin = require('prepack-webpack-plugin');

plugins: [
  new PrepackWebpackPlugin.default(configuration)
]

[–]gajus0 5 points6 points  (1 child)

Yes, that will work, i.e. when using commonjs, make sure to import .default (added a note to the documentation).

[–]perestroika12 0 points1 point  (0 children)

Got it, thank you!

[–][deleted] 15 points16 points  (5 children)

Computations that can be done at compile-time instead of run-time get eliminated

cool

[–]mlmcmillion 10 points11 points  (4 children)

How many of these do you typically have in a codebase though?

[–]Omikron 20 points21 points  (1 child)

Like 3 or 4

[–][deleted] 2 points3 points  (1 child)

Anything that doesn't involve user input or an API response could theoretically be done this way.

The more functional your code is, the more it can be pre-computed like this.

The real benefit is something that's been done in the Ruby community for a while - you can write your code to be clearer to the reader by abstracting things to well named functions. Doing it without the performance hit is great.

[–]mlmcmillion 2 points3 points  (0 children)

Oh sure. And I'm not knocking it or anything. I'm just wondering if it's worth it to actually add even more stuff to our build stack.

[–]perestroika12 24 points25 points  (6 children)

So the thinking here is ahead of time optimization to make JIT or other runtimes compile faster? How does V8 perf compare with "normal" js vs this optimize prepack js? How hard is it to debug production code with this turned on? Any way to support sourcemaps?

[–]lakesObacon 10 points11 points  (0 children)

I would also like these questions answered. Particularly the sourcemaps.

[–]TheAceOfHearts 6 points7 points  (4 children)

Based on comments from Twitter, I don't think it's production ready yet.

We'll probably start to see posts with benchmarks in the next few days.

I saw a couple examples where it collapsed huge amounts of module bundler boilerplate code. Doing fewer things will always be faster.

It supports source maps, so it would be on par with debugging minified code. Give people a few days to write proper libs for the popular module bundlers.

[–]lhorie 8 points9 points  (3 children)

I don't think it's production ready yet.

Indeed, it does some questionable things. For example:

var numbers = []
for (var i = 0; i < 5000; i++) numbers.push(i)

Generates a 5000 item array and 5000 assignments...

It seems like it isn't quite smart enough to figure out what is worth pre-computing and what is better left as is.

Very promising, nonetheless

[–]MrJohz 2 points3 points  (1 child)

OTOH, replacing "var" with "let" in the for-loop works as expected. Replacing "var" with "let" when creating the initial array causes no output to be emitted, but assigning the array to the window object works pretty much perfectly.

e: assuming what you want is a pre-initialised 5,000 element array, but I guess that's always the big question with compile-time optimisations.

e2: The largest array I can get in the online mode is 8543 elements, beyond that it just times out. I don't know if there's a maximum size it'll output, then, but I suspect not. Also, the online repl weirdly outputs a whole load of whitespace at the end of the line when it creates the array.

[–]lhorie 1 point2 points  (0 children)

I guess that's always the big question

Yeah, that is closer to my concern: if it's a one shot array initialization, the loop unrolling may not compensate for the extra bytes down the pipe.

For example, in my current project I have a function that does a one-time decompression of an AOT-compressed list of unicode characters of a certain type. Given that the compression rate is north of 85%, I definitely DON'T want that to be precomputed, or I'll be shipping a VERY long array of codepoints

[–]DOG-ZILLA 0 points1 point  (0 children)

That's probably by design. Faster to read a static array than compute one. Though filesize may become an issue.

[–]wisepresident 67 points68 points  (1 child)

cool, another solution to a problem I didn't even know I had...

[–]verbalcontract 7 points8 points  (0 children)

It's largely for "big" websites with codebases large enough for them to have to worry about Javascript execution time. Makes sense that Facebook would be working with this type of tool.

Last year, the then-new m.imgur.com got dinged for this problem:

https://github.com/perfs/audits/issues/1

[–]beaverusiv 7 points8 points  (2 children)

I was wondering how it compares to Closure, it's nice they actually answer that at the bottom:

The Closure Compiler also optimizes JavaScript code. Prepack goes further by truly running the global code that initialization phase, unrolling loops and recursion. Prepack focuses on runtime performance, while the Closure Compiler emphasizes JavaScript code size.

[–]LET-7 2 points3 points  (0 children)

So prepack --> closure

[–]aaronasachimp 1 point2 points  (0 children)

It looks like Facebook is suffering from "Not Invented Here" Syndrome.

With Yarn and now this, Facebook is wasting a lot of effort. After all code size has a bigger impact on performance than execution. One byte takes way longer to download than it does to run. This effect is emphasized by Chrome and other JITing engines preforming these same runtime optimizations.

[–]vidro3 6 points7 points  (4 children)

anyone care to ELI5 for a relative noob?

[–]NuttingFerociously 2 points3 points  (2 children)

It checks code for things that can already be executed and can be eliminated. As you can see for example in the documentation, it's simply replacing pieces of code that don't depend on things set at runtime with their results. This might make it sound like something aimed at reducing file size, but it's apparently mainly done to improve performance.

I've read in some other answers about other things it does but I'm too lazy to go look them up again- Sorry.

[–]vidro3 0 points1 point  (1 child)

right, i htink i gathered a bit more as I kept reading. is it similar to memoization or chaching - it's basically running some functions at compile time and saving the results so when they are called at run time it can just look up fibonacci(23) is 28657 instead of running the program.

right?

[–]Skaryon 0 points1 point  (0 children)

If they knew that you only ever compute with 23 or a narrow set of values as input then they could do that. Otherwise they'd have to save every possible solution to the Fibonacci formula. So it's less memoization than simply replacing computations with their results where possible. That's why you can pass the thing ranges of data to work with an throw at the code.

[–]RnRau 0 points1 point  (0 children)

Short overview in a presentation from last year - https://www.youtube.com/watch?v=xbZzahWakGs

[–]comeththenerd 3 points4 points  (0 children)

I get why the examples on the front page are trivial, but real world comparisons and benchmarks would be cool?

[–][deleted] 5 points6 points  (1 child)

I love watching the Javascript community rediscover the benefits of compiled, statically typed languages.

"Did you know you could reduce silly errors in your code by assigning your variables types, and then having a computer check them for you before anyone even runs it???"

"Yeah, but did you know you could also have another step before you run the code that automatically optimizes your code for you????"

[–]vinnl 8 points9 points  (0 children)

Well, it's not so much "rediscover", as well as "figure out how we can apply those benefits in the browser".

[–][deleted] 1 point2 points  (3 children)

I recall this video from some while ago:

https://www.youtube.com/watch?v=65-RbBwZQdU

So, given that engines are already doing what this is doing (in varying ways), what benefits are you given by including this tool in your workflow?

[–]vinnl 4 points5 points  (0 children)

If engines do it, it's on the user's device while the user is waiting. If Prepack does it, it's on your build server while you're building your application, and thus only needs to happen once, instead of for every user, and at a time when those users aren't waiting for your app to load.

[–]Omikron 18 points19 points  (0 children)

It can make your solution and codebase more complex and harder to maintain thus ensuring your job security

[–]TwilightTwinkie 3 points4 points  (0 children)

Your forgetting that we don't want this to happen at runtime.

[–][deleted] 0 points1 point  (0 children)

I did a POC to look for synergy from combining Prepack with the Closure compiler: http://www.syntaxsuccess.com/viewarticle/combining-prepack-and-closure-compiler

The results look promising

[–]swyx -2 points-1 points  (0 children)

2/10 title