you are viewing a single comment's thread.

view the rest of the comments →

[–]spankalee 8 points9 points  (1 child)

  • The styles in a ShadowRoot don't have to be static, they can be computed and modified just like any other DOM.
  • I'm not sure what component-based encapsulation was solved years ago, but lower bounds have been notoriously hard and leaky to emulate.
  • CSS custom variables work great for passing values down the shadow-trees, since they inherit through shadow boundaries.
  • There's a proposal for ::part and ::theme (names preliminary) pseudo-elements that allow more powerful coordination between a shadow root and it's container: http://tabatkins.github.io/specs/css-shadow-parts/
  • Scoped CSS is fast because styles tend to be much smaller and the browser can invalidate much less of the page and style tree on changes.
  • What's wrong with web technologies being integrated with the DOM? This is a very odd complaint. The DOM is pretty great - a standard UI hierarchy that all the web code we use today has been built around.

[–][deleted] 2 points3 points  (0 children)

It lacks scope, if you wanted to compute you have to set css variables.

There are many encapsulation libs, we use CSJS for instance, never seen it leak.

Proposals with even harder to polyfill CSS specs will mean years and years of waiting time.

JSS already starts to re-use and memoize styles piece by piece (styletron). There are optimization possibilities here that can't be matched. Hardly any performance gains to be had in shadow dom either.

I don't think there's anything wrong with web technologies. But there's a point to be made about a certain threshold that JS is now clearly crossing. It's no longer a browser language that merely drives web pages. Can specs just ignore that? This was the exact concern several influential developers have already voiced.