Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Parcel CSS: A new CSS parser, compiler, and minifier (parceljs.org)
345 points by Lapz on Jan 12, 2022 | hide | past | favorite | 122 comments


This is amazing! Seen a lot of work around Rust and JS / JSX / TS / TSX (notably SWC[0] is moving to be a Rust based version of Babel) not so much around CSS.

My genunine hope is this will replace PostCSS sooner rather than later. PostCSS isn't the most performant thing I've worked with, and a lot of plugins for PostCSS rely on doing multiple passes at the AST, they can slow down significantly as a result

For that however, we will need some kind of plugin system. I know the SWC[0] project has had some struggles with this as their current JS API relies on serializing the AST back and forth. I wonder if this forgoes a JS API entirely or if that is something planned for say 1.5?

[0]: https://swc.rs/


Hi, author of Parcel CSS here.

I have been thinking about implementing the CSS OM spec as the JS API. This is the same API that browsers expose for manipulating stylesheets. The advantage of this is that we don't need to invent something custom. Still thinking through options. https://github.com/parcel-bundler/parcel-css/issues/5

That said, I think I'd want to keep the use cases for JS plugins limited, because it will affect performance significantly. So, if it's something that exists in an official CSS spec somewhere (even if draft), or a really common transformation, it should be implemented in Rust. An example of this is the support for the CSS nesting spec. Ideally, we'd try to keep the amount of custom syntax being invented around CSS to a minimum and stick to standard CSS syntax wherever possible though.


Separate question, that may be seemingly unrelated however I'm curious.

How did you kinda "learn" how to read these specs effectively? I can read them, and I think I can reasonably understand them, however I feel like I have little confidence in saying yes I get this.

Is there anything you point to that helps? Tips or recommendations?

I'm trying to learn how to parse these standards docs better myself.


For a spec about a browser feature, "getting it" can mean a few different things.

1. Understanding the purpose of the feature ("why/when would I use this?")

2. Understanding how to implement the feature

3. Understanding how to use the feature

4. Understanding the feature's "corner cases" (surprising implications, cases where it doesn't do what you'd expect, etc.)

5. Understanding why the feature works the way it does (instead of some other way)

Most of the web specs really only explain how to implement a feature, and even then, they're not great at that, because they do such a poor job at explaining the purpose of the feature.

Assuming that you, like most of us, aren't working on implementing a browser, that means that web specs are mostly unhelpful to you. It's almost completely beyond the purpose of a spec to teach you how to use a feature, what its corner cases would be (which are often unknown at the time a spec was written), and why the specification says what it says.

This is an area where the web spec community has made some improvements in recent years. Nowadays, it's understood that new proposed specifications shouldn't just provide a specification, but also a separate "explainer" document to explain the purpose of the feature, and also persuade the other browser vendors to implement the feature. ("This will be really cool, and here's why…")

At a minimum, specs nowadays often include a non-normative "Motivation" section, as the CSS Nesting spec does. https://www.w3.org/TR/css-nesting-1/ I you'll find that you can "get" that spec much better than you can the CSS OM spec https://www.w3.org/TR/cssom-1/ which is old enough to buy alcohol and doesn't include a "Motivation" section.

You can often find explainer docs linked off of https://chromestatus.com/ e.g. https://github.com/MicrosoftEdge/MSEdgeExplainers/blob/main/... I think you'll find that explainers are 10000% better for learning features than specs are. (They typically even discuss #3, #4, and #5, as they typically discuss alternative rejected approaches.)


In addition to dfabulich's excellent comment, I'd also suggest: read specs of tech you're most familiar with first, rather than specs of new cutting edge stuff you're curious about. Doing that will give you huge amounts of extra context because you're just reading something you already know expressed in a different format.


Is this something that can run in browser Workers via wasm or other means?

Most of CSS OM is already implemented in Javascript which can come in handy for this project:

https://github.com/NV/CSSOM

https://github.com/jsdom/cssstyle

(both of which jsdom use internally. cssstyle is a kind of updated version for some parts of CSSOM)


Yes, there is a WASM build used by the demo: https://parcel-css.vercel.app

Not sure if we could reuse these packages though, since we'd need to expose an API from Rust to JS anyway.


Does this include dart scss processing?


When a site's CSS is well written, with a capable web developer knowing, what they are doing, such a tool would not be necessary. People pump out megabytes of JavaScript, but then they worry about a few kilobytes of CSS being saved by compressing it, at the same time making it less readable by minifying it. (We are not yet shipping hundreds of kilobytes of CSS, are we?!)

When there is a need for a tool that minifies CSS, then people seriously need to ask themselves, how it can be, that they have that much CSS. How much redundant CSS can you accumulate? Was there no coherent styling idea or strategy, so that every thing on the page needs separate styling?

What many forget is, that data is often sent gzip compressed anyway, which naturally takes advantage of repeated parts to compress. Text usually compresses pretty well. Especially something like a description language with many repeating parts. It is great, that Parcel CSS is faster than some other tool. However, for me, these kind of tools are merely treating the symptoms of not doing styling properly. I'm glad, that I got to know the web, when you simply could look at all the source of a page and learn from it, instead of facing a multi megabyte "minified" script and "minified" CSS.


> We are not yet shipping hundreds of kilobytes of CSS, are we?

Well, Bootstrap 5 is ~200K when not minified and a web site usually ships additional CSS resources as well.


Oh my ... Does one need all of that? Isn't there a process of reducing it to only what one needs? I think there was something like that in the past. Not sure how well that works.


> Isn't there a process of reducing it to only what one needs?

One such process is what you're commenting on here in this thread.


I am more talking about "Only ever include the CSS rules we need." instead of "Lets take the whole of bootstrap! Oh damn, we have to somehow reduce it!".

If memory serves me right, you could compile the Bootstrap library and choose the parts you needed. Not sure if that is still true for Bootstrap 5. If it is not possible, it stands to reason, that perhaps one should not use Bootstrap 5 and instead write good styling oneself, without all the stuff, that will never be used on the site anyway.

CSS can be tricky, but in the end, reasonable styling is usually not that hard to do. Even responsiveness can be done mostly without media queries these day, unless you want content to change, if layout changes. It used to be much harder, when our browser standards were not that far. Nowadays we have flex and grid and more standardized interpretations of CSS rules in browsers. We can make navigations, menus, tiles and whatever all responsive without that much work put in.


From the Parcel CSS page : it does 'tree shaking', which automatically eliminates unused code. Not details on what that implies, though. Tools like PurgeCSS (used by Tailwind pre-v3) comb through the HTML for matching selectors, but I'm not sure if Parcel does the same thing.

Anyway, as others comments said, these kind of tools nowadays are more 'transformers' than mere 'minifiers'.


That's really nice, but to me still sounds like the wrong way to go. Instead of reducing something that is too much, that which is too much should not have been added in the first place. We are doing work here, removing, what we have previously added.

I wonder how much ineffective styling still remains after "tree shaking". I guess the fancy tree shaking is out of luck, if we add unnecessary styling to elements, which are indeed used. It seems a really hard problem to solve automatically, to transform CSS in a way, that takes care of redundancies and unnecessary styling, while still maintaining the readability, which well crafted CSS can have.


> that which is too much should not have been added in the first place.

The problem is, with CSS you don't know if you've added too much. As the project evolves, and code and styles change there's literally no way of knowing if a certain CSS rule is still in use Well, except regexp'ing the whole project looking for matches.


> regexp'ing the whole project looking for matches

That's basically what these tree-shaking tools do.


> Isn't there a process of reducing it to only what one needs?

Yes there is:

https://github.com/uncss/uncss

https://github.com/purifycss/purifycss


and https://github.com/leeoniya/dropcss, if you're feeling adventurous ;)


You can switch to WindiCSS or Tailwind and use the optimized output. That is usually a lot shorter.

Tailwind actually has its own jit compiler for development since the old, full css was extremely long. Windi always creates small css as far as I remember.


This is not a good argument, because no matter how well you write anything, reducing its size is beneficial for everyone.


As someone who regularly adds custom CSS to sites to fix various UI issues/annoyances, I can say that I'm definitely not a fan of minified CSS classes.


What a load of crock. You should always use a css minifier to - at minimum - remove white space and trailing semicolons.


i doubt there's a noticeable difference after applying any "dumb" file compression scheme, e.g. brotli/gzip


What does it mean to compile CSS? Pack multiple files into one? Resolve and inline variables? Convert nested rules into flat CSS? Something else?

I'm really not sure.


Its GitHub repo calls it a "transformer" rather than a compiler.

https://parceljs.org/features/plugins/#transformers


> In addition to minification, Parcel CSS handles compiling CSS modules, tree shaking, automatically adding and removing vendor prefixes for your browser targets, and transpiling modern CSS features like nesting, logical properties, level 4 color syntax, and much more.


Elitism about the term "compiler" aside, I've been reading through The History of the FORTRAN Programming Language and funnily enough compile in those historic CS contexts just meant what "to compile" means in English: merging things together. It's an interesting read whatever the case.

[0] https://www.goodreads.com/book/show/52320048-abstracting-awa...


This is not about elitism about terminology, but about sloppiness in using well established words in a specific context to mean something new. It happens rather frequently in the frontend world - which I’m sometimes a part of - and it’s incredibly confusing to all of us.


I write frontend code and backend code and do compiler stuff for fun and I don't find it confusing. "Compiler" means completely separate things in so many contexts that I get the general picture and always need to look deeper to truly understand. Just because it's a generic term doesn't mean anyone should not be allowed to use it IMO.


You are right and I phrased my comment the wrong way.

The confusing part is not calling this process compilation, the confusing part is now calling something a compiler for which the consensus previously was calling it a pre-processor. Now I started to wonder if there is something new that I wasn't aware off, like a browser support optimized binary style format that we can compile into.


It's in the first couple paragraphs

> Parcel CSS handles compiling CSS modules, tree shaking, automatically adding and removing vendor prefixes for your browser targets, and transpiling modern CSS features like nesting, logical properties, level 4 color syntax, and much more.


Had you taken the time to open the link, the 2nd paragraph states their capabilities succinctly:

> Parcel CSS has significantly better performance than existing tools, while also improving minification quality. In addition to minification, Parcel CSS handles compiling CSS modules, tree shaking, automatically adding and removing vendor prefixes for your browser targets, and transpiling modern CSS features like nesting, logical properties, level 4 color syntax, and much more.


I've obviously read that. Most of those features are what we used to call CSS pre-processors, but I'll happily call that compilation if that's a thing now.


It wasn't obvious to me that you had read that paragraph.


Looking forward to an scss compiler written in Rust, compiled to wasm and exported via a node module that no longer requires node version/platform specific binaries to be either downloaded or compiled.

Oh cool: https://github.com/connorskees/grass

Oh node-sass is written in C++. I wonder why they aren't shipping the code as wasm rather than using node bindings


`node-sass` is deprecated [0]. use the `sass`[1] package instead

[0] https://www.npmjs.com/package/node-sass [1] https://www.npmjs.com/package/sass instead


Actual source language is dart and this library is basically a transpilation of: https://sass-lang.com/dart-sass


Well I'll be...


I was just looking at Parcel a week or so ago to replace the stagnated Snowpack project for our frontend.

Should I just stop trying to avoid it and just use Webpack 5, or can I actually rely on Parcel in a way I couldn't Snowpack?

I just want to bundle my React app, I'm not trying to do anything special...


Use Vite instead, it has very nice developer experience. Webpack is a "legacy" bundler, while it is supported, is shouldn't really be used in new projects if you really value your sanity.


Webpack is under active development though, nothing legacy about it as far as I'm aware. Are you referring to abandoned third-party plugins maybe?

Vite and all the other bundlers are really awesome, but Webpack's strenght - and I think this may also be the root of the issues you seem to be having with the tool - is that there's almost no magic and barely any handholding. Freedom and reliability at the cost of having to build your own configuration (it's just dumping plugin config objects in an array, really not difficult).


I have done manual configs from scratch in Rollup, even wrote bunch of plugins for stuff like custom css class minifying and web worker support, so no I don't need handholding :) Setting it all up was even fun, everything was well documented and clear, with Webpack on the other hand, you only have fun if you like pain.


> there's almost no magic

That depends entirely on your definition of “magic”. Needing a specific incantation of seven different plugins and rules, that are often conflicting and incomprehensible, to get a basic project to build, is the same as magic to me.

I assume you mean magic == conventions or implicit behavior. I’ll take that over the Webpack mess any day.


Magic as in runtime hacks and precompiled semi-proprietary rust binaries.

Neither of which are necessarily negatives of course!

Only thing is that for every minute gained from those speedy rust builds, I lost five more on debugging stuff that used to work fine with Babel + having to learn this new language to write replacements for now unsupported plugins :)


> it's just dumping plugin config objects in an array, really not difficult

Until you eject a CRA :)


Which is the cherry on top for those types of self-chastisement fetishists who willingly put themselves at the mercy of CRA ;-)


Check out the CRACO project. Allows config of CRA w o ejecting.


Vite is so much better, IMO. Quicker too, since it uses esbuild in dev mode.


I seriously don't get the hatred Webpack receives. Calling it legacy when it introduced stuff like module federation is unfair to say the least. Just because a lot of people use CRA or some "zero config" bundler, or because esbuild etc are picking up momentum, doesn't mean it's bad. The documentation is actually really good IMO.


The problem with using something like webpack is that you can't use it for something quick.

You can't dip into the documentation for the one thing you're trying to do, you have to spend time reading a large chunk of the documentation, then going to forums and such to try an understand the history of how that one thing you're trying to do has changed over webpack's history.

Then 5 months later when you're doing it for another project, you have to re-read all that documentation again because one other thing is different.

Compare that waste of time and effort to:

A developer is now unavailable and you are taking over the project. The project uses a modern build-system/module-loader/etc... You need to change one thing to the way to project is built.

You don't have to learn an uber nested spaghetti config file that looks like a domain-specific language. You just read the miminal configuration the previous developer left you and now you know what to change, or how to add the one thing you need to do. You've now saved days of work.


I agree that hatred is probably somewhat unwarranted, and flexibility and support Webpack has is immense - but, based on my experience, I understand some of the people and where it's coming from.

I just replaced Webpack with Vite, in a 2yo React project, with relatively standard webpack config (scss, fonts, assets copying, minify etc). I was able to remove 24 total devDependencies packages, including the whole of babel and its related ones, bunch of webpack plugins that broke our build on every packages upgrade (e.g. to Webpack 4, then again for Webpack 5) etc etc.

We now have nothing to do with babel, hard-to-understand plugins that someone added since it was solving a build error, nothing to do with webpack, just 1 almost non-existent config file, instant HMR, and 3x faster build...


One thing that always put me off about webpack is the default way it compiles your code. One of the ways is(was?) compiling code as _eval_(!!) statements and code as string. It is absolutely impossible to debug such code.

You were meant to rely on sourcemaps to get something on your debugger, but despite using the latest Chrome and developer tools at the time, I could never get it to work to actually debug sites.

I know Rollup, Vite and etc had a much easier time providing an easier developer experience because they rely on the browser's native ESM support, but I never could understand why webpack decided to mangle the code so badly.


> Webpack is a "legacy" bundler

How did you end up with this assessment? I'm not a fan either, but it's still very popular, I doubt it will go away in the next 5 years.


It won't go away, it will still be supported and used for many years, but majority of new projects today won't use it. Brief history: Rollup dramatically improved bundler plugin and ES modules situation, then came new super fast bundlers like Esbuild and finally solutions combining both like Vite. Webpack is still playing catch up with that. There is literally zero reasons to choose clunky, slow Webpack configs, while alternatives exist, unless you have a legacy codebase and migrating is not an option.


> literally zero reasons

What about module federation? Has any other bundler implemented that?


It's a relatively minor feature used by relatively few people.


Do you know any success stories with that?


I haven't heard any, but I haven't gone looking either. I'm just saying there's at least literally one reason to use webpack for new code.


I'd argue that Vite's still pretty immature. It's great if you can stick within what works with it, but webpack is still the gold standard in terms of compatibility and features, and I suspect that will be the case for another couple of years while the native code bundlers (esbuild, swc, etc) catch up with their JS counterparts.


> Webpack is a "legacy" bundler

What a weird thing to say. I feel like people say this because they don't understand it's uses, and it's versatility.


Parcel has been around for awhile, and the project currently has a lot of activity - I've made the call to migrate an old create-react-app project to Parcel 2 and in general it is going well.

The nice thing about Parcel is that it is intentionally low-config with relatively intuitive defaults, which tells me that it wouldn't be terribly difficult to move to a different bundler in the future. Webpack has gotten better, but you can quickly end up with a _lot_ of Webpack-specific config that makes future migrations hard.

That said, optimizing for future migrations shouldn't generally be your #1 priority - I also like Parcel's speed, and new things like ParcelCSS just validate that further.


Snowpack, ESBuild, ..., all promised us nice frontend compiles, yet I still turn to Webpack every time I actually want everything to work - you always hit roadblocks you can't solve in these other tools.

That'd be my recommendation to you, still, sadly.


Not my experience. Both esbuild and vite have been great for my usecases.


I recommend Vite. I use ESBuild in some libraries but it has a learning curve. Vite's experience is similar to Webpack.


Parcel is becoming my go-to choice most of the time now. I’ve migrated some apps out of CRA, pure Webpack and Snowpack to it and I need to say it was incredible.

Some minor issues with HMR on one of those projects I still can’t figure out. Anyway having something that works, with JSX, with TS, really fast, in just 2/3 really simple steps is amazing! You don’t even need to turn your brain on.

My two cents.


Just a stab in the dark, but I was having HMR issues as well and it turned out to be an obscure transitive dependency conflict - details and solutions here on the outside chance it's the same thing affecting your project https://github.com/parcel-bundler/parcel/issues/6685


I'm also on that issue with you :)

Sadly in my use case it is not a dependency conflict, but I'm failing to replicate it (and actually not even trying very hard). I guess it will automagically fix itself on some minor update.


I have been using esbuild for a while, and I can say that it's looking like a huge improvement over webpack, snowpack, vite, etc.

I made a web app starter that uses esbuild to bundle a react-redux app [0] and my experience was very positive of the bundler.

[0] https://github.com/samhuk/tree-starter


Vite and snowpack are built on top of esbuild, so I’m not sure your comment makes sense.


The point being that, IMHO, extensions over ESBuild tend to not improve the developer experience because they have a tendency to bloat and steer towards being "webpack but esbuild", which defeats the point of esbuild, being the "anti-webpack".


Tools such as Vite, aren’t designed to improve the developer experience of ESBuild. They exist separately and leverage ESBuild’s strong qualities – JS, TS and CSS bundling and minification.


I used Parcel for about a week a month ago. Couldn't get the HTTPS server to use my certificate and key and would have needed to write a plugin so that it wouldn't rename favicons and delete files not mentioned in the HTML file.

So, Esbuild for 'no configuration' as it is significantly faster than Parcel and Webpack for more complex stuff.


> Parcel CSS is based on the cssparser[0] Rust crate, a browser-grade CSS tokenizer created by Mozilla and used in Firefox. This provides a solid foundation, including tokenization and basic parsing. However, it does not interpret any CSS properties or at rules. That's where Parcel CSS comes in. It handles parsing each individual rule and property value, as well as minification, compilation, and printing back to CSS.

https://github.com/servo/rust-cssparser


It's really nice to see browser components being able to be reused easily rather than a bunch of half-assed parsers.


Nice too that it's a compiled language, so you get the end tool in a nice static binary. As a non-Node dev, I hate the experience of hacking on some project and having to install a giant pool of NPM stuff just to run some minifier or linter. Hound is an example of this— the guts of the project are golang, but it has a frontend that uses webpack, jest, etc: https://github.com/hound-search/hound

Which is fine, I guess; definitely use the right tool for the job. And maybe Node developers hate finding my Python projects and needing to set up a virtualenv to run them in. But all the same, I approve a direction where more of this kind of tooling is available without a build-time Node dependency.


The bigger win here are projects that increasingly do more and do it faster. You can essentially replace babel’s hydra with typescript (single dependency) and do it faster too. TS also essentially supports “preset-env” via the target property (however you must pick the ES version, not a browser list)

The important part now is ensuring that these projects don’t just die and disappear like, say, Rich Harris’ “buble” tool (a very old “fast Babel alternative”)


Agreed. If cssparser blows chunks on your css then at least you know that will probably break some of your user agents too, rather than the game of “is my tool broken or am I?”


I feel stupid asking this question, but isn't the browser that parses css? i understand minifying css, but what does parcel's css parsing refer to?


To do the job of minifying, this minifyer first reads the CSS file into an internal datastructure. That process is called parsing.

All the minifying operations are done on that internal datastructure.

Then it writes it out again into a textfile, ready for the browser to parse it again - hopefully the minifying step made it a bit easier for the browser to parse this minified CSS.


It parses the css so it can improve the minification, for example converting 'margin: 4px 4px 4px 4px' into 'margin:4px'


For packages that have images and css, it reads css and extracts image files to be bundled.


It would be really great for me if any of these CSS preprocessors offered a standalone command line version that didn't need npm to install. My developer blog (https://ajxs.me) is built by a homegrown static-site-generator built in Python, which reads data from a simple SQLite database. I've been looking for a suitable CSS preprocessor for a while, however the ones I can find are all totally overcooked for my needs, and nearly exclusively designed to be integrated with Node.js.


For some reason I trust minifier / compiler benchmarks way more than I trust DB benchmarks. I wonder why that is?

Maybe I expect DB workloads to be much more varied so benchmarks are less representative. Whereas with minifying you can run it against some large project and expect it'll reflect your real world experience


Is it just me or parcel is starting to lose focus and performance?

Back in the v1 days, things just worked without config and with v2, I need some config and even basic glob pattern to include all files in a directory is now a plugin and for some reason a parcel process takes 3GB of memory to compile a smallish project and I don't know how to fix it or pinpoint what the cause is.


Unrelated but regarding SCSS/SASS: I am curious, why did we invent a new syntax for scss instead of writing a library in Python or Go that maps data-objects to CSS classes? You have full access to a proper programming language instead of this new DSL we need to learn. I tried searching for it but no luck, any reason why we don't do this?


I can't speak for all features in SCSS/SASS. I'd prefer JS (for reusability) to Python, Go and DSLs, but why open Turing complete Pandoras box in the first place when declarative works so incredibly well?

Regular CSS didn't have variables for a long time, which was the killer use case imo. I haven't been in the loop for a while but just found out that there is a CSS Custom Properties[1] standard which would solve most of my use cases. Heck, they're even scoped! Imo front-end folks have an unhealthy low threshold for taking on dependencies that are already supported natively, imo.

[1]: https://developer.mozilla.org/en-US/docs/Web/CSS/Using_CSS_c...


It's called interoperable css [0] or css modules [1].

[0] https://github.com/css-modules/icss

[1] https://github.com/css-modules/css-modules


> I am curious, why did we invent a new syntax for scss instead of writing a library in Python or Go that maps data-objects to CSS classes?

I mean, if using js/ts is not beneath you I'm pretty sure any css-in-js solution fits the bill.


Outside, didn't read yet. Does minification include summering? Like if a child has an attribute that it actually doesn't need because it will inherit anyway since it is on the parent too, we can remove that attribute and save a line. Does it make sense?


Summarizing*


The syntax lowering and modules handling might be really nice. Maybe even the tree-shaking. Minification, thought, especially with this level of parsing, just seems like an elaborate waste of effort if the web server is set to gzip the final CSS file, anyway.


"Parcel CSS is based on the cssparser Rust crate, a browser-grade CSS tokenizer created by Mozilla and used in Firefox."

Fantastic work, I believe cssparser could benefit from Parcel CSS if they contributed back.


We actually already have! I contributed hwb() color support as part of my work on Parcel CSS, and it shipped in Firefox 96! https://github.com/servo/rust-cssparser/commit/62d63fea751df...


Wonderful, thanks for your contributions!


After I saw the title, I thought "oh, it seems to kind of occupy the same space as esbuild, but for CSS. I wonder if the devs gave any thought to performance?" Then I clicked the link and saw that there a direct comparison with esbuild, with Parcel being 3x faster on a large real-world benchmark (Bootstrap 4).

This is really impressive. Although Rust tooling is rather suboptimal, Rust programs seems to have quite the performance edge. I'll take the RESF any day as long as it means getting away from ultra-heavy webtech everywhere.


> Rust tooling is rather suboptimal

In what way? People seem to really like cargo.


The Rust compiler is both dog-slow and massive (both from a source and binaries perspective), and doesn't have a working incremental compilation mode yet, or support in-process hot-patching. There's no Rust REPL (hacks like papyrus don't count). Poor structural editing support. Integration with various editors/IDEs is lacking (e.g. there's no support for reporting possible performance issues with code constructs in any editor that I'm aware of, nor is there in-editor per-function disassembly or LLVM IR inspection, or borrow-checker lifetime annotation, no extract-function refactoring).

Cargo being good is necessary, but not sufficient.


I hope this can support SugarSS or other indented syntaxes easily. As far as I can tell, you'd still need to slap in PostCSS to transform to CSS first which defeats the purpose. Semicolons and brackets, like with JSON, make for a condense format for compressing but have an awful writing experience and contribute nothing to readability.


Parcel is a great project that has made it much quicker to develop a SPA. This only furthers my sense of joy for it.


Very nice! Always great to see these new tools, especially if they can speed up build times.


In the playground, why does ‘yellow’ get converted to ‘#ff0’ but ‘green’ remains ‘green’?


Because the hex for green (#00ff00) needs more letters than using the named color.


Wouldn't that shorten to #0f0? But "green" is #008000, which doesn't shorten?


Hmm I wonder which approach is actually better overall when it comes to content-encoding like their own site uses (brotli compression) or client side parsing performance. It's all probably a bit off into the weeds over something like 0.05% performance though.



Ah thanks, I assumed ‘#0f0’ as well. I guess it is length based then.


That's an... interesting optimization, and one that might make sense if you only care about byte size, but intuition (which might be wrong!) tells me that this will be more expensive (especially if it saves only one or two bytes).

I'd bet that browsers can more quickly parse a string like '0x00ff00' into its internal color representation than it can parse the string 'green'. It's probably faster to check for a '0x' prefix and convert hex-encoded ASCII to u8 values, than it is to normalize the string & do a lookup in a dictionary of 147 CSS3 color names, when taking into account the extra two bytes that need to be transferred.


More importantly, I would expect #00ff00 to compress better than green, because it would usually make the CSS file more predictably repetitive. Reducing network bytes is usually very important for speeding up loading (at least it is in the boondocks of the internet - out at the rim of the world).


A related thing that I realised a few days ago about compression algorithms:

  <!doctype html><meta charset=utf-8>
  <!DOCTYPE html><meta charset="utf-8">
Compress these with gzip, and the first is smaller than the second (56 and 58 bytes): lowercase doctype because you’re using very few uppercase letters in your document (on slightly larger samples it tends to save a byte or two), and omit the quotes as unnecessary. On larger documents there will be some places where you need quotes around attribute values, but it’s still worth omitting them when you can.

LZMA, similar: 60 and 63 bytes.

But then compress these with Brotli, and it’s the other way around by a larger margin, 29 and 19 bytes, because Brotli ships a dictionary primed on arbitrary web content. And so it becomes a popularity contest, and an inferior but vastly more popular technique compresses better.

In the case of #008000/green/#00ff00/#0f0/lime/#ffff00/#ff0/yellow, the dictionary doesn’t look to bee tainted, so traditional length and repetition wisdom still applies.


Someone else mentioned this already but "green" is not #00ff00. Tt's #008000.


I don't get the sense that runtime CSS parsing is really much of a concern - it's more about build speed and asset size - since 'green' might be used hundreds of times in a large CSS bundle, the optimization might make sense even with an imperceptible speed cost in the browser.


> one that might make sense if you only care about byte size

A minimizer should care a lot abut size. I'm not sure what would be faster, but the difference must be minimal anyway, so it's safe to focus on your default concern.


I doubt that difference it takes to parse green vs a 0x00ff00 is at all significant. It only should happen once per page load.


>That's an... interesting optimization, and one that might make sense if you only care about byte size, but intuition (which might be wrong!) tells me that this will be more expensive (especially if it saves only one or two bytes).

This is an interesting premise, do you know of any tools that optimize web pages for parsing and rendering speed as opposed to size? I wrote a tool once that bundles and archives webpages for offline viewing, and in those cases network throughput is not an issue since files are stored locally. It could be interesting to see what performance benefits I might be able to get from optimizing for parsing speed in this case, although I suspect the performance differences will be so negligible that it won't make much of a difference. Still would be an interesting experiment nonetheless.


Don’t know, but I wouldn’t be surprised if the browser first tried a dict lookup (which is super fast) and only then tried to actually parse hex the string.


I don't have intuition here, but perhaps transferring two bytes over a slow network takes longer than the string normalization and dictionary lookup?


> Because the hex for green (#00ff00) needs more letters than using the named color.

But "#0f0" is fewer letters than "green"?


It is supporting glob patterns?


That’s awesome, great job!


Why is a product written in Rust have “JS” (javascript) in its URL name?

I nearly didn't look at this because I didn't want another javascript dependance in my pipeline.


It seems Parcel CSS is a project from the creators of Parcel JS, which is a build tool. The URL isn't linking to the top-level page on the domain




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: