NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Fastest Front End Tooling for Humans and AI (cpojer.net)
conartist6 1 days ago [-]
It's funny to me that people should look at this situation and say "this is OK".

The upshot of all these projects to make JS tools faster is a fractured ecosystem. Who if given the choice would honestly want to try to maintain Javascript tools written in a mixture of Rust and Go? Already we've seemingly committed to having a big schism in the middle. And the new tools don't replace the old ones, so to own your tools you'll need to make Rust, Go, and JS all work together using a mix of clean modern technology and shims into horribly legacy technology. We have to maintain everything, old and new, because it's all still critical, engineers have to learn everything, old and new, because it's all still critical.

All I really see is an explosion of complexity.

dfabulich 1 days ago [-]
So, what's your counterproposal?

Each of these tools provides real value.

* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.

* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)

* Code formatters automatically ensure consistent formatting.

* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)

As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.

And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?

conartist6 1 days ago [-]
I'm being a little coy because I do have a very detailed proposal.

In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.

notnullorvoid 1 days ago [-]
Yeah it's a shame that few people realize running 3 (or more) different programs that have separate parsing and AST is the bigger problem.
conartist6 1 days ago [-]
Not just because of perf (though the perf aspect is annoying) but because of how often the three will get out of sync and produce bizarre results
nicoburns 24 hours ago [-]
Hasn't that already been tried (10+ years ago) with projects like https://github.com/jquery/esprima ? Which have since seen usage dramatically reduced for performance reasons.
conartist6 23 hours ago [-]
Yeah, you are correct. But that means I have the benefit of ten years development in the web platform, as well as having hindsight on the earlier effort.

I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed

9dev 24 hours ago [-]
You can rip fast builds from my cold, dead hands. I’m not looking back to JS-only tooling, and I was there since the gulp days.
conartist6 23 hours ago [-]
All I can say for sure is that the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...

9dev 12 hours ago [-]
> […] the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

I don't have them at hand right now but there are various detailed write-ups from the maintainers of Vite, oxc, and more, that are addressing this specific argument to point out that indeed the JavaScript runtime was a hard limitation on the throughput they could achieve, making Rust a necessity to improve build speeds.

conartist6 11 hours ago [-]
Why do you need high throughput though? Isn't that a metric of how fast a batch processing system is?

Why are we still treating batch processing as the controlling paradigm for tools that work on code. If we fully embraced incremental recomputation and shifted the focus to how to avoid re-doing the same work over and over, batch processing speed would become largely irrelevant as a metric

dcre 1 days ago [-]
I look at it and don't really have an issue with it. I have been using tsc, vite, eslint, and prettier for years. I am in the process of switching my projects to tsgo (which will soon be tsc anyway), oxlint, and oxfmt. It's not a big deal and it's well worth the 10x speed increase. It would be nice if there was one toolchain to rule them all, but that is just not the world we live in.
philipwhiuk 24 hours ago [-]
How do you plan to track CVEs flagged on tsgo's native dependencies.
dcre 23 hours ago [-]
I only use it for typechecking locally and in CI. I don’t have it generating code. Of course, what is generating my code is esbuild and soon Rolldown, so same issue maybe. If CVEs in tsgo’s deps are a big risk to run locally, I would say I have much bigger problems than that — a hundred programs I run on my machine have this problem.
TheAlexLichter 1 days ago [-]
The good part is that the new tools do replace the old ones, while being compatible. The pattern is:

* Rolldown is compatible to Rollup's API and can use most Rollup plugins

* Oxlint supports JS plugins and is ESLint compatibel (can run ESLint rules easily)

* Oxfmt plans to support Prettier plugins, in turn using the power of the ecosystem

* and so on...

So you get better performance and can still work with your favorite plugins and extend tools "as before".

Regarding the "mix of technology" or tooling fatigue: I get that. We have to install a lot of tools, even for a simple application. This is where Vite+[0] will shine, bringing the modern and powerful tools together, making them even easier to adopt and reducing the divide in the ecosystem.

[0] https://viteplus.dev/

notnullorvoid 20 hours ago [-]
As far as I'm aware oxlint only supports plugins for non type aware rules, and type aware rules themselves aren't fully stable because it relies on a fork of tsgo.
TheAlexLichter 14 hours ago [-]
That is correct, every rule with a custom parser (e.g. vue/svelte/astro tempaltes) and also type-aware rules can't be used as JS plugin.

Type-aware rule are indeed not marked as stable but work like a charm. tsgolint is indeed tsgo + shims + some works, but that won't change soon as tsgo won't have a JS API for a while.

conartist6 22 hours ago [-]
So you really think everyone in JS should have to learn Rust or else be excluded from sharing in the ownership of their critical infra..?
TheAlexLichter 15 hours ago [-]
1) This is not what I said, no

2) With AI, languages and syntax matters even less nowadays.

3) There have been a good amount of contributors (e.g. for Oxc) that came out the JS world, so it isn't impossible

4) Realistically, the avg. web dev does not contribute to tooling internals, maximum custom rules or similar. The concepts are a bigger "hurdle" than the lang.

conartist6 12 hours ago [-]
That still leaves you admitting that only a small fraction of the served community can really contribute. You'll need to keep all the best benefits of your work for the Plus users or else there would be no reason to buy Plus and no way to keep paying the few to do all the work for the many.

You're stuck telling people what they can't have (and shouldn't want) while I'm now in a position to just give people what they want. I admire the people who work there, but you need a new business model and fast because I am unequivocally going to collapse your current one.

TheAlexLichter 11 hours ago [-]
> That still leaves you admitting that only a small fraction of the served community can really contribute.

Not really (see above).

> You'll need to keep all the best benefits of your work for the Plus users or else there would be no reason to buy Plus and no way to keep paying the few to do all the work for the many.

No, won't happen that way.

> You're stuck telling people what they can't have (and shouldn't want) while I'm now in a position to just give people what they want.

Didn't see any software of yours yet, only big talk so far sadly! Besides that, VoidZero will also be in a position to just give people what they want

conartist6 10 hours ago [-]
VoidZero is betting I'm just talk, ok, that's fair. But the way you say it makes me think that your evaluation of me is mostly on hearsay, because if you had actually tried to find out how serious I am I suspect you'd be less flippant. You're welcome in the Discord!
lelandfe 1 days ago [-]
e: ahhh frick this is just stupid AI spam for this dude’s project.

Supports… some ESLint rules. It is not “easy” to add support to Oxlint for the rules it does not.

The projects at my work that “switched” to it now use both Eslint and Oxlint. It sucks, but at least a subset of errors are caught much faster.

dcre 23 hours ago [-]
Vite+ is not “this dude’s project”, it’s made by the team that makes all the tools discussed in this article.
9 hours ago [-]
TheAlexLichter 23 hours ago [-]
Yeah, no. Real human here.

Oxlint does support core rules out of the box but has support for JS plugins[0] as mentioned. If you don't rely on a custom parser (so svelte or vue component for example) things just work. Even react compiler rules[1].

[0] https://oxc.rs/docs/guide/usage/linter/js-plugins.html [1] https://github.com/TheAlexLichter/oxlint-react-compiler-rule...

lelandfe 9 hours ago [-]
Definitely read AI tonality into the earlier comment, noticed it didn't call out your relationship to it, then saw that you had a comment history plugging it, and made assumptions.

My apologies. I'll follow through to the links next time.

conartist6 11 hours ago [-]
So as long as you only need the pre-installed software it's a great device eh. I'm the PC to your game console here. Parser extension? Piece of cake for us. Heck just to showboat we actually extended our es6 parser from our es3 parser, and then implemented each later standard as an extension of the earlier one. We're able to run parsers for pretty much any programming language, and making them super easy to write. We can do cross-language transforms with ease. We can be our own system of version control! We're going to be a real threat to GitHub. VoidZero is not even trying to do this stuff. Your vision is just so... small.
TheAlexLichter 11 hours ago [-]
As said in another comment: Curious to see what you are coming up! Talk is cheap
CodingJeebus 1 days ago [-]
> We have to maintain everything, old and new, because it's all still critical, engineers have to learn everything, old and new, because it's all still critical.

I completely agree but maintenance is a maintainer problem, not the consumer or user of the package, at least according to the average user of open source nowadays. One of two things are come out of this: either the wheels start falling off once the community can no longer maintain this fractured tooling as you point out, or companies are going to pick up the slack and start stewarding it (likely looking for opportunities to capture tooling and profit along the way).

Neither outcome looks particularly appealing.

NewsaHackO 1 days ago [-]
Yes, this just sounds like the run-of-the-mill specialization issue that is affecting every industry (and has been affecting every industry before AI). Web devs learn Javascript/Typescript/frameworks, "middleware" developers learn Rust/Go/C++/etc. to build the web development frameworks, lower-level devs build that, etc. There shouldn’t be a strict need for someone who wants to make websites or web technology to learn Rust or Go unless they want to break into web framework development or WASM stuff. But again, this is just over-specialization that has been happening since forever (or at least since the Industrial revolution).
riskable 1 days ago [-]
> All I really see is an explosion of complexity.

I thought this was the point of all development in the JavaScript/web ecosystem?

co_king_5 1 days ago [-]
In retrospect, the tolerance for excess complexity in the JS/npm/yarn/web framework ecosystem was an important precursor to the wanton overconsumption of today's LLM ecosystem.
cod1r 1 days ago [-]
It's definitely an explosion of complexity but also something that AI can help manage. So :shrug: ...

Based on current trends, I don't think people care about knowing how all the parts work (even before these powerful LLMs came along) as long as the job gets done and things get shipped and it mostly works.

fsmedberg 1 days ago [-]
I'm very surprised the article doesn't mention Bun. Bun is significantly faster than Vite & Rolldown, if it's simply speed one is aiming for. More importantly Bun allows for simplicity. Install Bun, you get Bundler included and TypeScript just works, and it's blazing fast.
yurishimo 1 days ago [-]
IMO Bun and Vite are best suited for slightly different things. Not to say that there isn't a lot of overlap, but if you don't need many of the features Bun provides, it can be a bit overkill.

Personally, I write a lot of Vue, so using a "first party" environment has a lot of advantages for me. Perhaps if you are a React developer, the swap might be even more straightforward.

I also think it's important to take into consideration the other two packages mentioned in this post (oxlint & oxfmt) because they are first class citizens in Vite (and soon to be Vite+). Bun might be a _technically_ faster dev server, but if your other tools are still slow, that might be a moot point.

Also, Typescript also "just works" in Vite as well. I have a project on work that is using `.ts` files without even an `tsconfig` file in the project.

https://vite.dev/guide/features#typescript

squidsoup 1 days ago [-]
Worth mentioning that both oxfmt/oxc are in alpha. I would put money on them replacing prettier and eslint, but they're not ready for production yet.
kevinfiol 1 days ago [-]
It's been a while since I've tried it, but post-1.0 release of Bun still seemed like beta software and I would get all sorts of hard to understand errors while building a simple CRUD app. My impression from the project is the maintainers were adding so many features that they were spread too thin. Hopefully it's a little more stable now.
dcre 1 days ago [-]
Bun and Vite are not really analogous. Bun includes features that overlap with Vite but Vite does a lot more. (It goes without saying that Bun also does things Vite doesn't do because Bun is a whole JS runtime.)
canadiantim 1 days ago [-]
Bun can replace vite?
netghost 1 days ago [-]
Bun ships with lots of tools built in. It has support for bundling js, html, etc for the browser.

I suspect that if you want the best results or to hit all the edge cases you'd still want vite, but bun probably covers most needs.

TheAlexLichter 15 hours ago [-]
Not really.
gaoshan 1 days ago [-]
This smells of "I like to solve puzzles and fiddle with things" and reminds of hours spent satisfyingly tweaking my very specific and custom setups for various things technical.

I, too, like to fiddle with optimizations and tool configuration puzzles but I need to get things done and get them done now. It doesn't seem fast, it seems cumbersome and inconsistent.

ssgodderidge 1 days ago [-]
> It doesn't seem fast, it seems cumbersome and inconsistent

I think the point of this project is to provide an opinionated set of templates aimed at shipping instead of tinkering, right? "Don't tinker with the backend frameworks, just use this and focus on building the business logic."

conradkay 1 days ago [-]
It seems like all you have to do is paste 2-3 prompts
Narretz 1 days ago [-]
Kinda crazy that ts-node is still the recommendation when it hasn't been updated since 2023. And likewise crazy that no other lib has emerged that has typescript compilation and typechecking. Of course if it works, don't fix it, but typescript has evolved quite a bit since 2023.
dcre 1 days ago [-]
I like tsx for this, and it's actively maintained. The author may not know about it. https://github.com/privatenumber/tsx
TheAlexLichter 11 hours ago [-]
Love that fact that you don't need anything ts-node/tsx like if you have erasable syntax only. Other than that, there is https://github.com/oxc-project/oxc-node too.
EvgheniDem 1 days ago [-]
The bit about strict guardrails helping LLMs write better code matches what we have been seeing. We ran the same task in loose vs strict lint configurations and the output quality difference was noticeable.

What was surprising is that it wasn't just about catching errors after generation. The model seemed to anticipate the constraints and generated cleaner code from the start. My working theory is that strict, typed configs give the model a cleaner context to reason from, almost like telling it what good code looks like before it starts.

The piece I still haven't solved: even with perfect guardrails per file, models frequently lose track of cross-file invariants. You can have every individual component lint-clean and still end up with a codebase that silently breaks when components interact. That seems like the next layer of the problem.

newzino 1 days ago [-]
[dead]
takeaura25 1 days ago [-]
We've been building our frontend with AI assistance and the bottleneck has shifted from writing code to reviewing it. Faster tooling helps, but I wonder if the next big gain is in tighter feedback loops — seeing your changes live as the AI generates them, rather than waiting for a full build cycle.
EvgheniDem 23 hours ago [-]
Exactly this. And what makes it compound is that you can not build muscle memory for patterns you have already reviewed. Same prompt, different output every time, so every generation is a fresh read even if you have seen similar code before.

The feedback loop angle is interesting. Real-time linting during generation rather than after could help catch issues earlier, but I think the deeper problem is the non-determinism. Even with instant feedback, if the output changes on each run you are still starting from scratch each time.

Have you found anything that actually reduces the review time per component, or is it mostly about finding issues faster?

simonbw 1 days ago [-]
Are your frontend builds actually so slow that you're not seeing them live? I've gotten used to most frontend builds being single digit seconds or less for what feels like a decade now.
EvgheniDem 23 hours ago [-]
Not build speed, the human review cycle. When the AI generates a component, I still need to read through it manually to make sure it does what I intended, handles edge cases, and fits the existing patterns. That takes 8-12 minutes per component regardless of how fast the build is.

The slow part is not the computer. It is me reading AI-generated code line by line before I trust it enough to ship.

austin-cheney 1 days ago [-]
Any method for front end tooling is potentially the fastest. It always comes to what you measure and how you measure it. If you don't have any measures at all then your favorite method is always the fastest no matter what, because you live in a world without evidence.

Even after consideration of measurements radical performance improvements are most typically the result of the code's organization and techniques employed than the language its written in. But, of course, that cannot be validated without evidence from comparison of measurements.

The tragic part of all this is that everybody already knows this, but most front end developers do not measure things and may become hostile when measurements do occur that contradict their favorite techniques.

codingdave 1 days ago [-]
I have yet to meet a front-end dev that gets hostile when you show them how their code can be improved. On the contrary, the folks I have worked with are thrilled to improve their craft.

Unless of course you are not showing them improvements and are instead just shitting on their work. Yes, people do get hostile to that approach.

simonw 1 days ago [-]
I take it you've never suggested to a front-end dev that maybe their contact form doesn't need a 1MB+ of JavaScript framework and could just be HTML that submits to a backend.
philipwhiuk 24 hours ago [-]
Live-form validation? Auto-complete? Any of these ringing a bell?

It's almost like there are genuine UX improvements being done

simonw 22 hours ago [-]
That should be a few dozen lines of vanilla JavaScript.
Zardoz84 16 hours ago [-]
You know that pure HTML have it ? and if you need a more complex validation, a few lines of js does the magic. Same if you need live autocomplet.
simonw 8 hours ago [-]
Good point, HTML with no JavaScript at all has pretty good validation built in now. Try entering an incomplete email address on this demo page: https://tools.simonwillison.net/html-validation-demo
austin-cheney 1 days ago [-]
Then you and I are talking to different people. Fortunately, I don't work in JavaScript for employment any more. As a frame of reference just the mere mention that a site could be 50-200x faster by dumping React creates conflicts of interests for impacted developers and the results are typically not immediately welcoming. That isn't shitting on anybody's work, especially if you provide guidance for improvement, but if a large group of developers cannot function without React their perception of "shitting on their work" will be less objective.
johnfn 1 days ago [-]
It doesn't surprise me that you got a lot of people upset at you. "Dumping React" is not a viable strategy for the large majority of organizations. This would be like saying that you could improve performance by rewriting the backend into Rust.
austin-cheney 1 days ago [-]
Perfect example of what I am talking about.

People want faster software... until they are confronted by challenging decisions. JavaScript can be very fast. JavaScript, in the browser, reports a page load of about 0.06 seconds for my large personal SPA and that includes state restoration. That is determined by using: performance.getEntries()[0].duration in the browser.

When conflicts arise people most frequently become emotional and complain about the situation than make any decision towards resolution one way or the other. That is a psychological problem called cognitive conservatism[1]. About the half the time that emotional output is some form of deflection, such as hostility. Cognitive conservatism is only allowed to exist when there is insufficient pressure on the thought leaders to impose a resolution.

Its okay to say you don't really want to be faster.

[1] https://en.wikipedia.org/wiki/Conservatism_(belief_revision)

See also cognitive complexity: https://en.wikipedia.org/wiki/Cognitive_complexity#In_psycho...

agos 1 days ago [-]
Insinuating the person you’re discussing with has a psychological problem is also not a great way to win minds
austin-cheney 1 days ago [-]
I did no such thing. That you see such is an example of front end developers seeing everything through emotionally tinted glasses. If you want to talk numbers we can talk numbers, but it doesn't matter if the first matter is whether or your not the numbers offend you.
johnfn 22 hours ago [-]
> When conflicts arise people most frequently become emotional and complain about the situation than make any decision towards resolution one way or the other. That is a psychological problem called cognitive conservatism[1]. About the half the time that emotional output is some form of deflection, such as hostility. Cognitive conservatism is only allowed to exist when there is insufficient pressure on the thought leaders to impose a resolution.

Please consider that it is your own behavior which leads to these responses. When you are getting repeated "hostile" reactions from everyone you try to state your case to, instead of insisting that everyone else is wrong, it might be wise to look inwards and ask yourself if you are the problem.

It is interesting that you claim that the only reason you can think of that people disagree with you that they are emotional. Have you considered that you might be wrong?

austin-cheney 16 hours ago [-]
It’s not about right or wrong. It’s about the numbers. You are either faster or not. People that are easily offended really want it to be all about right or wrong. The only reason I can think of abandoning or discarding evidence is emotion. Whether that is right or wrong I don’t care.
johnfn 15 hours ago [-]
The success of an organization is very rarely dependent solely on website performance. Speed is just one dimension in a vast and multi-dimensional optimization space. Spending time improving performance means you are necessarily not spending time improving one of those other vectors. It is a question of priorities - and suggesting that others who say that other priorities are more important are "emotional" is failing to grapple with that reality.

I don't doubt you have been correct to say performance can be improved. Performance always can be improved. It just likely doesn't matter.

austin-cheney 11 hours ago [-]
Yes. Everything else matters too, but that is still not an excuse to throw away evidence.

If you don’t want to be faster it’s okay.

johnfn 5 hours ago [-]
I am not throwing away evidence. I am saying "You are correct that removing React would make most websites faster. It would also be a strategy that few places should pursue."
insin 1 days ago [-]
Any plans to create a combined server + web app template using @hono/vite-dev-server for local development, with both sides of auth preconfigured, with the server serving up the built web app in production?

I've used this setup for my last few projects and it's so painless, and with recent versions of Node.js which can strip TypeScript types I don't even need a build step for the server code.

Edit: oops, I didn't see nkzw-tech/fate-template, which has something like this, but running client and server separately instead

Exoristos 1 days ago [-]
All y'all need more RAM in your development laptops. Maybe. At least, I've never been bothered by the performance of standard tooling like prettier, ESLint, and npm.
agos 1 days ago [-]
ESLint on medium/big projects can be pretty slow and if you use type aware rules it opts out of caching
squidsoup 1 days ago [-]
On a large codebase, eslint is quite slow.
sunaookami 1 days ago [-]
Oxfmt!? I just switched from ESLint and Prettier to Biome!
sibeliuss 23 hours ago [-]
Biome missed the bus by not getting mixed language formatting in soon enough. Oxfmt got it, and then won.
kevlened 19 hours ago [-]
> Oxfmt got it, and then won.

This doesn't align with the data.

  - 650k npm downloads for oxfmt this week [0]
  - 4.3m npm downloads for biome this week [1]

  - 19.1k gh stars for oxfmt [2]
  - 23.7k gh stars for biome [3]
  - gap widening in favor of biome [4]

  [0] https://www.npmjs.com/package/oxfmt
  [1] https://www.npmjs.com/package/@biomejs/biome
  [2] https://github.com/oxc-project/oxc
  [3] https://github.com/biomejs/biome
  [4] https://www.star-history.com/#oxc-project/oxc&biomejs/biome&type=date&legend=top-left
nimonian 1 days ago [-]
I really liked biome but it kept murdering my .vue files
h4ch1 1 days ago [-]
dprint ftw, works very well with Svelte as far as I've seen.

Biome and oxc* never worked properly with Svelte, but I haven't tried them since the past 9 or so months when I switched to dprint from prettier.

bingobongodev 1 days ago [-]
You can omit tsc with : https://oxc.rs/docs/guide/usage/linter/type-aware.html#type-..., so one less script to run in paralell
the_harpia_io 1 days ago [-]
the ecosystem fragmentation thing hit me pretty hard when i was trying to set up a consistent linting workflow across a mono-repo last year. half the team already using biome, half still on eslint+prettier, and adding any shared tooling meant either duplicating config or just picking a side and upsetting someone

i get why the rust/go tools exist - the perf gains are measurable. but the cognitive overhead is real. new engineer joins, they now need 3 different mental models just to make a PR. not sure AI helps here either honestly, it just makes it easier to copy-paste configs you don't fully understand

e10jc 1 days ago [-]
Very cool list but why no mention of biome? I’ve been using that on a recent project and it’s been pretty great. Also bun test instead of vitest.
huksley 1 days ago [-]
One nitpick is Claude Code on the web does not do linting by default, so you need to run lint for its changes manually.
_pdp_ 1 days ago [-]
This is a good list. Bookmarked.
philipwhiuk 24 hours ago [-]
Can't wait for the first crypto-attack on a front-end JS library that's caused by a Go package vuln. God knows how `pnpm audit` will handle Go-module dependencies.

(I opened an issue against typescript-go to flag this https://github.com/microsoft/typescript-go/issues/2825 )

1 days ago [-]
dejli 1 days ago [-]
It looks more functional i like it.
vivzkestrel 1 days ago [-]
get rid of both Oxfmt and Oxlint and use biome OP
loevborg 1 days ago [-]
what are the pros and cons of oxlint vs biome?
vivzkestrel 20 hours ago [-]
biome does the work of all 3 of those libraries for starters
elxr 1 days ago [-]
For what reason?
vivzkestrel 20 hours ago [-]
because it is a single tool that does lint, formatting and checking sort order. you really dont need 3 libraries in its place
fullstackchris 1 days ago [-]
anyone have any insight as to why microsoft chose go? I feel like with rust it could have been even faster!
steveklabnik 1 days ago [-]
They said at the time that Go let them keep the overall structure of the code, that is, they weren't trying to do a re-implementation from scratch, more of a port, and so the port was more straightforward with Go.
sublinear 1 days ago [-]
I'm confused by this, but also curious what we mean by "fastest".

In my experience, the bottleneck has always been backend dev and testing.

I was hoping "tooling" meant faster testing, not yet another layer of frontend dev. Frontend dev is pretty fast even when done completely by hand for the last decade or so. I have and have also seen others livecode on 15 minute calls with stakeholders or QA to mock some UI or debug. I've seen people deliver the final results from that meeting just a couple of hours later. I say this as in, that's what's going to prod minus some very specific edge case bugs that might even get argued away and never fixed.

Not trying to be defensive of pure human coding skills, but sometimes I wonder if we've rolled back expectations in the past few years. All this recent stuff seems even more complicated and more error prone, and frontend is already those things.

whstl 1 days ago [-]
It's about raw performance. The tools mentioned mostly optimize for fast parsing, fast compilation/transpilation, etc.
1necornbuilder 24 hours ago [-]
[dead]
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 22:42:12 GMT+0000 (Coordinated Universal Time) with Vercel.