Rethinking the dev stack: Web Components vs React, Dependabot vs govulncheck
Lately I keep seeing variations of “React became the default because the web was immature.” In the same week, I ran into similarly structured claims in two very different ecosystems. On the web frontend: “With Custom Elements + Shadow DOM we no longer need React.” In the Go world: “Turn Dependabot off and switch to govulncheck.” Both share the same structure: reexamining the tools we’ve taken for granted.
Are Web Components really making React unnecessary?
Every time I hear “Web Components mean you don’t need a framework,” I used to think “not this again.” But after reading Stephan Schwab’s piece, the framing “the browser itself has become the framework” felt surprisingly accurate.
Hidden costs of frameworks
The cost of sticking with React or Vue isn’t just “learning cost.” There’s major-version churn, rewrites for deprecated patterns, and dependency management across the build toolchain. Many teams struggled with the move to Concurrent Mode in React 18, or the Vue 2 → 3 shift to the Composition API.
By contrast, code based on web standards comes with backwards compatibility guaranteed by browser vendors. Custom Elements written 10 years ago still run on today’s browsers. Framework maintainers can stop maintenance when budgets, staffing, or priorities change; browser vendors can’t. Once a feature lands in the web platform, it’s effectively supported forever.
The three APIs
Custom Elements
A mechanism to register behavior on custom HTML tags. There are four lifecycle callbacks.
class TaskCard extends HTMLElement {
// 監視する属性を宣言
static observedAttributes = ['title', 'description', 'status'];
connectedCallback() {
// DOMに追加されたとき(Reactのマウントに相当)
this.render();
}
disconnectedCallback() {
// DOMから除去されたとき(クリーンアップ処理)
this.cleanup();
}
attributeChangedCallback(name, oldValue, newValue) {
// observedAttributesに列挙した属性が変更されたとき
if (oldValue !== newValue) this.render();
}
adoptedCallback() {
// document.adoptNode()で別ドキュメントに移動したとき(iframe間の移動など)
}
render() {
this.innerHTML = `
<div class="task ${this.getAttribute('status') || ''}">
<h3>${this.getAttribute('title') || ''}</h3>
<p>${this.getAttribute('description') || ''}</p>
</div>
`;
}
cleanup() {
// イベントリスナーの解除など
}
}
customElements.define('task-card', TaskCard);
Only attributes listed in observedAttributes will trigger attributeChangedCallback. Attributes not listed won’t be observed. Attributes are strings, so when you want to pass objects or arrays, assign them directly as properties instead.
// 属性(文字列のみ)
document.querySelector('task-card').setAttribute('title', 'タスク名');
// プロパティ(任意の型)
document.querySelector('task-card').data = { id: 1, tags: ['urgent'] };
This duality—attributes are strings, complex data goes through properties—differs from React props. If you don’t internalize it early, it’s easy to stumble.
Shadow DOM
Provides encapsulation for structure and styles.
class StyledCard extends HTMLElement {
constructor() {
super();
this.attachShadow({ mode: 'open' });
}
connectedCallback() {
this.shadowRoot.innerHTML = `
<style>
:host { display: block; }
:host([variant="highlighted"]) { border-left: 3px solid blue; }
.card { padding: 1rem; border-radius: 8px; background: var(--card-bg, #f5f5f5); }
</style>
<div class="card"><slot></slot></div>
`;
}
}
Use the :host selector to style the component itself, and :host([attr]) to vary by attributes. The <slot> element enables composition patterns.
CSS variables (custom properties) flow through Shadow DOM boundaries. As in var(--card-bg, #f5f5f5) above, you can inject themes from the outside. This is the only officially supported way to tweak styles across a Shadow DOM boundary.
Native event system
DOM’s native events remove the need for global state managers or prop drilling.
// 子コンポーネントがイベントを発火
this.dispatchEvent(new CustomEvent('item-selected', {
detail: { itemId: this.selectedId, metadata: this.itemData },
bubbles: true,
composed: true // Shadow DOM境界を越えて伝搬
}));
// 親がリッスン
document.addEventListener('item-selected', (e) => {
document.querySelectorAll('[data-filterable]').forEach(panel => {
panel.applyFilters(e.detail);
});
});
bubbles: true propagates up the DOM tree; composed: true crosses Shadow DOM boundaries. With composed: false, the event stays inside the shadow root, which you can use for private, internal events.
Shadow DOM gotchas
Shadow DOM is powerful, but there are a few traps.
Form participation
Inputs inside a shadow tree don’t submit values to an outer <form>. Use the ElementInternals API to explicitly participate.
class FormInput extends HTMLElement {
static formAssociated = true;
constructor() {
super();
this.internals = this.attachInternals();
this.attachShadow({ mode: 'open' });
}
connectedCallback() {
this.shadowRoot.innerHTML = `<input type="text" />`;
this.shadowRoot.querySelector('input').addEventListener('input', (e) => {
this.internals.setFormValue(e.target.value);
});
}
}
You need both static formAssociated = true and attachInternals(). Forget either and the form value won’t be submitted.
Limitations of ::slotted()
The ::slotted() pseudo-element only styles direct children inserted into a slot. Grandchildren and deeper descendants are not targeted.
/* Works: direct children passed to the slot */
::slotted(p) { color: red; }
/* Doesn’t work: styling descendants of slotted content */
::slotted(p span) { font-weight: bold; }
To work around this, pass styles via CSS variables or pre-style the content before it’s slotted.
Accessibility and ARIA
IDs inside a shadow tree aren’t addressable from the outside. You can’t point aria-labelledby or aria-describedby at elements inside another shadow root, so use ElementInternals properties such as ariaLabel, or set aria-label directly.
Global CSS reset
Reset/normalize styles don’t penetrate Shadow DOM. Either write local resets per component, @import a shared stylesheet, or use Constructable Stylesheets.
const sharedStyles = new CSSStyleSheet();
sharedStyles.replaceSync(`*, *::before, *::after { box-sizing: border-box; }`);
class MyComponent extends HTMLElement {
constructor() {
super();
const shadow = this.attachShadow({ mode: 'open' });
shadow.adoptedStyleSheets = [sharedStyles];
}
}
Using adoptedStyleSheets lets multiple components share a single stylesheet instance, which is memory-efficient.
Declarative Shadow DOM
Traditionally, Shadow DOM could only be created from JavaScript. To address criticism that it plays poorly with SSR and progressive enhancement, Declarative Shadow DOM was introduced.
<my-card>
<template shadowrootmode="open">
<style>
.card { padding: 1rem; background: var(--card-bg, #f5f5f5); }
</style>
<div class="card"><slot></slot></div>
</template>
<p>カードの中身</p>
</my-card>
When the HTML parser encounters <template shadowrootmode="open">, it creates the shadow root without waiting for JS execution. If you assemble HTML on the server, the page renders with Shadow DOM applied before JS loads. Supported in Chrome, Firefox, and Safari.
When to use React vs Web Components
| Perspective | React | Web Components |
|---|---|---|
| Component communication | props, Context, state libraries | DOM events, CSS variables |
| Style scoping | CSS Modules, CSS‑in‑JS | Shadow DOM |
| Lifecycle | Hooks like useEffect | connectedCallback, etc. |
| Build tools | Required | Not required (native execution) |
| Backwards compatibility | Version‑dependent | Guaranteed by browser vendors |
| Bundle size | Runtime overhead | Zero |
| SSR | Needs Next.js, etc. | Declarative Shadow DOM |
| Form integration | Conventional | Requires the ElementInternals API |
| Testing | Rich Testing Library ecosystem | Testable with standard DOM APIs |
Where React fits: teams already fluent in React and optimizing for velocity, complex SPA state management, or when you want to leverage its vast ecosystem.
Where Web Components fit: sharing an internal design system across multiple products, minimizing framework lock‑in for long‑term maintenance, or using them as boundary components in a micro‑frontend setup.
This isn’t an either‑or, all‑in choice. You can mix Web Components into a React app with no technical problem. Because Web Components operate at the HTML boundary of frameworks, you can adopt them incrementally.
Adoption at scale
This is no longer just “good on paper.” From 2025 into 2026, large products accelerated their adoption of Web Components.
Shopify is the most symbolic case. In October 2025 it released Polaris Web Components as stable and moved the React version of Polaris to maintenance mode. New Shopify apps and extensions default to the Web Components version. UI extensions moved to Preact with a 64 KB bundle‑size cap. It’s an organization‑level pivot driven by the “hidden costs of frameworks.”
Adobe built Spectrum 2 on Web Components to provide a framework‑agnostic design system that unifies diverse teams under one brand. Salesforce continues to push Lightning Web Components (LWC); in 2026 Lightning Out 2.0 reached GA, and an AI assistant called AgentForce Vibes even appeared to generate LWC from natural language. ING Bank was an early adopter, building the Lion component library that meets financial compliance requirements.
Meanwhile, GitHub is moving in the opposite direction in some areas. Despite pioneering Web Components internally via Catalyst and github-elements, as of February 2026 it put Primer ViewComponents into maintenance mode and recommends migrating to Primer React. Reality isn’t “Web Components everywhere”—it depends on use case.
By the way, Chrome page‑load stats show Web Components usage rising from 10% to 18% (as of 2024). Full browser support without polyfills clearly helped.
Framework choice is less about “which is technically superior” and more about “what fits this project’s lifespan and org structure.” If you ask whether a five‑page static site needs React 19’s Server Components, the answer is obvious (I wrote about making exactly that call and migrating to Astro in this post). Conversely, for a large SPA with complex state, React’s ecosystem is still compelling.
Original article: Web Components: The Framework‑Free Renaissance
Turn Dependabot off and switch to Go vulnerability checking
Filippo Valsorda wrote “Turn Dependabot Off.” The title is provocative, and the content lands just as hard.
The noise Dependabot creates
Dependabot is GitHub’s automatic dependency‑update tool. It can detect vulnerabilities and open PRs, but its detection logic is the problem.
The edwards25519 incident on filippo.io is a clear example. There was an issue in the MultiScalarMult method, but almost no projects actually called it. Even so, Dependabot opened PRs across thousands of repos. Maintainers had to spend real time reviewing “fixes” for code paths they never invoked. This is the essence of false positives at the package‑version level.
Reachability analysis vs. version matching
govulncheck doesn’t flag “vulnerable version X.Y.Z present in your dependency tree.” Instead, it builds a call graph and reports only vulnerabilities in code you actually call. Findings are categorized into “Called” and “Imported which uses it,” and it’s especially conservative on the “Called” side.
This translates into operational differences:
- You won’t get PR floods for vulnerabilities in functions you don’t use.
- You can prioritize remediation for findings that are actually reachable.
- You reduce reviewer fatigue and the chance of rubber‑stamping noisy PRs.
With Dependabot, teams often end up distinguishing three piles—“critical,” “maybe later,” and “ignore”—and the border between them constantly shifts. With govulncheck, you can focus on “Called.”
Moving to scheduled tests
Filippo proposes combining two GitHub Actions.
One is a scheduled govulncheck run:
# .github/workflows/govulncheck.yml
name: govulncheck
on:
schedule:
- cron: '0 9 * * 1' # 毎週月曜日9時UTC
push:
branches: [main]
jobs:
govulncheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: golang/govulncheck-action@v1
The other runs your test suite against the latest dependency versions. It bumps everything with go get -u ./... and then runs tests.
name: Test with latest deps
on:
schedule:
- cron: '0 9 * * 1'
jobs:
test-latest:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-go@v5
with:
go-version: stable
- run: go get -u ./...
- run: go mod tidy
- run: go test ./...
If tests fail, you fix them. Instead of generating piles of PRs and exhausting reviewers, you only detect changes that actually break you. It’s a different philosophy than Dependabot’s “updating version numbers is the goal.”
Beyond the Go ecosystem
govulncheck’s symbol‑level analysis is closely tied to Go’s compiler and type system, but tools with similar ideas are emerging elsewhere:
- Rust:
cargo auditworks at the crate level; combine withcargo-vetfor supply‑chain attestation. - Node.js:
npm auditoperates at the package‑version level. Reachability analysis isn’t mainstream yet, but Socket.dev analyzes dependency behavior. - Python:
pip-audit+ the OSV Database.
The core idea—“don’t just look at package versions; look at the code paths you actually call”—is language‑agnostic.
What’s new in govulncheck
In the v1.1 line, govulncheck expanded its output formats:
- SARIF output (
-format sarif): enables direct integration with GitHub Code Scanning. - OpenVEX output (
-format openvex): a standardized vulnerability‑exchange format. - Stable API (the
golang.org/x/vuln/scanpackage): programmatic access with feature‑parity to govulncheck; easier to integrate into custom tools. - Up to 15% faster on large programs.
For practitioners integrating with CI pipelines or security dashboards, SARIF support is especially valuable.
The shift away from Dependabot
Filippo’s post isn’t a lone opinion; by February 2026 it was becoming a trend. Andrew Nesbitt’s “16 Best Practices for Reducing Dependabot Noise” lists 16 tactics to cut the noise. Notably, some are half‑tongue‑in‑cheek: “put lockfiles in .gitignore,” “merge into a monorepo so Dependabot analysis times out,” and similar hacks are being seriously discussed. On February 20, 2026, Mehdi.cc published “No more dependabot in these repositories,” reporting on developers who disabled Dependabot after reading Filippo’s article.
GitHub itself recognizes the issue, adding auto‑triage rules, preset filtering, and automatic dismissal of npm false positives. But there’s still no reachability analysis for Go, so govulncheck keeps its advantage.
Connection to supply‑chain attacks
There’s another often‑missed risk in a flow where “upgrading versions” becomes the goal. If you enable auto‑merge, you can automatically ingest malicious package updates. In February 2026 we saw the SANDWORM_MODE worm targeting the npm ecosystem, with 19 typo‑squatted packages discovered. In Clinejection, which abused the Cline AI bot, attackers even weaponized GitHub Actions cache poisoning.
When adopting the GitHub‑Actions approach Filippo proposes, harden the workflows themselves too: be mindful of the pull_request_target trigger and secret‑exposure surfaces.
govulncheck’s call‑graph analysis reduces false positives by looking only at code paths you actually invoke. The same direction is emerging elsewhere. Anthropic’s Claude Code Security relies on AI‑based understanding rather than purely rule‑based SAST to detect vulnerabilities spanning multiple components. Different languages and approaches, same push toward “understanding code beyond pattern matching.”