Tech 12 min read

JPEG-XL revival and PQC migration Merkle Tree Certificates, changes in Chrome 145-146

IkesanContents

When I was following the changes introduced in Chrome 145 to 146, I found that the directions were so varied that I couldn’t believe it was just a single browser, from the revival of JPEG-XL to the PQC transition, the AI ​​agent API, and zero-day. I tried digging into each.

JPEG-XL revival (Chrome 145)

JPEG-XL, which was removed from Chrome in 2022 and caused a lot of backlash, is back in Chrome 145 (stable version released on February 10, 2026). At the time of its removal, the reason was “not enough ecosystem interest,” but that changed when Apple Safari 17 supported it in 2023.

In its revival, it uses the Rust-based decoder “jxl-rs” instead of C++ libjxl. It’s part of a growing trend toward Rust adoption across Chromium.

However, currently it cannot be used unless the flag is enabled with chrome://flags/#enable-jxl-image-format. A timeline for default enablement has not been announced.

Format comparison

DebugBear benchmark (990KB JPEG photo):

FormatSizeJPEG ratioEncoding speedProgressive
WebP700KB-29%FastestNone
AVIF507KB-49%SlowNone
JPEG XL472KB-52%Medium to FastYes

Cloudinary’s large-scale tests (over 40,000 images) show that JPEG XL is 20% smaller than AVIF and encodes 2.5 times faster. Another strength is lossless recompression from existing JPEGs, which can reduce data by 20-30% with no quality loss.

Convert with cjxl

It can be converted using the cjxl command included in libjxl. Install with brew install jpeg-xl on macOS and apt install libjxl-tools on Ubuntu.

# 視覚的にロスレス(distance 1.0)
cjxl -d 1.0 input.png output.jxl

# Web向け(やや圧縮重視)
cjxl -d 2.0 input.png output.jxl

# 完全ロスレス
cjxl -d 0 input.png lossless.jxl

# 既存JPEGのロスレス再圧縮(品質劣化ゼロ、元のJPEGに完全復元可能)
cjxl input.jpg output.jxl

# 最大圧縮(effort 9、遅いが最小サイズ)
cjxl -d 1.0 -e 9 input.png output.jxl

For distance, 0 is lossless, 1.0 is visually lossless, 1.0 to 2.0 is for web, and 2.0 to 3.0 is size-oriented.

GNU parallel is useful for batch conversion. It is more efficient to process multiple images simultaneously with one thread than to allocate multiple threads to one image.

parallel -j16 cjxl -e 9 --num_threads=0 -d 1.0 {} {.}.jxl ::: *.png

Browser support and actual operation

BrowserCompatibility status
SafariNatively supported on 17+ (animation/progressive not supported)
ChromeRust decoder installed on 145, flags must be enabled
Firefoxjxl-rs integration work in progress, no stable timeline

The JXL compatibility rate of all browsers is approximately 12% (mainly Safari’s share). At Interop 2026, Apple, Google, Microsoft, and Mozilla are all participating in JPEG XL as a “research item,” and there is an intention to respond cross-vendor.

If you want to use it on the web right now, create a fallback with the <picture> element.

<picture>
  <source srcset="image.jxl" type="image/jxl">
  <source srcset="image.avif" type="image/avif">
  <source srcset="image.webp" type="image/webp">
  <img src="image.jpg" alt="description">
</picture>

Node.js sharp supports JXL experimentally, but it is not included in the pre-built binaries. You need to build libvips with libjxl on your own, so it cannot be used in production. If JXL conversion is required, it is realistic to call cjxl in child_process.

WebP/AVIF will be the mainstay for actual operations until Chrome’s default enablement and sharp’s pre-built support are available.

TLS ciphers came from RSA to PQC via elliptic curves

RSA era

RSA has long been responsible for TLS (SSL) key exchange. The client generates a random premaster secret, encrypts it with the server’s public key, and sends it. The server decrypts it with its private key, and both parties derive a symmetric key (such as AES).

Key sizes have increased over time. At the end of the 1990s, 512-bit RSA was the standard, but in 1999 512-bit RSA was publicly decrypted, and in 2009 768-bit RSA was also factorized. The current standard is 2048 bits (256 bytes), and 4096 bits makes the TLS handshake 6 to 7 times slower, so 2048 bits is the overwhelming majority in practice.

The fatal weakness of RSA key exchange is the lack of forward secrecy. If the server’s private key is leaked, all previously recorded communications can be decrypted.

Transition to Elliptic Curve (ECC)

Elliptic curve cryptography provides the same security strength as RSA with a much shorter key.

Security StrengthRSAECC
128 bits3,072 bits (384 bytes)256 bits (32 bytes)
192 bits7,680 bits384 bits
256 bits15,360 bits521 ​​bits

The ECC P-256 public key is only 32 bytes. This is 1/12th compared to RSA-3072’s 384 bytes. ECC-256 is also 20 to 116 times faster than RSA-3072 in Cloudflare’s benchmarks.

In 2011, Google enabled ECDHE (Elliptic Curve Diffie-Hellman Ephemeral) by default on gmail.com and docs.google.com. ECDHE generates a temporary key pair every time, so past communications are safe even if the server’s long-term secret key is leaked (Forward Secrecy). The importance of forward secrecy was widely recognized after the Snowden incident in 2013, and RSA key exchange was completely abolished in TLS 1.3 (RFC 8446) in 2018.

Quantum computer destroys both

So far, elliptic curves seem to be the strongest, but Shor’s algorithm on quantum computers can efficiently solve both RSA (prime factorization) and ECC (discrete logarithm problem).

The irony is that ECC is more vulnerable to quantum attacks than RSA. The number of qubits required for Shor’s algorithm depends on the bit length of the key.

AlgorithmRequired number of logical qubits
RSA-20486,190
ECC P-2562,619
ECC P-3843,901

ECC-256 and RSA-3072 are equally secure against classical computers, but the advantage of ECC’s short keys becomes a weakness against quantum attacks.

As of 2025, there will be no quantum computer capable of decrypting codes, but the problem is “Harvest Now, Decrypt Later (HNDL).” Today’s encrypted communications will be recorded and saved, and future quantum computers will be able to decipher them. The US DHS, UK NCSC, and EU ENISA have all officially warned that “hostile actors are already collecting data.” This is the main reason for moving to PQC before the quantum computer is completed.

Chrome has PQC support for key exchange.

PQC support for key exchange is already underway. From Chrome 131 (November 2024), the NIST standard ML-KEM (formerly CRYSTALS-Kyber) is enabled by default in a hybrid configuration with X25519. ML-KEM is a type of lattice cryptography, and its security is based on the shortest vector problem on a lattice, to which Shor’s algorithm cannot be applied.

Cloudflare observes that as of October 2025, the majority of human-originated TLS traffic will be protected with PQC encryption. The increase in handshake time is approximately 4%.

However, the public key of ML-KEM-768 is 1,184 bytes, which is a significant increase compared to the 32 bytes of X25519. Key exchange is fine because it only requires a single handshake, but certificates have another problem.

PQC HTTPS Migration with Merkle Tree Certificates

Chrome announced on the Google Security Blog that it plans to phase out its X.509 certificate-based TLS PKI and move to Merkle Tree Certificates (MTCs).

Reasons why it is difficult to comply with PQC for certificates

PQC for key exchange was realized in Chrome 131, but replacing certificate signatures with PQC is another challenge. The signature of ML-DSA (PQC signature algorithm) is 3,293 bytes. Since the TLS handshake sends the entire certificate chain (root → intermediate → end entity), multiple signatures accumulate.

Half of QUIC connections have a total transfer volume of less than 8KB, with existing certificates alone accounting for 3-4KB. If a PQC signature is added here, the bandwidth cost for small-scale communications will become serious. Google decided that the problem could not be solved within the framework of X.509, and adopted a different architecture called MTCs.

How MTCs work

In traditional X.509, a CA signs each domain’s certificate individually, and the browser receives and verifies the entire certificate chain. In MTCs, a single CA signature covers a “Tree Head” under which millions of certificates are stored in a Merkle tree structure. The browser doesn’t need the entire tree, just a lightweight “proof-of-inclusion”. Since the amount of data to be sent is on a logarithmic order regardless of the tree size, the impact on bandwidth can be minimized even when using the PQC algorithm.

Additionally, Certificate Transparency (CT) is structurally internalized since it cannot function as a certificate unless it is included in the Merkle Tree. In current CTs, issuance and recording are separate operations, but in MTCs, “certificates that are not recorded in the CT log” cannot exist in principle.

Migration roadmap

  • Phase 1 (currently in progress): Testing on real traffic in a pilot with Cloudflare. Maintain X.509 fallback
  • Phase 2 (Q1 2027): Invite CT Log operators to bootstrap public infrastructure of MTCs
  • Phase 3 (Q3 2027): Established Chrome Quantum-resistant Root Store (CQRS). Provides downgrade protection options to prevent regression to X.509

The plan is for a year and a half at the earliest, assuming parallel operation with X.509 will continue for a long time. It is unclear whether Firefox or Safari will follow suit, but Let’s Encrypt officials commented that “MTCs are the preferred approach for implementing quantum-resistant HTTPS.”

There’s nothing special for web developers to do right now. If TLS 1.3 is used, the browser and server will automatically convert the key exchange to PQC. MTC migration on the certificate side is a matter of the CA side, and if Let’s Encrypt supports it, it will switch automatically.

WebMCP declarative API

Google released an early preview of WebMCP on February 10, 2026. A standard protocol for AI agents to perform actions on websites, allowing sites to declaratively describe what they can do, and providing an integration path that does not rely on scraping. It can be enabled from chrome://flags in Chrome 146 Canary.

Declarative APIs work by simply adding attributes to existing HTML forms.

<form
  toolname="search_products"
  tooldescription="商品を検索してフィルタリングする"
>
  <input
    type="text"
    name="query"
    toolparamdescription="検索キーワード"
  />
  <select
    name="category"
    toolparamdescription="商品カテゴリ: 'electronics', 'books', 'clothing'"
  >
    <option value="electronics">電子機器</option>
    <option value="books">書籍</option>
    <option value="clothing">衣類</option>
  </select>
  <button type="submit">検索</button>
</form>

toolname is the tool identifier, tooldescription is the description to the agent, and toolparamdescription of each field is the parameter description. The browser reads this and provides it to the AI ​​agent as a structured tool schema.

When an agent submits a form, the agentInvoked property is added to SubmitEvent, allowing you to determine whether it is via an agent.

form.addEventListener("submit", (e) => {
  e.preventDefault();
  if (e.agentInvoked) {
    // AIエージェントからの送信
    const data = new FormData(e.target);
    e.respondWith(`検索結果: ${data.get("query")} in ${data.get("category")}`);
    return;
  }
  // 通常のフォーム送信
});

There is also an imperative API for complex interactions that require JavaScript execution. This is a method to dynamically register a tool using navigator.modelContext.registerTool().

It received 253 points and 137 comments on HN, and sparked discussion as “Google’s move to mediate access to the web through the browser.” There are opinions that the existing accessibility standard (ARIA) is sufficient, and opinions that server-side MCP (proposed by Anthropic) is more appropriate. Although WebMCP borrows the name from MCP, it is a specification independent of Anthropic’s protocol.

Gemini Auto Browse

Chrome + Gemini integration is also accelerating. Auto Browse, announced on January 28, 2026, is an agent feature that allows Chrome to autonomously scroll, click, type, and navigate. Perform multi-step tasks on your behalf, such as booking travel, filling out forms, and collecting tax documents. A Google AI Pro/Ultra subscription is required to use it, and it is available first in the US.

If WebMCP is the declaration infrastructure on the site side, Auto Browse is the agent implementation on the Google side.

In Chrome 146, the Gemini side panel will be available on macOS/Windows and will support summaries of the page you are currently viewing and Q&A. The ability to reference Google Drive files as context has also been added.

CVE-2026-2441 and Chrome zero-day circumstances

In February 2026, CSS engine use-after-free vulnerability CVE-2026-2441 was confirmed. A high severity version of CVSS 8.8 allowed remote code execution in a sandbox via a specially crafted HTML page. Reported on February 11th and fixed in Chrome 145.0.7632.75/76. Added to CISA’s KEV catalog on February 17th, correction deadline is March 10th.

This CVE was featured twice on this blog in the February CISA KEV summary. CISA adds four vulnerabilities actively exploited to KEV catalog provides basic information about Chrome UAF, CISA 4 critical vulnerabilities added to KEV describes a structural problem in which Blink’s CSS engine complicates object lifetime management due to dynamic state changes due to cascading, inheritance, and animation, and UAF repeatedly occurs.

A zero-day in Mojo (Chrome’s interprocess communication system) was also confirmed in March. Exploited in sophisticated attacks targeting Russian organizations. CSS engine, V8 (also fixed integer overflow for CVE-2026-2649 in February), and Mojo, so the attack surface is wide. Chrome zero-days continue to occur several times a year, so it’s best to make sure that automatic updates are enabled.

Customizable select element (Chrome 134)

A feature that web developers have been waiting for for many years is now available in Chrome 134. The appearance: base-select CSS property allows you to customize the <select> dropdown into a rich UI that includes images and animations.

Traditional <select> relies on OS-native widgets, making styling with CSS almost impossible. Therefore, it was common practice to create custom dropdowns using divs, breaking accessibility.

Basic usage

On the HTML side, place <button> and <selectedcontent> inside <select>. <selectedcontent> is an element that mirrors the content of the selected option inside the button.

<select id="country">
  <button>
    <selectedcontent></selectedcontent>
  </button>
  <option value="jp">
    <img src="/flags/jp.png" width="20" height="15" alt="" aria-hidden="true" />
    Japan
  </option>
  <option value="us">
    <img src="/flags/us.png" width="20" height="15" alt="" aria-hidden="true" />
    United States
  </option>
</select>

Opt in to customization mode by specifying appearance: base-select on the CSS side.

select,
::picker(select) {
  appearance: base-select;
}

/* ドロップダウンのスタイリング */
::picker(select) {
  border: 1px solid #e0e0e0;
  border-radius: 12px;
  padding: 8px;
  box-shadow: 0 8px 32px rgba(0, 0, 0, 0.12);
}

/* 矢印アイコン */
select::picker-icon {
  transition: rotate 0.3s;
}
select:open::picker-icon {
  rotate: 180deg;
}

/* オプションのスタイリング */
option {
  display: flex;
  align-items: center;
  gap: 10px;
  padding: 10px 14px;
  border-radius: 8px;
}
option:hover {
  background: #f3e8ff;
}

The new pseudo-elements and pseudo-classes are ::picker(select) (dropdown body), ::picker-icon (arrow), ::checkmark (selected mark), and :open (open state). The decisive difference from creating your own using div is that you can change only the appearance while maintaining native keyboard operation and screen reader compatibility.

In unsupported browsers, <button> and <selectedcontent> are ignored and fall back to normal <select>. <select multiple> is not supported.