# All four of these silently corrupt or reject valid tokens after May 2026. # 1. Database column too narrow.
CREATE TABLE app_tokens ( installation_id BIGINT, token VARCHAR(40), -- silently truncates to 40 chars on insert expires_at TIMESTAMP
); # 2. Regex validator pinned to old length.
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36}$') # rejects new tokens
if not TOKEN_RE.match(token): raise ValueError("invalid token") # 3. Length assertion in middleware.
assert len(token) == 40, f"expected 40-char token, got {len(token)}" # 4. Fixed-size buffer in C/Go/Rust FFI.
char token_buf[64]; // overflows when memcpy'd, or strncpy truncates
# All four of these silently corrupt or reject valid tokens after May 2026. # 1. Database column too narrow.
CREATE TABLE app_tokens ( installation_id BIGINT, token VARCHAR(40), -- silently truncates to 40 chars on insert expires_at TIMESTAMP
); # 2. Regex validator pinned to old length.
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36}$') # rejects new tokens
if not TOKEN_RE.match(token): raise ValueError("invalid token") # 3. Length assertion in middleware.
assert len(token) == 40, f"expected 40-char token, got {len(token)}" # 4. Fixed-size buffer in C/Go/Rust FFI.
char token_buf[64]; // overflows when memcpy'd, or strncpy truncates
# All four of these silently corrupt or reject valid tokens after May 2026. # 1. Database column too narrow.
CREATE TABLE app_tokens ( installation_id BIGINT, token VARCHAR(40), -- silently truncates to 40 chars on insert expires_at TIMESTAMP
); # 2. Regex validator pinned to old length.
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36}$') # rejects new tokens
if not TOKEN_RE.match(token): raise ValueError("invalid token") # 3. Length assertion in middleware.
assert len(token) == 40, f"expected 40-char token, got {len(token)}" # 4. Fixed-size buffer in C/Go/Rust FFI.
char token_buf[64]; // overflows when memcpy'd, or strncpy truncates
# 1. DB column types
-weight: 500;">git grep -nE "token.*VARCHAR\(([0-9]+)\)" -- '*.sql' '*.ts' '*.py' '*.rb' '*.go'
-weight: 500;">git grep -nE "varchar\(40\)|VARCHAR\(40\)|String\(40\)" -- # 2. Regex validators
-weight: 500;">git grep -nE 'ghs_\[A-Za-z0-9\]\{[0-9]+\}|ghs_\\\w\{[0-9]+\}' --
-weight: 500;">git grep -n 'ghs_' -- '*.py' '*.ts' '*.go' '*.rb' '*.java' | grep -E 'match|regex|RE|Pattern' # 3. Length assertions
-weight: 500;">git grep -nE 'len\(token\)\s*==\s*40|token\.length\s*==\s*40' --
-weight: 500;">git grep -nE 'fixed.*40|token\[0:40\]|substring\(0, 40\)' -- # 4. Fixed-size buffers
-weight: 500;">git grep -nE 'char token\[[0-9]+\]|\[40\]byte|token\[64\]' -- '*.c' '*.cpp' '*.go' '*.rs'
# 1. DB column types
-weight: 500;">git grep -nE "token.*VARCHAR\(([0-9]+)\)" -- '*.sql' '*.ts' '*.py' '*.rb' '*.go'
-weight: 500;">git grep -nE "varchar\(40\)|VARCHAR\(40\)|String\(40\)" -- # 2. Regex validators
-weight: 500;">git grep -nE 'ghs_\[A-Za-z0-9\]\{[0-9]+\}|ghs_\\\w\{[0-9]+\}' --
-weight: 500;">git grep -n 'ghs_' -- '*.py' '*.ts' '*.go' '*.rb' '*.java' | grep -E 'match|regex|RE|Pattern' # 3. Length assertions
-weight: 500;">git grep -nE 'len\(token\)\s*==\s*40|token\.length\s*==\s*40' --
-weight: 500;">git grep -nE 'fixed.*40|token\[0:40\]|substring\(0, 40\)' -- # 4. Fixed-size buffers
-weight: 500;">git grep -nE 'char token\[[0-9]+\]|\[40\]byte|token\[64\]' -- '*.c' '*.cpp' '*.go' '*.rs'
# 1. DB column types
-weight: 500;">git grep -nE "token.*VARCHAR\(([0-9]+)\)" -- '*.sql' '*.ts' '*.py' '*.rb' '*.go'
-weight: 500;">git grep -nE "varchar\(40\)|VARCHAR\(40\)|String\(40\)" -- # 2. Regex validators
-weight: 500;">git grep -nE 'ghs_\[A-Za-z0-9\]\{[0-9]+\}|ghs_\\\w\{[0-9]+\}' --
-weight: 500;">git grep -n 'ghs_' -- '*.py' '*.ts' '*.go' '*.rb' '*.java' | grep -E 'match|regex|RE|Pattern' # 3. Length assertions
-weight: 500;">git grep -nE 'len\(token\)\s*==\s*40|token\.length\s*==\s*40' --
-weight: 500;">git grep -nE 'fixed.*40|token\[0:40\]|substring\(0, 40\)' -- # 4. Fixed-size buffers
-weight: 500;">git grep -nE 'char token\[[0-9]+\]|\[40\]byte|token\[64\]' -- '*.c' '*.cpp' '*.go' '*.rs'
# Wrong — pins to current new ceiling
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36,520}$') # Right — accepts anything matching prefix and charset
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]+$')
# Wrong — pins to current new ceiling
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36,520}$') # Right — accepts anything matching prefix and charset
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]+$')
# Wrong — pins to current new ceiling
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]{36,520}$') # Right — accepts anything matching prefix and charset
TOKEN_RE = re.compile(r'^ghs_[A-Za-z0-9]+$') - Type system unchanged. The token is still a string. Static analysis, schema validation, OpenAPI spec, type guards — all still pass. No compile error, no schema drift alarm. Octokit, PyGithub, go-github all return string from their token endpoints today and tomorrow.
- Runtime unchanged at the issuance call. POST /app/installations/{installation_id}/access_tokens still returns 201 with a token field. Your code reads the field, uses it, gets a 401 on the next call — far from the issuance frame. A naive retry-on-401 hides it briefly, until the new token of the new size also fails to fit.
- Brownout is intermittent. Per the GitHub announcement, the change rolls out as a brownout starting mid-May before full cutover late June. During the brownout window, the same token endpoint can return a 40-char token at 9:00am and a 380-char token at 9:30am. Tests written against a recorded fixture pass. CI passes. Production hits the brownout at unpredictable times. - Caching layers. Redis with MAXLEN, Memcached with item-size limits (default 1MB is fine, but custom-tuned smaller installs are not).
- Audit logs. A token-redaction filter that masks the first 36 chars after ghs_ still leaks the last hundred characters of the new tokens.
- Webhook signature verification. Code that uses an installation token in a downstream -weight: 500;">service's HMAC by hashing a fixed prefix length.
- Environment variable storage. Some platforms truncate env vars over a length. Heroku, Vercel, Cloud Run all have limits per-platform; check your edition.
- JWT claims. Apps that embed an installation token in a custom claim and the verifier asserts a fixed claim size.
- Test fixtures. Recorded VCR cassettes, json fixtures, mock objects with hardcoded 40-char strings. - MySQL with non-strict mode (the historic default before 5.7, and still common in older Docker images): silently truncates and emits a warning.
- SQL Server with ANSI_WARNINGS OFF: silently truncates.
- Some ORMs with default-string-conversion: do the truncation client-side before send. - Reverse proxies and middleware. A Cloudflare Worker or Express middleware that strips/validates tokens. If you -weight: 500;">update the App but not the Worker, the Worker rejects valid tokens.
- Cross--weight: 500;">service token forwarding. Service A fetches a token, hands it to Service B, B logs and validates the format. Updating only A means B starts dropping tokens it gets from A.
- CI/CD secret scanning. Internal secret scanners that pattern-match on ghs_[A-Za-z0-9]{36} will -weight: 500;">stop catching leaked tokens after the format change, because the regex no longer matches the new format. Update the scanners or you have a leak detection regression at the same time as the migration. - Prefix. Still ghs_. (Personal access tokens use ghp_, OAuth tokens gho_, app-to-server ghu_. None of these are announced as changing in this rollout, but the same lessons apply if they do later — strip your fixed lengths now.)
- Issuance API. POST /app/installations/{installation_id}/access_tokens returns the same JSON shape.
- Revocation API. DELETE /installation/token still works the same way.
- Token TTL. Still 1 hour from issuance.
- Permissions model. Unchanged. - -weight: 500;">git grep the four patterns above (column type, regex, length assertion, fixed buffer). Inventory every match.
- Update DB columns to TEXT (or equivalent unbounded string type). For MySQL specifically, drop indexes on the token column before changing type if you have any — the index becomes invalid after the change.
- Replace fixed-length regexes with prefix-only validation (^ghs_[A-Za-z0-9]+$) or -weight: 500;">remove the validation entirely (tokens are opaque, your code shouldn't be parsing them).
- Remove length assertions in middleware and FFI boundaries. Resize fixed-size buffers to a generous ceiling (1024 or 4096).
- Update internal secret scanners first — before any token-handling change — so leak detection doesn't regress mid-migration.
- Add a real integration test against POST /app/installations/{id}/access_tokens and assert that the returned token round-trips through your storage layer without modification (use a length-comparison check, not a string-equality check, to keep the test stable across token issuances).
- If you operate GitHub Enterprise Server, plan for the format change in your next GHES -weight: 500;">upgrade — the brownout schedule for GHES typically lags the GitHub.com schedule by one or two minor versions.