Part 0.0: k-step Gram-Laplacian — full correlation matrix as Jensen gap of log-compliance#94
Draft
Part 0.0: k-step Gram-Laplacian — full correlation matrix as Jensen gap of log-compliance#94
Conversation
… gap of log-compliance
Extends Cor 14.4 (adjacent-pair Gram-Laplacian identity) to all layer
pairs. For any (d_i, d_j) with k = |d_j - d_i| >= 1:
log C^2_{ij} = -(1/2) Delta_k^2 log alpha |_{d_i+d_j+1}
= log alpha(midpoint) - (1/2)[log alpha(left) + log alpha(right)]
where left = 2 d_i + 1, right = 2 d_j + 1, midpoint = d_i + d_j + 1
(the arithmetic mean of left and right). k=1 reproduces Cor 14.4.
Reading: every entry of the full correlation matrix (Theorem on the
Gram matrix C in the supplement's setup) is a Jensen gap of the
log-compliance log(alpha). Cauchy-Schwarz C^2_{ij} <= 1 is then
equivalent to log-concavity of alpha on the cascade tower -- which
follows from R(d) ~ sqrt(2/(d+1)) so log alpha(d) = -log d + O(1/d) is
strictly concave.
Item 8 (cross-observable correlations from off-diagonal C_{ij}) was
also explored: shared-layer cancellation in observable ratios is real
and clean (Gram corrections to A and B sharing layers cancel
identically, leaving only the symmetric-difference path correction)
but is mechanical additivity rather than a new structural prediction.
Not landed in the paper.
Verifier tools/research/knockon_higher_order_laplacian.py confirms
both items at mpmath precision.
https://claude.ai/code/session_01KEi216CArEazmn38JExPwK
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Following the user's "anymore knock-ons?" prompt, this PR investigates two consequences of the recent Part 0.0 work:
Item 7 (LANDED): k-step Gram-Laplacian generalisation. For any cascade layer pair$(d_i, d_j)$ with $k = |d_j - d_i| \geq 1$ :
$$\log C^2_{ij} = -\tfrac{1}{2},\Delta_k^2 \log\alpha\big|_{d_i+d_j+1} = \log\alpha(\text{mid}) - \tfrac{1}{2}\bigl[\log\alpha(\text{left}) + \log\alpha(\text{right})\bigr]$$ $= 2d_i+1$ , right $= 2d_j+1$ , mid $= d_i+d_j+1 = (\text{left}+\text{right})/2$ .
where left
Reading: every entry of the full correlation matrix is a Jensen gap of the log-compliance$\log\alpha$ . Cauchy–Schwarz $C^2_{ij} \leq 1$ for $i \neq j$ is then equivalent to log-concavity of $\alpha$ on the cascade tower (which follows from $\log\alpha(d) = -\log d + O(1/d)$ being strictly concave).
The adjacent case$k=1$ recovers Cor 14.4. The general case extends the cascade-native Laplacian-of-compliance reading from the adjacent diagonal to the entire correlation matrix.
Verified to mpmath 50-digit precision across$(d, k)$ samples spanning cascade-physics scales:
tools/research/knockon_higher_order_laplacian.py.Item 8 (NOT LANDED): cross-observable correlations. Tested the shared-layer cancellation in observable ratios. The result is clean — Gram corrections to two observables sharing layers cancel identically in their ratio, leaving only the symmetric-difference path correction — but this is mechanical additivity of the Gram correction rather than a structurally new prediction. Verified in the script but not promoted to the paper.
What this adds
cor:gram-laplacian-k-step) in Part 0.0knockon_higher_order_laplacian.py)Test plan
python3 tools/research/knockon_higher_order_laplacian.pyruns to completion (verifies item 7 to mpmath precision)Honest framing
This is a modest result, not a major one. The structural identity is real and clean (and was previously undocumented for non-adjacent pairs), but it follows mechanically from the Beta-function form of$C^2_{ij}$ already in the supplement plus $\log\alpha = 2\log R - 2\log 2$ . Worth landing as a brief Corollary; not worth a full Theorem.
https://claude.ai/code/session_01KEi216CArEazmn38JExPwK
Generated by Claude Code