Skip to content

Part 0.0: k-step Gram-Laplacian — full correlation matrix as Jensen gap of log-compliance#94

Draft
dickie81 wants to merge 1 commit intomainfrom
claude/research-higher-order-knockons
Draft

Part 0.0: k-step Gram-Laplacian — full correlation matrix as Jensen gap of log-compliance#94
dickie81 wants to merge 1 commit intomainfrom
claude/research-higher-order-knockons

Conversation

@dickie81
Copy link
Copy Markdown
Owner

Summary

Following the user's "anymore knock-ons?" prompt, this PR investigates two consequences of the recent Part 0.0 work:

Item 7 (LANDED): k-step Gram-Laplacian generalisation. For any cascade layer pair $(d_i, d_j)$ with $k = |d_j - d_i| \geq 1$:
$$\log C^2_{ij} = -\tfrac{1}{2},\Delta_k^2 \log\alpha\big|_{d_i+d_j+1} = \log\alpha(\text{mid}) - \tfrac{1}{2}\bigl[\log\alpha(\text{left}) + \log\alpha(\text{right})\bigr]$$
where left $= 2d_i+1$, right $= 2d_j+1$, mid $= d_i+d_j+1 = (\text{left}+\text{right})/2$.

Reading: every entry of the full correlation matrix is a Jensen gap of the log-compliance $\log\alpha$. Cauchy–Schwarz $C^2_{ij} \leq 1$ for $i \neq j$ is then equivalent to log-concavity of $\alpha$ on the cascade tower (which follows from $\log\alpha(d) = -\log d + O(1/d)$ being strictly concave).

The adjacent case $k=1$ recovers Cor 14.4. The general case extends the cascade-native Laplacian-of-compliance reading from the adjacent diagonal to the entire correlation matrix.

Verified to mpmath 50-digit precision across $(d, k)$ samples spanning cascade-physics scales: tools/research/knockon_higher_order_laplacian.py.

Item 8 (NOT LANDED): cross-observable correlations. Tested the shared-layer cancellation in observable ratios. The result is clean — Gram corrections to two observables sharing layers cancel identically in their ratio, leaving only the symmetric-difference path correction — but this is mechanical additivity of the Gram correction rather than a structurally new prediction. Verified in the script but not promoted to the paper.

What this adds

  • One new Corollary 14.4.1 (cor:gram-laplacian-k-step) in Part 0.0
  • One verifier script (knockon_higher_order_laplacian.py)

Test plan

  • CI builds Part 0.0 cleanly
  • python3 tools/research/knockon_higher_order_laplacian.py runs to completion (verifies item 7 to mpmath precision)
  • LaTeX warning check passes

Honest framing

This is a modest result, not a major one. The structural identity is real and clean (and was previously undocumented for non-adjacent pairs), but it follows mechanically from the Beta-function form of $C^2_{ij}$ already in the supplement plus $\log\alpha = 2\log R - 2\log 2$. Worth landing as a brief Corollary; not worth a full Theorem.

https://claude.ai/code/session_01KEi216CArEazmn38JExPwK


Generated by Claude Code

… gap of log-compliance

Extends Cor 14.4 (adjacent-pair Gram-Laplacian identity) to all layer
pairs.  For any (d_i, d_j) with k = |d_j - d_i| >= 1:

    log C^2_{ij}  =  -(1/2) Delta_k^2 log alpha |_{d_i+d_j+1}
                  =  log alpha(midpoint) - (1/2)[log alpha(left) + log alpha(right)]

where left = 2 d_i + 1, right = 2 d_j + 1, midpoint = d_i + d_j + 1
(the arithmetic mean of left and right).  k=1 reproduces Cor 14.4.

Reading: every entry of the full correlation matrix (Theorem on the
Gram matrix C in the supplement's setup) is a Jensen gap of the
log-compliance log(alpha).  Cauchy-Schwarz C^2_{ij} <= 1 is then
equivalent to log-concavity of alpha on the cascade tower -- which
follows from R(d) ~ sqrt(2/(d+1)) so log alpha(d) = -log d + O(1/d) is
strictly concave.

Item 8 (cross-observable correlations from off-diagonal C_{ij}) was
also explored: shared-layer cancellation in observable ratios is real
and clean (Gram corrections to A and B sharing layers cancel
identically, leaving only the symmetric-difference path correction)
but is mechanical additivity rather than a new structural prediction.
Not landed in the paper.

Verifier tools/research/knockon_higher_order_laplacian.py confirms
both items at mpmath precision.

https://claude.ai/code/session_01KEi216CArEazmn38JExPwK
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants