Marlin

Sumcheck

  • Aim: Given a polynomial \(g: \mathbb{F}^m \rightarrow \mathbb{F}\) and \(X = \{x_i\}_{i \in [m]}\) compute the sum
\begin{aligned} H = \sum_{X \in \{0,1\}^m} g(x_1, x_2, \dots, x_m) \end{aligned}
  • Intuition: evaluation on a boolean hypercube
  • Naively, a verifier would require \(2^m\) evaluations of \(g(.)\)
  • Sumcheck protocol requires \(\mathcal{O}(m + \lambda)\) verifier work
  • Here \(\lambda\) is the cost to evaluate \(g(.)\) at some \(r \in \mathbb{F}^{m}\)
  • Prover's work is only a constant factor more than mere computation of \(H\)

\(g(x,y) = \frac{-4x}{(x^2+y^2+1)}\)

Sumcheck

  • Honest prover starts by computing \(C = \sum_{X \in \{0,1\}^m}g(x_1, x_2, \dots, x_m)\)

\(g_1(\textcolor{orange}{X_1}) := \sum_{X \setminus x_1}g(\textcolor{orange}{X_1},x_2, \dots, x_m)\)

\(g_2(\textcolor{orange}{X_2}) := \sum_{X \setminus x_2}g(\textcolor{green}{r_1}, \textcolor{orange}{X_2}, x_3, \dots, x_m)\)

\(C \stackrel{?}{=} g_1(0) + g_1(1)\)

\(g_1(\textcolor{green}{r_1}) \stackrel{?}{=} g_2(0) + g_2(1)\)

\(g_3(\textcolor{orange}{X_3}) := \sum_{X \setminus x_3}g(\textcolor{green}{r_1}, \textcolor{green}{r_2}, \textcolor{orange}{X_3}, x_4, \dots, x_m)\)

\(g_m(\textcolor{orange}{X_m}) := g(\textcolor{green}{r_1}, \textcolor{green}{r_2}, \dots, \textcolor{green}{r_{m-1}}, \textcolor{orange}{X_m})\)

\(g_2(\textcolor{green}{r_2}) \stackrel{?}{=} g_3(0) + g_3(1)\)

\(g_{m-1}(\textcolor{green}{r_{m-1}}) \stackrel{?}{=} g_m(0) + g_m(1)\)

\(g_{m}(\textcolor{green}{r_{m}}) \stackrel{?}{=} g(\textcolor{green}{r_1}, \textcolor{green}{r_2}, \dots, \textcolor{green}{r_m})\)

Prover \(\mathcal{P}\)

Verifier \(\mathcal{V}\)

\(g_1\)

\(r_1\)

\(g_2\)

\(g_3\)

\(g_m\)

\(r_{m-1}\)

\(r_2\)

\(\vdots\)

\(\vdots\)

\(\vdots\)

Low-degree Extensions

  • An \(m\)-variate polynomial \(\hat{f}\) over \(\mathbb{F}\) is an extension of a function \(f: \{0,1\}^m \rightarrow \mathbb{F}\) if \(\hat{f}(x) = f(x)\) on the boolean hypercube \(\{0,1\}^m\)
  • A low-degree extension can be thought of as error-correcting encoding of \(f\)  

Univariate Sumcheck

  • On a multiplicative subgroup \(H\) of \(\mathbb{F}\), a polynomial \(f\) with \(\text{deg}(f) < |H|\) sums to

\(\sum_{a \in H}f(a) = f(0) \cdot |H|\)

\(f(a^1) = c_0 \ + \ c_1a^1 \ + \ c_2a^2 \ + \ \dots \ + \ c_da^d\)

\(f(a^2) = c_0 \ + \ c_1a^2 \ + \ c_2a^4 \ + \ \dots \ + \ c_da^{2d}\)

\(f(a^3) = c_0 \ + \ c_1a^3 \ + \ c_2a^6 \ + \ \dots \ + \ c_da^{3d}\)

\(f(a^n) = c_0 \ + \ c_1a^n \ + \ c_2a^{2n} \ + \ \dots \ + c_da^{nd}\)

\(\vdots\)

\(\sum_{a \in H}f(a) =c_0\cdot |H|\)

  • If \(\text{deg}(f) > |H|\), we can write \(f\) in terms of polynomials \(g \in \mathbb{F}^{< (|H|-1)}, h \in \mathbb{F}^{<|H|}\) 

\(f(X) = Xg(X) + v_h(X)h(X) + \sigma/|H|\)

  • Here \(v_H(X) = \prod_{a \in H}(X - a)\) is the vanishing polynomial on \(H\)
  • To prove that the sum of \(f\) over \(H\) is \(\sigma\), the prover sends \(g, h\) and the alleged sum \(\sigma\)
  • The verifier can check the equality at a random \(r \in \mathbb{F}\)

\(\sum_{a\in H}f(a) = \sigma\)

\(\iff\)

\(f(r) \stackrel{?}{=} rg(r) + v_h(r)h(r) + \frac{\sigma}{|H|}\)

Algebraic Holographic Proof

Prover

\(\textsf{AHP} = (\textsf{k, s, d}, \ \textbf{I},\ \textbf{P},\ \textbf{V})\)

Verifier

\(f:\{0,1\}^{\ast} \rightarrow \mathbb{N}\)

\(\underbrace{\hspace{1.2cm}}\)

Indexer

  • Offline phase: Indexer \(\textbf{I}\) encodes the given index \(\textmd{i}\) in round \(0\)

\(\textbf{I}(\mathbb{F}, \textmd{i}) \longrightarrow \mathbb{I} = \left(p_{0,1} \in \mathbb{F}^{<d(0,1)},\ \dots,\ p_{0,s(0)} \in \mathbb{F}^{<d(0,s(0))}\right)\)

  • Online phase: In round \(i \in [k]\) of interaction between \(\textbf{P}\) and \(\textbf{V}\):

\(\mathbf{P}(\mathbb{F}, \textmd{i}, \textmd{x}, \textmd{w})\)

\(\mathbf{V}^{\mathbb{I}}(\mathbb{F}, \textmd{x})\)

\(r_i\)

\(\left(p_{i,1} \in \mathbb{F}^{<d(i,1)},\ \dots,\ p_{i,s(i)} \in \mathbb{F}^{<d(i,s(i))}\right)\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(2\)

\(2\)

\(\nu_2\)

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(2\)

\(2\)

\(\nu_2\)

\(3\)

\(6\)

\(\nu_3\)

\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(2\)

\(2\)

\(\nu_2\)

\(3\)

\(6\)

\(\nu_3\)

\(5\)

\(1\)

\(\nu_4\)

\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(2\)

\(2\)

\(\nu_2\)

\(3\)

\(6\)

\(\nu_3\)

\(5\)

\(1\)

\(\nu_4\)

\(5\)

\(4\)

\(\nu_5\)

\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
\begin{matrix} 0\\ 1\\ 2\\ 3\\ 4\\ 5\\ 6\\ 7\\ \end{matrix}
\begin{matrix} 0 & \ 1 &\hspace{1mm} 2& \hspace{1.8mm} 3 &\hspace{1mm} 4 &\ 5 &\ 6 &\ 7 \end{matrix}

\(0\)

\(4\)

\(\nu_1\)

\(\textsf{row}\)

\(\textsf{col}\)

\(\textsf{val}\)

\(2\)

\(2\)

\(\nu_2\)

\(3\)

\(6\)

\(\nu_3\)

\(5\)

\(1\)

\(\nu_4\)

\(5\)

\(4\)

\(\nu_5\)

\begin{bmatrix} 0 & 0 & 0 & 0 & \nu_1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & \nu_2 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & \nu_3 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & \nu_4 & 0 & 0 & \nu_5 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & \nu_6 & 0 & 0 & 0 & 0\\ \end{bmatrix}

\(7\)

\(3\)

\(\nu_6\)

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

  • Matrix \(M \in \mathbb{F}^{|H| \times |H|}\) encodes the linear relationship between \(f_1\) and \(f_2\)
  • The indexer outputs the polynomials \(\hat{\textsf{row}}, \hat{\textsf{col}}, \hat{\textsf{val}} \in \mathbb{F}^{< |K|}\)
  • A low-degree extension of \(M\) over \(K \subseteq \mathbb{F}, \ |K| \ge \|M\| > 0\) is:
\begin{aligned} \hat{M}(X,Y) &= \sum_{\kappa \in K}u_H(X, \hat{\textsf{row}}(\kappa)) \cdot u_H(Y, \hat{\textsf{col}}(\kappa)) \cdot \hat{\textsf{val}}(\kappa) \\ &= \sum_{\kappa \in K}\frac{v_H(X) - v_H(\hat{\textsf{row}}(\kappa))}{X - \hat{\textsf{row}}(\kappa)} \cdot \frac{v_H(Y) - v_H(\hat{\textsf{col}}(\kappa))}{Y - \hat{\textsf{col}}(\kappa)} \cdot \hat{\textsf{val}}(\kappa) \\ &= \sum_{\kappa \in K}\frac{v_H(X)}{X - \hat{\textsf{row}}(\kappa)} \cdot \frac{v_H(Y)}{Y - \hat{\textsf{col}}(\kappa)} \cdot \hat{\textsf{val}}(\kappa) \end{aligned}

AHP for Lincheck

  • Given polynomials \(f_1, f_2 \in \mathbb{F}^{<d}[X]\), we need to verify if for each \(a \in H\), we have

\(f_1(a) = \sum_{b \in H}M_{a,b}f_2(b)\)

\implies \ \sum_{a\in H} \textcolor{orange}{r_a} \Big( f_1(a) - \sum_{b \in H} \hat{M}(a,b) f_2(b) \Big) = 0
\implies \ \sum_{a\in H} \textcolor{orange}{r_a} f_1(a) - \sum_{a\in H} \textcolor{orange}{r_a}\Big(\sum_{b \in H} \hat{M}(a,b) f_2(b) \Big) = 0
\implies \ \sum_{a\in H} \textcolor{orange}{r_a} f_1(a) - \sum_{a\in H} \Big(\sum_{b \in H} \textcolor{orange}{r_b}\hat{M}(b,a) \Big)f_2(a) = 0
\implies \ \sum_{a\in H} \Big( \textcolor{orange}{r_a} f_1(a) - \Big(\sum_{b \in H} \textcolor{orange}{r_b}\hat{M}(b,a) \Big) f_2(a) \Big) = 0
\implies \ q_1(X) := \textcolor{orange}{r(\alpha, X)} f_1(X) - r_M(\alpha, X) f_2(X)
  • Converted lincheck to sumcheck! Note that \(r_M(X,Y) = \sum_{b \in H} \textcolor{orange}{r(X, b)}\hat{M}(b,Y)\)

AHP for Lincheck

q_1(X) := r(\alpha, X) f_1(X) - r_M(\alpha, X) f_2(X)

\(g_1, h_1\)

\(\alpha\)

q_1(X) = Xg_1(X) + v_H(X)h_1(X)

\(\beta_1\)

q_2(X) := r(\alpha, X)\hat{M}(X, \beta_1)
\textcolor{red}{\sigma_2} := \sum_{b \in H} r(\alpha, b)\hat{M}(b,\beta_1)
q_2(X) = Xg_2(X) + v_H(X)h_2(X) + \frac{\sigma_2}{|H|}

\(\sigma_2, g_2, h_2\)

\textsf{lhs}_2 := r(\alpha, \beta_2)\textcolor{red}{\hat{M}(\beta_2,\beta_1)}
\textsf{rhs}_2 = \beta_2g_2(\beta_2) + v_H(\beta_2)h_2(\beta_2) + \frac{\sigma_2}{|H|}
\textcolor{red}{\sigma_3} := \sum_{k \in K} \frac{v_H(\beta_2)v_H(\beta_1) \hat{\textsf{val}}(k) }{(\beta_2 - \hat{\textsf{row}}(k))(\beta_1 - \hat{\textsf{col}}(k))}

\(\beta_2\)

\sigma_1 := 0
q_3(X) := \frac{v_H(\beta_2)v_H(\beta_1) \hat{\textsf{val}}(X) }{(\beta_2 - \hat{\textsf{row}}(X))(\beta_1 - \hat{\textsf{col}}(X))} = \frac{\textsf{num}(X)}{\textsf{den}(X)}
q_3(X) = Xg_3(X) + \frac{\sigma_3}{|K|}
\textsf{num}(X) - \textsf{den}(X)q_3(X) = v_K(X)h_3(X)

\(\sigma_3, g_3, h_3\)

\textsf{lhs}_3 := \textsf{num}(\beta_3) - \textsf{den}(\beta_3)(\beta_3 g_3(\beta_3) + \frac{\sigma_3}{|K|})
\textsf{rhs}_3 = v_K(\beta_3)h_3(\beta_3)
\alpha \leftarrow \mathbb{F}
\beta_1 \leftarrow \mathbb{F}
\textsf{rhs}_1 = \beta_1g_1(\beta_1)+v_H(\beta_1)h_1(\beta_1)
\textsf{lhs}_1 = r(\alpha, \beta_1)f_1(\beta_1)+\textcolor{red}{r_M(\alpha, \beta_1)}f_2(\beta_1)
\beta_2 \leftarrow \mathbb{F}
\beta_3 \leftarrow \mathbb{F}

Towards Marlin

  • Rank-1 Constraint System: \((\textmd{i} = (\mathbb{F}, H,K,A, B, C), \textmd{x} = x, \textmd{w} = w)\)  
  • We have \(A, B, C \in \mathbb{F}^{|H|\times |H|},\) \(|K| \ge \text{max}\{ \|A\|, \|B\|, \|C\| \}\) and \(z := (x, w) \in \mathbb{F}^{|H|}\) s.t.
Az \circ Bz = Cz
  • Prover \(\textbf{P}\) defines \(z_M := Mz\) for all \(M \in \{A,B,C\}\) and needs to prove:
    • Entrywise product:  \(\forall a \in H, \quad \hat{z}_A(a)\hat{z}_B(a) - \hat{z}_C(a) = 0\)
    • Linear relationship: \(\forall M \in \{A,B,C\}, \ \forall a \in H, \quad \hat{z}_{M}(a) = \sum_{b\in H} M[a,b]\hat{z}(b) \)
  • Offline phase: Indexer \(\textbf{I}\) outputs \(\{\hat{\textsf{row}}_M, \hat{\textsf{col}}_M, \hat{\textsf{val}}_M\}_{M \in \{A,B,C\}}\)
  • \(\textbf{P}\) starts by computing shifted witness \(\bar{w}: H[>|x|] \rightarrow \mathbb{F}\) and low-degree extensions:
\begin{aligned} \bar{w}(\gamma) := \frac{w(\gamma) - \hat{x}(\gamma)}{v_{H[\le x](\gamma)}}, \ \hat{w} \in \mathbb{F}^{<|w|+\textsf{b}}[X], \ (\hat{z}_A, \hat{z}_B, \hat{z}_C) \in \mathbb{F}^{<|H|+\textsf{b}}[X] \end{aligned}
  • Note that \(\hat{z}(X) = \hat{w}(X)v_{H[\le |x|]}(X) + \hat{x}(X)\)

Towards Marlin

\hat{z}_A(X) \hat{z}_B(X) - \hat{z}_C(X) = v_H(X)h_0(X)

\(\alpha,\eta_A, \eta_B, \eta_C\)

s(X) \leftarrow \mathbb{F}^{<2|H|+b-1}, \ \sigma_1 := \sum_{a\in H}s(a)
\alpha, \eta_A, \eta_B, \eta_C \leftarrow \mathbb{F}

\(\sigma_1, \hat{w}, s\)

\(\hat{z}_A, \hat{z}_B, \hat{z}_C, h_0\)

\hat{z}_A(\alpha) \hat{z}_B(\alpha) - \hat{z}_C(\alpha) \stackrel{?}{=} v_H(\alpha)h_0(\alpha)
\hat{z}_{A}(X) = \sum_{b\in H} \hat{A}(X,b)\hat{z}(b)
\hat{z}_{B}(X) = \sum_{b\in H} \hat{B}(X,b)\hat{z}(b)
\hat{z}_{C}(X) = \sum_{b\in H} \hat{C}(X,b)\hat{z}(b)
f_1(X) := \sum_{M \in \{A,B,C\}} \eta_M \hat{z}_M(X)

Towards Marlin

\hat{z}_A(X) \hat{z}_B(X) - \hat{z}_C(X) = v_H(X)h_0(X)

\(\alpha,\eta_A, \eta_B, \eta_C\)

\(\beta_1\)

s(X) \leftarrow \mathbb{F}^{<2|H|+b-1}, \ \sigma_1 := \sum_{a\in H}s(a)
\alpha, \eta_A, \eta_B, \eta_C \leftarrow \mathbb{F}
\beta_1 \leftarrow \mathbb{F} \setminus H
\textsf{rhs}_1 = \beta_1g_1(\beta_1)+v_H(\beta_1)h_1(\beta_1) + \frac{\sigma_1}{|H|}
\textsf{lhs}_1 = r(\alpha, \beta_1)f_1(\beta_1)+\textcolor{red}{r_1(\alpha, \beta_1)}f_2(\beta_1) + s(\beta_1)

\(\sigma_1, \hat{w}, s\)

\(\hat{z}_A, \hat{z}_B, \hat{z}_C, h_0\)

\hat{z}_A(\alpha) \hat{z}_B(\alpha) - \hat{z}_C(\alpha) \stackrel{?}{=} v_H(\alpha)h_0(\alpha)
f_1(X) := \sum_{M \in \{A,B,C\}} \eta_M \hat{z}_M(X)
q_1(X) := s(X) + r(\alpha, X) f_1(X) - r_1(\alpha, X) \hat{z}(X)
q_1(X) = Xg_1(X) + v_H(X)h_1(X) + \frac{\sigma_1}{|H|}
r_1(\alpha, X) := \sum_{M \in \{A,B,C\}} \eta_M r_M(\alpha,X)

\(g_1, h_1\)

q_2(X) := r(\alpha, X) \Big( \sum_M \eta_M \hat{M}(X,\beta_1) \Big)
\textcolor{red}{\sigma_2} := \sum_{b \in H} r(\alpha, b) \Big( \sum_M \eta_M \hat{M}(b,\beta_1) \Big)
q_2(X) = Xg_2(X) + v_H(X)h_2(X) + \frac{\sigma_2}{|H|}

\(\sigma_2, g_2, h_2\)

\(\beta_2\)

\textsf{lhs}_2 := r(\alpha, \beta_2)\Big( \textcolor{red}{\sum_M \eta_M \hat{M}(\beta_2,\beta_1)} \Big)
\textsf{rhs}_2 = \beta_2g_2(\beta_2) + v_H(\beta_2)h_2(\beta_2) + \frac{\sigma_2}{|H|}
\beta_2 \leftarrow \mathbb{F} \setminus H

Towards Marlin

\hat{z}_A(X) \hat{z}_B(X) - \hat{z}_C(X) = v_H(X)h_0(X)

\(\beta_1\)

s(X) \leftarrow \mathbb{F}^{<2|H|+b-1}, \ \sigma_1 := \sum_{a\in H}s(a)
\alpha, \eta_A, \eta_B, \eta_C \leftarrow \mathbb{F}
\beta_1 \leftarrow \mathbb{F} \setminus H
\textsf{rhs}_1 = \beta_1g_1(\beta_1)+v_H(\beta_1)h_1(\beta_1)
\textsf{lhs}_1 = r(\alpha, \beta_1)f_1(\beta_1)+\textcolor{red}{r_1(\alpha, \beta_1)}f_2(\beta_1) + \frac{\sigma_1}{|H|}
\hat{z}_A(\alpha) \hat{z}_B(\alpha) - \hat{z}_C(\alpha) \stackrel{?}{=} v_H(\alpha)h_0(\alpha)
f_1(X) := \sum_{M \in \{A,B,C\}} \eta_M \hat{z}_M(X)
q_1(X) := s(X) + r(\alpha, X) f_1(X) - r_1(\alpha, X) \hat{z}(X)
q_1(X) = Xg_1(X) + v_H(X)h_1(X) + \frac{\sigma_1}{|H|}
r_1(\alpha, X) := \sum_{M \in \{A,B,C\}} \eta_M r_M(\alpha,X)

\(g_1, h_1\)

q_2(X) := r(\alpha, X) \Big( \sum_M \eta_M \hat{M}(X,\beta_1) \Big)
\textcolor{red}{\sigma_2} := \sum_{b \in H} r(\alpha, b) \Big( \sum_M \eta_M \hat{M}(b,\beta_1) \Big)
q_2(X) = Xg_2(X) + v_H(X)h_2(X) + \frac{\sigma_2}{|H|}

\(\sigma_2, g_2, h_2\)

\(\beta_2\)

\textsf{lhs}_2 := r(\alpha, \beta_2)\Big( \textcolor{red}{\sum_M \eta_M \hat{M}(\beta_2,\beta_1)} \Big)
\textsf{rhs}_2 = \beta_2g_2(\beta_2) + v_H(\beta_2)h_2(\beta_2) + \frac{\sigma_2}{|H|}
\beta_2 \leftarrow \mathbb{F} \setminus H

\(\hat{z}_A, \hat{z}_B, \hat{z}_C, h_0\)

\(\alpha,\eta_A, \eta_B, \eta_C\)

\(\sigma_1, \hat{w}, s\)

\textcolor{red}{\sigma_3} := \sum_{k \in K} \sum_{M} \eta_M \frac{v_H(\beta_2)v_H(\beta_1) \hat{\textsf{val}}_M(k) }{(\beta_2 - \hat{\textsf{row}}_M(k))(\beta_1 - \hat{\textsf{col}}_M(k))}
q_3(X) = Xg_3(X) + \frac{\sigma_3}{|K|}
\textsf{num}(X) - \textsf{den}(X)q_3(X) = v_K(X)h_3(X)
q_3(X) := \sum_M \eta_M \frac{v_H(\beta_2)v_H(\beta_1) \hat{\textsf{val}}_M(X) }{(\beta_2 - \hat{\textsf{row}}_M(X))(\beta_1 - \hat{\textsf{col}}_M(X))}

\(\sigma_3, g_3, h_3\)

\textsf{lhs}_3 := \textsf{num}(\beta_3) - \textsf{den}(\beta_3)\Big(\beta_3 g_3(\beta_3) + \frac{\sigma_3}{|K|}\Big)
\textsf{rhs}_3 = v_K(\beta_3)h_3(\beta_3)
\beta_3 \leftarrow \mathbb{F}

Optimizations

  • Removing \(h_0, \hat{z}_C:\) replace \(\hat{z}_C\) with \((\hat{z}_A \cdot \hat{z}_B)\)  
  • Minimal query bound: set \(\textsf{b} = 1\)
  • Eliminating \(\sigma_1\): sample \(s(X)\) such that \(\sum_{a\in H}s(a) = 0\)
  • More efficient holographic lincheck from Fractal

\(r_M(X,Y) = M^{\star}(Y,X) := M_{x,y}u_H(X,X)\)

  • Linear combinations of matrices \(A^{\star}, B^{\star}, C^{\star}\) to get a single \(\hat{\textsf{row}}, \hat{\textsf{col}}\)
  • Reducing the number of hiding commitments: \(\hat{w}, \hat{z}_A, \hat{z}_B, \hat{z}_C, s, g_1, h_1\)
  • Batching pairing equations
  • Linearisation trick:

\(p_1(X) + p_2(X)p_3(X) = p_4(X)\)

\(\implies \ p_2(z) = v_2, \quad p_5(X) := p_1(X) + v_2p_3(X) - p_4(X) = 0\)

Marlin

s(X) \leftarrow \mathbb{F}^{<3|H|+2b-2} \text{ s.t. } \sum_{a\in H}s(a) = 0

\(\alpha,\eta_A, \eta_B, \eta_C\)

z:= (x,w), z_A = Az, z_B = Bz
\eta_A, \eta_B, \eta_C \leftarrow \mathbb{F}

\(\textsf{cm}_{\hat{w}}, \textsf{cm}_{\hat{z}_A}, \textsf{cm}_{\hat{z}_B}, \textsf{cm}_{\hat{s}}\)

\alpha \leftarrow \mathbb{F} \setminus H
\mathcal{P}(\mathbb{F}, H, K, A, B, C, x, w)
\mathcal{V}^{\hat{\textsf{row}}, \hat{\textsf{col}}, \hat{\textsf{rowcol}}, \hat{\textsf{val}}_A, \hat{\textsf{val}}_B, \hat{\textsf{val}}_C}(\mathbb{F}, H, K, x)

\(\beta\)

\beta \leftarrow \mathbb{F} \setminus H
\textsf{rhs}_1 = \beta g_1(\beta)+v_H(\beta)h_1(\beta)
\textsf{lhs}_1 = s(\beta) + u_H(\alpha, \beta)\left(\sum_{M}\eta_M \hat{z}_M(\beta)\right)+\textcolor{red}{t(\beta)}\hat{z}(\beta)
z_C(X) := \hat{z}_A(X) \hat{z}_B(X),
q_1(X) = s(X) + u_H(\alpha, X)\left(\sum_{M}\eta_M \hat{z}_M(X)\right) - t(X) \hat{z}(X)
q_1(X) = Xg_1(X) + v_H(X)h_1(X)
t(X) := \sum_{M} \eta_M r_M(\alpha,X)

\(\textsf{cm}_{t},\textsf{cm}_{g_1}, \textsf{cm}_{h_1}\)

v_{g_1} = g_1(\beta), \ v_{\hat{z}_B} = \hat{z}_B(\beta), \ v_{t} = t(\beta)
\textcolor{red}{t(\beta)} := \sum_{k \in K}\sum_{M} \eta_M \frac{v_H(\beta)v_H(\alpha) \hat{\textsf{val}}_{M^{\star}}(k) }{(\beta - \hat{\textsf{row}}(k))(\alpha - \hat{\textsf{col}}(k))}
q_2(X) = Xg_2(X) + \frac{t(\beta)}{|K|}
\textsf{num}(X) - \textsf{den}(X)q_2(X) = v_K(X)h_2(X)
q_2(X) := \sum_M \eta_M \frac{v_H(\beta)v_H(\alpha) \hat{\textsf{val}}_{M^{\star}}(X) }{(\beta - \hat{\textsf{row}}(X))(\alpha - \hat{\textsf{col}}(X))}

\(v_{g_1}, v_{\hat{z}_B}, v_t\)

\(\textsf{cm}_{g_3}, \textsf{cm}_{h_3}\)

\textsf{lhs}_2 = \textsf{num}(\gamma) - \textsf{den}(\gamma)\Big(\gamma g_2(\gamma) + \frac{\sigma_3}{|K|}\Big)
\textsf{rhs}_2 = v_K(\gamma)h_2(\gamma)
\gamma \leftarrow \mathbb{F}

Summary

  • Marlin AHP coupled with a polynomial commitment scheme gives a zk-SNARK
  • The core idea is simple but the math is heavy
  • A lot more to learn from analysing the prover and verifier work
  • Lots of straightforward optimisations used in practice

Marlin & More

By Suyash Bagad

Marlin & More

Explaining Marlin zkSNARK.

  • 116