Skip to main content
Version: Testnet

Verifiable Random Hash

Kurier exposes a verifiable pseudo-random hash endpoint on testnet. Each request returns:

  • A random hash (hash)
  • A Groth16 proof (proof) that the hash was computed correctly from public inputs
  • A job id (jobId) you can use to track verification status

Endpoint

POST https://api-testnet.kurier.xyz/api/v1/random-hash/<API_KEY>

  • Authentication is done via the <API_KEY> path parameter.
  • Request body: empty

Example:

curl -X POST "https://api-testnet.kurier.xyz/api/v1/random-hash/<API_KEY>"

Response

The response has the following shape:

{
"jobId": "6d79265c-bfd9-11f0-b4be-a621194d52f0",
"hash": "0x2a0f...e9",
"optimisticVerify": "success",
"proof": {
"proofType": "groth16",
"vkRegistered": false,
"proofOptions": { "library": "snarkjs", "curve": "bn254" },
"proofData": {
"proof": { "pi_a": [], "pi_b": [], "pi_c": [], "protocol": "groth16", "curve": "bn254" },
"publicSignals": ["..."],
"vk": { "...": "..." }
}
}
}
  • jobId: A Kurier job UUID you can use with /job-status to track verification state.
  • hash: The random output (hex string, 0x...). This is the circuit output R[0].
  • optimisticVerify: success if Kurier’s optimistic verifier accepted the proof, otherwise failed.
  • proof: The proof bundle (proof + public signals + verification key) needed to verify the computation.

How the randomness is generated

Each random hash is derived from public entropy and then proven correct.

Entropy sources (unpredictability)

  • Latest zkVerify chain block hash: Kurier fetches the latest block header hash from zkVerify RPC and uses it as the main entropy source.
  • Per-request nonce (userNonce): Kurier also mixes in a server-generated nonce derived from the request’s database job id (a UUID created at request time). This is primarily for uniqueness / domain separation (so multiple requests won’t collide even if they land on the same block hash). We don’t rely on it as the primary “entropy beacon”; the main unpredictability is still the chain blockHash.

How much entropy is used

  • The block hash is a 32-byte value. For circuit compatibility, Kurier truncates inputs to 31 bytes = 248 bits when mapping them into the BN254 field.
  • The output space is therefore a 248-bit value (returned as hash).

Note: the proof guarantees correct computation of the output from the public inputs; the unpredictability depends on zkVerify’s block production.

“Pseudo-random” vs “true random” (what users should assume)

The returned hash is pseudo-random: it is the output of a deterministic function (Poseidon) applied to public inputs (blockHash, userNonce) inside a ZK circuit.

  • Pseudo-random: looks random and is hard to predict without knowing what the next blockHash will be, but it is still fully determined by the inputs.
  • True random: comes from a physical entropy source (e.g., hardware noise) and is not deterministically determined by public inputs.

In Kurier, the unpredictability comes from the chain’s evolving blockHash; the circuit/proof provides verifiable correctness of how that input was transformed into hash.

Processing: raw entropy → random hash

Kurier uses a circom circuit which computes:

R[0] = Poseidon(blockHash, userNonce) mod N

Where:

  • blockHash and userNonce are converted into field elements (truncated to 31 bytes if needed)
  • N = 2^248 - 1 (the public modulus / upper bound)
  • the returned hash is R[0] formatted as hex (0x...)

The circuit exposes blockHash, userNonce, and N as public inputs, and returns R[0] in the public signals alongside those inputs.

Because the circuit computes mod N, the output is always in the range [0, N) — meaning N effectively sets the maximum possible output value (exclusive).

The circuit implementation used by Kurier can be found in the zkVerify/randomgen repository.

In Kurier’s implementation, the public signals are interpreted as:

  • publicSignals[0] = R[0] (the returned hash)
  • publicSignals[1] = blockHash
  • publicSignals[2] = userNonce
  • publicSignals[3] = N

How to check proof verification

You have two practical options:

Use the returned jobId to poll Kurier’s job status endpoint:

curl -X GET "https://api-testnet.kurier.xyz/api/v1/job-status/<API_KEY>/<jobId>"

Look for statuses like Valid, IncludedInBlock, and Finalized (see Job Statuses).

2) Verify the proof locally (snarkjs)

You can also verify the returned Groth16 proof locally using snarkjs:

import * as snarkjs from "snarkjs";

// resp = JSON.parse(httpResponseBody)
const { proof, publicSignals, vk } = resp.proof.proofData;
const ok = await snarkjs.groth16.verify(vk, publicSignals, proof);
console.log("proof valid:", ok);

Why this is fair / reliable

  • Correctness is provable: the returned proof (and included verification key) lets anyone verify that hash was computed as specified from the public inputs.
  • Public entropy anchor: the primary entropy comes from a public on-chain value (the latest zkVerify block hash), so all users get randomness derived from the same public beacon at that moment.
  • Uniqueness: the per-request nonce (derived from the request’s job UUID) ensures distinct outputs even across rapid repeated requests, including multiple calls against the same blockHash.

Example: random number in a range from the random hash

If you need a random integer in 1..n (inclusive), you can treat the returned hash as a large random integer r, then scale it into your range.

This mirrors the classic “scale uniform bytes into a range” approach described on Stack Overflow: Generate random number in given range from random bytes.

JavaScript (BigInt)

/**
* Map a 248-bit hash into [1..n] (inclusive) by scaling.
* Returns a BigInt so it works for large n.
*/
export function randomIntFromRandomHash(hashHex, n) {
const r = BigInt(hashHex); // "0x..."
const nn = BigInt(n);
const k = 248n;
const max = 1n << k; // 2^248
return (r * nn) / max + 1n;
}

If you need strictly unbiased sampling for arbitrary n, use rejection sampling. If a rejection occurs, you’ll need more pseudorandom material (e.g., call the endpoint again, or deterministically expand hash with a hash/KDF + counter).

Note: hash/KDF/PRG expansion does not add entropy — it only stretches the existing hash into more bytes.