Commit a97ed3bd authored by Ethen Pociask's avatar Ethen Pociask

Merge branch 'develop' of https://github.com/epociask/optimism into indexer.client

parents bc4974ec e8205113

Too many changes to show.

To preserve performance only 1000 of 1000+ files are displayed.

v1.10.25
---
'@eth-optimism/fault-detector': patch
---
Bump contracts-bedrock version
{ {
"$schema": "https://unpkg.com/@changesets/config@1.6.0/schema.json", "$schema": "https://unpkg.com/@changesets/config@2.3.0/schema.json",
"changelog": "@changesets/cli/changelog", "changelog": ["@changesets/changelog-github", { "repo": "ethereum-optimism/optimism" }],
"commit": false, "commit": false,
"linked": [], "fixed": [],
"linked": [["@eth-optimism/contracts-bedrock", "@eth-optimism/contracts-ts"]],
"access": "public", "access": "public",
"baseBranch": "master", "baseBranch": "develop",
"updateInternalDependencies": "patch", "updateInternalDependencies": "patch",
"ignore": [] "ignore": []
} }
---
'@eth-optimism/sdk': minor
---
Fixes issue with legacy withdrawal message status detection
---
'@eth-optimism/fault-detector': minor
---
Remove pre-bedrock support from fault detector.
---
'@eth-optimism/contracts-bedrock': patch
'@eth-optimism/core-utils': patch
'@eth-optimism/sdk': patch
---
Delete dead typescript https://github.com/ethereum-optimism/optimism/pull/6148.
---
'@eth-optimism/sdk': patch
---
Update the addresses of the bridges on optimism and optimism goerli for the ECO bridge adapter
This diff is collapsed.
# Legacy codebases # Packages
/packages/chain-mon @ethereum-optimism/devxpod
/packages/common-ts @ethereum-optimism/typescript-reviewers /packages/common-ts @ethereum-optimism/typescript-reviewers
/packages/contracts @ethereum-optimism/contract-reviewers
/packages/contracts-bedrock @ethereum-optimism/contract-reviewers /packages/contracts-bedrock @ethereum-optimism/contract-reviewers
/packages/contracts-periphery @ethereum-optimism/contract-reviewers
/packages/core-utils @ethereum-optimism/legacy-reviewers /packages/core-utils @ethereum-optimism/legacy-reviewers
/packages/chain-mon @smartcontracts
/packages/fault-detector @ethereum-optimism/devxpod
/packages/hardhat-deploy-config @ethereum-optimism/legacy-reviewers
/packages/replica-healthcheck @ethereum-optimism/legacy-reviewers
/packages/sdk @ethereum-optimism/devxpod /packages/sdk @ethereum-optimism/devxpod
/packages/atst @ethereum-optimism/devxpod
# Bedrock codebases # Bedrock codebases
/bedrock-devnet @ethereum-optimism/go-reviewers /bedrock-devnet @ethereum-optimism/go-reviewers
/cannon @ethereum-optimism/go-reviewers
/op-batcher @ethereum-optimism/go-reviewers /op-batcher @ethereum-optimism/go-reviewers
/op-bootnode @ethereum-optimism/go-reviewers
/op-chain-ops @ethereum-optimism/go-reviewers /op-chain-ops @ethereum-optimism/go-reviewers
/op-challenger @ethereum-optimism/go-reviewers
/op-e2e @ethereum-optimism/go-reviewers /op-e2e @ethereum-optimism/go-reviewers
/op-exporter @ethereum-optimism/go-reviewers
/op-heartbeat @ethereum-optimism/go-reviewers
/op-node @ethereum-optimism/go-reviewers /op-node @ethereum-optimism/go-reviewers
/op-node/rollup @protolambda @trianglesphere /op-node/rollup @protolambda @trianglesphere
/op-proposer @ethereum-optimism/go-reviewers /op-preimage @ethereum-optimism/go-reviewers
/op-bootnode @ethereum-optimism/go-reviewers
/op-program @ethereum-optimism/go-reviewers /op-program @ethereum-optimism/go-reviewers
/op-proposer @ethereum-optimism/go-reviewers
/op-service @ethereum-optimism/go-reviewers /op-service @ethereum-optimism/go-reviewers
/op-signer @ethereum-optimism/go-reviewers
/op-wheel @ethereum-optimism/go-reviewers
/ops-bedrock @ethereum-optimism/go-reviewers /ops-bedrock @ethereum-optimism/go-reviewers
# Ops # Ops
/.circleci @ethereum-optimism/infra-reviewers /.circleci @ethereum-optimism/infra-reviewers
/.github @ethereum-optimism/infra-reviewers /.github @ethereum-optimism/infra-reviewers
...@@ -37,3 +37,11 @@ ...@@ -37,3 +37,11 @@
/infra @ethereum-optimism/infra-reviewers /infra @ethereum-optimism/infra-reviewers
/specs @ethereum-optimism/contract-reviewers @ethereum-optimism/go-reviewers /specs @ethereum-optimism/contract-reviewers @ethereum-optimism/go-reviewers
/endpoint-monitor @ethereum-optimism/infra-reviewers /endpoint-monitor @ethereum-optimism/infra-reviewers
# Don't add owners if only package.json is updated
/packages/*/package.json
/*/package.json
# JavaScript Releases
/packages/*/CHANGELOG.md @ethereum-optimism/release-managers
/*/CHANGELOG.md @ethereum-optimism/release-managers
name: Setup
description: Common setup steps used by our workflows
runs:
using: composite
steps:
- name: Setup pnpm
uses: pnpm/action-setup@v2
with:
version: 8.6.5
- name: Setup node 16.x
uses: actions/setup-node@v3
with:
node-version-file: .nvmrc
registry-url: https://registry.npmjs.org
cache: pnpm
- name: Setup foundry
uses: foundry-rs/foundry-toolchain@v1
- name: Install node dependencies
shell: bash
run: pnpm install --frozen-lockfile
- name: Derive appropriate SHAs for base and head for `nx affected` commands
uses: nrwl/nx-set-shas@v3
with:
main-branch-name: "develop"
- run: |
echo "nx using following shas:"
echo "BASE: ${{ env.NX_BASE }}"
echo "HEAD: ${{ env.NX_HEAD }}"
shell: bash
version: 2
updates:
- package-ecosystem: docker
directory: "/ops-bedrock"
schedule:
interval: daily
open-pull-requests-limit: 10
labels:
- dependabot
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: daily
open-pull-requests-limit: 10
labels:
- dependabot
- package-ecosystem: npm
directory: "/"
schedule:
interval: daily
time: "16:30"
timezone: "America/New_York"
open-pull-requests-limit: 10
versioning-strategy: auto
labels:
- dependabot
- package-ecosystem: gomod
directory: "/"
schedule:
interval: daily
open-pull-requests-limit: 10
labels:
- dependabot
---
C-Protocol-Critical:
- 'packages/contracts-bedrock/**/*.sol'
...@@ -13,18 +13,6 @@ pull_request_rules: ...@@ -13,18 +13,6 @@ pull_request_rules:
- "label!=multiple-reviewers" - "label!=multiple-reviewers"
- "label!=mergify-ignore" - "label!=mergify-ignore"
- "base=develop" - "base=develop"
- or:
- and:
- "label!=SR-Risk"
- "label!=C-Protocol-Critical"
- and:
- "label=SR-Risk"
- "approved-reviews-by=maurelian"
- and:
- "label=C-Protocol-Critical"
- or:
- "approved-reviews-by=tynes"
- "approved-reviews-by=smartcontracts"
actions: actions:
queue: queue:
name: default name: default
...@@ -47,28 +35,6 @@ pull_request_rules: ...@@ -47,28 +35,6 @@ pull_request_rules:
label: label:
remove: remove:
- on-merge-train - on-merge-train
- name: Handle security critical PRs
conditions:
- "label=SR-Risk"
actions:
request_reviews:
users:
- "maurelian"
comment:
message: |
Hey there @{{author}}! You flagged this PR as security critical. To make review easier, please add a comment describing
1. The risks present in this PR.
2. The mitigations you have added to try and reduce those risks.
- name: Request protocol critical reviewers
conditions:
- label=C-Protocol-Critical
actions:
request_reviews:
users:
- tynes
- smartcontracts
random_count: 1
- name: Ask to resolve conflict - name: Ask to resolve conflict
conditions: conditions:
- conflict - conflict
......
---
name: "Pull Request Labeler"
on:
- pull_request_target
jobs:
pr-labeler:
runs-on: ubuntu-latest
steps:
- uses: actions/labeler@main
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
configuration-path: .github/labeler.yml
This diff is collapsed.
name: Publish Docker images (canary)
on:
# enable users to manually trigger with workflow_dispatch
workflow_dispatch:
inputs:
customImageName:
description: 'Custom Docker Image Tag (keep empty for git hash)'
required: false
default: '0.0.0-rc-0'
jobs:
canary-publish:
name: Publish Packages (canary)
runs-on: ubuntu-latest
# map the step outputs to job outputs
outputs:
fault-mon: ${{ steps.packages.outputs.fault-mon }}
balance-mon: ${{ steps.packages.outputs.balance-mon }}
drippie-mon: ${{ steps.packages.outputs.drippie-mon }}
wd-mon: ${{ steps.packages.outputs.wd-mon }}
replica-mon: ${{ steps.packages.outputs.replica-mon }}
canary-docker-tag: ${{ steps.docker-image-name.outputs.canary-docker-tag }}
op-exporter: ${{ steps.packages.outputs.op-exporter }}
endpoint-monitor: ${{ steps.packages.outputs.endpoint-monitor }}
steps:
- name: Check out source code
uses: actions/checkout@v2
with:
# This makes Actions fetch all Git history so that Changesets can generate changelogs with the correct commits
fetch-depth: 0
- name: Docker Image Name
id: docker-image-name
run: |
if [ ${CUSTOM_IMAGE_NAME} == '' ]
then
echo "::set-output name=canary-docker-tag::${GITHUB_SHA::8}"
else
echo "::set-output name=canary-docker-tag::prerelease-${CUSTOM_IMAGE_NAME}"
fi
env:
CUSTOM_IMAGE_NAME: ${{ github.event.inputs.customImageName }}
fault-mon:
name: Publish fault-mon Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.fault-mon != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./ops/docker/Dockerfile.packages
target: fault-mon
push: true
tags: ethereumoptimism/fault-mon:${{ needs.canary-publish.outputs.canary-docker-tag }}
balance-mon:
name: Publish Balance Monitor Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.balance-mon != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./ops/docker/Dockerfile.packages
target: balance-mon
push: true
tags: ethereumoptimism/balance-mon:${{ needs.canary-publish.outputs.canary-docker-tag }}
drippie-mon:
name: Publish Drippie Monitor Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.drippie-mon != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./ops/docker/Dockerfile.packages
target: drippie-mon
push: true
tags: ethereumoptimism/drippie-mon:${{ needs.canary-publish.outputs.canary-docker-tag }}
wd-mon:
name: Publish Withdrawal Monitor Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.wd-mon != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./ops/docker/Dockerfile.packages
target: wd-mon
push: true
tags: ethereumoptimism/wd-mon:${{ needs.canary-publish.outputs.canary-docker-tag }}
replica-mon:
name: Publish replica-mon Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.replica-mon != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./ops/docker/Dockerfile.packages
target: replica-mon
push: true
tags: ethereumoptimism/replica-mon:${{ needs.canary-publish.outputs.canary-docker-tag }}
op-exporter:
name: Publish op-exporter Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.op-exporter != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Set build args
id: build_args
run: |
echo ::set-output name=GITDATE::"$(date +%d-%m-%Y)"
echo ::set-output name=GITVERSION::$(jq -r .version ./op-exporter/package.json)
echo ::set-output name=GITCOMMIT::"$GITHUB_SHA"
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./op-exporter/Dockerfile=
push: true
tags: ethereumoptimism/op-exporter:${{ needs.canary-publish.outputs.op-exporter }}
build-args: |
GITDATE=${{ steps.build_args.outputs.GITDATE }}
GITCOMMIT=${{ steps.build_args.outputs.GITCOMMIT }}
GITVERSION=${{ steps.build_args.outputs.GITVERSION }}
endpoint-monitor:
name: Publish endpoint-monitor Version ${{ needs.canary-publish.outputs.canary-docker-tag }}
needs: canary-publish
if: needs.canary-publish.outputs.endpoint-monitor != ''
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN_SECRET }}
- name: Set build args
id: build_args
run: |
echo ::set-output name=GITDATE::"$(date +%d-%m-%Y)"
echo ::set-output name=GITVERSION::$(jq -r .version ./endpoint-monitor/package.json)
echo ::set-output name=GITCOMMIT::"$GITHUB_SHA"
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
file: ./endpoint-monitor/Dockerfile
push: true
tags: ethereumoptimism/endpoint-monitor:${{ needs.canary-publish.outputs.endpoint-monitor }}
build-args: |
GITDATE=${{ steps.build_args.outputs.GITDATE }}
GITCOMMIT=${{ steps.build_args.outputs.GITCOMMIT }}
GITVERSION=${{ steps.build_args.outputs.GITVERSION }}
name: Release snapshot
# Releases a snapshot release when new commits merge to develop
# This ensures the release process is working as expected as well as always gives us a release
# we should move to a single branch in near future
on:
workflow_dispatch:
push:
branches:
- develop
jobs:
snapshot-snapshot:
name: Publish snapshot release to npm
if: github.repository == 'ethereum-optimism/optimism'
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
fetch-depth: 0
- name: Setup
uses: ./.github/actions/setup
- name: Set deployment token
run: npm config set '//registry.npmjs.org/:_authToken' "${NPM_TOKEN}"
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish snapshots
uses: seek-oss/changesets-snapshot@v0
with:
pre-publish: pnpm nx run-many --target=build --skip-nx-cache
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
name: Release name: Release and version
on: on:
workflow_dispatch:
push: push:
branches: branches:
- master - develop
# Always wait for previous release to finish before releasing again # Always wait for previous release to finish before releasing again
concurrency: ${{ github.workflow }}-${{ github.ref }} concurrency: ${{ github.workflow }}-${{ github.ref }}
...@@ -12,41 +13,38 @@ jobs: ...@@ -12,41 +13,38 @@ jobs:
release: release:
name: Release name: Release
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: github.repository == 'ethereum-optimism/optimism'
# map the step outputs to job outputs # map the step outputs to job outputs
outputs: outputs:
fault-detector: ${{ steps.packages.outputs.fault-detector }} fault-mon: ${{ steps.packages.outputs.fault-mon }}
balance-mon: ${{ steps.packages.outputs.drippie-mon }} balance-mon: ${{ steps.packages.outputs.drippie-mon }}
drippie-mon: ${{ steps.packages.outputs.drippie-mon }} drippie-mon: ${{ steps.packages.outputs.drippie-mon }}
wd-mon: ${{ steps.packages.outputs.wd-mon }} wd-mon: ${{ steps.packages.outputs.wd-mon }}
replica-healthcheck: ${{ steps.packages.outputs.replica-healthcheck }} replica-mon: ${{ steps.packages.outputs.replica-mon }}
op-exporter: ${{ steps.packages.outputs.op-exporter }} op-exporter: ${{ steps.packages.outputs.op-exporter }}
endpoint-monitor: ${{ steps.packages.outputs.endpoint-monitor }} endpoint-monitor: ${{ steps.packages.outputs.endpoint-monitor }}
# Permissions necessary for Changesets to push a new branch and open PRs
# (for automated Version Packages PRs), and request the JWT for provenance.
# More info: https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#adding-permissions-settings
permissions:
contents: write
pull-requests: write
id-token: write
steps: steps:
- name: Checkout Repo - name: Checkout Repo
uses: actions/checkout@master uses: actions/checkout@v3
with: with:
# This makes Actions fetch all Git history so that Changesets can generate changelogs with the correct commits # This makes Actions fetch all Git history so that Changesets can generate changelogs with the correct commits
fetch-depth: 0 fetch-depth: 0
ref: "develop"
- name: Set up pnpm - name: Setup
uses: pnpm/action-setup@v2 uses: ./.github/actions/setup
with:
version: 8.6.5
- name: Setup Node.js 16.x
uses: actions/setup-node@master
with:
node-version: 16.x
cache: pnpm
- name: Install Dependencies
run: pnpm install --frozen-lockfile
- name: Install Foundry - name: Set deployment token
uses: foundry-rs/foundry-toolchain@v1 run: npm config set '//registry.npmjs.org/:_authToken' "${NPM_TOKEN}"
with: env:
version: nightly NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
# Makes a pr to publish the changesets that when # Makes a pr to publish the changesets that when
# merged will publish to npm # merged will publish to npm
...@@ -56,7 +54,8 @@ jobs: ...@@ -56,7 +54,8 @@ jobs:
id: changesets id: changesets
with: with:
createGithubReleases: false createGithubReleases: false
publish: pnpm release publish: pnpm release:publish
version: pnpm release:version
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }} NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
...@@ -106,10 +105,10 @@ jobs: ...@@ -106,10 +105,10 @@ jobs:
GITCOMMIT=${{ steps.build_args.outputs.GITCOMMIT }} GITCOMMIT=${{ steps.build_args.outputs.GITCOMMIT }}
GITVERSION=${{ steps.build_args.outputs.GITVERSION }} GITVERSION=${{ steps.build_args.outputs.GITVERSION }}
fault-detector: fault-mon:
name: Publish Fault Detector Version ${{ needs.release.outputs.fault-detector }} name: Publish fault-mon Version ${{ needs.release.outputs.fault-mon }}
needs: release needs: release
if: needs.release.outputs.fault-detector != '' if: needs.release.outputs.fault-mon != ''
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
...@@ -129,9 +128,9 @@ jobs: ...@@ -129,9 +128,9 @@ jobs:
with: with:
context: . context: .
file: ./ops/docker/Dockerfile.packages file: ./ops/docker/Dockerfile.packages
target: fault-detector target: fault-mon
push: true push: true
tags: ethereumoptimism/fault-detector:${{ needs.release.outputs.fault-detector }},ethereumoptimism/fault-detector:latest tags: ethereumoptimism/fault-mon:${{ needs.release.outputs.fault-mon }},ethereumoptimism/fault-mon:latest
wd-mon: wd-mon:
name: Publish Withdrawal Monitor Version ${{ needs.release.outputs.wd-mon }} name: Publish Withdrawal Monitor Version ${{ needs.release.outputs.wd-mon }}
...@@ -214,10 +213,10 @@ jobs: ...@@ -214,10 +213,10 @@ jobs:
push: true push: true
tags: ethereumoptimism/drippie-mon:${{ needs.release.outputs.drippie-mon }},ethereumoptimism/drippie-mon:latest tags: ethereumoptimism/drippie-mon:${{ needs.release.outputs.drippie-mon }},ethereumoptimism/drippie-mon:latest
replica-healthcheck: replica-mon:
name: Publish Replica Healthcheck Version ${{ needs.release.outputs.replica-healthcheck }} name: Publish Replica Healthcheck Version ${{ needs.release.outputs.replica-mon }}
needs: release needs: release
if: needs.release.outputs.replica-healthcheck != '' if: needs.release.outputs.replica-mon != ''
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
...@@ -237,9 +236,9 @@ jobs: ...@@ -237,9 +236,9 @@ jobs:
with: with:
context: . context: .
file: ./ops/docker/Dockerfile.packages file: ./ops/docker/Dockerfile.packages
target: replica-healthcheck target: replica-mon
push: true push: true
tags: ethereumoptimism/replica-healthcheck:${{ needs.release.outputs.replica-healthcheck }},ethereumoptimism/replica-healthcheck:latest tags: ethereumoptimism/replica-mon:${{ needs.release.outputs.replica-mon }},ethereumoptimism/replica-mon:latest
endpoint-monitor: endpoint-monitor:
name: Publish endpoint-monitor Version ${{ needs.release.outputs.endpoint-monitor}} name: Publish endpoint-monitor Version ${{ needs.release.outputs.endpoint-monitor}}
......
...@@ -18,12 +18,15 @@ on: ...@@ -18,12 +18,15 @@ on:
required: true required: true
type: choice type: choice
options: options:
- ci-builder
- fault-detector
- indexer
- op-node - op-node
- op-batcher - op-batcher
- op-proposer - op-proposer
- op-ufm
- proxyd - proxyd
- indexer - indexer
- fault-detector
- ci-builder - ci-builder
prerelease: prerelease:
description: Increment major/minor/patch as prerelease? description: Increment major/minor/patch as prerelease?
......
...@@ -17,11 +17,6 @@ dist ...@@ -17,11 +17,6 @@ dist
artifacts artifacts
cache cache
packages/contracts-periphery/coverage*
packages/contracts-periphery/@openzeppelin*
packages/contracts-periphery/hardhat*
packages/contracts-periphery/forge-artifacts*
packages/contracts-bedrock/deployments/devnetL1 packages/contracts-bedrock/deployments/devnetL1
packages/contracts-bedrock/deployments/anvil packages/contracts-bedrock/deployments/anvil
......
[submodule "packages/contracts-periphery/lib/multicall"] [submodule "packages/contracts-bedrock/lib/openzeppelin-contracts-upgradeable"]
path = packages/contracts-periphery/lib/multicall path = packages/contracts-bedrock/lib/openzeppelin-contracts-upgradeable
url = https://github.com/mds1/multicall url = https://github.com/OpenZeppelin/openzeppelin-contracts-upgradeable
[submodule "lib/multicall"] [submodule "packages/contracts-bedrock/lib/openzeppelin-contracts"]
branch = v3.1.0 path = packages/contracts-bedrock/lib/openzeppelin-contracts
url = https://github.com/OpenZeppelin/openzeppelin-contracts
[submodule "packages/contracts-bedrock/lib/solmate"]
path = packages/contracts-bedrock/lib/solmate
url = https://github.com/transmissions11/solmate
[submodule "packages/contracts-bedrock/lib/clones-with-immutable-args"]
path = packages/contracts-bedrock/lib/clones-with-immutable-args
url = https://github.com/Saw-mon-and-Natalie/clones-with-immutable-args
[submodule "packages/contracts-bedrock/lib/forge-std"]
path = packages/contracts-bedrock/lib/forge-std
url = https://github.com/foundry-rs/forge-std
// This function does not modify the lockfile. It asserts that packages do not use SSH
// when specifying git repository
function afterAllResolved(lockfile, context) {
const pkgs = lockfile['packages'];
for (const [pkg, entry] of Object.entries(pkgs)) {
const repo = entry.resolution['repo'];
if (repo !== undefined) {
if (repo.startsWith('git@github.com')) {
throw new Error(`Invalid git ssh specification found for package ${pkg}. Ensure sure that the dependencies do not reference SSH-based git repos before running installing them`);
}
}
}
return lockfile
}
module.exports = {
hooks: {
afterAllResolved
}
}
...@@ -16,17 +16,9 @@ ...@@ -16,17 +16,9 @@
"directory": "packages/contracts", "directory": "packages/contracts",
"changeProcessCWD": true "changeProcessCWD": true
}, },
{
"directory": "packages/contracts-periphery",
"changeProcessCWD": true
},
{ {
"directory": "packages/chain-mon", "directory": "packages/chain-mon",
"changeProcessCWD": true "changeProcessCWD": true
},
{
"directory": "packages/fault-detector",
"changeProcessCWD": true
} }
], ],
"eslint.nodePath": "./node_modules/eslint/bin/", "eslint.nodePath": "./node_modules/eslint/bin/",
......
...@@ -68,6 +68,7 @@ You'll need the following: ...@@ -68,6 +68,7 @@ You'll need the following:
* [Docker Compose](https://docs.docker.com/compose/install/) * [Docker Compose](https://docs.docker.com/compose/install/)
* [Go](https://go.dev/dl/) * [Go](https://go.dev/dl/)
* [Foundry](https://getfoundry.sh) * [Foundry](https://getfoundry.sh)
* [go-ethereum](https://github.com/ethereum/go-ethereum)
### Setup ### Setup
...@@ -80,7 +81,7 @@ cd optimism ...@@ -80,7 +81,7 @@ cd optimism
### Install the Correct Version of NodeJS ### Install the Correct Version of NodeJS
Install node v16.16.0 with [nvm](https://github.com/nvm-sh/nvm) Install the correct node version with [nvm](https://github.com/nvm-sh/nvm)
```bash ```bash
nvm use nvm use
...@@ -112,10 +113,11 @@ Use the above commands to recompile the packages. ...@@ -112,10 +113,11 @@ Use the above commands to recompile the packages.
### Building the rest of the system ### Building the rest of the system
If you want to run an Optimism node OR **if you want to run the integration tests**, you'll need to build the rest of the system. If you want to run an Optimism node OR **if you want to run the integration tests**, you'll need to build the rest of the system.
Note that these environment variables significantly speed up build time.
```bash ```bash
cd ops cd ops-bedrock
export COMPOSE_DOCKER_CLI_BUILD=1 # these environment variables significantly speed up build time export COMPOSE_DOCKER_CLI_BUILD=1
export DOCKER_BUILDKIT=1 export DOCKER_BUILDKIT=1
docker-compose build docker-compose build
``` ```
...@@ -124,7 +126,7 @@ Source code changes can have an impact on more than one container. ...@@ -124,7 +126,7 @@ Source code changes can have an impact on more than one container.
**If you're unsure about which containers to rebuild, just rebuild them all**: **If you're unsure about which containers to rebuild, just rebuild them all**:
```bash ```bash
cd ops cd ops-bedrock
docker-compose down docker-compose down
docker-compose build docker-compose build
docker-compose up docker-compose up
......
...@@ -16,6 +16,9 @@ build-ts: submodules ...@@ -16,6 +16,9 @@ build-ts: submodules
pnpm build pnpm build
.PHONY: build-ts .PHONY: build-ts
ci-builder:
docker build -t ci-builder -f ops/docker/ci-builder/Dockerfile .
submodules: submodules:
# CI will checkout submodules on its own (and fails on these commands) # CI will checkout submodules on its own (and fails on these commands)
if [ -z "$$GITHUB_ENV" ]; then \ if [ -z "$$GITHUB_ENV" ]; then \
...@@ -56,6 +59,15 @@ op-program: ...@@ -56,6 +59,15 @@ op-program:
make -C ./op-program op-program make -C ./op-program op-program
.PHONY: op-program .PHONY: op-program
cannon:
make -C ./cannon cannon
.PHONY: cannon
cannon-prestate: op-program cannon
./cannon/bin/cannon load-elf --path op-program/bin/op-program-client.elf --out op-program/bin/prestate.json --meta op-program/bin/meta.json
./cannon/bin/cannon run --proof-at '=0' --stop-at '=1' --input op-program/bin/prestate.json --meta op-program/bin/meta.json --proof-fmt 'op-program/bin/%d.json' --output /dev/null
mv op-program/bin/0.json op-program/bin/prestate-proof.json
mod-tidy: mod-tidy:
# Below GOPRIVATE line allows mod-tidy to be run immediately after # Below GOPRIVATE line allows mod-tidy to be run immediately after
# releasing new versions. This bypasses the Go modules proxy, which # releasing new versions. This bypasses the Go modules proxy, which
...@@ -74,12 +86,15 @@ nuke: clean devnet-clean ...@@ -74,12 +86,15 @@ nuke: clean devnet-clean
.PHONY: nuke .PHONY: nuke
devnet-up: devnet-up:
$(shell ./ops/scripts/newer-file.sh .devnet/allocs-l1.json ./packages/contracts-bedrock)
if [ $(.SHELLSTATUS) -ne 0 ]; then \
make devnet-allocs; \
fi
PYTHONPATH=./bedrock-devnet python3 ./bedrock-devnet/main.py --monorepo-dir=. PYTHONPATH=./bedrock-devnet python3 ./bedrock-devnet/main.py --monorepo-dir=.
.PHONY: devnet-up .PHONY: devnet-up
devnet-up-deploy: # alias for devnet-up
PYTHONPATH=./bedrock-devnet python3 ./bedrock-devnet/main.py --monorepo-dir=. --deploy devnet-up-deploy: devnet-up
.PHONY: devnet-up-deploy
devnet-down: devnet-down:
@(cd ./ops-bedrock && GENESIS_TIMESTAMP=$(shell date +%s) docker-compose stop) @(cd ./ops-bedrock && GENESIS_TIMESTAMP=$(shell date +%s) docker-compose stop)
...@@ -93,6 +108,9 @@ devnet-clean: ...@@ -93,6 +108,9 @@ devnet-clean:
docker volume ls --filter name=ops-bedrock --format='{{.Name}}' | xargs -r docker volume rm docker volume ls --filter name=ops-bedrock --format='{{.Name}}' | xargs -r docker volume rm
.PHONY: devnet-clean .PHONY: devnet-clean
devnet-allocs:
PYTHONPATH=./bedrock-devnet python3 ./bedrock-devnet/main.py --monorepo-dir=. --allocs
devnet-logs: devnet-logs:
@(cd ./ops-bedrock && docker-compose logs -f) @(cd ./ops-bedrock && docker-compose logs -f)
.PHONY: devnet-logs .PHONY: devnet-logs
...@@ -130,4 +148,9 @@ update-op-geth: ...@@ -130,4 +148,9 @@ update-op-geth:
.PHONY: update-op-geth .PHONY: update-op-geth
bedrock-markdown-links: bedrock-markdown-links:
docker run --init -it -v `pwd`:/input lycheeverse/lychee --verbose --no-progress --exclude-loopback --exclude twitter.com --exclude explorer.optimism.io --exclude-mail /input/README.md "/input/specs/**/*.md" docker run --init -it -v `pwd`:/input lycheeverse/lychee --verbose --no-progress --exclude-loopback \
--exclude twitter.com --exclude explorer.optimism.io --exclude linux-mips.org \
--exclude-mail /input/README.md "/input/specs/**/*.md"
install-geth:
go install github.com/ethereum/go-ethereum/cmd/geth@v1.12.0
...@@ -52,11 +52,8 @@ Refer to the Directory Structure section below to understand which packages are ...@@ -52,11 +52,8 @@ Refer to the Directory Structure section below to understand which packages are
├── <a href="./packages">packages</a> ├── <a href="./packages">packages</a>
│ ├── <a href="./packages/common-ts">common-ts</a>: Common tools for building apps in TypeScript │ ├── <a href="./packages/common-ts">common-ts</a>: Common tools for building apps in TypeScript
│ ├── <a href="./packages/contracts-bedrock">contracts-bedrock</a>: Bedrock smart contracts. │ ├── <a href="./packages/contracts-bedrock">contracts-bedrock</a>: Bedrock smart contracts.
│ ├── <a href="./packages/contracts-periphery">contracts-periphery</a>: Peripheral contracts for Optimism
│ ├── <a href="./packages/core-utils">core-utils</a>: Low-level utilities that make building Optimism easier │ ├── <a href="./packages/core-utils">core-utils</a>: Low-level utilities that make building Optimism easier
│ ├── <a href="./packages/chain-mon">chain-mon</a>: Chain monitoring services │ ├── <a href="./packages/chain-mon">chain-mon</a>: Chain monitoring services
│ ├── <a href="./packages/fault-detector">fault-detector</a>: Service for detecting Sequencer faults
│ ├── <a href="./packages/replica-healthcheck">replica-healthcheck</a>: Service for monitoring the health of a replica node
│ └── <a href="./packages/sdk">sdk</a>: provides a set of tools for interacting with Optimism │ └── <a href="./packages/sdk">sdk</a>: provides a set of tools for interacting with Optimism
├── <a href="./op-bindings">op-bindings</a>: Go bindings for Bedrock smart contracts. ├── <a href="./op-bindings">op-bindings</a>: Go bindings for Bedrock smart contracts.
├── <a href="./op-batcher">op-batcher</a>: L2-Batch Submitter, submits bundles of batches to L1 ├── <a href="./op-batcher">op-batcher</a>: L2-Batch Submitter, submits bundles of batches to L1
...@@ -79,11 +76,8 @@ Refer to the Directory Structure section below to understand which packages are ...@@ -79,11 +76,8 @@ Refer to the Directory Structure section below to understand which packages are
~~ Pre-BEDROCK ~~ ~~ Pre-BEDROCK ~~
├── <a href="./packages">packages</a> ├── <a href="./packages">packages</a>
│ ├── <a href="./packages/common-ts">common-ts</a>: Common tools for building apps in TypeScript │ ├── <a href="./packages/common-ts">common-ts</a>: Common tools for building apps in TypeScript
│ ├── <a href="./packages/contracts-periphery">contracts-periphery</a>: Peripheral contracts for Optimism
│ ├── <a href="./packages/core-utils">core-utils</a>: Low-level utilities that make building Optimism easier │ ├── <a href="./packages/core-utils">core-utils</a>: Low-level utilities that make building Optimism easier
│ ├── <a href="./packages/chain-mon">chain-mon</a>: Chain monitoring services │ ├── <a href="./packages/chain-mon">chain-mon</a>: Chain monitoring services
│ ├── <a href="./packages/fault-detector">fault-detector</a>: Service for detecting Sequencer faults
│ ├── <a href="./packages/replica-healthcheck">replica-healthcheck</a>: Service for monitoring the health of a replica node
│ └── <a href="./packages/sdk">sdk</a>: provides a set of tools for interacting with Optimism │ └── <a href="./packages/sdk">sdk</a>: provides a set of tools for interacting with Optimism
├── <a href="./indexer">indexer</a>: indexes and syncs transactions ├── <a href="./indexer">indexer</a>: indexes and syncs transactions
├── <a href="./op-exporter">op-exporter</a>: A prometheus exporter to collect/serve metrics from an Optimism node ├── <a href="./op-exporter">op-exporter</a>: A prometheus exporter to collect/serve metrics from an Optimism node
...@@ -118,7 +112,7 @@ The primary development branch is [`develop`](https://github.com/ethereum-optimi ...@@ -118,7 +112,7 @@ The primary development branch is [`develop`](https://github.com/ethereum-optimi
`develop` contains the most up-to-date software that remains backwards compatible with the latest experimental [network deployments](https://community.optimism.io/docs/useful-tools/networks/). `develop` contains the most up-to-date software that remains backwards compatible with the latest experimental [network deployments](https://community.optimism.io/docs/useful-tools/networks/).
If you're making a backwards compatible change, please direct your pull request towards `develop`. If you're making a backwards compatible change, please direct your pull request towards `develop`.
**Changes to contracts within `packages/contracts-bedrock/contracts` are usually NOT considered backwards compatible and SHOULD be made against a release candidate branch**. **Changes to contracts within `packages/contracts-bedrock/src` are usually NOT considered backwards compatible and SHOULD be made against a release candidate branch**.
Some exceptions to this rule exist for cases in which we absolutely must deploy some new contract after a release candidate branch has already been fully deployed. Some exceptions to this rule exist for cases in which we absolutely must deploy some new contract after a release candidate branch has already been fully deployed.
If you're changing or adding a contract and you're unsure about which branch to make a PR into, default to using the latest release candidate branch. If you're changing or adding a contract and you're unsure about which branch to make a PR into, default to using the latest release candidate branch.
See below for info about release candidate branches. See below for info about release candidate branches.
...@@ -126,7 +120,7 @@ See below for info about release candidate branches. ...@@ -126,7 +120,7 @@ See below for info about release candidate branches.
### Release candidate branches ### Release candidate branches
Branches marked `release/X.X.X` are **release candidate branches**. Branches marked `release/X.X.X` are **release candidate branches**.
Changes that are not backwards compatible and all changes to contracts within `packages/contracts-bedrock/contracts` MUST be directed towards a release candidate branch. Changes that are not backwards compatible and all changes to contracts within `packages/contracts-bedrock/src` MUST be directed towards a release candidate branch.
Release candidates are merged into `develop` and then into `master` once they've been fully deployed. Release candidates are merged into `develop` and then into `master` once they've been fully deployed.
We may sometimes have more than one active `release/X.X.X` branch if we're in the middle of a deployment. We may sometimes have more than one active `release/X.X.X` branch if we're in the middle of a deployment.
See table in the **Active Branches** section above to find the right branch to target. See table in the **Active Branches** section above to find the right branch to target.
......
This diff is collapsed.
import time
DEV_ACCOUNTS = [
'3c44cdddb6a900fa2b585dd299e03d12fa4293bc',
'70997970c51812dc3a010c7d01b50e0d17dc79c8',
'f39fd6e51aad88f6f4ce6ab8827279cfffb92266'
]
GENESIS_TMPL = {
'config': {
'chainId': 900,
"homesteadBlock": 0,
"eip150Block": 0,
"eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"eip155Block": 0,
"eip158Block": 0,
"byzantiumBlock": 0,
"constantinopleBlock": 0,
"petersburgBlock": 0,
"istanbulBlock": 0,
"muirGlacierBlock": 0,
"berlinBlock": 0,
"londonBlock": 0,
"arrowGlacierBlock": 0,
"grayGlacierBlock": 0,
"shanghaiBlock": None,
"cancunBlock": None,
'clique': {
'period': 3,
'epoch': 30000
}
},
'nonce': '0x0',
'timestamp': '{:#x}'.format(int(time.time())),
'extraData': '0x0000000000000000000000000000000000000000000000000000000000000000ca062b0fd91172d89bcd4bb084ac4e21972cc4670000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000',
'gasLimit': '0xE4E1C0',
'difficulty': '0x1',
'mixHash': '0x0000000000000000000000000000000000000000000000000000000000000000',
'coinbase': '0x0000000000000000000000000000000000000000',
'alloc': {
'{:x}'.format(i).ljust(40, '0'): {
'balance': '0x1'
} for i in range(0, 255)
},
'number': '0x0',
'gasUsed': '0x0',
'parentHash': '0x0000000000000000000000000000000000000000000000000000000000000000',
'baseFeePergas': '0x3B9ACA00'
}
GENESIS_TMPL['alloc'].update({
d: {
'balance': '0x200000000000000000000000000000000000000000000000000000000000000'
} for d in DEV_ACCOUNTS
})
...@@ -9,5 +9,7 @@ example/bin ...@@ -9,5 +9,7 @@ example/bin
contracts/out contracts/out
state.json state.json
*.json *.json
*.json.gz
*.pprof *.pprof
*.out *.out
bin
GITCOMMIT := $(shell git rev-parse HEAD)
GITDATE := $(shell git show -s --format='%ct')
VERSION := v0.0.0
LDFLAGSSTRING +=-X main.GitCommit=$(GITCOMMIT)
LDFLAGSSTRING +=-X main.GitDate=$(GITDATE)
LDFLAGSSTRING +=-X github.com/ethereum-optimism/optimism/op-program/version.Version=$(VERSION)
LDFLAGSSTRING +=-X github.com/ethereum-optimism/optimism/op-program/version.Meta=$(VERSION_META)
LDFLAGS := -ldflags "$(LDFLAGSSTRING)"
cannon:
env GO111MODULE=on GOOS=$(TARGETOS) GOARCH=$(TARGETARCH) go build -v $(LDFLAGS) -o ./bin/cannon .
clean:
rm -rf bin
elf:
make -C ./example elf
test: elf
go test -v ./...
lint:
golangci-lint run -E goimports,sqlclosecheck,bodyclose,asciicheck,misspell,errorlint --timeout 5m -e "errors.As" -e "errors.Is"
fuzz:
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateSyscallBrk ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateSyscallClone ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateSyscallMmap ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateSyscallExitGroup ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateSyscallFnctl ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateHintRead ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 20s -fuzz=FuzzStatePreimageRead ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 10s -fuzz=FuzzStateHintWrite ./mipsevm
go test -run NOTAREALTEST -v -fuzztime 20s -fuzz=FuzzStatePreimageWrite ./mipsevm
.PHONY: \
cannon \
clean \
test \
lint \
fuzz
...@@ -26,11 +26,11 @@ make op-program # build ...@@ -26,11 +26,11 @@ make op-program # build
# Switch back to cannon, and build the CLI # Switch back to cannon, and build the CLI
cd ../cannon cd ../cannon
go build -o cannon . make cannon
# Transform MIPS op-program client binary into first VM state. # Transform MIPS op-program client binary into first VM state.
# This outputs state.json (VM state) and meta.json (for debug symbols). # This outputs state.json (VM state) and meta.json (for debug symbols).
./cannon load-elf --path=../op-program/bin/op-program-client.elf ./bin/cannon load-elf --path=../op-program/bin/op-program-client.elf
# Run cannon emulator (with example inputs) # Run cannon emulator (with example inputs)
# Note that the server-mode op-program command is passed into cannon (after the --), # Note that the server-mode op-program command is passed into cannon (after the --),
...@@ -39,30 +39,30 @@ go build -o cannon . ...@@ -39,30 +39,30 @@ go build -o cannon .
# Note: # Note:
# - The L2 RPC is an archive L2 node on OP goerli. # - The L2 RPC is an archive L2 node on OP goerli.
# - The L1 RPC is a non-archive RPC, also change `--l1.rpckind` to reflect the correct L1 RPC type. # - The L1 RPC is a non-archive RPC, also change `--l1.rpckind` to reflect the correct L1 RPC type.
./cannon run ./bin/cannon run \
--pprof.cpu --pprof.cpu \
--info-at '%10000000' --info-at '%10000000' \
--proof-at never --proof-at never \
--input ./state.json --input ./state.json \
-- -- \
../op-program/bin/op-program ../op-program/bin/op-program \
--l2 http://127.0.0.1:8745 --network goerli \
--l1 http://127.0.0.1:8645 --l1.trustrpc \
--l1.trustrpc --l1.rpckind debug_geth \
--l1.rpckind debug_geth --l1 http://127.0.0.1:8645 \
--log.format terminal --l2 http://127.0.0.1:8745 \
--l2.head 0xedc79de4d616a9100fdd42192224580daee81ea3d6303de8089d48a6c1bf4816 --l1.head 0x204f815790ca3bb43526ad60ebcc64784ec809bdc3550e82b54a0172f981efab \
--network goerli --l2.head 0xedc79de4d616a9100fdd42192224580daee81ea3d6303de8089d48a6c1bf4816 \
--l1.head 0x204f815790ca3bb43526ad60ebcc64784ec809bdc3550e82b54a0172f981efab --l2.claim 0x530658ab1b1b3ff4829731fc8d5955f0e6b8410db2cd65b572067ba58df1f2b9 \
--l2.claim 0x530658ab1b1b3ff4829731fc8d5955f0e6b8410db2cd65b572067ba58df1f2b9 --l2.blocknumber 8813570 \
--l2.blocknumber 8813570 --datadir /tmp/fpp-database \
--datadir /tmp/fpp-database --log.format terminal \
--server --server
# Add --proof-at '=12345' (or pick other pattern, see --help) # Add --proof-at '=12345' (or pick other pattern, see --help)
# to pick a step to build a proof for (e.g. exact step, every N steps, etc.) # to pick a step to build a proof for (e.g. exact step, every N steps, etc.)
# Also see `./cannon run --help` for more options # Also see `./bin/cannon run --help` for more options
``` ```
## Contracts ## Contracts
...@@ -72,7 +72,7 @@ The Cannon contracts: ...@@ -72,7 +72,7 @@ The Cannon contracts:
- `PreimageOracle.sol`: implements the pre-image oracle ABI, to support the instruction execution pre-image requests. - `PreimageOracle.sol`: implements the pre-image oracle ABI, to support the instruction execution pre-image requests.
The smart-contracts are integrated into the Optimism monorepo contracts: The smart-contracts are integrated into the Optimism monorepo contracts:
[`../packages/contracts-bedrock/contracts/cannon`](../packages/contracts-bedrock/contracts/cannon) [`../packages/contracts-bedrock/src/cannon`](../packages/contracts-bedrock/src/cannon)
## `mipsevm` ## `mipsevm`
......
package cmd package cmd
import ( import (
"compress/gzip"
"encoding/json" "encoding/json"
"errors" "errors"
"fmt" "fmt"
"io" "io"
"os" "os"
"strings"
) )
func loadJSON[X any](inputPath string) (*X, error) { func loadJSON[X any](inputPath string) (*X, error) {
if inputPath == "" { if inputPath == "" {
return nil, errors.New("no path specified") return nil, errors.New("no path specified")
} }
var f io.ReadCloser
f, err := os.OpenFile(inputPath, os.O_RDONLY, 0) f, err := os.OpenFile(inputPath, os.O_RDONLY, 0)
if err != nil { if err != nil {
return nil, fmt.Errorf("failed to open file %q: %w", inputPath, err) return nil, fmt.Errorf("failed to open file %q: %w", inputPath, err)
} }
defer f.Close() defer f.Close()
if isGzip(inputPath) {
f, err = gzip.NewReader(f)
if err != nil {
return nil, fmt.Errorf("create gzip reader: %w", err)
}
defer f.Close()
}
var state X var state X
if err := json.NewDecoder(f).Decode(&state); err != nil { if err := json.NewDecoder(f).Decode(&state); err != nil {
return nil, fmt.Errorf("failed to decode file %q: %w", inputPath, err) return nil, fmt.Errorf("failed to decode file %q: %w", inputPath, err)
...@@ -33,6 +43,11 @@ func writeJSON[X any](outputPath string, value X, outIfEmpty bool) error { ...@@ -33,6 +43,11 @@ func writeJSON[X any](outputPath string, value X, outIfEmpty bool) error {
} }
defer f.Close() defer f.Close()
out = f out = f
if isGzip(outputPath) {
g := gzip.NewWriter(f)
defer g.Close()
out = g
}
} else if outIfEmpty { } else if outIfEmpty {
out = os.Stdout out = os.Stdout
} else { } else {
...@@ -48,3 +63,7 @@ func writeJSON[X any](outputPath string, value X, outIfEmpty bool) error { ...@@ -48,3 +63,7 @@ func writeJSON[X any](outputPath string, value X, outIfEmpty bool) error {
} }
return nil return nil
} }
func isGzip(path string) bool {
return strings.HasSuffix(path, ".gz")
}
package cmd
import (
"encoding/json"
"os"
"path/filepath"
"testing"
"github.com/stretchr/testify/require"
)
func TestRoundTripJSON(t *testing.T) {
dir := t.TempDir()
file := filepath.Join(dir, "test.json")
data := &jsonTestData{A: "yay", B: 3}
err := writeJSON(file, data, false)
require.NoError(t, err)
// Confirm the file is uncompressed
fileContent, err := os.ReadFile(file)
require.NoError(t, err)
err = json.Unmarshal(fileContent, &jsonTestData{})
require.NoError(t, err)
var result *jsonTestData
result, err = loadJSON[jsonTestData](file)
require.NoError(t, err)
require.EqualValues(t, data, result)
}
func TestRoundTripJSONWithGzip(t *testing.T) {
dir := t.TempDir()
file := filepath.Join(dir, "test.json.gz")
data := &jsonTestData{A: "yay", B: 3}
err := writeJSON(file, data, false)
require.NoError(t, err)
// Confirm the file isn't raw JSON
fileContent, err := os.ReadFile(file)
require.NoError(t, err)
err = json.Unmarshal(fileContent, &jsonTestData{})
require.Error(t, err, "should not be able to decode without decompressing")
var result *jsonTestData
result, err = loadJSON[jsonTestData](file)
require.NoError(t, err)
require.EqualValues(t, data, result)
}
type jsonTestData struct {
A string `json:"a"`
B int `json:"b"`
}
...@@ -88,6 +88,13 @@ type Proof struct { ...@@ -88,6 +88,13 @@ type Proof struct {
Pre common.Hash `json:"pre"` Pre common.Hash `json:"pre"`
Post common.Hash `json:"post"` Post common.Hash `json:"post"`
StateData hexutil.Bytes `json:"state-data"`
ProofData hexutil.Bytes `json:"proof-data"`
OracleKey hexutil.Bytes `json:"oracle-key,omitempty"`
OracleValue hexutil.Bytes `json:"oracle-value,omitempty"`
OracleOffset uint32 `json:"oracle-offset,omitempty"`
StepInput hexutil.Bytes `json:"step-input"` StepInput hexutil.Bytes `json:"step-input"`
OracleInput hexutil.Bytes `json:"oracle-input"` OracleInput hexutil.Bytes `json:"oracle-input"`
} }
...@@ -298,7 +305,7 @@ func Run(ctx *cli.Context) error { ...@@ -298,7 +305,7 @@ func Run(ctx *cli.Context) error {
} }
if snapshotAt(state) { if snapshotAt(state) {
if err := writeJSON[*mipsevm.State](fmt.Sprintf(snapshotFmt, step), state, false); err != nil { if err := writeJSON(fmt.Sprintf(snapshotFmt, step), state, false); err != nil {
return fmt.Errorf("failed to write state snapshot: %w", err) return fmt.Errorf("failed to write state snapshot: %w", err)
} }
} }
...@@ -314,6 +321,8 @@ func Run(ctx *cli.Context) error { ...@@ -314,6 +321,8 @@ func Run(ctx *cli.Context) error {
Step: step, Step: step,
Pre: preStateHash, Pre: preStateHash,
Post: postStateHash, Post: postStateHash,
StateData: witness.State,
ProofData: witness.MemProof,
StepInput: witness.EncodeStepInput(), StepInput: witness.EncodeStepInput(),
} }
if witness.HasPreimage() { if witness.HasPreimage() {
...@@ -322,8 +331,11 @@ func Run(ctx *cli.Context) error { ...@@ -322,8 +331,11 @@ func Run(ctx *cli.Context) error {
return fmt.Errorf("failed to encode pre-image oracle input: %w", err) return fmt.Errorf("failed to encode pre-image oracle input: %w", err)
} }
proof.OracleInput = inp proof.OracleInput = inp
proof.OracleKey = witness.PreimageKey[:]
proof.OracleValue = witness.PreimageValue
proof.OracleOffset = witness.PreimageOffset
} }
if err := writeJSON[*Proof](fmt.Sprintf(proofFmt, step), proof, true); err != nil { if err := writeJSON(fmt.Sprintf(proofFmt, step), proof, true); err != nil {
return fmt.Errorf("failed to write proof data: %w", err) return fmt.Errorf("failed to write proof data: %w", err)
} }
} else { } else {
...@@ -334,7 +346,7 @@ func Run(ctx *cli.Context) error { ...@@ -334,7 +346,7 @@ func Run(ctx *cli.Context) error {
} }
} }
if err := writeJSON[*mipsevm.State](ctx.Path(RunOutputFlag.Name), state, true); err != nil { if err := writeJSON(ctx.Path(RunOutputFlag.Name), state, true); err != nil {
return fmt.Errorf("failed to write state output: %w", err) return fmt.Errorf("failed to write state output: %w", err)
} }
return nil return nil
......
package cmd
import (
"fmt"
"os"
"github.com/ethereum-optimism/optimism/cannon/mipsevm"
"github.com/ethereum/go-ethereum/crypto"
"github.com/urfave/cli/v2"
)
var (
WitnessInputFlag = &cli.PathFlag{
Name: "input",
Usage: "path of input JSON state.",
TakesFile: true,
Required: true,
}
WitnessOutputFlag = &cli.PathFlag{
Name: "output",
Usage: "path to write binary witness.",
TakesFile: true,
}
)
func Witness(ctx *cli.Context) error {
input := ctx.Path(WitnessInputFlag.Name)
output := ctx.Path(WitnessOutputFlag.Name)
state, err := loadJSON[mipsevm.State](input)
if err != nil {
return fmt.Errorf("invalid input state (%v): %w", input, err)
}
witness := state.EncodeWitness()
h := crypto.Keccak256Hash(witness)
if output != "" {
if err := os.WriteFile(output, witness, 0755); err != nil {
return fmt.Errorf("writing output to %v: %w", output, err)
}
}
fmt.Println(h.Hex())
return nil
}
var WitnessCommand = &cli.Command{
Name: "witness",
Usage: "Convert a Cannon JSON state into a binary witness",
Description: "Convert a Cannon JSON state into a binary witness. The hash of the witness is written to stdout",
Action: Witness,
Flags: []cli.Flag{
WitnessInputFlag,
WitnessOutputFlag,
},
}
...@@ -44,48 +44,17 @@ There are 3 types of witness data involved in onchain execution: ...@@ -44,48 +44,17 @@ There are 3 types of witness data involved in onchain execution:
### Packed State ### Packed State
The following state is packed together, and provided in every executed onchain instruction: The Packed State is provided in every executed onchain instruction.
```solidity See [Cannon VM Specs](../../specs/cannon-fault-proof-vm.md#state) for
struct State { details on the state structure.
bytes32 memRoot;
bytes32 preimageKey;
uint32 preimageOffset;
uint32 pc;
uint32 nextPC;
uint32 lo;
uint32 hi;
uint32 heap;
uint8 exitCode;
bool exited;
uint64 step;
uint32[32] registers;
}
```
The packed state is small! The `State` data can be packed in such a small amount of EVM words, The packed state is small! The `State` data can be packed in such a small amount of EVM words,
that it is more efficient to fully provide it, than to create a proof for each individual part involved in execution. that it is more efficient to fully provide it, than to create a proof for each individual part involved in execution.
The memory is represented as a merkle-tree root, committing to a binary merkle tree of all memory data,
see [memory proofs](#memory-proofs).
The `State` covers all general purpose registers, as well as the `lo`, `hi` and `pc` values.
`nextPC` helps pre-schedule the program counter change, to emulate delay-slot behavior of MIPS
after branch and jump instructions.
The program stops changing the state when `exited` is set to true. The exit-code is remembered, The program stops changing the state when `exited` is set to true. The exit-code is remembered,
to determine if the program is successful, or panicked/exited in some unexpected way. to determine if the program is successful, or panicked/exited in some unexpected way.
This outcome can be used to determine truthiness of claims that are verified as part of the program execution. This outcome can be used to determine truthiness of claims that are verified as part of the program execution.
The `heap` value is a special value, used to emulate a growing heap, to map new memory upon `mmap` calls by the program.
No memory reads/writes are actually illegal however, mmap-ing is purely to satisfy program runtimes that
need the memory-pointer result of the syscall to locate free memory.
The `preimageKey` and `preimageOffset` are backing the in-flight communication of [pre-image data](#pre-image-data).
The VM `stateHash` is computed as `keccak256(encoded_packed_state)`,
where `encoded_packed_state` is the concatenation of all state-values (all `uint` values are big-endian).
### Memory proofs ### Memory proofs
...@@ -133,11 +102,7 @@ Pre-image data is accessed through syscalls exclusively. ...@@ -133,11 +102,7 @@ Pre-image data is accessed through syscalls exclusively.
The OP-stack fault-proof [Pre-image Oracle specs](../../specs/fault-proof.md#pre-image-oracle) The OP-stack fault-proof [Pre-image Oracle specs](../../specs/fault-proof.md#pre-image-oracle)
define the ABI for communicating pre-images. define the ABI for communicating pre-images.
This ABI is implemented by the VM by intercepting the `read`/`write` syscalls to specific file descriptors: This ABI is implemented by the VM by intercepting the `read`/`write` syscalls to specific file descriptors. See [Cannon VM Specs](../../specs/cannon-fault-proof-vm.md#io) for more details.
- `hint client read = 3`: used by the client to send hint data to the host. Optional, implemented as blocking.
- `hint client write = 4`: used by the client to wait for the host. Always instant, since the above is implemented as blocking.
- `preimage client read = 5`: used by the client to read pre-image data, starting from the latest pre-image reading offset.
- `preimage client write = 6`: used by the client to change the pre-image key. The key may change a little bit at a time. The last 32 written bytes count as key for retrieval when reading the pre-image. Changing the key also resets the read offset to 0.
The data is loaded into `PreimageOracle.sol` using the respective loading function based on the pre-image type. The data is loaded into `PreimageOracle.sol` using the respective loading function based on the pre-image type.
And then retrieved during execution of the `read` syscall. And then retrieved during execution of the `read` syscall.
......
...@@ -5,8 +5,8 @@ go 1.20 ...@@ -5,8 +5,8 @@ go 1.20
require github.com/ethereum-optimism/optimism v0.0.0 require github.com/ethereum-optimism/optimism v0.0.0
require ( require (
golang.org/x/crypto v0.8.0 // indirect golang.org/x/crypto v0.12.0 // indirect
golang.org/x/sys v0.7.0 // indirect golang.org/x/sys v0.11.0 // indirect
) )
replace github.com/ethereum-optimism/optimism v0.0.0 => ../../.. replace github.com/ethereum-optimism/optimism v0.0.0 => ../../..
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/stretchr/testify v1.8.1 h1:w7B6lhMri9wdJUVmEZPGGhZzrYTPvgJArz7wNPgYKsk= github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
golang.org/x/crypto v0.8.0 h1:pd9TJtTueMTVQXzk8E2XESSMQDj/U7OUu0PqJqPXQjQ= golang.org/x/crypto v0.12.0 h1:tFM/ta59kqch6LlvYnPa0yx5a83cL2nHflFhYKvv9Yk=
golang.org/x/crypto v0.8.0/go.mod h1:mRqEX+O9/h5TFCrQhkgjo2yKi0yYA+9ecGkdQoHrywE= golang.org/x/crypto v0.12.0/go.mod h1:NF0Gs7EO5K4qLn+Ylc+fih8BSTeIjAP05siRnAh98yw=
golang.org/x/sys v0.7.0 h1:3jlCCIQZPdOYu1h8BkNvLz8Kgwtae2cagcG/VamtZRU= golang.org/x/sys v0.11.0 h1:eG7RXZHdqOJ1i+0lgLgCpSXAp6M3LYlAo6osgSi0xOM=
golang.org/x/sys v0.7.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.11.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA= gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
module hello module hello
go 1.20
...@@ -20,6 +20,7 @@ func main() { ...@@ -20,6 +20,7 @@ func main() {
app.Description = "MIPS Fault Proof tool" app.Description = "MIPS Fault Proof tool"
app.Commands = []*cli.Command{ app.Commands = []*cli.Command{
cmd.LoadELFCommand, cmd.LoadELFCommand,
cmd.WitnessCommand,
cmd.RunCommand, cmd.RunCommand,
} }
ctx, cancel := context.WithCancel(context.Background()) ctx, cancel := context.WithCancel(context.Background())
......
...@@ -24,9 +24,10 @@ import ( ...@@ -24,9 +24,10 @@ import (
) )
var ( var (
StepBytes4 = crypto.Keccak256([]byte("Step(bytes,bytes)"))[:4] StepBytes4 = crypto.Keccak256([]byte("step(bytes,bytes)"))[:4]
CheatBytes4 = crypto.Keccak256([]byte("cheat(uint256,bytes32,bytes32,uint256)"))[:4] CheatBytes4 = crypto.Keccak256([]byte("cheat(uint256,bytes32,bytes32,uint256)"))[:4]
LoadKeccak256PreimagePartBytes4 = crypto.Keccak256([]byte("loadKeccak256PreimagePart(uint256,bytes)"))[:4] LoadKeccak256PreimagePartBytes4 = crypto.Keccak256([]byte("loadKeccak256PreimagePart(uint256,bytes)"))[:4]
LoadLocalDataBytes4 = crypto.Keccak256([]byte("loadLocalData(uint256,bytes32,uint256,uint256)"))[:4]
) )
// LoadContracts loads the Cannon contracts, from op-bindings package // LoadContracts loads the Cannon contracts, from op-bindings package
...@@ -111,9 +112,17 @@ func NewEVMEnv(contracts *Contracts, addrs *Addresses) (*vm.EVM, *state.StateDB) ...@@ -111,9 +112,17 @@ func NewEVMEnv(contracts *Contracts, addrs *Addresses) (*vm.EVM, *state.StateDB)
env := vm.NewEVM(blockContext, vm.TxContext{}, state, chainCfg, vmCfg) env := vm.NewEVM(blockContext, vm.TxContext{}, state, chainCfg, vmCfg)
// pre-deploy the contracts // pre-deploy the contracts
env.StateDB.SetCode(addrs.MIPS, contracts.MIPS.DeployedBytecode.Object)
env.StateDB.SetCode(addrs.Oracle, contracts.Oracle.DeployedBytecode.Object) env.StateDB.SetCode(addrs.Oracle, contracts.Oracle.DeployedBytecode.Object)
env.StateDB.SetState(addrs.MIPS, common.Hash{}, addrs.Oracle.Hash())
var mipsCtorArgs [32]byte
copy(mipsCtorArgs[12:], addrs.Oracle[:])
mipsDeploy := append(hexutil.MustDecode(bindings.MIPSMetaData.Bin), mipsCtorArgs[:]...)
startingGas := uint64(30_000_000)
_, deployedMipsAddr, leftOverGas, err := env.Create(vm.AccountRef(addrs.Sender), mipsDeploy, startingGas, big.NewInt(0))
if err != nil {
panic(fmt.Errorf("failed to deploy MIPS contract: %w. took %d gas", err, startingGas-leftOverGas))
}
addrs.MIPS = deployedMipsAddr
rules := env.ChainConfig().Rules(header.Number, true, header.Time) rules := env.ChainConfig().Rules(header.Number, true, header.Time)
env.StateDB.Prepare(rules, addrs.Sender, addrs.FeeRecipient, &addrs.MIPS, vm.ActivePrecompiles(rules), nil) env.StateDB.Prepare(rules, addrs.Sender, addrs.FeeRecipient, &addrs.MIPS, vm.ActivePrecompiles(rules), nil)
......
This diff is collapsed.
This diff is collapsed.
...@@ -6,6 +6,16 @@ import ( ...@@ -6,6 +6,16 @@ import (
"io" "io"
) )
const (
sysMmap = 4090
sysBrk = 4045
sysClone = 4120
sysExitGroup = 4246
sysRead = 4003
sysWrite = 4004
sysFcntl = 4055
)
func (m *InstrumentedState) readPreimage(key [32]byte, offset uint32) (dat [32]byte, datLen uint32) { func (m *InstrumentedState) readPreimage(key [32]byte, offset uint32) (dat [32]byte, datLen uint32) {
preimage := m.lastPreimage preimage := m.lastPreimage
if key != m.lastPreimageKey { if key != m.lastPreimageKey {
...@@ -43,7 +53,7 @@ func (m *InstrumentedState) handleSyscall() error { ...@@ -43,7 +53,7 @@ func (m *InstrumentedState) handleSyscall() error {
//fmt.Printf("syscall: %d\n", syscallNum) //fmt.Printf("syscall: %d\n", syscallNum)
switch syscallNum { switch syscallNum {
case 4090: // mmap case sysMmap:
sz := a1 sz := a1
if sz&PageAddrMask != 0 { // adjust size to align with page size if sz&PageAddrMask != 0 { // adjust size to align with page size
sz += PageSize - (sz & PageAddrMask) sz += PageSize - (sz & PageAddrMask)
...@@ -56,15 +66,15 @@ func (m *InstrumentedState) handleSyscall() error { ...@@ -56,15 +66,15 @@ func (m *InstrumentedState) handleSyscall() error {
v0 = a0 v0 = a0
//fmt.Printf("mmap hint 0x%x size 0x%x\n", v0, sz) //fmt.Printf("mmap hint 0x%x size 0x%x\n", v0, sz)
} }
case 4045: // brk case sysBrk:
v0 = 0x40000000 v0 = 0x40000000
case 4120: // clone (not supported) case sysClone: // clone (not supported)
v0 = 1 v0 = 1
case 4246: // exit_group case sysExitGroup:
m.state.Exited = true m.state.Exited = true
m.state.ExitCode = uint8(a0) m.state.ExitCode = uint8(a0)
return nil return nil
case 4003: // read case sysRead:
// args: a0 = fd, a1 = addr, a2 = count // args: a0 = fd, a1 = addr, a2 = count
// returns: v0 = read, v1 = err code // returns: v0 = read, v1 = err code
switch a0 { switch a0 {
...@@ -98,7 +108,7 @@ func (m *InstrumentedState) handleSyscall() error { ...@@ -98,7 +108,7 @@ func (m *InstrumentedState) handleSyscall() error {
v0 = 0xFFffFFff v0 = 0xFFffFFff
v1 = MipsEBADF v1 = MipsEBADF
} }
case 4004: // write case sysWrite:
// args: a0 = fd, a1 = addr, a2 = count // args: a0 = fd, a1 = addr, a2 = count
// returns: v0 = written, v1 = err code // returns: v0 = written, v1 = err code
switch a0 { switch a0 {
...@@ -144,7 +154,7 @@ func (m *InstrumentedState) handleSyscall() error { ...@@ -144,7 +154,7 @@ func (m *InstrumentedState) handleSyscall() error {
v0 = 0xFFffFFff v0 = 0xFFffFFff
v1 = MipsEBADF v1 = MipsEBADF
} }
case 4055: // fcntl case sysFcntl:
// args: a0 = fd, a1 = cmd // args: a0 = fd, a1 = cmd
if a1 == 3 { // F_GETFL: get file descriptor flags if a1 == 3 { // F_GETFL: get file descriptor flags
switch a0 { switch a0 {
...@@ -170,6 +180,10 @@ func (m *InstrumentedState) handleSyscall() error { ...@@ -170,6 +180,10 @@ func (m *InstrumentedState) handleSyscall() error {
} }
func (m *InstrumentedState) handleBranch(opcode uint32, insn uint32, rtReg uint32, rs uint32) error { func (m *InstrumentedState) handleBranch(opcode uint32, insn uint32, rtReg uint32, rs uint32) error {
if m.state.NextPC != m.state.PC+4 {
panic("branch in delay slot")
}
shouldBranch := false shouldBranch := false
if opcode == 4 || opcode == 5 { // beq/bne if opcode == 4 || opcode == 5 { // beq/bne
rt := m.state.Registers[rtReg] rt := m.state.Registers[rtReg]
...@@ -236,6 +250,9 @@ func (m *InstrumentedState) handleHiLo(fun uint32, rs uint32, rt uint32, storeRe ...@@ -236,6 +250,9 @@ func (m *InstrumentedState) handleHiLo(fun uint32, rs uint32, rt uint32, storeRe
} }
func (m *InstrumentedState) handleJump(linkReg uint32, dest uint32) error { func (m *InstrumentedState) handleJump(linkReg uint32, dest uint32) error {
if m.state.NextPC != m.state.PC+4 {
panic("jump in delay slot")
}
prevPC := m.state.PC prevPC := m.state.PC
m.state.PC = m.state.NextPC m.state.PC = m.state.NextPC
m.state.NextPC = dest m.state.NextPC = dest
......
...@@ -8,10 +8,9 @@ md = Cs(CS_ARCH_MIPS, CS_MODE_32 + CS_MODE_BIG_ENDIAN) ...@@ -8,10 +8,9 @@ md = Cs(CS_ARCH_MIPS, CS_MODE_32 + CS_MODE_BIG_ENDIAN)
def maketest(d, out): def maketest(d, out):
with tempfile.NamedTemporaryFile() as nf: with tempfile.NamedTemporaryFile() as nf:
path = "/Users/kafka/fun/mips/mips-gcc-4.8.1/bin/"
print("building", d, "->", out) print("building", d, "->", out)
# which mips is go # which mips is go
ret = os.system("%s/mips-elf-as -defsym big_endian=1 -march=mips32r2 -o %s %s" % (path, nf.name, d)) ret = os.system("mips-linux-gnu-as -defsym big_endian=1 -march=mips32r2 -o %s %s" % (nf.name, d))
assert(ret == 0) assert(ret == 0)
nf.seek(0) nf.seek(0)
elffile = ELFFile(nf) elffile = ELFFile(nf)
...@@ -35,4 +34,4 @@ if __name__ == "__main__": ...@@ -35,4 +34,4 @@ if __name__ == "__main__":
for d in os.listdir("test/"): for d in os.listdir("test/"):
if not d.endswith(".asm"): if not d.endswith(".asm"):
continue continue
maketest("test/"+d, "test/bin/"+(d.replace(".asm", ".bin"))) maketest("test/"+d, "test/bin/"+(d.replace(".asm", ".bin")))
\ No newline at end of file
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
test:
li $v0, 4045
syscall
lui $t0, 0x4000
subu $v0, $v0, $t0
sltiu $v0, $v0, 1
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
test:
li $v0, 4120
syscall
li $t0, 0x1
subu $v0, $v0, $t0
sltiu $v0, $v0, 1
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
test:
li $a0, 1
li $v0, 4246
syscall
# Unreachable ....
# set test result to fail.
# Test runner should short-circuit before reaching this point.
li $v0, 0
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
test:
# fnctl(0, 3)
li $v0, 4055
li $a0, 0x0
li $a1, 0x3
syscall
sltiu $v0, $v0, 1
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
test:
li $v0, 4090
lui $a0, 0x3000
li $a1, 4096
syscall
lui $t0, 0x3000
subu $v0, $v0, $t0
sltiu $v0, $v0, 1
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
...@@ -69,6 +69,10 @@ $readloop: ...@@ -69,6 +69,10 @@ $readloop:
addiu $t0, $t0, -1 addiu $t0, $t0, -1
bnez $t0, $readloop bnez $t0, $readloop
nop nop
# reading the pre-image stream at EOF should have no effect
li $a1, 0x31000008
li $v0, 4003
syscall
# length at 0x31000000. We also check that the lower 32 bits are zero # length at 0x31000000. We also check that the lower 32 bits are zero
lui $s1, 0x3100 lui $s1, 0x3100
......
.section .test, "x"
.balign 4
.set noreorder
.global test
.ent test
# load hash at 0x30001000
# 0x47173285 a8d7341e 5e972fc6 77286384 f802f8ef 42a5ec5f 03bbfa25 4cb01fad = keccak("hello world")
# 0x02173285 a8d7341e 5e972fc6 77286384 f802f8ef 42a5ec5f 03bbfa25 4cb01fad = keccak("hello world").key
test:
lui $s0, 0x3000
ori $s0, 0x1000
lui $t0, 0x0217
ori $t0, 0x3285
sw $t0, 0($s0)
lui $t0, 0xa8d7
ori $t0, 0x341e
sw $t0, 4($s0)
lui $t0, 0x5e97
ori $t0, 0x2fc6
sw $t0, 8($s0)
lui $t0, 0x7728
ori $t0, 0x6384
sw $t0, 0xc($s0)
lui $t0, 0xf802
ori $t0, 0xf8ef
sw $t0, 0x10($s0)
lui $t0, 0x42a5
ori $t0, 0xec5f
sw $t0, 0x14($s0)
lui $t0, 0x03bb
ori $t0, 0xfa25
sw $t0, 0x18($s0)
lui $t0, 0x4cb0
ori $t0, 0x1fad
sw $t0, 0x1c($s0)
# preimage request - write(fdPreimageWrite, preimageData, 32)
li $a0, 6
li $a1, 0x30001000
li $t0, 8
li $a2, 4
$writeloop:
li $v0, 4004
syscall
addiu $a1, $a1, 4
addiu $t0, $t0, -1
bnez $t0, $writeloop
nop
# preimage response to 0x30002000 - read(fdPreimageRead, addr, count)
# read preimage length to unaligned addr. This will read only up to the nearest aligned byte so we have to read again.
li $a0, 5
li $a1, 0x31000001
li $a2, 4
li $v0, 4003
syscall
li $a1, 0x31000004
li $v0, 4003
syscall
li $a1, 0x31000008
li $a2, 1
li $v0, 4003
syscall
# read the preimage data
li $a1, 0x31000009
li $t0, 11
$readloop:
li $v0, 4003
li $a2, 4
syscall
addu $a1, $a1, $v0
subu $t0, $t0, $v0
bnez $t0, $readloop
nop
# length at 0x31000001. We also check that the lower 32 bits are zero
li $s1, 0x31000001
lb $t0, 0($s1)
lb $t2, 1($s1)
sll $t2, $t2, 8
or $t0, $t0, $t2
lb $t2, 2($s1)
sll $t2, $t2, 16
or $t0, $t0, $t2
# assert len[0:3] == 0
sltiu $v0, $t0, 1
# assert len[4:8] == 0
addiu $s1, $s1, 3
lw $t1, 0($s1)
sltiu $v1, $t1, 1
and $v0, $v0, $v1
# assert len[8:9] == 11
addiu $s1, $s1, 4
lb $t2, 0($s1)
li $t4, 11
subu $t5, $t2, $t4
sltiu $v1, $t5, 1
and $v0, $v0, $v1
# data at 0x31000009
addiu $s1, $s1, 1
lb $t0, 0($s1)
lb $t2, 1($s1)
sll $t0, $t0, 8
or $t0, $t0, $t2
lb $t2, 2($s1)
sll $t0, $t0, 8
or $t0, $t0, $t2
lb $t2, 3($s1)
sll $t0, $t0, 8
or $t0, $t0, $t2
#lw $t0, 0($s1)
lui $t4, 0x6865
ori $t4, 0x6c6c
subu $t5, $t0, $t4
sltiu $v1, $t5, 1
and $v0, $v0, $v1
# save results
lui $s0, 0xbfff # Load the base address 0xbffffff0
ori $s0, 0xfff0
ori $s1, $0, 1 # Prepare the 'done' status
sw $v0, 8($s0) # Set the test result
sw $s1, 4($s0) # Set 'done'
$done:
jr $ra
nop
.end test
...@@ -35,6 +35,9 @@ func TestState(t *testing.T) { ...@@ -35,6 +35,9 @@ func TestState(t *testing.T) {
if strings.HasPrefix(f.Name(), "oracle") { if strings.HasPrefix(f.Name(), "oracle") {
oracle = staticOracle(t, []byte("hello world")) oracle = staticOracle(t, []byte("hello world"))
} }
// Short-circuit early for exit_group.bin
exitGroup := f.Name() == "exit_group.bin"
// TODO: currently tests are compiled as flat binary objects // TODO: currently tests are compiled as flat binary objects
// We can use more standard tooling to compile them to ELF files and get remove maketests.py // We can use more standard tooling to compile them to ELF files and get remove maketests.py
fn := path.Join("open_mips_tests/test/bin", f.Name()) fn := path.Join("open_mips_tests/test/bin", f.Name())
...@@ -57,14 +60,24 @@ func TestState(t *testing.T) { ...@@ -57,14 +60,24 @@ func TestState(t *testing.T) {
if us.state.PC == endAddr { if us.state.PC == endAddr {
break break
} }
if exitGroup && us.state.Exited {
break
}
_, err := us.Step(false) _, err := us.Step(false)
require.NoError(t, err) require.NoError(t, err)
} }
require.Equal(t, uint32(endAddr), us.state.PC, "must reach end")
// inspect test result if exitGroup {
done, result := state.Memory.GetMemory(baseAddrEnd+4), state.Memory.GetMemory(baseAddrEnd+8) require.NotEqual(t, uint32(endAddr), us.state.PC, "must not reach end")
require.Equal(t, done, uint32(1), "must be done") require.True(t, us.state.Exited, "must set exited state")
require.Equal(t, result, uint32(1), "must have success result") require.Equal(t, uint8(1), us.state.ExitCode, "must exit with 1")
} else {
require.Equal(t, uint32(endAddr), us.state.PC, "must reach end")
done, result := state.Memory.GetMemory(baseAddrEnd+4), state.Memory.GetMemory(baseAddrEnd+8)
// inspect test result
require.Equal(t, done, uint32(1), "must be done")
require.Equal(t, result, uint32(1), "must have success result")
}
}) })
} }
} }
......
...@@ -26,13 +26,21 @@ func uint32ToBytes32(v uint32) []byte { ...@@ -26,13 +26,21 @@ func uint32ToBytes32(v uint32) []byte {
} }
func (wit *StepWitness) EncodeStepInput() []byte { func (wit *StepWitness) EncodeStepInput() []byte {
abiStateLen := len(wit.State)
if abiStateLen%32 != 0 {
abiStateLen += 32 - (abiStateLen % 32)
}
// pad state to 32 byte multiple per ABI
abiState := make([]byte, abiStateLen)
copy(abiState, wit.State)
var input []byte var input []byte
input = append(input, StepBytes4...) input = append(input, StepBytes4...)
input = append(input, uint32ToBytes32(32*2)...) // state data offset in bytes input = append(input, uint32ToBytes32(32*2)...) // state data offset in bytes
input = append(input, uint32ToBytes32(32*2+32+uint32(len(wit.State)))...) // proof data offset in bytes input = append(input, uint32ToBytes32(32*2+32+uint32(len(abiState)))...) // proof data offset in bytes
input = append(input, uint32ToBytes32(uint32(len(wit.State)))...) // state data length in bytes input = append(input, uint32ToBytes32(uint32(len(wit.State)))...) // state data length in bytes
input = append(input, wit.State[:]...) input = append(input, abiState[:]...)
input = append(input, uint32ToBytes32(uint32(len(wit.MemProof)))...) // proof data length in bytes input = append(input, uint32ToBytes32(uint32(len(wit.MemProof)))...) // proof data length in bytes
input = append(input, wit.MemProof[:]...) input = append(input, wit.MemProof[:]...)
return input return input
...@@ -49,18 +57,19 @@ func (wit *StepWitness) EncodePreimageOracleInput() ([]byte, error) { ...@@ -49,18 +57,19 @@ func (wit *StepWitness) EncodePreimageOracleInput() ([]byte, error) {
switch preimage.KeyType(wit.PreimageKey[0]) { switch preimage.KeyType(wit.PreimageKey[0]) {
case preimage.LocalKeyType: case preimage.LocalKeyType:
// We have no on-chain form of preparing the bootstrap pre-images onchain yet. if len(wit.PreimageValue) > 32+8 {
// So instead we cheat them in. return nil, fmt.Errorf("local pre-image exceeds maximum size of 32 bytes with key 0x%x", wit.PreimageKey)
// In production usage there should be an on-chain contract that exposes this, }
// rather than going through the global keccak256 oracle.
var input []byte var input []byte
input = append(input, CheatBytes4...) input = append(input, LoadLocalDataBytes4...)
input = append(input, uint32ToBytes32(wit.PreimageOffset)...)
input = append(input, wit.PreimageKey[:]...) input = append(input, wit.PreimageKey[:]...)
preimagePart := wit.PreimageValue[8:]
var tmp [32]byte var tmp [32]byte
copy(tmp[:], wit.PreimageValue[wit.PreimageOffset:]) copy(tmp[:], preimagePart)
input = append(input, tmp[:]...) input = append(input, tmp[:]...)
input = append(input, uint32ToBytes32(uint32(len(wit.PreimageValue))-8)...) input = append(input, uint32ToBytes32(uint32(len(wit.PreimageValue)-8))...)
input = append(input, uint32ToBytes32(wit.PreimageOffset)...)
// Note: we can pad calldata to 32 byte multiple, but don't strictly have to // Note: we can pad calldata to 32 byte multiple, but don't strictly have to
return input, nil return input, nil
case preimage.Keccak256KeyType: case preimage.Keccak256KeyType:
......
...@@ -7,8 +7,8 @@ ignore: ...@@ -7,8 +7,8 @@ ignore:
- "**/*.t.sol" - "**/*.t.sol"
- "op-bindings/bindings/*.go" - "op-bindings/bindings/*.go"
- "packages/contracts-bedrock/contracts/vendor/WETH9.sol" - "packages/contracts-bedrock/contracts/vendor/WETH9.sol"
- "packages/contracts-bedrock/contracts/echidna"
- "packages/contracts-bedrock/contracts/cannon" # tested through Go tests - "packages/contracts-bedrock/contracts/cannon" # tested through Go tests
- 'packages/contracts-bedrock/contracts/EAS/**/*.sol'
coverage: coverage:
status: status:
patch: patch:
...@@ -34,9 +34,6 @@ flag_management: ...@@ -34,9 +34,6 @@ flag_management:
- name: common-ts-tests - name: common-ts-tests
- name: contracts-tests - name: contracts-tests
- name: core-utils-tests - name: core-utils-tests
- name: contracts-periphery-tests
- name: dtl-tests - name: dtl-tests
- name: chain-mon-tests - name: chain-mon-tests
- name: fault-detector-tests
- name: replica-healthcheck-tests
- name: sdk-tests - name: sdk-tests
# The OP Stack Docs
[![Discord](https://img.shields.io/discord/667044843901681675.svg?color=768AD4&label=discord&logo=https%3A%2F%2Fdiscordapp.com%2Fassets%2F8c9701b98ad4372b58f13fd9f65f966e.svg)](https://discord.gg/optimism)
[![Twitter Follow](https://img.shields.io/twitter/follow/optimismPBC.svg?label=optimismPBC&style=social)](https://twitter.com/optimismPBC)
The OP Stack is an open, collectively maintained development stack for blockchain ecosystems.
This repository contains the source code for the [OP Stack Docs](https://stack.optimism.io).
## Development
### Serving docs locally
```sh
yarn dev
```
Then navigate to [http://localhost:8080](http://localhost:8080).
If that link doesn't work, double check the output of `yarn dev`.
You might already be serving something on port 8080 and the site may be on another port (e.g., 8081).
### Building docs for production
```sh
yarn build
```
You probably don't need to run this command, but now you know.
### Editing docs
Edit the markdown directly in [src/docs](./src/docs).
### Adding new docs
Add your markdown files to [src/docs](./src/docs).
You will also have to update [src/.vuepress/config.js](./src/.vuepress/config.js) if you want these docs to show up in the sidebar.
### Updating the theme
We currently use an ejected version of [vuepress-theme-hope](https://vuepress-theme-hope.github.io/).
Since the version we use was ejected from the original theme, you'll see a bunch of compiled JavaScript files instead of the original TypeScript files.
There's not much we can do about that right now, so you'll just need to make do and edit the raw JS if you need to make theme adjustments.
We're planning to move away from VuePress relatively soon anyway so we won't be fixing this.
{
"name": "@eth-optimism/op-stack-docs",
"version": "0.0.2",
"description": "The OP Stack Docs",
"main": "index.js",
"scripts": {
"dev": "vuepress dev src",
"build": "vuepress build src",
"preview": "yarn build && serve -s src/.vuepress/dist -p 8080"
},
"license": "MIT",
"devDependencies": {
"@vuepress/plugin-medium-zoom": "^1.8.2",
"@vuepress/plugin-pwa": "^1.9.7",
"serve": "^14.2.0",
"vuepress": "^1.8.2",
"vuepress-plugin-plausible-analytics": "^0.2.1",
"vuepress-theme-hope": "^1.22.0"
},
"dependencies": {
"check-md": "^1.0.2",
"dayjs": "^1.11.7"
}
}
const { description } = require('../../package')
const path = require('path')
module.exports = {
title: 'OP Stack Docs',
description: description,
head: [
['link', { rel: 'manifest', href: '/manifest.json' }],
['meta', { name: 'theme-color', content: '#3eaf7c' }],
['meta', { name: 'apple-mobile-web-app-capable', content: 'yes' }],
['meta', { name: 'apple-mobile-web-app-status-bar-style', content: 'black' }],
['meta', { property: 'og:image', content: 'https://stack.optimism.io/assets/logos/twitter-logo.png' }],
['meta', { name: 'twitter:image', content: 'https://stack.optimism.io/assets/logos/twitter-logo.png' }],
['meta', { name: 'twitter:title', content: 'OP Stack Docs' }],
['meta', { property: 'og:title', content: 'OP Stack Docs' }],
['meta', { name: 'twitter:card', content: 'summary' } ],
['link', { rel: "icon", type: "image/png", sizes: "32x32", href: "/assets/logos/favicon.png"}],
],
theme: path.resolve(__dirname, './theme'),
themeConfig: {
"twitter:card": "summary",
contributor: false,
hostname: 'https://stack.optimism.io',
logo: '/assets/logos/logo.png',
docsDir: 'src',
docsRepo: 'https://github.com/ethereum-optimism/opstack-docs',
docsBranch: 'main',
lastUpdated: false,
darkmode: 'disable',
themeColor: false,
blog: false,
iconPrefix: 'far fa-',
pageInfo: false,
pwa: {
cacheHTML: false,
},
activeHash: {
offset: -200,
},
algolia: {
appId: 'O9WKE9RMCV',
apiKey: '00cf17cba30b374d08d7f7afead974be',
indexName: 'optimism'
},
nav: [
{
text: 'Home',
link: 'https://www.optimism.io/'
},
{
text: 'OP Stack Docs',
link: '/'
},
{
text: 'Optimism Docs',
link: 'https://community.optimism.io/'
},
{
text: 'Governance',
link: 'https://community.optimism.io/docs/governance/'
},
{
text: 'Community',
items: [
{
icon: 'discord',
iconPrefix: 'fab fa-',
iconClass: 'color-discord',
text: 'Discord',
link: 'https://discord.optimism.io',
},
{
icon: 'github',
iconPrefix: 'fab fa-',
iconClass: 'color-github',
text: 'GitHub',
link: 'https://github.com/ethereum-optimism/optimism',
},
{
icon: 'twitter',
iconPrefix: 'fab fa-',
iconClass: 'color-twitter',
text: 'Twitter',
link: 'https://twitter.com/optimismFND',
},
{
icon: 'twitch',
iconPrefix: 'fab fa-',
iconClass: 'color-twitch',
text: 'Twitch',
link: 'https://www.twitch.tv/optimismpbc'
},
{
icon: 'medium',
iconPrefix: 'fab fa-',
iconClass: 'color-medium',
text: 'Blog',
link: 'https://optimismpbc.medium.com/'
},
{
icon: 'computer-classic',
iconClass: 'color-ecosystem',
text: 'Ecosystem',
link: 'https://www.optimism.io/apps/all',
},
{
icon: 'globe',
iconClass: 'color-optimism',
text: 'optimism.io',
link: 'https://www.optimism.io/',
}
]
}
],
searchPlaceholder: 'Search the docs',
sidebar: [
{
title: "OP Stack",
collapsable: false,
children: [
'/',
[
'/docs/understand/design-principles.md',
'Design Principles'
],
'/docs/understand/landscape.md',
'/docs/understand/explainer.md'
]
},
{
title: "Releases",
collapsable: false,
children: [
'/docs/releases/',
{
title: "Bedrock",
collapsable: true,
children: [
'/docs/releases/bedrock/',
'/docs/releases/bedrock/explainer.md',
'/docs/releases/bedrock/differences.md'
]
}
]
},
{
title: "Building OP Stack Rollups",
collapsable: false,
children: [
'/docs/build/getting-started.md',
'/docs/build/conf.md',
'/docs/build/operations.md',
'/docs/build/explorer.md',
'/docs/build/sdk.md',
{
title: "OP Stack Hacks",
collapsable: true,
children: [
'/docs/build/hacks.md',
'/docs/build/featured.md',
'/docs/build/data-avail.md',
'/docs/build/derivation.md',
'/docs/build/execution.md',
'/docs/build/settlement.md',
{
title: "Sample Hacks",
children: [
"/docs/build/tutorials/add-attr.md",
"/docs/build/tutorials/new-precomp.md",
"/docs/build/tutorials/predeploys.md"
]
} // End of tutorials
],
}, // End of OP Stack hacks
],
}, // End of Building OP Stack Rollups
{
title: "Contributing",
collapsable: false,
children: [
'/docs/contribute.md',
]
},
{
title: "Security",
collapsable: false,
children: [
'/docs/security/faq.md',
'/docs/security/policy.md',
'/docs/security/pause.md',
'/docs/security/forced-withdrawal.md',
]
},
], // end of sidebar
plugins: [
"@vuepress/pwa",
[
'@vuepress/plugin-medium-zoom',
{
selector: ':not(a) > img'
}
],
"plausible-analytics"
]
}
}
// module.exports.themeConfig.sidebar["/docs/useful-tools/"] = module.exports.themeConfig.sidebar["/docs/developers/"]
import event from '@vuepress/plugin-pwa/lib/event'
export default ({ router }) => {
registerAutoReload();
router.addRoutes([
{ path: '/docs/', redirect: '/' },
])
}
// When new content is detected by the app, this will automatically
// refresh the page, so that users do not need to manually click
// the refresh button. For more details see:
// https://linear.app/optimism/issue/FE-1003/investigate-archive-issue-on-docs
const registerAutoReload = () => {
event.$on('sw-updated', e => {
e.skipWaiting().then(() =>
{
if (typeof location !== 'undefined')
location.reload(true);
}
)
})
}
{
"name": "OP Docs",
"short_name": "OP Docs",
"description": "The official OP Docs",
"icons": [
{
"src": "/assets/logos/icon-192x192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "/assets/logos/icon-256x256.png",
"sizes": "256x256",
"type": "image/png"
},
{
"src": "/assets/logos/icon-384x384.png",
"sizes": "384x384",
"type": "image/png"
},
{
"src": "/assets/logos/icon-512x512.png",
"sizes": "512x512",
"type": "image/png"
},
{
"src": "/assets/logos/icon-1020x1020.png",
"sizes": "any",
"type": "image/png"
}
],
"start_url": "/index.html",
"display": "standalone",
"background_color": "#ffffff",
"theme_color": "#ff0420"
}
@import url('https://fonts.googleapis.com/css2?family=Open+Sans:wght@400;600&display=swap');
@import url('https://fonts.googleapis.com/css2?family=Rubik:ital,wght@0,400;0,600;0,700;1,600;1,700&display=swap');
@import 'https://pro.fontawesome.com/releases/v5.15.4/css/all.css';
main, body, html {
font-family: 'Open Sans', sans-serif;
}
p {
font-size: 16px;
line-height: 24px;
}
aside.sidebar {
background-color: #F1F4F9;
border-right: none;
}
p.sidebar-heading {
color: #323A43 !important;
font-family: 'Open Sans', sans-serif;
font-weight: 600 !important;
font-size: 14px !important;
line-height: 24px !important;
min-height: 36px;
margin-left: 20px;
padding: 8px 16px !important;
width: calc(100% - 60px) !important;
border-radius: 8px;
}
a.sidebar-link {
font-family: 'Open Sans', sans-serif;
font-size: 14px !important;
line-height: 24px !important;
min-height: 36px;
margin-top: 3px;
margin-left: 20px;
padding: 8px 16px !important;
width: calc(100% - 60px) !important;
border-radius: 8px;
}
section.sidebar-group a.sidebar-link,
section.sidebar-group p.sidebar-heading.clickable {
margin-left: 32px;
width: calc(100% - 60px) !important;
}
.sidebar-links:not(.sidebar-group-items) > li > a.sidebar-link {
font-weight: 600 !important;
color: #323A43 !important;
}
.sidebar-links:not(.sidebar-group-items) > li > a.sidebar-link.active {
border-left-color: #F1F4F9 !important;
background-color: #FFDBDF !important;
color: #FF0420 !important;
}
a.sidebar-link.active {
border-left-color: #F1F4F9 !important;
background-color: #FFDBDF !important;
color: #FF0420 !important;
}
h1 {
font-size: 50px;
}
h2 {
font-size: 40px;
}
h3 {
font-size: 28px;
}
h4 {
font-size: 20px;
}
h1 {
font-family: 'Rubik', sans-serif;
font-weight: 700;
font-style: italic;
border-bottom: none;
color: #202327 !important;
}
h2, h3, h4 {
font-family: 'Rubik', sans-serif;
border-bottom: none;
font-weight: 400;
}
#search-form {
@media (min-width $MQNormal) {
margin-left: 2rem;
}
}
.search-box {
@media (min-width $MQNormal) {
order: 1;
margin-right: 0;
margin-left: 1rem;
.suggestions {
left: auto !important;
right: 0 !important;
}
}
input {
border-color: #CBD5E0 !important;
border-radius: 100px !important;
background-color: #FFFFFF !important;
}
}
header.navbar {
border-bottom: none;
box-shadow: 0px 6px 8px -6px rgba(20, 23, 26, 0.06), 0px 8px 16px -6px rgba(20, 23, 26, 0.04);
--bgcolor-blur: hsla(0,0%,100%,0.9);
}
span.site-name {
display: none !important;
}
a.nav-link,
div.nav-item span.title {
font-weight: 600;
}
a.nav-link:not(.router-link-active),
div.nav-item span.title {
color: #68778D;
}
.theme-default-content:not(.custom) > p {
text-align: inherit !important;
}
.sidebar {
box-shadow: none !important;
}
div.hero-info {
color: white;
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
display: flex;
align-items: center;
}
div.hero-info .description {
display: none;
}
div.hero-info #main-title {
color: white !important;
}
header.hero > img {
border-radius: 16px;
max-height: none !important;
}
header.hero {
margin-top: 20px;
position: relative;
justify-content: space-between !important;
}
.home .features {
border-top: none !important;
justify-content: space-between !important;
margin: 0 !important;
padding-top: 0.5rem !important;
}
.home .features h2 {
font-family: 'Open Sans', sans-serif;
font-style: normal;
font-size: 16px !important;
font-weight: 600 !important;
color: #202327 !important;
}
.home .features p {
font-family: 'Open Sans', sans-serif;
font-size: 14px !important;
color: #68778D !important;
}
.home .features .icon-container {
height: 44px;
width: 44px;
background-color: #FFF0F1;
border-radius: 8px;
display: flex;
justify-content: center;
align-items: center;
color: #FF0420;
}
.home .features .feature {
background-color: #FFFFFF !important;
box-shadow: 0px 6px 8px -6px rgba(20, 23, 26, 0.12), 0px 8px 16px -6px rgba(20, 23, 26, 0.08);
border-radius: 16px !important;
margin: 0 !important;
margin-bottom: 2rem !important;
padding: 1.5rem !important;
}
.features-header {
margin-top: 30px;
margin-bottom: 0px;
}
div.theme-container:not(.has-sidebar) {
background-color: #F1F4F9;
}
.anchor-header {
color: #202327;
font-weight: 600;
font-size: 14px;
font-height: 20px;
margin-bottom: 10px;
}
.anchor-support {
margin-top: 20px;
}
.anchor-support-links i {
width: 20px;
text-align: center;
margin-right: 5px;
}
.anchor-support-links a {
color: #68778D;
}
.anchor-support-links a div {
height: 30px;
font-size: 14px;
}
.anchor-support-links a:hover {
color: #FF0420;
}
#anchor {
width: 15rem !important;
}
#anchor .anchor {
line-height: 27.2px !important;
}
#anchor .anchor div {
color: #68778D !important;
}
#anchor .anchor.active div {
font-weight: 600 !important;
}
.theme-default-content code {
top: 1px;
line-height: 22.4px !important;
vertical-align: middle !important;
}
.nav-dropdown svg.icon.outbound {
display: none;
}
.nav-dropdown .dropdown-item i {
width: 20px;
}
.color-discord {
color: #5865F2;
}
.color-github {
color: #121212;
}
.color-twitter {
color: #1DA1F2;
}
.color-twitch {
color: #6441A5;
}
.color-medium {
color: #000000;
}
.color-optimism {
color: #FF0420;
}
.color-ecosystem {
color: #ea94db;
}
$textColor = #000000
$accentColor = #f01a37
$backgroundColor = #272934
$lightBackgroundColor = #f3f3f3
$sidebarWidth = 320px
import Vue from "vue";
import type { AlgoliaOption } from "@mr-hope/vuepress-types";
declare const _default: import("vue/types/vue").ExtendedVue<Vue, {
placeholder: string;
}, {
initialize(userOptions: AlgoliaOption, lang: string): void;
update(options: AlgoliaOption, lang: string): void;
}, unknown, {
options: AlgoliaOption;
}>;
export default _default;
{"version":3,"file":"Dropdown.js","sourceRoot":"","sources":["Dropdown.ts"],"names":[],"mappings":"AAAA,OAAO,GAAG,MAAM,KAAK,CAAC;AAKtB,eAAe,GAAG,CAAC,MAAM,CAAC;IACxB,IAAI,EAAE,uBAAuB;IAE7B,KAAK,EAAE;QACL,OAAO,EAAE,EAAE,IAAI,EAAE,MAAiC,EAAE,QAAQ,EAAE,IAAI,EAAE;KACrE;IAED,IAAI,EAAE,GAAG,EAAE,CAAC,CAAC;QACX,WAAW,EAAE,EAAE;KAChB,CAAC;IAEF,KAAK,EAAE;QACL,KAAK,CAAC,QAAgB;YACpB,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;QACtC,CAAC;QAED,OAAO,CAAC,QAAuB;YAC7B,IAAI,CAAC,MAAM,CAAC,QAAQ,EAAE,IAAI,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;KACF;IAED,OAAO;QACL,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,KAAK,CAAC,CAAC;QAC1C,IAAI,CAAC,WAAW;YACb,IAAI,CAAC,KAAK,CAAC,WAAW,CAAC,iBAA4B,IAAI,EAAE,CAAC;IAC/D,CAAC;IAED,OAAO,EAAE;QACP,UAAU,CAAC,WAA0B,EAAE,IAAY;YACjD,KAAK,OAAO,CAAC,GAAG,CAAC;gBACf,MAAM;gBACJ,mCAAmC,CAAC,wCAAwC,CAC7E;gBACD,MAAM;gBACJ,mCAAmC,CAAC,yCAAyC,CAC9E;aACF,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,EAAE,EAAE;gBACtB,2BAA2B;gBAC1B,SAAiB,CAAC,OAAO,iCACrB,WAAW,KACd,aAAa,EAAE,uBAAuB;oBACtC,8CAA8C;oBAC9C,cAAc,EAAE;wBACd,YAAY,EAAE,CAAC,QAAQ,IAAI,EAAE,CAAC,CAAC,MAAM;wBACnC,2BAA2B;wBACzB,WAAmB,CAAC,YAAyB,IAAI,EAAE,CACtD;qBACF,EACD,cAAc,EAAE,CACd,MAAwB,EACxB,MAAa,EACb,UAA2B,EAC3B,EAAE;wBACF,MAAM,EAAE,QAAQ,EAAE,IAAI,EAAE,GAAG,IAAI,GAAG,CAAC,UAAU,CAAC,GAAG,CAAC,CAAC;wBACnD,MAAM,SAAS,GAAG,QAAQ,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;wBAEzD,IACE,IAAI,CAAC,OAAO,CAAC,SAAS,EAAE,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,EAAE,CAAC,KAAK,CAAC,IAAI,KAAK,SAAS,CAAC;4BAElE,KAAK,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,GAAG,SAAS,GAAG,kBAAkB,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;;4BAC/D,MAAM,CAAC,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,CAAC;oBACnC,CAAC,IACD,CAAC;YACL,CAAC,CAAC,CAAC;QACL,CAAC;QAED,MAAM,CAAC,OAAsB,EAAE,IAAY;YACzC,IAAI,CAAC,GAAG,CAAC,SAAS;gBAChB,wDAAwD,CAAC;YAC3D,IAAI,CAAC,UAAU,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;QACjC,CAAC;KACF;CACF,CAAC,CAAC"}
\ No newline at end of file
import Vue from "vue";
import type { AlgoliaOption } from "@mr-hope/vuepress-types";
declare const _default: import("vue/types/vue").ExtendedVue<Vue, unknown, {
initialize(userOptions: AlgoliaOption, _lang: string): void;
resolveRoutePathFromUrl(absoluteUrl: string): string;
update(options: AlgoliaOption, lang: string): void;
}, unknown, {
options: AlgoliaOption;
}>;
export default _default;
{"version":3,"file":"Full.js","sourceRoot":"","sources":["Full.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,aAAa,EAAE,MAAM,QAAQ,CAAC;AACvC,OAAO,GAAG,MAAM,KAAK,CAAC;AACtB,6DAA6D;AAC7D,mCAAmC;AACnC,OAAO,SAAS,MAAM,eAAe,CAAC;AAMtC,eAAe,GAAG,CAAC,MAAM,CAAC;IACxB,IAAI,EAAE,mBAAmB;IAEzB,KAAK,EAAE;QACL,OAAO,EAAE,EAAE,IAAI,EAAE,MAAiC,EAAE,QAAQ,EAAE,IAAI,EAAE;KACrE;IAED,KAAK,EAAE;QACL,KAAK,CAAC,QAAgB;YACpB,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;QACtC,CAAC;QAED,OAAO,CAAC,QAAuB;YAC7B,IAAI,CAAC,MAAM,CAAC,QAAQ,EAAE,IAAI,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;KACF;IAED,OAAO;QACL,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,KAAK,CAAC,CAAC;IAC5C,CAAC;IAED,OAAO,EAAE;QACP,6DAA6D;QAC7D,UAAU,CAAC,WAA0B,EAAE,KAAa;YAClD,2BAA2B;YAC1B,SAAqE,+BACpE,SAAS,EAAE,YAAY,EACvB,WAAW,EAAG,IAAI,CAAC,KAAK,CAAC,WAAW,CAAC,iBAA4B,IAAI,EAAE,IACpE,WAAW,KACd,gBAAgB,EAAE,WAAW,CAAC,gBAAgB,IAAI,EAAE;gBAEpD,mCAAmC;gBACnC,cAAc,EAAE,CAAC,KAAK,EAAE,EAAE,CACxB,KAAK,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,iCACf,IAAI;oBACP,wDAAwD;oBACxD,gDAAgD;oBAChD,GAAG,EAAE,IAAI,CAAC,uBAAuB,CAAC,IAAI,CAAC,GAAG,CAAC,IAC3C,CAAC;gBAEL,yDAAyD;gBACzD,YAAY,EAAE,CAAC,EAAE,GAAG,EAAE,QAAQ,EAAE,EAAE,EAAE,CAClC,aAAa,CACX,GAAG,EACH;oBACE,IAAI,EAAE,GAAG,CAAC,GAAG;oBACb,OAAO,EAAE,CAAC,KAAY,EAAQ,EAAE;wBAC9B,oDAAoD;wBACpD,yDAAyD;wBACzD,sCAAsC;wBACtC,IAAI,IAAI,CAAC,MAAM,CAAC,QAAQ,KAAK,GAAG,CAAC,GAAG;4BAAE,OAAO;wBAE7C,MAAM,QAAQ,GAAG,GAAG,MAAM,CAAC,QAAQ,CAAC,MAAM,GAAG,GAAG,CAAC,GAAG,EAAE,CAAC;wBACvD,MAAM,EAAE,QAAQ,EAAE,WAAW,EAAE,GAAG,IAAI,GAAG,CAAC,QAAQ,CAAC,CAAC;wBAEpD,wEAAwE;wBACxE,8CAA8C;wBAC9C,IAAI,IAAI,CAAC,MAAM,CAAC,IAAI,KAAK,WAAW;4BAAE,KAAK,CAAC,cAAc,EAAE,CAAC;wBAE7D,IACE,IAAI,CAAC,OAAO;6BACT,SAAS,EAAE;6BACX,IAAI,CACH,CAAC,KAAK,EAAE,EAAE,CACR,KAAK,CAAC,IAAI,CAAC,OAAO,CAAC,cAAc,EAAE,EAAE,CAAC,KAAK,WAAW,CACzD;4BAEH,KAAK,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;;4BAC7B,MAAM,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;oBAC7B,CAAC;iBACF,EACD,QAAQ,CACT,EAEH,SAAS,EAAE;oBACT,QAAQ,EAAE,CAAC,EAAE,OAAO,EAAE,EAAQ,EAAE;wBAC9B,MAAM,QAAQ,GAAG,GAAG,MAAM,CAAC,QAAQ,CAAC,MAAM,GAAG,OAAO,EAAE,CAAC;wBACvD,MAAM,EAAE,QAAQ,EAAE,WAAW,EAAE,GAAG,IAAI,GAAG,CAAC,QAAQ,CAAC,CAAC;wBAEpD,2DAA2D;wBAC3D,yDAAyD;wBACzD,IAAI,IAAI,CAAC,MAAM,CAAC,IAAI,KAAK,WAAW;4BAClC,MAAM,CAAC,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;6BAC9B,IACH,IAAI,CAAC,OAAO;6BACT,SAAS,EAAE;6BACX,IAAI,CAAC,CAAC,KAAK,EAAE,EAAE,CAAC,KAAK,CAAC,IAAI,KAAK,WAAW,CAAC;4BAE9C,KAAK,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;;4BAC7B,MAAM,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;oBAC7B,CAAC;oBACD,cAAc,CAAC,EAAE,OAAO,EAAE;wBACxB,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;oBACvB,CAAC;oBACD,iBAAiB,CAAC,EAAE,OAAO,EAAE;wBAC3B,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;oBACvB,CAAC;iBACF,IACD,CAAC;QACL,CAAC;QAED,uBAAuB,CAAC,WAAmB;YACzC,MAAM,EAAE,QAAQ,EAAE,IAAI,EAAE,GAAG,IAAI,GAAG,CAAC,WAAW,CAAC,CAAC;YAEhD,OAAO,GAAG,QAAQ,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,CAAC,GAAG,IAAI,EAAE,CAAC;QAC5D,CAAC;QAED,MAAM,CAAC,OAAsB,EAAE,IAAY;YACzC,IAAI,CAAC,GAAG,CAAC,SAAS,GAAG,4BAA4B,CAAC;YAClD,IAAI,CAAC,UAAU,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;QACjC,CAAC;KACF;CACF,CAAC,CAAC"}
\ No newline at end of file
import Vue from "vue";
import { SidebarHeader } from "@theme/utils/sidebar";
declare const _default: import("vue/types/vue").ExtendedVue<Vue, {}, {}, {}, {
items: SidebarHeader[];
}>;
export default _default;
This diff is collapsed.
{"version":3,"file":"Anchor.js","sourceRoot":"","sources":["Anchor.ts"],"names":[],"mappings":"AAAA,OAAO,GAAG,MAAM,KAAK,CAAC;AAEtB,OAAO,EAAE,QAAQ,EAAE,MAAM,mBAAmB,CAAC;AAW7C,MAAM,UAAU,GAAG,CACjB,CAAgB,EAChB,EAAE,IAAI,EAAE,IAAI,EAAE,KAAK,EAAc,EAC1B,EAAE,CACT,CAAC,CACC,YAAY,EACZ;IACE,KAAK,EAAE;QACL,EAAE,EAAE,IAAI;QACR,WAAW,EAAE,EAAE;QACf,gBAAgB,EAAE,EAAE;KACrB;IACD,KAAK,EAAE;QACL,aAAa,EAAE,IAAI;QACnB,CAAC,KAAK,CAAC,CAAC,CAAC,UAAU,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,KAAK;KACxC;CACF,EACD,CAAC,CAAC,CAAC,KAAK,EAAE,EAAE,EAAE,CAAC,IAAI,CAAC,CAAC,CAAC,CACvB,CAAC;AAOJ,MAAM,cAAc,GAAG,CACrB,CAAgB,EAChB,EAAE,QAAQ,EAAE,KAAK,EAAyB,EACnC,EAAE,CACT,CAAC,CACC,IAAI,EACJ,EAAE,KAAK,EAAE,aAAa,EAAE,EACxB,QAAQ,CAAC,GAAG,CAAC,CAAC,KAAoB,EAAE,EAAE;IACpC,MAAM,MAAM,GAAG,QAAQ,CAAC,KAAK,EAAE,GAAG,KAAK,CAAC,IAAI,IAAI,KAAK,CAAC,IAAI,EAAE,CAAC,CAAC;IAE9D,OAAO,CAAC,CAAC,IAAI,EAAE,EAAE,KAAK,EAAE,EAAE,MAAM,EAAE,IAAI,EAAE,MAAM,EAAE,EAAE,EAAE;QAClD,UAAU,CAAC,CAAC,EAAE;YACZ,IAAI,EAAE,KAAK,CAAC,KAAK;YACjB,IAAI,EAAE,GAAG,KAAK,CAAC,IAAI,IAAI,KAAK,CAAC,IAAI,EAAE;YACnC,KAAK,EAAE,KAAK,CAAC,KAAK;SACnB,CAAC;KACH,CAAC,CAAC;AACL,CAAC,CAAC,CACH,CAAC;AAEJ,eAAe,GAAG,CAAC,MAAM,CAAC;IACxB,IAAI,EAAE,QAAQ;IAEd,UAAU,EAAE,IAAI;IAEhB,KAAK,EAAE;QACL,KAAK,EAAE;YACL,IAAI,EAAE,KAAkC;YACxC,OAAO,EAAE,GAAG,EAAE,CAAC,EAAE;SAClB;KACF;IAED,MAAM,CAAC,CAAC,EAAE,EAAE,KAAK,EAAE,MAAM,EAAE,EAAE,KAAK,EAAE,MAAM,EAAE,EAAE;QAC5C,OAAO,CAAC,CAAC,KAAK,EAAE,EAAE,KAAK,EAAE,EAAE,KAAK,EAAE,qBAAqB,EAAE,EAAE,EAAE;YAC3D,CAAC,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,QAAQ,EAAE,EAAE,EAAE;gBACtC,CAAC,CAAC,KAAK,EAAE,EAAE,KAAK,EAAE,gBAAgB,EAAE,EAAE;oBACpC,KAAK,CAAC,KAAK,CAAC,MAAM;wBAChB,CAAC,CAAC,cAAc,CAAC,CAAC,EAAE;4BAChB,QAAQ,EAAE,KAAK,CAAC,KAAK;4BACrB,KAAK,EAAE,MAAM;yBACd,CAAC;wBACJ,CAAC,CAAC,KAAK,CAAC,OAAO;4BACf,CAAC,CAAC,cAAc,CAAC,CAAC,EAAE;gCAChB,QAAQ,EAAE,KAAK,CAAC,OAAO;gCACvB,KAAK,EAAE,MAAM;6BACd,CAAC;4BACJ,CAAC,CAAC,IAAI;iBACT,CAAC;aACH,CAAC;SACH,CAAC,CAAC;IACL,CAAC;CACF,CAAC,CAAC"}
\ No newline at end of file
{"version":3,"file":"ArticleInfo.js","sourceRoot":"","sources":["ArticleInfo.ts"],"names":[],"mappings":"AAAA,OAAO,GAAG,MAAM,KAAK,CAAC;AACtB,OAAO,EAAE,UAAU,EAAE,MAAM,0BAA0B,CAAC;AACtD,OAAO,UAAU,MAAM,kEAAkE,CAAC;AAC1F,OAAO,YAAY,MAAM,oEAAoE,CAAC;AAC9F,OAAO,YAAY,MAAM,8DAA8D,CAAC;AACxF,OAAO,OAAO,MAAM,yDAAyD,CAAC;AAC9E,OAAO,SAAS,MAAM,iEAAiE,CAAC;AAKxF,eAAe,GAAG,CAAC,MAAM,CAAC;IACxB,IAAI,EAAE,aAAa;IAEnB,UAAU,EAAE;QACV,UAAU;QACV,YAAY;QACZ,YAAY;QACZ,OAAO;QACP,SAAS;KACV;IAED,KAAK,EAAE;QACL,OAAO,EAAE,EAAE,IAAI,EAAE,MAAgC,EAAE,QAAQ,EAAE,IAAI,EAAE;KACpE;IAED,QAAQ,EAAE;QACR,MAAM;YACJ,OAAO,CACL,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,MAAM;gBAC/B,CAAC,IAAI,CAAC,YAAY,CAAC,MAAM,IAAI,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,MAAM,KAAK,KAAK;oBACpE,CAAC,CAAC,IAAI,CAAC,YAAY,CAAC,MAAM;oBAC1B,CAAC,CAAC,EAAE,CAAC,CACR,CAAC;QACJ,CAAC;QAED,IAAI;YACF,MAAM,EAAE,IAAI,EAAE,IAAI,GAAG,IAAI,EAAE,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC;YAEvD,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;gBAC5B,IAAI,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC,EAAE;oBAC5B,MAAM,CAAC,UAAU,EAAE,IAAI,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;oBAC3C,MAAM,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;oBAEhC,OAAO,GAAG,UAAU,IAAI,KAAK,KAAK,UAAU,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;iBAC7D;gBAED,OAAO,IAAI,CAAC;aACb;YAED,OAAO,IAAI,CAAC,OAAO,CAAC,UAAU,IAAI,EAAE,CAAC;QACvC,CAAC;QAED,IAAI;YACF,MAAM,EAAE,GAAG,EAAE,IAAI,GAAG,GAAG,EAAE,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC;YAErD,IAAI,OAAO,IAAI,KAAK,QAAQ;gBAAE,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC,CAAC;YAExD,IAAI,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC;gBAAE,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC,CAAC;YAErE,OAAO,EAAE,CAAC;QACZ,CAAC;QAED,kBAAkB;YAChB,OAAO,KAAK,IAAI,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,WAAW,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC,GAAG,CAAC;QACzE,CAAC;QAED,WAAW;YACT,MAAM,EAAE,MAAM,EAAE,IAAI,EAAE,GAAG,iBAAiB,CAAC,IAAI,CAAC,WAAW,IAAI,GAAG,CAAC,CAAC;YAEpE,OAAO,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,OAAO,GAAG,CAAC;gBACzC,CAAC,CAAC,MAAM;gBACR,CAAC,CAAC,IAAI,CAAC,OAAO,CACV,OAAO,EACP,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC,QAAQ,EAAE,CACxD,CAAC;QACR,CAAC;QAED,UAAU;YACR,OAAO,cAAc,CAAC,IAAI,CAAC,WAAW,IAAI,GAAG,CAAC,CAAC,MAAM,CAAC;QACxD,CAAC;QAED,QAAQ;YACN,OAAO,cAAc,CAAC,IAAI,CAAC,WAAW,IAAI,GAAG,CAAC,CAAC,IAAI,CAAC;QACtD,CAAC;QAED,OAAO;YACL,OAAO,cAAc,CAAC,IAAI,CAAC,WAAW,IAAI,GAAG,CAAC,CAAC,GAAG,CAAC;QACrD,CAAC;QAED,eAAe;YACb,OAAO,cAAc,CAAC,IAAI,CAAC,WAAW,IAAI,GAAG,CAAC,CAAC,WAAW,CAAC;QAC7D,CAAC;KACF;CACF,CAAC,CAAC"}
\ No newline at end of file
import Vue from "vue";
import type { PageComputed } from "@mr-hope/vuepress-types";
declare const _default: import("vue/types/vue").ExtendedVue<Vue, unknown, unknown, {
isEncrypted: boolean;
excerpt: string;
}, {
article: PageComputed;
}>;
export default _default;
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment