Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
N
nebula
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
exchain
nebula
Commits
ed234a8d
Unverified
Commit
ed234a8d
authored
Feb 08, 2022
by
smartcontracts
Committed by
GitHub
Feb 08, 2022
Browse files
Options
Browse Files
Download
Plain Diff
Merge pull request #2158 from ethereum-optimism/sc/rm-regenesis-surgery
maintenance: remove regenesis-surgery package
parents
69779a27
3e31c2f1
Changes
35
Hide whitespace changes
Inline
Side-by-side
Showing
35 changed files
with
16 additions
and
3576 deletions
+16
-3576
Makefile
ops/Makefile
+0
-4
Dockerfile.monorepo
ops/docker/Dockerfile.monorepo
+0
-1
Dockerfile.packages
ops/docker/Dockerfile.packages
+0
-1
Dockerfile.regenesis-surgery
ops/docker/Dockerfile.regenesis-surgery
+0
-30
.env.example
packages/regenesis-surgery/.env.example
+0
-10
.eslintrc.js
packages/regenesis-surgery/.eslintrc.js
+0
-4
.gitignore
packages/regenesis-surgery/.gitignore
+0
-8
.lintstagedrc.yml
packages/regenesis-surgery/.lintstagedrc.yml
+0
-2
.prettierrc.js
packages/regenesis-surgery/.prettierrc.js
+0
-3
CHANGELOG.md
packages/regenesis-surgery/CHANGELOG.md
+0
-33
LICENSE
packages/regenesis-surgery/LICENSE
+0
-22
README.md
packages/regenesis-surgery/README.md
+0
-34
package.json
packages/regenesis-surgery/package.json
+0
-70
classifiers.ts
packages/regenesis-surgery/scripts/classifiers.ts
+0
-109
constants.ts
packages/regenesis-surgery/scripts/constants.ts
+0
-171
data.ts
packages/regenesis-surgery/scripts/data.ts
+0
-209
handlers.ts
packages/regenesis-surgery/scripts/handlers.ts
+0
-443
solc.ts
packages/regenesis-surgery/scripts/solc.ts
+0
-260
surgery.ts
packages/regenesis-surgery/scripts/surgery.ts
+0
-128
types.ts
packages/regenesis-surgery/scripts/types.ts
+0
-143
utils.ts
packages/regenesis-surgery/scripts/utils.ts
+0
-319
beforeall.spec.ts
packages/regenesis-surgery/test/beforeall.spec.ts
+0
-5
delete.spec.ts
packages/regenesis-surgery/test/delete.spec.ts
+0
-53
eoa.spec.ts
packages/regenesis-surgery/test/eoa.spec.ts
+0
-126
erc20.spec.ts
packages/regenesis-surgery/test/erc20.spec.ts
+0
-37
predeploy.spec.ts
packages/regenesis-surgery/test/predeploy.spec.ts
+0
-248
provider.spec.ts
packages/regenesis-surgery/test/provider.spec.ts
+0
-122
provider.ts
packages/regenesis-surgery/test/provider.ts
+0
-311
setup.ts
packages/regenesis-surgery/test/setup.ts
+0
-183
uniswap.spec.ts
packages/regenesis-surgery/test/uniswap.spec.ts
+0
-244
utils.ts
packages/regenesis-surgery/test/utils.ts
+0
-34
verified.spec.ts
packages/regenesis-surgery/test/verified.spec.ts
+0
-48
tsconfig.build.json
packages/regenesis-surgery/tsconfig.build.json
+0
-12
tsconfig.json
packages/regenesis-surgery/tsconfig.json
+0
-3
yarn.lock
yarn.lock
+16
-146
No files found.
ops/Makefile
View file @
ed234a8d
...
@@ -43,7 +43,3 @@ ps-metrics:
...
@@ -43,7 +43,3 @@ ps-metrics:
-f
docker-compose-metrics.yml
\
-f
docker-compose-metrics.yml
\
ps
ps
.PHONY
:
ps
.PHONY
:
ps
regenesis-surgery
:
docker build
-f
./docker/Dockerfile.regenesis-surgery
\
-t
ethereumoptimism/regenesis-surgery:latest ..
ops/docker/Dockerfile.monorepo
View file @
ed234a8d
...
@@ -32,7 +32,6 @@ COPY packages/data-transport-layer/package.json ./packages/data-transport-layer/
...
@@ -32,7 +32,6 @@ COPY packages/data-transport-layer/package.json ./packages/data-transport-layer/
COPY packages/batch-submitter/package.json ./packages/batch-submitter/package.json
COPY packages/batch-submitter/package.json ./packages/batch-submitter/package.json
COPY packages/message-relayer/package.json ./packages/message-relayer/package.json
COPY packages/message-relayer/package.json ./packages/message-relayer/package.json
COPY packages/replica-healthcheck/package.json ./packages/replica-healthcheck/package.json
COPY packages/replica-healthcheck/package.json ./packages/replica-healthcheck/package.json
COPY packages/regenesis-surgery/package.json ./packages/regenesis-surgery/package.json
COPY integration-tests/package.json ./integration-tests/package.json
COPY integration-tests/package.json ./integration-tests/package.json
RUN yarn install --frozen-lockfile
RUN yarn install --frozen-lockfile
...
...
ops/docker/Dockerfile.packages
View file @
ed234a8d
...
@@ -21,7 +21,6 @@ COPY packages/data-transport-layer/package.json ./packages/data-transport-layer/
...
@@ -21,7 +21,6 @@ COPY packages/data-transport-layer/package.json ./packages/data-transport-layer/
COPY packages/batch-submitter/package.json ./packages/batch-submitter/package.json
COPY packages/batch-submitter/package.json ./packages/batch-submitter/package.json
COPY packages/message-relayer/package.json ./packages/message-relayer/package.json
COPY packages/message-relayer/package.json ./packages/message-relayer/package.json
COPY packages/replica-healthcheck/package.json ./packages/replica-healthcheck/package.json
COPY packages/replica-healthcheck/package.json ./packages/replica-healthcheck/package.json
COPY packages/regenesis-surgery/package.json ./packages/regenesis-surgery/package.json
COPY integration-tests/package.json ./integration-tests/package.json
COPY integration-tests/package.json ./integration-tests/package.json
RUN yarn install --frozen-lockfile
RUN yarn install --frozen-lockfile
...
...
ops/docker/Dockerfile.regenesis-surgery
deleted
100644 → 0
View file @
69779a27
ARG LOCAL_REGISTRY=docker.io
ARG BUILDER_TAG=latest
FROM ${LOCAL_REGISTRY}/ethereumoptimism/builder:${BUILDER_TAG} AS builder
FROM node:16-alpine
RUN apk add --no-cache curl bash jq
WORKDIR /opt/optimism
# copy top level files
COPY --from=builder /optimism/*.json /optimism/yarn.lock ./
COPY --from=builder /optimism/node_modules ./node_modules
# copy deps (would have been nice if docker followed the symlinks required)
COPY --from=builder /optimism/packages/core-utils/package.json ./packages/core-utils/package.json
COPY --from=builder /optimism/packages/core-utils/dist ./packages/core-utils/dist
COPY --from=builder /optimism/packages/common-ts/package.json ./packages/common-ts/package.json
COPY --from=builder /optimism/packages/common-ts/dist ./packages/common-ts/dist
COPY --from=builder /optimism/packages/contracts/package.json ./packages/contracts/package.json
COPY --from=builder /optimism/packages/contracts/deployments ./packages/contracts/deployments
COPY --from=builder /optimism/packages/contracts/dist ./packages/contracts/dist
COPY --from=builder /optimism/packages/contracts/artifacts ./packages/contracts/artifacts
# copy the service
WORKDIR /opt/optimism/packages/regenesis-surgery
COPY --from=builder /optimism/packages/regenesis-surgery/package.json ./
COPY --from=builder /optimism/packages/regenesis-surgery/scripts ./scripts
COPY --from=builder /optimism/packages/regenesis-surgery/node_modules ./node_modules
packages/regenesis-surgery/.env.example
deleted
100644 → 0
View file @
69779a27
REGEN__STATE_DUMP_FILE=
REGEN__ETHERSCAN_FILE=
REGEN__GENESIS_FILE=
REGEN__OUTPUT_FILE=
REGEN__L2_PROVIDER_URL=
REGEN__ETH_PROVIDER_URL=
REGEN__L1_TESTNET_PROVIDER_URL=
REGEN__L1_TESTNET_PRIVATE_KEY=
REGEN__START_INDEX=
REGEN__END_INDEX=
packages/regenesis-surgery/.eslintrc.js
deleted
100644 → 0
View file @
69779a27
module
.
exports
=
{
extends
:
'
../../.eslintrc.js
'
,
ignorePatterns
:
[
'
/data
'
,
'
/solc-bin
'
,
'
/solc-cache
'
],
}
packages/regenesis-surgery/.gitignore
deleted
100644 → 0
View file @
69779a27
node_modules/
build/
solc-bin/
outputs/
etherscan/
state-dumps/
data/
solc-cache/
packages/regenesis-surgery/.lintstagedrc.yml
deleted
100644 → 0
View file @
69779a27
"
*.{ts,js}"
:
-
eslint
packages/regenesis-surgery/.prettierrc.js
deleted
100644 → 0
View file @
69779a27
module
.
exports
=
{
...
require
(
'
../../.prettierrc.js
'
),
};
\ No newline at end of file
packages/regenesis-surgery/CHANGELOG.md
deleted
100644 → 0
View file @
69779a27
# @eth-optimism/regenesis-surgery
## 0.2.3
### Patch Changes
-
ba14c59d: Updates various ethers dependencies to their latest versions
## 0.2.2
### Patch Changes
-
8e634b49: Fix package JSON issues
## 0.2.1
### Patch Changes
-
243f33e5: Standardize package json file format
## 0.2.0
### Minor Changes
-
8148d2fb: Add regenesis-surgery package and event-indexer script
-
81ccd6e4:
`regenesis/0.5.0`
release
### Patch Changes
-
f9ea95bd: Fixes the compiler cache to prevent collisions between EVM and OVM outputs.
-
b70ee70c: upgraded to solidity 0.8.9
-
c38e4b57: Minor bugfixes to the regenesis process for OVM_ETH
-
a98a1884: Fixes dependencies instead of using caret constraints
packages/regenesis-surgery/LICENSE
deleted
100644 → 0
View file @
69779a27
(The MIT License)
Copyright 2020-2021 Optimism
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
packages/regenesis-surgery/README.md
deleted
100644 → 0
View file @
69779a27
# @eth-optimism/regenesis-surgery
Scripts used to perform the transition process between OVMv1 and OVMv2.
## Installation
```
sh
git clone git@github.com:ethereum-optimism/optimism.git
yarn clean
yarn
install
yarn build
```
## Usage
1.
Open
`.env`
and add values for all environment variables listed below.
2.
Run
`yarn start`
to start the surgery process.
3.
Grab a coffee or something.
## Environment Variables
| Variable | Description |
| ----------------------------- | --------------------------------------------------------------|
|
`REGEN__STATE_DUMP_FILE`
| Path to the state dump file |
|
`REGEN__ETHERSCAN_FILE`
| Path to the etherscan dump file |
|
`REGEN__GENESIS_FILE`
| Path to the initial genesis file |
|
`REGEN__OUTPUT_FILE`
| Path where the output genesis will be saved |
|
`REGEN__L2_PROVIDER_URL`
| RPC provider for the L2 network being upgraded |
|
`REGEN__ETH_PROVIDER_URL`
| RPC provider for Ethereum mainnet |
|
`REGEN__ROPSTEN_PROVIDER_URL`
| RPC provider for the Ropsten testnet |
|
`REGEN__ROPSTEN_PRIVATE_KEY`
| Private key of an account that has Ropsten ETH |
|
`REGEN__STATE_DUMP_HEIGHT`
| Height at which the state dump was taken |
|
`REGEN__START_INDEX`
| Start index to begin processing the regenesis at (do not set) |
|
`REGEN__END_INDEX`
| End index to finish processing the regenesis at (do not set) |
packages/regenesis-surgery/package.json
deleted
100644 → 0
View file @
69779a27
{
"private"
:
true
,
"name"
:
"@eth-optimism/regenesis-surgery"
,
"version"
:
"0.2.3"
,
"description"
:
"[Optimism] Tooling for managing the OVM 1.0 to OVM 2.0 upgrade"
,
"main"
:
"dist/index"
,
"types"
:
"dist/index"
,
"files"
:
[
"dist/*"
],
"scripts"
:
{
"clean"
:
"rimraf ./dist ./tsconfig.build.tsbuildinfo"
,
"lint"
:
"yarn run lint:fix && yarn run lint:check"
,
"lint:fix"
:
"yarn lint:check --fix"
,
"lint:check"
:
"eslint . --max-warnings=0"
,
"pre-commit"
:
"lint-staged"
,
"test:surgery"
:
"ts-mocha --timeout 50000000 test/*"
,
"start"
:
"ts-node ./scripts/surgery.ts"
},
"keywords"
:
[
"optimism"
,
"ethereum"
,
"regenesis"
,
"surgery"
],
"homepage"
:
"https://github.com/ethereum-optimism/optimism/tree/develop/packages/regenesis-surgery#readme"
,
"license"
:
"MIT"
,
"author"
:
"Optimism PBC"
,
"repository"
:
{
"type"
:
"git"
,
"url"
:
"https://github.com/ethereum-optimism/optimism.git"
},
"devDependencies"
:
{
"@discoveryjs/json-ext"
:
"^0.5.3"
,
"@eth-optimism/core-utils"
:
"0.7.6"
,
"@ethersproject/abi"
:
"^5.5.0"
,
"@ethersproject/abstract-provider"
:
"^5.5.1"
,
"@ethersproject/bignumber"
:
"^5.5.0"
,
"@ethersproject/properties"
:
"^5.5.0"
,
"@ethersproject/providers"
:
"^5.5.3"
,
"@types/node"
:
"^15.12.2"
,
"@types/node-fetch"
:
"^3.0.3"
,
"@typescript-eslint/eslint-plugin"
:
"^4.26.0"
,
"@typescript-eslint/parser"
:
"^4.26.0"
,
"@uniswap/sdk-core"
:
"^3.0.1"
,
"@uniswap/v3-core"
:
"^1.0.0"
,
"@uniswap/v3-sdk"
:
"^3.5.1"
,
"babel-eslint"
:
"^10.1.0"
,
"byline"
:
"^5.0.0"
,
"chai"
:
"^4.3.4"
,
"chai-as-promised"
:
"^7.1.1"
,
"dotenv"
:
"^10.0.0"
,
"eslint-config-prettier"
:
"^8.3.0"
,
"eslint-plugin-import"
:
"^2.23.4"
,
"eslint-plugin-jsdoc"
:
"^35.1.2"
,
"eslint-plugin-prefer-arrow"
:
"^1.2.3"
,
"eslint-plugin-prettier"
:
"^3.4.0"
,
"eslint-plugin-react"
:
"^7.24.0"
,
"eslint-plugin-unicorn"
:
"^32.0.1"
,
"ethereum-waffle"
:
"^3.4.0"
,
"ethereumjs-util"
:
"^7.1.3"
,
"ethers"
:
"^5.5.4"
,
"lint-staged"
:
"11.0.0"
,
"mocha"
:
"^9.1.2"
,
"node-fetch"
:
"2.6.7"
,
"solc"
:
"0.8.7-fixed"
,
"ts-mocha"
:
"^8.0.0"
,
"ts-node"
:
"^10.0.0"
}
}
packages/regenesis-surgery/scripts/classifiers.ts
deleted
100644 → 0
View file @
69779a27
import
{
EOA_CODE_HASHES
,
UNISWAP_V3_FACTORY_ADDRESS
,
UNISWAP_V3_NFPM_ADDRESS
,
UNISWAP_V3_CONTRACT_ADDRESSES
,
UNISWAP_V3_MAINNET_MULTICALL
,
PREDEPLOY_WIPE_ADDRESSES
,
PREDEPLOY_NO_WIPE_ADDRESSES
,
PREDEPLOY_NEW_NOT_ETH_ADDRESSES
,
OLD_ETH_ADDRESS
,
NEW_ETH_ADDRESS
,
ONEINCH_DEPLOYER_ADDRESS
,
DELETE_CONTRACTS
,
}
from
'
./constants
'
import
{
Account
,
AccountType
,
SurgeryDataSources
}
from
'
./types
'
import
{
hexStringEqual
,
isBytecodeERC20
}
from
'
./utils
'
export
const
classifiers
:
{
[
key
in
AccountType
]:
(
account
:
Account
,
data
:
SurgeryDataSources
)
=>
boolean
}
=
{
[
AccountType
.
ONEINCH_DEPLOYER
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
ONEINCH_DEPLOYER_ADDRESS
)
},
[
AccountType
.
DELETE
]:
(
account
)
=>
{
return
DELETE_CONTRACTS
.
some
((
addr
)
=>
{
return
hexStringEqual
(
account
.
address
,
addr
)
})
},
[
AccountType
.
EOA
]:
(
account
)
=>
{
// Just in case the account doesn't have a code hash
if
(
!
account
.
codeHash
)
{
return
false
}
return
EOA_CODE_HASHES
.
some
((
codeHash
)
=>
{
return
hexStringEqual
(
account
.
codeHash
,
codeHash
)
})
},
[
AccountType
.
PRECOMPILE
]:
(
account
)
=>
{
return
account
.
address
.
toLowerCase
()
.
startsWith
(
'
0x00000000000000000000000000000000000000
'
)
},
[
AccountType
.
PREDEPLOY_NEW_NOT_ETH
]:
(
account
)
=>
{
return
PREDEPLOY_NEW_NOT_ETH_ADDRESSES
.
some
((
addr
)
=>
{
return
hexStringEqual
(
account
.
address
,
addr
)
})
},
[
AccountType
.
PREDEPLOY_WIPE
]:
(
account
)
=>
{
return
PREDEPLOY_WIPE_ADDRESSES
.
some
((
addr
)
=>
{
return
hexStringEqual
(
account
.
address
,
addr
)
})
},
[
AccountType
.
PREDEPLOY_NO_WIPE
]:
(
account
)
=>
{
return
PREDEPLOY_NO_WIPE_ADDRESSES
.
some
((
addr
)
=>
{
return
hexStringEqual
(
account
.
address
,
addr
)
})
},
[
AccountType
.
PREDEPLOY_ETH
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
NEW_ETH_ADDRESS
)
},
[
AccountType
.
PREDEPLOY_WETH
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
OLD_ETH_ADDRESS
)
},
[
AccountType
.
UNISWAP_V3_FACTORY
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
UNISWAP_V3_FACTORY_ADDRESS
)
},
[
AccountType
.
UNISWAP_V3_NFPM
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
UNISWAP_V3_NFPM_ADDRESS
)
},
[
AccountType
.
UNISWAP_V3_MAINNET_MULTICALL
]:
(
account
)
=>
{
return
hexStringEqual
(
account
.
address
,
UNISWAP_V3_MAINNET_MULTICALL
)
},
[
AccountType
.
UNISWAP_V3_POOL
]:
(
account
,
data
)
=>
{
return
data
.
pools
.
some
((
pool
)
=>
{
return
hexStringEqual
(
pool
.
oldAddress
,
account
.
address
)
})
},
[
AccountType
.
UNISWAP_V3_OTHER
]:
(
account
)
=>
{
return
UNISWAP_V3_CONTRACT_ADDRESSES
.
some
((
addr
)
=>
{
return
hexStringEqual
(
account
.
address
,
addr
)
})
},
[
AccountType
.
UNVERIFIED
]:
(
account
,
data
)
=>
{
const
found
=
data
.
etherscanDump
.
find
(
(
c
)
=>
c
.
contractAddress
===
account
.
address
)
return
found
===
undefined
||
found
.
sourceCode
===
''
},
[
AccountType
.
VERIFIED
]:
(
account
,
data
)
=>
{
return
!
classifiers
[
AccountType
.
UNVERIFIED
](
account
,
data
)
},
[
AccountType
.
ERC20
]:
(
account
)
=>
{
return
isBytecodeERC20
(
account
.
code
)
},
}
export
const
classify
=
(
account
:
Account
,
data
:
SurgeryDataSources
):
AccountType
=>
{
for
(
const
accountType
in
AccountType
)
{
if
(
!
isNaN
(
Number
(
accountType
)))
{
if
(
classifiers
[
accountType
](
account
,
data
))
{
return
Number
(
accountType
)
}
}
}
}
packages/regenesis-surgery/scripts/constants.ts
deleted
100644 → 0
View file @
69779a27
import
path
from
'
path
'
// Codehashes of OVM_ECDSAContractAccount for 0.3.0 and 0.4.0
export
const
EOA_CODE_HASHES
=
[
'
0xa73df79c90ba2496f3440188807022bed5c7e2e826b596d22bcb4e127378835a
'
,
'
0xef2ab076db773ffc554c9f287134123439a5228e92f5b3194a28fec0a0afafe3
'
,
]
export
const
UNISWAP_V3_FACTORY_ADDRESS
=
'
0x1F98431c8aD98523631AE4a59f267346ea31F984
'
export
const
UNISWAP_V3_NFPM_ADDRESS
=
'
0xC36442b4a4522E871399CD717aBDD847Ab11FE88
'
export
const
UNISWAP_V3_CONTRACT_ADDRESSES
=
[
// PoolDeployer
'
0x569E8D536EC2dD5988857147c9FCC7d8a08a7DBc
'
,
// UniswapV3Factory
'
0x1F98431c8aD98523631AE4a59f267346ea31F984
'
,
// ProxyAdmin
'
0xB753548F6E010e7e680BA186F9Ca1BdAB2E90cf2
'
,
// TickLens
'
0xbfd8137f7d1516D3ea5cA83523914859ec47F573
'
,
// Quoter
'
0xb27308f9F90D607463bb33eA1BeBb41C27CE5AB6
'
,
// SwapRouter
'
0xE592427A0AEce92De3Edee1F18E0157C05861564
'
,
// NonfungiblePositionLibrary
'
0x42B24A95702b9986e82d421cC3568932790A48Ec
'
,
// NonfungibleTokenPositionDescriptor
'
0x91ae842A5Ffd8d12023116943e72A606179294f3
'
,
// TransparentUpgradeableProxy
'
0xEe6A57eC80ea46401049E92587E52f5Ec1c24785
'
,
// NonfungibleTokenPositionManager
'
0xC36442b4a4522E871399CD717aBDD847Ab11FE88
'
,
// UniswapInterfaceMulticall (OP KOVAN)
'
0x1F98415757620B543A52E61c46B32eB19261F984
'
,
]
export
const
UNISWAP_V3_KOVAN_MULTICALL
=
'
0x1F98415757620B543A52E61c46B32eB19261F984
'
export
const
UNISWAP_V3_MAINNET_MULTICALL
=
'
0x90f872b3d8f33f305e0250db6A2761B354f7710A
'
export
const
PREDEPLOY_WIPE_ADDRESSES
=
[
// L2CrossDomainMessenger
'
0x4200000000000000000000000000000000000007
'
,
// OVM_GasPriceOracle
'
0x420000000000000000000000000000000000000F
'
,
// L2StandardBridge
'
0x4200000000000000000000000000000000000010
'
,
// OVM_SequencerFeeVault
'
0x4200000000000000000000000000000000000011
'
,
]
export
const
PREDEPLOY_NO_WIPE_ADDRESSES
=
[
// OVM_DeployerWhitelist
'
0x4200000000000000000000000000000000000002
'
,
// OVM_L2ToL1MessagePasser
'
0x4200000000000000000000000000000000000000
'
,
]
export
const
PREDEPLOY_NEW_NOT_ETH_ADDRESSES
=
[
// L2StandardTokenFactory
'
0x4200000000000000000000000000000000000012
'
,
// OVM_L1BlockNumber
'
0x4200000000000000000000000000000000000013
'
,
]
export
const
OLD_ETH_ADDRESS
=
'
0x4200000000000000000000000000000000000006
'
export
const
NEW_ETH_ADDRESS
=
'
0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000
'
export
const
ONEINCH_DEPLOYER_ADDRESS
=
'
0xee4f7b6c39e7e87af01fb9e4cee0c893ff4d63f2
'
export
const
DELETE_CONTRACTS
=
[
// 1inch aggregator
'
0x11111112542D85B3EF69AE05771c2dCCff4fAa26
'
,
// OVM_L1MessageSender
'
0x4200000000000000000000000000000000000001
'
,
// OVM v1 System Contract
'
0xDEADDEaDDeAddEADDeaDDEADdeaDdeAddeAd0005
'
,
// OVM v1 System Contract
'
0xDEADdeAdDeAddEAdDEaDdEaddEAddeaDdEaD0006
'
,
// OVM v1 System Contract
'
0xDeaDDeaDDeaddEADdeaDdEadDeaDdeADDEad0007
'
,
// Uniswap Position
'
0x18F7E3ae7202e93984290e1195810c66e1E276FF
'
,
// Uniswap Oracle
'
0x17b0f5e5850e7230136df66c5d49497b8c3be0c1
'
,
// Uniswap Tick
'
0x47405b0d5f88e16701be6dc8ae185fefaa5dca2f
'
,
// Uniswap TickBitmap
'
0x01d95165c3c730d6b40f55c37e24c7aac73d5e6f
'
,
// Uniswap TickMath
'
0x308c3e60585ad4eab5b7677be0566fead4cb4746
'
,
// Uniswap SwapMath
'
0x198dcc7cd919dd33dd72c3f981df653750901d75
'
,
// Uniswap UniswapV3PoolDeployer
'
0x569e8d536ec2dd5988857147c9fcc7d8a08a7dbc
'
,
// Uniswap NFTDescriptor
'
0x042f51014b152c2d2fc9b57e36b16bc744065d8c
'
,
]
export
const
WETH_TRANSFER_ADDRESSES
=
[
// Rubicon Mainnet bathETH
'
0xB0bE5d911E3BD4Ee2A8706cF1fAc8d767A550497
'
,
// Rubicon Mainnet bathETH-USDC
'
0x87a7Eed69eaFA78D30344001D0baFF99FC005Dc8
'
,
// Rubicon Mainnet bathETH-DAI
'
0x314eC4Beaa694264746e1ae324A5edB913a6F7C6
'
,
// Rubicon Mainnet bathETH-USDT
'
0xF6A47B24e80D12Ac7d3b5Cef67B912BCd3377333
'
,
// Rubicon Mainnet exchange
'
0x7a512d3609211e719737E82c7bb7271eC05Da70d
'
,
// Rubicon Mainnet bathUSDC
'
0xe0e112e8f33d3f437D1F895cbb1A456836125952
'
,
// Rubicon Mainnet bathDAI
'
0x60daEC2Fc9d2e0de0577A5C708BcaDBA1458A833
'
,
// Rubicon Mainnet bathUSDT
'
0xfFBD695bf246c514110f5DAe3Fa88B8c2f42c411
'
,
// Rubicon Kovan bathETH
'
0x5790AedddfB25663f7dd58261De8E96274A82BAd
'
,
// Rubicon Kovan bathETH-USDC
'
0x52fBa53c876a47a64A10F111fbeA7Ed506dCc7e7
'
,
// Rubicon Kovan bathETH-DAI
'
0xA92E4Bd9f61e90757Cd8806D236580698Fc20C91
'
,
// Rubicon Kovan bathETH-USDT
'
0x80D94a6f6b0335Bfed8D04B92423B6Cd14b5d31C
'
,
// Rubicon Kovan market
'
0x5ddDa7DF721272106af1904abcc64E76AB2019d2
'
,
// Hop Mainnet AMM Wrapper
'
0x86cA30bEF97fB651b8d866D45503684b90cb3312
'
,
// Hop MainnetSwap
'
0xaa30d6bba6285d0585722e2440ff89e23ef68864
'
,
// Hop Kovan AMM Wrapper
'
0xc9E6628791cdD4ad568550fcc6f378cEF27e98fd
'
,
// Hop Kovan Swap
'
0xD6E31cE884DFf44c4600fD9D36BcC9af447C28d5
'
,
// Synthetix Mainnet WETHWrapper
'
0x6202a3b0be1d222971e93aab084c6e584c29db70
'
,
]
export
const
COMPILER_VERSIONS_TO_SOLC
=
{
'
v0.5.16
'
:
'
v0.5.16+commit.9c3226ce
'
,
'
v0.5.16-alpha.7
'
:
'
v0.5.16+commit.9c3226ce
'
,
'
v0.6.12
'
:
'
v0.6.12+commit.27d51765
'
,
'
v0.7.6
'
:
'
v0.7.6+commit.7338295f
'
,
'
v0.7.6+commit.3b061308
'
:
'
v0.7.6+commit.7338295f
'
,
'
v0.7.6-allow_kall
'
:
'
v0.7.6+commit.7338295f
'
,
'
v0.7.6-no_errors
'
:
'
v0.7.6+commit.7338295f
'
,
'
v0.8.4
'
:
'
v0.8.4+commit.c7e474f2
'
,
}
export
const
SOLC_BIN_PATH
=
'
https://binaries.soliditylang.org
'
export
const
EMSCRIPTEN_BUILD_PATH
=
`
${
SOLC_BIN_PATH
}
/emscripten-wasm32`
export
const
EMSCRIPTEN_BUILD_LIST
=
`
${
EMSCRIPTEN_BUILD_PATH
}
/list.json`
export
const
LOCAL_SOLC_DIR
=
path
.
join
(
__dirname
,
'
..
'
,
'
solc-bin
'
)
export
const
EVM_SOLC_CACHE_DIR
=
path
.
join
(
__dirname
,
'
..
'
,
'
solc-cache
'
,
'
evm
'
)
export
const
OVM_SOLC_CACHE_DIR
=
path
.
join
(
__dirname
,
'
..
'
,
'
solc-cache
'
,
'
ovm
'
)
packages/regenesis-surgery/scripts/data.ts
deleted
100644 → 0
View file @
69779a27
import
{
ethers
}
from
'
ethers
'
import
{
computePoolAddress
,
POOL_INIT_CODE_HASH
,
POOL_INIT_CODE_HASH_OPTIMISM
,
POOL_INIT_CODE_HASH_OPTIMISM_KOVAN
,
}
from
'
@uniswap/v3-sdk
'
import
{
Token
}
from
'
@uniswap/sdk-core
'
import
{
UNISWAP_V3_FACTORY_ADDRESS
}
from
'
./constants
'
import
{
downloadAllSolcVersions
}
from
'
./solc
'
import
{
PoolHashCache
,
StateDump
,
UniswapPoolData
,
SurgeryDataSources
,
EtherscanContract
,
SurgeryConfigs
,
GenesisFile
,
}
from
'
./types
'
import
{
loadConfigs
,
checkStateDump
,
readDumpFile
,
readEtherscanFile
,
readGenesisFile
,
getUniswapV3Factory
,
getMappingKey
,
}
from
'
./utils
'
export
const
getUniswapPoolData
=
async
(
l2Provider
:
ethers
.
providers
.
BaseProvider
,
network
:
'
mainnet
'
|
'
kovan
'
):
Promise
<
UniswapPoolData
[]
>
=>
{
if
(
!
network
)
{
throw
new
Error
(
'
Must provide network "mainnet" or "kovan"
'
)
}
const
UniswapV3Factory
=
getUniswapV3Factory
(
l2Provider
)
const
pools
:
UniswapPoolData
[]
=
[]
const
poolEvents
=
await
UniswapV3Factory
.
queryFilter
(
'
PoolCreated
'
as
any
)
for
(
const
event
of
poolEvents
)
{
// Compute the old pool address using the OVM init code hash.
const
oldPoolAddress
=
computePoolAddress
({
factoryAddress
:
UNISWAP_V3_FACTORY_ADDRESS
,
tokenA
:
new
Token
(
0
,
event
.
args
.
token0
,
18
),
tokenB
:
new
Token
(
0
,
event
.
args
.
token1
,
18
),
fee
:
event
.
args
.
fee
,
initCodeHashManualOverride
:
network
===
'
mainnet
'
?
POOL_INIT_CODE_HASH_OPTIMISM
:
POOL_INIT_CODE_HASH_OPTIMISM_KOVAN
,
}).
toLowerCase
()
// Compute the new pool address using the EVM init code hash.
const
newPoolAddress
=
computePoolAddress
({
factoryAddress
:
UNISWAP_V3_FACTORY_ADDRESS
,
tokenA
:
new
Token
(
0
,
event
.
args
.
token0
,
18
),
tokenB
:
new
Token
(
0
,
event
.
args
.
token1
,
18
),
fee
:
event
.
args
.
fee
,
initCodeHashManualOverride
:
POOL_INIT_CODE_HASH
,
}).
toLowerCase
()
pools
.
push
({
oldAddress
:
oldPoolAddress
,
newAddress
:
newPoolAddress
,
token0
:
event
.
args
.
token0
,
token1
:
event
.
args
.
token1
,
fee
:
event
.
args
.
fee
,
})
}
return
pools
}
export
const
makePoolHashCache
=
(
pools
:
UniswapPoolData
[]):
PoolHashCache
=>
{
const
cache
:
PoolHashCache
=
{}
for
(
const
pool
of
pools
)
{
for
(
let
i
=
0
;
i
<
1000
;
i
++
)
{
cache
[
getMappingKey
([
pool
.
oldAddress
],
i
)]
=
{
pool
,
index
:
i
,
}
}
}
return
cache
}
const
getChainId
=
async
(
provider
:
ethers
.
providers
.
JsonRpcProvider
):
Promise
<
number
>
=>
{
const
response
=
await
provider
.
send
(
'
eth_chainId
'
,
[])
return
ethers
.
BigNumber
.
from
(
response
).
toNumber
()
}
export
const
loadSurgeryData
=
async
(
configs
?:
SurgeryConfigs
):
Promise
<
SurgeryDataSources
>
=>
{
// First download every solc version that we'll need during this surgery.
console
.
log
(
'
Downloading all required solc versions...
'
)
await
downloadAllSolcVersions
()
// Load the configuration values, will throw if anything is missing.
if
(
configs
===
undefined
)
{
console
.
log
(
'
Loading configuration values...
'
)
configs
=
loadConfigs
()
}
// Get a reference to an ETH (mainnet) provider.
console
.
log
(
'
Connecting to ETH provider...
'
)
const
ethProvider
=
new
ethers
.
providers
.
JsonRpcProvider
(
configs
.
ethProviderUrl
)
const
mainnetChainId
=
await
getChainId
(
ethProvider
)
if
(
mainnetChainId
!==
1
)
{
throw
new
Error
(
`Mainnet chain id incorrect, got
${
mainnetChainId
}
and expected 1`
)
}
// Get a reference to the L2 provider so we can load pool data.
// Do validation on the chain id before reading data from disk
// because that is slow
console
.
log
(
'
Connecting to L2 provider...
'
)
const
l2Provider
=
new
ethers
.
providers
.
JsonRpcProvider
(
configs
.
l2ProviderUrl
)
const
l2ChainId
=
await
getChainId
(
l2Provider
)
if
(
l2ChainId
===
10
)
{
configs
.
l2NetworkName
=
'
mainnet
'
}
else
if
(
l2ChainId
===
69
)
{
configs
.
l2NetworkName
=
'
kovan
'
}
else
{
throw
new
Error
(
`Unknown l2 chain id:
${
l2ChainId
}
`
)
}
console
.
log
(
`Using network
${
configs
.
l2NetworkName
}
`
)
// Load and validate the state dump.
console
.
log
(
'
Loading and validating state dump file...
'
)
const
dump
:
StateDump
=
await
readDumpFile
(
configs
.
stateDumpFilePath
)
checkStateDump
(
dump
)
console
.
log
(
`
${
dump
.
length
}
entries in state dump`
)
// Load the genesis file.
console
.
log
(
'
Loading genesis file...
'
)
const
genesis
:
GenesisFile
=
await
readGenesisFile
(
configs
.
genesisFilePath
)
if
(
genesis
.
config
.
chainId
!==
l2ChainId
)
{
// Don't throw here because we might need to do a staging environment with a different chain ID
console
.
log
(
`WARNING: Genesis File at
${
configs
.
genesisFilePath
}
has chain id mismatch with remote L2 node`
+
` got
${
genesis
.
config
.
chainId
}
locally and
${
l2ChainId
}
remotely`
)
}
const
genesisDump
:
StateDump
=
[]
for
(
const
[
address
,
account
]
of
Object
.
entries
(
genesis
.
alloc
))
{
genesisDump
.
push
({
address
,
...
account
,
})
}
console
.
log
(
`
${
genesisDump
.
length
}
entries in genesis file`
)
// Load the etherscan dump.
console
.
log
(
'
Loading etherscan dump file...
'
)
const
etherscanDump
:
EtherscanContract
[]
=
await
readEtherscanFile
(
configs
.
etherscanFilePath
)
console
.
log
(
`
${
etherscanDump
.
length
}
entries in etherscan dump`
)
// Load the pool data.
console
.
log
(
'
Loading Uniswap pool data...
'
)
const
pools
:
UniswapPoolData
[]
=
await
getUniswapPoolData
(
l2Provider
,
configs
.
l2NetworkName
)
console
.
log
(
`
${
pools
.
length
}
uniswap pools`
)
console
.
log
(
'
Generating pool cache...
'
)
const
poolHashCache
=
makePoolHashCache
(
pools
)
// Get a reference to the ropsten provider and wallet, used for deploying Uniswap pools.
console
.
log
(
'
Connecting to ropsten provider...
'
)
const
ropstenProvider
=
new
ethers
.
providers
.
JsonRpcProvider
(
configs
.
ropstenProviderUrl
)
const
ropstenWallet
=
new
ethers
.
Wallet
(
configs
.
ropstenPrivateKey
,
ropstenProvider
)
const
ropstenChainId
=
await
ropstenWallet
.
getChainId
()
if
(
ropstenChainId
!==
3
)
{
throw
new
Error
(
`Ropsten chain id incorrect, got
${
ropstenChainId
}
and expected 3`
)
}
return
{
configs
,
dump
,
genesis
,
genesisDump
,
pools
,
poolHashCache
,
etherscanDump
,
ropstenProvider
,
ropstenWallet
,
l2Provider
,
ethProvider
,
}
}
packages/regenesis-surgery/scripts/handlers.ts
deleted
100644 → 0
View file @
69779a27
import
{
ethers
}
from
'
ethers
'
import
linker
from
'
solc/linker
'
import
{
POOL_INIT_CODE_HASH_OPTIMISM
,
POOL_INIT_CODE_HASH_OPTIMISM_KOVAN
,
}
from
'
@uniswap/v3-sdk
'
import
{
sleep
,
add0x
,
remove0x
,
clone
}
from
'
@eth-optimism/core-utils
'
import
{
OLD_ETH_ADDRESS
,
WETH_TRANSFER_ADDRESSES
,
UNISWAP_V3_KOVAN_MULTICALL
,
}
from
'
./constants
'
import
{
findAccount
,
hexStringIncludes
,
transferStorageSlot
,
getMappingKey
,
getUniswapV3Factory
,
replaceWETH
,
}
from
'
./utils
'
import
{
compile
}
from
'
./solc
'
import
{
Account
,
AccountType
,
SurgeryDataSources
,
ImmutableReference
,
}
from
'
./types
'
export
const
handlers
:
{
[
key
in
AccountType
]:
(
account
:
Account
,
data
:
SurgeryDataSources
)
=>
Account
|
Promise
<
Account
>
}
=
{
[
AccountType
.
ONEINCH_DEPLOYER
]:
(
account
,
data
)
=>
{
return
{
...
handlers
[
AccountType
.
EOA
](
account
,
data
),
nonce
:
0
,
}
},
[
AccountType
.
DELETE
]:
()
=>
{
return
undefined
// delete the account
},
[
AccountType
.
EOA
]:
(
account
)
=>
{
return
{
address
:
account
.
address
,
nonce
:
account
.
nonce
,
balance
:
account
.
balance
,
}
},
[
AccountType
.
PRECOMPILE
]:
(
account
)
=>
{
return
account
},
[
AccountType
.
PREDEPLOY_NEW_NOT_ETH
]:
(
account
)
=>
{
return
account
},
[
AccountType
.
PREDEPLOY_WIPE
]:
(
account
,
data
)
=>
{
const
genesisAccount
=
findAccount
(
data
.
genesisDump
,
account
.
address
)
return
{
...
account
,
code
:
genesisAccount
.
code
,
storage
:
genesisAccount
.
storage
,
}
},
[
AccountType
.
PREDEPLOY_NO_WIPE
]:
(
account
,
data
)
=>
{
const
genesisAccount
=
findAccount
(
data
.
genesisDump
,
account
.
address
)
return
{
...
account
,
code
:
genesisAccount
.
code
,
storage
:
{
...
account
.
storage
,
...
genesisAccount
.
storage
,
},
}
},
[
AccountType
.
PREDEPLOY_ETH
]:
(
account
,
data
)
=>
{
// Get a copy of the old account so we don't modify the one in dump by accident.
const
oldAccount
=
clone
(
findAccount
(
data
.
dump
,
OLD_ETH_ADDRESS
))
// Special handling for moving certain balances over to the WETH predeploy.
// We need to trasnfer all statically defined addresses AND all uni pools.
const
addressesToXfer
=
WETH_TRANSFER_ADDRESSES
.
concat
(
data
.
pools
.
map
((
pool
)
=>
{
return
pool
.
oldAddress
})
)
// For each of the listed addresses, check if it has an ETH balance. If so, we remove the ETH
// balance and give WETH a balance instead.
let
wethBalance
=
ethers
.
BigNumber
.
from
(
0
)
for
(
const
address
of
addressesToXfer
)
{
const
balanceKey
=
getMappingKey
([
address
],
0
)
if
(
oldAccount
.
storage
[
balanceKey
]
!==
undefined
)
{
wethBalance
=
wethBalance
.
add
(
add0x
(
oldAccount
.
storage
[
balanceKey
]))
// Remove this balance from the old account storage.
delete
oldAccount
.
storage
[
balanceKey
]
}
}
const
wethBalanceKey
=
getMappingKey
([
OLD_ETH_ADDRESS
],
0
)
return
{
...
account
,
storage
:
{
...
oldAccount
.
storage
,
...
account
.
storage
,
[
wethBalanceKey
]:
wethBalance
.
toHexString
(),
},
}
},
[
AccountType
.
PREDEPLOY_WETH
]:
async
(
account
,
data
)
=>
{
// Treat it like a wipe of the old ETH account.
account
=
await
handlers
[
AccountType
.
PREDEPLOY_WIPE
](
account
,
data
)
// Get a copy of the old ETH account so we don't modify the one in dump by accident.
const
ethAccount
=
clone
(
findAccount
(
data
.
dump
,
OLD_ETH_ADDRESS
))
// Special handling for moving certain balances over from the old account.
for
(
const
address
of
WETH_TRANSFER_ADDRESSES
)
{
const
balanceKey
=
getMappingKey
([
address
],
0
)
if
(
ethAccount
.
storage
[
balanceKey
]
!==
undefined
)
{
// Give this account a balance inside of WETH.
const
newBalanceKey
=
getMappingKey
([
address
],
3
)
account
.
storage
[
newBalanceKey
]
=
ethAccount
.
storage
[
balanceKey
]
}
}
// Need to handle pools in a special manner because we want to get the balance for the old pool
// address but we need to transfer the balance to the new pool address.
for
(
const
pool
of
data
.
pools
)
{
const
balanceKey
=
getMappingKey
([
pool
.
oldAddress
],
0
)
if
(
ethAccount
.
storage
[
balanceKey
]
!==
undefined
)
{
// Give this account a balance inside of WETH.
const
newBalanceKey
=
getMappingKey
([
pool
.
newAddress
],
3
)
account
.
storage
[
newBalanceKey
]
=
ethAccount
.
storage
[
balanceKey
]
}
}
return
account
},
[
AccountType
.
UNISWAP_V3_FACTORY
]:
async
(
account
,
data
)
=>
{
// Transfer the owner slot
transferStorageSlot
({
account
,
oldSlot
:
0
,
newSlot
:
3
,
})
// Transfer the feeAmountTickSpacing slot
for
(
const
fee
of
[
500
,
3000
,
10000
])
{
transferStorageSlot
({
account
,
oldSlot
:
getMappingKey
([
fee
],
1
),
newSlot
:
getMappingKey
([
fee
],
4
),
})
}
// Transfer the getPool slot
for
(
const
pool
of
data
.
pools
)
{
// Fix the token0 => token1 => fee mapping
transferStorageSlot
({
account
,
oldSlot
:
getMappingKey
([
pool
.
token0
,
pool
.
token1
,
pool
.
fee
],
2
),
newSlot
:
getMappingKey
([
pool
.
token0
,
pool
.
token1
,
pool
.
fee
],
5
),
newValue
:
pool
.
newAddress
,
})
// Fix the token1 => token0 => fee mapping
transferStorageSlot
({
account
,
oldSlot
:
getMappingKey
([
pool
.
token1
,
pool
.
token0
,
pool
.
fee
],
2
),
newSlot
:
getMappingKey
([
pool
.
token1
,
pool
.
token0
,
pool
.
fee
],
5
),
newValue
:
pool
.
newAddress
,
})
}
return
handlers
[
AccountType
.
UNISWAP_V3_OTHER
](
account
,
data
)
},
[
AccountType
.
UNISWAP_V3_NFPM
]:
async
(
account
,
data
)
=>
{
for
(
const
pool
of
data
.
pools
)
{
try
{
transferStorageSlot
({
account
,
oldSlot
:
getMappingKey
([
pool
.
oldAddress
],
10
),
newSlot
:
getMappingKey
([
pool
.
newAddress
],
10
),
})
}
catch
(
err
)
{
if
(
err
.
message
.
includes
(
'
old slot not found in state dump
'
))
{
// It's OK for this to happen because some pools may not have any position NFTs.
console
.
log
(
`pool not found in NonfungiblePositionManager _poolIds mapping:
${
pool
.
oldAddress
}
`
)
}
else
{
throw
err
}
}
}
return
handlers
[
AccountType
.
UNISWAP_V3_OTHER
](
account
,
data
)
},
[
AccountType
.
UNISWAP_V3_POOL
]:
async
(
account
,
data
)
=>
{
// Find the pool by its old address
const
pool
=
data
.
pools
.
find
((
poolData
)
=>
{
return
poolData
.
oldAddress
===
account
.
address
})
// Get the pool's code.
let
poolCode
=
await
data
.
ropstenProvider
.
getCode
(
pool
.
newAddress
)
if
(
poolCode
===
'
0x
'
)
{
console
.
log
(
'
Could not find pool code, deploying to testnet...
'
)
const
UniswapV3Factory
=
getUniswapV3Factory
(
data
.
ropstenWallet
)
await
UniswapV3Factory
.
createPool
(
pool
.
token0
,
pool
.
token1
,
pool
.
fee
)
// Repeatedly try to get the remote pool code from the testnet.
let
retries
=
0
while
(
poolCode
===
'
0x
'
)
{
retries
++
if
(
retries
>
50
)
{
throw
new
Error
(
`unable to create pool with data:
${
pool
}
`
)
}
poolCode
=
await
data
.
ropstenProvider
.
getCode
(
pool
.
newAddress
)
await
sleep
(
5000
)
}
}
return
{
...
account
,
address
:
pool
.
newAddress
,
code
:
poolCode
,
}
},
[
AccountType
.
UNISWAP_V3_MAINNET_MULTICALL
]:
async
(
account
,
data
)
=>
{
// When upgrading mainnet, we want to get rid of the old multicall contract and introduce a new
// multicall contract at the OP Kovan address (also the ETH mainnet address). By changing the
// address here and piping into the UNISWAP_V3_OTHER handler, we:
// (1) Get the state of the old multicall but with the new address
// (2) Query the code using the new address (required)
return
handlers
[
AccountType
.
UNISWAP_V3_OTHER
](
{
...
account
,
address
:
UNISWAP_V3_KOVAN_MULTICALL
,
},
data
)
},
[
AccountType
.
UNISWAP_V3_OTHER
]:
async
(
account
,
data
)
=>
{
let
code
=
await
data
.
ethProvider
.
getCode
(
account
.
address
)
if
(
code
===
'
0x
'
)
{
throw
new
Error
(
`account code is empty:
${
account
.
address
}
`
)
}
// Replace references to L1 WETH address with the L2 WETH address.
code
=
replaceWETH
(
code
)
return
{
...
account
,
code
,
}
},
[
AccountType
.
UNVERIFIED
]:
()
=>
{
return
undefined
// delete the account
},
[
AccountType
.
VERIFIED
]:
(
account
:
Account
,
data
:
SurgeryDataSources
)
=>
{
// Find the account in the etherscan dump
const
contract
=
data
.
etherscanDump
.
find
((
acc
)
=>
{
return
acc
.
contractAddress
===
account
.
address
})
// The contract must exist
if
(
!
contract
)
{
throw
new
Error
(
`Unable to find
${
account
.
address
}
in etherscan dump`
)
}
const
evmOutput
=
compile
({
contract
,
ovm
:
false
,
})
// Pull out the bytecode, exact handling depends on the Solidity version
let
bytecode
=
evmOutput
.
evm
.
deployedBytecode
if
(
typeof
bytecode
===
'
object
'
)
{
bytecode
=
bytecode
.
object
}
// Make sure the bytecode is 0x-prefixed.
bytecode
=
add0x
(
bytecode
)
// Handle external library references.
if
(
contract
.
library
)
{
const
linkReferences
=
linker
.
findLinkReferences
(
bytecode
)
const
libStrings
=
contract
.
library
.
split
(
'
;
'
)
const
libraries
=
{}
for
(
const
[
i
,
libStr
]
of
libStrings
.
entries
())
{
const
[
name
,
address
]
=
libStr
.
split
(
'
:
'
)
let
key
:
string
if
(
Object
.
keys
(
linkReferences
).
length
>
i
)
{
key
=
Object
.
keys
(
linkReferences
)[
i
]
}
else
{
key
=
name
}
libraries
[
key
]
=
add0x
(
address
)
}
// Inject the libraries at the required locations
bytecode
=
linker
.
linkBytecode
(
bytecode
,
libraries
)
// There should no longer be any link references if linking was done correctly
if
(
Object
.
keys
(
linker
.
findLinkReferences
(
bytecode
)).
length
!==
0
)
{
throw
new
Error
(
`Library linking did not happen correctly:
${
contract
.
contractAddress
}
`
)
}
}
// Make sure the bytecode is (still) 0x-prefixed.
bytecode
=
add0x
(
bytecode
)
// If the contract has immutables in it, then the contracts
// need to be compiled with the ovm compiler so that the offsets
// can be found. The immutables must be pulled out of the old code
// and inserted into the new code
const
immutableRefs
:
ImmutableReference
=
evmOutput
.
evm
.
deployedBytecode
.
immutableReferences
if
(
immutableRefs
&&
Object
.
keys
(
immutableRefs
).
length
!==
0
)
{
// Compile using the ovm compiler to find the location of the
// immutableRefs in the ovm contract so they can be migrated
// to the new contract
const
ovmOutput
=
compile
({
contract
,
ovm
:
true
,
})
const
ovmImmutableRefs
:
ImmutableReference
=
ovmOutput
.
evm
.
deployedBytecode
.
immutableReferences
// Iterate over the immutableRefs and slice them into the new code
// to carry over their values. The keys are the AST IDs
for
(
const
[
key
,
value
]
of
Object
.
entries
(
immutableRefs
))
{
const
ovmValue
=
ovmImmutableRefs
[
key
]
if
(
!
ovmValue
)
{
throw
new
Error
(
`cannot find ast in ovm compiler output`
)
}
// Each value is an array of {length, start}
for
(
const
[
i
,
ref
]
of
value
.
entries
())
{
const
ovmRef
=
ovmValue
[
i
]
if
(
ref
.
length
!==
ovmRef
.
length
)
{
throw
new
Error
(
`length mismatch`
)
}
// Get the value from the contract code
const
immutable
=
ethers
.
utils
.
hexDataSlice
(
add0x
(
account
.
code
),
ovmRef
.
start
,
ovmRef
.
start
+
ovmRef
.
length
)
const
pre
=
ethers
.
utils
.
hexDataSlice
(
bytecode
,
0
,
ref
.
start
)
const
post
=
ethers
.
utils
.
hexDataSlice
(
bytecode
,
ref
.
start
+
ref
.
length
)
// Make a note of the original bytecode length so we can confirm it doesn't change
const
bytecodeLength
=
bytecode
.
length
// Assign to the global bytecode variable
bytecode
=
ethers
.
utils
.
hexConcat
([
pre
,
immutable
,
post
])
if
(
bytecode
.
length
!==
bytecodeLength
)
{
throw
new
Error
(
`mismatch in size:
${
bytecode
.
length
}
vs
${
bytecodeLength
}
`
)
}
}
}
}
// Handle migrating storage slots
if
(
account
.
storage
)
{
for
(
const
[
key
,
value
]
of
Object
.
entries
(
account
.
storage
))
{
for
(
const
pool
of
data
.
pools
)
{
// Turn into hex string or hexStringIncludes will throw
const
val
=
add0x
(
value
)
if
(
hexStringIncludes
(
val
,
pool
.
oldAddress
))
{
console
.
log
(
`found unexpected reference to pool address
${
val
}
in
${
account
.
address
}
`
)
const
regex
=
new
RegExp
(
remove0x
(
pool
.
oldAddress
).
toLowerCase
(),
'
g
'
)
account
.
storage
[
key
]
=
value
.
replace
(
regex
,
remove0x
(
pool
.
newAddress
).
toLowerCase
()
)
console
.
log
(
`updated to
${
account
.
storage
[
key
]}
`
)
}
if
(
hexStringIncludes
(
val
,
POOL_INIT_CODE_HASH_OPTIMISM
))
{
throw
new
Error
(
`found unexpected reference to mainnet pool init code hash`
)
}
if
(
hexStringIncludes
(
val
,
POOL_INIT_CODE_HASH_OPTIMISM_KOVAN
))
{
throw
new
Error
(
`found unexpected reference to kovan pool init code hash`
)
}
}
if
(
data
.
poolHashCache
[
key
])
{
const
cached
=
data
.
poolHashCache
[
key
]
console
.
log
(
`fixing single-level mapping in contract`
,
`address=
${
account
.
address
}
`
,
`pool=
${
cached
.
pool
.
oldAddress
}
`
,
`slot=
${
key
}
`
)
transferStorageSlot
({
account
,
oldSlot
:
key
,
newSlot
:
getMappingKey
([
cached
.
pool
.
newAddress
],
cached
.
index
),
})
}
}
}
return
{
...
account
,
code
:
bytecode
,
}
},
[
AccountType
.
ERC20
]:
async
(
account
)
=>
{
throw
new
Error
(
`Unexpected ERC20 classification, this should never happen:
${
account
.
address
}
`
)
},
}
packages/regenesis-surgery/scripts/solc.ts
deleted
100644 → 0
View file @
69779a27
/* eslint @typescript-eslint/no-var-requires: "off" */
import
fs
from
'
fs
'
import
path
from
'
path
'
import
fetch
from
'
node-fetch
'
import
{
ethers
}
from
'
ethers
'
import
{
clone
}
from
'
@eth-optimism/core-utils
'
import
setupMethods
from
'
solc/wrapper
'
import
{
COMPILER_VERSIONS_TO_SOLC
,
EMSCRIPTEN_BUILD_LIST
,
EMSCRIPTEN_BUILD_PATH
,
LOCAL_SOLC_DIR
,
EVM_SOLC_CACHE_DIR
,
OVM_SOLC_CACHE_DIR
,
}
from
'
./constants
'
import
{
EtherscanContract
}
from
'
./types
'
const
OVM_BUILD_PATH
=
(
version
:
string
)
=>
{
return
`https://raw.githubusercontent.com/ethereum-optimism/solc-bin/9455107699d2f7ad9b09e1005c7c07f4b5dd6857/bin/soljson-
${
version
}
.js`
}
/**
* Downloads a specific solc version.
*
* @param version Solc version to download.
* @param ovm If true, downloads from the OVM repository.
*/
export
const
downloadSolc
=
async
(
version
:
string
,
ovm
?:
boolean
)
=>
{
// TODO: why is this one missing?
if
(
version
===
'
v0.5.16-alpha.7
'
)
{
return
}
// File is the location where we'll put the downloaded compiler.
let
file
:
string
// Remote is the URL we'll query if the file doesn't already exist.
let
remote
:
string
// Exact file/remote will depend on if downloading OVM or EVM compiler.
if
(
ovm
)
{
file
=
`
${
path
.
join
(
LOCAL_SOLC_DIR
,
version
)}
.js`
remote
=
OVM_BUILD_PATH
(
version
)
}
else
{
const
res
=
await
fetch
(
EMSCRIPTEN_BUILD_LIST
)
const
data
:
any
=
await
res
.
json
()
const
list
=
data
.
builds
// Make sure the target version actually exists
let
target
:
any
for
(
const
entry
of
list
)
{
const
longVersion
=
`v
${
entry
.
longVersion
}
`
if
(
version
===
longVersion
)
{
target
=
entry
}
}
// Error out if the given version can't be found
if
(
!
target
)
{
throw
new
Error
(
`Cannot find compiler version
${
version
}
`
)
}
file
=
path
.
join
(
LOCAL_SOLC_DIR
,
target
.
path
)
remote
=
`
${
EMSCRIPTEN_BUILD_PATH
}
/
${
target
.
path
}
`
}
try
{
// Check to see if we already have the file
fs
.
accessSync
(
file
,
fs
.
constants
.
F_OK
)
}
catch
(
e
)
{
console
.
error
(
`Downloading
${
version
}
${
ovm
?
'
ovm
'
:
'
solidity
'
}
`
)
// If we don't have the file, download it
const
res
=
await
fetch
(
remote
)
const
bin
=
await
res
.
text
()
fs
.
writeFileSync
(
file
,
bin
)
}
}
/**
* Downloads all required solc versions, if not already downloaded.
*/
export
const
downloadAllSolcVersions
=
async
()
=>
{
try
{
fs
.
mkdirSync
(
LOCAL_SOLC_DIR
)
}
catch
(
e
)
{
// directory already exists
}
// Keys are OVM versions.
await
Promise
.
all
(
// Use a set to dedupe the list of versions.
[...
new
Set
(
Object
.
keys
(
COMPILER_VERSIONS_TO_SOLC
))].
map
(
async
(
version
)
=>
{
await
downloadSolc
(
version
,
true
)
}
)
)
// Values are EVM versions.
await
Promise
.
all
(
// Use a set to dedupe the list of versions.
[...
new
Set
(
Object
.
values
(
COMPILER_VERSIONS_TO_SOLC
))].
map
(
async
(
version
)
=>
{
await
downloadSolc
(
version
)
}
)
)
}
export
const
getMainContract
=
(
contract
:
EtherscanContract
,
output
)
=>
{
if
(
contract
.
contractFileName
)
{
return
clone
(
output
.
contracts
[
contract
.
contractFileName
][
contract
.
contractName
]
)
}
return
clone
(
output
.
contracts
.
file
[
contract
.
contractName
])
}
export
const
getSolc
=
(
version
:
string
,
ovm
?:
boolean
)
=>
{
return
setupMethods
(
require
(
path
.
join
(
LOCAL_SOLC_DIR
,
ovm
?
version
:
`solc-emscripten-wasm32-
${
version
}
.js`
))
)
}
export
const
solcInput
=
(
contract
:
EtherscanContract
)
=>
{
// Create a base solc input object
const
input
=
{
language
:
'
Solidity
'
,
sources
:
{
file
:
{
content
:
contract
.
sourceCode
,
},
},
settings
:
{
outputSelection
:
{
'
*
'
:
{
'
*
'
:
[
'
*
'
],
},
},
optimizer
:
{
enabled
:
contract
.
optimizationUsed
===
'
1
'
,
runs
:
parseInt
(
contract
.
runs
,
10
),
},
},
}
try
{
// source code may be one of 3 things
// - raw content string
// - sources object
// - entire input
let
sourceCode
=
contract
.
sourceCode
// Remove brackets that are wrapped around the source
// when trying to parse json
if
(
sourceCode
.
substr
(
0
,
2
)
===
'
{{
'
)
{
// Trim the first and last bracket
sourceCode
=
sourceCode
.
slice
(
1
,
-
1
)
}
// If the source code is valid json, and
// has the keys of a solc input, just return it
const
json
=
JSON
.
parse
(
sourceCode
)
// If the json has language, then it is the whole input
if
(
json
.
language
)
{
return
json
}
// Add the json file as the sources
input
.
sources
=
json
}
catch
(
e
)
{
//
}
return
input
}
const
readCompilerCache
=
(
target
:
'
evm
'
|
'
ovm
'
,
hash
:
string
):
any
|
undefined
=>
{
try
{
const
cacheDir
=
target
===
'
evm
'
?
EVM_SOLC_CACHE_DIR
:
OVM_SOLC_CACHE_DIR
return
JSON
.
parse
(
fs
.
readFileSync
(
path
.
join
(
cacheDir
,
hash
),
{
encoding
:
'
utf-8
'
,
})
)
}
catch
(
err
)
{
return
undefined
}
}
const
writeCompilerCache
=
(
target
:
'
evm
'
|
'
ovm
'
,
hash
:
string
,
content
:
any
)
=>
{
const
cacheDir
=
target
===
'
evm
'
?
EVM_SOLC_CACHE_DIR
:
OVM_SOLC_CACHE_DIR
fs
.
writeFileSync
(
path
.
join
(
cacheDir
,
hash
),
JSON
.
stringify
(
content
))
}
export
const
compile
=
(
opts
:
{
contract
:
EtherscanContract
ovm
:
boolean
}):
any
=>
{
try
{
fs
.
mkdirSync
(
EVM_SOLC_CACHE_DIR
,
{
recursive
:
true
,
})
}
catch
(
e
)
{
// directory already exists
}
try
{
fs
.
mkdirSync
(
OVM_SOLC_CACHE_DIR
,
{
recursive
:
true
,
})
}
catch
(
e
)
{
// directory already exists
}
let
version
:
string
if
(
opts
.
ovm
)
{
version
=
opts
.
contract
.
compilerVersion
}
else
{
version
=
COMPILER_VERSIONS_TO_SOLC
[
opts
.
contract
.
compilerVersion
]
if
(
!
version
)
{
throw
new
Error
(
`Unable to find solc version
${
opts
.
contract
.
compilerVersion
}
`
)
}
}
const
solcInstance
=
getSolc
(
version
,
opts
.
ovm
)
const
input
=
JSON
.
stringify
(
solcInput
(
opts
.
contract
))
const
inputHash
=
ethers
.
utils
.
solidityKeccak256
([
'
string
'
],
[
input
])
const
compilerTarget
=
opts
.
ovm
?
'
ovm
'
:
'
evm
'
// Cache the compiler output to speed up repeated compilations of the same contract. If this
// cache is too memory intensive, then we could consider only caching if the contract has been
// seen more than once.
let
output
=
readCompilerCache
(
compilerTarget
,
inputHash
)
if
(
output
===
undefined
)
{
output
=
JSON
.
parse
(
solcInstance
.
compile
(
input
))
writeCompilerCache
(
compilerTarget
,
inputHash
,
output
)
}
if
(
!
output
.
contracts
)
{
throw
new
Error
(
`Cannot compile
${
opts
.
contract
.
contractAddress
}
`
)
}
const
mainOutput
=
getMainContract
(
opts
.
contract
,
output
)
if
(
!
mainOutput
)
{
throw
new
Error
(
`Contract filename mismatch:
${
opts
.
contract
.
contractAddress
}
`
)
}
return
mainOutput
}
packages/regenesis-surgery/scripts/surgery.ts
deleted
100644 → 0
View file @
69779a27
import
fs
from
'
fs
'
import
{
ethers
}
from
'
ethers
'
import
{
add0x
,
remove0x
,
clone
}
from
'
@eth-optimism/core-utils
'
import
{
StateDump
,
SurgeryDataSources
,
AccountType
}
from
'
./types
'
import
{
findAccount
}
from
'
./utils
'
import
{
handlers
}
from
'
./handlers
'
import
{
classify
}
from
'
./classifiers
'
import
{
loadSurgeryData
}
from
'
./data
'
const
doGenesisSurgery
=
async
(
data
:
SurgeryDataSources
):
Promise
<
StateDump
>
=>
{
// We'll generate the final genesis file from this output.
const
output
:
StateDump
=
[]
// Handle each account in the state dump.
const
input
=
data
.
dump
.
slice
(
data
.
configs
.
startIndex
,
data
.
configs
.
endIndex
)
// Insert any accounts in the genesis that aren't already in the state dump.
for
(
const
account
of
data
.
genesisDump
)
{
if
(
findAccount
(
input
,
account
.
address
)
===
undefined
)
{
input
.
push
(
account
)
}
}
for
(
const
[
i
,
account
]
of
input
.
entries
())
{
const
accountType
=
classify
(
account
,
data
)
console
.
log
(
`[
${
i
}
/
${
input
.
length
}
]
${
AccountType
[
accountType
]}
:
${
account
.
address
}
`
)
const
handler
=
handlers
[
accountType
]
const
newAccount
=
await
handler
(
clone
(
account
),
data
)
if
(
newAccount
!==
undefined
)
{
output
.
push
(
newAccount
)
}
}
// Clean up and standardize the dump. Also performs a few tricks to reduce the overall size of
// the state dump, which reduces bandwidth requirements.
console
.
log
(
'
Cleaning up and standardizing dump format...
'
)
for
(
const
account
of
output
)
{
for
(
const
[
key
,
val
]
of
Object
.
entries
(
account
))
{
// We want to be left with the following fields:
// - balance
// - nonce
// - code
// - storage (if necessary)
if
(
key
===
'
storage
'
)
{
if
(
Object
.
keys
(
account
[
key
]).
length
===
0
)
{
// We don't need storage if there are no storage values.
delete
account
[
key
]
}
else
{
// We can remove 0x from storage keys and vals to save space.
for
(
const
[
storageKey
,
storageVal
]
of
Object
.
entries
(
account
[
key
]))
{
delete
account
.
storage
[
storageKey
]
account
.
storage
[
remove0x
(
storageKey
)]
=
remove0x
(
storageVal
)
}
}
}
else
if
(
key
===
'
code
'
)
{
// Code MUST start with 0x.
account
[
key
]
=
add0x
(
val
)
}
else
if
(
key
===
'
codeHash
'
||
key
===
'
root
'
)
{
// Neither of these fields are necessary. Geth will automatically generate them from the
// code and storage.
delete
account
[
key
]
}
else
if
(
key
===
'
balance
'
||
key
===
'
nonce
'
)
{
// At this point we know that the input is either a string or a number. If it's a number,
// we want to convert it into a string.
let
stripped
=
typeof
val
===
'
number
'
?
val
.
toString
(
16
)
:
val
// Remove 0x so we can strip any leading zeros.
stripped
=
remove0x
(
stripped
)
// We can further reduce our genesis size by removing leading zeros. We can even go as far
// as removing the entire string because Geth appears to treat the empty string as 0.
stripped
=
stripped
.
replace
().
replace
(
/^0+/
,
''
)
// We have to add 0x if the value is greater or equal to than 10 because Geth will throw an
// error otherwise.
if
(
stripped
!==
''
&&
ethers
.
BigNumber
.
from
(
add0x
(
stripped
)).
gte
(
10
))
{
stripped
=
add0x
(
stripped
)
}
account
[
key
]
=
stripped
}
else
if
(
key
===
'
address
'
)
{
// Keep the address as-is, we'll delete it eventually.
}
else
{
throw
new
Error
(
`unexpected account field:
${
key
}
`
)
}
}
}
return
output
}
const
main
=
async
()
=>
{
// Load the surgery data.
const
data
=
await
loadSurgeryData
()
// Do the surgery process and get the new genesis dump.
console
.
log
(
'
Starting surgery process...
'
)
const
finalGenesisDump
=
await
doGenesisSurgery
(
data
)
// Convert to the format that Geth expects.
console
.
log
(
'
Converting dump to final format...
'
)
const
finalGenesisAlloc
=
{}
for
(
const
account
of
finalGenesisDump
)
{
const
address
=
account
.
address
delete
account
.
address
finalGenesisAlloc
[
remove0x
(
address
)]
=
account
}
// Attach all of the original genesis configuration values.
const
finalGenesis
=
{
...
data
.
genesis
,
alloc
:
finalGenesisAlloc
,
}
// Write the final genesis file to disk.
console
.
log
(
'
Writing final genesis to disk...
'
)
fs
.
writeFileSync
(
data
.
configs
.
outputFilePath
,
JSON
.
stringify
(
finalGenesis
,
null
,
2
)
)
console
.
log
(
'
All done!
'
)
}
main
()
packages/regenesis-surgery/scripts/types.ts
deleted
100644 → 0
View file @
69779a27
import
{
ethers
}
from
'
ethers
'
export
interface
SurgeryConfigs
{
stateDumpFilePath
:
string
etherscanFilePath
:
string
genesisFilePath
:
string
outputFilePath
:
string
l2NetworkName
?:
SupportedNetworks
l2ProviderUrl
:
string
ropstenProviderUrl
:
string
ropstenPrivateKey
:
string
ethProviderUrl
:
string
stateDumpHeight
:
number
startIndex
:
number
endIndex
:
number
}
export
interface
Account
{
address
:
string
nonce
:
number
|
string
balance
:
string
codeHash
?:
string
root
?:
string
code
?:
string
storage
?:
{
[
key
:
string
]:
string
}
}
export
type
StateDump
=
Account
[]
export
interface
GethStateDump
{
[
address
:
string
]:
{
nonce
:
number
balance
:
string
codeHash
:
string
root
:
string
code
?:
string
storage
?:
{
[
key
:
string
]:
string
}
}
}
export
enum
AccountType
{
ONEINCH_DEPLOYER
,
DELETE
,
EOA
,
PRECOMPILE
,
PREDEPLOY_NEW_NOT_ETH
,
PREDEPLOY_WIPE
,
PREDEPLOY_NO_WIPE
,
PREDEPLOY_ETH
,
PREDEPLOY_WETH
,
UNISWAP_V3_FACTORY
,
UNISWAP_V3_NFPM
,
UNISWAP_V3_MAINNET_MULTICALL
,
UNISWAP_V3_POOL
,
UNISWAP_V3_OTHER
,
UNVERIFIED
,
VERIFIED
,
ERC20
,
}
export
interface
UniswapPoolData
{
oldAddress
:
string
newAddress
:
string
token0
:
string
token1
:
string
fee
:
ethers
.
BigNumber
}
export
interface
EtherscanContract
{
contractAddress
:
string
code
:
string
hash
:
string
sourceCode
:
string
creationCode
:
string
contractFileName
:
string
contractName
:
string
compilerVersion
:
string
optimizationUsed
:
string
runs
:
string
constructorArguments
:
string
library
:
string
}
export
type
EtherscanDump
=
EtherscanContract
[]
export
type
SupportedNetworks
=
'
mainnet
'
|
'
kovan
'
export
interface
SurgeryDataSources
{
configs
:
SurgeryConfigs
dump
:
StateDump
genesis
:
GenesisFile
genesisDump
:
StateDump
pools
:
UniswapPoolData
[]
poolHashCache
:
PoolHashCache
etherscanDump
:
EtherscanContract
[]
ropstenProvider
:
ethers
.
providers
.
JsonRpcProvider
ropstenWallet
:
ethers
.
Wallet
l2Provider
:
ethers
.
providers
.
JsonRpcProvider
ethProvider
:
ethers
.
providers
.
JsonRpcProvider
}
export
interface
GenesisFile
{
config
:
{
chainId
:
number
homesteadBlock
:
number
eip150Block
:
number
eip155Block
:
number
eip158Block
:
number
byzantiumBlock
:
number
constantinopleBlock
:
number
petersburgBlock
:
number
istanbulBlock
:
number
muirGlacierBlock
:
number
clique
:
{
period
:
number
epoch
:
number
}
}
difficulty
:
string
gasLimit
:
string
extraData
:
string
alloc
:
GethStateDump
}
export
interface
ImmutableReference
{
start
:
number
length
:
number
}
export
interface
ImmutableReferences
{
[
key
:
string
]:
ImmutableReference
[]
}
export
interface
PoolHashCache
{
[
key
:
string
]:
{
pool
:
UniswapPoolData
index
:
number
}
}
packages/regenesis-surgery/scripts/utils.ts
deleted
100644 → 0
View file @
69779a27
/* eslint @typescript-eslint/no-var-requires: "off" */
import
{
createReadStream
}
from
'
fs
'
import
*
as
fs
from
'
fs
'
import
*
as
assert
from
'
assert
'
import
{
ethers
}
from
'
ethers
'
import
{
abi
as
UNISWAP_FACTORY_ABI
}
from
'
@uniswap/v3-core/artifacts/contracts/UniswapV3Factory.sol/UniswapV3Factory.json
'
import
{
Interface
}
from
'
@ethersproject/abi
'
import
{
parseChunked
}
from
'
@discoveryjs/json-ext
'
import
byline
from
'
byline
'
import
*
as
dotenv
from
'
dotenv
'
import
{
reqenv
,
getenv
,
remove0x
}
from
'
@eth-optimism/core-utils
'
import
{
Account
,
EtherscanContract
,
StateDump
,
SurgeryConfigs
,
GenesisFile
,
}
from
'
./types
'
import
{
UNISWAP_V3_FACTORY_ADDRESS
}
from
'
./constants
'
export
const
findAccount
=
(
dump
:
StateDump
,
address
:
string
):
Account
=>
{
return
dump
.
find
((
acc
)
=>
{
return
hexStringEqual
(
acc
.
address
,
address
)
})
}
export
const
hexStringIncludes
=
(
a
:
string
,
b
:
string
):
boolean
=>
{
if
(
!
ethers
.
utils
.
isHexString
(
a
))
{
throw
new
Error
(
`not a hex string:
${
a
}
`
)
}
if
(
!
ethers
.
utils
.
isHexString
(
b
))
{
throw
new
Error
(
`not a hex string:
${
b
}
`
)
}
return
a
.
slice
(
2
).
toLowerCase
().
includes
(
b
.
slice
(
2
).
toLowerCase
())
}
export
const
hexStringEqual
=
(
a
:
string
,
b
:
string
):
boolean
=>
{
if
(
!
ethers
.
utils
.
isHexString
(
a
))
{
throw
new
Error
(
`not a hex string:
${
a
}
`
)
}
if
(
!
ethers
.
utils
.
isHexString
(
b
))
{
throw
new
Error
(
`not a hex string:
${
b
}
`
)
}
return
a
.
toLowerCase
()
===
b
.
toLowerCase
()
}
export
const
replaceWETH
=
(
code
:
string
):
string
=>
{
return
code
.
replace
(
/c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2/g
,
'
4200000000000000000000000000000000000006
'
)
}
/**
* Left-pads a hex string with zeroes to 32 bytes.
*
* @param val Value to hex pad to 32 bytes.
* @returns Value padded to 32 bytes.
*/
export
const
toHex32
=
(
val
:
string
|
number
|
ethers
.
BigNumber
)
=>
{
return
ethers
.
utils
.
hexZeroPad
(
ethers
.
BigNumber
.
from
(
val
).
toHexString
(),
32
)
}
export
const
transferStorageSlot
=
(
opts
:
{
account
:
Account
oldSlot
:
string
|
number
newSlot
:
string
|
number
newValue
?:
string
}):
void
=>
{
if
(
opts
.
account
.
storage
===
undefined
)
{
throw
new
Error
(
`account has no storage:
${
opts
.
account
.
address
}
`
)
}
if
(
typeof
opts
.
oldSlot
!==
'
string
'
)
{
opts
.
oldSlot
=
toHex32
(
opts
.
oldSlot
)
}
if
(
typeof
opts
.
newSlot
!==
'
string
'
)
{
opts
.
newSlot
=
toHex32
(
opts
.
newSlot
)
}
const
oldSlotVal
=
opts
.
account
.
storage
[
opts
.
oldSlot
]
if
(
oldSlotVal
===
undefined
)
{
throw
new
Error
(
`old slot not found in state dump, address=
${
opts
.
account
.
address
}
, slot=
${
opts
.
oldSlot
}
`
)
}
if
(
opts
.
newValue
===
undefined
)
{
opts
.
account
.
storage
[
opts
.
newSlot
]
=
oldSlotVal
}
else
{
if
(
opts
.
newValue
.
startsWith
(
'
0x
'
))
{
opts
.
newValue
=
opts
.
newValue
.
slice
(
2
)
}
opts
.
account
.
storage
[
opts
.
newSlot
]
=
opts
.
newValue
}
delete
opts
.
account
.
storage
[
opts
.
oldSlot
]
}
export
const
getMappingKey
=
(
keys
:
any
[],
slot
:
number
)
=>
{
// TODO: assert keys.length > 0
let
key
=
ethers
.
utils
.
keccak256
(
ethers
.
utils
.
hexConcat
([
toHex32
(
keys
[
0
]),
toHex32
(
slot
)])
)
if
(
keys
.
length
>
1
)
{
for
(
let
i
=
1
;
i
<
keys
.
length
;
i
++
)
{
key
=
ethers
.
utils
.
keccak256
(
ethers
.
utils
.
hexConcat
([
toHex32
(
keys
[
i
]),
key
])
)
}
}
return
key
}
// ERC20 interface
const
iface
=
new
Interface
([
'
function balanceOf(address)
'
,
'
function name()
'
,
'
function symbol()
'
,
'
function decimals()
'
,
'
function totalSupply()
'
,
'
function transfer(address,uint256)
'
,
])
// PUSH4 should prefix any 4 byte selector
const
PUSH4
=
0x63
const
erc20Sighashes
=
new
Set
()
// Build the set of erc20 4 byte selectors
for
(
const
fn
of
Object
.
keys
(
iface
.
functions
))
{
const
sighash
=
iface
.
getSighash
(
fn
)
erc20Sighashes
.
add
(
sighash
)
}
export
const
isBytecodeERC20
=
(
bytecode
:
string
):
boolean
=>
{
if
(
bytecode
===
'
0x
'
||
bytecode
===
undefined
)
{
return
false
}
const
seen
=
new
Set
()
const
buf
=
Buffer
.
from
(
remove0x
(
bytecode
),
'
hex
'
)
for
(
const
[
i
,
byte
]
of
buf
.
entries
())
{
// Track all of the observed 4 byte selectors that follow a PUSH4
// and are also present in the set of erc20Sighashes
if
(
byte
===
PUSH4
)
{
const
sighash
=
'
0x
'
+
buf
.
slice
(
i
+
1
,
i
+
5
).
toString
(
'
hex
'
)
if
(
erc20Sighashes
.
has
(
sighash
))
{
seen
.
add
(
sighash
)
}
}
}
// create a set that contains those elements of set
// erc20Sighashes that are not in set seen
const
elements
=
[...
erc20Sighashes
].
filter
((
x
)
=>
!
seen
.
has
(
x
))
return
!
elements
.
length
}
export
const
getUniswapV3Factory
=
(
signerOrProvider
:
any
):
ethers
.
Contract
=>
{
return
new
ethers
.
Contract
(
UNISWAP_V3_FACTORY_ADDRESS
,
UNISWAP_FACTORY_ABI
,
signerOrProvider
)
}
export
const
loadConfigs
=
():
SurgeryConfigs
=>
{
dotenv
.
config
()
const
stateDumpFilePath
=
reqenv
(
'
REGEN__STATE_DUMP_FILE
'
)
const
etherscanFilePath
=
reqenv
(
'
REGEN__ETHERSCAN_FILE
'
)
const
genesisFilePath
=
reqenv
(
'
REGEN__GENESIS_FILE
'
)
const
outputFilePath
=
reqenv
(
'
REGEN__OUTPUT_FILE
'
)
const
l2ProviderUrl
=
reqenv
(
'
REGEN__L2_PROVIDER_URL
'
)
const
ropstenProviderUrl
=
reqenv
(
'
REGEN__ROPSTEN_PROVIDER_URL
'
)
const
ropstenPrivateKey
=
reqenv
(
'
REGEN__ROPSTEN_PRIVATE_KEY
'
)
const
ethProviderUrl
=
reqenv
(
'
REGEN__ETH_PROVIDER_URL
'
)
const
stateDumpHeight
=
parseInt
(
reqenv
(
'
REGEN__STATE_DUMP_HEIGHT
'
),
10
)
const
startIndex
=
parseInt
(
getenv
(
'
REGEN__START_INDEX
'
,
'
0
'
),
10
)
const
endIndex
=
parseInt
(
getenv
(
'
REGEN__END_INDEX
'
,
'
0
'
),
10
)
||
Infinity
return
{
stateDumpFilePath
,
etherscanFilePath
,
genesisFilePath
,
outputFilePath
,
l2ProviderUrl
,
ropstenProviderUrl
,
ropstenPrivateKey
,
ethProviderUrl
,
stateDumpHeight
,
startIndex
,
endIndex
,
}
}
/**
* Reads the state dump file into an object. Required because the dumps get quite large.
* JavaScript throws an error when trying to load large JSON files (>512mb) directly via
* fs.readFileSync. Need a streaming approach instead.
*
* @param dumppath Path to the state dump file.
* @returns Parsed state dump object.
*/
export
const
readDumpFile
=
async
(
dumppath
:
string
):
Promise
<
StateDump
>
=>
{
return
new
Promise
<
StateDump
>
((
resolve
)
=>
{
const
dump
:
StateDump
=
[]
const
stream
=
byline
(
fs
.
createReadStream
(
dumppath
,
{
encoding
:
'
utf8
'
}))
let
isFirstRow
=
true
stream
.
on
(
'
data
'
,
(
line
:
any
)
=>
{
const
account
=
JSON
.
parse
(
line
)
if
(
isFirstRow
)
{
isFirstRow
=
false
}
else
{
delete
account
.
key
dump
.
push
(
account
)
}
})
stream
.
on
(
'
end
'
,
()
=>
{
resolve
(
dump
)
})
})
}
export
const
readEtherscanFile
=
async
(
etherscanpath
:
string
):
Promise
<
EtherscanContract
[]
>
=>
{
return
parseChunked
(
createReadStream
(
etherscanpath
))
}
export
const
readGenesisFile
=
async
(
genesispath
:
string
):
Promise
<
GenesisFile
>
=>
{
return
JSON
.
parse
(
fs
.
readFileSync
(
genesispath
,
'
utf8
'
))
}
export
const
readGenesisStateDump
=
async
(
genesispath
:
string
):
Promise
<
StateDump
>
=>
{
const
genesis
=
await
readGenesisFile
(
genesispath
)
const
genesisDump
:
StateDump
=
[]
for
(
const
[
address
,
account
]
of
Object
.
entries
(
genesis
.
alloc
))
{
genesisDump
.
push
({
address
,
...
account
,
})
}
return
genesisDump
}
export
const
checkStateDump
=
(
dump
:
StateDump
)
=>
{
for
(
const
account
of
dump
)
{
assert
.
equal
(
account
.
address
.
toLowerCase
(),
account
.
address
,
`unexpected upper case character in state dump address:
${
account
.
address
}
`
)
assert
.
ok
(
typeof
account
.
nonce
===
'
number
'
,
`nonce is not a number:
${
account
.
nonce
}
`
)
if
(
account
.
codeHash
)
{
assert
.
equal
(
account
.
codeHash
.
toLowerCase
(),
account
.
codeHash
,
`unexpected upper case character in state dump codeHash:
${
account
.
codeHash
}
`
)
}
if
(
account
.
root
)
{
assert
.
equal
(
account
.
root
.
toLowerCase
(),
account
.
root
,
`unexpected upper case character in state dump root:
${
account
.
root
}
`
)
}
if
(
account
.
code
)
{
assert
.
equal
(
account
.
code
.
toLowerCase
(),
account
.
code
,
`unexpected upper case character in state dump code:
${
account
.
code
}
`
)
}
// All accounts other than precompiles should have a balance of zero.
if
(
!
account
.
address
.
startsWith
(
'
0x00000000000000000000000000000000000000
'
)
)
{
assert
.
equal
(
account
.
balance
,
'
0
'
,
`unexpected non-zero balance in state dump address:
${
account
.
address
}
`
)
}
if
(
account
.
storage
!==
undefined
)
{
for
(
const
[
storageKey
,
storageVal
]
of
Object
.
entries
(
account
.
storage
))
{
assert
.
equal
(
storageKey
.
toLowerCase
(),
storageKey
,
`unexpected upper case character in state dump storage key:
${
storageKey
}
`
)
assert
.
equal
(
storageVal
.
toLowerCase
(),
storageVal
,
`unexpected upper case character in state dump storage value:
${
storageVal
}
`
)
}
}
}
}
packages/regenesis-surgery/test/beforeall.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
env
}
from
'
./setup
'
before
(
'
initializing test environment
'
,
async
()
=>
{
await
env
.
init
()
})
packages/regenesis-surgery/test/delete.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
KECCAK256_RLP_S
,
KECCAK256_NULL_S
}
from
'
ethereumjs-util
'
import
{
add0x
}
from
'
@eth-optimism/core-utils
'
import
{
ethers
}
from
'
ethers
'
import
{
expect
,
env
}
from
'
./setup
'
import
{
AccountType
}
from
'
../scripts/types
'
describe
(
'
deleted contracts
'
,
()
=>
{
let
accs
before
(
async
()
=>
{
await
env
.
init
()
accs
=
env
.
getAccountsByType
(
AccountType
.
DELETE
)
})
it
(
'
accounts
'
,
async
()
=>
{
for
(
const
[
i
,
acc
]
of
accs
.
entries
())
{
describe
(
`account
${
i
}
/
${
accs
.
length
}
(
${
acc
.
address
}
)`
,
()
=>
{
it
(
'
should not have any code
'
,
async
()
=>
{
const
code
=
await
env
.
postL2Provider
.
getCode
(
acc
.
address
)
expect
(
code
).
to
.
eq
(
'
0x
'
)
})
it
(
'
should have the null code hash and storage root
'
,
async
()
=>
{
const
proof
=
await
env
.
postL2Provider
.
send
(
'
eth_getProof
'
,
[
acc
.
address
,
[],
'
latest
'
,
])
expect
(
proof
.
codeHash
).
to
.
equal
(
add0x
(
KECCAK256_NULL_S
))
expect
(
proof
.
storageHash
).
to
.
equal
(
add0x
(
KECCAK256_RLP_S
))
})
it
(
'
should have a balance equal to zero
'
,
async
()
=>
{
// Balance after can come from the latest block.
const
balance
=
await
env
.
postL2Provider
.
getBalance
(
acc
.
address
)
expect
(
balance
).
to
.
deep
.
eq
(
ethers
.
BigNumber
.
from
(
0
))
})
it
(
'
should have a nonce equal to zero
'
,
async
()
=>
{
// Nonce after can come from the latest block.
const
nonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
acc
.
address
)
expect
(
nonce
).
to
.
deep
.
eq
(
0
)
})
})
}
})
})
packages/regenesis-surgery/test/eoa.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
KECCAK256_RLP_S
,
KECCAK256_NULL_S
}
from
'
ethereumjs-util
'
import
{
add0x
}
from
'
@eth-optimism/core-utils
'
import
{
expect
,
env
}
from
'
./setup
'
import
{
AccountType
,
Account
}
from
'
../scripts/types
'
describe
(
'
EOAs
'
,
()
=>
{
describe
(
'
standard EOA
'
,
()
=>
{
let
eoas
before
(
async
()
=>
{
await
env
.
init
()
eoas
=
env
.
getAccountsByType
(
AccountType
.
EOA
)
})
it
(
'
EOAs
'
,
()
=>
{
for
(
const
[
i
,
eoa
]
of
eoas
.
entries
())
{
describe
(
`account
${
i
}
/
${
eoas
.
length
}
(
${
eoa
.
address
}
)`
,
()
=>
{
it
(
'
should not have any code
'
,
async
()
=>
{
const
code
=
await
env
.
postL2Provider
.
getCode
(
eoa
.
address
)
expect
(
code
).
to
.
eq
(
'
0x
'
)
})
it
(
'
should have the null code hash and storage root
'
,
async
()
=>
{
const
proof
=
await
env
.
postL2Provider
.
send
(
'
eth_getProof
'
,
[
eoa
.
address
,
[],
'
latest
'
,
])
expect
(
proof
.
codeHash
).
to
.
equal
(
add0x
(
KECCAK256_NULL_S
))
expect
(
proof
.
storageHash
).
to
.
equal
(
add0x
(
KECCAK256_RLP_S
))
})
it
(
'
should have the same balance as it had before
'
,
async
()
=>
{
// Balance before needs to come from the specific block at which the dump was taken.
const
preBalance
=
await
env
.
preL2Provider
.
getBalance
(
eoa
.
address
,
env
.
config
.
stateDumpHeight
)
// Balance after can come from the latest block.
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
eoa
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
it
(
'
should have the same nonce as it had before
'
,
async
()
=>
{
// Nonce before needs to come from the specific block at which the dump was taken.
const
preNonce
=
await
env
.
preL2Provider
.
getTransactionCount
(
eoa
.
address
,
env
.
config
.
stateDumpHeight
)
// Nonce after can come from the latest block.
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
eoa
.
address
)
expect
(
preNonce
).
to
.
deep
.
eq
(
postNonce
)
})
})
}
})
})
// eslint-disable-next-line
describe
(
'
1inch deployer
'
,
function
()
{
let
eoa
:
Account
// eslint-disable-next-line
before
(
function
()
{
if
(
env
.
surgeryDataSources
.
configs
.
l2NetworkName
===
'
kovan
'
)
{
console
.
log
(
'
1inch deployer does not exist on Optimism Kovan
'
)
this
.
skip
()
}
eoa
=
env
.
getAccountsByType
(
AccountType
.
ONEINCH_DEPLOYER
)[
0
]
if
(
!
eoa
)
{
throw
new
Error
(
'
Cannot find one inch deployer
'
)
}
})
it
(
'
should not have any code
'
,
async
()
=>
{
const
code
=
await
env
.
postL2Provider
.
getCode
(
eoa
.
address
)
expect
(
code
).
to
.
eq
(
'
0x
'
)
})
it
(
'
should have the null code hash and storage root
'
,
async
()
=>
{
const
proof
=
await
env
.
postL2Provider
.
send
(
'
eth_getProof
'
,
[
eoa
.
address
,
[],
'
latest
'
,
])
expect
(
proof
.
codeHash
).
to
.
equal
(
add0x
(
KECCAK256_NULL_S
))
expect
(
proof
.
storageHash
).
to
.
equal
(
add0x
(
KECCAK256_RLP_S
))
})
it
(
'
should have the same balance as it had before
'
,
async
()
=>
{
// Balance before needs to come from the specific block at which the dump was taken.
const
preBalance
=
await
env
.
preL2Provider
.
getBalance
(
eoa
.
address
,
env
.
config
.
stateDumpHeight
)
// Balance after can come from the latest block.
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
eoa
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
it
(
'
should have a nonce equal to zero
'
,
async
()
=>
{
// Nonce before needs to come from the specific block at which the dump was taken.
const
preNonce
=
await
env
.
preL2Provider
.
getTransactionCount
(
eoa
.
address
,
env
.
config
.
stateDumpHeight
)
expect
(
preNonce
).
to
.
not
.
eq
(
0
)
// Nonce after can come from the latest block.
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
eoa
.
address
)
expect
(
postNonce
).
to
.
deep
.
eq
(
0
)
})
})
})
packages/regenesis-surgery/test/erc20.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
expect
}
from
'
@eth-optimism/core-utils/test/setup
'
import
{
BigNumber
}
from
'
ethers
'
import
{
env
}
from
'
./setup
'
describe
(
'
erc20
'
,
()
=>
{
describe
(
'
standard ERC20
'
,
()
=>
{
before
(
async
()
=>
{
await
env
.
init
()
})
it
(
'
ERC20s
'
,
()
=>
{
for
(
const
[
i
,
erc20
]
of
env
.
erc20s
.
entries
())
{
describe
(
`erc20
${
i
}
/
${
env
.
erc20s
.
length
}
(
${
erc20
.
address
}
)`
,
()
=>
{
it
(
'
should have the same storage
'
,
async
()
=>
{
const
account
=
env
.
surgeryDataSources
.
dump
.
find
(
(
a
)
=>
a
.
address
===
erc20
.
address
)
if
(
account
.
storage
)
{
for
(
const
key
of
Object
.
keys
(
account
.
storage
))
{
const
pre
=
await
env
.
preL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
const
post
=
await
env
.
postL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
expect
(
pre
).
to
.
deep
.
eq
(
post
)
}
}
})
})
}
})
})
})
packages/regenesis-surgery/test/predeploy.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
ethers
,
BigNumber
,
Contract
}
from
'
ethers
'
import
{
expect
,
env
,
ERC20_ABI
}
from
'
./setup
'
import
{
GenesisJsonProvider
}
from
'
./provider
'
import
{
AccountType
}
from
'
../scripts/types
'
describe
(
'
predeploys
'
,
()
=>
{
const
predeploys
=
{
eth
:
[],
newNotEth
:
[],
noWipe
:
[],
wipe
:
[],
weth
:
[],
}
// Base genesis file only
let
genesisStateProvider
:
GenesisJsonProvider
// Old sequencer state
let
oldStateProvider
:
GenesisJsonProvider
before
(
async
()
=>
{
await
env
.
init
()
predeploys
.
eth
=
env
.
getAccountsByType
(
AccountType
.
PREDEPLOY_ETH
)
predeploys
.
newNotEth
=
env
.
getAccountsByType
(
AccountType
.
PREDEPLOY_NEW_NOT_ETH
)
predeploys
.
noWipe
=
env
.
getAccountsByType
(
AccountType
.
PREDEPLOY_NO_WIPE
)
predeploys
.
wipe
=
env
.
getAccountsByType
(
AccountType
.
PREDEPLOY_WIPE
)
predeploys
.
weth
=
env
.
getAccountsByType
(
AccountType
.
PREDEPLOY_WETH
)
genesisStateProvider
=
new
GenesisJsonProvider
(
env
.
surgeryDataSources
.
genesis
)
oldStateProvider
=
new
GenesisJsonProvider
(
env
.
surgeryDataSources
.
configs
.
stateDumpFilePath
)
})
describe
(
'
new predeploys that are not ETH
'
,
()
=>
{
for
(
const
[
i
,
account
]
of
predeploys
.
newNotEth
.
entries
())
{
describe
(
`account
${
i
}
/
${
predeploys
.
newNotEth
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should have the exact state specified in the base genesis file
'
,
async
()
=>
{
const
preBytecode
=
await
genesisStateProvider
.
getCode
(
account
.
address
)
const
postBytecode
=
await
env
.
postL2Provider
.
getCode
(
account
.
address
)
expect
(
preBytecode
).
to
.
eq
(
postBytecode
)
const
dumpAccount
=
env
.
surgeryDataSources
.
dump
.
find
(
(
a
)
=>
a
.
address
===
account
.
address
)
if
(
dumpAccount
.
storage
)
{
for
(
const
key
of
Object
.
keys
(
dumpAccount
.
storage
))
{
const
pre
=
await
env
.
preL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
const
post
=
await
env
.
postL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
expect
(
pre
).
to
.
deep
.
eq
(
post
)
}
}
const
preNonce
=
await
genesisStateProvider
.
getTransactionCount
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
account
.
address
)
expect
(
preNonce
).
to
.
deep
.
eq
(
postNonce
)
const
preBalance
=
await
genesisStateProvider
.
getBalance
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
account
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
})
}
})
describe
(
'
predeploys where the old state should be wiped
'
,
()
=>
{
for
(
const
[
i
,
account
]
of
predeploys
.
wipe
.
entries
())
{
describe
(
`account
${
i
}
/
${
predeploys
.
wipe
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should have the code and storage of the base genesis file
'
,
async
()
=>
{
const
preBytecode
=
await
genesisStateProvider
.
getCode
(
account
.
address
)
const
postBytecode
=
await
env
.
postL2Provider
.
getCode
(
account
.
address
)
expect
(
preBytecode
).
to
.
eq
(
postBytecode
)
const
dumpAccount
=
env
.
surgeryDataSources
.
dump
.
find
(
(
a
)
=>
a
.
address
===
account
.
address
)
if
(
dumpAccount
.
storage
)
{
for
(
const
key
of
Object
.
keys
(
dumpAccount
.
storage
))
{
const
pre
=
await
env
.
preL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
const
post
=
await
env
.
postL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
expect
(
pre
).
to
.
deep
.
eq
(
post
)
}
}
})
it
(
'
should have the same nonce and balance as before
'
,
async
()
=>
{
const
preNonce
=
await
oldStateProvider
.
getTransactionCount
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
account
.
address
)
expect
(
preNonce
).
to
.
deep
.
eq
(
postNonce
)
const
preBalance
=
await
oldStateProvider
.
getBalance
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
account
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
})
}
})
describe
(
'
predeploys where the old state should be preserved
'
,
()
=>
{
for
(
const
[
i
,
account
]
of
predeploys
.
noWipe
.
entries
())
{
describe
(
`account
${
i
}
/
${
predeploys
.
noWipe
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should have the code of the base genesis file
'
,
async
()
=>
{
const
preBytecode
=
await
genesisStateProvider
.
getCode
(
account
.
address
)
const
postBytecode
=
await
env
.
postL2Provider
.
getCode
(
account
.
address
)
expect
(
preBytecode
).
to
.
eq
(
postBytecode
)
})
it
(
'
should have the combined storage of the old and new state
'
,
async
()
=>
{
const
dumpAccount
=
env
.
surgeryDataSources
.
dump
.
find
(
(
a
)
=>
a
.
address
===
account
.
address
)
if
(
dumpAccount
.
storage
)
{
for
(
const
key
of
Object
.
keys
(
dumpAccount
.
storage
))
{
const
pre
=
await
env
.
preL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
const
post
=
await
env
.
postL2Provider
.
getStorageAt
(
account
.
address
,
BigNumber
.
from
(
key
)
)
expect
(
pre
).
to
.
deep
.
eq
(
post
)
}
}
})
it
(
'
should have the same nonce and balance as before
'
,
async
()
=>
{
const
preNonce
=
await
oldStateProvider
.
getTransactionCount
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
account
.
address
)
expect
(
preNonce
).
to
.
deep
.
eq
(
postNonce
)
const
preBalance
=
await
oldStateProvider
.
getBalance
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
account
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
})
}
})
describe
(
'
OVM_ETH
'
,
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run pool contract tests without live provider
'
)
return
}
let
OVM_ETH
:
Contract
before
(
async
()
=>
{
OVM_ETH
=
new
ethers
.
Contract
(
predeploys
.
eth
[
0
].
address
,
ERC20_ABI
,
env
.
postL2Provider
)
})
for
(
const
[
i
,
account
]
of
predeploys
.
eth
.
entries
())
{
describe
(
`account
${
i
}
/
${
predeploys
.
eth
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should have disabled ERC20 features
'
,
async
()
=>
{
await
expect
(
OVM_ETH
.
transfer
(
account
.
address
,
100
)
).
to
.
be
.
revertedWith
(
'
OVM_ETH: transfer is disabled pending further community discussion.
'
)
})
it
(
'
should have a new balance for WETH9 equal to the sum of the moved contract balances
'
,
async
()
=>
{
// need live provider for WETH balances
})
})
}
})
describe
(
'
WETH9
'
,
()
=>
{
for
(
const
[
i
,
account
]
of
predeploys
.
weth
.
entries
())
{
describe
(
`account
${
i
}
/
${
predeploys
.
weth
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should no recorded ETH balance
'
,
async
()
=>
{
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
account
.
address
)
expect
(
postBalance
.
toNumber
()).
to
.
eq
(
0
)
})
it
(
'
should have WETH balances for each contract that should move
'
,
async
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run pool contract tests without live provider
'
)
return
}
})
it
(
'
should have a balance equal to the sum of all moved balances
'
,
async
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run pool contract tests without live provider
'
)
return
}
})
})
}
})
})
packages/regenesis-surgery/test/provider.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
expect
}
from
'
@eth-optimism/core-utils/test/setup
'
import
{
ethers
,
BigNumber
}
from
'
ethers
'
import
{
Genesis
}
from
'
@eth-optimism/core-utils/src/types
'
import
{
remove0x
,
add0x
,
}
from
'
@eth-optimism/core-utils/src/common/hex-strings
'
import
{
KECCAK256_RLP_S
,
KECCAK256_NULL_S
}
from
'
ethereumjs-util
'
import
{
GenesisJsonProvider
}
from
'
./provider
'
const
account
=
'
0x66a84544bed4ca45b3c024776812abf87728fbaf
'
const
genesis
:
Genesis
=
{
config
:
{
chainId
:
0
,
homesteadBlock
:
0
,
eip150Block
:
0
,
eip155Block
:
0
,
eip158Block
:
0
,
byzantiumBlock
:
0
,
constantinopleBlock
:
0
,
petersburgBlock
:
0
,
istanbulBlock
:
0
,
muirGlacierBlock
:
0
,
clique
:
{
period
:
0
,
epoch
:
0
,
},
},
difficulty
:
'
0x
'
,
gasLimit
:
'
0x
'
,
extraData
:
'
0x
'
,
alloc
:
{
[
remove0x
(
account
)]:
{
nonce
:
101
,
balance
:
'
234
'
,
codeHash
:
ethers
.
utils
.
keccak256
(
'
0x6080
'
),
root
:
'
0x
'
,
code
:
'
6080
'
,
storage
:
{
'
0000000000000000000000000000000000000000000000000000000000000002
'
:
'
989680
'
,
'
0000000000000000000000000000000000000000000000000000000000000003
'
:
'
536f6d65205265616c6c7920436f6f6c20546f6b656e204e616d650000000036
'
,
'
7d55c28652d09dd36b33c69e81e67cbe8d95f51dc46ab5b17568d616d481854d
'
:
'
989680
'
,
},
},
},
}
describe
(
'
GenesisJsonProvider
'
,
()
=>
{
let
provider
before
(()
=>
{
provider
=
new
GenesisJsonProvider
(
genesis
)
})
it
(
'
should get nonce
'
,
async
()
=>
{
const
nonce
=
await
provider
.
getTransactionCount
(
account
)
expect
(
nonce
).
to
.
deep
.
eq
(
101
)
})
it
(
'
should get nonce on missing account
'
,
async
()
=>
{
const
nonce
=
await
provider
.
getTransactionCount
(
'
0x
'
)
expect
(
nonce
).
to
.
deep
.
eq
(
0
)
})
it
(
'
should get code
'
,
async
()
=>
{
const
code
=
await
provider
.
getCode
(
account
)
expect
(
code
).
to
.
deep
.
eq
(
'
0x6080
'
)
})
it
(
'
should get code on missing account
'
,
async
()
=>
{
const
code
=
await
provider
.
getCode
(
'
0x
'
)
expect
(
code
).
to
.
deep
.
eq
(
'
0x
'
)
})
it
(
'
should get balance
'
,
async
()
=>
{
const
balance
=
await
provider
.
getBalance
(
account
)
expect
(
balance
.
toString
()).
to
.
deep
.
eq
(
BigNumber
.
from
(
234
).
toString
())
})
it
(
'
should get balance on missing account
'
,
async
()
=>
{
const
balance
=
await
provider
.
getBalance
(
'
0x
'
)
expect
(
balance
.
toString
()).
to
.
deep
.
eq
(
'
0
'
)
})
it
(
'
should get storage
'
,
async
()
=>
{
const
storage
=
await
provider
.
getStorageAt
(
account
,
2
)
expect
(
storage
).
to
.
deep
.
eq
(
'
0x989680
'
)
})
it
(
'
should get storage of missing account
'
,
async
()
=>
{
const
storage
=
await
provider
.
getStorageAt
(
'
0x
'
,
0
)
expect
(
storage
).
to
.
deep
.
eq
(
'
0x
'
)
})
it
(
'
should get storage of missing slot
'
,
async
()
=>
{
const
storage
=
await
provider
.
getStorageAt
(
account
,
9999999999999
)
expect
(
storage
).
to
.
deep
.
eq
(
'
0x
'
)
})
it
(
'
should call eth_getProof
'
,
async
()
=>
{
const
proof
=
await
provider
.
send
(
'
eth_getProof
'
,
[
account
])
// There is code at the account, so it shouldn't be the null code hash
expect
(
proof
.
codeHash
).
to
.
not
.
eq
(
add0x
(
KECCAK256_NULL_S
))
// There is storage so it should not be the null storage hash
expect
(
proof
.
storageHash
).
to
.
not
.
eq
(
add0x
(
KECCAK256_RLP_S
))
})
it
(
'
should call eth_getProof on missing account
'
,
async
()
=>
{
const
proof
=
await
provider
.
send
(
'
eth_getProof
'
,
[
'
0x
'
])
expect
(
proof
.
codeHash
).
to
.
eq
(
add0x
(
KECCAK256_NULL_S
))
expect
(
proof
.
storageHash
).
to
.
eq
(
add0x
(
KECCAK256_RLP_S
))
})
it
(
'
should also initialize correctly with state dump
'
,
async
()
=>
{
provider
=
new
GenesisJsonProvider
(
genesis
.
alloc
)
expect
(
provider
).
to
.
be
.
instanceOf
(
GenesisJsonProvider
)
})
})
packages/regenesis-surgery/test/provider.ts
deleted
100644 → 0
View file @
69779a27
import
path
from
'
path
'
import
{
ethers
}
from
'
ethers
'
import
{
BigNumber
}
from
'
@ethersproject/bignumber
'
import
{
Deferrable
}
from
'
@ethersproject/properties
'
import
{
Provider
}
from
'
@ethersproject/providers
'
import
{
Provider
as
AbstractProvider
,
EventType
,
TransactionRequest
,
TransactionResponse
,
TransactionReceipt
,
Filter
,
Log
,
Block
,
BlockWithTransactions
,
BlockTag
,
Listener
,
}
from
'
@ethersproject/abstract-provider
'
import
{
KECCAK256_RLP_S
,
KECCAK256_NULL_S
}
from
'
ethereumjs-util
'
import
{
bytes32ify
,
remove0x
,
add0x
}
from
'
@eth-optimism/core-utils
'
// Represents the ethereum state
export
interface
State
{
[
address
:
string
]:
{
nonce
:
number
balance
:
string
codeHash
:
string
root
:
string
code
?:
string
storage
?:
{
[
key
:
string
]:
string
}
}
}
// Represents a genesis file that geth can consume
export
interface
Genesis
{
config
:
{
chainId
:
number
homesteadBlock
:
number
eip150Block
:
number
eip155Block
:
number
eip158Block
:
number
byzantiumBlock
:
number
constantinopleBlock
:
number
petersburgBlock
:
number
istanbulBlock
:
number
muirGlacierBlock
:
number
clique
:
{
period
:
number
epoch
:
number
}
}
difficulty
:
string
gasLimit
:
string
extraData
:
string
alloc
:
State
}
export
class
GenesisJsonProvider
implements
AbstractProvider
{
state
:
State
constructor
(
dump
:
string
|
Genesis
|
State
)
{
let
input
if
(
typeof
dump
===
'
string
'
)
{
input
=
require
(
path
.
resolve
(
dump
))
}
else
if
(
typeof
dump
===
'
object
'
)
{
input
=
dump
}
this
.
state
=
input
.
alloc
?
input
.
alloc
:
input
if
(
this
.
state
===
null
)
{
throw
new
Error
(
'
Must initialize with genesis or state object
'
)
}
this
.
_isProvider
=
false
}
async
getBalance
(
addressOrName
:
string
,
// eslint-disable-next-line
blockTag
?:
number
|
string
):
Promise
<
BigNumber
>
{
addressOrName
=
addressOrName
.
toLowerCase
()
const
address
=
remove0x
(
addressOrName
)
const
account
=
this
.
state
[
address
]
||
this
.
state
[
addressOrName
]
if
(
!
account
||
account
.
balance
===
''
)
{
return
BigNumber
.
from
(
0
)
}
return
BigNumber
.
from
(
account
.
balance
)
}
async
getTransactionCount
(
addressOrName
:
string
,
// eslint-disable-next-line
blockTag
?:
number
|
string
):
Promise
<
number
>
{
addressOrName
=
addressOrName
.
toLowerCase
()
const
address
=
remove0x
(
addressOrName
)
const
account
=
this
.
state
[
address
]
||
this
.
state
[
addressOrName
]
if
(
!
account
)
{
return
0
}
if
(
typeof
account
.
nonce
===
'
number
'
)
{
return
account
.
nonce
}
if
(
account
.
nonce
===
''
)
{
return
0
}
if
(
typeof
account
.
nonce
===
'
string
'
)
{
return
BigNumber
.
from
(
account
.
nonce
).
toNumber
()
}
return
0
}
async
getCode
(
addressOrName
:
string
):
Promise
<
string
>
{
addressOrName
=
addressOrName
.
toLowerCase
()
const
address
=
remove0x
(
addressOrName
)
const
account
=
this
.
state
[
address
]
||
this
.
state
[
addressOrName
]
if
(
!
account
)
{
return
'
0x
'
}
if
(
typeof
account
.
code
===
'
string
'
)
{
return
add0x
(
account
.
code
)
}
return
'
0x
'
}
async
getStorageAt
(
addressOrName
:
string
,
position
:
BigNumber
|
number
):
Promise
<
string
>
{
addressOrName
=
addressOrName
.
toLowerCase
()
const
address
=
remove0x
(
addressOrName
)
const
account
=
this
.
state
[
address
]
||
this
.
state
[
addressOrName
]
if
(
!
account
)
{
return
'
0x
'
}
const
bytes32
=
bytes32ify
(
position
)
const
storage
=
account
.
storage
[
remove0x
(
bytes32
)]
||
account
.
storage
[
bytes32
]
if
(
!
storage
)
{
return
'
0x
'
}
return
add0x
(
storage
)
}
async
call
(
transaction
:
Deferrable
<
TransactionRequest
>
,
blockTag
?:
BlockTag
|
Promise
<
BlockTag
>
):
Promise
<
string
>
{
throw
new
Error
(
`Unsupported Method: call with args: transaction -
${
transaction
}
, blockTag -
${
blockTag
}
`
)
}
async
send
(
method
:
string
,
args
:
Array
<
any
>
):
Promise
<
any
>
{
switch
(
method
)
{
case
'
eth_getProof
'
:
{
const
address
=
args
[
0
]
if
(
!
address
)
{
throw
new
Error
(
'
Must pass address as first arg
'
)
}
const
account
=
this
.
state
[
remove0x
(
address
)]
||
this
.
state
[
address
]
// The account doesn't exist or is an EOA
if
(
!
account
||
!
account
.
code
||
account
.
code
===
'
0x
'
)
{
return
{
codeHash
:
add0x
(
KECCAK256_NULL_S
),
storageHash
:
add0x
(
KECCAK256_RLP_S
),
}
}
return
{
codeHash
:
ethers
.
utils
.
keccak256
(
add0x
(
account
.
code
)),
storageHash
:
add0x
(
account
.
root
),
}
}
default
:
throw
new
Error
(
`Unsupported Method: send
${
method
}
`
)
}
}
async
getNetwork
()
{
return
undefined
}
async
getBlockNumber
():
Promise
<
number
>
{
return
0
}
async
getGasPrice
():
Promise
<
BigNumber
>
{
return
BigNumber
.
from
(
0
)
}
async
getFeeData
()
{
return
undefined
}
async
sendTransaction
(
signedTransaction
:
string
|
Promise
<
string
>
):
Promise
<
TransactionResponse
>
{
throw
new
Error
(
`Unsupported Method: sendTransaction with args: transaction -
${
signedTransaction
}
`
)
}
async
estimateGas
():
Promise
<
BigNumber
>
{
return
BigNumber
.
from
(
0
)
}
async
getBlock
(
blockHashOrBlockTag
:
BlockTag
|
string
|
Promise
<
BlockTag
|
string
>
):
Promise
<
Block
>
{
throw
new
Error
(
`Unsupported Method: getBlock with args blockHashOrBlockTag -
${
blockHashOrBlockTag
}
`
)
}
async
getBlockWithTransactions
(
blockHashOrBlockTag
:
BlockTag
|
string
|
Promise
<
BlockTag
|
string
>
):
Promise
<
BlockWithTransactions
>
{
throw
new
Error
(
`Unsupported Method: getBlockWithTransactions with args blockHashOrBlockTag -
${
blockHashOrBlockTag
}
`
)
}
async
getTransaction
(
transactionHash
:
string
):
Promise
<
TransactionResponse
>
{
throw
new
Error
(
`Unsupported Method: getTransaction with args transactionHash -
${
transactionHash
}
`
)
}
async
getTransactionReceipt
(
transactionHash
:
string
):
Promise
<
TransactionReceipt
>
{
throw
new
Error
(
`Unsupported Method: getTransactionReceipt with args transactionHash -
${
transactionHash
}
`
)
}
async
getLogs
(
filter
:
Filter
):
Promise
<
Array
<
Log
>>
{
throw
new
Error
(
`Unsupported Method: getLogs with args filter -
${
filter
}
`
)
}
async
resolveName
(
name
:
string
|
Promise
<
string
>
):
Promise
<
null
|
string
>
{
throw
new
Error
(
`Unsupported Method: resolveName with args name -
${
name
}
`
)
}
async
lookupAddress
(
address
:
string
|
Promise
<
string
>
):
Promise
<
null
|
string
>
{
throw
new
Error
(
`Unsupported Method: lookupAddress with args address -
${
address
}
`
)
}
on
(
eventName
:
EventType
,
listener
:
Listener
):
Provider
{
throw
new
Error
(
`Unsupported Method: on with args eventName -
${
eventName
}
, listener -
${
listener
}
`
)
}
once
(
eventName
:
EventType
,
listener
:
Listener
):
Provider
{
throw
new
Error
(
`Unsupported Method: once with args eventName -
${
eventName
}
, listener -
${
listener
}
`
)
}
emit
(
eventName
:
EventType
,
...
args
:
Array
<
any
>
):
boolean
{
throw
new
Error
(
`Unsupported Method: emit with args eventName -
${
eventName
}
, args -
${
args
}
`
)
}
listenerCount
(
eventName
?:
EventType
):
number
{
throw
new
Error
(
`Unsupported Method: listenerCount with args eventName -
${
eventName
}
`
)
}
listeners
(
eventName
?:
EventType
):
Array
<
Listener
>
{
throw
new
Error
(
`Unsupported Method: listeners with args eventName -
${
eventName
}
`
)
}
off
(
eventName
:
EventType
,
listener
?:
Listener
):
Provider
{
throw
new
Error
(
`Unsupported Method: off with args eventName -
${
eventName
}
, listener -
${
listener
}
`
)
}
removeAllListeners
(
eventName
?:
EventType
):
Provider
{
throw
new
Error
(
`Unsupported Method: removeAllListeners with args eventName -
${
eventName
}
`
)
}
addListener
(
eventName
:
EventType
,
listener
:
Listener
):
Provider
{
throw
new
Error
(
`Unsupported Method: addListener with args eventName -
${
eventName
}
, listener -
${
listener
}
`
)
}
removeListener
(
eventName
:
EventType
,
listener
:
Listener
):
Provider
{
throw
new
Error
(
`Unsupported Method: removeListener with args eventName -
${
eventName
}
, listener -
${
listener
}
`
)
}
async
waitForTransaction
(
transactionHash
:
string
,
confirmations
?:
number
,
timeout
?:
number
):
Promise
<
TransactionReceipt
>
{
throw
new
Error
(
`Unsupported Method: waitForTransaction with args transactionHash -
${
transactionHash
}
, confirmations -
${
confirmations
}
, timeout -
${
timeout
}
`
)
}
readonly
_isProvider
:
boolean
}
packages/regenesis-surgery/test/setup.ts
deleted
100644 → 0
View file @
69779a27
/* External Imports */
import
chai
=
require
(
'
chai
'
)
import
Mocha
from
'
mocha
'
import
chaiAsPromised
from
'
chai-as-promised
'
import
*
as
dotenv
from
'
dotenv
'
import
{
getenv
,
remove0x
}
from
'
@eth-optimism/core-utils
'
import
{
providers
,
BigNumber
}
from
'
ethers
'
import
{
solidity
}
from
'
ethereum-waffle
'
import
{
GenesisJsonProvider
}
from
'
./provider
'
import
{
SurgeryDataSources
,
Account
,
AccountType
}
from
'
../scripts/types
'
import
{
loadSurgeryData
}
from
'
../scripts/data
'
import
{
classify
,
classifiers
}
from
'
../scripts/classifiers
'
// Chai plugins go here.
chai
.
use
(
chaiAsPromised
)
chai
.
use
(
solidity
)
const
should
=
chai
.
should
()
const
expect
=
chai
.
expect
dotenv
.
config
()
export
const
NUM_ACCOUNTS_DIVISOR
=
4096
export
const
ERC20_ABI
=
[
'
function balanceOf(address owner) view returns (uint256)
'
,
]
interface
TestEnvConfig
{
preL2ProviderUrl
:
string
|
null
postL2ProviderUrl
:
string
|
null
postSurgeryGenesisFilePath
:
string
stateDumpHeight
:
string
|
number
}
const
config
=
():
TestEnvConfig
=>
{
const
height
=
getenv
(
'
REGEN__STATE_DUMP_HEIGHT
'
)
return
{
// Optional config params for running against live nodes
preL2ProviderUrl
:
getenv
(
'
REGEN__PRE_L2_PROVIDER_URL
'
),
postL2ProviderUrl
:
getenv
(
'
REGEN__POST_L2_PROVIDER_URL
'
),
// File path to the post regenesis file to read
postSurgeryGenesisFilePath
:
getenv
(
'
REGEN__POST_GENESIS_FILE_PATH
'
),
stateDumpHeight
:
parseInt
(
height
,
10
)
||
'
latest
'
,
}
}
interface
TypedAccount
extends
Account
{
type
:
AccountType
}
// A TestEnv that contains all of the required test data
class
TestEnv
{
// Config
config
:
TestEnvConfig
// An L2 provider configured to be able to query a pre
// regenesis L2 node. This node should be synced to the
// height that the state dump was taken
preL2Provider
:
providers
.
StaticJsonRpcProvider
|
GenesisJsonProvider
// An L2 provider configured to be able to query a post
// regenesis L2 node. This L2 node was initialized with
// the results of the state surgery script
postL2Provider
:
providers
.
StaticJsonRpcProvider
|
GenesisJsonProvider
// The datasources used for doing state surgery
surgeryDataSources
:
SurgeryDataSources
// List of typed accounts in the input dump
accounts
:
TypedAccount
[]
=
[]
// List of erc20 contracts in input dump
erc20s
:
Account
[]
=
[]
constructor
(
opts
:
TestEnvConfig
)
{
this
.
config
=
opts
// If the pre provider url is provided, use a json rpc provider.
// Otherwise, initialize a preL2Provider in the init function
// since it depends on suregery data sources
if
(
opts
.
preL2ProviderUrl
)
{
this
.
preL2Provider
=
new
providers
.
StaticJsonRpcProvider
(
opts
.
preL2ProviderUrl
)
}
if
(
opts
.
postL2ProviderUrl
)
{
this
.
postL2Provider
=
new
providers
.
StaticJsonRpcProvider
(
opts
.
postL2ProviderUrl
)
}
else
{
if
(
!
opts
.
postSurgeryGenesisFilePath
)
{
throw
new
Error
(
'
Must configure REGEN__POST_GENESIS_FILE_PATH
'
)
}
console
.
log
(
'
Using GenesisJsonProvider for postL2Provider
'
)
this
.
postL2Provider
=
new
GenesisJsonProvider
(
opts
.
postSurgeryGenesisFilePath
)
}
}
// Read the big files from disk. Without bumping the size of the nodejs heap,
// this can oom the process. Prefix the test command with:
// $ NODE_OPTIONS=--max_old_space=8912
async
init
()
{
if
(
this
.
surgeryDataSources
===
undefined
)
{
this
.
surgeryDataSources
=
await
loadSurgeryData
()
if
(
!
this
.
preL2Provider
)
{
console
.
log
(
'
Initializing pre GenesisJsonProvider...
'
)
// Convert the genesis dump into a genesis file format
const
genesis
=
{
...
this
.
surgeryDataSources
.
genesis
}
for
(
const
account
of
this
.
surgeryDataSources
.
dump
)
{
let
nonce
=
account
.
nonce
if
(
typeof
nonce
===
'
string
'
)
{
if
(
nonce
===
''
)
{
nonce
=
0
}
else
{
nonce
=
BigNumber
.
from
(
nonce
).
toNumber
()
}
}
genesis
.
alloc
[
remove0x
(
account
.
address
).
toLowerCase
()]
=
{
nonce
,
balance
:
account
.
balance
,
codeHash
:
remove0x
(
account
.
codeHash
),
root
:
remove0x
(
account
.
root
),
code
:
remove0x
(
account
.
code
),
storage
:
{},
}
// Fill in the storage if it exists
if
(
account
.
storage
)
{
for
(
const
[
key
,
value
]
of
Object
.
entries
(
account
.
storage
))
{
genesis
.
alloc
[
remove0x
(
account
.
address
).
toLowerCase
()].
storage
[
remove0x
(
key
)
]
=
remove0x
(
value
)
}
}
}
// Create the pre L2 provider using the build genesis object
this
.
preL2Provider
=
new
GenesisJsonProvider
(
genesis
)
}
// Classify the accounts once, this takes a while so it's better to cache it.
console
.
log
(
`Classifying accounts...`
)
for
(
const
account
of
this
.
surgeryDataSources
.
dump
)
{
const
accountType
=
classify
(
account
,
this
.
surgeryDataSources
)
this
.
accounts
.
push
({
...
account
,
type
:
accountType
,
})
if
(
classifiers
[
AccountType
.
ERC20
](
account
,
this
.
surgeryDataSources
))
{
this
.
erc20s
.
push
(
account
)
}
}
}
}
// isProvider is false when it is not live
hasLiveProviders
():
boolean
{
return
this
.
postL2Provider
.
_isProvider
}
getAccountsByType
(
type
:
AccountType
)
{
return
this
.
accounts
.
filter
((
account
)
=>
account
.
type
===
type
)
}
}
// Create a singleton test env that can be imported into each
// test file. It is important that the async operations are only
// called once as they take awhile. Each test file should be sure
// to call `env.init()` in a `before` clause to ensure that
// the files are read from disk at least once
let
env
:
TestEnv
try
{
if
(
env
===
undefined
)
{
const
cfg
=
config
()
env
=
new
TestEnv
(
cfg
)
}
}
catch
(
e
)
{
console
.
error
(
`unable to initialize test env:
${
e
.
toString
()}
`
)
}
export
{
should
,
expect
,
Mocha
,
env
}
packages/regenesis-surgery/test/uniswap.spec.ts
deleted
100644 → 0
View file @
69779a27
import
{
ethers
}
from
'
ethers
'
import
{
abi
as
UNISWAP_POOL_ABI
}
from
'
@uniswap/v3-core/artifacts/contracts/UniswapV3Pool.sol/UniswapV3Pool.json
'
import
{
expect
,
env
,
ERC20_ABI
}
from
'
./setup
'
import
{
UNISWAP_V3_NFPM_ADDRESS
}
from
'
../scripts/constants
'
import
{
getUniswapV3Factory
,
replaceWETH
}
from
'
../scripts/utils
'
import
{
AccountType
}
from
'
../scripts/types
'
describe
(
'
uniswap contracts
'
,
()
=>
{
before
(
async
()
=>
{
await
env
.
init
()
})
it
(
'
V3 factory
'
,
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run factory tests without live provider
'
)
return
}
let
preUniswapV3Factory
:
ethers
.
Contract
let
postUniswapV3Factory
:
ethers
.
Contract
before
(
async
()
=>
{
preUniswapV3Factory
=
getUniswapV3Factory
(
env
.
preL2Provider
)
postUniswapV3Factory
=
getUniswapV3Factory
(
env
.
postL2Provider
)
})
it
(
'
should have the same owner
'
,
async
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run factory tests without live provider
'
)
return
}
const
preOwner
=
await
preUniswapV3Factory
.
owner
()
const
postOwner
=
await
postUniswapV3Factory
.
owner
()
expect
(
preOwner
).
to
.
equal
(
postOwner
)
})
it
(
'
should have the same feeAmountTickSpacing map values
'
,
async
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run factory tests without live provider
'
)
return
}
for
(
const
fee
of
[
500
,
3000
,
10000
])
{
const
preValue
=
await
preUniswapV3Factory
.
feeAmountTickSpacing
(
fee
)
const
postValue
=
await
postUniswapV3Factory
.
feeAmountTickSpacing
(
fee
)
expect
(
preValue
).
to
.
deep
.
equal
(
postValue
)
}
})
it
(
'
should have the right pool addresses
'
,
async
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run factory tests without live provider
'
)
return
}
for
(
const
pool
of
env
.
surgeryDataSources
.
pools
)
{
const
remotePoolAddress1
=
await
postUniswapV3Factory
.
getPool
(
pool
.
token0
,
pool
.
token1
,
pool
.
fee
)
const
remotePoolAddress2
=
await
postUniswapV3Factory
.
getPool
(
pool
.
token1
,
pool
.
token0
,
pool
.
fee
)
expect
(
remotePoolAddress1
).
to
.
equal
(
remotePoolAddress2
)
expect
(
remotePoolAddress1
.
toLowerCase
()).
to
.
equal
(
pool
.
newAddress
.
toLowerCase
()
)
}
})
// Debug this one...
it
(
'
should have the same code as on mainnet
'
,
async
()
=>
{
let
l2Code
=
await
env
.
postL2Provider
.
getCode
(
postUniswapV3Factory
.
address
)
l2Code
=
replaceWETH
(
l2Code
)
const
l1Code
=
await
env
.
surgeryDataSources
.
ethProvider
.
getCode
(
postUniswapV3Factory
.
address
)
expect
(
l2Code
).
to
.
not
.
equal
(
'
0x
'
)
expect
(
l2Code
).
to
.
equal
(
l1Code
)
})
})
describe
(
'
V3 NFPM
'
,
()
=>
{
it
(
'
should have the same code as on mainnet
'
,
async
()
=>
{
const
l2Code
=
await
env
.
postL2Provider
.
getCode
(
UNISWAP_V3_NFPM_ADDRESS
)
let
l1Code
=
await
env
.
surgeryDataSources
.
ethProvider
.
getCode
(
UNISWAP_V3_NFPM_ADDRESS
)
l1Code
=
replaceWETH
(
l1Code
)
expect
(
l2Code
).
to
.
not
.
equal
(
'
0x
'
)
expect
(
l2Code
).
to
.
equal
(
l1Code
)
})
// TODO: what's the best way to test the _poolIds change?
})
describe
(
'
V3 pools
'
,
()
=>
{
it
(
'
Pools code
'
,
()
=>
{
for
(
const
pool
of
env
.
surgeryDataSources
.
pools
)
{
describe
(
`pool at address
${
pool
.
newAddress
}
`
,
()
=>
{
it
(
'
should have the same code as on testnet
'
,
async
()
=>
{
const
l2Code
=
await
env
.
postL2Provider
.
getCode
(
pool
.
newAddress
)
const
l1Code
=
await
env
.
surgeryDataSources
.
ropstenProvider
.
getCode
(
pool
.
newAddress
)
expect
(
l2Code
).
to
.
not
.
equal
(
'
0x
'
)
expect
(
l2Code
).
to
.
equal
(
l1Code
)
})
})
}
})
it
(
'
Pools contract
'
,
()
=>
{
if
(
!
env
.
hasLiveProviders
())
{
console
.
log
(
'
Cannot run pool contract tests without live provider
'
)
return
}
for
(
const
pool
of
env
.
surgeryDataSources
.
pools
)
{
describe
(
`pool at address
${
pool
.
newAddress
}
`
,
()
=>
{
let
prePoolContract
:
ethers
.
Contract
let
postPoolContract
:
ethers
.
Contract
before
(
async
()
=>
{
prePoolContract
=
new
ethers
.
Contract
(
pool
.
oldAddress
,
UNISWAP_POOL_ABI
,
env
.
preL2Provider
)
postPoolContract
=
new
ethers
.
Contract
(
pool
.
newAddress
,
UNISWAP_POOL_ABI
,
env
.
postL2Provider
)
})
it
(
'
should have the same code as on testnet
'
,
async
()
=>
{
const
l2Code
=
await
env
.
postL2Provider
.
getCode
(
postPoolContract
.
address
)
const
l1Code
=
await
env
.
surgeryDataSources
.
ethProvider
.
getCode
(
postPoolContract
.
address
)
expect
(
l2Code
).
to
.
not
.
equal
(
'
0x
'
)
expect
(
l2Code
).
to
.
equal
(
l1Code
)
})
it
(
'
should have the same storage values
'
,
async
()
=>
{
const
varsToCheck
=
[
'
slot0
'
,
'
feeGrowthGlobal0X128
'
,
'
feeGrowthGlobal1X128
'
,
'
protocolFees
'
,
'
liquidity
'
,
'
factory
'
,
'
token0
'
,
'
token1
'
,
'
fee
'
,
'
tickSpacing
'
,
'
maxLiquidityPerTick
'
,
]
for
(
const
varName
of
varsToCheck
)
{
const
preValue
=
await
prePoolContract
[
varName
]({
blockTag
:
env
.
config
.
stateDumpHeight
,
})
const
postValue
=
await
postPoolContract
[
varName
]()
expect
(
preValue
).
to
.
deep
.
equal
(
postValue
)
}
})
it
(
'
should have the same token balances as before
'
,
async
()
=>
{
const
baseERC20
=
new
ethers
.
Contract
(
ethers
.
constants
.
AddressZero
,
ERC20_ABI
)
const
preToken0
=
baseERC20
.
attach
(
pool
.
token0
)
.
connect
(
env
.
preL2Provider
)
const
postToken0
=
baseERC20
.
attach
(
pool
.
token0
)
.
connect
(
env
.
postL2Provider
)
const
preToken1
=
baseERC20
.
attach
(
pool
.
token1
)
.
connect
(
env
.
preL2Provider
)
const
postToken1
=
baseERC20
.
attach
(
pool
.
token1
)
.
connect
(
env
.
postL2Provider
)
// Token0 might not have any code in the new system, we can skip this check if so.
const
newToken0Code
=
await
env
.
postL2Provider
.
getCode
(
pool
.
token0
)
if
(
newToken0Code
!==
'
0x
'
)
{
const
preBalance0
=
await
preToken0
.
balanceOf
(
pool
.
oldAddress
,
{
blockTag
:
env
.
config
.
stateDumpHeight
,
})
const
postBalance0
=
await
postToken0
.
balanceOf
(
pool
.
newAddress
)
expect
(
preBalance0
).
to
.
deep
.
equal
(
postBalance0
)
}
// Token1 might not have any code in the new system, we can skip this check if so.
const
newToken1Code
=
await
env
.
postL2Provider
.
getCode
(
pool
.
token1
)
if
(
newToken1Code
!==
'
0x
'
)
{
const
preBalance1
=
await
preToken1
.
balanceOf
(
pool
.
oldAddress
,
{
blockTag
:
env
.
config
.
stateDumpHeight
,
})
const
postBalance1
=
await
postToken1
.
balanceOf
(
pool
.
newAddress
)
expect
(
preBalance1
).
to
.
deep
.
equal
(
postBalance1
)
}
})
})
}
// TODO: add a test for minting positions?
})
})
describe
(
'
other
'
,
()
=>
{
let
accs
before
(
async
()
=>
{
accs
=
env
.
getAccountsByType
(
AccountType
.
UNISWAP_V3_OTHER
)
})
// TODO: for some reason these tests fail
it
(
'
Other uniswap contracts
'
,
()
=>
{
for
(
const
acc
of
accs
)
{
describe
(
`uniswap contract at address
${
acc
.
address
}
`
,
()
=>
{
it
(
'
should have the same code as on mainnet
'
,
async
()
=>
{
const
l2Code
=
await
env
.
postL2Provider
.
getCode
(
acc
.
address
)
let
l1Code
=
await
env
.
surgeryDataSources
.
ethProvider
.
getCode
(
acc
.
address
)
l1Code
=
replaceWETH
(
l1Code
)
expect
(
l2Code
).
to
.
not
.
equal
(
'
0x
'
)
expect
(
l2Code
).
to
.
equal
(
l1Code
)
})
})
}
})
})
})
packages/regenesis-surgery/test/utils.ts
deleted
100644 → 0
View file @
69779a27
import
fs
from
'
fs/promises
'
import
path
from
'
path
'
import
{
expect
}
from
'
@eth-optimism/core-utils/test/setup
'
import
{
isBytecodeERC20
}
from
'
../scripts/utils
'
describe
(
'
Utils
'
,
()
=>
{
// Read in the mock data
const
contracts
=
{}
before
(
async
()
=>
{
const
files
=
await
fs
.
readdir
(
path
.
join
(
__dirname
,
'
data
'
))
for
(
const
filename
of
files
)
{
const
file
=
await
fs
.
readFile
(
path
.
join
(
__dirname
,
'
data
'
,
filename
))
const
name
=
path
.
parse
(
filename
).
name
const
json
=
JSON
.
parse
(
file
.
toString
())
contracts
[
name
]
=
{
bytecode
:
json
.
bytecode
.
toString
().
trim
(),
expected
:
json
.
expected
,
}
}
})
it
(
'
isBytecodeERC20
'
,
()
=>
{
for
(
const
[
name
,
contract
]
of
Object
.
entries
(
contracts
))
{
describe
(
`contract
${
name
}
`
,
()
=>
{
it
(
'
should be identified erc20
'
,
()
=>
{
const
result
=
isBytecodeERC20
((
contract
as
any
).
bytecode
as
string
)
expect
(
result
).
to
.
eq
((
contract
as
any
).
expected
)
})
})
}
})
})
packages/regenesis-surgery/test/verified.spec.ts
deleted
100644 → 0
View file @
69779a27
/* eslint-disable @typescript-eslint/no-empty-function */
import
{
expect
,
env
,
NUM_ACCOUNTS_DIVISOR
}
from
'
./setup
'
import
{
AccountType
}
from
'
../scripts/types
'
describe
(
'
verified
'
,
()
=>
{
let
verified
before
(
async
()
=>
{
await
env
.
init
()
verified
=
env
.
getAccountsByType
(
AccountType
.
VERIFIED
)
})
it
(
'
accounts
'
,
async
()
=>
{
for
(
const
[
i
,
account
]
of
verified
.
entries
())
{
if
(
i
%
NUM_ACCOUNTS_DIVISOR
===
0
)
{
const
preBytecode
=
await
env
.
preL2Provider
.
getCode
(
account
.
address
)
const
postBytecode
=
await
env
.
postL2Provider
.
getCode
(
account
.
address
)
describe
(
`account
${
i
}
/
${
verified
.
length
}
(
${
account
.
address
}
)`
,
()
=>
{
it
(
'
should have new bytecode with equal or smaller size
'
,
async
()
=>
{
const
preSize
=
preBytecode
.
length
const
postSize
=
postBytecode
.
length
expect
(
preSize
>=
postSize
).
to
.
be
.
true
})
it
(
'
should have the same nonce and balance
'
,
async
()
=>
{
const
preNonce
=
await
env
.
preL2Provider
.
getTransactionCount
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postNonce
=
await
env
.
postL2Provider
.
getTransactionCount
(
account
.
address
)
expect
(
preNonce
).
to
.
deep
.
eq
(
postNonce
)
const
preBalance
=
await
env
.
preL2Provider
.
getBalance
(
account
.
address
,
env
.
config
.
stateDumpHeight
)
const
postBalance
=
await
env
.
postL2Provider
.
getBalance
(
account
.
address
)
expect
(
preBalance
).
to
.
deep
.
eq
(
postBalance
)
})
})
}
}
})
})
packages/regenesis-surgery/tsconfig.build.json
deleted
100644 → 0
View file @
69779a27
{
"extends"
:
"../../tsconfig.build.json"
,
"compilerOptions"
:
{
"rootDir"
:
"./src"
,
"outDir"
:
"./dist"
},
"include"
:
[
"src/**/*"
]
}
packages/regenesis-surgery/tsconfig.json
deleted
100644 → 0
View file @
69779a27
{
"extends"
:
"../../tsconfig.json"
}
yarn.lock
View file @
ed234a8d
...
@@ -457,11 +457,6 @@
...
@@ -457,11 +457,6 @@
rxjs "^7.2.0"
rxjs "^7.2.0"
semver "^7.3.5"
semver "^7.3.5"
"@discoveryjs/json-ext@^0.5.3":
version "0.5.3"
resolved "https://registry.yarnpkg.com/@discoveryjs/json-ext/-/json-ext-0.5.3.tgz#90420f9f9c6d3987f176a19a7d8e764271a2f55d"
integrity sha512-Fxt+AfXgjMoin2maPIYzFZnQjAXjAL0PHscM5pRTtatFqB+vZxAM9tLp2Optnuw3QOQC40jTNeGYFOMvyf7v9g==
"@ensdomains/ens@^0.4.4":
"@ensdomains/ens@^0.4.4":
version "0.4.5"
version "0.4.5"
resolved "https://registry.yarnpkg.com/@ensdomains/ens/-/ens-0.4.5.tgz#e0aebc005afdc066447c6e22feb4eda89a5edbfc"
resolved "https://registry.yarnpkg.com/@ensdomains/ens/-/ens-0.4.5.tgz#e0aebc005afdc066447c6e22feb4eda89a5edbfc"
...
@@ -3070,13 +3065,6 @@
...
@@ -3070,13 +3065,6 @@
"@types/node" "*"
"@types/node" "*"
form-data "^3.0.0"
form-data "^3.0.0"
"@types/node-fetch@^3.0.3":
version "3.0.3"
resolved "https://registry.yarnpkg.com/@types/node-fetch/-/node-fetch-3.0.3.tgz#9d969c9a748e841554a40ee435d26e53fa3ee899"
integrity sha512-HhggYPH5N+AQe/OmN6fmhKmRRt2XuNJow+R3pQwJxOOF9GuwM7O2mheyGeIrs5MOIeNjDEdgdoyHBOrFeJBR3g==
dependencies:
node-fetch "*"
"@types/node@*":
"@types/node@*":
version "16.7.1"
version "16.7.1"
resolved "https://registry.yarnpkg.com/@types/node/-/node-16.7.1.tgz#c6b9198178da504dfca1fd0be9b2e1002f1586f0"
resolved "https://registry.yarnpkg.com/@types/node/-/node-16.7.1.tgz#c6b9198178da504dfca1fd0be9b2e1002f1586f0"
...
@@ -3346,7 +3334,7 @@
...
@@ -3346,7 +3334,7 @@
resolved "https://registry.yarnpkg.com/@uniswap/v2-core/-/v2-core-1.0.1.tgz#af8f508bf183204779938969e2e54043e147d425"
resolved "https://registry.yarnpkg.com/@uniswap/v2-core/-/v2-core-1.0.1.tgz#af8f508bf183204779938969e2e54043e147d425"
integrity sha512-MtybtkUPSyysqLY2U210NBDeCHX+ltHt3oADGdjqoThZaFRDKwM6k1Nb3F0A3hk5hwuQvytFWhrWHOEq6nVJ8Q==
integrity sha512-MtybtkUPSyysqLY2U210NBDeCHX+ltHt3oADGdjqoThZaFRDKwM6k1Nb3F0A3hk5hwuQvytFWhrWHOEq6nVJ8Q==
"@uniswap/v3-core@1.0.0"
, "@uniswap/v3-core@^1.0.0"
:
"@uniswap/v3-core@1.0.0":
version "1.0.0"
version "1.0.0"
resolved "https://registry.yarnpkg.com/@uniswap/v3-core/-/v3-core-1.0.0.tgz#6c24adacc4c25dceee0ba3ca142b35adbd7e359d"
resolved "https://registry.yarnpkg.com/@uniswap/v3-core/-/v3-core-1.0.0.tgz#6c24adacc4c25dceee0ba3ca142b35adbd7e359d"
integrity sha512-kSC4djMGKMHj7sLMYVnn61k9nu+lHjMIxgg9CDQT+s2QYLoA56GbSK9Oxr+qJXzzygbkrmuY6cwgP6cW2JXPFA==
integrity sha512-kSC4djMGKMHj7sLMYVnn61k9nu+lHjMIxgg9CDQT+s2QYLoA56GbSK9Oxr+qJXzzygbkrmuY6cwgP6cW2JXPFA==
...
@@ -3363,19 +3351,6 @@
...
@@ -3363,19 +3351,6 @@
base64-sol "1.0.1"
base64-sol "1.0.1"
hardhat-watcher "^2.1.1"
hardhat-watcher "^2.1.1"
"@uniswap/v3-sdk@^3.5.1":
version "3.5.1"
resolved "https://registry.yarnpkg.com/@uniswap/v3-sdk/-/v3-sdk-3.5.1.tgz#441e8e44d1ea576964d726903ec10a150cf916ea"
integrity sha512-DcLtlnWfkKpc5cqzKMLeFub+E8lvqTemObOYLqOZo6DdMAeLqawj8g2oo9o2JuhSVW/VXQDuSDe2V68BYF+U+w==
dependencies:
"@ethersproject/abi" "^5.0.12"
"@ethersproject/solidity" "^5.0.9"
"@uniswap/sdk-core" "^3.0.1"
"@uniswap/v3-periphery" "^1.1.1"
"@uniswap/v3-staker" "1.0.0"
tiny-invariant "^1.1.0"
tiny-warning "^1.0.3"
"@uniswap/v3-sdk@^3.6.2":
"@uniswap/v3-sdk@^3.6.2":
version "3.6.2"
version "3.6.2"
resolved "https://registry.yarnpkg.com/@uniswap/v3-sdk/-/v3-sdk-3.6.2.tgz#45fa659f7642e8807cb36939e4426355c7a5943c"
resolved "https://registry.yarnpkg.com/@uniswap/v3-sdk/-/v3-sdk-3.6.2.tgz#45fa659f7642e8807cb36939e4426355c7a5943c"
...
@@ -5188,7 +5163,7 @@ chokidar@3.5.1:
...
@@ -5188,7 +5163,7 @@ chokidar@3.5.1:
optionalDependencies:
optionalDependencies:
fsevents "~2.3.1"
fsevents "~2.3.1"
chokidar@
3.5.2, chokidar@
^3.4.0, chokidar@^3.4.3, chokidar@^3.5.2:
chokidar@^3.4.0, chokidar@^3.4.3, chokidar@^3.5.2:
version "3.5.2"
version "3.5.2"
resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-3.5.2.tgz#dba3976fcadb016f66fd365021d91600d01c1e75"
resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-3.5.2.tgz#dba3976fcadb016f66fd365021d91600d01c1e75"
integrity sha512-ekGhOnNVPgT77r4K/U3GDhu+FQ2S8TnK/s2KbIGXi0SZWuwkZ2QNyfWdZW+TVfn84DpEP7rLeCt2UI6bJ8GwbQ==
integrity sha512-ekGhOnNVPgT77r4K/U3GDhu+FQ2S8TnK/s2KbIGXi0SZWuwkZ2QNyfWdZW+TVfn84DpEP7rLeCt2UI6bJ8GwbQ==
...
@@ -5915,11 +5890,6 @@ dashdash@^1.12.0:
...
@@ -5915,11 +5890,6 @@ dashdash@^1.12.0:
dependencies:
dependencies:
assert-plus "^1.0.0"
assert-plus "^1.0.0"
data-uri-to-buffer@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/data-uri-to-buffer/-/data-uri-to-buffer-3.0.1.tgz#594b8973938c5bc2c33046535785341abc4f3636"
integrity sha512-WboRycPNsVw3B3TL559F7kuBUM4d8CgMEvk6xEJlOp7OBPjt6G7z8WMWlD2rOFZLk6OYfFIUGsCOWzcQH9K2og==
dateformat@^3.0.0:
dateformat@^3.0.0:
version "3.0.3"
version "3.0.3"
resolved "https://registry.yarnpkg.com/dateformat/-/dateformat-3.0.3.tgz#a6e37499a4d9a9cf85ef5872044d62901c9889ae"
resolved "https://registry.yarnpkg.com/dateformat/-/dateformat-3.0.3.tgz#a6e37499a4d9a9cf85ef5872044d62901c9889ae"
...
@@ -5949,7 +5919,7 @@ debug@3.2.6:
...
@@ -5949,7 +5919,7 @@ debug@3.2.6:
dependencies:
dependencies:
ms "^2.1.1"
ms "^2.1.1"
debug@4, debug@
4.3.2, debug@
^4.0.1, debug@^4.1.0, debug@^4.1.1, debug@^4.3.1, debug@^4.3.2:
debug@4, debug@^4.0.1, debug@^4.1.0, debug@^4.1.1, debug@^4.3.1, debug@^4.3.2:
version "4.3.2"
version "4.3.2"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.2.tgz#f0a49c18ac8779e31d4a0c6029dfb76873c7428b"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.2.tgz#f0a49c18ac8779e31d4a0c6029dfb76873c7428b"
integrity sha512-mOp8wKcvj7XxC78zLgw/ZA+6TSgkoE2C/ienthhRD298T7UNwAg9diBpLRxC0mOezLl4B0xV7M0cCO6P/O0Xhw==
integrity sha512-mOp8wKcvj7XxC78zLgw/ZA+6TSgkoE2C/ienthhRD298T7UNwAg9diBpLRxC0mOezLl4B0xV7M0cCO6P/O0Xhw==
...
@@ -7228,17 +7198,6 @@ ethereumjs-util@^7.0.10, ethereumjs-util@^7.0.2, ethereumjs-util@^7.0.7, ethereu
...
@@ -7228,17 +7198,6 @@ ethereumjs-util@^7.0.10, ethereumjs-util@^7.0.2, ethereumjs-util@^7.0.7, ethereu
ethjs-util "0.1.6"
ethjs-util "0.1.6"
rlp "^2.2.4"
rlp "^2.2.4"
ethereumjs-util@^7.1.3:
version "7.1.3"
resolved "https://registry.yarnpkg.com/ethereumjs-util/-/ethereumjs-util-7.1.3.tgz#b55d7b64dde3e3e45749e4c41288238edec32d23"
integrity sha512-y+82tEbyASO0K0X1/SRhbJJoAlfcvq8JbrG4a5cjrOks7HS/36efU/0j2flxCPOUM++HFahk33kr/ZxyC4vNuw==
dependencies:
"@types/bn.js" "^5.1.0"
bn.js "^5.1.2"
create-hash "^1.1.2"
ethereum-cryptography "^0.1.3"
rlp "^2.2.4"
ethereumjs-vm@4.2.0:
ethereumjs-vm@4.2.0:
version "4.2.0"
version "4.2.0"
resolved "https://registry.yarnpkg.com/ethereumjs-vm/-/ethereumjs-vm-4.2.0.tgz#e885e861424e373dbc556278f7259ff3fca5edab"
resolved "https://registry.yarnpkg.com/ethereumjs-vm/-/ethereumjs-vm-4.2.0.tgz#e885e861424e373dbc556278f7259ff3fca5edab"
...
@@ -7691,13 +7650,6 @@ fastq@^1.6.0:
...
@@ -7691,13 +7650,6 @@ fastq@^1.6.0:
dependencies:
dependencies:
reusify "^1.0.4"
reusify "^1.0.4"
fetch-blob@^3.1.2:
version "3.1.2"
resolved "https://registry.yarnpkg.com/fetch-blob/-/fetch-blob-3.1.2.tgz#6bc438675f3851ecea51758ac91f6a1cd1bacabd"
integrity sha512-hunJbvy/6OLjCD0uuhLdp0mMPzP/yd2ssd1t2FCJsaA7wkWhpbp9xfuNVpv7Ll4jFhzp6T4LAupSiV9uOeg0VQ==
dependencies:
web-streams-polyfill "^3.0.3"
fetch-ponyfill@^4.0.0:
fetch-ponyfill@^4.0.0:
version "4.1.0"
version "4.1.0"
resolved "https://registry.yarnpkg.com/fetch-ponyfill/-/fetch-ponyfill-4.1.0.tgz#ae3ce5f732c645eab87e4ae8793414709b239893"
resolved "https://registry.yarnpkg.com/fetch-ponyfill/-/fetch-ponyfill-4.1.0.tgz#ae3ce5f732c645eab87e4ae8793414709b239893"
...
@@ -8369,26 +8321,26 @@ glob@7.1.6:
...
@@ -8369,26 +8321,26 @@ glob@7.1.6:
once "^1.3.0"
once "^1.3.0"
path-is-absolute "^1.0.0"
path-is-absolute "^1.0.0"
glob@
7.1.7, glob@^7.0.0, glob@^7.0.5, glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6, glob@~7.1.7
:
glob@
^5.0.15
:
version "
7.1.7
"
version "
5.0.15
"
resolved "https://registry.yarnpkg.com/glob/-/glob-
7.1.7.tgz#3b193e9233f01d42d0b3f78294bbeeb418f94a90
"
resolved "https://registry.yarnpkg.com/glob/-/glob-
5.0.15.tgz#1bc936b9e02f4a603fcc222ecf7633d30b8b93b1
"
integrity sha
512-OvD9ENzPLbegENnYP5UUfJIirTg4+XwMWGaQfQTY0JenxNvvIKP3U3/tAQSPIu/lHxXYSZmpXlUHeqAIdKzBLQ=
=
integrity sha
1-G8k2ueAvSmA/zCIuz3Yz0wuLk7E
=
dependencies:
dependencies:
fs.realpath "^1.0.0"
inflight "^1.0.4"
inflight "^1.0.4"
inherits "2"
inherits "2"
minimatch "
^3.0.4
"
minimatch "
2 || 3
"
once "^1.3.0"
once "^1.3.0"
path-is-absolute "^1.0.0"
path-is-absolute "^1.0.0"
glob@^
5.0.15
:
glob@^
7.0.0, glob@^7.0.5, glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6, glob@~7.1.7
:
version "
5.0.15
"
version "
7.1.7
"
resolved "https://registry.yarnpkg.com/glob/-/glob-
5.0.15.tgz#1bc936b9e02f4a603fcc222ecf7633d30b8b93b1
"
resolved "https://registry.yarnpkg.com/glob/-/glob-
7.1.7.tgz#3b193e9233f01d42d0b3f78294bbeeb418f94a90
"
integrity sha
1-G8k2ueAvSmA/zCIuz3Yz0wuLk7E
=
integrity sha
512-OvD9ENzPLbegENnYP5UUfJIirTg4+XwMWGaQfQTY0JenxNvvIKP3U3/tAQSPIu/lHxXYSZmpXlUHeqAIdKzBLQ=
=
dependencies:
dependencies:
fs.realpath "^1.0.0"
inflight "^1.0.4"
inflight "^1.0.4"
inherits "2"
inherits "2"
minimatch "
2 || 3
"
minimatch "
^3.0.4
"
once "^1.3.0"
once "^1.3.0"
path-is-absolute "^1.0.0"
path-is-absolute "^1.0.0"
...
@@ -9752,13 +9704,6 @@ js-yaml@4.0.0:
...
@@ -9752,13 +9704,6 @@ js-yaml@4.0.0:
dependencies:
dependencies:
argparse "^2.0.1"
argparse "^2.0.1"
js-yaml@4.1.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-4.1.0.tgz#c1fb65f8f5017901cdd2c951864ba18458a10602"
integrity sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==
dependencies:
argparse "^2.0.1"
jsbi@^3.1.4:
jsbi@^3.1.4:
version "3.2.5"
version "3.2.5"
resolved "https://registry.yarnpkg.com/jsbi/-/jsbi-3.2.5.tgz#b37bb90e0e5c2814c1c2a1bcd8c729888a2e37d6"
resolved "https://registry.yarnpkg.com/jsbi/-/jsbi-3.2.5.tgz#b37bb90e0e5c2814c1c2a1bcd8c729888a2e37d6"
...
@@ -10519,7 +10464,7 @@ log-symbols@4.0.0:
...
@@ -10519,7 +10464,7 @@ log-symbols@4.0.0:
dependencies:
dependencies:
chalk "^4.0.0"
chalk "^4.0.0"
log-symbols@
4.1.0, log-symbols@
^4.1.0:
log-symbols@^4.1.0:
version "4.1.0"
version "4.1.0"
resolved "https://registry.yarnpkg.com/log-symbols/-/log-symbols-4.1.0.tgz#3fbdbb95b4683ac9fc785111e792e558d4abd503"
resolved "https://registry.yarnpkg.com/log-symbols/-/log-symbols-4.1.0.tgz#3fbdbb95b4683ac9fc785111e792e558d4abd503"
integrity sha512-8XPvpAA8uyhfteu8pIvQxpJZ7SYYdpUivZpGy6sFsBuKRY/7rQGavedeB8aK+Zkyq6upMFVL/9AW6vOYzfRyLg==
integrity sha512-8XPvpAA8uyhfteu8pIvQxpJZ7SYYdpUivZpGy6sFsBuKRY/7rQGavedeB8aK+Zkyq6upMFVL/9AW6vOYzfRyLg==
...
@@ -11242,36 +11187,6 @@ mocha@^8.4.0:
...
@@ -11242,36 +11187,6 @@ mocha@^8.4.0:
yargs-parser "20.2.4"
yargs-parser "20.2.4"
yargs-unparser "2.0.0"
yargs-unparser "2.0.0"
mocha@^9.1.2:
version "9.1.2"
resolved "https://registry.yarnpkg.com/mocha/-/mocha-9.1.2.tgz#93f53175b0f0dc4014bd2d612218fccfcf3534d3"
integrity sha512-ta3LtJ+63RIBP03VBjMGtSqbe6cWXRejF9SyM9Zyli1CKZJZ+vfCTj3oW24V7wAphMJdpOFLoMI3hjJ1LWbs0w==
dependencies:
"@ungap/promise-all-settled" "1.1.2"
ansi-colors "4.1.1"
browser-stdout "1.3.1"
chokidar "3.5.2"
debug "4.3.2"
diff "5.0.0"
escape-string-regexp "4.0.0"
find-up "5.0.0"
glob "7.1.7"
growl "1.10.5"
he "1.2.0"
js-yaml "4.1.0"
log-symbols "4.1.0"
minimatch "3.0.4"
ms "2.1.3"
nanoid "3.1.25"
serialize-javascript "6.0.0"
strip-json-comments "3.1.1"
supports-color "8.1.1"
which "2.0.2"
workerpool "6.1.5"
yargs "16.2.0"
yargs-parser "20.2.4"
yargs-unparser "2.0.0"
mock-fs@^4.1.0:
mock-fs@^4.1.0:
version "4.14.0"
version "4.14.0"
resolved "https://registry.yarnpkg.com/mock-fs/-/mock-fs-4.14.0.tgz#ce5124d2c601421255985e6e94da80a7357b1b18"
resolved "https://registry.yarnpkg.com/mock-fs/-/mock-fs-4.14.0.tgz#ce5124d2c601421255985e6e94da80a7357b1b18"
...
@@ -11409,11 +11324,6 @@ nanoid@3.1.20:
...
@@ -11409,11 +11324,6 @@ nanoid@3.1.20:
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.20.tgz#badc263c6b1dcf14b71efaa85f6ab4c1d6cfc788"
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.20.tgz#badc263c6b1dcf14b71efaa85f6ab4c1d6cfc788"
integrity sha512-a1cQNyczgKbLX9jwbS/+d7W8fX/RfgYR7lVWwWOGIPNgK2m0MWvrGF6/m4kk6U3QcFMnZf3RIhL0v2Jgh/0Uxw==
integrity sha512-a1cQNyczgKbLX9jwbS/+d7W8fX/RfgYR7lVWwWOGIPNgK2m0MWvrGF6/m4kk6U3QcFMnZf3RIhL0v2Jgh/0Uxw==
nanoid@3.1.25:
version "3.1.25"
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.25.tgz#09ca32747c0e543f0e1814b7d3793477f9c8e152"
integrity sha512-rdwtIXaXCLFAQbnfqDRnI6jaRHp9fTcYBjtFKE8eezcZ7LuLjhUaQGNeMXf1HmRoCH32CLz6XwX0TtxEOS/A3Q==
nanomatch@^1.2.9:
nanomatch@^1.2.9:
version "1.2.13"
version "1.2.13"
resolved "https://registry.yarnpkg.com/nanomatch/-/nanomatch-1.2.13.tgz#b87a8aa4fc0de8fe6be88895b38983ff265bd119"
resolved "https://registry.yarnpkg.com/nanomatch/-/nanomatch-1.2.13.tgz#b87a8aa4fc0de8fe6be88895b38983ff265bd119"
...
@@ -11521,20 +11431,12 @@ node-environment-flags@1.0.6:
...
@@ -11521,20 +11431,12 @@ node-environment-flags@1.0.6:
object.getownpropertydescriptors "^2.0.3"
object.getownpropertydescriptors "^2.0.3"
semver "^5.7.0"
semver "^5.7.0"
node-fetch@*:
version "3.0.0"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-3.0.0.tgz#79da7146a520036f2c5f644e4a26095f17e411ea"
integrity sha512-bKMI+C7/T/SPU1lKnbQbwxptpCrG9ashG+VkytmXCPZyuM9jB6VU+hY0oi4lC8LxTtAeWdckNCTa3nrGsAdA3Q==
dependencies:
data-uri-to-buffer "^3.0.1"
fetch-blob "^3.1.2"
node-fetch@2.6.1:
node-fetch@2.6.1:
version "2.6.1"
version "2.6.1"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.1.tgz#045bd323631f76ed2e2b55573394416b639a0052"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.1.tgz#045bd323631f76ed2e2b55573394416b639a0052"
integrity sha512-V4aYg89jEoVRxRb2fJdAg8FHvI7cEyYdVAh94HH0UIK8oJxUfkjlDQN9RbMx+bEjP7+ggMiFRprSti032Oipxw==
integrity sha512-V4aYg89jEoVRxRb2fJdAg8FHvI7cEyYdVAh94HH0UIK8oJxUfkjlDQN9RbMx+bEjP7+ggMiFRprSti032Oipxw==
node-fetch@
2.6.7, node-fetch@
^2.6.0, node-fetch@^2.6.1:
node-fetch@^2.6.0, node-fetch@^2.6.1:
version "2.6.7"
version "2.6.7"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad"
integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==
integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==
...
@@ -13851,13 +13753,6 @@ serialize-javascript@5.0.1:
...
@@ -13851,13 +13753,6 @@ serialize-javascript@5.0.1:
dependencies:
dependencies:
randombytes "^2.1.0"
randombytes "^2.1.0"
serialize-javascript@6.0.0:
version "6.0.0"
resolved "https://registry.yarnpkg.com/serialize-javascript/-/serialize-javascript-6.0.0.tgz#efae5d88f45d7924141da8b5c3a7a7e663fefeb8"
integrity sha512-Qr3TosvguFt8ePWqsvRfrKyQXIiW+nGbYpy8XK24NQHE83caxWt+mIymTT19DGFbNWNLfEwsrkSmN64lVWB9ag==
dependencies:
randombytes "^2.1.0"
serve-static@1.14.1:
serve-static@1.14.1:
version "1.14.1"
version "1.14.1"
resolved "https://registry.yarnpkg.com/serve-static/-/serve-static-1.14.1.tgz#666e636dc4f010f7ef29970a88a674320898b2f9"
resolved "https://registry.yarnpkg.com/serve-static/-/serve-static-1.14.1.tgz#666e636dc4f010f7ef29970a88a674320898b2f9"
...
@@ -14158,21 +14053,6 @@ solc@0.7.3:
...
@@ -14158,21 +14053,6 @@ solc@0.7.3:
semver "^5.5.0"
semver "^5.5.0"
tmp "0.0.33"
tmp "0.0.33"
solc@0.8.7-fixed:
version "0.8.7-fixed"
resolved "https://registry.yarnpkg.com/solc/-/solc-0.8.7-fixed.tgz#76eb37d33637ad278ad858e2633e9da597d877ac"
integrity sha512-nWZRkdPwfBpimAelO30Bz7/hxoj+mylb30gEpBL8hhEWR4xqu2ezQAxWK1Hz5xx1NqesbgGjSgnGul49tRHWgQ==
dependencies:
command-exists "^1.2.8"
commander "3.0.2"
follow-redirects "^1.12.1"
fs-extra "^0.30.0"
js-sha3 "0.8.0"
memorystream "^0.3.1"
require-from-string "^2.0.0"
semver "^5.5.0"
tmp "0.0.33"
solc@^0.4.20:
solc@^0.4.20:
version "0.4.26"
version "0.4.26"
resolved "https://registry.yarnpkg.com/solc/-/solc-0.4.26.tgz#5390a62a99f40806b86258c737c1cf653cc35cb5"
resolved "https://registry.yarnpkg.com/solc/-/solc-0.4.26.tgz#5390a62a99f40806b86258c737c1cf653cc35cb5"
...
@@ -15769,11 +15649,6 @@ wcwidth@^1.0.0, wcwidth@^1.0.1:
...
@@ -15769,11 +15649,6 @@ wcwidth@^1.0.0, wcwidth@^1.0.1:
dependencies:
dependencies:
defaults "^1.0.3"
defaults "^1.0.3"
web-streams-polyfill@^3.0.3:
version "3.1.1"
resolved "https://registry.yarnpkg.com/web-streams-polyfill/-/web-streams-polyfill-3.1.1.tgz#1516f2d4ea8f1bdbfed15eb65cb2df87098c8364"
integrity sha512-Czi3fG883e96T4DLEPRvufrF2ydhOOW1+1a6c3gNjH2aIh50DNFBdfwh2AKoOf1rXvpvavAoA11Qdq9+BKjE0Q==
web3-bzz@1.2.11:
web3-bzz@1.2.11:
version "1.2.11"
version "1.2.11"
resolved "https://registry.yarnpkg.com/web3-bzz/-/web3-bzz-1.2.11.tgz#41bc19a77444bd5365744596d778b811880f707f"
resolved "https://registry.yarnpkg.com/web3-bzz/-/web3-bzz-1.2.11.tgz#41bc19a77444bd5365744596d778b811880f707f"
...
@@ -16418,11 +16293,6 @@ workerpool@6.1.0:
...
@@ -16418,11 +16293,6 @@ workerpool@6.1.0:
resolved "https://registry.yarnpkg.com/workerpool/-/workerpool-6.1.0.tgz#a8e038b4c94569596852de7a8ea4228eefdeb37b"
resolved "https://registry.yarnpkg.com/workerpool/-/workerpool-6.1.0.tgz#a8e038b4c94569596852de7a8ea4228eefdeb37b"
integrity sha512-toV7q9rWNYha963Pl/qyeZ6wG+3nnsyvolaNUS8+R5Wtw6qJPTxIlOP1ZSvcGhEJw+l3HMMmtiNo9Gl61G4GVg==
integrity sha512-toV7q9rWNYha963Pl/qyeZ6wG+3nnsyvolaNUS8+R5Wtw6qJPTxIlOP1ZSvcGhEJw+l3HMMmtiNo9Gl61G4GVg==
workerpool@6.1.5:
version "6.1.5"
resolved "https://registry.yarnpkg.com/workerpool/-/workerpool-6.1.5.tgz#0f7cf076b6215fd7e1da903ff6f22ddd1886b581"
integrity sha512-XdKkCK0Zqc6w3iTxLckiuJ81tiD/o5rBE/m+nXpRCB+/Sq4DqkfXZ/x0jW02DG1tGsfUGXbTJyZDP+eu67haSw==
wrap-ansi@^2.0.0:
wrap-ansi@^2.0.0:
version "2.1.0"
version "2.1.0"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-2.1.0.tgz#d8fc3d284dd05794fe84973caecdd1cf824fdd85"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-2.1.0.tgz#d8fc3d284dd05794fe84973caecdd1cf824fdd85"
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment