Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
N
nebula
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
exchain
nebula
Commits
98ae8c3b
Unverified
Commit
98ae8c3b
authored
Oct 19, 2022
by
mergify[bot]
Committed by
GitHub
Oct 19, 2022
Browse files
Options
Browse Files
Download
Plain Diff
Merge pull request #3731 from ethereum-optimism/jg/decompress_channel_size_limit
specs,op-node: Clarify Max RLP Bytes Per Channel
parents
936b8ba5
c8f36940
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
8 additions
and
1 deletion
+8
-1
channel.go
op-node/rollup/derive/channel.go
+1
-1
params.go
op-node/rollup/derive/params.go
+4
-0
derivation.md
specs/derivation.md
+3
-0
No files found.
op-node/rollup/derive/channel.go
View file @
98ae8c3b
...
...
@@ -138,7 +138,7 @@ func BatchReader(r io.Reader, l1InclusionBlock eth.L1BlockRef) (func() (BatchWit
if
err
!=
nil
{
return
nil
,
err
}
rlpReader
:=
rlp
.
NewStream
(
zr
,
10
_000_000
)
rlpReader
:=
rlp
.
NewStream
(
zr
,
MaxRLPBytesPerChannel
)
// Read each batch iteratively
return
func
()
(
BatchWithL1InclusionBlock
,
error
)
{
ret
:=
BatchWithL1InclusionBlock
{
...
...
op-node/rollup/derive/params.go
View file @
98ae8c3b
...
...
@@ -15,6 +15,10 @@ const DerivationVersion0 = 0
// starting with the oldest channel.
const
MaxChannelBankSize
=
100
_000_000
// MaxRLPBytesPerChannel is the maximum amount of bytes that will be read from
// a channel. This limit is set when decoding the RLP.
const
MaxRLPBytesPerChannel
=
10
_000_000
// DuplicateErr is returned when a newly read frame is already known
var
DuplicateErr
=
errors
.
New
(
"duplicate frame"
)
...
...
specs/derivation.md
View file @
98ae8c3b
...
...
@@ -367,6 +367,9 @@ When decompressing a channel, we limit the amount of decompressed data to `MAX_R
humongous amount of data). If the decompressed data exceeds the limit, things proceeds as thought the channel contained
only the first
`MAX_RLP_BYTES_PER_CHANNEL`
decompressed bytes.
When decoding batches, all batches that can be completly decoded below
`MAX_RLP_BYTES_PER_CHANNEL`
will be accepted
even if the size of the channel is greater than
`MAX_RLP_BYTES_PER_CHANNEL`
.
While the above pseudocode implies that all batches are known in advance, it is possible to perform streaming
compression and decompression of RLP-encoded batches. This means it is possible to start including channel frames in a
[
batcher transaction
][
g-batcher-transaction
]
before we know how many batches (and how many frames) the channel will
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment