[BUG]: Allocator can not allocate more than 64 buffers
What version of Dgraph are you using?
Dgraph version : v22.0.2 Dgraph codename : dgraph Dgraph SHA-256 : a11258bf3352eff0521bc68983a5aedb9316414947719920d75f12143dd368bd Commit SHA-1 : 55697a4 Commit timestamp : 2022-12-16 23:03:35 +0000 Branch : release/v22.0.2 Go version : go1.18.5 jemalloc enabled : true
For Dgraph official documentation, visit https://dgraph.io/docs. For discussions about Dgraph , visit https://discuss.dgraph.io. For fully-managed Dgraph Cloud , visit https://dgraph.io/cloud.
Licensed variously under the Apache Public License 2.0 and Dgraph Community License. Copyright 2015-2022 Dgraph Labs, Inc.
Tell us a little more about your go-environment?
Docker image
Have you tried reproducing the issue with the latest release?
No
What is the hardware spec (RAM, CPU, OS)?
RAM: 1.5TB CPU: 64 cores Host OS: Ubuntu 22.04.2 LTS Dgraph running with docker.
What steps will reproduce the bug?
I bulk imported a huge dataset and tried to perform a count query like this:
{
q(func: type(TypeName)) {
count(uid)
}
}
On a type with ~1.8B entries.
Expected behavior and actual result.
Expected behaviour: get count query result. Actual behaviour: alpha goroutine panics.
Stack trace:
2023/05/26 10:43:41 http: panic serving 172.23.0.1:47114: Allocator can not allocate more than 64 buffers
goroutine 31102 [running]:
net/http.(*conn).serve.func1()
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:1825 +0xbf
panic({0x1c94a80, 0xc0004a6e40})
/opt/hostedtoolcache/go/1.18.5/x64/src/runtime/panic.go:844 +0x258
github.com/dgraph-io/ristretto/z.(*Allocator).addBufferAt(0xc00abf00c0?, 0x0?, 0x1f?)
/home/runner/go/pkg/mod/github.com/dgraph-io/[email protected]/z/allocator.go:251 +0x1e9
github.com/dgraph-io/ristretto/z.(*Allocator).Allocate(0xc00abf00c0, 0x1f)
/home/runner/go/pkg/mod/github.com/dgraph-io/[email protected]/z/allocator.go:302 +0x159
github.com/dgraph-io/ristretto/z.(*Allocator).AllocateAligned(0xc1110e6048?, 0x18)
/home/runner/go/pkg/mod/github.com/dgraph-io/[email protected]/z/allocator.go:226 +0x29
github.com/dgraph-io/dgraph/query.(*encoder).newNode(0xc1110e6048?, 0x3)
/home/runner/work/dgraph/dgraph/query/outputnode.go:316 +0x2c
github.com/dgraph-io/dgraph/query.processNodeUids(0x7e1a1b0a7000, 0xc00788e3c0, 0xc001d1ac00)
/home/runner/work/dgraph/dgraph/query/outputnode.go:1093 +0x3b8
github.com/dgraph-io/dgraph/query.(*SubGraph).toFastJSON(0xc009ab29d8, {0x224a5b8?, 0xc00d4bc960}, 0xc00acd7fc0, {0x0, 0x0})
/home/runner/work/dgraph/dgraph/query/outputnode.go:1162 +0x29c
github.com/dgraph-io/dgraph/query.ToJson({0x224a5b8, 0xc00d4bc960}, 0x1?, {0xc008994228, 0x1, 0x0?}, {0x0, 0x0})
/home/runner/work/dgraph/dgraph/query/outputnode.go:63 +0x205
github.com/dgraph-io/dgraph/edgraph.processQuery({0x224a5b8, 0xc00d4bc960}, 0xc008f04dc0)
/home/runner/work/dgraph/dgraph/edgraph/server.go:1431 +0x76f
github.com/dgraph-io/dgraph/edgraph.(*Server).doQuery(0x224a580?, {0x224a5b8, 0xc00bc559e0}, 0xc009ab36a8)
/home/runner/work/dgraph/dgraph/edgraph/server.go:1306 +0xc36
github.com/dgraph-io/dgraph/edgraph.(*Server).Query(0x224a5b8?, {0x224a580?, 0xc007895080?}, 0xc0078950e0)
/home/runner/work/dgraph/dgraph/edgraph/server.go:1171 +0x2f9
github.com/dgraph-io/dgraph/dgraph/cmd/alpha.queryHandler({0x22493d8, 0xc001cec540}, 0xc001d3ce00)
/home/runner/work/dgraph/dgraph/dgraph/cmd/alpha/http.go:249 +0x6c5
net/http.HandlerFunc.ServeHTTP(0x2ecd8a0?, {0x22493d8?, 0xc001cec540?}, 0x6?)
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:2084 +0x2f
net/http.(*ServeMux).ServeHTTP(0xc0002c09c0?, {0x22493d8, 0xc001cec540}, 0xc001d3ce00)
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:2462 +0x149
github.com/dgraph-io/dgraph/ee/audit.AuditRequestHttp.func1({0x22493d8?, 0xc001cec540?}, 0x71229bcf212?)
/home/runner/work/dgraph/dgraph/ee/audit/interceptor_ee.go:91 +0xc2
net/http.HandlerFunc.ServeHTTP(0x70431759a12?, {0x22493d8?, 0xc001cec540?}, 0xc0079ea6a0?)
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:2084 +0x2f
net/http.(*ServeMux).ServeHTTP(0xc008124bac?, {0x22493d8, 0xc001cec540}, 0xc001d3ce00)
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:2462 +0x149
net/http.serverHandler.ServeHTTP({0x2243ab0?}, {0x22493d8, 0xc001cec540}, 0xc001d3ce00)
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:2916 +0x43b
net/http.(*conn).serve(0xc0133ea960, {0x224a5b8, 0xc000987b30})
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:1966 +0x5d7
created by net/http.(*Server).Serve
/opt/hostedtoolcache/go/1.18.5/x64/src/net/http/server.go:3071 +0x4db
Additional information
This is probably related to the Ristretto hard limit on buffers set here: https://github.com/dgraph-io/ristretto/blob/957744998202b20bf4620acc6171dd610953c22d/z/allocator.go#LL78C14-L78C14
fixed by #8841 merged to main in 82c4a71c551bbe8fb07ee0d3dbc3fb468ab2a0cd
This issue has been stale for 60 days and will be closed automatically in 7 days. Comment to keep it open.