If a server decides to encode a Slice in "x bytes", can anyone please describe the criterion/or basis on which the value "x number of bytes" is derived.
Random dice roll squared? It's best to ask the server maintainers themselves.
i hope the design to decide "x number of bytes" of slices is probably due to channel bandwidth? but dont have details, can anyone throw more light on it?