简体   繁体   中英

How to protect service from gzip bomb?

I have test.gzip file with json

{"events": [
{"uuid":"56c1718c-8eb3-11e9-8157-e4b97a2c93d3",
"timestamp":"2019-06-14 14:47:31 +0000",
"number":732,
"user": {"full_name":"0"*1024*1024*1024}}]}

full_name filed contains 1GB of 0 , zipped filesize ~1Mb

how can I protect my service during unpacking so that my memory is not over?

func ReadGzFile(filename string) ([]byte, error) {
    fi, err := os.Open(filename)
    if err != nil {
        return nil, err
    }
    defer fi.Close()

    fz, err := gzip.NewReader(fi)
    if err != nil {
        return nil, err
    }
    defer fz.Close()

    s, err := ioutil.ReadAll(fz)
    if err != nil {
        return nil, err
    }
    return s, nil
}

func main() {
    b, err := ReadGzFile("test.gzip")
    if err != nil {
        log.Println(err)
    }
    var dat map[string]interface{}
    if err := json.Unmarshal(b, &dat); err != nil {
        panic(err)
    }
    fmt.Println(dat)
}

In this case output can kill my service by OOMKiller

What can be deceiving is that the compressed size may be significantly smaller that the allowed size (the size you can or you wish to handle). In your example the input is about 1 MB, while the uncompressed size is about 1 GB.

While reading the uncompressed data you should stop after reaching a reasonable limit. To easily do that, you may use io.LimitReader() where you can specify the max amount of bytes you wish to read. Yes, you have to wrap the unzipped stream, not the original, compressed stream.

This is an example how it would look like:

limited := io.LimitReader(fz, 2*1024*1024)

s, err := ioutil.ReadAll(limited)

The above example limits the readable data to 2 MB. What happens when the unzipped data is more than that? The io.Reader returned by io.LimitReader() (which is by the way an io.LimitedReader ) will report io.EOF . This protects your server from the attack, but might not be the best way to handle it.

Since you mentioned this is for a rest API, a better suited solution would be the similar http.MaxBytesReader() . This wraps the passed reader to read up until a given limit, and if that is reached, it returns an error, and also sends an error back to the HTTP client, and also closes the underlying read-closer. If the default behavior of http.MaxBytesReader() is not suitable for you, check its sources, copy it and modify it, it's relatively simple. Tune it to your needs.

Also note that you should not read everything (the uncompressed data) into memory. You may pass the "limited reader" to json.NewDecoder() which will read from the given reader while decoding the input JSON. Of course if the passed limited reader reports an error, the decoding will fail.

Don't read everything into memory. Operate on a stream if possible. This is 100% possible in your example:

func ReadGzFile(filename string) (io.ReadCloser, error) {
    fi, err := os.Open(filename)
    if err != nil {
        return nil, err
    }

    return gzip.NewReader(fi)
}

func main() {
    b, err := ReadGzFile("test.gzip")
    if err != nil {
        log.Println(err)
    }
    defer b.Close()
    var dat map[string]interface{}
    if err := json.NewDecoder(b).Decode(&dat); err != nil {
        panic(err)
    }
    fmt.Println(dat)
}

This Decode approach has the side effect (which may or may not be desirable) of ignoring any garbage in the stream after the first valid JSON object. In your case, this seems like a benefit. In some cases, it may not be.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM