[Beast] Security issue note
Looking into parser/body code I noticed: parser: void on_body(boost::optional< std::uint64_t> const& content_length, error_code& ec) { wr_.emplace(m_); wr_->init(content_length, ec); } string_body: void init(boost::optional< std::uint64_t> content_length, error_code& ec) { if(content_length) { if(*content_length > (std::numeric_limits< std::size_t>::max)()) { ec = make_error_code( errc::not_enough_memory); return; } ec.assign(0, ec.category()); body_.reserve(static_cast< std::size_t>(*content_length)); } } Basically I can exhaust the memory of the server and kill it by providing huge content length from several connections and lead to its crash. Reasonable and configurable limit should be provided for content length. Artyom Beilis
On Tue, Jun 27, 2017 at 1:40 PM, Artyom Beilis via Boost
Looking into parser/body code I noticed: ... Basically I can exhaust the memory of the server and kill it by providing huge content length from several connections and lead to its crash.
Reasonable and configurable limit should be provided for content length.
That's reasonable although note that you can put a max buffer size on the dynamic buffers that come with Beast, and it will naturally take care of limits. For example: beast::http::requestbeast::http::dynamic_body req{1024 * 1024}; will create a request that has a 1MB limit on the body. The moment the reader goes to resize the dynamic buffer, it will return a beast::http::error::buffer_overflow error. Still, your suggestion to add something like `void basic_parser::max_content_length(std::size_t)` is a good idea. Thanks!
On Wed, Jun 28, 2017 at 12:30 AM, Vinnie Falco via Boost
On Tue, Jun 27, 2017 at 1:40 PM, Artyom Beilis via Boost
wrote: Looking into parser/body code I noticed: ... Basically I can exhaust the memory of the server and kill it by providing huge content length from several connections and lead to its crash.
Reasonable and configurable limit should be provided for content length.
That's reasonable although note that you can put a max buffer size on the dynamic buffers that come with Beast, and it will naturally take care of limits. For example:
beast::http::requestbeast::http::dynamic_body req{1024 * 1024};
will create a request that has a 1MB limit on the body. The moment the reader goes to resize the dynamic buffer, it will return a beast::http::error::buffer_overflow error.
It does not fix security flaw of using http::string_body!
Still, your suggestion to add something like `void basic_parser::max_content_length(std::size_t)` is a good idea. Thanks!
Note: the default and reasonable max_context_length must be defined by default. std::size_t isn't good for max_content_length, it should be unsigned long long or uint64_t because if you use it for file upload on 32 bit system you want to support files above 4GB. Regards, Artyom Beilis
On Tue, Jun 27, 2017 at 10:25 PM, Artyom Beilis via Boost
std::size_t isn't good for max_content_length, it should be unsigned long long or uint64_t because if you use it for file upload on 32 bit system you want to support files above 4GB.
Yes, and that's how I wrote it (the feature is currently in code review).
Note: the default and reasonable max_context_length must be defined by default.
What's typical? How about 8MB for responses and 1MB for requests?
Note: the default and reasonable max_context_length must be defined by default.
What's typical? How about 8MB for responses and 1MB for requests?
In CppCMS I use 1MB for generic content type and 64MB for multipart/form-data (that goes to filesystem - not memory...) If you look at PHP defaults they are: http://php.net/manual/en/ini.core.php 8MB for post and for files 2MB per file up to 20 files. For client... it is little bit trickier since you may not know the response size. And in any case you probably do not download entire response to memory. Artyom
On Wed, Jun 28, 2017 at 1:06 AM, Artyom Beilis via Boost
In CppCMS I use 1MB for generic content type and 64MB for multipart/form-data (that goes to filesystem - not memory...) ... 8MB for post and for files 2MB per file up to 20 files.
Well, Beast doesn't know anything about content type or multipart encoding so I can only realistically set a default depending on whether it is a request or a response. I will leave it at 1MB for requests and 8MB for responses. Servers will have more connections so it makes sense for the limit to be lower. I also added an "on_header" callback feature to beast::http::parser so that users can set the limit after receiving the header based on the contents. This allows for the type of logic you are describing; the limit may be conditionally set depending on the value of Content-Type. The benefit of the callback is that it does not require that the HTTP message is read in two I/Os (first the header then the body). This is in the "v70" branch which will be merged today (Wednesday)
I also added an "on_header" callback feature to beast::http::parser so that users can set the limit after receiving the header based on the contents. This allows for the type of logic you are describing; the limit may be conditionally set depending on the value of Content-Type. The benefit of the callback is that it does not require that the HTTP message is read in two I/Os (first the header then the body). This is something I did in CppCMS as well at some point. However this approach still has a certain design flaw I couldn't fix in CppCMS without significant API changes. Between the header analysis and content handling there may be need for doing some blocking or async IO. For example accessing session stored in SQL server using Id defined by cookies to get user limits/roles. So I suggest keeping an option for separate headers and body processing. Artyom
participants (2)
-
Artyom Beilis
-
Vinnie Falco