From mboxrd@z Thu Jan 1 00:00:00 1970 Received: from utsuho.znet ([108.68.78.157]) by pp; Mon Feb 9 03:15:34 EST 2015 Date: Mon, 9 Feb 2015 03:15:27 -0500 From: BurnZeZ@feline.systems To: 9front@9front.org Subject: rc-httpd bug Message-ID: <5ac577fe5abc816f182b1d32da8334d2@utsuho.znet> List-ID: <9front.9front.org> X-Glyph: ➈ X-Bullshit: wrapper pipelining full-stack-oriented generator MIME-Version: 1.0 Content-Type: text/plain; charset="US-ASCII" Content-Transfer-Encoding: 7bit lachs0r pointed out a bug involving the request handling. When someone makes a request, rc-httpd does not limit the size of the request. It loops getting lines until the request is complete, or rc runs out of memory. fn getline{ read | sed 's/'^$"cr^'$//g' } done=false while(~ $"done false){ line=`{getline} if(~ $#line 0) done=true reqlines=$"reqlines$"line' ' The preceding excerpts should make the problem apparent. See /rc/bin/rc-httpd/rc-httpd:/^done/ Another thing of note is that read(1) as used here will read until newline with no regard for how much data is read. RFC2616 section 10.4.14 speaks of a response, 413 Request Entity Too Large > The server is refusing to process a request because the request > entity is larger than the server is willing or able to process. The > server MAY close the connection to prevent the client from continuing > the request. > > If the condition is temporary, the server SHOULD include a Retry- > After header field to indicate that it is temporary and after what > time the client MAY try again. This definition leaves it up to the server to decide how much crap it tolerates. >From a quick glance at a few http server implementations, I see limits varying from 1-48 KB. I'm not familiar enough with http to know how to impose such a limitation without breaking things.