支持分块编码的Python HTTP服务器?

7 人关注

我正在寻找一个支持多线程的Python HTTP服务器,它支持分块编码的回复。 (即在回复中显示 "Transfer-Encoding: chunked")。 对于这个目的,最好的HTTP服务器基础是什么?

python
http
chunked-encoding
slacy
slacy
发布于 2009-04-09
6 个回答
Jarret Hardie
Jarret Hardie
发布于 2022-06-02
已采纳
0 人赞同

Twisted supports 分块传输编码(API链接)。 (也可参见API文档中的 HTTPChannel ).有许多生产级项目使用Twisted(例如,苹果公司在Mac OS X Server中的iCalendar服务器使用它),所以它得到了相当好的支持,而且非常强大。

谢谢,我听说过Twisted,但我的第一印象是,它对我的任务来说有点沉重。 我打算再看一下,因为看起来你可以只下载和运行twisted.web,而不需要其他的东西。
我理解你的感受......twisted有一个很大的API,看起来有点邪教,而且有一些学习曲线。它最初也把我拒之门外,但有时我确实发现它是工作的正确工具 :-)
死链接。你能不能在今后发帖时总结一下你的链接内容?
现在我只能找到 twisted.web.http.Request who is chunked, but not found chunked response.
mathieu
mathieu
发布于 2022-06-02
0 人赞同

Twisted支持分块传输,而且是透明的,也就是说,如果你的请求处理程序没有指定响应长度,Twisted将自动切换到分块传输,它将在每次调用Request.write时生成一个分块。

Orwellophile
Orwellophile
发布于 2022-06-02
0 人赞同

你可以使用Python的HTTPServer实现一个简单的分块服务器,把这个添加到你的serve函数中。

    def _serve(self, res):
        response = next(res)
        content_type = 'application/json'
        self.send_response(200)
        self.send_header('Content-Type', content_type)
        self.send_header('Transfer-Encoding', 'chunked')
        self.end_headers()
            while True:
                # This line removed as suggested by @SergeyNudnov
                # r = response.encode('utf-8')
                l = len(r)
                self.wfile.write('{:X}\r\n{}\r\n'.format(l, r).encode('utf-8'))
                response = next(it)
        except StopIteration:
        self.wfile.write('0\r\n\r\n'.encode('utf-8'))

I would not recommend it for production use.

This line should be removed: r = response.encode('utf-8') . You don't need double encoding there. See my response to this question
@SergeyNudnov 现在看,我怀疑它实际上不会在r预编码的情况下运行,所以我会用注释来纠正它。
Shane C. Mason
Shane C. Mason
发布于 2022-06-02
0 人赞同

我非常肯定,符合WSGI标准的服务器应该支持这一点。从本质上讲,WSGI应用程序返回可迭代的块,而网络服务器则返回这些块。我没有这方面的第一手经验,但这里有一个 符合要求的服务器列表 .

我认为,如果WSGI服务器不能满足你的需求,使用Python的内建功能,可以相当容易地推出你自己的服务器。 CGIHTTPServer .它已经是多线程的了,所以只是由你来分块回复。

Iulian Onofrei
Iulian Onofrei
发布于 2022-06-02
0 人赞同

我设法用 龙卷风 :

#!/usr/bin/env python
import logging
import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web
from tornado.options import define, options
define("port", default=8080, help="run on the given port", type=int)
@tornado.web.stream_request_body
class MainHandler(tornado.web.RequestHandler):
    def post(self):
        print()
    def data_received(self, chunk):
        self.write(chunk)
        logging.info(chunk)
def main():
    tornado.options.parse_command_line()
    application = tornado.web.Application([
        (r"/", MainHandler),
    http_server = tornado.httpserver.HTTPServer(application)
    http_server.listen(options.port)
    tornado.ioloop.IOLoop.current().start()
if __name__ == "__main__":
    main()
    
Sergey Nudnov
Sergey Nudnov
发布于 2022-06-02
0 人赞同

下面的脚本是一个完整的工作实例。它可以作为一个CGI脚本,在Apache或IIS下流转数据。

#!/usr/bin/env pythonw
import sys
import os
import time
# Minimal length of response to avoid its buffering by IIS+FastCGI
# This value was found by testing this script from a browser and
# ensuring that every event received separately and in full
response_padding = 284
def send_chunk(r):
    # Binary write into stdout
    os.write(1, "{:X}\r\n{}\r\n".format(len(r), r).encode('utf-8'))
class Unbuffered(object):
    Stream wrapper to disable output buffering
    To be used in the CGI scripts
    https://stackoverflow.com/a/107717/9921853
    def __init__(self, stream):
        self.stream = stream
    def write(self, data):
        self.stream.write(data)
        self.stream.flush()
    def writelines(self, lines):
        self.stream.writelines(lines)
        self.stream.flush()
    def __getattr__(self, attr):
        return getattr(self.stream, attr)
# Ensure stdout is unbuffered to avoid problems serving this CGI script on IIS
# Also web.config should have responseBufferLimit="0" applied to the CgiModule handler
sys.stdout = Unbuffered(sys.stdout)
print(
    "Transfer-Encoding: chunked\n"
    "Content-Type: text/event-stream; charset=utf-8\n"
# Fixing an issue, that IIS provides a wrong file descriptor for stdin, if no data passed to the POST request
sys.stdin = sys.stdin or open(os.devnull, 'r')
progress = 0
send_chunk(
        "event: started\n"
        f"data: {progress}"
    ).ljust(response_padding) + "\n\n"
while progress < 5:
    time.sleep(2)
    progress += 1
    send_chunk(
            "event: progress\n"
            f"data: {progress}"
        ).ljust(response_padding) + "\n\n"
time.sleep(2)
send_chunk(
        "event: completed\n"
        f"data: {progress+1}"
    ).ljust(response_padding) + "\n\n"
# To close stream