简体   繁体   中英

Flask template streaming with Jinja

I have a flask application. On a particular view, I show a table with about 100k rows in total. It's understandably taking a long time for the page to load, and I'm looking for ways to improve it. So far I've determined I query the database and get a result fairly quickly. I think the problem lies in rendering the actual page. I've found this page on streaming and am trying to work with that, but keep running into problems. I've tried the stream_template solution provided there with this code:

@app.route('/thing/matches', methods = ['GET', 'POST'])
@roles_accepted('admin', 'team')
def r_matches():
    matches = Match.query.filter(
            Match.name == g.name).order_by(Match.name).all()

    return Response(stream_template('/retailer/matches.html',
        dashboard_title = g.name,
        match_show_option = True,
        match_form = form,
        matches = matches))

def stream_template(template_name, **context):
    app.update_template_context(context)
    t = app.jinja_env.get_template(template_name)
    rv = t.stream(context)
    rv.enable_buffering(5)
    return rv

The Match query is the one that returns 100k+ items. However, whenever I run this the page just shows up blank with nothing there. I've also tried the solution with streaming the data to a json and loading it via ajax, but nothing seems to be in the json file either! Here's what that solution looks like:

@app.route('/large.json')
def generate_large_json():
    def generate():
        app.logger.info("Generating JSON")
        matches = Product.query.join(Category).filter(
            Product.retailer == g.retailer,
            Product.match != None).order_by(Product.name)
        for match in matches:
            yield json.dumps(match)
    app.logger.info("Sending file response")
    return Response(stream_with_context(generate()))

Another solution I was looking at was for pagination. This solution works well, except I need to be able to sort through the entire dataset by headers, and couldn't find a way to do that without rendering the whole dataset in the table then using JQuery for sorting/pagination.

The file I get by going to /large.json is always empty. Please help or recommend another way to display such a large data set!

Edit: I got the generate() part to work and updated the code.

The problem in both cases is almost certainly that you are hanging on building 100K+ Match items and storing them in memory. You will want to stream the results from the DB as well using yield_per . However, only PostGres+psycopg2 support the necessary stream_result argument ( here's a way to do it with MySQL ):

matches = Match.query.filter(
        Match.name == g.name).order_by(Match.name).yield_per(10)
        # Stream ten results at a time

An alternative

If you are using Flask-SQLAlchemy you can make use of its Pagination class to paginate your query server-side and not load all 100K+ entries into the browser. This has the added advantage of not requiring the browser to manage all of the DOM entries (assuming you are doing the HTML streaming option).

See also

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM