Forums

Flask & robots.txt

I'm new to Flask and have started messing around with it recently. I've seen about a half-dozen or so different answers and ways to achieve similar results.

I'm hosting the robots.txt in the static folder, but wondering if this is the best solution for the environment here. Any advice? Is there something I should do differently?

from flask import Flask, send_from_directory, request
app = Flask(__name__, static_folder='static')

# other code

@app.route('/robots.txt')
def static_from_root():
 return send_from_directory(app.static_folder, request.path[1:])

You could conceivably use the static files table at the bottom of the Web tab for this, but I'd say, if this send_from_directory solution is working, then don't change it. Robots.txt doesn't see a lot of traffic, so there's no point in trying to optimise it...

Thanks!

:)

Thanks!

Robots.txt is usually a very specific use case and pretty simple file. Instead of managing and remembering the exceptions to the rule, this function lets you control the response much more granularly, without messing with random app structures and template directory paths.

from flask import Flask, Response
app = Flask(__name__)

@app.route('/robots.txt')
def noindex():
    r = Response(response="User-Agent: *\nDisallow: /\n", status=200, mimetype="text/plain")
    r.headers["Content-Type"] = "text/plain; charset=utf-8"
    return r

Dear pyjobs,

Does the code above allows your site to be not indexed by search engines?