Forums

proper use of robots.txt file

Hello,

I am trying to setup my robots.txt on my django site.

I added this to my views,

31 def robots(request): 32 data = {}
33 return render(request, 'name_app/robots.txt', data)

I added this to my urls

url(r'^robots.txt$',port_main_app.views.robots),

However when I navigate to the file, mydomain.com/robots.txt

the robots.txt file is displayed on a single line like so User-agent: * Disallow: /

I believe this will confuse the crawlers and prevent the file from being followed.

How do I fix this so that the robots.txt file is displayed properly like so,

User-agent: * Disallow: /

Thanks in advance!

What operating system are you using? this sounds like a unix-vs-windows line-ending thing... what do you see if you retrieve the robots.txt using curl or requests?

My friend your topic helped me to config my robots.txt.

I solved your problem change this line: return render(request, 'name_app/robots.txt', data)

To this: return render(request, 'name_app/robots.txt', data, content_type='text/plain'

The content_type must be set to text/plain

Bests, Toguko