DEPRECATION WARNING
This documentation is not using the current rendering mechanism and is probably outdated. The extension maintainer should switch to the new system. Details on how to use the rendering mechanism can be found here.
EXT: robots.txt¶
Author: | Marcel Alburg |
---|---|
Created: | 2006-10-27T22:34:28 |
Changed: | 2007-02-15T10:52:10 |
Email: | alb@weeaar.com |
Info 2: | |
Info 3: | |
Info 4: |
EXT: robots.txt¶
Extension Key: weeaar_robotostxt
Copyright 2005 - 2006, Marcel Alburg, <alb@weeaar.com>
This document is published under the Open Content License
available from http://www.opencontent.org/opl.shtml
The content of this document is related to TYPO3
- a GNU/GPL CMS/Framework available from www.typo3.com
Table of contents¶
EXT: Extended tt_news 1
Introduction 1
What does it do? 1
Basics 1
Configuration 1
Known problems 1
To-Do list 1
Changelog 1
Introduction¶
What does it do?¶
Create a robots.txt with realurl. You can create for every domain an own file.
Basics¶
Configuration¶
pid_list¶
a
pid_list
b
Set the PageId where contains the robots data
robots = PAGE
robots {
typeNum = 201
10 >
10 < plugin.tx_weeaarrobotstxt_pi1
10.pid_list = 2
config {
disableAllHeaderCode = 1
additionalHeaders = Content-type:text/plain
no_cache = 1
}
}
RealURL¶
To make the robots.txt available under the URL http://domain.tld/robots.txt you have to configure your RealUrl. (changes are bold).
.
.
.
'fileName' => array(
'defaultToHTMLsuffixOnPrev'=>1,
'index' => array(
'rss.xml' => array(
'keyValues' => array(
'type' => 100,
),
),
'rss091.xml' => array(
'keyValues' => array(
'type' => 104,
),
),
'rdf.xml' => array(
'keyValues' => array(
'type' => 101,
),
),
'atom.xml' => array(
'keyValues' => array(
'type' => 103,
),
),
'atom03.xml' => array(
'keyValues' => array(
'type' => 102,
),
),
'robots.txt' => array(
'keyValues' => array(
'type' => 201,
),
),
),
),
),
.
.
.
Set the content for the Robots.txt¶
You can enter you robots data in the page properties from the page wich is entered in pid_list (configuration).
Write your data in the field Content for Robots.txt:
Sample Data could be:
User-agent: *
Disallow:
User-agent: Googlebot-Image
Disallow: /
Known problems¶
To-Do list¶
- Bugfixing! ;)
- Adding more features... ? Tell me what you need !