Prevent Bamboo from being indexed by search engines
Platform Notice: Data Center - This article applies to Atlassian products on the Data Center platform.
Note that this knowledge base article was created for the Data Center version of the product. Data Center knowledge base articles for non-Data Center-specific features may also work for Server versions of the product, however they have not been tested. Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.
*Except Fisheye and Crucible
Problem
When Bamboo is exposed to the internet, there are occasions when we may want to prevent search engines like Google from indexing the server. This way, we can prevent some pages from being exposed to the public via search engines. This is possible by introducing robots.txt
in Bamboo.
Solution
Stop Bamboo.
Create a file
robots.txt
on<bamboo-install>/atlassian-bamboo
directory and add the below content. You may customize this entry as per your need as explained in this article.User-agent: * Disallow: /
Start Bamboo
Notes
Bamboo >= 9.2.5, 9.3.4, 9.4.0
- Add the
robots.txt
file to the location as instructed in the Solution section - Add the following property to Bamboo:
-Dbamboo.enable.robots.txt=true
- Restart Bamboo
Bamboo >= 8.1.12, 8.2.8, 9.0.3, 9.1.2, 9.2.3, 9.3.0
Bamboo's default Secure Servlet prevents unknown files from being exposed, including robots.txt
. This is documented on BAM-22474 - robots.txt file not working in Bamboo after the upgrade
To allow external files, disable Bamboo's Default Secure Servlet so it can serve any files inside the context path.
Comment the following block in
<bamboo-install>/atlassian-bamboo/WEB-INF/web.xml:
<!-- servlet-mapping> <servlet-name>bamboo-default-servlet</servlet-name> <url-pattern>/</url-pattern> </servlet-mapping -->
- Add the
robots.txt
file to the location as instructed in the Solution section - Restart Bamboo