Robots.txt 是一個自定義腳本,可讓您決定網站上的哪些頁面應由 Google 抓取和索引。
Robots.txt is a custom script that allows you to decide which pages on your website should be crawled and indexed by Google.
1. 登入Web builder > 點擊右上角的"設定"下拉,然後選擇"設置"。
Login to Web builder > Click Settings in the upper right corner > click Settings.
2. 切換到 "文檔 "選項,勾選"啟動 robots.txt"。
Switch to "Files" tab, tick "Enable robots.txt"
3. 插入您的腳本後點擊"應用"。
Insert your script and click "Apply".

