Robots.txt 是一個自定義腳本,可讓您決定網站上的哪些頁面應由 Google 抓取和索引。
Robots.txt is a custom script that allows you to decide which pages on your website should be crawled and indexed by Google.
1. 登入Web builder > 於右上角點擊設置。
Login to Web builder > Click Settings in the upper right corner.
2. 切換到 "robots.txt "選項。
switch to "robots.txt" tab.
2.1 點撃啟動robots.txt。
Press enable robots.txt.
2.2 插入您的腳本後點擊"應用"。
Insert your script and click "Apply".