robots文件屏蔽一些页面抓取怎么写?
https://www.fanyuancloud.com/articles/189[?]
https://www.fanyuancloud.com/articles/190[?]
https://www.fanyuancloud.com/articles/187[?]
https://www.fanyuancloud.com/articles/100[?]
https://www.fanyuancloud.com/articles/101[?]
https://www.fanyuancloud.com/articles/102[?]
https://www.fanyuancloud.com/articles/98[?]
https://www.fanyuancloud.com/articles/67[?]
这些页面不让百度蜘蛛抓取,但是,我又不想删除这些url,也不想隐藏,就是不想让他抓取,这个robots文件怎么写呢?
3 人参与回答
回复问答禁止带推广链接、违法词及灌水,违规将封禁账号!!了解合理带链接的方法
本贴交流超时,回复已关闭