robots文件屏蔽一些页面抓取怎么写?
https://www.fanyuancloud.com/articles/189[?]
https://www.fanyuancloud.com/articles/190[?]
https://www.fanyuancloud.com/articles/187[?]
https://www.fanyuancloud.com/articles/100[?]
https://www.fanyuancloud.com/articles/101[?]
https://www.fanyuancloud.com/articles/102[?]
https://www.fanyuancloud.com/articles/98[?]
https://www.fanyuancloud.com/articles/67[?]
这些页面不让百度蜘蛛抓取,但是,我又不想删除这些url,也不想隐藏,就是不想让他抓取,这个robots文件怎么写呢?