Robots.txt - would allow support be useful for many
Posted: Wed Sep 06, 2023 5:45 pm
I'm not sure the best way to handle this -
Also, since spiders come and go and their names change, supporting only disallow seems a bit limiting - at least for me.
https: My server has a letsencrypt cert, but I don't know if there is a way to grant sphider capability to make that https connection.
thanks - I'm a fairly new user, so still learning. Apologies if the answer is already available.
- I want to spider parts of my local server, while generally disallowing external robots.
There is a second challenge in that I would prefer to the secure path like https://local.server.com/path/
Also, since spiders come and go and their names change, supporting only disallow seems a bit limiting - at least for me.
https: My server has a letsencrypt cert, but I don't know if there is a way to grant sphider capability to make that https connection.
thanks - I'm a fairly new user, so still learning. Apologies if the answer is already available.