Are all repo contents prevented from being crawled by search engines? #22746
-
Hello GitHub experts! Recently I found that the https://github.com/robots.txt has prevented all repo contents from being crawled by search engines. Here is the current content of the robots.txt.
I believe that by |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hi @zhiliangxu! 👋 Welcome to the Community! I’ve been checking with our SEO team about this - it seems that our robots.txt was updated in May to block all crawling, but we are in the process of reverting this change. We expect repositories to start getting indexed again in the next few weeks! |
Beta Was this translation helpful? Give feedback.
-
Hi, any update on this? |
Beta Was this translation helpful? Give feedback.
Hi @zhiliangxu! 👋 Welcome to the Community!
I’ve been checking with our SEO team about this - it seems that our robots.txt was updated in May to block all crawling, but we are in the process of reverting this change.
We expect repositories to start getting indexed again in the next few weeks!