My github wiki is not crawl data by google


I created a wiki, it took a long time, but my article and account were not crawled on Google. I have created the file but still not indexed. If anyone knows please help me to get my wiki articles indexed on search engines. Please!

Thank you!

~ TuanBuffalo ~


Hi @tuanbuffalo01,

Thanks for being part of the GitHub Community Forum!

Unfortunately, GitHub has no control over what Google caches, so you’ll need to contact Google for help with this. You can send them a request to remove outdated content here:


]( of luck!

Unfortunately, GitHub has no control over what Google caches, so you’ll need to contact Google for help with this. 

Unfortunately, that is not a true statement. If you check github’s robots.txt, it explicitly forbids google from trawling wiki pages:

Disallow: /*/*/wiki/*/*

This is a decision taken by GitHub, and there is no way around it from user’s side, as discussed on stackexchange and other places.

This rule unfortunately significantly reduces the usability of GitHub’s wiki pages. 


@martin-pr @tuanbuffalo01 Thank you for following up on this, @martin-pr. Indeed, you are correct. Apologies for any confusion caused by my previous comment.

I’m afraid it isn’t possible to enable web crawling for a wiki at this time, but I can pass the request onto our team to consider. I can definitely understand the value in this. I can’t promise if or when we’d add this but we’ll make sure the request is in the right hands!


Hi @nadiajoyce,

Do you have any update on this please.

Thanks in advance.


HI @mohkharma,

I don’t have any additional information on this. As mentioned before, I can’t share any timeline or guarantee about if or when this might be implemented. 

If you’d like up to date info on what changes or features we may be releaseing, I’d recommend following the changelog on our blog. That will be the best place to get news about anything we may add or change.


I work hard on my wiki. Why is not enabled…

///wiki// is disallowed but ///wiki and ///wiki/* are allowed. And the wiki pages do appear in google search results:

So, I believe that is not the case, and the first statement by @nadiajoyce was correct.


I wish that were the case but, unfortunately, you are mistaken. The only reference to wiki in the Github robots.txt file reads:

Disallow: */wiki/

I have posted a request to Github for allow wiki’s to be indexed by web crawlers.

This situation is a disappointment to me, personally, because I put a lot of effort into providing content related to the algorithms and implementation design used in my open-source project. Some of that information would be useful to others using my software or attempting their own implementations. Of course, since the wiki’s are not indexed, the chances of them being able to find my content and take advantage of it is greatly reduced.