I respectfully request that the Github team consider removing the restriction in their robots.txt file that prevents search engines from indexing github wiki pages.
I put a great deal of effort into providing wiki pages that would assist users of my open source software. I also hoped that they would help potential users find the project by providing meaningful content related to the problems my software addresses. The fact that Bing, Google, and other search engines cannot index my wiki pages limits the effectiveness of these pages for my project and the usefulness of their content for the Internet community as a whole.
For example, I’ve written an article on Natural Neighbor Interpolation, which is a function my software supports. It’s a niche topic and the information I supply is not well-covered elsewhere. Enough people have linked to my article that if your run a Google search on “Natural Neighbor Interpolation” my wiki page comes up as the fourth item in the search. But, disappointingly, the description line on Google’s search page reads “No information is available for this page”. It doesn’t even show up in Bing.
I have thought about this for a few hours now, and I cannot think of a plausible rationale for deterring search engines from indexing Github pages… But I accept that there must have been reason for doing so. However, I believe that project Wiki pages are a special case. Nobody would put content in a wiki page if it were not intended for public consumption. So I ask that the Github managers at least give this issue due consideration.
Thank you for your attention in this matter.