Google SEO - new page not showing up in search result - Angular6

I have to create a new page and make it standalone searchable on Google. My Application is in Angular 6. I have created a new ng component and defined the meta tags in app.routing like below -

{ path: ‘test’, data: { breadcrumb: ‘test page’, title: ‘test page title’,
metaDescription: ‘test page description’, metaKeywords: ‘keyword1, keyword2’ }
, component: TestComponent }

I am updating the header of the page on load of the component according to these meta data values. When I load the page I can see the updated meta title,keyword and description in the source code in browser for test page. But when I search on google by keyword1, keyword2 then the page is not showing up in search results, Infact I dont see the new page listed on sitemap of my website on google. Am I missing anything here? Also is there anyway to test this google search in test environment?

Google indexing might take some time, and even if Google has visited your website after you’ve update the metadata it might still not update its search results for a while. How exactly Google works is not publicly available information, but we know that Google takes some precautions before updating its search results in order to prevent keywords spamming and promoting sites that switch contents once they show up in Google searches.

So, probably you’ll just have to wait some time to allow the new metadata to affect the searches.

Also, if your keywords belong to highly inflated sets (i.e. for which there’s a huge competition), the reason could be that your website is not making its way through the ranks enough to show up in the results. This is always a possibility.

Thank you for your reply @tajmone. Its almost 4 months since I have elevated my changes and its still does not show. By any chance do I need to change anything there? Like meta keywords or description which I am using for the new published page? Actually there is no reference in the website for the new page so its just meant to show up through google search results. Do you suggest to wait more if there is a possibility that it might take more time to show up in google index or try to make some changes and re elevate ?

Four months should be enough for Google re-indexing the website and showing it in the search results.

Could you provide a link to the actual website? This would help in various ways, from being able to look at the website HTML sources to submitting its URL to some online SEO analysis tools. If I could see the page sources I could give you some SEO tips (I’ve worked a lot with SEO in the past and I know that often it’s the small things that can lead to big improvements).

I suggest you also test your website using Google’s browser built-in developers tools, especially the latest added Lighthouse tool (accessible from the browser via F12) which provides in-depth diagnostics and optimization tips (including SEO):

Lighthouse is an open-source, automated tool for improving the quality of web pages. You can run it against any web page, public or requiring authentication. It has audits for performance, accessibility, progressive web apps, SEO and more.

Sure. Thank you. I tried auditing in Lighthouse and it shows SEO score 90 for home page and new page as well. Below are the links of the home page and the new page for which indexing is not working -

home page - www.bristolwest.com
new page - www.bristolwest.com/reviews

When I try to open those links I get an error page denying me access to the pages (my IP address and some other personal info partly redacted withXs for privacy reasons):

www.bristolwest.com - Access Denied

Error code 16

This request was blocked by the security rules

2020-08-17 18:XX:XX UTC

  • Your IP93.XX.252.XX
  • |Proxy IP45.XX14.23(ID 108XX-100)
  • Origin Server IPN/A

Incident ID: 879000130002213XXX-371XX33436XX60586

So it should not come as a surprise if Google is not indexing the website: probably it’s not able to see it at all (among the checks that Google probably does is checking that websites show the same contents to different IPs, to prevent fake websites servicing fake contents to trick Google).

You really need to find out why your server is banning visitors like me from viewing your site.

The tricky part is that apparently you’re seeing the website without problems, so you’ll have to try viewing it using proxies from around the world to get a feel of how wide the server censorship is.

The only reason I could think why it’s blocking me is because my ISP doesn’t have full NAT.

But then, again, Google is not really interested in promoting links to websites which have geographical restrictions (lest end users would get search results that lead to error pages).

right. I think we had a firewall for this application and it is meant for united states regions only. I can more investigate on this. But still my doubt is google is indexing all other pages of the application just not the new page that I have created. if you search sitemap:bristolwest.com on google then you will see all other pages in search results just not the ‘reviews page’ which is newly created and its meta keywords and description are different from rest of the pages.

Don’t really know what to say about this. But my guess is that Google might have discovered of the regional restrictions at a later stage, and this might have affected further indexing. Google’s indexing is a complex process resulting from the inter-operation of thousands of different servers and tools which are constantly analyzing the WWW from different places and angles, therefore it seems reasonable to think that indexing is a multi-stage process nowadays — far gone are the old day of the 90s, when you could calculate to the hour when a website would be indexed by Google (it used to be a midnight of the beginning of a new month). Since the introduction of the Google SandBox, SEO has no longer been an “exact science” as it used to be.

Today you simply can’t relay on ranking and predictable Google behavior. All you can (and should) do is design compliant websites with semantic rich contents which are natural and not artificial text aiming at keywords spamming (you won’t get away with that today).

Restricting contents to specific regions is probably a big blocker for Google’s indexing. I have no direct proof of this, but it would make sense that Google shuns restricted websites in favor of unrestricted ones. Also, bare in mind that your server doesn’t produce a standard error page, so Google might simply think you’re serving different contents according to IP — which is an alarm bell to Google, because in the past some search-engine spammers have used similar techniques to serve Google tailored contents to gain ranking, while serving contents that would have been censored by Google to certain IPs (i.e. marketing certain regions with services that are not eligible to Google indexing).