If you have a website you probably have pages that you may not want shown to the public such as login pages, thank you pages, download pages, etc.

For years, one popular way to accomplish keeping pages suppressed from the url being indexed was to list the page urls in a robots.txt file for noindex: and upload this file to the server that hosts the site.

This was an ‘unofficial’ way to achieve this and is very common since all other methods are/were confusing and time consuming to implement.

Yesterday, Google announced that as of Sept, 1 it will no longer ‘recognize’ this method and those page urls may become indexed. If you do not want sensitive pages indexed, you’ll need to take action now.

The official announcement: https://webmasters.googleblog.com/2019/07/a-note-on-unsupported-rules-in-robotstxt.html

Action needed: Insert the “noindex robots meta tag” into the header of each page itself before the closing “/head” tag.

See more info here: https://developers.google.com/search/reference/robots_txt

If your site is a wordpress site, using a popular plugin to suppress page url’s is an option to explore BUT I don’t enjoy using this popular plugin because it fills my dash with “you need to buy stuff from us” to the point, I won’t use it.

Being who I am, I searched for a simple fix and there wasn’t one. I then created a way to solve this issue easily by developing a lightweight little plugin give me the option to insert the code on each page of a site per the new protocol as desired without any bloated advertising. It’s quick and easy.

If you have questions and would like some assistance, please call and leave me a message. 816-482-3755, I’m happy to help.