The Latest robot.txt 2019 Settings and Super SEO Tag Header

robots.txt google, robots.txt allow, robot.txt file generator, robots.txt tester, robots.txt sitemap, robots.txt wordpress, robots.txt allow google, robots.txt allow all

Robot.txt and Super SEO 2019 Tag Header Guide

  1. You log in to the blog dashboard
  2. Enter the Settings, menu ,
  3. selectSearch Preferences,
  4. then See Crawling and indexing
  5. Do you choose Custom robots.txt?, click Edit. Then enter the custom robot.txt code below in the available column, then click Save Changes.
User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap:https://probloglive.com/sitemap.xml
Change the code that I marked with your blog's URL. Look at the picture:
Information:
1User-agent: *, includes all Google Googleboot robots. Among them are Googleboot, Mediapartner Google, googlebot-images, googleboot-Video, Googleboot-News, and Googleboot-mobile.
2 Disallow: / search , meaning that all urls in the / search directory will not be crawled by search engines. For the blogger himself, Dissalow: / search row is highly recommended so there is no duplicate title and description in the search results.
3 Allow: / : means that robots.txt allows search engines to crawl the entire URL or link in the website directory. With the exception when applied row Disallow on robots.txt.
Keyword:How to Create the Perfect Robots.txt File for SEO, Robots.txt Generator - SEO tools, What is Robots.txt, How to Optimize Your Robots.txt for SEO, Proper SEO and the Robots.txt File, When optimizing a website,
DONASI VIA PAYPAL Bantu berikan donasi jika artikelnya dirasa bermanfaat. Donasi akan digunakan untuk memperpanjang domain https://www.probloglive.com/. Terima kasih.
Newer Posts Newer Posts

More posts

Comments

Post a Comment