Перейти к содержимому




Здравствуйте, гость ( Вход | Регистрация )


ФИНАНСОВЫЙ СТРИПТИЗ! Админ форума и пользователи палят на чём и сколько они зарабатывают в месяц! Подробнее в теме…

How to Create robots.txt file & sitemap.xml file for seo


  • Авторизуйтесь для ответа в теме
В этой теме нет ответов

#1 FerMer

FerMer

    Капиталист

  • Пользователи Мои
  • PipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPipPip
  • 55 395 сообщений
  • 0 спасибо

Отправлено 25 Май 2015 - 05:08



How to Create robots.txt file & sitemap.xml file for seo

Изображение
rankyaseoservices

Опубликовано: 19 июня 2012 г.

How to create robots.txt file & sitemap.xml file for seo purposes. Watch this video to learn how to take advantage of these methods for higher Google rankings.

Brought to you by www.rankya.com.au seo services, in the SEO circles, this method of controlling which pages should be indexed by Google search engine is called PageRank sculpting or link sculpting.

Please NOTE: do NOT include the forward slash / by accident, because by doing so, you are directing user agents to NOT index your entire website (I've seen it happen), so whatever you do, do NOT use it like this:

User-Agent: Googlebot
Disallow: /
I repeat do NOT place the forward slash / (see above) because that tells Googlebot NOT to index your entire website (meaning it will de-index your entire website which no webmaster would want).

So as far as internet is concerned, you have an option to control which of your web pages you want indexed by Google which surely is a good thing to consider for your online business.


You can read more about blocking certain webpages using robots.txt file through Google guidelines for webmasters here:
support.google.com/webmasters...

XML sitemaps are a method to tell Google about web pages on your site that it might otherwise not discover, simply head out to:
support.google.com/webmasters...

The key points that I want to draw your attention to are: Web crawlers (also known as web spiders) often discover (thus, follow) web pages from one hyperlink to another found on internet. Sitemaps supplement your URI data to allow web crawlers that support Sitemaps to become aware of all the URLs in the Sitemap you create, and to learn about those URLs using the associated metadata. Read more about sitemap standards here:
www.sitemaps.org/


For in depth reading (most webmasters can skip this) you can visit:
developers.google.com/webmast...

You can even use use other methods instead of robots.txt file for Googlebot, visit this page to learn more about Robots meta tag and X-Robots-Tag HTTP header specifications:
developers.google.com/webmast...



Похожие темы




Количество пользователей, читающих эту тему: 0

0 пользователей, 0 гостей, 0 анонимных




Здравствуйте, гость! Для того чтобы ответить в теме необходимо потратить 40 секунд на регистрацию! ( Вход | Регистрация )



Яндекс.Метрика