Pilih Laman

sumber : https://support.google.com/webmasters/answer/35769?hl=en

General Guidelines

Help Google find your pages
  • Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page. Crawlable links are <a> tags with an href attribute.
  • Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).
  • Limit the number of links on a page to a reasonable number (a few thousand at most).
  • Make sure that your web server correctly supports the If-Modified-Since HTTP header. This feature directs your web server to tell Google if your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
  • Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.

Ways to help Google find your site:

Help Google understand your pages
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Ensure that your <title> elements and alt attributes are descriptive, specific, and accurate.
  • Design your site to have a clear conceptual page hierarchy.
  • Follow our recommended best practices for imagesvideo, and structured data.
  • When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.
  • To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl use the URL Inspection tool; to debug directives in your robots.txt file, use the and robots.txt Tester tool.
  • Allow search bots to crawl your site without session IDs or URL parameters that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
  • Make your site’s important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.
  • Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel="nofollow" to prevent advertisement links from being followed by a crawler
Help visitors use your pages
  • Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text.
  • Ensure that all links go to live web pages. Use valid HTML.
  • Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.
  • Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed.
  • Ensure that  your site appears correctly in different browsers.
  • If possible, secure your site’s connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.
  • Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader.

Quality guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google’s search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.

Basic principles

  • Make pages primarily for users, not for search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Specific guidelines

Avoid the following techniques:

Automatically generated content

Automatically generated—or “auto-generated”—content is content that’s been generated programmatically.  In cases where it is intended to manipulate search rankings and not help users, Google may take actions on such content. These include, but are not limited to:

  • Text that make no sense to the reader but which may contain search keywords.
  • Text translated by an automated tool without human review or curation before publishing
  • Text generated through automated processes, such as Markov chains
  • Text generated using automated synonymizing or obfuscation techniques
  • Text generated from scraping Atom/RSS feeds or search results
  • Stitching or combining content from different web pages without adding sufficient value

Sneaky redirects

Redirecting is the act of sending a visitor to a different URL than the one they initially requested. There are many good reasons to redirect one URL to another, such as when moving your site to a new address, or consolidating several pages into one.

However, some redirects deceive search engines or display content to human users that is different than that made available to crawlers. It’s a violation of Google Webmaster Guidelines to redirect a user to a different page with the intent to display content other than what was made available to the search engine crawler. When a redirect is implemented in this way, a search engine might index the original page rather than follow the redirect, while users are taken to the redirect target. Like cloaking, this practice is deceptive because it attempts to display different content to users and to Googlebot, and can take a visitor somewhere other than where they expected to go.

Some examples of sneaky redirects include:

  • Search engines shown one type of content while users are redirected to something significantly different.
  • Desktop users receive a normal page, while mobile users are redirected to a completely different spam domain.

Using JavaScript to redirect users can be a legitimate practice. For example, if you redirect users to an internal page once they’re logged in, you can use JavaScript to do so. When examining JavaScript or other redirect methods to ensure your site adheres to our guidelines, consider the intent. Keep in mind that 301 redirects are best when moving your site, but you could use a JavaScript redirect for this purpose if you don’t have access to your website’s server.

Link schemes

Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.

The following are examples of link schemes which can negatively impact a site’s ranking in search results:

  • Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
  • Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking
  • Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links
  • Using automated programs or services to create links to your site
  • Requiring a link as part of a Terms of Service, contract, or similar arrangement without allowing a third-party content owner the choice of using nofollow or other method of blocking PageRank, should they wish.

Additionally, creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines. Here are a few common examples of unnatural links that may violate our guidelines:

  • Text advertisements that pass PageRank
  • Advertorials or native advertising where payment is received for articles that include links that pass PageRank
  • Links with optimized anchor text in articles or press releases distributed on other sites. For example:
    There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
  • Low-quality directory or bookmark site links
  • Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites, for example:
    Visitors to this page: 1,472
    car insurance
  • Widely distributed links in the footers or templates of various sites
  • Forum comments with optimized links in the post or signature, for example:
    Thanks, that’s great info!
    – Paul
    paul’s pizza san diego pizza best pizza san diego

Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:

  • Adding a rel=”nofollow” attribute to the <a> tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We’ll use your information to improve our algorithmic detection of such links.

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

Some examples of cloaking include:

  • Serving a page of HTML text to search engines, while showing a page of images or Flash to users
  • Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor

If your site uses technologies that search engines have difficulty accessing, like JavaScript, images, or Flash, see our recommendations for making that content accessible to search engines and users without cloaking.

If a site gets hacked, it’s not uncommon for the hacker to use cloaking to make the hack harder for the site owner to detect. Read more about hacked sites.

Hidden text and links

Hiding text or links in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines. Text (such as excessive keywords) can be hidden in several ways, including:

  • Using white text on a white background
  • Locating text behind an image
  • Using CSS to position text off-screen
  • Setting the font size to 0
  • Hiding a link by only linking one small character—for example, a hyphen in the middle of a paragraph

When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?

However, not all hidden text is considered deceptive. For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site. Remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either and will benefit from the descriptive text as well. You can test your site’s accessibility by turning off JavaScript, Flash, and images in your browser, or by using a text-only browser such as Lynx. Some tips on making your site accessible include:

  • Images: Use the alt attribute to provide descriptive text. In addition, we recommend using a human-readable caption and descriptive text around the image. See this article for more advice on publishing images.
  • JavaScript: Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.
  • Videos: Include descriptive text about the video in HTML. You might also consider providing transcripts. See this article for more advice on publishing videos.

Doorway pages

Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.

Here are some examples of doorways:

  • Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
  • Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
  • Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy

Scraped content

Some webmasters use content taken (“scraped”) from other, more reputable sites on the assumption that increasing the volume of pages on their site is a good long-term strategy regardless of the relevance or uniqueness of that content. Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases. It’s worthwhile to take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide more useful results for users searching on Google.

Some examples of scraping include:

  • Sites that copy and republish content from other sites without adding any original content or value
  • Sites that copy content from other sites, modify it slightly (for example, by substituting synonyms or using automated techniques), and republish it
  • Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user
  • Sites dedicated to embedding content such as video, images, or other media from other sites without substantial added value to the user

Scraped content

Some webmasters use content taken (“scraped”) from other, more reputable sites on the assumption that increasing the volume of pages on their site is a good long-term strategy regardless of the relevance or uniqueness of that content. Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases. It’s worthwhile to take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide more useful results for users searching on Google.

Some examples of scraping include:

  • Sites that copy and republish content from other sites without adding any original content or value
  • Sites that copy content from other sites, modify it slightly (for example, by substituting synonyms or using automated techniques), and republish it
  • Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user
  • Sites dedicated to embedding content such as video, images, or other media from other sites without substantial added value to the user

Affiliate programs

Our Webmaster Guidelines advise you to create websites with original content that adds value for users. This is particularly important for sites that participate in affiliate programs. Typically, affiliate websites feature product descriptions that appear on sites across that affiliate network. As a result, sites featuring mostly content from affiliate networks can suffer in Google’s search rankings, because they do not have enough added value content that differentiates them from other sites on the web. Added value means additional meaningful content or features, such as additional information about price, purchasing location, or product category.

Google believes that pure, or “thin,” affiliate websites do not provide additional value for web users, especially (but not only) if they are part of a program that distributes its content across a network of affiliates. These sites often appear to be cookie-cutter sites or templates the same or similar content replicated within the same site, or across multiple domains or languages. Because a search results page could return several of these sites, all with the same content, thin affiliates create a frustrating user experience.

Exampes of thin affiliates:

  • Pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.
  • Pages of product affiliation where the majority of the site is made for affiliation and contains a limited amount of original content or added value for users.

Not every site that participates in an affiliate program is a thin affiliate. Good affiliates add value, for example by offering original product reviews, ratings, navigation of products or categories, and product comparisons. If you participate in an affiliate program, there are a number of steps you can take to help your site stand out and differentiate your site:

  • Affiliate program content should form only a minor part of the content of your site if the content adds no additional features.
  • Ask yourself why a user would want to visit your site first rather than visiting the original merchant directly. Make sure your site adds substantial value beyond simply republishing content available from the original merchant.
  • When selecting an affiliate program, choose a product category appropriate for your intended audience. The more targeted the affiliate program is to your site’s content, the more value it will add and the more likely you will be to rank better in Google search results and make money from the program. For example, a well-maintained site about hiking in the Alps could consider an affiliate partnership with a supplier who sells hiking books rather than office supplies.
  • Use your website to build community among your users. This will help build a loyal readership, and can also create a source of information on the subject you are writing about. For example, discussion forums, user reviews, and blogs all offer unique content and provide value to users.
  • Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.

Pure affiliate sites consisting of content that appears in many other places on the web are highly unlikely to perform well in Google search results and may be negatively perceived by search engines. Unique, relevant content provides value to users and distinguishes your site from other affiliates, making it more likely to rank well in Google search results.

Irrelevant keywords

“Keyword stuffing” refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers results in a negative user experience, and can harm your site’s ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.

Examples of keyword stuffing include:

  • Lists of phone numbers without substantial added value
  • Blocks of text listing cities and states a webpage is trying to rank for
  • Repeating the same words or phrases so often that it sounds unnatural, for example:
    We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at custom.cigar.humidors@example.com.

Creating pages with malicious behavior

Distributing content or software on your website that behaves in a way other than what a user expected is a violation of Google Webmaster Guidelines. This includes anything that manipulates content on the page in an unexpected way, downloads or executes files on a user’s computer without their consent, or does not comply with the Google Unwanted Software Policy. Google not only aims to give its users the most relevant search results for their queries, but also to keep them safe on the web.

Some examples of malicious behavior include:

  • Changing or manipulating the location of content on a page, so that when a user thinks they’re clicking on a particular link or button the click is actually registered by a different part of the page
  • Injecting new ads or pop-ups on pages, or swapping out existing ads on a webpage with different ads; or promoting or installing software that does so
  • Including unwanted files in a download that a user requested
  • Installing malware, trojans, spyware, ads or viruses on a user’s computer
  • Changing a user’s browser homepage or search preferences without the user’s informed consent

User-generated spam

Google’s Webmaster Guidelines outline best practices for website owners, and the use of techniques that violate our guidelines may cause us to take action on a site. However, not all violations of our Webmaster Guidelines are related to content created intentionally by a site’s owner. Sometimes, spam can be generated on a good site by malicious visitors or users. This spam is usually generated on sites that allow users to create new pages or otherwise add content to the site.

If you receive a warning from Google about this type of spam, the good news is that we generally believe your site is of sufficient quality that we didn’t see a need to take manual action on the whole site. However, if your site has too much user-generated spam on it, that can affect our assessment of the site, which may eventually result in us taking manual action on the whole site.

Some examples of spammy user-generated content include:

  • Spammy accounts on free hosts
  • Spammy posts on forum threads
  • Comment spam on blogs

Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site. Here are several tips on how to prevent abuse of your site’s public areas.

Ways to Prevent Comment Spam

 

Comments are a great way for webmasters to build community and readership. Unfortunately, they’re often abused by spammers and nogoodniks, many of whom use scripts or other software to generate and post spam. If you’ve ever received a comment that looked like an advertisement or a random link to an unrelated site, then you’ve encountered comment spam.

This type of spam can be harmful to your site in several ways including:

  • Low-quality content on some parts of a website can impact the whole site’s rankings.
  • Spam can distract and annoy your users and lower the reputation of your site.
  • Unintended traffic from unrelated content on your site can slow down your site and raise bandwidth costs.
  • Google might remove or demote pages overrun with user-generated spam to protect the quality of our search results.
  • Content dropped by spammers can lead to malicious sites that can negatively affect your users.

It’s important to find ways to protect your website from this kind of malicious spam. Here are some ideas for reducing or preventing comment spam on your website.

Think twice about enabling a guestbook or comments

Pages full of spam don’t give users a good impression of your site. If this feature isn’t adding much value to your users, or if you won’t have time to regularly monitor your comments, consider turning them off. Most blogging software, such as Blogger, will let you turn comments off for individual posts.

Turn on comment and profile creation moderation

Comment moderation means that no comments will appear on your site until they are reviewed and approved. This means you’ll spend more time monitoring your comments, but it can really help to improve the user experience for your visitors. It’s particularly worthwhile if you regularly post about controversial subjects, where emotions can become heated. It’s generally available as a setting in your blogging software, such as Blogger.

Requiring people to validate a real email address when they sign up for a new account can prevent many spam bots from automatically creating accounts. Additionally, you can set up filters to block email addresses that are suspicious or coming from email services that you don’t trust.

Use anti-spam tools

Many commenting systems require users to prove they’re a real live human, not a nasty spamming script. Generally the user is presented with a distorted image (a CAPTCHA) and asked to type the letters or numbers she sees in the image. Some CAPTCHA systems also support audio CAPTCHAs. This is a pretty effective way of preventing comment spam.

Google’s free reCAPTCHA’s service is easy to implement on your site. In addition, data collected from the service is used to improve the process of scanning text, such as from books, newspapers, or maps. By using reCAPTCHA, you’re not only protecting your site from spammers; you’re helping to digitize the world’s books. You can sign up here if you’d like to implement reCAPTCHA for free on your own site. reCAPTCHA Plugins are available for popular applications and programming environments such as WordPress and PHP.

You can also look into external tools that can help you combat comment spam. For example, your content management system might have useful tools available to install. There are also a number of free tools like Project Honeypot that can help prevent and fight user-generated spam on your site. Visit their websites for instructions on how to implement their tools.

Use “nofollow” tags

Together with Yahoo! and MSN, Google introduced the “nofollow” HTML microformat several years ago, and the attribute has been widely adopted. Any link with the rel=”nofollow” attribute will not be used to calculate PageRank or determine the relevancy of your pages for a user query. For example, if a spammer includes a link in your comments like this:

<a href="http://www.example.com/">This is a nice site!</a>it will get converted to:

<a href="http://www.example.com/" rel="nofollow">This is a nice site! </a>This new link will not be taken into account when calculating PageRank. This won’t prevent spam, but it will avoid problems with passing PageRank and deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.

Prevent untrusted content from showing in search

If your site allows users to create pages like profile pages, forum threads, or websites, you can deter spam abuse by preventing new or untrusted content from showing up in search.

For example, you can use the noindex meta standard to block access to pages for new and not-yet-trusted users. Like this:

<html> <head> <META NAME="googlebot" CONTENT="noindex">Or you can use the robots.txt standard to temporarily block the page:

Disallow:/guestbook/newpost.phpOnce you believe the user is legitimate and not a spammer, you can remove the crawling or indexing restrictions. There are a number of ways that you can tell if a new user is a spammer, including using signals from your community.

Get help from your community

Your users care about your website and are annoyed by spam too. Let them help you solve the problem.

  • Allow trusted users to flag spam comments or threads when they see it. This type of system can be abused, so you should be careful how it’s implemented. One option is to temporarily remove a post or thread that has crossed a threshold of spam reports until it has been manually reviewed.
  • Creating a user reputation system can not only help you engage users, but it can also help identify spammers. Since many comment spammers want their content in search engines, consider adding a noindex robots meta tag on posts that come from new users that don’t have any reputation in your community. Then, after some time, when the user gains reputation, you can allow their posts to be indexed. This will greatly demotivate spammers from trying to post in your community.

Use a blacklist to prevent repetitive spamming attempts

Once you find a single spammy profile, make it simple to remove any others. For example, if you see several spammy profiles coming from the same IP address, you can add that IP address to a permanent ban list.

Monitor your site for spammy content

One of the best tools for this is Google Alerts. Set up a site: query using commercial or adult keywords that you wouldn’t expect to see on your site. Google Alerts is also a great tool to help detect hacked pages.

Report spam, paid links, or malware

If you find information in Google’s search results that you believe result from spam, paid links or malware, here’s how you can help.

Spam

If the site is spam, tell us about it! Google takes spam extremely seriously, and investigates reported instances. These reports are submitted directly to our webspam team and are used to devise scalable solutions to fight spam.

File a spam report (Google Account required)

Paid links

Buying or selling links that pass PageRank can dilute the quality of search results. Participating in link schemes violates the Googles Webmaster Guidelines and can negatively impact a site’s ranking in search results.

If you believe a site is engaged in buying or selling links that pass PageRank, please tell us about it.

Malware

If you believe the site is infected with malware or malicious software, please report it to us so we can take action as necessary.