EsheleD Marketing & Technology

20Jan/160

AMP error report preview in Search Console

More and more sites are implementing Accelerated Mobile Pages (AMP) for news content, so we've decided to provide a preview of error reports in Search Console to help you get ready for the upcoming official AMP launch and get early feedback from you. You can find these reports under Search Appearance - Accelerated Mobile Pages. The goal here is to make it easier to spot issues in your AMP implementation across the whole website. In order to get started with AMP on Google Search, you'll need to create matching, valid AMP pages where relevant, ensure that they use the NewsArticle schema.org markup, and link them appropriately.

The AMP error report gives an overview of the overall situation on your site, and then lets you drill down to specific error types and URLs. This process helps you quickly find the most common issues, so that you can systematically address them in your site's AMP implementation (potentially just requiring tweaks in the templates or plugin used for these pages).

Curious about AMP and how it might fit in with your site? Here's a demo preview of AMP in search, more on how AMP works, and a guide to getting started with AMP. If you think AMP would be a good fit for your website, implementing it might ultimately be as easy as installing a plugin in your CMS, so check with your provider. AMP hasn't officially launched in Google Search, so there's still time to get set up -- feedback & patience will be appreciated by your CMS & plugin providers. Stay tuned for more updates on the AMP Project blog.

We're only getting started -- this is a first step at AMP error reporting. We'll be refining this report in the near future, and we'd love to get your feedback to help us. Let us know in the comments here how things work out for you.

Posted by John Mueller, Webmaster Trends Analyst, Google Zurich

20Jan/160

New year, new look: Introducing our new Webmasters website

It’s a new year and a perfect time to share with you our brand new Webmasters website.

We spent a lot of time making this site right for you. We took our own advice by analyzing visitor behavior and conducting user studies to organize the site into categories you’ll find most useful. Thanks to our awesome community and Top Contributors for the valuable feedback during the process!

Our new Google Webmasters website

The site contains support resources to help you fix issues with your website, SEO learning materials to create a high-quality site and improve search rankings, and connection opportunities to stay up-to-date with our team and webmaster community. It also contains new features such as:

  • Webmaster troubleshooter: Need a step-by-step guide to move your site or understand a message in Search Console? The troubleshooter can help answer these and other common problems with your site in Google Search and Google Search Console.
  • Popular resources: Looking for popular Google Webmasters YouTube videos, blog posts and forum threads? Here’s a curated list of our top resources – these may differ across languages.
  • Events calendar: Want to meet someone from our team online for office hours or at a live event near you? We have office hours and events in multiple languages around the world. 

Browse around and let us know in the comments below if you stumble onto something new!

Posted by Mary Chen, Senior Webmaster Relations Specialist

18Dec/150

Indexing HTTPS pages by default

At Google, user security has always been a top priority. Over the years, we’ve worked hard to promote a more secure web and to provide a better browsing experience for users. Gmail, Google search, and YouTube have had secure connections for some time, and we also started giving a slight ranking boost to HTTPS URLs in search results last year. Browsing the web should be a private experience between the user and the website, and must not be subject to eavesdropping, man-in-the-middle attacks, or data modification. This is why we’ve been strongly promoting HTTPS everywhere.

As a natural continuation of this, today we'd like to announce that we're adjusting our indexing system to look for more HTTPS pages. Specifically, we’ll start crawling HTTPS equivalents of HTTP pages, even when the former are not linked to from any page. When two URLs from the same domain appear to have the same content but are served over different protocol schemes, we’ll typically choose to index the HTTPS URL if:

  • It doesn’t contain insecure dependencies.
  • It isn’t blocked from crawling by robots.txt.
  • It doesn’t redirect users to or through an insecure HTTP page.
  • It doesn’t have a rel="canonical" link to the HTTP page.
  • It doesn’t contain a noindex robots meta tag.
  • It doesn’t have on-host outlinks to HTTP URLs.
  • The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL
  • The server has a valid TLS certificate.

Although our systems prefer the HTTPS version by default, you can also make this clearer for other search engines by redirecting your HTTP site to your HTTPS version and by implementing the HSTS header on your server.

We’re excited about taking another step forward in making the web more secure. By showing users HTTPS pages in our search results, we’re hoping to decrease the risk for users to browse a website over an insecure connection and making themselves vulnerable to content injection attacks. As usual, if you have any questions or comments, please let us know in the comments section below or in our webmaster help forums.

Posted by Zineb Ait Bahajji, WTA, and the Google Security and Indexing teams

20Nov/150

Updating Our Search Quality Rating Guidelines

Developing algorithmic changes to search involves a process of experimentation. Part of that experimentation is having evaluators—people who assess the quality of Google’s search results—give us feedback on our experiments. Ratings from evaluators do not determine individual site rankings, but are used help us understand our experiments. The evaluators base their ratings on guidelines we give them; the guidelines reflect what Google thinks search users want.

In 2013, we published our human rating guidelines to provide transparency on how Google works and to help webmasters understand what Google looks for in web pages. Since that time, a lot has changed: notably, more people have smartphones than ever before and more searches are done on mobile devices today than on computers.

We often make changes to the guidelines as our understanding of what users wants evolves, but we haven’t shared an update publicly since then. However, we recently completed a major revision of our rater guidelines to adapt to this mobile world, recognizing that people use search differently when they carry internet-connected devices with them all the time. You can find that update here (PDF).

This is not the final version of our rater guidelines. The guidelines will continue to evolve as search, and how people use it, changes. We won’t be updating the public document with every change, but we will try to publish big changes to the guidelines periodically.

We expect our phones and other devices to do a lot, and we want Google to continue giving users the answers they're looking for—fast!

Posted by Mimi Underwood, Sr. Program Manager, Search Growth & Analysis

9Nov/150

TC Summit 2015: Celebrating our Webmaster Top Contributors!

Two weeks ago, we were extremely lucky to host the 2015 edition of the Top Contributor Summit (#TCsummit), in San Francisco and on Google’s campus in Mountain View, California.

Google Top Contributors are an exceptional group of passionate Google product enthusiasts who share their expertise across our international help forums to support millions of Google users every year. Google’s Top Contributor Summit is an event organised every two years, to celebrate these amazing users. This year we had the pleasure to welcome 526 Top Contributors, from all around the world.

Under the motto “Learn, Connect, Celebrate”, Top Contributors had the chance to learn more about our products, get insights on the future of Google, connect with Googlers and Top Contributors from various products and, finally, to celebrate their positive impact on our products and users.

Footage of the 2015 Top Contributor Summit

We also had the chance to hold Webmaster-specific sessions, which gave Googlers the unique opportunity to meet 56 of our Webmaster Top Contributors, representing 20 countries and speaking 14 different languages.


Group photo of the Webmaster Top Contributor community and the Google Webmaster Relations team

Throughout the day, we had in-depth sessions about Google Webmaster guidelines, Search Console and Google Search. We discussed the most common issues that users are bringing up in our international webmaster forums, and listened to the Top Contributors’ feedback regarding our Search tools. We also talked about the Top Contributor program itself and additional opportunities for our users to benefit from both Google and the TCs’ support. Product managers, engineers and search quality Googlers attended the sessions to listen and bring the feedback given by Top Contributors and users on the forum back to their teams.


Webmaster Top Contributors during the in-depth sessions about Google Webmaster guidelines, Search Console and Google Search

At Google, we are grateful to have the incredible opportunity to meet and connect with some of the most insightful members of the webmaster community and get their feedback on such important topics. It helps us be sure that Google keeps focusing on what really matters to webmasters, content creators, and users.

To learn more about our Top Contributor Program, or to give us your own feedback, visit our Top Contributor homepage or join our Webmaster help forum.

Diogo Botelho and Roberta Remigi, Webmaster Relations team

30Oct/150

Detect and get rid of unwanted sneaky mobile redirects

In many cases, it is OK to show slightly different content on different devices. For example, optimizing the smaller space of a smartphone screen can mean that some content, like images, will have to be modified. Or you might want to store your website’s menu in a navigation drawer (find documentation here) to make mobile browsing easier and more effective. When implemented properly, these user-centric modifications can be understood very well by Google.

The situation is similar when it comes to mobile-only redirect. Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.


A frustrating experience: The same URL shows up in search results pages on desktop and on mobile. When a user clicks on this result on their desktop computer, the URL opens normally. However, when clicking on the same result on a smartphone, a redirect happens and an unrelated URL loads.

Who implements these mobile-only sneaky redirects?

There are cases where webmasters knowingly decide to put into place redirection rules for their mobile users. This is typically a webmaster guidelines violation, and we do take manual action against it when it harms Google users’ experience (see last section of this article).   

But we’ve also observed situations where mobile-only sneaky redirects happen without site owners being aware of it:

  • Advertising schemes that redirect mobile users specifically
    A script/element installed to display ads and monetize content might be redirecting mobile users to a completely different site without the webmaster being aware of it.
  • Mobile redirect as a result of the site being a target of hacking
    In other cases, if your website has been hacked, a potential result can be redirects to spammy domains for mobile users only.

How do I detect if my site is doing sneaky mobile redirects?

  1. Check if you are redirected when you navigate to your site on your smartphone
    We recommend you to check the mobile user experience of your site by visiting your pages from Google search results with a smartphone. When debugging, mobile emulation in desktop browsers is handy, mostly because you can test for many different devices. You can, for example, do it straight from your browser in Chrome, Firefox or Safari (for the latter, make sure you have enabled the “Show Develop menu in menu bar” feature).
  1. Listen to your users
    Your users could see your site in a different way than you do. It’s always important to pay attention to user complaints, so you can hear of any issue related to mobile UX.
  2. Monitor your users in your site’s analytics data
    Unusual mobile user activity could be detected by looking at some of the data held in your website's analytics data. For example, looking at the average time spent on your site by your mobile users could be a good signal to watch: if all of a sudden, your mobile users (and only them) start spending much less time on your site than they used to, there might be an issue related to mobile redirections.

    To be aware of wide changes in mobile user activity as soon as they happen, you can for example set up Google Analytics alerts. For example, you can set an alert to be warned in case of a sharp drop in average time spent on your site by mobile users, or a drop in mobile users (always take into account that big changes in those metrics are not a clear, direct signal that your site is doing mobile sneaky redirects).

I’ve detected sneaky redirects for my mobile users, and I did not set it up: what do I do?

  1. Make sure that your site is not hacked.
    Check the Security Issues tool in the Search Console, if we have noticed any hack, you should get some information there.
    Review our additional resources on typical symptoms of hacked sites, and our case studies on hacked sites.
  2. Audit third-party scripts/elements on your site
    If your site is not hacked, then we recommend you take the time to investigate if third-party scripts/elements are causing the redirects. You can follow these steps:
    A. Remove one by one the third-party scripts/elements you do not control from the redirecting page(s).
    B. Check your site on a mobile device or through emulation between each script/element removal, and see when the redirect stops.
    C. If you think a particular script/element is responsible for the sneaky redirect, consider removing it from your site, and debugging the issue with the script/element provider.

Last Thoughts on Sneaky Mobile Redirects

It's a violation of the Google Webmaster Guidelines to redirect a user to a page with the intent of displaying content other than what was made available to the search engine crawler (more information on sneaky redirects). To ensure quality search results for our users, the Google Search Quality team can take action on such sites, including removal of URLs from our index.  When we take manual action, we send a message to the site owner via Search Console. Therefore, make sure you’ve set up a Search Console account.

Be sure to choose advertisers who are transparent on how they handle user traffic, to avoid unknowingly redirecting your own users. If you are interested in trust-building in the online advertising space, you may check out industry-wide best practices when participating in ad networks. For example, the Trustworthy Accountability Group’s (Interactive Advertising Bureau) Inventory Quality Guidelines are a good place to start. There are many ways to monetize your content with mobile solutions that provide a high quality user experience, be sure to use them.

If you have questions or comments about mobile-only redirects, join us in our Google Webmaster Support forum.

Written by Vincent Courson & Badr Salmi El Idrissi, Search Quality team

15Oct/150

Deprecating our AJAX crawling scheme

tl;dr: We are no longer recommending the AJAX crawling proposal we made back in 2009.

In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. Because "crawlers … [were] not able to see any content … created dynamically," we proposed a set of practices that webmasters can follow in order to ensure that their AJAX-based applications are indexed by search engines.

Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site's CSS or JS files.

Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement. For example, you can use the History API pushState() to ensure accessibility for a wider range of browsers (and our systems).

Questions and answers

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you've deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you're making the next update for your site. Instead of the _escaped_fragment_ URLs, we'll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you're building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn't pre-render pages only for Google -- we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user's experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

If you have any questions, feel free to post them here, or in the webmaster help forum.

Posted by , Search Quality Analyst

6Oct/150

An update on how we tackle hacked spam

Recently we have started rolling out a series of algorithmic changes that aim to tackle hacked spam in our search results. A huge amount of legitimate sites are hacked by spammers and used to engage in abusive behavior, such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs, etc.

Website owners that don’t implement standard best practices for security can leave their websites vulnerable to being easily hacked. This can include government sites, universities, small business, company websites, restaurants, hobby organizations, conferences, etc. Spammers and cyber-criminals purposely seek out those sites and inject pages with malicious content in an attempt to gain rank and traffic in search engines.

We are aggressively targeting hacked spam in order to protect users and webmasters.

The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. As we roll out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown:

This is due to the large amount of hacked spam being removed, and should improve in the near future. We are continuing tuning our systems to weed out the bad content while retaining the organic, legitimate results. If you have any questions about these changes, or want to give us feedback on these algorithms, feel free to drop by our Webmaster Help Forums.

Posted by Ning Song, Software Engineer

29Sep/150

First Click Free update

Around ten years ago when we introduced a policy called “First Click Free,” it was hard to imagine that the always-on, multi-screen, multiple device world we now live in would change content consumption so much and so fast. The spirit of the First Click Free effort was - and still is - to help users get access to high quality news with a minimum of effort, while also ensuring that publishers with a paid subscription model get discovered in Google Search and via Google News.

In 2009, we updated the FCF policy to allow a limit of five articles per day, in order to protect publishers who felt some users were abusing the spirit of this policy. Recently we have heard from publishers about the need to revisit these policies to reflect the mobile, multiple device world. Today we are announcing a change to the FCF limit to allow a limit of three articles a day. This change will be valid on both Google Search and Google News.

Google wants to play its part in connecting users to quality news and in connecting publishers to users. We believe the FCF is important in helping achieve that goal, and we will periodically review and update these policies as needed so they continue to benefit users and publishers alike. We are listening and always welcome feedback.

Questions and answers about First Click Free

Q: Do the rest of the old guidelines still apply?
A: Yes, please check the guidelines for Google News as well as the guidelines for Web Search and the associated blog post for more information.

Q: Can I apply First Click Free to only a section of my site / only for Google News (or only for Web Search)?
A: Sure! Just make sure that both Googlebot and users from the appropriate search results can view the content as required. Keep in mind that showing Googlebot the full content of a page while showing users a registration page would be considered cloaking.

Q: Do I have to sign up to use First Click Free?
A: Please let us know about your decision to use First Click Free if you are using it for Google News. There's no need to inform us of the First Click Free status for Google Web Search.

Q: What is the preferred way to count a user's accesses?
A: Since there are many different site architectures, we believe it's best to leave this up to the publisher to decide.

(Please see our related blog post for more information on First Click Free for Google News.)

Posted by John Mueller, Google Switzerland

24Sep/150

Helping hacked sites with reconsideration requests

Thus far in 2015 we have seen a 180% increase in the number of sites getting hacked and a 300% increase in hacked site reconsideration requests. While we are working hard to help webmasters prevent hacks in the first place through efforts such as blog posts and #NoHacked campaigns, we recognize that our reconsideration process is an important part of making recovering from a hack faster and easier. Here's what we've been focussing on:

1) Improved communication
2) Better tools
3) Continuous feedback loop

1. Improving communications with webmasters of hacked sites

Last year we launched the "Note from your reviewer" feature in our reconsideration process. This feature enables us to give specific examples and advice tailored to each case in response to a reconsideration request. Thus far in 2015 we have sent a customized note to over 70% of webmasters whose hacked reconsideration request was rejected, with specific guidance on where and how to find the remaining hacked content. The results have been encouraging, as we've seen a 29% decrease in the average amount of time from when a site receives a hacked manual action to the time when the webmaster cleans up and the manual action is removed.


Example "note from your reviewer" with detailed guidance and a custom example of hacked text and a hacked page

We have also completed our second #NoHacked campaign, with more detailed help on preventing and recovering from hacks. In the campaign, we focused on ways to improve the security on your site as well as ways to fix your site if it was compromised. You can catch up by reading the first post.

2. Better tools including auto-removal of some hacked manual actions

Last year we launched the "Fetch and Render" feature to the Fetch as Google tool, which allows you to see the website exactly as Googlebot sees it. This functionality is useful in recovering from a hack, since many hackers inject cloaked content that's not visible to the normal user but obvious to search engine crawlers like Googlebot.

This year we also launched the Hacked Sites Troubleshooter in 23 languages which guides webmasters through some basic steps to recover from a hack. Let us know if you have found the troubleshooter useful as we're continuing to expand its features and impact.

Finally, we're beta testing the automated removal of some hacked manual actions. In Search Console if Google sees a "Hacked site" manual action under "Partial matches", and our systems detect that the hacked content is no longer present, in some cases we will automatically remove that manual action. We still recommend that you submit a reconsideration request if you see any manual actions, but don't be surprised if a "Hacked site" manual action disappears and saves you the trouble!


Example of a Hacked site manual action on a Partial match: if our systems detect that the hacked content is no longer present, in some cases we will automatically remove the manual action

3. Soliciting your feedback and taking action

Our improved communication and tools have come directly from feedback we've collected from webmasters of sites that have been hacked. For example, earlier this year we hosted webmasters who have been through the hacked reconsideration process in both Mountain View, USA and Dublin, Ireland for brainstorming sessions. We also randomly sampled webmasters that had been through a hacked reconsideration. We found that while only 15% of webmasters were dissatisfied with the process, the main challenges those webmasters faced were in clearer notification of their site being hacked and clearer guidance on how to resolve the hack. This feedback contributed directly our more detailed blog post on hacked recovery, and to much of the content in our latest #NoHacked campaign.

(for hi-res version) https://goo.gl/photos/TkvkwYt23MpVHBwz6  


Googlers in Dublin brainstorming ways to improve the hacked reconsideration process after meeting with local webmasters

We will continue to support webmasters of hacked sites through the methods detailed above, in addition to the Webmasters help for hacked sites portal and the security, malware & hacked sites section of our forum. And we'd love to hear your ideas in the comments below on how Google can better support webmasters recovering from a hacked website!

Posted by Josh Feira and Yuan Niu, Search Quality Team

Page 1 of 1912345...10...Last »