EsheleD Marketing & Technology

22Jun/160

More security notifications via Google Analytics

Over a year ago, we launched Safe Browsing alerts in Google Analytics to warn users about websites identified as compromised and being used for distributing malware or phishing attacks. Since launch, we’ve alerted more than 24,000 Google Analytics property owners whose websites had been compromised by 3rd parties.

Today, we’re happy to announce that we’ll be expanding our set of alerts in Google Analytics by adding notifications about sites hacked for spam in violation of our Webmaster Guidelines. In the unlikely event of your site being compromised by a 3rd party, the alert will flag the affected domain right within the Google Analytics UI and will point you to resources to help you resolve the issue.

                                       An example of a Google Analytics alert for a compromised site.

Website security is still something to take very seriously. In September of last year, we shared that we’d seen a 180% increase in sites getting hacked for spam compared to the previous year. Our research has shown that direct contact with website owners increases the likelihood of remediation to over 75%. This new alert gives us an additional method for letting website owners know that their site may be compromised.

What can you do to prevent your site being compromised?

Prevention plays an important role in keeping your site, and your users, safe. We’ve recently published tips and best practices to protect your content on the web, we recommend them to any site, large or small.

Verify your site on Search Console.

Aside from receiving alerts in Google Analytics or via Search results labels when your site is compromised, we recommend taking the extra step to verify your site in Search Console.

The Security Issues feature will alert you when things don’t look good and will pin-point the issues we’ve uncovered on your properties. We have detailed a recovery journey in our hacked step-by-step recovery guide to help you resolve the issue and keep your website and users safe.

We’re always looking for ideas and feedback—feel free to use the comments section below. For any support questions, visit google.com/webmasters and our support communities available in 14 languages.

Posted by Giacomo Gnecchi Ruscone, Search Outreach and Anthony Medeiros, Google Analytics

2Jun/160

Search at I/O 16 Recap: Eight things you don’t want to miss

Cross-posted from the Google Developers Blog

Two weeks ago, over 7,000 developers descended upon Mountain View for this year’s Google I/O, with a takeaway that it’s truly an exciting time for Search. People come to Google billions of times per day to fulfill their daily information needs. We’re focused on creating features and tools that we believe will help users and publishers make the most of Search in today’s world. As Google continues to evolve and expand to new interfaces, such as the Google assistant and Google Home, we want to make it easy for publishers to integrate and grow with Google.

In case you didn’t have a chance to attend all our sessions, we put together a recap of all the Search happenings at I/O.

1: Introducing rich cards

We announced rich cards, a new Search result format building on rich snippets, that uses schema.org markup to display content in an even more engaging and visual format. Rich cards are available in English for recipes and movies and we’re excited to roll out for more content categories soon. To learn more, browse the new gallery with screenshots and code samples of each markup type or watch our rich cards devByte.

2: New Search Console reports

We want to make it easy for webmasters and developers to track and measure their performance in search results. We launched a new report in Search Console to help developers confirm that their rich card markup is valid. In the report we highlight “enhanceable cards,” which are cards that can benefit from marking up more fields. The new Search Appearance filter also makes it easy for webmasters to filter their traffic by AMP and rich cards.

3: Real-time indexing

Users are searching for more than recipes and movies: they’re often coming to Search to find fresh information about what’s happening right now. This insight kickstarted our efforts to use real-time indexing to connect users searching for real-time events with fresh content. Instead of waiting for content to be crawled and indexed, publishers will be able to use the Google Indexing API to trigger the indexing of their content in real time. It’s still in its early days, but we’re excited to launch a pilot later this summer.

3: Getting up to speed with Accelerated Mobile Pages

We provided an update on our use of AMP, an open source effort to speed up the mobile web. Google Search uses AMP to enable instant-loading content. Speed is important---over 40% of users abandon a page that takes more than three seconds to load. We announced that we’re bringing AMPed news carousels to the iOS and Android Google apps, as well as experimenting with combining AMP and rich cards. Stay tuned for more via our blog and github page.

In addition to the sessions, attendees could talk directly with Googlers at the Search & AMP sandbox.

5: A new and improved Structured Data Testing Tool

We updated the popular Structured Data Testing tool. The tool is now tightly integrated with the DevSite Search Gallery and the new Search Preview service, which lets you preview how your rich cards will look on the search results page.

6: App Indexing got a new home (and new features)

We announced App Indexing’s migration to Firebase, Google’s unified developer platform. Watch the session to learn how to grow your app with Firebase App Indexing.

7: App streaming

App streaming is a new way for Android users to try out games without having to download and install the app -- and it’s already available in Google Search. Check out the session to learn more.

8. Revamped documentation

We also revamped our developer documentation, organizing our docs around topical guides to make it easier to follow.

Thanks to all who came to I/O -- it’s always great to talk directly with developers and hear about experiences first-hand. And whether you came in person or tuned in from afar, let’s continue the conversation on the webmaster forum or during our office hours, hosted weekly via hangouts-on-air.

Posted by Posted by Fabian Schlup, Software Engineer

23May/160

Tie your sites together with property sets in Search Console

Mobile app, mobile website, desktop website -- how do you track their combined visibility in search? Until now, you've had to track all of these statistics separately. Search Console is introducing the concept of "property sets," which let you combine multiple properties (both apps and sites) into a single group to monitor the overall clicks and impressions in search within a single report.

It's easy to get started:

  1. Create a property set
  2. Add the properties you're interested in
  3. The data will start being collected within a few days
  4. Profit from the new insights in Search Analytics!

Property Sets will treat all URIs from the properties included as a single presence in the Search Analytics feature. This means that Search Analytics metrics aggregated by host will be aggregated across all properties included in the set. For example, at a glance you'll get the clicks and impressions of any of the sites in the set for all queries.

This feature will work for any kind of property in Search Console. Use it to gain an overview of your international websites, of mixed HTTP / HTTPS sites, of different departments or brands that run separate websites, or monitor the Search Analytics of all your apps together: all of that's possible with this feature.

Don't just listen to us, here's what we heard from one of the beta-testers:

It was one of my most important demands since the beginning of Webmaster Tools / Search Console. And I love the way it is given to us. I see that the remarks of beta-testers have also been understood by Google engineers. So thank you so much! -- Olivier Andrieu (Abondance)

We'll be rolling this out over the next couple of days. If you have multiple properties verified in Search Console, we hope this feature makes it easier for you to keep track. If you have any questions, feedback, or ideas, please come and visit us in the webmaster help forum, or read the help documentation for this new feature!

Posted by Ofir Roval, Search Console Team

P.S. Want to become a beta-tester for future features? Just sign up to become a beta-tester and we'll get in touch.

18May/160

Introducing rich cards

Rich cards are a new Search result format building on the success of rich snippets. Just like rich snippets, rich cards use schema.org structured markup to display content in an even more engaging and visual format, with a focus on providing a better mobile user experience.

Evolution of search results for queries like [peanut butter cookies recipe]: with rich cards, results are presented in carousels that are easy to browse by scrolling left and right. Carousels can contain cards all from the same site or from multiple sites.

For site owners, this is a new opportunity to stand out in Search results and attract more targeted users to your page. For example, if you have a recipe site, you can build a richer preview of your content with a prominent image for each dish. This visual format helps users find what they want right away, so you're getting users who specifically want that especially delicious cookie recipe you have.

We’re starting to show rich cards for two content categories: recipes and movies. They will appear initially on mobile search results in English for google.com. We’re actively experimenting with more opportunities to provide more publishers with a rich preview of their content.

We’ve built a comprehensive set of tools and completely updated our developer documentation to take site owners and developers from initial exploration through implementation to performance monitoring.

Explore rich card types and identify where your content fits

Browse the new gallery with screenshots and code samples of each markup type.

Test and tweak your markup

We strongly recommend using JSON-LD in your implementation.

  • Find out which fields are essential to mark up in order for a rich card to appear. We’ve also listed additional fields that can enhance your rich cards.
  • See a preview in revamped Structured Data Testing Tool of how the rich card might appear in Search (currently available for recipes and movies).
  • Use the the Structured Data Testing Tool to see errors as you tweak your markup in real time.

Keep track of coverage and debug errors

Check how many of your rich cards are indexed in the new Search Console Rich Cards report.

  • Keep an eye out for errors (also listed in the Rich Cards report). Each error example links directly to the Structured Data Testing tool so you can test it.
  • Submit a sitemap to help us discover all your marked-up content.
Find opportunities for growth

In the Rich Cards report, you'll see which cards can be enhanced by marking up additional fields.

Monitor performance

A new “Rich results” filter in Search Analytics (currently in a closed beta) will help you track how your rich cards and rich snippets are doing in search: you’ll be able to drill down and see clicks and impressions for both.

Q: Can I keep my existing rich snippets markup?

A: Yes, you can! We’ll keep you posted as the rich result ecosystem evolves.

Q: What about the Structured Data report in Search Console?

A: The Structured Data report will continue to show only top-level entities for the existing rich snippets (Product, Recipe, Review, Event, SoftwareApplication, Video, News article) and for any new categories (e.g., Movies). We plan to migrate all errors from the structured data report into rich card report.

Q: What if I use the wrong markup?

A: Technical and quality guidelines apply for rich cards as they do for rich snippets. We will enforce them as before.

Learn more about rich cards in the Search and the mobile content ecosystem session at Google I/O (which will be live streamed!) or on the Developer site. If you have more questions, find us in the dedicated Structured data section of our forum, on Twitter or on Google+.

Posted by Na'ama Zohary, Search Console Team, and Elliott Ng, Product Management Director, Search Ecosystem

17May/160

A new mobile friendly testing tool

Mobile is close to our heart - we love seeing more and more sites make their content available in useful & accessible ways for mobile users. To help keep the ball rolling, we've now launched a new Mobile Friendly Test.

The new tool is linked from Search Console's mobile usability report or available directly at https://search.google.com/search-console/mobile-friendly

The updated tool provides us with room to continue to improve on its functionality, and over time, we expect it to replace the previous Mobile Friendly Test. Additionally, of course this tool also works well on your smartphone, if you need to double-check something there!

We'd like to invite you to take it for a spin, try your website and other sites that you're curious about! Let us know how you like it - either here in the comments or in our webmaster help forums.

Posted by Yaniv Loewenstein, Search Console Team

3May/160

How we fought webspam in 2015

Search is a powerful tool. It helps people to find, share, and access an amazing wealth of content regardless of how they connect or where they are located. As part of Google’s search quality team, we work hard to ensure that searchers see high quality search results—and not webspam. We fight spam through a combination of algorithms and manual reviews to ensure that sites don’t rise in search results through deceptive or manipulative behavior, especially because those sites could harm or mislead users.

Below are some of the webspam insights we gathered in 2015, including trends we’ve seen, what we’re doing to fight spam and protect against those trends, and how we’re working with you to make the web better.

2015 webspam trends

  • We saw a huge number of websites being hacked – a 180% increase compared to the previous year. Stay safe on the web and take preventative measures to protect your content on the web.

  • We saw an increase in the number of sites with thin, low quality content. Such content contains little or no added value and is often scraped from other sites.

2015 spam-fighting efforts

  • As always, our algorithms addressed the vast majority of webspam and search quality improvement for users. One of our algorithmic updates helped to remove the amount of hacked spam in search results.

  • The rest of spam was tackled manually. We sent more than 4.3 million messages to webmasters to notify them of manual actions we took on their site and to help them identify the issues.
  • We saw a 33% increase in the number of sites that went through spam clean-up efforts towards a successful reconsideration process.

Working with users and webmasters for a better web

  • More than 400,000 spam reports were submitted by users around the world. After prioritizing the reports, we acted on 65% of them, and considered 80% of those acted upon to be spam. Thanks to all who submitted reports and contributed towards a cleaner web ecosystem!

  • We conducted more than 200 online office hours and live events around the world in 17 languages. These are great opportunities for us to help webmasters with their sites and for them to share helpful feedback with us as well.
  • The webmaster help forum continued to be an excellent source of webmaster support. Webmasters had tens of thousands of questions answered, including over 35,000 by users designated as Webmaster Top Contributors. Also, 56 Webmaster Top Contributors joined us at our Top Contributor Summit to discuss how to provide users and webmasters with better support and tools. We’re grateful for our awesome Top Contributors and their tremendous contributions!

We’re continuously improving our spam-fighting technology and working closely with webmasters and users to foster and support a high-quality web ecosystem. (In fact, fighting webspam is one of the many ways we maintain search quality at Google.) Thanks for helping to keep spammers away so users can continue accessing great content in Google Search.

Posted by Kiyotaka Tanaka and Mary Chen, User Education and Search Outreach

20Apr/160

Helping webmasters re-secure their sites

(Cross-posted from the Google Security Blog.)
Every week, over 10 million users encounter harmful websites that deliver malware and scams. Many of these sites are compromised personal blogs or small business pages that have fallen victim due to a weak password or outdated software. Safe Browsing and Google Search protect visitors from dangerous content by displaying browser warnings and labeling search results with 'this site may harm your computer'. While this helps keep users safe in the moment, the compromised site remains a problem that needs to be fixed.

Unfortunately, many webmasters for compromised sites are unaware anything is amiss. Worse yet, even when they learn of an incident, they may lack the security expertise to take action and address the root cause of compromise. Quoting one webmaster from a survey we conducted, “our daily and weekly backups were both infected” and even after seeking the help of a specialist, after “lots of wasted hours/days” the webmaster abandoned all attempts to restore the site and instead refocused his efforts on “rebuilding the site from scratch”.

In order to find the best way to help webmasters clean-up from compromise, we recently teamed up with the University of California, Berkeley to explore how to quickly contact webmasters and expedite recovery while minimizing the distress involved. We’ve summarized our key lessons below. The full study, which you can read here, was recently presented at the International World Wide Web Conference.

When Google works directly with webmasters during critical moments like security breaches, we can help 75% of webmasters re-secure their content. The whole process takes a median of 3 days. This is a better experience for webmasters and their audience.
How many sites get compromised?

Number of freshly compromised sites Google detects every week.

Over the last year Google detected nearly 800,000 compromised websites—roughly 16,500 new sites every week from around the globe. Visitors to these sites are exposed to low-quality scam content and malware via drive-by downloads. While browser and search warnings help protect visitors from harm, these warnings can at times feel punitive to webmasters who learn only after-the-fact that their site was compromised. To balance the safety of our users with the experience of webmasters, we set out to find the best approach to help webmasters recover from security breaches and ultimately reconnect websites with their audience.
Finding the most effective ways to aid webmaster

  1. Getting in touch with webmasters: One of the hardest steps on the road to recovery is first getting in contact with webmasters. We tried three notification channels: email, browser warnings, and search warnings. For webmasters who proactively registered their site with Search Console, we found that email communication led to 75% of webmasters re-securing their pages. When we didn’t know a webmaster’s email address, browser warnings and search warnings helped 54% and 43% of sites clean up respectively.
  2. Providing tips on cleaning up harmful content: Attackers rely on hidden files, easy-to-miss redirects, and remote inclusions to serve scams and malware. This makes clean-up increasingly tricky. When we emailed webmasters, we included tips and samples of exactly which pages contained harmful content. This, combined with expedited notification, helped webmasters clean up 62% faster compared to no tips—usually within 3 days.
  3. Making sure sites stay clean: Once a site is no longer serving harmful content, it’s important to make sure attackers don’t reassert control. We monitored recently cleaned websites and found 12% were compromised again in 30 days. This illustrates the challenge involved in identifying the root cause of a breach versus dealing with the side-effects.
Making security issues less painful for webmasters—and everyone
We hope that webmasters never have to deal with a security incident. If you are a webmaster, there are some quick steps you can take to reduce your risk. We’ve made it easier to receive security notifications through Google Analytics as well as through Search Console. Make sure to register for both services. Also, we have laid out helpful tips for updating your site’s software and adding additional authentication that will make your site safer.
If you’re a hosting provider or building a service that needs to notify victims of compromise, understand that the entire process is distressing for users. Establish a reliable communication channel before a security incident occurs, make sure to provide victims with clear recovery steps, and promptly reply to inquiries so the process feels helpful, not punitive.
As we work to make the web a safer place, we think it’s critical to empower webmasters and users to make good security decisions. It’s easy for the security community to be pessimistic about incident response being ‘too complex’ for victims, but as our findings demonstrate, even just starting a dialogue can significantly expedite recovery.

Posted by Kurt Thomas and Yuan Niu, Spam & Abuse Research

12Apr/160

No More Deceptive Download Buttons

(Cross-posted from the Google Security Blog.)
In November, we announced that Safe Browsing would protect you from social engineering attacks - deceptive tactics that try to trick you into doing something dangerous, like installing unwanted software or revealing your personal information (for example, passwords, phone numbers, or credit cards). You may have encountered social engineering in a deceptive download button, or an image ad that falsely claims your system is out of date. Today, we’re expanding Safe Browsing protection to protect you from such deceptive embedded content, like social engineering ads.

Consistent with the social engineering policy we announced in November, embedded content (like ads) on a web page will be considered social engineering when they either:

  • Pretend to act, or look and feel, like a trusted entity — like your own device or browser, or the website itself. 
  • Try to trick you into doing something you’d only do for a trusted entity — like sharing a password or calling tech support.
Below are some examples of deceptive content, shown via ads:
This image claims that your software is out-of-date to trick you into clicking “update”. 

This image mimics a dialogue from the FLV software developer -- but it does not actually originate from this developer.
These buttons seem like they will produce content that relate to the site (like a TV show or sports video stream) by mimicking the site’s look and feel. They are often not distinguishable from the rest of the page.
Our fight against unwanted software and social engineering is still just beginning. We'll continue to improve Google's Safe Browsing protection to help more people stay safe online.
Will my site be affected?
If visitors to your web site consistently see social engineering content, Google Safe Browsing may warn users when they visit the site. If your site is flagged for containing social engineering content, you should troubleshoot with Search Console. Check out our social engineering help for webmasters.

Posted by Lucas Ballard, Safe Browsing Team

17Mar/160

Continuing to make the web more mobile friendly

Getting good, relevant answers when you search shouldn’t depend on what device you’re using. You should get the best answer possible, whether you’re on a phone, desktop or tablet. Last year, we started using mobile-friendliness as a ranking signal on mobile searches. Today we’re announcing that beginning in May, we’ll start rolling out an update to mobile search results that increases the effect of the ranking signal to help our users find even more pages that are relevant and mobile-friendly.

If you've already made your site mobile-friendly, you will not be impacted by this update. If you need support with your mobile-friendly site, we recommend checking out the Mobile-Friendly Test and the Webmaster Mobile Guide, both of which provide guidance on how to improve your mobile site. And remember, the intent of the search query is still a very strong signal — so even if a page with high quality content is not mobile-friendly, it could still rank well if it has great, relevant content.

If you have any questions, please go to the Webmaster help forum.

Posted by Klemen Kloboves, Software Engineer

16Mar/160

Updating the smartphone user-agent of Googlebot

As technology on the web changes, we periodically update the user-agents we use for Googlebot. Next month, we will be updating the smartphone user-agent of Googlebot:


Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
(Googlebot smartphone user-agent starting from April 18, 2016)

Today, we use the following smartphone user-agent for Googlebot:


Mozilla/5.0 (iPhone; CPU iPhone OS 8_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12F70 Safari/600.1.4 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
(Current Googlebot smartphone user-agent)

We’re updating the user-agent string so that our renderer can better understand pages that use newer web technologies. Our renderer evolves over time and the user-agent string indicates that that it is becoming more similar to Chrome than Safari. To make sure your site can be viewed properly by a wide range of users and browsers, we recommend using feature detection and progressive enhancement.

Our evaluation suggests that this user-agent change should have no effect on 99% of sites. The most common reason a site might be affected is if it specifically looks for a particular Googlebot user-agent string. User-agent sniffing for Googlebot is not recommended and is considered to be a form of cloaking. Googlebot should be treated like any other browser.

If you believe your site may be affected by this update, we recommend checking your site with the Fetch and Render Tool in Search Console (which has been updated with the new user-agent string) or by changing the user-agent string in Developer Tools in your browser (for example, via Chrome Device Mode). If you have any questions, we’re always happy to answer them in our Webmaster help forums.

Posted by Katsuaki Ikegami, Software Engineer

Page 1 of 2012345...1020...Last »