EsheleD Marketing & Technology

installment loans hawaii
17Mar/150

Helping users fill out online forms

A lot of websites rely on forms for important goals completion, such as completing a transaction on a shopping site or registering on a news site. For many users, online forms mean repeatedly typing common information like their names, emails, phone numbers or addresses, on different sites across the web. In addition to being tedious, this task is also error-prone, which can lead many users to abandon the flow entirely. In a world where users browse the internet using their mobile devices more than their laptops or desktops, having forms that are easy and quick to fill out is crucial! Three years ago, we announced the support for a new “autocomplete” attribute in Chrome, to make form-filling faster, easier and smarter. Now, Chrome fully supports the "autocomplete" attribute for form fields according to the current WHATWG HTML Standard. This allows webmasters and web developers to label input element fields with common data types, such as ‘name’ or ‘street-address’, without changing the user interface or the backend. Numerous webmasters have increased the rate of form completions on their sites by marking up their forms for auto-completion.

For example, marking up an email address field on a form to allow auto-completion would look like this (with a full sample form available):

<input type="text" name="customerEmail" autocomplete="email"/>

Making websites friendly and easy to browse for users on mobile devices is very important. We hope to see many forms marked up with the “autocomplete” attribute in the future. For more information, you can check out our specifications about Label and name inputs in Web Fundamentals. And as usual, if you have any questions, please post in our Webmasters Help Forums.

Posted by Mathieu Perreault, Chrome Software Engineer, and Zineb Ait Bahajji, Webmaster Trends Analyst

17Mar/150

An update on doorway pages

Google’s Search Quality team is continually working on ways in which to minimize the impact of webspam on users. This includes doorway pages.

We have a long-standing view that doorway pages that created solely for search engines can harm the quality of the user’s search experience.

For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn't like it, and then tries the next result in the search results page and is taken to that same site that they didn't like, that's a really frustrating experience.
Over time, we've seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.
To help webmasters better understand our guidelines, we've added clarifying examples and freshened our definition of doorway pages in our Quality Guidelines.
Here are questions to ask of pages that could be seen as doorway pages:
  • Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
  • Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
  • Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
  • Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
  • Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?
If you have questions or feedback about doorway pages, please visit our webmaster help forum.

Posted by Brian White, Google Webspam Team

13Mar/150

Deprecation of the old Webmaster Tools API

Last fall we announced the new Webmaster Tools API, which helps you to automate a number of important aspects using code. With the pending shutdown of ClientLogin, we're going to turn down the old Webmaster Tools API on April 20, 2015.  

If you're still using the old API, getting started with the new one is fairly easy. The new API covers everything from the old version except for messages and keywords. We have examples in Python, Java, as well as OACurl (for command-line fans & quick testing).  Additionally, there's the Site Verification API to add sites programmatically to your account. The Python search query data download will continue to be available for the moment, and replaced by an API in the upcoming quarters.

As always, should you have any questions, feel free to comment here, or post in our Webmaster Help Forum.

Posted by John Mueller, fan of command lines & APIs, Google Zuerich

12Mar/150

Unblocking resources with Webmaster Tools

Webmasters often use linked images, CSS, and JavaScript files in web pages to make them pretty and functional. If these resources are blocked from crawling, then Googlebot can't use them when it renders those pages for search. Google Webmaster Tools now includes a Blocked Resources Report to help you find and resolve these kinds of issues.

This report starts with the names of the hosts from which your site is using blocked resources such as JavaScript, CSS, and images. Clicking on the rows gives you the list of blocked resources and then the pages that embed them, guiding you through the steps to diagnose and resolve how we're able to crawl and index the page's content.

An update to Fetch and Render shows how these blocked resources matter. When you request a URL be fetched and rendered, it now shows screenshots rendered both as Googlebot and as a typical user. This makes it easier to recognize the issues that significantly influence why your pages are seen differently by Googlebot.

Webmaster Tools attempts to show you only the hosts that you might have influence over, so at the moment, we won't show hosts that are used by many different sites  (such as popular analytics services). Because it can be time-consuming (usually not for technical reasons!) to update all robots.txt files, we recommend starting with the resources that make the most important visual difference when blocked. Our Help Center article has more information on the steps involved.

We hope this new feature makes it easier for you to spot and then unblock resources used by your website! Should you have any questions, feel free to drop by our webmaster help forums.

Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland

9Mar/150

Easier website development with Web Components and JSON-LD

JSON-LD is a JSON-based data format that can be used to implement structured data describing content on your site to Google and other search engines. For example, if you have a list of events, cafes, people or more, you can include this data in your pages in a structured way using the schema.org vocabulary embedded in webpages as a JSON-LD snippet. The structured data helps Google understand your pages better and highlight your content in search features, such events in the Knowledge Graph and rich snippets.

Web Components are a nascent set of technologies to define custom, reusable user interface widgets and their behavior. Any web developer can build a Web Component. You start by defining a template for a distinct part of the user interface, which you import into the pages on which you want to use the Web Component. A Custom Element is used to define the behavior of the Web Component. Because you’re bundling the display and logic for part of the user interface into the Web Component, you can share and reuse the bundle on other pages and with other developers, thus simplifying web development.

JSON-LD and Web Components work really well together. The Custom Element functions as the presentation layer and the JSON-LD functions as the data layer that the custom element and search engines consume. This means you can build custom elements for any schema.org type, such as schema.org/Event and schema.org/LocalBusiness.

Your architecture would then look like this. Your structured data is stored in your database, for example, the store locations in your chain. This data is embedded into your webpage as a JSON-LD snippet, which means it’s available to be consumed by the Custom Element to display to a human visitor and for Googlebot to retrieve for Google indexing.

To learn more and get started with your own custom elements, please see:

Posted by Ewa Gasperowicz, Developer Programs Engineer, Mano Marks, Developer Advocate, Pierre Far, Webmaster Trends Analyst

27Feb/150

Finding more mobile-friendly search results

Webmaster level: all

When it comes to search on mobile devices, users should get the most relevant and timely results, no matter if the information lives on mobile-friendly web pages or apps. As more people use mobile devices to access the internet, our algorithms have to adapt to these usage patterns. In the past, we’ve made updates to ensure a site is configured properly and viewable on modern devices. We’ve made it easier for users to find mobile-friendly web pages and we’ve introduced App Indexing to surface useful content from apps. Today, we’re announcing two important changes to help users discover more mobile-friendly content:

1. More mobile-friendly websites in search results

Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high quality search results that are optimized for their devices.

To get help with making a mobile-friendly site, check out our guide to mobile-friendly sites. If you’re a webmaster, you can get ready for this change by using the following tools to see how Googlebot views your pages:

  • If you want to test a few pages, you can use the Mobile-Friendly Test.
  • If you have a site, you can use your Webmaster Tools account to get a full list of mobile usability issues across your site using the Mobile Usability Report.

2. More relevant app content in search results

Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.

If you have questions about either mobile-friendly websites or app indexing, we’re always happy to chat in our Webmaster Help Forum.

Posted by Takaki Makino, Chaesang Jung, and Doantam Phan

19Feb/150

Case Studies: Fixing Hacked Sites

Webmaster Level: All

Every day, thousands of websites get hacked. Hacked sites can harm users by serving malicious software, collecting personal information, or redirecting them to sites they didn't intend to visit. Webmasters want to fix hacked sites quickly, but unfortunately recovering from a hack can be a complicated process.

We're trying to make the process of recovering from a hack easier for webmasters with features like Security Issues, Help for Hacked Sites, and a section of our forum just for hacked sites. Recently we talked to two webmasters with hacked sites to learn more about how they were able to fix their sites. We're sharing their stories with the hope that they might provide ideas to other webmasters who have been victims of hacking. We're also using these stories and other feedback for improving our documentation for hacked sites to make the process easier for everyone going forward.

Case Study #1: Restaurant website with multiple hack-injected scripts

A restaurant website using WordPress received a message from Google in their Webmaster Tools account, alerting them that their site had been altered by hackers. To protect Google users, the website was labelled as hacked in Google's search results. The webmaster of the site, Sam, looked at the source code and noticed many unfamiliar links on the site with pharmaceuticals terms such as "viagra" and "cialis." She also noticed many pages where the meta description tags (in the HTML) had added content such as "buy valtrex in florida." There were also hidden div tags (also in the HTML) of many pages that linked to many sites. None of these links were added by Sam.

Sam removed all of the hacked content she found and filed a reconsideration request. The request was rejected but in the message she received from Google, she was advised to check for any unfamiliar scripts in the any PHP files (or any other server files), as well as changes to the .htaccess file. These files are likely to have scripts added by the hackers that modify the site. These scripts typically only show the hacked content to search engines, while hiding the content from a normal user. Sam checked out all of the .php files and compared them to the clean copies she had in her backup. She found new content added to her footer.php, index.php, and functions.php. When she replaced those files with the clean backups, she could no longer find any hacked content on her site. When she filed another reconsideration request, she got a response from Google notifying her that her site was free from hacked content!

Even though Sam had cleaned up the hacked content on her site, she knew that she would need to continue to secure her site against future attacks. She followed the steps below to keep her site safe in the future:

  • Keep the CMS (content management system like WordPress, Joomla, Drupal, etc) up to date with the most current version. Make sure plugins are up to date as well.
  • Make sure the account used to access the administrative features of the CMS uses a difficult and unique password.
  • If the CMS supports it, enable 2-step verification for login. (This might also be called two factor authentication or two step authentication.) This is recommended for the account being used for password recovery as well. Most email providers, like Google, Microsoft, Yahoo all support this!
  • Make sure the plugins and themes installed are from a reputable source - pirated plugins or themes can often contain code that makes it even easier for hackers to get in!

Case Study #2: Professional website with lots of hard to find hacked pages

A small business owner named Maria who also manages her own website received a message in her Webmaster Tools that her site was hacked. The message provided an example of a page added by hackers: http://example.com/where-to-buy-cialis-over-the-counter/. She talked to her hosting provider who looked at the source code on the homepage but could not find any pharmaceutical keywords. When the hosting provider visited http://example.com/where-to-buy-cialis-over-the-counter/, it returned an error page. Maria also bought a malware scanning service but the service was not able to find any malicious content on her site.

Maria then went to Webmaster Tools and used the Fetch as Google tool on the example URL Google had provided (http://example.com/where-to-buy-cialis-over-the-counter/) which returned no content. Confused, she filed a reconsideration request and received a rejection message which advised her to do two things:

  1. Verify the non-www version of her site as hackers often try to hide content in folders that may be overlooked by the webmaster.

    While it may seem like http://example.com and http://www.example.com are the same site, Google actually treats these as different sites. http://example.com is referred to as the "root domain" while http://www.example.com is called the subdomain. Maria had http://www.example.com verified but not http://example.com verified which is important because the pages added by hackers were non-www pages like http://example.com/where-to-buy-cialis-over-the-counter/. Once she verified http://example.com she was able to successfully see the hacked content on the provided URL with the Fetch as Google tool in Webmaster Tools.

  2. Check her .htaccess file for new rules.

    Maria talked to her hosting provider who showed her how to access her .htaccess file. She noticed right away that her .htaccess file had some strange content that she had not added:

    <IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteCond %{HTTP_USER_AGENT} (google|yahoo|msn|aol|bing) [OR]
    RewriteCond %{HTTP_REFERER} (google|yahoo|msn|aol|bing)
    RewriteRule ^([^/]*)/$ /main.php?p=$1 [L]
    </IfModule>

    The mod_rewrite rule you see above was inserted by the hacker and redirects anyone coming from certain search engines, as well as search engine crawlers, to main.php, which generates all of the hacked content. It's also possible that these rules can redirect users accessing the site on a mobile device. On the same day, she also saw that a recent malware scan found suspicious content on the main.php file. One top of that, she also noticed an unknown user in the ftp users area of her website development software.

She removed the main.php file, the .htaccess file, and removed the unknown user from her FTP users area and her site was no longer hacked!

Steps to prevent getting hacked in the future

  • Avoid using FTP when transferring files to your servers. FTP does not encrypt any traffic, including passwords. Instead, use SFTP, which will encrypt everything, including your password, as a protection against eavesdroppers examining network traffic.
  • Check the permissions on sensitive files like .htaccess. Your hosting provider may be able to assist you if you need help. The .htaccess file can be used to improve and protect your site, but it can also be used for malicious hacks if they are able to gain access to it.
  • Be vigilant and look for new and unfamiliar users in your administrative panel and any other place where there may be users that can modify your site.

We hope your site never gets hacked, but if it does, we have many resources for hacked webmasters on our Help for Hacked Sites page. If you need more help or would like to share your own tips, you can post in our Webmaster Help Forum. If you do post to the forum or submit a reconsideration request for your site, please include #NoHacked.

Posted by Julian Prentice and Yuan Niu, Search Quality Team

28Jan/150

Crawling and indexing of locale-adaptive pages

Webmaster level: advanced

Locale-adaptive pages change their content to reflect the user's language or perceived geographic location. Since, by default, Googlebot requests pages without setting an Accept-Language HTTP request header and uses IP addresses that appear to be located in the USA, not all content variants of locale-adaptive pages may be indexed completely.

Today we’re introducing new locale-aware crawl configurations for Googlebot for pages that we detect may adapt the content they serve based on the request's language and perceived location. These are:

  • Geo-distributed crawling where Googlebot would start to use IP addresses that appear to be coming from outside the USA, in addition to the current IP addresses that appear to be from the USA that Googlebot currently uses.
  • Language-dependent crawling where Googlebot would start to crawl with an Accept-Language HTTP header in the request.

As these new crawling configurations are enabled automatically for pages we detect to be locale-adaptive, you may notice changes in how we crawl and show your site in Google search results without you altering your CMS or server settings.

Note that these new configurations do not alter our recommendation to use separate URLs with rel=alternate hreflang annotations for each locale. We continue to support and recommend using separate URLs as they are still the best way for users to interact and share your content, and also to maximize indexing and better ranking of all variants of your content.

As always, if you have any questions or feedback, please tell us in the internationalization Webmaster Help Forum.

Posted by Qin Yin, Software Engineer Search Infrastructure, and Pierre Far, Webmaster Trends Analyst

16Jan/150

Upcoming Events In The Knowledge Graph

Webmaster level: all

Last year, we launched a new way for musical artists to list their upcoming events on Google: schema.org markup on their official websites. Now we’re expanding this program in four ways:

1. Official Ticket Links

For artists: if you mark up ticketing links along with the events on your official website, we’ll show an expanded answer card for your events in Google search, including the on-sale date, availability, and a direct link to your preferred ticketing site.

As before, you may write the event markup directly into your site’s HTML, or simply install an event widget that builds in the markup for you automatically—like Bandsintown, BandPage, GigPress, ReverbNation or Songkick.

2. Delegated Event Listings

What if you can’t add markup or an event widget to your official website—for example, if your website doesn’t list your events at all? Now you can use delegation markup to tell us to source your events from a page of your choice on another website. Just add the following markup to your home page, making sure to customize the three red values:

<script type="application/ld+json">{"@context" : "http://schema.org",  "@type" : "MusicGroup",  "name" : "Your Band or Performer Name",  "url" : "http://your-official-website.com",  "event" : "http://other-event-site.com/your-event-listing-page/"}</script>

The marked-up events found on the other event site's page will then be eligible for Google events features. Examples of sites you can point to in the “event” field include bandpage.com, bandsintown.com, songkick.com, and ticketmaster.com.

3. Comedian Events

Hey funny people! We want your performances to show up on Google, too. Just add ComedyEvent markup to your official website. Or, if another site like laughstub.com has your complete event listings, use delegation markup on your home page to point us their way.

4. Venue Events

Last but definitely not least: we’re starting to show venue event listings in Google Search. Concert venues, theaters, libraries, fairgrounds, and so on: make your upcoming events eligible for display across Google by adding Event markup to your official website.

As with artist events, you have a choice of writing the event markup directly into your site’s HTML, or using a widget or plugin that builds in the markup for you. Also, if all your events are ticketed by a primary ticketer whose website provides markup, you don’t have to do anything! Google will read the ticketer’s markup and apply it toward your venue’s event listings.

For example, venues ticketed by Ticketmaster, including its international sites and TicketWeb, will automatically be covered. The same goes for venues that list events with Ticketfly, AXS, LaughStub, Wantickets, Holdmyticket, ShowClix, Stranger Tickets, Ticket Alternative, Digitick, See Tickets, Tix, Fnac Spectacles, Ticketland.ru, iTickets, MIDWESTIX, Ticketleap, or Instantseats. All of these have already implemented ticketer events markup.

Please see our Developer Site for full documentation of these features, including a video tutorial on how to write and test event markup. Then add the markup, help new fans discover your events, and play to a packed house!

Posted by Justin Boyan, Product Manager, Google Search

16Jan/150

New Structured Data Testing Tool, documentation, and more

Webmaster level: all

Structured data markup helps your content get discovered in search results and across Google properties. We’re excited to share several updates to help you author and publish markup on your website:

Structured Data Testing Tool

The new Structured Data Testing Tool better reflects how Google interprets a web page’s structured data markup.

It provides the following features:

  • Validation for all Google features powered by structured data
  • Support for markup in the JSON-LD syntax, including in dynamic HTML pages
  • Clean display of the structured data items on your page
  • Syntax highlighting of markup problems right in your HTML source code

New documentation and simpler policy

We've clarified our documentation for the vocabulary supported in structured data based on webmasters' feedback. The new documentation explains the markup you need to add to enable different search features for your content, along with code examples in the supported syntaxes. We'll be retiring the old documentation soon.

We've also simplified and clarified our policies on using structured data. If you believe that another site is abusing Google's rich snippets quality guidelines, please let us know using the rich snippets spam report form.

Expanded support for JSON-LD

We've extended our support for schema.org vocabulary in JSON-LD syntax to new use cases: company logos and contacts, social profile links, events in the Knowledge Graph, the sitelinks search box, and event rich snippets. We're working on expanding support to additional markup-powered features in the future.

As always, we welcome your feedback and questions; please post in our Webmaster Help forums.

Posted by Pierre Far, Webmaster Trends Analyst, Tatsiana Sakhar, Search Quality Analyst, Zach Clifford, Software Engineer

Page 1 of 1512345...10...Last »