If you joined me last week you’ll have heard that Google is cracking down on websites using AMP pages in a manipulative way to get into the news carousel. This week I’m sad to report that because another feature of the Google SERPs is being abused Google is threatening more manual actions. This time it’s using event mark-up in a misleading way. Essentially, this schema mark-up is supposed to denote information on your website specific to an event you are promoting. This then pulls through to the Google search results in a nicely formatted list of events that can be clicked on to send users through to the relevant page on your site. Unfortunately, some unscrupulous website owners, particularly in the voucher code industry, have been using it to mark-up time sensitive information that isn’t actually an event just to get this information to pull through to the SERPs.
Google is famously anti-manipulation in order to make sure users clicking through to a website from a search result are presented with the most relevant information for their query. Therefore, presenting something as an event that isn’t actually an event is against their guidelines. They’ve even stated “Since this creates a misleading user experience, we may take manual action on such cases.” So, don’t use schema in a misleading way or you might find no schema you’ve marked up on your site will show in the SERPs.
You might soon see an intriguing message appearing in the search results of Google. The mysterious line “No information is available for this page. Learn why.” has begun appearing in place of some meta descriptions in the search results. If you click on the “learn why” link you’re taken through to a Google support page that tells you the reason there is no information about the page available is because the page can’t be accessed by Google.
Essentially, this is just replacing the old message “A description for this result is not available because of this site's robots.txt – learn more.” It just means that Googlebot has found that page in the past when it was on one of its crawling jaunts, which is why it’s in Google’s index, but having tried to crawl it again the bot has been prevented from accessing the page. This is commonly because the bot has been blocked through a directive in the robots.txt which sits at the root of the website. If you see this description appearing for a page on your website you may want to go to your robots.txt file and see if there are any exclusions in there that might be telling Google not to visit that page. You might be surprised about what you are blocking Google from crawling.
There could be a legitimate reason why the page is excluded through the robots.txt; you may have tried to remove that page from Google’s index altogether, because you no longer want the page to be found through a search for instance. However, stopping Google from visiting the page is not the right way of doing it. Instead, you need to allow Google to crawl the page again but add a “noindex” meta tag to it so Google knows to remove the page from the index. Simple… no?
Google is feeling altruistic by letting searchers donate to non-profit organisations and charities straight from the search results. Currently only in the US, charities’ knowledge graph panels are displaying a “donate” button, that once you click enables you to give money to that organisation. In order to be eligible for the button organisations have to sign up to “Google for Non-Profits” scheme .
According to Google 30% of giving occurs during the holiday season and seeing as this Tuesday just gone was “Giving Tuesday” (the antidote to Black Friday/Cyber Monday) it seems like a logical time for Google to be rolling this out.
They are saying it’s only available in the US at the moment but they hope to roll it out across more locales soon.