Wednesday, March 11, 2015

While JavaScript, CSS, and linked images make websites look good and function properly, they can cause SEO headaches if those resources are blocked from crawling. Now, Google is aiming to remedy that problem by making sure webmasters know exactly which website features are being blocked.

In a blog post this morning, Google said its new reporting feature will begin with the names of blocked hosts. Clicking on the "rows" column will diagnose the problem in more detail with a list of blocked resources and a step-by-step guide to remedying the issues.

Googling is also making it easier to test sites for crawling problems with Fetch and Render, a URL retrieval feature that gives webmasters screenshots of how a page appears to Googlebot and a typical reader.

Greater transparency into Googlebot crawling issues impacts a number of issues, including "Mobile-Friendly" tags. 
 

Tuesday, March 10, 2015

Despite many sites already supporting HTTPS, 80% of those URLs do not show in the Google search results because the webmaster is communicating to Google to display the HTTP version.

Gary Illyes, a Webmaster Trends Analyst at Google, announced on Google+ this morning that over 80% of the eligible HTTPS URLs are not being displayed in Google’s search results as HTTPS URLs, instead they are showing up as HTTP URLs simply because of webmaster configuration.

Gary said they ran a small analysis at Google and found that of the HTTPS URLs eligible to be displayed in the Google search results, over 80% of them are not being displayed. Eligible HTTPS URLs include URLs that have no crawl issues, don’t contain the noindex and have no other problems. But because of how the webmaster configured the site, Google is being instructed to display the HTTP URL instead of the HTTPS URL.

Gary said the webmaster is using the HTTP variant in their sitemap files, in the rel-canonical and rel-alternate-hreflang elements instead of the HTTPS variant.

Google wants you to go HTTPS and even started months ago giving a small ranking benefit to HTTPS URLs. But still, many webmasters are not going HTTPS.

Gary from Google said:

If your site supports HTTPS, please do tell us: use HTTPS URLs everywhere so search engines can see them!

Source:- http://searchengineland.com/google-analysis-shows-80-https-urls-not-displaying-googles-search-results-216441

Thursday, February 26, 2015

Have you ever had a prize page that was winning rankings for a specific keyword? And then, gradually, over time, it dropped from the SERPs?

I’ve had it happen before. Why? There are plenty of reasons. The algorithm changes. Competitors rise. The search landscape fluctuates.

When it does happen, you want to make sure you have an arsenal of techniques that will allow you to win back rank for the page. Here are 12 methods for doing just that.

1. Build Fresh Internal Links to the Page

If you haven’t done it yet, add internal links to the page. Find other high-traffic pages on your site and link them back to the page you want to rank again.

Internal links have some ability to pass pagerank, but the real reason you want to build internal links is because they help to solidify and strengthen the internal site structure.

As you create links to and from various pages on the site, you’re more likely to drive referral traffic to the page, too.

2. Create a Main Navigation Menu to the Page

When you are trying to make an old page gain rank again, your main goal is to make the page a big deal. You want it to have popularity, lots of traffic, and more recognition.
One of the easiest things that you can do to improve ranking is simply to link to your target page from your main navigation menu.

Click Here to read more....

Saturday, January 3, 2015

Shortly after Google released the Google Panda 4.1 update, I asked if this site was hit by the almighty Panda algorithm. It clearly looked like it was and I believe it was hit by Panda. Nothing has changed since. Not to this site, not to the content..click here to read more....

Popular Posts

Recent Posts