Google is frequently rolling out updates to its algorithm, a set of code which determines when and where websites pop up after users engage with search. Its latest change, rolled out without much fanfare in March, has targeted what it determines are attempts to include “spammy links” in articles referred to as contributor, guest, partner or syndicated posts. In a blog post published on 25 May, Google said factors such as stuffing keyword-rich links into articles, having articles published across many different sites, using or hiring article writers that aren’t knowledgeable on the topics they are writing on or using the same or similar content across articles are indications an article is in violation of new guidelines. The blog added: “When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. “Sites accepting and publishing such articles should carefully vet them, asking questions like: ‘Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=“nofollow” on them?’.” It was reported that some websites saw up to 90 per cent of keywords drop a number of places in search engine results pages following the roll out of the Google algorithm update. So it appears websites with a lack of topical relevance and expertise will be adversely affected, where driving revenue is prioritised over quality content. Quality over quantity is being suggested by many sources, alongside avoiding link pyramids or exchanges. Businesses which have an outdated website containing content that is not relevant to its primary offering the new Google algorithm may deem it irrelevant and rank lower. Advice also points towards dealing with outdated backlinks. In a separate blog post tweeted by Gary Illness, Google’s webmaster trends analyst, a nod towards better meta detail was made. Comparing it to the way people like to read the synopsis or preface of a book before committing time to it, the post said good meta descriptions are short blurbs that describe accurately the content of the page. “They are like a pitch that convince the user that the page is exactly what they’re looking for,” it went on to say. The most common problems with meta descriptions, the post mentioned, are not including any at all or having the same details across multiple pages.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.