Google is frequently rolling out updates to its algorithm, a set of code which determines when and where websites pop up after users engage with search. Its latest change, rolled out without much fanfare in March, has targeted what it determines are attempts to include “spammy links” in articles referred to as contributor, guest, partner or syndicated posts. In a blog post published on 25 May, Google said factors such as stuffing keyword-rich links into articles, having articles published across many different sites, using or hiring article writers that aren’t knowledgeable on the topics they are writing on or using the same or similar content across articles are indications an article is in violation of new guidelines. The blog added: “When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. “Sites accepting and publishing such articles should carefully vet them, asking questions like: ‘Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=“nofollow” on them?’.” It was reported that some websites saw up to 90 per cent of keywords drop a number of places in search engine results pages following the roll out of the Google algorithm update. So it appears websites with a lack of topical relevance and expertise will be adversely affected, where driving revenue is prioritised over quality content. Quality over quantity is being suggested by many sources, alongside avoiding link pyramids or exchanges. Businesses which have an outdated website containing content that is not relevant to its primary offering the new Google algorithm may deem it irrelevant and rank lower. Advice also points towards dealing with outdated backlinks. In a separate blog post tweeted by Gary Illness, Google’s webmaster trends analyst, a nod towards better meta detail was made. Comparing it to the way people like to read the synopsis or preface of a book before committing time to it, the post said good meta descriptions are short blurbs that describe accurately the content of the page. “They are like a pitch that convince the user that the page is exactly what they’re looking for,” it went on to say. The most common problems with meta descriptions, the post mentioned, are not including any at all or having the same details across multiple pages.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.