SEO is the process of improving your website readability for search engines with the goal of improving your website search visibility. For a while now, SEO has been an essential part of ensuring the success of your website, and it only continues to increase in importance. SEO stands for Search Engine Optimisation.
All websites succeed or fail based on how many people visit them (traffic) and they rely on these search engines to put their website where potential users can find them. When a user makes a search query on these search engines, it generates a number of search engine results pages (SERPs).
You want your website to appear as close to the top of these pages as possible. The way that this is done is through a sophisticated algorithm that the engine employs to determine where your site will appear on the page.
To understand how the rankings fluctuate let’s first understand the anatomy of the SERPs.
- 1 Anatomy of a SERP
- 2 SEO Outcomes
- 3 How does SEO work?
- 4 SEO Quality
- 5 Google’s Algorithmic Updates
- 6 Search Engine Rankings
- 7 Good SEO Practices & Ethical SEO
Anatomy of a SERP
When you make a search query in a search engine, a SERP will be generated, also called an ‘impression’. These pages are made up of many links to websites.SERPs links come in a few different categories.
Sponsored links or ‘ads’ are links paid links that will always appear in the spot on the SERP when they are at least somewhat relevant to the query. They operate sort of like billboards in that a business pays a certain amount to ensure their link will always appear on the results page.
As Google has updated its search options (e.g. all, images, videos, maps, news etc.) the SERP has evolved to include some of these. Often a news link will appear on the results page if it is relevant to the search query.
Organic, or natural links are the usual ones that make up the bulk of a results page.
How and why these links appear on the SERP is determined by the search engine’s algorithm. This is where SEO comes into play. By adhering to the parameters set by the algorithm your website should appear on the results page in a way the is natural and relevant to the user’s query.
Page breakdown: While some things have changed as the search engines have become more advanced, SERPs follow a basic format. A SERP can have a significant number of links.
The page structure is as follows:
Sponsored links(pictured below with AD label): At the top of the page there can be up to 4 ad links
News links: Next there can be different links such as news links.
Local Listings(pictured below): Accompanied by Google maps at the top for geo queries.
Organic Links (pictured below): In the middle of the page there are always ten or twelve organic links
Sponsored Links: At the bottom of the page there can be up to 3 ad links.
Web crawlers or ‘spiders’ are what search engines use to update and index their data continuously. Crawlers are primarily an internet-based bot that systematically crawls the internet, visiting websites and collecting data that is sent back to the search engines. Although due to the extreme enormity of the web even the largest crawlers aren’t able to do a flawless job every time, their tireless work makes the browsing experience increasingly efficient for all users.
SEO is used to try and get as many people as you can to visit your site; this is called traffic. There are some key components of traffic that SEO aims to achieve.
Quantity of Traffic
Just like a store, your website relies on people to visit and generate interest. Whether it is by allowing them to purchase goods, or sharing information, you want people to come in and see what you have. For this reason, it is one goal of SEO to get as much traffic to your site as possible.
While you want a lot of people to visit your site, you also want the right people.
One of the ways that SEO works is based on certain keywords. If you search a particular word then an algorithm can recognise that word in your site and make it appear in the SERPs. However, it’s not always that simple due to the multiplicity of the written language.
It is likely that if you make the query ‘buy oranges’ then you will get a page about buying oranges. However, if you search for ‘buy apples’ you may be presented with a page for buying the fruit, or you may get a page full of links about Apple phones. This is obviously not relevant to the user’s search query, so it’s not just a matter of using specific words to produce that relevant web page.Any traffic that goes to your site should be people who want to be there.
There are some other key considerations for a search engine like
- Query understanding
- Query intention
- Language identification
- Query filtering
- Breaking down a query or tokenization
- Query expansion
- Query rewriting
- Search improvements for head queries
- Improving results for small search results
Query tokenization example. Source
Stemming and Lemmatization. Source
This is done by making sure that your site appears on the SERPs that are relevant to their search query. Ensure that you get quality traffic by showing up on SERPs where it makes sense
One way to get traffic is to do it through paid SERPS links (ads). However, it means that you need to pay for your place on the SERPS. There is also research that suggests a lot of users tend to ignore the ad portion of the SERPS, even if it has the result that they want. This may be due to the promotional presentation of these links, eliciting a sort of protective response to the user. Higher traffic generally does not mean higher sales/actions. Converting a visitor to a customer is called a conversion. Read up on conversion rates here.
Conversion tests report.
A better way to get traffic is through the organic portion of the SERPS. With a little bit of extra work, you can optimise your site so that it appears high in the rankings and has a good chance of being seen without needing to pay premium costs for an ad spot.
How does SEO work?
SEO is not the optimisation of a search engine, but rather the optimisation of your website to work with a search engine. But what does that even mean? Well, like most things, it doesn’t mean just one thing, and a number of factors influence the process of this optimisation.
Search engines are worked on by their own team, whether it be Google or Bing. This leaves you to design – or optimise – your website in a manner that works with these sophisticated and refined engines. There are a number of factors that make up this model:
The content is the main part of a website.There are a number of different aspects of content such as the words used, the level of grammar, and the tone of the content. Content also includes other forms of media like images, videos, and interactive programs.
Not to be confused with the links that are generated on a SERP, there are also links that are embedded into websites themselves. The practice of using links in this way is called ‘Link building’. These links provide a way to navigate to different pages on the internet, whether it be another page on the website, or to another site altogether. These links are often integrated into the content usually somewhere in a piece of text, or as a clickable image or video.
This can also include links to social media feeds such as Twitter, or Facebook. These links often have unique attributes that allow them to do different things, such as allowing a user to click the link and ‘follow’ without leaving the page.
Google webmaster tools can provide you with your external backlinks report.
A sample report from webmaster tools.
These are the words that search engine algorithms will pick up on to find connections in a website’s content to the search query. Keywords can be implemented into written content in a way that ensures it will be included in the links that a SERP presents.
As we alluded to before, a search engine works simultaneously with millions of different websites at the same time. With all of these using the same foundation for their exposure, well and nice aren’t good enough, and there has to be a more logical and concrete method to determine effectiveness.
Over the years, for instance, Google, has developed its algorithm to recognise certain aspects of websites to do this. This works more by exclusion than inclusion, meaning the algorithm looks at what website’s shouldn’t do rather than what they should do.
The central front against Google’s quality control is a type of SEO practice called Blackhat techniques. In the computer world, Blackhat is the name given to hackers who violate computer landscapes for no reason beyond personal gain. In SEO they are no different. These bad or manipulative techniques are employed to game the search engine’s system and achieve a high ranking no regardless of the damage. Blackhat techniques are the internet equivalent to spam, which has earned them the term ‘web-spam’. SEO is about using the various aspects mentioned above to optimise a website to be most efficient with a search engine. Blackhatting uses the same elements, but to an extent that creates an imbalance and sacrifices a good browsing experience for self-serving reasons.
Here are some of the blackhat techniques to watch out for:
While link building is an acceptable and efficient practice in SEO, it can also be overdone. When used as a blackhat link building is using links in a way that serves to promote a business more than it does in offering valuable information to the user. Essentially it is just putting as many links as you can into a page’s content, even if it is not relevant.
Similar to negative links, keywords can also be used in a nefarious manner. Blackhat techniques will see numerous appearances by the same keywords in a page’s content. This usually means stuffing them into text wherever they can, rather than allowing the words to appear naturally. This can result in content that reads poorly and loses the flow of a better-written piece. Users will often pick up on these repeated words, potentially making them feel manipulated.
Some blackhat techniques even go beyond a single page. Doorway pages are multiple pages, or sites, created specifically to clog up the search results and funnel users to one page on a website. These pages are made to be very similar, each result mostly takes the user to the same place. These are very similar pages that are closer to the search query, rather than presenting users with a relevant, browsable hierarchy. Update on doorway pages.
This is another instance of black hatting working across multiple pages. Content creators will often use the exact same text-based content on multiple sites. Although this is probably the least egregious use of black hatting, it can still cause problems and appear lazy. Duplicate content is particularly an issue when the sites are all ranking highly, as users will inevitably reach a level of redundancy in the information that they are finding. Google’s guideline on duplicate content.
This is one of the more malicious forms of Blackhatting. Some sites will manipulate their IP address to purposefully confuse Google’s crawlers. This causes the content picked up by the crawler to be different to the actual link provided in the SERP. A nastier cousin of doorway pages, at best cloaking is used to send users to a site to gain clicks. At worst this technique can bring people to see content that has nothing to do with their search. For example, there have been many instances of users finding pornographic content through non-pornographic searching. Google’s guideline on cloaking.
Google’s Algorithmic Updates
So now Google knew what to look out for. But what did it do about it?
Google’s mission statement is to create an enjoyable browsing experience, and Blackhatting is in direct opposition to that goal. While they have managed to provide the service that has earned them their emblematic status, when you create a place as prominent and popular Google – especially when there is money to be made – there are always going to be some bad seeds sitting in the corner or running amok in their dark headwear. So what do you do when you want better control of your population? You hire security.
Over time Google has improved it engine to introduce a number of updates aimed specifically at the use of blackhat and other manipulative techniques. These are called Google Penalties:
Google Panda was introduced in February 2011. Panda was the first major update to Google’s algorithm that dealt with quality control. Panda focused specifically on a website’s content, aiming to reduce low-quality sites and return better sites to the top of the rankings. Panda recognised the issue as stemming from the site’s overall approach, so penalties affect a whole site, rather than individual pages.
Later on, as SEO abusers became more crafty Google introduced another penalty. Released in April 2012, Google Penguin was implemented as a sort of spam blocker. Penguin looked at the amount of blackhat and web spam techniques used on a website and flagged the website if it was too much. Penguin is slightly more lenient in that it only penalises the specific pages it had a problem with.
See the latest Google’s Fred Algo update with possible recovery mechanisms here.
Search Engine Rankings
Although these systems eliminated a significant number of quality-less content, there here are still plenty of websites remaining. With all of these sites competing against each other for the same users, Google had to find a way to create a fair and logical method to order each website according to how well it was performing with its engine. Using their methods for determining quality, Google created a ranking system, called Google Rankings. These rankings dictate where your site will appear in the SERPs. If it is ranked well, it will be near the top. If it’s ranked poorly it will be so low that is unlikely to ever be seen.
Good SEO Practices & Ethical SEO
While it may seem easy enough to get ranked highly by just using as many of the good methods as possible and avoiding the bad ones, it’s not as black and white as that.
SEO plays an important part in the search experience. As such, there is a bit of a grey area in what are considered good SEO practices, and what is more manipulative.
Just like anything where human experience is involved, there are ethics to consider. For instance, there are definitely ways of using SEO that may be lucrative in the short term and don’t set off any of the search engine’s alerts, but still aren’t very pleasant for the user. These sites may even be ranked highly, but are clearly just designed in a way that ticks the boxes, instead of providing an enjoyable and productive user experience. That’s why it’s up to SEO companies to implement their optimisation in a way that is good for the wellbeing of the website, as well as the people visiting them.
We can help with positively improving your SEO or taking care of your entire digital strategy. You can look at our SEO services here.