Search Engine Basics – Climb to the top of Google

The high demand for search continues to rise because people can obtain information much quicker as opposed to the past where they had to make a trip to the closest library to learn. We live in a digital era nowadays where most shopping, banking, and other transactions are conducted online. For the website to exist nowadays, it needs to be visible on the internet.

What is organic SEO and why is it a challenge for marketers?

Organic (natural) SEO refers to appearing in the top position of Google in unpaid, algorithm-driven results. A good SEO plan involves getting to know the audience behind a particular brand, their interests and needs. What keywords (shorter or longer search phrases) are they likely to type? What is their intent? We differentiate between navigational, informational, and transactional queries.

Navigational queries: Here users have a particular website in mind and if you are not that website, they will omit you. In this example, the branded searches tend to drive most high traffic value that is likely to convert.

Informational queries: They are the hardest to target because the scope is usually broad. Informational queries can lead to conversion but do not have to. These are all the questions that users type with an intent to get as much information as they can on something that intrigues them. It is where search engines usually display blogs, how-to videos, and other guides in search results.

Transactional queries: Those queries do not necessarily involve an immediate transaction but they may lead to it. Some examples of transactional queries can be signing up for a free trial account, searching for specific products to buy or looking for restaurants, hotels, flights and similar.

Summarising, the biggest challenge for marketers is to make sure that the content on a website is optimised to match the exact needs of the specific target audience for that website.

What is SERPs? (Search Engine Results Pages)

Search engine result pages are the pages that search engines return to fulfill a query. Google, Yahoo, and Bing have one specific goal in mind and it is to deliver the most accurate results to their users. The layout of those search engines slightly differ but they all include the following sections:

Vertical navigation – it is somewhat a navigation menu where users can choose how they want to view the search results (images, news, videos, or maps).

Search query box – this is where users type their queries.

Result information – this is metadata (short introduction text underneath URL).

PPC advertising – Google AdWords or Bing adverts.

Natural/organic/algorithmic results – these are the search results pulled from the search engines based on their complex algorithms.

Apart from the above ‘standard’ results search engines also show vertical results or instant answers. I have already written a blog post that talks more about structured data and featured snippets. Getting placed in the vertical result can be a significant win for an SEO practitioner.

Google’s Knowledge Graph & Google Knowledge Graph Card/Panel – what’s the difference?

Google’s Knowledge Graph is a system that has been launched by Google in 2012 to expand the horizons of the search results. This new technology can discover the connections between different people, places, and things and display the results in entities alongside Google’s traditional search results. Google’s Knowledge Graph card or panel is an information box placed on the right-hand side after googling a specific search term. Google can understand, for example, that Marie Sklodowska-Curie (people) was a physicist born in Warsaw (places) that was married to Pierre Curie (relations) and gained her fame by discovering radium and polonium (things). In the section ‘People also search for’, Google will put people who are the relatives of Marie Sklodowska-Curie such as her husband or children or other physicists. This proves that Google can understand the further connections between people as well as things. It can see those connections thanks to its factual database which includes information about and relationships between millions of entities (entities are people, places, and things). Type ‘Marie Sklodowska-Curie in Google and check it yourself.

Crawling, Indexing and Ranking

Search engines have one crucial job. It is to provide the best and most accurate results to their users. Two factors play a significant role in determining whether a website will show up in SERPs. These are the relevance and importance. Relevance is a measure of how the content matches the intent of a user whereas the importance refers more to an authority that the website has earned online. If both are high, the chances of ranking increase significantly.

How do search engines find new websites? – Crawling

Search engines (Google, Yahoo, Bing, etc.) have automated software programs that are called crawlers or spiders that scan through trillions of documents (pages and files) on the web to search for the most relevant content to specific queries.

Please note that other countries use different search engines. The way Google or Bing operate does not necessarily apply to them. Some examples of popular search engines used in other countries are Yandex (Russia), Baidu (China), Seznam (Czech Republic), or Naver (North Korea).

Going back to the primary question. The automated program (bot) starts with crawling the already high-quality seed websites that it stores in an index (search engine’s database). It visits the links on each page of those websites to look for new web pages. What does it mean? It is simple – your website can be found through links from other sites on the internet pointing to it. Do you want to speed up the discovery process? If you want Google to find your website faster, you may want to add it to Google Search Console. This approach will help you to queue your website for the next crawl.

There are some limitations to what search engines can see on the website. They may have difficulties with reading images, videos, some JavaScript codes, or iFrames. This, however, does not mean that they will continue to have those problems in the future as there are updates taking place regularly.

A few words on robots.txt file and sitemaps

What is robots.txt file and why is it important for SEO? Here’s an answer. Before crawling any website, the automated program will look for instructions. Those instructions are in robots.txt file. Thanks that the bot knows what pages it can crawl and what pages it has no access to. I highly recommend checking Neil Patel’s blog post that tells more about how to best leverage the robots.txt file for SEO.

What is sitemap?

Sitemap in very short is a hierarchical collection of all the pages on a website. It is very useful for an automated bot because it helps it to discover all those pages faster. The sitemap should include only the important pages on the website to avoid wasting the crawl budget. Check out Backlinko’s blog post if you want to learn more about the crawl budget.

Does crawling guarantee indexing?

It’s one-third of the success knowing that a website has been crawled. I strongly believe that this is where the technical SEO plays a crucial role. Technical SEO is like the foundation that serves for all the further work such as content creation, promotion, and link building. An automated program after visiting the website will check for all sorts of things. Some of them are the content of each page, navigation links, the value of the content, the length of the content, videos, images, and more. Although limited in its ability, it will try to identify if the particular website is a good match for the given query.

Below are some questions that you can run through to see if an automated bot is likely to consider your website as a valuable source of information:

Is my website architecture clear?
Are the URLs on my website readable?
Are my images optimised? Do they have an alt tag?
Is my code clean and readable to search engines? 
What is my website about? Do I write blogs that are relevant to my business? Are they helpful to users? 
Is my content high quality? Do I have decent grammar?
Do I have internal links on the website for better navigation and user experience?
Is my content unique?
Do I use headings and subheadings to provide better structure to my blogs?

Ranking – the final part

Many ranking factors dictate whether a website will appear in search results. Mobile-friendliness, loading speed, domain age, technical SEO, links, and optimised content are the most important indicators of success. There are plenty of online resources that can help you to start off with SEO. It does not necessarily mean that you need to have all those things right to appear in SERPs. Sometimes a few changes such as implementing structured data, writing about a unique but highly demanded subject, or simply improving overall user experience can bring unexpectedly positive results.

CONCLUSION:

We live in a digital era where access to knowledge and information is easier and faster than in the past. Entrepreneurs need to exist online if they want to be noticed by their customers nowadays. This means they require a website that is well-built and optimised for SEO. Putting a website live is not enough. SEO is the key component of the long term success and it is not a one time project or a job. It is a continuous effort that requires knowledge, practice, and monitoring. This blog post was inspired by the ‘Search Engine Basics’ chapter in ‘The Art of SEO Mastering Search Engine Optimization’ book by Eric Enge, Stephan Spencer, and Jessie C. Stricchiola (3rd edition).

Basic Guide to Structured Data for SEO

This article is a simple introductory guide to structured data. It will help you understand what structured data is and when you can implement it to your website.

There are a lot of data on the Internet. They are presented in various formats. Your website can have pages that display text along with pictures, excel tables, videos, and more. Your pages are likely to link either internally to another page on your website or externally to another source on the Internet. This is easy to follow for humans but can be a struggle for an automated Google bot that does not fully understand what the picture displays, what these links are and where they are pointing. That is why developers came up with an idea of structured data. They packaged up elements of the page so that Google bot can understand the meaning of that page as a human does.

Types of structured data

The above describes what structured data is and why you should implement it to some pages on your website. It does not say what is the best way to express this information to Google bots, though. The most common ways of presenting data to Google bots is through three popular formats. These are:

JSON-LD

JSON-LD is the newest and most recommended by Google format of structured data. The abbreviation stands for JavaScript Object Notation for Linked Data. It is relatively easy to implement because it does not need to wrap certain elements on the website. (like Microdata or RDFa do) JSON-LD is a script that gets implemented into the <head> (preferable) or <body> section of your HTML document. It always starts with the following script tag <script type=”application/ld+json”> . The script informs the browser that the JavaScript code containing JSON-LD will be loaded. Next, a very important element is “@context:”http://schema.org” that informs Google what vocabulary JSON-LD uses and shows it where the guide for that vocabulary can be found. While writing your JSON-LD script, do not forget about commas and brackets as their role is very important! Commas inform the browser that there is more to analyse and the curly braces enable parsing. (parsing is an act of reading/processing HTML document)

MICRODATA

It is an inline markup syntax that uses HTML to specify certain elements of the page. In simple words, Microdata uses property-value relation to describe content in HTML document. See the below example:

<!DOCTYPE html>
<html>
<head>
</head>
<body>
<p> My name is <span itemprop=”name”> John </span> </p>
</body>
</html>

Thanks to the above data markup (see in bold), Google bot will understand that “John” is in fact a name.

RDFa

RDFa and Microdata use a similar syntax. However, RDFa is more complex and harder to implement. You can read more about the main differences between RDFa and Microdata here.

A full list of data that you can mark up with schema can be found on the schema.org website.

Clear structure vs structured data

There has been a debate on whether structured data can increase the chances of appearing in Google’s featured snippets. Featured snippets are Google’s way to provide quick answers to their users right at the top of Google’s first page without any need of clicking through or searching further to find specific results.

Types of featured snippets

Text snippet (short text box providing a quick answer to your query)

Video snippet (videos pulled from YouTube)

Numbered/bullet list snippet (tables presenting information such as data, numbers or prices and similar)

To avoid any confusion, I have also decided to list some of the search results that are not featured snippets. Unfortunately, some may mistakenly consider them as such.

These are:

Rich answer (short factual answers)

Knowledge graph (info box next to the search results)

Rich snippet (rating stars, pricing information, photos, reviews etc.)

There is no ultimate answer to whether structured data makes it easier to show up in Google’s featured snippets but, as John Mueller said, the clear structure helps a lot. The clear structure is when the main theme of your website is understandable, the navigation is easy and straightforward, there are no orphaned pages, and both internal links and heading tags are implemented correctly.

How to use structured data?

The topic of structured data can feel a bit overwhelming at first, but it is not that hard to understand and implement once you become more familiar with it. I will now show you an example of how you can create structured data for your website in an easy and quick way. I highly recommend going to Steal Our JSON-LD website where you can find some of the most popular examples of structured data. It is a legitimate source that many webmasters and SEO’s use. Once you are on the website, pick the type of structured data you are the most interested in. (on the left-hand side) Next, tweak the information in the script so that it matches the content on your page. You can then test your script using the Structured Data Testing Tool. If there are no errors, you are ready to add the script to your website.

How to add structured data to your website?

Depending on what type of CMS you use, you can either do it manually or with a plugin. For WordPress websites, I would recommend implementing the script manually. Go to your WordPress dashboard and navigate towards the menu bar on the left-hand side. Click “Appearance” and next “Theme Editor”. (see the below picture)


Now, you need to locate your “header.php” file and place the script just right before the closing </head> tag. You may alternatively go to the “footer.php” file and place the script before the closing </body> tag.
If you do not feel comfortable with the above solution, I recommend installing the “Insert Headers and Footers” plugin. The plugin makes it easy to insert header and footer scripts without the need of modifying your theme files! (see the below picture)

If you are still not sure or not fully comfortable with any of the above solutions, you can try out some of the plugins that are available for schema structured data implementation. They will automatically do all the job for you. One very good example of such a plugin is WP SEO Structured Data Schema.

For CMS systems other than WordPress the implementation can be very different. It can be worth checking out some guides or speaking directly to web developers who may be able to offer more help. A very good reference point for any technical-related issues or queries is Stack Overflow.

CONCLUSION:

Structured data enables Google bots to understand the content of your pages better. Although there is no clear answer to whether structured data is a determining ranking factor, it can potentially contribute to increasing your visibility in SERP. It is simple. Google serves its users by giving the most accurate answer to a query. The better it understands your website or a page the higher your chances of showing up for that query grow. Remember to make sure that any schema structured data implementation that you make to your website must be accurate to what content your website displays. More information on quality guidelines can be found here.

What Is SEO Siloing And Why Is It Important?

SEO siloing is organising your website’s content into categories and subcategories. It is not only better for users but also for search crawlers. They discover connections between related pages and therefore better understand how your content is grouped and what it is all about. Search engine’s job is to determine your website’s main topics and see whether they can serve as a credible and exhaustive source of information for different users around the world. We differentiate between physical siloing and virtual siloing. Here’s what it is in more detail.

Physical Siloing vs Virtual Siloing

Physical siloing is when you are organising themes on your website into folders and subfolders. Think of it like when you have a file that holds a bunch of documents that are related together. Let’s say you have a website on beauty products. Your main themes can be hair products, makeup, and skin treatment. Hair products can be a very broad term. Let’s assume that your business offers styling and hair care products. If you wanted to create a silo structure for that particular theme, it would look somewhat like this:

Hair Products Page
www.yourdomainname.co.uk/hair-products/
www.yourdomainname.co.uk/hair-products/shampoo/
www.yourdomainname.co.uk/hair-products/conditioner/
www.yourdomainname.co.uk/hair-products/treatments/
www.yourdomainname.co.uk/hair-products/masks/

Hair Styling Page
www.yourdomainname.co.uk/hair-styling/sprays/
www.yourdomainname.co.uk/hair-styling/waxes/
www.yourdomainname.co.uk/hair-styling/oils/
www.yourdomainname.co.uk/hair-styling/gels/
www.yourdomainname.co.uk/hair-styling/powders/

The above is the example of well-organised content that is clear to both visitors and search crawlers. This strategy needs to be applied to all themes on your website. What is virtual siloing then and how it differs from physical siloing? Virtual siloing is based on an internal linking strategy. You include internal links to other pages within the same theme so that Google knows they are related. It is highly advisable to combine physical siloing with virtual siloing for even better SEO results.

The importance of keywords

Keyword research is an essential part of your website siloing. And, here’s why. When people want to find specific information on any topic of their interest, they go to Google. They type in a search query, and when they hit enter, they are presented with a few paid links at the top of the first page of Google and a few other organic results. These URLs are the best ranking pages for that particular query they typed in. This is why it is so important to look for the search terms that bring people to your site. You also may want to determine what is currently ranking well for your site and whether it is relevant to what your business offers. And, if it is a keyword that is relevant to your business, strengthen it by building silos. In other words, group or link other pages with a related content to that particular service page that already ranks high.

URLs, files, directories and sitemaps

When designing URLs, it is good to remember about a few things. URLs should not be too long because they can be flagged by robots as spammy. They should have a clear structure and be relatively short. Also, individual pages should be placed in a subfolder rather than in the root directory (website siloing). As stated above, it will help humans and robots to understand better the whole site structure. Finally, a sitemap is an organised list of all or the most important pages that are on your website. There are two types of sitemaps: HTML and XML. HTML sitemap is coded using HTML and it lists all the pages of a smaller website and the high-level pages or categories of a larger website. The XML sitemap is useful only to robots that crawl your website. All URLs listed in an XML sitemap should be valid pages that contain high-quality and original content.

CONCLUSION:

SEO siloing is grouping relevant pages together structurally and through internal linking. It is to establish the website’s main themes and make it clear to both visitors and search engines. Ideally, you should plan your website’s architecture before designing your website but that is not always the case. You can still improve your ranking by siloing your website’s content. Start from the keyword research. Search your competitors and find the most common search queries for your type of business. Next, try to group the content of your website into themes. Design the link structure based on the keyword research for your website’s content. The best practice is also going for keywords you have a realistic chance of ranking for. And, finally, implement the silo. In WordPress, it is usually done by setting up parent and child pages and creating relevant categories and subcategories that help to group relevant pages together. All in all, clean and hierarchical structure of your website along with other SEO strategies can significantly boost your rankings in search engines.

Basic Guide To Search Engine Optimization

Search Engine Optimisation (SEO) is about all the steps that you take on and off your website to gain more exposure in search engine results. It is a long-term process that if done well brings long-term value. Your chances of sale increase when more people visit your site for the right reasons. Search engines’ job is to deliver quick and accurate results to their users. Gaining search engines’ credibility is not an easy or a quick job. It takes time and involves analysis, keyword research, content creation, link building, and resolving technical issues.

Keywords research

Before you optimise a website, you should specify what your keywords are. Keywords are terms that people type into search engines. Let’s say you have a business that sells handmade cards. What should be your keyword then? The keyword “handmade cards” is highly competitive for a company that is just starting out to promote itself. What about “buy personalised yellow handmade cards.” It may be less popular but is relevant to your business and can be beneficial for you at the start. If you want to find out what keywords would be best for your business, you need to do the research. Start by listing all services you offer. You may want to come up with a long-tail keyword such as “buy personalised holiday homemade cards.” The thing is that you may find many long-tail keywords that can bring you a way better results than the keyword “homemade cards” would bring from the start. Some tools can help you to do the keyword research. These are Google AdWords Keyword Planner (free tool) and Google Trends. Before you choose to use any of the keywords you have found, you should consider three things. These are relevance, competition and search volume.

Web pages – search engines versus people

How search engines and people view web pages? Search engines crawl the entire website to find out what is in there and how it is organised. Make sure that your website structure is clear and user-friendly to both people and crawlers. You ideally should make a list of keywords for every page on your website. Next, you write content and include these keywords in the text for that particular page. Your keywords should also be put in the URL, meta title, meta description and subheadings of your content. One of the recommended tools for thorough SEO analysis is Moz SEO toolset. It is a paid tool, but there is a thirty days trial option available for you to get familiar with it and see whether it suits your needs. It is worth mentioning that user-generated content brings many benefits to your website. These are comments, guest blog posts, votes, infographics, product reviews and more.

Technical SEO

One of the crucial things when it comes to technical SEO is to make sure that your code is clean, meaning that it does not contain any coding errors. How do search engines discover the new content on your website? They go through links. You can create an XML sitemap to make it easier for them. What if there are any specific pages on your site that you do not want Google crawlers to see? There is a file called robots.txt that instructs web robots how to crawl and index a website. You can exclude certain pages from crawling and indexing if you want to. Here I can recommend two resources where you can find credible information on sitemaps and robotos.txt file:

www.sitemaps.org
www.robotstxt.org

What is more, if you move your content from one page to another, you should implement 301 redirect. It is a permanent redirect that tells search engines that the content is no longer available under the old URL. Finally, I would like to add that loading time of your website matters as well. It is one of the most important factors when it comes to SEO. How can you speed the loading time of your site? You can consider using CND (Content Delivery Network) or leverage browser caching. Google expressed the preference for secure sites. You may want to buy SSL certificate for your website.

Link building strategies

Backlinks are still one of the most important SEO factors. What matters to Google is the number of links that are pointing to your website and the quality of these links. The quality of your links is assessed by their relevance to your page content. Additionally, anchor text should ideally contain a phrase or a keyword. It tells Google crawlers what they can expect to see when they click that link. Anchor texts such as “Click here” or “Link” do not contain much value. Search engines are also expecting consistency. If you received a considerate number of links in the past few years but are not gaining any links now, it can be an indication to search engines that you hired someone to get these links. In consequence, your site will not be very credible and authoritative in the eyes of Google or any other search engine.

CONCLUSION:

Start with keyword research for your website. Write down all the services you offer and try to create an excel spreadsheet where you include all your website’s pages and keywords you try to rank for per each of these pages. Next, you evaluate the relevance of the keywords you want to use and their popularity. Write SEO friendly content and do your on-site SEO. Once you have all that in place, promote your website through external advertising, social media and link building. Find your link building opportunities by analysing your competitors. Consider guest blogging and adding review section to your website where customers can leave their feedback on the products or services they received from you.

How To Create SEO Friendly Content?

Creating SEO friendly content can be challenging. Content creation is the part of search engine optimisation. Firstly, you need to have a quality piece of content. Next, you can optimise it for SEO. If your content is uninteresting, grammatically incorrect or duplicate, no SEO strategy will help you to get it rank high in search engines. Good SEO takes time, and many things contribute to whether your website appears on the first pages of Google or not.

Answer questions

Create content that answers people’s questions. Do not write for the sake of writing only. You probably have heard about forums such as Quora or Reddit. See what people are struggling with there. Write your blog post in a way that brings a solution to an existing problem. Content marketing is varied. It does not only apply to writing blog posts. People like to consume content in a variety of forms nowadays. These can be infographics, interactive videos, ebooks, and podcasts. See what works best for you and your audience and deliver it consistently and up to the best possible standard.

Keyword research

Let’s assume that you run a blog on your website, and your business focus is on healthy products for hair. First of all, consistency is the key. If you create two blog posts per week, stick to that. Think of keywords before you start writing your next blog. What is now very popular in hair care industry? What people search for the most? It can be two words or a phrase such as organic products or healthy hair products. Type in similar words/phrases in Google search bar and see how many impressions they get. It will give you the idea what words are the most commonly used by people who search online. Once you have your keywords in place, start thinking about your content.

On-site SEO

When you write your blog post, make sure you use h1 and h2 tags and that you put your keywords inside them. The same applies to your main content. However, do not practise keyword stuffing. It is no longer accepted and is seen by Google as spam. Make sure your content is long enough. It should have more than three hundred words and ideally somewhere between seven hundred and one thousand. Some claim that even more than one thousand words is what works best. Why are longer posts considered better? It is not easy to write a blog post that contains around two thousand words. It requires time and research. Therefore, it should be of good quality. Google wants to create the best possible experience for its users so that it will rank higher the content that provides the best information on a given subject.

Off-site SEO

At this stage, your content should be ready to publish. Off-site SEO refers to your actions taken outside of your website. It is all about promoting the content on the Internet. And, there are a couple of ways how you can do that. As per off-site SEO, backlinks are the most important. There are three types of backlinks: natural, manually built and self-created. Natural links occur when a blogger or other website owner links to your blog without any request. The second type of links refers to situations when you directly ask your customers or influencers to share your blog post on their social channels. Self-created links are when you submit your website to business directories online. The last type is very often associated with black-hat SEO techniques. Placing a new site on a few directories can be beneficial, but mass submission can hurt your rankings.

CONCLUSION:

Creating SEO friendly content takes time. You need to come up with the right strategy first. Start by doing keyword research. Next, collect as much information as possible on your chosen topic. Remember to check more than a few sources and compare results. Once you have that done, write your content and structure it well. Put your keywords in h1 and h2 tags as well as in the text itself. Do not overdo it. Keyword stuffing is a bad practice and is perceived as spam by Google. Proofread your content and when you are ready, publish it. Finally, use social channels and other advertising tools to promote your content via the web.

What is Screaming Frog SEO Spider?

Screaming Frog SEO Spider is a tool that you can use for improving your on-site SEO. It is a software that analyses all links on your website and shows you where you have errors. It applies to meta descriptions, h1 and h2 tags, images, page titles, broken links and more. There is 500 crawl URL limit, so if your website is larger than that, then you may want to upgrade to a paid version. Let’s say you have a site that contains ten subpages. Screaming Frog SEO Spider crawls your images, external and internal links, canonicals, CSS, JavaScript and other which will result in more than ten URLs.

On-Site SEO

Let’s start with on-site SEO. Screaming Frog SEO Spider allows you to check if your page titles (it is a title tag that appears in the top of a browser window) are over 65 characters long, whether you have duplicates, and if you have multiple page titles. Next, it shows you how many duplicate meta descriptions you have and how long they are so that you can make changes if required. It will also check the same features for your h1 and h2 tags. The great thing about the tool is that it helps you to generate an XML sitemap for your website. The XML sitemap is a document that enables Google crawlers to understand how many links are there on your site. For WordPress blogs, you can also create XML sitemap by using Yoast SEO Plugin.

Images

If your images are too big, they may slow down the loading time of your website. Slow sites do not rank high in Google. Screaming Frog shows you which images need to be resized. Resizing images in the dashboard to desired dimensions will not help at all. Their actual size is still big. They will be loaded with their exact size first and then will get scaled by the browser to dimensions you set in the dashboard (if you are using WordPress or other CMS system). All images should also have an alt tag. The alt tag is important for search engines because it helps them to understand what an image displays.

Broken links and Canonicals

A broken link is when a link on a web page does not work. It may be due to incorrect URL or because the destination page does no longer exist. Screaming Frog SEO Spider allows you to check for broken links on your website. You can also generate reports in CSV file format so that you can work in Excel if it is better for you. Canonical URL tells search engines that particular URL is the original one. It is very helpful when you have several pages with the same content (good for e-commerce websites with multiple pages displaying certain product category in different colors, for example). You set canonical URL for the subpage of your choice, and you point other subpages (with the same content) to that one. It allows avoiding the duplicate content issue. Screaming Frog SEO Spider checks your website for any errors regarding your canonical links.

Response Codes

Whenever you type your website address and press enter, your browser makes a request to a server. The server is another machine (just like your computer) that is connected to the Internet. When you make a request, you get a response from a server. It can be anything from an image, CSS file, JavaScript etc.) The browser knows what the server gives by the content type (for example, HTML document) There are a couple of status codes that you can get when you receive a response from a server. Anything between 200 means that everything is okay and 300 and above relates to redirects. Anything starting at 400 indicates an error. And, finally, 500 status codes refer to server errors. Screaming Frog SEO Spider shows you if there are redirects or client-side errors on your website.

CONCLUSION:

Screaming Frog SEO Spider is a great tool for on-site SEO. It will crawl your website and will show you where you can introduce changes. You can filter data by categories such as duplicates, character limit, missing descriptions and more. The tool will check your meta descriptions, h1 and h2 tags, meta keywords, page titles, internal and external links and more. You have an option to open listed links in a browser as well as export reports on individual data (canonicals, error codes, images missing alt tags, internal and external links and other). You can use the tool to create an XML sitemap. It will help search engines understand better how your site works. You then need to upload your sitemap to Google Search Console. There is also an option to integrate Screaming Frog SEO Spider with your Google Analytics Account. All in all, it is a perfect tool for either competitor analysis, your website improvement or both.

Duplicate Content Problem Solved – Technical Issue

There are two types of duplicate content – external and internal. External duplicate content appears when you copy someone else’s work, and you put it on your website as your own. The internal duplicate content is when you have the same piece of information on several pages on your site. In both cases, it can decrease your website rankings in Google searches. There is a myth that Google penalizes for duplicate content. Let’s make it clear – no it does not. Google realizes there are and will be duplicate content issues around the web. Matt Cutts, an American software engineer, said that somewhere between 25% – 30% of the content on the Internet is duplicate and that is okay. Sometimes companies post the updated version of their terms and conditions, or they have the website translated into another language. Is that treated as the duplicate content issue? No, it is not. Matt Cutts explains that Google takes duplicate content and groups it into clusters and it will then show the best result from these clusters. Google does not penalize for duplicate content, but it can decrease your rankings if you violate the rules. It means that if you continually copy someone else’s work or ideas.

How can you check if your website has duplicate content?

Plenty of tools on the Internet enable you to check if your website has duplicate content. Some of the best ones are Copyscape Plagiarism Checker and Siteliner. The first one will tell you if there is any other content similar to yours on other websites and the second one will identify if you have internal duplicate content on your site. The nice thing about these tools is that they are free. If you need more in-depth analysis, you can upgrade to a premium account. Siteliner is simple to use. You just go to their website. You type in your website address and press enter. The site will show you how much internal duplicate content you have, where on the site you have broken links, what is your average page load time and other. You run the SEO audit, and it shows that you have a significant amount of duplicate content. Let’s say you are a blogger and every post you create is unique. Where does the duplicate content come from then? Well, the answer is in the next paragraph.

Identifying the issue with duplicate content

Many CMS systems such as WordPress, Magento or Joomla can create an internal duplicate content issue. Here’s the thing. Do you have categories on your blog? Do you use tags? Is there recent posts section or maybe you have a website and a blog page with short snippets redirecting people to the actual blog post? Whenever you create a post, there is a link assigned to it. Then, you attach the post to a certain category – this is where another link is created. Next, you add tags to your post for your readers to identify what it is exactly about. You may also think that it will help in SEO. Well, it depends on how you use your tags. Let’s say that you run a blog about aging and dementia. The categories for your posts can be, for example, older age, Alzheimer’s and dementia and disability. Then, if you write a blog post and place it in the disability category, you may want to use one of the following tags: mental illness, autism, vision impairment or brain injury. In other words, categories help you to sort out the content on your website and tags even more specifically explain the nature of your blog post. Every single of these elements creates an additional link. See below:

www.yourwebsiteaddress/what-is-disability/
www.yourwebsiteaddress/category/disability/
www.yourwebsiteaddress/tag/autism/

You created one blog post that is accessible via three links. It is unique for a human being, but it is not for Google crawlers. They treat each link separately. Therefore, they think that two of these three links are duplicates. What can you do?

Duplicate content solutions

If several URLs lead to the same content, it can be solved! The answer is a canonical URL. Canonical URL tells Google crawlers which URL is the original one. Once you have that done, Google will direct all other URLs to the original one, and no duplicate content issue will appear on your website. It is a good idea to noindex your tags if you run a blog or website on WordPress. As previously mentioned, tags create additional URLs that are then treated as a separate page with a “unique” content. This unique content obviously exists under the original URL of your blog post. It is when the duplicate issue arises. There is a great plugin for WordPress that is called Yoast SEO plugin. The taxonomy section allows you to noindex your tags. Also, it is good to remember that title tags and meta descriptions matter to Google and you should write them individually for every page. Also, when you login to Google Search Console, you will be able to see how much duplicate content you have.

CONCLUSION:

All pages on your website ideally should be unique. You should have individual title tags and meta descriptions for every page. Your posts are better if you divide them into smaller sections and include subheadings. Duplicate content cannot be avoided in many cases. The perfect example is e-commerce website where the individual product page can be reached under many categories. Google understands that there is duplicate content around the web and it does not penalize it. It can, however, impact your rankings if you abuse the general rules and become spammy. It is when you copy large pieces of information from other sources. The ideal solution for on-site SEO work and duplicate content issue prevention for WordPress websites and blogs is Yoast SEO plugin. The plugin is free, and the Yoast website offers great articles on how you can configure the plugin. Did you come across duplicate content on your site? If so, how did you solve the problem?

How To Get a Top Google Position in 2017? – SEO Guide

The article is based on Backlinko ranking factor’s list.

Google’s ranking algorithm is made up of over 200 components. There is no need to have everything right to reach the top position in Google search. However, there are some things that everyone should get done.

There is no quick and easy way to optimize your site. Google’s trust can be earned through consistent and valuable work. There is a division on the white hat and the black hat SEO. Some examples of bad SEO practices are hidden links, sneaky redirects, automated queries to Google, subdomains or domains with a duplicate content and malicious pages. Long-lasting business is always built with the right amount of effort and consistency.

Google is visited by hundreds of millions of people per day and it makes it the most powerful source of traffic to your website. Being banned can badly influence your business and there is no guarantee that they will ever re-list you. Gaining Google’s trust is a tough job, but destroying it can be done very quickly.

White Hat SEO Techniques:

1. Backlinks

Backlinks to your website are a signal to search engines that others find your content unique, credible and trustworthy. Earning backlinks is an off-site SEO. Followed backlinks from authority websites are the most desirable ones. It is not easy to earn them. One way to get backlinks is to search for outdated links on reputable sites. Once you find them, contact the owner and tell them that the link is out of date or broken. Then, gently suggest that they add your link to their site. You may consider writing articles for other websites, so they can link back to your site or social profiles. There are plenty of companies that offer quick and effective backlink building. Do not trust them. They are spammy or low-quality sites for most of the time and white hat SEO is always a better choice than a black hat SEO.

2. Keywords and Updates

Keywords are important, but those in a keyword tag – not anymore. Meta tags and meta descriptions matter. Google looks for phrases instead of single words now. It is good to include keywords in meta tag descriptions or articles that you post on your blog. Keywords should also appear in a domain name, in h1, h2, and h3 tags and image descriptions. Fresh and new content is an indicator that the site is up to date. The content should be of a good quality and it cannot have too many affiliate links, as they can hurt your rankings.

3. Site Usability and Reputation

A site that is difficult to use or navigate will not rank highly in Google search. Site architecture is important. The content should be structured into smaller sections. The use of bullet points and numbered lists may be helpful. The length of the content also matters. The longer the text, the better it is for SEO. It is recommended to produce articles over 1, 500 words. Reputation is an important Google’s factor. Include About Us and Contact pages to make the site look more credible. Think of adding the sitemap for a better navigation and visibility for search engines. Bear in mind that the layout of your site should always be user-friendly.

4. Social Media Presence

Social media presence is very important and it has a huge impact on SEO, especially YouTube. Again, tweets coming from authority accounts are the most valuable ones. The more tweets the page has, the better influence it has on SEO. Facebook shares are stronger than Facebook likes because they are comparable to backlinks. Pinterest is not only a social platform, but it also serves as a search engine. It makes it interesting for SEO. Pinterest is a great referral traffic back to your site with pins from your own site. Google Plus is a social media network owned by Google. Some people underestimate its meaning. It is difficult to believe that Google would not consider its own social network for SEO. Also, Google places Google Plus local results higher than organic SERPs.

5. YouTube

YouTube is a traffic source with an enormous potential. Some of the most important signals that YouTube uses are title tag information, keywords in the description tag, tags, likes and dislikes, the number of subscribers, comments and the length of the video. Longer video descriptions help you to get your videos ranked higher, especially if you use right keywords. Utilize online communities such as Quora or LinkedIn. Provide an answer to a problem and suggest they watch your video with a valuable content that may be helpful for them. Encourage subscribing and liking and keep your lists organized. Put a keyword in your playlist description to give a deeper information about your video’s topic.

6. Other

Your domain’s name age matters to Google. Domain’s history is also an important factor. It is good to check the credibility of a domain that was earlier used by someone else. Poor code and a duplicate content are a big no to SEO. Broken or affiliate links may hurt your site rankings too. It is good to have terms of service and a privacy page to show Google that you are a trustworthy Internet’s member. The website should have a responsive design and it needs to provide a good quality content. Otherwise, you can get hit by a Panda algorithm penalty. Penguin penalty is another Google’s algorithm that decreases Google’s search rankings of pages that use black-hat SEO techniques.

CONCLUSION:

SEO is constantly changing. In the mid-1990s, it was important to complete meta keywords tag with as many words as possible to rank well. In 2004, placing links with anchor text was a key thing to be leveraged for traffic. In 2011, the social media marketing was main SEO method. In 2017, backlinks, general traffic, and reviews are the key factors for search engine optimization.

The future is uncertain, but the world of SEO is a constant world of change. Bear in mind that these tactics that worked a few years back may now cause harm to your website.