Search Engine Basics – Climb to the top of Google

The high demand for search continues to rise because people can obtain information much quicker as opposed to the past where they had to make a trip to the closest library to learn. We live in a digital era nowadays where most shopping, banking, and other transactions are conducted online. For the website to exist nowadays, it needs to be visible on the internet.

What is organic SEO and why is it a challenge for marketers?

Organic (natural) SEO refers to appearing in the top position of Google in unpaid, algorithm-driven results. A good SEO plan involves getting to know the audience behind a particular brand, their interests and needs. What keywords (shorter or longer search phrases) are they likely to type? What is their intent? We differentiate between navigational, informational, and transactional queries.

Navigational queries: Here users have a particular website in mind and if you are not that website, they will omit you. In this example, the branded searches tend to drive most high traffic value that is likely to convert.

Informational queries: They are the hardest to target because the scope is usually broad. Informational queries can lead to conversion but do not have to. These are all the questions that users type with an intent to get as much information as they can on something that intrigues them. It is where search engines usually display blogs, how-to videos, and other guides in search results.

Transactional queries: Those queries do not necessarily involve an immediate transaction but they may lead to it. Some examples of transactional queries can be signing up for a free trial account, searching for specific products to buy or looking for restaurants, hotels, flights and similar.

Summarising, the biggest challenge for marketers is to make sure that the content on a website is optimised to match the exact needs of the specific target audience for that website.

What is SERPs? (Search Engine Results Pages)

Search engine result pages are the pages that search engines return to fulfill a query. Google, Yahoo, and Bing have one specific goal in mind and it is to deliver the most accurate results to their users. The layout of those search engines slightly differ but they all include the following sections:

Vertical navigation – it is somewhat a navigation menu where users can choose how they want to view the search results (images, news, videos, or maps).

Search query box – this is where users type their queries.

Result information – this is metadata (short introduction text underneath URL).

PPC advertising – Google AdWords or Bing adverts.

Natural/organic/algorithmic results – these are the search results pulled from the search engines based on their complex algorithms.

Apart from the above ‘standard’ results search engines also show vertical results or instant answers. I have already written a blog post that talks more about structured data and featured snippets. Getting placed in the vertical result can be a significant win for an SEO practitioner.

Google’s Knowledge Graph & Google Knowledge Graph Card/Panel – what’s the difference?

Google’s Knowledge Graph is a system that has been launched by Google in 2012 to expand the horizons of the search results. This new technology can discover the connections between different people, places, and things and display the results in entities alongside Google’s traditional search results. Google’s Knowledge Graph card or panel is an information box placed on the right-hand side after googling a specific search term. Google can understand, for example, that Marie Sklodowska-Curie (people) was a physicist born in Warsaw (places) that was married to Pierre Curie (relations) and gained her fame by discovering radium and polonium (things). In the section ‘People also search for’, Google will put people who are the relatives of Marie Sklodowska-Curie such as her husband or children or other physicists. This proves that Google can understand the further connections between people as well as things. It can see those connections thanks to its factual database which includes information about and relationships between millions of entities (entities are people, places, and things). Type ‘Marie Sklodowska-Curie in Google and check it yourself.

Crawling, Indexing and Ranking

Search engines have one crucial job. It is to provide the best and most accurate results to their users. Two factors play a significant role in determining whether a website will show up in SERPs. These are the relevance and importance. Relevance is a measure of how the content matches the intent of a user whereas the importance refers more to an authority that the website has earned online. If both are high, the chances of ranking increase significantly.

How do search engines find new websites? – Crawling

Search engines (Google, Yahoo, Bing, etc.) have automated software programs that are called crawlers or spiders that scan through trillions of documents (pages and files) on the web to search for the most relevant content to specific queries.

Please note that other countries use different search engines. The way Google or Bing operate does not necessarily apply to them. Some examples of popular search engines used in other countries are Yandex (Russia), Baidu (China), Seznam (Czech Republic), or Naver (North Korea).

Going back to the primary question. The automated program (bot) starts with crawling the already high-quality seed websites that it stores in an index (search engine’s database). It visits the links on each page of those websites to look for new web pages. What does it mean? It is simple – your website can be found through links from other sites on the internet pointing to it. Do you want to speed up the discovery process? If you want Google to find your website faster, you may want to add it to Google Search Console. This approach will help you to queue your website for the next crawl.

There are some limitations to what search engines can see on the website. They may have difficulties with reading images, videos, some JavaScript codes, or iFrames. This, however, does not mean that they will continue to have those problems in the future as there are updates taking place regularly.

A few words on robots.txt file and sitemaps

What is robots.txt file and why is it important for SEO? Here’s an answer. Before crawling any website, the automated program will look for instructions. Those instructions are in robots.txt file. Thanks that the bot knows what pages it can crawl and what pages it has no access to. I highly recommend checking Neil Patel’s blog post that tells more about how to best leverage the robots.txt file for SEO.

What is sitemap?

Sitemap in very short is a hierarchical collection of all the pages on a website. It is very useful for an automated bot because it helps it to discover all those pages faster. The sitemap should include only the important pages on the website to avoid wasting the crawl budget. Check out Backlinko’s blog post if you want to learn more about the crawl budget.

Does crawling guarantee indexing?

It’s one-third of the success knowing that a website has been crawled. I strongly believe that this is where the technical SEO plays a crucial role. Technical SEO is like the foundation that serves for all the further work such as content creation, promotion, and link building. An automated program after visiting the website will check for all sorts of things. Some of them are the content of each page, navigation links, the value of the content, the length of the content, videos, images, and more. Although limited in its ability, it will try to identify if the particular website is a good match for the given query.

Below are some questions that you can run through to see if an automated bot is likely to consider your website as a valuable source of information:

Is my website architecture clear?
Are the URLs on my website readable?
Are my images optimised? Do they have an alt tag?
Is my code clean and readable to search engines? 
What is my website about? Do I write blogs that are relevant to my business? Are they helpful to users? 
Is my content high quality? Do I have decent grammar?
Do I have internal links on the website for better navigation and user experience?
Is my content unique?
Do I use headings and subheadings to provide better structure to my blogs?

Ranking – the final part

Many ranking factors dictate whether a website will appear in search results. Mobile-friendliness, loading speed, domain age, technical SEO, links, and optimised content are the most important indicators of success. There are plenty of online resources that can help you to start off with SEO. It does not necessarily mean that you need to have all those things right to appear in SERPs. Sometimes a few changes such as implementing structured data, writing about a unique but highly demanded subject, or simply improving overall user experience can bring unexpectedly positive results.

CONCLUSION:

We live in a digital era where access to knowledge and information is easier and faster than in the past. Entrepreneurs need to exist online if they want to be noticed by their customers nowadays. This means they require a website that is well-built and optimised for SEO. Putting a website live is not enough. SEO is the key component of the long term success and it is not a one time project or a job. It is a continuous effort that requires knowledge, practice, and monitoring. This blog post was inspired by the ‘Search Engine Basics’ chapter in ‘The Art of SEO Mastering Search Engine Optimization’ book by Eric Enge, Stephan Spencer, and Jessie C. Stricchiola (3rd edition).

What is Screaming Frog SEO Spider?

Screaming Frog SEO Spider is a tool that you can use for improving your on-site SEO. It is a software that analyses all links on your website and shows you where you have errors. It applies to meta descriptions, h1 and h2 tags, images, page titles, broken links and more. There is 500 crawl URL limit, so if your website is larger than that, then you may want to upgrade to a paid version. Let’s say you have a site that contains ten subpages. Screaming Frog SEO Spider crawls your images, external and internal links, canonicals, CSS, JavaScript and other which will result in more than ten URLs.

On-Site SEO

Let’s start with on-site SEO. Screaming Frog SEO Spider allows you to check if your page titles (it is a title tag that appears in the top of a browser window) are over 65 characters long, whether you have duplicates, and if you have multiple page titles. Next, it shows you how many duplicate meta descriptions you have and how long they are so that you can make changes if required. It will also check the same features for your h1 and h2 tags. The great thing about the tool is that it helps you to generate an XML sitemap for your website. The XML sitemap is a document that enables Google crawlers to understand how many links are there on your site. For WordPress blogs, you can also create XML sitemap by using Yoast SEO Plugin.

Images

If your images are too big, they may slow down the loading time of your website. Slow sites do not rank high in Google. Screaming Frog shows you which images need to be resized. Resizing images in the dashboard to desired dimensions will not help at all. Their actual size is still big. They will be loaded with their exact size first and then will get scaled by the browser to dimensions you set in the dashboard (if you are using WordPress or other CMS system). All images should also have an alt tag. The alt tag is important for search engines because it helps them to understand what an image displays.

Broken links and Canonicals

A broken link is when a link on a web page does not work. It may be due to incorrect URL or because the destination page does no longer exist. Screaming Frog SEO Spider allows you to check for broken links on your website. You can also generate reports in CSV file format so that you can work in Excel if it is better for you. Canonical URL tells search engines that particular URL is the original one. It is very helpful when you have several pages with the same content (good for e-commerce websites with multiple pages displaying certain product category in different colors, for example). You set canonical URL for the subpage of your choice, and you point other subpages (with the same content) to that one. It allows avoiding the duplicate content issue. Screaming Frog SEO Spider checks your website for any errors regarding your canonical links.

Response Codes

Whenever you type your website address and press enter, your browser makes a request to a server. The server is another machine (just like your computer) that is connected to the Internet. When you make a request, you get a response from a server. It can be anything from an image, CSS file, JavaScript etc.) The browser knows what the server gives by the content type (for example, HTML document) There are a couple of status codes that you can get when you receive a response from a server. Anything between 200 means that everything is okay and 300 and above relates to redirects. Anything starting at 400 indicates an error. And, finally, 500 status codes refer to server errors. Screaming Frog SEO Spider shows you if there are redirects or client-side errors on your website.

CONCLUSION:

Screaming Frog SEO Spider is a great tool for on-site SEO. It will crawl your website and will show you where you can introduce changes. You can filter data by categories such as duplicates, character limit, missing descriptions and more. The tool will check your meta descriptions, h1 and h2 tags, meta keywords, page titles, internal and external links and more. You have an option to open listed links in a browser as well as export reports on individual data (canonicals, error codes, images missing alt tags, internal and external links and other). You can use the tool to create an XML sitemap. It will help search engines understand better how your site works. You then need to upload your sitemap to Google Search Console. There is also an option to integrate Screaming Frog SEO Spider with your Google Analytics Account. All in all, it is a perfect tool for either competitor analysis, your website improvement or both.

How To Build a Google Friendly Website?

One of the most important aspects of a successful SEO strategy is a Google friendly website. There are two ways to drive visitors from Google to your website. One way is to build profitable Google AdWords campaigns that help you to drive the targeted traffic to your website. The other way is to gain high organic rankings. The latter method involves a lot of time and effort but is a better option in the long term. It also requires minimum or no financial investment. Search Engine Land published a great article on why organic search is better. You can read more about it here.

What are the characteristics of a good website?

There are some general components that can make or break the effectiveness of your online presence. These are appearance, functionality, content, website usability and of course SEO. Website’s appearance is how a user perceives your website. A site must be visually appealing, clean in design and professional. When a person visits your site, this is usually the first impression they get that makes them stay or leave. There are a few rules that always work when it comes to a good design. There should be a maximum of 3 colors used for a design. The colors cannot be bright. Otherwise, it may be difficult to read the text displayed on a website. The easiest combination is a black text on a white background and a font size between 10 and 12 points. Any graphics and visual elements of the site need to match the content. Every website should be functional. What does that mean? Hyperlinks, contact forms, event registration forms and other website components should work as expected. Spelling mistakes and bad grammar

Every website should be functional. What does that mean? Hyperlinks, contact forms, event registrations and other website components should work as expected. Spelling mistakes and bad grammar are welcomed neither on a website or in a company downloadable files. Other elements of a Google friendly website are content and usability. When it comes to the usability, the website should be easy to read, navigate and understand. This applies to producing valuable content, the general loading speed of the website, and logical navigation. Every business should include a contact number at the top right of the website, social links, quick contact form and eye-catching call to actions on their website. There should be short information about products or services they offer and a clickable phone number.

What plugins and tools are worth checking?

If your website is built on WordPress, you may use plugins to unlock new features. Plugins allow you to install a piece of code to your website thanks to which you can either introduce new changes to the website’s layout or improve its functionality. Backwpup Free is an example of a plugin that allows you to download your entire website within a couple of minutes. It can be extremely useful, especially when your hosting provider does not offer weekly or monthly backups. Yoast SEO is one of the most popular plugins that enables you to customize the meta data for individual pages on your website. If you need something to speed up the website loading time, then you may consider installing a W3 Total Cache. Your website security is important too. Akismet is an anti-spam plugin that processes and analyzes a great amount of data from millions of sites and communities in real time. There are plenty of tools available online. Some of them are good for content creation and social media, while others help in tracking the effectiveness of your online campaigns.

Useful tools for content creation and website analysis:

Hubspot’s Blog Ideas Generator

This tool will help you to generate ideas for your next blog post. Hubspot’s Blog Ideas Generator is free and easy to use. Just type out few key words, press enter and wait for results. It may not necessarily give you the exact title, but it can boost your creativity or help you to identify your next topic for a blog post.

Google Analytics

This is a free web analytics Google service. Google Analytics helps you to understand how people interact with your website. How does it work? When someone visits your website, the Google Analytics tracking code loads. This code is a small piece of JavaScript that sits in the code of all the pages of your website. The tracking code looks for information stored in cookies. If it finds any information in a cookie, then it updates it. If no, then a new cookie is created in a web browser. The tracking code sends gathered information to the Google Analytics servers and this is how you learn about who visited your website, where they come from and what pages they were browsing.

Google Webmaster Tools

GWT is a free Google toolset that helps you understand what happens on your website. You need to get authorized first to gain access to these data. You will be able to set up your own dashboard once you get verified. The dashboard will give you an idea of which keywords you are trying to target and how much traffic your website is receiving. A website that is active in Webmaster Tools has a better chance to be fully indexed by Google. Obviously, you may get a much better overview of your website’s performance through Webmaster Tools. Sign up for an account and check how it works.

Google Website Optimizer

It is another free tool from Google that allows you to see which sections of content on your website convert into most clicks. This tool is mainly for experimenting with the effectiveness of your content. It allows you to create different versions of one page. It measures the efficacy of each page and points out the best version.

Evernote

Evernote is a digital notebook where you can store and organize content as well as images. It also allows you to record an audio content and copy information from other websites. Evernote is accessible on your computer, tablet, and smartphone and all data you save there are synchronized.

CONCLUSION:

In April 2015, Google released a new algorithm that labels websites as either mobile-friendly or not. This obviously determines the site’s ranking in searches on mobile devices. More and more people are using tablets or mobile phones to look for the information online nowadays. Therefore, your site should be either responsive or mobile friendly. A responsive website is designed to smoothly transition between mobile, tablet and PC interfaces (through an additional piece of code), whereas the mobile friendly website is still accessible to mobile viewers, but it is not designed for optimal user experience.

There are many features that determine whether a website is Google friendly or not. Here are 10 tips on how to improve your Google rankings, if you already have a well-designed website. There are more and more websites being built every day. If someone wants to stay visible, they need to make sure that their website does not violate Google’s regulations. Having a nicely designed website is just half of a success. The other half is branding and a good SEO strategy.