Sunday, December 22, 2019

Google Search Console

GOOGLE SEARCH CONSOLE

Google Search Console is a service generating service offered by Google that helps you monitor, maintain and troubleshoot your site's presence in Google search results.  You don't have to sign up for a search console to be included in Google search results, but the Search Console helps you understand and improve how Google sees your site.

google-search-console

Updates to the Google Search Console are coming quickly and provocatively. Even for the most eagle-eyed of SEO professionals, it’s hard to stay on top of the latest developments.

Similarly, I thought it would be easy to absorb what we know about Google's plans to migrate features from the old version and the evolution of the new version of the search console.

What is the difference between Google Analytics and Google Search Console?

Simply put, Google Analytics will provide you with data points about your website's performance.on the other hand,uses the Google Search Console to improve and optimize your website.
analytics-and-search-console
If your site has technical errors, Intel also provides information about who links to your site and keyword queries.

Does Google Search Console help SEO?


This SEO tool is useful for tracking metrics and finding new stats to boost your organic footprint.  Google says anyone with a website should use the

Thursday, December 19, 2019

On Page SEO Optimization

on-page-seo-optimizationON-PAGE SEO

On-page SEO is one of the most important processes you can use to rank high in a search engine's organic results and run successful SEO campaigns.
A website is the focus of all SEO processes, and if it is not optimized properly for search engines and users, you reduce the risk of getting traffic from search engines.

Snippet

over the year,google has been adding more and more information to the search result,aside from adwords and organic serach result,to improve the search experience.

snippet

featured snippets are a format that allows user to answer their questions briefly and directly without clicking on a specific result.


Title TagsAn HTML tag used to define the title of a text or search list at the top of a web browser.  The title should be accurate and provide a short summary of the content of the page.  It must not exceed 60 characters.  Tags are essential to every website, and are web pages, and each webpage must have a different title tag.  Title tags play a very important role in optimizing your page in search engines.Google calculate the pixel width of 512 pixels with a limit of characters used.

Code Sample;
<head>
<tittle>Example Tittle</tittle>
</head>

Try to avoid all caps titles. It can be difficult for search visitors to read them,and can severely limit the number of characters Google displays.

While there is no penalty for Google's algorithm for longer titles, you will have problems if your title starts to fill in the keywords, creating a bad user experience.

Give every page a unique title

Unique titles help search engines understand that your content is unique and valuable, but also increases high click-through rates.
While title tags are important to SEO, keep in mind that your first job is to attract clicks from well-targeted visitors who may find your content valuable. In addition to optimization and keyword usage, it is important to think about the whole user experience when creating your header tags.
A meta description but with a general impact, you should try to use good grammar and avoid spelling mistakes.  High quality and low visibility of low quality content.  You rarely see spelling mistakes or grammatical issues in content from top publishers.


Blogging:Alt text is an attribute added to the HTML of the image.  Adding optimized alt text to your images will make your blog more accessible to the visually impaired and give your blog an SEO boost.

Always write whole phrases, but don't worry about the uppercase or punctuation mark because the uppercase letter isn't important.  And, as tempting as it may be, avoid keyword packing.  Google does not reward this tactical behavior, and it can only make your SEO worse.

google-search-consoleGoogle search Console + Web Master tool

These two webmaster tools help you find out what the two best search engines think about your site.  It is helpful to see any bugs, alerts, and indexing issues.
Both devices require a bit of installation on your site
Google Webmaster Tools, also known as the Google Search Console, is a collection of web-based web utilities that help website owners make sure their site is Google-friendly.  These Google Web tools have many useful applications, such as obtaining data about incoming search traffic, requesting Google to crawl and index the website, and view crawl error reports.

Why Use Webmaster Tools?

One of the best applications of the device is that it allows webmasters to crawl and process their websites and pages for Google indexing.  Error reports enable them to detect problems that prevent their site from performing well in Google Search.  Webmaster tools include a set of Google search tools that provide data about which keywords are ranked on Google and which domains link to a given website.

Meta Description

The meta description tag in HTML is a 160 character snippet used to summarize the content of a web page.  Search engines sometimes use these snippets in search results, telling visitors about a page before clicking on it.Meta descriptions can be any length, but Google typically shortens the snippets to 155–160 characters.  It is best to keep the meta descriptions sufficiently descriptive, so we recommend descriptions between 50-160 characters.  Keep in mind that the "optimal" length varies depending on the situation, and your primary goal should be to provide value and drive clicks.
Although not linked to search engine ranking, meta description tags are very important to get user click through from SRPs.  These short paragraphs are a webmaster's opportunity to "advertise" content to those who are searching, and the opportunity for the searcher to decide if the content is relevant and contains the information they seek from their search query.
Code Sample
<head>
meta name="description"content="
</head>
The meta description tag helps the function of ad copy.  It draws readers from SERP to a website, so this is a very visible and important part of search marketing.  Creating a readable and compelling description using important keywords can improve the click-through rate for a given webpage.  To increase click-through rates on search engine result pages, it is important to note that Google and other search engines bolster keywords in the description when matched with search queries.

Avoid duplicate meta description

As with title tags, it’s important to keep the meta descriptions on each page unique.  Otherwise, you will end up with SERP results like this:

meta-description
One way to combat duplicate meta descriptions is to implement a dynamic and programmatic way to create unique meta descriptions for automated pages.  If possible, replace the actual description you write for each page.


Body Section

body-section

h1

Each page or post can have multiple headings.  The <H1> HTML tag is usually used for the title of a
page or post, and is the first title to appear on a page.  The formatting of an h1 usually differs from the rest of the title tags found on a page (h2, h3, h4).

h2

The H2 tag is a subtitle and your H1 tag should contain similar keywords.  Your H3 is subtitled to your H2.  Think of them as a hierarchy of importance, and the above is more important than the following

Sand Box

Google sandbox refers to the general belief that Google has a filter that restricts all new websites to rank in searches for a period of time.Very few people know if the sandbox model actually exists, but it appears in March 2004 as a filter added to Google algorithms.

Keywords

Primary keyword
secondary keyword

primary keyword, as the name implies, is a keyword used before any other keywords in a webpage or article.  Therefore, the primary keyword is the most relvent keyword on a web page.  In turn, the primary keyword must include a site's title, domain, and its content.  It should also be added to the first sentence on the page.  The primary keyword is essential for SEO purposes because it is used to describe the web page and to help people find the site.  A secondary keyword is used after the primary keyword, which is not required in relation to SEO.  However, using a secondary keyword increases your chances of attracting additional visitors to your site.

Content optimization

Content optimization is the process of ensuring that content is written in such a way that it reaches the target audience.  The process of optimizing content should include meta,short heading,simple sentence,title tags, and relevant links, ensuring that relevant keywords are included.
Bold differs strongly from ordinary text, and is used to highlight important keywords for the content of the text and to allow such words to be easily scanned visually. 

Bounce rate

One of the cheapest and cheapest measures of any website's SEO campaign is the site's "bounce rate."
Bounce rate is the percentage of visitors who open a website before spending a significant amount of time searching

bounce-rateFor example, if the user searches for a quick answer question, clicks the first result, finds their answer in a few seconds and returns to SERP to continue doing something unrelated, this will lead to higher bounce rates.  Considering this example, it's easy to see why "good" bounce rates are entirely dependent on your website's goals and why algorithms have different ranking factors.


Keyword density

Keyword density refers to how many times a keyword appears as a percentage or percentage of the total number of words within a particular webpage or piece of content.  It is also sometimes called keyword frequency or the frequency at which a specific keyword appears on a webpage.
In the context of search engine optimization, keyword density can be used to determine if a web page is relevant to a particular keyword or keyword style.

Anchor test

While on a website or webpage, you may see a different color and underlined text.  If you hover over this text with your mouse and a link appears, George has your chance at anchor text!  Click on that text and you will surely be moved to a new page!
Anchor text, link label, link text or link title is a text that is visible and clickable in a hyperlink The words contained in the anchor text can determine the ranking that a page receives through search engines.Generally, web search engines analyze anchor text from hyperlinks in web pages.

you can refer our previous posts about the history and evolution of SEO


Google Algorith Updates

GOOGLE ALGORITHUM UPDATES

Google's algorithms are a sophisticated system used to retrieve data from its search indexes and instantly provide the best possible results for a query.  It combines algorithms and several ranking signals to deliver relevance-ranked webpages on search engine results pages (SRPs).
In the early years, Google made only a few updates to its algorithms.  Now, Google makes thousands of changes every year.
Most of these updates are so minor that they go completely unnoticed.  However, in some cases, the search engine releases major algorithm updates, which greatly affects the SERPs are :
google-algorithms
  • Panda 
  • ‌Penguin
  • ‌Hummingbird
  • pigeon
  • ‌Fred
  • ‌Mobilegeddon
  • Pirate
  • Parked Domain
  • ‌EMD
  • ‌Rank Brain
  • Possum

Panda

Launch Date: February 24, 2011

Triggers of panda:

Thin Content,Duplicate Content or Copy content,low quality content,user-generated Content,keyword stuffing
Thin content - Weak pages with very little relevant or substantive text and resources, and a bunch of pages describing various health conditions with just a few sentences on each page.
Duplicate Content - Copied content that appears on the Internet in multiple locations.  Duplicate content issues can also occur on your own website when there are multiple pages of the same sentence with little or no difference.

duplicate-content

Low quality content - pages that give human readers little value because they lack in-depth information.
Panda does not specifically target user-generated content.  Panda can target user-generated content, but it also impacts sites that produce low-quality content - spammy guest posts or spam forums.
User-generated Content Panda can target user-generated content, but it also impacts sites that produce low-quality content - spammy guest posts or spam forums

How it works To prevent sites with low quality content from performing their best search results.  Panda updates from time to time.  When this happens, the previously edited sites can be saved if you make the right changes.

Panda 4.0 (Rolling out on May 20 2014)
Recently, Google updated their algorithm with a known Panda update.  This is called Panda 4.0.  This algorithm update fulfills the predictions of some, the realization of nightmares for others.  For better or worse, the biggest algorithm of 2014 is Panda 4.0.
panda-4.0
penguin-updation

Penguin updation

Launch Date april 24 ,2012.

Penguin's goal is to reduce Google's reliance on fraudulent sites by creating unnatural backlinks to gain an advantage in Google results.  While the penguin's primary focus is on unnatural links, there are other factors that can affect a site in the penguin's eyes.  However, the most important thing is to look at the links.

Triggers of penguin:Spammy or irrelevant links; links with over-optimized anchor text

Why are links important?
If a decent site links to your site, this is a recommendation for your site.  If a small, anonymous site links to you, this vote does not equal a vote from an authorized site.  Still, if you can get a lot of these little votes, they can really make a difference.  This is why, in retrospect, SEOs were trying to get as many links as possible from any possible source.

penguin 4.0

As of Friday, September 23rd, you may have read that Penguin is now part of Google's main algorithm.  But what exactly does that mean?  In this article, I will talk about the main differences between Penguin 4.0 and previous updates, how those changes affect your website and whether or not Penguin's latest update has affected you (for better or for worse).

Hummingbird

Launch date: August 22, 2013

Triggers of Hummingbird:Keyword stuffing; low-quality content.

Hummingbird helps Google better interpret search queries and deliver results that match the search intent (as opposed to the individual words in the query).

humming-bird
When keywords are important, Hummingbird makes it possible to rank a page for a query, even if it does not contain the exact words that the search term provides.  This is achieved with the help of natural language processing that relies on hidden semantic indexing, coherent terms and synonyms.

Pigeon

Launch date: July 24, 2014

Triggers of Pigeon:Poor on- and off-page SEO.

Google implemented one of Google's biggest impact algorithm updates for local search results, and local businesses typically saw the results of the update on their website's analytics data.

pigeon
Pigeon affects searches where user location plays an important role. The update added more to the local algorithm and core algorithm: Traditional SEO elements are now used to rank local results


Mobilegeddon

Launch date: April 21, 2015

Triggers of Mobilegeddon: Lack of a mobile version of the page; poor mobile usability.Google rolled out their mobile-friendly update, which quickly gained many monikers - mobilepocalyse, mopocalypse, mobocalypse, etc.

 Ultimately the name that stuck was “Mobilegeddon”
Google posted this message on the Web blog Webmaster Central Blog, as it sometimes does, and briefly updated the image with a picture to show the difference between being mobile friendly and heart friendly:
mobilegeddon
This update does not specify a gray area.  Your pages were either mobile-friendly, or they weren't.  In the meantime, there were none.
On that day in April, the roll of the update was a formal roll-out, but this was not the only notice given to Google Webmasters.

Fred

Launch date: March 8, 2017
Triggers of Fred: Thin, affiliate-heavy or ad-centered content
The latest in Google's Verified Updates, Fred targets websites that violate Google's Webmaster guidelines.  Most of the affected sites are blogs with low quality posts that seem to have been created mostly for advertising revenue.

fred

Google Fred is an algorithm designed by Google to target black-hat strategies linked to excessive aggressive monetization.  Google Fred specifically searches for excessive advertising, low-value content, and websites that offer little benefit to users.

RankBrain

Launch date: October 26, 2015
Triggers of Rankbrain: Lack of query-specific relevance features; shallow content; poor UX
RankBrain is part of Google's Hummingbird algorithm.  It is a machine learning system that helps Google understand the meaning behind the queries, and provides the most relevant search results to answer those questions.  Google calls the ranking brain the third most important ranking factor.  Although we do not know the insights and facts of RankBrain, the general opinion is that it identifies relevant features that rank web pages for a specific query, which are basically query-specific ranking elements.
rankbrain

Possum

Launch date:1 September 2016
The Possum Update ensured that local results vary greatly depending on the location of the searcher: the closer you are to the address of a business, the more likely you are to see it in local results.  POS also added more diversity to the results ranking for similar questions like "Dentist Denver" and "Dentist Denver Co."  Interestingly, Possum provided a boost for businesses outside the physical city area
possum

EMD(Exact match Domain)

Launch Date September 28,2012
The purpose behind this update was not to target exact match domain names, but to target sites using the following spammy strategies: Accurate Match Domains, which are poor quality sites with thin content.
There really are no other nicknames for this update.  This has gone through an EMD update or an exact match domain updated
exact-match-domain

The main weakness of these websites is that SEOs will buy domains that have precise matching keyword styles and build a site, but they have very thin content at no cost.  It was very easy to do.  So easy, in fact, that it was like taking candy from a baby when it came to easy SEO successes

Pirate


pirateGoogle's Pirate Update is designed to prevent sites with multiple copyright infringement reports from ranking better on Google's listings, as filed through Google's DMCA system.  The filter is updated periodically.
When this happens, sites that have been improved before can be saved if they make the right improvements.  Filters may also capture new sites that were previously captured, and may release "false positives" that have been captured.


Parked Domain


Launch Date:April 16,2012

Google Parked Domain classifier error blamed for lost search ranking

parked-domain

The short explanation is that our classifier for parked domains reads from two empty files.  As a result, we categorized some sites as parked when they were not.  I apologize for this;  The problem seems to have been solved now, and we'll look at how to prevent this from happening again.  ”

update
SMALL ALGORITHM UPDATES;

  • BERT
  • MEDIC
  • BRACKETS

BERT

The Bert algorithm (bidirectional encoder representatives from transformers) is a deep learning algorithm related to natural language processing.  It helps a machine to understand the meaning of words in a sentence, but with all the nuances of the context.
bert

However, as Dont puts it, using words in the right way makes SEO more important on the page.  Google BERT update may not help with sloppy content.


medicMEDIC

Google has just released one of the biggest search engine algorithm updates ever made - leading to a path of turbulent website ranking.  This update, called "Google Medical Update" by SEO guru Barry Schwartz, primarily affects the organic rankings for health, fitness and medical websites.


BRACKETS

Launch Date:March 7,2018
brackets
   This time, it's Google's chance to turn our worlds upside down.  In calling it the "bracket update," Google seems to be starting a new focus on quality.  The update is designed to help sites do what is right, contrary to a penal penalty for sites that "break the rules".






You can refer our previous post
THE HISTORY AND EVOLUTION OF SEO


The History And Evolution Of SEO


The History And Evolution Of SEO


Google discovered two PhD students at Stanford University in California.  They created Google as part of their project.  When the September 11, 2001, attacks on the US World Trade Center.


Google could not provide accurate information on the matter.  This greatly influenced Google officials, which prompted them to hold an emergency meeting on the matter.  According to Google officials, the main reason for not getting enough information on the matter is that the website associated with the incident is not "crawlable".

Google acts as a process of crawling, catching and indexing.  Google has decided to crawl web pages, for which Google experts have developed an SEO Starter Guide and provided it to Webmasters.  Search Engine Optimization Starter Guide aimed at improving quality and gaining trust among users.

Google is based on algorithms.  There are some facts that determine the ranking of websites.  Google has undergone some changes over time to improve the quality of websites for their users.  Google has adopted some of the following methods:

keyword-stuffing
  • Content specification
  • Link specification
  • Quality link specification
  •  The juice passes
  • Content specification

Content-specific 

is a method based on keywords.  This method was used by Google in the first period.  Keywords are terms that users use to search for something.  The ranking was based on the increased usage of keywords on the website.  This has led to the manipulation of websites.  Web developers have taken advantage of this and instead of improving the website quality, they have stuffed the site with keywords called 'keyword stuffing'.

Link specification

Due to the rise in keyword secularism, Google has developed a new method, called a link-specific method.  Link-specific method is a method used to rank a website, how often and when that particular website's link is used.  Webmasters have also dealt with this.  They try to buy and sell links to each other to stay on top of the search engine.

Quality link specification

Link specification has been replaced by Quality Link specific method.  A website is ranked based on links used on quality web pages.  Google ranked these websites out of 10.

The Juice passes

Link juice is a term used in the SEO world to denote the value or equity transferred from one or site to another.this value passes hroughhyperlinks. Search engines see links as votes for other websites that need to promote and promote our page

blak-hat-seoSEO Techniques

Black Hat Technique 
Unreasonable use of techniques to improve web page ranking. 
Black Hat SEO uses the techniques and strategies used to achieve high search rankings and violate search engine rules.  Black Hat SEO focuses solely on search engines and not on human audiences.
 Example: Keyword secularism.

White Hat Technique 
Ethical Ways to Improve the Quality of a Web Page
White Hat SEO refers to the use of techniques and strategies aimed at a human audience as opposed to a search engine.

Gray Hat Technique - Uses immoral and moral ways in a top-ranked webpage.

Google Adwords

Inital release: october 23,2000
Google Adwords is an online advertising platform developed by Google, where advertisers pay to display short ads, service offerings, product listings and video content and create mobile app installs on Google.  Ad network for web users.
Formerly known as Google AdWords, Google Ads - the most popular PPC platform to date - operates on a one-click pay-per-view model, in which users bid on keywords and pay per click on their ads.

Google AdSense

Initial release date: 18 June 2003
Google AdSense offers website owners a way to make money from their online content.  AdSense works by matching text to your site and displaying ads based on your content and visitors.  Advertisers who want to promote products create and pay for ads.

Pogo Sticking

Pogo Sticking is defined as going to and from a search engine result page (SERP) to a personalized search result destination site.  In other words, when a searcher clicks on a link in a SERP, she sees that she is not looking and bounces back immediately after hitting the back button.


Personalized Search result

Individual search results are the results a user sees in a search engine based on traditional ranking factors (such as the relevance of web pages to the search term or their authority), and the search information the user has about the engine at a given time.