Breaking

Showing posts with label On Page SEO. Show all posts
Showing posts with label On Page SEO. Show all posts

Tuesday, May 4, 2021

May 04, 2021

301 Redirects

301 Redirects


This is the cleanest way to redirect a URL. Quick, easy, and search-engine friendly. Remember HTAccess stuff is for Apache servers only.


Redirect a Single Page

Redirect 301 /oldpage.html http://www.yoursite.com/newpage.html
Redirect 301 /oldpage2.html http://www.yoursite.com/folder/

Redirect an Entire Site

This way does it with links intact. That is www.oldsite.com/some/crazy/link.html will become www.newsite.com/some/crazy/link.html. This is extremely helpful when you are just “moving” a site to a new domain. Place this on the OLD site:

Redirect 301 / http://newsite.com/

Tuesday, July 28, 2020

July 28, 2020

What is a Sitemap and How To Create a Sitemap?

What is Sitemap and How To Create Sitemap

If you have a blog, then you also need to know how a sitemap helps you. first of all, you need to know How many types of sitemap

Although sitemaps are of 6 types, most 2 types of sitemaps are done.

Sitemap.xml
Sitemap.html

There are other 4 ways:

  • Image Sitemap
  • Video Sitemap
  • News Sitemap
  • Mobile Sitemap
The sitemap is a file. It contains all the information of our website.

Like how many pages and posts are there on the website, where are the images and other media files etc.

These sitemaps give all the information of the website to the search engines.

So when crawlers of search engines come, they have no problem in collecting information from the website.

For Example:

There are many colonies in the city. You are asked to collect the data of all colonies and bring them.

There is a map on the entrance of colony-A, which gives you all the information about the colony.

Like which building is it at? What is the location of the park? e.t.c

But there is no map/map in colony-B.

When you collect your data, then the data of colony-A will be collected quickly and will be more accurate. Because of that map, you will be able to know the layout of that colony in a better way.

Because of the map, you can understand the layout of colony-A in a better way.

In this example-

City is Internet
The colony is - your website
The map of the colony is - Sitemap of your website
Buildings are - website posts, pages, etc.
You are - Search Engine crawler

Note:- Websites whose sitemap is made on the Internet, Google's crawlers collect data of those websites easily and quickly.

I hope this example would have cleared that sitemap

Google defines a sitemap in its documents as something like "A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content."

To Say in Easy Words-

The sitemap is a map of our website. Which helps Google to collect all the data from the website.

On-page optimization requires a lot of steps to follow. Sitemaps are also one of them.

This is something that a Sitemap looks like. This is the sitemap of my website Digital Paratha

Digital Paratha Sitemap Screenshot
This is How Sitemap Looks

Difference Between HTML Sitemap VS XML Sitemap

HTML Sitemap: It is available for the user and crawler. Any user can see an HTML site map of the website. Normally e-commerce site like Amazon, or Flipkart has HTML sitemap. it does not tell the last modification of webpages and frequency of modification and timing.

XML Sitemap: It is an XML file which is available to web crawler only not for the user. This is XML file has to be uploaded in the root directory of the website. It tells the crawler how many pages presented on your website. it also tells the last modification of webpages and frequency of modification and timing.

Why it is important to create a sitemap -

So that Google crawlers easily collect all the information of our website.
All our pages should be indexed in Google.
And our site has an organic ranking increase.

Sitemaps are very beneficial for new websites and blogs.

There are not many backlinks on new websites, so it is not easy for search engines to find all the pages of the new website.

But with the help of a sitemap, search engines are able to know all the information of our website better.

How to Create .xml Sitemap?

How a sitemap will be generated depends on the platform of your website. For example, it can be easily generated with the help of SEO by Yoast. Install and activate SEO by Yoast plugin on your blog. After that, go to SEO> XML Sitemaps in Settings and generate the sitemap for the first time.

After that, go to Settings> SEO> XML Sitemaps to generate the sitemap for the first time.

Yoast SEO XML Sitemap


On this link, you can see the example of a sitemap file of the Digital Paratha blog.

At the same time, there are many online tools for other platforms such as static sites and many types of offline software which will help you to generate the sitemap. The XML sitemap is a very simple way to generate, To generate XML sitemap file, click on the link given below:

Step 1: XML Sitemap Generator – https://www.xml-sitemaps.com

Step 2: After that, go there and enter your website's URL in the text box and click on the Start button.

And sitemap.xml of your website will be generated, which you will have to upload in the root folder of your website and submit the URL of sitemap to Google.

How To Submit a Sitemap On Google (sitemap.xml)?

To submit an XML sitemap, you have to follow the steps given below:

Step 1: First of all, go to the Google search console and select your website.

Step 2: In the menu on the left hand, click on the option with the sitemap.

Step 3: In a new window, type sitemap.xml in front of your website. ( Like this - www.abc.com/sitemap.xml ) | Note - Apart from this, do not type anything else in front of the URL.

Step 4: Copy the URL with step 3 and paste it back into the sitemap option of Google Search Console and click on the submit button.

In this way your website will be submitted to the Google Sitemap and Google crawlers will be able to crawl your website.

If you like my article, please share it: -)

Tell me whether you have submitted a sitemap of your website or not? What other things related to SEO do you do to get your blog indexed fast?

Thursday, July 23, 2020

July 23, 2020

What is Robots.txt File and How to Use It for SEO?

What is Robots.txt File and How to Use It for SEO

A robots.txt file is a little text file that lives in your site's root folder. It tells the search engine which part of the website to crawl and index and which part not to.

If you make any mistake while editing/customizing it, the search engine Bots will stop crawling and indexing your site and your site will not be visible in the search results.

In this article, I will tell you what is Robots.txt file and how to create a Perfect Robots.txt file for SEO.

Why is The Robots.txt File Website Required?

At the point when search engine Bots come to sites and blogs, they follow the robots.txt file instruction and crawl the content. But your website won't have a Robots.txt file, so the search engine Bots will start crawl and indexing all the content or pages of your site which you don't want to index.

Search engine Bots search the robots.txt file before indexing any site or webpages. At the point when they don't get any Instructions by Robots.txt file, they start indexing all webpages or contents of the site.

Note: Robots.txt file is required for these reasons. If we don't give instructions to the search engine Bots through this file, then they index our entire site. Also, you index some data that you didn't want to index.

Advantages Of Robots.txt File

  • The search engine tells Bots which part of the website to crawl and index or which not to.
  • A particular file, folder, image, pdf, etc. can be prevented from being indexed in the search engine.
  • Sometimes search engine spiders crawl your site like a black mamba, which affects your site performance. But you can get rid of this issue by adding crawl-delay to your robots.txt file. However, Googlebot does not obey this command. But you can set the Crawl rate in Google Search Console. This protects your server from being overloaded.
  • You can private the entire section of any website.
  • Internal search results can prevent pages from appearing in SERPs.
  • You can improve your Website SEO by blocks of low-quality pages.

Where is Located Robots.txt File Inside On The Website?

If you are a WordPress user, it resides in your site's root folder. If this file is not found in this location, the search engine bot starts indexing your entire website. Because the search engines don't search your entire site for the bot Robots file.

If you don't know if your site has a robots.txt file? So on the web search address bar all you simply should type it - example.com/robots.txt

A text page will open in front of you as you can see in the screenshot.

Digital Paratha Robots txt
Screenshot of Digital Paratha Robots.txt

This is the robots.txt file of DigitalParatha. If you do not see any such txt page, then you have to create a robots.txt file for your site.

Basic Format of Robots.txt File for SEO

The fundamental configuration of the robots.txt file is very simple and looks like this,

User-agent: [user-agent name]
Disallow: [URL or page you don't want to crawl]

These two commands are considered a complete robot file. However, a robot's file can contain multiple commands of user agents and directives (disallows, allows, crawl-delays, etc.).
  • User-agent: Search Engines are Crawlers / Bots. If you want to give the same instruction to all search engine bots, use the * sign after user-agent: Like - User-agent: *
  • Disallow: This prevents files and directories from being indexed.
  • Allow: This search engine allows bots to crawl and index your content.
  • Crawl-delay: How many seconds the bots have to wait before loading and crawling the page content.

Preventing All Web Spiders from Indexing Websites


User-agent: *
Disallow: /

Using this command directly in the robots.txt file can stop all web crawlers/bots from crawling the website.

All Web Spiders Allowed to Index All Content


User-agent: *
Disallow:

This order in the robots.txt file allows all search engine bots to crawl every one of the pages of your website.

Blocking a Specific Folder for Specific Web Spiders


User-agent: Googlebot
Disallow: /example-subfolder/

This command only stops Google spiders from crawling for example-subfolder. But if you want to block all Spiders, then your robots.txt file will be like this.

User-agent: *
Disallow: /example-subfolder/

Preventing a Specific Page (Thank You Page) from Being Indexed


User-agent: *
Disallow: /page URL (Thank You Page)

This will stop all spiders from crawling your webpage or blog URL. But if you want to block Specific Spider, then you write it like this.

User-agent: Bingbot
Disallow: /page URL

This command will only stop Bingbot from crawling your page URL.

How To Add a Sitemap To Robots.txt File and Why it is Important?

There are thousands of search engines in the world and it is not possible to submit your site to every search engine, but when you add your sitemap to Robots.txt file, you do not need to submit your site to all search engines.

However, submitting your site to Google and Bing is important.

What is Sitemap and Robots.txt File?

A sitemap is a list of all the URLs on your website that tells the search engine about all the pages and posts URLs on your website. The sitemap does not improve your search ranking, but it allows your website to crawl better for search engines.


The robots.txt file helps search engines understand which parts of your site to index and which not. When search engine robots visit your site, they follow a robots.txt file on your site and index the part that you want to be indexed in the search engine.

How To Add a Sitemap to a Robots.txt File?

First, go to the root directory of your site and select the robot.txt file and add your sitemap URL by clicking on the Edit button.

Now your robots.txt file will look something like this.

Sitemap: http://www.example.com/sitemap.xml

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemaps can be placed anywhere in the robots.txt file. It does not matter where you keep it.

How To Add Multiple Sitemap to Robots.txt File?

You can add different URLs for your multiple Sitemap files like this

Sitemap: http://www.example.com/sitemap_host1.xml
Sitemap: http://www.example.com/sitemap_host2.xml

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

In this way, you can manage your sitemap with the help of a robots txt file.

You can comment on any type of question or suggestion related to this article. If this article has proved helpful for you, then do not forget to Share it!
July 23, 2020

What is SEO?

What is SEO

SEO is a process by which we can increase the organic ranking of our website in search engines.

In simple words -

SEO is all the ways by which we can bring our website to the top of search engines so that more and more people see it.

full form of SEO (Search engine Optimization) Which is directly related to the search engine. SEO There are rules to bring your website to the top in a type of search engine so that traffic can increase on our website. If you follow these rules, then your website appears on the first page in the search engine.

Bringing the website to the first page is important because most people prefer to visit the website that comes on the first page and for this we have to follow Seo.

Any company or person makes his website so that he can sell his service and product, but if only traffic does not visit his website, then how will he sell his product. So we have to do search engine optimization to get our website to the first page. Which has increased traffic on our website.

Note:- Search engine optimization is exactly like traffic rules. For example, to run the traffic smoothly, we need a roadmap so that people do not face any problem and they can quickly reach their target by choosing their right path.

Similarly, Seo is also a search engine Traffic Rules. So that when searching anything, it gets the right information quickly. For this, there is a roadmap of a search engine called Seo (Search engine optimization).

Both Seo and traffic rules work for people. So that our journey is good. Like if you search for "what is SEO" but the result that comes is about something else, then you will have to search again and again which means that your journey and user experience is poor.

That's why the Google search engine uses the Seo factor to increase its user experience so that it can give information fastly and correctly to its user.

Every search engine has its own SEO factor. In today's time, Google is the largest search engine which is the most used in the whole world. Google works on about 200 Seo factors.

If you want to understand Seo in one line, then you should know that Google likes the content that the user likes to read. Whose content automatically goes to the first page. And if you do not like it, then it slowly goes down. This is the most important factor in Google SEO.

What is SERP?

SERP means Search Engine result Page. When we search for someone in Google or any other search engine, he shows all the results on his page. On searching, the page that opens it is called the Search Engine result page.

The results that appear on the Search Engine result page, there are 2 types of listings in the list.
  1. Organic listing
  2. Inorganic Listing

What is Organic Listing?

Organic listing is the listing in which we come to the search engine results page without spending any money. But for this, we have to do search engine optimization. Organic listing is the best because we get regular traffic from it.

What is Inorganic Listing?

When we spend money and come to the result page of Google, then we call it Inorganic listing. These listings are not stable, that is, as long as we continue to pay money to Google, we can come to the results page.

How Does Search Engine Work?

Like if you search for "what is SEO" then the search engine brings the crawling and indexed ranking list to you. The bots and spiders of the search engine crawl and index for 24 hours continuously and make their ranking list. And as soon as you do some search, you see it on the search engine result page (SERP).

By the way, all search engines have different technic to work. But every search engine works in three steps.
  1. Crawling
  2. Indexing
  3. Ranking

What Are The Types Of SEO?

So far we have learned what SEO is and why it is important. After this, we talk about how this is done. When the website becomes a blog, then its optimization starts. That is, before publishing the post, work starts on it. Nowadays blogging is done most in WordPress. If you know very well about what WordPress is, then you will know that we get a lot of plugins in it for free, many of which are also used for optimization. So let's know about this type of thing.

These are mainly of 2 types.

How To Do On-Page SEO?

Search engine optimization has two important factors. First of all, we talk about on-page Seo because it is the most important factor to increase organic traffic on the website.

To set up your website according to search engine optimize, the work that is done on it is called on-page SEO.

By doing this, your organic traffic increases. Directing your website by searching keywords on google is called organic traffic.

On-page SEO has many factors, with the help of which you can optimize your website for on-page, we are going to tell you some common factors.
  • Website Design
  • Website Speed
  • Website Structure
  • Website Favicon
  • Mobile-friendly Website
  • Title Tag
  • Meta Description
  • Keyword Density
  • Image Alt Tag
  • URL Structure
  • Internal Links
  • Highlight Important Keyword
  • Use Heading Tag
  • Post-Good Length
  • Robots.txt
  • Sitemap.xml
  • Check Broken Links
  • SEO Friendly URL
  • Google Analytics
  • Social Media Button
  • HTML Page Size
  • Clear Page Cache
  • Website security HTTPS etc

How To Do Off-Page SEO?

To rank your website and post in the search engine, promoting its link on the internet is called off Page SEO.

When your post is promoted and shared on the internet, it gets some signal to the search engine. Which search engine increases the ranking of that post.

There are many ways to do Off-Page SEO. With the help of which you can increase the traffic of your website by increasing the ranking of your post. We are telling you how to do something off-page.
  1. Social Sharing
  2. Social Bookmarking
  3. Guest Posting
  4. Forum Posting
  5. Blog Commenting
  6. Blog Directory Submission
  7. Search Engine Submission
  8. Classifieds Submission Site
  9. Video-Sharing site
  10. Photo Sharing site
  11. Question and Answering Site
  12. PDF / PPT Submission Site