SlideShare une entreprise Scribd logo
1  sur  8
Télécharger pour lire hors ligne
BLEKKO
Blekko Delivers Spam free Results
 Search engine are the sites on the net which are used to source
information from other websites on the World Wide Web. blekko
is a consumer facing, new age search engine that delivers quality
results from the most trusted of sites. This is made possible by a
judicious mix of complex algorithms and human expertise.
 People instantly turn towards internet search engines if they wish
to find information about anything on the internet. Major search
engines deliver hundreds of thousands of results within a fraction
of a second.
 This made life immensely easy for people. In absence of a search
engine, anybody looking for any information would have had to
perform the impossible task of remembering a particular website’s
URL.
 This could be feasible had there been only a handful of websites.
The World Wide Web has billion of websites and memorizing their
names would be a task befitting a supercomputer and not an
ordinary human being.
 There are basically three types of search engines: First is called a robot
or spider that makes use of complex algorithms to search through the
databases of HTML documents.
 The second is powered by human submissions, i.e. they rely on
submissions made by human beings to create a catalogue. The third
type is called a hybrid search engine which is a combination of both
algorithm based programs and human expertise.
 Blekko is a hybrid search engine that makes excellent use of human
experience and expertise and combines it with complex algorithms to
deliver outstanding results.
 Involvement of users’ experience and knowledge ensures that the
results returned score high on quality and relevancy. A user is spared
the agony of visiting dozens of irrelevant pages to find the relevant
information.
 We shall now deal in detail with all the three major types of search
engines and find out how they work behind the scenes to make our
lives easier and less complicated.
ROBOTS OR CRAWLER BASED INTERNET SEARCH ENGINES
 It is important to know about Meta tags before we delve deeper into
the functioning of crawler search engines.
 Meta tags are special Hypertext Mark Up Language (HTML) tags that
provide specific information about a web page. Unlike HTML tags they
do not affect the functioning of the site.
 Their job is to provide relevant information about the web page, its
creator and traffic, relevant keywords, content of the site, and the
interval at which it is updated. This information is of paramount
importance to search engines. They make use of all the above
mentioned information to build their indices.
 Crawlers based search engines use automated software to visit a
website and source relevant information from the site. They read the
actual content of the site and observe the links to the site in question.
 More importantly, they read the Meta tags of the website to know
more about its relevance and the importance of the information it
dispenses.
 A crawler or spider armed with all the necessary information returns to
the central depository where the all-important task of indexing is
carried out.
 The spider at frequent interval returns to the site to take note of
information that may have been appended, modified, or deleted. The
frequency of these visits is determined by the search engine
administrators.
 Human powered search engines build catalogues and indexes upon the
input provided by humans. In other words, robots (algorithm programs)
have no role in building the index.
 One of the important tools used by crawlers to develop indexes is to
look for relevant keywords on the website. A user taking help of a
search engine for any question or query is actually searching through
the index that the crawler has created.
 Since keywords play an important part in indexing of websites,
unscrupulous webmasters make use of unethical practices like keyword
stuffing and spamdexing to increase their website’s visibility.
WHAT IS SPAMDEXING?
 Spamdexing, also known as web spam, black hat SEO, or search engine
poisoning, refers to the deliberate and intentional manipulation of
search engine indexes.
 Search engine that make use of crawlers depend upon complex
algorithms to determine the relevancy of a web site. Unethical websites
owners use dubious methods and processes to make their websites
achieve a higher ranking in results returned by the search engines.
 This technique, often referred to as Black hat SEO technique, leads to
wastage of time and other precious resources of a search engine user.
They have to wade through pages and pages of low quality sites to get
the actual information.
 Search engines severely punish spammers and webmasters indulging in
search engine spam. Unfortunately, the risk is too low and the rewards
too high to permanently deter tricksters from perpetrating such frauds.
 The World Wide Web is a treasure trove of information. It is said that
information is power and today’s search engines have made people
powerful by giving them access to quality information.
 A person can find anything about any products or services by just few
clicks of their mouse from the comfort of their home.
 Spams and other unscrupulous techniques used by charlatan
webmasters, however, exposes people to a lot of scams and frauds.
 Blekko is a new age search engine that focusses on quality rather than
quantity. It returns spam free results from the most trusted of sites.

Contenu connexe

Dernier

Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024D Cloud Solutions
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureEric D. Schabell
 
UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8DianaGray10
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6DianaGray10
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesMd Hossain Ali
 
Machine Learning Model Validation (Aijun Zhang 2024).pdf
Machine Learning Model Validation (Aijun Zhang 2024).pdfMachine Learning Model Validation (Aijun Zhang 2024).pdf
Machine Learning Model Validation (Aijun Zhang 2024).pdfAijun Zhang
 
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...Aggregage
 
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IES VE
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesDavid Newbury
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDELiveplex
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfDaniel Santiago Silva Capera
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
 
Introduction to Matsuo Laboratory (ENG).pptx
Introduction to Matsuo Laboratory (ENG).pptxIntroduction to Matsuo Laboratory (ENG).pptx
Introduction to Matsuo Laboratory (ENG).pptxMatsuo Lab
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding TeamAdam Moalla
 
Building AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxBuilding AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxUdaiappa Ramachandran
 
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationUsing IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationIES VE
 
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdf
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdfUiPath Solutions Management Preview - Northern CA Chapter - March 22.pdf
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdfDianaGray10
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Brian Pichman
 
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UbiTrack UK
 

Dernier (20)

Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability Adventure
 
UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
 
Machine Learning Model Validation (Aijun Zhang 2024).pdf
Machine Learning Model Validation (Aijun Zhang 2024).pdfMachine Learning Model Validation (Aijun Zhang 2024).pdf
Machine Learning Model Validation (Aijun Zhang 2024).pdf
 
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
 
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond Ontologies
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
 
Introduction to Matsuo Laboratory (ENG).pptx
Introduction to Matsuo Laboratory (ENG).pptxIntroduction to Matsuo Laboratory (ENG).pptx
Introduction to Matsuo Laboratory (ENG).pptx
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team
 
Building AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptxBuilding AI-Driven Apps Using Semantic Kernel.pptx
Building AI-Driven Apps Using Semantic Kernel.pptx
 
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationUsing IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
 
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdf
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdfUiPath Solutions Management Preview - Northern CA Chapter - March 22.pdf
UiPath Solutions Management Preview - Northern CA Chapter - March 22.pdf
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )
 
20150722 - AGV
20150722 - AGV20150722 - AGV
20150722 - AGV
 
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
 

Blekko Delivers Spam free Results

  • 2.  Search engine are the sites on the net which are used to source information from other websites on the World Wide Web. blekko is a consumer facing, new age search engine that delivers quality results from the most trusted of sites. This is made possible by a judicious mix of complex algorithms and human expertise.  People instantly turn towards internet search engines if they wish to find information about anything on the internet. Major search engines deliver hundreds of thousands of results within a fraction of a second.  This made life immensely easy for people. In absence of a search engine, anybody looking for any information would have had to perform the impossible task of remembering a particular website’s URL.  This could be feasible had there been only a handful of websites. The World Wide Web has billion of websites and memorizing their names would be a task befitting a supercomputer and not an ordinary human being.
  • 3.  There are basically three types of search engines: First is called a robot or spider that makes use of complex algorithms to search through the databases of HTML documents.  The second is powered by human submissions, i.e. they rely on submissions made by human beings to create a catalogue. The third type is called a hybrid search engine which is a combination of both algorithm based programs and human expertise.  Blekko is a hybrid search engine that makes excellent use of human experience and expertise and combines it with complex algorithms to deliver outstanding results.  Involvement of users’ experience and knowledge ensures that the results returned score high on quality and relevancy. A user is spared the agony of visiting dozens of irrelevant pages to find the relevant information.  We shall now deal in detail with all the three major types of search engines and find out how they work behind the scenes to make our lives easier and less complicated.
  • 4. ROBOTS OR CRAWLER BASED INTERNET SEARCH ENGINES  It is important to know about Meta tags before we delve deeper into the functioning of crawler search engines.  Meta tags are special Hypertext Mark Up Language (HTML) tags that provide specific information about a web page. Unlike HTML tags they do not affect the functioning of the site.  Their job is to provide relevant information about the web page, its creator and traffic, relevant keywords, content of the site, and the interval at which it is updated. This information is of paramount importance to search engines. They make use of all the above mentioned information to build their indices.
  • 5.  Crawlers based search engines use automated software to visit a website and source relevant information from the site. They read the actual content of the site and observe the links to the site in question.  More importantly, they read the Meta tags of the website to know more about its relevance and the importance of the information it dispenses.  A crawler or spider armed with all the necessary information returns to the central depository where the all-important task of indexing is carried out.  The spider at frequent interval returns to the site to take note of information that may have been appended, modified, or deleted. The frequency of these visits is determined by the search engine administrators.
  • 6.  Human powered search engines build catalogues and indexes upon the input provided by humans. In other words, robots (algorithm programs) have no role in building the index.  One of the important tools used by crawlers to develop indexes is to look for relevant keywords on the website. A user taking help of a search engine for any question or query is actually searching through the index that the crawler has created.  Since keywords play an important part in indexing of websites, unscrupulous webmasters make use of unethical practices like keyword stuffing and spamdexing to increase their website’s visibility.
  • 7. WHAT IS SPAMDEXING?  Spamdexing, also known as web spam, black hat SEO, or search engine poisoning, refers to the deliberate and intentional manipulation of search engine indexes.  Search engine that make use of crawlers depend upon complex algorithms to determine the relevancy of a web site. Unethical websites owners use dubious methods and processes to make their websites achieve a higher ranking in results returned by the search engines.  This technique, often referred to as Black hat SEO technique, leads to wastage of time and other precious resources of a search engine user. They have to wade through pages and pages of low quality sites to get the actual information.  Search engines severely punish spammers and webmasters indulging in search engine spam. Unfortunately, the risk is too low and the rewards too high to permanently deter tricksters from perpetrating such frauds.
  • 8.  The World Wide Web is a treasure trove of information. It is said that information is power and today’s search engines have made people powerful by giving them access to quality information.  A person can find anything about any products or services by just few clicks of their mouse from the comfort of their home.  Spams and other unscrupulous techniques used by charlatan webmasters, however, exposes people to a lot of scams and frauds.  Blekko is a new age search engine that focusses on quality rather than quantity. It returns spam free results from the most trusted of sites.