Blekko provides an unique search experience, as its mission is to return results that are open, transparent with high degree of human involvement. It makes search a collaborative process where users and its partners combine their own user experience and their expertise with traditional algorithm method to deliver high-quality and most relevant content.
2. Search engine are the sites on the net which are used to source
information from other websites on the World Wide Web. blekko
is a consumer facing, new age search engine that delivers quality
results from the most trusted of sites. This is made possible by a
judicious mix of complex algorithms and human expertise.
People instantly turn towards internet search engines if they wish
to find information about anything on the internet. Major search
engines deliver hundreds of thousands of results within a fraction
of a second.
This made life immensely easy for people. In absence of a search
engine, anybody looking for any information would have had to
perform the impossible task of remembering a particular website’s
URL.
This could be feasible had there been only a handful of websites.
The World Wide Web has billion of websites and memorizing their
names would be a task befitting a supercomputer and not an
ordinary human being.
3. There are basically three types of search engines: First is called a robot
or spider that makes use of complex algorithms to search through the
databases of HTML documents.
The second is powered by human submissions, i.e. they rely on
submissions made by human beings to create a catalogue. The third
type is called a hybrid search engine which is a combination of both
algorithm based programs and human expertise.
Blekko is a hybrid search engine that makes excellent use of human
experience and expertise and combines it with complex algorithms to
deliver outstanding results.
Involvement of users’ experience and knowledge ensures that the
results returned score high on quality and relevancy. A user is spared
the agony of visiting dozens of irrelevant pages to find the relevant
information.
We shall now deal in detail with all the three major types of search
engines and find out how they work behind the scenes to make our
lives easier and less complicated.
4. ROBOTS OR CRAWLER BASED INTERNET SEARCH ENGINES
It is important to know about Meta tags before we delve deeper into
the functioning of crawler search engines.
Meta tags are special Hypertext Mark Up Language (HTML) tags that
provide specific information about a web page. Unlike HTML tags they
do not affect the functioning of the site.
Their job is to provide relevant information about the web page, its
creator and traffic, relevant keywords, content of the site, and the
interval at which it is updated. This information is of paramount
importance to search engines. They make use of all the above
mentioned information to build their indices.
5. Crawlers based search engines use automated software to visit a
website and source relevant information from the site. They read the
actual content of the site and observe the links to the site in question.
More importantly, they read the Meta tags of the website to know
more about its relevance and the importance of the information it
dispenses.
A crawler or spider armed with all the necessary information returns to
the central depository where the all-important task of indexing is
carried out.
The spider at frequent interval returns to the site to take note of
information that may have been appended, modified, or deleted. The
frequency of these visits is determined by the search engine
administrators.
6. Human powered search engines build catalogues and indexes upon the
input provided by humans. In other words, robots (algorithm programs)
have no role in building the index.
One of the important tools used by crawlers to develop indexes is to
look for relevant keywords on the website. A user taking help of a
search engine for any question or query is actually searching through
the index that the crawler has created.
Since keywords play an important part in indexing of websites,
unscrupulous webmasters make use of unethical practices like keyword
stuffing and spamdexing to increase their website’s visibility.
7. WHAT IS SPAMDEXING?
Spamdexing, also known as web spam, black hat SEO, or search engine
poisoning, refers to the deliberate and intentional manipulation of
search engine indexes.
Search engine that make use of crawlers depend upon complex
algorithms to determine the relevancy of a web site. Unethical websites
owners use dubious methods and processes to make their websites
achieve a higher ranking in results returned by the search engines.
This technique, often referred to as Black hat SEO technique, leads to
wastage of time and other precious resources of a search engine user.
They have to wade through pages and pages of low quality sites to get
the actual information.
Search engines severely punish spammers and webmasters indulging in
search engine spam. Unfortunately, the risk is too low and the rewards
too high to permanently deter tricksters from perpetrating such frauds.
8. The World Wide Web is a treasure trove of information. It is said that
information is power and today’s search engines have made people
powerful by giving them access to quality information.
A person can find anything about any products or services by just few
clicks of their mouse from the comfort of their home.
Spams and other unscrupulous techniques used by charlatan
webmasters, however, exposes people to a lot of scams and frauds.
Blekko is a new age search engine that focusses on quality rather than
quantity. It returns spam free results from the most trusted of sites.