Prakash sharma SEO - Advanced SEO Interview Questions and Answers 2014

30
What is SEO? SEO is a process by which you make your web pages more search engine friendly in order to get the much desired higher page position. SEO is the process of improving your Web page or Web site in Serch engine. SEO = Search Engine Optimization, ie getting your site ranked higher so more people show up at your doorstep. What's the difference between SEO and SEM? While some people use SEO and SEM interchangeably, SEO (search engine optimization) is actually a part of SEM (search engine marketing). SEO refers to the process of using on and off page factors (typically free) to get your web pages ranked for your chosen keywords in order to get more search engine traffic to your sites. SEM takes it a step farther to include using paid search engine listings and paid inclusion to get more traffic to your websites. What's the difference between paid and organic search listings? Organic search engine listings are the main results users see when they do a Google search. The websites appearing in the organic listings appear because those sites are most relevant to the user’s keywords. Indeed, most of these sites appear in the top of the search engine results because the webmasters of these sites have used SEO tactics to ensure top rankings. The paid (or “sponsored”) listings usually appear on the top, bottom and to the right of the regular organic listings. Usually these are pay per click (PPC) ads, which means the website owner only pays when someone clicks on his ad (as opposed to paying for impressions). This isn’t an either/or game. Just because you do SEO doesn’t mean you can’t/shouldn’t use PPC and vice versa. SEO is not free traffic, it takes time and/or money to get good organic rankings but in the long run it’s usually cheaper than PPC. What's on-page SEO? On-page SEO refers to the things you do on your own site to enhance it’s ranking in the search engines. This includes but is not limited to: Creating content around specific keywords. Formatting/designing your site so that the most important keywords are emphasized and appear near the top of the page. Including the chosen keywords in meta tags. Including the keywords in the navigation menu and other links. Using your keywords in other parts of your site, such as the title of the page, the file name, etc. Using related keywords on the site (see the question on LSI for more information). On page SEO helps you to promote your site and this technique is popular because it helps them get better SERP (Search Engine Ranking Position). This is to follow SEO techniques: ? Your keyword density should be moderate in the web pages in general. Do not do keyword stuffing. ? Application of Bold / Strong, Italic / Underline and emphasis: spiders search engines search for words in a web page that are bold, italic or underlined. If you put important words in these tags, 1

Transcript of Prakash sharma SEO - Advanced SEO Interview Questions and Answers 2014

Page 1: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

What is SEO?

SEO is a process by which you make your web pages more search engine friendly in order to get the much desired higher page position. SEO is the process of improving your Web page or Web site in Serch engine. SEO = Search Engine Optimization, ie getting your site ranked higher so more people show up at your doorstep. What's the difference between SEO and SEM?

While some people use SEO and SEM interchangeably, SEO (search engine optimization) is actually a part of SEM (search engine marketing).

SEO refers to the process of using on and off page factors (typically free) to get your web pages ranked for your chosen keywords in order to get more search engine traffic to your sites. SEM takes it a step farther to include using paid search engine listings and paid inclusion to get more traffic to your websites.

What's the difference between paid and organic search listings?

Organic search engine listings are the main results users see when they do a Google search. The websites appearing in the organic listings appear because those sites are most relevant to the user’s keywords. Indeed, most of these sites appear in the top of the search engine results because the webmasters of these sites have used SEO tactics to ensure top rankings.

The paid (or “sponsored”) listings usually appear on the top, bottom and to the right of the regular organic listings. Usually these are pay per click (PPC) ads, which means the website owner only pays when someone clicks on his ad (as opposed to paying for impressions).

This isn’t an either/or game. Just because you do SEO doesn’t mean you can’t/shouldn’t use PPC and vice versa.

SEO is not free traffic, it takes time and/or money to get good organic rankings but in the long run it’s usually cheaper than PPC.

What's on-page SEO?

On-page SEO refers to the things you do on your own site to enhance it’s ranking in the search engines. This includes but is not limited to:

• Creating content around specific keywords.• Formatting/designing your site so that the most important keywords are emphasized and

appear near the top of the page.• Including the chosen keywords in meta tags.• Including the keywords in the navigation menu and other links.• Using your keywords in other parts of your site, such as the title of the page, the file name,

etc.• Using related keywords on the site (see the question on LSI for more information).On page SEO helps you to promote your site and this technique is popular because it helps them get better SERP (Search Engine Ranking Position). This is to follow SEO techniques:

? Your keyword density should be moderate in the web pages in general. Do not do keyword stuffing.

? Application of Bold / Strong, Italic / Underline and emphasis: spiders search engines search for words in a web page that are bold, italic or underlined. If you put important words in these tags,

1

Page 2: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

then search engines take them as important keywords and your search will be optimized for those keywords. This is the normal syntax to follow:

On SEOOn page SEO </ strong>On page SEOOn SEO

? Text Size: Allows you to set the major items of paragraphs larger than the original text. Title tag (tag H) is appropriate to do so. On page SEO, text in heading tags are easily tracked by search engines.On page SEO? Title: The title tag is an important element in the page for SEO, often ignored by different webmasters. The content within

....

? Meta Description: Your description is very important for search engines. You should avoid too small and too long in the descriptions. Length of description should not exceed 160 characters. As the description of the title, Meta description should be unique and should not be excessive keywords.

"Description"

? Meta Keywords: Search engine spiders are based on huge Meta keywords. You can put 10 to 15 keywords separated by commas (,). The keywords you choose must be relevant to the content of your page. In optimizing websites, the pages should not be very focused on all keywords. Only one or two words should be duly highlighted throughout the page content.

Name

? Brand image: used to tag images with keywords. When you search with appropriate keywords, search engines to find appropriate images. If the image file name is given, which gives search engines to find the right kind of images in its results?

? Internal Link Structure: Pages web site should be internally linked. If the internal navigation of the site is well structured, then search engines are each web page quite easily.

? URL Name: The name of the address or domain name is also important on page SEO. Too long URL name keywords stuffing are rejected by search engines. And the URL name must not contain a lot number. The web page name should describe the issue shortly. If you have graphic content in direct response mini-sites on a single page then the page name as minisite-graphics.html. Do not use names like page2.html. Spiders or search engines can identify these pages or you can easily follow the content of your page quickly if you use page names that way.

? Site Map: Include a site map to your website. Without a site map of your web pages makes it difficult to find through search. A site map to keep your web pages indexed and bots of search engines is easy. Even for visitors to a site map helps you navigate easily.

Crawler notification, for example: website language, name of a copy writer, website owner and publisher declaration should be in the Meta tags.

Tip 1- Name your URL or the domain name in such a way that it has the targeted keyword of related to your business. Both organic and natural search engines will catch this up faster.

2

Page 3: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Tip 2- Use the most used keywords in your title tag for optimization of your page in a natural way.

Tip 3- The next thing that you need to do is keep the Meta tag intact with your preferred set of keywords.

Tip 4- You can implement header tags which communicate to search engines that the information is significant to it. You can use H1-H6, but what is recommended is that you use H2 as a common tag for both header and footer.

Tip 5- You can use unique content for on-page SEO optimization to be implemented in your website. The article should be ideally 30% fresh and no other website should contain it.

For a good web site optimization, make your page coding error free. Erroneous pages are ignored by search engines. It is necessary to validate the page encoding. Search pages with the W3C valuator. Once your page is validated according to international standard and to implement all the SEO techniques in the page, the page will have a better page rank in all the outcomes of the major search engines.

Title :- (Use your keywords in your title)

URL :- (Use your keywords in your URL slug - shorter is better , hyphens not underscores)

Headline :- (h1) [Not an option at Hubpages - but use your capsule headers (h2) instead]

Natural repetition in body of content :- repeat your keywords, but dont "stuff" them unnaturally

ALT attribute :- is controlled via the caption segment of the image capsule

File Name :- Hubpages changes the file name upon upload

Keyword Location :- Use your keyword within the first 100 words of your content also use in your custom summary

Internal Links :- The amount of links to your content from within the site itself are valuable, using tag pages in order to get internal links will give you a boost on top of the powerful built in interlinking that HP already employs. This is essentially what a HP tag is, an inbound link from within the site that you can control. Also be sure to interlink your own content with anchored text link.

Italics/bold :- they have a tiny seo benefit - italics more than bold - and does not need to be used more than once!

Content :- once it is seen, its the content that will get the desire to share started. So, the tired maxim - content is king. Has some truth to it.

What's off-page SEO?

Off page SEO refers to those things you do outside of your own web pages to enhance their rankings in the search engines.

This is a glorified way of saying, “get links” and did I mention, “more links”.

Off – Page Factor

3

Page 4: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

• Web Directories Submission• Article Submission• Blog Creation and Posting• Press Release• Classified Ads• Forum Posting• Yellow Page Submission• E Book Submission

Web Directories Submission

A web directory or link directory is a directory on the World Wide Web. It specializes in linking to other web sites and categorizing those links. One of the best means to assemble links to your sites is through web directory submission. But choosing a directory that is best for you is not as easy as it seems. You will come across free web directories, paid lists and regional ones. It is up to you to choose the most beneficial directories for your submission.

Advantages of submitting a web directory

* Directory Submission Aids in Hiking Your Back links - Web directory submission helps to increase your back links as the main aim of submitting to a web directory itself is to build maximum links.

* Assures Search Engine Indexing - It is the search engines that discover your site by following the back links from other sites. Once the search engine locates your link on the directory sites, they will follow these links that will take them to your site, thereby hiking the chances of your site being indexed in search engine data bases.

* Recurrent Bot Visit - Search engine robots that are usually called bots frequent all the sites on the web to accumulate data for archiving purposes. So the higher back links a site has, the greater will be the visits of these bots which will result in your site attaining a better rank.

* Increases Traffic To Your Site - There are numerous people who look out for terms that finally lead them to some directory on the web. And there are also many people who just surf the web directories to find out whether they can locate anything catches their fancy. So the more number of directories to which your site is listed, the greater will be the traffic to your site.

Article Submission

Article submission is referred to as article marketing which actually help a website in its branding, popularity and importantly it also help in Link Building. Article submission is a great resource for increasing traffic to your website and also getting one way links back to your site.

• Great sources of getting one way link. • Generate interest in the content of your website so as to attract new, unique visitors. • Best alternative to link exchange and link exchange networks. • Save time in getting back-links to your website instead of directory submission and link exchanging.

Tips to Promote Your Article and Drive More Traffic to Your Website

1. Write a meaningful & brief description

It’s important to write meaningful and brief description about your article, most readers skip the article if they do not like your article description. Don’t write the stories you have just 300 characters to describe your article. Do not use same description on your all articles, it creates the duplicity.

4

Page 5: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

2. Keywords are important

You might be wondered to know keywords play important role in finding your article to readers and search engines. Always write accurate keywords and avoid keyword stuffing. You can get keywords from your article contents or you can think yourself what user will search to find your article.

3. Submit to social bookmarking sites

We recommend you should submit your site to social bookmarking sites such as Twitter, Digg, Reddit, and Yahoo! Buzz, Stumble Upon etc. With few minutes of work you can not only promote your article easily but also drive good amount of traffic on your website. For your easiness we have put sharing links at top and bottom of each article.

Forum Posting

Forum is a place for discussion on the client’s website. In this forum, people can post their queries and share their views with other members. However, the idea behind Forum Post is that of Online Community but we understand that you would like to publicize your products through questions and answers with some links for your website. This is by far the most successful form of attracting direct clientele and shall help you in your website optimization. Internet forum is an online discussion site which has different threads, each dedicated to particular topic. Posting in forums is a good way to gain quality backlinks with the anchor text of your choice. Forums are also good for creating your brand or image.

Directory Submissions: Submitting your site URL to the relevant categories of popular directories like DMOZ, Best of the Web, etc can help you to get valuable back links.

Article Submission: An easy way to get link juice via back links is to submit unique articles to various popular article submission sites like EzineArticles.com, GoArticles.com, ArticleDashboard.com, iSnare.com, and ArticlesBase.com.

Forum: Set up your account on some popular forums, build your credibility there and soon you will be allowed to add your site URL in your signature, which will act as a backlink and help to lure your avid followers to your site.

Blogging: Whether you have a blog of your own or want to write as a contributing blogger for some popular blogging sites, this can prove to be an effective way of making people take notice of what you have to offer. RSS Feed generation and submission also help to keep your avid readers interested in your updates and news even when they don’t have the time to actually visit the site.

Social Bookmarking and Q and A Postings: You can also further your SEO interests by posting question and answers on Yahoo Answers, and via social bookmarking.

Social Networking: Networking on various social platforms like Twitter, Facebook, MySpace, and LinkedIn are the newest buzz in SEO tactics, which webmasters are using with a zeal. You too can join the bandwagon after some careful planning.How quickly will I see results?

If you target long tail keywords you can see results pretty quickly but always remember SEO is a long term strategy not a set and forget thing.

If you’re after more competitive keywords prepare to commit to it for at least three months of consistent effort.

Should I rank my own content or articles on other sites?

5

Page 6: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Yes – but let’s qualify that.

Because you can’t control what third-party sites do, you should focus the vast majority of your efforts on ranking content on your own sites.

However, you can leverage high-ranking third-party sites by posting SEO’ed content on them and then you including a link back to your own site. Not only do you get the SEO benefits of the backlinks, you’ll also get indirect search engine traffic from people clicking through to your main site.

Other Factors ("Advanced" SEO)

what other factors affect rankings besides backlinks?

Where you’re getting your links, the quality of these links, the relevancy of these links, how many links you have and what keywords you’re using as the anchor text all affect your rankings. But there are other factors that affect your ranking, including but not limited to:

• On page optimization factors – this is how well you’ve optimized your tags, content, formatting, keyword proximity, site map, and links on your web page. This also includes whether you use your keywords at the top of your page and in your “alt” tags (both good things).

• Having a lot outgoing or reciprocal links pointing to “bad” sites (like link farms) – can negatively impact rankings.

• Whether you have unique content (which the SE’s like).• How frequently you update your site. Faster isn't necessarily better. Check what ranks well

for your niche and aim to match it.• Whether your domain includes your primary keywords.• Your domain’s age, reputation, IP address and whether it’s a top level domain (e.g., a .com is

better than a .info although probably not by much).• Shady practices such as keyword stuffing or using text that’s the same color as the

background can negatively affect your rankings. Only an issue if your site gets manually inspected and you don't have a legitimate reason for it.

• Showing one page to the search engines and other page to visitors negatively affects your rankings. (Cloaking and doorway pages.)

• Frames negatively affect your rankings.• Using content that the search engines can’t read, like audios, flash, videos, graphics (without

alt tags), etc.• Whether you have a robots.txt file that tells the search engine bots to stop crawling or

indexing your site.

Does domain age help?

Yes – search engines view an older domain as more trustworthy, which means older domains may have a slight advantage. But this is only true if the older domain has a good reputation (e.g., it hasn’t been blacklisted, penalized or banned from the search engines).

Cached Links

Google takes a snapshot of each page examined as it crawls the web and caches these as a back-up in case the original page is unavailable. If you click on the "Cached" link, you will see the web page as it looked when we indexed it. The cached content is the content Google uses to judge whether this page is a relevant match for your query.

6

Page 7: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

When the cached page is displayed, it will have a header at the top which serves as a reminder that this is not necessarily the most recent version of the page. Terms that match your query are highlighted on the cached version to make it easier for you to see why your page is relevant.

The "Cached" link will be missing for sites that have not been indexed, as well as for sites whose owners have requested we not cache their content.

Why would I want to 301redirect an aged domain?

Google passes link juice/authority/age/ranking strength (call it what you like) from one domain to another if you do a 301 redirect on it.

For the fewer techs savvy out there the 301 code means “permanently moved” and is a way to announce that your site that was once “here” is now “there”.

The upshot of this is that you can buy an aged domain and “301” it to the site you’re trying to rank instantly passing on all that lovely ranking power that it’s acquired just by sitting in some domain squatters account for 10 years.

Just make sure they do a domain push at the same registrar it was originally registered at or all these effects are lost.

Also, you have to wait up to 2 weeks to see the benefits. They are not instant!

What is rel="canonical"?

If you have two or more pages with similar content, you can tell Google which is your preferred page to show in the search engine results. This is referred to as your “canonical” page. If Google agrees this designated page is the best version, it will show this preferred page in its index.

To tell Google which page you want listed as the canonical page, add the following bit of code into the head section of the similar (non-canonical) pages:

<link rel="canonical" href="http://www.example.com/filename.html"/>

Naturally, you should replace the example.com/filename.html with your actual domain name and file name.

For example…

Example.com/file1.html is your preferred canonical page, the one you want displayed in the search engine results. You don’t have to add any tags to this site.

Example.com/file2.html and Example.com/file3.html have similar content to example.com/file1.html. As such, you’d place the canonical code within the <head> tag of these two sites to tell Google that example.com/file1.html is the most important page.

The most common reason to do this is to tell Google that these pages are all the same –• Example.com• www.example.com • www.example.com/index.html • Example.com/index.html

Don’t go overboard with this and certainly don’t use it on stuff like paginated comment pages because they are “similar” but contain the same post. They contain enough unique content to be treated as

7

Page 8: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

unique and Google will start to ignore your legitimate canonicals if it finds too many instances of you misusing it.

Yes, Google thinks it’s smarter than you, deal with it and move on.

What's the truth about duplicate content?

There is no duplicate content penalty when it comes to multiple sites. Otherwise, your shady competitors could just create near-clones of your site to make your site disappear. But that doesn’t happen. Indeed, run a search for a PLR article and you’ll likely see many SE results for that same article.

TIP: Nonetheless, it’s better if you have unique content, rather than competing with others for the same keywords using similar content.

What about duplicate content on your OWN site? In other words, what happens if you have two web pages with the same content but different file names? In that case, refer to the question on rel-canonical for instructions on how to deal with this.

What is a doorway page/cloaking?

Cloaking refers to showing one page to a search engine and a different page to your human visitors. Doorway pages are optimized pages that pull in SE traffic, but this traffic is immediately redirected (either manually or automatically) to a different page.

Google and other search engines do NOT like these practices.

What are Meta tags?

Meta tags are information that you put between the <head> tag of your web page’s source code. These meta tags primarily tell search engines and other user agents about your site’s content (description), keywords, formatting, title and whether you want the search engines to crawl (and index) the page.

There are also some tags that are shown to the user, such as the title tag (which is the title that appears at the top of your browser).

Note that the big search engines no longer take these tags into consideration when ranking your web pages (with the exception of the title tags). Some smaller and more specialized search engines still utilize the keywords and description tags when ranking and displaying your site.

What is the "freshness" factor?

Search engines such as Google prefer “fresh” (newly updated) web pages and content over stale content. That’s why when you first add content to your site – such as a new blog post – this page may sit high in the rankings for a while. Eventually it may sink to a more realistic ranking.

It’s this “freshness factor” that allows your pages to get those higher rankings, even if the ranking is temporary. Thus updating your pages frequently can help push them to the top of the rankings.

This is one of the primary reasons why you hear people talking about how “Google loves blogs”. Google doesn’t love blogs, Google loves regularly updated sites.

What is a C-class IP and why should I care?

A computer’s IP address is it’s address on the Internet. A C-Class block of IPs is ones which are next

8

Page 9: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

to each other. Links from the same IP have very limited value. Links from the same C-Class IP block have a little more value but still not much. Links from different C-Class IPs are worth the most.

Not as important as it once was, especially when it comes to sites hosted on huge shared server clusters like those at HostGator/ThePlanet, Blue Host and others. The shortage of available IP addresses is driving this.

Most importantly tons of domains all on the same IP or C-Class that all interlink are the fastest way to announce to Google that you’re trying to cheat the system. This may have worked a couple of years ago, now it’s just a flashing neon sign telling Google to deindex you.

What is LSI?

LSI is short for latent semantic indexing. This refers to different words that have the same or similar meanings (or words that are otherwise related). For example, “housebreaking a dog” and “housetraining a puppy” are two entirely different phrases, but they mean about the same thing.

The reason this is important is because Google analyzes WebPages using LSI to help it return the most relevant results to the user.

For example, a page that has the keyword “housebreaking a dog” but NO other similar words (like housetraining, paper training, potty training, puppy, dogs, puppies, etc) probably really isn’t about housebreaking. End result: Google won’t rank it as high as a web page that does include a lot of relevant, related terms.

What does this mean to you? When you create a web page around a keyword, be sure to also include the keyword’s synonyms and other related words.

Pure LSI analysis isn't scalable enough to handle the volumes of data that Google processes. Instead they use more streamlined and scalable content analysis algorithms that have some basis in LSI and other related technologies. It also appears that this analysis is ongoing and not just a one time run through the system.

Cliff Notes: Don’t write content that a drunk 4th grader would be ashamed of. Spend the extra couple of minutes to write decent stuff and you’ll be fine.

RSS feed

Sample feed

Here's a sample RSS file which can be used as a template for your first feed:

<?xml version="1.0"?><rss version="2.0"> <channel> <title>John Smith News</title> <link>http://JohnSmithHomepage.com/</link> <description>Latest stories form John Smith</description> <language>en-us</language> <lastBuildDate>Tue, 10 Jun 2003 09:41:01 GMT</lastBuildDate>

</channel></rss>

Should I build links for human beings or the search engines?

9

Page 10: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Both but make sure you know which one you’re going for at any point.

If you want human beings to click the link then make sure your content high quality and worth that click.

If it’s never going to be seen by a human then don’t spend a week writing a beautifully crafted piece of prose use automation or anything you can lay your hands on to get links fast.

What is an XML Sitemap?

This is a listing of all the pages on a website, along with important information about those pages (such as when they were last updated, how important they are to the site, etc). The reason to create a sitemap is so that the search engines can easily find and crawl all your web pages.

This is really only important if you have a large and complex site that won't be crawled easily. A 10-20 page HTML mini-niche site doesn't really need one while a 20,000 page product catalog might benefit from one. Also avoid automating this on WordPress autoblogs since sitemap generation is a processor hog and can get you kicked off of shared hosting.

What's the sandbox?

The Google Sandbox Effect is a theory used to explain why newly-registered domains or domains with frequent ownership changes rank poorly in Google Search Engine Results Pages (SERPS). In other words new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking. The disappointment webmasters feel when Google's stupid algorithms don't appreciate their site. It can't be them so it must be Google's fault.

What is robots.txt for?

This is a file some include in some or all of their website directories. Search engine robots (bots) look at this file to see if they should crawl and index pages on your site, certain file types or even the entire site. An absence of this file gives them the green light to crawl and index your site.

If you don’t want search engine bots to crawl your site, then create a robots.txt file in your root directory that includes this bit of code:

User-agent: *Disallow: /

You can also create a meta tag that keeps the search engines from indexing your site:

<meta name="robots" content="noindex">

Important: Only “well behaved” bots read robots.txt so don’t use it to “protect” content on your site just to keep Google from indexing stuff. Most importantly be aware that malicious bots will look for pages you’re asking not to be indexed and go to them with priority to see why.

What's a spamblog?

A spamblog (or splog) is a blog used primarily to create backlinks to another site. Splogs tend to be populated with fake articles, commercial links and other garbage content.

In other words, they provide little or no value to a human reader. As such, the search engines tend to

10

Page 11: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

de-index these sites once they discover them.

What's an autoblog?

An autoblog uses an automation tool to pull in content from other sources and post it on the blog. In other words, it’s an easy way to automatically and frequently update a blog.

They are a great way to build foundation sites to provide link juice to your higher ranking, more competitive sites but a good way to get sites banned if you don’t know what you are doing.

Most importantly there is a lot of discussion about how legal they are due to reproducing content. I’m definitely not going to get involved in that discussion and I ask you not to turn this thread into a flame fest discussing it.

What's an "authority" site?

An authority site is one that is seen as influential and trustworthy by search engines, and thus it tends to rank well. Authority sites tend to be well-established sites that have a lot of high-quality, relevant content as well as links from other authority sites,

Obviously, getting your own website recognized as an “authority site” will boost your rankings. However, it’s also beneficial to get backlinks from these authority sites.

What are "supplemental" results?

These are results that are displayed in Google’s index after the main results – especially if Google’s trusted websites didn’t return many results. These supplemental results are no longer labeled as “supplemental” results. However, this secondary database still exists to index pages that have less importance, such as duplicate content on your site or orphaned pages.

For example, if you have multiple pages on your site with the exact same content, then Google will index your most important page in the main index, and place the duplicate page in the supplemental index.

Google and Page Rank

what is Page Rank?

Page Rank (PR) is a numeric value from 0-10 that Google assigns to your individual web pages, and it’s a measure of how important that page is.

Google determines this importance by looking at how many other high quality, relevant pages link to a particular page. The more links – and the better quality those links are – the more “votes” a page gets in terms of importance. And the more “votes” a site gets, generally the higher the PR.

How often does Google update Page Rank?

It used to be every 3 months but it’s becoming more and more erratic.

Does PR matter?

Yes and no.

Originally PR was all that mattered in the search rankings but today that’s just not true since there are a myriad of other factors that Google considers when weighting who should appear where.

11

Page 12: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

That said, high PR is always worth having just don’t obsess over it.

What is the "Google Dance"?

When “stuff” changes the SERPs fluctuate, sometimes wildly. One day your site could be number 1 and the next nowhere to be seen. One of the main contributing factors to that is how Google sees your backlinks (which you’re consistently building, right?).Don’t obsess over it, just keep building and you’ll be fine.

How does Google personalize my results?

If you’re signed into Google, then Google keeps track of what search engine results you’ve clicked on. And even if you’re not signed in, Google keeps track of what results people who use your computer click on.

Over time, Google starts to detect a pattern. For example, if you seem to always click on Wikipedia results, then Google will start showing you more Wikipedia results. If you always click on health results from webmd.com, then you’ll get more webmd.com results when you run a health-related search.

Link Building Basics

what is a backlink?

This is when a third-party website links to your website. For example, if you write and submit an article to an article directory, then you’ll have an incoming link – a backlink -- from the directory.

The search engines prefer one-way incoming backlinks from high-quality, relevant websites.

What is anchor text?

When you create a link, the anchor text is the clickable part of the link. For example, in the phrase, “go to Google,” Google is the anchor text.

The reason this is important is because you want to use your keywords as your anchor text on incoming links. So if you’re trying to rank for “gardening secrets,” then those two words should make up the anchor text for several of your backlinks.

What is a do-follow/no-follow link?

There are two types of “nofollow” attribute. The robots meta tag version –

<meta name="robots" content="nofollow" />

Which tells (well behaved) bots/crawlers/spiders not to follow links on the page

And the link attribute

<a href=”http://www.google.com” rel=”nofollow”>

Which tells search engines not to count the link in terms of ranking pages.

In theory these links are worthless for boosting your search engine rankings. In practice you’ll often see some benefit, especially when mixed in with a load of dofollow links.

12

Page 13: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Links are automatically “dofollow” in the absence of the rel=”nofollow” attribute. There is no rel=”dofollow” attribute.

What are static and dynamic websites?

There are many static websites on the Internet, you won’t be able to tell immediately if it is static, but the chances are, if the site looks basic and is for a smaller company, and simply delivers information without any bells and whistles, it could be a static website. Static websites can only really be updated by someone with a knowledge of website development. Static websites are the cheapest to develop and host, and many smaller companies still use these to get a web presence.Advantages of static websites

* Quick to develop * Cheap to develop * Cheap to host

Disadvantages of static websites

* Requires web development expertise to update site * Site not as useful for the user * Content can get stagnant

Dynamic sites on the other hand can be more expensive to develop initially, but the advantages are numerous. At a basic level, a dynamic website can give the website owner the ability to simply update and add new content to the site. For example, news and events could be posted to the site through a simple browser interface. Dynamic features of a site are only limited by imagination. Some examples of dynamic website features could be: content management system, e-commerce system, bulletin / discussion boards, intranet or extranet facilities, ability for clients or users to upload documents, ability for administrators or users to create content or add information to a site (dynamic publishing).Advantages of dynamic websites

* Much more functional website * Much easier to update * New content brings people back to the site and helps in the search engines * Can work as a system to allow staff or users to collaborate

Disadvantages of dynamic websites

* Slower / more expensive to develop * Hosting costs a little more

Types of backlinks?

TBD

Can paid links harms my ranking?

Google’s official stance is that buying links is an attempt to manipulate rankings – and Google frowns on this practice.

In reality, however, it’s very hard for Google to penalize you for buying links (and they wouldn’t be able to tell for sure anyway). Indeed, if there was a penalty, then you could destroy a competitor simply by purchasing links to their site and then reporting them to Google. Poof, competition gone.

Of course it doesn’t work that way. As such, if there’s any “penalty,” it may just be that Google doesn’t

13

Page 14: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

“count” links from paid sources.

TIP: Google does penalize the sites that are selling these backlinks – so if you buy backlinks, be sure that the backlinks aren’t coming directly from the penalized sites.

Are reciprocal links bad?

They’re not bad, per se, especially if they’re coming from relevant, high quality websites. However, one-way incoming links tend to be more valuable in terms of SEO.

What is a one-way link?

This is a non-reciprocal link. That means that Site A links to Site B, but Site B does NOT link back to Site A.

The search engines prefer to see one-way links from relevant, quality sites.

What is a Two-way link?

A Two way (reciprocal) link is an agreement or mutual understanding between two webmasters to provide a hyperlink within their own website to each other's web site. Website A has a link to website B. If B returns a link to A, then that link is called a reciprocal link.

What is three-way linking?

Three-way linking is a way for two webmasters to exchange links so that each person’s website gets a one-way link (rather than a reciprocal link).

In order to make this work, at least one of the webmasters has to have a second site in the same niche. Here’s how it works:

Webmaster 1 links his Site A to Webmaster 2’s Site B. Then Webmaster 2 links his Site C to Webmaster 1’s Site A.

Thus Sites A, B and C all have one-way incoming links, like this:

Site A -> Site B -> Site C -> Site A

What is a site wide link?

These are links that are found on every page of a website. For example, many people have a link to their “home” page (the index page) on every other page of their web site. That’s a site wide link.

What is pinging?

Pinging is informing web-crawling bots (such as search engines or directories) that you’ve updated the content on your web page. The goal is to get these bots to crawl and index your new content immediately.

For example, if you post a new article on your blog, you can use pingomatic.com or pingler.com to inform multiple bots about this change.

Advanced Link Building

what is link velocity?

14

Page 15: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

This refers to how quickly you gain backlinks. For best results, maintain a consistent link velocity.

Most importantly don’t build a load of backlinks (especially with fast indexation techniques) and then stop. Google sees this as a news article that was interesting for a short period of time but no longer relevant so stops ranking it. “Too many links” or “links built too fast” are rarely a problem but inconsistency is.

Can I build links too fast?

Yes and no. If you’ve got a brand new domain name and you fire up some of the more powerful link spamming automation software you’ll get you domain flagged quicker than you can say, “help me my site is gone”.

If you’re building links manually or controlling your usage of serious spam software you’ll be hard pushed to build links too fast on any domain that’s already been aged a bit. Just be consistent.

If you think you can build links too fast on any site here’s an experiment for you next time you’re having a slow weekend. Go out and buy the fastest, spammiest link building software you can lay your hands on and pick a Wikipedia article that currently ranks quite well. Go nuts. All you will do is strengthen its position.

What is page rank sculpting?There are various techniques available to channel link juice through the links you actually want to receive it and thus rank them higher. In theory Google has corrected this but several experiments have shown this isn’t the case, although the actual PR passed through the links no longer gets affected.

What is a link wheel?

A link wheel refers to setting up multiple pages on multiple third-party websites (usually at least five) as a means of getting backlinks to your main site.

You link these properties to each other, but not reciprocally. For example, you link your EzineArticles article to your Squidoo page, then link your Squidoo page to Hub Pages… and so on. Finally, you link each of these third-party pages to your main site.

By using sites with a ton of content (and other SEOs backlinking them) you’re naturally tapping a bigger seem of link juice. Take advantage of this by writing high quality content for them so human beings follow the links as well since they will rank alongside your money site.

What is a mininet?

This is like a link wheel, except that you own all the sites that you’re linking together. You may link together a series of smaller niche sites, with each smaller site linking to your main site.

For example, you might link your dog housetraining site to your dog obedience site, and then link your dog obedience site to a site about training dogs to do tricks. All of these smaller niche sites would then link to your main dog training site.

What makes a good site for a link wheel?

Web 2.0 properties and other websites that have a high Page Rank. The best ones are sites which you get a page that will be automatically linked to from all over the site. Article directories like EzineArticles are perfect for this since you get tons of internal links to kick things off with.What is link bait?

15

Page 16: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

This means “baiting” others into linking to your site. Typically, this means posting unique, controversial, extremely useful or otherwise entertaining content or tools so that others naturally link to your web page.

In other words, you create and post viral content.

What is a link farm?

Link farms consist of large networks of sites whose sole purpose is to generate pages that can be used to link out to other sites that are actually worth something.

They are pretty much essential to rank for more highly competitive keywords but don’t attempt this unless you really know what you are doing. Google is smarter than you!

What is a footprint?

TBDhow do I search for footprints?TBD

what is a proxy?A proxy server is one that sits between your computer and the Internet, and using one allows you to go online somewhat anonymously. If you get online using a Proxy, no one can trace your IP address back to you and your computer.

For example, you can use a proxy to set up multiple EzineArticles.com accounts.

Indexation

How do I get my site indexed?Don’t bother submitting your site through the traditional methods. The fastest way to get a site to appear in Google’s index is to create backlinks to it. Use social bookmarking sites to create lots of easy win links from sites that are spidered regularly and submit any RSS feeds you’ve got to directories.

If you’re really keen to get indexed as fast as humanly possible –

• Stick Adsense on your pages (even if you remove it later) as this forces Google to spider you.• Setup an Adwords campaign to your domain (Google has to spider you to determine your

quality score).• Search for your domain name.• Perform site: and link: searches on your domain.• Visit your site using accounts with some of the most widespread ISPs (eg AOL) since their

logs are used to find new content.• Email links to your site to and from a Gmail account.

How do I get my backlinks indexed?

The slow way is to wait for the search engines to naturally find them. The faster way is to ping the page after you leave a backlink. For truly fast backlinking social bookmark them or create RSS feeds with links in.

How can I tell if my site has been visited by a spider/bot?

By checking your traffic logs and statistics. Most traffic analyzing software will recognize and label the

16

Page 17: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

bots and spiders that crawl your site. You can also recognize these visitors manually, as the “user agent” is usually labeled something obvious, such as “Google Bot.”

Statistics and Monitoring

What percentage of people click on the first listing in Google?

Only Google knows for sure, but estimates range from about 40% to 50%. AOL once released their data, which suggested that 42% click on the first listing. “Heat map” studies tend to lean more towards 50% or more.

How do I use Google alerts to monitor the SERPs?

All you have to do is get a Google account and then go to Google Alerts. There you enter the keywords you want the tool to monitor the SERPs for, choose “comprehensive,” choose the frequency you want to receive the alerts and then enter your email address where you want to receive the alerts.

Once you’ve completed those simple steps, you’ll get alerted when new pages that use your keywords appear in the search engines.

You can also use this tool to monitor your backlinks as they appear in Google. Just enter this search term into the alerts field:

link:www.yourdomain.com/filename.html

Replace the above URL with your actual link, of course.

How can I track the number of backlinks I have?

There are a variety of tools available to you, such as using the Yahoo! Site Explorer, Google Webmaster tools (check the links report) and SEO Quake.

Using these tools is preferable to searching directly in Google. That’s because searching manually generally yields only a sample of the sites that are linking to your site.

Ultimately they’re all wrong! Don’t obsess about tracking these things just focus on building more.

Keyword Research

what makes a good keyword?

A good keyword is one that your target market is searching for regularly. An even better keyword is one that’s not only searched for regularly, there’s also very little competition in the search engines. That means you have a good chance of ranking well for that keyword.

How many people are searching for my keyword?

You’ll need to use a keyword tool to find out the answer. Example tools include the Google keyword tool, WordTracker.com, MarketSamurai.com and any number of other similar tools.

What is the "true" competition for a keyword?

Forget all that rubbish you see in just about everyone’s WSO “proof” about how they outranked a bajillion other sites for some phrase or other.

17

Page 18: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

The only listings that matter are on page 1 so the only people you are competing with are on page 1. I would much rather compete with a billion PR0 unrelated sites than 10 PR9s that have been around over a decade and you should too!

Find out the page rank for the top ten listed pages and find the number of backlinks they have. That’s your competition.

What are long tail keywords?

Highly niche searches. For example, “dog training” is a short tail keyword, while “how to train a deaf dog” is a long tail keyword.

Long tail keywords tend to have less people searching for them than short tail words. On the other hand, they also tend to have less competition in the search engines, thus it can be easier for you to get top rankings for these words.

Official Stuff

what is the official Google/Yahoo/Bing policy on SEO?The search engines encourage you to design your site and organize your content in a search engine friendly way. This includes proper use of meta tags, creating information-rich sites, including words on your site that your users are searching for, using site maps and more.

However, they all strongly discourage any attempts to manipulate your search engine rankings, such as keyword stuffing, link spamming, cloaking and similar practices.

Why doesn't Google tell me how many links I have?Google only shows a sample of backlinks, because generally it’s only webmasters who are seeking this information. As such, webmasters who know ALL of their competitor’s backlinks can just go and get links from the same sources (which may be viewed as manipulating the rankings). By only showing a sample, Google helps reduce this practice somewhat.

They also make some claim about the amount of resources required to list all this information which I guess would be true if they didn’t have to have it stored for a million other reasons. Bottom line, they don’t want you to have it, get over it.

Who is Matt Cutts?

Matt Cutts is a Google employee specializing in SEO issues, and thus he’s seen as the authority on all things Google. He frequently talks about SEO practices, Google’s policies, link strategies and other Google issues. You can find his personal blog here.

He’s an incredibly talented and influential individual but never forget that he has Google’s best interest at heart. Not everything he says can be taken as Gospel.

Google webmaster tools

Google offers webmasters a variety of free tools that allow you to do things like: submit your site map, get more info about how often the Google bot is crawling your site, get a report of any errors the bot found, see the internal and external links pointing to your site, determine how your URL is displayed in the SERPs, etc.

You can access the full set of Webmaster Tools here.

18

Page 19: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Automation, Outsourcing and 3rd Party Stuff

Can anyone guarantee a 1st place ranking?

No. Because the search engines can and do change their algorithms, and because a third-party site may drop or change your links, no one can guarantee a first place ranking for a keyword.

However, SEO experts can create high rankings – even first place – for certain keywords. They just can’t guarantee those placements, as the algorithms and third-party links are not under their control.

What is a backlink packet?

Instead of searching for high-PR, .edu, .gov and authority sites to place your backlinks, you can save time by purchasing a “packet” that lists these types of sites for you. These packets typically include “do follow”:

• Blogs where you can make comments.• Forums where you can set up profiles.• Directories where you can post content and backlinks•

…and similar sites.

The bonus of these packets is that they save you time since you don’t have to seek out these sites yourself. The downside is that sometimes the webmasters change their policies once they get an onslaught of these links. For example, the owner of a high-PR blog may change to “no follow” links or disallow comments altogether.

I bought a packet of "high pr links" but all my links are PR0, what happened?

Usually this is because the main page of the website – such as the main page of the forum – has a high PR. However, the actual place where you put your link – such as your profile page – is PR0 because you basically just created the page when you created your profile.

What automation tools are there?

There are a variety of tools you can use to help automate the SEO process, including:

• Tools to structure your content in a search engine friendly way. (Hint: Content management systems and blogs like WordPress do this naturally, but you can also use SEO plugins to make your blog content even more search-engine friendly.)

• Keyword tools.• Tools to automatically submit or update content, such as tools that submit to directories or

tools that automatically update your social media content (such as ping.fm).• Tools that automate social bookmarking.

• Tools that help automate tasks like building link wheels.

• Tools to create content, such as article spinners, scrapers and autoblog tools.• Pinging tools (like pingomatic.com or pingler.com).

19

Page 20: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

• Tools that automate link-building, such as blog and guest book commenting tools.

What SEO service should I use?

This question is far too contentious for a forum FAQ like this so I’m not going to name specific services. Instead here’s some general advice on selecting SEO services.

Don’t fall for hype about “ranking for the most competitive terms in the SEO industry”. SEO companies that do this are pouring their resources into this highly competitive game because of the PR boost its worth. Ultimately that cost has to go somewhere. Instead find SEO firms that focus on customer testimonials showing good results.

Don’t get involved in “my links are better than your links” battles. Nothing annoys me more than seeing arguments about how so-and-so’s link packet is more effective than such-and-such’s. Just focus on building a large variety of links and you’ll be fine.

What does an SEO host give me that a regular one doesn't?

Multiple C-class IP addresses. So even if you host multiple websites with one host, you get different addresses. And that means you can build a mininet more easily without being detected.

Glossary

SEO: Refers to search engine optimization, which is the practice of using on-page and off-page factors to improve your search engine rankings.

SERP: Refers to search engine results page. When a user enters a query into a search engine, the results are called the SERPs. You can also refer to specific pages of the SERPs.

For example: “My website is on Page 3 of the SERPs for my keyword – how do I get it to Page 1 of the SERPs?”

Spider/Crawler: This is a robot (bot) or computer program that search engines and directories send out to find and index pages across the web. It can only find pages that are linked to other pages. You can check your traffic logs to discover when and how often various spiders / bots / crawlers visit your site.

Backlink: This is an incoming link pointing to your web page. Search engines prefer to see one-way backlinks (rather than reciprocal links) coming from high-quality, relevant websites.

Anchor Text: When you create a link, the anchor text is the clickable part of that link. For SEO purposes, you should use your chosen keywords as your anchor text.

LSI: Refers to latent semantic indexing, which is a way to analyze content to see if all content on a page is related. For example, if there is a page about swimming safety, you’d expect to see words like “water” and “swim.” Both of those words are different than the word “swimming,” but an LSI analysis will realize the content is related.

Google uses LSI to help it return the best results to searchers. As such, be sure to use related words on your own pages. For example, if your page is about cats, then use the words: cat, cats, kitten, kitty and feline.

What is RSS Submission

20

Page 21: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

RSS (Really Simple Syndication) is an XML-based format for sharing and distributing Web content, such as news headlines or blog articles. Using an RSS reader, you can view data feeds from various news sources, including headlines, summaries and links to full stories.

What are the benefits to having an RSS feed?

RSS is beneficial to both publishers and website visitors. To keep things simple I have listed just a few of the benefits for both publishers and website visitors.

RSS benefits for publishers:

1. Reaching new audiences through syndication2. Improved search engine optimization3. Easier and less expensive vehicle for communication than email.4. Additional way to communicate with customers or potential customers.

RSS benefits for website visitors:

1. Website visitors do not have to release personal information in order to subscribe to an RSS feed.2. 100% opt-in, users control the content they wish to receive.3. Faster method for scanning content (saves time)

What is Sitemap.xml?

Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.

A site map (or sitemap) is a list of pages of a web site accessible to crawlers or users. It can be either a document in any form used as a planning tool for web design, or a web page that lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site.

Benefits of XML sitemaps to search-optimize Flash sites

Below is an example of a validated XML sitemap for a simple three page web site. Sitemaps are a useful tool for making sites built in Flash and other non-html languages searchable. Note that because the website's navigation is built with Flash (Adobe), the initial homepage of a site developed in this way would probably be found by an automated search program (ref: bot) However, the subsequent pages are unlikely to be found without an XML sitemap.

Sample XML Sitemap

The following example shows a Sitemap that contains just one URL and uses all optional tags. The optional tags are in italics.

21

Page 22: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

Note: A Sitemap index file can only specify Sitemaps that are found on the same site as the Sitemap index file. For example, http://www.yoursite.com/sitemap_index.xml can include Sitemaps on http://www.yoursite.com but not on http://www.example.com or http://yourhost.yoursite.com. As with Sitemaps, your Sitemap index file must be UTF-8 encoded.

Sample XML Sitemap Index

The following example shows a Sitemap index that lists two Sitemaps:

<?xml version="1.0" encoding="UTF-8"?><sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <sitemap> <loc>http://www.example.com/sitemap1.xml.gz</loc> <lastmod>2004-10-01T18:23:17+00:00</lastmod> </sitemap> <sitemap> <loc>http://www.example.com/sitemap2.xml.gz</loc> <lastmod>2005-01-01</lastmod> </sitemap></sitemapindex>

Note: Sitemap URLs, like all values in your XML files, must be entity escaped.

The damping factor

d The calculation is a damping factor, it is in the calculation because the chance that a visitor leaves a page via a link. Standard Google claimed 15% chance that a visitor would leave the page via a link which means that the damping factor 0.85 remains. In the above example, page A than a pagerank of 1 to 0.85 + 0.85 (5 / 4) = 1.2125 given.

Google Penalty Recovery StrategyRecovering from a Google penalty normally involves fixing the cause of the problem and then waiting for Google to remove any over optimisation penalties or SERPS filters. To fully recover Google ranking may take around 2-3 months after all website problems are corrected, although we have seen penalty recovery in a matter of weeks following full and thorough resolution of the Google Webmaster Guidelines infringements.

The Google algorithm can automatically remove penalties if the affected website is still Google indexed. To check whether a particular website is still Google indexed, refer to our Google indexing page. If your website has been Google de-indexed and lost Page Rank, then you will need to make a Google re-inclusion request. Where the reason for the penalty is clear, it helps to provide details of any changes you've made to correct violations of the Google Webmaster Guidelines.

Google Penalty ChecklistIf your website has suffered a Google penalty, some free SEO advice to help identify the cause and solve the problem is provided below. Once you have identified the cause of the problem, we suggest watching the Google reconsideration tips video to help prepare a successful reconsideration request to Google.Gorilla Marketing (Viral Marketing)

Viral marketing and viral advertising are buzzwords referring to marketing techniques that use pre-existing social networks to produce increases in brand awareness or to achieve other marketing objectives (such as product sales) through self-replicating viral processes, analogous to the spread of virus or computer viruses. It can be word-of-mouth delivered or enhanced by the network effects of

22

Page 23: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

the Internet. Viral promotions may take the form of video clips, interactive Flash games, advergames, ebooks, brandable software, images, or even text messages.

Email Marketing

E-mail marketing is a form of direct marketing which uses electronic mail as a means of communicating commercial or fund-raising messages to an audience. In its broadest sense, every e-mail sent to a potential or current customer could be considered e-mail marketing. However, the term is usually used to refer to:

* sending e-mails with the purpose of enhancing the relationship of a merchant with its current or previous customers, to encourage customer loyalty and repeat business,

* sending e-mails with the purpose of acquiring new customers or convincing current customers to purchase something immediately,

* adding advertisements to e-mails sent by other companies to their customers, and

* sending e-mails over the Internet, as e-mail did and does exist outside the Internet (e.g., network e-mail and FIDO).How often the page rank is updated

One of the most frequently asked questions is still: Why does my site not Pagerank? That answer is quite simple: He does but he Pagerank is not updated, for example, the Google Toolbar is not real-time. Calculating the correct Pagerank to the right page is a continuous process, but only once per period (usually 3 months) the visible Pagerank upgraded. In addition, Google also takes a minimum period for the score of a page to calculate, finally, all links are analyzed. As a rule you can say that up to 4 ½ month before your Pagerank to a page to see.

More Links doesn’t guarantee higher page rank

This is also a frequently asked question; fortunately the answer is very simple. The easiest way is to think the web in small view. Set the Internet in only 10 pages in which links are attached. You will see that they all have a PageRank of 0 in the first time Pagerank calculation, finally does not have a single page Pagerank, and therefore they can not pass. There is a value needed to calculate the process gain.

Google itself says that: The sum of all Pagerank will be one. If in the above example, none of the 10 pages to display links, each page a Pagerank of 1 / 10, this is the entertainment value (initial value). Then the calculation can begin to calculate the actual Pagerank.

What is my Pagerank goes down

As I already said Pagerank as we see (a number between 0 and 10) not the actual Pagerank. Since \\\"The sum of all Pagerank will be one\\\", the actual Pagerank is a number between 0 and 1. Somewhere in the equation the score will be divided by the number of familiar Google Web pages.

That divided by the number of web pages is the reason that Pagerank can go down while the number is left unchanged or even increased. You can only increase as the number of links to your website is growing faster than the number of pages as Google is known.

No more links for my Pagerank

Unfortunately, it is not so easy. A page with no outgoing links is seen as a dead-end on the web. A user has only one choice and that is another way to find a new webpage. In this case it is not possible

23

Page 24: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

to calculate where a visitor then goes and sees Google as this to any other website linked. In practice this means that a page without links so even more will lose value (and therefore less will score).

(1) what is meta tag ?

A special HTML tag that provides

(1) information about a Web page. (2) they provide information such as

(a) who created the page, (b) what the page is about, (c) which keywords represent the page's content. and etc

(2) What is title tag ?

The <title> tag defines the title of the document.

The title element is required in all HTML/XHTML documents.

The title element:

* defines a title in the browser toolbar * displays a title for the page in search-engine results

*** The <title> tag does not support any event attributes.*** title tag is the main text that describes an online document

**** <title>keywords of document</title>

*** By using the pipe sign you are able to divide a sentence visually.** It should be 70 to 80 character

(3)what is keywords ?

answer :: A word used by a search engine in its search for relevant Web pages. (4)Types of meta tag ? and its descriptions ?

a) <meta name="description" content="text">

*** How do you explain your visitor in only 2 or 3 phrases what they can expect on your website? You use

meta description tag.

24

Page 25: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

****all the major search engines like Yahoo, Google, Fast and Altavista will use your description tag.

** inserted head section of the body

*** Use no more than 200 characters.

(b) <meta name="keywords" content="text">

*****If a search engine spider finds the same words on your website and in your meta tags, these words will

be ranked higher in the search index. Don't add too many words, the most search engines will only index the

first 20 words. Make sure that you put the 10 most important keywords first.

** i am 250 charactor in the keywords sections

** Try to give each page of your website a relevant title, relevant description and relevant keywords that

correspond with the text on your website. A visitor will find the exact information within your website, on the

correct page.

** A well formed description tag will be shown directly by e.g. Yahoo, Bing and Google. That's why the

description tag is also named the Google meta description or snippet.

(5)<meta name="robots" content="index, follow">

* The spider will now index your whole website. * The spider will not only index the first web-page of your website but also all your other web-pages.

you are also allowed to type it like this:

<meta name="robots" content="INDEX, FOLLOW"><META NAME="robots" CONTENT="INDEX, FOLLOW"><META NAME="robots" CONTENT="index, follow"><META NAME="robots" CONTENT="all"

(6) If you don't want the search engine spider to crawl through your whole website you use the following meta

tag :

<meta name="robots" content="index, nofollow">

(7) * The spider will now only look at this page and stops there.

25

Page 26: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

<meta name="robots" content="noindex, follow">

(8) * The spider will not look at this page but will crawl through the rest of the pages on your website.

<meta name="robots" content="noindex, nofollow">

(9)What is the meaning of the meta tag "REVISIT-AFTER"

By using this so called REVISIT-AFTER meta tag you can tell the spider to come back to your website and

index it again. This meta tag is used by several North American search engines.

<meta name="revisit-after" content="period">Example how to use the meta tag revisit-after

Add the following meta tag in the HTML source of your page:

<meta name="revisit-after" content="7 days"><meta name="revisit-after" content="1 month">

(10) What is the meaning of the meta tag "ABSTRACT" .

The meta tag abstract can be used to indicate in just a very short sentence what the web-page is about. So

every web-page of your website gets it's own abstract-tag.

This meta tag appears to be equal to the description meta tag but they are different. The description meta

tag is used by many search engines as a small text under the click-able title. The abstract-tag is hardly used

by anybody.

<meta name="abstract" content="a very short description of your website ">

You may add the meta tags in all of your web-pages, so not only in the first index page. Make sure that on

every page relevant meta tags are added. Add keywords and phrases that are relevant and correspond to the

text on that specific page.

(11)The meaning of the meta tag name"author"

How to give credit to the person of company that made your website? You use the so called AUTHOR meta

26

Page 27: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

tag. There are also Content Management Systems (CMS) who will put the name of the actual person editing

the page. If used like this, it is easy to find the right person who is responsible for a web-page.

Sometimes a CMS refers to the author in case of the writer and refers to the web-author in case of a mail

server manager. Therefor you will also find the meta tag web-author.

This meta tag has no influence on your search engine ranking. Meta tags that do have a lot of influence are

the title of your page, the meta keywords tag and the description meta tags

<meta name="author" content="John Smith">

(12)What is the meaning of the meta tag "CONTACT"

Difference Between NoFollow and DoFollow Links

(1) DoFollow links are counted as backlinks by search engines.(2) DoFollow links help increase both your traffic and backlinks which is most beneficial

to you.(3) NoFollow links do not give you any backlinks or search engine credit(4) NoFollow links are only beneficial for increasing traffic to your website.

Url:

<a href="http://www.link.com" rel="nofollow">Link</a>

27

Page 28: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

It is also possible to make all links on a page nofollow by using the meta robots tag in the head-section of the html page:<meta name="robots" content="nofollow" />

Url:

<a href="http://www.link.com">Link</a>How Search Engines Use

Ask:

Never adhered to NoFollow. It’s not a search engine anymore.

Google:

They looks NoFollow links but simply does not pass on Google Page Rank to an outbound link that is tagged with the value.

Bing:

Bing may not follow NoFollow link, but it does exclude it from ranking calculations.

Yahoo:

Yahoo follows NoFollow links & excludes the link from all ranking calculations

Robots.txt :

The simplest robots.txt file uses two rules:

* User-agent: the robot the following rule applies to * Disallow: the URL you want to block

***** A user-agent is a specific search engine robot.

*** The Disallow line lists the pages you want to block. You can list a specific URL or a pattern. The entry should begin with a forward slash (/).

**** To block the entire site, use a forward slash. Disallow: /

To block a directory and everything in it, follow the directory name with a forward slash. Disallow: /junk-directory/

To block a page, list the page. Disallow: /private_file.html

To remove a specific image from Google Images, add the following: User-agent: Googlebot-ImageDisallow: /images/dogs.jpg

To remove all images on your site from Google Images:

28

Page 29: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

User-agent: Googlebot-ImageDisallow: /

To block files of a specific file type (for example, .gif), use the following: User-agent: GooglebotDisallow: /*.gif$

Important Error on server :

(1) 509 ....Error due to high band width i.e bandwidth limit exceeded this is tempory error i.e services are tempory unavailable

What's the sandbox?

Ques:What's the difference between paid and organic search listings?or

What is different between inorganic and organic seo ?

Ans:

Organic SEO

****is natural and free of cost way to bring traffic to your site .

****which has long term impact

**** it is time consuming.

****It is suited to all type of business.

***But we do not get immediate result.

inorganic SEO on the other hand

29

Page 30: Prakash sharma SEO -  Advanced SEO Interview Questions and Answers 2014

**** provides result in very short time.

***Through Inorganic SEO we get immediate result as because it is paid service.

***It is very easy to understand and get targeted traffic.

***But not applicable for all kind of business.

30