Thursday, May 30, 2013

Necessary things to keep in mind to get your website a good rank

While building the links, Google’s preference list has changed than before. The important things to keep in mind are:

Total Number: Google at first keeps track of the total number of links one has. It isn’t an effective factor considering the rank, establishment and trustworthiness of a good site will always be much more valuable than spammy and low quality links.

Domains: If a site has several links from the same site, then it’s a spoil spot. Google is going to disregard the influences of most of the links and so link diversity are very important.

Google can detect the domains of links through varied IP addresses. So, it’s advised that the links should be coming from IP addresses around the globe suggesting connection with different people.

Anchor Text: Anchor texts should be varied while creating links which keeps your link profile diverse and natural looking. As a benefit, people won’t use the keyword that the developer has made on that page pointing to that site.

Age: It’s a popular belief that older links that has been used for years are much more important and powerful than the new links. Older and known links will be definitely more influential and impactful than the new links. Google trusts the old links more because they are much more authoritative.

Variation: Diversity or variation is very important in terms of anchor text and source of the links. Variation is also important with respect to image versus text, placement of links on various sites and ‘dofollow’ links versus ‘nofollow’ links.

Some years ago, only the ‘DoFollow’ links were allowed by Google. As a result, every site developer started focussing entirely on ‘DoFollow’ links and started eliminating the other. But the SEO’s belief is contrary to this and they say that Google wants to curb the success of sites that are not playing according to the instructions of Google and blocking their way to the top.

Quality: Quality is the most important factor to keep in mind. Receiving the links from valued sites is better than receiving thousands of spammy blogs. The focus should be made on getting links from the well-known sites, which is difficult because they’re very conservative. Creating and sharing the links should of top priority to increase the quality of the site.

Relevance of the links is another sign of quality. Google is going to discount the sites where the subjects are not relevant merely makes sense. A typical webmaster should try to lessen the irrelevant topics and stay in their niche.

Bad Links: The bad links have a detrimental impression on the page’s ranking. So, equal importance should be given to limit the number of bad links.

While building the links, Google’s preference list has changed than before. The important things to keep in mind are:

Total Number: Google at first keeps track of the total number of links one has. It isn’t an effective factor considering the rank, establishment and trustworthiness of a good site will always be much more valuable than spammy and low quality links.

Domains: If a site has several links from the same site, then it’s a spoil spot. Google is going to disregard the influences of most of the links and so link diversity are very important.

Google can detect the domains of links through varied IP addresses. So, it’s advised that the links should be coming from IP addresses around the globe suggesting connection with different people.

Anchor Text: Anchor texts should be varied while creating links which keeps your link profile diverse and natural looking. As a benefit, people won’t use the keyword that the developer has made on that page pointing to that site.

Age: It’s a popular belief that older links that has been used for years are much more important and powerful than the new links. Older and known links will be definitely more influential and impactful than the new links. Google trusts the old links more because they are much more authoritative.

Variation: Diversity or variation is very important in terms of anchor text and source of the links. Variation is also important with respect to image versus text, placement of links on various sites and ‘dofollow’ links versus ‘nofollow’ links.

Some years ago, only the ‘DoFollow’ links were allowed by Google. As a result, every site developer started focussing entirely on ‘DoFollow’ links and started eliminating the other. But the SEO’s belief is contrary to this and they say that Google wants to curb the success of sites that are not playing according to the instructions of Google and blocking their way to the top.

Quality: Quality is the most important factor to keep in mind. Receiving the links from valued sites is better than receiving thousands of spammy blogs. The focus should be made on getting links from the well-known sites, which is difficult because they’re very conservative. Creating and sharing the links should of top priority to increase the quality of the site.

Relevance of the links is another sign of quality. Google is going to discount the sites where the subjects are not relevant merely makes sense. A typical webmaster should try to lessen the irrelevant topics and stay in their niche.

Bad Links: The bad links have a detrimental impression on the page’s ranking. So, equal importance should be given to limit the number of bad links.

Velocity: Link velocity refers to the rate and schedule at which the kinks are built. The key is to make the links look natural and consistent in making. There shouldn’t be many curves in the velocity and developing new links should be more than the previous month. Link velocity refers to the rate and schedule at which the kinks are built. The key is to make the links look natural and consistent in making. There shouldn’t be many curves in the velocity and developing new links should be more than the previous month.

Wednesday, May 22, 2013

Penguin 2.0 rolled out today | Penguin 2.0 update | Penguin update | Google Update

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we're not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type. “White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site. 

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings. The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. 

And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.” In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can't divulge specific signals because we don't want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Here’s an example of a webspam tactic like keyword stuffing taken from a site that will be affected by this change:-


We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

Tuesday, May 14, 2013

Black Hat & Link Spammers Less Likely To Show Up In Search Results After Summer


Google’s head of search spam, today answers some of the questions about what webmasters and SEOs should expect in the near future in regards to SEO.

The primary question Matt asked and answered was, “What should we expect in the next few months in terms of SEO for Google?”

Matt addressed 10 points, all summarized at the end as helping improve the search results by awarding the good sites and hurting the spammers and black hats in the search results. Here are the 10 points Matt addressed in his video, followed by the video itself:

1.  Penguin Updates

The next generation Penguin update, Penguin 4 (AKA Penguin 2.0), which is expected to launch in the next few weeks, will go deeper and have more of an impact than the first version of that Penguin update. So expect that we will hear more of an outcry from the SEO community when this does launch.

2.  Advertorials

Earlier this year, Google went after some websites for using advertorials as a means to artificially inflate their link profile. Matt Cutts said Google will soon take a stronger stance against those using advertorials in a means that violates their webmaster guidelines.

3.  Spammy Queries
While queries that tend to be spammy in nature, such as [pay day loans] or some pornographic related queries, were somewhat less likely to be a target for Google’s search spam team – Matt Cutts said Google is more likely to look at this area in the near future. He made it sound like these requests are coming from outside of Google and thus Google wants to address those concerns with these types of queries.

4.  Going Upstream

Matt Cutts said they want to go more “upstream” to deter link spammers and the value of the links they are acquiring from the sources. This seems to imply to me that Google will go after more link networks, like they’ve done in the past.

5.  Sophisticated Link Analysis

Matt promises that Google is going to get even better at their link analysis. Google’s head of search spam explained that Google is in the early stages of this much more “sophisticated” link analysis software but when it is released, they will be much better at understanding links.

6.  Improvements On Hacked Sites

Google has done a lot of work with hacked sites and their index, specifically labeling the search results of potentially hacked sites, removing those sites and also warnings webmasters about the hack. Matt said Google is working on rolling out a new feature to better detect hacked sites in the upcoming months. Cutts also added they plan on improving webmaster communication in regards to hacked sites.

7.  Authority Boost

Google hopes to give sites that are an authority in a specific industry, community or space a ranking boost. So if you are an authority in the medical or travel spaces, Google hopes that related queries will return your site above less authoritative web sites.

8.  Panda Sympathy

While many sites have been impacted by the Google Panda update, Matt Cutts said that many of those impacted are borderline cases. Google is looking for ways to “soften” that impact by looking at other quality metrics to move those on the line to not be impacted by the Panda algorithm.

9.  Domain Clusters In SERPs

The number of clusters of the same domain name showing on the first page of Google’s search results should lessen this year. Google’s Matt Cutts said they want to make the search results on the first page even more diverse, but when you click to the second results page, you may be more likely to see clustered results from the same domain name. Google is constantly tweaking how many search results from the same domain name show up on a single page of search results.

10.  Improved Webmaster Communication

As always, Google is always saying they want to improve their communication with webmasters. Matt Cutts said to expect even more detailed examples within webmaster notifications received within Google Webmaster Tools.

Toward the end of this video, Matt Cutts explains the purpose of all these changes is to reduce the number of webmasters doing black hat spam tactics from showing up, while giving smaller businesses that are more white hat the chances to rank better.


More Info :- http://searchengineland.com/googles-matt-cutts-black-hat-link-spammers-less-likely-to-show-up-in-search-results-after-summer-159185

Wednesday, May 1, 2013

feel 4.9 earthquake rocks Delhi-NCR

New Delhi, NCR region including the adjacent parts of the country on Tuesday tremors were felt almost twenty four hours.

From neighboring Pakistan earthquake tremors are reports of town. Ahmedabad in Gujarat in the country, including areas of Saurashtra and Rajasthan Bhuj shaking was felt also in large part.

Ludhiana in Punjab and Chandigarh tremors were felt for 10-12 seconds.

The epicenter is said to be on the border of Pakistan and Iran. 8 measured the quake at the border. The epicenter was 135 km below ground.

Shocks of earthquake in Iran. In addition to the earthquake in Pakistan Peshawar felt by people.

 
Designed by jai Rai
Tweet