The web is growing by leaps and bounds and businesses are now ready to shell out a huge part of their marketing budget towards online marketing. Becoming visible to people who matter, is thus on the priority list of all brands. And when it comes to online visibility, Search Engine Optimization is the key. So what does 2012 has in store for Search Engine Optimization?
- Social Media: In 2012, Social Media will gain more prominence when it comes to online search. More and more social media mentions will find their way into search result pages. With the likes of Facebook, Twitter and Google+ gradually capturing our mind spaces, it will take them only some time to establish their full-fledged presence on our search result pages too.
- Personalized search: Keeping in tune with Web 3.0, the internet will become more and more personalized. Your search results will increasingly display results that your online network has recommended. Logged-in users can also enjoy the benefits of a more fine-tuned search. Yet at the same time, privacy issues will become all the more important…how much to reveal and how much to hide, would be a tightrope walk in 2012.
- Mobile search: With more and more iPhones, Androids and the likes entering the market, mobile will be the most common platform to communicate with customers on-the-go. Thus optimizing sites for mobile viewing would be crucial in 2012. In addition to this, new mobile technologies like ‘Speaking’, ‘Tapping’, ‘Shaking’, etc. will further expand the Mobile SEO market.
- Quality content : Content will still be the king in 2012. As recent updates of the Google algorithm has suggested, quality content will be awarded and bad or low quality content will suffer. So in 2012 before you even think of making an impact online, make sure your content is in place.
- Local Search : With new age technologies like Near Field Communication, ‘Push’ mails, etc. local search will acquire a completely new meaning in 2012. Also with more and more directory-based portals like Zomato, Just Dial, Sulekha, Buzzintown, etc. entering the market, local will become the new global.
So what will be your SEO strategy for 2012?
Source : seomoz
Content and Navigation of a website are supposed to be two key pillars which support any Search Engine Optimization campaign. However at times, good content and a well thought of navigation might not be able to do the SEO wonders it was supposed to do.
Theme Bleeding might me a reason!
By definition, Theme Bleeding is a link leading from a specific topic or theme to another theme (or topic) that is not directly related.
For eg: You have a webpage about Search Engine Optimization (SEO). Some of the outgoing links from this page point to a web page about Social Media Marketing. Though you may be offering both the services to your clients and it might make business sense to cross sell, however Google and other search engines might consider it as an apt case for Theme Bleeding. Your ‘theme’ of Search Engine Optimization (SEO) is bleeding into an unrelated theme of Social Media Marketing.
Google Algorithm ranks pages on the basis of their Relevance to the content/Keyword. Theme Bleeding leads to Google according a low relevance to your content.
How to avoid Theme Bleeding?
- Plan your navigation properly: Most webmasters suggest the use of a silo structure for your site wherein related content is stacked together.
- Using the NOFOLLOW attribute: Add a nofollow attribute to pages you do not want the search spiders to crawl.
Eg : If you do not want the spiders to crawl your Social Media Marketing page from your Search Engine Optimization page, you can add the nofollow attribute to the Social Media Marketing page.Normal Link:
<a href=”http://www.myseopandit.com/socialmediamarketing.html”>socialmediamarketing </a>
Link with nofollow attribute:
<a href= “http://www.myseopandit.com/socialmediamarketing.html” rel= “nofollow” >socialmediamarketing </a>
- Using the Robots.txt file: This is the most effective method to prevent Theme Bleeding. Robots.txt is a simple text file that sits on your server in the same directory where your home page exists. Using the Robot.txt file in links/directories you do not want to be crawled, will ensure that search spiders under any circumstance would not crawl these links/directories.
URLs (Uniform Resource Locator) form an integral part of any Search Engine Optimization (SEO) campaign. A recent study shows that of the top ten indexed pages for a particular keyword, ‘Google has 40-50% of those with the keyword either in the URL or the domain, Yahoo shows 60% and Bing has an astonishing 85%’. What this translates into for Search Engine Optimization (SEO) is that having keywords in the URL or domain name can make a lot of difference to a page’s ranking.
There are primarily two types of URL – Static URL and Dynamic URL
Static URL: As the name suggests, Static URLs do not change. It does not have any variable strings or URL parameters. The page actually exists on a server and can be downloaded via FTP (File Transfer Protocol).
Eg: A URL of the type http://www.discountpandit.com/web/ is a static URL.
Dynamic URL: A Dynamic URL is an address that directs to a dynamically generated web page. The content of the page is stored on a database and is displayed only on demand. There is no actual page. The page is only a template on which the results of the query are displayed.
Eg: A URL of the type http://www.discountpandit.com/web/thread.php?threadid=123&sort=date is a dynamic URL.
Static URL or Dynamic URL, which is better?
Though both have their advantages and disadvantages, Static URLs have a definite edge when it comes to SEO.
- Static URLs are indexed faster and ranked higher in SERPs.
- It is wiser to optimize Static URLs (for long term SEO benefits).
- It is easier to insert relevant keywords in Static URLs rather than in Dynamic URLs.
- It is easier for the end user to understand a static URL, thereby the chances of clicking the link increases manifolds.
- It is advisable to use Static URLs for all branding activities as it ensures faster retention and recall.
In spite of the so-called disadvantages of a dynamically generated URL, they are used for the following reasons:
- Web Pages of a dynamic URL can be edited easily.
- It is a cake walk to update the dynamic URL page content. Just updating the same in the database ensures that the content gets updated for all relevant links.
For affiliate sites, blogs, e-commerce sites, etc. which have huge content that need to be updated regularly, Dynamic URLs are generally preferred.
The key is to find the right balance for your website -between your dynamically generated pages which allows for a faster updating and your static pages which helps in SEO.
If you have further queries on the issue, post it as a comment below.
Press Releases are a tried and tested method of reaching out to the masses. However the entire purpose of writing a Press Release gets defeated if your target audience is unable to find it. Here are a few simple tips to keep in mind while writing your Press Release to ensure that it reaches your targeted audience:
- Keywords: Keywords play a pivotal role in the success or failure of your Press Release. Before sitting down to write a Press Release, research on the keywords that people will type in their search box while looking for your Press Release. Once this list is made, zero in on a few most important keywords for your Press Release and incorporate them in your body content. Ideally, your Keywords should be there in your Title as well.
- Highlight: Bolden your critical words. These may be your keywords as well as other important words. Bold Texts not only help in SEO but also help readers glance through the main points in the article.
- Anchor Texts: Link your Press Release to relevant pages on your web site. These pages may or may not be your Home Page. Linking your Press Release with relevant articles helps readers to know more about the topic as well as help search spiders crawl more efficiently.
- First Impression: Optimize the first 250 words of your Press Release. This ensures that you rank for the relevant keywords. It also decides whether your readers will read you further or not.
- Write for Humans: All said and done, write as if you will write for a Human reader and not for a search bot. Keep your article free of all technical jargons, do not overstuff with keywords and write in simple and grammatically correct English.
This week’s biggest technology news was ICANN’s (Internet Corporation for Assigned Names and Numbers) expansion of generic TLDs (Top Level Domain). Traditionally there were 22 approved domain suffixes like .com, .net, etc. After the change, companies can choose to have any domain suffix as long as they can justify the use of that suffix and shell out a cool $1, 85,000 as application fee. This fee is exclusive of the annual fee of $25,000 and all other additional fees that ICANN may decide gradually.
So what happens to the level playing field that the internet claimed to be? Will internet be reduced to a place where only big fishes survive? The questions are many and the answers are still vague and somewhere in the air.
But one thing for sure is going to happen after the proposed ICANN change is implemented- your Search Engine Optimization strategy will need a major revamping. URLs and TLDs (Top Level Domains) play a key role in search engine rankings and with TLDs up for sale and on the “exclusive” route, the competition will only grow tougher. Somebody with a domain .hospitals will get a preference over all other hospitals with unsuitable TLD.
Experts believe that the competition will exist in owning branded domains too- eg: Apple will prefer to have .apple in its URL rather than a more generic TLD.
Under such a scenario, local search, personalized search and concepts of geo-targeting will need a major rethink.
It might be slightly early to zero in on the pros and cons of the proposed change but if the TLD change is actually implemented, expect a major overhauling of search algorithms and a relook of search optimization strategies across the board