Mar 062014
 

Most of the software companies upgrade their software to a newer version which presumably provides more feature, better functionality in addition to fixing bugs. Any user who buys the newer software seemingly benefits by buying the latest version of the software – be it the Microsoft Windows, Office or any other EDA or other software.

On the surface of it, this looks good, but newer versions of software, even though supposed to be backward compatible creates a series of issues for its users. This can be catastrophic in many case while causing inconvenience in most others. Windows Vista and Windows 8.1 played havoc with the productivity of a company where customers were forced to buy the Windows 8 even when they would have liked to stay with Windows 7. The new interface requires a learning curve and companies lose several thousand dollars in training and lost time trying to get used to the newer operating system.

The writer is a hardware design engineer and faced a number of issue related to the newer software. Many hardware will just not work with the Windows 8 even when it was supposed to be backward compatible. The writer has a USB to Serial Port cable from USB gear that worked seamlessly. But when switch was made to newer operating system, there was no driver or the one that was available just did not work.

In EDA industries, for example, the newer Allegro 16.6 files will just not work with older version, say 16.3. A firm A develops a brd file in newer version while company B has only the older 16.3 version. As a result of collaboration the company B receives the brd files from company A but can not use it because the older software will just not work. There is not a big difference in 16.6 and 16.3 and it should have been possible to save the 16.6 file as a 16.3 compatible file, but Cadence will not do it. This forces the Company A to upgrade its software costing it tens of thousands of dollars.

When Microsoft updated its Office excel, it created a new extension xlsx. More than 99% of the cases just did not need the extra features of the updated software, but it still used the newer xlsx as a default extension – the basic reason – it will force others for upgrade the Office software bringing Microsoft hundreds of millions of extra revenue.

Firms creating the software must remember that such practices can bring them extra revenue but if it regularly plays havoc with the smooth functioning of business they will eventually abandon them. The key is to balance the greed of their revenue stream
with needs of the customer.

 Posted by at 1:29 am
Dec 232012
 

Ever since Google Panda rolled out, spammers are looking at new avenues of spam. The spam has not changed, only their methodology and modus operandi has. Panda has rooted out the stupid spammers and only the best ones are very well in the race, changing tactics, surviving and profitable.

Youtube is one of the sites that gained in post panda SERP results. The only reason we can attribute to it is the lower bounce rate and relatively longer time spent by viewers on its website. While a number of SEO experts have argued that the reason of youtube’s better than expected SERP has to do with google owning youtube. We differ on this. We do not think that google has different yardstick to measure the website it own with the ones it does not own. Had this been the case, google could have demoted Apple’s results or potentially answers.yahoo.com results ( another potential competitor).

Back to the video SERP misue. Let us take a look at the search term

2013 Toyota Corolla Vs honda civic

With 2013 kicking, potential buyers could be searching for this term in relatively large number. Here is the SERP as captured from a US location.

youtube-misues

The first video result is not a youtube, but does not provide any information about any comparison between the two cars. It is just a placeholder for advertising a seller’s website. Why google gives it a weightage is difficult to know. The next three are exactly same videos. While the videos themselves are good, it provides a frustrating user experience by clubbing them all together on the first page of the SERP. The Second page is even worse. The videos do not provide any useful information at all. This makes google’s claim of providing diversity in search result hollow.

Youtube – Time of view as a ranking factor

Youtube recently announced the time of viewing a video as a ranking factor. This was its internal ranking algorithm change and has nothing to do with the google’s external ranking factor. In theory the youtube.com and the google.com are separate products and we believe that google.com act as if it does not has the data on how long a video was viewed. The ranking factor was introduced by youtube to potentially derank the videos that were gamed to rank higher based upon other factors like number of likes and views.

 Posted by at 2:46 pm
Dec 202012
 

panda23rdYes We did it again

We were the first to accurately predict that the 23rd Panda update was coming on Dec 21st 2012. And this has been confirmed now by Google. The 23rd Panda update was rolled out just before the Christmas Holidays on Dec 21st 2012. The update has affected 1.3% search queries.

Here is what we had published on Dec 20th 2012
——————————————————-

As related contents reported earlier ( and It was the first one to report which websites like searchengineland, later picked up, the 22nd Panda update happened on 21st November 2012. Since Panda works on gathering data, potentially using an aggregated of, Bounce Rate, Dwell time and Click Through Rate – it can not happen too early or too late. If it happens to early, google does not has enough data. If it happens too late, the spam sites will get enough benefits in the period. So far the Panda updates have been rolling at about a month interval.

Around December 13,2012 some webmasters claimed that there was SERP volatility, a claim denied by Google. While there may be an update, it looks like it is some other experiment or something un related to Panda or Penguine update. The SERP position changes from tools like MozCast, SERPmetrics and SERPS.com show some kind of adjustments.

Google has started to make it a polity to keep mum about changes. It did not have any official words about the 22nd Panda update. It seems to show a middle finger to the so called SEO journalists at Search Engine Land who have been vocal in criticizing google at every issue they can find.

Since we are close to a month from previous update, we expect some kind of Panda update soon. It could come as early as on 21st Dec 2012, before the Holiday season kicks in. Or it may happen after the vacation, sometimes just before the holidays. We will keep you posted as soon as we learn something.

 Posted by at 8:33 pm
Dec 052012
 

What if your competitor builds an infinite number of links to hurt your ranking. This is the question that came up for discussion at Google’s Webmaster World Forum post. The Question was asked (potentially) by the owner of the website -www.radonseal.com.

Question - Since April, a black-hat SEO competitor has added 30,000 inbound links to our website from several websites. Out of these, Google Webmaster Tool recognizes over 18,000 links including one site with 11,000 identical links. Several sites pump out WordPress pages each with a link to us. Lately, new links are coming from a variety of cheap link farms.

There is no other explanation for such a dramatic drop in our Google traffic. Google algorithm is smart but I figure it must be overwhelmed with all the bad links coming to our site. “Where is smoke …” Our traffic is still steadily dropping.

If radonseal.com’s assertion is true, all it takes for your competitor is – to use a software robot to churn out 100,000 or so link and your website ranking will drop. John Mu, did reply this post and according to him, the reason of the drop was not the links. He added

Those links are really not negatively affecting your site. You’re welcome to submit them with the disavow links tool – for peace of mind – if you’re certain that they’re unnatural and that they can’t be removed manually, but in your specific case, that wouldn’t even be necessary. Instead of spending too much time on that, I’d really, really recommend working to make your website the absolute best of its kind. I realize that’s not as easy as submitting a bunch of links, but in my opinion, at the moment, it’s really one of the best things that you can do with your time when it comes to your website.

If anyone thinks that large number of link building affects negatively, should look no further than the Adobe’s case. If you search for “click here”, the “Adobe Reader” page – get.adobe.com/reader/ lands at 2nd position. Adobe Reader has ranked #1 for years for the term without having the phrase anywhere on its page. The only reason that can be attributed to this phenomenon is the a huge number of links on the Web pointing to the adobe reader page using the anchor text “click here”. At position # 2 right now, it is pretty impressive by any standard. If link thing would have demoted anything, Adobe reader should have been the first one to be be outranked. That did not happen even after Panda and Penguin.

That gives us two possibilities. Either the links do not effect the page negative at all, or, more possibly, some relationship exists between the link and the quality and relevancy to the site that the link points to.

The first possibility – “links not effecting webpage at all” does not explain the Penguin effect. Many websites, especially the wordpress related ones reported drop in SERP mainly because they had identical anchors likely built by robots. That brings us to the second possibility – there exists a relationship between the quality of the site and the number of links, that google found a way to correlate. If the quality of the page is very good, then irrespective of the number of links are pointed to it, the SERP will not be degraded downwards. It will not affect web page in any negative way. Google has ways to check the quality of the page using various signals, including, but not necessarily, the time spent on webpage, and the bounce rate. On the other hand if the quality of the webpage is poor, and if the number of the links pointing to it are huge, the two facts do not align in Search Engine Algorithm’s eye and this is what exactly lead to the SERP affecting negatively. This is the most likely scenario that may happen when you yourself or your competitor builds the link for a page that is not good in quality.

So What to do

The best thing to do is what John Mu suggests

Instead of spending too much time on that, I’d really, really recommend working to make your website the absolute best of its kind.

In fact, if you spend time on improving the quality of the webpage, the competitors link which can potentially be negative can turn into positive. How ? Google may actually be led to believe that the number of the links and the quality of the page are proportional.

By Vikas Shukla

Disclaimer : Vikas Shukla is not and SEO expert. The views are based upon his observations on the websites, google analytics and other facts that he has read on websites.

 Posted by at 12:40 am
Dec 012012
 

Matt Cutts tells that “Bounce Rate is not a factor in Panda Ranking. This was tweeted by Danny Sullivan sometimes ago.

Let related-contents.com tell you the truth. This is a plain lie. As simple as that. Any one who has seriously worked on more than 5 websites and has seen the Panda effect on few of them both rising and falling, there is one thing common – all Panda hit sites have high bounce rate. All sites not hit by Panda have either low bounce rate or have a very high average time spent on the site. We invite readers to comment if there observation is contrary to what I have stated. So the big question is not if the bounce rate is a signal to Panda or not. The big question is – how to reduce the bounce rate. For, even if it is not a direct signal, it really does not harm to keep readers engaged for few extra pages on your website.

Today we will look at one way to improve ( or reduce) bounce rate of your website. We too our inspiration from the The Moving Blog website . We were reading and article on how to reduce the website when this website popped up. The One thing that was pointed out was that implementation of the the pop up on the bottom right hand side improved the bounce rate by as much as 30%. This is impressing by any standard. So we looked at the source of thewebsite and searched for all occurences of the “plugins”. It turned out that the plugin was upprev


http://www.themovingblog.com/wp-content/plugins/upprev/styles/upprev.css?ver=3.2.1

Further searches on google revealed that the upPrev is a nice plugin on wordress. It can be downloaded from here.

We have implemented the plugin in our site and will report about its outcome some other time.

 Posted by at 6:05 pm
Nov 232012
 

Google has updated its Panda Algorithm on Nov 22, 2012 which will be 22nd iteration of the Panda that first started on Feb 24, 2012. This was leaned by related-contents.com from one of the websites that it is familiar with. The comparison of the graphs between the Nov 22, 2012 and September 20th 2012 ( a little before the 20 th Panda update on September 27th 2012) reveals this fact.

It looks like the site that was investigated by related-contents.com recovered fully at 4:00 PM pacific time on 11-22-2012. The date 22 nd November, incidentally coincides with the 22nd iteration of Panda.

Update 11/29/2012

It has been confirmed that the refresh indeed happened on Nov 21, 2012. This is indeed in alignment with what related-contents.com observed as is clear from the graph above. The story was later on confirmed by searchengineland.

Here is the list of the previous Panda updates.

Panda Update 1, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
Panda Update 2, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
Panda Update 11, Feb. 27, 2012 (no change given; announced)
Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
Panda Update 16, June 25, 2012: (about 1% of queries; announced)
Panda Update 17, July 24, 2012:(about 1% of queries; announced)
Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
Panda Update 20 , Sep. 27, 2012 (2.4% English queries, impacted, belatedly announced
Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)

 Posted by at 2:29 am
Oct 132012
 

We will try to fix one of the two issues in the Google Analytics – preventing google from modifying the scroll bar behavior in the browser.

Note that this fix applies only to Mozilla Firefox and not for other browsers, including Internet explorer and Google’s own Chrome. There may exist some solution similar to the one pointed out below, but we do not know it yet. Mozilla Firefox in one of the most loved browser by webmasters. If you are using Google Analytics, chances are you are using Firefox as your default browser. So here are the steps for the Mozilla Firefox.

1. Open the firefox browser and type “about:config” into the url area.It will ask you to confirm. You may like to confirm it and proceed to the next window.

2.It will show hundreds of setting under Preference Name. Fortunately there is a search bar which allows you to locate the setting you want to change. If you search for dom.disable_window_open_feature, you get the following window

3. Choose dom.disable_window_open_feature.scrollbars

4. Double click this line and change the value from false to true.

5. Close the window.

Next time you click an outbound link, you should be able to see a scroll bar. Do not depend upon Google to fix the issue. You should now be able to scroll the webpage you visit out from Google Analytics. Reading the page is now a breeze, since you can scroll pages with longer content.

 Posted by at 1:09 pm
Oct 132012
 

One of the things that you want to do after analyzing posts on Google Analytics is figure out which post have higher bounce rate or low time spent. You then click the outbound link of the Google Analytics to see the post or the page.

If you are using a word press based blog, it really makes it easy to go to the relevant post and edit the post. However, google does not make it easy for you. There are at least two major issues

1. The outbound link opens as a pop up – not as another tab in your browser. This is really annoying. You wish to see the link in another tab, so you can seamlessly go to the tab and edit the post or page right there.

2. In the pop up window, it is not possible to scroll the window ?

Google is one of the most user unfriendly company that give very little value to the communication with its users. Try calling google or writing an email. If you ever did , have you received a reply or an acknowledgement ? You need to be a Danny Sullivan kicking the rear of google in their posts to be able to heard.

We hope that with this post, google makes these two changes to make it easy for its webmasters to seamlessly transition between the google analytics and their websites.

Incidentally, the problem was not present in their earlier Flash based Analytics site. For some reasons google was determined to retire the older version in favor of the new one which comes with many new features but break breaks many things. It looks like google 3rd grade team worked on the new Analytics interface with poor understanding of the user behavior.

So how do you go on fixing the Scroll Window in Google Analytics. Let us read it in the next post.

 Posted by at 12:44 pm
Sep 072012
 


In search engine field, Kevin O’Connor is known for two important things, the first and the more important one is DoubleClick – a controversial technology to track user browsing behavior that could be used to serve them with best possible ad. As most of us know, Double click was eventually acquired by Google and rest is a history.

His more recent venture is as a co-founder of FindTheBest.com, started in 2009, that he calls “unbiased data driven comparison engine”. Between January 2012 to July 2012, findthebest saw phenomenal growth from just about 1 million users a month to more than 10 million users a month. There are few success stories that matches the one that of findthebest.

Kevin received bachelor’s degree in 1983 from the University of Michigan and worked initially for Intercomputer Communications Corporation. Kevin O’Connor eventually became its CTO and a VP of its R&D wing.

O’ Connor quit DCA in 1995. He alongwith Chris Klaus started Internet Security Systems (ISS). O’Connor was the initial investor. Kevin tasted first success when in 1999, when ISS went public. It was eventually sold to IBM Corp. in 2006 for $1.4b.

But his biggest achievement was DoubleClick which he started in his basement with Dwight Merriman Alpharetta, Georgia, USA. Doubleclick was sold to Hellman & Friedman for $1.2b in in 2005. Google eventually acquired it in 2007 for $3.1b.

Kevin O’Connor co-author “The Map of Innovation: Creating Something Out of Nothing (ISBN 1400048311)” published in 2003.

Kevin O’Connor runs the venture capital firm O’Connor Ventures. The companies he invested in includes 9Star, Surfline, Travidia, ProCore and CampusExplorer.
[ad]

 Posted by at 1:40 am
Sep 072012
 
findthebest

Google’s August update may have affected the data driven comparison site – findthebest.com. Here is the Alexa Ranking of the findthebest.com

As you can see from the figure, the decline of findthebest.com traffic coincides exactly with July 24, 2012 Panda update.

The July 24th Panda update had affected about 1% of websites. There very very few that recovered from Panda and most of the websites were demoted. The rise of findthebest.coom has been phenomenal as is evident by its Alex traffic. From an Alexa ranking of only around 40,000 at the beginning of 2012, findthebest has risen to an Alexa ranking of close to 3000 before being hit by Panda.

Findthebest “Present the facts in easy-to-use tables with smart filters, so that you can decide what is best.”

Unfortunately the categorization of the facts is often misleading and not intuitive which sends a lot of visitors back. Findthebest also tries to capture each and everything in the world, from cars to computer to dentists and plumbers. It is easy to capture data in every field, but you need to put some thought on how to present the data so that it is useful to the users. Findthebest has clearly failed in this respect and hence the panda.

It is a lesson for others to learn.

Findthebest was co founded by Kevin O’Connor in 2009, better known as a founder of “DoubleClick”.

 Posted by at 1:15 am