New and Sexy LeapFish: But Framesets Can Present Issues.

UPDATE May13, 2010: Things pertaining to LeapFish.com, their founders and management, along with related startups are no longer of interest to me, therefore no further comments are allowed on this post. There are also more reliable sources out there (such as Better Business Bureau) to help you make an informed decision about LeapFish.

***You may also want to know that the Law Firm of Daniel Bakondi, in San Francisco, CA, is investigating a possible Class Action Law Suit Against LeapFish.

Update Nov. 9, 2009: It appears that LeapFish is no longer using framesets on their website.


A mention by US Today, stream of hundreds of Tweets , mentioned by hundreds of blogs, a “cutting edge” video on youtube- LeapFish.com relaunch can certainly be seen as enormous success, except…

The leading technology blogs continue to give LeapFish.com nothing short of a “cold shoulder”.

All the buzz surrounding the relaunch is focused on how sexy and nice looking the new website is, no one seems to be trying to “flip beyond the cover page”.

I would like to write about LeapFish’s implementation of framesets on their pages that may actually spook an unseasoned Internet user into believing that they are visiting unsecured websites.

If you search LeapFish for “Read Write Web” for example, the website returns nice clean results with the most relevant link in number one position:

Considering the average user expectation, most of us would expect to land on http://www.readwriteweb.com/ – it is however not the case with LeapFish. After clicking the the link you end up on a nice lookng LeapFish page which embeds (via frameset) the content from The Read Write Web and the LeapFish bar at the bottom of the page:

Click on the above image to view it in full size

While visitors to The Read Write Web may not care at all. Visitors to websites like Amazon can get confused and and may be led to believe that shopping on Amazon and other websites are no longer secure. Let me show you an example of what I mean. The following shot is of my Amazon account page while accessing the website directly (typing http://www.amazon.com into my browser):
Click on the image to view it full size

Click on the above image to view it in full size

The HTTPS as well as the lock icon are the basic signs by which an average internet can identify whether or not website is secure.

Now the screenshot of the same page but via frameset on LeapFish.com:
Click on the image to view it full size

Click on the above image to view it in full size

Both HTTPS and lock icons are missing. And the URL clearly says you are still on LeapFish.com. Even though I consider myself an above average Internet user, before I logged into my Amazon account via the frameset on LeapFish I had to check the code of the page first:
framesetcode
I wanted to know if my Amazon login info is safe with LeapFish. How many users do you think will do that?

Potential Issues With Google?

AdSense users can tell you how tough Google is when it comes to enforcing terms and condition and how easy it is to get banned. Google Search API TOS do not address the use framesets- so if you know if use of framesets allowed please enlighten me:
Click on the image to view it in full size

Click on the above image to view it in full size

If you are interested to find out more about LeapFish I highly recommend LeapFish Review by Better Business Bureau.

Tips On Keeping Your WordPress Less Vulnerable To Hackers

WordPress, being one of the most popular publishing platforms and content management system also is the most frequent target for spammers. Google reacts quickly to identify the hacked sites – as a result your traffic and sales can drastically drop. The good news is that Google will try to make reasonable attempts to notify webmasters about potential problems via Google Webmasters Tools. However if you do not act quickly, it can be truly a devastating blow to your online presence.

So how do you prepare yourself for the event such as your blog being hacked by injecting code into your theme files or even database? Here are the steps that I would recommend to anyone using WordPress as their publishing platform or CMS:

  • Register with Google Webmasters Tools. Not only Google Webmasters Tools can be useful in identifying the attack. You can also use the tools to resubmit your website for reconsideration once your dealt with threat and cleaned up the mess.
  • Create Google alerts to notify you of a possible threat. While it is impossible to foresee every possible spam keyword you can create alert for the most common ones such as “viagra” or “port”. How do you create such an alert? Simple. Lets presume your domain is “yourdomain.com” your Google alert the will be for the search query “viagra site:yourdomain.com”. Of course relying on Google Alerts alone is not a good idea.
  • Check the code yourself. It does not really take that much- right click your mouth and view the page source. Generally when attack is carried out the code is injected somewhere in common files and will be visible on every page of your website.
  • If you modify your theme yourself- keep the back up of the version that includes your most recent updates. It is always good idea to keep a back up of your theme files no matter what.
  • Keep the latest back up of your database. I find that WordPress Database Backup Plugin (HT: Andy Beard) – is one of the most useful plugin to have. You can tell the plugin to mail you the SQL file to the e-mail of your choice on a regular basis. I am doing it on a daily basis if you publish many posts per day you can chose for the back up to be made every couple of hours.
  • Do not broadcast to the world the version of WordPress you are using. I have seen WordPress theme developers inserting a code that displays the current version of WordPress, most of the time it that code is found in the header.php file loog for the code and remove it. There is no reason for anyone to know what version you are using.
  • Keep your WordPress and plugins updated. Current version of WordPress allows one click updates from withing your dashboard- there is really no excuse for us any longer.
  • Keep your files in a directory that no one besides you knows about it. You can install or move your WordPres files to a directory that only you will know about- I will try elaborate on it in future especially on how to move your WordPress to another “secret” directory.

I have cleaned several WordPress installations for my friends over the period of last year. The hacker attack can be devastating if your are not prepared to deal with it. And yes it can happen to any one- even to the best of geeks.

Trolls Are Not Allowed!

Within minutes of publishing last post I had a visitor- a very pissed off visitor. Who managed to spew his venom on Bill Hartzer‘s blog as well (Bill my apologies for the “nofollow” attribute, I do trust your website- I hate to give LeapFish any juice):
LeapFish Troll
While I can speculate who this “Anonymous” may have been- the only certain fact I know is that he came to read the post via Twitter after an entry was posted in my account that announcing that I have written the post.

There is no point to communicate with anonymous trolls like that. Most of the time they have no interest to hear what you have to say. They can’t contain their anger and spew their venom using bogus names along with bogus e-mails. I find it difficult to describe what do I feel about trolls like that- fortunately there are people out there who have their way with words (HT: Lord Matt) and I can’t agree more!

As sad as it is their behavior cast more shadow of suspicion and does more damage to themselves and the interests they try to defend and the companies they work for. Sad really but from now on……..

NO TROLLS ALLOWED!

LeapFish 2.0 Claims To Solve The Real Time Search. But Will They Survive Beyond 2009?

UPDATE May13, 2010: Things pertaining to LeapFish.com, their founders and management, along with related startups are no longer of interest to me, therefore no further comments are allowed on this post. There are also more reliable sources out there (such as Better Business Bureau) to help you make an informed decision about LeapFish.

***You may also want to know that the Law Firm of Daniel Bakondi, in San Francisco, CA, is investigating a possible Class Action Law Suit Against LeapFish.

Update: The new LeapFish has finally launched. You can read my analysis on LeapFish implementation of framesets and how confusing it can be for users.

After rather a painful year LeapFish.com- a meta search aggregator site just like Dogpile.com or Mamma.com, is ready to launch what they call it “LeapFish 2.0″- claiming that it will solve the real-time search.

Believe it or not I, of all people, was on of the “few privileged” to view their “test” products. However I could not keep quiet about my discovery:
LeapFish 2.0 Is About To Launch?
And of course LeapFish killed that subdomain few minutes after my “Twitter broadcast”. Did they find my viewing of their new product not welcomed? Or is it also possible I “tuned in” at the end of the exhibit of their product to their supporters? I will never find out.

The new design has many flaws and way too many Ajax widgets- it felt as I was looking at the dashboard of WordPress. It will cause a huge load time issues if released as it was seen by me yesterday. Sometimes less means more. The notion that LeapFish will solve the real-time search is a myth to say the least. It is no briner that search engine gurus like Danny Sullivan would never consider LeapFish as contender to solve the real-time search problems.

If you have never heard of LeapFish.com before here is a short history rundown…..

LeapFish launched about a year ago and was given a somewhat cold shoulder by TechChrunch. Instead of trying to appeal to the hearts and minds of the technology and online marketing crowd they were focused on their business more and got caught with their “pants down” by the same TechChrunch. After firing the employee who engage in the alleged click fraud described by TechChrunch, the leapfish CEO still tried to blame TechChurnch for disliking LeapFish:

I am disappointed at this post by you and by TechCrunch. You never contacted us to verify the information you posted or even checked to see what our position was as a company around such behavior before you published. You apparently called in but didn’t make the effort of speaking with someone about this. Frankly put, I find your post’s title and content irresponsible and distasteful. – Ben Behrouzi, CEO of LeapFish (you can find the full text of his rebuttal at http://www.benbehrouzi.org/2009/02/04/leapfish-gets-second-lashing-from-techcrunc/ - apologies to my readers for not providing a click-able link).

Dealing with criticism in an adult matter is not a virtue of LeapFish, instead CEO of LeapFish decided to go on a domain shopping spreethreatening to expose the “real’ Vlad Zablotskyy. I have to say that I am surprised they never threatened to sue TechChrunch- or Michael Arrington just laughed at the threat if they did in fact receive one. However is I was Robin Wauters I would make sure I own forever the .org .net and other variations of his name.

But the past is just that- the past. The true test for the company will be the future and in particular next few weeks and months. The first advertisers and investors are about to reach their first anniversary and their credit cards will be charged the renewal fee- it will trigger many to reexamine their “investement” in LeapFish. It would be in LeapFish best interest to have as few disappointed “investors” and “advertisers” as possible, and the only way to do so is to send your advertiser top quality traffic. While they may have been successful in selling their inventory (keywords)- the increase in traffic was and is disproportionately slow. Even the most favorable statistics from Compete.com (showing LeapFish as having almost 500,000 visitors in September)- do not justify an investment of $1,000+. Especially if LeapFish.com indeed has “hundreds” of advertisers in their system. I will do some math about LeapFish statistics in the upcoming posts. I doubt however the launch of their new product will stop the upcoming wave of dissatisfied clients- for LeapFish’s own sake and their future I hope I am wrong. Nor do I think this upcoming launch will bring the necessary volume of traffic to justify the price of advertising. Again I hope LeapFish proves me and other critics wrong for their own good.

If they don’t take drastic measures to purchase or otherwise increase traffic to their site- LeapFish, or at least their business model, will not survive beyond 2009.

If you are interested to find out more about LeapFish I highly recommend LeapFish Review by Better Business Bureau.

Disclosure: Everything written above is my personal opinion and my interpretation of events as I see them. You are free to form your own opinion which may or may not agree with the above post. You are however asked to show some manners (aka “being polite”) if you decide to leave comment below. The comments from LeapFish management will be moderated and may be removed if I find them inappropriate.

Telling Google How Often To Crawl Your Site- Not A Good Idea After All.

A while back I wrote an article pointing out one of the features available to webmasters at Google Webmaster Central – ability to set a custom crawl rate. At the time I thought it was an outstanding idea. It is still is a good solution should googlebot visit your site too often causing issues with bandwidth (even though I have never heard of such scenario).

Webmasters however should exercise caution when increasing the crawl rate via Google Webmaster Central. A few weeks ago I was involved in an upgrade of a website with 2000+ pages. One of the side effects of the upgrade was that we ended up with 2000+ new urls. Unfortunately there was no easy way to implement 2000+ redirects on the website. But, since most of the traffic came from PPC campaigns, we decided to let Google re-index the website on it’s own. Exception was of course that we used Google Webmaster Central to implement request for a more frequent crawl by Google. We increased the crawl rate by some 3000% only to receive the following notice a few days later:

Dear Webmaster,

Google has algorithms that determine how much to crawl each site. Our goal is to crawl as many page from your site, http://www.xxxxx.com/, as we can without overwhelming your server’s bandwidth.

For your site, http://www.xxxxx.com/, you have set a very low crawl rate which is preventing us from accessing your great content. We recommend you set the crawl rate option to “Let Google determine my default”, or, if you prefer to maintain a custom setting, increasing the rate.

Thanks,

The Google Web Crawling Team

In fact the Google had not indexed a single URL from the new sitemap. Not even the home page (which is/was the strongest page of the website) was re-indexed. So we followed Google’s instruction and within 24 hours our new pages began to appear in the index.

So if you are thinking of increasing the crawl rate in Google Webmaster Central – exercise caution or you may run the risk of not increasing the the crawl rate enough for Google to do it’s job.

Staypressed theme by Themocracy