I am curious on how some of you adopt data-driven approach to your startups.
Recently, i did an experiment with simple AB testing tool like A/Bingo:
http://www.bingocardcreator.com/abingo/ Through that, i found that majority of our traffics are search engine's spider.
Example: instead of 100 visitors with 10% conversion rate (10 users), it become 10000 visitors and 0.001%. 9900 "visitors" are spiders.
How do you handle that? Any tools you would recommend?
It would be great if you have any reference to blog post or article that explain in detail on how to adopt data-driven approach to startups.
PS. Have been reading heaps about Dave McClure AARRR, Eric Ries, Andrew Chen, Mix Panel etc. I am looking for "more specific, with real-world scenario" kind of example.
I think robots.txt works well with "genuine" bots, and evil bots can be handled using htaccess rules.
See robots.txt examples from some top sites -http://www.webmasterworld.com/forum13/687.htm
Beyond that, you'll have to discard all results from search engines. Maybe Patrick can help make that change to A/Bingo.
P.S. +1 on the robots.txt