Product “Testing” Gives Us An Excuse…

Jun 28, 2011 4 Comments by

I’ve always believed that “real world” product tests involve using the product in question as much as possible, and then reporting the results.  Product destruction isn’t at the top of my list.

Still, I couldn’t help but laugh at our friends over at Big Squid when they posted this video of their Futaba 4PL “durability test.”  When I get a new radio, the last thing I want to do with it is drop it on the ground…repeatedly.  Then again, it looks like fun to do so.  If you see a police report about a guy arrested for creeping around his condo building’s rooftop, you’ll know it was me, all in the name of product testing.  I wonder if a LiPo battery would survive a 20ft drop without some serious fireworks?

Watch this and try not to cringe…

Featured News, Futaba, Stephen Bess

About the author

Executive Editor I jumped into R/C back in 1987, when mechanical speed controllers and hard rubber tires were all the rage. Since then, I’ve experienced R/C in almost every state in the USA. I've built and raced every type of RC vehicle created, and traveled throughout the country (and world) to attend and cover more R/C events than I can remember. But what a fun ride it's been! I'm fortunate to live in Southern California, and I take advantage of my location by enjoying R/C outdoors year ‘round. Club racing is the future for R/C growth, and I’m always looking to bring new people into the hobby, whether it’s through backyard bashing or organized racing.

4 Responses to “Product “Testing” Gives Us An Excuse…”

  1. legal eagle says:

    I’d much rather see this than a “paid review” – yokles will inevitably drop their radios and to me it would be a big selling point if it’s proven that your big-dollar radio won’t shatter (coughdx3scough) if you drop it in a reasonably possible circumstance (like the examples provided).

    Personally, I’d love to see a “brick wall test” of every vehicle where you run it full speed into a wall and report what parts broke at the end of testing. If the MRC Thunder King can do it without breaking (big selling point of the truck in the ’90s), so can others. Just because one or two things may break, that doesn’t make the vehicle “bad” – all it does is it will provide info for people as to what parts of which they may want to pick up a spare or two. And for the ones that don’t break . . . . . . that would be a very promotable thing for the companies to use in their ads.

    • Matthew Higgins says:

      I really don’t agree at all. I can assure you that we don’t do “paid reviews” and being accused (as we were in a recent amateur YouTube rant) of such unethical actions is, well, kind of insulting. With a 2.4GHz transmitter I rather see range and line of site tests. Maybe a failsafe test. Were those elements tested and reported on? Maybe, but probably not. For the most part, I believe products should be tested as they are intended to be used. That’s called a fair conditions test. Next, I value reporting how a product performs after extended use– AKA long-term testing. After a fair conditions test has been performed and reported and a long-term test has been performed and reported, you can do stuff like see how a radio performs under water or after being smashed by a rock or dropped off a roof. Doing silly stuff such as a drop test on a piece of electronics may be valuable but only if it performed on all radios tested. As a professional media company we would only do such a “test” as part of a multi product guide/shootout. Otherwise it is just a stunt or a gimmick.

  2. legal eagle says:

    That’s what I was getting at Matt – after the more conventional testing is complete, then you start to test the product regarding what would happen if common mistakes/problems occur, such as plugging batteries in backwards to test reverse polarity protection, the aforementioned radio drop to test durability, running things out of range to test fail-safes, and common crash-style durability tests. Obviously the real data comes first regarding technical testing, but had various (most) publications covered some products I ended up purchasing fairly and pointed out some true cons of the product, I wouldn’t have been stuck with some products that made me an extremely unhappy customer. Rave reviews of something with true flaws does nothing but hurt the customer, but that’s how our politically-correct world works now since everyone is afraid of losing advertising revenue because they say something bad about a product. Nothing wrong with that – it’s just the way it is now.

    • Stephen Bess says:

      I posted this blog to start a conversation, and it looks like it did that.

      L.Eagle, I’m glad you responded. It seems like your mind is made up, but I’ll discuss anyway :) The tests you’re suggesting are all but guaranteed to break or destroy the product. This isn’t what we’re looking to do. It seems to me that any RC guy knows that you do your best to never drop any piece of electronics.

      The electronics world has some of the most exhaustive media testing standards I’ve seen– including reviews from Consumer Reports who does not accept advertising of any kind–and yet I’ve never seen any of them perform tests as you suggest. The outcome is fairly predictable; plug it in backwards or drop it just right, and the product will fail or break. I’m sure you’d like to see some of the tests you suggest performed, but when reputable media outlets (us included) aren’t doing it, then as Matt said it’s probably a gimmick and readers would rather have more useful testing performed.

Copyright © Air Age Media. All rights reserved.
click me