Garmin Vivofit 2 — a Micro Review — and a Comment about Techno Crap

© 2016 Peter Free

 

14 March 2016

 

 

Nice idea — fouled by shoddy manufacturing

 

After having neurosurgery on my spine, I uncharacteristically thought that a techno gadget might help me monitor my rehab activity. Mountain and road cycling, weight lifting, and everything else even whisperingly hazardous were prohibited. I was limited to not lifting more than 10 pounds. A sneeze, they told me, could undo the surgeon’s work. Imagine a fall down the stairs or a trip on a protruding German cobblestone. Yikes.

 

Enter Garmin’s Vivofit 2, which I chose over its competitors, including Fitbit’s noticeably more expensive Charge HR.

 

 

What was good about the Vivofit 2?

 

It made a great (but expensive) watch.

 

Not being a gadget freak — and usually being averse to wearing anything on wrists and arms — I thought that an activity tracker should minimally function as a watch that I could see without reading glasses.

 

Out of consideration went the Fitbit, which required a button push even to see its tiny numbers. In came the Vivofit, which continuously displays the time in numerals large enough to see without reading glasses. Hallelujah.

 

The Vivofit also has the advantage of fitting both securely and loosely on my wrist. It was the first watch in decades that that I could stand wearing all day. I was pleased.

 

 

The bad — everything else

 

Indicatively, the first time around, the Vivofit 2 would not synch properly with my PC and the Garmin-provided dongle. (The dongle is a small Bluetooth receiver that fits into the computer’s USB port). You can also get the Vivofit to synch with Bluetooth devices, especially phones. You can see the instruction manual here.

 

I did not bother to try the tracker’s cell phone and tablet pairing ability because I had so much trouble just getting it to hook up with its own Garmin dongle and my computer. I figured that if I were going to have to jump through Apple’s software hoops as well as Garmin’s, I would have even more trouble.

 

It took lots of pairing tries before I could get the computer to recognize the Vivofit and download the gizmo’s operating software.

 

Afterward, as a test, I counted my steps around the house in my head. The Vivofit 2’s count roughly agreed. Good enough, I thought, for comparative purposes from day to day.

 

I had never expected a device this simple to be especially accurate. I just wanted its errors to be roughly consistent, so that I could compare approximated walking totals from one day to the next.

 

However, based on past experience with more expensive Garmin devices — see one such here — I knew that Garmin frequently promises more than it delivers. Rather than trusting the device itself to store my totals, I wrote them down each evening. This would, it turned out, later justify my return of the product.

 

On the morning of the second day, the Vivofit again would not initially pair with the laptop. (Synching is necessary to transfer the device’s totals to Garmin’s online Connect app.) After some futzing, I discovered that Garmin’s initial software installation had not bothered to download its own software updates. How stupid is that?

 

I manually updated the device and was again able to use the tracker, this time without apparent subsequent synching difficulty. To test the updated software, I looked at my online totals a few more times that day.

 

On the morning of the third day, however, the Vivofit again would not synch on the first try. After several attempts, I got it to upload its stored data to Garmin’s Connect website. I saw (sadly) that the gizmo had dropped more than half of the previous day’s total steps. Although I had stopped walking only at 2200 hours, the online graph indicated that I had stopped doing anything at all around 1200 the previous day.

 

Nothing I tried worked to get the lost steps back. They were gone, as if they had never been — which defeated the purpose of having the tracker.

 

 

I returned the Vivofit 2

 

Nice idea, but with poor software and unreliably manufactured.

 

 

A related comment — about culture and techno crap

 

Lauren Goode, writing at The Verge, summed our apparently shared irritation with these poorly implemented devices and their manufacturers:

 

 

Over the past several months I’ve noticed a bizarre trend among the digital health and fitness products I’ve tested: many are faulty, incomplete, or inaccurate products, and the companies making them are hawking them anyway. And when people — myself included — complain about them, the response is always that improvements are coming. But are they?

 

A multi-sport watch that actually only tracks one sport, with the promise of more to come and a vague timeline, was likely rushed out for the holiday shopping season. A health-tracking app that’s only viewable on a tiny watch face, and not in a compatible mobile app, is an odd workaround; maybe it will get there. But a fitness band that doesn’t come close to accurately measuring running distances is frustrating, or even ridiculous.

 

My experiences with consumer health and fitness products that have shipped as "ready" when they’re really not go much farther back than this.

 

If [consumers are] lucky, maybe the product they bought will work as promised right out of the gate. If they’re unlucky . . . they’ll be part of a feedback loop where they’re giving plenty up plenty of personal health data for little in return . . . .

 

© 2016 Lauren Goode, People who buy activity trackers shouldn’t have to be beta testers: Stop telling us we're holding it wrong, The Verge (13 March 2016)

 

 

This is where Amazon’s product reviews are helpful

 

 

In the case of activity trackers, roughly 20 percent of Amazon customers think that they do not work properly and accord them only 1 or 2 stars. Doesn’t matter which brand.

 

Amazon’s coy rating scale, however, skews these results into deceptively favorable overall ratings. As of this writing, the Vivofit 2 gets a 4 star — “I like it” — overall rating. This is so despite the fact that:

 

14 percent of its raters thought it was pure crap (1 star — “I hate it”)

 

and

 

another 9 percent considered it to be almost pure crap (2 stars — “I don’t like it”).

 

That’s a 23 percent “it sucks” rate. And this is for a device that only tells time, totals arm jiggles and costs about $80. And they still can't get it right.

 

If you saw a rejection rate this high anywhere else, you probably would not buy the product. For example, no professional automobile reviewer would rate a car model as being good — 4 stars (“I like it”) — when 20 percent of the units off its manufacturing line do not even serve their intended purpose.

 

 

If you think about it, Amazon’s rating system is nonsensical

 

If:

 

50 percent of samples of a product get 5 stars (“I love it”)

 

and

 

the other half get 1 star (“I hate it”) —

 

then the averaged (total) rating will be 3 stars — “It’s okay”.

 

No one in their right mind would implement meant-to-be useful quality ratings this way. Amazon probably does it to conceal how bad a lot of the products they sell are. The mega-business would lose retail profits, if consumers stopped buying obvious garbage.

 

 

The moral? — Many samples of the Vivofit 2 (and other activity trackers) are evidently trash . . .

 

. . . elevated beyond their worth by Amazon’s disingenuous rating system.

 

Both these aspects give us insight into our trash-enduring culture. Crap flows downhill into Consumer Valley, where it piles up as if behind a dam.