July 13, 2010, 8:00 AM — So now we have Consumer Reports adding to the overall data on the iPhone 4 "Antenna Grip Bug" dustup. That's good and bad, and gives me an excuse to riff on some things a bit. First it's good, because unlike most people, CR did a what appears to be a solid test. They tested in an RF-shielded room, with a cell tower simulator. So, all the variables of testing say, in your office or on the street? Gone. However, there are some issues with their report. Note: I don't think these issues invalidate, or even come close to invalidating their findings. But there are things I'd like more info on.
First, I'd like them to release all the data on their testing methodology. I don't doubt their findings, but having other labs replicate their findings using the same methodology would be a good way to further validate their results. It would help to know how may ways they tested the iPhone 4. CR states they tested the iPhone 3GS and the Palm Pre among other AT&T phones, yet the specifics of those tests are not in the video CR has on their site showing the problem. Did they only test the other models the exact way they tested the iPhone 4? We don't know. I know that I can get my 3GS to drop signal with ease if I hold it right, but the specific way to hold it isn't the "grip of death". Consumer Reports should publish all details on their test, including setup, room information (including background RF levels. "Clean" and "RF Proof" have a wide range of meaning. Without knowing the "normal" background RF, no one can judge how clean the test environment is. That's important), test equipment, equipment settings, the results of each run, how they arrived at the signal loss figure, etc.
None of this is an attempt to deny or invalidate the CR results. But right now, we have a contextless video showing a single signal strength drop after an iPhone 4 in a test jig, (what's the test jig made of? How is it mounted to the test bench?) is touched on the antenna gap on the left side of the phone. How many types of grips were tested? How did they measure the signal drop? All of this information should be released, because it's critical in allowing those with the background to independently evaluate and/or replicate CR's results. I am always leery of accepting a rather detail-less blog entry, and a heavily cut video as 'proof'. It leaves too many openings to attack the findings from too many angles, legitimate to ridiculous.
CR says the signal drop "could" be enough to drop a call. Well, that's kind of meaningless, as a any drop in signal strength "could" be enough to drop a call. Did CR test this to see if the 20 dbm drop they measured has noticeable real-world implications or is it more cosmetic, (you have less bars)? We don't know, that's not mentioned in the blog entry. Did CR test across a range of signal strengths? Was the signal drop consistent regardless of signal strength? We don't know, and yet we should be able to find out.
I would have really liked to see CR, as much as possible, repro their tests in the real world. I've worked on enough RF systems to know that lab results, while not to be ignored, can miss problems that crop up in the real world, or falsely emphasis issues that end up not mattering outside of the lab.
(Some background: I worked on Electronic Countermeasures/Communications/Navigation systems on B-1B bombers from late 1987 to early 1993. We got them fresh from the factory, and my base, along with the other (at the time) four bases, along with other B-1B test groups helped develop B-1B ECM test procedures. So while not an RF engineer or scientist, I have a bit more experience with RF, and testing RF systems than most.)
I'd also like to take a minute to talk about bars. First, they're worthless. Literally, they have no practical value. For example, they can basically measure two things: signal strength and/or signal quality, aka signal-to-noise Ratio or SNR. Neither, on their own, or really, even together, have much value for the layman.
Signal strength is probably the most useless of the two, because it's a bogus measurement. I can, with some time, and a magnetron from a standard microwave oven, hit your cell phone with more signal strength than any 5 cell towers you know. It's a completely useless signal, it carries no data at all, but if you want to measure raw power, oh, you'll have it in abundance. So signal strength doesn't do much. Next up, SNR. Well, that's somewhat better in theory, because it's telling you the quality of the signal, and strength is a part of that. I mean, you have to have enough signal to measure, right? Well, yes and no. Yes, SNR is probably a better measurement, but only if you know a few things. Like what each bar signifies, are the phone and the carrier measuring the same things, do all carriers and all handset manufacturers mean the same thing when it comes to bars, and what level of SNR measurement you need to get work done. Telling me I have 4 bars instead of 5 is useless unless I know that each bar means I'll have n better/worse bandwidth per bar for example. In other words, what do I gain or lose with every bar.
That's the problem with bars: they have no context. They're marketing puffery used by carriers to trump their networks. They also leave out another important thing: cell tower capacity. It is easy, very easy to be sitting in site of a cell tower, getting 5 bars, and not be able to make a call/use the network, because that tower is at capacity for voice and/or data. So telling me I have a strong signal, telling me I have a clean signal? That's no good if the tower is bogged down and I can't actually do anything with that clean, strong signal. The same thing applies to a Wi-Fi network. I can have my laptop practically mating with an access point, but if there's no available bandwidth, that signal quality is meaningless.
I also doubt there's any practical way to give you all the data you'd need for any measurement system to be valid in a way that gives you all that information in a simple indicator. Well, there is a way to do this, but it means we set aside the stupidity of bars for more of an Ethernet-like system. The phone itself measure strength, SNR, and can it get a connection. If so, you get a link light, and maybe a flashy thing to show 'data' being transferred. If you can't get a good combination of all three, then you don't get a link light, and you know you're in a bad location and need to move. The trick of course would be getting all the carriers to agree on a standard for what makes a link, and stopping the idea of bars entirely. If you want more info, there could be a settings page that showed you some kind of effective data rate for your current connection, but that would create other silly problems. Basically, as long as there's enough signal to make a voice call and/or run at say..100K up/download speeds, or whatever bandwidth the carriers want to consider a working minimum, you get a link indicator, and you have some confidence you can make a call and/or up/download data. You may not be moving data very fast, but you'd know you have a minimum quality connection to the cell tower.
Do I think that would please everyone? Of course not, don't be silly. But, it would at least have some reliable qualitative value, which is far more than the current bars idiocy does.
Also, just to be quite clear: I do think Apple has a real problem with the iPhone 4. I think it's one that can be fixed via a number of methods that can be field-implementable and would not noticeably, or barely noticeably change the phone's appearance. But I do think that the problem is real. I also would much prefer to wait until Apple comes up with a solution that isn't just doing SOMETHING, and will fix the problem the first time.
In addition, while it doesn't fix the fundamental problem with bars, (they suck and are worthless), I think that if Apple and the carrier both mean the same thing by "bar", we at least get some consistency within a useless measurement. No, it's not a material improvement, but that way, we'll a consistently worthless indicator.
In the meantime, here's my recommendation for telling if your phone, regardless of model, is getting a good signal: can you get work done? If so, then the bars don't really matter much. If you can't get work done, then the bars have even less meaning. It's a simple, easy-to-grasp method that works independently of carrier and handset.
If nothing else, it's tons better than "bars".