Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's true. Most 2.4GHz networks beacon at 1Mbps, which is because they are typically configured for compatibility down to 802.11b and beacons are transmitted at the lowest enabled speed.

Here [1] somebody collected measurements of time eaten by beacons at few configurations. Not a complete disaster, but still somewhat significant. For example, at the place where I am now, I'm receiving 67 beacons per second (all at 1Mbps), which, according to those calculation, wastes 17% of airtime.

[1] http://wifinigel.blogspot.com/2013/08/its-well-known-rule-of...



That's a great link / experiment, thanks.

By just counting the literal airtime of the beacons, I think it underestimates the effects a bit, because it doesn't account for the contention of the remaining air time, which would be reflected in increased collisions and small delays which (sorry to be hand-wavy again) can add up. I think if he ran some application-level tests at the same time (perhaps iperf, perhaps something more sophisticated) he would see a bigger impact to "good-put".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: