But I've seen some products that seem to have a 20MHz-only mode, a 20/40MHz mode, and a 40MHz-only mode. If you have equipment like that, and you've put it into 40MHz-only mode, then it can only transmit when the whole 40MHz-wide swath of bandwidth is free, and it can be interfered with by anything in the whole 40MHz wide channel. You don't have to always use 20/40Mhz channel width. Also there was something that was called Good Nieghbor wifi policy, which having many local neighbors with wifi signals causes some interferences. When this happens the channel width between these wifi signals will fall back to 20Mhz to help better deal with interferences issues. Just made a change to my 802.11AC from 80MHz bandwidth to 40MHz bandwidth and it works. Unlike 802.11N it doesn't change to 40MHz bandwidth, 802.11N only uses 20MHz bandwidth (a slower speed compare to 40MHz). 2 GHz drivers are required to drop to 20 MHz if another AP is detected in the extended channel. 1 Answer. To get 802.11n's 300mbps signaling rate, you have to do two spatial streams (a.k.a. 2x2, or 2T2R), 40MHz channels, and a short (400ns) guard interval (short GI). The fact that you're seeing a 144mbps signaling rate indicates that you've got the 2x2 and short GI right, but you're only using 20MHz channels instead of 40MHz. DC frequency is what we use in clock pulses. "DC - 20MHz" in this case. Frequency is a parameter of a signal that changes state periodically, ANY change of state that is defined, not just polarity (AC) or pulse repetition rate (DC), it is not an inherently AC or DC parameter. DWjDUm.

difference between 20mhz and 40mhz wifi