Actually, the 10mW setting wouldn't do anything good for you. The reason it's there is only for legal reasons: In some countries wi-fi at 100mW isn't allowed. But if you select 10mW it's a good chance that, due to the lower (thus worse) signal, the wi-fi link may negotiate a lower transmission rate (you only get the max. 802g rate if the signal quality is good). Thus, slower transmission, and much more battery use. With good signal it'll transmit its 100mW in short bursts, and using little battery. The only reason I'm aware of for changing power saving from max to medium is to handle certain access points which don't do well at max power saving, i.e. it could be difficult to connect or some services don't work as they should. Going from max to intermediate should actually increase battery consumption, however if your access point works badly with max power saving (but still works to some extent) then battery usage could increase due to lots of extra retransmissions and the like. But on this I'm mostly guessing.