I would speculate "no" as received power might go down as the inverse square, but when one reads into signal modulation and throughput calculations, there doesn't seem to be a linear relationship between throughput and received power such that the resultant observation should fall as the inverse square of the distance.
As there are a few threads addressing cellular and WiFi throughput versus user expectation, I thought it would be nice to do a little book work. I would guess a few of us might have expertise in mathematics, physics, engineering, computer science, and networking without necessarily being...
www.ispreview.co.uk
The data throughput which can be transmitted is ultimately determined by the channel width available in the frequency domain, and the attenuation of the modulation signal as to how much of the signalling is dedicated to error correction.
Once maximal throughput with minimal error correction is obtained, being closer gets you no more throughput. As you move further away, more bits of transmitted symbols are consumed for error correction but not with a linear dependence on received power but instead dependent on the error rate within the channel which is itself dependent on the signal to noise ratio (which again is not linearly dependent on the received power of the carrier signal).
Depending on the frequency of the channels, entire channels will fall away (from the highest frequencies first) when it is no longer possible to transmit reliably and you would need very many contiguous channels even to get received power of active channels falling as the inverse square of distance. In reality, you may continue to have received power long after signalling has ceased in a high frequency channel, and gaps between channels should give sharp fall offs in throughput.
A simple test would be with WiFi. If you double your distance from the router, it isn't usually a four times decrease in throughput. There's normally a plateau within some radius and then a rapid fall off and disconnection.
The cliché with digital signalling is that you get all or nothing, although we're all familiar with the data stream corruption in a band between the all and nothing.