The Shannon limit is constant if you assume reasonable but idealized SNR values for the medium, and gives you a real "law of physics" upper limit ... typically still many orders of magnitude beyond what we can transmit today.
But it's amazing how effective DSP can be; trellis coding managed to get modems to squeeze right up to the Shannon limit of POTS telephone connections.
> The Shannon limit is constant if you assume reasonable but idealized SNR values for the medium
The Shannon limit is changed by changing technology over the medium, not the other way around.
If you signal with light on/off pulses, you get one limit. If you add polarization tricks (using different physical properties and tech), you get another limit. As you add QAM and a zillion other tricks, you get another channel limit. If you add quantum superdense coding, you get another channel limit. Each of those, until we learned there is yet another layer of physics and tech, would be "the Shannon Limit." All of these can be done on the same medium.
The Shannon limit is a mathematical *model* of a channel. It's not a physical/technological limit.
Here [1], for example, is a paper pointing this out for transoceanic undersea optical cables. "As pointed out in Section 9.3, the Shannon limit is only limiting if we assume there is no technical way to further improve the QoT..."
Technology changes routinely change the "Shannon Limit," since that limit has almost nothing to do with physics. Physics and the signaling technology define a Shannon Limit for that particular channel combination, nothing more.
There is also a technology-independent Shannon limit for a cable. You can calculate Shannon limits based on the bandwidth and noise of your specific fiber-optic transceivers, which can improve, but you can also calculate one based on the cable itself.
The Shannon limit already accounts for any number of channels and any level of QAM.
Ah, here's but two interesting and well cited papers regarding an optical "shannon limit", both published in IEEE Journal of Lightwave Technology.
The old idea of a "Shannon Limit" has been split into many types, including linear (old tech) and non-linear (using more modern tech). Both these papers address the non-linear limit (which is not a limit, as both papers show, but more of an artfact of modern tech):
[1] "Approaching the non-linear Shannon Limit" : From the abstract "We also discuss the techniques which are promising to *increase* and/or approach the information capacity limit."
[2] "Scope and Limitations of the Nonlinear Shannon Limit" : "It is shown that this is a limit (if at all) holding only for conventional detection strategies. Indeed, it should only be considered as a limit to the information rate that can be achieved with a given modulation/detection scheme"
These are extensively cited, so use google scholar and read up to see that there is no "Shannon Limit" except for a particular channel technology. The concept has been split into a million directions depending on the underlying technology, each with a different value for "Shannon Limit".
I just gave you a paper showing that is not the case. I even provided the quote.
There is no inherent limit from physics, only from engineering. History provides ample evidence were a given medium had it's "Shannon Limit" broken via new signaling methods over the same medium. The terms you used, "bandwidth" and "noise" are technological, not physical. "Bandwidth" increases each time we invent better methods to pack and decode it. Most of the old concepts are based on old Fourier Analysis, whereas a better modern framework is wavelets or even their successors. Fourier is a very simple way to think of signals, missing plenty of useful things, which is exactly what led to the development of wavelets, to enable more powerful signal analysis.
Note in particular there is no single "bandwidth" when dealing with photonics, since photons can be physically stacked arbitrarily without interference. Simply google for breakthroughs in photonics bandwidth to see decades of advances beyond what was previously thought to be "the bandwidth".
The Shannon limit is a *model* of the system. That model is necessarily incomplete, since there increasingly advanced technology. Those models almost always use simplifications to make the math tractable, such as gaussian noise, independent errors, and a host of other things that are not true physically. Even the concept of noise requires either a non=physical abstraction to math or some details of the engineering used to measure it. Simply saying there "is gaussian noise" is a gross model used to simplify math. It is non-physical.
A simple example that leveraging quantum has created a new field of quantum shannon theory, which has better bounds. Under shannon's theory and his understanding of information, quantum key distribution would not be possible, yet it is, and companies have offered commercial versions for 20+ years. I would not be surprised if more and more pieces of physics not currently used for communication get used in the future, changing the "limits" again and again.
Before MIMO transmission, single channel over the same medium had a shannon limit. MIMO surpassed it. Many channels, especially where light is used, allow stacking more and more information into the same medium, subject to technology ability to decode, which is an engineering feat, not a physical one.
The world of information theory has moved past shannon, just like orbital mechanics moved past Newton. Both Newton and Shannon have uses, but they're not valid for modern techniques.
Two of many such changes are compressed sensing, which was a major discovery, that routinely beats Shannon-Nyquist sampling theory, and the discovery of what is called generalized sampling developed around 2000-2010 (and still making it's way through image processing). Both are fundamental theory changes that allowed breaking previous "limits".
So please read the paper, and if you claim there is still some inherent limit, publish the paper countering the one above.
Bandwidth does not increase based on packing methods. Bandwidth refers to analog signal bandwidth, something we can measure - not data rate. MIMO gets around the Shannon limit by creating more channels, like putting more fibers in a bundle.
MIMO added no "new fibers". - it's more throughout with no change in medium, a consequence of photons not cross interacting. Electrons cross interact, and end up with different physical properties. You're proving my point: the physics has major impact in practice. Shannon says zero about this, since his is a simple, outdated model.
Bandwidth is defined by the signal type. A signal is defined via the technology and model of communication. Change the tech or model, and you get different Shannon limits.
Take some time and read and understand the papers I gave you. Your claims are contradicted by each of those papers (and hundreds more).
How can you honestly ignore all those well cited papers that state exactly what I said? Post some evidence countering those papers or there's no more point replying to your opinions.
But it's amazing how effective DSP can be; trellis coding managed to get modems to squeeze right up to the Shannon limit of POTS telephone connections.