CHANNEL capacity

Views:
 
Category: Entertainment
     
 

Presentation Description

No description available.

Comments

By: icdragon (40 month(s) ago)

It's a very good introduction, such as Channel capacity derivation.

By: pinudiku (43 month(s) ago)

pls sir sending this documentation

Presentation Transcript

Wireless CommunicationElec 534Set ISeptember 9, 2007 : 

Wireless CommunicationElec 534Set ISeptember 9, 2007 Behnaam Aazhang

The Course : 

The Course Light homework Team project Individual paper presentations Mid October Team project presentations Early December

Multiuser Network : 

Multiuser Network Multiple nodes with information

Outline : 

Outline Transmission over simple channels Information theoretic approach Fundamental limits Approaching capacity Fading channel models Multipath Rayleigh Rician

Outline : 

Outline Transmission over fading channels Information theoretic approach Fundamental limits Approaching achievable rates Communication with “additional” dimensions Multiple input multiple (MIMO) Achievable rates Transmission techniques User cooperation Achievable rates Transmission techniques

Outline : 

Outline Wireless network Cellular radios Multiple access Achievable rate region Multiuser detection Random access

Why Information Theory? : 

Why Information Theory? Information is modeled as random Information is quantified Transmission of information Model driven Reliability measured Rate is established

Information : 

Information Entropy Higher entropy (more random) higher information content Random variable Discrete Continuous

Communication : 

Communication Information transmission Mutual information Channel Useful Information Noise; useless information Maximum useful information

Wireless : 

Wireless Information transmission Channel Useful Information Noise; useless information Maximum useful information Interference Randomness due to channel

Multiuser Network : 

Multiuser Network Multiple nodes with information

References : 

References C.E. Shannon, W. Weaver, A Mathematical Theory Communication, 1949. T.M. Cover and J. Thomas, Elements of Information Theory, 1991. R. Gallager, Information Theory and Reliable Communication, 1968. J. Proakis, Digital Communication, 4th edition D. Tse and P. Viswanath, Fundamentals of Wireless Communication, 2005. A. Goldsmith “Wireless Communication” Cambridge University Press 2005

References : 

References E. Biglieri, J. Proakis, S. Shamai, Fading Channels: Information Theoretic and Communications, IEEE IT Trans.,1999. A. Goldsmith, P. Varaiya, Capacity of Fading Channels with Channel Side Information, IEEE IT Trans. 1997. I. Telatar, Capacity of Multi-antenna Gaussian Channels, European Trans. Telecomm, 1999. A. Sendonaris, E. Erkip, and B. Aazhang, “User cooperation diversity, Part I. Systemdescription,” IEEE Trans. Commun., Nov. 2003. ——, “User cooperation diversity. Part II. Implementation aspects and performance analysis,” IEEE Trans. Commun., Nov. 2003. J. N. Laneman, D. N. C. Tse, and G. W. Wornell, “Cooperative diversity in wireless networks: Efficient protocols and outage behavior,” IEEE Trans. Inform. Theory, Dec. 2004. M.A. Khojastepour, A. Sabharwal, and B. Aazhang, “On capacity of Gaussian ‘cheap’ relay channel,” GLOBECOM, Dec. 2003.

Reading for Set 1 : 

Reading for Set 1 Tse and Viswanath Chapters 5.1-5.3, 3.1 Appendices A, B.1-B.5 Goldsmith Chapters 1, 4.1,5 Appendices A, B, C

Single Link AWGN Channel : 

Single Link AWGN Channel Model where r(t) is the baseband received signal, b(t) is the information bearing signal, and n(t) is noise. The signal b(t) is assumed to be band-limited to W. The time period is assumed to be T. The dimension of signal is N=2WT

Signal Dimensions : 

Signal Dimensions A signal with bandwidth W sampled at the Nyquist rate. W complex (independent) samples per second. Each complex sample is one dimension or degree of freedom. Signal of duration T and bandwidth W has 2WT real degrees of freedom and can be represented 2WT real dimensions

Signals in Time Domain : 

Signals in Time Domain Sampled at Nyquist rate Example: three independent samples per second means three degrees of freedom time Voltage 1/W 1 second

Signal in Frequency Domain : 

Signal in Frequency Domain Bandwidth W at carrier frequency fc frequency Power W Carrier frequency fc

Baseband Signal in Frequency Domain : 

Baseband Signal in Frequency Domain Passband signal down converted Bandwidth W frequency Power W

Sampling : 

Sampling The baseband signal sampled at rate W Where Sinc function is an example of expansion basis

Model : 

Model There are N orthonormal basis functions to represent the information signal space. For example, The discrete time version

Noise : 

Noise Assumed to be a Gaussian process Zero mean Wide sense stationary Flat power spectral density with height Passed through a filter with BW of W Samples at the rate W are Gaussian Samples are independent

Noise : 

Noise Projection of noise Projections, ni onto orthonormal bases fi(t) are zero mean Gaussian Variance

Noise : 

Noise The samples of noise are Gaussian and independent The received signal given the information samples are also Gaussian

Model : 

Model The discrete time formulation can come from sampling the received signal at the Nyquist rate of W The final model The discrete time model could have come from projection or simple sampling

Statistical Model : 

Statistical Model Key part of the model The discrete time received signals are independent since noise is assumed white

Entropy : 

Entropy Differential entropy Differential conditional entropy with

Example : 

Example A Gaussian random variable with mean and variance The differential entropy is If complex then it is Among all random variables with fixed variance Gaussian has the largest differential entropy

Proof : 

Proof Consider two zero mean random variables X and Y with the same variance Assume X is Gaussian Variance of X

Proof : 

Proof Kullback-Leibler distance Due to Gibbs inequality!

Gibbs’ Inequality : 

Gibbs’ Inequality The KL distance is nonnegative

Capacity : 

Capacity Formally defined by Shannon as where the mutual information with

Capacity : 

Capacity Maximum reliable rate of information through the channel with this model. In our model

Mutual Information : 

Mutual Information Information flow Channel Useful Information Noise; useless information Maximum useful information

Capacity : 

Capacity In this model the maximum is achieved when information vector has mutually independent and Gaussian distributed elements.

AWGN Channel Capacity : 

AWGN Channel Capacity The average power of information signal The noise variance

AWGN Capacity : 

AWGN Capacity The original Shannon formula per unit time An alternate with energy per bit

Achievable Rate and Converse : 

Achievable Rate and Converse Construct codebook with N-dimensional space Law of large numbers Sphere packing

Sphere Packing : 

Sphere Packing Number of spheres (ratio of volumes) Non overlapping As N grows the probability of codeword error vanishes Higher rates not possible without overlap

Achievable Rate and Converse : 

Achievable Rate and Converse Construct codebook with bits in N channel use

Achieving Capacity : 

Achieving Capacity The information vector should be mutually independent with Gaussian distribution The dimension N should be large Complexity Source has information to transmit Full buffer Channel is available No contention for access Point to point

Achieving Capacity : 

Achieving Capacity Accurate model Statistical Noise Deterministic Linear channel Signal model at the receiver Timing Synchronization

Approaching Capacity : 

Approaching Capacity High SNR: Coded modulation with large constellation size Large constellation with binary codes Low SNR: Binary modulation Turbo coding LDPC coding

Constellations and Coding : 

Constellations and Coding