Information dimension of random variables was introduced by Alfred Renyi in 1959. Only recently, information dimension was shown to be relevant in various areas in information theory. For example, in 2010, Wu and Verdu showed that information dimension is a fundamental limit for lossless analog compression. Recently, Geiger and Koch generalized information dimension from random variables to stochastic processes. They showed connections to the rate-distortion dimension and to the bandwidth of the process. Specifically, if the process is scalar and Gaussian, then the information dimension equals the Lebesgue measure of the support of the process' power spectral density. This suggests that information dimension plays a fundamental role in sampling theory. The first part of the talk reviews the definition and basic properties of entropy and information dimension for random variables. The second part treats the information dimension of stochastic processes and sketches the proof that information dimension is linked to the process' bandwidth.