There are several factors that limit the resolution of an optical imaging system. The first limitation is related to the F number of the optics which expresses the wave nature of the propagated radiation i.e. the diffraction effect of the propagated wave. The smaller the F number of a lens is, the larger is the angular range of optical rays that it may collect and thus the resolvable separation distance between spatial features is reduced. The second is the geometric resolution limit determined by both the pitch of the pixels in the detector (that set the Nyquist sampling limit and therefore the spatial frequency above which aliasing effect is generated) as well as by the non ideal spatial sampling (caused by the spatial averaging performed by each one of the pixels, in contrast to ideal sampling that is done with Dirac delta functions). The third limitation is related to the dynamic range of the detection array and the number of quantization bits allocated per each sample. Within this presentation I will show a set of new approaches allowing overcoming the above mentioned diffraction and geometric limitations by converting the spatial degrees of freedom to time, polarization, space and wavelength domains. I intend to present techniques suitable for near field imaging as well as for remote sensing applications. I will conclude my talk by showing a new approach capable of monitoring from a distance the heart beats, blood pulse pressure and the glucose level in the blood stream of a patient.