The notion of directed information is introduced for stochastic processes in continuous time. Properties and operational interpretations are presented for this notion of directed information, which generalizes mutual information between stochastic processes in a similar manner as Massey's original notion of directed information generalizes Shannon's mutual information in the discrete-time setting. As a key application, Duncan's theorem is generalized to estimation problems in which the evolution of the target signal is affected by the past channel noise, and the causal minimum mean squared error estimation is related to directed information from the target signal to the observation corrupted by additive white Gaussian noise. An analogous relationship holds for the Poisson channel. The notion of directed information as a characterizing of the fundamental limit on reliable communication for a wide class of continuous-time channels with feedback is discussed.