With the recent advances in communication and computation technologies, integration of software into the sensing, actuation, and control is common. This has lead to a new branch of study called Cyber-Physical Systems (CPS). Avionics, automotives, power grid, medical devices, and robotics are a few examples of such systems. As these systems are part of critical infrastructure, it is very important to ensure that these systems function reliably without any failures. While testing improves confidence in these systems, it does not establish the absence of scenarios where the system fails. The focus of this thesis is on formal verification techniques for cyber-physical systems that prove the absence of errors in a given system. In particular, this thesis focuses on {\em dynamic analysis} techniques that bridge the gap between testing and verification.This thesis uses the framework of hybrid input output automata for modeling CPS. Formal verification of hybrid automata is undecidable in general. Because of the undecidability result, no algorithm is guaranteed to terminate for all models.This thesis focuses on developing heuristics for verification that exploit sample executions of the system. Moreover, the goal of the dynamic analysis techniques proposed in this thesis is to ensure that the techniques are sound, i.e., they always return the right answer, and they are relatively complete, i.e., the techniques terminate when the system satisfies certain special conditions. For undecidable problems, such theoretical guarantees are the strongest that can be expected out of any automatic procedure. This thesis focuses on safety properties, which require that nothing bad happens. In particular we consider invariant and temporal precedence properties; temporal precedence properties ensure that the temporal ordering of certain events in every execution satisfy a given specification.This thesis introduces the notion of a discrepancy function that aids in dynamic analysis of CPS. Informally, these discrepancy functions capture the convergence or divergence of continuous behaviors in CPS systems. In control theory, several proof certificates such as contraction metric and incremental stability have been proposed to capture the convergence and divergence of solutions of ordinary differential equations. This thesis establishes that discrepancy functions generalize such proof certificates. Further, this thesis also proposes a new technique to compute discrepancy functions for continuous systems with linear ODEs from sample executions.One of the main contributions of this thesis is a technique to compute an over-approximation of the set of reachable states using sample executions and discrepancy functions. Using the reachability computation technique, this thesis proposes a safety verification algorithm which is proved to be sound and relatively complete. This technique is implemented in a tool called, Compare-Execute-Check-Engine (C2E2) and experimental results show that it is scalable. To demonstrate the applicability of the algorithms presented, two challenging case studies are analyzed as a part of this thesis. The first case study is about an alerting mechanism in parallel aircraft landing. For performing this case study, the dynamic analysis presented for invariant verification is extended to handle temporal properties. The second case study is about verifying key specification of powertrain control system. New algorithms for computing discrepancy function were implemented in C2E2 for performing this case study. Both these case studies demonstrate that dynamic analysis technique gives promising results and can be applied to realistic CPS.For distributed CPS implementations, where message passing, and clocks skews between agents make formal verification difficult to scale, this thesis presents a dynamic analysis algorithm for inferring global predicates. Such global predicates include assertions about the physical state and the software state of all the agents involved in distributed CPS. This algorithm is applied to coordinated robotic maneuvers for inferring safety and detecting deadlock.