There has been considerable debate on the extent to which future costs should be included in cost-effectiveness analyses of health technologies. In this article, we summarize the theoretical debates and empirical research in this area and highlight the conclusions that can be drawn for current practice. For future related and future unrelated medical costs, the literature suggests that inclusion is required to obtain optimal outcomes from available resources. This conclusion does not depend on the perspective adopted by the decision maker. Future non-medical costs are only relevant when adopting a societal perspective; these should be included if the benefits of non-medical consumption and production are also included in the evaluation. Whether this is the case currently remains unclear, given that benefits are typically quantified in quality-adjusted life-years and only limited research has been performed on the extent to which these (implicitly) capture benefits beyond health. Empirical research has shown that the impact of including future costs can be large, and that estimation of such costs is feasible. In practice, however, future unrelated medical costs and future unrelated non-medical consumption costs are typically excluded from economic evaluations. This is explicitly prescribed in some pharmacoeconomic guidelines. Further research is warranted on the development and improvement of methods for the estimation of future costs. Standardization of methods is needed to enhance the practical applicability of inclusion for the analyst and the comparability of the outcomes of different studies. For future non-medical costs, further research is also needed on the extent to which benefits related to this spending are captured in the measurement and valuation of health benefits, and how to broaden the scope of the evaluation if they are not sufficiently captured.