Presents a novel approach for the modeling of multivariable time series. The model class consists of linear systems, i.e., the solution sets of linear difference equations. Restricting the model order, the aim is to determine a model with minimal l2-distance from the observed time series. Necessary conditions for optimality are described in terms of state-space representations. These conditions motivate a relatively simple iterative algorithm for the nonlinear problem of identifying optimal models. Attractive aspects of the proposed method are that the model error is measured globally, it can be applied for multi-input, multi-output systems, and no prior distinction between inputs and outputs is required. The authors give an illustration by means of some numerical simulations