A.H.G. Rinnooy Kan (Alexander)
http://repub.eur.nl/ppl/1791/
List of Publicationsenhttp://repub.eur.nl/eur_logo.png
http://repub.eur.nl/
RePub, Erasmus University RepositorySensitivity Analysis of List Scheduling Heuristics
http://repub.eur.nl/pub/2306/
Tue, 15 Nov 1994 00:00:01 GMT<div>A.W.J. Kolen</div><div>A.H.G. Rinnooy Kan</div><div>C.P.M. van Hoesel</div><div>A.P.M. Wagelmans</div>
When jobs have to be processed on a set of identical parallel machines so as to minimize the makespan of the schedule, list scheduling rules form a popular class of heuristics. The order in which jobs appear on the list is assumed here to be determined by the relative size of their processing times; well-known special cases are the LPT rule and the SPT rule, in which the jobs are ordered according to non-increasing and non-decreasing processing time respectively.
When all processing times are exactly known, a given list scheduling rule will generate a unique assignment of jobs to machines. However, when there exists a priori uncertainty with respect to one of the processing times, then there will be, in general, several possibilities for the assignment that will be generated once the processing time is known. This number of possible assignments may be viewed as a measure of the sensitivity of the list scheduling rule that is applied.
We derive bounds on the maximum number of possible assignments for several list scheduling heuristics, and we also study the makespan associated with these assignments. In this way we obtain analytical support for the intuitively plausible notion that the sensitivity of a list scheduling rule increases with the quality of the schedule produced.Decomposition in general mathematical programming
http://repub.eur.nl/pub/71619/
Tue, 01 Jun 1993 00:00:01 GMT<div>O.E. Flippo</div><div>A.H.G. Rinnooy Kan</div>
In this paper a unifying framework is presented for the generalization of the decomposition methods originally developed by Benders (1962) and Dantzig and Wolfe (1960). These generalizations, called Variable Decomposition and Constraint Decomposition respectively, are based on the general duality theory developed by Tind and Wolsey. The framework presented is of a general nature since there are no restrictive conditions imposed on problem structure; moreover, inaccuracies and duality gaps that are encountered during computations are accounted for. The two decomposition methods are proven not to cycle if certain (fairly general) conditions are met. Furthermore, finite convergence can be ensured under the traditional finiteness conditions and asymptotic convergence can be guaranteed once certain continuity conditions are met. The obvious symmetry between both types of decomposition methods is explained by establishing a duality relation between the two, which extends a similar result in Linear Programming. A remaining asymmetry in the asymptotic convergence results is argued to be a direct consequence of a fundamental asymmetry that resides in the Tind-Wolsey duality theory. It can be shown that in case the latter asymmetry disappears, the former does too. Other decomposition techniques, such as Lagrangean Decomposition and Cross Decomposition, turn out to be captured by the general framework presented here as well.Probabilistic analysis of algorithms for dual bin packing problems
http://repub.eur.nl/pub/11733/
Sat, 01 Jun 1991 00:00:01 GMT<div>J. Csirik</div><div>J.B.G. Frenk</div><div>G. Galambos</div><div>A.H.G. Rinnooy Kan</div>
In the dual bin packing problem, the objective is to assign items of given size to the largest possible number of bins, subject to the constraint that the total size of the items assigned to any bin is at least equal to 1. We carry out a probabilistic analysis of this problem under the assumption that the items are drawn independently from the uniform distribution on [0, 1] and reveal the connection between this problem and the classical bin packing problem as well as to renewal theory.Concurrent stochastic methods for global optimization
http://repub.eur.nl/pub/71742/
Mon, 01 Jan 1990 00:00:01 GMT<div>D.R. Byrd</div><div>C.L. Dert</div><div>A.H.G. Rinnooy Kan</div><div>R.B. Schnabel</div>
The global optimization problem, finding the lowest minimizer of a nonlinear function of several variables that has multiple local minimizers, appears well suited to concurrent computation. This paper presents a new parallel algorithm for the global optimization problem. The algorithm is a stochastic method related to the multi-level single-linkage methods of Rinnooy Kan and Timmer for sequential computers. Concurrency is achieved by partitioning the work of each of the three main parts of the algorithm, sampling, local minimization start point selection, and multiple local minimizations, among the processors. This parallelism is of a coarse grain type and is especially well suited to a local memory multiprocessing environment. The paper presents test results of a distributed implementation of this algorithm on a local area network of computer workstations. It also summarizes the theoretical properties of the algorithm.A probabilistic analysis of the multiknapsack value function
http://repub.eur.nl/pub/72072/
Mon, 01 Jan 1990 00:00:01 GMT<div>M. Meanti</div><div>A.H.G. Rinnooy Kan</div><div>L. Stougie</div><div>C. Vercellis</div>
The optimal solution value of the multiknapsack problem as a function of the knapsack capacities is studied under the assumption that the profit and weight coefficients are generated by an appropriate random mechanism. A strong asymptotic characterization is obtained, that yiclds a closed form expression for certain special cases.Chapter IX Global optimization
http://repub.eur.nl/pub/21361/
Fri, 01 Dec 1989 00:00:01 GMT<div>A.H.G. Rinnooy Kan</div><div>G. Timmer</div>
Order statistics and the linear assignment problem
http://repub.eur.nl/pub/11690/
Mon, 01 Jun 1987 00:00:01 GMT<div>J.B.G. Frenk</div><div>M. van Houweninge</div><div>A.H.G. Rinnooy Kan</div>
Under mild conditions on the distribution functionF, we analyze the asymptotic behavior in expectation of the smallest order statistic, both for the case thatF is defined on (–, +) and for the case thatF is defined on (0, ). These results yield asymptotic estimates of the expected optiml value of the linear assignment problem under the assumption that the cost coefficients are independent random variables with distribution functionF.A simulation tool for the performance evaluation of parallel branch and bound algorithms
http://repub.eur.nl/pub/1510/
Thu, 01 Jan 1987 00:00:01 GMT<div>A. de Bruin</div><div>A.H.G. Rinnooy Kan</div><div>H.W.J.M. Trienekens</div>
Parallel computation offers a challenging opportunity to speed up the time consuming
enumerative procedures that are necessary to solve hard combinatorial problems.
Theoretical analysis of such a parallel branch and bound algorithm is very hard and
empirical analysis is not straightforward because the performance of a parallel algorithm
cannot be evaluated simply by executing the algorithm on a few parallel systems. Among the
difficulties encountered are the noise produced by other users on the system, the limited
variation in parallelism (the number of processors in the system is strictly bounded) and
the waste of resources involved: most of the time, the outcomes of all computations are
already known and the only issue of interest is when these outcomes are produced.
We will describe a way to simulate the execution of parallel branch and bound algorithms
on arbitrary parallel systems in such a way that the memory and cpu requirements are very
reasonable. The use of simulation has only minor consequences for the formulation of the
algorithm.A probabilistic analysis of the next fit decreasing bin packing heuristic
http://repub.eur.nl/pub/11645/
Sat, 01 Nov 1986 00:00:01 GMT<div>J. Csirik</div><div>G. Galambos</div><div>J.B.G. Frenk</div><div>A.M. Frieze</div><div>A.H.G. Rinnooy Kan</div>
A probabilistic analysis is presented of the Next Fit Decreasing bin packing heuristic, in which bins are opened to accomodate the items in order of decreasing size.The rate of convergence to optimality of the LPT rule
http://repub.eur.nl/pub/11698/
Sun, 01 Jun 1986 00:00:01 GMT<div>J.B.G. Frenk</div><div>A.H.G. Rinnooy Kan</div>
The LPT rule is a heuristic method to distribute jobs among identical machines so as to minimize the makespan of the resulting schedule. If the processing times of the jobs are assumed to be independent identically distributed random variables, then (under a mild condition on the distribution) the absolute error of this heuristic is known to converge to 0 almost surely. In this note we analyse the asymptotic behaviour of the absolute error and its first and higher moments to show that under quite general assumptions the speed of convergence is proportional to appropriate powers of (log log n)/n and 1/n. Thus, we simplify, strengthen and extend earlier results obtained for the uniform and exponential distribution.A hierarchical scheduling problem with a well-solvable second stage
http://repub.eur.nl/pub/11692/
Wed, 01 Feb 1984 00:00:01 GMT<div>J.B.G. Frenk</div><div>A.H.G. Rinnooy Kan</div><div>L. Stougie</div>
In the hierarchical scheduling model to be considered, the decision at the aggregate level to acquire a number of identical machines has to be based on probabilistic information about the jobs that have to be scheduled on these machines at the detailed level. The objective is to minimize the sum of the acquisition costs and the expected average completion time of the jobs. In contrast to previous models of this type, the second part of this objective function corresponds to a well-solvable scheduling problem that can be solved to optimality by a simple priority rule. A heuristic method to solve the entire problem is described, for which strong asymptotic optimality results can be established.The asymptotic behaviour of a distributive sorting method
http://repub.eur.nl/pub/11689/
Thu, 01 Dec 1983 00:00:01 GMT<div>W.B. van Dam</div><div>J.B.G. Frenk</div><div>A.H.G. Rinnooy Kan</div>
In the distributive sorting method of Dobosiewicz, both the interval between the minimum and the median of the numbers to be sorted and the interval between the median and the maximum are partitioned inton/2 subintervals of equal length; the procedure is then applied recursively on each subinterval containing more than three numbers. We refine and extend previous analyses of this method, e.g., by establishing its asymptotic linear behaviour under various probabilistic assumptions.