Journal of Global Optimization
Công bố khoa học tiêu biểu
* Dữ liệu chỉ mang tính chất tham khảo
Sắp xếp:
Preface: Special issue of Journal of Global Optimization for the 8th international conference on optimization: techniques and applications
Journal of Global Optimization - Tập 56 - Trang 1295-1296 - 2012
Outer space branch and bound algorithm for solving linear multiplicative programming problems
Journal of Global Optimization - Tập 78 - Trang 453-482 - 2020
In this paper, we consider a linear multiplicative programming problem (LMP) that is known to be NP-hard even with one product term. We first introduce the auxiliary variables to obtain an equivalent problem of problem LMP. An outer space branch and bound algorithm is then designed, which integrates some basic operations such as the linear relaxation technique, branching rule and region reduction technique. The global convergence of the proposed algorithm is established by means of the subsequent solutions of a series of linear programming problems, and its computational complexity is estimated on the basis of the branching rule. Also, we discuss the relationship between the proposed linear relaxation and existing relaxations for LMP. Finally, preliminary numerical results demonstrate the proposed algorithm can efficiently find the globally optimal solutions for test instances.
Primal-dual splittings as fixed point iterations in the range of linear operators
Journal of Global Optimization - Tập 85 - Trang 847-866 - 2022
In this paper we study the convergence of the relaxed primal-dual algorithm with critical preconditioners for solving composite monotone inclusions in real Hilbert spaces. We prove that this algorithm define Krasnosel’skiĭ-Mann (KM) iterations in the range of a particular monotone self-adjoint linear operator with non-trivial kernel. Our convergence result generalizes (Condat in J Optim Theory Appl 158: 460–479, 2013, Theorem 3.3) and follows from that of KM iterations defined in the range of linear operators, which is a real Hilbert subspace under suitable conditions. The Douglas–Rachford splitting (DRS) with a non-standard metric is written as a particular instance of the primal-dual algorithm with critical preconditioners and we recover classical results from this new perspective. We implement the algorithm in total variation reconstruction, verifying the advantages of using critical preconditioners and relaxation steps.
One-dimensional identification problem and ranking parameters
Journal of Global Optimization - - 2010
Performance of global random search algorithms for large dimensions
Journal of Global Optimization - Tập 71 - Trang 57-71 - 2017
We investigate the rate of convergence of general global random search (GRS) algorithms. We show that if the dimension of the feasible domain is large then it is impossible to give any guarantee that the global minimizer is found by a general GRS algorithm with reasonable accuracy. We then study precision of statistical estimates of the global minimum in the case of large dimensions. We show that these estimates also suffer the curse of dimensionality. Finally, we demonstrate that the use of quasi-random points in place of the random ones does not give any visible advantage in large dimensions.
On feedback strengthening of the maximum principle for measure differential equations
Journal of Global Optimization - Tập 76 - Trang 587-612 - 2019
For a class of nonlinear nonconvex impulsive control problems with states of bounded variation driven by Borel measures, we derive a new type non-local necessary optimality condition, named impulsive feedback maximum principle. This optimality condition is expressed completely within the objects of the impulsive maximum principle (IMP), while employs certain “feedback variations” of impulsive control. The obtained optimality condition is shown to, potentially, discard non-optimal IMP-extrema, and can be viewed as a deterministic non-local iterative algorithm for optimal impulsive control.
Toulouse Global optimization Workshop 2010 (TOGO10)
Journal of Global Optimization - Tập 56 - Trang 757-759 - 2012
Partly convex programming and zermelo's navigation problems
Journal of Global Optimization - Tập 7 - Trang 229-259 - 1995
Mathematical programs, that become convex programs after “freezing” some variables, are termed partly convex. For such programs we give saddle-point conditions that are both necessary and sufficient that a feasible point be globally optimal. The conditions require “cooperation” of the feasible point tested for optimality, an assumption implied by lower semicontinuity of the feasible set mapping. The characterizations are simplified if certain point-to-set mappings satisfy a “sandwich condition”. The tools of parametric optimization and basic point-to-set topology are used in formulating both optimality conditions and numerical methods. In particular, we solve a large class of Zermelo's navigation problems and establish global optimality of the numerical solutions.
Multistart with early termination of descents
Journal of Global Optimization - Tập 79 - Trang 447-462 - 2019
Multistart is a celebrated global optimization technique frequently applied in practice. In its pure form, multistart has low efficiency. However, the simplicity of multistart and multitude of possibilities of its generalization make it very attractive especially in high-dimensional problems where e.g. Lipschitzian and Bayesian algorithms are not applicable. We propose a version of multistart where most of the local descents are terminated very early; we will call it METOD as an abbreviation for multistart with early termination of descents. The performance of the proposed algorithm is demonstrated on randomly generated test functions with 100 variables and a modest number of local minimizers.
Tổng số: 2,250
- 1
- 2
- 3
- 4
- 5
- 6
- 10