Home2024-08-30T22:28:42+00:00

Felipe Lara

Assistant Professor, Department of Mathematics, University of Tarapacá, Arica, Chile.

Education

Ph.D. in Applied Sciences with mention in Mathematical Engineer. University of Concepción, October 14, 2015. Concepción, Chile.
Ph.D. Thesis: “Second Order Asymptotic Analysis in Optimization”.
Advisors: Fabián Flores-Bazán, University of Concepción, Concepción, Chile. Nicolas Hadjisavvas, University of the Aegean, Syros, Greece.

Mathematical Engineer, University of Concepción, October 27, 2010. Concepción, Chile.
Thesis: “Efficiency in Nonconvex Multiobjective Optimization”.
Advisor: Fabián Flores-Bazán, University of Concepción, Concepción, Chile.

309, 2024

Felipe Lara gives the Plenary talk «A review on strongly quasiconvex functions: existence, characterizations and algorithms» at the XIV International Symposium on Generalized Convexity and Monotonicity (14GCM), University of Pisa, Pisa, Italy.

1904, 2024

Felipe Lara and his research group has obtained the research project «New methods for solving nonconvex optimization problems» given by the Vietnam Institute for Advanced Study in Mathematics (VIASM) which will be developed during the first semester of 2025 in Hanoi, Vietnam.

PUBLICATIONS

It is a pleasure for me to  share my publications with you, I hope this  helps  you in your future research. Please fell free to share it
  • A subgradient projection method for quasiconvex minimization, Positivity, DOI: 10.1007/s11117-024-01082-z, (2024), (with J. Choque and R.T. Marcavillaca).

    DOWNLOAD

In this paper, a subgradient projection method for quasiconvex minimiza
tion problems is provided. By employing strong subdi erentials, it is
proved that the generated sequence of the proposed algorithm converges
to the solution of the minimization problem of a proper, lower semicon
tinuous, and strongly quasiconvex function (in the sense of Polyak [18]),
under the same assumptions as those required for convex functions with
the convex subdi erentials. Furthermore, a quasi-linear convergence rate
of the iterates, extending similar results for the general quasiconvex case,
is also provided.
Keywords: Subgradient methods; First-order methods; Nonconvex opti
mization; Generalized convexity, Quasiconvexity

  • A two-step proximal point algorithm for nonconvex equilibrium problems with applications to quadratic programming, J. Global Optim., DOI: 10.1007/s10898-024-01419-8, (2024), (with A. Iusem, R.T. Marcavillaca, L.H. Yen).

    DOWNLOAD

Wepresent a proximal point type algorithm tailored for tackling pseudomo
notone equilibrium problems in a Hilbert space which are not necessarily
convex in the second argument of the involved bifunction. Motivated
by the extragradient algorithm, we propose a two-step method and we
prove that the generated sequence converges strongly to a solution of the
nonconvex equilibrium problem under mild assumptions and, also, we es
tablish a linear convergent rate for the iterates. Furthermore, we identify
a new class of functions that meet our assumptions, and we provide suf
cient conditions for quadratic fractional functions to exhibit strong qua
siconvexity. Finally, we perform numerical experiments comparing our
algorithm against two alternative methods for classes of nonconvex mixed
variational inequalities.
Keywords: Proximal point methods; Nonconvex optimization; Equili
brium problems; Generalized convexity; Fractional programming.

  • An Extragradient Projection Method for Strongly Quasiconvex Equilibrium Problems with Applications, Comp. Appl. Math., DOI: 10.1007/s40314-024-02626-5, (2024) (with R.T. Marcavillaca and L.H. Yen).

    DOWNLOAD

We discuss an extragradient projection method for dealing with equili-
brium problems which are strongly quasiconvex on its second argument.

The algorithm combines a proximal step with a subgradient projection

step using a generalized subdifferential, which is specially useful for deal-
ing with this class of generalized convex functions, and also with a line

search. As a consequence, the usual assumption regarding the relation-
ship between the Lipschitz-type parameter and the modulus of strong

quasiconvexity is not longer needed for ensuring the convergence of the

generated sequence to the solution of the problem. Furthermore, nume-
rical experiments for classes of nonconvex mixed variational inequalities

based on fractional programming problems are given in order to show the
performance of our proposed method.

Keywords: Nonconvex optimization; Equilibrium problems; Extragradi-
ent methods; Subgradient methods; Variational inequalities

  • Proximal Point Type Algorithms with Relaxed and Inertial Effects Beyond Convexity, Optimization, DOI: 10.1080/02331934.2024.2329779, (2024), (with S.-M. Grad and R.T. Marcavillaca)

    DOWNLOAD

We show that the recent relaxed-inertial proximal point algo-
rithm due to Attouch and Cabot remains convergent when the function to

be minimized is not convex, being only endowed with certain generalized
convexity properties. Numerical experiments showcase the improvements
brought by the relaxation and inertia features to the standard proximal
point method in this setting, too.
Keywords: Proximal point algorithms, Relaxed iterative methods, Inertial
iterative methods, Generalized convexity, Prox-convexity.

  • Relaxed-Inertial Proximal Point Algorithms for Nonconvex Equilibrium Problems with Applications. DOI: 10.1007/s10957-023-02375-1, (2024), (with S.-M. Grad and R.T. Marcavillaca).

    DOWNLOAD

We propose a relaxed-inertial proximal point algorithm for sol-
ving equilibrium problems involving bifunctions which satisfy in the se-
cond variable a generalized convexity notion called strong quasiconvexity,

introduced by Polyak in 1966. The method is suitable for solving mixed
variational inequalities and inverse mixed variational inequalities involving
strongly quasiconvex functions, as these can be written as special cases
of equilibrium problems. Numerical experiments where the performance
of the proposed algorithm outperforms the one of the standard proximal
point methods are provided, too.
Communicated by Alexander Vladimirovich Gasnikov.
Keywords: Proximal point algorithms; inertial algorithms; equilibrium
problems; nonconvex optimization; quasiconvexity

  • Semistrictly and Neatly Quasiconvex Programming using Lower Global Subdifferentials. J. Global optim. DOI: 10.1007/s10898-023-01278-9, (2023), (with A. Kabgani).

    DOWNLOAD

The main goal of this paper is to investigate the properties and connections of neatly and semistrictly quasiconvex functions, especially when they appear in constrained and unconstrained optimization problems. The lower global subdifferential, recently introduced in the literature, plays an essential role in this study. We present several optimality conditions for constrained and unconstrained nonsmooth neatly/semistrictly quasiconvex optimization problems in terms of lower global subdifferentials.

To this end, for a constrained optimization problem, we present some characterizations for the normal and tangent cones and the cone of feasible directions of the feasible set. Some relationships between the Greenberg-Pierskalla, tangentially and lower global subdifferentials of neatly and semistrictly quasiconvex functions are also given.

The mentioned relationships show that the outcomes of this paper generalize some results existing in the literature.
Keywords: Nonconvex optimization; Quasiconvex programming; KKT conditions; Global subdifferentials; Greenberg-Pierskalla’s subdifferential

  • Relaxed-inertial proximal point type algorithms for quasiconvex minimization. J. Global Optim. DOI: 10.1007/s10898-022-01226-z, (2022), (with S.-M. Grad and R.T. Marcavillaca).

    DOWNLOAD

We propose a relaxed-inertial proximal point type algorithm for solving optimization problems consisting in minimizing strongly quasicon-
vex functions whose variables lie in finitely dimensional linear subspaces.
A relaxed version of the method where the constraint set is only closed and convex is also discussed, and so is the case of a quasiconvex objective
function. Numerical experiments illustrate the theoretical results.
Keywords: Proximal point algorithms, Relaxed methods, Inertial methods, Generalized convexity, Strong quasiconvexity.
  • Bregman type proximal point algorithms for quasiconvex minimization. Optimization DOI: 10.1080/02331934.2022.2112580, (2022), (with R. T. Marcavillaca).

    DOWNLOAD

We discuss a Bregman proximal point type algorithm for dealing with quasiconvex minimization. In particular, we prove that the Bregman
proximal point type algorithm converges to a minimal point for the minimization problem of a certain class of quasiconvex functions without
neither differentiability nor Lipschitz continuity assumptions, this class of nonconvex functions is known as strongly quasiconvex functions and, as
a consequence, we revisited the general case of quasiconvex functions.
Keywords: Proximal point algorithms, Bregman distances, Nonconvex optimization, Generalized convexity, Quasiconvexity.

  • Strong subdifferentials: theory and applications in nonconvex optimization J. Global Optim. DOI: 10.1007/s10898-022-01149-9, (2022), (with A. Kabgani).

    DOWNLOAD

A new subdifferential for dealing with nonconvex functions is provided in
the following paper and the usual properties are presented as well. Furthermore, characterizations and optimality conditions for a point to be a
solution for the nonconvex minimization problem are given. In particular, new KKT-type optimality conditions for nonconvex nonsmooth constraint optimization problems are developed. Moreover, a relationship with
the proximity operator for lower semicontinuous quasiconvex functions is
given and, as a consequence, the nonemptiness of this subdifferential for
large classes of quasiconvex functions is ensured.
Keywords: Nonconvex optimization; Nonsmooth optimization; Generalized convexity; KKT conditions; Proximal operators.

  • On Strongly Quasiconvex Functions: Existence
    Results and Proximal Point Algorithms
    J. Optim. Theory Appl. DOI: 10.1007/s10957-021-01996-8, (2022).

    DOWNLOAD

We prove that every strongly quasiconvex function is 2-supercoercive (in
particular, coercive). Furthermore, we investigate the usual properties
of proximal operators for strongly quasiconvex functions. In particular,
we prove that the set of fixed points of the proximal operator coincides
with the unique minimizer of a lower semicontinuous strongly quasiconvex
function. As a consequence, we implement the proximal point algorithm
for finding the unique solution of the minimization problem of a strongly
quasiconvex function by using a positive sequence of paramenters bounded
away from 0 and, in particular, we revisit the general quasiconvex case.
Moreover, a new characterization for convex functions is derived from
this analysis. Finally, an application for a strongly quasiconvex function
which is neither convex nor differentiable nor locally Lipschitz continuous
is provided.
Keywords: Nonconvex optimization, Nonsmooth optimization, Strongly
quasiconvex functions, Existence of Solutions, Proximal point algorithms

  • Proximal point algorithms for quasiconvex pseudomonotone equilibrium problems
    J. Optim. Theory. Appl. DOI: 10.1007/s10957-021-01951-7, (2021). (with A. Iusem).

    DOWNLOAD

We propose a proximal point method for quasiconvex pseudomonotone equilibrium problems. The subproblems of the method are
optimization problems whose objective is the sum of a strongly quasiconvex function plus the standard quadratic regularization term for optimization problems. We prove, under suitable additional assumptions, convergence of the generated sequence to a solution of the equilibrium problem,
whenever the bifunction is strongly quasiconvex in its second argument,
thus extending the validity of the convergence analysis of proximal point
methods for equilibrium problems beyond the standard assumption of
convexity of the bifunction in the second argument.
Keywords: Proximal point algorithms, Equilibrium problems, Pseudomonotonicity, Quasiconvexity, Strong quasiconvexity

  • An extension of the proximal point algorithm beyond convexity. J. of Global Optim. DOI: 10.1007/s10898-021-01081-4, (2021). (With S.-M. Grad).

    DOWNLOAD

We introduce and investigate a new generalized convexity
notion for functions called prox-convexity. The proximity operator of
such a function is single-valued and firmly nonexpansive. We provide
examples of (strongly) quasiconvex, weakly convex, and DC (difference of
convex) functions that are prox-convex, however none of these classes fully
contains the one of prox-convex functions or is included into it. We show
that the classical proximal point algorithm remains convergent when the
convexity of the proper lower semicontinuous function to be minimized is
relaxed to prox-convexity.
Keywords: Nonsmooth optimization, Nonconvex optimization, Proximity
operator, Proximal point algorithm, Generalized convex function

  • On nonconvex pseudomonotone equilibrium problems with applications. Set-Valued and Var. Anal. DOI: 10.1007/s11228-021-00586-0, (2021).

    DOWNLOAD

In this paper, we provide a further study for nonconvex pseudomonotone equilibrium problems and nonconvex mixed variational inequalities by using global directional derivatives. We provide finer necessary
and sufficient optimality conditions for both problems in the pseudomonotone case and, as a consequence, a characterization for a point to be a solution for nonconvex equilibrium problems is given. Finally, we apply the
golden ratio algorithm for a class of nonconvex functions in equilibrium
problems and mixed variational inequalities.
Keywords: Nonconvex optimization, Nonsmooth analysis, Equilibrium problems, Variational inequalities, Golden ratio algorithms

  • Solving mixed variational inequalities beyond convexity. J. Optim. Theory Appl. DOI: 10.1007/s10957-021-01860-9, (2021). (With S.-M. Grad).

    DOWNLOAD

We show that Malitsky’s recent Golden Ratio Algorithm for solving convex mixed variational inequalities can be employed in a certain nonconvex framework as well, making it probably the first iterative method in the literature for solving generalized convex mixed variational inequalities, and illustrate this result by numerical experiments.
Keywords: Variational inequalities, Quasiconvex functions, Proximal Point Algorithms, Golden Ratio Algorithms

  • On global subdifferentials with applications in nonsmooth optimization. J. of Global Optimization. DOI: 10.1007/s10898-020-00981-1, (2020). (with A. Kabgani).

    DOWNLOAD

The notions of global subdifferentials associated with the global

directional derivatives are introduced in the following paper. Most com-
mon used properties, a set of calculus rules along with a mean value

theorem are presented as well. In addition, a diversity of comparisons

with well-known subdifferentials such as Fr ́echet, Dini, Clarke, Michel-
Penot, and Mordukhovich subdifferential and convexificator notion are

provided. Furthermore, the lower global subdifferential is in fact proved

to be an abstract subdifferential. Therefore, the lower global subdifferen-
tial satisfies standard properties for subdifferential operators. Finally, two

applications in nonconvex nonsmooth optimization are given: necessary
and sufficient optimality conditions for a point to be local minima with
and without constraints, and a revisited characterization for nonsmooth
quasiconvex functions.

Keywords: Nonsmooth analysis; Global derivatives; Global subdifferen-
tials; Nonconvex optimization; Local minima.

  • Characterizations of Nonconvex Optimization
    Problems via Variational Inequalities
    Characterizations of Nonconvex Optimization Problems via Variational Inequalities. Optimization, DOI: 10.1080/02331934.20201857758, (2020)

    DOWNLOAD

In this paper, we deal with two problems from the theory of
nonconvex nonsmooth analysis; The characterization of nonsmooth quasiconvex functions, and connections between nonsmooth constraint optimization problems via variational inequalities. For the first problem, we
provide different characterizations for nonsmooth quasiconvex functions,
while for the second problem, a full connection between constraint optimization problems and Stampacchia and Minty variational inequalities is
provided, in both cases, neither differentiability nor convexity nor continuity assumptions are considered. As a corollary, we recover well-known
results from convex analysis.

Keywords: Nonsmooth analysis; Nonconvex optimization; Quasiconvexity; Minty Variational Inequalities; Stampacchia Variational Inequalities.

  • A Note on “Existence Results for Noncoercive Mixed Variational Inequalities in Finite Dimensional Spaces” J. Optim. Theory Appl. DOI: 10.1007/s10957-020-01722-w, (2020). (With A. Iusem).

    DOWNLOAD

We correct the proofs of a previous publication.

Keywords: Asymptotic analysis; Asymptotic functions; Noncoercive Optimization; Variational Inequalities; Equilibrium Problems.

  • Optimality Conditions for Nonconvex Nonsmooth Optimization via Global Derivatives J. Optim. Theory Appl. Vol 185, Issue 1, 134–150, (2020).

    DOWNLOAD

The notions of upper and lower global directional derivatives are introduced for dealing with nonconvex and nonsmooth optimization problems. We provide calculus rules and monotonicity properties for these notions. As a consequence, new formulas for the Dini directional derivatives, radial epiderivatives and generalized asymptotic functions are given in terms of the upper and lower global directional derivatives. Furthermore, a mean value theorem, which extend the well-known Diewert’s mean value theorem for radially upper and lower semicontinuous functions, is established. We also provide necessary and sufficient optimality conditions for a point to be a local and/or global solution for the nonconvex minimization problem. Finally, applications for nonconvex and nonsmooth mathematical programming problems are also presented.

Keywords: Global derivatives, Asymptotic functions, Quasiconvexity, Nonconvex optimization, Nonsmooth optimization.

  • First and second order asymptotic analysis with applications in quasiconvex optimization
    J. Optim. Theory Appl., Vol. 170, Issue 2, 372–393, (2016). (With F. Flores-Bazán, N. Hadjisavvas and I. Montenegro).

    DOWNLOAD

We use asymptotic analysis to describe in a more systematic way the behavior at the infinity of functions in the convex and quasiconvex case. Starting from the formulae for the first and second order asymptotic function in the convex case, we introduce similar notions suitable for dealing with quasiconvex functions. Afterwards, by using such notions, a class of quasiconvex vector mappings under which a closed convex set is closed, is introduced; we characterize the nonemptiness and boundedness of the set of minimizers of any lsc quasiconvex function; finally, we also characterize boundedness from below, along lines, of any proper and lsc function.

Keywords: Quasiconvexity, asymptotic functions, second-order asymptotic functions and cones, optimality conditions, nonconvex optimization.

  • Inner and outer estimates for the solution sets and their asymptotic cones in vector optimization Optim. Letters, Vol. 6, Issue 7, 1233–1249, (2012). (With F. Flores-Bazán).

    DOWNLOAD

We use asymptotic analysis to develop finer estimates for the efficient, weak efficient and proper efficient solution sets (and for their asymptotic cones) to a convex/quasiconvex vector optimization problems. We also provide a new representation for the efficient solution set without any convexity assumption, and the estimates involve the minima of the linear scalarization of the original vector problem. Some new necessary conditions for a point to be efficient or weak efficient solution for general convex vector optimization problems, as well as for the nonconvex quadratic multiobjective optimization problem, are established.neral case of quasiconvex multiobjective optimization problems.

Keywords: Nonconvex vector optimization; Efficiency; Weak efficiency; Proper efficiency; Linear scalarization; Asymptotic functions and cones; Necessary conditions.

  • An Augmented Lagrangian Method for Quasi-Equilibrium Problems  Comp. Optim. Appl., Vol. 76, Issue 3, 737–766 (2020), (with L.F. Bueno, G. Haeser and F.N. Rojas)
    DOWNLOAD

In this paper, we propose an Augmented Lagrangian algorithm for solving a general class of possible non-convex problems called quasi-equilibrium problems (QEPs). We define an Augmented Lagrangian bifunction associated with QEPs, introduce a secondary QEP as a measure of infeasibility and we discuss several special classes of QEPs within our theoretical framework. For obtaining global convergence under a new weak constraint qualification, we extend the notion of an Approximate Karush-Kuhn-Tucker (AKKT) point for QEPs (AKKT-QEP), showing that in general it is not necessarily satisfied at a solution, differently from its counterpart in optimization. We study some particular cases where AKKT-QEP does hold at a solution, while discussing the solvability of the subproblems of the algorithm. We also present illustrative numerical experiments.

Keywords: Augmented Lagrangian methods; Quasi-equilibrium problems; Equilibrium problems; Constraint qualifications; Approximate-KKT conditions

  • On the Existence of a Saddle Value for Nonconvex and Noncoercive Bifunctions Minimax Theory Appl. Vol. 5, Issue 1, 65–76, (2020).

    DOWNLOAD

We provide necessary and sufficient conditions for ensuring the existence of a saddle value for classes of nonconvex and noncoercive bifunctions. To that end, we use special classes of asymptotic (recession) directions and generalized asymptotic functions introduced and studied previously in the literature. We apply our theoretical results for providing sufficient conditions for zero duality gap for classes of quasiconvex cone constraint mathematical programming problems.

Keywords: Saddle Value; Asymptotic Directions; Asymptotic functions: Duality; Quasiconvexity; Noncoercive Optimization, Nonconvex Programming.

  • A Further Study on Asymptotic Functions via Variational Analysis J. Optim Theory Appl., Vol. 182, Issue 1, 366–382, (2019).(With R. López and B. F. Svaiter)

    DOWNLOAD

We use variational analysis for studying asymptotic (recession or horizon) functions. We introduce the upper and lower asymptotic operators and study its domain and image. Moreover, we characterize its fixed points and zeros. Finally, we establish continuity properties of this operator, i.e., the convergence of asymptotic functions of convergent sequences of functions.

Keywords: Asymptotic functions; Variational analysis; Operator theory; Set convergence; Epi-convergence.

  • Optimality Conditions for Vector Equilibrium Problems with Applications J. Optim. Theory Appl., Vol. 180, Issue 1, 187–206, (2019).(With A. Iusem)

    DOWNLOAD

We use asymptotic analysis for studying noncoercive pseudomonotone equilibrium problems and vector equilibrium problems. We introduce suitable notions of asymptotic functions, which provide sufficient conditions for the set of solutions of these problems to be nonempty and compact under quasiconvexity of the objective function. We characterize the efficient and weakly efficient solution set for the nonconvex vector equilibrium problem via scalarization. A sufficient condition for the closedness of the image of a nonempty, closed and convex set via a quasiconvex vector-valued function is given. Finally, applications to the quadratic fractional programming problem are also presented.

Keywords: Asymptotic analysis; Generalized convexity; Pseudomonotone operators; Equilibrium problems; Vector optimization

  • A General Asymptotic Function with Applications in Nonconvex Optimization J. of Global Optim. Vol. 78, Issue 1, 49–68,(2020). (With N. Hadjisavvas and D. T. Luc)

    DOWNLOAD

We introduce a new concept of asymptotic functions which allows us to simultaneously study convex and concave functions as well as quasiconvex and quasiconcave functions. We provide some calculus rules and most relevant properties of the new asymptotic functions for application purpose. We also compare them with the classical asymptotic functions of convex analysis. By using the new concept of asymptotic functions we establish sufficient conditions for the nonemptiness and for the boundedness of the solution set of quasiconvex minimization problems and quasiconcave maximization problems. Applications are given for quadratic and fractional quadratic problems.

Keywords: Asymptotic functions; Quasiconvex functions; Nonconvex optimization

  • Second Order Asymptotic Functions and Applications to Quadratic Programming J. of Convex Anal., Vol. 25, Issue 1, 271–291, (2018).(With A. Iusem)

    DOWNLOAD

We introduce a new second order asymptotic function which gives information on the convexity (concavity) of the original function from its behavior at infinity. We establish several properties and calculus rules for this concept, which differs from previous notions of second order asymptotic function. Finally, we apply our new definition in order to obtain necessary and sufficient optimality conditions for quadratic programming and quadratic fractional programming.

Keywords: Asymptotic cone; Asymptotic function; Second order asymptotic functions; Generalized convexity; Quadratic programming; Quadratic fractional programming.

  • Second-Order Asymptotic Analysis for Noncoercive Convex Optimization Math. Meth. Oper. Res., Vol. 86, Issue 3, 469–483, (2017).

    DOWNLOAD

We use second-order asymptotic analysis to deal with the minimization problem of a noncoercive convex function in a reflexive Banach space. To that end, we first introduce the definition of a second-order asymptotic cone, and its respective function, based on previous results for the finite dimensional case. We provide necessary and sufficient conditions for the existence of solutions for noncoercive convex minimization problems. Examples for which our assumptions are easier to verify than other well-known results are also provided.

Keywords: Asymptotic cones and functions; Second-order asymptotic functions; Convex optimization; Optimality conditions

  • Generalized asymptotic functions in nonconvex multiobjective optimization problems Optimization, Vol. 66, Issue 8, 1259–1272, (2017).

    DOWNLOAD

In this paper, we use generalized asymptotic functions and second order asymptotic cones to develop a general existence result for the nonemptiness of the proper efficient solution set and a sufficient condition for the domination property in nonconvex multiobjective optimization problems. A new necessary condition for a point to be efficient or weakly efficient solution is given without any convexity assumption. We also provide a finer outer estimate for the asymptotic cone of the weakly efficient solution set in the quasiconvex case. Finally, we apply our results to the linear fractional multiobjective optimization problem.

Keywords: Quasiconvexity, asymptotic functions; second order asymptotic cones; nonconvex vector optimization; linear fractional programming.

  • Second order asymptotic analysis: basic theory J. of Convex Anal., Vol. 22, Issue 4, 1173–1196, (2015). (With F. Flores-Bazán and N. Hadjisavvas)

    DOWNLOAD

Recently, the concepts of second order asymptotic directions and functions have been introduced and applied to global and vector optimization problems. In this work, we establish some new properties for these two concepts. In particular, in case of a convex set, a complete characterization of the second order asymptotic cone is given. Also, formulas that permit the easy computation of the second order asymptotic function of a convex function are established. It is shown that the second order asymptotic function provides a finer description of the behavior of functions at infinity, than the first order asymptotic function. Finally, we show that second order asymptotic function of a given convex one can be seen as first order asymptotic function of another convex function.

Keywords: Asymptotic cone; recession cone; asymptotic function; second order asymptotic cone; second order asymptotic function.

  • Formulas for Asymptotic Functions via Conjugates, Directional Derivatives and Subdifferentials
    J. Optim. Theory Appl., Vol. 173, Issue 3, 793–811,(2017).(With R. López)

    DOWNLOAD

The q-asymptotic function is a new tool that permits to study nonconvex optimization problems with unbounded data. This tool is particularly useful when dealing with quasiconvex functions. In this paper, we obtain formulas for the q-asymptotic function via c-conjugates, directional derivatives and subdifferentials. We obtain such formulas under lower semicontinuity or locally Lipschitz assumptions. The well-known formulas for the asymptotic function in the convex case are consequences of these formulas. We obtain a new formula for the convex case.

Keywords: Asymptotic cones and functions; q-Asymptotic functions; cConjugates; Directional derivatives; Subdifferentials

  • A Quasiconvex Asymptotic Function with Applications in Optimization J. Optim. Theory Appl., Vol. 180, Issue 1,170–186, (2019).(With N. Hadjisavvas and J. E. Martínez-Legaz)

    DOWNLOAD

We introduce a new asymptotic function, which is mainly adapted to quasiconvex functions. We establish several properties and calculus rules for this concept and compare it to previous notions of generalized asymptotic functions. Finally, we apply our new definition to quasiconvex optimization problems: we characterize the boundedness of the function, and the nonemptiness and compactness of the set of minimizers. We also provide a sufficient condition for the closedness of the image of a nonempty closed and convex set via a vector-valued function.

Keywords: Asymptotic cones; Asymptotic functions; Quasiconvexity; Nonconvex optimization; Closedness criteria

  • The q-Asymptotic Function in c-Convex Analysis Optimization.Vol. 68, Issue 7, 1429–1445, (2019).(With A. Iusem)

    DOWNLOAD

We establish several connections between generalized asymptotic functions and different areas of convexity theory, without coercivity assumptions. Properties and characterizations of abstract subdifferentials, normal cones, conjugates, support functions and optimality conditions for the minimization problem are given. We provide a new result on existence of minimizers for a class of nonconvex functions which is strictly larger than the class of quasiconvex ones.

Keywords: Asymptotic functions, generalized convexity, conjugacies, subdifferentials, support functions, optimality conditions.

  • Quadratic Fractional Programming under Asymptotic Analysis J. of Convex Anal., Vol. 26, Issue 1, 15–32, (2019).

    DOWNLOAD

This paper considers the quadratic fractional programming problem, which minimizes a ratio of two functions; a quadratic (not necessarily convex) function over an a affine function on an unbounded set. As is well-known, if the quadratic function is convex or quasiconvex, then the quadratic fractional function is pseudoconvex, a particular case of the quasiconvex minimization problem. Thus, we develop optimality conditions for the general case by introducing a generalized asymptotic function to deal with quasiconvexity. We established two characterization results for the nonemptiness and compactness for the set of minimizers of any quasiconvex function. In addition, an extension for the Frank-Wolfe theorem from the quadratic to the quadratic fractional problem will be given. Finally, applications to pseudoconvex quadratic fractional programming are also provided.

Keywords: Asymptotic functions, Second order asymptotic functions, Nonconvex optimization, Optimality conditions, Quasiconvexity, Frank-Wolfe theorem, Quadratic fractional programming.

  • Quasiconvex Optimization Problems and Asymptotic Analysis in Banach Spaces Optimization, DOI: 10.1080/02331934.2019.1612893, (2020). (With A. Iusem).

    DOWNLOAD

We use asymptotic analysis for dealing with quasiconvex optimization problems in reflexive Banach spaces. We study generalized asymptotic (recession) cones for nonconvex and nonclosed sets and its respective generalized asymptotic functions. We prove that the generalized asymptotic functions defined in previous works directly through closed formulae can also be generated from the generalized asymptotic cones. We establish three characterizations results for the nonemptiness and compactness of the solution set for noncoercive quasiconvex minimization problems using different asymptotic functions. Finally, we present a sufficient condition for the nonemptiness and boundedness of the solution set for quasiconvex pseudomonotone equilibrium problems.

Keywords: Asymptotic cones; Asymptotic functions; Quasiconvexity; Nonconvex optimization; Equilibrium problems.

  • A note on “Reguralizers for structured sparsity” Adv. Comp. Math., Vol. 44, Issue 4, 1321–1323, (2018).

    DOWNLOAD

In this note, the notion of admissible sets contained in the strictly positive orthant introduced in [5] is analyzed. This notion was used to generalize theoretical results and optimization methods for structured sparsity. Unfortunately, we will prove that there is no generalization using admissible sets.

Keywords: Convex optimization; Feature selection; LASSO; Regularization; Sparse estimation

Research Projects

6-. ANID–Chile through Fondecyt Regular 1241040: “A further study on quasiconvex functions with applications in continuous optimization and variational inequalities”. From April 2024 until March 2027.

5-. Anid-Chile through Fondecyt Regular 1220379; «Further developments on quasiconvex functions with applications in continuous optimization and equilibrium problems». From April 2022 until March 2024.

4-.INCTMat through CAPES (Brazilian research project in collaboration with Alfredo Iusem): “Proximal Point Algorithms for Classes of Nonconvex Functions”. From August 2020 to July 2022.

3-.UTA Mayor 4749-20 de 2020 (Universidad de Tarapacá internal Research Project): “Derivadas direccionales globales en optimización continua no suave y desigualdades variacionales”. From August 2020 to July 2022.

2-. CONICYT–CHILE through Fondecyt Iniciación 11180320: “Further developments in asymptotic analysis under generalized convexity assumptions with applications in continuous optimization and variational inequalities”. From November 2018 until October 2021.

1-. CONICYT–CHILE through Fondecyt Postdoctorado 3160205: “Asymptotic analysis under generalized convexity assumptions with applications in optimization and nonlinear analysis”. From November 2015 to October 2018.

Conferences and Talks

32-. Seminário da Escola de Matemática Aplicada, Fundação Getúlio Vargas EMAp/FGV. «An overview on strongly quasiconvex functions with applications». August 18 of 2022, Rio de Janeiro, Brazil.

31-. Capricorn Mathematics Congress (COMCA).“Algoritmos de punto proximal para problemas de equilibrio pseudomónotonos casiconvexos”, August 03-05 of 2022, Iquique, Chile.

30-. Webinário Programa de Pós-Graduação em Matemática, Universidade Federal de Amazonas (PPGM-UFAM): “On strongly quasiconvex functions: existence results and proximal point algorithms”. March 30 of 2022, Manaus, Brazil. 

29-. Optimization Seminars: «On strongly quasiconvex functions: theory and applications». October 06 of 2021, Santiago, Chile (online Seminar).

28-. Jornadas Matemáticas de la Zona Sur: «On strongly quasiconvex functions: existence results and proximal point algorithms». April 24 of 2021, Temuco, Chile. (Online Conference).

27-. I Congreso Internacional de Matemáticas: «An study of nonconvex optimization problems via global derivatives». December 11 of 2020, Arequipa, Perú. (Online Conference).

26-. Nonsmooth Optimization and Its Applications: “Global Directional Derivatives with Applications to Generalized Convexity and Generalized Monotonicity”. September 17 of 2020. Urmía, Irán. (Online Conference).

25-. Coloquio del Departamento de Matemática: “Characterizations of nonconvex optimization problems and variational inequalities via global derivatives”. August 27 of 2020, Arica, Chile.

24-. ISORA: 14th International Seminar on Optimization and Related Areas. “Optimality conditions for nonconvex optimization problems via global derivatives”. October 07-11 of 2019, Lima, Perú.

23-. XIII Brazilian Workshop on Continuous Optimization. “Global directional derivatives for nonconvex optimization problems”. September 23-27 of 2019, Rio de Janeiro, Brazil.

22-. 6th International Conference on Continuous Optimization (ICCOPT). “Existence results for equilibrium problems and mixed variational inequalities”. August 05-08 of 2019, Berlin, Germany.

21-. 71st Workshop: Advances in Nonsmooth Analysis and Optimization. “Optimality conditions for nonconvex optimization via global derivatives”. June 24 to July 01, Erice, Italy.

20-.Colloquium de Optimição: “Análise assintótica para problemas de optimização quase-convexos”. August 23 of 2018. University of São Paulo, São Paulo, Brazil.

19-. XII Brazilian Workshop on Continuous Optimization. “A new quasiconvex asymptotic function with applications in Optimization”, July 23-27 of 2018, Foz de Iguazu, Brazil.

18-. Colloquium & International Conference on Variational Analysis and Nonsmooth Optimization. “Optimality conditions for vector equilibrium problems with applications”, From June 28 to July 01 of 2018, Halle, Germany.

17-. Workshop on Nonlinear Analysis and its Applications; “An overview on quasiconvex asymptotic analysis: theory and applications”. June 23 of 2018, Institute of Physics and Mathematics, University of Isfahan, Isfahan, Iran.

16-. NAOP 2018: Fourth Conference in Nonlinear Analysis and Optimization. “A quasiconvex asymptotic function with applications in optimization”, June 19-22 of 2018, Zanjan, Iran.

15-. Seminario PRPPG (Parana Programa de PosGraduação) em Matematica; “Asymptotic analysis with applications in quasiconvex optimization”, March 23 of 2018, Universidade Federal do Parana, Coritiba, Brazil.

14-. ISORA: 13th International Seminar on Optimization and Related Areas. “Optimality conditions for pseudomonotone equilibrium problems”. October 09-13 of 2017, Lima, Perú.

13-. XI International Conference on Parametric Optimization and Related Topics. “Quadratic fractional programming under asymptotic analysis”, September 19-22 of 2017, Prague, Czech Republic.

12-. XII International Symposium on Generalized Convexity and Monotonicity. “Asymptotic analysis for quasiconvex functions with applications”. August 27 to September 02 of 2017, Hajdúszoboszló, Hungary.

11-. “A new second order asymptotic function and applications to quadratic programming”, December 14 of 2016, King Fahd University of Petroleum and Minerals, Dhahran, Kingdom of Saudi Arabia.

10-. NewTOVAA: New Trends in Optimization and Variational Analysis for Applications. “Asymptotic function for generalized monotone operators”. December 07-10 of 2016, Quy Nhon, Vietnam.

9-. 28th EURO 2016: European Conference on Operation Research. “Generalized asymptotic functions and applications in multiobjective optimization”, July 03-06 of 2016, Poznan, Poland.

8-. EUROPT 2016: Workshop on Advances in Continuous Optimization. “The q-asymptotic function in generalized convexity theory”. July 01-02 of 2016, Warsaw, Poland.

7-. XI Brazilian Workshop on Continuous Optimization. “Generalized asymptotic functions in generalized convexity theory”. May 22-27 of 2016, Manaus, Brazil.

6-. Seminarios do IMPA, Optimicao; “Generalized asymptotic functions and their connections with generalized convexity theory”, February 18 of 2016, Instituto Nacional do Matematica Pura e Aplicada, IMPA, Rio de Janeiro, Brazil.

5-. ISORA: 12th International Seminar on Optimization and Related Areas. “Asymptotic analysis for quasiconvex functions”. October 05-09 of 2015, Lima, Perú.

4-. Optimization and Equilibrium, Seminars; “Second order asymptotic analysis; theory and applications”, September 10 of 2014, Center for Mathematical Modeling CMM, University of Chile, Santiago, Chile.

3-. Workshop in Optimization; “Second order asymptotic analysis in the convex case”, November 29 of 2013, Departamento de Ingeniería Matemática, University of Concepción, Concepción, Chile.

2-. Capricorn Mathematics Congress (COMCA).“Asymptotical approach for solution sets in non-convex vector optimization”, August 04-06 of 2010, Arica, Chile.

1-. ICCOPT: International Conference on Continuous Optimization. “Efficiency, proper efficiency and weak efficiency in multi-objective optimization”, July 26-29 of 2010, Santiago, Chile.

Teaching

5-. Nonsmooth Analysis II-2021 (Master degree course).

4-. Convex Analysis I-2021 (Master degree course).

3-. Topics in Optimizations and Calculus of Variations. II-2020.

2-. Algebraic Structures. I-2020.

1-. Linear Algebra: II-2019.

Ir a Arriba