Algorithms for Convex Optimization

A special issue of Algorithms (ISSN 1999-4893).

Deadline for manuscript submissions: closed (31 March 2022) | Viewed by 2966

Special Issue Editor


E-Mail Website
Guest Editor
Faculty of Mathematics, University of Vienna, 1090 Vienna, Austria
Interests: convex analysis; convex optimization; monotone operators; vector optimization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear colleagues,

In this Special Issue of Algorithms we welcome contributions presenting and/or analyzing iterative methods for solving convex optimization problems and real life problems that can be modeled in this way. Besides novel approaches to splitting proximal point type methods and significant improvements of existing ones as well as iterative procedures for solving constrained convex optimization, problems of special interest are also extensions to multiobjective and nonconvex optimization of known algorithms. Iterative methods for solving monotone inclusions or interpretations of such by means of dynamical systems can be considered, too.

Dr. Sorin-Mihai Grad
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Proximal point method
  • Convex optimization problem
  • Constrained optimization problem
  • Splitting technique
  • Image processing
  • Machine learning
  • Stochastic proximal algorithm
  • Location theory
  • Game theory

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 674 KiB  
Article
A Linearly Involved Generalized Moreau Enhancement of 2,1-Norm with Application to Weighted Group Sparse Classification
by Yang Chen, Masao Yamagishi and Isao Yamada
Algorithms 2021, 14(11), 312; https://0-doi-org.brum.beds.ac.uk/10.3390/a14110312 - 27 Oct 2021
Cited by 2 | Viewed by 1810
Abstract
This paper proposes a new group-sparsity-inducing regularizer to approximate 2,0 pseudo-norm. The regularizer is nonconvex, which can be seen as a linearly involved generalized Moreau enhancement of 2,1-norm. Moreover, the overall convexity of the corresponding group-sparsity-regularized [...] Read more.
This paper proposes a new group-sparsity-inducing regularizer to approximate 2,0 pseudo-norm. The regularizer is nonconvex, which can be seen as a linearly involved generalized Moreau enhancement of 2,1-norm. Moreover, the overall convexity of the corresponding group-sparsity-regularized least squares problem can be achieved. The model can handle general group configurations such as weighted group sparse problems, and can be solved through a proximal splitting algorithm. Among the applications, considering that the bias of convex regularizer may lead to incorrect classification results especially for unbalanced training sets, we apply the proposed model to the (weighted) group sparse classification problem. The proposed classifier can use the label, similarity and locality information of samples. It also suppresses the bias of convex regularizer-based classifiers. Experimental results demonstrate that the proposed classifier improves the performance of convex 2,1 regularizer-based methods, especially when the training data set is unbalanced. This paper enhances the potential applicability and effectiveness of using nonconvex regularizers in the frame of convex optimization. Full article
(This article belongs to the Special Issue Algorithms for Convex Optimization)
Show Figures

Figure 1

Back to TopTop