In this paper an algorithm for minimization of a nondifferentiable function is presented. The algorithm uses the Moreau-Yosida regularization of the objective function and its second order Dini upper directional derivative. The purpose of the paper is to establish general hypotheses for this algorithm, under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of the convergence. | Yugoslav Journal of Operations Research 23 (2013), Number 1, 59-71 DOI: ON AN ALGORITHM IN NONDIFFERENTIAL CONVEX OPTIMIZATION Nada I. ĐURANOVIĆ-MILIČIĆ Department of Mathematics, Faculty of Technology and Metallurgy University of Belgrade, Belgrade, Serbia nmilicic@ Milanka GARDAŠEVIĆ-FILIPOVIĆ Vocational College of Technology, Arandjelovac, Serbia milankafilipovic@ Received: May 2011 / Accepted: October 2012 Abstract: In this paper an algorithm for minimization of a nondifferentiable function is presented. The algorithm uses the Moreau-Yosida regularization of the objective function and its second order Dini upper directional derivative. The purpose of the paper is to establish general hypotheses for this algorithm, under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of the convergence. Keywords: Moreau-Yosida regularization, non-smooth convex optimization, directional derivative, second order Dini upper directional derivative, uniformly convex functions. MSC:90C30; 90C25; 65K05. 1. INTRODUCTION The following minimization problem is considered: minn f ( x) x∈R () N. Djuranovic-Milicic, M. Gardasevic - Filipovic / On an Algorithm In Nondifferential 60 where f : R n → R ∪ {+∞} is a convex and not necessarily differentiable function with a nonempty set X * of minima. Many approaches have been presented for non-smooth programs, but they are often restricted to the convex unconstrained case. The reason for the restriction is the fact that a constrained problem can be easily transformed to an unconstrained problem using a distance function. In general, the various approaches are based on combinations of the following methods: subgradient methods, bundle techniques and the Moreau-Yosida regularization. For a convex function f it is very important that its Moreau-Yosida regularization is a new function with the same set of minima as f and is .