arnip.orgKazuhiro HISHINUMA


Announcement to those who received my business card (Errata)

K. Hishinuma

Kazuhiro HISHINUMA, MSc.

Research Fellow of Japan Society for the Promotion of Science,
Doctoral Course Student at Meiji University, Japan.

mailto: kaz suffixORCID, GNU Social

Current Specialized Field

Continuous optimization theory (convex and quasi-convex optimization; distributed optimization), fixed point theory.

Academic & Professional Experience



  1. K. Hishinuma, H. Iiduka: Incremental and Parallel Line Search Subgradient Methods for Constrained Nonsmooth Convex Optimization – Numerically accelerated results by multi-core computing, arXiv, 1605.03738, 2016.

Publications in Refereed Journals

  1. K. Hishinuma, H. Iiduka: On Acceleration of the Krasnosel’skii-Mann Fixed Point Algorithm Based on Conjugate Gradient Method for Smooth Optimization, Journal of Nonlinear and Convex Analysis, Vol. 16, No. 11, pp. 2243-2254, 2015. [BiBTeX, Open Access]
  2. K. Hishinuma, H. Iiduka: Parallel Subgradient Method for Nonsmooth Convex Optimization with a Simple Constraint, Linear and Nonlinear Analysis, Vol. 1, No. 1, pp. 67-77, 2015. [BiBTeX, Open Access]
  3. H. Iiduka, K. Hishinuma: Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms, SIAM Journal on Optimization, Vol. 24, Issue 4, pp. 1840-1863, 2014. [BiBTeX, DOI, PDF]
  4. S. Iemoto, K. Hishinuma, H. Iiduka: Approximate Solutions to Variational Inequality over the Fixed Point Set of a Strongly Nonexpansive Mapping, Fixed Point Theory and Applications, 2014:51, 2014. [BiBTeX, Open Access]


  1. K. Hishinuma, H. Iiduka: Parallel Computing Method for Nonsmooth Convex Optimization, RIMS Kôkyûroku, No. 1963-11, pp. 71-77, 2015. [BiBTeX, CiNiiOpen Access]

Conference Activities & Talks

  1. K. Hishinuma, H. Iiduka: Flexible stepsize selection of subgradient methods for constrained convex optimization, the 10th Anniversary Conference on Nonlinear Analysis and Convex Analysis, 2017.
  2. K. Hishinuma, H. Iiduka: Acceleration approach for parallel subgradient method based on line search, The 2016 Fall National Conference of Operations Research Society of Japan, 2016.
  3. K. Hishinuma, H. Iiduka: On Parallel Computing Method for Nonsmooth Convex Optimization, The 2014 Fall National Conference of Operations Research Society of Japan, 2014.
  4. K. Hishinuma, H. Iiduka: Parallel Algorithm for Nonsmooth Convex Optimization, the International Workshop on Nonlinear Analysis and Convex Analysis, 2014.

Rewards and Punishments


Association Memberships

the Operations Research Society of Japan, Information Processing Society of Japan.


Mathematical Optimization (Hideaki Iiduka) Lab.,
Department of Computer Science, Meiji University
1-1-1 Higashimita, Tama-ku, Kawasaki-shi, Kanagawa 214-8571 Japan.
(Tel: +81-44-934-7484, Web:

mailto: kaz suffix