Stochastic Item Descent Method for Large Scale Equal Circle Packing Problem

01/22/2020
by   Kun He, et al.
0

Stochastic gradient descent (SGD) is a powerful method for large-scale optimization problems in the area of machine learning, especially for a finite-sum formulation with numerous variables. In recent years, mini-batch SGD gains great success and has become a standard technique for training deep neural networks fed with big amount of data. Inspired by its success in deep learning, we apply the idea of SGD with batch selection of samples to a classic optimization problem in decision version. Given n unit circles, the equal circle packing problem (ECPP) asks whether there exist a feasible packing that could put all the circles inside a circular container without overlapping. Specifically, we propose a stochastic item descent method (SIDM) for ECPP in large scale, which randomly divides the unit circles into batches and runs Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm on the corresponding batch function iteratively to speedup the calculation. We also increase the batch size during the batch iterations to gain higher quality solution. Comparing to the current best packing algorithms, SIDM greatly speeds up the calculation of optimization process and guarantees the solution quality for large scale instances with up to 1500 circle items, while the baseline algorithms usually handle about 300 circle items. The results indicate the highly efficiency of SIDM for this classic optimization problem in large scale, and show potential for other large scale classic optimization problems in which gradient descent is used for optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2023

Geometric Batch Optimization for the Packing Equal Circles in a Circle Problem on Large Scale

The problem of packing equal circles in a circle is a classic and famous...
research
06/28/2015

Stochastic Gradient Made Stable: A Manifold Propagation Approach for Large-Scale Optimization

Stochastic gradient descent (SGD) holds as a classical method to build l...
research
06/20/2019

Submodular Batch Selection for Training Deep Neural Networks

Mini-batch gradient descent based methods are the de facto algorithms fo...
research
08/28/2015

Parallel Dither and Dropout for Regularising Deep Neural Networks

Effective regularisation during training can mean the difference between...
research
09/11/2023

Stochastic Gradient Descent-like relaxation is equivalent to Glauber dynamics in discrete optimization and inference problems

Is Stochastic Gradient Descent (SGD) substantially different from Glaube...
research
05/17/2023

An Efficient Solution Space Exploring and Descent Method for Packing Equal Spheres in a Sphere

The problem of packing equal spheres in a spherical container is a class...
research
05/17/2023

Stochastic Ratios Tracking Algorithm for Large Scale Machine Learning Problems

Many machine learning applications and tasks rely on the stochastic grad...

Please sign up or login with your details

Forgot password? Click here to reset