AsynML

  −  asynchronous parallel algorithm package for machine learning

AsynML (download AsynML_code.zip) is an asynchronous parallel algorithm package for solving some popular machine learning problems on multi-core / multi-socket machines. We use C language to implement the core algorithms (‘AsynML_code/src’) and offer the interface functions written in Matlab for each problem (‘AsynML\matlab’). There are some test benchs/examples to show how to use these functions in ‘AsynML’ . All the code has been tested under Linux and macOS system.

We apply ‘Asynchronous Stochastic Coordinate Descent’ algorithm to solve problems: Asyn-LeastSquare/Asyn-SparseLeastSquare, Asyn-RidgeRegression/Asyn-SparseLeastSquare, Asyn-Lasso/Asyn-SparseLasso, Asyn-ElasticNet/Asyn-SparseElasticNet; use ‘Asynchronous Parallel Randomized Kaczmarz Algorithm’ to solve Asyn-LinearSystme.

The current release is 1.1. Small changes would be made from time to time. The latest updates were made on 11/20/2016. Here is the Copyright Statement.

1. Asyn-LeastSquare / Asyn-SparseLeastSquare

\(
\begin{array}{ll}
\text{min}_w & {1 \over 2} \|Xw-y\|^2\\
\text{s.t.} & lb \leq w \leq ub \\
& X \text{ is dense / sparse.}\\
\end{array}
\)

2. Asyn-RidgeRegression / Asyn-SparseRidgeRegression

\(
\begin{array}{ll}
\text{min}_w & {1 \over {2 N_{sample}}} \|Xw-y\|^2 + {l_2 \over 2} \|w\|^2\\
\text{s.t.} & lb \leq w \leq ub \\
& X \text{ is dense / sparse.}\\
\end{array}
\)

3. Asyn-Lasso / Asyn-SparseLasso

\(
\begin{array}{ll}
\text{min}_w & {1 \over {2 N_{sample}}} \|Xw-y\|^2 + l_1 \|w\|_1\\
\text{s.t.} & lb \leq w \leq ub \\
& X \text{ is dense / sparse.}\\
\end{array}
\)

4. Asyn-ElasticNet / Asyn-SparseElasticNet

\(
\begin{array}{ll}
\text{min}_w & {1 \over {2 N_{sample}}} \|Xw-y\|^2 + {l_2 \over 2} \|w\|^2 + l_1 \|w\|_1\\
\text{s.t.} & lb \leq w \leq ub \\
& X \text{ is dense / sparse.}\\
\end{array}
\)

5. Asyn-LinearSystme

\(
\begin{array}{ll}
A x = b \quad (\text{A is sparse.})\\
\end{array}
\)

References

Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties. Ji Liu and Stephen J. Wright. SIAM Journal on Optimization. 2015.
An Asynchronous Parallel Randomized Kaczmarz Algorithm. Ji Liu, and Stephen J. Wright. Mathematics of Computation. 2016.
Asynchronous Parallel Stochastic Gradient for Nonconvex Optimization. Xiangru Lian, Yijun Huang, Yuncheng Li, and Ji Liu. NIPS. 2016.
An Asynchronous Parallel Stochastic Coordinate Descent Algorithm. J Liu, SJ Wright, C Ré, V Bittorf, and S Sridhar. Journal of Machine Learning Research. 2015
Exclusive Sparsity Norm Minimization with Random Groups via Cone Projection. Yijun Huang and J Liu. 2015