LHAC
by Xiaocheng Tang [http://goo.gl/6QuMl]
LHAC stands for the algorithm  Low rank Hessian Approximation in Activeset Coordinate descent (paper)  for minimizing composite functions, i.e.,

min f(x) + g(x)
where f(x)
can be any smooth function, i.e., logistic loss, square loss, etc., and g(x)
is assumed to be simple, i.e., l1norm
, l1/l2norm
, etc. There are for now two varieties of LHAC:
Lcc and Lss both implement LHAC, but are targeted at different sets of problems. In particular, Lss is coded specifically for the sparse inverse covariance selection problem and mainly handles variables as a matrix constrained to be positive definite, whereas Lcc is implemented for general composite minimization that treats variables as a vector with no extra constraints.
For more details on the two packages, please visit their repositories on Github.
Citation
If you use LHAC in your research, please cite the following paper:
 Katya Scheinberg and Xiaocheng Tang, Practical Inexact Proximal QuasiNewton Method with Global Complexity Analysis, submitted, 2014 (BibTex)