Book recommendation: Scalable machine learning for bigger data

I am aware of the theory of stochastic gradient descent, which is a faster way of developing linear regression. Through this we can have an ‘optimized implementation’ of linear regression. There are similar techniques for non-parametric methods as well, which allows you to converge faster keeping in mind cost function.

I need suggestion for a book which has worked out implementations or examples of these type of optimized models with R/Python code or pseudo code. So that i can run sophisticated machine learning algorithms faster, without increasing my hardware further. I am open about increasing hardware though. What interests me is a faster implementation of techniques, so that i can use scalable implementations of machine learning algorithms for bigger data.

Thanks!!

@azimulh, you need to adjust your expectations. Machine Learning is not really at a point where there are a lot of good structured material available. If you want to optimize a particular learning algorithm, you will need to look for papers that do so. That said, a couple of books that discuss the inner workings of learning algorithms are Elements of Statistical Learning and the Deep Learning Book.

1 Like