Research Article | Open Access
Volume 6 | Issue 10 | Year 2019 | Article Id. IJCSE-V6I10P105 | DOI : https://doi.org/10.14445/23488387/IJCSE-V6I10P105

A Study of Logistic Regression And Its Optimization Techniques Using Octave


Annapoorani Anantharaman

Citation :

Annapoorani Anantharaman, "A Study of Logistic Regression And Its Optimization Techniques Using Octave," International Journal of Computer Science and Engineering , vol. 6, no. 10, pp. 23-28, 2019. Crossref, https://doi.org/10.14445/23488387/IJCSE-V6I10P105

Abstract

A classification problem produces a binary output even when the input values are real numbers. Linear regression cannot be used to solve classification problems. Instead, logistic regression is used. Logistic regression is a supervised learning algorithm which estimates the probability of an outcome for the input given. Logistic regression is useful in many real world problems in many fields and its performance can be improved by some optimization techniques. This paper describes logistic regression and its various optimization techniques along with the performance metrics by using the study of graduate school admissions.

Keywords

Logistic regression, optimisation techniques, Newton Raphson method, BFGS, L-BFGS, Gradient Descent, Conjugate Gradient Descent

References

[1] Machine Learning course – Andrew Ng
[2] Revisit of Logistic Regression: Efficient Optimization and Kernel Extension- Takumi Kobayashi, Nobuyuki Otsu, Kenji Watanabe
[3] Logistic Regression — Gradient Descent Optimization – Abhinav Mazumdar
[4] A study of Classification Problems using Logistic Regression – Arka Mukherjee
[5] Wikipedia – BFGS
[6] An Introduction to the Conjugate Gradient Method Without the Agonizing Pain Edition by Jonathan Richard Shewchuk
[7] PadhAi Labs – Deep Learning Course