Restricted Boltzmann Machines is a probabilistic energy-based model with a two-layer architecture in which the visible units are connected to the hidden units. It is an unsupervised learning algorithm that makes inferences from input data without labeled responses. RBM overcome the raw data issues such as problems of missing values, noisy labels, and unstructured data. RBM is a special class of Boltzmann Machine(BM) with a single hidden layer and a bipartite connection. Every neuron in the visible layer is connected to every neuron in the hidden layer, but the neurons in the same layer are not connected to each other. Advantages of RBM networks are computational effectiveness, intense encoding of any distribution, faster than traditional BM models, and high performance due to the activations of hidden units being fed as input to other models. RBM comprises of two phases feed-forward pass and feed backward pass, in which the activation and reconstructed input are produced to generate the pattern for the activation of hidden neurons. RBM is commonly useful for classification and regression problems. The most popular applications of the Restricted Boltzmann machine are dimensionality reduction, collaborative filtering, feature learning, topic modeling, recommender systems, and handwritten digit recognition. Future research areas for RBM are class RBM with imbalanced data, RBM with sparse learning, topological features in locally connected RBMs, multi-modal analysis for group data in neuroimaging, and many more.