List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

How to Build and Evaluate a Multi-Layer Perceptron (MLP) Classifier for Predicting Student Depression

MLP Classifier for Student Depression

Condition for Building and Evaluating a Multi-Layer Perceptron (MLP) Classifier for Student Depression Prediction

  • Description:
    This code preprocesses a student depression dataset by handling missing values, encoding categorical features, and scaling the input data. It then builds and trains a Multi-Layer Perceptron (MLP) classifier to predict student depression based on various features. The model's performance is evaluated using classification metrics like accuracy, precision, recall, F1 score, and confusion matrix.
Step-by-Step Process
  • Import Libraries:
    Import necessary libraries like pandas, sklearn, and matplotlib for data processing and model evaluation.
  • Load and Inspect Dataset:
    Load the student depression dataset and check for missing or null values.
  • Preprocess Data:
    Handle missing values, encode categorical features, and scale the input features.
  • Build and Train Model:
    Build an MLP classifier with specified hidden layers and train it on the preprocessed data.
  • Evaluate Model:
    Evaluate the model using classification metrics and visualize the results using a confusion matrix.
Sample Source Code
  • # Import Necessary Libraries
    import pandas as pd
    from sklearn.preprocessing import LabelEncoder, StandardScaler
    import matplotlib.pyplot as plt
    import seaborn as sns
    from sklearn.model_selection import train_test_split
    from sklearn.neural_network import MLPClassifier

    import warnings
    warnings.filterwarnings("ignore")

    from sklearn.metrics import (classification_report, confusion_matrix, accuracy_score,
    f1_score, recall_score, precision_score)

    df = pd.read_csv("/home/soft12/Downloads/sample_dataset/Website/Dataset/Student Depression Dataset.csv")

    # Check Nan values
    print("Check Nan values\n")
    print(df.isna().sum())

    # If Nan values present
    df = df.dropna()

    # Check Null Values
    print("Check Null Values\n")
    print(df.isnull().sum())

    # Check dtypes of features
    print(df.dtypes)

    label = LabelEncoder()
    for i in df.columns:
    if df[i].dtypes == 'object':
    df[i] = label.fit_transform(df[i])

    # calculate correlation between features
    # Compute the correlation matrix
    correlation_matrix = df.corr()

    # Display the correlation matrix
    print(correlation_matrix)

    # Plot the heatmap
    plt.figure(figsize=(10, 8))
    sns.heatmap(correlation_matrix, annot=True, cmap='coolwarm', fmt='.2f', linewidths=0.5)
    plt.title('Correlation Heatmap')
    plt.show()

    x = df.drop('Depression', axis=1)
    y = df['Depression']

    # Count the number of samples per class
    class_counts = y.value_counts()

    # Plot the class distribution
    plt.figure(figsize=(8, 6))
    sns.barplot(x=class_counts.index, y=class_counts.values, palette="viridis")
    plt.title('Class Balance Check', fontsize=16)
    plt.xlabel('Class', fontsize=14)
    plt.ylabel('Count', fontsize=14)
    plt.xticks(fontsize=12)
    plt.yticks(fontsize=12)
    plt.grid(axis='y', linestyle='--', alpha=0.7)
    plt.show()

    # Scaling the input data
    scaler = StandardScaler()
    x = scaler.fit_transform(x)

    # Split the train-test data
    X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=.2, random_state=42)

    # Define the MLP Classifier
    mlp = MLPClassifier(hidden_layer_sizes=(128, 64, 32),
    activation='relu',
    solver='adam',
    max_iter=10,
    batch_size=2,
    random_state=42,
    verbose=True)

    # Train the model
    mlp.fit(X_train, y_train)

    # Make predictions
    y_pred = mlp.predict(X_test)

    print("___Performance_Metrics___\n")
    print('Classification_Report:\n', classification_report(y_test, y_pred))
    print('Confusion_Matrix:\n', confusion_matrix(y_test, y_pred))
    print('Accuracy_Score: ', accuracy_score(y_test, y_pred))
    print('F1_Score: ', f1_score(y_test, y_pred))
    print('Recall_Score: ', recall_score(y_test, y_pred))
    print('Precision_Score: ', precision_score(y_test, y_pred))
Screenshots
  • MLP Classifier Output Screenshot