Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Projects in Hyperbolic Deep Neural Networks

projects-in-hyperbolic-deep-neural-networks.jpg

Python Projects in Hyperbolic Deep Neural Networks for Masters and PhD

    Project Background:
    Hyperbolic Deep Neural Networks (HDNNs) represent an innovative approach to deep learning that operates in hyperbolic space, a non-Euclidean geometry. Traditional deep learning models are designed for Euclidean spaces, which may not capture the underlying hierarchical and tree-like structures often observed in complex datasets. The HDNNs revolves around harnessing the unique properties of hyperbolic geometry to enhance the learning capabilities of neural networks. By modeling data in hyperbolic space, HDNNs can effectively capture hierarchical relationships, handle high-dimensional data with reduced computational complexity, and exhibit improved generalization performance. This aims to push the boundaries of deep learning by exploring novel architectures and optimization techniques tailored for hyperbolic spaces. Additionally, the project work seeks to apply HDNNs to various domains such as natural language processing, graph data analysis, and recommendation systems, where hierarchical structures are prevalent.

    Problem Statement

  • Addressing limitations of Euclidean geometry in capturing hierarchical relationships present in many real-world datasets.
  • Developing efficient algorithms for training and inference in hyperbolic spaces to overcome computational challenges.
  • Improving the generalization ability of hyperbolic deep neural networks to handle diverse and complex data structures.
  • Enhancing interpretability of hyperbolic deep learning models to facilitate understanding and trust in their predictions.
  • Aim and Objectives

  • Develop Hyperbolic Deep Neural Networks (HDNNs) for improved representation learning.
  • Design novel architectures that leverage hyperbolic geometry for capturing hierarchical structures.
  • Address computational challenges by developing efficient training and inference algorithms in hyperbolic spaces.
  • Enhance generalization capabilities of HDNNs to handle diverse and complex datasets.
  • Explore applications of HDNNs in domains like natural language processing and graph analysis.
  • Evaluate HDNNs performance against traditional Euclidean models to showcase advantages.
  • Promote research and adoption of hyperbolic deep learning techniques in the machine learning community.
  • Contributions to Hyperbolic Deep Neural Networks

  • Developed novel architectures for Hyperbolic Deep Neural Networks (HDNNs) to capture hierarchical structures efficiently.
  • Introduced efficient training and inference algorithms in hyperbolic spaces, addressing computational challenges.
  • Improved generalization capabilities of HDNNs, enabling them to handle diverse and complex datasets effectively.
  • Demonstrated the advantages of HDNNs over traditional Euclidean models in various applications, promoting their adoption in the machine learning community.
  • Deep Learning Algorithms for Hyperbolic Deep Neural Networks

  • Hyperbolic Autoencoders
  • Hyperbolic Convolutional Neural Networks (HCNNs)
  • Hyperbolic Recurrent Neural Networks (HRNNs)
  • Hyperbolic Generative Adversarial Networks (HGANs)
  • Hyperbolic Graph Neural Networks (HGNNs)
  • Hyperbolic Attention Mechanisms
  • Hyperbolic Capsule Networks
  • Hyperbolic Variational Autoencoders (HVAEs)
  • Hyperbolic Transformer Networks
  • Hyperbolic Siamese Networks
  • Datasets for Hyperbolic Deep Neural Networks

  • WordNet
  • Freebase
  • OpenFlights
  • Kinship
  • Large Movie Review Dataset (IMDb)
  • Wikipedia Knowledge Graph
  • Amazon Product Co-Purchasing Network
  • Enron Email Dataset
  • Reddit Hyperlink Network CiteSeer Academic Citation Network
  • Performance Metrics:

  • Mean Average Precision (MAP)
  • Normalized Discounted Cumulative Gain (NDCG)
  • Precision at k (P@k)
  • Recall at k (R@k)
  • Area Under the Precision-Recall Curve (AUC-PR)
  • Area Under the Receiver Operating Characteristic Curve (AUC-ROC)
  • F1 Score
  • Mean Squared Error (MSE)
  • Mean Absolute Error (MAE)
  • Kendalls Tau-a
  • Software Tools and Technologies

    Operating System: Ubuntu 18.04 LTS 64bit / Windows 10
    Development Tools: Anaconda3, Spyder 5.0, Jupyter Notebook
    Language Version: Python 3.9
    Python Libraries:
    1. Python ML Libraries:

  • Scikit-Learn
  • Numpy
  • Pandas
  • Matplotlib
  • Seaborn
  • Docker
  • MLflow

  • 2. Deep Learning Frameworks:
  • Keras
  • TensorFlow
  • PyTorch