Information Theory and Distance Quantification with R
-
Updated
Dec 5, 2024 - R
Information Theory and Distance Quantification with R
[CVPR'22] Official Implementation of the CVPR 2022 paper "UNICON: Combating Label Noise Through Uniform Selection and Contrastive Learning"
Bayesian entropy estimation in Python - via the Nemenman-Schafee-Bialek algorithm
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
Parameter Learning of a Bayesian Network
Python implementation of the Jensen-Shannon divergence
NLP implementations like information-theoretic measures of distributional similarity, text preprocessing using shell commands, Naive Bayes text categorization model, Cocke-Younger-Kasami parsing.
textRec utlizes Latent Dirichlet Allocation and Jensen-Shannon-Divergence on the discrete probability distributions over LDA topics per document, in order to recommend unique and novel documents to specific users.
Add a description, image, and links to the jensen-shannon-divergence topic page so that developers can more easily learn about it.
To associate your repository with the jensen-shannon-divergence topic, visit your repo's landing page and select "manage topics."