Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concatenating Reparameterized and UnorderedSetEstimator samples into one tensor #95

Open
DavidKLim opened this issue Apr 19, 2021 · 1 comment
Labels
enhancement New feature or request

Comments

@DavidKLim
Copy link

DavidKLim commented Apr 19, 2021

Hi! I need to concatenate samples from these two methods into one tensor, so that I can sample from a continuous distribution using reparameterization, and sample from a discrete distribution using UnorderedSetEstimator. Is this functionality something that can be built in? It seems that this is not possible given the current iteration (see below for example error). Thank you!

import storch
import torch
import torch.distributions as td
method1 = storch.method.Reparameterization
method2 = storch.method.UnorderedSetEstimator

method1 = method1(plate_name="1",n_samples=25)
method2 = method2(plate_name="1",k=25)
p1 = td.Independent(td.Normal(loc=torch.zeros([1000,2]),scale=torch.ones([1000,2])),0)
p2 = td.Independent(td.OneHotCategorical(probs=torch.zeros([1000,3]).uniform_()),0)

samp1 = method1(p1)
samp2 = method2(p2)
samp1.shape # torch.Size([25, 1000, 2])
samp2.shape # torch.Size([25, 1000, 3])

storch.cat([samp1,samp2],2)
ValueError: Received a plate with name 1 that is not also an AncestralPlate.
@HEmile
Copy link
Owner

HEmile commented May 27, 2021

Hi David,
I have been thinking a bit about this problem, but I think there's no 'simple' way to implement this generally while remaining unbiased. The problem is that with the UnorderedSetEstimator, the idea is that you can sample repeatedly from that method such that you can sample without replacement from sequences. This means it prunes away some samples to keep with the budget of 25 samples.
However, the reparameterization method would sample 25 samples with replacement. If we were to join these samples into a single 'Plate', the UnorderedSetEstimator can prune away some of the samples with replacement from the reparameterization, and will repeat some samples from the reparameterization method, making it no longer an unbiased sample with replacement.

As a 'hack' to implement your specific example, what you can do is fake p1 to be dependent on samp2 such that it'd have the AncestralPlate "1" in the parameters. The following code works for me in the latest version on pip (this needed a fix to be compatible with td.Independent):

import storch
import torch
import torch.distributions as td
method1 = storch.method.Reparameterization
method2 = storch.method.UnorderedSetEstimator

method2 = method2(plate_name="1",k=25)
p2 = td.Independent(td.OneHotCategorical(probs=torch.zeros([1000,3]).uniform_()),0)
samp2 = method2(p2)

# This hacky version works
method1 = method1(plate_name="2",n_samples=1)
hack = samp2[..., [0, 1]] + 1  # as hack/hack = 1
p1 = td.Independent(td.Normal(loc=torch.zeros([1000,2]) * hack/ hack,scale=torch.ones([1000,2])),0)
samp1 = method1(p1)

storch.cat([samp1,samp2], -1).shape  # torch.Size([25, 1000, 5])

@HEmile HEmile added the enhancement New feature or request label May 27, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants