You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I need to concatenate samples from these two methods into one tensor, so that I can sample from a continuous distribution using reparameterization, and sample from a discrete distribution using UnorderedSetEstimator. Is this functionality something that can be built in? It seems that this is not possible given the current iteration (see below for example error). Thank you!
import storch
import torch
import torch.distributions as td
method1 = storch.method.Reparameterization
method2 = storch.method.UnorderedSetEstimator
method1 = method1(plate_name="1",n_samples=25)
method2 = method2(plate_name="1",k=25)
p1 = td.Independent(td.Normal(loc=torch.zeros([1000,2]),scale=torch.ones([1000,2])),0)
p2 = td.Independent(td.OneHotCategorical(probs=torch.zeros([1000,3]).uniform_()),0)
samp1 = method1(p1)
samp2 = method2(p2)
samp1.shape # torch.Size([25, 1000, 2])
samp2.shape # torch.Size([25, 1000, 3])
storch.cat([samp1,samp2],2)
ValueError: Received a plate with name 1 that is not also an AncestralPlate.
The text was updated successfully, but these errors were encountered:
Hi David,
I have been thinking a bit about this problem, but I think there's no 'simple' way to implement this generally while remaining unbiased. The problem is that with the UnorderedSetEstimator, the idea is that you can sample repeatedly from that method such that you can sample without replacement from sequences. This means it prunes away some samples to keep with the budget of 25 samples.
However, the reparameterization method would sample 25 samples with replacement. If we were to join these samples into a single 'Plate', the UnorderedSetEstimator can prune away some of the samples with replacement from the reparameterization, and will repeat some samples from the reparameterization method, making it no longer an unbiased sample with replacement.
As a 'hack' to implement your specific example, what you can do is fake p1 to be dependent on samp2 such that it'd have the AncestralPlate "1" in the parameters. The following code works for me in the latest version on pip (this needed a fix to be compatible with td.Independent):
importstorchimporttorchimporttorch.distributionsastdmethod1=storch.method.Reparameterizationmethod2=storch.method.UnorderedSetEstimatormethod2=method2(plate_name="1",k=25)
p2=td.Independent(td.OneHotCategorical(probs=torch.zeros([1000,3]).uniform_()),0)
samp2=method2(p2)
# This hacky version worksmethod1=method1(plate_name="2",n_samples=1)
hack=samp2[..., [0, 1]] +1# as hack/hack = 1p1=td.Independent(td.Normal(loc=torch.zeros([1000,2]) *hack/hack,scale=torch.ones([1000,2])),0)
samp1=method1(p1)
storch.cat([samp1,samp2], -1).shape# torch.Size([25, 1000, 5])
Hi! I need to concatenate samples from these two methods into one tensor, so that I can sample from a continuous distribution using reparameterization, and sample from a discrete distribution using UnorderedSetEstimator. Is this functionality something that can be built in? It seems that this is not possible given the current iteration (see below for example error). Thank you!
The text was updated successfully, but these errors were encountered: