Skip to content

Conversation

@Dekermanjian
Copy link
Contributor

updated dim_shape assignment logic in fit_laplace to handle absent dims on data containers and deterministics. I also added a test for that specific scenario.

closes #581

Copy link
Member

@ricardoV94 ricardoV94 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are we forcing dims in variables that didn't have then originally?

dim_shapes = initval.shape if initval is not None else batched_rv.shape.eval()[2:]
laplace_model.add_coords(
{name: np.arange(shape) for name, shape in zip(dims[2:], dim_shapes)}
{name: pt.arange(shape) for name, shape in zip(dims[2:], dim_shapes)}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pt.arange doesn't sound correct for dims

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I switched back to np.arange I noticed a speed-up during tests. Can you help me understand why pt.arange should not be used for dims? Is it because that type of execution happens outside of the model graph?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

coordinates should be concrete valid values for Arviz. pt.arange is not a valid coord type. It's probably undefined behavior

@Dekermanjian
Copy link
Contributor Author

why are we forcing dims in variables that didn't have then originally?

Hey @ricardoV94, I am not sure I understand what you mean here. I compared the returned Inference_Data object dimensions from pmx.fit_laplace() to the one returned by pm.sample() and the dimensions in the outputs seemed the same. Are you saying in general, both methods should not be adding a mu_dim_0 dimension?

@ricardoV94
Copy link
Member

why are we forcing dims in variables that didn't have then originally?

Hey @ricardoV94, I am not sure I understand what you mean here. I compared the returned Inference_Data object dimensions from pmx.fit_laplace() to the one returned by pm.sample() and the dimensions in the outputs seemed the same. Are you saying in general, both methods should not be adding a mu_dim_0 dimension?

pm.sample doesn't add dims to model variables. Those show up in the conversion to InferenceData by arviz

@Dekermanjian
Copy link
Contributor Author

why are we forcing dims in variables that didn't have then originally?

Hey @ricardoV94, I am not sure I understand what you mean here. I compared the returned Inference_Data object dimensions from pmx.fit_laplace() to the one returned by pm.sample() and the dimensions in the outputs seemed the same. Are you saying in general, both methods should not be adding a mu_dim_0 dimension?

pm.sample doesn't add dims to model variables. Those show up in the conversion to InferenceData by arviz

Thank you, @ricardoV94. I was not aware of that. I made a change so that we don't force any dims on variables that were not assigned any in the original model context.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fit_laplace errors when deterministics or data do not have dims

2 participants