Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write a guide for API-based inference #69

Open
talmo opened this issue Jul 23, 2024 · 0 comments
Open

Write a guide for API-based inference #69

talmo opened this issue Jul 23, 2024 · 0 comments

Comments

@talmo
Copy link
Contributor

talmo commented Jul 23, 2024

The run script/CLI is helpful when using this exclusively, but we'll want to be able to integrate it into other stuff.

For example, in SLEAP/sleap-nn, we'll already have loaded the images and maybe will be generating poses batch-by-batch, so using DREEM's inference data loader doesn't make as much sense.

The goal is to show a recipe where we can use DREEM models for inference from other packages (e.g., SLEAP, sleap-nn) in their own workflows and execution context.

We want to write something like this: https://sleap.ai/notebooks/Interactive_and_realtime_inference.html

Note the progressive exposure of complexity, going from high to low level, where on the highest level, we get back numpy arrays and on lowest level we're getting tensors back still on the GPU before any further postprocessing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant