Batch Processing Support for Model Inference Using Tabular Data #15
thuongle2210
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
hi @habedi,
I see Infera's roadmap supports:
Batch Processing
Inference on batches for models with dynamic dimensions. => Already supported.
Automatic batch splitting for models with fixed batch size. => Not yet supported.
I see the maintainer currently only supports input as blob. I propose the feature with input like this:
Data preparation:
create or replace table batch_raw_data as select 1 as id, 1.0::float as f1, 2.0::float as f2, 3.0::float as f3 union all select 1 as id, 1.0::float as f1, 2.0::float as f2, 4.0::float as f3;then
select infera_predict('linear_model', f1, f2, f3) from batch_raw_data;Expected output is two rows of scores.
Beta Was this translation helpful? Give feedback.
All reactions