-
Notifications
You must be signed in to change notification settings - Fork 36
Issues: triton-inference-server/fil_backend
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Update compatibility matrix for Treelite/XGBoost version support
documentation
Improvements or additions to documentation
#412
opened Nov 5, 2024 by
wphicks
Does Nvidia Triton Inference Server Support AutoML(AutoGluon) framework?
#410
opened Oct 21, 2024 by
IamExperimenting
Does the FIL backend support the XGBoost base_margin feature?
enhancement
New feature or request
#408
opened Oct 13, 2024 by
leslizhang
Error while running triton-server docker with fil backed on MacOS 14.3
#400
opened Aug 21, 2024 by
suvratjain1995
[FEA] Support categorical features when serving XGBoost models
enhancement
New feature or request
#389
opened May 15, 2024 by
gfalcone
Provide ARM release to allow support for FIL backend on Jetson and other ARM platforms
#362
opened Jul 12, 2023 by
blthayer
[BUG] cuml binary classification models do not observe threshold
#351
opened Mar 28, 2023 by
RAMitchell
[BUG] Multiclass models must have output_class=true to predict probabilities
#350
opened Mar 28, 2023 by
RAMitchell
[FEA] Provide CatBoost Support
enhancement
New feature or request
#347
opened Mar 23, 2023 by
riaris
[DOC] More clearly document that max_batch_size 0 is not supported
#338
opened Feb 10, 2023 by
wphicks
Internal Docs Feedback: Provide a notebook demonstrating deployment of a pre-trained model
#295
opened Oct 3, 2022 by
wphicks
Add example of submitting a Python request with shared memory to FAQ notebook
#270
opened Jun 30, 2022 by
wphicks
Previous Next
ProTip!
Adding no:label will show everything without a label.