Bring your own endpoint (a.k.a. support for external endpoints)¶
If you have an endpoint deployed on say Amazon EKS
or Amazon EC2
or have your models hosted on a fully-managed service such as Amazon Bedrock
, you can still bring your endpoint to FMBench
and run tests against your endpoint. To do this you need to do the following:
-
Create a derived class from
FMBenchPredictor
abstract class and provide implementation for the constructor, theget_predictions
method and theendpoint_name
property. SeeSageMakerPredictor
for an example. Save this file locally as saymy_custom_predictor.py
. -
Upload your new Python file (
my_custom_predictor.py
) for your custom FMBench predictor to yourFMBench
read bucket and the scripts prefix specified in thes3_read_data
section (read_bucket
andscripts_prefix
). -
Edit the configuration file you are using for your
FMBench
for the following:- Skip the deployment step by setting the
2_deploy_model.ipynb
step underrun_steps
tono
. - Set the
inference_script
under any experiment in theexperiments
section for which you want to use your new custom inference script to point to your new Python file (my_custom_predictor.py
) that contains your custom predictor.
- Skip the deployment step by setting the