Skip to content

Connect to Hosted Service


Connecting to this service eliminates the need for you to deploy and stand up a private Kubernetes cluster. Each chassis build job run on our hosted service has enough resources to containerize even the most memory intensive ML models (up to 8GB RAM and 2 CPUs). Follow the instructions on this page to connect and get started using Chassis right away.

Download Chassis SDK

To get started, make sure you set up a Python virtual enviornment and install the chassisml SDK.

pip install chassisml

Get Chassis Connection URL

Sign up for the publicly-hosted service.

Next, when you receive your connection link, use the URL and ChassisClient object to establish connection to the running service. The information you receive will look something like this:

chassis_client = chassisml.ChassisClient("")

Begin Using Chassis

With your environment set up and connection URL in hand, you can now start to integrate the service into your MLOps pipelines.

Check out this example to follow along and see Chassis in action. Just insert your URL into the aforementioned client connection and you're well on your way.

Docker Hub Account Required

The publicly-hosted Chassis service will only push container images to Docker Hub. If you prefer to configure Chassis to build and push model containers to a private registry, follow the Private Docker Registry Support guide for setup instructions.

Back to top