Register a Model as an API Service

This section explains steps involved in registering a Data Science Model as an API Service.

To publish a Model as an API Service, the user needs to follow the three steps given below:

Step-1 Publish a Model as an API

Step-2 Register an API Client

Step-3 Pass the Model values in the Postman

Using the Models tab, the user can publish a DSL model as an API. Only the published models get this option.

  • Navigate to the Models tab.

  • Filter the model list by using the Registered or All options.

  • Select a registered model from the list.

  • Click the Register as API option.

  • The Update Model page opens.

  • Provide Max instance limit.

  • Click the Save and Register option.

Updating a model

Please Note: Use the Save option to save the data which can be published later.

  • The model gets saved and registered as an API service. A notification message appears to inform the same.

Please Note: The Registered Model as an API can be accessed under the Registered Models & API option in the left menu panel on the Data Science Lab homepage.

  • Navigate to the Admin module.

  • Click the API Client Registration option.

  • The API Client Registration page opens.

  • Click the New option.

  • Select the Client type as internal.

  • Provide the following client-specific information:

    • Client Name

    • Client Email

    • App Name

    • Request Per Hour

    • Request Per Day

    • Select API Type- Select the Model as API option.

    • Select the Services Entitled -Select the published DSL model from the drop-down menu.

  • Click the Save option.

  • A notification message appears to inform the same.

  • The client details get registered.

  • Once the client gets registered open the registered client details using the Edit option.

  • The API Client Registration page opens with the Client ID and Client Secret key.

The user can pass the model values in Postman in the following sequence to get the results.

Sample URLs for Passing a Regstered Model's Values as API in Postman
  • To check whether the service has started or not pass

https://app.bdb.ai/services/modelasapi/<model_name>

  • To check whether the Job is running or not

https://app.bdb.ai/services/modelasapi/<model_name>/getStatus

  • To get results from the API service

https://app.bdb.ai/services/modelasapi/<model_name>/getResults

Registering a Model as an API Service

  • Navigate to the Postman.

  • Go to the New Collection.

  • Add a new POST request.

  • Pass the URL with the model name for the POST request.

  • Provide required headers under the Headers tab:

    • Client Id

    • Client Secret Key

    • App Name

    • Put the test data in the JSON list using the Body tab.

    • Click the Send option to send the request.

Please Note:

  • A job will get spin-up at the tenant level to process the requests.

  • The input data (JSON body) will be saved in a Kafka topic as a message, which will be cleared after 4 hours.

  • The tenant will get a response as below:

    • Success: the success of the request is identified by getting 'true' here.

    • Request ID: A Request ID is generated.

    • Message: Ensures that the service has started running.

Please Note: The Request ID is required to get the status request in the next step.

  • Pass the URL with the model name for the POST request.

  • Provide required headers under the Headers tab:

    • Client Id

    • Client Secret Key

    • App Name

  • Open the Body tab and provide the Request ID.

  • Click the Send option to send the request.

  • The response will be received as below:

    • Success: the success of the request is identified by getting 'true' here.

    • Request ID: The used Request ID appears.

    • Status Message: Ensures that the service has been completed.

  • Pass the URL with the model name for the POST request.

  • Provide required headers under the Headers tab:

    • Client Id

    • Client Secret Key

    • App Name

  • Open the Body tab and provide the Request ID.

  • Click the Send option to send the request.

  • The model prediction result will be displayed in response.

Please Note: The output data will be stored inside the Sandbox repository in the specific sub-folder of the request under the Model as API folder of the respective DSL Project.

Last updated