CLI: Deploying a Project

Develop and deploy your project entirely on Tensor One’s infrastructure. With Tensor One projects, you can create and implement Serverless Endpoints without the need for manual container management or Docker expertise. Code changes are automatically reflected in a live environment, without rebuilding or redeploying Docker images, thereby speeding up development.

How to Set Up a Basic Endpoint

This tutorial teaches you how to obtain the IP address of the machine executing your code. By the end, you will know how to:
  • Deploy your code as a Serverless Endpoint
  • Interact with it locally and in the cloud
  • Set up a project environment using tensoronecli

Prerequisites

  • Installed tensoronecli
  • Python 3.8+

Step 1: Set Up Project Environment

Configure the CLI with your API key:
tensoronecli config --apiKey <API_KEY>
Create a new project directory:
tensoronecli project create
Select Hello World from the prompt and follow the instructions.

Step 2: Writing and Testing Code Locally

Navigate into your project folder:
cd my_ip
Replace the default code in src/handler.py with the following:
from tensoroneGPU import tensoroneClient
import requests

def get_my_ip(job):
    response = requests.get('https://httpbin.org/ip')
    return response.json()['origin']

tensoroneClient.serverless.start({"handler": get_my_ip})
Test your code locally:
python3 src/handler.py --test_input '{"input": {"prompt": ""}}'
You should see your local machine’s IP address in the output.

Step 3: Running a Development Server

Launch a live development server on a Tensor One Cluster:
tensoronecli project dev
Watch the logs for a generated URL to confirm successful deployment.

Step 4: Interacting with Your Code

Your project relies on the external library requests. Add it to your requirements.txt:
tensoroneGPU
requests
The dev server will automatically sync changes. Then interact with your endpoint using curl:
curl -X POST \
  'https://${YOUR_ENDPOINT}-8080.proxy.tensorone.ai/runsync' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{"input": {}}'
You should receive the Cluster’s IP address in the response.

Step 5: Deploying Your Endpoint

Stop the development server (Ctrl + C), and deploy your project:
tensoronecli project deploy
You’ll see live Endpoint URLs in the output, such as:
https://api.tensorone.ai/v2/${YOUR_ENDPOINT}/runsync

Step 6: Interacting with Your Serverless Endpoint

Use curl to send a request directly to the deployed endpoint:
curl -X POST \
  'https://api.tensorone.ai/v2/${YOUR_ENDPOINT}/runsync' \
  -H 'accept: application/json' \
  -H 'authorization: ${YOUR_API_KEY}' \
  -H 'Content-Type: application/json' \
  -d '{"input": {}}'
You should receive the IP address of the Cluster hosting your endpoint.

Conclusion

You’ve successfully:
  • Created a Tensor One project
  • Wrote and tested code locally
  • Ran a live development server
  • Deployed your Serverless Endpoint
You’re now ready to explore more complex projects and unlock the full potential of Tensor One’s serverless infrastructure.