API reference
Introduction
You can interact with the API trough HTTP requests from any coding language like for example Python, JavaScript or PHP.
Authentication
The BOMML API uses API bearer tokens for authentication. Visit your API Tokens page to create and retrieve the API token you will use for authenticating your requests.
Please Remember to always keep your API tokens secure and secret! Do not share them with any unauthorized parties or expose them publicly in any client-side code (browser,apps,websites). Store them securely in your own backend which does the requests, and you can pass your API tokens securely for example from your environment variables or secret storage.
All API requests must include your API token in an `Authorization` HTTP header as follows:
Authorization: Bearer BOMML_API_TOKEN
Making requests
You can paste the command below into your terminal to run your first API request using CURL. Make sure to replace `$BOMML_API_TOKEN` with your own secret API token.
curl https://api.bomml.ai/api/v1/completions/chat \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $BOMML_API_TOKEN" \
-d '{
"model": "meta-llama/Llama-2-70b-chat-hf",
"messages": [{"role": "user", "content": "Write me a Haiku"}],
"max_tokens": 256,
"temperature": 0.6
}'
This API request queries the `meta-llama/Llama-2-70b-chat-hf` model (for more models please see our models endpoint) to complete or follow the given text prompt/instructions. You should get the following respons back:
{
"id": "cmpl-187c6927307b44f9a4fca738b6071af0",
"object": "text_completion",
"created": 1694297703,
"model": "meta-llama\/Llama-2-70b-chat-hf",
"choices": [{
"index": 0,
"text": "Sure! Here is a haiku:\n\nSnowflakes gently fall\nBlanketing the landscape white\nWinter's peaceful hush",
"logprobs": null,
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 26,
"total_tokens": 61,
"completion_tokens": 35
}
}
Chat
Given a list of messages compiling a conversation with the AI and the user, the model will return a repose based on the chain of messages.
The chat completion object
{
"id": "cmpl-a8624144abe14943a4146b9dfecf2f76",
"object": "text_completion",
"created": 1694305075,
"model": "meta-llama\/Llama-2-70b-chat-hf",
"choices": [{
"index": 0,
"text": "Sun sets slowly down\nGolden hues upon the sea\nPeaceful evening sky",
"logprobs": null,
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 27,
"total_tokens": 47,
"completion_tokens": 20
}
}
Create chat completion
POST https://api.bomml.ai/api/v1/completions/chat
curl https://api.bomml.ai/api/v1/completions/chat \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $BOMML_API_TOKEN" \
-d '{
"model": "meta-llama\/Llama-2-70b-chat-hf",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Write me a Haiku"
}
]
}'
{
"id": "cmpl-a8624144abe14943a4146b9dfecf2f76",
"object": "text_completion",
"created": 1694305075,
"model": "meta-llama\/Llama-2-70b-chat-hf",
"choices": [{
"index": 0,
"text": "Sun sets slowly down\nGolden hues upon the sea\nPeaceful evening sky",
"logprobs": null,
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 27,
"total_tokens": 47,
"completion_tokens": 20
}
}
Completions
Given a prompt, the model will return one or more predicted completions, this can as well be used with as instructions to execute tasks.
The completion object
{
"id": "cmpl-a8624144abe14943a4146b9dfecf2f76",
"object": "text_completion",
"created": 1694305075,
"model": "meta-llama\/Llama-2-70b-chat-hf",
"choices": [{
"index": 0,
"text": "Sun sets slowly down\nGolden hues upon the sea\nPeaceful evening sky",
"logprobs": null,
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 27,
"total_tokens": 47,
"completion_tokens": 20
}
}
Create completion
POST https://api.bomml.ai/api/v1/completions
curl https://api.bomml.ai/api/v1/completions/chat \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $BOMML_API_TOKEN" \
-d '{
"model": "meta-llama\/Llama-2-70b-chat-hf",
"prompt": "Write me a Haiku"
}'
{
"id": "cmpl-a8624144abe14943a4146b9dfecf2f76",
"object": "text_completion",
"created": 1694305075,
"model": "meta-llama\/Llama-2-70b-chat-hf",
"choices": [{
"index": 0,
"text": "Sun sets slowly down\nGolden hues upon the sea\nPeaceful evening sky",
"logprobs": null,
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 27,
"total_tokens": 47,
"completion_tokens": 20
}
}
Models
List and describe the various AI models available the BOMML API. For additional information please check the Models documentation to understand what each model is capable of and the differences between them.
The model object
{
"id": 1,
"name": "Llama 2 70B",
"key": "meta-llama\/Llama-2-70b-chat-hf",
"description": null,
"created_at": null,
"updated_at": null
}
List models
GET https://api.bomml.ai/api/v1/models
Returns
[{
"id": 1,
"name": "Llama 2 70B",
"key": "meta-llama\/Llama-2-70b-chat-hf",
"description": null,
"created_at": null,
"updated_at": null
}, {
"id": 2,
"name": "Llama-2-13b-chat-hf",
"key": "meta-llama\/Llama-2-13b-chat-hf",
"description": null,
"created_at": null,
"updated_at": null
}]