Quickstart
Get started with the Morpheus Inference API in just a few minutes. The API is fully OpenAI-compatible , meaning you can use existing OpenAI SDKs and tools with just a base URL change.
Base URL: https://api.mor.org/api/v1Authentication: Bearer token (your API key)
Step 1: Get Your API Key
Go to app.mor.org
Create an account or sign in
Click Create API Key and copy it immediately
Your API key won’t be shown again after creation. Store it securely.
Step 2: Make Your First Request
The Morpheus Inference API is fully OpenAI-compatible . Simply change the base URL and use your Morpheus API key.
curl
Python
JavaScript
TypeScript
curl https://api.mor.org/api/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.3-70b",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_API_KEY" ,
base_url = "https://api.mor.org/api/v1"
)
response = client.chat.completions.create(
model = "llama-3.3-70b" ,
messages = [
{ "role" : "user" , "content" : "Hello!" }
]
)
print (response.choices[ 0 ].message.content)
Install the OpenAI SDK: pip install openai import OpenAI from "openai" ;
const client = new OpenAI ({
apiKey: "YOUR_API_KEY" ,
baseURL: "https://api.mor.org/api/v1" ,
});
const response = await client . chat . completions . create ({
model: "llama-3.3-70b" ,
messages: [
{ role: "user" , content: "Hello!" }
],
});
console . log ( response . choices [ 0 ]. message . content );
Install the OpenAI SDK: npm install openai import OpenAI from "openai" ;
const client = new OpenAI ({
apiKey: process . env . MORPHEUS_API_KEY ! ,
baseURL: "https://api.mor.org/api/v1" ,
});
const response = await client . chat . completions . create ({
model: "llama-3.3-70b" ,
messages: [
{ role: "user" , content: "Hello!" }
],
});
console . log ( response . choices [ 0 ]. message . content );
Install the OpenAI SDK: npm install openai
OpenAI Compatibility
The Morpheus Inference API implements the OpenAI API specification, ensuring compatibility with existing OpenAI clients and tools. You can switch from OpenAI to Morpheus with just two changes:
Setting OpenAI Morpheus Base URL https://api.openai.com/v1https://api.mor.org/api/v1API Key sk-... (OpenAI key)sk-... (Morpheus key)
Any application that supports “OpenAI-compatible” APIs can work with Morpheus. This includes Cursor, Continue, LangChain, and hundreds of other tools.
Streaming Responses
For real-time output, enable streaming:
curl https://api.mor.org/api/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.3-70b",
"messages": [
{"role": "user", "content": "Write a haiku about AI"}
],
"stream": true
}'
from openai import OpenAI
client = OpenAI(
api_key = "YOUR_API_KEY" ,
base_url = "https://api.mor.org/api/v1"
)
stream = client.chat.completions.create(
model = "llama-3.3-70b" ,
messages = [
{ "role" : "user" , "content" : "Write a haiku about AI" }
],
stream = True
)
for chunk in stream:
if chunk.choices[ 0 ].delta.content:
print (chunk.choices[ 0 ].delta.content, end = "" , flush = True )
import OpenAI from "openai" ;
const client = new OpenAI ({
apiKey: "YOUR_API_KEY" ,
baseURL: "https://api.mor.org/api/v1" ,
});
const stream = await client . chat . completions . create ({
model: "llama-3.3-70b" ,
messages: [
{ role: "user" , content: "Write a haiku about AI" }
],
stream: true ,
});
for await ( const chunk of stream ) {
if ( chunk . choices [ 0 ]?. delta ?. content ) {
process . stdout . write ( chunk . choices [ 0 ]. delta . content );
}
}
Available Models
List available models to see what’s currently active in the marketplace:
curl https://api.mor.org/api/v1/models \
-H "Authorization: Bearer YOUR_API_KEY"
View All Models See the complete list of available models with capabilities and context windows.
Next Steps