Model ID: meta-llama/llama-4-scout
per 1M tokens
per 1M tokens
import requests
# Fetch pricing data from llmprices.ai
response = requests.get(
"https://llmprices.ai/api/pricing?model=meta-llama/llama-4-scout"
)
data = response.json()
print(f"Model: {data['name']}")
print(f"Input: {float(data["pricing"]["prompt"]) * 1000000:.2f}/1M tokens")
print(f"Output: {float(data["pricing"]["completion"]) * 1000000:.2f}/1M tokens")Endpoint:
GET https://llmprices.ai/api/pricing?model=meta-llama/llama-4-scoutExample Response:
{
"id": "meta-llama/llama-4-scout",
"name": "Meta: Llama 4 Scout",
"pricing": {
"prompt": "0.00000008",
"completion": "0.0000003"
}
}