Large Language Models

Groq

economy

Groq is an cost-effective large language models provider offering Chat Completion, Streaming Chat, Speech Synthesis and 3 more through Aweb's unified API. It features full streaming support with premium-tier latency.

Try GroqOfficial docs

Specifications

CategoryLarge Language Models
Auth Typeapi_key
Latency Tierpremium
Streamingfull
Pricing Tiereconomy
Reliabilitypremium
Qualitypremium
Base URLhttps://api.groq.com/openai/v1
SDK Package

Capabilities (6)

Chat Completionllm.chat
Score: 92
Streaming Chatllm.stream
Score: 95
Speech Synthesistts.synthesize
Score: 85
Speech Transcriptionstt.transcribe
Score: 88
Speech Transcriptionstt.transcribe
Score: 88
Fast LLM Inferencellm.fast-inference
Score: 99DEFAULT

Quick start

Use Groq through Alfred's unified API — no provider-specific SDK needed.

cURL

curl -X POST https://api.alfred-ai.app/v1/execute \
  -H "Authorization: Bearer $ALFRED_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "provider": "groq",
    "capability": "llm.chat",
    "input": { "prompt": "Hello from Groq!" }
  }'

TypeScript

import { Alfred } from '@alfred/core';

const alfred = new Alfred({ apiKey: process.env.ALFRED_API_KEY });

const result = await alfred.execute({
  provider: 'groq',
  capability: 'llm.chat',
  input: { prompt: 'Hello from Groq!' },
});

console.log(result.output);

Related Large Language Models providers

Anthropic

premium tier

OpenAI

premium tier

Mistral AI

standard tier

Google Gemini

economy tier

Getting started →API reference →All capabilities →All providers →