Skip to content

Commit

Permalink
Add groq to docs (#557)
Browse files Browse the repository at this point in the history
Added code snippets to documentation for using Groq with E2B.
  • Loading branch information
tizkovatereza authored Jan 27, 2025
2 parents 90788ee + 0524739 commit aea3d3e
Showing 1 changed file with 41 additions and 0 deletions.
41 changes: 41 additions & 0 deletions apps/web/src/app/(docs)/docs/quickstart/connect-llms/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ If the LLM doesn't support tool use, you can, for example, prompt the LLM to out
- [OpenAI](#openai)
- [Anthropic](#anthropic)
- [Mistral](#mistral)
- [Groq](#groq)
- [Vercel AI SDK](#vercel-ai-sdk)
- [CrewAI](#crewai)
- [LangChain](#langchain)
Expand Down Expand Up @@ -375,6 +376,46 @@ print(final_response.choices[0].message.content)

---

## Groq
<CodeGroup>
```python

# pip install groq e2b-code-interpreter
import os
from groq import Groq
from e2b_code_interpreter import Sandbox

api_key = os.environ["GROQ_API_KEY"]

# Create Groq client
client = Groq(api_key=api_key)
system_prompt = "You are a helpful assistant that can execute python code in a Jupyter notebook. Only respond with the code to be executed and nothing else. Strip backticks in code blocks."
prompt = "Calculate how many r's are in the word 'strawberry.'"

# Send the prompt to the model
response = client.chat.completions.create(
model="llama3-70b-8192",
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": prompt},
]
)

# Extract the code from the response
code = response.choices[0].message.content

# Execute code in E2B Sandbox
with Sandbox() as sandbox:
execution = sandbox.run_code(code)
result = execution.text

print(result)

```
</CodeGroup>

---

## Vercel AI SDK
Vercel's [AI SDK](https://sdk.vercel.ai) offers support for multiple different LLM providers through a unified JavaScript interface that's easy to use.

Expand Down

0 comments on commit aea3d3e

Please sign in to comment.