Skip to content

Commit 7ad330d

Browse files
lpozobzaczynski
andauthored
Sample code for the article on Ollama and Python (#732)
* Sample code for the article on Ollama and Python * Fix linter issues --------- Co-authored-by: Bartosz Zaczyński <bartosz.zaczynski@gmail.com>
1 parent e660117 commit 7ad330d

File tree

7 files changed

+112
-0
lines changed

7 files changed

+112
-0
lines changed

ollama-python-sdk/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# How to Integrate Local LLMs With Ollama and Python
2+
3+
This folder provides the code examples for the Real Python tutorial [How to Integrate Local LLMs With Ollama and Python](https://realpython.com/ollama-python/).

ollama-python-sdk/chat.py

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
from ollama import chat
2+
3+
messages = [
4+
{
5+
"role": "user",
6+
"content": "Explain what Python is in one sentence.",
7+
},
8+
]
9+
10+
response = chat(model="llama3.2:latest", messages=messages)
11+
print(response.message.content)

ollama-python-sdk/chat_context.py

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
from ollama import chat
2+
3+
messages = [
4+
{
5+
"role": "system",
6+
"content": "You are an expert Python tutor.",
7+
},
8+
{
9+
"role": "user",
10+
"content": "Define list comprehensions in a sentence.",
11+
},
12+
]
13+
response = chat(model="llama3.2:latest", messages=messages)
14+
print(response.message.content)
15+
16+
messages.append(response.message) # Keep context
17+
messages.append(
18+
{
19+
"role": "user",
20+
"content": "Provide a short, practical example.",
21+
}
22+
)
23+
response = chat(model="llama3.2:latest", messages=messages)
24+
print(response.message.content)

ollama-python-sdk/generate_code.py

Whitespace-only changes.

ollama-python-sdk/generate_text.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
from ollama import generate
2+
3+
response = generate(
4+
model="llama3.2:latest",
5+
prompt="Explain what Python is in one sentence.",
6+
)
7+
8+
print(response.response)

ollama-python-sdk/streams.py

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
from ollama import chat
2+
3+
stream = chat(
4+
model="llama3.2:latest",
5+
messages=[
6+
{
7+
"role": "user",
8+
"content": "Explain Python dataclasses with a quick example.",
9+
}
10+
],
11+
stream=True,
12+
)
13+
14+
for chunk in stream:
15+
print(chunk.message.content, end="", flush=True)

ollama-python-sdk/tool_calling.py

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
import math
2+
3+
from ollama import chat
4+
5+
6+
# Define a tool as a Python function
7+
def square_root(number: float) -> float:
8+
"""Calculate the square root of a number.
9+
10+
Args:
11+
number: The number to calculate the square root for.
12+
13+
Returns:
14+
The square root of the number.
15+
"""
16+
return math.sqrt(number)
17+
18+
19+
messages = [
20+
{
21+
"role": "user",
22+
"content": "What is the square root of 36?",
23+
}
24+
]
25+
26+
response = chat(
27+
model="llama3.2:latest",
28+
messages=messages,
29+
tools=[square_root], # Pass the tools along with the prompt
30+
)
31+
32+
# Append the response for context
33+
messages.append(response.message)
34+
35+
if response.message.tool_calls:
36+
tool = response.message.tool_calls[0]
37+
# Call the tool
38+
result = square_root(float(tool.function.arguments["number"]))
39+
40+
# Append the tool result
41+
messages.append(
42+
{
43+
"role": "tool",
44+
"tool_name": tool.function.name,
45+
"content": str(result),
46+
}
47+
)
48+
49+
# Obtain the final answer
50+
final_response = chat(model="llama3.2:latest", messages=messages)
51+
print(final_response.message.content)

0 commit comments

Comments
 (0)