Skip to content

Commit 33b3c70

Browse files
committed
website
1 parent 434941e commit 33b3c70

File tree

1 file changed

+255
-0
lines changed

1 file changed

+255
-0
lines changed

Diff for: llm/website_ollama.ipynb

+255
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,255 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "d465baef-080f-4cdb-b1c3-465e4e1ce305",
6+
"metadata": {},
7+
"source": [
8+
"# Running a local LLM with ollama\n",
9+
"\n",
10+
"[Ollama](https://ollama.com/) is probably the easiest way to run a LLM on your local machine.\n",
11+
"\n",
12+
"To run the code of this notebook on your machine, you will need [ollama-python](https://github.com/ollama/ollama-python).\n",
13+
"\n",
14+
"If you want to run this notebook on Google Colab, you can follow the instructions below."
15+
]
16+
},
17+
{
18+
"cell_type": "markdown",
19+
"id": "720847c4-1c9a-4ea9-8de5-2a0d198a4af8",
20+
"metadata": {},
21+
"source": [
22+
"## Instructions to run Ollama on Colab:\n",
23+
"- first install xterm\n",
24+
"```\n",
25+
"!pip install colab-xterm #https://pypi.org/project/colab-xterm/\n",
26+
"%load_ext colabxterm\n",
27+
"```\n",
28+
"- start a xterm and inside the terminal start an ollama server:\n",
29+
"```\n",
30+
"%xterm\n",
31+
"# curl https://ollama.ai/install.sh | sh\n",
32+
" # ollama serve & ollama pull mistral\n",
33+
"```\n",
34+
"- in a cell, check that you have your model\n",
35+
"```\n",
36+
"!ollama list\n",
37+
"```\n",
38+
"- in a cell, install ollama-python\n",
39+
"```\n",
40+
"!pip install ollama\n",
41+
"```"
42+
]
43+
},
44+
{
45+
"cell_type": "markdown",
46+
"id": "13e9b490-6c0a-4563-bbf7-03c2bb268449",
47+
"metadata": {},
48+
"source": [
49+
"## Check that everything is running fine"
50+
]
51+
},
52+
{
53+
"cell_type": "code",
54+
"execution_count": null,
55+
"id": "dd16d74c-ed5e-4826-952e-f1809e48fae6",
56+
"metadata": {},
57+
"outputs": [],
58+
"source": [
59+
"from ollama import chat\n",
60+
"from ollama import ChatResponse\n",
61+
"from IPython import display"
62+
]
63+
},
64+
{
65+
"cell_type": "code",
66+
"execution_count": null,
67+
"id": "1f804638-4101-46d4-86eb-c58e4c005394",
68+
"metadata": {},
69+
"outputs": [],
70+
"source": [
71+
"response: ChatResponse = chat(model='mistral', messages=[\n",
72+
" {\n",
73+
" 'role': 'user',\n",
74+
" 'content': 'Why is the sky blue?',\n",
75+
" },\n",
76+
"])\n",
77+
"\n",
78+
"print(response.message.content)"
79+
]
80+
},
81+
{
82+
"cell_type": "markdown",
83+
"id": "0c81abfc-3a45-4d7a-ae0c-ad2f73f63783",
84+
"metadata": {},
85+
"source": [
86+
"## Choose your model (you will need to pull it before) and make a nice website!"
87+
]
88+
},
89+
{
90+
"cell_type": "code",
91+
"execution_count": null,
92+
"id": "fd80bb2c-d17f-4957-93ea-4abf89a1579f",
93+
"metadata": {},
94+
"outputs": [],
95+
"source": [
96+
"model = 'mistral'"
97+
]
98+
},
99+
{
100+
"cell_type": "code",
101+
"execution_count": null,
102+
"id": "99eb2596-ad7a-4e12-9dab-957c4d2e3819",
103+
"metadata": {},
104+
"outputs": [],
105+
"source": [
106+
"prompt_0 = \"Write a webpage for the course: Large Language Models : Introduction and Applications for Code given by Nathanaël FIJALKOW and Marc LELARGE in the Master MVA. I will pipe your output directly to a file, so just give me ```html\\n[page here]\\n```.\""
107+
]
108+
},
109+
{
110+
"cell_type": "code",
111+
"execution_count": null,
112+
"id": "1e4c077a-fd5c-4f20-b1df-c33d506ec61c",
113+
"metadata": {},
114+
"outputs": [],
115+
"source": [
116+
"response_0: ChatResponse = chat(model=model, messages=[\n",
117+
" {\n",
118+
" 'role': 'user',\n",
119+
" 'content': prompt_0,\n",
120+
" },\n",
121+
"])"
122+
]
123+
},
124+
{
125+
"cell_type": "code",
126+
"execution_count": null,
127+
"id": "ccef4c65-7df9-45dc-95eb-4fe001f46177",
128+
"metadata": {},
129+
"outputs": [],
130+
"source": [
131+
"print(response_0.message.content)"
132+
]
133+
},
134+
{
135+
"cell_type": "code",
136+
"execution_count": null,
137+
"id": "43f79f75-f0c9-40c6-8686-7aedde090f12",
138+
"metadata": {},
139+
"outputs": [],
140+
"source": [
141+
"page = response_0.message.content.split(\"```\")[1].partition(\"\\n\")[2]"
142+
]
143+
},
144+
{
145+
"cell_type": "code",
146+
"execution_count": null,
147+
"id": "a729c73f-1b21-475f-b381-bd25fc2158c6",
148+
"metadata": {},
149+
"outputs": [],
150+
"source": [
151+
"display.HTML(page)"
152+
]
153+
},
154+
{
155+
"cell_type": "code",
156+
"execution_count": null,
157+
"id": "6ad3fa84-d37b-45fc-95e2-afaf8618fc5f",
158+
"metadata": {},
159+
"outputs": [],
160+
"source": [
161+
"def parse_html(response):\n",
162+
" return response.message.content.split(\"```\")[1].partition(\"\\n\")[2]\n",
163+
"\n",
164+
"def write_html_file(file_name, content):\n",
165+
" try:\n",
166+
" with open(file_name, 'w') as file:\n",
167+
" file.write(content)\n",
168+
" print(f\"HTML file '{file_name}' created successfully.\")\n",
169+
" except Exception as e:\n",
170+
" print(f\"An error occurred while writing the HTML file: {e}\")"
171+
]
172+
},
173+
{
174+
"cell_type": "code",
175+
"execution_count": null,
176+
"id": "5446b7bf-67c7-4de2-a06e-543bc93af478",
177+
"metadata": {},
178+
"outputs": [],
179+
"source": [
180+
"write_html_file('webpage_mistral_v0.html', page)"
181+
]
182+
},
183+
{
184+
"cell_type": "code",
185+
"execution_count": null,
186+
"id": "60eddf60-b0ba-4f3c-a977-c916a9706e91",
187+
"metadata": {},
188+
"outputs": [],
189+
"source": [
190+
"prompt_1 = \"Add more detail and better html and css. I will pipe your output directly to a file, so just give me ```html\\n[page here]\\n``` so that it fits in one file.\""
191+
]
192+
},
193+
{
194+
"cell_type": "code",
195+
"execution_count": null,
196+
"id": "b3e07184-84ae-48c9-ac04-4c912bbeb484",
197+
"metadata": {},
198+
"outputs": [],
199+
"source": [
200+
"response_1: ChatResponse = chat(model=model, messages=[\n",
201+
" {\n",
202+
" 'role': 'user',\n",
203+
" 'content': prompt_0,\n",
204+
" },\n",
205+
" response_0.message,\n",
206+
" {\n",
207+
" 'role': 'user',\n",
208+
" 'content': prompt_1\n",
209+
" },\n",
210+
"])"
211+
]
212+
},
213+
{
214+
"cell_type": "code",
215+
"execution_count": null,
216+
"id": "fe439346-e76e-4381-a7d5-df88472203c8",
217+
"metadata": {},
218+
"outputs": [],
219+
"source": [
220+
"new_page = parse_html(response_1)"
221+
]
222+
},
223+
{
224+
"cell_type": "code",
225+
"execution_count": null,
226+
"id": "5360dbf9-7782-4b90-845e-ec759ca8a9ea",
227+
"metadata": {},
228+
"outputs": [],
229+
"source": [
230+
"write_html_file('webpage_mistral_v1.html', new_page)"
231+
]
232+
}
233+
],
234+
"metadata": {
235+
"kernelspec": {
236+
"display_name": "chess",
237+
"language": "python",
238+
"name": "chess"
239+
},
240+
"language_info": {
241+
"codemirror_mode": {
242+
"name": "ipython",
243+
"version": 3
244+
},
245+
"file_extension": ".py",
246+
"mimetype": "text/x-python",
247+
"name": "python",
248+
"nbconvert_exporter": "python",
249+
"pygments_lexer": "ipython3",
250+
"version": "3.12.4"
251+
}
252+
},
253+
"nbformat": 4,
254+
"nbformat_minor": 5
255+
}

0 commit comments

Comments
 (0)