Bläddra i källkod

update download links to the releases page until we have a better download url

Jeffrey Morgan 1 år sedan
förälder
incheckning
12199bcfa8
3 ändrade filer med 8 tillägg och 11 borttagningar
  1. 1 1
      README.md
  2. 5 8
      web/app/page.tsx
  3. 2 2
      web/package-lock.json

+ 1 - 1
README.md

@@ -16,7 +16,7 @@ Run large language models with `llama.cpp`.
 
 ## Install
 
-- Download for macOS
+- [Download](https://github.com/jmorganca/ollama/releases/latest) for macOS
 - Download for Windows (coming soon)
 - Docker: `docker run -p 11434:11434 ollama/ollama`
 

+ 5 - 8
web/app/page.tsx

@@ -5,18 +5,15 @@ export default async function Home() {
     <main className='flex min-h-screen max-w-2xl flex-col p-4 lg:p-24'>
       <h1 className='font-serif text-3xl'>ollama</h1>
       <section className='my-8'>
-        <p className='my-3 mb-8 max-w-md'>
+        <p className='my-3 max-w-md'>
           <a className='underline' href='https://github.com/jmorganca/ollama'>
             Ollama
           </a>{' '}
-          is a tool for running large language models.
-          <br />
-          <br />
-          Get started with Ollama using pip:
+          is a tool for running large language models. The latest version is available for download{' '}
+          <a className='underline' href='https://github.com/jmorganca/ollama/releases/latest'>
+            here.
+          </a>
         </p>
-        <pre className='my-4'>
-          <code>pip install ollama</code>
-        </pre>
       </section>
       <section className='my-4'>
         <h2 className='mb-4 text-lg'>Example models you can try running:</h2>

+ 2 - 2
web/package-lock.json

@@ -1,12 +1,12 @@
 {
   "name": "web",
-  "version": "0.1.0",
+  "version": "0.0.0",
   "lockfileVersion": 3,
   "requires": true,
   "packages": {
     "": {
       "name": "web",
-      "version": "0.1.0",
+      "version": "0.0.0",
       "dependencies": {
         "@octokit/rest": "^19.0.13",
         "@types/node": "20.4.0",