소스 검색

Merge pull request #735 from Arthur2500/main

Replaced Ollama´s old .ai TLD with new .com TLD
Timothy Jaeryang Baek 1 년 전
부모
커밋
a2a433b08c
5개의 변경된 파일6개의 추가작업 그리고 6개의 파일을 삭제
  1. 1 1
      README.md
  2. 1 1
      TROUBLESHOOTING.md
  3. 1 1
      docs/apache.md
  4. 2 2
      src/lib/components/chat/Settings/Models.svelte
  5. 1 1
      src/routes/(app)/modelfiles/create/+page.svelte

+ 1 - 1
README.md

@@ -121,7 +121,7 @@ Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/)
 
 2. **Ensure You Have the Latest Version of Ollama:**
 
-   - Download the latest version from [https://ollama.ai/](https://ollama.ai/).
+   - Download the latest version from [https://ollama.com/](https://ollama.com/).
 
 3. **Verify Ollama Installation:**
    - After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you.

+ 1 - 1
TROUBLESHOOTING.md

@@ -20,7 +20,7 @@ docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BAS
 
 ### General Connection Errors
 
-**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.ai/) for the latest updates.
+**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates.
 
 **Troubleshooting Steps**:
 

+ 1 - 1
docs/apache.md

@@ -74,7 +74,7 @@ On your latest installation of Ollama, make sure that you have setup your api se
 
 The guide doesn't seem to match the current updated service file on linux. So, we will address it here:
 
-Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.ai/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file:
+Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.com/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file:
 
 ```
 sudo nano /etc/systemd/system/ollama.service

+ 2 - 2
src/lib/components/chat/Settings/Models.svelte

@@ -291,7 +291,7 @@
 <div class="flex flex-col h-full justify-between text-sm">
 	<div class=" space-y-3 pr-1.5 overflow-y-scroll h-80">
 		<div>
-			<div class=" mb-2.5 text-sm font-medium">Pull a model from Ollama.ai</div>
+			<div class=" mb-2.5 text-sm font-medium">Pull a model from Ollama.com</div>
 			<div class="flex w-full">
 				<div class="flex-1 mr-2">
 					<input
@@ -354,7 +354,7 @@
 			<div class="mt-2 mb-1 text-xs text-gray-400 dark:text-gray-500">
 				To access the available model names for downloading, <a
 					class=" text-gray-500 dark:text-gray-300 font-medium"
-					href="https://ollama.ai/library"
+					href="https://ollama.com/library"
 					target="_blank">click here.</a
 				>
 			</div>

+ 1 - 1
src/routes/(app)/modelfiles/create/+page.svelte

@@ -497,7 +497,7 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, '');
 							<div class="mt-1 text-xs text-gray-400 dark:text-gray-500">
 								To access the available model names for downloading, <a
 									class=" text-gray-500 dark:text-gray-300 font-medium"
-									href="https://ollama.ai/library"
+									href="https://ollama.com/library"
 									target="_blank">click here.</a
 								>
 							</div>