Not convinced yet?
— Plamen Mushkov (@plamen_9) February 12, 2024
Give my little POC a try - built in a single day, using the best Low-code tool - Oracle APEX (#orclAPEX) with some REST API calls to my local machine.
Now, explaining memes to you, using the latest version of LLaVA multi-modal LLM, run locally using Ollama ✨… pic.twitter.com/atvTzVpMCg
ubuntu@mywhisper2:~$ curl -fsSL https://ollama.com/install.sh | sh
>>> Downloading ollama...
######################################################################## 100.0%######################################################################### 100.0%
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 0.0.0.0:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA GPU detected. Ollama will run in CPU-only mode.
ubuntu@mywhisper2:~$
ubuntu@mywhisper2:~$ ollama run codellama
pulling manifest
pulling 3a43f93b78ec... 100% ▕████████████████▏ 3.8 GB
pulling 8c17c2ebb0ea... 100% ▕████████████████▏ 7.0 KB
pulling 590d74a5569b... 100% ▕████████████████▏ 4.8 KB
pulling 2e0493f67d0c... 100% ▕████████████████▏ 59 B
pulling 7f6a57943a88... 100% ▕████████████████▏ 120 B
pulling 316526ac7323... 100% ▕████████████████▏ 529 B
verifying sha256 digest
writing manifest
removing any unused layers
success
>>> /?
Available Commands:
/set Set session variables
/show Show model information
/load <model> Load a session or model
/save <model> Save your current session
/bye Exit
/?, /help Help for a command
/? shortcuts Help for keyboard shortcuts
Use """ to begin a multi-line message.
>>> /bye
ubuntu@mywhisper2:~$
ubuntu@mywhisper2:~$ sudo firewall-cmd --add-port=11434/tcp
You're performing an operation over default zone ('public'),
but your connections/interfaces are in zone 'docker' (see --get-active-zones)
You most likely need to use --zone=docker option.
success
ubuntu@mywhisper2:~$
ubuntu@mywhisper2:~$ sudo firewall-cmd --runtime-to-permanent
success
ubuntu@mywhisper2:~$
proxy_pass http://localhost:11434/;
以上の変更を行なった後にNginxを再起動すると、Ollamaを呼び出せる状態になります。