Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

DeepSeek R1 con Ollama en Docker Compose

Tiempo de lectura: < 1 minuto

Hoy os voy a enseñar cómo podemos desplegar DeepSeek R1 el nuevo modelo Chino en un Docker compose usando Ollama.

Deepseek logo

Es un proceso muy sencillo.

Primero vamos a crear el docker compose con la imagen de Ollama:

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
services:
ollama:
image: ollama/ollama
container_name: ollama
restart: unless-stopped
ports:
- 11434:11434
volumes:
- ./models:/root/.ollama # Montar carpeta local en el contenedor
services: ollama: image: ollama/ollama container_name: ollama restart: unless-stopped ports: - 11434:11434 volumes: - ./models:/root/.ollama # Montar carpeta local en el contenedor
services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    restart: unless-stopped
    ports:
      - 11434:11434
    volumes:
      - ./models:/root/.ollama  # Montar carpeta local en el contenedor

Levantamos el contenedor:

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
docker compose up -d
docker compose up -d
docker compose up -d

Y ahora vamos a descargar el modelo correspondiente a deepseek r1 7b

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
docker exec -it ollama ollama run deepseek-r1:7b
docker exec -it ollama ollama run deepseek-r1:7b
docker exec -it ollama ollama run deepseek-r1:7b

Y aparecerá un chat para poder utilizarlo:

Conversación chat Deep Seek

Documentación de la API: https://github.com/ollama/ollama/blob/main/docs/api.md

Si quieres añadir autenticación aquí te explico cómo: https://devcodelight.com/anadir-autenticacion-en-nginx-proxy-manager-para-tus-dominios

Si necesitas acceso a la GPU mediante Docker: https://devcodelight.com/anadir-gpu-en-docker-para-ollama-u-otros-servicios/

Y si quieres mostrar el chat mediante una interfaz web: https://devcodelight.com/mostrar-una-interfaz-web-de-chat-usando-open-webui-para-llama-3-2-o-cualquier-modelo-compatible-con-ollama-usando-docker-compose/

0

Deja un comentario