How to install and use DeepSeek-R1 locally on your computer, whether you use Ubuntu or any other system

  • DeepSeek-R1 is an open-source model with advanced reasoning capabilities.
  • Ollama makes it easy to install and manage AI models locally.
  • ChatBoxAI provides a graphical interface to interact with models like DeepSeek.
  • The model can be easily integrated into Python development projects.

DeepSeek-R1 on Ubuntu

Artificial intelligence continues to transform our world, and the options for working with advanced language models are growing by leaps and bounds. However, not everyone needs to connect to cloud services or rely on third parties to explore these technologies. An interesting and accessible alternative is DeepSeek-R1, an AI model that allows users to run it locally on modest computers. In this article, I will explain how to install DeepSeek and take full advantage of its capabilities.

DeepSeek-R1 is a open-source AI model DeepSeek-R1 excels in its efficiency and advanced reasoning capabilities. By running it locally, you not only save recurring costs, but also protect your privacy and gain flexibility to integrate it into custom projects. While some models require powerful hardware, DeepSeek-RXNUMX offers versions tuned for different resources, from entry-level computers to advanced workstations.

What is DeepSeek and why use it locally?

DeepSeek-R1 is a advanced language model designed for complex tasks such as logical reasoning, mathematical problem solving and code generation. Its main advantage is that it is open source, which means that you can install and run it on your own computer without relying on external servers.

Some of its most notable features include:

  • Flexibility: You can customize the model to suit your needs, from lightweight versions to advanced configurations.
  • Privacy: All processing is done locally, avoiding concerns about exposing sensitive data. This is perhaps the most important point, as many people are concerned about what companies can do with our data.
  • Saving: You won't have to shell out money for subscriptions or cloud services, making it an economical option for developers and businesses.

Requirements for installation

Before you begin the installation, make sure you comply with the following: requirements:

  • A computer running Linux, macOS or Windows operating system (with support for WSL2 in the latter case).
  • A minimum of 8 GB of RAM, although it is recommended at least 16 GB for optimal performance.
  • Internet access to download the models initially.
  • Basic knowledge of terminal or command line.

Furthermore, you will need to install a tool called Don't, which manages and runs DeepSeek models locally.

Installation of Ollama

Don't It is a simple solution that allows you to download and run language models such as DeepSeek-R1. To install it, follow these steps:

  1. On Linux or macOS, open the terminal and run the following command to install Ollama — the package curl it is necessary, obviously –:
curl -fsSL https://ollama.com/install.sh | sh
  1. On Windows systems, make sure you have WSL2 enabled first and then follow the same steps in the Ubuntu terminal that you configured within WSL.
  2. Verify that Ollama has been installed correctly by running ollama --versionIf the command returns a version number, you are ready to move forward.

DeepSeek-R1 Download

With Ollama installed and running (ollama serve in the terminal if the download (which we explain below) gives an error, you can now download the DeepSeek model that best suits your needs and hardware:

  • 1.5B parameters: Ideal for basic computers. This model takes up approximately 1.1 GB.
  • 7B parameters: Recommended for teams with GPUs medium-high. This occupies about 4.7 GB.
  • 70B parameters: For complex tasks on teams with great capacity memory and powerful GPU.

To download the standard 7B model, run this command in the terminal:

ollama run deepseek-r1

The download time will depend on your internet speed and will only be necessary the first time we run the chatbot. Once complete, the model will be ready to use from the command line or through a graphical interface.

Using DeepSeek with a graphical interface

Although you can interact with DeepSeek directly from the terminal, many users prefer a graphical interface for convenience. In this case, you can install ChatBoxAI, a free application that will allow you to take advantage of DeepSeek visual form.

  • Download and install ChatBoxAI from your official page.
  • Configure the application to use Don't as a model provider:

In ChatBoxAI settings, select “Use my own API” and choose the DeepSeek model you downloaded earlier. If everything is set up correctly, you will be able to perform queries and tasks directly from the graphical interface.

Integrating DeepSeek into projects

If you are a developer, you can integrate DeepSeek into your projects using its API compatible with OpenAI. Here is a simple example using Python:

import openai client = openai.Client(base_url="http://localhost:11434/v1", api_key="ollama") response = client.chat.completions.create(model="deepseek-r1", messages=[{ "role": "user", "content": "Generate code in Python to calculate Fibonacci"}])

This script will send a query to the local DeepSeek model and return the result to your terminal or application.

The DeepSeek-R1 AI model represents an excellent option for those looking for an advanced and economical solutionWith the ease of access that Ollama provides, the flexibility of its models, and the ability to be integrated into custom projects, DeepSeek opens up new possibilities for developers, students, and AI experts. With its focus on privacy and performance, it is a tool worth exploring further.


Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.