×
How to install DeepSeek on your phone — and why you’d want to
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The latest smartphones can now run sophisticated AI language models like DeepSeek directly on device, offering enhanced privacy through offline processing.

Key capabilities: Modern flagship smartphones have demonstrated the ability to run condensed versions of large language models (LLMs) locally, achieving usable performance for basic tasks.

  • High-end phones with Snapdragon 8 Elite chips can process 7-8 billion parameter models at 11 tokens per second
  • Older devices like the Pixel 7 Pro can handle smaller 3 billion parameter models at 5 tokens per second
  • Current implementations rely solely on CPU processing, with no GPU or NPU acceleration yet available

Technical requirements: Running local AI models demands substantial hardware resources and careful consideration of device specifications.

  • Phones need at least 12GB of RAM to run 7-8 billion parameter models effectively
  • 16GB or more RAM is required for larger 14 billion parameter models
  • Processing power significantly impacts model performance, with newer chips providing better results
  • Device temperature can increase substantially during model operation

Implementation options: Users have two main approaches to installing local AI models on their phones.

  • PocketPal AI offers a user-friendly app-based solution for both Android and iOS
  • Advanced users can utilize Termux and Ollama for a more technical command-line implementation
  • Both methods allow access to various models through the HuggingFace portal

Current limitations: Local AI implementation faces several practical constraints.

  • Models cannot access internet or external functions like cloud-based assistants
  • User interface limitations make document processing and complex interactions challenging
  • App stability issues and memory management remain ongoing concerns
  • Lack of hardware acceleration support restricts performance on older devices

Looking ahead: While current smartphone AI capabilities show promise, significant development is still needed for widespread adoption.

The successful implementation of local AI models on smartphones demonstrates technical feasibility, but practical limitations and setup complexity currently restrict their appeal to enthusiasts and developers. Future advances in hardware acceleration and improved user interfaces could make local AI processing more accessible to mainstream users.

How I installed DeepSeek on my phone with surprisingly good results

Recent News

Hacker admits using AI malware to breach Disney employee data

The case reveals how cybercriminals are exploiting AI enthusiasm to deliver sophisticated trojans targeting corporate networks and stealing personal data.

AI-powered social media monitoring expands US government reach

Federal agencies are increasingly adopting AI tools to analyze social media content, raising concerns that surveillance ostensibly targeting immigrants will inevitably capture American citizens' data.

MediaTek’s Q1 results reveal 4 key AI and mobile trends

Growing revenue but shrinking profits for MediaTek highlight the cost of competing in AI and premium mobile chips amid ongoing market volatility.