Founded in 2023, DeepSeek focuses on researching the underlying models and technologies of general artificial intelligence, and has released several multi-billion parameter models, such as DeepSeek-LLM, DeepSeek-Coder, and has taken the lead in open-sourcing the first MoE model (DeepSeek-MoE) in China. The major models have excellent performance in the evaluation, with generalization effect beyond the same level of models. Users can easily access the API and enjoy dialog and code assistant services.