What is llm.c? Framework for Training Large Language Models in C Language
2024-07-08
Person who needs help
I would like to know llm.c!
We can help you with your concerns.
Reliability of This Article by Our Founder/CEO&CTO Hiroyuki Chishiro
He has been involved in 12 years of research on real-time systems.
He teaches OS (Linux kernel) in English at the University of Tokyo.
From September 2012 to August 2013, he was a visiting researcher at the Department of Computer Science, the University of North Carolina at Chapel Hill (UNC), Chapel Hill, North Carolina, United States. He has been involved in research and development of real-time Linux in C language.
He has experienced in more than 15 years of programming languages: C/C++, Python, Solidity/Vyper, Java, Ruby, Go, Rust, D, HTML/CSS/JS/PHP, MATLAB, Verse (UEFN), Assembler (x64, ARM).
While a faculty member at the University of Tokyo, he developed the "Extension of LLVM Compiler" in C++ language and his own real-time OS "Mcube Kernel" in C language, which he published as open source on GitHub.
In January 2020-Present, he is CTO of Guarantee Happiness LLC, Chapel Hill, North Carolina, United States, in charge of e-commerce site development and web/social network marketing. In June 2022-Present, he is CEO&CTO of Japanese Tar Heel, Inc. in Chapel Hill, North Carolina, United States.
We have been engaged in disseminating useful information on AI and Crypto (Web3).
We have written more than 20 articles on AI including AI chatbots such as ChatGPT, Auto-GPT, Gemini (formerly Bard). He has experience in contract work as a prompt engineer, manager, and quality assurance (QA) for several companies in San Francisco, United States (Silicon Valley in the broadest sense of the word).
We have written more than 40 articles on cryptocurrency (including smart contract programming). He has experience as an outsourced translator of English articles on cryptocurrency into Japanese for a company in London, England.
llm.c is a framework for training Large Language Models (LLMs) using pure C and CUDA without PyTorch or cPython.
By using llm.c, you can train LLMs about 7% faster than with PyTorch.
Since "llm" in "llm.c" is a Large Language Model and ".c" is an extension for C source files, the name of the framework like the filename of C source files is good!
As of July 2024, llm.c will focus on pretraining (especially GPT-2 and GPT-3 reproduction) and parallel PyTorch reference implementation in train_gpt2.py.
In the future, llm.c may be used in state-of-the-art ChatGPT training.
If you want to know how to start ChatGPT and how to use it, please click the following.