- Awesome GPT - GitHub
Awesome GPT A curated list of awesome projects and resources related to GPT, ChatGPT, OpenAI, LLM, and more
- GitHub - openai gpt-oss: gpt-oss-120b and gpt-oss-20b are two open . . .
Try gpt-oss · Guides · Model card · OpenAI blog Download gpt-oss-120b and gpt-oss-20b on Hugging Face Welcome to the gpt-oss series, OpenAI's open-weight models designed for powerful reasoning, agentic tasks, and versatile developer use cases We're releasing two flavors of these open models: gpt-oss-120b — for production, general purpose, high reasoning use cases that fit into a single
- GPT-3: Language Models are Few-Shot Learners - GitHub
Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model, and test its performance in the few-shot setting
- GitHub - openai gpt-2: Code for the paper Language Models are . . .
gpt-2 Code and models from the paper "Language Models are Unsupervised Multitask Learners" You can read about GPT-2 and its staged release in our original blog post, 6 month follow-up post, and final post We have also released a dataset for researchers to study their behaviors
- gpt-engineer - GitHub
gpt-engineer installs the binary 'bench', which gives you a simple interface for benchmarking your own agent implementations against popular public datasets The easiest way to get started with benchmarking is by checking out the template repo, which contains detailed instructions and an agent template
- GPT-API-free DeepSeek-API-free - GitHub
️ 免费API Key gpt-5系列模型的推理能力较弱,若需要更强的推理能力,可以购买付费API ️ 免费API Key仅可用于个人非商业用途,教育,非营利性科研工作中。
- graphcore gpt-j: Notebook for running GPT-J GPT-J-6B - GitHub
GPT-J is an open-source alternative from EleutherAI to OpenAI's GPT-3 Available for anyone to download, GPT-J can be successfully fine-tuned to perform just as well as large models on a range of NLP tasks including question answering, sentiment analysis, and named entity recognition Try running
- GitHub - karpathy minGPT: A minimal PyTorch re-implementation of the . . .
A PyTorch re-implementation of GPT, both training and inference minGPT tries to be small, clean, interpretable and educational, as most of the currently available GPT model implementations can a bit sprawling
|