- GitHub - openai gpt-oss: gpt-oss-120b and gpt-oss-20b are two open . . .
Try gpt-oss · Guides · Model card · OpenAI blog Download gpt-oss-120b and gpt-oss-20b on Hugging Face Welcome to the gpt-oss series, OpenAI's open-weight models designed for powerful reasoning, agentic tasks, and versatile developer use cases We're releasing two flavors of these open models: gpt-oss-120b — for production, general purpose, high reasoning use cases that fit into a single
- Awesome GPT - GitHub
Awesome GPT A curated list of awesome projects and resources related to GPT, ChatGPT, OpenAI, LLM, and more
- GPT-3: Language Models are Few-Shot Learners - GitHub
GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic
- GitHub - openai gpt-2: Code for the paper Language Models are . . .
The dataset our GPT-2 models were trained on contains many texts with biases and factual inaccuracies, and thus GPT-2 models are likely to be biased and inaccurate as well To avoid having samples mistaken as human-written, we recommend clearly labeling samples as synthetic before wide dissemination
- GPT-API-free DeepSeek-API-free - GitHub
️ 免费API Key gpt-5系列模型的推理能力较弱,若需要更强的推理能力,可以购买付费API ️ 免费API Key仅可用于个人非商业用途,教育,非营利性科研工作中。
- gpt-engineer - GitHub
gpt-engineer installs the binary 'bench', which gives you a simple interface for benchmarking your own agent implementations against popular public datasets The easiest way to get started with benchmarking is by checking out the template repo, which contains detailed instructions and an agent template
- GitHub - gpt4o-image GPT-4o: 新版GPT-4o生图功能太强了(附国内使用指南)【9月最新更新】
新版GPT-4o生图功能太强了(附国内使用指南)【9月最新更新】 Contribute to gpt4o-image GPT-4o development by creating an account on GitHub
- GitHub - karpathy minGPT: A minimal PyTorch re-implementation of the . . .
A PyTorch re-implementation of GPT, both training and inference minGPT tries to be small, clean, interpretable and educational, as most of the currently available GPT model implementations can a bit sprawling
|