About 70,800 results
Open links in new tab
  1. gpt-j-6b/LICENSE at main · mallorbc/gpt-j-6b · GitHub

    If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, …

  2. EleutherAI/gpt-j-6b · Hugging Face

    GPT-J-6B was trained on an English-language only dataset, and is thus not suitable for translation or generating text in other languages. GPT-J-6B has not been fine-tuned for downstream contexts in …

  3. GPT-J - Wikipedia

    GPT-J or GPT-J-6B is an open-source large language model (LLM) developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like …

  4. GPT J 6B By EleutherAI: Benchmarks, Features and Detailed Analysis ...

    GPT J 6B is an open-source language model by EleutherAI. Features: 6b LLM, VRAM: 24.2GB, License: apache-2.0, HF Score: 40.1, LLM Explorer Score: 0.15, Arc: 41.4, HellaSwag: 67.5, MMLU: …

  5. GPTJ-6B Slim Autoregressive Language Model - Kaggle

    The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization …

  6. GPT-J 6B Benchmark Results, Specs & Pricing | DataLearnerAI

    Jun 4, 2021 · Explore GPT-J 6B (GPT-J 6B) including model size, context length, benchmark scores, API pricing, and licensing details. Published by EleutherAI.

  7. Software:GPT-J - HandWiki

    GPT-J or GPT-J-6B is an open-source large language model (LLM) developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like …

  8. gpt-j/LICENSE at main · graphcore/gpt-j · GitHub

    A short and simple permissive license with conditions only requiring preservation of copyright and license notices. Licensed works, modifications, and larger works may be distributed under different …

  9. togethercomputer/GPT-JT-6B-v1 · Hugging Face

    With a new decentralized training algorithm, we fine-tuned GPT-J (6B) on 3.53 billion tokens, resulting in GPT-JT (6B), a model that outperforms many 100B+ parameter models on classification …

  10. GitHub - janghoseong/gpt-j: gpt-j

    License The weights of GPT-J-6B are licensed under version 2.0 of the Apache License.