Text Generation
llama
pretrained
llama-3
openllm-france

Model Card

This repository contains checkpoints (splitted for 512 GPUs) in DeepSpeed format for the Lucie-7B model, which was trained using this repository of code based on a fork of Megatron-Deepspeed.

Each checkpoint is in a subbranch (revision), which names specifies the number of training steps. For instance step0400000 corresponds to the checkpoint after 4M training steps.

Those checkpoints are provided so that the model can be retrained from a given point.

Contact

[email protected]

Downloads last month
2
Inference Examples
Unable to determine this model's library. Check the docs .

Dataset used to train OpenLLM-France/Lucie-7B-optimizer-states-512GPU

Collection including OpenLLM-France/Lucie-7B-optimizer-states-512GPU