Update README.md
36cef0e
-
1.52 kB
initial commit
-
4.39 kB
Update README.md
-
435 Bytes
MentaLLaMA-13B trained with LoRA
-
102 MB
MentaLLaMA-13B trained with LoRA
rng_state_0.pth
Detected Pickle imports (7)
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "numpy.dtype",
- "collections.OrderedDict",
- "torch.ByteStorage",
- "_codecs.encode",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
17.7 kB
MentaLLaMA-13B trained with LoRA
rng_state_1.pth
Detected Pickle imports (7)
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "numpy.dtype",
- "collections.OrderedDict",
- "torch.ByteStorage",
- "_codecs.encode",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
17.7 kB
MentaLLaMA-13B trained with LoRA
rng_state_2.pth
Detected Pickle imports (7)
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "numpy.dtype",
- "collections.OrderedDict",
- "torch.ByteStorage",
- "_codecs.encode",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
17.7 kB
MentaLLaMA-13B trained with LoRA
rng_state_3.pth
Detected Pickle imports (7)
- "torch.ByteStorage",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "torch._utils._rebuild_tensor_v2",
- "_codecs.encode",
- "numpy.dtype",
- "collections.OrderedDict"
How to fix it?
17.7 kB
MentaLLaMA-13B trained with LoRA
-
435 Bytes
MentaLLaMA-13B trained with LoRA
-
500 kB
MentaLLaMA-13B trained with LoRA
-
824 Bytes
MentaLLaMA-13B trained with LoRA
-
479 kB
MentaLLaMA-13B trained with LoRA
training_args.bin
Detected Pickle imports (11)
- "__main__.TrainingArguments",
- "accelerate.utils.dataclasses.DistributedType",
- "transformers.training_args.OptimizerNames",
- "transformers.trainer_utils.IntervalStrategy",
- "accelerate.utils.dataclasses.DeepSpeedPlugin",
- "torch.float32",
- "transformers.deepspeed.HfTrainerDeepSpeedConfig",
- "transformers.trainer_utils.SchedulerType",
- "accelerate.state.PartialState",
- "torch.device",
- "transformers.trainer_utils.HubStrategy"
How to fix it?
5.12 kB
MentaLLaMA-13B trained with LoRA
-
24.2 kB
MentaLLaMA-13B trained with LoRA