update with +2 epochs ft
7c2dda0
-
1.34 kB
update with +2 epochs ft
-
13 Bytes
update ckpt with 6ish epochs of training with 1024 TOKENS as max output
-
2.75 kB
remove inference
-
1.14 kB
update with +2 epochs ft
-
14 Bytes
update with +2 epochs ft
-
3.17 kB
update ckpt with 6ish epochs of training with 1024 TOKENS as max output
-
3.16 kB
Upload longt5-tglobal-large-pubmed-3k-booksum-8192-V4-ft2-booksum_training_metadata.json
-
3.13 GB
update with +2 epochs ft
rng_state_0.pth
Detected Pickle imports (7)
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "torch.ByteStorage",
- "_codecs.encode",
- "numpy.dtype",
- "collections.OrderedDict",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
14.5 kB
update with +2 epochs ft
-
2.2 kB
load model from drive and convert
-
2.42 MB
load model from drive and convert
-
2.37 kB
update with +2 epochs ft
-
7.71 kB
update with +2 epochs ft
training_args.bin
Detected Pickle imports (8)
- "transformers.trainer_utils.IntervalStrategy",
- "transformers.trainer_utils.HubStrategy",
- "torch.float16",
- "transformers.trainer_utils.SchedulerType",
- "transformers.deepspeed.HfTrainerDeepSpeedConfig",
- "transformers.training_args.OptimizerNames",
- "transformers.training_args_seq2seq.Seq2SeqTrainingArguments",
- "torch.device"
How to fix it?
4.53 kB
update with +2 epochs ft
-
18.9 kB
load model from drive and convert