Cosmos
Safetensors
NeMo
nvidia

I want to convert my final checkpoint into a .pt file

#8
by edgarkim - opened

I am performing post-training on Cosmos-1.0-Diffusion-7B-Text2World following the guide below:

Post-Training Guide

After completing the post-training, I see that the checkpoint results contain context and weights folders.

In the weights folder, there are multiple state and weight files.
Now, I want to convert my final checkpoint into a .pt file.

Can you guide me on how to do this?

-files-
common.pt module.decoder.layers.cross_attention.q_layernorm.weight module.extra_pos_embedder.pos_emb_t
metadata.json module.decoder.layers.full_self_attention.core_attention._extra_state module.extra_pos_embedder.pos_emb_w
module.affline_norm.weight module.decoder.layers.full_self_attention.k_layernorm.weight module.final_layer.adaLN_modulation.1.weight
module.decoder.layers.adaLN.adaLN_modulation.1.weight module.decoder.layers.full_self_attention.linear_proj._extra_state module.final_layer.adaLN_modulation.2.weight
module.decoder.layers.adaLN.adaLN_modulation.2.weight module.decoder.layers.full_self_attention.linear_proj.weight module.final_layer.linear.weight
module.decoder.layers.cross_attention.core_attention._extra_state module.decoder.layers.full_self_attention.linear_qkv._extra_state module.logvar.0.freqs
module.decoder.layers.cross_attention.k_layernorm.weight module.decoder.layers.full_self_attention.linear_qkv.weight module.logvar.0.phases
module.decoder.layers.cross_attention.linear_kv._extra_state module.decoder.layers.full_self_attention.q_layernorm.weight module.logvar.1.weight
module.decoder.layers.cross_attention.linear_kv.weight module.decoder.layers.mlp.linear_fc1._extra_state module.pos_embedder.seq
module.decoder.layers.cross_attention.linear_proj._extra_state module.decoder.layers.mlp.linear_fc1.weight module.t_embedder.1.linear_1.weight
module.decoder.layers.cross_attention.linear_proj.weight module.decoder.layers.mlp.linear_fc2._extra_state module.t_embedder.1.linear_2.weight
module.decoder.layers.cross_attention.linear_q._extra_state module.decoder.layers.mlp.linear_fc2.weight module.x_embedder.proj.1.weight
module.decoder.layers.cross_attention.linear_q.weight module.extra_pos_embedder.pos_emb_h

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment