End of training
Browse files- README.md +56 -0
- wandb/debug-internal.log +112 -0
- wandb/debug.log +6 -0
- wandb/run-20231221_074619-o0md2mn0/logs/debug-internal.log +112 -0
- wandb/run-20231221_074619-o0md2mn0/logs/debug.log +6 -0
README.md
ADDED
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: peft
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
base_model: ybelkada/falcon-7b-sharded-bf16
|
6 |
+
model-index:
|
7 |
+
- name: falcon-7b-sharded-bf16-finetuned-mental-health-conversational
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
+
should probably proofread and complete it, then remove this comment. -->
|
13 |
+
|
14 |
+
# falcon-7b-sharded-bf16-finetuned-mental-health-conversational
|
15 |
+
|
16 |
+
This model is a fine-tuned version of [ybelkada/falcon-7b-sharded-bf16](https://huggingface.co/ybelkada/falcon-7b-sharded-bf16) on an unknown dataset.
|
17 |
+
|
18 |
+
## Model description
|
19 |
+
|
20 |
+
More information needed
|
21 |
+
|
22 |
+
## Intended uses & limitations
|
23 |
+
|
24 |
+
More information needed
|
25 |
+
|
26 |
+
## Training and evaluation data
|
27 |
+
|
28 |
+
More information needed
|
29 |
+
|
30 |
+
## Training procedure
|
31 |
+
|
32 |
+
### Training hyperparameters
|
33 |
+
|
34 |
+
The following hyperparameters were used during training:
|
35 |
+
- learning_rate: 0.0002
|
36 |
+
- train_batch_size: 2
|
37 |
+
- eval_batch_size: 8
|
38 |
+
- seed: 42
|
39 |
+
- gradient_accumulation_steps: 2
|
40 |
+
- total_train_batch_size: 4
|
41 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
42 |
+
- lr_scheduler_type: cosine
|
43 |
+
- lr_scheduler_warmup_ratio: 0.03
|
44 |
+
- training_steps: 320
|
45 |
+
|
46 |
+
### Training results
|
47 |
+
|
48 |
+
|
49 |
+
|
50 |
+
### Framework versions
|
51 |
+
|
52 |
+
- PEFT 0.7.2.dev0
|
53 |
+
- Transformers 4.36.2
|
54 |
+
- Pytorch 2.1.0+cu121
|
55 |
+
- Datasets 2.15.0
|
56 |
+
- Tokenizers 0.15.0
|
wandb/debug-internal.log
CHANGED
@@ -2468,3 +2468,115 @@
|
|
2468 |
2023-12-21 09:14:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2469 |
2023-12-21 09:14:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2470 |
2023-12-21 09:14:52,994 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2468 |
2023-12-21 09:14:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2469 |
2023-12-21 09:14:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2470 |
2023-12-21 09:14:52,994 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2471 |
+
2023-12-21 09:14:53,995 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2472 |
+
2023-12-21 09:14:55,172 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: pause
|
2473 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [handler.py:handle_request_pause():705] stopping system metrics thread
|
2474 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [system_monitor.py:finish():203] Stopping system monitor
|
2475 |
+
2023-12-21 09:14:55,174 DEBUG SystemMonitor:20887 [system_monitor.py:_start():179] Finished system metrics aggregation loop
|
2476 |
+
2023-12-21 09:14:55,175 DEBUG SystemMonitor:20887 [system_monitor.py:_start():183] Publishing last batch of metrics
|
2477 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined cpu monitor
|
2478 |
+
2023-12-21 09:14:55,176 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined disk monitor
|
2479 |
+
2023-12-21 09:14:55,195 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined gpu monitor
|
2480 |
+
2023-12-21 09:14:55,196 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined memory monitor
|
2481 |
+
2023-12-21 09:14:55,197 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined network monitor
|
2482 |
+
2023-12-21 09:14:55,197 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2483 |
+
2023-12-21 09:14:55,648 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: resume
|
2484 |
+
2023-12-21 09:14:55,649 INFO HandlerThread:20887 [handler.py:handle_request_resume():696] starting system metrics thread
|
2485 |
+
2023-12-21 09:14:55,649 INFO HandlerThread:20887 [system_monitor.py:start():194] Starting system monitor
|
2486 |
+
2023-12-21 09:14:55,649 INFO SystemMonitor:20887 [system_monitor.py:_start():158] Starting system asset monitoring threads
|
2487 |
+
2023-12-21 09:14:55,652 INFO SystemMonitor:20887 [interfaces.py:start():190] Started cpu monitoring
|
2488 |
+
2023-12-21 09:14:55,655 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: pause
|
2489 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [handler.py:handle_request_pause():705] stopping system metrics thread
|
2490 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [system_monitor.py:finish():203] Stopping system monitor
|
2491 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined cpu monitor
|
2492 |
+
2023-12-21 09:14:55,656 WARNING HandlerThread:20887 [interfaces.py:finish():207] Failed to finish disk monitoring: cannot join thread before it is started
|
2493 |
+
2023-12-21 09:14:55,663 INFO SystemMonitor:20887 [interfaces.py:start():190] Started disk monitoring
|
2494 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():172] Starting system metrics aggregation loop
|
2495 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():179] Finished system metrics aggregation loop
|
2496 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():183] Publishing last batch of metrics
|
2497 |
+
2023-12-21 09:14:55,665 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2498 |
+
2023-12-21 09:14:59,666 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2499 |
+
2023-12-21 09:15:04,667 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2500 |
+
2023-12-21 09:15:06,394 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2501 |
+
2023-12-21 09:15:06,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2502 |
+
2023-12-21 09:15:06,395 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2503 |
+
2023-12-21 09:15:10,593 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2504 |
+
2023-12-21 09:15:15,594 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2505 |
+
2023-12-21 09:15:20,595 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2506 |
+
2023-12-21 09:15:21,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2507 |
+
2023-12-21 09:15:21,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2508 |
+
2023-12-21 09:15:21,395 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2509 |
+
2023-12-21 09:15:26,588 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2510 |
+
2023-12-21 09:15:31,589 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2511 |
+
2023-12-21 09:15:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2512 |
+
2023-12-21 09:15:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2513 |
+
2023-12-21 09:15:36,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2514 |
+
2023-12-21 09:15:36,686 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2515 |
+
2023-12-21 09:15:41,687 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2516 |
+
2023-12-21 09:15:46,688 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2517 |
+
2023-12-21 09:15:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2518 |
+
2023-12-21 09:15:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2519 |
+
2023-12-21 09:15:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2520 |
+
2023-12-21 09:15:52,682 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2521 |
+
2023-12-21 09:15:57,683 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2522 |
+
2023-12-21 09:16:02,684 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2523 |
+
2023-12-21 09:16:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2524 |
+
2023-12-21 09:16:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2525 |
+
2023-12-21 09:16:06,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2526 |
+
2023-12-21 09:16:08,637 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2527 |
+
2023-12-21 09:16:13,637 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2528 |
+
2023-12-21 09:16:18,638 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2529 |
+
2023-12-21 09:16:21,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2530 |
+
2023-12-21 09:16:21,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2531 |
+
2023-12-21 09:16:21,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2532 |
+
2023-12-21 09:16:23,728 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2533 |
+
2023-12-21 09:16:28,729 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2534 |
+
2023-12-21 09:16:33,730 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2535 |
+
2023-12-21 09:16:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2536 |
+
2023-12-21 09:16:36,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2537 |
+
2023-12-21 09:16:36,437 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2538 |
+
2023-12-21 09:16:39,642 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2539 |
+
2023-12-21 09:16:44,643 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2540 |
+
2023-12-21 09:16:49,644 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2541 |
+
2023-12-21 09:16:51,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2542 |
+
2023-12-21 09:16:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2543 |
+
2023-12-21 09:16:51,436 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2544 |
+
2023-12-21 09:16:55,621 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2545 |
+
2023-12-21 09:17:00,622 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2546 |
+
2023-12-21 09:17:05,623 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2547 |
+
2023-12-21 09:17:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2548 |
+
2023-12-21 09:17:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2549 |
+
2023-12-21 09:17:06,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2550 |
+
2023-12-21 09:17:10,640 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2551 |
+
2023-12-21 09:17:15,641 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2552 |
+
2023-12-21 09:17:20,642 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2553 |
+
2023-12-21 09:17:21,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2554 |
+
2023-12-21 09:17:21,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2555 |
+
2023-12-21 09:17:21,437 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2556 |
+
2023-12-21 09:17:26,633 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2557 |
+
2023-12-21 09:17:31,633 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2558 |
+
2023-12-21 09:17:36,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2559 |
+
2023-12-21 09:17:36,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2560 |
+
2023-12-21 09:17:36,398 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2561 |
+
2023-12-21 09:17:36,663 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2562 |
+
2023-12-21 09:17:41,664 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2563 |
+
2023-12-21 09:17:46,667 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2564 |
+
2023-12-21 09:17:51,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2565 |
+
2023-12-21 09:17:51,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2566 |
+
2023-12-21 09:17:51,398 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2567 |
+
2023-12-21 09:17:52,621 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2568 |
+
2023-12-21 09:17:57,622 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2569 |
+
2023-12-21 09:18:02,624 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2570 |
+
2023-12-21 09:18:04,209 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: resume
|
2571 |
+
2023-12-21 09:18:04,212 INFO HandlerThread:20887 [handler.py:handle_request_resume():696] starting system metrics thread
|
2572 |
+
2023-12-21 09:18:04,212 INFO HandlerThread:20887 [system_monitor.py:start():194] Starting system monitor
|
2573 |
+
2023-12-21 09:18:04,213 INFO SystemMonitor:20887 [system_monitor.py:_start():158] Starting system asset monitoring threads
|
2574 |
+
2023-12-21 09:18:04,226 INFO SystemMonitor:20887 [interfaces.py:start():190] Started cpu monitoring
|
2575 |
+
2023-12-21 09:18:04,245 INFO SystemMonitor:20887 [interfaces.py:start():190] Started disk monitoring
|
2576 |
+
2023-12-21 09:18:04,265 INFO SystemMonitor:20887 [interfaces.py:start():190] Started gpu monitoring
|
2577 |
+
2023-12-21 09:18:04,271 INFO SystemMonitor:20887 [interfaces.py:start():190] Started memory monitoring
|
2578 |
+
2023-12-21 09:18:04,298 INFO SystemMonitor:20887 [interfaces.py:start():190] Started network monitoring
|
2579 |
+
2023-12-21 09:18:06,432 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2580 |
+
2023-12-21 09:18:06,435 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2581 |
+
2023-12-21 09:18:06,436 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2582 |
+
2023-12-21 09:18:07,709 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
wandb/debug.log
CHANGED
@@ -28,3 +28,9 @@ config: {}
|
|
28 |
2023-12-21 07:46:22,886 INFO MainThread:562 [wandb_run.py:_redirect():2178] Redirects installed.
|
29 |
2023-12-21 07:46:22,888 INFO MainThread:562 [wandb_init.py:init():841] run started, returning control to user process
|
30 |
2023-12-21 07:46:22,895 INFO MainThread:562 [wandb_run.py:_config_callback():1342] config_cb None None {'vocab_size': 65024, 'hidden_size': 4544, 'num_hidden_layers': 32, 'num_attention_heads': 71, 'layer_norm_epsilon': 1e-05, 'initializer_range': 0.02, 'use_cache': False, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'bos_token_id': 11, 'eos_token_id': 11, 'num_kv_heads': 71, 'alibi': False, 'new_decoder_architecture': False, 'multi_query': True, 'parallel_attn': True, 'bias': False, 'max_position_embeddings': 2048, 'rope_theta': 10000.0, 'rope_scaling': None, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'bfloat16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['FalconForCausalLM'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'pad_token_id': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'ybelkada/falcon-7b-sharded-bf16', 'transformers_version': '4.36.2', 'apply_residual_connection_post_layernorm': False, 'model_type': 'falcon', 'n_head': 71, 'n_layer': 32, 'quantization_config': {'quant_method': 'QuantizationMethod.BITS_AND_BYTES', 'load_in_8bit': False, 'load_in_4bit': True, 'llm_int8_threshold': 6.0, 'llm_int8_skip_modules': None, 'llm_int8_enable_fp32_cpu_offload': False, 'llm_int8_has_fp16_weight': False, 'bnb_4bit_quant_type': 'nf4', 'bnb_4bit_use_double_quant': True, 'bnb_4bit_compute_dtype': 'bfloat16'}, 'output_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 2, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0002, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 0.3, 'num_train_epochs': 3.0, 'max_steps': 320, 'lr_scheduler_type': 'cosine', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.03, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational/runs/Dec21_07-31-46_0dba6b8fc499', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': None, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': False, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': None, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'paged_adamw_32bit', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': False, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': False, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
2023-12-21 07:46:22,886 INFO MainThread:562 [wandb_run.py:_redirect():2178] Redirects installed.
|
29 |
2023-12-21 07:46:22,888 INFO MainThread:562 [wandb_init.py:init():841] run started, returning control to user process
|
30 |
2023-12-21 07:46:22,895 INFO MainThread:562 [wandb_run.py:_config_callback():1342] config_cb None None {'vocab_size': 65024, 'hidden_size': 4544, 'num_hidden_layers': 32, 'num_attention_heads': 71, 'layer_norm_epsilon': 1e-05, 'initializer_range': 0.02, 'use_cache': False, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'bos_token_id': 11, 'eos_token_id': 11, 'num_kv_heads': 71, 'alibi': False, 'new_decoder_architecture': False, 'multi_query': True, 'parallel_attn': True, 'bias': False, 'max_position_embeddings': 2048, 'rope_theta': 10000.0, 'rope_scaling': None, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'bfloat16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['FalconForCausalLM'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'pad_token_id': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'ybelkada/falcon-7b-sharded-bf16', 'transformers_version': '4.36.2', 'apply_residual_connection_post_layernorm': False, 'model_type': 'falcon', 'n_head': 71, 'n_layer': 32, 'quantization_config': {'quant_method': 'QuantizationMethod.BITS_AND_BYTES', 'load_in_8bit': False, 'load_in_4bit': True, 'llm_int8_threshold': 6.0, 'llm_int8_skip_modules': None, 'llm_int8_enable_fp32_cpu_offload': False, 'llm_int8_has_fp16_weight': False, 'bnb_4bit_quant_type': 'nf4', 'bnb_4bit_use_double_quant': True, 'bnb_4bit_compute_dtype': 'bfloat16'}, 'output_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 2, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0002, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 0.3, 'num_train_epochs': 3.0, 'max_steps': 320, 'lr_scheduler_type': 'cosine', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.03, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational/runs/Dec21_07-31-46_0dba6b8fc499', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': None, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': False, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': None, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'paged_adamw_32bit', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': False, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': False, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None}
|
31 |
+
2023-12-21 09:14:55,171 INFO MainThread:562 [jupyter.py:save_ipynb():373] not saving jupyter notebook
|
32 |
+
2023-12-21 09:14:55,172 INFO MainThread:562 [wandb_init.py:_pause_backend():435] pausing backend
|
33 |
+
2023-12-21 09:14:55,641 INFO MainThread:562 [wandb_init.py:_resume_backend():440] resuming backend
|
34 |
+
2023-12-21 09:14:55,653 INFO MainThread:562 [jupyter.py:save_ipynb():373] not saving jupyter notebook
|
35 |
+
2023-12-21 09:14:55,654 INFO MainThread:562 [wandb_init.py:_pause_backend():435] pausing backend
|
36 |
+
2023-12-21 09:18:04,205 INFO MainThread:562 [wandb_init.py:_resume_backend():440] resuming backend
|
wandb/run-20231221_074619-o0md2mn0/logs/debug-internal.log
CHANGED
@@ -2468,3 +2468,115 @@
|
|
2468 |
2023-12-21 09:14:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2469 |
2023-12-21 09:14:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2470 |
2023-12-21 09:14:52,994 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2468 |
2023-12-21 09:14:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2469 |
2023-12-21 09:14:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2470 |
2023-12-21 09:14:52,994 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2471 |
+
2023-12-21 09:14:53,995 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2472 |
+
2023-12-21 09:14:55,172 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: pause
|
2473 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [handler.py:handle_request_pause():705] stopping system metrics thread
|
2474 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [system_monitor.py:finish():203] Stopping system monitor
|
2475 |
+
2023-12-21 09:14:55,174 DEBUG SystemMonitor:20887 [system_monitor.py:_start():179] Finished system metrics aggregation loop
|
2476 |
+
2023-12-21 09:14:55,175 DEBUG SystemMonitor:20887 [system_monitor.py:_start():183] Publishing last batch of metrics
|
2477 |
+
2023-12-21 09:14:55,174 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined cpu monitor
|
2478 |
+
2023-12-21 09:14:55,176 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined disk monitor
|
2479 |
+
2023-12-21 09:14:55,195 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined gpu monitor
|
2480 |
+
2023-12-21 09:14:55,196 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined memory monitor
|
2481 |
+
2023-12-21 09:14:55,197 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined network monitor
|
2482 |
+
2023-12-21 09:14:55,197 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2483 |
+
2023-12-21 09:14:55,648 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: resume
|
2484 |
+
2023-12-21 09:14:55,649 INFO HandlerThread:20887 [handler.py:handle_request_resume():696] starting system metrics thread
|
2485 |
+
2023-12-21 09:14:55,649 INFO HandlerThread:20887 [system_monitor.py:start():194] Starting system monitor
|
2486 |
+
2023-12-21 09:14:55,649 INFO SystemMonitor:20887 [system_monitor.py:_start():158] Starting system asset monitoring threads
|
2487 |
+
2023-12-21 09:14:55,652 INFO SystemMonitor:20887 [interfaces.py:start():190] Started cpu monitoring
|
2488 |
+
2023-12-21 09:14:55,655 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: pause
|
2489 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [handler.py:handle_request_pause():705] stopping system metrics thread
|
2490 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [system_monitor.py:finish():203] Stopping system monitor
|
2491 |
+
2023-12-21 09:14:55,656 INFO HandlerThread:20887 [interfaces.py:finish():202] Joined cpu monitor
|
2492 |
+
2023-12-21 09:14:55,656 WARNING HandlerThread:20887 [interfaces.py:finish():207] Failed to finish disk monitoring: cannot join thread before it is started
|
2493 |
+
2023-12-21 09:14:55,663 INFO SystemMonitor:20887 [interfaces.py:start():190] Started disk monitoring
|
2494 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():172] Starting system metrics aggregation loop
|
2495 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():179] Finished system metrics aggregation loop
|
2496 |
+
2023-12-21 09:14:55,663 DEBUG SystemMonitor:20887 [system_monitor.py:_start():183] Publishing last batch of metrics
|
2497 |
+
2023-12-21 09:14:55,665 DEBUG SenderThread:20887 [sender.py:send():382] send: stats
|
2498 |
+
2023-12-21 09:14:59,666 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2499 |
+
2023-12-21 09:15:04,667 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2500 |
+
2023-12-21 09:15:06,394 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2501 |
+
2023-12-21 09:15:06,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2502 |
+
2023-12-21 09:15:06,395 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2503 |
+
2023-12-21 09:15:10,593 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2504 |
+
2023-12-21 09:15:15,594 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2505 |
+
2023-12-21 09:15:20,595 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2506 |
+
2023-12-21 09:15:21,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2507 |
+
2023-12-21 09:15:21,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2508 |
+
2023-12-21 09:15:21,395 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2509 |
+
2023-12-21 09:15:26,588 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2510 |
+
2023-12-21 09:15:31,589 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2511 |
+
2023-12-21 09:15:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2512 |
+
2023-12-21 09:15:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2513 |
+
2023-12-21 09:15:36,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2514 |
+
2023-12-21 09:15:36,686 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2515 |
+
2023-12-21 09:15:41,687 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2516 |
+
2023-12-21 09:15:46,688 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2517 |
+
2023-12-21 09:15:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2518 |
+
2023-12-21 09:15:51,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2519 |
+
2023-12-21 09:15:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2520 |
+
2023-12-21 09:15:52,682 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2521 |
+
2023-12-21 09:15:57,683 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2522 |
+
2023-12-21 09:16:02,684 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2523 |
+
2023-12-21 09:16:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2524 |
+
2023-12-21 09:16:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2525 |
+
2023-12-21 09:16:06,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2526 |
+
2023-12-21 09:16:08,637 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2527 |
+
2023-12-21 09:16:13,637 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2528 |
+
2023-12-21 09:16:18,638 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2529 |
+
2023-12-21 09:16:21,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2530 |
+
2023-12-21 09:16:21,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2531 |
+
2023-12-21 09:16:21,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2532 |
+
2023-12-21 09:16:23,728 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2533 |
+
2023-12-21 09:16:28,729 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2534 |
+
2023-12-21 09:16:33,730 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2535 |
+
2023-12-21 09:16:36,395 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2536 |
+
2023-12-21 09:16:36,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2537 |
+
2023-12-21 09:16:36,437 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2538 |
+
2023-12-21 09:16:39,642 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2539 |
+
2023-12-21 09:16:44,643 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2540 |
+
2023-12-21 09:16:49,644 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2541 |
+
2023-12-21 09:16:51,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2542 |
+
2023-12-21 09:16:51,396 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2543 |
+
2023-12-21 09:16:51,436 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2544 |
+
2023-12-21 09:16:55,621 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2545 |
+
2023-12-21 09:17:00,622 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2546 |
+
2023-12-21 09:17:05,623 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2547 |
+
2023-12-21 09:17:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2548 |
+
2023-12-21 09:17:06,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2549 |
+
2023-12-21 09:17:06,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2550 |
+
2023-12-21 09:17:10,640 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2551 |
+
2023-12-21 09:17:15,641 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2552 |
+
2023-12-21 09:17:20,642 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2553 |
+
2023-12-21 09:17:21,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2554 |
+
2023-12-21 09:17:21,397 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2555 |
+
2023-12-21 09:17:21,437 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2556 |
+
2023-12-21 09:17:26,633 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2557 |
+
2023-12-21 09:17:31,633 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2558 |
+
2023-12-21 09:17:36,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2559 |
+
2023-12-21 09:17:36,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2560 |
+
2023-12-21 09:17:36,398 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2561 |
+
2023-12-21 09:17:36,663 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2562 |
+
2023-12-21 09:17:41,664 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2563 |
+
2023-12-21 09:17:46,667 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2564 |
+
2023-12-21 09:17:51,396 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2565 |
+
2023-12-21 09:17:51,397 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2566 |
+
2023-12-21 09:17:51,398 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2567 |
+
2023-12-21 09:17:52,621 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2568 |
+
2023-12-21 09:17:57,622 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2569 |
+
2023-12-21 09:18:02,624 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
2570 |
+
2023-12-21 09:18:04,209 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: resume
|
2571 |
+
2023-12-21 09:18:04,212 INFO HandlerThread:20887 [handler.py:handle_request_resume():696] starting system metrics thread
|
2572 |
+
2023-12-21 09:18:04,212 INFO HandlerThread:20887 [system_monitor.py:start():194] Starting system monitor
|
2573 |
+
2023-12-21 09:18:04,213 INFO SystemMonitor:20887 [system_monitor.py:_start():158] Starting system asset monitoring threads
|
2574 |
+
2023-12-21 09:18:04,226 INFO SystemMonitor:20887 [interfaces.py:start():190] Started cpu monitoring
|
2575 |
+
2023-12-21 09:18:04,245 INFO SystemMonitor:20887 [interfaces.py:start():190] Started disk monitoring
|
2576 |
+
2023-12-21 09:18:04,265 INFO SystemMonitor:20887 [interfaces.py:start():190] Started gpu monitoring
|
2577 |
+
2023-12-21 09:18:04,271 INFO SystemMonitor:20887 [interfaces.py:start():190] Started memory monitoring
|
2578 |
+
2023-12-21 09:18:04,298 INFO SystemMonitor:20887 [interfaces.py:start():190] Started network monitoring
|
2579 |
+
2023-12-21 09:18:06,432 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: internal_messages
|
2580 |
+
2023-12-21 09:18:06,435 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: stop_status
|
2581 |
+
2023-12-21 09:18:06,436 DEBUG SenderThread:20887 [sender.py:send_request():409] send_request: stop_status
|
2582 |
+
2023-12-21 09:18:07,709 DEBUG HandlerThread:20887 [handler.py:handle_request():146] handle_request: status_report
|
wandb/run-20231221_074619-o0md2mn0/logs/debug.log
CHANGED
@@ -28,3 +28,9 @@ config: {}
|
|
28 |
2023-12-21 07:46:22,886 INFO MainThread:562 [wandb_run.py:_redirect():2178] Redirects installed.
|
29 |
2023-12-21 07:46:22,888 INFO MainThread:562 [wandb_init.py:init():841] run started, returning control to user process
|
30 |
2023-12-21 07:46:22,895 INFO MainThread:562 [wandb_run.py:_config_callback():1342] config_cb None None {'vocab_size': 65024, 'hidden_size': 4544, 'num_hidden_layers': 32, 'num_attention_heads': 71, 'layer_norm_epsilon': 1e-05, 'initializer_range': 0.02, 'use_cache': False, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'bos_token_id': 11, 'eos_token_id': 11, 'num_kv_heads': 71, 'alibi': False, 'new_decoder_architecture': False, 'multi_query': True, 'parallel_attn': True, 'bias': False, 'max_position_embeddings': 2048, 'rope_theta': 10000.0, 'rope_scaling': None, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'bfloat16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['FalconForCausalLM'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'pad_token_id': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'ybelkada/falcon-7b-sharded-bf16', 'transformers_version': '4.36.2', 'apply_residual_connection_post_layernorm': False, 'model_type': 'falcon', 'n_head': 71, 'n_layer': 32, 'quantization_config': {'quant_method': 'QuantizationMethod.BITS_AND_BYTES', 'load_in_8bit': False, 'load_in_4bit': True, 'llm_int8_threshold': 6.0, 'llm_int8_skip_modules': None, 'llm_int8_enable_fp32_cpu_offload': False, 'llm_int8_has_fp16_weight': False, 'bnb_4bit_quant_type': 'nf4', 'bnb_4bit_use_double_quant': True, 'bnb_4bit_compute_dtype': 'bfloat16'}, 'output_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 2, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0002, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 0.3, 'num_train_epochs': 3.0, 'max_steps': 320, 'lr_scheduler_type': 'cosine', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.03, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational/runs/Dec21_07-31-46_0dba6b8fc499', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': None, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': False, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': None, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'paged_adamw_32bit', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': False, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': False, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
2023-12-21 07:46:22,886 INFO MainThread:562 [wandb_run.py:_redirect():2178] Redirects installed.
|
29 |
2023-12-21 07:46:22,888 INFO MainThread:562 [wandb_init.py:init():841] run started, returning control to user process
|
30 |
2023-12-21 07:46:22,895 INFO MainThread:562 [wandb_run.py:_config_callback():1342] config_cb None None {'vocab_size': 65024, 'hidden_size': 4544, 'num_hidden_layers': 32, 'num_attention_heads': 71, 'layer_norm_epsilon': 1e-05, 'initializer_range': 0.02, 'use_cache': False, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'bos_token_id': 11, 'eos_token_id': 11, 'num_kv_heads': 71, 'alibi': False, 'new_decoder_architecture': False, 'multi_query': True, 'parallel_attn': True, 'bias': False, 'max_position_embeddings': 2048, 'rope_theta': 10000.0, 'rope_scaling': None, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'bfloat16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['FalconForCausalLM'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'pad_token_id': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'ybelkada/falcon-7b-sharded-bf16', 'transformers_version': '4.36.2', 'apply_residual_connection_post_layernorm': False, 'model_type': 'falcon', 'n_head': 71, 'n_layer': 32, 'quantization_config': {'quant_method': 'QuantizationMethod.BITS_AND_BYTES', 'load_in_8bit': False, 'load_in_4bit': True, 'llm_int8_threshold': 6.0, 'llm_int8_skip_modules': None, 'llm_int8_enable_fp32_cpu_offload': False, 'llm_int8_has_fp16_weight': False, 'bnb_4bit_quant_type': 'nf4', 'bnb_4bit_use_double_quant': True, 'bnb_4bit_compute_dtype': 'bfloat16'}, 'output_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 2, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0002, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 0.3, 'num_train_epochs': 3.0, 'max_steps': 320, 'lr_scheduler_type': 'cosine', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.03, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational/runs/Dec21_07-31-46_0dba6b8fc499', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': None, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': False, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': None, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': '/content/gdrive/MyDrive/LLM/falcon-7b-sharded-bf16-finetuned-mental-health-conversational', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'paged_adamw_32bit', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': False, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': False, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None}
|
31 |
+
2023-12-21 09:14:55,171 INFO MainThread:562 [jupyter.py:save_ipynb():373] not saving jupyter notebook
|
32 |
+
2023-12-21 09:14:55,172 INFO MainThread:562 [wandb_init.py:_pause_backend():435] pausing backend
|
33 |
+
2023-12-21 09:14:55,641 INFO MainThread:562 [wandb_init.py:_resume_backend():440] resuming backend
|
34 |
+
2023-12-21 09:14:55,653 INFO MainThread:562 [jupyter.py:save_ipynb():373] not saving jupyter notebook
|
35 |
+
2023-12-21 09:14:55,654 INFO MainThread:562 [wandb_init.py:_pause_backend():435] pausing backend
|
36 |
+
2023-12-21 09:18:04,205 INFO MainThread:562 [wandb_init.py:_resume_backend():440] resuming backend
|