--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram results: [] --- # ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram This model is a fine-tuned version of [gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram](https://huggingface.co/gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4624 - Wer: 0.2031 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 100 - num_epochs: 100.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.3313 | 1.0 | 72 | 0.4484 | 0.2159 | | 0.318 | 2.0 | 144 | 0.4495 | 0.2155 | | 0.3046 | 3.0 | 216 | 0.4518 | 0.2149 | | 0.2925 | 4.0 | 288 | 0.4516 | 0.2144 | | 0.2995 | 5.0 | 360 | 0.4518 | 0.2126 | | 0.3005 | 6.0 | 432 | 0.4536 | 0.2114 | | 0.2867 | 7.0 | 504 | 0.4555 | 0.2123 | | 0.2992 | 8.0 | 576 | 0.4537 | 0.2113 | | 0.2964 | 9.0 | 648 | 0.4567 | 0.2115 | | 0.3104 | 10.0 | 720 | 0.4545 | 0.2093 | | 0.2961 | 11.0 | 792 | 0.4572 | 0.2101 | | 0.3039 | 12.0 | 864 | 0.4562 | 0.2121 | | 0.2903 | 13.0 | 936 | 0.4601 | 0.2099 | | 0.2962 | 14.0 | 1008 | 0.4571 | 0.2110 | | 0.3033 | 15.0 | 1080 | 0.4578 | 0.2104 | | 0.3055 | 16.0 | 1152 | 0.4549 | 0.2105 | | 0.2917 | 17.0 | 1224 | 0.4605 | 0.2101 | | 0.2894 | 18.0 | 1296 | 0.4655 | 0.2086 | | 0.2902 | 19.0 | 1368 | 0.4671 | 0.2087 | | 0.2893 | 20.0 | 1440 | 0.4665 | 0.2097 | | 0.2974 | 21.0 | 1512 | 0.4655 | 0.2080 | | 0.3053 | 22.0 | 1584 | 0.4592 | 0.2090 | | 0.2873 | 23.0 | 1656 | 0.4647 | 0.2083 | | 0.2824 | 24.0 | 1728 | 0.4611 | 0.2063 | | 0.2931 | 25.0 | 1800 | 0.4598 | 0.2066 | | 0.2974 | 26.0 | 1872 | 0.4611 | 0.2054 | | 0.2892 | 27.0 | 1944 | 0.4604 | 0.2058 | | 0.2937 | 28.0 | 2016 | 0.4607 | 0.2046 | | 0.2864 | 29.0 | 2088 | 0.4608 | 0.2063 | | 0.3046 | 30.0 | 2160 | 0.4607 | 0.2055 | | 0.2973 | 31.0 | 2232 | 0.4553 | 0.2051 | | 0.2946 | 32.0 | 2304 | 0.4632 | 0.2057 | | 0.2784 | 33.0 | 2376 | 0.4624 | 0.2055 | | 0.2807 | 34.0 | 2448 | 0.4605 | 0.2055 | | 0.29 | 35.0 | 2520 | 0.4627 | 0.2061 | | 0.2927 | 36.0 | 2592 | 0.4586 | 0.2034 | | 0.2978 | 37.0 | 2664 | 0.4567 | 0.2039 | | 0.2852 | 38.0 | 2736 | 0.4605 | 0.2033 | | 0.2765 | 39.0 | 2808 | 0.4608 | 0.2039 | | 0.3007 | 40.0 | 2880 | 0.4585 | 0.2052 | | 0.2959 | 41.0 | 2952 | 0.4597 | 0.2043 | | 0.279 | 42.0 | 3024 | 0.4584 | 0.2037 | | 0.2865 | 43.0 | 3096 | 0.4608 | 0.2047 | | 0.2857 | 44.0 | 3168 | 0.4601 | 0.2033 | | 0.2927 | 45.0 | 3240 | 0.4614 | 0.2033 | | 0.2965 | 46.0 | 3312 | 0.4619 | 0.2032 | | 0.2865 | 47.0 | 3384 | 0.4600 | 0.2030 | | 0.2825 | 48.0 | 3456 | 0.4617 | 0.2040 | | 0.2773 | 49.0 | 3528 | 0.4623 | 0.2040 | | 0.2903 | 50.0 | 3600 | 0.4646 | 0.2034 | | 0.2895 | 51.0 | 3672 | 0.4635 | 0.2051 | | 0.282 | 52.0 | 3744 | 0.4653 | 0.2050 | | 0.2879 | 53.0 | 3816 | 0.4637 | 0.2043 | | 0.2753 | 54.0 | 3888 | 0.4633 | 0.2042 | | 0.2897 | 55.0 | 3960 | 0.4638 | 0.2051 | | 0.2946 | 56.0 | 4032 | 0.4616 | 0.2056 | | 0.29 | 57.0 | 4104 | 0.4635 | 0.2051 | | 0.2787 | 58.0 | 4176 | 0.4647 | 0.2051 | | 0.2824 | 59.0 | 4248 | 0.4670 | 0.2048 | | 0.2819 | 60.0 | 4320 | 0.4660 | 0.2042 | | 0.2903 | 61.0 | 4392 | 0.4652 | 0.2056 | | 0.2947 | 62.0 | 4464 | 0.4667 | 0.2045 | | 0.2885 | 63.0 | 4536 | 0.4660 | 0.2040 | | 0.2892 | 64.0 | 4608 | 0.4639 | 0.2034 | | 0.2932 | 65.0 | 4680 | 0.4638 | 0.2050 | | 0.2764 | 66.0 | 4752 | 0.4655 | 0.2030 | | 0.2936 | 67.0 | 4824 | 0.4654 | 0.2036 | | 0.2911 | 68.0 | 4896 | 0.4659 | 0.2031 | | 0.2871 | 69.0 | 4968 | 0.4654 | 0.2028 | | 0.294 | 70.0 | 5040 | 0.4656 | 0.2042 | | 0.2946 | 71.0 | 5112 | 0.4624 | 0.2038 | | 0.2915 | 72.0 | 5184 | 0.4620 | 0.2034 | | 0.2844 | 73.0 | 5256 | 0.4610 | 0.2034 | | 0.2752 | 74.0 | 5328 | 0.4607 | 0.2036 | | 0.2941 | 75.0 | 5400 | 0.4605 | 0.2040 | | 0.2964 | 76.0 | 5472 | 0.4612 | 0.2027 | | 0.2957 | 77.0 | 5544 | 0.4599 | 0.2030 | | 0.2875 | 78.0 | 5616 | 0.4609 | 0.2030 | | 0.2939 | 79.0 | 5688 | 0.4616 | 0.2031 | | 0.3005 | 80.0 | 5760 | 0.4601 | 0.2039 | | 0.2947 | 81.0 | 5832 | 0.4610 | 0.2035 | | 0.2876 | 82.0 | 5904 | 0.4615 | 0.2036 | | 0.2886 | 83.0 | 5976 | 0.4627 | 0.2034 | | 0.2908 | 84.0 | 6048 | 0.4639 | 0.2033 | | 0.296 | 85.0 | 6120 | 0.4628 | 0.2031 | | 0.2842 | 86.0 | 6192 | 0.4627 | 0.2034 | | 0.2938 | 87.0 | 6264 | 0.4629 | 0.2035 | | 0.2967 | 88.0 | 6336 | 0.4632 | 0.2033 | | 0.2854 | 89.0 | 6408 | 0.4634 | 0.2035 | | 0.2975 | 90.0 | 6480 | 0.4642 | 0.2034 | | 0.2917 | 91.0 | 6552 | 0.4632 | 0.2034 | | 0.2854 | 92.0 | 6624 | 0.4637 | 0.2030 | | 0.2877 | 93.0 | 6696 | 0.4640 | 0.2034 | | 0.2818 | 94.0 | 6768 | 0.4625 | 0.2027 | | 0.2946 | 95.0 | 6840 | 0.4626 | 0.2027 | | 0.2992 | 96.0 | 6912 | 0.4626 | 0.2025 | | 0.2884 | 97.0 | 6984 | 0.4626 | 0.2029 | | 0.299 | 98.0 | 7056 | 0.4624 | 0.2027 | | 0.2788 | 99.0 | 7128 | 0.4628 | 0.2032 | | 0.2966 | 100.0 | 7200 | 0.4624 | 0.2031 | ### Framework versions - Transformers 4.21.0.dev0 - Pytorch 1.9.1+cu102 - Datasets 2.3.3.dev0 - Tokenizers 0.12.1