Update README.md
Browse files
README.md
CHANGED
@@ -72,7 +72,7 @@ This research is for academic research use only, commercial use is not allowed w
|
|
72 |
Data processing and annotation is one of the important steps in training the model. We sincerely welcome Traditional Chinese Medicine practitioners with strong TCM thinking and innovative spirit to join us. We will also declare corresponding data contributions. We look forward to the day when we can achieve a reliable General Artificial Intelligence for Traditional Chinese Medicine, allowing the ancient Chinese medicine to blend with modern technology and shine anew. This is also the ultimate mission of this project. If interested, please send an email to [email protected].
|
73 |
|
74 |
## Team Introduction
|
75 |
-
|
76 |
|
77 |
## Citation
|
78 |
If you find this work useful in your research, please cite our repository:
|
|
|
72 |
Data processing and annotation is one of the important steps in training the model. We sincerely welcome Traditional Chinese Medicine practitioners with strong TCM thinking and innovative spirit to join us. We will also declare corresponding data contributions. We look forward to the day when we can achieve a reliable General Artificial Intelligence for Traditional Chinese Medicine, allowing the ancient Chinese medicine to blend with modern technology and shine anew. This is also the ultimate mission of this project. If interested, please send an email to [email protected].
|
73 |
|
74 |
## Team Introduction
|
75 |
+
Led by the non-profit organization FulPhil-医哲未来 (Future Medicine Philosophy), the CMLM (Chinese Medicine Language Models) initiative on HuggingFace is dedicated to advancing healthcare AI by integrating traditional Chinese medicine with state-of-the-art machine learning. Our mission includes curating valuable medical datasets, developing AI models for medical assistance, and ensuring ethical AI use in healthcare, fostering collaboration between global experts in Chinese and Western medicine and AI.
|
76 |
|
77 |
## Citation
|
78 |
If you find this work useful in your research, please cite our repository:
|