Update README.md
Browse files
README.md
CHANGED
@@ -1,10 +1,24 @@
|
|
1 |
---
|
2 |
title: README
|
3 |
-
emoji:
|
4 |
colorFrom: indigo
|
5 |
-
colorTo:
|
6 |
sdk: static
|
7 |
pinned: false
|
|
|
8 |
---
|
9 |
|
10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
title: README
|
3 |
+
emoji: 👀
|
4 |
colorFrom: indigo
|
5 |
+
colorTo: blue
|
6 |
sdk: static
|
7 |
pinned: false
|
8 |
+
license: apache-2.0
|
9 |
---
|
10 |
|
11 |
+
<img src="https://cdn-uploads.huggingface.co/production/uploads/66f68e0efbc158f28460a696/dLd19hGCL9UmSj2sKSHXs.png" width="600" alt="X logo">
|
12 |
+
|
13 |
+
# Welcome to X!
|
14 |
+
|
15 |
+
[Website](#) | [GitHub](#) | [Discord](#)
|
16 |
+
|
17 |
+
## About Us:
|
18 |
+
X AI is at the forefront of multimodal AI technology. We are committed to making AI faster, smarter, and more accessible to everyone. Our innovative Mixture of Experts (MoE) architecture plays a central role in this effort, driving superior performance while reducing costs. This makes advanced AI solutions both accessible and practical for everyone.
|
19 |
+
|
20 |
+
## Models:
|
21 |
+
**A**: A powerful open-source MoE model, natively supporting multi-modality, enabling seamless integration across diverse applications.
|
22 |
+
|
23 |
+
|
24 |
+
Edit this `README.md` markdown file to author your organization card.
|