Spaces:
Sleeping
Sleeping
Fix links in documentation
Browse files
hexviz/pages/2_📄Documentation.py
CHANGED
@@ -1,7 +1,9 @@
|
|
1 |
import streamlit as st
|
2 |
|
|
|
|
|
3 |
st.markdown(
|
4 |
-
"""
|
5 |
## Protein language models
|
6 |
There has been an explosion of capabilities in natural language processing models in the last few years.
|
7 |
These architectural advances from NLP have proven to work very well for protein sequences, and we now have protein language models (pLMs) that can generate novel functional proteins sequences [ProtGPT2](https://www.nature.com/articles/s42256-022-00499-z)
|
@@ -23,8 +25,8 @@ domain experts to explore and interpret the knowledge contained in pLMs.
|
|
23 |
|
24 |
## How to use Hexviz
|
25 |
There are two views:
|
26 |
-
1.
|
27 |
-
2.
|
28 |
|
29 |
The first view is the meat of the application and is where you can investigate how attention patterns map onto the structure of a protein you're interested in.
|
30 |
Use the second view to narrow down to a few heads that you want to investigate attention patterns from in detail.
|
@@ -50,5 +52,6 @@ Hexviz currently supports the following models:
|
|
50 |
## FAQ
|
51 |
1. I can't see any attention- "bars" in the visualization, what is wrong? -> Lower the `minimum attention`.
|
52 |
2. How are sequences I input folded? -> Using https://esmatlas.com/resources?action=fold
|
53 |
-
"""
|
|
|
54 |
)
|
|
|
1 |
import streamlit as st
|
2 |
|
3 |
+
from hexviz.config import URL
|
4 |
+
|
5 |
st.markdown(
|
6 |
+
f"""
|
7 |
## Protein language models
|
8 |
There has been an explosion of capabilities in natural language processing models in the last few years.
|
9 |
These architectural advances from NLP have proven to work very well for protein sequences, and we now have protein language models (pLMs) that can generate novel functional proteins sequences [ProtGPT2](https://www.nature.com/articles/s42256-022-00499-z)
|
|
|
25 |
|
26 |
## How to use Hexviz
|
27 |
There are two views:
|
28 |
+
1. <a href="{URL}Attention_Visualization" target="_self">🧬Attention Visualization</a> Shows attention weights from a single head as red bars between residues on a protein structure.
|
29 |
+
2. <a href="{URL}Identify_Interesting_Heads" target="_self">🗺️Identify Interesting Heads</a> Plots attention weights between residues as a heatmap for each head in the model.
|
30 |
|
31 |
The first view is the meat of the application and is where you can investigate how attention patterns map onto the structure of a protein you're interested in.
|
32 |
Use the second view to narrow down to a few heads that you want to investigate attention patterns from in detail.
|
|
|
52 |
## FAQ
|
53 |
1. I can't see any attention- "bars" in the visualization, what is wrong? -> Lower the `minimum attention`.
|
54 |
2. How are sequences I input folded? -> Using https://esmatlas.com/resources?action=fold
|
55 |
+
""",
|
56 |
+
unsafe_allow_html=True,
|
57 |
)
|