index
int64
0
0
repo_id
stringclasses
179 values
file_path
stringlengths
26
186
content
stringlengths
1
2.1M
__index_level_0__
int64
0
9
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter1/quiz.mdx
<!-- DISABLE-FRONTMATTER-SECTIONS --> # Check your understanding of the course material ### 1. 采样率使用的单位是? <Question choices={[ { text: "dB", explain: "错误,使用dB标度的是幅值。" }, { text: "Hz", explain: "采样率是指每秒内采样点的个数,使用赫兹(Hz)为单位。", correct: true }, { text: "bit", explain: "比特(bit)是描述位深度的单位,位深度指的是单个样本点需要用多少位的二进制数(即比特)来表示。", } ]} /> ### 2. 使用流式加载模式加载大规模数据集时,何时才能开始使用该数据集? <Question choices={[ { text: "在完整的数据集下载完毕时", explain: "使用流式加载的目的就是让我们不需要下载完整的数据集也可以开始使用数据集。" }, { text: "在前16个样本下载完成时", explain: "再试一次!" }, { text: "在第一个样本被下载完成是", explain: "", correct: true } ]} /> ### 3. 什么是时频谱? <Question choices={[ { text: "将麦克风的电信号转化为数字信号的设备", explain: "将麦克风的电信号转化为数字信号的设备称为模拟-数字转换器(模数转换器,Analog-to-Digital Converter,ADC)。再试一次!" }, { text: "表示音频信号的幅值随时间变化的图像,也被称为声音信号的*时域*表示", explain: "这种图像称为波形图,而非时频谱图。" }, { text: "表示音频信号的各个频率成分随时间变化的可视化表示", explain: "", correct: true } ]} /> ### 4. 下列方法中,最简单的可以将原始音频转化为Whisper模型接受的对数梅尔谱的方法是? A. ```python librosa.feature.melspectrogram(audio["array"]) ``` B. ```python feature_extractor = WhisperFeatureExtractor.from_pretrained("openai/whisper-small") feature_extractor(audio["array"]) ``` C. ```python dataset.feature(audio["array"], model="whisper") ``` <Question choices={[ { text: "A", explain: "`librosa.feature.melspectrogram()`生成的是能量频谱。" }, { text: "B", explain: "", correct: true }, { text: "C", explain: "Dataset并不会对数据进行预处理或转换。模型的特征提取器才会进行转换。" } ]} /> ### 5. 如何从🤗 Hub加载数据集? A. ```python from datasets import load_dataset dataset = load_dataset(DATASET_NAME_ON_HUB) ``` B. ```python import librosa dataset = librosa.load(PATH_TO_DATASET) ``` C. ```python from transformers import load_dataset dataset = load_dataset(DATASET_NAME_ON_HUB) ``` <Question choices={[ { text: "A", explain: "最佳方案是使用🤗 Datasets库。", correct: true }, { text: "B", explain: "Librosa.load 在加载单个的音频文件时十分好用,但在加载多样本和多特征的数据库时并非最佳方案。" }, { text: "C", explain: "load_dataset 方法属于🤗 Datasets库,而非🤗 Transformers。" } ]} /> ### 6. 你的自定义数据集包含了32千赫兹采样率的高清音频。你现在想要训练一个使用16千赫兹采样率的语音识别模型。你应该怎么做? <Question choices={[ { text: "直接使用这些数据,模型可以对不同采样率的数据有泛化能力", explain: "由于依靠注意力机制,模型很难对不同采样率的数据进行泛化。" }, { text: "使用🤗 Datasets库的音频模组对自定义数据集进行降采样", explain: "", correct: true }, { text: "使用每隔一个样本点丢弃一个样本点的方法进行降采样", explain: "这种方法会产生混叠失真。重采样操作往往十分棘手,因此我们推荐经过测试的工具库,例如librosa和🤗 Datasets。" } ]} /> ### 7. 如何将机器学习模型生成的时频谱转化为波形? <Question choices={[ { text: "我们可以使用一种叫做声码器(vocoder)的神经网络从时频谱重构波形", explain: "由于生成的时频谱缺乏相位信息,我们需要使用声码器或者经典的Griffin-Lim算法来重构波形。", correct: true }, { text: "我们可以用逆短时傅里叶变换(inverse STFT)将生成的时频谱转化为波形", explain: "使用逆短时傅里叶变化需要时频谱的相位信息,但生成的时频谱一般仅具有幅值信息。" }, { text: "无法将机器学习模型生成的时频谱转化为波形。", explain: "再试一次!" } ]} />
0
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter1/introduction.mdx
# 第1单元:音频数据处理 ## 单元简介 所有音频或语音相关的任务都需要使用音频文件。在我们深入了解这些任务之前,我们需要了解音频文件的实际内容以及如何利用音频文件。 本单元将为你介绍与音频数据相关的基本概念,包括波形、采样率和频谱图。你会学习到如何使用音频数据集,包括音频数据加载、音频数据预处理,以及高效加载大规模音频数据集的流式加载方法。 完成本单元的学习后,你会掌握基础的音频相关术语,并且掌握针对不同应用的音频数据处理工具。本单元的知识会成为后面章节的基础。
1
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter1/audio_data.mdx
# 音频数据处理入门 声波在本质上是一种连续信号,这意味着在一段给定时间内的声音信号有无数个取值。对于只能读取有限长数组的数字计算机来说,这是一个重要的问题。为了使得数字设备能够处理、储存和传送声波,我们需要将连续的声音信号转换为一个离散的序列。我们称之为数字化表示。 音频数据集里包含了许多音频段落的数字化文件,例如一段旁白或者一段音乐。你可能见过不同的文件格式,例如`.wav` (Waveform Audio File,音频波形文件)、 `.flac` (Free Lossless Audio Codec,免费无损音频编解码) 和 `.mp3` (MPEG-1 音频格式 3)。这些格式的主要区别在于他们的压缩方法不同。 下面我们来了解一下如何将连续的声音信号转换为这些数字化表示。原始的模拟信号首先被麦克风捕捉,并由声音信号转化为电信号。接下来,电信号会由模拟-数字转换器(模数转换器,Analog-to-Digital Converter, ADC)经由采样过程转换为数字化表示。 ## 采样过程和采样率 采样是一种在固定的时间间隔上测量连续信号的数值的过程。采样过后的信号被称为_离散信号_,因为这些信号是在固定间隔上记录的有限长度信号。 <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/Signal_Sampling.png" alt="Signal sampling illustration"> </div> *示意图取自维基百科词条: [Sampling (signal processing)](https://en.wikipedia.org/wiki/Sampling_(signal_processing)) [采样](https://zh.wikipedia.org/zh-cn/%E5%8F%96%E6%A8%A3)* **采样率**(sampling rate,也叫采样频率,sampling frequency)指的是每一秒钟内测量信号数值的次数,单位为赫兹(Hz)。作为参照,CD音质的音频一般采用44100赫兹的采样率,意味着每秒钟测量了44100次信号的数值。作为对比,高清(High-resolution)音频的采样率一般为192000赫兹,即192千赫兹。语音模型常用的采样率为16,000赫兹,即16千赫兹。 采样率决定了能够被捕捉的最高频率。采样率的二分之一被称为奈奎斯特极限,这是该采样率能够捕捉的最高频率。人耳可辨认的语音信号往往在8千赫兹以下,因此16千赫兹的采样率足够捕捉所有可听到的语音内容。使用更高的采样率并不能采集到更多的信息,并且往往会导致计算成本的增加。另一方面,过低的采样率会导致信息丢失。使用8千赫兹采样的音频会听起来很闷,因为该采样率无法捕捉更高频率的声音。 在处理音频任务时,切记要保证数据集中的所有数据都使用了相同的采样率。如果你计划使用自己的数据来对预训练模型进行微调,你自己的音频数据和预训练模型所使用的音频数据需要保持相同的采样率。采样率决定了相邻的音频采样点的间隔时间,同时也影响着音频数据的时间分辨率。设想这样一个例子:一段5秒长度,16000赫兹采样率的音频等效于一个40000个数据点的序列。Transformer模型会使用注意力机制来学习音频或多模态表征。由于序列的长度会根据音频采样率而变化,我们的模型很难对不同的采样率进行泛化学习。**重采样**过程可以匹配不同音频文件的采样率,是音频数据[预处理](preprocessing#resampling-the-audio-data)过程的一部分。 ## 幅值和位深度 采样率告诉了我们每个采样点之间的时间间隔,那么采样点的数值具体又是如何确定的呢? 声音本质上是人类可察觉范围内的气压的周期性波动。声音的**幅值**描述的是任意瞬间的气压大小,使用分贝(dB)作为单位。人类感知到的幅值强度称为响度。举个例子,正常的说话声音响度在60分贝以下;一场摇滚演出的响度大概在125分贝,几乎是人耳的极限。 在数字音频中,每个采样点都记录了某个时间点上的声波的幅值。采样点的**位深度**决定了采样点的数值可以有多少种变化,即采样的精度。位深度越大,数字化表示就可以越准确地记录下原始的连续声波。 最常见的音频位深度为16比特或24比特。比特是一个二进制单位,表示了声波的连续幅值被数字化后可以取值的范围:16比特有65,536种可能的取值,而24比特有16,777,216种可能的取值。在这一量化过程中,原始的连续幅值被约减到最近的离散值上,因此量化过程会引入噪声。位深度越大,量化噪声则越小。在实际应用中,16比特音频的量化噪声已达到了几乎不可辨别的程度,因此我们通常不会使用更大的位深度。 你也许听说过32比特音频。在这种设置下,采样点会被当作浮点数储存,而16比特或24比特则是将采样点作为整数储存。32比特的浮点数所拥有的精度实际上也是24比特,与24比特音频相同。浮点数采样点的数值变化范围是[-1.0, 1.0]。由于机器学习模型在设计上也采用浮点数据,因此事实上任何音频文件在被输入进模型进行训练之前都需要转换为浮点数。在下一个章节[音频数据预处理](preprocessing)中我们会详细介绍这一过程。 与连续声音信号相同,数字音频信号的响度也通常使用分贝(dB)表示。这是由于人耳对于声音响度的感知是遵循对数关系的:我们的耳朵对于细微声音的微小扰动的敏感度大于对吵闹声音的微小扰动的敏感度。分贝也遵循这样的对数关系。现实世界中声音的分贝值是从0分贝开始计算的,0分贝代表着人耳所能感知到的最小的声音,更大的声音则拥有更高的分贝值。然而在数字音频中,0分贝代表着最大的幅值,并且任何更小的声音都有着负数的分贝值。一个简单的规则是,每-6分贝会让幅值减半,而-60分贝以下的声音基本是不可感知的,除非音量被调到很大。 ## 音频的波形表示 你可能见过被可视化为**波形**的声音信号。在这种图表中,采样点随着时间变化的数值被标记在直角坐标系中。这也被称为声音的*时域*表示。 这种可视化表示方法可以很好地帮助我们辨别声音信号中的某些特征,例如某个声音时间发生的时间、音频的整体响度、以及音频中的非正常部分或者噪声部分。 我们可以使用`librosa`这一Python库来绘制音频信号的波形图: ```bash pip install librosa ``` 我们可以使用库中自带的音频文件"trumpet"绘制示例图: ```py import librosa array, sampling_rate = librosa.load(librosa.ex("trumpet")) ``` 这一示例音频文件以元组的形式被加载,第一个元素为音频的时间序列(我们命名为`array`),第二个元素为采样率(`sampling_rate`)。我们使用librosa的`waveshow()`函数来绘制该音频的波形图: ```py import matplotlib.pyplot as plt import librosa.display plt.figure().set_figwidth(12) librosa.display.waveshow(array, sr=sampling_rate) ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/waveform_plot.png" alt="Waveform plot"> </div> 该图中的y轴表示的是信号的幅值,x轴则表示时间。换句话说,图中的每个点都代表着该音频对应的原始信号被采样的某一瞬间的取值。同时我们也注意到librosa返回的音频序列已经是浮点数的,并且幅值的范围在[-1.0, 1.0]之间。 除了直接聆听音频外,音频的可视化也可以帮助我们更好地理解我们的数据。你可以观察信号的形状、总结信号的规律、学习如何找出信号中的噪音和失真。如果你使用了归一化、重采样或者滤波等的信号预处理方法,你可以用可视化的方法来确认预处理后的信号是否符合你的预期。在完成模型训练之后,你也可以可视化模型出错的数据(例如在音频分类任务中被分到错误类别的样本)来找到模型中的错误。 ## 频谱图 另一种音频可视化的方法则是绘制出音频信号的**频谱**(spectrum),也称为信号的**频域**(frequency domain)表示。频谱可以通过离散傅里叶变换(Discrete Fourier Transform, DFT)求得,它描述了音频信号中每个频率成分的强度。 我们可以使用numpy的`rfft()`函数来绘制前文提到的小号声音的频谱图。虽然我们也可以绘制整个音频文件的频谱,但绘制一小段音频片段的频谱会更加有用。这里我们使用整段音频的前4096个采样点计算DFT,这差不多是第一个音符的长度: ```py import numpy as np dft_input = array[:4096] # 计算 DFT window = np.hanning(len(dft_input)) windowed_input = dft_input * window dft = np.fft.rfft(windowed_input) # 计算频谱的幅值,转换为分贝标度 amplitude = np.abs(dft) amplitude_db = librosa.amplitude_to_db(amplitude, ref=np.max) # 计算每个DFT分量对应的频率值 frequency = librosa.fft_frequencies(sr=sampling_rate, n_fft=len(dft_input)) plt.figure().set_figwidth(12) plt.plot(frequency, amplitude_db) plt.xlabel("Frequency (Hz)") plt.ylabel("Amplitude (dB)") plt.xscale("log") ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/spectrum_plot.png" alt="Spectrum plot"> </div> 这张图向我们展示了截取的音频片段中各个频率成分的强度。图中的x轴是频率的值,一般采用对数表示;y轴则对于频率的幅值。 可以看到这张频谱图中有几个峰值。这些峰值对应着当前音符的泛音频率,且更高的泛音声音更小。可以看到首个峰对应的频率在620赫兹左右,这说明当前演奏的音符的音高是E♭。 计算DFT所得到的频谱是由复数组成的序列,每个复数都包含了实部和虚部。我们可以使用`np.abs(dft)`来计算频谱的绝对值(又称模、幅值)。实部和虚部的夹角组成的序列也成为相位谱,但在机器学习应用中我们通常不关注这一部分。 我们使用了`librosa.amplitude_to_db()`函数将幅值转换为了分贝标度,方便我们观察频谱的细节。有时人们也使用测量能量而非幅值的**能量谱**(power spectrogram),其值为幅值的平方。 <Tip> 💡 在实践中,人们往往将快速傅里叶变换(Fast Fourier Transform, FFT)和离散傅里叶变换(Discrete Fourier Transform, DFT)这两个名词等价使用,这是因为FFT是在计算机中可以高效计算DFT的唯一方法。 </Tip> 音频信号的频谱和其波形所包含的信息其实完全相同,他们只是相同数据的不同表示方法(这里均表示该小号音频的前4096个样本)。两者的区别在于波形表示的是幅值随着时间的变化,而频谱表示的是各个频率成分在该时间段内的强度。 ## 时频谱 我们能否用某种方法表示出频率成分随着时间的变化呢?在这段小号音频中,演奏者实际上吹奏了几个不同频率的音符。频谱的问题在于其只能表示一个短暂时间段内各个频率成分的总体幅值。这里的解决方法是我们可以进行多次的DFT,每次DFT都覆盖一小段不同的时间段,然后再把所有的频谱堆叠起来,这样就构成了**时频谱**(spectrogram)。 时频谱表示了音频信号中各个频率成分随时间变化的过程。它可以让你在一张图中看到时间、频率和幅值的所有信息。计算时频谱的算法被成为短时傅里叶变换(Short Time Fourier Transform, STFT)。 时频谱是信息量最大的音频工具之一。举个例子,在分析音乐文件时,时频谱可以清晰地展示出各个乐器和人声在音乐整体中所占的部分。在语音文件中,你可以在时频谱里看到每个元音音节以及它们频率成分的差异。 我们使用librosa的`stft()`函数和`specshow()`函数来绘制同一段小号音频的时频谱图: ```py import numpy as np D = librosa.stft(array) S_db = librosa.amplitude_to_db(np.abs(D), ref=np.max) plt.figure().set_figwidth(12) librosa.display.specshow(S_db, x_axis="time", y_axis="hz") plt.colorbar() ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/spectrogram_plot.png" alt="Spectrogram plot"> </div> 该图中,x轴表示的是和波形图中相同的时间,但y轴现在表示着不同的频率,以赫兹为单位。颜色的强度表示着当前时间点和频率的幅值强度,使用分贝(dB)标度。 时频谱的计算大概经过以下几个步骤:首先截取很短的音频片段(通常只有几毫秒),然后对每个片段计算其离散傅里叶变换(DFT);获得所有片段的频谱之后,我们再将频谱延时间轴堆叠起来,这样就得到了我们的时频谱。时频谱图像的每个垂直切片都是一个单独的频谱图。`librosa.stft()`函数在默认条件下会把音频信号分割为2048个样本的许多切片,这一数字是在权衡了时频谱的频域分辨率和时域分辨率之后设置的。 由于时频谱和波形是同一信号的不同表示方法,我们也可以利用反向短时傅里叶变换(inverse STFT)将时频谱转换回原始的波形。然而,这一操作除了需要时频谱的强度谱之外,也需要时频谱的相位谱。目前的机器学习模型大多只能生成强度谱。这时我们可以使用一些相位重建(phase reconstruction)方法,包括传统的Griffin-Lim算法,或者使用一种被称为声码器(vocoder)的神经网络来从时频谱还原其波形。 时频谱的作用不仅在于音频的可视化。许多机器学习模型也会使用时频谱作为模型的输入和输出而不直接使用音频的波形。 现在我们了解了时频谱的原理和计算方法,我们来进一步学习一下在语音处理中常见的一种时频谱变体:梅尔时频谱。 ## 梅尔时频谱 梅尔时频谱(简称梅尔谱)是一种在语音处理和机器学习中常用的时频谱变体。梅尔谱也和时频谱一样表示了频率成分随时间的变化,只是频率所在的轴不同。 在标准的时频谱中,频率所在的轴是赫兹的线性变化轴。然而,人类的听觉系统对于低频率声音的变化更敏感,对于高频率声音的变化则较不敏感。这一敏感度的变化是随频率的上升呈对数关系下降的。梅尔标度作为一种感知标度模拟了人耳对于频率的非线性感知。 为了生成信号的梅尔谱,我们首先使用和标准时频谱相同的短时傅里叶变换(STFT)将音频分割为许多短时片段,并计算每个片段的频谱。然后,我们将每个片段的频谱输入进梅尔滤波器组(mel filterbank),来将频率成分转换到梅尔标度。 下面我们使用librosa的`melspectrogram()`函数绘制梅尔谱图,该函数帮我们执行了上述的所有步骤: ```py S = librosa.feature.melspectrogram(y=array, sr=sampling_rate, n_mels=128, fmax=8000) S_dB = librosa.power_to_db(S, ref=np.max) plt.figure().set_figwidth(12) librosa.display.specshow(S_dB, x_axis="time", y_axis="mel", sr=sampling_rate, fmax=8000) plt.colorbar() ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/mel-spectrogram.png" alt="Mel spectrogram plot"> </div> 在这段例子中,`n_mels`代表梅尔滤波器组中的滤波器个数。梅尔滤波器组会计算一组频率范围,这些频率范围会将整个频谱分割成许多部分。每个频率范围都对应滤波器组中的一个滤波器,滤波器的形状和间隔是模拟人耳对不同频率的感知差异而计算得出。常用的`n_mels`取值为40或80。`fmax`则代表我们想选取的最大频率(以赫兹为单位)。 和标准频谱一样,我们也会将梅尔频率成分的强度转化为分贝标度。由于分贝的转化过程涉及到对数运算,转化后的梅尔谱通常被称为**对数梅尔时频谱**(log-mel spectrum)。在上面示例中,我们使用`librosa.power_to_db()`函数和`librosa.feature.melspectrogram()`来生成能量对数梅尔时频谱。 <Tip> 💡 梅尔视频谱间也有各种区别!有两种常用的mel计算标度("htk" 和 "slaney"),此外还有能量谱和幅度谱的区别。对数梅尔谱的转换有时仅仅是简单计算`对数`而不会完整转化为分贝标度。因此,在使用以梅尔谱作为输入的机器学习模型时,我们建议你检查梅尔谱的计算过程是否完全一致。 </Tip> 由于梅尔谱的计算过程中需要对信号进行滤波,梅尔谱的计算是一个有损过程。将梅尔谱转化回波形比将标准时频谱转化回波形更加困难,因为我们需要估计在滤波过程中丢失的频率成分。这就是为何我们需要HiFiGAN声码器等机器学习模型来将梅尔谱转化回波形。 与标准时频谱相比,梅尔谱可以捕捉更多人类可感知的音频特征,因此梅尔谱也成为了在语音识别、说话人识别、音乐风格分类等任务中更常用的选择。 现在你已经学会如何可视化音频数据了,试着可视化看看你最喜欢的声音吧:)
2
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter1/load_and_explore.mdx
# 加载音频数据集 本节中我们将会使用🤗 Datasets来获取音频数据集。🤗 Datasets是一个下载和准备数据集的开源工具,包含了音频在内的各种模态数据。该工具集为Hugging Face Hub上公开的机器学习数据集提供了易用的接口。此外,🤗 Datasets还提供了专门为音频数据集而设的多种特性,帮助研究者和机器学习实践者更轻松地使用这些数据集。 首先,我们要确认已经安装了🤗 Datasets库: ```bash pip install datasets[audio] ``` 🤗 Datasets的其中一个重磅功能是可以使用`load_dataset()`函数达到仅用一行代码下载和准备数据集。 这里我们来加载和探索[MINDS-14](https://huggingface.co/datasets/PolyAI/minds14)这一音频数据集。该数据集的内容是人们向某个网银系统提问的录音,包含了多种语言和方言。 为了加载MINDS-14数据集,我们需要复制该数据集在Hugging Face Hub上的identifier(`PolyAI/minds14`),并向`load_dataset()`函数传入该参数。这里我们只选取该数据集的澳大利亚子集(`en-AU`)的训练分集: ```py from datasets import load_dataset minds = load_dataset("PolyAI/minds14", name="en-AU", split="train") minds ``` **输出:** ```out Dataset( { features: [ "path", "audio", "transcription", "english_transcription", "intent_class", "lang_id", ], num_rows: 654, } ) ``` 该数据集包含了654个音频文件,每个都有对应的转录文字和其英语翻译,以及一个代表询问人目的的标签。"audio"列则包含了原始的音频数据。我们来仔细看看其中的一个样本: ```py example = minds[0] example ``` **输出** ```out { "path": "/root/.cache/huggingface/datasets/downloads/extracted/f14948e0e84be638dd7943ac36518a4cf3324e8b7aa331c5ab11541518e9368c/en-AU~PAY_BILL/response_4.wav", "audio": { "path": "/root/.cache/huggingface/datasets/downloads/extracted/f14948e0e84be638dd7943ac36518a4cf3324e8b7aa331c5ab11541518e9368c/en-AU~PAY_BILL/response_4.wav", "array": array( [0.0, 0.00024414, -0.00024414, ..., -0.00024414, 0.00024414, 0.0012207], dtype=float32, ), "sampling_rate": 8000, }, "transcription": "I would like to pay my electricity bill using my card can you please assist", "english_transcription": "I would like to pay my electricity bill using my card can you please assist", "intent_class": 13, "lang_id": 2, } ``` 你可能注意到了"audio"列包含了好几个特征,它们分别是: * `path`:音频文件的路径(这里为`*.wav`)。 * `array`:解码后的音频文件,以1维NumPy数组表示。 * `sampling_rate`:音频文件的采样率(该样本为8000赫兹)。 `intent_class`则是分类的具体类别。我们可以使用`int2str()`方法将该数字转换为有意义的字符串: ```py id2label = minds.features["intent_class"].int2str id2label(example["intent_class"]) ``` **输出:** ```out "pay_bill" ``` 在该样本的转录文字中,我们可以看到该音频的内容确实是某人在提一个关于账单的问题。 如果你只是想用该子集训练一个音频分类器,你可能不需要使用所有的特征。举个例子,`lang_id`标签在该子集中全部为同样的值;`english_transcription`标签和`transcription`几乎完全含有相同的内容,因此我们也可以舍弃该标签。 你可以使用🤗 Datasets的`remove_columns()`方法轻松地移除所有不相关的标签: ```py columns_to_remove = ["lang_id", "english_transcription"] minds = minds.remove_columns(columns_to_remove) minds ``` **输出:** ```out Dataset({features: ["path", "audio", "transcription", "intent_class"], num_rows: 654}) ``` 现在我们已经加载并检验了数据集的原始内容,让我们来听几个例子吧!我们可以使用`Gradio`中的`Blocks`功能和`Audio`功能从数据集中解码几个样本: ```py import gradio as gr def generate_audio(): example = minds.shuffle()[0] audio = example["audio"] return ( audio["sampling_rate"], audio["array"], ), id2label(example["intent_class"]) with gr.Blocks() as demo: with gr.Column(): for _ in range(4): audio, label = generate_audio() output = gr.Audio(audio, label=label) demo.launch(debug=True) ``` 你也可以可视化你想要的样本。这里我们试着绘制第一个样本的波形图: ```py import librosa import matplotlib.pyplot as plt import librosa.display array = example["audio"]["array"] sampling_rate = example["audio"]["sampling_rate"] plt.figure().set_figwidth(12) librosa.display.waveshow(array, sr=sampling_rate) ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface-course/audio-course-images/resolve/main/waveform_unit1.png" alt="Waveform plot"> </div> 动手试试吧!试着下载MINDS-14数据集中其他语言或方言的子集,聆听并可视化其中的一些样本,感受整个数据集的多样性。你可以在[这里](https://huggingface.co/datasets/PolyAI/minds14)找到语言和方言的全部列表(仅英文)。
3
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/evaluation.mdx
# 评估语音合成模型 在训练期间,语音合成模型旨在优化平均平方误差损失(mean-square error loss,简称 MSE)或平均绝对误差(mean absolute error,简称 MAE), 这两者衡量的是预测的频谱图和真正的频谱图之间的差异。MSE 和 MAE 都鼓励模型最小化预测和目标频谱图之间的差异。然而,由于 TTS 是一种一对多映射问题, 即给定文本的输出频谱图有多种可能性,所以评估语音合成模型要比预想的困难得多。 与许多其他可以使用定量指标(如准确率、精确度)客观衡量的计算任务不同,评估 TTS 主要依赖于主观的人类分析。 评估 TTS 系统最常用方法之一是通过平均意见得分(Mean Opinion Scores,简称 MOS)进行定性评估。MOS 是一种主观评分系统, 允许人类评估者对合成语音的感知质量进行评分,评分范围从 1 到 5。这些得分通常通过听力测试收集,参与者听取并评价合成语音样本。 对于 TTS 评估来说,设置客观指标困难的主要原因之一是语音感知的主观性。不同的人类听众对语音的不同方面,包括发音、语调、自然度和清晰度, 有着各自的偏好和敏感度。用单一的数值捕捉这些感知细微差别是一项艰巨的任务。同时,人类评估的主观性使得比较和基准测试不同的 TTS 系统变得困难。 此外,这种评估可能会忽视语音合成的某些重要方面,如自然度、表达力和情感影响。这些质量难以客观量化,但在需要合成语音达到类人的品质和引发适当情感反应的应用中至关重要。 总之,由于缺乏一个真正客观的指标,评估语音合成模型是一项复杂的任务。最常用的评估方法,即平均意见得分(MOS),依赖于主观的人类分析。 尽管 MOS 提供了对合成语音质量的宝贵意见,但它也引入了不确定性和主观性。
4
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/supplemental_reading.mdx
# 补充阅读 本单元介绍了语音合成任务,包含了很多内容。想要了解更多吗?在这里,您将找到额外的资源,帮助您深入理解这些主题并提升您的学习体验。 * [HiFi-GAN: 用于高效和高保真语音合成的生成对抗网络](https://arxiv.org/pdf/2010.05646.pdf):介绍语音合成中的声码器 HiFi-GAN 的论文。 * [X-Vectors: 用于说话人识别的鲁棒 DNN 嵌入](https://www.danielpovey.com/files/2018_icassp_xvectors.pdf):介绍说话人嵌入的 X-Vector 方法的论文。 * [FastSpeech 2: 快速且高质量的端到端语音合成](https://arxiv.org/pdf/2006.04558.pdf):介绍 FastSpeech 2 的论文,这是另一个流行的语音合成模型,它使用了一种非自回归的 TTS 方法。 * [文本到语音合成的一种基于真实自发语音的向量量化方法](https://arxiv.org/pdf/2302.04215v1.pdf):介绍 MQTTS 的论文,这是一个自回归的 TTS 系统,它用量化的离散表示替换了梅尔谱。
5
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/pre-trained_models.mdx
# 语音合成的预训练模型 与 ASR(语音识别)和音频分类任务相比,语音合成的预训练模型检查点明显较少。在 🤗 Hub 上,您可以找到近 300 个适合的检查点。 在这些预训练模型中,我们将重点关注两种在 🤗 Transformers 库中开箱即用的架构——SpeechT5 和 Massive Multilingual Speech(MMS)。 在本节中,我们将探索如何在 Transformers 库中使用这些预训练模型进行 TTS(语音合成)。 ## SpeechT5 [SpeechT5](https://arxiv.org/abs/2110.07205) 是由 Microsoft 的 Junyi Ao 等人发布的模型,它能够处理一系列语音任务。虽然本单元中我们关注的是文本转语音, 但这个模型还可以用于语音转文本的任务(语音识别或说话人识别),以及语音转语音的任务(例如语音增强或变声器)。这是模型设计和预训练的方式所决定的。 SpeechT5 的核心是一个常规的 Transformer 编码器-解码器模型。就像任何其他 Transformer 一样,编码器-解码器网络使用隐藏表示来模拟序列到序列的转换。这个 Transformer 骨架对 SpeechT5 支持的所有任务都是相同的。 除此之外,SpeechT5 还有六个模态特定(语音/文本)的预处理网络(_pre-nets_)和后处理网络(_post-nets_)。输入的语音或文本(取决于任务)通过相应的预处理网络被预处理, 以获得 Transformer 可以使用的隐藏表示。然后,Transformer 的输出被送进后处理网络,转化为目标模态的输出。 以下是该模型的架构(图片来源于原始论文): <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/speecht5/architecture.jpg" alt="SpeechT5 architecture from the original paper"> </div> SpeechT5 首先使用大规模的未标注语音和文本数据进行预训练,以获得不同模态的统一表示。在预训练阶段,所有的预处理网络和后处理网络同时使用。 预训练之后,将整个编解码器骨架针对每个单独任务进行微调。在这个步骤中,只有与特定任务相关的预处理和后处理网络被采用。 例如,要使用 SpeechT5 进行语音合成,您需要选择文本编码器预处理网络来处理文本输入,以及语音解码器的预处理和后处理网络来处理语音输出。 这样就可以获得多个针对不同语音任务微调的模型,它们都受益于最开始在未标注数据上预训练时学到的知识。 <Tip> 尽管模型开始时使用的是同一个预训练模型的相同权重集,但微调后的最终版本会大不相同。所以,您不能仅仅通过更换预处理和后处理网络, 就将一个微调后的 ASR 模型转换为一个可用的 TTS 模型。SpeechT5 很灵活,但还没有灵活到这个程度 QwQ </Tip> 让我们具体看看 SpeechT5 在 TTS(文本到语音)任务中使用的预/后处理网络是什么: * 文本编码器预处理网络(Text encoder pre-net):一个文本嵌入层,将文本词元映射到编码器可以读入的隐藏表示。这与 NLP 模型(如 BERT)中的做法类似。 * 语音解码器预处理网络(Speech decoder pre-net):这个网络以对数梅尔谱(log mel spectrogram)为输入,并使用一连串的线性层来将频谱图压缩成隐藏表示。 * 语音解码器后处理网络(Speech decoder post-net):这个网络预测一个残差来添加到输出的频谱图中,用于优化模型输出的结果。 结合起来,这就是 SpeechT5 在语音合成中使用的架构: <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/speecht5/tts.jpg" alt="SpeechT5 architecture for TTS"> </div> 正如您所看到的,输出是一个对数梅尔谱,而不是最终的波形。如果您记得,我们在 [第 3 单元](../chapter3/introduction#spectrogram-output) 简要讨论过这个话题。 生成语音的模型的输出常常是对数梅尔谱,还需要通过一个额外的神经网络,即声码器(vocoder),才能转换成波形。 让我们看看具体应该怎么做。 首先,让我们从 🤗 Hub 加载微调过的 TTS SpeechT5 模型,以及用于分词和特征提取的处理器对象: ```python from transformers import SpeechT5Processor, SpeechT5ForTextToSpeech processor = SpeechT5Processor.from_pretrained("microsoft/speecht5_tts") model = SpeechT5ForTextToSpeech.from_pretrained("microsoft/speecht5_tts") ``` 接下来,给输入的文本分词。 ```python inputs = processor(text="Don't count the days, make the days count.", return_tensors="pt") ``` SpeechT5 TTS 模型不仅限于生成一种音色的语音。相反,它能够使用所谓的“说话人嵌入”(speaker embeddings)来模仿各种特定说话人的声音特征。 <Tip> 说话人嵌入(Speaker embeddings)是一个固定大小的向量,不论音频的长度如何都能用一种紧凑的方式表示说话人的身份。这些向量捕获了关于说话人的音色、口音、 语调以及其他独特特征的关键信息,这些特征区分了不同的说话人。这样的嵌入有说话人验证(speaker verification)、说话人分离(speaker diarization)、 说话人识别(speaker identification)等多种应用。生成说话人嵌入时最常用的技术包括: * I-Vectors(身份向量):I-Vectors 基于高斯混合模型(Gaussian mixture model,GMM)。它属于无监督学习,用一个特定说话人的 GMM 的统计数据得出一个低维度、固定大小的向量来代表说话人。 * X-Vectors:X-Vectors 利用了深度神经网络(DNN),通过整合时间维度的上下文来捕获帧级的说话人信息。 [X-Vectors](https://www.danielpovey.com/files/2018_icassp_xvectors.pdf) 是 SOTA(state-of-the-art)的方法,在测试集上的表现优于 I-Vectors。 X-Vectors 是深度神经网络产生的:它被训练以区分不同说话人,并将长度不一的语音音频映射到固定维度的嵌入。您还可以加载提前计算好的 X-Vector 说话人嵌入,它封装好了某个特定说话人的说话特征。 </Tip> 让我们从 Hub 上的数据集中加载一个说话人嵌入。这些嵌入是使用 [CMU ARCTIC 数据集](http://www.festvox.org/cmu_arctic/) 和 [这个脚本](https://huggingface.co/mechanicalsea/speecht5-vc/blob/main/manifest/utils/prep_cmu_arctic_spkemb.py) 获得的,这里使用任何 X-Vector 嵌入都可以。 ```python from datasets import load_dataset embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation") import torch speaker_embeddings = torch.tensor(embeddings_dataset[7306]["xvector"]).unsqueeze(0) ``` 说话人嵌入是形状为 (1, 512) 的张量,我们选的这个说话人嵌入描述的是一个女声。 此时,我们已经有足够的输入来生成对数梅尔谱作为输出,您可以这样做: ```python spectrogram = model.generate_speech(inputs["input_ids"], speaker_embeddings) ``` 这会输出一个形状为 (140, 80) 的张量,包含对数梅尔谱。第一维是序列长度,可能每次运行代码它的取值都会不一样, 因为语音解码器预处理网络(pre-net)总是对输入序列用 dropout。这给生成的语音添加了一些随机变化。 如果我们想生成语音波形,我们还需要指定一个用于把频谱图转换为波形的声码器(vocoder)。理论上,您可以使用任何适用于 80-bin 梅尔谱的声码器。 🤗 Transformers 提供了一个基于 HiFi-GAN 的声码器,可以很方便地使用。其权重由 SpeechT5 的原作者友情提供。 <Tip> [HiFi-GAN](https://arxiv.org/pdf/2010.05646v2.pdf) 是一种用于高保真语音合成的 SOTA 生成对抗网络(GAN)。它能够根据输入的频谱图生成高品质且逼真的音频波形。 总的来说,HiFi-GAN 由一个生成器和两个鉴别器组成。生成器是一个全卷积神经网络,它以梅尔谱作为输入并学习产生原始音频波形。 鉴别器负责区分真实音频和生成音频。这两个鉴别器关注音频的不同方面。 HiFi-GAN 在大量高品质音频数据上进行训练。它采用所谓的<em>对抗训练</em>,其中生成器和鉴别器网络相互竞争。最初,生成器产生的音频质量较低, 鉴别器可以轻松地将其与真实音频区分开。随着训练的进行,生成器改进其输出,目标是欺骗鉴别器。反过来,鉴别器也学会更准确地区分真实和生成的音频。 这种对抗性反馈循环帮助两个网络随时间推移提升性能。最终,HiFi-GAN 学会生成高保真音频,精密地模仿训练数据的特征。 </Tip> 加载声码器就像加载任何其他 🤗 Transformers 模型一样简单。 ```python from transformers import SpeechT5HifiGan vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan") ``` 现在您需要做的就是在生成语音时将其作为参数传递,输出将自动转换为语音波形。 ```python speech = model.generate_speech(inputs["input_ids"], speaker_embeddings, vocoder=vocoder) ``` 让我们来听听输出的结果。SpeechT5 使用的采样率始终是 16 kHz。 ```python from IPython.display import Audio Audio(speech, rate=16000) ``` 真不错! 请随意使用 SpeechT5 语音合成 demo 探索其他声音,尝试各种各样的输入。请注意,该预训练检查点仅支持英语: <iframe src="https://matthijs-speecht5-tts-demo.hf.space" frameborder="0" width="850" height="450"> </iframe> ## Bark Bark 是 Suno AI 提出的基于 transformer 的语音合成模型,见 [suno-ai/bark](https://github.com/suno-ai/bark)。 与 SpeechT5 不同,Bark 直接生成语音波形,无需在推理过程中使用单独的声码器——它已经集成了这一功能。这种功能是通过使用 [`Encodec`](https://huggingface.co/docs/transformers/main/en/model_doc/encodec) 实现的,它既是编解码器也是压缩工具。 借助 `Encodec`,您可以将音频压缩成轻量级格式以减少内存使用,然后再解压缩以恢复原始音频。这个压缩过程由 8 个 codebook 帮助完成, 每个 codebook 都由整数向量组成。可以将这些 codebook 视为音频的整数形式表示或嵌入,并且每个后续的 codebook 都能在前一个的基础上提高音频重建的质量。 由于 codebook 是整数向量,它们可以被 transformer 模型学习,而 transformer 在这项任务上非常高效。这正是 Bark 被训练去做的任务。 更具体地说,Bark 由 4 个主要的模型组成: - `BarkSemanticModel`(也称为“文本”模型):一个因果自回归的 transformer 模型,它接收分词后的文本作为输入,并预测表示文本含义的语义文本词元。 - `BarkCoarseModel`(也称为“粗粒度声学”模型):一个因果自回归的 transformer,它接收 `BarkSemanticModel` 模型的结果作为输入,预测 EnCodec 所需的前两个音频 codebook 。 - `BarkFineModel`(“细粒度声学”模型),这次是一个非因果自编码 transformer,它基于之前生成的所有 codebook 嵌入,迭代地预测下一个 codebook 。 - 在 `EncodecModel` 预测了所有 codebook 通道后,Bark 使用它来解码输出音频数组。 需要注意的是,前三个模块中的每一个都支持说话人嵌入作为条件参数,以根据预特定的预设调整输出的声音。 Bark 是一个高度可控的语音合成模型,意味着您可以调整各种各样的设置,下面我们将见证这一功能。 在一切开始之前,我们需要加载模型及其处理器。 处理器在这里有两方面的作用: 1. 它用于对输入文本进行分词,即将其切割成模型可以理解的小片段。 2. 它存储说话人嵌入,即可以调节生成结果的声音预设。 ```python from transformers import BarkModel, BarkProcessor model = BarkModel.from_pretrained("suno/bark-small") processor = BarkProcessor.from_pretrained("suno/bark-small") ``` Bark 非常多功能,可以通过处理器加载一个 [说话人嵌入库](https://suno-ai.notion.site/8b8e8749ed514b0cbf3f699013548683?v=bc67cff786b04b50b3ceb756fd05f68c) 中的说话人嵌入来调节它产生的声音。 ```python # 设置说话人嵌入 inputs = processor("This is a test!", voice_preset="v2/en_speaker_3") speech_output = model.generate(**inputs).cpu().numpy() ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/first_sample.wav" type="audio/wav"> Your browser does not support the audio element. </audio> 它还可以生成随时可用的多语言语音,例如法语和中文。您可以在 [这里](https://huggingface.co/suno/bark) 找到支持的语言列表。与下面讨论的 MMS 不同,它不需要指定所使用的语言,只需将输入文本调整为相应的语言即可。 ```python # 试试法语,同时我们也来用一个法语说话人嵌入 inputs = processor("C'est un test!", voice_preset="v2/fr_speaker_1") speech_output = model.generate(**inputs).cpu().numpy() ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/second_sample.wav" type="audio/wav"> Your browser does not support the audio element. </audio> 该模型还可以生成**非语言交流**,例如笑、叹息和哭泣。 您只需使用相应的提示修改输入文本,例如 `[clears throat]`、`[laughter]` 或 `...`。 ```python inputs = processor( "[clears throat] This is a test ... and I just took a long pause.", voice_preset="v2/fr_speaker_1", ) speech_output = model.generate(**inputs).cpu().numpy() ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/third_sample.wav" type="audio/wav"> Your browser does not support the audio element. </audio> Bark 甚至可以产生音乐。您可以通过在单词周围添加 ♪ 音符 ♪ 来做到这一点。 ```python inputs = processor( "♪ In the mighty jungle, I'm trying to generate barks.", ) speech_output = model.generate(**inputs).cpu().numpy() ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/fourth_sample.wav" type="audio/wav"> Your browser does not support the audio element. </audio> 除了所有这些功能之外,Bark 还支持批处理,这意味着您可以同时处理多个文本条目,但代价是更密集的计算。在某些硬件(例如 GPU)上,批处理可以加快整体生成速度,这意味着一次性生成所有样本比逐个生成样本更快。 让我们尝试生成一些示例: ```python input_list = [ "[clears throat] Hello uh ..., my dog is cute [laughter]", "Let's try generating speech, with Bark, a text-to-speech model", "♪ In the jungle, the mighty jungle, the lion barks tonight ♪", ] # 设置一个说话人嵌入 inputs = processor(input_list, voice_preset="v2/en_speaker_3") speech_output = model.generate(**inputs).cpu().numpy() ``` 让我们来一个一个听这些输出。 第一个: ```python from IPython.display import Audio sampling_rate = model.generation_config.sample_rate Audio(speech_output[0], rate=sampling_rate) ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/batch_1.wav" type="audio/wav"> Your browser does not support the audio element. </audio> 第二个: ```python Audio(speech_output[1], rate=sampling_rate) ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/batch_2.wav" type="audio/wav"> Your browser does not support the audio element. </audio> 第三个: ```python Audio(speech_output[2], rate=sampling_rate) ``` <audio controls> <source src="https://huggingface.co/datasets/ylacombe/hf-course-audio-files/resolve/main/batch_3.wav" type="audio/wav"> Your browser does not support the audio element. </audio> <Tip> Bark 与其他 🤗 Transformers 模型一样,只需几行代码即可针对速度和内存影响进行优化。要了解具体操作方法,请单击 [此 colab 演示笔记本](https://colab.research.google.com/github/ylacombe/notebooks/blob/main/Benchmark_Bark_HuggingFace.ipynb)。 </Tip> ## Massive Multilingual Speech (MMS) 如果您需要一个非英语的预训练模型怎么办?Massive Multilingual Speech(MMS,大规模多语种语音)是另一个涵盖多种语音任务的模型,并且,它支持大量的语言。比方说,它可以合成超过 1,100 种语言的语音。 MMS 的语音合成是基于 [VITS Kim et al., 2021](https://arxiv.org/pdf/2106.06103.pdf) 的,它是最先进的 TTS 方法之一。 VITS 是一个语音生成网络,可以将文本转换成语音波形。它的工作方式类似于条件变分自编码器(conditional variational auto-encoder),从输入文本估计音频特征。首先,生成用频谱图表示的声学特征。 然后,使用改编自 HiFi-GAN 的转置卷积层解码出波形。在推理过程中,文本编码被上采样并使用流模块(flow module)和 HiFi-GAN 解码器转换成波形。像 Bark 一样,这里不需要声码器(vocoder),因为已经直接生成波形了。 <Tip warning={true}> MMS 模型最近才被添加到 🤗 Transformers 中,所以您需要从源代码安装该库: ```bash pip install git+https://github.com/huggingface/transformers.git ``` </Tip> 让我们试用一下 MMS,看看如何合成非英语的语音,例如德语。首先,我们将加载特定语言的检查点和分词器: ```python from transformers import VitsModel, VitsTokenizer model = VitsModel.from_pretrained("facebook/mms-tts-deu") tokenizer = VitsTokenizer.from_pretrained("facebook/mms-tts-deu") ``` 您可能会注意到,加载 MMS 模型需要使用 `VitsModel` 和 `VitsTokenizer`。这是因为如前所述,MMS 的 TTS 是基于 VITS 模型的。 让我们选择一首德语儿歌的前两句作为示例文本: ```python text_example = ( "Ich bin Schnappi das kleine Krokodil, komm aus Ägypten das liegt direkt am Nil." ) ``` 要生成波形输出,使用分词器预处理文本,并将其传递给模型: ```python import torch inputs = tokenizer(text_example, return_tensors="pt") input_ids = inputs["input_ids"] with torch.no_grad(): outputs = model(input_ids) speech = outputs["waveform"] ``` 让我们来听听: ```python from IPython.display import Audio Audio(speech, rate=16000) ``` Wunderbar!如果您想尝试其他语言的 MMS,可以 [在 🤗 Hub 上](https://huggingface.co/models?filter=vits) 寻找其他合适的 `vits` 检查点。 现在,让我们看看如何自己微调一个 TTS 模型!
6
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/tts_datasets.mdx
# 语音合成数据集 语音合成任务(也称为 _文本转语音_,Text-to-Speech)面临许多挑战。 首先,就像之前讨论的语音识别(ASR)一样,文本和语音之间的对齐可能很棘手。然而,与 ASR 不同的是,TTS 是一个**一对多**映射问题, 即同一文本可以以多种不同方式合成。想想您每天听到的语音中声音和说话风格的多样性——每个人说同一句话的方式都不同,但它们都是有效且正确的! TTS 模型的不同输出,频谱图或音频波形,可能对应同一个真实结果(ground truth)。模型必须学会为每个音素、单词或句子生成正确的持续时间和时序, 这可能会很有挑战性,特别是对于长且复杂的句子。 其次,TTS 中存在长距离依赖问题:语言具有时间维度,理解句子的意义通常需要考虑包含了周围词汇的上下文。确保 TTS 模型能捕获并保留长序列中的上下文信息对于生成连贯自然的语音至关重要。 最后,训练 TTS 模型通常需要文本和相应的语音录音配对,且为保证模型能够为不同说话人和说话风格生成自然的语音,数据应包含来自多个说话人的多样、有代表性的语音样本。 收集此类数据既昂贵又耗时,并且对于某些语言来说并不可行。您可能会想,为什么不直接采用为 ASR 设计的数据集来训练 TTS 模型呢?不幸的是, ASR 数据集并不是最佳选择。它们对于 ASR 有益的特性,如过多的背景噪音,在 TTS 中通常是不好的。能够在嘈杂街道的录音中辨别出语音是很棒的, 但如果您的语音助手回答您时背景有汽车喇叭声和建筑施工声就不那么理想了。尽管如此,有时可以用一些 ASR 数据集来微调,因为寻找高质量、多语言、多说话人的 TTS 数据集可能相当困难。 让我们来探索一些 🤗 Hub 上适用于 TTS 的数据集吧。 ## LJSpeech [LJSpeech](https://huggingface.co/datasets/lj_speech) 是一个包含 13,100 个英语音频片段及其对应转写的大型数据集, 内容为一位说话人朗读 7 本纪实类英语书籍的录音。由于其音质高,语言内容多样,LJSpeech 经常被用作评估 TTS 模型的基准。 ## Multilingual LibriSpeech [Multilingual LibriSpeech](https://huggingface.co/datasets/facebook/multilingual_librispeech) 是 LibriSpeech 数据集的多语言扩展版。 后者只包含英语有声读物,而 Multilingual LibriSpeech 还包括了额外的语言,如德语、荷兰语、西班牙语、法语、意大利语、葡萄牙语和波兰语。 它提供了每种语言的音频录音以及与之对齐的转写。这个数据集为开发多语言 TTS 系统和探索跨语言语音合成技术提供了宝贵的资源。 ## VCTK (Voice Cloning Toolkit) [VCTK](https://huggingface.co/datasets/vctk) 是专为语音合成研究和开发设计的数据集。它包含了 110 位不同口音的人说英语的录音,每位说话人读约 400 个句子, 这些句子选自报纸、the rainbow passage 和旨在识别说话者口音的引出段落(elicitation paragraph)。VCTK 为训练具有多样化声音和口音的 TTS 模型提供了宝贵资源,使语音合成更加自然和多样化。 ## Libri-TTS / LibriTTS-R [Libri-TTS / LibriTTS-R](https://huggingface.co/datasets/cdminix/libritts-r-aligned) 是一个由约 585 小时的英语朗读语音组成的多说话人英语语料库, 采样率 24kHz,由 Heiga Zen 在 Google Speech 和 Google Brain 团队成员的协助下搭建。LibriTTS 语料库专为 TTS 研究而设计,它源自 LibriSpeech 语料库的原始材料 (LibriVox 的 mp3 音频文件和 Project Gutenberg 的文本文件)。它与 LibriSpeech 语料库的主要区别如下: * 音频文件是 24kHz 采样率。 * 每条语音数据在句子与句子间断开。 * 包括原始和规范化的文本。 * 可以提取上下文信息(例如相邻句子)。 * 排除了具有显著背景噪音的语音数据。 组建一个适用于 TTS 的优秀数据集并非易事,因为这样的数据集需要具备几个关键特性: * 高质量和多样化的录音,涵盖广泛的语音模式、口音、语言和情感。录音应清晰、无背景噪音,并展现自然的语音特征。 * 转写:每个音频录音应有其相应的文本转写。 * 语言内容的多样性:数据集应包含语言多样化的内容,包括不同类型的句子、短语和单词。它应涵盖各种主题、体裁和领域,以确保模型能够处理不同的语言环境。 好消息是,您很可能不需要从头开始训练一个 TTS 模型。在下一节中,我们将探讨在 🤗 Hub 上可用的预训练模型。
7
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/introduction.mdx
# 第六单元:从文本到语音 在上一个单元中,您学习了如何使用 Transformers 将语音转换成文本。现在,让我们换一个方向,看看该如何将输入的文本转换成听起来像人类语音的音频输出。 我们在这个单元将学习的任务称为“语音合成”(Text-to-Speech,简称 TTS),要将文本转换为可听的人类语音。这样的模型具有广泛的潜在应用: * 辅助 app:可以利用这些模型帮助视觉障碍人士通过声音媒介访问数字内容。 * 有声读物朗读:将文本的书籍转换成音频形式,使比起读喜欢听的或阅读有困难的人们能更容易地欣赏文学作品。 * 虚拟助手:TTS 模型是 Siri、Google Assistant 或 Amazon Alexa 等虚拟助手的基本组成部分。它们使用分类模型捕捉到唤醒词,并使用 ASR(语音识别)模型处理了您的请求之后,就可以使用 TTS 模型来回应您的问题。 * 娱乐、游戏和语言学习:为您的 NPC(非玩家角色)赋予声音,叙述游戏事件,或帮助语言学习者了解单词和短语的正确发音和语调。 这些只是一些例子,我相信您还可以想象出更多!然而,能力越大责任越大,需要强调 TTS 模型有可能被用于恶意目的。例如,有了足够的声音样本, 不法分子可能会合成出足以以假乱真的假语音,未经授权使用他人的声音,甚至用于诈骗。如果想要收集数据以微调自己的系统,请仔细考虑隐私和知情同意。 获取声音数据应获得个人的明确同意,确保他们理解声音在 TTS 系统中使用的目的、范围和潜在风险。请负责任地使用语音合成技术。 在这一章中,我们将介绍: * [适合训练语音合成模型的数据集](tts_datasets) * [语音合成的预训练模型](pre-trained_models) * [在一门新语言上微调 SpeechT5](fine-tuning) * [评估语音识别模型的性能](evaluation)
8
0
hf_public_repos/audio-transformers-course/chapters/zh-CN
hf_public_repos/audio-transformers-course/chapters/zh-CN/chapter6/fine-tuning.mdx
# 微调 SpeechT5 现在您已经熟悉了语音合成任务和 SpeechT5 模型的内部工作原理,该模型是在英语数据上预训练的,让我们看看如何将其微调到另一种语言。 ## 基础准备 如果您想复现这个示例,请确保您有一个 GPU。在笔记本中,您可以使用以下命令检查: ```bash nvidia-smi ``` <Tip warning={true}> 在我们的示例中,我们将使用大约 40 小时的训练数据。如果您想使用 Google Colab 免费版的 GPU 复现,需要将训练数据量减少到大约 10-15 小时,并减少训练步骤的数量。 </Tip> 您还需要一些额外的依赖: ```bash pip install transformers datasets soundfile speechbrain accelerate ``` 最后,不要忘记登录您的 Hugging Face 账户,以便您能够上传并与社区共享您的模型: ```py from huggingface_hub import notebook_login notebook_login() ``` ## 数据集 在这个示例中,我们将使用 [VoxPopuli](https://huggingface.co/datasets/facebook/voxpopuli) 数据集的荷兰语(`nl`)子集。 [VoxPopuli](https://huggingface.co/datasets/facebook/voxpopuli) 是一个大规模的多语言语音语料库,包含了 2009-2020 年欧洲议会事件的录音数据。 它包含 15 种欧洲语言的带标签的音频-转写数据。虽然我们将使用荷兰语子集,但您可以自由选择其他子集。 这是一个语音识别(ASR)数据集,所以,如前所述,它不是训练 TTS 模型的最佳选择。然而,对于这个练习来说,它已经足够好了。 让我们加载数据: ```python from datasets import load_dataset, Audio dataset = load_dataset("facebook/voxpopuli", "nl", split="train") len(dataset) ``` **输出:** ```out 20968 ``` 20968 条数据应该足以进行微调。输入 SpeechT5 的音频数据应具有 16 kHz 的采样率,所以要确保我们的数据集满足这一要求: ```python dataset = dataset.cast_column("audio", Audio(sampling_rate=16000)) ``` ## 数据预处理 处理器包含了分词器和特征提取器,我们需要用它们来预处理训练数据。所以我们先定义要使用的模型检查点,并加载对应的处理器: ```py from transformers import SpeechT5Processor checkpoint = "microsoft/speecht5_tts" processor = SpeechT5Processor.from_pretrained(checkpoint) ``` ### 为 SpeechT5 分词进行文本清理 首先,为了处理文本,我们需要处理器的分词器部分,所以让我们来获取它: ```py tokenizer = processor.tokenizer ``` 让我们看一个示例: ```python dataset[0] ``` **输出:** ```out {'audio_id': '20100210-0900-PLENARY-3-nl_20100210-09:06:43_4', 'language': 9, 'audio': {'path': '/root/.cache/huggingface/datasets/downloads/extracted/02ec6a19d5b97c03e1379250378454dbf3fa2972943504a91c7da5045aa26a89/train_part_0/20100210-0900-PLENARY-3-nl_20100210-09:06:43_4.wav', 'array': array([ 4.27246094e-04, 1.31225586e-03, 1.03759766e-03, ..., -9.15527344e-05, 7.62939453e-04, -2.44140625e-04]), 'sampling_rate': 16000}, 'raw_text': 'Dat kan naar mijn gevoel alleen met een brede meerderheid die wij samen zoeken.', 'normalized_text': 'dat kan naar mijn gevoel alleen met een brede meerderheid die wij samen zoeken.', 'gender': 'female', 'speaker_id': '1122', 'is_gold_transcript': True, 'accent': 'None'} ``` 您可能会注意到数据包含 `raw_text` 和 `normalized_text` 特征。在决定使用哪个特征作为文本输入时,需要注意的是 SpeechT5 分词器没有任何数字的词元。 在 `normalized_text` 中,数字被写成文本。因此,它更合适,我们应该使用 `normalized_text` 作为输入文本。 因为 SpeechT5 是在英语上训练的,它可能无法识别荷兰语数据集中的某些字符。如果保持原样,这些字符将被转换为 `<unk>` 词元。 然而,在荷兰语中,某些字符如 `à` 用于强调音节。为了保留文本的含义,我们可以将此字符替换为普通的 `a`。 要识别不支持的词元,使用 `SpeechT5Tokenizer` 提取数据集中所有独特字符,该分词器将字符视为词元。为此,我们将编写 `extract_all_chars` 映射函数, 该函数将所有数据样例的转写连接成一个字符串,然后转换为字符集。确保在 `dataset.map()` 中设置 `batched=True` 和 `batch_size=-1`,以便一次性获取所有转写并输入映射函数。 ```py def extract_all_chars(batch): all_text = " ".join(batch["normalized_text"]) vocab = list(set(all_text)) return {"vocab": [vocab], "all_text": [all_text]} vocabs = dataset.map( extract_all_chars, batched=True, batch_size=-1, keep_in_memory=True, remove_columns=dataset.column_names, ) dataset_vocab = set(vocabs["vocab"][0]) tokenizer_vocab = {k for k, _ in tokenizer.get_vocab().items()} ``` 现在您有两组字符:一个来自数据集,另一个来自分词器。要识别数据集中任何不支持的字符,您可以取这两组的差集,结果将包含在数据集中而不在分词器中的字符。 ```py dataset_vocab - tokenizer_vocab ``` **输出:** ```out {' ', 'à', 'ç', 'è', 'ë', 'í', 'ï', 'ö', 'ü'} ``` 为了处理上一步骤中识别的不支持字符,我们可以定义一个将这些字符映射到有效词元的函数。注意,分词器中的空格已经被替换为 `▁`,因此不需要单独处理。 ```py replacements = [ ("à", "a"), ("ç", "c"), ("è", "e"), ("ë", "e"), ("í", "i"), ("ï", "i"), ("ö", "o"), ("ü", "u"), ] def cleanup_text(inputs): for src, dst in replacements: inputs["normalized_text"] = inputs["normalized_text"].replace(src, dst) return inputs dataset = dataset.map(cleanup_text) ``` 现在我们处理好了文本中的特殊字符,是时候将注意力转移到音频数据上了。 ### 说话人 VoxPopuli 数据集包含多个说话人的语音,但到底有多少呢?我们可以计算一下数据集中说话人的数量以及每个说话人贡献的数据量。 数据集总共有 20,968 条数据,这些信息将帮助我们更好地了解数据中的说话人和数据样例的分布。 ```py from collections import defaultdict speaker_counts = defaultdict(int) for speaker_id in dataset["speaker_id"]: speaker_counts[speaker_id] += 1 ``` 通过绘制直方图,您可以了解每个说话人的数据量。 ```py import matplotlib.pyplot as plt plt.figure() plt.hist(speaker_counts.values(), bins=20) plt.ylabel("Speakers") plt.xlabel("Examples") plt.show() ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/tts_speakers_histogram.png" alt="Speakers histogram"/> </div> 直方图显示,数据集中大约三分之一的说话人的数据少于 100 条,而大约十个说话人的数据超过 500 条。为了提高训练效率并平衡数据集,我们可以将数据限制在有 100 到 400 条数据的说话人之间。 ```py def select_speaker(speaker_id): return 100 <= speaker_counts[speaker_id] <= 400 dataset = dataset.filter(select_speaker, input_columns=["speaker_id"]) ``` 让我们检查还剩多少个说话人: ```py len(set(dataset["speaker_id"])) ``` **输出:** ```out 42 ``` 让我们看看还剩多少条数据: ```py len(dataset) ``` **输出:** ```out 9973 ``` 您留下了不到 10,000 条数据,来自大约 40 个独特的说话人,这应该足够用了。 请注意,如果某些数据很长,一些看似数据样例量较少的说话人可能有比预想的更多的音频数据。然而,确定每个说话人的总音频量需要扫描整个数据集, 这是一个耗时的过程,涉及加载和解码每个音频文件。因此,我们在这里选择跳过这一步。 ### 说话人嵌入 为了使 TTS 模型能够区分多个说话人,您需要为每条数据创建一个说话人嵌入。说话人嵌入是模型的一个额外输入,用于描述特定说话人的声音特征。 要生成这些说话人嵌入,可以使用来自 SpeechBrain 的预训练模型 [spkrec-xvect-voxceleb](https://huggingface.co/speechbrain/spkrec-xvect-voxceleb)。 创建一个 `create_speaker_embedding()` 函数,该函数接受音频波形作为输入,并输出包含相应说话人嵌入的 512 维向量。 ```py import os import torch from speechbrain.pretrained import EncoderClassifier spk_model_name = "speechbrain/spkrec-xvect-voxceleb" device = "cuda" if torch.cuda.is_available() else "cpu" speaker_model = EncoderClassifier.from_hparams( source=spk_model_name, run_opts={"device": device}, savedir=os.path.join("/tmp", spk_model_name), ) def create_speaker_embedding(waveform): with torch.no_grad(): speaker_embeddings = speaker_model.encode_batch(torch.tensor(waveform)) speaker_embeddings = torch.nn.functional.normalize(speaker_embeddings, dim=2) speaker_embeddings = speaker_embeddings.squeeze().cpu().numpy() return speaker_embeddings ``` 注意,`speechbrain/spkrec-xvect-voxceleb` 模型是在 VoxCeleb 数据集的英语语音上训练的,而这个示例训练的是荷兰语。 虽然我们相信这个模型仍然可以为我们的荷兰语数据集生成合理的说话人嵌入,但这个假设可能不总是成立。 为了获得最佳结果,我们需要首先在目标语音上训练 X-Vector 模型。这将确保模型能够更好地捕捉荷兰语中存在的独特声音特征。如果您想训练自己的 X-向量模型, 可以参考 [此脚本](https://huggingface.co/mechanicalsea/speecht5-vc/blob/main/manifest/utils/prep_cmu_arctic_spkemb.py)。 ### 处理数据集 最后,让我们将数据处理成模型能够读入的格式。创建一个 `prepare_dataset` 函数,输入单个示例并使用 `SpeechT5Processor` 对象来对输入文本进行分词,并将目标音频加载成对数梅尔谱。它还应该额外输入说话人嵌入。 ```py def prepare_dataset(example): audio = example["audio"] example = processor( text=example["normalized_text"], audio_target=audio["array"], sampling_rate=audio["sampling_rate"], return_attention_mask=False, ) # 去掉批量处理的维度 example["labels"] = example["labels"][0] # 用 SpeechBrain 获取 X-Vector example["speaker_embeddings"] = create_speaker_embedding(audio["array"]) return example ``` 查看单个示例来验证处理是否正确: ```py processed_example = prepare_dataset(dataset[0]) list(processed_example.keys()) ``` **输出:** ```out ['input_ids', 'labels', 'stop_labels', 'speaker_embeddings'] ``` 说话人嵌入应该是一个 512 维向量: ```py processed_example["speaker_embeddings"].shape ``` **输出:** ```out (512,) ``` 标签应该是一个有 80 个 mel 频段的对数梅尔谱。 ```py import matplotlib.pyplot as plt plt.figure() plt.imshow(processed_example["labels"].T) plt.show() ``` <div class="flex justify-center"> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/tts_logmelspectrogram_1.png" alt="Log-mel spectrogram with 80 mel bins"/> </div> 注:如果您看不明白这个频谱图,可能是因为您习惯将低频放在底部,高频放在顶部。然而,在使用 matplotlib 库将频谱图作为图像绘制时,y 轴是反过来的,频谱图看起来是倒置的。 现在我们需要将处理函数应用于整个数据集。这将花费 5 到 10 分钟的时间。 ```py dataset = dataset.map(prepare_dataset, remove_columns=dataset.column_names) ``` 您会看到一个警告说数据集中的某些数据长于模型能够处理的最大输入长度(600 词元),得从数据集中删除这些数据。在这里,我们更进一步,为了允许更大的批量大小,删除任何超过 200 词元的内容。 ```py def is_not_too_long(input_ids): input_length = len(input_ids) return input_length < 200 dataset = dataset.filter(is_not_too_long, input_columns=["input_ids"]) len(dataset) ``` **输出:** ```out 8259 ``` 接下来,把数据集分成基本的训练/测试子集: ```py dataset = dataset.train_test_split(test_size=0.1) ``` ### 数据整理器 为了将多条数据组合成一个批次,您需要定义一个自定义数据整理器。这个整理器将使用填充词元填充较短的序列,确保所有示例都具有相同的长度。 对于频谱图标签,填充部分将被特殊值 `-100` 替换。这个特殊值指示模型在计算频谱图的损失函数时忽略那部分频谱图。 ```py from dataclasses import dataclass from typing import Any, Dict, List, Union @dataclass class TTSDataCollatorWithPadding: processor: Any def __call__( self, features: List[Dict[str, Union[List[int], torch.Tensor]]] ) -> Dict[str, torch.Tensor]: input_ids = [{"input_ids": feature["input_ids"]} for feature in features] label_features = [{"input_values": feature["labels"]} for feature in features] speaker_features = [feature["speaker_embeddings"] for feature in features] # 把输入数据和生成目标整合进一个批次 batch = processor.pad( input_ids=input_ids, labels=label_features, return_tensors="pt" ) # 把填充词元换成 -100 来正确地忽略这一部分的损失函数 batch["labels"] = batch["labels"].masked_fill( batch.decoder_attention_mask.unsqueeze(-1).ne(1), -100 ) # 在微调时用不上,删了 del batch["decoder_attention_mask"] # 把目标长度下调到 reduction factor 的整数倍 if model.config.reduction_factor > 1: target_lengths = torch.tensor( [len(feature["input_values"]) for feature in label_features] ) target_lengths = target_lengths.new( [ length - length % model.config.reduction_factor for length in target_lengths ] ) max_length = max(target_lengths) batch["labels"] = batch["labels"][:, :max_length] # 加上说话人嵌入 batch["speaker_embeddings"] = torch.tensor(speaker_features) return batch ``` 在 SpeechT5 中,模型的解码器部分的输入减少了 2 倍(reduction factor)。换句话说,它抛弃了目标序列中每两步中的一步。然后,解码器预测一个两倍长度的序列。 由于原来的目标序列长度可能是奇数,数据整理器会确保将批次的最大长度调整为 2 的倍数。 ```py data_collator = TTSDataCollatorWithPadding(processor=processor) ``` ## 训练模型 从与处理器相同的检查点加载预训练模型: ```py from transformers import SpeechT5ForTextToSpeech model = SpeechT5ForTextToSpeech.from_pretrained(checkpoint) ``` `use_cache=True` 选项与梯度检查点不兼容。我们在训练时禁用这个选项,并在生成时重新启用缓存以加快推理: ```py from functools import partial # 在训练时禁用缓存 model.config.use_cache = False # 设置语言和任务准备推理,并重新启用缓存 model.generate = partial(model.generate, use_cache=True) ``` 定义训练参数。这里我们在训练过程中不计算任何评估指标,我们将在本章稍后讨论评估。这里,我们先只关注损失函数: ```python from transformers import Seq2SeqTrainingArguments training_args = Seq2SeqTrainingArguments( output_dir="speecht5_finetuned_voxpopuli_nl", # 改成您选择的仓库名 per_device_train_batch_size=4, gradient_accumulation_steps=8, learning_rate=1e-5, warmup_steps=500, max_steps=4000, gradient_checkpointing=True, fp16=True, evaluation_strategy="steps", per_device_eval_batch_size=2, save_steps=1000, eval_steps=1000, logging_steps=25, report_to=["tensorboard"], load_best_model_at_end=True, greater_is_better=False, label_names=["labels"], push_to_hub=True, ) ``` 实例化 `Trainer` 对象并将模型、数据集和数据整理器传递给它。 ```py from transformers import Seq2SeqTrainer trainer = Seq2SeqTrainer( args=training_args, model=model, train_dataset=dataset["train"], eval_dataset=dataset["test"], data_collator=data_collator, tokenizer=processor, ) ``` 有了这个,我们就准备开始训练了!训练将花费几个小时。由于 GPU 不同,当您开始训练时,可能会遇到 CUDA 报“out-of-memory”(显存不足)的错误。这时,您可以尝试将 `per_device_train_batch_size` 两倍两倍地减少,并将 `gradient_accumulation_steps` 增加到两倍以补偿。 ```py trainer.train() ``` 将最终的模型上传到 🤗 Hub: ```py trainer.push_to_hub() ``` ## 推理 一旦您微调了一个模型,您就可以使用它进行推理!从 🤗 Hub 加载模型(记得在以下代码片段中使用您的账号名): ```py model = SpeechT5ForTextToSpeech.from_pretrained( "您的账号/speecht5_finetuned_voxpopuli_nl" ) ``` 选择一个示例,这里我们将从测试数据集中取一个。获取说话人嵌入。 ```py example = dataset["test"][304] speaker_embeddings = torch.tensor(example["speaker_embeddings"]).unsqueeze(0) ``` 定义一些输入文本并对它进行分词。 ```py text = "hallo allemaal, ik praat nederlands. groetjes aan iedereen!" ``` 预处理输入文本: ```py inputs = processor(text=text, return_tensors="pt") ``` 实例化一个声码器并生成语音: ```py from transformers import SpeechT5HifiGan vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan") speech = model.generate_speech(inputs["input_ids"], speaker_embeddings, vocoder=vocoder) ``` 准备好听结果了吗? ```py from IPython.display import Audio Audio(speech.numpy(), rate=16000) ``` 用这个模型在新语言上获得的满意结果可能很有挑战性。说话人嵌入的质量可能是一个重要因素。由于 SpeechT5 是使用英语 X-Vector 预训练的,它在使用英语说话人嵌入时表现最佳。如果合成的语音听起来效果不好,尝试使用不同的说话人嵌入。 增加训练时长也可能提高结果的质量。但即便不继续训练,语音也显然是荷兰语而不是英语,并且它确实学到了说话人的声音特征(与示例中的原始音频相比较)。另一个可以试验的是模型的配置。例如,尝试使用 `config.reduction_factor = 1` 来看是否能改善结果。 在下一节中,我们将讨论如何评估语音合成模型。
9
0
hf_public_repos/audio-transformers-course/chapters
hf_public_repos/audio-transformers-course/chapters/ko/_toctree.yml
- title: 0단원. 코스에 오신 것을 환영합니다! sections: - local: chapter0/introduction title: 이 코스에서 기대할 수 있는 것들 - local: chapter0/get_ready title: 준비하기 - local: chapter0/community title: 커뮤니티에 참여하기 - title: 1단원. 오디오 데이터 다루기 sections: - local: chapter1/introduction title: 학습할 내용들 - local: chapter1/audio_data title: 오디오 데이터에 대하여 - local: chapter1/load_and_explore title: 오디오 데이터셋 불러오기 및 탐색하기 - local: chapter1/preprocessing title: 오디오 데이터 전처리하기 - local: chapter1/streaming title: 오디오 데이터 스트리밍하기 - local: chapter1/quiz title: 퀴즈 quiz: 1 - local: chapter1/supplemental_reading title: 참고자료들 - title: 2단원. 오디오의 응용에 대한 소개 sections: - local: chapter2/introduction title: 오디오의 응용 개요 - local: chapter2/audio_classification_pipeline title: 파이프라인을 이용한 오디오 분류 - local: chapter2/asr_pipeline title: 파이프라인을 이용한 자동 음성 인식 - local: chapter2/hands_on title: 실습 과제 - title: 3단원. 오디오용 트랜스포머 아키텍처 sections: - local: chapter3/introduction title: 트랜스퍼모델 돌아보기 - local: chapter3/ctc title: CTC 아키텍처 - local: chapter3/seq2seq title: Seq2Seq 아키텍처 - local: chapter3/classification title: 오디오 분류 아키텍처 - local: chapter3/quiz title: 퀴즈 quiz: 3 - local: chapter3/supplemental_reading title: 보충자료 및 리소스 - title: 코스 이벤트 sections: - local: events/introduction title: 라이브 세션과 워크샵
0
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter0/get_ready.mdx
# 코스 준비하기[[get-ready-to-take-the-course]] 코스에 대한 기대가 크신가요? 이 페이지는 여러분이 바로 시작하실 수 있도록 준비를 도와드립니다. ## 1단계. 등록하기[[step-1-sign-up]] 모든 업데이트와 소셜 이벤트에 대한 최신 소식을 받아보려면 코스에 등록하세요. [👉 등록하기](http://eepurl.com/insvcI) ## 2단계. Hugging Face 계정 만들기[[step-2-get-a-hugging-face-account]] 아직 허깅페이스 계정이 없다면, 계정을 만드세요(무료입니다). 실습과제 완료, 수료 인증, 사전학습 모델 탐색, 데이터셋에 접근 등을 위해 필요합니다. [👉 HUGGING FACE 계정 생성](https://huggingface.co/join) ## 3단계. 기초지식 점검하기(필요한 경우)[[step-3-brush-up-on-fundamentals-if-you-need-to]] 우린 여러분이 딥러닝과 트랜스포머에 대해 대략적으로 이해를 하고 있다고 가정합니다. 트랜스포머에 대한 이해가 필요하다면 우리의 [NLP 코스](https://huggingface.co/course/chapter1/1)를 참고하세요. ## 4단계. 설정 확인하기[[step-4-check-your-setup]] 코스 자료를 보기 위해서는 다음이 필요합니다: - 인터넷 연결이 가능한 컴퓨터 - 실습과제를 위한 [Google Colab](https://colab.research.google.com). 무료버전이면 충분합니다. Google Colab을 사용해본적이 없으시다면, 이 [공식 소개 노트북](https://colab.research.google.com/notebooks/intro.ipynb)을 참고하세요. ## 5단계. 커뮤니티 참여하기[[step-5-join-the-community]] 동료 수강생들과 아이디어를 공유하고 허깅페이스팀과 연락할 수 있는 디스코드 서버에 가입하세요. [👉 디스코드 참여](http://hf.co/join/discord) 이 디스코드 커뮤니티에 대해 더 알아보고 싶으시다면 [다음 페이지](community)를 참고하세요.
1
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter0/introduction.mdx
# 허깅페이스 오디오 코스에 오신것을 환영합니다![[welcome-to-the-hugging-face-audio-course]] 학습자 여러분, 트랜스포머 모델의 오디오 분야 적용에 대한 코스에 오신것을 환영합니다. 트랜스포머는 자연어 처리, 컴퓨터 비전, 최근에는 오디오 처리에 이르기까지 다양한 작업에서 최고의 성능을 달성하는 가장 강력하고 다재다능한 딥러닝 아키텍처 중 하나입니다. 이 코스에서는 트랜스포머를 오디오 데이터에 적용하는 방법을 살펴볼 것입니다. 여러분은 이를 사용하여 다양한 오디오 작업을 처리하는 방법을 배우게 됩니다. 음성 인식, 오디오 분류, 텍스트에서 음성 생성 같은 문제에 관심이 있다면 트랜스포머와 이 코스를 통해 해결할 수 있을것입니다. 이 모델로 어떤 작업이 가능한지 보여주기 위해 아래 데모를 준비했습니다. 데모에서 짧게 말한 후 실시간으로 받아쓰는 것을 확인해보세요! <iframe src="https://openai-whisper.hf.space" frameborder="0" width="850" height="450"> </iframe> 코스를 진행하면서 여러분은 오디오 데이터작업의 세부사항들과 다양한 트랜스포머 아키텍처에 대해 배우고, 사전학습된 모델을 활용하여 여러분만의 오디오 트랜스포머를 훈련시킬 것입니다. 이 코스는 딥러닝에 대한 배경지식이 있고 트랜스포머에 대해 어느 정도 친숙한 학습자를 대상으로 설계되었습니다. 오디오 데이터 처리에 대한 전문지식은 필요하지 않습니다. 트랜스포머에 대한 이해가 필요하다면, 트랜스포머의 기초에 대한 저희의 [NLP 코스](https://huggingface.co/course/chapter1/1)를 참고하세요. ## 코스 팀 소개[[meet-the-course-team]] **Sanchit Gandhi, Machine Learning Research Engineer at Hugging Face** 안녕하세요! 저는 Sanchit이고, 허깅페이스🤗의 오픈 소스 팀에서 오디오 분야의 기계 학습 리서치 엔지니어로 일하고 있습니다. 저의 주요 연구 분야는 자동 음성 인식과 번역으로, 음성 모델을 더 빠르고, 가볍고, 사용하기 쉽게 만드는 것을 목표로 하고 있습니다. **Matthijs Hollemans, Machine Learning Engineer at Hugging Face** 안녕하세요, 저는 Matthijs입니다. 저는 허깅페이스의 오픈 소스 팀에서 오디오 분야의 기계 학습 엔지니어로 일하고 있습니다. 또한 사운드 신디사이저를 작성하는 방법에 대한 책의 저자이며, 여가 시간에 오디오 플러그인을 만듭니다. **Maria Khalusova, Documentation & Courses at Hugging Face** 저는 Maria입니다. 트랜스포머와 기타 오픈 소스 도구를 더욱 접근하기 쉽게 만들기 위해 교육 콘텐츠와 문서를 만듭니다. 복잡한 기술 개념을 세분화하여 사람들이 최첨단 기술을 시작하는데 도움을 줍니다. **Vaibhav Srivastav, ML Developer Advocate Engineer at Hugging Face** 저는 Vaibhav(VB)이고, 허깅페이스의 오픈 소스 팀에서 오디오 분야의 Developer Advocate 엔지니어로 일하고 있습니다. 저자원으로 텍스트를 음성으로 변환하는 연구를 하고 있으며, 최첨단 음성 연구를 대중에게 전달하는데 도움을 주고 있습니다. ## 코스 구성[[course-structure]] 이 코스는 다양한 주제를 심도 있게 다루는 여러 단원으로 구성되어 있습니다: * 1단원: 오디오 처리 및 데이터 준비 등 오디오 데이터를 다루는 방법을 배웁니다. * 2단원: 오디오의 응용방법을 알아보고, 오디오 분류 및 음성 인식과 같은 다양한 작업을 위해 🤗 트랜스포머 파이프라인을 사용하는 방법을 배웁니다. * 3단원: 오디오 트랜스포머 아키텍처를 탐구하고, 그 차이를 배우며, 어떤 작업에 가장 적합한지 알아봅니다. * 4단원: 여러분만의 음악 장르 분류기를 만듭니다. * 5단원: 음성 인식에 대해 더 자세히 알아보고, 회의 녹음을 위한 모델을 만듭니다. * 6단원: 텍스트에서 음성을 생성하는 방법을 배웁니다. * 7단원: 트랜스포머를 이용하여 오디오에서 다른 오디오로 바꾸는 법을 배웁니다. 각 단원에는 기본 개념과 기술에 대해 깊이 있는 이해를 얻을 수 있는 이론적인 구성 요소가 포함되어 있습니다. 코스 전반에 걸쳐 여러분의 지식을 테스트하고 학습을 도와줄 퀴즈를 제공하며, 일부 장에는 배운 내용을 적용해 볼 수 있는 실습과제들(hands-on exercises)도 포함되어 있습니다. 이 코스를 마치면 여러분은 트랜스포머를 활용한 오디오 데이터 처리에 대한 탄탄한 기초를 갖추게 되며, 다양한 오디오 관련 작업에 이 기술을 적용할 수 있게될 것입니다. 코스의 단원들은 다음과 같은 게시일정에 따라 순차적으로 공개될 예정입니다: | 단원 | 출시일 | |---|-----------------| | 0단원, 1단원, 2단원 | 2023년 6월 14일 | | 3단원, 4단원 | 2023년 6월 21일 | | 5단원 | 2023년 6월 28일 | | 6단원 | 2023년 7월 5일 | | 7단원, 8단원 | 2023년 7월 12일 | [//]: # (| Bonus Unit | TBD |) ## 학습 경로 및 인증[[learning-paths-and-certification]] 이 코스를 수강하는 데 옳거나 그른 방법은 없습니다. 이 코스의 모든 자료는 100% 무료로 공개되며 오픈 소스입니다. 여러분은 자유롭게 진도를 나갈 수 있지만, 단원 순서대로 진행하는 것을 권장합니다. 코스 완료 시 인증을 받고 싶다면, 두 가지 옵션이 있습니다: | 인증 유형 | 요구 사항 | |---|-------------------------------------------------------------------------------------| | Certificate of completion | 2023년 7월 말까지 지침에 따라 실습과제의 80%를 완료하세요. | | Certificate of honors | 2023년 7월 말까지 지침에 따라 실습과제의 100%를 완료하세요. | 각각의 실습과제들에 완료 기준이 써있습니다. 인증을 받을 수 있을정도로 실습과제들을 충분히 풀었다면, 코스의 마지막 단원을 참조하여 인증서를 취득하는 방법을 알아보세요. 행운을 빕니다! ## 코스 등록하기[[sign-up-to-the-course]] 이 코스의 단원들은 몇 주에 걸쳐 점진적으로 공개될 예정입니다. 새로운 단원이 출시될때 놓치지 않도록 코스 업데이트에 등록하시는 것을 권유드립니다. 코스 업데이트에 등록한 사용자는 저희가 주최예정인 특별한 소셜 이벤트에 대해서도 가장 먼저 알게 됩니다. [등록하기](http://eepurl.com/insvcI) 즐거운 학습 되세요!
2
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter0/community.mdx
# 커뮤니티에 참여해보세요![[join-the-community]] [활발하고 지원이 풍부한 우리의 디스코드 커뮤니티](http://hf.co/join/discord)에 여러분을 초대합니다. 여러분은 이곳에서 같은 생각을 가진 학습자들을 만나고, 아이디어를 교환하며, 실습 과제에 대한 소중한 피드백을 받으실 수 있습니다. 질문을 하고, 자료를 공유하며 다른 사람들과 협력을 해보세요. 우리 팀도 디스코드에서 활동하고 있으며 여러분께 지원과 안내를 해드립니다. 커뮤니티에 가입하는 것은 참여적이고 동기를 부여받을 수 있게 해주며 소통을 유지할 수 있는 훌륭한 방법입니다. 여러분을 커뮤니티에서 만나 뵙기를 기대합니다! ## 디스코드가 뭔가요?[[what-is-discord]] 디스코드는 무료 채팅 플랫폼입니다. 슬랙을 써보신적 있으시다면 그것과 비슷하다고 생각하시면 됩니다. 허깅페이스 디스코드 서버는 18,000명의 AI 전문가, 학습자 및 애호가로 구성된 활발한 커뮤니티로, 여러분도 참여하실 수 있습니다. ## 디스코드 탐색하기[[navigating-discord]] 디스코드 서버에 가입하시면 왼쪽의 `#role-assignment`를 클릭하여 관심있는 주제를 선택하셔야 합니다. 주제는 원하시는 만큼 선택하실 수 있으며 다른 학습자들과 같이하기 위해선 반드시 "ML for Audio and Speech"를 클릭하셔야 합니다. 채널을 살펴보고 `#introduce-yourself`에서 여러분을 소개해보세요. ## 오디오 코스 채널[[audio-course-channels]] 우리의 디스코드 서버에는 다양한 주제의 채널들이 있습니다. 논문에 대한 토론, 이벤트 꾸리기, 프로젝트와 아이디어 공유, 브레인스토밍 등 다양한 활동을 찾아보실 수 있습니다. 다음의 채널들은 오디오 코스 학습을 위해 관련이 있는 채널들입니다: * `#audio-announcements`: 코스 업데이트, 허깅페이스의 오디오와 관련된 모든 뉴스들, 이벤트 공지 등을 전합니다. * `#audio-study-group`: 아이디어를 교환하고 코스에 대한 질문과 토론을 합니다. * `#audio-discuss`: 오디오와 관련된 일반적인 토론을 합니다. `#audio-study-group` 외에도 자유롭게 자신의 학습 그룹을 만들어보세요. 함께 배우면 더 쉽습니다!
3
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/events/introduction.mdx
# 라이브 세션과 워크샵[[live-sessions-and-workshops]] 새 오디오 트랜스포머 코스: Paige Bailey(DeepMind), 김석환(Amazon Alexa AI), Brian McFee(Librosa)가 함께하는 라이브 런칭 이벤트 <Youtube id="wqkKResXWB8"/>
4
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter2/asr_pipeline.mdx
# 파이프라인을 이용한 자동 음성 인식[[automatic-speech-recognition-with-a-pipeline]] 자동 음성 인식(ASR)은 음성 오디오 녹음을 텍스트로 변환하는 작업입니다. 이 작업은 매우 다양하게 실용적으로 쓰일 수가 있습니다. 비디오 자막 생성부터 Siri나 Alexa같은 가상 비서의 음성명령에 이르기까지요. 이번 섹션에선 이전의 MINDS-14 데이터셋의 청구서 지불 방법에 대해 묻는 사람의 음성 녹음을 `automatic-speech-recognition` 파이프라인을 이용해 텍스트로 변환하는 방법을 알아보겠습니다. 시작을 위해 데이터를 준비해야 합니다. 아직 준비하지 않았다면, [Audio classification with a pipeline](introduction.mdx)에서 했던것처럼 데이터셋을 불러오고 16 kHz로 업샘플링을 해주세요. 오디오 녹음을 텍스트로 바꾸기 위해 🤗 Transformers의 `automatic-speech-recognition` 파이프라인을 이용합니다. 파이프라인을 인스턴스화(instantiate) 해보겠습니다: ```py from transformers import pipeline asr = pipeline("automatic-speech-recognition") ``` 다음으로, 데이터셋에서 원시 데이터를 불러와 파이프라인에 넘겨봅시다: ```py example = minds[0] asr(example["audio"]["array"]) ``` **Output:** ```out {"text": "I WOULD LIKE TO PAY MY ELECTRICITY BILL USING MY COD CAN YOU PLEASE ASSIST"} ``` 이 출력과 실제값을 비교해보겠습니다: ```py example["english_transcription"] ``` **Output:** ```out "I would like to pay my electricity bill using my card can you please assist" ``` 모델이 오디오를 텍스트로 바꾸는 일을 꽤 잘 해낸것 같습니다! 실제 텍스트와 비교했을때 한 단어("card")만을 틀렸을 뿐입니다. 화자가 호주식 억양인 것을 고려할 때 꽤 괜찮은 결과로 볼 수 있습니다(호주식 억양에선 "r"이 종종 묵음입니다). 그렇긴 하지만, 전기요금을 물고기("cod"는 영어로 대구를 뜻합니다)로 낼 것을 권장하지는 않습니다! 기본적으로 이 파이프라인은 영어의 자동 음성 인식을 위해 학습된 모델을 씁니다. 이 예제에서는 괜찮지만 여러분이 만약 MINDS-14의 다른 언어에 대해 텍스트 변환을 시도해보고 싶으시다면 [🤗 Hub](https://huggingface.co/models?pipeline_tag=automatic-speech-recognition&language=fr&sort=downloads)에서 사전학습된 ASR 모델을 찾아보실 수 있습니다. 모델 리스트에서 작업순으로 필터링을 먼저하고 언어에 대해 필터링을 할 수 있습니다. 마음에 드는 모델을 찾으셨다면, 파이프라인의 `model` 인수(argument)로 넘겨 쓰면 됩니다. 이를 이용해 MINDS-14의 독일어 부분을 다뤄 보겠습니다. "de-DE" 부분을 불러봅시다: ```py from datasets import load_dataset from datasets import Audio minds = load_dataset("PolyAI/minds14", name="de-DE", split="train") minds = minds.cast_column("audio", Audio(sampling_rate=16_000)) ``` 예제를 하나 선택해 텍스트가 어떻게 나와야하는지 확인해봅시다: ```py example = minds[0] example["transcription"] ``` **Output:** ```out "ich möchte gerne Geld auf mein Konto einzahlen" ``` 🤗 Hub에서 독일어를 위해 사전학습된 ASR 모델을 찾아 파이프라인을 인스턴스화한 후 이 예제에 적용시켜봅시다: ```py from transformers import pipeline asr = pipeline("automatic-speech-recognition", model="maxidl/wav2vec2-large-xlsr-german") asr(example["audio"]["array"]) ``` **Output:** ```out {"text": "ich möchte gerne geld auf mein konto einzallen"} ``` 역시나, stimmt's! 여러분이 작업을 시작할 때, 이번 단원에서 보신것처럼 간단한 파이프라인으로 시작해보는것은 여러 장점이 있습니다: - 여러분의 문제를 해결할 사전학습된 모델이 이미 있을 수 있습니다. 많은 시간을 아끼실 수 있을겁니다. - `pipeline()`은 여러분을 위해 전처리 및 후처리를 대신 해줍니다. 따라서 여러분은 데이터 형식을 모델에 맞추는것에 대해 걱정하지 않으셔도 됩니다. - 결과가 이상적이지 않더라도 하나의 기준점을 빠르게 제시해줍니다. - 여러분이 커스텀 데이터에 맞춰 모델을 파인튜닝하고 허브에 올린다면 `pipeline()` 메소드를 이용해 모든 커뮤니티가 이를 쉽게 쓸 수 있어 AI를 더욱 사용하기 쉽게 만듭니다.
5
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter2/audio_classification_pipeline.mdx
# 파이프라인을 이용한 오디오 분류[[audio-classification-with-a-pipeline]] 오디오 분류는 녹음된 오디오 내용에 기반하여 하나 혹은 여러개의 레이블을 할당하는 작업입니다. 이 레이블은 음악, 음성, 노이즈 같은 카테고리에 해당할 수도 있고, 새소리나 차 엔진 소리처럼 더 구체적인 카테고리일 수도 있습니다. 인기 있는 오디오 트랜스포머 모델들이 세부적으로 어떻게 작동하는지, 커스텀 모델을 어떻게 파인튜닝하는지 등을 알아보기 전에 🤗 Transformers를 이용하여 단 몇줄의 코드만으로 사전학습된 모델을 오디오 분류에 쓰는 법을 알아봅시다. 이전 단원에서 사용했던 [MINDS-14](https://huggingface.co/datasets/PolyAI/minds14) 데이터셋을 쓰겠습니다. 기억하시다시피, MINDS-14는 사람들이 인터넷뱅킹 시스템에 대해 전화로 여러 언어와 방언으로 묻는 것이 녹음돼있습니다. 각 녹음에는 `intent_class`가 있으며 이를 이용해 녹음들을 전화의 의도에 따라 분류할 수 있습니다. 파이프라인을 써보기 위해 이전과 마찬가지로 데이터의 `en-AU` 부분을 가져와 모델이 요구하는 16 kHz 샘플링 속도를 가지도록 업샘플링 해봅시다. ```py from datasets import load_dataset from datasets import Audio minds = load_dataset("PolyAI/minds14", name="en-AU", split="train") minds = minds.cast_column("audio", Audio(sampling_rate=16_000)) ``` 🤗 Transformers의 `audio-classification` 파이프라인을 사용하면 녹음된 오디오를 클래스 집합으로 분류할 수 있습니다. 우리의 경우, MINDS-14 데이터셋의 의도 분류를 위해 파인튜닝된 모델이 필요합니다. 운좋게도, 바로 그럴때 쓰이는 모델이 허브에 있습니다! `pipeline()` 함수를 써서 이를 불러보겠습니다: ```py from transformers import pipeline classifier = pipeline( "audio-classification", model="anton-l/xtreme_s_xlsr_300m_minds14", ) ``` 이 파이프라인은 오디오 데이터로 넘파이 배열을 요구합니다. 원시 오디오 데이터의 모든 전처리는 편리하게도 파이프라인이 해결해줍니다. 한 예를 봅시다: ```py example = minds[0] ``` 데이터셋의 구조를 기억하신다면, 원시 오디오 데이터가 `["audio"]["array"]` 아래에 넘파이 배열로 저장돼있는걸 기억하실겁니다. 그대로 `classifier`에 넘겨봅시다: ```py classifier(example["audio"]["array"]) ``` **Output:** ```out [ {"score": 0.9631525278091431, "label": "pay_bill"}, {"score": 0.02819698303937912, "label": "freeze"}, {"score": 0.0032787492964416742, "label": "card_issues"}, {"score": 0.0019414445850998163, "label": "abroad"}, {"score": 0.0008378693601116538, "label": "high_value_payment"}, ] ``` 모델은 전화하는 사람이 청구서의 지불 방법에 대해 묻고 있다고 매우 확신하고 있습니다. 실제 레이블은 어떤지 확인해봅시다: ```py id2label = minds.features["intent_class"].int2str id2label(example["intent_class"]) ``` **Output:** ```out "pay_bill" ``` 만세! 예측값이 맞았습니다! 운 좋게도 우리는 필요한 레이블을 정확하게 분류할 수 있는 모델을 찾을 수 있었습니다. 그러나 분류 작업을 다루는 많은 경우에는 사전학습된 모델의 클래스가 우리가 바라는 분류 클래스와 일치하지 않습니다. 이와 같은 경우, 여러분은 사전학습된 모델을 "보정(calibrate)"하여 여러분의 클래스 레이블에 맞출 수 있습니다. 이후 단원에서 이를 어떻게 하는지 배우게 됩니다. 이제 음성 처리에서 매우 일반적인 작업인 _자동 음성 인식_에 대해 살펴봅시다.
6
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter2/introduction.mdx
# 2단원. 오디오의 응용에 대한 소개[[unit-2-a-gentle-introduction-to-audio-applications]] 허깅페이스 오디오 코스의 두번째 단원에 오신것을 환영합니다! 지금까지는 오디오 데이터의 기본 개념을 살펴보고 🤗 Datasets과 🤗 Transformers 라이브러리를 활용해 오디오 데이터셋을 처리하는 방법을 배웠습니다. 또한 샘플링 속도, 진폭, 비트뎁스, 파형, 스펙트로그램, 사전학습된 모델을 위해 데이터를 전처리하는 방법에 관하여도 살펴봤습니다. 이 시점에서 여러분은 🤗 Transformers로 처리할 수 있는 오디오 작업들에 관해 배우고 싶으실 것이며 이에 필요한 기초 지식은 모두 갖추셨을 것입니다. 몇 가지 놀라운 오디오 작업 예제들을 살펴봅시다: * **오디오 분류(Audio classification)**: 오디오 클립을 쉽게 다른 카테고리들로 분류합니다. 녹음된 소리가 개가 짖는 소리인지 고양이가 우는 소리인지를 구분한다거나, 노래가 어떤 음악 장르에 속하는지 등을 판별합니다. * **자동 음성 인식(Automatic speech recognition)**: 오디오 클립에서 자동으로 자막을 만듭니다. "오늘 하루 어때요?"와 같이 누군가가 말하는 녹음 내용을 텍스트로 변환할 수 있습니다. 메모를 할 때 상당히 유용합니다! * **화자 구분(Speaker diarization)**: 녹음에서 누가 말하고 있는지 궁금했던 적이 있나요? 🤗 Transformers를 사용하면 오디오 클립의 어느 시점에 누가 말하는지를 구분할 수 있습니다. "Alice"와 "Bob" 두 사람의 대화 녹음에서 그들을 구분할 수 있다고 상상해 보세요. * **텍스트 음성 변환(Text to speech)**: 텍스트의 나레이션을 만들어 오디오북을 만들거나 접근성(accessibility)을 향상시킬 수도 있고 게임의 NPC에게 목소리를 부여할 수도 있습니다. 🤗 Transformers를 사용하면 쉬운 일입니다! 이번 단원에서는 🤗 Transformers의 `pipeline()` 함수를 사용하여 이런 작업들에 사전학습된 모델을 쓰는 법을 알아보겠습니다. 특히, 사전학습된 모델이 오디오 분류와 자동 음성 인식에 어떻게 쓰이는지를 살펴보겠습니다. 시작해봅시다!
7
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter2/hands_on.mdx
# 실습 과제[[hands-on-exercise]] 이 과제는 평가의 대상은 아닙니다. 단지 나머지 코스를 위해 여러분이 라이브러리와 툴에 익숙해지는 것을 목적으로 합니다. 구글 코랩, 🤗 Datasets, librosa, 🤗 Transformers에 이미 익숙하시다면 과제를 건너뛰셔도 좋습니다. 1. [구글 코랩](https://colab.research.google.com) 노트북을 생성해보세요. 2. 🤗 Datasets을 이용해 여러분이 원하시는 언어의 [`facebook/voxpopuli` 데이터셋](https://huggingface.co/datasets/facebook/voxpopuli) 학습 데이터를 스트리밍 모드로 불러와보세요. 3. 데이터셋의 `train` 부분에서 세번째 데이터를 불러와 보세요. 이 데이터의 feature를 고려할 때, 어떤 오디오 작업에 이 데이터셋을 쓸 수 있으실 것 같나요? 4. 이 데이터의 파형과 스펙트로그램을 그려보세요. 5. [🤗 허브](https://huggingface.co/models)에서 사전학습된 모델을 둘러보고 여러분이 고른 언어의 자동 음성 인식 모델을 선택해보세요. 그에 맞는 파이프라인을 인스턴스화 하시고 음성 데이터를 텍스트로 바꿔보세요. 6. 여러분이 파이프라인에서 얻은 출력 텍스트와 실제 데이터의 텍스트를 비교해보세요. 과제를 푸는데 어려움이 있다면 [풀이 예시](https://colab.research.google.com/drive/1NGyo5wFpRj8TMfZOIuPaJHqyyXCITftc?usp=sharing)를 살펴보는 것도 좋습니다. 뭔가 흥미로운 것을 발견하셨나요? 멋진 모델을 찾으셨나요? 아름다운 스펙트로그램을 얻으셨나요? 트위터에 여러분의 작업 결과와 발견들을 공유해보세요! 다음 챕터에선 여러 오디오 트랜스포머의 구조에 대해 알아보고 여러분만의 모델을 학습해봅시다!
8
0
hf_public_repos/audio-transformers-course/chapters/ko
hf_public_repos/audio-transformers-course/chapters/ko/chapter3/supplemental_reading.mdx
# 추가 자료 및 리소스[[supplemental-reading-and-resources]] 다양한 트랜스포머 아키텍처에 대해 더 자세히 알아보고 음성 처리 분야의 다양한 애플리케이션에 대해 알아보려면 이 백서를 확인하세요: ### 음성 처리를 위한 트랜스포머: 설문 조사[[transformers-in-speech-processing-a-survey]] 작성자: Siddique Latif, Aun Zaidi, Heriberto Cuayahuitl, Fahad Shamshad, Moazzam Shoukat, Junaid Qadir "자연어 처리 분야에서 트랜스포머의 놀라운 성공은 음성 처리 커뮤니티의 관심을 불러일으켰고, 음성 시퀀스 내에서 장거리 종속성을 모델링할 수 있는 트랜스포머의 잠재력에 대한 탐구로 이어졌습니다. 최근 트랜스포머는 자동 음성 인식, 음성 합성, 음성 번역, 음성 준언어학, 음성 향상, 음성 대화 시스템 및 수많은 멀티모달 애플리케이션을 포함한 다양한 음성 관련 영역에서 각광받고 있습니다. 이 백서에서는 음성 기술 내 다양한 하위 분야의 연구를 연결하는 것을 목표로 하는 포괄적인 설문조사를 제시합니다. 음성 기술 환경 전반의 연구 결과를 통합함으로써, 이 분야를 발전시키기 위해 트랜스포머의 힘을 활용하는 데 관심이 있는 연구자에게 귀중한 리소스를 제공합니다. 연구자들에게 귀중한 리소스를 제공합니다. 음성 처리에서 랜포머가 직면한 문제를 파악하는 동시에 이러한 문제를 해결할 수 있는 잠재적 솔루션에 대한 통찰력도 제공합니다." [arxiv.org/abs/2303.11607](https://arxiv.org/abs/2303.11607)
9
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/utils/memory.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """ A collection of utilities for ensuring that training can always occur. Heavily influenced by the [toma](https://github.com/BlackHC/toma) library. """ import functools import gc import importlib import inspect import warnings import torch from packaging import version from .imports import ( is_cuda_available, is_ipex_available, is_mlu_available, is_mps_available, is_musa_available, is_npu_available, is_xpu_available, ) from .versions import compare_versions def clear_device_cache(garbage_collection=False): """ Clears the device cache by calling `torch.{backend}.empty_cache`. Can also run `gc.collect()`, but do note that this is a *considerable* slowdown and should be used sparingly. """ if garbage_collection: gc.collect() if is_xpu_available(): torch.xpu.empty_cache() elif is_mlu_available(): torch.mlu.empty_cache() elif is_musa_available(): torch.musa.empty_cache() elif is_npu_available(): torch.npu.empty_cache() elif is_mps_available(min_version="2.0"): torch.mps.empty_cache() elif is_cuda_available(): torch.cuda.empty_cache() def release_memory(*objects): """ Releases memory from `objects` by setting them to `None` and calls `gc.collect()` and `torch.cuda.empty_cache()`. Returned objects should be reassigned to the same variables. Args: objects (`Iterable`): An iterable of objects Returns: A list of `None` objects to replace `objects` Example: ```python >>> import torch >>> from accelerate.utils import release_memory >>> a = torch.ones(1000, 1000).cuda() >>> b = torch.ones(1000, 1000).cuda() >>> a, b = release_memory(a, b) ``` """ if not isinstance(objects, list): objects = list(objects) for i in range(len(objects)): objects[i] = None clear_device_cache(garbage_collection=True) return objects def should_reduce_batch_size(exception: Exception) -> bool: """ Checks if `exception` relates to CUDA out-of-memory, XPU out-of-memory, CUDNN not supported, or CPU out-of-memory Args: exception (`Exception`): An exception """ _statements = [ "CUDA out of memory.", # CUDA OOM "XPU out of memory.", # XPU OOM "cuDNN error: CUDNN_STATUS_NOT_SUPPORTED.", # CUDNN SNAFU "DefaultCPUAllocator: can't allocate memory", # CPU OOM ] if isinstance(exception, RuntimeError) and len(exception.args) == 1: return any(err in exception.args[0] for err in _statements) return False def find_executable_batch_size(function: callable = None, starting_batch_size: int = 128): """ A basic decorator that will try to execute `function`. If it fails from exceptions related to out-of-memory or CUDNN, the batch size is cut in half and passed to `function` `function` must take in a `batch_size` parameter as its first argument. Args: function (`callable`, *optional*): A function to wrap starting_batch_size (`int`, *optional*): The batch size to try and fit into memory Example: ```python >>> from accelerate.utils import find_executable_batch_size >>> @find_executable_batch_size(starting_batch_size=128) ... def train(batch_size, model, optimizer): ... ... >>> train(model, optimizer) ``` """ if function is None: return functools.partial(find_executable_batch_size, starting_batch_size=starting_batch_size) batch_size = starting_batch_size def decorator(*args, **kwargs): nonlocal batch_size clear_device_cache(garbage_collection=True) params = list(inspect.signature(function).parameters.keys()) # Guard against user error if len(params) < (len(args) + 1): arg_str = ", ".join([f"{arg}={value}" for arg, value in zip(params[1:], args[1:])]) raise TypeError( f"Batch size was passed into `{function.__name__}` as the first argument when called." f"Remove this as the decorator already does so: `{function.__name__}({arg_str})`" ) while True: if batch_size == 0: raise RuntimeError("No executable batch size found, reached zero.") try: return function(batch_size, *args, **kwargs) except Exception as e: if should_reduce_batch_size(e): clear_device_cache(garbage_collection=True) batch_size //= 2 else: raise return decorator def get_xpu_available_memory(device_index: int): if is_ipex_available(): ipex_version = version.parse(importlib.metadata.version("intel_extension_for_pytorch")) if compare_versions(ipex_version, ">=", "2.5"): from intel_extension_for_pytorch.xpu import mem_get_info return mem_get_info(device_index)[0] warnings.warn( "The XPU `mem_get_info` API is available in IPEX version >=2.5. The current returned available memory is incorrect. Please consider upgrading your IPEX version." ) return torch.xpu.max_memory_allocated(device_index)
0
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/utils/megatron_lm.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import argparse import math import os from abc import ABC from functools import partial import torch import torch.nn.functional as F from torch.nn import BCEWithLogitsLoss, CrossEntropyLoss, MSELoss from torch.nn.parallel.distributed import DistributedDataParallel as torchDDP from ..optimizer import AcceleratedOptimizer from ..scheduler import AcceleratedScheduler from .imports import is_megatron_lm_available from .operations import recursively_apply, send_to_device if is_megatron_lm_available(): from megatron.core import mpu, tensor_parallel from megatron.core.distributed import DistributedDataParallel as LocalDDP from megatron.core.distributed import finalize_model_grads from megatron.core.enums import ModelType from megatron.core.num_microbatches_calculator import get_num_microbatches from megatron.core.optimizer import get_megatron_optimizer from megatron.core.parallel_state import get_tensor_model_parallel_group, get_tensor_model_parallel_src_rank from megatron.core.pipeline_parallel import get_forward_backward_func from megatron.core.utils import get_model_config from megatron.inference.text_generation.communication import broadcast_int_list, broadcast_tensor from megatron.inference.text_generation.generation import ( beam_search_and_return_on_first_stage, generate_tokens_probs_and_return_on_first_stage, ) from megatron.legacy.data.dataset_utils import build_train_valid_test_datasets from megatron.legacy.model import BertModel, Float16Module, GPTModel, T5Model from megatron.legacy.model.classification import Classification from megatron.training import ( get_args, get_tensorboard_writer, get_tokenizer, print_rank_last, ) from megatron.training.arguments import ( _add_data_args, _add_validation_args, core_transformer_config_from_args, parse_args, validate_args, ) from megatron.training.checkpointing import load_args_from_checkpoint, load_checkpoint, save_checkpoint from megatron.training.global_vars import set_global_variables from megatron.training.initialize import ( _compile_dependencies, _init_autoresume, _initialize_distributed, _set_random_seed, set_jit_fusion_options, write_args_to_tensorboard, ) from megatron.training.tokenizer.tokenizer import _vocab_size_with_padding from megatron.training.training import ( build_train_valid_test_data_iterators, get_optimizer_param_scheduler, num_floating_point_operations, setup_model_and_optimizer, train_step, training_log, ) from megatron.training.utils import ( average_losses_across_data_parallel_group, calc_params_l2_norm, get_ltor_masks_and_position_ids, unwrap_model, ) # model utilities def model_provider_func(pre_process=True, post_process=True, add_encoder=True, add_decoder=True): """Build the model.""" args = get_args() mode = "pre-training" if args.pretraining_flag else "fine-tuning" if args.rank == 0: print(f"Building {args.model_type_name} model in the {mode} mode.") print( "The Megatron LM model weights are initialized at random in `accelerator.prepare`. " "Please use `accelerator.load_checkpoint` to load a pre-trained checkpoint matching the distributed setup." ) config = core_transformer_config_from_args(args) if args.model_type_name == "bert": if args.pretraining_flag: num_tokentypes = 2 if args.bert_binary_head else 0 model = BertModel( config=config, num_tokentypes=num_tokentypes, add_binary_head=args.bert_binary_head, parallel_output=True, pre_process=pre_process, post_process=post_process, ) else: model = Classification( config=config, num_classes=args.num_labels, num_tokentypes=2, pre_process=pre_process, post_process=post_process, ) elif args.model_type_name == "gpt": model = GPTModel( config=config, num_tokentypes=0, parallel_output=True, pre_process=pre_process, post_process=post_process, ) elif args.model_type_name == "t5": model = T5Model( config=config, num_tokentypes=0, parallel_output=True, pre_process=pre_process, post_process=post_process, add_encoder=add_encoder, add_decoder=add_decoder, ) else: raise ValueError(f"Unsupported model type: {args.model_type_name}") return model def prepare_model_optimizer_scheduler(accelerator): accelerator.print("Preparing model optimizer scheduler") args = get_args() if accelerator.state.megatron_lm_plugin.custom_prepare_model_function is not None: if accelerator.state.megatron_lm_plugin.custom_model_provider_function is None: raise ValueError( "You must provide a `custom_model_provider_function` when using a `custom_prepare_model_function`." ) custom_model_provider_func = accelerator.state.megatron_lm_plugin.custom_model_provider_function model = accelerator.state.megatron_lm_plugin.custom_prepare_model_function(custom_model_provider_func) optimizer = prepare_optimizer(accelerator, model) scheduler = prepare_scheduler(accelerator, optimizer, scheduler=None) else: model_type = ModelType.encoder_or_decoder if args.model_type_name == "t5": model_type = ModelType.encoder_and_decoder model_provider_func_ = model_provider_func if accelerator.state.megatron_lm_plugin.custom_model_provider_function is not None: model_provider_func_ = accelerator.state.megatron_lm_plugin.custom_model_provider_function (model, optimizer, scheduler) = setup_model_and_optimizer( model_provider_func_, model_type, no_wd_decay_cond=args.no_wd_decay_cond, scale_lr_cond=args.scale_lr_cond, lr_mult=args.lr_mult, ) args.model_len = len(model) return model, optimizer, scheduler # dataloader utilities class MegatronLMDummyDataLoader: """ Dummy dataloader presents model parameters or param groups, this is primarily used to follow conventional training Args: **dataset_kwargs: Megatron data arguments. """ def __init__(self, **dataset_kwargs): parser = argparse.ArgumentParser() parser = _add_data_args(parser) parser = _add_validation_args(parser) data_args = parser.parse_known_args() self.dataset_args = vars(data_args[0]) self.dataset_args.update(dataset_kwargs) self.dataset_args["megatron_dataset_flag"] = True def set_megatron_data_args(self): args = get_args() for key, value in self.dataset_args.items(): old_value = getattr(args, key, "") if old_value != value: print( f"WARNING: MegatronLMDummyDataLoader overriding arguments for " f"{key}:{old_value} with {key}:{value}" ) setattr(args, key, value) def get_train_valid_test_datasets_provider(self, accelerator): def train_valid_test_datasets_provider(train_val_test_num_samples): """Build train, valid, and test datasets.""" args = get_args() dataset_args = { "data_prefix": args.data_path if isinstance(args.data_path, (list, tuple)) else [args.data_path], "splits_string": args.split, "train_valid_test_num_samples": train_val_test_num_samples, "seed": args.seed, } if args.model_type_name == "bert": dataset_args.update( { "max_seq_length": args.seq_length, "binary_head": args.bert_binary_head, } ) elif args.model_type_name == "gpt": dataset_args.update( { "max_seq_length": args.seq_length, } ) elif args.model_type_name == "t5": dataset_args.update( { "max_seq_length": args.encoder_seq_length, "max_seq_length_dec": args.decoder_seq_length, "dataset_type": "t5", } ) else: raise ValueError(f"Unsupported model type: {args.model_type_name}") train_ds, valid_ds, test_ds = build_train_valid_test_datasets(**dataset_args) return train_ds, valid_ds, test_ds if accelerator.state.megatron_lm_plugin.custom_megatron_datasets_provider_function is not None: return accelerator.state.megatron_lm_plugin.custom_megatron_datasets_provider_function try: args = get_args() # Use '--no-use-pep517 -e' to pip install nvidia's megatron from source if args.model_type_name == "bert": from pretrain_bert import train_valid_test_datasets_provider train_valid_test_datasets_provider.is_distributed = True return train_valid_test_datasets_provider elif args.model_type_name == "gpt": from pretrain_gpt import train_valid_test_datasets_provider train_valid_test_datasets_provider.is_distributed = True return train_valid_test_datasets_provider elif args.model_type_name == "t5": from pretrain_t5 import train_valid_test_datasets_provider train_valid_test_datasets_provider.is_distributed = True return train_valid_test_datasets_provider except ImportError: pass return train_valid_test_datasets_provider def build_train_valid_test_data_iterators(self, accelerator): args = get_args() train_valid_test_dataset_provider = self.get_train_valid_test_datasets_provider(accelerator) if args.virtual_pipeline_model_parallel_size is not None: train_data_iterator = [] valid_data_iterator = [] test_data_iterator = [] for i in range(getattr(args, "model_len", 0)): mpu.set_virtual_pipeline_model_parallel_rank(i) iterators = build_train_valid_test_data_iterators(train_valid_test_dataset_provider) train_data_iterator.append(iterators[0]) valid_data_iterator.append(iterators[1]) test_data_iterator.append(iterators[2]) else: train_data_iterator, valid_data_iterator, test_data_iterator = build_train_valid_test_data_iterators( train_valid_test_dataset_provider ) return train_data_iterator, valid_data_iterator, test_data_iterator def _handle_megatron_data_iterator(accelerator, data_iterator): class DummyMegatronDataloader: def __iter__(self): return self def __next__(self): return {} is_data_iterator_empty = data_iterator is None is_src_data_iterator_empty = torch.tensor(is_data_iterator_empty, dtype=torch.bool, device=accelerator.device) torch.distributed.broadcast( is_src_data_iterator_empty, get_tensor_model_parallel_src_rank(), group=get_tensor_model_parallel_group() ) if not is_src_data_iterator_empty and is_data_iterator_empty: return DummyMegatronDataloader() return data_iterator def prepare_data_loader(accelerator, dataloader): accelerator.print("Preparing dataloader") args = get_args() if not args.megatron_dataset_flag: from ..data_loader import _PYTORCH_DATALOADER_KWARGS, prepare_data_loader micro_batch_size = args.micro_batch_size * args.num_micro_batches kwargs = {k: getattr(dataloader, k, _PYTORCH_DATALOADER_KWARGS[k]) for k in _PYTORCH_DATALOADER_KWARGS} if kwargs["batch_size"] is None: if isinstance(kwargs["sampler"], torch.utils.data.BatchSampler): kwargs["sampler"].batch_size = micro_batch_size else: del kwargs["sampler"] del kwargs["shuffle"] del kwargs["batch_size"] kwargs["batch_sampler"].batch_size = micro_batch_size else: del kwargs["batch_sampler"] kwargs["batch_size"] = micro_batch_size dataloader = torch.utils.data.DataLoader(dataloader.dataset, **kwargs) # split_batches: # Megatron only needs to fetch different data between different dp groups, # and does not need to split the data within the dp group. return prepare_data_loader( dataloader, accelerator.device, num_processes=mpu.get_data_parallel_world_size(), process_index=mpu.get_data_parallel_rank(), split_batches=False, put_on_device=True, rng_types=accelerator.rng_types.copy(), dispatch_batches=accelerator.dispatch_batches, ) else: if args.consumed_samples is not None: ( args.consumed_train_samples, args.consumed_valid_samples, args.consumed_test_samples, ) = args.consumed_samples else: args.consumed_train_samples, args.consumed_valid_samples, args.consumed_test_samples = 0, 0, 0 args.micro_batch_size = args.micro_batch_size * args.num_micro_batches # In order to be compatible with data in transform format, # it needs to increase the size of mbs first, # and then split the large batch data into some mbs. ( train_data_iterator, valid_data_iterator, test_data_iterator, ) = dataloader.build_train_valid_test_data_iterators(accelerator) args.micro_batch_size = args.micro_batch_size // args.num_micro_batches train_data_iterator = _handle_megatron_data_iterator( accelerator=accelerator, data_iterator=train_data_iterator ) valid_data_iterator = _handle_megatron_data_iterator( accelerator=accelerator, data_iterator=valid_data_iterator ) test_data_iterator = _handle_megatron_data_iterator(accelerator=accelerator, data_iterator=test_data_iterator) return train_data_iterator, valid_data_iterator, test_data_iterator # optimizer utilities class MegatronLMOptimizerWrapper(AcceleratedOptimizer): def __init__(self, optimizer): super().__init__(optimizer, device_placement=False, scaler=None) def zero_grad(self, set_to_none=None): pass # `model(**batch)` is doing that automatically. Therefore, it's implementation is not needed def step(self): pass # `model(**batch)` is doing that automatically. Therefore, it's implementation is not needed @property def step_was_skipped(self): """Whether or not the optimizer step was done, or skipped because of gradient overflow.""" return self.optimizer.skipped_iter def prepare_optimizer(accelerator, model): accelerator.print("Preparing optimizer") args = get_args() return get_megatron_optimizer(model, args.no_wd_decay_cond, args.scale_lr_cond, args.lr_mult) # scheduler utilities class MegatronLMDummyScheduler: """ Dummy scheduler presents model parameters or param groups, this is primarily used to follow conventional training loop when scheduler config is specified in the deepspeed config file. Args: optimizer (`torch.optim.optimizer.Optimizer`): The optimizer to wrap. total_num_steps (int): Total number of steps. warmup_num_steps (int): Number of steps for warmup. **kwargs (additional keyword arguments, *optional*): Other arguments. """ def __init__(self, optimizer, total_num_steps=None, warmup_num_steps=0, **kwargs): self.optimizer = optimizer self.total_num_steps = total_num_steps self.warmup_num_steps = warmup_num_steps self.kwargs = kwargs class MegatronLMSchedulerWrapper(AcceleratedScheduler): def __init__(self, scheduler, optimizers): super().__init__(scheduler, optimizers) def step(self, *args, **kwargs): return # `model(**batch)` is doing that automatically. Therefore, it's implementation is not needed def prepare_scheduler(accelerator, optimizer, scheduler): accelerator.print("Preparing scheduler") scheduler = get_optimizer_param_scheduler(optimizer) return scheduler class AbstractTrainStep(ABC): """Abstract class for batching, forward pass and loss handler.""" def __init__(self, name): super().__init__() self.name = name def get_batch_func(self, accelerator, megatron_dataset_flag): pass def get_forward_step_func(self): pass def get_loss_func(self, accelerator): pass class BertTrainStep(AbstractTrainStep): """ Bert train step class. Args: args (`argparse.Namespace`): Megatron-LM arguments. """ def __init__(self, accelerator, args): super().__init__("BertTrainStep") self.get_batch = self.get_batch_func(accelerator, args.megatron_dataset_flag) self.loss_func = self.get_loss_func(accelerator, args.pretraining_flag, args.num_labels) self.forward_step = self.get_forward_step_func(args.pretraining_flag, args.bert_binary_head) if not args.model_return_dict: self.model_output_class = None else: from transformers.modeling_outputs import SequenceClassifierOutput self.model_output_class = SequenceClassifierOutput def get_batch_func(self, accelerator, megatron_dataset_flag): def get_batch_megatron(data_iterator): """Build the batch.""" # Items and their type. keys = ["text", "types", "labels", "is_random", "loss_mask", "padding_mask"] datatype = torch.int64 # Broadcast data. if data_iterator is not None: data = next(data_iterator) else: data = None data_b = tensor_parallel.broadcast_data(keys, data, datatype) # Unpack. tokens = data_b["text"].long() types = data_b["types"].long() sentence_order = data_b["is_random"].long() loss_mask = data_b["loss_mask"].float() lm_labels = data_b["labels"].long() padding_mask = data_b["padding_mask"].long() return tokens, types, sentence_order, loss_mask, lm_labels, padding_mask def get_batch_transformer(data_iterator): """Build the batch.""" data = next(data_iterator) data = send_to_device(data, torch.cuda.current_device()) # Unpack. tokens = data["input_ids"].long() padding_mask = data["attention_mask"].long() if "token_type_ids" in data: types = data["token_type_ids"].long() else: types = None if "labels" in data: lm_labels = data["labels"].long() loss_mask = (data["labels"] != -100).to(torch.float) else: lm_labels = None loss_mask = None if "next_sentence_label" in data: sentence_order = data["next_sentence_label"].long() else: sentence_order = None return tokens, types, sentence_order, loss_mask, lm_labels, padding_mask if accelerator.state.megatron_lm_plugin.custom_get_batch_function is not None: return accelerator.state.megatron_lm_plugin.custom_get_batch_function if megatron_dataset_flag: try: # Use '--no-use-pep517 -e' to pip install nvidia's megatron from source from pretrain_bert import get_batch return get_batch except ImportError: pass return get_batch_megatron else: return get_batch_transformer def get_loss_func(self, accelerator, pretraining_flag, num_labels): def loss_func_pretrain(loss_mask, sentence_order, output_tensor): lm_loss_, sop_logits = output_tensor lm_loss_ = lm_loss_.float() loss_mask = loss_mask.float() lm_loss = torch.sum(lm_loss_.view(-1) * loss_mask.reshape(-1)) / loss_mask.sum() if sop_logits is not None: sop_loss = F.cross_entropy(sop_logits.view(-1, 2).float(), sentence_order.view(-1), ignore_index=-1) sop_loss = sop_loss.float() loss = lm_loss + sop_loss averaged_losses = average_losses_across_data_parallel_group([lm_loss, sop_loss]) return loss, {"lm loss": averaged_losses[0], "sop loss": averaged_losses[1]} else: loss = lm_loss averaged_losses = average_losses_across_data_parallel_group([lm_loss]) return loss, {"lm loss": averaged_losses[0]} def loss_func_finetune(labels, logits): if num_labels == 1: # We are doing regression loss_fct = MSELoss() loss = loss_fct(logits.view(-1), labels.view(-1)) elif self.num_labels > 1 and (labels.dtype in (torch.long, torch.int)): loss_fct = CrossEntropyLoss() loss = loss_fct(logits.view(-1, num_labels), labels.view(-1)) else: loss_fct = BCEWithLogitsLoss() loss = loss_fct(logits, labels) averaged_losses = average_losses_across_data_parallel_group([loss]) return loss, {"loss": averaged_losses[0]} if accelerator.state.megatron_lm_plugin.custom_loss_function is not None: return accelerator.state.megatron_lm_plugin.custom_loss_function if pretraining_flag: return loss_func_pretrain else: return loss_func_finetune def get_forward_step_func(self, pretraining_flag, bert_binary_head): def forward_step(data_iterator, model): """Forward step.""" tokens, types, sentence_order, loss_mask, labels, padding_mask = self.get_batch(data_iterator) if not bert_binary_head: types = None # Forward pass through the model. if pretraining_flag: output_tensor = model(tokens, padding_mask, tokentype_ids=types, lm_labels=labels) return output_tensor, partial(self.loss_func, loss_mask, sentence_order) else: logits = model(tokens, padding_mask, tokentype_ids=types) return logits, partial(self.loss_func, labels) return forward_step class GPTTrainStep(AbstractTrainStep): """ GPT train step class. Args: args (`argparse.Namespace`): Megatron-LM arguments. """ def __init__(self, accelerator, args): super().__init__("GPTTrainStep") self.get_batch = self.get_batch_func(accelerator, args.megatron_dataset_flag) self.loss_func = self.get_loss_func(accelerator) self.forward_step = self.get_forward_step_func() self.eod_token = args.padded_vocab_size - 1 if args.vocab_file is not None: tokenizer = get_tokenizer() self.eod_token = tokenizer.eod self.reset_position_ids = args.reset_position_ids self.reset_attention_mask = args.reset_attention_mask self.eod_mask_loss = args.eod_mask_loss if not args.model_return_dict: self.model_output_class = None else: from transformers.modeling_outputs import CausalLMOutputWithCrossAttentions self.model_output_class = CausalLMOutputWithCrossAttentions def get_batch_func(self, accelerator, megatron_dataset_flag): def get_batch_megatron(data_iterator): """Generate a batch""" # Items and their type. keys = ["text"] datatype = torch.int64 # Broadcast data. if data_iterator is not None: data = next(data_iterator) else: data = None data_b = tensor_parallel.broadcast_data(keys, data, datatype) # Unpack. tokens_ = data_b["text"].long() labels = tokens_[:, 1:].contiguous() tokens = tokens_[:, :-1].contiguous() # Get the masks and postition ids. attention_mask, loss_mask, position_ids = get_ltor_masks_and_position_ids( tokens, self.eod_token, self.reset_position_ids, self.reset_attention_mask, self.eod_mask_loss ) return tokens, labels, loss_mask, attention_mask, position_ids def get_batch_transformer(data_iterator): data = next(data_iterator) data = {"input_ids": data["input_ids"]} data = send_to_device(data, torch.cuda.current_device()) tokens_ = data["input_ids"].long() padding = torch.zeros((tokens_.shape[0], 1), dtype=tokens_.dtype, device=tokens_.device) + self.eod_token tokens_ = torch.concat([tokens_, padding], dim=1) labels = tokens_[:, 1:].contiguous() tokens = tokens_[:, :-1].contiguous() # Get the masks and postition ids. attention_mask, loss_mask, position_ids = get_ltor_masks_and_position_ids( tokens, self.eod_token, self.reset_position_ids, self.reset_attention_mask, True ) return tokens, labels, loss_mask, attention_mask, position_ids if accelerator.state.megatron_lm_plugin.custom_get_batch_function is not None: return accelerator.state.megatron_lm_plugin.custom_get_batch_function if megatron_dataset_flag: try: # Use '--no-use-pep517 -e' to pip install nvidia's megatron from source from pretrain_gpt import get_batch return get_batch except ImportError: pass return get_batch_megatron else: return get_batch_transformer def get_loss_func(self, accelerator): args = get_args() def loss_func(loss_mask, output_tensor): if args.return_logits: losses, logits = output_tensor else: losses = output_tensor losses = losses.float() loss_mask = loss_mask.view(-1).float() if args.context_parallel_size > 1: loss = torch.cat([torch.sum(losses.view(-1) * loss_mask).view(1), loss_mask.sum().view(1)]) torch.distributed.all_reduce(loss, group=mpu.get_context_parallel_group()) loss = loss[0] / loss[1] else: loss = torch.sum(losses.view(-1) * loss_mask) / loss_mask.sum() # Check individual rank losses are not NaN prior to DP all-reduce. if args.check_for_nan_in_loss_and_grad: global_rank = torch.distributed.get_rank() assert not loss.isnan(), ( f"Rank {global_rank}: found NaN in local forward loss calculation. " f"Device: {torch.cuda.current_device()}, node: {os.uname()[1]}" ) # Reduce loss for logging. averaged_loss = average_losses_across_data_parallel_group([loss]) output_dict = {"lm loss": averaged_loss[0]} if args.return_logits: output_dict.update({"logits": logits}) return loss, output_dict if accelerator.state.megatron_lm_plugin.custom_loss_function is not None: return accelerator.state.megatron_lm_plugin.custom_loss_function return loss_func def get_forward_step_func(self): def forward_step(data_iterator, model): """Forward step.""" # Get the batch. tokens, labels, loss_mask, attention_mask, position_ids = self.get_batch(data_iterator) output_tensor = model(tokens, position_ids, attention_mask, labels=labels) return output_tensor, partial(self.loss_func, loss_mask) return forward_step class T5TrainStep(AbstractTrainStep): """ T5 train step class. Args: args (`argparse.Namespace`): Megatron-LM arguments. """ def __init__(self, accelerator, args): super().__init__("T5TrainStep") self.get_batch = self.get_batch_func(accelerator, args.megatron_dataset_flag) self.loss_func = self.get_loss_func(accelerator) self.forward_step = self.get_forward_step_func() if not args.model_return_dict: self.model_output_class = None else: from transformers.modeling_outputs import Seq2SeqLMOutput self.model_output_class = Seq2SeqLMOutput @staticmethod def attn_mask_postprocess(attention_mask): # We create a 3D attention mask from a 2D tensor mask. # [b, 1, s] attention_mask_b1s = attention_mask.unsqueeze(1) # [b, s, 1] attention_mask_bs1 = attention_mask.unsqueeze(2) # [b, s, s] attention_mask_bss = attention_mask_b1s * attention_mask_bs1 # Convert attention mask to binary: extended_attention_mask = attention_mask_bss < 0.5 return extended_attention_mask @staticmethod def get_decoder_mask(seq_length, device): attention_mask = torch.tril(torch.ones((1, seq_length, seq_length), device=device)) attention_mask = attention_mask < 0.5 return attention_mask @staticmethod def get_enc_dec_mask(attention_mask, dec_seq_length, device): batch_size, _ = attention_mask.shape # We create a 3D attention mask from a 2D tensor mask. # [b, 1, s] attention_mask_b1s = attention_mask.unsqueeze(1) # [b, s, 1] attention_mask_bs1 = torch.ones((batch_size, dec_seq_length, 1), device=device) attention_mask_bss = attention_mask_bs1 * attention_mask_b1s extended_attention_mask = attention_mask_bss < 0.5 return extended_attention_mask def get_batch_func(self, accelerator, megatron_dataset_flag): def get_batch_megatron(data_iterator): """Build the batch.""" keys = ["text_enc", "text_dec", "labels", "loss_mask", "enc_mask", "dec_mask", "enc_dec_mask"] datatype = torch.int64 # Broadcast data. if data_iterator is not None: data = next(data_iterator) else: data = None data_b = tensor_parallel.broadcast_data(keys, data, datatype) # Unpack. tokens_enc = data_b["text_enc"].long() tokens_dec = data_b["text_dec"].long() labels = data_b["labels"].long() loss_mask = data_b["loss_mask"].float() enc_mask = data_b["enc_mask"] < 0.5 dec_mask = data_b["dec_mask"] < 0.5 enc_dec_mask = data_b["enc_dec_mask"] < 0.5 return tokens_enc, tokens_dec, loss_mask, labels, enc_mask, dec_mask, enc_dec_mask def get_batch_transformer(data_iterator): """Build the batch.""" data = next(data_iterator) data = send_to_device(data, torch.cuda.current_device()) tokens_enc = data["input_ids"].long() labels = data["labels"].long() loss_mask = (labels != -100).to(torch.float) if "decoder_input_ids" in data: tokens_dec = data["decoder_input_ids"].long() else: tokens_dec = labels.new_zeros(labels.shape, device=labels.device, dtype=torch.long) tokens_dec[..., 1:] = labels[..., :-1].clone() tokens_dec[..., 0] = 0 tokens_dec.masked_fill_(tokens_dec == -100, 0) enc_mask = T5TrainStep.attn_mask_postprocess(data["attention_mask"].long()) dec_mask = T5TrainStep.get_decoder_mask(tokens_dec.shape[1], tokens_dec.device) enc_dec_mask = T5TrainStep.get_enc_dec_mask( data["attention_mask"].long(), tokens_dec.shape[1], tokens_dec.device ) return tokens_enc, tokens_dec, loss_mask, labels, enc_mask, dec_mask, enc_dec_mask if accelerator.state.megatron_lm_plugin.custom_get_batch_function is not None: return accelerator.state.megatron_lm_plugin.custom_get_batch_function if megatron_dataset_flag: try: # Use '--no-use-pep517 -e' to pip install nvidia's megatron from source from pretrain_t5 import get_batch return get_batch except ImportError: pass return get_batch_megatron else: return get_batch_transformer def get_loss_func(self, accelerator): def loss_func(loss_mask, output_tensor): lm_loss_ = output_tensor.float() lm_loss = torch.sum(lm_loss_.view(-1) * loss_mask.reshape(-1)) / loss_mask.sum() loss = lm_loss averaged_losses = average_losses_across_data_parallel_group([lm_loss]) return loss, {"lm loss": averaged_losses[0]} if accelerator.state.megatron_lm_plugin.custom_loss_function is not None: return accelerator.state.megatron_lm_plugin.custom_loss_function return loss_func def get_forward_step_func(self): def forward_step(data_iterator, model): """Forward step.""" # Get the batch. tokens_enc, tokens_dec, loss_mask, lm_labels, enc_mask, dec_mask, enc_dec_mask = self.get_batch( data_iterator ) # Forward model lm_labels output_tensor = model( tokens_enc, tokens_dec, enc_mask, dec_mask, enc_dec_mask, tokentype_ids=None, lm_labels=lm_labels ) return output_tensor, partial(self.loss_func, loss_mask) return forward_step def finish_mpu_init(): # torch.distributed initialization args = get_args() # Pytorch distributed. _initialize_distributed() # Random seeds for reproducibility. if args.rank == 0: print(f"> setting random seeds to {args.seed} ...") _set_random_seed(args.seed, args.data_parallel_random_init) # intialize megatron setup def initialize(accelerator, extra_args_provider=None, args_defaults={}): accelerator.print("Initializing Megatron-LM") assert torch.cuda.is_available(), "Megatron requires CUDA." # Parse arguments args = parse_args(extra_args_provider, ignore_unknown_args=True) # Set defaults for key, value in args_defaults.items(): if getattr(args, key, None) is not None: if args.rank == 0: print( f"WARNING: overriding default arguments for " f"{key}:{getattr(args, key)} with {key}:{value}", flush=True, ) setattr(args, key, value) if args.use_checkpoint_args or args_defaults.get("use_checkpoint_args", False): assert args.load is not None, "--use-checkpoints-args requires --load argument" load_args_from_checkpoint(args) validate_args(args) # set global args, build tokenizer, and set adlr-autoresume, # tensorboard-writer, and timers. set_global_variables(args) # Megatron's MPU is the master. Complete initialization right away. finish_mpu_init() # Autoresume. _init_autoresume() # Compile dependencies. _compile_dependencies() # Set pytorch JIT layer fusion options and warmup JIT functions. set_jit_fusion_options() args = get_args() if getattr(args, "padded_vocab_size", None) is None: args.padded_vocab_size = _vocab_size_with_padding(args.orig_vocab_size, args) if args.model_type_name == "bert" and args.pretraining_flag and args.num_labels == 2: args.bert_binary_head = True else: args.bert_binary_head = False args.iteration = 0 class MegatronEngine(torch.nn.Module): """ Megatron-LM model wrapper Args: accelerator (:class:`~accelerate.Accelerator`): The accelerator object to use. model: Megatron-LM model optimizer: Megatron-LM optimizer lr_scheduler: Megatron-LM lr scheduler """ def __init__(self, accelerator, model, optimizer, scheduler): super().__init__() self.module = model self.base_model = model[0] self.optimizer = optimizer self.scheduler = scheduler args = get_args() if accelerator.state.megatron_lm_plugin.custom_train_step_class is not None: self.train_step_handler = accelerator.state.megatron_lm_plugin.custom_train_step_class( args, **accelerator.state.megatron_lm_plugin.custom_train_step_kwargs ) elif args.model_type_name == "bert": self.train_step_handler = BertTrainStep(accelerator, args) elif args.model_type_name == "gpt": self.train_step_handler = GPTTrainStep(accelerator, args) elif args.model_type_name == "t5": self.train_step_handler = T5TrainStep(accelerator, args) else: raise ValueError(f"Unsupported model type: {args.model_type_name}") self.optimizer.skipped_iter = False # Tracking loss. self.total_loss_dict = {} self.eval_total_loss_dict = {} self.iteration = 0 self.report_memory_flag = True self.num_floating_point_operations_so_far = 0 self.module_config = None if args.tensorboard_dir is not None: write_args_to_tensorboard() def get_module_config(self): args = get_args() config = get_model_config(self.module[0]) # Setup some training config params config.grad_scale_func = self.optimizer.scale_loss if isinstance(self.module[0], LocalDDP) and args.overlap_grad_reduce: assert config.no_sync_func is None, ( "When overlap_grad_reduce is True, config.no_sync_func must be None; " "a custom no_sync_func is not supported when overlapping grad-reduce" ) config.no_sync_func = [model_chunk.no_sync for model_chunk in self.module] if len(self.module) == 1: config.no_sync_func = config.no_sync_func[0] if args.delay_grad_reduce: config.grad_sync_func = [model_chunk.start_grad_sync for model_chunk in self.module] if len(self.module) == 1: config.grad_sync_func = config.grad_sync_func[0] if args.overlap_param_gather and args.delay_param_gather: config.param_sync_func = [ lambda x: self.optimizer.finish_param_sync(model_index, x) for model_index in range(len(self.module)) ] if len(self.module) == 1: config.param_sync_func = config.param_sync_func[0] config.finalize_model_grads_func = finalize_model_grads return config def train(self): for model_module in self.module: model_module.train() if self.module_config is None: self.module_config = self.get_module_config() self.log_eval_results() def eval(self): for model_module in self.module: model_module.eval() if self.module_config is None: self.module_config = self.get_module_config() def get_batch_data_iterator(self, batch_data): args = get_args() data_chunks = [] if len(batch_data) > 0: if args.num_micro_batches > 1: for i in range(0, args.num_micro_batches): data_chunks.append( { k: v[i * args.micro_batch_size : (i + 1) * args.micro_batch_size] for k, v in batch_data.items() } ) else: data_chunks = [batch_data] if len(self.module) > 1: batch_data_iterator = ( [iter(data_chunks) for _ in range(len(self.module))] if len(batch_data) > 0 else [None] * len(self.module) ) else: batch_data_iterator = iter(data_chunks) if len(batch_data) > 0 else None return batch_data_iterator def train_step(self, **batch_data): """ Training step for Megatron-LM Args: batch_data (:obj:`dict`): The batch data to train on. """ batch_data_iterator = self.get_batch_data_iterator(batch_data) loss_reduced, skipped_iter, grad_norm, num_zeros_in_grad = train_step( forward_step_func=self.train_step_handler.forward_step, data_iterator=batch_data_iterator, model=self.module, optimizer=self.optimizer, opt_param_scheduler=self.scheduler, config=self.module_config, ) self.optimizer.skipped_iter = skipped_iter == 1 return loss_reduced, skipped_iter, grad_norm, num_zeros_in_grad def eval_step(self, **batch_data): """ Evaluation step for Megatron-LM Args: batch_data (:obj:`dict`): The batch data to evaluate on. """ args = get_args() batch_data_iterator = self.get_batch_data_iterator(batch_data) forward_backward_func = get_forward_backward_func() loss_dicts = forward_backward_func( forward_step_func=self.train_step_handler.forward_step, data_iterator=batch_data_iterator, model=self.module, num_microbatches=get_num_microbatches(), seq_length=args.seq_length, micro_batch_size=args.micro_batch_size, forward_only=True, ) # Empty unused memory if args.empty_unused_memory_level >= 1: torch.cuda.empty_cache() args.consumed_valid_samples += ( mpu.get_data_parallel_world_size() * args.micro_batch_size * get_num_microbatches() ) if mpu.is_pipeline_last_stage(ignore_virtual=True): # Average loss across microbatches. loss_reduced = {} for key in loss_dicts[0]: losses_reduced_for_key = [x[key] for x in loss_dicts] if len(losses_reduced_for_key[0].shape) == 0: loss_reduced[key] = sum(losses_reduced_for_key) / len(losses_reduced_for_key) else: loss_reduced[key] = torch.concat(losses_reduced_for_key) return loss_reduced return {} def forward(self, **batch_data): # During training, we use train_step() # model(**batch_data) performs following operations by delegating it to `self.train_step`: # 1. Prepare **batch_data for Tendor, Pipeline and Model Parallelism # 2. Set grad to zero. # 3. forward pass and backward pass using Pipeline Parallelism # 4. Empty unused memory. # 5. Reduce gradients. # 6. Update parameters. # 7. Gather params when using Distributed Optimizer (Data Parallelism). # 8. Update learning rate if scheduler is specified. # 9. Empty unused memory. # 10. Average loss across microbatches and across DP ranks. # # During evaluation, we use eval_step() args = get_args() if self.module[0].training: loss_dict, skipped_iter, grad_norm, num_zeros_in_grad = self.train_step(**batch_data) self.iteration += 1 batch_size = mpu.get_data_parallel_world_size() * args.micro_batch_size * get_num_microbatches() args.consumed_train_samples += batch_size self.num_floating_point_operations_so_far += num_floating_point_operations(args, batch_size) if args.tensorboard_dir is not None: # Logging. loss_scale = self.optimizer.get_loss_scale().item() params_norm = None if args.log_params_norm: params_norm = calc_params_l2_norm(self.model) self.report_memory_flag = training_log( loss_dict, self.total_loss_dict, self.optimizer.param_groups[0]["lr"], self.iteration, loss_scale, self.report_memory_flag, skipped_iter, grad_norm, params_norm, num_zeros_in_grad, ) else: loss_dict = self.eval_step(**batch_data) if args.tensorboard_dir is not None: for key in loss_dict: self.eval_total_loss_dict[key] = ( self.eval_total_loss_dict.get(key, torch.cuda.FloatTensor([0.0])) + loss_dict[key] ) self.eval_total_loss_dict[key + "_num_iters"] = self.eval_total_loss_dict.get( key + "_num_iters", torch.cuda.FloatTensor([0.0]) ) + torch.cuda.FloatTensor([1.0]) loss = torch.tensor(0.0, device=torch.cuda.current_device()) for key in loss_dict: if len(loss_dict[key].shape) == 0: loss += loss_dict[key] logits = None if "logits" in loss_dict: logits = loss_dict["logits"] if self.train_step_handler.model_output_class is not None: return self.train_step_handler.model_output_class(loss=loss, logits=logits) return loss def log_eval_results(self): args = get_args() if args.tensorboard_dir is None or self.iteration == 0: return args = get_args() writer = get_tensorboard_writer() string = f"validation loss at iteration {self.iteration} | " for key in self.eval_total_loss_dict: if key.endswith("_num_iters"): continue value = self.eval_total_loss_dict[key] / self.eval_total_loss_dict[key + "_num_iters"] string += f"{key} value: {value} | " ppl = math.exp(min(20, value.item())) if args.pretraining_flag: string += f"{key} PPL: {ppl} | " if writer: writer.add_scalar(f"{key} validation", value.item(), self.iteration) if args.pretraining_flag: writer.add_scalar(f"{key} validation ppl", ppl, self.iteration) length = len(string) + 1 print_rank_last("-" * length) print_rank_last(string) print_rank_last("-" * length) self.eval_total_loss_dict = {} def save_checkpoint(self, output_dir): self.log_eval_results() args = get_args() args.save = output_dir torch.distributed.barrier() save_checkpoint( self.iteration, self.module, self.optimizer, self.scheduler, num_floating_point_operations_so_far=self.num_floating_point_operations_so_far, ) torch.distributed.barrier() def load_checkpoint(self, input_dir): args = get_args() args.load = input_dir args.consumed_train_samples = 0 args.consumed_valid_samples = 0 torch.distributed.barrier() iteration, num_floating_point_operations_so_far = load_checkpoint(self.module, self.optimizer, self.scheduler) torch.distributed.barrier() self.iteration = iteration self.num_floating_point_operations_so_far = num_floating_point_operations_so_far if args.fp16 and self.iteration == 0: self.optimizer.reload_model_params() def megatron_generate( self, inputs, attention_mask=None, max_length=None, max_new_tokens=None, num_beams=None, temperature=None, top_k=None, top_p=None, length_penalty=None, **kwargs, ): """ Generate method for GPT2 model. This method is used for inference. Supports both greedy and beam search along with sampling. Refer the Megatron-LM repo for more details Args: inputs (torch.Tensor): input ids attention_mask (torch.Tensor, optional): attention mask. Defaults to None. max_length (int, optional): max length of the generated sequence. Defaults to None. Either this or max_new_tokens should be provided. max_new_tokens (int, optional): max number of tokens to be generated. Defaults to None. Either this or max_length should be provided. num_beams (int, optional): number of beams to use for beam search. Defaults to None. temperature (float, optional): temperature for sampling. Defaults to 1.0. top_k (int, optional): top k tokens to consider for sampling. Defaults to 0.0. top_p (float, optional): tokens in top p probability are considered for sampling. Defaults to 0.0. length_penalty (float, optional): length penalty for beam search. Defaults to None. kwargs: additional key-value arguments """ # checking if required arguments are passed args = get_args() if args.model_type_name != "gpt": raise NotImplementedError("Generate method is not implemented for this model") if args.data_parallel_size > 1: raise ValueError("Generate method requires data parallelism to be 1") if args.sequence_parallel: raise ValueError("Generate method requires sequence parallelism to be False") if args.recompute_granularity is not None: raise ValueError("Checkpoint activations cannot be set for inference") if args.vocab_file is None: raise ValueError("Vocab file is required for inference") # Prepare inputs if max_length is None and max_new_tokens is None: raise ValueError("`max_length` or `max_new_tokens` are required for inference") if temperature is None: temperature = 1.0 elif not (0.0 < temperature <= 100.0): raise ValueError("temperature must be a positive number less than or equal to 100.0") if top_k is None: top_k = 0 elif not (0 <= top_k <= 1000): raise ValueError("top_k must be a positive number less than or equal to 1000") if top_p is None: top_p = 0.0 elif top_p > 0.0 and top_k > 0.0: raise ValueError("top_p and top_k sampling cannot be set together") else: if not (0.0 <= top_p <= 1.0): raise ValueError("top_p must be less than or equal to 1.0") top_p_decay = kwargs.get("top_p_decay", 0.0) if not (0.0 <= top_p_decay <= 1.0): raise ValueError("top_p_decay must be less than or equal to 1.0") top_p_bound = kwargs.get("top_p_bound", 0.0) if not (0.0 <= top_p_bound <= 1.0): raise ValueError("top_p_bound must be less than or equal to 1.0") add_BOS = kwargs.get("add_BOS", False) if not (isinstance(add_BOS, bool)): raise ValueError("add_BOS must be a boolean") beam_width = num_beams if beam_width is not None: if not isinstance(beam_width, int): raise ValueError("beam_width must be an integer") if beam_width < 1: raise ValueError("beam_width must be greater than 0") if inputs.shape[0] > 1: return "When doing beam_search, batch size must be 1" tokenizer = get_tokenizer() stop_token = kwargs.get("stop_token", tokenizer.eod) if stop_token is not None: if not isinstance(stop_token, int): raise ValueError("stop_token must be an integer") if length_penalty is None: length_penalty = 1.0 sizes_list = None prompts_tokens_tensor = None prompts_length_tensor = None if torch.distributed.get_rank() == 0: # Get the prompts length. if attention_mask is None: prompts_length_tensor = torch.cuda.LongTensor([inputs.shape[1]] * inputs.shape[0]) else: prompts_length_tensor = attention_mask.sum(axis=-1).cuda() if max_new_tokens is None: max_new_tokens = max_length - inputs.shape[1] if max_new_tokens <= 0: raise ValueError("max_new_tokens must be greater than 0") if add_BOS: max_length = max_new_tokens + inputs.shape[1] + 1 # making sure that `max_length` is a multiple of 4 to leverage fused kernels max_length = 4 * math.ceil(max_length / 4) max_new_tokens = max_length - (inputs.shape[1] + 1) padding = torch.cuda.LongTensor([[tokenizer.eod] * max_new_tokens] * inputs.shape[0]) prompts_tokens_tensor = torch.concat( [torch.unsqueeze(padding[:, 0], axis=-1), inputs.cuda(), padding], axis=-1 ) else: # making sure that `max_length` is a multiple of 4 to leverage fused kernels max_length = max_new_tokens + inputs.shape[1] max_length = 4 * math.ceil(max_length / 4) max_new_tokens = max_length - inputs.shape[1] padding = torch.cuda.LongTensor([[tokenizer.eod] * max_new_tokens] * inputs.shape[0]) prompts_tokens_tensor = torch.concat([inputs.cuda(), padding], axis=-1) # We need the sizes of these tensors for the boradcast sizes_list = [ prompts_tokens_tensor.size(0), # Batch size prompts_tokens_tensor.size(1), ] # Sequence lenght # First, broadcast the sizes. sizes_tensor = broadcast_int_list(2, int_list=sizes_list, rank=0) # Now that we have the sizes, we can boradcast the tokens # and length tensors. sizes = sizes_tensor.tolist() context_tokens_tensor = broadcast_tensor(sizes, torch.int64, tensor=prompts_tokens_tensor, rank=0) context_length_tensor = broadcast_tensor(sizes[0], torch.int64, tensor=prompts_length_tensor, rank=0) # Run the inference random_seed = kwargs.get("random_seed", 0) torch.random.manual_seed(random_seed) unwrapped_model = unwrap_model(self.base_model, (torchDDP, LocalDDP, Float16Module)) if beam_width is not None: tokens, _ = beam_search_and_return_on_first_stage( unwrapped_model, context_tokens_tensor, context_length_tensor, beam_width, stop_token=stop_token, num_return_gen=1, length_penalty=length_penalty, ) else: tokens, _, _ = generate_tokens_probs_and_return_on_first_stage( unwrapped_model, context_tokens_tensor, context_length_tensor, return_output_log_probs=False, top_k=top_k, top_p=top_p, top_p_decay=top_p_decay, top_p_bound=top_p_bound, temperature=temperature, use_eod_token_for_early_termination=True, ) return tokens # other utilities def avg_losses_across_data_parallel_group(losses): """ Average losses across data parallel group. Args: losses (List[Tensor]): List of losses to average across data parallel group. """ return average_losses_across_data_parallel_group(losses) def gather_across_data_parallel_groups(tensor): """ Recursively gather tensor in a nested list/tuple/dictionary of tensors from data parallel ranks. Args: tensor (nested list/tuple/dictionary of `torch.Tensor`): The data to gather across data parallel ranks. """ def _gpu_gather_one(tensor): if tensor.ndim == 0: tensor = tensor.clone()[None] output_tensors = [ torch.empty_like(tensor) for _ in range(torch.distributed.get_world_size(group=mpu.get_data_parallel_group())) ] torch.distributed.all_gather(output_tensors, tensor, group=mpu.get_data_parallel_group()) return torch.cat(output_tensors, dim=0) return recursively_apply(_gpu_gather_one, tensor, error_on_other_type=True)
1
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/utils/deepspeed.py
# Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import base64 import json import os from copy import deepcopy from ..optimizer import AcceleratedOptimizer from ..scheduler import AcceleratedScheduler from .dataclasses import DistributedType def get_active_deepspeed_plugin(state): """ Returns the currently active DeepSpeedPlugin. Raises: ValueError: If DeepSpeed was not enabled and this function is called. """ if state.distributed_type != DistributedType.DEEPSPEED: raise ValueError( "Couldn't retrieve the active `DeepSpeedPlugin` as none were enabled. " "Please make sure that either `Accelerator` is configured for `deepspeed` " "or make sure that the desired `DeepSpeedPlugin` has been enabled (`AcceleratorState().select_deepspeed_plugin(name)`) " "before calling this function." ) if not isinstance(state.deepspeed_plugins, dict): return state.deepspeed_plugins return next(plugin for plugin in state.deepspeed_plugins.values() if plugin.selected) class HfDeepSpeedConfig: """ This object contains a DeepSpeed configuration dictionary and can be quickly queried for things like zero stage. A `weakref` of this object is stored in the module's globals to be able to access the config from areas where things like the Trainer object is not available (e.g. `from_pretrained` and `_get_resized_embeddings`). Therefore it's important that this object remains alive while the program is still running. [`Trainer`] uses the `HfTrainerDeepSpeedConfig` subclass instead. That subclass has logic to sync the configuration with values of [`TrainingArguments`] by replacing special placeholder values: `"auto"`. Without this special logic the DeepSpeed configuration is not modified in any way. Args: config_file_or_dict (`Union[str, Dict]`): path to DeepSpeed config file or dict. """ def __init__(self, config_file_or_dict): if isinstance(config_file_or_dict, dict): # Don't modify user's data should they want to reuse it (e.g. in tests), because once we # modified it, it will not be accepted here again, since `auto` values would have been overridden config = deepcopy(config_file_or_dict) elif os.path.exists(config_file_or_dict): with open(config_file_or_dict, encoding="utf-8") as f: config = json.load(f) else: try: config_decoded = base64.urlsafe_b64decode(config_file_or_dict).decode("utf-8") config = json.loads(config_decoded) except (UnicodeDecodeError, AttributeError, ValueError): raise ValueError( f"Expected a string path to an existing deepspeed config, or a dictionary, or a base64 encoded string. Received: {config_file_or_dict}" ) self.config = config self.set_stage_and_offload() def set_stage_and_offload(self): # zero stage - this is done as early as possible, before model is created, to allow # ``is_deepspeed_zero3_enabled`` query and getting to the early deepspeed config object # during ``zero.Init()`` which needs to know the dtype, and some other hparams. self._stage = self.get_value("zero_optimization.stage", -1) # offload self._offload = False if self.is_zero2() or self.is_zero3(): offload_devices_valid = set(["cpu", "nvme"]) offload_devices = set( [ self.get_value("zero_optimization.offload_optimizer.device"), self.get_value("zero_optimization.offload_param.device"), ] ) if len(offload_devices & offload_devices_valid) > 0: self._offload = True def find_config_node(self, ds_key_long): config = self.config # find the config node of interest if it exists nodes = ds_key_long.split(".") ds_key = nodes.pop() for node in nodes: config = config.get(node) if config is None: return None, ds_key return config, ds_key def get_value(self, ds_key_long, default=None): """ Returns the set value or `default` if no value is set """ config, ds_key = self.find_config_node(ds_key_long) if config is None: return default return config.get(ds_key, default) def del_config_sub_tree(self, ds_key_long, must_exist=False): """ Deletes a sub-section of the config file if it's found. Unless `must_exist` is `True` the section doesn't have to exist. """ config = self.config # find the config node of interest if it exists nodes = ds_key_long.split(".") for node in nodes: parent_config = config config = config.get(node) if config is None: if must_exist: raise ValueError(f"Can't find {ds_key_long} entry in the config: {self.config}") else: return # if found remove it if parent_config is not None: parent_config.pop(node) def is_true(self, ds_key_long): """ Returns `True`/``False` only if the value is set, always `False` otherwise. So use this method to ask the very specific question of whether the value is set to `True` (and it's not set to `False`` or isn't set). """ value = self.get_value(ds_key_long) return False if value is None else bool(value) def is_false(self, ds_key_long): """ Returns `True`/``False` only if the value is set, always `False` otherwise. So use this method to ask the very specific question of whether the value is set to `False` (and it's not set to `True`` or isn't set). """ value = self.get_value(ds_key_long) return False if value is None else not bool(value) def is_zero2(self): return self._stage == 2 def is_zero3(self): return self._stage == 3 def is_offload(self): return self._offload class DeepSpeedEngineWrapper: """ Internal wrapper for deepspeed.runtime.engine.DeepSpeedEngine. This is used to follow conventional training loop. Args: engine (deepspeed.runtime.engine.DeepSpeedEngine): deepspeed engine to wrap """ def __init__(self, engine): self.engine = engine def backward(self, loss, **kwargs): # runs backpropagation and handles mixed precision self.engine.backward(loss, **kwargs) # Deepspeed's `engine.step` performs the following operations: # - gradient accumulation check # - gradient clipping # - optimizer step # - zero grad # - checking overflow # - lr_scheduler step (only if engine.lr_scheduler is not None) self.engine.step() # and this plugin overrides the above calls with no-ops when Accelerate runs under # Deepspeed, but allows normal functionality for non-Deepspeed cases thus enabling a simple # training loop that works transparently under many training regimes. class DeepSpeedOptimizerWrapper(AcceleratedOptimizer): """ Internal wrapper around a deepspeed optimizer. Args: optimizer (`torch.optim.optimizer.Optimizer`): The optimizer to wrap. """ def __init__(self, optimizer): super().__init__(optimizer, device_placement=False, scaler=None) self.__has_overflow__ = hasattr(self.optimizer, "overflow") def zero_grad(self, set_to_none=None): pass # `accelerator.backward(loss)` is doing that automatically. Therefore, its implementation is not needed def step(self): pass # `accelerator.backward(loss)` is doing that automatically. Therefore, its implementation is not needed @property def step_was_skipped(self): """Whether or not the optimizer step was done, or skipped because of gradient overflow.""" if self.__has_overflow__: return self.optimizer.overflow return False class DeepSpeedSchedulerWrapper(AcceleratedScheduler): """ Internal wrapper around a deepspeed scheduler. Args: scheduler (`torch.optim.lr_scheduler.LambdaLR`): The scheduler to wrap. optimizers (one or a list of `torch.optim.Optimizer`): """ def __init__(self, scheduler, optimizers): super().__init__(scheduler, optimizers) def step(self): pass # `accelerator.backward(loss)` is doing that automatically. Therefore, its implementation is not needed class DummyOptim: """ Dummy optimizer presents model parameters or param groups, this is primarily used to follow conventional training loop when optimizer config is specified in the deepspeed config file. Args: lr (float): Learning rate. params (iterable): iterable of parameters to optimize or dicts defining parameter groups weight_decay (float): Weight decay. **kwargs (additional keyword arguments, *optional*): Other arguments. """ def __init__(self, params, lr=0.001, weight_decay=0, **kwargs): self.params = params self.lr = lr self.weight_decay = weight_decay self.kwargs = kwargs class DummyScheduler: """ Dummy scheduler presents model parameters or param groups, this is primarily used to follow conventional training loop when scheduler config is specified in the deepspeed config file. Args: optimizer (`torch.optim.optimizer.Optimizer`): The optimizer to wrap. total_num_steps (int, *optional*): Total number of steps. warmup_num_steps (int, *optional*): Number of steps for warmup. lr_scheduler_callable (callable, *optional*): A callable function that creates an LR Scheduler. It accepts only one argument `optimizer`. **kwargs (additional keyword arguments, *optional*): Other arguments. """ def __init__(self, optimizer, total_num_steps=None, warmup_num_steps=0, lr_scheduler_callable=None, **kwargs): self.optimizer = optimizer self.total_num_steps = total_num_steps self.warmup_num_steps = warmup_num_steps self.lr_scheduler_callable = lr_scheduler_callable self.kwargs = kwargs
2
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/test_utils/examples.py
#!/usr/bin/env python # Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """ A collection of utilities for comparing `examples/complete_*_example.py` scripts with the capabilities inside of each `examples/by_feature` example. `compare_against_test` is the main function that should be used when testing, while the others are used to either get the code that matters, or to preprocess them (such as stripping comments) """ import os from typing import List def get_function_contents_by_name(lines: List[str], name: str): """ Extracts a function from `lines` of segmented source code with the name `name`. Args: lines (`List[str]`): Source code of a script seperated by line. name (`str`): The name of the function to extract. Should be either `training_function` or `main` """ if name != "training_function" and name != "main": raise ValueError(f"Incorrect function name passed: {name}, choose either 'main' or 'training_function'") good_lines, found_start = [], False for line in lines: if not found_start and f"def {name}" in line: found_start = True good_lines.append(line) continue if found_start: if name == "training_function" and "def main" in line: return good_lines if name == "main" and "if __name__" in line: return good_lines good_lines.append(line) def clean_lines(lines: List[str]): """ Filters `lines` and removes any entries that start with a comment ('#') or is just a newline ('\n') Args: lines (`List[str]`): Source code of a script seperated by line. """ return [line for line in lines if not line.lstrip().startswith("#") and line != "\n"] def compare_against_test(base_filename: str, feature_filename: str, parser_only: bool, secondary_filename: str = None): """ Tests whether the additional code inside of `feature_filename` was implemented in `base_filename`. This should be used when testing to see if `complete_*_.py` examples have all of the implementations from each of the `examples/by_feature/*` scripts. It utilizes `nlp_example.py` to extract out all of the repeated training code, so that only the new additional code is examined and checked. If something *other* than `nlp_example.py` should be used, such as `cv_example.py` for the `complete_cv_example.py` script, it should be passed in for the `secondary_filename` parameter. Args: base_filename (`str` or `os.PathLike`): The filepath of a single "complete" example script to test, such as `examples/complete_cv_example.py` feature_filename (`str` or `os.PathLike`): The filepath of a single feature example script. The contents of this script are checked to see if they exist in `base_filename` parser_only (`bool`): Whether to compare only the `main()` sections in both files, or to compare the contents of `training_loop()` secondary_filename (`str`, *optional*): A potential secondary filepath that should be included in the check. This function extracts the base functionalities off of "examples/nlp_example.py", so if `base_filename` is a script other than `complete_nlp_example.py`, the template script should be included here. Such as `examples/cv_example.py` """ with open(base_filename) as f: base_file_contents = f.readlines() with open(os.path.abspath(os.path.join("examples", "nlp_example.py"))) as f: full_file_contents = f.readlines() with open(feature_filename) as f: feature_file_contents = f.readlines() if secondary_filename is not None: with open(secondary_filename) as f: secondary_file_contents = f.readlines() # This is our base, we remove all the code from here in our `full_filename` and `feature_filename` to find the new content if parser_only: base_file_func = clean_lines(get_function_contents_by_name(base_file_contents, "main")) full_file_func = clean_lines(get_function_contents_by_name(full_file_contents, "main")) feature_file_func = clean_lines(get_function_contents_by_name(feature_file_contents, "main")) if secondary_filename is not None: secondary_file_func = clean_lines(get_function_contents_by_name(secondary_file_contents, "main")) else: base_file_func = clean_lines(get_function_contents_by_name(base_file_contents, "training_function")) full_file_func = clean_lines(get_function_contents_by_name(full_file_contents, "training_function")) feature_file_func = clean_lines(get_function_contents_by_name(feature_file_contents, "training_function")) if secondary_filename is not None: secondary_file_func = clean_lines( get_function_contents_by_name(secondary_file_contents, "training_function") ) _dl_line = "train_dataloader, eval_dataloader = get_dataloaders(accelerator, batch_size)\n" # Specific code in our script that differs from the full version, aka what is new new_feature_code = [] passed_idxs = [] # We keep track of the idxs just in case it's a repeated statement it = iter(feature_file_func) for i in range(len(feature_file_func) - 1): if i not in passed_idxs: line = next(it) if (line not in full_file_func) and (line.lstrip() != _dl_line): if "TESTING_MOCKED_DATALOADERS" not in line: new_feature_code.append(line) passed_idxs.append(i) else: # Skip over the `config['num_epochs'] = 2` statement _ = next(it) # Extract out just the new parts from the full_file_training_func new_full_example_parts = [] passed_idxs = [] # We keep track of the idxs just in case it's a repeated statement for i, line in enumerate(base_file_func): if i not in passed_idxs: if (line not in full_file_func) and (line.lstrip() != _dl_line): if "TESTING_MOCKED_DATALOADERS" not in line: new_full_example_parts.append(line) passed_idxs.append(i) # Finally, get the overall diff diff_from_example = [line for line in new_feature_code if line not in new_full_example_parts] if secondary_filename is not None: diff_from_two = [line for line in full_file_contents if line not in secondary_file_func] diff_from_example = [line for line in diff_from_example if line not in diff_from_two] return diff_from_example
3
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/test_utils/testing.py
# Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import asyncio import inspect import io import os import shutil import subprocess import sys import tempfile import unittest from contextlib import contextmanager from functools import partial from pathlib import Path from typing import List, Union from unittest import mock import torch import accelerate from ..state import AcceleratorState, PartialState from ..utils import ( gather, is_bnb_available, is_clearml_available, is_comet_ml_available, is_cuda_available, is_datasets_available, is_deepspeed_available, is_dvclive_available, is_import_timer_available, is_mlu_available, is_mps_available, is_musa_available, is_npu_available, is_pandas_available, is_pippy_available, is_schedulefree_available, is_tensorboard_available, is_timm_available, is_torch_version, is_torch_xla_available, is_torchdata_stateful_dataloader_available, is_torchvision_available, is_transformer_engine_available, is_transformers_available, is_triton_available, is_wandb_available, is_xpu_available, str_to_bool, ) def get_backend(): if is_torch_xla_available(): return "xla", torch.cuda.device_count(), torch.cuda.memory_allocated elif is_cuda_available(): return "cuda", torch.cuda.device_count(), torch.cuda.memory_allocated elif is_mps_available(min_version="2.0"): return "mps", 1, torch.mps.current_allocated_memory elif is_mps_available(): return "mps", 1, lambda: 0 elif is_mlu_available(): return "mlu", torch.mlu.device_count(), torch.mlu.memory_allocated elif is_musa_available(): return "musa", torch.musa.device_count(), torch.musa.memory_allocated elif is_npu_available(): return "npu", torch.npu.device_count(), torch.npu.memory_allocated elif is_xpu_available(): return "xpu", torch.xpu.device_count(), torch.xpu.memory_allocated else: return "cpu", 1, lambda: 0 torch_device, device_count, memory_allocated_func = get_backend() def get_launch_command(**kwargs) -> list: """ Wraps around `kwargs` to help simplify launching from `subprocess`. Example: ```python # returns ['accelerate', 'launch', '--num_processes=2', '--device_count=2'] get_launch_command(num_processes=2, device_count=2) ``` """ command = ["accelerate", "launch"] for k, v in kwargs.items(): if isinstance(v, bool) and v: command.append(f"--{k}") elif v is not None: command.append(f"--{k}={v}") return command DEFAULT_LAUNCH_COMMAND = get_launch_command(num_processes=device_count, monitor_interval=0.1) def parse_flag_from_env(key, default=False): try: value = os.environ[key] except KeyError: # KEY isn't set, default to `default`. _value = default else: # KEY is set, convert it to True or False. try: _value = str_to_bool(value) except ValueError: # More values are supported, but let's keep the message simple. raise ValueError(f"If set, {key} must be yes or no.") return _value _run_slow_tests = parse_flag_from_env("RUN_SLOW", default=False) def skip(test_case): "Decorator that skips a test unconditionally" return unittest.skip("Test was skipped")(test_case) def slow(test_case): """ Decorator marking a test as slow. Slow tests are skipped by default. Set the RUN_SLOW environment variable to a truthy value to run them. """ return unittest.skipUnless(_run_slow_tests, "test is slow")(test_case) def require_cpu(test_case): """ Decorator marking a test that must be only ran on the CPU. These tests are skipped when a GPU is available. """ return unittest.skipUnless(torch_device == "cpu", "test requires only a CPU")(test_case) def require_non_cpu(test_case): """ Decorator marking a test that requires a hardware accelerator backend. These tests are skipped when there are no hardware accelerator available. """ return unittest.skipUnless(torch_device != "cpu", "test requires a GPU")(test_case) def require_cuda(test_case): """ Decorator marking a test that requires CUDA. These tests are skipped when there are no GPU available or when TorchXLA is available. """ return unittest.skipUnless(is_cuda_available() and not is_torch_xla_available(), "test requires a GPU")(test_case) def require_xpu(test_case): """ Decorator marking a test that requires XPU. These tests are skipped when there are no XPU available. """ return unittest.skipUnless(is_xpu_available(), "test requires a XPU")(test_case) def require_non_xpu(test_case): """ Decorator marking a test that should be skipped for XPU. """ return unittest.skipUnless(torch_device != "xpu", "test requires a non-XPU")(test_case) def require_mlu(test_case): """ Decorator marking a test that requires MLU. These tests are skipped when there are no MLU available. """ return unittest.skipUnless(is_mlu_available(), "test require a MLU")(test_case) def require_musa(test_case): """ Decorator marking a test that requires MUSA. These tests are skipped when there are no MUSA available. """ return unittest.skipUnless(is_musa_available(), "test require a MUSA")(test_case) def require_npu(test_case): """ Decorator marking a test that requires NPU. These tests are skipped when there are no NPU available. """ return unittest.skipUnless(is_npu_available(), "test require a NPU")(test_case) def require_mps(test_case): """ Decorator marking a test that requires MPS backend. These tests are skipped when torch doesn't support `mps` backend. """ return unittest.skipUnless(is_mps_available(), "test requires a `mps` backend support in `torch`")(test_case) def require_huggingface_suite(test_case): """ Decorator marking a test that requires transformers and datasets. These tests are skipped when they are not. """ return unittest.skipUnless( is_transformers_available() and is_datasets_available(), "test requires the Hugging Face suite", )(test_case) def require_transformers(test_case): """ Decorator marking a test that requires transformers. These tests are skipped when they are not. """ return unittest.skipUnless(is_transformers_available(), "test requires the transformers library")(test_case) def require_timm(test_case): """ Decorator marking a test that requires timm. These tests are skipped when they are not. """ return unittest.skipUnless(is_timm_available(), "test requires the timm library")(test_case) def require_torchvision(test_case): """ Decorator marking a test that requires torchvision. These tests are skipped when they are not. """ return unittest.skipUnless(is_torchvision_available(), "test requires the torchvision library")(test_case) def require_triton(test_case): """ Decorator marking a test that requires triton. These tests are skipped when they are not. """ return unittest.skipUnless(is_triton_available(), "test requires the triton library")(test_case) def require_schedulefree(test_case): """ Decorator marking a test that requires schedulefree. These tests are skipped when they are not. """ return unittest.skipUnless(is_schedulefree_available(), "test requires the schedulefree library")(test_case) def require_bnb(test_case): """ Decorator marking a test that requires bitsandbytes. These tests are skipped when they are not. """ return unittest.skipUnless(is_bnb_available(), "test requires the bitsandbytes library")(test_case) def require_tpu(test_case): """ Decorator marking a test that requires TPUs. These tests are skipped when there are no TPUs available. """ return unittest.skipUnless(is_torch_xla_available(check_is_tpu=True), "test requires TPU")(test_case) def require_non_torch_xla(test_case): """ Decorator marking a test as requiring an environment without TorchXLA. These tests are skipped when TorchXLA is available. """ return unittest.skipUnless(not is_torch_xla_available(), "test requires an env without TorchXLA")(test_case) def require_single_device(test_case): """ Decorator marking a test that requires a single device. These tests are skipped when there is no hardware accelerator available or number of devices is more than one. """ return unittest.skipUnless(torch_device != "cpu" and device_count == 1, "test requires a hardware accelerator")( test_case ) def require_single_gpu(test_case): """ Decorator marking a test that requires CUDA on a single GPU. These tests are skipped when there are no GPU available or number of GPUs is more than one. """ return unittest.skipUnless(torch.cuda.device_count() == 1, "test requires a GPU")(test_case) def require_single_xpu(test_case): """ Decorator marking a test that requires CUDA on a single XPU. These tests are skipped when there are no XPU available or number of xPUs is more than one. """ return unittest.skipUnless(torch.xpu.device_count() == 1, "test requires a XPU")(test_case) def require_multi_device(test_case): """ Decorator marking a test that requires a multi-device setup. These tests are skipped on a machine without multiple devices. """ return unittest.skipUnless(device_count > 1, "test requires multiple hardware accelerators")(test_case) def require_multi_gpu(test_case): """ Decorator marking a test that requires a multi-GPU setup. These tests are skipped on a machine without multiple GPUs. """ return unittest.skipUnless(torch.cuda.device_count() > 1, "test requires multiple GPUs")(test_case) def require_multi_xpu(test_case): """ Decorator marking a test that requires a multi-XPU setup. These tests are skipped on a machine without multiple XPUs. """ return unittest.skipUnless(torch.xpu.device_count() > 1, "test requires multiple XPUs")(test_case) def require_deepspeed(test_case): """ Decorator marking a test that requires DeepSpeed installed. These tests are skipped when DeepSpeed isn't installed """ return unittest.skipUnless(is_deepspeed_available(), "test requires DeepSpeed")(test_case) def require_fsdp(test_case): """ Decorator marking a test that requires FSDP installed. These tests are skipped when FSDP isn't installed """ return unittest.skipUnless(is_torch_version(">=", "1.12.0"), "test requires torch version >= 1.12.0")(test_case) def require_torch_min_version(test_case=None, version=None): """ Decorator marking that a test requires a particular torch version to be tested. These tests are skipped when an installed torch version is less than the required one. """ if test_case is None: return partial(require_torch_min_version, version=version) return unittest.skipUnless(is_torch_version(">=", version), f"test requires torch version >= {version}")(test_case) def require_tensorboard(test_case): """ Decorator marking a test that requires tensorboard installed. These tests are skipped when tensorboard isn't installed """ return unittest.skipUnless(is_tensorboard_available(), "test requires Tensorboard")(test_case) def require_wandb(test_case): """ Decorator marking a test that requires wandb installed. These tests are skipped when wandb isn't installed """ return unittest.skipUnless(is_wandb_available(), "test requires wandb")(test_case) def require_comet_ml(test_case): """ Decorator marking a test that requires comet_ml installed. These tests are skipped when comet_ml isn't installed """ return unittest.skipUnless(is_comet_ml_available(), "test requires comet_ml")(test_case) def require_clearml(test_case): """ Decorator marking a test that requires clearml installed. These tests are skipped when clearml isn't installed """ return unittest.skipUnless(is_clearml_available(), "test requires clearml")(test_case) def require_dvclive(test_case): """ Decorator marking a test that requires dvclive installed. These tests are skipped when dvclive isn't installed """ return unittest.skipUnless(is_dvclive_available(), "test requires dvclive")(test_case) def require_pandas(test_case): """ Decorator marking a test that requires pandas installed. These tests are skipped when pandas isn't installed """ return unittest.skipUnless(is_pandas_available(), "test requires pandas")(test_case) def require_pippy(test_case): """ Decorator marking a test that requires pippy installed. These tests are skipped when pippy isn't installed """ return unittest.skipUnless(is_pippy_available(), "test requires pippy")(test_case) def require_import_timer(test_case): """ Decorator marking a test that requires tuna interpreter installed. These tests are skipped when tuna isn't installed """ return unittest.skipUnless(is_import_timer_available(), "test requires tuna interpreter")(test_case) def require_transformer_engine(test_case): """ Decorator marking a test that requires transformers engine installed. These tests are skipped when transformers engine isn't installed """ return unittest.skipUnless(is_transformer_engine_available(), "test requires transformers engine")(test_case) _atleast_one_tracker_available = ( any([is_wandb_available(), is_tensorboard_available()]) and not is_comet_ml_available() ) def require_trackers(test_case): """ Decorator marking that a test requires at least one tracking library installed. These tests are skipped when none are installed """ return unittest.skipUnless( _atleast_one_tracker_available, "test requires at least one tracker to be available and for `comet_ml` to not be installed", )(test_case) def require_torchdata_stateful_dataloader(test_case): """ Decorator marking a test that requires torchdata.stateful_dataloader. These tests are skipped when torchdata with stateful_dataloader module isn't installed. """ return unittest.skipUnless( is_torchdata_stateful_dataloader_available(), "test requires torchdata.stateful_dataloader" )(test_case) class TempDirTestCase(unittest.TestCase): """ A TestCase class that keeps a single `tempfile.TemporaryDirectory` open for the duration of the class, wipes its data at the start of a test, and then destroyes it at the end of the TestCase. Useful for when a class or API requires a single constant folder throughout it's use, such as Weights and Biases The temporary directory location will be stored in `self.tmpdir` """ clear_on_setup = True @classmethod def setUpClass(cls): "Creates a `tempfile.TemporaryDirectory` and stores it in `cls.tmpdir`" cls.tmpdir = Path(tempfile.mkdtemp()) @classmethod def tearDownClass(cls): "Remove `cls.tmpdir` after test suite has finished" if os.path.exists(cls.tmpdir): shutil.rmtree(cls.tmpdir) def setUp(self): "Destroy all contents in `self.tmpdir`, but not `self.tmpdir`" if self.clear_on_setup: for path in self.tmpdir.glob("**/*"): if path.is_file(): path.unlink() elif path.is_dir(): shutil.rmtree(path) class AccelerateTestCase(unittest.TestCase): """ A TestCase class that will reset the accelerator state at the end of every test. Every test that checks or utilizes the `AcceleratorState` class should inherit from this to avoid silent failures due to state being shared between tests. """ def tearDown(self): super().tearDown() # Reset the state of the AcceleratorState singleton. AcceleratorState._reset_state() PartialState._reset_state() class MockingTestCase(unittest.TestCase): """ A TestCase class designed to dynamically add various mockers that should be used in every test, mimicking the behavior of a class-wide mock when defining one normally will not do. Useful when a mock requires specific information available only initialized after `TestCase.setUpClass`, such as setting an environment variable with that information. The `add_mocks` function should be ran at the end of a `TestCase`'s `setUp` function, after a call to `super().setUp()` such as: ```python def setUp(self): super().setUp() mocks = mock.patch.dict(os.environ, {"SOME_ENV_VAR", "SOME_VALUE"}) self.add_mocks(mocks) ``` """ def add_mocks(self, mocks: Union[mock.Mock, List[mock.Mock]]): """ Add custom mocks for tests that should be repeated on each test. Should be called during `MockingTestCase.setUp`, after `super().setUp()`. Args: mocks (`mock.Mock` or list of `mock.Mock`): Mocks that should be added to the `TestCase` after `TestCase.setUpClass` has been run """ self.mocks = mocks if isinstance(mocks, (tuple, list)) else [mocks] for m in self.mocks: m.start() self.addCleanup(m.stop) def are_the_same_tensors(tensor): state = AcceleratorState() tensor = tensor[None].clone().to(state.device) tensors = gather(tensor).cpu() tensor = tensor[0].cpu() for i in range(tensors.shape[0]): if not torch.equal(tensors[i], tensor): return False return True class _RunOutput: def __init__(self, returncode, stdout, stderr): self.returncode = returncode self.stdout = stdout self.stderr = stderr async def _read_stream(stream, callback): while True: line = await stream.readline() if line: callback(line) else: break async def _stream_subprocess(cmd, env=None, stdin=None, timeout=None, quiet=False, echo=False) -> _RunOutput: if echo: print("\nRunning: ", " ".join(cmd)) p = await asyncio.create_subprocess_exec( cmd[0], *cmd[1:], stdin=stdin, stdout=asyncio.subprocess.PIPE, stderr=asyncio.subprocess.PIPE, env=env, ) # note: there is a warning for a possible deadlock when using `wait` with huge amounts of data in the pipe # https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.asyncio.subprocess.Process.wait # # If it starts hanging, will need to switch to the following code. The problem is that no data # will be seen until it's done and if it hangs for example there will be no debug info. # out, err = await p.communicate() # return _RunOutput(p.returncode, out, err) out = [] err = [] def tee(line, sink, pipe, label=""): line = line.decode("utf-8").rstrip() sink.append(line) if not quiet: print(label, line, file=pipe) # XXX: the timeout doesn't seem to make any difference here await asyncio.wait( [ asyncio.create_task(_read_stream(p.stdout, lambda l: tee(l, out, sys.stdout, label="stdout:"))), asyncio.create_task(_read_stream(p.stderr, lambda l: tee(l, err, sys.stderr, label="stderr:"))), ], timeout=timeout, ) return _RunOutput(await p.wait(), out, err) def execute_subprocess_async(cmd: list, env=None, stdin=None, timeout=180, quiet=False, echo=True) -> _RunOutput: # Cast every path in `cmd` to a string for i, c in enumerate(cmd): if isinstance(c, Path): cmd[i] = str(c) loop = asyncio.get_event_loop() result = loop.run_until_complete( _stream_subprocess(cmd, env=env, stdin=stdin, timeout=timeout, quiet=quiet, echo=echo) ) cmd_str = " ".join(cmd) if result.returncode > 0: stderr = "\n".join(result.stderr) raise RuntimeError( f"'{cmd_str}' failed with returncode {result.returncode}\n\n" f"The combined stderr from workers follows:\n{stderr}" ) return result class SubprocessCallException(Exception): pass def run_command(command: List[str], return_stdout=False, env=None): """ Runs `command` with `subprocess.check_output` and will potentially return the `stdout`. Will also properly capture if an error occured while running `command` """ # Cast every path in `command` to a string for i, c in enumerate(command): if isinstance(c, Path): command[i] = str(c) if env is None: env = os.environ.copy() try: output = subprocess.check_output(command, stderr=subprocess.STDOUT, env=env) if return_stdout: if hasattr(output, "decode"): output = output.decode("utf-8") return output except subprocess.CalledProcessError as e: raise SubprocessCallException( f"Command `{' '.join(command)}` failed with the following error:\n\n{e.output.decode()}" ) from e def path_in_accelerate_package(*components: str) -> Path: """ Get a path within the `accelerate` package's directory. Args: *components: Components of the path to join after the package directory. Returns: `Path`: The path to the requested file or directory. """ accelerate_package_dir = Path(inspect.getfile(accelerate)).parent return accelerate_package_dir.joinpath(*components) @contextmanager def assert_exception(exception_class: Exception, msg: str = None) -> bool: """ Context manager to assert that the right `Exception` class was raised. If `msg` is provided, will check that the message is contained in the raised exception. """ was_ran = False try: yield was_ran = True except Exception as e: assert isinstance(e, exception_class), f"Expected exception of type {exception_class} but got {type(e)}" if msg is not None: assert msg in str(e), f"Expected message '{msg}' to be in exception but got '{str(e)}'" if was_ran: raise AssertionError(f"Expected exception of type {exception_class} but ran without issue.") def capture_call_output(func, *args, **kwargs): """ Takes in a `func` with `args` and `kwargs` and returns the captured stdout as a string """ captured_output = io.StringIO() original_stdout = sys.stdout try: sys.stdout = captured_output func(*args, **kwargs) except Exception as e: raise e finally: sys.stdout = original_stdout return captured_output.getvalue()
4
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/test_utils/__init__.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from .testing import ( DEFAULT_LAUNCH_COMMAND, are_the_same_tensors, assert_exception, capture_call_output, device_count, execute_subprocess_async, get_launch_command, memory_allocated_func, path_in_accelerate_package, require_bnb, require_cpu, require_cuda, require_huggingface_suite, require_mlu, require_mps, require_multi_device, require_multi_gpu, require_multi_xpu, require_musa, require_non_cpu, require_non_torch_xla, require_non_xpu, require_npu, require_pippy, require_single_device, require_single_gpu, require_single_xpu, require_torch_min_version, require_torchvision, require_tpu, require_transformer_engine, require_xpu, skip, slow, torch_device, ) from .training import RegressionDataset, RegressionModel, RegressionModel4XPU from .scripts import test_script, test_sync, test_ops # isort: skip
5
0
hf_public_repos/accelerate/src/accelerate
hf_public_repos/accelerate/src/accelerate/test_utils/training.py
# Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import numpy as np import torch from torch.utils.data import DataLoader from accelerate.utils.dataclasses import DistributedType class RegressionDataset: def __init__(self, a=2, b=3, length=64, seed=None): rng = np.random.default_rng(seed) self.length = length self.x = rng.normal(size=(length,)).astype(np.float32) self.y = a * self.x + b + rng.normal(scale=0.1, size=(length,)).astype(np.float32) def __len__(self): return self.length def __getitem__(self, i): return {"x": self.x[i], "y": self.y[i]} class RegressionModel4XPU(torch.nn.Module): def __init__(self, a=0, b=0, double_output=False): super().__init__() self.a = torch.nn.Parameter(torch.tensor([2, 3]).float()) self.b = torch.nn.Parameter(torch.tensor([2, 3]).float()) self.first_batch = True def forward(self, x=None): if self.first_batch: print(f"Model dtype: {self.a.dtype}, {self.b.dtype}. Input dtype: {x.dtype}") self.first_batch = False return x * self.a[0] + self.b[0] class RegressionModel(torch.nn.Module): def __init__(self, a=0, b=0, double_output=False): super().__init__() self.a = torch.nn.Parameter(torch.tensor(a).float()) self.b = torch.nn.Parameter(torch.tensor(b).float()) self.first_batch = True def forward(self, x=None): if self.first_batch: print(f"Model dtype: {self.a.dtype}, {self.b.dtype}. Input dtype: {x.dtype}") self.first_batch = False return x * self.a + self.b def mocked_dataloaders(accelerator, batch_size: int = 16): from datasets import load_dataset from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("bert-base-cased") data_files = {"train": "tests/test_samples/MRPC/train.csv", "validation": "tests/test_samples/MRPC/dev.csv"} datasets = load_dataset("csv", data_files=data_files) label_list = datasets["train"].unique("label") label_to_id = {v: i for i, v in enumerate(label_list)} def tokenize_function(examples): # max_length=None => use the model max length (it's actually the default) outputs = tokenizer( examples["sentence1"], examples["sentence2"], truncation=True, max_length=None, padding="max_length" ) if "label" in examples: outputs["labels"] = [label_to_id[l] for l in examples["label"]] return outputs # Apply the method we just defined to all the examples in all the splits of the dataset tokenized_datasets = datasets.map( tokenize_function, batched=True, remove_columns=["sentence1", "sentence2", "label"], ) def collate_fn(examples): # On TPU it's best to pad everything to the same length or training will be very slow. if accelerator.distributed_type == DistributedType.XLA: return tokenizer.pad(examples, padding="max_length", max_length=128, return_tensors="pt") return tokenizer.pad(examples, padding="longest", return_tensors="pt") # Instantiate dataloaders. train_dataloader = DataLoader(tokenized_datasets["train"], shuffle=True, collate_fn=collate_fn, batch_size=2) eval_dataloader = DataLoader(tokenized_datasets["validation"], shuffle=False, collate_fn=collate_fn, batch_size=1) return train_dataloader, eval_dataloader
6
0
hf_public_repos/accelerate/src/accelerate/test_utils
hf_public_repos/accelerate/src/accelerate/test_utils/scripts/test_ddp_comm_hook.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import torch from accelerate import Accelerator, DDPCommunicationHookType, DistributedDataParallelKwargs, PartialState class MockModel(torch.nn.Module): def __init__(self): super().__init__() torch.manual_seed(0) self.p = torch.nn.Parameter(torch.randn(40, 20)) def forward(self, x, rank): return self.p * (x ** (1 + rank)) def _run_and_get_grads(model, rank): torch.manual_seed(2024) input = torch.randn(40, 20) output = model(input, rank) output.mean().backward() param = next(model.parameters()) return param.grad def test_ddp_comm_hook(comm_hook, comm_wrapper, comm_state_option): ddp_kwargs = DistributedDataParallelKwargs( comm_hook=comm_hook, comm_wrapper=comm_wrapper, comm_state_option=comm_state_option, ) accelerator = Accelerator(kwargs_handlers=[ddp_kwargs]) model = accelerator.prepare(MockModel()) hook_grads = _run_and_get_grads(model, accelerator.local_process_index) reference_model = torch.nn.parallel.DistributedDataParallel( MockModel().to(accelerator.device), device_ids=[accelerator.local_process_index], output_device=accelerator.local_process_index, ) reference_grads = _run_and_get_grads(reference_model, accelerator.local_process_index) torch.testing.assert_close(hook_grads, reference_grads, rtol=1e-2, atol=1e-2) def main(): for comm_hook, comm_wrapper, comm_state_option in [ (DDPCommunicationHookType.NO, DDPCommunicationHookType.NO, {}), (DDPCommunicationHookType.FP16, DDPCommunicationHookType.NO, {}), (DDPCommunicationHookType.BF16, DDPCommunicationHookType.NO, {}), (DDPCommunicationHookType.POWER_SGD, DDPCommunicationHookType.NO, {}), (DDPCommunicationHookType.POWER_SGD, DDPCommunicationHookType.FP16, {}), (DDPCommunicationHookType.POWER_SGD, DDPCommunicationHookType.BF16, {}), (DDPCommunicationHookType.POWER_SGD, DDPCommunicationHookType.NO, {"matrix_approximation_rank": 2}), (DDPCommunicationHookType.BATCHED_POWER_SGD, DDPCommunicationHookType.NO, {}), (DDPCommunicationHookType.BATCHED_POWER_SGD, DDPCommunicationHookType.FP16, {}), (DDPCommunicationHookType.BATCHED_POWER_SGD, DDPCommunicationHookType.BF16, {}), ]: print(f"Test DDP comm hook: {comm_hook}, comm wrapper: {comm_wrapper}") test_ddp_comm_hook(comm_hook, comm_wrapper, comm_state_option) PartialState().destroy_process_group() if __name__ == "__main__": main()
7
0
hf_public_repos/accelerate/src/accelerate/test_utils
hf_public_repos/accelerate/src/accelerate/test_utils/scripts/test_sync.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from copy import deepcopy import torch import torch.nn.functional as F from torch.optim import AdamW from torch.optim.lr_scheduler import LambdaLR from torch.utils.data import DataLoader from accelerate.accelerator import Accelerator, DataLoaderConfiguration, GradientAccumulationPlugin from accelerate.state import GradientState from accelerate.test_utils import RegressionDataset, RegressionModel from accelerate.utils import DistributedType, set_seed def check_model_parameters(model_a, model_b, did_step, iteration, **kwargs): for param, grad_param in zip(model_a.parameters(), model_b.parameters()): if not param.requires_grad: continue if not did_step: # Grads should not be in sync assert ( torch.allclose(param.grad, grad_param.grad, **kwargs) is False ), f"Gradients in sync when they should not be at iteration {iteration}:\nmodel_a grad ({param.grad}) == model_b grad ({grad_param.grad})" else: # Grads should be in sync assert ( torch.allclose(param.grad, grad_param.grad, **kwargs) is True ), f"Gradients not in sync when they should be at iteration {iteration}:\nmodel_a grad ({param.grad}) != model_b grad ({grad_param.grad})" def step_model(model, input, target, accelerator, do_backward=True): model.train() output = model(input) loss = F.mse_loss(output, target.to(output.device)) if not do_backward: loss /= accelerator.gradient_accumulation_steps loss.backward() else: accelerator.backward(loss) def get_training_setup(accelerator, sched=False): "Returns everything needed to perform basic training" set_seed(42) model = RegressionModel() ddp_model = deepcopy(model) dset = RegressionDataset(length=80) dataloader = DataLoader(dset, batch_size=16) model.to(accelerator.device) if sched: opt = AdamW(params=model.parameters(), lr=1e-3) ddp_opt = AdamW(params=ddp_model.parameters(), lr=1e-3) sched = LambdaLR(opt, lr_lambda=lambda epoch: epoch**0.65) ddp_sched = LambdaLR(ddp_opt, lr_lambda=lambda epoch: epoch**0.65) # Make a copy of `model` if sched: ddp_model, ddp_opt, ddp_sched, dataloader = accelerator.prepare(ddp_model, ddp_opt, ddp_sched, dataloader) else: ddp_model, dataloader = accelerator.prepare(ddp_model, dataloader) if sched: return (model, opt, sched, dataloader, ddp_model, ddp_opt, ddp_sched) return model, ddp_model, dataloader def test_noop_sync(accelerator): # Test when on a single CPU or GPU that the context manager does nothing model, ddp_model, dataloader = get_training_setup(accelerator) # Use a single batch ddp_input, ddp_target = next(iter(dataloader)).values() for iteration in range(3): # Gather the distributed inputs and targs for the base model input, target = accelerator.gather((ddp_input, ddp_target)) input, target = input.to(accelerator.device), target.to(accelerator.device) # Perform our initial ground truth step in non "DDP" step_model(model, input, target, accelerator) # Do "gradient accumulation" (noop) if iteration % 2 == 0: # Accumulate grads locally with accelerator.no_sync(ddp_model): step_model(ddp_model, ddp_input, ddp_target, accelerator) else: # Sync grads step_model(ddp_model, ddp_input, ddp_target, accelerator) # Since `no_sync` is a noop, `ddp_model` and `model` grads should always be in sync check_model_parameters(model, ddp_model, True, iteration) for param, ddp_param in zip(model.parameters(), ddp_model.parameters()): if not param.requires_grad: continue assert torch.allclose( param.grad, ddp_param.grad ), f"Gradients not in sync when they should be:\nModel grad ({param.grad}) != DDP grad ({ddp_param.grad})" # Shuffle ddp_input on each iteration torch.manual_seed(1337 + iteration) ddp_input = ddp_input[torch.randperm(len(ddp_input))] def test_distributed_sync(accelerator): # Test on distributed setup that context manager behaves properly model, ddp_model, dataloader = get_training_setup(accelerator) # Use a single batch ddp_input, ddp_target = next(iter(dataloader)).values() for iteration in range(3): # Gather the distributed inputs and targs for the base model input, target = accelerator.gather((ddp_input, ddp_target)) input, target = input.to(accelerator.device), target.to(accelerator.device) # Perform our initial ground truth step in non "DDP" step_model(model, input, target, accelerator) # Do "gradient accumulation" (noop) if iteration % 2 == 0: # Accumulate grads locally with accelerator.no_sync(ddp_model): step_model(ddp_model, ddp_input, ddp_target, accelerator) else: # Sync grads step_model(ddp_model, ddp_input, ddp_target, accelerator) # DDP model and model should only be in sync when not (iteration % 2 == 0) for param, ddp_param in zip(model.parameters(), ddp_model.parameters()): if not param.requires_grad: continue if iteration % 2 == 0: # Grads should not be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is False ), f"Gradients in sync when they should not be:\nModel grad ({param.grad}) == DDP grad ({ddp_param.grad})" else: # Grads should be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is True ), f"Gradients not in sync when they should be:\nModel grad ({param.grad}) != DDP grad ({ddp_param.grad})" # Shuffle ddp_input on each iteration torch.manual_seed(1337 + iteration) ddp_input = ddp_input[torch.randperm(len(ddp_input))] def test_distributed_sync_multiple_fwd(accelerator): # Test on distributed setup that context manager behaves properly when used with multiple forwards followed by multiple backwards model, ddp_model, dataloader = get_training_setup(accelerator) # Do multiple forwards losses = [] num_iterations = 3 for iteration in range(num_iterations): ddp_input, ddp_target = next(iter(dataloader)).values() # Gather the distributed inputs and targs for the base model input, target = accelerator.gather((ddp_input, ddp_target)) input, target = input.to(accelerator.device), target.to(accelerator.device) # Perform our initial ground truth step in non "DDP" step_model(model, input, target, accelerator) # Accumulate grads locally with accelerator.no_sync(ddp_model): ddp_output = ddp_model(ddp_input) loss = F.mse_loss(ddp_output, ddp_target.to(ddp_output.device)) losses.append(loss) # Do multiple backwards and sync only at the last backward for iteration in range(num_iterations): loss = losses[iteration] if iteration < num_iterations - 1: # Accumulate grads locally accelerator.backward(loss) # DDP model and model should only be in sync after last backward for param, ddp_param in zip(model.parameters(), ddp_model.parameters()): if not param.requires_grad: continue # Grads should not be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is False ), f"Gradients in sync when they should not be:\nModel grad ({param.grad}) == DDP grad ({ddp_param.grad})" else: # Sync grads if last backward with accelerator.trigger_sync_in_backward(ddp_model): accelerator.backward(loss) # DDP model and model should only be in sync after last backward for param, ddp_param in zip(model.parameters(), ddp_model.parameters()): if not param.requires_grad: continue # Grads should be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is True ), f"Gradients not in sync when they should be:\nModel grad ({param.grad}) != DDP grad ({ddp_param.grad})" def test_gradient_accumulation(split_batches=False, dispatch_batches=False, sync_each_batch=False): gradient_accumulation_plugin = GradientAccumulationPlugin(num_steps=2, sync_each_batch=sync_each_batch) dataloader_config = DataLoaderConfiguration(split_batches=split_batches, dispatch_batches=dispatch_batches) accelerator = Accelerator( dataloader_config=dataloader_config, gradient_accumulation_plugin=gradient_accumulation_plugin, ) # Test that context manager behaves properly model, ddp_model, dataloader = get_training_setup(accelerator) for iteration, batch in enumerate(dataloader): ddp_input, ddp_target = batch.values() # Gather the distributed inputs and targs for the base model input, target = accelerator.gather((ddp_input, ddp_target)) input, target = input.to(accelerator.device), target.to(accelerator.device) # Perform our initial ground truth step in non "DDP" step_model(model, input, target, accelerator, False) # Do "gradient accumulation" (noop) with accelerator.accumulate(ddp_model): step_model(ddp_model, ddp_input, ddp_target, accelerator) # DDP model and model should only be in sync when not (iteration % 2 == 0) for param, ddp_param in zip(model.parameters(), ddp_model.parameters()): if not param.requires_grad: continue if ((iteration + 1) % 2 == 0) or (iteration == len(dataloader) - 1) or sync_each_batch: # Grads should be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is True ), f"Gradients not in sync when they should be at iteration {iteration}:\nModel grad ({param.grad}) != DDP grad ({ddp_param.grad})" else: # Grads should not be in sync assert ( torch.allclose(param.grad, ddp_param.grad) is False ), f"Gradients in sync when they should not be at iteration {iteration}:\nModel grad ({param.grad}) == DDP grad ({ddp_param.grad})" # Shuffle ddp_input on each iteration torch.manual_seed(1337 + iteration) ddp_input = ddp_input[torch.randperm(len(ddp_input))] GradientState._reset_state() def test_gradient_accumulation_with_opt_and_scheduler( split_batches=False, dispatch_batches=False, sync_each_batch=False ): gradient_accumulation_plugin = GradientAccumulationPlugin(num_steps=2, sync_each_batch=sync_each_batch) dataloader_config = DataLoaderConfiguration(split_batches=split_batches, dispatch_batches=dispatch_batches) accelerator = Accelerator( dataloader_config=dataloader_config, gradient_accumulation_plugin=gradient_accumulation_plugin, ) # Test that context manager behaves properly model, opt, sched, dataloader, ddp_model, ddp_opt, ddp_sched = get_training_setup(accelerator, True) for iteration, batch in enumerate(dataloader): ddp_input, ddp_target = batch.values() # Gather the distributed inputs and targs for the base model input, target = accelerator.gather((ddp_input, ddp_target)) input, target = input.to(accelerator.device), target.to(accelerator.device) # Perform our initial ground truth step in non "DDP" model.train() ddp_model.train() step_model(model, input, target, accelerator, False) opt.step() if ((iteration + 1) % 2 == 0) or ((iteration + 1) == len(dataloader)): if split_batches: sched.step() else: for _ in range(accelerator.num_processes): sched.step() # Perform gradient accumulation under wrapper with accelerator.accumulate(ddp_model): step_model(ddp_model, ddp_input, ddp_target, accelerator) ddp_opt.step() ddp_sched.step() # Learning rates should be the same assert ( opt.param_groups[0]["lr"] == ddp_opt.param_groups[0]["lr"] ), f'Learning rates found in each optimizer did not align\nopt: {opt.param_groups[0]["lr"]}\nDDP opt: {ddp_opt.param_groups[0]["lr"]}\n' did_step = (((iteration + 1) % 2) == 0) or ((iteration + 1) == len(dataloader)) if accelerator.num_processes > 1: check_model_parameters( model, ddp_model, did_step or sync_each_batch, # syncs at each grad_accum interval of if sync_each_batch==True iteration, rtol=1e-3, # needs a relative tolerance due to roundoff errors ) if did_step: opt.zero_grad() # flush gradients every accum step ddp_opt.zero_grad() # Shuffle ddp_input on each iteration torch.manual_seed(1337 + iteration) GradientState._reset_state() def test_dataloader_break(): accelerator = Accelerator() first_dset = RegressionDataset(length=80) first_dataloader = DataLoader(first_dset, batch_size=16) second_dset = RegressionDataset(length=96) second_dataloader = DataLoader(second_dset, batch_size=16) first_dataloader, second_dataloader = accelerator.prepare(first_dataloader, second_dataloader) assert accelerator.gradient_state.active_dataloader is None for iteration, _ in enumerate(first_dataloader): assert id(accelerator.gradient_state.active_dataloader) == id(first_dataloader) if iteration < len(first_dataloader) - 1: assert not accelerator.gradient_state.end_of_dataloader if iteration == 1: for batch_num, _ in enumerate(second_dataloader): assert id(accelerator.gradient_state.active_dataloader) == id(second_dataloader) if batch_num < len(second_dataloader) - 1: assert not accelerator.gradient_state.end_of_dataloader else: assert accelerator.gradient_state.end_of_dataloader else: assert accelerator.gradient_state.end_of_dataloader assert accelerator.gradient_state.active_dataloader is None def main(): accelerator = Accelerator() state = accelerator.state if state.local_process_index == 0: print("**Test `accumulate` gradient accumulation with dataloader break**") if state.distributed_type != DistributedType.XLA: test_dataloader_break() if state.distributed_type == DistributedType.NO: if state.local_process_index == 0: print("**Test NOOP `no_sync` context manager**") test_noop_sync(accelerator) if state.distributed_type in ( DistributedType.MULTI_GPU, DistributedType.MULTI_NPU, DistributedType.MULTI_MLU, DistributedType.MULTI_MUSA, DistributedType.MULTI_CPU, ): if state.local_process_index == 0: print("**Test Distributed `no_sync` context manager**") test_distributed_sync(accelerator) if state.local_process_index == 0: print("**Test Distributed `no_sync` context manager with multiple forwards**") test_distributed_sync_multiple_fwd(accelerator) if state.distributed_type in ( DistributedType.MULTI_GPU, DistributedType.MULTI_NPU, DistributedType.MULTI_MLU, DistributedType.MULTI_MUSA, ): for split_batch in [True, False]: for dispatch_batches in [True, False]: for sync_each_batch in [True, False]: if state.local_process_index == 0: print( "**Test `accumulate` gradient accumulation, ", f"`split_batches={split_batch}` and `dispatch_batches={dispatch_batches}` and `sync_each_batch={sync_each_batch}`**", ) test_gradient_accumulation(split_batch, dispatch_batches, sync_each_batch) # Currently will break on torch 2.0 +, need to investigate why if state.local_process_index == 0: print( "**Test `accumulate` gradient accumulation with optimizer and scheduler, ", "`split_batches=False`, `dispatch_batches=False`, `sync_each_batch=False`**", ) test_gradient_accumulation_with_opt_and_scheduler() if state.distributed_type in ( DistributedType.MULTI_GPU, DistributedType.MULTI_NPU, DistributedType.MULTI_MLU, DistributedType.MULTI_MUSA, ): for split_batch in [True, False]: for dispatch_batches in [True, False]: for sync_each_batch in [True, False]: if not split_batch and not dispatch_batches and not sync_each_batch: continue if state.local_process_index == 0: print( "**Test `accumulate` gradient accumulation with optimizer and scheduler, ", f"`split_batches={split_batch}` and `dispatch_batches={dispatch_batches}` and `sync_each_batch={sync_each_batch}`**", ) test_gradient_accumulation_with_opt_and_scheduler(split_batch, dispatch_batches, sync_each_batch) state.destroy_process_group() def _mp_fn(index): # For xla_spawn (TPUs) main() if __name__ == "__main__": main()
8
0
hf_public_repos/accelerate/src/accelerate/test_utils
hf_public_repos/accelerate/src/accelerate/test_utils/scripts/test_merge_weights.py
# Copyright 2024 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import gc import logging import shutil from pathlib import Path import torch from safetensors.torch import load_file from torch.distributed.fsdp.fully_sharded_data_parallel import ShardingStrategy, StateDictType from torch.utils.data import DataLoader from accelerate import Accelerator, FullyShardedDataParallelPlugin from accelerate.commands.merge import merge_command, merge_command_parser from accelerate.state import AcceleratorState from accelerate.test_utils.training import RegressionDataset from accelerate.utils import merge_fsdp_weights, patch_environment, save_fsdp_model logging.basicConfig(level=logging.INFO) parser = merge_command_parser() class TinyModel(torch.nn.Module): def __init__(self): super().__init__() self.linear1 = torch.nn.Linear(16, 16) self.activation = torch.nn.ReLU() self.linear2 = torch.nn.Linear(16, 16) self.softmax = torch.nn.Softmax() def forward(self, x): return self.linear2(self.activation(self.linear1(x))) def setup(): if AcceleratorState._shared_state != {}: AcceleratorState()._reset_state() plugin = FullyShardedDataParallelPlugin( sharding_strategy=ShardingStrategy.FULL_SHARD, state_dict_type=StateDictType.SHARDED_STATE_DICT ) model = TinyModel() with patch_environment(fsdp_auto_wrap_policy="SIZE_BASED_WRAP"): plugin.set_auto_wrap_policy(model) accelerator = Accelerator(fsdp_plugin=plugin) model = accelerator.prepare(model) return model, plugin, accelerator def mock_training(accelerator, model): train_set = RegressionDataset(length=128, seed=42) train_dl = DataLoader(train_set, batch_size=16, shuffle=False) optimizer = torch.optim.SGD(model.parameters(), lr=0.1) train_dl, model, optimizer = accelerator.prepare(train_dl, model, optimizer) for _ in range(3): for batch in train_dl: model.zero_grad() output = model(batch["x"]) loss = torch.nn.functional.mse_loss(output, batch["y"]) accelerator.backward(loss) optimizer.step() return model def check_weights(operation, state_1, state_2): for weight_1, weight_2 in zip(state_1.values(), state_2.values()): if str(weight_1.device) != "cuda": weight_1 = weight_1.to("cuda") if str(weight_2.device) != "cuda": weight_2 = weight_2.to("cuda") if operation == "same": assert torch.allclose(weight_1, weight_2) else: assert not torch.allclose(weight_1, weight_2) def check_safetensors_weights(path, model): safe_state_dict = load_file(path / "model.safetensors") safe_loaded_model = TinyModel() check_weights("diff", model.state_dict(), safe_loaded_model.state_dict()) safe_loaded_model.load_state_dict(safe_state_dict) check_weights("same", model.state_dict(), safe_loaded_model.state_dict()) def check_pytorch_weights(path, model): nonsafe_state_dict = torch.load(path / "pytorch_model.bin") nonsafe_loaded_model = TinyModel() check_weights("diff", model.state_dict(), nonsafe_loaded_model.state_dict()) nonsafe_loaded_model.load_state_dict(nonsafe_state_dict) check_weights("same", model.state_dict(), nonsafe_loaded_model.state_dict()) def test_merge_weights_safetensors(model, path): # Should now be saved at `path/merged.safetensors` merge_fsdp_weights(path / "pytorch_model_fsdp_0", path, safe_serialization=True) check_safetensors_weights(path, model) def test_merge_weights_command_safetensors(model, path): args = parser.parse_args([str(path / "pytorch_model_fsdp_0"), str(path)]) merge_command(args) check_safetensors_weights(path, model) def test_merge_weights_pytorch(model, path): # Should now be saved at `path/merged.bin` merge_fsdp_weights(path / "pytorch_model_fsdp_0", path, safe_serialization=False) check_pytorch_weights(path, model) def test_merge_weights_command_pytorch(model, path): args = parser.parse_args([str(path / "pytorch_model_fsdp_0"), str(path), "--unsafe_serialization"]) merge_command(args) check_pytorch_weights(path, model) if __name__ == "__main__": # Note this test requires at least two accelerators! model, plugin, accelerator = setup() if accelerator.num_processes > 1: try: # Initial setup for things out_path = Path("test_merge_weights_fsdp_weights") if not out_path.exists(): out_path.mkdir(parents=True, exist_ok=True) # Train briefly once weights aren't the baseline model = mock_training(accelerator, model) accelerator.wait_for_everyone() gc.collect() # Needed for some lingering refs after training save_fsdp_model(plugin, accelerator, model, out_path) accelerator.wait_for_everyone() # Finally we can test test_merge_weights_safetensors(model, out_path) test_merge_weights_command_safetensors(model, out_path) test_merge_weights_pytorch(model, out_path) test_merge_weights_command_pytorch(model, out_path) except Exception: raise finally: # Cleanup in case of any failures if accelerator.is_main_process: shutil.rmtree(out_path) accelerator.wait_for_everyone() accelerator.end_training()
9
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/111_fine_tune_whisper/whisper_architecture.svg
<svg width="648" height="522" viewBox="0 0 648 522" fill="none" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <g clip-path="url(#clip0_3571_158427)"> <rect width="647" height="521" transform="translate(0.00195312 0.650391)" fill="white"/> <path d="M381.007 140.844L375.007 137.38V144.308L381.007 140.844ZM322.814 141.444H375.607V140.244H322.814V141.444Z" fill="black"/> <path d="M381.007 285.766L375.007 282.302V289.23L381.007 285.766ZM322.814 286.366H375.607V285.166H322.814V286.366Z" fill="black"/> <path d="M381.007 189.09L375.007 185.626V192.554L381.007 189.09ZM322.814 189.69H375.607V188.49H322.814V189.69Z" fill="black"/> <path d="M380.957 334.656L374.957 331.192V338.12L380.957 334.656ZM330.011 335.256H375.557V334.056H330.011V335.256ZM314.011 41.1422L140.004 41.1422V42.3422L314.011 42.3422V41.1422ZM322.611 326.656V311.978H321.411V326.656H322.611ZM322.611 311.978V49.7422H321.411V311.978H322.611ZM131.404 49.7422V71.6744H132.604V49.7422H131.404ZM140.004 41.1422C135.254 41.1422 131.404 44.9925 131.404 49.7422H132.604C132.604 45.6553 135.917 42.3422 140.004 42.3422V41.1422ZM314.011 42.3422C318.097 42.3422 321.411 45.6553 321.411 49.7422H322.611C322.611 44.9926 318.76 41.1422 314.011 41.1422V42.3422ZM330.011 334.056C325.924 334.056 322.611 330.743 322.611 326.656H321.411C321.411 331.405 325.261 335.256 330.011 335.256V334.056Z" fill="black"/> <rect x="0.00390625" y="247.229" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M94.2967 262.76V271.376H99.9247V270.368H95.4367V267.428H99.2407V266.42H95.4367V263.768H99.9247V262.76H94.2967ZM101.245 271.376H102.325V267.632C102.325 266.588 103.117 265.988 103.897 265.988C104.833 265.988 105.205 266.648 105.205 267.524V271.376H106.285V267.236C106.285 265.904 105.529 264.98 104.197 264.98C103.309 264.98 102.685 265.436 102.325 265.952V265.1H101.245V271.376ZM108.666 268.244C108.666 266.804 109.434 265.916 110.55 265.916C111.414 265.916 111.93 266.48 112.11 267.248L113.046 266.78C112.746 265.736 111.858 264.98 110.55 264.98C108.798 264.98 107.586 266.3 107.586 268.244C107.586 270.176 108.798 271.508 110.55 271.508C111.858 271.508 112.77 270.716 113.07 269.672L112.11 269.228C111.93 270.008 111.414 270.56 110.55 270.56C109.434 270.56 108.666 269.672 108.666 268.244ZM119.653 268.244C119.653 266.3 118.453 264.98 116.737 264.98C115.021 264.98 113.821 266.3 113.821 268.244C113.821 270.188 115.021 271.508 116.737 271.508C118.453 271.508 119.653 270.188 119.653 268.244ZM114.913 268.244C114.913 266.804 115.633 265.892 116.737 265.892C117.841 265.892 118.561 266.804 118.561 268.244C118.561 269.672 117.841 270.584 116.737 270.584C115.633 270.584 114.913 269.672 114.913 268.244ZM123.33 271.508C124.206 271.508 124.902 271.088 125.274 270.464V271.376H126.354V262.76H125.274V266.012C124.902 265.4 124.206 264.98 123.33 264.98C121.59 264.98 120.594 266.456 120.594 268.244C120.594 270.02 121.59 271.508 123.33 271.508ZM125.298 268.028V268.472C125.298 269.912 124.41 270.56 123.522 270.56C122.346 270.56 121.686 269.588 121.686 268.244C121.686 266.888 122.346 265.928 123.522 265.928C124.41 265.928 125.298 266.564 125.298 268.028ZM130.647 271.508C131.895 271.508 132.843 270.824 133.227 269.708L132.279 269.372C132.087 270.116 131.487 270.572 130.647 270.572C129.555 270.572 128.799 269.732 128.739 268.412H133.275V268.052C133.275 266.312 132.351 264.98 130.599 264.98C128.895 264.98 127.707 266.372 127.707 268.244C127.707 270.176 128.907 271.508 130.647 271.508ZM130.587 265.904C131.679 265.904 132.159 266.696 132.183 267.584H128.799C128.991 266.516 129.675 265.904 130.587 265.904ZM137.872 265.076C137.764 265.064 137.608 265.052 137.44 265.052C136.588 265.052 135.976 265.532 135.7 266.168V265.1H134.62V271.376H135.7V267.944C135.7 266.84 136.432 266.12 137.344 266.12C137.548 266.12 137.692 266.132 137.872 266.168V265.076ZM141.64 271.376H145.072C146.812 271.376 147.904 270.5 147.904 269C147.904 267.908 147.256 267.152 146.284 266.888C146.944 266.66 147.616 266.108 147.616 265.004C147.616 263.564 146.644 262.76 144.82 262.76H141.64V271.376ZM142.756 266.456V263.732H144.724C145.876 263.732 146.512 264.176 146.512 265.1C146.512 266.012 145.876 266.456 144.724 266.456H142.756ZM142.756 267.44H145.024C146.176 267.44 146.8 268.04 146.8 268.916C146.8 269.804 146.176 270.404 145.024 270.404H142.756V267.44ZM150.337 262.76H149.257V271.376H150.337V262.76ZM157.539 268.244C157.539 266.3 156.339 264.98 154.623 264.98C152.907 264.98 151.707 266.3 151.707 268.244C151.707 270.188 152.907 271.508 154.623 271.508C156.339 271.508 157.539 270.188 157.539 268.244ZM152.799 268.244C152.799 266.804 153.519 265.892 154.623 265.892C155.727 265.892 156.447 266.804 156.447 268.244C156.447 269.672 155.727 270.584 154.623 270.584C153.519 270.584 152.799 269.672 152.799 268.244ZM159.561 268.244C159.561 266.804 160.329 265.916 161.445 265.916C162.309 265.916 162.825 266.48 163.005 267.248L163.941 266.78C163.641 265.736 162.753 264.98 161.445 264.98C159.693 264.98 158.481 266.3 158.481 268.244C158.481 270.176 159.693 271.508 161.445 271.508C162.753 271.508 163.665 270.716 163.965 269.672L163.005 269.228C162.825 270.008 162.309 270.56 161.445 270.56C160.329 270.56 159.561 269.672 159.561 268.244ZM166.227 262.76H165.147V271.376H166.227V269.276L167.139 268.352L169.299 271.376H170.559L167.907 267.572L170.355 265.1H169.011L166.227 268.028V262.76Z" fill="black"/> <rect x="0.00390625" y="199.229" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M94.2987 214.76V223.376H99.9267V222.368H95.4387V219.428H99.2427V218.42H95.4387V215.768H99.9267V214.76H94.2987ZM101.247 223.376H102.327V219.632C102.327 218.588 103.119 217.988 103.899 217.988C104.835 217.988 105.207 218.648 105.207 219.524V223.376H106.287V219.236C106.287 217.904 105.531 216.98 104.199 216.98C103.311 216.98 102.687 217.436 102.327 217.952V217.1H101.247V223.376ZM108.668 220.244C108.668 218.804 109.436 217.916 110.552 217.916C111.416 217.916 111.932 218.48 112.112 219.248L113.048 218.78C112.748 217.736 111.86 216.98 110.552 216.98C108.8 216.98 107.588 218.3 107.588 220.244C107.588 222.176 108.8 223.508 110.552 223.508C111.86 223.508 112.772 222.716 113.072 221.672L112.112 221.228C111.932 222.008 111.416 222.56 110.552 222.56C109.436 222.56 108.668 221.672 108.668 220.244ZM119.655 220.244C119.655 218.3 118.455 216.98 116.739 216.98C115.023 216.98 113.823 218.3 113.823 220.244C113.823 222.188 115.023 223.508 116.739 223.508C118.455 223.508 119.655 222.188 119.655 220.244ZM114.915 220.244C114.915 218.804 115.635 217.892 116.739 217.892C117.843 217.892 118.563 218.804 118.563 220.244C118.563 221.672 117.843 222.584 116.739 222.584C115.635 222.584 114.915 221.672 114.915 220.244ZM123.332 223.508C124.208 223.508 124.904 223.088 125.276 222.464V223.376H126.356V214.76H125.276V218.012C124.904 217.4 124.208 216.98 123.332 216.98C121.592 216.98 120.596 218.456 120.596 220.244C120.596 222.02 121.592 223.508 123.332 223.508ZM125.3 220.028V220.472C125.3 221.912 124.412 222.56 123.524 222.56C122.348 222.56 121.688 221.588 121.688 220.244C121.688 218.888 122.348 217.928 123.524 217.928C124.412 217.928 125.3 218.564 125.3 220.028ZM130.649 223.508C131.897 223.508 132.845 222.824 133.229 221.708L132.281 221.372C132.089 222.116 131.489 222.572 130.649 222.572C129.557 222.572 128.801 221.732 128.741 220.412H133.277V220.052C133.277 218.312 132.353 216.98 130.601 216.98C128.897 216.98 127.709 218.372 127.709 220.244C127.709 222.176 128.909 223.508 130.649 223.508ZM130.589 217.904C131.681 217.904 132.161 218.696 132.185 219.584H128.801C128.993 218.516 129.677 217.904 130.589 217.904ZM137.874 217.076C137.766 217.064 137.61 217.052 137.442 217.052C136.59 217.052 135.978 217.532 135.702 218.168V217.1H134.622V223.376H135.702V219.944C135.702 218.84 136.434 218.12 137.346 218.12C137.55 218.12 137.694 218.132 137.874 218.168V217.076ZM141.642 223.376H145.074C146.814 223.376 147.906 222.5 147.906 221C147.906 219.908 147.258 219.152 146.286 218.888C146.946 218.66 147.618 218.108 147.618 217.004C147.618 215.564 146.646 214.76 144.822 214.76H141.642V223.376ZM142.758 218.456V215.732H144.726C145.878 215.732 146.514 216.176 146.514 217.1C146.514 218.012 145.878 218.456 144.726 218.456H142.758ZM142.758 219.44H145.026C146.178 219.44 146.802 220.04 146.802 220.916C146.802 221.804 146.178 222.404 145.026 222.404H142.758V219.44ZM150.338 214.76H149.258V223.376H150.338V214.76ZM157.541 220.244C157.541 218.3 156.341 216.98 154.625 216.98C152.909 216.98 151.709 218.3 151.709 220.244C151.709 222.188 152.909 223.508 154.625 223.508C156.341 223.508 157.541 222.188 157.541 220.244ZM152.801 220.244C152.801 218.804 153.521 217.892 154.625 217.892C155.729 217.892 156.449 218.804 156.449 220.244C156.449 221.672 155.729 222.584 154.625 222.584C153.521 222.584 152.801 221.672 152.801 220.244ZM159.563 220.244C159.563 218.804 160.331 217.916 161.447 217.916C162.311 217.916 162.827 218.48 163.007 219.248L163.943 218.78C163.643 217.736 162.755 216.98 161.447 216.98C159.695 216.98 158.483 218.3 158.483 220.244C158.483 222.176 159.695 223.508 161.447 223.508C162.755 223.508 163.667 222.716 163.967 221.672L163.007 221.228C162.827 222.008 162.311 222.56 161.447 222.56C160.331 222.56 159.563 221.672 159.563 220.244ZM166.229 214.76H165.149V223.376H166.229V221.276L167.141 220.352L169.301 223.376H170.561L167.909 219.572L170.357 217.1H169.013L166.229 220.028V214.76Z" fill="black"/> <rect x="0.00390625" y="118.844" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M94.2987 134.375V142.991H99.9267V141.983H95.4387V139.043H99.2427V138.035H95.4387V135.383H99.9267V134.375H94.2987ZM101.247 142.991H102.327V139.247C102.327 138.203 103.119 137.603 103.899 137.603C104.835 137.603 105.207 138.263 105.207 139.139V142.991H106.287V138.851C106.287 137.519 105.531 136.595 104.199 136.595C103.311 136.595 102.687 137.051 102.327 137.567V136.715H101.247V142.991ZM108.668 139.859C108.668 138.419 109.436 137.531 110.552 137.531C111.416 137.531 111.932 138.095 112.112 138.863L113.048 138.395C112.748 137.351 111.86 136.595 110.552 136.595C108.8 136.595 107.588 137.915 107.588 139.859C107.588 141.791 108.8 143.123 110.552 143.123C111.86 143.123 112.772 142.331 113.072 141.287L112.112 140.843C111.932 141.623 111.416 142.175 110.552 142.175C109.436 142.175 108.668 141.287 108.668 139.859ZM119.655 139.859C119.655 137.915 118.455 136.595 116.739 136.595C115.023 136.595 113.823 137.915 113.823 139.859C113.823 141.803 115.023 143.123 116.739 143.123C118.455 143.123 119.655 141.803 119.655 139.859ZM114.915 139.859C114.915 138.419 115.635 137.507 116.739 137.507C117.843 137.507 118.563 138.419 118.563 139.859C118.563 141.287 117.843 142.199 116.739 142.199C115.635 142.199 114.915 141.287 114.915 139.859ZM123.332 143.123C124.208 143.123 124.904 142.703 125.276 142.079V142.991H126.356V134.375H125.276V137.627C124.904 137.015 124.208 136.595 123.332 136.595C121.592 136.595 120.596 138.071 120.596 139.859C120.596 141.635 121.592 143.123 123.332 143.123ZM125.3 139.643V140.087C125.3 141.527 124.412 142.175 123.524 142.175C122.348 142.175 121.688 141.203 121.688 139.859C121.688 138.503 122.348 137.543 123.524 137.543C124.412 137.543 125.3 138.179 125.3 139.643ZM130.649 143.123C131.897 143.123 132.845 142.439 133.229 141.323L132.281 140.987C132.089 141.731 131.489 142.187 130.649 142.187C129.557 142.187 128.801 141.347 128.741 140.027H133.277V139.667C133.277 137.927 132.353 136.595 130.601 136.595C128.897 136.595 127.709 137.987 127.709 139.859C127.709 141.791 128.909 143.123 130.649 143.123ZM130.589 137.519C131.681 137.519 132.161 138.311 132.185 139.199H128.801C128.993 138.131 129.677 137.519 130.589 137.519ZM137.874 136.691C137.766 136.679 137.61 136.667 137.442 136.667C136.59 136.667 135.978 137.147 135.702 137.783V136.715H134.622V142.991H135.702V139.559C135.702 138.455 136.434 137.735 137.346 137.735C137.55 137.735 137.694 137.747 137.874 137.783V136.691ZM141.642 142.991H145.074C146.814 142.991 147.906 142.115 147.906 140.615C147.906 139.523 147.258 138.767 146.286 138.503C146.946 138.275 147.618 137.723 147.618 136.619C147.618 135.179 146.646 134.375 144.822 134.375H141.642V142.991ZM142.758 138.071V135.347H144.726C145.878 135.347 146.514 135.791 146.514 136.715C146.514 137.627 145.878 138.071 144.726 138.071H142.758ZM142.758 139.055H145.026C146.178 139.055 146.802 139.655 146.802 140.531C146.802 141.419 146.178 142.019 145.026 142.019H142.758V139.055ZM150.338 134.375H149.258V142.991H150.338V134.375ZM157.541 139.859C157.541 137.915 156.341 136.595 154.625 136.595C152.909 136.595 151.709 137.915 151.709 139.859C151.709 141.803 152.909 143.123 154.625 143.123C156.341 143.123 157.541 141.803 157.541 139.859ZM152.801 139.859C152.801 138.419 153.521 137.507 154.625 137.507C155.729 137.507 156.449 138.419 156.449 139.859C156.449 141.287 155.729 142.199 154.625 142.199C153.521 142.199 152.801 141.287 152.801 139.859ZM159.563 139.859C159.563 138.419 160.331 137.531 161.447 137.531C162.311 137.531 162.827 138.095 163.007 138.863L163.943 138.395C163.643 137.351 162.755 136.595 161.447 136.595C159.695 136.595 158.483 137.915 158.483 139.859C158.483 141.791 159.695 143.123 161.447 143.123C162.755 143.123 163.667 142.331 163.967 141.287L163.007 140.843C162.827 141.623 162.311 142.175 161.447 142.175C160.331 142.175 159.563 141.287 159.563 139.859ZM166.229 134.375H165.149V142.991H166.229V140.891L167.141 139.967L169.301 142.991H170.561L167.909 139.187L170.357 136.715H169.013L166.229 139.643V134.375Z" fill="black"/> <rect x="0.00390625" y="70.8438" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M94.2987 86.3752V94.9912H99.9267V93.9832H95.4387V91.0432H99.2427V90.0352H95.4387V87.3832H99.9267V86.3752H94.2987ZM101.247 94.9912H102.327V91.2472C102.327 90.2032 103.119 89.6032 103.899 89.6032C104.835 89.6032 105.207 90.2632 105.207 91.1392V94.9912H106.287V90.8512C106.287 89.5192 105.531 88.5952 104.199 88.5952C103.311 88.5952 102.687 89.0512 102.327 89.5672V88.7152H101.247V94.9912ZM108.668 91.8592C108.668 90.4192 109.436 89.5312 110.552 89.5312C111.416 89.5312 111.932 90.0952 112.112 90.8632L113.048 90.3952C112.748 89.3512 111.86 88.5952 110.552 88.5952C108.8 88.5952 107.588 89.9152 107.588 91.8592C107.588 93.7912 108.8 95.1232 110.552 95.1232C111.86 95.1232 112.772 94.3312 113.072 93.2872L112.112 92.8432C111.932 93.6232 111.416 94.1752 110.552 94.1752C109.436 94.1752 108.668 93.2872 108.668 91.8592ZM119.655 91.8592C119.655 89.9152 118.455 88.5952 116.739 88.5952C115.023 88.5952 113.823 89.9152 113.823 91.8592C113.823 93.8032 115.023 95.1232 116.739 95.1232C118.455 95.1232 119.655 93.8032 119.655 91.8592ZM114.915 91.8592C114.915 90.4192 115.635 89.5072 116.739 89.5072C117.843 89.5072 118.563 90.4192 118.563 91.8592C118.563 93.2872 117.843 94.1992 116.739 94.1992C115.635 94.1992 114.915 93.2872 114.915 91.8592ZM123.332 95.1232C124.208 95.1232 124.904 94.7032 125.276 94.0792V94.9912H126.356V86.3752H125.276V89.6272C124.904 89.0152 124.208 88.5952 123.332 88.5952C121.592 88.5952 120.596 90.0712 120.596 91.8592C120.596 93.6352 121.592 95.1232 123.332 95.1232ZM125.3 91.6432V92.0872C125.3 93.5272 124.412 94.1752 123.524 94.1752C122.348 94.1752 121.688 93.2032 121.688 91.8592C121.688 90.5032 122.348 89.5432 123.524 89.5432C124.412 89.5432 125.3 90.1792 125.3 91.6432ZM130.649 95.1232C131.897 95.1232 132.845 94.4392 133.229 93.3232L132.281 92.9872C132.089 93.7312 131.489 94.1872 130.649 94.1872C129.557 94.1872 128.801 93.3472 128.741 92.0272H133.277V91.6672C133.277 89.9272 132.353 88.5952 130.601 88.5952C128.897 88.5952 127.709 89.9872 127.709 91.8592C127.709 93.7912 128.909 95.1232 130.649 95.1232ZM130.589 89.5192C131.681 89.5192 132.161 90.3112 132.185 91.1992H128.801C128.993 90.1312 129.677 89.5192 130.589 89.5192ZM137.874 88.6912C137.766 88.6792 137.61 88.6672 137.442 88.6672C136.59 88.6672 135.978 89.1472 135.702 89.7832V88.7152H134.622V94.9912H135.702V91.5592C135.702 90.4552 136.434 89.7352 137.346 89.7352C137.55 89.7352 137.694 89.7472 137.874 89.7832V88.6912ZM141.642 94.9912H145.074C146.814 94.9912 147.906 94.1152 147.906 92.6152C147.906 91.5232 147.258 90.7672 146.286 90.5032C146.946 90.2752 147.618 89.7232 147.618 88.6192C147.618 87.1792 146.646 86.3752 144.822 86.3752H141.642V94.9912ZM142.758 90.0712V87.3472H144.726C145.878 87.3472 146.514 87.7912 146.514 88.7152C146.514 89.6272 145.878 90.0712 144.726 90.0712H142.758ZM142.758 91.0552H145.026C146.178 91.0552 146.802 91.6552 146.802 92.5312C146.802 93.4192 146.178 94.0192 145.026 94.0192H142.758V91.0552ZM150.338 86.3752H149.258V94.9912H150.338V86.3752ZM157.541 91.8592C157.541 89.9152 156.341 88.5952 154.625 88.5952C152.909 88.5952 151.709 89.9152 151.709 91.8592C151.709 93.8032 152.909 95.1232 154.625 95.1232C156.341 95.1232 157.541 93.8032 157.541 91.8592ZM152.801 91.8592C152.801 90.4192 153.521 89.5072 154.625 89.5072C155.729 89.5072 156.449 90.4192 156.449 91.8592C156.449 93.2872 155.729 94.1992 154.625 94.1992C153.521 94.1992 152.801 93.2872 152.801 91.8592ZM159.563 91.8592C159.563 90.4192 160.331 89.5312 161.447 89.5312C162.311 89.5312 162.827 90.0952 163.007 90.8632L163.943 90.3952C163.643 89.3512 162.755 88.5952 161.447 88.5952C159.695 88.5952 158.483 89.9152 158.483 91.8592C158.483 93.7912 159.695 95.1232 161.447 95.1232C162.755 95.1232 163.667 94.3312 163.967 93.2872L163.007 92.8432C162.827 93.6232 162.311 94.1752 161.447 94.1752C160.331 94.1752 159.563 93.2872 159.563 91.8592ZM166.229 86.3752H165.149V94.9912H166.229V92.8912L167.141 91.9672L169.301 94.9912H170.561L167.909 91.1872L170.357 88.7152H169.013L166.229 91.6432V86.3752Z" fill="black"/> <circle cx="132.004" cy="170.844" r="2" fill="black"/> <circle cx="132.004" cy="178.844" r="2" fill="black"/> <circle cx="132.004" cy="186.844" r="2" fill="black"/> <rect x="0.00195312" y="426.831" width="264" height="44" rx="2" fill="url(#pattern0)"/> <rect x="0.751953" y="427.581" width="262.5" height="42.5" rx="1.25" stroke="black" stroke-width="1.5"/> <rect x="383.008" y="312.719" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M476.74 338.164H479.476C482.092 338.164 483.748 336.484 483.748 333.856C483.748 331.228 482.092 329.548 479.476 329.548H476.74V338.164ZM477.868 337.168V330.544H479.464C481.432 330.544 482.62 331.84 482.62 333.856C482.62 335.86 481.432 337.168 479.476 337.168H477.868ZM487.614 338.296C488.862 338.296 489.81 337.612 490.194 336.496L489.246 336.16C489.054 336.904 488.454 337.36 487.614 337.36C486.522 337.36 485.766 336.52 485.706 335.2H490.242V334.84C490.242 333.1 489.318 331.768 487.566 331.768C485.862 331.768 484.674 333.16 484.674 335.032C484.674 336.964 485.874 338.296 487.614 338.296ZM487.554 332.692C488.646 332.692 489.126 333.484 489.15 334.372H485.766C485.958 333.304 486.642 332.692 487.554 332.692ZM492.235 335.032C492.235 333.592 493.003 332.704 494.119 332.704C494.983 332.704 495.499 333.268 495.679 334.036L496.615 333.568C496.315 332.524 495.427 331.768 494.119 331.768C492.367 331.768 491.155 333.088 491.155 335.032C491.155 336.964 492.367 338.296 494.119 338.296C495.427 338.296 496.339 337.504 496.639 336.46L495.679 336.016C495.499 336.796 494.983 337.348 494.119 337.348C493.003 337.348 492.235 336.46 492.235 335.032ZM503.221 335.032C503.221 333.088 502.021 331.768 500.305 331.768C498.589 331.768 497.389 333.088 497.389 335.032C497.389 336.976 498.589 338.296 500.305 338.296C502.021 338.296 503.221 336.976 503.221 335.032ZM498.481 335.032C498.481 333.592 499.201 332.68 500.305 332.68C501.409 332.68 502.129 333.592 502.129 335.032C502.129 336.46 501.409 337.372 500.305 337.372C499.201 337.372 498.481 336.46 498.481 335.032ZM506.898 338.296C507.774 338.296 508.47 337.876 508.842 337.252V338.164H509.922V329.548H508.842V332.8C508.47 332.188 507.774 331.768 506.898 331.768C505.158 331.768 504.162 333.244 504.162 335.032C504.162 336.808 505.158 338.296 506.898 338.296ZM508.866 334.816V335.26C508.866 336.7 507.978 337.348 507.09 337.348C505.914 337.348 505.254 336.376 505.254 335.032C505.254 333.676 505.914 332.716 507.09 332.716C507.978 332.716 508.866 333.352 508.866 334.816ZM514.216 338.296C515.464 338.296 516.412 337.612 516.796 336.496L515.848 336.16C515.656 336.904 515.056 337.36 514.216 337.36C513.124 337.36 512.368 336.52 512.308 335.2H516.844V334.84C516.844 333.1 515.92 331.768 514.168 331.768C512.464 331.768 511.276 333.16 511.276 335.032C511.276 336.964 512.476 338.296 514.216 338.296ZM514.156 332.692C515.248 332.692 515.728 333.484 515.752 334.372H512.368C512.56 333.304 513.244 332.692 514.156 332.692ZM521.44 331.864C521.332 331.852 521.176 331.84 521.008 331.84C520.156 331.84 519.544 332.32 519.268 332.956V331.888H518.188V338.164H519.268V334.732C519.268 333.628 520 332.908 520.912 332.908C521.116 332.908 521.26 332.92 521.44 332.956V331.864ZM525.209 338.164H528.641C530.381 338.164 531.473 337.288 531.473 335.788C531.473 334.696 530.825 333.94 529.853 333.676C530.513 333.448 531.185 332.896 531.185 331.792C531.185 330.352 530.213 329.548 528.389 329.548H525.209V338.164ZM526.325 333.244V330.52H528.293C529.445 330.52 530.081 330.964 530.081 331.888C530.081 332.8 529.445 333.244 528.293 333.244H526.325ZM526.325 334.228H528.593C529.745 334.228 530.369 334.828 530.369 335.704C530.369 336.592 529.745 337.192 528.593 337.192H526.325V334.228ZM533.905 329.548H532.825V338.164H533.905V329.548ZM541.108 335.032C541.108 333.088 539.908 331.768 538.192 331.768C536.476 331.768 535.276 333.088 535.276 335.032C535.276 336.976 536.476 338.296 538.192 338.296C539.908 338.296 541.108 336.976 541.108 335.032ZM536.368 335.032C536.368 333.592 537.088 332.68 538.192 332.68C539.296 332.68 540.016 333.592 540.016 335.032C540.016 336.46 539.296 337.372 538.192 337.372C537.088 337.372 536.368 336.46 536.368 335.032ZM543.129 335.032C543.129 333.592 543.897 332.704 545.013 332.704C545.877 332.704 546.393 333.268 546.573 334.036L547.509 333.568C547.209 332.524 546.321 331.768 545.013 331.768C543.261 331.768 542.049 333.088 542.049 335.032C542.049 336.964 543.261 338.296 545.013 338.296C546.321 338.296 547.233 337.504 547.533 336.46L546.573 336.016C546.393 336.796 545.877 337.348 545.013 337.348C543.897 337.348 543.129 336.46 543.129 335.032ZM549.796 329.548H548.716V338.164H549.796V336.064L550.708 335.14L552.868 338.164H554.128L551.476 334.36L553.924 331.888H552.58L549.796 334.816V329.548Z" fill="black"/> <rect x="383.008" y="264.719" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M476.74 290.164H479.476C482.092 290.164 483.748 288.484 483.748 285.856C483.748 283.228 482.092 281.548 479.476 281.548H476.74V290.164ZM477.868 289.168V282.544H479.464C481.432 282.544 482.62 283.84 482.62 285.856C482.62 287.86 481.432 289.168 479.476 289.168H477.868ZM487.614 290.296C488.862 290.296 489.81 289.612 490.194 288.496L489.246 288.16C489.054 288.904 488.454 289.36 487.614 289.36C486.522 289.36 485.766 288.52 485.706 287.2H490.242V286.84C490.242 285.1 489.318 283.768 487.566 283.768C485.862 283.768 484.674 285.16 484.674 287.032C484.674 288.964 485.874 290.296 487.614 290.296ZM487.554 284.692C488.646 284.692 489.126 285.484 489.15 286.372H485.766C485.958 285.304 486.642 284.692 487.554 284.692ZM492.235 287.032C492.235 285.592 493.003 284.704 494.119 284.704C494.983 284.704 495.499 285.268 495.679 286.036L496.615 285.568C496.315 284.524 495.427 283.768 494.119 283.768C492.367 283.768 491.155 285.088 491.155 287.032C491.155 288.964 492.367 290.296 494.119 290.296C495.427 290.296 496.339 289.504 496.639 288.46L495.679 288.016C495.499 288.796 494.983 289.348 494.119 289.348C493.003 289.348 492.235 288.46 492.235 287.032ZM503.221 287.032C503.221 285.088 502.021 283.768 500.305 283.768C498.589 283.768 497.389 285.088 497.389 287.032C497.389 288.976 498.589 290.296 500.305 290.296C502.021 290.296 503.221 288.976 503.221 287.032ZM498.481 287.032C498.481 285.592 499.201 284.68 500.305 284.68C501.409 284.68 502.129 285.592 502.129 287.032C502.129 288.46 501.409 289.372 500.305 289.372C499.201 289.372 498.481 288.46 498.481 287.032ZM506.898 290.296C507.774 290.296 508.47 289.876 508.842 289.252V290.164H509.922V281.548H508.842V284.8C508.47 284.188 507.774 283.768 506.898 283.768C505.158 283.768 504.162 285.244 504.162 287.032C504.162 288.808 505.158 290.296 506.898 290.296ZM508.866 286.816V287.26C508.866 288.7 507.978 289.348 507.09 289.348C505.914 289.348 505.254 288.376 505.254 287.032C505.254 285.676 505.914 284.716 507.09 284.716C507.978 284.716 508.866 285.352 508.866 286.816ZM514.216 290.296C515.464 290.296 516.412 289.612 516.796 288.496L515.848 288.16C515.656 288.904 515.056 289.36 514.216 289.36C513.124 289.36 512.368 288.52 512.308 287.2H516.844V286.84C516.844 285.1 515.92 283.768 514.168 283.768C512.464 283.768 511.276 285.16 511.276 287.032C511.276 288.964 512.476 290.296 514.216 290.296ZM514.156 284.692C515.248 284.692 515.728 285.484 515.752 286.372H512.368C512.56 285.304 513.244 284.692 514.156 284.692ZM521.44 283.864C521.332 283.852 521.176 283.84 521.008 283.84C520.156 283.84 519.544 284.32 519.268 284.956V283.888H518.188V290.164H519.268V286.732C519.268 285.628 520 284.908 520.912 284.908C521.116 284.908 521.26 284.92 521.44 284.956V283.864ZM525.209 290.164H528.641C530.381 290.164 531.473 289.288 531.473 287.788C531.473 286.696 530.825 285.94 529.853 285.676C530.513 285.448 531.185 284.896 531.185 283.792C531.185 282.352 530.213 281.548 528.389 281.548H525.209V290.164ZM526.325 285.244V282.52H528.293C529.445 282.52 530.081 282.964 530.081 283.888C530.081 284.8 529.445 285.244 528.293 285.244H526.325ZM526.325 286.228H528.593C529.745 286.228 530.369 286.828 530.369 287.704C530.369 288.592 529.745 289.192 528.593 289.192H526.325V286.228ZM533.905 281.548H532.825V290.164H533.905V281.548ZM541.108 287.032C541.108 285.088 539.908 283.768 538.192 283.768C536.476 283.768 535.276 285.088 535.276 287.032C535.276 288.976 536.476 290.296 538.192 290.296C539.908 290.296 541.108 288.976 541.108 287.032ZM536.368 287.032C536.368 285.592 537.088 284.68 538.192 284.68C539.296 284.68 540.016 285.592 540.016 287.032C540.016 288.46 539.296 289.372 538.192 289.372C537.088 289.372 536.368 288.46 536.368 287.032ZM543.129 287.032C543.129 285.592 543.897 284.704 545.013 284.704C545.877 284.704 546.393 285.268 546.573 286.036L547.509 285.568C547.209 284.524 546.321 283.768 545.013 283.768C543.261 283.768 542.049 285.088 542.049 287.032C542.049 288.964 543.261 290.296 545.013 290.296C546.321 290.296 547.233 289.504 547.533 288.46L546.573 288.016C546.393 288.796 545.877 289.348 545.013 289.348C543.897 289.348 543.129 288.46 543.129 287.032ZM549.796 281.548H548.716V290.164H549.796V288.064L550.708 287.14L552.868 290.164H554.128L551.476 286.36L553.924 283.888H552.58L549.796 286.816V281.548Z" fill="black"/> <rect x="383.008" y="168.83" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M476.74 194.275H479.476C482.092 194.275 483.748 192.595 483.748 189.967C483.748 187.339 482.092 185.659 479.476 185.659H476.74V194.275ZM477.868 193.279V186.655H479.464C481.432 186.655 482.62 187.951 482.62 189.967C482.62 191.971 481.432 193.279 479.476 193.279H477.868ZM487.614 194.407C488.862 194.407 489.81 193.723 490.194 192.607L489.246 192.271C489.054 193.015 488.454 193.471 487.614 193.471C486.522 193.471 485.766 192.631 485.706 191.311H490.242V190.951C490.242 189.211 489.318 187.879 487.566 187.879C485.862 187.879 484.674 189.271 484.674 191.143C484.674 193.075 485.874 194.407 487.614 194.407ZM487.554 188.803C488.646 188.803 489.126 189.595 489.15 190.483H485.766C485.958 189.415 486.642 188.803 487.554 188.803ZM492.235 191.143C492.235 189.703 493.003 188.815 494.119 188.815C494.983 188.815 495.499 189.379 495.679 190.147L496.615 189.679C496.315 188.635 495.427 187.879 494.119 187.879C492.367 187.879 491.155 189.199 491.155 191.143C491.155 193.075 492.367 194.407 494.119 194.407C495.427 194.407 496.339 193.615 496.639 192.571L495.679 192.127C495.499 192.907 494.983 193.459 494.119 193.459C493.003 193.459 492.235 192.571 492.235 191.143ZM503.221 191.143C503.221 189.199 502.021 187.879 500.305 187.879C498.589 187.879 497.389 189.199 497.389 191.143C497.389 193.087 498.589 194.407 500.305 194.407C502.021 194.407 503.221 193.087 503.221 191.143ZM498.481 191.143C498.481 189.703 499.201 188.791 500.305 188.791C501.409 188.791 502.129 189.703 502.129 191.143C502.129 192.571 501.409 193.483 500.305 193.483C499.201 193.483 498.481 192.571 498.481 191.143ZM506.898 194.407C507.774 194.407 508.47 193.987 508.842 193.363V194.275H509.922V185.659H508.842V188.911C508.47 188.299 507.774 187.879 506.898 187.879C505.158 187.879 504.162 189.355 504.162 191.143C504.162 192.919 505.158 194.407 506.898 194.407ZM508.866 190.927V191.371C508.866 192.811 507.978 193.459 507.09 193.459C505.914 193.459 505.254 192.487 505.254 191.143C505.254 189.787 505.914 188.827 507.09 188.827C507.978 188.827 508.866 189.463 508.866 190.927ZM514.216 194.407C515.464 194.407 516.412 193.723 516.796 192.607L515.848 192.271C515.656 193.015 515.056 193.471 514.216 193.471C513.124 193.471 512.368 192.631 512.308 191.311H516.844V190.951C516.844 189.211 515.92 187.879 514.168 187.879C512.464 187.879 511.276 189.271 511.276 191.143C511.276 193.075 512.476 194.407 514.216 194.407ZM514.156 188.803C515.248 188.803 515.728 189.595 515.752 190.483H512.368C512.56 189.415 513.244 188.803 514.156 188.803ZM521.44 187.975C521.332 187.963 521.176 187.951 521.008 187.951C520.156 187.951 519.544 188.431 519.268 189.067V187.999H518.188V194.275H519.268V190.843C519.268 189.739 520 189.019 520.912 189.019C521.116 189.019 521.26 189.031 521.44 189.067V187.975ZM525.209 194.275H528.641C530.381 194.275 531.473 193.399 531.473 191.899C531.473 190.807 530.825 190.051 529.853 189.787C530.513 189.559 531.185 189.007 531.185 187.903C531.185 186.463 530.213 185.659 528.389 185.659H525.209V194.275ZM526.325 189.355V186.631H528.293C529.445 186.631 530.081 187.075 530.081 187.999C530.081 188.911 529.445 189.355 528.293 189.355H526.325ZM526.325 190.339H528.593C529.745 190.339 530.369 190.939 530.369 191.815C530.369 192.703 529.745 193.303 528.593 193.303H526.325V190.339ZM533.905 185.659H532.825V194.275H533.905V185.659ZM541.108 191.143C541.108 189.199 539.908 187.879 538.192 187.879C536.476 187.879 535.276 189.199 535.276 191.143C535.276 193.087 536.476 194.407 538.192 194.407C539.908 194.407 541.108 193.087 541.108 191.143ZM536.368 191.143C536.368 189.703 537.088 188.791 538.192 188.791C539.296 188.791 540.016 189.703 540.016 191.143C540.016 192.571 539.296 193.483 538.192 193.483C537.088 193.483 536.368 192.571 536.368 191.143ZM543.129 191.143C543.129 189.703 543.897 188.815 545.013 188.815C545.877 188.815 546.393 189.379 546.573 190.147L547.509 189.679C547.209 188.635 546.321 187.879 545.013 187.879C543.261 187.879 542.049 189.199 542.049 191.143C542.049 193.075 543.261 194.407 545.013 194.407C546.321 194.407 547.233 193.615 547.533 192.571L546.573 192.127C546.393 192.907 545.877 193.459 545.013 193.459C543.897 193.459 543.129 192.571 543.129 191.143ZM549.796 185.659H548.716V194.275H549.796V192.175L550.708 191.251L552.868 194.275H554.128L551.476 190.471L553.924 187.999H552.58L549.796 190.927V185.659Z" fill="black"/> <rect x="383.008" y="120.83" width="264" height="40" rx="2" fill="#EDEDED"/> <path d="M476.74 146.275H479.476C482.092 146.275 483.748 144.595 483.748 141.967C483.748 139.339 482.092 137.659 479.476 137.659H476.74V146.275ZM477.868 145.279V138.655H479.464C481.432 138.655 482.62 139.951 482.62 141.967C482.62 143.971 481.432 145.279 479.476 145.279H477.868ZM487.614 146.407C488.862 146.407 489.81 145.723 490.194 144.607L489.246 144.271C489.054 145.015 488.454 145.471 487.614 145.471C486.522 145.471 485.766 144.631 485.706 143.311H490.242V142.951C490.242 141.211 489.318 139.879 487.566 139.879C485.862 139.879 484.674 141.271 484.674 143.143C484.674 145.075 485.874 146.407 487.614 146.407ZM487.554 140.803C488.646 140.803 489.126 141.595 489.15 142.483H485.766C485.958 141.415 486.642 140.803 487.554 140.803ZM492.235 143.143C492.235 141.703 493.003 140.815 494.119 140.815C494.983 140.815 495.499 141.379 495.679 142.147L496.615 141.679C496.315 140.635 495.427 139.879 494.119 139.879C492.367 139.879 491.155 141.199 491.155 143.143C491.155 145.075 492.367 146.407 494.119 146.407C495.427 146.407 496.339 145.615 496.639 144.571L495.679 144.127C495.499 144.907 494.983 145.459 494.119 145.459C493.003 145.459 492.235 144.571 492.235 143.143ZM503.221 143.143C503.221 141.199 502.021 139.879 500.305 139.879C498.589 139.879 497.389 141.199 497.389 143.143C497.389 145.087 498.589 146.407 500.305 146.407C502.021 146.407 503.221 145.087 503.221 143.143ZM498.481 143.143C498.481 141.703 499.201 140.791 500.305 140.791C501.409 140.791 502.129 141.703 502.129 143.143C502.129 144.571 501.409 145.483 500.305 145.483C499.201 145.483 498.481 144.571 498.481 143.143ZM506.898 146.407C507.774 146.407 508.47 145.987 508.842 145.363V146.275H509.922V137.659H508.842V140.911C508.47 140.299 507.774 139.879 506.898 139.879C505.158 139.879 504.162 141.355 504.162 143.143C504.162 144.919 505.158 146.407 506.898 146.407ZM508.866 142.927V143.371C508.866 144.811 507.978 145.459 507.09 145.459C505.914 145.459 505.254 144.487 505.254 143.143C505.254 141.787 505.914 140.827 507.09 140.827C507.978 140.827 508.866 141.463 508.866 142.927ZM514.216 146.407C515.464 146.407 516.412 145.723 516.796 144.607L515.848 144.271C515.656 145.015 515.056 145.471 514.216 145.471C513.124 145.471 512.368 144.631 512.308 143.311H516.844V142.951C516.844 141.211 515.92 139.879 514.168 139.879C512.464 139.879 511.276 141.271 511.276 143.143C511.276 145.075 512.476 146.407 514.216 146.407ZM514.156 140.803C515.248 140.803 515.728 141.595 515.752 142.483H512.368C512.56 141.415 513.244 140.803 514.156 140.803ZM521.44 139.975C521.332 139.963 521.176 139.951 521.008 139.951C520.156 139.951 519.544 140.431 519.268 141.067V139.999H518.188V146.275H519.268V142.843C519.268 141.739 520 141.019 520.912 141.019C521.116 141.019 521.26 141.031 521.44 141.067V139.975ZM525.209 146.275H528.641C530.381 146.275 531.473 145.399 531.473 143.899C531.473 142.807 530.825 142.051 529.853 141.787C530.513 141.559 531.185 141.007 531.185 139.903C531.185 138.463 530.213 137.659 528.389 137.659H525.209V146.275ZM526.325 141.355V138.631H528.293C529.445 138.631 530.081 139.075 530.081 139.999C530.081 140.911 529.445 141.355 528.293 141.355H526.325ZM526.325 142.339H528.593C529.745 142.339 530.369 142.939 530.369 143.815C530.369 144.703 529.745 145.303 528.593 145.303H526.325V142.339ZM533.905 137.659H532.825V146.275H533.905V137.659ZM541.108 143.143C541.108 141.199 539.908 139.879 538.192 139.879C536.476 139.879 535.276 141.199 535.276 143.143C535.276 145.087 536.476 146.407 538.192 146.407C539.908 146.407 541.108 145.087 541.108 143.143ZM536.368 143.143C536.368 141.703 537.088 140.791 538.192 140.791C539.296 140.791 540.016 141.703 540.016 143.143C540.016 144.571 539.296 145.483 538.192 145.483C537.088 145.483 536.368 144.571 536.368 143.143ZM543.129 143.143C543.129 141.703 543.897 140.815 545.013 140.815C545.877 140.815 546.393 141.379 546.573 142.147L547.509 141.679C547.209 140.635 546.321 139.879 545.013 139.879C543.261 139.879 542.049 141.199 542.049 143.143C542.049 145.075 543.261 146.407 545.013 146.407C546.321 146.407 547.233 145.615 547.533 144.571L546.573 144.127C546.393 144.907 545.877 145.459 545.013 145.459C543.897 145.459 543.129 144.571 543.129 143.143ZM549.796 137.659H548.716V146.275H549.796V144.175L550.708 143.251L552.868 146.275H554.128L551.476 142.471L553.924 139.999H552.58L549.796 142.927V137.659Z" fill="black"/> <circle cx="515.691" cy="228.773" r="2" fill="black"/> <circle cx="515.691" cy="236.773" r="2" fill="black"/> <circle cx="515.691" cy="244.773" r="2" fill="black"/> <circle cx="335.814" cy="228.773" r="2" fill="black"/> <circle cx="335.814" cy="236.773" r="2" fill="black"/> <circle cx="335.814" cy="244.773" r="2" fill="black"/> <rect x="381.608" y="41.891" width="264.9" height="42.8" rx="1.4" stroke="#CDCDCD" stroke-width="1.2"/> <path d="M548.177 60.787V59.947H542.357V60.787H544.787V67.127H545.747V60.787H548.177ZM549.002 67.127H549.902V64.007C549.902 63.137 550.562 62.637 551.212 62.637C551.992 62.637 552.302 63.187 552.302 63.917V67.127H553.202V63.677C553.202 62.567 552.572 61.797 551.462 61.797C550.722 61.797 550.202 62.177 549.902 62.607V59.947H549.002V67.127ZM556.736 67.237C557.776 67.237 558.566 66.667 558.886 65.737L558.096 65.457C557.936 66.077 557.436 66.457 556.736 66.457C555.826 66.457 555.196 65.757 555.146 64.657H558.926V64.357C558.926 62.907 558.156 61.797 556.696 61.797C555.276 61.797 554.286 62.957 554.286 64.517C554.286 66.127 555.286 67.237 556.736 67.237ZM556.686 62.567C557.596 62.567 557.996 63.227 558.016 63.967H555.196C555.356 63.077 555.926 62.567 556.686 62.567Z" fill="black"/> <rect x="534.344" y="46.4941" width="33.8536" height="34" rx="2" fill="#7CF178"/> <path d="M541.682 67.1041C542.412 67.1041 542.992 66.7541 543.302 66.2341V68.7941H544.202V61.7641H543.302V62.5241C542.992 62.0141 542.412 61.6641 541.682 61.6641C540.232 61.6641 539.402 62.8941 539.402 64.3841C539.402 65.8641 540.232 67.1041 541.682 67.1041ZM543.322 64.2041V64.5741C543.322 65.7741 542.582 66.3141 541.842 66.3141C540.862 66.3141 540.312 65.5041 540.312 64.3841C540.312 63.2541 540.862 62.4541 541.842 62.4541C542.582 62.4541 543.322 62.9841 543.322 64.2041ZM549.849 61.7641H548.949V64.9141C548.949 65.7841 548.289 66.2741 547.639 66.2741C546.869 66.2741 546.549 65.7741 546.549 65.0341V61.7641H545.649V65.2241C545.649 66.3341 546.279 67.1041 547.399 67.1041C548.139 67.1041 548.649 66.7141 548.949 66.2741V66.9941H549.849V61.7641ZM551.304 60.8541H552.284V59.8141H551.304V60.8541ZM552.244 61.7641H551.344V66.9941H552.244V61.7641ZM554.286 64.3841C554.286 63.1841 554.926 62.4441 555.856 62.4441C556.576 62.4441 557.006 62.9141 557.156 63.5541L557.936 63.1641C557.686 62.2941 556.946 61.6641 555.856 61.6641C554.396 61.6641 553.386 62.7641 553.386 64.3841C553.386 65.9941 554.396 67.1041 555.856 67.1041C556.946 67.1041 557.706 66.4441 557.956 65.5741L557.156 65.2041C557.006 65.8541 556.576 66.3141 555.856 66.3141C554.926 66.3141 554.286 65.5741 554.286 64.3841ZM559.841 59.8141H558.941V66.9941H559.841V65.2441L560.601 64.4741L562.401 66.9941H563.451L561.241 63.8241L563.281 61.7641H562.161L559.841 64.2041V59.8141Z" fill="black"/> <rect x="497.504" y="46.4941" width="33.8536" height="34" rx="2" fill="#7CF178"/> <path d="M511.866 60.6541V59.8141H506.046V60.6541H508.476V66.9941H509.436V60.6541H511.866ZM512.691 66.9941H513.591V63.8741C513.591 63.0041 514.251 62.5041 514.901 62.5041C515.681 62.5041 515.991 63.0541 515.991 63.7841V66.9941H516.891V63.5441C516.891 62.4341 516.261 61.6641 515.151 61.6641C514.411 61.6641 513.891 62.0441 513.591 62.4741V59.8141H512.691V66.9941ZM520.425 67.1041C521.465 67.1041 522.255 66.5341 522.575 65.6041L521.785 65.3241C521.625 65.9441 521.125 66.3241 520.425 66.3241C519.515 66.3241 518.885 65.6241 518.835 64.5241H522.615V64.2241C522.615 62.7741 521.845 61.6641 520.385 61.6641C518.965 61.6641 517.975 62.8241 517.975 64.3841C517.975 65.9941 518.975 67.1041 520.425 67.1041ZM520.375 62.4341C521.285 62.4341 521.685 63.0941 521.705 63.8341H518.885C519.045 62.9441 519.615 62.4341 520.375 62.4341Z" fill="black"/> <rect x="424.422" y="47.0941" width="32.6536" height="32.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2"/> <path d="M433.036 57.7521V56.9681H428.906V57.7521H430.523V61.9941H431.419V57.7521H433.036ZM434.514 60.0481H435.326L436.551 61.9941H437.573L436.278 59.9781C437.013 59.8031 437.447 59.2991 437.447 58.5081C437.447 57.4931 436.747 56.9681 435.627 56.9681H433.618V61.9941H434.514V60.0481ZM434.514 59.2781V57.7381H435.578C436.222 57.7381 436.565 58.0111 436.565 58.5081C436.565 58.9981 436.222 59.2781 435.578 59.2781H434.514ZM439.655 56.9681L437.751 61.9941H438.64L439.067 60.8461H441.237L441.671 61.9941H442.574L440.67 56.9681H439.655ZM440.145 57.9621L440.943 60.0621H439.361L440.145 57.9621ZM447.257 56.9681H446.396V60.4891L444.044 56.9681H443.12V61.9941H443.981V58.3051L446.473 61.9941H447.257V56.9681ZM448.009 60.3281H449.969V59.5791H448.009V60.3281ZM428.766 68.0141C429.158 68.6581 430.012 69.0711 430.873 69.0711C431.909 69.0711 432.777 68.4621 432.777 67.4751C432.777 66.3831 431.804 66.1941 430.992 66.0121C430.313 65.8581 429.928 65.7671 429.928 65.3261C429.928 64.9271 430.299 64.6541 430.838 64.6541C431.426 64.6541 431.797 64.9551 432.077 65.3891L432.735 64.8221C432.399 64.2971 431.727 63.8911 430.866 63.8911C429.851 63.8911 429.053 64.4861 429.053 65.4171C429.053 66.3971 429.865 66.6211 430.621 66.8031C431.37 66.9851 431.895 67.0551 431.895 67.5661C431.895 68.0491 431.468 68.3081 430.901 68.3081C430.327 68.3081 429.802 67.9931 429.452 67.4471L428.766 68.0141ZM434.043 66.4811C434.043 65.3961 434.631 64.6821 435.457 64.6821C436.045 64.6821 436.472 65.0391 436.626 65.6271L437.466 65.3401C437.172 64.4511 436.472 63.8911 435.457 63.8911C434.12 63.8911 433.147 64.9551 433.147 66.4811C433.147 68.0071 434.12 69.0711 435.457 69.0711C436.472 69.0711 437.172 68.5111 437.466 67.6221L436.626 67.3351C436.472 67.9231 436.045 68.2801 435.457 68.2801C434.631 68.2801 434.043 67.5661 434.043 66.4811ZM438.971 67.0481H439.783L441.008 68.9941H442.03L440.735 66.9781C441.47 66.8031 441.904 66.2991 441.904 65.5081C441.904 64.4931 441.204 63.9681 440.084 63.9681H438.075V68.9941H438.971V67.0481ZM438.971 66.2781V64.7381H440.035C440.679 64.7381 441.022 65.0111 441.022 65.5081C441.022 65.9981 440.679 66.2781 440.035 66.2781H438.971ZM443.544 63.9681H442.648V68.9941H443.544V63.9681ZM444.494 68.9941H446.615C447.651 68.9941 448.302 68.4761 448.302 67.5941C448.302 66.9781 447.952 66.5441 447.427 66.3761C447.777 66.2361 448.141 65.9141 448.141 65.2911C448.141 64.4371 447.567 63.9681 446.489 63.9681H444.494V68.9941ZM445.355 66.0471V64.7241H446.405C446.979 64.7241 447.294 64.9411 447.294 65.3821C447.294 65.8231 446.979 66.0471 446.405 66.0471H445.355ZM445.355 66.8031H446.566C447.133 66.8031 447.441 67.0971 447.441 67.5171C447.441 67.9441 447.133 68.2381 446.566 68.2381H445.355V66.8031ZM448.964 63.9681V68.9941H452.366V68.2101H449.846V66.7961H451.967V66.0121H449.846V64.7521H452.366V63.9681H448.964Z" fill="black"/> <path d="M470.498 63.2108C470.498 65.4008 471.558 66.9108 473.438 66.9108C475.318 66.9108 476.388 65.4008 476.388 63.2108C476.388 61.0208 475.318 59.5108 473.438 59.5108C471.558 59.5108 470.498 61.0208 470.498 63.2108ZM472.168 63.2108C472.168 61.8608 472.538 60.9308 473.438 60.9308C474.338 60.9308 474.718 61.8608 474.718 63.2108C474.718 64.5608 474.338 65.4908 473.438 65.4908C472.538 65.4908 472.168 64.5608 472.168 63.2108ZM476.671 66.8008H478.511V65.0108H476.671V66.8008ZM478.799 63.2108C478.799 65.4008 479.859 66.9108 481.739 66.9108C483.619 66.9108 484.689 65.4008 484.689 63.2108C484.689 61.0208 483.619 59.5108 481.739 59.5108C479.859 59.5108 478.799 61.0208 478.799 63.2108ZM480.469 63.2108C480.469 61.8608 480.839 60.9308 481.739 60.9308C482.639 60.9308 483.019 61.8608 483.019 63.2108C483.019 64.5608 482.639 65.4908 481.739 65.4908C480.839 65.4908 480.469 64.5608 480.469 63.2108Z" fill="black"/> <rect x="461.264" y="47.0941" width="31.6579" height="31.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2" stroke-dasharray="2 2"/> <path d="M470.618 63.5643C470.618 65.7543 471.648 67.2643 473.358 67.2643C475.058 67.2643 476.088 65.7543 476.088 63.5643C476.088 61.3743 475.058 59.8643 473.358 59.8643C471.648 59.8643 470.618 61.3743 470.618 63.5643ZM471.578 63.5643C471.578 61.8843 472.188 60.7043 473.358 60.7043C474.518 60.7043 475.138 61.8843 475.138 63.5643C475.138 65.2443 474.518 66.4243 473.358 66.4243C472.188 66.4243 471.578 65.2443 471.578 63.5643ZM476.52 67.1543H477.66V65.9743H476.52V67.1543ZM478.098 63.5643C478.098 65.7543 479.128 67.2643 480.838 67.2643C482.538 67.2643 483.568 65.7543 483.568 63.5643C483.568 61.3743 482.538 59.8643 480.838 59.8643C479.128 59.8643 478.098 61.3743 478.098 63.5643ZM479.058 63.5643C479.058 61.8843 479.668 60.7043 480.838 60.7043C481.998 60.7043 482.618 61.8843 482.618 63.5643C482.618 65.2443 481.998 66.4243 480.838 66.4243C479.668 66.4243 479.058 65.2443 479.058 63.5643Z" fill="black"/> <rect x="387.582" y="47.0941" width="32.6536" height="32.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2"/> <path d="M398.102 60.4743V67.6543H402.792V66.8143H399.052V64.3643H402.222V63.5243H399.052V61.3143H402.792V60.4743H398.102ZM409.719 60.4743H408.789V66.0243L405.059 60.4743H404.049V67.6543H404.969V61.8743L408.889 67.6543H409.719V60.4743Z" fill="black"/> <path d="M575.033 63.537C575.033 65.727 576.413 67.237 578.293 67.237C578.653 67.237 579.003 67.177 579.323 67.077C579.613 67.597 580.283 68.087 580.923 68.267L581.483 67.577C580.923 67.437 580.373 67.107 580.083 66.687C580.983 66.067 581.543 64.937 581.543 63.537C581.543 61.347 580.173 59.837 578.293 59.837C576.413 59.837 575.033 61.347 575.033 63.537ZM580.593 63.537C580.593 65.237 579.643 66.387 578.293 66.387C576.943 66.387 575.993 65.237 575.993 63.537C575.993 61.837 576.943 60.687 578.293 60.687C579.643 60.687 580.593 61.837 580.593 63.537ZM586.819 61.897H585.919V65.047C585.919 65.917 585.259 66.407 584.609 66.407C583.839 66.407 583.519 65.907 583.519 65.167V61.897H582.619V65.357C582.619 66.467 583.249 67.237 584.369 67.237C585.109 67.237 585.619 66.847 585.919 66.407V67.127H586.819V61.897ZM588.274 60.987H589.254V59.947H588.274V60.987ZM589.214 61.897H588.314V67.127H589.214V61.897ZM591.256 64.517C591.256 63.317 591.896 62.577 592.826 62.577C593.546 62.577 593.976 63.047 594.126 63.687L594.906 63.297C594.656 62.427 593.916 61.797 592.826 61.797C591.366 61.797 590.356 62.897 590.356 64.517C590.356 66.127 591.366 67.237 592.826 67.237C593.916 67.237 594.676 66.577 594.926 65.707L594.126 65.337C593.976 65.987 593.546 66.447 592.826 66.447C591.896 66.447 591.256 65.707 591.256 64.517ZM596.811 59.947H595.911V67.127H596.811V65.377L597.571 64.607L599.371 67.127H600.421L598.211 63.957L600.251 61.897H599.131L596.811 64.337V59.947Z" fill="black"/> <rect x="571.186" y="46.4941" width="33.8536" height="34" rx="2" fill="#7CF178"/> <path d="M577.475 67.1041C578.925 67.1041 579.755 65.8641 579.755 64.3841C579.755 62.8941 578.925 61.6641 577.475 61.6641C576.745 61.6641 576.165 62.0141 575.855 62.5241V59.8141H574.955V66.9941H575.855V66.2341C576.165 66.7541 576.745 67.1041 577.475 67.1041ZM575.835 64.2041C575.835 62.9841 576.575 62.4541 577.315 62.4541C578.295 62.4541 578.845 63.2541 578.845 64.3841C578.845 65.5041 578.295 66.3141 577.315 66.3141C576.575 66.3141 575.835 65.7741 575.835 64.5741V64.2041ZM583.603 61.7441C583.513 61.7341 583.383 61.7241 583.243 61.7241C582.533 61.7241 582.023 62.1241 581.793 62.6541V61.7641H580.893V66.9941H581.793V64.1341C581.793 63.2141 582.403 62.6141 583.163 62.6141C583.333 62.6141 583.453 62.6241 583.603 62.6541V61.7441ZM588.879 64.3841C588.879 62.7641 587.879 61.6641 586.449 61.6641C585.019 61.6641 584.019 62.7641 584.019 64.3841C584.019 66.0041 585.019 67.1041 586.449 67.1041C587.879 67.1041 588.879 66.0041 588.879 64.3841ZM584.929 64.3841C584.929 63.1841 585.529 62.4241 586.449 62.4241C587.369 62.4241 587.969 63.1841 587.969 64.3841C587.969 65.5741 587.369 66.3341 586.449 66.3341C585.529 66.3341 584.929 65.5741 584.929 64.3841ZM592.416 61.7641L591.336 65.6541L590.236 61.7641H589.306L590.896 66.9941H591.686L592.776 63.1141L593.866 66.9941H594.656L596.246 61.7641H595.346L594.256 65.6641L593.176 61.7641H592.416ZM597.123 66.9941H598.023V63.8741C598.023 63.0041 598.683 62.5041 599.333 62.5041C600.113 62.5041 600.423 63.0541 600.423 63.7841V66.9941H601.323V63.5441C601.323 62.4341 600.693 61.6641 599.583 61.6641C598.843 61.6641 598.323 62.0441 598.023 62.4741V61.7641H597.123V66.9941Z" fill="black"/> <path d="M612.14 67.127H615C616.45 67.127 617.36 66.397 617.36 65.147C617.36 64.237 616.82 63.607 616.01 63.387C616.56 63.197 617.12 62.737 617.12 61.817C617.12 60.617 616.31 59.947 614.79 59.947H612.14V67.127ZM613.07 63.027V60.757H614.71C615.67 60.757 616.2 61.127 616.2 61.897C616.2 62.657 615.67 63.027 614.71 63.027H613.07ZM613.07 63.847H614.96C615.92 63.847 616.44 64.347 616.44 65.077C616.44 65.817 615.92 66.317 614.96 66.317H613.07V63.847ZM621.196 61.877C621.106 61.867 620.976 61.857 620.836 61.857C620.126 61.857 619.616 62.257 619.386 62.787V61.897H618.486V67.127H619.386V64.267C619.386 63.347 619.996 62.747 620.756 62.747C620.926 62.747 621.046 62.757 621.196 62.787V61.877ZM626.473 64.517C626.473 62.897 625.473 61.797 624.043 61.797C622.613 61.797 621.613 62.897 621.613 64.517C621.613 66.137 622.613 67.237 624.043 67.237C625.473 67.237 626.473 66.137 626.473 64.517ZM622.523 64.517C622.523 63.317 623.123 62.557 624.043 62.557C624.963 62.557 625.563 63.317 625.563 64.517C625.563 65.707 624.963 66.467 624.043 66.467C623.123 66.467 622.523 65.707 622.523 64.517ZM630.01 61.897L628.93 65.787L627.83 61.897H626.9L628.49 67.127H629.28L630.37 63.247L631.46 67.127H632.25L633.84 61.897H632.94L631.85 65.797L630.77 61.897H630.01ZM634.717 67.127H635.617V64.007C635.617 63.137 636.277 62.637 636.927 62.637C637.707 62.637 638.017 63.187 638.017 63.917V67.127H638.917V63.677C638.917 62.567 638.287 61.797 637.177 61.797C636.437 61.797 635.917 62.177 635.617 62.607V61.897H634.717V67.127Z" fill="black"/> <rect x="608.025" y="46.4941" width="33.8536" height="34" rx="2" fill="#7CF178"/> <ellipse cx="628.437" cy="63.4941" rx="0.75" ry="0.74677" transform="rotate(90 628.437 63.4941)" fill="black"/> <ellipse cx="624.952" cy="63.4941" rx="0.75" ry="0.74677" transform="rotate(90 624.952 63.4941)" fill="black"/> <ellipse cx="621.468" cy="63.4941" rx="0.75" ry="0.74677" transform="rotate(90 621.468 63.4941)" fill="black"/> <rect x="381.508" y="427.318" width="265.1" height="43" rx="1.5" stroke="#CDCDCD"/> <path d="M548.912 446.314V445.474H543.092V446.314H545.522V452.654H546.482V446.314H548.912ZM549.736 452.654H550.636V449.534C550.636 448.664 551.296 448.164 551.946 448.164C552.726 448.164 553.036 448.714 553.036 449.444V452.654H553.936V449.204C553.936 448.094 553.306 447.324 552.196 447.324C551.456 447.324 550.936 447.704 550.636 448.134V445.474H549.736V452.654ZM557.471 452.764C558.511 452.764 559.301 452.194 559.621 451.264L558.831 450.984C558.671 451.604 558.171 451.984 557.471 451.984C556.561 451.984 555.931 451.284 555.881 450.184H559.661V449.884C559.661 448.434 558.891 447.324 557.431 447.324C556.011 447.324 555.021 448.484 555.021 450.044C555.021 451.654 556.021 452.764 557.471 452.764ZM557.421 448.094C558.331 448.094 558.731 448.754 558.751 449.494H555.931C556.091 448.604 556.661 448.094 557.421 448.094Z" fill="black"/> <rect x="571.008" y="432.021" width="34" height="34" rx="2" fill="#7CF178"/> <path d="M578.417 452.631C579.147 452.631 579.727 452.281 580.037 451.761V454.321H580.937V447.291H580.037V448.051C579.727 447.541 579.147 447.191 578.417 447.191C576.967 447.191 576.137 448.421 576.137 449.911C576.137 451.391 576.967 452.631 578.417 452.631ZM580.057 449.731V450.101C580.057 451.301 579.317 451.841 578.577 451.841C577.597 451.841 577.047 451.031 577.047 449.911C577.047 448.781 577.597 447.981 578.577 447.981C579.317 447.981 580.057 448.511 580.057 449.731ZM586.585 447.291H585.685V450.441C585.685 451.311 585.025 451.801 584.375 451.801C583.605 451.801 583.285 451.301 583.285 450.561V447.291H582.385V450.751C582.385 451.861 583.015 452.631 584.135 452.631C584.875 452.631 585.385 452.241 585.685 451.801V452.521H586.585V447.291ZM588.039 446.381H589.019V445.341H588.039V446.381ZM588.979 447.291H588.079V452.521H588.979V447.291ZM591.021 449.911C591.021 448.711 591.661 447.971 592.591 447.971C593.311 447.971 593.741 448.441 593.891 449.081L594.671 448.691C594.421 447.821 593.681 447.191 592.591 447.191C591.131 447.191 590.121 448.291 590.121 449.911C590.121 451.521 591.131 452.631 592.591 452.631C593.681 452.631 594.441 451.971 594.691 451.101L593.891 450.731C593.741 451.381 593.311 451.841 592.591 451.841C591.661 451.841 591.021 451.101 591.021 449.911ZM596.577 445.341H595.677V452.521H596.577V450.771L597.337 450.001L599.137 452.521H600.187L597.977 449.351L600.017 447.291H598.897L596.577 449.731V445.341Z" fill="black"/> <rect x="534.008" y="432.021" width="34" height="34" rx="2" fill="#7CF178"/> <path d="M548.443 446.181V445.341H542.623V446.181H545.053V452.521H546.013V446.181H548.443ZM549.268 452.521H550.168V449.401C550.168 448.531 550.828 448.031 551.478 448.031C552.258 448.031 552.568 448.581 552.568 449.311V452.521H553.468V449.071C553.468 447.961 552.838 447.191 551.728 447.191C550.988 447.191 550.468 447.571 550.168 448.001V445.341H549.268V452.521ZM557.002 452.631C558.042 452.631 558.832 452.061 559.152 451.131L558.362 450.851C558.202 451.471 557.702 451.851 557.002 451.851C556.092 451.851 555.462 451.151 555.412 450.051H559.192V449.751C559.192 448.301 558.422 447.191 556.962 447.191C555.542 447.191 554.552 448.351 554.552 449.911C554.552 451.521 555.552 452.631 557.002 452.631ZM556.952 447.961C557.862 447.961 558.262 448.621 558.282 449.361H555.462C555.622 448.471 556.192 447.961 556.952 447.961Z" fill="black"/> <rect x="461.608" y="432.621" width="32.8" height="32.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2"/> <path d="M470.243 443.279V442.495H466.113V443.279H467.73V447.521H468.626V443.279H470.243ZM471.721 445.575H472.533L473.758 447.521H474.78L473.485 445.505C474.22 445.33 474.654 444.826 474.654 444.035C474.654 443.02 473.954 442.495 472.834 442.495H470.825V447.521H471.721V445.575ZM471.721 444.805V443.265H472.785C473.429 443.265 473.772 443.538 473.772 444.035C473.772 444.525 473.429 444.805 472.785 444.805H471.721ZM476.862 442.495L474.958 447.521H475.847L476.274 446.373H478.444L478.878 447.521H479.781L477.877 442.495H476.862ZM477.352 443.489L478.15 445.589H476.568L477.352 443.489ZM484.464 442.495H483.603V446.016L481.251 442.495H480.327V447.521H481.188V443.832L483.68 447.521H484.464V442.495ZM485.217 445.855H487.177V445.106H485.217V445.855ZM465.973 453.541C466.365 454.185 467.219 454.598 468.08 454.598C469.116 454.598 469.984 453.989 469.984 453.002C469.984 451.91 469.011 451.721 468.199 451.539C467.52 451.385 467.135 451.294 467.135 450.853C467.135 450.454 467.506 450.181 468.045 450.181C468.633 450.181 469.004 450.482 469.284 450.916L469.942 450.349C469.606 449.824 468.934 449.418 468.073 449.418C467.058 449.418 466.26 450.013 466.26 450.944C466.26 451.924 467.072 452.148 467.828 452.33C468.577 452.512 469.102 452.582 469.102 453.093C469.102 453.576 468.675 453.835 468.108 453.835C467.534 453.835 467.009 453.52 466.659 452.974L465.973 453.541ZM471.25 452.008C471.25 450.923 471.838 450.209 472.664 450.209C473.252 450.209 473.679 450.566 473.833 451.154L474.673 450.867C474.379 449.978 473.679 449.418 472.664 449.418C471.327 449.418 470.354 450.482 470.354 452.008C470.354 453.534 471.327 454.598 472.664 454.598C473.679 454.598 474.379 454.038 474.673 453.149L473.833 452.862C473.679 453.45 473.252 453.807 472.664 453.807C471.838 453.807 471.25 453.093 471.25 452.008ZM476.178 452.575H476.99L478.215 454.521H479.237L477.942 452.505C478.677 452.33 479.111 451.826 479.111 451.035C479.111 450.02 478.411 449.495 477.291 449.495H475.282V454.521H476.178V452.575ZM476.178 451.805V450.265H477.242C477.886 450.265 478.229 450.538 478.229 451.035C478.229 451.525 477.886 451.805 477.242 451.805H476.178ZM480.751 449.495H479.855V454.521H480.751V449.495ZM481.701 454.521H483.822C484.858 454.521 485.509 454.003 485.509 453.121C485.509 452.505 485.159 452.071 484.634 451.903C484.984 451.763 485.348 451.441 485.348 450.818C485.348 449.964 484.774 449.495 483.696 449.495H481.701V454.521ZM482.562 451.574V450.251H483.612C484.186 450.251 484.501 450.468 484.501 450.909C484.501 451.35 484.186 451.574 483.612 451.574H482.562ZM482.562 452.33H483.773C484.34 452.33 484.648 452.624 484.648 453.044C484.648 453.471 484.34 453.765 483.773 453.765H482.562V452.33ZM486.171 449.495V454.521H489.573V453.737H487.053V452.323H489.174V451.539H487.053V450.279H489.573V449.495H486.171Z" fill="black"/> <path d="M507.915 448.738C507.915 450.928 508.975 452.438 510.855 452.438C512.735 452.438 513.805 450.928 513.805 448.738C513.805 446.548 512.735 445.038 510.855 445.038C508.975 445.038 507.915 446.548 507.915 448.738ZM509.585 448.738C509.585 447.388 509.955 446.458 510.855 446.458C511.755 446.458 512.135 447.388 512.135 448.738C512.135 450.088 511.755 451.018 510.855 451.018C509.955 451.018 509.585 450.088 509.585 448.738ZM514.088 452.328H515.928V450.538H514.088V452.328ZM516.216 448.738C516.216 450.928 517.276 452.438 519.156 452.438C521.036 452.438 522.106 450.928 522.106 448.738C522.106 446.548 521.036 445.038 519.156 445.038C517.276 445.038 516.216 446.548 516.216 448.738ZM517.886 448.738C517.886 447.388 518.256 446.458 519.156 446.458C520.056 446.458 520.436 447.388 520.436 448.738C520.436 450.088 520.056 451.018 519.156 451.018C518.256 451.018 517.886 450.088 517.886 448.738Z" fill="black"/> <rect x="498.608" y="432.621" width="31.8" height="31.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2" stroke-dasharray="2 2"/> <path d="M508.032 449.092C508.032 451.282 509.062 452.792 510.772 452.792C512.472 452.792 513.502 451.282 513.502 449.092C513.502 446.902 512.472 445.392 510.772 445.392C509.062 445.392 508.032 446.902 508.032 449.092ZM508.992 449.092C508.992 447.412 509.602 446.232 510.772 446.232C511.932 446.232 512.552 447.412 512.552 449.092C512.552 450.772 511.932 451.952 510.772 451.952C509.602 451.952 508.992 450.772 508.992 449.092ZM513.935 452.682H515.075V451.502H513.935V452.682ZM515.513 449.092C515.513 451.282 516.543 452.792 518.253 452.792C519.953 452.792 520.983 451.282 520.983 449.092C520.983 446.902 519.953 445.392 518.253 445.392C516.543 445.392 515.513 446.902 515.513 449.092ZM516.473 449.092C516.473 447.412 517.083 446.232 518.253 446.232C519.413 446.232 520.033 447.412 520.033 449.092C520.033 450.772 519.413 451.952 518.253 451.952C517.083 451.952 516.473 450.772 516.473 449.092Z" fill="black"/> <rect x="424.608" y="432.621" width="32.8" height="32.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2"/> <path d="M435.201 446.002V453.182H439.891V452.342H436.151V449.892H439.321V449.052H436.151V446.842H439.891V446.002H435.201ZM446.818 446.002H445.888V451.552L442.158 446.002H441.148V453.182H442.068V447.402L445.988 453.182H446.818V446.002Z" fill="black"/> <rect x="387.608" y="432.621" width="32.8" height="32.8" rx="1.4" fill="white" stroke="black" stroke-width="1.2"/> <path d="M394.663 451.041C395.153 452.011 396.343 452.631 397.583 452.631C399.003 452.631 400.193 451.781 400.193 450.441C400.193 448.921 398.883 448.671 397.693 448.391C396.683 448.151 396.013 447.991 396.013 447.231C396.013 446.541 396.643 446.061 397.513 446.061C398.443 446.061 399.013 446.531 399.383 447.221L400.113 446.641C399.683 445.831 398.783 445.231 397.533 445.231C396.173 445.231 395.083 446.051 395.083 447.311C395.083 448.661 396.203 448.971 397.333 449.241C398.423 449.501 399.253 449.651 399.253 450.521C399.253 451.341 398.533 451.801 397.603 451.801C396.673 451.801 395.863 451.301 395.413 450.441L394.663 451.041ZM400.862 448.931C400.862 451.121 402.242 452.631 404.122 452.631C406.002 452.631 407.372 451.121 407.372 448.931C407.372 446.741 406.002 445.231 404.122 445.231C402.242 445.231 400.862 446.741 400.862 448.931ZM406.422 448.931C406.422 450.631 405.472 451.781 404.122 451.781C402.772 451.781 401.822 450.631 401.822 448.931C401.822 447.231 402.772 446.081 404.122 446.081C405.472 446.081 406.422 447.231 406.422 448.931ZM413.22 446.181V445.341H407.4V446.181H409.83V452.521H410.79V446.181H413.22Z" fill="black"/> <path d="M612.198 452.654H615.058C616.508 452.654 617.418 451.924 617.418 450.674C617.418 449.764 616.878 449.134 616.068 448.914C616.618 448.724 617.178 448.264 617.178 447.344C617.178 446.144 616.368 445.474 614.848 445.474H612.198V452.654ZM613.128 448.554V446.284H614.768C615.728 446.284 616.258 446.654 616.258 447.424C616.258 448.184 615.728 448.554 614.768 448.554H613.128ZM613.128 449.374H615.018C615.978 449.374 616.498 449.874 616.498 450.604C616.498 451.344 615.978 451.844 615.018 451.844H613.128V449.374ZM621.255 447.404C621.165 447.394 621.035 447.384 620.895 447.384C620.185 447.384 619.675 447.784 619.445 448.314V447.424H618.545V452.654H619.445V449.794C619.445 448.874 620.055 448.274 620.815 448.274C620.985 448.274 621.105 448.284 621.255 448.314V447.404ZM626.531 450.044C626.531 448.424 625.531 447.324 624.101 447.324C622.671 447.324 621.671 448.424 621.671 450.044C621.671 451.664 622.671 452.764 624.101 452.764C625.531 452.764 626.531 451.664 626.531 450.044ZM622.581 450.044C622.581 448.844 623.181 448.084 624.101 448.084C625.021 448.084 625.621 448.844 625.621 450.044C625.621 451.234 625.021 451.994 624.101 451.994C623.181 451.994 622.581 451.234 622.581 450.044ZM630.068 447.424L628.988 451.314L627.888 447.424H626.958L628.548 452.654H629.338L630.428 448.774L631.518 452.654H632.308L633.898 447.424H632.998L631.908 451.324L630.828 447.424H630.068ZM634.775 452.654H635.675V449.534C635.675 448.664 636.335 448.164 636.985 448.164C637.765 448.164 638.075 448.714 638.075 449.444V452.654H638.975V449.204C638.975 448.094 638.345 447.324 637.235 447.324C636.495 447.324 635.975 447.704 635.675 448.134V447.424H634.775V452.654Z" fill="#7CF178"/> <rect x="608.008" y="432.021" width="34" height="34" rx="2" fill="#7CF178"/> <circle cx="628.508" cy="449.021" r="0.75" transform="rotate(90 628.508 449.021)" fill="black"/> <circle cx="625.008" cy="449.021" r="0.75" transform="rotate(90 625.008 449.021)" fill="black"/> <circle cx="621.508" cy="449.021" r="0.75" transform="rotate(90 621.508 449.021)" fill="black"/> <path d="M84.4101 486.336H88.9501V485.496H85.3701V479.156H84.4101V486.336ZM94.2705 483.726C94.2705 482.106 93.2705 481.006 91.8405 481.006C90.4105 481.006 89.4105 482.106 89.4105 483.726C89.4105 485.346 90.4105 486.446 91.8405 486.446C93.2705 486.446 94.2705 485.346 94.2705 483.726ZM90.3205 483.726C90.3205 482.526 90.9205 481.766 91.8405 481.766C92.7605 481.766 93.3605 482.526 93.3605 483.726C93.3605 484.916 92.7605 485.676 91.8405 485.676C90.9205 485.676 90.3205 484.916 90.3205 483.726ZM99.845 485.866V481.106H98.945V481.876C98.665 481.366 98.065 481.006 97.345 481.006C95.915 481.006 95.075 482.186 95.075 483.606C95.075 485.036 95.915 486.216 97.345 486.216C98.075 486.216 98.665 485.856 98.945 485.346V485.936C98.945 487.026 98.335 487.496 97.415 487.496C96.685 487.496 96.145 487.106 95.975 486.436L95.135 486.756C95.425 487.716 96.265 488.246 97.415 488.246C98.835 488.246 99.845 487.396 99.845 485.866ZM98.955 483.526V483.716C98.955 484.866 98.255 485.426 97.505 485.426C96.555 485.426 95.975 484.706 95.975 483.606C95.975 482.506 96.555 481.796 97.505 481.796C98.255 481.796 98.955 482.356 98.955 483.526ZM101.103 483.826H103.833V483.016H101.103V483.826ZM105.993 483.206C105.993 482.356 106.633 481.846 107.233 481.846C107.973 481.846 108.273 482.396 108.273 483.126V486.336H109.173V483.206C109.173 482.356 109.813 481.846 110.413 481.846C111.153 481.846 111.453 482.396 111.453 483.126V486.336H112.353V482.886C112.353 481.776 111.703 481.006 110.623 481.006C109.803 481.006 109.243 481.466 108.953 481.896C108.693 481.356 108.203 481.006 107.503 481.006C106.773 481.006 106.273 481.386 105.993 481.816V481.106H105.093V486.336H105.993V483.206ZM115.894 486.446C116.934 486.446 117.724 485.876 118.044 484.946L117.254 484.666C117.094 485.286 116.594 485.666 115.894 485.666C114.984 485.666 114.354 484.966 114.304 483.866H118.084V483.566C118.084 482.116 117.314 481.006 115.854 481.006C114.434 481.006 113.444 482.166 113.444 483.726C113.444 485.336 114.444 486.446 115.894 486.446ZM115.844 481.776C116.754 481.776 117.154 482.436 117.174 483.176H114.354C114.514 482.286 115.084 481.776 115.844 481.776ZM120.104 479.156H119.204V486.336H120.104V479.156ZM123.136 485.226C123.536 486.036 124.486 486.446 125.436 486.446C126.556 486.446 127.466 485.786 127.466 484.786C127.466 483.646 126.436 483.466 125.536 483.256C124.806 483.096 124.346 483.006 124.346 482.516C124.346 482.046 124.746 481.746 125.346 481.746C126.006 481.746 126.496 482.076 126.766 482.596L127.406 482.086C127.096 481.466 126.326 481.006 125.386 481.006C124.276 481.006 123.466 481.656 123.466 482.586C123.466 483.656 124.366 483.846 125.226 484.036C126.086 484.216 126.586 484.296 126.586 484.866C126.586 485.436 126.046 485.696 125.436 485.696C124.756 485.696 124.116 485.326 123.836 484.716L123.136 485.226ZM131.099 486.446C132.549 486.446 133.379 485.206 133.379 483.726C133.379 482.236 132.549 481.006 131.099 481.006C130.369 481.006 129.789 481.356 129.479 481.866V481.106H128.579V488.136H129.479V485.576C129.789 486.096 130.369 486.446 131.099 486.446ZM129.459 483.546C129.459 482.326 130.199 481.796 130.939 481.796C131.919 481.796 132.469 482.596 132.469 483.726C132.469 484.846 131.919 485.656 130.939 485.656C130.199 485.656 129.459 485.116 129.459 483.916V483.546ZM136.607 486.446C137.647 486.446 138.437 485.876 138.757 484.946L137.967 484.666C137.807 485.286 137.307 485.666 136.607 485.666C135.697 485.666 135.067 484.966 135.017 483.866H138.797V483.566C138.797 482.116 138.027 481.006 136.567 481.006C135.147 481.006 134.157 482.166 134.157 483.726C134.157 485.336 135.157 486.446 136.607 486.446ZM136.557 481.776C137.467 481.776 137.867 482.436 137.887 483.176H135.067C135.227 482.286 135.797 481.776 136.557 481.776ZM140.457 483.726C140.457 482.526 141.097 481.786 142.027 481.786C142.747 481.786 143.177 482.256 143.327 482.896L144.107 482.506C143.857 481.636 143.117 481.006 142.027 481.006C140.567 481.006 139.557 482.106 139.557 483.726C139.557 485.336 140.567 486.446 142.027 486.446C143.117 486.446 143.877 485.786 144.127 484.916L143.327 484.546C143.177 485.196 142.747 485.656 142.027 485.656C141.097 485.656 140.457 484.916 140.457 483.726ZM145.252 485.086C145.252 486.086 145.762 486.386 146.652 486.386C146.952 486.386 147.212 486.356 147.442 486.306V485.536C147.232 485.586 147.082 485.596 146.872 485.596C146.412 485.596 146.142 485.496 146.142 484.946V481.876H147.332V481.106H146.142V479.576H145.252V481.106H144.442V481.876H145.252V485.086ZM151.221 481.086C151.131 481.076 151.001 481.066 150.861 481.066C150.151 481.066 149.641 481.466 149.411 481.996V481.106H148.511V486.336H149.411V483.476C149.411 482.556 150.021 481.956 150.781 481.956C150.951 481.956 151.071 481.966 151.221 481.996V481.086ZM156.497 483.726C156.497 482.106 155.497 481.006 154.067 481.006C152.637 481.006 151.637 482.106 151.637 483.726C151.637 485.346 152.637 486.446 154.067 486.446C155.497 486.446 156.497 485.346 156.497 483.726ZM152.547 483.726C152.547 482.526 153.147 481.766 154.067 481.766C154.987 481.766 155.587 482.526 155.587 483.726C155.587 484.916 154.987 485.676 154.067 485.676C153.147 485.676 152.547 484.916 152.547 483.726ZM162.072 485.866V481.106H161.172V481.876C160.892 481.366 160.292 481.006 159.572 481.006C158.142 481.006 157.302 482.186 157.302 483.606C157.302 485.036 158.142 486.216 159.572 486.216C160.302 486.216 160.892 485.856 161.172 485.346V485.936C161.172 487.026 160.562 487.496 159.642 487.496C158.912 487.496 158.372 487.106 158.202 486.436L157.362 486.756C157.652 487.716 158.492 488.246 159.642 488.246C161.062 488.246 162.072 487.396 162.072 485.866ZM161.182 483.526V483.716C161.182 484.866 160.482 485.426 159.732 485.426C158.782 485.426 158.202 484.706 158.202 483.606C158.202 482.506 158.782 481.796 159.732 481.796C160.482 481.796 161.182 482.356 161.182 483.526ZM166.279 481.086C166.189 481.076 166.059 481.066 165.919 481.066C165.209 481.066 164.699 481.466 164.469 481.996V481.106H163.569V486.336H164.469V483.476C164.469 482.556 165.079 481.956 165.839 481.956C166.009 481.956 166.129 481.966 166.279 481.996V481.086ZM168.473 486.426C169.183 486.426 169.763 486.156 170.083 485.746C170.193 486.246 170.653 486.436 171.453 486.336V485.616C171.103 485.696 170.893 485.586 170.893 485.246V482.696C170.893 481.576 170.173 481.006 169.003 481.006C167.933 481.006 167.163 481.616 166.913 482.466L167.763 482.686C167.923 482.106 168.313 481.796 168.973 481.796C169.673 481.796 170.003 482.136 170.003 482.736V482.956L168.733 483.216C167.563 483.456 166.753 483.886 166.753 484.926C166.753 485.866 167.523 486.426 168.473 486.426ZM170.003 484.556C170.003 485.226 169.353 485.696 168.583 485.696C168.013 485.696 167.643 485.396 167.643 484.886C167.643 484.256 168.183 484.036 168.983 483.876L170.003 483.656V484.556ZM173.327 483.206C173.327 482.356 173.967 481.846 174.567 481.846C175.307 481.846 175.607 482.396 175.607 483.126V486.336H176.507V483.206C176.507 482.356 177.147 481.846 177.747 481.846C178.487 481.846 178.787 482.396 178.787 483.126V486.336H179.687V482.886C179.687 481.776 179.037 481.006 177.957 481.006C177.137 481.006 176.577 481.466 176.287 481.896C176.027 481.356 175.537 481.006 174.837 481.006C174.107 481.006 173.607 481.386 173.327 481.816V481.106H172.427V486.336H173.327V483.206Z" fill="black"/> <path d="M472.423 28.3434H471.493V33.8934L467.763 28.3434H466.753V35.5234H467.673V29.7434L471.593 35.5234H472.423V28.3434ZM476.049 35.6334C477.089 35.6334 477.879 35.0634 478.199 34.1334L477.409 33.8534C477.249 34.4734 476.749 34.8534 476.049 34.8534C475.139 34.8534 474.509 34.1534 474.459 33.0534H478.239V32.7534C478.239 31.3034 477.469 30.1934 476.009 30.1934C474.589 30.1934 473.599 31.3534 473.599 32.9134C473.599 34.5234 474.599 35.6334 476.049 35.6334ZM475.999 30.9634C476.909 30.9634 477.309 31.6234 477.329 32.3634H474.509C474.669 31.4734 475.239 30.9634 475.999 30.9634ZM482.365 35.5234H483.385L481.465 32.7934L483.215 30.2934H482.225L480.945 32.1234L479.685 30.2934H478.655L480.405 32.7934L478.495 35.5234H479.475L480.915 33.4634L482.365 35.5234ZM484.363 34.2734C484.363 35.2734 484.873 35.5734 485.763 35.5734C486.063 35.5734 486.323 35.5434 486.553 35.4934V34.7234C486.343 34.7734 486.193 34.7834 485.983 34.7834C485.523 34.7834 485.253 34.6834 485.253 34.1334V31.0634H486.443V30.2934H485.253V28.7634H484.363V30.2934H483.553V31.0634H484.363V34.2734ZM487.078 33.0134H489.808V32.2034H487.078V33.0134ZM491.208 34.2734C491.208 35.2734 491.718 35.5734 492.608 35.5734C492.908 35.5734 493.168 35.5434 493.398 35.4934V34.7234C493.188 34.7734 493.038 34.7834 492.828 34.7834C492.368 34.7834 492.098 34.6834 492.098 34.1334V31.0634H493.288V30.2934H492.098V28.7634H491.208V30.2934H490.398V31.0634H491.208V34.2734ZM498.938 32.9134C498.938 31.2934 497.938 30.1934 496.508 30.1934C495.078 30.1934 494.078 31.2934 494.078 32.9134C494.078 34.5334 495.078 35.6334 496.508 35.6334C497.938 35.6334 498.938 34.5334 498.938 32.9134ZM494.988 32.9134C494.988 31.7134 495.588 30.9534 496.508 30.9534C497.428 30.9534 498.028 31.7134 498.028 32.9134C498.028 34.1034 497.428 34.8634 496.508 34.8634C495.588 34.8634 494.988 34.1034 494.988 32.9134ZM500.982 28.3434H500.082V35.5234H500.982V33.7734L501.742 33.0034L503.542 35.5234H504.592L502.382 32.3534L504.422 30.2934H503.302L500.982 32.7334V28.3434ZM507.299 35.6334C508.339 35.6334 509.129 35.0634 509.449 34.1334L508.659 33.8534C508.499 34.4734 507.999 34.8534 507.299 34.8534C506.389 34.8534 505.759 34.1534 505.709 33.0534H509.489V32.7534C509.489 31.3034 508.719 30.1934 507.259 30.1934C505.839 30.1934 504.849 31.3534 504.849 32.9134C504.849 34.5234 505.849 35.6334 507.299 35.6334ZM507.249 30.9634C508.159 30.9634 508.559 31.6234 508.579 32.3634H505.759C505.919 31.4734 506.489 30.9634 507.249 30.9634ZM510.609 35.5234H511.509V32.4034C511.509 31.5334 512.169 31.0334 512.819 31.0334C513.599 31.0334 513.909 31.5834 513.909 32.3134V35.5234H514.809V32.0734C514.809 30.9634 514.179 30.1934 513.069 30.1934C512.329 30.1934 511.809 30.5734 511.509 31.0034V30.2934H510.609V35.5234ZM520.883 35.6334C522.333 35.6334 523.163 34.3934 523.163 32.9134C523.163 31.4234 522.333 30.1934 520.883 30.1934C520.153 30.1934 519.573 30.5434 519.263 31.0534V30.2934H518.363V37.3234H519.263V34.7634C519.573 35.2834 520.153 35.6334 520.883 35.6334ZM519.243 32.7334C519.243 31.5134 519.983 30.9834 520.723 30.9834C521.703 30.9834 522.253 31.7834 522.253 32.9134C522.253 34.0334 521.703 34.8434 520.723 34.8434C519.983 34.8434 519.243 34.3034 519.243 33.1034V32.7334ZM527.011 30.2734C526.921 30.2634 526.791 30.2534 526.651 30.2534C525.941 30.2534 525.431 30.6534 525.201 31.1834V30.2934H524.301V35.5234H525.201V32.6634C525.201 31.7434 525.811 31.1434 526.571 31.1434C526.741 31.1434 526.861 31.1534 527.011 31.1834V30.2734ZM529.877 35.6334C530.917 35.6334 531.707 35.0634 532.027 34.1334L531.237 33.8534C531.077 34.4734 530.577 34.8534 529.877 34.8534C528.967 34.8534 528.337 34.1534 528.287 33.0534H532.067V32.7534C532.067 31.3034 531.297 30.1934 529.837 30.1934C528.417 30.1934 527.427 31.3534 527.427 32.9134C527.427 34.5234 528.427 35.6334 529.877 35.6334ZM529.827 30.9634C530.737 30.9634 531.137 31.6234 531.157 32.3634H528.337C528.497 31.4734 529.067 30.9634 529.827 30.9634ZM535.108 35.6334C535.838 35.6334 536.418 35.2834 536.728 34.7634V35.5234H537.628V28.3434H536.728V31.0534C536.418 30.5434 535.838 30.1934 535.108 30.1934C533.658 30.1934 532.828 31.4234 532.828 32.9134C532.828 34.3934 533.658 35.6334 535.108 35.6334ZM536.748 32.7334V33.1034C536.748 34.3034 536.008 34.8434 535.268 34.8434C534.288 34.8434 533.738 34.0334 533.738 32.9134C533.738 31.7834 534.288 30.9834 535.268 30.9834C536.008 30.9834 536.748 31.5134 536.748 32.7334ZM539.075 29.3834H540.055V28.3434H539.075V29.3834ZM540.015 30.2934H539.115V35.5234H540.015V30.2934ZM542.058 32.9134C542.058 31.7134 542.698 30.9734 543.628 30.9734C544.348 30.9734 544.778 31.4434 544.928 32.0834L545.708 31.6934C545.458 30.8234 544.718 30.1934 543.628 30.1934C542.168 30.1934 541.158 31.2934 541.158 32.9134C541.158 34.5234 542.168 35.6334 543.628 35.6334C544.718 35.6334 545.478 34.9734 545.728 34.1034L544.928 33.7334C544.778 34.3834 544.348 34.8434 543.628 34.8434C542.698 34.8434 542.058 34.1034 542.058 32.9134ZM546.853 34.2734C546.853 35.2734 547.363 35.5734 548.253 35.5734C548.553 35.5734 548.813 35.5434 549.043 35.4934V34.7234C548.833 34.7734 548.683 34.7834 548.473 34.7834C548.013 34.7834 547.743 34.6834 547.743 34.1334V31.0634H548.933V30.2934H547.743V28.7634H546.853V30.2934H546.043V31.0634H546.853V34.2734ZM550.071 29.3834H551.051V28.3434H550.071V29.3834ZM551.011 30.2934H550.111V35.5234H551.011V30.2934ZM557.014 32.9134C557.014 31.2934 556.014 30.1934 554.584 30.1934C553.154 30.1934 552.154 31.2934 552.154 32.9134C552.154 34.5334 553.154 35.6334 554.584 35.6334C556.014 35.6334 557.014 34.5334 557.014 32.9134ZM553.064 32.9134C553.064 31.7134 553.664 30.9534 554.584 30.9534C555.504 30.9534 556.104 31.7134 556.104 32.9134C556.104 34.1034 555.504 34.8634 554.584 34.8634C553.664 34.8634 553.064 34.1034 553.064 32.9134ZM558.158 35.5234H559.058V32.4034C559.058 31.5334 559.718 31.0334 560.368 31.0334C561.148 31.0334 561.458 31.5834 561.458 32.3134V35.5234H562.358V32.0734C562.358 30.9634 561.728 30.1934 560.618 30.1934C559.878 30.1934 559.358 30.5734 559.058 31.0034V30.2934H558.158V35.5234Z" fill="black"/> <path d="M444.162 479.996V479.156H438.342V479.996H440.772V486.336H441.732V479.996H444.162ZM448.471 483.726C448.471 482.106 447.471 481.006 446.041 481.006C444.611 481.006 443.611 482.106 443.611 483.726C443.611 485.346 444.611 486.446 446.041 486.446C447.471 486.446 448.471 485.346 448.471 483.726ZM444.521 483.726C444.521 482.526 445.121 481.766 446.041 481.766C446.961 481.766 447.561 482.526 447.561 483.726C447.561 484.916 446.961 485.676 446.041 485.676C445.121 485.676 444.521 484.916 444.521 483.726ZM450.515 479.156H449.615V486.336H450.515V484.586L451.275 483.816L453.075 486.336H454.125L451.915 483.166L453.955 481.106H452.835L450.515 483.546V479.156ZM456.832 486.446C457.872 486.446 458.662 485.876 458.982 484.946L458.192 484.666C458.032 485.286 457.532 485.666 456.832 485.666C455.922 485.666 455.292 484.966 455.242 483.866H459.022V483.566C459.022 482.116 458.252 481.006 456.792 481.006C455.372 481.006 454.382 482.166 454.382 483.726C454.382 485.336 455.382 486.446 456.832 486.446ZM456.782 481.776C457.692 481.776 458.092 482.436 458.112 483.176H455.292C455.452 482.286 456.022 481.776 456.782 481.776ZM460.143 486.336H461.043V483.216C461.043 482.346 461.703 481.846 462.353 481.846C463.133 481.846 463.443 482.396 463.443 483.126V486.336H464.343V482.886C464.343 481.776 463.713 481.006 462.603 481.006C461.863 481.006 461.343 481.386 461.043 481.816V481.106H460.143V486.336ZM465.207 485.226C465.607 486.036 466.557 486.446 467.507 486.446C468.627 486.446 469.537 485.786 469.537 484.786C469.537 483.646 468.507 483.466 467.607 483.256C466.877 483.096 466.417 483.006 466.417 482.516C466.417 482.046 466.817 481.746 467.417 481.746C468.077 481.746 468.567 482.076 468.837 482.596L469.477 482.086C469.167 481.466 468.397 481.006 467.457 481.006C466.347 481.006 465.537 481.656 465.537 482.586C465.537 483.656 466.437 483.846 467.297 484.036C468.157 484.216 468.657 484.296 468.657 484.866C468.657 485.436 468.117 485.696 467.507 485.696C466.827 485.696 466.187 485.326 465.907 484.716L465.207 485.226ZM472.72 480.196H473.7V479.156H472.72V480.196ZM473.66 481.106H472.76V486.336H473.66V481.106ZM475.162 486.336H476.062V483.216C476.062 482.346 476.722 481.846 477.372 481.846C478.152 481.846 478.462 482.396 478.462 483.126V486.336H479.362V482.886C479.362 481.776 478.732 481.006 477.622 481.006C476.882 481.006 476.362 481.386 476.062 481.816V481.106H475.162V486.336ZM483.816 483.206C483.816 482.356 484.456 481.846 485.056 481.846C485.796 481.846 486.096 482.396 486.096 483.126V486.336H486.996V483.206C486.996 482.356 487.636 481.846 488.236 481.846C488.976 481.846 489.276 482.396 489.276 483.126V486.336H490.176V482.886C490.176 481.776 489.526 481.006 488.446 481.006C487.626 481.006 487.066 481.466 486.776 481.896C486.516 481.356 486.026 481.006 485.326 481.006C484.596 481.006 484.096 481.386 483.816 481.816V481.106H482.916V486.336H483.816V483.206ZM495.777 481.106H494.877V484.256C494.877 485.126 494.217 485.616 493.567 485.616C492.797 485.616 492.477 485.116 492.477 484.376V481.106H491.577V484.566C491.577 485.676 492.207 486.446 493.327 486.446C494.067 486.446 494.577 486.056 494.877 485.616V486.336H495.777V481.106ZM498.171 479.156H497.271V486.336H498.171V479.156ZM499.814 485.086C499.814 486.086 500.324 486.386 501.214 486.386C501.514 486.386 501.774 486.356 502.004 486.306V485.536C501.794 485.586 501.644 485.596 501.434 485.596C500.974 485.596 500.704 485.496 500.704 484.946V481.876H501.894V481.106H500.704V479.576H499.814V481.106H499.004V481.876H499.814V485.086ZM503.032 480.196H504.012V479.156H503.032V480.196ZM503.972 481.106H503.072V486.336H503.972V481.106ZM505.615 485.086C505.615 486.086 506.125 486.386 507.015 486.386C507.315 486.386 507.575 486.356 507.805 486.306V485.536C507.595 485.586 507.445 485.596 507.235 485.596C506.775 485.596 506.505 485.496 506.505 484.946V481.876H507.695V481.106H506.505V479.576H505.615V481.106H504.805V481.876H505.615V485.086ZM510.183 486.426C510.893 486.426 511.473 486.156 511.793 485.746C511.903 486.246 512.363 486.436 513.163 486.336V485.616C512.813 485.696 512.603 485.586 512.603 485.246V482.696C512.603 481.576 511.883 481.006 510.713 481.006C509.643 481.006 508.873 481.616 508.623 482.466L509.473 482.686C509.633 482.106 510.023 481.796 510.683 481.796C511.383 481.796 511.713 482.136 511.713 482.736V482.956L510.443 483.216C509.273 483.456 508.463 483.886 508.463 484.926C508.463 485.866 509.233 486.426 510.183 486.426ZM511.713 484.556C511.713 485.226 511.063 485.696 510.293 485.696C509.723 485.696 509.353 485.396 509.353 484.886C509.353 484.256 509.893 484.036 510.693 483.876L511.713 483.656V484.556ZM513.557 485.226C513.957 486.036 514.907 486.446 515.857 486.446C516.977 486.446 517.887 485.786 517.887 484.786C517.887 483.646 516.857 483.466 515.957 483.256C515.227 483.096 514.767 483.006 514.767 482.516C514.767 482.046 515.167 481.746 515.767 481.746C516.427 481.746 516.917 482.076 517.187 482.596L517.827 482.086C517.517 481.466 516.747 481.006 515.807 481.006C514.697 481.006 513.887 481.656 513.887 482.586C513.887 483.656 514.787 483.846 515.647 484.036C516.507 484.216 517.007 484.296 517.007 484.866C517.007 485.436 516.467 485.696 515.857 485.696C515.177 485.696 514.537 485.326 514.257 484.716L513.557 485.226ZM519.9 479.156H519V486.336H519.9V484.586L520.66 483.816L522.46 486.336H523.51L521.3 483.166L523.34 481.106H522.22L519.9 483.546V479.156ZM526.591 485.086C526.591 486.086 527.101 486.386 527.991 486.386C528.291 486.386 528.551 486.356 528.781 486.306V485.536C528.571 485.586 528.421 485.596 528.211 485.596C527.751 485.596 527.481 485.496 527.481 484.946V481.876H528.671V481.106H527.481V479.576H526.591V481.106H525.781V481.876H526.591V485.086ZM532.56 481.086C532.47 481.076 532.34 481.066 532.2 481.066C531.49 481.066 530.98 481.466 530.75 481.996V481.106H529.85V486.336H530.75V483.476C530.75 482.556 531.36 481.956 532.12 481.956C532.29 481.956 532.41 481.966 532.56 481.996V481.086ZM534.753 486.426C535.463 486.426 536.043 486.156 536.363 485.746C536.473 486.246 536.933 486.436 537.733 486.336V485.616C537.383 485.696 537.173 485.586 537.173 485.246V482.696C537.173 481.576 536.453 481.006 535.283 481.006C534.213 481.006 533.443 481.616 533.193 482.466L534.043 482.686C534.203 482.106 534.593 481.796 535.253 481.796C535.953 481.796 536.283 482.136 536.283 482.736V482.956L535.013 483.216C533.843 483.456 533.033 483.886 533.033 484.926C533.033 485.866 533.803 486.426 534.753 486.426ZM536.283 484.556C536.283 485.226 535.633 485.696 534.863 485.696C534.293 485.696 533.923 485.396 533.923 484.886C533.923 484.256 534.463 484.036 535.263 483.876L536.283 483.656V484.556ZM538.667 480.196H539.647V479.156H538.667V480.196ZM539.607 481.106H538.707V486.336H539.607V481.106ZM541.109 486.336H542.009V483.216C542.009 482.346 542.669 481.846 543.319 481.846C544.099 481.846 544.409 482.396 544.409 483.126V486.336H545.309V482.886C545.309 481.776 544.679 481.006 543.569 481.006C542.829 481.006 542.309 481.386 542.009 481.816V481.106H541.109V486.336ZM546.714 480.196H547.694V479.156H546.714V480.196ZM547.654 481.106H546.754V486.336H547.654V481.106ZM549.156 486.336H550.056V483.216C550.056 482.346 550.716 481.846 551.366 481.846C552.146 481.846 552.456 482.396 552.456 483.126V486.336H553.356V482.886C553.356 481.776 552.726 481.006 551.616 481.006C550.876 481.006 550.356 481.386 550.056 481.816V481.106H549.156V486.336ZM559.231 485.866V481.106H558.331V481.876C558.051 481.366 557.451 481.006 556.731 481.006C555.301 481.006 554.461 482.186 554.461 483.606C554.461 485.036 555.301 486.216 556.731 486.216C557.461 486.216 558.051 485.856 558.331 485.346V485.936C558.331 487.026 557.721 487.496 556.801 487.496C556.071 487.496 555.531 487.106 555.361 486.436L554.521 486.756C554.811 487.716 555.651 488.246 556.801 488.246C558.221 488.246 559.231 487.396 559.231 485.866ZM558.341 483.526V483.716C558.341 484.866 557.641 485.426 556.891 485.426C555.941 485.426 555.361 484.706 555.361 483.606C555.361 482.506 555.941 481.796 556.891 481.796C557.641 481.796 558.341 482.356 558.341 483.526ZM565.118 479.066C564.918 479.056 564.818 479.046 564.638 479.046C563.568 479.046 563.008 479.666 563.008 480.686V481.106H562.218V481.876H563.008V486.336H563.908V481.876H565.068V481.106H563.908V480.746C563.908 480.076 564.198 479.846 564.728 479.846C564.878 479.846 564.968 479.846 565.118 479.866V479.066ZM570.375 483.726C570.375 482.106 569.375 481.006 567.945 481.006C566.515 481.006 565.515 482.106 565.515 483.726C565.515 485.346 566.515 486.446 567.945 486.446C569.375 486.446 570.375 485.346 570.375 483.726ZM566.425 483.726C566.425 482.526 567.025 481.766 567.945 481.766C568.865 481.766 569.465 482.526 569.465 483.726C569.465 484.916 568.865 485.676 567.945 485.676C567.025 485.676 566.425 484.916 566.425 483.726ZM574.23 481.086C574.14 481.076 574.01 481.066 573.87 481.066C573.16 481.066 572.65 481.466 572.42 481.996V481.106H571.52V486.336H572.42V483.476C572.42 482.556 573.03 481.956 573.79 481.956C573.96 481.956 574.08 481.966 574.23 481.996V481.086ZM576.121 483.206C576.121 482.356 576.761 481.846 577.361 481.846C578.101 481.846 578.401 482.396 578.401 483.126V486.336H579.301V483.206C579.301 482.356 579.941 481.846 580.541 481.846C581.281 481.846 581.581 482.396 581.581 483.126V486.336H582.481V482.886C582.481 481.776 581.831 481.006 580.751 481.006C579.931 481.006 579.371 481.466 579.081 481.896C578.821 481.356 578.331 481.006 577.631 481.006C576.901 481.006 576.401 481.386 576.121 481.816V481.106H575.221V486.336H576.121V483.206ZM585.242 486.426C585.952 486.426 586.532 486.156 586.852 485.746C586.962 486.246 587.422 486.436 588.222 486.336V485.616C587.872 485.696 587.662 485.586 587.662 485.246V482.696C587.662 481.576 586.942 481.006 585.772 481.006C584.702 481.006 583.932 481.616 583.682 482.466L584.532 482.686C584.692 482.106 585.082 481.796 585.742 481.796C586.442 481.796 586.772 482.136 586.772 482.736V482.956L585.502 483.216C584.332 483.456 583.522 483.886 583.522 484.926C583.522 485.866 584.292 486.426 585.242 486.426ZM586.772 484.556C586.772 485.226 586.122 485.696 585.352 485.696C584.782 485.696 584.412 485.396 584.412 484.886C584.412 484.256 584.952 484.036 585.752 483.876L586.772 483.656V484.556ZM589.335 485.086C589.335 486.086 589.845 486.386 590.735 486.386C591.035 486.386 591.295 486.356 591.525 486.306V485.536C591.315 485.586 591.165 485.596 590.955 485.596C590.495 485.596 590.225 485.496 590.225 484.946V481.876H591.415V481.106H590.225V479.576H589.335V481.106H588.525V481.876H589.335V485.086Z" fill="black"/> <path d="M307.536 282.812C305.496 282.812 304.116 281.684 304.116 280.088C304.116 278.972 304.764 278.168 305.856 277.856L305.496 276.764C304.008 277.268 303.096 278.444 303.096 280.088C303.096 282.32 304.908 283.964 307.536 283.964C310.164 283.964 311.976 282.32 311.976 280.088C311.976 278.444 311.064 277.268 309.576 276.764L309.216 277.856C310.308 278.168 310.956 278.972 310.956 280.088C310.956 281.684 309.576 282.812 307.536 282.812ZM305.544 272.346C305.532 272.454 305.52 272.61 305.52 272.778C305.52 273.63 306 274.242 306.636 274.518L305.568 274.518L305.568 275.598L311.844 275.598L311.844 274.518L308.412 274.518C307.308 274.518 306.588 273.786 306.588 272.874C306.588 272.67 306.6 272.526 306.636 272.346L305.544 272.346ZM308.712 266.014C306.768 266.014 305.448 267.214 305.448 268.93C305.448 270.646 306.768 271.846 308.712 271.846C310.656 271.846 311.976 270.646 311.976 268.93C311.976 267.214 310.656 266.014 308.712 266.014ZM308.712 270.754C307.272 270.754 306.36 270.034 306.36 268.93C306.36 267.826 307.272 267.106 308.712 267.106C310.14 267.106 311.052 267.826 311.052 268.93C311.052 270.034 310.14 270.754 308.712 270.754ZM310.512 265.337C311.484 264.857 311.976 263.717 311.976 262.577C311.976 261.233 311.184 260.141 309.984 260.141C308.616 260.141 308.4 261.377 308.148 262.457C307.956 263.333 307.848 263.885 307.26 263.885C306.696 263.885 306.336 263.405 306.336 262.685C306.336 261.893 306.732 261.305 307.356 260.981L306.744 260.213C306 260.585 305.448 261.509 305.448 262.637C305.448 263.969 306.228 264.941 307.344 264.941C308.628 264.941 308.856 263.861 309.084 262.829C309.3 261.797 309.396 261.197 310.08 261.197C310.764 261.197 311.076 261.845 311.076 262.577C311.076 263.393 310.632 264.161 309.9 264.497L310.512 265.337ZM310.512 259.501C311.484 259.021 311.976 257.881 311.976 256.741C311.976 255.397 311.184 254.305 309.984 254.305C308.616 254.305 308.4 255.541 308.148 256.621C307.956 257.497 307.848 258.049 307.26 258.049C306.696 258.049 306.336 257.569 306.336 256.849C306.336 256.057 306.732 255.469 307.356 255.145L306.744 254.377C306 254.749 305.448 255.673 305.448 256.801C305.448 258.133 306.228 259.105 307.344 259.105C308.628 259.105 308.856 258.025 309.084 256.993C309.3 255.961 309.396 255.361 310.08 255.361C310.764 255.361 311.076 256.009 311.076 256.741C311.076 257.557 310.632 258.325 309.9 258.661L310.512 259.501ZM311.952 248.866C311.952 248.014 311.628 247.318 311.136 246.934C311.736 246.802 311.964 246.25 311.844 245.29L310.98 245.29C311.076 245.71 310.944 245.962 310.536 245.962L307.476 245.962C306.132 245.962 305.448 246.826 305.448 248.23C305.448 249.514 306.18 250.438 307.2 250.738L307.464 249.718C306.768 249.526 306.396 249.058 306.396 248.266C306.396 247.426 306.804 247.03 307.524 247.03L307.788 247.03L308.1 248.554C308.388 249.958 308.904 250.93 310.152 250.93C311.28 250.93 311.952 250.006 311.952 248.866ZM309.708 247.03C310.512 247.03 311.076 247.81 311.076 248.734C311.076 249.418 310.716 249.862 310.104 249.862C309.348 249.862 309.084 249.214 308.892 248.254L308.628 247.03L309.708 247.03ZM310.344 243.953C311.544 243.953 311.904 243.341 311.904 242.273C311.904 241.913 311.868 241.601 311.808 241.325L310.884 241.325C310.944 241.577 310.956 241.757 310.956 242.009C310.956 242.561 310.836 242.885 310.176 242.885L306.492 242.885L306.492 241.457L305.568 241.457L305.568 242.885L303.732 242.885L303.732 243.953L305.568 243.953L305.568 244.925L306.492 244.925L306.492 243.953L310.344 243.953ZM310.344 239.875C311.544 239.875 311.904 239.263 311.904 238.195C311.904 237.835 311.868 237.523 311.808 237.247L310.884 237.247C310.944 237.499 310.956 237.679 310.956 237.931C310.956 238.483 310.836 238.807 310.176 238.807L306.492 238.807L306.492 237.379L305.568 237.379L305.568 238.807L303.732 238.807L303.732 239.875L305.568 239.875L305.568 240.847L306.492 240.847L306.492 239.875L310.344 239.875ZM311.976 233.492C311.976 232.244 311.292 231.296 310.176 230.912L309.84 231.86C310.584 232.052 311.04 232.652 311.04 233.492C311.04 234.584 310.2 235.34 308.88 235.4L308.88 230.864L308.52 230.864C306.78 230.864 305.448 231.788 305.448 233.54C305.448 235.244 306.84 236.432 308.712 236.432C310.644 236.432 311.976 235.232 311.976 233.492ZM306.372 233.552C306.372 232.46 307.164 231.98 308.052 231.956L308.052 235.34C306.984 235.148 306.372 234.464 306.372 233.552ZM311.844 229.52L311.844 228.44L308.1 228.44C307.056 228.44 306.456 227.648 306.456 226.868C306.456 225.932 307.116 225.56 307.992 225.56L311.844 225.56L311.844 224.48L307.704 224.48C306.372 224.48 305.448 225.236 305.448 226.568C305.448 227.456 305.904 228.08 306.42 228.44L305.568 228.44L305.568 229.52L311.844 229.52ZM310.344 222.578C311.544 222.578 311.904 221.966 311.904 220.898C311.904 220.538 311.868 220.226 311.808 219.95L310.884 219.95C310.944 220.202 310.956 220.382 310.956 220.634C310.956 221.186 310.836 221.51 310.176 221.51L306.492 221.51L306.492 220.082L305.568 220.082L305.568 221.51L303.732 221.51L303.732 222.578L305.568 222.578L305.568 223.55L306.492 223.55L306.492 222.578L310.344 222.578ZM304.476 218.716L304.476 217.54L303.228 217.54L303.228 218.716L304.476 218.716ZM305.568 217.588L305.568 218.668L311.844 218.668L311.844 217.588L305.568 217.588ZM308.712 210.386C306.768 210.386 305.448 211.586 305.448 213.302C305.448 215.018 306.768 216.218 308.712 216.218C310.656 216.218 311.976 215.018 311.976 213.302C311.976 211.586 310.656 210.386 308.712 210.386ZM308.712 215.126C307.272 215.126 306.36 214.406 306.36 213.302C306.36 212.198 307.272 211.478 308.712 211.478C310.14 211.478 311.052 212.198 311.052 213.302C311.052 214.406 310.14 215.126 308.712 215.126ZM311.844 209.012L311.844 207.932L308.1 207.932C307.056 207.932 306.456 207.14 306.456 206.36C306.456 205.424 307.116 205.052 307.992 205.052L311.844 205.052L311.844 203.972L307.704 203.972C306.372 203.972 305.448 204.728 305.448 206.06C305.448 206.948 305.904 207.572 306.42 207.932L305.568 207.932L305.568 209.012L311.844 209.012Z" fill="black"/> <path d="M515.189 87.9541L511.725 93.9541L518.654 93.9541L515.189 87.9541ZM515.789 120.83L515.789 93.3541L514.589 93.3541L514.589 120.83L515.789 120.83Z" fill="black"/> <circle cx="515.191" cy="389.227" r="11.25" stroke="black" stroke-width="1.5"/> <line x1="527.191" y1="388.711" x2="503.723" y2="388.711" stroke="black" stroke-width="1.5"/> <line x1="515.207" y1="400.863" x2="515.207" y2="377.589" stroke="black" stroke-width="1.5"/> <path d="M529.191 389.227L535.191 392.691L535.191 385.762L529.191 389.227ZM578.543 388.627L534.591 388.627L534.591 389.827L578.543 389.827L578.543 388.627Z" fill="black"/> <path d="M457.716 387.227H462.256V386.387H458.676V380.047H457.716V387.227ZM465.166 387.337C466.206 387.337 466.996 386.767 467.316 385.837L466.526 385.557C466.366 386.177 465.866 386.557 465.166 386.557C464.256 386.557 463.626 385.857 463.576 384.757H467.356V384.457C467.356 383.007 466.586 381.897 465.126 381.897C463.706 381.897 462.716 383.057 462.716 384.617C462.716 386.227 463.716 387.337 465.166 387.337ZM465.116 382.667C466.026 382.667 466.426 383.327 466.446 384.067H463.626C463.786 383.177 464.356 382.667 465.116 382.667ZM469.845 387.317C470.555 387.317 471.135 387.047 471.455 386.637C471.565 387.137 472.025 387.327 472.825 387.227V386.507C472.475 386.587 472.265 386.477 472.265 386.137V383.587C472.265 382.467 471.545 381.897 470.375 381.897C469.305 381.897 468.535 382.507 468.285 383.357L469.135 383.577C469.295 382.997 469.685 382.687 470.345 382.687C471.045 382.687 471.375 383.027 471.375 383.627V383.847L470.105 384.107C468.935 384.347 468.125 384.777 468.125 385.817C468.125 386.757 468.895 387.317 469.845 387.317ZM471.375 385.447C471.375 386.117 470.725 386.587 469.955 386.587C469.385 386.587 469.015 386.287 469.015 385.777C469.015 385.147 469.555 384.927 470.355 384.767L471.375 384.547V385.447ZM476.509 381.977C476.419 381.967 476.289 381.957 476.149 381.957C475.439 381.957 474.929 382.357 474.699 382.887V381.997H473.799V387.227H474.699V384.367C474.699 383.447 475.309 382.847 476.069 382.847C476.239 382.847 476.359 382.857 476.509 382.887V381.977ZM477.5 387.227H478.4V384.107C478.4 383.237 479.06 382.737 479.71 382.737C480.49 382.737 480.8 383.287 480.8 384.017V387.227H481.7V383.777C481.7 382.667 481.07 381.897 479.96 381.897C479.22 381.897 478.7 382.277 478.4 382.707V381.997H477.5V387.227ZM485.235 387.337C486.275 387.337 487.065 386.767 487.385 385.837L486.595 385.557C486.435 386.177 485.935 386.557 485.235 386.557C484.325 386.557 483.695 385.857 483.645 384.757H487.425V384.457C487.425 383.007 486.655 381.897 485.195 381.897C483.775 381.897 482.785 383.057 482.785 384.617C482.785 386.227 483.785 387.337 485.235 387.337ZM485.185 382.667C486.095 382.667 486.495 383.327 486.515 384.067H483.695C483.855 383.177 484.425 382.667 485.185 382.667ZM490.465 387.337C491.195 387.337 491.775 386.987 492.085 386.467V387.227H492.985V380.047H492.085V382.757C491.775 382.247 491.195 381.897 490.465 381.897C489.015 381.897 488.185 383.127 488.185 384.617C488.185 386.097 489.015 387.337 490.465 387.337ZM492.105 384.437V384.807C492.105 386.007 491.365 386.547 490.625 386.547C489.645 386.547 489.095 385.737 489.095 384.617C489.095 383.487 489.645 382.687 490.625 382.687C491.365 382.687 492.105 383.217 492.105 384.437ZM407.494 397.217H409.274C410.794 397.217 411.764 396.477 411.764 395.127C411.764 393.757 410.794 393.047 409.274 393.047H406.534V400.227H407.494V397.217ZM407.494 396.387V393.877H409.224C410.244 393.877 410.814 394.317 410.814 395.127C410.814 395.937 410.244 396.387 409.224 396.387H407.494ZM416.98 397.617C416.98 395.997 415.98 394.897 414.55 394.897C413.12 394.897 412.12 395.997 412.12 397.617C412.12 399.237 413.12 400.337 414.55 400.337C415.98 400.337 416.98 399.237 416.98 397.617ZM413.03 397.617C413.03 396.417 413.63 395.657 414.55 395.657C415.47 395.657 416.07 396.417 416.07 397.617C416.07 398.807 415.47 399.567 414.55 399.567C413.63 399.567 413.03 398.807 413.03 397.617ZM417.545 399.117C417.945 399.927 418.895 400.337 419.845 400.337C420.965 400.337 421.875 399.677 421.875 398.677C421.875 397.537 420.845 397.357 419.945 397.147C419.215 396.987 418.755 396.897 418.755 396.407C418.755 395.937 419.155 395.637 419.755 395.637C420.415 395.637 420.905 395.967 421.175 396.487L421.815 395.977C421.505 395.357 420.735 394.897 419.795 394.897C418.685 394.897 417.875 395.547 417.875 396.477C417.875 397.547 418.775 397.737 419.635 397.927C420.495 398.107 420.995 398.187 420.995 398.757C420.995 399.327 420.455 399.587 419.845 399.587C419.165 399.587 418.525 399.217 418.245 398.607L417.545 399.117ZM422.948 394.087H423.928V393.047H422.948V394.087ZM423.888 394.997H422.988V400.227H423.888V394.997ZM425.531 398.977C425.531 399.977 426.041 400.277 426.931 400.277C427.231 400.277 427.491 400.247 427.721 400.197V399.427C427.511 399.477 427.361 399.487 427.151 399.487C426.691 399.487 426.421 399.387 426.421 398.837V395.767H427.611V394.997H426.421V393.467H425.531V394.997H424.721V395.767H425.531V398.977ZM428.749 394.087H429.729V393.047H428.749V394.087ZM429.689 394.997H428.789V400.227H429.689V394.997ZM435.691 397.617C435.691 395.997 434.691 394.897 433.261 394.897C431.831 394.897 430.831 395.997 430.831 397.617C430.831 399.237 431.831 400.337 433.261 400.337C434.691 400.337 435.691 399.237 435.691 397.617ZM431.741 397.617C431.741 396.417 432.341 395.657 433.261 395.657C434.181 395.657 434.781 396.417 434.781 397.617C434.781 398.807 434.181 399.567 433.261 399.567C432.341 399.567 431.741 398.807 431.741 397.617ZM436.836 400.227H437.736V397.107C437.736 396.237 438.396 395.737 439.046 395.737C439.826 395.737 440.136 396.287 440.136 397.017V400.227H441.036V396.777C441.036 395.667 440.406 394.897 439.296 394.897C438.556 394.897 438.036 395.277 437.736 395.707V394.997H436.836V400.227ZM443.79 400.317C444.5 400.317 445.08 400.047 445.4 399.637C445.51 400.137 445.97 400.327 446.77 400.227V399.507C446.42 399.587 446.21 399.477 446.21 399.137V396.587C446.21 395.467 445.49 394.897 444.32 394.897C443.25 394.897 442.48 395.507 442.23 396.357L443.08 396.577C443.24 395.997 443.63 395.687 444.29 395.687C444.99 395.687 445.32 396.027 445.32 396.627V396.847L444.05 397.107C442.88 397.347 442.07 397.777 442.07 398.817C442.07 399.757 442.84 400.317 443.79 400.317ZM445.32 398.447C445.32 399.117 444.67 399.587 443.9 399.587C443.33 399.587 442.96 399.287 442.96 398.777C442.96 398.147 443.5 397.927 444.3 397.767L445.32 397.547V398.447ZM448.644 393.047H447.744V400.227H448.644V393.047ZM452.296 393.047V400.227H456.986V399.387H453.246V396.937H456.416V396.097H453.246V393.887H456.986V393.047H452.296ZM458.086 400.227H458.986V397.107C458.986 396.237 459.646 395.737 460.296 395.737C461.076 395.737 461.386 396.287 461.386 397.017V400.227H462.286V396.777C462.286 395.667 461.656 394.897 460.546 394.897C459.806 394.897 459.286 395.277 458.986 395.707V394.997H458.086V400.227ZM464.27 397.617C464.27 396.417 464.91 395.677 465.84 395.677C466.56 395.677 466.99 396.147 467.14 396.787L467.92 396.397C467.67 395.527 466.93 394.897 465.84 394.897C464.38 394.897 463.37 395.997 463.37 397.617C463.37 399.227 464.38 400.337 465.84 400.337C466.93 400.337 467.69 399.677 467.94 398.807L467.14 398.437C466.99 399.087 466.56 399.547 465.84 399.547C464.91 399.547 464.27 398.807 464.27 397.617ZM473.426 397.617C473.426 395.997 472.426 394.897 470.996 394.897C469.566 394.897 468.566 395.997 468.566 397.617C468.566 399.237 469.566 400.337 470.996 400.337C472.426 400.337 473.426 399.237 473.426 397.617ZM469.476 397.617C469.476 396.417 470.076 395.657 470.996 395.657C471.916 395.657 472.516 396.417 472.516 397.617C472.516 398.807 471.916 399.567 470.996 399.567C470.076 399.567 469.476 398.807 469.476 397.617ZM476.49 400.337C477.22 400.337 477.8 399.987 478.11 399.467V400.227H479.01V393.047H478.11V395.757C477.8 395.247 477.22 394.897 476.49 394.897C475.04 394.897 474.21 396.127 474.21 397.617C474.21 399.097 475.04 400.337 476.49 400.337ZM478.13 397.437V397.807C478.13 399.007 477.39 399.547 476.65 399.547C475.67 399.547 475.12 398.737 475.12 397.617C475.12 396.487 475.67 395.687 476.65 395.687C477.39 395.687 478.13 396.217 478.13 397.437ZM480.458 394.087H481.438V393.047H480.458V394.087ZM481.398 394.997H480.498V400.227H481.398V394.997ZM482.9 400.227H483.8V397.107C483.8 396.237 484.46 395.737 485.11 395.737C485.89 395.737 486.2 396.287 486.2 397.017V400.227H487.1V396.777C487.1 395.667 486.47 394.897 485.36 394.897C484.62 394.897 484.1 395.277 483.8 395.707V394.997H482.9V400.227ZM492.975 399.757V394.997H492.075V395.767C491.795 395.257 491.195 394.897 490.475 394.897C489.045 394.897 488.205 396.077 488.205 397.497C488.205 398.927 489.045 400.107 490.475 400.107C491.205 400.107 491.795 399.747 492.075 399.237V399.827C492.075 400.917 491.465 401.387 490.545 401.387C489.815 401.387 489.275 400.997 489.105 400.327L488.265 400.647C488.555 401.607 489.395 402.137 490.545 402.137C491.965 402.137 492.975 401.287 492.975 399.757ZM492.085 397.417V397.607C492.085 398.757 491.385 399.317 490.635 399.317C489.685 399.317 489.105 398.597 489.105 397.497C489.105 396.397 489.685 395.687 490.635 395.687C491.385 395.687 492.085 396.247 492.085 397.417Z" fill="black"/> <path d="M602.191 400.949V400.949C602.191 400.951 602.191 400.962 602.178 400.987C602.164 401.015 602.135 401.058 602.077 401.114C601.958 401.229 601.756 401.364 601.451 401.508C600.843 401.794 599.931 402.064 598.764 402.296C596.439 402.758 593.205 403.048 589.617 403.048C586.029 403.048 582.795 402.758 580.47 402.296C579.303 402.064 578.391 401.794 577.783 401.508C577.478 401.364 577.276 401.229 577.157 401.114C577.099 401.058 577.07 401.015 577.056 400.987C577.043 400.962 577.043 400.951 577.043 400.949L577.043 400.949L577.043 400.949C577.043 400.947 577.043 400.935 577.056 400.91C577.07 400.883 577.099 400.84 577.157 400.783C577.276 400.668 577.478 400.533 577.783 400.389C578.391 400.104 579.303 399.833 580.47 399.601C582.795 399.139 586.029 398.85 589.617 398.85C593.205 398.85 596.439 399.139 598.764 399.601C599.931 399.833 600.843 400.104 601.451 400.389C601.756 400.533 601.958 400.668 602.077 400.783C602.135 400.84 602.164 400.883 602.178 400.91C602.191 400.935 602.191 400.947 602.191 400.949V400.949Z" fill="white" stroke="black"/> <mask id="path-84-inside-1_3571_158427" fill="white"> <path d="M576.543 385.617H602.691V400.949H576.543V385.617Z"/> </mask> <path d="M576.543 385.617H602.691V400.949H576.543V385.617Z" fill="white"/> <path d="M601.391 385.617V400.949H603.991V385.617H601.391ZM577.843 400.949V385.617H575.243V400.949H577.843Z" fill="black" mask="url(#path-84-inside-1_3571_158427)"/> <path d="M602.041 385.669C602.041 385.8 601.932 386.09 601.317 386.478C600.734 386.846 599.848 387.201 598.694 387.509C596.397 388.122 593.188 388.509 589.617 388.509C586.047 388.509 582.837 388.122 580.54 387.509C579.387 387.201 578.5 386.846 577.917 386.478C577.302 386.09 577.193 385.8 577.193 385.669C577.193 385.538 577.302 385.249 577.917 384.861C578.5 384.493 579.387 384.138 580.54 383.83C582.837 383.217 586.047 382.83 589.617 382.83C593.188 382.83 596.397 383.217 598.694 383.83C599.848 384.138 600.734 384.493 601.317 384.861C601.932 385.249 602.041 385.538 602.041 385.669Z" fill="white" stroke="black" stroke-width="1.3"/> <path d="M602.041 382.128C602.041 382.259 601.932 382.549 601.317 382.937C600.734 383.305 599.848 383.66 598.694 383.968C596.397 384.581 593.188 384.968 589.617 384.968C586.047 384.968 582.837 384.581 580.54 383.968C579.387 383.66 578.5 383.305 577.917 382.937C577.302 382.549 577.193 382.259 577.193 382.128C577.193 381.997 577.302 381.708 577.917 381.32C578.5 380.952 579.387 380.597 580.54 380.289C582.837 379.676 586.047 379.289 589.617 379.289C593.188 379.289 596.397 379.676 598.694 380.289C599.848 380.597 600.734 380.952 601.317 381.32C601.932 381.708 602.041 381.997 602.041 382.128Z" fill="white" stroke="black" stroke-width="1.3"/> <path d="M602.041 378.396C602.041 378.527 601.932 378.816 601.317 379.204C600.734 379.572 599.848 379.927 598.694 380.235C596.397 380.848 593.188 381.235 589.617 381.235C586.047 381.235 582.837 380.848 580.54 380.235C579.387 379.927 578.5 379.572 577.917 379.204C577.302 378.816 577.193 378.527 577.193 378.396C577.193 378.265 577.302 377.975 577.917 377.587C578.5 377.219 579.387 376.864 580.54 376.556C582.837 375.943 586.047 375.556 589.617 375.556C593.188 375.556 596.397 375.943 598.694 376.556C599.848 376.864 600.734 377.219 601.317 377.587C601.932 377.975 602.041 378.265 602.041 378.396Z" fill="white" stroke="black" stroke-width="1.3"/> <path d="M515.191 403.568L511.727 409.568L518.656 409.568L515.191 403.568ZM515.791 427.524L515.791 408.968L514.591 408.968L514.591 427.524L515.791 427.524Z" fill="black"/> <path d="M515.191 353.719L511.727 359.719L518.656 359.719L515.191 353.719ZM515.791 374.605L515.791 359.119L514.591 359.119L514.591 374.605L515.791 374.605Z" fill="black"/> <path d="M132.004 288.51L128.54 294.51L135.468 294.51L132.004 288.51ZM132.604 311.719L132.604 293.91L131.404 293.91L131.404 311.719L132.604 311.719Z" fill="black"/> <path d="M246.05 365.518C245.494 364.315 244.289 363.544 242.964 363.544H21.0402C19.7145 363.544 18.5097 364.315 17.9537 365.518L1.85815 400.357C1.42957 401.285 2.10715 402.344 3.12909 402.344H260.875C261.897 402.344 262.574 401.285 262.146 400.357L246.05 365.518Z" fill="white" stroke="black" stroke-width="1.2"/> <path d="M83.2331 385.996V386.944H89.1611V385.936H84.8051L86.7491 384.352C87.9851 383.356 88.9931 382.348 88.9931 380.884C88.9931 379.3 87.8891 378.196 86.1491 378.196C84.5651 378.196 83.3051 379.36 83.3051 380.896C83.3051 381.232 83.3651 381.52 83.4971 381.892H84.5771C84.4571 381.592 84.3851 381.268 84.3851 380.956C84.3851 379.972 85.0931 379.18 86.1251 379.18C87.1811 379.18 87.8651 379.9 87.8651 380.956C87.8651 382.084 86.9531 382.912 85.7771 383.884L83.2331 385.996ZM94.2922 386.944H95.5162L93.2122 383.668L95.3122 380.668H94.1242L92.5882 382.864L91.0762 380.668H89.8402L91.9402 383.668L89.6482 386.944H90.8242L92.5522 384.472L94.2922 386.944ZM99.7492 382.636C99.7492 380.596 100.877 379.216 102.473 379.216C103.589 379.216 104.393 379.864 104.705 380.956L105.797 380.596C105.293 379.108 104.117 378.196 102.473 378.196C100.241 378.196 98.5972 380.008 98.5972 382.636C98.5972 385.264 100.241 387.076 102.473 387.076C104.117 387.076 105.293 386.164 105.797 384.676L104.705 384.316C104.393 385.408 103.589 386.056 102.473 386.056C100.877 386.056 99.7492 384.676 99.7492 382.636ZM112.363 383.812C112.363 381.868 111.163 380.548 109.447 380.548C107.731 380.548 106.531 381.868 106.531 383.812C106.531 385.756 107.731 387.076 109.447 387.076C111.163 387.076 112.363 385.756 112.363 383.812ZM107.623 383.812C107.623 382.372 108.343 381.46 109.447 381.46C110.551 381.46 111.271 382.372 111.271 383.812C111.271 385.24 110.551 386.152 109.447 386.152C108.343 386.152 107.623 385.24 107.623 383.812ZM113.737 386.944H114.817V383.2C114.817 382.156 115.609 381.556 116.389 381.556C117.325 381.556 117.697 382.216 117.697 383.092V386.944H118.777V382.804C118.777 381.472 118.021 380.548 116.689 380.548C115.801 380.548 115.177 381.004 114.817 381.52V380.668H113.737V386.944ZM123.093 386.944L125.445 380.668H124.317L122.577 385.624L120.813 380.668H119.673L122.025 386.944H123.093ZM128.239 378.28C128.155 379.276 127.819 379.624 126.799 379.696L125.875 379.756V380.608H127.951V386.944H129.103V378.28H128.239ZM131.058 386.944H133.794C136.41 386.944 138.066 385.264 138.066 382.636C138.066 380.008 136.41 378.328 133.794 378.328H131.058V386.944ZM132.186 385.948V379.324H133.782C135.75 379.324 136.938 380.62 136.938 382.636C136.938 384.64 135.75 385.948 133.794 385.948H132.186ZM147.752 383.32V382.36H145.184V379.78H144.224V382.36H141.656V383.32H144.224V385.888H145.184V383.32H147.752ZM157.668 386.944H158.724V382.468H155.136V383.476H157.608C157.584 385.012 156.648 386.08 155.256 386.08C153.612 386.08 152.484 384.676 152.484 382.648C152.484 380.608 153.612 379.216 155.244 379.216C156.372 379.216 157.164 379.852 157.476 380.944L158.58 380.584C158.064 379.096 156.888 378.196 155.232 378.196C152.976 378.196 151.332 380.032 151.332 382.66C151.332 385.276 152.964 387.076 155.16 387.076C156.396 387.076 157.32 386.452 157.668 385.66V386.944ZM160.472 378.328V386.944H166.1V385.936H161.612V382.996H165.416V381.988H161.612V379.336H166.1V378.328H160.472ZM167.609 386.944H173.057V385.936H168.761V378.328H167.609V386.944ZM176.989 387.064C178.945 387.064 180.385 385.912 180.385 383.536V378.328H179.233V383.5C179.233 385.264 178.297 386.032 176.989 386.032C175.693 386.032 174.757 385.264 174.757 383.5V378.328H173.605V383.536C173.605 385.912 175.045 387.064 176.989 387.064Z" fill="black"/> <path d="M131.773 405.945L128.309 411.945L135.238 411.945L131.773 405.945ZM132.373 426.831L132.373 411.345L131.173 411.345L131.173 426.831L132.373 426.831Z" fill="black"/> <path d="M131.773 340.842L128.309 346.842L135.238 346.842L131.773 340.842ZM132.373 363.092L132.373 346.242L131.173 346.242L131.173 363.092L132.373 363.092Z" fill="black"/> <circle cx="131.773" cy="325.346" r="11.25" stroke="black" stroke-width="1.5"/> <line x1="143.773" y1="324.83" x2="120.306" y2="324.83" stroke="black" stroke-width="1.5"/> <line x1="131.789" y1="336.984" x2="131.789" y2="313.71" stroke="black" stroke-width="1.5"/> <path d="M117.427 325.346L111.427 321.882V328.81L117.427 325.346ZM72.3398 325.946H112.027V324.746H72.3398V325.946Z" fill="black"/> <circle cx="57.5488" cy="325.346" r="15.25" stroke="black" stroke-width="1.5"/> <path d="M48.2188 328.115C48.2188 328.115 50.0632 317.418 57.5475 325.345C65.0317 333.272 66.8762 322.575 66.8762 322.575" stroke="black" stroke-width="1.5"/> <path d="M153.813 321.866C154.303 322.836 155.493 323.456 156.733 323.456C158.153 323.456 159.343 322.606 159.343 321.266C159.343 319.746 158.033 319.496 156.843 319.216C155.833 318.976 155.163 318.816 155.163 318.056C155.163 317.366 155.793 316.886 156.663 316.886C157.593 316.886 158.163 317.356 158.533 318.046L159.263 317.466C158.833 316.656 157.933 316.056 156.683 316.056C155.323 316.056 154.233 316.876 154.233 318.136C154.233 319.486 155.353 319.796 156.483 320.066C157.573 320.326 158.403 320.476 158.403 321.346C158.403 322.166 157.683 322.626 156.753 322.626C155.823 322.626 155.013 322.126 154.563 321.266L153.813 321.866ZM160.353 317.206H161.333V316.166H160.353V317.206ZM161.293 318.116H160.393V323.346H161.293V318.116ZM162.795 323.346H163.695V320.226C163.695 319.356 164.355 318.856 165.005 318.856C165.785 318.856 166.095 319.406 166.095 320.136V323.346H166.995V319.896C166.995 318.786 166.365 318.016 165.255 318.016C164.515 318.016 163.995 318.396 163.695 318.826V318.116H162.795V323.346ZM172.589 318.116H171.689V321.266C171.689 322.136 171.029 322.626 170.379 322.626C169.609 322.626 169.289 322.126 169.289 321.386V318.116H168.389V321.576C168.389 322.686 169.019 323.456 170.139 323.456C170.879 323.456 171.389 323.066 171.689 322.626V323.346H172.589V318.116ZM173.504 322.236C173.904 323.046 174.854 323.456 175.804 323.456C176.924 323.456 177.834 322.796 177.834 321.796C177.834 320.656 176.804 320.476 175.904 320.266C175.174 320.106 174.714 320.016 174.714 319.526C174.714 319.056 175.114 318.756 175.714 318.756C176.374 318.756 176.864 319.086 177.134 319.606L177.774 319.096C177.464 318.476 176.694 318.016 175.754 318.016C174.644 318.016 173.834 318.666 173.834 319.596C173.834 320.666 174.734 320.856 175.594 321.046C176.454 321.226 176.954 321.306 176.954 321.876C176.954 322.446 176.414 322.706 175.804 322.706C175.124 322.706 174.484 322.336 174.204 321.726L173.504 322.236ZM183.447 320.736C183.447 319.116 182.447 318.016 181.017 318.016C179.587 318.016 178.587 319.116 178.587 320.736C178.587 322.356 179.587 323.456 181.017 323.456C182.447 323.456 183.447 322.356 183.447 320.736ZM179.497 320.736C179.497 319.536 180.097 318.776 181.017 318.776C181.937 318.776 182.537 319.536 182.537 320.736C182.537 321.926 181.937 322.686 181.017 322.686C180.097 322.686 179.497 321.926 179.497 320.736ZM184.552 317.206H185.532V316.166H184.552V317.206ZM185.492 318.116H184.592V323.346H185.492V318.116ZM188.914 323.456C189.644 323.456 190.224 323.106 190.534 322.586V323.346H191.434V316.166H190.534V318.876C190.224 318.366 189.644 318.016 188.914 318.016C187.464 318.016 186.634 319.246 186.634 320.736C186.634 322.216 187.464 323.456 188.914 323.456ZM190.554 320.556V320.926C190.554 322.126 189.814 322.666 189.074 322.666C188.094 322.666 187.544 321.856 187.544 320.736C187.544 319.606 188.094 318.806 189.074 318.806C189.814 318.806 190.554 319.336 190.554 320.556ZM194.232 323.436C194.942 323.436 195.522 323.166 195.842 322.756C195.952 323.256 196.412 323.446 197.212 323.346V322.626C196.862 322.706 196.652 322.596 196.652 322.256V319.706C196.652 318.586 195.932 318.016 194.762 318.016C193.692 318.016 192.922 318.626 192.672 319.476L193.522 319.696C193.682 319.116 194.072 318.806 194.732 318.806C195.432 318.806 195.762 319.146 195.762 319.746V319.966L194.492 320.226C193.322 320.466 192.512 320.896 192.512 321.936C192.512 322.876 193.282 323.436 194.232 323.436ZM195.762 321.566C195.762 322.236 195.112 322.706 194.342 322.706C193.772 322.706 193.402 322.406 193.402 321.896C193.402 321.266 193.942 321.046 194.742 320.886L195.762 320.666V321.566ZM199.086 316.166H198.186V323.346H199.086V316.166ZM155.523 333.336H157.303C158.823 333.336 159.793 332.596 159.793 331.246C159.793 329.876 158.823 329.166 157.303 329.166H154.563V336.346H155.523V333.336ZM155.523 332.506V329.996H157.253C158.273 329.996 158.843 330.436 158.843 331.246C158.843 332.056 158.273 332.506 157.253 332.506H155.523ZM165.01 333.736C165.01 332.116 164.01 331.016 162.58 331.016C161.15 331.016 160.15 332.116 160.15 333.736C160.15 335.356 161.15 336.456 162.58 336.456C164.01 336.456 165.01 335.356 165.01 333.736ZM161.06 333.736C161.06 332.536 161.66 331.776 162.58 331.776C163.5 331.776 164.1 332.536 164.1 333.736C164.1 334.926 163.5 335.686 162.58 335.686C161.66 335.686 161.06 334.926 161.06 333.736ZM165.574 335.236C165.974 336.046 166.924 336.456 167.874 336.456C168.994 336.456 169.904 335.796 169.904 334.796C169.904 333.656 168.874 333.476 167.974 333.266C167.244 333.106 166.784 333.016 166.784 332.526C166.784 332.056 167.184 331.756 167.784 331.756C168.444 331.756 168.934 332.086 169.204 332.606L169.844 332.096C169.534 331.476 168.764 331.016 167.824 331.016C166.714 331.016 165.904 331.666 165.904 332.596C165.904 333.666 166.804 333.856 167.664 334.046C168.524 334.226 169.024 334.306 169.024 334.876C169.024 335.446 168.484 335.706 167.874 335.706C167.194 335.706 166.554 335.336 166.274 334.726L165.574 335.236ZM170.978 330.206H171.958V329.166H170.978V330.206ZM171.918 331.116H171.018V336.346H171.918V331.116ZM173.56 335.096C173.56 336.096 174.07 336.396 174.96 336.396C175.26 336.396 175.52 336.366 175.75 336.316V335.546C175.54 335.596 175.39 335.606 175.18 335.606C174.72 335.606 174.45 335.506 174.45 334.956V331.886H175.64V331.116H174.45V329.586H173.56V331.116H172.75V331.886H173.56V335.096ZM176.778 330.206H177.758V329.166H176.778V330.206ZM177.718 331.116H176.818V336.346H177.718V331.116ZM183.721 333.736C183.721 332.116 182.721 331.016 181.291 331.016C179.861 331.016 178.861 332.116 178.861 333.736C178.861 335.356 179.861 336.456 181.291 336.456C182.721 336.456 183.721 335.356 183.721 333.736ZM179.771 333.736C179.771 332.536 180.371 331.776 181.291 331.776C182.211 331.776 182.811 332.536 182.811 333.736C182.811 334.926 182.211 335.686 181.291 335.686C180.371 335.686 179.771 334.926 179.771 333.736ZM184.865 336.346H185.765V333.226C185.765 332.356 186.425 331.856 187.075 331.856C187.855 331.856 188.165 332.406 188.165 333.136V336.346H189.065V332.896C189.065 331.786 188.435 331.016 187.325 331.016C186.585 331.016 186.065 331.396 185.765 331.826V331.116H184.865V336.346ZM191.82 336.436C192.53 336.436 193.11 336.166 193.43 335.756C193.54 336.256 194 336.446 194.8 336.346V335.626C194.45 335.706 194.24 335.596 194.24 335.256V332.706C194.24 331.586 193.52 331.016 192.35 331.016C191.28 331.016 190.51 331.626 190.26 332.476L191.11 332.696C191.27 332.116 191.66 331.806 192.32 331.806C193.02 331.806 193.35 332.146 193.35 332.746V332.966L192.08 333.226C190.91 333.466 190.1 333.896 190.1 334.936C190.1 335.876 190.87 336.436 191.82 336.436ZM193.35 334.566C193.35 335.236 192.7 335.706 191.93 335.706C191.36 335.706 190.99 335.406 190.99 334.896C190.99 334.266 191.53 334.046 192.33 333.886L193.35 333.666V334.566ZM196.673 329.166H195.773V336.346H196.673V329.166ZM200.325 329.166V336.346H205.015V335.506H201.275V333.056H204.445V332.216H201.275V330.006H205.015V329.166H200.325ZM206.115 336.346H207.015V333.226C207.015 332.356 207.675 331.856 208.325 331.856C209.105 331.856 209.415 332.406 209.415 333.136V336.346H210.315V332.896C210.315 331.786 209.685 331.016 208.575 331.016C207.835 331.016 207.315 331.396 207.015 331.826V331.116H206.115V336.346ZM212.3 333.736C212.3 332.536 212.94 331.796 213.87 331.796C214.59 331.796 215.02 332.266 215.17 332.906L215.95 332.516C215.7 331.646 214.96 331.016 213.87 331.016C212.41 331.016 211.4 332.116 211.4 333.736C211.4 335.346 212.41 336.456 213.87 336.456C214.96 336.456 215.72 335.796 215.97 334.926L215.17 334.556C215.02 335.206 214.59 335.666 213.87 335.666C212.94 335.666 212.3 334.926 212.3 333.736ZM221.455 333.736C221.455 332.116 220.455 331.016 219.025 331.016C217.595 331.016 216.595 332.116 216.595 333.736C216.595 335.356 217.595 336.456 219.025 336.456C220.455 336.456 221.455 335.356 221.455 333.736ZM217.505 333.736C217.505 332.536 218.105 331.776 219.025 331.776C219.945 331.776 220.545 332.536 220.545 333.736C220.545 334.926 219.945 335.686 219.025 335.686C218.105 335.686 217.505 334.926 217.505 333.736ZM224.52 336.456C225.25 336.456 225.83 336.106 226.14 335.586V336.346H227.04V329.166H226.14V331.876C225.83 331.366 225.25 331.016 224.52 331.016C223.07 331.016 222.24 332.246 222.24 333.736C222.24 335.216 223.07 336.456 224.52 336.456ZM226.16 333.556V333.926C226.16 335.126 225.42 335.666 224.68 335.666C223.7 335.666 223.15 334.856 223.15 333.736C223.15 332.606 223.7 331.806 224.68 331.806C225.42 331.806 226.16 332.336 226.16 333.556ZM228.487 330.206H229.467V329.166H228.487V330.206ZM229.427 331.116H228.527V336.346H229.427V331.116ZM230.93 336.346H231.83V333.226C231.83 332.356 232.49 331.856 233.14 331.856C233.92 331.856 234.23 332.406 234.23 333.136V336.346H235.13V332.896C235.13 331.786 234.5 331.016 233.39 331.016C232.65 331.016 232.13 331.396 231.83 331.826V331.116H230.93V336.346ZM241.004 335.876V331.116H240.104V331.886C239.824 331.376 239.224 331.016 238.504 331.016C237.074 331.016 236.234 332.196 236.234 333.616C236.234 335.046 237.074 336.226 238.504 336.226C239.234 336.226 239.824 335.866 240.104 335.356V335.946C240.104 337.036 239.494 337.506 238.574 337.506C237.844 337.506 237.304 337.116 237.134 336.446L236.294 336.766C236.584 337.726 237.424 338.256 238.574 338.256C239.994 338.256 241.004 337.406 241.004 335.876ZM240.114 333.536V333.726C240.114 334.876 239.414 335.436 238.664 335.436C237.714 335.436 237.134 334.716 237.134 333.616C237.134 332.516 237.714 331.806 238.664 331.806C239.414 331.806 240.114 332.366 240.114 333.536Z" fill="black"/> </g> <defs> <pattern id="pattern0" patternContentUnits="objectBoundingBox" width="1" height="1"> <use xlink:href="#image0_3571_158427" transform="translate(0 -0.358603) scale(0.00333333 0.0175915)"/> </pattern> <clipPath id="clip0_3571_158427"> <rect width="647" height="521" fill="white" transform="translate(0.00195312 0.650391)"/> </clipPath> <image id="image0_3571_158427" width="300" height="80" xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAASwAAABQCAIAAAAsiN8sAAAMP2lDQ1BJQ0MgUHJvZmlsZQAASImVVwdYU8kWnltSIaEEEJASehNEpASQEkILIL0IohKSAKHEGAgq9rKo4FpQEQUbuiqi2Gl2xM6i2PtiQUVZFwt25U0K6LqvfG++b+78958z/zlzZm4ZADSO8ySSXFQTgDxxgTQ2JIA5JjmFSXoCyEAPaAI6YPL4+RJ2dHQEgGWg/Xt5dx0g8vaKo1zrn/3/tWgJhPl8AJBoiNMF+fw8iA8AgFfxJdICAIhy3mJygUSOYQU6UhggxAvlOFOJq+Q4XYn3KGziYzkQtwJAVufxpJkA0C9BnlnIz4Qa9F6IncUCkRgADSbEvnl5EwUQp0FsC20kEMv1Wek/6GT+TTN9UJPHyxzEyrkoCjlQlC/J5U39P9Pxv0termzAhzWs6lnS0Fj5nGHebuZMDJdjdYh7xOmRURBrQ/xBJFDYQ4xSs2ShCUp71Iifz4E5gysNUGcBLzAcYiOIg8W5kREqPj1DFMyFGO4QdIqogBsPsT7EC4X5QXEqm43SibEqX2h9hpTDVvFneVKFX7mv+7KcBLZK/3WWkKvSx+hFWfFJEFMhtiwUJUZCTIfYKT8nLlxlM6ooixM5YCOVxcrjt4Q4VigOCVDqY4UZ0uBYlX1JXv7AfLGNWSJupArvK8iKD1XmB2vl8xTxw7lgl4RidsKAjjB/TMTAXATCwCDl3LFnQnFCnErng6QgIFY5FqdKcqNV9ri5MDdEzptD7JpfGKcaiycWwA2p1MczJAXR8co48aJsXli0Mh58GYgAHBAImEAGazqYCLKBqL2noQfeKXuCAQ9IQSYQAkcVMzAiSdEjhtc4UAT+hEgI8gfHBSh6haAQ8l8HWeXVEWQoegsVI3LAE4jzQDjIhfcyxSjxoLdE8Bgyon9458HKh/Hmwirv//f8APudYUMmQsXIBjwyNQYsiUHEQGIoMZhohxvivrg3HgGv/rC64Czcc2Ae3+0JTwgdhIeEa4ROwq0JornSn6IcDTqhfrAqF+k/5gK3hppueADuA9WhMq6HGwJH3BX6YeN+0LMbZDmquOVZYf6k/bcZ/LAaKjuKMwWlDKH4U2x/Hkm3p7sNqshz/WN+lLGmD+abM9jzs3/OD9kXwDb8Z0tsIbYfO4OdwM5hh7EGwMSOYY1YG3ZEjgd312PF7hrwFquIJwfqiP7hb2Bl5ZnMd6517nb+ouwrEE6Rv6MBZ6JkqlSUmVXAZMMvgpDJFfOdhjFdnF1cAJB/X5Svrzcxiu8Gotf2nZv3BwA+x/r7+w9958KOAbDXAz7+Td85Wxb8dKgBcLaJL5MWKjlcfiHAt4QGfNIMgAmwALZwPi7AHXgDfxAEwkAUiAfJYDyMPgvucymYDKaDOaAYlIJlYBVYCzaAzWA72AX2gQZwGJwAp8EFcAlcA3fg7ukCL0AveAc+IwhCQmgIAzFATBErxAFxQViILxKERCCxSDKShmQiYkSGTEfmIaVIGbIW2YTUIHuRJuQEcg7pQG4hD5Bu5DXyCcVQdVQHNUat0eEoC2Wj4Wg8Og7NRCehReh8dAlagVajO9F69AR6Ab2GdqIv0D4MYGqYHmaGOWIsjINFYSlYBibFZmIlWDlWjdVhzXCdr2CdWA/2ESfiDJyJO8IdHIon4Hx8Ej4TX4yvxbfj9XgrfgV/gPfi3wg0ghHBgeBF4BLGEDIJkwnFhHLCVsJBwin4LHUR3hGJRD2iDdEDPovJxGziNOJi4jribuJxYgfxEbGPRCIZkBxIPqQoEo9UQComrSHtJB0jXSZ1kT6Q1cimZBdyMDmFLCbPJZeTd5CPki+Tn5I/UzQpVhQvShRFQJlKWUrZQmmmXKR0UT5Ttag2VB9qPDWbOodaQa2jnqLepb5RU1MzV/NUi1ETqc1Wq1Dbo3ZW7YHaR3VtdXt1jnqqukx9ifo29ePqt9Tf0Gg0a5o/LYVWQFtCq6GdpN2nfaAz6E50Ll1An0WvpNfTL9NfalA0rDTYGuM1ijTKNfZrXNTo0aRoWmtyNHmaMzUrNZs0b2j2aTG0RmhFaeVpLdbaoXVO65k2SdtaO0hboD1fe7P2Se1HDIxhweAw+Ix5jC2MU4wuHaKOjQ5XJ1unVGeXTrtOr662rqtuou4U3UrdI7qdepietR5XL1dvqd4+vet6n4YYD2EPEQ5ZNKRuyOUh7/WH6vvrC/VL9HfrX9P/ZMA0CDLIMVhu0GBwzxA3tDeMMZxsuN7wlGHPUJ2h3kP5Q0uG7ht62wg1sjeKNZpmtNmozajP2MQ4xFhivMb4pHGPiZ6Jv0m2yUqToybdpgxTX1OR6UrTY6bPmbpMNjOXWcFsZfaaGZmFmsnMNpm1m302tzFPMJ9rvtv8ngXVgmWRYbHSosWi19LUcrTldMtay9tWFCuWVZbVaqszVu+tbayTrBdYN1g/s9G34doU2dTa3LWl2frZTrKttr1qR7Rj2eXYrbO7ZI/au9ln2VfaX3RAHdwdRA7rHDqGEYZ5DhMPqx52w1Hdke1Y6Fjr+MBJzynCaa5Tg9PL4ZbDU4YvH35m+DdnN+dc5y3Od0ZojwgbMXdE84jXLvYufJdKl6sjaSODR84a2TjylauDq9B1vetNN4bbaLcFbi1uX9093KXude7dHpYeaR5VHjdYOqxo1mLWWU+CZ4DnLM/Dnh+93L0KvPZ5/eXt6J3jvcP72SibUcJRW0Y98jH34fls8un0Zfqm+W707fQz8+P5Vfs99LfwF/hv9X/KtmNns3eyXwY4B0gDDga853hxZnCOB2KBIYElge1B2kEJQWuD7gebB2cG1wb3hriFTAs5HkoIDQ9dHnqDa8zlc2u4vWEeYTPCWsPVw+PC14Y/jLCPkEY0j0ZHh41eMfpupFWkOLIhCkRxo1ZE3Yu2iZ4UfSiGGBMdUxnzJHZE7PTYM3GMuAlxO+LexQfEL42/k2CbIEtoSdRITE2sSXyfFJhUltQ5ZviYGWMuJBsmi5IbU0gpiSlbU/rGBo1dNbYr1S21OPX6OJtxU8adG284Pnf8kQkaE3gT9qcR0pLSdqR94UXxqnl96dz0qvRePoe/mv9C4C9YKegW+gjLhE8zfDLKMp5l+mSuyOzO8ssqz+oRcURrRa+yQ7M3ZL/PicrZltOfm5S7O4+cl5bXJNYW54hbJ5pMnDKxQ+IgKZZ0TvKatGpSrzRcujUfyR+X31igA3/k22S2sl9kDwp9CysLP0xOnLx/itYU8ZS2qfZTF019WhRc9Ns0fBp/Wst0s+lzpj+YwZ6xaSYyM31myyyLWfNndc0Omb19DnVOzpzf5zrPLZv7dl7SvOb5xvNnz3/0S8gvtcX0YmnxjQXeCzYsxBeKFrYvGrlozaJvJYKS86XOpeWlXxbzF5//dcSvFb/2L8lY0r7Ufen6ZcRl4mXXl/st316mVVZU9mjF6BX1K5krS1a+XTVh1bly1/INq6mrZas7KyIqGtdYrlm25svarLXXKgMqd1cZVS2qer9OsO7yev/1dRuMN5Ru+LRRtPHmppBN9dXW1eWbiZsLNz/ZkrjlzG+s32q2Gm4t3fp1m3hb5/bY7a01HjU1O4x2LK1Fa2W13TtTd17aFbirsc6xbtNuvd2le8Ae2Z7ne9P2Xt8Xvq9lP2t/3QGrA1UHGQdL6pH6qfW9DVkNnY3JjR1NYU0tzd7NBw85Hdp22Oxw5RHdI0uPUo/OP9p/rOhY33HJ8Z4TmScetUxouXNyzMmrrTGt7afCT509HXz65Bn2mWNnfc4ePud1ruk863zDBfcL9W1ubQd/d/v9YLt7e/1Fj4uNlzwvNXeM6jh62e/yiSuBV05f5V69cC3yWsf1hOs3b6Te6LwpuPnsVu6tV7cLb3++M/su4W7JPc175feN7lf/YffH7k73ziMPAh+0PYx7eOcR/9GLx/mPv3TNf0J7Uv7U9GnNM5dnh7uDuy89H/u864Xkxeee4j+1/qx6afvywF/+f7X1junteiV91f968RuDN9veur5t6Yvuu/8u793n9yUfDD5s/8j6eOZT0qennyd/IX2p+Gr3tflb+Le7/Xn9/RKelKf4FcBgRTMyAHi9DQBaMgAMeD6jjlWe/xQFUZ5ZFQj8J6w8IyqKOwB18P89pgf+3dwAYM8WePyC+hqpAETTAIj3BOjIkYN14KymOFfKCxGeAzZyv6bnpYN/U5Rnzh/i/rkFclVX8HP7L/ZYfF1RZJOVAAAAOGVYSWZNTQAqAAAACAABh2kABAAAAAEAAAAaAAAAAAACoAIABAAAAAEAAAEsoAMABAAAAAEAAABQAAAAAAzWfrYAAEAASURBVHgBjL1nry3ZeedXOe58crj33Nj3drPZZItNUsEaURoqYDjCGLDgmQFkwIANvzP8zoC/w9j+DDZg2C/GAWOPBAEzliGJpkSKbLLZ8eZ07sk77125yr9/HdovDBjwUfNqn32qVq31rCf8n7TK/L75J8b/jx/zw681H3/2/77QNO3RsMkLK46qq0lT5NcX2JsbfDC7nWaxbJLUuHvDXGdmklXbQ+vtRbMzMqsm34qtvHKuVo3rWMt1fX5Zr9eGZRt1ZUWRGQam6za9jsH3ScqATScyi9JgtKZpytLY3jAXa6Oum0G37Af2Ktf3nmOPl9nNkZ2W2ch3V6WZ141jFV0nG9h23jSWWQamt6itsqldM7jKi8hxkqoK7LxnO0mtaddNGVj81V3W2cDKu2blm2ZtrPcbd27u/zCxssos62Q3mt90vGXDaHZa21ntXa3NoqofPbOGw/rWbtlxy9CufMsqGndZNraZbLl21vCIyTt2502d9a3GNvrPS29WMKAzS+rQrULXnSSshc9cac2TuhfO3unylPktx5s34UW52nXcpPEnZRVYTNUumqxnBdPKv8wb1ypiB/LWvumsa9bLI3RN3qRDUcBojMEvx8nNvn+ZZJuhlde1Z9lZxfdsx+RBZDLBxgivqvWWbZUGt0Cx4VerouNCqLznMrI/KZxpmu51GtOwCo1wTT13WWUDJ+9ajBCdl3Zee5frbCuC/t68tNMq3fRYlD8u+ADRKp4fa/5MmAdBZDupuMyAr5a5fTlr0szYHJjrtFmnpmMXt3e4nb0200LbHbhmVhrjmWGZZhQaecFfDd9rAs+crwzbqged5vOn12zpHN1oVusmy839Hd376rhO02t+s4dDk7u6Mc8VjxVl+ebYuXWzOjlrskxjtj/O7k6zTqr5/P/+wrC6XX0uCmvQ13NdxyyruheZSV53QsM2NWBecgmM0bw8ZjR4o7q4sLe2jKauZwsT5u91zd93/5np+6ZtM7pzsF9fjZkcF5mB3ywW1XSmIVyv+dZDe5Y0tm2eXRn9jnE5bo72rbOx4ThGVRlhwL+N516vodobWavMXCaMZjpO9fAIwiF1sJd5fF7dOygjd73jQvToLGcDeIS9zu3TiSRtnRY3N2EIzduxrNnarMRMxW6fK620LEaBsyrsybrxnSr2a99uby+SvRBGD0/TZCdoLDjPzIaWnTTICYJkmIadiaV4aN4x41OxMpfxvZM0yRbcbLgrw58gqEbl63uzNODIq2/WzspyVohiY6cmg2z+Mof7EePlngO3lZHJsO6KB9XrDTu6qkrfYpDaMd1WqvngzaoyttK+hfBwy/LA4npmW3QMd8HEmjI0O28r5KpxTDiblTprbR50KwfB6XfC+KSubbN2tYoyMioPGWvQDlzDAosIYUB9GE6q+TNyGRrBuObRVsUtdRFZWd/snMDfBsRc7treoukc5+kmu9CUPlOVJKMv+GuyZbJwngUFeG42QB+JIOFVmYwcq2r8acUMp3c95Ce6qBYHTvdNWYbWct+CdE5ieLPGl25iFdXywPPnVXS8Lno+j8t6Zu1otGDc6CmO5mwnNXpqvSsR9eZVGaEv6mTDid/myCQzbGwLxZoP/aJjR28T+Md/NU5vjeyiNmqtl78aXOZYEtF1Zjh2HXjroxjFhIqxrxaob/S1OZlXe5uGSGRZWYnu4xZEBW3OispBaKWoTBs2mz3odV+unRdnTKm6vGpq5lrx+frH3hilH972z1fImzGeGsN+NepIf5UoEtNMy6ofoJvMorYTqQaY2bqYNnluRlE9niC0NZJVVWYnNntdxz7YMyyr7oaWeZBuRHa6xeSSrcidY9Y2WH8VIN81et0O4FozWKzqODA6B+js+tYO02UZuqDvY3bshTQH30hcm0bLk2TW6PUythsrsgY37XnOLgx/Ms6ORpof93bd1X432AjtNaP1UJwYIj3ahR877Io7yyfvhvCfOCY0g4lTH0XXIhSMK5QxEji77cCXRRTO7sjCxMfIntEMzWIhO5aNDGeNohXfwyVWySOMlPW5hjc1YWIksPKM1b4YTk+JTNgadrdT6fV0pwrf2ogoE54fYW4Mb9XAQEUHYTAQDyaz2rb5VabHMPKexKNILXgdFpHK71hFzyxj08K2RJDILKNWqnONY1bGeouLmY8JR85veeFY40RvKmy1u2hQHBonMBBRF6nrtCOEEhWJXMzSDBex9Ey4n4tZ8nrbxnI2dhOdFEXsw/dZ30bXsPwqNFJMfeJkXa7XCFCAkZGH9Tbqxsg26vgYFSXBzqBhjFFmmU4+YF1IkYkOksDbWjXkWtxwGJlrrm9B8llvmFaTdzx4BkRg5SF/Rch5DGuH41F8kAL6IPxh3iCBMDGqc3HTtXKeAcVMc9etba/2DC6zM4koj57ej7uvs+TuBlxeu1bNkqcFQmVYzeow7D0q0NrGYlV87ZCNK/qudVXLOtlW1Q2cNK8DB35jVkVsRacZth2BZ4bx8xmj5YMIxWEt82BSppuB5+/D1dbeZt3x3PEaQtWffIldaW7sOqsyudFls8BJxmxhIxfjKVbX2ds1bNvCervoaLT4BENlerK0dW8TS2Me7jWO3WwP073Im+bONLHvbv021rz+4klzdumuK3u6soBGL87MEJEozC+eOYsMRcNAziJvfNuZrrGwDfOQXDmoGWuRGk9fGU/f2Fit+dKcLRB6M0mz9w4d2zOyvBl0uAZzgd5CwiENTyw3YzS0FBjYwBVSCl9MjI+/sI+vgAfAKtQwexm/AaNKMcNPSCMbz35AJuyFPyu9RRWcLOvIzXuATtm09a6VD9kwc/gY9mWTZDq0eZ7+Cy+lFyRCC5kR/uRP2qeca1g7b7X4snGQ9kaCxAXhmVFFTAAQa3otEmHXMT5M2MnhS8OfN1wA9zOg1WplPiAPTJvbMbk8cb0DIETOZabSLbPo19leaWWWuzTLDlock2c6KQ81UfPZpldq5oLE2VCf+c9dG+mmzFrRQ4O0ki6bhigyPcSSLzUT+BuLl42wq3XeR8ezanjEQp0VHc2BEbDPsCAfuIW/MlukjmclEj+ApZaZDxgT3aHve6+qyrMk/0xyDaEQZgk5FJBxBt5jJ6FrY0L/7pvW6o5aWklgQSCy0umGVpRsWlWoe/mXmaNiWcJ6R3NjGkjg/K6Z98UeTJXLGtDcUGRkwlJPAO+uZhKdFYubPpNnTEYQlIiAEIbwed9LDjvldh+lz3KWB3giLibUBQ7vBuUoBkaBVOUaFEzD9q/yKnK4LB8GRa/1SlJjdTNCX4Cxs74DkztZk214ZVeAsbm119zcSbfCfOi6C1lpK6uTd7ebYdcaDqztzeJgUPWjxnfr2C+HoeQFxyrwqo0O2qHa7NaRl+5Eb74fpSOUt5Nth/Y95xvGHHXUWrDVql6uMJQMjRDixQGam+XKGA2SQ3jBQA7Nq5kJ/hxPcVesIIQ6+lmsMbWCvL5v7G2bWD/PM23HfHVSXY1lQMvKeHvePH9tnI3tbqfqQa3aXmb2+RRL6Cxz92pVPX6GtNgP7xbDEEXrTbNrEKJ/k4IL3LMZw0JBb4qaAW9Y/tlaO2nCwXUrQvXkPavcLGEJRio7KEIZDRdjCM+BxSrZwxxgdlolmzafYRFUGaoX6WXL4WN2ms1b78JSRu91heGFs2FTxhFOBAHamEdhWpgSU+AvG2QM2eAaLBIigeRLZrawAGJQTBx2NWzRV96zmBXs5c7gTSO4lERJNoQ9ZYiKHtafqZqMH07q6CIHtzMB9IKEEEWTSpi5C3sIG6WbcFJjFZgmjcDSYE0+5wMLLudi2J3x4WC+BHOyxiIy022WbGLWmB66CUWAuQNvl13mJtuebzQgcBl1iZCZ7GCEr/WX1A1Uis/K8LLIBw6TYVF6StagevA/IVMVaKqiG+axRezMCiJck0iKYIVfgDQ2DMXEVodgjQbLJnXgQ3+0ISoA+S9tDBvy2a6iRUaiHj4I37iJsdqz2CY4IO/a2dBhFaBiHsTW4JqyankirahjfPKOzV7nfY+Jae8czRnmIV7AlUWsB4FKbCIPG3JU0qFV+xCNhaAuKzaC2AFQOe87LJDFwqWIYjH0GZm9QA3VoeOeLkwA/51O49kogvTmoOnFdcfHJgMqmWrZcWCwyUcAeJQ4Ym/ad977x7h8dr+P0yluPti3NkfN3qaVZA2+5rADisW8ZtuBu6otkHEUoGbqrQGCYc1WTTdCwMysMO/erO4fGsNett81EEXfNaqmGfacKMZzLQ83690RSNr2/fTeJoRxJqm1Sonc4Acb3Rgn29rcsDCSaW4/fmOfTq0QAIMgectDj3/LflhudrKRJ8oOAVcISY1/IqwbOIh0Y6EjbQjqje3+Y+1997j2x/gn4j/Y8ZqD1/vSxHzmAj5Dd5hVMJInbzfsPd8nO2AbdsEAMml7etheBBBrjLQTGoHtzGRXuhkbiBVi168FiZ1IDqqyg4lgMvhOpmSbLQeYWOA6QkRwfFNsF2BRzIiTSFRQ+DAHphgG4okVaA0HBFnC8wpt2ER83DH9cQNDrA/EoMibjAmQEvvMAgth0fBKVghrzFDCyRhY1gGgEH/IdVT8Zl7DQAgSfwJrRJc1n1EZ4ZXWjmSyTKaEZCIknbeSEK7PBwIXYkfbCCY1bIrNgemhEbSSScSihgLbsLJkwJEeAe0zICNAKFQPsu3jFrX4gg/jD5p0FxQEpxLpgYwmsAKZxwwGVzK53Di/bTPs7GGdHlYJo2UWi1rvITa4r9Kq0A0DPr/FPkLw9tGQC5lEGSm0pn1nRSyfMBgAAZq7KdgehwXIYEGlZMPGs2BW/kwIiDVyuwSskscBJdMRVthc7xEXYusLbG8wRn4kz1hIhFMBggpIlWMt0fIgrWwr5EvIIqkGNy2qyrdBv0gpjgm3rPbcpnT6j8zu25rn2ju/+QMLkwVo2hrWt/fAu8VWt4q9uhNgfO3JEgmUzbUcHpxu+96c1UviFf8J/XwUVB0vvT2sQy8busgDIHO966N4yp67Ogy8xGy6IT5hPvSKvm8EQTZyUSceIj1brb5zq97uF4Og6Ht14NZb/WqrV9/YaraGBCHtJCcME17kjgIqMm+rHQeyogURJDAMT1nejua3/bzvTh+gYISCsLvsQXTRLA/RMzCrwT7BGWw23+cbNS4Z/AFF4C30PfyBC4SQ8HOtgJMdSR035j0pbC6AmYouJDSDsUQ03RY7Yo4Yin2FdXZ+muTEYHEyPZkObTyiXkk8kBhxDIyLKQiMYlS5vcy89Pwriz1D16abGhkeZUx4Du8Uz0feI8aEp9RcwBMbf9ZeEJjd15oto/FXlsC2C7CxS/zn6NH8QAdYqujqG9GqSyQJOwnatKKLOtmW2yxwLquLDddlyLmYCTkhBIbizRhIphLqSQfZrUewEPH1FEBBYApao3InmjbzZ6oQRPqrjeswFCzINPiGXeO5GrlVN4j98lZjryx/LOSM6F776kAJBAYCYqURDMkkfsEOUW6b8BhKkGmzIxDZW4kBcERZnWwaShZ101p7FiVDBzFRNwGIV+tihoxMsIo4EySKLiuklA8t1jX8BT6qjYa1S3CNLf8Ylz7+FZ6XtYeSgvRAXHSfFV4VfMaJALViDISHuw6RpNU+RtKFXVOsK44AGxvwDF0WnaWIIrMlUhWOq8qVQiH2Q6jZfjf+LYugMJHGHs6B3DOrALrB3yibmqiOAk1xK1SRE1zxTAuDTnwZSCBfTgFleLQJXk48ItSbPmuGdjwsuEgJrDvncwHxRWbVpn+xInjlX2WwDYtEyN1Z5l6umhD7ZoVv5niMzmSNPDMTe74ut7remPimA025EtWAgoH/+A/6gRaiyxLrLSTATlhmdIk1qLKRqAlbo/l4CNvJngFy4KrWsRGvKCi3EnXE+orUiV6dN9o2uErWCTsREXuQxYAnGjRXjNqTW4V4wAfXNpMtbMGeQfiEjYTpe89kMNkqHpfsSlrYjM6xNIg/l/OW7hr1yq27GHGGbclQ/MqDis+IFCuywn5jowg2yrCE7RxKsT7PwvqBS68lkF1kbqyOH5YGC/IsdAd3wfryh3N8SOEuWJwVQRb+n1CWK+8X3pI6a9rLOla6JdUTTBvoRhwYm8BdWEj0FyPjk3MxUiTGwvITpEVLk/WBw9DgBBd96Qj2Atm7lgqUNd+wL0hy95hwNOpKz2VAUChmufNKE2YvuJL5K7yUogsUrwJjVxFPwsKbQJvOS4aSbWeDsE7xuRInogzeRBvWEg9o8Otgnp7L2iEFs+J6/sbFWFcZyYG2j5GXN7Bgmgm2DtMKGdk1oMT1lmEkWSb/ogrZVjRjOhIluYb5MwLr4kabKH7HLrryP5kSz2IoEA1kZ7+YBvKPCWXt6cgD6LGz6cDylzXLQeCFFzZ9+87JpnF26TjsnulMlpAY62e9OrXfkl1wTTJ4Gz37fIaRTXYDhN6dI05GFeOwCvI6We1OUiSn2O4qIFs03lVKuJmcEmElzBd5Czuvqg5b1DgXczPNQaomabMXp2/+5PDiu8HFd6O8CzNayX40uxcVG1G65WIt5w866chFPhF7vIXljQDvGUJDbhYGTaEai4/OSjAbqQ4UR3yaEbNa7+BbiAUJISCcLFs7xOoSA9MfXhKlMXsvpD65BjHgezhs4/McKCuVKVK2tyjSgPVowHvpHnYNBcvmCZ2yBxCBrYXK2Kj4rcSgVefXwFIcL608xyAb0amwjUuCsxVIzKy/u67mfvepPXjaROeKP6DIO6dlEdudt/l6243PBHuYORNAU/AgwCTiwTegR9gClsXWoV9hCGwpInG9XlgKwwX7okryoRQNosVkuIb/GA2mB0QxjtxXAJUipQJmfInlgWURMxEh1IpQQxjPbNgk+xXfMyZE4F48bZQXgBwswMXciNqSCBUGZg2aI97IAzKPdQVkic7C2OJguBmth2VG0rrkXSwzvqgkUbE0UTBhZIu8CDR3VxZgm2/Ya4k9gSssbQwpDPy91Q0BDQEBi28QSAOHlkFwzkEu+QjvmvUKBfhTbROCRIyHabCudKMB0TA37oJiGFtWoV0gdpgBLjTINUxACJkwWgmaI8Bt1kc3mqWCbaAtNJpMNwnVolkcWehHfuVK9Je0sydlypWMwLAsEOpBJS5AB7EuxFVK8Oj3/pQsYz3s1p980dw9xLAQycEiVXf3qxE60CGYg+uFNQvOyRNaRFNIv2aj1jotawwU+TqiQPJcCDh1XflpdQMurWJXD5iuTLL5JMcxtsM4vdE3PNi/qbd60wce5sLO8QEadoIdYjHsHBMlgocwQBrowr2oGTgRFuFLNpi8kxQwGWbCaAqKYjpa14vkTw3HSkVBO0F2XIIaACnrRCQDovCIzmnNBfAx94oibeQAc0P6Dl3LVgUX2gxMN0FC9pgrq4iQOX/VtvGNu5IAw15sNjo7OpVNA5tJZgjW6V9tQ9ayArFNthApguJww/pWWa48b2IXfQEeTFnnpBx8NnV/+aI62FBi7QZDawlyMFqGZv+WN9Hxck7Q/UA+FAGqBPnkM4OvD3EfebqZwQSKP0lVAymhj6bKSsk3tfESFoiGRsYQGz6jVSAj4FBMxtNwBYkxkiaVF4rW1F14a1gQqCHzuJRZZtcibFGsvWCzGEfTKNvUH+AZ2vo4wxI5yML34rkN3Cp9YAtYDo9jcIwGGHW1p8g2ooKkCd/2hFnKLjIrIybt0Jfk6H/MDWFrLbNyTkkrRYgr7nSbUmLtBdPmoXDFSrzBU2QSFUflfgkb84EsrJxtZeZyiVe40JYsahu2hQjy82MmIFOGHAIH8ADRQdyL0JJWZUrtvfoGevJ0AmkoBb4U/SXR0kSMgBaD31ArLEoTKDV/HtTyueYjGbl78Lv2IlFmoyiJwRBiUSGC61VdEqMOkVb3Cq+uyXYiZ4Fn6jjHV6TUGSI8Tdxxas9W1iJRTnmENwCcAL5azuWyGJHWq9yzucI2rK5HSNcHE2MhhXLrZvogJroNTvNmRu9Vvt5V7j6Y1mjQ+CQFVdsEABN8aBuBJNONGWELWR8s29pGxEm+e0D1xkppAJa4PHTQDrA7XIitR0NDEdYMpOEDW8LeBLM6Osnmd7DPold4Ca6WI84I2q1WEUCa+JzyDpjeSPaoj0FUrKqvD9yCktb+tfKGKXCW4kiCLnAMmXc2Dz5j7xkE/XK9MXwDszIf9DF6s/aaKm6lSLZUauj8u/HyG9s4Kqt9T9OWNtWGaY+xPF0zPFM0XzFYKjGEgTVbNAIsi91r2jiKnoJoJQZ5AlQvPAoWgKVQwFwm20J9z1JEXu0rqwYoQPwIP0QXKr5hsdgxVAYf+E8zEHq9fpYZXGpWEJC4FMFkqb/W2WNufInuY8mQmjgqRpJHoARlu0QEWRLZ2FuZPXEke1ugdK4UVkdUmDCDcO/yqC4GIA1L6sMxeZBiAe1DWQgmEWqgXBgQqx6ea6XwD5PpnFU8Dp3ITLCHAJDRl+yI9EV00ixviiwilA899ZnAEo9mE/kSXXm9dpYrBdHXhKWG8BvPr7ehhV3Kx+r2CujE8rGFMV6uYjnwDAtkH2VXZ3out0uSWxPKB8ZC/+obovFM6azEq8ScMBO4EQPjYLtIjONu2XvbZS9gLFjcLWvrrz8O49g82GWMcqNz+YHr3O3jgFp3b7MAGMUqffEiOYKyLX2Qb0CsXNvW3McjFgM1d0PEFcgK0QtABWmL3EZziLJVQwoOhYGqa+s2tPGErVh8EQf81cqaZNMhfkOKmTH9GUlCWXDVTJykZRRiJIkSsYZ0w4VT+bX7SnUwYBXgO09hKOEHPOCV/BzmDDKc3Lcn9yP2j9Q880fFA+oWN1tmnYqB+DLdQv/JLyRkgskizom9MnCB52xSU6OyMJh9cVhN1rdNJ3Dx9KG4pPZqew0b1biRTi+vCssce73HeB2gEbLJTTWALqZ36uSbVY4oGk5xZUZnvyIdGyPzQuZj1gQTEiQWNW6Y1uUR+EJPbLGcCgxQ7fmw1sRYw0xSmuzK+YFNITV7hDqXQcN087XspLKUJOXsDE9e1Quwi3/VtFUHiGvtZCY1PVyj4AqDgtA8IzqWsRWrEQJpdRbYFX4iVBvOFIxltqQNi1Cc0DKfFsJ286D1DXaRahgBgcprjBkUwXlugguLGLI7ttqYM7a0RfWb+r765qpIY2TMm8gqgvoQCTkvRJ6uWQvthmQSCiBRyaxaE7fetJEQloNA8ixEi7QQPKZlDiCF7BjIlmsA2MGFgACCROnF6hB2ohxFgseNy0Opquv5OAK9wqUQv25pyOogTjht5kdCNMQI8CrhKPA/z2VzkSv+6z8T/rzeKWgixbeiBMLCVZ7dJn1idF67aFt8K9hy/C5mibSc923DdVZ3epS2LI7IVzp538aFCyn1ONxJjvr5VoQUkbVkovoPkVPonL1Udhu2w/K0+l5pK3bLW4td2iulIBEeojhiINinFTl3jX8lLcuVmjd4ICH4XiOuiDHFX2gLPhCbwUci2EPtAoIXTEtiu9cJooDiw5Ebn2TOLF3eJH4C8GhVJglCRkZHyPlubRGIC9+drWqT9VzJhHGuuAxaZxtai/jWExNf20M0nJhs1vLxCoCnOIH7ztwNyqzwGo8dZcOwDNpsMAb/glpxGpuwdjayxqulLGcoRtNYOt6p6xNwpyiMaAdwEQ+txYoSHkLhUwCIdo7PWE4l3DDFBZoe3SnpguNl5Mma3CqbfmnPbcWQAKKtIoeeZUxVHbegmDUZDQsAb7OCeU8OLctkd4issmVYG2K8IhE/duuyXileD+XBJqs9VfwwjdXDvKKGZliBwMHMFD9gmtg+STWqvFa0hrQ7CoJsAaoNPpMEhqQf6+YgMx6sV3dLVIyBhlo5nZfyFIgJ7965uv3g5OL1qPfRJTLpvPQhHXHmbIhtlCvIIEXmVbE0Czan7Iry8EkjCy+yoFCwhzwIy8nGsRx8MAQehcUOcgEmjiuRNyjTOamTTe2ysiMrMCcmiOIkMYBESDpFm8tduPTIMwgT3C4Ik+ti5IqNZpxrswbWkMxTabTBc9H+2E3ty7XTCOczJYgDj4PDGYT/NDeITBQ30v4ClPQIgAlRbmwSmIhsutKkpn3jGz/It0K2jUSHtwaN2IgN2Yxip5tvBFwXXGQkGfme2Q+eEFEtCZyAjClExAr13oBFZFhhGqxtfJYFb5fzuyHTIpmLt+ZfJCTxWrhiggPzgRxFdkVqeF7D95K6Vu8u97H3+L4llQqAImA6Uoo0jt8lsslWOekG0SARcXbPB/ZQCUERA0zQf4KJpOLMwRuZ37aSbVk5PG8WhZZCIAkkIADXrgufYXTJJJZkqwpP2Hthd6iMTmIC2o8NBTaYFHygPdiiQpHcTV3OfALlbBL7itTBK9ozHImbaYNopVZVWXePzhela05hHDkV6AKBK1vIcPY+QBrdc63FCBnJwYM74S2khSwQ8IaVKg8mV1MTxs4gb4sjkvsgD8NKVRUAa2abdXmAt2Q3Ae63+Almzbfq6I0in2h3aXRH8oPWiN7KdmUDyRJOHbwOQzB5qjeBDySm8VsgOHzMmpkMj6yHpd0p6k5Vl5Q0mEAACvdgOKJK2Sa81vqrXAc4XzSzB4390dw5WjvDzHGrZOUbp2H3cy96YVNqQ1Cn6jTvPHj77x784vFq5zSN//l7f//J5UHiAXDMa3peg1JwB2xDwRoLJMWKFsNuwPcQGcM7vye4SxoGv1eB67UEDPMC1ylptNViRby1nmKzkA75gewsHG8WSeDX1YFUiXYWwheUxdRQBl9UjBGa60NKbQ2mWqJV+xqNXeApEA0oRJFTMNaq2Q5oC4jlA5vFvaLAO1xA4l4ViBK/fTCSeAOPYL2PslYei3slZWPtLBNAXimrVCUj2aDh9/8RlexQk2pP/lb7ACdop8o6So0IuxMqALJeO0gEZP2z1fxeyLZJXbEH85ICAqKusooVTjnVIlxMitmwFWm0qUMhbkkeDwMLOYjVsv5rGA35cM3bygZZAOnvNj3N4NAJil8XWynipPypDBSX8SBQATRCbFBjoFkeMbvj8hy4BFpAGi7Qyin72qzKPmX+uhGpIFGB9i02SswdAgYdMRcYIjQxwkYgEYoke5QatvkDNEsXtV8ZmZazNVokuGeUxW7loFAyV1CAkREz8g0MtXk4+6MHn/398c18KpjL/yGuWNT4WEoHa5PsNd6lPXpnnDPDpWsMqIxicwz/Qmhq9IU0DnOOT7FLRHpBBw1bgLJjR8qHiXWm6kqu7D1nzpYzVd1XHQKS7fAUsuC5gwjENDwdVcLFhA2VXls0JG8YE8sfXsjgo8u4l0Vhe/HfuJhfET5IIcrcSr2wKOaUrljOwj786K17c70C1i+c+A34U1ugWB3er6ugF94yyiU57bifRtV5gLFtuFX9HBIhfppONRysPpkcPn2yd+/u6biITz/dDW8uS5jlHEvNbgo6wfFEmIIzm5gn+8v8s62K+QOvFvdwT8TWcD8PZR/hb9lGYp7U2VH+eu2jtjmAbIduDzgT0tGoIVKsDmsGZxcAqDAA4zA+HwiDo7+wKNgo6MZuSgMBKCi0PgNPSSlbRJ6JzIVKKkDP9X59jWxR2ewFgVZoR9BBu2yya3yDlrSiM7mIsBxMQjxPTNLSluATFgKiwZ/ZfgliEoTZ/s0/YnLd13my7bI98AHQjpQAsJPZR+c55LbLenkoC4Z4BHI9XaQUtoZ2uG3QqP88RbT4lYeBajCy4bhtF/IAVKQvqXtQFmW9o5gqu8JquRispV8t7GeFACNO3RfJ+bd81NL/g9CABKAI9rv3sryO2vNoApgKUrUFZVBNNaJoDXBRQj7AWN4r8cpAU9AaiOjKMbObbtV5QQW50lCGT1FJA1jyzsmOVmipJgTcyHTUQWMOc9NBkhsrQdGRObEHR9ONwWoYJNOEVGlZrLGwNiJUd+rOwWJ0Yza/itmB9SQ0ekSgsqtxl9oOqF/sFMVOmRBuvVJ1FSVd5J13N2e4Sl+7ffzezqk/ygof0IoGBlahWhVUZJJoYlYH/Vl4NlLMNng4X9cujGs9D/gSXcPC2V1/rDg+NORi+InAIDwKq4HQCMpRf8OOgHz4lZExg1g/NgJrQERRt8C1tknnB6F2ADPiCltvv3OFF5lNqN9BPoxJESzPu/4bz16L7JLAUqgMQ4EoI5OEUohqG70iuLc4fHje7dEjZBsXXvTWHH1VI/Z4aOuvBlPH/e67zy7WnWefHEb3Zqu3Xe/YS/fL5iAtNqocJLWUy5Rvgxdwei24U4oevCCoJPsPM1zbIpgbIZSTRswTOhDKxi2aiU+gM4LKSlv8rOAcsgokwQMkKIW4ci9/hQOZOVLEKuB2oA1lA4DM8MyifAfhFCZK5ZKAdIi9MQJPlABjEGQq6JuRwmJKDMJoWFSJVleqBNXD3KTuBxVbz3ag45BJTAJ4WAsRSK47z9THw9D27kd/sPFF5r+aeBnLduJHV94Xx2HmOoUZvyX4mRE1JeFeDTpEciiM8i7WdmWPfnrlFW73i3F0XoRfnTefPfNOZlavTw+bT2QsbaJXc+944vzoc3e4FZwlFFubls2fCDYgb+h4PMnrHY1PFdP1xzlCyGSxoHAP+CE+KXtPV/gTnVe0P7jhOVDWoVATEEvAlxAT8Qjmg4XhG/7rHCsWzrD20sZhC88pMbWGn5v9n9ruxPbPbAF9HMIrmMaJX1Lnbh/8dR6cOeiw8EQFGRgTeNoekx11ghMHBMsIXB8/mM+T4Pf3v0osb7yKvLB0ukUcZeksqN5Gs3XonztYs3KvuJh2NwcUwpruKLtz7/TeNljcsTol1gg1iYKP9leDIP3To78Dxfzw7e2Ty8HOcPHu/slgZ/mm7uA+yc7bqokBBcBhWDCUHRGXdUJAsAqjvBhTXST3D/EgoiDDRYx0IBPKN503aHJkTO0UYtNAIR88TJiY5ZMKImYA0yCZ8lKWzfDzFS0F0J9vcBD4cn2z3NhcUDRhfk71ilFul3ZQkc5T9CK1uq9o0ZBPiQBT20CtDHdVqKBh4XhVMg/Gi072s2HwhU/0Utu6UE9m7Vjeh5P/5P2/wWv75Ef3q42Si4vUMfeysJ+GfpEXTmdrtf3u1Tgh+Gs4cVEWlIlRvi81RFErbhuqhBWxR9pBgRrZQ2K2XIYMIJxAJ00Y8wuk3ASgNZhxoYAr+qREHGQViwcxEXU5GoFESFE3CnL6JGyFLbmsUFmPpAjwBa+2ZrOtRtwoYS1kLJhQPMiYVPDAt9ov2ca2IoIZMhOEGUhC4opwOnMDZuPJMw6mgiul+DwNy8blB0XVqe2jd/5AeuyzF7bluJ88N6nk3t+25itZtqt5E3l0TFHtqZqYf/339K2m7+3R6saa3MslPU2U11Qb3eK9Q2NzCGqlK8SlNYNWUXpSqXebMWvfWiWk8sOPX9Lmi9bBtBJi4V9mg0/Cv/wU8hXxs13CdBS2hyd4ksAGwozUgnndF+oiUfAioJ9ATbRVZFM6RPMEqgXS81eiNUR3CHWCftl/Koz5t/d87X/2xg466jpNG8z45l+/Hfz1cfezcZSB6yEo67QIhfdeZRv/9pXjdl1wTkbuTrlEMTqx9c0qTT10wjL3fa9MMi9b+tXLOL41b7qlBY8u6XUwov3lt49ePRlvlqW9uoovLvrHr7eML+N87leDqtxlZmZ3sA7ckjLMP//4g/KCgjpn+ar3cj48v+g7MypgtYXxGREFq/+yZF+QCkqcKDHFMHKB8Spkgwnxs9mC0xTKdBXo7z0VyIf/WlDHrhu9l2hiuXBsvBgUVlY7BbEose+1HG58mjjPTkOKV7ZCSij5nidiE5aY7dRzz9z0oOpsrrb7y0XqD37mouzZLAZRIKcyiSDwi2KY+MNAg6nrzGmyAVrLzuQjcgxiYim+f3z+L977l5dV97/9+W8AN6LRen1JYtEwL3zv88B8EjpXbnURBjdWf3z/l59e7H/39ovXeTd65qI4+i8q2B1HA48L+jCy0zp48YnsJEvGieAD4Ataodb1dEyTbfqXpKNxsPls0e6IdwpeR7Xh8qAgtj6mQsAGRkpsaGI+F7LjXv4FIyCNcCPBAvLyzEH2LbGi18B+VSzJ2WkdS54oCNZt4zQwHeQgPdjCVAGfNncKqYtR7V+qbFD+LSXVLURH3bMcGzUxsamCr6k7Wf37HxA2DYed2d2YYA7xAISksYZEVvAJWQAPs+/dLvb79F9Uu0r3ZgMKYFULwlxRw05KR5mZvE+8GUZRQX184vpnvexwQLkqFxTv3yeMBl14IjXvcBVAgklD35Bqxk2sNvXNKkdOh2E4don6MCArQ3mvN9UxybBUzROUaz8TB3O48TqpDX5LBzT1tnFRMqrTZnFImBcA3BmNjs6+zayEQ9CpefdA1YOgOahG/QcVyVfN7D7hE365TRPd6gD03/hXRIm4Bvhh4BoBaAhOTdfhfBHa1NxHef999TWtV4HrlaBdDG9i9H/emPc2Lt8ue65TLV6ScmqG/+D0Vm/8k9c3y2NAszGb05lnPD1+FwgX35qtlkGxcnpfuvjoqFWAEwSRZ+jJy0XxUW+BzzwjJtGhoZl1oaeILSgW6lCsD7yg5Lr1bIWlvWZ5ZMSvYCN5Jvwonaiogwwd8AEmIzSnwCB1+Gfq2Ch+7Sj+6oLsM5uIcUO6IG8Se87EgYbJ2lpexum4b3Tq5U0DeAnpuJFgr1XV8xs4F0Z6K9vfm3xj4+1Pzm5maJ9lYL0NMBcmXWBXZdYjnkdTnfG/TH/tX/3tt4xO8ccfffyXr+93nrp3f/A0cvIffXrv1v9sXH3NZZ7Hn+9cDE9/951HZ2nX9YnPCWDTowTHkyeIX+P+KcdwDdSZCUQTuVhjpszk4sC++ibFwEa+W2BgkzCwn1JfKTiK09SGf9lHY3mn9IAtkcuSMePyYMnc7MEfEmasqzuxOq81MvyjLCKmbB+7Uac3CBDX7ivavRTopgiZmiQQEHdxO5YNuE7QC/IyH7ENPkLaRvvozT7kAkUrgjMJK8nnjB6xJ+b0a3QVN/bNh79PHBLuJ+WgbgkVnspuCp/0TZrEMT7chv2xLHtxFLIBbCGoEmnEXpNLIXiDruVLdQwQ8VKlL2sw+z85rk5Ond5gdQMu0JLQ08RCgZ2yZiUVyS2Y1L/oVMRYihNdQLQATCUYo3gdDAdaaHHIVCIHKzMaj8PvZ5Fcc+1SX18Dj6JpyN7M7xAXReZZAr0nisHId28TA1raCGUG9cXBSiEeFET2cNXY4Dqq7SFtWQ7ySegS9B/0s3zt9bvJdoecQz2bxcPBejKP0mc9742b0kdPvIdwyHY26q22Q5qCjVkS0uHM4N1+chhPbw6mr55SMwo+bNLcHQxWtw4vaOykHmk4Wl3iaHapByR8JZMOzTNEiDXWqjMkDowE1oMSf1VRO8XDWLLMggAVmV18DCIWA4IuHFShuAvrgnHVgmxSDgGU1FAoO0V9Ttg4TpQQ3cQ9FCuHEc45uSwgPbVUdGkowHOwTtUF12zdmFjDfGN7Pl9F3Rc8WuwlS0gFWWDO3quDYTpbRJbfnE566WnHmrjEjXgoBkRnF2CUCBVn8ZNHB9Ww/M++828ju/jx374XfXT1J4cfj8vOcdadf6Pa/vrl6MH4ctV5erX5Ozcfv00Hpyej4NS+TkigFpFGAMLitlH2q/BUHU/4cjkhYjYRA28RMoBtbCJPoBhCu15QllekwyUq2Z0iiylsFLxESFyMD77xhBZBE3RNoEGdojNyTiT9cP8soC88xgIZNqWugIQxpQhE6GdOE1f0uEDtqldbXfQop1lgPUGVeR2SdsdBVeSc2WL3MHr6r92acpsjMIoacNPVxpVbhRHBheosJ49qfvgf/5f9F7n7V79UU/Dto2JvQDqfhdHEQD8ivRQQnb71zmcXTSfMN0K8R2O+bHY3OGIEKEgFDD341nQ1/dY2u7L5N6c0NdI7r1Ddi2NOx9DxMOQb9+msUguvezzm4JDi4QFN9KT4wvOc2m4yDcm2p5yYgfqs0P3qpiMdfJHPbvvdNwWmGMmMT3L0GanbtmWp7b7nSA28DuKuVD4jaJy5cEf5TLQRAoAUoM61lusqgrYCkzTJes+n+x4R7byUI0GlBdysWBkt9l0qgVo6lioNufqgdTYGOtfADCovyqvSLlPHuXRJxwe7ACMjpQbtpU+T7tb+9AeHn71JhwN3/a/+9a/juJNrEu4f1vG9WfoZ7UC0CJuLW/Xwzjgr3OxRj5wYe1B26Q01vSuLPDIRFPQUnSgcYCNGB4UqRmquj4DNhEHqJrei5yrBUmwwVjqLhcTHoCdhe1BQeEnXD/XE0sT4hERi0OWofGllCrgpK9uSqQSVSGPS8PVaJwNRdbS4HQKC5ncN6+4yW3nOOUWFRnEnJZJ09uk2rAN9YH0lVClTlgMpK0HsngfFr+U00oxCFJHRcF9NIsqXXnRqUbMyec8gIv0ffveHR/7lf/3lP1y86f17v/njk7T/2cVuUTEJ49Zo/M3Bm0138T+8/OiwO+262Q//8n3cJ9bIQpBnbGD3JclJfcMyUbhbH8MbtJVjherOKyVm2HQ4PtuurLWi/SoPQHxQqd2iOffltmEJ9orgjVcGTf+JoDv1j6oPUUmDuTrQfnEjBQzTh8hPizUKWQ7ML3JFqDZ+IcXK8mEq0CYbxNoprEEsiQBBAYRcjhv1rgHYmBM6NDgjUBI8fuhz3gehfsLvUmSJEj/sHQEh84P/9L8irMSquFlZKXZXipPtAbmhmxXJQdZRqxguEgz9FwWtxBRYT+4rx0J3Wf/Rkh6l+ZEHgbBOLbCWRd794XT2sJdTgyc6K6dPBhJ8goElcEzCBJNIszywM7jMJw+pDldVHlIBKYGISI4/qahEY4WodmIwLJhaEEEm5Ys0JsYTbuB7VgVRYAueC8kUHGMVZGaw6lLGOu2C8giN0KPJEB+D8eWUk2wgBVcMsJhmPShsv6rmXvjakanJoBHxaCvbpha7djt4uhaG2CMCsfbqAhtK66jsCZzH7SDYm//g1T8/+PF/d/xdZrQRrLaDRcfO/u7q1pPHe/Fz4LGcB5TrdXyMmBszZBqKCpwqGIDKACMJmYNKTKr5dHDL4gbwRFVapMJk+dnvQU3EiNQ5TEPgARbvPxI3kBGF1BwhBZvi/MAHLB+GWO0oPNB/zuE3vyqNQMdBRpS41BMVDjCr4grq9wVZOQjh29jg7IQ3dvIgG4yWG/H6Qe/8zz573z5FruStMDgqALLDLbd/8IwRxknEQESwcjrPlsi91XtMc5k4Z+ufvjrqEO+q//yr95qZ98H7LzqI2ef3rKUTHSv5BufM3iv/yXd+hgvyP/6b37AO1+GPY+7lh3+DK+3g9RazanwQqc42EnONSIGgLBwQRKNmZ2O9PEOFl85bH92Ex0SumO1GGPjBkMYvacozBo+FqoKrmnpDhAr5gYBAXMbnsB+qlAD8bMG1xkHU5/fqZpQHXwXkS2EJ5Gd1h0Zz55rTuFIcpboLPYa/Avv5IQ/ByNs/BbQ0x7/j4C/AJyxWHCvAKAMgpt3/xh+weXjwLEyIBXRH3qyH72Gtd+APJXxx2eXZU0ymnC9pHJuGNBX4qn+ZLkmbwn9OB+J2HFmoBjRi7HCi80IYnHuJHeHyIoEoDKJPUAEHl2Ap6RABYDrZd5QqBBlizSE3IzOat5ZDCPORXYFL8AkhHP3mQqq0hEDcLpXBbZ50VJdDfFftGRLFEAKlJ5JDajKQKMJTiiCPOKUIZEbMlRgaZC0IXTIZ/jU7FShxq7daVm6TqsSx1T4CfmhBs1904yzwC04aILKC/rbeyAVVUZvfdG7OCVpSvHI16drD+uv9tx/2X32n+3zkrkBc8xKHJaCQtiJkQ0yvV5H2ID2d3U3RCAWdmCc2DISpUbEiJ8GoKVH+HuAZUSGLC/bmAnYO6wqXYDZRroOnJVqM6HTlO4jiet/SCW46A0Yl1MyN/xBaSCQYlgjAcz1/grDsQv95ER8n/l/8jPC1QlltGQ3P1Z6Sv3Okm7B7R3fPO36+Ey7GeXT5xRYSOPqyII5F8uNaBaCmH7z7euSvn003Lk4G7idx+JVLuBgnOTqrrzkKY/XZi8PT//0GEXpYed24Lz6+4Z+CWajfwFIADlF29ifrXU6vWNAGMCWLwjEldfcFvRqKghCJlceUG6NHxeyuajZgAKldwwBSsmbNnFqZqZ2t/ODcxqfFbSM9qFgo6ObM6j8zDv9ibBjxddAFiqlej1qUsZS7RAGFgt1Tt4DyImyQoi/gbjpXCNkmZuexDfEB21apIHb8ku5eAW+sX3xaU3aLccMRA9Oi+JA0sglknjBUmBmIj+ZFehE8xo9BW7EOfAD4YDbMP3zvv+AIqvRrh/7JIt9GnOncS+l4oGmQO9V/xHFXBRNllghHM70fwqCrPQk9BhMGZQXoKmQaRwsFMHhSYL7Q6FhOwqSUnmHrO6+z1QFnEEJrEuUqH+08mjDg8j7dfkb0fL540Ee59p4n+cCjTiC4KjlwzT2d1YMYM8syeAr/gmmDc9IgDmclLt7bUBiW2p8pJ2RAMvq3VHEKKKV2fLnvT98Ro2BU0S/QKG0NBXuJ5ed6kC00hfTIPD4q42AfIBA4DSDBBfpVbd3KRHd2lHhCx0ODLHGJBLLTWDDnaJnNfffCpa7taDSZZcHlrOO61WocWn7VjH0cGIQ5Jz52hadX1/3Ce+sVN7K4SzjcII56f/vi8+Nd83VIyxycgRGjmUM6oq1vIlM/v0kekx5/Od6ksOA8GJcNpggWqMlmk8thyZw1qPIoJQ9lS1kdzEHlrc4UxDCqalfmi6HgOXaEZ+kaEjxjMACpC49rhl8sjn+vp9AUB7hc2FQUAe0Ae6BBRBpYSH8J1fZQbH7bUyyHA5q28MRI6xk7P1maWUUT9uyW13bTyQbS3oVhYaNb9U3Qqyn3M+vSG/3SnLxrlL2KbDVpBiKcv6qGBZLsZP6nCiBivVnCdeAE8cg2BNohAvJAaIT58BkwzJ+YD3kLMML6VvHu/WNUxiSLvjjdYSNYsOeUeemsFkGD4d1ZUVRAaaEdFwTY8gU9ciTrEBEWqCZj72iZZy7BtnThd4br5YXyNKRMEAXPo5C/SV51KUsafVmOHzokKnAFMW7u1Bp8JRIRGWoIKFSW/9rT8UUUkwBtVN2lqAS/Mmdmy5ikQwmNCn9ReIAWrA42vYuVuUo8zlnc6+noweO3/ne+Tvs8R1R4Ly+b9drs99JbuL1C5JRokYi7rkWQz+pQtq+WMIYCKlBU7RCwxi84XlnrzB1GnF3FsTReD+VE82FSezEgoRxw2BipfFXJmWmGfAKEkTr3pEnvbKp6jsMwfaqeCi8tratpU1bJrx3hLFURsTuqMPPwLKV7gxlqecs0zIvs/g6F19Mjp/daYAyOlH8MjCcO0XblIGNCpyR2aHiD/9qqVAkD4KqNxMKawDPMBVeKD1gH4dx+1g/Tjpc9v9ioSivqZNmpj55mZHwnA7sKdCRMc7Ex7K7Z2tVl5HKwjsEpo2J6rFD8S/hMeorYTL5Z9nrJfBJFj/xst/68sKslPbx0navtGvGQSmZZagISPtcSwEXnAgJIoKq0iR+sdWgnUBxUr3qJtKZEmDWCrPgX2y6+5xDHRIV7bbGiACcxG+azOKT2jXAODoL8DmfNUTE+iKZN2HImhUr8rCUZNjOnnf/Kmn+QDzcXH24f/+3xkfm/9tKRDnroPc9xSSjMwAhMv0EQ0z3+XofGCJZMGpoAI8RhEmhA6TtyAJcVCyEYaC5cIChemeyP1RCiWG4ZbpwnserlKJep1gRXBUMwWQAxlsOYEAQqQEAAApqIoeV3qMVMgUo+txcYkB1UzF/RhvnaLR0At1Hi265BN7a9simoICFMEgYMXjkNmRWcDoT4WjCYQEK1E74ujsaaywNrpWWUSKmad/m74Y0tAAuxTBYF9fgK0UUp4NcgGsxQfgpfs7QIcyrp0ODQgqKZtsgRNMHqapcWeWoDaCeiw3OnQ91ZMMbf9+zxyj9dcByizWGmywwh1EmHO4Oit4NUEA5FeEZ/d7Z8f6v/JMNScSgoSQsrKdeHEbaL1hgOk7V/+iUYlxLQ1a1O1uvx4PgsoJteEoiNDVVfpgVUdbYRcpQtCZIm8GX0KbfshvlmBO7FHcU/Tg67q13qeOr+LwjkpeGbBWdA6SC2qkrvbHE2VueLVfXVE+ub7zWhX4860zuqMmVLMJjsMdgA5wq2BpqrNlJ9EdfATIX/mLvwxdTjdppF9neW721O7znBK9UScBcTkLcJXk+sYub7W5PHb7fv7V08erUz6K1TW1EN8cSE2LqU2fkXWxufmKe/433v3Uc/LG/7WyvqSDlylRxGPQbiE3cWY1GlZd9bzk+61A/ID3FrbKCLJKx+1UiBMUdowY39n4nOFAMhe8sDDJf2EvorPc0ZvobBOTrz2wq9QE3ipdIX+lo/g2foWyOhYZy2t5UxfJQR/eIMETwT9lq8Lz6BPlX0dAy8Tu5FaCVWhFqUj7pFEF8eLJgwuZNH/QSfkJRtsvRdVU4TNjOaTWKMgkJ46Sip5EENlzOms7bDY0KRhC7kpSP2kJ2KGfxSfg3foC9lB2jJrQOhNVaFsS0TG7hB6iU8b7qvZZmBSxp5IWxCPIPqDgrBmDbInuKedvKKsdfPTEy9Egm+Wp8YuvyfttbHZbnpcMioXOKximZiWnlokTvNKWPGYYFL4efgJFnc63gLlBylfwUZ6cWhF12aHCdL4Yf03YaPk8UIQFkOv6Akk7WAxtdbilOw9q2fwio6LRJEBveix/f/mjo7ttBakp90tR/EMpALSk3SLQ/IRscPKl5TeqMwBxYoJDDz3X/2L8BdnD7AHoRvlsm+MnLMkuOVONLTW3BkpUMTQ6rDJwlwq+SFWaIL47eFd7k6+60h+83mMSKbiobhpA2CTpz3jDsHmzJXED8xTEAmJqj/dL3eJ2LQ6nIdtqcTY9GmsNf4IdpXurwt3aK2vVHAilQvVSM0+OY6PILowuLQVniKQniq2widLWmHUwZsuafaSHgUXmd8+sGxHvjTdQ9fmCgTsUJVWhDgoh7l4kOBfgADi8UrACSQgQQsgfURKqwEw7IBsEt6WHzr4fPPTvey84iIwut5f3zS984k0Lhz1ihD0ppHHfQxTDD+ZrV7+2qZ+mlCM4VZJ44zJhNLHIzyxabezuwTv9rPzCuPcCKqIT1UaxoTA5WNPiPkUM7uuv5EO4ocwnN8gAJER9l1flV6cGXu/l169bWAaA2CzQ/SgsskjaCQoL5kF3AfFkdgV12/+WnJ2bvLo3h+y4bLIRrmkQApa+czlPTo2W/PYoAO3Dt5H3VJwl7kBaTlG5XVoZTRbDJVFwMNFCgOC+Hwc2ro6yYiSGLac0ddWhCZ1CWWak28zaQ/nUhjMZQ1D9462GQmeY3wr1WGyb6AB3W+FnFgLQfBw36yEBYuY5KosQsGEGNwCGh76Bb7CwCmAhuysF9QhotJY0JkRtj4ZTN5yCbiBCniCGXwjMZfZzmC9FBem7sj60q8o/u8PUQnIVRTclIrTD5/WGHJMW6Kx+5Iu5HxIqqp2ME1P19JiW98VnCCxPi9gMQJ10BM2r6YWJecwB1b/Rk6TkV6ijgN/U2oTjwLoJb0I0/npDxqleOa82xVvdZ7TdIdRgEtAcFbGFFbyKgOP9Yh52pkZJPYAKZoVXTZkgrHK9VpqvArP5zB7q4QTpc/oQ/6X669MSc+BVRLklcgFTN5x+UcBzTW9J2Iozhbo6rMJEkLa8tTnpCOwalifYgrigR0xKrw2ZgrjEgJCMzEZ0ptei/py1Kang1gbtir3suaHJGup5pkJHYUMxVwIR2fZrOyYQgwwHWAGPzGmVkoSSQQz7vEw6VOuG7cAABAAElEQVS0V4tW2IMqJKA14QEEXo8G0XVyThYgH2/28uNFf7HknDqT2A8pI9J7flDc2bj64kWnCurkdhm+8BYvd+jogYlB7KOvsqJToX04sjrbcJqgzHfol6bd2M++u8Rj+d7+q799fctzS/OzYTpEazJ5+dIAh/OPlNZnUaMvEDXv4luap2LlVNXoaDmVQdIEBP35fvZOPfhCHAABcSnRNbgi4mkK31u/t+x6AthSl8b4G62MkU/nKFTUGfEeeprp2x6op6mVSSXivFOZLPy66TtOucYP0a/QhB8QV750idASPMwyPHyZJqIsHEAE8blMnLdQihKVTycXBhIlxf4SzMQ6geiQMUwiEs42YQD5F/6WWLbxbaaBNrzOLSntyTSmxfihh1WHD1mXFA0pG6S9BRHkchE5Nis6UdktT2FA1M3yhoblyprzhCgZHvNI7uV08LaVgc9Ehi5Kq6D8AqnOMePUbBF3hKo8QtSgT2qiamp0FhvKD6EmjMrgcWH/Hz/jNMBo8yMqT67dV+IxYA3MPvdSMsFaFM5s2/yhKp6URKZNHfE4VABusLOi0NqgMKIKP3l9/SYJc2urv86oQTY+faITROmr2Bh0ziENI1KOzOE1jTNenf2DTaI6EJp83XWWyZvltORbOakTBdN5i0B9uEFGjjcW4LzpnBibo9fKIvK7uPXkuyrI30IpihB0lDUZamKDFCAjCTKnhLxYsLummmRajKL8rcOhVO75kqP1ixsbyBhKCORWhkAIJ7hIop+dd+7t84YD+qGgIPtBnJBIFJibgiOEnJHZALQdeIxN5Yih3quCI7F0xKAvW0qZ39bPM//lFa80OPtHR2tO47u7xgMMPCpq8yjKcAK+t//4X/7dt/0N2lSbZBz6n/CWheazmx4BT4yDfYG6Uhiz+9waPM552QNOsvs3j4z37gF7xGdulRe+Q17+wfRP7/34v3/2EV4V4k1s5tnvWnj8nRMqOSgYsue3Yjw0dZykzfmHPiqTSImxsptegYs/fRMSzYbtWB06CGmEo9GDpCXYTZxDuAdVCPP1nyj+nPbtZIQOBqo0Gz+bjN8fILe8AwMYEr9Jsg3yQg11M6SI2VCdxgfk3MnrE58IFnAGTbD5sXHxh4RWneGPVBcFkueRqOn4R085Ip7P9jt31/c5F6h0fvqV3inS/nB8ZrVPyybtzh1/ToraAPDnbfHS9ak84BcEBg8Q948XFoCA2CbkE2gGhPZfXE6+uw9mgancGWd8SeP0nwjfhhMlyXd+tDz7zT6aXeFiTnM7U1fq5VCu9eYn1ew2p/1W6F/2HRHqPZHM01cgcFE0Oz+cWe2huMy0LSMz7Af30qMB78m48Re58eNfOocHxY3N5SE9DubyUAEhMT+94NTr6g0LbvDvfHN1EPS/nIVnnv2zr8y7R/k2aRXqB3E9OD5CYX8DqIWaSImgKrSJMaSaEg+riCkVaPpfSUUiliYvhCGfzusorgnXRCh/nSvD8f3NdG7s0W5p5Zsx7hxSziGFyf0tdNL8HrYGtS1QR3YbQVcMnb6vS0VdsWOKuasEXPAANAUhWiAkmI74QTKsBMub3PeAVayNiG30VsVugEzCfepqJSNCz3/TjD6eIIfeszNeGLD+4JB0CORQ5c2KosqKlwe0L6iIOO6JCCEmmmyb1NUfUfmfTalcKW3PoUs5I6+wyrzAK/iGWO8y8ZExTFmSeDjiYTcrKWigTiQskDGCY9Sm8SGbBgTcDqLZyFuN8/iHr2+nSx//h7UTZMMqklRUMyveXQfQZfQH6+lJT029xO42KA224APCgCA6ggSkGfc3ODfO/nDzeJqT2DYvEkpYDRLzfGa2tHtV9DF5+SrXuQj8GrpFWvKmgYa6udeXg43+6uyMem1zd4/3ZpjjWcwq6rUDGrRvyhH1qYfOXMpgssyltACrBRQnwKAJ431fsG3aeKyHSgU4TGkm29J9TQpX1TkKSHaoVy57XzoAZtJfKlI5JrcGdlJr4vARkWeb1+OAwZQz+LzpvE6TbX95QASfuDTHFMjnVECCKq0LRYmkr0n3jZWZxFIxDmwgtgYcHgvvgL0BePA63jU2k+/5GX0pNTR9H+nlwDV1pSDGaGe4GQYjX41bxNEVIDo5Zj1lengQ92KI+Je20nxY9b9SllWFNaooUOEbkWomRoRTfNI1d/52xQlm03uqtikoZMnAGDXFGJyax5HEZNJxDlG1gCyiWThy+CwYeZokgULWjDcUCEiTHYDhr2st0BFylAJK4ZQQZm74U4tbggbkCXm6s7Dqo+Qf3vtqVoSTNNqJ5g5VMhyIVmzERCZ1ZP3rt5y9bW0M9HaXfpez8Y3DHf8Xz9ffuYu5Y3hOmvEnbvc56FlLJWw9fEwhNaAYl0+HlHRfpvPbHLCNU4FHKywK2gcVUPjinyeL+xzfKYSJIPEnaM2m8le10gx5+0+yvBkQLTj73jah7abLG1RUC+7SKHv8los5TiUc0inYLN8d8ToOqnaiV5cchNM4MXltCI0qhcSY+yR3eQkHhSlFYZdEW/D7OKupcBBFhA3p4nsYFK4l2Gskdmozpwbpqr2qAvKtbW9QrWZBd5vj6UyqGVelN85onXARAGYCiGpIP3bTtOC9HeArmjAJblpLl2od8QFgCcMCx7M62Ag5MTOLLBmHRzPgn//0A9oskLR17qapSwFAnlFcKIPNpqYc4pOgsSkOaPJASuT7+1/OynBduGgQhUAa43IiyE79QL0gZcZJsKzQqlaurD7x5tQtyJgT0w/4xaIwAL4XHzccr1CP35dA4omBAmJK/hW+UpUJFuNaSJA6wrBwHv+V27zNxobD1OZOtYNOYWSWiFkjxXEfX4NAt7H9MxL/lLxKEqRqga9ajH64CzlX0RmHD9B695quPEkFJOIAKx4ht2JLkgeMZ2IwPXQ7/4gNJT4BArcGj5qr94mF1uujhrKH1RFH7LmygbjBnCSikn1jfTfvfOWlH649n64ComI1L8XKD+wic3tdYjjW4pCDdAgClpGfLw4CKEeK6M1v8eaYaV1QXWYPo4zzPeGWZRzMb7aOg4Nnb3GgfpK5PgmPr+F2qJ6Ytnj0Y0nSlQZTq85w36jLVxO7CbQhQs41fE8RGsmN5I4YptwyKQWR17TRHG1NaHu/oprOqnNw4B++858DF3Vo/hWHTNh00wcvxmhsnASO97XPJ9if8uzCfnAn2yfUideUlbHKpvgMuYNnF+WLV/z6//WDjeWdM7yzhpfyKDrKUWs/+kX2g2+D5lU6/OwF75fizO/Ve9sM2P3FKaM5RzeufvsA3kVW+0/WnGGBzeRFHFSQp3c2gjdzDLX56ROeCAbAFlS8IqrIzW9/fb1PALqiUUPV27QXrhTRIVrFHrNhmGIgGXktEBrAjMH15WVNoiJ4Na37FCtUx783gAtB1KjP9H5298Y5B/BzmhyWaj+e/dXje3XqWCEyahxuTV693aCQjWnYH/PqMvrflexCK8F/6GwUAVHK+Pky3Y8m9138T3irHJbcjkC6l07n/fHktGfHpfclBzY3yAmoDJc14GQDzrvrE8SiDg3lbVf/0eVePH8xHU3Ou1SExH4++elW95WqhcgfUvfIRqCbEcXgXA4JT6fwhcxV/0VGtRT2DV8XDQ0DsC75NuhB6k6JmFO90QZmSOjzMjNePbC8Ec7u8v4W2UPltdGPYJz2LAxWiqCyrs1f8nomgKtyA8TqqNzAwhAFQT4lfnLhOKCxBhDK81FCUm/aoWqZWjB3bDPh1d0C1UDXJcah+0KBa85oRSbpwMTTDkJRlfweLeGRW5xOu+Xr2NhL64nfPZgvplF/uJq96pOqBddhz72JjqvhSBszM7/x0VPOql+X3oa/oqRy4CWfXO0/GJwT3b1IO6vCp5IJUSSKj5eBYmVnR976POtwC9/TWohU0MkVOkXgFIgK3zCZVelzy9Bb8/YGfmVklClCMiukc4kaMBqvVuHeec5phbySRoLHMSFUagJkMAnDKOGdKevC45yJ3ZhDdT0p9wWnMmsfiPg37omORSu3OUHS4P1+/sfP8m/e9h+dVmfnxtcfGCenSCBxGhfst0jLrVH0gkNLaqLbOsQeNHULheZkN4f+z56BYOvHL5CK+nc+BDp3Hk2bqp690+m8Sov2fVHhO3dxQjCD9EDV+99c7frRcUoCoxw5xd7Q2BuOb1NGje6ksJsCAN+eZ6BJ986tYqe/2nG9K7f27NUff6P3xaz69Eu71zMf3AHpmZNlcyOi55DQPC0tupey7zbSQBSHthq4B8HDPsNSZLct3lJGZnKp82+SOyPCJ+QDRFDO8G6BVvDYvxpF8P1vvPuUfeJP37v/GJOYlCRJmstlzMlF9lc+J000wleKp2cPk/zMJ/4GVgfXAac5pY4qP5wrTAEFKJR9coqP0WbJekE2Q57fBMROKdtnvZyXjkUqQuyqNAgHH+ejUC2aLtR0lmu89IaarPjmOP7GuLzYwCJhN4Qt24ANOT3iissb8vhh9OkDiOAR1sv7JCEpibSSuxmlAridwEgaxrlRx5q3+S5exsQ2orNI9GMPEVGWQNwVFEr5pdoO2s6p4WeIt048ABCibghaoDr7zzDEYBmZfQKtEIofJBBLSACTUsjuM4BuC8NwCBXqpPACbGlWHZwa3FeqIwV0+RO2kLNUaNnDzJJFly0lEnvshaDBZcTEVpxFVZvzKX1VVv9pTQ2wCrLnenukUvAr52TVO7voewERRWwRSMG0v+j85V2a5A1GJPf3qKdQrVBJFwUD85u0cTdTgX8BikjHvVlt2knuBl/6FccLASGwbxjYbOEHvSwFZeBodOBjg7JsENC1akaicHBcDpXACwOL5U7FMQoLznSyFjHVycrKcEbJcbhp9/M6t+1zsC8ht7tDsr2DRyve+AfHkwNhb6LBO3gIXXPP2+gBG3hN0mpIEYyV3wgGv8jJuV18V60yihqtOMT0kCpYEn0wvfXOQ5Rr+HBIHIUQyOyWU3kj9ANbW4QhwSJMkDvrtM1NAJuY75EKvEq9cCI0xg/wg6We2dTBE1V+oFYpRcFerb4bd19VJD8WdzrkJKn2nn594N/8NldyNj4xd7QvLQIEbDKOEtfpl6rFI3zHPFe7OBuSCpwWfqUrZ3EkH0PaynTA1UgIP+QkVjerJqjJKwy+NK4+qvNlaE8dYNaWt/h4ekP35j6KbbKIBv9b3NkGTemcEtIh2ASC5rxfhBbzxd0qfm2jMZe8su9QR9myKGZY98jDeNVh6n8ZvvP7T/+DvR9d3ej8ZH77F5f7V+OOeebv/Z8WbRMkZqiHWtGeMomXHJyzafpJEHaLP3nw859PD5+eb56f9x8enXz5UUwpI6LFYrE/VbdMQo6BcXCr+AYhxOyAFQFsdbckK40EkqrByYGM46+RpEbGanWULlTAhJ9GHpz4MNdDELZA4ufzX/2Db/0CvT4vAgo7k992r9IYaODlPmcyYzF40RZNB5fLSO6oV05XvE5MAgDS63c5ds7GBeBViGDOfL+IBskgTjpuPklDPPaLZbxe++U3syp1FugmQvVRab/mzX7gAhX04goWPct9b55lTrHAj3S6T/WSFmqD0r3qjMOvcA3op6EdEf+aLlfD6Hg5p4XwoTrhbQeKTmGNR3/lUzkMBJi9W9GQXV2qnrvK8WIk585zyT00TPcLgEm3o3aZi1W8Tj3SMPGnAW5hulO6Iyr36/CZl9zh5a1O74k1/aZBv+VyGnIAV1WY7jDrxOludwFlsHLzLFhm3nIVFKSVERZCA4oa4PNIDl2aVKdetZ1bXsUBgfb8NvLTCa9UakBkUltCBdnLbH7ke0Py+LyRc4JuXm3ryDrz6xschl/6tMBJoVJerGMF21JPxEAJqGUzP7LLBzrNGrla0AGFvgEg7erEQUR6/H5E6BIlSrK11IElzeUH1IbS3ob/JKVek9iZmFfvc7ShUkNsCW8mARS1ze8IjBIb3tLlALnr4gkdBU29v2pfrdNf5wyvOn6jakxcfKAEDZ2n3yFmoPYF2kMBfrg3sBoXECMNL5FnNyOuLRTaIDzMU0//sNq7ewFBLszms7Pd2/HVzXj8Z4++Vl8E/cfWQA2TOvqQuBTxBuq2aHgjxFp9yps26FRQdJ7EI9zM5DmzbOvjNYdf+d0suWXc2hkPb653w8WfjT/g7B4wzO/uP852nUeH29O/vwl0JJlG5gBiolA8ijx2zcm4M5l0sgMlJ3txernynpxuffPW6583NygBYw6oAGPC3mDt5WJBZMSewhRohWaBY+o+J5HY/lZSdILJAxKtjGRQUscpkvgFKFNGoL6C/WX32RrEGIBK0oKmWBTQiPeJNah3k/PWF1EAh2W1A1RblAHAjA/zbrCm4hTHrOdJOCuHa2Ic2No6nvXh9rxX7e5POCLkXveC6wlx9ZzkIu6crOmdaVaFN0sC8OcgTI89SodwhgyHVBAFVWFBiybo4yKOJ27HWeugzWxYO5vJ9mBJZ/g64x0ndp3SRmHx5XgdogUwQTjJsLNK8/VGEB0bSUzFGuSkiHgvUimjV5vIvFPniZPnMolYttFgSVQMVw0kTA6p6eYca4ZrSpFwbgUFIUtO8UPKOPDGMgYfu7xLlL4nZYnPoJE18f2J3bOoR5MeUIEUvMT7ASEy6sBOffI0ZI84GKp63OldkJNs/G7qdN6k0/vRNXDnMHkEj5JrtpMcPbrcn5Z6LSF7/2wd8E4pFXwJf8anRQQrU1cVeYOvVkAIipjw7ykmHjzJZndUAcPJjYyAXRX4YQw64wGZNBPqKFFiZxgoHXih4xJIUhFqp8CFt5E9U1ltsqnWFeqSkFsZrhtqau68LeXdcfjKPBv8dFH6O9ceCA+Cb7AYyDMFLmAqBudZbe0sa2YbVOfJmUhl2/Gknpf2LJaNzxJiSIRzZwBkXl/1xkxlMAlAN2CVrXD1dLyBf0K9CPU8/+aEUtQ6PMaE8k5Ms/haRulM+aNNXMHgXNLOxICdCeXQ8OKWRYE1L+7lFxAv0SZs8hzHe2ZPN8O+l/7N8e3FJUdT8GZZK7oLruW480TpogvCHtKA/uNTvPHaOyjw2LuKXV/96HDyayV5y3eOTue5/+nx/v3D89WOdz7tUNPDBQRIEbbcsTEj0I1KDuLS6S4qrY6eeBiQ1Ap7HATM0eC8CYd3PMyN7Z8slrdikLBKKaiJH1qDZ8W1R01otOUb87/5xa+TplcXFXyA5YFfW/tDjt4k6ErkApRFfg9Xqw1ZwbKoS5SdYj+5gHF/Rk2PczXZHpfGFwe8zawmmMQZPwodL6xyQEU7b4bRK1FejtQgIiXiq9YCeWus8NPNDuOgXLq8NYl2c6DWmVk/6ZzdQMlBXpCi0buUxk92vDVeFheQ5VujwQkcUd0i3yQ60zk65TlvNQpcHy2p9p2sF1CcLQiNLlI5RLjqhRwYIFzDaTr0mnTEgSgmRqiJQN9R5RORVSzN4FmeDp34rzFsaLEaJ4IJo/xw6d2Fmu8o/FKmnT7bLQddDx6kBoYKZ4Bk+Yaiolrs/cK1HvfN79v/tPj+h8HLKYfV13HIa37JdVAKQ5DGWmY6jfv5ccMLCXm1752b1I6x2ePfBFiI7f4vtt48RpI0Pe+LzIjIiIzI+6ys++j7nnt6jp3ZmdnZg6REUqRIiZRoypBlyTRgGSAMmP7DhmVItiRClChYgm2AEmyapLjLJcW9scfs7uxcPTM903d33VVZWZX3nRkZGRH+vdmk4T9cGAyqq7IyI774vvd43ud9XrJtOVczuQEWji1IjYF0C7fJDXMbnC5WhF9RuRI7DS1DsnaB0YiBKTNQhIUPAI++t0iwIybZrIINcKaD2FYXChu3wTg0VpmEjfeREa35SGx3gMY+HYmPR7rDrYsctoens2Zl2D0blxaQbCh4CkLHlLicrUN05HtQNCXgkW3EVUAVI0kg+gfZFwlDakDSa0dWQNggnbLFUSY5oCpAuIWKDG/y7MI+wDDB2P6d0qvP38ZykNW+tb/Bp8wQVxV4E8SF9yffgG6S+5hKkQg3Ie+DCFIsCc4bHhzbiumrTenmLJ6pFa0+6f5xT3DOUry3305JrBIKYjRdyBw6po+rGHva1VGX4S4QcaFwArb2U3O3ftg4DSL7Wub+7jj3YXN56zhPoUVurQeLRNG6EqYqp6Ro4Q01a0sOYaSl9J+hg8ObNqKBQS8qmUaAagZxCmeS6EBUsFgGkWAAXhe0g7apUGGcfCvaOS3xKspFIPicPdIbRHoI6XECUAge86elaD4CTxKNL6aaEQfKsFFUwHHSPqwD9GOFhsFH0C5DyEAHFrE3cS+KAWwe9L5q1yiISRsAESA/wZ7Sz0XzDZ2EvAMXxh4ghCG+4Ozxtul7SvushBszgSYRcSboJYPwswA8rnoXGqsgznJVO358e3D4RhzODXYBkIxnxBNnawGqRatC2ceIM35U701A+EjOSdHrl3Wa4FoXFZKIRHyEYAJPhJ3DQ8FuDrHZBBTGhMDVB4+lDkwSiLZygwsVvBp7JF4hivAn+basLReT+ySoPhVKn2/k7T7ZDcifpqaT2t1K0O25F9cmGaj0bCDGsTl+Dt6BKHNrhSwiMX4HNSVd7UASk8Y/Yi0WBY/BkxPPIzuaBZJvcFbcj4CBjNSbkWBk3Wft2ES5LKIX0aV67ovipVnjgTF+yIJqg9uEqkIRmBNLShkauzQQTxYRRaQD0KlfjfGGRKGQ42i2cYsyn5QqBJQ/SH1aO2p9uEdXwuRJRBGk9DSfbYKMscWBNLgqsheovZrqTaFxktpzIUBkuPmEQfaC1hCAJ46O7e6RhETdi6UKYVVSH//kcJXYppCWAO6Z1B63/aOIczF2xBkAfzubh8oAvME1quQ5QSbUHQpcNmpE2yCiCUQsp1Z+YJsgLMMeRBmkdEIz+q8DghYmxKXh8L5a5DiB/SR0pjnLM+adeU9sFcEeCJuSUeaj3c8k7499/c5o8UG3uEnXg+oBxN0eLFywjh5GCvO5tuQhY2MYMYmp6ttIz4fOFuq1QazRTI/OjY0ds3vRzaUGYPtt07UiLpC61E51YU4zmgDMgBCQ1QsRCKrBdEL7cpj+cfIuYlpBgGYFg3hm0GfBoq4Mu3TDfhvmMqZDUBnq7Gx6DDSmgPuYit6mhAbsPFwZuHTrovAKU5te55TgoqjxSxhcF6/Fw+UEYrCoUnB02VfSmjMIegsa6NqoIAkqZ4lzjrcX2gPjOJsST/G5fn7iTJjagLAF5SDSY89OjHnKY5SgqqHhAjUPeJGc/xj2grInTBc+nf8wImxICcKFkIwDBIuO6AMdQIRrkHZbLqNFawgFCZBHNk+YGgZhM4tPsD0wHTYJuOjI1nlYPFBMdq0Vnxr0NNCHJhWmEGaI+FyHKTOLI0RmXtzpldxR0ejieYjzQ19Y/q9o0fEX81gFccrMIaPy8aOP2Qphy2ICMAU6NZ9XAj9wJn6vBw2i9fIyG4S29/aGifHgTLLKQKyEo1rfhWAi9BpGbSVFthSiGaGOdYy7wR1jS73xfJw+7hmB24jdqULUCA9dN20CBtIEzPkk3pNR4CcIkUNRNYdMP4bH3BHJQ5aYvic8J7Mu+vOCaOF7OWUoQXVOiToOTWIYCMoS7ZfHPlyzx6xREmP2AXAb7d4iLiirLzaSNGCWuwNL4ANxhjxmeQFdLed6Z/K1thM9aiQpuL148dFul4DbW4q19nsZbuXoOM2nS42R7cveRe5+Au7h6zVtmvARBQLsIRTH0NKXgMECSae0KA6H0jNo3hzhMfCyFyezP0zwjYrAKSYWX8rnR1H4mfHxZxTnly89uBY/XIw05rQOnXbf6F79D1tPDNrRhfkmfejsIvYB7fzf2L3AD/PFDqeL7nUuRrWmHqEu7QAb3fFWwi/CeqFeHib849PBALkYkha5fewSeQaR1axvWJ7+rJhO7YFYBtCV8kN/YxriIgfs/pke4SzUxI2w2twpGQ2BNC/OvVN1llLMtQWoE94SnpADKf0fcCx8lFdhuvJKOHfk/LT545rovn38Qfw5ARFOkuATnjCsADA5IiYIw/wJH8FZJYqTU02KhYxNXQBSFpmL53xyO8SNknoIpTEAleWVhKmcT6JKXk/sxs95ytwgonVEqo+PIg+dkqmC2iUbphNJPCB/oVkeNTrBcqlwJrYQyedNpHbP+3NuceAcWnkHcktAPvG3Ylmol5BJIVDAxbCefPH0ybP4dDnwRM573skz4cTFBpFRKdGlaKF51Rri0k5umUAW6kDsYcfNCULDl/CPzqwqVMlxcfGEv7kj9cslFJhht4zhYcPBkdNy2B6tpaP7UrQYnIVrTAwqzCA+m+mqU3OmJZXQIq0JJQrjzkE4bzPlly7BSNsdnMvjRMxqd3wqZt5thDfieFFhqJXb7lzSM4VNQkwLFSu+NeqeihPKjhditDiK+gY7itm3dij3KQ2YjL8mCpfLztyDdB62viJ6GRgLLobHJn39La+7CgAnUQfp5TCnkXoB4dpb3da1FNuFKFqeFhqBaqidtSZZtTGwqL2CPt+qllivQrT38fGicz9JuZlJQXwcuajEBfNSBuTLrNPuHFiHKsQOCpVQXqkWlH7iQ8iIbMGPI7qT58GT4FQMahYpfuh2eq6CZAHxgwRviZ1ZySSLPZYFnEUTPoTbLx9eo2mdMPU31n/wxcQn/VXjjz98ulxJnzQTlxfLe730uVQVvQykFWtBUotiIz3BYyDbxabRUp8w6TF9XBwa6totCGs0kQUIxkIGPHojD7rGLhfZSAS/F4RRzIph4EBrJat/PHeJd2NuGZ0Q2BREOzGIJ9J72S+JGwSpQsSBSGp4KsNoShWRsHwYWdFIqbuWaiUiI5bOfx8fNzsVMwCWGAoRCmwffpLDA7rGMaP8yJGA+U17OwtFC0J3BdIigeOMHELxaSYVg4pU87xOOYrYVVA9+g4mKPAGzatS7SDpBf/sXOJMU3nxjX0DtJzjzX+IzRC6oOeNaeCDMKScWPTzOZlToEGMkStIO7dM6XVUkDDKQidfJjRLbwBPHBvKw80eYnSk2MGkWl7M7asuzZ/yei6bPxEHqyrJPfwN5jhEHa6/gp2QlmsIlc3DVGaxTcsVT1Cd/+Lf0jWKU5C+PZVghJiGMx216fFTF6mwOqHFkmKhEj4O21YwGCj7FbVYjBy1weDNu2WtjrKXg3KeX64oVLICI/yTW9rDsv6gbD9qxauBdetI+/CR9uhI642VVCLc6GmHNQ5eePPQL2UjzbG+VfH2DycXF2WoaNfTf/CJd2pB646dBQgRIetuhU9H5VbmYWR0FjF60At9fF/Zq1gtz2qHEEFV3/oYtw/xDMuEThTuHSuONPiwQD+HaGrQuc+54vGTdfCYAZnoK+Gf1vE48t5DJZ1kGpzZ8DK3evpYo8pMLKB4ajenvL78EAWPU/kaV3LURu4z1L5R2Pi9Y/h5nHCQQxwIvhdLufiDCfP0CKvaFwAQpVeFjn5+K/4wIrJCdBshr9R9cqKuA8urusUMcfWXn3+vsxw6xrQyI4Fm7WNY/zTIuwjtiDIfcwQcsko1cqZ/LYtgU3gj0fhG5eIDd/6MVXUs7aSeNGPOUSvV61hDCmZepLjeGIwMYefANeUc9rTAVSO0GnVMtQ6EJgK1gENgVMwtwn7Zh2N4f4phAvGxn6w641eQVHKpxPJ/HhOEfoAxkUEgZyuHiu8jDwvODOdLyd2eMHqgsy6tzxJ48x4aUPl0lCMO11vnVO/CYH259kJx52zspDaJ791YQpKLfZ/Ypb4vQuNow7GzQRoB2GRxztET50c2+iNN5RtvYaIUJzBXQBYhYIie/DP9oR6BygPdjB6AwRq+O4x/5p+YBNK8EdVC2j5yFFUlaEqfaS0WWmHDn5RtYFKOGYkxdTIRyZ7A8MZVhGZEPA4Y94iuk2pvafE90s7ZG84mNHMUAdV7G7D2RH10fMqFw4R2BLjDYFV0ovCZwzlcvdSZocLSP8E9cj2AkSwasBzGgsCwD8eYhs+R0tkQNvw0O4Uv6d5OBbTa1K9o6tk0N8qpxRoBEHGUNVTvegVqstKyjYaxRCYZBDBj70gFtFPSp9E8+LWytsav+Bj7QV3pof7jePce8eePv3xO7ObOX/5LQfRJa/SDIlLs1HqZaxEHntIrQzSjwiScdGlM3BBsVSLM9hiUhZZ8yNzyz2pP60UISjF+4FHhVjc4u+HdeQDxTBp8h0gIGNONknk01PuRzqrJ3oode/AEgPtpG4PADZbFJmOXzMZdKCb0nZlEUm/N1ooXaR0kjsWiqxkhCeAhMatk/3NRMjQxbMQ00GVgOdypzjn5aefJIp0KvGC66PjnnPrDBBEdU+KQJjh6PcdwC6wdw5wxfuMlHD6hcwhSCFRG9ge6o4V4f5rs/Xfrf37gZt+wtt+yyr93/QWA9eZBSrupipZJWLwiBRVfp60JWolS7cWKUdgekV9cfF/N+X/UePbb1fOfyW0+TOVHPbNYbDduFtxsODfXzUaHtRqCRL6GEGOPVrFQeG6MG0TA12hGsBoBkTMP0/LbGzJ4HAmvyRkbwjTBEpFF/XKUOJklx+Sbdf5csjWcOT6fb7qr2B2hFvBz4kAeGU6SIBBNAAgAs6WCoi0tnXRRuRmKInq5JRUIEt29o2zhdtC4Kh/EvBCiONbcmxu7G0LQhUhI5FSMDymBLqY6D1tRKz0adqJq1CcFZTLvdMUrZLvP5Pe3C6gGBKT3JPlL1mAnk81Y43bbdksBkbxuu8BpVCCrc4zRCHd6UeiKUFWGVzrjdtRdoDoeTFzVnJXyux1qHVKZ6EPy5imjPLA2niyHZXoUqBXNg9GpUwgBiSn7yaDguKSi5pT371jRlWwTTIU+7+1qFmjayg1ZZDToIAYnjPFJPz4YR2DwkgvQToQYPHtAqrUgzLT8ogOwT/asqC2R0iK+VZfOv8lTwds+Pruz7E6iAgJxIBbWBuRacOEZdKnq0grY26A2L0GdhHYEPidOcOuR/+JVLRoLGsh4yJeQ0RJxhKP/4p+mqebpL7DDPdTD/dDhsdJo6WEj1BsgG4D4WndVjw4hesWhqrLrjdrQtyPWTgdWTfAphfOOlkppfW5UYUopnTHK2oJyeDzdmOOHoUxKqyH7TgvABNodYRXBVWed5BZfJOxW5mCJIDwgPmoX0HaRgZj17wAUCfgEtY2BLuhNMQwcwE8GR9P/7i2XGrdqpf7Q7LvG9nH+Pz/9w4Q9cbiDC9PS5apWAAOVatJoZCy8NR0XTeNkSHMzQ12EQYI1ZKwknV+iSyvJzAyDVVDjq+1lXjn94FeSm2f19j03/r3uhe1Orj+iN8EPehHwScqYqS23fQ5uqk/yxi4ZJZXDVpo2RSPhXzTLL9jbN8fL+Uh/a5AfndjJ7GAcU0qJXn8SoZ4W3IuR40lJeqCTpWj5sVvmUBJ1wQ7lDMwyKB6fQ5+hS7CACxJ1Flqf0ALLSuutFO6JD6VHXtARXoB7FEq9lD2lupO7g1zE48YUToTIKdA1gi612DiBA2S8ETH58tdEGSnyR8pgP2dvM0BWVHfZSEK1aSudi7A56ckIXEeb9iPhiD8YmbB5G50YCbbM3mlroY4OeAhiRI0E/UUvHCZL74yjoF+cse7YhLo56pvBUBNtWGgrVaMf0pkiSZTBGYZjSATaFaY+bBdOmsR5eHxBjKHg8npKLAQAgtPxS84FYk+y4PxHQC5BIeR4bnGk0bUBtM54HAmesRpBmJm3gm06OsA774H4EKV86UrDmyBBRA+AA0gFbz78GGjwIZoDLAtIgV0OJx5q6UdMd5fiUOjzl37LmaNpV6a+iMwebVdtGaPJ5vYsXT9o+Jl4+KTpnGOkjQIRMf1gwpgkoZ4wI+W7tMfW/fEY2lr1tQWYmURQuKCh6IVJ3Z+TTDtpcOP2+KefxeFwnu0jGW5hbFbdRaYBTUclG1fWW4rUnvcEiENIgDBdQA4hfEgagLQSQ0qpQff4RsBuAjyE4tjcfMQkjo2YsUPC8lzZQLyAjULUzqJJgrGgwP2F189bYaGJ3cko0DgiykJdv3HZ5k1wsABWpJQz6y4Vat4EXdf5lw73qhkqtqCj1H9/6/I3UuqgjVqXkzueJD6oLR/vCG0I9oOgO1TjaJyDCs40bL7hkCNVvCSoIBk/55D35LfjnARFz33m3puZ2/AA/+fNz59Uk3T6gtEPzjrp96ENUSSkOdOnZ5JdC5zARzSuwN+Hl0MHgHflyu7/tPLVR27+D6rPwnsEXrtZm6cSc75w8ml5/i/2VkMo3dMcjXTCRLeWeoNWlCFtXAz3i1sDlGIAoAAMlOtm0mAsHW2ZI2SzG35vWYXlwx3RyQ1/lekgRJtQIOTdRBZengUNrNJjARvWBoEU3plwVojBZskVkR5mnWCMkIHEidew+BQG+HROAQgNAd70zDAoR8nceHCPSQU8NUEHZ0ws/s+fc2gGRRltAqDC0ZUcW+aKCQzLHuDKMbLkpbM/l6fGp/Cgic6AbahwUP8AsGFLy4kCSrHlLihBCV1mpqYHis7+wTQLsNSWDuBZP7FIzmHBeRNOI8xyWmQo6PH4uDsqAvK5Insr3WRYKFmT2UwLihDIBaBBzstm2aN8+uPCJoaeDckPeROunC1BtJK6BaEVgVwl9NLP/C8MRatds6TLIynpbOEjac7k/onfovWJfnvPazQHv/BcZ1VK3qgM8F4gfjzRpe8OEbPgEKrFwvDpFehj1NzJiUV+x5HskzwHCbDko0HnjE2fvjSw7fb7GzHs6MwGw60hVpRm5Oav9F9a2oYdO/CE4co35KfYdYFUZ9KUrb41YuIKRgdTHqHozu0osISkrkAWRANXF+4WgaA8xtCACRPhPjkDPQRdBt/jTVSwL5p66FIpfARhwOsDNZ5hR4KAiRALfwVIIJUuwUsF7Fp85YBK4P1bSziTM6UqBQ9c4rzV2ermtk5y/MF0xIwFD8pFNDohz/NaFJg91UQMB0q+HHvMCrUjjjG9FzQcBSOVETQ0qnPpwIykc6O6ZVRVSs9sghnAM0sjEWgilyCgRb8mJm3KzjJRC8VWHzUNrhOO8i/Pvf9wXMJxk2tRs7pxsgSlBsKk04xSEYHlyOXFTad8iKyft5BvH+zlWLrIkR6cIquajofcahj2OddGu8Z0QILAIHRWVqTEwR7IaTkwIN6087B9ExcaDKGj/sGnQ8sCTmi9PTc5N0onB1DYjpsJngIOwW+geC39mexCtq8AmAwYvwCcSo+Irm2Z7B8xqVKFeWxMIf2puVsuPd/dCy7kr+J7Su1JglsQcgFIOB4cWiBlyN9wVtgqWIfGpRAJVdic4uhkP/Skn4ipFRCnYNLwobRroXJAmNPbwAeI0NbglAu6Zsdg3E1g2PCwOEbUYKQfRerGmFLR9gUxRnGY109WHILSuZ8olVf8GMFQ3/AHOrk0zxTzSgs/t/D4YNObT2xPwOIlpqk5+t+mtUacC4OezvXLIszoX7lPXUj8hPQwYXoXJ9JOoXnazRjvM73UJ5Xk1SJJwqVgQvAzsKWsvS4pGcedeMtfLrFkABv8AdxCXgkTf7ZTsQqRcDbDb+F52x/uU6JgyXDuvBW+S66S6AWvljEptlibTXu/z2dFjx3piuJ2hHsjdXwhzaA7MjU4gcQbEMzJIkiB6K+j2YdqnvQloXRGr9eYUCA0HcI1IsBgk5BdapxAtjiryZAw6hDg8qDzGB42OqcRYyll5ZnmAh/HcyW3gW08WBIbgZEDF+XJ0WdAMRoHggmU+rIecP7JwBX00YwpJ3Cnm9nvpL757tXdekZCGoIZ4MeRNiWa8rj+APIHlfQZnTcEfYTP8vsU8lV6puQEIjTU1aiSESLSWwipl+I7B4YGM04g/An8MOEiZpULJurmhyLsx7bnnQ0vnu9rBv2TMqPm47tr//Th57eG+YLefT6+SVz6V5ZvX1iucKJ46sMGOZCcFgZI8H/Ldo7qgkmyz7yNkSQtljAugqHKBcwCM7lULlgCLagTNBBSzmUzzJRsOIHqWp8TCGzX7UcpfrAm8Ygzmuf+peuHViwKicKF6OhGEzxTgHtqfbwnnkc8G7Al2REUGQrcCP7FfOrUonF0LMYOX1q9pqPvBgmTJ4ginlwb8jNEu7yxBDXS4ghvGctOiwmJKO+vkGKNtBBDozC7SZdjQ9TAzqSCT1sZj5iAor8aqIvD0dPD1GePeXM2GF89Il7ujWujcUxkY6VaI2HoEAAixPczt4x8qgYpl3vnyBGNopqHEA65sYzBashBePyFnWKfyD2yeC2NLjM2qiw1YT/qTxxCQjniAz6B0ASNw9n8XDiuycQgZo1F3htKFRgCnpAJSkSkuCl+ZJVHoVuPlHPrUKVQBwrubQVXz9Bo7L36pHQTMzupPd7/ErM5RHiYfQNIiJAWvhSaCz+MdGXoOZUJ2dYdkX6J7Y30g7rfbFHwCF642l+W0jAo6DgrlR9ehr4AHe71a7P4U/g3gpXxVgL7wooQ0y8GgoOBhZYwYDZngwWS440sPLHHbPr8jKCjoIm08ubuva15cnRMPptsUrE5gbwMqEBPOMAV8uRIu+nGgCVTcEIV2ocFoSbw4DFQ6mGviDD+TL1iyOvBUWPOi4vb33n3CisLn0ax0dIR1QyJ+uhO6qtK3gkdm15WSrTa4YwfTHe0tOfJEDxuWe70ZBZsKUr9qsSr/CEPb32lun9jIXNXwAzums+iERaaiJB1S1CrJb4aPjfMpXvHuxL9cszkhJPAzI8vLlVey91/2XrYoDodHt9xFv7F/dfcT2kkkTIAX2xc7osXowQTf6R2z0xDtijuKHWpHeNheOKEjnwKoDxeV0pw1EUg7bNirCpPqhO0nqCySRLpoy0/ZogNJwS3iTpgSy6Dd4C5DkzFxYuY/+z4cU4ei8FwfniaKInwPqQn7bPhF77wKU1AP/jTJwk+GVLP7RCYEUFgv6LvxUinscj0RlF7ZMXAiigssVW4F+qBqU0UImS6GLE6tpsdgqmiIkcRz6p5Ry9JCZ77hQf3uO0D2tOVxTKn6INbG/FH8MVkMVl5/pD7nW1XIkORY0SxApfA97BkHustkOjy6eXPzVBjMmQI5aUJwAxFYzR/CeaLN5zKCwa+F+PFe3IU5RvM2VCoXawAFzOz6TOaAZy+mWIl/GRkEdW5kTuiSxaAU6IGLXazPLgyDwZDchm/2+DgK9HoYAn8EHavbsul69pTF+ly4Oa1kabfqTL7T1z8vMDEJBhMHeTnVNi4QyYz8vhZGui/SI9gkxBXT98fPpY8oFetdVZiHnVEx8osu0AVJimFh7n3uPlw9vagcdEmlH+8Ulga4goWCEY4t8eiYMUwFvwttGma0efe7ijbh82fu8SHyu6xQe2oL/hLS42C1XtQLxQS3arqD9vRsOUmopMXF3b2B+kdOpQMt1vJsmmuLh3dPVw3ztKTF1hQxIJQu4XSaWiu2D7ZzA0VI7feBOlKWbDlKJxBpPIZ4gVnAuYNYwkZPAJ1GJfCsnkZN2JP7KhDA4PLDFDTQ/EDi+7rEAUJjbxhSZSbY2VhRWIXWSD9VI8+XU7jcE6MDXwOiW1QbaEk0Jpy18jXEnS8uvGIWvwodwiX+tGg8PaDDSsxXkoLReZ3b362c8m6au2f01qKUgZ63Z633Z5KNC58AHptIAhlHTs68fZTYjV6BjJthN8c1O45l1lfwLacJcwZlLTZfmIYnevSjy9IhuKWdVSt4IIxnGOKtYoERlmAA74edxLRgx/mV9Du07MUtCX6MYWPRC6Vjc73EOiJO1DfOHpRW3r28Ix98nBQxEkC5dM3JG8ksbPiH1pkUOw6u8J0LU0kj4ESLbYHdebZ1upRMBQNVX5FXAcPBLeZfuS4tgn0ACZElOE/g6xtOBcbt4AAyyYBiOTJrpq6oyV2p0wHyNzpoe+I5Bm3DNc3eb/XPRMnDcb5o4gbPZ5SkrXqEqHxE2xBCNTK4knrnMCFUitljuiNqKftQSPaGBmjdcdOjuPRmYQsscDsRqCwCXJDZQj+N5Uz8ioWGGgJmKVpotHMIstYuKGqrA5sQwoVWpCwKdARVXhxw0tLY6uyOk9hHSVJ/lg5tUz9kPdgiwuIP/aChQLmDaOCCeSMcb7FDPC/xxnq4++lY0NOBccSi45QDXoSJJakofyEeJW/5bcwNrAZYm6pwI5oXiK/F3RefiU6k7I54AMIRmpk4L4RfYqaLVJ2MnlcCDqjxZg1ohtYPp335OTzw51mBplQ8jcILkSz5D+0L6RiI+haJGoCedG6jor0LDeDIKZu9EkViLubPRvomQZCPheqEYgL1oEXZ2PDjWSdPzRzI2QvCC95Z87zs+e2ufF//MkXQlUySxw3Lfm0zYdNazLs6WCnEvj0kNbwXciWhj9F3SZQ60/J7BdZMZqM/NBxJU3/zmAJ2w+JOTRkNwykGQ9zhrQMM4CNpn4ldnjaOKYxkaz355Luv9Y++/bhuiw6b6J6f7p3+b346hv5+/zzpBtHG47Gz52VTLAH8RA2jQfkGLLY0LJEpD30++OfOXUAdxJQ0Mna+Qu+tXTZRcjuZsJqM7vOO2A4XJ10FLYLiyGeTfTXnFCb8G19QvEDxVSgGgcFgJYkVDgBBiQNlkR1AowA9wVZr3UhtPRs+bns7tvNjb3WrCRGOx98nfQ4mxCMsmIm3TV2rtseG7GoTPHjGbGCstqIKffMeGK0kW45b2i6r9LM2aCBKOL3X/foWzjuotURoiLPGGU8zNhA2BAeHS23EeuhSrEnue1S8UdJebBkI8NB4wHhK60qXiIyI9yEwUEGJXov4RVSiNdzt8fdZanQFDfqxweScJHXNPuMIpGTJlEmwKeYUfaSxKuPMybCLkJ0efHMSlG0IFCXaDPsxyxULjiZTLsH8vGvrh9+enONyhqUSe5O8+4+pC9WiejqkR8sFOFt42GolVNtV3xPAf3fOwrFYgncWnMwzcfZlbgEzgntRUCdqAajDgqFRQJrVFpiQiXjaHFpxlEfUjij0YJKNTizqtAzt4X0FwC2vIBgEvoLBAuzOtKqIFkj/dwipc/0AzjgVv6D1nCZTsXAfPu+1+vF7yrhK+cGawnzZBhpa+O8Eb9ZCWIWr4FOgHyb9aiBDgBXPliLu7eS5UKMdAtakNihhhFZHTe7QgOCAdM7iZlHen9eZMMI9Ld7yGKHTo5TnHm00trXmTLOCiD3QmKE1w0aW5nUWosWbMuajLvG/EKz1o69vvjw72TeXtLCSPeu5poPm5hN2kZ8ItsRvnIR9odC0mJUGGYnRBA2bvSAKeK+ea5Nt8TB7Tm6BIyCNApLYplgVaa0guodjWYCuiLbp6WDiXeiAR8OVH0aW4rogPn33cIr0e3/fu47v629/OXbTzA/TPac5t97uLD1kxVvbYRKNPPGXspsKnnlX/ZfiyKq2zWNbbM3r4aynlkhnxM8GRtHLQRyFo35eFoyX9TdByVKOhwhlHlhqAQ07xK2CR+FHpcjwW8JcCBh8h/Ro9hWxMtoCGpErJaMN+q41A3lTiGjUYVH/nCwoLvL49E2ikT+woWTU4kaMNLmd9a9y33CMOipgRN2GlGmV1BvAOV6aX2LLifosuwnWqUA5/gegCBCp3w6UkRb0EIUQBl6kcNYqpqME/JkjAHWtmVbh7W0d3rodMzC9yLNi+i1EHCSFkqzL3/CnDmcJ3EmFXYucgKsJrY+1LggUr+Y2sYlSrLit2GoYjg6q0bsyG1e1p7JHPPn1WgiGOiTvVjbEbsG3Z9yBUV8SixD10aOVcQgsVk4GOwG9hzxBCAfbB/BNo8440hDtphkBeAncb1xNXX4aXKxX4mZVU2aGM+sfh7mGgW9YAzoEUItIiifhFodvz9AeDto0iPk+P2hslAIPrkfKuXVKrMqjeSdVrSMQA9gw1B9cMBwguDTB0q5Gt49Du8cRcy4emsbDk3QaIV3KlTVNcv2W23luBa5WzYOu/rN7ahCMUuCzMhJz9vaRYE3SFPg0lAfRuHX2mmF378LSuO327yGL9Ruxk8sIQ1m7cKCicAjDz86gBiCznWk0pHK4f5RcHQyOTvvwgKfgk6pXg9QR+AvwEm1YmplY1o3zRp0LWyYSC+DBNZ8Wy0baN0iEwQmQbCBCxe7TrUDuCU7ie5E+rZaG9mvlh4mM+NziZMvLd6+bm/RD9DwQ98dXGCm2PFdRnnIiF8i7fmf+P0MFRXG02t2RcI8Mn4qDcTJfOZvvPZtGure299YOXv8X57+3plsrRfXQQeEm8+gwl0ZrMmX1Gax01AI2tLZdFQ0/2Hp7bTq/EHzKTpzLxvtl+2D/7t52QEUcZA9hU1Gy3VIO46MqOMYqhHxL0bLiZTTmNrDvTjFYj01UZqo00MNERoaNUCKE8zq4IyltkSgrfh+LzIIk18Q45Tec/SxsKuyd6V3rHUJYR9GFCHiGAyXfBZQHCIOlCoxcMXMi0J/oXOH481GpCIK/y48Vk89v8esuONBGl2jUqp7MkrcrpS4GDPqMjFGWjFsaWNnIKkz0dK5PtUNtjEKFCggVcdxJFjQQYKADmLXc0xAOyQaj51EzYmV+ymeVM+hpq4edNCfDA8qMVyU9YhQU/Ae0ASooWxnYj8icypqTI/G1mBWCAdQQ2c2CfaCmJy7IBAjG2QECDAHdSCgICoK/SV1uj5OxYe7VSawMRKXe+OhhCC+KjDZw0IDFngSvL1BQw/4BEAnCjxhZjwjyyETXWUutwQ4TEQP1aVPAOiY9NW80qPzmAbh/nEcKTCzjIE4V4zsHciT59l3mdoKthUJ2ZaWi04Py/yQGiC6L2pnpGQzgUOPSk/7qB0q5v2EFb63GyzOhahQ/qXMDCGwvI8FQ54+OXR5xNjzBTHt8Tfyf87kXNFNmGA8o7PJSCKqra8OzucR20Q9bTIXT2z23Pl0aEcm+2oaEDhsP1GkgzkAHTzYOdDNDd8kBg0F9zfDc8Wg31dKBcjl8uYUKUTuTQGtwgupJyrgBMuE7Qdy5PHIlBJGUwykm6bw4dg6icAq5KhIgM1bYohm9Ula4I3SkLETw5VpdMsYnw72nSwpGTt+Tmvfd0pfGxe2+rmdVjZjD6WGxhV2RU9pYtMxxKD3cPYeAiOUYnXsK1clCUagXDP3n4e28OYfZ9X+c0bre6O5Xzv16bGn/v37f7MeiTWTWvxjk/IAl5rY9zrnsIqgGqFcdICCYVq1/nHp+/+mdeU/9E/9euLgn17649+8/Quj3ai3PDWPIBkog1UveU9r9XPf6aDza16gY9aJqEvDM4X6355/p3st+mFv5dsfXWZUGKLgmGSyLwRsgOXaG5lxPkqyPROnkLDf/PMPwueuY8ahwlAYmBTRv5OP4Iwt/HC09QuR5D0VBpxgg0BliOpPhIHJK2cAj2JfaJ3O1lBk2e1lgZ3HQ+1Bueh3Zc4Mp64/0GM0T5d8dCKCcsxdn/ArTszmg5KRGzFK4Licnm16smZpTRzQVeuGWlawF51jJRWB4gQjBfZopaXeMyB+Jr0kfcDLUbG0fZfvlwZ+VYZvgyx0Xh+hOpHjqStKSp0CVzH2nHCNG0JkBF9JDRsBGA52LuKgCkMYTChEvPnh5grN0MheiQPEeweCAwFEkVej0EnbhybD2xjoK3ZJTj69mgWPuAYF/rDNGHA5DhAAgBuRGOVb4oth3/p+9zTxFxhS6lSnV8+q80983owwc9eD4KLQ9OqizBdCeckrJGG1KpNJKJtWDajHUwZiB/RoVJvB+bVQve3P3Jd3wPgpkS4Lm8S803AEVbeJsjwHRZvcUlmagwGjptL+SY2Dh3fVSnOKDuciNZ63ja0TJRHzYIYuCoXNeOvO5Ewp+qjmpW0iW00z3IXMZIkyiCmzAOazjIIB0OU1EYThmMXdG8CMExF7G+aP+pis07u+jMSbkFJY6gAAQABJREFUkxYoDM4724inBRZiVxnMJOOckjsTbDYEUTa6ddA3W1NmJsNuS+wLkCgicVgPBuvUQ/r5PjU0bBs+017u367O9xn8nPsUou3vHz/39luXB+9k27bWqsNyQARAalO8CYwFYuzkniSE8FfhrxGTpLbH+EMM8wsv3T6le0ta4yvdS2eNw989fvWX0rsLmvFC+sNv985TC+m5JhGR6O3NC+iFrcVxpVY6NFdsRI7ssH7FOOoEyu5U/7w1SSfK96yccK4Jg4u+MT8cQUqthKfz7mE7VZkmc/ZgPtFZjzc+G7t/KVKdNxunFmrXigfz860tM97KSM/sOG8y36r43eNphm5TITPEbx5hi0eX5zEHmQeoL6nDpYBR2DIpOhTqnCbslN2WuSc0I2kOyrLniHGV/tVx8nojfqlzOXfUnlh3aqWjg2z0WEVCSYcqsDIMWkZsF5oEqXEIbvd0yJhHCn16fEvtpsKRqgQFSKl0mzGrrMJuhZyCs4ofoOhHPqWu/tlgWDDNhrRrLv+TDzOHULNB9UPxndC4GFCRYiY5Z4YFDKJMU576bUQZAFeDUysndGYv2J28OZiLdqltsslduvPtYcHqg9SkjBHKTpxD+lHADohVddWvkqRQxgiYvB3GFuPWxkvTIO26ESlUIGfOBFL2ySQjI1wFb+/K+UbkQiJt3HzcoTIZ0gSkYQ15lBBl4feztVHWpNkF203hxK+Y6pnCq5gFjrhzfoFz6FxdcS4tqO/e1SKmMhrTRREeT8aniyrrno5O6bY66Spxu/N0yXINv5ShDTecQs/dofah5rPBfN69uEIzvmJbgw04IYCDQejhrvfcRfwYwW1oYQ6vGC7k9Lbj5RNUQYZLIpgnOaTBPCCDxtzhaoLcMvzBPSL78Vp6moxSFA/d3Zqul/TmECVZ8wd3h9dX9Vv7hIwhTYU4SiMXZ1617OZTcXQIUltMAqXzEGQV6zMrhDAMvAYQRK4llcP47gC+c/dMjChlnBXBfyJAYG4GIZCSpT+sklt2S5AUCd1g5auDLMQAB+EDRl+RZ/b96MHDIu8MyFl8R1Q28BKkyjLsbsC4sin7FUiQBq7odjNkRKGVOgVomeHwWQc1DIChf/7wcyFLIUdC4uq03mY28Sde7s4h85BCsX01e3dMYhY9kfDJTVJQ0D9srBzqsS3PdJXJxUjzy+0nVo2tzyBzGG3+sHLaSQZUuuip0SsRcpWgD/vRn9xK1Y0IGdTxKBGz3Kpvl90s9SqkgEGhUrbzsFFA4YJqpBjrQjz91q6mGvadk+bLSxE7jVwN+7u3xOwhhUlJ0U0dBtKs9w95ea6K2EzUylvX/PX16sqFyrNPPPzp5U+fSB0SSu31M/cP56Z7MXqLCWhB070S3Xzj0ANrtEANMIyGPAAJKQARmrTJM3kIoeQ+6LHaCJmiNENsP+urgvsOtZX+jPo12jKM5C5VCjpUwjEvwQgGAy2VDS4/LLNvJ+F0vjccGfFNbVLwhTzEGLwBOgNqODc5rKfrjn3QSx2PkrvNbLWeHB3b7Sm6mbFKJ3kyjB91k+VW6qCRqfbj1Vqy02DARSh6xJWF3PMjuIEiWkOahzZmC81rodewb9MPJcsQ/AVm0olEpzjpGSdbd8YRkbRtmSR+1BhxyNCwMKxAboJgFV3nYRLZKDFs9BMOLpeIA5mmAhggolr0Vd3D4Q59yJhzKL6h5iBWmb3LrziuxFf4X3R+H6uMTqOqedRzSnFayCCIwZgBZydUYG3wQuxOPAyBCjqiIDFQ7Ll03AUkAYApaqPgURhg6rCAV7CKyYXQ5JE5mB0eiQCwvACHhkXki2CSgWdoCtOrhtuZjS4SXijz2ACggaopFs29C3FUyA0wyPh0uvsEeUdqoSfxAIiIaGlRf+N7C4lJiCT0nAmgJW5TRswR9/tCVl5Gujek5pzQHtUcxTwjZQxgukKueylz/N0bl6gWosjm5NlVkggxcQFCDErs9E+wgQjVVr82Cr19033jqf5CJFZxewv66GeE8vTK4tY3H523LYeHxP9/ef3DS9GDf7n3xsPtEu2IJB4ApHwiEgwUKo6fM0bLVLXFIYN2XD51+A8Wvs9Zuu/M/3piSw+pD93J/1p7lQmeckeg5G4otqn3z06MI907NVL2o3iqlZUa1XasOpUb8AwUh/PR/iffPwM4gT2Wd7Z9dHIxW4DPnQsQC71Unvkv0+ZN+nmVF16//cPNU+S92lFk/mmEq0JsvLQxBKJej9VXzTrcnSVdaMObTvHPDy+d7GesPQ3vwU+0xES/a9nP15ubGcaGKV3xe6w/CSTd7pXXIYaRYgGLeeaOMTk9guFw5swR+h0QRKd3hRwPiUxqbhRaYmPWnzZR2AsaNTZD0DXSATBhUEq8Cq1etfs5bWG4nGvRr4wkV/1eDsdIiy38UgJXKdwRuPLiIY5dMBtGGkpPII58SpxJWitVYhITAhAGSM7/qU5Hlfpci+z9o5sbGiI9bU6FPAt2y2OwioooH+GjBRyd0uuMjSbxGAyN8H4Uh8nBIV5lXyEHzp+j3Xy8n8HKhI9MdtRgP6GkJupqd7H5uVW4lJM0Wx/NGJBPiVbpRqFEQwGXJBU8l1/ByuWYy8XNSfszzybaCMgoRAKDs5akXwNhZhnEqVj0E7EjZhUIoT6QDQtNluM6i9rRzGMApYiUyYBv9g21yToIlSiyQBQWiRdWgUHHRPZIJCRlgcVYMnpapvzi62dUEmJ6dAptmZ1Cv0/60bj8qsxUZKFJV8jLvSQapMzp8Kg10zrkAcfjOhgbaEsBB80fTqY8Cf5DdxfsmE+HV8mbw/5DxRDCSUJks31HTd0PdRGAMqfOiTUux2BUw+LgESpQe1MeD/LNz310dfHAyQd1xg8NGVosFHCzrrkXkAzX7Oq0vRGhHtVYUsEkHp0UoVP7O3aopSORcHtcbKiJm9tLvB+HLUCldzZ/j3I2498ICiZnJyxS9kPAzFB1bPcp5CrmcqQBL31Z5tE4l6z9SUKji6DdJZEJxbYZnCQADMkJHOggOe0MoiPH6O0nq1P7cD/XqcYbiqlUDRJmbpy6hTwdS7AW2fEp10xMKMzwjM+dOXzuwqPPJB/s+Pmf3rjVTEZ+duHm+cTxstVesRqpCCW3Qc1NfNJZ9MLqp/2lD+orJ3cLABXOIoOpdWOjx1FxuqYoTVVMqjVmRUpf7HiWuv0sKG3IzI8QFM0udfzCBNQ+lR1ko/w3BI/pc3eKSqd8KO5eWD4un6QZnmKgjK4GFDbg5dEwQbmIbQNrb0JXBP5/pg24EO8QW4Lr9PEYREtNnWIP1AhegM8Vi2xP7fSIRnhm0MvJTLkaA20isCyl0MW2hxKE32NETKysqOcGdoReGEhCCiUcJr2G+hrGmgQYOqu3PNaqehhZVC+8lG9xCKMIAXsaHHS2Ac5jtOThUeilSmSGkoIawagTZaRhQBhPJQARgPg/+E/ZuFYdN8dhICnCazGkScgu9C5BpEJ4AodJuMjOgCzL+FgQJGhHBCfpe4PwYdXPJlTIS/2pb+oE7tK55wJRqOjVkQhhLfCHkHF5PauPZDoQKO/MRpePc0WtjBfg/dgQZHEAWQQwQE9cD+iijfiHLvxS/pz7kfNKCMYQLFKAhlRKxItCzdEY/6KPiz6anyAuXOTi82Vm7lIneHxfM5aT0JQw58IjpVAJGAC1b6DCkqH3fMYzklk8+F4mJeDZELQWvXwYTAjvMdmDBeCBwR5rhpuqwcnE3TF72Vue/PyLH1y1D9pTG6VaMLDuYYIbARzqrSNmJdOXwP0B38gKhkVBFPPvq4gvZ+9AUhEpOh7Y4TiR+BDxGUTphQ7+OJCWMTsdmtL0zqlAbRFdC4AEAr6/X9gOpSzLAy3Ma5W7k/Q5fXzN2qNx7mZvAeiIGpOwtxCH15gwh9ikAjGSn9t7EOg0FOVggQkVC1ADE4QeHp9Ig7406QqcyzaFJEhbBU/1xdJ2MdKd0zsyCIB0mcHjqtOaxgike7iuQbY+id9qzj/aK50EsfuVkgwX4F0zGC/hdrPdYTVAIRozZhRls460tmBV0bNsPwEnybNSY/wtU/NQTmSX9ZsWw1VbI0Kp8HGN+Xw64jEYCP72wloZid5JHScSIqeC5EBbCclQY2APxoa6adMZ2LdUgtKRJ0S3kRc5qaQJRyNtle1KXx57Etun9eA2kl0JtZuuCw4wWSspLtPOMbuQe4SUAskzGSJoBMtVh6pT9Oq9GNBHpxWD78ZHSN/2oQ82A5vHzXp6k2xSCbd1/GGjHWPdIPHytpxn3l9GWSqI6KhDWwHRBWjQKqKRx8Q3NoOVG6npN7/A06KLtLeq2kc+O5vjRw8hozMYJc+QmvQHJyokSqQQCFMfOfSwMvyMdyeWTdxpN19ZEkbo0Km9kCMXQiSK5hd2QPqRTA9G6g+zI6KAKWB40WibxnX7iHAQ4N43uuRRMlWYijydjjpd22WfrlzESOj75AyQDwArxw/obRWnRyGUc8gC8yd0JBHTCpQPXQFsuiqTngQtuC/ySnjL06fx99Ncrkeiv5Jv6PY0lhnGs0MrOU4TaMUnlImhhljZ0SikKSnksRTMIWEJbo2nDn6NyALhO/kPRgePzcHGl2o5R9T1gOOQQmrp2rXOtYXyF9O39ia5D1qrrYnFmNjDkE1TEnZkSqBiEvAxOJoDKWpIuGiy/O45P31PDBMmjwbT2CECKIIkzb0r2tuYIQpc2LvBHPmY0V8OT5mzCyFYIG/EM2fX1o7e6syF4gGxwv3x/FKkgo3+ZLz42bmHaizwCtN2I04FghRl1poghET0qnH12BceHDuDx8F6Sm8RdrClpDeZaK+i2kj7LPshc4tJRoZXcheS3fKYfjgVCeITN/moV6g4ybvt0v1GYbeb3dstsE04vbRxMHEAYEnIIhg+2qawcVxTH+62xuSTcUx02Xju0mqQcgcrQSInxG7CD+oTkFqmvYg71LPvkx/rbj3ads3ILtlUuHBzwuwFKgr5y7Uz6ToPMREbW/YE9gX0oJl3EzaFVhpFT/VgDsCqI2Dh5LvICzk6aE244IwQtY768A0x4lwVK8A/JXQXm86mEqIZTR5CY+JwioBFyIv76TuiTwNbaNqJRO+aCIfRB0iTBCa7eGMCGoJcqMSJMGRgqkw47WHqKPE7ERcSXpldJIss/MGQOBViOiCocDUibDUIunFmeIjRt9aQV2Be3EWvv4wyypRJiPxzCtSjkpgx2wwyGFuwRCiIARPzAuKiRGL7IhaGw6l8rkjCNiihKUQZl8AygqXH3ZHSwC0idrJOdFpRRNcwTj6KSkdEpA1zKv9nl8tlSYYpOR7vQDjODCAiAZwbolfQvYQlg6mmx7cY5+d8RP6TKW8IJQpniO4InR980X6SuN+3E0lMwEjaOxQvrTBuhXYvJFiO+slFe9wci+4/6RCk8IQ1aI9Som7WTOfTndZJAmWnCeIrlKV5HEzJTMppZ4oL8SR2kZQSsEtEbkGri5OBagW5yeWVI3SZrqe2NiLV6jT+7ZPzezfnCbcun99HDe0wHGOh0WJBeohtRxurl3K7p0iD3BHDahJO87KFpWTmSYuxHVHR8IMfw9Ai1IQJeNSEe1JUqfxK9oJyzVYU5UIQGh4H7SmSWfE10t4/WPnxw1Mvndn8nnYOPs3msPBTmU9+Lf82pvN/N1/+qLRi3TcQ2yRNpaWdzzp+Fg6QZOl8liqxhqQMWC6ON+JoQGjIkLIdKehLPxeCl4Z7o7ZEWvjplXkQxfaQjRyiSV9CCcKapMt4UAiA4FUt0mpCAx6jo0KtZA0lawqkow2MgNYBEFECXVG1sz0yOkgF0YjL0/nkYHHa18mROB/4B+ReORX0v9IvglPCHgG5sXnIhlAo/ZWF905nj/lsYk65eFSucaEg5iixBAZ6l1Iw9A08DN+QNvdoioDcTD+gB1BtdCdR+nLYBuS0UBokoQL+x1lMEQyTBDKNTMOM8sLPkSysLdsrprORqMPoYA5BqGtOTjG9MWLU1cbFCDYawAl6KGdysjzRDPTzw4hBdW2ZKTRcUOmT4H2kucQlsghzjyge4IRxTiy4l6aNmVUISH3V2C9+DhiXFcc6kuYQu6tMhGRJeDRktFKVlYSQPyNOi9Y8hCG4+P4SUZPk1mw1CSMIYeA9MhqMwVEx4kbJLiD1upYGKAqlm58zCZRDRVwn7aGzHjMMkiTo+ByEaQwl+0R1WAwIoJXlcfsqREGaU0RdpnnZgBEPl8/NT8cJjUGQ7BIRz8rwtyIbAb43XBC5Dy5JFhZjZiv1sNktJ9TEdH83byQmu9XscNbz2jlMAhW0NrNqatItU5hzRw1Rj6bdTrgOY1p7hTVOYXq07KfuSpCGGyQdZZZLbqlNJwFV9Scu7SxZLdik56OVZb3176svfHp3BYOK5znxrFxiAObGegL3iYNlGXGteAByYy3AlrO8xFR0fEqfbZQDL5Iq2FSSYZ+asz3VIh6CYhS1+QZ3kV5v28lRf0xrmYwmhJsOlxqjQBICk6MVMspu+p36OpuRbOS8UclTUyBhy4yPckZP1SB/jkt+74obxGE2S+aDOgO8WR4KFpoqH8C6eGyw+JY8HcSt4fmItbaC8YMkP/QWnTGMzoERjbpw2ZHlFqQE9BLbBMYxibjNKO8sqREaH1LCFmeLK+DNZcMR9IHRQCABCE8g9aU9sXLwtxffRX/12Et2ILykpKmc8MGPzYbMMaZhoia20NHgeaoTjHw0aAXGX136ZENvxSmLAEGEZUQv3HyLwF916KDGn9CdJptLJgnApRvxWyeI8E8oODgWvDRXSzWCVmx09CwZNcFIAwAKH+m0JDI0JqO6pSu3EKVuITxHVvts4oTbhD9AZw+uHgiKwgIwB/uWcgW7LZzBCaLbPYb3k7Ad7E/choWkxKIiGyWT7qHRQAmATEdoMFGx1NInYHM8RXNV3JtPvw+AmdRhccEe08O5CL+PT5F+AuwZ+2ZanCiGN54wZcpi7ofYj7Sf2kSTF0ItT46/VcYFEZYj2eCvZjIK8B6npH/rf+IcP88wHkali5sVzF1ML09lhgOlpwKOGT4WAvGCsyvHlAjZXeMgMmTOx9RuTHjmGuUolo/GJeI9dP7a42hz3kKkgN0s2g3UDEa6lBN4kGwmyUqUcIVhXMH+/SKXtFkp0GhDM6FjcwUBvDM828luJpyc1OtxEBqfrgI2CblrUwQwuUgCEsQIIKmnHnndUxLBcB7OpGtNa9DPGitWszaJISOd1KC3uTcOl2iUxnDJtq6Y5VhS7H1Vh+Crn0RcDTMbCqPCxkwp7C+GuW6JQJv0+Mpi0k0LBD1tkM2iAjqrzgLlYc39KZ6HB3lmYR8+1ze3oL1JzxGULJ+CO1GfZDahXsviv0yudzyI/9tbryVfG142Dxa09jV776mN3c3FYvliaskk2Zq811n70ejc0lqN3QYHqN6JuYjzM2vYdAH0GSzVelIaPsw0M51k7oVCw14G7EBLad7pdK2RsJPGiEoa+qs4EJ4CwAbT5jBhAYJYMcyDkBYwHxSM+SG6yTBwITyAiDpFohVtsjFZKTQvpiq/lHnPDrnNaexKprx3INJhzKWarjL5VoBoT7BHOYE0MU9rqpPxiO68drT6VOI+I1NEqyTMsuP6+EM6yWFF4f1q0/hp4+TYTeIAO1MrOZsJeJ/AiWcX8stDYLow4vbwimmMROobM4gjwwfi5LkpWMS0dxBAcVY4mWw23nyzkl+NN7hfRtDRsOoiNjeD/XFaAIeC9DAMLzEAXSMaZw1JiWlT9M2wVJiZv9CO0h0iJ2uoeY+JbDxQls0mTg3bp9tE1HyoRvAw/zbYnd65MJU+N2BD+JbxqX4i2A4tT1iyVGbApfccZJnU+N6MpFsannxeoyUc+BvVbTCVUGpCeYTBtFwWra70+yHR4Rvezl81UAd2liYTiJENcSx84cpo3OR4R9Mjnhb70vGiC0b7mrmXkkNIKY3ObK3h2zWKhwT5UIiZORVoFTdVncQHllE2k8SWj9cRYKNpMDcyzOcyzRM2Nmy15iRhb6JMAzAVGsXpS/P13AAlBShIRJYce9nl+CMia3TR88EEiuIgPFykpi+0YzkwyGSBID4j4q3YdWkaVpSLyQofyvH70c1z5rH2fvT8r3zxrUnNSp9IL0z1SRGc7lbJLUD/AoCv0UCLNClPS2scMZ7ydK/XsEk79a4MhYcJzSlqbzCoSOJ5YmAMAeRYsmi8aPOibtBglfGTZ0cbZu1bk2v0K9BTAyMHKYXEsQxzZm3RhuKah1jyIrX24A8On6kUU+fNI/Kly8YB1JyuaeIiUDbDwPUvGE+n9mHov9Pd+DSYN1QPeRtWEgqywoh5gBsI6LqUFvgCm5QhEHAsG7Hrczsoo8IZ4ufd2fi3kRVh5SGXEc4dUN6MQs+RvQs6xeYmugPhRIu1pxswhlnk7hlzda55LX34G7kfZlT1hhP7Vu3CG/l7Z9cq5G/4Ip77faVA6xnQF3Gak5UpTsKTpmOwCpik3Owv//PtNya9CDqRXKToptLCV7fI0+hEiWZHF4rHjxp5DhtngH0fSTiP416YQ0S/XHmna43fy4Lz1dmHgdI7hTbW8FRBKKl0uYvOAm0fTJcD73Hi6MRSon03tjoY0BFDf0AQruH+5Mn6ORQKYBL4alKk++mhqZ8k8nMdzio7nPkT3kDD52Elp3XMilAd5NDiG/JjuSRj0una6PeditfxMYK00vMLq4BjzaZkVsaUlAPyVg9W4XT/5321rDNYk2dDsstWjlX8nZ8Lv7n+CFt4shzPmf3KOtV0CQaoePIARBQEG+Ag8Yuc93iU16EInIr15HnM2kz73Sh9Bklz0ps1ODNcAWZ6OBcQ39c8krwumvMYM96EQWqcwLobt1Qnz2BEup+nUQ4AqqktlMVHMZR26DGtD22UYBQEtjMelLiw6YycCIWg0aVRIdMbLeor9pDIhzTgZBQHniLkKCz1ehMTu8iz4Z+cqwZdlhM9ExseFxL8E/8TnqjjpQlAtj+KGOZkWLXv1ObCBR8J0JvNRcYYUHFhWR/0oW8phM2comnWHY/Q+RS6GUAZ1F5pD6fh4ByHShTp8yZzgGX0FdYNfCKeHjLK082HqN+LdeAWRCKVEJHWOGot0hdLgvAzqY/n1P6jzxSqTnyvm4bVFbsfQeKNyAjclZBvUnKt7Qg2NFwxGdjWSFe+2n+iYPQ4gSwjJmyWwynzWutC4njFqC9orXv6PL8CY4TmTjIzQjOYgIs4eap2HFBgBZ4HxTdxUESID83jjQRpEoaZp0CKJT4EgoAMUwPi4K/YHuQzCgcDz8Nh5puwFVSbs6yde/LDly/u/2rpnWWtuabH7kxGX29fvbs7/xuL3+vnTEzGu/0N3pmB5KOjOPoXLCzmibAWU8W2JPICJnm7vIa9BtHhg2qHcO5DdAmGLjueHbL2dHO+s99N4y1oOGI3bjZzi8lOTHfimsMsrUovnrOG4qlk+KEwDfgI3AAxKluoBVdYSciQGxwAM0KpA3ECKXTPDbstC97Myno1Yw6qw/hxI4kzZztttzJstnyiDxSH53Sy6nKiNRfrsdv7bsQqusxgEwbcXwpE4PHw4UlkgHz1qA9kp3DI+S3Xpg3WXBR1yA2o6pAGEChOu0hEqXBw6dFCyAmuTQNf3I9E4qgoq2TJxZXGm6k7DY+RgyqP+SCe4djwmDupKPwMwOsopBWhExAouH3QEsVH05dveHjc4UkahnVAqRf7VxnKjucphu3h1w4v/tjc4FppdRnuJRbOnwgYMIrSfWHHx0+XDmhluLG9EkuMnprz6Nap92x8Bla8dZiMb2nTBd/rRCKH2tGaIaVY07+ydMSIOez3WrT2eBf2YoSwKlcbV8ec+UbK5p9UnHlmPCtWByt4aKfYW5xYngHPJq477DO+PxiKWfnRrbOfFBaIIuiUx/XhMOm4B2gGyxHSoEqbnJe8L4KZiCZZFRk21lv3qPgjxEOXMA+pf4Y2dR3Mgw73ObtHtMcKcAF8FmESbCkwbpTROfzwzolWV3Lt10UXOPpr2beh9Bxnk38eu/KRvkyDKVcVOo5FF3u8Va2T55+AWzjwH3z5qcHpSSrX/4/vP4E6w1yiR7jFb0lv3qmu7cazp+1qhaIB4M4P8kyoFXhcBkFzCmXMIA3gjSuUXwg8ogwShGgGs5k9jWmLvB3vnodzzRx1B8WH6CFTiumbJtnz+7tMD0XRR2mshsYLUDRIGkFmGK3gvDS/TTnxpdiD2Y3QtemXp4nL9qF7SYVAS/LwRrTHdsJRf296WnKt7IQgC7fEl4QGDyNMBe2cZmeATSncSeEDhUERFDapJYTautajqCMvRub8yfSBxSSoQOWcMI2crKE1id4/KkY/th5ecMw9YjdkVtBw8apP6aW3aBmJf/Kmlv26WVlBdVsQOGj9IAKN56nxBUiQMGkjV+ycTZ3MG3gvc8fKYsj4LDYwAw/zZh9jVzI7+LRspM+WliRTCiLyHy/j/1wM98gOZMuZYZfSzr0fvZJ9oo5W93Yky1FUkz/9JqkhAcD8O2PoLFT2kG0EQiAVBitzEgrxT9Bj9JSSyAzGYHmOeuqZg+fjW99qXWZEfCzsHLlpYm6m69XwBYrC05W2LpwsaGTIPxilKUhhA8C4cZ4MIWK3/QVIJUYi0iIBVn1+0vqUSmdk4On0hnslp3uUoLUPrKSUow4Y4gA/rBTXS3XUuHjfkRuRVHBgjp2IViN5lSpcFAoCiwBTFsS5qa6dOn5vZ3Ul0yLQrboJkky2OwuBJcTTcvaw9wOmrUmkH+5PZ7ojuBbMxIDRjQE8w512BgktMQSdGDDJYBTxoTseIc8ZLfxIR23Sy0wH1ZifQGNYICx7W5uuOj7SLSjEUPcXuTqilxDlciIZuzj41ZX3LWv6YK9kyp0FaC9zSfWRTeQmIU0/JkMLt1OhpMvelZjKVRPx8a8XpVcQ0eqkOrwS6STNlp7wD4eppD3uDO219RNupDUARVGR9J3OTwCZmMXplaMk9uYnhv+T+FEj2/84gwMfb8eP7+c3f7Ja3i70U6HMuzAmw/H9oHPWh1qUuQ2cJlwQ1gYiCPVlQm6p7KVCDIlDRIdtKu8J6rWnU3rB7lCOgq1KJQ1y3wyNEy5HNDdaLLZentt+obj9fG7n72dufNaqnY2Iq+TrHSdIh8dXjcYZs7ymR8/ozGBVz0ca81rzRItBaoUhOAb+o5bN+uU8tjOZERgGplZKeRKshoaLzCr14QlyqUvfd6gV95+aUqPHELIa3SlFjuidVunewfzxQW7+a1rrXOj0f/F+qpOgs5y9jRPEx1KnRTRVLwtFDtwof9OjFLrw1YP69RSpTuK2joQOaXku2xf/QVDpRtk/xH2VMSwHCnfTpmOz23mI+BjcGgeMIwfgzb7CDmEkCeVANLjzDiTpkM8/qfHsqbFuLWbGBVuWKHLlz4LD12S4JENxlr7Z6ZwHfNJoAweHZb2EthtWSm+L9NPhZ9PRJtPzApzs/1m7/t7+6gfGkijnfKvYe3KMME7+g3D1FZcBQNAX+XNwGmV+vPE3b37w288LVNiRHllm4iEchgErr8MOEeJP8cZo82+ghM8xUsxPo/1z4cJm4B2KIUS3N3rib/88dLhQqzQtvKNuPTePwvx+iYm8ocXvTUZPGuyVwscikw6g4jypGJkRTWWxR3R+Kg+aBUrq33338o3VpemPM9D2wQBlF6DNsafh9tlkojApjLDw2X/0aOsfXsSz8bwhvu1fULcv9XNftXavZbK3ArNImU4JSkHuAZy+4cGbduaPb2ZuLh98KcN1Ft+lQWPauBQdoz7jhTd+r7zzqwuL3x9VXoxKpZ7hMH9mEq+ORqmfefLhgt7cPZt5IbP9nL3JtQBs0vG+6cz9uLkxqNqT+wk/O138SqT8eqTwDsX9cHW/VL8wyKn2OPD/h6Mv/Lelbz5tHC9lm6ej1X/95S9F+6HG0G7spQlx0Ywya8HoxADvWf7WhCGQDAMa5TUIA2JVRVAMqEP4DHw/WJ0Wvm61zoSydzxYfuaJOi5QWQsyd6dM0aD3KvUoIBQaLAmOCgWU0JSyPlVKisnNiwHDsXf/erD4H9Xj5yA5sZ466nLOE0OaFV5b3H41ee8po7ysoTAru4gpXrP/K1968KU/Ov2VcWDjH7anFt5vi7wuUA/cDJgK0fLnk7fmznYOxpmvP3oKJl3+Rhjpt9QjmTPXuzSVXQqeJ+WuAB4muTRQ1srXR8cvWPM/7A8H5o2906tfm+78kqLX9GhFdByjeRFQGxSEkc81HL6ZXPr2gGJ17SqaVNDTOXXirxhPlP837zT/znVII92nFzilGBdqNvRtFT7yj14uoYhzgF6bpSy+NfrmF9eWvj1B24Kjsf7vDr77X1/mEZfe9SrXaW4WiB7RAL0jE28822MUjDOPprdnbJnjVQcJZrBlKopYpOAgub1kIW8VevZX/hnvRbG7/LLOpoQKQHyy/pUBogxMpKj/vetsMtFyTfgsCpgeBNbUchsNCELqyK5hl5Xcv32n8XevU8dDIKR+mSSTEuJszUlhZVyOrBosQXremYUGg9T6k/da/8l1KoqcSaREwCEJ3kjEyW20wkgUeMCR+FlPiOds/fXfeXDw6+fYNxAAGO+CLiPJenLbh3pKysRplLAwquQ/nZZfDRs1OS3DNTdR6Pc6TA8Bsgyh/iTUPqgSyKihXEAhlJ8bfuwedctAxlYfJQ1EHFYc2kxlShGugDWS8qD0JfHP0Srzo4Vsgcm0j0KIo/BbuoGE3GRIS87/K75S/FH45A2XsAcxtAv/49Hd31rQOyqy+RRUs3e9b/yr36Hc+I4TXdc6pEZu4FU80EPlO8PV/6v8fO2Pl6DvwneLbUm2ye5hGtEQxnPOeeuVfxUPy4aue94kCN+dzF0zjr7ev0i/73uN1e33lhk0Xfi+Xn8ySN8OofFe+BCuBeEIZAAh9MLIxcVxAdZX3mv92vX2Odw2/YQi5TZr25UF5BGQuALVts6F1373QflvneXBXfzcw4dfPfPcX//k+z+8cuqp/Qdb89cvbDKa6uX85t4oe5E5jEH4NfueHvKTYe/xFaI65ASCglCJ35l6eYjlIZXu56+2n/rZ1IfvDU/9/v7TCC6BwIJvu3cSnGGiWVD7jScPYLT+jcK7Nwbr/+7j6zwLnC2WiC4wQvrHLWbJu9ropX74Xmy84iQ+MbD+1h2ZCcnQiPqzs7IqSfhYGv8AtNa/Mqo9ycwFqbT1VoTJPFqc0u/HP+FzxndRzp4NEhcQA65MAB5LWY5iJmQvFgS7Bq0U7bbVPygf/PzC0p8cPfrPStRXiW4QWaOMnHjAMCJ5nygq/bSMX5QHRHxOn1dvnfUPk9xyPvk4thCyNJRnWWFgZKA+wR8BlWdT8UKrv/PP8A9CY10ezWW6JCGoPIh42Y618Yft2jOp1oVAnR9S+qAlfNnm00J4VeLD3X4GkKO+k0Eqj7wR49T5rHRtYc711BhRE8uYQCxiUCtBLLQpou2ZKRUz1ngCZqaQvIVNSyNW1GXW33ysi6CgAGWaQ0s1sBvh62Yt5xzGKE/DMMrdDE5eo1vKA8MwYxPLdBA7hB8Ezvk4CRRWGjzgvQiW7NpLD6kFZUB2GC3mGUfDJEAz35NzEjwwhQNFbZqyAZzQXDgcpRl3SIJeHcUZI07uB8IE41kYjBAiQgGbg9IIsBu33xmb/PYxkMivdo+z//6F/+NFxKIC/9bE/ZPuk98/PrOaaNw8WVhPN++fFCbDGU0JYObM+Pdf/t9Qdb07XiCqedraBoj6QfvcM4mdf/LBF3LZ3uDH+eEq7Rh+8buyL05enRoJZzLWf/HyR8BRa9E6EdHF6CFhyZIuQsV885LZ+fNB6bc++NlTpdpyrPWjb10hRCx9P4yWT/EDl4ZxNhOAEAVlLyl6TSL62BTFDZp07UPyVR9SOIaVh8gnsn3X/vFH27/1xGNeTvL14390+qtPGAMG7fx281xO6/1MbKug/oVb4/X/v1/70/43B2eej6JQvPhW+9wLyU2ifVZ+NVL/w/qzEMHJIx7/4bWlw5sHi+vF+pLdftAuAKjcLc+9sLbNc3nwjdP+kz3m8an3Y5iz+e+q1MPW/8rWXjv9uC2Q1OvRByvaWj8TH5zcLpDIAS6imzy4MoarTSnYaxgw0bh3OUwAm22NAxC+0Bv1DOZvc0oHZydoAkHlpWMDBh+GkhI04CeNtoW3terLU3wUUVtyZ4quFBXs5e9M9r6oz71DmuYPClr9unSO4zBxM2xp1rn13ISdrN2KjdYmZ9aOybCObhe1pcHTiwdUPu7ul1KpAUTw40Gi0kqwCMzqGryXS794rOZe/nxQmMTu6bC8YTLl7UHrZj6GXCwAdd9i6oAYFcjQxFs0LZlj0JH6OMZu2BO5pAlEWMJ3f8kBI1yYa8EDJOtQUlNUbjhddPVPUCsD3e2o1tnOEGF3kdYLYUgwbBiIdKnr7cQk52SclTkBqDwcpEipSZA4hKxNx4k6fTjjKlgifRLALcAb+k4ULhKv50iEtmxY8/bHprMwTc91U8nhoGa7eTeeGIPKFCO9g3G6EOn9pLxm6F6lmaCESoFLTr7qA7rsd1PL8TbpOzfFiSp3k90uhltAs0Y7vphq403o9WRgKDADaBBwYhPILeJVO3HSxaNq6qfO3/576Qq7ikrLnKZ91q5dTN36LNpmlvLXcjcyyXEp090Nx3GxemJiJbx3uqe+vn9he5Ar+5lvHF0iub/Xmy+lO82hBY9LTzlR220vhha+PnK/4IyOY3ZueC518s2tC/d7xfI49Ulv+ePuMuVhBmbispc0VOnHH09X/pvVb/zd9KO/dvVHTyxvBc+Mq+lI7Xx4itYgfHfsLpA6ypkNLVTgGzBB6RhALBzFJ2t+oN+NIgk3XpG663SpNJ6bSd8+1f3ti3/0ArOiwxGme2xE9i8Yrfz/5wTWPRpjpWxzbwLRUH977O8i5hYa/GHv0svWI3CIW+OlvmeCBdwblBB3wpF/++Dc5IgSaujiSqVSybyy+vBoknyxsJ3Rh1eTZSSV68PY9t7ccT0VOd2Hv4o0y7hjanOjToFNE754ce/u3ZWWY14pHfFcKp0ERfO5ZK9BYa6vEaxlbytDRKXmpRbFBuODrLkBqH4iPXQbCPiH/blJIjEaTgzzJDxBtREE7dCUNr+lKbQ+at3MHYA21b8sgQ9d3XDc6y8ghu9BomDCcWyly2zAxlPoDKjuyoTUQ+8RDfmjOentSi532F2DpGIlHDoqoQccj+10Sh4fAMRhM016z/fY35N7BT3jUGWBfwI1XC08/wWuhRCUJzTtGiMk4itmn+c7RpQ2PCop6QcBs+D8TgRiRG1qHTVS9UaiBp91x+r4EfgT4AfSajAWwTkIJeYmBQ02KzNTTUQliIl5tOA6U1AvZIGqEi7S4gA7Fj6XKPAdCvnQ6xg1hv+69vFR5sSz0S84rGaaY2t0FIsezQjHbBzKLKis0kiFYMeJSUgzQYTuRB1Rh06BQwXzuQ4IUL8ew4jwDQ2yoyDyyfGCSbmyXOi27VCN8T7GoGXhorkFJqG2dtLdKG0N+UY1QYPPqBKj/ZRp3IO2hUFl6Q8qmeFxzIsFtaH9meLWaqxZTPbuHs0/vXRA9fBsofqbc9+hiYq92PFHdY/JdhEm6h1MA0jPOJdDNwMTCgQFnSWsxp3W3MOTov8w3g3rjOVtVpjoMgEly8YH5XKWnjQzI9Jd3MjW6WQhS54vVDsqUbTAoRaDAeqNTdSrdvrZvhB+4wfUfRlIp0RXjMYlg5gwcjbivm4f/XTuxoule+ri9MlTO/lcj536/PrWrX7xlTOblbtzuSu1NpW7S6OpEaSTw84k6p0eYcVDtjcA6kDwM+9uzNV/s3j/y/3cJUMu6cv9xbTaqXvuxw7RS++Rq/5gtAYSwMG7MVr1Q80fD8+W3czONEGhvONHmYJ1f1QiBqk5ccKQrhOFIXDz7jo7G6+YiQ//n57u7Lfu9KwDuH32fffu2EmcTDJJJrNkminTgVIKtIBAAgnuuEXcIK75CxASCO5ZxE3hEgRI0LIUtXQbukxnMpNkmsX7bh+ffbGPzef1QY0s55yff8v7e5fnfZbv833uL2x92pomdrn4yRd1/x7XZ0WGypVOJj+4Xj6qk7/DeH6+RZlavnJ4MEqvzO4b66lSGxTR+c0X5dxCa/e4iNCJoXGO8e2z7fP9VHmpcfpJ8c6b63vNguhuttprH2YzC51ublJJciGEYq1dvXPsxV+d2t1NpD/3uU+4uGce7BemOr2p87fvvljfmLrxyk45093rFO7fWYulR7fndjOlAcfPQSZxdfFwv1NYurHfPE3mrjZztS46P8mcfIRwRcjauzv5TKX/dHd6ptY003KpoShAo5WhW3Fbqs9hBXLJtLr2NOHLSOCAKDyLtK+E6o3ojUneXHeCc0W/8yNybyCDKD3BfDzcfzvZ72ThmItr7CWa8cWwbinlgEKcmWoCT+S5UhmmshsRcCD2A45hsIk74wyXXjUqBCMY4JPOLS2y3Y92U5nywQWCCRtn6Ql/Y6KK72YGjmYiUlK7PlF5zOQLtAKDhro8wQLsJ1SNxIlyvv+ACg4+eoENiZo+6kQOK3D7vGehtG2rH6zW0QKCmcTjwxmeqP4xrxVNejIgFemkOxbkZKQ63NwtI+pJlvo029xiqx2HqEePrXjI+cFxIUVjiZ+/M7XKx/XFwsd34o21szTWhtcLm3+7/dkHN9YlANkTvt2fetS7VYu3finzKZqaf2k+lHNgDX23fo2WK8ImvgRMiFoLc3F8pUOsNXeh+CaVa1W3BJMKgA4wQ+XV7sY6h2BEPNrOTOllr77IVoGe7i1s/+gny+xPUgBjFaPgcWP2n3/wxi+8/hi4cT65NB39eDFGIga7dy6WqymLVONSu6gp8sfynJz8vV/4nyentdYXk785/aN/Lb7GcXXyMrd4/aS5mCrnuntHRTnSAmIWeb2Rfa20XT/vg6TWR90XZ7HtYfmbF5zb8a+sfeb3r3/jgw6Q66SN7vA096Jb2xxWQnhtmAMQ0+ztQeledtOR5fRROdbZyZYWE8fCQs3PpBRFDd5CrzY5YtFsHJTzCcHxSdDwm6WDb728/ury2vNGFX4FpyPT4P7Uzjef30AOcPetHfuJqvcSqVRi/Pzcs9YbyXenXyIZiSwF/6ROdp+DYpOg/Nad9NXsUeI2PGfsQWk9egPUPwQ5gRYC9kMNexjzyXNkrV+qfDwVbd57c9tf9RtXoMTI2+/t0r3d84M85UEAPDedbD85nC7Xumyr68uHa6eLD2tr7UHSshSl+Prj6ltvPfvfD1fS5VDwZ3Ujb9BrRaDY87fmNmjXr5W3MSYDGwtmPG7OzqWbjdPU6qQiC4EaFzSbAyAwMrAd/UseRLk9Zr8bCruB0XCXgYzmtkI6hb/SYbhPJDckWmcqcQ/hS0xOVgya5I5tHN0ltG4AXoKJciW5Z1geIUNksvbR2eYXghtWf8iWsDgrTzAFx+RtAENwz0hsn/rWUeONKbE1MDeEBTQEJJBIskKBQUWf8B/XrX9wk4mj18KOyrXQvDWyVY4TK5v7OcDXNCtgF4ImAgOp0jXJRG2ezrV7hfhoOeA5OIW5Xc6vTY6XB33dhFZArzmTNvad6eC4GgfWjgYZZjAD9Z38cwP2fDhjEQIuvlra5cH70vXH5t9aMfPxYPFrh3fMAGCUvzx+dyl5JByy2S4X8LFOjoyBTi/kNeX0WuF4NVMRziolus9zNQNTSXaLeUWwhtm5441oWRYcsm0U2rMl7rkQw3j11qagZbHWE55K3B7Nppq1eBsu+Xr6oJFN3y9tUbYNLajkh4Mrq6c9Njo081vJ3RTE28TEiUqUl4Ra7LCVeE6lCabySnz/N2o/+vfova2H7dv5PZutiLYKhzu9ghnsfet5BWKH/9VdNGU3Rsf/0HhAvqCoMQfAMFqj9De2VpaKJ2a2a6WrX88cWkUm63szL5pnKamARn61U1WXBFRgvVd+mH2+NpyyAgUPcKjqYWtyLtM8SGefHMzcn4FybXtfXglWj0jp9cXDH+5fuVXb15hrs4dsex8gQrcaxQfl9U4jpT2wH+UYoz15s3BwLH8l2RQ5YPO7P9CFpoJzSGJms8womjrE3jE0nUGvrEB/5VIaf2WZF6OdndPyXLy+NZJcoxgthrv8K6kdGBqbOV8AgdJ5WWwX8arAGTXit5rXkge55DKTh9E0c2//Z8vPZh82nfasOUW24te5MXPwaXv6bm7nK8/fnk23ljJ18IbXMhv/vX0TSdwQ2KieW5o5Xt+thAyaS1qUkLkrvpzdRPcQ2O+KL6Rbx3lmhaGHnVhQ6THki3rFJ7Fc8+3KMwzo1SA+GPSXxTfkGjcDkbOFXVi9qL8akgOsOg4YhRZkzXOcBobjcggoidtUHwVQZa8C5guigjw7Gmr3XEZgj+9fwJ1YwwevB46d3FrwE9pjQyQ3AtAUqC/5pkL1Egyp6o0EnTeQmZ/3AglAcExNA+yfdZsp1NH9pH4NziRN5YmhkIABWIFG3ZA7Qrfh3wNcMtJOc9wSd4Qc9dVoMWz8/vvNzyxcOz46y22jSGhU3p7ekDb69fYdEAWXCPGzalpnqZOzjOliAYeuufwnXux/RJoeZ9b67M5Qi3BeRMAlag+hJaNygr2qlZgqBe4DREmUSEklyR8Dga8M8IN576DmhbDnxEVOlpf+nwSeDDe0NTXP06v9GgjlfLwOUb86rJUiT/93MO20t5L7u2fTJsrWWdlCgqkIBTbADoV5bdKRELT0ARhNV2AZftyezZLBExMs2HYqibRCdErof7Vfre8WAojqIrKQOcnFh3uDAhcU6/1lrhp69TzytDv7ZGdaC9u55E9Opp7lZz9qLwgRKyWgVhlr/E5hpxTvScwlKD3OU7wafteXzepnr760wUKrXs0cfX335q8tfPykPbuSDYyJ3tFa0lH25PvTO6ChfGaWmYfqnO1Oke5gQ0YQejjIQeTRAO2QjWT68cms+nZ4Isb7sGF1vt8eutkvzSRbNiiYRL+PMlDBmeZpKl4dSZgks0BhjCCNl5EFAqHHPr/0XKjZuJgAfMV3ynv8ZBAp3uI7sg3m+qjlZpPNldyhF8EG8El99l55BxrEUr8i/t3LQ+TcnN8ngk+lLq388Z+JtLSWU63lS26vhtRyaKmJ2oeBOBmBHL8qp5M4PMSdcBMvENuP41U8mn+WZeyvWCeU6bAgxfrDUkGr1cW9TXrwHYVNDBLa+hn/yQfq67A8Ek8XA/BPmDiMwUlk+vuqEYE9uXZkGwwkmRLe0QceW5CTCC1DnZ2QNoW29az6fszXQRVKX0D8XElaNwlFncAXi70AGYMSVQIhwE4mokwnYQmY/ICcvpDBbV+CtAprchCnJU7l2wB3NHgn+1MuyS80hHUK1vMo+ubUJhz999eWpE5TBdsbheyilMFgVQMxy31XQpDpAj/p8gL3O+xSP2lmwFJ6HxguFp0MoOJiqGXNfeW4sZmdahw1kXJR7i9GSLtFU4QtQp5biDcC1uM+YCDh/1bPVfUbUVnl74gPs9ncysYDXNOE4zEKu3pUmrgU6BE/sPGeSTe3uiXeXSW+jbrGXMnWSZZH9Tn7gxcnCEwF05QAg7b1W9sYroMzmOYRvOjy/BFZcK107EwwJiEfyvlCuYF7t9lMT3HnDhJgYhrgWp3ZWCtOloeO29g3j0ocj0JBpemW9usQ/eBu3hq2LrLceXf5pWXwvd0lhS6kC1HPXsntW2ZgXA/Lq7wXthr98OPOFQnTH3UX383/ZHNYJXpuJPce9+cRCwBt7wxLM/FmHoAgYLgTf/Xic+NcpL290sqVfX7sg93izNzJ/mEhuEwFvRf6bGFKIHQerX51fYp6P4TIjY/AJ0MlCQIuct5upNO5gTTFejf97sLLjU4Z6PLg0XTm5omxgfm8XdyzUD/an7eMm3u5X37zkS7S+Vb4vz69Uyx0zZmfXeRw6nxwsujz5knxwfyGQTkeZFz7ol2bS0u8P1vvVKhg0drPfymzH8KareWAwwKPCIsEheu+3BZbYshCEmCxrvyEiFzgv7gsVwDEQNaH9FCXi5yEEwLoEghczk7A3BDpnNMh1kfbJOxIVesnIDAqo2jxNDbTj1bRmNNjJ/DwIflUfnQwdyY8xZsnqVkGLc+NVIHJ6eGoeiYhjSN0QFkonaaLg8HKaWy+X55pxYrDylQL30GJXpcdFPK9mXxbavdMuZUqDpZnjvLFXinbCzC/UkB42eZm8i3WBeeYJtbynWqma6uUqyIVSNUhm57a1Lr1nZn1rY585As+TBN9MfgwsxZbdaa5UGiKpiiRZ5XqXPsY5ke3c6suvxTEcNJKHvqxwAzt0lRdY8rpHj0NLYoPieSIQrVSO8JFYAYg4MCpfmduN5Y6F7ApFnqVmeb1qUPT0VY5U2jZyM1ji4p+W011d9v5g04+kzjlQLbAdtoFpTN5gNdbZRqabRm8gzxyfn+UsNiKicFGu/xob26+2AyeGBGss8RBK0urBDCzt9DYncYjYp8VfHq4vAZPvP54brtZOPm42ktPEgfUYx4vJBqVEvMjCAvusdZasTzdcuFmu5hInzoBZnh/s5yu9Mz1gNhClTuM3ywfpmDl5bglI+8sr623KnOZhg3ZVqbzGag3Cof108yD4jo5Qlk1p93KVgx6wsu1LGg1EWPOsQyNmiPCfxYk9xgNXM+wi14Mp7GYAKObBgQEN8n8VIPyT4QJysZrUt2kX53LVL5aPoZ3Qe1OIAapfDHJd0X46s+QmXWYTpX61EVrzM4PMipuJ4JNlFOatnYEQ7x6KGGE9N5siabOX5ygToDUSWUzw0RU/kuEN7iJ2A9cKtHPp4LSwZVgqrBB6Bq3c3tICd5fWyYLYgEmO5K2h18o1J2iDQbWFkRDXisabDDODBm0NkAEeFApkgbViNx+LzCoyxwDF9QnTEfYJXosm9BCpZHKOzIDJGUyCO2KFNEA87PhBS6Ticy9JnWfruL9TSkRFa/EVOtwQETPTAVKhX7x2cIgPrkHQ82GUbSU7h0mszqCPjlONrEL0Qogub1t6JgAhw+a57Xy0bXsEXuJhU1bM5yMAX/lS2zItcDaHjk1v3291DyDTU+yOoF+QtzS6zTv1fTWsqo3UDQqI12WQ66Xs9wtDDAFNLgB3dlprqLjEWw+mD1uK4Sj5iM3vbsFRfE85rjbOujOFCpnug/F5mrqsFGFgQ5hOi2ZiTUash1CJDYcYascqysMCaUU1XlCG6hh3qWirkE5BIoUZnLEcbd1mpelJnmoz+On++BlJQ25hFLdnUvwkZj640eI9dtwNNi75yP9vUpxMcGgTTofzlsM8xFX/2EAFQtj8mRqkvgqyCv5bUlbNu6DI49FJ6srdit4OwivhdTJ8EbUX+nk9md9a0V9ufqRObVWqWn2/fTGd1Mrt9M7jEadqYV3ctsKYC8k63eTW/ujvCbpB1ucZh+cFajWYcgkHnF6n/PuSMEmnEZLsbpUppMR10FBy2/l9yKFC907lzjZGpblwd3K7Bp3x51sgHQ4ABqQZw3K+iJq7wU0vZwGQct1jhfkmzmcyy2m66yMrV6pEBt8ee6T7UrpcD7LchbEXi1V6c9vFDeZ01RZ0pmhPp9N2dhNBhapp/DMB3V3YpKnigmayYWbMxdNYxFR6owzv7ezpEgWKQOMxzkZUyKzsDqylkwqW1n5aefoXhbDxezf/PDkt94AdgnMc62A2bOQJGpltiF3L4qfBuCFWBNkhoIb8XZ06t9etN9eBud2K7x9DODGSgpfXflR/7SSas8nSs/69VeSy+WjG9mDYMmcx4kTdoglxEQxlYmKXjJgz8ev5E/5RJIqX03xyJ4TJ9Fq6woAAAkTSURBVDzaesq4MkjGDjGURS5nu5ORViaNywkWT3eUvJXaISZx/Zhk40VoI/JQ4nP81ZnG1Vf6TLitdP/LILgR0mvg6Sm1Jy7kSAepz8RyxK3MBl/N2vHMHt/EqvZhbPT74OtPV77bOuK5GkOuOx6+Xq55gD2eAI+4nFXyk3Ezt37aGA+ajrXGjZdTkoqAJqM7tyZPpZVo1bglGuag02Cg3dytfvpSPjvnhJqp0OPlc3mY5FdZoi43uVU+dVyrnFaNtbXHrRz31f6v3tuZqknxC9BzbcZFb5j0cynWZdHNJmCuzi/FTcjoey0fHmHsTGj1JwzBIB3kmn/ExCuJvXxkqAM9dyHaiCoUFenpJQ2oygUOZm1c71UifR1Lcmmh8xFnu8QRrmStMi66iEgSXioJvQpRx4+s0irCv1F6KckH2wMDnI02luJHJ6mMDvd0w0SoWcYe4YORHY/+YuL/e0yvOk3DPNQjmqm0h/owl4B8Pl2I12UntlO2NW3uEzfusJQ4dFU53iEEiUInk4OO6w2TUxUMQF9vN1R6QhWTS4mvEIAwDPvFdNVvb8xs3c7uSSuLASv1KoHXROq6dnSnMLvIuQ6ckza9i7srfl8KiODnNN3F2XHMsO4cxLFJX7UTIo/pzGrAxNnSNNQymIp7lj7qtG6RdBepT/dGs0zAAFkUhLBj0QToytxcRmtjWH7RqL5e2wK0r59lvn88BzmBK57jK7QHhAiHzsTEK1kibeR8b2uowmugQYj2xjuGE3T3eAl5PV3pyN/svudkgl9n8SVY1frCTUwUHhQXkojjrYnIJNum48GTxpJm/oVlH+/wQ64PKrT/HE5YPE8cJKcUyjjpaDw+Ppl7q7LBRc5JbTbwfREoVCmWgOZ5tAtl37vKQ49Ps3gviV7NsxrdhAvHjqGdBoljQwKHx/kqXYswGrvyfSVWx9lbbugrZ53f5KurCHUNth4cIVzckHwdY/ndweTwUnqMsBczcI5X4MhxzpGU9MsO9JW8dytfNdJvF44/+/CyXjkripacAhKyOceNh5Kjh5tPtrVHrfn+KCbObp+HffWmvIL6zePgy3WRzcdmwtVMRK6e1ozXo94VQQsFj+XgeqLxtY+dRLPWvC3O741LZcRq8XksZZxg4LRfv3koOUvohK8XcR6mcbegWtwalCg+vpL5R/GcZWaTlF3giPXA22mspYnPoUu5iOscT/fjET+dQpf3H8sv0yBvpPjA5UNvTZS/Wb8pk5uzx0Y39pBRZMxATjgOcFq09shhMG0c9Nn9G6CzkZF76l798KwzxZmHAwpG/9O9qfvFrc+XPiUvNG/y6l/8afkTDKJn6a3OwYMiUPnsN45HWZ4dFoXyV92Tm9n8pjLHSqAl+Da3fw7dGDU1+FoonyYGNVU9g/ZslK9S2fc4qEMRcMxYhwz61F5v8kdPB1983Z55eD9umlFHF35lDVJpMVWXKDjuAgOm6T5bITQ98mY8S/S+tzLDDIluHW9ZesfJ43/2h7EA89tiG+uohmo7pP8W/uknr4VqjGiIYEe6sXSpzwsiv2ZcVxRLpJuwy6UgQ8NJ5QkOhrPIaSeESfO497JdgSx0DOr7nslXhtjZibOKc6vBaTy4h6svEaDPK4OAdw37q4BmWCrhA0AQGB2z02c+r340M9+Wao2rK/RMqEZ6ycAnJOoH0xpPOP4oVU0vK7wGNtTLtxQ06l8dcupwF4cLA7JqwtqQbG3LHHNmmpM2K0HaQBeCpoCV3sTWo/JEKL4X6CoQ4e6FmDBCV3xW3FeFDxNg+p4C9q0xgBM2A/Dg9Fb06lc2nvzhQigItR0fTJ/NLB9z2zBysPJY8xQqWYLTlSaZzc/M5WukBMdCsZ12Gm6RxXvUzuhwapL4UL+dTD1L8p9lrzVU40nuMvtPAehqX08cvBsQc2EwucygGhuR4fTZ3H9G935dnlSAiygjMXzQHsrdEdSQAzAfMgkT3823bgdIdGw7QKhdKP6syI+NoTs/Ssx2B/VUejMOO4Z/XhYIwuKTtweJzQSP+sy3I/u/PJw8ShQ/BQA2FQMXBFRnyDLPotINGgoFRdVBrRK2lccYme+pZIbYG0i48Cghyj3/zcH2e0nnA8RD3oL+BRq6TqiEU78D5h54fi0QHn55WNyW3k7yQHDg42RWCuUo8EeV7x7+7tX3H3Xmv/bR3cmVP/mzG3/NgxnHbz+4v0SxzDzaPtvaji3M9+7Ox7/2/YmHr8U2DkbH9Yn7r0RebO3/5i36J99p6l/e3/uDdxWRip2gUoz2ZlPpf3y//dvvCHgg6pv8zo+jd28xuIK21+n1r9cInIM3UrN//u3NP3oXmypHn5AdD2SYsejwLjUln+mcoaMNeogNBAlNvjrCOS4mA8Dpq+PsRu4K5qIkt8ABwWeN4SIQEDG9qaTnqqOcX+mP2mEsidHgZQ2VySbjdaHOS/XQ0qCZWx6sYnYsTnLwbk2xcPAdYU9xCYCrgtvqUhyqfBKIOZDBiJbp/cCYijFpbnTlqxcbX5osPYoYVJcYOSsB1QCkL+JDCGnz22e5pGdzA0VpDIlh6CzDkkSNlpM9lLsY+UCkEc+tReRABAgyX7S3gdznjlb1vh7yUN1NP6nTRAJydKX3Lho3AoEabaV99Ty3HvCTwkKXQRoVOQOJM5eYvdMRwdWwgJHVL5+Ds6fX4xakx7HVtRwhAI9A41pglyyunR3fislaAmXmd6zeOzg4zhe/lQIBlwQzafNDjgpmlZJbfh6KZnIMNxNca/y6YfGgPGvFoYIBevV24F/dUebp0rF3OqGd1Q8YNboxZNXwMtDCVPzsVSOz/7Gz/avzuO3KT8/T+8ONX0xCe2KFDhWLoxO9uXMQLnGy9pIaL5Mz37NnZOe+unv8cFodaL3KdUPtUgLeVUE5OEePPUh9sLrzO7dqH/VkIxz8/s9IQQwex0uyP9Ssskz6VTwGo8bV2Ow/PB9dm+3Op5P10/hRF723aeP86R90D97MtJZDYgrXs+odbLe5r+1O7B22fvFVNeQrT0+zP97e+/JS9cN280bWu+Sfnhy+XSmsD6DnnVB+dpY8HG7/XMYqPfxMaGr4wZNf7Zb+Lrf9hYn/A7WYJ5VbfHCnAAAAAElFTkSuQmCC"/> </defs> </svg>
0
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/01_how-to-train/eo.svg
<?xml version="1.0" encoding="UTF-8"?> <svg width="78px" height="62px" viewBox="0 0 78 62" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <!-- Generator: Sketch 61.2 (89653) - https://sketch.com --> <title>eo</title> <desc>Created with Sketch.</desc> <g id="Page-1" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd"> <g id="juicy-flag-sprite-7" transform="translate(2.000000, -1516.000000)" fill-rule="nonzero"> <g id="eo" transform="translate(0.000000, 1518.000000)"> <g id="Group"> <g id="Path"> <path d="M14.563,0 L59.437,0 C63.722,0 65.785,0.399 67.93,1.545 C69.8569512,2.56682087 71.4331791,4.14304876 72.455,6.07 C73.6,8.215 74,10.278 74,14.563 L74,43.437 C74,47.722 73.601,49.785 72.455,51.93 C71.4331791,53.8569512 69.8569512,55.4331791 67.93,56.455 C65.785,57.6 63.722,58 59.437,58 L14.563,58 C10.278,58 8.215,57.601 6.07,56.455 C4.14304876,55.4331791 2.56682087,53.8569512 1.545,51.93 C0.4,49.785 0,47.722 0,43.437 L0,14.563 C0,10.278 0.399,8.215 1.545,6.07 C2.56682087,4.14304876 4.14304876,2.56682087 6.07,1.545 C8.215,0.4 10.278,0 14.563,0 Z" stroke="#FFFFFF" stroke-width="4" fill="#7AC70C"></path> <path d="M14.563,2 L37,2 L37,29 C37,31.209139 35.209139,33 33,33 L2,33 L2,14.563 C2,10.195 2.455,8.61 3.309,7.013 C4.14439321,5.43490545 5.43490545,4.14439321 7.013,3.309 C8.61,2.455 10.195,2 14.563,2 Z" fill="#EEEEEE"></path> <path d="M18.597,24.712 L15.179,26.509 C14.461296,26.8860958 13.5917457,26.8230516 12.9359421,26.3463736 C12.2801384,25.8696956 11.9518104,25.062049 12.089,24.263 L12.741,20.458 C12.7889305,20.1767734 12.6954468,19.8899654 12.491,19.691 L9.727,16.996 C9.14590837,16.4302004 8.93656721,15.5834909 9.18704884,14.8120924 C9.43753047,14.0406938 10.1043622,13.4784874 10.907,13.362 L14.728,12.807 C15.0104071,12.7660882 15.2546002,12.5888333 15.381,12.333 L17.09,8.871 C17.4488378,8.14400818 18.1892709,7.6837619 19,7.6837619 C19.8107291,7.6837619 20.5511622,8.14400818 20.91,8.871 L22.619,12.333 C22.745,12.589 22.989,12.766 23.272,12.807 L27.092,13.362 C27.8946378,13.4784874 28.5614695,14.0406938 28.8119512,14.8120924 C29.0624328,15.5834909 28.8530916,16.4302004 28.272,16.996 L25.508,19.691 C25.3035532,19.8899654 25.2100695,20.1767734 25.258,20.458 L25.912,24.263 C26.0494654,25.0623361 25.7210358,25.8703735 25.0648921,26.3471443 C24.4087485,26.8239151 23.5387757,26.8866677 22.821,26.509 L19.403,24.712 C19.150663,24.5795277 18.849337,24.5795277 18.597,24.712 Z" fill="#7AC70C"></path> </g> </g> </g> </g> </g> </svg>
1
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/84_first_ml_project/sentence-transformers-explained.svg
<svg width="1345" height="552" viewBox="0 0 1345 552" fill="none" xmlns="http://www.w3.org/2000/svg"> <rect x="1" y="1" width="267.26" height="550" rx="29" fill="#F5F1FF"/> <path d="M23.7274 151V135.256H25.7194V151H23.7274ZM29.9361 140.656L29.3841 139.744C29.9921 139.344 30.4401 138.92 30.7281 138.472C31.0321 138.008 31.1841 137.432 31.1841 136.744C31.1361 136.76 31.0721 136.768 30.9921 136.768C30.6561 136.768 30.3601 136.664 30.1041 136.456C29.8481 136.248 29.7201 135.952 29.7201 135.568C29.7201 135.152 29.8401 134.824 30.0801 134.584C30.3201 134.344 30.6241 134.224 30.9921 134.224C31.4561 134.224 31.8241 134.416 32.0961 134.8C32.3681 135.168 32.5041 135.704 32.5041 136.408C32.5041 137.368 32.2801 138.2 31.8321 138.904C31.4001 139.592 30.7681 140.176 29.9361 140.656ZM35.5119 151V139.336H37.1439L37.3119 141.016H37.3839C37.8959 140.456 38.4559 139.992 39.0639 139.624C39.6719 139.24 40.3199 139.048 41.0079 139.048C41.9039 139.048 42.5999 139.248 43.0959 139.648C43.6079 140.032 43.9839 140.576 44.2239 141.28C44.8319 140.624 45.4479 140.088 46.0719 139.672C46.6959 139.256 47.3599 139.048 48.0639 139.048C49.2639 139.048 50.1519 139.44 50.7279 140.224C51.3199 140.992 51.6159 142.12 51.6159 143.608V151H49.6479V143.872C49.6479 142.784 49.4719 141.992 49.1199 141.496C48.7679 141 48.2239 140.752 47.4879 140.752C46.6239 140.752 45.6479 141.352 44.5599 142.552V151H42.5919V143.872C42.5919 142.784 42.4159 141.992 42.0639 141.496C41.7119 141 41.1599 140.752 40.4079 140.752C39.5439 140.752 38.5679 141.352 37.4799 142.552V151H35.5119ZM63.2631 151.288C62.4311 151.288 61.6391 151.136 60.8871 150.832C60.1351 150.512 59.4791 150.128 58.9191 149.68L59.9031 148.36C60.4151 148.76 60.9431 149.096 61.4871 149.368C62.0311 149.624 62.6471 149.752 63.3351 149.752C64.1031 149.752 64.6791 149.576 65.0631 149.224C65.4471 148.856 65.6391 148.424 65.6391 147.928C65.6391 147.528 65.5031 147.192 65.2311 146.92C64.9751 146.648 64.6391 146.424 64.2231 146.248C63.8231 146.056 63.4071 145.88 62.9751 145.72C62.4311 145.512 61.8951 145.28 61.3671 145.024C60.8391 144.752 60.4071 144.408 60.0711 143.992C59.7351 143.56 59.5671 143.016 59.5671 142.36C59.5671 141.416 59.9191 140.632 60.6231 140.008C61.3431 139.368 62.3351 139.048 63.5991 139.048C64.3191 139.048 64.9911 139.176 65.6151 139.432C66.2391 139.688 66.7751 140 67.2231 140.368L66.2631 141.616C65.8631 141.312 65.4471 141.064 65.0151 140.872C64.5831 140.68 64.1111 140.584 63.5991 140.584C62.8631 140.584 62.3191 140.752 61.9671 141.088C61.6311 141.424 61.4631 141.816 61.4631 142.264C61.4631 142.632 61.5831 142.936 61.8231 143.176C62.0631 143.4 62.3751 143.6 62.7591 143.776C63.1431 143.936 63.5511 144.104 63.9831 144.28C64.5431 144.488 65.0951 144.728 65.6391 145C66.1831 145.256 66.6311 145.608 66.9831 146.056C67.3511 146.488 67.5351 147.072 67.5351 147.808C67.5351 148.432 67.3671 149.008 67.0311 149.536C66.7111 150.064 66.2311 150.488 65.5911 150.808C64.9671 151.128 64.1911 151.288 63.2631 151.288ZM74.8057 151.288C73.8617 151.288 72.9737 151.048 72.1417 150.568C71.3257 150.088 70.6617 149.392 70.1497 148.48C69.6537 147.568 69.4057 146.472 69.4057 145.192C69.4057 143.88 69.6537 142.768 70.1497 141.856C70.6617 140.944 71.3257 140.248 72.1417 139.768C72.9737 139.288 73.8617 139.048 74.8057 139.048C75.7657 139.048 76.6537 139.288 77.4697 139.768C78.2857 140.248 78.9417 140.944 79.4377 141.856C79.9497 142.768 80.2057 143.88 80.2057 145.192C80.2057 146.472 79.9497 147.568 79.4377 148.48C78.9417 149.392 78.2857 150.088 77.4697 150.568C76.6537 151.048 75.7657 151.288 74.8057 151.288ZM74.8057 149.656C75.8137 149.656 76.6217 149.248 77.2297 148.432C77.8537 147.6 78.1657 146.52 78.1657 145.192C78.1657 143.848 77.8537 142.76 77.2297 141.928C76.6217 141.096 75.8137 140.68 74.8057 140.68C73.8137 140.68 73.0057 141.096 72.3817 141.928C71.7577 142.76 71.4457 143.848 71.4457 145.192C71.4457 146.52 71.7577 147.6 72.3817 148.432C73.0057 149.248 73.8137 149.656 74.8057 149.656ZM92.0182 156.376C90.5942 156.376 89.4342 156.104 88.5382 155.56C87.6422 155.016 87.1942 154.24 87.1942 153.232C87.1942 152.736 87.3462 152.256 87.6502 151.792C87.9542 151.344 88.3702 150.944 88.8982 150.592V150.496C88.6102 150.32 88.3622 150.072 88.1542 149.752C87.9622 149.432 87.8662 149.048 87.8662 148.6C87.8662 148.104 88.0022 147.672 88.2742 147.304C88.5462 146.936 88.8342 146.648 89.1382 146.44V146.344C88.7542 146.024 88.4022 145.592 88.0822 145.048C87.7782 144.504 87.6262 143.888 87.6262 143.2C87.6262 142.352 87.8262 141.616 88.2262 140.992C88.6262 140.368 89.1622 139.888 89.8342 139.552C90.5062 139.216 91.2342 139.048 92.0182 139.048C92.3382 139.048 92.6422 139.08 92.9302 139.144C93.2182 139.192 93.4662 139.256 93.6742 139.336H97.7302V140.848H95.3302C95.6022 141.104 95.8262 141.448 96.0022 141.88C96.1942 142.296 96.2902 142.752 96.2902 143.248C96.2902 144.08 96.0982 144.8 95.7142 145.408C95.3302 146.016 94.8182 146.488 94.1782 146.824C93.5382 147.144 92.8182 147.304 92.0182 147.304C91.3942 147.304 90.8102 147.168 90.2662 146.896C90.0582 147.072 89.8822 147.272 89.7382 147.496C89.5942 147.704 89.5222 147.968 89.5222 148.288C89.5222 148.656 89.6662 148.96 89.9542 149.2C90.2582 149.44 90.8022 149.56 91.5862 149.56H93.8422C95.2022 149.56 96.2182 149.784 96.8902 150.232C97.5782 150.664 97.9222 151.368 97.9222 152.344C97.9222 153.064 97.6822 153.728 97.2022 154.336C96.7222 154.944 96.0422 155.432 95.1622 155.8C94.2822 156.184 93.2342 156.376 92.0182 156.376ZM92.0182 145.984C92.6902 145.984 93.2662 145.736 93.7462 145.24C94.2422 144.728 94.4902 144.048 94.4902 143.2C94.4902 142.352 94.2502 141.688 93.7702 141.208C93.2902 140.728 92.7062 140.488 92.0182 140.488C91.3302 140.488 90.7462 140.728 90.2662 141.208C89.7862 141.688 89.5462 142.352 89.5462 143.2C89.5462 144.048 89.7862 144.728 90.2662 145.24C90.7622 145.736 91.3462 145.984 92.0182 145.984ZM92.3062 155.008C93.4262 155.008 94.3222 154.76 94.9942 154.264C95.6662 153.784 96.0022 153.24 96.0022 152.632C96.0022 152.088 95.7942 151.712 95.3782 151.504C94.9782 151.296 94.4022 151.192 93.6502 151.192H91.6342C91.4102 151.192 91.1622 151.176 90.8902 151.144C90.6342 151.112 90.3782 151.064 90.1222 151C89.7062 151.304 89.4022 151.624 89.2102 151.96C89.0182 152.296 88.9222 152.632 88.9222 152.968C88.9222 153.592 89.2182 154.088 89.8102 154.456C90.4182 154.824 91.2502 155.008 92.3062 155.008ZM102.264 151.288C101.528 151.288 100.992 151.064 100.656 150.616C100.336 150.152 100.176 149.496 100.176 148.648V133.912H102.144V148.792C102.144 149.096 102.2 149.32 102.312 149.464C102.424 149.592 102.552 149.656 102.696 149.656C102.76 149.656 102.816 149.656 102.864 149.656C102.928 149.64 103.016 149.624 103.128 149.608L103.392 151.096C103.264 151.16 103.112 151.208 102.936 151.24C102.76 151.272 102.536 151.288 102.264 151.288ZM108.981 151.288C108.005 151.288 107.189 151 106.533 150.424C105.893 149.832 105.573 149.016 105.573 147.976C105.573 146.696 106.141 145.72 107.277 145.048C108.429 144.36 110.245 143.88 112.725 143.608C112.725 143.112 112.653 142.64 112.509 142.192C112.381 141.744 112.141 141.384 111.789 141.112C111.453 140.824 110.965 140.68 110.325 140.68C109.653 140.68 109.021 140.808 108.429 141.064C107.837 141.32 107.309 141.608 106.845 141.928L106.077 140.56C106.621 140.208 107.285 139.872 108.069 139.552C108.869 139.216 109.733 139.048 110.661 139.048C112.085 139.048 113.117 139.488 113.757 140.368C114.397 141.232 114.717 142.392 114.717 143.848V151H113.085L112.917 149.608H112.845C112.301 150.056 111.701 150.448 111.045 150.784C110.405 151.12 109.717 151.288 108.981 151.288ZM109.557 149.704C110.117 149.704 110.645 149.568 111.141 149.296C111.637 149.024 112.165 148.64 112.725 148.144V144.904C110.789 145.144 109.429 145.504 108.645 145.984C107.877 146.464 107.493 147.08 107.493 147.832C107.493 148.488 107.693 148.968 108.093 149.272C108.493 149.56 108.981 149.704 109.557 149.704ZM122.371 151.288C120.915 151.288 119.747 150.76 118.867 149.704C117.987 148.632 117.547 147.128 117.547 145.192C117.547 143.928 117.779 142.84 118.243 141.928C118.723 141 119.347 140.288 120.115 139.792C120.899 139.296 121.731 139.048 122.611 139.048C123.283 139.048 123.867 139.168 124.363 139.408C124.859 139.648 125.363 139.976 125.875 140.392L125.779 138.4V133.912H127.771V151H126.139L125.971 149.632H125.899C125.451 150.08 124.923 150.472 124.315 150.808C123.707 151.128 123.059 151.288 122.371 151.288ZM122.803 149.632C123.827 149.632 124.819 149.096 125.779 148.024V141.928C125.283 141.48 124.803 141.168 124.339 140.992C123.891 140.8 123.427 140.704 122.947 140.704C122.323 140.704 121.755 140.896 121.243 141.28C120.747 141.648 120.347 142.168 120.043 142.84C119.739 143.496 119.587 144.272 119.587 145.168C119.587 146.56 119.867 147.656 120.427 148.456C120.987 149.24 121.779 149.632 122.803 149.632ZM136.696 151V135.256H138.688V151H136.696ZM149.701 151.288C148.965 151.288 148.429 151.064 148.093 150.616C147.773 150.152 147.613 149.496 147.613 148.648V133.912H149.581V148.792C149.581 149.096 149.637 149.32 149.749 149.464C149.861 149.592 149.989 149.656 150.133 149.656C150.197 149.656 150.253 149.656 150.301 149.656C150.365 149.64 150.453 149.624 150.565 149.608L150.829 151.096C150.701 151.16 150.549 151.208 150.373 151.24C150.197 151.272 149.973 151.288 149.701 151.288ZM158.459 151.288C157.419 151.288 156.475 151.048 155.627 150.568C154.779 150.072 154.107 149.368 153.611 148.456C153.115 147.544 152.867 146.456 152.867 145.192C152.867 143.912 153.115 142.816 153.611 141.904C154.123 140.992 154.779 140.288 155.579 139.792C156.379 139.296 157.219 139.048 158.099 139.048C159.587 139.048 160.731 139.544 161.531 140.536C162.347 141.528 162.755 142.856 162.755 144.52C162.755 144.728 162.747 144.936 162.731 145.144C162.731 145.336 162.715 145.504 162.683 145.648H154.811C154.891 146.88 155.275 147.864 155.963 148.6C156.667 149.336 157.579 149.704 158.699 149.704C159.259 149.704 159.771 149.624 160.235 149.464C160.715 149.288 161.171 149.064 161.603 148.792L162.299 150.088C161.803 150.408 161.235 150.688 160.595 150.928C159.971 151.168 159.259 151.288 158.459 151.288ZM154.787 144.232H161.027C161.027 143.048 160.771 142.152 160.259 141.544C159.763 140.92 159.059 140.608 158.147 140.608C157.331 140.608 156.595 140.928 155.939 141.568C155.299 142.192 154.915 143.08 154.787 144.232ZM167.997 151.288C167.021 151.288 166.205 151 165.549 150.424C164.909 149.832 164.589 149.016 164.589 147.976C164.589 146.696 165.157 145.72 166.293 145.048C167.445 144.36 169.261 143.88 171.741 143.608C171.741 143.112 171.669 142.64 171.525 142.192C171.397 141.744 171.157 141.384 170.805 141.112C170.469 140.824 169.981 140.68 169.341 140.68C168.669 140.68 168.037 140.808 167.445 141.064C166.853 141.32 166.325 141.608 165.861 141.928L165.093 140.56C165.637 140.208 166.301 139.872 167.085 139.552C167.885 139.216 168.749 139.048 169.677 139.048C171.101 139.048 172.133 139.488 172.773 140.368C173.413 141.232 173.733 142.392 173.733 143.848V151H172.101L171.933 149.608H171.861C171.317 150.056 170.717 150.448 170.061 150.784C169.421 151.12 168.733 151.288 167.997 151.288ZM168.573 149.704C169.133 149.704 169.661 149.568 170.157 149.296C170.653 149.024 171.181 148.64 171.741 148.144V144.904C169.805 145.144 168.445 145.504 167.661 145.984C166.893 146.464 166.509 147.08 166.509 147.832C166.509 148.488 166.709 148.968 167.109 149.272C167.509 149.56 167.997 149.704 168.573 149.704ZM177.403 151V139.336H179.035L179.203 141.448H179.275C179.675 140.712 180.163 140.128 180.739 139.696C181.315 139.264 181.931 139.048 182.587 139.048C183.051 139.048 183.467 139.128 183.835 139.288L183.451 141.016C183.259 140.952 183.083 140.904 182.923 140.872C182.763 140.84 182.563 140.824 182.323 140.824C181.827 140.824 181.307 141.024 180.763 141.424C180.235 141.824 179.771 142.52 179.371 143.512V151H177.403ZM185.723 151V139.336H187.355L187.523 141.016H187.595C188.155 140.456 188.747 139.992 189.371 139.624C189.995 139.24 190.707 139.048 191.507 139.048C192.739 139.048 193.635 139.44 194.195 140.224C194.771 140.992 195.059 142.12 195.059 143.608V151H193.091V143.872C193.091 142.784 192.915 141.992 192.563 141.496C192.211 141 191.651 140.752 190.883 140.752C190.291 140.752 189.755 140.904 189.275 141.208C188.811 141.512 188.283 141.96 187.691 142.552V151H185.723ZM203.576 151.288C202.536 151.288 201.592 151.048 200.744 150.568C199.896 150.072 199.224 149.368 198.728 148.456C198.232 147.544 197.984 146.456 197.984 145.192C197.984 143.912 198.232 142.816 198.728 141.904C199.24 140.992 199.896 140.288 200.696 139.792C201.496 139.296 202.336 139.048 203.216 139.048C204.704 139.048 205.848 139.544 206.648 140.536C207.464 141.528 207.872 142.856 207.872 144.52C207.872 144.728 207.864 144.936 207.848 145.144C207.848 145.336 207.832 145.504 207.8 145.648H199.928C200.008 146.88 200.392 147.864 201.08 148.6C201.784 149.336 202.696 149.704 203.816 149.704C204.376 149.704 204.888 149.624 205.352 149.464C205.832 149.288 206.288 149.064 206.72 148.792L207.416 150.088C206.92 150.408 206.352 150.688 205.712 150.928C205.088 151.168 204.376 151.288 203.576 151.288ZM199.904 144.232H206.144C206.144 143.048 205.888 142.152 205.376 141.544C204.88 140.92 204.176 140.608 203.264 140.608C202.448 140.608 201.712 140.928 201.056 141.568C200.416 142.192 200.032 143.08 199.904 144.232ZM214.738 151.288C213.282 151.288 212.114 150.76 211.234 149.704C210.354 148.632 209.914 147.128 209.914 145.192C209.914 143.928 210.146 142.84 210.61 141.928C211.09 141 211.714 140.288 212.482 139.792C213.266 139.296 214.098 139.048 214.978 139.048C215.65 139.048 216.234 139.168 216.73 139.408C217.226 139.648 217.73 139.976 218.242 140.392L218.146 138.4V133.912H220.138V151H218.506L218.338 149.632H218.266C217.818 150.08 217.29 150.472 216.682 150.808C216.074 151.128 215.426 151.288 214.738 151.288ZM215.17 149.632C216.194 149.632 217.186 149.096 218.146 148.024V141.928C217.65 141.48 217.17 141.168 216.706 140.992C216.258 140.8 215.794 140.704 215.314 140.704C214.69 140.704 214.122 140.896 213.61 141.28C213.114 141.648 212.714 142.168 212.41 142.84C212.106 143.496 211.954 144.272 211.954 145.168C211.954 146.56 212.234 147.656 212.794 148.456C213.354 149.24 214.146 149.632 215.17 149.632ZM232.543 151.288C231.295 151.288 230.423 150.928 229.927 150.208C229.447 149.488 229.207 148.552 229.207 147.4V140.944H227.479V139.456L229.303 139.336L229.543 136.072H231.199V139.336H234.343V140.944H231.199V147.424C231.199 148.144 231.327 148.704 231.583 149.104C231.855 149.488 232.327 149.68 232.999 149.68C233.207 149.68 233.431 149.648 233.671 149.584C233.911 149.504 234.127 149.432 234.319 149.368L234.703 150.856C234.383 150.968 234.031 151.064 233.647 151.144C233.279 151.24 232.911 151.288 232.543 151.288ZM241.189 151.288C240.245 151.288 239.357 151.048 238.525 150.568C237.709 150.088 237.045 149.392 236.533 148.48C236.037 147.568 235.789 146.472 235.789 145.192C235.789 143.88 236.037 142.768 236.533 141.856C237.045 140.944 237.709 140.248 238.525 139.768C239.357 139.288 240.245 139.048 241.189 139.048C242.149 139.048 243.037 139.288 243.853 139.768C244.669 140.248 245.325 140.944 245.821 141.856C246.333 142.768 246.589 143.88 246.589 145.192C246.589 146.472 246.333 147.568 245.821 148.48C245.325 149.392 244.669 150.088 243.853 150.568C243.037 151.048 242.149 151.288 241.189 151.288ZM241.189 149.656C242.197 149.656 243.005 149.248 243.613 148.432C244.237 147.6 244.549 146.52 244.549 145.192C244.549 143.848 244.237 142.76 243.613 141.928C243.005 141.096 242.197 140.68 241.189 140.68C240.197 140.68 239.389 141.096 238.765 141.928C238.141 142.76 237.829 143.848 237.829 145.192C237.829 146.52 238.141 147.6 238.765 148.432C239.389 149.248 240.197 149.656 241.189 149.656ZM50.409 181.288C49.385 181.288 48.457 181.048 47.625 180.568C46.793 180.088 46.137 179.392 45.657 178.48C45.177 177.568 44.937 176.472 44.937 175.192C44.937 173.88 45.193 172.768 45.705 171.856C46.233 170.944 46.921 170.248 47.769 169.768C48.633 169.288 49.561 169.048 50.553 169.048C51.321 169.048 51.977 169.184 52.521 169.456C53.081 169.728 53.561 170.048 53.961 170.416L52.953 171.712C52.617 171.408 52.257 171.16 51.873 170.968C51.505 170.776 51.089 170.68 50.625 170.68C49.921 170.68 49.289 170.872 48.729 171.256C48.185 171.624 47.753 172.152 47.433 172.84C47.129 173.512 46.977 174.296 46.977 175.192C46.977 176.52 47.305 177.6 47.961 178.432C48.633 179.248 49.505 179.656 50.577 179.656C51.121 179.656 51.625 179.544 52.089 179.32C52.553 179.08 52.961 178.8 53.313 178.48L54.177 179.8C53.649 180.264 53.065 180.632 52.425 180.904C51.785 181.16 51.113 181.288 50.409 181.288ZM60.7667 181.288C59.8227 181.288 58.9347 181.048 58.1027 180.568C57.2867 180.088 56.6227 179.392 56.1107 178.48C55.6147 177.568 55.3667 176.472 55.3667 175.192C55.3667 173.88 55.6147 172.768 56.1107 171.856C56.6227 170.944 57.2867 170.248 58.1027 169.768C58.9347 169.288 59.8227 169.048 60.7667 169.048C61.7267 169.048 62.6147 169.288 63.4307 169.768C64.2467 170.248 64.9027 170.944 65.3987 171.856C65.9107 172.768 66.1667 173.88 66.1667 175.192C66.1667 176.472 65.9107 177.568 65.3987 178.48C64.9027 179.392 64.2467 180.088 63.4307 180.568C62.6147 181.048 61.7267 181.288 60.7667 181.288ZM60.7667 179.656C61.7747 179.656 62.5827 179.248 63.1907 178.432C63.8147 177.6 64.1267 176.52 64.1267 175.192C64.1267 173.848 63.8147 172.76 63.1907 171.928C62.5827 171.096 61.7747 170.68 60.7667 170.68C59.7747 170.68 58.9667 171.096 58.3427 171.928C57.7187 172.76 57.4067 173.848 57.4067 175.192C57.4067 176.52 57.7187 177.6 58.3427 178.432C58.9667 179.248 59.7747 179.656 60.7667 179.656ZM73.2225 181.288C71.7665 181.288 70.5985 180.76 69.7185 179.704C68.8385 178.632 68.3985 177.128 68.3985 175.192C68.3985 173.928 68.6305 172.84 69.0945 171.928C69.5745 171 70.1985 170.288 70.9665 169.792C71.7505 169.296 72.5825 169.048 73.4625 169.048C74.1345 169.048 74.7185 169.168 75.2145 169.408C75.7105 169.648 76.2145 169.976 76.7265 170.392L76.6305 168.4V163.912H78.6225V181H76.9905L76.8225 179.632H76.7505C76.3025 180.08 75.7745 180.472 75.1665 180.808C74.5585 181.128 73.9105 181.288 73.2225 181.288ZM73.6545 179.632C74.6785 179.632 75.6705 179.096 76.6305 178.024V171.928C76.1345 171.48 75.6545 171.168 75.1905 170.992C74.7425 170.8 74.2785 170.704 73.7985 170.704C73.1745 170.704 72.6065 170.896 72.0945 171.28C71.5985 171.648 71.1985 172.168 70.8945 172.84C70.5905 173.496 70.4385 174.272 70.4385 175.168C70.4385 176.56 70.7185 177.656 71.2785 178.456C71.8385 179.24 72.6305 179.632 73.6545 179.632ZM87.279 181.288C86.239 181.288 85.295 181.048 84.447 180.568C83.599 180.072 82.927 179.368 82.431 178.456C81.935 177.544 81.687 176.456 81.687 175.192C81.687 173.912 81.935 172.816 82.431 171.904C82.943 170.992 83.599 170.288 84.399 169.792C85.199 169.296 86.039 169.048 86.919 169.048C88.407 169.048 89.551 169.544 90.351 170.536C91.167 171.528 91.575 172.856 91.575 174.52C91.575 174.728 91.567 174.936 91.551 175.144C91.551 175.336 91.535 175.504 91.503 175.648H83.631C83.711 176.88 84.095 177.864 84.783 178.6C85.487 179.336 86.399 179.704 87.519 179.704C88.079 179.704 88.591 179.624 89.055 179.464C89.535 179.288 89.991 179.064 90.423 178.792L91.119 180.088C90.623 180.408 90.055 180.688 89.415 180.928C88.791 181.168 88.079 181.288 87.279 181.288ZM83.607 174.232H89.847C89.847 173.048 89.591 172.152 89.079 171.544C88.583 170.92 87.879 170.608 86.967 170.608C86.151 170.608 85.415 170.928 84.759 171.568C84.119 172.192 83.735 173.08 83.607 174.232ZM101.11 181L97.8699 169.336H99.8859L101.614 176.08C101.742 176.624 101.862 177.168 101.974 177.712C102.086 178.24 102.198 178.776 102.31 179.32H102.406C102.534 178.776 102.662 178.24 102.79 177.712C102.918 177.168 103.054 176.624 103.198 176.08L104.998 169.336H106.918L108.742 176.08C108.886 176.624 109.022 177.168 109.15 177.712C109.294 178.24 109.43 178.776 109.558 179.32H109.654C109.782 178.776 109.902 178.24 110.014 177.712C110.142 177.168 110.262 176.624 110.374 176.08L112.078 169.336H113.95L110.83 181H108.43L106.75 174.736C106.606 174.176 106.47 173.624 106.342 173.08C106.23 172.536 106.102 171.968 105.958 171.376H105.862C105.734 171.968 105.606 172.544 105.478 173.104C105.35 173.648 105.206 174.2 105.046 174.76L103.414 181H101.11ZM116.488 181V169.336H118.456V181H116.488ZM117.496 166.936C117.112 166.936 116.792 166.824 116.536 166.6C116.296 166.36 116.176 166.04 116.176 165.64C116.176 165.256 116.296 164.944 116.536 164.704C116.792 164.464 117.112 164.344 117.496 164.344C117.88 164.344 118.192 164.464 118.432 164.704C118.688 164.944 118.816 165.256 118.816 165.64C118.816 166.04 118.688 166.36 118.432 166.6C118.192 166.824 117.88 166.936 117.496 166.936ZM126.067 181.288C124.819 181.288 123.947 180.928 123.451 180.208C122.971 179.488 122.731 178.552 122.731 177.4V170.944H121.003V169.456L122.827 169.336L123.067 166.072H124.723V169.336H127.867V170.944H124.723V177.424C124.723 178.144 124.851 178.704 125.107 179.104C125.379 179.488 125.851 179.68 126.523 179.68C126.731 179.68 126.955 179.648 127.195 179.584C127.435 179.504 127.651 179.432 127.843 179.368L128.227 180.856C127.907 180.968 127.555 181.064 127.171 181.144C126.803 181.24 126.435 181.288 126.067 181.288ZM130.504 181V163.912H132.472V168.568L132.4 170.968C132.96 170.44 133.544 169.992 134.152 169.624C134.776 169.24 135.488 169.048 136.288 169.048C137.52 169.048 138.416 169.44 138.976 170.224C139.552 170.992 139.84 172.12 139.84 173.608V181H137.872V173.872C137.872 172.784 137.696 171.992 137.344 171.496C136.992 171 136.432 170.752 135.664 170.752C135.072 170.752 134.536 170.904 134.056 171.208C133.592 171.512 133.064 171.96 132.472 172.552V181H130.504ZM148.555 181V165.256H153.043C154.227 165.256 155.259 165.4 156.139 165.688C157.019 165.976 157.707 166.464 158.203 167.152C158.699 167.824 158.947 168.736 158.947 169.888C158.947 171.536 158.411 172.76 157.339 173.56C156.267 174.36 154.867 174.76 153.139 174.76H150.547V181H148.555ZM150.547 173.128H152.899C154.275 173.128 155.291 172.872 155.947 172.36C156.619 171.832 156.955 171.008 156.955 169.888C156.955 168.752 156.603 167.968 155.899 167.536C155.211 167.088 154.179 166.864 152.803 166.864H150.547V173.128ZM162.149 186.016C161.893 186.016 161.645 185.992 161.405 185.944C161.181 185.896 160.973 185.84 160.781 185.776L161.165 184.216C161.293 184.248 161.437 184.28 161.597 184.312C161.757 184.36 161.909 184.384 162.053 184.384C162.709 184.384 163.253 184.144 163.685 183.664C164.117 183.2 164.453 182.608 164.693 181.888L164.957 181.024L160.277 169.336H162.317L164.693 175.792C164.869 176.288 165.053 176.824 165.245 177.4C165.453 177.976 165.645 178.536 165.821 179.08H165.917C166.093 178.552 166.261 178 166.421 177.424C166.581 176.848 166.741 176.304 166.901 175.792L168.989 169.336H170.909L166.517 181.96C166.245 182.728 165.917 183.416 165.533 184.024C165.165 184.632 164.701 185.112 164.141 185.464C163.597 185.832 162.933 186.016 162.149 186.016ZM176.832 181.288C175.584 181.288 174.712 180.928 174.216 180.208C173.736 179.488 173.496 178.552 173.496 177.4V170.944H171.768V169.456L173.592 169.336L173.832 166.072H175.488V169.336H178.632V170.944H175.488V177.424C175.488 178.144 175.616 178.704 175.872 179.104C176.144 179.488 176.616 179.68 177.288 179.68C177.496 179.68 177.72 179.648 177.96 179.584C178.2 179.504 178.416 179.432 178.608 179.368L178.992 180.856C178.672 180.968 178.32 181.064 177.936 181.144C177.568 181.24 177.2 181.288 176.832 181.288ZM181.27 181V163.912H183.238V168.568L183.166 170.968C183.726 170.44 184.31 169.992 184.918 169.624C185.542 169.24 186.254 169.048 187.054 169.048C188.286 169.048 189.182 169.44 189.742 170.224C190.318 170.992 190.606 172.12 190.606 173.608V181H188.638V173.872C188.638 172.784 188.462 171.992 188.11 171.496C187.758 171 187.198 170.752 186.43 170.752C185.838 170.752 185.302 170.904 184.822 171.208C184.358 171.512 183.83 171.96 183.238 172.552V181H181.27ZM198.86 181.288C197.916 181.288 197.028 181.048 196.196 180.568C195.38 180.088 194.716 179.392 194.204 178.48C193.708 177.568 193.46 176.472 193.46 175.192C193.46 173.88 193.708 172.768 194.204 171.856C194.716 170.944 195.38 170.248 196.196 169.768C197.028 169.288 197.916 169.048 198.86 169.048C199.82 169.048 200.708 169.288 201.524 169.768C202.34 170.248 202.996 170.944 203.492 171.856C204.004 172.768 204.26 173.88 204.26 175.192C204.26 176.472 204.004 177.568 203.492 178.48C202.996 179.392 202.34 180.088 201.524 180.568C200.708 181.048 199.82 181.288 198.86 181.288ZM198.86 179.656C199.868 179.656 200.676 179.248 201.284 178.432C201.908 177.6 202.22 176.52 202.22 175.192C202.22 173.848 201.908 172.76 201.284 171.928C200.676 171.096 199.868 170.68 198.86 170.68C197.868 170.68 197.06 171.096 196.436 171.928C195.812 172.76 195.5 173.848 195.5 175.192C195.5 176.52 195.812 177.6 196.436 178.432C197.06 179.248 197.868 179.656 198.86 179.656ZM207.332 181V169.336H208.964L209.132 171.016H209.204C209.764 170.456 210.356 169.992 210.98 169.624C211.604 169.24 212.316 169.048 213.116 169.048C214.348 169.048 215.244 169.44 215.804 170.224C216.38 170.992 216.668 172.12 216.668 173.608V181H214.7V173.872C214.7 172.784 214.524 171.992 214.172 171.496C213.82 171 213.26 170.752 212.492 170.752C211.9 170.752 211.364 170.904 210.884 171.208C210.42 171.512 209.892 171.96 209.3 172.552V181H207.332ZM221.273 176.248L221.009 167.176L220.961 164.92H222.953L222.905 167.176L222.641 176.248H221.273ZM221.969 181.288C221.569 181.288 221.225 181.152 220.937 180.88C220.665 180.592 220.529 180.232 220.529 179.8C220.529 179.336 220.665 178.968 220.937 178.696C221.225 178.408 221.569 178.264 221.969 178.264C222.353 178.264 222.681 178.408 222.953 178.696C223.241 178.968 223.385 179.336 223.385 179.8C223.385 180.232 223.241 180.592 222.953 180.88C222.681 181.152 222.353 181.288 221.969 181.288Z" fill="black"/> <path d="M53.6922 270V254.256H58.1802C59.3642 254.256 60.3962 254.4 61.2762 254.688C62.1562 254.976 62.8442 255.464 63.3402 256.152C63.8362 256.824 64.0842 257.736 64.0842 258.888C64.0842 260.536 63.5482 261.76 62.4762 262.56C61.4042 263.36 60.0042 263.76 58.2762 263.76H55.6842V270H53.6922ZM55.6842 262.128H58.0362C59.4122 262.128 60.4282 261.872 61.0842 261.36C61.7562 260.832 62.0922 260.008 62.0922 258.888C62.0922 257.752 61.7402 256.968 61.0362 256.536C60.3482 256.088 59.3162 255.864 57.9402 255.864H55.6842V262.128ZM67.286 275.016C67.03 275.016 66.782 274.992 66.542 274.944C66.318 274.896 66.11 274.84 65.918 274.776L66.302 273.216C66.43 273.248 66.574 273.28 66.734 273.312C66.894 273.36 67.046 273.384 67.19 273.384C67.846 273.384 68.39 273.144 68.822 272.664C69.254 272.2 69.59 271.608 69.83 270.888L70.094 270.024L65.414 258.336H67.454L69.83 264.792C70.006 265.288 70.19 265.824 70.382 266.4C70.59 266.976 70.782 267.536 70.958 268.08H71.054C71.23 267.552 71.398 267 71.558 266.424C71.718 265.848 71.878 265.304 72.038 264.792L74.126 258.336H76.046L71.654 270.96C71.382 271.728 71.054 272.416 70.67 273.024C70.302 273.632 69.838 274.112 69.278 274.464C68.734 274.832 68.07 275.016 67.286 275.016ZM81.9691 270.288C80.7211 270.288 79.8491 269.928 79.3531 269.208C78.8731 268.488 78.6331 267.552 78.6331 266.4V259.944H76.9051V258.456L78.7291 258.336L78.9691 255.072H80.6251V258.336H83.7691V259.944H80.6251V266.424C80.6251 267.144 80.7531 267.704 81.0091 268.104C81.2811 268.488 81.7531 268.68 82.4251 268.68C82.6331 268.68 82.8571 268.648 83.0971 268.584C83.3371 268.504 83.5531 268.432 83.7451 268.368L84.1291 269.856C83.8091 269.968 83.4571 270.064 83.0731 270.144C82.7051 270.24 82.3371 270.288 81.9691 270.288ZM86.4065 270V252.912H88.3745V257.568L88.3025 259.968C88.8625 259.44 89.4465 258.992 90.0545 258.624C90.6785 258.24 91.3905 258.048 92.1905 258.048C93.4225 258.048 94.3185 258.44 94.8785 259.224C95.4545 259.992 95.7425 261.12 95.7425 262.608V270H93.7745V262.872C93.7745 261.784 93.5985 260.992 93.2465 260.496C92.8945 260 92.3345 259.752 91.5665 259.752C90.9745 259.752 90.4385 259.904 89.9585 260.208C89.4945 260.512 88.9665 260.96 88.3745 261.552V270H86.4065ZM103.997 270.288C103.053 270.288 102.165 270.048 101.333 269.568C100.517 269.088 99.8531 268.392 99.3411 267.48C98.8451 266.568 98.5971 265.472 98.5971 264.192C98.5971 262.88 98.8451 261.768 99.3411 260.856C99.8531 259.944 100.517 259.248 101.333 258.768C102.165 258.288 103.053 258.048 103.997 258.048C104.957 258.048 105.845 258.288 106.661 258.768C107.477 259.248 108.133 259.944 108.629 260.856C109.141 261.768 109.397 262.88 109.397 264.192C109.397 265.472 109.141 266.568 108.629 267.48C108.133 268.392 107.477 269.088 106.661 269.568C105.845 270.048 104.957 270.288 103.997 270.288ZM103.997 268.656C105.005 268.656 105.813 268.248 106.421 267.432C107.045 266.6 107.357 265.52 107.357 264.192C107.357 262.848 107.045 261.76 106.421 260.928C105.813 260.096 105.005 259.68 103.997 259.68C103.005 259.68 102.197 260.096 101.573 260.928C100.949 261.76 100.637 262.848 100.637 264.192C100.637 265.52 100.949 266.6 101.573 267.432C102.197 268.248 103.005 268.656 103.997 268.656ZM112.469 270V258.336H114.101L114.269 260.016H114.341C114.901 259.456 115.493 258.992 116.117 258.624C116.741 258.24 117.453 258.048 118.253 258.048C119.485 258.048 120.381 258.44 120.941 259.224C121.517 259.992 121.805 261.12 121.805 262.608V270H119.837V262.872C119.837 261.784 119.661 260.992 119.309 260.496C118.957 260 118.397 259.752 117.629 259.752C117.037 259.752 116.501 259.904 116.021 260.208C115.557 260.512 115.029 260.96 114.437 261.552V270H112.469ZM130.399 270V258.336H132.367V270H130.399ZM131.407 255.936C131.023 255.936 130.703 255.824 130.447 255.6C130.207 255.36 130.087 255.04 130.087 254.64C130.087 254.256 130.207 253.944 130.447 253.704C130.703 253.464 131.023 253.344 131.407 253.344C131.791 253.344 132.103 253.464 132.343 253.704C132.599 253.944 132.727 254.256 132.727 254.64C132.727 255.04 132.599 255.36 132.343 255.6C132.103 255.824 131.791 255.936 131.407 255.936ZM139.353 270.288C138.521 270.288 137.729 270.136 136.977 269.832C136.225 269.512 135.569 269.128 135.009 268.68L135.993 267.36C136.505 267.76 137.033 268.096 137.577 268.368C138.121 268.624 138.737 268.752 139.425 268.752C140.193 268.752 140.769 268.576 141.153 268.224C141.537 267.856 141.729 267.424 141.729 266.928C141.729 266.528 141.593 266.192 141.321 265.92C141.065 265.648 140.729 265.424 140.313 265.248C139.913 265.056 139.497 264.88 139.065 264.72C138.521 264.512 137.985 264.28 137.457 264.024C136.929 263.752 136.497 263.408 136.161 262.992C135.825 262.56 135.657 262.016 135.657 261.36C135.657 260.416 136.009 259.632 136.713 259.008C137.433 258.368 138.425 258.048 139.689 258.048C140.409 258.048 141.081 258.176 141.705 258.432C142.329 258.688 142.865 259 143.313 259.368L142.353 260.616C141.953 260.312 141.537 260.064 141.105 259.872C140.673 259.68 140.201 259.584 139.689 259.584C138.953 259.584 138.409 259.752 138.057 260.088C137.721 260.424 137.553 260.816 137.553 261.264C137.553 261.632 137.673 261.936 137.913 262.176C138.153 262.4 138.465 262.6 138.849 262.776C139.233 262.936 139.641 263.104 140.073 263.28C140.633 263.488 141.185 263.728 141.729 264C142.273 264.256 142.721 264.608 143.073 265.056C143.441 265.488 143.625 266.072 143.625 266.808C143.625 267.432 143.457 268.008 143.121 268.536C142.801 269.064 142.321 269.488 141.681 269.808C141.057 270.128 140.281 270.288 139.353 270.288ZM153.852 270.288C152.876 270.288 152.06 270 151.404 269.424C150.764 268.832 150.444 268.016 150.444 266.976C150.444 265.696 151.012 264.72 152.148 264.048C153.3 263.36 155.116 262.88 157.596 262.608C157.596 262.112 157.524 261.64 157.38 261.192C157.252 260.744 157.012 260.384 156.66 260.112C156.324 259.824 155.836 259.68 155.196 259.68C154.524 259.68 153.892 259.808 153.3 260.064C152.708 260.32 152.18 260.608 151.716 260.928L150.948 259.56C151.492 259.208 152.156 258.872 152.94 258.552C153.74 258.216 154.604 258.048 155.532 258.048C156.956 258.048 157.988 258.488 158.628 259.368C159.268 260.232 159.588 261.392 159.588 262.848V270H157.956L157.788 268.608H157.716C157.172 269.056 156.572 269.448 155.916 269.784C155.276 270.12 154.588 270.288 153.852 270.288ZM154.428 268.704C154.988 268.704 155.516 268.568 156.012 268.296C156.508 268.024 157.036 267.64 157.596 267.144V263.904C155.66 264.144 154.3 264.504 153.516 264.984C152.748 265.464 152.364 266.08 152.364 266.832C152.364 267.488 152.564 267.968 152.964 268.272C153.364 268.56 153.852 268.704 154.428 268.704ZM171.999 275.376C170.575 275.376 169.415 275.104 168.519 274.56C167.623 274.016 167.175 273.24 167.175 272.232C167.175 271.736 167.327 271.256 167.631 270.792C167.935 270.344 168.351 269.944 168.879 269.592V269.496C168.591 269.32 168.343 269.072 168.135 268.752C167.943 268.432 167.847 268.048 167.847 267.6C167.847 267.104 167.983 266.672 168.255 266.304C168.527 265.936 168.815 265.648 169.119 265.44V265.344C168.735 265.024 168.383 264.592 168.063 264.048C167.759 263.504 167.607 262.888 167.607 262.2C167.607 261.352 167.807 260.616 168.207 259.992C168.607 259.368 169.143 258.888 169.815 258.552C170.487 258.216 171.215 258.048 171.999 258.048C172.319 258.048 172.623 258.08 172.911 258.144C173.199 258.192 173.447 258.256 173.655 258.336H177.711V259.848H175.311C175.583 260.104 175.807 260.448 175.983 260.88C176.175 261.296 176.271 261.752 176.271 262.248C176.271 263.08 176.079 263.8 175.695 264.408C175.311 265.016 174.799 265.488 174.159 265.824C173.519 266.144 172.799 266.304 171.999 266.304C171.375 266.304 170.791 266.168 170.247 265.896C170.039 266.072 169.863 266.272 169.719 266.496C169.575 266.704 169.503 266.968 169.503 267.288C169.503 267.656 169.647 267.96 169.935 268.2C170.239 268.44 170.783 268.56 171.567 268.56H173.823C175.183 268.56 176.199 268.784 176.871 269.232C177.559 269.664 177.903 270.368 177.903 271.344C177.903 272.064 177.663 272.728 177.183 273.336C176.703 273.944 176.023 274.432 175.143 274.8C174.263 275.184 173.215 275.376 171.999 275.376ZM171.999 264.984C172.671 264.984 173.247 264.736 173.727 264.24C174.223 263.728 174.471 263.048 174.471 262.2C174.471 261.352 174.231 260.688 173.751 260.208C173.271 259.728 172.687 259.488 171.999 259.488C171.311 259.488 170.727 259.728 170.247 260.208C169.767 260.688 169.527 261.352 169.527 262.2C169.527 263.048 169.767 263.728 170.247 264.24C170.743 264.736 171.327 264.984 171.999 264.984ZM172.287 274.008C173.407 274.008 174.303 273.76 174.975 273.264C175.647 272.784 175.983 272.24 175.983 271.632C175.983 271.088 175.775 270.712 175.359 270.504C174.959 270.296 174.383 270.192 173.631 270.192H171.615C171.391 270.192 171.143 270.176 170.871 270.144C170.615 270.112 170.359 270.064 170.103 270C169.687 270.304 169.383 270.624 169.191 270.96C168.999 271.296 168.903 271.632 168.903 271.968C168.903 272.592 169.199 273.088 169.791 273.456C170.399 273.824 171.231 274.008 172.287 274.008ZM180.156 270V258.336H181.788L181.956 260.448H182.028C182.428 259.712 182.916 259.128 183.492 258.696C184.068 258.264 184.684 258.048 185.34 258.048C185.804 258.048 186.22 258.128 186.588 258.288L186.204 260.016C186.012 259.952 185.836 259.904 185.676 259.872C185.516 259.84 185.316 259.824 185.076 259.824C184.58 259.824 184.06 260.024 183.516 260.424C182.988 260.824 182.524 261.52 182.124 262.512V270H180.156ZM192.97 270.288C191.93 270.288 190.986 270.048 190.138 269.568C189.29 269.072 188.618 268.368 188.122 267.456C187.626 266.544 187.378 265.456 187.378 264.192C187.378 262.912 187.626 261.816 188.122 260.904C188.634 259.992 189.29 259.288 190.09 258.792C190.89 258.296 191.73 258.048 192.61 258.048C194.098 258.048 195.242 258.544 196.042 259.536C196.858 260.528 197.266 261.856 197.266 263.52C197.266 263.728 197.258 263.936 197.242 264.144C197.242 264.336 197.226 264.504 197.194 264.648H189.322C189.402 265.88 189.786 266.864 190.474 267.6C191.178 268.336 192.09 268.704 193.21 268.704C193.77 268.704 194.282 268.624 194.746 268.464C195.226 268.288 195.682 268.064 196.114 267.792L196.81 269.088C196.314 269.408 195.746 269.688 195.106 269.928C194.482 270.168 193.77 270.288 192.97 270.288ZM189.298 263.232H195.538C195.538 262.048 195.282 261.152 194.77 260.544C194.274 259.92 193.57 259.608 192.658 259.608C191.842 259.608 191.106 259.928 190.45 260.568C189.81 261.192 189.426 262.08 189.298 263.232ZM202.509 270.288C201.533 270.288 200.717 270 200.061 269.424C199.421 268.832 199.101 268.016 199.101 266.976C199.101 265.696 199.669 264.72 200.805 264.048C201.957 263.36 203.773 262.88 206.253 262.608C206.253 262.112 206.181 261.64 206.037 261.192C205.909 260.744 205.669 260.384 205.317 260.112C204.981 259.824 204.493 259.68 203.853 259.68C203.181 259.68 202.549 259.808 201.957 260.064C201.365 260.32 200.837 260.608 200.373 260.928L199.605 259.56C200.149 259.208 200.813 258.872 201.597 258.552C202.397 258.216 203.261 258.048 204.189 258.048C205.613 258.048 206.645 258.488 207.285 259.368C207.925 260.232 208.245 261.392 208.245 262.848V270H206.613L206.445 268.608H206.373C205.829 269.056 205.229 269.448 204.573 269.784C203.933 270.12 203.245 270.288 202.509 270.288ZM203.085 268.704C203.645 268.704 204.173 268.568 204.669 268.296C205.165 268.024 205.693 267.64 206.253 267.144V263.904C204.317 264.144 202.957 264.504 202.173 264.984C201.405 265.464 201.021 266.08 201.021 266.832C201.021 267.488 201.221 267.968 201.621 268.272C202.021 268.56 202.509 268.704 203.085 268.704ZM215.258 270.288C214.01 270.288 213.138 269.928 212.642 269.208C212.162 268.488 211.922 267.552 211.922 266.4V259.944H210.194V258.456L212.018 258.336L212.258 255.072H213.914V258.336H217.058V259.944H213.914V266.424C213.914 267.144 214.042 267.704 214.298 268.104C214.57 268.488 215.042 268.68 215.714 268.68C215.922 268.68 216.146 268.648 216.386 268.584C216.626 268.504 216.842 268.432 217.034 268.368L217.418 269.856C217.098 269.968 216.746 270.064 216.362 270.144C215.994 270.24 215.626 270.288 215.258 270.288ZM16.4572 304.92V288.336H18.0892L18.2572 289.68H18.3292C18.8572 289.232 19.4332 288.848 20.0572 288.528C20.6972 288.208 21.3612 288.048 22.0492 288.048C23.5532 288.048 24.6972 288.592 25.4812 289.68C26.2652 290.752 26.6572 292.192 26.6572 294C26.6572 295.312 26.4172 296.44 25.9372 297.384C25.4732 298.328 24.8572 299.048 24.0892 299.544C23.3372 300.04 22.5132 300.288 21.6172 300.288C21.0732 300.288 20.5292 300.168 19.9852 299.928C19.4572 299.688 18.9212 299.36 18.3772 298.944L18.4252 300.984V304.92H16.4572ZM21.2812 298.632C22.2412 298.632 23.0332 298.224 23.6572 297.408C24.2972 296.576 24.6172 295.44 24.6172 294C24.6172 292.72 24.3772 291.688 23.8972 290.904C23.4332 290.104 22.6492 289.704 21.5452 289.704C21.0492 289.704 20.5452 289.84 20.0332 290.112C19.5372 290.384 19.0012 290.776 18.4252 291.288V297.408C18.9532 297.856 19.4652 298.176 19.9612 298.368C20.4572 298.544 20.8972 298.632 21.2812 298.632ZM29.7697 300V288.336H31.4017L31.5697 290.448H31.6417C32.0417 289.712 32.5297 289.128 33.1057 288.696C33.6817 288.264 34.2977 288.048 34.9537 288.048C35.4177 288.048 35.8337 288.128 36.2017 288.288L35.8177 290.016C35.6257 289.952 35.4497 289.904 35.2897 289.872C35.1297 289.84 34.9297 289.824 34.6897 289.824C34.1937 289.824 33.6737 290.024 33.1297 290.424C32.6017 290.824 32.1377 291.52 31.7377 292.512V300H29.7697ZM42.3917 300.288C41.4477 300.288 40.5597 300.048 39.7277 299.568C38.9117 299.088 38.2477 298.392 37.7357 297.48C37.2397 296.568 36.9917 295.472 36.9917 294.192C36.9917 292.88 37.2397 291.768 37.7357 290.856C38.2477 289.944 38.9117 289.248 39.7277 288.768C40.5597 288.288 41.4477 288.048 42.3917 288.048C43.3517 288.048 44.2397 288.288 45.0557 288.768C45.8717 289.248 46.5277 289.944 47.0237 290.856C47.5357 291.768 47.7917 292.88 47.7917 294.192C47.7917 295.472 47.5357 296.568 47.0237 297.48C46.5277 298.392 45.8717 299.088 45.0557 299.568C44.2397 300.048 43.3517 300.288 42.3917 300.288ZM42.3917 298.656C43.3997 298.656 44.2077 298.248 44.8157 297.432C45.4397 296.6 45.7517 295.52 45.7517 294.192C45.7517 292.848 45.4397 291.76 44.8157 290.928C44.2077 290.096 43.3997 289.68 42.3917 289.68C41.3997 289.68 40.5917 290.096 39.9677 290.928C39.3437 291.76 39.0317 292.848 39.0317 294.192C39.0317 295.52 39.3437 296.6 39.9677 297.432C40.5917 298.248 41.3997 298.656 42.3917 298.656ZM54.7995 305.376C53.3755 305.376 52.2155 305.104 51.3195 304.56C50.4235 304.016 49.9755 303.24 49.9755 302.232C49.9755 301.736 50.1275 301.256 50.4315 300.792C50.7355 300.344 51.1515 299.944 51.6795 299.592V299.496C51.3915 299.32 51.1435 299.072 50.9355 298.752C50.7435 298.432 50.6475 298.048 50.6475 297.6C50.6475 297.104 50.7835 296.672 51.0555 296.304C51.3275 295.936 51.6155 295.648 51.9195 295.44V295.344C51.5355 295.024 51.1835 294.592 50.8635 294.048C50.5595 293.504 50.4075 292.888 50.4075 292.2C50.4075 291.352 50.6075 290.616 51.0075 289.992C51.4075 289.368 51.9435 288.888 52.6155 288.552C53.2875 288.216 54.0155 288.048 54.7995 288.048C55.1195 288.048 55.4235 288.08 55.7115 288.144C55.9995 288.192 56.2475 288.256 56.4555 288.336H60.5115V289.848H58.1115C58.3835 290.104 58.6075 290.448 58.7835 290.88C58.9755 291.296 59.0715 291.752 59.0715 292.248C59.0715 293.08 58.8795 293.8 58.4955 294.408C58.1115 295.016 57.5995 295.488 56.9595 295.824C56.3195 296.144 55.5995 296.304 54.7995 296.304C54.1755 296.304 53.5915 296.168 53.0475 295.896C52.8395 296.072 52.6635 296.272 52.5195 296.496C52.3755 296.704 52.3035 296.968 52.3035 297.288C52.3035 297.656 52.4475 297.96 52.7355 298.2C53.0395 298.44 53.5835 298.56 54.3675 298.56H56.6235C57.9835 298.56 58.9995 298.784 59.6715 299.232C60.3595 299.664 60.7035 300.368 60.7035 301.344C60.7035 302.064 60.4635 302.728 59.9835 303.336C59.5035 303.944 58.8235 304.432 57.9435 304.8C57.0635 305.184 56.0155 305.376 54.7995 305.376ZM54.7995 294.984C55.4715 294.984 56.0475 294.736 56.5275 294.24C57.0235 293.728 57.2715 293.048 57.2715 292.2C57.2715 291.352 57.0315 290.688 56.5515 290.208C56.0715 289.728 55.4875 289.488 54.7995 289.488C54.1115 289.488 53.5275 289.728 53.0475 290.208C52.5675 290.688 52.3275 291.352 52.3275 292.2C52.3275 293.048 52.5675 293.728 53.0475 294.24C53.5435 294.736 54.1275 294.984 54.7995 294.984ZM55.0875 304.008C56.2075 304.008 57.1035 303.76 57.7755 303.264C58.4475 302.784 58.7835 302.24 58.7835 301.632C58.7835 301.088 58.5755 300.712 58.1595 300.504C57.7595 300.296 57.1835 300.192 56.4315 300.192H54.4155C54.1915 300.192 53.9435 300.176 53.6715 300.144C53.4155 300.112 53.1595 300.064 52.9035 300C52.4875 300.304 52.1835 300.624 51.9915 300.96C51.7995 301.296 51.7035 301.632 51.7035 301.968C51.7035 302.592 51.9995 303.088 52.5915 303.456C53.1995 303.824 54.0315 304.008 55.0875 304.008ZM62.9572 300V288.336H64.5892L64.7572 290.448H64.8292C65.2292 289.712 65.7172 289.128 66.2932 288.696C66.8692 288.264 67.4852 288.048 68.1412 288.048C68.6052 288.048 69.0212 288.128 69.3892 288.288L69.0052 290.016C68.8132 289.952 68.6372 289.904 68.4772 289.872C68.3172 289.84 68.1172 289.824 67.8772 289.824C67.3812 289.824 66.8612 290.024 66.3172 290.424C65.7892 290.824 65.3252 291.52 64.9252 292.512V300H62.9572ZM73.3796 300.288C72.4036 300.288 71.5876 300 70.9316 299.424C70.2916 298.832 69.9716 298.016 69.9716 296.976C69.9716 295.696 70.5396 294.72 71.6756 294.048C72.8276 293.36 74.6436 292.88 77.1236 292.608C77.1236 292.112 77.0516 291.64 76.9076 291.192C76.7796 290.744 76.5396 290.384 76.1876 290.112C75.8516 289.824 75.3636 289.68 74.7236 289.68C74.0516 289.68 73.4196 289.808 72.8276 290.064C72.2356 290.32 71.7076 290.608 71.2436 290.928L70.4756 289.56C71.0196 289.208 71.6836 288.872 72.4676 288.552C73.2676 288.216 74.1316 288.048 75.0596 288.048C76.4836 288.048 77.5156 288.488 78.1556 289.368C78.7956 290.232 79.1156 291.392 79.1156 292.848V300H77.4836L77.3156 298.608H77.2436C76.6996 299.056 76.0996 299.448 75.4436 299.784C74.8036 300.12 74.1156 300.288 73.3796 300.288ZM73.9556 298.704C74.5156 298.704 75.0436 298.568 75.5396 298.296C76.0356 298.024 76.5636 297.64 77.1236 297.144V293.904C75.1876 294.144 73.8276 294.504 73.0436 294.984C72.2756 295.464 71.8916 296.08 71.8916 296.832C71.8916 297.488 72.0916 297.968 72.4916 298.272C72.8916 298.56 73.3796 298.704 73.9556 298.704ZM82.7854 300V288.336H84.4174L84.5854 290.016H84.6574C85.1694 289.456 85.7294 288.992 86.3374 288.624C86.9454 288.24 87.5934 288.048 88.2814 288.048C89.1774 288.048 89.8734 288.248 90.3694 288.648C90.8814 289.032 91.2574 289.576 91.4974 290.28C92.1054 289.624 92.7214 289.088 93.3454 288.672C93.9694 288.256 94.6334 288.048 95.3374 288.048C96.5374 288.048 97.4254 288.44 98.0014 289.224C98.5934 289.992 98.8894 291.12 98.8894 292.608V300H96.9214V292.872C96.9214 291.784 96.7454 290.992 96.3934 290.496C96.0414 290 95.4974 289.752 94.7614 289.752C93.8974 289.752 92.9214 290.352 91.8334 291.552V300H89.8654V292.872C89.8654 291.784 89.6894 290.992 89.3374 290.496C88.9854 290 88.4334 289.752 87.6814 289.752C86.8174 289.752 85.8414 290.352 84.7534 291.552V300H82.7854ZM102.684 300V288.336H104.316L104.484 290.016H104.556C105.068 289.456 105.628 288.992 106.236 288.624C106.844 288.24 107.492 288.048 108.18 288.048C109.076 288.048 109.772 288.248 110.268 288.648C110.78 289.032 111.156 289.576 111.396 290.28C112.004 289.624 112.62 289.088 113.244 288.672C113.868 288.256 114.532 288.048 115.236 288.048C116.436 288.048 117.324 288.44 117.9 289.224C118.492 289.992 118.788 291.12 118.788 292.608V300H116.82V292.872C116.82 291.784 116.644 290.992 116.292 290.496C115.94 290 115.396 289.752 114.66 289.752C113.796 289.752 112.82 290.352 111.732 291.552V300H109.764V292.872C109.764 291.784 109.588 290.992 109.236 290.496C108.884 290 108.332 289.752 107.58 289.752C106.716 289.752 105.74 290.352 104.652 291.552V300H102.684ZM122.582 300V288.336H124.55V300H122.582ZM123.59 285.936C123.206 285.936 122.886 285.824 122.63 285.6C122.39 285.36 122.27 285.04 122.27 284.64C122.27 284.256 122.39 283.944 122.63 283.704C122.886 283.464 123.206 283.344 123.59 283.344C123.974 283.344 124.286 283.464 124.526 283.704C124.782 283.944 124.91 284.256 124.91 284.64C124.91 285.04 124.782 285.36 124.526 285.6C124.286 285.824 123.974 285.936 123.59 285.936ZM128.488 300V288.336H130.12L130.288 290.016H130.36C130.92 289.456 131.512 288.992 132.136 288.624C132.76 288.24 133.472 288.048 134.272 288.048C135.504 288.048 136.4 288.44 136.96 289.224C137.536 289.992 137.824 291.12 137.824 292.608V300H135.856V292.872C135.856 291.784 135.68 290.992 135.328 290.496C134.976 290 134.416 289.752 133.648 289.752C133.056 289.752 132.52 289.904 132.04 290.208C131.576 290.512 131.048 290.96 130.456 291.552V300H128.488ZM145.549 305.376C144.125 305.376 142.965 305.104 142.069 304.56C141.173 304.016 140.725 303.24 140.725 302.232C140.725 301.736 140.877 301.256 141.181 300.792C141.485 300.344 141.901 299.944 142.429 299.592V299.496C142.141 299.32 141.893 299.072 141.685 298.752C141.493 298.432 141.397 298.048 141.397 297.6C141.397 297.104 141.533 296.672 141.805 296.304C142.077 295.936 142.365 295.648 142.669 295.44V295.344C142.285 295.024 141.933 294.592 141.613 294.048C141.309 293.504 141.157 292.888 141.157 292.2C141.157 291.352 141.357 290.616 141.757 289.992C142.157 289.368 142.693 288.888 143.365 288.552C144.037 288.216 144.765 288.048 145.549 288.048C145.869 288.048 146.173 288.08 146.461 288.144C146.749 288.192 146.997 288.256 147.205 288.336H151.261V289.848H148.861C149.133 290.104 149.357 290.448 149.533 290.88C149.725 291.296 149.821 291.752 149.821 292.248C149.821 293.08 149.629 293.8 149.245 294.408C148.861 295.016 148.349 295.488 147.709 295.824C147.069 296.144 146.349 296.304 145.549 296.304C144.925 296.304 144.341 296.168 143.797 295.896C143.589 296.072 143.413 296.272 143.269 296.496C143.125 296.704 143.053 296.968 143.053 297.288C143.053 297.656 143.197 297.96 143.485 298.2C143.789 298.44 144.333 298.56 145.117 298.56H147.373C148.733 298.56 149.749 298.784 150.421 299.232C151.109 299.664 151.453 300.368 151.453 301.344C151.453 302.064 151.213 302.728 150.733 303.336C150.253 303.944 149.573 304.432 148.693 304.8C147.813 305.184 146.765 305.376 145.549 305.376ZM145.549 294.984C146.221 294.984 146.797 294.736 147.277 294.24C147.773 293.728 148.021 293.048 148.021 292.2C148.021 291.352 147.781 290.688 147.301 290.208C146.821 289.728 146.237 289.488 145.549 289.488C144.861 289.488 144.277 289.728 143.797 290.208C143.317 290.688 143.077 291.352 143.077 292.2C143.077 293.048 143.317 293.728 143.797 294.24C144.293 294.736 144.877 294.984 145.549 294.984ZM145.837 304.008C146.957 304.008 147.853 303.76 148.525 303.264C149.197 302.784 149.533 302.24 149.533 301.632C149.533 301.088 149.325 300.712 148.909 300.504C148.509 300.296 147.933 300.192 147.181 300.192H145.165C144.941 300.192 144.693 300.176 144.421 300.144C144.165 300.112 143.909 300.064 143.653 300C143.237 300.304 142.933 300.624 142.741 300.96C142.549 301.296 142.453 301.632 142.453 301.968C142.453 302.592 142.749 303.088 143.341 303.456C143.949 303.824 144.781 304.008 145.837 304.008ZM160.6 300.288C159.864 300.288 159.328 300.064 158.992 299.616C158.672 299.152 158.512 298.496 158.512 297.648V282.912H160.48V297.792C160.48 298.096 160.536 298.32 160.648 298.464C160.76 298.592 160.888 298.656 161.032 298.656C161.096 298.656 161.152 298.656 161.2 298.656C161.264 298.64 161.352 298.624 161.464 298.608L161.728 300.096C161.6 300.16 161.448 300.208 161.272 300.24C161.096 300.272 160.872 300.288 160.6 300.288ZM167.317 300.288C166.341 300.288 165.525 300 164.869 299.424C164.229 298.832 163.909 298.016 163.909 296.976C163.909 295.696 164.477 294.72 165.613 294.048C166.765 293.36 168.581 292.88 171.061 292.608C171.061 292.112 170.989 291.64 170.845 291.192C170.717 290.744 170.477 290.384 170.125 290.112C169.789 289.824 169.301 289.68 168.661 289.68C167.989 289.68 167.357 289.808 166.765 290.064C166.173 290.32 165.645 290.608 165.181 290.928L164.413 289.56C164.957 289.208 165.621 288.872 166.405 288.552C167.205 288.216 168.069 288.048 168.997 288.048C170.421 288.048 171.453 288.488 172.093 289.368C172.733 290.232 173.053 291.392 173.053 292.848V300H171.421L171.253 298.608H171.181C170.637 299.056 170.037 299.448 169.381 299.784C168.741 300.12 168.053 300.288 167.317 300.288ZM167.893 298.704C168.453 298.704 168.981 298.568 169.477 298.296C169.973 298.024 170.501 297.64 171.061 297.144V293.904C169.125 294.144 167.765 294.504 166.981 294.984C166.213 295.464 165.829 296.08 165.829 296.832C165.829 297.488 166.029 297.968 166.429 298.272C166.829 298.56 167.317 298.704 167.893 298.704ZM176.723 300V288.336H178.355L178.523 290.016H178.595C179.155 289.456 179.747 288.992 180.371 288.624C180.995 288.24 181.707 288.048 182.507 288.048C183.739 288.048 184.635 288.44 185.195 289.224C185.771 289.992 186.059 291.12 186.059 292.608V300H184.091V292.872C184.091 291.784 183.915 290.992 183.563 290.496C183.211 290 182.651 289.752 181.883 289.752C181.291 289.752 180.755 289.904 180.275 290.208C179.811 290.512 179.283 290.96 178.691 291.552V300H176.723ZM193.784 305.376C192.36 305.376 191.2 305.104 190.304 304.56C189.408 304.016 188.96 303.24 188.96 302.232C188.96 301.736 189.112 301.256 189.416 300.792C189.72 300.344 190.136 299.944 190.664 299.592V299.496C190.376 299.32 190.128 299.072 189.92 298.752C189.728 298.432 189.632 298.048 189.632 297.6C189.632 297.104 189.768 296.672 190.04 296.304C190.312 295.936 190.6 295.648 190.904 295.44V295.344C190.52 295.024 190.168 294.592 189.848 294.048C189.544 293.504 189.392 292.888 189.392 292.2C189.392 291.352 189.592 290.616 189.992 289.992C190.392 289.368 190.928 288.888 191.6 288.552C192.272 288.216 193 288.048 193.784 288.048C194.104 288.048 194.408 288.08 194.696 288.144C194.984 288.192 195.232 288.256 195.44 288.336H199.496V289.848H197.096C197.368 290.104 197.592 290.448 197.768 290.88C197.96 291.296 198.056 291.752 198.056 292.248C198.056 293.08 197.864 293.8 197.48 294.408C197.096 295.016 196.584 295.488 195.944 295.824C195.304 296.144 194.584 296.304 193.784 296.304C193.16 296.304 192.576 296.168 192.032 295.896C191.824 296.072 191.648 296.272 191.504 296.496C191.36 296.704 191.288 296.968 191.288 297.288C191.288 297.656 191.432 297.96 191.72 298.2C192.024 298.44 192.568 298.56 193.352 298.56H195.608C196.968 298.56 197.984 298.784 198.656 299.232C199.344 299.664 199.688 300.368 199.688 301.344C199.688 302.064 199.448 302.728 198.968 303.336C198.488 303.944 197.808 304.432 196.928 304.8C196.048 305.184 195 305.376 193.784 305.376ZM193.784 294.984C194.456 294.984 195.032 294.736 195.512 294.24C196.008 293.728 196.256 293.048 196.256 292.2C196.256 291.352 196.016 290.688 195.536 290.208C195.056 289.728 194.472 289.488 193.784 289.488C193.096 289.488 192.512 289.728 192.032 290.208C191.552 290.688 191.312 291.352 191.312 292.2C191.312 293.048 191.552 293.728 192.032 294.24C192.528 294.736 193.112 294.984 193.784 294.984ZM194.072 304.008C195.192 304.008 196.088 303.76 196.76 303.264C197.432 302.784 197.768 302.24 197.768 301.632C197.768 301.088 197.56 300.712 197.144 300.504C196.744 300.296 196.168 300.192 195.416 300.192H193.4C193.176 300.192 192.928 300.176 192.656 300.144C192.4 300.112 192.144 300.064 191.888 300C191.472 300.304 191.168 300.624 190.976 300.96C190.784 301.296 190.688 301.632 190.688 301.968C190.688 302.592 190.984 303.088 191.576 303.456C192.184 303.824 193.016 304.008 194.072 304.008ZM205.35 300.288C204.118 300.288 203.214 299.904 202.638 299.136C202.062 298.352 201.774 297.216 201.774 295.728V288.336H203.766V295.464C203.766 296.552 203.934 297.344 204.27 297.84C204.622 298.336 205.182 298.584 205.95 298.584C206.558 298.584 207.094 298.432 207.558 298.128C208.038 297.808 208.55 297.304 209.094 296.616V288.336H211.062V300H209.43L209.262 298.176H209.19C208.646 298.816 208.07 299.328 207.462 299.712C206.854 300.096 206.15 300.288 205.35 300.288ZM217.684 300.288C216.708 300.288 215.892 300 215.236 299.424C214.596 298.832 214.276 298.016 214.276 296.976C214.276 295.696 214.844 294.72 215.98 294.048C217.132 293.36 218.948 292.88 221.428 292.608C221.428 292.112 221.356 291.64 221.212 291.192C221.084 290.744 220.844 290.384 220.492 290.112C220.156 289.824 219.668 289.68 219.028 289.68C218.356 289.68 217.724 289.808 217.132 290.064C216.54 290.32 216.012 290.608 215.548 290.928L214.78 289.56C215.324 289.208 215.988 288.872 216.772 288.552C217.572 288.216 218.436 288.048 219.364 288.048C220.788 288.048 221.82 288.488 222.46 289.368C223.1 290.232 223.42 291.392 223.42 292.848V300H221.788L221.62 298.608H221.548C221.004 299.056 220.404 299.448 219.748 299.784C219.108 300.12 218.42 300.288 217.684 300.288ZM218.26 298.704C218.82 298.704 219.348 298.568 219.844 298.296C220.34 298.024 220.868 297.64 221.428 297.144V293.904C219.492 294.144 218.132 294.504 217.348 294.984C216.58 295.464 216.196 296.08 216.196 296.832C216.196 297.488 216.396 297.968 216.796 298.272C217.196 298.56 217.684 298.704 218.26 298.704ZM231.026 305.376C229.602 305.376 228.442 305.104 227.546 304.56C226.65 304.016 226.202 303.24 226.202 302.232C226.202 301.736 226.354 301.256 226.658 300.792C226.962 300.344 227.378 299.944 227.906 299.592V299.496C227.618 299.32 227.37 299.072 227.162 298.752C226.97 298.432 226.874 298.048 226.874 297.6C226.874 297.104 227.01 296.672 227.282 296.304C227.554 295.936 227.842 295.648 228.146 295.44V295.344C227.762 295.024 227.41 294.592 227.09 294.048C226.786 293.504 226.634 292.888 226.634 292.2C226.634 291.352 226.834 290.616 227.234 289.992C227.634 289.368 228.17 288.888 228.842 288.552C229.514 288.216 230.242 288.048 231.026 288.048C231.346 288.048 231.65 288.08 231.938 288.144C232.226 288.192 232.474 288.256 232.682 288.336H236.738V289.848H234.338C234.61 290.104 234.834 290.448 235.01 290.88C235.202 291.296 235.298 291.752 235.298 292.248C235.298 293.08 235.106 293.8 234.722 294.408C234.338 295.016 233.826 295.488 233.186 295.824C232.546 296.144 231.826 296.304 231.026 296.304C230.402 296.304 229.818 296.168 229.274 295.896C229.066 296.072 228.89 296.272 228.746 296.496C228.602 296.704 228.53 296.968 228.53 297.288C228.53 297.656 228.674 297.96 228.962 298.2C229.266 298.44 229.81 298.56 230.594 298.56H232.85C234.21 298.56 235.226 298.784 235.898 299.232C236.586 299.664 236.93 300.368 236.93 301.344C236.93 302.064 236.69 302.728 236.21 303.336C235.73 303.944 235.05 304.432 234.17 304.8C233.29 305.184 232.242 305.376 231.026 305.376ZM231.026 294.984C231.698 294.984 232.274 294.736 232.754 294.24C233.25 293.728 233.498 293.048 233.498 292.2C233.498 291.352 233.258 290.688 232.778 290.208C232.298 289.728 231.714 289.488 231.026 289.488C230.338 289.488 229.754 289.728 229.274 290.208C228.794 290.688 228.554 291.352 228.554 292.2C228.554 293.048 228.794 293.728 229.274 294.24C229.77 294.736 230.354 294.984 231.026 294.984ZM231.314 304.008C232.434 304.008 233.33 303.76 234.002 303.264C234.674 302.784 235.01 302.24 235.01 301.632C235.01 301.088 234.802 300.712 234.386 300.504C233.986 300.296 233.41 300.192 232.658 300.192H230.642C230.418 300.192 230.17 300.176 229.898 300.144C229.642 300.112 229.386 300.064 229.13 300C228.714 300.304 228.41 300.624 228.218 300.96C228.026 301.296 227.93 301.632 227.93 301.968C227.93 302.592 228.226 303.088 228.818 303.456C229.426 303.824 230.258 304.008 231.314 304.008ZM243.584 300.288C242.544 300.288 241.6 300.048 240.752 299.568C239.904 299.072 239.232 298.368 238.736 297.456C238.24 296.544 237.992 295.456 237.992 294.192C237.992 292.912 238.24 291.816 238.736 290.904C239.248 289.992 239.904 289.288 240.704 288.792C241.504 288.296 242.344 288.048 243.224 288.048C244.712 288.048 245.856 288.544 246.656 289.536C247.472 290.528 247.88 291.856 247.88 293.52C247.88 293.728 247.872 293.936 247.856 294.144C247.856 294.336 247.84 294.504 247.808 294.648H239.936C240.016 295.88 240.4 296.864 241.088 297.6C241.792 298.336 242.704 298.704 243.824 298.704C244.384 298.704 244.896 298.624 245.36 298.464C245.84 298.288 246.296 298.064 246.728 297.792L247.424 299.088C246.928 299.408 246.36 299.688 245.72 299.928C245.096 300.168 244.384 300.288 243.584 300.288ZM239.912 293.232H246.152C246.152 292.048 245.896 291.152 245.384 290.544C244.888 289.92 244.184 289.608 243.272 289.608C242.456 289.608 241.72 289.928 241.064 290.568C240.424 291.192 240.04 292.08 239.912 293.232ZM251.794 300.288C251.394 300.288 251.05 300.152 250.762 299.88C250.49 299.592 250.354 299.232 250.354 298.8C250.354 298.336 250.49 297.968 250.762 297.696C251.05 297.408 251.394 297.264 251.794 297.264C252.178 297.264 252.506 297.408 252.778 297.696C253.066 297.968 253.21 298.336 253.21 298.8C253.21 299.232 253.066 299.592 252.778 299.88C252.506 300.152 252.178 300.288 251.794 300.288Z" fill="black"/> <path d="M39.2257 389V374.936H34.4737V373.256H45.9937V374.936H41.2417V389H39.2257ZM48.6369 389V371.912H50.6049V376.568L50.5329 378.968C51.0929 378.44 51.6769 377.992 52.2849 377.624C52.9089 377.24 53.6209 377.048 54.4209 377.048C55.6529 377.048 56.5489 377.44 57.1089 378.224C57.6849 378.992 57.9729 380.12 57.9729 381.608V389H56.0049V381.872C56.0049 380.784 55.8289 379.992 55.4769 379.496C55.1249 379 54.5649 378.752 53.7969 378.752C53.2049 378.752 52.6689 378.904 52.1889 379.208C51.7249 379.512 51.1969 379.96 50.6049 380.552V389H48.6369ZM64.3796 389.288C63.4036 389.288 62.5876 389 61.9316 388.424C61.2916 387.832 60.9716 387.016 60.9716 385.976C60.9716 384.696 61.5396 383.72 62.6756 383.048C63.8276 382.36 65.6436 381.88 68.1236 381.608C68.1236 381.112 68.0516 380.64 67.9076 380.192C67.7796 379.744 67.5396 379.384 67.1876 379.112C66.8516 378.824 66.3636 378.68 65.7236 378.68C65.0516 378.68 64.4196 378.808 63.8276 379.064C63.2356 379.32 62.7076 379.608 62.2436 379.928L61.4756 378.56C62.0196 378.208 62.6836 377.872 63.4676 377.552C64.2676 377.216 65.1316 377.048 66.0596 377.048C67.4836 377.048 68.5156 377.488 69.1556 378.368C69.7956 379.232 70.1156 380.392 70.1156 381.848V389H68.4836L68.3156 387.608H68.2436C67.6996 388.056 67.0996 388.448 66.4436 388.784C65.8036 389.12 65.1156 389.288 64.3796 389.288ZM64.9556 387.704C65.5156 387.704 66.0436 387.568 66.5396 387.296C67.0356 387.024 67.5636 386.64 68.1236 386.144V382.904C66.1876 383.144 64.8276 383.504 64.0436 383.984C63.2756 384.464 62.8916 385.08 62.8916 385.832C62.8916 386.488 63.0916 386.968 63.4916 387.272C63.8916 387.56 64.3796 387.704 64.9556 387.704ZM77.1292 389.288C75.8812 389.288 75.0092 388.928 74.5132 388.208C74.0332 387.488 73.7932 386.552 73.7932 385.4V378.944H72.0652V377.456L73.8892 377.336L74.1292 374.072H75.7852V377.336H78.9292V378.944H75.7852V385.424C75.7852 386.144 75.9132 386.704 76.1692 387.104C76.4412 387.488 76.9132 387.68 77.5852 387.68C77.7932 387.68 78.0172 387.648 78.2572 387.584C78.4972 387.504 78.7132 387.432 78.9052 387.368L79.2892 388.856C78.9692 388.968 78.6172 389.064 78.2332 389.144C77.8652 389.24 77.4972 389.288 77.1292 389.288ZM88.2193 389L84.9793 377.336H86.9953L88.7233 384.08C88.8513 384.624 88.9713 385.168 89.0833 385.712C89.1953 386.24 89.3073 386.776 89.4193 387.32H89.5153C89.6433 386.776 89.7713 386.24 89.8993 385.712C90.0273 385.168 90.1633 384.624 90.3073 384.08L92.1073 377.336H94.0273L95.8513 384.08C95.9953 384.624 96.1313 385.168 96.2593 385.712C96.4033 386.24 96.5393 386.776 96.6673 387.32H96.7633C96.8913 386.776 97.0113 386.24 97.1233 385.712C97.2513 385.168 97.3713 384.624 97.4833 384.08L99.1873 377.336H101.059L97.9393 389H95.5393L93.8593 382.736C93.7153 382.176 93.5793 381.624 93.4513 381.08C93.3393 380.536 93.2113 379.968 93.0673 379.376H92.9713C92.8433 379.968 92.7153 380.544 92.5873 381.104C92.4593 381.648 92.3153 382.2 92.1553 382.76L90.5233 389H88.2193ZM106.051 389.288C105.075 389.288 104.259 389 103.603 388.424C102.963 387.832 102.643 387.016 102.643 385.976C102.643 384.696 103.211 383.72 104.347 383.048C105.499 382.36 107.315 381.88 109.795 381.608C109.795 381.112 109.723 380.64 109.579 380.192C109.451 379.744 109.211 379.384 108.859 379.112C108.523 378.824 108.035 378.68 107.395 378.68C106.723 378.68 106.091 378.808 105.499 379.064C104.907 379.32 104.379 379.608 103.915 379.928L103.147 378.56C103.691 378.208 104.355 377.872 105.139 377.552C105.939 377.216 106.803 377.048 107.731 377.048C109.155 377.048 110.187 377.488 110.827 378.368C111.467 379.232 111.787 380.392 111.787 381.848V389H110.155L109.987 387.608H109.915C109.371 388.056 108.771 388.448 108.115 388.784C107.475 389.12 106.787 389.288 106.051 389.288ZM106.627 387.704C107.187 387.704 107.715 387.568 108.211 387.296C108.707 387.024 109.235 386.64 109.795 386.144V382.904C107.859 383.144 106.499 383.504 105.715 383.984C104.947 384.464 104.563 385.08 104.563 385.832C104.563 386.488 104.763 386.968 105.163 387.272C105.563 387.56 106.051 387.704 106.627 387.704ZM118.505 389.288C117.673 389.288 116.881 389.136 116.129 388.832C115.377 388.512 114.721 388.128 114.161 387.68L115.145 386.36C115.657 386.76 116.185 387.096 116.729 387.368C117.273 387.624 117.889 387.752 118.577 387.752C119.345 387.752 119.921 387.576 120.305 387.224C120.689 386.856 120.881 386.424 120.881 385.928C120.881 385.528 120.745 385.192 120.473 384.92C120.217 384.648 119.881 384.424 119.465 384.248C119.065 384.056 118.649 383.88 118.217 383.72C117.673 383.512 117.137 383.28 116.609 383.024C116.081 382.752 115.649 382.408 115.313 381.992C114.977 381.56 114.809 381.016 114.809 380.36C114.809 379.416 115.161 378.632 115.865 378.008C116.585 377.368 117.577 377.048 118.841 377.048C119.561 377.048 120.233 377.176 120.857 377.432C121.481 377.688 122.017 378 122.465 378.368L121.505 379.616C121.105 379.312 120.689 379.064 120.257 378.872C119.825 378.68 119.353 378.584 118.841 378.584C118.105 378.584 117.561 378.752 117.209 379.088C116.873 379.424 116.705 379.816 116.705 380.264C116.705 380.632 116.825 380.936 117.065 381.176C117.305 381.4 117.617 381.6 118.001 381.776C118.385 381.936 118.793 382.104 119.225 382.28C119.785 382.488 120.337 382.728 120.881 383C121.425 383.256 121.873 383.608 122.225 384.056C122.593 384.488 122.777 385.072 122.777 385.808C122.777 386.432 122.609 387.008 122.273 387.536C121.953 388.064 121.473 388.488 120.833 388.808C120.209 389.128 119.433 389.288 118.505 389.288ZM133.005 389.288C132.029 389.288 131.213 389 130.557 388.424C129.917 387.832 129.597 387.016 129.597 385.976C129.597 384.696 130.165 383.72 131.301 383.048C132.453 382.36 134.269 381.88 136.749 381.608C136.749 381.112 136.677 380.64 136.533 380.192C136.405 379.744 136.165 379.384 135.813 379.112C135.477 378.824 134.989 378.68 134.349 378.68C133.677 378.68 133.045 378.808 132.453 379.064C131.861 379.32 131.333 379.608 130.869 379.928L130.101 378.56C130.645 378.208 131.309 377.872 132.093 377.552C132.893 377.216 133.757 377.048 134.685 377.048C136.109 377.048 137.141 377.488 137.781 378.368C138.421 379.232 138.741 380.392 138.741 381.848V389H137.109L136.941 387.608H136.869C136.325 388.056 135.725 388.448 135.069 388.784C134.429 389.12 133.741 389.288 133.005 389.288ZM133.581 387.704C134.141 387.704 134.669 387.568 135.165 387.296C135.661 387.024 136.189 386.64 136.749 386.144V382.904C134.813 383.144 133.453 383.504 132.669 383.984C131.901 384.464 131.517 385.08 131.517 385.832C131.517 386.488 131.717 386.968 132.117 387.272C132.517 387.56 133.005 387.704 133.581 387.704ZM151.199 389.288C149.743 389.288 148.575 388.76 147.695 387.704C146.815 386.632 146.375 385.128 146.375 383.192C146.375 381.928 146.607 380.84 147.071 379.928C147.551 379 148.175 378.288 148.943 377.792C149.727 377.296 150.559 377.048 151.439 377.048C152.111 377.048 152.695 377.168 153.191 377.408C153.687 377.648 154.191 377.976 154.703 378.392L154.607 376.4V371.912H156.599V389H154.967L154.799 387.632H154.727C154.279 388.08 153.751 388.472 153.143 388.808C152.535 389.128 151.887 389.288 151.199 389.288ZM151.631 387.632C152.655 387.632 153.647 387.096 154.607 386.024V379.928C154.111 379.48 153.631 379.168 153.167 378.992C152.719 378.8 152.255 378.704 151.775 378.704C151.151 378.704 150.583 378.896 150.071 379.28C149.575 379.648 149.175 380.168 148.871 380.84C148.567 381.496 148.415 382.272 148.415 383.168C148.415 384.56 148.695 385.656 149.255 386.456C149.815 387.24 150.607 387.632 151.631 387.632ZM165.256 389.288C164.216 389.288 163.272 389.048 162.424 388.568C161.576 388.072 160.904 387.368 160.408 386.456C159.912 385.544 159.664 384.456 159.664 383.192C159.664 381.912 159.912 380.816 160.408 379.904C160.92 378.992 161.576 378.288 162.376 377.792C163.176 377.296 164.016 377.048 164.896 377.048C166.384 377.048 167.528 377.544 168.328 378.536C169.144 379.528 169.552 380.856 169.552 382.52C169.552 382.728 169.544 382.936 169.528 383.144C169.528 383.336 169.512 383.504 169.48 383.648H161.608C161.688 384.88 162.072 385.864 162.76 386.6C163.464 387.336 164.376 387.704 165.496 387.704C166.056 387.704 166.568 387.624 167.032 387.464C167.512 387.288 167.968 387.064 168.4 386.792L169.096 388.088C168.6 388.408 168.032 388.688 167.392 388.928C166.768 389.168 166.056 389.288 165.256 389.288ZM161.584 382.232H167.824C167.824 381.048 167.568 380.152 167.056 379.544C166.56 378.92 165.856 378.608 164.944 378.608C164.128 378.608 163.392 378.928 162.736 379.568C162.096 380.192 161.712 381.08 161.584 382.232ZM174.522 389.288C173.786 389.288 173.25 389.064 172.914 388.616C172.594 388.152 172.434 387.496 172.434 386.648V371.912H174.402V386.792C174.402 387.096 174.458 387.32 174.57 387.464C174.682 387.592 174.81 387.656 174.954 387.656C175.018 387.656 175.074 387.656 175.122 387.656C175.186 387.64 175.274 387.624 175.386 387.608L175.65 389.096C175.522 389.16 175.37 389.208 175.194 389.24C175.018 389.272 174.794 389.288 174.522 389.288ZM178.551 389V377.336H180.519V389H178.551ZM179.559 374.936C179.175 374.936 178.855 374.824 178.599 374.6C178.359 374.36 178.239 374.04 178.239 373.64C178.239 373.256 178.359 372.944 178.599 372.704C178.855 372.464 179.175 372.344 179.559 372.344C179.943 372.344 180.255 372.464 180.495 372.704C180.751 372.944 180.879 373.256 180.879 373.64C180.879 374.04 180.751 374.36 180.495 374.6C180.255 374.824 179.943 374.936 179.559 374.936ZM189.065 389.288C188.041 389.288 187.113 389.048 186.281 388.568C185.449 388.088 184.793 387.392 184.313 386.48C183.833 385.568 183.593 384.472 183.593 383.192C183.593 381.88 183.849 380.768 184.361 379.856C184.889 378.944 185.577 378.248 186.425 377.768C187.289 377.288 188.217 377.048 189.209 377.048C189.977 377.048 190.633 377.184 191.177 377.456C191.737 377.728 192.217 378.048 192.617 378.416L191.609 379.712C191.273 379.408 190.913 379.16 190.529 378.968C190.161 378.776 189.745 378.68 189.281 378.68C188.577 378.68 187.945 378.872 187.385 379.256C186.841 379.624 186.409 380.152 186.089 380.84C185.785 381.512 185.633 382.296 185.633 383.192C185.633 384.52 185.961 385.6 186.617 386.432C187.289 387.248 188.161 387.656 189.233 387.656C189.777 387.656 190.281 387.544 190.745 387.32C191.209 387.08 191.617 386.8 191.969 386.48L192.833 387.8C192.305 388.264 191.721 388.632 191.081 388.904C190.441 389.16 189.769 389.288 189.065 389.288ZM195.403 389V377.336H197.371V389H195.403ZM196.411 374.936C196.027 374.936 195.707 374.824 195.451 374.6C195.211 374.36 195.091 374.04 195.091 373.64C195.091 373.256 195.211 372.944 195.451 372.704C195.707 372.464 196.027 372.344 196.411 372.344C196.795 372.344 197.107 372.464 197.347 372.704C197.603 372.944 197.731 373.256 197.731 373.64C197.731 374.04 197.603 374.36 197.347 374.6C197.107 374.824 196.795 374.936 196.411 374.936ZM205.845 389.288C204.901 389.288 204.013 389.048 203.181 388.568C202.365 388.088 201.701 387.392 201.189 386.48C200.693 385.568 200.445 384.472 200.445 383.192C200.445 381.88 200.693 380.768 201.189 379.856C201.701 378.944 202.365 378.248 203.181 377.768C204.013 377.288 204.901 377.048 205.845 377.048C206.805 377.048 207.693 377.288 208.509 377.768C209.325 378.248 209.981 378.944 210.477 379.856C210.989 380.768 211.245 381.88 211.245 383.192C211.245 384.472 210.989 385.568 210.477 386.48C209.981 387.392 209.325 388.088 208.509 388.568C207.693 389.048 206.805 389.288 205.845 389.288ZM205.845 387.656C206.853 387.656 207.661 387.248 208.269 386.432C208.893 385.6 209.205 384.52 209.205 383.192C209.205 381.848 208.893 380.76 208.269 379.928C207.661 379.096 206.853 378.68 205.845 378.68C204.853 378.68 204.045 379.096 203.421 379.928C202.797 380.76 202.485 381.848 202.485 383.192C202.485 384.52 202.797 385.6 203.421 386.432C204.045 387.248 204.853 387.656 205.845 387.656ZM217.725 389.288C216.493 389.288 215.589 388.904 215.013 388.136C214.437 387.352 214.149 386.216 214.149 384.728V377.336H216.141V384.464C216.141 385.552 216.309 386.344 216.645 386.84C216.997 387.336 217.557 387.584 218.325 387.584C218.933 387.584 219.469 387.432 219.933 387.128C220.413 386.808 220.925 386.304 221.469 385.616V377.336H223.437V389H221.805L221.637 387.176H221.565C221.021 387.816 220.445 388.328 219.837 388.712C219.229 389.096 218.525 389.288 217.725 389.288ZM230.419 389.288C229.587 389.288 228.795 389.136 228.043 388.832C227.291 388.512 226.635 388.128 226.075 387.68L227.059 386.36C227.571 386.76 228.099 387.096 228.643 387.368C229.187 387.624 229.803 387.752 230.491 387.752C231.259 387.752 231.835 387.576 232.219 387.224C232.603 386.856 232.795 386.424 232.795 385.928C232.795 385.528 232.659 385.192 232.387 384.92C232.131 384.648 231.795 384.424 231.379 384.248C230.979 384.056 230.563 383.88 230.131 383.72C229.587 383.512 229.051 383.28 228.523 383.024C227.995 382.752 227.563 382.408 227.227 381.992C226.891 381.56 226.723 381.016 226.723 380.36C226.723 379.416 227.075 378.632 227.779 378.008C228.499 377.368 229.491 377.048 230.755 377.048C231.475 377.048 232.147 377.176 232.771 377.432C233.395 377.688 233.931 378 234.379 378.368L233.419 379.616C233.019 379.312 232.603 379.064 232.171 378.872C231.739 378.68 231.267 378.584 230.755 378.584C230.019 378.584 229.475 378.752 229.123 379.088C228.787 379.424 228.619 379.816 228.619 380.264C228.619 380.632 228.739 380.936 228.979 381.176C229.219 381.4 229.531 381.6 229.915 381.776C230.299 381.936 230.707 382.104 231.139 382.28C231.699 382.488 232.251 382.728 232.795 383C233.339 383.256 233.787 383.608 234.139 384.056C234.507 384.488 234.691 385.072 234.691 385.808C234.691 386.432 234.523 387.008 234.187 387.536C233.867 388.064 233.387 388.488 232.747 388.808C232.123 389.128 231.347 389.288 230.419 389.288ZM52.3907 419.288C51.8467 419.288 51.2787 419.16 50.6867 418.904C50.1107 418.632 49.5667 418.264 49.0547 417.8H48.9827L48.8147 419H47.2307V401.912H49.1987V406.568L49.1507 408.68C49.6787 408.216 50.2547 407.832 50.8787 407.528C51.5187 407.208 52.1587 407.048 52.7987 407.048C54.3187 407.048 55.4707 407.584 56.2547 408.656C57.0387 409.728 57.4307 411.168 57.4307 412.976C57.4307 414.304 57.1907 415.44 56.7107 416.384C56.2467 417.328 55.6307 418.048 54.8627 418.544C54.1107 419.04 53.2867 419.288 52.3907 419.288ZM52.0547 417.632C53.0147 417.632 53.8067 417.224 54.4307 416.408C55.0707 415.576 55.3907 414.44 55.3907 413C55.3907 411.72 55.1507 410.688 54.6707 409.904C54.2067 409.104 53.4227 408.704 52.3187 408.704C51.8227 408.704 51.3187 408.84 50.8067 409.112C50.2947 409.384 49.7587 409.776 49.1987 410.288V416.408C49.7107 416.856 50.2147 417.176 50.7107 417.368C51.2227 417.544 51.6707 417.632 52.0547 417.632ZM65.0323 419.288C64.0883 419.288 63.2003 419.048 62.3683 418.568C61.5523 418.088 60.8883 417.392 60.3763 416.48C59.8803 415.568 59.6323 414.472 59.6323 413.192C59.6323 411.88 59.8803 410.768 60.3763 409.856C60.8883 408.944 61.5523 408.248 62.3683 407.768C63.2003 407.288 64.0883 407.048 65.0323 407.048C65.9923 407.048 66.8803 407.288 67.6963 407.768C68.5123 408.248 69.1683 408.944 69.6643 409.856C70.1763 410.768 70.4323 411.88 70.4323 413.192C70.4323 414.472 70.1763 415.568 69.6643 416.48C69.1683 417.392 68.5123 418.088 67.6963 418.568C66.8803 419.048 65.9923 419.288 65.0323 419.288ZM65.0323 417.656C66.0403 417.656 66.8483 417.248 67.4563 416.432C68.0803 415.6 68.3923 414.52 68.3923 413.192C68.3923 411.848 68.0803 410.76 67.4563 409.928C66.8483 409.096 66.0403 408.68 65.0323 408.68C64.0403 408.68 63.2323 409.096 62.6083 409.928C61.9843 410.76 61.6723 411.848 61.6723 413.192C61.6723 414.52 61.9843 415.6 62.6083 416.432C63.2323 417.248 64.0403 417.656 65.0323 417.656ZM75.2584 419L72.0184 407.336H74.0344L75.7624 414.08C75.8904 414.624 76.0104 415.168 76.1224 415.712C76.2344 416.24 76.3464 416.776 76.4584 417.32H76.5544C76.6824 416.776 76.8104 416.24 76.9384 415.712C77.0664 415.168 77.2024 414.624 77.3464 414.08L79.1464 407.336H81.0664L82.8904 414.08C83.0344 414.624 83.1704 415.168 83.2984 415.712C83.4424 416.24 83.5784 416.776 83.7064 417.32H83.8024C83.9304 416.776 84.0504 416.24 84.1624 415.712C84.2904 415.168 84.4104 414.624 84.5224 414.08L86.2264 407.336H88.0984L84.9784 419H82.5784L80.8984 412.736C80.7544 412.176 80.6184 411.624 80.4904 411.08C80.3784 410.536 80.2504 409.968 80.1064 409.376H80.0104C79.8824 409.968 79.7544 410.544 79.6264 411.104C79.4984 411.648 79.3544 412.2 79.1944 412.76L77.5624 419H75.2584ZM92.7249 419.288C91.9889 419.288 91.4529 419.064 91.1169 418.616C90.7969 418.152 90.6369 417.496 90.6369 416.648V401.912H92.6049V416.792C92.6049 417.096 92.6609 417.32 92.7729 417.464C92.8849 417.592 93.0129 417.656 93.1569 417.656C93.2209 417.656 93.2769 417.656 93.3249 417.656C93.3889 417.64 93.4769 417.624 93.5889 417.608L93.8529 419.096C93.7249 419.16 93.5729 419.208 93.3969 419.24C93.2209 419.272 92.9969 419.288 92.7249 419.288ZM106.095 419.288C105.151 419.288 104.263 419.048 103.431 418.568C102.615 418.088 101.951 417.392 101.439 416.48C100.943 415.568 100.695 414.472 100.695 413.192C100.695 411.88 100.943 410.768 101.439 409.856C101.951 408.944 102.615 408.248 103.431 407.768C104.263 407.288 105.151 407.048 106.095 407.048C107.055 407.048 107.943 407.288 108.759 407.768C109.575 408.248 110.231 408.944 110.727 409.856C111.239 410.768 111.495 411.88 111.495 413.192C111.495 414.472 111.239 415.568 110.727 416.48C110.231 417.392 109.575 418.088 108.759 418.568C107.943 419.048 107.055 419.288 106.095 419.288ZM106.095 417.656C107.103 417.656 107.911 417.248 108.519 416.432C109.143 415.6 109.455 414.52 109.455 413.192C109.455 411.848 109.143 410.76 108.519 409.928C107.911 409.096 107.103 408.68 106.095 408.68C105.103 408.68 104.295 409.096 103.671 409.928C103.047 410.76 102.735 411.848 102.735 413.192C102.735 414.52 103.047 415.6 103.671 416.432C104.295 417.248 105.103 417.656 106.095 417.656ZM114.903 419V408.944H113.319V407.456L114.903 407.336V405.488C114.903 404.304 115.175 403.368 115.719 402.68C116.279 401.976 117.143 401.624 118.311 401.624C118.679 401.624 119.031 401.664 119.367 401.744C119.703 401.808 119.999 401.896 120.255 402.008L119.823 403.52C119.391 403.328 118.951 403.232 118.503 403.232C117.415 403.232 116.871 403.984 116.871 405.488V407.336H119.343V408.944H116.871V419H114.903ZM129.427 419.288C128.595 419.288 127.803 419.136 127.051 418.832C126.299 418.512 125.643 418.128 125.083 417.68L126.067 416.36C126.579 416.76 127.107 417.096 127.651 417.368C128.195 417.624 128.811 417.752 129.499 417.752C130.267 417.752 130.843 417.576 131.227 417.224C131.611 416.856 131.803 416.424 131.803 415.928C131.803 415.528 131.667 415.192 131.395 414.92C131.139 414.648 130.803 414.424 130.387 414.248C129.987 414.056 129.571 413.88 129.139 413.72C128.595 413.512 128.059 413.28 127.531 413.024C127.003 412.752 126.571 412.408 126.235 411.992C125.899 411.56 125.731 411.016 125.731 410.36C125.731 409.416 126.083 408.632 126.787 408.008C127.507 407.368 128.499 407.048 129.763 407.048C130.483 407.048 131.155 407.176 131.779 407.432C132.403 407.688 132.939 408 133.387 408.368L132.427 409.616C132.027 409.312 131.611 409.064 131.179 408.872C130.747 408.68 130.275 408.584 129.763 408.584C129.027 408.584 128.483 408.752 128.131 409.088C127.795 409.424 127.627 409.816 127.627 410.264C127.627 410.632 127.747 410.936 127.987 411.176C128.227 411.4 128.539 411.6 128.923 411.776C129.307 411.936 129.715 412.104 130.147 412.28C130.707 412.488 131.259 412.728 131.803 413C132.347 413.256 132.795 413.608 133.147 414.056C133.515 414.488 133.699 415.072 133.699 415.808C133.699 416.432 133.531 417.008 133.195 417.536C132.875 418.064 132.395 418.488 131.755 418.808C131.131 419.128 130.355 419.288 129.427 419.288ZM136.434 423.92V407.336H138.066L138.234 408.68H138.306C138.834 408.232 139.41 407.848 140.034 407.528C140.674 407.208 141.338 407.048 142.026 407.048C143.53 407.048 144.674 407.592 145.458 408.68C146.242 409.752 146.634 411.192 146.634 413C146.634 414.312 146.394 415.44 145.914 416.384C145.45 417.328 144.834 418.048 144.066 418.544C143.314 419.04 142.49 419.288 141.594 419.288C141.05 419.288 140.506 419.168 139.962 418.928C139.434 418.688 138.898 418.36 138.354 417.944L138.402 419.984V423.92H136.434ZM141.258 417.632C142.218 417.632 143.01 417.224 143.634 416.408C144.274 415.576 144.594 414.44 144.594 413C144.594 411.72 144.354 410.688 143.874 409.904C143.41 409.104 142.626 408.704 141.522 408.704C141.026 408.704 140.522 408.84 140.01 409.112C139.514 409.384 138.978 409.776 138.402 410.288V416.408C138.93 416.856 139.442 417.176 139.938 417.368C140.434 417.544 140.874 417.632 141.258 417.632ZM152.106 419.288C151.13 419.288 150.314 419 149.658 418.424C149.018 417.832 148.698 417.016 148.698 415.976C148.698 414.696 149.266 413.72 150.402 413.048C151.554 412.36 153.37 411.88 155.85 411.608C155.85 411.112 155.778 410.64 155.634 410.192C155.506 409.744 155.266 409.384 154.914 409.112C154.578 408.824 154.09 408.68 153.45 408.68C152.778 408.68 152.146 408.808 151.554 409.064C150.962 409.32 150.434 409.608 149.97 409.928L149.202 408.56C149.746 408.208 150.41 407.872 151.194 407.552C151.994 407.216 152.858 407.048 153.786 407.048C155.21 407.048 156.242 407.488 156.882 408.368C157.522 409.232 157.842 410.392 157.842 411.848V419H156.21L156.042 417.608H155.97C155.426 418.056 154.826 418.448 154.17 418.784C153.53 419.12 152.842 419.288 152.106 419.288ZM152.682 417.704C153.242 417.704 153.77 417.568 154.266 417.296C154.762 417.024 155.29 416.64 155.85 416.144V412.904C153.914 413.144 152.554 413.504 151.77 413.984C151.002 414.464 150.618 415.08 150.618 415.832C150.618 416.488 150.818 416.968 151.218 417.272C151.618 417.56 152.106 417.704 152.682 417.704ZM165.448 424.376C164.024 424.376 162.864 424.104 161.968 423.56C161.072 423.016 160.624 422.24 160.624 421.232C160.624 420.736 160.776 420.256 161.08 419.792C161.384 419.344 161.8 418.944 162.328 418.592V418.496C162.04 418.32 161.792 418.072 161.584 417.752C161.392 417.432 161.296 417.048 161.296 416.6C161.296 416.104 161.432 415.672 161.704 415.304C161.976 414.936 162.264 414.648 162.568 414.44V414.344C162.184 414.024 161.832 413.592 161.512 413.048C161.208 412.504 161.056 411.888 161.056 411.2C161.056 410.352 161.256 409.616 161.656 408.992C162.056 408.368 162.592 407.888 163.264 407.552C163.936 407.216 164.664 407.048 165.448 407.048C165.768 407.048 166.072 407.08 166.36 407.144C166.648 407.192 166.896 407.256 167.104 407.336H171.16V408.848H168.76C169.032 409.104 169.256 409.448 169.432 409.88C169.624 410.296 169.72 410.752 169.72 411.248C169.72 412.08 169.528 412.8 169.144 413.408C168.76 414.016 168.248 414.488 167.608 414.824C166.968 415.144 166.248 415.304 165.448 415.304C164.824 415.304 164.24 415.168 163.696 414.896C163.488 415.072 163.312 415.272 163.168 415.496C163.024 415.704 162.952 415.968 162.952 416.288C162.952 416.656 163.096 416.96 163.384 417.2C163.688 417.44 164.232 417.56 165.016 417.56H167.272C168.632 417.56 169.648 417.784 170.32 418.232C171.008 418.664 171.352 419.368 171.352 420.344C171.352 421.064 171.112 421.728 170.632 422.336C170.152 422.944 169.472 423.432 168.592 423.8C167.712 424.184 166.664 424.376 165.448 424.376ZM165.448 413.984C166.12 413.984 166.696 413.736 167.176 413.24C167.672 412.728 167.92 412.048 167.92 411.2C167.92 410.352 167.68 409.688 167.2 409.208C166.72 408.728 166.136 408.488 165.448 408.488C164.76 408.488 164.176 408.728 163.696 409.208C163.216 409.688 162.976 410.352 162.976 411.2C162.976 412.048 163.216 412.728 163.696 413.24C164.192 413.736 164.776 413.984 165.448 413.984ZM165.736 423.008C166.856 423.008 167.752 422.76 168.424 422.264C169.096 421.784 169.432 421.24 169.432 420.632C169.432 420.088 169.224 419.712 168.808 419.504C168.408 419.296 167.832 419.192 167.08 419.192H165.064C164.84 419.192 164.592 419.176 164.32 419.144C164.064 419.112 163.808 419.064 163.552 419C163.136 419.304 162.832 419.624 162.64 419.96C162.448 420.296 162.352 420.632 162.352 420.968C162.352 421.592 162.648 422.088 163.24 422.456C163.848 422.824 164.68 423.008 165.736 423.008ZM173.606 419V401.912H175.574V406.568L175.502 408.968C176.062 408.44 176.646 407.992 177.254 407.624C177.878 407.24 178.59 407.048 179.39 407.048C180.622 407.048 181.518 407.44 182.078 408.224C182.654 408.992 182.942 410.12 182.942 411.608V419H180.974V411.872C180.974 410.784 180.798 409.992 180.446 409.496C180.094 409 179.534 408.752 178.766 408.752C178.174 408.752 177.638 408.904 177.158 409.208C176.694 409.512 176.166 409.96 175.574 410.552V419H173.606ZM191.388 419.288C190.348 419.288 189.404 419.048 188.556 418.568C187.708 418.072 187.036 417.368 186.54 416.456C186.044 415.544 185.796 414.456 185.796 413.192C185.796 411.912 186.044 410.816 186.54 409.904C187.052 408.992 187.708 408.288 188.508 407.792C189.308 407.296 190.148 407.048 191.028 407.048C192.516 407.048 193.66 407.544 194.46 408.536C195.276 409.528 195.684 410.856 195.684 412.52C195.684 412.728 195.676 412.936 195.66 413.144C195.66 413.336 195.644 413.504 195.612 413.648H187.74C187.82 414.88 188.204 415.864 188.892 416.6C189.596 417.336 190.508 417.704 191.628 417.704C192.188 417.704 192.7 417.624 193.164 417.464C193.644 417.288 194.1 417.064 194.532 416.792L195.228 418.088C194.732 418.408 194.164 418.688 193.524 418.928C192.9 419.168 192.188 419.288 191.388 419.288ZM187.716 412.232H193.956C193.956 411.048 193.7 410.152 193.188 409.544C192.692 408.92 191.988 408.608 191.076 408.608C190.26 408.608 189.524 408.928 188.868 409.568C188.228 410.192 187.844 411.08 187.716 412.232ZM202.004 419.288C200.756 419.288 199.884 418.928 199.388 418.208C198.908 417.488 198.668 416.552 198.668 415.4V408.944H196.94V407.456L198.764 407.336L199.004 404.072H200.66V407.336H203.804V408.944H200.66V415.424C200.66 416.144 200.788 416.704 201.044 417.104C201.316 417.488 201.788 417.68 202.46 417.68C202.668 417.68 202.892 417.648 203.132 417.584C203.372 417.504 203.588 417.432 203.78 417.368L204.164 418.856C203.844 418.968 203.492 419.064 203.108 419.144C202.74 419.24 202.372 419.288 202.004 419.288ZM209.645 419.288C208.397 419.288 207.525 418.928 207.029 418.208C206.549 417.488 206.309 416.552 206.309 415.4V408.944H204.581V407.456L206.405 407.336L206.645 404.072H208.301V407.336H211.445V408.944H208.301V415.424C208.301 416.144 208.429 416.704 208.685 417.104C208.957 417.488 209.429 417.68 210.101 417.68C210.309 417.68 210.533 417.648 210.773 417.584C211.013 417.504 211.229 417.432 211.421 417.368L211.805 418.856C211.485 418.968 211.133 419.064 210.749 419.144C210.381 419.24 210.013 419.288 209.645 419.288ZM214.082 419V407.336H216.05V419H214.082ZM215.09 404.936C214.706 404.936 214.386 404.824 214.13 404.6C213.89 404.36 213.77 404.04 213.77 403.64C213.77 403.256 213.89 402.944 214.13 402.704C214.386 402.464 214.706 402.344 215.09 402.344C215.474 402.344 215.786 402.464 216.026 402.704C216.282 402.944 216.41 403.256 216.41 403.64C216.41 404.04 216.282 404.36 216.026 404.6C215.786 404.824 215.474 404.936 215.09 404.936ZM221.02 419.288C220.62 419.288 220.276 419.152 219.988 418.88C219.716 418.592 219.58 418.232 219.58 417.8C219.58 417.336 219.716 416.968 219.988 416.696C220.276 416.408 220.62 416.264 221.02 416.264C221.404 416.264 221.732 416.408 222.004 416.696C222.292 416.968 222.436 417.336 222.436 417.8C222.436 418.232 222.292 418.592 222.004 418.88C221.732 419.152 221.404 419.288 221.02 419.288Z" fill="black"/> <rect x="1" y="1" width="267.26" height="550" rx="29" stroke="#D7DDFF" stroke-width="2"/> <path d="M338.97 156.707C339.36 156.317 339.36 155.683 338.97 155.293L332.606 148.929C332.215 148.538 331.582 148.538 331.192 148.929C330.801 149.319 330.801 149.953 331.192 150.343L336.848 156L331.192 161.657C330.801 162.047 330.801 162.681 331.192 163.071C331.582 163.462 332.215 163.462 332.606 163.071L338.97 156.707ZM268.263 157H338.263V155H268.263V157Z" fill="#B4B4B4"/> <path d="M338.97 276.707C339.36 276.317 339.36 275.683 338.97 275.293L332.606 268.929C332.215 268.538 331.582 268.538 331.192 268.929C330.801 269.319 330.801 269.953 331.192 270.343L336.848 276L331.192 281.657C330.801 282.047 330.801 282.681 331.192 283.071C331.582 283.462 332.215 283.462 332.606 283.071L338.97 276.707ZM268.263 277H338.263V275H268.263V277Z" fill="#B4B4B4"/> <path d="M338.97 396.707C339.36 396.317 339.36 395.683 338.97 395.293L332.606 388.929C332.215 388.538 331.582 388.538 331.192 388.929C330.801 389.319 330.801 389.953 331.192 390.343L336.848 396L331.192 401.657C330.801 402.047 330.801 402.681 331.192 403.071C331.582 403.462 332.215 403.462 332.606 403.071L338.97 396.707ZM268.263 397H338.263V395H268.263V397Z" fill="#B4B4B4"/> <rect x="1" y="1" width="267.26" height="550" rx="29" fill="#F5F1FF"/> <path d="M23.7274 151V135.256H25.7194V151H23.7274ZM29.9361 140.656L29.3841 139.744C29.9921 139.344 30.4401 138.92 30.7281 138.472C31.0321 138.008 31.1841 137.432 31.1841 136.744C31.1361 136.76 31.0721 136.768 30.9921 136.768C30.6561 136.768 30.3601 136.664 30.1041 136.456C29.8481 136.248 29.7201 135.952 29.7201 135.568C29.7201 135.152 29.8401 134.824 30.0801 134.584C30.3201 134.344 30.6241 134.224 30.9921 134.224C31.4561 134.224 31.8241 134.416 32.0961 134.8C32.3681 135.168 32.5041 135.704 32.5041 136.408C32.5041 137.368 32.2801 138.2 31.8321 138.904C31.4001 139.592 30.7681 140.176 29.9361 140.656ZM35.5119 151V139.336H37.1439L37.3119 141.016H37.3839C37.8959 140.456 38.4559 139.992 39.0639 139.624C39.6719 139.24 40.3199 139.048 41.0079 139.048C41.9039 139.048 42.5999 139.248 43.0959 139.648C43.6079 140.032 43.9839 140.576 44.2239 141.28C44.8319 140.624 45.4479 140.088 46.0719 139.672C46.6959 139.256 47.3599 139.048 48.0639 139.048C49.2639 139.048 50.1519 139.44 50.7279 140.224C51.3199 140.992 51.6159 142.12 51.6159 143.608V151H49.6479V143.872C49.6479 142.784 49.4719 141.992 49.1199 141.496C48.7679 141 48.2239 140.752 47.4879 140.752C46.6239 140.752 45.6479 141.352 44.5599 142.552V151H42.5919V143.872C42.5919 142.784 42.4159 141.992 42.0639 141.496C41.7119 141 41.1599 140.752 40.4079 140.752C39.5439 140.752 38.5679 141.352 37.4799 142.552V151H35.5119ZM63.2631 151.288C62.4311 151.288 61.6391 151.136 60.8871 150.832C60.1351 150.512 59.4791 150.128 58.9191 149.68L59.9031 148.36C60.4151 148.76 60.9431 149.096 61.4871 149.368C62.0311 149.624 62.6471 149.752 63.3351 149.752C64.1031 149.752 64.6791 149.576 65.0631 149.224C65.4471 148.856 65.6391 148.424 65.6391 147.928C65.6391 147.528 65.5031 147.192 65.2311 146.92C64.9751 146.648 64.6391 146.424 64.2231 146.248C63.8231 146.056 63.4071 145.88 62.9751 145.72C62.4311 145.512 61.8951 145.28 61.3671 145.024C60.8391 144.752 60.4071 144.408 60.0711 143.992C59.7351 143.56 59.5671 143.016 59.5671 142.36C59.5671 141.416 59.9191 140.632 60.6231 140.008C61.3431 139.368 62.3351 139.048 63.5991 139.048C64.3191 139.048 64.9911 139.176 65.6151 139.432C66.2391 139.688 66.7751 140 67.2231 140.368L66.2631 141.616C65.8631 141.312 65.4471 141.064 65.0151 140.872C64.5831 140.68 64.1111 140.584 63.5991 140.584C62.8631 140.584 62.3191 140.752 61.9671 141.088C61.6311 141.424 61.4631 141.816 61.4631 142.264C61.4631 142.632 61.5831 142.936 61.8231 143.176C62.0631 143.4 62.3751 143.6 62.7591 143.776C63.1431 143.936 63.5511 144.104 63.9831 144.28C64.5431 144.488 65.0951 144.728 65.6391 145C66.1831 145.256 66.6311 145.608 66.9831 146.056C67.3511 146.488 67.5351 147.072 67.5351 147.808C67.5351 148.432 67.3671 149.008 67.0311 149.536C66.7111 150.064 66.2311 150.488 65.5911 150.808C64.9671 151.128 64.1911 151.288 63.2631 151.288ZM74.8057 151.288C73.8617 151.288 72.9737 151.048 72.1417 150.568C71.3257 150.088 70.6617 149.392 70.1497 148.48C69.6537 147.568 69.4057 146.472 69.4057 145.192C69.4057 143.88 69.6537 142.768 70.1497 141.856C70.6617 140.944 71.3257 140.248 72.1417 139.768C72.9737 139.288 73.8617 139.048 74.8057 139.048C75.7657 139.048 76.6537 139.288 77.4697 139.768C78.2857 140.248 78.9417 140.944 79.4377 141.856C79.9497 142.768 80.2057 143.88 80.2057 145.192C80.2057 146.472 79.9497 147.568 79.4377 148.48C78.9417 149.392 78.2857 150.088 77.4697 150.568C76.6537 151.048 75.7657 151.288 74.8057 151.288ZM74.8057 149.656C75.8137 149.656 76.6217 149.248 77.2297 148.432C77.8537 147.6 78.1657 146.52 78.1657 145.192C78.1657 143.848 77.8537 142.76 77.2297 141.928C76.6217 141.096 75.8137 140.68 74.8057 140.68C73.8137 140.68 73.0057 141.096 72.3817 141.928C71.7577 142.76 71.4457 143.848 71.4457 145.192C71.4457 146.52 71.7577 147.6 72.3817 148.432C73.0057 149.248 73.8137 149.656 74.8057 149.656ZM92.0182 156.376C90.5942 156.376 89.4342 156.104 88.5382 155.56C87.6422 155.016 87.1942 154.24 87.1942 153.232C87.1942 152.736 87.3462 152.256 87.6502 151.792C87.9542 151.344 88.3702 150.944 88.8982 150.592V150.496C88.6102 150.32 88.3622 150.072 88.1542 149.752C87.9622 149.432 87.8662 149.048 87.8662 148.6C87.8662 148.104 88.0022 147.672 88.2742 147.304C88.5462 146.936 88.8342 146.648 89.1382 146.44V146.344C88.7542 146.024 88.4022 145.592 88.0822 145.048C87.7782 144.504 87.6262 143.888 87.6262 143.2C87.6262 142.352 87.8262 141.616 88.2262 140.992C88.6262 140.368 89.1622 139.888 89.8342 139.552C90.5062 139.216 91.2342 139.048 92.0182 139.048C92.3382 139.048 92.6422 139.08 92.9302 139.144C93.2182 139.192 93.4662 139.256 93.6742 139.336H97.7302V140.848H95.3302C95.6022 141.104 95.8262 141.448 96.0022 141.88C96.1942 142.296 96.2902 142.752 96.2902 143.248C96.2902 144.08 96.0982 144.8 95.7142 145.408C95.3302 146.016 94.8182 146.488 94.1782 146.824C93.5382 147.144 92.8182 147.304 92.0182 147.304C91.3942 147.304 90.8102 147.168 90.2662 146.896C90.0582 147.072 89.8822 147.272 89.7382 147.496C89.5942 147.704 89.5222 147.968 89.5222 148.288C89.5222 148.656 89.6662 148.96 89.9542 149.2C90.2582 149.44 90.8022 149.56 91.5862 149.56H93.8422C95.2022 149.56 96.2182 149.784 96.8902 150.232C97.5782 150.664 97.9222 151.368 97.9222 152.344C97.9222 153.064 97.6822 153.728 97.2022 154.336C96.7222 154.944 96.0422 155.432 95.1622 155.8C94.2822 156.184 93.2342 156.376 92.0182 156.376ZM92.0182 145.984C92.6902 145.984 93.2662 145.736 93.7462 145.24C94.2422 144.728 94.4902 144.048 94.4902 143.2C94.4902 142.352 94.2502 141.688 93.7702 141.208C93.2902 140.728 92.7062 140.488 92.0182 140.488C91.3302 140.488 90.7462 140.728 90.2662 141.208C89.7862 141.688 89.5462 142.352 89.5462 143.2C89.5462 144.048 89.7862 144.728 90.2662 145.24C90.7622 145.736 91.3462 145.984 92.0182 145.984ZM92.3062 155.008C93.4262 155.008 94.3222 154.76 94.9942 154.264C95.6662 153.784 96.0022 153.24 96.0022 152.632C96.0022 152.088 95.7942 151.712 95.3782 151.504C94.9782 151.296 94.4022 151.192 93.6502 151.192H91.6342C91.4102 151.192 91.1622 151.176 90.8902 151.144C90.6342 151.112 90.3782 151.064 90.1222 151C89.7062 151.304 89.4022 151.624 89.2102 151.96C89.0182 152.296 88.9222 152.632 88.9222 152.968C88.9222 153.592 89.2182 154.088 89.8102 154.456C90.4182 154.824 91.2502 155.008 92.3062 155.008ZM102.264 151.288C101.528 151.288 100.992 151.064 100.656 150.616C100.336 150.152 100.176 149.496 100.176 148.648V133.912H102.144V148.792C102.144 149.096 102.2 149.32 102.312 149.464C102.424 149.592 102.552 149.656 102.696 149.656C102.76 149.656 102.816 149.656 102.864 149.656C102.928 149.64 103.016 149.624 103.128 149.608L103.392 151.096C103.264 151.16 103.112 151.208 102.936 151.24C102.76 151.272 102.536 151.288 102.264 151.288ZM108.981 151.288C108.005 151.288 107.189 151 106.533 150.424C105.893 149.832 105.573 149.016 105.573 147.976C105.573 146.696 106.141 145.72 107.277 145.048C108.429 144.36 110.245 143.88 112.725 143.608C112.725 143.112 112.653 142.64 112.509 142.192C112.381 141.744 112.141 141.384 111.789 141.112C111.453 140.824 110.965 140.68 110.325 140.68C109.653 140.68 109.021 140.808 108.429 141.064C107.837 141.32 107.309 141.608 106.845 141.928L106.077 140.56C106.621 140.208 107.285 139.872 108.069 139.552C108.869 139.216 109.733 139.048 110.661 139.048C112.085 139.048 113.117 139.488 113.757 140.368C114.397 141.232 114.717 142.392 114.717 143.848V151H113.085L112.917 149.608H112.845C112.301 150.056 111.701 150.448 111.045 150.784C110.405 151.12 109.717 151.288 108.981 151.288ZM109.557 149.704C110.117 149.704 110.645 149.568 111.141 149.296C111.637 149.024 112.165 148.64 112.725 148.144V144.904C110.789 145.144 109.429 145.504 108.645 145.984C107.877 146.464 107.493 147.08 107.493 147.832C107.493 148.488 107.693 148.968 108.093 149.272C108.493 149.56 108.981 149.704 109.557 149.704ZM122.371 151.288C120.915 151.288 119.747 150.76 118.867 149.704C117.987 148.632 117.547 147.128 117.547 145.192C117.547 143.928 117.779 142.84 118.243 141.928C118.723 141 119.347 140.288 120.115 139.792C120.899 139.296 121.731 139.048 122.611 139.048C123.283 139.048 123.867 139.168 124.363 139.408C124.859 139.648 125.363 139.976 125.875 140.392L125.779 138.4V133.912H127.771V151H126.139L125.971 149.632H125.899C125.451 150.08 124.923 150.472 124.315 150.808C123.707 151.128 123.059 151.288 122.371 151.288ZM122.803 149.632C123.827 149.632 124.819 149.096 125.779 148.024V141.928C125.283 141.48 124.803 141.168 124.339 140.992C123.891 140.8 123.427 140.704 122.947 140.704C122.323 140.704 121.755 140.896 121.243 141.28C120.747 141.648 120.347 142.168 120.043 142.84C119.739 143.496 119.587 144.272 119.587 145.168C119.587 146.56 119.867 147.656 120.427 148.456C120.987 149.24 121.779 149.632 122.803 149.632ZM136.696 151V135.256H138.688V151H136.696ZM149.701 151.288C148.965 151.288 148.429 151.064 148.093 150.616C147.773 150.152 147.613 149.496 147.613 148.648V133.912H149.581V148.792C149.581 149.096 149.637 149.32 149.749 149.464C149.861 149.592 149.989 149.656 150.133 149.656C150.197 149.656 150.253 149.656 150.301 149.656C150.365 149.64 150.453 149.624 150.565 149.608L150.829 151.096C150.701 151.16 150.549 151.208 150.373 151.24C150.197 151.272 149.973 151.288 149.701 151.288ZM158.459 151.288C157.419 151.288 156.475 151.048 155.627 150.568C154.779 150.072 154.107 149.368 153.611 148.456C153.115 147.544 152.867 146.456 152.867 145.192C152.867 143.912 153.115 142.816 153.611 141.904C154.123 140.992 154.779 140.288 155.579 139.792C156.379 139.296 157.219 139.048 158.099 139.048C159.587 139.048 160.731 139.544 161.531 140.536C162.347 141.528 162.755 142.856 162.755 144.52C162.755 144.728 162.747 144.936 162.731 145.144C162.731 145.336 162.715 145.504 162.683 145.648H154.811C154.891 146.88 155.275 147.864 155.963 148.6C156.667 149.336 157.579 149.704 158.699 149.704C159.259 149.704 159.771 149.624 160.235 149.464C160.715 149.288 161.171 149.064 161.603 148.792L162.299 150.088C161.803 150.408 161.235 150.688 160.595 150.928C159.971 151.168 159.259 151.288 158.459 151.288ZM154.787 144.232H161.027C161.027 143.048 160.771 142.152 160.259 141.544C159.763 140.92 159.059 140.608 158.147 140.608C157.331 140.608 156.595 140.928 155.939 141.568C155.299 142.192 154.915 143.08 154.787 144.232ZM167.997 151.288C167.021 151.288 166.205 151 165.549 150.424C164.909 149.832 164.589 149.016 164.589 147.976C164.589 146.696 165.157 145.72 166.293 145.048C167.445 144.36 169.261 143.88 171.741 143.608C171.741 143.112 171.669 142.64 171.525 142.192C171.397 141.744 171.157 141.384 170.805 141.112C170.469 140.824 169.981 140.68 169.341 140.68C168.669 140.68 168.037 140.808 167.445 141.064C166.853 141.32 166.325 141.608 165.861 141.928L165.093 140.56C165.637 140.208 166.301 139.872 167.085 139.552C167.885 139.216 168.749 139.048 169.677 139.048C171.101 139.048 172.133 139.488 172.773 140.368C173.413 141.232 173.733 142.392 173.733 143.848V151H172.101L171.933 149.608H171.861C171.317 150.056 170.717 150.448 170.061 150.784C169.421 151.12 168.733 151.288 167.997 151.288ZM168.573 149.704C169.133 149.704 169.661 149.568 170.157 149.296C170.653 149.024 171.181 148.64 171.741 148.144V144.904C169.805 145.144 168.445 145.504 167.661 145.984C166.893 146.464 166.509 147.08 166.509 147.832C166.509 148.488 166.709 148.968 167.109 149.272C167.509 149.56 167.997 149.704 168.573 149.704ZM177.403 151V139.336H179.035L179.203 141.448H179.275C179.675 140.712 180.163 140.128 180.739 139.696C181.315 139.264 181.931 139.048 182.587 139.048C183.051 139.048 183.467 139.128 183.835 139.288L183.451 141.016C183.259 140.952 183.083 140.904 182.923 140.872C182.763 140.84 182.563 140.824 182.323 140.824C181.827 140.824 181.307 141.024 180.763 141.424C180.235 141.824 179.771 142.52 179.371 143.512V151H177.403ZM185.723 151V139.336H187.355L187.523 141.016H187.595C188.155 140.456 188.747 139.992 189.371 139.624C189.995 139.24 190.707 139.048 191.507 139.048C192.739 139.048 193.635 139.44 194.195 140.224C194.771 140.992 195.059 142.12 195.059 143.608V151H193.091V143.872C193.091 142.784 192.915 141.992 192.563 141.496C192.211 141 191.651 140.752 190.883 140.752C190.291 140.752 189.755 140.904 189.275 141.208C188.811 141.512 188.283 141.96 187.691 142.552V151H185.723ZM203.576 151.288C202.536 151.288 201.592 151.048 200.744 150.568C199.896 150.072 199.224 149.368 198.728 148.456C198.232 147.544 197.984 146.456 197.984 145.192C197.984 143.912 198.232 142.816 198.728 141.904C199.24 140.992 199.896 140.288 200.696 139.792C201.496 139.296 202.336 139.048 203.216 139.048C204.704 139.048 205.848 139.544 206.648 140.536C207.464 141.528 207.872 142.856 207.872 144.52C207.872 144.728 207.864 144.936 207.848 145.144C207.848 145.336 207.832 145.504 207.8 145.648H199.928C200.008 146.88 200.392 147.864 201.08 148.6C201.784 149.336 202.696 149.704 203.816 149.704C204.376 149.704 204.888 149.624 205.352 149.464C205.832 149.288 206.288 149.064 206.72 148.792L207.416 150.088C206.92 150.408 206.352 150.688 205.712 150.928C205.088 151.168 204.376 151.288 203.576 151.288ZM199.904 144.232H206.144C206.144 143.048 205.888 142.152 205.376 141.544C204.88 140.92 204.176 140.608 203.264 140.608C202.448 140.608 201.712 140.928 201.056 141.568C200.416 142.192 200.032 143.08 199.904 144.232ZM214.738 151.288C213.282 151.288 212.114 150.76 211.234 149.704C210.354 148.632 209.914 147.128 209.914 145.192C209.914 143.928 210.146 142.84 210.61 141.928C211.09 141 211.714 140.288 212.482 139.792C213.266 139.296 214.098 139.048 214.978 139.048C215.65 139.048 216.234 139.168 216.73 139.408C217.226 139.648 217.73 139.976 218.242 140.392L218.146 138.4V133.912H220.138V151H218.506L218.338 149.632H218.266C217.818 150.08 217.29 150.472 216.682 150.808C216.074 151.128 215.426 151.288 214.738 151.288ZM215.17 149.632C216.194 149.632 217.186 149.096 218.146 148.024V141.928C217.65 141.48 217.17 141.168 216.706 140.992C216.258 140.8 215.794 140.704 215.314 140.704C214.69 140.704 214.122 140.896 213.61 141.28C213.114 141.648 212.714 142.168 212.41 142.84C212.106 143.496 211.954 144.272 211.954 145.168C211.954 146.56 212.234 147.656 212.794 148.456C213.354 149.24 214.146 149.632 215.17 149.632ZM232.543 151.288C231.295 151.288 230.423 150.928 229.927 150.208C229.447 149.488 229.207 148.552 229.207 147.4V140.944H227.479V139.456L229.303 139.336L229.543 136.072H231.199V139.336H234.343V140.944H231.199V147.424C231.199 148.144 231.327 148.704 231.583 149.104C231.855 149.488 232.327 149.68 232.999 149.68C233.207 149.68 233.431 149.648 233.671 149.584C233.911 149.504 234.127 149.432 234.319 149.368L234.703 150.856C234.383 150.968 234.031 151.064 233.647 151.144C233.279 151.24 232.911 151.288 232.543 151.288ZM241.189 151.288C240.245 151.288 239.357 151.048 238.525 150.568C237.709 150.088 237.045 149.392 236.533 148.48C236.037 147.568 235.789 146.472 235.789 145.192C235.789 143.88 236.037 142.768 236.533 141.856C237.045 140.944 237.709 140.248 238.525 139.768C239.357 139.288 240.245 139.048 241.189 139.048C242.149 139.048 243.037 139.288 243.853 139.768C244.669 140.248 245.325 140.944 245.821 141.856C246.333 142.768 246.589 143.88 246.589 145.192C246.589 146.472 246.333 147.568 245.821 148.48C245.325 149.392 244.669 150.088 243.853 150.568C243.037 151.048 242.149 151.288 241.189 151.288ZM241.189 149.656C242.197 149.656 243.005 149.248 243.613 148.432C244.237 147.6 244.549 146.52 244.549 145.192C244.549 143.848 244.237 142.76 243.613 141.928C243.005 141.096 242.197 140.68 241.189 140.68C240.197 140.68 239.389 141.096 238.765 141.928C238.141 142.76 237.829 143.848 237.829 145.192C237.829 146.52 238.141 147.6 238.765 148.432C239.389 149.248 240.197 149.656 241.189 149.656ZM50.409 181.288C49.385 181.288 48.457 181.048 47.625 180.568C46.793 180.088 46.137 179.392 45.657 178.48C45.177 177.568 44.937 176.472 44.937 175.192C44.937 173.88 45.193 172.768 45.705 171.856C46.233 170.944 46.921 170.248 47.769 169.768C48.633 169.288 49.561 169.048 50.553 169.048C51.321 169.048 51.977 169.184 52.521 169.456C53.081 169.728 53.561 170.048 53.961 170.416L52.953 171.712C52.617 171.408 52.257 171.16 51.873 170.968C51.505 170.776 51.089 170.68 50.625 170.68C49.921 170.68 49.289 170.872 48.729 171.256C48.185 171.624 47.753 172.152 47.433 172.84C47.129 173.512 46.977 174.296 46.977 175.192C46.977 176.52 47.305 177.6 47.961 178.432C48.633 179.248 49.505 179.656 50.577 179.656C51.121 179.656 51.625 179.544 52.089 179.32C52.553 179.08 52.961 178.8 53.313 178.48L54.177 179.8C53.649 180.264 53.065 180.632 52.425 180.904C51.785 181.16 51.113 181.288 50.409 181.288ZM60.7667 181.288C59.8227 181.288 58.9347 181.048 58.1027 180.568C57.2867 180.088 56.6227 179.392 56.1107 178.48C55.6147 177.568 55.3667 176.472 55.3667 175.192C55.3667 173.88 55.6147 172.768 56.1107 171.856C56.6227 170.944 57.2867 170.248 58.1027 169.768C58.9347 169.288 59.8227 169.048 60.7667 169.048C61.7267 169.048 62.6147 169.288 63.4307 169.768C64.2467 170.248 64.9027 170.944 65.3987 171.856C65.9107 172.768 66.1667 173.88 66.1667 175.192C66.1667 176.472 65.9107 177.568 65.3987 178.48C64.9027 179.392 64.2467 180.088 63.4307 180.568C62.6147 181.048 61.7267 181.288 60.7667 181.288ZM60.7667 179.656C61.7747 179.656 62.5827 179.248 63.1907 178.432C63.8147 177.6 64.1267 176.52 64.1267 175.192C64.1267 173.848 63.8147 172.76 63.1907 171.928C62.5827 171.096 61.7747 170.68 60.7667 170.68C59.7747 170.68 58.9667 171.096 58.3427 171.928C57.7187 172.76 57.4067 173.848 57.4067 175.192C57.4067 176.52 57.7187 177.6 58.3427 178.432C58.9667 179.248 59.7747 179.656 60.7667 179.656ZM73.2225 181.288C71.7665 181.288 70.5985 180.76 69.7185 179.704C68.8385 178.632 68.3985 177.128 68.3985 175.192C68.3985 173.928 68.6305 172.84 69.0945 171.928C69.5745 171 70.1985 170.288 70.9665 169.792C71.7505 169.296 72.5825 169.048 73.4625 169.048C74.1345 169.048 74.7185 169.168 75.2145 169.408C75.7105 169.648 76.2145 169.976 76.7265 170.392L76.6305 168.4V163.912H78.6225V181H76.9905L76.8225 179.632H76.7505C76.3025 180.08 75.7745 180.472 75.1665 180.808C74.5585 181.128 73.9105 181.288 73.2225 181.288ZM73.6545 179.632C74.6785 179.632 75.6705 179.096 76.6305 178.024V171.928C76.1345 171.48 75.6545 171.168 75.1905 170.992C74.7425 170.8 74.2785 170.704 73.7985 170.704C73.1745 170.704 72.6065 170.896 72.0945 171.28C71.5985 171.648 71.1985 172.168 70.8945 172.84C70.5905 173.496 70.4385 174.272 70.4385 175.168C70.4385 176.56 70.7185 177.656 71.2785 178.456C71.8385 179.24 72.6305 179.632 73.6545 179.632ZM87.279 181.288C86.239 181.288 85.295 181.048 84.447 180.568C83.599 180.072 82.927 179.368 82.431 178.456C81.935 177.544 81.687 176.456 81.687 175.192C81.687 173.912 81.935 172.816 82.431 171.904C82.943 170.992 83.599 170.288 84.399 169.792C85.199 169.296 86.039 169.048 86.919 169.048C88.407 169.048 89.551 169.544 90.351 170.536C91.167 171.528 91.575 172.856 91.575 174.52C91.575 174.728 91.567 174.936 91.551 175.144C91.551 175.336 91.535 175.504 91.503 175.648H83.631C83.711 176.88 84.095 177.864 84.783 178.6C85.487 179.336 86.399 179.704 87.519 179.704C88.079 179.704 88.591 179.624 89.055 179.464C89.535 179.288 89.991 179.064 90.423 178.792L91.119 180.088C90.623 180.408 90.055 180.688 89.415 180.928C88.791 181.168 88.079 181.288 87.279 181.288ZM83.607 174.232H89.847C89.847 173.048 89.591 172.152 89.079 171.544C88.583 170.92 87.879 170.608 86.967 170.608C86.151 170.608 85.415 170.928 84.759 171.568C84.119 172.192 83.735 173.08 83.607 174.232ZM101.11 181L97.8699 169.336H99.8859L101.614 176.08C101.742 176.624 101.862 177.168 101.974 177.712C102.086 178.24 102.198 178.776 102.31 179.32H102.406C102.534 178.776 102.662 178.24 102.79 177.712C102.918 177.168 103.054 176.624 103.198 176.08L104.998 169.336H106.918L108.742 176.08C108.886 176.624 109.022 177.168 109.15 177.712C109.294 178.24 109.43 178.776 109.558 179.32H109.654C109.782 178.776 109.902 178.24 110.014 177.712C110.142 177.168 110.262 176.624 110.374 176.08L112.078 169.336H113.95L110.83 181H108.43L106.75 174.736C106.606 174.176 106.47 173.624 106.342 173.08C106.23 172.536 106.102 171.968 105.958 171.376H105.862C105.734 171.968 105.606 172.544 105.478 173.104C105.35 173.648 105.206 174.2 105.046 174.76L103.414 181H101.11ZM116.488 181V169.336H118.456V181H116.488ZM117.496 166.936C117.112 166.936 116.792 166.824 116.536 166.6C116.296 166.36 116.176 166.04 116.176 165.64C116.176 165.256 116.296 164.944 116.536 164.704C116.792 164.464 117.112 164.344 117.496 164.344C117.88 164.344 118.192 164.464 118.432 164.704C118.688 164.944 118.816 165.256 118.816 165.64C118.816 166.04 118.688 166.36 118.432 166.6C118.192 166.824 117.88 166.936 117.496 166.936ZM126.067 181.288C124.819 181.288 123.947 180.928 123.451 180.208C122.971 179.488 122.731 178.552 122.731 177.4V170.944H121.003V169.456L122.827 169.336L123.067 166.072H124.723V169.336H127.867V170.944H124.723V177.424C124.723 178.144 124.851 178.704 125.107 179.104C125.379 179.488 125.851 179.68 126.523 179.68C126.731 179.68 126.955 179.648 127.195 179.584C127.435 179.504 127.651 179.432 127.843 179.368L128.227 180.856C127.907 180.968 127.555 181.064 127.171 181.144C126.803 181.24 126.435 181.288 126.067 181.288ZM130.504 181V163.912H132.472V168.568L132.4 170.968C132.96 170.44 133.544 169.992 134.152 169.624C134.776 169.24 135.488 169.048 136.288 169.048C137.52 169.048 138.416 169.44 138.976 170.224C139.552 170.992 139.84 172.12 139.84 173.608V181H137.872V173.872C137.872 172.784 137.696 171.992 137.344 171.496C136.992 171 136.432 170.752 135.664 170.752C135.072 170.752 134.536 170.904 134.056 171.208C133.592 171.512 133.064 171.96 132.472 172.552V181H130.504ZM148.555 181V165.256H153.043C154.227 165.256 155.259 165.4 156.139 165.688C157.019 165.976 157.707 166.464 158.203 167.152C158.699 167.824 158.947 168.736 158.947 169.888C158.947 171.536 158.411 172.76 157.339 173.56C156.267 174.36 154.867 174.76 153.139 174.76H150.547V181H148.555ZM150.547 173.128H152.899C154.275 173.128 155.291 172.872 155.947 172.36C156.619 171.832 156.955 171.008 156.955 169.888C156.955 168.752 156.603 167.968 155.899 167.536C155.211 167.088 154.179 166.864 152.803 166.864H150.547V173.128ZM162.149 186.016C161.893 186.016 161.645 185.992 161.405 185.944C161.181 185.896 160.973 185.84 160.781 185.776L161.165 184.216C161.293 184.248 161.437 184.28 161.597 184.312C161.757 184.36 161.909 184.384 162.053 184.384C162.709 184.384 163.253 184.144 163.685 183.664C164.117 183.2 164.453 182.608 164.693 181.888L164.957 181.024L160.277 169.336H162.317L164.693 175.792C164.869 176.288 165.053 176.824 165.245 177.4C165.453 177.976 165.645 178.536 165.821 179.08H165.917C166.093 178.552 166.261 178 166.421 177.424C166.581 176.848 166.741 176.304 166.901 175.792L168.989 169.336H170.909L166.517 181.96C166.245 182.728 165.917 183.416 165.533 184.024C165.165 184.632 164.701 185.112 164.141 185.464C163.597 185.832 162.933 186.016 162.149 186.016ZM176.832 181.288C175.584 181.288 174.712 180.928 174.216 180.208C173.736 179.488 173.496 178.552 173.496 177.4V170.944H171.768V169.456L173.592 169.336L173.832 166.072H175.488V169.336H178.632V170.944H175.488V177.424C175.488 178.144 175.616 178.704 175.872 179.104C176.144 179.488 176.616 179.68 177.288 179.68C177.496 179.68 177.72 179.648 177.96 179.584C178.2 179.504 178.416 179.432 178.608 179.368L178.992 180.856C178.672 180.968 178.32 181.064 177.936 181.144C177.568 181.24 177.2 181.288 176.832 181.288ZM181.27 181V163.912H183.238V168.568L183.166 170.968C183.726 170.44 184.31 169.992 184.918 169.624C185.542 169.24 186.254 169.048 187.054 169.048C188.286 169.048 189.182 169.44 189.742 170.224C190.318 170.992 190.606 172.12 190.606 173.608V181H188.638V173.872C188.638 172.784 188.462 171.992 188.11 171.496C187.758 171 187.198 170.752 186.43 170.752C185.838 170.752 185.302 170.904 184.822 171.208C184.358 171.512 183.83 171.96 183.238 172.552V181H181.27ZM198.86 181.288C197.916 181.288 197.028 181.048 196.196 180.568C195.38 180.088 194.716 179.392 194.204 178.48C193.708 177.568 193.46 176.472 193.46 175.192C193.46 173.88 193.708 172.768 194.204 171.856C194.716 170.944 195.38 170.248 196.196 169.768C197.028 169.288 197.916 169.048 198.86 169.048C199.82 169.048 200.708 169.288 201.524 169.768C202.34 170.248 202.996 170.944 203.492 171.856C204.004 172.768 204.26 173.88 204.26 175.192C204.26 176.472 204.004 177.568 203.492 178.48C202.996 179.392 202.34 180.088 201.524 180.568C200.708 181.048 199.82 181.288 198.86 181.288ZM198.86 179.656C199.868 179.656 200.676 179.248 201.284 178.432C201.908 177.6 202.22 176.52 202.22 175.192C202.22 173.848 201.908 172.76 201.284 171.928C200.676 171.096 199.868 170.68 198.86 170.68C197.868 170.68 197.06 171.096 196.436 171.928C195.812 172.76 195.5 173.848 195.5 175.192C195.5 176.52 195.812 177.6 196.436 178.432C197.06 179.248 197.868 179.656 198.86 179.656ZM207.332 181V169.336H208.964L209.132 171.016H209.204C209.764 170.456 210.356 169.992 210.98 169.624C211.604 169.24 212.316 169.048 213.116 169.048C214.348 169.048 215.244 169.44 215.804 170.224C216.38 170.992 216.668 172.12 216.668 173.608V181H214.7V173.872C214.7 172.784 214.524 171.992 214.172 171.496C213.82 171 213.26 170.752 212.492 170.752C211.9 170.752 211.364 170.904 210.884 171.208C210.42 171.512 209.892 171.96 209.3 172.552V181H207.332ZM221.273 176.248L221.009 167.176L220.961 164.92H222.953L222.905 167.176L222.641 176.248H221.273ZM221.969 181.288C221.569 181.288 221.225 181.152 220.937 180.88C220.665 180.592 220.529 180.232 220.529 179.8C220.529 179.336 220.665 178.968 220.937 178.696C221.225 178.408 221.569 178.264 221.969 178.264C222.353 178.264 222.681 178.408 222.953 178.696C223.241 178.968 223.385 179.336 223.385 179.8C223.385 180.232 223.241 180.592 222.953 180.88C222.681 181.152 222.353 181.288 221.969 181.288Z" fill="black"/> <path d="M53.6922 270V254.256H58.1802C59.3642 254.256 60.3962 254.4 61.2762 254.688C62.1562 254.976 62.8442 255.464 63.3402 256.152C63.8362 256.824 64.0842 257.736 64.0842 258.888C64.0842 260.536 63.5482 261.76 62.4762 262.56C61.4042 263.36 60.0042 263.76 58.2762 263.76H55.6842V270H53.6922ZM55.6842 262.128H58.0362C59.4122 262.128 60.4282 261.872 61.0842 261.36C61.7562 260.832 62.0922 260.008 62.0922 258.888C62.0922 257.752 61.7402 256.968 61.0362 256.536C60.3482 256.088 59.3162 255.864 57.9402 255.864H55.6842V262.128ZM67.286 275.016C67.03 275.016 66.782 274.992 66.542 274.944C66.318 274.896 66.11 274.84 65.918 274.776L66.302 273.216C66.43 273.248 66.574 273.28 66.734 273.312C66.894 273.36 67.046 273.384 67.19 273.384C67.846 273.384 68.39 273.144 68.822 272.664C69.254 272.2 69.59 271.608 69.83 270.888L70.094 270.024L65.414 258.336H67.454L69.83 264.792C70.006 265.288 70.19 265.824 70.382 266.4C70.59 266.976 70.782 267.536 70.958 268.08H71.054C71.23 267.552 71.398 267 71.558 266.424C71.718 265.848 71.878 265.304 72.038 264.792L74.126 258.336H76.046L71.654 270.96C71.382 271.728 71.054 272.416 70.67 273.024C70.302 273.632 69.838 274.112 69.278 274.464C68.734 274.832 68.07 275.016 67.286 275.016ZM81.9691 270.288C80.7211 270.288 79.8491 269.928 79.3531 269.208C78.8731 268.488 78.6331 267.552 78.6331 266.4V259.944H76.9051V258.456L78.7291 258.336L78.9691 255.072H80.6251V258.336H83.7691V259.944H80.6251V266.424C80.6251 267.144 80.7531 267.704 81.0091 268.104C81.2811 268.488 81.7531 268.68 82.4251 268.68C82.6331 268.68 82.8571 268.648 83.0971 268.584C83.3371 268.504 83.5531 268.432 83.7451 268.368L84.1291 269.856C83.8091 269.968 83.4571 270.064 83.0731 270.144C82.7051 270.24 82.3371 270.288 81.9691 270.288ZM86.4065 270V252.912H88.3745V257.568L88.3025 259.968C88.8625 259.44 89.4465 258.992 90.0545 258.624C90.6785 258.24 91.3905 258.048 92.1905 258.048C93.4225 258.048 94.3185 258.44 94.8785 259.224C95.4545 259.992 95.7425 261.12 95.7425 262.608V270H93.7745V262.872C93.7745 261.784 93.5985 260.992 93.2465 260.496C92.8945 260 92.3345 259.752 91.5665 259.752C90.9745 259.752 90.4385 259.904 89.9585 260.208C89.4945 260.512 88.9665 260.96 88.3745 261.552V270H86.4065ZM103.997 270.288C103.053 270.288 102.165 270.048 101.333 269.568C100.517 269.088 99.8531 268.392 99.3411 267.48C98.8451 266.568 98.5971 265.472 98.5971 264.192C98.5971 262.88 98.8451 261.768 99.3411 260.856C99.8531 259.944 100.517 259.248 101.333 258.768C102.165 258.288 103.053 258.048 103.997 258.048C104.957 258.048 105.845 258.288 106.661 258.768C107.477 259.248 108.133 259.944 108.629 260.856C109.141 261.768 109.397 262.88 109.397 264.192C109.397 265.472 109.141 266.568 108.629 267.48C108.133 268.392 107.477 269.088 106.661 269.568C105.845 270.048 104.957 270.288 103.997 270.288ZM103.997 268.656C105.005 268.656 105.813 268.248 106.421 267.432C107.045 266.6 107.357 265.52 107.357 264.192C107.357 262.848 107.045 261.76 106.421 260.928C105.813 260.096 105.005 259.68 103.997 259.68C103.005 259.68 102.197 260.096 101.573 260.928C100.949 261.76 100.637 262.848 100.637 264.192C100.637 265.52 100.949 266.6 101.573 267.432C102.197 268.248 103.005 268.656 103.997 268.656ZM112.469 270V258.336H114.101L114.269 260.016H114.341C114.901 259.456 115.493 258.992 116.117 258.624C116.741 258.24 117.453 258.048 118.253 258.048C119.485 258.048 120.381 258.44 120.941 259.224C121.517 259.992 121.805 261.12 121.805 262.608V270H119.837V262.872C119.837 261.784 119.661 260.992 119.309 260.496C118.957 260 118.397 259.752 117.629 259.752C117.037 259.752 116.501 259.904 116.021 260.208C115.557 260.512 115.029 260.96 114.437 261.552V270H112.469ZM130.399 270V258.336H132.367V270H130.399ZM131.407 255.936C131.023 255.936 130.703 255.824 130.447 255.6C130.207 255.36 130.087 255.04 130.087 254.64C130.087 254.256 130.207 253.944 130.447 253.704C130.703 253.464 131.023 253.344 131.407 253.344C131.791 253.344 132.103 253.464 132.343 253.704C132.599 253.944 132.727 254.256 132.727 254.64C132.727 255.04 132.599 255.36 132.343 255.6C132.103 255.824 131.791 255.936 131.407 255.936ZM139.353 270.288C138.521 270.288 137.729 270.136 136.977 269.832C136.225 269.512 135.569 269.128 135.009 268.68L135.993 267.36C136.505 267.76 137.033 268.096 137.577 268.368C138.121 268.624 138.737 268.752 139.425 268.752C140.193 268.752 140.769 268.576 141.153 268.224C141.537 267.856 141.729 267.424 141.729 266.928C141.729 266.528 141.593 266.192 141.321 265.92C141.065 265.648 140.729 265.424 140.313 265.248C139.913 265.056 139.497 264.88 139.065 264.72C138.521 264.512 137.985 264.28 137.457 264.024C136.929 263.752 136.497 263.408 136.161 262.992C135.825 262.56 135.657 262.016 135.657 261.36C135.657 260.416 136.009 259.632 136.713 259.008C137.433 258.368 138.425 258.048 139.689 258.048C140.409 258.048 141.081 258.176 141.705 258.432C142.329 258.688 142.865 259 143.313 259.368L142.353 260.616C141.953 260.312 141.537 260.064 141.105 259.872C140.673 259.68 140.201 259.584 139.689 259.584C138.953 259.584 138.409 259.752 138.057 260.088C137.721 260.424 137.553 260.816 137.553 261.264C137.553 261.632 137.673 261.936 137.913 262.176C138.153 262.4 138.465 262.6 138.849 262.776C139.233 262.936 139.641 263.104 140.073 263.28C140.633 263.488 141.185 263.728 141.729 264C142.273 264.256 142.721 264.608 143.073 265.056C143.441 265.488 143.625 266.072 143.625 266.808C143.625 267.432 143.457 268.008 143.121 268.536C142.801 269.064 142.321 269.488 141.681 269.808C141.057 270.128 140.281 270.288 139.353 270.288ZM153.852 270.288C152.876 270.288 152.06 270 151.404 269.424C150.764 268.832 150.444 268.016 150.444 266.976C150.444 265.696 151.012 264.72 152.148 264.048C153.3 263.36 155.116 262.88 157.596 262.608C157.596 262.112 157.524 261.64 157.38 261.192C157.252 260.744 157.012 260.384 156.66 260.112C156.324 259.824 155.836 259.68 155.196 259.68C154.524 259.68 153.892 259.808 153.3 260.064C152.708 260.32 152.18 260.608 151.716 260.928L150.948 259.56C151.492 259.208 152.156 258.872 152.94 258.552C153.74 258.216 154.604 258.048 155.532 258.048C156.956 258.048 157.988 258.488 158.628 259.368C159.268 260.232 159.588 261.392 159.588 262.848V270H157.956L157.788 268.608H157.716C157.172 269.056 156.572 269.448 155.916 269.784C155.276 270.12 154.588 270.288 153.852 270.288ZM154.428 268.704C154.988 268.704 155.516 268.568 156.012 268.296C156.508 268.024 157.036 267.64 157.596 267.144V263.904C155.66 264.144 154.3 264.504 153.516 264.984C152.748 265.464 152.364 266.08 152.364 266.832C152.364 267.488 152.564 267.968 152.964 268.272C153.364 268.56 153.852 268.704 154.428 268.704ZM171.999 275.376C170.575 275.376 169.415 275.104 168.519 274.56C167.623 274.016 167.175 273.24 167.175 272.232C167.175 271.736 167.327 271.256 167.631 270.792C167.935 270.344 168.351 269.944 168.879 269.592V269.496C168.591 269.32 168.343 269.072 168.135 268.752C167.943 268.432 167.847 268.048 167.847 267.6C167.847 267.104 167.983 266.672 168.255 266.304C168.527 265.936 168.815 265.648 169.119 265.44V265.344C168.735 265.024 168.383 264.592 168.063 264.048C167.759 263.504 167.607 262.888 167.607 262.2C167.607 261.352 167.807 260.616 168.207 259.992C168.607 259.368 169.143 258.888 169.815 258.552C170.487 258.216 171.215 258.048 171.999 258.048C172.319 258.048 172.623 258.08 172.911 258.144C173.199 258.192 173.447 258.256 173.655 258.336H177.711V259.848H175.311C175.583 260.104 175.807 260.448 175.983 260.88C176.175 261.296 176.271 261.752 176.271 262.248C176.271 263.08 176.079 263.8 175.695 264.408C175.311 265.016 174.799 265.488 174.159 265.824C173.519 266.144 172.799 266.304 171.999 266.304C171.375 266.304 170.791 266.168 170.247 265.896C170.039 266.072 169.863 266.272 169.719 266.496C169.575 266.704 169.503 266.968 169.503 267.288C169.503 267.656 169.647 267.96 169.935 268.2C170.239 268.44 170.783 268.56 171.567 268.56H173.823C175.183 268.56 176.199 268.784 176.871 269.232C177.559 269.664 177.903 270.368 177.903 271.344C177.903 272.064 177.663 272.728 177.183 273.336C176.703 273.944 176.023 274.432 175.143 274.8C174.263 275.184 173.215 275.376 171.999 275.376ZM171.999 264.984C172.671 264.984 173.247 264.736 173.727 264.24C174.223 263.728 174.471 263.048 174.471 262.2C174.471 261.352 174.231 260.688 173.751 260.208C173.271 259.728 172.687 259.488 171.999 259.488C171.311 259.488 170.727 259.728 170.247 260.208C169.767 260.688 169.527 261.352 169.527 262.2C169.527 263.048 169.767 263.728 170.247 264.24C170.743 264.736 171.327 264.984 171.999 264.984ZM172.287 274.008C173.407 274.008 174.303 273.76 174.975 273.264C175.647 272.784 175.983 272.24 175.983 271.632C175.983 271.088 175.775 270.712 175.359 270.504C174.959 270.296 174.383 270.192 173.631 270.192H171.615C171.391 270.192 171.143 270.176 170.871 270.144C170.615 270.112 170.359 270.064 170.103 270C169.687 270.304 169.383 270.624 169.191 270.96C168.999 271.296 168.903 271.632 168.903 271.968C168.903 272.592 169.199 273.088 169.791 273.456C170.399 273.824 171.231 274.008 172.287 274.008ZM180.156 270V258.336H181.788L181.956 260.448H182.028C182.428 259.712 182.916 259.128 183.492 258.696C184.068 258.264 184.684 258.048 185.34 258.048C185.804 258.048 186.22 258.128 186.588 258.288L186.204 260.016C186.012 259.952 185.836 259.904 185.676 259.872C185.516 259.84 185.316 259.824 185.076 259.824C184.58 259.824 184.06 260.024 183.516 260.424C182.988 260.824 182.524 261.52 182.124 262.512V270H180.156ZM192.97 270.288C191.93 270.288 190.986 270.048 190.138 269.568C189.29 269.072 188.618 268.368 188.122 267.456C187.626 266.544 187.378 265.456 187.378 264.192C187.378 262.912 187.626 261.816 188.122 260.904C188.634 259.992 189.29 259.288 190.09 258.792C190.89 258.296 191.73 258.048 192.61 258.048C194.098 258.048 195.242 258.544 196.042 259.536C196.858 260.528 197.266 261.856 197.266 263.52C197.266 263.728 197.258 263.936 197.242 264.144C197.242 264.336 197.226 264.504 197.194 264.648H189.322C189.402 265.88 189.786 266.864 190.474 267.6C191.178 268.336 192.09 268.704 193.21 268.704C193.77 268.704 194.282 268.624 194.746 268.464C195.226 268.288 195.682 268.064 196.114 267.792L196.81 269.088C196.314 269.408 195.746 269.688 195.106 269.928C194.482 270.168 193.77 270.288 192.97 270.288ZM189.298 263.232H195.538C195.538 262.048 195.282 261.152 194.77 260.544C194.274 259.92 193.57 259.608 192.658 259.608C191.842 259.608 191.106 259.928 190.45 260.568C189.81 261.192 189.426 262.08 189.298 263.232ZM202.509 270.288C201.533 270.288 200.717 270 200.061 269.424C199.421 268.832 199.101 268.016 199.101 266.976C199.101 265.696 199.669 264.72 200.805 264.048C201.957 263.36 203.773 262.88 206.253 262.608C206.253 262.112 206.181 261.64 206.037 261.192C205.909 260.744 205.669 260.384 205.317 260.112C204.981 259.824 204.493 259.68 203.853 259.68C203.181 259.68 202.549 259.808 201.957 260.064C201.365 260.32 200.837 260.608 200.373 260.928L199.605 259.56C200.149 259.208 200.813 258.872 201.597 258.552C202.397 258.216 203.261 258.048 204.189 258.048C205.613 258.048 206.645 258.488 207.285 259.368C207.925 260.232 208.245 261.392 208.245 262.848V270H206.613L206.445 268.608H206.373C205.829 269.056 205.229 269.448 204.573 269.784C203.933 270.12 203.245 270.288 202.509 270.288ZM203.085 268.704C203.645 268.704 204.173 268.568 204.669 268.296C205.165 268.024 205.693 267.64 206.253 267.144V263.904C204.317 264.144 202.957 264.504 202.173 264.984C201.405 265.464 201.021 266.08 201.021 266.832C201.021 267.488 201.221 267.968 201.621 268.272C202.021 268.56 202.509 268.704 203.085 268.704ZM215.258 270.288C214.01 270.288 213.138 269.928 212.642 269.208C212.162 268.488 211.922 267.552 211.922 266.4V259.944H210.194V258.456L212.018 258.336L212.258 255.072H213.914V258.336H217.058V259.944H213.914V266.424C213.914 267.144 214.042 267.704 214.298 268.104C214.57 268.488 215.042 268.68 215.714 268.68C215.922 268.68 216.146 268.648 216.386 268.584C216.626 268.504 216.842 268.432 217.034 268.368L217.418 269.856C217.098 269.968 216.746 270.064 216.362 270.144C215.994 270.24 215.626 270.288 215.258 270.288ZM16.4572 304.92V288.336H18.0892L18.2572 289.68H18.3292C18.8572 289.232 19.4332 288.848 20.0572 288.528C20.6972 288.208 21.3612 288.048 22.0492 288.048C23.5532 288.048 24.6972 288.592 25.4812 289.68C26.2652 290.752 26.6572 292.192 26.6572 294C26.6572 295.312 26.4172 296.44 25.9372 297.384C25.4732 298.328 24.8572 299.048 24.0892 299.544C23.3372 300.04 22.5132 300.288 21.6172 300.288C21.0732 300.288 20.5292 300.168 19.9852 299.928C19.4572 299.688 18.9212 299.36 18.3772 298.944L18.4252 300.984V304.92H16.4572ZM21.2812 298.632C22.2412 298.632 23.0332 298.224 23.6572 297.408C24.2972 296.576 24.6172 295.44 24.6172 294C24.6172 292.72 24.3772 291.688 23.8972 290.904C23.4332 290.104 22.6492 289.704 21.5452 289.704C21.0492 289.704 20.5452 289.84 20.0332 290.112C19.5372 290.384 19.0012 290.776 18.4252 291.288V297.408C18.9532 297.856 19.4652 298.176 19.9612 298.368C20.4572 298.544 20.8972 298.632 21.2812 298.632ZM29.7697 300V288.336H31.4017L31.5697 290.448H31.6417C32.0417 289.712 32.5297 289.128 33.1057 288.696C33.6817 288.264 34.2977 288.048 34.9537 288.048C35.4177 288.048 35.8337 288.128 36.2017 288.288L35.8177 290.016C35.6257 289.952 35.4497 289.904 35.2897 289.872C35.1297 289.84 34.9297 289.824 34.6897 289.824C34.1937 289.824 33.6737 290.024 33.1297 290.424C32.6017 290.824 32.1377 291.52 31.7377 292.512V300H29.7697ZM42.3917 300.288C41.4477 300.288 40.5597 300.048 39.7277 299.568C38.9117 299.088 38.2477 298.392 37.7357 297.48C37.2397 296.568 36.9917 295.472 36.9917 294.192C36.9917 292.88 37.2397 291.768 37.7357 290.856C38.2477 289.944 38.9117 289.248 39.7277 288.768C40.5597 288.288 41.4477 288.048 42.3917 288.048C43.3517 288.048 44.2397 288.288 45.0557 288.768C45.8717 289.248 46.5277 289.944 47.0237 290.856C47.5357 291.768 47.7917 292.88 47.7917 294.192C47.7917 295.472 47.5357 296.568 47.0237 297.48C46.5277 298.392 45.8717 299.088 45.0557 299.568C44.2397 300.048 43.3517 300.288 42.3917 300.288ZM42.3917 298.656C43.3997 298.656 44.2077 298.248 44.8157 297.432C45.4397 296.6 45.7517 295.52 45.7517 294.192C45.7517 292.848 45.4397 291.76 44.8157 290.928C44.2077 290.096 43.3997 289.68 42.3917 289.68C41.3997 289.68 40.5917 290.096 39.9677 290.928C39.3437 291.76 39.0317 292.848 39.0317 294.192C39.0317 295.52 39.3437 296.6 39.9677 297.432C40.5917 298.248 41.3997 298.656 42.3917 298.656ZM54.7995 305.376C53.3755 305.376 52.2155 305.104 51.3195 304.56C50.4235 304.016 49.9755 303.24 49.9755 302.232C49.9755 301.736 50.1275 301.256 50.4315 300.792C50.7355 300.344 51.1515 299.944 51.6795 299.592V299.496C51.3915 299.32 51.1435 299.072 50.9355 298.752C50.7435 298.432 50.6475 298.048 50.6475 297.6C50.6475 297.104 50.7835 296.672 51.0555 296.304C51.3275 295.936 51.6155 295.648 51.9195 295.44V295.344C51.5355 295.024 51.1835 294.592 50.8635 294.048C50.5595 293.504 50.4075 292.888 50.4075 292.2C50.4075 291.352 50.6075 290.616 51.0075 289.992C51.4075 289.368 51.9435 288.888 52.6155 288.552C53.2875 288.216 54.0155 288.048 54.7995 288.048C55.1195 288.048 55.4235 288.08 55.7115 288.144C55.9995 288.192 56.2475 288.256 56.4555 288.336H60.5115V289.848H58.1115C58.3835 290.104 58.6075 290.448 58.7835 290.88C58.9755 291.296 59.0715 291.752 59.0715 292.248C59.0715 293.08 58.8795 293.8 58.4955 294.408C58.1115 295.016 57.5995 295.488 56.9595 295.824C56.3195 296.144 55.5995 296.304 54.7995 296.304C54.1755 296.304 53.5915 296.168 53.0475 295.896C52.8395 296.072 52.6635 296.272 52.5195 296.496C52.3755 296.704 52.3035 296.968 52.3035 297.288C52.3035 297.656 52.4475 297.96 52.7355 298.2C53.0395 298.44 53.5835 298.56 54.3675 298.56H56.6235C57.9835 298.56 58.9995 298.784 59.6715 299.232C60.3595 299.664 60.7035 300.368 60.7035 301.344C60.7035 302.064 60.4635 302.728 59.9835 303.336C59.5035 303.944 58.8235 304.432 57.9435 304.8C57.0635 305.184 56.0155 305.376 54.7995 305.376ZM54.7995 294.984C55.4715 294.984 56.0475 294.736 56.5275 294.24C57.0235 293.728 57.2715 293.048 57.2715 292.2C57.2715 291.352 57.0315 290.688 56.5515 290.208C56.0715 289.728 55.4875 289.488 54.7995 289.488C54.1115 289.488 53.5275 289.728 53.0475 290.208C52.5675 290.688 52.3275 291.352 52.3275 292.2C52.3275 293.048 52.5675 293.728 53.0475 294.24C53.5435 294.736 54.1275 294.984 54.7995 294.984ZM55.0875 304.008C56.2075 304.008 57.1035 303.76 57.7755 303.264C58.4475 302.784 58.7835 302.24 58.7835 301.632C58.7835 301.088 58.5755 300.712 58.1595 300.504C57.7595 300.296 57.1835 300.192 56.4315 300.192H54.4155C54.1915 300.192 53.9435 300.176 53.6715 300.144C53.4155 300.112 53.1595 300.064 52.9035 300C52.4875 300.304 52.1835 300.624 51.9915 300.96C51.7995 301.296 51.7035 301.632 51.7035 301.968C51.7035 302.592 51.9995 303.088 52.5915 303.456C53.1995 303.824 54.0315 304.008 55.0875 304.008ZM62.9572 300V288.336H64.5892L64.7572 290.448H64.8292C65.2292 289.712 65.7172 289.128 66.2932 288.696C66.8692 288.264 67.4852 288.048 68.1412 288.048C68.6052 288.048 69.0212 288.128 69.3892 288.288L69.0052 290.016C68.8132 289.952 68.6372 289.904 68.4772 289.872C68.3172 289.84 68.1172 289.824 67.8772 289.824C67.3812 289.824 66.8612 290.024 66.3172 290.424C65.7892 290.824 65.3252 291.52 64.9252 292.512V300H62.9572ZM73.3796 300.288C72.4036 300.288 71.5876 300 70.9316 299.424C70.2916 298.832 69.9716 298.016 69.9716 296.976C69.9716 295.696 70.5396 294.72 71.6756 294.048C72.8276 293.36 74.6436 292.88 77.1236 292.608C77.1236 292.112 77.0516 291.64 76.9076 291.192C76.7796 290.744 76.5396 290.384 76.1876 290.112C75.8516 289.824 75.3636 289.68 74.7236 289.68C74.0516 289.68 73.4196 289.808 72.8276 290.064C72.2356 290.32 71.7076 290.608 71.2436 290.928L70.4756 289.56C71.0196 289.208 71.6836 288.872 72.4676 288.552C73.2676 288.216 74.1316 288.048 75.0596 288.048C76.4836 288.048 77.5156 288.488 78.1556 289.368C78.7956 290.232 79.1156 291.392 79.1156 292.848V300H77.4836L77.3156 298.608H77.2436C76.6996 299.056 76.0996 299.448 75.4436 299.784C74.8036 300.12 74.1156 300.288 73.3796 300.288ZM73.9556 298.704C74.5156 298.704 75.0436 298.568 75.5396 298.296C76.0356 298.024 76.5636 297.64 77.1236 297.144V293.904C75.1876 294.144 73.8276 294.504 73.0436 294.984C72.2756 295.464 71.8916 296.08 71.8916 296.832C71.8916 297.488 72.0916 297.968 72.4916 298.272C72.8916 298.56 73.3796 298.704 73.9556 298.704ZM82.7854 300V288.336H84.4174L84.5854 290.016H84.6574C85.1694 289.456 85.7294 288.992 86.3374 288.624C86.9454 288.24 87.5934 288.048 88.2814 288.048C89.1774 288.048 89.8734 288.248 90.3694 288.648C90.8814 289.032 91.2574 289.576 91.4974 290.28C92.1054 289.624 92.7214 289.088 93.3454 288.672C93.9694 288.256 94.6334 288.048 95.3374 288.048C96.5374 288.048 97.4254 288.44 98.0014 289.224C98.5934 289.992 98.8894 291.12 98.8894 292.608V300H96.9214V292.872C96.9214 291.784 96.7454 290.992 96.3934 290.496C96.0414 290 95.4974 289.752 94.7614 289.752C93.8974 289.752 92.9214 290.352 91.8334 291.552V300H89.8654V292.872C89.8654 291.784 89.6894 290.992 89.3374 290.496C88.9854 290 88.4334 289.752 87.6814 289.752C86.8174 289.752 85.8414 290.352 84.7534 291.552V300H82.7854ZM102.684 300V288.336H104.316L104.484 290.016H104.556C105.068 289.456 105.628 288.992 106.236 288.624C106.844 288.24 107.492 288.048 108.18 288.048C109.076 288.048 109.772 288.248 110.268 288.648C110.78 289.032 111.156 289.576 111.396 290.28C112.004 289.624 112.62 289.088 113.244 288.672C113.868 288.256 114.532 288.048 115.236 288.048C116.436 288.048 117.324 288.44 117.9 289.224C118.492 289.992 118.788 291.12 118.788 292.608V300H116.82V292.872C116.82 291.784 116.644 290.992 116.292 290.496C115.94 290 115.396 289.752 114.66 289.752C113.796 289.752 112.82 290.352 111.732 291.552V300H109.764V292.872C109.764 291.784 109.588 290.992 109.236 290.496C108.884 290 108.332 289.752 107.58 289.752C106.716 289.752 105.74 290.352 104.652 291.552V300H102.684ZM122.582 300V288.336H124.55V300H122.582ZM123.59 285.936C123.206 285.936 122.886 285.824 122.63 285.6C122.39 285.36 122.27 285.04 122.27 284.64C122.27 284.256 122.39 283.944 122.63 283.704C122.886 283.464 123.206 283.344 123.59 283.344C123.974 283.344 124.286 283.464 124.526 283.704C124.782 283.944 124.91 284.256 124.91 284.64C124.91 285.04 124.782 285.36 124.526 285.6C124.286 285.824 123.974 285.936 123.59 285.936ZM128.488 300V288.336H130.12L130.288 290.016H130.36C130.92 289.456 131.512 288.992 132.136 288.624C132.76 288.24 133.472 288.048 134.272 288.048C135.504 288.048 136.4 288.44 136.96 289.224C137.536 289.992 137.824 291.12 137.824 292.608V300H135.856V292.872C135.856 291.784 135.68 290.992 135.328 290.496C134.976 290 134.416 289.752 133.648 289.752C133.056 289.752 132.52 289.904 132.04 290.208C131.576 290.512 131.048 290.96 130.456 291.552V300H128.488ZM145.549 305.376C144.125 305.376 142.965 305.104 142.069 304.56C141.173 304.016 140.725 303.24 140.725 302.232C140.725 301.736 140.877 301.256 141.181 300.792C141.485 300.344 141.901 299.944 142.429 299.592V299.496C142.141 299.32 141.893 299.072 141.685 298.752C141.493 298.432 141.397 298.048 141.397 297.6C141.397 297.104 141.533 296.672 141.805 296.304C142.077 295.936 142.365 295.648 142.669 295.44V295.344C142.285 295.024 141.933 294.592 141.613 294.048C141.309 293.504 141.157 292.888 141.157 292.2C141.157 291.352 141.357 290.616 141.757 289.992C142.157 289.368 142.693 288.888 143.365 288.552C144.037 288.216 144.765 288.048 145.549 288.048C145.869 288.048 146.173 288.08 146.461 288.144C146.749 288.192 146.997 288.256 147.205 288.336H151.261V289.848H148.861C149.133 290.104 149.357 290.448 149.533 290.88C149.725 291.296 149.821 291.752 149.821 292.248C149.821 293.08 149.629 293.8 149.245 294.408C148.861 295.016 148.349 295.488 147.709 295.824C147.069 296.144 146.349 296.304 145.549 296.304C144.925 296.304 144.341 296.168 143.797 295.896C143.589 296.072 143.413 296.272 143.269 296.496C143.125 296.704 143.053 296.968 143.053 297.288C143.053 297.656 143.197 297.96 143.485 298.2C143.789 298.44 144.333 298.56 145.117 298.56H147.373C148.733 298.56 149.749 298.784 150.421 299.232C151.109 299.664 151.453 300.368 151.453 301.344C151.453 302.064 151.213 302.728 150.733 303.336C150.253 303.944 149.573 304.432 148.693 304.8C147.813 305.184 146.765 305.376 145.549 305.376ZM145.549 294.984C146.221 294.984 146.797 294.736 147.277 294.24C147.773 293.728 148.021 293.048 148.021 292.2C148.021 291.352 147.781 290.688 147.301 290.208C146.821 289.728 146.237 289.488 145.549 289.488C144.861 289.488 144.277 289.728 143.797 290.208C143.317 290.688 143.077 291.352 143.077 292.2C143.077 293.048 143.317 293.728 143.797 294.24C144.293 294.736 144.877 294.984 145.549 294.984ZM145.837 304.008C146.957 304.008 147.853 303.76 148.525 303.264C149.197 302.784 149.533 302.24 149.533 301.632C149.533 301.088 149.325 300.712 148.909 300.504C148.509 300.296 147.933 300.192 147.181 300.192H145.165C144.941 300.192 144.693 300.176 144.421 300.144C144.165 300.112 143.909 300.064 143.653 300C143.237 300.304 142.933 300.624 142.741 300.96C142.549 301.296 142.453 301.632 142.453 301.968C142.453 302.592 142.749 303.088 143.341 303.456C143.949 303.824 144.781 304.008 145.837 304.008ZM160.6 300.288C159.864 300.288 159.328 300.064 158.992 299.616C158.672 299.152 158.512 298.496 158.512 297.648V282.912H160.48V297.792C160.48 298.096 160.536 298.32 160.648 298.464C160.76 298.592 160.888 298.656 161.032 298.656C161.096 298.656 161.152 298.656 161.2 298.656C161.264 298.64 161.352 298.624 161.464 298.608L161.728 300.096C161.6 300.16 161.448 300.208 161.272 300.24C161.096 300.272 160.872 300.288 160.6 300.288ZM167.317 300.288C166.341 300.288 165.525 300 164.869 299.424C164.229 298.832 163.909 298.016 163.909 296.976C163.909 295.696 164.477 294.72 165.613 294.048C166.765 293.36 168.581 292.88 171.061 292.608C171.061 292.112 170.989 291.64 170.845 291.192C170.717 290.744 170.477 290.384 170.125 290.112C169.789 289.824 169.301 289.68 168.661 289.68C167.989 289.68 167.357 289.808 166.765 290.064C166.173 290.32 165.645 290.608 165.181 290.928L164.413 289.56C164.957 289.208 165.621 288.872 166.405 288.552C167.205 288.216 168.069 288.048 168.997 288.048C170.421 288.048 171.453 288.488 172.093 289.368C172.733 290.232 173.053 291.392 173.053 292.848V300H171.421L171.253 298.608H171.181C170.637 299.056 170.037 299.448 169.381 299.784C168.741 300.12 168.053 300.288 167.317 300.288ZM167.893 298.704C168.453 298.704 168.981 298.568 169.477 298.296C169.973 298.024 170.501 297.64 171.061 297.144V293.904C169.125 294.144 167.765 294.504 166.981 294.984C166.213 295.464 165.829 296.08 165.829 296.832C165.829 297.488 166.029 297.968 166.429 298.272C166.829 298.56 167.317 298.704 167.893 298.704ZM176.723 300V288.336H178.355L178.523 290.016H178.595C179.155 289.456 179.747 288.992 180.371 288.624C180.995 288.24 181.707 288.048 182.507 288.048C183.739 288.048 184.635 288.44 185.195 289.224C185.771 289.992 186.059 291.12 186.059 292.608V300H184.091V292.872C184.091 291.784 183.915 290.992 183.563 290.496C183.211 290 182.651 289.752 181.883 289.752C181.291 289.752 180.755 289.904 180.275 290.208C179.811 290.512 179.283 290.96 178.691 291.552V300H176.723ZM193.784 305.376C192.36 305.376 191.2 305.104 190.304 304.56C189.408 304.016 188.96 303.24 188.96 302.232C188.96 301.736 189.112 301.256 189.416 300.792C189.72 300.344 190.136 299.944 190.664 299.592V299.496C190.376 299.32 190.128 299.072 189.92 298.752C189.728 298.432 189.632 298.048 189.632 297.6C189.632 297.104 189.768 296.672 190.04 296.304C190.312 295.936 190.6 295.648 190.904 295.44V295.344C190.52 295.024 190.168 294.592 189.848 294.048C189.544 293.504 189.392 292.888 189.392 292.2C189.392 291.352 189.592 290.616 189.992 289.992C190.392 289.368 190.928 288.888 191.6 288.552C192.272 288.216 193 288.048 193.784 288.048C194.104 288.048 194.408 288.08 194.696 288.144C194.984 288.192 195.232 288.256 195.44 288.336H199.496V289.848H197.096C197.368 290.104 197.592 290.448 197.768 290.88C197.96 291.296 198.056 291.752 198.056 292.248C198.056 293.08 197.864 293.8 197.48 294.408C197.096 295.016 196.584 295.488 195.944 295.824C195.304 296.144 194.584 296.304 193.784 296.304C193.16 296.304 192.576 296.168 192.032 295.896C191.824 296.072 191.648 296.272 191.504 296.496C191.36 296.704 191.288 296.968 191.288 297.288C191.288 297.656 191.432 297.96 191.72 298.2C192.024 298.44 192.568 298.56 193.352 298.56H195.608C196.968 298.56 197.984 298.784 198.656 299.232C199.344 299.664 199.688 300.368 199.688 301.344C199.688 302.064 199.448 302.728 198.968 303.336C198.488 303.944 197.808 304.432 196.928 304.8C196.048 305.184 195 305.376 193.784 305.376ZM193.784 294.984C194.456 294.984 195.032 294.736 195.512 294.24C196.008 293.728 196.256 293.048 196.256 292.2C196.256 291.352 196.016 290.688 195.536 290.208C195.056 289.728 194.472 289.488 193.784 289.488C193.096 289.488 192.512 289.728 192.032 290.208C191.552 290.688 191.312 291.352 191.312 292.2C191.312 293.048 191.552 293.728 192.032 294.24C192.528 294.736 193.112 294.984 193.784 294.984ZM194.072 304.008C195.192 304.008 196.088 303.76 196.76 303.264C197.432 302.784 197.768 302.24 197.768 301.632C197.768 301.088 197.56 300.712 197.144 300.504C196.744 300.296 196.168 300.192 195.416 300.192H193.4C193.176 300.192 192.928 300.176 192.656 300.144C192.4 300.112 192.144 300.064 191.888 300C191.472 300.304 191.168 300.624 190.976 300.96C190.784 301.296 190.688 301.632 190.688 301.968C190.688 302.592 190.984 303.088 191.576 303.456C192.184 303.824 193.016 304.008 194.072 304.008ZM205.35 300.288C204.118 300.288 203.214 299.904 202.638 299.136C202.062 298.352 201.774 297.216 201.774 295.728V288.336H203.766V295.464C203.766 296.552 203.934 297.344 204.27 297.84C204.622 298.336 205.182 298.584 205.95 298.584C206.558 298.584 207.094 298.432 207.558 298.128C208.038 297.808 208.55 297.304 209.094 296.616V288.336H211.062V300H209.43L209.262 298.176H209.19C208.646 298.816 208.07 299.328 207.462 299.712C206.854 300.096 206.15 300.288 205.35 300.288ZM217.684 300.288C216.708 300.288 215.892 300 215.236 299.424C214.596 298.832 214.276 298.016 214.276 296.976C214.276 295.696 214.844 294.72 215.98 294.048C217.132 293.36 218.948 292.88 221.428 292.608C221.428 292.112 221.356 291.64 221.212 291.192C221.084 290.744 220.844 290.384 220.492 290.112C220.156 289.824 219.668 289.68 219.028 289.68C218.356 289.68 217.724 289.808 217.132 290.064C216.54 290.32 216.012 290.608 215.548 290.928L214.78 289.56C215.324 289.208 215.988 288.872 216.772 288.552C217.572 288.216 218.436 288.048 219.364 288.048C220.788 288.048 221.82 288.488 222.46 289.368C223.1 290.232 223.42 291.392 223.42 292.848V300H221.788L221.62 298.608H221.548C221.004 299.056 220.404 299.448 219.748 299.784C219.108 300.12 218.42 300.288 217.684 300.288ZM218.26 298.704C218.82 298.704 219.348 298.568 219.844 298.296C220.34 298.024 220.868 297.64 221.428 297.144V293.904C219.492 294.144 218.132 294.504 217.348 294.984C216.58 295.464 216.196 296.08 216.196 296.832C216.196 297.488 216.396 297.968 216.796 298.272C217.196 298.56 217.684 298.704 218.26 298.704ZM231.026 305.376C229.602 305.376 228.442 305.104 227.546 304.56C226.65 304.016 226.202 303.24 226.202 302.232C226.202 301.736 226.354 301.256 226.658 300.792C226.962 300.344 227.378 299.944 227.906 299.592V299.496C227.618 299.32 227.37 299.072 227.162 298.752C226.97 298.432 226.874 298.048 226.874 297.6C226.874 297.104 227.01 296.672 227.282 296.304C227.554 295.936 227.842 295.648 228.146 295.44V295.344C227.762 295.024 227.41 294.592 227.09 294.048C226.786 293.504 226.634 292.888 226.634 292.2C226.634 291.352 226.834 290.616 227.234 289.992C227.634 289.368 228.17 288.888 228.842 288.552C229.514 288.216 230.242 288.048 231.026 288.048C231.346 288.048 231.65 288.08 231.938 288.144C232.226 288.192 232.474 288.256 232.682 288.336H236.738V289.848H234.338C234.61 290.104 234.834 290.448 235.01 290.88C235.202 291.296 235.298 291.752 235.298 292.248C235.298 293.08 235.106 293.8 234.722 294.408C234.338 295.016 233.826 295.488 233.186 295.824C232.546 296.144 231.826 296.304 231.026 296.304C230.402 296.304 229.818 296.168 229.274 295.896C229.066 296.072 228.89 296.272 228.746 296.496C228.602 296.704 228.53 296.968 228.53 297.288C228.53 297.656 228.674 297.96 228.962 298.2C229.266 298.44 229.81 298.56 230.594 298.56H232.85C234.21 298.56 235.226 298.784 235.898 299.232C236.586 299.664 236.93 300.368 236.93 301.344C236.93 302.064 236.69 302.728 236.21 303.336C235.73 303.944 235.05 304.432 234.17 304.8C233.29 305.184 232.242 305.376 231.026 305.376ZM231.026 294.984C231.698 294.984 232.274 294.736 232.754 294.24C233.25 293.728 233.498 293.048 233.498 292.2C233.498 291.352 233.258 290.688 232.778 290.208C232.298 289.728 231.714 289.488 231.026 289.488C230.338 289.488 229.754 289.728 229.274 290.208C228.794 290.688 228.554 291.352 228.554 292.2C228.554 293.048 228.794 293.728 229.274 294.24C229.77 294.736 230.354 294.984 231.026 294.984ZM231.314 304.008C232.434 304.008 233.33 303.76 234.002 303.264C234.674 302.784 235.01 302.24 235.01 301.632C235.01 301.088 234.802 300.712 234.386 300.504C233.986 300.296 233.41 300.192 232.658 300.192H230.642C230.418 300.192 230.17 300.176 229.898 300.144C229.642 300.112 229.386 300.064 229.13 300C228.714 300.304 228.41 300.624 228.218 300.96C228.026 301.296 227.93 301.632 227.93 301.968C227.93 302.592 228.226 303.088 228.818 303.456C229.426 303.824 230.258 304.008 231.314 304.008ZM243.584 300.288C242.544 300.288 241.6 300.048 240.752 299.568C239.904 299.072 239.232 298.368 238.736 297.456C238.24 296.544 237.992 295.456 237.992 294.192C237.992 292.912 238.24 291.816 238.736 290.904C239.248 289.992 239.904 289.288 240.704 288.792C241.504 288.296 242.344 288.048 243.224 288.048C244.712 288.048 245.856 288.544 246.656 289.536C247.472 290.528 247.88 291.856 247.88 293.52C247.88 293.728 247.872 293.936 247.856 294.144C247.856 294.336 247.84 294.504 247.808 294.648H239.936C240.016 295.88 240.4 296.864 241.088 297.6C241.792 298.336 242.704 298.704 243.824 298.704C244.384 298.704 244.896 298.624 245.36 298.464C245.84 298.288 246.296 298.064 246.728 297.792L247.424 299.088C246.928 299.408 246.36 299.688 245.72 299.928C245.096 300.168 244.384 300.288 243.584 300.288ZM239.912 293.232H246.152C246.152 292.048 245.896 291.152 245.384 290.544C244.888 289.92 244.184 289.608 243.272 289.608C242.456 289.608 241.72 289.928 241.064 290.568C240.424 291.192 240.04 292.08 239.912 293.232ZM251.794 300.288C251.394 300.288 251.05 300.152 250.762 299.88C250.49 299.592 250.354 299.232 250.354 298.8C250.354 298.336 250.49 297.968 250.762 297.696C251.05 297.408 251.394 297.264 251.794 297.264C252.178 297.264 252.506 297.408 252.778 297.696C253.066 297.968 253.21 298.336 253.21 298.8C253.21 299.232 253.066 299.592 252.778 299.88C252.506 300.152 252.178 300.288 251.794 300.288Z" fill="black"/> <path d="M39.2257 389V374.936H34.4737V373.256H45.9937V374.936H41.2417V389H39.2257ZM48.6369 389V371.912H50.6049V376.568L50.5329 378.968C51.0929 378.44 51.6769 377.992 52.2849 377.624C52.9089 377.24 53.6209 377.048 54.4209 377.048C55.6529 377.048 56.5489 377.44 57.1089 378.224C57.6849 378.992 57.9729 380.12 57.9729 381.608V389H56.0049V381.872C56.0049 380.784 55.8289 379.992 55.4769 379.496C55.1249 379 54.5649 378.752 53.7969 378.752C53.2049 378.752 52.6689 378.904 52.1889 379.208C51.7249 379.512 51.1969 379.96 50.6049 380.552V389H48.6369ZM64.3796 389.288C63.4036 389.288 62.5876 389 61.9316 388.424C61.2916 387.832 60.9716 387.016 60.9716 385.976C60.9716 384.696 61.5396 383.72 62.6756 383.048C63.8276 382.36 65.6436 381.88 68.1236 381.608C68.1236 381.112 68.0516 380.64 67.9076 380.192C67.7796 379.744 67.5396 379.384 67.1876 379.112C66.8516 378.824 66.3636 378.68 65.7236 378.68C65.0516 378.68 64.4196 378.808 63.8276 379.064C63.2356 379.32 62.7076 379.608 62.2436 379.928L61.4756 378.56C62.0196 378.208 62.6836 377.872 63.4676 377.552C64.2676 377.216 65.1316 377.048 66.0596 377.048C67.4836 377.048 68.5156 377.488 69.1556 378.368C69.7956 379.232 70.1156 380.392 70.1156 381.848V389H68.4836L68.3156 387.608H68.2436C67.6996 388.056 67.0996 388.448 66.4436 388.784C65.8036 389.12 65.1156 389.288 64.3796 389.288ZM64.9556 387.704C65.5156 387.704 66.0436 387.568 66.5396 387.296C67.0356 387.024 67.5636 386.64 68.1236 386.144V382.904C66.1876 383.144 64.8276 383.504 64.0436 383.984C63.2756 384.464 62.8916 385.08 62.8916 385.832C62.8916 386.488 63.0916 386.968 63.4916 387.272C63.8916 387.56 64.3796 387.704 64.9556 387.704ZM77.1292 389.288C75.8812 389.288 75.0092 388.928 74.5132 388.208C74.0332 387.488 73.7932 386.552 73.7932 385.4V378.944H72.0652V377.456L73.8892 377.336L74.1292 374.072H75.7852V377.336H78.9292V378.944H75.7852V385.424C75.7852 386.144 75.9132 386.704 76.1692 387.104C76.4412 387.488 76.9132 387.68 77.5852 387.68C77.7932 387.68 78.0172 387.648 78.2572 387.584C78.4972 387.504 78.7132 387.432 78.9052 387.368L79.2892 388.856C78.9692 388.968 78.6172 389.064 78.2332 389.144C77.8652 389.24 77.4972 389.288 77.1292 389.288ZM88.2193 389L84.9793 377.336H86.9953L88.7233 384.08C88.8513 384.624 88.9713 385.168 89.0833 385.712C89.1953 386.24 89.3073 386.776 89.4193 387.32H89.5153C89.6433 386.776 89.7713 386.24 89.8993 385.712C90.0273 385.168 90.1633 384.624 90.3073 384.08L92.1073 377.336H94.0273L95.8513 384.08C95.9953 384.624 96.1313 385.168 96.2593 385.712C96.4033 386.24 96.5393 386.776 96.6673 387.32H96.7633C96.8913 386.776 97.0113 386.24 97.1233 385.712C97.2513 385.168 97.3713 384.624 97.4833 384.08L99.1873 377.336H101.059L97.9393 389H95.5393L93.8593 382.736C93.7153 382.176 93.5793 381.624 93.4513 381.08C93.3393 380.536 93.2113 379.968 93.0673 379.376H92.9713C92.8433 379.968 92.7153 380.544 92.5873 381.104C92.4593 381.648 92.3153 382.2 92.1553 382.76L90.5233 389H88.2193ZM106.051 389.288C105.075 389.288 104.259 389 103.603 388.424C102.963 387.832 102.643 387.016 102.643 385.976C102.643 384.696 103.211 383.72 104.347 383.048C105.499 382.36 107.315 381.88 109.795 381.608C109.795 381.112 109.723 380.64 109.579 380.192C109.451 379.744 109.211 379.384 108.859 379.112C108.523 378.824 108.035 378.68 107.395 378.68C106.723 378.68 106.091 378.808 105.499 379.064C104.907 379.32 104.379 379.608 103.915 379.928L103.147 378.56C103.691 378.208 104.355 377.872 105.139 377.552C105.939 377.216 106.803 377.048 107.731 377.048C109.155 377.048 110.187 377.488 110.827 378.368C111.467 379.232 111.787 380.392 111.787 381.848V389H110.155L109.987 387.608H109.915C109.371 388.056 108.771 388.448 108.115 388.784C107.475 389.12 106.787 389.288 106.051 389.288ZM106.627 387.704C107.187 387.704 107.715 387.568 108.211 387.296C108.707 387.024 109.235 386.64 109.795 386.144V382.904C107.859 383.144 106.499 383.504 105.715 383.984C104.947 384.464 104.563 385.08 104.563 385.832C104.563 386.488 104.763 386.968 105.163 387.272C105.563 387.56 106.051 387.704 106.627 387.704ZM118.505 389.288C117.673 389.288 116.881 389.136 116.129 388.832C115.377 388.512 114.721 388.128 114.161 387.68L115.145 386.36C115.657 386.76 116.185 387.096 116.729 387.368C117.273 387.624 117.889 387.752 118.577 387.752C119.345 387.752 119.921 387.576 120.305 387.224C120.689 386.856 120.881 386.424 120.881 385.928C120.881 385.528 120.745 385.192 120.473 384.92C120.217 384.648 119.881 384.424 119.465 384.248C119.065 384.056 118.649 383.88 118.217 383.72C117.673 383.512 117.137 383.28 116.609 383.024C116.081 382.752 115.649 382.408 115.313 381.992C114.977 381.56 114.809 381.016 114.809 380.36C114.809 379.416 115.161 378.632 115.865 378.008C116.585 377.368 117.577 377.048 118.841 377.048C119.561 377.048 120.233 377.176 120.857 377.432C121.481 377.688 122.017 378 122.465 378.368L121.505 379.616C121.105 379.312 120.689 379.064 120.257 378.872C119.825 378.68 119.353 378.584 118.841 378.584C118.105 378.584 117.561 378.752 117.209 379.088C116.873 379.424 116.705 379.816 116.705 380.264C116.705 380.632 116.825 380.936 117.065 381.176C117.305 381.4 117.617 381.6 118.001 381.776C118.385 381.936 118.793 382.104 119.225 382.28C119.785 382.488 120.337 382.728 120.881 383C121.425 383.256 121.873 383.608 122.225 384.056C122.593 384.488 122.777 385.072 122.777 385.808C122.777 386.432 122.609 387.008 122.273 387.536C121.953 388.064 121.473 388.488 120.833 388.808C120.209 389.128 119.433 389.288 118.505 389.288ZM133.005 389.288C132.029 389.288 131.213 389 130.557 388.424C129.917 387.832 129.597 387.016 129.597 385.976C129.597 384.696 130.165 383.72 131.301 383.048C132.453 382.36 134.269 381.88 136.749 381.608C136.749 381.112 136.677 380.64 136.533 380.192C136.405 379.744 136.165 379.384 135.813 379.112C135.477 378.824 134.989 378.68 134.349 378.68C133.677 378.68 133.045 378.808 132.453 379.064C131.861 379.32 131.333 379.608 130.869 379.928L130.101 378.56C130.645 378.208 131.309 377.872 132.093 377.552C132.893 377.216 133.757 377.048 134.685 377.048C136.109 377.048 137.141 377.488 137.781 378.368C138.421 379.232 138.741 380.392 138.741 381.848V389H137.109L136.941 387.608H136.869C136.325 388.056 135.725 388.448 135.069 388.784C134.429 389.12 133.741 389.288 133.005 389.288ZM133.581 387.704C134.141 387.704 134.669 387.568 135.165 387.296C135.661 387.024 136.189 386.64 136.749 386.144V382.904C134.813 383.144 133.453 383.504 132.669 383.984C131.901 384.464 131.517 385.08 131.517 385.832C131.517 386.488 131.717 386.968 132.117 387.272C132.517 387.56 133.005 387.704 133.581 387.704ZM151.199 389.288C149.743 389.288 148.575 388.76 147.695 387.704C146.815 386.632 146.375 385.128 146.375 383.192C146.375 381.928 146.607 380.84 147.071 379.928C147.551 379 148.175 378.288 148.943 377.792C149.727 377.296 150.559 377.048 151.439 377.048C152.111 377.048 152.695 377.168 153.191 377.408C153.687 377.648 154.191 377.976 154.703 378.392L154.607 376.4V371.912H156.599V389H154.967L154.799 387.632H154.727C154.279 388.08 153.751 388.472 153.143 388.808C152.535 389.128 151.887 389.288 151.199 389.288ZM151.631 387.632C152.655 387.632 153.647 387.096 154.607 386.024V379.928C154.111 379.48 153.631 379.168 153.167 378.992C152.719 378.8 152.255 378.704 151.775 378.704C151.151 378.704 150.583 378.896 150.071 379.28C149.575 379.648 149.175 380.168 148.871 380.84C148.567 381.496 148.415 382.272 148.415 383.168C148.415 384.56 148.695 385.656 149.255 386.456C149.815 387.24 150.607 387.632 151.631 387.632ZM165.256 389.288C164.216 389.288 163.272 389.048 162.424 388.568C161.576 388.072 160.904 387.368 160.408 386.456C159.912 385.544 159.664 384.456 159.664 383.192C159.664 381.912 159.912 380.816 160.408 379.904C160.92 378.992 161.576 378.288 162.376 377.792C163.176 377.296 164.016 377.048 164.896 377.048C166.384 377.048 167.528 377.544 168.328 378.536C169.144 379.528 169.552 380.856 169.552 382.52C169.552 382.728 169.544 382.936 169.528 383.144C169.528 383.336 169.512 383.504 169.48 383.648H161.608C161.688 384.88 162.072 385.864 162.76 386.6C163.464 387.336 164.376 387.704 165.496 387.704C166.056 387.704 166.568 387.624 167.032 387.464C167.512 387.288 167.968 387.064 168.4 386.792L169.096 388.088C168.6 388.408 168.032 388.688 167.392 388.928C166.768 389.168 166.056 389.288 165.256 389.288ZM161.584 382.232H167.824C167.824 381.048 167.568 380.152 167.056 379.544C166.56 378.92 165.856 378.608 164.944 378.608C164.128 378.608 163.392 378.928 162.736 379.568C162.096 380.192 161.712 381.08 161.584 382.232ZM174.522 389.288C173.786 389.288 173.25 389.064 172.914 388.616C172.594 388.152 172.434 387.496 172.434 386.648V371.912H174.402V386.792C174.402 387.096 174.458 387.32 174.57 387.464C174.682 387.592 174.81 387.656 174.954 387.656C175.018 387.656 175.074 387.656 175.122 387.656C175.186 387.64 175.274 387.624 175.386 387.608L175.65 389.096C175.522 389.16 175.37 389.208 175.194 389.24C175.018 389.272 174.794 389.288 174.522 389.288ZM178.551 389V377.336H180.519V389H178.551ZM179.559 374.936C179.175 374.936 178.855 374.824 178.599 374.6C178.359 374.36 178.239 374.04 178.239 373.64C178.239 373.256 178.359 372.944 178.599 372.704C178.855 372.464 179.175 372.344 179.559 372.344C179.943 372.344 180.255 372.464 180.495 372.704C180.751 372.944 180.879 373.256 180.879 373.64C180.879 374.04 180.751 374.36 180.495 374.6C180.255 374.824 179.943 374.936 179.559 374.936ZM189.065 389.288C188.041 389.288 187.113 389.048 186.281 388.568C185.449 388.088 184.793 387.392 184.313 386.48C183.833 385.568 183.593 384.472 183.593 383.192C183.593 381.88 183.849 380.768 184.361 379.856C184.889 378.944 185.577 378.248 186.425 377.768C187.289 377.288 188.217 377.048 189.209 377.048C189.977 377.048 190.633 377.184 191.177 377.456C191.737 377.728 192.217 378.048 192.617 378.416L191.609 379.712C191.273 379.408 190.913 379.16 190.529 378.968C190.161 378.776 189.745 378.68 189.281 378.68C188.577 378.68 187.945 378.872 187.385 379.256C186.841 379.624 186.409 380.152 186.089 380.84C185.785 381.512 185.633 382.296 185.633 383.192C185.633 384.52 185.961 385.6 186.617 386.432C187.289 387.248 188.161 387.656 189.233 387.656C189.777 387.656 190.281 387.544 190.745 387.32C191.209 387.08 191.617 386.8 191.969 386.48L192.833 387.8C192.305 388.264 191.721 388.632 191.081 388.904C190.441 389.16 189.769 389.288 189.065 389.288ZM195.403 389V377.336H197.371V389H195.403ZM196.411 374.936C196.027 374.936 195.707 374.824 195.451 374.6C195.211 374.36 195.091 374.04 195.091 373.64C195.091 373.256 195.211 372.944 195.451 372.704C195.707 372.464 196.027 372.344 196.411 372.344C196.795 372.344 197.107 372.464 197.347 372.704C197.603 372.944 197.731 373.256 197.731 373.64C197.731 374.04 197.603 374.36 197.347 374.6C197.107 374.824 196.795 374.936 196.411 374.936ZM205.845 389.288C204.901 389.288 204.013 389.048 203.181 388.568C202.365 388.088 201.701 387.392 201.189 386.48C200.693 385.568 200.445 384.472 200.445 383.192C200.445 381.88 200.693 380.768 201.189 379.856C201.701 378.944 202.365 378.248 203.181 377.768C204.013 377.288 204.901 377.048 205.845 377.048C206.805 377.048 207.693 377.288 208.509 377.768C209.325 378.248 209.981 378.944 210.477 379.856C210.989 380.768 211.245 381.88 211.245 383.192C211.245 384.472 210.989 385.568 210.477 386.48C209.981 387.392 209.325 388.088 208.509 388.568C207.693 389.048 206.805 389.288 205.845 389.288ZM205.845 387.656C206.853 387.656 207.661 387.248 208.269 386.432C208.893 385.6 209.205 384.52 209.205 383.192C209.205 381.848 208.893 380.76 208.269 379.928C207.661 379.096 206.853 378.68 205.845 378.68C204.853 378.68 204.045 379.096 203.421 379.928C202.797 380.76 202.485 381.848 202.485 383.192C202.485 384.52 202.797 385.6 203.421 386.432C204.045 387.248 204.853 387.656 205.845 387.656ZM217.725 389.288C216.493 389.288 215.589 388.904 215.013 388.136C214.437 387.352 214.149 386.216 214.149 384.728V377.336H216.141V384.464C216.141 385.552 216.309 386.344 216.645 386.84C216.997 387.336 217.557 387.584 218.325 387.584C218.933 387.584 219.469 387.432 219.933 387.128C220.413 386.808 220.925 386.304 221.469 385.616V377.336H223.437V389H221.805L221.637 387.176H221.565C221.021 387.816 220.445 388.328 219.837 388.712C219.229 389.096 218.525 389.288 217.725 389.288ZM230.419 389.288C229.587 389.288 228.795 389.136 228.043 388.832C227.291 388.512 226.635 388.128 226.075 387.68L227.059 386.36C227.571 386.76 228.099 387.096 228.643 387.368C229.187 387.624 229.803 387.752 230.491 387.752C231.259 387.752 231.835 387.576 232.219 387.224C232.603 386.856 232.795 386.424 232.795 385.928C232.795 385.528 232.659 385.192 232.387 384.92C232.131 384.648 231.795 384.424 231.379 384.248C230.979 384.056 230.563 383.88 230.131 383.72C229.587 383.512 229.051 383.28 228.523 383.024C227.995 382.752 227.563 382.408 227.227 381.992C226.891 381.56 226.723 381.016 226.723 380.36C226.723 379.416 227.075 378.632 227.779 378.008C228.499 377.368 229.491 377.048 230.755 377.048C231.475 377.048 232.147 377.176 232.771 377.432C233.395 377.688 233.931 378 234.379 378.368L233.419 379.616C233.019 379.312 232.603 379.064 232.171 378.872C231.739 378.68 231.267 378.584 230.755 378.584C230.019 378.584 229.475 378.752 229.123 379.088C228.787 379.424 228.619 379.816 228.619 380.264C228.619 380.632 228.739 380.936 228.979 381.176C229.219 381.4 229.531 381.6 229.915 381.776C230.299 381.936 230.707 382.104 231.139 382.28C231.699 382.488 232.251 382.728 232.795 383C233.339 383.256 233.787 383.608 234.139 384.056C234.507 384.488 234.691 385.072 234.691 385.808C234.691 386.432 234.523 387.008 234.187 387.536C233.867 388.064 233.387 388.488 232.747 388.808C232.123 389.128 231.347 389.288 230.419 389.288ZM52.3907 419.288C51.8467 419.288 51.2787 419.16 50.6867 418.904C50.1107 418.632 49.5667 418.264 49.0547 417.8H48.9827L48.8147 419H47.2307V401.912H49.1987V406.568L49.1507 408.68C49.6787 408.216 50.2547 407.832 50.8787 407.528C51.5187 407.208 52.1587 407.048 52.7987 407.048C54.3187 407.048 55.4707 407.584 56.2547 408.656C57.0387 409.728 57.4307 411.168 57.4307 412.976C57.4307 414.304 57.1907 415.44 56.7107 416.384C56.2467 417.328 55.6307 418.048 54.8627 418.544C54.1107 419.04 53.2867 419.288 52.3907 419.288ZM52.0547 417.632C53.0147 417.632 53.8067 417.224 54.4307 416.408C55.0707 415.576 55.3907 414.44 55.3907 413C55.3907 411.72 55.1507 410.688 54.6707 409.904C54.2067 409.104 53.4227 408.704 52.3187 408.704C51.8227 408.704 51.3187 408.84 50.8067 409.112C50.2947 409.384 49.7587 409.776 49.1987 410.288V416.408C49.7107 416.856 50.2147 417.176 50.7107 417.368C51.2227 417.544 51.6707 417.632 52.0547 417.632ZM65.0323 419.288C64.0883 419.288 63.2003 419.048 62.3683 418.568C61.5523 418.088 60.8883 417.392 60.3763 416.48C59.8803 415.568 59.6323 414.472 59.6323 413.192C59.6323 411.88 59.8803 410.768 60.3763 409.856C60.8883 408.944 61.5523 408.248 62.3683 407.768C63.2003 407.288 64.0883 407.048 65.0323 407.048C65.9923 407.048 66.8803 407.288 67.6963 407.768C68.5123 408.248 69.1683 408.944 69.6643 409.856C70.1763 410.768 70.4323 411.88 70.4323 413.192C70.4323 414.472 70.1763 415.568 69.6643 416.48C69.1683 417.392 68.5123 418.088 67.6963 418.568C66.8803 419.048 65.9923 419.288 65.0323 419.288ZM65.0323 417.656C66.0403 417.656 66.8483 417.248 67.4563 416.432C68.0803 415.6 68.3923 414.52 68.3923 413.192C68.3923 411.848 68.0803 410.76 67.4563 409.928C66.8483 409.096 66.0403 408.68 65.0323 408.68C64.0403 408.68 63.2323 409.096 62.6083 409.928C61.9843 410.76 61.6723 411.848 61.6723 413.192C61.6723 414.52 61.9843 415.6 62.6083 416.432C63.2323 417.248 64.0403 417.656 65.0323 417.656ZM75.2584 419L72.0184 407.336H74.0344L75.7624 414.08C75.8904 414.624 76.0104 415.168 76.1224 415.712C76.2344 416.24 76.3464 416.776 76.4584 417.32H76.5544C76.6824 416.776 76.8104 416.24 76.9384 415.712C77.0664 415.168 77.2024 414.624 77.3464 414.08L79.1464 407.336H81.0664L82.8904 414.08C83.0344 414.624 83.1704 415.168 83.2984 415.712C83.4424 416.24 83.5784 416.776 83.7064 417.32H83.8024C83.9304 416.776 84.0504 416.24 84.1624 415.712C84.2904 415.168 84.4104 414.624 84.5224 414.08L86.2264 407.336H88.0984L84.9784 419H82.5784L80.8984 412.736C80.7544 412.176 80.6184 411.624 80.4904 411.08C80.3784 410.536 80.2504 409.968 80.1064 409.376H80.0104C79.8824 409.968 79.7544 410.544 79.6264 411.104C79.4984 411.648 79.3544 412.2 79.1944 412.76L77.5624 419H75.2584ZM92.7249 419.288C91.9889 419.288 91.4529 419.064 91.1169 418.616C90.7969 418.152 90.6369 417.496 90.6369 416.648V401.912H92.6049V416.792C92.6049 417.096 92.6609 417.32 92.7729 417.464C92.8849 417.592 93.0129 417.656 93.1569 417.656C93.2209 417.656 93.2769 417.656 93.3249 417.656C93.3889 417.64 93.4769 417.624 93.5889 417.608L93.8529 419.096C93.7249 419.16 93.5729 419.208 93.3969 419.24C93.2209 419.272 92.9969 419.288 92.7249 419.288ZM106.095 419.288C105.151 419.288 104.263 419.048 103.431 418.568C102.615 418.088 101.951 417.392 101.439 416.48C100.943 415.568 100.695 414.472 100.695 413.192C100.695 411.88 100.943 410.768 101.439 409.856C101.951 408.944 102.615 408.248 103.431 407.768C104.263 407.288 105.151 407.048 106.095 407.048C107.055 407.048 107.943 407.288 108.759 407.768C109.575 408.248 110.231 408.944 110.727 409.856C111.239 410.768 111.495 411.88 111.495 413.192C111.495 414.472 111.239 415.568 110.727 416.48C110.231 417.392 109.575 418.088 108.759 418.568C107.943 419.048 107.055 419.288 106.095 419.288ZM106.095 417.656C107.103 417.656 107.911 417.248 108.519 416.432C109.143 415.6 109.455 414.52 109.455 413.192C109.455 411.848 109.143 410.76 108.519 409.928C107.911 409.096 107.103 408.68 106.095 408.68C105.103 408.68 104.295 409.096 103.671 409.928C103.047 410.76 102.735 411.848 102.735 413.192C102.735 414.52 103.047 415.6 103.671 416.432C104.295 417.248 105.103 417.656 106.095 417.656ZM114.903 419V408.944H113.319V407.456L114.903 407.336V405.488C114.903 404.304 115.175 403.368 115.719 402.68C116.279 401.976 117.143 401.624 118.311 401.624C118.679 401.624 119.031 401.664 119.367 401.744C119.703 401.808 119.999 401.896 120.255 402.008L119.823 403.52C119.391 403.328 118.951 403.232 118.503 403.232C117.415 403.232 116.871 403.984 116.871 405.488V407.336H119.343V408.944H116.871V419H114.903ZM129.427 419.288C128.595 419.288 127.803 419.136 127.051 418.832C126.299 418.512 125.643 418.128 125.083 417.68L126.067 416.36C126.579 416.76 127.107 417.096 127.651 417.368C128.195 417.624 128.811 417.752 129.499 417.752C130.267 417.752 130.843 417.576 131.227 417.224C131.611 416.856 131.803 416.424 131.803 415.928C131.803 415.528 131.667 415.192 131.395 414.92C131.139 414.648 130.803 414.424 130.387 414.248C129.987 414.056 129.571 413.88 129.139 413.72C128.595 413.512 128.059 413.28 127.531 413.024C127.003 412.752 126.571 412.408 126.235 411.992C125.899 411.56 125.731 411.016 125.731 410.36C125.731 409.416 126.083 408.632 126.787 408.008C127.507 407.368 128.499 407.048 129.763 407.048C130.483 407.048 131.155 407.176 131.779 407.432C132.403 407.688 132.939 408 133.387 408.368L132.427 409.616C132.027 409.312 131.611 409.064 131.179 408.872C130.747 408.68 130.275 408.584 129.763 408.584C129.027 408.584 128.483 408.752 128.131 409.088C127.795 409.424 127.627 409.816 127.627 410.264C127.627 410.632 127.747 410.936 127.987 411.176C128.227 411.4 128.539 411.6 128.923 411.776C129.307 411.936 129.715 412.104 130.147 412.28C130.707 412.488 131.259 412.728 131.803 413C132.347 413.256 132.795 413.608 133.147 414.056C133.515 414.488 133.699 415.072 133.699 415.808C133.699 416.432 133.531 417.008 133.195 417.536C132.875 418.064 132.395 418.488 131.755 418.808C131.131 419.128 130.355 419.288 129.427 419.288ZM136.434 423.92V407.336H138.066L138.234 408.68H138.306C138.834 408.232 139.41 407.848 140.034 407.528C140.674 407.208 141.338 407.048 142.026 407.048C143.53 407.048 144.674 407.592 145.458 408.68C146.242 409.752 146.634 411.192 146.634 413C146.634 414.312 146.394 415.44 145.914 416.384C145.45 417.328 144.834 418.048 144.066 418.544C143.314 419.04 142.49 419.288 141.594 419.288C141.05 419.288 140.506 419.168 139.962 418.928C139.434 418.688 138.898 418.36 138.354 417.944L138.402 419.984V423.92H136.434ZM141.258 417.632C142.218 417.632 143.01 417.224 143.634 416.408C144.274 415.576 144.594 414.44 144.594 413C144.594 411.72 144.354 410.688 143.874 409.904C143.41 409.104 142.626 408.704 141.522 408.704C141.026 408.704 140.522 408.84 140.01 409.112C139.514 409.384 138.978 409.776 138.402 410.288V416.408C138.93 416.856 139.442 417.176 139.938 417.368C140.434 417.544 140.874 417.632 141.258 417.632ZM152.106 419.288C151.13 419.288 150.314 419 149.658 418.424C149.018 417.832 148.698 417.016 148.698 415.976C148.698 414.696 149.266 413.72 150.402 413.048C151.554 412.36 153.37 411.88 155.85 411.608C155.85 411.112 155.778 410.64 155.634 410.192C155.506 409.744 155.266 409.384 154.914 409.112C154.578 408.824 154.09 408.68 153.45 408.68C152.778 408.68 152.146 408.808 151.554 409.064C150.962 409.32 150.434 409.608 149.97 409.928L149.202 408.56C149.746 408.208 150.41 407.872 151.194 407.552C151.994 407.216 152.858 407.048 153.786 407.048C155.21 407.048 156.242 407.488 156.882 408.368C157.522 409.232 157.842 410.392 157.842 411.848V419H156.21L156.042 417.608H155.97C155.426 418.056 154.826 418.448 154.17 418.784C153.53 419.12 152.842 419.288 152.106 419.288ZM152.682 417.704C153.242 417.704 153.77 417.568 154.266 417.296C154.762 417.024 155.29 416.64 155.85 416.144V412.904C153.914 413.144 152.554 413.504 151.77 413.984C151.002 414.464 150.618 415.08 150.618 415.832C150.618 416.488 150.818 416.968 151.218 417.272C151.618 417.56 152.106 417.704 152.682 417.704ZM165.448 424.376C164.024 424.376 162.864 424.104 161.968 423.56C161.072 423.016 160.624 422.24 160.624 421.232C160.624 420.736 160.776 420.256 161.08 419.792C161.384 419.344 161.8 418.944 162.328 418.592V418.496C162.04 418.32 161.792 418.072 161.584 417.752C161.392 417.432 161.296 417.048 161.296 416.6C161.296 416.104 161.432 415.672 161.704 415.304C161.976 414.936 162.264 414.648 162.568 414.44V414.344C162.184 414.024 161.832 413.592 161.512 413.048C161.208 412.504 161.056 411.888 161.056 411.2C161.056 410.352 161.256 409.616 161.656 408.992C162.056 408.368 162.592 407.888 163.264 407.552C163.936 407.216 164.664 407.048 165.448 407.048C165.768 407.048 166.072 407.08 166.36 407.144C166.648 407.192 166.896 407.256 167.104 407.336H171.16V408.848H168.76C169.032 409.104 169.256 409.448 169.432 409.88C169.624 410.296 169.72 410.752 169.72 411.248C169.72 412.08 169.528 412.8 169.144 413.408C168.76 414.016 168.248 414.488 167.608 414.824C166.968 415.144 166.248 415.304 165.448 415.304C164.824 415.304 164.24 415.168 163.696 414.896C163.488 415.072 163.312 415.272 163.168 415.496C163.024 415.704 162.952 415.968 162.952 416.288C162.952 416.656 163.096 416.96 163.384 417.2C163.688 417.44 164.232 417.56 165.016 417.56H167.272C168.632 417.56 169.648 417.784 170.32 418.232C171.008 418.664 171.352 419.368 171.352 420.344C171.352 421.064 171.112 421.728 170.632 422.336C170.152 422.944 169.472 423.432 168.592 423.8C167.712 424.184 166.664 424.376 165.448 424.376ZM165.448 413.984C166.12 413.984 166.696 413.736 167.176 413.24C167.672 412.728 167.92 412.048 167.92 411.2C167.92 410.352 167.68 409.688 167.2 409.208C166.72 408.728 166.136 408.488 165.448 408.488C164.76 408.488 164.176 408.728 163.696 409.208C163.216 409.688 162.976 410.352 162.976 411.2C162.976 412.048 163.216 412.728 163.696 413.24C164.192 413.736 164.776 413.984 165.448 413.984ZM165.736 423.008C166.856 423.008 167.752 422.76 168.424 422.264C169.096 421.784 169.432 421.24 169.432 420.632C169.432 420.088 169.224 419.712 168.808 419.504C168.408 419.296 167.832 419.192 167.08 419.192H165.064C164.84 419.192 164.592 419.176 164.32 419.144C164.064 419.112 163.808 419.064 163.552 419C163.136 419.304 162.832 419.624 162.64 419.96C162.448 420.296 162.352 420.632 162.352 420.968C162.352 421.592 162.648 422.088 163.24 422.456C163.848 422.824 164.68 423.008 165.736 423.008ZM173.606 419V401.912H175.574V406.568L175.502 408.968C176.062 408.44 176.646 407.992 177.254 407.624C177.878 407.24 178.59 407.048 179.39 407.048C180.622 407.048 181.518 407.44 182.078 408.224C182.654 408.992 182.942 410.12 182.942 411.608V419H180.974V411.872C180.974 410.784 180.798 409.992 180.446 409.496C180.094 409 179.534 408.752 178.766 408.752C178.174 408.752 177.638 408.904 177.158 409.208C176.694 409.512 176.166 409.96 175.574 410.552V419H173.606ZM191.388 419.288C190.348 419.288 189.404 419.048 188.556 418.568C187.708 418.072 187.036 417.368 186.54 416.456C186.044 415.544 185.796 414.456 185.796 413.192C185.796 411.912 186.044 410.816 186.54 409.904C187.052 408.992 187.708 408.288 188.508 407.792C189.308 407.296 190.148 407.048 191.028 407.048C192.516 407.048 193.66 407.544 194.46 408.536C195.276 409.528 195.684 410.856 195.684 412.52C195.684 412.728 195.676 412.936 195.66 413.144C195.66 413.336 195.644 413.504 195.612 413.648H187.74C187.82 414.88 188.204 415.864 188.892 416.6C189.596 417.336 190.508 417.704 191.628 417.704C192.188 417.704 192.7 417.624 193.164 417.464C193.644 417.288 194.1 417.064 194.532 416.792L195.228 418.088C194.732 418.408 194.164 418.688 193.524 418.928C192.9 419.168 192.188 419.288 191.388 419.288ZM187.716 412.232H193.956C193.956 411.048 193.7 410.152 193.188 409.544C192.692 408.92 191.988 408.608 191.076 408.608C190.26 408.608 189.524 408.928 188.868 409.568C188.228 410.192 187.844 411.08 187.716 412.232ZM202.004 419.288C200.756 419.288 199.884 418.928 199.388 418.208C198.908 417.488 198.668 416.552 198.668 415.4V408.944H196.94V407.456L198.764 407.336L199.004 404.072H200.66V407.336H203.804V408.944H200.66V415.424C200.66 416.144 200.788 416.704 201.044 417.104C201.316 417.488 201.788 417.68 202.46 417.68C202.668 417.68 202.892 417.648 203.132 417.584C203.372 417.504 203.588 417.432 203.78 417.368L204.164 418.856C203.844 418.968 203.492 419.064 203.108 419.144C202.74 419.24 202.372 419.288 202.004 419.288ZM209.645 419.288C208.397 419.288 207.525 418.928 207.029 418.208C206.549 417.488 206.309 416.552 206.309 415.4V408.944H204.581V407.456L206.405 407.336L206.645 404.072H208.301V407.336H211.445V408.944H208.301V415.424C208.301 416.144 208.429 416.704 208.685 417.104C208.957 417.488 209.429 417.68 210.101 417.68C210.309 417.68 210.533 417.648 210.773 417.584C211.013 417.504 211.229 417.432 211.421 417.368L211.805 418.856C211.485 418.968 211.133 419.064 210.749 419.144C210.381 419.24 210.013 419.288 209.645 419.288ZM214.082 419V407.336H216.05V419H214.082ZM215.09 404.936C214.706 404.936 214.386 404.824 214.13 404.6C213.89 404.36 213.77 404.04 213.77 403.64C213.77 403.256 213.89 402.944 214.13 402.704C214.386 402.464 214.706 402.344 215.09 402.344C215.474 402.344 215.786 402.464 216.026 402.704C216.282 402.944 216.41 403.256 216.41 403.64C216.41 404.04 216.282 404.36 216.026 404.6C215.786 404.824 215.474 404.936 215.09 404.936ZM221.02 419.288C220.62 419.288 220.276 419.152 219.988 418.88C219.716 418.592 219.58 418.232 219.58 417.8C219.58 417.336 219.716 416.968 219.988 416.696C220.276 416.408 220.62 416.264 221.02 416.264C221.404 416.264 221.732 416.408 222.004 416.696C222.292 416.968 222.436 417.336 222.436 417.8C222.436 418.232 222.292 418.592 222.004 418.88C221.732 419.152 221.404 419.288 221.02 419.288Z" fill="black"/> <rect x="1" y="1" width="267.26" height="550" rx="29" stroke="#D7DDFF" stroke-width="2"/> <path d="M79.8205 53.788C78.5085 53.788 77.3405 53.452 76.3165 52.78C75.3085 52.108 74.5165 51.156 73.9405 49.924C73.3805 48.692 73.1005 47.236 73.1005 45.556C73.1005 43.876 73.3805 42.436 73.9405 41.236C74.5165 40.02 75.3085 39.092 76.3165 38.452C77.3405 37.796 78.5085 37.468 79.8205 37.468C81.1485 37.468 82.3165 37.796 83.3245 38.452C84.3325 39.092 85.1245 40.02 85.7005 41.236C86.2765 42.436 86.5645 43.876 86.5645 45.556C86.5645 47.236 86.2765 48.692 85.7005 49.924C85.1245 51.156 84.3325 52.108 83.3245 52.78C82.3165 53.452 81.1485 53.788 79.8205 53.788ZM79.8205 52.036C80.7645 52.036 81.5805 51.772 82.2685 51.244C82.9725 50.7 83.5165 49.948 83.9005 48.988C84.2845 48.012 84.4765 46.868 84.4765 45.556C84.4765 43.604 84.0525 42.06 83.2045 40.924C82.3565 39.788 81.2285 39.22 79.8205 39.22C78.4125 39.22 77.2845 39.788 76.4365 40.924C75.5885 42.06 75.1645 43.604 75.1645 45.556C75.1645 46.868 75.3565 48.012 75.7405 48.988C76.1245 49.948 76.6685 50.7 77.3725 51.244C78.0765 51.772 78.8925 52.036 79.8205 52.036ZM89.758 53.5V41.836H91.39L91.558 43.948H91.63C92.03 43.212 92.518 42.628 93.094 42.196C93.67 41.764 94.286 41.548 94.942 41.548C95.406 41.548 95.822 41.628 96.19 41.788L95.806 43.516C95.614 43.452 95.438 43.404 95.278 43.372C95.118 43.34 94.918 43.324 94.678 43.324C94.182 43.324 93.662 43.524 93.118 43.924C92.59 44.324 92.126 45.02 91.726 46.012V53.5H89.758ZM98.0783 53.5V41.836H100.046V53.5H98.0783ZM99.0863 39.436C98.7023 39.436 98.3823 39.324 98.1263 39.1C97.8863 38.86 97.7663 38.54 97.7663 38.14C97.7663 37.756 97.8863 37.444 98.1263 37.204C98.3823 36.964 98.7023 36.844 99.0863 36.844C99.4703 36.844 99.7823 36.964 100.022 37.204C100.278 37.444 100.406 37.756 100.406 38.14C100.406 38.54 100.278 38.86 100.022 39.1C99.7823 39.324 99.4703 39.436 99.0863 39.436ZM107.921 58.876C106.497 58.876 105.337 58.604 104.441 58.06C103.545 57.516 103.097 56.74 103.097 55.732C103.097 55.236 103.249 54.756 103.553 54.292C103.857 53.844 104.273 53.444 104.801 53.092V52.996C104.513 52.82 104.265 52.572 104.057 52.252C103.865 51.932 103.769 51.548 103.769 51.1C103.769 50.604 103.905 50.172 104.177 49.804C104.449 49.436 104.737 49.148 105.041 48.94V48.844C104.657 48.524 104.305 48.092 103.985 47.548C103.681 47.004 103.529 46.388 103.529 45.7C103.529 44.852 103.729 44.116 104.129 43.492C104.529 42.868 105.065 42.388 105.737 42.052C106.409 41.716 107.137 41.548 107.921 41.548C108.241 41.548 108.545 41.58 108.833 41.644C109.121 41.692 109.369 41.756 109.577 41.836H113.633V43.348H111.233C111.505 43.604 111.729 43.948 111.905 44.38C112.097 44.796 112.193 45.252 112.193 45.748C112.193 46.58 112.001 47.3 111.617 47.908C111.233 48.516 110.721 48.988 110.081 49.324C109.441 49.644 108.721 49.804 107.921 49.804C107.297 49.804 106.713 49.668 106.169 49.396C105.961 49.572 105.785 49.772 105.641 49.996C105.497 50.204 105.425 50.468 105.425 50.788C105.425 51.156 105.569 51.46 105.857 51.7C106.161 51.94 106.705 52.06 107.489 52.06H109.745C111.105 52.06 112.121 52.284 112.793 52.732C113.481 53.164 113.825 53.868 113.825 54.844C113.825 55.564 113.585 56.228 113.105 56.836C112.625 57.444 111.945 57.932 111.065 58.3C110.185 58.684 109.137 58.876 107.921 58.876ZM107.921 48.484C108.593 48.484 109.169 48.236 109.649 47.74C110.145 47.228 110.393 46.548 110.393 45.7C110.393 44.852 110.153 44.188 109.673 43.708C109.193 43.228 108.609 42.988 107.921 42.988C107.233 42.988 106.649 43.228 106.169 43.708C105.689 44.188 105.449 44.852 105.449 45.7C105.449 46.548 105.689 47.228 106.169 47.74C106.665 48.236 107.249 48.484 107.921 48.484ZM108.209 57.508C109.329 57.508 110.225 57.26 110.897 56.764C111.569 56.284 111.905 55.74 111.905 55.132C111.905 54.588 111.697 54.212 111.281 54.004C110.881 53.796 110.305 53.692 109.553 53.692H107.537C107.313 53.692 107.065 53.676 106.793 53.644C106.537 53.612 106.281 53.564 106.025 53.5C105.609 53.804 105.305 54.124 105.113 54.46C104.921 54.796 104.825 55.132 104.825 55.468C104.825 56.092 105.121 56.588 105.713 56.956C106.321 57.324 107.153 57.508 108.209 57.508ZM116.078 53.5V41.836H118.046V53.5H116.078ZM117.086 39.436C116.702 39.436 116.382 39.324 116.126 39.1C115.886 38.86 115.766 38.54 115.766 38.14C115.766 37.756 115.886 37.444 116.126 37.204C116.382 36.964 116.702 36.844 117.086 36.844C117.47 36.844 117.782 36.964 118.022 37.204C118.278 37.444 118.406 37.756 118.406 38.14C118.406 38.54 118.278 38.86 118.022 39.1C117.782 39.324 117.47 39.436 117.086 39.436ZM121.985 53.5V41.836H123.617L123.785 43.516H123.857C124.417 42.956 125.009 42.492 125.633 42.124C126.257 41.74 126.969 41.548 127.769 41.548C129.001 41.548 129.897 41.94 130.457 42.724C131.033 43.492 131.321 44.62 131.321 46.108V53.5H129.353V46.372C129.353 45.284 129.177 44.492 128.825 43.996C128.473 43.5 127.913 43.252 127.145 43.252C126.553 43.252 126.017 43.404 125.537 43.708C125.073 44.012 124.545 44.46 123.953 45.052V53.5H121.985ZM137.798 53.788C136.822 53.788 136.006 53.5 135.35 52.924C134.71 52.332 134.39 51.516 134.39 50.476C134.39 49.196 134.958 48.22 136.094 47.548C137.246 46.86 139.062 46.38 141.542 46.108C141.542 45.612 141.47 45.14 141.326 44.692C141.198 44.244 140.958 43.884 140.606 43.612C140.27 43.324 139.782 43.18 139.142 43.18C138.47 43.18 137.838 43.308 137.246 43.564C136.654 43.82 136.126 44.108 135.662 44.428L134.894 43.06C135.438 42.708 136.102 42.372 136.886 42.052C137.686 41.716 138.55 41.548 139.478 41.548C140.902 41.548 141.934 41.988 142.574 42.868C143.214 43.732 143.534 44.892 143.534 46.348V53.5H141.902L141.734 52.108H141.662C141.118 52.556 140.518 52.948 139.862 53.284C139.222 53.62 138.534 53.788 137.798 53.788ZM138.374 52.204C138.934 52.204 139.462 52.068 139.958 51.796C140.454 51.524 140.982 51.14 141.542 50.644V47.404C139.606 47.644 138.246 48.004 137.462 48.484C136.694 48.964 136.31 49.58 136.31 50.332C136.31 50.988 136.51 51.468 136.91 51.772C137.31 52.06 137.798 52.204 138.374 52.204ZM149.291 53.788C148.555 53.788 148.019 53.564 147.683 53.116C147.363 52.652 147.203 51.996 147.203 51.148V36.412H149.171V51.292C149.171 51.596 149.227 51.82 149.339 51.964C149.451 52.092 149.579 52.156 149.723 52.156C149.787 52.156 149.843 52.156 149.891 52.156C149.955 52.14 150.043 52.124 150.155 52.108L150.419 53.596C150.291 53.66 150.139 53.708 149.963 53.74C149.787 53.772 149.563 53.788 149.291 53.788ZM161.581 53.5V39.436H156.829V37.756H168.349V39.436H163.597V53.5H161.581ZM174.127 53.788C173.087 53.788 172.143 53.548 171.295 53.068C170.447 52.572 169.775 51.868 169.279 50.956C168.783 50.044 168.535 48.956 168.535 47.692C168.535 46.412 168.783 45.316 169.279 44.404C169.791 43.492 170.447 42.788 171.247 42.292C172.047 41.796 172.887 41.548 173.767 41.548C175.255 41.548 176.399 42.044 177.199 43.036C178.015 44.028 178.423 45.356 178.423 47.02C178.423 47.228 178.415 47.436 178.399 47.644C178.399 47.836 178.383 48.004 178.351 48.148H170.479C170.559 49.38 170.943 50.364 171.631 51.1C172.335 51.836 173.247 52.204 174.367 52.204C174.927 52.204 175.439 52.124 175.903 51.964C176.383 51.788 176.839 51.564 177.271 51.292L177.967 52.588C177.471 52.908 176.903 53.188 176.263 53.428C175.639 53.668 174.927 53.788 174.127 53.788ZM170.455 46.732H176.695C176.695 45.548 176.439 44.652 175.927 44.044C175.431 43.42 174.727 43.108 173.815 43.108C172.999 43.108 172.263 43.428 171.607 44.068C170.967 44.692 170.583 45.58 170.455 46.732ZM179.509 53.5L183.325 47.404L179.797 41.836H181.933L183.493 44.404C183.669 44.708 183.853 45.028 184.045 45.364C184.237 45.7 184.437 46.028 184.645 46.348H184.741C184.917 46.028 185.093 45.7 185.269 45.364C185.445 45.028 185.621 44.708 185.797 44.404L187.213 41.836H189.277L185.749 47.62L189.541 53.5H187.405L185.701 50.788C185.493 50.436 185.285 50.084 185.077 49.732C184.869 49.38 184.653 49.036 184.429 48.7H184.333C184.125 49.036 183.925 49.38 183.733 49.732C183.541 50.068 183.349 50.42 183.157 50.788L181.573 53.5H179.509ZM194.938 53.788C193.69 53.788 192.818 53.428 192.322 52.708C191.842 51.988 191.602 51.052 191.602 49.9V43.444H189.874V41.956L191.698 41.836L191.938 38.572H193.594V41.836H196.738V43.444H193.594V49.924C193.594 50.644 193.722 51.204 193.978 51.604C194.25 51.988 194.722 52.18 195.394 52.18C195.602 52.18 195.826 52.148 196.066 52.084C196.306 52.004 196.522 51.932 196.714 51.868L197.098 53.356C196.778 53.468 196.426 53.564 196.042 53.644C195.674 53.74 195.306 53.788 194.938 53.788Z" fill="#A2A2A2"/> <path d="M338.97 156.707C339.36 156.317 339.36 155.683 338.97 155.293L332.606 148.929C332.215 148.538 331.582 148.538 331.192 148.929C330.801 149.319 330.801 149.953 331.192 150.343L336.848 156L331.192 161.657C330.801 162.047 330.801 162.681 331.192 163.071C331.582 163.462 332.215 163.462 332.606 163.071L338.97 156.707ZM268.263 157H338.263V155H268.263V157Z" fill="#B4B4B4"/> <path d="M338.97 276.707C339.36 276.317 339.36 275.683 338.97 275.293L332.606 268.929C332.215 268.538 331.582 268.538 331.192 268.929C330.801 269.319 330.801 269.953 331.192 270.343L336.848 276L331.192 281.657C330.801 282.047 330.801 282.681 331.192 283.071C331.582 283.462 332.215 283.462 332.606 283.071L338.97 276.707ZM268.263 277H338.263V275H268.263V277Z" fill="#B4B4B4"/> <path d="M338.97 396.707C339.36 396.317 339.36 395.683 338.97 395.293L332.606 388.929C332.215 388.538 331.582 388.538 331.192 388.929C330.801 389.319 330.801 389.953 331.192 390.343L336.848 396L331.192 401.657C330.801 402.047 330.801 402.681 331.192 403.071C331.582 403.462 332.215 403.462 332.606 403.071L338.97 396.707ZM268.263 397H338.263V395H268.263V397Z" fill="#B4B4B4"/> <rect x="726" y="1" width="268" height="550" rx="29" fill="#F5F1FF"/> <path d="M779.147 169.648V149.008H783.443V150.136H780.635V168.52H783.443V169.648H779.147ZM790.132 166.288C788.596 166.288 787.388 165.6 786.508 164.224C785.644 162.832 785.212 160.856 785.212 158.296C785.212 155.72 785.644 153.76 786.508 152.416C787.388 151.072 788.596 150.4 790.132 150.4C791.652 150.4 792.844 151.072 793.708 152.416C794.588 153.76 795.028 155.72 795.028 158.296C795.028 160.856 794.588 162.832 793.708 164.224C792.844 165.6 791.652 166.288 790.132 166.288ZM790.132 164.704C791.028 164.704 791.748 164.2 792.292 163.192C792.836 162.168 793.108 160.536 793.108 158.296C793.108 156.056 792.836 154.448 792.292 153.472C791.748 152.48 791.028 151.984 790.132 151.984C789.236 151.984 788.508 152.48 787.948 153.472C787.404 154.448 787.132 156.056 787.132 158.296C787.132 160.536 787.404 162.168 787.948 163.192C788.508 164.2 789.236 164.704 790.132 164.704ZM799.086 166.288C798.686 166.288 798.342 166.152 798.054 165.88C797.782 165.592 797.646 165.232 797.646 164.8C797.646 164.336 797.782 163.968 798.054 163.696C798.342 163.408 798.686 163.264 799.086 163.264C799.47 163.264 799.798 163.408 800.07 163.696C800.358 163.968 800.502 164.336 800.502 164.8C800.502 165.232 800.358 165.592 800.07 165.88C799.798 166.152 799.47 166.288 799.086 166.288ZM803.023 166V164.824C804.559 163.288 805.863 161.944 806.935 160.792C808.007 159.624 808.823 158.576 809.383 157.648C809.943 156.704 810.223 155.824 810.223 155.008C810.223 154.128 809.983 153.408 809.503 152.848C809.023 152.288 808.295 152.008 807.319 152.008C806.679 152.008 806.087 152.192 805.543 152.56C804.999 152.912 804.503 153.344 804.055 153.856L802.927 152.728C803.567 152.024 804.247 151.464 804.967 151.048C805.703 150.616 806.567 150.4 807.559 150.4C808.983 150.4 810.103 150.816 810.919 151.648C811.735 152.464 812.143 153.552 812.143 154.912C812.143 155.872 811.871 156.848 811.327 157.84C810.799 158.816 810.063 159.848 809.118 160.936C808.191 162.008 807.119 163.176 805.903 164.44C806.319 164.408 806.751 164.376 807.199 164.344C807.647 164.312 808.071 164.296 808.471 164.296H812.911V166H803.023ZM815.6 170.08L815.12 168.928C815.808 168.624 816.344 168.216 816.728 167.704C817.112 167.208 817.296 166.64 817.28 166C817.248 166.016 817.208 166.024 817.16 166.024C817.112 166.024 817.072 166.024 817.04 166.024C816.656 166.024 816.32 165.912 816.032 165.688C815.76 165.448 815.624 165.104 815.624 164.656C815.624 164.224 815.768 163.888 816.056 163.648C816.344 163.392 816.688 163.264 817.088 163.264C817.6 163.264 818 163.472 818.288 163.888C818.592 164.304 818.744 164.872 818.744 165.592C818.744 166.616 818.464 167.512 817.904 168.28C817.344 169.064 816.576 169.664 815.6 170.08ZM830.749 166.288C829.213 166.288 828.005 165.6 827.125 164.224C826.261 162.832 825.829 160.856 825.829 158.296C825.829 155.72 826.261 153.76 827.125 152.416C828.005 151.072 829.213 150.4 830.749 150.4C832.269 150.4 833.461 151.072 834.325 152.416C835.205 153.76 835.645 155.72 835.645 158.296C835.645 160.856 835.205 162.832 834.325 164.224C833.461 165.6 832.269 166.288 830.749 166.288ZM830.749 164.704C831.645 164.704 832.365 164.2 832.909 163.192C833.453 162.168 833.725 160.536 833.725 158.296C833.725 156.056 833.453 154.448 832.909 153.472C832.365 152.48 831.645 151.984 830.749 151.984C829.853 151.984 829.125 152.48 828.565 153.472C828.021 154.448 827.749 156.056 827.749 158.296C827.749 160.536 828.021 162.168 828.565 163.192C829.125 164.2 829.853 164.704 830.749 164.704ZM839.703 166.288C839.303 166.288 838.959 166.152 838.671 165.88C838.399 165.592 838.263 165.232 838.263 164.8C838.263 164.336 838.399 163.968 838.671 163.696C838.959 163.408 839.303 163.264 839.703 163.264C840.087 163.264 840.415 163.408 840.687 163.696C840.975 163.968 841.119 164.336 841.119 164.8C841.119 165.232 840.975 165.592 840.687 165.88C840.415 166.152 840.087 166.288 839.703 166.288ZM848.296 166.288C847.08 166.288 846.072 166.072 845.272 165.64C844.472 165.208 843.808 164.728 843.28 164.2L844.24 162.904C844.704 163.352 845.24 163.76 845.848 164.128C846.456 164.48 847.208 164.656 848.104 164.656C849.048 164.656 849.84 164.344 850.48 163.72C851.136 163.08 851.464 162.224 851.464 161.152C851.464 160.096 851.168 159.272 850.576 158.68C849.984 158.088 849.192 157.792 848.2 157.792C847.672 157.792 847.216 157.872 846.832 158.032C846.464 158.192 846.048 158.424 845.584 158.728L844.528 158.056L845.032 150.688H852.688V152.392H846.76L846.352 156.928C846.72 156.736 847.088 156.584 847.456 156.472C847.824 156.36 848.24 156.304 848.704 156.304C849.584 156.304 850.376 156.48 851.08 156.832C851.8 157.168 852.376 157.688 852.808 158.392C853.24 159.096 853.456 160 853.456 161.104C853.456 162.208 853.208 163.152 852.712 163.936C852.232 164.704 851.6 165.288 850.816 165.688C850.032 166.088 849.192 166.288 848.296 166.288ZM856.217 170.08L855.737 168.928C856.425 168.624 856.961 168.216 857.345 167.704C857.729 167.208 857.913 166.64 857.897 166C857.865 166.016 857.825 166.024 857.777 166.024C857.729 166.024 857.689 166.024 857.657 166.024C857.273 166.024 856.937 165.912 856.649 165.688C856.377 165.448 856.241 165.104 856.241 164.656C856.241 164.224 856.385 163.888 856.673 163.648C856.961 163.392 857.305 163.264 857.705 163.264C858.217 163.264 858.617 163.472 858.905 163.888C859.209 164.304 859.361 164.872 859.361 165.592C859.361 166.616 859.081 167.512 858.521 168.28C857.961 169.064 857.193 169.664 856.217 170.08ZM867.287 166V164.368H870.791V153.088H868.007V151.816C868.711 151.688 869.319 151.536 869.831 151.36C870.359 151.168 870.831 150.944 871.247 150.688H872.759V164.368H875.927V166H867.287ZM880.32 166.288C879.92 166.288 879.576 166.152 879.288 165.88C879.016 165.592 878.88 165.232 878.88 164.8C878.88 164.336 879.016 163.968 879.288 163.696C879.576 163.408 879.92 163.264 880.32 163.264C880.704 163.264 881.032 163.408 881.304 163.696C881.592 163.968 881.736 164.336 881.736 164.8C881.736 165.232 881.592 165.592 881.304 165.88C881.032 166.152 880.704 166.288 880.32 166.288ZM888.961 166.288C887.745 166.288 886.729 166.072 885.913 165.64C885.097 165.192 884.433 164.688 883.921 164.128L884.929 162.832C885.393 163.312 885.937 163.736 886.561 164.104C887.185 164.472 887.945 164.656 888.841 164.656C889.769 164.656 890.529 164.4 891.121 163.888C891.713 163.376 892.009 162.696 892.009 161.848C892.009 161.24 891.849 160.712 891.529 160.264C891.225 159.8 890.713 159.44 889.993 159.184C889.289 158.928 888.329 158.8 887.113 158.8V157.288C888.201 157.288 889.057 157.16 889.681 156.904C890.321 156.648 890.777 156.304 891.049 155.872C891.321 155.44 891.457 154.96 891.457 154.432C891.457 153.68 891.217 153.088 890.737 152.656C890.273 152.224 889.633 152.008 888.817 152.008C888.177 152.008 887.585 152.152 887.041 152.44C886.513 152.728 886.017 153.104 885.553 153.568L884.497 152.32C885.089 151.76 885.745 151.304 886.465 150.952C887.185 150.584 887.993 150.4 888.889 150.4C890.217 150.4 891.313 150.744 892.177 151.432C893.041 152.104 893.473 153.056 893.473 154.288C893.473 155.216 893.217 155.976 892.705 156.568C892.193 157.16 891.521 157.616 890.689 157.936V158.032C891.617 158.24 892.401 158.68 893.041 159.352C893.681 160.008 894.001 160.864 894.001 161.92C894.001 162.816 893.777 163.592 893.329 164.248C892.881 164.904 892.273 165.408 891.505 165.76C890.753 166.112 889.905 166.288 888.961 166.288ZM896.835 170.08L896.355 168.928C897.043 168.624 897.579 168.216 897.963 167.704C898.347 167.208 898.531 166.64 898.515 166C898.483 166.016 898.443 166.024 898.395 166.024C898.347 166.024 898.307 166.024 898.275 166.024C897.891 166.024 897.555 165.912 897.267 165.688C896.995 165.448 896.859 165.104 896.859 164.656C896.859 164.224 897.003 163.888 897.291 163.648C897.579 163.392 897.923 163.264 898.323 163.264C898.835 163.264 899.235 163.472 899.523 163.888C899.827 164.304 899.979 164.872 899.979 165.592C899.979 166.616 899.699 167.512 899.139 168.28C898.579 169.064 897.811 169.664 896.835 170.08ZM911.984 166.288C910.448 166.288 909.24 165.6 908.36 164.224C907.496 162.832 907.064 160.856 907.064 158.296C907.064 155.72 907.496 153.76 908.36 152.416C909.24 151.072 910.448 150.4 911.984 150.4C913.504 150.4 914.696 151.072 915.56 152.416C916.44 153.76 916.88 155.72 916.88 158.296C916.88 160.856 916.44 162.832 915.56 164.224C914.696 165.6 913.504 166.288 911.984 166.288ZM911.984 164.704C912.88 164.704 913.6 164.2 914.144 163.192C914.688 162.168 914.96 160.536 914.96 158.296C914.96 156.056 914.688 154.448 914.144 153.472C913.6 152.48 912.88 151.984 911.984 151.984C911.088 151.984 910.36 152.48 909.8 153.472C909.256 154.448 908.984 156.056 908.984 158.296C908.984 160.536 909.256 162.168 909.8 163.192C910.36 164.2 911.088 164.704 911.984 164.704ZM920.938 166.288C920.538 166.288 920.194 166.152 919.906 165.88C919.634 165.592 919.498 165.232 919.498 164.8C919.498 164.336 919.634 163.968 919.906 163.696C920.194 163.408 920.538 163.264 920.938 163.264C921.322 163.264 921.65 163.408 921.922 163.696C922.21 163.968 922.354 164.336 922.354 164.8C922.354 165.232 922.21 165.592 921.922 165.88C921.65 166.152 921.322 166.288 920.938 166.288ZM926.722 155.32C926.722 156.312 926.954 157.104 927.418 157.696C927.898 158.272 928.61 158.56 929.554 158.56C930.066 158.56 930.602 158.4 931.162 158.08C931.738 157.76 932.274 157.24 932.77 156.52C932.642 155.032 932.29 153.904 931.714 153.136C931.138 152.352 930.362 151.96 929.386 151.96C928.65 151.96 928.018 152.272 927.49 152.896C926.978 153.504 926.722 154.312 926.722 155.32ZM928.834 166.288C928.018 166.288 927.298 166.136 926.674 165.832C926.066 165.512 925.554 165.128 925.138 164.68L926.242 163.432C926.562 163.8 926.938 164.096 927.37 164.32C927.818 164.544 928.282 164.656 928.762 164.656C929.482 164.656 930.146 164.448 930.754 164.032C931.362 163.616 931.85 162.928 932.218 161.968C932.602 161.008 932.802 159.712 932.818 158.08C932.338 158.688 931.778 159.168 931.138 159.52C930.498 159.872 929.858 160.048 929.218 160.048C927.906 160.048 926.85 159.656 926.05 158.872C925.266 158.072 924.874 156.888 924.874 155.32C924.874 154.344 925.082 153.488 925.498 152.752C925.914 152.016 926.458 151.44 927.13 151.024C927.818 150.608 928.57 150.4 929.386 150.4C930.394 150.4 931.298 150.672 932.098 151.216C932.898 151.76 933.522 152.576 933.97 153.664C934.434 154.752 934.666 156.12 934.666 157.768C934.666 159.816 934.386 161.464 933.826 162.712C933.282 163.96 932.57 164.872 931.69 165.448C930.81 166.008 929.858 166.288 928.834 166.288ZM936.588 169.648V168.52H939.372V150.136H936.588V149.008H940.86V169.648H936.588Z" fill="black"/> <path d="M779.147 288.648V268.008H783.443V269.136H780.635V287.52H783.443V288.648H779.147ZM790.132 285.288C788.596 285.288 787.388 284.6 786.508 283.224C785.644 281.832 785.212 279.856 785.212 277.296C785.212 274.72 785.644 272.76 786.508 271.416C787.388 270.072 788.596 269.4 790.132 269.4C791.652 269.4 792.844 270.072 793.708 271.416C794.588 272.76 795.028 274.72 795.028 277.296C795.028 279.856 794.588 281.832 793.708 283.224C792.844 284.6 791.652 285.288 790.132 285.288ZM790.132 283.704C791.028 283.704 791.748 283.2 792.292 282.192C792.836 281.168 793.108 279.536 793.108 277.296C793.108 275.056 792.836 273.448 792.292 272.472C791.748 271.48 791.028 270.984 790.132 270.984C789.236 270.984 788.508 271.48 787.948 272.472C787.404 273.448 787.132 275.056 787.132 277.296C787.132 279.536 787.404 281.168 787.948 282.192C788.508 283.2 789.236 283.704 790.132 283.704ZM799.086 285.288C798.686 285.288 798.342 285.152 798.054 284.88C797.782 284.592 797.646 284.232 797.646 283.8C797.646 283.336 797.782 282.968 798.054 282.696C798.342 282.408 798.686 282.264 799.086 282.264C799.47 282.264 799.798 282.408 800.07 282.696C800.358 282.968 800.502 283.336 800.502 283.8C800.502 284.232 800.358 284.592 800.07 284.88C799.798 285.152 799.47 285.288 799.086 285.288ZM803.959 285V283.368H807.463V272.088H804.679V270.816C805.383 270.688 805.991 270.536 806.503 270.36C807.031 270.168 807.503 269.944 807.919 269.688H809.431V283.368H812.599V285H803.959ZM815.6 289.08L815.12 287.928C815.808 287.624 816.344 287.216 816.728 286.704C817.112 286.208 817.296 285.64 817.28 285C817.248 285.016 817.208 285.024 817.16 285.024C817.112 285.024 817.072 285.024 817.04 285.024C816.656 285.024 816.32 284.912 816.032 284.688C815.76 284.448 815.624 284.104 815.624 283.656C815.624 283.224 815.768 282.888 816.056 282.648C816.344 282.392 816.688 282.264 817.088 282.264C817.6 282.264 818 282.472 818.288 282.888C818.592 283.304 818.744 283.872 818.744 284.592C818.744 285.616 818.464 286.512 817.904 287.28C817.344 288.064 816.576 288.664 815.6 289.08ZM830.749 285.288C829.213 285.288 828.005 284.6 827.125 283.224C826.261 281.832 825.829 279.856 825.829 277.296C825.829 274.72 826.261 272.76 827.125 271.416C828.005 270.072 829.213 269.4 830.749 269.4C832.269 269.4 833.461 270.072 834.325 271.416C835.205 272.76 835.645 274.72 835.645 277.296C835.645 279.856 835.205 281.832 834.325 283.224C833.461 284.6 832.269 285.288 830.749 285.288ZM830.749 283.704C831.645 283.704 832.365 283.2 832.909 282.192C833.453 281.168 833.725 279.536 833.725 277.296C833.725 275.056 833.453 273.448 832.909 272.472C832.365 271.48 831.645 270.984 830.749 270.984C829.853 270.984 829.125 271.48 828.565 272.472C828.021 273.448 827.749 275.056 827.749 277.296C827.749 279.536 828.021 281.168 828.565 282.192C829.125 283.2 829.853 283.704 830.749 283.704ZM839.703 285.288C839.303 285.288 838.959 285.152 838.671 284.88C838.399 284.592 838.263 284.232 838.263 283.8C838.263 283.336 838.399 282.968 838.671 282.696C838.959 282.408 839.303 282.264 839.703 282.264C840.087 282.264 840.415 282.408 840.687 282.696C840.975 282.968 841.119 283.336 841.119 283.8C841.119 284.232 840.975 284.592 840.687 284.88C840.415 285.152 840.087 285.288 839.703 285.288ZM845.176 279.192H849.976V274.752C849.976 274.336 849.992 273.84 850.024 273.264C850.056 272.688 850.08 272.2 850.096 271.8H850C849.808 272.168 849.608 272.528 849.4 272.88C849.192 273.232 848.976 273.592 848.752 273.96L845.176 279.192ZM849.976 285V280.776H843.088V279.48L849.64 269.688H851.848V279.192H853.936V280.776H851.848V285H849.976ZM856.217 289.08L855.737 287.928C856.425 287.624 856.961 287.216 857.345 286.704C857.729 286.208 857.913 285.64 857.897 285C857.865 285.016 857.825 285.024 857.777 285.024C857.729 285.024 857.689 285.024 857.657 285.024C857.273 285.024 856.937 284.912 856.649 284.688C856.377 284.448 856.241 284.104 856.241 283.656C856.241 283.224 856.385 282.888 856.673 282.648C856.961 282.392 857.305 282.264 857.705 282.264C858.217 282.264 858.617 282.472 858.905 282.888C859.209 283.304 859.361 283.872 859.361 284.592C859.361 285.616 859.081 286.512 858.521 287.28C857.961 288.064 857.193 288.664 856.217 289.08ZM866.351 285V283.824C867.887 282.288 869.191 280.944 870.263 279.792C871.335 278.624 872.151 277.576 872.711 276.648C873.271 275.704 873.551 274.824 873.551 274.008C873.551 273.128 873.311 272.408 872.831 271.848C872.351 271.288 871.623 271.008 870.647 271.008C870.007 271.008 869.415 271.192 868.871 271.56C868.327 271.912 867.831 272.344 867.383 272.856L866.255 271.728C866.895 271.024 867.575 270.464 868.295 270.048C869.031 269.616 869.895 269.4 870.887 269.4C872.311 269.4 873.431 269.816 874.247 270.648C875.063 271.464 875.471 272.552 875.471 273.912C875.471 274.872 875.199 275.848 874.655 276.84C874.127 277.816 873.391 278.848 872.447 279.936C871.519 281.008 870.447 282.176 869.231 283.44C869.647 283.408 870.079 283.376 870.527 283.344C870.975 283.312 871.399 283.296 871.799 283.296H876.239V285H866.351ZM880.32 285.288C879.92 285.288 879.576 285.152 879.288 284.88C879.016 284.592 878.88 284.232 878.88 283.8C878.88 283.336 879.016 282.968 879.288 282.696C879.576 282.408 879.92 282.264 880.32 282.264C880.704 282.264 881.032 282.408 881.304 282.696C881.592 282.968 881.736 283.336 881.736 283.8C881.736 284.232 881.592 284.592 881.304 284.88C881.032 285.152 880.704 285.288 880.32 285.288ZM885.193 285V283.368H888.697V272.088H885.913V270.816C886.617 270.688 887.225 270.536 887.737 270.36C888.265 270.168 888.737 269.944 889.153 269.688H890.665V283.368H893.833V285H885.193ZM896.835 289.08L896.355 287.928C897.043 287.624 897.579 287.216 897.963 286.704C898.347 286.208 898.531 285.64 898.515 285C898.483 285.016 898.443 285.024 898.395 285.024C898.347 285.024 898.307 285.024 898.275 285.024C897.891 285.024 897.555 284.912 897.267 284.688C896.995 284.448 896.859 284.104 896.859 283.656C896.859 283.224 897.003 282.888 897.291 282.648C897.579 282.392 897.923 282.264 898.323 282.264C898.835 282.264 899.235 282.472 899.523 282.888C899.827 283.304 899.979 283.872 899.979 284.592C899.979 285.616 899.699 286.512 899.139 287.28C898.579 288.064 897.811 288.664 896.835 289.08ZM907.904 285V283.368H911.408V272.088H908.624V270.816C909.328 270.688 909.936 270.536 910.448 270.36C910.976 270.168 911.448 269.944 911.864 269.688H913.376V283.368H916.544V285H907.904ZM920.938 285.288C920.538 285.288 920.194 285.152 919.906 284.88C919.634 284.592 919.498 284.232 919.498 283.8C919.498 283.336 919.634 282.968 919.906 282.696C920.194 282.408 920.538 282.264 920.938 282.264C921.322 282.264 921.65 282.408 921.922 282.696C922.21 282.968 922.354 283.336 922.354 283.8C922.354 284.232 922.21 284.592 921.922 284.88C921.65 285.152 921.322 285.288 920.938 285.288ZM924.874 285V283.824C926.41 282.288 927.714 280.944 928.786 279.792C929.858 278.624 930.674 277.576 931.234 276.648C931.794 275.704 932.074 274.824 932.074 274.008C932.074 273.128 931.834 272.408 931.354 271.848C930.874 271.288 930.146 271.008 929.17 271.008C928.53 271.008 927.938 271.192 927.394 271.56C926.85 271.912 926.354 272.344 925.906 272.856L924.778 271.728C925.418 271.024 926.098 270.464 926.818 270.048C927.554 269.616 928.418 269.4 929.41 269.4C930.834 269.4 931.954 269.816 932.77 270.648C933.586 271.464 933.994 272.552 933.994 273.912C933.994 274.872 933.722 275.848 933.178 276.84C932.65 277.816 931.914 278.848 930.97 279.936C930.042 281.008 928.97 282.176 927.754 283.44C928.17 283.408 928.602 283.376 929.05 283.344C929.498 283.312 929.922 283.296 930.322 283.296H934.762V285H924.874ZM936.588 288.648V287.52H939.372V269.136H936.588V268.008H940.86V288.648H936.588Z" fill="black"/> <path d="M779.147 407.648V387.008H783.443V388.136H780.635V406.52H783.443V407.648H779.147ZM786.964 393.32C786.964 394.312 787.196 395.104 787.66 395.696C788.14 396.272 788.852 396.56 789.796 396.56C790.308 396.56 790.844 396.4 791.404 396.08C791.98 395.76 792.516 395.24 793.012 394.52C792.884 393.032 792.532 391.904 791.956 391.136C791.38 390.352 790.604 389.96 789.628 389.96C788.892 389.96 788.26 390.272 787.732 390.896C787.22 391.504 786.964 392.312 786.964 393.32ZM789.076 404.288C788.26 404.288 787.54 404.136 786.916 403.832C786.308 403.512 785.796 403.128 785.38 402.68L786.484 401.432C786.804 401.8 787.18 402.096 787.612 402.32C788.06 402.544 788.524 402.656 789.004 402.656C789.724 402.656 790.388 402.448 790.996 402.032C791.604 401.616 792.092 400.928 792.46 399.968C792.844 399.008 793.044 397.712 793.06 396.08C792.58 396.688 792.02 397.168 791.38 397.52C790.74 397.872 790.1 398.048 789.46 398.048C788.148 398.048 787.092 397.656 786.292 396.872C785.508 396.072 785.116 394.888 785.116 393.32C785.116 392.344 785.324 391.488 785.74 390.752C786.156 390.016 786.7 389.44 787.372 389.024C788.06 388.608 788.812 388.4 789.628 388.4C790.636 388.4 791.54 388.672 792.34 389.216C793.14 389.76 793.764 390.576 794.212 391.664C794.676 392.752 794.908 394.12 794.908 395.768C794.908 397.816 794.628 399.464 794.068 400.712C793.524 401.96 792.812 402.872 791.932 403.448C791.052 404.008 790.1 404.288 789.076 404.288ZM799.086 404.288C798.686 404.288 798.342 404.152 798.054 403.88C797.782 403.592 797.646 403.232 797.646 402.8C797.646 402.336 797.782 401.968 798.054 401.696C798.342 401.408 798.686 401.264 799.086 401.264C799.47 401.264 799.798 401.408 800.07 401.696C800.358 401.968 800.502 402.336 800.502 402.8C800.502 403.232 800.358 403.592 800.07 403.88C799.798 404.152 799.47 404.288 799.086 404.288ZM803.023 404V402.824C804.559 401.288 805.863 399.944 806.935 398.792C808.007 397.624 808.823 396.576 809.383 395.648C809.943 394.704 810.223 393.824 810.223 393.008C810.223 392.128 809.983 391.408 809.503 390.848C809.023 390.288 808.295 390.008 807.319 390.008C806.679 390.008 806.087 390.192 805.543 390.56C804.999 390.912 804.503 391.344 804.055 391.856L802.927 390.728C803.567 390.024 804.247 389.464 804.967 389.048C805.703 388.616 806.567 388.4 807.559 388.4C808.983 388.4 810.103 388.816 810.919 389.648C811.735 390.464 812.143 391.552 812.143 392.912C812.143 393.872 811.871 394.848 811.327 395.84C810.799 396.816 810.063 397.848 809.118 398.936C808.191 400.008 807.119 401.176 805.903 402.44C806.319 402.408 806.751 402.376 807.199 402.344C807.647 402.312 808.071 402.296 808.471 402.296H812.911V404H803.023ZM815.6 408.08L815.12 406.928C815.808 406.624 816.344 406.216 816.728 405.704C817.112 405.208 817.296 404.64 817.28 404C817.248 404.016 817.208 404.024 817.16 404.024C817.112 404.024 817.072 404.024 817.04 404.024C816.656 404.024 816.32 403.912 816.032 403.688C815.76 403.448 815.624 403.104 815.624 402.656C815.624 402.224 815.768 401.888 816.056 401.648C816.344 401.392 816.688 401.264 817.088 401.264C817.6 401.264 818 401.472 818.288 401.888C818.592 402.304 818.744 402.872 818.744 403.592C818.744 404.616 818.464 405.512 817.904 406.28C817.344 407.064 816.576 407.664 815.6 408.08ZM830.389 404.288C829.173 404.288 828.165 404.072 827.365 403.64C826.565 403.208 825.901 402.728 825.373 402.2L826.333 400.904C826.797 401.352 827.333 401.76 827.941 402.128C828.549 402.48 829.301 402.656 830.197 402.656C831.141 402.656 831.933 402.344 832.573 401.72C833.229 401.08 833.557 400.224 833.557 399.152C833.557 398.096 833.261 397.272 832.669 396.68C832.077 396.088 831.285 395.792 830.293 395.792C829.765 395.792 829.309 395.872 828.925 396.032C828.557 396.192 828.141 396.424 827.677 396.728L826.621 396.056L827.125 388.688H834.781V390.392H828.853L828.445 394.928C828.813 394.736 829.181 394.584 829.549 394.472C829.917 394.36 830.333 394.304 830.797 394.304C831.677 394.304 832.469 394.48 833.173 394.832C833.893 395.168 834.469 395.688 834.901 396.392C835.333 397.096 835.549 398 835.549 399.104C835.549 400.208 835.301 401.152 834.805 401.936C834.325 402.704 833.693 403.288 832.909 403.688C832.125 404.088 831.285 404.288 830.389 404.288ZM839.703 404.288C839.303 404.288 838.959 404.152 838.671 403.88C838.399 403.592 838.263 403.232 838.263 402.8C838.263 402.336 838.399 401.968 838.671 401.696C838.959 401.408 839.303 401.264 839.703 401.264C840.087 401.264 840.415 401.408 840.687 401.696C840.975 401.968 841.119 402.336 841.119 402.8C841.119 403.232 840.975 403.592 840.687 403.88C840.415 404.152 840.087 404.288 839.703 404.288ZM848.344 404.288C847.128 404.288 846.112 404.072 845.296 403.64C844.48 403.192 843.816 402.688 843.304 402.128L844.312 400.832C844.776 401.312 845.32 401.736 845.944 402.104C846.568 402.472 847.328 402.656 848.224 402.656C849.152 402.656 849.912 402.4 850.504 401.888C851.096 401.376 851.392 400.696 851.392 399.848C851.392 399.24 851.232 398.712 850.912 398.264C850.608 397.8 850.096 397.44 849.376 397.184C848.672 396.928 847.712 396.8 846.496 396.8V395.288C847.584 395.288 848.44 395.16 849.064 394.904C849.704 394.648 850.16 394.304 850.432 393.872C850.704 393.44 850.84 392.96 850.84 392.432C850.84 391.68 850.6 391.088 850.12 390.656C849.656 390.224 849.016 390.008 848.2 390.008C847.56 390.008 846.968 390.152 846.424 390.44C845.896 390.728 845.4 391.104 844.936 391.568L843.88 390.32C844.472 389.76 845.128 389.304 845.848 388.952C846.568 388.584 847.376 388.4 848.272 388.4C849.6 388.4 850.696 388.744 851.56 389.432C852.424 390.104 852.856 391.056 852.856 392.288C852.856 393.216 852.6 393.976 852.088 394.568C851.576 395.16 850.904 395.616 850.072 395.936V396.032C851 396.24 851.784 396.68 852.424 397.352C853.064 398.008 853.384 398.864 853.384 399.92C853.384 400.816 853.16 401.592 852.712 402.248C852.264 402.904 851.656 403.408 850.888 403.76C850.136 404.112 849.288 404.288 848.344 404.288ZM856.217 408.08L855.737 406.928C856.425 406.624 856.961 406.216 857.345 405.704C857.729 405.208 857.913 404.64 857.897 404C857.865 404.016 857.825 404.024 857.777 404.024C857.729 404.024 857.689 404.024 857.657 404.024C857.273 404.024 856.937 403.912 856.649 403.688C856.377 403.448 856.241 403.104 856.241 402.656C856.241 402.224 856.385 401.888 856.673 401.648C856.961 401.392 857.305 401.264 857.705 401.264C858.217 401.264 858.617 401.472 858.905 401.888C859.209 402.304 859.361 402.872 859.361 403.592C859.361 404.616 859.081 405.512 858.521 406.28C857.961 407.064 857.193 407.664 856.217 408.08ZM866.351 404V402.824C867.887 401.288 869.191 399.944 870.263 398.792C871.335 397.624 872.151 396.576 872.711 395.648C873.271 394.704 873.551 393.824 873.551 393.008C873.551 392.128 873.311 391.408 872.831 390.848C872.351 390.288 871.623 390.008 870.647 390.008C870.007 390.008 869.415 390.192 868.871 390.56C868.327 390.912 867.831 391.344 867.383 391.856L866.255 390.728C866.895 390.024 867.575 389.464 868.295 389.048C869.031 388.616 869.895 388.4 870.887 388.4C872.311 388.4 873.431 388.816 874.247 389.648C875.063 390.464 875.471 391.552 875.471 392.912C875.471 393.872 875.199 394.848 874.655 395.84C874.127 396.816 873.391 397.848 872.447 398.936C871.519 400.008 870.447 401.176 869.231 402.44C869.647 402.408 870.079 402.376 870.527 402.344C870.975 402.312 871.399 402.296 871.799 402.296H876.239V404H866.351ZM880.32 404.288C879.92 404.288 879.576 404.152 879.288 403.88C879.016 403.592 878.88 403.232 878.88 402.8C878.88 402.336 879.016 401.968 879.288 401.696C879.576 401.408 879.92 401.264 880.32 401.264C880.704 401.264 881.032 401.408 881.304 401.696C881.592 401.968 881.736 402.336 881.736 402.8C881.736 403.232 881.592 403.592 881.304 403.88C881.032 404.152 880.704 404.288 880.32 404.288ZM888.961 404.288C887.745 404.288 886.729 404.072 885.913 403.64C885.097 403.192 884.433 402.688 883.921 402.128L884.929 400.832C885.393 401.312 885.937 401.736 886.561 402.104C887.185 402.472 887.945 402.656 888.841 402.656C889.769 402.656 890.529 402.4 891.121 401.888C891.713 401.376 892.009 400.696 892.009 399.848C892.009 399.24 891.849 398.712 891.529 398.264C891.225 397.8 890.713 397.44 889.993 397.184C889.289 396.928 888.329 396.8 887.113 396.8V395.288C888.201 395.288 889.057 395.16 889.681 394.904C890.321 394.648 890.777 394.304 891.049 393.872C891.321 393.44 891.457 392.96 891.457 392.432C891.457 391.68 891.217 391.088 890.737 390.656C890.273 390.224 889.633 390.008 888.817 390.008C888.177 390.008 887.585 390.152 887.041 390.44C886.513 390.728 886.017 391.104 885.553 391.568L884.497 390.32C885.089 389.76 885.745 389.304 886.465 388.952C887.185 388.584 887.993 388.4 888.889 388.4C890.217 388.4 891.313 388.744 892.177 389.432C893.041 390.104 893.473 391.056 893.473 392.288C893.473 393.216 893.217 393.976 892.705 394.568C892.193 395.16 891.521 395.616 890.689 395.936V396.032C891.617 396.24 892.401 396.68 893.041 397.352C893.681 398.008 894.001 398.864 894.001 399.92C894.001 400.816 893.777 401.592 893.329 402.248C892.881 402.904 892.273 403.408 891.505 403.76C890.753 404.112 889.905 404.288 888.961 404.288ZM896.835 408.08L896.355 406.928C897.043 406.624 897.579 406.216 897.963 405.704C898.347 405.208 898.531 404.64 898.515 404C898.483 404.016 898.443 404.024 898.395 404.024C898.347 404.024 898.307 404.024 898.275 404.024C897.891 404.024 897.555 403.912 897.267 403.688C896.995 403.448 896.859 403.104 896.859 402.656C896.859 402.224 897.003 401.888 897.291 401.648C897.579 401.392 897.923 401.264 898.323 401.264C898.835 401.264 899.235 401.472 899.523 401.888C899.827 402.304 899.979 402.872 899.979 403.592C899.979 404.616 899.699 405.512 899.139 406.28C898.579 407.064 897.811 407.664 896.835 408.08ZM907.904 404V402.368H911.408V391.088H908.624V389.816C909.328 389.688 909.936 389.536 910.448 389.36C910.976 389.168 911.448 388.944 911.864 388.688H913.376V402.368H916.544V404H907.904ZM920.938 404.288C920.538 404.288 920.194 404.152 919.906 403.88C919.634 403.592 919.498 403.232 919.498 402.8C919.498 402.336 919.634 401.968 919.906 401.696C920.194 401.408 920.538 401.264 920.938 401.264C921.322 401.264 921.65 401.408 921.922 401.696C922.21 401.968 922.354 402.336 922.354 402.8C922.354 403.232 922.21 403.592 921.922 403.88C921.65 404.152 921.322 404.288 920.938 404.288ZM930.178 396.128C929.682 396.128 929.146 396.288 928.57 396.608C927.994 396.928 927.466 397.44 926.986 398.144C927.114 399.632 927.458 400.768 928.018 401.552C928.594 402.336 929.37 402.728 930.346 402.728C931.098 402.728 931.73 402.424 932.242 401.816C932.754 401.192 933.01 400.376 933.01 399.368C933.01 398.376 932.77 397.592 932.29 397.016C931.826 396.424 931.122 396.128 930.178 396.128ZM930.346 404.288C929.354 404.288 928.458 404.016 927.658 403.472C926.858 402.928 926.226 402.112 925.762 401.024C925.298 399.92 925.066 398.552 925.066 396.92C925.066 394.872 925.338 393.224 925.882 391.976C926.442 390.728 927.162 389.824 928.042 389.264C928.938 388.688 929.89 388.4 930.898 388.4C931.73 388.4 932.442 388.56 933.034 388.88C933.642 389.184 934.162 389.568 934.594 390.032L933.49 391.256C933.186 390.888 932.81 390.6 932.362 390.392C931.914 390.168 931.45 390.056 930.97 390.056C930.25 390.056 929.594 390.264 929.002 390.68C928.41 391.08 927.922 391.76 927.538 392.72C927.17 393.664 926.97 394.944 926.938 396.56C927.418 395.968 927.978 395.504 928.618 395.168C929.258 394.816 929.89 394.64 930.514 394.64C931.842 394.64 932.898 395.04 933.682 395.84C934.482 396.624 934.882 397.8 934.882 399.368C934.882 400.344 934.674 401.2 934.258 401.936C933.842 402.672 933.29 403.248 932.602 403.664C931.93 404.08 931.178 404.288 930.346 404.288ZM936.588 407.648V406.52H939.372V388.136H936.588V387.008H940.86V407.648H936.588Z" fill="black"/> <rect x="726" y="1" width="268" height="550" rx="29" stroke="#D7DDFF" stroke-width="2"/> <path d="M1109.71 156.707C1110.1 156.317 1110.1 155.683 1109.71 155.293L1103.34 148.929C1102.95 148.538 1102.32 148.538 1101.93 148.929C1101.54 149.319 1101.54 149.953 1101.93 150.343L1107.59 156L1101.93 161.657C1101.54 162.047 1101.54 162.681 1101.93 163.071C1102.32 163.462 1102.95 163.462 1103.34 163.071L1109.71 156.707ZM994 157H1109V155H994V157Z" fill="#B4B4B4"/> <path d="M1109.97 207.243C1110.1 206.707 1109.78 206.164 1109.24 206.03L1100.51 203.847C1099.98 203.713 1099.43 204.039 1099.3 204.575C1099.16 205.11 1099.49 205.653 1100.03 205.787L1107.79 207.728L1105.85 215.489C1105.71 216.025 1106.04 216.567 1106.57 216.701C1107.11 216.835 1107.65 216.51 1107.79 215.974L1109.97 207.243ZM994.514 276.857L1109.51 207.857L1108.49 206.143L993.486 275.143L994.514 276.857Z" fill="#B4B4B4"/> <path d="M1109.21 351.979C1109.75 351.865 1110.09 351.334 1109.98 350.794L1108.12 341.987C1108.01 341.447 1107.48 341.101 1106.94 341.215C1106.4 341.328 1106.05 341.859 1106.17 342.399L1107.82 350.227L1099.99 351.876C1099.45 351.989 1099.1 352.52 1099.21 353.06C1099.33 353.601 1099.86 353.946 1100.4 353.833L1109.21 351.979ZM993.454 276.838L1108.45 351.838L1109.55 350.162L994.546 275.162L993.454 276.838Z" fill="#B4B4B4"/> <path d="M1109.71 396.707C1110.1 396.317 1110.1 395.683 1109.71 395.293L1103.34 388.929C1102.95 388.538 1102.32 388.538 1101.93 388.929C1101.54 389.319 1101.54 389.953 1101.93 390.343L1107.59 396L1101.93 401.657C1101.54 402.047 1101.54 402.681 1101.93 403.071C1102.32 403.462 1102.95 403.462 1103.34 403.071L1109.71 396.707ZM994 397H1109V395H994V397Z" fill="#B4B4B4"/> <path d="M799.394 53V37.256H808.466V38.936H801.386V43.88H807.362V45.584H801.386V51.296H808.706V53H799.394ZM811.859 53V41.336H813.491L813.659 43.016H813.731C814.243 42.456 814.803 41.992 815.411 41.624C816.019 41.24 816.667 41.048 817.355 41.048C818.251 41.048 818.947 41.248 819.443 41.648C819.955 42.032 820.331 42.576 820.571 43.28C821.179 42.624 821.795 42.088 822.419 41.672C823.043 41.256 823.707 41.048 824.411 41.048C825.611 41.048 826.499 41.44 827.075 42.224C827.667 42.992 827.963 44.12 827.963 45.608V53H825.995V45.872C825.995 44.784 825.819 43.992 825.467 43.496C825.115 43 824.571 42.752 823.835 42.752C822.971 42.752 821.995 43.352 820.907 44.552V53H818.939V45.872C818.939 44.784 818.763 43.992 818.411 43.496C818.059 43 817.507 42.752 816.755 42.752C815.891 42.752 814.915 43.352 813.827 44.552V53H811.859ZM836.917 53.288C836.373 53.288 835.805 53.16 835.213 52.904C834.637 52.632 834.093 52.264 833.581 51.8H833.509L833.341 53H831.757V35.912H833.725V40.568L833.677 42.68C834.205 42.216 834.781 41.832 835.405 41.528C836.045 41.208 836.685 41.048 837.325 41.048C838.845 41.048 839.997 41.584 840.781 42.656C841.565 43.728 841.957 45.168 841.957 46.976C841.957 48.304 841.717 49.44 841.237 50.384C840.773 51.328 840.157 52.048 839.389 52.544C838.637 53.04 837.813 53.288 836.917 53.288ZM836.581 51.632C837.541 51.632 838.333 51.224 838.957 50.408C839.597 49.576 839.917 48.44 839.917 47C839.917 45.72 839.677 44.688 839.197 43.904C838.733 43.104 837.949 42.704 836.845 42.704C836.349 42.704 835.845 42.84 835.333 43.112C834.821 43.384 834.285 43.776 833.725 44.288V50.408C834.237 50.856 834.741 51.176 835.237 51.368C835.749 51.544 836.197 51.632 836.581 51.632ZM849.751 53.288C848.711 53.288 847.767 53.048 846.919 52.568C846.071 52.072 845.399 51.368 844.903 50.456C844.407 49.544 844.159 48.456 844.159 47.192C844.159 45.912 844.407 44.816 844.903 43.904C845.415 42.992 846.071 42.288 846.871 41.792C847.671 41.296 848.511 41.048 849.391 41.048C850.879 41.048 852.023 41.544 852.823 42.536C853.639 43.528 854.047 44.856 854.047 46.52C854.047 46.728 854.039 46.936 854.023 47.144C854.023 47.336 854.007 47.504 853.975 47.648H846.103C846.183 48.88 846.567 49.864 847.255 50.6C847.959 51.336 848.871 51.704 849.991 51.704C850.551 51.704 851.063 51.624 851.527 51.464C852.007 51.288 852.463 51.064 852.895 50.792L853.591 52.088C853.095 52.408 852.527 52.688 851.887 52.928C851.263 53.168 850.551 53.288 849.751 53.288ZM846.079 46.232H852.319C852.319 45.048 852.063 44.152 851.551 43.544C851.055 42.92 850.351 42.608 849.439 42.608C848.623 42.608 847.887 42.928 847.231 43.568C846.591 44.192 846.207 45.08 846.079 46.232ZM860.913 53.288C859.457 53.288 858.289 52.76 857.409 51.704C856.529 50.632 856.089 49.128 856.089 47.192C856.089 45.928 856.321 44.84 856.785 43.928C857.265 43 857.889 42.288 858.657 41.792C859.441 41.296 860.273 41.048 861.153 41.048C861.825 41.048 862.409 41.168 862.905 41.408C863.401 41.648 863.905 41.976 864.417 42.392L864.321 40.4V35.912H866.313V53H864.681L864.513 51.632H864.441C863.993 52.08 863.465 52.472 862.857 52.808C862.249 53.128 861.601 53.288 860.913 53.288ZM861.345 51.632C862.369 51.632 863.361 51.096 864.321 50.024V43.928C863.825 43.48 863.345 43.168 862.881 42.992C862.433 42.8 861.969 42.704 861.489 42.704C860.865 42.704 860.297 42.896 859.785 43.28C859.289 43.648 858.889 44.168 858.585 44.84C858.281 45.496 858.129 46.272 858.129 47.168C858.129 48.56 858.409 49.656 858.969 50.456C859.529 51.24 860.321 51.632 861.345 51.632ZM874.225 53.288C872.769 53.288 871.601 52.76 870.721 51.704C869.841 50.632 869.401 49.128 869.401 47.192C869.401 45.928 869.633 44.84 870.097 43.928C870.577 43 871.201 42.288 871.969 41.792C872.753 41.296 873.585 41.048 874.465 41.048C875.137 41.048 875.721 41.168 876.217 41.408C876.713 41.648 877.217 41.976 877.729 42.392L877.633 40.4V35.912H879.625V53H877.993L877.825 51.632H877.753C877.305 52.08 876.777 52.472 876.169 52.808C875.561 53.128 874.913 53.288 874.225 53.288ZM874.657 51.632C875.681 51.632 876.673 51.096 877.633 50.024V43.928C877.137 43.48 876.657 43.168 876.193 42.992C875.745 42.8 875.281 42.704 874.801 42.704C874.177 42.704 873.609 42.896 873.097 43.28C872.601 43.648 872.201 44.168 871.897 44.84C871.593 45.496 871.441 46.272 871.441 47.168C871.441 48.56 871.721 49.656 872.281 50.456C872.841 51.24 873.633 51.632 874.657 51.632ZM883.554 53V41.336H885.522V53H883.554ZM884.562 38.936C884.178 38.936 883.858 38.824 883.602 38.6C883.362 38.36 883.242 38.04 883.242 37.64C883.242 37.256 883.362 36.944 883.602 36.704C883.858 36.464 884.178 36.344 884.562 36.344C884.946 36.344 885.258 36.464 885.498 36.704C885.754 36.944 885.882 37.256 885.882 37.64C885.882 38.04 885.754 38.36 885.498 38.6C885.258 38.824 884.946 38.936 884.562 38.936ZM889.46 53V41.336H891.092L891.26 43.016H891.332C891.892 42.456 892.484 41.992 893.108 41.624C893.732 41.24 894.444 41.048 895.244 41.048C896.476 41.048 897.372 41.44 897.932 42.224C898.508 42.992 898.796 44.12 898.796 45.608V53H896.828V45.872C896.828 44.784 896.652 43.992 896.3 43.496C895.948 43 895.388 42.752 894.62 42.752C894.028 42.752 893.492 42.904 893.012 43.208C892.548 43.512 892.02 43.96 891.428 44.552V53H889.46ZM906.521 58.376C905.097 58.376 903.937 58.104 903.041 57.56C902.145 57.016 901.697 56.24 901.697 55.232C901.697 54.736 901.849 54.256 902.153 53.792C902.457 53.344 902.873 52.944 903.401 52.592V52.496C903.113 52.32 902.865 52.072 902.657 51.752C902.465 51.432 902.369 51.048 902.369 50.6C902.369 50.104 902.505 49.672 902.777 49.304C903.049 48.936 903.337 48.648 903.641 48.44V48.344C903.257 48.024 902.905 47.592 902.585 47.048C902.281 46.504 902.129 45.888 902.129 45.2C902.129 44.352 902.329 43.616 902.729 42.992C903.129 42.368 903.665 41.888 904.337 41.552C905.009 41.216 905.737 41.048 906.521 41.048C906.841 41.048 907.145 41.08 907.433 41.144C907.721 41.192 907.969 41.256 908.177 41.336H912.233V42.848H909.833C910.105 43.104 910.329 43.448 910.505 43.88C910.697 44.296 910.793 44.752 910.793 45.248C910.793 46.08 910.601 46.8 910.217 47.408C909.833 48.016 909.321 48.488 908.681 48.824C908.041 49.144 907.321 49.304 906.521 49.304C905.897 49.304 905.313 49.168 904.769 48.896C904.561 49.072 904.385 49.272 904.241 49.496C904.097 49.704 904.025 49.968 904.025 50.288C904.025 50.656 904.169 50.96 904.457 51.2C904.761 51.44 905.305 51.56 906.089 51.56H908.345C909.705 51.56 910.721 51.784 911.393 52.232C912.081 52.664 912.425 53.368 912.425 54.344C912.425 55.064 912.185 55.728 911.705 56.336C911.225 56.944 910.545 57.432 909.665 57.8C908.785 58.184 907.737 58.376 906.521 58.376ZM906.521 47.984C907.193 47.984 907.769 47.736 908.249 47.24C908.745 46.728 908.993 46.048 908.993 45.2C908.993 44.352 908.753 43.688 908.273 43.208C907.793 42.728 907.209 42.488 906.521 42.488C905.833 42.488 905.249 42.728 904.769 43.208C904.289 43.688 904.049 44.352 904.049 45.2C904.049 46.048 904.289 46.728 904.769 47.24C905.265 47.736 905.849 47.984 906.521 47.984ZM906.809 57.008C907.929 57.008 908.825 56.76 909.497 56.264C910.169 55.784 910.505 55.24 910.505 54.632C910.505 54.088 910.297 53.712 909.881 53.504C909.481 53.296 908.905 53.192 908.153 53.192H906.137C905.913 53.192 905.665 53.176 905.393 53.144C905.137 53.112 904.881 53.064 904.625 53C904.209 53.304 903.905 53.624 903.713 53.96C903.521 54.296 903.425 54.632 903.425 54.968C903.425 55.592 903.721 56.088 904.313 56.456C904.921 56.824 905.753 57.008 906.809 57.008ZM917.727 53.288C916.895 53.288 916.103 53.136 915.351 52.832C914.599 52.512 913.943 52.128 913.383 51.68L914.367 50.36C914.879 50.76 915.407 51.096 915.951 51.368C916.495 51.624 917.111 51.752 917.799 51.752C918.567 51.752 919.143 51.576 919.527 51.224C919.911 50.856 920.103 50.424 920.103 49.928C920.103 49.528 919.967 49.192 919.695 48.92C919.439 48.648 919.103 48.424 918.687 48.248C918.287 48.056 917.871 47.88 917.439 47.72C916.895 47.512 916.359 47.28 915.831 47.024C915.303 46.752 914.871 46.408 914.535 45.992C914.199 45.56 914.031 45.016 914.031 44.36C914.031 43.416 914.383 42.632 915.087 42.008C915.807 41.368 916.799 41.048 918.063 41.048C918.783 41.048 919.455 41.176 920.079 41.432C920.703 41.688 921.239 42 921.687 42.368L920.727 43.616C920.327 43.312 919.911 43.064 919.479 42.872C919.047 42.68 918.575 42.584 918.063 42.584C917.327 42.584 916.783 42.752 916.431 43.088C916.095 43.424 915.927 43.816 915.927 44.264C915.927 44.632 916.047 44.936 916.287 45.176C916.527 45.4 916.839 45.6 917.223 45.776C917.607 45.936 918.015 46.104 918.447 46.28C919.007 46.488 919.559 46.728 920.103 47C920.647 47.256 921.095 47.608 921.447 48.056C921.815 48.488 921.999 49.072 921.999 49.808C921.999 50.432 921.831 51.008 921.495 51.536C921.175 52.064 920.695 52.488 920.055 52.808C919.431 53.128 918.655 53.288 917.727 53.288Z" fill="#A2A2A2"/> <rect x="1110" y="100" width="233.752" height="144" rx="29" fill="#FFF5DD"/> <rect x="1110" y="100" width="233.752" height="144" rx="29" stroke="#D7DDFF" stroke-width="2"/> <path d="M1150.21 164.288C1148.9 164.288 1147.73 163.968 1146.69 163.328C1145.66 162.672 1144.85 161.736 1144.24 160.52C1143.65 159.304 1143.35 157.84 1143.35 156.128C1143.35 154.432 1143.65 152.976 1144.26 151.76C1144.87 150.544 1145.7 149.608 1146.76 148.952C1147.81 148.296 1149.01 147.968 1150.33 147.968C1151.29 147.968 1152.14 148.168 1152.88 148.568C1153.61 148.952 1154.21 149.4 1154.65 149.912L1153.57 151.208C1153.16 150.76 1152.69 150.4 1152.16 150.128C1151.63 149.856 1151.03 149.72 1150.36 149.72C1149.37 149.72 1148.49 149.984 1147.74 150.512C1147.01 151.024 1146.43 151.752 1146.01 152.696C1145.61 153.64 1145.41 154.768 1145.41 156.08C1145.41 157.392 1145.61 158.536 1146.01 159.512C1146.41 160.472 1146.97 161.216 1147.69 161.744C1148.43 162.272 1149.29 162.536 1150.29 162.536C1151.04 162.536 1151.71 162.376 1152.3 162.056C1152.89 161.736 1153.44 161.304 1153.93 160.76L1155.04 162.008C1154.41 162.728 1153.71 163.288 1152.93 163.688C1152.14 164.088 1151.24 164.288 1150.21 164.288ZM1162.32 164.288C1161.37 164.288 1160.48 164.048 1159.65 163.568C1158.84 163.088 1158.17 162.392 1157.66 161.48C1157.16 160.568 1156.92 159.472 1156.92 158.192C1156.92 156.88 1157.16 155.768 1157.66 154.856C1158.17 153.944 1158.84 153.248 1159.65 152.768C1160.48 152.288 1161.37 152.048 1162.32 152.048C1163.28 152.048 1164.16 152.288 1164.98 152.768C1165.8 153.248 1166.45 153.944 1166.95 154.856C1167.46 155.768 1167.72 156.88 1167.72 158.192C1167.72 159.472 1167.46 160.568 1166.95 161.48C1166.45 162.392 1165.8 163.088 1164.98 163.568C1164.16 164.048 1163.28 164.288 1162.32 164.288ZM1162.32 162.656C1163.32 162.656 1164.13 162.248 1164.74 161.432C1165.36 160.6 1165.68 159.52 1165.68 158.192C1165.68 156.848 1165.36 155.76 1164.74 154.928C1164.13 154.096 1163.32 153.68 1162.32 153.68C1161.32 153.68 1160.52 154.096 1159.89 154.928C1159.27 155.76 1158.96 156.848 1158.96 158.192C1158.96 159.52 1159.27 160.6 1159.89 161.432C1160.52 162.248 1161.32 162.656 1162.32 162.656ZM1173.84 164.288C1173 164.288 1172.21 164.136 1171.46 163.832C1170.71 163.512 1170.05 163.128 1169.49 162.68L1170.48 161.36C1170.99 161.76 1171.52 162.096 1172.06 162.368C1172.6 162.624 1173.22 162.752 1173.91 162.752C1174.68 162.752 1175.25 162.576 1175.64 162.224C1176.02 161.856 1176.21 161.424 1176.21 160.928C1176.21 160.528 1176.08 160.192 1175.8 159.92C1175.55 159.648 1175.21 159.424 1174.8 159.248C1174.4 159.056 1173.98 158.88 1173.55 158.72C1173 158.512 1172.47 158.28 1171.94 158.024C1171.41 157.752 1170.98 157.408 1170.64 156.992C1170.31 156.56 1170.14 156.016 1170.14 155.36C1170.14 154.416 1170.49 153.632 1171.2 153.008C1171.92 152.368 1172.91 152.048 1174.17 152.048C1174.89 152.048 1175.56 152.176 1176.19 152.432C1176.81 152.688 1177.35 153 1177.8 153.368L1176.84 154.616C1176.44 154.312 1176.02 154.064 1175.59 153.872C1175.16 153.68 1174.68 153.584 1174.17 153.584C1173.44 153.584 1172.89 153.752 1172.54 154.088C1172.2 154.424 1172.04 154.816 1172.04 155.264C1172.04 155.632 1172.16 155.936 1172.4 156.176C1172.64 156.4 1172.95 156.6 1173.33 156.776C1173.72 156.936 1174.12 157.104 1174.56 157.28C1175.12 157.488 1175.67 157.728 1176.21 158C1176.76 158.256 1177.2 158.608 1177.56 159.056C1177.92 159.488 1178.11 160.072 1178.11 160.808C1178.11 161.432 1177.94 162.008 1177.6 162.536C1177.28 163.064 1176.8 163.488 1176.16 163.808C1175.54 164.128 1174.76 164.288 1173.84 164.288ZM1180.84 164V152.336H1182.81V164H1180.84ZM1181.85 149.936C1181.47 149.936 1181.15 149.824 1180.89 149.6C1180.65 149.36 1180.53 149.04 1180.53 148.64C1180.53 148.256 1180.65 147.944 1180.89 147.704C1181.15 147.464 1181.47 147.344 1181.85 147.344C1182.24 147.344 1182.55 147.464 1182.79 147.704C1183.04 147.944 1183.17 148.256 1183.17 148.64C1183.17 149.04 1183.04 149.36 1182.79 149.6C1182.55 149.824 1182.24 149.936 1181.85 149.936ZM1186.75 164V152.336H1188.38L1188.55 154.016H1188.62C1189.18 153.456 1189.77 152.992 1190.4 152.624C1191.02 152.24 1191.73 152.048 1192.53 152.048C1193.77 152.048 1194.66 152.44 1195.22 153.224C1195.8 153.992 1196.09 155.12 1196.09 156.608V164H1194.12V156.872C1194.12 155.784 1193.94 154.992 1193.59 154.496C1193.24 154 1192.68 153.752 1191.91 153.752C1191.32 153.752 1190.78 153.904 1190.3 154.208C1189.84 154.512 1189.31 154.96 1188.72 155.552V164H1186.75ZM1204.6 164.288C1203.56 164.288 1202.62 164.048 1201.77 163.568C1200.92 163.072 1200.25 162.368 1199.75 161.456C1199.26 160.544 1199.01 159.456 1199.01 158.192C1199.01 156.912 1199.26 155.816 1199.75 154.904C1200.27 153.992 1200.92 153.288 1201.72 152.792C1202.52 152.296 1203.36 152.048 1204.24 152.048C1205.73 152.048 1206.87 152.544 1207.67 153.536C1208.49 154.528 1208.9 155.856 1208.9 157.52C1208.9 157.728 1208.89 157.936 1208.87 158.144C1208.87 158.336 1208.86 158.504 1208.83 158.648H1200.95C1201.03 159.88 1201.42 160.864 1202.11 161.6C1202.81 162.336 1203.72 162.704 1204.84 162.704C1205.4 162.704 1205.91 162.624 1206.38 162.464C1206.86 162.288 1207.31 162.064 1207.75 161.792L1208.44 163.088C1207.95 163.408 1207.38 163.688 1206.74 163.928C1206.11 164.168 1205.4 164.288 1204.6 164.288ZM1200.93 157.232H1207.17C1207.17 156.048 1206.91 155.152 1206.4 154.544C1205.91 153.92 1205.2 153.608 1204.29 153.608C1203.47 153.608 1202.74 153.928 1202.08 154.568C1201.44 155.192 1201.06 156.08 1200.93 157.232ZM1221.15 164.288C1220.04 164.288 1219.02 164.08 1218.07 163.664C1217.13 163.232 1216.31 162.664 1215.63 161.96L1216.83 160.568C1217.39 161.16 1218.04 161.64 1218.79 162.008C1219.56 162.36 1220.35 162.536 1221.17 162.536C1222.21 162.536 1223.02 162.304 1223.59 161.84C1224.17 161.36 1224.46 160.736 1224.46 159.968C1224.46 159.424 1224.34 158.992 1224.1 158.672C1223.87 158.352 1223.56 158.08 1223.16 157.856C1222.78 157.632 1222.34 157.408 1221.84 157.184L1219.59 156.2C1219.09 155.992 1218.59 155.72 1218.1 155.384C1217.62 155.048 1217.21 154.616 1216.87 154.088C1216.55 153.56 1216.39 152.912 1216.39 152.144C1216.39 151.344 1216.6 150.632 1217.02 150.008C1217.45 149.368 1218.04 148.872 1218.79 148.52C1219.55 148.152 1220.39 147.968 1221.34 147.968C1222.28 147.968 1223.15 148.152 1223.95 148.52C1224.75 148.872 1225.43 149.336 1225.99 149.912L1224.91 151.208C1224.43 150.744 1223.9 150.384 1223.31 150.128C1222.73 149.856 1222.07 149.72 1221.34 149.72C1220.46 149.72 1219.75 149.928 1219.2 150.344C1218.67 150.76 1218.41 151.32 1218.41 152.024C1218.41 152.52 1218.54 152.936 1218.79 153.272C1219.07 153.592 1219.4 153.856 1219.8 154.064C1220.2 154.272 1220.61 154.464 1221.03 154.64L1223.26 155.6C1223.87 155.856 1224.41 156.168 1224.89 156.536C1225.39 156.888 1225.78 157.328 1226.07 157.856C1226.35 158.368 1226.5 159.016 1226.5 159.8C1226.5 160.632 1226.28 161.392 1225.85 162.08C1225.42 162.752 1224.8 163.288 1224 163.688C1223.2 164.088 1222.25 164.288 1221.15 164.288ZM1229.41 164V152.336H1231.37V164H1229.41ZM1230.41 149.936C1230.03 149.936 1229.71 149.824 1229.45 149.6C1229.21 149.36 1229.09 149.04 1229.09 148.64C1229.09 148.256 1229.21 147.944 1229.45 147.704C1229.71 147.464 1230.03 147.344 1230.41 147.344C1230.8 147.344 1231.11 147.464 1231.35 147.704C1231.61 147.944 1231.73 148.256 1231.73 148.64C1231.73 149.04 1231.61 149.36 1231.35 149.6C1231.11 149.824 1230.8 149.936 1230.41 149.936ZM1235.31 164V152.336H1236.94L1237.11 154.016H1237.18C1237.7 153.456 1238.26 152.992 1238.86 152.624C1239.47 152.24 1240.12 152.048 1240.81 152.048C1241.7 152.048 1242.4 152.248 1242.9 152.648C1243.41 153.032 1243.78 153.576 1244.02 154.28C1244.63 153.624 1245.25 153.088 1245.87 152.672C1246.5 152.256 1247.16 152.048 1247.86 152.048C1249.06 152.048 1249.95 152.44 1250.53 153.224C1251.12 153.992 1251.42 155.12 1251.42 156.608V164H1249.45V156.872C1249.45 155.784 1249.27 154.992 1248.92 154.496C1248.57 154 1248.02 153.752 1247.29 153.752C1246.42 153.752 1245.45 154.352 1244.36 155.552V164H1242.39V156.872C1242.39 155.784 1242.22 154.992 1241.86 154.496C1241.51 154 1240.96 153.752 1240.21 153.752C1239.34 153.752 1238.37 154.352 1237.28 155.552V164H1235.31ZM1255.21 164V152.336H1257.18V164H1255.21ZM1256.22 149.936C1255.83 149.936 1255.51 149.824 1255.26 149.6C1255.02 149.36 1254.9 149.04 1254.9 148.64C1254.9 148.256 1255.02 147.944 1255.26 147.704C1255.51 147.464 1255.83 147.344 1256.22 147.344C1256.6 147.344 1256.91 147.464 1257.15 147.704C1257.41 147.944 1257.54 148.256 1257.54 148.64C1257.54 149.04 1257.41 149.36 1257.15 149.6C1256.91 149.824 1256.6 149.936 1256.22 149.936ZM1263.2 164.288C1262.47 164.288 1261.93 164.064 1261.6 163.616C1261.28 163.152 1261.12 162.496 1261.12 161.648V146.912H1263.08V161.792C1263.08 162.096 1263.14 162.32 1263.25 162.464C1263.36 162.592 1263.49 162.656 1263.64 162.656C1263.7 162.656 1263.76 162.656 1263.8 162.656C1263.87 162.64 1263.96 162.624 1264.07 162.608L1264.33 164.096C1264.2 164.16 1264.05 164.208 1263.88 164.24C1263.7 164.272 1263.48 164.288 1263.2 164.288ZM1269.92 164.288C1268.95 164.288 1268.13 164 1267.47 163.424C1266.83 162.832 1266.51 162.016 1266.51 160.976C1266.51 159.696 1267.08 158.72 1268.22 158.048C1269.37 157.36 1271.19 156.88 1273.67 156.608C1273.67 156.112 1273.59 155.64 1273.45 155.192C1273.32 154.744 1273.08 154.384 1272.73 154.112C1272.39 153.824 1271.91 153.68 1271.27 153.68C1270.59 153.68 1269.96 153.808 1269.37 154.064C1268.78 154.32 1268.25 154.608 1267.79 154.928L1267.02 153.56C1267.56 153.208 1268.23 152.872 1269.01 152.552C1269.81 152.216 1270.67 152.048 1271.6 152.048C1273.03 152.048 1274.06 152.488 1274.7 153.368C1275.34 154.232 1275.66 155.392 1275.66 156.848V164H1274.03L1273.86 162.608H1273.79C1273.24 163.056 1272.64 163.448 1271.99 163.784C1271.35 164.12 1270.66 164.288 1269.92 164.288ZM1270.5 162.704C1271.06 162.704 1271.59 162.568 1272.08 162.296C1272.58 162.024 1273.11 161.64 1273.67 161.144V157.904C1271.73 158.144 1270.37 158.504 1269.59 158.984C1268.82 159.464 1268.43 160.08 1268.43 160.832C1268.43 161.488 1268.63 161.968 1269.03 162.272C1269.43 162.56 1269.92 162.704 1270.5 162.704ZM1279.33 164V152.336H1280.96L1281.13 154.448H1281.2C1281.6 153.712 1282.09 153.128 1282.66 152.696C1283.24 152.264 1283.86 152.048 1284.51 152.048C1284.98 152.048 1285.39 152.128 1285.76 152.288L1285.38 154.016C1285.18 153.952 1285.01 153.904 1284.85 153.872C1284.69 153.84 1284.49 153.824 1284.25 153.824C1283.75 153.824 1283.23 154.024 1282.69 154.424C1282.16 154.824 1281.7 155.52 1281.3 156.512V164H1279.33ZM1287.65 164V152.336H1289.62V164H1287.65ZM1288.66 149.936C1288.27 149.936 1287.95 149.824 1287.7 149.6C1287.46 149.36 1287.34 149.04 1287.34 148.64C1287.34 148.256 1287.46 147.944 1287.7 147.704C1287.95 147.464 1288.27 147.344 1288.66 147.344C1289.04 147.344 1289.35 147.464 1289.59 147.704C1289.85 147.944 1289.98 148.256 1289.98 148.64C1289.98 149.04 1289.85 149.36 1289.59 149.6C1289.35 149.824 1289.04 149.936 1288.66 149.936ZM1297.23 164.288C1295.98 164.288 1295.11 163.928 1294.61 163.208C1294.13 162.488 1293.89 161.552 1293.89 160.4V153.944H1292.16V152.456L1293.99 152.336L1294.23 149.072H1295.88V152.336H1299.03V153.944H1295.88V160.424C1295.88 161.144 1296.01 161.704 1296.27 162.104C1296.54 162.488 1297.01 162.68 1297.68 162.68C1297.89 162.68 1298.11 162.648 1298.35 162.584C1298.59 162.504 1298.81 162.432 1299 162.368L1299.39 163.856C1299.07 163.968 1298.71 164.064 1298.33 164.144C1297.96 164.24 1297.59 164.288 1297.23 164.288ZM1301.86 169.016C1301.6 169.016 1301.35 168.992 1301.11 168.944C1300.89 168.896 1300.68 168.84 1300.49 168.776L1300.87 167.216C1301 167.248 1301.14 167.28 1301.3 167.312C1301.46 167.36 1301.62 167.384 1301.76 167.384C1302.42 167.384 1302.96 167.144 1303.39 166.664C1303.82 166.2 1304.16 165.608 1304.4 164.888L1304.66 164.024L1299.98 152.336H1302.02L1304.4 158.792C1304.58 159.288 1304.76 159.824 1304.95 160.4C1305.16 160.976 1305.35 161.536 1305.53 162.08H1305.62C1305.8 161.552 1305.97 161 1306.13 160.424C1306.29 159.848 1306.45 159.304 1306.61 158.792L1308.7 152.336H1310.62L1306.22 164.96C1305.95 165.728 1305.62 166.416 1305.24 167.024C1304.87 167.632 1304.41 168.112 1303.85 168.464C1303.3 168.832 1302.64 169.016 1301.86 169.016Z" fill="#A2A2A2"/> <path d="M1199.08 197.288C1197.49 197.288 1196.23 196.6 1195.29 195.224C1194.36 193.848 1193.89 191.872 1193.89 189.296C1193.89 186.72 1194.36 184.768 1195.29 183.44C1196.23 182.112 1197.49 181.448 1199.08 181.448C1200.66 181.448 1201.92 182.112 1202.85 183.44C1203.79 184.768 1204.26 186.72 1204.26 189.296C1204.26 191.872 1203.79 193.848 1202.85 195.224C1201.92 196.6 1200.66 197.288 1199.08 197.288ZM1199.08 195.128C1199.57 195.128 1200.01 194.944 1200.4 194.576C1200.78 194.192 1201.08 193.576 1201.29 192.728C1201.51 191.88 1201.62 190.736 1201.62 189.296C1201.62 187.856 1201.51 186.72 1201.29 185.888C1201.08 185.056 1200.78 184.472 1200.4 184.136C1200.01 183.784 1199.57 183.608 1199.08 183.608C1198.6 183.608 1198.17 183.784 1197.78 184.136C1197.4 184.472 1197.09 185.056 1196.87 185.888C1196.65 186.72 1196.53 187.856 1196.53 189.296C1196.53 190.736 1196.65 191.88 1196.87 192.728C1197.09 193.576 1197.4 194.192 1197.78 194.576C1198.17 194.944 1198.6 195.128 1199.08 195.128ZM1208.55 197.288C1208.05 197.288 1207.63 197.112 1207.28 196.76C1206.93 196.408 1206.75 195.96 1206.75 195.416C1206.75 194.872 1206.93 194.424 1207.28 194.072C1207.63 193.72 1208.05 193.544 1208.55 193.544C1209.06 193.544 1209.49 193.72 1209.82 194.072C1210.16 194.424 1210.33 194.872 1210.33 195.416C1210.33 195.96 1210.16 196.408 1209.82 196.76C1209.49 197.112 1209.06 197.288 1208.55 197.288ZM1215.28 186.44C1215.28 187.336 1215.49 188.016 1215.9 188.48C1216.34 188.944 1216.94 189.176 1217.7 189.176C1218.15 189.176 1218.62 189.04 1219.1 188.768C1219.59 188.48 1220.04 188.024 1220.44 187.4C1220.3 186.056 1219.97 185.08 1219.46 184.472C1218.94 183.864 1218.31 183.56 1217.56 183.56C1216.94 183.56 1216.4 183.808 1215.95 184.304C1215.5 184.8 1215.28 185.512 1215.28 186.44ZM1217.06 197.288C1216.14 197.288 1215.35 197.12 1214.68 196.784C1214.01 196.448 1213.44 196.04 1212.98 195.56L1214.46 193.88C1214.75 194.216 1215.11 194.496 1215.54 194.72C1215.99 194.928 1216.45 195.032 1216.91 195.032C1217.55 195.032 1218.14 194.856 1218.66 194.504C1219.19 194.152 1219.62 193.568 1219.96 192.752C1220.3 191.92 1220.48 190.8 1220.51 189.392C1220.06 189.936 1219.54 190.368 1218.93 190.688C1218.32 191.008 1217.74 191.168 1217.18 191.168C1215.86 191.168 1214.79 190.784 1213.96 190.016C1213.14 189.232 1212.74 188.04 1212.74 186.44C1212.74 185.416 1212.95 184.536 1213.38 183.8C1213.83 183.048 1214.42 182.472 1215.14 182.072C1215.87 181.656 1216.67 181.448 1217.54 181.448C1218.53 181.448 1219.44 181.712 1220.27 182.24C1221.12 182.768 1221.79 183.584 1222.29 184.688C1222.8 185.776 1223.06 187.176 1223.06 188.888C1223.06 190.856 1222.78 192.464 1222.22 193.712C1221.66 194.944 1220.92 195.848 1220.01 196.424C1219.1 197 1218.11 197.288 1217.06 197.288ZM1230.32 197.288C1229.33 197.288 1228.44 197.112 1227.66 196.76C1226.87 196.392 1226.26 195.896 1225.81 195.272C1225.36 194.648 1225.14 193.92 1225.14 193.088C1225.14 192.144 1225.42 191.352 1225.98 190.712C1226.54 190.072 1227.18 189.56 1227.9 189.176V189.08C1227.3 188.664 1226.8 188.168 1226.38 187.592C1225.98 187 1225.78 186.304 1225.78 185.504C1225.78 184.672 1225.98 183.96 1226.38 183.368C1226.78 182.76 1227.33 182.288 1228.02 181.952C1228.72 181.616 1229.51 181.448 1230.39 181.448C1231.77 181.448 1232.86 181.832 1233.66 182.6C1234.47 183.352 1234.88 184.344 1234.88 185.576C1234.88 186.328 1234.66 187.008 1234.23 187.616C1233.8 188.224 1233.34 188.712 1232.84 189.08V189.176C1233.54 189.576 1234.16 190.08 1234.69 190.688C1235.22 191.296 1235.48 192.112 1235.48 193.136C1235.48 193.92 1235.26 194.624 1234.83 195.248C1234.4 195.872 1233.8 196.368 1233.03 196.736C1232.26 197.104 1231.36 197.288 1230.32 197.288ZM1231.33 188.384C1231.74 187.968 1232.06 187.544 1232.26 187.112C1232.49 186.68 1232.6 186.224 1232.6 185.744C1232.6 185.072 1232.4 184.52 1232 184.088C1231.6 183.64 1231.05 183.416 1230.34 183.416C1229.75 183.416 1229.26 183.6 1228.86 183.968C1228.46 184.336 1228.26 184.848 1228.26 185.504C1228.26 186.016 1228.39 186.448 1228.66 186.8C1228.94 187.136 1229.3 187.432 1229.77 187.688C1230.25 187.928 1230.77 188.16 1231.33 188.384ZM1230.37 195.32C1231.1 195.32 1231.7 195.12 1232.17 194.72C1232.65 194.304 1232.89 193.744 1232.89 193.04C1232.89 192.48 1232.72 192.024 1232.38 191.672C1232.06 191.32 1231.62 191.016 1231.06 190.76C1230.52 190.488 1229.9 190.216 1229.22 189.944C1228.74 190.296 1228.34 190.712 1228.02 191.192C1227.71 191.672 1227.56 192.208 1227.56 192.8C1227.56 193.552 1227.83 194.16 1228.38 194.624C1228.92 195.088 1229.58 195.32 1230.37 195.32ZM1242.22 197.288C1240.99 197.288 1239.95 197.08 1239.1 196.664C1238.27 196.232 1237.58 195.72 1237.03 195.128L1238.35 193.352C1238.82 193.816 1239.35 194.216 1239.94 194.552C1240.53 194.872 1241.21 195.032 1241.98 195.032C1242.81 195.032 1243.48 194.832 1243.99 194.432C1244.52 194.016 1244.79 193.448 1244.79 192.728C1244.79 192.2 1244.65 191.744 1244.38 191.36C1244.12 190.96 1243.67 190.656 1243.01 190.448C1242.37 190.24 1241.47 190.136 1240.3 190.136V188.12C1241.8 188.12 1242.84 187.888 1243.42 187.424C1243.99 186.96 1244.28 186.376 1244.28 185.672C1244.28 185.032 1244.08 184.536 1243.68 184.184C1243.28 183.816 1242.73 183.632 1242.03 183.632C1241.43 183.632 1240.89 183.768 1240.39 184.04C1239.9 184.296 1239.42 184.64 1238.95 185.072L1237.54 183.368C1238.19 182.792 1238.89 182.328 1239.63 181.976C1240.38 181.624 1241.21 181.448 1242.12 181.448C1243.59 181.448 1244.79 181.8 1245.7 182.504C1246.61 183.192 1247.07 184.184 1247.07 185.48C1247.07 186.328 1246.83 187.048 1246.35 187.64C1245.88 188.216 1245.23 188.664 1244.4 188.984V189.08C1245.3 189.32 1246.05 189.76 1246.66 190.4C1247.27 191.04 1247.57 191.864 1247.57 192.872C1247.57 193.8 1247.32 194.592 1246.83 195.248C1246.35 195.904 1245.7 196.408 1244.88 196.76C1244.08 197.112 1243.19 197.288 1242.22 197.288ZM1254.52 197.288C1253.29 197.288 1252.25 197.08 1251.4 196.664C1250.57 196.232 1249.88 195.72 1249.34 195.128L1250.66 193.352C1251.12 193.816 1251.65 194.216 1252.24 194.552C1252.83 194.872 1253.51 195.032 1254.28 195.032C1255.11 195.032 1255.79 194.832 1256.3 194.432C1256.83 194.016 1257.09 193.448 1257.09 192.728C1257.09 192.2 1256.95 191.744 1256.68 191.36C1256.43 190.96 1255.97 190.656 1255.31 190.448C1254.67 190.24 1253.77 190.136 1252.6 190.136V188.12C1254.11 188.12 1255.15 187.888 1255.72 187.424C1256.3 186.96 1256.59 186.376 1256.59 185.672C1256.59 185.032 1256.39 184.536 1255.99 184.184C1255.59 183.816 1255.03 183.632 1254.33 183.632C1253.74 183.632 1253.19 183.768 1252.7 184.04C1252.2 184.296 1251.72 184.64 1251.26 185.072L1249.84 183.368C1250.5 182.792 1251.19 182.328 1251.93 181.976C1252.68 181.624 1253.51 181.448 1254.43 181.448C1255.9 181.448 1257.09 181.8 1258 182.504C1258.91 183.192 1259.37 184.184 1259.37 185.48C1259.37 186.328 1259.13 187.048 1258.65 187.64C1258.19 188.216 1257.54 188.664 1256.71 188.984V189.08C1257.6 189.32 1258.35 189.76 1258.96 190.4C1259.57 191.04 1259.87 191.864 1259.87 192.872C1259.87 193.8 1259.63 194.592 1259.13 195.248C1258.65 195.904 1258 196.408 1257.19 196.76C1256.39 197.112 1255.5 197.288 1254.52 197.288Z" fill="#44B631"/> <rect x="1110" y="307" width="233.752" height="144" rx="29" fill="#FFF5DD"/> <rect x="1110" y="307" width="233.752" height="144" rx="29" stroke="#D7DDFF" stroke-width="2"/> <path d="M1150.21 364.288C1148.9 364.288 1147.73 363.968 1146.69 363.328C1145.66 362.672 1144.85 361.736 1144.24 360.52C1143.65 359.304 1143.35 357.84 1143.35 356.128C1143.35 354.432 1143.65 352.976 1144.26 351.76C1144.87 350.544 1145.7 349.608 1146.76 348.952C1147.81 348.296 1149.01 347.968 1150.33 347.968C1151.29 347.968 1152.14 348.168 1152.88 348.568C1153.61 348.952 1154.21 349.4 1154.65 349.912L1153.57 351.208C1153.16 350.76 1152.69 350.4 1152.16 350.128C1151.63 349.856 1151.03 349.72 1150.36 349.72C1149.37 349.72 1148.49 349.984 1147.74 350.512C1147.01 351.024 1146.43 351.752 1146.01 352.696C1145.61 353.64 1145.41 354.768 1145.41 356.08C1145.41 357.392 1145.61 358.536 1146.01 359.512C1146.41 360.472 1146.97 361.216 1147.69 361.744C1148.43 362.272 1149.29 362.536 1150.29 362.536C1151.04 362.536 1151.71 362.376 1152.3 362.056C1152.89 361.736 1153.44 361.304 1153.93 360.76L1155.04 362.008C1154.41 362.728 1153.71 363.288 1152.93 363.688C1152.14 364.088 1151.24 364.288 1150.21 364.288ZM1162.32 364.288C1161.37 364.288 1160.48 364.048 1159.65 363.568C1158.84 363.088 1158.17 362.392 1157.66 361.48C1157.16 360.568 1156.92 359.472 1156.92 358.192C1156.92 356.88 1157.16 355.768 1157.66 354.856C1158.17 353.944 1158.84 353.248 1159.65 352.768C1160.48 352.288 1161.37 352.048 1162.32 352.048C1163.28 352.048 1164.16 352.288 1164.98 352.768C1165.8 353.248 1166.45 353.944 1166.95 354.856C1167.46 355.768 1167.72 356.88 1167.72 358.192C1167.72 359.472 1167.46 360.568 1166.95 361.48C1166.45 362.392 1165.8 363.088 1164.98 363.568C1164.16 364.048 1163.28 364.288 1162.32 364.288ZM1162.32 362.656C1163.32 362.656 1164.13 362.248 1164.74 361.432C1165.36 360.6 1165.68 359.52 1165.68 358.192C1165.68 356.848 1165.36 355.76 1164.74 354.928C1164.13 354.096 1163.32 353.68 1162.32 353.68C1161.32 353.68 1160.52 354.096 1159.89 354.928C1159.27 355.76 1158.96 356.848 1158.96 358.192C1158.96 359.52 1159.27 360.6 1159.89 361.432C1160.52 362.248 1161.32 362.656 1162.32 362.656ZM1173.84 364.288C1173 364.288 1172.21 364.136 1171.46 363.832C1170.71 363.512 1170.05 363.128 1169.49 362.68L1170.48 361.36C1170.99 361.76 1171.52 362.096 1172.06 362.368C1172.6 362.624 1173.22 362.752 1173.91 362.752C1174.68 362.752 1175.25 362.576 1175.64 362.224C1176.02 361.856 1176.21 361.424 1176.21 360.928C1176.21 360.528 1176.08 360.192 1175.8 359.92C1175.55 359.648 1175.21 359.424 1174.8 359.248C1174.4 359.056 1173.98 358.88 1173.55 358.72C1173 358.512 1172.47 358.28 1171.94 358.024C1171.41 357.752 1170.98 357.408 1170.64 356.992C1170.31 356.56 1170.14 356.016 1170.14 355.36C1170.14 354.416 1170.49 353.632 1171.2 353.008C1171.92 352.368 1172.91 352.048 1174.17 352.048C1174.89 352.048 1175.56 352.176 1176.19 352.432C1176.81 352.688 1177.35 353 1177.8 353.368L1176.84 354.616C1176.44 354.312 1176.02 354.064 1175.59 353.872C1175.16 353.68 1174.68 353.584 1174.17 353.584C1173.44 353.584 1172.89 353.752 1172.54 354.088C1172.2 354.424 1172.04 354.816 1172.04 355.264C1172.04 355.632 1172.16 355.936 1172.4 356.176C1172.64 356.4 1172.95 356.6 1173.33 356.776C1173.72 356.936 1174.12 357.104 1174.56 357.28C1175.12 357.488 1175.67 357.728 1176.21 358C1176.76 358.256 1177.2 358.608 1177.56 359.056C1177.92 359.488 1178.11 360.072 1178.11 360.808C1178.11 361.432 1177.94 362.008 1177.6 362.536C1177.28 363.064 1176.8 363.488 1176.16 363.808C1175.54 364.128 1174.76 364.288 1173.84 364.288ZM1180.84 364V352.336H1182.81V364H1180.84ZM1181.85 349.936C1181.47 349.936 1181.15 349.824 1180.89 349.6C1180.65 349.36 1180.53 349.04 1180.53 348.64C1180.53 348.256 1180.65 347.944 1180.89 347.704C1181.15 347.464 1181.47 347.344 1181.85 347.344C1182.24 347.344 1182.55 347.464 1182.79 347.704C1183.04 347.944 1183.17 348.256 1183.17 348.64C1183.17 349.04 1183.04 349.36 1182.79 349.6C1182.55 349.824 1182.24 349.936 1181.85 349.936ZM1186.75 364V352.336H1188.38L1188.55 354.016H1188.62C1189.18 353.456 1189.77 352.992 1190.4 352.624C1191.02 352.24 1191.73 352.048 1192.53 352.048C1193.77 352.048 1194.66 352.44 1195.22 353.224C1195.8 353.992 1196.09 355.12 1196.09 356.608V364H1194.12V356.872C1194.12 355.784 1193.94 354.992 1193.59 354.496C1193.24 354 1192.68 353.752 1191.91 353.752C1191.32 353.752 1190.78 353.904 1190.3 354.208C1189.84 354.512 1189.31 354.96 1188.72 355.552V364H1186.75ZM1204.6 364.288C1203.56 364.288 1202.62 364.048 1201.77 363.568C1200.92 363.072 1200.25 362.368 1199.75 361.456C1199.26 360.544 1199.01 359.456 1199.01 358.192C1199.01 356.912 1199.26 355.816 1199.75 354.904C1200.27 353.992 1200.92 353.288 1201.72 352.792C1202.52 352.296 1203.36 352.048 1204.24 352.048C1205.73 352.048 1206.87 352.544 1207.67 353.536C1208.49 354.528 1208.9 355.856 1208.9 357.52C1208.9 357.728 1208.89 357.936 1208.87 358.144C1208.87 358.336 1208.86 358.504 1208.83 358.648H1200.95C1201.03 359.88 1201.42 360.864 1202.11 361.6C1202.81 362.336 1203.72 362.704 1204.84 362.704C1205.4 362.704 1205.91 362.624 1206.38 362.464C1206.86 362.288 1207.31 362.064 1207.75 361.792L1208.44 363.088C1207.95 363.408 1207.38 363.688 1206.74 363.928C1206.11 364.168 1205.4 364.288 1204.6 364.288ZM1200.93 357.232H1207.17C1207.17 356.048 1206.91 355.152 1206.4 354.544C1205.91 353.92 1205.2 353.608 1204.29 353.608C1203.47 353.608 1202.74 353.928 1202.08 354.568C1201.44 355.192 1201.06 356.08 1200.93 357.232ZM1221.15 364.288C1220.04 364.288 1219.02 364.08 1218.07 363.664C1217.13 363.232 1216.31 362.664 1215.63 361.96L1216.83 360.568C1217.39 361.16 1218.04 361.64 1218.79 362.008C1219.56 362.36 1220.35 362.536 1221.17 362.536C1222.21 362.536 1223.02 362.304 1223.59 361.84C1224.17 361.36 1224.46 360.736 1224.46 359.968C1224.46 359.424 1224.34 358.992 1224.1 358.672C1223.87 358.352 1223.56 358.08 1223.16 357.856C1222.78 357.632 1222.34 357.408 1221.84 357.184L1219.59 356.2C1219.09 355.992 1218.59 355.72 1218.1 355.384C1217.62 355.048 1217.21 354.616 1216.87 354.088C1216.55 353.56 1216.39 352.912 1216.39 352.144C1216.39 351.344 1216.6 350.632 1217.02 350.008C1217.45 349.368 1218.04 348.872 1218.79 348.52C1219.55 348.152 1220.39 347.968 1221.34 347.968C1222.28 347.968 1223.15 348.152 1223.95 348.52C1224.75 348.872 1225.43 349.336 1225.99 349.912L1224.91 351.208C1224.43 350.744 1223.9 350.384 1223.31 350.128C1222.73 349.856 1222.07 349.72 1221.34 349.72C1220.46 349.72 1219.75 349.928 1219.2 350.344C1218.67 350.76 1218.41 351.32 1218.41 352.024C1218.41 352.52 1218.54 352.936 1218.79 353.272C1219.07 353.592 1219.4 353.856 1219.8 354.064C1220.2 354.272 1220.61 354.464 1221.03 354.64L1223.26 355.6C1223.87 355.856 1224.41 356.168 1224.89 356.536C1225.39 356.888 1225.78 357.328 1226.07 357.856C1226.35 358.368 1226.5 359.016 1226.5 359.8C1226.5 360.632 1226.28 361.392 1225.85 362.08C1225.42 362.752 1224.8 363.288 1224 363.688C1223.2 364.088 1222.25 364.288 1221.15 364.288ZM1229.41 364V352.336H1231.37V364H1229.41ZM1230.41 349.936C1230.03 349.936 1229.71 349.824 1229.45 349.6C1229.21 349.36 1229.09 349.04 1229.09 348.64C1229.09 348.256 1229.21 347.944 1229.45 347.704C1229.71 347.464 1230.03 347.344 1230.41 347.344C1230.8 347.344 1231.11 347.464 1231.35 347.704C1231.61 347.944 1231.73 348.256 1231.73 348.64C1231.73 349.04 1231.61 349.36 1231.35 349.6C1231.11 349.824 1230.8 349.936 1230.41 349.936ZM1235.31 364V352.336H1236.94L1237.11 354.016H1237.18C1237.7 353.456 1238.26 352.992 1238.86 352.624C1239.47 352.24 1240.12 352.048 1240.81 352.048C1241.7 352.048 1242.4 352.248 1242.9 352.648C1243.41 353.032 1243.78 353.576 1244.02 354.28C1244.63 353.624 1245.25 353.088 1245.87 352.672C1246.5 352.256 1247.16 352.048 1247.86 352.048C1249.06 352.048 1249.95 352.44 1250.53 353.224C1251.12 353.992 1251.42 355.12 1251.42 356.608V364H1249.45V356.872C1249.45 355.784 1249.27 354.992 1248.92 354.496C1248.57 354 1248.02 353.752 1247.29 353.752C1246.42 353.752 1245.45 354.352 1244.36 355.552V364H1242.39V356.872C1242.39 355.784 1242.22 354.992 1241.86 354.496C1241.51 354 1240.96 353.752 1240.21 353.752C1239.34 353.752 1238.37 354.352 1237.28 355.552V364H1235.31ZM1255.21 364V352.336H1257.18V364H1255.21ZM1256.22 349.936C1255.83 349.936 1255.51 349.824 1255.26 349.6C1255.02 349.36 1254.9 349.04 1254.9 348.64C1254.9 348.256 1255.02 347.944 1255.26 347.704C1255.51 347.464 1255.83 347.344 1256.22 347.344C1256.6 347.344 1256.91 347.464 1257.15 347.704C1257.41 347.944 1257.54 348.256 1257.54 348.64C1257.54 349.04 1257.41 349.36 1257.15 349.6C1256.91 349.824 1256.6 349.936 1256.22 349.936ZM1263.2 364.288C1262.47 364.288 1261.93 364.064 1261.6 363.616C1261.28 363.152 1261.12 362.496 1261.12 361.648V346.912H1263.08V361.792C1263.08 362.096 1263.14 362.32 1263.25 362.464C1263.36 362.592 1263.49 362.656 1263.64 362.656C1263.7 362.656 1263.76 362.656 1263.8 362.656C1263.87 362.64 1263.96 362.624 1264.07 362.608L1264.33 364.096C1264.2 364.16 1264.05 364.208 1263.88 364.24C1263.7 364.272 1263.48 364.288 1263.2 364.288ZM1269.92 364.288C1268.95 364.288 1268.13 364 1267.47 363.424C1266.83 362.832 1266.51 362.016 1266.51 360.976C1266.51 359.696 1267.08 358.72 1268.22 358.048C1269.37 357.36 1271.19 356.88 1273.67 356.608C1273.67 356.112 1273.59 355.64 1273.45 355.192C1273.32 354.744 1273.08 354.384 1272.73 354.112C1272.39 353.824 1271.91 353.68 1271.27 353.68C1270.59 353.68 1269.96 353.808 1269.37 354.064C1268.78 354.32 1268.25 354.608 1267.79 354.928L1267.02 353.56C1267.56 353.208 1268.23 352.872 1269.01 352.552C1269.81 352.216 1270.67 352.048 1271.6 352.048C1273.03 352.048 1274.06 352.488 1274.7 353.368C1275.34 354.232 1275.66 355.392 1275.66 356.848V364H1274.03L1273.86 362.608H1273.79C1273.24 363.056 1272.64 363.448 1271.99 363.784C1271.35 364.12 1270.66 364.288 1269.92 364.288ZM1270.5 362.704C1271.06 362.704 1271.59 362.568 1272.08 362.296C1272.58 362.024 1273.11 361.64 1273.67 361.144V357.904C1271.73 358.144 1270.37 358.504 1269.59 358.984C1268.82 359.464 1268.43 360.08 1268.43 360.832C1268.43 361.488 1268.63 361.968 1269.03 362.272C1269.43 362.56 1269.92 362.704 1270.5 362.704ZM1279.33 364V352.336H1280.96L1281.13 354.448H1281.2C1281.6 353.712 1282.09 353.128 1282.66 352.696C1283.24 352.264 1283.86 352.048 1284.51 352.048C1284.98 352.048 1285.39 352.128 1285.76 352.288L1285.38 354.016C1285.18 353.952 1285.01 353.904 1284.85 353.872C1284.69 353.84 1284.49 353.824 1284.25 353.824C1283.75 353.824 1283.23 354.024 1282.69 354.424C1282.16 354.824 1281.7 355.52 1281.3 356.512V364H1279.33ZM1287.65 364V352.336H1289.62V364H1287.65ZM1288.66 349.936C1288.27 349.936 1287.95 349.824 1287.7 349.6C1287.46 349.36 1287.34 349.04 1287.34 348.64C1287.34 348.256 1287.46 347.944 1287.7 347.704C1287.95 347.464 1288.27 347.344 1288.66 347.344C1289.04 347.344 1289.35 347.464 1289.59 347.704C1289.85 347.944 1289.98 348.256 1289.98 348.64C1289.98 349.04 1289.85 349.36 1289.59 349.6C1289.35 349.824 1289.04 349.936 1288.66 349.936ZM1297.23 364.288C1295.98 364.288 1295.11 363.928 1294.61 363.208C1294.13 362.488 1293.89 361.552 1293.89 360.4V353.944H1292.16V352.456L1293.99 352.336L1294.23 349.072H1295.88V352.336H1299.03V353.944H1295.88V360.424C1295.88 361.144 1296.01 361.704 1296.27 362.104C1296.54 362.488 1297.01 362.68 1297.68 362.68C1297.89 362.68 1298.11 362.648 1298.35 362.584C1298.59 362.504 1298.81 362.432 1299 362.368L1299.39 363.856C1299.07 363.968 1298.71 364.064 1298.33 364.144C1297.96 364.24 1297.59 364.288 1297.23 364.288ZM1301.86 369.016C1301.6 369.016 1301.35 368.992 1301.11 368.944C1300.89 368.896 1300.68 368.84 1300.49 368.776L1300.87 367.216C1301 367.248 1301.14 367.28 1301.3 367.312C1301.46 367.36 1301.62 367.384 1301.76 367.384C1302.42 367.384 1302.96 367.144 1303.39 366.664C1303.82 366.2 1304.16 365.608 1304.4 364.888L1304.66 364.024L1299.98 352.336H1302.02L1304.4 358.792C1304.58 359.288 1304.76 359.824 1304.95 360.4C1305.16 360.976 1305.35 361.536 1305.53 362.08H1305.62C1305.8 361.552 1305.97 361 1306.13 360.424C1306.29 359.848 1306.45 359.304 1306.61 358.792L1308.7 352.336H1310.62L1306.22 364.96C1305.95 365.728 1305.62 366.416 1305.24 367.024C1304.87 367.632 1304.41 368.112 1303.85 368.464C1303.3 368.832 1302.64 369.016 1301.86 369.016Z" fill="#A2A2A2"/> <path d="M1198.95 403.288C1197.37 403.288 1196.11 402.6 1195.16 401.224C1194.23 399.848 1193.77 397.872 1193.77 395.296C1193.77 392.72 1194.23 390.768 1195.16 389.44C1196.11 388.112 1197.37 387.448 1198.95 387.448C1200.54 387.448 1201.79 388.112 1202.72 389.44C1203.67 390.768 1204.14 392.72 1204.14 395.296C1204.14 397.872 1203.67 399.848 1202.72 401.224C1201.79 402.6 1200.54 403.288 1198.95 403.288ZM1198.95 401.128C1199.45 401.128 1199.89 400.944 1200.27 400.576C1200.66 400.192 1200.95 399.576 1201.16 398.728C1201.39 397.88 1201.5 396.736 1201.5 395.296C1201.5 393.856 1201.39 392.72 1201.16 391.888C1200.95 391.056 1200.66 390.472 1200.27 390.136C1199.89 389.784 1199.45 389.608 1198.95 389.608C1198.47 389.608 1198.04 389.784 1197.66 390.136C1197.27 390.472 1196.97 391.056 1196.75 391.888C1196.52 392.72 1196.41 393.856 1196.41 395.296C1196.41 396.736 1196.52 397.88 1196.75 398.728C1196.97 399.576 1197.27 400.192 1197.66 400.576C1198.04 400.944 1198.47 401.128 1198.95 401.128ZM1208.43 403.288C1207.93 403.288 1207.51 403.112 1207.15 402.76C1206.8 402.408 1206.63 401.96 1206.63 401.416C1206.63 400.872 1206.8 400.424 1207.15 400.072C1207.51 399.72 1207.93 399.544 1208.43 399.544C1208.94 399.544 1209.36 399.72 1209.7 400.072C1210.03 400.424 1210.2 400.872 1210.2 401.416C1210.2 401.96 1210.03 402.408 1209.7 402.76C1209.36 403.112 1208.94 403.288 1208.43 403.288ZM1217.48 403.288C1216.25 403.288 1215.21 403.08 1214.36 402.664C1213.53 402.232 1212.84 401.72 1212.3 401.128L1213.62 399.352C1214.08 399.816 1214.61 400.216 1215.2 400.552C1215.8 400.872 1216.48 401.032 1217.24 401.032C1218.08 401.032 1218.75 400.832 1219.26 400.432C1219.79 400.016 1220.05 399.448 1220.05 398.728C1220.05 398.2 1219.92 397.744 1219.64 397.36C1219.39 396.96 1218.93 396.656 1218.28 396.448C1217.64 396.24 1216.73 396.136 1215.56 396.136V394.12C1217.07 394.12 1218.11 393.888 1218.68 393.424C1219.26 392.96 1219.55 392.376 1219.55 391.672C1219.55 391.032 1219.35 390.536 1218.95 390.184C1218.55 389.816 1218 389.632 1217.29 389.632C1216.7 389.632 1216.16 389.768 1215.66 390.04C1215.16 390.296 1214.68 390.64 1214.22 391.072L1212.8 389.368C1213.46 388.792 1214.16 388.328 1214.89 387.976C1215.64 387.624 1216.48 387.448 1217.39 387.448C1218.86 387.448 1220.05 387.8 1220.96 388.504C1221.88 389.192 1222.33 390.184 1222.33 391.48C1222.33 392.328 1222.09 393.048 1221.61 393.64C1221.15 394.216 1220.5 394.664 1219.67 394.984V395.08C1220.56 395.32 1221.32 395.76 1221.92 396.4C1222.53 397.04 1222.84 397.864 1222.84 398.872C1222.84 399.8 1222.59 400.592 1222.09 401.248C1221.61 401.904 1220.96 402.408 1220.15 402.76C1219.35 403.112 1218.46 403.288 1217.48 403.288ZM1230.46 395.56C1230.03 395.56 1229.57 395.696 1229.09 395.968C1228.61 396.224 1228.16 396.672 1227.72 397.312C1227.87 398.672 1228.2 399.656 1228.71 400.264C1229.22 400.872 1229.84 401.176 1230.58 401.176C1231.22 401.176 1231.76 400.928 1232.19 400.432C1232.64 399.92 1232.86 399.2 1232.86 398.272C1232.86 397.376 1232.64 396.704 1232.21 396.256C1231.8 395.792 1231.21 395.56 1230.46 395.56ZM1230.63 403.288C1229.64 403.288 1228.72 403.024 1227.87 402.496C1227.02 401.968 1226.34 401.152 1225.83 400.048C1225.33 398.944 1225.08 397.536 1225.08 395.824C1225.08 393.84 1225.36 392.24 1225.92 391.024C1226.5 389.792 1227.24 388.888 1228.16 388.312C1229.07 387.736 1230.04 387.448 1231.08 387.448C1232.01 387.448 1232.81 387.616 1233.48 387.952C1234.16 388.288 1234.72 388.696 1235.16 389.176L1233.68 390.856C1233.4 390.52 1233.04 390.248 1232.6 390.04C1232.16 389.816 1231.72 389.704 1231.25 389.704C1230.61 389.704 1230.03 389.88 1229.5 390.232C1228.97 390.584 1228.54 391.176 1228.2 392.008C1227.88 392.824 1227.7 393.936 1227.65 395.344C1228.1 394.784 1228.63 394.344 1229.24 394.024C1229.84 393.704 1230.42 393.544 1230.96 393.544C1232.29 393.544 1233.36 393.936 1234.18 394.72C1235.01 395.488 1235.43 396.672 1235.43 398.272C1235.43 399.28 1235.2 400.16 1234.76 400.912C1234.32 401.664 1233.75 402.248 1233.03 402.664C1232.31 403.08 1231.51 403.288 1230.63 403.288ZM1242.09 403.288C1240.86 403.288 1239.82 403.08 1238.97 402.664C1238.14 402.232 1237.45 401.72 1236.91 401.128L1238.23 399.352C1238.69 399.816 1239.22 400.216 1239.81 400.552C1240.41 400.872 1241.09 401.032 1241.85 401.032C1242.69 401.032 1243.36 400.832 1243.87 400.432C1244.4 400.016 1244.66 399.448 1244.66 398.728C1244.66 398.2 1244.53 397.744 1244.25 397.36C1244 396.96 1243.54 396.656 1242.89 396.448C1242.25 396.24 1241.34 396.136 1240.17 396.136V394.12C1241.68 394.12 1242.72 393.888 1243.29 393.424C1243.87 392.96 1244.16 392.376 1244.16 391.672C1244.16 391.032 1243.96 390.536 1243.56 390.184C1243.16 389.816 1242.61 389.632 1241.9 389.632C1241.31 389.632 1240.77 389.768 1240.27 390.04C1239.77 390.296 1239.29 390.64 1238.83 391.072L1237.41 389.368C1238.07 388.792 1238.77 388.328 1239.5 387.976C1240.25 387.624 1241.09 387.448 1242 387.448C1243.47 387.448 1244.66 387.8 1245.57 388.504C1246.49 389.192 1246.94 390.184 1246.94 391.48C1246.94 392.328 1246.7 393.048 1246.22 393.64C1245.76 394.216 1245.11 394.664 1244.28 394.984V395.08C1245.17 395.32 1245.93 395.76 1246.53 396.4C1247.14 397.04 1247.45 397.864 1247.45 398.872C1247.45 399.8 1247.2 400.592 1246.7 401.248C1246.22 401.904 1245.57 402.408 1244.76 402.76C1243.96 403.112 1243.07 403.288 1242.09 403.288ZM1251.81 396.928H1255.81V393.472C1255.81 393.024 1255.83 392.496 1255.86 391.888C1255.89 391.28 1255.92 390.752 1255.93 390.304H1255.84C1255.65 390.704 1255.45 391.096 1255.24 391.48C1255.03 391.864 1254.81 392.264 1254.59 392.68L1251.81 396.928ZM1255.81 403V399.064H1249.07V397.192L1255.12 387.736H1258.41V396.928H1260.33V399.064H1258.41V403H1255.81Z" fill="#F35F5F"/> <rect x="341" y="1" width="313" height="550" rx="29" fill="#FFF5DD"/> <rect x="341" y="1" width="313" height="550" rx="29" stroke="#FFE1B4" stroke-width="2"/> <path d="M725.97 156.707C726.36 156.317 726.36 155.683 725.97 155.293L719.606 148.929C719.215 148.538 718.582 148.538 718.192 148.929C717.801 149.319 717.801 149.953 718.192 150.343L723.848 156L718.192 161.657C717.801 162.047 717.801 162.681 718.192 163.071C718.582 163.462 719.215 163.462 719.606 163.071L725.97 156.707ZM655.263 157H725.263V155H655.263V157Z" fill="#B4B4B4"/> <path d="M725.97 276.707C726.36 276.317 726.36 275.683 725.97 275.293L719.606 268.929C719.215 268.538 718.582 268.538 718.192 268.929C717.801 269.319 717.801 269.953 718.192 270.343L723.848 276L718.192 281.657C717.801 282.047 717.801 282.681 718.192 283.071C718.582 283.462 719.215 283.462 719.606 283.071L725.97 276.707ZM655.263 277H725.263V275H655.263V277Z" fill="#B4B4B4"/> <path d="M725.97 396.707C726.36 396.317 726.36 395.683 725.97 395.293L719.606 388.929C719.215 388.538 718.582 388.538 718.192 388.929C717.801 389.319 717.801 389.953 718.192 390.343L723.848 396L718.192 401.657C717.801 402.047 717.801 402.681 718.192 403.071C718.582 403.462 719.215 403.462 719.606 403.071L725.97 396.707ZM655.263 397H725.263V395H655.263V397Z" fill="#B4B4B4"/> <path d="M388.145 186.288C387.041 186.288 386.017 186.08 385.073 185.664C384.129 185.232 383.313 184.664 382.625 183.96L383.825 182.568C384.385 183.16 385.041 183.64 385.793 184.008C386.561 184.36 387.353 184.536 388.169 184.536C389.209 184.536 390.017 184.304 390.593 183.84C391.169 183.36 391.457 182.736 391.457 181.968C391.457 181.424 391.337 180.992 391.097 180.672C390.873 180.352 390.561 180.08 390.161 179.856C389.777 179.632 389.337 179.408 388.841 179.184L386.585 178.2C386.089 177.992 385.593 177.72 385.097 177.384C384.617 177.048 384.209 176.616 383.873 176.088C383.553 175.56 383.393 174.912 383.393 174.144C383.393 173.344 383.601 172.632 384.017 172.008C384.449 171.368 385.041 170.872 385.793 170.52C386.545 170.152 387.393 169.968 388.337 169.968C389.281 169.968 390.153 170.152 390.953 170.52C391.753 170.872 392.433 171.336 392.993 171.912L391.913 173.208C391.433 172.744 390.897 172.384 390.305 172.128C389.729 171.856 389.073 171.72 388.337 171.72C387.457 171.72 386.745 171.928 386.201 172.344C385.673 172.76 385.409 173.32 385.409 174.024C385.409 174.52 385.537 174.936 385.793 175.272C386.065 175.592 386.401 175.856 386.801 176.064C387.201 176.272 387.609 176.464 388.025 176.64L390.257 177.6C390.865 177.856 391.409 178.168 391.889 178.536C392.385 178.888 392.777 179.328 393.065 179.856C393.353 180.368 393.497 181.016 393.497 181.8C393.497 182.632 393.281 183.392 392.849 184.08C392.417 184.752 391.801 185.288 391.001 185.688C390.201 186.088 389.249 186.288 388.145 186.288ZM401.134 186.288C400.094 186.288 399.15 186.048 398.302 185.568C397.454 185.072 396.782 184.368 396.286 183.456C395.79 182.544 395.542 181.456 395.542 180.192C395.542 178.912 395.79 177.816 396.286 176.904C396.798 175.992 397.454 175.288 398.254 174.792C399.054 174.296 399.894 174.048 400.773 174.048C402.262 174.048 403.406 174.544 404.206 175.536C405.022 176.528 405.43 177.856 405.43 179.52C405.43 179.728 405.422 179.936 405.406 180.144C405.406 180.336 405.39 180.504 405.358 180.648H397.486C397.566 181.88 397.95 182.864 398.638 183.6C399.342 184.336 400.254 184.704 401.374 184.704C401.934 184.704 402.446 184.624 402.91 184.464C403.39 184.288 403.846 184.064 404.278 183.792L404.974 185.088C404.478 185.408 403.91 185.688 403.27 185.928C402.646 186.168 401.934 186.288 401.134 186.288ZM397.462 179.232H403.702C403.702 178.048 403.446 177.152 402.934 176.544C402.438 175.92 401.734 175.608 400.822 175.608C400.006 175.608 399.27 175.928 398.614 176.568C397.974 177.192 397.59 178.08 397.462 179.232ZM408.312 186V174.336H409.944L410.112 176.016H410.184C410.744 175.456 411.336 174.992 411.96 174.624C412.584 174.24 413.296 174.048 414.096 174.048C415.328 174.048 416.224 174.44 416.784 175.224C417.36 175.992 417.648 177.12 417.648 178.608V186H415.68V178.872C415.68 177.784 415.504 176.992 415.152 176.496C414.8 176 414.24 175.752 413.472 175.752C412.88 175.752 412.344 175.904 411.864 176.208C411.4 176.512 410.872 176.96 410.28 177.552V186H408.312ZM425.109 186.288C423.861 186.288 422.989 185.928 422.493 185.208C422.013 184.488 421.773 183.552 421.773 182.4V175.944H420.045V174.456L421.869 174.336L422.109 171.072H423.765V174.336H426.909V175.944H423.765V182.424C423.765 183.144 423.893 183.704 424.149 184.104C424.421 184.488 424.893 184.68 425.565 184.68C425.773 184.68 425.997 184.648 426.237 184.584C426.477 184.504 426.693 184.432 426.885 184.368L427.269 185.856C426.949 185.968 426.597 186.064 426.213 186.144C425.845 186.24 425.477 186.288 425.109 186.288ZM433.946 186.288C432.906 186.288 431.962 186.048 431.114 185.568C430.266 185.072 429.594 184.368 429.098 183.456C428.602 182.544 428.354 181.456 428.354 180.192C428.354 178.912 428.602 177.816 429.098 176.904C429.61 175.992 430.266 175.288 431.066 174.792C431.866 174.296 432.706 174.048 433.586 174.048C435.074 174.048 436.218 174.544 437.018 175.536C437.834 176.528 438.242 177.856 438.242 179.52C438.242 179.728 438.234 179.936 438.218 180.144C438.218 180.336 438.202 180.504 438.17 180.648H430.298C430.378 181.88 430.762 182.864 431.45 183.6C432.154 184.336 433.066 184.704 434.186 184.704C434.746 184.704 435.258 184.624 435.722 184.464C436.202 184.288 436.658 184.064 437.09 183.792L437.786 185.088C437.29 185.408 436.722 185.688 436.082 185.928C435.458 186.168 434.746 186.288 433.946 186.288ZM430.274 179.232H436.514C436.514 178.048 436.258 177.152 435.746 176.544C435.25 175.92 434.546 175.608 433.634 175.608C432.818 175.608 432.082 175.928 431.426 176.568C430.786 177.192 430.402 178.08 430.274 179.232ZM441.124 186V174.336H442.756L442.924 176.016H442.996C443.556 175.456 444.148 174.992 444.772 174.624C445.396 174.24 446.108 174.048 446.908 174.048C448.14 174.048 449.036 174.44 449.596 175.224C450.172 175.992 450.46 177.12 450.46 178.608V186H448.492V178.872C448.492 177.784 448.316 176.992 447.964 176.496C447.612 176 447.052 175.752 446.284 175.752C445.692 175.752 445.156 175.904 444.676 176.208C444.212 176.512 443.684 176.96 443.092 177.552V186H441.124ZM458.857 186.288C457.833 186.288 456.905 186.048 456.073 185.568C455.241 185.088 454.585 184.392 454.105 183.48C453.625 182.568 453.385 181.472 453.385 180.192C453.385 178.88 453.641 177.768 454.153 176.856C454.681 175.944 455.369 175.248 456.217 174.768C457.081 174.288 458.009 174.048 459.001 174.048C459.769 174.048 460.425 174.184 460.969 174.456C461.529 174.728 462.009 175.048 462.409 175.416L461.401 176.712C461.065 176.408 460.705 176.16 460.321 175.968C459.953 175.776 459.537 175.68 459.073 175.68C458.369 175.68 457.737 175.872 457.177 176.256C456.633 176.624 456.201 177.152 455.881 177.84C455.577 178.512 455.425 179.296 455.425 180.192C455.425 181.52 455.753 182.6 456.409 183.432C457.081 184.248 457.953 184.656 459.025 184.656C459.569 184.656 460.073 184.544 460.537 184.32C461.001 184.08 461.409 183.8 461.761 183.48L462.625 184.8C462.097 185.264 461.513 185.632 460.873 185.904C460.233 186.16 459.561 186.288 458.857 186.288ZM469.407 186.288C468.367 186.288 467.423 186.048 466.575 185.568C465.727 185.072 465.055 184.368 464.559 183.456C464.063 182.544 463.815 181.456 463.815 180.192C463.815 178.912 464.063 177.816 464.559 176.904C465.071 175.992 465.727 175.288 466.527 174.792C467.327 174.296 468.167 174.048 469.047 174.048C470.535 174.048 471.679 174.544 472.479 175.536C473.295 176.528 473.703 177.856 473.703 179.52C473.703 179.728 473.695 179.936 473.679 180.144C473.679 180.336 473.663 180.504 473.631 180.648H465.759C465.839 181.88 466.223 182.864 466.911 183.6C467.615 184.336 468.527 184.704 469.647 184.704C470.207 184.704 470.719 184.624 471.183 184.464C471.663 184.288 472.119 184.064 472.551 183.792L473.247 185.088C472.751 185.408 472.183 185.688 471.543 185.928C470.919 186.168 470.207 186.288 469.407 186.288ZM465.735 179.232H471.975C471.975 178.048 471.719 177.152 471.207 176.544C470.711 175.92 470.007 175.608 469.095 175.608C468.279 175.608 467.543 175.928 466.887 176.568C466.247 177.192 465.863 178.08 465.735 179.232ZM484.846 186V171.936H480.094V170.256H491.614V171.936H486.862V186H484.846ZM493.156 186V174.336H494.788L494.956 176.448H495.028C495.428 175.712 495.916 175.128 496.492 174.696C497.068 174.264 497.684 174.048 498.34 174.048C498.804 174.048 499.22 174.128 499.588 174.288L499.204 176.016C499.012 175.952 498.836 175.904 498.676 175.872C498.516 175.84 498.316 175.824 498.076 175.824C497.58 175.824 497.06 176.024 496.516 176.424C495.988 176.824 495.524 177.52 495.124 178.512V186H493.156ZM503.578 186.288C502.602 186.288 501.786 186 501.13 185.424C500.49 184.832 500.17 184.016 500.17 182.976C500.17 181.696 500.738 180.72 501.874 180.048C503.026 179.36 504.842 178.88 507.322 178.608C507.322 178.112 507.25 177.64 507.106 177.192C506.978 176.744 506.738 176.384 506.386 176.112C506.05 175.824 505.562 175.68 504.922 175.68C504.25 175.68 503.618 175.808 503.026 176.064C502.434 176.32 501.906 176.608 501.442 176.928L500.674 175.56C501.218 175.208 501.882 174.872 502.666 174.552C503.466 174.216 504.33 174.048 505.258 174.048C506.682 174.048 507.714 174.488 508.354 175.368C508.994 176.232 509.314 177.392 509.314 178.848V186H507.682L507.514 184.608H507.442C506.898 185.056 506.298 185.448 505.642 185.784C505.002 186.12 504.314 186.288 503.578 186.288ZM504.154 184.704C504.714 184.704 505.242 184.568 505.738 184.296C506.234 184.024 506.762 183.64 507.322 183.144V179.904C505.386 180.144 504.026 180.504 503.242 180.984C502.474 181.464 502.09 182.08 502.09 182.832C502.09 183.488 502.29 183.968 502.69 184.272C503.09 184.56 503.578 184.704 504.154 184.704ZM512.984 186V174.336H514.616L514.784 176.016H514.856C515.416 175.456 516.008 174.992 516.632 174.624C517.256 174.24 517.968 174.048 518.768 174.048C520 174.048 520.896 174.44 521.456 175.224C522.032 175.992 522.32 177.12 522.32 178.608V186H520.352V178.872C520.352 177.784 520.176 176.992 519.824 176.496C519.472 176 518.912 175.752 518.144 175.752C517.552 175.752 517.016 175.904 516.536 176.208C516.072 176.512 515.544 176.96 514.952 177.552V186H512.984ZM529.157 186.288C528.325 186.288 527.533 186.136 526.781 185.832C526.029 185.512 525.373 185.128 524.813 184.68L525.797 183.36C526.309 183.76 526.837 184.096 527.381 184.368C527.925 184.624 528.541 184.752 529.229 184.752C529.997 184.752 530.573 184.576 530.957 184.224C531.341 183.856 531.533 183.424 531.533 182.928C531.533 182.528 531.397 182.192 531.125 181.92C530.869 181.648 530.533 181.424 530.117 181.248C529.717 181.056 529.301 180.88 528.869 180.72C528.325 180.512 527.789 180.28 527.261 180.024C526.733 179.752 526.301 179.408 525.965 178.992C525.629 178.56 525.461 178.016 525.461 177.36C525.461 176.416 525.813 175.632 526.517 175.008C527.237 174.368 528.229 174.048 529.493 174.048C530.213 174.048 530.885 174.176 531.509 174.432C532.133 174.688 532.669 175 533.117 175.368L532.157 176.616C531.757 176.312 531.341 176.064 530.909 175.872C530.477 175.68 530.005 175.584 529.493 175.584C528.757 175.584 528.213 175.752 527.861 176.088C527.525 176.424 527.357 176.816 527.357 177.264C527.357 177.632 527.477 177.936 527.717 178.176C527.957 178.4 528.269 178.6 528.653 178.776C529.037 178.936 529.445 179.104 529.877 179.28C530.437 179.488 530.989 179.728 531.533 180C532.077 180.256 532.525 180.608 532.877 181.056C533.245 181.488 533.429 182.072 533.429 182.808C533.429 183.432 533.261 184.008 532.925 184.536C532.605 185.064 532.125 185.488 531.485 185.808C530.861 186.128 530.085 186.288 529.157 186.288ZM536.499 186V175.944H534.915V174.456L536.499 174.336V172.488C536.499 171.304 536.771 170.368 537.315 169.68C537.875 168.976 538.739 168.624 539.907 168.624C540.275 168.624 540.627 168.664 540.963 168.744C541.299 168.808 541.595 168.896 541.851 169.008L541.419 170.52C540.987 170.328 540.547 170.232 540.099 170.232C539.011 170.232 538.467 170.984 538.467 172.488V174.336H540.939V175.944H538.467V186H536.499ZM547.473 186.288C546.529 186.288 545.641 186.048 544.809 185.568C543.993 185.088 543.329 184.392 542.817 183.48C542.321 182.568 542.073 181.472 542.073 180.192C542.073 178.88 542.321 177.768 542.817 176.856C543.329 175.944 543.993 175.248 544.809 174.768C545.641 174.288 546.529 174.048 547.473 174.048C548.433 174.048 549.321 174.288 550.137 174.768C550.953 175.248 551.609 175.944 552.105 176.856C552.617 177.768 552.873 178.88 552.873 180.192C552.873 181.472 552.617 182.568 552.105 183.48C551.609 184.392 550.953 185.088 550.137 185.568C549.321 186.048 548.433 186.288 547.473 186.288ZM547.473 184.656C548.481 184.656 549.289 184.248 549.897 183.432C550.521 182.6 550.833 181.52 550.833 180.192C550.833 178.848 550.521 177.76 549.897 176.928C549.289 176.096 548.481 175.68 547.473 175.68C546.481 175.68 545.673 176.096 545.049 176.928C544.425 177.76 544.113 178.848 544.113 180.192C544.113 181.52 544.425 182.6 545.049 183.432C545.673 184.248 546.481 184.656 547.473 184.656ZM555.945 186V174.336H557.577L557.745 176.448H557.817C558.217 175.712 558.705 175.128 559.281 174.696C559.857 174.264 560.473 174.048 561.129 174.048C561.593 174.048 562.009 174.128 562.377 174.288L561.993 176.016C561.801 175.952 561.625 175.904 561.465 175.872C561.305 175.84 561.105 175.824 560.865 175.824C560.369 175.824 559.849 176.024 559.305 176.424C558.777 176.824 558.313 177.52 557.913 178.512V186H555.945ZM564.265 186V174.336H565.897L566.065 176.016H566.137C566.649 175.456 567.209 174.992 567.817 174.624C568.425 174.24 569.073 174.048 569.761 174.048C570.657 174.048 571.353 174.248 571.849 174.648C572.361 175.032 572.737 175.576 572.977 176.28C573.585 175.624 574.201 175.088 574.825 174.672C575.449 174.256 576.113 174.048 576.817 174.048C578.017 174.048 578.905 174.44 579.481 175.224C580.073 175.992 580.369 177.12 580.369 178.608V186H578.401V178.872C578.401 177.784 578.225 176.992 577.873 176.496C577.521 176 576.977 175.752 576.241 175.752C575.377 175.752 574.401 176.352 573.313 177.552V186H571.345V178.872C571.345 177.784 571.169 176.992 570.817 176.496C570.465 176 569.913 175.752 569.161 175.752C568.297 175.752 567.321 176.352 566.233 177.552V186H564.265ZM588.891 186.288C587.851 186.288 586.907 186.048 586.059 185.568C585.211 185.072 584.539 184.368 584.043 183.456C583.547 182.544 583.299 181.456 583.299 180.192C583.299 178.912 583.547 177.816 584.043 176.904C584.555 175.992 585.211 175.288 586.011 174.792C586.811 174.296 587.651 174.048 588.531 174.048C590.019 174.048 591.163 174.544 591.963 175.536C592.779 176.528 593.187 177.856 593.187 179.52C593.187 179.728 593.179 179.936 593.163 180.144C593.163 180.336 593.147 180.504 593.115 180.648H585.243C585.323 181.88 585.707 182.864 586.395 183.6C587.099 184.336 588.011 184.704 589.131 184.704C589.691 184.704 590.203 184.624 590.667 184.464C591.147 184.288 591.603 184.064 592.035 183.792L592.731 185.088C592.235 185.408 591.667 185.688 591.027 185.928C590.403 186.168 589.691 186.288 588.891 186.288ZM585.219 179.232H591.459C591.459 178.048 591.203 177.152 590.691 176.544C590.195 175.92 589.491 175.608 588.579 175.608C587.763 175.608 587.027 175.928 586.371 176.568C585.731 177.192 585.347 178.08 585.219 179.232ZM596.07 186V174.336H597.702L597.87 176.448H597.942C598.342 175.712 598.83 175.128 599.406 174.696C599.982 174.264 600.598 174.048 601.254 174.048C601.718 174.048 602.134 174.128 602.502 174.288L602.118 176.016C601.926 175.952 601.75 175.904 601.59 175.872C601.43 175.84 601.23 175.824 600.99 175.824C600.494 175.824 599.974 176.024 599.43 176.424C598.902 176.824 598.438 177.52 598.038 178.512V186H596.07ZM607.344 186.288C606.512 186.288 605.72 186.136 604.968 185.832C604.216 185.512 603.56 185.128 603 184.68L603.984 183.36C604.496 183.76 605.024 184.096 605.568 184.368C606.112 184.624 606.728 184.752 607.416 184.752C608.184 184.752 608.76 184.576 609.144 184.224C609.528 183.856 609.72 183.424 609.72 182.928C609.72 182.528 609.584 182.192 609.312 181.92C609.056 181.648 608.72 181.424 608.304 181.248C607.904 181.056 607.488 180.88 607.056 180.72C606.512 180.512 605.976 180.28 605.448 180.024C604.92 179.752 604.488 179.408 604.152 178.992C603.816 178.56 603.648 178.016 603.648 177.36C603.648 176.416 604 175.632 604.704 175.008C605.424 174.368 606.416 174.048 607.68 174.048C608.4 174.048 609.072 174.176 609.696 174.432C610.32 174.688 610.856 175 611.304 175.368L610.344 176.616C609.944 176.312 609.528 176.064 609.096 175.872C608.664 175.68 608.192 175.584 607.68 175.584C606.944 175.584 606.4 175.752 606.048 176.088C605.712 176.424 605.544 176.816 605.544 177.264C605.544 177.632 605.664 177.936 605.904 178.176C606.144 178.4 606.456 178.6 606.84 178.776C607.224 178.936 607.632 179.104 608.064 179.28C608.624 179.488 609.176 179.728 609.72 180C610.264 180.256 610.712 180.608 611.064 181.056C611.432 181.488 611.616 182.072 611.616 182.808C611.616 183.432 611.448 184.008 611.112 184.536C610.792 185.064 610.312 185.488 609.672 185.808C609.048 186.128 608.272 186.288 607.344 186.288ZM408.129 216V200.256H417.201V201.936H410.121V206.88H416.097V208.584H410.121V214.296H417.441V216H408.129ZM420.593 216V204.336H422.225L422.393 206.016H422.465C422.977 205.456 423.537 204.992 424.145 204.624C424.753 204.24 425.401 204.048 426.089 204.048C426.985 204.048 427.681 204.248 428.177 204.648C428.689 205.032 429.065 205.576 429.305 206.28C429.913 205.624 430.529 205.088 431.153 204.672C431.777 204.256 432.441 204.048 433.145 204.048C434.345 204.048 435.233 204.44 435.809 205.224C436.401 205.992 436.697 207.12 436.697 208.608V216H434.729V208.872C434.729 207.784 434.553 206.992 434.201 206.496C433.849 206 433.305 205.752 432.569 205.752C431.705 205.752 430.729 206.352 429.641 207.552V216H427.673V208.872C427.673 207.784 427.497 206.992 427.145 206.496C426.793 206 426.241 205.752 425.489 205.752C424.625 205.752 423.649 206.352 422.561 207.552V216H420.593ZM445.651 216.288C445.107 216.288 444.539 216.16 443.947 215.904C443.371 215.632 442.827 215.264 442.315 214.8H442.243L442.075 216H440.491V198.912H442.459V203.568L442.411 205.68C442.939 205.216 443.515 204.832 444.139 204.528C444.779 204.208 445.419 204.048 446.059 204.048C447.579 204.048 448.731 204.584 449.515 205.656C450.299 206.728 450.691 208.168 450.691 209.976C450.691 211.304 450.451 212.44 449.971 213.384C449.507 214.328 448.891 215.048 448.123 215.544C447.371 216.04 446.547 216.288 445.651 216.288ZM445.315 214.632C446.275 214.632 447.067 214.224 447.691 213.408C448.331 212.576 448.651 211.44 448.651 210C448.651 208.72 448.411 207.688 447.931 206.904C447.467 206.104 446.683 205.704 445.579 205.704C445.083 205.704 444.579 205.84 444.067 206.112C443.555 206.384 443.019 206.776 442.459 207.288V213.408C442.971 213.856 443.475 214.176 443.971 214.368C444.483 214.544 444.931 214.632 445.315 214.632ZM458.485 216.288C457.445 216.288 456.501 216.048 455.653 215.568C454.805 215.072 454.133 214.368 453.637 213.456C453.141 212.544 452.893 211.456 452.893 210.192C452.893 208.912 453.141 207.816 453.637 206.904C454.149 205.992 454.805 205.288 455.605 204.792C456.405 204.296 457.245 204.048 458.125 204.048C459.613 204.048 460.757 204.544 461.557 205.536C462.373 206.528 462.781 207.856 462.781 209.52C462.781 209.728 462.773 209.936 462.757 210.144C462.757 210.336 462.741 210.504 462.709 210.648H454.837C454.917 211.88 455.301 212.864 455.989 213.6C456.693 214.336 457.605 214.704 458.725 214.704C459.285 214.704 459.797 214.624 460.261 214.464C460.741 214.288 461.197 214.064 461.629 213.792L462.325 215.088C461.829 215.408 461.261 215.688 460.621 215.928C459.997 216.168 459.285 216.288 458.485 216.288ZM454.813 209.232H461.053C461.053 208.048 460.797 207.152 460.285 206.544C459.789 205.92 459.085 205.608 458.173 205.608C457.357 205.608 456.621 205.928 455.965 206.568C455.325 207.192 454.941 208.08 454.813 209.232ZM469.647 216.288C468.191 216.288 467.023 215.76 466.143 214.704C465.263 213.632 464.823 212.128 464.823 210.192C464.823 208.928 465.055 207.84 465.519 206.928C465.999 206 466.623 205.288 467.391 204.792C468.175 204.296 469.007 204.048 469.887 204.048C470.559 204.048 471.143 204.168 471.639 204.408C472.135 204.648 472.639 204.976 473.151 205.392L473.055 203.4V198.912H475.047V216H473.415L473.247 214.632H473.175C472.727 215.08 472.199 215.472 471.591 215.808C470.983 216.128 470.335 216.288 469.647 216.288ZM470.079 214.632C471.103 214.632 472.095 214.096 473.055 213.024V206.928C472.559 206.48 472.079 206.168 471.615 205.992C471.167 205.8 470.703 205.704 470.223 205.704C469.599 205.704 469.031 205.896 468.519 206.28C468.023 206.648 467.623 207.168 467.319 207.84C467.015 208.496 466.863 209.272 466.863 210.168C466.863 211.56 467.143 212.656 467.703 213.456C468.263 214.24 469.055 214.632 470.079 214.632ZM482.96 216.288C481.504 216.288 480.336 215.76 479.456 214.704C478.576 213.632 478.136 212.128 478.136 210.192C478.136 208.928 478.368 207.84 478.832 206.928C479.312 206 479.936 205.288 480.704 204.792C481.488 204.296 482.32 204.048 483.2 204.048C483.872 204.048 484.456 204.168 484.952 204.408C485.448 204.648 485.952 204.976 486.464 205.392L486.368 203.4V198.912H488.36V216H486.728L486.56 214.632H486.488C486.04 215.08 485.512 215.472 484.904 215.808C484.296 216.128 483.648 216.288 482.96 216.288ZM483.392 214.632C484.416 214.632 485.408 214.096 486.368 213.024V206.928C485.872 206.48 485.392 206.168 484.928 205.992C484.48 205.8 484.016 205.704 483.536 205.704C482.912 205.704 482.344 205.896 481.832 206.28C481.336 206.648 480.936 207.168 480.632 207.84C480.328 208.496 480.176 209.272 480.176 210.168C480.176 211.56 480.456 212.656 481.016 213.456C481.576 214.24 482.368 214.632 483.392 214.632ZM492.288 216V204.336H494.256V216H492.288ZM493.296 201.936C492.912 201.936 492.592 201.824 492.336 201.6C492.096 201.36 491.976 201.04 491.976 200.64C491.976 200.256 492.096 199.944 492.336 199.704C492.592 199.464 492.912 199.344 493.296 199.344C493.68 199.344 493.992 199.464 494.232 199.704C494.488 199.944 494.616 200.256 494.616 200.64C494.616 201.04 494.488 201.36 494.232 201.6C493.992 201.824 493.68 201.936 493.296 201.936ZM498.195 216V204.336H499.827L499.995 206.016H500.067C500.627 205.456 501.219 204.992 501.843 204.624C502.467 204.24 503.179 204.048 503.979 204.048C505.211 204.048 506.107 204.44 506.667 205.224C507.243 205.992 507.531 207.12 507.531 208.608V216H505.563V208.872C505.563 207.784 505.387 206.992 505.035 206.496C504.683 206 504.123 205.752 503.355 205.752C502.763 205.752 502.227 205.904 501.747 206.208C501.283 206.512 500.755 206.96 500.163 207.552V216H498.195ZM515.256 221.376C513.832 221.376 512.672 221.104 511.776 220.56C510.88 220.016 510.432 219.24 510.432 218.232C510.432 217.736 510.584 217.256 510.888 216.792C511.192 216.344 511.608 215.944 512.136 215.592V215.496C511.848 215.32 511.6 215.072 511.392 214.752C511.2 214.432 511.104 214.048 511.104 213.6C511.104 213.104 511.24 212.672 511.512 212.304C511.784 211.936 512.072 211.648 512.376 211.44V211.344C511.992 211.024 511.64 210.592 511.32 210.048C511.016 209.504 510.864 208.888 510.864 208.2C510.864 207.352 511.064 206.616 511.464 205.992C511.864 205.368 512.4 204.888 513.072 204.552C513.744 204.216 514.472 204.048 515.256 204.048C515.576 204.048 515.88 204.08 516.168 204.144C516.456 204.192 516.704 204.256 516.912 204.336H520.968V205.848H518.568C518.84 206.104 519.064 206.448 519.24 206.88C519.432 207.296 519.528 207.752 519.528 208.248C519.528 209.08 519.336 209.8 518.952 210.408C518.568 211.016 518.056 211.488 517.416 211.824C516.776 212.144 516.056 212.304 515.256 212.304C514.632 212.304 514.048 212.168 513.504 211.896C513.296 212.072 513.12 212.272 512.976 212.496C512.832 212.704 512.76 212.968 512.76 213.288C512.76 213.656 512.904 213.96 513.192 214.2C513.496 214.44 514.04 214.56 514.824 214.56H517.08C518.44 214.56 519.456 214.784 520.128 215.232C520.816 215.664 521.16 216.368 521.16 217.344C521.16 218.064 520.92 218.728 520.44 219.336C519.96 219.944 519.28 220.432 518.4 220.8C517.52 221.184 516.472 221.376 515.256 221.376ZM515.256 210.984C515.928 210.984 516.504 210.736 516.984 210.24C517.48 209.728 517.728 209.048 517.728 208.2C517.728 207.352 517.488 206.688 517.008 206.208C516.528 205.728 515.944 205.488 515.256 205.488C514.568 205.488 513.984 205.728 513.504 206.208C513.024 206.688 512.784 207.352 512.784 208.2C512.784 209.048 513.024 209.728 513.504 210.24C514 210.736 514.584 210.984 515.256 210.984ZM515.544 220.008C516.664 220.008 517.56 219.76 518.232 219.264C518.904 218.784 519.24 218.24 519.24 217.632C519.24 217.088 519.032 216.712 518.616 216.504C518.216 216.296 517.64 216.192 516.888 216.192H514.872C514.648 216.192 514.4 216.176 514.128 216.144C513.872 216.112 513.616 216.064 513.36 216C512.944 216.304 512.64 216.624 512.448 216.96C512.256 217.296 512.16 217.632 512.16 217.968C512.16 218.592 512.456 219.088 513.048 219.456C513.656 219.824 514.488 220.008 515.544 220.008ZM528.41 216V200.256H530.81L533.834 208.656C534.026 209.2 534.21 209.752 534.386 210.312C534.578 210.856 534.77 211.4 534.962 211.944H535.058C535.25 211.4 535.426 210.856 535.586 210.312C535.762 209.752 535.946 209.2 536.138 208.656L539.114 200.256H541.538V216H539.666V207.336C539.666 206.632 539.698 205.856 539.762 205.008C539.826 204.144 539.882 203.368 539.93 202.68H539.834L538.586 206.256L535.61 214.416H534.29L531.314 206.256L530.066 202.68H529.97C530.018 203.368 530.066 204.144 530.114 205.008C530.178 205.856 530.21 206.632 530.21 207.336V216H528.41ZM550.192 216.288C549.248 216.288 548.36 216.048 547.528 215.568C546.712 215.088 546.048 214.392 545.536 213.48C545.04 212.568 544.792 211.472 544.792 210.192C544.792 208.88 545.04 207.768 545.536 206.856C546.048 205.944 546.712 205.248 547.528 204.768C548.36 204.288 549.248 204.048 550.192 204.048C551.152 204.048 552.04 204.288 552.856 204.768C553.672 205.248 554.328 205.944 554.824 206.856C555.336 207.768 555.592 208.88 555.592 210.192C555.592 211.472 555.336 212.568 554.824 213.48C554.328 214.392 553.672 215.088 552.856 215.568C552.04 216.048 551.152 216.288 550.192 216.288ZM550.192 214.656C551.2 214.656 552.008 214.248 552.616 213.432C553.24 212.6 553.552 211.52 553.552 210.192C553.552 208.848 553.24 207.76 552.616 206.928C552.008 206.096 551.2 205.68 550.192 205.68C549.2 205.68 548.392 206.096 547.768 206.928C547.144 207.76 546.832 208.848 546.832 210.192C546.832 211.52 547.144 212.6 547.768 213.432C548.392 214.248 549.2 214.656 550.192 214.656ZM562.647 216.288C561.191 216.288 560.023 215.76 559.143 214.704C558.263 213.632 557.823 212.128 557.823 210.192C557.823 208.928 558.055 207.84 558.519 206.928C558.999 206 559.623 205.288 560.391 204.792C561.175 204.296 562.007 204.048 562.887 204.048C563.559 204.048 564.143 204.168 564.639 204.408C565.135 204.648 565.639 204.976 566.151 205.392L566.055 203.4V198.912H568.047V216H566.415L566.247 214.632H566.175C565.727 215.08 565.199 215.472 564.591 215.808C563.983 216.128 563.335 216.288 562.647 216.288ZM563.079 214.632C564.103 214.632 565.095 214.096 566.055 213.024V206.928C565.559 206.48 565.079 206.168 564.615 205.992C564.167 205.8 563.703 205.704 563.223 205.704C562.599 205.704 562.031 205.896 561.519 206.28C561.023 206.648 560.623 207.168 560.319 207.84C560.015 208.496 559.863 209.272 559.863 210.168C559.863 211.56 560.143 212.656 560.703 213.456C561.263 214.24 562.055 214.632 563.079 214.632ZM576.704 216.288C575.664 216.288 574.72 216.048 573.872 215.568C573.024 215.072 572.352 214.368 571.856 213.456C571.36 212.544 571.112 211.456 571.112 210.192C571.112 208.912 571.36 207.816 571.856 206.904C572.368 205.992 573.024 205.288 573.824 204.792C574.624 204.296 575.464 204.048 576.344 204.048C577.832 204.048 578.976 204.544 579.776 205.536C580.592 206.528 581 207.856 581 209.52C581 209.728 580.992 209.936 580.976 210.144C580.976 210.336 580.96 210.504 580.928 210.648H573.056C573.136 211.88 573.52 212.864 574.208 213.6C574.912 214.336 575.824 214.704 576.944 214.704C577.504 214.704 578.016 214.624 578.48 214.464C578.96 214.288 579.416 214.064 579.848 213.792L580.544 215.088C580.048 215.408 579.48 215.688 578.84 215.928C578.216 216.168 577.504 216.288 576.704 216.288ZM573.032 209.232H579.272C579.272 208.048 579.016 207.152 578.504 206.544C578.008 205.92 577.304 205.608 576.392 205.608C575.576 205.608 574.84 205.928 574.184 206.568C573.544 207.192 573.16 208.08 573.032 209.232ZM585.97 216.288C585.234 216.288 584.698 216.064 584.362 215.616C584.042 215.152 583.882 214.496 583.882 213.648V198.912H585.85V213.792C585.85 214.096 585.906 214.32 586.018 214.464C586.13 214.592 586.258 214.656 586.402 214.656C586.466 214.656 586.522 214.656 586.57 214.656C586.634 214.64 586.722 214.624 586.834 214.608L587.098 216.096C586.97 216.16 586.818 216.208 586.642 216.24C586.466 216.272 586.242 216.288 585.97 216.288Z" fill="#A2A2A2"/> </svg>
2
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/101_train-decision-transformers/file
3
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/87_playlist_generator/embedding-diagram.svg
<svg width="3398" height="1374" viewBox="0 0 3398 1374" fill="none" xmlns="http://www.w3.org/2000/svg"> <rect width="3398" height="1374" rx="20.8913" fill="#FAFAFA"/> <path d="M728 758H967" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <path d="M1391 742H1927" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <path d="M870 758H876.202C879.516 758 882.202 760.686 882.202 764V1157C882.202 1160.31 884.889 1163 888.202 1163H967" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <g filter="url(#filter0_d_103_235)"> <rect x="127" y="297" width="471" height="833" rx="10" fill="white"/> </g> <path d="M168 353C168 347.477 172.477 343 178 343H460.826C466.349 343 470.826 347.477 470.826 353V370.081C470.826 375.604 466.349 380.081 460.826 380.081H178C172.477 380.081 168 375.604 168 370.081V353Z" fill="url(#paint0_linear_103_235)"/> <path d="M168 400C168 394.477 172.477 390 178 390H408C413.523 390 418 394.477 418 400V417C418 422.523 413.523 427 408 427H178C172.477 427 168 422.523 168 417V400Z" fill="url(#paint1_linear_103_235)"/> <path d="M168 447C168 441.477 172.477 437 178 437H428C433.523 437 438 441.477 438 447V464C438 469.523 433.523 474 428 474H178C172.477 474 168 469.523 168 464V447Z" fill="url(#paint2_linear_103_235)"/> <path d="M168 494C168 488.477 172.477 484 178 484H489C494.523 484 499 488.477 499 494V511C499 516.523 494.523 521 489 521H178C172.477 521 168 516.523 168 511V494Z" fill="url(#paint3_linear_103_235)"/> <path d="M168 541C168 535.477 172.477 531 178 531H408C413.523 531 418 535.477 418 541V558C418 563.523 413.523 568 408 568H178C172.477 568 168 563.523 168 558V541Z" fill="url(#paint4_linear_103_235)"/> <path d="M168 614C168 608.477 172.477 604 178 604H289C294.523 604 299 608.477 299 614V631C299 636.523 294.523 641 289 641H178C172.477 641 168 636.523 168 631V614Z" fill="url(#paint5_linear_103_235)"/> <path d="M168 661C168 655.477 172.477 651 178 651H428C433.523 651 438 655.477 438 661V678C438 683.523 433.523 688 428 688H178C172.477 688 168 683.523 168 678V661Z" fill="url(#paint6_linear_103_235)"/> <path d="M168 708C168 702.477 172.477 698 178 698H361C366.523 698 371 702.477 371 708V725C371 730.523 366.523 735 361 735H178C172.477 735 168 730.523 168 725V708Z" fill="url(#paint7_linear_103_235)"/> <path d="M168 755C168 749.477 172.477 745 178 745H361C366.523 745 371 749.477 371 755V772C371 777.523 366.523 782 361 782H178C172.477 782 168 777.523 168 772V755Z" fill="url(#paint8_linear_103_235)"/> <path d="M168 802C168 796.477 172.477 792 178 792H304C309.523 792 314 796.477 314 802V819C314 824.523 309.523 829 304 829H178C172.477 829 168 824.523 168 819V802Z" fill="url(#paint9_linear_103_235)"/> <path d="M168 875C168 869.477 172.477 865 178 865H408C413.523 865 418 869.477 418 875V892C418 897.523 413.523 902 408 902H178C172.477 902 168 897.523 168 892V875Z" fill="url(#paint10_linear_103_235)"/> <path d="M168 923C168 917.477 172.477 913 178 913H447C452.523 913 457 917.477 457 923V940C457 945.523 452.523 950 447 950H178C172.477 950 168 945.523 168 940V923Z" fill="url(#paint11_linear_103_235)"/> <path d="M168 971C168 965.477 172.477 961 178 961H391C396.523 961 401 965.477 401 971V988C401 993.523 396.523 998 391 998H178C172.477 998 168 993.523 168 988V971Z" fill="url(#paint12_linear_103_235)"/> <path d="M168 1019C168 1013.48 172.477 1009 178 1009H428C433.523 1009 438 1013.48 438 1019V1036C438 1041.52 433.523 1046 428 1046H178C172.477 1046 168 1041.52 168 1036V1019Z" fill="url(#paint13_linear_103_235)"/> <path d="M168 1067C168 1061.48 172.477 1057 178 1057H470C475.523 1057 480 1061.48 480 1067V1084C480 1089.52 475.523 1094 470 1094H178C172.477 1094 168 1089.52 168 1084V1067Z" fill="url(#paint14_linear_103_235)"/> <g filter="url(#filter1_d_103_235)"> <rect x="189" y="360" width="471" height="833" rx="10" fill="white"/> </g> <path d="M230 416C230 410.477 234.477 406 240 406H522.826C528.349 406 532.826 410.477 532.826 416V433.081C532.826 438.604 528.349 443.081 522.826 443.081H240C234.477 443.081 230 438.604 230 433.081V416Z" fill="url(#paint15_linear_103_235)"/> <path d="M230 463C230 457.477 234.477 453 240 453H470C475.523 453 480 457.477 480 463V480C480 485.523 475.523 490 470 490H240C234.477 490 230 485.523 230 480V463Z" fill="url(#paint16_linear_103_235)"/> <path d="M230 510C230 504.477 234.477 500 240 500H490C495.523 500 500 504.477 500 510V527C500 532.523 495.523 537 490 537H240C234.477 537 230 532.523 230 527V510Z" fill="url(#paint17_linear_103_235)"/> <path d="M230 557C230 551.477 234.477 547 240 547H551C556.523 547 561 551.477 561 557V574C561 579.523 556.523 584 551 584H240C234.477 584 230 579.523 230 574V557Z" fill="url(#paint18_linear_103_235)"/> <path d="M230 604C230 598.477 234.477 594 240 594H470C475.523 594 480 598.477 480 604V621C480 626.523 475.523 631 470 631H240C234.477 631 230 626.523 230 621V604Z" fill="url(#paint19_linear_103_235)"/> <path d="M230 677C230 671.477 234.477 667 240 667H351C356.523 667 361 671.477 361 677V694C361 699.523 356.523 704 351 704H240C234.477 704 230 699.523 230 694V677Z" fill="url(#paint20_linear_103_235)"/> <path d="M230 724C230 718.477 234.477 714 240 714H490C495.523 714 500 718.477 500 724V741C500 746.523 495.523 751 490 751H240C234.477 751 230 746.523 230 741V724Z" fill="url(#paint21_linear_103_235)"/> <path d="M230 771C230 765.477 234.477 761 240 761H423C428.523 761 433 765.477 433 771V788C433 793.523 428.523 798 423 798H240C234.477 798 230 793.523 230 788V771Z" fill="url(#paint22_linear_103_235)"/> <path d="M230 818C230 812.477 234.477 808 240 808H423C428.523 808 433 812.477 433 818V835C433 840.523 428.523 845 423 845H240C234.477 845 230 840.523 230 835V818Z" fill="url(#paint23_linear_103_235)"/> <path d="M230 865C230 859.477 234.477 855 240 855H366C371.523 855 376 859.477 376 865V882C376 887.523 371.523 892 366 892H240C234.477 892 230 887.523 230 882V865Z" fill="url(#paint24_linear_103_235)"/> <path d="M230 938C230 932.477 234.477 928 240 928H470C475.523 928 480 932.477 480 938V955C480 960.523 475.523 965 470 965H240C234.477 965 230 960.523 230 955V938Z" fill="url(#paint25_linear_103_235)"/> <path d="M230 986C230 980.477 234.477 976 240 976H509C514.523 976 519 980.477 519 986V1003C519 1008.52 514.523 1013 509 1013H240C234.477 1013 230 1008.52 230 1003V986Z" fill="url(#paint26_linear_103_235)"/> <path d="M230 1034C230 1028.48 234.477 1024 240 1024H453C458.523 1024 463 1028.48 463 1034V1051C463 1056.52 458.523 1061 453 1061H240C234.477 1061 230 1056.52 230 1051V1034Z" fill="url(#paint27_linear_103_235)"/> <path d="M230 1082C230 1076.48 234.477 1072 240 1072H490C495.523 1072 500 1076.48 500 1082V1099C500 1104.52 495.523 1109 490 1109H240C234.477 1109 230 1104.52 230 1099V1082Z" fill="url(#paint28_linear_103_235)"/> <path d="M230 1130C230 1124.48 234.477 1120 240 1120H532C537.523 1120 542 1124.48 542 1130V1147C542 1152.52 537.523 1157 532 1157H240C234.477 1157 230 1152.52 230 1147V1130Z" fill="url(#paint29_linear_103_235)"/> <g filter="url(#filter2_d_103_235)"> <rect x="258" y="423" width="471" height="833" rx="10" fill="white"/> </g> <path d="M299 479C299 473.477 303.477 469 309 469H591.826C597.349 469 601.826 473.477 601.826 479V496.081C601.826 501.604 597.349 506.081 591.826 506.081H309C303.477 506.081 299 501.604 299 496.081V479Z" fill="url(#paint30_linear_103_235)"/> <path d="M299 526C299 520.477 303.477 516 309 516H539C544.523 516 549 520.477 549 526V543C549 548.523 544.523 553 539 553H309C303.477 553 299 548.523 299 543V526Z" fill="url(#paint31_linear_103_235)"/> <path d="M299 573C299 567.477 303.477 563 309 563H559C564.523 563 569 567.477 569 573V590C569 595.523 564.523 600 559 600H309C303.477 600 299 595.523 299 590V573Z" fill="url(#paint32_linear_103_235)"/> <path d="M299 620C299 614.477 303.477 610 309 610H620C625.523 610 630 614.477 630 620V637C630 642.523 625.523 647 620 647H309C303.477 647 299 642.523 299 637V620Z" fill="url(#paint33_linear_103_235)"/> <path d="M299 667C299 661.477 303.477 657 309 657H539C544.523 657 549 661.477 549 667V684C549 689.523 544.523 694 539 694H309C303.477 694 299 689.523 299 684V667Z" fill="url(#paint34_linear_103_235)"/> <path d="M299 740C299 734.477 303.477 730 309 730H420C425.523 730 430 734.477 430 740V757C430 762.523 425.523 767 420 767H309C303.477 767 299 762.523 299 757V740Z" fill="url(#paint35_linear_103_235)"/> <path d="M299 787C299 781.477 303.477 777 309 777H559C564.523 777 569 781.477 569 787V804C569 809.523 564.523 814 559 814H309C303.477 814 299 809.523 299 804V787Z" fill="url(#paint36_linear_103_235)"/> <path d="M299 834C299 828.477 303.477 824 309 824H492C497.523 824 502 828.477 502 834V851C502 856.523 497.523 861 492 861H309C303.477 861 299 856.523 299 851V834Z" fill="url(#paint37_linear_103_235)"/> <path d="M299 881C299 875.477 303.477 871 309 871H492C497.523 871 502 875.477 502 881V898C502 903.523 497.523 908 492 908H309C303.477 908 299 903.523 299 898V881Z" fill="url(#paint38_linear_103_235)"/> <path d="M299 928C299 922.477 303.477 918 309 918H435C440.523 918 445 922.477 445 928V945C445 950.523 440.523 955 435 955H309C303.477 955 299 950.523 299 945V928Z" fill="url(#paint39_linear_103_235)"/> <path d="M299 1001C299 995.477 303.477 991 309 991H539C544.523 991 549 995.477 549 1001V1018C549 1023.52 544.523 1028 539 1028H309C303.477 1028 299 1023.52 299 1018V1001Z" fill="url(#paint40_linear_103_235)"/> <path d="M299 1049C299 1043.48 303.477 1039 309 1039H578C583.523 1039 588 1043.48 588 1049V1066C588 1071.52 583.523 1076 578 1076H309C303.477 1076 299 1071.52 299 1066V1049Z" fill="url(#paint41_linear_103_235)"/> <path d="M299 1097C299 1091.48 303.477 1087 309 1087H522C527.523 1087 532 1091.48 532 1097V1114C532 1119.52 527.523 1124 522 1124H309C303.477 1124 299 1119.52 299 1114V1097Z" fill="url(#paint42_linear_103_235)"/> <path d="M299 1145C299 1139.48 303.477 1135 309 1135H559C564.523 1135 569 1139.48 569 1145V1162C569 1167.52 564.523 1172 559 1172H309C303.477 1172 299 1167.52 299 1162V1145Z" fill="url(#paint43_linear_103_235)"/> <path d="M299 1193C299 1187.48 303.477 1183 309 1183H601C606.523 1183 611 1187.48 611 1193V1210C611 1215.52 606.523 1220 601 1220H309C303.477 1220 299 1215.52 299 1210V1193Z" fill="url(#paint44_linear_103_235)"/> <g filter="url(#filter3_d_103_235)"> <rect x="995" y="999" width="396" height="317" rx="10" fill="white"/> </g> <path d="M1028 1057C1028 1051.48 1032.48 1047 1038 1047H1268C1273.52 1047 1278 1051.48 1278 1057V1074C1278 1079.52 1273.52 1084 1268 1084H1038C1032.48 1084 1028 1079.52 1028 1074V1057Z" fill="url(#paint45_linear_103_235)"/> <path d="M1028 1105C1028 1099.48 1032.48 1095 1038 1095H1307C1312.52 1095 1317 1099.48 1317 1105V1122C1317 1127.52 1312.52 1132 1307 1132H1038C1032.48 1132 1028 1127.52 1028 1122V1105Z" fill="url(#paint46_linear_103_235)"/> <path d="M1028 1153C1028 1147.48 1032.48 1143 1038 1143H1251C1256.52 1143 1261 1147.48 1261 1153V1170C1261 1175.52 1256.52 1180 1251 1180H1038C1032.48 1180 1028 1175.52 1028 1170V1153Z" fill="url(#paint47_linear_103_235)"/> <path d="M1028 1201C1028 1195.48 1032.48 1191 1038 1191H1288C1293.52 1191 1298 1195.48 1298 1201V1218C1298 1223.52 1293.52 1228 1288 1228H1038C1032.48 1228 1028 1223.52 1028 1218V1201Z" fill="url(#paint48_linear_103_235)"/> <path d="M1028 1249C1028 1243.48 1032.48 1239 1038 1239H1330C1335.52 1239 1340 1243.48 1340 1249V1266C1340 1271.52 1335.52 1276 1330 1276H1038C1032.48 1276 1028 1271.52 1028 1266V1249Z" fill="url(#paint49_linear_103_235)"/> <g filter="url(#filter4_d_103_235)"> <rect x="995" y="199" width="396" height="317" rx="10" fill="white"/> </g> <path d="M1028 256C1028 250.477 1032.48 246 1038 246H1320.83C1326.35 246 1330.83 250.477 1330.83 256V273.081C1330.83 278.604 1326.35 283.081 1320.83 283.081H1038C1032.48 283.081 1028 278.604 1028 273.081V256Z" fill="url(#paint50_linear_103_235)"/> <path d="M1028 303.081C1028 297.558 1032.48 293.081 1038 293.081H1268C1273.52 293.081 1278 297.558 1278 303.081V320.081C1278 325.604 1273.52 330.081 1268 330.081H1038C1032.48 330.081 1028 325.604 1028 320.081V303.081Z" fill="url(#paint51_linear_103_235)"/> <path d="M1028 350.081C1028 344.558 1032.48 340.081 1038 340.081H1288C1293.52 340.081 1298 344.558 1298 350.081V367.081C1298 372.604 1293.52 377.081 1288 377.081H1038C1032.48 377.081 1028 372.604 1028 367.081V350.081Z" fill="url(#paint52_linear_103_235)"/> <path d="M1028 397.081C1028 391.558 1032.48 387.081 1038 387.081H1349C1354.52 387.081 1359 391.558 1359 397.081V414.081C1359 419.604 1354.52 424.081 1349 424.081H1038C1032.48 424.081 1028 419.604 1028 414.081V397.081Z" fill="url(#paint53_linear_103_235)"/> <path d="M1028 444.081C1028 438.558 1032.48 434.081 1038 434.081H1268C1273.52 434.081 1278 438.558 1278 444.081V461.081C1278 466.604 1273.52 471.081 1268 471.081H1038C1032.48 471.081 1028 466.604 1028 461.081V444.081Z" fill="url(#paint54_linear_103_235)"/> <g filter="url(#filter5_d_103_235)"> <rect x="995" y="599" width="396" height="317" rx="10" fill="white"/> </g> <path d="M1028 655C1028 649.477 1032.48 645 1038 645H1149C1154.52 645 1159 649.477 1159 655V672C1159 677.523 1154.52 682 1149 682H1038C1032.48 682 1028 677.523 1028 672V655Z" fill="url(#paint55_linear_103_235)"/> <path d="M1028 702C1028 696.477 1032.48 692 1038 692H1288C1293.52 692 1298 696.477 1298 702V719C1298 724.523 1293.52 729 1288 729H1038C1032.48 729 1028 724.523 1028 719V702Z" fill="url(#paint56_linear_103_235)"/> <path d="M1028 749C1028 743.477 1032.48 739 1038 739H1221C1226.52 739 1231 743.477 1231 749V766C1231 771.523 1226.52 776 1221 776H1038C1032.48 776 1028 771.523 1028 766V749Z" fill="url(#paint57_linear_103_235)"/> <path d="M1028 796C1028 790.477 1032.48 786 1038 786H1221C1226.52 786 1231 790.477 1231 796V813C1231 818.523 1226.52 823 1221 823H1038C1032.48 823 1028 818.523 1028 813V796Z" fill="url(#paint58_linear_103_235)"/> <path d="M1028 843C1028 837.477 1032.48 833 1038 833H1164C1169.52 833 1174 837.477 1174 843V860C1174 865.523 1169.52 870 1164 870H1038C1032.48 870 1028 865.523 1028 860V843Z" fill="url(#paint59_linear_103_235)"/> <path d="M250.004 109V75.096H271.376V81.544H257.648V89.396H269.4V95.844H257.648V109H250.004ZM283.433 109.624C280.625 109.624 278.579 108.705 277.297 106.868C276.049 104.996 275.425 102.448 275.425 99.224V83.208H283.069V98.236C283.069 100.073 283.329 101.356 283.849 102.084C284.369 102.777 285.183 103.124 286.293 103.124C287.263 103.124 288.078 102.899 288.737 102.448C289.395 101.997 290.106 101.269 290.869 100.264V83.208H298.513V109H292.273L291.701 105.412H291.545C290.47 106.695 289.291 107.717 288.009 108.48C286.726 109.243 285.201 109.624 283.433 109.624ZM312.259 109.624C309.625 109.624 307.787 108.844 306.747 107.284C305.742 105.689 305.239 103.592 305.239 100.992V72.548H312.883V101.304C312.883 102.101 313.022 102.656 313.299 102.968C313.611 103.28 313.923 103.436 314.235 103.436C314.409 103.436 314.547 103.436 314.651 103.436C314.79 103.401 314.963 103.367 315.171 103.332L316.107 109C315.691 109.173 315.154 109.312 314.495 109.416C313.871 109.555 313.126 109.624 312.259 109.624ZM327.138 109.624C324.504 109.624 322.666 108.844 321.626 107.284C320.621 105.689 320.118 103.592 320.118 100.992V72.548H327.762V101.304C327.762 102.101 327.901 102.656 328.178 102.968C328.49 103.28 328.802 103.436 329.114 103.436C329.288 103.436 329.426 103.436 329.53 103.436C329.669 103.401 329.842 103.367 330.05 103.332L330.986 109C330.57 109.173 330.033 109.312 329.374 109.416C328.75 109.555 328.005 109.624 327.138 109.624ZM356.379 109.624C354.161 109.624 351.942 109.208 349.723 108.376C347.539 107.544 345.581 106.331 343.847 104.736L348.215 99.484C349.429 100.524 350.781 101.373 352.271 102.032C353.762 102.691 355.201 103.02 356.587 103.02C358.182 103.02 359.361 102.725 360.123 102.136C360.921 101.547 361.319 100.749 361.319 99.744C361.319 98.6693 360.869 97.8893 359.967 97.404C359.101 96.884 357.922 96.312 356.431 95.688L352.011 93.816C350.867 93.3307 349.775 92.6893 348.735 91.892C347.695 91.06 346.846 90.0373 346.187 88.824C345.529 87.6107 345.199 86.1893 345.199 84.56C345.199 82.688 345.702 80.9893 346.707 79.464C347.747 77.9387 349.169 76.7253 350.971 75.824C352.809 74.9227 354.906 74.472 357.263 74.472C359.205 74.472 361.146 74.8533 363.087 75.616C365.029 76.3787 366.727 77.488 368.183 78.944L364.283 83.78C363.174 82.9133 362.065 82.2547 360.955 81.804C359.846 81.3187 358.615 81.076 357.263 81.076C355.946 81.076 354.889 81.3533 354.091 81.908C353.329 82.428 352.947 83.1733 352.947 84.144C352.947 85.184 353.433 85.964 354.403 86.484C355.409 87.004 356.639 87.5587 358.095 88.148L362.463 89.916C364.509 90.748 366.138 91.892 367.351 93.348C368.565 94.804 369.171 96.728 369.171 99.12C369.171 100.992 368.669 102.725 367.663 104.32C366.658 105.915 365.202 107.197 363.295 108.168C361.389 109.139 359.083 109.624 356.379 109.624ZM385.378 109.624C383.194 109.624 381.131 109.104 379.19 108.064C377.283 106.989 375.741 105.447 374.562 103.436C373.383 101.391 372.794 98.9467 372.794 96.104C372.794 93.2267 373.383 90.7827 374.562 88.772C375.741 86.7613 377.283 85.236 379.19 84.196C381.131 83.1213 383.194 82.584 385.378 82.584C387.562 82.584 389.607 83.1213 391.514 84.196C393.421 85.236 394.963 86.7613 396.142 88.772C397.321 90.7827 397.91 93.2267 397.91 96.104C397.91 98.9467 397.321 101.391 396.142 103.436C394.963 105.447 393.421 106.989 391.514 108.064C389.607 109.104 387.562 109.624 385.378 109.624ZM385.378 103.436C386.938 103.436 388.117 102.777 388.914 101.46C389.711 100.108 390.11 98.3227 390.11 96.104C390.11 93.8507 389.711 92.0653 388.914 90.748C388.117 89.4307 386.938 88.772 385.378 88.772C383.783 88.772 382.587 89.4307 381.79 90.748C381.027 92.0653 380.646 93.8507 380.646 96.104C380.646 98.3227 381.027 100.108 381.79 101.46C382.587 102.777 383.783 103.436 385.378 103.436ZM403.146 109V83.208H409.386L409.906 86.484H410.114C411.223 85.444 412.436 84.5427 413.754 83.78C415.106 82.9827 416.648 82.584 418.382 82.584C421.19 82.584 423.218 83.52 424.466 85.392C425.748 87.2293 426.39 89.76 426.39 92.984V109H418.746V93.972C418.746 92.1 418.486 90.8173 417.966 90.124C417.48 89.4307 416.683 89.084 415.574 89.084C414.603 89.084 413.771 89.3093 413.078 89.76C412.384 90.176 411.622 90.7827 410.79 91.58V109H403.146ZM442.107 119.972C440.097 119.972 438.277 119.747 436.647 119.296C435.018 118.845 433.718 118.135 432.747 117.164C431.777 116.193 431.291 114.945 431.291 113.42C431.291 111.34 432.522 109.607 434.983 108.22V108.012C434.325 107.561 433.753 106.989 433.267 106.296C432.817 105.603 432.591 104.719 432.591 103.644C432.591 102.708 432.869 101.807 433.423 100.94C433.978 100.073 434.671 99.3627 435.503 98.808V98.6C434.602 97.976 433.787 97.0747 433.059 95.896C432.366 94.7173 432.019 93.3827 432.019 91.892C432.019 89.812 432.522 88.096 433.527 86.744C434.533 85.3573 435.85 84.3173 437.479 83.624C439.109 82.9307 440.842 82.584 442.679 82.584C444.205 82.584 445.539 82.792 446.683 83.208H456.095V88.772H451.987C452.23 89.1533 452.421 89.6387 452.559 90.228C452.733 90.8173 452.819 91.4587 452.819 92.152C452.819 94.128 452.369 95.7573 451.467 97.04C450.566 98.3227 449.353 99.276 447.827 99.9C446.302 100.524 444.586 100.836 442.679 100.836C441.674 100.836 440.634 100.663 439.559 100.316C438.935 100.836 438.623 101.477 438.623 102.24C438.623 102.899 438.918 103.384 439.507 103.696C440.097 104.008 441.102 104.164 442.523 104.164H446.683C449.873 104.164 452.299 104.684 453.963 105.724C455.662 106.729 456.511 108.393 456.511 110.716C456.511 112.484 455.922 114.061 454.743 115.448C453.565 116.869 451.901 117.979 449.751 118.776C447.602 119.573 445.054 119.972 442.107 119.972ZM442.679 96.208C443.685 96.208 444.517 95.844 445.175 95.116C445.869 94.388 446.215 93.3133 446.215 91.892C446.215 90.54 445.869 89.5173 445.175 88.824C444.517 88.096 443.685 87.732 442.679 87.732C441.674 87.732 440.825 88.0787 440.131 88.772C439.473 89.4653 439.143 90.5053 439.143 91.892C439.143 93.3133 439.473 94.388 440.131 95.116C440.825 95.844 441.674 96.208 442.679 96.208ZM443.303 115.188C445.037 115.188 446.458 114.876 447.567 114.252C448.677 113.628 449.231 112.883 449.231 112.016C449.231 111.219 448.885 110.681 448.191 110.404C447.533 110.127 446.562 109.988 445.279 109.988H442.627C441.761 109.988 441.033 109.953 440.443 109.884C439.889 109.849 439.403 109.78 438.987 109.676C438.051 110.508 437.583 111.357 437.583 112.224C437.583 113.195 438.103 113.923 439.143 114.408C440.218 114.928 441.605 115.188 443.303 115.188ZM471.715 109V75.096H479.359V102.552H492.775V109H471.715ZM498.123 119.088C497.36 119.088 496.684 119.036 496.095 118.932C495.54 118.828 495.003 118.707 494.483 118.568L495.835 112.744C496.077 112.779 496.355 112.831 496.667 112.9C496.979 113.004 497.273 113.056 497.551 113.056C498.833 113.056 499.821 112.744 500.515 112.12C501.208 111.496 501.728 110.681 502.075 109.676L502.439 108.324L492.507 83.208H500.203L503.895 94.284C504.276 95.4627 504.623 96.6587 504.935 97.872C505.247 99.0853 505.576 100.333 505.923 101.616H506.131C506.408 100.403 506.685 99.1893 506.963 97.976C507.275 96.728 507.587 95.4973 507.899 94.284L511.019 83.208H518.351L509.407 109.26C508.609 111.409 507.725 113.212 506.755 114.668C505.784 116.159 504.605 117.268 503.219 117.996C501.867 118.724 500.168 119.088 498.123 119.088ZM522.38 109V83.208H528.62L529.14 87.732H529.348C530.284 85.9987 531.411 84.716 532.728 83.884C534.045 83.0173 535.363 82.584 536.68 82.584C537.408 82.584 538.015 82.636 538.5 82.74C538.985 82.8093 539.401 82.9307 539.748 83.104L538.5 89.708C538.049 89.5693 537.599 89.4653 537.148 89.396C536.732 89.3267 536.247 89.292 535.692 89.292C534.721 89.292 533.699 89.656 532.624 90.384C531.584 91.0773 530.717 92.2907 530.024 94.024V109H522.38ZM543.099 109V83.208H550.743V109H543.099ZM546.895 79.412C545.612 79.412 544.572 79.048 543.775 78.32C542.977 77.592 542.579 76.6213 542.579 75.408C542.579 74.1947 542.977 73.224 543.775 72.496C544.572 71.768 545.612 71.404 546.895 71.404C548.177 71.404 549.217 71.768 550.015 72.496C550.812 73.224 551.211 74.1947 551.211 75.408C551.211 76.6213 550.812 77.592 550.015 78.32C549.217 79.048 548.177 79.412 546.895 79.412ZM568.702 109.624C566.31 109.624 564.143 109.104 562.202 108.064C560.295 106.989 558.77 105.447 557.626 103.436C556.517 101.391 555.962 98.9467 555.962 96.104C555.962 93.2267 556.586 90.7827 557.834 88.772C559.082 86.7613 560.729 85.236 562.774 84.196C564.819 83.1213 567.021 82.584 569.378 82.584C570.973 82.584 572.377 82.844 573.59 83.364C574.838 83.884 575.947 84.5427 576.918 85.34L573.33 90.28C572.117 89.2747 570.955 88.772 569.846 88.772C568.009 88.772 566.535 89.4307 565.426 90.748C564.351 92.0653 563.814 93.8507 563.814 96.104C563.814 98.3227 564.351 100.108 565.426 101.46C566.535 102.777 567.922 103.436 569.586 103.436C570.418 103.436 571.233 103.263 572.03 102.916C572.827 102.535 573.555 102.084 574.214 101.564L577.23 106.556C575.947 107.665 574.561 108.463 573.07 108.948C571.579 109.399 570.123 109.624 568.702 109.624ZM589.335 109.624C587.637 109.624 585.886 109.295 584.083 108.636C582.315 107.977 580.773 107.111 579.455 106.036L582.887 101.252C584.066 102.119 585.193 102.795 586.267 103.28C587.377 103.731 588.469 103.956 589.543 103.956C590.687 103.956 591.519 103.765 592.039 103.384C592.559 102.968 592.819 102.431 592.819 101.772C592.819 101.183 592.559 100.697 592.039 100.316C591.554 99.9347 590.913 99.588 590.115 99.276C589.318 98.9293 588.469 98.5827 587.567 98.236C586.527 97.82 585.487 97.3 584.447 96.676C583.442 96.052 582.593 95.2547 581.899 94.284C581.206 93.2787 580.859 92.0653 580.859 90.644C580.859 88.252 581.761 86.3107 583.563 84.82C585.401 83.3293 587.793 82.584 590.739 82.584C592.715 82.584 594.449 82.9307 595.939 83.624C597.465 84.2827 598.765 85.028 599.839 85.86L596.407 90.436C595.506 89.7427 594.605 89.2053 593.703 88.824C592.802 88.4427 591.901 88.252 590.999 88.252C589.023 88.252 588.035 88.928 588.035 90.28C588.035 91.112 588.521 91.7533 589.491 92.204C590.497 92.62 591.658 93.0707 592.975 93.556C594.085 93.9373 595.159 94.44 596.199 95.064C597.274 95.6533 598.158 96.4507 598.851 97.456C599.579 98.4267 599.943 99.7093 599.943 101.304C599.943 103.627 599.042 105.603 597.239 107.232C595.437 108.827 592.802 109.624 589.335 109.624Z" fill="#5B5B5B"/> <path d="M409 133H424.5M439.5 133H424.5M424.5 133V178.5" stroke="#7B8B8F" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/> <path d="M1022.35 109.624C1020.13 109.624 1017.91 109.208 1015.7 108.376C1013.51 107.544 1011.55 106.331 1009.82 104.736L1014.19 99.484C1015.4 100.524 1016.75 101.373 1018.24 102.032C1019.73 102.691 1021.17 103.02 1022.56 103.02C1024.15 103.02 1025.33 102.725 1026.1 102.136C1026.89 101.547 1027.29 100.749 1027.29 99.744C1027.29 98.6693 1026.84 97.8893 1025.94 97.404C1025.07 96.884 1023.89 96.312 1022.4 95.688L1017.98 93.816C1016.84 93.3307 1015.75 92.6893 1014.71 91.892C1013.67 91.06 1012.82 90.0373 1012.16 88.824C1011.5 87.6107 1011.17 86.1893 1011.17 84.56C1011.17 82.688 1011.67 80.9893 1012.68 79.464C1013.72 77.9387 1015.14 76.7253 1016.94 75.824C1018.78 74.9227 1020.88 74.472 1023.24 74.472C1025.18 74.472 1027.12 74.8533 1029.06 75.616C1031 76.3787 1032.7 77.488 1034.16 78.944L1030.26 83.78C1029.15 82.9133 1028.04 82.2547 1026.93 81.804C1025.82 81.3187 1024.59 81.076 1023.24 81.076C1021.92 81.076 1020.86 81.3533 1020.06 81.908C1019.3 82.428 1018.92 83.1733 1018.92 84.144C1018.92 85.184 1019.41 85.964 1020.38 86.484C1021.38 87.004 1022.61 87.5587 1024.07 88.148L1028.44 89.916C1030.48 90.748 1032.11 91.892 1033.32 93.348C1034.54 94.804 1035.14 96.728 1035.14 99.12C1035.14 100.992 1034.64 102.725 1033.64 104.32C1032.63 105.915 1031.17 107.197 1029.27 108.168C1027.36 109.139 1025.06 109.624 1022.35 109.624ZM1040.27 118.568V83.208H1046.51L1047.03 85.756H1047.24C1048.25 84.8547 1049.37 84.1093 1050.62 83.52C1051.91 82.896 1053.22 82.584 1054.57 82.584C1057.69 82.584 1060.16 83.78 1061.96 86.172C1063.76 88.5293 1064.66 91.7013 1064.66 95.688C1064.66 98.6347 1064.14 101.148 1063.1 103.228C1062.06 105.308 1060.71 106.903 1059.05 108.012C1057.42 109.087 1055.67 109.624 1053.79 109.624C1052.69 109.624 1051.61 109.399 1050.57 108.948C1049.53 108.463 1048.56 107.787 1047.66 106.92L1047.92 111.028V118.568H1040.27ZM1052.03 103.384C1053.34 103.384 1054.47 102.777 1055.41 101.564C1056.34 100.351 1056.81 98.4267 1056.81 95.792C1056.81 91.1467 1055.32 88.824 1052.34 88.824C1050.85 88.824 1049.37 89.604 1047.92 91.164V101.72C1048.61 102.344 1049.31 102.777 1050 103.02C1050.69 103.263 1051.37 103.384 1052.03 103.384ZM1077.1 109.624C1074.47 109.624 1072.63 108.844 1071.59 107.284C1070.59 105.689 1070.08 103.592 1070.08 100.992V72.548H1077.73V101.304C1077.73 102.101 1077.87 102.656 1078.14 102.968C1078.46 103.28 1078.77 103.436 1079.08 103.436C1079.25 103.436 1079.39 103.436 1079.5 103.436C1079.63 103.401 1079.81 103.367 1080.02 103.332L1080.95 109C1080.54 109.173 1080 109.312 1079.34 109.416C1078.72 109.555 1077.97 109.624 1077.1 109.624ZM1084.96 109V83.208H1092.61V109H1084.96ZM1088.76 79.412C1087.48 79.412 1086.44 79.048 1085.64 78.32C1084.84 77.592 1084.44 76.6213 1084.44 75.408C1084.44 74.1947 1084.84 73.224 1085.64 72.496C1086.44 71.768 1087.48 71.404 1088.76 71.404C1090.04 71.404 1091.08 71.768 1091.88 72.496C1092.68 73.224 1093.07 74.1947 1093.07 75.408C1093.07 76.6213 1092.68 77.592 1091.88 78.32C1091.08 79.048 1090.04 79.412 1088.76 79.412ZM1109.27 109.624C1106.08 109.624 1103.79 108.705 1102.4 106.868C1101.05 105.031 1100.37 102.621 1100.37 99.64V89.188H1096.84V83.52L1100.79 83.208L1101.67 76.344H1108.02V83.208H1114.21V89.188H1108.02V99.536C1108.02 100.992 1108.31 102.049 1108.9 102.708C1109.53 103.332 1110.34 103.644 1111.35 103.644C1111.76 103.644 1112.18 103.592 1112.59 103.488C1113.04 103.384 1113.44 103.263 1113.79 103.124L1114.99 108.688C1114.33 108.896 1113.51 109.104 1112.54 109.312C1111.61 109.52 1110.51 109.624 1109.27 109.624ZM1130.27 109V75.096H1137.92V109H1130.27ZM1145.29 109V83.208H1151.53L1152.05 86.484H1152.26C1153.37 85.444 1154.58 84.5427 1155.9 83.78C1157.25 82.9827 1158.79 82.584 1160.53 82.584C1163.33 82.584 1165.36 83.52 1166.61 85.392C1167.89 87.2293 1168.53 89.76 1168.53 92.984V109H1160.89V93.972C1160.89 92.1 1160.63 90.8173 1160.11 90.124C1159.62 89.4307 1158.83 89.084 1157.72 89.084C1156.75 89.084 1155.92 89.3093 1155.22 89.76C1154.53 90.176 1153.77 90.7827 1152.93 91.58V109H1145.29ZM1184.98 109.624C1181.79 109.624 1179.5 108.705 1178.12 106.868C1176.76 105.031 1176.09 102.621 1176.09 99.64V89.188H1172.55V83.52L1176.5 83.208L1177.39 76.344H1183.73V83.208H1189.92V89.188H1183.73V99.536C1183.73 100.992 1184.03 102.049 1184.62 102.708C1185.24 103.332 1186.05 103.644 1187.06 103.644C1187.48 103.644 1187.89 103.592 1188.31 103.488C1188.76 103.384 1189.16 103.263 1189.5 103.124L1190.7 108.688C1190.04 108.896 1189.23 109.104 1188.26 109.312C1187.32 109.52 1186.23 109.624 1184.98 109.624ZM1205.12 109.624C1202.93 109.624 1200.87 109.104 1198.93 108.064C1197.02 106.989 1195.48 105.447 1194.3 103.436C1193.12 101.391 1192.53 98.9467 1192.53 96.104C1192.53 93.2267 1193.12 90.7827 1194.3 88.772C1195.48 86.7613 1197.02 85.236 1198.93 84.196C1200.87 83.1213 1202.93 82.584 1205.12 82.584C1207.3 82.584 1209.35 83.1213 1211.25 84.196C1213.16 85.236 1214.7 86.7613 1215.88 88.772C1217.06 90.7827 1217.65 93.2267 1217.65 96.104C1217.65 98.9467 1217.06 101.391 1215.88 103.436C1214.7 105.447 1213.16 106.989 1211.25 108.064C1209.35 109.104 1207.3 109.624 1205.12 109.624ZM1205.12 103.436C1206.68 103.436 1207.85 102.777 1208.65 101.46C1209.45 100.108 1209.85 98.3227 1209.85 96.104C1209.85 93.8507 1209.45 92.0653 1208.65 90.748C1207.85 89.4307 1206.68 88.772 1205.12 88.772C1203.52 88.772 1202.33 89.4307 1201.53 90.748C1200.77 92.0653 1200.38 93.8507 1200.38 96.104C1200.38 98.3227 1200.77 100.108 1201.53 101.46C1202.33 102.777 1203.52 103.436 1205.12 103.436ZM1239.79 109L1229.55 75.096H1237.66L1241.72 90.904C1242.24 92.7413 1242.69 94.544 1243.07 96.312C1243.49 98.08 1243.95 99.9 1244.47 101.772H1244.68C1245.17 99.9 1245.62 98.08 1246.03 96.312C1246.45 94.544 1246.9 92.7413 1247.39 90.904L1251.39 75.096H1259.19L1249 109H1239.79ZM1272.77 109.624C1270.31 109.624 1268.09 109.087 1266.11 108.012C1264.14 106.937 1262.58 105.395 1261.43 103.384C1260.29 101.373 1259.72 98.9467 1259.72 96.104C1259.72 93.296 1260.29 90.8867 1261.43 88.876C1262.61 86.8653 1264.14 85.3227 1266.01 84.248C1267.88 83.1387 1269.84 82.584 1271.88 82.584C1274.35 82.584 1276.37 83.1387 1277.97 84.248C1279.6 85.3227 1280.81 86.796 1281.61 88.668C1282.44 90.5053 1282.86 92.6027 1282.86 94.96C1282.86 95.6187 1282.82 96.2773 1282.75 96.936C1282.68 97.56 1282.61 98.028 1282.54 98.34H1267.1C1267.45 100.212 1268.23 101.599 1269.44 102.5C1270.65 103.367 1272.11 103.8 1273.81 103.8C1275.65 103.8 1277.5 103.228 1279.37 102.084L1281.92 106.712C1280.6 107.613 1279.13 108.324 1277.5 108.844C1275.87 109.364 1274.29 109.624 1272.77 109.624ZM1267.05 93.296H1276.36C1276.36 91.8747 1276.01 90.7133 1275.32 89.812C1274.66 88.876 1273.57 88.408 1272.04 88.408C1270.86 88.408 1269.8 88.824 1268.87 89.656C1267.93 90.4533 1267.33 91.6667 1267.05 93.296ZM1288.14 109V83.208H1294.38L1294.9 87.732H1295.11C1296.04 85.9987 1297.17 84.716 1298.49 83.884C1299.8 83.0173 1301.12 82.584 1302.44 82.584C1303.17 82.584 1303.77 82.636 1304.26 82.74C1304.74 82.8093 1305.16 82.9307 1305.51 83.104L1304.26 89.708C1303.81 89.5693 1303.36 89.4653 1302.91 89.396C1302.49 89.3267 1302 89.292 1301.45 89.292C1300.48 89.292 1299.46 89.656 1298.38 90.384C1297.34 91.0773 1296.48 92.2907 1295.78 94.024V109H1288.14ZM1316.04 109.624C1314.34 109.624 1312.59 109.295 1310.79 108.636C1309.02 107.977 1307.48 107.111 1306.16 106.036L1309.59 101.252C1310.77 102.119 1311.9 102.795 1312.97 103.28C1314.08 103.731 1315.18 103.956 1316.25 103.956C1317.39 103.956 1318.23 103.765 1318.75 103.384C1319.27 102.968 1319.53 102.431 1319.53 101.772C1319.53 101.183 1319.27 100.697 1318.75 100.316C1318.26 99.9347 1317.62 99.588 1316.82 99.276C1316.02 98.9293 1315.18 98.5827 1314.27 98.236C1313.23 97.82 1312.19 97.3 1311.15 96.676C1310.15 96.052 1309.3 95.2547 1308.61 94.284C1307.91 93.2787 1307.57 92.0653 1307.57 90.644C1307.57 88.252 1308.47 86.3107 1310.27 84.82C1312.11 83.3293 1314.5 82.584 1317.45 82.584C1319.42 82.584 1321.16 82.9307 1322.65 83.624C1324.17 84.2827 1325.47 85.028 1326.55 85.86L1323.11 90.436C1322.21 89.7427 1321.31 89.2053 1320.41 88.824C1319.51 88.4427 1318.61 88.252 1317.71 88.252C1315.73 88.252 1314.74 88.928 1314.74 90.28C1314.74 91.112 1315.23 91.7533 1316.2 92.204C1317.2 92.62 1318.36 93.0707 1319.68 93.556C1320.79 93.9373 1321.87 94.44 1322.91 95.064C1323.98 95.6533 1324.86 96.4507 1325.56 97.456C1326.29 98.4267 1326.65 99.7093 1326.65 101.304C1326.65 103.627 1325.75 105.603 1323.95 107.232C1322.14 108.827 1319.51 109.624 1316.04 109.624ZM1343.05 109.624C1340.59 109.624 1338.37 109.087 1336.39 108.012C1334.42 106.937 1332.86 105.395 1331.71 103.384C1330.57 101.373 1330 98.9467 1330 96.104C1330 93.296 1330.57 90.8867 1331.71 88.876C1332.89 86.8653 1334.42 85.3227 1336.29 84.248C1338.16 83.1387 1340.12 82.584 1342.17 82.584C1344.63 82.584 1346.65 83.1387 1348.25 84.248C1349.88 85.3227 1351.09 86.796 1351.89 88.668C1352.72 90.5053 1353.14 92.6027 1353.14 94.96C1353.14 95.6187 1353.1 96.2773 1353.03 96.936C1352.96 97.56 1352.89 98.028 1352.83 98.34H1337.38C1337.73 100.212 1338.51 101.599 1339.72 102.5C1340.93 103.367 1342.39 103.8 1344.09 103.8C1345.93 103.8 1347.78 103.228 1349.65 102.084L1352.2 106.712C1350.88 107.613 1349.41 108.324 1347.78 108.844C1346.15 109.364 1344.57 109.624 1343.05 109.624ZM1337.33 93.296H1346.64C1346.64 91.8747 1346.29 90.7133 1345.6 89.812C1344.94 88.876 1343.85 88.408 1342.32 88.408C1341.14 88.408 1340.09 88.824 1339.15 89.656C1338.21 90.4533 1337.61 91.6667 1337.33 93.296ZM1366.01 109.624C1364.31 109.624 1362.56 109.295 1360.76 108.636C1358.99 107.977 1357.45 107.111 1356.13 106.036L1359.56 101.252C1360.74 102.119 1361.87 102.795 1362.94 103.28C1364.05 103.731 1365.14 103.956 1366.22 103.956C1367.36 103.956 1368.2 103.765 1368.72 103.384C1369.24 102.968 1369.5 102.431 1369.5 101.772C1369.5 101.183 1369.24 100.697 1368.72 100.316C1368.23 99.9347 1367.59 99.588 1366.79 99.276C1365.99 98.9293 1365.14 98.5827 1364.24 98.236C1363.2 97.82 1362.16 97.3 1361.12 96.676C1360.12 96.052 1359.27 95.2547 1358.58 94.284C1357.88 93.2787 1357.54 92.0653 1357.54 90.644C1357.54 88.252 1358.44 86.3107 1360.24 84.82C1362.08 83.3293 1364.47 82.584 1367.42 82.584C1369.39 82.584 1371.12 82.9307 1372.62 83.624C1374.14 84.2827 1375.44 85.028 1376.52 85.86L1373.08 90.436C1372.18 89.7427 1371.28 89.2053 1370.38 88.824C1369.48 88.4427 1368.58 88.252 1367.68 88.252C1365.7 88.252 1364.71 88.928 1364.71 90.28C1364.71 91.112 1365.2 91.7533 1366.17 92.204C1367.17 92.62 1368.33 93.0707 1369.65 93.556C1370.76 93.9373 1371.84 94.44 1372.88 95.064C1373.95 95.6533 1374.83 96.4507 1375.53 97.456C1376.26 98.4267 1376.62 99.7093 1376.62 101.304C1376.62 103.627 1375.72 105.603 1373.92 107.232C1372.11 108.827 1369.48 109.624 1366.01 109.624Z" fill="#5B5B5B"/> <path d="M1178 133H1193.5M1208.5 133H1193.5M1193.5 133V178.5" stroke="#7B8B8F" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/> <path d="M2681 409V375.096H2702.22V381.544H2688.65V388.356H2700.19V394.752H2688.65V402.552H2702.74V409H2681ZM2708.87 409V383.208H2715.11L2715.63 386.536H2715.84C2716.91 385.461 2718.05 384.543 2719.27 383.78C2720.48 382.983 2721.94 382.584 2723.64 382.584C2725.47 382.584 2726.95 382.965 2728.06 383.728C2729.2 384.456 2730.1 385.513 2730.76 386.9C2731.9 385.721 2733.12 384.716 2734.4 383.884C2735.68 383.017 2737.17 382.584 2738.87 382.584C2741.65 382.584 2743.67 383.52 2744.96 385.392C2746.27 387.229 2746.93 389.76 2746.93 392.984V409H2739.29V393.972C2739.29 392.1 2739.03 390.817 2738.51 390.124C2738.02 389.431 2737.23 389.084 2736.12 389.084C2734.83 389.084 2733.36 389.916 2731.7 391.58V409H2724.05V393.972C2724.05 392.1 2723.79 390.817 2723.27 390.124C2722.79 389.431 2721.99 389.084 2720.88 389.084C2719.6 389.084 2718.14 389.916 2716.51 391.58V409H2708.87ZM2766.97 409.624C2765.83 409.624 2764.67 409.347 2763.49 408.792C2762.35 408.203 2761.27 407.353 2760.27 406.244H2760.06L2759.43 409H2753.45V372.548H2761.1V381.492L2760.89 385.444C2761.9 384.543 2762.99 383.849 2764.17 383.364C2765.34 382.844 2766.52 382.584 2767.7 382.584C2769.78 382.584 2771.58 383.121 2773.11 384.196C2774.64 385.271 2775.8 386.796 2776.59 388.772C2777.43 390.713 2777.84 393.001 2777.84 395.636C2777.84 398.583 2777.32 401.113 2776.28 403.228C2775.24 405.308 2773.89 406.903 2772.23 408.012C2770.6 409.087 2768.85 409.624 2766.97 409.624ZM2765.21 403.384C2766.52 403.384 2767.65 402.777 2768.59 401.564C2769.52 400.351 2769.99 398.427 2769.99 395.792C2769.99 391.147 2768.5 388.824 2765.52 388.824C2763.99 388.824 2762.52 389.604 2761.1 391.164V401.72C2761.79 402.344 2762.48 402.777 2763.18 403.02C2763.87 403.263 2764.55 403.384 2765.21 403.384ZM2794.81 409.624C2792.35 409.624 2790.13 409.087 2788.15 408.012C2786.17 406.937 2784.61 405.395 2783.47 403.384C2782.33 401.373 2781.75 398.947 2781.75 396.104C2781.75 393.296 2782.33 390.887 2783.47 388.876C2784.65 386.865 2786.17 385.323 2788.05 384.248C2789.92 383.139 2791.88 382.584 2793.92 382.584C2796.38 382.584 2798.41 383.139 2800.01 384.248C2801.64 385.323 2802.85 386.796 2803.65 388.668C2804.48 390.505 2804.89 392.603 2804.89 394.96C2804.89 395.619 2804.86 396.277 2804.79 396.936C2804.72 397.56 2804.65 398.028 2804.58 398.34H2789.14C2789.49 400.212 2790.27 401.599 2791.48 402.5C2792.69 403.367 2794.15 403.8 2795.85 403.8C2797.68 403.8 2799.54 403.228 2801.41 402.084L2803.96 406.712C2802.64 407.613 2801.17 408.324 2799.54 408.844C2797.91 409.364 2796.33 409.624 2794.81 409.624ZM2789.09 393.296H2798.39C2798.39 391.875 2798.05 390.713 2797.35 389.812C2796.7 388.876 2795.6 388.408 2794.08 388.408C2792.9 388.408 2791.84 388.824 2790.91 389.656C2789.97 390.453 2789.36 391.667 2789.09 393.296ZM2819.38 409.624C2816.19 409.624 2813.63 408.428 2811.68 406.036C2809.78 403.609 2808.82 400.299 2808.82 396.104C2808.82 393.296 2809.33 390.887 2810.33 388.876C2811.37 386.831 2812.71 385.271 2814.34 384.196C2816 383.121 2817.73 382.584 2819.54 382.584C2820.96 382.584 2822.15 382.827 2823.12 383.312C2824.1 383.797 2825.01 384.456 2825.88 385.288L2825.57 381.336V372.548H2833.21V409H2826.97L2826.45 406.452H2826.24C2825.34 407.353 2824.29 408.116 2823.07 408.74C2821.86 409.329 2820.63 409.624 2819.38 409.624ZM2821.36 403.384C2822.19 403.384 2822.93 403.211 2823.59 402.864C2824.29 402.517 2824.94 401.911 2825.57 401.044V390.488C2824.91 389.864 2824.2 389.431 2823.44 389.188C2822.71 388.945 2822 388.824 2821.3 388.824C2820.09 388.824 2819.02 389.413 2818.08 390.592C2817.14 391.736 2816.68 393.539 2816.68 396C2816.68 398.531 2817.08 400.403 2817.87 401.616C2818.7 402.795 2819.87 403.384 2821.36 403.384ZM2849.19 409.624C2846 409.624 2843.43 408.428 2841.49 406.036C2839.59 403.609 2838.63 400.299 2838.63 396.104C2838.63 393.296 2839.14 390.887 2840.14 388.876C2841.18 386.831 2842.52 385.271 2844.15 384.196C2845.81 383.121 2847.54 382.584 2849.35 382.584C2850.77 382.584 2851.96 382.827 2852.93 383.312C2853.9 383.797 2854.82 384.456 2855.69 385.288L2855.38 381.336V372.548H2863.02V409H2856.78L2856.26 406.452H2856.05C2855.15 407.353 2854.09 408.116 2852.88 408.74C2851.67 409.329 2850.44 409.624 2849.19 409.624ZM2851.17 403.384C2852 403.384 2852.74 403.211 2853.4 402.864C2854.09 402.517 2854.75 401.911 2855.38 401.044V390.488C2854.72 389.864 2854.01 389.431 2853.25 389.188C2852.52 388.945 2851.81 388.824 2851.11 388.824C2849.9 388.824 2848.83 389.413 2847.89 390.592C2846.95 391.736 2846.49 393.539 2846.49 396C2846.49 398.531 2846.88 400.403 2847.68 401.616C2848.51 402.795 2849.67 403.384 2851.17 403.384ZM2869.79 409V383.208H2877.44V409H2869.79ZM2873.59 379.412C2872.31 379.412 2871.27 379.048 2870.47 378.32C2869.67 377.592 2869.27 376.621 2869.27 375.408C2869.27 374.195 2869.67 373.224 2870.47 372.496C2871.27 371.768 2872.31 371.404 2873.59 371.404C2874.87 371.404 2875.91 371.768 2876.71 372.496C2877.51 373.224 2877.91 374.195 2877.91 375.408C2877.91 376.621 2877.51 377.592 2876.71 378.32C2875.91 379.048 2874.87 379.412 2873.59 379.412ZM2884.17 409V383.208H2890.41L2890.93 386.484H2891.13C2892.24 385.444 2893.46 384.543 2894.77 383.78C2896.13 382.983 2897.67 382.584 2899.4 382.584C2902.21 382.584 2904.24 383.52 2905.49 385.392C2906.77 387.229 2907.41 389.76 2907.41 392.984V409H2899.77V393.972C2899.77 392.1 2899.51 390.817 2898.99 390.124C2898.5 389.431 2897.7 389.084 2896.59 389.084C2895.62 389.084 2894.79 389.309 2894.1 389.76C2893.4 390.176 2892.64 390.783 2891.81 391.58V409H2884.17ZM2923.13 419.972C2921.12 419.972 2919.3 419.747 2917.67 419.296C2916.04 418.845 2914.74 418.135 2913.77 417.164C2912.8 416.193 2912.31 414.945 2912.31 413.42C2912.31 411.34 2913.54 409.607 2916 408.22V408.012C2915.34 407.561 2914.77 406.989 2914.29 406.296C2913.84 405.603 2913.61 404.719 2913.61 403.644C2913.61 402.708 2913.89 401.807 2914.44 400.94C2915 400.073 2915.69 399.363 2916.52 398.808V398.6C2915.62 397.976 2914.81 397.075 2914.08 395.896C2913.39 394.717 2913.04 393.383 2913.04 391.892C2913.04 389.812 2913.54 388.096 2914.55 386.744C2915.55 385.357 2916.87 384.317 2918.5 383.624C2920.13 382.931 2921.86 382.584 2923.7 382.584C2925.22 382.584 2926.56 382.792 2927.7 383.208H2937.11V388.772H2933.01C2933.25 389.153 2933.44 389.639 2933.58 390.228C2933.75 390.817 2933.84 391.459 2933.84 392.152C2933.84 394.128 2933.39 395.757 2932.49 397.04C2931.59 398.323 2930.37 399.276 2928.85 399.9C2927.32 400.524 2925.61 400.836 2923.7 400.836C2922.69 400.836 2921.65 400.663 2920.58 400.316C2919.95 400.836 2919.64 401.477 2919.64 402.24C2919.64 402.899 2919.94 403.384 2920.53 403.696C2921.12 404.008 2922.12 404.164 2923.54 404.164H2927.7C2930.89 404.164 2933.32 404.684 2934.98 405.724C2936.68 406.729 2937.53 408.393 2937.53 410.716C2937.53 412.484 2936.94 414.061 2935.76 415.448C2934.58 416.869 2932.92 417.979 2930.77 418.776C2928.62 419.573 2926.07 419.972 2923.13 419.972ZM2923.7 396.208C2924.7 396.208 2925.54 395.844 2926.19 395.116C2926.89 394.388 2927.23 393.313 2927.23 391.892C2927.23 390.54 2926.89 389.517 2926.19 388.824C2925.54 388.096 2924.7 387.732 2923.7 387.732C2922.69 387.732 2921.84 388.079 2921.15 388.772C2920.49 389.465 2920.16 390.505 2920.16 391.892C2920.16 393.313 2920.49 394.388 2921.15 395.116C2921.84 395.844 2922.69 396.208 2923.7 396.208ZM2924.32 415.188C2926.06 415.188 2927.48 414.876 2928.59 414.252C2929.7 413.628 2930.25 412.883 2930.25 412.016C2930.25 411.219 2929.9 410.681 2929.21 410.404C2928.55 410.127 2927.58 409.988 2926.3 409.988H2923.65C2922.78 409.988 2922.05 409.953 2921.46 409.884C2920.91 409.849 2920.42 409.78 2920.01 409.676C2919.07 410.508 2918.6 411.357 2918.6 412.224C2918.6 413.195 2919.12 413.923 2920.16 414.408C2921.24 414.928 2922.62 415.188 2924.32 415.188ZM2949.29 409.624C2947.59 409.624 2945.84 409.295 2944.04 408.636C2942.27 407.977 2940.73 407.111 2939.41 406.036L2942.84 401.252C2944.02 402.119 2945.15 402.795 2946.22 403.28C2947.33 403.731 2948.43 403.956 2949.5 403.956C2950.64 403.956 2951.48 403.765 2952 403.384C2952.52 402.968 2952.78 402.431 2952.78 401.772C2952.78 401.183 2952.52 400.697 2952 400.316C2951.51 399.935 2950.87 399.588 2950.07 399.276C2949.27 398.929 2948.43 398.583 2947.52 398.236C2946.48 397.82 2945.44 397.3 2944.4 396.676C2943.4 396.052 2942.55 395.255 2941.86 394.284C2941.16 393.279 2940.82 392.065 2940.82 390.644C2940.82 388.252 2941.72 386.311 2943.52 384.82C2945.36 383.329 2947.75 382.584 2950.7 382.584C2952.67 382.584 2954.41 382.931 2955.9 383.624C2957.42 384.283 2958.72 385.028 2959.8 385.86L2956.36 390.436C2955.46 389.743 2954.56 389.205 2953.66 388.824C2952.76 388.443 2951.86 388.252 2950.96 388.252C2948.98 388.252 2947.99 388.928 2947.99 390.28C2947.99 391.112 2948.48 391.753 2949.45 392.204C2950.45 392.62 2951.61 393.071 2952.93 393.556C2954.04 393.937 2955.12 394.44 2956.16 395.064C2957.23 395.653 2958.11 396.451 2958.81 397.456C2959.54 398.427 2959.9 399.709 2959.9 401.304C2959.9 403.627 2959 405.603 2957.2 407.232C2955.39 408.827 2952.76 409.624 2949.29 409.624Z" fill="#5B5B5B"/> <path d="M2805 433H2820.5M2835.5 433H2820.5M2820.5 433V478.5" stroke="#7B8B8F" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/> <g opacity="0.4" filter="url(#filter6_f_103_235)"> <circle cx="2070.16" cy="758.155" r="122.839" transform="rotate(29.1969 2070.16 758.155)" fill="url(#paint60_linear_103_235)"/> </g> <circle cx="2070.31" cy="738.311" r="124" fill="white"/> <path d="M2006.07 725.79C2003.73 725.79 2001.62 725.27 1999.75 724.23C1997.89 723.19 1996.42 721.67 1995.35 719.67C1994.29 717.644 1993.75 715.19 1993.75 712.31C1993.75 709.457 1994.3 707.017 1995.39 704.99C1996.49 702.937 1997.97 701.377 1999.83 700.31C2001.7 699.217 2003.79 698.67 2006.11 698.67C2007.98 698.67 2009.55 699.017 2010.83 699.71C2012.14 700.377 2013.21 701.137 2014.03 701.99L2011.55 704.99C2010.89 704.35 2010.14 703.817 2009.31 703.39C2008.51 702.937 2007.5 702.71 2006.27 702.71C2003.95 702.71 2002.07 703.564 2000.63 705.27C1999.22 706.95 1998.51 709.257 1998.51 712.19C1998.51 715.177 1999.19 717.524 2000.55 719.23C2001.91 720.91 2003.9 721.75 2006.51 721.75C2007.26 721.75 2007.97 721.657 2008.63 721.47C2009.3 721.257 2009.85 720.964 2010.27 720.59V714.95H2005.31V711.15H2014.43V722.67C2013.55 723.55 2012.38 724.297 2010.91 724.91C2009.47 725.497 2007.86 725.79 2006.07 725.79ZM2028.23 725.79C2026.42 725.79 2024.78 725.39 2023.31 724.59C2021.84 723.764 2020.68 722.59 2019.83 721.07C2018.98 719.524 2018.55 717.67 2018.55 715.51C2018.55 713.377 2018.98 711.537 2019.83 709.99C2020.71 708.444 2021.84 707.257 2023.23 706.43C2024.62 705.604 2026.07 705.19 2027.59 705.19C2029.38 705.19 2030.87 705.59 2032.07 706.39C2033.27 707.164 2034.18 708.257 2034.79 709.67C2035.4 711.057 2035.71 712.67 2035.71 714.51C2035.71 715.47 2035.64 716.217 2035.51 716.75H2023.03C2023.24 718.484 2023.87 719.83 2024.91 720.79C2025.95 721.75 2027.26 722.23 2028.83 722.23C2029.68 722.23 2030.47 722.11 2031.19 721.87C2031.94 721.604 2032.67 721.244 2033.39 720.79L2034.95 723.67C2034.02 724.284 2032.98 724.79 2031.83 725.19C2030.68 725.59 2029.48 725.79 2028.23 725.79ZM2022.99 713.63H2031.71C2031.71 712.11 2031.38 710.924 2030.71 710.07C2030.04 709.19 2029.04 708.75 2027.71 708.75C2026.56 708.75 2025.54 709.177 2024.63 710.03C2023.75 710.857 2023.2 712.057 2022.99 713.63ZM2040.1 725.31V705.67H2043.9L2044.22 708.31H2044.38C2045.26 707.457 2046.22 706.724 2047.26 706.11C2048.3 705.497 2049.49 705.19 2050.82 705.19C2052.93 705.19 2054.46 705.87 2055.42 707.23C2056.38 708.59 2056.86 710.51 2056.86 712.99V725.31H2052.26V713.59C2052.26 711.964 2052.02 710.817 2051.54 710.15C2051.06 709.484 2050.28 709.15 2049.18 709.15C2048.33 709.15 2047.57 709.364 2046.9 709.79C2046.26 710.19 2045.53 710.79 2044.7 711.59V725.31H2040.1ZM2070.89 725.79C2069.07 725.79 2067.43 725.39 2065.97 724.59C2064.5 723.764 2063.34 722.59 2062.49 721.07C2061.63 719.524 2061.21 717.67 2061.21 715.51C2061.21 713.377 2061.63 711.537 2062.49 709.99C2063.37 708.444 2064.5 707.257 2065.89 706.43C2067.27 705.604 2068.73 705.19 2070.25 705.19C2072.03 705.19 2073.53 705.59 2074.73 706.39C2075.93 707.164 2076.83 708.257 2077.45 709.67C2078.06 711.057 2078.37 712.67 2078.37 714.51C2078.37 715.47 2078.3 716.217 2078.17 716.75H2065.69C2065.9 718.484 2066.53 719.83 2067.57 720.79C2068.61 721.75 2069.91 722.23 2071.49 722.23C2072.34 722.23 2073.13 722.11 2073.85 721.87C2074.59 721.604 2075.33 721.244 2076.05 720.79L2077.61 723.67C2076.67 724.284 2075.63 724.79 2074.49 725.19C2073.34 725.59 2072.14 725.79 2070.89 725.79ZM2065.65 713.63H2074.37C2074.37 712.11 2074.03 710.924 2073.37 710.07C2072.7 709.19 2071.7 708.75 2070.37 708.75C2069.22 708.75 2068.19 709.177 2067.29 710.03C2066.41 710.857 2065.86 712.057 2065.65 713.63ZM2082.76 725.31V705.67H2086.56L2086.88 709.15H2087.04C2087.73 707.87 2088.57 706.897 2089.56 706.23C2090.55 705.537 2091.56 705.19 2092.6 705.19C2093.53 705.19 2094.28 705.324 2094.84 705.59L2094.04 709.59C2093.69 709.484 2093.37 709.404 2093.08 709.35C2092.79 709.297 2092.43 709.27 2092 709.27C2091.23 709.27 2090.41 709.577 2089.56 710.19C2088.71 710.777 2087.97 711.817 2087.36 713.31V725.31H2082.76ZM2101.39 725.79C2099.68 725.79 2098.28 725.257 2097.19 724.19C2096.12 723.124 2095.59 721.737 2095.59 720.03C2095.59 717.924 2096.51 716.297 2098.35 715.15C2100.19 713.977 2103.12 713.177 2107.15 712.75C2107.12 711.71 2106.84 710.817 2106.31 710.07C2105.8 709.297 2104.88 708.91 2103.55 708.91C2102.59 708.91 2101.64 709.097 2100.71 709.47C2099.8 709.844 2098.91 710.297 2098.03 710.83L2096.35 707.75C2097.44 707.057 2098.67 706.457 2100.03 705.95C2101.41 705.444 2102.88 705.19 2104.43 705.19C2106.88 705.19 2108.71 705.924 2109.91 707.39C2111.13 708.83 2111.75 710.924 2111.75 713.67V725.31H2107.95L2107.63 723.15H2107.47C2106.59 723.897 2105.64 724.524 2104.63 725.03C2103.64 725.537 2102.56 725.79 2101.39 725.79ZM2102.87 722.19C2103.67 722.19 2104.39 722.004 2105.03 721.63C2105.69 721.23 2106.4 720.697 2107.15 720.03V715.63C2104.48 715.977 2102.63 716.497 2101.59 717.19C2100.55 717.857 2100.03 718.684 2100.03 719.67C2100.03 720.55 2100.29 721.19 2100.83 721.59C2101.36 721.99 2102.04 722.19 2102.87 722.19ZM2123.55 725.79C2121.28 725.79 2119.67 725.137 2118.71 723.83C2117.77 722.497 2117.31 720.777 2117.31 718.67V709.31H2114.51V705.87L2117.55 705.67L2118.11 700.31H2121.95V705.67H2126.95V709.31H2121.95V718.67C2121.95 720.964 2122.87 722.11 2124.71 722.11C2125.05 722.11 2125.41 722.07 2125.79 721.99C2126.16 721.884 2126.48 721.777 2126.75 721.67L2127.55 725.07C2127.01 725.257 2126.4 725.417 2125.71 725.55C2125.04 725.71 2124.32 725.79 2123.55 725.79ZM2138.86 725.79C2137.04 725.79 2135.4 725.39 2133.94 724.59C2132.47 723.764 2131.31 722.59 2130.46 721.07C2129.6 719.524 2129.18 717.67 2129.18 715.51C2129.18 713.377 2129.6 711.537 2130.46 709.99C2131.34 708.444 2132.47 707.257 2133.86 706.43C2135.24 705.604 2136.7 705.19 2138.22 705.19C2140 705.19 2141.5 705.59 2142.7 706.39C2143.9 707.164 2144.8 708.257 2145.42 709.67C2146.03 711.057 2146.34 712.67 2146.34 714.51C2146.34 715.47 2146.27 716.217 2146.14 716.75H2133.66C2133.87 718.484 2134.5 719.83 2135.54 720.79C2136.58 721.75 2137.88 722.23 2139.46 722.23C2140.31 722.23 2141.1 722.11 2141.82 721.87C2142.56 721.604 2143.3 721.244 2144.02 720.79L2145.58 723.67C2144.64 724.284 2143.6 724.79 2142.46 725.19C2141.31 725.59 2140.11 725.79 2138.86 725.79ZM2133.62 713.63H2142.34C2142.34 712.11 2142 710.924 2141.34 710.07C2140.67 709.19 2139.67 708.75 2138.34 708.75C2137.19 708.75 2136.16 709.177 2135.26 710.03C2134.38 710.857 2133.83 712.057 2133.62 713.63ZM1966.09 775.31V749.15H1981.85V753.07H1970.73V759.75H1980.13V763.67H1970.73V771.39H1982.25V775.31H1966.09ZM1987.21 775.31V755.67H1991.01L1991.33 758.35H1991.49C1992.32 757.47 1993.21 756.724 1994.17 756.11C1995.16 755.497 1996.27 755.19 1997.49 755.19C1998.93 755.19 2000.08 755.51 2000.93 756.15C2001.81 756.764 2002.48 757.617 2002.93 758.71C2003.87 757.697 2004.84 756.857 2005.85 756.19C2006.89 755.524 2008.03 755.19 2009.25 755.19C2011.33 755.19 2012.87 755.87 2013.85 757.23C2014.84 758.59 2015.33 760.51 2015.33 762.99V775.31H2010.69V763.59C2010.69 761.964 2010.44 760.817 2009.93 760.15C2009.45 759.484 2008.71 759.15 2007.69 759.15C2006.47 759.15 2005.09 759.964 2003.57 761.59V775.31H1998.97V763.59C1998.97 761.964 1998.72 760.817 1998.21 760.15C1997.73 759.484 1996.97 759.15 1995.93 759.15C1994.71 759.15 1993.33 759.964 1991.81 761.59V775.31H1987.21ZM2030.45 775.79C2028.58 775.79 2026.79 774.937 2025.09 773.23H2024.97L2024.57 775.31H2020.93V747.07H2025.53V754.35L2025.41 757.63C2026.23 756.91 2027.14 756.324 2028.13 755.87C2029.11 755.417 2030.1 755.19 2031.09 755.19C2033.54 755.19 2035.45 756.097 2036.81 757.91C2038.17 759.724 2038.85 762.137 2038.85 765.15C2038.85 767.39 2038.45 769.31 2037.65 770.91C2036.85 772.484 2035.81 773.697 2034.53 774.55C2033.27 775.377 2031.91 775.79 2030.45 775.79ZM2029.49 771.99C2030.79 771.99 2031.89 771.417 2032.77 770.27C2033.65 769.124 2034.09 767.444 2034.09 765.23C2034.09 763.257 2033.75 761.724 2033.09 760.63C2032.42 759.537 2031.34 758.99 2029.85 758.99C2028.46 758.99 2027.02 759.724 2025.53 761.19V770.35C2026.22 770.937 2026.9 771.364 2027.57 771.63C2028.26 771.87 2028.9 771.99 2029.49 771.99ZM2051.86 775.79C2050.05 775.79 2048.41 775.39 2046.94 774.59C2045.48 773.764 2044.32 772.59 2043.46 771.07C2042.61 769.524 2042.18 767.67 2042.18 765.51C2042.18 763.377 2042.61 761.537 2043.46 759.99C2044.34 758.444 2045.48 757.257 2046.86 756.43C2048.25 755.604 2049.7 755.19 2051.22 755.19C2053.01 755.19 2054.5 755.59 2055.7 756.39C2056.9 757.164 2057.81 758.257 2058.42 759.67C2059.04 761.057 2059.34 762.67 2059.34 764.51C2059.34 765.47 2059.28 766.217 2059.14 766.75H2046.66C2046.88 768.484 2047.5 769.83 2048.54 770.79C2049.58 771.75 2050.89 772.23 2052.46 772.23C2053.32 772.23 2054.1 772.11 2054.82 771.87C2055.57 771.604 2056.3 771.244 2057.02 770.79L2058.58 773.67C2057.65 774.284 2056.61 774.79 2055.46 775.19C2054.32 775.59 2053.12 775.79 2051.86 775.79ZM2046.62 763.63H2055.34C2055.34 762.11 2055.01 760.924 2054.34 760.07C2053.68 759.19 2052.68 758.75 2051.34 758.75C2050.2 758.75 2049.17 759.177 2048.26 760.03C2047.38 760.857 2046.84 762.057 2046.62 763.63ZM2070.62 775.79C2068.16 775.79 2066.2 774.884 2064.74 773.07C2063.27 771.257 2062.54 768.737 2062.54 765.51C2062.54 763.377 2062.92 761.537 2063.7 759.99C2064.5 758.444 2065.54 757.257 2066.82 756.43C2068.1 755.604 2069.44 755.19 2070.86 755.19C2071.98 755.19 2072.92 755.39 2073.7 755.79C2074.5 756.164 2075.27 756.684 2076.02 757.35L2075.86 754.19V747.07H2080.46V775.31H2076.66L2076.34 773.19H2076.18C2075.46 773.91 2074.62 774.524 2073.66 775.03C2072.7 775.537 2071.68 775.79 2070.62 775.79ZM2071.74 771.99C2073.2 771.99 2074.58 771.257 2075.86 769.79V760.63C2075.19 760.017 2074.52 759.59 2073.86 759.35C2073.19 759.11 2072.52 758.99 2071.86 758.99C2070.6 758.99 2069.52 759.564 2068.62 760.71C2067.74 761.83 2067.3 763.417 2067.3 765.47C2067.3 767.577 2067.68 769.19 2068.46 770.31C2069.23 771.43 2070.32 771.99 2071.74 771.99ZM2093.2 775.79C2090.74 775.79 2088.78 774.884 2087.32 773.07C2085.85 771.257 2085.12 768.737 2085.12 765.51C2085.12 763.377 2085.5 761.537 2086.28 759.99C2087.08 758.444 2088.12 757.257 2089.4 756.43C2090.68 755.604 2092.02 755.19 2093.44 755.19C2094.56 755.19 2095.5 755.39 2096.28 755.79C2097.08 756.164 2097.85 756.684 2098.6 757.35L2098.44 754.19V747.07H2103.04V775.31H2099.24L2098.92 773.19H2098.76C2098.04 773.91 2097.2 774.524 2096.24 775.03C2095.28 775.537 2094.26 775.79 2093.2 775.79ZM2094.32 771.99C2095.78 771.99 2097.16 771.257 2098.44 769.79V760.63C2097.77 760.017 2097.1 759.59 2096.44 759.35C2095.77 759.11 2095.1 758.99 2094.44 758.99C2093.18 758.99 2092.1 759.564 2091.2 760.71C2090.32 761.83 2089.88 763.417 2089.88 765.47C2089.88 767.577 2090.26 769.19 2091.04 770.31C2091.81 771.43 2092.9 771.99 2094.32 771.99ZM2108.89 775.31V755.67H2113.49V775.31H2108.89ZM2111.21 752.23C2110.39 752.23 2109.72 751.99 2109.21 751.51C2108.71 751.03 2108.45 750.39 2108.45 749.59C2108.45 748.817 2108.71 748.19 2109.21 747.71C2109.72 747.23 2110.39 746.99 2111.21 746.99C2112.04 746.99 2112.71 747.23 2113.21 747.71C2113.72 748.19 2113.97 748.817 2113.97 749.59C2113.97 750.39 2113.72 751.03 2113.21 751.51C2112.71 751.99 2112.04 752.23 2111.21 752.23ZM2119.36 775.31V755.67H2123.16L2123.48 758.31H2123.64C2124.52 757.457 2125.48 756.724 2126.52 756.11C2127.56 755.497 2128.75 755.19 2130.08 755.19C2132.19 755.19 2133.72 755.87 2134.68 757.23C2135.64 758.59 2136.12 760.51 2136.12 762.99V775.31H2131.52V763.59C2131.52 761.964 2131.28 760.817 2130.8 760.15C2130.32 759.484 2129.54 759.15 2128.44 759.15C2127.59 759.15 2126.83 759.364 2126.16 759.79C2125.52 760.19 2124.79 760.79 2123.96 761.59V775.31H2119.36ZM2148.59 783.99C2147.04 783.99 2145.64 783.804 2144.39 783.43C2143.16 783.057 2142.19 782.484 2141.47 781.71C2140.77 780.964 2140.43 780.017 2140.43 778.87C2140.43 777.244 2141.36 775.844 2143.23 774.67V774.51C2142.72 774.19 2142.29 773.764 2141.95 773.23C2141.63 772.697 2141.47 772.03 2141.47 771.23C2141.47 770.457 2141.68 769.764 2142.11 769.15C2142.56 768.51 2143.07 767.99 2143.63 767.59V767.43C2142.96 766.924 2142.36 766.217 2141.83 765.31C2141.32 764.404 2141.07 763.377 2141.07 762.23C2141.07 760.71 2141.43 759.43 2142.15 758.39C2142.87 757.35 2143.81 756.564 2144.99 756.03C2146.19 755.47 2147.47 755.19 2148.83 755.19C2149.36 755.19 2149.87 755.244 2150.35 755.35C2150.85 755.43 2151.31 755.537 2151.71 755.67H2158.75V759.07H2155.15C2155.47 759.444 2155.73 759.924 2155.95 760.51C2156.16 761.07 2156.27 761.697 2156.27 762.39C2156.27 763.83 2155.93 765.057 2155.27 766.07C2154.6 767.057 2153.71 767.804 2152.59 768.31C2151.47 768.817 2150.21 769.07 2148.83 769.07C2148.4 769.07 2147.96 769.03 2147.51 768.95C2147.05 768.87 2146.6 768.737 2146.15 768.55C2145.85 768.817 2145.61 769.084 2145.43 769.35C2145.27 769.617 2145.19 769.977 2145.19 770.43C2145.19 770.99 2145.41 771.43 2145.87 771.75C2146.35 772.07 2147.19 772.23 2148.39 772.23H2151.87C2154.24 772.23 2156.03 772.617 2157.23 773.39C2158.45 774.137 2159.07 775.364 2159.07 777.07C2159.07 778.35 2158.64 779.51 2157.79 780.55C2156.93 781.617 2155.72 782.457 2154.15 783.07C2152.57 783.684 2150.72 783.99 2148.59 783.99ZM2148.83 766.19C2149.76 766.19 2150.56 765.844 2151.23 765.15C2151.89 764.457 2152.23 763.484 2152.23 762.23C2152.23 761.004 2151.89 760.057 2151.23 759.39C2150.59 758.697 2149.79 758.35 2148.83 758.35C2147.87 758.35 2147.05 758.684 2146.39 759.35C2145.72 760.017 2145.39 760.977 2145.39 762.23C2145.39 763.484 2145.72 764.457 2146.39 765.15C2147.05 765.844 2147.87 766.19 2148.83 766.19ZM2149.31 780.99C2150.88 780.99 2152.16 780.67 2153.15 780.03C2154.13 779.39 2154.63 778.657 2154.63 777.83C2154.63 777.057 2154.32 776.537 2153.71 776.27C2153.12 776.004 2152.27 775.87 2151.15 775.87H2148.47C2147.4 775.87 2146.51 775.777 2145.79 775.59C2144.77 776.364 2144.27 777.23 2144.27 778.19C2144.27 779.07 2144.72 779.75 2145.63 780.23C2146.53 780.737 2147.76 780.99 2149.31 780.99ZM2168.01 775.79C2166.67 775.79 2165.34 775.537 2164.01 775.03C2162.67 774.497 2161.53 773.844 2160.57 773.07L2162.73 770.11C2163.61 770.777 2164.49 771.31 2165.37 771.71C2166.25 772.11 2167.18 772.31 2168.17 772.31C2169.23 772.31 2170.02 772.084 2170.53 771.63C2171.03 771.177 2171.29 770.617 2171.29 769.95C2171.29 769.39 2171.07 768.937 2170.65 768.59C2170.25 768.217 2169.73 767.897 2169.09 767.63C2168.45 767.337 2167.78 767.057 2167.09 766.79C2166.23 766.47 2165.38 766.084 2164.53 765.63C2163.7 765.15 2163.02 764.55 2162.49 763.83C2161.95 763.084 2161.69 762.164 2161.69 761.07C2161.69 759.337 2162.33 757.924 2163.61 756.83C2164.89 755.737 2166.62 755.19 2168.81 755.19C2170.19 755.19 2171.43 755.43 2172.53 755.91C2173.62 756.39 2174.57 756.937 2175.37 757.55L2173.25 760.35C2172.55 759.844 2171.85 759.444 2171.13 759.15C2170.43 758.83 2169.7 758.67 2168.93 758.67C2167.94 758.67 2167.21 758.884 2166.73 759.31C2166.25 759.71 2166.01 760.217 2166.01 760.83C2166.01 761.604 2166.41 762.19 2167.21 762.59C2168.01 762.99 2168.94 763.377 2170.01 763.75C2170.91 764.07 2171.79 764.47 2172.65 764.95C2173.5 765.404 2174.21 766.004 2174.77 766.75C2175.35 767.497 2175.65 768.484 2175.65 769.71C2175.65 771.39 2174.99 772.83 2173.69 774.03C2172.38 775.204 2170.49 775.79 2168.01 775.79Z" fill="#FF9042"/> <path d="M1940.62 774.743C1941.21 774.907 1941.21 775.753 1940.62 775.917L1927.77 779.448C1927.38 779.555 1927 779.263 1927 778.862L1927 771.799C1927 771.397 1927.38 771.105 1927.77 771.212L1940.62 774.743Z" fill="#7B8B8F"/> <path d="M1938.87 707.413C1939.46 707.577 1939.46 708.423 1938.87 708.587L1926.02 712.118C1925.63 712.225 1925.25 711.933 1925.25 711.531L1925.25 704.469C1925.25 704.067 1925.63 703.775 1926.02 703.882L1938.87 707.413Z" fill="#7B8B8F"/> <path d="M1938.87 741.413C1939.46 741.577 1939.46 742.423 1938.87 742.587L1926.02 746.118C1925.63 746.225 1925.25 745.933 1925.25 745.531L1925.25 738.469C1925.25 738.067 1925.63 737.775 1926.02 737.882L1938.87 741.413Z" fill="#7B8B8F"/> <path d="M870 758H876.202C879.516 758 882.202 755.314 882.202 752V365C882.202 361.686 884.889 359 888.202 359H967" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <path d="M1391 362H1613.4C1616.72 362 1619.4 364.686 1619.4 368V702C1619.4 705.314 1622.09 708 1625.4 708H1927" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <path d="M1391 1163H1612.48C1615.79 1163 1618.48 1160.31 1618.48 1157V781C1618.48 777.686 1621.17 775 1624.48 775H1772.74H1927" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <g filter="url(#filter7_d_103_235)"> <path d="M2491 683C2491 677.477 2495.48 673 2501 673H3139C3144.52 673 3149 677.477 3149 683V771C3149 776.523 3144.52 781 3139 781H2501C2495.48 781 2491 776.523 2491 771V683Z" fill="white"/> </g> <path d="M2519.9 754.176V707.48H2534.46V710.704H2523.49V750.952H2534.46V754.176H2519.9ZM2543.8 747V743.204H2554.2V713.928H2553.84L2544.69 722.456L2542.14 719.7L2551.76 710.704H2558.57V743.204H2568.14V747H2543.8ZM2586.16 747.468C2584.74 747.468 2583.74 747.173 2583.15 746.584C2582.59 745.995 2582.31 745.249 2582.31 744.348V743.412C2582.31 742.511 2582.59 741.765 2583.15 741.176C2583.74 740.587 2584.74 740.292 2586.16 740.292C2587.58 740.292 2588.57 740.587 2589.13 741.176C2589.72 741.765 2590.01 742.511 2590.01 743.412V744.348C2590.01 745.249 2589.72 745.995 2589.13 746.584C2588.57 747.173 2587.58 747.468 2586.16 747.468ZM2627.79 714.5H2611.57L2610.53 728.384H2610.89C2611.8 727.171 2612.8 726.217 2613.91 725.524C2615.02 724.796 2616.54 724.432 2618.49 724.432C2620.08 724.432 2621.54 724.692 2622.85 725.212C2624.21 725.697 2625.37 726.425 2626.34 727.396C2627.31 728.367 2628.07 729.545 2628.63 730.932C2629.18 732.284 2629.46 733.844 2629.46 735.612C2629.46 737.38 2629.18 739.009 2628.63 740.5C2628.07 741.956 2627.26 743.221 2626.18 744.296C2625.11 745.336 2623.81 746.151 2622.28 746.74C2620.76 747.329 2619.02 747.624 2617.08 747.624C2615.56 747.624 2614.19 747.451 2612.97 747.104C2611.8 746.757 2610.74 746.289 2609.8 745.7C2608.9 745.111 2608.1 744.452 2607.41 743.724C2606.72 742.996 2606.09 742.233 2605.54 741.436L2608.81 738.94C2609.3 739.668 2609.78 740.327 2610.27 740.916C2610.79 741.505 2611.38 742.025 2612.04 742.476C2612.7 742.892 2613.44 743.221 2614.27 743.464C2615.11 743.707 2616.08 743.828 2617.19 743.828C2619.68 743.828 2621.59 743.152 2622.91 741.8C2624.26 740.448 2624.93 738.576 2624.93 736.184V735.768C2624.93 733.376 2624.26 731.504 2622.91 730.152C2621.55 728.8 2619.61 728.124 2617.08 728.124C2615.35 728.124 2614 728.419 2613.03 729.008C2612.09 729.597 2611.22 730.325 2610.43 731.192L2606.73 730.672L2608.03 710.704H2627.79V714.5ZM2646.29 739.928H2652.99L2647.01 754.332H2643.53L2646.29 739.928ZM2710.93 747.624C2709.03 747.624 2707.31 747.312 2705.79 746.688C2704.26 746.029 2702.96 745.111 2701.89 743.932C2700.85 742.719 2700.03 741.245 2699.44 739.512C2698.89 737.779 2698.61 735.837 2698.61 733.688C2698.61 730.984 2699.01 728.453 2699.81 726.096C2700.64 723.704 2701.68 721.537 2702.93 719.596C2704.21 717.62 2705.63 715.887 2707.19 714.396C2708.75 712.871 2710.27 711.64 2711.77 710.704H2717.38C2715.41 712.125 2713.64 713.512 2712.08 714.864C2710.55 716.216 2709.2 717.637 2708.02 719.128C2706.84 720.584 2705.85 722.161 2705.06 723.86C2704.26 725.524 2703.6 727.413 2703.08 729.528L2703.34 729.632C2704.21 728.141 2705.35 726.911 2706.77 725.94C2708.19 724.935 2710.03 724.432 2712.29 724.432C2713.88 724.432 2715.34 724.692 2716.65 725.212C2717.97 725.732 2719.11 726.477 2720.09 727.448C2721.06 728.419 2721.8 729.597 2722.32 730.984C2722.88 732.371 2723.15 733.913 2723.15 735.612C2723.15 737.38 2722.86 739.009 2722.27 740.5C2721.68 741.956 2720.85 743.221 2719.77 744.296C2718.73 745.336 2717.45 746.151 2715.93 746.74C2714.43 747.329 2712.77 747.624 2710.93 747.624ZM2710.88 743.932C2713.38 743.932 2715.3 743.256 2716.65 741.904C2718.04 740.552 2718.73 738.645 2718.73 736.184V735.768C2718.73 733.307 2718.04 731.4 2716.65 730.048C2715.3 728.696 2713.38 728.02 2710.88 728.02C2708.39 728.02 2706.44 728.696 2705.06 730.048C2703.71 731.4 2703.03 733.307 2703.03 735.768V736.184C2703.03 738.645 2703.71 740.552 2705.06 741.904C2706.44 743.256 2708.39 743.932 2710.88 743.932ZM2742.06 747.468C2740.64 747.468 2739.63 747.173 2739.04 746.584C2738.49 745.995 2738.21 745.249 2738.21 744.348V743.412C2738.21 742.511 2738.49 741.765 2739.04 741.176C2739.63 740.587 2740.64 740.292 2742.06 740.292C2743.48 740.292 2744.47 740.587 2745.02 741.176C2745.61 741.765 2745.91 742.511 2745.91 743.412V744.348C2745.91 745.249 2745.61 745.995 2745.02 746.584C2744.47 747.173 2743.48 747.468 2742.06 747.468ZM2762.06 747V743.204H2772.46V713.928H2772.1L2762.94 722.456L2760.4 719.7L2770.02 710.704H2776.83V743.204H2786.4V747H2762.06ZM2802.18 739.928H2808.89L2802.91 754.332H2799.43L2802.18 739.928ZM2865.22 726.096C2867.75 726.096 2869.64 725.541 2870.89 724.432C2872.17 723.288 2872.81 721.832 2872.81 720.064V719.7C2872.81 717.759 2872.19 716.303 2870.94 715.332C2869.73 714.361 2868.13 713.876 2866.16 713.876C2864.21 713.876 2862.64 714.309 2861.42 715.176C2860.25 716.008 2859.27 717.117 2858.51 718.504L2855.24 716.008C2855.69 715.28 2856.24 714.569 2856.9 713.876C2857.56 713.148 2858.32 712.507 2859.19 711.952C2860.09 711.397 2861.11 710.947 2862.26 710.6C2863.43 710.253 2864.77 710.08 2866.26 710.08C2867.82 710.08 2869.28 710.288 2870.63 710.704C2871.98 711.085 2873.16 711.675 2874.16 712.472C2875.17 713.235 2875.95 714.188 2876.5 715.332C2877.09 716.476 2877.39 717.776 2877.39 719.232C2877.39 720.411 2877.2 721.468 2876.82 722.404C2876.47 723.34 2875.97 724.155 2875.31 724.848C2874.68 725.541 2873.94 726.131 2873.07 726.616C2872.21 727.101 2871.29 727.465 2870.32 727.708V727.916C2871.32 728.124 2872.29 728.471 2873.23 728.956C2874.16 729.407 2875 730.013 2875.72 730.776C2876.45 731.504 2877.02 732.405 2877.44 733.48C2877.89 734.52 2878.12 735.716 2878.12 737.068C2878.12 738.628 2877.8 740.067 2877.18 741.384C2876.59 742.667 2875.74 743.776 2874.63 744.712C2873.56 745.613 2872.24 746.324 2870.68 746.844C2869.15 747.364 2867.46 747.624 2865.58 747.624C2863.99 747.624 2862.57 747.451 2861.32 747.104C2860.11 746.757 2859.01 746.289 2858.04 745.7C2857.11 745.111 2856.28 744.452 2855.55 743.724C2854.85 742.996 2854.23 742.233 2853.68 741.436L2856.95 738.94C2857.44 739.668 2857.94 740.327 2858.46 740.916C2858.98 741.505 2859.57 742.025 2860.23 742.476C2860.92 742.892 2861.7 743.221 2862.57 743.464C2863.43 743.707 2864.44 743.828 2865.58 743.828C2868.15 743.828 2870.11 743.239 2871.46 742.06C2872.85 740.881 2873.54 739.2 2873.54 737.016V736.6C2873.54 734.451 2872.86 732.787 2871.51 731.608C2870.19 730.429 2868.2 729.84 2865.53 729.84H2861.22V726.096H2865.22ZM2897.96 747.468C2896.54 747.468 2895.53 747.173 2894.94 746.584C2894.39 745.995 2894.11 745.249 2894.11 744.348V743.412C2894.11 742.511 2894.39 741.765 2894.94 741.176C2895.53 740.587 2896.54 740.292 2897.96 740.292C2899.38 740.292 2900.37 740.587 2900.92 741.176C2901.51 741.765 2901.81 742.511 2901.81 743.412V744.348C2901.81 745.249 2901.51 745.995 2900.92 746.584C2900.37 747.173 2899.38 747.468 2897.96 747.468ZM2941.46 747H2917.23V742.528L2929.35 731.608C2931.12 730.013 2932.57 728.384 2933.72 726.72C2934.86 725.021 2935.43 723.236 2935.43 721.364V720.74C2935.43 718.556 2934.86 716.875 2933.72 715.696C2932.57 714.483 2930.87 713.876 2928.62 713.876C2926.4 713.876 2924.68 714.448 2923.47 715.592C2922.29 716.701 2921.41 718.192 2920.82 720.064L2916.92 718.608C2917.27 717.533 2917.73 716.493 2918.32 715.488C2918.95 714.448 2919.73 713.529 2920.66 712.732C2921.6 711.935 2922.73 711.293 2924.04 710.808C2925.4 710.323 2926.96 710.08 2928.72 710.08C2930.53 710.08 2932.12 710.34 2933.51 710.86C2934.93 711.38 2936.11 712.108 2937.04 713.044C2938.01 713.98 2938.74 715.089 2939.23 716.372C2939.75 717.655 2940.01 719.059 2940.01 720.584C2940.01 721.971 2939.8 723.271 2939.38 724.484C2939 725.697 2938.45 726.859 2937.72 727.968C2937.03 729.077 2936.18 730.169 2935.17 731.244C2934.2 732.284 2933.11 733.324 2931.9 734.364L2921.76 743.204H2941.46V747ZM2958.08 739.928H2964.79L2958.81 754.332H2955.33L2958.08 739.928ZM3026.73 747V739.876H3008.85V736.028L3024.55 710.704H3030.89V736.288H3036.3V739.876H3030.89V747H3026.73ZM3012.75 736.288H3026.73V714.084H3026.53L3012.75 736.288ZM3053.86 747.468C3052.44 747.468 3051.43 747.173 3050.84 746.584C3050.29 745.995 3050.01 745.249 3050.01 744.348V743.412C3050.01 742.511 3050.29 741.765 3050.84 741.176C3051.43 740.587 3052.44 740.292 3053.86 740.292C3055.28 740.292 3056.27 740.587 3056.82 741.176C3057.41 741.765 3057.71 742.511 3057.71 743.412V744.348C3057.71 745.249 3057.41 745.995 3056.82 746.584C3056.27 747.173 3055.28 747.468 3053.86 747.468ZM3089.09 747V739.876H3071.21V736.028L3086.91 710.704H3093.25V736.288H3098.66V739.876H3093.25V747H3089.09ZM3075.11 736.288H3089.09V714.084H3088.89L3075.11 736.288ZM3120.12 707.48V754.176H3105.56V750.952H3116.53V710.704H3105.56V707.48H3120.12Z" fill="#5B5B5B"/> <g filter="url(#filter8_d_103_235)"> <path d="M2376 559C2376 553.477 2380.48 549 2386 549H3024C3029.52 549 3034 553.477 3034 559V647C3034 652.523 3029.52 657 3024 657H2386C2380.48 657 2376 652.523 2376 647V559Z" fill="white"/> </g> <path d="M2404.9 630.176V583.48H2419.46V586.704H2408.49V626.952H2419.46V630.176H2404.9ZM2439.98 623.624C2437.76 623.624 2435.86 623.208 2434.26 622.376C2432.67 621.509 2431.35 620.279 2430.31 618.684C2429.27 617.089 2428.51 615.131 2428.02 612.808C2427.54 610.485 2427.29 607.833 2427.29 604.852C2427.29 601.905 2427.54 599.271 2428.02 596.948C2428.51 594.591 2429.27 592.615 2430.31 591.02C2431.35 589.425 2432.67 588.212 2434.26 587.38C2435.86 586.513 2437.76 586.08 2439.98 586.08C2442.2 586.08 2444.11 586.513 2445.7 587.38C2447.3 588.212 2448.61 589.425 2449.65 591.02C2450.69 592.615 2451.46 594.591 2451.94 596.948C2452.43 599.271 2452.67 601.905 2452.67 604.852C2452.67 607.833 2452.43 610.485 2451.94 612.808C2451.46 615.131 2450.69 617.089 2449.65 618.684C2448.61 620.279 2447.3 621.509 2445.7 622.376C2444.11 623.208 2442.2 623.624 2439.98 623.624ZM2439.98 619.828C2441.44 619.828 2442.69 619.551 2443.73 618.996C2444.77 618.407 2445.6 617.592 2446.22 616.552C2446.88 615.512 2447.37 614.264 2447.68 612.808C2447.99 611.317 2448.15 609.653 2448.15 607.816V601.888C2448.15 600.085 2447.99 598.439 2447.68 596.948C2447.37 595.457 2446.88 594.192 2446.22 593.152C2445.6 592.112 2444.77 591.315 2443.73 590.76C2442.69 590.171 2441.44 589.876 2439.98 589.876C2438.53 589.876 2437.28 590.171 2436.24 590.76C2435.2 591.315 2434.35 592.112 2433.69 593.152C2433.07 594.192 2432.6 595.457 2432.29 596.948C2431.97 598.439 2431.82 600.085 2431.82 601.888V607.816C2431.82 609.653 2431.97 611.317 2432.29 612.808C2432.6 614.264 2433.07 615.512 2433.69 616.552C2434.35 617.592 2435.2 618.407 2436.24 618.996C2437.28 619.551 2438.53 619.828 2439.98 619.828ZM2439.98 607.92C2438.77 607.92 2437.92 607.677 2437.43 607.192C2436.98 606.707 2436.76 606.117 2436.76 605.424V604.28C2436.76 603.587 2436.98 602.997 2437.43 602.512C2437.92 602.027 2438.77 601.784 2439.98 601.784C2441.2 601.784 2442.03 602.027 2442.48 602.512C2442.96 602.997 2443.21 603.587 2443.21 604.28V605.424C2443.21 606.117 2442.96 606.707 2442.48 607.192C2442.03 607.677 2441.2 607.92 2439.98 607.92ZM2471.16 623.468C2469.74 623.468 2468.74 623.173 2468.15 622.584C2467.59 621.995 2467.31 621.249 2467.31 620.348V619.412C2467.31 618.511 2467.59 617.765 2468.15 617.176C2468.74 616.587 2469.74 616.292 2471.16 616.292C2472.58 616.292 2473.57 616.587 2474.13 617.176C2474.72 617.765 2475.01 618.511 2475.01 619.412V620.348C2475.01 621.249 2474.72 621.995 2474.13 622.584C2473.57 623.173 2472.58 623.468 2471.16 623.468ZM2514.67 623H2490.43V618.528L2502.55 607.608C2504.32 606.013 2505.77 604.384 2506.92 602.72C2508.06 601.021 2508.63 599.236 2508.63 597.364V596.74C2508.63 594.556 2508.06 592.875 2506.92 591.696C2505.77 590.483 2504.08 589.876 2501.82 589.876C2499.6 589.876 2497.89 590.448 2496.67 591.592C2495.5 592.701 2494.61 594.192 2494.02 596.064L2490.12 594.608C2490.47 593.533 2490.94 592.493 2491.53 591.488C2492.15 590.448 2492.93 589.529 2493.87 588.732C2494.8 587.935 2495.93 587.293 2497.25 586.808C2498.6 586.323 2500.16 586.08 2501.93 586.08C2503.73 586.08 2505.32 586.34 2506.71 586.86C2508.13 587.38 2509.31 588.108 2510.25 589.044C2511.22 589.98 2511.94 591.089 2512.43 592.372C2512.95 593.655 2513.21 595.059 2513.21 596.584C2513.21 597.971 2513 599.271 2512.59 600.484C2512.2 601.697 2511.65 602.859 2510.92 603.968C2510.23 605.077 2509.38 606.169 2508.37 607.244C2507.4 608.284 2506.31 609.324 2505.1 610.364L2494.96 619.204H2514.67V623ZM2531.29 615.928H2537.99L2532.01 630.332H2528.53L2531.29 615.928ZM2595.88 623.624C2593.66 623.624 2591.76 623.208 2590.16 622.376C2588.57 621.509 2587.25 620.279 2586.21 618.684C2585.17 617.089 2584.41 615.131 2583.92 612.808C2583.44 610.485 2583.19 607.833 2583.19 604.852C2583.19 601.905 2583.44 599.271 2583.92 596.948C2584.41 594.591 2585.17 592.615 2586.21 591.02C2587.25 589.425 2588.57 588.212 2590.16 587.38C2591.76 586.513 2593.66 586.08 2595.88 586.08C2598.1 586.08 2600.01 586.513 2601.6 587.38C2603.2 588.212 2604.51 589.425 2605.55 591.02C2606.59 592.615 2607.36 594.591 2607.84 596.948C2608.33 599.271 2608.57 601.905 2608.57 604.852C2608.57 607.833 2608.33 610.485 2607.84 612.808C2607.36 615.131 2606.59 617.089 2605.55 618.684C2604.51 620.279 2603.2 621.509 2601.6 622.376C2600.01 623.208 2598.1 623.624 2595.88 623.624ZM2595.88 619.828C2597.34 619.828 2598.59 619.551 2599.63 618.996C2600.67 618.407 2601.5 617.592 2602.12 616.552C2602.78 615.512 2603.27 614.264 2603.58 612.808C2603.89 611.317 2604.05 609.653 2604.05 607.816V601.888C2604.05 600.085 2603.89 598.439 2603.58 596.948C2603.27 595.457 2602.78 594.192 2602.12 593.152C2601.5 592.112 2600.67 591.315 2599.63 590.76C2598.59 590.171 2597.34 589.876 2595.88 589.876C2594.43 589.876 2593.18 590.171 2592.14 590.76C2591.1 591.315 2590.25 592.112 2589.59 593.152C2588.97 594.192 2588.5 595.457 2588.19 596.948C2587.87 598.439 2587.72 600.085 2587.72 601.888V607.816C2587.72 609.653 2587.87 611.317 2588.19 612.808C2588.5 614.264 2588.97 615.512 2589.59 616.552C2590.25 617.592 2591.1 618.407 2592.14 618.996C2593.18 619.551 2594.43 619.828 2595.88 619.828ZM2595.88 607.92C2594.67 607.92 2593.82 607.677 2593.33 607.192C2592.88 606.707 2592.66 606.117 2592.66 605.424V604.28C2592.66 603.587 2592.88 602.997 2593.33 602.512C2593.82 602.027 2594.67 601.784 2595.88 601.784C2597.09 601.784 2597.93 602.027 2598.38 602.512C2598.86 602.997 2599.11 603.587 2599.11 604.28V605.424C2599.11 606.117 2598.86 606.707 2598.38 607.192C2597.93 607.677 2597.09 607.92 2595.88 607.92ZM2627.06 623.468C2625.64 623.468 2624.63 623.173 2624.04 622.584C2623.49 621.995 2623.21 621.249 2623.21 620.348V619.412C2623.21 618.511 2623.49 617.765 2624.04 617.176C2624.63 616.587 2625.64 616.292 2627.06 616.292C2628.48 616.292 2629.47 616.587 2630.02 617.176C2630.61 617.765 2630.91 618.511 2630.91 619.412V620.348C2630.91 621.249 2630.61 621.995 2630.02 622.584C2629.47 623.173 2628.48 623.468 2627.06 623.468ZM2668.69 590.5H2652.47L2651.43 604.384H2651.79C2652.69 603.171 2653.7 602.217 2654.81 601.524C2655.92 600.796 2657.44 600.432 2659.38 600.432C2660.98 600.432 2662.44 600.692 2663.75 601.212C2665.1 601.697 2666.27 602.425 2667.24 603.396C2668.21 604.367 2668.97 605.545 2669.52 606.932C2670.08 608.284 2670.36 609.844 2670.36 611.612C2670.36 613.38 2670.08 615.009 2669.52 616.5C2668.97 617.956 2668.16 619.221 2667.08 620.296C2666.01 621.336 2664.71 622.151 2663.18 622.74C2661.66 623.329 2659.92 623.624 2657.98 623.624C2656.46 623.624 2655.09 623.451 2653.87 623.104C2652.69 622.757 2651.64 622.289 2650.7 621.7C2649.8 621.111 2649 620.452 2648.31 619.724C2647.62 618.996 2646.99 618.233 2646.44 617.436L2649.71 614.94C2650.2 615.668 2650.68 616.327 2651.17 616.916C2651.69 617.505 2652.28 618.025 2652.94 618.476C2653.6 618.892 2654.34 619.221 2655.17 619.464C2656 619.707 2656.98 619.828 2658.08 619.828C2660.58 619.828 2662.49 619.152 2663.8 617.8C2665.16 616.448 2665.83 614.576 2665.83 612.184V611.768C2665.83 609.376 2665.16 607.504 2663.8 606.152C2662.45 604.8 2660.51 604.124 2657.98 604.124C2656.25 604.124 2654.9 604.419 2653.92 605.008C2652.99 605.597 2652.12 606.325 2651.32 607.192L2647.63 606.672L2648.93 586.704H2668.69V590.5ZM2687.18 615.928H2693.89L2687.91 630.332H2684.43L2687.18 615.928ZM2740.6 623V619.204H2751V589.928H2750.64L2741.48 598.456L2738.94 595.7L2748.56 586.704H2755.37V619.204H2764.94V623H2740.6ZM2782.96 623.468C2781.54 623.468 2780.53 623.173 2779.94 622.584C2779.39 621.995 2779.11 621.249 2779.11 620.348V619.412C2779.11 618.511 2779.39 617.765 2779.94 617.176C2780.53 616.587 2781.54 616.292 2782.96 616.292C2784.38 616.292 2785.37 616.587 2785.92 617.176C2786.51 617.765 2786.81 618.511 2786.81 619.412V620.348C2786.81 621.249 2786.51 621.995 2785.92 622.584C2785.37 623.173 2784.38 623.468 2782.96 623.468ZM2812.58 602.096C2815.11 602.096 2817 601.541 2818.25 600.432C2819.53 599.288 2820.17 597.832 2820.17 596.064V595.7C2820.17 593.759 2819.55 592.303 2818.3 591.332C2817.09 590.361 2815.49 589.876 2813.52 589.876C2811.57 589.876 2810 590.309 2808.78 591.176C2807.6 592.008 2806.63 593.117 2805.87 594.504L2802.6 592.008C2803.05 591.28 2803.6 590.569 2804.26 589.876C2804.92 589.148 2805.68 588.507 2806.55 587.952C2807.45 587.397 2808.47 586.947 2809.62 586.6C2810.79 586.253 2812.13 586.08 2813.62 586.08C2815.18 586.08 2816.64 586.288 2817.99 586.704C2819.34 587.085 2820.52 587.675 2821.52 588.472C2822.53 589.235 2823.31 590.188 2823.86 591.332C2824.45 592.476 2824.75 593.776 2824.75 595.232C2824.75 596.411 2824.56 597.468 2824.18 598.404C2823.83 599.34 2823.33 600.155 2822.67 600.848C2822.04 601.541 2821.3 602.131 2820.43 602.616C2819.56 603.101 2818.65 603.465 2817.68 603.708V603.916C2818.68 604.124 2819.65 604.471 2820.59 604.956C2821.52 605.407 2822.36 606.013 2823.08 606.776C2823.81 607.504 2824.38 608.405 2824.8 609.48C2825.25 610.52 2825.48 611.716 2825.48 613.068C2825.48 614.628 2825.16 616.067 2824.54 617.384C2823.95 618.667 2823.1 619.776 2821.99 620.712C2820.92 621.613 2819.6 622.324 2818.04 622.844C2816.51 623.364 2814.82 623.624 2812.94 623.624C2811.35 623.624 2809.93 623.451 2808.68 623.104C2807.47 622.757 2806.37 622.289 2805.4 621.7C2804.47 621.111 2803.64 620.452 2802.91 619.724C2802.21 618.996 2801.59 618.233 2801.04 617.436L2804.31 614.94C2804.8 615.668 2805.3 616.327 2805.82 616.916C2806.34 617.505 2806.93 618.025 2807.59 618.476C2808.28 618.892 2809.06 619.221 2809.93 619.464C2810.79 619.707 2811.8 619.828 2812.94 619.828C2815.51 619.828 2817.47 619.239 2818.82 618.06C2820.21 616.881 2820.9 615.2 2820.9 613.016V612.6C2820.9 610.451 2820.22 608.787 2818.87 607.608C2817.55 606.429 2815.56 605.84 2812.89 605.84H2808.58V602.096H2812.58ZM2843.08 615.928H2849.79L2843.81 630.332H2840.33L2843.08 615.928ZM2896.5 623V619.204H2906.9V589.928H2906.53L2897.38 598.456L2894.83 595.7L2904.45 586.704H2911.27V619.204H2920.83V623H2896.5ZM2938.86 623.468C2937.44 623.468 2936.43 623.173 2935.84 622.584C2935.29 621.995 2935.01 621.249 2935.01 620.348V619.412C2935.01 618.511 2935.29 617.765 2935.84 617.176C2936.43 616.587 2937.44 616.292 2938.86 616.292C2940.28 616.292 2941.27 616.587 2941.82 617.176C2942.41 617.765 2942.71 618.511 2942.71 619.412V620.348C2942.71 621.249 2942.41 621.995 2941.82 622.584C2941.27 623.173 2940.28 623.468 2938.86 623.468ZM2982.31 600.016C2982.31 602.72 2981.89 605.268 2981.06 607.66C2980.26 610.017 2979.22 612.184 2977.94 614.16C2976.66 616.101 2975.24 617.835 2973.68 619.36C2972.15 620.851 2970.64 622.064 2969.15 623H2963.54C2965.51 621.579 2967.26 620.192 2968.79 618.84C2970.35 617.488 2971.72 616.084 2972.9 614.628C2974.08 613.137 2975.06 611.56 2975.86 609.896C2976.66 608.197 2977.32 606.291 2977.84 604.176L2977.58 604.072C2976.68 605.563 2975.51 606.811 2974.09 607.816C2972.71 608.787 2970.89 609.272 2968.63 609.272C2967.04 609.272 2965.57 609.012 2964.21 608.492C2962.9 607.972 2961.75 607.227 2960.78 606.256C2959.85 605.285 2959.1 604.124 2958.55 602.772C2958.03 601.385 2957.77 599.825 2957.77 598.092C2957.77 596.324 2958.04 594.712 2958.6 593.256C2959.19 591.765 2960.02 590.5 2961.09 589.46C2962.17 588.385 2963.45 587.553 2964.94 586.964C2966.47 586.375 2968.15 586.08 2969.99 586.08C2971.89 586.08 2973.61 586.409 2975.13 587.068C2976.66 587.692 2977.94 588.611 2978.98 589.824C2980.06 591.003 2980.87 592.459 2981.43 594.192C2982.01 595.925 2982.31 597.867 2982.31 600.016ZM2970.04 605.684C2972.53 605.684 2974.46 605.008 2975.81 603.656C2977.2 602.304 2977.89 600.397 2977.89 597.936V597.52C2977.89 595.059 2977.2 593.152 2975.81 591.8C2974.46 590.448 2972.53 589.772 2970.04 589.772C2967.54 589.772 2965.6 590.448 2964.21 591.8C2962.86 593.152 2962.19 595.059 2962.19 597.52V597.936C2962.19 600.397 2962.86 602.304 2964.21 603.656C2965.6 605.008 2967.54 605.684 2970.04 605.684ZM3005.12 583.48V630.176H2990.56V626.952H3001.53V586.704H2990.56V583.48H3005.12Z" fill="#5B5B5B"/> <g filter="url(#filter9_d_103_235)"> <path d="M2606 803C2606 797.477 2610.48 793 2616 793H3254C3259.52 793 3264 797.477 3264 803V891C3264 896.523 3259.52 901 3254 901H2616C2610.48 901 2606 896.523 2606 891V803Z" fill="white"/> </g> <path d="M2634.9 874.176V827.48H2649.46V830.704H2638.49V870.952H2649.46V874.176H2634.9ZM2669.98 867.624C2667.76 867.624 2665.86 867.208 2664.26 866.376C2662.67 865.509 2661.35 864.279 2660.31 862.684C2659.27 861.089 2658.51 859.131 2658.02 856.808C2657.54 854.485 2657.29 851.833 2657.29 848.852C2657.29 845.905 2657.54 843.271 2658.02 840.948C2658.51 838.591 2659.27 836.615 2660.31 835.02C2661.35 833.425 2662.67 832.212 2664.26 831.38C2665.86 830.513 2667.76 830.08 2669.98 830.08C2672.2 830.08 2674.11 830.513 2675.7 831.38C2677.3 832.212 2678.61 833.425 2679.65 835.02C2680.69 836.615 2681.46 838.591 2681.94 840.948C2682.43 843.271 2682.67 845.905 2682.67 848.852C2682.67 851.833 2682.43 854.485 2681.94 856.808C2681.46 859.131 2680.69 861.089 2679.65 862.684C2678.61 864.279 2677.3 865.509 2675.7 866.376C2674.11 867.208 2672.2 867.624 2669.98 867.624ZM2669.98 863.828C2671.44 863.828 2672.69 863.551 2673.73 862.996C2674.77 862.407 2675.6 861.592 2676.22 860.552C2676.88 859.512 2677.37 858.264 2677.68 856.808C2677.99 855.317 2678.15 853.653 2678.15 851.816V845.888C2678.15 844.085 2677.99 842.439 2677.68 840.948C2677.37 839.457 2676.88 838.192 2676.22 837.152C2675.6 836.112 2674.77 835.315 2673.73 834.76C2672.69 834.171 2671.44 833.876 2669.98 833.876C2668.53 833.876 2667.28 834.171 2666.24 834.76C2665.2 835.315 2664.35 836.112 2663.69 837.152C2663.07 838.192 2662.6 839.457 2662.29 840.948C2661.97 842.439 2661.82 844.085 2661.82 845.888V851.816C2661.82 853.653 2661.97 855.317 2662.29 856.808C2662.6 858.264 2663.07 859.512 2663.69 860.552C2664.35 861.592 2665.2 862.407 2666.24 862.996C2667.28 863.551 2668.53 863.828 2669.98 863.828ZM2669.98 851.92C2668.77 851.92 2667.92 851.677 2667.43 851.192C2666.98 850.707 2666.76 850.117 2666.76 849.424V848.28C2666.76 847.587 2666.98 846.997 2667.43 846.512C2667.92 846.027 2668.77 845.784 2669.98 845.784C2671.2 845.784 2672.03 846.027 2672.48 846.512C2672.96 846.997 2673.21 847.587 2673.21 848.28V849.424C2673.21 850.117 2672.96 850.707 2672.48 851.192C2672.03 851.677 2671.2 851.92 2669.98 851.92ZM2701.16 867.468C2699.74 867.468 2698.74 867.173 2698.15 866.584C2697.59 865.995 2697.31 865.249 2697.31 864.348V863.412C2697.31 862.511 2697.59 861.765 2698.15 861.176C2698.74 860.587 2699.74 860.292 2701.16 860.292C2702.58 860.292 2703.57 860.587 2704.13 861.176C2704.72 861.765 2705.01 862.511 2705.01 863.412V864.348C2705.01 865.249 2704.72 865.995 2704.13 866.584C2703.57 867.173 2702.58 867.468 2701.16 867.468ZM2721.16 867V863.204H2731.56V833.928H2731.2L2722.05 842.456L2719.5 839.7L2729.12 830.704H2735.93V863.204H2745.5V867H2721.16ZM2761.29 859.928H2767.99L2762.01 874.332H2758.53L2761.29 859.928ZM2825.88 867.624C2823.66 867.624 2821.76 867.208 2820.16 866.376C2818.57 865.509 2817.25 864.279 2816.21 862.684C2815.17 861.089 2814.41 859.131 2813.92 856.808C2813.44 854.485 2813.19 851.833 2813.19 848.852C2813.19 845.905 2813.44 843.271 2813.92 840.948C2814.41 838.591 2815.17 836.615 2816.21 835.02C2817.25 833.425 2818.57 832.212 2820.16 831.38C2821.76 830.513 2823.66 830.08 2825.88 830.08C2828.1 830.08 2830.01 830.513 2831.6 831.38C2833.2 832.212 2834.51 833.425 2835.55 835.02C2836.59 836.615 2837.36 838.591 2837.84 840.948C2838.33 843.271 2838.57 845.905 2838.57 848.852C2838.57 851.833 2838.33 854.485 2837.84 856.808C2837.36 859.131 2836.59 861.089 2835.55 862.684C2834.51 864.279 2833.2 865.509 2831.6 866.376C2830.01 867.208 2828.1 867.624 2825.88 867.624ZM2825.88 863.828C2827.34 863.828 2828.59 863.551 2829.63 862.996C2830.67 862.407 2831.5 861.592 2832.12 860.552C2832.78 859.512 2833.27 858.264 2833.58 856.808C2833.89 855.317 2834.05 853.653 2834.05 851.816V845.888C2834.05 844.085 2833.89 842.439 2833.58 840.948C2833.27 839.457 2832.78 838.192 2832.12 837.152C2831.5 836.112 2830.67 835.315 2829.63 834.76C2828.59 834.171 2827.34 833.876 2825.88 833.876C2824.43 833.876 2823.18 834.171 2822.14 834.76C2821.1 835.315 2820.25 836.112 2819.59 837.152C2818.97 838.192 2818.5 839.457 2818.19 840.948C2817.87 842.439 2817.72 844.085 2817.72 845.888V851.816C2817.72 853.653 2817.87 855.317 2818.19 856.808C2818.5 858.264 2818.97 859.512 2819.59 860.552C2820.25 861.592 2821.1 862.407 2822.14 862.996C2823.18 863.551 2824.43 863.828 2825.88 863.828ZM2825.88 851.92C2824.67 851.92 2823.82 851.677 2823.33 851.192C2822.88 850.707 2822.66 850.117 2822.66 849.424V848.28C2822.66 847.587 2822.88 846.997 2823.33 846.512C2823.82 846.027 2824.67 845.784 2825.88 845.784C2827.09 845.784 2827.93 846.027 2828.38 846.512C2828.86 846.997 2829.11 847.587 2829.11 848.28V849.424C2829.11 850.117 2828.86 850.707 2828.38 851.192C2827.93 851.677 2827.09 851.92 2825.88 851.92ZM2857.06 867.468C2855.64 867.468 2854.63 867.173 2854.04 866.584C2853.49 865.995 2853.21 865.249 2853.21 864.348V863.412C2853.21 862.511 2853.49 861.765 2854.04 861.176C2854.63 860.587 2855.64 860.292 2857.06 860.292C2858.48 860.292 2859.47 860.587 2860.02 861.176C2860.61 861.765 2860.91 862.511 2860.91 863.412V864.348C2860.91 865.249 2860.61 865.995 2860.02 866.584C2859.47 867.173 2858.48 867.468 2857.06 867.468ZM2881.64 867L2895.83 834.448H2879.92V841.208H2876.02V830.704H2900.25V834.552L2886.32 867H2881.64ZM2917.18 859.928H2923.89L2917.91 874.332H2914.43L2917.18 859.928ZM2994.1 867H2969.87V862.528L2981.99 851.608C2983.76 850.013 2985.21 848.384 2986.36 846.72C2987.5 845.021 2988.07 843.236 2988.07 841.364V840.74C2988.07 838.556 2987.5 836.875 2986.36 835.696C2985.21 834.483 2983.51 833.876 2981.26 833.876C2979.04 833.876 2977.33 834.448 2976.11 835.592C2974.93 836.701 2974.05 838.192 2973.46 840.064L2969.56 838.608C2969.91 837.533 2970.37 836.493 2970.96 835.488C2971.59 834.448 2972.37 833.529 2973.3 832.732C2974.24 831.935 2975.37 831.293 2976.68 830.808C2978.04 830.323 2979.6 830.08 2981.36 830.08C2983.17 830.08 2984.76 830.34 2986.15 830.86C2987.57 831.38 2988.75 832.108 2989.68 833.044C2990.65 833.98 2991.38 835.089 2991.87 836.372C2992.39 837.655 2992.65 839.059 2992.65 840.584C2992.65 841.971 2992.44 843.271 2992.02 844.484C2991.64 845.697 2991.09 846.859 2990.36 847.968C2989.67 849.077 2988.82 850.169 2987.81 851.244C2986.84 852.284 2985.75 853.324 2984.54 854.364L2974.4 863.204H2994.1V867ZM3012.96 867.468C3011.54 867.468 3010.53 867.173 3009.94 866.584C3009.39 865.995 3009.11 865.249 3009.11 864.348V863.412C3009.11 862.511 3009.39 861.765 3009.94 861.176C3010.53 860.587 3011.54 860.292 3012.96 860.292C3014.38 860.292 3015.37 860.587 3015.92 861.176C3016.51 861.765 3016.81 862.511 3016.81 863.412V864.348C3016.81 865.249 3016.51 865.995 3015.92 866.584C3015.37 867.173 3014.38 867.468 3012.96 867.468ZM3056.41 844.016C3056.41 846.72 3056 849.268 3055.16 851.66C3054.37 854.017 3053.33 856.184 3052.04 858.16C3050.76 860.101 3049.34 861.835 3047.78 863.36C3046.25 864.851 3044.75 866.064 3043.26 867H3037.64C3039.62 865.579 3041.37 864.192 3042.89 862.84C3044.45 861.488 3045.82 860.084 3047 858.628C3048.18 857.137 3049.17 855.56 3049.96 853.896C3050.76 852.197 3051.42 850.291 3051.94 848.176L3051.68 848.072C3050.78 849.563 3049.62 850.811 3048.2 851.816C3046.81 852.787 3044.99 853.272 3042.74 853.272C3041.14 853.272 3039.67 853.012 3038.32 852.492C3037 851.972 3035.85 851.227 3034.88 850.256C3033.95 849.285 3033.2 848.124 3032.65 846.772C3032.13 845.385 3031.87 843.825 3031.87 842.092C3031.87 840.324 3032.14 838.712 3032.7 837.256C3033.29 835.765 3034.12 834.5 3035.2 833.46C3036.27 832.385 3037.55 831.553 3039.04 830.964C3040.57 830.375 3042.25 830.08 3044.09 830.08C3045.99 830.08 3047.71 830.409 3049.24 831.068C3050.76 831.692 3052.04 832.611 3053.08 833.824C3054.16 835.003 3054.97 836.459 3055.53 838.192C3056.12 839.925 3056.41 841.867 3056.41 844.016ZM3044.14 849.684C3046.64 849.684 3048.56 849.008 3049.91 847.656C3051.3 846.304 3051.99 844.397 3051.99 841.936V841.52C3051.99 839.059 3051.3 837.152 3049.91 835.8C3048.56 834.448 3046.64 833.772 3044.14 833.772C3041.64 833.772 3039.7 834.448 3038.32 835.8C3036.96 837.152 3036.29 839.059 3036.29 841.52V841.936C3036.29 844.397 3036.96 846.304 3038.32 847.656C3039.7 849.008 3041.64 849.684 3044.14 849.684ZM3073.08 859.928H3079.79L3073.81 874.332H3070.33L3073.08 859.928ZM3137.68 867.624C3135.46 867.624 3133.55 867.208 3131.96 866.376C3130.36 865.509 3129.05 864.279 3128.01 862.684C3126.97 861.089 3126.2 859.131 3125.72 856.808C3125.23 854.485 3124.99 851.833 3124.99 848.852C3124.99 845.905 3125.23 843.271 3125.72 840.948C3126.2 838.591 3126.97 836.615 3128.01 835.02C3129.05 833.425 3130.36 832.212 3131.96 831.38C3133.55 830.513 3135.46 830.08 3137.68 830.08C3139.9 830.08 3141.8 830.513 3143.4 831.38C3144.99 832.212 3146.31 833.425 3147.35 835.02C3148.39 836.615 3149.15 838.591 3149.64 840.948C3150.12 843.271 3150.37 845.905 3150.37 848.852C3150.37 851.833 3150.12 854.485 3149.64 856.808C3149.15 859.131 3148.39 861.089 3147.35 862.684C3146.31 864.279 3144.99 865.509 3143.4 866.376C3141.8 867.208 3139.9 867.624 3137.68 867.624ZM3137.68 863.828C3139.13 863.828 3140.38 863.551 3141.42 862.996C3142.46 862.407 3143.29 861.592 3143.92 860.552C3144.58 859.512 3145.06 858.264 3145.37 856.808C3145.69 855.317 3145.84 853.653 3145.84 851.816V845.888C3145.84 844.085 3145.69 842.439 3145.37 840.948C3145.06 839.457 3144.58 838.192 3143.92 837.152C3143.29 836.112 3142.46 835.315 3141.42 834.76C3140.38 834.171 3139.13 833.876 3137.68 833.876C3136.22 833.876 3134.97 834.171 3133.93 834.76C3132.89 835.315 3132.04 836.112 3131.39 837.152C3130.76 838.192 3130.29 839.457 3129.98 840.948C3129.67 842.439 3129.51 844.085 3129.51 845.888V851.816C3129.51 853.653 3129.67 855.317 3129.98 856.808C3130.29 858.264 3130.76 859.512 3131.39 860.552C3132.04 861.592 3132.89 862.407 3133.93 862.996C3134.97 863.551 3136.22 863.828 3137.68 863.828ZM3137.68 851.92C3136.46 851.92 3135.62 851.677 3135.13 851.192C3134.68 850.707 3134.45 850.117 3134.45 849.424V848.28C3134.45 847.587 3134.68 846.997 3135.13 846.512C3135.62 846.027 3136.46 845.784 3137.68 845.784C3138.89 845.784 3139.72 846.027 3140.17 846.512C3140.66 846.997 3140.9 847.587 3140.9 848.28V849.424C3140.9 850.117 3140.66 850.707 3140.17 851.192C3139.72 851.677 3138.89 851.92 3137.68 851.92ZM3168.86 867.468C3167.44 867.468 3166.43 867.173 3165.84 866.584C3165.29 865.995 3165.01 865.249 3165.01 864.348V863.412C3165.01 862.511 3165.29 861.765 3165.84 861.176C3166.43 860.587 3167.44 860.292 3168.86 860.292C3170.28 860.292 3171.27 860.587 3171.82 861.176C3172.41 861.765 3172.71 862.511 3172.71 863.412V864.348C3172.71 865.249 3172.41 865.995 3171.82 866.584C3171.27 867.173 3170.28 867.468 3168.86 867.468ZM3200.04 867.624C3197.92 867.624 3196.07 867.364 3194.47 866.844C3192.88 866.289 3191.54 865.544 3190.47 864.608C3189.43 863.672 3188.63 862.563 3188.08 861.28C3187.56 859.997 3187.3 858.611 3187.3 857.12C3187.3 854.624 3187.97 852.648 3189.33 851.192C3190.68 849.736 3192.46 848.713 3194.68 848.124V847.708C3192.74 847.049 3191.2 845.992 3190.05 844.536C3188.94 843.045 3188.39 841.26 3188.39 839.18C3188.39 836.407 3189.41 834.205 3191.46 832.576C3193.5 830.912 3196.36 830.08 3200.04 830.08C3203.71 830.08 3206.57 830.912 3208.62 832.576C3210.66 834.205 3211.69 836.407 3211.69 839.18C3211.69 841.26 3211.11 843.045 3209.97 844.536C3208.86 845.992 3207.33 847.049 3205.39 847.708V848.124C3207.61 848.713 3209.4 849.736 3210.75 851.192C3212.1 852.648 3212.78 854.624 3212.78 857.12C3212.78 858.611 3212.5 859.997 3211.95 861.28C3211.43 862.563 3210.63 863.672 3209.55 864.608C3208.48 865.544 3207.14 866.289 3205.55 866.844C3203.99 867.364 3202.15 867.624 3200.04 867.624ZM3200.04 863.932C3202.6 863.932 3204.61 863.36 3206.07 862.216C3207.53 861.072 3208.25 859.46 3208.25 857.38V856.236C3208.25 854.156 3207.53 852.544 3206.07 851.4C3204.65 850.256 3202.64 849.684 3200.04 849.684C3197.44 849.684 3195.41 850.256 3193.95 851.4C3192.53 852.544 3191.82 854.156 3191.82 856.236V857.38C3191.82 859.46 3192.55 861.072 3194.01 862.216C3195.46 863.36 3197.47 863.932 3200.04 863.932ZM3200.04 846.148C3202.39 846.148 3204.18 845.645 3205.39 844.64C3206.64 843.635 3207.27 842.196 3207.27 840.324V839.596C3207.27 837.724 3206.62 836.285 3205.34 835.28C3204.09 834.275 3202.33 833.772 3200.04 833.772C3197.75 833.772 3195.96 834.275 3194.68 835.28C3193.43 836.285 3192.81 837.724 3192.81 839.596V840.324C3192.81 842.196 3193.42 843.635 3194.63 844.64C3195.88 845.645 3197.68 846.148 3200.04 846.148ZM3235.12 827.48V874.176H3220.56V870.952H3231.53V830.704H3220.56V827.48H3235.12Z" fill="#5B5B5B"/> <path d="M979.865 757.413C980.462 757.577 980.462 758.423 979.865 758.587L967.02 762.118C966.633 762.225 966.25 761.933 966.25 761.531L966.25 754.469C966.25 754.067 966.633 753.775 967.02 753.882L979.865 757.413Z" fill="#7B8B8F"/> <path d="M2194 741H2433" stroke="#7B8B8F" stroke-width="1.5" stroke-linejoin="round"/> <path d="M2445.87 740.413C2446.46 740.577 2446.46 741.423 2445.87 741.587L2433.02 745.118C2432.63 745.225 2432.25 744.933 2432.25 744.531L2432.25 737.469C2432.25 737.067 2432.63 736.775 2433.02 736.882L2445.87 740.413Z" fill="#7B8B8F"/> <path d="M979.865 1162.41C980.462 1162.58 980.462 1163.42 979.865 1163.59L967.02 1167.12C966.633 1167.22 966.25 1166.93 966.25 1166.53L966.25 1159.47C966.25 1159.07 966.633 1158.78 967.02 1158.88L979.865 1162.41Z" fill="#7B8B8F"/> <path d="M979.865 358.413C980.462 358.577 980.462 359.423 979.865 359.587L967.02 363.118C966.633 363.225 966.25 362.933 966.25 362.531L966.25 355.469C966.25 355.067 966.633 354.775 967.02 354.882L979.865 358.413Z" fill="#7B8B8F"/> <defs> <filter id="filter0_d_103_235" x="92" y="278" width="541" height="903" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter1_d_103_235" x="154" y="341" width="541" height="903" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter2_d_103_235" x="223" y="404" width="541" height="903" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter3_d_103_235" x="960" y="980" width="466" height="387" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter4_d_103_235" x="960" y="180" width="466" height="387" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter5_d_103_235" x="960" y="580" width="466" height="387" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter6_f_103_235" x="1908.29" y="596.293" width="323.723" height="323.723" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feBlend mode="normal" in="SourceGraphic" in2="BackgroundImageFix" result="shape"/> <feGaussianBlur stdDeviation="19.5" result="effect1_foregroundBlur_103_235"/> </filter> <filter id="filter7_d_103_235" x="2456" y="654" width="728" height="178" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter8_d_103_235" x="2341" y="530" width="728" height="178" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <filter id="filter9_d_103_235" x="2571" y="774" width="728" height="178" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB"> <feFlood flood-opacity="0" result="BackgroundImageFix"/> <feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/> <feOffset dy="16"/> <feGaussianBlur stdDeviation="17.5"/> <feComposite in2="hardAlpha" operator="out"/> <feColorMatrix type="matrix" values="0 0 0 0 0.439216 0 0 0 0 0.564706 0 0 0 0 0.690196 0 0 0 0.2 0"/> <feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_103_235"/> <feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_103_235" result="shape"/> </filter> <linearGradient id="paint0_linear_103_235" x1="126.373" y1="366.299" x2="186.536" y2="166.862" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint1_linear_103_235" x1="133.634" y1="413.248" x2="203.507" y2="221.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint2_linear_103_235" x1="130.885" y1="460.248" x2="196.684" y2="265.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint3_linear_103_235" x1="122.5" y1="507.248" x2="178.075" y2="305.44" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint4_linear_103_235" x1="133.634" y1="554.248" x2="203.507" y2="362.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint5_linear_103_235" x1="149.992" y1="627.248" x2="251.781" y2="480.963" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint6_linear_103_235" x1="130.885" y1="674.248" x2="196.684" y2="479.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint7_linear_103_235" x1="140.095" y1="721.248" x2="221.227" y2="540.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint8_linear_103_235" x1="140.095" y1="768.248" x2="221.227" y2="587.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint9_linear_103_235" x1="147.931" y1="815.248" x2="245.464" y2="659.029" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint10_linear_103_235" x1="133.634" y1="888.248" x2="203.507" y2="696.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint11_linear_103_235" x1="128.273" y1="936.248" x2="190.557" y2="738.78" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint12_linear_103_235" x1="135.971" y1="984.248" x2="209.635" y2="795.954" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint13_linear_103_235" x1="130.885" y1="1032.25" x2="196.684" y2="837.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint14_linear_103_235" x1="125.112" y1="1080.25" x2="183.554" y2="880.21" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint15_linear_103_235" x1="188.373" y1="429.299" x2="248.536" y2="229.862" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint16_linear_103_235" x1="195.634" y1="476.248" x2="265.507" y2="284.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint17_linear_103_235" x1="192.885" y1="523.248" x2="258.684" y2="328.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint18_linear_103_235" x1="184.5" y1="570.248" x2="240.075" y2="368.44" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint19_linear_103_235" x1="195.634" y1="617.248" x2="265.507" y2="425.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint20_linear_103_235" x1="211.992" y1="690.248" x2="313.781" y2="543.963" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint21_linear_103_235" x1="192.885" y1="737.248" x2="258.684" y2="542.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint22_linear_103_235" x1="202.095" y1="784.248" x2="283.227" y2="603.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint23_linear_103_235" x1="202.095" y1="831.248" x2="283.227" y2="650.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint24_linear_103_235" x1="209.931" y1="878.248" x2="307.464" y2="722.029" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint25_linear_103_235" x1="195.634" y1="951.248" x2="265.507" y2="759.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint26_linear_103_235" x1="190.273" y1="999.248" x2="252.557" y2="801.78" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint27_linear_103_235" x1="197.971" y1="1047.25" x2="271.635" y2="858.954" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint28_linear_103_235" x1="192.885" y1="1095.25" x2="258.684" y2="900.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint29_linear_103_235" x1="187.112" y1="1143.25" x2="245.554" y2="943.21" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint30_linear_103_235" x1="257.373" y1="492.299" x2="317.536" y2="292.862" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint31_linear_103_235" x1="264.634" y1="539.248" x2="334.507" y2="347.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint32_linear_103_235" x1="261.885" y1="586.248" x2="327.684" y2="391.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint33_linear_103_235" x1="253.5" y1="633.248" x2="309.075" y2="431.44" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint34_linear_103_235" x1="264.634" y1="680.248" x2="334.507" y2="488.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint35_linear_103_235" x1="280.992" y1="753.248" x2="382.781" y2="606.963" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint36_linear_103_235" x1="261.885" y1="800.248" x2="327.684" y2="605.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint37_linear_103_235" x1="271.095" y1="847.248" x2="352.227" y2="666.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint38_linear_103_235" x1="271.095" y1="894.248" x2="352.227" y2="713.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint39_linear_103_235" x1="278.931" y1="941.248" x2="376.464" y2="785.029" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint40_linear_103_235" x1="264.634" y1="1014.25" x2="334.507" y2="822.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint41_linear_103_235" x1="259.273" y1="1062.25" x2="321.557" y2="864.78" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint42_linear_103_235" x1="266.971" y1="1110.25" x2="340.635" y2="921.954" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint43_linear_103_235" x1="261.885" y1="1158.25" x2="327.684" y2="963.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint44_linear_103_235" x1="256.112" y1="1206.25" x2="314.554" y2="1006.21" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint45_linear_103_235" x1="993.634" y1="1070.25" x2="1063.51" y2="878.612" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint46_linear_103_235" x1="988.273" y1="1118.25" x2="1050.56" y2="920.78" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint47_linear_103_235" x1="995.971" y1="1166.25" x2="1069.63" y2="977.954" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint48_linear_103_235" x1="990.885" y1="1214.25" x2="1056.68" y2="1019.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint49_linear_103_235" x1="985.112" y1="1262.25" x2="1043.55" y2="1062.21" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint50_linear_103_235" x1="986.373" y1="269.299" x2="1046.54" y2="69.8618" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint51_linear_103_235" x1="993.634" y1="316.329" x2="1063.51" y2="124.693" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint52_linear_103_235" x1="990.885" y1="363.329" x2="1056.68" y2="168.43" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint53_linear_103_235" x1="982.5" y1="410.329" x2="1038.08" y2="208.521" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint54_linear_103_235" x1="993.634" y1="457.329" x2="1063.51" y2="265.693" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint55_linear_103_235" x1="1009.99" y1="668.248" x2="1111.78" y2="521.963" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint56_linear_103_235" x1="990.885" y1="715.248" x2="1056.68" y2="520.35" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint57_linear_103_235" x1="1000.1" y1="762.248" x2="1081.23" y2="581.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint58_linear_103_235" x1="1000.1" y1="809.248" x2="1081.23" y2="628.566" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint59_linear_103_235" x1="1007.93" y1="856.248" x2="1105.46" y2="700.029" gradientUnits="userSpaceOnUse"> <stop stop-color="#A8B8C1"/> <stop offset="0.818708" stop-color="#A8B8C1" stop-opacity="0.0208334"/> </linearGradient> <linearGradient id="paint60_linear_103_235" x1="1971.04" y1="833.546" x2="2192.99" y2="706.489" gradientUnits="userSpaceOnUse"> <stop stop-color="#EC8F5A"/> <stop offset="1" stop-color="#FBFF44"/> </linearGradient> </defs> </svg>
4
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/62_pytorch_fsdp/run_clm_no_trainer.py
#!/usr/bin/env python # coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """ Fine-tuning the library models for causal language modeling (GPT, GPT-2, CTRL, ...) on a text file or a dataset without using HuggingFace Trainer. Here is the full list of checkpoints on the hub that can be fine-tuned by this script: https://huggingface.co/models?filter=text-generation """ # You can also adapt this script on your own causal language modeling task. Pointers for this are left as comments. import argparse import json import logging import math import os import random from itertools import chain from pathlib import Path import datasets import torch from datasets import load_dataset from torch.utils.data import DataLoader from tqdm.auto import tqdm import transformers from accelerate import Accelerator, DistributedType from accelerate.utils import set_seed from huggingface_hub import Repository from transformers import ( CONFIG_MAPPING, MODEL_MAPPING, AdamW, AutoConfig, AutoModelForCausalLM, AutoTokenizer, SchedulerType, default_data_collator, get_scheduler, ) from transformers.utils import get_full_repo_name from transformers.utils.versions import require_version logger = logging.getLogger(__name__) require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/language-modeling/requirements.txt") MODEL_CONFIG_CLASSES = list(MODEL_MAPPING.keys()) MODEL_TYPES = tuple(conf.model_type for conf in MODEL_CONFIG_CLASSES) def parse_args(): parser = argparse.ArgumentParser(description="Finetune a transformers model on a causal language modeling task") parser.add_argument( "--dataset_name", type=str, default=None, help="The name of the dataset to use (via the datasets library).", ) parser.add_argument( "--dataset_config_name", type=str, default=None, help="The configuration name of the dataset to use (via the datasets library).", ) parser.add_argument( "--train_file", type=str, default=None, help="A csv or a json file containing the training data." ) parser.add_argument( "--validation_file", type=str, default=None, help="A csv or a json file containing the validation data." ) parser.add_argument( "--validation_split_percentage", default=5, help="The percentage of the train set used as validation set in case there's no validation split", ) parser.add_argument( "--model_name_or_path", type=str, help="Path to pretrained model or model identifier from huggingface.co/models.", required=True, ) parser.add_argument( "--config_name", type=str, default=None, help="Pretrained config name or path if not the same as model_name", ) parser.add_argument( "--tokenizer_name", type=str, default=None, help="Pretrained tokenizer name or path if not the same as model_name", ) parser.add_argument( "--use_slow_tokenizer", action="store_true", help="If passed, will use a slow tokenizer (not backed by the 🤗 Tokenizers library).", ) parser.add_argument( "--per_device_train_batch_size", type=int, default=8, help="Batch size (per device) for the training dataloader.", ) parser.add_argument( "--per_device_eval_batch_size", type=int, default=8, help="Batch size (per device) for the evaluation dataloader.", ) parser.add_argument( "--learning_rate", type=float, default=5e-5, help="Initial learning rate (after the potential warmup period) to use.", ) parser.add_argument("--weight_decay", type=float, default=0.0, help="Weight decay to use.") parser.add_argument("--num_train_epochs", type=int, default=3, help="Total number of training epochs to perform.") parser.add_argument( "--max_train_steps", type=int, default=None, help="Total number of training steps to perform. If provided, overrides num_train_epochs.", ) parser.add_argument( "--gradient_accumulation_steps", type=int, default=1, help="Number of updates steps to accumulate before performing a backward/update pass.", ) parser.add_argument( "--lr_scheduler_type", type=SchedulerType, default="linear", help="The scheduler type to use.", choices=["linear", "cosine", "cosine_with_restarts", "polynomial", "constant", "constant_with_warmup"], ) parser.add_argument( "--num_warmup_steps", type=int, default=0, help="Number of steps for the warmup in the lr scheduler." ) parser.add_argument("--output_dir", type=str, default=None, help="Where to store the final model.") parser.add_argument("--seed", type=int, default=None, help="A seed for reproducible training.") parser.add_argument( "--model_type", type=str, default=None, help="Model type to use if training from scratch.", choices=MODEL_TYPES, ) parser.add_argument( "--block_size", type=int, default=None, help="Optional input sequence length after tokenization. The training dataset will be truncated in block of this size for training. Default to the model max input length for single sentence inputs (take into account special tokens).", ) parser.add_argument( "--preprocessing_num_workers", type=int, default=None, help="The number of processes to use for the preprocessing.", ) parser.add_argument( "--overwrite_cache", type=bool, default=False, help="Overwrite the cached training and evaluation sets" ) parser.add_argument( "--no_keep_linebreaks", action="store_true", help="Do not keep line breaks when using TXT files." ) parser.add_argument("--push_to_hub", action="store_true", help="Whether or not to push the model to the Hub.") parser.add_argument( "--hub_model_id", type=str, help="The name of the repository to keep in sync with the local `output_dir`." ) parser.add_argument("--hub_token", type=str, help="The token to use to push to the Model Hub.") parser.add_argument( "--checkpointing_steps", type=str, default=None, help="Whether the various states should be saved at the end of every n steps, or 'epoch' for each epoch.", ) parser.add_argument( "--resume_from_checkpoint", type=str, default=None, help="If the training should continue from a checkpoint folder.", ) parser.add_argument( "--with_tracking", action="store_true", help="Whether to load in all available experiment trackers from the environment and use them for logging.", ) parser.add_argument( "--n_train", type=int, default=2000, help="Number of train samples.", ) parser.add_argument( "--n_val", type=int, default=500, help="Number of validation samples.", ) args = parser.parse_args() # Sanity checks if args.dataset_name is None and args.train_file is None and args.validation_file is None: raise ValueError("Need either a dataset name or a training/validation file.") else: if args.train_file is not None: extension = args.train_file.split(".")[-1] assert extension in ["csv", "json", "txt"], "`train_file` should be a csv, json or txt file." if args.validation_file is not None: extension = args.validation_file.split(".")[-1] assert extension in ["csv", "json", "txt"], "`validation_file` should be a csv, json or txt file." if args.push_to_hub: assert args.output_dir is not None, "Need an `output_dir` to create a repo when `--push_to_hub` is passed." return args def main(): args = parse_args() # Initialize the accelerator. We will let the accelerator handle device placement for us in this example. # If we're using tracking, we also need to initialize it here and it will pick up all supported trackers in the environment accelerator = Accelerator(log_with="all", logging_dir=args.output_dir) if args.with_tracking else Accelerator() # Make one log on every process with the configuration for debugging. logging.basicConfig( format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", datefmt="%m/%d/%Y %H:%M:%S", level=logging.INFO, ) logger.info(accelerator.state) # Setup logging, we only want one process per machine to log things on the screen. # accelerator.is_local_main_process is only True for one process per machine. logger.setLevel(logging.INFO if accelerator.is_local_main_process else logging.ERROR) if accelerator.is_local_main_process: datasets.utils.logging.set_verbosity_warning() transformers.utils.logging.set_verbosity_info() else: datasets.utils.logging.set_verbosity_error() transformers.utils.logging.set_verbosity_error() # If passed along, set the training seed now. if args.seed is not None: set_seed(args.seed) # Handle the repository creation if accelerator.is_main_process: if args.push_to_hub: if args.hub_model_id is None: repo_name = get_full_repo_name(Path(args.output_dir).name, token=args.hub_token) else: repo_name = args.hub_model_id repo = Repository(args.output_dir, clone_from=repo_name) with open(os.path.join(args.output_dir, ".gitignore"), "w+") as gitignore: if "step_*" not in gitignore: gitignore.write("step_*\n") if "epoch_*" not in gitignore: gitignore.write("epoch_*\n") elif args.output_dir is not None: os.makedirs(args.output_dir, exist_ok=True) accelerator.wait_for_everyone() # Get the datasets: you can either provide your own CSV/JSON/TXT training and evaluation files (see below) # or just provide the name of one of the public datasets available on the hub at https://huggingface.co/datasets/ # (the dataset will be downloaded automatically from the datasets Hub). # # For CSV/JSON files, this script will use the column called 'text' or the first column if no column called # 'text' is found. You can easily tweak this behavior (see below). # # In distributed training, the load_dataset function guarantee that only one local process can concurrently # download the dataset. if args.dataset_name is not None: # Downloading and loading a dataset from the hub. raw_datasets = datasets.DatasetDict( { "train": datasets.Dataset.from_dict( load_dataset(args.dataset_name, args.dataset_config_name)["train"][: args.n_train + args.n_val] ) } ) if "validation" not in raw_datasets.keys(): raw_datasets["validation"] = load_dataset( args.dataset_name, args.dataset_config_name, split=f"train[:{args.validation_split_percentage}%]", ) raw_datasets["train"] = load_dataset( args.dataset_name, args.dataset_config_name, split=f"train[{args.validation_split_percentage}%:]", ) else: data_files = {} dataset_args = {} if args.train_file is not None: data_files["train"] = args.train_file if args.validation_file is not None: data_files["validation"] = args.validation_file extension = args.train_file.split(".")[-1] if extension == "txt": extension = "text" dataset_args["keep_linebreaks"] = not args.no_keep_linebreaks raw_datasets = load_dataset(extension, data_files=data_files, **dataset_args) # If no validation data is there, validation_split_percentage will be used to divide the dataset. if "validation" not in raw_datasets.keys(): raw_datasets["validation"] = load_dataset( extension, data_files=data_files, split=f"train[:{args.validation_split_percentage}%]", **dataset_args, ) raw_datasets["train"] = load_dataset( extension, data_files=data_files, split=f"train[{args.validation_split_percentage}%:]", **dataset_args, ) # See more about loading any type of standard or custom dataset (from files, python dict, pandas DataFrame, etc) at # https://huggingface.co/docs/datasets/loading_datasets.html. # Load pretrained model and tokenizer # # In distributed training, the .from_pretrained methods guarantee that only one local process can concurrently # download model & vocab. if args.config_name: config = AutoConfig.from_pretrained(args.config_name) elif args.model_name_or_path: config = AutoConfig.from_pretrained(args.model_name_or_path) else: config = CONFIG_MAPPING[args.model_type]() logger.warning("You are instantiating a new config instance from scratch.") if args.tokenizer_name: tokenizer = AutoTokenizer.from_pretrained(args.tokenizer_name, use_fast=not args.use_slow_tokenizer) elif args.model_name_or_path: tokenizer = AutoTokenizer.from_pretrained(args.model_name_or_path, use_fast=not args.use_slow_tokenizer) else: raise ValueError( "You are instantiating a new tokenizer from scratch. This is not supported by this script." "You can do it from another script, save it, and load it from here, using --tokenizer_name." ) if args.model_name_or_path: model = AutoModelForCausalLM.from_pretrained( args.model_name_or_path, from_tf=bool(".ckpt" in args.model_name_or_path), config=config, ) else: logger.info("Training new model from scratch") model = AutoModelForCausalLM.from_config(config) model.resize_token_embeddings(len(tokenizer)) # Preprocessing the datasets. # First we tokenize all the texts. column_names = raw_datasets["train"].column_names text_column_name = "text" if "text" in column_names else column_names[0] def tokenize_function(examples): return tokenizer(examples[text_column_name]) with accelerator.main_process_first(): tokenized_datasets = raw_datasets.map( tokenize_function, batched=True, num_proc=args.preprocessing_num_workers, remove_columns=column_names, load_from_cache_file=not args.overwrite_cache, desc="Running tokenizer on dataset", ) if args.block_size is None: block_size = tokenizer.model_max_length if block_size > 1024: logger.warning( f"The tokenizer picked seems to have a very large `model_max_length` ({tokenizer.model_max_length}). " "Picking 1024 instead. You can change that default value by passing --block_size xxx." ) block_size = 1024 else: if args.block_size > tokenizer.model_max_length: logger.warning( f"The block_size passed ({args.block_size}) is larger than the maximum length for the model" f"({tokenizer.model_max_length}). Using block_size={tokenizer.model_max_length}." ) block_size = min(args.block_size, tokenizer.model_max_length) # Main data processing function that will concatenate all texts from our dataset and generate chunks of block_size. def group_texts(examples): # Concatenate all texts. concatenated_examples = {k: list(chain(*examples[k])) for k in examples.keys()} total_length = len(concatenated_examples[list(examples.keys())[0]]) # We drop the small remainder, we could add padding if the model supported it instead of this drop, you can # customize this part to your needs. if total_length >= block_size: total_length = (total_length // block_size) * block_size # Split by chunks of max_len. result = { k: [t[i : i + block_size] for i in range(0, total_length, block_size)] for k, t in concatenated_examples.items() } result["labels"] = result["input_ids"].copy() return result # Note that with `batched=True`, this map processes 1,000 texts together, so group_texts throws away a remainder # for each of those groups of 1,000 texts. You can adjust that batch_size here but a higher value might be slower # to preprocess. # # To speed up this part, we use multiprocessing. See the documentation of the map method for more information: # https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map with accelerator.main_process_first(): lm_datasets = tokenized_datasets.map( group_texts, batched=True, num_proc=args.preprocessing_num_workers, load_from_cache_file=not args.overwrite_cache, desc=f"Grouping texts in chunks of {block_size}", ) train_dataset = lm_datasets["train"] eval_dataset = lm_datasets["validation"] # Log a few random samples from the training set: for index in random.sample(range(len(train_dataset)), 3): logger.info(f"Sample {index} of the training set: {train_dataset[index]}.") # DataLoaders creation: train_dataloader = DataLoader( train_dataset, shuffle=True, collate_fn=default_data_collator, batch_size=args.per_device_train_batch_size ) eval_dataloader = DataLoader( eval_dataset, collate_fn=default_data_collator, batch_size=args.per_device_eval_batch_size ) # Optimizer # Split weights in two groups, one with weight decay and the other not. no_decay = ["bias", "LayerNorm.weight"] optimizer_grouped_parameters = [ { "params": [p for n, p in model.named_parameters() if not any(nd in n for nd in no_decay)], "weight_decay": args.weight_decay, }, { "params": [p for n, p in model.named_parameters() if any(nd in n for nd in no_decay)], "weight_decay": 0.0, }, ] optimizer = AdamW(optimizer_grouped_parameters, lr=args.learning_rate) # On TPU, the tie weights in our model have been disconnected, so we need to restore the ties. if accelerator.distributed_type == DistributedType.TPU: model.tie_weights() # Scheduler and math around the number of training steps. num_update_steps_per_epoch = math.ceil(len(train_dataloader) / args.gradient_accumulation_steps) if args.max_train_steps is None: args.max_train_steps = args.num_train_epochs * num_update_steps_per_epoch else: args.num_train_epochs = math.ceil(args.max_train_steps / num_update_steps_per_epoch) lr_scheduler = get_scheduler( name=args.lr_scheduler_type, optimizer=optimizer, num_warmup_steps=args.num_warmup_steps, num_training_steps=args.max_train_steps, ) # Prepare everything with our `accelerator`. model, optimizer, train_dataloader, eval_dataloader, lr_scheduler = accelerator.prepare( model, optimizer, train_dataloader, eval_dataloader, lr_scheduler ) # Figure out how many steps we should save the Accelerator states if hasattr(args.checkpointing_steps, "isdigit"): checkpointing_steps = args.checkpointing_steps if args.checkpointing_steps.isdigit(): checkpointing_steps = int(args.checkpointing_steps) else: checkpointing_steps = None # We need to initialize the trackers we use, and also store our configuration if args.with_tracking: experiment_config = vars(args) # TensorBoard cannot log Enums, need the raw value experiment_config["lr_scheduler_type"] = experiment_config["lr_scheduler_type"].value accelerator.init_trackers("clm_no_trainer", experiment_config) # Train! total_batch_size = args.per_device_train_batch_size * accelerator.num_processes * args.gradient_accumulation_steps logger.info("***** Running training *****") logger.info(f" Num examples = {len(train_dataset)}") logger.info(f" Num Epochs = {args.num_train_epochs}") logger.info(f" Instantaneous batch size per device = {args.per_device_train_batch_size}") logger.info(f" Total train batch size (w. parallel, distributed & accumulation) = {total_batch_size}") logger.info(f" Gradient Accumulation steps = {args.gradient_accumulation_steps}") logger.info(f" Total optimization steps = {int(args.max_train_steps/accelerator.num_processes)}") # Only show the progress bar once on each machine. progress_bar = tqdm( range(int(args.max_train_steps / accelerator.num_processes)), disable=not accelerator.is_local_main_process ) completed_steps = 0 # Potentially load in the weights and states from a previous save if args.resume_from_checkpoint: if args.resume_from_checkpoint is not None or args.resume_from_checkpoint != "": accelerator.print(f"Resumed from checkpoint: {args.resume_from_checkpoint}") accelerator.load_state(args.resume_from_checkpoint) resume_step = None path = args.resume_from_checkpoint else: # Get the most recent checkpoint dirs = [f.name for f in os.scandir(os.getcwd()) if f.is_dir()] dirs.sort(key=os.path.getctime) path = dirs[-1] # Sorts folders by date modified, most recent checkpoint is the last if "epoch" in path: args.num_train_epochs -= int(path.replace("epoch_", "")) else: resume_step = int(path.replace("step_", "")) args.num_train_epochs -= resume_step // len(train_dataloader) resume_step = (args.num_train_epochs * len(train_dataloader)) - resume_step for epoch in range(args.num_train_epochs): model.train() if args.with_tracking: total_loss = 0 for step, batch in enumerate(train_dataloader): # We need to skip steps until we reach the resumed step if args.resume_from_checkpoint and epoch == 0 and step < resume_step: continue outputs = model(**batch) loss = outputs.loss # We keep track of the loss at each epoch if args.with_tracking: total_loss += loss.detach().float() loss = loss / args.gradient_accumulation_steps accelerator.backward(loss) if step % args.gradient_accumulation_steps == 0 or step == len(train_dataloader) - 1: optimizer.step() lr_scheduler.step() optimizer.zero_grad() progress_bar.update(1) completed_steps += 1 if isinstance(checkpointing_steps, int): if completed_steps % checkpointing_steps == 0: output_dir = f"step_{completed_steps}" if args.output_dir is not None: output_dir = os.path.join(args.output_dir, output_dir) accelerator.save_state(output_dir) if completed_steps >= args.max_train_steps: break model.eval() losses = [] for step, batch in enumerate(eval_dataloader): with torch.no_grad(): outputs = model(**batch) loss = outputs.loss losses.append(accelerator.gather(loss.repeat(args.per_device_eval_batch_size))) losses = torch.cat(losses) losses = losses[: len(eval_dataset)] try: perplexity = math.exp(torch.mean(losses)) except OverflowError: perplexity = float("inf") logger.info(f"epoch {epoch}: perplexity: {perplexity}") if args.with_tracking: accelerator.log( {"perplexity": perplexity, "train_loss": total_loss, "epoch": epoch, "step": completed_steps}, ) if args.push_to_hub and epoch < args.num_train_epochs - 1: accelerator.wait_for_everyone() unwrapped_model = accelerator.unwrap_model(model) unwrapped_model.save_pretrained(args.output_dir, save_function=accelerator.save) if accelerator.is_main_process: tokenizer.save_pretrained(args.output_dir) repo.push_to_hub( commit_message=f"Training in progress epoch {epoch}", blocking=False, auto_lfs_prune=True ) if args.checkpointing_steps == "epoch": output_dir = f"epoch_{epoch}" if args.output_dir is not None: output_dir = os.path.join(args.output_dir, output_dir) accelerator.save_state(output_dir) if args.output_dir is not None: accelerator.wait_for_everyone() unwrapped_model = accelerator.unwrap_model(model) unwrapped_model.save_pretrained(args.output_dir, save_function=accelerator.save) if accelerator.is_main_process: tokenizer.save_pretrained(args.output_dir) if args.push_to_hub: repo.push_to_hub(commit_message="End of training", auto_lfs_prune=True) with open(os.path.join(args.output_dir, "all_results.json"), "w") as f: json.dump({"perplexity": perplexity}, f) if __name__ == "__main__": main()
5
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/164_ethics-soc-5/why_open.md
# Some Notes on Pros of Open Science and Open Source - **Pooling Resources**: Building off of one another’s strengths; learning from one another’s failures. - **Accessibility**: Anyone can use the models, regardless of budget or affiliation. - This also helps to ensure diversity of contributors. - **Lowering Barriers**: You don’t need to have a tech job to explore how AI works. - **Innovation**: High-value applications are possible for more people to discover and create. - Relatedly, advancements in **addressing bias/harms** become more possible. - **Economic Opportunity**: More access leads to more businesses and jobs. - **Transparency**: Users and those affected have full visibility on the model and the training data. They can better identify potential biases or errors. - **Accountability**: Provenance to trace who-did-what; independent auditing possible. - **Privacy**: Users don't have to send their data to black box APIs. - **IP protection**: Users train their models on their data, and own them. - **Freedom of choice**: Users are not locked in. They can switch models anytime. - **IT flexibility**: Users can train and deploy models anywhere they like. - **Tailored use**: Users can train/fine-tune for their specific needs. - **Safety**: More mechanisms available. - **Speed**: Good ideas can quickly flourish and be built on. Security issues can be quickly addressed. - **Diversity** of options. # Cons of Closed Source - **Centralization** of power. - **Opacity** of subtle bias/harm issues. - Hiding **illegal** or problematic data. - **Bare minimum of legal compliance** as opposed to good practices. - Fostering **misunderstanding for hype and profit**. - **Insularity of thinking** creates "groupthink" technology issues (such as harming people with marginalized characteristics). - **Security issues** not addressed quickly. - Consumer apps **can’t be flexible** and become dependent on a single model: Consumer apps built on top of closed source must “lock-in” their code based on what an API outputs; as closed source internal models are updated or changed, this can completely break the consumer’s system, or the consumer’s expectations of behavior. # Common Misunderstandings ## There’s an idea that open source is “less secure”. - Misses that closed software has just as dire (or more so) security concerns as open source. - Misses the fact that the diversity of options available with open source limits how many people will be affected by a malicious actor. ## There’s an idea that open source will help China to “beat us”. - Misses that part of why U.S. technology has flourished due to open science/open source. - Misses that U.S. dominance is a function of how friendly the U.S. is to companies: There is more to success than the code itself, the socioeconomic variables that the U.S. provides is particularly well-placed to help open companies flourish.
6
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/introduction-to-ggml/ggml-debug.svg
<svg class="mx-auto" width="266pt" height="142pt" viewBox="0.00 0.00 265.72 142.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 138)"> <title>G</title> <polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-138 261.7224,-138 261.7224,4 -4,4"/> <!-- 0x7ff906fb60b0 --> <g id="node1" class="node"> <title>0x7ff906fb60b0</title> <polygon fill="#ffffff" stroke="#000000" points="169.026,-59.1 169.026,-133.5 257.7224,-133.5 257.7224,-59.1 169.026,-59.1"/> <text text-anchor="middle" x="213.3742" y="-116.9" font-family="Times,serif" font-size="14.00" fill="#000000">node_0 (f32)</text> <polyline fill="none" stroke="#000000" points="169.026,-108.7 257.7224,-108.7 "/> <text text-anchor="middle" x="213.3742" y="-92.1" font-family="Times,serif" font-size="14.00" fill="#000000">0 [4, 3]</text> <polyline fill="none" stroke="#000000" points="169.026,-83.9 257.7224,-83.9 "/> <text text-anchor="middle" x="213.3742" y="-67.3" font-family="Times,serif" font-size="14.00" fill="#000000">X*Y</text> </g> <!-- 0x563f17827fe0 --> <g id="node2" class="node"> <title>0x563f17827fe0</title> <polygon fill="#ffc0cb" stroke="#000000" points="0,-69.5 0,-119.1 106.2062,-119.1 106.2062,-69.5 0,-69.5"/> <text text-anchor="middle" x="53.1031" y="-102.5" font-family="Times,serif" font-size="14.00" fill="#000000">leaf_0 (f32)</text> <polyline fill="none" stroke="#000000" points="0,-94.3 106.2062,-94.3 "/> <text text-anchor="middle" x="53.1031" y="-77.7" font-family="Times,serif" font-size="14.00" fill="#000000">CONST 0 [2, 4]</text> </g> <!-- 0x563f17827fe0&#45;&gt;0x7ff906fb60b0 --> <g id="edge1" class="edge"> <title>0x563f17827fe0:x-&gt;0x7ff906fb60b0:x</title> <path fill="none" stroke="#000000" d="M106.1031,-106.3C134.4734,-106.3 137.0176,-79.0225 159.3349,-72.6293"/> <polygon fill="#000000" stroke="#000000" points="169.3742,-71.3 160.0515,-77.0738 164.4175,-71.9564 159.4607,-72.6127 159.4607,-72.6127 159.4607,-72.6127 164.4175,-71.9564 158.87,-68.1517 169.3742,-71.3 169.3742,-71.3"/> <text text-anchor="middle" x="137.6161" y="-104.5" font-family="Times,serif" font-size="14.00" fill="#000000">src 0</text> </g> <!-- 0x563f17828150 --> <g id="node3" class="node"> <title>0x563f17828150</title> <polygon fill="#ffc0cb" stroke="#000000" points="0,-.5 0,-50.1 106.2062,-50.1 106.2062,-.5 0,-.5"/> <text text-anchor="middle" x="53.1031" y="-33.5" font-family="Times,serif" font-size="14.00" fill="#000000">leaf_1 (f32)</text> <polyline fill="none" stroke="#000000" points="0,-25.3 106.2062,-25.3 "/> <text text-anchor="middle" x="53.1031" y="-8.7" font-family="Times,serif" font-size="14.00" fill="#000000">CONST 1 [2, 3]</text> </g> <!-- 0x563f17828150&#45;&gt;0x7ff906fb60b0 --> <g id="edge2" class="edge"> <title>0x563f17828150:x-&gt;0x7ff906fb60b0:x</title> <path fill="none" stroke="#000000" d="M106.1031,-37.3C127.6452,-37.3 132.6375,-44.2784 151.026,-55.5 156.1932,-58.6533 157.9478,-63.2584 160.4093,-66.6842"/> <polygon fill="#000000" stroke="#000000" points="169.3742,-71.3 158.4235,-70.7232 164.9288,-69.0112 160.4834,-66.7223 160.4834,-66.7223 160.4834,-66.7223 164.9288,-69.0112 162.5434,-62.7215 169.3742,-71.3 169.3742,-71.3"/> <text text-anchor="middle" x="137.6161" y="-59.5" font-family="Times,serif" font-size="14.00" fill="#000000">src 1</text> </g> </g> </svg>
7
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/optimum_nvidia/throughput.svg
<svg version="1.1" viewBox="0.0 0.0 600.0 371.0" fill="none" stroke="none" stroke-linecap="square" stroke-miterlimit="10" width="600" height="371" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns="http://www.w3.org/2000/svg"><path fill="#ffffff" d="M0 0L600.0 0L600.0 371.0L0 371.0L0 0Z" fill-rule="nonzero"/><path stroke="#333333" stroke-width="1.0" stroke-linecap="butt" d="M77.5 332.5L581.5 332.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 258.5L581.5 258.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 185.5L581.5 185.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 111.5L581.5 111.5" fill-rule="nonzero"/><clipPath id="id_0"><path d="M77.55 111.78333L581.45 111.78333L581.45 332.45L77.55 332.45L77.55 111.78333Z" clip-rule="nonzero"/></clipPath><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M178.0 332.0L131.0 332.0L131.0 315.0C131.0 313.89542 131.89543 313.0 133.0 313.0L176.0 313.0C177.10457 313.0 178.0 313.89542 178.0 315.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M178.0 332.0L131.0 332.0L131.0 315.0C131.0 313.89542 131.89543 313.0 133.0 313.0L176.0 313.0C177.10457 313.0 178.0 313.89542 178.0 315.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M329.0 332.0L282.0 332.0L282.0 278.0C282.0 276.89542 282.89542 276.0 284.0 276.0L327.0 276.0C328.10458 276.0 329.0 276.89542 329.0 278.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M329.0 332.0L282.0 332.0L282.0 278.0C282.0 276.89542 282.89542 276.0 284.0 276.0L327.0 276.0C328.10458 276.0 329.0 276.89542 329.0 278.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M479.0 332.0L432.0 332.0L432.0 130.0C432.0 128.89543 432.89542 128.0 434.0 128.0L477.0 128.0C478.10458 128.0 479.0 128.89543 479.0 130.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M479.0 332.0L432.0 332.0L432.0 130.0C432.0 128.89543 432.89542 128.0 434.0 128.0L477.0 128.0C478.10458 128.0 479.0 128.89543 479.0 130.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M378.0 332.0L331.0 332.0L331.0 280.0C331.0 278.89542 331.89542 278.0 333.0 278.0L376.0 278.0C377.10458 278.0 378.0 278.89542 378.0 280.0Z" fill-rule="nonzero"/><path fill="#3d5314" clip-path="url(#id_0)" d="M378.0 332.0L331.0 332.0L331.0 280.0C331.0 278.89542 331.89542 278.0 333.0 278.0L376.0 278.0C377.10458 278.0 378.0 278.89542 378.0 280.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M528.0 332.0L481.0 332.0L481.0 179.0C481.0 177.89543 481.89542 177.0 483.0 177.0L526.0 177.0C527.10455 177.0 528.0 177.89543 528.0 179.0Z" fill-rule="nonzero"/><path fill="#3d5314" clip-path="url(#id_0)" d="M528.0 332.0L481.0 332.0L481.0 179.0C481.0 177.89543 481.89542 177.0 483.0 177.0L526.0 177.0C527.10455 177.0 528.0 177.89543 528.0 179.0Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M149.29688 327.281L143.70312 327.281L143.70312 326.49976L146.65625 323.2185Q147.3125 322.4685 147.5625 322.01538Q147.8125 321.54663 147.8125 321.04663Q147.8125 320.37476 147.40625 319.95288Q147.01562 319.51538 146.32812 319.51538Q145.53125 319.51538 145.07812 319.98413Q144.625 320.43726 144.625 321.26538L143.54688 321.26538Q143.54688 320.07788 144.29688 319.35913Q145.0625 318.62476 146.32812 318.62476Q147.53125 318.62476 148.21875 319.24976Q148.90625 319.87476 148.90625 320.92163Q148.90625 322.17163 147.29688 323.92163L145.01562 326.39038L149.29688 326.39038L149.29688 327.281ZM150.84375 326.7185Q150.84375 326.43726 151.0 326.24976Q151.17188 326.06226 151.5 326.06226Q151.84375 326.06226 152.01562 326.24976Q152.1875 326.43726 152.1875 326.7185Q152.1875 326.98413 152.01562 327.17163Q151.84375 327.3435 151.5 327.3435Q151.17188 327.3435 151.0 327.17163Q150.84375 326.98413 150.84375 326.7185ZM157.96875 318.74976L157.96875 319.656L157.76562 319.656Q156.5 319.68726 155.75 320.42163Q155.0 321.14038 154.875 322.45288Q155.5625 321.68726 156.71875 321.68726Q157.82812 321.68726 158.5 322.4685Q159.17188 323.24976 159.17188 324.49976Q159.17188 325.81226 158.45312 326.60913Q157.73438 327.406 156.51562 327.406Q155.29688 327.406 154.53125 326.4685Q153.78125 325.51538 153.78125 324.031L153.78125 323.62476Q153.78125 321.26538 154.78125 320.01538Q155.78125 318.76538 157.76562 318.74976L157.96875 318.74976ZM156.54688 322.5935Q155.98438 322.5935 155.51562 322.92163Q155.04688 323.24976 154.85938 323.76538L154.85938 324.156Q154.85938 325.2185 155.32812 325.85913Q155.8125 326.49976 156.51562 326.49976Q157.25 326.49976 157.67188 325.9685Q158.09375 325.42163 158.09375 324.54663Q158.09375 323.67163 157.67188 323.14038Q157.25 322.5935 156.54688 322.5935ZM162.95312 323.24976L164.35938 320.93726L165.625 320.93726L163.54688 324.07788L165.6875 327.281L164.4375 327.281L162.96875 324.906L161.5 327.281L160.23438 327.281L162.375 324.07788L160.3125 320.93726L161.5625 320.93726L162.95312 323.24976Z" fill-rule="nonzero"/><path fill="#000000" d="M149.29688 327.281L143.70312 327.281L143.70312 326.49976L146.65625 323.2185Q147.3125 322.4685 147.5625 322.01538Q147.8125 321.54663 147.8125 321.04663Q147.8125 320.37476 147.40625 319.95288Q147.01562 319.51538 146.32812 319.51538Q145.53125 319.51538 145.07812 319.98413Q144.625 320.43726 144.625 321.26538L143.54688 321.26538Q143.54688 320.07788 144.29688 319.35913Q145.0625 318.62476 146.32812 318.62476Q147.53125 318.62476 148.21875 319.24976Q148.90625 319.87476 148.90625 320.92163Q148.90625 322.17163 147.29688 323.92163L145.01562 326.39038L149.29688 326.39038L149.29688 327.281ZM150.84375 326.7185Q150.84375 326.43726 151.0 326.24976Q151.17188 326.06226 151.5 326.06226Q151.84375 326.06226 152.01562 326.24976Q152.1875 326.43726 152.1875 326.7185Q152.1875 326.98413 152.01562 327.17163Q151.84375 327.3435 151.5 327.3435Q151.17188 327.3435 151.0 327.17163Q150.84375 326.98413 150.84375 326.7185ZM157.96875 318.74976L157.96875 319.656L157.76562 319.656Q156.5 319.68726 155.75 320.42163Q155.0 321.14038 154.875 322.45288Q155.5625 321.68726 156.71875 321.68726Q157.82812 321.68726 158.5 322.4685Q159.17188 323.24976 159.17188 324.49976Q159.17188 325.81226 158.45312 326.60913Q157.73438 327.406 156.51562 327.406Q155.29688 327.406 154.53125 326.4685Q153.78125 325.51538 153.78125 324.031L153.78125 323.62476Q153.78125 321.26538 154.78125 320.01538Q155.78125 318.76538 157.76562 318.74976L157.96875 318.74976ZM156.54688 322.5935Q155.98438 322.5935 155.51562 322.92163Q155.04688 323.24976 154.85938 323.76538L154.85938 324.156Q154.85938 325.2185 155.32812 325.85913Q155.8125 326.49976 156.51562 326.49976Q157.25 326.49976 157.67188 325.9685Q158.09375 325.42163 158.09375 324.54663Q158.09375 323.67163 157.67188 323.14038Q157.25 322.5935 156.54688 322.5935ZM162.95312 323.24976L164.35938 320.93726L165.625 320.93726L163.54688 324.07788L165.6875 327.281L164.4375 327.281L162.96875 324.906L161.5 327.281L160.23438 327.281L162.375 324.07788L160.3125 320.93726L161.5625 320.93726L162.95312 323.24976Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M300.21875 285.0247L296.6875 292.94656L295.54688 292.94656L299.0625 285.30594L294.45312 285.30594L294.45312 284.4153L300.21875 284.4153L300.21875 285.0247ZM301.84375 292.38406Q301.84375 292.1028 302.0 291.9153Q302.17188 291.7278 302.5 291.7278Q302.84375 291.7278 303.01562 291.9153Q303.1875 292.1028 303.1875 292.38406Q303.1875 292.6497 303.01562 292.8372Q302.84375 293.00906 302.5 293.00906Q302.17188 293.00906 302.0 292.8372Q301.84375 292.6497 301.84375 292.38406ZM310.21875 285.0247L306.6875 292.94656L305.54688 292.94656L309.0625 285.30594L304.45312 285.30594L304.45312 284.4153L310.21875 284.4153L310.21875 285.0247ZM313.95312 288.9153L315.35938 286.6028L316.625 286.6028L314.54688 289.74344L316.6875 292.94656L315.4375 292.94656L313.96875 290.57156L312.5 292.94656L311.23438 292.94656L313.375 289.74344L311.3125 286.6028L312.5625 286.6028L313.95312 288.9153Z" fill-rule="nonzero"/><path fill="#000000" d="M300.21875 285.0247L296.6875 292.94656L295.54688 292.94656L299.0625 285.30594L294.45312 285.30594L294.45312 284.4153L300.21875 284.4153L300.21875 285.0247ZM301.84375 292.38406Q301.84375 292.1028 302.0 291.9153Q302.17188 291.7278 302.5 291.7278Q302.84375 291.7278 303.01562 291.9153Q303.1875 292.1028 303.1875 292.38406Q303.1875 292.6497 303.01562 292.8372Q302.84375 293.00906 302.5 293.00906Q302.17188 293.00906 302.0 292.8372Q301.84375 292.6497 301.84375 292.38406ZM310.21875 285.0247L306.6875 292.94656L305.54688 292.94656L309.0625 285.30594L304.45312 285.30594L304.45312 284.4153L310.21875 284.4153L310.21875 285.0247ZM313.95312 288.9153L315.35938 286.6028L316.625 286.6028L314.54688 289.74344L316.6875 292.94656L315.4375 292.94656L313.96875 290.57156L312.5 292.94656L311.23438 292.94656L313.375 289.74344L311.3125 286.6028L312.5625 286.6028L313.95312 288.9153Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M446.79688 144.49187L441.20312 144.49187L441.20312 143.71062L444.15625 140.42937Q444.8125 139.67937 445.0625 139.22624Q445.3125 138.75749 445.3125 138.25749Q445.3125 137.58562 444.90625 137.16374Q444.51562 136.72624 443.82812 136.72624Q443.03125 136.72624 442.57812 137.19499Q442.125 137.64812 442.125 138.47624L441.04688 138.47624Q441.04688 137.28874 441.79688 136.56999Q442.5625 135.83562 443.82812 135.83562Q445.03125 135.83562 445.71875 136.46062Q446.40625 137.08562 446.40625 138.13249Q446.40625 139.38249 444.79688 141.13249L442.51562 143.60124L446.79688 143.60124L446.79688 144.49187ZM453.71875 136.56999L450.1875 144.49187L449.04688 144.49187L452.5625 136.85124L447.95312 136.85124L447.95312 135.96062L453.71875 135.96062L453.71875 136.56999ZM455.34375 143.92937Q455.34375 143.64812 455.5 143.46062Q455.67188 143.27312 456.0 143.27312Q456.34375 143.27312 456.51562 143.46062Q456.6875 143.64812 456.6875 143.92937Q456.6875 144.19499 456.51562 144.38249Q456.34375 144.55437 456.0 144.55437Q455.67188 144.55437 455.5 144.38249Q455.34375 144.19499 455.34375 143.92937ZM463.39062 138.17937Q463.39062 138.81999 463.04688 139.31999Q462.70312 139.81999 462.14062 140.10124Q462.79688 140.38249 463.1875 140.94499Q463.57812 141.49187 463.57812 142.19499Q463.57812 143.28874 462.82812 143.96062Q462.09375 144.61687 460.875 144.61687Q459.64062 144.61687 458.89062 143.96062Q458.15625 143.28874 458.15625 142.19499Q458.15625 141.49187 458.53125 140.94499Q458.90625 140.38249 459.57812 140.10124Q459.01562 139.81999 458.6875 139.31999Q458.35938 138.81999 458.35938 138.17937Q458.35938 137.10124 459.04688 136.47624Q459.73438 135.83562 460.875 135.83562Q462.0 135.83562 462.6875 136.47624Q463.39062 137.10124 463.39062 138.17937ZM462.5 142.16374Q462.5 141.44499 462.04688 141.00749Q461.59375 140.55437 460.85938 140.55437Q460.125 140.55437 459.67188 141.00749Q459.23438 141.44499 459.23438 142.17937Q459.23438 142.89812 459.67188 143.31999Q460.10938 143.72624 460.875 143.72624Q461.625 143.72624 462.0625 143.30437Q462.5 142.88249 462.5 142.16374ZM460.875 136.72624Q460.23438 136.72624 459.82812 137.13249Q459.4375 137.52312 459.4375 138.21062Q459.4375 138.85124 459.82812 139.25749Q460.21875 139.66374 460.85938 139.66374Q461.51562 139.66374 461.90625 139.25749Q462.29688 138.85124 462.29688 138.21062Q462.29688 137.55437 461.89062 137.14812Q461.48438 136.72624 460.875 136.72624ZM467.45312 140.46062L468.85938 138.14812L470.125 138.14812L468.04688 141.28874L470.1875 144.49187L468.9375 144.49187L467.46875 142.11687L466.0 144.49187L464.73438 144.49187L466.875 141.28874L464.8125 138.14812L466.0625 138.14812L467.45312 140.46062Z" fill-rule="nonzero"/><path fill="#000000" d="M446.79688 144.49187L441.20312 144.49187L441.20312 143.71062L444.15625 140.42937Q444.8125 139.67937 445.0625 139.22624Q445.3125 138.75749 445.3125 138.25749Q445.3125 137.58562 444.90625 137.16374Q444.51562 136.72624 443.82812 136.72624Q443.03125 136.72624 442.57812 137.19499Q442.125 137.64812 442.125 138.47624L441.04688 138.47624Q441.04688 137.28874 441.79688 136.56999Q442.5625 135.83562 443.82812 135.83562Q445.03125 135.83562 445.71875 136.46062Q446.40625 137.08562 446.40625 138.13249Q446.40625 139.38249 444.79688 141.13249L442.51562 143.60124L446.79688 143.60124L446.79688 144.49187ZM453.71875 136.56999L450.1875 144.49187L449.04688 144.49187L452.5625 136.85124L447.95312 136.85124L447.95312 135.96062L453.71875 135.96062L453.71875 136.56999ZM455.34375 143.92937Q455.34375 143.64812 455.5 143.46062Q455.67188 143.27312 456.0 143.27312Q456.34375 143.27312 456.51562 143.46062Q456.6875 143.64812 456.6875 143.92937Q456.6875 144.19499 456.51562 144.38249Q456.34375 144.55437 456.0 144.55437Q455.67188 144.55437 455.5 144.38249Q455.34375 144.19499 455.34375 143.92937ZM463.39062 138.17937Q463.39062 138.81999 463.04688 139.31999Q462.70312 139.81999 462.14062 140.10124Q462.79688 140.38249 463.1875 140.94499Q463.57812 141.49187 463.57812 142.19499Q463.57812 143.28874 462.82812 143.96062Q462.09375 144.61687 460.875 144.61687Q459.64062 144.61687 458.89062 143.96062Q458.15625 143.28874 458.15625 142.19499Q458.15625 141.49187 458.53125 140.94499Q458.90625 140.38249 459.57812 140.10124Q459.01562 139.81999 458.6875 139.31999Q458.35938 138.81999 458.35938 138.17937Q458.35938 137.10124 459.04688 136.47624Q459.73438 135.83562 460.875 135.83562Q462.0 135.83562 462.6875 136.47624Q463.39062 137.10124 463.39062 138.17937ZM462.5 142.16374Q462.5 141.44499 462.04688 141.00749Q461.59375 140.55437 460.85938 140.55437Q460.125 140.55437 459.67188 141.00749Q459.23438 141.44499 459.23438 142.17937Q459.23438 142.89812 459.67188 143.31999Q460.10938 143.72624 460.875 143.72624Q461.625 143.72624 462.0625 143.30437Q462.5 142.88249 462.5 142.16374ZM460.875 136.72624Q460.23438 136.72624 459.82812 137.13249Q459.4375 137.52312 459.4375 138.21062Q459.4375 138.85124 459.82812 139.25749Q460.21875 139.66374 460.85938 139.66374Q461.51562 139.66374 461.90625 139.25749Q462.29688 138.85124 462.29688 138.21062Q462.29688 137.55437 461.89062 137.14812Q461.48438 136.72624 460.875 136.72624ZM467.45312 140.46062L468.85938 138.14812L470.125 138.14812L468.04688 141.28874L470.1875 144.49187L468.9375 144.49187L467.46875 142.11687L466.0 144.49187L464.73438 144.49187L466.875 141.28874L464.8125 138.14812L466.0625 138.14812L467.45312 140.46062Z" fill-rule="nonzero"/><path stroke="#3d5314" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M349.21875 286.51736L345.6875 294.43924L344.54688 294.43924L348.0625 286.7986L343.45312 286.7986L343.45312 285.908L349.21875 285.908L349.21875 286.51736ZM350.84375 293.87674Q350.84375 293.5955 351.0 293.408Q351.17188 293.2205 351.5 293.2205Q351.84375 293.2205 352.01562 293.408Q352.1875 293.5955 352.1875 293.87674Q352.1875 294.14236 352.01562 294.32986Q351.84375 294.50174 351.5 294.50174Q351.17188 294.50174 351.0 294.32986Q350.84375 294.14236 350.84375 293.87674ZM354.20312 290.158L354.64062 285.908L359.01562 285.908L359.01562 286.908L355.5625 286.908L355.29688 289.2361Q355.9375 288.8611 356.73438 288.8611Q357.89062 288.8611 358.57812 289.64236Q359.26562 290.408 359.26562 291.7205Q359.26562 293.033 358.54688 293.7986Q357.84375 294.56424 356.5625 294.56424Q355.4375 294.56424 354.71875 293.93924Q354.0 293.2986 353.90625 292.18924L354.92188 292.18924Q355.03125 292.9236 355.45312 293.2986Q355.875 293.6736 356.5625 293.6736Q357.3125 293.6736 357.75 293.158Q358.1875 292.64236 358.1875 291.7361Q358.1875 290.87674 357.71875 290.3611Q357.25 289.8455 356.46875 289.8455Q355.76562 289.8455 355.35938 290.14236L355.07812 290.37674L354.20312 290.158ZM362.95312 290.408L364.35938 288.0955L365.625 288.0955L363.54688 291.2361L365.6875 294.43924L364.4375 294.43924L362.96875 292.06424L361.5 294.43924L360.23438 294.43924L362.375 291.2361L360.3125 288.0955L361.5625 288.0955L362.95312 290.408Z" fill-rule="nonzero"/><path fill="#ffffff" d="M349.21875 286.51736L345.6875 294.43924L344.54688 294.43924L348.0625 286.7986L343.45312 286.7986L343.45312 285.908L349.21875 285.908L349.21875 286.51736ZM350.84375 293.87674Q350.84375 293.5955 351.0 293.408Q351.17188 293.2205 351.5 293.2205Q351.84375 293.2205 352.01562 293.408Q352.1875 293.5955 352.1875 293.87674Q352.1875 294.14236 352.01562 294.32986Q351.84375 294.50174 351.5 294.50174Q351.17188 294.50174 351.0 294.32986Q350.84375 294.14236 350.84375 293.87674ZM354.20312 290.158L354.64062 285.908L359.01562 285.908L359.01562 286.908L355.5625 286.908L355.29688 289.2361Q355.9375 288.8611 356.73438 288.8611Q357.89062 288.8611 358.57812 289.64236Q359.26562 290.408 359.26562 291.7205Q359.26562 293.033 358.54688 293.7986Q357.84375 294.56424 356.5625 294.56424Q355.4375 294.56424 354.71875 293.93924Q354.0 293.2986 353.90625 292.18924L354.92188 292.18924Q355.03125 292.9236 355.45312 293.2986Q355.875 293.6736 356.5625 293.6736Q357.3125 293.6736 357.75 293.158Q358.1875 292.64236 358.1875 291.7361Q358.1875 290.87674 357.71875 290.3611Q357.25 289.8455 356.46875 289.8455Q355.76562 289.8455 355.35938 290.14236L355.07812 290.37674L354.20312 290.158ZM362.95312 290.408L364.35938 288.0955L365.625 288.0955L363.54688 291.2361L365.6875 294.43924L364.4375 294.43924L362.96875 292.06424L361.5 294.43924L360.23438 294.43924L362.375 291.2361L360.3125 288.0955L361.5625 288.0955L362.95312 290.408Z" fill-rule="nonzero"/><path stroke="#3d5314" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M495.79688 193.9427L490.20312 193.9427L490.20312 193.16145L493.15625 189.8802Q493.8125 189.1302 494.0625 188.67708Q494.3125 188.20833 494.3125 187.70833Q494.3125 187.03645 493.90625 186.61458Q493.51562 186.17708 492.82812 186.17708Q492.03125 186.17708 491.57812 186.64583Q491.125 187.09895 491.125 187.92708L490.04688 187.92708Q490.04688 186.73958 490.79688 186.02083Q491.5625 185.28645 492.82812 185.28645Q494.03125 185.28645 494.71875 185.91145Q495.40625 186.53645 495.40625 187.58333Q495.40625 188.83333 493.79688 190.58333L491.51562 193.05208L495.79688 193.05208L495.79688 193.9427ZM500.76562 193.9427L499.6875 193.9427L499.6875 186.72395L497.5 187.52083L497.5 186.53645L500.60938 185.36458L500.76562 185.36458L500.76562 193.9427ZM504.34375 193.3802Q504.34375 193.09895 504.5 192.91145Q504.67188 192.72395 505.0 192.72395Q505.34375 192.72395 505.51562 192.91145Q505.6875 193.09895 505.6875 193.3802Q505.6875 193.64583 505.51562 193.83333Q505.34375 194.0052 505.0 194.0052Q504.67188 194.0052 504.5 193.83333Q504.34375 193.64583 504.34375 193.3802ZM510.76562 193.9427L509.6875 193.9427L509.6875 186.72395L507.5 187.52083L507.5 186.53645L510.60938 185.36458L510.76562 185.36458L510.76562 193.9427ZM516.4531 189.91145L517.8594 187.59895L519.125 187.59895L517.0469 190.73958L519.1875 193.9427L517.9375 193.9427L516.46875 191.5677L515.0 193.9427L513.7344 193.9427L515.875 190.73958L513.8125 187.59895L515.0625 187.59895L516.4531 189.91145Z" fill-rule="nonzero"/><path fill="#ffffff" d="M495.79688 193.9427L490.20312 193.9427L490.20312 193.16145L493.15625 189.8802Q493.8125 189.1302 494.0625 188.67708Q494.3125 188.20833 494.3125 187.70833Q494.3125 187.03645 493.90625 186.61458Q493.51562 186.17708 492.82812 186.17708Q492.03125 186.17708 491.57812 186.64583Q491.125 187.09895 491.125 187.92708L490.04688 187.92708Q490.04688 186.73958 490.79688 186.02083Q491.5625 185.28645 492.82812 185.28645Q494.03125 185.28645 494.71875 185.91145Q495.40625 186.53645 495.40625 187.58333Q495.40625 188.83333 493.79688 190.58333L491.51562 193.05208L495.79688 193.05208L495.79688 193.9427ZM500.76562 193.9427L499.6875 193.9427L499.6875 186.72395L497.5 187.52083L497.5 186.53645L500.60938 185.36458L500.76562 185.36458L500.76562 193.9427ZM504.34375 193.3802Q504.34375 193.09895 504.5 192.91145Q504.67188 192.72395 505.0 192.72395Q505.34375 192.72395 505.51562 192.91145Q505.6875 193.09895 505.6875 193.3802Q505.6875 193.64583 505.51562 193.83333Q505.34375 194.0052 505.0 194.0052Q504.67188 194.0052 504.5 193.83333Q504.34375 193.64583 504.34375 193.3802ZM510.76562 193.9427L509.6875 193.9427L509.6875 186.72395L507.5 187.52083L507.5 186.53645L510.60938 185.36458L510.76562 185.36458L510.76562 193.9427ZM516.4531 189.91145L517.8594 187.59895L519.125 187.59895L517.0469 190.73958L519.1875 193.9427L517.9375 193.9427L516.46875 191.5677L515.0 193.9427L513.7344 193.9427L515.875 190.73958L513.8125 187.59895L515.0625 187.59895L516.4531 189.91145Z" fill-rule="nonzero"/><path fill="#000000" d="M26.784388 319.5698L26.690638 318.5073Q27.346888 318.42917 27.753138 318.14792Q28.159388 317.86667 28.425013 317.28854Q28.675013 316.6948 28.675013 315.96042Q28.675013 315.3198 28.487513 314.8198Q28.284388 314.3198 27.956263 314.08542Q27.612513 313.83542 27.221888 313.83542Q26.815638 313.83542 26.518763 314.0698Q26.206263 314.30417 26.003138 314.83542Q25.862513 315.1948 25.581263 316.3823Q25.300013 317.55417 25.050013 318.02292Q24.721888 318.64792 24.253138 318.9448Q23.768763 319.24167 23.175013 319.24167Q22.534388 319.24167 21.971888 318.8823Q21.393763 318.5073 21.112513 317.80417Q20.815638 317.08542 20.815638 316.22604Q20.815638 315.27292 21.128138 314.53854Q21.425013 313.80417 22.034388 313.41354Q22.628138 313.02292 23.378138 312.99167L23.456263 314.08542Q22.643763 314.17917 22.237513 314.6948Q21.815638 315.1948 21.815638 316.17917Q21.815638 317.21042 22.190638 317.67917Q22.565638 318.14792 23.096888 318.14792Q23.565638 318.14792 23.862513 317.8198Q24.159388 317.49167 24.471888 316.11667Q24.784388 314.72604 25.018763 314.21042Q25.378138 313.46042 25.909388 313.10104Q26.425013 312.74167 27.128138 312.74167Q27.815638 312.74167 28.425013 313.1323Q29.018763 313.52292 29.362513 314.2573Q29.690638 314.99167 29.690638 315.91354Q29.690638 317.08542 29.362513 317.8823Q29.018763 318.66354 28.331263 319.11667Q27.643763 319.55417 26.784388 319.5698ZM31.940638 311.3198L23.331263 311.3198L23.331263 310.36667L24.128138 310.36667Q23.659388 310.02292 23.425013 309.60104Q23.190638 309.16354 23.190638 308.55417Q23.190638 307.7573 23.596888 307.14792Q24.003138 306.53854 24.753138 306.24167Q25.503138 305.92917 26.393763 305.92917Q27.346888 305.92917 28.112513 306.27292Q28.878138 306.60104 29.284388 307.2573Q29.690638 307.91354 29.690638 308.6323Q29.690638 309.16354 29.471888 309.58542Q29.237513 310.0073 28.909388 310.27292L31.940638 310.27292L31.940638 311.3198ZM26.471888 310.36667Q27.675013 310.36667 28.253138 309.8823Q28.815638 309.39792 28.815638 308.71042Q28.815638 308.0073 28.221888 307.5073Q27.628138 307.0073 26.378138 307.0073Q25.190638 307.0073 24.612513 307.49167Q24.018763 307.97604 24.018763 308.66354Q24.018763 309.33542 24.643763 309.85104Q25.268763 310.36667 26.471888 310.36667ZM27.550013 300.0698L27.675013 298.97604Q28.628138 299.22604 29.159388 299.92917Q29.690638 300.6323 29.690638 301.71042Q29.690638 303.0698 28.846888 303.8823Q28.003138 304.67917 26.487513 304.67917Q24.925013 304.67917 24.065638 303.86667Q23.190638 303.05417 23.190638 301.77292Q23.190638 300.52292 24.034388 299.74167Q24.878138 298.9448 26.425013 298.9448Q26.518763 298.9448 26.706263 298.9448L26.706263 303.58542Q27.737513 303.52292 28.284388 303.0073Q28.815638 302.49167 28.815638 301.71042Q28.815638 301.1323 28.518763 300.72604Q28.206263 300.30417 27.550013 300.0698ZM25.846888 303.52292L25.846888 300.05417Q25.050013 300.11667 24.659388 300.4448Q24.050013 300.96042 24.050013 301.7573Q24.050013 302.49167 24.534388 302.99167Q25.018763 303.47604 25.846888 303.52292ZM27.550013 293.0698L27.675013 291.97604Q28.628138 292.22604 29.159388 292.92917Q29.690638 293.6323 29.690638 294.71042Q29.690638 296.0698 28.846888 296.8823Q28.003138 297.67917 26.487513 297.67917Q24.925013 297.67917 24.065638 296.86667Q23.190638 296.05417 23.190638 294.77292Q23.190638 293.52292 24.034388 292.74167Q24.878138 291.9448 26.425013 291.9448Q26.518763 291.9448 26.706263 291.9448L26.706263 296.58542Q27.737513 296.52292 28.284388 296.0073Q28.815638 295.49167 28.815638 294.71042Q28.815638 294.1323 28.518763 293.72604Q28.206263 293.30417 27.550013 293.0698ZM25.846888 296.52292L25.846888 293.05417Q25.050013 293.11667 24.659388 293.4448Q24.050013 293.96042 24.050013 294.7573Q24.050013 295.49167 24.534388 295.99167Q25.018763 296.47604 25.846888 296.52292ZM29.550013 286.28854L28.768763 286.28854Q29.690638 286.8823 29.690638 288.02292Q29.690638 288.77292 29.284388 289.39792Q28.862513 290.02292 28.128138 290.36667Q27.393763 290.71042 26.440638 290.71042Q25.518763 290.71042 24.753138 290.39792Q23.987513 290.08542 23.596888 289.46042Q23.190638 288.83542 23.190638 288.0698Q23.190638 287.5073 23.425013 287.0698Q23.659388 286.6323 24.034388 286.35104L20.956263 286.35104L20.956263 285.30417L29.550013 285.30417L29.550013 286.28854ZM26.440638 289.61667Q27.643763 289.61667 28.237513 289.11667Q28.815638 288.61667 28.815638 287.92917Q28.815638 287.24167 28.253138 286.7573Q27.690638 286.27292 26.534388 286.27292Q25.253138 286.27292 24.659388 286.77292Q24.065638 287.2573 24.065638 287.97604Q24.065638 288.67917 24.643763 289.14792Q25.206263 289.61667 26.440638 289.61667ZM29.550013 279.24167L28.628138 279.24167Q29.690638 279.97604 29.690638 281.22604Q29.690638 281.77292 29.487513 282.2573Q29.268763 282.72604 28.956263 282.96042Q28.628138 283.1948 28.159388 283.28854Q27.862513 283.35104 27.175013 283.35104L23.331263 283.35104L23.331263 282.28854L26.784388 282.28854Q27.596888 282.28854 27.893763 282.22604Q28.300013 282.1323 28.550013 281.8198Q28.784388 281.49167 28.784388 281.0073Q28.784388 280.53854 28.550013 280.1323Q28.300013 279.71042 27.878138 279.53854Q27.456263 279.35104 26.659388 279.35104L23.331263 279.35104L23.331263 278.30417L29.550013 278.30417L29.550013 279.24167ZM31.940638 276.3198L23.331263 276.3198L23.331263 275.36667L24.128138 275.36667Q23.659388 275.02292 23.425013 274.60104Q23.190638 274.16354 23.190638 273.55417Q23.190638 272.7573 23.596888 272.14792Q24.003138 271.53854 24.753138 271.24167Q25.503138 270.92917 26.393763 270.92917Q27.346888 270.92917 28.112513 271.27292Q28.878138 271.60104 29.284388 272.2573Q29.690638 272.91354 29.690638 273.6323Q29.690638 274.16354 29.471888 274.58542Q29.237513 275.0073 28.909388 275.27292L31.940638 275.27292L31.940638 276.3198ZM26.471888 275.36667Q27.675013 275.36667 28.253138 274.8823Q28.815638 274.39792 28.815638 273.71042Q28.815638 273.0073 28.221888 272.5073Q27.628138 272.0073 26.378138 272.0073Q25.190638 272.0073 24.612513 272.49167Q24.018763 272.97604 24.018763 273.66354Q24.018763 274.33542 24.643763 274.85104Q25.268763 275.36667 26.471888 275.36667ZM27.268763 262.2573L27.409388 261.22604Q28.471888 261.39792 29.081263 262.10104Q29.690638 262.80417 29.690638 263.8198Q29.690638 265.10104 28.862513 265.8823Q28.018763 266.64792 26.456263 266.64792Q25.456263 266.64792 24.706263 266.3198Q23.940638 265.97604 23.565638 265.30417Q23.190638 264.61667 23.190638 263.80417Q23.190638 262.80417 23.706263 262.16354Q24.206263 261.5073 25.159388 261.3198L25.315638 262.35104Q24.690638 262.49167 24.378138 262.86667Q24.050013 263.24167 24.050013 263.77292Q24.050013 264.5698 24.628138 265.0698Q25.190638 265.5698 26.425013 265.5698Q27.690638 265.5698 28.253138 265.08542Q28.815638 264.60104 28.815638 263.83542Q28.815638 263.21042 28.440638 262.80417Q28.065638 262.3823 27.268763 262.2573ZM26.440638 260.71042Q24.706263 260.71042 23.878138 259.7573Q23.190638 258.96042 23.190638 257.80417Q23.190638 256.52292 24.034388 255.71042Q24.862513 254.8823 26.346888 254.8823Q27.550013 254.8823 28.237513 255.24167Q28.925013 255.60104 29.315638 256.30417Q29.690638 256.99167 29.690638 257.80417Q29.690638 259.10104 28.862513 259.91354Q28.018763 260.71042 26.440638 260.71042ZM26.440638 259.6323Q27.628138 259.6323 28.221888 259.11667Q28.815638 258.58542 28.815638 257.80417Q28.815638 257.0073 28.221888 256.49167Q27.628138 255.97604 26.409388 255.97604Q25.253138 255.97604 24.659388 256.5073Q24.065638 257.02292 24.065638 257.80417Q24.065638 258.58542 24.659388 259.11667Q25.237513 259.6323 26.440638 259.6323ZM29.550013 253.3198L23.331263 253.3198L23.331263 252.3823L24.206263 252.3823Q23.737513 252.08542 23.471888 251.60104Q23.190638 251.11667 23.190638 250.49167Q23.190638 249.80417 23.471888 249.36667Q23.753138 248.91354 24.268763 248.74167Q23.190638 247.99167 23.190638 246.8198Q23.190638 245.8823 23.706263 245.39792Q24.206263 244.89792 25.284388 244.89792L29.550013 244.89792L29.550013 245.9448L25.628138 245.9448Q25.003138 245.9448 24.721888 246.05417Q24.440638 246.14792 24.268763 246.41354Q24.096888 246.67917 24.096888 247.05417Q24.096888 247.71042 24.534388 248.14792Q24.971888 248.5698 25.940638 248.5698L29.550013 248.5698L29.550013 249.6323L25.503138 249.6323Q24.800013 249.6323 24.456263 249.89792Q24.096888 250.14792 24.096888 250.72604Q24.096888 251.17917 24.331263 251.55417Q24.565638 251.92917 25.018763 252.10104Q25.471888 252.27292 26.315638 252.27292L29.550013 252.27292L29.550013 253.3198ZM31.940638 242.3198L23.331263 242.3198L23.331263 241.36667L24.128138 241.36667Q23.659388 241.02292 23.425013 240.60104Q23.190638 240.16354 23.190638 239.55417Q23.190638 238.7573 23.596888 238.14792Q24.003138 237.53854 24.753138 237.24167Q25.503138 236.92917 26.393763 236.92917Q27.346888 236.92917 28.112513 237.27292Q28.878138 237.60104 29.284388 238.2573Q29.690638 238.91354 29.690638 239.6323Q29.690638 240.16354 29.471888 240.58542Q29.237513 241.0073 28.909388 241.27292L31.940638 241.27292L31.940638 242.3198ZM26.471888 241.36667Q27.675013 241.36667 28.253138 240.8823Q28.815638 240.39792 28.815638 239.71042Q28.815638 239.0073 28.221888 238.5073Q27.628138 238.0073 26.378138 238.0073Q25.190638 238.0073 24.612513 238.49167Q24.018763 238.97604 24.018763 239.66354Q24.018763 240.33542 24.643763 240.85104Q25.268763 241.36667 26.471888 241.36667ZM28.784388 231.2573Q29.284388 231.85104 29.487513 232.39792Q29.690638 232.92917 29.690638 233.55417Q29.690638 234.58542 29.190638 235.1323Q28.690638 235.67917 27.909388 235.67917Q27.456263 235.67917 27.081263 235.47604Q26.690638 235.27292 26.471888 234.92917Q26.237513 234.58542 26.128138 234.16354Q26.034388 233.86667 25.956263 233.22604Q25.815638 231.96042 25.596888 231.35104Q25.378138 231.35104 25.331263 231.35104Q24.675013 231.35104 24.409388 231.64792Q24.065638 232.05417 24.065638 232.85104Q24.065638 233.58542 24.331263 233.9448Q24.581263 234.30417 25.237513 234.47604L25.096888 235.5073Q24.440638 235.36667 24.034388 235.03854Q23.628138 234.71042 23.409388 234.10104Q23.190638 233.49167 23.190638 232.6948Q23.190638 231.89792 23.378138 231.39792Q23.565638 230.89792 23.846888 230.66354Q24.128138 230.42917 24.565638 230.33542Q24.831263 230.28854 25.534388 230.28854L26.940638 230.28854Q28.409388 230.28854 28.800013 230.22604Q29.190638 230.14792 29.550013 229.9448L29.550013 231.05417Q29.221888 231.21042 28.784388 231.2573ZM26.425013 231.35104Q26.659388 231.92917 26.831263 233.0698Q26.925013 233.72604 27.050013 233.99167Q27.159388 234.2573 27.378138 234.41354Q27.596888 234.55417 27.878138 234.55417Q28.300013 234.55417 28.581263 234.24167Q28.862513 233.91354 28.862513 233.30417Q28.862513 232.6948 28.596888 232.22604Q28.331263 231.74167 27.862513 231.52292Q27.503138 231.35104 26.815638 231.35104L26.425013 231.35104ZM29.550013 228.33542L23.331263 228.33542L23.331263 227.3823L24.268763 227.3823Q23.612513 227.02292 23.409388 226.72604Q23.190638 226.41354 23.190638 226.03854Q23.190638 225.5073 23.518763 224.96042L24.503138 225.3198Q24.268763 225.71042 24.268763 226.08542Q24.268763 226.4448 24.487513 226.72604Q24.690638 226.99167 25.065638 227.10104Q25.628138 227.28854 26.284388 227.28854L29.550013 227.28854L29.550013 228.33542ZM27.550013 220.0698L27.675013 218.97604Q28.628138 219.22604 29.159388 219.92917Q29.690638 220.6323 29.690638 221.71042Q29.690638 223.0698 28.846888 223.8823Q28.003138 224.67917 26.487513 224.67917Q24.925013 224.67917 24.065638 223.86667Q23.190638 223.05417 23.190638 221.77292Q23.190638 220.52292 24.034388 219.74167Q24.878138 218.9448 26.425013 218.9448Q26.518763 218.9448 26.706263 218.9448L26.706263 223.58542Q27.737513 223.52292 28.284388 223.0073Q28.815638 222.49167 28.815638 221.71042Q28.815638 221.1323 28.518763 220.72604Q28.206263 220.30417 27.550013 220.0698ZM25.846888 223.52292L25.846888 220.05417Q25.050013 220.11667 24.659388 220.4448Q24.050013 220.96042 24.050013 221.7573Q24.050013 222.49167 24.534388 222.99167Q25.018763 223.47604 25.846888 223.52292ZM29.550013 213.28854L28.768763 213.28854Q29.690638 213.8823 29.690638 215.02292Q29.690638 215.77292 29.284388 216.39792Q28.862513 217.02292 28.128138 217.36667Q27.393763 217.71042 26.440638 217.71042Q25.518763 217.71042 24.753138 217.39792Q23.987513 217.08542 23.596888 216.46042Q23.190638 215.83542 23.190638 215.0698Q23.190638 214.5073 23.425013 214.0698Q23.659388 213.6323 24.034388 213.35104L20.956263 213.35104L20.956263 212.30417L29.550013 212.30417L29.550013 213.28854ZM26.440638 216.61667Q27.643763 216.61667 28.237513 216.11667Q28.815638 215.61667 28.815638 214.92917Q28.815638 214.24167 28.253138 213.7573Q27.690638 213.27292 26.534388 213.27292Q25.253138 213.27292 24.659388 213.77292Q24.065638 214.2573 24.065638 214.97604Q24.065638 215.67917 24.643763 216.14792Q25.206263 216.61667 26.440638 216.61667ZM28.612513 205.02292L29.534388 204.86667Q29.628138 205.3198 29.628138 205.66354Q29.628138 206.24167 29.456263 206.55417Q29.268763 206.86667 28.971888 207.0073Q28.675013 207.1323 27.721888 207.1323L24.143763 207.1323L24.143763 207.89792L23.331263 207.89792L23.331263 207.1323L21.784388 207.1323L21.159388 206.08542L23.331263 206.08542L23.331263 205.02292L24.143763 205.02292L24.143763 206.08542L27.784388 206.08542Q28.237513 206.08542 28.362513 206.03854Q28.487513 205.97604 28.565638 205.85104Q28.643763 205.72604 28.643763 205.49167Q28.643763 205.30417 28.612513 205.02292ZM26.440638 204.71042Q24.706263 204.71042 23.878138 203.7573Q23.190638 202.96042 23.190638 201.80417Q23.190638 200.52292 24.034388 199.71042Q24.862513 198.8823 26.346888 198.8823Q27.550013 198.8823 28.237513 199.24167Q28.925013 199.60104 29.315638 200.30417Q29.690638 200.99167 29.690638 201.80417Q29.690638 203.10104 28.862513 203.91354Q28.018763 204.71042 26.440638 204.71042ZM26.440638 203.6323Q27.628138 203.6323 28.221888 203.11667Q28.815638 202.58542 28.815638 201.80417Q28.815638 201.0073 28.221888 200.49167Q27.628138 199.97604 26.409388 199.97604Q25.253138 199.97604 24.659388 200.5073Q24.065638 201.02292 24.065638 201.80417Q24.065638 202.58542 24.659388 203.11667Q25.237513 203.6323 26.440638 203.6323ZM29.550013 194.0698L24.143763 194.0698L24.143763 195.0073L23.331263 195.0073L23.331263 194.0698L22.659388 194.0698Q22.034388 194.0698 21.737513 193.96042Q21.315638 193.80417 21.065638 193.42917Q20.815638 193.03854 20.815638 192.35104Q20.815638 191.89792 20.925013 191.36667L21.831263 191.52292Q21.784388 191.85104 21.784388 192.14792Q21.784388 192.6323 21.987513 192.83542Q22.190638 193.02292 22.753138 193.02292L23.331263 193.02292L23.331263 191.80417L24.143763 191.80417L24.143763 193.02292L29.550013 193.02292L29.550013 194.0698ZM29.550013 191.33542L23.331263 191.33542L23.331263 190.3823L24.268763 190.3823Q23.612513 190.02292 23.409388 189.72604Q23.190638 189.41354 23.190638 189.03854Q23.190638 188.5073 23.518763 187.96042L24.503138 188.3198Q24.268763 188.71042 24.268763 189.08542Q24.268763 189.4448 24.487513 189.72604Q24.690638 189.99167 25.065638 190.10104Q25.628138 190.28854 26.284388 190.28854L29.550013 190.28854L29.550013 191.33542ZM28.784388 183.2573Q29.284388 183.85104 29.487513 184.39792Q29.690638 184.92917 29.690638 185.55417Q29.690638 186.58542 29.190638 187.1323Q28.690638 187.67917 27.909388 187.67917Q27.456263 187.67917 27.081263 187.47604Q26.690638 187.27292 26.471888 186.92917Q26.237513 186.58542 26.128138 186.16354Q26.034388 185.86667 25.956263 185.22604Q25.815638 183.96042 25.596888 183.35104Q25.378138 183.35104 25.331263 183.35104Q24.675013 183.35104 24.409388 183.64792Q24.065638 184.05417 24.065638 184.85104Q24.065638 185.58542 24.331263 185.9448Q24.581263 186.30417 25.237513 186.47604L25.096888 187.5073Q24.440638 187.36667 24.034388 187.03854Q23.628138 186.71042 23.409388 186.10104Q23.190638 185.49167 23.190638 184.6948Q23.190638 183.89792 23.378138 183.39792Q23.565638 182.89792 23.846888 182.66354Q24.128138 182.42917 24.565638 182.33542Q24.831263 182.28854 25.534388 182.28854L26.940638 182.28854Q28.409388 182.28854 28.800013 182.22604Q29.190638 182.14792 29.550013 181.9448L29.550013 183.05417Q29.221888 183.21042 28.784388 183.2573ZM26.425013 183.35104Q26.659388 183.92917 26.831263 185.0698Q26.925013 185.72604 27.050013 185.99167Q27.159388 186.2573 27.378138 186.41354Q27.596888 186.55417 27.878138 186.55417Q28.300013 186.55417 28.581263 186.24167Q28.862513 185.91354 28.862513 185.30417Q28.862513 184.6948 28.596888 184.22604Q28.331263 183.74167 27.862513 183.52292Q27.503138 183.35104 26.815638 183.35104L26.425013 183.35104ZM29.550013 180.3198L23.331263 180.3198L23.331263 179.3823L24.206263 179.3823Q23.737513 179.08542 23.471888 178.60104Q23.190638 178.11667 23.190638 177.49167Q23.190638 176.80417 23.471888 176.36667Q23.753138 175.91354 24.268763 175.74167Q23.190638 174.99167 23.190638 173.8198Q23.190638 172.8823 23.706263 172.39792Q24.206263 171.89792 25.284388 171.89792L29.550013 171.89792L29.550013 172.9448L25.628138 172.9448Q25.003138 172.9448 24.721888 173.05417Q24.440638 173.14792 24.268763 173.41354Q24.096888 173.67917 24.096888 174.05417Q24.096888 174.71042 24.534388 175.14792Q24.971888 175.5698 25.940638 175.5698L29.550013 175.5698L29.550013 176.6323L25.503138 176.6323Q24.800013 176.6323 24.456263 176.89792Q24.096888 177.14792 24.096888 177.72604Q24.096888 178.17917 24.331263 178.55417Q24.565638 178.92917 25.018763 179.10104Q25.471888 179.27292 26.315638 179.27292L29.550013 179.27292L29.550013 180.3198ZM27.550013 165.0698L27.675013 163.97604Q28.628138 164.22604 29.159388 164.92917Q29.690638 165.6323 29.690638 166.71042Q29.690638 168.0698 28.846888 168.8823Q28.003138 169.67917 26.487513 169.67917Q24.925013 169.67917 24.065638 168.86667Q23.190638 168.05417 23.190638 166.77292Q23.190638 165.52292 24.034388 164.74167Q24.878138 163.9448 26.425013 163.9448Q26.518763 163.9448 26.706263 163.9448L26.706263 168.58542Q27.737513 168.52292 28.284388 168.0073Q28.815638 167.49167 28.815638 166.71042Q28.815638 166.1323 28.518763 165.72604Q28.206263 165.30417 27.550013 165.0698ZM25.846888 168.52292L25.846888 165.05417Q25.050013 165.11667 24.659388 165.4448Q24.050013 165.96042 24.050013 166.7573Q24.050013 167.49167 24.534388 167.99167Q25.018763 168.47604 25.846888 168.52292ZM29.550013 161.17917L23.331263 163.08542L23.331263 161.99167L26.925013 161.0073L28.253138 160.6323Q28.159388 160.60104 26.971888 160.30417L23.331263 159.3198L23.331263 158.24167L26.940638 157.30417L28.128138 156.99167L26.925013 156.6323L23.331263 155.5698L23.331263 154.53854L29.550013 156.49167L29.550013 157.58542L25.815638 158.5698L24.768763 158.8198L29.550013 160.0698L29.550013 161.17917ZM26.440638 153.71042Q24.706263 153.71042 23.878138 152.7573Q23.190638 151.96042 23.190638 150.80417Q23.190638 149.52292 24.034388 148.71042Q24.862513 147.8823 26.346888 147.8823Q27.550013 147.8823 28.237513 148.24167Q28.925013 148.60104 29.315638 149.30417Q29.690638 149.99167 29.690638 150.80417Q29.690638 152.10104 28.862513 152.91354Q28.018763 153.71042 26.440638 153.71042ZM26.440638 152.6323Q27.628138 152.6323 28.221888 152.11667Q28.815638 151.58542 28.815638 150.80417Q28.815638 150.0073 28.221888 149.49167Q27.628138 148.97604 26.409388 148.97604Q25.253138 148.97604 24.659388 149.5073Q24.065638 150.02292 24.065638 150.80417Q24.065638 151.58542 24.659388 152.11667Q25.237513 152.6323 26.440638 152.6323ZM29.550013 146.33542L23.331263 146.33542L23.331263 145.3823L24.268763 145.3823Q23.612513 145.02292 23.409388 144.72604Q23.190638 144.41354 23.190638 144.03854Q23.190638 143.5073 23.518763 142.96042L24.503138 143.3198Q24.268763 143.71042 24.268763 144.08542Q24.268763 144.4448 24.487513 144.72604Q24.690638 144.99167 25.065638 145.10104Q25.628138 145.28854 26.284388 145.28854L29.550013 145.28854L29.550013 146.33542ZM29.550013 142.3198L20.956263 142.3198L20.956263 141.2573L25.862513 141.2573L23.331263 138.77292L23.331263 137.39792L25.628138 139.78854L29.550013 137.16354L29.550013 138.46042L26.362513 140.52292L27.081263 141.2573L29.550013 141.2573L29.550013 142.3198Z" fill-rule="nonzero"/><path fill="#000000" d="M60.05 332.21564Q60.05 330.6844 60.3625 329.7625Q60.675 328.825 61.284374 328.325Q61.909374 327.825 62.846874 327.825Q63.534374 327.825 64.05 328.10626Q64.58125 328.3875 64.925 328.91876Q65.26875 329.4344 65.45625 330.1844Q65.64375 330.9344 65.64375 332.21564Q65.64375 333.73126 65.33125 334.65314Q65.03438 335.575 64.40938 336.09064Q63.8 336.59064 62.846874 336.59064Q61.6125 336.59064 60.89375 335.7Q60.05 334.6375 60.05 332.21564ZM61.128124 332.21564Q61.128124 334.325 61.628124 335.02814Q62.128124 335.73126 62.846874 335.73126Q63.58125 335.73126 64.06563 335.02814Q64.56563 334.325 64.56563 332.21564Q64.56563 330.09064 64.06563 329.40314Q63.58125 328.7 62.83125 328.7Q62.1125 328.7 61.675 329.3094Q61.128124 330.09064 61.128124 332.21564ZM66.64375 336.45L68.90938 333.21564L66.81563 330.23126L68.12813 330.23126L69.08125 331.6844Q69.34688 332.10626 69.51875 332.3875Q69.76875 331.9969 69.9875 331.7L71.03438 330.23126L72.3 330.23126L70.14375 333.15314L72.45625 336.45L71.175 336.45L69.89375 334.5125L69.55 333.9969L67.90938 336.45L66.64375 336.45Z" fill-rule="nonzero"/><path fill="#000000" d="M57.01875 262.89444L55.971874 262.89444L55.971874 256.1757Q55.58125 256.53506 54.95625 256.91006Q54.346874 257.26944 53.8625 257.4413L53.8625 256.4257Q54.7375 256.0038 55.39375 255.42569Q56.065624 254.83194 56.346874 254.26944L57.01875 254.26944L57.01875 262.89444ZM60.05 258.66006Q60.05 257.1288 60.3625 256.20694Q60.675 255.26944 61.284374 254.76944Q61.909374 254.26944 62.846874 254.26944Q63.534374 254.26944 64.05 254.55069Q64.58125 254.83194 64.925 255.36319Q65.26875 255.87881 65.45625 256.6288Q65.64375 257.3788 65.64375 258.66006Q65.64375 260.1757 65.33125 261.09756Q65.03438 262.01944 64.40938 262.53506Q63.8 263.03506 62.846874 263.03506Q61.6125 263.03506 60.89375 262.14444Q60.05 261.08194 60.05 258.66006ZM61.128124 258.66006Q61.128124 260.76944 61.628124 261.47256Q62.128124 262.1757 62.846874 262.1757Q63.58125 262.1757 64.06563 261.47256Q64.56563 260.76944 64.56563 258.66006Q64.56563 256.53506 64.06563 255.84756Q63.58125 255.14444 62.83125 255.14444Q62.1125 255.14444 61.675 255.75381Q61.128124 256.53506 61.128124 258.66006ZM66.64375 262.89444L68.90938 259.66006L66.81563 256.6757L68.12813 256.6757L69.08125 258.1288Q69.34688 258.5507 69.51875 258.83194Q69.76875 258.4413 69.9875 258.14444L71.03438 256.6757L72.3 256.6757L70.14375 259.59756L72.45625 262.89444L71.175 262.89444L69.89375 260.95694L69.55 260.4413L67.90938 262.89444L66.64375 262.89444Z" fill-rule="nonzero"/><path fill="#000000" d="M58.596874 188.32326L58.596874 189.33888L52.909374 189.33888Q52.909374 188.96388 53.034374 188.6045Q53.253124 188.02638 53.721874 187.46388Q54.20625 186.90138 55.1125 186.167Q56.51875 185.01076 57.003124 184.33888Q57.503124 183.667 57.503124 183.07326Q57.503124 182.44826 57.05 182.02638Q56.596874 181.58888 55.878124 181.58888Q55.1125 181.58888 54.659374 182.042Q54.20625 182.49513 54.190624 183.30763L53.1125 183.19826Q53.221874 181.9795 53.940624 181.3545Q54.675 180.71388 55.909374 180.71388Q57.14375 180.71388 57.8625 181.40138Q58.58125 182.08888 58.58125 183.1045Q58.58125 183.62013 58.3625 184.12013Q58.159374 184.6045 57.659374 185.167Q57.175 185.71388 56.05 186.667Q55.096874 187.46388 54.815624 187.76076Q54.55 188.042 54.378124 188.32326L58.596874 188.32326ZM60.05 185.1045Q60.05 183.57326 60.3625 182.65138Q60.675 181.71388 61.284374 181.21388Q61.909374 180.71388 62.846874 180.71388Q63.534374 180.71388 64.05 180.99513Q64.58125 181.27638 64.925 181.80763Q65.26875 182.32326 65.45625 183.07326Q65.64375 183.82326 65.64375 185.1045Q65.64375 186.62013 65.33125 187.542Q65.03438 188.46388 64.40938 188.9795Q63.8 189.4795 62.846874 189.4795Q61.6125 189.4795 60.89375 188.58888Q60.05 187.52638 60.05 185.1045ZM61.128124 185.1045Q61.128124 187.21388 61.628124 187.917Q62.128124 188.62013 62.846874 188.62013Q63.58125 188.62013 64.06563 187.917Q64.56563 187.21388 64.56563 185.1045Q64.56563 182.9795 64.06563 182.292Q63.58125 181.58888 62.83125 181.58888Q62.1125 181.58888 61.675 182.19826Q61.128124 182.9795 61.128124 185.1045ZM66.64375 189.33888L68.90938 186.1045L66.81563 183.12013L68.12813 183.12013L69.08125 184.57326Q69.34688 184.99513 69.51875 185.27638Q69.76875 184.88576 69.9875 184.58888L71.03438 183.12013L72.3 183.12013L70.14375 186.042L72.45625 189.33888L71.175 189.33888L69.89375 187.40138L69.55 186.88576L67.90938 189.33888L66.64375 189.33888Z" fill-rule="nonzero"/><path fill="#000000" d="M53.05 113.51771L54.1125 113.37708Q54.284374 114.26771 54.721874 114.67396Q55.159374 115.06458 55.784374 115.06458Q56.534374 115.06458 57.05 114.54896Q57.565624 114.03333 57.565624 113.26771Q57.565624 112.54896 57.08125 112.08021Q56.6125 111.59583 55.878124 111.59583Q55.58125 111.59583 55.128124 111.72083L55.253124 110.78333Q55.3625 110.79896 55.425 110.79896Q56.096874 110.79896 56.628124 110.45521Q57.175 110.09583 57.175 109.36146Q57.175 108.78333 56.784374 108.40833Q56.39375 108.01771 55.76875 108.01771Q55.159374 108.01771 54.7375 108.40833Q54.33125 108.79896 54.221874 109.56458L53.159374 109.37708Q53.346874 108.33021 54.034374 107.75208Q54.721874 107.15833 55.7375 107.15833Q56.440624 107.15833 57.034374 107.47083Q57.628124 107.76771 57.940624 108.28333Q58.253124 108.79896 58.253124 109.39271Q58.253124 109.93958 57.95625 110.40833Q57.659374 110.86146 57.08125 111.12708Q57.83125 111.29896 58.253124 111.86146Q58.675 112.40833 58.675 113.25208Q58.675 114.37708 57.846874 115.15833Q57.034374 115.93958 55.784374 115.93958Q54.659374 115.93958 53.909374 115.26771Q53.159374 114.58021 53.05 113.51771ZM60.05 111.54896Q60.05 110.01771 60.3625 109.09583Q60.675 108.15833 61.284374 107.65833Q61.909374 107.15833 62.846874 107.15833Q63.534374 107.15833 64.05 107.43958Q64.58125 107.72083 64.925 108.25208Q65.26875 108.76771 65.45625 109.51771Q65.64375 110.26771 65.64375 111.54896Q65.64375 113.06458 65.33125 113.98646Q65.03438 114.90833 64.40938 115.42396Q63.8 115.92396 62.846874 115.92396Q61.6125 115.92396 60.89375 115.03333Q60.05 113.97083 60.05 111.54896ZM61.128124 111.54896Q61.128124 113.65833 61.628124 114.36146Q62.128124 115.06458 62.846874 115.06458Q63.58125 115.06458 64.06563 114.36146Q64.56563 113.65833 64.56563 111.54896Q64.56563 109.42396 64.06563 108.73646Q63.58125 108.03333 62.83125 108.03333Q62.1125 108.03333 61.675 108.64271Q61.128124 109.42396 61.128124 111.54896ZM66.64375 115.78333L68.90938 112.54896L66.81563 109.56458L68.12813 109.56458L69.08125 111.01771Q69.34688 111.43958 69.51875 111.72083Q69.76875 111.33021 69.9875 111.03333L71.03438 109.56458L72.3 109.56458L70.14375 112.48646L72.45625 115.78333L71.175 115.78333L69.89375 113.84583L69.55 113.33021L67.90938 115.78333L66.64375 115.78333Z" fill-rule="nonzero"/><path fill="#000000" d="M168.95709 349.45L168.95709 347.3875L165.23834 347.3875L165.23834 346.41876L169.16022 340.85626L170.01959 340.85626L170.01959 346.41876L171.17584 346.41876L171.17584 347.3875L170.01959 347.3875L170.01959 349.45L168.95709 349.45ZM168.95709 346.41876L168.95709 342.5594L166.26959 346.41876L168.95709 346.41876ZM172.58209 345.21564Q172.58209 343.6844 172.89459 342.7625Q173.20709 341.825 173.81647 341.325Q174.44147 340.825 175.37897 340.825Q176.06647 340.825 176.58209 341.10626Q177.11334 341.3875 177.45709 341.91876Q177.80084 342.4344 177.98834 343.1844Q178.17584 343.9344 178.17584 345.21564Q178.17584 346.73126 177.86334 347.65314Q177.56647 348.575 176.94147 349.09064Q176.33209 349.59064 175.37897 349.59064Q174.14459 349.59064 173.42584 348.7Q172.58209 347.6375 172.58209 345.21564ZM173.66022 345.21564Q173.66022 347.325 174.16022 348.02814Q174.66022 348.73126 175.37897 348.73126Q176.11334 348.73126 176.59772 348.02814Q177.09772 347.325 177.09772 345.21564Q177.09772 343.09064 176.59772 342.40314Q176.11334 341.7 175.36334 341.7Q174.64459 341.7 174.20709 342.3094Q173.66022 343.09064 173.66022 345.21564ZM179.73834 347.46564L180.75397 347.3719Q180.87897 348.09064 181.23834 348.41876Q181.61334 348.73126 182.17584 348.73126Q182.66022 348.73126 183.01959 348.5125Q183.39459 348.27814 183.62897 347.91876Q183.86334 347.54376 184.01959 346.91876Q184.17584 346.27814 184.17584 345.6219Q184.17584 345.54376 184.17584 345.40314Q183.86334 345.90314 183.31647 346.23126Q182.76959 346.54376 182.12897 346.54376Q181.05084 346.54376 180.31647 345.77814Q179.58209 344.9969 179.58209 343.73126Q179.58209 342.41876 180.34772 341.6219Q181.12897 340.825 182.28522 340.825Q183.11334 340.825 183.80084 341.27814Q184.50397 341.73126 184.86334 342.575Q185.22272 343.40314 185.22272 344.98126Q185.22272 346.6219 184.86334 347.60626Q184.51959 348.575 183.80084 349.09064Q183.09772 349.59064 182.16022 349.59064Q181.14459 349.59064 180.50397 349.04376Q179.86334 348.48126 179.73834 347.46564ZM184.05084 343.66876Q184.05084 342.7625 183.56647 342.23126Q183.09772 341.7 182.41022 341.7Q181.70709 341.7 181.17584 342.27814Q180.66022 342.84064 180.66022 343.7625Q180.66022 344.575 181.16022 345.09064Q181.66022 345.60626 182.37897 345.60626Q183.11334 345.60626 183.58209 345.09064Q184.05084 344.575 184.05084 343.66876ZM186.58209 345.21564Q186.58209 343.6844 186.89459 342.7625Q187.20709 341.825 187.81647 341.325Q188.44147 340.825 189.37897 340.825Q190.06647 340.825 190.58209 341.10626Q191.11334 341.3875 191.45709 341.91876Q191.80084 342.4344 191.98834 343.1844Q192.17584 343.9344 192.17584 345.21564Q192.17584 346.73126 191.86334 347.65314Q191.56647 348.575 190.94147 349.09064Q190.33209 349.59064 189.37897 349.59064Q188.14459 349.59064 187.42584 348.7Q186.58209 347.6375 186.58209 345.21564ZM187.66022 345.21564Q187.66022 347.325 188.16022 348.02814Q188.66022 348.73126 189.37897 348.73126Q190.11334 348.73126 190.59772 348.02814Q191.09772 347.325 191.09772 345.21564Q191.09772 343.09064 190.59772 342.40314Q190.11334 341.7 189.36334 341.7Q188.64459 341.7 188.20709 342.3094Q187.66022 343.09064 187.66022 345.21564Z" fill-rule="nonzero"/><path fill="#000000" d="M315.875 349.45L315.875 340.85626L317.01562 340.85626L317.01562 348.4344L321.25 348.4344L321.25 349.45L315.875 349.45ZM325.875 349.45L325.875 347.3875L322.15625 347.3875L322.15625 346.41876L326.07812 340.85626L326.9375 340.85626L326.9375 346.41876L328.09375 346.41876L328.09375 347.3875L326.9375 347.3875L326.9375 349.45L325.875 349.45ZM325.875 346.41876L325.875 342.5594L323.1875 346.41876L325.875 346.41876ZM329.5 345.21564Q329.5 343.6844 329.8125 342.7625Q330.125 341.825 330.73438 341.325Q331.35938 340.825 332.29688 340.825Q332.98438 340.825 333.5 341.10626Q334.03125 341.3875 334.375 341.91876Q334.71875 342.4344 334.90625 343.1844Q335.09375 343.9344 335.09375 345.21564Q335.09375 346.73126 334.78125 347.65314Q334.48438 348.575 333.85938 349.09064Q333.25 349.59064 332.29688 349.59064Q331.0625 349.59064 330.34375 348.7Q329.5 347.6375 329.5 345.21564ZM330.57812 345.21564Q330.57812 347.325 331.07812 348.02814Q331.57812 348.73126 332.29688 348.73126Q333.03125 348.73126 333.51562 348.02814Q334.01562 347.325 334.01562 345.21564Q334.01562 343.09064 333.51562 342.40314Q333.03125 341.7 332.28125 341.7Q331.5625 341.7 331.125 342.3094Q330.57812 343.09064 330.57812 345.21564ZM336.54688 346.6844L337.60938 346.59064Q337.6875 347.2469 337.96875 347.65314Q338.25 348.0594 338.82812 348.325Q339.42188 348.575 340.15625 348.575Q340.79688 348.575 341.29688 348.3875Q341.79688 348.1844 342.03125 347.85626Q342.28125 347.5125 342.28125 347.1219Q342.28125 346.71564 342.04688 346.41876Q341.8125 346.10626 341.28125 345.90314Q340.92188 345.7625 339.73438 345.48126Q338.5625 345.2 338.09375 344.95Q337.46875 344.6219 337.17188 344.15314Q336.875 343.66876 336.875 343.075Q336.875 342.4344 337.23438 341.8719Q337.60938 341.29376 338.3125 341.0125Q339.03125 340.71564 339.89062 340.71564Q340.84375 340.71564 341.57812 341.02814Q342.3125 341.325 342.70312 341.9344Q343.09375 342.52814 343.125 343.27814L342.03125 343.35626Q341.9375 342.54376 341.42188 342.1375Q340.92188 341.71564 339.9375 341.71564Q338.90625 341.71564 338.4375 342.09064Q337.96875 342.46564 337.96875 342.9969Q337.96875 343.46564 338.29688 343.7625Q338.625 344.0594 340.0 344.3719Q341.39062 344.6844 341.90625 344.91876Q342.65625 345.27814 343.01562 345.8094Q343.375 346.325 343.375 347.02814Q343.375 347.71564 342.98438 348.325Q342.59375 348.91876 341.85938 349.2625Q341.125 349.59064 340.20312 349.59064Q339.03125 349.59064 338.23438 349.2625Q337.45312 348.91876 337.0 348.23126Q336.5625 347.54376 336.54688 346.6844Z" fill-rule="nonzero"/><path fill="#000000" d="M465.88666 349.45L465.88666 340.85626L467.01166 340.85626L467.01166 344.3875L471.4804 344.3875L471.4804 340.85626L472.62103 340.85626L472.62103 349.45L471.4804 349.45L471.4804 345.40314L467.01166 345.40314L467.01166 349.45L465.88666 349.45ZM478.38666 349.45L477.33978 349.45L477.33978 342.73126Q476.94916 343.09064 476.32416 343.46564Q475.71478 343.825 475.2304 343.9969L475.2304 342.98126Q476.1054 342.5594 476.76166 341.98126Q477.43353 341.3875 477.71478 340.825L478.38666 340.825L478.38666 349.45ZM481.4179 345.21564Q481.4179 343.6844 481.7304 342.7625Q482.0429 341.825 482.65228 341.325Q483.27728 340.825 484.21478 340.825Q484.90228 340.825 485.4179 341.10626Q485.94916 341.3875 486.2929 341.91876Q486.63666 342.4344 486.82416 343.1844Q487.01166 343.9344 487.01166 345.21564Q487.01166 346.73126 486.69916 347.65314Q486.40228 348.575 485.77728 349.09064Q485.1679 349.59064 484.21478 349.59064Q482.9804 349.59064 482.26166 348.7Q481.4179 347.6375 481.4179 345.21564ZM482.49603 345.21564Q482.49603 347.325 482.99603 348.02814Q483.49603 348.73126 484.21478 348.73126Q484.94916 348.73126 485.43353 348.02814Q485.93353 347.325 485.93353 345.21564Q485.93353 343.09064 485.43353 342.40314Q484.94916 341.7 484.19916 341.7Q483.4804 341.7 483.0429 342.3094Q482.49603 343.09064 482.49603 345.21564ZM488.4179 345.21564Q488.4179 343.6844 488.7304 342.7625Q489.0429 341.825 489.65228 341.325Q490.27728 340.825 491.21478 340.825Q491.90228 340.825 492.4179 341.10626Q492.94916 341.3875 493.2929 341.91876Q493.63666 342.4344 493.82416 343.1844Q494.01166 343.9344 494.01166 345.21564Q494.01166 346.73126 493.69916 347.65314Q493.40228 348.575 492.77728 349.09064Q492.1679 349.59064 491.21478 349.59064Q489.9804 349.59064 489.26166 348.7Q488.4179 347.6375 488.4179 345.21564ZM489.49603 345.21564Q489.49603 347.325 489.99603 348.02814Q490.49603 348.73126 491.21478 348.73126Q491.94916 348.73126 492.43353 348.02814Q492.93353 347.325 492.93353 345.21564Q492.93353 343.09064 492.43353 342.40314Q491.94916 341.7 491.19916 341.7Q490.4804 341.7 490.0429 342.3094Q489.49603 343.09064 489.49603 345.21564Z" fill-rule="nonzero"/><path fill="#85b737" d="M188.5 80.416664C188.5 79.86438 188.94771 79.416664 189.5 79.416664L199.5 79.416664C200.05229 79.416664 200.5 79.86438 200.5 80.416664L200.5 90.416664C200.5 90.96895 200.05229 91.416664 199.5 91.416664L189.5 91.416664C188.94771 91.416664 188.5 90.96895 188.5 90.416664Z" fill-rule="nonzero"/><path fill="#1a1a1a" d="M209.375 89.416664L209.375 80.822914L210.51562 80.822914L210.51562 88.40104L214.75 88.40104L214.75 89.416664L209.375 89.416664ZM216.375 89.416664L216.375 80.822914L217.51562 80.822914L217.51562 88.40104L221.75 88.40104L221.75 89.416664L216.375 89.416664ZM227.35938 88.65104Q226.76562 89.15104 226.21875 89.354164Q225.6875 89.55729 225.0625 89.55729Q224.03125 89.55729 223.48438 89.05729Q222.9375 88.55729 222.9375 87.77604Q222.9375 87.322914 223.14062 86.947914Q223.34375 86.55729 223.6875 86.33854Q224.03125 86.104164 224.45312 85.99479Q224.75 85.90104 225.39062 85.822914Q226.65625 85.68229 227.26562 85.46354Q227.26562 85.24479 227.26562 85.197914Q227.26562 84.541664 226.96875 84.27604Q226.5625 83.93229 225.76562 83.93229Q225.03125 83.93229 224.67188 84.197914Q224.3125 84.447914 224.14062 85.104164L223.10938 84.96354Q223.25 84.30729 223.57812 83.90104Q223.90625 83.49479 224.51562 83.27604Q225.125 83.05729 225.92188 83.05729Q226.71875 83.05729 227.21875 83.24479Q227.71875 83.43229 227.95312 83.71354Q228.1875 83.99479 228.28125 84.43229Q228.32812 84.697914 228.32812 85.40104L228.32812 86.80729Q228.32812 88.27604 228.39062 88.666664Q228.46875 89.05729 228.67188 89.416664L227.5625 89.416664Q227.40625 89.08854 227.35938 88.65104ZM227.26562 86.291664Q226.6875 86.52604 225.54688 86.697914Q224.89062 86.791664 224.625 86.916664Q224.35938 87.02604 224.20312 87.24479Q224.0625 87.46354 224.0625 87.74479Q224.0625 88.166664 224.375 88.447914Q224.70312 88.729164 225.3125 88.729164Q225.92188 88.729164 226.39062 88.46354Q226.875 88.197914 227.09375 87.729164Q227.26562 87.36979 227.26562 86.68229L227.26562 86.291664ZM230.39062 89.416664L230.39062 80.822914L232.10938 80.822914L234.14062 86.90104Q234.42188 87.760414 234.54688 88.18229Q234.6875 87.71354 235.0 86.80729L237.0625 80.822914L238.59375 80.822914L238.59375 89.416664L237.5 89.416664L237.5 82.229164L235.0 89.416664L233.96875 89.416664L231.48438 82.104164L231.48438 89.416664L230.39062 89.416664ZM238.48438 89.416664L241.78125 80.822914L243.0 80.822914L246.51562 89.416664L245.23438 89.416664L244.21875 86.80729L240.64062 86.80729L239.6875 89.416664L238.48438 89.416664ZM240.96875 85.885414L243.875 85.885414L242.98438 83.510414Q242.5625 82.43229 242.375 81.729164Q242.20312 82.55729 241.90625 83.36979L240.96875 85.885414ZM245.875 86.83854L245.875 85.77604L249.125 85.77604L249.125 86.83854L245.875 86.83854ZM255.54688 88.40104L255.54688 89.416664L249.85938 89.416664Q249.85938 89.041664 249.98438 88.68229Q250.20312 88.104164 250.67188 87.541664Q251.15625 86.979164 252.0625 86.24479Q253.46875 85.08854 253.95312 84.416664Q254.45312 83.74479 254.45312 83.15104Q254.45312 82.52604 254.0 82.104164Q253.54688 81.666664 252.82812 81.666664Q252.0625 81.666664 251.60938 82.11979Q251.15625 82.572914 251.14062 83.385414L250.0625 83.27604Q250.17188 82.05729 250.89062 81.43229Q251.625 80.791664 252.85938 80.791664Q254.09375 80.791664 254.8125 81.479164Q255.53125 82.166664 255.53125 83.18229Q255.53125 83.697914 255.3125 84.197914Q255.10938 84.68229 254.60938 85.24479Q254.125 85.791664 253.0 86.74479Q252.04688 87.541664 251.76562 87.83854Q251.5 88.11979 251.32812 88.40104L255.54688 88.40104ZM256.875 86.83854L256.875 85.77604L260.125 85.77604L260.125 86.83854L256.875 86.83854ZM261.0625 81.947914L261.0625 80.93229L266.625 80.93229L266.625 81.760414Q265.8125 82.635414 265.0 84.08854Q264.20312 85.52604 263.76562 87.05729Q263.4375 88.135414 263.35938 89.416664L262.26562 89.416664Q262.28125 88.40104 262.65625 86.96354Q263.04688 85.52604 263.75 84.197914Q264.46875 82.86979 265.28125 81.947914L261.0625 81.947914ZM269.26562 89.416664L268.28125 89.416664L268.28125 80.822914L269.34375 80.822914L269.34375 83.885414Q270.01562 83.05729 271.04688 83.05729Q271.625 83.05729 272.125 83.291664Q272.64062 83.510414 272.96875 83.93229Q273.3125 84.354164 273.5 84.947914Q273.6875 85.541664 273.6875 86.21354Q273.6875 87.80729 272.89062 88.68229Q272.09375 89.55729 271.0 89.55729Q269.89062 89.55729 269.26562 88.635414L269.26562 89.416664ZM269.25 86.260414Q269.25 87.36979 269.5625 87.86979Q270.0625 88.68229 270.90625 88.68229Q271.59375 88.68229 272.09375 88.08854Q272.60938 87.49479 272.60938 86.291664Q272.60938 85.072914 272.125 84.49479Q271.64062 83.916664 270.95312 83.916664Q270.26562 83.916664 269.75 84.52604Q269.25 85.11979 269.25 86.260414ZM274.875 86.83854L274.875 85.77604L278.125 85.77604L278.125 86.83854L274.875 86.83854ZM279.29688 89.416664L279.29688 80.822914L280.34375 80.822914L280.34375 83.90104Q281.07812 83.05729 282.20312 83.05729Q282.90625 83.05729 283.40625 83.33854Q283.92188 83.604164 284.14062 84.08854Q284.35938 84.55729 284.35938 85.479164L284.35938 89.416664L283.3125 89.416664L283.3125 85.479164Q283.3125 84.68229 282.96875 84.322914Q282.625 83.96354 282.0 83.96354Q281.53125 83.96354 281.10938 84.21354Q280.70312 84.447914 280.51562 84.86979Q280.34375 85.27604 280.34375 86.010414L280.34375 89.416664L279.29688 89.416664ZM286.54688 89.416664L286.54688 84.010414L285.60938 84.010414L285.60938 83.197914L286.54688 83.197914L286.54688 82.52604Q286.54688 81.90104 286.65625 81.604164Q286.8125 81.18229 287.1875 80.93229Q287.57812 80.68229 288.26562 80.68229Q288.71875 80.68229 289.25 80.791664L289.09375 81.697914Q288.76562 81.65104 288.46875 81.65104Q287.98438 81.65104 287.78125 81.854164Q287.59375 82.05729 287.59375 82.61979L287.59375 83.197914L288.8125 83.197914L288.8125 84.010414L287.59375 84.010414L287.59375 89.416664L286.54688 89.416664Z" fill-rule="nonzero"/><path fill="#3d5314" d="M304.5 80.416664C304.5 79.86438 304.94772 79.416664 305.5 79.416664L315.5 79.416664C316.05228 79.416664 316.5 79.86438 316.5 80.416664L316.5 90.416664C316.5 90.96895 316.05228 91.416664 315.5 91.416664L305.5 91.416664C304.94772 91.416664 304.5 90.96895 304.5 90.416664Z" fill-rule="nonzero"/><path fill="#1a1a1a" d="M325.375 89.416664L325.375 80.822914L326.51562 80.822914L326.51562 88.40104L330.75 88.40104L330.75 89.416664L325.375 89.416664ZM332.375 89.416664L332.375 80.822914L333.51562 80.822914L333.51562 88.40104L337.75 88.40104L337.75 89.416664L332.375 89.416664ZM343.35938 88.65104Q342.76562 89.15104 342.21875 89.354164Q341.6875 89.55729 341.0625 89.55729Q340.03125 89.55729 339.48438 89.05729Q338.9375 88.55729 338.9375 87.77604Q338.9375 87.322914 339.14062 86.947914Q339.34375 86.55729 339.6875 86.33854Q340.03125 86.104164 340.45312 85.99479Q340.75 85.90104 341.39062 85.822914Q342.65625 85.68229 343.26562 85.46354Q343.26562 85.24479 343.26562 85.197914Q343.26562 84.541664 342.96875 84.27604Q342.5625 83.93229 341.76562 83.93229Q341.03125 83.93229 340.67188 84.197914Q340.3125 84.447914 340.14062 85.104164L339.10938 84.96354Q339.25 84.30729 339.57812 83.90104Q339.90625 83.49479 340.51562 83.27604Q341.125 83.05729 341.92188 83.05729Q342.71875 83.05729 343.21875 83.24479Q343.71875 83.43229 343.95312 83.71354Q344.1875 83.99479 344.28125 84.43229Q344.32812 84.697914 344.32812 85.40104L344.32812 86.80729Q344.32812 88.27604 344.39062 88.666664Q344.46875 89.05729 344.67188 89.416664L343.5625 89.416664Q343.40625 89.08854 343.35938 88.65104ZM343.26562 86.291664Q342.6875 86.52604 341.54688 86.697914Q340.89062 86.791664 340.625 86.916664Q340.35938 87.02604 340.20312 87.24479Q340.0625 87.46354 340.0625 87.74479Q340.0625 88.166664 340.375 88.447914Q340.70312 88.729164 341.3125 88.729164Q341.92188 88.729164 342.39062 88.46354Q342.875 88.197914 343.09375 87.729164Q343.26562 87.36979 343.26562 86.68229L343.26562 86.291664ZM346.39062 89.416664L346.39062 80.822914L348.10938 80.822914L350.14062 86.90104Q350.42188 87.760414 350.54688 88.18229Q350.6875 87.71354 351.0 86.80729L353.0625 80.822914L354.59375 80.822914L354.59375 89.416664L353.5 89.416664L353.5 82.229164L351.0 89.416664L349.96875 89.416664L347.48438 82.104164L347.48438 89.416664L346.39062 89.416664ZM354.48438 89.416664L357.78125 80.822914L359.0 80.822914L362.51562 89.416664L361.23438 89.416664L360.21875 86.80729L356.64062 86.80729L355.6875 89.416664L354.48438 89.416664ZM356.96875 85.885414L359.875 85.885414L358.98438 83.510414Q358.5625 82.43229 358.375 81.729164Q358.20312 82.55729 357.90625 83.36979L356.96875 85.885414ZM361.875 86.83854L361.875 85.77604L365.125 85.77604L365.125 86.83854L361.875 86.83854ZM371.54688 88.40104L371.54688 89.416664L365.85938 89.416664Q365.85938 89.041664 365.98438 88.68229Q366.20312 88.104164 366.67188 87.541664Q367.15625 86.979164 368.0625 86.24479Q369.46875 85.08854 369.95312 84.416664Q370.45312 83.74479 370.45312 83.15104Q370.45312 82.52604 370.0 82.104164Q369.54688 81.666664 368.82812 81.666664Q368.0625 81.666664 367.60938 82.11979Q367.15625 82.572914 367.14062 83.385414L366.0625 83.27604Q366.17188 82.05729 366.89062 81.43229Q367.625 80.791664 368.85938 80.791664Q370.09375 80.791664 370.8125 81.479164Q371.53125 82.166664 371.53125 83.18229Q371.53125 83.697914 371.3125 84.197914Q371.10938 84.68229 370.60938 85.24479Q370.125 85.791664 369.0 86.74479Q368.04688 87.541664 367.76562 87.83854Q367.5 88.11979 367.32812 88.40104L371.54688 88.40104ZM372.875 86.83854L372.875 85.77604L376.125 85.77604L376.125 86.83854L372.875 86.83854ZM380.96875 89.416664L379.92188 89.416664L379.92188 82.697914Q379.53125 83.05729 378.90625 83.43229Q378.29688 83.791664 377.8125 83.96354L377.8125 82.947914Q378.6875 82.52604 379.34375 81.947914Q380.01562 81.354164 380.29688 80.791664L380.96875 80.791664L380.96875 89.416664ZM384.0 87.15104L385.0625 87.010414Q385.23438 87.90104 385.67188 88.30729Q386.10938 88.697914 386.73438 88.697914Q387.48438 88.697914 388.0 88.18229Q388.51562 87.666664 388.51562 86.90104Q388.51562 86.18229 388.03125 85.71354Q387.5625 85.229164 386.82812 85.229164Q386.53125 85.229164 386.07812 85.354164L386.20312 84.416664Q386.3125 84.43229 386.375 84.43229Q387.04688 84.43229 387.57812 84.08854Q388.125 83.729164 388.125 82.99479Q388.125 82.416664 387.73438 82.041664Q387.34375 81.65104 386.71875 81.65104Q386.10938 81.65104 385.6875 82.041664Q385.28125 82.43229 385.17188 83.197914L384.10938 83.010414Q384.29688 81.96354 384.98438 81.385414Q385.67188 80.791664 386.6875 80.791664Q387.39062 80.791664 387.98438 81.104164Q388.57812 81.40104 388.89062 81.916664Q389.20312 82.43229 389.20312 83.02604Q389.20312 83.572914 388.90625 84.041664Q388.60938 84.49479 388.03125 84.760414Q388.78125 84.93229 389.20312 85.49479Q389.625 86.041664 389.625 86.885414Q389.625 88.010414 388.79688 88.791664Q387.98438 89.572914 386.73438 89.572914Q385.60938 89.572914 384.85938 88.90104Q384.10938 88.21354 384.0 87.15104ZM392.26562 89.416664L391.28125 89.416664L391.28125 80.822914L392.34375 80.822914L392.34375 83.885414Q393.01562 83.05729 394.04688 83.05729Q394.625 83.05729 395.125 83.291664Q395.64062 83.510414 395.96875 83.93229Q396.3125 84.354164 396.5 84.947914Q396.6875 85.541664 396.6875 86.21354Q396.6875 87.80729 395.89062 88.68229Q395.09375 89.55729 394.0 89.55729Q392.89062 89.55729 392.26562 88.635414L392.26562 89.416664ZM392.25 86.260414Q392.25 87.36979 392.5625 87.86979Q393.0625 88.68229 393.90625 88.68229Q394.59375 88.68229 395.09375 88.08854Q395.60938 87.49479 395.60938 86.291664Q395.60938 85.072914 395.125 84.49479Q394.64062 83.916664 393.95312 83.916664Q393.26562 83.916664 392.75 84.52604Q392.25 85.11979 392.25 86.260414ZM397.875 86.83854L397.875 85.77604L401.125 85.77604L401.125 86.83854L397.875 86.83854ZM402.29688 89.416664L402.29688 80.822914L403.34375 80.822914L403.34375 83.90104Q404.07812 83.05729 405.20312 83.05729Q405.90625 83.05729 406.40625 83.33854Q406.92188 83.604164 407.14062 84.08854Q407.35938 84.55729 407.35938 85.479164L407.35938 89.416664L406.3125 89.416664L406.3125 85.479164Q406.3125 84.68229 405.96875 84.322914Q405.625 83.96354 405.0 83.96354Q404.53125 83.96354 404.10938 84.21354Q403.70312 84.447914 403.51562 84.86979Q403.34375 85.27604 403.34375 86.010414L403.34375 89.416664L402.29688 89.416664ZM409.54688 89.416664L409.54688 84.010414L408.60938 84.010414L408.60938 83.197914L409.54688 83.197914L409.54688 82.52604Q409.54688 81.90104 409.65625 81.604164Q409.8125 81.18229 410.1875 80.93229Q410.57812 80.68229 411.26562 80.68229Q411.71875 80.68229 412.25 80.791664L412.09375 81.697914Q411.76562 81.65104 411.46875 81.65104Q410.98438 81.65104 410.78125 81.854164Q410.59375 82.05729 410.59375 82.61979L410.59375 83.197914L411.8125 83.197914L411.8125 84.010414L410.59375 84.010414L410.59375 89.416664L409.54688 89.416664Z" fill-rule="nonzero"/><path fill="#999999" d="M19.58125 63.05L19.58125 53.034374L23.33125 53.034374Q24.4875 53.034374 25.175 53.346874Q25.878124 53.64375 26.26875 54.26875Q26.659374 54.89375 26.659374 55.596874Q26.659374 56.2375 26.3 56.8Q25.95625 57.3625 25.253124 57.721874Q26.159374 57.9875 26.64375 58.628124Q27.14375 59.26875 27.14375 60.14375Q27.14375 60.846874 26.846874 61.45625Q26.55 62.065624 26.1125 62.39375Q25.675 62.721874 25.003124 62.89375Q24.346874 63.05 23.39375 63.05L19.58125 63.05ZM20.909374 57.2375L23.065624 57.2375Q23.95625 57.2375 24.33125 57.128124Q24.846874 56.971874 25.096874 56.628124Q25.346874 56.26875 25.346874 55.753124Q25.346874 55.253124 25.1125 54.878124Q24.878124 54.4875 24.425 54.346874Q23.9875 54.20625 22.909374 54.20625L20.909374 54.20625L20.909374 57.2375ZM20.909374 61.8625L23.39375 61.8625Q24.034374 61.8625 24.3 61.815624Q24.753124 61.7375 25.065624 61.55Q25.378124 61.346874 25.565624 60.9875Q25.76875 60.628124 25.76875 60.14375Q25.76875 59.58125 25.471874 59.175Q25.190624 58.753124 24.675 58.596874Q24.175 58.425 23.221874 58.425L20.909374 58.425L20.909374 61.8625ZM33.20625 62.159374Q32.534374 62.7375 31.89375 62.9875Q31.26875 63.221874 30.534374 63.221874Q29.346874 63.221874 28.690624 62.64375Q28.05 62.05 28.05 61.128124Q28.05 60.596874 28.284374 60.159374Q28.534374 59.721874 28.925 59.45625Q29.33125 59.190624 29.815624 59.05Q30.175 58.95625 30.909374 58.8625Q32.409374 58.690624 33.1125 58.440624Q33.1125 58.190624 33.1125 58.128124Q33.1125 57.3625 32.76875 57.065624Q32.3 56.64375 31.3625 56.64375Q30.503124 56.64375 30.08125 56.95625Q29.675 57.253124 29.471874 58.01875L28.26875 57.8625Q28.425 57.08125 28.8 56.6125Q29.190624 56.14375 29.89375 55.89375Q30.6125 55.628124 31.55 55.628124Q32.471874 55.628124 33.05 55.846874Q33.628124 56.065624 33.89375 56.39375Q34.175 56.721874 34.284374 57.2375Q34.346874 57.55 34.346874 58.3625L34.346874 60.003124Q34.346874 61.721874 34.425 62.175Q34.503124 62.628124 34.7375 63.05L33.45625 63.05Q33.26875 62.659374 33.20625 62.159374ZM33.1125 59.409374Q32.440624 59.675 31.096874 59.878124Q30.346874 59.9875 30.01875 60.128124Q29.70625 60.253124 29.534374 60.51875Q29.3625 60.784374 29.3625 61.096874Q29.3625 61.596874 29.7375 61.925Q30.1125 62.253124 30.83125 62.253124Q31.55 62.253124 32.096874 61.940624Q32.64375 61.628124 32.909374 61.08125Q33.1125 60.675 33.1125 59.8625L33.1125 59.409374ZM39.159374 61.95625L39.33125 63.034374Q38.815624 63.14375 38.409374 63.14375Q37.7375 63.14375 37.3625 62.940624Q37.003124 62.721874 36.846874 62.378124Q36.70625 62.034374 36.70625 60.925L36.70625 56.753124L35.8 56.753124L35.8 55.784374L36.70625 55.784374L36.70625 53.9875L37.925 53.253124L37.925 55.784374L39.159374 55.784374L39.159374 56.753124L37.925 56.753124L37.925 60.9875Q37.925 61.51875 37.9875 61.675Q38.05 61.815624 38.190624 61.909374Q38.346874 62.003124 38.6125 62.003124Q38.83125 62.003124 39.159374 61.95625ZM45.20625 60.39375L46.425 60.55Q46.221874 61.8 45.39375 62.51875Q44.58125 63.221874 43.39375 63.221874Q41.909374 63.221874 41.003124 62.253124Q40.096874 61.26875 40.096874 59.440624Q40.096874 58.26875 40.4875 57.39375Q40.878124 56.503124 41.675 56.065624Q42.471874 55.628124 43.409374 55.628124Q44.58125 55.628124 45.33125 56.221874Q46.096874 56.815624 46.3 57.925L45.1125 58.1125Q44.940624 57.378124 44.503124 57.01875Q44.065624 56.64375 43.45625 56.64375Q42.51875 56.64375 41.940624 57.315624Q41.3625 57.971874 41.3625 59.409374Q41.3625 60.878124 41.925 61.55Q42.4875 62.20625 43.378124 62.20625Q44.1125 62.20625 44.596874 61.76875Q45.08125 61.315624 45.20625 60.39375ZM47.471874 63.05L47.471874 53.034374L48.70625 53.034374L48.70625 56.628124Q49.565624 55.628124 50.878124 55.628124Q51.690624 55.628124 52.284374 55.95625Q52.878124 56.26875 53.128124 56.83125Q53.39375 57.378124 53.39375 58.45625L53.39375 63.05L52.159374 63.05L52.159374 58.45625Q52.159374 57.51875 51.753124 57.1125Q51.3625 56.690624 50.628124 56.690624Q50.08125 56.690624 49.596874 56.971874Q49.1125 57.253124 48.909374 57.7375Q48.70625 58.221874 48.70625 59.08125L48.70625 63.05L47.471874 63.05ZM59.175 59.83125L60.425 59.721874Q60.51875 60.471874 60.846874 60.95625Q61.175 61.440624 61.846874 61.7375Q62.534374 62.034374 63.39375 62.034374Q64.14375 62.034374 64.72188 61.815624Q65.31563 61.58125 65.59688 61.190624Q65.87813 60.8 65.87813 60.33125Q65.87813 59.8625 65.59688 59.51875Q65.33125 59.159374 64.70625 58.909374Q64.3 58.753124 62.909374 58.425Q61.534374 58.096874 60.9875 57.8Q60.26875 57.425 59.909374 56.878124Q59.565624 56.315624 59.565624 55.6125Q59.565624 54.8625 59.9875 54.20625Q60.425 53.534374 61.253124 53.20625Q62.08125 52.8625 63.096874 52.8625Q64.20625 52.8625 65.05 53.221874Q65.90938 53.58125 66.3625 54.284374Q66.81563 54.971874 66.8625 55.846874L65.58125 55.940624Q65.4875 55.003124 64.89375 54.51875Q64.3 54.01875 63.14375 54.01875Q61.940624 54.01875 61.39375 54.471874Q60.846874 54.909374 60.846874 55.534374Q60.846874 56.065624 61.2375 56.425Q61.6125 56.76875 63.221874 57.14375Q64.84688 57.503124 65.44063 57.76875Q66.31563 58.175 66.7375 58.8Q67.15938 59.409374 67.15938 60.221874Q67.15938 61.01875 66.69063 61.7375Q66.2375 62.440624 65.37813 62.83125Q64.51875 63.221874 63.45625 63.221874Q62.096874 63.221874 61.175 62.83125Q60.253124 62.425 59.721874 61.628124Q59.20625 60.83125 59.175 59.83125ZM68.4875 54.440624L68.4875 53.034374L69.70625 53.034374L69.70625 54.440624L68.4875 54.440624ZM68.4875 63.05L68.4875 55.784374L69.70625 55.784374L69.70625 63.05L68.4875 63.05ZM70.83125 63.05L70.83125 62.05L75.44063 56.753124Q74.65938 56.784374 74.05 56.784374L71.09688 56.784374L71.09688 55.784374L77.03438 55.784374L77.03438 56.596874L73.09688 61.20625L72.34688 62.05Q73.175 61.9875 73.89375 61.9875L77.25313 61.9875L77.25313 63.05L70.83125 63.05ZM82.44063 60.70625L83.72188 60.8625Q83.40938 61.9875 82.59688 62.6125Q81.78438 63.221874 80.51875 63.221874Q78.925 63.221874 77.9875 62.2375Q77.06563 61.253124 77.06563 59.4875Q77.06563 57.64375 78.00313 56.64375Q78.95625 55.628124 80.45625 55.628124Q81.90938 55.628124 82.83125 56.628124Q83.75313 57.6125 83.75313 59.409374Q83.75313 59.51875 83.75313 59.7375L78.33125 59.7375Q78.40938 60.925 79.00313 61.565624Q79.6125 62.20625 80.53438 62.20625Q81.20625 62.20625 81.675 61.846874Q82.15938 61.4875 82.44063 60.70625ZM78.40938 58.721874L82.45625 58.721874Q82.37813 57.8 81.9875 57.346874Q81.40938 56.64375 80.47188 56.64375Q79.6125 56.64375 79.03438 57.20625Q78.47188 57.76875 78.40938 58.721874ZM95.94063 57.159374L89.33125 57.159374L89.33125 56.003124L95.94063 56.003124L95.94063 57.159374ZM95.94063 60.20625L89.33125 60.20625L89.33125 59.05L95.94063 59.05L95.94063 60.20625ZM105.08125 63.05L105.08125 60.64375L100.72188 60.64375L100.72188 59.51875L105.3 53.034374L106.3 53.034374L106.3 59.51875L107.65938 59.51875L107.65938 60.64375L106.3 60.64375L106.3 63.05L105.08125 63.05ZM105.08125 59.51875L105.08125 55.003124L101.94063 59.51875L105.08125 59.51875Z" fill-rule="nonzero"/><path fill="#757575" d="M20.01875 37.55L20.01875 23.2375L21.909374 23.2375L21.909374 35.8625L28.95625 35.8625L28.95625 37.55L20.01875 37.55ZM31.01875 37.55L31.01875 23.2375L32.909374 23.2375L32.909374 35.8625L39.95625 35.8625L39.95625 37.55L31.01875 37.55ZM48.64375 36.26875Q47.659374 37.096874 46.753124 37.440624Q45.846874 37.784374 44.815624 37.784374Q43.1125 37.784374 42.190624 36.95625Q41.26875 36.1125 41.26875 34.815624Q41.26875 34.05 41.6125 33.425Q41.971874 32.8 42.534374 32.425Q43.096874 32.034374 43.8 31.83125Q44.315624 31.70625 45.3625 31.565624Q47.4875 31.315624 48.4875 30.971874Q48.503124 30.6125 48.503124 30.503124Q48.503124 29.440624 48.003124 28.9875Q47.33125 28.39375 46.003124 28.39375Q44.753124 28.39375 44.159374 28.83125Q43.58125 29.26875 43.3 30.378124L41.58125 30.14375Q41.815624 29.034374 42.346874 28.3625Q42.878124 27.675 43.89375 27.315624Q44.909374 26.940624 46.253124 26.940624Q47.58125 26.940624 48.409374 27.253124Q49.2375 27.565624 49.628124 28.05Q50.01875 28.51875 50.175 29.2375Q50.26875 29.690624 50.26875 30.8625L50.26875 33.20625Q50.26875 35.659374 50.378124 36.315624Q50.4875 36.95625 50.83125 37.55L48.9875 37.55Q48.721874 37.003124 48.64375 36.26875ZM48.4875 32.346874Q47.534374 32.7375 45.6125 33.003124Q44.534374 33.159374 44.08125 33.3625Q43.64375 33.55 43.39375 33.925Q43.14375 34.3 43.14375 34.76875Q43.14375 35.471874 43.675 35.940624Q44.20625 36.409374 45.2375 36.409374Q46.253124 36.409374 47.034374 35.971874Q47.83125 35.51875 48.20625 34.753124Q48.4875 34.14375 48.4875 32.9875L48.4875 32.346874ZM53.034374 37.55L53.034374 23.2375L55.89375 23.2375L59.26875 33.3625Q59.7375 34.784374 59.95625 35.4875Q60.20625 34.70625 60.721874 33.190624L64.14375 23.2375L66.69063 23.2375L66.69063 37.55L64.87813 37.55L64.87813 25.565624L60.70625 37.55L59.003124 37.55L54.8625 25.3625L54.8625 37.55L53.034374 37.55ZM68.51875 37.55L74.01875 23.2375L76.06563 23.2375L81.925 37.55L79.76875 37.55L78.09688 33.20625L72.1125 33.20625L70.53438 37.55L68.51875 37.55ZM72.65938 31.675L77.50313 31.675L76.01875 27.70625Q75.33125 25.89375 75.00313 24.7375Q74.72188 26.1125 74.22188 27.471874L72.65938 31.675ZM97.6125 35.8625L97.6125 37.55L88.15938 37.55Q88.14375 36.909374 88.3625 36.33125Q88.72188 35.3625 89.51875 34.425Q90.31563 33.4875 91.81563 32.253124Q94.15938 30.346874 94.97188 29.2375Q95.8 28.1125 95.8 27.1125Q95.8 26.065624 95.05 25.346874Q94.3 24.628124 93.09688 24.628124Q91.83125 24.628124 91.06563 25.39375Q90.3 26.159374 90.3 27.503124L88.4875 27.315624Q88.675 25.3 89.87813 24.2375Q91.09688 23.175 93.14375 23.175Q95.19063 23.175 96.39375 24.315624Q97.59688 25.45625 97.59688 27.14375Q97.59688 28.003124 97.2375 28.846874Q96.89375 29.675 96.08125 30.596874Q95.26875 31.503124 93.37813 33.1125Q91.78438 34.440624 91.33125 34.909374Q90.89375 35.378124 90.59688 35.8625L97.6125 35.8625ZM109.7375 37.55L109.7375 24.925L105.01875 24.925L105.01875 23.2375L116.3625 23.2375L116.3625 24.925L111.62813 24.925L111.62813 37.55L109.7375 37.55ZM117.8625 37.55L117.8625 23.2375L119.62813 23.2375L119.62813 28.3625Q120.8625 26.940624 122.7375 26.940624Q123.87813 26.940624 124.72188 27.39375Q125.58125 27.846874 125.94063 28.659374Q126.31563 29.45625 126.31563 30.971874L126.31563 37.55L124.56563 37.55L124.56563 30.971874Q124.56563 29.659374 123.9875 29.065624Q123.40938 28.45625 122.37813 28.45625Q121.59688 28.45625 120.89375 28.8625Q120.20625 29.26875 119.90938 29.971874Q119.62813 30.659374 119.62813 31.878124L119.62813 37.55L117.8625 37.55ZM127.84688 37.55L127.84688 27.175L129.425 27.175L129.425 28.753124Q130.03438 27.64375 130.55 27.3Q131.06563 26.940624 131.675 26.940624Q132.56563 26.940624 133.4875 27.503124L132.87813 29.14375Q132.2375 28.753124 131.59688 28.753124Q131.01875 28.753124 130.55 29.1125Q130.09688 29.45625 129.89375 30.065624Q129.6125 31.003124 129.6125 32.1125L129.6125 37.55L127.84688 37.55ZM134.22188 32.3625Q134.22188 29.4875 135.81563 28.096874Q137.15938 26.940624 139.08125 26.940624Q141.22188 26.940624 142.56563 28.346874Q143.925 29.753124 143.925 32.221874Q143.925 34.221874 143.33125 35.378124Q142.7375 36.51875 141.58125 37.159374Q140.44063 37.784374 139.08125 37.784374Q136.89375 37.784374 135.55 36.39375Q134.22188 34.9875 134.22188 32.3625ZM136.01875 32.3625Q136.01875 34.3625 136.87813 35.346874Q137.75313 36.33125 139.08125 36.33125Q140.39375 36.33125 141.25313 35.346874Q142.12813 34.346874 142.12813 32.3Q142.12813 30.378124 141.25313 29.39375Q140.37813 28.39375 139.08125 28.39375Q137.75313 28.39375 136.87813 29.39375Q136.01875 30.378124 136.01875 32.3625ZM152.65938 37.55L152.65938 36.01875Q151.45625 37.784374 149.37813 37.784374Q148.45625 37.784374 147.65938 37.440624Q146.8625 37.08125 146.47188 36.55Q146.09688 36.01875 145.94063 35.2375Q145.83125 34.721874 145.83125 33.596874L145.83125 27.175L147.58125 27.175L147.58125 32.925Q147.58125 34.3 147.69063 34.784374Q147.8625 35.471874 148.39375 35.878124Q148.94063 36.26875 149.72188 36.26875Q150.51875 36.26875 151.20625 35.8625Q151.90938 35.45625 152.19063 34.76875Q152.4875 34.065624 152.4875 32.7375L152.4875 27.175L154.2375 27.175L154.2375 37.55L152.65938 37.55ZM155.55 38.409374L157.25313 38.659374Q157.3625 39.45625 157.84688 39.815624Q158.50313 40.3 159.64375 40.3Q160.8625 40.3 161.51875 39.815624Q162.19063 39.33125 162.425 38.45625Q162.56563 37.909374 162.55 36.190624Q161.39375 37.55 159.675 37.55Q157.53438 37.55 156.3625 36.003124Q155.19063 34.45625 155.19063 32.3Q155.19063 30.815624 155.72188 29.565624Q156.26875 28.315624 157.28438 27.628124Q158.31563 26.940624 159.69063 26.940624Q161.51875 26.940624 162.72188 28.425L162.72188 27.175L164.33125 27.175L164.33125 36.14375Q164.33125 38.565624 163.83125 39.565624Q163.34688 40.58125 162.26875 41.159374Q161.20625 41.753124 159.64375 41.753124Q157.8 41.753124 156.64375 40.925Q155.50313 40.096874 155.55 38.409374ZM157.00313 32.175Q157.00313 34.221874 157.81563 35.159374Q158.62813 36.096874 159.84688 36.096874Q161.05 36.096874 161.8625 35.159374Q162.69063 34.221874 162.69063 32.2375Q162.69063 30.33125 161.84688 29.3625Q161.00313 28.39375 159.81563 28.39375Q158.64375 28.39375 157.81563 29.346874Q157.00313 30.3 157.00313 32.175ZM166.8625 37.55L166.8625 23.2375L168.62813 23.2375L168.62813 28.3625Q169.8625 26.940624 171.7375 26.940624Q172.87813 26.940624 173.72188 27.39375Q174.58125 27.846874 174.94063 28.659374Q175.31563 29.45625 175.31563 30.971874L175.31563 37.55L173.56563 37.55L173.56563 30.971874Q173.56563 29.659374 172.9875 29.065624Q172.40938 28.45625 171.37813 28.45625Q170.59688 28.45625 169.89375 28.8625Q169.20625 29.26875 168.90938 29.971874Q168.62813 30.659374 168.62813 31.878124L168.62813 37.55L166.8625 37.55ZM176.8625 41.51875L176.8625 27.175L178.47188 27.175L178.47188 28.51875Q179.03438 27.7375 179.7375 27.346874Q180.45625 26.940624 181.47188 26.940624Q182.8 26.940624 183.81563 27.628124Q184.83125 28.315624 185.34688 29.565624Q185.87813 30.8 185.87813 32.284374Q185.87813 33.878124 185.3 35.159374Q184.7375 36.425 183.64375 37.1125Q182.55 37.784374 181.34688 37.784374Q180.47188 37.784374 179.76875 37.409374Q179.08125 37.034374 178.62813 36.471874L178.62813 41.51875L176.8625 41.51875ZM178.45625 32.425Q178.45625 34.425 179.26875 35.378124Q180.08125 36.33125 181.2375 36.33125Q182.40938 36.33125 183.2375 35.346874Q184.08125 34.3625 184.08125 32.26875Q184.08125 30.3 183.25313 29.315624Q182.44063 28.315624 181.31563 28.315624Q180.19063 28.315624 179.31563 29.378124Q178.45625 30.425 178.45625 32.425ZM194.65938 37.55L194.65938 36.01875Q193.45625 37.784374 191.37813 37.784374Q190.45625 37.784374 189.65938 37.440624Q188.8625 37.08125 188.47188 36.55Q188.09688 36.01875 187.94063 35.2375Q187.83125 34.721874 187.83125 33.596874L187.83125 27.175L189.58125 27.175L189.58125 32.925Q189.58125 34.3 189.69063 34.784374Q189.8625 35.471874 190.39375 35.878124Q190.94063 36.26875 191.72188 36.26875Q192.51875 36.26875 193.20625 35.8625Q193.90938 35.45625 194.19063 34.76875Q194.4875 34.065624 194.4875 32.7375L194.4875 27.175L196.2375 27.175L196.2375 37.55L194.65938 37.55ZM201.70625 35.971874L201.95625 37.534374Q201.22188 37.690624 200.62813 37.690624Q199.675 37.690624 199.14375 37.39375Q198.62813 37.08125 198.40938 36.596874Q198.19063 36.096874 198.19063 34.51875L198.19063 28.55L196.90938 28.55L196.90938 27.175L198.19063 27.175L198.19063 24.6125L199.94063 23.55L199.94063 27.175L201.70625 27.175L201.70625 28.55L199.94063 28.55L199.94063 34.6125Q199.94063 35.3625 200.03438 35.58125Q200.12813 35.784374 200.33125 35.925Q200.55 36.05 200.94063 36.05Q201.22188 36.05 201.70625 35.971874Z" fill-rule="nonzero"/></svg>
8
0
hf_public_repos/blog/assets
hf_public_repos/blog/assets/optimum_nvidia/first_token_latency.svg
<svg version="1.1" viewBox="0.0 0.0 600.0 371.0" fill="none" stroke="none" stroke-linecap="square" stroke-miterlimit="10" width="600" height="371" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns="http://www.w3.org/2000/svg"><path fill="#ffffff" d="M0 0L600.0 0L600.0 371.0L0 371.0L0 0Z" fill-rule="nonzero"/><path stroke="#333333" stroke-width="1.0" stroke-linecap="butt" d="M77.5 332.5L581.5 332.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 277.5L581.5 277.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 222.5L581.5 222.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 166.5L581.5 166.5" fill-rule="nonzero"/><path stroke="#cccccc" stroke-width="1.0" stroke-linecap="butt" d="M77.5 111.5L581.5 111.5" fill-rule="nonzero"/><clipPath id="id_0"><path d="M77.55 111.78333L581.45 111.78333L581.45 332.45L77.55 332.45L77.55 111.78333Z" clip-rule="nonzero"/></clipPath><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M178.0 332.0L131.0 332.0L131.0 181.0C131.0 179.89543 131.89543 179.0 133.0 179.0L176.0 179.0C177.10457 179.0 178.0 179.89543 178.0 181.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M178.0 332.0L131.0 332.0L131.0 181.0C131.0 179.89543 131.89543 179.0 133.0 179.0L176.0 179.0C177.10457 179.0 178.0 179.89543 178.0 181.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M329.0 332.0L282.0 332.0L282.0 155.0C282.0 153.89543 282.89542 153.0 284.0 153.0L327.0 153.0C328.10458 153.0 329.0 153.89543 329.0 155.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M329.0 332.0L282.0 332.0L282.0 155.0C282.0 153.89543 282.89542 153.0 284.0 153.0L327.0 153.0C328.10458 153.0 329.0 153.89543 329.0 155.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M479.0 332.0L432.0 332.0L432.0 194.0C432.0 192.89543 432.89542 192.0 434.0 192.0L477.0 192.0C478.10458 192.0 479.0 192.89543 479.0 194.0Z" fill-rule="nonzero"/><path fill="#85b737" clip-path="url(#id_0)" d="M479.0 332.0L432.0 332.0L432.0 194.0C432.0 192.89543 432.89542 192.0 434.0 192.0L477.0 192.0C478.10458 192.0 479.0 192.89543 479.0 194.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M378.0 332.0L331.0 332.0L331.0 173.0C331.0 171.89543 331.89542 171.0 333.0 171.0L376.0 171.0C377.10458 171.0 378.0 171.89543 378.0 173.0Z" fill-rule="nonzero"/><path fill="#3d5314" clip-path="url(#id_0)" d="M378.0 332.0L331.0 332.0L331.0 173.0C331.0 171.89543 331.89542 171.0 333.0 171.0L376.0 171.0C377.10458 171.0 378.0 171.89543 378.0 173.0Z" fill-rule="nonzero"/><path stroke="#000000" stroke-width="2.0" stroke-linecap="butt" stroke-opacity="0.0" clip-path="url(#id_0)" d="M528.0 332.0L481.0 332.0L481.0 160.0C481.0 158.89543 481.89542 158.0 483.0 158.0L526.0 158.0C527.10455 158.0 528.0 158.89543 528.0 160.0Z" fill-rule="nonzero"/><path fill="#3d5314" clip-path="url(#id_0)" d="M528.0 332.0L481.0 332.0L481.0 160.0C481.0 158.89543 481.89542 158.0 483.0 158.0L526.0 158.0C527.10455 158.0 528.0 158.89543 528.0 160.0Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M149.29688 195.53775L143.70312 195.53775L143.70312 194.7565L146.65625 191.47525Q147.3125 190.72525 147.5625 190.27213Q147.8125 189.80338 147.8125 189.30338Q147.8125 188.6315 147.40625 188.20963Q147.01562 187.77213 146.32812 187.77213Q145.53125 187.77213 145.07812 188.24088Q144.625 188.694 144.625 189.52213L143.54688 189.52213Q143.54688 188.33463 144.29688 187.61588Q145.0625 186.8815 146.32812 186.8815Q147.53125 186.8815 148.21875 187.5065Q148.90625 188.1315 148.90625 189.17838Q148.90625 190.42838 147.29688 192.17838L145.01562 194.64713L149.29688 194.64713L149.29688 195.53775ZM150.84375 194.97525Q150.84375 194.694 151.0 194.5065Q151.17188 194.319 151.5 194.319Q151.84375 194.319 152.01562 194.5065Q152.1875 194.694 152.1875 194.97525Q152.1875 195.24088 152.01562 195.42838Q151.84375 195.60025 151.5 195.60025Q151.17188 195.60025 151.0 195.42838Q150.84375 195.24088 150.84375 194.97525ZM158.89062 189.22525Q158.89062 189.86588 158.54688 190.36588Q158.20312 190.86588 157.64062 191.14713Q158.29688 191.42838 158.6875 191.99088Q159.07812 192.53775 159.07812 193.24088Q159.07812 194.33463 158.32812 195.0065Q157.59375 195.66275 156.375 195.66275Q155.14062 195.66275 154.39062 195.0065Q153.65625 194.33463 153.65625 193.24088Q153.65625 192.53775 154.03125 191.99088Q154.40625 191.42838 155.07812 191.14713Q154.51562 190.86588 154.1875 190.36588Q153.85938 189.86588 153.85938 189.22525Q153.85938 188.14713 154.54688 187.52213Q155.23438 186.8815 156.375 186.8815Q157.5 186.8815 158.1875 187.52213Q158.89062 188.14713 158.89062 189.22525ZM158.0 193.20963Q158.0 192.49088 157.54688 192.05338Q157.09375 191.60025 156.35938 191.60025Q155.625 191.60025 155.17188 192.05338Q154.73438 192.49088 154.73438 193.22525Q154.73438 193.944 155.17188 194.36588Q155.60938 194.77213 156.375 194.77213Q157.125 194.77213 157.5625 194.35025Q158.0 193.92838 158.0 193.20963ZM156.375 187.77213Q155.73438 187.77213 155.32812 188.17838Q154.9375 188.569 154.9375 189.2565Q154.9375 189.89713 155.32812 190.30338Q155.71875 190.70963 156.35938 190.70963Q157.01562 190.70963 157.40625 190.30338Q157.79688 189.89713 157.79688 189.2565Q157.79688 188.60025 157.39062 188.194Q156.98438 187.77213 156.375 187.77213ZM162.95312 191.5065L164.35938 189.194L165.625 189.194L163.54688 192.33463L165.6875 195.53775L164.4375 195.53775L162.96875 193.16275L161.5 195.53775L160.23438 195.53775L162.375 192.33463L160.3125 189.194L161.5625 189.194L162.95312 191.5065Z" fill-rule="nonzero"/><path fill="#000000" d="M149.29688 195.53775L143.70312 195.53775L143.70312 194.7565L146.65625 191.47525Q147.3125 190.72525 147.5625 190.27213Q147.8125 189.80338 147.8125 189.30338Q147.8125 188.6315 147.40625 188.20963Q147.01562 187.77213 146.32812 187.77213Q145.53125 187.77213 145.07812 188.24088Q144.625 188.694 144.625 189.52213L143.54688 189.52213Q143.54688 188.33463 144.29688 187.61588Q145.0625 186.8815 146.32812 186.8815Q147.53125 186.8815 148.21875 187.5065Q148.90625 188.1315 148.90625 189.17838Q148.90625 190.42838 147.29688 192.17838L145.01562 194.64713L149.29688 194.64713L149.29688 195.53775ZM150.84375 194.97525Q150.84375 194.694 151.0 194.5065Q151.17188 194.319 151.5 194.319Q151.84375 194.319 152.01562 194.5065Q152.1875 194.694 152.1875 194.97525Q152.1875 195.24088 152.01562 195.42838Q151.84375 195.60025 151.5 195.60025Q151.17188 195.60025 151.0 195.42838Q150.84375 195.24088 150.84375 194.97525ZM158.89062 189.22525Q158.89062 189.86588 158.54688 190.36588Q158.20312 190.86588 157.64062 191.14713Q158.29688 191.42838 158.6875 191.99088Q159.07812 192.53775 159.07812 193.24088Q159.07812 194.33463 158.32812 195.0065Q157.59375 195.66275 156.375 195.66275Q155.14062 195.66275 154.39062 195.0065Q153.65625 194.33463 153.65625 193.24088Q153.65625 192.53775 154.03125 191.99088Q154.40625 191.42838 155.07812 191.14713Q154.51562 190.86588 154.1875 190.36588Q153.85938 189.86588 153.85938 189.22525Q153.85938 188.14713 154.54688 187.52213Q155.23438 186.8815 156.375 186.8815Q157.5 186.8815 158.1875 187.52213Q158.89062 188.14713 158.89062 189.22525ZM158.0 193.20963Q158.0 192.49088 157.54688 192.05338Q157.09375 191.60025 156.35938 191.60025Q155.625 191.60025 155.17188 192.05338Q154.73438 192.49088 154.73438 193.22525Q154.73438 193.944 155.17188 194.36588Q155.60938 194.77213 156.375 194.77213Q157.125 194.77213 157.5625 194.35025Q158.0 193.92838 158.0 193.20963ZM156.375 187.77213Q155.73438 187.77213 155.32812 188.17838Q154.9375 188.569 154.9375 189.2565Q154.9375 189.89713 155.32812 190.30338Q155.71875 190.70963 156.35938 190.70963Q157.01562 190.70963 157.40625 190.30338Q157.79688 189.89713 157.79688 189.2565Q157.79688 188.60025 157.39062 188.194Q156.98438 187.77213 156.375 187.77213ZM162.95312 191.5065L164.35938 189.194L165.625 189.194L163.54688 192.33463L165.6875 195.53775L164.4375 195.53775L162.96875 193.16275L161.5 195.53775L160.23438 195.53775L162.375 192.33463L160.3125 189.194L161.5625 189.194L162.95312 191.5065Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M296.28125 164.7518L297.09375 164.7518Q297.875 164.73618 298.3125 164.34555Q298.75 163.95493 298.75 163.28305Q298.75 161.78305 297.25 161.78305Q296.54688 161.78305 296.125 162.1893Q295.71875 162.59555 295.71875 163.2518L294.625 163.2518Q294.625 162.23618 295.35938 161.5643Q296.10938 160.89243 297.25 160.89243Q298.45312 160.89243 299.14062 161.53305Q299.82812 162.17368 299.82812 163.3143Q299.82812 163.8768 299.46875 164.39243Q299.10938 164.90805 298.48438 165.17368Q299.1875 165.39243 299.5625 165.90805Q299.95312 166.42368 299.95312 167.17368Q299.95312 168.3143 299.20312 169.0018Q298.45312 169.67368 297.25 169.67368Q296.0625 169.67368 295.29688 169.01743Q294.54688 168.36118 294.54688 167.29868L295.64062 167.29868Q295.64062 167.97055 296.07812 168.3768Q296.51562 168.78305 297.26562 168.78305Q298.04688 168.78305 298.45312 168.3768Q298.875 167.95493 298.875 167.1893Q298.875 166.45493 298.42188 166.0643Q297.96875 165.65805 297.09375 165.64243L296.28125 165.64243L296.28125 164.7518ZM301.84375 168.98618Q301.84375 168.70493 302.0 168.51743Q302.17188 168.32993 302.5 168.32993Q302.84375 168.32993 303.01562 168.51743Q303.1875 168.70493 303.1875 168.98618Q303.1875 169.2518 303.01562 169.4393Q302.84375 169.61118 302.5 169.61118Q302.17188 169.61118 302.0 169.4393Q301.84375 169.2518 301.84375 168.98618ZM306.28125 164.7518L307.09375 164.7518Q307.875 164.73618 308.3125 164.34555Q308.75 163.95493 308.75 163.28305Q308.75 161.78305 307.25 161.78305Q306.54688 161.78305 306.125 162.1893Q305.71875 162.59555 305.71875 163.2518L304.625 163.2518Q304.625 162.23618 305.35938 161.5643Q306.10938 160.89243 307.25 160.89243Q308.45312 160.89243 309.14062 161.53305Q309.82812 162.17368 309.82812 163.3143Q309.82812 163.8768 309.46875 164.39243Q309.10938 164.90805 308.48438 165.17368Q309.1875 165.39243 309.5625 165.90805Q309.95312 166.42368 309.95312 167.17368Q309.95312 168.3143 309.20312 169.0018Q308.45312 169.67368 307.25 169.67368Q306.0625 169.67368 305.29688 169.01743Q304.54688 168.36118 304.54688 167.29868L305.64062 167.29868Q305.64062 167.97055 306.07812 168.3768Q306.51562 168.78305 307.26562 168.78305Q308.04688 168.78305 308.45312 168.3768Q308.875 167.95493 308.875 167.1893Q308.875 166.45493 308.42188 166.0643Q307.96875 165.65805 307.09375 165.64243L306.28125 165.64243L306.28125 164.7518ZM313.95312 165.51743L315.35938 163.20493L316.625 163.20493L314.54688 166.34555L316.6875 169.54868L315.4375 169.54868L313.96875 167.17368L312.5 169.54868L311.23438 169.54868L313.375 166.34555L311.3125 163.20493L312.5625 163.20493L313.95312 165.51743Z" fill-rule="nonzero"/><path fill="#000000" d="M296.28125 164.7518L297.09375 164.7518Q297.875 164.73618 298.3125 164.34555Q298.75 163.95493 298.75 163.28305Q298.75 161.78305 297.25 161.78305Q296.54688 161.78305 296.125 162.1893Q295.71875 162.59555 295.71875 163.2518L294.625 163.2518Q294.625 162.23618 295.35938 161.5643Q296.10938 160.89243 297.25 160.89243Q298.45312 160.89243 299.14062 161.53305Q299.82812 162.17368 299.82812 163.3143Q299.82812 163.8768 299.46875 164.39243Q299.10938 164.90805 298.48438 165.17368Q299.1875 165.39243 299.5625 165.90805Q299.95312 166.42368 299.95312 167.17368Q299.95312 168.3143 299.20312 169.0018Q298.45312 169.67368 297.25 169.67368Q296.0625 169.67368 295.29688 169.01743Q294.54688 168.36118 294.54688 167.29868L295.64062 167.29868Q295.64062 167.97055 296.07812 168.3768Q296.51562 168.78305 297.26562 168.78305Q298.04688 168.78305 298.45312 168.3768Q298.875 167.95493 298.875 167.1893Q298.875 166.45493 298.42188 166.0643Q297.96875 165.65805 297.09375 165.64243L296.28125 165.64243L296.28125 164.7518ZM301.84375 168.98618Q301.84375 168.70493 302.0 168.51743Q302.17188 168.32993 302.5 168.32993Q302.84375 168.32993 303.01562 168.51743Q303.1875 168.70493 303.1875 168.98618Q303.1875 169.2518 303.01562 169.4393Q302.84375 169.61118 302.5 169.61118Q302.17188 169.61118 302.0 169.4393Q301.84375 169.2518 301.84375 168.98618ZM306.28125 164.7518L307.09375 164.7518Q307.875 164.73618 308.3125 164.34555Q308.75 163.95493 308.75 163.28305Q308.75 161.78305 307.25 161.78305Q306.54688 161.78305 306.125 162.1893Q305.71875 162.59555 305.71875 163.2518L304.625 163.2518Q304.625 162.23618 305.35938 161.5643Q306.10938 160.89243 307.25 160.89243Q308.45312 160.89243 309.14062 161.53305Q309.82812 162.17368 309.82812 163.3143Q309.82812 163.8768 309.46875 164.39243Q309.10938 164.90805 308.48438 165.17368Q309.1875 165.39243 309.5625 165.90805Q309.95312 166.42368 309.95312 167.17368Q309.95312 168.3143 309.20312 169.0018Q308.45312 169.67368 307.25 169.67368Q306.0625 169.67368 305.29688 169.01743Q304.54688 168.36118 304.54688 167.29868L305.64062 167.29868Q305.64062 167.97055 306.07812 168.3768Q306.51562 168.78305 307.26562 168.78305Q308.04688 168.78305 308.45312 168.3768Q308.875 167.95493 308.875 167.1893Q308.875 166.45493 308.42188 166.0643Q307.96875 165.65805 307.09375 165.64243L306.28125 165.64243L306.28125 164.7518ZM313.95312 165.51743L315.35938 163.20493L316.625 163.20493L314.54688 166.34555L316.6875 169.54868L315.4375 169.54868L313.96875 167.17368L312.5 169.54868L311.23438 169.54868L313.375 166.34555L311.3125 163.20493L312.5625 163.20493L313.95312 165.51743Z" fill-rule="nonzero"/><path stroke="#85b737" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M450.29688 209.0724L444.70312 209.0724L444.70312 208.29115L447.65625 205.0099Q448.3125 204.2599 448.5625 203.80678Q448.8125 203.33803 448.8125 202.83803Q448.8125 202.16615 448.40625 201.74428Q448.01562 201.30678 447.32812 201.30678Q446.53125 201.30678 446.07812 201.77553Q445.625 202.22865 445.625 203.05678L444.54688 203.05678Q444.54688 201.86928 445.29688 201.15053Q446.0625 200.41615 447.32812 200.41615Q448.53125 200.41615 449.21875 201.04115Q449.90625 201.66615 449.90625 202.71303Q449.90625 203.96303 448.29688 205.71303L446.01562 208.18178L450.29688 208.18178L450.29688 209.0724ZM451.84375 208.5099Q451.84375 208.22865 452.0 208.04115Q452.17188 207.85365 452.5 207.85365Q452.84375 207.85365 453.01562 208.04115Q453.1875 208.22865 453.1875 208.5099Q453.1875 208.77553 453.01562 208.96303Q452.84375 209.1349 452.5 209.1349Q452.17188 209.1349 452.0 208.96303Q451.84375 208.77553 451.84375 208.5099ZM455.20312 204.79115L455.64062 200.54115L460.01562 200.54115L460.01562 201.54115L456.5625 201.54115L456.29688 203.86928Q456.9375 203.49428 457.73438 203.49428Q458.89062 203.49428 459.57812 204.27553Q460.26562 205.04115 460.26562 206.35365Q460.26562 207.66615 459.54688 208.43178Q458.84375 209.1974 457.5625 209.1974Q456.4375 209.1974 455.71875 208.5724Q455.0 207.93178 454.90625 206.8224L455.92188 206.8224Q456.03125 207.55678 456.45312 207.93178Q456.875 208.30678 457.5625 208.30678Q458.3125 208.30678 458.75 207.79115Q459.1875 207.27553 459.1875 206.36928Q459.1875 205.5099 458.71875 204.99428Q458.25 204.47865 457.46875 204.47865Q456.76562 204.47865 456.35938 204.77553L456.07812 205.0099L455.20312 204.79115ZM463.95312 205.04115L465.35938 202.72865L466.625 202.72865L464.54688 205.86928L466.6875 209.0724L465.4375 209.0724L463.96875 206.6974L462.5 209.0724L461.23438 209.0724L463.375 205.86928L461.3125 202.72865L462.5625 202.72865L463.95312 205.04115Z" fill-rule="nonzero"/><path fill="#000000" d="M450.29688 209.0724L444.70312 209.0724L444.70312 208.29115L447.65625 205.0099Q448.3125 204.2599 448.5625 203.80678Q448.8125 203.33803 448.8125 202.83803Q448.8125 202.16615 448.40625 201.74428Q448.01562 201.30678 447.32812 201.30678Q446.53125 201.30678 446.07812 201.77553Q445.625 202.22865 445.625 203.05678L444.54688 203.05678Q444.54688 201.86928 445.29688 201.15053Q446.0625 200.41615 447.32812 200.41615Q448.53125 200.41615 449.21875 201.04115Q449.90625 201.66615 449.90625 202.71303Q449.90625 203.96303 448.29688 205.71303L446.01562 208.18178L450.29688 208.18178L450.29688 209.0724ZM451.84375 208.5099Q451.84375 208.22865 452.0 208.04115Q452.17188 207.85365 452.5 207.85365Q452.84375 207.85365 453.01562 208.04115Q453.1875 208.22865 453.1875 208.5099Q453.1875 208.77553 453.01562 208.96303Q452.84375 209.1349 452.5 209.1349Q452.17188 209.1349 452.0 208.96303Q451.84375 208.77553 451.84375 208.5099ZM455.20312 204.79115L455.64062 200.54115L460.01562 200.54115L460.01562 201.54115L456.5625 201.54115L456.29688 203.86928Q456.9375 203.49428 457.73438 203.49428Q458.89062 203.49428 459.57812 204.27553Q460.26562 205.04115 460.26562 206.35365Q460.26562 207.66615 459.54688 208.43178Q458.84375 209.1974 457.5625 209.1974Q456.4375 209.1974 455.71875 208.5724Q455.0 207.93178 454.90625 206.8224L455.92188 206.8224Q456.03125 207.55678 456.45312 207.93178Q456.875 208.30678 457.5625 208.30678Q458.3125 208.30678 458.75 207.79115Q459.1875 207.27553 459.1875 206.36928Q459.1875 205.5099 458.71875 204.99428Q458.25 204.47865 457.46875 204.47865Q456.76562 204.47865 456.35938 204.77553L456.07812 205.0099L455.20312 204.79115ZM463.95312 205.04115L465.35938 202.72865L466.625 202.72865L464.54688 205.86928L466.6875 209.0724L465.4375 209.0724L463.96875 206.6974L462.5 209.0724L461.23438 209.0724L463.375 205.86928L461.3125 202.72865L462.5625 202.72865L463.95312 205.04115Z" fill-rule="nonzero"/><path stroke="#3d5314" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M349.29688 187.4659L343.70312 187.4659L343.70312 186.68465L346.65625 183.4034Q347.3125 182.6534 347.5625 182.20027Q347.8125 181.73152 347.8125 181.23152Q347.8125 180.55965 347.40625 180.13777Q347.01562 179.70027 346.32812 179.70027Q345.53125 179.70027 345.07812 180.16902Q344.625 180.62215 344.625 181.45027L343.54688 181.45027Q343.54688 180.26277 344.29688 179.54402Q345.0625 178.80965 346.32812 178.80965Q347.53125 178.80965 348.21875 179.43465Q348.90625 180.05965 348.90625 181.10652Q348.90625 182.35652 347.29688 184.10652L345.01562 186.57527L349.29688 186.57527L349.29688 187.4659ZM350.84375 186.9034Q350.84375 186.62215 351.0 186.43465Q351.17188 186.24715 351.5 186.24715Q351.84375 186.24715 352.01562 186.43465Q352.1875 186.62215 352.1875 186.9034Q352.1875 187.16902 352.01562 187.35652Q351.84375 187.5284 351.5 187.5284Q351.17188 187.5284 351.0 187.35652Q350.84375 187.16902 350.84375 186.9034ZM357.85938 183.7159Q357.53125 184.12215 357.04688 184.37215Q356.57812 184.60652 356.01562 184.60652Q355.28125 184.60652 354.73438 184.24715Q354.1875 183.88777 353.89062 183.23152Q353.59375 182.55965 353.59375 181.76277Q353.59375 180.91902 353.90625 180.23152Q354.23438 179.54402 354.82812 179.18465Q355.4375 178.80965 356.23438 178.80965Q357.5 178.80965 358.21875 179.76277Q358.95312 180.7159 358.95312 182.3409L358.95312 182.6534Q358.95312 185.1534 357.96875 186.29402Q356.98438 187.43465 355.0 187.4659L354.78125 187.4659L354.78125 186.55965L355.01562 186.55965Q356.35938 186.5284 357.07812 185.85652Q357.79688 185.18465 357.85938 183.7159ZM356.1875 183.7159Q356.73438 183.7159 357.1875 183.38777Q357.65625 183.04402 357.875 182.55965L357.875 182.12215Q357.875 181.05965 357.40625 180.38777Q356.9375 179.7159 356.23438 179.7159Q355.51562 179.7159 355.07812 180.26277Q354.65625 180.80965 354.65625 181.7159Q354.65625 182.5909 355.0625 183.1534Q355.48438 183.7159 356.1875 183.7159ZM362.95312 183.43465L364.35938 181.12215L365.625 181.12215L363.54688 184.26277L365.6875 187.4659L364.4375 187.4659L362.96875 185.0909L361.5 187.4659L360.23438 187.4659L362.375 184.26277L360.3125 181.12215L361.5625 181.12215L362.95312 183.43465Z" fill-rule="nonzero"/><path fill="#ffffff" d="M349.29688 187.4659L343.70312 187.4659L343.70312 186.68465L346.65625 183.4034Q347.3125 182.6534 347.5625 182.20027Q347.8125 181.73152 347.8125 181.23152Q347.8125 180.55965 347.40625 180.13777Q347.01562 179.70027 346.32812 179.70027Q345.53125 179.70027 345.07812 180.16902Q344.625 180.62215 344.625 181.45027L343.54688 181.45027Q343.54688 180.26277 344.29688 179.54402Q345.0625 178.80965 346.32812 178.80965Q347.53125 178.80965 348.21875 179.43465Q348.90625 180.05965 348.90625 181.10652Q348.90625 182.35652 347.29688 184.10652L345.01562 186.57527L349.29688 186.57527L349.29688 187.4659ZM350.84375 186.9034Q350.84375 186.62215 351.0 186.43465Q351.17188 186.24715 351.5 186.24715Q351.84375 186.24715 352.01562 186.43465Q352.1875 186.62215 352.1875 186.9034Q352.1875 187.16902 352.01562 187.35652Q351.84375 187.5284 351.5 187.5284Q351.17188 187.5284 351.0 187.35652Q350.84375 187.16902 350.84375 186.9034ZM357.85938 183.7159Q357.53125 184.12215 357.04688 184.37215Q356.57812 184.60652 356.01562 184.60652Q355.28125 184.60652 354.73438 184.24715Q354.1875 183.88777 353.89062 183.23152Q353.59375 182.55965 353.59375 181.76277Q353.59375 180.91902 353.90625 180.23152Q354.23438 179.54402 354.82812 179.18465Q355.4375 178.80965 356.23438 178.80965Q357.5 178.80965 358.21875 179.76277Q358.95312 180.7159 358.95312 182.3409L358.95312 182.6534Q358.95312 185.1534 357.96875 186.29402Q356.98438 187.43465 355.0 187.4659L354.78125 187.4659L354.78125 186.55965L355.01562 186.55965Q356.35938 186.5284 357.07812 185.85652Q357.79688 185.18465 357.85938 183.7159ZM356.1875 183.7159Q356.73438 183.7159 357.1875 183.38777Q357.65625 183.04402 357.875 182.55965L357.875 182.12215Q357.875 181.05965 357.40625 180.38777Q356.9375 179.7159 356.23438 179.7159Q355.51562 179.7159 355.07812 180.26277Q354.65625 180.80965 354.65625 181.7159Q354.65625 182.5909 355.0625 183.1534Q355.48438 183.7159 356.1875 183.7159ZM362.95312 183.43465L364.35938 181.12215L365.625 181.12215L363.54688 184.26277L365.6875 187.4659L364.4375 187.4659L362.96875 185.0909L361.5 187.4659L360.23438 187.4659L362.375 184.26277L360.3125 181.12215L361.5625 181.12215L362.95312 183.43465Z" fill-rule="nonzero"/><path stroke="#3d5314" stroke-width="3.0" stroke-linejoin="round" stroke-linecap="round" d="M495.28125 169.6479L496.09375 169.6479Q496.875 169.63228 497.3125 169.24165Q497.75 168.85103 497.75 168.17915Q497.75 166.67915 496.25 166.67915Q495.54688 166.67915 495.125 167.0854Q494.71875 167.49165 494.71875 168.1479L493.625 168.1479Q493.625 167.13228 494.35938 166.4604Q495.10938 165.78853 496.25 165.78853Q497.45312 165.78853 498.14062 166.42915Q498.82812 167.06978 498.82812 168.2104Q498.82812 168.7729 498.46875 169.28853Q498.10938 169.80415 497.48438 170.06978Q498.1875 170.28853 498.5625 170.80415Q498.95312 171.31978 498.95312 172.06978Q498.95312 173.2104 498.20312 173.8979Q497.45312 174.56978 496.25 174.56978Q495.0625 174.56978 494.29688 173.91353Q493.54688 173.25728 493.54688 172.19478L494.64062 172.19478Q494.64062 172.86665 495.07812 173.2729Q495.51562 173.67915 496.26562 173.67915Q497.04688 173.67915 497.45312 173.2729Q497.875 172.85103 497.875 172.0854Q497.875 171.35103 497.42188 170.9604Q496.96875 170.55415 496.09375 170.53853L495.28125 170.53853L495.28125 169.6479ZM500.84375 173.88228Q500.84375 173.60103 501.0 173.41353Q501.17188 173.22603 501.5 173.22603Q501.84375 173.22603 502.01562 173.41353Q502.1875 173.60103 502.1875 173.88228Q502.1875 174.1479 502.01562 174.3354Q501.84375 174.50728 501.5 174.50728Q501.17188 174.50728 501.0 174.3354Q500.84375 174.1479 500.84375 173.88228ZM509.29688 174.44478L503.70312 174.44478L503.70312 173.66353L506.65625 170.38228Q507.3125 169.63228 507.5625 169.17915Q507.8125 168.7104 507.8125 168.2104Q507.8125 167.53853 507.40625 167.11665Q507.01562 166.67915 506.32812 166.67915Q505.53125 166.67915 505.07812 167.1479Q504.625 167.60103 504.625 168.42915L503.54688 168.42915Q503.54688 167.24165 504.29688 166.5229Q505.0625 165.78853 506.32812 165.78853Q507.53125 165.78853 508.21875 166.41353Q508.90625 167.03853 508.90625 168.0854Q508.90625 169.3354 507.29688 171.0854L505.01562 173.55415L509.29688 173.55415L509.29688 174.44478ZM512.9531 170.41353L514.3594 168.10103L515.625 168.10103L513.5469 171.24165L515.6875 174.44478L514.4375 174.44478L512.96875 172.06978L511.5 174.44478L510.23438 174.44478L512.375 171.24165L510.3125 168.10103L511.5625 168.10103L512.9531 170.41353Z" fill-rule="nonzero"/><path fill="#ffffff" d="M495.28125 169.6479L496.09375 169.6479Q496.875 169.63228 497.3125 169.24165Q497.75 168.85103 497.75 168.17915Q497.75 166.67915 496.25 166.67915Q495.54688 166.67915 495.125 167.0854Q494.71875 167.49165 494.71875 168.1479L493.625 168.1479Q493.625 167.13228 494.35938 166.4604Q495.10938 165.78853 496.25 165.78853Q497.45312 165.78853 498.14062 166.42915Q498.82812 167.06978 498.82812 168.2104Q498.82812 168.7729 498.46875 169.28853Q498.10938 169.80415 497.48438 170.06978Q498.1875 170.28853 498.5625 170.80415Q498.95312 171.31978 498.95312 172.06978Q498.95312 173.2104 498.20312 173.8979Q497.45312 174.56978 496.25 174.56978Q495.0625 174.56978 494.29688 173.91353Q493.54688 173.25728 493.54688 172.19478L494.64062 172.19478Q494.64062 172.86665 495.07812 173.2729Q495.51562 173.67915 496.26562 173.67915Q497.04688 173.67915 497.45312 173.2729Q497.875 172.85103 497.875 172.0854Q497.875 171.35103 497.42188 170.9604Q496.96875 170.55415 496.09375 170.53853L495.28125 170.53853L495.28125 169.6479ZM500.84375 173.88228Q500.84375 173.60103 501.0 173.41353Q501.17188 173.22603 501.5 173.22603Q501.84375 173.22603 502.01562 173.41353Q502.1875 173.60103 502.1875 173.88228Q502.1875 174.1479 502.01562 174.3354Q501.84375 174.50728 501.5 174.50728Q501.17188 174.50728 501.0 174.3354Q500.84375 174.1479 500.84375 173.88228ZM509.29688 174.44478L503.70312 174.44478L503.70312 173.66353L506.65625 170.38228Q507.3125 169.63228 507.5625 169.17915Q507.8125 168.7104 507.8125 168.2104Q507.8125 167.53853 507.40625 167.11665Q507.01562 166.67915 506.32812 166.67915Q505.53125 166.67915 505.07812 167.1479Q504.625 167.60103 504.625 168.42915L503.54688 168.42915Q503.54688 167.24165 504.29688 166.5229Q505.0625 165.78853 506.32812 165.78853Q507.53125 165.78853 508.21875 166.41353Q508.90625 167.03853 508.90625 168.0854Q508.90625 169.3354 507.29688 171.0854L505.01562 173.55415L509.29688 173.55415L509.29688 174.44478ZM512.9531 170.41353L514.3594 168.10103L515.625 168.10103L513.5469 171.24165L515.6875 174.44478L514.4375 174.44478L512.96875 172.06978L511.5 174.44478L510.23438 174.44478L512.375 171.24165L510.3125 168.10103L511.5625 168.10103L512.9531 170.41353Z" fill-rule="nonzero"/><path fill="#000000" d="M26.784388 319.5698L26.690638 318.5073Q27.346888 318.42917 27.753138 318.14792Q28.159388 317.86667 28.425013 317.28854Q28.675013 316.6948 28.675013 315.96042Q28.675013 315.3198 28.487513 314.8198Q28.284388 314.3198 27.956263 314.08542Q27.612513 313.83542 27.221888 313.83542Q26.815638 313.83542 26.518763 314.0698Q26.206263 314.30417 26.003138 314.83542Q25.862513 315.1948 25.581263 316.3823Q25.300013 317.55417 25.050013 318.02292Q24.721888 318.64792 24.253138 318.9448Q23.768763 319.24167 23.175013 319.24167Q22.534388 319.24167 21.971888 318.8823Q21.393763 318.5073 21.112513 317.80417Q20.815638 317.08542 20.815638 316.22604Q20.815638 315.27292 21.128138 314.53854Q21.425013 313.80417 22.034388 313.41354Q22.628138 313.02292 23.378138 312.99167L23.456263 314.08542Q22.643763 314.17917 22.237513 314.6948Q21.815638 315.1948 21.815638 316.17917Q21.815638 317.21042 22.190638 317.67917Q22.565638 318.14792 23.096888 318.14792Q23.565638 318.14792 23.862513 317.8198Q24.159388 317.49167 24.471888 316.11667Q24.784388 314.72604 25.018763 314.21042Q25.378138 313.46042 25.909388 313.10104Q26.425013 312.74167 27.128138 312.74167Q27.815638 312.74167 28.425013 313.1323Q29.018763 313.52292 29.362513 314.2573Q29.690638 314.99167 29.690638 315.91354Q29.690638 317.08542 29.362513 317.8823Q29.018763 318.66354 28.331263 319.11667Q27.643763 319.55417 26.784388 319.5698ZM31.940638 311.3198L23.331263 311.3198L23.331263 310.36667L24.128138 310.36667Q23.659388 310.02292 23.425013 309.60104Q23.190638 309.16354 23.190638 308.55417Q23.190638 307.7573 23.596888 307.14792Q24.003138 306.53854 24.753138 306.24167Q25.503138 305.92917 26.393763 305.92917Q27.346888 305.92917 28.112513 306.27292Q28.878138 306.60104 29.284388 307.2573Q29.690638 307.91354 29.690638 308.6323Q29.690638 309.16354 29.471888 309.58542Q29.237513 310.0073 28.909388 310.27292L31.940638 310.27292L31.940638 311.3198ZM26.471888 310.36667Q27.675013 310.36667 28.253138 309.8823Q28.815638 309.39792 28.815638 308.71042Q28.815638 308.0073 28.221888 307.5073Q27.628138 307.0073 26.378138 307.0073Q25.190638 307.0073 24.612513 307.49167Q24.018763 307.97604 24.018763 308.66354Q24.018763 309.33542 24.643763 309.85104Q25.268763 310.36667 26.471888 310.36667ZM27.550013 300.0698L27.675013 298.97604Q28.628138 299.22604 29.159388 299.92917Q29.690638 300.6323 29.690638 301.71042Q29.690638 303.0698 28.846888 303.8823Q28.003138 304.67917 26.487513 304.67917Q24.925013 304.67917 24.065638 303.86667Q23.190638 303.05417 23.190638 301.77292Q23.190638 300.52292 24.034388 299.74167Q24.878138 298.9448 26.425013 298.9448Q26.518763 298.9448 26.706263 298.9448L26.706263 303.58542Q27.737513 303.52292 28.284388 303.0073Q28.815638 302.49167 28.815638 301.71042Q28.815638 301.1323 28.518763 300.72604Q28.206263 300.30417 27.550013 300.0698ZM25.846888 303.52292L25.846888 300.05417Q25.050013 300.11667 24.659388 300.4448Q24.050013 300.96042 24.050013 301.7573Q24.050013 302.49167 24.534388 302.99167Q25.018763 303.47604 25.846888 303.52292ZM27.550013 293.0698L27.675013 291.97604Q28.628138 292.22604 29.159388 292.92917Q29.690638 293.6323 29.690638 294.71042Q29.690638 296.0698 28.846888 296.8823Q28.003138 297.67917 26.487513 297.67917Q24.925013 297.67917 24.065638 296.86667Q23.190638 296.05417 23.190638 294.77292Q23.190638 293.52292 24.034388 292.74167Q24.878138 291.9448 26.425013 291.9448Q26.518763 291.9448 26.706263 291.9448L26.706263 296.58542Q27.737513 296.52292 28.284388 296.0073Q28.815638 295.49167 28.815638 294.71042Q28.815638 294.1323 28.518763 293.72604Q28.206263 293.30417 27.550013 293.0698ZM25.846888 296.52292L25.846888 293.05417Q25.050013 293.11667 24.659388 293.4448Q24.050013 293.96042 24.050013 294.7573Q24.050013 295.49167 24.534388 295.99167Q25.018763 296.47604 25.846888 296.52292ZM29.550013 286.28854L28.768763 286.28854Q29.690638 286.8823 29.690638 288.02292Q29.690638 288.77292 29.284388 289.39792Q28.862513 290.02292 28.128138 290.36667Q27.393763 290.71042 26.440638 290.71042Q25.518763 290.71042 24.753138 290.39792Q23.987513 290.08542 23.596888 289.46042Q23.190638 288.83542 23.190638 288.0698Q23.190638 287.5073 23.425013 287.0698Q23.659388 286.6323 24.034388 286.35104L20.956263 286.35104L20.956263 285.30417L29.550013 285.30417L29.550013 286.28854ZM26.440638 289.61667Q27.643763 289.61667 28.237513 289.11667Q28.815638 288.61667 28.815638 287.92917Q28.815638 287.24167 28.253138 286.7573Q27.690638 286.27292 26.534388 286.27292Q25.253138 286.27292 24.659388 286.77292Q24.065638 287.2573 24.065638 287.97604Q24.065638 288.67917 24.643763 289.14792Q25.206263 289.61667 26.440638 289.61667ZM29.550013 279.24167L28.628138 279.24167Q29.690638 279.97604 29.690638 281.22604Q29.690638 281.77292 29.487513 282.2573Q29.268763 282.72604 28.956263 282.96042Q28.628138 283.1948 28.159388 283.28854Q27.862513 283.35104 27.175013 283.35104L23.331263 283.35104L23.331263 282.28854L26.784388 282.28854Q27.596888 282.28854 27.893763 282.22604Q28.300013 282.1323 28.550013 281.8198Q28.784388 281.49167 28.784388 281.0073Q28.784388 280.53854 28.550013 280.1323Q28.300013 279.71042 27.878138 279.53854Q27.456263 279.35104 26.659388 279.35104L23.331263 279.35104L23.331263 278.30417L29.550013 278.30417L29.550013 279.24167ZM31.940638 276.3198L23.331263 276.3198L23.331263 275.36667L24.128138 275.36667Q23.659388 275.02292 23.425013 274.60104Q23.190638 274.16354 23.190638 273.55417Q23.190638 272.7573 23.596888 272.14792Q24.003138 271.53854 24.753138 271.24167Q25.503138 270.92917 26.393763 270.92917Q27.346888 270.92917 28.112513 271.27292Q28.878138 271.60104 29.284388 272.2573Q29.690638 272.91354 29.690638 273.6323Q29.690638 274.16354 29.471888 274.58542Q29.237513 275.0073 28.909388 275.27292L31.940638 275.27292L31.940638 276.3198ZM26.471888 275.36667Q27.675013 275.36667 28.253138 274.8823Q28.815638 274.39792 28.815638 273.71042Q28.815638 273.0073 28.221888 272.5073Q27.628138 272.0073 26.378138 272.0073Q25.190638 272.0073 24.612513 272.49167Q24.018763 272.97604 24.018763 273.66354Q24.018763 274.33542 24.643763 274.85104Q25.268763 275.36667 26.471888 275.36667ZM27.268763 262.2573L27.409388 261.22604Q28.471888 261.39792 29.081263 262.10104Q29.690638 262.80417 29.690638 263.8198Q29.690638 265.10104 28.862513 265.8823Q28.018763 266.64792 26.456263 266.64792Q25.456263 266.64792 24.706263 266.3198Q23.940638 265.97604 23.565638 265.30417Q23.190638 264.61667 23.190638 263.80417Q23.190638 262.80417 23.706263 262.16354Q24.206263 261.5073 25.159388 261.3198L25.315638 262.35104Q24.690638 262.49167 24.378138 262.86667Q24.050013 263.24167 24.050013 263.77292Q24.050013 264.5698 24.628138 265.0698Q25.190638 265.5698 26.425013 265.5698Q27.690638 265.5698 28.253138 265.08542Q28.815638 264.60104 28.815638 263.83542Q28.815638 263.21042 28.440638 262.80417Q28.065638 262.3823 27.268763 262.2573ZM26.440638 260.71042Q24.706263 260.71042 23.878138 259.7573Q23.190638 258.96042 23.190638 257.80417Q23.190638 256.52292 24.034388 255.71042Q24.862513 254.8823 26.346888 254.8823Q27.550013 254.8823 28.237513 255.24167Q28.925013 255.60104 29.315638 256.30417Q29.690638 256.99167 29.690638 257.80417Q29.690638 259.10104 28.862513 259.91354Q28.018763 260.71042 26.440638 260.71042ZM26.440638 259.6323Q27.628138 259.6323 28.221888 259.11667Q28.815638 258.58542 28.815638 257.80417Q28.815638 257.0073 28.221888 256.49167Q27.628138 255.97604 26.409388 255.97604Q25.253138 255.97604 24.659388 256.5073Q24.065638 257.02292 24.065638 257.80417Q24.065638 258.58542 24.659388 259.11667Q25.237513 259.6323 26.440638 259.6323ZM29.550013 253.3198L23.331263 253.3198L23.331263 252.3823L24.206263 252.3823Q23.737513 252.08542 23.471888 251.60104Q23.190638 251.11667 23.190638 250.49167Q23.190638 249.80417 23.471888 249.36667Q23.753138 248.91354 24.268763 248.74167Q23.190638 247.99167 23.190638 246.8198Q23.190638 245.8823 23.706263 245.39792Q24.206263 244.89792 25.284388 244.89792L29.550013 244.89792L29.550013 245.9448L25.628138 245.9448Q25.003138 245.9448 24.721888 246.05417Q24.440638 246.14792 24.268763 246.41354Q24.096888 246.67917 24.096888 247.05417Q24.096888 247.71042 24.534388 248.14792Q24.971888 248.5698 25.940638 248.5698L29.550013 248.5698L29.550013 249.6323L25.503138 249.6323Q24.800013 249.6323 24.456263 249.89792Q24.096888 250.14792 24.096888 250.72604Q24.096888 251.17917 24.331263 251.55417Q24.565638 251.92917 25.018763 252.10104Q25.471888 252.27292 26.315638 252.27292L29.550013 252.27292L29.550013 253.3198ZM31.940638 242.3198L23.331263 242.3198L23.331263 241.36667L24.128138 241.36667Q23.659388 241.02292 23.425013 240.60104Q23.190638 240.16354 23.190638 239.55417Q23.190638 238.7573 23.596888 238.14792Q24.003138 237.53854 24.753138 237.24167Q25.503138 236.92917 26.393763 236.92917Q27.346888 236.92917 28.112513 237.27292Q28.878138 237.60104 29.284388 238.2573Q29.690638 238.91354 29.690638 239.6323Q29.690638 240.16354 29.471888 240.58542Q29.237513 241.0073 28.909388 241.27292L31.940638 241.27292L31.940638 242.3198ZM26.471888 241.36667Q27.675013 241.36667 28.253138 240.8823Q28.815638 240.39792 28.815638 239.71042Q28.815638 239.0073 28.221888 238.5073Q27.628138 238.0073 26.378138 238.0073Q25.190638 238.0073 24.612513 238.49167Q24.018763 238.97604 24.018763 239.66354Q24.018763 240.33542 24.643763 240.85104Q25.268763 241.36667 26.471888 241.36667ZM28.784388 231.2573Q29.284388 231.85104 29.487513 232.39792Q29.690638 232.92917 29.690638 233.55417Q29.690638 234.58542 29.190638 235.1323Q28.690638 235.67917 27.909388 235.67917Q27.456263 235.67917 27.081263 235.47604Q26.690638 235.27292 26.471888 234.92917Q26.237513 234.58542 26.128138 234.16354Q26.034388 233.86667 25.956263 233.22604Q25.815638 231.96042 25.596888 231.35104Q25.378138 231.35104 25.331263 231.35104Q24.675013 231.35104 24.409388 231.64792Q24.065638 232.05417 24.065638 232.85104Q24.065638 233.58542 24.331263 233.9448Q24.581263 234.30417 25.237513 234.47604L25.096888 235.5073Q24.440638 235.36667 24.034388 235.03854Q23.628138 234.71042 23.409388 234.10104Q23.190638 233.49167 23.190638 232.6948Q23.190638 231.89792 23.378138 231.39792Q23.565638 230.89792 23.846888 230.66354Q24.128138 230.42917 24.565638 230.33542Q24.831263 230.28854 25.534388 230.28854L26.940638 230.28854Q28.409388 230.28854 28.800013 230.22604Q29.190638 230.14792 29.550013 229.9448L29.550013 231.05417Q29.221888 231.21042 28.784388 231.2573ZM26.425013 231.35104Q26.659388 231.92917 26.831263 233.0698Q26.925013 233.72604 27.050013 233.99167Q27.159388 234.2573 27.378138 234.41354Q27.596888 234.55417 27.878138 234.55417Q28.300013 234.55417 28.581263 234.24167Q28.862513 233.91354 28.862513 233.30417Q28.862513 232.6948 28.596888 232.22604Q28.331263 231.74167 27.862513 231.52292Q27.503138 231.35104 26.815638 231.35104L26.425013 231.35104ZM29.550013 228.33542L23.331263 228.33542L23.331263 227.3823L24.268763 227.3823Q23.612513 227.02292 23.409388 226.72604Q23.190638 226.41354 23.190638 226.03854Q23.190638 225.5073 23.518763 224.96042L24.503138 225.3198Q24.268763 225.71042 24.268763 226.08542Q24.268763 226.4448 24.487513 226.72604Q24.690638 226.99167 25.065638 227.10104Q25.628138 227.28854 26.284388 227.28854L29.550013 227.28854L29.550013 228.33542ZM27.550013 220.0698L27.675013 218.97604Q28.628138 219.22604 29.159388 219.92917Q29.690638 220.6323 29.690638 221.71042Q29.690638 223.0698 28.846888 223.8823Q28.003138 224.67917 26.487513 224.67917Q24.925013 224.67917 24.065638 223.86667Q23.190638 223.05417 23.190638 221.77292Q23.190638 220.52292 24.034388 219.74167Q24.878138 218.9448 26.425013 218.9448Q26.518763 218.9448 26.706263 218.9448L26.706263 223.58542Q27.737513 223.52292 28.284388 223.0073Q28.815638 222.49167 28.815638 221.71042Q28.815638 221.1323 28.518763 220.72604Q28.206263 220.30417 27.550013 220.0698ZM25.846888 223.52292L25.846888 220.05417Q25.050013 220.11667 24.659388 220.4448Q24.050013 220.96042 24.050013 221.7573Q24.050013 222.49167 24.534388 222.99167Q25.018763 223.47604 25.846888 223.52292ZM29.550013 213.28854L28.768763 213.28854Q29.690638 213.8823 29.690638 215.02292Q29.690638 215.77292 29.284388 216.39792Q28.862513 217.02292 28.128138 217.36667Q27.393763 217.71042 26.440638 217.71042Q25.518763 217.71042 24.753138 217.39792Q23.987513 217.08542 23.596888 216.46042Q23.190638 215.83542 23.190638 215.0698Q23.190638 214.5073 23.425013 214.0698Q23.659388 213.6323 24.034388 213.35104L20.956263 213.35104L20.956263 212.30417L29.550013 212.30417L29.550013 213.28854ZM26.440638 216.61667Q27.643763 216.61667 28.237513 216.11667Q28.815638 215.61667 28.815638 214.92917Q28.815638 214.24167 28.253138 213.7573Q27.690638 213.27292 26.534388 213.27292Q25.253138 213.27292 24.659388 213.77292Q24.065638 214.2573 24.065638 214.97604Q24.065638 215.67917 24.643763 216.14792Q25.206263 216.61667 26.440638 216.61667ZM28.612513 205.02292L29.534388 204.86667Q29.628138 205.3198 29.628138 205.66354Q29.628138 206.24167 29.456263 206.55417Q29.268763 206.86667 28.971888 207.0073Q28.675013 207.1323 27.721888 207.1323L24.143763 207.1323L24.143763 207.89792L23.331263 207.89792L23.331263 207.1323L21.784388 207.1323L21.159388 206.08542L23.331263 206.08542L23.331263 205.02292L24.143763 205.02292L24.143763 206.08542L27.784388 206.08542Q28.237513 206.08542 28.362513 206.03854Q28.487513 205.97604 28.565638 205.85104Q28.643763 205.72604 28.643763 205.49167Q28.643763 205.30417 28.612513 205.02292ZM26.440638 204.71042Q24.706263 204.71042 23.878138 203.7573Q23.190638 202.96042 23.190638 201.80417Q23.190638 200.52292 24.034388 199.71042Q24.862513 198.8823 26.346888 198.8823Q27.550013 198.8823 28.237513 199.24167Q28.925013 199.60104 29.315638 200.30417Q29.690638 200.99167 29.690638 201.80417Q29.690638 203.10104 28.862513 203.91354Q28.018763 204.71042 26.440638 204.71042ZM26.440638 203.6323Q27.628138 203.6323 28.221888 203.11667Q28.815638 202.58542 28.815638 201.80417Q28.815638 201.0073 28.221888 200.49167Q27.628138 199.97604 26.409388 199.97604Q25.253138 199.97604 24.659388 200.5073Q24.065638 201.02292 24.065638 201.80417Q24.065638 202.58542 24.659388 203.11667Q25.237513 203.6323 26.440638 203.6323ZM29.550013 194.0698L24.143763 194.0698L24.143763 195.0073L23.331263 195.0073L23.331263 194.0698L22.659388 194.0698Q22.034388 194.0698 21.737513 193.96042Q21.315638 193.80417 21.065638 193.42917Q20.815638 193.03854 20.815638 192.35104Q20.815638 191.89792 20.925013 191.36667L21.831263 191.52292Q21.784388 191.85104 21.784388 192.14792Q21.784388 192.6323 21.987513 192.83542Q22.190638 193.02292 22.753138 193.02292L23.331263 193.02292L23.331263 191.80417L24.143763 191.80417L24.143763 193.02292L29.550013 193.02292L29.550013 194.0698ZM29.550013 191.33542L23.331263 191.33542L23.331263 190.3823L24.268763 190.3823Q23.612513 190.02292 23.409388 189.72604Q23.190638 189.41354 23.190638 189.03854Q23.190638 188.5073 23.518763 187.96042L24.503138 188.3198Q24.268763 188.71042 24.268763 189.08542Q24.268763 189.4448 24.487513 189.72604Q24.690638 189.99167 25.065638 190.10104Q25.628138 190.28854 26.284388 190.28854L29.550013 190.28854L29.550013 191.33542ZM28.784388 183.2573Q29.284388 183.85104 29.487513 184.39792Q29.690638 184.92917 29.690638 185.55417Q29.690638 186.58542 29.190638 187.1323Q28.690638 187.67917 27.909388 187.67917Q27.456263 187.67917 27.081263 187.47604Q26.690638 187.27292 26.471888 186.92917Q26.237513 186.58542 26.128138 186.16354Q26.034388 185.86667 25.956263 185.22604Q25.815638 183.96042 25.596888 183.35104Q25.378138 183.35104 25.331263 183.35104Q24.675013 183.35104 24.409388 183.64792Q24.065638 184.05417 24.065638 184.85104Q24.065638 185.58542 24.331263 185.9448Q24.581263 186.30417 25.237513 186.47604L25.096888 187.5073Q24.440638 187.36667 24.034388 187.03854Q23.628138 186.71042 23.409388 186.10104Q23.190638 185.49167 23.190638 184.6948Q23.190638 183.89792 23.378138 183.39792Q23.565638 182.89792 23.846888 182.66354Q24.128138 182.42917 24.565638 182.33542Q24.831263 182.28854 25.534388 182.28854L26.940638 182.28854Q28.409388 182.28854 28.800013 182.22604Q29.190638 182.14792 29.550013 181.9448L29.550013 183.05417Q29.221888 183.21042 28.784388 183.2573ZM26.425013 183.35104Q26.659388 183.92917 26.831263 185.0698Q26.925013 185.72604 27.050013 185.99167Q27.159388 186.2573 27.378138 186.41354Q27.596888 186.55417 27.878138 186.55417Q28.300013 186.55417 28.581263 186.24167Q28.862513 185.91354 28.862513 185.30417Q28.862513 184.6948 28.596888 184.22604Q28.331263 183.74167 27.862513 183.52292Q27.503138 183.35104 26.815638 183.35104L26.425013 183.35104ZM29.550013 180.3198L23.331263 180.3198L23.331263 179.3823L24.206263 179.3823Q23.737513 179.08542 23.471888 178.60104Q23.190638 178.11667 23.190638 177.49167Q23.190638 176.80417 23.471888 176.36667Q23.753138 175.91354 24.268763 175.74167Q23.190638 174.99167 23.190638 173.8198Q23.190638 172.8823 23.706263 172.39792Q24.206263 171.89792 25.284388 171.89792L29.550013 171.89792L29.550013 172.9448L25.628138 172.9448Q25.003138 172.9448 24.721888 173.05417Q24.440638 173.14792 24.268763 173.41354Q24.096888 173.67917 24.096888 174.05417Q24.096888 174.71042 24.534388 175.14792Q24.971888 175.5698 25.940638 175.5698L29.550013 175.5698L29.550013 176.6323L25.503138 176.6323Q24.800013 176.6323 24.456263 176.89792Q24.096888 177.14792 24.096888 177.72604Q24.096888 178.17917 24.331263 178.55417Q24.565638 178.92917 25.018763 179.10104Q25.471888 179.27292 26.315638 179.27292L29.550013 179.27292L29.550013 180.3198ZM27.550013 165.0698L27.675013 163.97604Q28.628138 164.22604 29.159388 164.92917Q29.690638 165.6323 29.690638 166.71042Q29.690638 168.0698 28.846888 168.8823Q28.003138 169.67917 26.487513 169.67917Q24.925013 169.67917 24.065638 168.86667Q23.190638 168.05417 23.190638 166.77292Q23.190638 165.52292 24.034388 164.74167Q24.878138 163.9448 26.425013 163.9448Q26.518763 163.9448 26.706263 163.9448L26.706263 168.58542Q27.737513 168.52292 28.284388 168.0073Q28.815638 167.49167 28.815638 166.71042Q28.815638 166.1323 28.518763 165.72604Q28.206263 165.30417 27.550013 165.0698ZM25.846888 168.52292L25.846888 165.05417Q25.050013 165.11667 24.659388 165.4448Q24.050013 165.96042 24.050013 166.7573Q24.050013 167.49167 24.534388 167.99167Q25.018763 168.47604 25.846888 168.52292ZM29.550013 161.17917L23.331263 163.08542L23.331263 161.99167L26.925013 161.0073L28.253138 160.6323Q28.159388 160.60104 26.971888 160.30417L23.331263 159.3198L23.331263 158.24167L26.940638 157.30417L28.128138 156.99167L26.925013 156.6323L23.331263 155.5698L23.331263 154.53854L29.550013 156.49167L29.550013 157.58542L25.815638 158.5698L24.768763 158.8198L29.550013 160.0698L29.550013 161.17917ZM26.440638 153.71042Q24.706263 153.71042 23.878138 152.7573Q23.190638 151.96042 23.190638 150.80417Q23.190638 149.52292 24.034388 148.71042Q24.862513 147.8823 26.346888 147.8823Q27.550013 147.8823 28.237513 148.24167Q28.925013 148.60104 29.315638 149.30417Q29.690638 149.99167 29.690638 150.80417Q29.690638 152.10104 28.862513 152.91354Q28.018763 153.71042 26.440638 153.71042ZM26.440638 152.6323Q27.628138 152.6323 28.221888 152.11667Q28.815638 151.58542 28.815638 150.80417Q28.815638 150.0073 28.221888 149.49167Q27.628138 148.97604 26.409388 148.97604Q25.253138 148.97604 24.659388 149.5073Q24.065638 150.02292 24.065638 150.80417Q24.065638 151.58542 24.659388 152.11667Q25.237513 152.6323 26.440638 152.6323ZM29.550013 146.33542L23.331263 146.33542L23.331263 145.3823L24.268763 145.3823Q23.612513 145.02292 23.409388 144.72604Q23.190638 144.41354 23.190638 144.03854Q23.190638 143.5073 23.518763 142.96042L24.503138 143.3198Q24.268763 143.71042 24.268763 144.08542Q24.268763 144.4448 24.487513 144.72604Q24.690638 144.99167 25.065638 145.10104Q25.628138 145.28854 26.284388 145.28854L29.550013 145.28854L29.550013 146.33542ZM29.550013 142.3198L20.956263 142.3198L20.956263 141.2573L25.862513 141.2573L23.331263 138.77292L23.331263 137.39792L25.628138 139.78854L29.550013 137.16354L29.550013 138.46042L26.362513 140.52292L27.081263 141.2573L29.550013 141.2573L29.550013 142.3198Z" fill-rule="nonzero"/><path fill="#000000" d="M60.05 332.21564Q60.05 330.6844 60.3625 329.7625Q60.675 328.825 61.284374 328.325Q61.909374 327.825 62.846874 327.825Q63.534374 327.825 64.05 328.10626Q64.58125 328.3875 64.925 328.91876Q65.26875 329.4344 65.45625 330.1844Q65.64375 330.9344 65.64375 332.21564Q65.64375 333.73126 65.33125 334.65314Q65.03438 335.575 64.40938 336.09064Q63.8 336.59064 62.846874 336.59064Q61.6125 336.59064 60.89375 335.7Q60.05 334.6375 60.05 332.21564ZM61.128124 332.21564Q61.128124 334.325 61.628124 335.02814Q62.128124 335.73126 62.846874 335.73126Q63.58125 335.73126 64.06563 335.02814Q64.56563 334.325 64.56563 332.21564Q64.56563 330.09064 64.06563 329.40314Q63.58125 328.7 62.83125 328.7Q62.1125 328.7 61.675 329.3094Q61.128124 330.09064 61.128124 332.21564ZM66.64375 336.45L68.90938 333.21564L66.81563 330.23126L68.12813 330.23126L69.08125 331.6844Q69.34688 332.10626 69.51875 332.3875Q69.76875 331.9969 69.9875 331.7L71.03438 330.23126L72.3 330.23126L70.14375 333.15314L72.45625 336.45L71.175 336.45L69.89375 334.5125L69.55 333.9969L67.90938 336.45L66.64375 336.45Z" fill-rule="nonzero"/><path fill="#000000" d="M64.01875 281.28333L62.971874 281.28333L62.971874 274.56458Q62.58125 274.92395 61.95625 275.29895Q61.346874 275.65833 60.8625 275.8302L60.8625 274.81458Q61.7375 274.3927 62.39375 273.81458Q63.065624 273.22083 63.346874 272.65833L64.01875 272.65833L64.01875 281.28333ZM66.64375 281.28333L68.90938 278.04895L66.81563 275.06458L68.12813 275.06458L69.08125 276.5177Q69.34688 276.93958 69.51875 277.22083Q69.76875 276.8302 69.9875 276.53333L71.03438 275.06458L72.3 275.06458L70.14375 277.98645L72.45625 281.28333L71.175 281.28333L69.89375 279.34583L69.55 278.8302L67.90938 281.28333L66.64375 281.28333Z" fill-rule="nonzero"/><path fill="#000000" d="M65.59688 225.10104L65.59688 226.11667L59.909374 226.11667Q59.909374 225.74167 60.034374 225.3823Q60.253124 224.80417 60.721874 224.24167Q61.20625 223.67917 62.1125 222.9448Q63.51875 221.78854 64.00313 221.11667Q64.50313 220.4448 64.50313 219.85104Q64.50313 219.22604 64.05 218.80417Q63.596874 218.36667 62.878124 218.36667Q62.1125 218.36667 61.659374 218.8198Q61.20625 219.27292 61.190624 220.08542L60.1125 219.97604Q60.221874 218.7573 60.940624 218.1323Q61.675 217.49167 62.909374 217.49167Q64.14375 217.49167 64.8625 218.17917Q65.58125 218.86667 65.58125 219.8823Q65.58125 220.39792 65.3625 220.89792Q65.15938 221.3823 64.65938 221.9448Q64.175 222.49167 63.05 223.4448Q62.096874 224.24167 61.815624 224.53854Q61.55 224.8198 61.378124 225.10104L65.59688 225.10104ZM66.64375 226.11667L68.90938 222.8823L66.81563 219.89792L68.12813 219.89792L69.08125 221.35104Q69.34688 221.77292 69.51875 222.05417Q69.76875 221.66354 69.9875 221.36667L71.03438 219.89792L72.3 219.89792L70.14375 222.8198L72.45625 226.11667L71.175 226.11667L69.89375 224.17917L69.55 223.66354L67.90938 226.11667L66.64375 226.11667Z" fill-rule="nonzero"/><path fill="#000000" d="M60.05 168.68437L61.1125 168.54375Q61.284374 169.43437 61.721874 169.84062Q62.159374 170.23125 62.784374 170.23125Q63.534374 170.23125 64.05 169.71562Q64.56563 169.2 64.56563 168.43437Q64.56563 167.71562 64.08125 167.24687Q63.6125 166.7625 62.878124 166.7625Q62.58125 166.7625 62.128124 166.8875L62.253124 165.95Q62.3625 165.96562 62.425 165.96562Q63.096874 165.96562 63.628124 165.62187Q64.175 165.2625 64.175 164.52812Q64.175 163.95 63.784374 163.575Q63.39375 163.18437 62.76875 163.18437Q62.159374 163.18437 61.7375 163.575Q61.33125 163.96562 61.221874 164.73125L60.159374 164.54375Q60.346874 163.49687 61.034374 162.91875Q61.721874 162.325 62.7375 162.325Q63.440624 162.325 64.03438 162.6375Q64.62813 162.93437 64.94063 163.45Q65.25313 163.96562 65.25313 164.55937Q65.25313 165.10625 64.95625 165.575Q64.65938 166.02812 64.08125 166.29375Q64.83125 166.46562 65.25313 167.02812Q65.675 167.575 65.675 168.41875Q65.675 169.54375 64.84688 170.325Q64.03438 171.10625 62.784374 171.10625Q61.659374 171.10625 60.909374 170.43437Q60.159374 169.74687 60.05 168.68437ZM66.64375 170.95L68.90938 167.71562L66.81563 164.73125L68.12813 164.73125L69.08125 166.18437Q69.34688 166.60625 69.51875 166.8875Q69.76875 166.49687 69.9875 166.2L71.03438 164.73125L72.3 164.73125L70.14375 167.65312L72.45625 170.95L71.175 170.95L69.89375 169.0125L69.55 168.49687L67.90938 170.95L66.64375 170.95Z" fill-rule="nonzero"/><path fill="#000000" d="M63.425 115.78333L63.425 113.72083L59.70625 113.72083L59.70625 112.75208L63.628124 107.18958L64.4875 107.18958L64.4875 112.75208L65.64375 112.75208L65.64375 113.72083L64.4875 113.72083L64.4875 115.78333L63.425 115.78333ZM63.425 112.75208L63.425 108.89271L60.7375 112.75208L63.425 112.75208ZM66.64375 115.78333L68.90938 112.54896L66.81563 109.56458L68.12813 109.56458L69.08125 111.01771Q69.34688 111.43958 69.51875 111.72083Q69.76875 111.33021 69.9875 111.03333L71.03438 109.56458L72.3 109.56458L70.14375 112.48646L72.45625 115.78333L71.175 115.78333L69.89375 113.84583L69.55 113.33021L67.90938 115.78333L66.64375 115.78333Z" fill-rule="nonzero"/><path fill="#000000" d="M168.95709 349.45L168.95709 347.3875L165.23834 347.3875L165.23834 346.41876L169.16022 340.85626L170.01959 340.85626L170.01959 346.41876L171.17584 346.41876L171.17584 347.3875L170.01959 347.3875L170.01959 349.45L168.95709 349.45ZM168.95709 346.41876L168.95709 342.5594L166.26959 346.41876L168.95709 346.41876ZM172.58209 345.21564Q172.58209 343.6844 172.89459 342.7625Q173.20709 341.825 173.81647 341.325Q174.44147 340.825 175.37897 340.825Q176.06647 340.825 176.58209 341.10626Q177.11334 341.3875 177.45709 341.91876Q177.80084 342.4344 177.98834 343.1844Q178.17584 343.9344 178.17584 345.21564Q178.17584 346.73126 177.86334 347.65314Q177.56647 348.575 176.94147 349.09064Q176.33209 349.59064 175.37897 349.59064Q174.14459 349.59064 173.42584 348.7Q172.58209 347.6375 172.58209 345.21564ZM173.66022 345.21564Q173.66022 347.325 174.16022 348.02814Q174.66022 348.73126 175.37897 348.73126Q176.11334 348.73126 176.59772 348.02814Q177.09772 347.325 177.09772 345.21564Q177.09772 343.09064 176.59772 342.40314Q176.11334 341.7 175.36334 341.7Q174.64459 341.7 174.20709 342.3094Q173.66022 343.09064 173.66022 345.21564ZM179.73834 347.46564L180.75397 347.3719Q180.87897 348.09064 181.23834 348.41876Q181.61334 348.73126 182.17584 348.73126Q182.66022 348.73126 183.01959 348.5125Q183.39459 348.27814 183.62897 347.91876Q183.86334 347.54376 184.01959 346.91876Q184.17584 346.27814 184.17584 345.6219Q184.17584 345.54376 184.17584 345.40314Q183.86334 345.90314 183.31647 346.23126Q182.76959 346.54376 182.12897 346.54376Q181.05084 346.54376 180.31647 345.77814Q179.58209 344.9969 179.58209 343.73126Q179.58209 342.41876 180.34772 341.6219Q181.12897 340.825 182.28522 340.825Q183.11334 340.825 183.80084 341.27814Q184.50397 341.73126 184.86334 342.575Q185.22272 343.40314 185.22272 344.98126Q185.22272 346.6219 184.86334 347.60626Q184.51959 348.575 183.80084 349.09064Q183.09772 349.59064 182.16022 349.59064Q181.14459 349.59064 180.50397 349.04376Q179.86334 348.48126 179.73834 347.46564ZM184.05084 343.66876Q184.05084 342.7625 183.56647 342.23126Q183.09772 341.7 182.41022 341.7Q181.70709 341.7 181.17584 342.27814Q180.66022 342.84064 180.66022 343.7625Q180.66022 344.575 181.16022 345.09064Q181.66022 345.60626 182.37897 345.60626Q183.11334 345.60626 183.58209 345.09064Q184.05084 344.575 184.05084 343.66876ZM186.58209 345.21564Q186.58209 343.6844 186.89459 342.7625Q187.20709 341.825 187.81647 341.325Q188.44147 340.825 189.37897 340.825Q190.06647 340.825 190.58209 341.10626Q191.11334 341.3875 191.45709 341.91876Q191.80084 342.4344 191.98834 343.1844Q192.17584 343.9344 192.17584 345.21564Q192.17584 346.73126 191.86334 347.65314Q191.56647 348.575 190.94147 349.09064Q190.33209 349.59064 189.37897 349.59064Q188.14459 349.59064 187.42584 348.7Q186.58209 347.6375 186.58209 345.21564ZM187.66022 345.21564Q187.66022 347.325 188.16022 348.02814Q188.66022 348.73126 189.37897 348.73126Q190.11334 348.73126 190.59772 348.02814Q191.09772 347.325 191.09772 345.21564Q191.09772 343.09064 190.59772 342.40314Q190.11334 341.7 189.36334 341.7Q188.64459 341.7 188.20709 342.3094Q187.66022 343.09064 187.66022 345.21564Z" fill-rule="nonzero"/><path fill="#000000" d="M315.875 349.45L315.875 340.85626L317.01562 340.85626L317.01562 348.4344L321.25 348.4344L321.25 349.45L315.875 349.45ZM325.875 349.45L325.875 347.3875L322.15625 347.3875L322.15625 346.41876L326.07812 340.85626L326.9375 340.85626L326.9375 346.41876L328.09375 346.41876L328.09375 347.3875L326.9375 347.3875L326.9375 349.45L325.875 349.45ZM325.875 346.41876L325.875 342.5594L323.1875 346.41876L325.875 346.41876ZM329.5 345.21564Q329.5 343.6844 329.8125 342.7625Q330.125 341.825 330.73438 341.325Q331.35938 340.825 332.29688 340.825Q332.98438 340.825 333.5 341.10626Q334.03125 341.3875 334.375 341.91876Q334.71875 342.4344 334.90625 343.1844Q335.09375 343.9344 335.09375 345.21564Q335.09375 346.73126 334.78125 347.65314Q334.48438 348.575 333.85938 349.09064Q333.25 349.59064 332.29688 349.59064Q331.0625 349.59064 330.34375 348.7Q329.5 347.6375 329.5 345.21564ZM330.57812 345.21564Q330.57812 347.325 331.07812 348.02814Q331.57812 348.73126 332.29688 348.73126Q333.03125 348.73126 333.51562 348.02814Q334.01562 347.325 334.01562 345.21564Q334.01562 343.09064 333.51562 342.40314Q333.03125 341.7 332.28125 341.7Q331.5625 341.7 331.125 342.3094Q330.57812 343.09064 330.57812 345.21564ZM336.54688 346.6844L337.60938 346.59064Q337.6875 347.2469 337.96875 347.65314Q338.25 348.0594 338.82812 348.325Q339.42188 348.575 340.15625 348.575Q340.79688 348.575 341.29688 348.3875Q341.79688 348.1844 342.03125 347.85626Q342.28125 347.5125 342.28125 347.1219Q342.28125 346.71564 342.04688 346.41876Q341.8125 346.10626 341.28125 345.90314Q340.92188 345.7625 339.73438 345.48126Q338.5625 345.2 338.09375 344.95Q337.46875 344.6219 337.17188 344.15314Q336.875 343.66876 336.875 343.075Q336.875 342.4344 337.23438 341.8719Q337.60938 341.29376 338.3125 341.0125Q339.03125 340.71564 339.89062 340.71564Q340.84375 340.71564 341.57812 341.02814Q342.3125 341.325 342.70312 341.9344Q343.09375 342.52814 343.125 343.27814L342.03125 343.35626Q341.9375 342.54376 341.42188 342.1375Q340.92188 341.71564 339.9375 341.71564Q338.90625 341.71564 338.4375 342.09064Q337.96875 342.46564 337.96875 342.9969Q337.96875 343.46564 338.29688 343.7625Q338.625 344.0594 340.0 344.3719Q341.39062 344.6844 341.90625 344.91876Q342.65625 345.27814 343.01562 345.8094Q343.375 346.325 343.375 347.02814Q343.375 347.71564 342.98438 348.325Q342.59375 348.91876 341.85938 349.2625Q341.125 349.59064 340.20312 349.59064Q339.03125 349.59064 338.23438 349.2625Q337.45312 348.91876 337.0 348.23126Q336.5625 347.54376 336.54688 346.6844Z" fill-rule="nonzero"/><path fill="#000000" d="M465.88666 349.45L465.88666 340.85626L467.01166 340.85626L467.01166 344.3875L471.4804 344.3875L471.4804 340.85626L472.62103 340.85626L472.62103 349.45L471.4804 349.45L471.4804 345.40314L467.01166 345.40314L467.01166 349.45L465.88666 349.45ZM478.38666 349.45L477.33978 349.45L477.33978 342.73126Q476.94916 343.09064 476.32416 343.46564Q475.71478 343.825 475.2304 343.9969L475.2304 342.98126Q476.1054 342.5594 476.76166 341.98126Q477.43353 341.3875 477.71478 340.825L478.38666 340.825L478.38666 349.45ZM481.4179 345.21564Q481.4179 343.6844 481.7304 342.7625Q482.0429 341.825 482.65228 341.325Q483.27728 340.825 484.21478 340.825Q484.90228 340.825 485.4179 341.10626Q485.94916 341.3875 486.2929 341.91876Q486.63666 342.4344 486.82416 343.1844Q487.01166 343.9344 487.01166 345.21564Q487.01166 346.73126 486.69916 347.65314Q486.40228 348.575 485.77728 349.09064Q485.1679 349.59064 484.21478 349.59064Q482.9804 349.59064 482.26166 348.7Q481.4179 347.6375 481.4179 345.21564ZM482.49603 345.21564Q482.49603 347.325 482.99603 348.02814Q483.49603 348.73126 484.21478 348.73126Q484.94916 348.73126 485.43353 348.02814Q485.93353 347.325 485.93353 345.21564Q485.93353 343.09064 485.43353 342.40314Q484.94916 341.7 484.19916 341.7Q483.4804 341.7 483.0429 342.3094Q482.49603 343.09064 482.49603 345.21564ZM488.4179 345.21564Q488.4179 343.6844 488.7304 342.7625Q489.0429 341.825 489.65228 341.325Q490.27728 340.825 491.21478 340.825Q491.90228 340.825 492.4179 341.10626Q492.94916 341.3875 493.2929 341.91876Q493.63666 342.4344 493.82416 343.1844Q494.01166 343.9344 494.01166 345.21564Q494.01166 346.73126 493.69916 347.65314Q493.40228 348.575 492.77728 349.09064Q492.1679 349.59064 491.21478 349.59064Q489.9804 349.59064 489.26166 348.7Q488.4179 347.6375 488.4179 345.21564ZM489.49603 345.21564Q489.49603 347.325 489.99603 348.02814Q490.49603 348.73126 491.21478 348.73126Q491.94916 348.73126 492.43353 348.02814Q492.93353 347.325 492.93353 345.21564Q492.93353 343.09064 492.43353 342.40314Q491.94916 341.7 491.19916 341.7Q490.4804 341.7 490.0429 342.3094Q489.49603 343.09064 489.49603 345.21564Z" fill-rule="nonzero"/><path fill="#85b737" d="M188.5 80.416664C188.5 79.86438 188.94771 79.416664 189.5 79.416664L199.5 79.416664C200.05229 79.416664 200.5 79.86438 200.5 80.416664L200.5 90.416664C200.5 90.96895 200.05229 91.416664 199.5 91.416664L189.5 91.416664C188.94771 91.416664 188.5 90.96895 188.5 90.416664Z" fill-rule="nonzero"/><path fill="#1a1a1a" d="M209.375 89.416664L209.375 80.822914L210.51562 80.822914L210.51562 88.40104L214.75 88.40104L214.75 89.416664L209.375 89.416664ZM216.375 89.416664L216.375 80.822914L217.51562 80.822914L217.51562 88.40104L221.75 88.40104L221.75 89.416664L216.375 89.416664ZM227.35938 88.65104Q226.76562 89.15104 226.21875 89.354164Q225.6875 89.55729 225.0625 89.55729Q224.03125 89.55729 223.48438 89.05729Q222.9375 88.55729 222.9375 87.77604Q222.9375 87.322914 223.14062 86.947914Q223.34375 86.55729 223.6875 86.33854Q224.03125 86.104164 224.45312 85.99479Q224.75 85.90104 225.39062 85.822914Q226.65625 85.68229 227.26562 85.46354Q227.26562 85.24479 227.26562 85.197914Q227.26562 84.541664 226.96875 84.27604Q226.5625 83.93229 225.76562 83.93229Q225.03125 83.93229 224.67188 84.197914Q224.3125 84.447914 224.14062 85.104164L223.10938 84.96354Q223.25 84.30729 223.57812 83.90104Q223.90625 83.49479 224.51562 83.27604Q225.125 83.05729 225.92188 83.05729Q226.71875 83.05729 227.21875 83.24479Q227.71875 83.43229 227.95312 83.71354Q228.1875 83.99479 228.28125 84.43229Q228.32812 84.697914 228.32812 85.40104L228.32812 86.80729Q228.32812 88.27604 228.39062 88.666664Q228.46875 89.05729 228.67188 89.416664L227.5625 89.416664Q227.40625 89.08854 227.35938 88.65104ZM227.26562 86.291664Q226.6875 86.52604 225.54688 86.697914Q224.89062 86.791664 224.625 86.916664Q224.35938 87.02604 224.20312 87.24479Q224.0625 87.46354 224.0625 87.74479Q224.0625 88.166664 224.375 88.447914Q224.70312 88.729164 225.3125 88.729164Q225.92188 88.729164 226.39062 88.46354Q226.875 88.197914 227.09375 87.729164Q227.26562 87.36979 227.26562 86.68229L227.26562 86.291664ZM230.39062 89.416664L230.39062 80.822914L232.10938 80.822914L234.14062 86.90104Q234.42188 87.760414 234.54688 88.18229Q234.6875 87.71354 235.0 86.80729L237.0625 80.822914L238.59375 80.822914L238.59375 89.416664L237.5 89.416664L237.5 82.229164L235.0 89.416664L233.96875 89.416664L231.48438 82.104164L231.48438 89.416664L230.39062 89.416664ZM238.48438 89.416664L241.78125 80.822914L243.0 80.822914L246.51562 89.416664L245.23438 89.416664L244.21875 86.80729L240.64062 86.80729L239.6875 89.416664L238.48438 89.416664ZM240.96875 85.885414L243.875 85.885414L242.98438 83.510414Q242.5625 82.43229 242.375 81.729164Q242.20312 82.55729 241.90625 83.36979L240.96875 85.885414ZM245.875 86.83854L245.875 85.77604L249.125 85.77604L249.125 86.83854L245.875 86.83854ZM255.54688 88.40104L255.54688 89.416664L249.85938 89.416664Q249.85938 89.041664 249.98438 88.68229Q250.20312 88.104164 250.67188 87.541664Q251.15625 86.979164 252.0625 86.24479Q253.46875 85.08854 253.95312 84.416664Q254.45312 83.74479 254.45312 83.15104Q254.45312 82.52604 254.0 82.104164Q253.54688 81.666664 252.82812 81.666664Q252.0625 81.666664 251.60938 82.11979Q251.15625 82.572914 251.14062 83.385414L250.0625 83.27604Q250.17188 82.05729 250.89062 81.43229Q251.625 80.791664 252.85938 80.791664Q254.09375 80.791664 254.8125 81.479164Q255.53125 82.166664 255.53125 83.18229Q255.53125 83.697914 255.3125 84.197914Q255.10938 84.68229 254.60938 85.24479Q254.125 85.791664 253.0 86.74479Q252.04688 87.541664 251.76562 87.83854Q251.5 88.11979 251.32812 88.40104L255.54688 88.40104ZM256.875 86.83854L256.875 85.77604L260.125 85.77604L260.125 86.83854L256.875 86.83854ZM261.0625 81.947914L261.0625 80.93229L266.625 80.93229L266.625 81.760414Q265.8125 82.635414 265.0 84.08854Q264.20312 85.52604 263.76562 87.05729Q263.4375 88.135414 263.35938 89.416664L262.26562 89.416664Q262.28125 88.40104 262.65625 86.96354Q263.04688 85.52604 263.75 84.197914Q264.46875 82.86979 265.28125 81.947914L261.0625 81.947914ZM269.26562 89.416664L268.28125 89.416664L268.28125 80.822914L269.34375 80.822914L269.34375 83.885414Q270.01562 83.05729 271.04688 83.05729Q271.625 83.05729 272.125 83.291664Q272.64062 83.510414 272.96875 83.93229Q273.3125 84.354164 273.5 84.947914Q273.6875 85.541664 273.6875 86.21354Q273.6875 87.80729 272.89062 88.68229Q272.09375 89.55729 271.0 89.55729Q269.89062 89.55729 269.26562 88.635414L269.26562 89.416664ZM269.25 86.260414Q269.25 87.36979 269.5625 87.86979Q270.0625 88.68229 270.90625 88.68229Q271.59375 88.68229 272.09375 88.08854Q272.60938 87.49479 272.60938 86.291664Q272.60938 85.072914 272.125 84.49479Q271.64062 83.916664 270.95312 83.916664Q270.26562 83.916664 269.75 84.52604Q269.25 85.11979 269.25 86.260414ZM274.875 86.83854L274.875 85.77604L278.125 85.77604L278.125 86.83854L274.875 86.83854ZM279.29688 89.416664L279.29688 80.822914L280.34375 80.822914L280.34375 83.90104Q281.07812 83.05729 282.20312 83.05729Q282.90625 83.05729 283.40625 83.33854Q283.92188 83.604164 284.14062 84.08854Q284.35938 84.55729 284.35938 85.479164L284.35938 89.416664L283.3125 89.416664L283.3125 85.479164Q283.3125 84.68229 282.96875 84.322914Q282.625 83.96354 282.0 83.96354Q281.53125 83.96354 281.10938 84.21354Q280.70312 84.447914 280.51562 84.86979Q280.34375 85.27604 280.34375 86.010414L280.34375 89.416664L279.29688 89.416664ZM286.54688 89.416664L286.54688 84.010414L285.60938 84.010414L285.60938 83.197914L286.54688 83.197914L286.54688 82.52604Q286.54688 81.90104 286.65625 81.604164Q286.8125 81.18229 287.1875 80.93229Q287.57812 80.68229 288.26562 80.68229Q288.71875 80.68229 289.25 80.791664L289.09375 81.697914Q288.76562 81.65104 288.46875 81.65104Q287.98438 81.65104 287.78125 81.854164Q287.59375 82.05729 287.59375 82.61979L287.59375 83.197914L288.8125 83.197914L288.8125 84.010414L287.59375 84.010414L287.59375 89.416664L286.54688 89.416664Z" fill-rule="nonzero"/><path fill="#3d5314" d="M304.5 80.416664C304.5 79.86438 304.94772 79.416664 305.5 79.416664L315.5 79.416664C316.05228 79.416664 316.5 79.86438 316.5 80.416664L316.5 90.416664C316.5 90.96895 316.05228 91.416664 315.5 91.416664L305.5 91.416664C304.94772 91.416664 304.5 90.96895 304.5 90.416664Z" fill-rule="nonzero"/><path fill="#1a1a1a" d="M325.375 89.416664L325.375 80.822914L326.51562 80.822914L326.51562 88.40104L330.75 88.40104L330.75 89.416664L325.375 89.416664ZM332.375 89.416664L332.375 80.822914L333.51562 80.822914L333.51562 88.40104L337.75 88.40104L337.75 89.416664L332.375 89.416664ZM343.35938 88.65104Q342.76562 89.15104 342.21875 89.354164Q341.6875 89.55729 341.0625 89.55729Q340.03125 89.55729 339.48438 89.05729Q338.9375 88.55729 338.9375 87.77604Q338.9375 87.322914 339.14062 86.947914Q339.34375 86.55729 339.6875 86.33854Q340.03125 86.104164 340.45312 85.99479Q340.75 85.90104 341.39062 85.822914Q342.65625 85.68229 343.26562 85.46354Q343.26562 85.24479 343.26562 85.197914Q343.26562 84.541664 342.96875 84.27604Q342.5625 83.93229 341.76562 83.93229Q341.03125 83.93229 340.67188 84.197914Q340.3125 84.447914 340.14062 85.104164L339.10938 84.96354Q339.25 84.30729 339.57812 83.90104Q339.90625 83.49479 340.51562 83.27604Q341.125 83.05729 341.92188 83.05729Q342.71875 83.05729 343.21875 83.24479Q343.71875 83.43229 343.95312 83.71354Q344.1875 83.99479 344.28125 84.43229Q344.32812 84.697914 344.32812 85.40104L344.32812 86.80729Q344.32812 88.27604 344.39062 88.666664Q344.46875 89.05729 344.67188 89.416664L343.5625 89.416664Q343.40625 89.08854 343.35938 88.65104ZM343.26562 86.291664Q342.6875 86.52604 341.54688 86.697914Q340.89062 86.791664 340.625 86.916664Q340.35938 87.02604 340.20312 87.24479Q340.0625 87.46354 340.0625 87.74479Q340.0625 88.166664 340.375 88.447914Q340.70312 88.729164 341.3125 88.729164Q341.92188 88.729164 342.39062 88.46354Q342.875 88.197914 343.09375 87.729164Q343.26562 87.36979 343.26562 86.68229L343.26562 86.291664ZM346.39062 89.416664L346.39062 80.822914L348.10938 80.822914L350.14062 86.90104Q350.42188 87.760414 350.54688 88.18229Q350.6875 87.71354 351.0 86.80729L353.0625 80.822914L354.59375 80.822914L354.59375 89.416664L353.5 89.416664L353.5 82.229164L351.0 89.416664L349.96875 89.416664L347.48438 82.104164L347.48438 89.416664L346.39062 89.416664ZM354.48438 89.416664L357.78125 80.822914L359.0 80.822914L362.51562 89.416664L361.23438 89.416664L360.21875 86.80729L356.64062 86.80729L355.6875 89.416664L354.48438 89.416664ZM356.96875 85.885414L359.875 85.885414L358.98438 83.510414Q358.5625 82.43229 358.375 81.729164Q358.20312 82.55729 357.90625 83.36979L356.96875 85.885414ZM361.875 86.83854L361.875 85.77604L365.125 85.77604L365.125 86.83854L361.875 86.83854ZM371.54688 88.40104L371.54688 89.416664L365.85938 89.416664Q365.85938 89.041664 365.98438 88.68229Q366.20312 88.104164 366.67188 87.541664Q367.15625 86.979164 368.0625 86.24479Q369.46875 85.08854 369.95312 84.416664Q370.45312 83.74479 370.45312 83.15104Q370.45312 82.52604 370.0 82.104164Q369.54688 81.666664 368.82812 81.666664Q368.0625 81.666664 367.60938 82.11979Q367.15625 82.572914 367.14062 83.385414L366.0625 83.27604Q366.17188 82.05729 366.89062 81.43229Q367.625 80.791664 368.85938 80.791664Q370.09375 80.791664 370.8125 81.479164Q371.53125 82.166664 371.53125 83.18229Q371.53125 83.697914 371.3125 84.197914Q371.10938 84.68229 370.60938 85.24479Q370.125 85.791664 369.0 86.74479Q368.04688 87.541664 367.76562 87.83854Q367.5 88.11979 367.32812 88.40104L371.54688 88.40104ZM372.875 86.83854L372.875 85.77604L376.125 85.77604L376.125 86.83854L372.875 86.83854ZM380.96875 89.416664L379.92188 89.416664L379.92188 82.697914Q379.53125 83.05729 378.90625 83.43229Q378.29688 83.791664 377.8125 83.96354L377.8125 82.947914Q378.6875 82.52604 379.34375 81.947914Q380.01562 81.354164 380.29688 80.791664L380.96875 80.791664L380.96875 89.416664ZM384.0 87.15104L385.0625 87.010414Q385.23438 87.90104 385.67188 88.30729Q386.10938 88.697914 386.73438 88.697914Q387.48438 88.697914 388.0 88.18229Q388.51562 87.666664 388.51562 86.90104Q388.51562 86.18229 388.03125 85.71354Q387.5625 85.229164 386.82812 85.229164Q386.53125 85.229164 386.07812 85.354164L386.20312 84.416664Q386.3125 84.43229 386.375 84.43229Q387.04688 84.43229 387.57812 84.08854Q388.125 83.729164 388.125 82.99479Q388.125 82.416664 387.73438 82.041664Q387.34375 81.65104 386.71875 81.65104Q386.10938 81.65104 385.6875 82.041664Q385.28125 82.43229 385.17188 83.197914L384.10938 83.010414Q384.29688 81.96354 384.98438 81.385414Q385.67188 80.791664 386.6875 80.791664Q387.39062 80.791664 387.98438 81.104164Q388.57812 81.40104 388.89062 81.916664Q389.20312 82.43229 389.20312 83.02604Q389.20312 83.572914 388.90625 84.041664Q388.60938 84.49479 388.03125 84.760414Q388.78125 84.93229 389.20312 85.49479Q389.625 86.041664 389.625 86.885414Q389.625 88.010414 388.79688 88.791664Q387.98438 89.572914 386.73438 89.572914Q385.60938 89.572914 384.85938 88.90104Q384.10938 88.21354 384.0 87.15104ZM392.26562 89.416664L391.28125 89.416664L391.28125 80.822914L392.34375 80.822914L392.34375 83.885414Q393.01562 83.05729 394.04688 83.05729Q394.625 83.05729 395.125 83.291664Q395.64062 83.510414 395.96875 83.93229Q396.3125 84.354164 396.5 84.947914Q396.6875 85.541664 396.6875 86.21354Q396.6875 87.80729 395.89062 88.68229Q395.09375 89.55729 394.0 89.55729Q392.89062 89.55729 392.26562 88.635414L392.26562 89.416664ZM392.25 86.260414Q392.25 87.36979 392.5625 87.86979Q393.0625 88.68229 393.90625 88.68229Q394.59375 88.68229 395.09375 88.08854Q395.60938 87.49479 395.60938 86.291664Q395.60938 85.072914 395.125 84.49479Q394.64062 83.916664 393.95312 83.916664Q393.26562 83.916664 392.75 84.52604Q392.25 85.11979 392.25 86.260414ZM397.875 86.83854L397.875 85.77604L401.125 85.77604L401.125 86.83854L397.875 86.83854ZM402.29688 89.416664L402.29688 80.822914L403.34375 80.822914L403.34375 83.90104Q404.07812 83.05729 405.20312 83.05729Q405.90625 83.05729 406.40625 83.33854Q406.92188 83.604164 407.14062 84.08854Q407.35938 84.55729 407.35938 85.479164L407.35938 89.416664L406.3125 89.416664L406.3125 85.479164Q406.3125 84.68229 405.96875 84.322914Q405.625 83.96354 405.0 83.96354Q404.53125 83.96354 404.10938 84.21354Q403.70312 84.447914 403.51562 84.86979Q403.34375 85.27604 403.34375 86.010414L403.34375 89.416664L402.29688 89.416664ZM409.54688 89.416664L409.54688 84.010414L408.60938 84.010414L408.60938 83.197914L409.54688 83.197914L409.54688 82.52604Q409.54688 81.90104 409.65625 81.604164Q409.8125 81.18229 410.1875 80.93229Q410.57812 80.68229 411.26562 80.68229Q411.71875 80.68229 412.25 80.791664L412.09375 81.697914Q411.76562 81.65104 411.46875 81.65104Q410.98438 81.65104 410.78125 81.854164Q410.59375 82.05729 410.59375 82.61979L410.59375 83.197914L411.8125 83.197914L411.8125 84.010414L410.59375 84.010414L410.59375 89.416664L409.54688 89.416664Z" fill-rule="nonzero"/><path fill="#999999" d="M19.58125 63.05L19.58125 53.034374L23.33125 53.034374Q24.4875 53.034374 25.175 53.346874Q25.878124 53.64375 26.26875 54.26875Q26.659374 54.89375 26.659374 55.596874Q26.659374 56.2375 26.3 56.8Q25.95625 57.3625 25.253124 57.721874Q26.159374 57.9875 26.64375 58.628124Q27.14375 59.26875 27.14375 60.14375Q27.14375 60.846874 26.846874 61.45625Q26.55 62.065624 26.1125 62.39375Q25.675 62.721874 25.003124 62.89375Q24.346874 63.05 23.39375 63.05L19.58125 63.05ZM20.909374 57.2375L23.065624 57.2375Q23.95625 57.2375 24.33125 57.128124Q24.846874 56.971874 25.096874 56.628124Q25.346874 56.26875 25.346874 55.753124Q25.346874 55.253124 25.1125 54.878124Q24.878124 54.4875 24.425 54.346874Q23.9875 54.20625 22.909374 54.20625L20.909374 54.20625L20.909374 57.2375ZM20.909374 61.8625L23.39375 61.8625Q24.034374 61.8625 24.3 61.815624Q24.753124 61.7375 25.065624 61.55Q25.378124 61.346874 25.565624 60.9875Q25.76875 60.628124 25.76875 60.14375Q25.76875 59.58125 25.471874 59.175Q25.190624 58.753124 24.675 58.596874Q24.175 58.425 23.221874 58.425L20.909374 58.425L20.909374 61.8625ZM33.20625 62.159374Q32.534374 62.7375 31.89375 62.9875Q31.26875 63.221874 30.534374 63.221874Q29.346874 63.221874 28.690624 62.64375Q28.05 62.05 28.05 61.128124Q28.05 60.596874 28.284374 60.159374Q28.534374 59.721874 28.925 59.45625Q29.33125 59.190624 29.815624 59.05Q30.175 58.95625 30.909374 58.8625Q32.409374 58.690624 33.1125 58.440624Q33.1125 58.190624 33.1125 58.128124Q33.1125 57.3625 32.76875 57.065624Q32.3 56.64375 31.3625 56.64375Q30.503124 56.64375 30.08125 56.95625Q29.675 57.253124 29.471874 58.01875L28.26875 57.8625Q28.425 57.08125 28.8 56.6125Q29.190624 56.14375 29.89375 55.89375Q30.6125 55.628124 31.55 55.628124Q32.471874 55.628124 33.05 55.846874Q33.628124 56.065624 33.89375 56.39375Q34.175 56.721874 34.284374 57.2375Q34.346874 57.55 34.346874 58.3625L34.346874 60.003124Q34.346874 61.721874 34.425 62.175Q34.503124 62.628124 34.7375 63.05L33.45625 63.05Q33.26875 62.659374 33.20625 62.159374ZM33.1125 59.409374Q32.440624 59.675 31.096874 59.878124Q30.346874 59.9875 30.01875 60.128124Q29.70625 60.253124 29.534374 60.51875Q29.3625 60.784374 29.3625 61.096874Q29.3625 61.596874 29.7375 61.925Q30.1125 62.253124 30.83125 62.253124Q31.55 62.253124 32.096874 61.940624Q32.64375 61.628124 32.909374 61.08125Q33.1125 60.675 33.1125 59.8625L33.1125 59.409374ZM39.159374 61.95625L39.33125 63.034374Q38.815624 63.14375 38.409374 63.14375Q37.7375 63.14375 37.3625 62.940624Q37.003124 62.721874 36.846874 62.378124Q36.70625 62.034374 36.70625 60.925L36.70625 56.753124L35.8 56.753124L35.8 55.784374L36.70625 55.784374L36.70625 53.9875L37.925 53.253124L37.925 55.784374L39.159374 55.784374L39.159374 56.753124L37.925 56.753124L37.925 60.9875Q37.925 61.51875 37.9875 61.675Q38.05 61.815624 38.190624 61.909374Q38.346874 62.003124 38.6125 62.003124Q38.83125 62.003124 39.159374 61.95625ZM45.20625 60.39375L46.425 60.55Q46.221874 61.8 45.39375 62.51875Q44.58125 63.221874 43.39375 63.221874Q41.909374 63.221874 41.003124 62.253124Q40.096874 61.26875 40.096874 59.440624Q40.096874 58.26875 40.4875 57.39375Q40.878124 56.503124 41.675 56.065624Q42.471874 55.628124 43.409374 55.628124Q44.58125 55.628124 45.33125 56.221874Q46.096874 56.815624 46.3 57.925L45.1125 58.1125Q44.940624 57.378124 44.503124 57.01875Q44.065624 56.64375 43.45625 56.64375Q42.51875 56.64375 41.940624 57.315624Q41.3625 57.971874 41.3625 59.409374Q41.3625 60.878124 41.925 61.55Q42.4875 62.20625 43.378124 62.20625Q44.1125 62.20625 44.596874 61.76875Q45.08125 61.315624 45.20625 60.39375ZM47.471874 63.05L47.471874 53.034374L48.70625 53.034374L48.70625 56.628124Q49.565624 55.628124 50.878124 55.628124Q51.690624 55.628124 52.284374 55.95625Q52.878124 56.26875 53.128124 56.83125Q53.39375 57.378124 53.39375 58.45625L53.39375 63.05L52.159374 63.05L52.159374 58.45625Q52.159374 57.51875 51.753124 57.1125Q51.3625 56.690624 50.628124 56.690624Q50.08125 56.690624 49.596874 56.971874Q49.1125 57.253124 48.909374 57.7375Q48.70625 58.221874 48.70625 59.08125L48.70625 63.05L47.471874 63.05ZM59.175 59.83125L60.425 59.721874Q60.51875 60.471874 60.846874 60.95625Q61.175 61.440624 61.846874 61.7375Q62.534374 62.034374 63.39375 62.034374Q64.14375 62.034374 64.72188 61.815624Q65.31563 61.58125 65.59688 61.190624Q65.87813 60.8 65.87813 60.33125Q65.87813 59.8625 65.59688 59.51875Q65.33125 59.159374 64.70625 58.909374Q64.3 58.753124 62.909374 58.425Q61.534374 58.096874 60.9875 57.8Q60.26875 57.425 59.909374 56.878124Q59.565624 56.315624 59.565624 55.6125Q59.565624 54.8625 59.9875 54.20625Q60.425 53.534374 61.253124 53.20625Q62.08125 52.8625 63.096874 52.8625Q64.20625 52.8625 65.05 53.221874Q65.90938 53.58125 66.3625 54.284374Q66.81563 54.971874 66.8625 55.846874L65.58125 55.940624Q65.4875 55.003124 64.89375 54.51875Q64.3 54.01875 63.14375 54.01875Q61.940624 54.01875 61.39375 54.471874Q60.846874 54.909374 60.846874 55.534374Q60.846874 56.065624 61.2375 56.425Q61.6125 56.76875 63.221874 57.14375Q64.84688 57.503124 65.44063 57.76875Q66.31563 58.175 66.7375 58.8Q67.15938 59.409374 67.15938 60.221874Q67.15938 61.01875 66.69063 61.7375Q66.2375 62.440624 65.37813 62.83125Q64.51875 63.221874 63.45625 63.221874Q62.096874 63.221874 61.175 62.83125Q60.253124 62.425 59.721874 61.628124Q59.20625 60.83125 59.175 59.83125ZM68.4875 54.440624L68.4875 53.034374L69.70625 53.034374L69.70625 54.440624L68.4875 54.440624ZM68.4875 63.05L68.4875 55.784374L69.70625 55.784374L69.70625 63.05L68.4875 63.05ZM70.83125 63.05L70.83125 62.05L75.44063 56.753124Q74.65938 56.784374 74.05 56.784374L71.09688 56.784374L71.09688 55.784374L77.03438 55.784374L77.03438 56.596874L73.09688 61.20625L72.34688 62.05Q73.175 61.9875 73.89375 61.9875L77.25313 61.9875L77.25313 63.05L70.83125 63.05ZM82.44063 60.70625L83.72188 60.8625Q83.40938 61.9875 82.59688 62.6125Q81.78438 63.221874 80.51875 63.221874Q78.925 63.221874 77.9875 62.2375Q77.06563 61.253124 77.06563 59.4875Q77.06563 57.64375 78.00313 56.64375Q78.95625 55.628124 80.45625 55.628124Q81.90938 55.628124 82.83125 56.628124Q83.75313 57.6125 83.75313 59.409374Q83.75313 59.51875 83.75313 59.7375L78.33125 59.7375Q78.40938 60.925 79.00313 61.565624Q79.6125 62.20625 80.53438 62.20625Q81.20625 62.20625 81.675 61.846874Q82.15938 61.4875 82.44063 60.70625ZM78.40938 58.721874L82.45625 58.721874Q82.37813 57.8 81.9875 57.346874Q81.40938 56.64375 80.47188 56.64375Q79.6125 56.64375 79.03438 57.20625Q78.47188 57.76875 78.40938 58.721874ZM95.94063 57.159374L89.33125 57.159374L89.33125 56.003124L95.94063 56.003124L95.94063 57.159374ZM95.94063 60.20625L89.33125 60.20625L89.33125 59.05L95.94063 59.05L95.94063 60.20625ZM105.76875 63.05L104.53438 63.05L104.53438 55.20625Q104.09688 55.628124 103.3625 56.065624Q102.64375 56.4875 102.08125 56.690624L102.08125 55.503124Q103.1125 55.01875 103.87813 54.33125Q104.65938 53.64375 104.97188 52.9875L105.76875 52.9875L105.76875 63.05Z" fill-rule="nonzero"/><path fill="#757575" d="M20.01875 37.55L20.01875 23.2375L21.909374 23.2375L21.909374 35.8625L28.95625 35.8625L28.95625 37.55L20.01875 37.55ZM31.01875 37.55L31.01875 23.2375L32.909374 23.2375L32.909374 35.8625L39.95625 35.8625L39.95625 37.55L31.01875 37.55ZM48.64375 36.26875Q47.659374 37.096874 46.753124 37.440624Q45.846874 37.784374 44.815624 37.784374Q43.1125 37.784374 42.190624 36.95625Q41.26875 36.1125 41.26875 34.815624Q41.26875 34.05 41.6125 33.425Q41.971874 32.8 42.534374 32.425Q43.096874 32.034374 43.8 31.83125Q44.315624 31.70625 45.3625 31.565624Q47.4875 31.315624 48.4875 30.971874Q48.503124 30.6125 48.503124 30.503124Q48.503124 29.440624 48.003124 28.9875Q47.33125 28.39375 46.003124 28.39375Q44.753124 28.39375 44.159374 28.83125Q43.58125 29.26875 43.3 30.378124L41.58125 30.14375Q41.815624 29.034374 42.346874 28.3625Q42.878124 27.675 43.89375 27.315624Q44.909374 26.940624 46.253124 26.940624Q47.58125 26.940624 48.409374 27.253124Q49.2375 27.565624 49.628124 28.05Q50.01875 28.51875 50.175 29.2375Q50.26875 29.690624 50.26875 30.8625L50.26875 33.20625Q50.26875 35.659374 50.378124 36.315624Q50.4875 36.95625 50.83125 37.55L48.9875 37.55Q48.721874 37.003124 48.64375 36.26875ZM48.4875 32.346874Q47.534374 32.7375 45.6125 33.003124Q44.534374 33.159374 44.08125 33.3625Q43.64375 33.55 43.39375 33.925Q43.14375 34.3 43.14375 34.76875Q43.14375 35.471874 43.675 35.940624Q44.20625 36.409374 45.2375 36.409374Q46.253124 36.409374 47.034374 35.971874Q47.83125 35.51875 48.20625 34.753124Q48.4875 34.14375 48.4875 32.9875L48.4875 32.346874ZM53.034374 37.55L53.034374 23.2375L55.89375 23.2375L59.26875 33.3625Q59.7375 34.784374 59.95625 35.4875Q60.20625 34.70625 60.721874 33.190624L64.14375 23.2375L66.69063 23.2375L66.69063 37.55L64.87813 37.55L64.87813 25.565624L60.70625 37.55L59.003124 37.55L54.8625 25.3625L54.8625 37.55L53.034374 37.55ZM68.51875 37.55L74.01875 23.2375L76.06563 23.2375L81.925 37.55L79.76875 37.55L78.09688 33.20625L72.1125 33.20625L70.53438 37.55L68.51875 37.55ZM72.65938 31.675L77.50313 31.675L76.01875 27.70625Q75.33125 25.89375 75.00313 24.7375Q74.72188 26.1125 74.22188 27.471874L72.65938 31.675ZM97.6125 35.8625L97.6125 37.55L88.15938 37.55Q88.14375 36.909374 88.3625 36.33125Q88.72188 35.3625 89.51875 34.425Q90.31563 33.4875 91.81563 32.253124Q94.15938 30.346874 94.97188 29.2375Q95.8 28.1125 95.8 27.1125Q95.8 26.065624 95.05 25.346874Q94.3 24.628124 93.09688 24.628124Q91.83125 24.628124 91.06563 25.39375Q90.3 26.159374 90.3 27.503124L88.4875 27.315624Q88.675 25.3 89.87813 24.2375Q91.09688 23.175 93.14375 23.175Q95.19063 23.175 96.39375 24.315624Q97.59688 25.45625 97.59688 27.14375Q97.59688 28.003124 97.2375 28.846874Q96.89375 29.675 96.08125 30.596874Q95.26875 31.503124 93.37813 33.1125Q91.78438 34.440624 91.33125 34.909374Q90.89375 35.378124 90.59688 35.8625L97.6125 35.8625ZM106.19063 37.55L106.19063 23.2375L115.84688 23.2375L115.84688 24.925L108.08125 24.925L108.08125 29.3625L114.8 29.3625L114.8 31.05L108.08125 31.05L108.08125 37.55L106.19063 37.55ZM117.87813 25.253124L117.87813 23.2375L119.64375 23.2375L119.64375 25.253124L117.87813 25.253124ZM117.87813 37.55L117.87813 27.175L119.64375 27.175L119.64375 37.55L117.87813 37.55ZM121.84688 37.55L121.84688 27.175L123.425 27.175L123.425 28.753124Q124.03438 27.64375 124.55 27.3Q125.06563 26.940624 125.675 26.940624Q126.56563 26.940624 127.4875 27.503124L126.87813 29.14375Q126.2375 28.753124 125.59688 28.753124Q125.01875 28.753124 124.55 29.1125Q124.09688 29.45625 123.89375 30.065624Q123.6125 31.003124 123.6125 32.1125L123.6125 37.55L121.84688 37.55ZM128.15938 34.45625L129.90938 34.175Q130.05 35.221874 130.72188 35.784374Q131.39375 36.33125 132.59688 36.33125Q133.8 36.33125 134.37813 35.846874Q134.97188 35.346874 134.97188 34.690624Q134.97188 34.096874 134.45625 33.753124Q134.09688 33.51875 132.65938 33.159374Q130.72188 32.659374 129.97188 32.315624Q129.2375 31.95625 128.84688 31.33125Q128.45625 30.690624 128.45625 29.925Q128.45625 29.2375 128.76875 28.64375Q129.09688 28.05 129.64375 27.659374Q130.05 27.3625 130.75313 27.159374Q131.47188 26.940624 132.28438 26.940624Q133.50313 26.940624 134.425 27.3Q135.34688 27.64375 135.78438 28.253124Q136.22188 28.846874 136.39375 29.846874L134.675 30.08125Q134.55 29.284374 133.9875 28.846874Q133.425 28.39375 132.40938 28.39375Q131.19063 28.39375 130.675 28.8Q130.15938 29.190624 130.15938 29.721874Q130.15938 30.065624 130.37813 30.346874Q130.58125 30.628124 131.05 30.815624Q131.31563 30.909374 132.59688 31.253124Q134.47188 31.753124 135.20625 32.08125Q135.94063 32.39375 136.3625 33.003124Q136.78438 33.596874 136.78438 34.503124Q136.78438 35.378124 136.26875 36.159374Q135.75313 36.940624 134.78438 37.3625Q133.81563 37.784374 132.59688 37.784374Q130.58125 37.784374 129.51875 36.940624Q128.45625 36.096874 128.15938 34.45625ZM142.70625 35.971874L142.95625 37.534374Q142.22188 37.690624 141.62813 37.690624Q140.675 37.690624 140.14375 37.39375Q139.62813 37.08125 139.40938 36.596874Q139.19063 36.096874 139.19063 34.51875L139.19063 28.55L137.90938 28.55L137.90938 27.175L139.19063 27.175L139.19063 24.6125L140.94063 23.55L140.94063 27.175L142.70625 27.175L142.70625 28.55L140.94063 28.55L140.94063 34.6125Q140.94063 35.3625 141.03438 35.58125Q141.12813 35.784374 141.33125 35.925Q141.55 36.05 141.94063 36.05Q142.22188 36.05 142.70625 35.971874ZM154.7375 37.55L154.7375 24.925L150.01875 24.925L150.01875 23.2375L161.3625 23.2375L161.3625 24.925L156.62813 24.925L156.62813 37.55L154.7375 37.55ZM162.22188 32.3625Q162.22188 29.4875 163.81563 28.096874Q165.15938 26.940624 167.08125 26.940624Q169.22188 26.940624 170.56563 28.346874Q171.925 29.753124 171.925 32.221874Q171.925 34.221874 171.33125 35.378124Q170.7375 36.51875 169.58125 37.159374Q168.44063 37.784374 167.08125 37.784374Q164.89375 37.784374 163.55 36.39375Q162.22188 34.9875 162.22188 32.3625ZM164.01875 32.3625Q164.01875 34.3625 164.87813 35.346874Q165.75313 36.33125 167.08125 36.33125Q168.39375 36.33125 169.25313 35.346874Q170.12813 34.346874 170.12813 32.3Q170.12813 30.378124 169.25313 29.39375Q168.37813 28.39375 167.08125 28.39375Q165.75313 28.39375 164.87813 29.39375Q164.01875 30.378124 164.01875 32.3625ZM173.87813 37.55L173.87813 23.2375L175.64375 23.2375L175.64375 31.39375L179.8 27.175L182.06563 27.175L178.1125 31.01875L182.47188 37.55L180.3 37.55L176.87813 32.253124L175.64375 33.440624L175.64375 37.55L173.87813 37.55ZM190.97188 34.20625L192.78438 34.440624Q192.3625 36.01875 191.19063 36.909374Q190.03438 37.784374 188.22188 37.784374Q185.95625 37.784374 184.6125 36.39375Q183.28438 34.9875 183.28438 32.45625Q183.28438 29.83125 184.62813 28.39375Q185.97188 26.940624 188.12813 26.940624Q190.20625 26.940624 191.51875 28.3625Q192.84688 29.76875 192.84688 32.346874Q192.84688 32.503124 192.83125 32.815624L185.09688 32.815624Q185.19063 34.51875 186.06563 35.425Q186.94063 36.33125 188.2375 36.33125Q189.20625 36.33125 189.87813 35.83125Q190.56563 35.315624 190.97188 34.20625ZM185.19063 31.3625L190.9875 31.3625Q190.87813 30.065624 190.33125 29.409374Q189.4875 28.39375 188.14375 28.39375Q186.94063 28.39375 186.1125 29.20625Q185.28438 30.003124 185.19063 31.3625ZM194.8625 37.55L194.8625 27.175L196.45625 27.175L196.45625 28.659374Q197.59688 26.940624 199.75313 26.940624Q200.69063 26.940624 201.47188 27.284374Q202.26875 27.6125 202.65938 28.159374Q203.05 28.70625 203.20625 29.45625Q203.3 29.95625 203.3 31.175L203.3 37.55L201.53438 37.55L201.53438 31.2375Q201.53438 30.159374 201.33125 29.628124Q201.12813 29.096874 200.59688 28.784374Q200.08125 28.471874 199.37813 28.471874Q198.25313 28.471874 197.44063 29.190624Q196.62813 29.89375 196.62813 31.878124L196.62813 37.55L194.8625 37.55ZM211.01875 37.55L211.01875 23.2375L212.90938 23.2375L212.90938 35.8625L219.95625 35.8625L219.95625 37.55L211.01875 37.55ZM228.64375 36.26875Q227.65938 37.096874 226.75313 37.440624Q225.84688 37.784374 224.81563 37.784374Q223.1125 37.784374 222.19063 36.95625Q221.26875 36.1125 221.26875 34.815624Q221.26875 34.05 221.6125 33.425Q221.97188 32.8 222.53438 32.425Q223.09688 32.034374 223.8 31.83125Q224.31563 31.70625 225.3625 31.565624Q227.4875 31.315624 228.4875 30.971874Q228.50313 30.6125 228.50313 30.503124Q228.50313 29.440624 228.00313 28.9875Q227.33125 28.39375 226.00313 28.39375Q224.75313 28.39375 224.15938 28.83125Q223.58125 29.26875 223.3 30.378124L221.58125 30.14375Q221.81563 29.034374 222.34688 28.3625Q222.87813 27.675 223.89375 27.315624Q224.90938 26.940624 226.25313 26.940624Q227.58125 26.940624 228.40938 27.253124Q229.2375 27.565624 229.62813 28.05Q230.01875 28.51875 230.175 29.2375Q230.26875 29.690624 230.26875 30.8625L230.26875 33.20625Q230.26875 35.659374 230.37813 36.315624Q230.4875 36.95625 230.83125 37.55L228.9875 37.55Q228.72188 37.003124 228.64375 36.26875ZM228.4875 32.346874Q227.53438 32.7375 225.6125 33.003124Q224.53438 33.159374 224.08125 33.3625Q223.64375 33.55 223.39375 33.925Q223.14375 34.3 223.14375 34.76875Q223.14375 35.471874 223.675 35.940624Q224.20625 36.409374 225.2375 36.409374Q226.25313 36.409374 227.03438 35.971874Q227.83125 35.51875 228.20625 34.753124Q228.4875 34.14375 228.4875 32.9875L228.4875 32.346874ZM236.70625 35.971874L236.95625 37.534374Q236.22188 37.690624 235.62813 37.690624Q234.675 37.690624 234.14375 37.39375Q233.62813 37.08125 233.40938 36.596874Q233.19063 36.096874 233.19063 34.51875L233.19063 28.55L231.90938 28.55L231.90938 27.175L233.19063 27.175L233.19063 24.6125L234.94063 23.55L234.94063 27.175L236.70625 27.175L236.70625 28.55L234.94063 28.55L234.94063 34.6125Q234.94063 35.3625 235.03438 35.58125Q235.12813 35.784374 235.33125 35.925Q235.55 36.05 235.94063 36.05Q236.22188 36.05 236.70625 35.971874ZM245.97188 34.20625L247.78438 34.440624Q247.3625 36.01875 246.19063 36.909374Q245.03438 37.784374 243.22188 37.784374Q240.95625 37.784374 239.6125 36.39375Q238.28438 34.9875 238.28438 32.45625Q238.28438 29.83125 239.62813 28.39375Q240.97188 26.940624 243.12813 26.940624Q245.20625 26.940624 246.51875 28.3625Q247.84688 29.76875 247.84688 32.346874Q247.84688 32.503124 247.83125 32.815624L240.09688 32.815624Q240.19063 34.51875 241.06563 35.425Q241.94063 36.33125 243.2375 36.33125Q244.20625 36.33125 244.87813 35.83125Q245.56563 35.315624 245.97188 34.20625ZM240.19063 31.3625L245.9875 31.3625Q245.87813 30.065624 245.33125 29.409374Q244.4875 28.39375 243.14375 28.39375Q241.94063 28.39375 241.1125 29.20625Q240.28438 30.003124 240.19063 31.3625ZM249.8625 37.55L249.8625 27.175L251.45625 27.175L251.45625 28.659374Q252.59688 26.940624 254.75313 26.940624Q255.69063 26.940624 256.47186 27.284374Q257.26874 27.6125 257.65936 28.159374Q258.05 28.70625 258.20624 29.45625Q258.3 29.95625 258.3 31.175L258.3 37.55L256.53436 37.55L256.53436 31.2375Q256.53436 30.159374 256.33124 29.628124Q256.1281 29.096874 255.59688 28.784374Q255.08125 28.471874 254.37813 28.471874Q253.25313 28.471874 252.44063 29.190624Q251.62813 29.89375 251.62813 31.878124L251.62813 37.55L249.8625 37.55ZM266.64374 33.753124L268.3625 33.971874Q268.08124 35.76875 266.90936 36.784374Q265.7531 37.784374 264.05 37.784374Q261.925 37.784374 260.6281 36.39375Q259.33124 35.003124 259.33124 32.409374Q259.33124 30.721874 259.8781 29.471874Q260.4406 28.20625 261.58124 27.58125Q262.72186 26.940624 264.0656 26.940624Q265.7531 26.940624 266.8156 27.8Q267.89374 28.659374 268.20624 30.221874L266.4875 30.4875Q266.2531 29.440624 265.6281 28.925Q265.0031 28.39375 264.1281 28.39375Q262.8 28.39375 261.97186 29.346874Q261.14374 30.3 261.14374 32.346874Q261.14374 34.440624 261.9406 35.39375Q262.7375 36.33125 264.03436 36.33125Q265.0656 36.33125 265.7531 35.70625Q266.45624 35.065624 266.64374 33.753124ZM269.78436 41.55L269.59686 39.89375Q270.175 40.05 270.59686 40.05Q271.1906 40.05 271.53436 39.846874Q271.89374 39.659374 272.1125 39.315624Q272.28436 39.05 272.65936 38.003124Q272.70624 37.846874 272.8156 37.565624L268.8781 27.175L270.76874 27.175L272.925 33.190624Q273.34686 34.33125 273.675 35.58125Q273.9875 34.378124 274.39374 33.221874L276.6125 27.175L278.3781 27.175L274.425 37.721874Q273.8 39.440624 273.4406 40.08125Q272.97186 40.95625 272.3625 41.346874Q271.76874 41.753124 270.925 41.753124Q270.40936 41.753124 269.78436 41.55Z" fill-rule="nonzero"/></svg>
9
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/convnext.rs
//! ConvNeXt implementation. //! //! This candle implementation uses a pre-trained ConvNeXt network for inference. The //! classification head has been trained on the ImageNet dataset and returns the //! probabilities for the top-5 classes. //! //! Original code: //! - 💻 [ConvNeXt](https://github.com/facebookresearch/ConvNeXt/) //! - 💻 [ConvNeXt-V2](https://github.com/facebookresearch/ConvNeXt-V2/) //! - 💻 [timm](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/convnext.py) //! - 📝 [Paper](https://arxiv.org/abs/2201.03545) A ConvNet for the 2020s //! - 📝 [Paper](https://arxiv.org/abs/2301.00808) ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders //! use candle::shape::ShapeWithOneHole; use candle::{Result, D}; use candle_nn::{conv2d, layer_norm, linear, Conv2dConfig, Func, VarBuilder}; #[derive(Clone)] pub struct Config { blocks: [usize; 4], channels: [usize; 4], use_conv_mlp: bool, } impl Config { pub fn atto() -> Self { Self { blocks: [2, 2, 6, 2], channels: [40, 80, 160, 320], use_conv_mlp: true, } } pub fn femto() -> Self { Self { blocks: [2, 2, 6, 2], channels: [48, 96, 192, 384], use_conv_mlp: true, } } pub fn pico() -> Self { Self { blocks: [2, 2, 6, 2], channels: [64, 128, 256, 512], use_conv_mlp: true, } } pub fn nano() -> Self { Self { blocks: [2, 2, 8, 2], channels: [80, 160, 320, 640], use_conv_mlp: true, } } pub fn tiny() -> Self { Self { blocks: [3, 3, 9, 3], channels: [96, 192, 384, 768], use_conv_mlp: false, } } pub fn small() -> Self { Self { blocks: [3, 3, 27, 3], channels: [96, 192, 384, 768], use_conv_mlp: false, } } pub fn base() -> Self { Self { blocks: [3, 3, 27, 3], channels: [128, 256, 512, 1024], use_conv_mlp: false, } } pub fn large() -> Self { Self { blocks: [3, 3, 27, 3], channels: [192, 384, 768, 1536], use_conv_mlp: false, } } pub fn xlarge() -> Self { Self { blocks: [3, 3, 27, 3], channels: [256, 512, 1024, 2048], use_conv_mlp: false, } } pub fn huge() -> Self { Self { blocks: [3, 3, 27, 3], channels: [352, 704, 1408, 2816], use_conv_mlp: false, } } } // Layer norm for data in channels-last format. fn layer_norm_cl(dim: usize, vb: VarBuilder) -> Result<Func<'static>> { let norm = layer_norm(dim, 1e-6, vb)?; Ok(Func::new(move |xs| xs.apply(&norm))) } // Layer norm for data in channels-first format. fn layer_norm_cf(dim: usize, vb: VarBuilder) -> Result<Func<'static>> { let norm = layer_norm(dim, 1e-6, vb)?; Ok(Func::new(move |xs| { let xs = xs .permute((0, 2, 3, 1))? .apply(&norm)? .permute((0, 3, 1, 2))?; Ok(xs) })) } // Global response normalization layer // Based on https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/grn.py fn convnext2_grn(dim: usize, channels_last: bool, vb: VarBuilder) -> Result<Func<'static>> { let (shape, spatial_dim, channel_dim) = if channels_last { ((1, 1, 1, ()).into_shape(dim)?, [1, 2], 3) } else { ((1, (), 1, 1).into_shape(dim)?, [2, 3], 1) }; let gamma = vb.get(dim, "weight")?.reshape(&shape)?; let beta = vb.get(dim, "bias")?.reshape(&shape)?; Ok(Func::new(move |xs| { let residual = xs; let gx = xs .sqr()? .sum_keepdim(spatial_dim)? .mean_keepdim(spatial_dim)? .sqrt()?; let gxmean = gx.mean_keepdim(channel_dim)?; let nx = gx.broadcast_div(&(gxmean + 1e-6)?)?; let xs = xs .broadcast_mul(&nx)? .broadcast_mul(&gamma)? .broadcast_add(&beta)?; xs + residual })) } // Initial downsampling via a patchify layer. fn convnext_stem(out_channels: usize, vb: VarBuilder) -> Result<Func<'static>> { let conv2d_cfg = Conv2dConfig { stride: 4, ..Default::default() }; let patchify = conv2d(3, out_channels, 4, conv2d_cfg, vb.pp(0))?; let norm = layer_norm_cf(out_channels, vb.pp(1))?; Ok(Func::new(move |xs| xs.apply(&patchify)?.apply(&norm))) } // Downsampling applied after the stages. fn convnext_downsample(dim: usize, vb: VarBuilder) -> Result<Func<'static>> { let conv2d_cfg = Conv2dConfig { stride: 2, ..Default::default() }; let norm = layer_norm_cf(dim / 2, vb.pp(0))?; let conv = conv2d(dim / 2, dim, 2, conv2d_cfg, vb.pp(1))?; Ok(Func::new(move |xs| xs.apply(&norm)?.apply(&conv))) } // MLP block from the original paper with optional GRN layer (v2 models). fn convnext_mlp(dim: usize, vb: VarBuilder) -> Result<Func<'static>> { let fc1 = linear(dim, 4 * dim, vb.pp("fc1"))?; let fc2 = linear(4 * dim, dim, vb.pp("fc2"))?; let grn = convnext2_grn(4 * dim, true, vb.pp("grn")); Ok(Func::new(move |xs| { let mut xs = xs.apply(&fc1)?.gelu_erf()?; if let Ok(g) = &grn { xs = xs.apply(g)?; } xs = xs.apply(&fc2)?; Ok(xs) })) } // MLP block using pointwise convolutions, with optional GRN layer (v2 models). fn convnext_conv_mlp(dim: usize, vb: VarBuilder) -> Result<Func<'static>> { let conv2d_cfg = Conv2dConfig { ..Default::default() }; let fc1 = conv2d(dim, 4 * dim, 1, conv2d_cfg, vb.pp("fc1"))?; let fc2 = conv2d(4 * dim, dim, 1, conv2d_cfg, vb.pp("fc2"))?; let grn = convnext2_grn(4 * dim, false, vb.pp("grn")); Ok(Func::new(move |xs| { let mut xs = xs.apply(&fc1)?.gelu_erf()?; if let Ok(g) = &grn { xs = xs.apply(g)?; } xs = xs.apply(&fc2)?; Ok(xs) })) } // A block consisting of a depthwise convolution, a MLP and layer scaling (v1 models only). fn convnext_block(dim: usize, use_conv_mlp: bool, vb: VarBuilder) -> Result<Func<'static>> { let conv2d_cfg = Conv2dConfig { groups: dim, padding: 3, ..Default::default() }; let conv_dw = conv2d(dim, dim, 7, conv2d_cfg, vb.pp("conv_dw"))?; let gamma = vb.get(dim, "gamma"); let (mlp, norm) = if use_conv_mlp { ( convnext_conv_mlp(dim, vb.pp("mlp"))?, layer_norm_cf(dim, vb.pp("norm"))?, ) } else { ( convnext_mlp(dim, vb.pp("mlp"))?, layer_norm_cl(dim, vb.pp("norm"))?, ) }; Ok(Func::new(move |xs| { let residual = xs; let mut xs = xs.apply(&conv_dw)?; xs = if use_conv_mlp { xs.apply(&norm)?.apply(&mlp)? } else { xs.permute((0, 2, 3, 1))? .apply(&norm)? .apply(&mlp)? .permute((0, 3, 1, 2))? }; if let Ok(g) = &gamma { xs = xs.broadcast_mul(&g.reshape((1, (), 1, 1))?)?; }; xs + residual })) } // Each stage contains blocks and a downsampling layer for the previous stage. fn convnext_stage(cfg: &Config, stage_idx: usize, vb: VarBuilder) -> Result<Func<'static>> { let nblocks = cfg.blocks[stage_idx]; let mut blocks = Vec::with_capacity(nblocks); let dim = cfg.channels[stage_idx]; if stage_idx > 0 { blocks.push(convnext_downsample(dim, vb.pp("downsample"))?); } for block_idx in 0..nblocks { blocks.push(convnext_block( dim, cfg.use_conv_mlp, vb.pp(format!("blocks.{block_idx}")), )?); } Ok(Func::new(move |xs| { let mut xs = xs.clone(); for block in blocks.iter() { xs = xs.apply(block)? } Ok(xs) })) } // Classification head. fn convnext_head(outputs: usize, nclasses: usize, vb: VarBuilder) -> Result<Func<'static>> { let norm = layer_norm_cl(outputs, vb.pp("norm"))?; let linear = linear(outputs, nclasses, vb.pp("fc"))?; Ok(Func::new(move |xs| xs.apply(&norm)?.apply(&linear))) } // Build a convnext model for a given configuration. fn convnext_model( config: &Config, nclasses: Option<usize>, vb: VarBuilder, ) -> Result<Func<'static>> { let head = match nclasses { None => None, Some(nclasses) => { let head = convnext_head(config.channels[3], nclasses, vb.pp("head"))?; Some(head) } }; let stem = convnext_stem(config.channels[0], vb.pp("stem"))?; let vb = vb.pp("stages"); let stage1 = convnext_stage(config, 0, vb.pp(0))?; let stage2 = convnext_stage(config, 1, vb.pp(1))?; let stage3 = convnext_stage(config, 2, vb.pp(2))?; let stage4 = convnext_stage(config, 3, vb.pp(3))?; Ok(Func::new(move |xs| { let xs = xs .apply(&stem)? .apply(&stage1)? .apply(&stage2)? .apply(&stage3)? .apply(&stage4)? .mean(D::Minus2)? .mean(D::Minus1)?; match &head { None => Ok(xs), Some(head) => xs.apply(head), } })) } pub fn convnext(cfg: &Config, nclasses: usize, vb: VarBuilder) -> Result<Func<'static>> { convnext_model(cfg, Some(nclasses), vb) } pub fn convnext_no_final_layer(cfg: &Config, vb: VarBuilder) -> Result<Func<'static>> { convnext_model(cfg, None, vb) }
0
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/quantized_rwkv_v6.rs
//! RWKV v6 model implementation with quantization support. //! //! RWKV is a linear attention model that combines the efficiency of RNNs //! with the parallelizable training of Transformers. Version 6 builds on previous //! versions with further optimizations. //! //! Key characteristics: //! - Linear attention mechanism //! - Time mixing layers //! - Channel mixing layers //! - RMSNorm for normalization //! - Support for 8-bit quantization //! //! References: //! - [RWKV Architecture](https://github.com/BlinkDL/RWKV-LM) //! - [RWKV v6 Release](https://huggingface.co/BlinkDL/rwkv-6) //! use crate::{ quantized_nn::{layer_norm, linear_no_bias as linear, Embedding, Linear}, quantized_var_builder::VarBuilder, }; use candle::{IndexOp, Result, Tensor}; use candle_nn::{GroupNorm, LayerNorm, Module}; pub use crate::models::rwkv_v5::{Config, State, Tokenizer}; #[derive(Debug, Clone)] struct SelfAttention { key: Linear, receptance: Linear, value: Linear, gate: Linear, output: Linear, ln_x: candle_nn::GroupNorm, time_mix_x: Tensor, time_mix_w: Tensor, time_mix_key: Tensor, time_mix_value: Tensor, time_mix_receptance: Tensor, time_decay: Tensor, time_faaaa: Tensor, time_mix_gate: Tensor, time_decay_w1: Tensor, time_decay_w2: Tensor, time_mix_w1: Tensor, time_mix_w2: Tensor, layer_id: usize, n_attn_heads: usize, } impl SelfAttention { fn new(layer_id: usize, cfg: &Config, vb: VarBuilder) -> Result<Self> { let hidden_size = cfg.hidden_size; let attn_hidden_size = cfg.attention_hidden_size; let key = linear(hidden_size, attn_hidden_size, vb.pp("key"))?; let receptance = linear(hidden_size, attn_hidden_size, vb.pp("receptance"))?; let value = linear(hidden_size, attn_hidden_size, vb.pp("value"))?; let gate = linear(hidden_size, attn_hidden_size, vb.pp("gate"))?; let output = linear(attn_hidden_size, hidden_size, vb.pp("output"))?; let vb_x = vb.pp("ln_x"); let ln_x_weight = vb_x.get(hidden_size, "weight")?.dequantize(vb.device())?; let ln_x_bias = vb_x.get(hidden_size, "bias")?.dequantize(vb.device())?; let ln_x = GroupNorm::new( ln_x_weight, ln_x_bias, hidden_size, hidden_size / cfg.head_size, 1e-5, )?; let time_mix_x = vb .get((1, 1, cfg.hidden_size), "time_mix_x")? .dequantize(vb.device())?; let time_mix_w = vb .get((1, 1, cfg.hidden_size), "time_mix_w")? .dequantize(vb.device())?; let time_mix_key = vb .get((1, 1, cfg.hidden_size), "time_mix_key")? .dequantize(vb.device())?; let time_mix_value = vb .get((1, 1, cfg.hidden_size), "time_mix_value")? .dequantize(vb.device())?; let time_mix_receptance = vb .get((1, 1, cfg.hidden_size), "time_mix_receptance")? .dequantize(vb.device())?; let n_attn_heads = cfg.hidden_size / cfg.head_size; let time_decay = vb .get((1, 1, cfg.hidden_size), "time_decay")? .dequantize(vb.device())?; let time_faaaa = vb .get((n_attn_heads, cfg.head_size), "time_faaaa")? .dequantize(vb.device())?; let time_mix_gate = vb .get((1, 1, cfg.hidden_size), "time_mix_gate")? .dequantize(vb.device())?; let time_decay_w1 = vb .get((cfg.hidden_size, n_attn_heads * 2), "time_decay_w1")? .dequantize(vb.device())?; let time_decay_w2 = vb .get((n_attn_heads * 2, cfg.hidden_size), "time_decay_w2")? .dequantize(vb.device())?; let time_mix_w1 = vb .get((cfg.hidden_size, n_attn_heads * 5), "time_mix_w1")? .dequantize(vb.device())?; let time_mix_w2 = vb .get((5, n_attn_heads, cfg.hidden_size), "time_mix_w2")? .dequantize(vb.device())?; Ok(Self { key, value, receptance, gate, output, ln_x, time_mix_x, time_mix_w, time_mix_key, time_mix_value, time_mix_receptance, time_decay, time_faaaa, time_mix_gate, time_decay_w1, time_decay_w2, time_mix_w1, time_mix_w2, layer_id, n_attn_heads, }) } pub fn forward(&self, xs: &Tensor, state: &mut State) -> Result<Tensor> { let h = self.n_attn_heads; let (b, t, s) = xs.dims3()?; let s = s / h; let (receptance, key, value, gate, w) = { // extract key-value let shifted = state.per_layer[self.layer_id].extract_key_value.clone(); let shifted = if shifted.rank() == 2 { shifted.unsqueeze(1)? } else { shifted }; let sx = (&shifted - xs)?; let xxx = (xs + &sx * &self.time_mix_x)?; let xxx = xxx .broadcast_matmul(&self.time_mix_w1)? .tanh()? .reshape((b * t, 5, ()))? .transpose(0, 1)?; let xxx = xxx.matmul(&self.time_mix_w2)?.reshape((5, b, t, ()))?; let (mw, mk, mv, mr, mg) = (xxx.i(0)?, xxx.i(1)?, xxx.i(2)?, xxx.i(3)?, xxx.i(4)?); let xw = (xs + &sx * (&self.time_mix_w + &mw)?)?; let xk = (xs + &sx * (&self.time_mix_key + &mk)?)?; let xv = (xs + &sx * (&self.time_mix_value + &mv)?)?; let xr = (xs + &sx * (&self.time_mix_receptance + &mr)?)?; let xg = (xs + &sx * (&self.time_mix_gate + &mg)?)?; let w = (&self.time_decay + xw.broadcast_matmul(&self.time_decay_w1)? .tanh()? .broadcast_matmul(&self.time_decay_w2)?)? .reshape(((), 1, 1))? .reshape((self.n_attn_heads, (), 1))?; let key = self.key.forward(&xk)?; let value = self.value.forward(&xv)?; let receptance = self.receptance.forward(&xr)?; let gate = candle_nn::ops::silu(&self.gate.forward(&xg)?)?; state.per_layer[self.layer_id].extract_key_value = xs.i((.., t - 1))?; (receptance, key, value, gate, w) }; // linear attention let mut state_ = state.per_layer[self.layer_id].linear_attention.clone(); let key = key.reshape((b, t, h, s))?.permute((0, 2, 3, 1))?; let value = value.reshape((b, t, h, s))?.transpose(1, 2)?; let receptance = receptance.reshape((b, t, h, s))?.transpose(1, 2)?; let w = w.exp()?.neg()?.exp()?; let time_faaaa = self.time_faaaa .reshape(((), 1, 1))? .reshape((self.n_attn_heads, (), 1))?; let mut out: Vec<Tensor> = Vec::with_capacity(t); for t_ in 0..t { let rt = receptance.i((.., .., t_..t_ + 1))?.contiguous()?; let kt = key.i((.., .., .., t_..t_ + 1))?.contiguous()?; let vt = value.i((.., .., t_..t_ + 1))?.contiguous()?; let at = kt.matmul(&vt)?; let rhs = (time_faaaa.broadcast_mul(&at)? + &state_)?; let out_ = rt.matmul(&rhs)?.squeeze(2)?; state_ = (&at + w.broadcast_mul(&state_))?; out.push(out_) } let out = Tensor::cat(&out, 1)?.reshape((b * t, h * s, 1))?; let out = out.apply(&self.ln_x)?.reshape((b, t, h * s))?; let out = (out * gate)?.apply(&self.output)?; state.per_layer[self.layer_id].linear_attention = state_; Ok(out) } } #[derive(Debug, Clone)] struct FeedForward { time_mix_key: Tensor, time_mix_receptance: Tensor, key: Linear, receptance: Linear, value: Linear, layer_id: usize, } impl FeedForward { fn new(layer_id: usize, cfg: &Config, vb: VarBuilder) -> Result<Self> { let int_size = cfg .intermediate_size .unwrap_or(((cfg.hidden_size as f64 * 3.5) as usize) / 32 * 32); let key = linear(cfg.hidden_size, int_size, vb.pp("key"))?; let receptance = linear(cfg.hidden_size, cfg.hidden_size, vb.pp("receptance"))?; let value = linear(int_size, cfg.hidden_size, vb.pp("value"))?; let time_mix_key = vb .get((1, 1, cfg.hidden_size), "time_mix_key")? .dequantize(vb.device())?; let time_mix_receptance = vb .get((1, 1, cfg.hidden_size), "time_mix_receptance")? .dequantize(vb.device())?; Ok(Self { key, receptance, value, time_mix_key, time_mix_receptance, layer_id, }) } fn forward(&self, xs: &Tensor, state: &mut State) -> Result<Tensor> { let shifted = state.per_layer[self.layer_id] .feed_forward .broadcast_sub(xs)?; let key = (xs + shifted.broadcast_mul(&self.time_mix_key)?)?; let receptance = (xs + shifted.broadcast_mul(&self.time_mix_receptance)?)?; let key = key.apply(&self.key)?.relu()?.sqr()?; let value = key.apply(&self.value)?; let receptance = candle_nn::ops::sigmoid(&receptance.apply(&self.receptance)?)?; state.per_layer[self.layer_id].feed_forward = xs.i((.., xs.dim(1)? - 1))?; let xs = (receptance * value)?; Ok(xs) } } #[derive(Debug, Clone)] struct Block { pre_ln: Option<LayerNorm>, ln1: LayerNorm, ln2: LayerNorm, attention: SelfAttention, feed_forward: FeedForward, } impl Block { fn new(layer_id: usize, cfg: &Config, vb: VarBuilder) -> Result<Self> { let ln1 = layer_norm(cfg.hidden_size, cfg.layer_norm_epsilon, vb.pp("ln1"))?; let ln2 = layer_norm(cfg.hidden_size, cfg.layer_norm_epsilon, vb.pp("ln2"))?; let pre_ln = if layer_id == 0 { let ln = layer_norm(cfg.hidden_size, cfg.layer_norm_epsilon, vb.pp("pre_ln"))?; Some(ln) } else { None }; let attention = SelfAttention::new(layer_id, cfg, vb.pp("attention"))?; let feed_forward = FeedForward::new(layer_id, cfg, vb.pp("feed_forward"))?; Ok(Self { pre_ln, ln1, ln2, attention, feed_forward, }) } fn forward(&self, xs: &Tensor, state: &mut State) -> Result<Tensor> { let xs = match self.pre_ln.as_ref() { None => xs.clone(), Some(pre_ln) => xs.apply(pre_ln)?, }; let attention = self.attention.forward(&xs.apply(&self.ln1)?, state)?; let xs = (xs + attention)?; let feed_forward = self.feed_forward.forward(&xs.apply(&self.ln2)?, state)?; let xs = (xs + feed_forward)?; Ok(xs) } } #[derive(Debug, Clone)] pub struct Model { embeddings: Embedding, blocks: Vec<Block>, ln_out: LayerNorm, head: Linear, rescale_every: usize, layers_are_rescaled: bool, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vb_m = vb.pp("rwkv"); let embeddings = Embedding::new(cfg.vocab_size, cfg.hidden_size, vb_m.pp("embeddings"))?; let mut blocks = Vec::with_capacity(cfg.num_hidden_layers); let vb_b = vb_m.pp("blocks"); for block_index in 0..cfg.num_hidden_layers { let block = Block::new(block_index, cfg, vb_b.pp(block_index))?; blocks.push(block) } let ln_out = layer_norm(cfg.hidden_size, 1e-5, vb_m.pp("ln_out"))?; let head = linear(cfg.hidden_size, cfg.vocab_size, vb.pp("head"))?; Ok(Self { embeddings, blocks, ln_out, head, rescale_every: cfg.rescale_every, layers_are_rescaled: false, // This seem to only happen for the f16/bf16 dtypes. }) } pub fn forward(&self, xs: &Tensor, state: &mut State) -> Result<Tensor> { let (_b_size, _seq_len) = xs.dims2()?; let mut xs = xs.apply(&self.embeddings)?; for (block_idx, block) in self.blocks.iter().enumerate() { xs = block.forward(&xs, state)?; if self.layers_are_rescaled && (block_idx + 1) % self.rescale_every == 0 { xs = (xs / 2.)? } } let xs = xs.apply(&self.ln_out)?.apply(&self.head)?; state.pos += 1; Ok(xs) } }
1
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/stable_lm.rs
//! StableLM model implementation. //! //! StableLM is a family of language models trained by Stability AI. //! This implementation supports the StableLM architecture. //! //! Key characteristics: //! - Grouped query attention (GQA) //! - Layer normalization //! - Rotary positional embeddings (RoPE) //! - Support for different model sizes (3B, 7B) //! //! References: //! - 🤗 [Model Card](https://huggingface.co/stabilityai/stablelm-3b-4e1t) //! use crate::models::with_tracing::{linear, linear_no_bias, Linear}; use candle::{DType, Device, Module, Result, Tensor, D}; use candle_nn::{Activation, LayerNorm, VarBuilder}; use serde::Deserialize; use std::sync::Arc; // https://huggingface.co/stabilityai/stablelm-3b-4e1t/blob/main/configuration_stablelm.py #[derive(Debug, Clone, PartialEq, Deserialize)] pub struct Config { pub(crate) vocab_size: usize, pub(crate) intermediate_size: usize, pub(crate) hidden_size: usize, pub(crate) num_hidden_layers: usize, pub(crate) num_attention_heads: usize, pub(crate) num_key_value_heads: usize, pub(crate) hidden_act: Activation, pub(crate) partial_rotary_factor: f64, pub(crate) rope_theta: f64, pub(crate) max_position_embeddings: usize, pub(crate) layer_norm_eps: f64, pub(crate) use_cache: bool, #[serde(default)] pub(crate) use_qkv_bias: bool, // Used in StableLM-2 #[serde(default)] pub(crate) use_flash_attn: bool, // Not in config.json } impl Config { pub fn stablelm_3b_4e1t(use_flash_attn: bool) -> Self { Self { vocab_size: 50304, intermediate_size: 6912, hidden_size: 2560, num_hidden_layers: 32, num_attention_heads: 32, num_key_value_heads: 32, hidden_act: Activation::Silu, partial_rotary_factor: 0.25, rope_theta: 10_000., max_position_embeddings: 4096, layer_norm_eps: 1e-5, use_qkv_bias: false, use_cache: true, use_flash_attn, } } pub fn head_dim(&self) -> usize { self.hidden_size / self.num_attention_heads } pub fn rotary_ndims(&self) -> usize { (self.head_dim() as f64 * self.partial_rotary_factor) as usize } pub fn num_kv_groups(&self) -> usize { self.num_attention_heads / self.num_key_value_heads } pub fn set_use_flash_attn(&mut self, use_flash_attn: bool) { self.use_flash_attn = use_flash_attn } } #[derive(Debug)] pub(crate) struct RotaryEmbedding { sin: Tensor, cos: Tensor, } fn rotate_half(xs: &Tensor) -> Result<Tensor> { let xs = xs.chunk(2, D::Minus1)?; Tensor::cat(&[&xs[1].neg()?, &xs[0]], D::Minus1) } impl RotaryEmbedding { pub(crate) fn new(dtype: DType, cfg: &Config, dev: &Device) -> Result<Self> { let dim = cfg.rotary_ndims(); let max_seq_len = cfg.max_position_embeddings; let inv_freq: Vec<_> = (0..dim) .step_by(2) .map(|i| 1f32 / cfg.rope_theta.powf(i as f64 / dim as f64) as f32) .collect(); let inv_freq_len = inv_freq.len(); let inv_freq = Tensor::from_vec(inv_freq, (1, inv_freq_len), dev)?.to_dtype(dtype)?; let t = Tensor::arange(0u32, max_seq_len as u32, dev)? .to_dtype(dtype)? .reshape((max_seq_len, 1))?; let freqs = t.matmul(&inv_freq)?; let freqs = Tensor::cat(&[&freqs, &freqs], D::Minus1)?; Ok(Self { sin: freqs.sin()?, cos: freqs.cos()?, }) } pub(crate) fn apply_rotary_emb_qkv( &self, q: &Tensor, k: &Tensor, seqlen_offset: usize, ) -> Result<(Tensor, Tensor)> { let (_b_sz, _h, seq_len, _n_embd) = q.dims4()?; let cos = self.cos.narrow(0, seqlen_offset, seq_len)?; let sin = self.sin.narrow(0, seqlen_offset, seq_len)?; let cos = cos.unsqueeze(0)?.unsqueeze(0)?; // (1, 1, seq_len, dim) let sin = sin.unsqueeze(0)?.unsqueeze(0)?; // (1, 1, seq_len, dim) let q_embed = (q.broadcast_mul(&cos)? + rotate_half(q)?.broadcast_mul(&sin))?; let k_embed = (k.broadcast_mul(&cos)? + rotate_half(k)?.broadcast_mul(&sin))?; Ok((q_embed, k_embed)) } } #[derive(Debug)] #[allow(clippy::upper_case_acronyms)] struct MLP { gate_proj: Linear, up_proj: Linear, down_proj: Linear, act_fn: Activation, span: tracing::Span, } impl MLP { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let hidden_sz = cfg.hidden_size; let intermediate_sz = cfg.intermediate_size; let gate_proj = linear_no_bias(hidden_sz, intermediate_sz, vb.pp("gate_proj"))?; let up_proj = linear_no_bias(hidden_sz, intermediate_sz, vb.pp("up_proj"))?; let down_proj = linear_no_bias(intermediate_sz, hidden_sz, vb.pp("down_proj"))?; Ok(Self { gate_proj, up_proj, down_proj, act_fn: cfg.hidden_act, span: tracing::span!(tracing::Level::TRACE, "mlp"), }) } } impl Module for MLP { fn forward(&self, xs: &Tensor) -> Result<Tensor> { let _enter = self.span.enter(); let lhs = xs.apply(&self.gate_proj)?.apply(&self.act_fn)?; let rhs = xs.apply(&self.up_proj)?; (lhs * rhs)?.apply(&self.down_proj) } } #[cfg(feature = "flash-attn")] fn flash_attn( q: &Tensor, k: &Tensor, v: &Tensor, softmax_scale: f32, causal: bool, ) -> Result<Tensor> { candle_flash_attn::flash_attn(q, k, v, softmax_scale, causal) } #[cfg(not(feature = "flash-attn"))] fn flash_attn(_: &Tensor, _: &Tensor, _: &Tensor, _: f32, _: bool) -> Result<Tensor> { unimplemented!("compile with '--features flash-attn'") } #[derive(Debug)] struct Attention { q_proj: Linear, k_proj: Linear, v_proj: Linear, o_proj: Linear, num_heads: usize, num_kv_heads: usize, num_kv_groups: usize, head_dim: usize, hidden_size: usize, rotary_emb: Arc<RotaryEmbedding>, kv_cache: Option<(Tensor, Tensor)>, use_cache: bool, rotary_ndims: usize, use_flash_attn: bool, span: tracing::Span, } impl Attention { fn new(rotary_emb: Arc<RotaryEmbedding>, cfg: &Config, vb: VarBuilder) -> Result<Self> { let hidden_sz = cfg.hidden_size; let head_dim = cfg.head_dim(); let num_heads = cfg.num_attention_heads; let num_kv_heads = cfg.num_key_value_heads; let linear_layer = if cfg.use_qkv_bias { linear } else { linear_no_bias }; let q_proj = linear_layer(hidden_sz, num_heads * head_dim, vb.pp("q_proj"))?; let k_proj = linear_layer(hidden_sz, num_kv_heads * head_dim, vb.pp("k_proj"))?; let v_proj = linear_layer(hidden_sz, num_kv_heads * head_dim, vb.pp("v_proj"))?; let o_proj = linear_no_bias(num_heads * head_dim, hidden_sz, vb.pp("o_proj"))?; Ok(Self { q_proj, k_proj, v_proj, o_proj, num_heads, num_kv_heads, num_kv_groups: cfg.num_kv_groups(), head_dim, hidden_size: hidden_sz, rotary_emb, kv_cache: None, use_cache: cfg.use_cache, rotary_ndims: cfg.rotary_ndims(), use_flash_attn: cfg.use_flash_attn, span: tracing::span!(tracing::Level::TRACE, "attn"), }) } fn forward( &mut self, xs: &Tensor, attention_mask: Option<&Tensor>, seqlen_offset: usize, ) -> Result<Tensor> { let _enter = self.span.enter(); let (b_sz, q_len, _) = xs.dims3()?; let query_states = self.q_proj.forward(xs)?; let key_states = self.k_proj.forward(xs)?; let value_states = self.v_proj.forward(xs)?; let query_states = query_states .reshape((b_sz, q_len, self.num_heads, self.head_dim))? .transpose(1, 2)?; let key_states = key_states .reshape((b_sz, q_len, self.num_kv_heads, self.head_dim))? .transpose(1, 2)?; let value_states = value_states .reshape((b_sz, q_len, self.num_kv_heads, self.head_dim))? .transpose(1, 2)?; let (rot_ndims, pass_ndims) = (self.rotary_ndims, self.head_dim - self.rotary_ndims); let query_rot = query_states.narrow(D::Minus1, 0, rot_ndims)?; let query_pass = query_states.narrow(D::Minus1, rot_ndims, pass_ndims)?; let key_rot = key_states.narrow(D::Minus1, 0, rot_ndims)?; let key_pass = key_states.narrow(D::Minus1, rot_ndims, pass_ndims)?; let (query_rot, key_rot) = self.rotary_emb .apply_rotary_emb_qkv(&query_rot, &key_rot, seqlen_offset)?; let query_states = Tensor::cat(&[query_rot, query_pass], D::Minus1)?.contiguous()?; let key_states = Tensor::cat(&[key_rot, key_pass], D::Minus1)?.contiguous()?; let (key_states, value_states) = match &self.kv_cache { None => (key_states, value_states), Some((prev_k, prev_v)) => { let key_states = Tensor::cat(&[prev_k, &key_states], 2)?; let value_states = Tensor::cat(&[prev_v, &value_states], 2)?; (key_states, value_states) } }; if self.use_cache { self.kv_cache = Some((key_states.clone(), value_states.clone())); } let key_states = crate::utils::repeat_kv(key_states, self.num_kv_groups)?.contiguous()?; let value_states = crate::utils::repeat_kv(value_states, self.num_kv_groups)?.contiguous()?; let attn_output = if self.use_flash_attn { // flash-attn expects (b_sz, seq_len, nheads, head_dim) let q = query_states.transpose(1, 2)?; let k = key_states.transpose(1, 2)?; let v = value_states.transpose(1, 2)?; let softmax_scale = 1f32 / (self.head_dim as f32).sqrt(); flash_attn(&q, &k, &v, softmax_scale, q_len > 1)?.transpose(1, 2)? } else { let scale = 1f64 / f64::sqrt(self.head_dim as f64); let attn_weights = (query_states.matmul(&key_states.transpose(2, 3)?)? * scale)?; let attn_weights = match attention_mask { None => attn_weights, Some(mask) => attn_weights.broadcast_add(mask)?, }; let attn_weights = candle_nn::ops::softmax_last_dim(&attn_weights)?; attn_weights.matmul(&value_states)? }; attn_output .transpose(1, 2)? .reshape((b_sz, q_len, self.hidden_size))? .apply(&self.o_proj) } } #[derive(Debug)] struct DecoderLayer { self_attn: Attention, mlp: MLP, input_layernorm: LayerNorm, post_attention_layernorm: LayerNorm, span: tracing::Span, } impl DecoderLayer { fn new(rotary_emb: Arc<RotaryEmbedding>, cfg: &Config, vb: VarBuilder) -> Result<Self> { let self_attn = Attention::new(rotary_emb, cfg, vb.pp("self_attn"))?; let mlp = MLP::new(cfg, vb.pp("mlp"))?; let input_layernorm = candle_nn::layer_norm( cfg.hidden_size, cfg.layer_norm_eps, vb.pp("input_layernorm"), )?; let post_attention_layernorm = candle_nn::layer_norm( cfg.hidden_size, cfg.layer_norm_eps, vb.pp("post_attention_layernorm"), )?; Ok(Self { self_attn, mlp, input_layernorm, post_attention_layernorm, span: tracing::span!(tracing::Level::TRACE, "layer"), }) } fn forward( &mut self, xs: &Tensor, attention_mask: Option<&Tensor>, seqlen_offset: usize, ) -> Result<Tensor> { let _enter = self.span.enter(); let residual = xs; let xs = self.input_layernorm.forward(xs)?; let xs = self.self_attn.forward(&xs, attention_mask, seqlen_offset)?; let xs = (xs + residual)?; let residual = &xs; let xs = xs.apply(&self.post_attention_layernorm)?.apply(&self.mlp)?; residual + xs } } #[derive(Debug)] pub struct Model { embed_tokens: candle_nn::Embedding, layers: Vec<DecoderLayer>, norm: LayerNorm, lm_head: Linear, device: Device, dtype: DType, span: tracing::Span, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vb_m = vb.pp("model"); let embed_tokens = candle_nn::embedding(cfg.vocab_size, cfg.hidden_size, vb_m.pp("embed_tokens"))?; let rotary_emb = Arc::new(RotaryEmbedding::new(vb.dtype(), cfg, vb_m.device())?); let mut layers = Vec::with_capacity(cfg.num_hidden_layers); let vb_l = vb_m.pp("layers"); for layer_idx in 0..cfg.num_hidden_layers { let layer = DecoderLayer::new(rotary_emb.clone(), cfg, vb_l.pp(layer_idx))?; layers.push(layer) } let norm = candle_nn::layer_norm(cfg.hidden_size, cfg.layer_norm_eps, vb_m.pp("norm"))?; let lm_head = linear_no_bias(cfg.hidden_size, cfg.vocab_size, vb.pp("lm_head"))?; Ok(Self { embed_tokens, layers, norm, lm_head, device: vb.device().clone(), dtype: vb.dtype(), span: tracing::span!(tracing::Level::TRACE, "model"), }) } fn prepare_decoder_attention_mask( &self, b_size: usize, tgt_len: usize, seqlen_offset: usize, ) -> Result<Tensor> { // Sliding window mask? let mask: Vec<_> = (0..tgt_len) .flat_map(|i| (0..tgt_len).map(move |j| if i < j { f32::NEG_INFINITY } else { 0. })) .collect(); let mask = Tensor::from_slice(&mask, (tgt_len, tgt_len), &self.device)?; let mask = if seqlen_offset > 0 { let mask0 = Tensor::zeros((tgt_len, seqlen_offset), DType::F32, &self.device)?; Tensor::cat(&[&mask0, &mask], D::Minus1)? } else { mask }; mask.expand((b_size, 1, tgt_len, tgt_len + seqlen_offset))? .to_dtype(self.dtype) } pub fn forward(&mut self, input_ids: &Tensor, seqlen_offset: usize) -> Result<Tensor> { let _enter = self.span.enter(); let (b_size, seq_len) = input_ids.dims2()?; let attention_mask = if seq_len <= 1 { None } else { let mask = self.prepare_decoder_attention_mask(b_size, seq_len, seqlen_offset)?; Some(mask) }; let mut xs = self.embed_tokens.forward(input_ids)?; for layer in self.layers.iter_mut() { xs = layer.forward(&xs, attention_mask.as_ref(), seqlen_offset)? } xs.narrow(1, seq_len - 1, 1)? .apply(&self.norm)? .apply(&self.lm_head) } }
2
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/quantized_metavoice.rs
//! Quantized MetaVoice model implementation. //! //! MetaVoice is a conditional text-to-speech model based on a transformer architecture. //! This implementation provides quantization for reduced memory and compute. //! //! Key characteristics: //! - Transformer-based autoregressive decoder //! - Speaker conditioning //! - Support for 8-bit quantization //! - Key-value caching for efficient inference //! - RMS normalization layers //! //! References: //! - [MetaVoice Code](https://github.com/metavoiceio/metavoice) //! use crate::quantized_nn::{linear_b, Embedding, Linear, RmsNorm}; pub use crate::quantized_var_builder::VarBuilder; use crate::models::metavoice::repeat_interleave; use candle::{Module, Result, Tensor, D}; pub mod transformer { use super::*; type Config = crate::models::metavoice::transformer::Config; #[derive(Debug, Clone)] struct FeedForward { w1: Linear, w2: Linear, w3: Linear, span: tracing::Span, } impl FeedForward { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let i_size = cfg.intermediate_size(); let w1 = linear_b(cfg.dim, i_size, false, vb.pp("swiglu.w1"))?; let w2 = linear_b(i_size, cfg.dim, false, vb.pp("w2"))?; let w3 = linear_b(cfg.dim, i_size, false, vb.pp("swiglu.w3"))?; Ok(Self { w1, w2, w3, span: tracing::span!(tracing::Level::TRACE, "feed-forward"), }) } } impl Module for FeedForward { fn forward(&self, xs: &Tensor) -> Result<Tensor> { let _enter = self.span.enter(); let swiglu = (candle_nn::ops::silu(&xs.apply(&self.w1)?)? * xs.apply(&self.w3))?; swiglu.apply(&self.w2) } } #[derive(Debug, Clone)] struct Attention { wqkv: Linear, wo: Linear, dim: usize, kv_size: usize, n_local_heads: usize, head_dim: usize, n_head: usize, kv_cache: Option<(Tensor, Tensor)>, span: tracing::Span, } impl Attention { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let n_local_heads = cfg.n_local_heads(); let head_dim = cfg.head_dim(); let total_head_dim = (cfg.n_head + 2 * n_local_heads) * head_dim; let wqkv = linear_b(cfg.dim, total_head_dim, false, vb.pp("wqkv"))?; let wo = linear_b(cfg.dim, cfg.dim, false, vb.pp("wo"))?; Ok(Self { wqkv, wo, dim: cfg.dim, kv_size: n_local_heads * head_dim, n_local_heads, head_dim, n_head: cfg.n_head, kv_cache: None, span: tracing::span!(tracing::Level::TRACE, "attention"), }) } fn forward(&mut self, xs: &Tensor, _pos: usize, mask: &Tensor) -> Result<Tensor> { let _enter = self.span.enter(); let (b_sz, seqlen, _) = xs.dims3()?; let qkv = xs.apply(&self.wqkv)?; let q = qkv.narrow(D::Minus1, 0, self.dim)?; let k = qkv.narrow(D::Minus1, self.dim, self.kv_size)?; let v = qkv.narrow(D::Minus1, self.dim + self.kv_size, self.kv_size)?; let q = q .reshape((b_sz, seqlen, self.n_head, self.head_dim))? .transpose(1, 2)? .contiguous()?; let k = k .reshape((b_sz, seqlen, self.n_local_heads, self.head_dim))? .transpose(1, 2)?; let v = v .reshape((b_sz, seqlen, self.n_local_heads, self.head_dim))? .transpose(1, 2)?; let (k, v) = match &self.kv_cache { None => (k, v), Some((prev_k, prev_v)) => { let k = Tensor::cat(&[prev_k, &k], 2)?; let v = Tensor::cat(&[prev_v, &v], 2)?; (k, v) } }; self.kv_cache = Some((k.clone(), v.clone())); let k = repeat_interleave(&k, self.n_head / self.n_local_heads, 1)?; let v = repeat_interleave(&v, self.n_head / self.n_local_heads, 1)?; let scale = 1f64 / f64::sqrt(self.head_dim as f64); let attn_weights = (q.matmul(&k.transpose(2, 3)?)? * scale)?; let attn_weights = attn_weights.broadcast_add(mask)?; let attn_weights = candle_nn::ops::softmax_last_dim(&attn_weights)?; let attn_output = attn_weights.matmul(&v)?; attn_output .transpose(1, 2)? .reshape((b_sz, seqlen, self.dim))? .apply(&self.wo) } fn clear_kv_cache(&mut self) { self.kv_cache = None } } #[derive(Debug, Clone)] struct Block { attention: Attention, feed_forward: FeedForward, ffn_norm: RmsNorm, attention_norm: RmsNorm, span: tracing::Span, } impl Block { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let attention = Attention::new(cfg, vb.pp("attention"))?; let feed_forward = FeedForward::new(cfg, vb.pp("feed_forward"))?; let ffn_norm = RmsNorm::new(cfg.dim, cfg.norm_eps, vb.pp("ffn_norm"))?; let attention_norm = RmsNorm::new(cfg.dim, cfg.norm_eps, vb.pp("attention_norm"))?; Ok(Self { attention, feed_forward, ffn_norm, attention_norm, span: tracing::span!(tracing::Level::TRACE, "block"), }) } fn forward(&mut self, xs: &Tensor, pos: usize, mask: &Tensor) -> Result<Tensor> { let _enter = self.span.enter(); let hs = xs.apply(&self.attention_norm)?; let hs = (xs + self.attention.forward(&hs, pos, mask))?; &hs + hs.apply(&self.ffn_norm)?.apply(&self.feed_forward) } fn clear_kv_cache(&mut self) { self.attention.clear_kv_cache() } } #[derive(Debug, Clone)] pub struct Model { tok_embeddings: Embedding, pos_embeddings: Embedding, speaker_cond_pos: Linear, layers: Vec<Block>, norm: RmsNorm, output: Linear, spk_cond_mask: Tensor, span: tracing::Span, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let tok_embeddings = Embedding::new(cfg.vocab_size, cfg.dim, vb.pp("tok_embeddings"))?; let pos_embeddings = Embedding::new(cfg.block_size, cfg.dim, vb.pp("pos_embeddings"))?; let speaker_cond_pos = linear_b( cfg.speaker_emb_dim, cfg.dim, false, vb.pp("speaker_cond_pos"), )?; let mut layers = Vec::with_capacity(cfg.n_layer); let vb_l = vb.pp("layers"); for layer_idx in 0..cfg.n_layer { let layer = Block::new(cfg, vb_l.pp(layer_idx))?; layers.push(layer) } let norm = RmsNorm::new(cfg.dim, cfg.norm_eps, vb.pp("norm"))?; let output = linear_b(cfg.dim, cfg.vocab_size, false, vb.pp("output"))?; let spk_cond_mask = Tensor::cat( &[ Tensor::ones((1, 1, cfg.dim), candle::DType::F32, vb.device())?, Tensor::zeros((1, 1, cfg.dim), candle::DType::F32, vb.device())?, ], 0, )?; Ok(Self { tok_embeddings, pos_embeddings, speaker_cond_pos, layers, norm, output, spk_cond_mask, span: tracing::span!(tracing::Level::TRACE, "qtransformer"), }) } pub fn clear_kv_cache(&mut self) { for layer in self.layers.iter_mut() { layer.clear_kv_cache() } } pub fn forward(&mut self, xs: &Tensor, spk_emb: &Tensor, pos: usize) -> Result<Tensor> { let _enter = self.span.enter(); let (_b_sz, seqlen) = xs.dims2()?; let mask: Vec<_> = (0..seqlen) .flat_map(|i| (0..seqlen).map(move |j| if i < j { f32::NEG_INFINITY } else { 0. })) .collect(); let mask = Tensor::from_slice(&mask, (1, 1, seqlen, seqlen), xs.device())?; let input_pos = Tensor::arange(pos as u32, (pos + seqlen) as u32, xs.device())?; let tok_embeddings = xs.apply(&self.tok_embeddings)?; let pos_embeddings = input_pos.apply(&self.pos_embeddings)?; let mut xs = tok_embeddings .broadcast_add(&pos_embeddings)? .broadcast_add( &spk_emb .apply(&self.speaker_cond_pos)? .broadcast_mul(&self.spk_cond_mask)?, )?; let mask = mask.to_dtype(xs.dtype())?; for layer in self.layers.iter_mut() { xs = layer.forward(&xs, pos, &mask)? } xs.narrow(1, seqlen - 1, 1)? .contiguous()? .apply(&self.norm)? .apply(&self.output) } } }
3
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/paligemma.rs
//! Multimodal multi-purpose model combining Gemma-based language model with SigLIP image understanding //! //! See PaLiGemma details at: //! - [Paper](https://arxiv.org/abs/2402.05257) //! - [Google Blog Post](https://blog.research.google/2024/02/paligemma-scaling-language-image.html) //! //! The model is a multimodal combination of: //! - SigLIP vision encoder //! - Gemma language model //! - Cross-projection layers //! //! References: //! - [HuggingFace Implementation](https://huggingface.co/google/paligemma-3b) //! - [Paper: PaLI-3 and Beyond: Scaling Language-Image Learning](https://arxiv.org/abs/2402.05257) //! use crate::models::{gemma, siglip}; use candle::{Module, Result, Tensor}; use candle_nn::{linear, Linear, VarBuilder}; #[derive(serde::Deserialize, Clone, Debug)] pub struct Config { pub vision_config: siglip::VisionConfig, pub text_config: gemma::Config, pub projection_dim: usize, } impl Config { pub fn paligemma_3b_224() -> Self { // https://huggingface.co/google/paligemma-3b-pt-224/blob/main/config.json Self { vision_config: siglip::VisionConfig::paligemma_3b_224(), text_config: gemma::Config { hidden_size: 2048, intermediate_size: 16384, num_attention_heads: 8, num_hidden_layers: 18, num_key_value_heads: 1, vocab_size: 257216, // Default values. rope_theta: 10000., head_dim: 256, hidden_act: Some(candle_nn::Activation::GeluPytorchTanh), hidden_activation: None, attention_bias: false, max_position_embeddings: 8192, rms_norm_eps: 1e-6, }, projection_dim: 2048, } } pub fn paligemma_3b_448() -> Self { Self { vision_config: siglip::VisionConfig::paligemma_3b_448(), text_config: gemma::Config { hidden_size: 2048, intermediate_size: 16384, num_attention_heads: 8, num_hidden_layers: 18, num_key_value_heads: 1, // Default values. rope_theta: 10000., head_dim: 256, hidden_act: Some(candle_nn::Activation::GeluPytorchTanh), hidden_activation: None, attention_bias: false, max_position_embeddings: 8192, rms_norm_eps: 1e-6, vocab_size: 257216, }, projection_dim: 2048, } } } #[derive(Clone, Debug)] pub struct MultiModalProjector { linear: Linear, } impl MultiModalProjector { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let linear = linear( cfg.vision_config.hidden_size, cfg.projection_dim, vb.pp("linear"), )?; Ok(Self { linear }) } } impl Module for MultiModalProjector { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.linear) } } #[derive(Clone, Debug)] pub struct Model { pos: usize, vision_tower: siglip::VisionModel, multi_modal_projector: MultiModalProjector, language_model: gemma::Model, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vision_tower = siglip::VisionModel::new( &cfg.vision_config, false, vb.pp("vision_tower.vision_model"), )?; let multi_modal_projector = MultiModalProjector::new(cfg, vb.pp("multi_modal_projector"))?; let language_model = gemma::Model::new(false, &cfg.text_config, vb.pp("language_model"))?; Ok(Self { pos: 0, language_model, vision_tower, multi_modal_projector, }) } pub fn setup(&mut self, pixel_values: &Tensor, input_ids: &Tensor) -> Result<Tensor> { self.clear_kv_cache(); let image_features = self .vision_tower .forward(pixel_values)? .apply(&self.multi_modal_projector)?; let image_features = crate::models::clip::div_l2_norm(&image_features)?; let text_features = self.language_model.embed_tokens().forward(input_ids)?; let input_embeds = Tensor::cat(&[image_features, text_features], 1)?; self.pos = input_embeds.dim(1)?; self.language_model.forward_embeds(&input_embeds, None, 0) } pub fn forward(&mut self, input_ids: &Tensor) -> Result<Tensor> { let pos = self.pos; let seq_len = input_ids.dim(1)?; self.pos = pos + seq_len; self.language_model.forward(input_ids, pos) } pub fn forward_without_projection(&mut self, input_ids: &Tensor) -> Result<Tensor> { self.clear_kv_cache(); let input_embeds = self.language_model.embed_tokens().forward(input_ids)?; self.language_model .forward_embeds_without_projection(&input_embeds, None, 0) } pub fn setup_without_projection( &mut self, pixel_values: &Tensor, input_ids: &Tensor, ) -> Result<Tensor> { self.clear_kv_cache(); let image_features = self .vision_tower .forward(pixel_values)? .apply(&self.multi_modal_projector)?; let image_features = crate::models::clip::div_l2_norm(&image_features)?; let text_features = self.language_model.embed_tokens().forward(input_ids)?; let input_embeds = Tensor::cat(&[image_features, text_features], 1)?; self.language_model .forward_embeds_without_projection(&input_embeds, None, 0) } pub fn clear_kv_cache(&mut self) { self.pos = 0; self.language_model.clear_kv_cache() } }
4
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/quantized_blip_text.rs
//! Quantized BLIP text module implementation. //! //! Provides the text decoder portion of the BLIP model with 8-bit quantization. //! Uses a BERT-style transformer architecture for text processing. //! //! Key components: //! - Text embeddings layer with position embeddings //! - Multi-head self attention layers //! - Cross-attention for vision-text fusion //! - Layer normalization and feed-forward layers //! - Quantized linear transformations //! //! References: //! - [BLIP Paper](https://arxiv.org/abs/2201.12086) //! - [Hugging Face Implementation](https://huggingface.co/docs/transformers/model_doc/blip) //! use crate::models::with_tracing::QMatMul; use crate::quantized_nn::{layer_norm, linear, Embedding, Linear}; pub use crate::quantized_var_builder::VarBuilder; use candle::{Module, Result, Tensor, D}; use candle_nn::LayerNorm; pub type Config = super::blip_text::Config; #[derive(Debug, Clone)] struct TextEmbeddings { word_embedddings: Embedding, position_embeddings: Embedding, layer_norm: LayerNorm, position_ids: Tensor, } impl TextEmbeddings { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let word_embedddings = Embedding::new(cfg.vocab_size, cfg.hidden_size, vb.pp("word_embeddings"))?; let position_embeddings = Embedding::new( cfg.max_position_embeddings, cfg.hidden_size, vb.pp("position_embeddings"), )?; let layer_norm = layer_norm(cfg.hidden_size, cfg.layer_norm_eps, vb.pp("LayerNorm"))?; let position_ids = Tensor::arange(0, cfg.max_position_embeddings as u32, vb.device())?.unsqueeze(0)?; Ok(Self { word_embedddings, position_embeddings, layer_norm, position_ids, }) } fn forward(&self, xs: &Tensor, past_kv_len: usize) -> Result<Tensor> { let seq_len = xs.dim(1)?; let position_ids = self.position_ids.narrow(1, past_kv_len, seq_len)?; let embeddings = self.word_embedddings.forward(xs)?; let position_embeddings = self.position_embeddings.forward(&position_ids)?; (embeddings + position_embeddings)?.apply(&self.layer_norm) } } #[derive(Debug, Clone)] struct TextSelfAttention { query: Linear, key: Linear, value: Linear, attention_head_size: usize, num_attention_heads: usize, attention_scale: f64, kv_cache: Option<(Tensor, Tensor)>, } impl TextSelfAttention { fn new(cfg: &Config, is_cross_attention: bool, vb: VarBuilder) -> Result<Self> { let num_attention_heads = cfg.num_attention_heads; let attention_head_size = cfg.hidden_size / num_attention_heads; let all_head_size = cfg.num_attention_heads * attention_head_size; let query = linear(cfg.hidden_size, all_head_size, vb.pp("query"))?; let in_size = if is_cross_attention { cfg.encoder_hidden_size } else { cfg.hidden_size }; let key = linear(in_size, all_head_size, vb.pp("key"))?; let value = linear(in_size, all_head_size, vb.pp("value"))?; let attention_scale = 1f64 / (attention_head_size as f64).sqrt(); Ok(Self { query, key, value, attention_head_size, num_attention_heads, attention_scale, kv_cache: None, }) } fn transpose_for_scores(&self, xs: &Tensor) -> Result<Tensor> { let (b_size, seq_len, _) = xs.dims3()?; xs.reshape(( b_size, seq_len, self.num_attention_heads, self.attention_head_size, ))? .permute((0, 2, 1, 3)) } fn reset_kv_cache(&mut self) { self.kv_cache = None } fn forward( &mut self, xs: &Tensor, encoder_hidden_states: Option<&Tensor>, attention_mask: Option<&Tensor>, ) -> Result<Tensor> { let query = self .transpose_for_scores(&self.query.forward(xs)?)? .contiguous()?; let (key, value) = match encoder_hidden_states { None => { let key = self.transpose_for_scores(&self.key.forward(xs)?)?; let value = self.transpose_for_scores(&self.value.forward(xs)?)?; let (key, value) = match &self.kv_cache { None => (key, value), Some((prev_key, prev_value)) => { let key = Tensor::cat(&[prev_key, &key], 2)?; let value = Tensor::cat(&[prev_value, &value], 2)?; (key, value) } }; self.kv_cache = Some((key.clone(), value.clone())); (key, value) } Some(xs) => { let key = self.transpose_for_scores(&self.key.forward(xs)?)?; let value = self.transpose_for_scores(&self.value.forward(xs)?)?; // no kv-cache in this case, but the results could probably be memoized. (key, value) } }; let key = key.contiguous()?; let value = value.contiguous()?; let attention_scores = query.matmul(&key.t()?)?; let attention_scores = (attention_scores * self.attention_scale)?; let attention_scores = match attention_mask { Some(mask) => attention_scores.broadcast_add(mask)?, None => attention_scores, }; let attention_probs = candle_nn::ops::softmax_last_dim(&attention_scores)?; attention_probs .matmul(&value)? .permute((0, 2, 1, 3))? .flatten_from(D::Minus2) } } #[derive(Debug, Clone)] struct TextSelfOutput { dense: Linear, layer_norm: LayerNorm, } impl TextSelfOutput { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense = linear(cfg.hidden_size, cfg.hidden_size, vb.pp("dense"))?; let layer_norm = layer_norm(cfg.hidden_size, cfg.layer_norm_eps, vb.pp("LayerNorm"))?; Ok(Self { dense, layer_norm }) } fn forward(&self, xs: &Tensor, input_tensor: &Tensor) -> Result<Tensor> { (xs.apply(&self.dense) + input_tensor)?.apply(&self.layer_norm) } } #[derive(Debug, Clone)] struct TextAttention { self_: TextSelfAttention, output: TextSelfOutput, } impl TextAttention { fn new(cfg: &Config, is_cross_attention: bool, vb: VarBuilder) -> Result<Self> { let self_ = TextSelfAttention::new(cfg, is_cross_attention, vb.pp("self"))?; let output = TextSelfOutput::new(cfg, vb.pp("output"))?; Ok(Self { self_, output }) } fn reset_kv_cache(&mut self) { self.self_.reset_kv_cache() } fn forward( &mut self, xs: &Tensor, encoder_hidden_states: Option<&Tensor>, attention_mask: Option<&Tensor>, ) -> Result<Tensor> { let self_outputs = self .self_ .forward(xs, encoder_hidden_states, attention_mask)?; self.output.forward(&self_outputs, xs) } } #[derive(Debug, Clone)] struct TextIntermediate { dense: Linear, intermediate_act_fn: candle_nn::Activation, } impl TextIntermediate { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense = linear(cfg.hidden_size, cfg.intermediate_size, vb.pp("dense"))?; Ok(Self { dense, intermediate_act_fn: cfg.hidden_act, }) } } impl Module for TextIntermediate { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.dense)?.apply(&self.intermediate_act_fn) } } #[derive(Debug, Clone)] struct TextOutput { dense: Linear, layer_norm: LayerNorm, } impl TextOutput { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense = linear(cfg.intermediate_size, cfg.hidden_size, vb.pp("dense"))?; let layer_norm = layer_norm(cfg.hidden_size, cfg.layer_norm_eps, vb.pp("LayerNorm"))?; Ok(Self { dense, layer_norm }) } fn forward(&self, xs: &Tensor, input_tensor: &Tensor) -> Result<Tensor> { (xs.apply(&self.dense)? + input_tensor)?.apply(&self.layer_norm) } } #[derive(Debug, Clone)] struct TextLayer { attention: TextAttention, cross_attention: Option<TextAttention>, intermediate: TextIntermediate, output: TextOutput, } impl TextLayer { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let attention = TextAttention::new(cfg, false, vb.pp("attention"))?; let cross_attention = if cfg.is_decoder { Some(TextAttention::new(cfg, true, vb.pp("crossattention"))?) } else { None }; let intermediate = TextIntermediate::new(cfg, vb.pp("intermediate"))?; let output = TextOutput::new(cfg, vb.pp("output"))?; Ok(Self { attention, cross_attention, intermediate, output, }) } fn reset_kv_cache(&mut self) { self.attention.reset_kv_cache(); if let Some(ca) = &mut self.cross_attention { ca.reset_kv_cache() } } fn forward( &mut self, xs: &Tensor, encoder_hidden_states: &Tensor, attention_mask: &Tensor, ) -> Result<Tensor> { let attention_output = self.attention.forward(xs, None, Some(attention_mask))?; let attention_output = match &mut self.cross_attention { Some(ca) => ca.forward(&attention_output, Some(encoder_hidden_states), None)?, None => candle::bail!("expected some cross-attn"), }; let intermediate_output = self.intermediate.forward(&attention_output)?; self.output.forward(&intermediate_output, &attention_output) } } #[derive(Debug, Clone)] struct TextEncoder { layers: Vec<TextLayer>, } impl TextEncoder { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vb = vb.pp("layer"); let mut layers = Vec::with_capacity(cfg.num_hidden_layers); for i in 0..cfg.num_hidden_layers { let layer = TextLayer::new(cfg, vb.pp(i))?; layers.push(layer) } Ok(Self { layers }) } fn reset_kv_cache(&mut self) { self.layers.iter_mut().for_each(|l| l.reset_kv_cache()) } fn forward( &mut self, xs: &Tensor, encoder_hidden_states: &Tensor, attention_mask: &Tensor, ) -> Result<Tensor> { let mut xs = xs.clone(); for layer in self.layers.iter_mut() { xs = layer.forward(&xs, encoder_hidden_states, attention_mask)? } Ok(xs) } } #[derive(Debug, Clone)] pub struct TextPooler { dense: Linear, } impl TextPooler { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense = linear(cfg.hidden_size, cfg.hidden_size, vb.pp("dense"))?; Ok(Self { dense }) } } impl Module for TextPooler { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.narrow(D::Minus1, 0, 1)? .squeeze(D::Minus1)? .apply(&self.dense)? .tanh() } } #[derive(Debug, Clone)] struct TextPredictionHeadTransform { dense: Linear, transform_act_fn: candle_nn::Activation, layer_norm: LayerNorm, } impl TextPredictionHeadTransform { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense = linear(cfg.hidden_size, cfg.hidden_size, vb.pp("dense"))?; let layer_norm = layer_norm(cfg.hidden_size, cfg.layer_norm_eps, vb.pp("LayerNorm"))?; Ok(Self { dense, transform_act_fn: cfg.hidden_act, layer_norm, }) } } impl Module for TextPredictionHeadTransform { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.dense)? .apply(&self.transform_act_fn)? .apply(&self.layer_norm) } } #[derive(Debug, Clone)] struct TextLMPredictionHead { transform: TextPredictionHeadTransform, decoder: Linear, } impl TextLMPredictionHead { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let transform = TextPredictionHeadTransform::new(cfg, vb.pp("transform"))?; let weight = QMatMul::new(cfg.hidden_size, cfg.vocab_size, vb.pp("decoder"))?; let bias = vb.get(cfg.vocab_size, "bias")?.dequantize(vb.device())?; let decoder = Linear::from_weights(weight, Some(bias)); Ok(Self { transform, decoder }) } } impl Module for TextLMPredictionHead { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.transform)?.apply(&self.decoder) } } #[derive(Debug, Clone)] struct TextOnlyMLMHead { predictions: TextLMPredictionHead, } impl TextOnlyMLMHead { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let predictions = TextLMPredictionHead::new(cfg, vb.pp("predictions"))?; Ok(Self { predictions }) } } impl Module for TextOnlyMLMHead { fn forward(&self, xs: &Tensor) -> Result<Tensor> { self.predictions.forward(xs) } } #[derive(Debug, Clone)] struct TextModel { embeddings: TextEmbeddings, encoder: TextEncoder, past_kv_len: usize, // We do not need the pooler for caption generation } impl TextModel { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let embeddings = TextEmbeddings::new(cfg, vb.pp("embeddings"))?; let encoder = TextEncoder::new(cfg, vb.pp("encoder"))?; Ok(Self { embeddings, encoder, past_kv_len: 0, }) } fn forward( &mut self, input_ids: &Tensor, encoder_hidden_states: &Tensor, attention_mask: &Tensor, ) -> Result<Tensor> { let (_b_sz, seq_len) = input_ids.dims2()?; let embedding_output = self.embeddings.forward(input_ids, self.past_kv_len)?; let sequence_output = self.encoder .forward(&embedding_output, encoder_hidden_states, attention_mask)?; self.past_kv_len += seq_len; // We're interested in the sequence-output rather than the pooled-output. Ok(sequence_output) } fn reset_kv_cache(&mut self) { self.past_kv_len = 0; self.encoder.reset_kv_cache(); } } #[derive(Debug, Clone)] pub struct TextLMHeadModel { bert: TextModel, cls: TextOnlyMLMHead, } impl TextLMHeadModel { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let bert = TextModel::new(cfg, vb.pp("bert"))?; let cls = TextOnlyMLMHead::new(cfg, vb.pp("cls"))?; Ok(Self { bert, cls }) } pub fn forward( &mut self, input_ids: &Tensor, encoder_hidden_states: &Tensor, ) -> Result<Tensor> { let seq_len = input_ids.dim(1)?; let mask: Vec<_> = (0..seq_len) .flat_map(|i| (0..seq_len).map(move |j| if j > i { f32::NEG_INFINITY } else { 0f32 })) .collect(); let mask = Tensor::from_vec(mask, (seq_len, seq_len), input_ids.device())?; let sequence_output = self.bert.forward(input_ids, encoder_hidden_states, &mask)?; let prediction_scores = self.cls.forward(&sequence_output)?; // return_logits is false so we don't discard the last sequence element. Ok(prediction_scores) } pub fn reset_kv_cache(&mut self) { self.bert.reset_kv_cache() } }
5
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/mobileclip.rs
//! Mobile CLIP model, combining a lightweight vision encoder with a text encoder //! //! A mobile-optimized CLIP implementation that uses: //! - FastViT as the vision encoder //! - OpenCLIP text encoder //! - Projection layers to align the feature spaces //! //! See model details at: //! - [FastViT](https://arxiv.org/abs/2303.14189) //! - [OpenCLIP](https://github.com/mlfoundations/open_clip) //! //! References: //! - [MobileVLM](https://huggingface.co/mobileVLM) //! - [MetaCLIP](https://arxiv.org/abs/2309.16671) //! use super::fastvit; use super::openclip::text_model; use candle::{Result, Tensor, D}; use candle_nn::{Func, VarBuilder}; #[derive(Clone, Debug)] pub struct MobileClipModel { text_model: text_model::OpenClipTextTransformer, vision_model: Func<'static>, text_projection: Tensor, logit_scale: Tensor, } #[derive(Clone, Debug)] pub struct MobileClipConfig { pub text_config: text_model::Config, pub vision_config: fastvit::Config, pub image_size: usize, } impl MobileClipConfig { pub fn s1() -> Self { let text_config = text_model::Config::vit_base_patch32(); let vision_config = fastvit::Config::mci1(); Self { text_config, vision_config, image_size: 256, } } pub fn s2() -> Self { let text_config = text_model::Config::vit_base_patch32(); let vision_config = fastvit::Config::mci2(); Self { text_config, vision_config, image_size: 256, } } } impl MobileClipModel { pub fn new(vs: VarBuilder, c: &MobileClipConfig) -> Result<Self> { let vision_model = fastvit::fastvit(&c.vision_config, 512, vs.pp("visual.trunk"))?; let text_model = text_model::OpenClipTextTransformer::new(vs.pp("text"), &c.text_config)?; let text_projection = vs.get( (c.text_config.embed_dim, c.text_config.projection_dim), "text.text_projection", )?; let logit_scale = vs.get(&[], "logit_scale")?; Ok(Self { text_model, vision_model, text_projection, logit_scale, }) } pub fn get_text_features(&self, input_ids: &Tensor) -> Result<Tensor> { input_ids .apply(&self.text_model)? .matmul(&self.text_projection) } pub fn get_image_features(&self, pixel_values: &Tensor) -> Result<Tensor> { pixel_values.apply(&self.vision_model) } pub fn forward(&self, pixel_values: &Tensor, input_ids: &Tensor) -> Result<(Tensor, Tensor)> { let image_features = self.get_image_features(pixel_values)?; let text_features = self.get_text_features(input_ids)?; let image_features_normalized = div_l2_norm(&image_features)?; let text_features_normalized = div_l2_norm(&text_features)?; let logits_per_text = text_features_normalized.matmul(&image_features_normalized.t()?)?; let logit_scale = self.logit_scale.exp()?; let logits_per_text = logits_per_text.broadcast_mul(&logit_scale)?; let logits_per_image = logits_per_text.t()?; Ok((logits_per_text, logits_per_image)) } } pub fn div_l2_norm(v: &Tensor) -> Result<Tensor> { let l2_norm = v.sqr()?.sum_keepdim(D::Minus1)?.sqrt()?; v.broadcast_div(&l2_norm) }
6
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/quantized_recurrent_gemma.rs
//! Recurrent Gemma model implementation with quantization support. //! //! Gemma is a large language model optimized for efficiency. //! This implementation provides quantization for reduced memory and compute. //! //! Key characteristics: //! - Recurrent blocks with gated recurrent units //! - Convolution and attention blocks //! - RMSNorm for layer normalization //! - Rotary positional embeddings (RoPE) //! - Support for 8-bit quantization //! //! References: //! - [Gemma Paper](https://arxiv.org/abs/2401.06751) //! - [Model Card](https://ai.google.dev/gemma) //! use crate::quantized_nn::{linear_b as linear, Embedding, Linear}; pub use crate::quantized_var_builder::VarBuilder; use candle::{DType, Device, IndexOp, Module, Result, Tensor, D}; use std::sync::Arc; use crate::models::recurrent_gemma::{Config, Rglru, RmsNorm, RotaryEmbedding, TemporalBlockType}; fn rms_norm(size: usize, eps: f64, vb: VarBuilder) -> Result<RmsNorm> { let weight = vb.get(size, "weight")?.dequantize(vb.device())?; Ok(RmsNorm::from_weight(weight, eps)) } #[derive(Debug, Clone)] struct Mlp { gate_proj: Linear, up_proj: Linear, down_proj: Linear, act_fn: candle_nn::Activation, } impl Mlp { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let h = cfg.hidden_size; let intermediate_size = cfg.intermediate_size / 2; let gate_proj = linear(h, intermediate_size, true, vb.pp("gate_proj"))?; let up_proj = linear(h, intermediate_size, true, vb.pp("up_proj"))?; let down_proj = linear(intermediate_size, h, true, vb.pp("down_proj"))?; Ok(Self { gate_proj, up_proj, down_proj, act_fn: cfg.hidden_activation, }) } } impl Module for Mlp { fn forward(&self, xs: &Tensor) -> Result<Tensor> { let gate = xs.apply(&self.gate_proj)?.apply(&self.act_fn)?; (gate * xs.apply(&self.up_proj))?.apply(&self.down_proj) } } fn rglru(cfg: &Config, vb: VarBuilder) -> Result<Rglru> { let h = cfg.hidden_size; let lru_width = cfg.lru_width.unwrap_or(h); let n_heads = cfg.num_attention_heads; let block_width = lru_width / n_heads; let recurrent_param = vb.get((lru_width,), "recurrent_param")?; let input_gate_weight = vb.get((n_heads, block_width, block_width), "input_gate_weight")?; let input_gate_bias = vb.get((n_heads, block_width), "input_gate_bias")?; let recurrent_gate_weight = vb.get((n_heads, block_width, block_width), "recurrent_gate_weight")?; let recurrent_gate_bias = vb.get((n_heads, block_width), "recurrent_gate_bias")?; Ok(Rglru { recurrent_param: recurrent_param.dequantize(vb.device())?, input_gate_bias: input_gate_bias.dequantize(vb.device())?, input_gate_weight: input_gate_weight.dequantize(vb.device())?, recurrent_gate_bias: recurrent_gate_bias.dequantize(vb.device())?, recurrent_gate_weight: recurrent_gate_weight.dequantize(vb.device())?, block_width, n_heads, recurrent_states: None, }) } #[derive(Debug, Clone)] struct RecurrentBlock { linear_y: Linear, linear_x: Linear, linear_out: Linear, conv_1d: candle_nn::Conv1d, conv1d_state: Option<Tensor>, conv1d_width: usize, rg_lru: Rglru, act_fn: candle_nn::Activation, } impl RecurrentBlock { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let h = cfg.hidden_size; let lru_width = cfg.lru_width.unwrap_or(h); let linear_y = linear(h, lru_width, true, vb.pp("linear_y"))?; let linear_x = linear(h, lru_width, true, vb.pp("linear_x"))?; let linear_out = linear(lru_width, h, true, vb.pp("linear_out"))?; let conv_1d = { let ws = vb .get((lru_width, 1, cfg.conv1d_width), "conv_1d.weight")? .dequantize(vb.device())?; let bs = vb.get(lru_width, "conv_1d.bias")?.dequantize(vb.device())?; let config = candle_nn::Conv1dConfig { groups: lru_width, padding: cfg.conv1d_width - 1, ..Default::default() }; candle_nn::Conv1d::new(ws, Some(bs), config) }; let rg_lru = rglru(cfg, vb.pp("rg_lru"))?; Ok(Self { linear_y, linear_x, linear_out, conv_1d, conv1d_state: None, conv1d_width: cfg.conv1d_width, rg_lru, act_fn: cfg.hidden_activation, }) } pub fn forward(&mut self, xs: &Tensor, pos: usize) -> Result<Tensor> { let (_b_sz, seq_len, _) = xs.dims3()?; let y_branch = xs.apply(&self.linear_y)?.apply(&self.act_fn)?; let x_branch = xs.apply(&self.linear_x)?.transpose(1, 2)?; let x_branch = if pos == 0 { let x_len = x_branch.dim(D::Minus1)?; let pad = self.conv1d_width as i64 - x_len as i64 - 1; let padded = match pad.cmp(&0) { std::cmp::Ordering::Equal => x_branch.clone(), std::cmp::Ordering::Less => { let rev_pad = (-pad) as usize; x_branch.narrow(D::Minus1, rev_pad, x_len - rev_pad)? } std::cmp::Ordering::Greater => { x_branch.pad_with_zeros(D::Minus1, pad as usize, 0)? } }; self.conv1d_state = Some(padded); x_branch .apply(&self.conv_1d)? .narrow(D::Minus1, 0, seq_len)? } else { let conv_state = match self.conv1d_state.as_ref() { None => candle::bail!("empty cache despite pos > 0"), Some(s) => Tensor::cat(&[s, &x_branch], D::Minus1)?, }; let w = self.conv_1d.weight().i((.., 0, ..))?; let x_branch = conv_state.broadcast_mul(&w)?.sum(D::Minus1)?; let x_branch = match self.conv_1d.bias() { None => x_branch, Some(b) => x_branch.broadcast_add(b)?, }; let x_branch = x_branch.unsqueeze(D::Minus1)?; self.conv1d_state = Some(conv_state.i((.., .., 1..))?); x_branch }; let x_branch = x_branch.transpose(1, 2)?; let x_branch = self.rg_lru.forward(&x_branch, pos)?; (x_branch * y_branch)?.apply(&self.linear_out) } } #[derive(Debug, Clone)] struct SdpaAttention { q_proj: Linear, k_proj: Linear, v_proj: Linear, o_proj: Linear, n_heads: usize, n_kv_heads: usize, head_dim: usize, hidden_size: usize, kv_cache: Option<(Tensor, Tensor)>, rotary_emb: Arc<RotaryEmbedding>, } impl SdpaAttention { fn new(rotary_emb: Arc<RotaryEmbedding>, cfg: &Config, vb: VarBuilder) -> Result<Self> { let h = cfg.hidden_size; let n_heads = cfg.num_attention_heads; let n_kv_heads = cfg.num_key_value_heads; let hd = cfg.head_dim; let q_proj = linear(h, n_heads * hd, cfg.attention_bias, vb.pp("q_proj"))?; let k_proj = linear(h, n_kv_heads * hd, cfg.attention_bias, vb.pp("k_proj"))?; let v_proj = linear(h, n_kv_heads * hd, cfg.attention_bias, vb.pp("v_proj"))?; let o_proj = linear(n_heads * hd, h, true, vb.pp("o_proj"))?; Ok(Self { q_proj, k_proj, v_proj, o_proj, n_heads, n_kv_heads, head_dim: hd, hidden_size: h, kv_cache: None, rotary_emb, }) } fn repeat_kv(&self, x: Tensor) -> Result<Tensor> { let n_rep = self.n_heads / self.n_kv_heads; crate::utils::repeat_kv(x, n_rep) } fn forward( &mut self, xs: &Tensor, attention_mask: Option<&Tensor>, pos: usize, ) -> Result<Tensor> { let (bsz, q_len, _) = xs.dims3()?; let query_states = xs.apply(&self.q_proj)?; let key_states = xs.apply(&self.k_proj)?; let value_states = xs.apply(&self.v_proj)?; let query_states = query_states .reshape((bsz, q_len, self.n_heads, self.head_dim))? .transpose(1, 2)?; let key_states = key_states .reshape((bsz, q_len, self.n_kv_heads, self.head_dim))? .transpose(1, 2)?; let value_states = value_states .reshape((bsz, q_len, self.n_kv_heads, self.head_dim))? .transpose(1, 2)?; let query_states = query_states.chunk(2, D::Minus1)?; let key_states = key_states.chunk(2, D::Minus1)?; let (query_rot, key_rot) = self.rotary_emb .apply_rotary_emb_qkv(&query_states[0], &key_states[0], pos)?; let query_states = Tensor::cat(&[&query_rot, &query_states[1]], D::Minus1)?.contiguous()?; let key_states = Tensor::cat(&[&key_rot, &key_states[1]], D::Minus1)?.contiguous()?; let (key_states, value_states) = match &self.kv_cache { None => (key_states, value_states), Some((prev_k, prev_v)) => { let key_states = Tensor::cat(&[prev_k, &key_states], 2)?; let value_states = Tensor::cat(&[prev_v, &value_states], 2)?; (key_states, value_states) } }; self.kv_cache = Some((key_states.clone(), value_states.clone())); let key_states = self.repeat_kv(key_states)?; let value_states = self.repeat_kv(value_states)?; let xs = { let att = (query_states.matmul(&key_states.t()?)? / (self.head_dim as f64).sqrt())?; let att = if q_len == 1 { att } else { match attention_mask { None => att, Some(mask) => att.broadcast_add(mask)?, } }; let att = candle_nn::ops::softmax_last_dim(&att)?; att.matmul(&value_states.contiguous()?)? }; let xs = xs .transpose(1, 2)? .reshape((bsz, q_len, self.hidden_size))?; self.o_proj.forward(&xs) } } #[derive(Debug, Clone)] enum TemporalBlock { Recurrent(RecurrentBlock), Attention(SdpaAttention), } impl TemporalBlock { fn forward( &mut self, xs: &Tensor, attention_mask: Option<&Tensor>, pos: usize, ) -> Result<Tensor> { match self { Self::Recurrent(b) => b.forward(xs, pos), Self::Attention(b) => b.forward(xs, attention_mask, pos), } } } #[derive(Debug, Clone)] struct DecoderLayer { temporal_pre_norm: RmsNorm, channel_pre_norm: RmsNorm, temporal_block: TemporalBlock, mlp_block: Mlp, } impl DecoderLayer { fn new( block_idx: usize, rotary_emb: Arc<RotaryEmbedding>, cfg: &Config, vb: VarBuilder, ) -> Result<Self> { let h = cfg.hidden_size; let temporal_pre_norm = rms_norm(h, cfg.rms_norm_eps, vb.pp("temporal_pre_norm"))?; let channel_pre_norm = rms_norm(h, cfg.rms_norm_eps, vb.pp("channel_pre_norm"))?; let temporal_block = match cfg.block_types[block_idx % cfg.block_types.len()] { TemporalBlockType::Recurrent => { let block = RecurrentBlock::new(cfg, vb.pp("temporal_block"))?; TemporalBlock::Recurrent(block) } TemporalBlockType::Attention => { let block = SdpaAttention::new(rotary_emb, cfg, vb.pp("temporal_block"))?; TemporalBlock::Attention(block) } }; let mlp_block = Mlp::new(cfg, vb.pp("mlp_block"))?; Ok(Self { temporal_pre_norm, channel_pre_norm, temporal_block, mlp_block, }) } fn forward( &mut self, xs: &Tensor, attention_mask: Option<&Tensor>, pos: usize, ) -> Result<Tensor> { let residual = xs; let xs = xs.apply(&self.temporal_pre_norm)?; let xs = self.temporal_block.forward(&xs, attention_mask, pos)?; let xs = (xs + residual)?; let residual = &xs; let xs = xs.apply(&self.channel_pre_norm)?.apply(&self.mlp_block)?; xs + residual } } #[derive(Debug, Clone)] pub struct Model { embed_tokens: Embedding, layers: Vec<DecoderLayer>, final_norm: RmsNorm, lm_head: Linear, hidden_size: usize, logits_soft_cap: f64, device: Device, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let embed_tokens = Embedding::new(cfg.vocab_size, cfg.hidden_size, vb.pp("embed_tokens"))?; let rotary_emb = Arc::new(RotaryEmbedding::new(DType::F32, cfg, vb.device())?); let vb_b = vb.pp("layers"); let mut layers = Vec::with_capacity(cfg.num_hidden_layers); for idx in 0..cfg.num_hidden_layers { let layer = DecoderLayer::new(idx, rotary_emb.clone(), cfg, vb_b.pp(idx))?; layers.push(layer) } let final_norm = rms_norm(cfg.hidden_size, cfg.rms_norm_eps, vb.pp("final_norm"))?; let lm_head = linear( cfg.hidden_size, cfg.vocab_size, false, vb.pp("embed_tokens"), )?; Ok(Self { embed_tokens, layers, final_norm, lm_head, hidden_size: cfg.hidden_size, logits_soft_cap: cfg.logits_soft_cap, device: vb.device().clone(), }) } fn prepare_decoder_attention_mask( &self, b_size: usize, tgt_len: usize, seqlen_offset: usize, ) -> Result<Tensor> { let mask: Vec<_> = (0..tgt_len) .flat_map(|i| (0..tgt_len).map(move |j| if i < j { f32::NEG_INFINITY } else { 0. })) .collect(); let mask = Tensor::from_slice(&mask, (tgt_len, tgt_len), &self.device)?; let mask = if seqlen_offset > 0 { let mask0 = Tensor::zeros((tgt_len, seqlen_offset), DType::F32, &self.device)?; Tensor::cat(&[&mask0, &mask], D::Minus1)? } else { mask }; mask.expand((b_size, 1, tgt_len, tgt_len + seqlen_offset))? .to_dtype(DType::F32) } pub fn forward(&mut self, xs: &Tensor, pos: usize) -> Result<Tensor> { let (b_size, seq_len) = xs.dims2()?; let attention_mask = if seq_len <= 1 { None } else { let mask = self.prepare_decoder_attention_mask(b_size, seq_len, pos)?; Some(mask) }; let xs = xs.apply(&self.embed_tokens)?; let mut xs = (xs * (self.hidden_size as f64).sqrt())?; for layer in self.layers.iter_mut() { xs = layer.forward(&xs, attention_mask.as_ref(), pos)?; } let logits = xs .narrow(1, seq_len - 1, 1)? .apply(&self.final_norm)? .apply(&self.lm_head)?; let logits = ((logits / self.logits_soft_cap)?.tanh()? * self.logits_soft_cap)?; Ok(logits) } }
7
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/codegeex4_9b.rs
//! CodeGeeX4 - A multi-language code generation model //! //! A Pre-Trained Model For Code Generation with Multilingual Evaluations on HumanEval-X" //! //! - 📝 [Arxiv](https://arxiv.org/abs/2303.17568) //! - 💻 [Github](https://github.com/THUDM/CodeGeeX) //! use crate::models::with_tracing::{linear_b as linear, Linear}; use candle::{DType, Device, IndexOp, Module, Result, Tensor, D}; use candle_nn::VarBuilder; #[derive(Debug, Clone)] pub struct Config { pub num_layers: usize, pub padded_vocab_size: usize, pub hidden_size: usize, pub ffn_hidden_size: usize, pub kv_channels: usize, pub num_attention_heads: usize, pub seq_length: usize, pub layernorm_epsilon: f64, pub rmsnorm: bool, pub apply_residual_connection_post_layernorm: bool, pub post_layer_norm: bool, pub add_bias_linear: bool, pub add_qkv_bias: bool, pub bias_dropout_fusion: bool, pub multi_query_attention: bool, pub multi_query_group_num: usize, pub apply_query_key_layer_scaling: bool, pub attention_softmax_in_fp32: bool, pub fp32_residual_connection: bool, } impl Config { pub fn codegeex4() -> Self { Self { num_layers: 40, padded_vocab_size: 151552, hidden_size: 4096, ffn_hidden_size: 13696, kv_channels: 128, num_attention_heads: 32, seq_length: 131072, layernorm_epsilon: 1e-5, rmsnorm: true, apply_residual_connection_post_layernorm: false, post_layer_norm: true, add_bias_linear: false, add_qkv_bias: true, bias_dropout_fusion: true, multi_query_attention: true, multi_query_group_num: 2, apply_query_key_layer_scaling: true, attention_softmax_in_fp32: true, fp32_residual_connection: false, } } } #[derive(Debug, Clone)] struct RotaryEmbedding { cache: Tensor, } impl RotaryEmbedding { fn new(cfg: &Config, dtype: DType, dev: &Device) -> Result<Self> { let rotary_dim = cfg.kv_channels; let n_elem = rotary_dim / 2; let inv_freq: Vec<_> = (0..n_elem) .step_by(2) .map(|i| 1f32 / 10_000f64.powf(i as f64 / n_elem as f64) as f32) .collect(); let inv_freq_len = inv_freq.len(); let inv_freq = Tensor::from_vec(inv_freq, (1, inv_freq_len), dev)?.to_dtype(dtype)?; let t = Tensor::arange(0u32, cfg.seq_length as u32, dev)? .to_dtype(dtype) .expect("unalbe to dytpe in Rotray Embedding new") .reshape((cfg.seq_length, 1))?; let freqs = t.matmul(&inv_freq)?; let cache = Tensor::stack(&[&freqs.cos()?, &freqs.sin()?], D::Minus1)?; Ok(Self { cache }) } fn apply(&self, xs: &Tensor, seqlen_offset: usize) -> Result<Tensor> { let (seqlen, _b, np, _hn) = xs.dims4()?; let cache = self.cache.narrow(0, seqlen_offset, seqlen)?; let rot_dim = cache.dim(D::Minus2)? * 2; let (xs, xs_pass) = ( xs.narrow(D::Minus1, 0, rot_dim)?, xs.narrow(D::Minus1, rot_dim, rot_dim)?, ); let xshaped = xs.reshape((seqlen, (), np, rot_dim / 2, 2))?; let cache = cache.reshape((seqlen, (), 1, rot_dim / 2, 2))?; let (xshaped0, xshaped1) = ( xshaped.i((.., .., .., .., 0))?, xshaped.i((.., .., .., .., 1))?, ); let (cache0, cache1) = (cache.i((.., .., .., .., 0))?, cache.i((.., .., .., .., 1))?); let xs_out = Tensor::stack( &[ (xshaped0.broadcast_mul(&cache0)? - xshaped1.broadcast_mul(&cache1)?)?, (xshaped1.broadcast_mul(&cache0)? + xshaped0.broadcast_mul(&cache1)?)?, ], D::Minus1, )?; let xs_out = xs_out.flatten_from(3)?; Tensor::cat(&[xs_out, xs_pass], D::Minus1) } } #[derive(Debug, Clone)] struct CoreAttention { coeff: Option<f64>, norm_factor: f64, dtype: DType, } fn masked_fill(on_false: &Tensor, mask: &Tensor, on_true: f32, dtype: DType) -> Result<Tensor> { let shape = mask.shape(); let on_true = Tensor::new(on_true, on_false.device())?.broadcast_as(shape.dims())?; let m = mask.where_cond(&on_true.to_dtype(dtype)?, on_false)?; Ok(m) } impl CoreAttention { fn new(layer_number: usize, cfg: &Config, dtype: DType) -> Result<Self> { let norm_factor = (cfg.kv_channels as f64).sqrt(); let (norm_factor, coeff) = if cfg.apply_query_key_layer_scaling { let coeff = f64::max(1.0, layer_number as f64); (norm_factor * coeff, Some(coeff)) } else { (norm_factor, None) }; Ok(Self { coeff, norm_factor, dtype, }) } fn forward( &self, query_layer: &Tensor, key_layer: &Tensor, value_layer: &Tensor, attention_mask: &Option<Tensor>, ) -> Result<Tensor> { let output_size = ( query_layer.dim(1)?, // b query_layer.dim(2)?, // np query_layer.dim(0)?, // sq key_layer.dim(0)?, // sk ); let query_layer = query_layer.reshape((output_size.2, output_size.0 * output_size.1, ()))?; let key_layer = key_layer.reshape((output_size.3, output_size.0 * output_size.1, ()))?; let matmul_result = Tensor::matmul( &query_layer.transpose(0, 1)?.contiguous()?, &key_layer.transpose(0, 1)?.transpose(1, 2)?.contiguous()?, )?; let matmul_result = (matmul_result / self.norm_factor)?.reshape(output_size)?; let matmul_result = match self.coeff { None => matmul_result, Some(coeff) => (matmul_result * coeff)?, }; let attention_scores = match attention_mask { Some(mask) => masked_fill( &matmul_result, &mask.broadcast_left((matmul_result.dim(0)?, matmul_result.dim(1)?))?, f32::NEG_INFINITY, self.dtype, )?, None => matmul_result, }; let attention_probs = candle_nn::ops::softmax_last_dim(&attention_scores)?; let output_size = ( value_layer.dim(1)?, value_layer.dim(2)?, query_layer.dim(0)?, value_layer.dim(3)?, ); let value_layer = value_layer.reshape((value_layer.dim(0)?, output_size.0 * output_size.1, ()))?; let attention_probs = attention_probs.reshape((output_size.0 * output_size.1, output_size.2, ()))?; let context_layer = Tensor::matmul( &attention_probs.contiguous()?, &value_layer.transpose(0, 1)?.contiguous()?, )?; let context_layer = context_layer.reshape(output_size)?; let context_layer = context_layer.permute((2, 0, 1, 3))?.contiguous()?; context_layer.flatten_from(D::Minus2) } } #[derive(Debug, Clone)] struct SelfAttention { query_key_value: Linear, core_attention: CoreAttention, dense: Linear, multi_query_attention: bool, num_attention_heads_per_partition: usize, num_multi_query_groups_per_partition: usize, hidden_size_per_attention_head: usize, kv_cache: Option<(Tensor, Tensor)>, } impl SelfAttention { fn new(layer_number: usize, cfg: &Config, vb: VarBuilder) -> Result<Self> { let projection_size = cfg.kv_channels * cfg.num_attention_heads; let hidden_size_per_attention_head = projection_size / cfg.num_attention_heads; let qkv_hidden_size = if cfg.multi_query_attention { projection_size + 2 * hidden_size_per_attention_head * cfg.multi_query_group_num } else { 3 * projection_size }; let query_key_value = linear( cfg.hidden_size, qkv_hidden_size, cfg.add_bias_linear || cfg.add_qkv_bias, vb.pp("query_key_value"), )?; let core_attention = CoreAttention::new(layer_number, cfg, vb.dtype())?; let dense = linear( cfg.hidden_size, cfg.hidden_size, cfg.add_bias_linear, vb.pp("dense"), )?; Ok(Self { query_key_value, core_attention, dense, multi_query_attention: cfg.multi_query_attention, num_attention_heads_per_partition: cfg.num_attention_heads, num_multi_query_groups_per_partition: cfg.multi_query_group_num, hidden_size_per_attention_head: cfg.kv_channels, kv_cache: None, }) } fn reset_kv_cache(&mut self) { self.kv_cache = None } fn forward( &mut self, xs: &Tensor, attention_mask: &Option<Tensor>, rotary_emb: &RotaryEmbedding, ) -> Result<Tensor> { let mixed_x_layer = xs.apply(&self.query_key_value)?; if !self.multi_query_attention { candle::bail!("only multi_query_attention=true is supported") } let hpa = self.hidden_size_per_attention_head; let query_layer = mixed_x_layer.narrow(D::Minus1, 0, self.num_attention_heads_per_partition * hpa)?; let key_layer = mixed_x_layer.narrow( D::Minus1, self.num_attention_heads_per_partition * hpa, self.num_multi_query_groups_per_partition * hpa, )?; let value_layer = mixed_x_layer.narrow( D::Minus1, self.num_attention_heads_per_partition * hpa + self.num_multi_query_groups_per_partition * hpa, self.num_multi_query_groups_per_partition * hpa, )?; let query_layer = query_layer.reshape(( query_layer.dim(0)?, query_layer.dim(1)?, self.num_attention_heads_per_partition, hpa, ))?; let key_layer = key_layer.reshape(( key_layer.dim(0)?, key_layer.dim(1)?, self.num_multi_query_groups_per_partition, hpa, ))?; let value_layer = value_layer.reshape(( value_layer.dim(0)?, value_layer.dim(1)?, self.num_multi_query_groups_per_partition, hpa, ))?; // Rotary embeddings. let seqlen_offset = match &self.kv_cache { None => 0, Some((prev_k, _)) => prev_k.dim(0)?, }; let query_layer = rotary_emb.apply(&query_layer, seqlen_offset)?; let key_layer = rotary_emb.apply(&key_layer, seqlen_offset)?; // KV cache. let (key_layer, value_layer) = match &self.kv_cache { None => (key_layer, value_layer), Some((prev_k, prev_v)) => { let k = Tensor::cat(&[prev_k, &key_layer], 0)?; let v = Tensor::cat(&[prev_v, &value_layer], 0)?; (k, v) } }; self.kv_cache = Some((key_layer.clone(), value_layer.clone())); // Repeat KV. let ratio = self.num_attention_heads_per_partition / self.num_multi_query_groups_per_partition; let key_layer = { let (d0, d1, d2, d3) = key_layer.dims4()?; key_layer .unsqueeze(D::Minus2)? .expand((d0, d1, d2, ratio, d3))? .reshape(( d0, d1, self.num_attention_heads_per_partition, self.hidden_size_per_attention_head, ))? }; let value_layer = { let (d0, d1, d2, d3) = value_layer.dims4()?; value_layer .unsqueeze(D::Minus2)? .expand((d0, d1, d2, ratio, d3))? .reshape(( d0, d1, self.num_attention_heads_per_partition, self.hidden_size_per_attention_head, ))? }; let context_layer = self.core_attention .forward(&query_layer, &key_layer, &value_layer, attention_mask)?; let output = context_layer.apply(&self.dense)?; Ok(output) } } #[allow(clippy::upper_case_acronyms)] #[derive(Debug, Clone)] struct MLP { dense_h_to_4h: Linear, dense_4h_to_h: Linear, } impl MLP { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let dense_h_to_4h = linear( cfg.hidden_size, cfg.ffn_hidden_size * 2, cfg.add_bias_linear, vb.pp("dense_h_to_4h"), )?; let dense_4h_to_h = linear( cfg.ffn_hidden_size, cfg.hidden_size, cfg.add_bias_linear, vb.pp("dense_4h_to_h"), )?; Ok(Self { dense_4h_to_h, dense_h_to_4h, }) } } impl Module for MLP { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.dense_h_to_4h)? .apply(&candle_nn::Activation::Swiglu)? .apply(&self.dense_4h_to_h) } } #[derive(Debug, Clone)] struct Block { input_layernorm: candle_nn::LayerNorm, self_attention: SelfAttention, post_attention_layernorm: candle_nn::LayerNorm, mlp: MLP, apply_residual_connection_post_layernorm: bool, } impl Block { fn new(layer_number: usize, cfg: &Config, vb: VarBuilder) -> Result<Self> { let input_layernorm = if cfg.rmsnorm { candle_nn::rms_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("input_layernorm"), )? .into_inner() } else { candle_nn::layer_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("input_layernorm"), )? }; let post_attention_layernorm = if cfg.rmsnorm { candle_nn::rms_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("post_attention_layernorm"), )? .into_inner() } else { candle_nn::layer_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("post_attention_layernorm"), )? }; let self_attention = SelfAttention::new(layer_number, cfg, vb.pp("self_attention"))?; let mlp = MLP::new(cfg, vb.pp("mlp"))?; Ok(Self { input_layernorm, self_attention, post_attention_layernorm, mlp, apply_residual_connection_post_layernorm: cfg.apply_residual_connection_post_layernorm, }) } fn reset_kv_cache(&mut self) { self.self_attention.reset_kv_cache() } fn forward( &mut self, xs: &Tensor, attention_mask: &Option<Tensor>, rotary_emb: &RotaryEmbedding, ) -> Result<Tensor> { let layernorm_output = xs.apply(&self.input_layernorm)?; let attention_output = self.self_attention .forward(&layernorm_output, attention_mask, rotary_emb)?; let residual = if self.apply_residual_connection_post_layernorm { &layernorm_output } else { xs }; let layernorm_input = (residual + attention_output)?; let layernorm_output = layernorm_input.apply(&self.post_attention_layernorm)?; let mlp_output = layernorm_output.apply(&self.mlp)?; let residual = if self.apply_residual_connection_post_layernorm { &layernorm_output } else { &layernorm_input }; mlp_output + residual } } #[derive(Debug, Clone)] struct Transformer { layers: Vec<Block>, final_layernorm: Option<candle_nn::LayerNorm>, rotary_emb: RotaryEmbedding, } impl Transformer { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vb_l = vb.pp("layers"); let mut layers = Vec::with_capacity(cfg.num_layers); for layer_index in 0..cfg.num_layers { let block = Block::new(layer_index + 1, cfg, vb_l.pp(layer_index))?; layers.push(block) } let final_layernorm = if cfg.post_layer_norm { let ln = if cfg.rmsnorm { candle_nn::rms_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("final_layernorm"), )? .into_inner() } else { candle_nn::layer_norm( cfg.hidden_size, cfg.layernorm_epsilon, vb.pp("final_layernorm"), )? }; Some(ln) } else { None }; let rotary_emb = RotaryEmbedding::new(cfg, vb.dtype(), vb.device())?; Ok(Self { layers, final_layernorm, rotary_emb, }) } fn reset_kv_cache(&mut self) { for block in self.layers.iter_mut() { block.reset_kv_cache() } } fn forward(&mut self, xs: &Tensor, attention_mask: &Option<Tensor>) -> Result<Tensor> { let mut xs = xs.clone(); for block in self.layers.iter_mut() { xs = block.forward(&xs, attention_mask, &self.rotary_emb)? } match self.final_layernorm.as_ref() { None => Ok(xs), Some(ln) => xs.apply(ln), } } } #[derive(Debug, Clone)] struct Embedding { word_embeddings: candle_nn::Embedding, fp32_residual_connection: bool, } impl Embedding { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let word_embeddings = candle_nn::embedding( cfg.padded_vocab_size, cfg.hidden_size, vb.pp("word_embeddings"), )?; Ok(Self { word_embeddings, fp32_residual_connection: cfg.fp32_residual_connection, }) } } impl Module for Embedding { fn forward(&self, xs: &Tensor) -> Result<Tensor> { let xs = self.word_embeddings.forward(xs)?.transpose(0, 1)?; // b,s,h -> s,b,h if self.fp32_residual_connection { xs.to_dtype(candle::DType::F32) } else { xs.contiguous() } } } #[derive(Debug, Clone)] pub struct Model { embedding: Embedding, encoder: Transformer, output_layer: Linear, } fn get_mask(size: usize, device: &Device) -> Result<Tensor> { let mask: Vec<_> = (0..size) .flat_map(|i| (0..size).map(move |j| u8::from(j > i))) .collect(); Tensor::from_slice(&mask, (size, size), device) } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let vb = vb.pp("transformer"); let embedding = Embedding::new(cfg, vb.pp("embedding"))?; let encoder = Transformer::new(cfg, vb.pp("encoder"))?; let output_layer = linear( cfg.hidden_size, cfg.padded_vocab_size, false, vb.pp("output_layer"), )?; Ok(Self { embedding, encoder, output_layer, }) } pub fn reset_kv_cache(&mut self) { self.encoder.reset_kv_cache() } pub fn forward(&mut self, xs: &Tensor) -> Result<Tensor> { let (_b_size, seq_len) = xs.dims2()?; let input_embeds = xs.apply(&self.embedding)?; let attention_mask = if seq_len <= 1 { None } else { Some(get_mask(seq_len, xs.device())?) }; let xs = self.encoder.forward(&input_embeds, &attention_mask)?; let lm_logits = xs.i(seq_len - 1)?.apply(&self.output_layer)?; Ok(lm_logits) } }
8
0
hf_public_repos/candle/candle-transformers/src
hf_public_repos/candle/candle-transformers/src/models/quantized_mpt.rs
//! Quantized MPT model implementation. //! //! MPT (MPT-7B) is a causal transformer model series optimized for code generation. //! This implementation provides quantization for reduced memory and compute. //! //! Key characteristics: //! - Multi-Query Grouped Attention (MQA) //! - Support for KV-caching //! - Pre-computed ALiBi attention biases //! - Support for 8-bit quantization //! //! References: //! - [Replit Code Models](https://huggingface.co/replit/replit-code-v1_5-3b) //! - [MPT-7B Implementation](https://github.com/mosaicml/llm-foundry) //! /// MPT model used by replit-code-v1_5-3b /// https://huggingface.co/replit/replit-code-v1_5-3b/blob/main/modeling_mpt.py /// use crate::quantized_nn::{layer_norm_no_bias, linear_no_bias, Embedding, Linear}; pub use crate::quantized_var_builder::VarBuilder; /// MPT model used by replit-code-v1_5-3b /// https://huggingface.co/replit/replit-code-v1_5-3b/blob/main/modeling_mpt.py use candle::{IndexOp, Module, Result, Tensor, D}; use candle_nn::LayerNorm; pub use super::mpt::Config; #[derive(Debug, Clone)] struct GroupedQueryAttention { wqkv: Linear, out_proj: Linear, kv_cache: Option<(Tensor, Tensor)>, softmax_scale: f64, head_dim: usize, d_model: usize, n_heads: usize, kv_n_heads: usize, attn_bias: Tensor, span: tracing::Span, } impl GroupedQueryAttention { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let head_dim = cfg.d_model / cfg.n_heads; let wqkv_size = cfg.d_model + 2 * cfg.kv_n_heads * head_dim; let wqkv = linear_no_bias(cfg.d_model, wqkv_size, vb.pp("Wqkv"))?; let softmax_scale = 1f64 / (head_dim as f64).sqrt(); let out_proj = linear_no_bias(cfg.d_model, cfg.d_model, vb.pp("out_proj"))?; let attn_bias = super::mpt::build_alibi_bias(cfg)?.to_device(vb.device())?; Ok(Self { wqkv, out_proj, kv_cache: None, softmax_scale, head_dim, d_model: cfg.d_model, n_heads: cfg.n_heads, kv_n_heads: cfg.kv_n_heads, attn_bias, span: tracing::span!(tracing::Level::TRACE, "gqa"), }) } fn forward(&mut self, xs: &Tensor, mask: Option<&Tensor>) -> Result<Tensor> { let _enter = self.span.enter(); let (b_size, seq_len, _n_embd) = xs.dims3()?; let qkv = self.wqkv.forward(xs)?; let query = qkv.narrow(2, 0, self.d_model)?; let kv_size = self.kv_n_heads * self.head_dim; let key = qkv.narrow(2, self.d_model, kv_size)?; let value = qkv.narrow(2, self.d_model + kv_size, kv_size)?; // scaled_multihead_dot_product_attention let query = query .reshape((b_size, seq_len, self.n_heads, ()))? .transpose(1, 2)?; // b,h,s,d let key = key .reshape((b_size, seq_len, self.kv_n_heads, ()))? .permute((0, 2, 3, 1))?; // b,h,d,s let value = value .reshape((b_size, seq_len, self.kv_n_heads, ()))? .transpose(1, 2)?; // b,h,s,d let (key, value) = match &self.kv_cache { None => (key, value), Some((prev_k, prev_v)) => { let k = Tensor::cat(&[prev_k, &key], 3)?; let v = Tensor::cat(&[prev_v, &value], 2)?; (k, v) } }; self.kv_cache = Some((key.clone(), value.clone())); let query = query.contiguous()?; let key = crate::utils::repeat_kv(key, self.n_heads / self.kv_n_heads)?.contiguous()?; let value = crate::utils::repeat_kv(value, self.n_heads / self.kv_n_heads)?.contiguous()?; let attn_weights = (query.matmul(&key)? * self.softmax_scale)?; let attn_bias = { let s_q = query.dim(D::Minus2)?; let s_k = key.dim(D::Minus1)?; let (_, _, a_q, a_k) = self.attn_bias.dims4()?; let start_q = a_q.saturating_sub(s_q); let start_k = a_k.saturating_sub(s_k); self.attn_bias.i((.., .., start_q.., start_k..))? }; let attn_weights = attn_weights.broadcast_add(&attn_bias)?; let attn_weights = match mask { None => attn_weights, Some(mask) => super::mpt::masked_fill( &attn_weights, &mask.broadcast_as(attn_weights.shape())?, f32::NEG_INFINITY, )?, }; let attn_weights = candle_nn::ops::softmax_last_dim(&attn_weights)?; let attn_output = attn_weights .matmul(&value)? .transpose(1, 2)? .flatten_from(D::Minus2)?; let out = attn_output.apply(&self.out_proj)?; Ok(out) } } #[derive(Debug, Clone)] struct Ffn { up_proj: Linear, down_proj: Linear, } impl Ffn { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let hidden = cfg.d_model * cfg.expansion_ratio; let up_proj = linear_no_bias(cfg.d_model, hidden, vb.pp("up_proj"))?; let down_proj = linear_no_bias(hidden, cfg.d_model, vb.pp("down_proj"))?; Ok(Self { up_proj, down_proj }) } } impl Module for Ffn { fn forward(&self, xs: &Tensor) -> Result<Tensor> { xs.apply(&self.up_proj)?.gelu_erf()?.apply(&self.down_proj) } } #[derive(Debug, Clone)] struct MPTBlock { norm1: LayerNorm, // Do we need the low-precision variant? attn: GroupedQueryAttention, norm2: LayerNorm, ffn: Ffn, } impl MPTBlock { fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let norm1 = layer_norm_no_bias(cfg.d_model, 1e-5, vb.pp("norm_1"))?; let norm2 = layer_norm_no_bias(cfg.d_model, 1e-5, vb.pp("norm_2"))?; let attn = GroupedQueryAttention::new(cfg, vb.pp("attn"))?; let ffn = Ffn::new(cfg, vb.pp("ffn"))?; Ok(Self { norm1, attn, norm2, ffn, }) } fn forward(&mut self, xs: &Tensor, mask: Option<&Tensor>) -> Result<Tensor> { let residual = xs; let xs = xs.apply(&self.norm1)?; let xs = self.attn.forward(&xs, mask)?; let xs = (xs + residual)?; let residual = &xs; let xs = xs.apply(&self.norm2)?.apply(&self.ffn)?; xs + residual } } #[derive(Debug, Clone)] pub struct Model { wte: Embedding, blocks: Vec<MPTBlock>, norm_f: LayerNorm, } impl Model { pub fn new(cfg: &Config, vb: VarBuilder) -> Result<Self> { let wte = Embedding::new(cfg.vocab_size, cfg.d_model, vb.pp("wte"))?; let vb_b = vb.pp("blocks"); let mut blocks = Vec::with_capacity(cfg.n_layers); for i in 0..cfg.n_layers { let block = MPTBlock::new(cfg, vb_b.pp(i))?; blocks.push(block) } let norm_f = layer_norm_no_bias(cfg.d_model, 1e-5, vb.pp("norm_f"))?; Ok(Self { wte, blocks, norm_f, }) } pub fn forward(&mut self, xs: &Tensor) -> Result<Tensor> { let (_b_size, seq_len) = xs.dims2()?; let mut xs = xs.apply(&self.wte)?; let mask = if seq_len <= 1 { None } else { Some(super::mpt::get_mask(seq_len, xs.device())?) }; for block in self.blocks.iter_mut() { xs = block.forward(&xs, mask.as_ref())?; } let xs = xs.apply(&self.norm_f)?; let logits = xs .narrow(1, seq_len - 1, 1)? .squeeze(1)? .matmul(&self.wte.embeddings().t()?)? .squeeze(1)?; Ok(logits) } }
9
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/dtype.rs
//! Types for elements that can be stored and manipulated using tensors. #![allow(clippy::redundant_closure_call)] use crate::backend::BackendStorage; use crate::{CpuStorage, CpuStorageRef, Error, Result}; /// The different types of elements allowed in tensors. #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)] pub enum DType { // Unsigned 8 bits integer. U8, // Unsigned 32 bits integer. U32, // Signed 64 bits integer. I64, // Brain floating-point using half precision (16 bits). BF16, // Floating-point using half precision (16 bits). F16, // Floating-point using single precision (32 bits). F32, // Floating-point using double precision (64 bits). F64, } #[derive(Debug, PartialEq, Eq)] pub struct DTypeParseError(String); impl std::fmt::Display for DTypeParseError { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { write!(f, "cannot parse '{}' as a dtype", self.0) } } impl std::error::Error for DTypeParseError {} impl std::str::FromStr for DType { type Err = DTypeParseError; fn from_str(s: &str) -> std::result::Result<Self, Self::Err> { match s { "u8" => Ok(Self::U8), "u32" => Ok(Self::U32), "i64" => Ok(Self::I64), "bf16" => Ok(Self::BF16), "f16" => Ok(Self::F16), "f32" => Ok(Self::F32), "f64" => Ok(Self::F64), _ => Err(DTypeParseError(s.to_string())), } } } impl DType { /// String representation for dtypes. pub fn as_str(&self) -> &'static str { match self { Self::U8 => "u8", Self::U32 => "u32", Self::I64 => "i64", Self::BF16 => "bf16", Self::F16 => "f16", Self::F32 => "f32", Self::F64 => "f64", } } /// The size used by each element in bytes, i.e. 1 for `U8`, 4 for `F32`. pub fn size_in_bytes(&self) -> usize { match self { Self::U8 => 1, Self::U32 => 4, Self::I64 => 8, Self::BF16 => 2, Self::F16 => 2, Self::F32 => 4, Self::F64 => 8, } } pub fn is_int(&self) -> bool { match self { Self::U8 | Self::U32 | Self::I64 => true, Self::BF16 | Self::F16 | Self::F32 | Self::F64 => false, } } pub fn is_float(&self) -> bool { match self { Self::U8 | Self::U32 | Self::I64 => false, Self::BF16 | Self::F16 | Self::F32 | Self::F64 => true, } } } pub trait WithDType: Sized + Copy + num_traits::NumAssign + std::cmp::PartialOrd + std::fmt::Display + 'static + Send + Sync + std::any::Any + crate::cpu::kernels::VecOps { const DTYPE: DType; fn from_f64(v: f64) -> Self; fn to_f64(self) -> f64; fn cpu_storage_ref(data: &[Self]) -> CpuStorageRef<'_>; fn to_cpu_storage_owned(data: Vec<Self>) -> CpuStorage; fn to_cpu_storage(data: &[Self]) -> CpuStorage { Self::to_cpu_storage_owned(data.to_vec()) } fn cpu_storage_as_slice(s: &CpuStorage) -> Result<&[Self]>; fn cpu_storage_data(s: CpuStorage) -> Result<Vec<Self>>; } macro_rules! with_dtype { ($ty:ty, $dtype:ident, $from_f64:expr, $to_f64:expr) => { impl WithDType for $ty { const DTYPE: DType = DType::$dtype; fn from_f64(v: f64) -> Self { $from_f64(v) } fn to_f64(self) -> f64 { $to_f64(self) } fn cpu_storage_ref(data: &[Self]) -> CpuStorageRef<'_> { CpuStorageRef::$dtype(data) } fn to_cpu_storage_owned(data: Vec<Self>) -> CpuStorage { CpuStorage::$dtype(data) } fn cpu_storage_data(s: CpuStorage) -> Result<Vec<Self>> { match s { CpuStorage::$dtype(data) => Ok(data), _ => Err(Error::UnexpectedDType { expected: DType::$dtype, got: s.dtype(), msg: "unexpected dtype", } .bt()), } } fn cpu_storage_as_slice(s: &CpuStorage) -> Result<&[Self]> { match s { CpuStorage::$dtype(data) => Ok(data), _ => Err(Error::UnexpectedDType { expected: DType::$dtype, got: s.dtype(), msg: "unexpected dtype", } .bt()), } } } }; } use half::{bf16, f16}; with_dtype!(u8, U8, |v: f64| v as u8, |v: u8| v as f64); with_dtype!(u32, U32, |v: f64| v as u32, |v: u32| v as f64); with_dtype!(i64, I64, |v: f64| v as i64, |v: i64| v as f64); with_dtype!(f16, F16, f16::from_f64, f16::to_f64); with_dtype!(bf16, BF16, bf16::from_f64, bf16::to_f64); with_dtype!(f32, F32, |v: f64| v as f32, |v: f32| v as f64); with_dtype!(f64, F64, |v: f64| v, |v: f64| v); pub trait IntDType: WithDType { fn is_true(&self) -> bool; fn as_usize(&self) -> usize; } impl IntDType for i64 { fn is_true(&self) -> bool { *self != 0 } fn as_usize(&self) -> usize { *self as usize } } impl IntDType for u32 { fn is_true(&self) -> bool { *self != 0 } fn as_usize(&self) -> usize { *self as usize } } impl IntDType for u8 { fn is_true(&self) -> bool { *self != 0 } fn as_usize(&self) -> usize { *self as usize } } pub trait FloatDType: WithDType {} impl FloatDType for f16 {} impl FloatDType for bf16 {} impl FloatDType for f32 {} impl FloatDType for f64 {}
0
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/backend.rs
//! Traits to Define Backend Behavior //! use crate::op::{BinaryOpT, CmpOp, ReduceOp, UnaryOpT}; use crate::{CpuStorage, DType, Layout, Result, Shape}; pub trait BackendStorage: Sized { type Device: BackendDevice; fn try_clone(&self, _: &Layout) -> Result<Self>; fn dtype(&self) -> DType; fn device(&self) -> &Self::Device; // Maybe this should return a Cow instead so that no copy is done on the cpu case. fn to_cpu_storage(&self) -> Result<CpuStorage>; fn affine(&self, _: &Layout, _: f64, _: f64) -> Result<Self>; fn powf(&self, _: &Layout, _: f64) -> Result<Self>; fn elu(&self, _: &Layout, _: f64) -> Result<Self>; fn reduce_op(&self, _: ReduceOp, _: &Layout, _: &[usize]) -> Result<Self>; fn cmp(&self, _: CmpOp, _: &Self, _: &Layout, _: &Layout) -> Result<Self>; fn to_dtype(&self, _: &Layout, _: DType) -> Result<Self>; fn unary_impl<B: UnaryOpT>(&self, _: &Layout) -> Result<Self>; fn binary_impl<B: BinaryOpT>(&self, _: &Self, _: &Layout, _: &Layout) -> Result<Self>; fn where_cond(&self, _: &Layout, _: &Self, _: &Layout, _: &Self, _: &Layout) -> Result<Self>; fn conv1d( &self, _l: &Layout, _kernel: &Self, _kernel_l: &Layout, _params: &crate::conv::ParamsConv1D, ) -> Result<Self>; fn conv_transpose1d( &self, _l: &Layout, _kernel: &Self, _kernel_l: &Layout, _params: &crate::conv::ParamsConvTranspose1D, ) -> Result<Self>; fn conv2d( &self, _l: &Layout, _kernel: &Self, _kernel_l: &Layout, _params: &crate::conv::ParamsConv2D, ) -> Result<Self>; fn conv_transpose2d( &self, _l: &Layout, _kernel: &Self, _kernel_l: &Layout, _params: &crate::conv::ParamsConvTranspose2D, ) -> Result<Self>; fn avg_pool2d(&self, _: &Layout, _: (usize, usize), _: (usize, usize)) -> Result<Self>; fn max_pool2d(&self, _: &Layout, _: (usize, usize), _: (usize, usize)) -> Result<Self>; fn upsample_nearest1d(&self, _: &Layout, _: usize) -> Result<Self>; fn upsample_nearest2d(&self, _: &Layout, _: usize, _: usize) -> Result<Self>; fn gather(&self, _: &Layout, _: &Self, _: &Layout, _: usize) -> Result<Self>; fn scatter_add( &self, _: &Layout, _: &Self, _: &Layout, _: &Self, _: &Layout, _: usize, ) -> Result<Self>; fn index_select(&self, _: &Self, _: &Layout, _: &Layout, _: usize) -> Result<Self>; fn index_add( &self, _: &Layout, _: &Self, _: &Layout, _: &Self, _: &Layout, _: usize, ) -> Result<Self>; fn matmul( &self, _: &Self, _: (usize, usize, usize, usize), _: &Layout, _: &Layout, ) -> Result<Self>; fn copy_strided_src(&self, _: &mut Self, _: usize, _: &Layout) -> Result<()>; #[allow(clippy::too_many_arguments)] // Similar to cudaMemcpy2D, though values are in elements and not in bytes. fn copy2d( &self, _: &mut Self, _d1: usize, _d2: usize, _src_stride1: usize, _dst_stride1: usize, _src_offset: usize, _dst_offset: usize, ) -> Result<()>; } pub trait BackendDevice: Sized + std::fmt::Debug + Clone { type Storage: BackendStorage; // TODO: Make the usize generic and part of a generic DeviceLocation. fn new(_: usize) -> Result<Self>; fn location(&self) -> crate::DeviceLocation; fn same_device(&self, _: &Self) -> bool; fn zeros_impl(&self, _shape: &Shape, _dtype: DType) -> Result<Self::Storage>; fn ones_impl(&self, _shape: &Shape, _dtype: DType) -> Result<Self::Storage>; /// # Safety /// This function is unsafe as it doesn't initialize the underlying data store. /// The caller should ensure that the data is properly initialized as early as possible /// after this call. unsafe fn alloc_uninit(&self, _shape: &Shape, _dtype: DType) -> Result<Self::Storage>; fn storage_from_slice<T: crate::WithDType>(&self, _: &[T]) -> Result<Self::Storage>; fn storage_from_cpu_storage(&self, _: &CpuStorage) -> Result<Self::Storage>; fn storage_from_cpu_storage_owned(&self, _: CpuStorage) -> Result<Self::Storage>; fn rand_uniform(&self, _: &Shape, _: DType, _: f64, _: f64) -> Result<Self::Storage>; fn rand_normal(&self, _: &Shape, _: DType, _: f64, _: f64) -> Result<Self::Storage>; fn set_seed(&self, _: u64) -> Result<()>; /// Synchronize should block until all the operations on the device are completed. fn synchronize(&self) -> Result<()>; }
1
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/test_utils.rs
use crate::{Result, Tensor}; #[macro_export] macro_rules! test_device { // TODO: Switch to generating the two last arguments automatically once concat_idents is // stable. https://github.com/rust-lang/rust/issues/29599 ($fn_name: ident, $test_cpu: ident, $test_cuda: ident, $test_metal: ident) => { #[test] fn $test_cpu() -> Result<()> { $fn_name(&Device::Cpu) } #[cfg(feature = "cuda")] #[test] fn $test_cuda() -> Result<()> { $fn_name(&Device::new_cuda(0)?) } #[cfg(feature = "metal")] #[test] fn $test_metal() -> Result<()> { $fn_name(&Device::new_metal(0)?) } }; } pub fn to_vec0_round(t: &Tensor, digits: i32) -> Result<f32> { let b = 10f32.powi(digits); let t = t.to_vec0::<f32>()?; Ok(f32::round(t * b) / b) } pub fn to_vec1_round(t: &Tensor, digits: i32) -> Result<Vec<f32>> { let b = 10f32.powi(digits); let t = t.to_vec1::<f32>()?; let t = t.iter().map(|t| f32::round(t * b) / b).collect(); Ok(t) } pub fn to_vec2_round(t: &Tensor, digits: i32) -> Result<Vec<Vec<f32>>> { let b = 10f32.powi(digits); let t = t.to_vec2::<f32>()?; let t = t .iter() .map(|t| t.iter().map(|t| f32::round(t * b) / b).collect()) .collect(); Ok(t) } pub fn to_vec3_round(t: &Tensor, digits: i32) -> Result<Vec<Vec<Vec<f32>>>> { let b = 10f32.powi(digits); let t = t.to_vec3::<f32>()?; let t = t .iter() .map(|t| { t.iter() .map(|t| t.iter().map(|t| f32::round(t * b) / b).collect()) .collect() }) .collect(); Ok(t) }
2
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/shape.rs
//! The shape of a tensor is a tuple with the size of each of its dimensions. #![allow(clippy::redundant_closure_call)] use crate::{Error, Result}; #[derive(Clone, PartialEq, Eq)] pub struct Shape(Vec<usize>); pub const SCALAR: Shape = Shape(vec![]); impl std::fmt::Debug for Shape { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { write!(f, "{:?}", &self.dims()) } } impl<const C: usize> From<&[usize; C]> for Shape { fn from(dims: &[usize; C]) -> Self { Self(dims.to_vec()) } } impl From<&[usize]> for Shape { fn from(dims: &[usize]) -> Self { Self(dims.to_vec()) } } impl From<&Shape> for Shape { fn from(shape: &Shape) -> Self { Self(shape.0.to_vec()) } } impl From<()> for Shape { fn from(_: ()) -> Self { Self(vec![]) } } impl From<usize> for Shape { fn from(d1: usize) -> Self { Self(vec![d1]) } } impl From<(usize,)> for Shape { fn from(d1: (usize,)) -> Self { Self(vec![d1.0]) } } impl From<(usize, usize)> for Shape { fn from(d12: (usize, usize)) -> Self { Self(vec![d12.0, d12.1]) } } impl From<(usize, usize, usize)> for Shape { fn from(d123: (usize, usize, usize)) -> Self { Self(vec![d123.0, d123.1, d123.2]) } } impl From<(usize, usize, usize, usize)> for Shape { fn from(d1234: (usize, usize, usize, usize)) -> Self { Self(vec![d1234.0, d1234.1, d1234.2, d1234.3]) } } impl From<(usize, usize, usize, usize, usize)> for Shape { fn from(d12345: (usize, usize, usize, usize, usize)) -> Self { Self(vec![d12345.0, d12345.1, d12345.2, d12345.3, d12345.4]) } } impl From<(usize, usize, usize, usize, usize, usize)> for Shape { fn from(d123456: (usize, usize, usize, usize, usize, usize)) -> Self { Self(vec![ d123456.0, d123456.1, d123456.2, d123456.3, d123456.4, d123456.5, ]) } } impl From<Vec<usize>> for Shape { fn from(dims: Vec<usize>) -> Self { Self(dims) } } macro_rules! extract_dims { ($fn_name:ident, $cnt:tt, $dims:expr, $out_type:ty) => { pub fn $fn_name(dims: &[usize]) -> Result<$out_type> { if dims.len() != $cnt { Err(Error::UnexpectedNumberOfDims { expected: $cnt, got: dims.len(), shape: Shape::from(dims), } .bt()) } else { Ok($dims(dims)) } } impl Shape { pub fn $fn_name(&self) -> Result<$out_type> { $fn_name(self.0.as_slice()) } } impl crate::Tensor { pub fn $fn_name(&self) -> Result<$out_type> { self.shape().$fn_name() } } impl std::convert::TryInto<$out_type> for Shape { type Error = crate::Error; fn try_into(self) -> std::result::Result<$out_type, Self::Error> { self.$fn_name() } } }; } impl Shape { pub fn from_dims(dims: &[usize]) -> Self { Self(dims.to_vec()) } /// The rank is the number of dimensions, 0 for a scalar value, 1 for a vector, etc. pub fn rank(&self) -> usize { self.0.len() } pub fn into_dims(self) -> Vec<usize> { self.0 } /// The dimensions as a slice of `usize`. pub fn dims(&self) -> &[usize] { &self.0 } /// The dimension size for a specified dimension index. pub fn dim<D: Dim>(&self, dim: D) -> Result<usize> { let dim = dim.to_index(self, "dim")?; Ok(self.dims()[dim]) } /// The total number of elements, this is the product of all dimension sizes. pub fn elem_count(&self) -> usize { self.0.iter().product() } /// The strides given in number of elements for a contiguous n-dimensional /// arrays using this shape. pub(crate) fn stride_contiguous(&self) -> Vec<usize> { let mut stride: Vec<_> = self .0 .iter() .rev() .scan(1, |prod, u| { let prod_pre_mult = *prod; *prod *= u; Some(prod_pre_mult) }) .collect(); stride.reverse(); stride } /// Returns true if the strides are C contiguous (aka row major). pub fn is_contiguous(&self, stride: &[usize]) -> bool { if self.0.len() != stride.len() { return false; } let mut acc = 1; for (&stride, &dim) in stride.iter().zip(self.0.iter()).rev() { if dim > 1 && stride != acc { return false; } acc *= dim; } true } /// Returns true if the strides are Fortran contiguous (aka column major). pub fn is_fortran_contiguous(&self, stride: &[usize]) -> bool { if self.0.len() != stride.len() { return false; } let mut acc = 1; for (&stride, &dim) in stride.iter().zip(self.0.iter()) { if dim > 1 && stride != acc { return false; } acc *= dim; } true } /// Modifies the shape by adding a list of additional dimensions at the end of the existing /// dimensions. pub fn extend(mut self, additional_dims: &[usize]) -> Self { self.0.extend(additional_dims); self } /// Check whether the two shapes are compatible for broadcast, and if it is the case return the /// broadcasted shape. This is to be used for binary pointwise ops. pub fn broadcast_shape_binary_op(&self, rhs: &Self, op: &'static str) -> Result<Shape> { let lhs = self; let lhs_dims = lhs.dims(); let rhs_dims = rhs.dims(); let lhs_ndims = lhs_dims.len(); let rhs_ndims = rhs_dims.len(); let bcast_ndims = usize::max(lhs_ndims, rhs_ndims); let mut bcast_dims = vec![0; bcast_ndims]; for (idx, bcast_value) in bcast_dims.iter_mut().enumerate() { let rev_idx = bcast_ndims - idx; let l_value = if lhs_ndims < rev_idx { 1 } else { lhs_dims[lhs_ndims - rev_idx] }; let r_value = if rhs_ndims < rev_idx { 1 } else { rhs_dims[rhs_ndims - rev_idx] }; *bcast_value = if l_value == r_value { l_value } else if l_value == 1 { r_value } else if r_value == 1 { l_value } else { Err(Error::ShapeMismatchBinaryOp { lhs: lhs.clone(), rhs: rhs.clone(), op, } .bt())? } } Ok(Shape::from(bcast_dims)) } pub(crate) fn broadcast_shape_matmul(&self, rhs: &Self) -> Result<(Shape, Shape)> { let lhs = self; let lhs_dims = lhs.dims(); let rhs_dims = rhs.dims(); if lhs_dims.len() < 2 || rhs_dims.len() < 2 { crate::bail!("only 2d matrixes are supported {lhs:?} {rhs:?}") } let (m, lhs_k) = (lhs_dims[lhs_dims.len() - 2], lhs_dims[lhs_dims.len() - 1]); let (rhs_k, n) = (rhs_dims[rhs_dims.len() - 2], rhs_dims[rhs_dims.len() - 1]); if lhs_k != rhs_k { crate::bail!("different inner dimensions in broadcast matmul {lhs:?} {rhs:?}") } let lhs_b = Self::from(&lhs_dims[..lhs_dims.len() - 2]); let rhs_b = Self::from(&rhs_dims[..rhs_dims.len() - 2]); let bcast = lhs_b.broadcast_shape_binary_op(&rhs_b, "broadcast_matmul")?; let bcast_dims = bcast.dims(); let bcast_lhs = [bcast_dims, &[m, lhs_k]].concat(); let bcast_rhs = [bcast_dims, &[rhs_k, n]].concat(); Ok((Shape::from(bcast_lhs), Shape::from(bcast_rhs))) } } pub trait Dim { fn to_index(&self, shape: &Shape, op: &'static str) -> Result<usize>; fn to_index_plus_one(&self, shape: &Shape, op: &'static str) -> Result<usize>; } impl Dim for usize { fn to_index(&self, shape: &Shape, op: &'static str) -> Result<usize> { let dim = *self; if dim >= shape.dims().len() { Err(Error::DimOutOfRange { shape: shape.clone(), dim: dim as i32, op, } .bt())? } else { Ok(dim) } } fn to_index_plus_one(&self, shape: &Shape, op: &'static str) -> Result<usize> { let dim = *self; if dim > shape.dims().len() { Err(Error::DimOutOfRange { shape: shape.clone(), dim: dim as i32, op, } .bt())? } else { Ok(dim) } } } #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)] pub enum D { Minus1, Minus2, Minus(usize), } impl D { fn out_of_range(&self, shape: &Shape, op: &'static str) -> Error { let dim = match self { Self::Minus1 => -1, Self::Minus2 => -2, Self::Minus(u) => -(*u as i32), }; Error::DimOutOfRange { shape: shape.clone(), dim, op, } .bt() } } impl Dim for D { fn to_index(&self, shape: &Shape, op: &'static str) -> Result<usize> { let rank = shape.rank(); match self { Self::Minus1 if rank >= 1 => Ok(rank - 1), Self::Minus2 if rank >= 2 => Ok(rank - 2), Self::Minus(u) if *u > 0 && rank >= *u => Ok(rank - *u), _ => Err(self.out_of_range(shape, op)), } } fn to_index_plus_one(&self, shape: &Shape, op: &'static str) -> Result<usize> { let rank = shape.rank(); match self { Self::Minus1 => Ok(rank), Self::Minus2 if rank >= 1 => Ok(rank - 1), Self::Minus(u) if *u > 0 && rank + 1 >= *u => Ok(rank + 1 - *u), _ => Err(self.out_of_range(shape, op)), } } } pub trait Dims: Sized { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>>; fn to_indexes(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let dims = self.to_indexes_internal(shape, op)?; for (i, &dim) in dims.iter().enumerate() { if dims[..i].contains(&dim) { Err(Error::DuplicateDimIndex { shape: shape.clone(), dims: dims.clone(), op, } .bt())? } if dim >= shape.rank() { Err(Error::DimOutOfRange { shape: shape.clone(), dim: dim as i32, op, } .bt())? } } Ok(dims) } } impl Dims for Vec<usize> { fn to_indexes_internal(self, _: &Shape, _: &'static str) -> Result<Vec<usize>> { Ok(self) } } impl<const N: usize> Dims for [usize; N] { fn to_indexes_internal(self, _: &Shape, _: &'static str) -> Result<Vec<usize>> { Ok(self.to_vec()) } } impl Dims for &[usize] { fn to_indexes_internal(self, _: &Shape, _: &'static str) -> Result<Vec<usize>> { Ok(self.to_vec()) } } impl Dims for () { fn to_indexes_internal(self, _: &Shape, _: &'static str) -> Result<Vec<usize>> { Ok(vec![]) } } impl<D: Dim + Sized> Dims for D { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let dim = self.to_index(shape, op)?; Ok(vec![dim]) } } impl<D: Dim> Dims for (D,) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let dim = self.0.to_index(shape, op)?; Ok(vec![dim]) } } impl<D1: Dim, D2: Dim> Dims for (D1, D2) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let d0 = self.0.to_index(shape, op)?; let d1 = self.1.to_index(shape, op)?; Ok(vec![d0, d1]) } } impl<D1: Dim, D2: Dim, D3: Dim> Dims for (D1, D2, D3) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let d0 = self.0.to_index(shape, op)?; let d1 = self.1.to_index(shape, op)?; let d2 = self.2.to_index(shape, op)?; Ok(vec![d0, d1, d2]) } } impl<D1: Dim, D2: Dim, D3: Dim, D4: Dim> Dims for (D1, D2, D3, D4) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let d0 = self.0.to_index(shape, op)?; let d1 = self.1.to_index(shape, op)?; let d2 = self.2.to_index(shape, op)?; let d3 = self.3.to_index(shape, op)?; Ok(vec![d0, d1, d2, d3]) } } impl<D1: Dim, D2: Dim, D3: Dim, D4: Dim, D5: Dim> Dims for (D1, D2, D3, D4, D5) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let d0 = self.0.to_index(shape, op)?; let d1 = self.1.to_index(shape, op)?; let d2 = self.2.to_index(shape, op)?; let d3 = self.3.to_index(shape, op)?; let d4 = self.4.to_index(shape, op)?; Ok(vec![d0, d1, d2, d3, d4]) } } impl<D1: Dim, D2: Dim, D3: Dim, D4: Dim, D5: Dim, D6: Dim> Dims for (D1, D2, D3, D4, D5, D6) { fn to_indexes_internal(self, shape: &Shape, op: &'static str) -> Result<Vec<usize>> { let d0 = self.0.to_index(shape, op)?; let d1 = self.1.to_index(shape, op)?; let d2 = self.2.to_index(shape, op)?; let d3 = self.3.to_index(shape, op)?; let d4 = self.4.to_index(shape, op)?; let d5 = self.5.to_index(shape, op)?; Ok(vec![d0, d1, d2, d3, d4, d5]) } } extract_dims!(dims0, 0, |_: &[usize]| (), ()); extract_dims!(dims1, 1, |d: &[usize]| d[0], usize); extract_dims!(dims2, 2, |d: &[usize]| (d[0], d[1]), (usize, usize)); extract_dims!( dims3, 3, |d: &[usize]| (d[0], d[1], d[2]), (usize, usize, usize) ); extract_dims!( dims4, 4, |d: &[usize]| (d[0], d[1], d[2], d[3]), (usize, usize, usize, usize) ); extract_dims!( dims5, 5, |d: &[usize]| (d[0], d[1], d[2], d[3], d[4]), (usize, usize, usize, usize, usize) ); pub trait ShapeWithOneHole { fn into_shape(self, el_count: usize) -> Result<Shape>; } impl<S: Into<Shape>> ShapeWithOneHole for S { fn into_shape(self, _el_count: usize) -> Result<Shape> { Ok(self.into()) } } impl ShapeWithOneHole for ((),) { fn into_shape(self, el_count: usize) -> Result<Shape> { Ok(el_count.into()) } } fn hole_size(el_count: usize, prod_d: usize, s: &dyn std::fmt::Debug) -> Result<usize> { if prod_d == 0 { crate::bail!("cannot reshape tensor of {el_count} elements to {s:?}") } if el_count % prod_d != 0 { crate::bail!("cannot reshape tensor with {el_count} elements to {s:?}") } Ok(el_count / prod_d) } impl ShapeWithOneHole for ((), usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let ((), d1) = self; Ok((hole_size(el_count, d1, &self)?, d1).into()) } } impl ShapeWithOneHole for (usize, ()) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, ()) = self; Ok((d1, hole_size(el_count, d1, &self)?).into()) } } impl ShapeWithOneHole for ((), usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let ((), d1, d2) = self; Ok((hole_size(el_count, d1 * d2, &self)?, d1, d2).into()) } } impl ShapeWithOneHole for (usize, (), usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, (), d2) = self; Ok((d1, hole_size(el_count, d1 * d2, &self)?, d2).into()) } } impl ShapeWithOneHole for (usize, usize, ()) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, ()) = self; Ok((d1, d2, hole_size(el_count, d1 * d2, &self)?).into()) } } impl ShapeWithOneHole for ((), usize, usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let ((), d1, d2, d3) = self; let d = hole_size(el_count, d1 * d2 * d3, &self)?; Ok((d, d1, d2, d3).into()) } } impl ShapeWithOneHole for (usize, (), usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, (), d2, d3) = self; let d = hole_size(el_count, d1 * d2 * d3, &self)?; Ok((d1, d, d2, d3).into()) } } impl ShapeWithOneHole for (usize, usize, (), usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, (), d3) = self; let d = hole_size(el_count, d1 * d2 * d3, &self)?; Ok((d1, d2, d, d3).into()) } } impl ShapeWithOneHole for (usize, usize, usize, ()) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, d3, ()) = self; let d = hole_size(el_count, d1 * d2 * d3, &self)?; Ok((d1, d2, d3, d).into()) } } impl ShapeWithOneHole for ((), usize, usize, usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let ((), d1, d2, d3, d4) = self; let d = hole_size(el_count, d1 * d2 * d3 * d4, &self)?; Ok((d, d1, d2, d3, d4).into()) } } impl ShapeWithOneHole for (usize, (), usize, usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, (), d2, d3, d4) = self; let d = hole_size(el_count, d1 * d2 * d3 * d4, &self)?; Ok((d1, d, d2, d3, d4).into()) } } impl ShapeWithOneHole for (usize, usize, (), usize, usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, (), d3, d4) = self; let d = hole_size(el_count, d1 * d2 * d3 * d4, &self)?; Ok((d1, d2, d, d3, d4).into()) } } impl ShapeWithOneHole for (usize, usize, usize, (), usize) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, d3, (), d4) = self; let d = hole_size(el_count, d1 * d2 * d3 * d4, &self)?; Ok((d1, d2, d3, d, d4).into()) } } impl ShapeWithOneHole for (usize, usize, usize, usize, ()) { fn into_shape(self, el_count: usize) -> Result<Shape> { let (d1, d2, d3, d4, ()) = self; let d = hole_size(el_count, d1 * d2 * d3 * d4, &self)?; Ok((d1, d2, d3, d4, d).into()) } } #[cfg(test)] mod tests { use super::*; #[test] fn stride() { let shape = Shape::from(()); assert_eq!(shape.stride_contiguous(), Vec::<usize>::new()); let shape = Shape::from(42); assert_eq!(shape.stride_contiguous(), [1]); let shape = Shape::from((42, 1337)); assert_eq!(shape.stride_contiguous(), [1337, 1]); let shape = Shape::from((299, 792, 458)); assert_eq!(shape.stride_contiguous(), [458 * 792, 458, 1]); } }
3
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/pickle.rs
//! Just enough pickle support to be able to read PyTorch checkpoints. // This hardcodes objects that are required for tensor reading, we may want to make this a bit more // composable/tensor agnostic at some point. use crate::{DType, Error as E, Layout, Result, Tensor}; use byteorder::{LittleEndian, ReadBytesExt}; use std::collections::HashMap; use std::io::BufRead; const VERBOSE: bool = false; // https://docs.juliahub.com/Pickle/LAUNc/0.1.0/opcode/ #[repr(u8)] #[derive(Debug, Eq, PartialEq, Clone)] pub enum OpCode { // https://github.com/python/cpython/blob/ed25f097160b5cbb0c9a1f9a746d2f1bbc96515a/Lib/pickletools.py#L2123 Proto = 0x80, Global = b'c', BinPut = b'q', LongBinPut = b'r', EmptyTuple = b')', Reduce = b'R', Mark = b'(', BinUnicode = b'X', BinInt = b'J', Tuple = b't', BinPersId = b'Q', BinInt1 = b'K', BinInt2 = b'M', Tuple1 = 0x85, Tuple2 = 0x86, Tuple3 = 0x87, NewTrue = 0x88, NewFalse = 0x89, None = b'N', BinGet = b'h', LongBinGet = b'j', SetItem = b's', SetItems = b'u', EmptyDict = b'}', Dict = b'd', Build = b'b', Stop = b'.', NewObj = 0x81, EmptyList = b']', BinFloat = b'G', Append = b'a', Appends = b'e', } // Avoid using FromPrimitive so as not to drag another dependency. impl TryFrom<u8> for OpCode { type Error = u8; fn try_from(value: u8) -> std::result::Result<Self, Self::Error> { match value { 0x80 => Ok(Self::Proto), b'c' => Ok(Self::Global), b'q' => Ok(Self::BinPut), b'r' => Ok(Self::LongBinPut), b')' => Ok(Self::EmptyTuple), b'R' => Ok(Self::Reduce), b'(' => Ok(Self::Mark), b'X' => Ok(Self::BinUnicode), b'J' => Ok(Self::BinInt), b't' => Ok(Self::Tuple), b'Q' => Ok(Self::BinPersId), b'K' => Ok(Self::BinInt1), b'M' => Ok(Self::BinInt2), b'N' => Ok(Self::None), 0x85 => Ok(Self::Tuple1), 0x86 => Ok(Self::Tuple2), 0x87 => Ok(Self::Tuple3), 0x88 => Ok(Self::NewTrue), 0x89 => Ok(Self::NewFalse), b'h' => Ok(Self::BinGet), b'j' => Ok(Self::LongBinGet), b's' => Ok(Self::SetItem), b'u' => Ok(Self::SetItems), b'}' => Ok(Self::EmptyDict), b'd' => Ok(Self::EmptyDict), b'b' => Ok(Self::Build), b'.' => Ok(Self::Stop), 0x81 => Ok(Self::NewObj), b']' => Ok(Self::EmptyList), b'G' => Ok(Self::BinFloat), b'a' => Ok(Self::Append), b'e' => Ok(Self::Appends), value => Err(value), } } } fn read_to_newline<R: BufRead>(r: &mut R) -> Result<Vec<u8>> { let mut data: Vec<u8> = Vec::with_capacity(32); r.read_until(b'\n', &mut data)?; data.pop(); if data.last() == Some(&b'\r') { data.pop(); } Ok(data) } #[derive(Debug, Clone, PartialEq)] pub enum Object { Class { module_name: String, class_name: String, }, Int(i32), Float(f64), Unicode(String), Bool(bool), None, Tuple(Vec<Object>), List(Vec<Object>), Mark, Dict(Vec<(Object, Object)>), Reduce { callable: Box<Object>, args: Box<Object>, }, Build { callable: Box<Object>, args: Box<Object>, }, PersistentLoad(Box<Object>), } type OResult<T> = std::result::Result<T, Object>; impl Object { pub fn unicode(self) -> OResult<String> { match self { Self::Unicode(t) => Ok(t), _ => Err(self), } } pub fn reduce(self) -> OResult<(Self, Self)> { match self { Self::Reduce { callable, args } => Ok((*callable, *args)), _ => Err(self), } } pub fn none(self) -> OResult<()> { match self { Self::None => Ok(()), _ => Err(self), } } pub fn persistent_load(self) -> OResult<Self> { match self { Self::PersistentLoad(t) => Ok(*t), _ => Err(self), } } pub fn bool(self) -> OResult<bool> { match self { Self::Bool(t) => Ok(t), _ => Err(self), } } pub fn int(self) -> OResult<i32> { match self { Self::Int(t) => Ok(t), _ => Err(self), } } pub fn tuple(self) -> OResult<Vec<Self>> { match self { Self::Tuple(t) => Ok(t), _ => Err(self), } } pub fn dict(self) -> OResult<Vec<(Self, Self)>> { match self { Self::Dict(t) => Ok(t), _ => Err(self), } } pub fn class(self) -> OResult<(String, String)> { match self { Self::Class { module_name, class_name, } => Ok((module_name, class_name)), _ => Err(self), } } pub fn into_tensor_info( self, name: Self, dir_name: &std::path::Path, ) -> Result<Option<TensorInfo>> { let name = match name.unicode() { Ok(name) => name, Err(_) => return Ok(None), }; let (callable, args) = match self.reduce() { Ok(callable_args) => callable_args, _ => return Ok(None), }; let (callable, args) = match callable { Object::Class { module_name, class_name, } if module_name == "torch._tensor" && class_name == "_rebuild_from_type_v2" => { let mut args = args.tuple()?; let callable = args.remove(0); let args = args.remove(1); (callable, args) } Object::Class { module_name, class_name, } if module_name == "torch._utils" && class_name == "_rebuild_parameter" => { let mut args = args.tuple()?; args.remove(0).reduce()? } _ => (callable, args), }; match callable { Object::Class { module_name, class_name, } if module_name == "torch._utils" && class_name == "_rebuild_tensor_v2" => {} _ => return Ok(None), }; let (layout, dtype, file_path, storage_size) = rebuild_args(args)?; Ok(Some(TensorInfo { name, dtype, layout, path: format!("{}/{}", dir_name.to_string_lossy(), file_path), storage_size, })) } } impl TryFrom<Object> for String { type Error = Object; fn try_from(value: Object) -> std::result::Result<Self, Self::Error> { match value { Object::Unicode(s) => Ok(s), other => Err(other), } } } impl TryFrom<Object> for usize { type Error = Object; fn try_from(value: Object) -> std::result::Result<Self, Self::Error> { match value { Object::Int(s) if s >= 0 => Ok(s as usize), other => Err(other), } } } impl<T: TryFrom<Object, Error = Object>> TryFrom<Object> for Vec<T> { type Error = Object; fn try_from(value: Object) -> std::result::Result<Self, Self::Error> { match value { Object::Tuple(values) => { // This does not return the appropriate value in the error case but instead return // the object related to the first error. values .into_iter() .map(|v| T::try_from(v)) .collect::<std::result::Result<Vec<T>, Self::Error>>() } other => Err(other), } } } #[derive(Debug)] pub struct Stack { stack: Vec<Object>, memo: HashMap<u32, Object>, } impl Stack { pub fn empty() -> Self { Self { stack: Vec::with_capacity(512), memo: HashMap::new(), } } pub fn stack(&self) -> &[Object] { self.stack.as_slice() } pub fn read_loop<R: BufRead>(&mut self, r: &mut R) -> Result<()> { loop { if self.read(r)? { break; } } Ok(()) } pub fn finalize(mut self) -> Result<Object> { self.pop() } fn push(&mut self, obj: Object) { self.stack.push(obj) } fn pop(&mut self) -> Result<Object> { match self.stack.pop() { None => crate::bail!("unexpected empty stack"), Some(obj) => Ok(obj), } } // https://docs.juliahub.com/Pickle/LAUNc/0.1.0/opcode/#Pickle.OpCodes.BUILD fn build(&mut self) -> Result<()> { let args = self.pop()?; let obj = self.pop()?; let obj = match (obj, args) { (Object::Dict(mut obj), Object::Dict(mut args)) => { obj.append(&mut args); Object::Dict(obj) } (obj, args) => Object::Build { callable: Box::new(obj), args: Box::new(args), }, }; self.push(obj); Ok(()) } fn reduce(&mut self) -> Result<()> { let args = self.pop()?; let callable = self.pop()?; #[allow(clippy::single_match)] let reduced = match &callable { Object::Class { module_name, class_name, } => { if module_name == "collections" && (class_name == "OrderedDict" || class_name == "defaultdict") { // TODO: have a separate ordered dict and a separate default dict. Some(Object::Dict(vec![])) } else { None } } _ => None, }; let reduced = reduced.unwrap_or_else(|| Object::Reduce { callable: Box::new(callable), args: Box::new(args), }); self.push(reduced); Ok(()) } fn last(&mut self) -> Result<&mut Object> { match self.stack.last_mut() { None => crate::bail!("unexpected empty stack"), Some(obj) => Ok(obj), } } fn memo_get(&self, id: u32) -> Result<Object> { match self.memo.get(&id) { None => crate::bail!("missing object in memo {id}"), Some(obj) => { // Maybe we should use refcounting rather than doing potential large clones here. Ok(obj.clone()) } } } fn memo_put(&mut self, id: u32) -> Result<()> { let obj = self.last()?.clone(); self.memo.insert(id, obj); Ok(()) } fn persistent_load(&self, id: Object) -> Result<Object> { Ok(Object::PersistentLoad(Box::new(id))) } fn new_obj(&self, class: Object, args: Object) -> Result<Object> { Ok(Object::Reduce { callable: Box::new(class), args: Box::new(args), }) } fn pop_to_marker(&mut self) -> Result<Vec<Object>> { let mut mark_idx = None; for (idx, obj) in self.stack.iter().enumerate().rev() { if obj == &Object::Mark { mark_idx = Some(idx); break; } } match mark_idx { Some(mark_idx) => { let objs = self.stack.split_off(mark_idx + 1); self.stack.pop(); Ok(objs) } None => { crate::bail!("marker object not found") } } } pub fn read<R: BufRead>(&mut self, r: &mut R) -> Result<bool> { let op_code = match OpCode::try_from(r.read_u8()?) { Ok(op_code) => op_code, Err(op_code) => { crate::bail!("unknown op-code {op_code}") } }; // println!("op: {op_code:?}"); // println!("{:?}", self.stack); match op_code { OpCode::Proto => { let version = r.read_u8()?; if VERBOSE { println!("proto {version}"); } } OpCode::Global => { let module_name = read_to_newline(r)?; let class_name = read_to_newline(r)?; let module_name = String::from_utf8_lossy(&module_name).to_string(); let class_name = String::from_utf8_lossy(&class_name).to_string(); self.push(Object::Class { module_name, class_name, }) } OpCode::BinInt1 => { let arg = r.read_u8()?; self.push(Object::Int(arg as i32)) } OpCode::BinInt2 => { let arg = r.read_u16::<LittleEndian>()?; self.push(Object::Int(arg as i32)) } OpCode::BinInt => { let arg = r.read_i32::<LittleEndian>()?; self.push(Object::Int(arg)) } OpCode::BinFloat => { // Somehow floats are encoded using BigEndian whereas int types use LittleEndian. // https://github.com/python/cpython/blob/0c80da4c14d904a367968955544dd6ae58c8101c/Lib/pickletools.py#L855 // https://github.com/pytorch/pytorch/blob/372d078f361e726bb4ac0884ac334b04c58179ef/torch/_weights_only_unpickler.py#L243 let arg = r.read_f64::<byteorder::BigEndian>()?; self.push(Object::Float(arg)) } OpCode::BinUnicode => { let len = r.read_u32::<LittleEndian>()?; let mut data = vec![0u8; len as usize]; r.read_exact(&mut data)?; let data = String::from_utf8(data).map_err(E::wrap)?; self.push(Object::Unicode(data)) } OpCode::BinPersId => { let id = self.pop()?; let obj = self.persistent_load(id)?; self.push(obj) } OpCode::Tuple => { let objs = self.pop_to_marker()?; self.push(Object::Tuple(objs)) } OpCode::Tuple1 => { let obj = self.pop()?; self.push(Object::Tuple(vec![obj])) } OpCode::Tuple2 => { let obj2 = self.pop()?; let obj1 = self.pop()?; self.push(Object::Tuple(vec![obj1, obj2])) } OpCode::Tuple3 => { let obj3 = self.pop()?; let obj2 = self.pop()?; let obj1 = self.pop()?; self.push(Object::Tuple(vec![obj1, obj2, obj3])) } OpCode::NewTrue => self.push(Object::Bool(true)), OpCode::NewFalse => self.push(Object::Bool(false)), OpCode::Append => { let value = self.pop()?; let pylist = self.last()?; if let Object::List(d) = pylist { d.push(value) } else { crate::bail!("expected a list, got {pylist:?}") } } OpCode::Appends => { let objs = self.pop_to_marker()?; let pylist = self.last()?; if let Object::List(d) = pylist { d.extend(objs) } else { crate::bail!("expected a list, got {pylist:?}") } } OpCode::SetItem => { let value = self.pop()?; let key = self.pop()?; let pydict = self.last()?; if let Object::Dict(d) = pydict { d.push((key, value)) } else { crate::bail!("expected a dict, got {pydict:?}") } } OpCode::SetItems => { let mut objs = self.pop_to_marker()?; let pydict = self.last()?; if let Object::Dict(d) = pydict { if objs.len() % 2 != 0 { crate::bail!("setitems: not an even number of objects") } while let Some(value) = objs.pop() { let key = objs.pop().unwrap(); d.push((key, value)) } } else { crate::bail!("expected a dict, got {pydict:?}") } } OpCode::None => self.push(Object::None), OpCode::Stop => { return Ok(true); } OpCode::Build => self.build()?, OpCode::EmptyDict => self.push(Object::Dict(vec![])), OpCode::Dict => { let mut objs = self.pop_to_marker()?; let mut pydict = vec![]; if objs.len() % 2 != 0 { crate::bail!("setitems: not an even number of objects") } while let Some(value) = objs.pop() { let key = objs.pop().unwrap(); pydict.push((key, value)) } self.push(Object::Dict(pydict)) } OpCode::Mark => self.push(Object::Mark), OpCode::Reduce => self.reduce()?, OpCode::EmptyTuple => self.push(Object::Tuple(vec![])), OpCode::EmptyList => self.push(Object::List(vec![])), OpCode::BinGet => { let arg = r.read_u8()?; let obj = self.memo_get(arg as u32)?; self.push(obj) } OpCode::LongBinGet => { let arg = r.read_u32::<LittleEndian>()?; let obj = self.memo_get(arg)?; self.push(obj) } OpCode::BinPut => { let arg = r.read_u8()?; self.memo_put(arg as u32)? } OpCode::LongBinPut => { let arg = r.read_u32::<LittleEndian>()?; self.memo_put(arg)? } OpCode::NewObj => { let args = self.pop()?; let class = self.pop()?; let obj = self.new_obj(class, args)?; self.push(obj) } } Ok(false) } } impl From<Object> for E { fn from(value: Object) -> Self { E::Msg(format!("conversion error on {value:?}")) } } // https://github.com/pytorch/pytorch/blob/4eac43d046ded0f0a5a5fa8db03eb40f45bf656e/torch/_utils.py#L198 // Arguments: storage, storage_offset, size, stride, requires_grad, backward_hooks fn rebuild_args(args: Object) -> Result<(Layout, DType, String, usize)> { let mut args = args.tuple()?; let stride = Vec::<usize>::try_from(args.remove(3))?; let size = Vec::<usize>::try_from(args.remove(2))?; let offset = args.remove(1).int()? as usize; let storage = args.remove(0).persistent_load()?; let mut storage = storage.tuple()?; let storage_size = storage.remove(4).int()? as usize; let path = storage.remove(2).unicode()?; let (_module_name, class_name) = storage.remove(1).class()?; let dtype = match class_name.as_str() { "FloatStorage" => DType::F32, "DoubleStorage" => DType::F64, "HalfStorage" => DType::F16, "BFloat16Storage" => DType::BF16, "ByteStorage" => DType::U8, "LongStorage" => DType::I64, other => { crate::bail!("unsupported storage type {other}") } }; let layout = Layout::new(crate::Shape::from(size), stride, offset); Ok((layout, dtype, path, storage_size)) } #[derive(Debug, Clone)] pub struct TensorInfo { pub name: String, pub dtype: DType, pub layout: Layout, pub path: String, pub storage_size: usize, } /// Read the tensor info from a .pth file. /// /// # Arguments /// * `file` - The path to the .pth file. /// * `verbose` - Whether to print debug information. /// * `key` - Optional key to retrieve `state_dict` from the pth file. pub fn read_pth_tensor_info<P: AsRef<std::path::Path>>( file: P, verbose: bool, key: Option<&str>, ) -> Result<Vec<TensorInfo>> { let file = std::fs::File::open(file)?; let zip_reader = std::io::BufReader::new(file); let mut zip = zip::ZipArchive::new(zip_reader)?; let zip_file_names = zip .file_names() .map(|f| f.to_string()) .collect::<Vec<String>>(); let mut tensor_infos = vec![]; for file_name in zip_file_names.iter() { if !file_name.ends_with("data.pkl") { continue; } let dir_name = std::path::PathBuf::from(file_name.strip_suffix(".pkl").unwrap()); let reader = zip.by_name(file_name)?; let mut reader = std::io::BufReader::new(reader); let mut stack = Stack::empty(); stack.read_loop(&mut reader)?; let obj = stack.finalize()?; if VERBOSE || verbose { println!("{obj:#?}"); } let obj = match obj { Object::Build { callable, args } => match *callable { Object::Reduce { callable, args: _ } => match *callable { Object::Class { module_name, class_name, } if module_name == "__torch__" && class_name == "Module" => *args, _ => continue, }, _ => continue, }, obj => obj, }; // If key is provided, then we need to extract the state_dict from the object. let obj = if let Some(key) = key { if let Object::Dict(key_values) = obj { key_values .into_iter() .find(|(k, _)| *k == Object::Unicode(key.to_owned())) .map(|(_, v)| v) .ok_or_else(|| E::Msg(format!("key {key} not found")))? } else { obj } } else { obj }; // If the object is a dict, then we can extract the tensor info from it. // NOTE: We are assuming that the `obj` is state_dict by this stage. if let Object::Dict(key_values) = obj { for (name, value) in key_values.into_iter() { match value.into_tensor_info(name, &dir_name) { Ok(Some(tensor_info)) => tensor_infos.push(tensor_info), Ok(None) => {} Err(err) => eprintln!("skipping: {err:?}"), } } } } Ok(tensor_infos) } /// Lazy tensor loader. pub struct PthTensors { tensor_infos: HashMap<String, TensorInfo>, path: std::path::PathBuf, // We do not store a zip reader as it needs mutable access to extract data. Instead we // re-create a zip reader for each tensor. } impl PthTensors { pub fn new<P: AsRef<std::path::Path>>(path: P, key: Option<&str>) -> Result<Self> { let tensor_infos = read_pth_tensor_info(path.as_ref(), false, key)?; let tensor_infos = tensor_infos .into_iter() .map(|ti| (ti.name.to_string(), ti)) .collect(); let path = path.as_ref().to_owned(); Ok(Self { tensor_infos, path }) } pub fn tensor_infos(&self) -> &HashMap<String, TensorInfo> { &self.tensor_infos } pub fn get(&self, name: &str) -> Result<Option<Tensor>> { use std::io::Read; let tensor_info = match self.tensor_infos.get(name) { None => return Ok(None), Some(tensor_info) => tensor_info, }; // We hope that the file has not changed since first reading it. let zip_reader = std::io::BufReader::new(std::fs::File::open(&self.path)?); let mut zip = zip::ZipArchive::new(zip_reader)?; let mut reader = zip.by_name(&tensor_info.path)?; let is_fortran_contiguous = tensor_info.layout.is_fortran_contiguous(); let rank = tensor_info.layout.shape().rank(); // Reading the data is a bit tricky as it can be strided, for now only support the basic // case and when the tensor is fortran contiguous. if !tensor_info.layout.is_contiguous() && !is_fortran_contiguous { crate::bail!( "cannot retrieve non-contiguous tensors {:?}", tensor_info.layout ) } let start_offset = tensor_info.layout.start_offset(); if start_offset > 0 { std::io::copy( &mut reader.by_ref().take(start_offset as u64), &mut std::io::sink(), )?; } let tensor = Tensor::from_reader( tensor_info.layout.shape().clone(), tensor_info.dtype, &mut reader, )?; if rank > 1 && is_fortran_contiguous { // Reverse the shape, e.g. Shape(2, 3, 4) -> Shape(4, 3, 2) let shape_reversed: Vec<_> = tensor_info.layout.dims().iter().rev().cloned().collect(); let tensor = tensor.reshape(shape_reversed)?; // Permute (transpose) the dimensions, e.g. Shape(4, 3, 2) -> Shape(2, 3, 4) let dim_indeces_reversed: Vec<_> = (0..rank).rev().collect(); let tensor = tensor.permute(dim_indeces_reversed)?; Ok(Some(tensor)) } else { Ok(Some(tensor)) } } } /// Read all the tensors from a PyTorch pth file with a given key. /// /// # Arguments /// * `path` - Path to the pth file. /// * `key` - Optional key to retrieve `state_dict` from the pth file. Sometimes the pth file /// contains multiple objects and the state_dict is the one we are interested in. pub fn read_all_with_key<P: AsRef<std::path::Path>>( path: P, key: Option<&str>, ) -> Result<Vec<(String, Tensor)>> { let pth = PthTensors::new(path, key)?; let tensor_names = pth.tensor_infos.keys(); let mut tensors = Vec::with_capacity(tensor_names.len()); for name in tensor_names { if let Some(tensor) = pth.get(name)? { tensors.push((name.to_string(), tensor)) } } Ok(tensors) } /// Read all the tensors from a PyTorch pth file. /// /// # Arguments /// * `path` - Path to the pth file. pub fn read_all<P: AsRef<std::path::Path>>(path: P) -> Result<Vec<(String, Tensor)>> { read_all_with_key(path, None) }
4
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/utils.rs
//! Useful functions for checking features. use std::str::FromStr; pub fn get_num_threads() -> usize { // Respond to the same environment variable as rayon. match std::env::var("RAYON_NUM_THREADS") .ok() .and_then(|s| usize::from_str(&s).ok()) { Some(x) if x > 0 => x, Some(_) | None => num_cpus::get(), } } pub fn has_accelerate() -> bool { cfg!(feature = "accelerate") } pub fn has_mkl() -> bool { cfg!(feature = "mkl") } pub fn cuda_is_available() -> bool { cfg!(feature = "cuda") } pub fn metal_is_available() -> bool { cfg!(feature = "metal") } pub fn with_avx() -> bool { cfg!(target_feature = "avx") } pub fn with_neon() -> bool { cfg!(target_feature = "neon") } pub fn with_simd128() -> bool { cfg!(target_feature = "simd128") } pub fn with_f16c() -> bool { cfg!(target_feature = "f16c") }
5
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/mkl.rs
#![allow(dead_code)] use libc::{c_char, c_double, c_float, c_int}; mod ffi { use super::*; extern "C" { pub fn vsTanh(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdTanh(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsExp(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdExp(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsLn(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdLn(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsSin(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdSin(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsCos(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdCos(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsSqrt(n: c_int, a: *const c_float, y: *mut c_float); pub fn vdSqrt(n: c_int, a: *const c_double, y: *mut c_double); pub fn vsAdd(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdAdd(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn vsSub(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdSub(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn vsMul(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdMul(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn vsDiv(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdDiv(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn vsFmax(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdFmax(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn vsFmin(n: c_int, a: *const c_float, b: *const c_float, y: *mut c_float); pub fn vdFmin(n: c_int, a: *const c_double, b: *const c_double, y: *mut c_double); pub fn sgemm_( transa: *const c_char, transb: *const c_char, m: *const c_int, n: *const c_int, k: *const c_int, alpha: *const c_float, a: *const c_float, lda: *const c_int, b: *const c_float, ldb: *const c_int, beta: *const c_float, c: *mut c_float, ldc: *const c_int, ); pub fn dgemm_( transa: *const c_char, transb: *const c_char, m: *const c_int, n: *const c_int, k: *const c_int, alpha: *const c_double, a: *const c_double, lda: *const c_int, b: *const c_double, ldb: *const c_int, beta: *const c_double, c: *mut c_double, ldc: *const c_int, ); pub fn hgemm_( transa: *const c_char, transb: *const c_char, m: *const c_int, n: *const c_int, k: *const c_int, alpha: *const half::f16, a: *const half::f16, lda: *const c_int, b: *const half::f16, ldb: *const c_int, beta: *const half::f16, c: *mut half::f16, ldc: *const c_int, ); } } #[allow(clippy::too_many_arguments)] #[inline] pub unsafe fn sgemm( transa: u8, transb: u8, m: i32, n: i32, k: i32, alpha: f32, a: &[f32], lda: i32, b: &[f32], ldb: i32, beta: f32, c: &mut [f32], ldc: i32, ) { ffi::sgemm_( &(transa as c_char), &(transb as c_char), &m, &n, &k, &alpha, a.as_ptr(), &lda, b.as_ptr(), &ldb, &beta, c.as_mut_ptr(), &ldc, ) } #[allow(clippy::too_many_arguments)] #[inline] pub unsafe fn dgemm( transa: u8, transb: u8, m: i32, n: i32, k: i32, alpha: f64, a: &[f64], lda: i32, b: &[f64], ldb: i32, beta: f64, c: &mut [f64], ldc: i32, ) { ffi::dgemm_( &(transa as c_char), &(transb as c_char), &m, &n, &k, &alpha, a.as_ptr(), &lda, b.as_ptr(), &ldb, &beta, c.as_mut_ptr(), &ldc, ) } #[allow(clippy::too_many_arguments)] #[inline] pub unsafe fn hgemm( transa: u8, transb: u8, m: i32, n: i32, k: i32, alpha: half::f16, a: &[half::f16], lda: i32, b: &[half::f16], ldb: i32, beta: half::f16, c: &mut [half::f16], ldc: i32, ) { ffi::hgemm_( &(transa as c_char), &(transb as c_char), &m, &n, &k, &alpha, a.as_ptr(), &lda, b.as_ptr(), &ldb, &beta, c.as_mut_ptr(), &ldc, ) } #[inline] pub fn vs_exp(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsExp(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_exp(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdExp(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_ln(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsLn(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_ln(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdLn(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_sin(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsSin(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_sin(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdSin(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_cos(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsCos(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_cos(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdCos(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_sqrt(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsSqrt(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_sqrt(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdSqrt(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_sqr(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsMul(a_len as i32, a.as_ptr(), a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_sqr(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdMul(a_len as i32, a.as_ptr(), a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_tanh(a: &[f32], y: &mut [f32]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vsTanh(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_tanh(a: &[f64], y: &mut [f64]) { let a_len = a.len(); let y_len = y.len(); if a_len != y_len { panic!("a and y have different lengths {a_len} <> {y_len}") } unsafe { ffi::vdTanh(a_len as i32, a.as_ptr(), y.as_mut_ptr()) } } // The vector functions from mkl can be performed in place by using the same array for input and // output. // https://www.intel.com/content/www/us/en/docs/onemkl/developer-reference-c/2023-2/vector-mathematical-functions.html #[inline] pub fn vs_tanh_inplace(y: &mut [f32]) { unsafe { ffi::vsTanh(y.len() as i32, y.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_tanh_inplace(y: &mut [f64]) { unsafe { ffi::vdTanh(y.len() as i32, y.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_exp_inplace(y: &mut [f32]) { unsafe { ffi::vsExp(y.len() as i32, y.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vd_exp_inplace(y: &mut [f64]) { unsafe { ffi::vdExp(y.len() as i32, y.as_ptr(), y.as_mut_ptr()) } } #[inline] pub fn vs_gelu(vs: &[f32], ys: &mut [f32]) { for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = (2.0f32 / std::f32::consts::PI).sqrt() * v * (1.0 + 0.044715 * v * v) } vs_tanh_inplace(ys); for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = 0.5 * v * (1.0 + *y) } } #[inline] pub fn vd_gelu(vs: &[f64], ys: &mut [f64]) { for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = (2.0f64 / std::f64::consts::PI).sqrt() * v * (1.0 + 0.044715 * v * v) } vd_tanh_inplace(ys); for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = 0.5 * v * (1.0 + *y) } } #[inline] pub fn vs_silu(vs: &[f32], ys: &mut [f32]) { for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = -v } vs_exp_inplace(ys); for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = v / (1.0 + *y) } } #[inline] pub fn vd_silu(vs: &[f64], ys: &mut [f64]) { for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = -v } vd_exp_inplace(ys); for (&v, y) in vs.iter().zip(ys.iter_mut()) { *y = v / (1.0 + *y) } } macro_rules! binary_op { ($fn_name:ident, $ty:ty, $mkl_name:ident) => { #[inline] pub fn $fn_name(a: &[$ty], b: &[$ty], y: &mut [$ty]) { let a_len = a.len(); let b_len = b.len(); let y_len = y.len(); if a_len != y_len || b_len != y_len { panic!( "{} a,b,y len mismatch {a_len} {b_len} {y_len}", stringify!($fn_name) ); } unsafe { ffi::$mkl_name(a_len as i32, a.as_ptr(), b.as_ptr(), y.as_mut_ptr()) } } }; } binary_op!(vs_add, f32, vsAdd); binary_op!(vd_add, f64, vdAdd); binary_op!(vs_sub, f32, vsSub); binary_op!(vd_sub, f64, vdSub); binary_op!(vs_mul, f32, vsMul); binary_op!(vd_mul, f64, vdMul); binary_op!(vs_div, f32, vsDiv); binary_op!(vd_div, f64, vdDiv); binary_op!(vs_max, f32, vsFmax); binary_op!(vd_max, f64, vdFmax); binary_op!(vs_min, f32, vsFmin); binary_op!(vd_min, f64, vdFmin);
6
0
hf_public_repos/candle/candle-core
hf_public_repos/candle/candle-core/src/safetensors.rs
//! Module to load `safetensor` files into CPU/GPU memory. //! //! There are multiple ways to load tensors from safetensor files: //! - `load` function for loading directly into memory and returning a HashMap of tensors //! - `MmapedSafetensors` for memory mapping files and avoiding full allocation //! - `SliceSafetensors` for working with in-memory buffers //! - `BufferedSafetensors` for owning a buffer of data //! //! Tensors can also be serialized to safetensor format using the `save` function or //! `Tensor::save_safetensors` method. //! use crate::{DType, Device, Error, Result, Tensor, WithDType}; use safetensors::tensor as st; use safetensors::tensor::SafeTensors; use std::borrow::Cow; use std::collections::HashMap; use std::path::Path; impl From<DType> for st::Dtype { fn from(value: DType) -> Self { match value { DType::U8 => st::Dtype::U8, DType::U32 => st::Dtype::U32, DType::I64 => st::Dtype::I64, DType::BF16 => st::Dtype::BF16, DType::F16 => st::Dtype::F16, DType::F32 => st::Dtype::F32, DType::F64 => st::Dtype::F64, } } } impl TryFrom<st::Dtype> for DType { type Error = Error; fn try_from(value: st::Dtype) -> Result<Self> { match value { st::Dtype::U8 => Ok(DType::U8), st::Dtype::U32 => Ok(DType::U32), st::Dtype::I64 => Ok(DType::I64), st::Dtype::BF16 => Ok(DType::BF16), st::Dtype::F16 => Ok(DType::F16), st::Dtype::F32 => Ok(DType::F32), st::Dtype::F64 => Ok(DType::F64), dtype => Err(Error::UnsupportedSafeTensorDtype(dtype)), } } } impl st::View for Tensor { fn dtype(&self) -> st::Dtype { self.dtype().into() } fn shape(&self) -> &[usize] { self.shape().dims() } fn data(&self) -> Cow<[u8]> { // This copies data from GPU to CPU. // TODO: Avoid the unwrap here. Cow::Owned(convert_back(self).unwrap()) } fn data_len(&self) -> usize { let n: usize = self.shape().elem_count(); let bytes_per_element = self.dtype().size_in_bytes(); n * bytes_per_element } } impl st::View for &Tensor { fn dtype(&self) -> st::Dtype { (*self).dtype().into() } fn shape(&self) -> &[usize] { self.dims() } fn data(&self) -> Cow<[u8]> { // This copies data from GPU to CPU. // TODO: Avoid the unwrap here. Cow::Owned(convert_back(self).unwrap()) } fn data_len(&self) -> usize { let n: usize = self.dims().iter().product(); let bytes_per_element = (*self).dtype().size_in_bytes(); n * bytes_per_element } } impl Tensor { pub fn save_safetensors<P: AsRef<Path>>(&self, name: &str, filename: P) -> Result<()> { let data = [(name, self.clone())]; Ok(st::serialize_to_file(data, &None, filename.as_ref())?) } } fn convert_slice<T: WithDType>(data: &[u8], shape: &[usize], device: &Device) -> Result<Tensor> { let size_in_bytes = T::DTYPE.size_in_bytes(); let elem_count = data.len() / size_in_bytes; if (data.as_ptr() as usize) % size_in_bytes == 0 { // SAFETY This is safe because we just checked that this // was correctly aligned. let data: &[T] = unsafe { std::slice::from_raw_parts(data.as_ptr() as *const T, elem_count) }; Tensor::from_slice(data, shape, device) } else { // XXX: We need to specify `T` here, otherwise the compiler will infer u8 because of the following cast // Making this vector too small to fit a full f16/f32/f64 weights, resulting in out-of-bounds access let mut c: Vec<T> = Vec::with_capacity(elem_count); // SAFETY: We just created c, so the allocated memory is necessarily // contiguous and non overlapping with the view's data. // We're downgrading the `c` pointer from T to u8, which removes alignment // constraints. unsafe { std::ptr::copy_nonoverlapping(data.as_ptr(), c.as_mut_ptr() as *mut u8, data.len()); c.set_len(elem_count) } Tensor::from_slice(&c, shape, device) } } fn convert_slice_with_cast<T: Sized + Copy, U: WithDType, F: Fn(T) -> Result<U>>( data: &[u8], shape: &[usize], device: &Device, conv: F, ) -> Result<Tensor> { let size_in_bytes = std::mem::size_of::<T>(); let elem_count = data.len() / size_in_bytes; if (data.as_ptr() as usize) % size_in_bytes == 0 { // SAFETY This is safe because we just checked that this // was correctly aligned. let data: &[T] = unsafe { std::slice::from_raw_parts(data.as_ptr() as *const T, elem_count) }; let data = data.iter().map(|t| conv(*t)).collect::<Result<Vec<_>>>()?; Tensor::from_vec(data, shape, device) } else { // XXX: We need to specify `T` here, otherwise the compiler will infer u8 because of the following cast // Making this vector too small to fit a full f16/f32/f64 weights, resulting in out-of-bounds access let mut c: Vec<T> = Vec::with_capacity(elem_count); // SAFETY: We just created c, so the allocated memory is necessarily // contiguous and non overlapping with the view's data. // We're downgrading the `c` pointer from T to u8, which removes alignment // constraints. unsafe { std::ptr::copy_nonoverlapping(data.as_ptr(), c.as_mut_ptr() as *mut u8, data.len()); c.set_len(elem_count) } let c = c.into_iter().map(conv).collect::<Result<Vec<_>>>()?; Tensor::from_vec(c, shape, device) } } fn convert_with_cast_<T: Sized + Copy, U: WithDType, F: Fn(T) -> Result<U>>( view: &st::TensorView<'_>, device: &Device, conv: F, ) -> Result<Tensor> { convert_slice_with_cast::<T, U, F>(view.data(), view.shape(), device, conv) } fn convert_<T: WithDType>(view: &st::TensorView<'_>, device: &Device) -> Result<Tensor> { convert_slice::<T>(view.data(), view.shape(), device) } fn convert_back_<T: WithDType>(mut vs: Vec<T>) -> Vec<u8> { let size_in_bytes = T::DTYPE.size_in_bytes(); let length = vs.len() * size_in_bytes; let capacity = vs.capacity() * size_in_bytes; let ptr = vs.as_mut_ptr() as *mut u8; // Don't run the destructor for Vec<T> std::mem::forget(vs); // SAFETY: // // Every T is larger than u8, so there is no issue regarding alignment. // This re-interpret the Vec<T> as a Vec<u8>. unsafe { Vec::from_raw_parts(ptr, length, capacity) } } pub trait Load { fn load(&self, device: &Device) -> Result<Tensor>; } impl Load for st::TensorView<'_> { fn load(&self, device: &Device) -> Result<Tensor> { convert(self, device) } } impl Tensor { pub fn from_raw_buffer( data: &[u8], dtype: DType, shape: &[usize], device: &Device, ) -> Result<Self> { match dtype { DType::U8 => convert_slice::<u8>(data, shape, device), DType::U32 => convert_slice::<u32>(data, shape, device), DType::I64 => convert_slice::<i64>(data, shape, device), DType::BF16 => convert_slice::<half::bf16>(data, shape, device), DType::F16 => convert_slice::<half::f16>(data, shape, device), DType::F32 => convert_slice::<f32>(data, shape, device), DType::F64 => convert_slice::<f64>(data, shape, device), } } } fn convert(view: &st::TensorView<'_>, device: &Device) -> Result<Tensor> { match view.dtype() { st::Dtype::U8 => convert_::<u8>(view, device), st::Dtype::U16 => { let conv = |x| Ok(u32::from(x)); convert_with_cast_::<u16, u32, _>(view, device, conv) } st::Dtype::U32 => convert_::<u32>(view, device), st::Dtype::I32 => { let conv = |x| Ok(i64::from(x)); convert_with_cast_::<i32, i64, _>(view, device, conv) } st::Dtype::I64 => convert_::<i64>(view, device), st::Dtype::BF16 => convert_::<half::bf16>(view, device), st::Dtype::F16 => convert_::<half::f16>(view, device), st::Dtype::F32 => convert_::<f32>(view, device), st::Dtype::F64 => convert_::<f64>(view, device), dtype => Err(Error::UnsupportedSafeTensorDtype(dtype)), } } fn convert_back(tensor: &Tensor) -> Result<Vec<u8>> { // TODO: This makes an unnecessary copy when the tensor is on the cpu. let tensor = tensor.flatten_all()?; match tensor.dtype() { DType::U8 => Ok(convert_back_::<u8>(tensor.to_vec1()?)), DType::U32 => Ok(convert_back_::<u32>(tensor.to_vec1()?)), DType::I64 => Ok(convert_back_::<i64>(tensor.to_vec1()?)), DType::F16 => Ok(convert_back_::<half::f16>(tensor.to_vec1()?)), DType::BF16 => Ok(convert_back_::<half::bf16>(tensor.to_vec1()?)), DType::F32 => Ok(convert_back_::<f32>(tensor.to_vec1()?)), DType::F64 => Ok(convert_back_::<f64>(tensor.to_vec1()?)), } } pub fn load<P: AsRef<Path>>(filename: P, device: &Device) -> Result<HashMap<String, Tensor>> { let data = std::fs::read(filename.as_ref())?; load_buffer(&data[..], device) } pub fn load_buffer(data: &[u8], device: &Device) -> Result<HashMap<String, Tensor>> { let st = safetensors::SafeTensors::deserialize(data)?; st.tensors() .into_iter() .map(|(name, view)| Ok((name, view.load(device)?))) .collect() } pub fn save<K: AsRef<str> + Ord + std::fmt::Display, P: AsRef<Path>>( tensors: &HashMap<K, Tensor>, filename: P, ) -> Result<()> { Ok(st::serialize_to_file(tensors, &None, filename.as_ref())?) } #[derive(yoke::Yokeable)] struct SafeTensors_<'a>(SafeTensors<'a>); pub struct MmapedSafetensors { safetensors: Vec<yoke::Yoke<SafeTensors_<'static>, memmap2::Mmap>>, routing: Option<HashMap<String, usize>>, } impl MmapedSafetensors { /// Creates a wrapper around a memory mapped file and deserialize the safetensors header. /// /// # Safety /// /// The unsafe is inherited from [`memmap2::MmapOptions`]. pub unsafe fn new<P: AsRef<Path>>(p: P) -> Result<Self> { let p = p.as_ref(); let file = std::fs::File::open(p).map_err(|e| Error::from(e).with_path(p))?; let file = memmap2::MmapOptions::new() .map(&file) .map_err(|e| Error::from(e).with_path(p))?; let safetensors = yoke::Yoke::<SafeTensors_<'static>, memmap2::Mmap>::try_attach_to_cart( file, |data: &[u8]| { let st = safetensors::SafeTensors::deserialize(data) .map_err(|e| Error::from(e).with_path(p))?; Ok::<_, Error>(SafeTensors_(st)) }, )?; Ok(Self { safetensors: vec![safetensors], routing: None, }) } /// Creates a wrapper around multiple memory mapped file and deserialize the safetensors headers. /// /// If a tensor name appears in multiple files, the last entry is returned. /// /// # Safety /// /// The unsafe is inherited from [`memmap2::MmapOptions`]. pub unsafe fn multi<P: AsRef<Path>>(paths: &[P]) -> Result<Self> { let mut routing = HashMap::new(); let mut safetensors = vec![]; for (index, p) in paths.iter().enumerate() { let p = p.as_ref(); let file = std::fs::File::open(p).map_err(|e| Error::from(e).with_path(p))?; let file = memmap2::MmapOptions::new() .map(&file) .map_err(|e| Error::from(e).with_path(p))?; let data = yoke::Yoke::<SafeTensors_<'static>, memmap2::Mmap>::try_attach_to_cart( file, |data: &[u8]| { let st = safetensors::SafeTensors::deserialize(data) .map_err(|e| Error::from(e).with_path(p))?; Ok::<_, Error>(SafeTensors_(st)) }, )?; for k in data.get().0.names() { routing.insert(k.to_string(), index); } safetensors.push(data) } Ok(Self { safetensors, routing: Some(routing), }) } pub fn load(&self, name: &str, dev: &Device) -> Result<Tensor> { self.get(name)?.load(dev) } pub fn tensors(&self) -> Vec<(String, st::TensorView<'_>)> { let mut tensors = vec![]; for safetensors in self.safetensors.iter() { tensors.push(safetensors.get().0.tensors()) } tensors.into_iter().flatten().collect() } pub fn get(&self, name: &str) -> Result<st::TensorView<'_>> { let index = match &self.routing { None => 0, Some(routing) => { let index = routing.get(name).ok_or_else(|| { Error::CannotFindTensor { path: name.to_string(), } .bt() })?; *index } }; Ok(self.safetensors[index].get().0.tensor(name)?) } } pub struct SliceSafetensors<'a> { safetensors: SafeTensors<'a>, } impl<'a> SliceSafetensors<'a> { /// Creates a wrapper around a binary buffer and deserialize the safetensors header. pub fn new(buffer: &'a [u8]) -> Result<Self> { let safetensors = safetensors::SafeTensors::deserialize(buffer)?; Ok(Self { safetensors }) } pub fn load(&self, name: &str, dev: &Device) -> Result<Tensor> { self.safetensors.tensor(name)?.load(dev) } pub fn tensors(&self) -> Vec<(String, st::TensorView<'_>)> { self.safetensors.tensors() } pub fn get(&self, name: &str) -> Result<st::TensorView<'_>> { Ok(self.safetensors.tensor(name)?) } } pub struct BufferedSafetensors { safetensors: yoke::Yoke<SafeTensors_<'static>, Vec<u8>>, } impl BufferedSafetensors { /// Creates a wrapper around a binary buffer and deserialize the safetensors header. pub fn new(buffer: Vec<u8>) -> Result<Self> { let safetensors = yoke::Yoke::<SafeTensors_<'static>, Vec<u8>>::try_attach_to_cart( buffer, |data: &[u8]| { let st = safetensors::SafeTensors::deserialize(data)?; Ok::<_, Error>(SafeTensors_(st)) }, )?; Ok(Self { safetensors }) } pub fn load(&self, name: &str, dev: &Device) -> Result<Tensor> { self.get(name)?.load(dev) } pub fn tensors(&self) -> Vec<(String, st::TensorView<'_>)> { self.safetensors.get().0.tensors() } pub fn get(&self, name: &str) -> Result<st::TensorView<'_>> { Ok(self.safetensors.get().0.tensor(name)?) } } pub struct MmapedFile { path: std::path::PathBuf, inner: memmap2::Mmap, } impl MmapedFile { /// Creates a wrapper around a memory mapped file from which you can retrieve /// tensors using [`MmapedFile::deserialize`] /// /// # Safety /// /// The unsafe is inherited from [`memmap2::MmapOptions`]. pub unsafe fn new<P: AsRef<Path>>(p: P) -> Result<Self> { let p = p.as_ref(); let file = std::fs::File::open(p).map_err(|e| Error::from(e).with_path(p))?; let inner = memmap2::MmapOptions::new() .map(&file) .map_err(|e| Error::from(e).with_path(p))?; Ok(Self { inner, path: p.to_path_buf(), }) } pub fn deserialize(&self) -> Result<SafeTensors<'_>> { let st = safetensors::SafeTensors::deserialize(&self.inner) .map_err(|e| Error::from(e).with_path(&self.path))?; Ok(st) } } #[cfg(test)] mod tests { use super::*; use std::collections::HashMap; #[test] fn save_single_tensor() { let t = Tensor::zeros((2, 2), DType::F32, &Device::Cpu).unwrap(); t.save_safetensors("t", "t.safetensors").unwrap(); let bytes = std::fs::read("t.safetensors").unwrap(); assert_eq!(bytes, b"@\0\0\0\0\0\0\0{\"t\":{\"dtype\":\"F32\",\"shape\":[2,2],\"data_offsets\":[0,16]}} \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"); std::fs::remove_file("t.safetensors").unwrap(); } #[test] fn save_load_multiple_tensors() { let t = Tensor::zeros((2, 2), DType::F32, &Device::Cpu).unwrap(); let u = Tensor::zeros((1, 2), DType::F32, &Device::Cpu).unwrap(); let map: HashMap<_, _> = [("t", t), ("u", u)].into_iter().collect(); save(&map, "multi.safetensors").unwrap(); let weights = load("multi.safetensors", &Device::Cpu).unwrap(); assert_eq!(weights.get("t").unwrap().dims(), &[2, 2]); assert_eq!(weights.get("u").unwrap().dims(), &[1, 2]); let bytes = std::fs::read("multi.safetensors").unwrap(); assert_eq!(bytes, b"x\0\0\0\0\0\0\0{\"t\":{\"dtype\":\"F32\",\"shape\":[2,2],\"data_offsets\":[0,16]},\"u\":{\"dtype\":\"F32\",\"shape\":[1,2],\"data_offsets\":[16,24]}} \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"); std::fs::remove_file("multi.safetensors").unwrap(); } }
7
0
hf_public_repos/candle/candle-core/src
hf_public_repos/candle/candle-core/src/metal_backend/device.rs
use crate::{DType, Result}; use candle_metal_kernels::Kernels; use metal::{Buffer, CommandBuffer, CommandQueue, MTLResourceOptions, NSUInteger}; use std::collections::HashMap; use std::ffi::c_void; use std::path::Path; use std::sync::{Arc, Mutex, RwLock}; use super::MetalError; /// Unique identifier for cuda devices. #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] pub struct DeviceId(usize); impl DeviceId { pub(crate) fn new() -> Self { // https://users.rust-lang.org/t/idiomatic-rust-way-to-generate-unique-id/33805 use std::sync::atomic; static COUNTER: atomic::AtomicUsize = atomic::AtomicUsize::new(1); Self(COUNTER.fetch_add(1, atomic::Ordering::Relaxed)) } } type BufferMap = HashMap<(NSUInteger, MTLResourceOptions), Vec<Arc<Buffer>>>; pub(crate) struct Commands { /// Single command queue for the entire device. command_queue: CommandQueue, /// One command buffer at a time. /// The scheduler works by allowing multiple /// [ComputeCommandEncoder](https://developer.apple.com/documentation/metal/mtlcomputecommandencoder?language=objc) /// on a single command buffer. Using a single command buffer would be fastest on the GPU but /// prevents overlapping of CPU and GPU commands (because command buffer needs to be committed /// to start to work). /// Despite what the documentation says, command buffers are NOT ordered. They are ordered /// for their START time, but there's no guarantee that command buffer1 will finish before /// command buffer2 starts (or there are metal bugs there) command_buffer: CommandBuffer, /// Keeps track of the current amount of compute command encoders on the current /// command buffer /// Arc, RwLock because of the interior mutability. command_buffer_index: usize, /// The maximum amount of [compute command encoder](https://developer.apple.com/documentation/metal/mtlcomputecommandencoder?language=objc) per [command buffer](https://developer.apple.com/documentation/metal/mtlcommandbuffer?language=objc) compute_per_buffer: usize, } impl Commands { pub(crate) fn new(command_queue: CommandQueue) -> Result<Self> { let command_buffer = command_queue.new_command_buffer().to_owned(); command_buffer.enqueue(); let compute_per_buffer = match std::env::var("CANDLE_METAL_COMPUTE_PER_BUFFER") { Ok(val) => val.parse()?, _ => 50, }; Ok(Self { command_queue, command_buffer, command_buffer_index: 0, compute_per_buffer, }) } pub fn command_buffer(&mut self) -> Result<(bool, CommandBuffer)> { let mut command_buffer = self.command_buffer.to_owned(); let mut flushed = false; if self.command_buffer_index > self.compute_per_buffer { self.command_buffer.commit(); command_buffer = self.command_queue.new_command_buffer().to_owned(); self.command_buffer = command_buffer.clone(); self.command_buffer_index = 0; flushed = true; } self.command_buffer_index += 1; Ok((flushed, command_buffer)) } pub fn wait_until_completed(&mut self) -> Result<()> { match self.command_buffer.status() { metal::MTLCommandBufferStatus::Committed | metal::MTLCommandBufferStatus::Scheduled | metal::MTLCommandBufferStatus::Completed => { panic!("Already committed"); } _ => {} } self.command_buffer.commit(); self.command_buffer.wait_until_completed(); self.command_buffer = self.command_queue.new_command_buffer().to_owned(); Ok(()) } } #[derive(Clone)] pub struct MetalDevice { /// Unique identifier, the registryID is not sufficient as it identifies the GPU rather than /// the device itself. pub(crate) id: DeviceId, /// Raw metal device: <https://developer.apple.com/documentation/metal/mtldevice?language=objc> pub(crate) device: metal::Device, pub(crate) commands: Arc<RwLock<Commands>>, /// Simple allocator struct. /// The buffers are stored in size buckets since ML tends to use similar shapes over and over. /// We store the buffers in [`Arc`] because it's much faster than Obj-c internal ref counting /// (could be linked to FFI communication overhead). /// /// Whenever a buffer has a strong_count==1, we can reuse it, it means it was dropped in the /// graph calculation, and only we the allocator kept a reference to it, therefore it's free /// to be reused. However, in order for this to work, we need to guarantee the order of /// operation, so that this buffer is not being used by another kernel at the same time. /// Arc is the CPU reference count, it doesn't mean anything on the GPU side of things. /// /// Whenever we actually allocate a new buffer, we make a full sweep to clean up unused buffers /// (strong_count = 1). pub(crate) buffers: Arc<RwLock<BufferMap>>, /// Simple keeper struct to keep track of the already compiled kernels so we can reuse them. /// Heavily used by [`candle_metal_kernels`] pub(crate) kernels: Arc<Kernels>, /// Seed for random number generation. pub(crate) seed: Arc<Mutex<Buffer>>, /// Whether to use the MLX matmul kernels instead of the MFA ones. pub(crate) use_mlx_mm: bool, } impl std::fmt::Debug for MetalDevice { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { write!(f, "MetalDevice({:?})", self.id) } } impl std::ops::Deref for MetalDevice { type Target = metal::DeviceRef; fn deref(&self) -> &Self::Target { &self.device } } impl MetalDevice { pub fn set_use_mlx_mm(&mut self, use_mlx_mm: bool) { self.use_mlx_mm = use_mlx_mm } pub fn compile( &self, func_name: &'static str, kernel: ug::lang::ssa::Kernel, ) -> Result<metal::ComputePipelineState> { let mut buf = vec![]; ug_metal::code_gen::gen(&mut buf, func_name, &kernel)?; let metal_code = String::from_utf8(buf)?; let lib = self .device .new_library_with_source(&metal_code, &metal::CompileOptions::new()) .map_err(MetalError::from)?; let func = lib .get_function(func_name, None) .map_err(MetalError::from)?; let pl = self .device .new_compute_pipeline_state_with_function(&func) .map_err(MetalError::from)?; Ok(pl) } pub fn id(&self) -> DeviceId { self.id } pub fn metal_device(&self) -> &metal::Device { &self.device } fn drop_unused_buffers(&self) -> Result<()> { let mut buffers = self.buffers.write().map_err(MetalError::from)?; for subbuffers in buffers.values_mut() { let newbuffers = subbuffers .iter() .filter(|s| Arc::strong_count(*s) > 1) .map(Arc::clone) .collect(); *subbuffers = newbuffers; } Ok(()) } pub fn command_buffer(&self) -> Result<CommandBuffer> { let mut commands = self.commands.write().map_err(MetalError::from)?; let (flushed, command_buffer) = commands.command_buffer()?; if flushed { self.drop_unused_buffers()? } Ok(command_buffer) } pub fn wait_until_completed(&self) -> Result<()> { let mut commands = self.commands.write().map_err(MetalError::from)?; commands.wait_until_completed() } pub fn kernels(&self) -> &Kernels { &self.kernels } pub fn device(&self) -> &metal::Device { &self.device } /// Creates a new buffer (not necessarily zeroed). /// The buffer is [MTLPrivate](https://developer.apple.com/documentation/metal/mtlstoragemode) /// This means the buffer data cannot be read on the CPU directly. /// /// [`name`] is only used to keep track of the resource origin in case of bugs pub fn new_buffer( &self, element_count: usize, dtype: DType, name: &str, ) -> Result<Arc<Buffer>> { let size = (element_count * dtype.size_in_bytes()) as NSUInteger; self.allocate_buffer(size, MTLResourceOptions::StorageModePrivate, name) } /// Creates a new buffer (not necessarily zeroed). /// The buffer is [MTLManaged](https://developer.apple.com/documentation/metal/mtlstoragemode) /// This means the buffer can be read on the CPU but will require manual /// synchronization when the CPU memory is modified /// Used as a bridge to gather data back from the GPU pub fn new_buffer_managed(&self, size: NSUInteger) -> Result<Arc<Buffer>> { self.allocate_buffer(size, MTLResourceOptions::StorageModeManaged, "managed") } /// Creates a new buffer from data. /// The buffer is [MTLManaged](https://developer.apple.com/documentation/metal/mtlstoragemode) /// /// Does not require synchronization, as [newBufferWithBytes](https://developer.apple.com/documentation/metal/mtldevice/1433429-newbufferwithbytes) /// allocates the buffer and copies over the existing data before returning the MTLBuffer. pub fn new_buffer_with_data<T>(&self, data: &[T]) -> Result<Arc<Buffer>> { let size = core::mem::size_of_val(data) as NSUInteger; let new_buffer = self.device.new_buffer_with_data( data.as_ptr() as *const c_void, size, MTLResourceOptions::StorageModeManaged, ); let mut buffers = self.buffers.write().map_err(MetalError::from)?; let subbuffers = buffers .entry((size, MTLResourceOptions::StorageModeManaged)) .or_insert(vec![]); let new_buffer = Arc::new(new_buffer); subbuffers.push(new_buffer.clone()); Ok(new_buffer) } pub fn allocate_zeros(&self, size_in_bytes: usize) -> Result<Arc<Buffer>> { let buffer = self.allocate_buffer( size_in_bytes as NSUInteger, MTLResourceOptions::StorageModePrivate, "allocate_zeros", )?; let command_buffer = self.command_buffer()?; command_buffer.set_label("zeros"); let blit = command_buffer.new_blit_command_encoder(); blit.fill_buffer( &buffer, metal::NSRange { location: 0, length: buffer.length(), }, 0, ); blit.end_encoding(); Ok(buffer) } /// The critical allocator algorithm fn allocate_buffer( &self, size: NSUInteger, option: MTLResourceOptions, _name: &str, ) -> Result<Arc<Buffer>> { let mut buffers = self.buffers.write().map_err(MetalError::from)?; if let Some(b) = find_available_buffer(size, option, &buffers) { // Cloning also ensures we increment the strong count return Ok(b.clone()); } let size = buf_size(size); let subbuffers = buffers.entry((size, option)).or_insert(vec![]); let new_buffer = self.device.new_buffer(size as NSUInteger, option); let new_buffer = Arc::new(new_buffer); subbuffers.push(new_buffer.clone()); Ok(new_buffer) } /// Create a metal GPU capture trace on [`path`]. pub fn capture<P: AsRef<Path>>(&self, path: P) -> Result<()> { let capture = metal::CaptureManager::shared(); let descriptor = metal::CaptureDescriptor::new(); descriptor.set_destination(metal::MTLCaptureDestination::GpuTraceDocument); descriptor.set_capture_device(self); // The [set_output_url] call requires an absolute path so we convert it if needed. if path.as_ref().is_absolute() { descriptor.set_output_url(path); } else { let path = std::env::current_dir()?.join(path); descriptor.set_output_url(path); } capture .start_capture(&descriptor) .map_err(MetalError::from)?; Ok(()) } } fn buf_size(size: NSUInteger) -> NSUInteger { size.saturating_sub(1).next_power_of_two() as NSUInteger } fn find_available_buffer( size: NSUInteger, option: MTLResourceOptions, buffers: &BufferMap, ) -> Option<Arc<Buffer>> { let mut best_buffer: Option<&Arc<Buffer>> = None; let mut best_buffer_size: NSUInteger = NSUInteger::MAX; for ((buffer_size, buffer_option), subbuffers) in buffers.iter() { if buffer_size >= &size && buffer_size < &best_buffer_size && buffer_option == &option { for sub in subbuffers { if Arc::strong_count(sub) == 1 { best_buffer = Some(sub); best_buffer_size = *buffer_size; } } } } best_buffer.cloned() }
8
0
hf_public_repos/candle/candle-core/src
hf_public_repos/candle/candle-core/src/metal_backend/mod.rs
//! Implementation of Backend traits for Metal //! use crate::backend::{BackendDevice, BackendStorage}; use crate::conv::{ParamsConv1D, ParamsConv2D, ParamsConvTranspose1D, ParamsConvTranspose2D}; use crate::op::{BinaryOpT, CmpOp, ReduceOp, UnaryOpT}; use crate::{CpuStorage, CpuStorageRef, DType, Layout, Result, Shape}; use candle_metal_kernels::{BufferOffset, CallConvTranspose2dCfg, Kernels}; use metal::{Buffer, MTLResourceOptions, NSUInteger}; use std::collections::HashMap; use std::ffi::c_void; use std::sync::{Arc, Mutex, PoisonError, RwLock, TryLockError}; mod device; pub use device::{DeviceId, MetalDevice}; pub fn buffer_o<'a>(buffer: &'a Buffer, l: &Layout, dtype: DType) -> BufferOffset<'a> { BufferOffset { buffer, offset_in_bytes: l.start_offset() * dtype.size_in_bytes(), } } /// Simple way to catch lock error without /// depending on T #[derive(thiserror::Error, Debug)] pub enum LockError { #[error("{0}")] Poisoned(String), #[error("Would block")] WouldBlock, } impl<T> From<TryLockError<T>> for MetalError { fn from(value: TryLockError<T>) -> Self { match value { TryLockError::Poisoned(p) => MetalError::LockError(LockError::Poisoned(p.to_string())), TryLockError::WouldBlock => MetalError::LockError(LockError::WouldBlock), } } } impl<T> From<PoisonError<T>> for MetalError { fn from(p: PoisonError<T>) -> Self { MetalError::LockError(LockError::Poisoned(p.to_string())) } } /// Metal related errors #[derive(thiserror::Error, Debug)] pub enum MetalError { #[error("{0}")] Message(String), #[error(transparent)] KernelError(#[from] candle_metal_kernels::MetalKernelError), #[error("{0:?}")] LockError(LockError), #[error("{msg}, expected: {expected:?}, got: {got:?}")] UnexpectedDType { msg: &'static str, expected: DType, got: DType, }, } impl From<String> for MetalError { fn from(e: String) -> Self { MetalError::Message(e) } } #[derive(Debug, Clone)] pub struct MetalStorage { /// The actual buffer containing the data. buffer: Arc<metal::Buffer>, /// a reference to the device owning this buffer device: MetalDevice, /// The count of allocated elements in the buffer count: usize, /// The dtype is kept since buffers are untyped. dtype: DType, } impl BackendStorage for MetalStorage { type Device = MetalDevice; fn try_clone(&self, _: &Layout) -> Result<Self> { Ok(self.clone()) } fn dtype(&self) -> DType { self.dtype } fn device(&self) -> &Self::Device { &self.device } fn to_cpu_storage(&self) -> Result<CpuStorage> { match self.dtype { DType::U8 => Ok(CpuStorage::U8(self.to_cpu()?)), DType::U32 => Ok(CpuStorage::U32(self.to_cpu()?)), DType::I64 => Ok(CpuStorage::I64(self.to_cpu()?)), DType::F16 => Ok(CpuStorage::F16(self.to_cpu()?)), DType::BF16 => Ok(CpuStorage::BF16(self.to_cpu()?)), DType::F32 => Ok(CpuStorage::F32(self.to_cpu()?)), DType::F64 => Ok(CpuStorage::F64(self.to_cpu()?)), } } fn affine(&self, layout: &Layout, mul: f64, add: f64) -> Result<Self> { let device = self.device().clone(); let shape = layout.shape(); let el = shape.elem_count(); let dtype = self.dtype; let buffer = device.new_buffer(el, self.dtype, "affine")?; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, layout, dtype); if layout.is_contiguous() { let name = match self.dtype { DType::F32 => "affine_f32", DType::F16 => "affine_f16", DType::BF16 => "affine_bf16", DType::U8 => "affine_u8", DType::U32 => "affine_u32", dtype => crate::bail!("Metal contiguous affine {dtype:?} not implemented"), }; candle_metal_kernels::call_affine( &device.device, &command_buffer, &device.kernels, name, el, src, &buffer, mul as f32, add as f32, ) .map_err(MetalError::from)?; } else { let name = match self.dtype { DType::F32 => "affine_f32_strided", DType::F16 => "affine_f16_strided", DType::BF16 => "affine_bf16_strided", dtype => crate::bail!("Metal strided affine {dtype:?} not implemented"), }; candle_metal_kernels::call_affine_strided( &device.device, &command_buffer, &device.kernels, name, layout.dims(), src, layout.stride(), &buffer, mul as f32, add as f32, ) .map_err(MetalError::from)?; } Ok(Self::new(buffer, device.clone(), el, dtype)) } fn powf(&self, layout: &Layout, pow: f64) -> Result<Self> { let device = self.device().clone(); let shape = layout.shape(); let el = shape.elem_count(); let dtype = self.dtype; let buffer = device.new_buffer(el, self.dtype, "powf")?; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, layout, dtype); if layout.is_contiguous() { let name = match self.dtype { DType::F32 => "powf_f32", DType::F16 => "powf_f16", DType::BF16 => "powf_bf16", dtype => crate::bail!("Metal contiguous powf {dtype:?} not implemented"), }; candle_metal_kernels::call_powf( &device.device, &command_buffer, &device.kernels, name, el, src, &buffer, pow as f32, ) .map_err(MetalError::from)?; } else { let name = match self.dtype { DType::F32 => "powf_f32_strided", DType::F16 => "powf_f16_strided", DType::BF16 => "powf_bf16_strided", dtype => crate::bail!("Metal strided powf {dtype:?} not implemented"), }; candle_metal_kernels::call_powf_strided( &device.device, &command_buffer, &device.kernels, name, layout.dims(), src, layout.stride(), &buffer, pow as f32, ) .map_err(MetalError::from)?; } Ok(Self::new(buffer, device.clone(), el, dtype)) } fn elu(&self, layout: &Layout, alpha: f64) -> Result<Self> { let device = self.device().clone(); let shape = layout.shape(); let el = shape.elem_count(); let dtype = self.dtype; let buffer = device.new_buffer(el, self.dtype, "elu")?; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, layout, self.dtype); if layout.is_contiguous() { let name = match self.dtype { DType::F32 => "elu_f32", DType::F16 => "elu_f16", DType::BF16 => "elu_bf16", dtype => crate::bail!("Metal contiguous elu {dtype:?} not implemented"), }; candle_metal_kernels::call_elu( &device.device, &command_buffer, &device.kernels, name, el, src, &buffer, alpha as f32, ) .map_err(MetalError::from)?; } else { let name = match self.dtype { DType::F32 => "elu_f32_strided", DType::F16 => "elu_f16_strided", DType::BF16 => "elu_bf16_strided", dtype => crate::bail!("Metal strided elu {dtype:?} not implemented"), }; candle_metal_kernels::call_elu_strided( &device.device, &command_buffer, &device.kernels, name, layout.dims(), src, layout.stride(), &buffer, alpha as f32, ) .map_err(MetalError::from)?; } Ok(Self::new(buffer, device.clone(), el, dtype)) } fn reduce_op(&self, op: ReduceOp, layout: &Layout, sum_dims: &[usize]) -> Result<Self> { let device = self.device.clone(); let src_stride = layout.stride(); let src_dims = layout.shape().dims(); // Source dims and strides with the sum dims at the end. let mut dims = vec![]; let mut stride = vec![]; let mut dst_el: usize = 1; for (dim_idx, &d) in src_dims.iter().enumerate() { if !sum_dims.contains(&dim_idx) { dst_el *= d; dims.push(d); stride.push(src_stride[dim_idx]); } } for &dim_idx in sum_dims.iter() { dims.push(src_dims[dim_idx]); stride.push(src_stride[dim_idx]); } // The reduction loop requires the shared array to be properly initialized and for // this we want the number of threads to be a power of two. let (name, check_empty, return_index) = match (op, self.dtype) { (ReduceOp::Sum, DType::F32) => ("fast_sum_f32_strided", false, false), (ReduceOp::Min, DType::F32) => ("fast_min_f32_strided", true, false), (ReduceOp::Max, DType::F32) => ("fast_max_f32_strided", true, false), (ReduceOp::ArgMin, DType::F32) => ("fast_argmin_f32_strided", true, true), (ReduceOp::ArgMax, DType::F32) => ("fast_argmax_f32_strided", true, true), (ReduceOp::Sum, DType::U32) => ("fast_sum_u32_strided", false, false), (ReduceOp::Min, DType::U32) => ("fast_min_u32_strided", true, false), (ReduceOp::Max, DType::U32) => ("fast_max_u32_strided", true, false), (ReduceOp::ArgMin, DType::U32) => ("fast_argmin_u32_strided", true, true), (ReduceOp::ArgMax, DType::U32) => ("fast_argmax_u32_strided", true, true), (ReduceOp::Sum, DType::F16) => ("fast_sum_f16_strided", false, false), (ReduceOp::Min, DType::F16) => ("fast_min_f16_strided", true, false), (ReduceOp::Max, DType::F16) => ("fast_max_f16_strided", true, false), (ReduceOp::ArgMin, DType::F16) => ("fast_argmin_f16_strided", true, true), (ReduceOp::ArgMax, DType::F16) => ("fast_argmax_f16_strided", true, true), (ReduceOp::Sum, DType::BF16) => ("fast_sum_bf16_strided", false, false), (ReduceOp::Min, DType::BF16) => ("fast_min_bf16_strided", true, false), (ReduceOp::Max, DType::BF16) => ("fast_max_bf16_strided", true, false), (ReduceOp::ArgMin, DType::BF16) => ("fast_argmin_bf16_strided", true, true), (ReduceOp::ArgMax, DType::BF16) => ("fast_argmax_bf16_strided", true, true), (ReduceOp::Sum, DType::I64) => ("fast_sum_i64_strided", false, false), (ReduceOp::Min, DType::I64) => ("fast_min_i64_strided", true, false), (ReduceOp::Max, DType::I64) => ("fast_max_i64_strided", true, false), (ReduceOp::ArgMin, DType::I64) => ("fast_argmin_i64_strided", true, true), (ReduceOp::ArgMax, DType::I64) => ("fast_argmax_i64_strided", true, true), (ReduceOp::Sum, DType::U8) => ("fast_sum_u8_strided", false, false), (ReduceOp::Min, DType::U8) => ("fast_min_u8_strided", true, false), (ReduceOp::Max, DType::U8) => ("fast_max_u8_strided", true, false), (ReduceOp::ArgMin, DType::U8) => ("fast_argmin_u8_strided", true, true), (ReduceOp::ArgMax, DType::U8) => ("fast_argmax_u8_strided", true, true), (k, dtype) => crate::bail!("Metal reduce op {k:?} {dtype:?} not implemented"), }; if check_empty && layout.shape().elem_count() == 0 { Err(crate::Error::EmptyTensor { op: "reduce" }.bt())? } let dtype = if return_index { DType::U32 } else { self.dtype }; let buffer = device.new_buffer(dst_el, dtype, "reduce")?; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, layout, self.dtype); candle_metal_kernels::call_reduce_strided( &device.device, &command_buffer, &device.kernels, name, &dims, &stride, dst_el, src, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, device, dst_el, dtype)) } fn cmp(&self, op: CmpOp, rhs: &Self, lhs_l: &Layout, rhs_l: &Layout) -> Result<Self> { let name = match op { CmpOp::Eq => "eq", CmpOp::Ne => "ne", CmpOp::Le => "le", CmpOp::Ge => "ge", CmpOp::Lt => "lt", CmpOp::Gt => "gt", }; self.binary(name, rhs, lhs_l, rhs_l) } fn to_dtype(&self, layout: &Layout, dtype: DType) -> Result<Self> { let device = self.device(); let shape = layout.shape(); let el_count = shape.elem_count(); let buffer = device.new_buffer(el_count, dtype, "todtype")?; let command_buffer = device.command_buffer()?; let src = buffer_o(&self.buffer, layout, self.dtype); if layout.is_contiguous() { let kernel_name = match (self.dtype, dtype) { (DType::U32, DType::BF16) => "cast_u32_bf16", (DType::U32, DType::F16) => "cast_u32_f16", (DType::U32, DType::F32) => "cast_u32_f32", (DType::U32, DType::I64) => "cast_u32_i64", (DType::U32, DType::U8) => "cast_u32_u8", (DType::U8, DType::BF16) => "cast_u8_bf16", (DType::U8, DType::F16) => "cast_u8_f16", (DType::U8, DType::F32) => "cast_u8_f32", (DType::U8, DType::I64) => "cast_u8_i64", (DType::U8, DType::U32) => "cast_u8_u32", (DType::F32, DType::BF16) => "cast_f32_bf16", (DType::F32, DType::F16) => "cast_f32_f16", (DType::F32, DType::I64) => "cast_f32_i64", (DType::F32, DType::U32) => "cast_f32_u32", (DType::F32, DType::U8) => "cast_f32_u8", (DType::I64, DType::BF16) => "cast_i64_bf16", (DType::I64, DType::F16) => "cast_i64_f16", (DType::I64, DType::F32) => "cast_i64_f32", (DType::I64, DType::U32) => "cast_i64_u32", (DType::I64, DType::U8) => "cast_i64_u8", (DType::F16, DType::BF16) => "cast_f16_bf16", (DType::F16, DType::F32) => "cast_f16_f32", (DType::F16, DType::I64) => "cast_f16_i64", (DType::F16, DType::U32) => "cast_f16_u32", (DType::F16, DType::U8) => "cast_f16_u8", (DType::BF16, DType::F16) => "cast_bf16_f16", (DType::BF16, DType::F32) => "cast_bf16_f32", (DType::BF16, DType::I64) => "cast_bf16_i64", (DType::BF16, DType::U32) => "cast_bf16_u32", (DType::BF16, DType::U8) => "cast_bf16_u8", (left, right) => { crate::bail!("Metal contiguous to_dtype {left:?} {right:?} not implemented") } }; candle_metal_kernels::call_cast_contiguous( &device.device, &command_buffer, &device.kernels, kernel_name, el_count, src, &buffer, ) .map_err(MetalError::from)?; } else { let kernel_name = match (self.dtype, dtype) { (DType::BF16, DType::F16) => "cast_bf16_f16_strided", (DType::BF16, DType::F32) => "cast_bf16_f32_strided", (DType::BF16, DType::I64) => "cast_bf16_i64_strided", (DType::BF16, DType::U32) => "cast_bf16_u32_strided", (DType::BF16, DType::U8) => "cast_bf16_u8_strided", (DType::F16, DType::BF16) => "cast_f16_bf16_strided", (DType::F16, DType::F32) => "cast_f16_f32_strided", (DType::F16, DType::I64) => "cast_f16_i64_strided", (DType::F16, DType::U32) => "cast_f16_u32_strided", (DType::F16, DType::U8) => "cast_f16_u8_strided", (DType::F32, DType::BF16) => "cast_f32_bf16_strided", (DType::F32, DType::F16) => "cast_f32_f16_strided", (DType::F32, DType::I64) => "cast_f32_i64_strided", (DType::F32, DType::U32) => "cast_f32_u32_strided", (DType::F32, DType::U8) => "cast_f32_u8_strided", (DType::I64, DType::F32) => "cast_i64_f32_strided", (DType::I64, DType::BF16) => "cast_i64_bf16_strided", (DType::I64, DType::F16) => "cast_i64_f16_strided", (DType::I64, DType::U32) => "cast_i64_u32_strided", (DType::I64, DType::U8) => "cast_i64_u8_strided", (DType::U32, DType::BF16) => "cast_u32_bf16_strided", (DType::U32, DType::F16) => "cast_u32_f16_strided", (DType::U32, DType::F32) => "cast_u32_f32_strided", (DType::U32, DType::I64) => "cast_u32_i64_strided", (DType::U32, DType::U8) => "cast_u32_u8_strided", (DType::U8, DType::BF16) => "cast_u8_bf16_strided", (DType::U8, DType::F16) => "cast_u8_f16_strided", (DType::U8, DType::F32) => "cast_u8_f32_strided", (DType::U8, DType::I64) => "cast_u8_i64_strided", (DType::U8, DType::U32) => "cast_u8_u32_strided", (left, right) => { crate::bail!("Metal strided to_dtype {left:?} {right:?} not implemented") } }; candle_metal_kernels::call_cast_strided( &device.device, &command_buffer, &device.kernels, kernel_name, layout.dims(), src, layout.stride(), &buffer, ) .map_err(MetalError::from)?; } command_buffer.set_label("to_dtype"); Ok(Self::new(buffer, device.clone(), el_count, dtype)) } fn unary_impl<B: UnaryOpT>(&self, layout: &Layout) -> Result<Self> { let device = self.device(); let dtype = self.dtype; let shape = layout.shape(); let el_count = shape.elem_count(); let buffer = device.new_buffer(el_count, dtype, B::KERNEL)?; let command_buffer = device.command_buffer()?; command_buffer.set_label(B::KERNEL); let src = buffer_o(&self.buffer, layout, self.dtype); match (el_count % 2, dtype, layout.is_contiguous()) { (0, DType::BF16 | DType::F16, true) => { use candle_metal_kernels::unary::contiguous_tiled; let kernel_name = match (B::KERNEL, dtype) { ("uabs", DType::F16) => contiguous_tiled::abs::HALF, ("uabs", DType::F32) => contiguous_tiled::abs::FLOAT, ("uabs", DType::BF16) => contiguous_tiled::abs::BFLOAT, ("uceil", DType::F16) => contiguous_tiled::ceil::HALF, ("uceil", DType::F32) => contiguous_tiled::ceil::FLOAT, ("uceil", DType::BF16) => contiguous_tiled::ceil::BFLOAT, ("ucos", DType::F16) => contiguous_tiled::cos::HALF, ("ucos", DType::F32) => contiguous_tiled::cos::FLOAT, ("ucos", DType::BF16) => contiguous_tiled::cos::BFLOAT, ("uerf", DType::F16) => contiguous_tiled::erf::HALF, ("uerf", DType::F32) => contiguous_tiled::erf::FLOAT, ("uerf", DType::BF16) => contiguous_tiled::erf::BFLOAT, ("uexp", DType::F16) => contiguous_tiled::exp::HALF, ("uexp", DType::F32) => contiguous_tiled::exp::FLOAT, ("uexp", DType::BF16) => contiguous_tiled::exp::BFLOAT, ("ufloor", DType::F16) => contiguous_tiled::floor::HALF, ("ufloor", DType::F32) => contiguous_tiled::floor::FLOAT, ("ufloor", DType::BF16) => contiguous_tiled::floor::BFLOAT, ("ugelu_erf", DType::F16) => contiguous_tiled::gelu_erf::HALF, ("ugelu_erf", DType::F32) => contiguous_tiled::gelu_erf::FLOAT, ("ugelu_erf", DType::BF16) => contiguous_tiled::gelu_erf::BFLOAT, ("ugelu", DType::F16) => contiguous_tiled::gelu::HALF, ("ugelu", DType::F32) => contiguous_tiled::gelu::FLOAT, ("ugelu", DType::BF16) => contiguous_tiled::gelu::BFLOAT, ("ulog", DType::F16) => contiguous_tiled::log::HALF, ("ulog", DType::F32) => contiguous_tiled::log::FLOAT, ("ulog", DType::BF16) => contiguous_tiled::log::BFLOAT, ("uneg", DType::F16) => contiguous_tiled::neg::HALF, ("uneg", DType::F32) => contiguous_tiled::neg::FLOAT, ("uneg", DType::BF16) => contiguous_tiled::neg::BFLOAT, ("urecip", DType::F16) => contiguous_tiled::recip::HALF, ("urecip", DType::F32) => contiguous_tiled::recip::FLOAT, ("urecip", DType::BF16) => contiguous_tiled::recip::BFLOAT, ("urelu", DType::F16) => contiguous_tiled::relu::HALF, ("urelu", DType::F32) => contiguous_tiled::relu::FLOAT, ("urelu", DType::BF16) => contiguous_tiled::relu::BFLOAT, ("uround", DType::F16) => contiguous_tiled::round::HALF, ("uround", DType::F32) => contiguous_tiled::round::FLOAT, ("uround", DType::BF16) => contiguous_tiled::round::BFLOAT, ("usilu", DType::F16) => contiguous_tiled::silu::HALF, ("usilu", DType::F32) => contiguous_tiled::silu::FLOAT, ("usilu", DType::BF16) => contiguous_tiled::silu::BFLOAT, ("usin", DType::F16) => contiguous_tiled::sin::HALF, ("usin", DType::F32) => contiguous_tiled::sin::FLOAT, ("usin", DType::BF16) => contiguous_tiled::sin::BFLOAT, ("usqr", DType::F16) => contiguous_tiled::sqr::HALF, ("usqr", DType::F32) => contiguous_tiled::sqr::FLOAT, ("usqr", DType::BF16) => contiguous_tiled::sqr::BFLOAT, ("usqrt", DType::F16) => contiguous_tiled::sqrt::HALF, ("usqrt", DType::F32) => contiguous_tiled::sqrt::FLOAT, ("usqrt", DType::BF16) => contiguous_tiled::sqrt::BFLOAT, ("utanh", DType::F16) => contiguous_tiled::tanh::HALF, ("utanh", DType::F32) => contiguous_tiled::tanh::FLOAT, ("utanh", DType::BF16) => contiguous_tiled::tanh::BFLOAT, ("usign", DType::F16) => contiguous_tiled::sign::HALF, ("usign", DType::F32) => contiguous_tiled::sign::FLOAT, ("usign", DType::BF16) => contiguous_tiled::sign::BFLOAT, ("usign", DType::I64) => contiguous_tiled::sign::I64, (name, dtype) => { crate::bail!( "Metal contiguous_tiled unary {name} {dtype:?} not implemented" ) } }; candle_metal_kernels::call_unary_contiguous_tiled( &device.device, &command_buffer, &device.kernels, kernel_name, el_count, src, &buffer, ) .map_err(MetalError::from)?; } (_, _, true) => { use candle_metal_kernels::unary::contiguous; let kernel_name = match (B::KERNEL, dtype) { ("uabs", DType::F16) => contiguous::abs::HALF, ("uabs", DType::F32) => contiguous::abs::FLOAT, ("uabs", DType::BF16) => contiguous::abs::BFLOAT, ("uceil", DType::F16) => contiguous::ceil::HALF, ("uceil", DType::F32) => contiguous::ceil::FLOAT, ("uceil", DType::BF16) => contiguous::ceil::BFLOAT, ("ucos", DType::F16) => contiguous::cos::HALF, ("ucos", DType::F32) => contiguous::cos::FLOAT, ("ucos", DType::BF16) => contiguous::cos::BFLOAT, ("uerf", DType::F16) => contiguous::erf::HALF, ("uerf", DType::F32) => contiguous::erf::FLOAT, ("uerf", DType::BF16) => contiguous::erf::BFLOAT, ("uexp", DType::F16) => contiguous::exp::HALF, ("uexp", DType::F32) => contiguous::exp::FLOAT, ("uexp", DType::BF16) => contiguous::exp::BFLOAT, ("ufloor", DType::F16) => contiguous::floor::HALF, ("ufloor", DType::F32) => contiguous::floor::FLOAT, ("ufloor", DType::BF16) => contiguous::floor::BFLOAT, ("ugelu_erf", DType::F16) => contiguous::gelu_erf::HALF, ("ugelu_erf", DType::F32) => contiguous::gelu_erf::FLOAT, ("ugelu_erf", DType::BF16) => contiguous::gelu_erf::BFLOAT, ("ugelu", DType::F16) => contiguous::gelu::HALF, ("ugelu", DType::F32) => contiguous::gelu::FLOAT, ("ugelu", DType::BF16) => contiguous::gelu::BFLOAT, ("ulog", DType::F16) => contiguous::log::HALF, ("ulog", DType::F32) => contiguous::log::FLOAT, ("ulog", DType::BF16) => contiguous::log::BFLOAT, ("uneg", DType::F16) => contiguous::neg::HALF, ("uneg", DType::F32) => contiguous::neg::FLOAT, ("uneg", DType::BF16) => contiguous::neg::BFLOAT, ("urecip", DType::F16) => contiguous::recip::HALF, ("urecip", DType::F32) => contiguous::recip::FLOAT, ("urecip", DType::BF16) => contiguous::recip::BFLOAT, ("urelu", DType::F16) => contiguous::relu::HALF, ("urelu", DType::F32) => contiguous::relu::FLOAT, ("urelu", DType::BF16) => contiguous::relu::BFLOAT, ("uround", DType::F16) => contiguous::round::HALF, ("uround", DType::F32) => contiguous::round::FLOAT, ("uround", DType::BF16) => contiguous::round::BFLOAT, ("usilu", DType::F16) => contiguous::silu::HALF, ("usilu", DType::F32) => contiguous::silu::FLOAT, ("usilu", DType::BF16) => contiguous::silu::BFLOAT, ("usin", DType::F16) => contiguous::sin::HALF, ("usin", DType::F32) => contiguous::sin::FLOAT, ("usin", DType::BF16) => contiguous::sin::BFLOAT, ("usqr", DType::F16) => contiguous::sqr::HALF, ("usqr", DType::F32) => contiguous::sqr::FLOAT, ("usqr", DType::BF16) => contiguous::sqr::BFLOAT, ("usqrt", DType::F16) => contiguous::sqrt::HALF, ("usqrt", DType::F32) => contiguous::sqrt::FLOAT, ("usqrt", DType::BF16) => contiguous::sqrt::BFLOAT, ("utanh", DType::F16) => contiguous::tanh::HALF, ("utanh", DType::F32) => contiguous::tanh::FLOAT, ("utanh", DType::BF16) => contiguous::tanh::BFLOAT, ("usign", DType::F16) => contiguous::sign::HALF, ("usign", DType::F32) => contiguous::sign::FLOAT, ("usign", DType::BF16) => contiguous::sign::BFLOAT, ("usign", DType::I64) => contiguous::sign::I64, (name, dtype) => { crate::bail!("Metal contiguous unary {name} {dtype:?} not implemented") } }; candle_metal_kernels::call_unary_contiguous( &device.device, &command_buffer, &device.kernels, kernel_name, el_count, src, &buffer, ) .map_err(MetalError::from)?; } (_, _, false) => { use candle_metal_kernels::unary::strided; let kernel_name = match (B::KERNEL, dtype) { ("ucos", DType::F32) => strided::cos::FLOAT, ("usin", DType::F32) => strided::sin::FLOAT, ("usqr", DType::F32) => strided::sqr::FLOAT, ("usqrt", DType::F32) => strided::sqrt::FLOAT, ("uneg", DType::F32) => strided::neg::FLOAT, ("uexp", DType::F32) => strided::exp::FLOAT, ("ulog", DType::F32) => strided::log::FLOAT, ("ugelu", DType::F32) => strided::gelu::FLOAT, ("ugelu_erf", DType::F32) => strided::gelu_erf::FLOAT, ("uerf", DType::F32) => strided::erf::FLOAT, ("usilu", DType::F32) => strided::silu::FLOAT, ("uabs", DType::F32) => strided::abs::FLOAT, ("uceil", DType::F32) => strided::ceil::FLOAT, ("ufloor", DType::F32) => strided::floor::FLOAT, ("urelu", DType::F32) => strided::relu::FLOAT, ("uround", DType::F32) => strided::round::FLOAT, ("utanh", DType::F32) => strided::tanh::FLOAT, ("ucos", DType::F16) => strided::cos::HALF, ("usin", DType::F16) => strided::sin::HALF, ("usqr", DType::F16) => strided::sqr::HALF, ("usqrt", DType::F16) => strided::sqrt::HALF, ("uneg", DType::F16) => strided::neg::HALF, ("uexp", DType::F16) => strided::exp::HALF, ("ulog", DType::F16) => strided::log::HALF, ("ugelu", DType::F16) => strided::gelu::HALF, ("ugelu_erf", DType::F16) => strided::gelu_erf::HALF, ("uerf", DType::F16) => strided::erf::HALF, ("usilu", DType::F16) => strided::silu::HALF, ("uabs", DType::F16) => strided::abs::HALF, ("uceil", DType::F16) => strided::ceil::HALF, ("ufloor", DType::F16) => strided::floor::HALF, ("urelu", DType::F16) => strided::relu::HALF, ("uround", DType::F16) => strided::round::HALF, ("utanh", DType::F16) => strided::tanh::HALF, ("ucos", DType::BF16) => strided::cos::BFLOAT, ("usin", DType::BF16) => strided::sin::BFLOAT, ("usqr", DType::BF16) => strided::sqr::BFLOAT, ("usqrt", DType::BF16) => strided::sqrt::BFLOAT, ("uneg", DType::BF16) => strided::neg::BFLOAT, ("uexp", DType::BF16) => strided::exp::BFLOAT, ("ulog", DType::BF16) => strided::log::BFLOAT, ("ugelu", DType::BF16) => strided::gelu::BFLOAT, ("ugelu_erf", DType::BF16) => strided::gelu_erf::BFLOAT, ("uerf", DType::BF16) => strided::erf::BFLOAT, ("usilu", DType::BF16) => strided::silu::BFLOAT, ("uabs", DType::BF16) => strided::abs::BFLOAT, ("uceil", DType::BF16) => strided::ceil::BFLOAT, ("ufloor", DType::BF16) => strided::floor::BFLOAT, ("urelu", DType::BF16) => strided::relu::BFLOAT, ("uround", DType::BF16) => strided::round::BFLOAT, ("utanh", DType::BF16) => strided::tanh::BFLOAT, (name, dtype) => { crate::bail!("Metal strided unary {name} {dtype:?} not implemented") } }; let dst = BufferOffset::zero_offset(&buffer); candle_metal_kernels::call_unary_strided( &device.device, &command_buffer, &device.kernels, kernel_name, layout.dims(), src, layout.stride(), dst, ) .map_err(MetalError::from)?; } } Ok(Self::new(buffer, device.clone(), el_count, dtype)) } fn binary_impl<B: BinaryOpT>( &self, rhs: &Self, lhs_l: &Layout, rhs_l: &Layout, ) -> Result<Self> { self.binary(B::KERNEL, rhs, lhs_l, rhs_l) } fn where_cond( &self, layout: &Layout, t: &Self, t_l: &Layout, f: &Self, f_l: &Layout, ) -> Result<Self> { let device = self.device.clone(); let shape = t_l.shape(); let dims = shape.dims(); let el = shape.elem_count(); let dtype = t.dtype; let buffer = self.device.new_buffer(el, dtype, "where")?; let command_buffer = self.device.command_buffer()?; if t.dtype() != f.dtype() { crate::bail!( "Invalid where: different dtypes for values {:?} != {:?}", t.dtype(), f.dtype() ); } let name = match (self.dtype, t.dtype()) { (DType::U8, DType::F32) => "where_u8_f32", (DType::U32, DType::F32) => "where_u32_f32", (DType::U8, DType::BF16) => "where_u8_bf16", (DType::U8, DType::F16) => "where_u8_f16", (DType::U8, DType::I64) => "where_u8_i64", (DType::U8, DType::U32) => "where_u8_u32", (DType::U8, DType::U8) => "where_u8_u8", (left, right) => crate::bail!("Metal where_cond {left:?} {right:?} not implemented"), }; let src = buffer_o(&self.buffer, layout, self.dtype); let t = buffer_o(&t.buffer, t_l, t.dtype); let f = buffer_o(&f.buffer, f_l, f.dtype); candle_metal_kernels::call_where_cond_strided( &device.device, &command_buffer, &device.kernels, name, dims, src, layout.stride(), t, t_l.stride(), f, f_l.stride(), &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, device, el, dtype)) } fn conv1d( &self, layout: &Layout, kernel: &Self, kernel_l: &Layout, params: &ParamsConv1D, ) -> Result<Self> { let device = self.device().clone(); let shape = layout.shape(); let dims = shape.dims(); let strides = layout.stride(); let stride = params.stride; let dilation = params.dilation; let padding = params.padding; let k_size = params.k_size; let l_out = (dims[2] + 2 * padding - dilation * (k_size - 1) - 1) / stride + 1; let dst_el = dims[0] * l_out * dims[1] * k_size; let dst = self .device .new_buffer(dst_el, self.dtype, "conv1d_im2col")?; let command_buffer = self.device.command_buffer()?; let name = match self.dtype { DType::F32 => "im2col1d_f32", dtype => crate::bail!("Metal conv1d {dtype:?} not implemented"), }; let src = buffer_o(&self.buffer, layout, self.dtype); candle_metal_kernels::call_im2col1d_strided( &self.device.device, &command_buffer, &self.device.kernels, name, layout.shape().dims(), strides, (k_size, stride, padding, dilation), src, &dst, ) .map_err(MetalError::from)?; let col = Self { buffer: dst, device, count: dst_el, dtype: self.dtype, }; let l_out = params.l_out(); let b = params.b_size; let n = params.c_out; let k = params.k_size * params.c_in; let m = l_out; let col_l = Layout::contiguous((b, m, k)); let res = if kernel_l.is_contiguous() { let kernel_l = Layout::contiguous_with_offset((1, n, k), kernel_l.start_offset()) .transpose(1, 2)? .broadcast_as((b, k, n))?; col.matmul(kernel, (b, m, n, k), &col_l, &kernel_l)? } else { // Make the kernel contiguous if not already the case. let mut kernel_c = self.device().zeros_impl(kernel_l.shape(), kernel.dtype())?; kernel.copy_strided_src(&mut kernel_c, 0, kernel_l)?; let kernel_l = Layout::contiguous_with_offset((1, n, k), kernel_l.start_offset()) .transpose(1, 2)? .broadcast_as((b, k, n))?; col.matmul(kernel, (b, m, n, k), &col_l, &kernel_l)? }; let res_l = Layout::contiguous((b, l_out, n)).transpose(1, 2)?; let mut res_t = self.device().zeros_impl(res_l.shape(), res.dtype())?; res.copy_strided_src(&mut res_t, 0, &res_l)?; Ok(res_t) } fn conv_transpose1d( &self, layout: &Layout, k: &Self, k_layout: &Layout, params: &ParamsConvTranspose1D, ) -> Result<Self> { const USE_COL2IM_CONV1D_TR: bool = true; let can_use_col2im = k_layout.is_contiguous() && params.dilation == 1 && params.padding == 0 && params.output_padding == 0; let l_out = params.l_out(); let dst_el = params.c_out * l_out * params.b_size; let buffer = if USE_COL2IM_CONV1D_TR && can_use_col2im { let (b_size, c_in, l_in) = layout.shape().dims3()?; let (c_in2, c_out, k_size) = k_layout.shape().dims3()?; if c_in != c_in2 { crate::bail!( "convtr1d: shape mismatch on c_in {:?} {:?}", layout.shape(), k_layout.shape() ) } let buffer = self .device .new_buffer(dst_el, self.dtype, "conv_transpose1d")?; let name = match self.dtype { DType::F32 => "col2im1d_f32", DType::U32 => "col2im1d_u32", DType::U8 => "col2im1d_u8", dtype => crate::bail!("metal col2im1d {dtype:?} not implemented"), }; let col = { // This merges the last two dimensions of the kernel together. let kernel_l_mm = Layout::new( (b_size, c_in, k_size * c_out).into(), vec![0, k_size * c_out, 1], k_layout.start_offset(), ); self.matmul( k, (b_size, l_in, c_out * k_size, c_in), &layout.transpose(1, 2)?, &kernel_l_mm, )? }; // It is important for the command buffer to be obtained *after* the matmul // kernel has run, otherwise we might use a command-buffer that has been commited // already resulting in the following error. // _status < MTLCommandBufferStatusCommitted > // -[IOGPUMetalCommandBuffer setCurrentCommandEncoder:] let command_buffer = self.device.command_buffer()?; candle_metal_kernels::call_col2im1d( &self.device.device, &command_buffer, &self.device.kernels, name, &[b_size, l_in, c_out, k_size], params.k_size, params.stride, BufferOffset::zero_offset(&col.buffer), &buffer, ) .map_err(MetalError::from)?; buffer } else { let buffer = self .device .new_buffer(dst_el, self.dtype, "conv_transpose1d")?; let command_buffer = self.device.command_buffer()?; let name = match self.dtype { DType::F32 => "conv_transpose1d_f32", DType::F16 => "conv_transpose1d_f16", DType::BF16 => "conv_transpose1d_bf16", DType::U32 => "conv_transpose1d_u32", DType::U8 => "conv_transpose1d_u8", dtype => crate::bail!("Metal conv_transpose1d {dtype:?} not implemented"), }; candle_metal_kernels::call_conv_transpose1d( &self.device.device, &command_buffer, &self.device.kernels, name, params.dilation, params.stride, params.padding, params.output_padding, params.c_out, l_out, params.b_size, layout.dims(), layout.stride(), k_layout.dims(), k_layout.stride(), &self.buffer, layout.start_offset() * self.dtype.size_in_bytes(), &k.buffer, k_layout.start_offset() * k.dtype.size_in_bytes(), &buffer, ) .map_err(MetalError::from)?; buffer }; Ok(Self::new(buffer, self.device.clone(), dst_el, self.dtype)) } fn conv2d( &self, layout: &Layout, kernel: &Self, kernel_l: &Layout, params: &ParamsConv2D, ) -> Result<Self> { let device = self.device().clone(); let shape = layout.shape(); let dims = shape.dims(); let stride = params.stride; let dilation = params.dilation; let padding = params.padding; let h_k = params.k_h; let w_k = params.k_w; let h = dims[2]; let w = dims[3]; let h_out = (h + 2 * padding - dilation * (h_k - 1) - 1) / stride + 1; let w_out = (w + 2 * padding - dilation * (w_k - 1) - 1) / stride + 1; let dst_el = dims[0] * h_out * w_out * dims[1] * h_k * w_k; let dst = self .device .new_buffer(dst_el, self.dtype, "conv2d_im2col")?; let command_buffer = self.device.command_buffer()?; let name = match self.dtype { DType::F32 => "im2col_f32", DType::F16 => "im2col_f16", DType::BF16 => "im2col_bf16", DType::U8 => "im2col_u8", DType::U32 => "im2col_u32", dtype => crate::bail!("Metal conv2d {dtype:?} not implemented"), }; let src = buffer_o(&self.buffer, layout, self.dtype); candle_metal_kernels::call_im2col_strided( &self.device.device, &command_buffer, &self.device.kernels, name, layout.shape().dims(), layout.stride(), (h_k, w_k, stride, padding, dilation), src, &dst, ) .map_err(MetalError::from)?; let col = Self { buffer: dst, device, count: dst_el, dtype: self.dtype, }; let h_out = params.out_h(); let w_out = params.out_w(); let b = params.b_size; let n = params.c_out; let k = params.k_h * params.k_w * params.c_in; let m = h_out * w_out; let col_l = Layout::contiguous((b, m, k)); let res = if kernel_l.is_contiguous() { let kernel_l = Layout::contiguous_with_offset((1, n, k), kernel_l.start_offset()) .transpose(1, 2)? .broadcast_as((b, k, n))?; col.matmul(kernel, (b, m, n, k), &col_l, &kernel_l)? } else { // Make the kernel contiguous if not already the case. let mut kernel_c = self.device().zeros_impl(kernel_l.shape(), kernel.dtype())?; kernel.copy_strided_src(&mut kernel_c, 0, kernel_l)?; let kernel_l = Layout::contiguous_with_offset((1, n, k), kernel_l.start_offset()) .transpose(1, 2)? .broadcast_as((b, k, n))?; col.matmul(kernel, (b, m, n, k), &col_l, &kernel_l)? }; let res_l = Layout::contiguous((b, h_out, w_out, n)) .transpose(1, 2)? .transpose(1, 3)?; let mut res_t = self.device().zeros_impl(res_l.shape(), res.dtype())?; res.copy_strided_src(&mut res_t, 0, &res_l)?; Ok(res_t) } fn conv_transpose2d( &self, l: &Layout, kernel: &Self, kernel_l: &Layout, params: &ParamsConvTranspose2D, ) -> Result<Self> { // Kernel shape: (c_in_k, c_out, h_k, w_k) // Input shape: (b_size, c_in, h_in, w_in) let (out_w, out_h) = (params.out_w(), params.out_h()); let dst_el = params.c_out * out_w * out_h * params.b_size; let dims = l.dims(); if dims.len() != 4 { crate::bail!("unexpected input shape for conv_transpose2d {dims:?}, expected 4") } let k_dims = kernel_l.dims(); if k_dims.len() != 4 { crate::bail!("unexpected kernel shape for conv_transpose2d {k_dims:?}, expected 4") } let buffer = self .device .new_buffer(dst_el, self.dtype, "conv_transpose2d")?; let command_buffer = self.device.command_buffer()?; let name = match self.dtype { DType::F32 => "conv_transpose2d_f32", DType::F16 => "conv_transpose2d_f16", DType::BF16 => "conv_transpose2d_bf16", dtype => crate::bail!("Metal conv_transpose2d {dtype:?} not implemented"), }; candle_metal_kernels::call_conv_transpose2d( &self.device.device, &command_buffer, &self.device.kernels, name, CallConvTranspose2dCfg { dilation: params.dilation, stride: params.stride, padding: params.padding, output_padding: params.output_padding, c_out: params.c_out, out_h, out_w, b_size: params.b_size, input_dims: l.dims(), input_stride: l.stride(), kernel_dims: kernel_l.dims(), kernel_stride: kernel_l.stride(), input_offset: l.start_offset() * self.dtype.size_in_bytes(), kernel_offset: kernel_l.start_offset() * kernel.dtype.size_in_bytes(), }, &self.buffer, &kernel.buffer, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, self.device.clone(), dst_el, self.dtype)) } fn avg_pool2d( &self, inp_l: &Layout, (w_k, h_k): (usize, usize), (w_stride, h_stride): (usize, usize), ) -> Result<Self> { let shape = inp_l.shape(); let (b_size, channels, width, height) = shape.dims4()?; let strides = inp_l.stride(); let name = match self.dtype { DType::F32 => "avg_pool2d_f32", DType::F16 => "avg_pool2d_f16", DType::BF16 => "avg_pool2d_bf16", DType::U8 => "avg_pool2d_u8", DType::U32 => "avg_pool2d_u32", dtype => crate::bail!("Metal avg_pool2d {dtype:?} not implemented"), }; let out_w = (width - w_k) / w_stride + 1; let out_h = (height - h_k) / h_stride + 1; let dst_el = out_w * out_h * b_size * channels; let buffer = self.device.new_buffer(dst_el, self.dtype, "avg_pool2d")?; let command_buffers = self.device.command_buffer()?; candle_metal_kernels::call_pool2d( &self.device.device, &command_buffers, &self.device.kernels, name, inp_l.dims(), strides, out_w, out_h, w_k, h_k, w_stride, h_stride, &self.buffer, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, self.device.clone(), dst_el, self.dtype)) } fn max_pool2d( &self, inp_l: &Layout, (w_k, h_k): (usize, usize), (w_stride, h_stride): (usize, usize), ) -> Result<Self> { let shape = inp_l.shape(); let (b_size, channels, width, height) = shape.dims4()?; let strides = inp_l.stride(); let name = match self.dtype { DType::F32 => "max_pool2d_f32", DType::F16 => "max_pool2d_f16", DType::BF16 => "max_pool2d_bf16", DType::U8 => "max_pool2d_u8", DType::U32 => "max_pool2d_u32", dtype => crate::bail!("Metal max_pool2d {dtype:?} not implemented"), }; let out_w = (width - w_k) / w_stride + 1; let out_h = (height - h_k) / h_stride + 1; let dst_el = out_w * out_h * b_size * channels; let buffer = self.device.new_buffer(dst_el, self.dtype, "max_pool2d")?; let command_buffers = self.device.command_buffer()?; candle_metal_kernels::call_pool2d( &self.device.device, &command_buffers, &self.device.kernels, name, inp_l.dims(), strides, out_w, out_h, w_k, h_k, w_stride, h_stride, &self.buffer, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, self.device.clone(), dst_el, self.dtype)) } fn upsample_nearest1d(&self, _: &Layout, _: usize) -> Result<Self> { crate::bail!("Metal upsample_nearest1d not implemented") } fn upsample_nearest2d(&self, inp_l: &Layout, out_w: usize, out_h: usize) -> Result<Self> { // let inp = &inp.slice(inp_l.start_offset()..); let shape = inp_l.shape(); let dims = shape.dims(); let strides = inp_l.stride(); if dims.len() != 4 { crate::bail!("unexpected input shape for upsample {dims:?}") } let name = match self.dtype { DType::F32 => "upsample_nearest2d_f32", DType::F16 => "upsample_nearest2d_f16", DType::BF16 => "upsample_nearest2d_bf16", DType::U8 => "upsample_nearest2d_u8", DType::U32 => "upsample_nearest2d_u32", dtype => crate::bail!("Metal upsample_nearest2d {dtype:?} not implemented"), }; let dst_el = out_w * out_h * dims[0] * dims[1]; let buffer = self .device .new_buffer(dst_el, self.dtype, "upsample_nearest2d")?; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, inp_l, self.dtype); candle_metal_kernels::call_upsample_nearest_2d( &self.device.device, &command_buffer, &self.device.kernels, name, dims, strides, out_w, out_h, src, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, self.device.clone(), dst_el, self.dtype)) } fn gather(&self, src_l: &Layout, ids: &Self, ids_l: &Layout, dim: usize) -> Result<Self> { if !ids_l.is_contiguous() { return Err(crate::Error::RequiresContiguous { op: "gather" }.bt()); }; let ids_el = ids_l.dims()[dim]; let dst_el = ids_l.shape().elem_count(); let dtype = self.dtype; let device = self.device(); let buffer = device.new_buffer(dst_el, dtype, "gather")?; let name = match (ids.dtype, self.dtype) { (DType::U32, DType::F32) => "gather_u32_f32", (DType::U32, DType::F16) => "gather_u32_f16", (DType::U32, DType::BF16) => "gather_u32_bf16", (DType::U32, DType::U32) => "gather_u32_u32", (left, right) => crate::bail!("Metal gather {left:?} {right:?} not implemented"), }; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, src_l, dtype); let ids = buffer_o(&ids.buffer, ids_l, ids.dtype); candle_metal_kernels::call_gather( &device.device, &command_buffer, &self.device.kernels, name, src_l.dims(), ids_el, dim, src, ids, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, device.clone(), dst_el, dtype)) } fn scatter_add( &self, l: &Layout, ids: &Self, ids_l: &Layout, src: &Self, src_l: &Layout, dim: usize, ) -> Result<Self> { let mut acc = self.device.zeros_impl(l.shape(), self.dtype())?; self.copy_strided_src(&mut acc, 0, l)?; if !ids_l.is_contiguous() || !src_l.is_contiguous() { return Err(crate::Error::RequiresContiguous { op: "scatter-add" }.bt()); }; let name = match (ids.dtype, self.dtype) { (DType::U8, DType::F32) => "sa_u8_f32", (DType::U8, DType::F16) => "sa_u8_f16", (DType::U8, DType::BF16) => "sa_u8_bf16", (DType::U32, DType::F32) => "sa_u32_f32", (DType::U32, DType::F16) => "sa_u32_f16", (DType::U32, DType::BF16) => "sa_u32_bf16", (DType::I64, DType::F32) => "sa_i64_f32", (DType::I64, DType::F16) => "sa_i64_f16", (DType::I64, DType::BF16) => "sa_i64_bf16", _ => Err(MetalError::UnexpectedDType { msg: "scatter-add ids should be u8/u32/i64", expected: DType::U32, got: ids.dtype(), })?, }; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&src.buffer, src_l, src.dtype); let ids = buffer_o(&ids.buffer, ids_l, ids.dtype); candle_metal_kernels::call_scatter_add( &self.device.device, &command_buffer, &self.device.kernels, name, src_l.dims(), l.dims(), dim, src, ids, &acc.buffer, ) .map_err(MetalError::from)?; Ok(acc) } fn index_select(&self, ids: &Self, src_l: &Layout, ids_l: &Layout, dim: usize) -> Result<Self> { if !ids_l.is_contiguous() { crate::bail!("Metal index_select requires contiguous ids") } let left_size: usize = src_l.dims()[..dim].iter().product(); let right_size: usize = src_l.dims()[dim + 1..].iter().product(); let ids_el = ids_l.shape().elem_count(); let dst_el = ids_el * left_size * right_size; let dtype = self.dtype; let device = self.device(); let buffer = device.new_buffer(dst_el, dtype, "index_select")?; let name = match (ids.dtype, self.dtype) { (DType::U8, DType::U8) => "is_u8_u8", (DType::U8, DType::U32) => "is_u8_u32", (DType::U8, DType::I64) => "is_u8_i64", (DType::U8, DType::BF16) => "is_u8_bf16", (DType::U8, DType::F32) => "is_u8_f32", (DType::U8, DType::F16) => "is_u8_f16", (DType::U32, DType::U8) => "is_u32_u8", (DType::U32, DType::U32) => "is_u32_u32", (DType::U32, DType::I64) => "is_u32_i64", (DType::U32, DType::F32) => "is_u32_f32", (DType::U32, DType::F16) => "is_u32_f16", (DType::U32, DType::BF16) => "is_u32_bf16", (DType::I64, DType::U8) => "is_i64_u8", (DType::I64, DType::U32) => "is_i64_u32", (DType::I64, DType::I64) => "is_i64_i64", (DType::I64, DType::F32) => "is_i64_f32", (DType::I64, DType::F16) => "is_i64_f16", (DType::I64, DType::BF16) => "is_i64_bf16", (left, right) => { crate::bail!("Metal contiguous index_select {left:?} {right:?} not implemented") } }; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&self.buffer, src_l, dtype); let ids = buffer_o(&ids.buffer, ids_l, ids.dtype); candle_metal_kernels::call_index_select( &device.device, &command_buffer, &self.device.kernels, name, src_l.dims(), ids_el, dim, src_l.is_contiguous(), src_l.dims(), src_l.stride(), src, ids, &buffer, ) .map_err(MetalError::from)?; Ok(Self::new(buffer, device.clone(), dst_el, dtype)) } fn index_add( &self, l: &Layout, ids: &Self, ids_l: &Layout, src: &Self, src_l: &Layout, dim: usize, ) -> Result<Self> { let mut acc = self.device.zeros_impl(l.shape(), self.dtype())?; self.copy_strided_src(&mut acc, 0, l)?; if !ids_l.is_contiguous() || !src_l.is_contiguous() { return Err(crate::Error::RequiresContiguous { op: "index-add" }.bt()); }; let name = match (ids.dtype, self.dtype) { (DType::I64, DType::BF16) => "ia_i64_bf16", (DType::I64, DType::F16) => "ia_i64_f16", (DType::I64, DType::F32) => "ia_i64_f32", (DType::I64, DType::I64) => "ia_i64_i64", (DType::I64, DType::U32) => "ia_i64_u32", (DType::I64, DType::U8) => "ia_i64_u8", (DType::U32, DType::BF16) => "ia_u32_bf16", (DType::U32, DType::F16) => "ia_u32_f16", (DType::U32, DType::F32) => "ia_u32_f32", (DType::U32, DType::I64) => "ia_u32_i64", (DType::U32, DType::U32) => "ia_u32_u32", (DType::U32, DType::U8) => "ia_u32_u8", (DType::U8, DType::BF16) => "ia_u8_bf16", (DType::U8, DType::F16) => "ia_u8_f16", (DType::U8, DType::F32) => "ia_u8_f32", (DType::U8, DType::I64) => "ia_u8_i64", (DType::U8, DType::U32) => "ia_u8_u32", (DType::U8, DType::U8) => "ia_u8_u8", _ => Err(MetalError::UnexpectedDType { msg: "index-add ids should be u8/u32/i64", expected: DType::U32, got: ids.dtype(), })?, }; let command_buffer = self.device.command_buffer()?; let src = buffer_o(&src.buffer, src_l, src.dtype); let ids = buffer_o(&ids.buffer, ids_l, ids.dtype); candle_metal_kernels::call_index_add( &self.device.device, &command_buffer, &self.device.kernels, name, src_l.dims(), l.dims(), ids_l.dims(), dim, src, ids, &acc.buffer, ) .map_err(MetalError::from)?; Ok(acc) } fn matmul( &self, rhs: &Self, (b, m, n, k): (usize, usize, usize, usize), lhs_l: &Layout, rhs_l: &Layout, ) -> Result<Self> { let buffer = self.device.new_buffer(b * m * n, self.dtype, "matmul")?; let command_buffer = self.device.command_buffer()?; command_buffer.set_label("matmul"); if self.dtype == DType::BF16 { candle_metal_kernels::call_mlx_gemm( &self.device.device, &command_buffer, &self.device.kernels, candle_metal_kernels::GemmDType::BF16, (b, m, n, k), lhs_l.stride(), lhs_l.start_offset() * self.dtype.size_in_bytes(), &self.buffer, rhs_l.stride(), rhs_l.start_offset() * rhs.dtype.size_in_bytes(), &rhs.buffer, &buffer, ) .map_err(MetalError::from)?; } else if self.device.use_mlx_mm { let dtype = match self.dtype { DType::F32 => candle_metal_kernels::GemmDType::F32, DType::F16 => candle_metal_kernels::GemmDType::F16, DType::BF16 => candle_metal_kernels::GemmDType::BF16, dtype => { return Err(MetalError::Message(format!( "mlx matmul doesn't support {dtype:?}" )) .into()) } }; candle_metal_kernels::call_mlx_gemm( &self.device.device, &command_buffer, &self.device.kernels, dtype, (b, m, n, k), lhs_l.stride(), lhs_l.start_offset() * self.dtype.size_in_bytes(), &self.buffer, rhs_l.stride(), rhs_l.start_offset() * rhs.dtype.size_in_bytes(), &rhs.buffer, &buffer, ) .map_err(MetalError::from)?; } else { let name = match self.dtype { DType::F32 => "sgemm", DType::F16 => "hgemm", dtype => { return Err( MetalError::Message(format!("matmul doesn't support {dtype:?}")).into(), ) } }; candle_metal_kernels::call_gemm( &self.device.device, &command_buffer, &self.device.kernels, name, (b, m, n, k), lhs_l.stride(), lhs_l.start_offset() * self.dtype.size_in_bytes(), &self.buffer, rhs_l.stride(), rhs_l.start_offset() * rhs.dtype.size_in_bytes(), &rhs.buffer, &buffer, ) .map_err(MetalError::from)?; } Ok(Self::new( buffer, self.device.clone(), b * m * n, self.dtype(), )) } fn copy2d( &self, dst: &mut Self, d1: usize, d2: usize, src_s: usize, dst_s: usize, src_o: usize, dst_o: usize, ) -> Result<()> { if self.dtype() != dst.dtype() { crate::bail!( "copy2d with inconsistent dtypes {:?} {:?}", self.dtype(), dst.dtype() ) } let command_buffer = self.device.command_buffer()?; if src_s == d2 && dst_s == d2 { command_buffer.set_label("copy2d_contiguous"); let blit = command_buffer.new_blit_command_encoder(); blit.set_label("copy2d_contiguous"); let src_offset = (src_o * self.dtype.size_in_bytes()) as NSUInteger; let length = (d1 * d2 * self.dtype.size_in_bytes()) as NSUInteger; let dst_offset = (dst_o * dst.dtype().size_in_bytes()) as NSUInteger; blit.copy_from_buffer(&self.buffer, src_offset, dst.buffer(), dst_offset, length); blit.end_encoding(); } else { let el_count = d1 * d2; if el_count == 0 { return Ok(()); } let kernel_name = match self.dtype { DType::F32 => candle_metal_kernels::copy2d::FLOAT, DType::F16 => candle_metal_kernels::copy2d::HALF, DType::BF16 => candle_metal_kernels::copy2d::BFLOAT, DType::I64 => candle_metal_kernels::copy2d::I64, DType::U32 => candle_metal_kernels::copy2d::U32, DType::U8 => candle_metal_kernels::copy2d::U8, dtype => crate::bail!("Metal copy2d {dtype:?} not implemented"), }; candle_metal_kernels::call_copy2d( &self.device.device, &command_buffer, &self.device.kernels, kernel_name, &self.buffer, &dst.buffer, d1, d2, src_s, dst_s, src_o * self.dtype.size_in_bytes(), dst_o * self.dtype.size_in_bytes(), ) .map_err(MetalError::from)?; command_buffer.set_label("copy2d"); } Ok(()) } fn copy_strided_src(&self, dst: &mut Self, dst_offset: usize, src_l: &Layout) -> Result<()> { let command_buffer = self.device.command_buffer()?; if src_l.is_contiguous() && self.dtype == dst.dtype() { command_buffer.set_label("copy_contiguous"); let blit = command_buffer.new_blit_command_encoder(); blit.set_label("copy_contiguous"); let src_offset = (src_l.start_offset() * self.dtype.size_in_bytes()) as NSUInteger; let length = (src_l.shape().elem_count() * self.dtype.size_in_bytes()) as NSUInteger; let dst_offset = (dst_offset * dst.dtype().size_in_bytes()) as NSUInteger; blit.copy_from_buffer(&self.buffer, src_offset, dst.buffer(), dst_offset, length); blit.end_encoding(); } else { let src_shape = src_l.shape(); let el_count = src_shape.elem_count(); if el_count == 0 { return Ok(()); } let kernel_name = match self.dtype { DType::F32 => candle_metal_kernels::unary::strided::copy::FLOAT, DType::F16 => candle_metal_kernels::unary::strided::copy::HALF, DType::BF16 => candle_metal_kernels::unary::strided::copy::BFLOAT, DType::I64 => candle_metal_kernels::unary::strided::copy::I64, DType::U32 => candle_metal_kernels::unary::strided::copy::U32, DType::U8 => candle_metal_kernels::unary::strided::copy::U8, dtype => crate::bail!("Metal copy_strided {dtype:?} not implemented"), }; let src = buffer_o(&self.buffer, src_l, self.dtype); let dst = BufferOffset { buffer: &dst.buffer, offset_in_bytes: dst_offset * dst.dtype.size_in_bytes(), }; candle_metal_kernels::call_unary_strided( &self.device.device, &command_buffer, &self.device.kernels, kernel_name, src_l.dims(), src, src_l.stride(), dst, ) .map_err(MetalError::from)?; command_buffer.set_label("copy_strided"); } Ok(()) } } impl MetalStorage { pub fn new(buffer: Arc<Buffer>, device: MetalDevice, count: usize, dtype: DType) -> Self { Self { buffer, device, count, dtype, } } pub fn buffer(&self) -> &Buffer { &self.buffer } pub fn binary( &self, op: &'static str, rhs: &Self, lhs_l: &Layout, rhs_l: &Layout, ) -> Result<Self> { let device = self.device(); let shape = lhs_l.shape(); let el_count = shape.elem_count(); let command_buffer = device.command_buffer()?; let lhs = buffer_o(&self.buffer, lhs_l, self.dtype); let rhs = buffer_o(&rhs.buffer, rhs_l, rhs.dtype); let (buffer, dtype) = if lhs_l.is_contiguous() && rhs_l.is_contiguous() && &op[..1] != "b" { use candle_metal_kernels::binary::contiguous; let (kernel_name, dtype) = match (op, self.dtype) { ("add", DType::F32) => (contiguous::add::FLOAT, self.dtype), ("sub", DType::F32) => (contiguous::sub::FLOAT, self.dtype), ("mul", DType::F32) => (contiguous::mul::FLOAT, self.dtype), ("div", DType::F32) => (contiguous::div::FLOAT, self.dtype), ("eq", DType::F32) => (contiguous::eq::FLOAT, DType::U8), ("ne", DType::F32) => (contiguous::ne::FLOAT, DType::U8), ("le", DType::F32) => (contiguous::le::FLOAT, DType::U8), ("lt", DType::F32) => (contiguous::lt::FLOAT, DType::U8), ("ge", DType::F32) => (contiguous::ge::FLOAT, DType::U8), ("gt", DType::F32) => (contiguous::gt::FLOAT, DType::U8), ("add", DType::F16) => (contiguous::add::HALF, self.dtype), ("sub", DType::F16) => (contiguous::sub::HALF, self.dtype), ("mul", DType::F16) => (contiguous::mul::HALF, self.dtype), ("div", DType::F16) => (contiguous::div::HALF, self.dtype), ("eq", DType::F16) => (contiguous::eq::HALF, DType::U8), ("ne", DType::F16) => (contiguous::ne::HALF, DType::U8), ("le", DType::F16) => (contiguous::le::HALF, DType::U8), ("lt", DType::F16) => (contiguous::lt::HALF, DType::U8), ("ge", DType::F16) => (contiguous::ge::HALF, DType::U8), ("gt", DType::F16) => (contiguous::gt::HALF, DType::U8), ("add", DType::BF16) => (contiguous::add::BFLOAT, self.dtype), ("sub", DType::BF16) => (contiguous::sub::BFLOAT, self.dtype), ("mul", DType::BF16) => (contiguous::mul::BFLOAT, self.dtype), ("div", DType::BF16) => (contiguous::div::BFLOAT, self.dtype), ("eq", DType::BF16) => (contiguous::eq::BFLOAT, DType::U8), ("ne", DType::BF16) => (contiguous::ne::BFLOAT, DType::U8), ("le", DType::BF16) => (contiguous::le::BFLOAT, DType::U8), ("lt", DType::BF16) => (contiguous::lt::BFLOAT, DType::U8), ("ge", DType::BF16) => (contiguous::ge::BFLOAT, DType::U8), ("gt", DType::BF16) => (contiguous::gt::BFLOAT, DType::U8), ("add", DType::I64) => (contiguous::add::I64, self.dtype), ("sub", DType::I64) => (contiguous::sub::I64, self.dtype), ("mul", DType::I64) => (contiguous::mul::I64, self.dtype), ("div", DType::I64) => (contiguous::div::I64, self.dtype), ("eq", DType::I64) => (contiguous::eq::I64, DType::U8), ("ne", DType::I64) => (contiguous::ne::I64, DType::U8), ("le", DType::I64) => (contiguous::le::I64, DType::U8), ("lt", DType::I64) => (contiguous::lt::I64, DType::U8), ("ge", DType::I64) => (contiguous::ge::I64, DType::U8), ("gt", DType::I64) => (contiguous::gt::I64, DType::U8), ("add", DType::U32) => (contiguous::add::U32, self.dtype), ("sub", DType::U32) => (contiguous::sub::U32, self.dtype), ("mul", DType::U32) => (contiguous::mul::U32, self.dtype), ("div", DType::U32) => (contiguous::div::U32, self.dtype), ("eq", DType::U32) => (contiguous::eq::U32, DType::U8), ("ne", DType::U32) => (contiguous::ne::U32, DType::U8), ("le", DType::U32) => (contiguous::le::U32, DType::U8), ("lt", DType::U32) => (contiguous::lt::U32, DType::U8), ("ge", DType::U32) => (contiguous::ge::U32, DType::U8), ("gt", DType::U32) => (contiguous::gt::U32, DType::U8), ("add", DType::U8) => (contiguous::add::U8, self.dtype), ("sub", DType::U8) => (contiguous::sub::U8, self.dtype), ("mul", DType::U8) => (contiguous::mul::U8, self.dtype), ("div", DType::U8) => (contiguous::div::U8, self.dtype), ("eq", DType::U8) => (contiguous::eq::U8, DType::U8), ("ne", DType::U8) => (contiguous::ne::U8, DType::U8), ("le", DType::U8) => (contiguous::le::U8, DType::U8), ("lt", DType::U8) => (contiguous::lt::U8, DType::U8), ("ge", DType::U8) => (contiguous::ge::U8, DType::U8), ("gt", DType::U8) => (contiguous::gt::U8, DType::U8), (name, dtype) => { crate::bail!("Metal contiguous binary {name} {dtype:?} not implemented") } }; let buffer = device.new_buffer(el_count, dtype, op)?; candle_metal_kernels::call_binary_contiguous( &device.device, &command_buffer, &device.kernels, kernel_name, el_count, lhs, rhs, &buffer, ) .map_err(MetalError::from)?; (buffer, dtype) } else { use candle_metal_kernels::binary::strided; let (kernel_name, dtype) = match (op, self.dtype) { ("badd", DType::F32) => (strided::add::FLOAT, self.dtype), ("bsub", DType::F32) => (strided::sub::FLOAT, self.dtype), ("bmul", DType::F32) => (strided::mul::FLOAT, self.dtype), ("bdiv", DType::F32) => (strided::div::FLOAT, self.dtype), ("bminimum", DType::F32) => (strided::min::FLOAT, self.dtype), ("bmaximum", DType::F32) => (strided::max::FLOAT, self.dtype), ("eq", DType::F32) => (strided::eq::FLOAT, DType::U8), ("ne", DType::F32) => (strided::ne::FLOAT, DType::U8), ("le", DType::F32) => (strided::le::FLOAT, DType::U8), ("lt", DType::F32) => (strided::lt::FLOAT, DType::U8), ("ge", DType::F32) => (strided::ge::FLOAT, DType::U8), ("gt", DType::F32) => (strided::gt::FLOAT, DType::U8), ("badd", DType::F16) => (strided::add::HALF, self.dtype), ("bsub", DType::F16) => (strided::sub::HALF, self.dtype), ("bmul", DType::F16) => (strided::mul::HALF, self.dtype), ("bdiv", DType::F16) => (strided::div::HALF, self.dtype), ("bminimum", DType::F16) => (strided::min::HALF, self.dtype), ("bmaximum", DType::F16) => (strided::max::HALF, self.dtype), ("eq", DType::F16) => (strided::eq::HALF, DType::U8), ("ne", DType::F16) => (strided::ne::HALF, DType::U8), ("le", DType::F16) => (strided::le::HALF, DType::U8), ("lt", DType::F16) => (strided::lt::HALF, DType::U8), ("ge", DType::F16) => (strided::ge::HALF, DType::U8), ("gt", DType::F16) => (strided::gt::HALF, DType::U8), ("badd", DType::BF16) => (strided::add::BFLOAT, self.dtype), ("bsub", DType::BF16) => (strided::sub::BFLOAT, self.dtype), ("bmul", DType::BF16) => (strided::mul::BFLOAT, self.dtype), ("bdiv", DType::BF16) => (strided::div::BFLOAT, self.dtype), ("bminimum", DType::BF16) => (strided::min::BFLOAT, self.dtype), ("bmaximum", DType::BF16) => (strided::max::BFLOAT, self.dtype), ("eq", DType::BF16) => (strided::eq::BFLOAT, DType::U8), ("ne", DType::BF16) => (strided::ne::BFLOAT, DType::U8), ("le", DType::BF16) => (strided::le::BFLOAT, DType::U8), ("lt", DType::BF16) => (strided::lt::BFLOAT, DType::U8), ("ge", DType::BF16) => (strided::ge::BFLOAT, DType::U8), ("gt", DType::BF16) => (strided::gt::BFLOAT, DType::U8), ("badd", DType::I64) => (strided::add::I64, self.dtype), ("bsub", DType::I64) => (strided::sub::I64, self.dtype), ("bmul", DType::I64) => (strided::mul::I64, self.dtype), ("bdiv", DType::I64) => (strided::div::I64, self.dtype), ("bminimum", DType::I64) => (strided::min::I64, self.dtype), ("bmaximum", DType::I64) => (strided::max::I64, self.dtype), ("eq", DType::I64) => (strided::eq::I64, DType::U8), ("ne", DType::I64) => (strided::ne::I64, DType::U8), ("le", DType::I64) => (strided::le::I64, DType::U8), ("lt", DType::I64) => (strided::lt::I64, DType::U8), ("ge", DType::I64) => (strided::ge::I64, DType::U8), ("gt", DType::I64) => (strided::gt::I64, DType::U8), ("badd", DType::U32) => (strided::add::U32, self.dtype), ("bsub", DType::U32) => (strided::sub::U32, self.dtype), ("bmul", DType::U32) => (strided::mul::U32, self.dtype), ("bdiv", DType::U32) => (strided::div::U32, self.dtype), ("bminimum", DType::U32) => (strided::min::U32, self.dtype), ("bmaximum", DType::U32) => (strided::max::U32, self.dtype), ("eq", DType::U32) => (strided::eq::U32, DType::U8), ("ne", DType::U32) => (strided::ne::U32, DType::U8), ("le", DType::U32) => (strided::le::U32, DType::U8), ("lt", DType::U32) => (strided::lt::U32, DType::U8), ("ge", DType::U32) => (strided::ge::U32, DType::U8), ("gt", DType::U32) => (strided::gt::U32, DType::U8), ("badd", DType::U8) => (strided::add::U8, self.dtype), ("bsub", DType::U8) => (strided::sub::U8, self.dtype), ("bmul", DType::U8) => (strided::mul::U8, self.dtype), ("bdiv", DType::U8) => (strided::div::U8, self.dtype), ("bminimum", DType::U8) => (strided::min::U8, self.dtype), ("bmaximum", DType::U8) => (strided::max::U8, self.dtype), ("eq", DType::U8) => (strided::eq::U8, DType::U8), ("ne", DType::U8) => (strided::ne::U8, DType::U8), ("le", DType::U8) => (strided::le::U8, DType::U8), ("lt", DType::U8) => (strided::lt::U8, DType::U8), ("ge", DType::U8) => (strided::ge::U8, DType::U8), ("gt", DType::U8) => (strided::gt::U8, DType::U8), (name, dtype) => { crate::bail!("Metal strided binary {name} {dtype:?} not implemented") } }; let buffer = device.new_buffer(el_count, dtype, op)?; candle_metal_kernels::call_binary_strided( &device.device, &command_buffer, &device.kernels, kernel_name, lhs_l.dims(), lhs, lhs_l.stride(), rhs, rhs_l.stride(), &buffer, ) .map_err(MetalError::from)?; (buffer, dtype) }; command_buffer.set_label("binary"); Ok(Self::new(buffer, device.clone(), el_count, dtype)) } pub(crate) fn to_cpu<T: Clone>(&self) -> Result<Vec<T>> { let size = (self.count * self.dtype.size_in_bytes()) as NSUInteger; let buffer = self.device.new_buffer_managed(size)?; { let command_buffer = self.device.command_buffer()?; command_buffer.set_label("to_cpu"); let blit = command_buffer.new_blit_command_encoder(); blit.set_label("blit_to_cpu"); blit.copy_from_buffer(&self.buffer, 0, &buffer, 0, size); blit.end_encoding(); } self.device.wait_until_completed()?; Ok(read_to_vec(&buffer, self.count)) } } impl BackendDevice for MetalDevice { type Storage = MetalStorage; fn new(ordinal: usize) -> Result<Self> { let device = metal::Device::all().swap_remove(ordinal); let command_queue = device.new_command_queue(); let kernels = Arc::new(Kernels::new()); let use_mlx_mm = match std::env::var("CANDLE_USE_MFA_MM").as_deref() { Ok("false") | Ok("False") | Ok("FALSE") | Ok("0") | Err(_) => true, Ok(_) => false, }; let seed = Arc::new(Mutex::new(device.new_buffer_with_data( [299792458].as_ptr() as *const c_void, 4, MTLResourceOptions::StorageModeManaged, ))); let commands = device::Commands::new(command_queue)?; Ok(Self { id: DeviceId::new(), device, commands: Arc::new(RwLock::new(commands)), buffers: Arc::new(RwLock::new(HashMap::new())), kernels, seed, use_mlx_mm, }) } fn location(&self) -> crate::DeviceLocation { crate::DeviceLocation::Metal { gpu_id: self.registry_id() as usize, } } fn same_device(&self, rhs: &Self) -> bool { self.id == rhs.id } unsafe fn alloc_uninit(&self, shape: &Shape, dtype: DType) -> Result<MetalStorage> { let buffer = self.new_buffer(shape.elem_count(), dtype, "alloc-uninit")?; Ok(MetalStorage::new( buffer, self.clone(), shape.elem_count(), dtype, )) } fn zeros_impl(&self, shape: &Shape, dtype: DType) -> Result<MetalStorage> { let size = shape.elem_count() * dtype.size_in_bytes(); let buffer = self.allocate_zeros(size)?; Ok(MetalStorage::new( buffer, self.clone(), shape.elem_count(), dtype, )) } fn ones_impl(&self, shape: &Shape, dtype: DType) -> Result<MetalStorage> { let name = match dtype { DType::U8 => "fill_u8", DType::U32 => "fill_u32", DType::I64 => "fill_i64", DType::F16 => "fill_f16", DType::BF16 => "fill_bf16", DType::F32 => "fill_f32", DType::F64 => { let cpu_storage = crate::cpu_backend::CpuDevice.ones_impl(shape, dtype)?; return self.storage_from_cpu_storage(&cpu_storage); } }; let buffer = self.new_buffer(shape.elem_count(), dtype, "alloc-ones")?; let command_buffer = self.command_buffer()?; candle_metal_kernels::call_const_fill( &self.device, &command_buffer, &self.kernels, name, shape.elem_count(), &buffer, 1., ) .map_err(MetalError::from)?; Ok(MetalStorage::new( buffer, self.clone(), shape.elem_count(), dtype, )) } fn storage_from_slice<T: crate::WithDType>(&self, s: &[T]) -> Result<Self::Storage> { let (count, buffer) = match T::cpu_storage_ref(s) { CpuStorageRef::U8(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::U32(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::I64(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::BF16(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::F16(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::F32(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorageRef::F64(storage) => (storage.len(), self.new_buffer_with_data(storage)), }; Ok(Self::Storage::new(buffer?, self.clone(), count, T::DTYPE)) } fn storage_from_cpu_storage(&self, storage: &CpuStorage) -> Result<Self::Storage> { let (count, buffer) = match storage { CpuStorage::U8(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::U32(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::I64(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::BF16(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::F16(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::F32(storage) => (storage.len(), self.new_buffer_with_data(storage)), CpuStorage::F64(storage) => (storage.len(), self.new_buffer_with_data(storage)), }; Ok(Self::Storage::new( buffer?, self.clone(), count, storage.dtype(), )) } fn storage_from_cpu_storage_owned(&self, storage: CpuStorage) -> Result<Self::Storage> { self.storage_from_cpu_storage(&storage) } fn rand_uniform( &self, shape: &Shape, dtype: DType, min: f64, max: f64, ) -> Result<Self::Storage> { let name = match dtype { DType::F32 => "rand_uniform_f32", DType::F16 => "rand_uniform_f16", DType::BF16 => "rand_uniform_bf16", dtype => crate::bail!("rand_uniform not implemented for {dtype:?}"), }; let buffer = self.new_buffer(shape.elem_count(), dtype, "rand_uniform")?; let command_buffer = self.command_buffer()?; candle_metal_kernels::call_random_uniform( &self.device, &command_buffer, &self.kernels, name, min as f32, max as f32, shape.elem_count(), &self.seed.lock().unwrap(), &buffer, ) .map_err(MetalError::from)?; Ok(Self::Storage::new( buffer, self.clone(), shape.elem_count(), dtype, )) } fn rand_normal( &self, shape: &Shape, dtype: DType, mean: f64, stddev: f64, ) -> Result<Self::Storage> { let name = match dtype { DType::F32 => "rand_normal_f32", DType::F16 => "rand_normal_f16", DType::BF16 => "rand_normal_bf16", dtype => crate::bail!("rand_uniform not implemented for {dtype:?}"), }; let buffer = self.new_buffer(shape.elem_count(), dtype, "rand_normal")?; let command_buffer = self.command_buffer()?; candle_metal_kernels::call_random_normal( &self.device, &command_buffer, &self.kernels, name, mean as f32, stddev as f32, shape.elem_count(), &self.seed.lock().unwrap(), &buffer, ) .map_err(MetalError::from)?; Ok(Self::Storage::new( buffer, self.clone(), shape.elem_count(), dtype, )) } fn set_seed(&self, seed: u64) -> Result<()> { let seed: u32 = seed.try_into().map_err(|_| { MetalError::Message("Metal seed must be less than or equal to u32::MAX".to_string()) })?; let seed_buffer = self.seed.try_lock().map_err(MetalError::from)?; let contents = seed_buffer.contents(); unsafe { std::ptr::copy([seed].as_ptr(), contents as *mut u32, 1); } seed_buffer.did_modify_range(metal::NSRange::new(0, 4)); Ok(()) } fn synchronize(&self) -> Result<()> { self.wait_until_completed() } } fn read_to_vec<T: Clone>(buffer: &Buffer, n: usize) -> Vec<T> { let ptr = buffer.contents() as *const T; assert!(!ptr.is_null()); let slice = unsafe { std::slice::from_raw_parts(ptr, n) }; slice.to_vec() }
9
0
hf_public_repos/api-inference-community/docker_images
hf_public_repos/api-inference-community/docker_images/bertopic/requirements.txt
starlette==0.27.0 api-inference-community==0.0.25 huggingface_hub==0.14.0 bertopic==0.15.0 safetensors==0.3.1
0
0
hf_public_repos/api-inference-community/docker_images
hf_public_repos/api-inference-community/docker_images/bertopic/Dockerfile
FROM tiangolo/uvicorn-gunicorn:python3.8 LABEL maintainer="Daniel van Strien <[email protected]> " # Add any system dependency here # RUN apt-get update -y && apt-get install libXXX -y COPY ./requirements.txt /app RUN pip install --no-cache-dir -r requirements.txt COPY ./prestart.sh /app/ # Most DL models are quite large in terms of memory, using workers is a HUGE # slowdown because of the fork and GIL with python. # Using multiple pods seems like a better default strategy. # Feel free to override if it does not make sense for your library. ARG max_workers=1 ENV MAX_WORKERS=$max_workers ENV HUGGINGFACE_HUB_CACHE=/data # Necessary on GPU environment docker. # TIMEOUT env variable is used by nvcr.io/nvidia/pytorch:xx for another purpose # rendering TIMEOUT defined by uvicorn impossible to use correctly # We're overriding it to be renamed UVICORN_TIMEOUT # UVICORN_TIMEOUT is a useful variable for very large models that take more # than 30s (the default) to load in memory. # If UVICORN_TIMEOUT is too low, uvicorn will simply never loads as it will # kill workers all the time before they finish. RUN sed -i 's/TIMEOUT/UVICORN_TIMEOUT/g' /gunicorn_conf.py COPY ./app /app/app
1
0
hf_public_repos/api-inference-community/docker_images
hf_public_repos/api-inference-community/docker_images/bertopic/prestart.sh
python app/main.py
2
0
hf_public_repos/api-inference-community/docker_images/bertopic
hf_public_repos/api-inference-community/docker_images/bertopic/app/main.py
import functools import logging import os from typing import Dict, Type from api_inference_community.routes import pipeline_route, status_ok from app.pipelines import Pipeline, TextClassificationPipeline from starlette.applications import Starlette from starlette.middleware import Middleware from starlette.middleware.gzip import GZipMiddleware from starlette.routing import Route TASK = os.getenv("TASK") MODEL_ID = os.getenv("MODEL_ID") logger = logging.getLogger(__name__) ALLOWED_TASKS: Dict[str, Type[Pipeline]] = { "text-classification": TextClassificationPipeline } @functools.lru_cache() def get_pipeline() -> Pipeline: task = os.environ["TASK"] model_id = os.environ["MODEL_ID"] if task not in ALLOWED_TASKS: raise EnvironmentError(f"{task} is not a valid pipeline for model : {model_id}") return ALLOWED_TASKS[task](model_id) routes = [ Route("/{whatever:path}", status_ok), Route("/{whatever:path}", pipeline_route, methods=["POST"]), ] middleware = [Middleware(GZipMiddleware, minimum_size=1000)] if os.environ.get("DEBUG", "") == "1": from starlette.middleware.cors import CORSMiddleware middleware.append( Middleware( CORSMiddleware, allow_origins=["*"], allow_headers=["*"], allow_methods=["*"], ) ) app = Starlette(routes=routes, middleware=middleware) @app.on_event("startup") async def startup_event(): logger = logging.getLogger("uvicorn.access") handler = logging.StreamHandler() handler.setFormatter(logging.Formatter("%(asctime)s - %(levelname)s - %(message)s")) logger.handlers = [handler] # Link between `api-inference-community` and framework code. app.get_pipeline = get_pipeline try: get_pipeline() except Exception: # We can fail so we can show exception later. pass if __name__ == "__main__": try: get_pipeline() except Exception: # We can fail so we can show exception later. pass
3
0
hf_public_repos/api-inference-community/docker_images/bertopic/app
hf_public_repos/api-inference-community/docker_images/bertopic/app/pipelines/base.py
from abc import ABC, abstractmethod from typing import Any class Pipeline(ABC): @abstractmethod def __init__(self, model_id: str): raise NotImplementedError("Pipelines should implement an __init__ method") @abstractmethod def __call__(self, inputs: Any) -> Any: raise NotImplementedError("Pipelines should implement a __call__ method") class PipelineException(Exception): pass
4
0
hf_public_repos/api-inference-community/docker_images/bertopic/app
hf_public_repos/api-inference-community/docker_images/bertopic/app/pipelines/__init__.py
from app.pipelines.base import Pipeline, PipelineException # isort:skip from app.pipelines.text_classification import TextClassificationPipeline
5
0
hf_public_repos/api-inference-community/docker_images/bertopic/app
hf_public_repos/api-inference-community/docker_images/bertopic/app/pipelines/text_classification.py
from typing import Dict, List from app.pipelines import Pipeline from bertopic import BERTopic class TextClassificationPipeline(Pipeline): def __init__( self, model_id: str, ): self.model = BERTopic.load(model_id) def __call__(self, inputs: str) -> List[List[Dict[str, float]]]: """ Args: inputs (:obj:`str`): a string containing some text Return: A :obj:`list`:. The object returned should be a list of one list like [[{"label": "positive", "score": 0.5}]] containing: - "label": A string representing what the label/class is. There can be multiple labels. - "score": A score between 0 and 1 describing how confident the model is for this label/class. """ topics, probabilities = self.model.transform(inputs) results = [] for topic, prob in zip(topics, probabilities): if self.model.custom_labels_ is not None: topic_label = self.model.custom_labels_[topic + self.model._outliers] else: topic_label = self.model.topic_labels_[topic] results.append({"label": topic_label, "score": float(prob)}) return [results]
6
0
hf_public_repos/api-inference-community/docker_images/bertopic
hf_public_repos/api-inference-community/docker_images/bertopic/tests/test_docker_build.py
import os import subprocess from unittest import TestCase class cd: """Context manager for changing the current working directory""" def __init__(self, newPath): self.newPath = os.path.expanduser(newPath) def __enter__(self): self.savedPath = os.getcwd() os.chdir(self.newPath) def __exit__(self, etype, value, traceback): os.chdir(self.savedPath) class DockerBuildTestCase(TestCase): def test_can_build_docker_image(self): with cd(os.path.dirname(os.path.dirname(__file__))): subprocess.check_output(["docker", "build", "."])
7
0
hf_public_repos/api-inference-community/docker_images/bertopic
hf_public_repos/api-inference-community/docker_images/bertopic/tests/test_api.py
import os from typing import Dict, List from unittest import TestCase, skipIf from app.main import ALLOWED_TASKS, get_pipeline # Must contain at least one example of each implemented pipeline # Tests do not check the actual values of the model output, so small dummy # models are recommended for faster tests. TESTABLE_MODELS: Dict[str, List[str]] = { "text-classification": ["MaartenGr/BERTopic_ArXiv", "MaartenGr/BERTopic_Wikipedia"], } ALL_TASKS = { "audio-classification", "audio-to-audio", "automatic-speech-recognition", "feature-extraction", "image-classification", "question-answering", "sentence-similarity", "speech-segmentation", "tabular-classification", "tabular-regression", "text-to-image", "text-to-speech", "token-classification", "conversational", "feature-extraction", "sentence-similarity", "fill-mask", "table-question-answering", "summarization", "text2text-generation", "text-classification", "zero-shot-classification", } class PipelineTestCase(TestCase): @skipIf( os.path.dirname(os.path.dirname(__file__)).endswith("common"), "common is a special case", ) def test_has_at_least_one_task_enabled(self): self.assertGreater( len(ALLOWED_TASKS.keys()), 0, "You need to implement at least one task" ) def test_unsupported_tasks(self): unsupported_tasks = ALL_TASKS - ALLOWED_TASKS.keys() for unsupported_task in unsupported_tasks: with self.subTest(msg=unsupported_task, task=unsupported_task): os.environ["TASK"] = unsupported_task os.environ["MODEL_ID"] = "XX" with self.assertRaises(EnvironmentError): get_pipeline()
8
0
hf_public_repos/api-inference-community/docker_images/bertopic
hf_public_repos/api-inference-community/docker_images/bertopic/tests/test_api_text_classification.py
import json import os from unittest import TestCase, skipIf from app.main import ALLOWED_TASKS from parameterized import parameterized_class from starlette.testclient import TestClient from tests.test_api import TESTABLE_MODELS @skipIf( "text-classification" not in ALLOWED_TASKS, "text-classification not implemented", ) @parameterized_class( [{"model_id": model_id} for model_id in TESTABLE_MODELS["text-classification"]] ) class TextClassificationTestCase(TestCase): def setUp(self): self.old_model_id = os.getenv("MODEL_ID") self.old_task = os.getenv("TASK") os.environ["MODEL_ID"] = self.model_id os.environ["TASK"] = "text-classification" from app.main import app self.app = app @classmethod def setUpClass(cls): from app.main import get_pipeline get_pipeline.cache_clear() def tearDown(self): if self.old_model_id is not None: os.environ["MODEL_ID"] = self.old_model_id else: del os.environ["MODEL_ID"] if self.old_task is not None: os.environ["TASK"] = self.old_task else: del os.environ["TASK"] def test_simple(self): inputs = "It is a beautiful day outside" with TestClient(self.app) as client: response = client.post("/", json={"inputs": inputs}) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual(len(content), 1) self.assertEqual(type(content[0]), list) self.assertEqual( set(k for el in content[0] for k in el.keys()), {"label", "score"}, ) with TestClient(self.app) as client: response = client.post("/", json=inputs) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual(len(content), 1) self.assertEqual(type(content[0]), list) self.assertEqual( set(k for el in content[0] for k in el.keys()), {"label", "score"}, ) def test_malformed_question(self): with TestClient(self.app) as client: response = client.post("/", data=b"\xc3\x28") self.assertEqual( response.status_code, 400, ) self.assertEqual( response.content, b'{"error":"\'utf-8\' codec can\'t decode byte 0xc3 in position 0: invalid continuation byte"}', )
9
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/native/test_utils.py
import candle from candle import Tensor, QTensor from candle.utils import load_safetensors, save_gguf, load_gguf, save_safetensors from pathlib import Path TEST_DIR = Path(__file__).parent.parent / "_workdir" TEST_DIR.mkdir(exist_ok=True) def test_can_roundtrip_safetensors(): tensors = { "a": candle.randn((16, 256)), "b": candle.randn((16, 16)), } file = str(TEST_DIR / "test.safetensors") save_safetensors(file, tensors) loaded_tensors = load_safetensors(file) assert set(tensors.keys()) == set(loaded_tensors.keys()) for key in tensors.keys(): assert tensors[key].values() == loaded_tensors[key].values(), "Values are not equal" assert tensors[key].shape == loaded_tensors[key].shape, "Shapes are not equal" assert str(tensors[key].dtype) == str(loaded_tensors[key].dtype), "Dtypes are not equal" def test_can_roundtrip_gguf(): metadata = { "a": 1, "b": "foo", "c": [1, 2, 3], "d": [[1, 2], [3, 4]], } tensors = { "a": candle.randn((16, 256)).quantize("q4_0"), "b": candle.randn((16, 16)).quantize("f32"), } file = str(TEST_DIR / "test.gguf") save_gguf(file, tensors, metadata) loaded_tensors, loaded_metadata = load_gguf(file) assert set(metadata.keys()) == set(loaded_metadata.keys()) for key in metadata.keys(): assert metadata[key] == loaded_metadata[key] assert set(tensors.keys()) == set(loaded_tensors.keys()) for key in tensors.keys(): assert tensors[key].dequantize().values() == loaded_tensors[key].dequantize().values(), "Values are not equal" assert tensors[key].shape == loaded_tensors[key].shape, "Shapes are not equal" assert str(tensors[key].ggml_dtype) == str(loaded_tensors[key].ggml_dtype), "Dtypes are not equal"
0
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/native/test_tensor.py
import candle from candle import Tensor from candle.utils import cuda_is_available from candle.testing import assert_equal import pytest def test_tensor_can_be_constructed(): t = Tensor(42.0) assert t.values() == 42.0 def test_tensor_can_be_constructed_from_list(): t = Tensor([3.0, 1, 4, 1, 5, 9, 2, 6]) assert t.values() == [3.0, 1, 4, 1, 5, 9, 2, 6] def test_tensor_can_be_constructed_from_list_of_lists(): t = Tensor([[3.0, 1, 4, 1], [5, 9, 2, 6]]) assert t.values() == [[3.0, 1, 4, 1], [5, 9, 2, 6]] def test_tensor_can_be_quantized(): t = candle.randn((16, 256)) for format in [ "q4_0", "q4_1", "q5_0", "q5_1", "q8_0", "q2k", "q3k", "q4k", "q5k", "q8k", ]: for formatted_format in [format.upper(), format.lower()]: quant_t = t.quantize(formatted_format) assert quant_t.ggml_dtype.lower() == format.lower() assert quant_t.shape == t.shape def test_tensor_can_be_indexed(): t = Tensor([[3.0, 1, 4, 1], [5, 9, 2, 6]]) assert t[0].values() == [3.0, 1.0, 4.0, 1.0] assert t[1].values() == [5.0, 9.0, 2.0, 6.0] assert t[-1].values() == [5.0, 9.0, 2.0, 6.0] assert t[-2].values() == [3.0, 1.0, 4.0, 1.0] def test_tensor_can_be_sliced(): t = Tensor([3.0, 1, 4, 10, 5, 9, 2, 6]) assert t[0:4].values() == [3.0, 1.0, 4.0, 10.0] assert t[4:8].values() == [5.0, 9.0, 2.0, 6.0] assert t[-4:].values() == [5.0, 9.0, 2.0, 6.0] assert t[:-4].values() == [3.0, 1.0, 4.0, 10.0] assert t[-4:-2].values() == [5.0, 9.0] assert t[...].values() == t.values() def test_tensor_can_be_sliced_2d(): t = Tensor([[3.0, 1, 4, 1], [5, 9, 2, 6]]) assert t[:, 0].values() == [3.0, 5] assert t[:, 1].values() == [1.0, 9.0] assert t[0, 0].values() == 3.0 assert t[:, -1].values() == [1.0, 6.0] assert t[:, -4].values() == [3.0, 5] assert t[..., 0].values() == [3.0, 5] def test_tensor_can_be_scliced_3d(): t = Tensor([[[1, 2, 3, 4], [5, 6, 7, 8]], [[9, 10, 11, 12], [13, 14, 15, 16]]]) assert t[:, :, 0].values() == [[1, 5], [9, 13]] assert t[:, :, 0:2].values() == [[[1, 2], [5, 6]], [[9, 10], [13, 14]]] assert t[:, 0, 0].values() == [1, 9] assert t[..., 0].values() == [[1, 5], [9, 13]] assert t[..., 0:2].values() == [[[1, 2], [5, 6]], [[9, 10], [13, 14]]] def assert_bool(t: Tensor, expected: bool): assert t.shape == () assert str(t.dtype) == str(candle.u8) assert bool(t.values()) == expected def test_tensor_supports_equality_operations_with_scalars(): t = Tensor(42.0) assert_bool(t == 42.0, True) assert_bool(t == 43.0, False) assert_bool(t != 42.0, False) assert_bool(t != 43.0, True) assert_bool(t > 41.0, True) assert_bool(t > 42.0, False) assert_bool(t >= 41.0, True) assert_bool(t >= 42.0, True) assert_bool(t < 43.0, True) assert_bool(t < 42.0, False) assert_bool(t <= 43.0, True) assert_bool(t <= 42.0, True) def test_tensor_supports_equality_operations_with_tensors(): t = Tensor(42.0) same = Tensor(42.0) other = Tensor(43.0) assert_bool(t == same, True) assert_bool(t == other, False) assert_bool(t != same, False) assert_bool(t != other, True) assert_bool(t > same, False) assert_bool(t > other, False) assert_bool(t >= same, True) assert_bool(t >= other, False) assert_bool(t < same, False) assert_bool(t < other, True) assert_bool(t <= same, True) assert_bool(t <= other, True) def test_tensor_equality_operations_can_broadcast(): # Create a decoder attention mask as a test case # e.g. # [[1,0,0] # [1,1,0] # [1,1,1]] mask_cond = candle.Tensor([0, 1, 2]) mask = mask_cond < (mask_cond + 1).reshape((3, 1)) assert mask.shape == (3, 3) assert_equal(mask, Tensor([[1, 0, 0], [1, 1, 0], [1, 1, 1]]).to_dtype(candle.u8)) def test_tensor_can_be_hashed(): t = Tensor(42.0) other = Tensor(42.0) # Hash should represent a unique tensor assert hash(t) != hash(other) assert hash(t) == hash(t) def test_tensor_can_be_expanded_with_none(): t = candle.rand((12, 12)) b = t[None] assert b.shape == (1, 12, 12) c = t[:, None, None, :] assert c.shape == (12, 1, 1, 12) d = t[None, :, None, :] assert d.shape == (1, 12, 1, 12) e = t[None, None, :, :] assert e.shape == (1, 1, 12, 12) f = t[:, :, None] assert f.shape == (12, 12, 1) def test_tensor_can_be_index_via_tensor(): t = candle.Tensor([[1, 2, 1, 2], [3, 4, 3, 4], [5, 6, 5, 6]]) indexed = t[candle.Tensor([0, 2])] assert indexed.shape == (2, 4) assert indexed.values() == [[1, 2, 1, 2], [5, 6, 5, 6]] indexed = t[:, candle.Tensor([0, 2])] assert indexed.shape == (3, 2) assert indexed.values() == [[1, 1], [3, 3], [5, 5]] def test_tensor_can_be_index_via_list(): t = candle.Tensor([[1, 2, 1, 2], [3, 4, 3, 4], [5, 6, 5, 6]]) indexed = t[[0, 2]] assert indexed.shape == (2, 4) assert indexed.values() == [[1, 2, 1, 2], [5, 6, 5, 6]] indexed = t[:, [0, 2]] assert indexed.shape == (3, 2) assert indexed.values() == [[1, 1], [3, 3], [5, 5]] def test_tensor_can_be_cast_via_to(): t = Tensor(42.0) assert str(t.dtype) == str(candle.f32) t_new_args = t.to(candle.f64) assert str(t_new_args.dtype) == str(candle.f64) t_new_kwargs = t.to(dtype=candle.f64) assert str(t_new_kwargs.dtype) == str(candle.f64) pytest.raises(TypeError, lambda: t.to("not a dtype")) pytest.raises(TypeError, lambda: t.to(dtype="not a dtype")) pytest.raises(TypeError, lambda: t.to(candle.f64, "not a dtype")) pytest.raises(TypeError, lambda: t.to()) pytest.raises(ValueError, lambda: t.to(candle.f16, dtype=candle.f64)) pytest.raises(ValueError, lambda: t.to(candle.f16, candle.f16)) other = Tensor(42.0).to(candle.f64) t_new_other_args = t.to(other) assert str(t_new_other_args.dtype) == str(candle.f64) t_new_other_kwargs = t.to(other=other) assert str(t_new_other_kwargs.dtype) == str(candle.f64) @pytest.mark.skipif(not cuda_is_available(), reason="CUDA is not available") def test_tensor_can_be_moved_via_to(): t = Tensor(42.0) assert t.device == "cpu" t_new_args = t.to("cuda") assert t_new_args.device == "cuda" t_new_kwargs = t.to(device="cuda") assert t_new_kwargs.device == "cuda" pytest.raises(TypeError, lambda: t.to("not a device")) pytest.raises(TypeError, lambda: t.to(device="not a device")) pytest.raises(TypeError, lambda: t.to("cuda", "not a device")) pytest.raises(TypeError, lambda: t.to()) pytest.raises(ValueError, lambda: t.to("cuda", device="cpu")) pytest.raises(ValueError, lambda: t.to("cuda", "cuda")) other = Tensor(42.0).to("cuda") t_new_other_args = t.to(other) assert t_new_other_args.device == "cuda" t_new_other_kwargs = t.to(other=other) assert t_new_other_kwargs.device == "cuda" @pytest.mark.skipif(not cuda_is_available(), reason="CUDA is not available") def test_tensor_can_be_moved_and_cast_via_to(): t = Tensor(42.0) assert t.device == "cpu" assert str(t.dtype) == str(candle.f32) t_new_args = t.to("cuda", candle.f64) assert t_new_args.device == "cuda" assert str(t_new_args.dtype) == str(candle.f64) t_new_kwargs = t.to(device="cuda", dtype=candle.f64) assert t_new_kwargs.device == "cuda" assert str(t_new_kwargs.dtype) == str(candle.f64) other = Tensor(42.0).to("cuda").to(candle.f64) t_new_other_args = t.to(other) assert t_new_other_args.device == "cuda" assert str(t_new_other_args.dtype) == str(candle.f64) t_new_other_kwargs = t.to(other=other) assert t_new_other_kwargs.device == "cuda" assert str(t_new_other_kwargs.dtype) == str(candle.f64) def test_tensor_can_be_added(): t = Tensor(42.0) result = t + t assert result.values() == 84.0 result = t + 2.0 assert result.values() == 44.0 a = candle.rand((3, 1, 4)) b = candle.rand((2, 1)) c_native = a.broadcast_add(b) c = a + b assert c.shape == (3, 2, 4) assert c.values() == c_native.values() with pytest.raises(ValueError): d = candle.rand((3, 4, 5)) e = candle.rand((4, 6)) f = d + e def test_tensor_can_be_subtracted(): t = Tensor(42.0) result = t - t assert result.values() == 0 result = t - 2.0 assert result.values() == 40.0 a = candle.rand((3, 1, 4)) b = candle.rand((2, 1)) c_native = a.broadcast_sub(b) c = a - b assert c.shape == (3, 2, 4) assert c.values() == c_native.values() with pytest.raises(ValueError): d = candle.rand((3, 4, 5)) e = candle.rand((4, 6)) f = d - e def test_tensor_can_be_multiplied(): t = Tensor(42.0) result = t * t assert result.values() == 1764.0 result = t * 2.0 assert result.values() == 84.0 a = candle.rand((3, 1, 4)) b = candle.rand((2, 1)) c_native = a.broadcast_mul(b) c = a * b assert c.shape == (3, 2, 4) assert c.values() == c_native.values() with pytest.raises(ValueError): d = candle.rand((3, 4, 5)) e = candle.rand((4, 6)) f = d * e def test_tensor_can_be_divided(): t = Tensor(42.0) result = t / t assert result.values() == 1.0 result = t / 2.0 assert result.values() == 21.0 a = candle.rand((3, 1, 4)) b = candle.rand((2, 1)) c_native = a.broadcast_div(b) c = a / b assert c.shape == (3, 2, 4) assert c.values() == c_native.values() with pytest.raises(ValueError): d = candle.rand((3, 4, 5)) e = candle.rand((4, 6)) f = d / e
1
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/native/test_shape.py
from candle import Tensor from candle import rand import pytest def test_absolute_shapes_are_valid(): a = rand((10, 20)) assert a.shape == (10, 20) b = rand(10, 20) assert b.shape == (10, 20) pytest.raises(OverflowError, lambda: rand((10, 20, -1))) pytest.raises(OverflowError, lambda: rand(-1, 20)) pytest.raises(TypeError, lambda: rand("foo", True)) def test_relative_shapes_are_valid(): a = rand(10, 20) a = a.reshape((1, -1)) assert a.shape == (1, 200) b = rand(10, 20) b = b.reshape(-1, 1) assert b.shape == (200, 1) c = rand(10, 20) pytest.raises(TypeError, lambda: c.reshape(1, "foo")) pytest.raises(ValueError, lambda: c.reshape(1, -2)) pytest.raises(ValueError, lambda: c.reshape((-2, 1))) pytest.raises(ValueError, lambda: c.reshape((0, 1))) pytest.raises(ValueError, lambda: c.reshape((1, -1, -1)))
2
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/bindings/test_testing.py
import candle from candle import Tensor from candle.testing import assert_equal, assert_almost_equal import pytest @pytest.mark.parametrize("dtype", [candle.f32, candle.f64, candle.f16, candle.u32, candle.u8, candle.i64]) def test_assert_equal_asserts_correctly(dtype: candle.DType): a = Tensor([1, 2, 3]).to(dtype) b = Tensor([1, 2, 3]).to(dtype) assert_equal(a, b) with pytest.raises(AssertionError): assert_equal(a, b + 1) @pytest.mark.parametrize("dtype", [candle.f32, candle.f64, candle.f16, candle.u32, candle.u8, candle.i64]) def test_assert_almost_equal_asserts_correctly(dtype: candle.DType): a = Tensor([1, 2, 3]).to(dtype) b = Tensor([1, 2, 3]).to(dtype) assert_almost_equal(a, b) with pytest.raises(AssertionError): assert_almost_equal(a, b + 1) assert_almost_equal(a, b + 1, atol=20) assert_almost_equal(a, b + 1, rtol=20) with pytest.raises(AssertionError): assert_almost_equal(a, b + 1, atol=0.9) with pytest.raises(AssertionError): assert_almost_equal(a, b + 1, rtol=0.1)
3
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/bindings/test_module.py
import candle from candle import Tensor, QTensor from candle.nn import Module, Linear from candle.utils import cuda_is_available import pytest def test_module_can_be_constructed(): class A(Module): pass a = A() assert a is not None assert len(list(a.buffers())) == 0 def test_module_registers_tensors(): class A(Module): def __init__(self): super().__init__() self.t = Tensor(42.0) a = A() named_buffers = dict(a.named_buffers()) assert len(named_buffers) == 1 assert "t" in named_buffers def test_module_registers_submodules(): class A(Module): def __init__(self): super().__init__() self.linear = Linear(10, 20) a = A() named_modules = dict(a.named_modules()) named_buffers = dict(a.named_buffers()) assert len(named_buffers) == 2 assert "linear" in named_modules assert "linear.weight" in named_buffers assert "linear.bias" in named_buffers def test_module_can_dump_statedict(): class A(Module): def __init__(self): super().__init__() self.linear = Linear(10, 20) self.t = Tensor(42.0) a = A() state_dict = a.state_dict() assert hasattr(state_dict, "_metadata") assert "t" in state_dict assert "linear.weight" in state_dict assert "linear.bias" in state_dict assert len(state_dict) == 3 def test_module_can_load_statedict(): class A(Module): def __init__(self): super().__init__() self.linear = Linear(10, 20) self.t = Tensor(42.0) statedict = { "linear.weight": candle.ones((20, 10)), "linear.bias": candle.zeros((20,)), "t": Tensor(42.0), } a = A() a.load_state_dict(statedict) def test_module_throws_on_shape_mismatch(): class A(Module): def __init__(self): super().__init__() self.t = Tensor(42.0) statedict = { "t": candle.ones((20,)), } a = A() with pytest.raises(RuntimeError) as excinfo: a.load_state_dict(statedict) assert "size mismatch" in str(excinfo.value) def test_module_throws_on_missing_key(): class A(Module): def __init__(self): super().__init__() self.t = Tensor(42.0) statedict = { "not_t": Tensor(42.0), } a = A() with pytest.raises(RuntimeError) as excinfo: a.load_state_dict(statedict) assert 'Missing key(s) in state_dict: "t".' in str(excinfo.value) def test_module_can_load_quantized_tensors(): class A(Module): def __init__(self): super().__init__() self.t = candle.randn((16, 256)) self._quantizable_buffers.add("t") statedict = { "t": candle.ones((16, 256)).quantize("q4_0"), } a = A() a.load_state_dict(statedict) assert isinstance(a.t, QTensor) assert a.t.ggml_dtype == "Q4_0" def test_module_dequantizes_tensors_automatically(): class A(Module): def __init__(self): super().__init__() self.t = candle.randn((16, 256)) statedict = { "t": candle.ones((16, 256)).quantize("q4_0"), } a = A() a.load_state_dict(statedict) assert isinstance(a.t, Tensor) @pytest.mark.skipif(not cuda_is_available(), reason="CUDA is not available") def test_module_can_be_moved_to_cuda(): class A(Module): def __init__(self): super().__init__() self.t = candle.randn((16, 256)) a = A() a.cuda() assert a.t.device == "cuda" @pytest.mark.skipif(not cuda_is_available(), reason="CUDA is not available") def test_module_can_be_moved_from_cuda_to_cpu(): class A(Module): def __init__(self): super().__init__() self.t = candle.randn((16, 256)) a = A() a.cuda() assert a.t.device == "cuda" a.cpu() assert a.t.device == "cpu"
4
0
hf_public_repos/candle/candle-pyo3/tests
hf_public_repos/candle/candle-pyo3/tests/bindings/test_linear.py
import candle from candle import Tensor from candle.nn import Linear def test_linear_layer_can_be_constructed(): linear = Linear(10, 10) assert linear is not None def test_linear_layer_can_forward_a_singular_input(): linear = Linear(384, 1536) input_tensor = candle.randn((8, 384)) output = linear.forward(input_tensor) assert output.shape == (8, 1536) def test_linear_layer_can_forward_a_batched_input(): linear = Linear(384, 1536) input_tensor = candle.randn((16, 8, 384)) output = linear.forward(input_tensor) assert output.shape == (16, 8, 1536) def test_quantized_linear_layer_can_forward_a_singular_input(): linear = Linear(384, 1536) linear.weight = linear.weight.quantize("q4_0") input_tensor = candle.randn((8, 384)) output = linear.forward(input_tensor) assert output.shape == (8, 1536) def test_quantized_linear_layer_can_forward_a_batched_input(): linear = Linear(384, 1536) linear.weight = linear.weight.quantize("q4_0") input_tensor = candle.randn((16, 8, 384)) output = linear.forward(input_tensor) assert output.shape == (16, 8, 1536)
5
0
hf_public_repos/candle
hf_public_repos/candle/tensor-tools/Cargo.toml
[package] name = "tensor-tools" version.workspace = true edition.workspace = true description.workspace = true repository.workspace = true keywords.workspace = true categories.workspace = true license.workspace = true [dependencies] anyhow = { workspace = true } candle = { workspace = true } clap = { workspace = true } rayon = { workspace = true } safetensors = { workspace = true }
6
0
hf_public_repos/candle/tensor-tools
hf_public_repos/candle/tensor-tools/src/main.rs
use candle::quantized::{gguf_file, GgmlDType, QTensor}; use candle::{Device, Result}; use clap::{Parser, Subcommand, ValueEnum}; use rayon::prelude::*; #[derive(ValueEnum, Debug, Clone)] enum QuantizationMode { /// The default quantization includes all 2d tensors, except the output tensor which always /// uses Q6_K. Llama, } impl QuantizationMode { fn quantize(&self, name: &str, tensor: QTensor, dtype: GgmlDType) -> Result<QTensor> { match self { Self::Llama => { // Same behavior as the llama.cpp quantization. let should_quantize = name.ends_with(".weight") && tensor.rank() == 2; if should_quantize { let tensor = tensor.dequantize(&Device::Cpu)?; if name == "output.weight" { QTensor::quantize(&tensor, GgmlDType::Q6K) } else { QTensor::quantize(&tensor, dtype) } } else { Ok(tensor) } } } } } #[derive(ValueEnum, Debug, Clone)] enum Quantization { #[value(name = "q4_0")] Q4_0, #[value(name = "q4_1")] Q4_1, #[value(name = "q5_0")] Q5_0, #[value(name = "q5_1")] Q5_1, #[value(name = "q8_0")] Q8_0, #[value(name = "q8_1")] Q8_1, Q2k, Q3k, Q4k, Q5k, Q6k, Q8k, F16, F32, } impl Quantization { fn dtype(&self) -> GgmlDType { match self { Quantization::Q4_0 => GgmlDType::Q4_0, Quantization::Q4_1 => GgmlDType::Q4_1, Quantization::Q5_0 => GgmlDType::Q5_0, Quantization::Q5_1 => GgmlDType::Q5_1, Quantization::Q8_0 => GgmlDType::Q8_0, Quantization::Q8_1 => GgmlDType::Q8_1, Quantization::Q2k => GgmlDType::Q2K, Quantization::Q3k => GgmlDType::Q3K, Quantization::Q4k => GgmlDType::Q4K, Quantization::Q5k => GgmlDType::Q5K, Quantization::Q6k => GgmlDType::Q6K, Quantization::Q8k => GgmlDType::Q8K, Quantization::F16 => GgmlDType::F16, Quantization::F32 => GgmlDType::F32, } } } #[derive(ValueEnum, Debug, Clone)] enum Format { Safetensors, Npz, Ggml, Gguf, Pth, Pickle, } impl Format { fn infer<P: AsRef<std::path::Path>>(p: P) -> Option<Self> { p.as_ref() .extension() .and_then(|e| e.to_str()) .and_then(|e| match e { // We don't infer any format for .bin as it can be used for ggml/gguf or pytorch. "safetensors" | "safetensor" => Some(Self::Safetensors), "npz" => Some(Self::Npz), "pth" | "pt" => Some(Self::Pth), "ggml" => Some(Self::Ggml), "gguf" => Some(Self::Gguf), _ => None, }) } } #[derive(Subcommand, Debug, Clone)] enum Command { Ls { files: Vec<std::path::PathBuf>, /// The file format to use, if unspecified infer from the file extension. #[arg(long, value_enum)] format: Option<Format>, /// Enable verbose mode. #[arg(short, long)] verbose: bool, }, Print { file: std::path::PathBuf, names: Vec<String>, /// The file format to use, if unspecified infer from the file extension. #[arg(long, value_enum)] format: Option<Format>, /// Print the whole content of each tensor. #[arg(long)] full: bool, /// Line width for printing the tensors. #[arg(long)] line_width: Option<usize>, }, Quantize { /// The input file(s), in safetensors format. in_file: Vec<std::path::PathBuf>, /// The output file, in gguf format. #[arg(long)] out_file: std::path::PathBuf, /// The quantization schema to apply. #[arg(long, value_enum)] quantization: Quantization, /// Which tensor to quantize. #[arg(long, value_enum, default_value_t = QuantizationMode::Llama)] mode: QuantizationMode, }, Dequantize { /// The input file, in gguf format. in_file: std::path::PathBuf, /// The output file, in safetensors format. #[arg(long)] out_file: std::path::PathBuf, }, } #[derive(Parser, Debug, Clone)] struct Args { #[command(subcommand)] command: Command, } fn run_print( file: &std::path::PathBuf, names: Vec<String>, format: Option<Format>, full: bool, line_width: Option<usize>, device: &Device, ) -> Result<()> { if full { candle::display::set_print_options_full(); } if let Some(line_width) = line_width { candle::display::set_line_width(line_width) } let format = match format { Some(format) => format, None => match Format::infer(file) { Some(format) => format, None => { println!( "{file:?}: cannot infer format from file extension, use the --format flag" ); return Ok(()); } }, }; match format { Format::Npz => { let tensors = candle::npy::NpzTensors::new(file)?; let names = if names.is_empty() { tensors.names().into_iter().map(|v| v.to_string()).collect() } else { names }; for name in names.iter() { println!("==== {name} ===="); match tensors.get(name)? { Some(tensor) => println!("{tensor}"), None => println!("not found"), } } } Format::Safetensors => { use candle::safetensors::Load; let tensors = unsafe { candle::safetensors::MmapedSafetensors::new(file)? }; let tensors: std::collections::HashMap<_, _> = tensors.tensors().into_iter().collect(); let names = if names.is_empty() { tensors.keys().map(|v| v.to_string()).collect() } else { names }; for name in names.iter() { println!("==== {name} ===="); match tensors.get(name) { Some(tensor_view) => { let tensor = tensor_view.load(device)?; println!("{tensor}") } None => println!("not found"), } } } Format::Pth => { let pth_file = candle::pickle::PthTensors::new(file, None)?; let names = if names.is_empty() { pth_file .tensor_infos() .keys() .map(|v| v.to_string()) .collect() } else { names }; for name in names.iter() { println!("==== {name} ===="); match pth_file.get(name)? { Some(tensor) => { println!("{tensor}") } None => println!("not found"), } } } Format::Pickle => { candle::bail!("pickle format is not supported for print") } Format::Ggml => { let mut file = std::fs::File::open(file)?; let content = candle::quantized::ggml_file::Content::read(&mut file, device)?; let names = if names.is_empty() { content.tensors.keys().map(|v| v.to_string()).collect() } else { names }; for name in names.iter() { println!("==== {name} ===="); match content.tensors.get(name) { Some(tensor) => { let tensor = tensor.dequantize(device)?; println!("{tensor}") } None => println!("not found"), } } } Format::Gguf => { let mut file = std::fs::File::open(file)?; let content = gguf_file::Content::read(&mut file)?; let names = if names.is_empty() { content.tensor_infos.keys().map(|v| v.to_string()).collect() } else { names }; for name in names.iter() { println!("==== {name} ===="); match content.tensor(&mut file, name, device) { Ok(tensor) => { let tensor = tensor.dequantize(device)?; println!("{tensor}") } Err(_) => println!("not found"), } } } } Ok(()) } fn run_ls( file: &std::path::PathBuf, format: Option<Format>, verbose: bool, device: &Device, ) -> Result<()> { let format = match format { Some(format) => format, None => match Format::infer(file) { Some(format) => format, None => { println!( "{file:?}: cannot infer format from file extension, use the --format flag" ); return Ok(()); } }, }; match format { Format::Npz => { let tensors = candle::npy::NpzTensors::new(file)?; let mut names = tensors.names(); names.sort(); for name in names { let shape_dtype = match tensors.get_shape_and_dtype(name) { Ok((shape, dtype)) => format!("[{shape:?}; {dtype:?}]"), Err(err) => err.to_string(), }; println!("{name}: {shape_dtype}") } } Format::Safetensors => { let tensors = unsafe { candle::safetensors::MmapedSafetensors::new(file)? }; let mut tensors = tensors.tensors(); tensors.sort_by(|a, b| a.0.cmp(&b.0)); for (name, view) in tensors.iter() { let dtype = view.dtype(); let dtype = match candle::DType::try_from(dtype) { Ok(dtype) => format!("{dtype:?}"), Err(_) => format!("{dtype:?}"), }; let shape = view.shape(); println!("{name}: [{shape:?}; {dtype}]") } } Format::Pth => { let mut tensors = candle::pickle::read_pth_tensor_info(file, verbose, None)?; tensors.sort_by(|a, b| a.name.cmp(&b.name)); for tensor_info in tensors.iter() { println!( "{}: [{:?}; {:?}]", tensor_info.name, tensor_info.layout.shape(), tensor_info.dtype, ); if verbose { println!(" {:?}", tensor_info); } } } Format::Pickle => { let file = std::fs::File::open(file)?; let mut reader = std::io::BufReader::new(file); let mut stack = candle::pickle::Stack::empty(); stack.read_loop(&mut reader)?; for (i, obj) in stack.stack().iter().enumerate() { println!("{i} {obj:?}"); } } Format::Ggml => { let mut file = std::fs::File::open(file)?; let content = candle::quantized::ggml_file::Content::read(&mut file, device)?; let mut tensors = content.tensors.into_iter().collect::<Vec<_>>(); tensors.sort_by(|a, b| a.0.cmp(&b.0)); for (name, qtensor) in tensors.iter() { println!("{name}: [{:?}; {:?}]", qtensor.shape(), qtensor.dtype()); } } Format::Gguf => { let mut file = std::fs::File::open(file)?; let content = gguf_file::Content::read(&mut file)?; if verbose { let mut metadata = content.metadata.into_iter().collect::<Vec<_>>(); metadata.sort_by(|a, b| a.0.cmp(&b.0)); println!("metadata entries ({})", metadata.len()); for (key, value) in metadata.iter() { println!(" {key}: {value:?}"); } } let mut tensors = content.tensor_infos.into_iter().collect::<Vec<_>>(); tensors.sort_by(|a, b| a.0.cmp(&b.0)); for (name, info) in tensors.iter() { println!("{name}: [{:?}; {:?}]", info.shape, info.ggml_dtype); } } } Ok(()) } fn run_quantize_safetensors( in_files: &[std::path::PathBuf], out_file: std::path::PathBuf, q: Quantization, ) -> Result<()> { let mut out_file = std::fs::File::create(out_file)?; let mut tensors = std::collections::HashMap::new(); for in_file in in_files.iter() { let in_tensors = candle::safetensors::load(in_file, &Device::Cpu)?; tensors.extend(in_tensors) } println!("tensors: {}", tensors.len()); let dtype = q.dtype(); let block_size = dtype.block_size(); let qtensors = tensors .into_par_iter() .map(|(name, tensor)| { let should_quantize = tensor.rank() == 2 && tensor.dim(1)? % block_size == 0; println!(" quantizing {name} {tensor:?} {should_quantize}"); let tensor = if should_quantize { QTensor::quantize(&tensor, dtype)? } else { QTensor::quantize(&tensor, GgmlDType::F32)? }; Ok((name, tensor)) }) .collect::<Result<Vec<_>>>()?; let qtensors = qtensors .iter() .map(|(k, v)| (k.as_str(), v)) .collect::<Vec<_>>(); gguf_file::write(&mut out_file, &[], &qtensors)?; Ok(()) } fn run_dequantize( in_file: std::path::PathBuf, out_file: std::path::PathBuf, device: &Device, ) -> Result<()> { let mut in_file = std::fs::File::open(in_file)?; let content = gguf_file::Content::read(&mut in_file)?; let mut tensors = std::collections::HashMap::new(); for (tensor_name, _) in content.tensor_infos.iter() { let tensor = content.tensor(&mut in_file, tensor_name, device)?; let tensor = tensor.dequantize(device)?; tensors.insert(tensor_name.to_string(), tensor); } candle::safetensors::save(&tensors, out_file)?; Ok(()) } fn run_quantize( in_files: &[std::path::PathBuf], out_file: std::path::PathBuf, q: Quantization, qmode: QuantizationMode, device: &Device, ) -> Result<()> { if in_files.is_empty() { candle::bail!("no specified input files") } if let Some(extension) = out_file.extension() { if extension == "safetensors" { candle::bail!("the generated file cannot use the safetensors extension") } } if let Some(extension) = in_files[0].extension() { if extension == "safetensors" { return run_quantize_safetensors(in_files, out_file, q); } } if in_files.len() != 1 { candle::bail!("only a single in-file can be used when quantizing gguf files") } // Open the out file early so as to fail directly on missing directories etc. let mut out_file = std::fs::File::create(out_file)?; let mut in_ = std::fs::File::open(&in_files[0])?; let content = gguf_file::Content::read(&mut in_)?; println!("tensors: {}", content.tensor_infos.len()); let dtype = q.dtype(); let qtensors = content .tensor_infos .par_iter() .map(|(name, _)| { println!(" quantizing {name}"); let mut in_file = std::fs::File::open(&in_files[0])?; let tensor = content.tensor(&mut in_file, name, device)?; let tensor = qmode.quantize(name, tensor, dtype)?; Ok((name, tensor)) }) .collect::<Result<Vec<_>>>()?; let qtensors = qtensors .iter() .map(|(k, v)| (k.as_str(), v)) .collect::<Vec<_>>(); let metadata = content .metadata .iter() .map(|(k, v)| (k.as_str(), v)) .collect::<Vec<_>>(); gguf_file::write(&mut out_file, metadata.as_slice(), &qtensors)?; Ok(()) } fn main() -> anyhow::Result<()> { let args = Args::parse(); let device = Device::Cpu; match args.command { Command::Ls { files, format, verbose, } => { let multiple_files = files.len() > 1; for file in files.iter() { if multiple_files { println!("--- {file:?} ---"); } run_ls(file, format.clone(), verbose, &device)? } } Command::Print { file, names, format, full, line_width, } => run_print(&file, names, format, full, line_width, &device)?, Command::Quantize { in_file, out_file, quantization, mode, } => run_quantize(&in_file, out_file, quantization, mode, &device)?, Command::Dequantize { in_file, out_file } => run_dequantize(in_file, out_file, &device)?, } Ok(()) }
7
0
hf_public_repos/candle
hf_public_repos/candle/candle-transformers/Cargo.toml
[package] name = "candle-transformers" version.workspace = true edition.workspace = true description.workspace = true repository.workspace = true keywords.workspace = true categories.workspace = true license.workspace = true readme = "README.md" [dependencies] accelerate-src = { workspace = true, optional = true } byteorder = { workspace = true } candle = { workspace = true } candle-flash-attn = { workspace = true, optional = true } candle-nn = { workspace = true } fancy-regex = { workspace = true } intel-mkl-src = { workspace = true, optional = true } num-traits = { workspace = true } rand = { workspace = true } rayon = { workspace = true } serde = { workspace = true } serde_json = { workspace = true } serde_plain = { workspace = true } tracing = { workspace = true } [features] default = [] accelerate = ["dep:accelerate-src", "candle/accelerate", "candle-nn/accelerate"] cuda = ["candle/cuda", "candle-nn/cuda"] flash-attn = ["cuda", "dep:candle-flash-attn"] mkl = ["dep:intel-mkl-src", "candle/mkl", "candle-nn/mkl"] metal = ["candle/metal", "candle-nn/metal"]
8
0
hf_public_repos/candle
hf_public_repos/candle/candle-transformers/README.md
# candle-transformers
9
0
hf_public_repos/candle
hf_public_repos/candle/candle-onnx/Cargo.toml
[package] name = "candle-onnx" version = "0.8.0" edition = "2021" description = "ONNX support for Candle" repository = "https://github.com/huggingface/candle" keywords = ["blas", "tensor", "machine-learning"] categories = ["science"] license = "MIT OR Apache-2.0" [dependencies] candle = { path = "../candle-core", package = "candle-core", version = "0.8.0" } candle-nn = { path = "../candle-nn", version = "0.8.0" } prost = "0.12.1" [build-dependencies] prost-build = "0.12.1" [dev-dependencies] anyhow = { version = "1", features = ["backtrace"] } clap = { version = "4.2.4", features = ["derive"] }
0
0
hf_public_repos/candle
hf_public_repos/candle/candle-onnx/README.md
# candle-onnx This crate adds ONNX support to candle ## FAQ #### Missing protoc installation when compiling candle-onnx The candle-onnx dependency prost-build no longer comes bundled with prost binaries. This could cause the following error when attempting to compile candle-onnx: ``` error: failed to run custom build command for `candle-onnx` Caused by: // (...) Could not find `protoc` installation and this build crate cannot proceed without this knowledge. ``` To fix this issue install protoc on your system and make it available in your system `PATH`. See the [protoc documentation](https://grpc.io/docs/protoc-installation/) for more information.
1
0
hf_public_repos/candle/candle-onnx
hf_public_repos/candle/candle-onnx/src/eval.rs
use crate::onnx::attribute_proto::AttributeType; use crate::onnx::tensor_proto::DataType; use crate::onnx::{self, GraphProto}; use candle::{bail, DType, Device, Result, Tensor}; use std::collections::{HashMap, HashSet}; pub type Value = Tensor; pub fn dtype(dt: DataType) -> Option<DType> { match dt { DataType::Uint8 => Some(DType::U8), DataType::Uint32 => Some(DType::U32), DataType::Int64 => Some(DType::I64), DataType::Float16 => Some(DType::F16), DataType::Float => Some(DType::F32), DataType::Double => Some(DType::F64), DataType::Bool => Some(DType::U8), _ => None, } } trait Attr { const TYPE: AttributeType; fn get(attr: &onnx::AttributeProto) -> Result<&Self>; } trait AttrOwned: Sized { const TYPE: AttributeType; fn get(attr: &onnx::AttributeProto) -> Result<Self>; } impl Attr for i64 { const TYPE: AttributeType = AttributeType::Int; fn get(attr: &onnx::AttributeProto) -> Result<&Self> { Ok(&attr.i) } } impl Attr for f32 { const TYPE: AttributeType = AttributeType::Float; fn get(attr: &onnx::AttributeProto) -> Result<&Self> { Ok(&attr.f) } } impl Attr for [i64] { const TYPE: AttributeType = AttributeType::Ints; fn get(attr: &onnx::AttributeProto) -> Result<&Self> { Ok(attr.ints.as_slice()) } } impl Attr for str { const TYPE: AttributeType = AttributeType::String; fn get(attr: &onnx::AttributeProto) -> Result<&Self> { std::str::from_utf8(&attr.s).map_err(candle::Error::wrap) } } impl Attr for GraphProto { const TYPE: AttributeType = AttributeType::Graph; fn get(attr: &onnx::AttributeProto) -> Result<&Self> { attr.g .as_ref() .ok_or_else(|| candle::Error::Msg("attribute does not contain graph".to_string())) } } impl AttrOwned for Vec<String> { const TYPE: AttributeType = AttributeType::Strings; fn get(attr: &onnx::AttributeProto) -> Result<Self> { let mut ret = vec![]; for bytes in attr.strings.iter() { let s = String::from_utf8(bytes.clone()).map_err(candle::Error::wrap)?; ret.push(s); } Ok(ret) } } impl AttrOwned for Tensor { const TYPE: AttributeType = AttributeType::Tensor; fn get(attr: &onnx::AttributeProto) -> Result<Self> { let tensor_proto = match &attr.t { Some(value) => value, None => bail!( "attribute {} was of type TENSOR, but no tensor was found", attr.name ), }; let data_type = match DataType::try_from(tensor_proto.data_type) { Ok(value) => value, Err(_) => bail!( "attribute {} of type TENSOR was an invalid data_type number {}", attr.name, tensor_proto.data_type ), }; let dtype = match dtype(data_type) { Some(value) => value, None => bail!( "attribute {} of type TENSOR has an unsupported data_type {}", attr.name, data_type.as_str_name() ), }; let mut dims = Vec::with_capacity(tensor_proto.dims.len()); for dim in &tensor_proto.dims { if dim < &0 { bail!( "attribute {} of type TENSOR has a negative dimension, which is unsupported", attr.name ) } dims.push(*dim as usize) } Tensor::from_raw_buffer(&tensor_proto.raw_data, dtype, &dims, &Device::Cpu) } } fn get_attr_<'a>(node: &'a onnx::NodeProto, name: &str) -> Result<&'a onnx::AttributeProto> { match node.attribute.iter().find(|attr| attr.name == name) { None => { bail!( "cannot find the '{name}' attribute in '{}' for {}", node.op_type, node.name ) } Some(dt) => Ok(dt), } } fn get_attr<'a, T: Attr + ?Sized>(node: &'a onnx::NodeProto, name: &str) -> Result<&'a T> { let attr = get_attr_(node, name)?; if attr.r#type() != T::TYPE { bail!( "unsupported type {:?} for '{name}' attribute in '{}' for {}", attr.r#type, node.op_type, node.name ) } T::get(attr) } fn get_attr_opt<'a, T: Attr + ?Sized>( node: &'a onnx::NodeProto, name: &str, ) -> Result<Option<&'a T>> { match node.attribute.iter().find(|attr| attr.name == name) { None => Ok(None), Some(attr) => { if attr.r#type() != T::TYPE { bail!( "unsupported type {:?} for '{name}' attribute in '{}' for {}", attr.r#type, node.op_type, node.name ) } let val = T::get(attr)?; Ok(Some(val)) } } } fn get_attr_opt_owned<T: AttrOwned>(node: &onnx::NodeProto, name: &str) -> Result<Option<T>> { match node.attribute.iter().find(|attr| attr.name == name) { None => Ok(None), Some(attr) => { if attr.r#type() != T::TYPE { bail!( "unsupported type {:?} for '{name}' attribute in '{}' for {}", attr.r#type, node.op_type, node.name ) } let val = T::get(attr)?; Ok(Some(val)) } } } pub fn get_tensor(t: &onnx::TensorProto, name: &str) -> Result<Tensor> { let dims: Vec<usize> = t.dims.iter().map(|&x| x as usize).collect(); match DataType::try_from(t.data_type) { Ok(DataType::Int32) => { if t.int32_data.is_empty() { let len = t.raw_data.len() / 4; let data: &[i32] = unsafe { std::slice::from_raw_parts(t.raw_data.as_ptr() as *const i32, len) }; let data = data.iter().map(|v| *v as i64).collect::<Vec<_>>(); Tensor::from_vec(data, len, &Device::Cpu) } else { let data = t.int32_data.iter().map(|v| *v as i64).collect::<Vec<_>>(); Tensor::from_vec(data, t.int32_data.len(), &Device::Cpu) } } Ok(dt) => match dtype(dt) { Some(dt) => { if dt == DType::F32 && !t.float_data.is_empty() { Tensor::from_slice(&t.float_data, dims.as_slice(), &Device::Cpu) } else if dt == DType::F64 && !t.double_data.is_empty() { Tensor::from_slice(&t.double_data, dims.as_slice(), &Device::Cpu) } else if dt == DType::I64 && !t.int64_data.is_empty() { Tensor::from_slice(&t.int64_data, dims.as_slice(), &Device::Cpu) } else { Tensor::from_raw_buffer( t.raw_data.as_slice(), dt, dims.as_slice(), &Device::Cpu, ) } } None => { bail!("unsupported 'value' data-type {dt:?} for {name}") } }, Err(_) => { bail!("unsupported 'value' data-type {} for {name}", t.data_type,) } } } // This function provides a direct evaluation of the proto. // Longer-term, we should first convert the proto to an intermediate representation of the compute // graph so as to make multiple evaluations more efficient. // An example upside of this would be to remove intermediary values when they are not needed // anymore. pub fn simple_eval( model: &onnx::ModelProto, mut inputs: HashMap<String, Value>, ) -> Result<HashMap<String, Value>> { let graph = match &model.graph { None => bail!("no graph defined in proto"), Some(graph) => graph, }; simple_eval_(graph, &mut inputs) } fn simple_eval_( graph: &onnx::GraphProto, values: &mut HashMap<String, Value>, ) -> Result<HashMap<String, Value>> { for t in graph.initializer.iter() { let tensor = get_tensor(t, t.name.as_str())?; values.insert(t.name.to_string(), tensor); } for input in graph.input.iter() { let input_type = match &input.r#type { Some(input_type) => input_type, None => continue, }; let input_type = match &input_type.value { Some(input_type) => input_type, None => continue, }; let tensor_type = match input_type { onnx::type_proto::Value::TensorType(tt) => tt, _ => continue, }; let tensor = match values.get(&input.name) { None => bail!("missing input {}", input.name), Some(tensor) => tensor, }; let dt = match DataType::try_from(tensor_type.elem_type) { Ok(dt) => match dtype(dt) { Some(dt) => dt, None => { bail!("unsupported 'value' data-type {dt:?} for {}", input.name) } }, type_ => bail!("unsupported input type {type_:?}"), }; match &tensor_type.shape { None => continue, Some(shape) => { if shape.dim.len() != tensor.rank() { bail!( "unexpected rank for {}, got {:?}, expected {:?}", input.name, shape.dim, tensor.shape() ) } for (idx, (d, &dim)) in shape.dim.iter().zip(tensor.dims().iter()).enumerate() { match &d.value { Some(onnx::tensor_shape_proto::dimension::Value::DimValue(v)) => { if *v as usize != dim { bail!( "unexpected dim {idx} for {}, got {:?}, expected {:?}", input.name, shape.dim, tensor.shape() ) } } // We do not check equality constraints for the DimParam dimensions for now. Some(onnx::tensor_shape_proto::dimension::Value::DimParam(_)) | None => (), } } } }; if dt != tensor.dtype() { bail!( "unexpected dtype for {}, got {:?}, expected {dt:?}", input.name, tensor.dtype() ) } } // The nodes are topologically sorted so we can just process them in order. for node in graph.node.iter() { let get = |input_name: &str| match values.get(input_name) { Some(value) => Ok(value), None => bail!("cannot find {input_name} for op '{}'", node.name), }; let get_opt = |i: usize| { node.input .get(i) .filter(|s: &&String| !s.is_empty()) .map(|s| get(s)) }; // TODO: Validate node.input for each operator. match node.op_type.as_str() { "Add" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_add(input1)?; values.insert(node.output[0].clone(), output); } "Sub" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_sub(input1)?; values.insert(node.output[0].clone(), output); } "Mul" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_mul(input1)?; values.insert(node.output[0].clone(), output); } "Div" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_div(input1)?; values.insert(node.output[0].clone(), output); } "Pow" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; // HACK: current implementation of broadcast_pow cannot handle negative base, // so we use powf where we can, which *does* correctly handle negative base. if let Ok(exp) = (|| input1.to_dtype(DType::F64)?.to_scalar::<f64>())() { let output = input0.powf(exp)?; values.insert(node.output[0].clone(), output); } else { let output = input0.broadcast_pow(input1)?; values.insert(node.output[0].clone(), output); } } "Exp" => { let xs = get(&node.input[0])?; let output = xs.exp()?; values.insert(node.output[0].clone(), output); } "Equal" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_eq(input1)?; values.insert(node.output[0].clone(), output); } "Not" => { let xs = get(&node.input[0])?; let xs = xs.eq(&xs.zeros_like()?)?; values.insert(node.output[0].clone(), xs); } "MatMul" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?; let output = input0.broadcast_matmul(input1)?; values.insert(node.output[0].clone(), output); } "Reshape" => { let input0 = get(&node.input[0])?; let input1 = get(&node.input[1])?.to_vec1::<i64>()?; // TODO: Check that there is at most a single -1 or 0, handle other neg values. let mut other_than_minus1 = 1usize; for &v in input1.iter() { if v != -1 && v != 0 { other_than_minus1 *= v as usize } } let input1 = input1 .iter() .enumerate() .map(|(idx, &v)| match v { -1 => Ok(input0.elem_count() / other_than_minus1), 0 => input0.dim(idx), _ => Ok(v as usize), }) .collect::<Result<Vec<usize>>>()?; let output = input0.reshape(input1)?; values.insert(node.output[0].clone(), output); } "LogSoftmax" => { let input = get(&node.input[0])?; let output = match get_attr_opt::<i64>(node, "axis")? { None => candle_nn::ops::softmax_last_dim(input)?, Some(&axis) => { let axis = input.normalize_axis(axis)?; candle_nn::ops::log_softmax(input, axis)? } }; values.insert(node.output[0].clone(), output); } "Softmax" => { let input = get(&node.input[0])?; let output = match get_attr_opt::<i64>(node, "axis")? { None => candle_nn::ops::softmax_last_dim(input)?, Some(&axis) => { let axis = input.normalize_axis(axis)?; candle_nn::ops::softmax(input, axis)? } }; values.insert(node.output[0].clone(), output); } "Transpose" => { let input = get(&node.input[0])?; let output = match get_attr_opt::<[i64]>(node, "perm")? { None => input.t()?, Some(perm) => { let perm = perm.iter().map(|&v| v as usize).collect::<Vec<_>>(); input.permute(perm)? } }; values.insert(node.output[0].clone(), output); } "Dropout" => { let input = get(&node.input[0])?; // Do not apply dropout at the moment, consider that we're only doing inference. values.insert(node.output[0].clone(), input.clone()); } "MaxPool" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#MaxPool let dilations = get_attr_opt::<[i64]>(node, "dilations")?; let kernel_shape = get_attr::<[i64]>(node, "kernel_shape")?; let pads = get_attr_opt::<[i64]>(node, "pads")?; let strides = get_attr_opt::<[i64]>(node, "strides")?; let auto_pad = get_attr_opt::<str>(node, "auto_pad")?; match auto_pad { None | Some("NOTSET") => (), Some(s) => bail!("unsupported auto_pad {s}"), }; if let Some(d) = dilations { if d.iter().any(|&v| v != 1) { bail!("MaxPool with dilation != 1, {dilations:?}") } } if let Some(d) = pads { if d.iter().any(|&v| v != 0) { bail!("MaxPool with pads != 0, {pads:?}") } } let xs = get(&node.input[0])?; let (k1, k2) = match kernel_shape { [k1, k2] => (*k1 as usize, *k2 as usize), _ => bail!("only 2d MaxPool is supported, kernel shape {kernel_shape:?}"), }; let ys = match strides { None => xs.max_pool2d((k1, k2))?, Some([s1, s2]) => { xs.max_pool2d_with_stride((k1, k2), (*s1 as usize, *s2 as usize))? } Some(strides) => bail!("only 2d MaxPool is supported, strides {strides:?}"), }; values.insert(node.output[0].clone(), ys); } "AveragePool" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#AveragePool let dilations = get_attr_opt::<[i64]>(node, "dilations")?; let kernel_shape = get_attr::<[i64]>(node, "kernel_shape")?; let pads = get_attr_opt::<[i64]>(node, "pads")?; let strides = get_attr_opt::<[i64]>(node, "strides")?; let auto_pad = get_attr_opt::<str>(node, "auto_pad")?; match auto_pad { None | Some("NOTSET") => (), Some(s) => bail!("unsupported auto_pad {s}"), }; if let Some(d) = dilations { if d.iter().any(|&v| v != 1) { bail!("AvgPool with dilation != 1, {dilations:?}") } } if let Some(d) = pads { if d.iter().any(|&v| v != 0) { bail!("AvgPool with pads != 0, {pads:?}") } } let xs = get(&node.input[0])?; let (k1, k2) = match kernel_shape { [k1, k2] => (*k1 as usize, *k2 as usize), _ => bail!("only 2d AvgPool is supported, kernel shape {kernel_shape:?}"), }; let ys = match strides { None => xs.avg_pool2d((k1, k2))?, Some([s1, s2]) => { xs.avg_pool2d_with_stride((k1, k2), (*s1 as usize, *s2 as usize))? } Some(strides) => bail!("only 2d AvgPool is supported, strides {strides:?}"), }; values.insert(node.output[0].clone(), ys); } "BatchNormalization" => { let training_mode = get_attr_opt::<i64>(node, "training_mode")?; if training_mode.copied().unwrap_or(0) != 0 { bail!("training mode is not supported for BatchNorm") } let eps = get_attr_opt::<f32>(node, "epsilon")? .copied() .unwrap_or(1e-5); let xs = get(&node.input[0])?; let weight = get(&node.input[1])?; let bias = get(&node.input[2])?; let running_mean = get(&node.input[3])?; let running_var = get(&node.input[4])?; let target_shape: Vec<usize> = xs .dims() .iter() .enumerate() .map(|(idx, v)| if idx == 1 { *v } else { 1 }) .collect(); let target_shape = target_shape.as_slice(); let xs = xs .broadcast_sub(&running_mean.reshape(target_shape)?)? .broadcast_div(&(running_var.reshape(target_shape)? + eps as f64)?.sqrt()?)?; let weight = weight.reshape(target_shape)?; let bias = bias.reshape(target_shape)?; let xs = xs.broadcast_mul(&weight)?.broadcast_add(&bias)?; values.insert(node.output[0].clone(), xs); } "Squeeze" => { let xs = get(&node.input[0])?; let mut axes = if node.input.len() <= 1 { // contract all the dimensions with size 1 except the batch dim. xs.dims() .iter() .enumerate() .flat_map(|(idx, &s)| if s == 1 && idx > 0 { Some(idx) } else { None }) .collect() } else { get(&node.input[1])? .to_vec1::<i64>()? .iter() .map(|&i| xs.normalize_axis(i)) .collect::<Result<Vec<_>>>()? }; axes.sort(); let mut xs = xs.clone(); for &axis in axes.iter().rev() { xs = xs.squeeze(axis)? } values.insert(node.output[0].clone(), xs); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#ConstantOfShape "ConstantOfShape" => { let input = get(&node.input[0])?; let value = get_attr_opt_owned::<Tensor>(node, "value")?.unwrap_or(Tensor::zeros( (), DType::F32, &Device::Cpu, )?); let xs = Tensor::ones(input.shape(), value.dtype(), input.device())? .broadcast_mul(&value)?; values.insert(node.output[0].clone(), xs); } "Unsqueeze" => { let xs = get(&node.input[0])?; let axes = match get_attr_opt::<[i64]>(node, "axes")? { Some(axis) => axis.to_vec(), None => get(&node.input[1])?.to_vec1::<i64>()?, }; let mut axes = axes .iter() .map(|&i| { if i == xs.rank() as i64 { Ok(xs.rank()) } else if i < 0 { // normalize_axis doesn't work correctly here // because we actually want normalized with respect // to the final size, not the current (off by one) Ok(xs.rank() - (-i as usize) + 1) } else { xs.normalize_axis(i) } }) .collect::<Result<Vec<_>>>()?; axes.sort(); let mut xs = xs.clone(); for &axis in axes.iter().rev() { xs = xs.unsqueeze(axis)? } values.insert(node.output[0].clone(), xs); } "Clip" => { let xs = get(&node.input[0])?; let xs = if let Some(mins) = get_opt(1) { xs.broadcast_maximum(mins?)? } else { xs.clone() }; let xs = if let Some(maxs) = get_opt(2) { xs.broadcast_minimum(maxs?)? } else { xs.clone() }; values.insert(node.output[0].clone(), xs); } "Gather" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Gather let xs = get(&node.input[0])?; let indices = get(&node.input[1])?; let axis = get_attr_opt::<i64>(node, "axis")?.copied().unwrap_or(0); let axis = xs.normalize_axis(axis)?; // index_select does not support negative indices, so normalize them // to positive indices. let indices = &{ let zeros = Tensor::zeros(indices.shape(), indices.dtype(), indices.device())?; let max = Tensor::new(xs.dims()[axis] as i64, indices.device())? .to_dtype(indices.dtype())?; let mask = indices.lt(&zeros)?; mask.to_dtype(indices.dtype())? .broadcast_mul(&max)? .add(indices)? }; // In Pytorch or Numpy this can be done by indexing the xs tensor using the indices // tensor directly, but candle does not support tensor indexing at the moment, so // some workarounds must be done. let xs = match indices.dims() { [] => { let index = indices.to_vec0::<i64>()? as usize; xs.narrow(axis, index, 1)?.squeeze(axis)? } [_] => xs.index_select(indices, axis)?, [first, _] => { let mut v = Vec::with_capacity(*first); for i in 0..*first { v.push(xs.index_select(&indices.get(i)?, axis)?) } Tensor::stack(&v, axis)? } _ => { // TODO: Provide an op to handle the ONNX generalized gather op ideally in a // differentiable way. todo!("implement gather for {xs:?} {indices:?} axis {axis}") } }; values.insert(node.output[0].clone(), xs); } // https://onnx.ai/onnx/operators/onnx__GatherElements.html#gatherelements // A Note to fellow lurkers: // The numpy based `gather_elements` implementation in `onnx` tests [here](https://github.com/onnx/onnx/blob/main/onnx/backend/test/case/node/gatherelements.py) // and examples is incorrect. // Use `torch.gather` for the validating/ verifying against the proper behaviour "GatherElements" => { let data = get(&node.input[0])?; let indices = get(&node.input[1])?; let rank = data.rank(); if rank != indices.rank() { bail!("indices must have same rank as input data. Data rank [{}] != indices rank [{}]", data.rank(), indices.rank()); } let axis = { let axis_i64 = get_attr_opt::<i64>(node, "axis")?.copied().unwrap_or(0); let axis = data.normalize_axis(axis_i64)?; if axis >= rank { bail!( "axis ({}) out of accepted range [-rank, rank-1] which was [-{rank}, {}]", axis_i64, rank - 1 ) } axis }; // index_select does not support negative indices, so normalize them // to positive indices. let indices = &{ let zeros = Tensor::zeros(indices.shape(), indices.dtype(), indices.device())?; let max = Tensor::new(data.dims()[axis] as i64, indices.device())? .to_dtype(indices.dtype())?; let mask = indices.lt(&zeros)?; mask.to_dtype(indices.dtype())? .broadcast_mul(&max)? .add(indices)? }; values.insert(node.output[0].clone(), data.gather(indices, axis)?); } "Shape" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Shape let xs = get(&node.input[0])?; let start = get_attr_opt::<i64>(node, "start")?.copied().unwrap_or(0); let end = get_attr_opt::<i64>(node, "end")?.copied().unwrap_or(-1); let start = xs.normalize_axis(start)?; let end = xs.normalize_axis(end)?; let mut dims = vec![]; for idx in start..=end { dims.push(xs.dim(idx)? as i64) } let dims = Tensor::from_vec(dims, xs.rank(), xs.device())?; values.insert(node.output[0].clone(), dims); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Size "Size" => { let data = get(&node.input[0])?; let size: usize = data.dims().iter().product(); let output = Tensor::from_slice(&[size as i64], (), data.device())?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Sqrt "Sqrt" => { let xs = get(&node.input[0])?; let output = xs.sqrt()?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Range "Range" => { let start = get(&node.input[0])?; let limit = get(&node.input[1])?; let delta = get(&node.input[2])?; macro_rules! arange_step { ($t: ty) => { Tensor::arange_step( start.to_vec0::<$t>()?, limit.to_vec0::<$t>()?, delta.to_vec0::<$t>()?, &Device::Cpu, )? }; } let output = match start.dtype() { DType::U8 => arange_step!(u8), DType::U32 => arange_step!(u32), DType::I64 => arange_step!(i64), DType::BF16 => arange_step!(f32), DType::F16 => arange_step!(f32), DType::F32 => arange_step!(f32), DType::F64 => arange_step!(f64), }; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Greater "Greater" => { let a = get(&node.input[0])?; let b = get(&node.input[1])?; let output = a.broadcast_gt(b)?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Less "Less" => { let a = get(&node.input[0])?; let b = get(&node.input[1])?; let output = a.broadcast_lt(b)?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Log "Log" => { let a = get(&node.input[0])?; let output = a.log()?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Min "Min" => { let mut output = get(&node.input[0])?.clone(); for input in node.input.iter() { let input = get(input)?; output = output.broadcast_minimum(input)? } values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Where "Where" => { let cond = get(&node.input[0])?; let a = get(&node.input[1])?; let b = get(&node.input[2])?; // where_cond requires that all inputs are the same shape. // In contrast, the Where op in ONNX only requires that they are broadcastable. let shape = broadcast_shape_from_many(&[cond.dims(), a.dims(), b.dims()])?; let cond = cond.broadcast_as(shape.clone())?; let a = a.broadcast_as(shape.clone())?; let b = b.broadcast_as(shape)?; let output = cond.where_cond(&a, &b)?; values.insert(node.output[0].clone(), output); } "Conv" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Conv let dilations = get_attr_opt::<[i64]>(node, "dilations")?; let groups = get_attr_opt::<i64>(node, "group")?.copied().unwrap_or(1); let _kernel_shape = get_attr_opt::<[i64]>(node, "kernel_shape")?; let pads = get_attr_opt::<[i64]>(node, "pads")?; let strides = get_attr_opt::<[i64]>(node, "strides")?; let auto_pad = get_attr_opt::<str>(node, "auto_pad")?; match auto_pad { None | Some("NOTSET") => (), Some(s) => bail!("unsupported auto_pad {s}"), }; let xs = get(&node.input[0])?; let ws = get(&node.input[1])?; let ys = match ws.rank() { 3 => { let (pads, xs) = match pads { None => (0, xs.clone()), Some([p]) => (*p as usize, xs.clone()), Some([p1, p2]) => { if p1 != p2 { (0usize, xs.pad_with_zeros(2, *p1 as usize, *p2 as usize)?) } else { (*p1 as usize, xs.clone()) } } Some(pads) => { bail!("more pads than expected in conv1d {pads:?} {}", node.name) } }; let strides = match strides { None => 1, Some([p]) => *p as usize, Some(s) => { bail!("more strides than expected in conv1d {s:?} {}", node.name) } }; let dilations = match dilations { None => 1, Some([p]) => *p as usize, Some(s) => { bail!("more dilations than expected in conv1d {s:?} {}", node.name) } }; xs.conv1d(ws, pads, strides, dilations, groups as usize)? } 4 => { let (pads, xs) = match pads { None => (0, xs.clone()), Some([p]) => (*p as usize, xs.clone()), Some(&[p1, p2, p3, p4]) => { let p1 = p1 as usize; let p2 = p2 as usize; let p3 = p3 as usize; let p4 = p4 as usize; if p1 != p2 || p1 != p3 || p1 != p4 { (0, xs.pad_with_zeros(2, p1, p3)?.pad_with_zeros(3, p2, p4)?) } else { (p1, xs.clone()) } } Some(pads) => { bail!("more pads than expected in conv2d {pads:?} {}", node.name) } }; let strides = match strides { None => 1, Some([p]) => *p as usize, Some([p1, p2]) => { if p1 != p2 { bail!( "strides have to be the same on both axis {pads:?} {}", node.name ) } *p1 as usize } Some(s) => { bail!("more strides than expected in conv2d {s:?} {}", node.name) } }; let dilations = match dilations { None => 1, Some([p]) => *p as usize, Some([p1, p2]) => { if p1 != p2 { bail!( "dilations have to be the same on both axis {pads:?} {}", node.name ) } *p1 as usize } Some(s) => { bail!("more dilations than expected in conv2d {s:?} {}", node.name) } }; xs.conv2d(ws, pads, strides, dilations, groups as usize)? } rank => bail!( "unsupported rank for weight matrix {rank} in conv {}", node.name ), }; let ys = if node.input.len() > 2 { let bs = get(&node.input[2])?; let mut bs_shape = vec![1; ys.rank()]; bs_shape[1] = bs.elem_count(); ys.broadcast_add(&bs.reshape(bs_shape)?)? } else { ys }; values.insert(node.output[0].clone(), ys); } "Concat" => { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Concat let inputs = node .input .iter() .map(|n| Ok(get(n.as_str())?.clone())) .collect::<Result<Vec<Value>>>()?; let axis: i64 = *get_attr(node, "axis")?; if inputs.is_empty() { bail!("empty concat") }; let axis = inputs[0].normalize_axis(axis)?; let output = Tensor::cat(&inputs, axis)?; values.insert(node.output[0].clone(), output); } "Abs" => { let input = get(&node.input[0])?; let output = input.abs()?; values.insert(node.output[0].clone(), output); } "Cos" => { let input = get(&node.input[0])?; let output = input.cos()?; values.insert(node.output[0].clone(), output); } "Sin" => { let input = get(&node.input[0])?; let output = input.sin()?; values.insert(node.output[0].clone(), output); } "Neg" => { let input = get(&node.input[0])?; let output = input.neg()?; values.insert(node.output[0].clone(), output); } "Erf" => { let input = get(&node.input[0])?; let output = input.erf()?; values.insert(node.output[0].clone(), output); } "Tanh" => { let input = get(&node.input[0])?; let output = input.tanh()?; values.insert(node.output[0].clone(), output); } "Sigmoid" => { let input = get(&node.input[0])?; let output = candle_nn::ops::sigmoid(input)?; values.insert(node.output[0].clone(), output); } "Gelu" => { let input = get(&node.input[0])?; let output = input.gelu_erf()?; values.insert(node.output[0].clone(), output); } "Relu" => { let input = get(&node.input[0])?; let output = input.relu()?; values.insert(node.output[0].clone(), output); } "Ceil" => { let input = get(&node.input[0])?; let output = input.ceil()?; values.insert(node.output[0].clone(), output); } "Floor" => { let input = get(&node.input[0])?; let output = input.floor()?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Constant "Constant" => { let value = match node.attribute.iter().find(|attr| attr.name == "value") { None => { // TODO: support sparse_value etc. bail!("cannot find 'value' attr in 'Constant' for {}", node.name) } Some(value) => value, }; let output = match value.r#type() { AttributeType::Tensor => { let t = value.t.as_ref().unwrap(); get_tensor(t, &node.name)? } rtype => bail!("unsupported 'value' type {rtype:?} for {}", node.name), }; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Cast "Cast" => { let input = get(&node.input[0])?; let dt: i64 = *get_attr(node, "to")?; let dtype = match DataType::try_from(dt as i32) { Ok(DataType::Int32) => DType::I64, Ok(dt) => match dtype(dt) { Some(dt) => dt, None => { bail!("unsupported 'to' value {dt:?} for cast {}", node.name) } }, Err(_) => { bail!("unsupported 'to' value {dt:?} for cast {}", node.name) } }; let output = input.to_dtype(dtype)?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#CumSum "CumSum" => { let exclusive = get_attr_opt::<i64>(node, "exclusive")? .copied() .unwrap_or(0); let reverse = get_attr_opt::<i64>(node, "reverse")?.copied().unwrap_or(0); if exclusive != 0 { bail!("only exclusive == 0 is supported in CumSum") } if reverse != 0 { bail!("only reverse == 0 is supported in CumSum") } let input = get(&node.input[0])?; let axis = get(&node.input[1])? .to_dtype(DType::U32)? .to_vec0::<u32>()?; let output = input.cumsum(axis as usize)?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#flatten "Flatten" => { let axis = get_attr_opt::<i64>(node, "axis")?.copied().unwrap_or(1) as usize; let input = get(&node.input[0])?; let first_part: usize = input.shape().dims().iter().take(axis).product(); let end_index = input.shape().dims().iter().product::<usize>(); let new_shape = (first_part, end_index / first_part); let output = input.reshape(new_shape)?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#identity "Identity" => { let input = get(&node.input[0])?; values.insert(node.output[0].clone(), input.clone()); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#if "If" => { // protobuf encodes boolean false as 0 and true as 1 let cond = get(&node.input[0])?.get(0)?.to_scalar::<u8>()?; let attr_name = if cond != 0 { "then_branch" } else { "else_branch" }; let sub_graph = get_attr::<GraphProto>(node, attr_name)?; if sub_graph.output.len() != node.output.len() { bail!( "If node {:?} is malformed: branch outputs ({}) don't match node outputs ({})", node.name, sub_graph.output.len(), node.output.len() ); } let branch_out = simple_eval_(sub_graph, values)?; for (i, out) in node.output.iter().enumerate() { values.insert( out.clone(), branch_out.get(&sub_graph.output[i].name).unwrap().clone(), ); } } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#pad "Pad" => { let mode = get_attr_opt(node, "mode")?.unwrap_or("constant"); let data = get(&node.input[0])?; let pads = get(&node.input[1])?; if node.input.len() > 2 { bail!( "unsupported number of inputs {} for Pad node {:?}, expected 2", node.input.len(), node.name ); } if pads.rank() != 1 { bail!("Pad expects 'pads' input to be 1D vector: {pads:?}"); } if pads.dim(0).unwrap() != 2 * data.rank() { bail!("Pad expects 'pads' input len to be 2 * rank of 'data' input: pads: {}, data rank: {}", pads, data.rank()); } let pads = pads.to_vec1::<i64>()?; let (pads_pre, pads_post) = pads.split_at(pads.len() / 2); match mode { "reflect" => { let mut out = data.clone(); for (i, &dim) in data.dims().iter().enumerate().rev() { if pads_pre[i] == 0 && pads_post[i] == 0 { continue; } fn zigzag(min: i64, max: i64) -> impl Iterator<Item = i64> { std::iter::repeat((min..max).chain((min + 1..=max).rev())).flatten() } let idx = if dim > 1 { let cycle_len = dim * 2 - 2; let skip = cycle_len - ((pads_pre[i] as usize) % cycle_len); let idx = zigzag(0, (dim - 1) as i64) .skip(skip) .take((pads_pre[i] as usize) + dim + (pads_post[i] as usize)); Tensor::from_iter(idx, out.device())? } else { Tensor::full(0i64, (dim,), out.device())? }; out = out.index_select(&idx, i)?; } values.insert(node.output[0].clone(), out); } _ => bail!( "unsupported 'mode' value {mode:?} for Pad node {:?}", node.name ), } } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#slice "Slice" => { let data = get(&node.input[0])?; let starts = get(&node.input[1])?; let ends = get(&node.input[2])?; let default_axes; let default_steps; let axes: &Tensor; let steps: &Tensor; // If axes are omitted, they are set to [0, ..., r-1]. If steps are omitted, // they are set to [1, ..., 1] of length len(starts) match node.input.len() { 3 => { let len = starts.dims()[0]; default_axes = Some(Tensor::arange(0, len as i64, starts.device())?); axes = default_axes.as_ref().unwrap(); default_steps = Some(Tensor::ones((len,), DType::I64, starts.device())?); steps = default_steps.as_ref().unwrap(); } 4 => { let len = starts.dims()[0]; axes = get(&node.input[3])?; default_steps = Some(Tensor::ones((len,), DType::I64, starts.device())?); steps = default_steps.as_ref().unwrap(); } 5 => { steps = get(&node.input[4])?; axes = get(&node.input[3])?; } _ => bail!( "Slice node is invalid, expected 3-5 inputs, got {}: {:?}", node.input.len(), node ), } let mut out = data.clone(); for (i, axis) in axes.to_vec1::<i64>()?.into_iter().enumerate() { // All negative elements of axes are made non-negative by // adding r to them, where r = rank(input). let axis = if axis < 0 { axis + data.rank() as i64 } else { axis } as usize; let data_dim = data.dims()[axis] as i64; let mut s = starts.get(i)?.to_scalar::<i64>()?; let mut e = ends.get(i)?.to_scalar::<i64>()?; // All negative values in starts[i] and ends[i] have // dims[axes[i]] added to them, where dims are the // dimensions of input. if s < 0 { s += data_dim; } if e < 0 { e += data_dim; } let p = steps.get(i)?.to_scalar::<i64>()?; // starts[i] is clamped into the range [0, dims[axes[i]]] // for positive stepping and [0, dims[axes[i]]-1] for // negative stepping. // for positive stepping ends[axes[i]] is clamped to // [0, dims[axes[i]]], while for negative stepping it is // clamped to [-1, dims[axes[i]]-1]. if p >= 0 { s = s.clamp(0, data_dim); e = e.clamp(0, data_dim); } else { s = s.clamp(0, data_dim - 1); e = e.clamp(-1, data_dim - 1); } let indexes = Tensor::arange_step(s, e, p, data.device())?; out = out.index_select(&indexes, axis)? } values.insert(node.output[0].clone(), out); } // https://onnx.ai/onnx/operators/onnx__ReduceMax.html#reducemax "ReduceMax" => { let input = get(&node.input[0])?; let axes = get_opt(1); let keepdims = get_attr_opt::<i64>(node, "keepdims")?.copied().unwrap_or(1) == 1; let axes = if let Some(Ok(axes)) = axes { // Satisfies version 18+ axes.to_vec1::<i64>().ok() } else if let Ok(Some(axes)) = get_attr_opt::<[i64]>(node, "axes") { // Backward compatiblity with version 13 and below Some(axes.to_vec()) } else { None }; let axes = if let Some(axes) = axes { let rank = input.rank(); let mut axes_set = HashSet::new(); let mut axes = axes .iter() .map(|a| { let axis = if *a < 0 { (rank as i64 + *a) as usize } else { *a as usize }; axes_set.insert(axis); axis }) .collect::<Vec<_>>(); if axes_set.len() < axes.len() { bail!("Duplicate value in 'axes'"); } if axes.len() > 1 { axes.sort(); } Some(axes) } else { None }; // TODO: Handle empty set // Definition: // "Reduction over an empty set of values yields minus infinity (if supported by the datatype) or the minimum value of the data type otherwise" // For now, this will throw an error if input.elem_count() == 0 { bail!("reduction over zero-size tensor not supported"); } let output = if let Some(axes) = axes { let mut result = input.clone(); for &axis in axes.iter().rev() { result = if keepdims { result.max_keepdim(axis)? } else { result.max(axis)? } } result } else { // If `axes` is empty and `noop_with_empty_axes` is set to `true (1)` // ""input tensor will not be reduced,and the output tensor would be equivalent to input tensor."" if get_attr_opt::<i64>(node, "noop_with_empty_axes")?.copied() == Some(1) { input.clone() } else { let mut result = input.flatten_all()?; if keepdims { result = result.max_keepdim(0)?; // If keepdims is true, reshape to match input dimensions let shape = vec![1; input.rank()]; result.reshape(shape)? } else { result.max(0)? } } }; values.insert(node.output[0].clone(), output); } // https://onnx.ai/onnx/operators/onnx__ReduceMean.html#reducemean-13 // TODO: This version is only compatible with ReduceMean V13 and below. "ReduceMean" => { let input = get(&node.input[0])?; let axes = get_attr_opt::<[i64]>(node, "axes")?; let keepdims = get_attr_opt::<i64>(node, "keepdims")?.copied().unwrap_or(1); let n_dims = input.dims().len(); let axes: Vec<usize> = if let Some(axes) = axes { axes.iter() .map(|e| (if e < &0 { (n_dims as i64) + *e } else { *e }) as usize) .collect() } else { (0..n_dims).collect() }; let output = if keepdims == 1 { input.mean_keepdim(axes)? } else { input.mean(axes)? }; values.insert(node.output[0].clone(), output); } // https://onnx.ai/onnx/operators/onnx__ReduceMin.html#reducemin "ReduceMin" => { let input = get(&node.input[0])?; let axes = get_opt(1); let keepdims = get_attr_opt::<i64>(node, "keepdims")?.copied().unwrap_or(1) == 1; let axes = if let Some(Ok(axes)) = axes { // Satisfies version 18+ axes.to_vec1::<i64>().ok() } else if let Ok(Some(axes)) = get_attr_opt::<[i64]>(node, "axes") { // Backward compatiblity with version 13 and below Some(axes.to_vec()) } else { None }; let axes = if let Some(axes) = axes { let rank = input.rank(); let mut axes_set = HashSet::new(); let mut axes = axes .iter() .map(|a| { let axis = if *a < 0 { (rank as i64 + *a) as usize } else { *a as usize }; axes_set.insert(axis); axis }) .collect::<Vec<_>>(); if axes_set.len() < axes.len() { bail!("Duplicate value in 'axes'"); } if axes.len() > 1 { axes.sort(); } Some(axes) } else { None }; // TODO: Handle empty set // Definition: // "Reduction over an empty set of values yields positive infinity (if supported by the datatype) or the max value of the data type otherwise" // For now, this will throw an error if input.elem_count() == 0 { bail!("reduction over zero-size tensor not supported"); } let output = if let Some(axes) = axes { let mut result = input.clone(); for &axis in axes.iter().rev() { result = if keepdims { result.min_keepdim(axis)? } else { result.min(axis)? } } result } else { // If `axes` is empty and `noop_with_empty_axes` is set to `true (1)` // ""input tensor will not be reduced,and the output tensor would be equivalent to input tensor."" if get_attr_opt::<i64>(node, "noop_with_empty_axes")?.copied() == Some(1) { input.clone() } else { let mut result = input.flatten_all()?; if keepdims { result = result.min_keepdim(0)?; // If keepdims is true, reshape to match input dimensions let shape = vec![1; input.rank()]; result.reshape(shape)? } else { result.min(0)? } } }; values.insert(node.output[0].clone(), output); } //https://github.com/onnx/onnx/blob/main/docs/Operators.md#Split // Version 18 impl "Split" => { let input_tensor = get(&node.input[0])?; let axis = get_attr_opt::<i64>(node, "axis")?.copied().unwrap_or(0); let axis = input_tensor.normalize_axis(axis)?; // Determine split sizes let splits = if node.input.len() > 1 { // If the split tensor is provided, use it to determine sizes let split_tensor = get(&node.input[1])?.to_vec1::<i64>()?; split_tensor.iter().map(|&x| x as usize).collect::<Vec<_>>() } else { let num_outputs = if let Some(&num_outputs_attrib) = get_attr_opt::<i64>(node, "num_outputs")? { num_outputs_attrib as usize } else { node.output.len() }; let input_dim = input_tensor.dim(axis)?; let mut split_sizes = vec![input_dim / num_outputs as usize; num_outputs as usize]; let remainder = input_dim % num_outputs as usize; if remainder > 0 { // If there's a remainder, add it to the last split size split_sizes[num_outputs as usize - 1] += remainder; } split_sizes }; // Perform the split operation let mut outputs = vec![]; let mut start = 0; for &size in &splits { let end = start + size; let slice = input_tensor.narrow(axis, start, size)?; outputs.push(slice); start = end; } // Insert the split outputs into the values map for (output, slice) in node.output.iter().zip(outputs.into_iter()) { values.insert(output.clone(), slice); } } //https://github.com/onnx/onnx/blob/main/docs/Operators.md#Expand // Version 13 impl "Expand" => { // unlike broadcast_to, expand allows for the output shape to // be different from the specified shape. let input_tensor = get(&node.input[0])?; let input_shape = get(&node.input[1])?; // Check that the shape tensor is 1D if input_shape.rank() != 1 { bail!( "Expand expects 'shape' input to be 1D tensor: {:?}", input_shape ); } let input_tensor_dims = input_tensor.dims(); let input_shape_dims = input_shape .to_vec1::<i64>()? .into_iter() .map(|x| x as usize) .collect::<Vec<_>>(); let target_shape = broadcast_shape(input_tensor_dims, input_shape_dims.as_slice())?; let expanded_tensor = input_tensor.broadcast_as(target_shape)?; values.insert(node.output[0].clone(), expanded_tensor); } //https://github.com/onnx/onnx/blob/main/docs/Operators.md#ReduceSum // Version 13 impl "ReduceSum" => { let input = get(&node.input[0])?; let axes = get_opt(1); let keepdims = get_attr_opt::<i64>(node, "keepdims")?.copied().unwrap_or(1); let noop_with_empty_axes = get_attr_opt::<i64>(node, "noop_with_empty_axes")? .copied() .unwrap_or(0); let axes = match axes { Some(Ok(axes)) => axes .to_vec1::<i64>()? .into_iter() .map(|x| x as usize) .collect::<Vec<_>>(), Some(Err(_)) | None => { if noop_with_empty_axes == 1 { vec![] } else { (0..input.rank()).collect() } } }; let output = if keepdims == 1 { input.sum_keepdim(axes)? } else { input.sum(axes)? }; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#ReduceL2 // Version 18 impl "ReduceL2" => { let input = get(&node.input[0])?; let axes = get_opt(1); let keepdims = get_attr_opt::<i64>(node, "keepdims")?.copied().unwrap_or(1); let noop_with_empty_axes = get_attr_opt::<i64>(node, "noop_with_empty_axes")? .copied() .unwrap_or(0); let input_sq = input.sqr()?; let axes = match axes { Some(axes) => axes? .to_vec1::<i64>()? .into_iter() .map(|x| x as usize) .collect::<Vec<_>>(), None => { if noop_with_empty_axes == 1 { vec![] } else { (0..input_sq.rank()).collect() } } }; let output = if keepdims == 1 { input_sq.sum_keepdim(axes)?.sqrt()? } else { input_sq.sum(axes)?.sqrt()? }; values.insert(node.output[0].clone(), output); } random_type @ ("RandomUniform" | "RandomNormal") => { let dt: i64 = get_attr_opt(node, "dtype")?.copied().unwrap_or(1); // 1 is float // type by // default let dtype = match DataType::try_from(dt as i32) { Ok(dt) => match dtype(dt) { Some(DType::U8 | DType::U32 | DType::I64) => { bail!( "unsupported 'dtype' value {dt:?}, only floats are allowed, for {random_type} {}", node.name ) } Some(dt) => dt, None => { bail!( "unsupported 'dtype' value {dt:?} for {random_type} {}", node.name ) } }, Err(_) => { bail!( "unsupported 'dtype' value {dt:?} for {random_type} {}", node.name ) } }; let seed: Option<f32> = get_attr_opt(node, "seed")?.copied(); if seed.is_some() { bail!("seed for {random_type} is currently not supported") }; let shape: Vec<usize> = get_attr::<[i64]>(node, "shape")? .iter() .map(|x| *x as usize) .collect(); let output = if random_type == "RandomUniform" { let low: f32 = get_attr_opt(node, "low")?.copied().unwrap_or(0.0); let high: f32 = get_attr_opt(node, "high")?.copied().unwrap_or(1.0); Tensor::rand(low, high, shape, &Device::Cpu)?.to_dtype(dtype)? } else { let mean: f32 = get_attr_opt(node, "mean")?.copied().unwrap_or(0.0); let scale: f32 = get_attr_opt(node, "scale")?.copied().unwrap_or(1.0); Tensor::randn(mean, scale, shape, &Device::Cpu)?.to_dtype(dtype)? }; values.insert(node.output[0].clone(), output); } "ArgMin" => { let input = get(&node.input[0])?; let axis_i64: i64 = get_attr_opt(node, "axis")?.copied().unwrap_or(0); let rank_i64: i64 = input.rank().try_into().unwrap(); if axis_i64 < -rank_i64 || axis_i64 >= rank_i64 { bail!( "axis ({}) out of accepted range [-rank, rank-1] which was [{}, {}]", axis_i64, -rank_i64, rank_i64 - 1 ) } let axis = input.normalize_axis(axis_i64)?; let keepdims: i64 = get_attr_opt(node, "keepdims")?.copied().unwrap_or(1); let select_last_index: i64 = get_attr_opt(node, "select_last_index")? .copied() .unwrap_or(0); if select_last_index == 1 { bail!("select_last_index for ArgMin is currently not supported") } let output = if keepdims == 1 { input.argmin_keepdim(axis)? } else { input.argmin(axis)? } .to_dtype(DType::I64)?; values.insert(node.output[0].clone(), output); } "ArgMax" => { let input = get(&node.input[0])?; let axis_i64: i64 = get_attr_opt(node, "axis")?.copied().unwrap_or(0); let rank_i64: i64 = input.rank().try_into().unwrap(); if axis_i64 < -rank_i64 || axis_i64 >= rank_i64 { bail!( "axis ({}) out of accepted range [-rank, rank-1] which was [{}, {}]", axis_i64, -rank_i64, rank_i64 - 1 ) } let axis = input.normalize_axis(axis_i64)?; let keepdims: i64 = get_attr_opt(node, "keepdims")?.copied().unwrap_or(1); let select_last_index: i64 = get_attr_opt(node, "select_last_index")? .copied() .unwrap_or(0); if select_last_index == 1 { bail!("select_last_index for ArgMin is currently not supported") } let output = if keepdims == 1 { input.argmax_keepdim(axis)? } else { input.argmax(axis)? } .to_dtype(DType::I64)?; values.insert(node.output[0].clone(), output); } "LeakyRelu" => { let input = get(&node.input[0])?; let dt = input.dtype(); match dt { DType::U8 | DType::U32 | DType::I64 => { bail!( "unsupported dtype {}, only float types are allowed for LeakyRelu", dt.as_str() ) } DType::BF16 | DType::F16 | DType::F32 | DType::F64 => {} } let alpha = get_attr_opt::<f32>(node, "alpha")?.copied().unwrap_or(0.01); let output = candle_nn::ops::leaky_relu(input, alpha.into())?; values.insert(node.output[0].clone(), output); } // https://github.com/onnx/onnx/blob/main/docs/Operators.md#Gemm "Gemm" => { let a = get(&node.input[0])?; let b = get(&node.input[1])?; let c = get(&node.input[2])?; let alpha = get_attr_opt::<f32>(node, "alpha")?.copied().unwrap_or(1.0); let beta = get_attr_opt::<f32>(node, "beta")?.copied().unwrap_or(1.0); let alpha = Tensor::full(alpha, a.shape(), &Device::Cpu)?; let beta = Tensor::full(beta, c.shape(), &Device::Cpu)?; let trans_a = get_attr_opt::<i64>(node, "transA")?.copied().unwrap_or(0); let trans_b = get_attr_opt::<i64>(node, "transB")?.copied().unwrap_or(0); let a = if trans_a == 0 { a.clone() } else { a.t()? }; let b = if trans_b == 0 { b.clone() } else { b.t()? }; let output = a .broadcast_mul(&alpha)? .broadcast_matmul(&b)? .broadcast_add(&c.broadcast_mul(&beta)?)?; values.insert(node.output[0].clone(), output); } "LSTM" => { let direction = get_attr_opt(node, "direction")?.unwrap_or("forward"); if direction != "forward" { bail!("LSTM currently only supports direction == \"forward\""); } let num_directions = if direction == "bidirectional" { 2 } else { 1 }; let hidden_size: i64 = get_attr(node, "hidden_size").copied()?; let input_forget = get_attr_opt(node, "input_forget")?.copied().unwrap_or(0); if input_forget != 0 { bail!("LSTM currently only supports input_forget == 0"); } let activations_default = vec![ "Sigmoid".to_string(), "Tanh".to_string(), "Tanh".to_string(), ]; let activations = get_attr_opt_owned::<Vec<String>>(node, "activations")? .unwrap_or(activations_default.clone()); if activations != activations_default { bail!("LSTM currently only supports default activations ({activations_default:?})"); } // activation_alpha and activation_beta don't apply to (Sigmoid, Tanh, Tanh) so ignoring them is okay if get_attr_opt::<f32>(node, "clip")?.is_some() { bail!("LSTM does not currently support clip attribute"); } // The shape format of inputs X, initial_h and outputs Y, Y_h. // If 0, the following shapes are expected: // X.shape = [seq_length, batch_size, input_size], // Y.shape = [seq_length, num_directions, batch_size, hidden_size], // initial_h.shape = Y_h.shape = [num_directions, batch_size, hidden_size]. // If 1, the following shapes are expected: // X.shape = [batch_size, seq_length, input_size], // Y.shape = [batch_size, seq_length, num_directions, hidden_size], // initial_h.shape = Y_h.shape = [batch_size, num_directions, hidden_size]. let layout = get_attr_opt(node, "layout")?.copied().unwrap_or(0); if layout != 0 { bail!("LSTM currently only supports layout == 0"); } // The input sequences packed (and potentially padded) into one 3-D tensor // with the shape of `[seq_length, batch_size, input_size]`. let x = get(&node.input[0])?; // XXX: depends on layout let (seq_length, batch_size, input_size) = x.dims3()?; // The weight tensor for the gates. // Concatenation of `W[iofc]` and `WB[iofc]` (if bidirectional) along dimension 0. // The tensor has shape `[num_directions, 4*hidden_size, input_size]`. let w = get(&node.input[1])?; // The recurrence weight tensor. // Concatenation of `R[iofc]` and `RB[iofc]` (if bidirectional) along dimension 0. // This tensor has shape `[num_directions, 4*hidden_size, hidden_size]`. let r = get(&node.input[2])?; // The bias tensor for input gate. // Concatenation of `[Wb[iofc], Rb[iofc]]`, and `[WBb[iofc], RBb[iofc]]` (if bidirectional) along dimension 0. // This tensor has shape `[num_directions, 8*hidden_size]`. // Optional: If not specified - assumed to be 0. let b_default: Tensor; let b = match get_opt(3) { Some(n) => n?, None => { b_default = Tensor::zeros( (num_directions, 8 * hidden_size as usize), DType::F32, x.device(), )?; &b_default } }; // Optional tensor specifying lengths of the sequences in a batch. // If not specified - assumed all sequences in the batch to have length `seq_length`. // It has shape `[batch_size]`. let seq_lens_default: Tensor; let seq_lens = match get_opt(4) { Some(n) => n?, None => { seq_lens_default = Tensor::full(seq_length as i64, (batch_size,), x.device())?; &seq_lens_default } }; let seq_lens_is_default = (seq_lens.to_vec1::<i64>()?.iter()).all(|e| *e as usize == seq_length); if !seq_lens_is_default { bail!("LSTM currently only supports default value of seq_lens"); } // Optional initial value of the hidden. If not specified - assumed to be 0. // It has shape `[num_directions, batch_size, hidden_size]`. let initial_h_default: Tensor; let initial_h = match get_opt(5) { Some(n) => n?, _ => { initial_h_default = Tensor::zeros( (num_directions, batch_size, hidden_size as usize), DType::F32, x.device(), )?; &initial_h_default } }; // Optional initial value of the cell. // If not specified - assumed to be 0. // It has shape `[num_directions, batch_size, hidden_size]`. let initial_c_default: Tensor; let initial_c = match node.input.get(6) { Some(n) if !n.is_empty() => get(n)?, _ => { initial_c_default = Tensor::zeros( (num_directions, batch_size, hidden_size as usize), DType::F32, x.device(), )?; &initial_c_default } }; // The weight tensor for peepholes. // Concatenation of `P[iof]` and `PB[iof]` (if bidirectional) along dimension 0. // It has shape `[num_directions, 3*hidde_size]`. Optional: If not specified - assumed to be 0. let p_default = Tensor::zeros( (num_directions, 3 * hidden_size as usize), DType::F32, x.device(), )?; let p = get_opt(7).unwrap_or(Ok(&p_default))?; let p_is_zeros = (p.to_vec2::<f32>()?.iter()).all(|v| v.iter().all(|e| *e == 0.0)); if !p_is_zeros { bail!( "LSTM currently only supports default value of p (a Tensor of all zeroes)" ); } // these all have [num_directions, ...] shapes let w = w.get(0)?; // w[iofc] has shape [4*hidden_size, input_size] let r = r.get(0)?; // r[iofc] has shape [4*hidden_size, hidden_size] let b = b.get(0)?; // concat of [wb[iofc],rb[iofc]] has shape [8*hidden_size] let idx_wb = Tensor::arange(0, 4 * hidden_size, x.device())?; let idx_rb = Tensor::arange(4 * hidden_size, 8 * hidden_size, x.device())?; let wb = b.index_select(&idx_wb, 0)?; let rb = b.index_select(&idx_rb, 0)?; let c = initial_c.get(0)?; let h = initial_h.get(0)?; // w, r, wb, rb are all iofc but lstm expects ifco // so we need to move some stuff around let idx_i = Tensor::arange(0, hidden_size, x.device())?; let idx_o = Tensor::arange(hidden_size, 2 * hidden_size, x.device())?; let idx_f = Tensor::arange(2 * hidden_size, 3 * hidden_size, x.device())?; let idx_c = Tensor::arange(3 * hidden_size, 4 * hidden_size, x.device())?; let idx_ifco = Tensor::cat(&[&idx_i, &idx_f, &idx_c, &idx_o], 0)?; let w = w.index_select(&idx_ifco, 0)?; let r = r.index_select(&idx_ifco, 0)?; let wb = wb.index_select(&idx_ifco, 0)?; let rb = rb.index_select(&idx_ifco, 0)?; let vmap = candle_nn::VarMap::new(); vmap.data().lock().unwrap().extend([ ("weight_ih_l0".to_string(), candle::Var::from_tensor(&w)?), ("weight_hh_l0".to_string(), candle::Var::from_tensor(&r)?), ("bias_ih_l0".to_string(), candle::Var::from_tensor(&wb)?), ("bias_hh_l0".to_string(), candle::Var::from_tensor(&rb)?), ]); use candle_nn::rnn::RNN as _; let lstm = candle_nn::rnn::lstm( input_size, hidden_size as usize, candle_nn::rnn::LSTMConfig::default(), candle_nn::VarBuilder::from_varmap(&vmap, w.dtype(), w.device()), )?; let mut lstm_state = candle_nn::rnn::LSTMState::new(h, c); let mut h_acc = if node.output.first().map(String::as_str).unwrap_or("") != "" { Some(vec![]) } else { None }; for t in 0..seq_length { let x = x.get(t)?; lstm_state = lstm.step(&x, &lstm_state)?; if let Some(h_acc) = &mut h_acc { h_acc.push(lstm_state.clone()); } } assert_eq!(num_directions, 1, "if support for bidirectional is ever added, outputs will have to be concatenated, not simply reshaped"); if let Some(name) = node.output.first() { let h_acc = h_acc.as_ref().unwrap(); let h_acc = lstm.states_to_tensor(h_acc)?; let h_acc = h_acc.reshape(( seq_length, num_directions, batch_size, hidden_size as usize, ))?; values.insert(name.clone(), h_acc); } if let Some(name) = node.output.get(1) { values.insert( name.clone(), lstm_state.h().reshape(( num_directions, batch_size, hidden_size as usize, ))?, ); } if let Some(name) = node.output.get(2) { values.insert( name.clone(), lstm_state.c().reshape(( num_directions, batch_size, hidden_size as usize, ))?, ); } } // https://onnx.ai/onnx/operators/onnx__Xor.html "Xor" => { // Since we don't have a `DType::Bool` yet, this ensures that we are working with `0`(False) & `1`(True) let a = get(&node.input[0])?.gt(0_u8)?; let b = get(&node.input[1])?.gt(0_u8)?; let out = a.broadcast_add(&b)?.eq(1_u8)?; values.insert(node.output[0].clone(), out); } // https://onnx.ai/onnx/operators/onnx__Sign.html "Sign" => { let input = get(&node.input[0])?; let output = input.sign()?; values.insert(node.output[0].clone(), output); } op_type => bail!("unsupported op_type {op_type} for op {node:?}"), } } graph .output .iter() .map(|output| match values.remove(&output.name) { None => bail!("cannot find output {}", output.name), Some(value) => Ok((output.name.clone(), value)), }) .collect() } fn broadcast_shape(shape_a: &[usize], shape_b: &[usize]) -> Result<Vec<usize>> { let (longest, shortest) = if shape_a.len() > shape_b.len() { (shape_a, shape_b) } else { (shape_b, shape_a) }; let diff = longest.len() - shortest.len(); let mut target_shape = longest[0..diff].to_vec(); for (dim1, dim2) in longest[diff..].iter().zip(shortest.iter()) { if *dim1 == *dim2 || *dim2 == 1 || *dim1 == 1 { target_shape.push(usize::max(*dim1, *dim2)); } else { bail!( "Expand: incompatible shapes for broadcast, {:?} and {:?}", shape_a, shape_b ); } } Ok(target_shape) } fn broadcast_shape_from_many(shapes: &[&[usize]]) -> Result<Vec<usize>> { if shapes.is_empty() { return Ok(Vec::new()); } let mut shape_out = shapes[0].to_vec(); for shape in shapes[1..].iter() { shape_out = broadcast_shape(&shape_out, shape)?; } Ok(shape_out) }
2
0
hf_public_repos/candle/candle-onnx
hf_public_repos/candle/candle-onnx/src/lib.rs
use candle::Result; use prost::Message; pub mod onnx { include!(concat!(env!("OUT_DIR"), "/onnx.rs")); } pub mod eval; pub use eval::{dtype, simple_eval}; pub fn read_file<P: AsRef<std::path::Path>>(p: P) -> Result<onnx::ModelProto> { let buf = std::fs::read(p)?; onnx::ModelProto::decode(buf.as_slice()).map_err(candle::Error::wrap) }
3
0
hf_public_repos/candle/candle-onnx
hf_public_repos/candle/candle-onnx/src/onnx.proto3
// // WARNING: This file is automatically generated! Please edit onnx.in.proto. // // SPDX-License-Identifier: Apache-2.0 syntax = "proto3"; package onnx; // Overview // // ONNX is an open specification that is comprised of the following components: // // 1) A definition of an extensible computation graph model. // 2) Definitions of standard data types. // 3) Definitions of built-in operators. // // This document describes the syntax of models and their computation graphs, // as well as the standard data types. Together, they are referred to as the ONNX // Intermediate Representation, or 'IR' for short. // // The normative semantic specification of the ONNX IR is found in docs/IR.md. // Definitions of the built-in neural network operators may be found in docs/Operators.md. // Notes // // Protobuf compatibility // // To simplify framework compatibility, ONNX is defined using the subset of protobuf // that is compatible with both protobuf v2 and v3. This means that we do not use any // protobuf features that are only available in one of the two versions. // // Here are the most notable contortions we have to carry out to work around // these limitations: // // - No 'map' (added protobuf 3.0). We instead represent mappings as lists // of key-value pairs, where order does not matter and duplicates // are not allowed. // Versioning // // ONNX versioning is specified in docs/IR.md and elaborated on in docs/Versioning.md // // To be compatible with both proto2 and proto3, we will use a version number // that is not defined by the default value but an explicit enum number. enum Version { // proto3 requires the first enum value to be zero. // We add this just to appease the compiler. _START_VERSION = 0; // The version field is always serialized and we will use it to store the // version that the graph is generated from. This helps us set up version // control. // For the IR, we are using simple numbers starting with 0x00000001, // which was the version we published on Oct 10, 2017. IR_VERSION_2017_10_10 = 0x0000000000000001; // IR_VERSION 2 published on Oct 30, 2017 // - Added type discriminator to AttributeProto to support proto3 users IR_VERSION_2017_10_30 = 0x0000000000000002; // IR VERSION 3 published on Nov 3, 2017 // - For operator versioning: // - Added new message OperatorSetIdProto // - Added opset_import in ModelProto // - For vendor extensions, added domain in NodeProto IR_VERSION_2017_11_3 = 0x0000000000000003; // IR VERSION 4 published on Jan 22, 2019 // - Relax constraint that initializers should be a subset of graph inputs // - Add type BFLOAT16 IR_VERSION_2019_1_22 = 0x0000000000000004; // IR VERSION 5 published on March 18, 2019 // - Add message TensorAnnotation. // - Add quantization annotation in GraphProto to map tensor with its scale and zero point quantization parameters. IR_VERSION_2019_3_18 = 0x0000000000000005; // IR VERSION 6 published on Sep 19, 2019 // - Add support for sparse tensor constants stored in model. // - Add message SparseTensorProto // - Add sparse initializers IR_VERSION_2019_9_19 = 0x0000000000000006; // IR VERSION 7 published on May 8, 2020 // - Add support to allow function body graph to rely on multiple external opreator sets. // - Add a list to promote inference graph's initializers to global and // mutable variables. Global variables are visible in all graphs of the // stored models. // - Add message TrainingInfoProto to store initialization // method and training algorithm. The execution of TrainingInfoProto // can modify the values of mutable variables. // - Implicitly add inference graph into each TrainingInfoProto's algorithm. IR_VERSION_2020_5_8 = 0x0000000000000007; // IR VERSION 8 published on July 30, 2021 // Introduce TypeProto.SparseTensor // Introduce TypeProto.Optional // Added a list of FunctionProtos local to the model // Deprecated since_version and operator status from FunctionProto IR_VERSION_2021_7_30 = 0x0000000000000008; // IR VERSION 9 published on May 5, 2023 // Added AttributeProto to FunctionProto so that default attribute values can be set. // Added FLOAT8E4M3FN, FLOAT8E4M3FNUZ, FLOAT8E5M2, FLOAT8E5M2FNUZ. IR_VERSION = 0x0000000000000009; } // Attributes // // A named attribute containing either singular float, integer, string, graph, // and tensor values, or repeated float, integer, string, graph, and tensor values. // An AttributeProto MUST contain the name field, and *only one* of the // following content fields, effectively enforcing a C/C++ union equivalent. message AttributeProto { reserved 12, 16 to 19; reserved "v"; // Note: this enum is structurally identical to the OpSchema::AttrType // enum defined in schema.h. If you rev one, you likely need to rev the other. enum AttributeType { UNDEFINED = 0; FLOAT = 1; INT = 2; STRING = 3; TENSOR = 4; GRAPH = 5; SPARSE_TENSOR = 11; TYPE_PROTO = 13; FLOATS = 6; INTS = 7; STRINGS = 8; TENSORS = 9; GRAPHS = 10; SPARSE_TENSORS = 12; TYPE_PROTOS = 14; } // The name field MUST be present for this version of the IR. string name = 1; // namespace Attribute // if ref_attr_name is not empty, ref_attr_name is the attribute name in parent function. // In this case, this AttributeProto does not contain data, and it's a reference of attribute // in parent scope. // NOTE: This should ONLY be used in function (sub-graph). It's invalid to be used in main graph. string ref_attr_name = 21; // A human-readable documentation for this attribute. Markdown is allowed. string doc_string = 13; // The type field MUST be present for this version of the IR. // For 0.0.1 versions of the IR, this field was not defined, and // implementations needed to use has_field heuristics to determine // which value field was in use. For IR_VERSION 0.0.2 or later, this // field MUST be set and match the f|i|s|t|... field in use. This // change was made to accommodate proto3 implementations. AttributeType type = 20; // discriminator that indicates which field below is in use // Exactly ONE of the following fields must be present for this version of the IR float f = 2; // float int64 i = 3; // int bytes s = 4; // UTF-8 string TensorProto t = 5; // tensor value GraphProto g = 6; // graph SparseTensorProto sparse_tensor = 22; // sparse tensor value // Do not use field below, it's deprecated. // optional ValueProto v = 12; // value - subsumes everything but graph TypeProto tp = 14; // type proto repeated float floats = 7; // list of floats repeated int64 ints = 8; // list of ints repeated bytes strings = 9; // list of UTF-8 strings repeated TensorProto tensors = 10; // list of tensors repeated GraphProto graphs = 11; // list of graph repeated SparseTensorProto sparse_tensors = 23; // list of sparse tensors repeated TypeProto type_protos = 15;// list of type protos } // Defines information on value, including the name, the type, and // the shape of the value. message ValueInfoProto { // This field MUST be present in this version of the IR. string name = 1; // namespace Value // This field MUST be present in this version of the IR for // inputs and outputs of the top-level graph. TypeProto type = 2; // A human-readable documentation for this value. Markdown is allowed. string doc_string = 3; } // Nodes // // Computation graphs are made up of a DAG of nodes, which represent what is // commonly called a "layer" or "pipeline stage" in machine learning frameworks. // // For example, it can be a node of type "Conv" that takes in an image, a filter // tensor and a bias tensor, and produces the convolved output. message NodeProto { repeated string input = 1; // namespace Value repeated string output = 2; // namespace Value // An optional identifier for this node in a graph. // This field MAY be absent in ths version of the IR. string name = 3; // namespace Node // The symbolic identifier of the Operator to execute. string op_type = 4; // namespace Operator // The domain of the OperatorSet that specifies the operator named by op_type. string domain = 7; // namespace Domain // Additional named attributes. repeated AttributeProto attribute = 5; // A human-readable documentation for this node. Markdown is allowed. string doc_string = 6; } // Training information // TrainingInfoProto stores information for training a model. // In particular, this defines two functionalities: an initialization-step // and a training-algorithm-step. Initialization resets the model // back to its original state as if no training has been performed. // Training algorithm improves the model based on input data. // // The semantics of the initialization-step is that the initializers // in ModelProto.graph and in TrainingInfoProto.algorithm are first // initialized as specified by the initializers in the graph, and then // updated by the "initialization_binding" in every instance in // ModelProto.training_info. // // The field "algorithm" defines a computation graph which represents a // training algorithm's step. After the execution of a // TrainingInfoProto.algorithm, the initializers specified by "update_binding" // may be immediately updated. If the targeted training algorithm contains // consecutive update steps (such as block coordinate descent methods), // the user needs to create a TrainingInfoProto for each step. message TrainingInfoProto { // This field describes a graph to compute the initial tensors // upon starting the training process. Initialization graph has no input // and can have multiple outputs. Usually, trainable tensors in neural // networks are randomly initialized. To achieve that, for each tensor, // the user can put a random number operator such as RandomNormal or // RandomUniform in TrainingInfoProto.initialization.node and assign its // random output to the specific tensor using "initialization_binding". // This graph can also set the initializers in "algorithm" in the same // TrainingInfoProto; a use case is resetting the number of training // iteration to zero. // // By default, this field is an empty graph and its evaluation does not // produce any output. Thus, no initializer would be changed by default. GraphProto initialization = 1; // This field represents a training algorithm step. Given required inputs, // it computes outputs to update initializers in its own or inference graph's // initializer lists. In general, this field contains loss node, gradient node, // optimizer node, increment of iteration count. // // An execution of the training algorithm step is performed by executing the // graph obtained by combining the inference graph (namely "ModelProto.graph") // and the "algorithm" graph. That is, the actual // input/initializer/output/node/value_info/sparse_initializer list of // the training graph is the concatenation of // "ModelProto.graph.input/initializer/output/node/value_info/sparse_initializer" // and "algorithm.input/initializer/output/node/value_info/sparse_initializer" // in that order. This combined graph must satisfy the normal ONNX conditions. // Now, let's provide a visualization of graph combination for clarity. // Let the inference graph (i.e., "ModelProto.graph") be // tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d // and the "algorithm" graph be // tensor_d -> Add -> tensor_e // The combination process results // tensor_a, tensor_b -> MatMul -> tensor_c -> Sigmoid -> tensor_d -> Add -> tensor_e // // Notice that an input of a node in the "algorithm" graph may reference the // output of a node in the inference graph (but not the other way round). Also, inference // node cannot reference inputs of "algorithm". With these restrictions, inference graph // can always be run independently without training information. // // By default, this field is an empty graph and its evaluation does not // produce any output. Evaluating the default training step never // update any initializers. GraphProto algorithm = 2; // This field specifies the bindings from the outputs of "initialization" to // some initializers in "ModelProto.graph.initializer" and // the "algorithm.initializer" in the same TrainingInfoProto. // See "update_binding" below for details. // // By default, this field is empty and no initializer would be changed // by the execution of "initialization". repeated StringStringEntryProto initialization_binding = 3; // Gradient-based training is usually an iterative procedure. In one gradient // descent iteration, we apply // // x = x - r * g // // where "x" is the optimized tensor, "r" stands for learning rate, and "g" is // gradient of "x" with respect to a chosen loss. To avoid adding assignments // into the training graph, we split the update equation into // // y = x - r * g // x = y // // The user needs to save "y = x - r * g" into TrainingInfoProto.algorithm. To // tell that "y" should be assigned to "x", the field "update_binding" may // contain a key-value pair of strings, "x" (key of StringStringEntryProto) // and "y" (value of StringStringEntryProto). // For a neural network with multiple trainable (mutable) tensors, there can // be multiple key-value pairs in "update_binding". // // The initializers appears as keys in "update_binding" are considered // mutable variables. This implies some behaviors // as described below. // // 1. We have only unique keys in all "update_binding"s so that two // variables may not have the same name. This ensures that one // variable is assigned up to once. // 2. The keys must appear in names of "ModelProto.graph.initializer" or // "TrainingInfoProto.algorithm.initializer". // 3. The values must be output names of "algorithm" or "ModelProto.graph.output". // 4. Mutable variables are initialized to the value specified by the // corresponding initializer, and then potentially updated by // "initializer_binding"s and "update_binding"s in "TrainingInfoProto"s. // // This field usually contains names of trainable tensors // (in ModelProto.graph), optimizer states such as momentums in advanced // stochastic gradient methods (in TrainingInfoProto.graph), // and number of training iterations (in TrainingInfoProto.graph). // // By default, this field is empty and no initializer would be changed // by the execution of "algorithm". repeated StringStringEntryProto update_binding = 4; } // Models // // ModelProto is a top-level file/container format for bundling a ML model and // associating its computation graph with metadata. // // The semantics of the model are described by the associated GraphProto's. message ModelProto { // The version of the IR this model targets. See Version enum above. // This field MUST be present. int64 ir_version = 1; // The OperatorSets this model relies on. // All ModelProtos MUST have at least one entry that // specifies which version of the ONNX OperatorSet is // being imported. // // All nodes in the ModelProto's graph will bind against the operator // with the same-domain/same-op_type operator with the HIGHEST version // in the referenced operator sets. repeated OperatorSetIdProto opset_import = 8; // The name of the framework or tool used to generate this model. // This field SHOULD be present to indicate which implementation/tool/framework // emitted the model. string producer_name = 2; // The version of the framework or tool used to generate this model. // This field SHOULD be present to indicate which implementation/tool/framework // emitted the model. string producer_version = 3; // Domain name of the model. // We use reverse domain names as name space indicators. For example: // `com.facebook.fair` or `com.microsoft.cognitiveservices` // // Together with `model_version` and GraphProto.name, this forms the unique identity of // the graph. string domain = 4; // The version of the graph encoded. See Version enum below. int64 model_version = 5; // A human-readable documentation for this model. Markdown is allowed. string doc_string = 6; // The parameterized graph that is evaluated to execute the model. GraphProto graph = 7; // Named metadata values; keys should be distinct. repeated StringStringEntryProto metadata_props = 14; // Training-specific information. Sequentially executing all stored // `TrainingInfoProto.algorithm`s and assigning their outputs following // the corresponding `TrainingInfoProto.update_binding`s is one training // iteration. Similarly, to initialize the model // (as if training hasn't happened), the user should sequentially execute // all stored `TrainingInfoProto.initialization`s and assigns their outputs // using `TrainingInfoProto.initialization_binding`s. // // If this field is empty, the training behavior of the model is undefined. repeated TrainingInfoProto training_info = 20; // A list of function protos local to the model. // // Name of the function "FunctionProto.name" should be unique within the domain "FunctionProto.domain". // In case of any conflicts the behavior (whether the model local functions are given higher priority, // or standard operator sets are given higher priotity or this is treated as error) is defined by // the runtimes. // // The operator sets imported by FunctionProto should be compatible with the ones // imported by ModelProto and other model local FunctionProtos. // Example, if same operator set say 'A' is imported by a FunctionProto and ModelProto // or by 2 FunctionProtos then versions for the operator set may be different but, // the operator schema returned for op_type, domain, version combination // for both the versions should be same for every node in the function body. // // One FunctionProto can reference other FunctionProto in the model, however, recursive reference // is not allowed. repeated FunctionProto functions = 25; }; // StringStringEntryProto follows the pattern for cross-proto-version maps. // See https://developers.google.com/protocol-buffers/docs/proto3#maps message StringStringEntryProto { string key = 1; string value = 2; }; message TensorAnnotation { string tensor_name = 1; // <key, value> pairs to annotate tensor specified by <tensor_name> above. // The keys used in the mapping below must be pre-defined in ONNX spec. // For example, for 8-bit linear quantization case, 'SCALE_TENSOR', 'ZERO_POINT_TENSOR' will be pre-defined as // quantization parameter keys. repeated StringStringEntryProto quant_parameter_tensor_names = 2; } // Graphs // // A graph defines the computational logic of a model and is comprised of a parameterized // list of nodes that form a directed acyclic graph based on their inputs and outputs. // This is the equivalent of the "network" or "graph" in many deep learning // frameworks. message GraphProto { // The nodes in the graph, sorted topologically. repeated NodeProto node = 1; // The name of the graph. string name = 2; // namespace Graph // A list of named tensor values, used to specify constant inputs of the graph. // Each initializer (both TensorProto as well SparseTensorProto) MUST have a name. // The name MUST be unique across both initializer and sparse_initializer, // but the name MAY also appear in the input list. repeated TensorProto initializer = 5; // Initializers (see above) stored in sparse format. repeated SparseTensorProto sparse_initializer = 15; // A human-readable documentation for this graph. Markdown is allowed. string doc_string = 10; // The inputs and outputs of the graph. repeated ValueInfoProto input = 11; repeated ValueInfoProto output = 12; // Information for the values in the graph. The ValueInfoProto.name's // must be distinct. It is optional for a value to appear in value_info list. repeated ValueInfoProto value_info = 13; // This field carries information to indicate the mapping among a tensor and its // quantization parameter tensors. For example: // For tensor 'a', it may have {'SCALE_TENSOR', 'a_scale'} and {'ZERO_POINT_TENSOR', 'a_zero_point'} annotated, // which means, tensor 'a_scale' and tensor 'a_zero_point' are scale and zero point of tensor 'a' in the model. repeated TensorAnnotation quantization_annotation = 14; reserved 3, 4, 6 to 9; reserved "ir_version", "producer_version", "producer_tag", "domain"; } // Tensors // // A serialized tensor value. message TensorProto { enum DataType { UNDEFINED = 0; // Basic types. FLOAT = 1; // float UINT8 = 2; // uint8_t INT8 = 3; // int8_t UINT16 = 4; // uint16_t INT16 = 5; // int16_t INT32 = 6; // int32_t INT64 = 7; // int64_t STRING = 8; // string BOOL = 9; // bool // IEEE754 half-precision floating-point format (16 bits wide). // This format has 1 sign bit, 5 exponent bits, and 10 mantissa bits. FLOAT16 = 10; DOUBLE = 11; UINT32 = 12; UINT64 = 13; COMPLEX64 = 14; // complex with float32 real and imaginary components COMPLEX128 = 15; // complex with float64 real and imaginary components // Non-IEEE floating-point format based on IEEE754 single-precision // floating-point number truncated to 16 bits. // This format has 1 sign bit, 8 exponent bits, and 7 mantissa bits. BFLOAT16 = 16; // Non-IEEE floating-point format based on papers // FP8 Formats for Deep Learning, https://arxiv.org/abs/2209.05433, // 8-bit Numerical Formats For Deep Neural Networks, https://arxiv.org/pdf/2206.02915.pdf. // Operators supported FP8 are Cast, CastLike, QuantizeLinear, DequantizeLinear. // The computation usually happens inside a block quantize / dequantize // fused by the runtime. FLOAT8E4M3FN = 17; // float 8, mostly used for coefficients, supports nan, not inf FLOAT8E4M3FNUZ = 18; // float 8, mostly used for coefficients, supports nan, not inf, no negative zero FLOAT8E5M2 = 19; // follows IEEE 754, supports nan, inf, mostly used for gradients FLOAT8E5M2FNUZ = 20; // follows IEEE 754, supports nan, inf, mostly used for gradients, no negative zero // Future extensions go here. } // The shape of the tensor. repeated int64 dims = 1; // The data type of the tensor. // This field MUST have a valid TensorProto.DataType value int32 data_type = 2; // For very large tensors, we may want to store them in chunks, in which // case the following fields will specify the segment that is stored in // the current TensorProto. message Segment { int64 begin = 1; int64 end = 2; } Segment segment = 3; // Tensor content must be organized in row-major order. // // Depending on the data_type field, exactly one of the fields below with // name ending in _data is used to store the elements of the tensor. // For float and complex64 values // Complex64 tensors are encoded as a single array of floats, // with the real components appearing in odd numbered positions, // and the corresponding imaginary component appearing in the // subsequent even numbered position. (e.g., [1.0 + 2.0i, 3.0 + 4.0i] // is encoded as [1.0, 2.0 ,3.0 ,4.0] // When this field is present, the data_type field MUST be FLOAT or COMPLEX64. repeated float float_data = 4 [packed = true]; // For int32, uint8, int8, uint16, int16, bool, float8, and float16 values // float16 and float8 values must be bit-wise converted to an uint16_t prior // to writing to the buffer. // When this field is present, the data_type field MUST be // INT32, INT16, INT8, UINT16, UINT8, BOOL, FLOAT16, BFLOAT16, FLOAT8E4M3FN, FLOAT8E4M3FNUZ, FLOAT8E5M2, FLOAT8E5M2FNUZ repeated int32 int32_data = 5 [packed = true]; // For strings. // Each element of string_data is a UTF-8 encoded Unicode // string. No trailing null, no leading BOM. The protobuf "string" // scalar type is not used to match ML community conventions. // When this field is present, the data_type field MUST be STRING repeated bytes string_data = 6; // For int64. // When this field is present, the data_type field MUST be INT64 repeated int64 int64_data = 7 [packed = true]; // Optionally, a name for the tensor. string name = 8; // namespace Value // A human-readable documentation for this tensor. Markdown is allowed. string doc_string = 12; // Serializations can either use one of the fields above, or use this // raw bytes field. The only exception is the string case, where one is // required to store the content in the repeated bytes string_data field. // // When this raw_data field is used to store tensor value, elements MUST // be stored in as fixed-width, little-endian order. // Floating-point data types MUST be stored in IEEE 754 format. // Complex64 elements must be written as two consecutive FLOAT values, real component first. // Complex128 elements must be written as two consecutive DOUBLE values, real component first. // Boolean type MUST be written one byte per tensor element (00000001 for true, 00000000 for false). // // Note: the advantage of specific field rather than the raw_data field is // that in some cases (e.g. int data), protobuf does a better packing via // variable length storage, and may lead to smaller binary footprint. // When this field is present, the data_type field MUST NOT be STRING or UNDEFINED bytes raw_data = 9; // Data can be stored inside the protobuf file using type-specific fields or raw_data. // Alternatively, raw bytes data can be stored in an external file, using the external_data field. // external_data stores key-value pairs describing data location. Recognized keys are: // - "location" (required) - POSIX filesystem path relative to the directory where the ONNX // protobuf model was stored // - "offset" (optional) - position of byte at which stored data begins. Integer stored as string. // Offset values SHOULD be multiples 4096 (page size) to enable mmap support. // - "length" (optional) - number of bytes containing data. Integer stored as string. // - "checksum" (optional) - SHA1 digest of file specified in under 'location' key. repeated StringStringEntryProto external_data = 13; // Location of the data for this tensor. MUST be one of: // - DEFAULT - data stored inside the protobuf message. Data is stored in raw_data (if set) otherwise in type-specified field. // - EXTERNAL - data stored in an external location as described by external_data field. enum DataLocation { DEFAULT = 0; EXTERNAL = 1; } // If value not set, data is stored in raw_data (if set) otherwise in type-specified field. DataLocation data_location = 14; // For double // Complex128 tensors are encoded as a single array of doubles, // with the real components appearing in odd numbered positions, // and the corresponding imaginary component appearing in the // subsequent even numbered position. (e.g., [1.0 + 2.0i, 3.0 + 4.0i] // is encoded as [1.0, 2.0 ,3.0 ,4.0] // When this field is present, the data_type field MUST be DOUBLE or COMPLEX128 repeated double double_data = 10 [packed = true]; // For uint64 and uint32 values // When this field is present, the data_type field MUST be // UINT32 or UINT64 repeated uint64 uint64_data = 11 [packed = true]; } // A serialized sparse-tensor value message SparseTensorProto { // The sequence of non-default values are encoded as a tensor of shape [NNZ]. // The default-value is zero for numeric tensors, and empty-string for string tensors. // values must have a non-empty name present which serves as a name for SparseTensorProto // when used in sparse_initializer list. TensorProto values = 1; // The indices of the non-default values, which may be stored in one of two formats. // (a) Indices can be a tensor of shape [NNZ, rank] with the [i,j]-th value // corresponding to the j-th index of the i-th value (in the values tensor). // (b) Indices can be a tensor of shape [NNZ], in which case the i-th value // must be the linearized-index of the i-th value (in the values tensor). // The linearized-index can be converted into an index tuple (k_1,...,k_rank) // using the shape provided below. // The indices must appear in ascending order without duplication. // In the first format, the ordering is lexicographic-ordering: // e.g., index-value [1,4] must appear before [2,1] TensorProto indices = 2; // The shape of the underlying dense-tensor: [dim_1, dim_2, ... dim_rank] repeated int64 dims = 3; } // Defines a tensor shape. A dimension can be either an integer value // or a symbolic variable. A symbolic variable represents an unknown // dimension. message TensorShapeProto { message Dimension { oneof value { int64 dim_value = 1; string dim_param = 2; // namespace Shape }; // Standard denotation can optionally be used to denote tensor // dimensions with standard semantic descriptions to ensure // that operations are applied to the correct axis of a tensor. // Refer to https://github.com/onnx/onnx/blob/main/docs/DimensionDenotation.md#denotation-definition // for pre-defined dimension denotations. string denotation = 3; }; repeated Dimension dim = 1; } // Types // // The standard ONNX data types. message TypeProto { message Tensor { // This field MUST NOT have the value of UNDEFINED // This field MUST have a valid TensorProto.DataType value // This field MUST be present for this version of the IR. int32 elem_type = 1; TensorShapeProto shape = 2; } // repeated T message Sequence { // The type and optional shape of each element of the sequence. // This field MUST be present for this version of the IR. TypeProto elem_type = 1; }; // map<K,V> message Map { // This field MUST have a valid TensorProto.DataType value // This field MUST be present for this version of the IR. // This field MUST refer to an integral type ([U]INT{8|16|32|64}) or STRING int32 key_type = 1; // This field MUST be present for this version of the IR. TypeProto value_type = 2; }; // wrapper for Tensor, Sequence, or Map message Optional { // The type and optional shape of the element wrapped. // This field MUST be present for this version of the IR. // Possible values correspond to OptionalProto.DataType enum TypeProto elem_type = 1; }; message SparseTensor { // This field MUST NOT have the value of UNDEFINED // This field MUST have a valid TensorProto.DataType value // This field MUST be present for this version of the IR. int32 elem_type = 1; TensorShapeProto shape = 2; } oneof value { // The type of a tensor. Tensor tensor_type = 1; // NOTE: DNN-only implementations of ONNX MAY elect to not support non-tensor values // as input and output to graphs and nodes. These types are needed to naturally // support classical ML operators. DNN operators SHOULD restrict their input // and output types to tensors. // The type of a sequence. Sequence sequence_type = 4; // The type of a map. Map map_type = 5; // The type of an optional. Optional optional_type = 9; // Type of the sparse tensor SparseTensor sparse_tensor_type = 8; } // An optional denotation can be used to denote the whole // type with a standard semantic description as to what is // stored inside. Refer to https://github.com/onnx/onnx/blob/main/docs/TypeDenotation.md#type-denotation-definition // for pre-defined type denotations. string denotation = 6; } // Operator Sets // // OperatorSets are uniquely identified by a (domain, opset_version) pair. message OperatorSetIdProto { // The domain of the operator set being identified. // The empty string ("") or absence of this field implies the operator // set that is defined as part of the ONNX specification. // This field MUST be present in this version of the IR when referring to any other operator set. string domain = 1; // The version of the operator set being identified. // This field MUST be present in this version of the IR. int64 version = 2; } // Operator/function status. enum OperatorStatus { EXPERIMENTAL = 0; STABLE = 1; } message FunctionProto { // The name of the function, similar usage of op_type in OperatorProto. // Combined with FunctionProto.domain, this forms the unique identity of // the FunctionProto. string name = 1; // Deprecated since IR Version 8 // optional int64 since_version = 2; reserved 2; reserved "since_version"; // Deprecated since IR Version 8 // optional OperatorStatus status = 3; reserved 3; reserved "status"; // The inputs and outputs of the function. repeated string input = 4; repeated string output = 5; // The attribute parameters of the function. // It is for function parameters without default values. repeated string attribute = 6; // The attribute protos of the function. // It is for function attributes with default values. // A function attribute shall be represented either as // a string attribute or an AttributeProto, not both. repeated AttributeProto attribute_proto = 11; // The nodes in the function. repeated NodeProto node = 7; // A human-readable documentation for this function. Markdown is allowed. string doc_string = 8; // The OperatorSets this function body (graph) relies on. // // All nodes in the function body (graph) will bind against the operator // with the same-domain/same-op_type operator with the HIGHEST version // in the referenced operator sets. This means at most one version can be relied // for one domain. // // The operator sets imported by FunctionProto should be compatible with the ones // imported by ModelProto. Example, if same operator set say 'A' is imported by FunctionProto // and ModelProto then versions for the operator set may be different but, // the operator schema returned for op_type, domain, version combination // for both the versions should be same. repeated OperatorSetIdProto opset_import = 9; // The domain which this function belongs to. Combined with FunctionProto.name, this forms the unique identity of // the FunctionProto. string domain = 10; } // For using protobuf-lite option optimize_for = LITE_RUNTIME;
4
0
hf_public_repos/candle/candle-onnx
hf_public_repos/candle/candle-onnx/tests/ops.rs
use candle::test_utils::to_vec2_round; use candle::{DType, Device, NdArray, Result, Tensor}; use candle_onnx::onnx::attribute_proto::AttributeType; use candle_onnx::onnx::tensor_proto::DataType; use candle_onnx::onnx::tensor_shape_proto::{dimension, Dimension}; use candle_onnx::onnx::{type_proto, TensorProto, TensorShapeProto, TypeProto}; use candle_onnx::onnx::{AttributeProto, GraphProto, ModelProto, NodeProto, ValueInfoProto}; use candle_onnx::simple_eval; use std::collections::HashMap; const INPUT_X: &str = "x"; const INPUT_Y: &str = "y"; const INPUT_A: &str = "a"; const OUTPUT_Z: &str = "z"; fn create_model_proto_with_graph(graph: Option<GraphProto>) -> ModelProto { ModelProto { metadata_props: vec![], training_info: vec![], functions: vec![], ir_version: 0, opset_import: vec![], producer_name: "".to_string(), producer_version: "".to_string(), domain: "".to_string(), model_version: 0, doc_string: "".to_string(), graph, } } #[test] fn test_evaluation_fails_without_defined_graph() -> Result<()> { let manual_graph = create_model_proto_with_graph(None); let inputs: HashMap<String, Tensor> = HashMap::new(); match candle_onnx::simple_eval(&manual_graph, inputs) { Err(err) => assert_eq!(err.to_string(), "no graph defined in proto"), Ok(_) => panic!("Expected an error due to undefined graph"), } Ok(()) } // "Add" #[test] fn test_add_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Add".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[2.], &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(&[2.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_vec1::<f64>()?[0]; assert_eq!(first, 4.0f64); Ok(()) } // "Sub" #[test] fn test_sub_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Sub".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[2.], &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(&[2.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_vec1::<f64>()?[0]; assert_eq!(first, 0.0f64); Ok(()) } // "Mul" #[test] fn test_mul_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Mul".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[2.], &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(&[2.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_vec1::<f64>()?[0]; assert_eq!(first, 4.0f64); Ok(()) } // "Div" #[test] fn test_div_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Div".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[2.], &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(&[2.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_vec1::<f64>()?[0]; assert_eq!(first, 1.0f64); Ok(()) } // "Exp" #[test] fn test_exp_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Exp".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![-1.0f32, 0.0f32, 1.0f32, 2.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results[0][0], 0.36787944f32); assert_eq!(results[0][1], 1.0f32); assert_eq!(results[1], vec![std::f32::consts::E, 7.389056f32]); Ok(()) } // "Equal" #[test] fn test_equal_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Equal".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[2.], &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(&[2.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_dtype(candle::DType::U8)?.to_vec1::<u8>()?.to_vec()[0]; assert_eq!(first, 1); Ok(()) } // "Not" #[test] fn test_not_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Not".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(&[0.], &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let first = z.to_dtype(candle::DType::U8)?.to_vec1::<u8>()?.to_vec()[0]; assert_eq!(first, 1); Ok(()) } // "MatMul" #[test] fn test_matmul_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "MatMul".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert( INPUT_X.to_string(), Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?, ); inputs.insert( INPUT_Y.to_string(), Tensor::from_vec( // vec![5.0f32, 6.0f32, 7.0f32, 8.0f32], &[2, 2], &Device::Cpu, )?, ); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![19.0, 22.0], vec![43.0, 50.0]]); Ok(()) } // "Reshape" #[test] fn test_reshape_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Reshape".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let y = Tensor::from_vec( // vec![4i64], &[1], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); inputs.insert(INPUT_Y.to_string(), y); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec1::<f32>()?; assert_eq!(results, vec![1.0, 2.0, 3.0, 4.0]); Ok(()) } // "LogSoftmax" #[test] fn test_logsoftmax_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "LogSoftmax".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![0.26894143, 0.7310586], vec![0.26894143, 0.7310586]] ); Ok(()) } // "Softmax" #[test] fn test_softmax_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Softmax".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![0.26894143, 0.7310586], vec![0.26894143, 0.7310586]] ); Ok(()) } // "Transpose" #[test] fn test_transpose_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Transpose".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![1.0, 3.0], vec![2.0, 4.0]]); Ok(()) } // "Dropout" #[test] fn test_dropout_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Dropout".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![1.0, 2.0], vec![3.0, 4.0]]); Ok(()) } // "Flatten" #[test] fn test_flatten_operation() -> Result<()> { let mut att_axis = AttributeProto { name: "axis".to_string(), ref_attr_name: "axis".to_string(), i: 0, doc_string: "axis".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Flatten".to_string(), domain: "".to_string(), attribute: vec![att_axis.clone()], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( vec![ 1.0f32, 2.0f32, 3.0f32, 4.0f32, 5.0f32, 6.0f32, 7.0f32, 8.0f32, ], &[2, 2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs.clone())?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0]]); att_axis.i = 1; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Flatten".to_string(), domain: "".to_string(), attribute: vec![att_axis.clone()], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![1.0, 2.0, 3.0, 4.0], vec![5.0, 6.0, 7.0, 8.0]] ); Ok(()) } // Below are ops that are implemented but not tested yet // "MaxPool" // #[test] // "AveragePool" // #[test] // "BatchNormalization" // #[test] // "Squeeze" // #[test] // "ConstantOfShape" #[test] fn test_constant_of_shape() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-31 test(&[4i64, 3, 2], Some(1.), &[1., 1., 1.])?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-31 test(&[0.], Some(0i64), &[0i64])?; // "value" defaults to 0 f32 test(&[1i64, 2, 3, 4], None as Option<i64>, &[0., 0., 0., 0.])?; fn test( input: impl NdArray, value: Option<impl NdArray>, expected: impl NdArray, ) -> Result<()> { let mut attribute = vec![]; if let Some(value) = value { let tensor = Tensor::new(value, &Device::Cpu)?; let (value, data_type) = match tensor.dtype() { DType::U8 => ( tensor.to_vec0::<u8>()?.to_le_bytes().to_vec(), DataType::Uint8, ), DType::U32 => ( tensor.to_vec0::<u32>()?.to_le_bytes().to_vec(), DataType::Uint32, ), DType::I64 => ( tensor.to_vec0::<i64>()?.to_le_bytes().to_vec(), DataType::Int64, ), DType::F32 => ( tensor.to_vec0::<f32>()?.to_le_bytes().to_vec(), DataType::Float, ), DType::F64 => ( tensor.to_vec0::<f64>()?.to_le_bytes().to_vec(), DataType::Double, ), _ => panic!("unsupported DType in test"), }; let tensor = TensorProto { data_type: data_type.into(), dims: tensor.dims().iter().map(|v| *v as i64).collect(), raw_data: value, segment: None, float_data: vec![], int32_data: vec![], string_data: vec![], int64_data: vec![], name: "".to_string(), doc_string: "".to_string(), external_data: vec![], data_location: 0, double_data: vec![], uint64_data: vec![], }; attribute.push(AttributeProto { name: "value".to_string(), ref_attr_name: "value".to_string(), i: 0, doc_string: "value".to_string(), r#type: AttributeType::Tensor.into(), f: 0.0, s: vec![], t: Some(tensor), g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }) } let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ConstantOfShape".to_string(), domain: "".to_string(), attribute, input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(input, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval .get(OUTPUT_Z) .expect("Output 'z' not found") .to_dtype(DType::F64)?; let expected = Tensor::new(expected, &Device::Cpu)?.to_dtype(DType::F64)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Unsqueeze" #[test] fn test_unsqueeze() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Unsqueeze".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( vec![ 1.0f32, 2.0f32, // 3.0f32, 4.0f32, // ], &[2, 2], &Device::Cpu, )?; let y = Tensor::from_vec(vec![-1i64], &[1], &Device::Cpu)?; let inputs = HashMap::from_iter([(INPUT_X.to_string(), x.clone()), (INPUT_Y.to_string(), y)]); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); assert_eq!(z.dims(), &[2, 2, 1]); assert_eq!( z.flatten_all()?.to_vec1::<f32>()?, x.flatten_all()?.to_vec1::<f32>()? ); Ok(()) } // "Clip" // #[test] // "Gather" #[test] fn test_gather_operation() -> Result<()> { // test taken from https://onnx.ai/onnx/operators/onnx__Gather.html#summary. test( &[[1.0, 1.2], [2.3, 3.4], [4.5, 5.7]], &[[0i64, 1], [1, 2]], 0, &[[[1.0, 1.2], [2.3, 3.4]], [[2.3, 3.4], [4.5, 5.7]]], )?; // test taken from https://onnx.ai/onnx/operators/onnx__Gather.html#summary. test( &[[1.0, 1.2, 1.9], [2.3, 3.4, 3.9], [4.5, 5.7, 5.9]], &[[0i64, 2]], 1, &[[[1.0, 1.9]], [[2.3, 3.9]], [[4.5, 5.9]]], )?; // all the tests below are generated from numpy.take, which works like // onnx's Gather operation. test(&[1.0, 2.0, 3.0, 4.0], 3i64, 0, 4.0)?; test(&[[1.0, 2.0, 3.0, 4.0]], 3i64, 1, &[4.0])?; test( &[[1.0], [2.0], [3.0], [4.0]], &[3i64, 2], 0, &[[4.0], [3.0]], )?; test( &[ [[1.0, 2.0], [3.0, 4.0]], [[5.0, 6.0], [7.0, 8.0]], [[9.0, 10.0], [11.0, 12.0]], [[13.0, 14.0], [15.0, 16.0]], ], 1i64, 0, &[[5.0, 6.0], [7.0, 8.0]], )?; test( &[ [[1.0, 2.0], [3.0, 4.0]], [[5.0, 6.0], [7.0, 8.0]], [[9.0, 10.0], [11.0, 12.0]], [[13.0, 14.0], [15.0, 16.0]], ], &[1i64, 0], 0, &[[[5.0, 6.0], [7.0, 8.0]], [[1.0, 2.0], [3.0, 4.0]]], )?; fn test( data: impl NdArray, indices: impl NdArray, axis: i64, expected: impl NdArray, ) -> Result<()> { let att_axis = AttributeProto { name: "axis".to_string(), ref_attr_name: "axis".to_string(), i: axis, doc_string: "axis".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Gather".to_string(), domain: "".to_string(), attribute: vec![att_axis], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(indices, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // GatherElements #[test] fn test_gather_elements() -> Result<()> { // all the tests below are verified against `torch.gather()` // Rank 1 index test(&[1.0, 2.0, 3.0, 4.0], &[3i64], 0, &[4.0])?; // Rank 2 index test(&[[1.0, 2.0, 3.0, 4.0]], &[[3i64]], 1, &[[4.0]])?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-57 gather_elements_0 test( &[[1., 2.], [3., 4.]], &[[0i64, 0], [1, 0]], 1, &[[1., 1.], [4., 3.]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-57 gather_elements_1 test( &[[1., 2., 3.], [4., 5., 6.], [7., 8., 9.]], &[[1i64, 2, 0], [2, 0, 0]], 0, &[[4., 8., 3.], [7., 2., 3.]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-57 gather_elements_negative_indices test( &[[1., 2., 3.], [4., 5., 6.], [7., 8., 9.]], &[[-1_i64, -2, 0], [-2, 0, 0]], 0, &[[7., 5., 3.], [4., 2., 3.]], )?; test( &[[1.0], [2.0], [3.0], [4.0]], &[[3i64], [2]], 0, &[[4.], [3.]], )?; // Rank 3 test( &[ [[1.0, 2.0], [3.0, 4.0]], [[5.0, 6.0], [7.0, 8.0]], [[9.0, 10.0], [11.0, 12.0]], [[13.0, 14.0], [15.0, 16.0]], ], &[[[1i64]]], 0, &[[[5.]]], )?; test( &[ [[1.0, 2.0], [3.0, 4.0]], [[5.0, 6.0], [7.0, 8.0]], [[9.0, 10.0], [11.0, 12.0]], [[13.0, 14.0], [15.0, 16.0]], ], &[[[1i64]]], 1, &[[[3.]]], )?; test( &[ [[1.0, 2.0], [3.0, 4.0]], [[5.0, 6.0], [7.0, 8.0]], [[9.0, 10.0], [11.0, 12.0]], [[13.0, 14.0], [15.0, 16.0]], ], &[[[1i64], [0]]], 2, &[[[2.], [3.]]], )?; // Error cases // Invalid index assert!(test(&[[1.0, 2.0, 3.0, 4.0]], &[[3i64]], 0, &[[1., 2., 3., 4.]]).is_err()); // Invalid axis/ dim assert!(test(&[[1.0, 2.0, 3.0, 4.0]], &[[3i64]], 2, &[[1., 2., 3., 4.]]).is_err()); // Invalid rank assert!(test(&[[1.0, 2.0, 3.0, 4.0]], &[3i64], 0, &[[1.]]).is_err()); fn test( data: impl NdArray, indices: impl NdArray, axis: i64, expected: impl NdArray, ) -> Result<()> { let att_axis = AttributeProto { name: "axis".to_string(), ref_attr_name: "axis".to_string(), i: axis, doc_string: "axis".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "GatherElements".to_string(), domain: "".to_string(), attribute: vec![att_axis], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(indices, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Size" #[test] fn test_size_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Size".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_scalar::<i64>()?; assert_eq!(results, 4); Ok(()) } // "Shape" #[test] fn test_shape_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Shape".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec1::<i64>()?; assert_eq!(results, vec![2, 2]); Ok(()) } // "Conv" // #[test] // "Concat" // #[test] // "Abs" #[test] fn test_abs_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Abs".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( vec![-1.0f32, 2.0f32, -3.0f32, 4.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![1.0, 2.0], vec![3.0, 4.0]]); Ok(()) } // "Cos" #[test] fn test_cos_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Cos".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![0.0f32, 1.0f32, 2.0f32, 3.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); assert_eq!(to_vec2_round(z, 4)?, [[1.0, 0.5403], [-0.4161, -0.99]]); Ok(()) } // "Sin" #[test] fn test_sin_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Sin".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![0.0f32, 1.0f32, 2.0f32, 3.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); assert_eq!(to_vec2_round(z, 4)?, [[0.0, 0.8415], [0.9093, 0.1411]]); Ok(()) } // "Neg" #[test] fn test_neg_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Neg".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![1.0f32, 2.0f32, 3.0f32, 4.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![-1.0, -2.0], vec![-3.0, -4.0]]); Ok(()) } // "Erf" // #[test] // "Tanh" #[test] fn test_tanh_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Tanh".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![0.0f32, 1.0f32, 2.0f32, 3.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![0.0, 0.7615942], vec![0.9640276, 0.9950548]] ); Ok(()) } // "Sigmoid" #[test] fn test_sigmoid_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Sigmoid".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![0.0f32, 1.0f32, 2.0f32, 3.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![0.5, 0.7310586], vec![0.880797, 0.95257413]] ); Ok(()) } // "Gelu" #[test] fn test_gelu_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Gelu".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: INPUT_Y.to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec(vec![0.0f32, 1.0f32, 2.0f32, 3.0f32], &[2, 2], &Device::Cpu)?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!( results, vec![vec![0.0, 0.8413448], vec![1.9544997, 2.9959502]] ); Ok(()) } // "Relu" #[test] fn test_relu_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Relu".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( vec![-1.0f32, 1.0f32, -2.0f32, 3.0f32], &[2, 2], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec2::<f32>()?; assert_eq!(results, vec![vec![0.0, 1.0], vec![0.0, 3.0]]); Ok(()) } // "Constant" // #[test] // "Cast" // #[test] // "ReduceMax" #[test] fn test_reduce_max() -> Result<()> { // Tests with random data generated with `np.random.uniform` // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-119 bool_inputs // No special treatment reqired for bool // `np.maximum.reduce(data, axis=axes, keepdims=True)` test( &[[1_u8, 1], [1, 0], [0, 1], [0, 0]], Some(vec![1]), 1, None, &[[1_u8], [1], [1], [0]], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-119 default_axes_keepdims // `np.maximum.reduce(data, axis=None, keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], None, 1, None, &[[[60.]]], false, )?; // same as above but with random test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 1, None, &[[[9.587318]]], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-119 default_axes_donot_keep_dims // `np.maximum.reduce(data, axis=None, keepdims=False)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], None, 0, None, 60., false, )?; // same as above but with random // `np.maximum.reduce(data, axis=None, keepdims=False)` test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 0, None, 9.587318, false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-119 keepdims // `np.maximum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1]), 1, None, &[[[20., 2.]], [[40., 2.]], [[60., 2.]]], false, )?; // keepdims with random data // `np.maximum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], Some(vec![1]), 1, None, &[ [[-7.318765, 7.2374434]], [[6.304022, 4.939862]], [[9.587318, 8.008944]], ], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-119 negative_axes_keepdims // axes = np.array([-1], dtype=np.int64) // `np.maximum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1]), 1, None, &[[[5.], [20.]], [[30.], [40.]], [[55.], [60.]]], false, )?; // axes = np.array([-2], dtype=np.int64) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-2]), 1, None, &[[[20., 2.]], [[40., 2.]], [[60., 2.]]], false, )?; // with random test( &[ [[-4.1676497, -2.7603748], [-4.5138783, -0.762791]], [[-6.3792877, 7.1619177], [-9.958144, 6.3753467]], [[9.046973, 3.4554052], [-5.4674335, 5.4642754]], ], Some(vec![-2]), 1, None, &[ [[-4.1676497, -0.762791]], [[-6.3792877, 7.1619177]], [[9.046973, 5.4642754]], ], false, )?; // Multiple axes - keepdims=1 (true) // axes = np.array([0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 1]), 1, None, &[[[60., 2.]]], false, )?; // axes = np.array([0, 2], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 2]), 1, None, &[[[55.], [60.]]], false, )?; // axes = np.array([2, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 1]), 1, None, &[[[20.]], [[40.]], [[60.]]], false, )?; // axes = np.array([2, 0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 0, 1]), 1, None, &[[[60.]]], false, )?; // Multiple axes - keepdims=0 (false) // axes = np.array([0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 1]), 0, None, &[60., 2.], false, )?; // axes = np.array([0, 2], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 2]), 0, None, &[55., 60.], false, )?; // axes = np.array([2, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 1]), 0, None, &[20., 40., 60.], false, )?; // axes = np.array([2, 0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 0, 1]), 0, None, 60., false, )?; // Multiple axes - negative `axes` - keepdims=1 (true) // axes = np.array([-1, 0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 1, None, &[[[60.]]], false, )?; // Multiple axes - negative `axes` - keepdims=0 (false) // axes = np.array([-1, 0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 0, None, 60., false, )?; // `noop_with_empty_axes = true (1)` should yield tensor equivallent to the input tensor test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 0, Some(1), &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], false, )?; // Rank-0 arrays are also valid test(42., None, 0, None, 42., false)?; test(42., None, 1, None, 42., false)?; // Negative test - expect error // axes = np.array([-2, 0, 1], dtype=np.int64) // np.maximum.reduce(data, axis=tuple(axes), keepdims=True) // Should error out with `duplicate value in "axes"` assert!(test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-2, 0, 1]), 1, None, &[[[60.]]], false ) .is_err()); // Negative test - expect error // Should error out on empty set assert!(test(&[[1_u8; 0]], Some(vec![-2, 0, 1]), 1, None, &[0.], false).is_err()); // Backward compatibility test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 0, None, 60., true, )?; fn test( data: impl NdArray, axes: Option<Vec<i64>>, keepdims: i64, noop_with_empty_axes: Option<i64>, expected: impl NdArray, backward_comp: bool, ) -> Result<()> { let has_axes = axes.is_some(); let att_keepdims = AttributeProto { name: "keepdims".to_string(), ref_attr_name: "keepdims".to_string(), i: keepdims, doc_string: "keepdims".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let mut attribute = vec![att_keepdims]; if let Some(noop) = noop_with_empty_axes { if !has_axes { let att_no_op_empty_axes = AttributeProto { name: "noop_with_empty_axes".to_string(), ref_attr_name: "noop_with_empty_axes".to_string(), i: noop, doc_string: "noop_with_empty_axes".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; attribute.push(att_no_op_empty_axes); } } if has_axes && backward_comp { attribute.push(AttributeProto { name: "axes".to_string(), ref_attr_name: "axes".to_string(), i: 0, doc_string: "axes".to_string(), r#type: 7, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: axes.clone().unwrap_or_default(), strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }); } let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ReduceMax".to_string(), domain: "".to_string(), attribute, input: if has_axes && !backward_comp { vec![INPUT_X.to_string(), INPUT_Y.to_string()] } else { vec![INPUT_X.to_string()] }, output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); let input_tensor = Tensor::new(data, &Device::Cpu)?; let input_dtype = input_tensor.dtype(); inputs.insert(INPUT_X.to_string(), input_tensor); if !backward_comp { if let Some(a) = axes { inputs.insert(INPUT_Y.to_string(), Tensor::new(a, &Device::Cpu)?); } } let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec0::<u8>()?, expected.to_vec0::<u8>()?) } else { assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?) } } 1 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec1::<u8>()?, expected.to_vec1::<u8>()?) } else { assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?) } } 2 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec2::<u8>()?, expected.to_vec2::<u8>()?) } else { assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?) } } 3 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec3::<u8>()?, expected.to_vec3::<u8>()?) } else { assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?) } } _ => unreachable!(), }; Ok(()) } Ok(()) } // "ReduceMin" #[test] fn test_reduce_min() -> Result<()> { // Tests with random data generated with `np.random.uniform` // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-121 bool_inputs // No special treatment reqired for bool // `np.minimum.reduce(data, axis=axes, keepdims=True)` test( &[[1_u8, 1], [1, 0], [0, 1], [0, 0]], Some(vec![1]), 1, None, &[[1_u8], [0], [0], [0]], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-121 default_axes_keepdims // `np.minimum.reduce(data, axis=None, keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], None, 1, None, &[[[1.]]], false, )?; // same as above but with random test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 1, None, &[[[-8.794852]]], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-121 default_axes_donot_keep_dims // `np.minimum.reduce(data, axis=None, keepdims=False)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], None, 0, None, 1., false, )?; // same as above but with random // `np.minimum.reduce(data, axis=None, keepdims=False)` test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 0, None, -8.794852, false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-121 keepdims // `np.minimum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1]), 1, None, &[[[5., 1.]], [[30., 1.]], [[55., 1.]]], false, )?; // keepdims with random data // `np.minimum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], Some(vec![1]), 1, None, &[ [[-7.648377, -5.4018507]], [[4.5435624, 3.072864]], [[-2.5058026, -8.794852]], ], false, )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-121 negative_axes_keepdims // axes = np.array([-1], dtype=np.int64) // `np.minimum.reduce(data, axis=tuple(axes), keepdims=True)` test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1]), 1, None, &[[[1.], [2.]], [[1.], [2.]], [[1.], [2.]]], false, )?; // axes = np.array([-2], dtype=np.int64) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-2]), 1, None, &[[[5., 1.]], [[30., 1.]], [[55., 1.]]], false, )?; // with random test( &[ [[-4.1676497, -2.7603748], [-4.5138783, -0.762791]], [[-6.3792877, 7.1619177], [-9.958144, 6.3753467]], [[9.046973, 3.4554052], [-5.4674335, 5.4642754]], ], Some(vec![-2]), 1, None, &[ [[-4.5138783, -2.7603748]], [[-9.958144, 6.3753467]], [[-5.4674335, 3.4554052]], ], false, )?; // Multiple axes - keepdims=1 (true) // axes = np.array([0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 1]), 1, None, &[[[5., 1.]]], false, )?; // axes = np.array([0, 2], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 2]), 1, None, &[[[1.], [2.]]], false, )?; // axes = np.array([2, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 1]), 1, None, &[[[1.]], [[1.]], [[1.]]], false, )?; // axes = np.array([2, 0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 0, 1]), 1, None, &[[[1.]]], false, )?; // Multiple axes - keepdims=0 (false) // axes = np.array([0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 1]), 0, None, &[5., 1.], false, )?; // axes = np.array([0, 2], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![0, 2]), 0, None, &[1., 2.], false, )?; // axes = np.array([2, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 1]), 0, None, &[1., 1., 1.], false, )?; // axes = np.array([2, 0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=False) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![2, 0, 1]), 0, None, 1., false, )?; // Multiple axes - negative `axes` - keepdims=1 (true) // axes = np.array([-1, 0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 1, None, &[[[1.]]], false, )?; // Multiple axes - negative `axes` - keepdims=0 (false) // axes = np.array([-1, 0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 0, None, 1., false, )?; // `noop_with_empty_axes = true (1)` should yield tensor equivallent to the input tensor test( &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], None, 0, Some(1), &[ [[-7.648377, -5.4018507], [-7.318765, 7.2374434]], [[6.304022, 4.939862], [4.5435624, 3.072864]], [[-2.5058026, 8.008944], [9.587318, -8.794852]], ], false, )?; // Rank-0 tensors are also valid test(42., None, 0, None, 42., false)?; test(42., None, 1, None, 42., false)?; // Negative test - expect error // axes = np.array([-2, 0, 1], dtype=np.int64) // np.minimum.reduce(data, axis=tuple(axes), keepdims=True) // Should error out with `duplicate value in "axes"` assert!(test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-2, 0, 1]), 1, None, &[0.], false ) .is_err()); // Negative test - expect error // Should error out on empty set assert!(test(&[[1_u8; 0]], Some(vec![-2, 0, 1]), 1, None, &[0.], false).is_err()); // Backward compatibility test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-1, 0, 1]), 0, None, 1., true, )?; fn test( data: impl NdArray, axes: Option<Vec<i64>>, keepdims: i64, noop_with_empty_axes: Option<i64>, expected: impl NdArray, backward_comp: bool, ) -> Result<()> { let has_axes = axes.is_some(); let att_keepdims = AttributeProto { name: "keepdims".to_string(), ref_attr_name: "keepdims".to_string(), i: keepdims, doc_string: "keepdims".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let mut attribute = vec![att_keepdims]; if let Some(noop) = noop_with_empty_axes { if !has_axes { let att_no_op_empty_axes = AttributeProto { name: "noop_with_empty_axes".to_string(), ref_attr_name: "noop_with_empty_axes".to_string(), i: noop, doc_string: "noop_with_empty_axes".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; attribute.push(att_no_op_empty_axes); } } if has_axes && backward_comp { attribute.push(AttributeProto { name: "axes".to_string(), ref_attr_name: "axes".to_string(), i: 0, doc_string: "axes".to_string(), r#type: 7, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: axes.clone().unwrap_or_default(), strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }); } let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ReduceMin".to_string(), domain: "".to_string(), attribute, input: if has_axes && !backward_comp { vec![INPUT_X.to_string(), INPUT_Y.to_string()] } else { vec![INPUT_X.to_string()] }, output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); let input_tensor = Tensor::new(data, &Device::Cpu)?; let input_dtype = input_tensor.dtype(); inputs.insert(INPUT_X.to_string(), input_tensor); if !backward_comp { if let Some(a) = axes { inputs.insert(INPUT_Y.to_string(), Tensor::new(a, &Device::Cpu)?); } } let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec0::<u8>()?, expected.to_vec0::<u8>()?) } else { assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?) } } 1 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec1::<u8>()?, expected.to_vec1::<u8>()?) } else { assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?) } } 2 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec2::<u8>()?, expected.to_vec2::<u8>()?) } else { assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?) } } 3 => { if input_dtype == DType::U8 { assert_eq!(z.to_vec3::<u8>()?, expected.to_vec3::<u8>()?) } else { assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?) } } _ => unreachable!(), }; Ok(()) } Ok(()) } // "ReduceMean" #[test] fn test_reduce_mean() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-120 default_axes_keepdims test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], None, 1, &[[[18.25]]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-120 do_no_keepdims test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1]), 0, &[[12.5, 1.5], [35.0, 1.5], [57.5, 1.5]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-120 keepdims test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1]), 1, &[[[12.5, 1.5]], [[35.0, 1.5]], [[57.5, 1.5]]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-120 negative_axes_keepdims test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![-2]), 1, &[[[12.5, 1.5]], [[35.0, 1.5]], [[57.5, 1.5]]], )?; // All the test data below was generated based on numpy's np.mean test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1, 2]), 0, &[7.0, 18.25, 29.5], )?; test( &[ [[5., 1.], [20., 2.]], [[30., 1.], [40., 2.]], [[55., 1.], [60., 2.]], ], Some(vec![1, 2]), 1, &[[[7.0]], [[18.25]], [[29.5]]], )?; test(&[1., 2., 3.], None, 1, &[2.0])?; fn test( data: impl NdArray, axes: Option<Vec<i64>>, keepdims: i64, expected: impl NdArray, ) -> Result<()> { let has_axes = axes.is_some(); let att_axes = AttributeProto { name: "axes".to_string(), ref_attr_name: "axes".to_string(), i: 0, doc_string: "axes".to_string(), r#type: 7, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: axes.unwrap_or_default(), strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_keepdims = AttributeProto { name: "keepdims".to_string(), ref_attr_name: "keepdims".to_string(), i: keepdims, doc_string: "keepdims".to_string(), r#type: 2, f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ReduceMean".to_string(), domain: "".to_string(), attribute: if has_axes { vec![att_axes, att_keepdims] } else { vec![att_keepdims] }, input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Sqrt" #[test] fn test_sqrt() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-155 test(&[1., 4., 9.], &[1., 2., 3.])?; fn test(data: impl NdArray, expected: impl NdArray) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Sqrt".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "RandomUniform" #[test] fn test_random_uniform() -> Result<()> { test(vec![3, 2, 1, 4], None, None)?; test(vec![2, 2, 2, 2], Some(-10.0), None)?; test(vec![2, 2, 2, 2], None, Some(10.0))?; test(vec![1, 2, 3, 4], Some(-10.0), Some(10.0))?; fn test(shape: Vec<i64>, low: Option<f32>, high: Option<f32>) -> Result<()> { let att_low = AttributeProto { name: "low".to_string(), ref_attr_name: "low".to_string(), i: 0, doc_string: "low".to_string(), r#type: 1, // FLOAT f: low.unwrap_or(0.0), s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_high = AttributeProto { name: "high".to_string(), ref_attr_name: "high".to_string(), i: 0, doc_string: "high".to_string(), r#type: 1, // FLOAT f: high.unwrap_or(1.0), s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_shape = AttributeProto { name: "shape".to_string(), ref_attr_name: "shape".to_string(), i: 0, doc_string: "shape".to_string(), r#type: 7, // INTS f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: shape, strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_dtype = AttributeProto { name: "dtype".to_string(), ref_attr_name: "dtype".to_string(), i: 11, // DOUBLE doc_string: "dtype".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let attrs = { let mut mut_attrs = vec![att_shape, att_dtype]; if low.is_some() { mut_attrs.push(att_low); } if high.is_some() { mut_attrs.push(att_high); } mut_attrs }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "RandomUniform".to_string(), domain: "".to_string(), attribute: attrs, input: vec![], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let eval = candle_onnx::simple_eval(&manual_graph, HashMap::new())?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let min = z .flatten_all()? .to_vec1()? .into_iter() .reduce(f64::min) .unwrap(); let max = z .flatten_all()? .to_vec1()? .into_iter() .reduce(f64::max) .unwrap(); assert!(min >= low.unwrap_or(0.0).into()); assert!(max <= high.unwrap_or(1.0).into()); assert_ne!(min, max); Ok(()) } Ok(()) } // "RandomNormal" #[test] fn test_random_normal() -> Result<()> { test(vec![3, 2, 1, 4], None, None)?; test(vec![2, 2, 2, 2], Some(-10.0), None)?; test(vec![2, 2, 2, 2], None, Some(10.0))?; test(vec![1, 2, 3, 4], Some(-10.0), Some(10.0))?; fn test(shape: Vec<i64>, mean: Option<f32>, scale: Option<f32>) -> Result<()> { let att_mean = AttributeProto { name: "mean".to_string(), ref_attr_name: "mean".to_string(), i: 0, doc_string: "mean".to_string(), r#type: 1, // FLOAT f: mean.unwrap_or(0.0), s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_scale = AttributeProto { name: "scale".to_string(), ref_attr_name: "scale".to_string(), i: 0, doc_string: "scale".to_string(), r#type: 1, // FLOAT f: scale.unwrap_or(1.0), s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_shape = AttributeProto { name: "shape".to_string(), ref_attr_name: "shape".to_string(), i: 0, doc_string: "shape".to_string(), r#type: 7, // INTS f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: shape, strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_dtype = AttributeProto { name: "dtype".to_string(), ref_attr_name: "dtype".to_string(), i: 11, // DOUBLE doc_string: "dtype".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let attrs = { let mut mut_attrs = vec![att_shape, att_dtype]; if mean.is_some() { mut_attrs.push(att_mean); } if scale.is_some() { mut_attrs.push(att_scale); } mut_attrs }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "RandomNormal".to_string(), domain: "".to_string(), attribute: attrs, input: vec![], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let eval = candle_onnx::simple_eval(&manual_graph, HashMap::new())?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let data = z.flatten_all()?.to_vec1::<f64>()?; // test if values are unique for (i, a) in data.iter().enumerate() { for (j, b) in data.iter().enumerate() { if i == j { continue; }; assert_ne!(a, b); } } Ok(()) } Ok(()) } // "Range" #[test] fn test_range() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-113 test(1., 5., 2., &[1., 3.])?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-113 test(10i64, 6i64, -3i64, &[10i64, 7i64])?; fn test( start: impl NdArray, limit: impl NdArray, delta: impl NdArray, expected: impl NdArray, ) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Range".to_string(), domain: "".to_string(), attribute: vec![], input: vec![ INPUT_X.to_string(), INPUT_Y.to_string(), INPUT_A.to_string(), ], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(start, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(limit, &Device::Cpu)?); inputs.insert(INPUT_A.to_string(), Tensor::new(delta, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval .get(OUTPUT_Z) .expect("Output 'z' not found") .to_dtype(DType::F64)?; let expected = Tensor::new(expected, &Device::Cpu)?.to_dtype(DType::F64)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Greater" #[test] fn test_greater() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-63 test(&[1., 2., 3.], &[3., 2., 1.], &[0u8, 0, 1])?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-63 test(&[1., 2., 3.], 2., &[0u8, 0, 1])?; fn test(a: impl NdArray, b: impl NdArray, expected: impl NdArray) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Greater".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(a, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(b, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval .get(OUTPUT_Z) .expect("Output 'z' not found") .to_dtype(DType::F64)?; let expected = Tensor::new(expected, &Device::Cpu)?.to_dtype(DType::F64)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Less" #[test] fn test_less() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-81 test(&[1., 2., 3.], &[3., 2., 1.], &[1u8, 0, 0])?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-81 test(&[1., 2., 3.], 2., &[1u8, 0, 0])?; fn test(a: impl NdArray, b: impl NdArray, expected: impl NdArray) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Less".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(a, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(b, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval .get(OUTPUT_Z) .expect("Output 'z' not found") .to_dtype(DType::F64)?; let expected = Tensor::new(expected, &Device::Cpu)?.to_dtype(DType::F64)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Log" #[test] fn test_log() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-82 test(&[1., 10.], &[0., std::f64::consts::LN_10])?; fn test(data: impl NdArray, expected: impl NdArray) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Log".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Min" #[test] fn test_min() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-94 test(&[3., 2., 1.], &[1., 4., 4.], &[2., 5., 0.], &[1., 2., 0.])?; fn test( a: impl NdArray, b: impl NdArray, c: impl NdArray, expected: impl NdArray, ) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Min".to_string(), domain: "".to_string(), attribute: vec![], input: vec![ INPUT_X.to_string(), INPUT_Y.to_string(), INPUT_A.to_string(), ], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(a, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(b, &Device::Cpu)?); inputs.insert(INPUT_A.to_string(), Tensor::new(c, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "Where" #[test] fn test_where() -> Result<()> { // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-173 test( &[[1u8, 0], [1, 1]], &[[1i64, 2], [3, 4]], &[[9i64, 8], [7, 6]], &[[1i64, 8], [3, 4]], )?; // https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-173 test( &[[1u8, 0], [1, 1]], &[[1., 2.], [3., 4.]], &[[9., 8.], [7., 6.]], &[[1., 8.], [3., 4.]], )?; fn test( condition: impl NdArray, x: impl NdArray, y: impl NdArray, expected: impl NdArray, ) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Where".to_string(), domain: "".to_string(), attribute: vec![], input: vec![ INPUT_X.to_string(), INPUT_Y.to_string(), INPUT_A.to_string(), ], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(condition, &Device::Cpu)?); inputs.insert(INPUT_Y.to_string(), Tensor::new(x, &Device::Cpu)?); inputs.insert(INPUT_A.to_string(), Tensor::new(y, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval .get(OUTPUT_Z) .expect("Output 'z' not found") .to_dtype(DType::F64)?; let expected = Tensor::new(expected, &Device::Cpu)?.to_dtype(DType::F64)?; match expected.dims().len() { 0 => assert_eq!(z.to_vec0::<f64>()?, expected.to_vec0::<f64>()?), 1 => assert_eq!(z.to_vec1::<f64>()?, expected.to_vec1::<f64>()?), 2 => assert_eq!(z.to_vec2::<f64>()?, expected.to_vec2::<f64>()?), 3 => assert_eq!(z.to_vec3::<f64>()?, expected.to_vec3::<f64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } #[test] fn test_floor() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Floor".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // some values taken from https://numpy.org/doc/stable/reference/generated/numpy.floor.html vec![ f64::NAN, f64::INFINITY, f64::NEG_INFINITY, -1.7, -1.5, -0.2, 0.2, 1.5, 1.7, 2.0, ], &[10], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec1::<f64>()?; assert!(results[0].is_nan()); assert_eq!( results[1..], vec![ f64::INFINITY, f64::NEG_INFINITY, -2., -2., -1., 0., 1., 1., 2. ] ); Ok(()) } #[test] fn test_ceil() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Ceil".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![ValueInfoProto { name: INPUT_X.to_string(), doc_string: "".to_string(), r#type: None, }], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let x = Tensor::from_vec( // some values taken from https://numpy.org/doc/stable/reference/generated/numpy.ceil.html vec![ f64::NAN, f64::INFINITY, f64::NEG_INFINITY, -1.7, -1.5, -0.2, 0.2, 1.5, 1.7, 2.0, ], &[10], &Device::Cpu, )?; let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), x); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let results = z.to_vec1::<f64>()?; assert!(results[0].is_nan()); assert_eq!( results[1..], vec![ f64::INFINITY, f64::NEG_INFINITY, -1., -1., -0., 1., 2., 2., 2. ] ); Ok(()) } // "ArgMin" #[test] fn test_argmin() -> Result<()> { // tests from https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-7 // default_axes_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], None, Some(1), None, &[[0i64, 0i64]], )?; // keepdims test( &[[2u32, 1u32], [3u32, 10u32]], Some(1), Some(1), None, &[[1i64], [0i64]], )?; // // negative_axis_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], Some(-1), Some(1), None, &[[1i64], [0i64]], )?; // no_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], None, Some(0), None, &[0i64, 0i64], )?; // tests from https://pytorch.org/docs/stable/generated/torch.argmin.html#torch.argmin test( &[ [0.1139, 0.2254, -0.1381, 0.3687], [1.0100, -1.1975, -0.0102, -0.4732], [-0.9240, 0.1207, -0.7506, -1.0213], [1.7809, -1.2960, 0.9384, 0.1438], ], Some(1), Some(0), None, &[2i64, 1i64, 3i64, 1i64], )?; test( &[ [0.1139, 0.2254, -0.1381, 0.3687], [1.0100, -1.1975, -0.0102, -0.4732], [-0.9240, 0.1207, -0.7506, -1.0213], [1.7809, -1.2960, 0.9384, 0.1438], ], Some(1), None, None, &[[2i64], [1i64], [3i64], [1i64]], )?; fn test( data: impl NdArray, axis: Option<i64>, keepdims: Option<i64>, select_last_index: Option<i64>, expected: impl NdArray, ) -> Result<()> { let att_axis = AttributeProto { name: "axis".to_string(), ref_attr_name: "axis".to_string(), i: axis.unwrap_or(0), doc_string: "axis".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_keepdims = AttributeProto { name: "keepdims".to_string(), ref_attr_name: "keepdims".to_string(), i: keepdims.unwrap_or(1), doc_string: "keepdims".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_select_last_index = AttributeProto { name: "select_last_index".to_string(), ref_attr_name: "select_last_index".to_string(), i: select_last_index.unwrap_or(0), doc_string: "select_last_index".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let attrs = { let mut mut_attrs = vec![]; if axis.is_some() { mut_attrs.push(att_axis); } if keepdims.is_some() { mut_attrs.push(att_keepdims); } if select_last_index.is_some() { mut_attrs.push(att_select_last_index); } mut_attrs }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ArgMin".to_string(), domain: "".to_string(), attribute: attrs, input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 1 => assert_eq!(z.to_vec1::<i64>()?, expected.to_vec1::<i64>()?), 2 => assert_eq!(z.to_vec2::<i64>()?, expected.to_vec2::<i64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "ArgMax" #[test] fn test_argmax() -> Result<()> { // tests from https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-6 // default_axes_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], None, Some(1), None, &[[1i64, 1i64]], )?; // keepdims test( &[[2u32, 1u32], [3u32, 10u32]], Some(1), Some(1), None, &[[0i64], [1i64]], )?; // // negative_axis_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], Some(-1), Some(1), None, &[[0i64], [1i64]], )?; // no_keepdims test( &[[2u32, 1u32], [3u32, 10u32]], None, Some(0), None, &[1i64, 1i64], )?; // tests from https://pytorch.org/docs/stable/generated/torch.argmax.html test( &[ [1.3398, 0.2663, -0.2686, 0.2450], [-0.7401, -0.8805, -0.3402, -1.1936], [0.4907, -1.3948, -1.0691, -0.3132], [-1.6092, 0.5419, -0.2993, 0.3195], ], Some(1), Some(0), None, &[0i64, 2i64, 0i64, 1i64], )?; test( &[ [1.3398, 0.2663, -0.2686, 0.2450], [-0.7401, -0.8805, -0.3402, -1.1936], [0.4907, -1.3948, -1.0691, -0.3132], [-1.6092, 0.5419, -0.2993, 0.3195], ], Some(1), None, None, &[[0i64], [2i64], [0i64], [1i64]], )?; fn test( data: impl NdArray, axis: Option<i64>, keepdims: Option<i64>, select_last_index: Option<i64>, expected: impl NdArray, ) -> Result<()> { let att_axis = AttributeProto { name: "axis".to_string(), ref_attr_name: "axis".to_string(), i: axis.unwrap_or(0), doc_string: "axis".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_keepdims = AttributeProto { name: "keepdims".to_string(), ref_attr_name: "keepdims".to_string(), i: keepdims.unwrap_or(1), doc_string: "keepdims".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let att_select_last_index = AttributeProto { name: "select_last_index".to_string(), ref_attr_name: "select_last_index".to_string(), i: select_last_index.unwrap_or(0), doc_string: "select_last_index".to_string(), r#type: 2, // INT f: 0.0, s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let attrs = { let mut mut_attrs = vec![]; if axis.is_some() { mut_attrs.push(att_axis); } if keepdims.is_some() { mut_attrs.push(att_keepdims); } if select_last_index.is_some() { mut_attrs.push(att_select_last_index); } mut_attrs }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "ArgMax".to_string(), domain: "".to_string(), attribute: attrs, input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 1 => assert_eq!(z.to_vec1::<i64>()?, expected.to_vec1::<i64>()?), 2 => assert_eq!(z.to_vec2::<i64>()?, expected.to_vec2::<i64>()?), _ => unreachable!(), }; Ok(()) } Ok(()) } // "LeakyRelu" #[test] fn test_leakyrelu() -> Result<()> { // tests from https://github.com/onnx/onnx/blob/main/docs/Operators.md#examples-80 // leakyrelu test(&[-1.0, 0.0, 1.0], Some(0.1), &[-0.1, 0.0, 1.0])?; fn test(data: impl NdArray, alpha: Option<f32>, expected: impl NdArray) -> Result<()> { let att_alpha = AttributeProto { name: "alpha".to_string(), ref_attr_name: "alpha".to_string(), i: 0, doc_string: "alpha".to_string(), r#type: 1, // FLOAT f: alpha.unwrap_or(0.01), s: vec![], t: None, g: None, sparse_tensor: None, tp: None, floats: vec![], ints: vec![], strings: vec![], tensors: vec![], graphs: vec![], sparse_tensors: vec![], type_protos: vec![], }; let attrs = { let mut mut_attrs = vec![]; if alpha.is_some() { mut_attrs.push(att_alpha); } mut_attrs }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "LeakyRelu".to_string(), domain: "".to_string(), attribute: attrs, input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert(INPUT_X.to_string(), Tensor::new(data, &Device::Cpu)?); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; for both in z .to_vec1::<f64>()? .iter() .zip(expected.to_vec1::<f64>()?.iter()) { let (act, exp) = both; assert!(f64::abs(act - exp) < f32::EPSILON.into()); } Ok(()) } Ok(()) } // "If" #[test] fn test_if() -> Result<()> { let x = vec![1.0, 2.0, 3.0, 4.0, 5.0]; let y = vec![5.0, 4.0, 3.0, 2.0, 1.0]; let output_type_proto = Some(TypeProto { value: Some(type_proto::Value::TensorType(type_proto::Tensor { elem_type: DataType::Float.into(), shape: Some(TensorShapeProto { dim: vec![Dimension { denotation: "".to_string(), value: Some(dimension::Value::DimValue(5)), }], }), })), denotation: "".to_string(), }); let then_branch = GraphProto { output: vec![ValueInfoProto { name: "then_out".to_string(), r#type: output_type_proto.clone(), doc_string: "".to_string(), }], node: vec![NodeProto { op_type: "Constant".to_string(), input: vec![], output: vec!["then_out".to_string()], attribute: vec![AttributeProto { name: "value".to_string(), r#type: AttributeType::Tensor.into(), t: Some(TensorProto { dims: vec![x.len() as i64], float_data: x.clone(), data_type: DataType::Float.into(), ..TensorProto::default() }), ..AttributeProto::default() }], ..NodeProto::default() }], ..GraphProto::default() }; let else_branch = GraphProto { output: vec![ValueInfoProto { name: "else_out".to_string(), r#type: output_type_proto.clone(), doc_string: "".to_string(), }], node: vec![NodeProto { op_type: "Constant".to_string(), input: vec![], output: vec!["else_out".to_string()], attribute: vec![AttributeProto { name: "value".to_string(), r#type: AttributeType::Tensor.into(), t: Some(TensorProto { dims: vec![y.len() as i64], float_data: y.clone(), data_type: DataType::Float.into(), ..TensorProto::default() }), ..AttributeProto::default() }], ..NodeProto::default() }], ..GraphProto::default() }; let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "If".to_string(), attribute: vec![ AttributeProto { name: "then_branch".to_string(), r#type: AttributeType::Graph.into(), g: Some(then_branch), ..AttributeProto::default() }, AttributeProto { name: "else_branch".to_string(), r#type: AttributeType::Graph.into(), g: Some(else_branch), ..AttributeProto::default() }, ], input: vec!["cond".to_string()], output: vec!["res".to_string()], ..NodeProto::default() }], input: vec![], output: vec![ValueInfoProto { name: "res".to_string(), doc_string: "".to_string(), r#type: output_type_proto.clone(), }], ..GraphProto::default() })); for cond in [1u8, 0] { let inputs = HashMap::from_iter([("cond".to_string(), Tensor::full(cond, (1,), &Device::Cpu)?)]); let outputs = candle_onnx::simple_eval(&manual_graph, inputs)?; let expected = if cond != 0 { &x } else { &y }; let Some(res) = outputs.get("res") else { candle::bail!("outputs didn't contain expected key `res`: {outputs:?}"); }; assert_eq!(&res.to_vec1::<f32>()?, expected); } Ok(()) } #[test] fn test_pad() -> Result<()> { let data = Tensor::from_vec( vec![ 1.0, 2.0, 3.0, // 4.0, 5.0, 6.0, // ], (2, 3), &Device::Cpu, )?; let pads = Tensor::from_vec(vec![0i64, 1, 0, 0], (4,), &Device::Cpu)?; let mode = "reflect"; let expected = Tensor::from_vec( vec![ 2.0, 1.0, 2.0, 3.0, // 5.0, 4.0, 5.0, 6.0, // ], (2, 4), &Device::Cpu, )?; let model = create_model_proto_with_graph(Some(GraphProto { input: vec![ ValueInfoProto { name: "data".to_string(), ..ValueInfoProto::default() }, ValueInfoProto { name: "pads".to_string(), ..ValueInfoProto::default() }, ], output: vec![ValueInfoProto { name: "output".to_string(), ..ValueInfoProto::default() }], node: vec![NodeProto { op_type: "Pad".to_string(), input: vec!["data".to_string(), "pads".to_string()], output: vec!["output".to_string()], attribute: vec![AttributeProto { name: "mode".to_string(), r#type: AttributeType::String.into(), s: mode.as_bytes().to_vec(), ..AttributeProto::default() }], ..NodeProto::default() }], ..GraphProto::default() })); let inputs = HashMap::from_iter([("data".to_string(), data), ("pads".to_string(), pads)]); let res = candle_onnx::simple_eval(&model, inputs)?; let Some(actual) = res.get("output") else { candle::bail!("outputs didn't contain expected key `output`: {res:?}"); }; assert_eq!(actual.to_vec2::<f64>()?, expected.to_vec2::<f64>()?); Ok(()) } #[test] fn test_slice() -> Result<()> { let model = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Slice".to_string(), input: vec![ "data".to_string(), "starts".to_string(), "ends".to_string(), "axes".to_string(), "steps".to_string(), ], output: vec!["result".to_string()], ..NodeProto::default() }], input: ["data", "starts", "ends", "axes", "steps"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), r#type: None, doc_string: "".to_string(), }) .collect(), output: ["result"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), r#type: None, doc_string: "".to_string(), }) .collect(), ..GraphProto::default() })); /* data = [ [1, 2, 3, 4], [5, 6, 7, 8], ] axes = [0, 1] starts = [1, 0] ends = [2, 3] steps = [1, 2] result = [ [5, 7], ] */ let outputs = candle_onnx::simple_eval( &model, HashMap::from_iter([ ( "data".to_string(), Tensor::from_vec(vec![1i64, 2, 3, 4, 5, 6, 7, 8], (2, 4), &Device::Cpu)?, ), ( "starts".to_string(), Tensor::from_vec(vec![1i64, 0], (2,), &Device::Cpu)?, ), ( "ends".to_string(), Tensor::from_vec(vec![2i64, 3], (2,), &Device::Cpu)?, ), ( "axes".to_string(), Tensor::from_vec(vec![0i64, 1], (2,), &Device::Cpu)?, ), ( "steps".to_string(), Tensor::from_vec(vec![1i64, 2], (2,), &Device::Cpu)?, ), ]), )?; let actual = outputs.get("result").unwrap().to_vec2::<i64>()?; assert_eq!(actual, vec![vec![5i64, 7]]); /* data = [ [1, 2, 3, 4], [5, 6, 7, 8], ] starts = [0, 1] ends = [-1, 1000] result = [ [2, 3, 4], ] */ let model = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Slice".to_string(), input: vec!["data".to_string(), "starts".to_string(), "ends".to_string()], output: vec!["result".to_string()], ..NodeProto::default() }], input: ["data", "starts", "ends"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), r#type: None, doc_string: "".to_string(), }) .collect(), output: ["result"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), r#type: None, doc_string: "".to_string(), }) .collect(), ..GraphProto::default() })); let outputs = candle_onnx::simple_eval( &model, HashMap::from_iter([ ( "data".to_string(), Tensor::from_vec(vec![1i64, 2, 3, 4, 5, 6, 7, 8], (2, 4), &Device::Cpu)?, ), ( "starts".to_string(), Tensor::from_vec(vec![0i64, 1], (2,), &Device::Cpu)?, ), ( "ends".to_string(), Tensor::from_vec(vec![-1i64, 1000], (2,), &Device::Cpu)?, ), ]), )?; let actual = outputs.get("result").unwrap().to_vec2::<i64>()?; assert_eq!(actual, vec![vec![2i64, 3, 4]]); Ok(()) } #[test] fn test_lstm() -> Result<()> { // values generated from pytorch, so at least it's close enough to what pytorch does /* #!/usr/bin/env python3 # torch.nn.LSTM(input_size, hidden_size, num_layers=1, bias=True, batch_first=False, dropout=0.0, bidirectional=False, proj_size=0, device=None, dtype=None) import torch rand_gen = torch.Generator() rand_gen.manual_seed(1) input_size = 3 hidden_size = 5 batch_size = 1 sequence_length = 4 number_directions = 1 rnn = torch.nn.LSTM(input_size,hidden_size) weight_ih_l0 = torch.randn(rnn.weight_ih_l0.shape, generator=rand_gen) weight_hh_l0 = torch.randn(rnn.weight_hh_l0.shape, generator=rand_gen) bias_ih_l0 = torch.randn(rnn.bias_ih_l0.shape, generator=rand_gen) bias_hh_l0 = torch.randn(rnn.bias_hh_l0.shape, generator=rand_gen) rnn.weight_ih_l0 = torch.nn.Parameter(weight_ih_l0) rnn.weight_hh_l0 = torch.nn.Parameter(weight_hh_l0) rnn.bias_ih_l0 = torch.nn.Parameter(bias_ih_l0) rnn.bias_hh_l0 = torch.nn.Parameter(bias_hh_l0) input = torch.randn(sequence_length, batch_size, input_size, generator=rand_gen) h0 = torch.randn(number_directions, batch_size, hidden_size, generator=rand_gen) c0 = torch.randn(number_directions, batch_size, hidden_size, generator=rand_gen) output, (hn, cn) = rnn(input, (h0, c0)) def fmt_tensor(t): return "Tensor::from_vec::<_, f32>(vec!"+ str(t.flatten().tolist()) + ", (" + "".join([str(n)+"," for n in t.shape])+"), &Device::Cpu)?" print("let input_size = ", input_size, ";") print("let hidden_size = ", hidden_size, ";") print("let batch_size = ", batch_size, ";") print("let sequence_length = ", sequence_length, ";") print("let number_directions = ", number_directions, ";") print("let weight_ih_l0 = ", fmt_tensor(rnn.weight_ih_l0), ";") print("let weight_hh_l0 = ", fmt_tensor(rnn.weight_hh_l0), ";") print("let bias_ih_l0 = ", fmt_tensor(rnn.bias_ih_l0), ";") print("let bias_hh_l0 = ", fmt_tensor(rnn.bias_hh_l0), ";") print("let input = ", fmt_tensor(input), ";") print("let h0 = ", fmt_tensor(h0), ";") print("let c0 = ", fmt_tensor(c0), ";") print("let output = ", fmt_tensor(output), ";") print("let hn = ", fmt_tensor(hn), ";") print("let cn = ", fmt_tensor(cn), ";") */ let input_size = 3; let hidden_size = 5; let batch_size = 1; let sequence_length = 4; let number_directions = 1; let weight_ih_l0 = Tensor::from_vec::<_, f32>( vec![ -1.525_595_9, -0.750_231_8, -0.653_980_9, -1.609_484_8, -0.100_167_18, -0.609_188_9, -0.979_772_27, -1.609_096_3, -0.712_144_6, 0.303_722, -0.777_314_3, -0.251_455_25, -0.222_270_49, 1.687_113_4, 0.228_425_17, 0.467_635_5, -0.696_972_4, -1.160_761_5, 0.699_542_4, 0.199_081_63, 0.865_692_4, 0.244_403_9, -0.662_911_36, 0.807_308_26, 1.101_680_6, -0.175_936_04, -2.245_557_8, -1.446_458, 0.061_155_282, -0.617_744_45, -0.798_069_83, -0.131_623_21, 1.879_345_8, -0.072_131_78, 0.157_770_6, -0.773_454_9, 0.199_056_5, 0.045_702_778, 0.152_956_92, -0.475_678_8, -0.111_019_83, 0.292_735_25, -0.157_845_15, -0.028_787_14, 0.453_254_58, 1.142_161_1, 0.248_610_7, -1.775_400_8, -0.025_502_462, -1.023_330_6, -0.596_185_15, -1.005_530_7, 0.428_542_3, 1.476_077_8, -1.786_867_9, 1.610_317_6, -0.703_956_66, -0.185_265_8, -0.996_235_1, -0.831_255_26, ], (20, 3), &Device::Cpu, )?; let weight_hh_l0 = Tensor::from_vec::<_, f32>( vec![ 0.409_972_43, 0.408_450_66, 0.257_865_4, 1.095_021_4, -0.506_486_6, 0.099_775_404, -0.653_973_4, 0.731_693_7, -1.456_733, 1.608_935_4, 0.093_769_975, -1.259_749, 0.254_633_5, -0.501_957_3, -1.041_2, 0.732_267_2, 1.307_535_5, -1.162_798_8, 0.119_636_11, -0.163_135_33, 0.661_445_3, 1.189_920_5, 0.816_533_9, -0.913_523_6, -0.353_806_53, 0.763_927_04, -0.588_950_7, -0.763_597_37, 1.335_205_7, 0.604_273_6, -0.103_442_08, -0.151_216_92, 1.246_568_3, 0.505_721_4, 0.950_511_2, 1.296_648_3, 0.873_796_3, -0.560_259_4, 1.285_784_5, 0.816_823_84, -1.464_799_4, -1.262_928_4, 1.122_018_8, 1.566_334_1, 2.558_138_4, -0.233_363_88, -0.013_472_13, 1.860_634_8, 1.549_620_5, 0.347_629_25, 0.093_008_03, 0.614_740_3, 0.712_364_55, -1.776_507_3, 0.353_864_58, 1.199_613_2, -0.712_258_93, -0.620_034_4, -0.228_134_95, -0.789_274_63, -1.611_111_8, -1.871_612_9, 0.543_083_6, 0.660_678_6, 0.270_527_72, 0.559_691_97, -0.318_396_3, 1.511_720_7, -1.363_267_2, -0.983_219_6, 1.511_266_7, 0.641_870_74, -0.747_445_9, -0.923_438_55, 0.573_398_4, -0.109_299_51, 0.518_112_1, 0.106_535_35, 0.269_240_77, 1.324_768, 0.037_456_9, -0.637_839_3, -0.814_755_44, -0.689_506_53, 0.843_654_3, 1.165_701_3, 0.526_932_2, 1.619_253_3, -0.963_976_26, 0.141_520_38, -0.163_660_96, -0.358_222_57, 1.722_279_3, -0.303_575_6, 0.238_874_2, 1.344_001_2, 0.103_225_69, 1.100_354_2, -0.341_680_2, 0.947_338_9, ], (20, 5), &Device::Cpu, )?; let bias_ih_l0 = Tensor::from_vec::<_, f32>( vec![ -0.568_515_96, 0.837_596_2, 1.783_660_7, -0.195_424_66, 0.235_193_13, 1.914_243_3, 1.836_411_1, 1.324_532_4, -0.070_514_58, 0.346_979_4, -0.653_679_6, 1.558_620_2, 0.218_566_15, -0.574_307_26, 1.457_125_1, 1.770_955_7, -2.017_3, 0.423_503_2, 0.573_022, -1.796_243, ], (20,), &Device::Cpu, )?; let bias_hh_l0 = Tensor::from_vec::<_, f32>( vec![ 1.247_040_4, 1.273_851_2, 0.390_949_25, 0.387_210_5, 0.144_403_95, 0.777_168_45, -2.338_112_6, -0.829_120_4, 1.166_139_1, 1.478_657_5, 0.267_608_73, 0.756_119_85, -0.587_336_1, -2.061_920_6, 0.430_473_48, 0.337_656_62, -0.343_785_35, -0.617_226_06, 1.252_969_3, -0.051_417_42, ], (20,), &Device::Cpu, )?; let input = Tensor::from_vec::<_, f32>( vec![ 0.647_212_8, -0.041_167_17, -0.177_493_08, -0.500_039_3, 0.867_274_94, -0.273_192_23, -0.460_768_13, -0.099_093_71, 0.472_844_8, 1.004_948_5, -0.287_142_04, -1.161_862_1, ], (4, 1, 3), &Device::Cpu, )?; let h0 = Tensor::from_vec::<_, f32>( vec![ 0.027_581_785, 0.565_238_24, -0.011_487_379, 0.670_640_05, -0.492_925_05, ], (1, 1, 5), &Device::Cpu, )?; let c0 = Tensor::from_vec::<_, f32>( vec![ 1.505_028_5, -2.326_355, 1.616_89, -0.902_623_8, 0.173_668_24, ], (1, 1, 5), &Device::Cpu, )?; let output = Tensor::from_vec::<_, f32>( vec![ 0.595_601_7, -0.017_232_792, 0.110_355_72, -0.493_231_74, 0.047_632_16, 0.635_845_2, 0.040_328_12, -0.378_861_16, -0.746_434, 0.200_809_09, 0.584_026_5, 0.145_328_82, -0.734_529_85, -0.521_430_43, 0.219_038_17, 0.742_045_16, 0.319_438_8, -0.047_266_465, -0.282_384_96, 0.271_313_4, ], (4, 1, 5), &Device::Cpu, )?; let hn = Tensor::from_vec::<_, f32>( vec![ 0.742_045_16, 0.319_438_8, -0.047_266_465, -0.282_384_96, 0.271_313_4, ], (1, 1, 5), &Device::Cpu, )?; let cn = Tensor::from_vec::<_, f32>( vec![ 0.963_055_85, 1.003_307, -1.754_899, -1.596_712_2, 0.825_292_47, ], (1, 1, 5), &Device::Cpu, )?; // end of generated values let model = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "LSTM".to_string(), name: "LSTM_test".to_string(), attribute: vec![AttributeProto { name: "hidden_size".to_string(), r#type: AttributeType::Int.into(), i: hidden_size as i64, ..AttributeProto::default() }], input: vec![ "input".to_string(), "w".to_string(), "r".to_string(), "b".to_string(), // b "".to_string(), // seq_lens "h".to_string(), "c".to_string(), ], output: vec!["output".to_string(), "hn".to_string(), "cn".to_string()], ..NodeProto::default() }], input: ["input", "w", "r", "b", "h", "c"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), ..ValueInfoProto::default() }) .collect(), output: ["output", "hn", "cn"] .into_iter() .map(|name| ValueInfoProto { name: name.to_string(), ..ValueInfoProto::default() }) .collect(), ..GraphProto::default() })); // pytorch stores weight and bias as [ifco] but we want it as [iofc] // so we need to re-arrange the tensors a bit let idx_iofc = { let stride = hidden_size as i64; let dev = weight_ih_l0.device(); let idx_i = Tensor::arange(0, stride, dev)?; let idx_f = Tensor::arange(stride, 2 * stride, dev)?; let idx_g = Tensor::arange(2 * stride, 3 * stride, dev)?; let idx_o = Tensor::arange(3 * stride, 4 * stride, dev)?; Tensor::cat(&[&idx_i, &idx_o, &idx_f, &idx_g], 0)? }; let w = weight_ih_l0.index_select(&idx_iofc, 0)?; let w = w.reshape((number_directions, 4 * hidden_size, input_size))?; let r = weight_hh_l0.index_select(&idx_iofc, 0)?; let r = r.reshape((number_directions, 4 * hidden_size, hidden_size))?; let wb = bias_ih_l0.index_select(&idx_iofc, 0)?; let rb = bias_hh_l0.index_select(&idx_iofc, 0)?; let b = Tensor::cat(&[wb, rb], 0)?.reshape((number_directions, 8 * hidden_size))?; let output = output.reshape((sequence_length, number_directions, batch_size, hidden_size))?; let result = simple_eval( &model, HashMap::from_iter([ ("input".to_string(), input), ("w".to_string(), w), ("r".to_string(), r), ("b".to_string(), b), ("h".to_string(), h0), ("c".to_string(), c0), ]), )?; let actual_output = result.get("output").unwrap(); assert_eq!(output.dims(), actual_output.dims()); let actual_hn = result.get("hn").unwrap(); assert_eq!(hn.dims(), actual_hn.dims()); let actual_cn = result.get("cn").unwrap(); assert_eq!(cn.dims(), actual_cn.dims()); let diff_close_enough = |a: &Tensor, b| -> Result<_> { let diffs = a.sub(b)?.flatten_all()?.to_vec1::<f32>()?; Ok(diffs.iter().all(|f| f.abs() < 0.0001)) }; assert!( diff_close_enough(&output, actual_output)?, "output did not match expected\n{actual_output}\n{output}", ); assert!( diff_close_enough(&hn, actual_hn)?, "hn did not match expected\n{actual_hn}\n{hn}", ); assert!( diff_close_enough(&cn, actual_cn)?, "cn did not match expected\n{actual_cn}\n{cn}", ); Ok(()) } #[test] fn test_expand_dim_changed() -> Result<()> { // Create a manual graph for the Expand operation let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Expand".to_string(), domain: "".to_string(), attribute: vec![], input: vec!["data".to_string(), "new_shape".to_string()], output: vec!["expanded".to_string()], name: "".to_string(), doc_string: "".to_string(), }], input: vec![ ValueInfoProto { name: "data".to_string(), doc_string: "".to_string(), r#type: None, }, ValueInfoProto { name: "new_shape".to_string(), doc_string: "".to_string(), r#type: None, }, ], output: vec![ValueInfoProto { name: "expanded".to_string(), doc_string: "".to_string(), r#type: None, }], ..GraphProto::default() })); // Input tensor with shape [3, 1] let data = Tensor::from_vec(vec![1.0f32, 2.0f32, 3.0f32], (3, 1), &Device::Cpu)?; // New shape tensor: [2, 1, 6] let new_shape = Tensor::from_vec(vec![2i64, 1, 6], (3,), &Device::Cpu)?; // Expected output after expansion let expected = Tensor::from_vec( vec![ 1.0f32, 1.0f32, 1.0f32, 1.0f32, 1.0f32, 1.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, 1.0f32, 1.0f32, 1.0f32, 1.0f32, 1.0f32, 1.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, ], (2, 3, 6), &Device::Cpu, )?; // Execute the model evaluation let inputs = HashMap::from_iter([ ("data".to_string(), data), ("new_shape".to_string(), new_shape), ]); let result = candle_onnx::simple_eval(&manual_graph, inputs)?; // Retrieve and compare the result let expanded = result.get("expanded").expect("Output 'expanded' not found"); assert_eq!(expanded.to_vec3::<f32>()?, expected.to_vec3::<f32>()?); Ok(()) } fn make_graph_helper( op_name: &str, inputs: &[&str], outputs: &[&str], attribs: Vec<AttributeProto>, ) -> ModelProto { create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: op_name.to_string(), domain: "".to_string(), attribute: attribs, input: inputs.iter().map(|s| s.to_string()).collect(), output: outputs.iter().map(|s| s.to_string()).collect(), name: "".to_string(), doc_string: "".to_string(), }], input: inputs .iter() .map(|name| ValueInfoProto { name: name.to_string(), ..ValueInfoProto::default() }) .collect(), output: outputs .iter() .map(|name| ValueInfoProto { name: name.to_string(), ..ValueInfoProto::default() }) .collect(), ..GraphProto::default() })) } #[test] fn test_expand_dim_unchanged() -> Result<()> { // Create a manual graph for the Expand operation let manual_graph = make_graph_helper("Expand", &["data", "new_shape"], &["expanded"], vec![]); // Input tensor with shape [3, 1] and dtype f32 let data = Tensor::from_vec(vec![1.0f32, 2.0f32, 3.0f32], (3, 1), &Device::Cpu)?; // New shape tensor: [3, 4] let new_shape = Tensor::from_vec(vec![3i64, 4], (2,), &Device::Cpu)?; // Expected output after expansion, dtype f32 let expected = Tensor::from_vec( vec![ 1.0f32, 1.0f32, 1.0f32, 1.0f32, 2.0f32, 2.0f32, 2.0f32, 2.0f32, 3.0f32, 3.0f32, 3.0f32, 3.0f32, ], (3, 4), &Device::Cpu, )?; // Execute the model evaluation let inputs = HashMap::from_iter([ ("data".to_string(), data), ("new_shape".to_string(), new_shape), ]); let result = candle_onnx::simple_eval(&manual_graph, inputs)?; // Retrieve and compare the result let expanded = result.get("expanded").expect("Output 'expanded' not found"); assert_eq!(expanded.to_vec2::<f32>()?, expected.to_vec2::<f32>()?); Ok(()) } fn make_split_graph_helper(inputs: &[&str], outputs: &[&str], axis: i64) -> ModelProto { let attribs = vec![AttributeProto { name: "axis".to_string(), r#type: AttributeType::Int.into(), i: axis, ..AttributeProto::default() }]; make_graph_helper("Split", inputs, outputs, attribs) } #[test] fn test_split_equal_parts_1d_opset13() -> Result<()> { let input = Tensor::from_vec( vec![1.0f32, 2.0f32, 3.0f32, 4.0f32, 5.0f32, 6.0f32], (6,), &Device::Cpu, )?; let mut inputs = HashMap::new(); inputs.insert("input".to_string(), input); { let manual_graph = make_split_graph_helper(&["input"], &["output_1", "output_2", "output_3"], 0); let eval = candle_onnx::simple_eval(&manual_graph, inputs.clone())?; assert_eq!(eval.len(), 3); let out1 = eval.get("output_1").expect("Output 'output_1' not found"); let out2 = eval.get("output_2").expect("Output 'output_2' not found"); let out3 = eval.get("output_3").expect("Output 'output_3' not found"); assert_eq!(out1.to_vec1::<f32>()?, vec![1.0f32, 2.0f32]); assert_eq!(out2.to_vec1::<f32>()?, vec![3.0f32, 4.0f32]); assert_eq!(out3.to_vec1::<f32>()?, vec![5.0f32, 6.0f32]); } { let splits = Tensor::from_vec(vec![2i64, 4], (2,), &Device::Cpu)?; inputs.insert("split".to_string(), splits); let manual_graph = make_split_graph_helper(&["input", "split"], &["output_1", "output_2"], 0); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 2); let out1 = eval.get("output_1").expect("Output 'output_1' not found"); let out2 = eval.get("output_2").expect("Output 'output_2' not found"); assert_eq!(out1.to_vec1::<f32>()?, vec![1.0f32, 2.0f32]); assert_eq!(out2.to_vec1::<f32>()?, vec![3.0f32, 4.0f32, 5.0f32, 6.0f32]); } Ok(()) } fn make_reduce_sum_graph_helper( inputs: &[&str], outputs: &[&str], keepdims: Option<i64>, noop_with_empty_axes: Option<i64>, ) -> ModelProto { let mut attribs = vec![]; if let Some(keepdims) = keepdims { attribs.push(AttributeProto { name: "keepdims".to_string(), r#type: AttributeType::Int.into(), i: keepdims, ..AttributeProto::default() }); } if let Some(noop_with_empty_axes) = noop_with_empty_axes { attribs.push(AttributeProto { name: "noop_with_empty_axes".to_string(), r#type: AttributeType::Ints.into(), i: noop_with_empty_axes, ..AttributeProto::default() }); } make_graph_helper("ReduceSum", inputs, outputs, attribs) } #[test] fn test_reduce_sum_default_axes_keepdims() -> Result<()> { let manual_graph = make_reduce_sum_graph_helper(&["data", "axes"], &["reduced"], Some(1), None); // Test with example data { let data = Tensor::from_vec( vec![ 1.0f32, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, ], (3, 2, 2), &Device::Cpu, )?; // let axes = Tensor::from_vec(Vec::<i64>::new(), (0,), &Device::Cpu)?; let mut inputs = HashMap::new(); inputs.insert("data".to_string(), data); // inputs.insert("axes".to_string(), axes); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let reduced = eval.get("reduced").expect("Output 'reduced' not found"); let expected = Tensor::from_vec(vec![78.0f32], (1, 1, 1), &Device::Cpu)?; assert_eq!(reduced.to_vec3::<f32>()?, expected.to_vec3::<f32>()?); } { let data = Tensor::from_vec( vec![ -5.2f32, 7.8, -3.1, 9.4, 2.6, -8.7, 4.3, -1.9, 6.5, -0.8, -7.2, 3.6, ], (3, 2, 2), &Device::Cpu, )?; let mut inputs = HashMap::new(); inputs.insert("data".to_string(), data.clone()); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let reduced = eval.get("reduced").expect("Output 'reduced' not found"); let expected = data.sum_all()?.reshape((1, 1, 1))?; assert_eq!(reduced.to_vec3::<f32>()?, expected.to_vec3::<f32>()?); } Ok(()) } #[test] fn test_reduce_sum_do_not_keep_dims() -> Result<()> { let manual_graph = make_reduce_sum_graph_helper(&["data", "axes"], &["reduced"], Some(0), None); // Test with example data { let data = Tensor::from_vec( vec![ 1.0f32, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, ], (3, 2, 2), &Device::Cpu, )?; let axes = Tensor::from_vec(vec![1i64], (1,), &Device::Cpu)?; let mut inputs = HashMap::new(); inputs.insert("data".to_string(), data); inputs.insert("axes".to_string(), axes); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let reduced = eval.get("reduced").expect("Output 'reduced' not found"); let expected = Tensor::from_vec( vec![4.0f32, 6.0, 12.0, 14.0, 20.0, 22.0], (3, 2), &Device::Cpu, )?; assert_eq!(reduced.to_vec2::<f32>()?, expected.to_vec2::<f32>()?); } // Test with random data { let _shape = (3, 2, 2); let data = Tensor::from_vec( vec![ -5.2f32, 7.8, -3.1, 9.4, 2.6, -8.7, 4.3, -1.9, 6.5, -0.8, -7.2, 3.6, ], (3, 2, 2), &Device::Cpu, )?; let axes = Tensor::from_vec(vec![1i64], (1,), &Device::Cpu)?; let mut inputs = HashMap::new(); inputs.insert("data".to_string(), data.clone()); inputs.insert("axes".to_string(), axes); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let reduced = eval.get("reduced").expect("Output 'reduced' not found"); // Calculate expected result let expected = data.sum(1)?; assert_eq!(reduced.to_vec2::<f32>()?, expected.to_vec2::<f32>()?); } Ok(()) } // Xor #[test] fn test_xor() -> Result<()> { // tests based on: https://github.com/onnx/onnx/blob/main/docs/Operators.md#Xor xor // 2d test( &[[0_u8, 1, 0, 0], [0, 0, 1, 1], [0, 1, 1, 1]], &[[1_u8, 1, 0, 0], [1, 0, 0, 1], [1, 1, 1, 0]], &[[1_u8, 0, 0, 0], [1, 0, 1, 0], [1, 0, 0, 1]], )?; // 3d test( &[ [ [0_u8, 1, 1, 1, 1], [0, 1, 1, 0, 0], [1, 1, 1, 1, 1], [0, 0, 0, 0, 1], ], [ [0, 0, 1, 1, 1], [1, 0, 1, 1, 1], [1, 1, 0, 0, 1], [1, 0, 0, 1, 0], ], [ [1, 0, 0, 1, 1], [1, 1, 1, 0, 0], [1, 1, 0, 0, 1], [1, 0, 0, 0, 1], ], ], &[ [ [1_u8, 0, 0, 1, 1], [0, 0, 1, 0, 1], [1, 0, 0, 1, 0], [0, 0, 0, 0, 0], ], [ [1, 0, 0, 1, 1], [1, 0, 1, 1, 1], [0, 1, 0, 1, 1], [1, 1, 1, 0, 0], ], [ [0, 1, 1, 1, 0], [1, 1, 0, 1, 0], [0, 1, 1, 1, 0], [1, 1, 0, 1, 0], ], ], &[ [ [1_u8, 1, 1, 0, 0], [0, 1, 0, 0, 1], [0, 1, 1, 0, 1], [0, 0, 0, 0, 1], ], [ [1, 0, 1, 0, 0], [0, 0, 0, 0, 0], [1, 0, 0, 1, 0], [0, 1, 1, 1, 0], ], [ [1, 1, 1, 0, 1], [0, 0, 1, 1, 0], [1, 0, 1, 1, 1], [0, 1, 0, 1, 1], ], ], )?; // 4d test( &[ [ [[0_u8, 1, 1, 0], [1, 0, 0, 0], [1, 1, 0, 1]], [[1, 1, 0, 1], [0, 0, 0, 1], [0, 0, 0, 1]], ], [ [[1, 1, 0, 0], [1, 0, 1, 0], [1, 0, 0, 0]], [[1, 0, 0, 1], [1, 0, 1, 1], [1, 1, 0, 1]], ], ], &[ [ [[1_u8, 0, 1, 0], [0, 0, 1, 1], [1, 0, 1, 0]], [[0, 1, 0, 0], [1, 0, 0, 0], [0, 0, 0, 1]], ], [ [[1, 1, 1, 0], [0, 0, 0, 1], [0, 0, 1, 0]], [[0, 0, 0, 0], [1, 0, 0, 0], [1, 1, 1, 1]], ], ], &[ [ [[1_u8, 1, 0, 0], [1, 0, 1, 1], [0, 1, 1, 1]], [[1, 0, 0, 1], [1, 0, 0, 1], [0, 0, 0, 0]], ], [ [[0, 0, 1, 0], [1, 0, 1, 1], [1, 0, 1, 0]], [[1, 0, 0, 1], [0, 0, 1, 1], [0, 0, 1, 0]], ], ], )?; // tests based on: https://github.com/onnx/onnx/blob/main/docs/Operators.md#Xor xor_broadcast // 3d vs 1d test( // Shape (3, 4, 5) &[ [ [0_u8, 0, 0, 0, 1], [0, 1, 0, 1, 1], [1, 0, 0, 1, 1], [0, 0, 1, 0, 1], ], [ [0, 1, 0, 1, 1], [1, 1, 0, 0, 1], [0, 1, 1, 1, 0], [0, 0, 0, 0, 1], ], [ [1, 1, 0, 1, 1], [0, 0, 0, 1, 1], [0, 1, 1, 0, 1], [1, 1, 0, 1, 1], ], ], // shape (5) &[1_u8, 0, 0, 1, 1], // shape (3, 4, 5) &[ [ [1_u8, 0, 0, 1, 0], [1, 1, 0, 0, 0], [0, 0, 0, 0, 0], [1, 0, 1, 1, 0], ], [ [1, 1, 0, 0, 0], [0, 1, 0, 1, 0], [1, 1, 1, 0, 1], [1, 0, 0, 1, 0], ], [ [0, 1, 0, 0, 0], [1, 0, 0, 0, 0], [1, 1, 1, 1, 0], [0, 1, 0, 0, 0], ], ], )?; // 3d vs 2d test( // Shape (3, 4, 5) &[ [ [0_u8, 0, 0, 0, 1], [0, 1, 0, 1, 1], [1, 0, 0, 1, 1], [0, 0, 1, 0, 1], ], [ [0, 1, 0, 1, 1], [1, 1, 0, 0, 1], [0, 1, 1, 1, 0], [0, 0, 0, 0, 1], ], [ [1, 1, 0, 1, 1], [0, 0, 0, 1, 1], [0, 1, 1, 0, 1], [1, 1, 0, 1, 1], ], ], // shape (4, 5) &[ [0_u8, 1, 0, 1, 0], [0, 0, 1, 0, 0], [1, 1, 0, 1, 1], [1, 1, 0, 1, 0], ], // shape (3, 4, 5) &[ [ [0_u8, 1, 0, 1, 1], [0, 1, 1, 1, 1], [0, 1, 0, 0, 0], [1, 1, 1, 1, 1], ], [ [0, 0, 0, 0, 1], [1, 1, 1, 0, 1], [1, 0, 1, 0, 1], [1, 1, 0, 1, 1], ], [ [1, 0, 0, 0, 1], [0, 0, 1, 1, 1], [1, 0, 1, 1, 0], [0, 0, 0, 0, 1], ], ], )?; // 4d vs 2d test( // Shape (2, 3, 3, 4) &[ [ [[1_u8, 0, 0, 1], [1, 1, 0, 0], [0, 1, 0, 0]], [[1, 1, 0, 0], [0, 1, 0, 0], [1, 0, 0, 1]], [[1, 0, 0, 0], [1, 1, 1, 0], [0, 0, 1, 1]], ], [ [[0, 1, 0, 1], [1, 1, 0, 1], [1, 0, 1, 1]], [[1, 1, 0, 0], [1, 0, 0, 0], [0, 0, 1, 1]], [[1, 0, 0, 0], [1, 1, 0, 0], [0, 1, 0, 1]], ], ], // shape (3, 4) &[[0_u8, 0, 1, 1], [1, 1, 1, 1], [0, 1, 0, 1]], // shape (2, 3, 3, 4) &[ [ [[1_u8, 0, 1, 0], [0, 0, 1, 1], [0, 0, 0, 1]], [[1, 1, 1, 1], [1, 0, 1, 1], [1, 1, 0, 0]], [[1, 0, 1, 1], [0, 0, 0, 1], [0, 1, 1, 0]], ], [ [[0, 1, 1, 0], [0, 0, 1, 0], [1, 1, 1, 0]], [[1, 1, 1, 1], [0, 1, 1, 1], [0, 1, 1, 0]], [[1, 0, 1, 1], [0, 0, 1, 1], [0, 0, 0, 0]], ], ], )?; // 4d vs 3d test( // Shape (2, 3, 3, 4) &[ [ [[1_u8, 0, 0, 1], [1, 1, 0, 0], [0, 1, 0, 0]], [[1, 1, 0, 0], [0, 1, 0, 0], [1, 0, 0, 1]], [[1, 0, 0, 0], [1, 1, 1, 0], [0, 0, 1, 1]], ], [ [[0, 1, 0, 1], [1, 1, 0, 1], [1, 0, 1, 1]], [[1, 1, 0, 0], [1, 0, 0, 0], [0, 0, 1, 1]], [[1, 0, 0, 0], [1, 1, 0, 0], [0, 1, 0, 1]], ], ], // shape (3, 3, 4) &[ [[1_u8, 1, 0, 0], [0, 0, 1, 1], [0, 1, 0, 0]], [[0, 1, 0, 1], [0, 0, 0, 0], [0, 1, 0, 1]], [[0, 1, 1, 0], [1, 0, 1, 1], [1, 1, 0, 1]], ], // shape (2, 3, 3, 4) &[ [ [[0_u8, 1, 0, 1], [1, 1, 1, 1], [0, 0, 0, 0]], [[1, 0, 0, 1], [0, 1, 0, 0], [1, 1, 0, 0]], [[1, 1, 1, 0], [0, 1, 0, 1], [1, 1, 1, 0]], ], [ [[1, 0, 0, 1], [1, 1, 1, 0], [1, 1, 1, 1]], [[1, 0, 0, 1], [1, 0, 0, 0], [0, 1, 1, 0]], [[1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 0]], ], ], )?; // 4d vs 4d test( // Shape (1, 4, 1, 2) &[[[[1_u8, 0]], [[1, 0]], [[1, 0]], [[1, 1]]]], // shape (2, 1, 4, 2) &[ [[[0_u8, 0], [1, 1], [1, 1], [1, 1]]], [[[0, 1], [1, 0], [0, 1], [0, 0]]], ], // shape (2, 4, 4, 2) &[ [ [[1_u8, 0], [0, 1], [0, 1], [0, 1]], [[1, 0], [0, 1], [0, 1], [0, 1]], [[1, 0], [0, 1], [0, 1], [0, 1]], [[1, 1], [0, 0], [0, 0], [0, 0]], ], [ [[1, 1], [0, 0], [1, 1], [1, 0]], [[1, 1], [0, 0], [1, 1], [1, 0]], [[1, 1], [0, 0], [1, 1], [1, 0]], [[1, 0], [0, 1], [1, 0], [1, 1]], ], ], )?; fn test(input: impl NdArray, other: impl NdArray, expected: impl NdArray) -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Xor".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string(), INPUT_Y.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let inputs: HashMap<String, Tensor> = HashMap::from([ (INPUT_X.to_string(), Tensor::new(input, &Device::Cpu)?), (INPUT_Y.to_string(), Tensor::new(other, &Device::Cpu)?), ]); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; assert_eq!(eval.len(), 1); let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); let expected = Tensor::new(expected, &Device::Cpu)?; match expected.dims().len() { 0 => { assert_eq!(z.to_vec0::<u8>()?, expected.to_vec0::<u8>()?) } 1 => { assert_eq!(z.to_vec1::<u8>()?, expected.to_vec1::<u8>()?) } 2 => { assert_eq!(z.to_vec2::<u8>()?, expected.to_vec2::<u8>()?) } 3 => { assert_eq!(z.to_vec3::<u8>()?, expected.to_vec3::<u8>()?) } 4 => { // Candle has no method equivallent to `to_vec4()` // So, as a hack, we flatten it to a single dim vec to test the results assert_eq!( z.flatten_all()?.to_vec1::<u8>()?, expected.flatten_all()?.to_vec1::<u8>()? ) } _ => unreachable!(), }; Ok(()) } Ok(()) } #[test] fn test_sign_operation() -> Result<()> { let manual_graph = create_model_proto_with_graph(Some(GraphProto { node: vec![NodeProto { op_type: "Sign".to_string(), domain: "".to_string(), attribute: vec![], input: vec![INPUT_X.to_string()], output: vec![OUTPUT_Z.to_string()], name: "".to_string(), doc_string: "".to_string(), }], name: "".to_string(), initializer: vec![], input: vec![], output: vec![ValueInfoProto { name: OUTPUT_Z.to_string(), doc_string: "".to_string(), r#type: None, }], value_info: vec![], doc_string: "".to_string(), sparse_initializer: vec![], quantization_annotation: vec![], })); let mut inputs: HashMap<String, Tensor> = HashMap::new(); inputs.insert( INPUT_X.to_string(), Tensor::new(vec![-2f32, -1., 0., 1., 2.], &Device::Cpu)?, ); let eval = candle_onnx::simple_eval(&manual_graph, inputs)?; let z = eval.get(OUTPUT_Z).expect("Output 'z' not found"); assert_eq!( z.to_dtype(candle::DType::I64)?.to_vec1::<i64>()?.to_vec(), vec![-1, -1, 0, 1, 1] ); Ok(()) }
5
0
hf_public_repos/candle
hf_public_repos/candle/candle-kernels/build.rs
fn main() { println!("cargo:rerun-if-changed=build.rs"); println!("cargo:rerun-if-changed=src/compatibility.cuh"); println!("cargo:rerun-if-changed=src/cuda_utils.cuh"); println!("cargo:rerun-if-changed=src/binary_op_macros.cuh"); let builder = bindgen_cuda::Builder::default(); println!("cargo:info={builder:?}"); let bindings = builder.build_ptx().unwrap(); bindings.write("src/lib.rs").unwrap(); }
6
0
hf_public_repos/candle
hf_public_repos/candle/candle-kernels/Cargo.toml
[package] name = "candle-kernels" version = "0.8.0" edition = "2021" description = "CUDA kernels for Candle" repository = "https://github.com/huggingface/candle" keywords = ["blas", "tensor", "machine-learning"] categories = ["science"] license = "MIT OR Apache-2.0" [dependencies] [build-dependencies] bindgen_cuda = "0.1.1"
7
0
hf_public_repos/candle
hf_public_repos/candle/candle-kernels/README.md
# candle-kernels This crate contains CUDA kernels used from candle. Some of these implementations come from the [dfdx crate](https://github.com/coreylowman/dfdx).
8
0
hf_public_repos/candle/candle-kernels
hf_public_repos/candle/candle-kernels/src/lib.rs
pub const AFFINE: &str = include_str!(concat!(env!("OUT_DIR"), "/affine.ptx")); pub const BINARY: &str = include_str!(concat!(env!("OUT_DIR"), "/binary.ptx")); pub const CAST: &str = include_str!(concat!(env!("OUT_DIR"), "/cast.ptx")); pub const CONV: &str = include_str!(concat!(env!("OUT_DIR"), "/conv.ptx")); pub const FILL: &str = include_str!(concat!(env!("OUT_DIR"), "/fill.ptx")); pub const INDEXING: &str = include_str!(concat!(env!("OUT_DIR"), "/indexing.ptx")); pub const QUANTIZED: &str = include_str!(concat!(env!("OUT_DIR"), "/quantized.ptx")); pub const REDUCE: &str = include_str!(concat!(env!("OUT_DIR"), "/reduce.ptx")); pub const SORT: &str = include_str!(concat!(env!("OUT_DIR"), "/sort.ptx")); pub const TERNARY: &str = include_str!(concat!(env!("OUT_DIR"), "/ternary.ptx")); pub const UNARY: &str = include_str!(concat!(env!("OUT_DIR"), "/unary.ptx"));
9
0
hf_public_repos/api-inference-community/docker_images/spacy
hf_public_repos/api-inference-community/docker_images/spacy/app/main.py
import functools import logging import os from typing import Dict, Type from api_inference_community.routes import pipeline_route, status_ok from app.pipelines import ( Pipeline, SentenceSimilarityPipeline, TextClassificationPipeline, TokenClassificationPipeline, ) from starlette.applications import Starlette from starlette.middleware import Middleware from starlette.middleware.gzip import GZipMiddleware from starlette.routing import Route TASK = os.getenv("TASK") MODEL_ID = os.getenv("MODEL_ID") logger = logging.getLogger(__name__) # Add the allowed tasks # Supported tasks are: # - text-generation # - text-classification # - token-classification # - translation # - summarization # - automatic-speech-recognition # - sentence-similarity # - ... # For instance # from app.pipelines import AutomaticSpeechRecognitionPipeline # ALLOWED_TASKS = {"automatic-speech-recognition": AutomaticSpeechRecognitionPipeline} # You can check the requirements and expectations of each pipelines in their respective # directories. Implement directly within the directories. ALLOWED_TASKS: Dict[str, Type[Pipeline]] = { "token-classification": TokenClassificationPipeline, "text-classification": TextClassificationPipeline, "sentence-similarity": SentenceSimilarityPipeline, } @functools.lru_cache() def get_pipeline() -> Pipeline: task = os.environ["TASK"] model_id = os.environ["MODEL_ID"] if task not in ALLOWED_TASKS: raise EnvironmentError(f"{task} is not a valid pipeline for model : {model_id}") return ALLOWED_TASKS[task](model_id) routes = [ Route("/{whatever:path}", status_ok), Route("/{whatever:path}", pipeline_route, methods=["POST"]), ] middleware = [Middleware(GZipMiddleware, minimum_size=1000)] if os.environ.get("DEBUG", "") == "1": from starlette.middleware.cors import CORSMiddleware middleware.append( Middleware( CORSMiddleware, allow_origins=["*"], allow_headers=["*"], allow_methods=["*"], ) ) app = Starlette(routes=routes, middleware=middleware) @app.on_event("startup") async def startup_event(): logger = logging.getLogger("uvicorn.access") handler = logging.StreamHandler() handler.setFormatter(logging.Formatter("%(asctime)s - %(levelname)s - %(message)s")) logger.handlers = [handler] # Link between `api-inference-community` and framework code. app.get_pipeline = get_pipeline try: get_pipeline() except Exception: # We can fail so we can show exception later. pass if __name__ == "__main__": try: get_pipeline() except Exception: # We can fail so we can show exception later. pass
0
0
hf_public_repos/api-inference-community/docker_images/spacy/app
hf_public_repos/api-inference-community/docker_images/spacy/app/pipelines/sentence_similarity.py
import os import subprocess import sys from typing import Dict, List, Union from app.pipelines import Pipeline class SentenceSimilarityPipeline(Pipeline): def __init__( self, model_id: str, ): # At the time, only public models from spaCy are allowed in the inference API. full_model_path = model_id.split("/") if len(full_model_path) != 2: raise ValueError( f"Invalid model_id: {model_id}. It should have a namespace (:namespace:/:model_name:)" ) namespace, model_name = full_model_path hf_endpoint = os.getenv("HF_ENDPOINT", "https://huggingface.co") package = f"{hf_endpoint}/{namespace}/{model_name}/resolve/main/{model_name}-any-py3-none-any.whl" cache_dir = os.environ["PIP_CACHE"] subprocess.check_call( [sys.executable, "-m", "pip", "install", "--cache-dir", cache_dir, package] ) import spacy self.model = spacy.load(model_name) def __call__(self, inputs: Dict[str, Union[str, List[str]]]) -> List[float]: """ Args: inputs (:obj:`dict`): a dictionary containing two keys, 'source_sentence' mapping to the sentence that will be compared against all the others, and 'sentences', mapping to a list of strings to which the source will be compared. Return: A :obj:`list` of floats: Some similarity measure between `source_sentence` and each sentence from `sentences`. """ source_sentence = inputs["source_sentence"] source_doc = self.model(source_sentence) similarities = [] for sentence in inputs["sentences"]: search_doc = self.model(sentence) similarities.append(source_doc.similarity(search_doc)) return similarities
1
0
hf_public_repos/api-inference-community/docker_images/spacy/app
hf_public_repos/api-inference-community/docker_images/spacy/app/pipelines/base.py
from abc import ABC, abstractmethod from typing import Any class Pipeline(ABC): @abstractmethod def __init__(self, model_id: str): raise NotImplementedError("Pipelines should implement an __init__ method") @abstractmethod def __call__(self, inputs: Any) -> Any: raise NotImplementedError("Pipelines should implement a __call__ method") class PipelineException(Exception): pass
2
0
hf_public_repos/api-inference-community/docker_images/spacy/app
hf_public_repos/api-inference-community/docker_images/spacy/app/pipelines/__init__.py
from app.pipelines.base import Pipeline, PipelineException # isort:skip from app.pipelines.sentence_similarity import SentenceSimilarityPipeline from app.pipelines.text_classification import TextClassificationPipeline from app.pipelines.token_classification import TokenClassificationPipeline
3
0
hf_public_repos/api-inference-community/docker_images/spacy/app
hf_public_repos/api-inference-community/docker_images/spacy/app/pipelines/text_classification.py
import os import subprocess import sys from typing import Dict, List from app.pipelines import Pipeline class TextClassificationPipeline(Pipeline): def __init__( self, model_id: str, ): # At the time, only public models from spaCy are allowed in the inference API. full_model_path = model_id.split("/") if len(full_model_path) != 2: raise ValueError( f"Invalid model_id: {model_id}. It should have a namespace (:namespace:/:model_name:)" ) namespace, model_name = full_model_path hf_endpoint = os.getenv("HF_ENDPOINT", "https://huggingface.co") package = f"{hf_endpoint}/{namespace}/{model_name}/resolve/main/{model_name}-any-py3-none-any.whl" cache_dir = os.environ["PIP_CACHE"] subprocess.check_call( [sys.executable, "-m", "pip", "install", "--cache-dir", cache_dir, package] ) import spacy self.model = spacy.load(model_name) def __call__(self, inputs: str) -> List[List[Dict[str, float]]]: """ Args: inputs (:obj:`str`): a string containing some text Return: A :obj:`list`:. The object returned should be a list of one list like [[{"label": 0.9939950108528137}]] containing : - "label": A string representing what the label/class is. There can be multiple labels. - "score": A score between 0 and 1 describing how confident the model is for this label/class. """ doc = self.model(inputs) categories = [] for cat, score in doc.cats.items(): categories.append({"label": cat, "score": score}) return [categories]
4
0
hf_public_repos/api-inference-community/docker_images/spacy/app
hf_public_repos/api-inference-community/docker_images/spacy/app/pipelines/token_classification.py
import os import subprocess import sys from typing import Any, Dict, List from app.pipelines import Pipeline class TokenClassificationPipeline(Pipeline): def __init__( self, model_id: str, ): # At the time, only public models from spaCy are allowed in the inference API. full_model_path = model_id.split("/") if len(full_model_path) != 2: raise ValueError( f"Invalid model_id: {model_id}. It should have a namespace (:namespace:/:model_name:)" ) namespace, model_name = full_model_path hf_endpoint = os.getenv("HF_ENDPOINT", "https://huggingface.co") package = f"{hf_endpoint}/{namespace}/{model_name}/resolve/main/{model_name}-any-py3-none-any.whl" cache_dir = os.environ["PIP_CACHE"] subprocess.check_call( [sys.executable, "-m", "pip", "install", "--cache-dir", cache_dir, package] ) import spacy self.model = spacy.load(model_name) def __call__(self, inputs: str) -> List[Dict[str, Any]]: """ Args: inputs (:obj:`str`): a string containing some text Return: A :obj:`list`:. The object returned should be like [{"entity_group": "XXX", "word": "some word", "start": 3, "end": 6, "score": 0.82}] containing : - "entity_group": A string representing what the entity is. - "word": A rubstring of the original string that was detected as an entity. - "start": the offset within `input` leading to `answer`. context[start:stop] == word - "end": the ending offset within `input` leading to `answer`. context[start:stop] === word - "score": A score between 0 and 1 describing how confident the model is for this entity. """ doc = self.model(inputs) entities = [] for ent in doc.ents: # Score is currently not well supported, see # https://github.com/explosion/spaCy/issues/5917. current_entity = { "entity_group": ent.label_, "word": ent.text, "start": ent.start_char, "end": ent.end_char, "score": 1.0, } entities.append(current_entity) return entities
5
0
hf_public_repos/api-inference-community/docker_images/spacy
hf_public_repos/api-inference-community/docker_images/spacy/tests/test_docker_build.py
import os import subprocess from unittest import TestCase class cd: """Context manager for changing the current working directory""" def __init__(self, newPath): self.newPath = os.path.expanduser(newPath) def __enter__(self): self.savedPath = os.getcwd() os.chdir(self.newPath) def __exit__(self, etype, value, traceback): os.chdir(self.savedPath) class DockerBuildTestCase(TestCase): def test_can_build_docker_image(self): with cd(os.path.dirname(os.path.dirname(__file__))): subprocess.check_output(["docker", "build", "."])
6
0
hf_public_repos/api-inference-community/docker_images/spacy
hf_public_repos/api-inference-community/docker_images/spacy/tests/test_api_sentence_similarity.py
import json import os from unittest import TestCase, skipIf from app.main import ALLOWED_TASKS from starlette.testclient import TestClient from tests.test_api import TESTABLE_MODELS @skipIf( "sentence-similarity" not in ALLOWED_TASKS, "sentence-similarity not implemented", ) class SentenceSimilarityTestCase(TestCase): def setUp(self): model_id = TESTABLE_MODELS["sentence-similarity"] self.old_model_id = os.getenv("MODEL_ID") self.old_task = os.getenv("TASK") os.environ["MODEL_ID"] = model_id os.environ["TASK"] = "sentence-similarity" from app.main import app self.app = app @classmethod def setUpClass(cls): from app.main import get_pipeline get_pipeline.cache_clear() def tearDown(self): if self.old_model_id is not None: os.environ["MODEL_ID"] = self.old_model_id else: del os.environ["MODEL_ID"] if self.old_task is not None: os.environ["TASK"] = self.old_task else: del os.environ["TASK"] def test_simple(self): source_sentence = "I am a very happy man" sentences = [ "What is this?", "I am a super happy man", "I am a sad man", "I am a happy dog", ] inputs = {"source_sentence": source_sentence, "sentences": sentences} with TestClient(self.app) as client: response = client.post("/", json={"inputs": inputs}) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual({type(item) for item in content}, {float}) with TestClient(self.app) as client: response = client.post("/", json=inputs) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual({type(item) for item in content}, {float}) def test_missing_input_sentences(self): source_sentence = "I am a very happy man" inputs = {"source_sentence": source_sentence} with TestClient(self.app) as client: response = client.post("/", json={"inputs": inputs}) self.assertEqual( response.status_code, 400, ) def test_malformed_input(self): with TestClient(self.app) as client: response = client.post("/", data=b"\xc3\x28") self.assertEqual( response.status_code, 400, ) self.assertEqual( response.content, b'{"error":"\'utf-8\' codec can\'t decode byte 0xc3 in position 0: invalid continuation byte"}', )
7
0
hf_public_repos/api-inference-community/docker_images/spacy
hf_public_repos/api-inference-community/docker_images/spacy/tests/test_api.py
import os from typing import Dict from unittest import TestCase, skipIf from app.main import ALLOWED_TASKS, get_pipeline # Must contain at least one example of each implemented pipeline # Tests do not check the actual values of the model output, so small dummy # models are recommended for faster tests. TESTABLE_MODELS: Dict[str, str] = { # IMPLEMENT_THIS # "automatic-speech-recognition": "mysample-ASR", # "text-generation": "mysample-gpt2", "token-classification": "spacy/en_core_web_sm", "text-classification": "explosion/en_textcat_goemotions", "sentence-similarity": "spacy/en_core_web_sm", } ALL_TASKS = { "automatic-speech-recognition", "audio-source-separation", "feature-extraction", "image-classification", "question-answering", "sentence-similarity", "text-generation", "text-to-speech", } class PipelineTestCase(TestCase): @skipIf( os.path.dirname(os.path.dirname(__file__)).endswith("common"), "common is a special case", ) def test_has_at_least_one_task_enabled(self): self.assertGreater( len(ALLOWED_TASKS.keys()), 0, "You need to implement at least one task" ) def test_unsupported_tasks(self): unsupported_tasks = ALL_TASKS - ALLOWED_TASKS.keys() for unsupported_task in unsupported_tasks: with self.subTest(msg=unsupported_task, task=unsupported_task): with self.assertRaises(EnvironmentError): get_pipeline(unsupported_task, model_id="XX")
8
0
hf_public_repos/api-inference-community/docker_images/spacy
hf_public_repos/api-inference-community/docker_images/spacy/tests/test_api_token_classification.py
import json import os from unittest import TestCase, skipIf from app.main import ALLOWED_TASKS from starlette.testclient import TestClient from tests.test_api import TESTABLE_MODELS @skipIf( "token-classification" not in ALLOWED_TASKS, "token-classification not implemented", ) class TokenClassificationTestCase(TestCase): def setUp(self): model_id = TESTABLE_MODELS["token-classification"] self.old_model_id = os.getenv("MODEL_ID") self.old_task = os.getenv("TASK") os.environ["MODEL_ID"] = model_id os.environ["TASK"] = "token-classification" from app.main import app self.app = app def tearDown(self): if self.old_model_id is not None: os.environ["MODEL_ID"] = self.old_model_id else: del os.environ["MODEL_ID"] if self.old_task is not None: os.environ["TASK"] = self.old_task else: del os.environ["TASK"] def test_simple(self): inputs = "Hello, my name is John and I live in New York" with TestClient(self.app) as client: response = client.post("/", json={"inputs": inputs}) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual( set(k for el in content for k in el.keys()), {"entity_group", "word", "start", "end", "score"}, ) with TestClient(self.app) as client: response = client.post("/", json=inputs) self.assertEqual( response.status_code, 200, ) content = json.loads(response.content) self.assertEqual(type(content), list) self.assertEqual( set(k for el in content for k in el.keys()), {"entity_group", "word", "start", "end", "score"}, ) def test_malformed_question(self): with TestClient(self.app) as client: response = client.post("/", data=b"\xc3\x28") self.assertEqual( response.status_code, 400, ) self.assertEqual( response.content, b'{"error":"\'utf-8\' codec can\'t decode byte 0xc3 in position 0: invalid continuation byte"}', )
9