repo
stringlengths
26
115
file
stringlengths
54
212
language
stringclasses
2 values
license
stringclasses
16 values
content
stringlengths
19
1.07M
https://github.com/justmejulian/typst-documentation-template
https://raw.githubusercontent.com/justmejulian/typst-documentation-template/main/sections/background.typ
typst
= Background #rect( width: 100%, radius: 10%, stroke: 0.5pt, fill: yellow, )[ Note: Describe each proven technology / concept shortly that is important to understand your thesis. Point out why it is interesting for your thesis. Make sure to incorporate references to important literature here. ] == e.g. User Feedback #rect( width: 100%, radius: 10%, stroke: 0.5pt, fill: yellow, )[ Info Note: This section would summarize the concept User Feedback using definitions, historical overviews and pointing out the most important aspects of User Feedback. ] == e.g. Representational State Transfer #rect( width: 100%, radius: 10%, stroke: 0.5pt, fill: yellow, )[ Note: This section would summarize the architectural style Representational State Transfer (REST) using definitions, historical overviews and pointing out the most important aspects of the architecture. ] == e.g. Scrum #rect( width: 100%, radius: 10%, stroke: 0.5pt, fill: yellow, )[ Note: This section would summarize the agile method Scrum using definitions, historical overviews and pointing out the most important aspects of Scrum. ]
https://github.com/alisa101rs/resume
https://raw.githubusercontent.com/alisa101rs/resume/master/resume.typ
typst
#import "./template.typ": * #show: resume.with( author: ( firstname: "Alisa", lastname: "Gorelova", email: "<EMAIL>", phone: "(+81) 7083148567", github: "alisa101rs", linkedin: "alisa-gorelova-88940a230", positions: ( "Software Engineer", "Software Architect", ) ), date: "May, 2024" ) Enthusiastic software engineer with strong competencies in Rust and Kotlin. #resume_section("Experience") #work_experience_item_header( "Rakuten Group Inc., Membership Section", "Tokyo, Japan", "Software Engineer", "Jan. 2022 - Current", ) #resume_item[ - Migrated legacy spring-based global authentication and authorization service to a new kotlin + vert.x stack with increased 100x throughput (architect + tech leader) - Migrated legacy spring REST service to a new kotlin + vert.x stack (architect + tech leader) - Designed migration strategy with a zero downtime, with POCs services (architect) - Conducted multiple trainings aimed at developing deep understanding of kotlin coroutines - Regular reviewing of external and internal PRs providing comments aimed at increasing product quality and raising engineering culture (architect) - Developed internal authorization framework in rust - Designed and reviewed several company-wide features aimed at enhancing user account security (architect + pdm) ] #work_experience_item_header( "Baranov Central Institute of Aviation Motor Development", "Moscow, Russia", "Software Engineer", "Aug. 2018 - Dec. 2021", ) #resume_item[ - Developed 3D engine for visualization of CFD results in Virtual Reality (Rust + Vulkan). - Conducted integration of local CFD solver into multiphysics engineering simulation software (Fortran + C). Updated old code bases of CFD Solver utilities to modern development standards. ] #resume_section("Education") #education_item[ BAUMAN MOSCOW STATE TECHNICAL UNIVERSITY (BMSTU) ][ Moscow, Russia ][ Specialist Degree in "Design and Construction of Engines and Power Plants of Aircraft" ][ Sept.2014 - July.2020 ] #resume_item[ General: - Physics, Mathematics, Computer Science Occupational: - Fluid Dynamics, Aerodynamics, Automation Systems, Turbopump Units ] #education_item[ SCIENTIFIC RESEARCH INSTITUTE FOR SYSTEM ANALYSIS ][ Moscow, Russia ][ Ph.D. in Computer Science (not finished) ][ Aug.2020 - ? ] #resume_item[ Dissertation: Visualization of CFD results in Virtual Reality with a volumetric rendering using raytracing. ] #resume_section("Personal Projects") #personal_project_item_header( "Kassandra", "", "single node cassandra implementation", "", ) #resume_item[ Toy in-memory DB that supports cql querry language and fully compatible with cassandra protocol ] #personal_project_item_header( "Kass Proxy", "", "legacy service outside interaction inspector", "", ) #resume_item[ Tool, that allows to inspect all external interactions, that are happening during processing individual http request, of a targeted legacy service. Most prominent feature is recording of all querries that are being made from the service to cassandra db, translating them and replaying against separate kassandra to record diffs that have been caused by that http request. ] #personal_project_item_header( "dothttp", "", "text-based http client", "", ) #resume_item[ Convinient CLI tool to execute http requests providing full compatibility with intelijj http client. ] #resume_section("Skills") #skill_item( "General", ( strong[Rust], strong[Kotlin], strong[Python], "Java", "JavaScript", "Nix", "git", "Linux", "Docker", ) ) #skill_item( "Special", ( strong[tokio], strong[axum], strong[hyper], strong[vert.x], strong[kotlin coroutines], "opentelemetry", "cassandra", "vector", "oauth 2.0", ) ) #skill_item( "Languages", ( strong[Russian (Native)], strong[English (C2)], ) )
https://github.com/lphoogenboom/typstThesisDCSC
https://raw.githubusercontent.com/lphoogenboom/typstThesisDCSC/master/chapters/acknowledgements.typ
typst
#import "../typFiles/specialChapter.typ": * #show: specialChapter.with(chapterTitle: "Acknowledgements",content: [ // Write your Acknowledgements Here #lorem(200) // lorem() function adds filler text ])
https://github.com/katamyra/Notes
https://raw.githubusercontent.com/katamyra/Notes/main/Compiled%20School%20Notes/CS2340/LectureNotes.typ
typst
#import "../../template.typ": * #show: template.with( title: [ Objects and Design Lecture Notes ], description: [ Notes based on in-person CS2340 Lectures in Spring 2024. ], authors: ( ( name: "<NAME>", link: "https://github.com/katamyra" ), ) ) #set text( fill: rgb("#04055c") ) #include "RequirementsEngineering.typ" #include "SoftwareArchitecture.typ"
https://github.com/GYPpro/DS-Course-Report
https://raw.githubusercontent.com/GYPpro/DS-Course-Report/main/Rep/14.typ
typst
#import "@preview/tablex:0.0.6": tablex, hlinex, vlinex, colspanx, rowspanx #import "@preview/codelst:2.0.1": sourcecode // Display inline code in a small box // that retains the correct baseline. #set text(font:("Times New Roman","Source Han Serif SC")) #show raw: set text( font: ("consolas", "Source Han Serif SC") ) #set page( paper: "a4", ) #set text( font:("Times New Roman","Source Han Serif SC"), style:"normal", weight: "regular", size: 13pt, ) #let nxtIdx(name) = box[ #counter(name).step()#counter(name).display()] #set math.equation(numbering: "(1)") #show raw.where(block: true): block.with( fill: luma(240), inset: 10pt, radius: 4pt, ) #show raw.where(block: false): box.with( fill: luma(230), inset: (x: 3pt, y: 0pt), outset: (y: 3pt), radius: 2pt, ) #set math.equation(numbering: "(1)") #set page( paper:"a4", number-align: right, margin: (x:2.54cm,y:4cm), header: [ #set text( size: 25pt, font: "KaiTi", ) #align( bottom + center, [ #strong[暨南大学本科实验报告专用纸(附页)] ] ) #line(start: (0pt,-5pt),end:(453pt,-5pt)) ] ) /*----*/ = 堆`Heap` \ #text( font:"KaiTi", size: 15pt )[ 课程名称#underline[#text(" 数据结构 ")]成绩评定#underline[#text(" ")]\ 实验项目名称#underline[#text(" ") 堆`Heap` #text(" ")]指导老师#underline[#text(" 干晓聪 ")]\ 实验项目编号#underline[#text(" 14 ")]实验项目类型#underline[#text(" 设计性 ")]实验地点#underline[#text(" 数学系机房 ")]\ 学生姓名#underline[#text(" 郭彦培 ")]学号#underline[#text(" 2022101149 ")]\ 学院#underline[#text(" 信息科学技术学院 ")]系#underline[#text(" 数学系 ")]专业#underline[#text(" 信息管理与信息系统 ")]\ 实验时间#underline[#text(" 2024年6月13日上午 ")]#text("~")#underline[#text(" 2024年7月13日中午 ")]\ ] #set heading( numbering: "1.1." ) = 实验目的 实现一个基础堆结构 = 实验环境 计算机:PC X64 操作系统:Windows + Ubuntu20.0LTS 编程语言:C++:GCC std20 IDE:Visual Studio Code = 程序原理 堆是一个完全二叉树,满足任意节点$a_i$,其子节点值均小于它,这是小根堆。大根堆反之。 插入时 ,在整棵树右下角插入,并且按照大小逐个上浮。可以证明,这样操作过后堆依然满足其性质。 删除时,将根与右下角对换,并按照左右大小下压根节点,最后删除右下角节点即可。 可以证明两者复杂度均为$OO(log_2 n)$ #pagebreak() = 程序代码 == `big_root_heap.h` #sourcecode[```cpp #ifndef PRIORITY_QUEUE_HPP #define PRIORITY_QUEUE_HPP #include <vector> #ifdef __PRIVATE_DEBUGE #include <iostream> #endif namespace myDS { template<typename VALUE_TYPE> class big_root_heap{ private: std::vector<VALUE_TYPE> h; void floow(std::size_t x) { while(x > 1 && h[x] > h[x/2]) { std::swap(h[x],h[x/2]); x >>= 1; } } void drown(std::size_t x) { while(x * 2 <= h.size()-1) { int t = x * 2; if(t + 1 <= h.size()-1 && h[t + 1] > h[t]) t ++; if(h[t] <= h[x]) break; std::swap(h[x],h[t]); x = t; } } public: big_root_heap() {h.push_back(0);} ~big_root_heap() { } void push(VALUE_TYPE t) { h.push_back(t); floow(h.size()-1); } VALUE_TYPE top(){ return h[1]; } VALUE_TYPE pop(){ auto t = this->top(); std::swap(h[1],h[h.size()-1]); h.pop_back(); drown(1); return t; } #ifdef __PRIVATE_DEBUGE void innerPrint(){ for(auto x:h) std::cout << x << " "; std::cout << "\n"; } #endif }; } // namespace myDS #endif ```] == `_PRIV_TEST.cpp` #sourcecode[```cpp #include <iostream> #define __PRIVATE_DEBUGE #include <Dev\14\big_root_heap.h> using namespace std; int main() { myDS::big_root_heap<int> piq; while(1) { string s; cin >> s; if(s == "push") { int t; cin >> t; piq.push(t); } else if(s == "pop") { cout << piq.pop() << "\n"; } else if(s == "top") { cout << piq.top() << "\n"; } else if(s == "p") { piq.innerPrint(); } } } ```] = 测试数据与运行结果 运行上述`_PRIV_TEST.cpp`测试代码中的正确性测试模块,得到以下内容: ``` push 5 push 4 push 1 push 3 push 2 p 0 5 4 1 3 2 pop 5 p 0 4 3 1 2 top 4 pop 4 pop 3 pop 2 pop 1 p 0 ``` 可以看出,代码运行结果与预期相符,可以认为代码正确性无误。
https://github.com/hchap1/Typst-Renderer-Discord-Bot
https://raw.githubusercontent.com/hchap1/Typst-Renderer-Discord-Bot/main/README.md
markdown
# Typst-Renderer-Discord-Bot Discord bot that can render Typst, an open source rust typesetting project. Useful for sharing maths in discord servers. Written in Rust and thus (hopefully) fast. REQUIREMENTS: Windows 11 Typst: https://github.com/typst/typst/releases Discord bot with appropriate settings. - Read and Create Messages - All three intents - Store the token in DISCORD_BOT_TOKEN environment variable. COMMANDS: !render -> Renders a Typst code block (do not specify language). This is done by: !render ``` Typst Code Here ``` !rm / !rendermath -> Renders a typst equation from a code block. This is done by: !rm/!rendermath ``` Typst Equation Here ``` !qm / !quickmath -> Renders a typst equation from the text after. This is done by: !qm Typst Equation Here In text equation parsing. This is done by: Some sentence $Typst Equation here$ !affirmation -> Reads a motivational quote from affirmations.txt !fact -> Reads a fun fact from facts.txt !anatomy -> Reads a random muscle / organ / bone from anatomy.txt
https://github.com/pqcfox/math-notes
https://raw.githubusercontent.com/pqcfox/math-notes/main/atiyah-macdonald/main.typ
typst
MIT License
#import "@preview/fractusist:0.1.0": hilbert-curve #let done = [#h(1fr) $square.stroked$] #set par(justify: true) #set page(margin: 1.25in) #set text(font: "<NAME>", size: 10pt) #show heading: set block(above: 2em, below: 1em) #set heading(numbering: "1.") #set quote(block: true) #show quote: set pad(x: 2em, top: -0.5em) #set page( header: context { let selector = selector(heading.where(level: 1)).before(here()) let level = counter(selector) let headings = query(selector) if headings.len() != 0 { let body = headings.last().body let num = level.display(heading.numbering) align(center, emph(num + " " + body)) } }, numbering: "1", ) #align( center, stack( dir: ttb, spacing: 1em, text(size: 17pt)[*Notes on Atiyah-MacDonald*], text(size: 14pt)[(Intro to Commutative Algebra)], ) ) #v(10pt) #align( center, stack( dir: ttb, spacing: 0.75em, text[Kat Watson], link("https://pqcfox.dev")[`https://pqcfox.dev`], ) ) #v(10pt) #align(center, stack( dir: ttb, spacing: 0.75em, text[*Abstract*], align(left, pad(x: 5em, top: 0.25em)[ #set par(justify: true) #set text(size: 9pt) These notes are from late 2024, and are intended as a review of commutative algebra in preparation to cover more of the in-depth sections of Hartshorne. In these notes, I make arguments from the text as explicit as possible, even when they may be "obvious" in order to get additional practice--it's been a while since my undergrad. Exercise solutions are included, and are selected to cover the material as well as possible. ] ), ) ) #v(10pt) #align(center, text[#sym.smash #h(0.25em) #sym.smash #h(0.25em) #sym.smash]) #v(-5pt) = Rings and ideals == Prime and maximal ideals (page 3) The claim is made that for a ring $A$ with $frak(p)$ and $frak(m)$ ideals of $A$, #quote[ $frak(p)$ is prime $<=>$ $A\/frak(p)$ is an integral domain;\ $frak(m)$ is maximal $<=>$ $A\/frak(m)$ is a field (by (1.1) and (1.2)). ] Indeed, $A\/frak(p)$ is a domain iff for all $x, y in A$, $overline(x) overline(y) = 0$ implies either $overline(x) = 0$ or $overline(y) = 0$. Since #align(center)[ $overline(x) overline(y) = 0 <=> x y in frak(p), quad overline(x) = 0 <=> x in frak(p), quad overline(y) = 0 <=> y in frak(p)$ ] we have that this is equivalent to $frak(p)$ being prime. For the other claim, (1.2) states that $A\/frak(m)$ is a field iff the only ideals in $A$ are $0$ and $(1)$. By (1.1), there is a one-to-one order-preserving correspondence between the ideals of $A\/frak(m)$ and the ideals of $A$ containing $frak(m)$ sending $0, (1) subset.eq A\/frak(m)$ to $frak(m), (1) subset.eq A$ respectively. Thus, we have #align(center)[ $ A\/frak(m) "is a field" &<=> "the only ideals of" A\/frak(m) "are" 0 "and" (1) \ &<=> "the only ideals of" A "containing" frak(m) "are" frak(m) "and" (1) \ &<=> frak(m) "is maximal". $ ] == Pullbacks of prime ideals (page 3) The claim is also made that #quote[ If $f: A -> B$ is a ring homomorphism and $frak(q)$ is a prime ideal in $B$, then $f^(-1)(q)$ is a prime ideal in $A$, for $A\/f^(-1)(frak(q))$ is isomorphic to a subring of $B\/frak(q)$ and hence has no zero-divisor $eq.not 0$. ] First, to construct this isomorphism, compose $f$ with the quotient map $pi: B -> B\/frak(q)$ to get $pi compose f: A -> B\/frak(q)$ and note that $ker (pi compose f)$ is precisely $f^(-1)(frak(q))$, so that by the first isomorphism theorem, we have an injective map $overline(pi compose f): A\/f^(-1)(frak(q)) -> B\/frak(q)$. Since the image of any ring homomorphism is a subring of the codomain, this map makes $A\/f^(-1)(frak(q))$ isomorphic to a subring of $B\/frak(q)$ as desired. == Every non-zero ring has a maximal ideal (Proposition 1.3) As part of the proof, it is claimed that for a chain $(frak(a)_alpha)$ in $Sigma$, $frak(a) := union_alpha frak(a)_alpha$ is an ideal. Indeed, take $x, y in frak(a)$. We have that for some $alpha$ and $beta$, $x in frak(a)_alpha$ and $y in frak(a)_beta$. WLOG, assume $frak(a)_alpha subset.eq frak(a)_beta$. Then $x, y in frak(a)_beta$, meaning $x + y in frak(a)_beta subset.eq frak(a)$. Similarly, if $a in A$ is a general element of $A$ and $x in frak(a)_alpha$, then we have $a x in frak(a)_alpha subset.eq frak(a)$, so that $frak(a)$ is an ideal of $A$. == Every ideal is contained in a maximal ideal (Corollary 1.4) In the proof of Corollary 1.4, the full argument is #quote[ Apply (1.3) to $A\/frak(a)$, bearing in mind (1.1). Alternatively, modify the proof of (1.3). ] We expand on this: applying (1.3) to $A\/frak(a)$, we get a maximal ideal of $A\/frak(a)$, which by the correspondence in (1.1) corresponds to a maximal ideal of $A$ containing $frak(a)$. == Criterion for a ring to be local (Proposition 1.6(ii)) The proof elides the detail that for any $x, y in A$, if $x y$ is a unit then $x$ is a unit. This follows from definition: if $x y$ is a unit, then there exists $z in A$ such that $(x y) z = 1$, but then $x (y z) = 1$, meaning $x$ is a unit. == Irreducible polynomials generate prime ideals (Example 1, page 4) As an example of a prime ideal, page 4 gives #quote[ $A = k[x_1, ..., x_n]$, $k$ a field. Let $f in A$ be an irreducible polynomial. By unique factorization, the ideal $(f)$ is prime. ] Making this explicit, assume that we have $g, h in A$ such that $g h in f$, i.e. $f bar g h$. Then since $f$ is irreducible, we have by unique factorization that either $f bar g$ or $f bar h$, i.e. $g in (f)$ or $h in (f)$, meaning $(f)$ is prime. == Ideals of $ZZ$ are all principal (Example 2, page 4) As another example on page 4, there is a claim that #quote[ Every ideal in $ZZ$ is of the form $(m)$ for some $m gt.eq 0$. ] Indeed, take an arbitrary non-zero ideal $frak(a) subset.eq ZZ$, and take the smallest positive element of $frak(a)$: denote this as $m$. Clearly $(m) subset.eq frak(a)$. Now, take any $n in frak(a)$. By Bezout's identity, we have $gcd(m, n) in frak(a)$. Since $gcd(m, n) lt.eq m$, but $m$ is the smallest positive element of $frak(a)$, we have $gcd(m, n) = m$, i.e. $m bar n$. Thus $frak(a) subset.eq (m)$, and the two ideals are equal. As implicitly noted in the text, a similar argument can be applied to show that $k[x]$ is a principal ideal domain, based on the Euclidean algorithm for univariate polynomials. == The modular law for ideals (page 6) The following is stated as the closest approximate we have in general to $sect$ and $+$ distributing over each other in a general ring: #quote[ $frak(a) sect (frak(b) + frak(c)) = frak(a) sect frak(b) + frak(a) sect frak(c) "if" frak(a) supset.eq frak(b) "or" frak(a) supset.eq frak(c)$ ] To see this, WLOG let $frak(a) supset.eq frak(b)$. We show each side of the desired equality contains the other. $subset.eq$: By definition, we have $frak(a) sect (frak(b) + frak(c)) subset.eq frak(a)$, and since $frak(b) subset.eq frak(a)$, we have $frak(a) sect (frak(b) + frak(c)) subset.eq frak(b) + frak(c) subset.eq frak(a) + frak(c)$. This means that any $x in frak(a) sect (frak(b) + frak(c))$ both lies in $frak(a)$ and can be written as $x = y + z$ for some $y in frak(a)$ and some $z in frak(c)$. But then, $z = x - y in frak(a)$, meaning $z in frak(a) sect frak(c)$ and therefore #align(center)[ $x = y + z in frak(a) + frak(a) sect frak(c) = frak(a) sect frak(b) + frak(a) sect frak(c)$. ] $supset.eq$: Any element $x in frak(a) sect frak(b) + frak(a) sect frak(c)$ can be written as $x = y + z$ for $y in frak(a) sect frak(b)$ and $z$ in $frak(a) sect frak(c)$. Since $y, z in frak(a)$, $x = y + z in frak(a)$, and since $y in frak(b)$ and $z in frak(c)$, we have $x = y + z in frak(b) + frak(c)$ as well, meaning $x in frak(a) sect (frak(b) + frak(c))$ as desired. #done == Ideals lying in the union of prime ideals (Proposition 1.11(i)) The part of the proof beginning with "...then for each $i$ there exists $x_i in frak(a)$ such that $x_i in.not frak(p)_j$ whenever $j eq.not i$..." could use clarification. We paraphrase the argument below. If $n > 1$ and the result is true for $n - 1$, assume the premise $frak(a) subset.eq.not union.big_(i = 1)^n frak(p)_i space.punct (1 lt.eq i lt.eq n)$ and note that for each $1 lt.eq j lt.eq n$ we then have $frak(a) subset.eq.not frak(p)_i$ for $1 lt.eq i lt.eq n$, $i eq.not j$. Since our desired result holds for $n - 1$, this in turn implies $frak(a) subset.eq.not union.big_(1 lt.eq i lt.eq n; space.punct i eq.not j) frak(p)_i$ for each such $j$, meaning that for each $j$ we have some $x_j in.not frak(a)$ such that $x_j in.not frak(p)_k$ when $j eq.not k$. Now, if by chance we also have $x_j in.not frak(p)_j$ for some $j$, then $x_j in.not union.big_(i = 1)^n frak(p)_i$ meaning $frak(a) subset.eq.not union.big_(i = 1)^n frak(p)_i$, our desired result. Otherwise, i.e. if $x_j in frak(p)_j$ for every $j$, then we consider #align(center)[ $y = sum_(i=1)^n x_1 x_2 dots.h.c hat(space.thin x_i space.thin) dots.h.c x_n$ ] where the "hat" indicates the omission of $x_i$ from the product, i.e. each summand is missing one of ${x_1, x_2, dots.h.c x_n}$. For each $j$, we note that all summands but one will contain $x_j$ and thus lie in $frak(p)_j$. The remaining summand is a product of elements not in $frak(p)_j$ and thus by primality doesn't lie in $frak(p)_j$, meaning the whole sum doesn't lie in $frak(p)_j$. Since this argument applies for all $j$, we have $y in.not union.big_(i = 1)^n frak(p)_i$, meaning that $frak(a) subset.eq.not union.big_(i=1)^n frak(p)_i.$ #done == Ideals whose intersection lies in a prime ideal (Proposition 1.11(ii)) This part of the proof contains a typo: the part which says #quote[ Suppose $frak(p) subset.eq.not frak(a)_i$ for all $i$... ] should actually begin "Suppose $frak(p) supset.eq.not frak(a)_i$ for all $i$," assuming the desired conclusion is false to show a contradiction. The proof can be summarized as #quote[ If each ideal $frak(a)_i$ has an element not in $frak(p)$, then the product of those elements is in all $frak(a)_i$ but not in $frak(p)$ (by primality). ] The equality case boils down to $frak(p) = sect frak(a)_i => frak(p) subset.eq frak(a)_i "for every" i$, and since $frak(p) supset.eq frak(a)_i "for some" i$, we have that $frak(p)$ must equal that particular $frak(a)_i$. == Ideal quotients in $ZZ$ (Example, page 8) TODO: write out why this works. == Various statements about ideal quotients (Exercise 1.12) #set enum(numbering: "(i)") The proofs for the provided statements are as follows: + For each $x in frak(a)$, we have by the definition of an ideal that $x frak(b) subset.eq frak(a)$, i.e. $x in (frak(a) : frak(b))$. + If $x in (frak(a) : frak(b))$, then $x frak(b) subset.eq frak(a)$, meaning $(frak(a) : frak(b)) frak(b) subset.eq frak(a)$. + For $x in A$ to be in $((frak(a) : frak(b)) : frak(c))$, that means that $x frak(c) subset.eq (frak(a) : frak(b))$, i.e. for any $y in x frak(c)$, $y frak(b) subset.eq frak(a)$. This in turn is the same as saying $(x frak(c)) frak(b) subset.eq frak(a)$. Since $(x frak(c)) frak(b) = x (frak(b) frak(c))$, that in turn is equivalent to the statement $x in (frak(a) : frak(b) frak(c))$. Thus, $((frak(a) : frak(b)) : frak(c)) = (frak(a) : frak(b) frak(c))$. Reversing the roles of $frak(b)$ and $frak(c)$, the remaining equality follows. + For any $x in A$, $x in (sect_i frak(a)_i : frak(b))$ is the same as saying $x frak(b) in frak(a)_i "for all" i$. This in turn is equivalent to $x in (frak(a)_i : frak(b)) "for all" i$, i.e. $x in sect_i (frak(a)_i : frak(b))$. + If $x in (frak(a) : sum_i frak(b)_i)$, then $x sum_i frak(b)_i subset.eq frak(a)$, meaning $x frak(b)_i subset.eq frak(a) "for each" i$, i.e. $sect_i (frak(a) : frak(b)_i)$. In the reverse direction, if $x frak(b)_i subset.eq frak(a) "for each" i$, then $x sum_i frak(b)_i subset.eq frak(a)$, i.e. $(frak(a) : sum_i frak(b)_i)$. #done == $r(frak(a))$ is an ideal (page 8) On page 8 just after the definition of the radical of an ideal, the following claim is made: #quote[ If $phi.alt: A -> A\/frak(a)$ is the standard homomorphism, then $r(frak(a)) = phi.alt^(-1)(frak(N)_(A\/frak(a)))$ and hence $r(frak(a))$ is an ideal by (1.7). ] TODO: elaborate on this. == Various statements about radicals of ideals (Exercise 1.13) #set enum(numbering: "(i)") The proofs for the provided statements are as follows: + If $x in frak(a)$, then in particular $x^1 in frak(a)$, so $x in r(frak(a))$. + From the previous statement, $r(r(frak(a)) supset.eq r(frak(a))$. We show the reverse inclusion: if $x in r(r(frak(a))$, then there exist $m, n > 0$ such that $(x^m)^n in frak(a)$. But then $x^(m n) in frak(a)$ with $m n > 0$, so $x in r(frak(a))$. + We show a cycle of inclusions: + $r(frak(a) frak(b)) subset.eq r(frak(a) sect frak(b))$: If $x$ in $r(frak(a) frak(b))$, then $x^n in frak(a) frak(b)$ for some $n > 0$. Since $frak(a) frak(b) subset.eq frak(a)$ and $frak(a) frak(b) subset.eq frak(b)$, this means $x^n in frak(a) sect frak(b)$. + $r(frak(a) sect frak(b)) subset.eq r(frak(a)) sect r(frak(b))$: If $x$ in $r(frak(a) sect frak(b))$, then $x^n in frak(a) sect frak(b)$ for some $n > 0$. Since $frak(a) sect frak(b) subset.eq frak(a)$ and $frak(a) sect frak(b) subset.eq frak(b)$, we have $x in r(frak(a)) sect r(frak(b)).$ + $r(frak(a)) sect r(frak(b)) subset.eq r(frak(a) frak(b))$: If $x in r(frak(a)) sect r(frak(b))$, then we have that for some $m, n > 0$ that $x^m in frak(a)$ and $x^n in frak(b)$. This means that $x^(m + n) in frak(a) frak(b)$, and since $m + n > 0$, we have $x in r(frak(a) frak(b)).$ + We handle this in two directions: + $=>$: If $r(frak(a)) = (1)$, then in particular $1 in r(frak(a))$, i.e. $1^n in frak(a)$ for some $n > 0$. Since $1^n = 1$ for any such $n$, we have $1 in frak(a)$, i.e. $frak(a) = (1)$. + $arrow.l.double$: If $frak(a) = (1)$, then since $r(frak(a)) supset.eq frak(a)$, we must have $r(frak(a)) = (1)$. + TODO + TODO == End of chapter exercises === Exercise 1 Take $n > 0$ such that $x^n = 0$. Then $(1 + x)(1 - x + x^2 - x^3 + dots.h.c plus.minus x^(n - 1)) = 1 plus.minus x^n = 1$. #done === Exercise 6 By Proposition 1.9, it suffices to show that for any $x in A$, if $1 - x y$ is a unit for every $y in A$, then $x$ lies in the nilradical. Assume to a contraction that some such $x$ is not in the nilradical. By the premise of the problem, $(x)$ would then contain some non-zero idempotent $e$. Take $y$ such that $e = x y$. Then $1 - x y = 1 - e$, which is idempotent: $(1 - e)^2 = 1 - 2e + e^2 = 1 - e$. If $1 - e$ was a unit, then multiplying both sides of the previous equality by its inverse would give $1 - e = 1$, i.e. $e = 0$, contradicting our choice of $e$. $arrow.r.double arrow.l.double$ #done === Exercise 7 Fix an arbitrary prime ideal $frak(p) subset.eq A$, and take $x in A - frak(p)$ with image $overline(x) in A\/frak(p)$. Since $x^n = x$ for some $n > 1$, we have $(x^(n - 1) - 1)x = 0$. Passing to the quotient, this gives $(overline(x)^(n - 1) - 1) overline(x) = 0$. Since $x in.not frak(p)$, we have $overline(x) eq.not 0$. Because $A\/frak(p)$ is an integral domain, this means $overline(x)^(n - 1) - 1 = 0$, implying that $overline(x)$ is a unit. As $x in A - frak(p)$ was arbitrary, this means that any nonzero element of $A\/frak(p)$ is a unit, i.e. $A\/frak(p)$ is a field and thus $frak(p)$ is maximal. #done === Exercise 8 Let $Sigma$ be the set of all prime ideals in $A$, ordered by reverse inclusion. $Sigma$ is non-empty, since $0 in Sigma$. Moreover, if $(frak(a)_alpha)$ is a chain of ideals in $Sigma$, we claim that $sect_alpha frak(a)_alpha$ is an upper (or rather, lower) bound to $(frak(a)_alpha)$. It suffices to show that $frak(a) := sect_alpha frak(a)_alpha$ is prime, and indeed if $x, y in.not frak(a)_alpha$, then we have that there are some $alpha, beta$ such that $x in.not frak(a)_alpha$ and $y in.not frak(a)_beta$. Since $(frak(a)_alpha)$ is a chain, we can assume WLOG that $frak(a)_alpha subset.eq frak(a)_beta$. Thus, $x, y in.not frak(a)_beta$, meaning $x y in.not frak(a)_beta$ and thus $x y in.not frak(a)$, showing that $frak(a)$ is prime. === Exercise 9 We handle both directions separately: $=>$: If $frak(a) = r(frak(a))$, then any $x in A$ such that $x^n in frak(a)$ for some $n > 0$ must lie in $frak(a)$. Thus, for any $overline(x) in A\/frak(a)$ such that $overline(x)^n = 0$, we have $overline(x) = 0$, meaning in $A\/frak(a)$ that the nilradical is the zero ideal, i.e. $sect.big_(frak(p) in cal(F)) frak(p) = 0$ for a family of prime ideals $cal(F)$ in $A\/frak(a)$. Now, take $pi: A -> A\/frak(a)$ to be the natural projection of $A$ onto $A\/frak(a)$. Because taking preimages preserves intersections and sends prime ideals to prime ideals, we have that $pi^(-1) cal(F)$ is a family of prime ideals whose intersection is $pi^(-1)(0) = frak(a)$, i.e. $frak(a)$ is an intersection of prime ideals. $arrow.l.double$: Let $frak(a)$ be the intersection of some family $cal(F)$ of prime ideals in $A$. If $x^n in frak(a)$ for some $n > 0$, then $x^n in frak(p)$ for each $frak(p) in cal(F)$. Since prime ideals are radical, this means that $x in frak(p)$ for each $frak(p) in cal(F)$, and thus $x in frak(a)$ as desired. #done === Exercise 10 We prove a cycle of implications. (i) $=>$ (ii): If $A$ has exactly one prime ideal, then it must equal $frak(N)$, being the intersection of every prime ideal. Because $frak(N)$ is the maximal ideal in a local ring, every element not in $frak(N)$ must be a unit, i.e. every element is nilpotent or a unit. (ii) $=>$ (iii): Since every non-nilpotent element of $A$ is a unit, we have that $frak(N)$ is maximal by Proposition 1.6(i) and thus $A\/frak(N)$ is a field. (iii) $=>$ (i): If $A\/frak(N)$ is a field, then $frak(N)$ is maximal. For any prime ideal $frak(p) subset.eq A$, we have $frak(N) subset.eq frak(p)$ as the nilpotent is the intersection of all prime ideals, but then by maximality of $frak(N)$ we must have $frak(N) = frak(p)$. Thus, there can only be one prime ideal in $A$. #done === Exercise 11 #set enum(numbering: "(i)") + For any $x in A$, we have $x + 1 = (x + 1)^2 = x^2 + 2x + 1 = 3x + 1$, meaning $2x = 0$. + For a given prime ideal $frak(p) subset.eq A$, take any $overline(x) in A\/frak(p)$. Since $x^2 = x$ in $A$, i.e. $x(x - 1) = 0$, we have $overline(x)(overline(x) - 1) = 0$ in $A\/frak(p)$, meaning either $overline(x) = 0$ or $overline(x) = 1$ as $A\/frak(p)$ is a domain. Thus, $A\/frak(p)$ is a field with two elements, implying $frak(p)$ is maximal. + Take an arbitrary finitely generated ideal $frak(a) = (x_1, x_2, ..., x_n) subset.eq A$. Let $y = ...$. TODO: finish this! === Exercise 12 Let $frak(m)$ be the maximal ideal of a local ring $A$. Note that any $x in.not frak(m)$ must be a unit, otherwise by Corollary 1.5 it would lie in a maximal ideal distinct from $frak(m)$. We show that the only idempotent element of $frak(m)$ is $0$. Assume some $e in frak(m)$ satisfies $e^2 = e$. Then $1 - e in.not frak(m)$, otherwise we would have $1 - e + e = 1 in frak(m)$. Thus $1 - e$ is a unit. But $(1 - e)^2 = 1 - 2e + e^2 = 1 - e$, so then multiplying both sides by $(1 - e)^(-1)$, we get $1 - e = 1$, i.e. $e = 0.$ The only idempotent $e in.not frak(m)$ is $1$, since $e in.not frak(m)$ implies $e$ is a unit, meaning $e^2 = e$ implies $e = 1$. Since either $e in frak(m)$ or $e in.not frak(m)$, our only options for an idempotent $e in A$ are $0$ and $1$. #done = Modules
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/002%20-%20Return%20to%20Ravnica/009_The%20Azorius%20Ten%20Most%20Wanted.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "The Azorius Ten Most Wanted", set_name: "Return to Ravnica", story_date: datetime(day: 31, month: 10, year: 2012), author: "<NAME>", doc ) #figure(image("009_The Azorius Ten Most Wanted/01.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] None #strong[Most Recent Offense:] #linebreak Destroying Azorius property with Azorius property; consumption of sentient plant matter. #figure(image("009_The Azorius Ten Most Wanted/02.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak FormerlyofSelesnya #strong[Most Recent Offense:] #linebreak Arson; using a trained wolf as a deadly weapon; transporting contraband across rooftops. #figure(image("009_The Azorius Ten Most Wanted/03.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Golgari #strong[Most Recent Offense:] #linebreak Non-ritualistic cannibalism. #figure(image("009_The Azorius Ten Most Wanted/04.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Unknown #strong[Most Recent Offense:] #linebreak Theft of Orzhov coin; agitating the wildlife. #figure(image("009_The Azorius Ten Most Wanted/05.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Izzet #strong[Most Recent Offense:] #linebreak Public nuisance; destruction of property; illegal launch of humanoid. #figure(image("009_The Azorius Ten Most Wanted/06.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak None #strong[Most Recent Offense] :#linebreak Unlawful use of a portal; using thrull as an incendiary device. #figure(image("009_The Azorius Ten Most Wanted/07.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Orzhov #strong[Most Recent Offense:] #linebreak Bribery; use of detention sphere for purposes other than detention. #figure(image("009_The Azorius Ten Most Wanted/08.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak None #strong[Most Recent Offense:] #linebreak Opening Spatial Rift without a permit. #figure(image("009_The Azorius Ten Most Wanted/09.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Golgari #strong[Most Recent Offense:] #linebreak Plagiarism of Azorius statutes for the purpose of phlebotomy and murder. #figure(image("009_The Azorius Ten Most Wanted/10.jpg", width: 100%), caption: [], supplement: none, numbering: none) #strong[Guild Alignment:] #linebreak Rakdos #strong[Most Recent Offense:] All of them.
https://github.com/undefik/jconv
https://raw.githubusercontent.com/undefik/jconv/master/jconv.typ
typst
The Unlicense
#import "@preview/sourcerer:0.2.1": code #import "@preview/ansi-render:0.6.1": * #import "@preview/based:0.1.0": base64 #import "@preview/cmarker:0.1.0" #import "@preview/showybox:2.0.1": showybox #let sb = it => showybox( frame: ( border-color: gray.lighten(70%), body-color: gray.lighten(90%), thickness: 1pt ), it ) #let jconv = (ipynb) => { for cell in ipynb.cells { if cell.cell_type == "code" { let src = "" for line in cell.source { src = src + line } code( raw(src, lang: ipynb.metadata.kernelspec.language), lang: ipynb.metadata.kernelspec.display_name, ) for out in cell.outputs { if out.data.keys().contains("text/html") { let text = "" for line in out.data.at("text/plain") { text = text + line } [ #sb( ansi-render(text, font: "FiraCode Nerd Font Mono", width: 100%, inset: 0.5em) ) ] } if out.data.keys().contains("image/png") { [#figure(image.decode(base64.decode(out.data.at("image/png")), format:"png"))] } } } else if cell.cell_type == "markdown" { let src = "" for line in cell.source { src = src + line } [ #sb( cmarker.render(src) ) ] } } }
https://github.com/lucafluri/typst-templates
https://raw.githubusercontent.com/lucafluri/typst-templates/master/Project_Report_FHNW_MSE/soa.typ
typst
#set heading(outlined: false, numbering: none) = Statement of Authenticity I confirm that this P8 project was written autonomously by me using only the sources, aids, and assistance stated in the report, and that any work adopted from other sources, which was written as part of this thesis, is duly cited and referenced as such. Brugg-Windisch, #v(2em) //#image("figures/signature_cropped.jpeg", width: 30%) <NAME>
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/013%20-%20Magic%202015/010_Monster.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Monster", set_name: "Magic 2015", story_date: datetime(day: 20, month: 08, year: 2014), author: "<NAME>", doc ) #emph[<NAME> is not the man he once was. Cursed by <NAME> and the sinister power of the Chain Veil, he turned his savage instincts toward hunting the most dangerous prey of all: other Planeswalkers. Left unchecked, he would have slaughtered his way through the multitude of planes.] #emph[And he is not the only one who wants to know…] #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) He preferred hunting his prey. Moving, chasing, anticipating, watching the quarry’s fear reduce it to the instinctive rituals life follows as death chases. So many different forms of life, all with different weapons and defenses and styles and knowledge; but faced with the hunt, they all act the same—the breathlessness and instinctive turns, the widening eyes and the last surge of speed, the final stages of being prey. To be hunted was to die. To hunt was to live. He hated waiting. He had been still in the underbrush for several hours. His right leg had cramped hard off and on for twenty minutes. He had not screamed. The pain was intense, but manageable in comparison to the other recent pain his body had been through. But then, being stabbed with daggers through the throat seemed manageable compared to the pain he had been through. Although he couldn’t see it, he could feel the hedron lodged in his flesh, softly pulsing like a second heart. His chance at a second life, of a sort. The hedron was cold, and alien. Many years ago he would have been driven mad by this intrusion of magic and artifice directly into his body. There was no escaping its presence. Its pulsing. The hedron #emph[thrummed] , although to a tune and tempo that only he could hear. He would live with it, though. His body and his mind were his own again. That freedom was worth any price. #figure(image("010_Monster/02.jpg", width: 100%), caption: [In Garruk’s Wake | Art by Chase Stone], supplement: none, numbering: none) He had waited for his prey for a long time. It was his third day back on Shandalar, hoping to get a glimpse. When he last left Shandalar, he thought he would never return. But there he was, just weeks later. A new hunt, a new quarry. "Garruk." It was a whisper of a breeze. A soft, low, female voice. One he had once searched for. "Garruk." The whisper came from behind him. Garruk rose slowly. There was no need for haste anymore. He had been found. He turned, and a small pinpoint of white light dancing in the forest clearing ahead slowly weaved in figure eights. As Garruk approached, the light it flitted away, deeper into the forest. Ahead of an open clearing, mists coalesced into murky pools among a thin copse of trees. The dot of light vanished into the mist. Garruk could not discern any details behind the shimmering opaque gauze. Most of Garruk’s senses had been altered during the curse, mostly for the worse. With the hedron holding the curse in check, there was no further damage done, but neither had there been healing. Yet more insults and injuries. A figure strode out of the mists into the clearing. Long, raven-black hair framed a beautiful face. There were light etchings on her face and arms, thin runic lines, but in the dim forest light they were invisible, although Garruk knew they were there. Her normally light violet eyes were obscured by a soft purplish glow. Most would describe the smile on her face as alluring, but Garruk knew the cold cruelty that drove it. Her dress and leggings were exactly the same as the last time they met, when Garruk tried his best to kill her. He had failed. He loosened his axe from the sheath on his back and drew it. The haft rested gently in his hands. The axe had been a dear friend in ending the lives of many Planeswalkers when Garruk was in the full throes of the curse. It just hadn’t ended the life of the Planeswalker who had cursed him. Not yet, anyway. "<NAME>." His voice was a full throaty rumble that carried across the clearing. Her smile became an open smirk. #figure(image("010_Monster/04.jpg", width: 100%), caption: [Liliana of the Veil | Art by <NAME>], supplement: none, numbering: none) "Garruk. You’re looking remarkably better than the last time we met. I’ve always found killing to be an effective way of staying healthy. You must be #emph[very] healthy." Her voice was dulcet lashes of satin. Her bared shoulders slightly tipped and swayed as she spoke. Garruk wondered who between the two of them had killed more. He grudgingly acknowledged he might not win that contest. He continued to stand at the edge of the clearing, holding his axe steady. "No primal roar? No swinging your axe trying to split my head open? Why Garruk, I almost wouldn’t be embarrassed to have you over to a dinner party. Find a way to remove your smell and that hedron sticking out of you and you might even be presentable as my arm candy." Garruk said nothing. He re-slung his axe in it its sheath and walked slowly over to Liliana. She raised her arms, and the same purple glow suffusing her eyes illuminated her hands. The last time Garruk had seen that glow from Liliana’s hands, it had meant a great deal of pain for him. He continued his slow stride toward her. "How close to killing me were you last time? You had your hands on my throat, choking the life from me, your stinking breath threatening to be my last sensation. Underneath your rage, I could feel your excitement, your desire to kill. It’s glorious, isn’t it, to see the life flee from those who have hurt you, those who have wronged you? How would people ever know there were consequences, unless people like you and me existed?" Liliana continued, with the purple glow of her raised hands intensifying. But no magic erupted. No dark tendrils to grip Garruk nor ghouls rising from the ground to slow his path. Garruk contemplated letting this farce drag out further, but he had to get back to hunting his prey. "The looks are perfect, and you have her voice right. But the smells are all wrong, Beleren." Garruk stopped a few feet from where Liliana stood. Her form shimmered and dissolved, and a much different and uglier form replaced her. The figure barely came up to the middle of Garruk’s chest, a scrawny, slight man, dressed in a blue cloak and clothing. His hood was up, obscuring his face, but Garruk knew what he would see if the hood was down. When last Garruk saw <NAME>, Garruk had his hands around the smaller man’s throat, trying to kill him. Garruk recognized a pattern, and he smiled. Liliana may have been an illusion, but she was right. He did enjoy killing. Jace looked up from under his hood. "You’ve killed a lot of people, Garruk. I have to make sure that’s not going to happen again." #figure(image("010_Monster/06.jpg", width: 100%), caption: [Jace, the Living Guildpact | Art by Chase Stone], supplement: none, numbering: none) "I couldn’t have killed them if you hadn’t sent them to find me in the first place. Stop trying to find me, and people won’t die." Garruk felt the weight of the axe on his back, and he knew how much time he needed to draw it and swing. But at that distance, he didn’t need the axe. "We can help you. The hedron has bought us time. Come back with me to Ravnica. I’ve already convened some of our best healers…" "Who is this #emph[we] ? Where were you when my body was wracked with pain, when my summons decayed in front of me, when voices sought to claim my body and mind?" His voice ended in a shout. His hands clenched into fists before relaxing and then clenching again. "Garruk, you need to come with me. We need to make sure that you’re healed, that you won’t kill again." Jace’s voice was calm, even, confident, as though perfectly designed to send Garruk into a rage. "And if I do want to kill again? Maybe right now?" "Then I’ll stop you. Garruk, this is not you. The hedron only keeps the curse in check, it hasn’t healed you. Come with me." Jace held out a hand. Garruk took it. "You’re not taking me anywhere." Garruk pulled Jace toward him and head-butted him. Jace exploded into glass, shards flying everywhere, and Garruk could feel blood dripping from his face where the glass had cut him. Illusions could kill. But so could he. He roared in the clearing as he unslung his axe. Figures of Jace sprung up all around him, each form a perfect copy, hands up in a defensive posture. "I don’t want to hurt you, Garruk." "Lucky for me, I don’t feel the same way." "Garruk, this is not a fair fight. You’ve suffered enough. Please. Come with me." Garruk swung his axe through the illusions. Each one shattered like glass. The air around Garruk solidified into an icy goo, his movements slowed, his breathing labored. #figure(image("010_Monster/08.jpg", width: 100%), caption: [Gossamer Phantasm | Art by Jon Foster], supplement: none, numbering: none) "These illusions are good, Beleren. But to be this good…" Garruk’s arm lashed out to the side and found what should have been an invisible form. "…you have to be really close." Garruk’s hand closed around Jace’s neck. For the first time ever, Garruk saw a look of surprise on Jace Beleren’s face. "How? Garruk…" He didn’t beg. Garruk respected that. "First, you spend too much time inside minds. Pay closer attention to the real world, Jace. Second…" Garruk saw a shimmer in front of him, and a ghost image of Jace was superimposed over the real one. The image of Jace widened ever so slightly, and Garruk’s hands automatically loosened to fit. This repeated until the real Jace had enough room to slowly ease his neck out of Garruk’s grip. Garruk tightened the grip hard. "How are you…? You shouldn’t be able to…" Jace’s words ended as he struggled to breathe. "Second, you rely too much on illusions. Learn to fight, little man!" Jace’s face turned purple. Garruk loosened his grip slightly. Jace took in a big gulp of air, and let out a single word: "Monster." The first time Garruk had heard that word from Jace during their first encounter, it had been like a blow. Garruk laughed. "You’re right, I am a monster. Third, and this one is important: if you come after me again, or send someone, you will die. Do you doubt me?" Jace shook his head. He still didn’t look afraid. At least the mind mage wasn’t all jelly. "I can’t…" Jace’s voice was raspy, and he struggled to regain his breath. "I can’t leave you as a homicidal maniac. I have to…" Garruk sighed. "Go ahead and read me, Beleren. I’m not that complicated." Garruk could feel the foreign touch in his mind, and despite all he had done to set up the encounter, he almost ended Jace’s life there. He would kill every single person he could if it would keep him free. #figure(image("010_Monster/10.jpg", width: 100%), caption: [Jace’s Ingenuity | Art by <NAME>], supplement: none, numbering: none) Jace’s touch receded. There was a look of disgust on his face, but also a look of surprise, and Garruk thought he saw some acceptance as well. "You…you’re clean. How is that possible?" "Because I am what I am. I’ll kill if I have to. I may even," and there Garruk opened his mouth in a wide smile, "enjoy it from time to time. But if you and the others leave me alone, then you have nothing to worry about. That’s the best deal I’m going to offer." Jace stood in thought. Garruk held him by the throat, could end his life in an eye blink, had already proven he was immune to Jace’s illusions. Garruk laughed again. If Garruk was open to having friends, then Jace might have been a good one. "You win," said Jace. "We will leave you alone. I will not seek you out. But please, if you change your mind, come find us on Ravnica. Something is still not right here. We can help you." Garruk let go. Jace rubbed his neck, and Garruk could see the deep purple bruises he had left. He continued to smile. "One final piece of advice, Beleren. Only the very best hunters can hunt alone. You? You need friends." Jace looked at him, and a picture of a library in Ravnica shimmered behind him. Images of Jace began stepping backwards into the library, image after image after image, each superimposing on top of one another in the library, and only the barest whisper of an image remained on Shandalar. And then Jace, and the vista into Ravnica, was gone. #figure(image("010_Monster/12.jpg", width: 100%), caption: [Island | Art by Yeong-Hao Han], supplement: none, numbering: none) Garruk breathed deeply, and momentarily slumped on his axe. He #emph[hurt] . He had to project strength to Jace, but he was still weak. The hedron continued to beat, to thrum. He hadn’t been sure his plan was going to work. It was strange for a hunt to end successfully without a kill or a trophy, but that had been life for Garruk recently. Strange. He decided to rest a little before leaving Shandalar for his next destination. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) A short time later, a man approached him from the other side of the clearing, striding confidently toward Garruk. The air in the clearing became cold, and the man’s footsteps crunched loudly on frozen forest floor. Even with diminished senses, Garruk realized he should have been able to smell the man coming, but the man had no smell. He was tall and thin, dressed in blue robes tinged with silver and black. His face was long and pale, and his white hair was rimmed with frost and icicles, making it stand out like long, white spikes. His eyes were a deep blue, with no visible iris. Garruk took up his axe with both hands. "Odd." He was mostly sure he wasn’t hallucinating again, he didn’t feel the active blight in his veins the way he had back on #emph[Innistrad] , but he couldn’t be too sure. "Who are you?" "I am here to take you back to #emph[Innistrad] . Come with me now, Garruk." His voice was clipped and harsh. A grating tone. "Didn’t I kill all of you already?" "Vronos paid me a great deal of money. You will come back with me now. You can come on your own, or you can come in an ice block." Many things about the situation annoyed Garruk. He wanted to rest. He was tired of people searching him out. Garruk didn’t like the cold. And the man ignored his question. Garruk needed to call the man something. He decided on Icy. "You see what happened to Vronos." Garruk pointed to the mask still on his belt. #figure(image("010_Monster/14.jpg", width: 100%), caption: [], supplement: none, numbering: none) "I saw. I saw what happened to all of them. I needed more time to prepare. When I was ready, I came to get you. Then I saw you and <NAME>." There was a note of uncertainty in the harsh voice. "Right, <NAME>. Who organized this whole hunt in the first place. A lot of people are dead because of <NAME>. Are you volunteering to be among them?" "I have been paid…" "Yes, a lot of money. So you have your money, right? And you saw that I let Jace live. Because he left me alone. Jace is not going to come get the money back. And Vronos #emph[certainly] is not going to come and get the money back. Leave me alone, and you get to enjoy it." Garruk could feel the hesitation, the calculation in Icy’s head. "Very well. But…a question. <NAME> has a great reputation as a master mind mage. Yet you seemed to know where he was hiding. How?" "It’s the food I eat. Very natural. Healthy. Makes me good at resisting mental alteration." "You are lying to me. You should not lie to me. It could become unpleasant for you." The cold intensified. Ice crackled in the air. "You think a little cold would stop me?" Garruk smiled. The blue orbs blinked for a moment. The air between the two of them thickened and developed an icy sheen. "Take a step toward me, beast, and I will freeze the air in your brain and shatter it. We will see how good you are at resisting alteration then." Garruk grunted in acknowledgment. "That does sound unpleasant. It was a joke. I don’t know why Jace’s illusions didn’t work. Maybe he’s just not that good at them." Garruk shrugged. Icy took a step backward, and the air froze heavily around him as globules of ice formed suspended in the space behind him. The globules reflected a different place than the one they were in—a white, icy landscape covered in drifting snow. The globules of ice grew and circled around Icy. Garruk held up a hand. "A question. How long does it take you to planeswalk?" Icy’s eyes widened, and his mouth opened as his hands raised up. Garruk gripped his axe, pivoted, and took Icy’s head off at the neck, just as the globules had begun forming into a single icy portal. Instead, the globules shattered, and immediately melted into water at Garruk’s feet. Icy’s body slumped lifeless to the ground, his head rolling to the side, its blue orbs gray and still. "Too long, it looks like." Garruk laughed again. He no longer #emph[needed] to kill. And he certainly didn’t need to kill Icy. But Icy had asked to die when he threatened Garruk. Had Icy lived, he would have realized a valuable lesson: Don’t threaten monsters. Garruk planeswalked to his next destination.
https://github.com/0x546974616e/typst-resume
https://raw.githubusercontent.com/0x546974616e/typst-resume/main/template/project.typ
typst
#import "./globals.typ": colors, spacing, column-size #import "./heading.typ": h2 #import "./layout.typ": right-panel, left-panel #import "./section.typ": * #let fullname(config) = { if "profil" in config { let items = get-fullname(config.profil) if items.len() >= 1 { return items.join(" ") } } "" } #let resume(config, lang) = { set page( margin: 0pt, ) set text( size: 11.5pt, fill: colors.fg3, font: "Quicksand", // font: "Lato", weight: 400, lang: lang, ) let author = fullname(config) set document( ..if author.len() > 0 {( author: author, title: author, )} ) table( columns: (column-size, 1fr), stroke: none, inset: 0pt, left-panel({ if "picture" in config { make-picture(config.picture) } else { v(size) } if "skills" in config and config.skills.len() >= 1 { h2(color: colors.fg4)[Skills] make-skills(config.skills) } if "interests" in config and config.interests.len() >= 1 { h2(color: colors.fg4)[Interests] make-interests(config.interests) } }), // left-panel right-panel({ v(1.5 * spacing.large) if "profil" in config { make-header(config.profil) h2[Profil] make-profil(config.profil) } if "experiences" in config and config.experiences.len() >= 1 { h2[Experiences] make-table( config.experiences, ( date: ( "start", "stop", "&months" ), title: ( "company", "position" ), body: ( "description", ), transform: ( // months: (month) => [(#month months)], months: (month) => { "(" + str(month) + get-lang(( en: " months)", fr: " mois)", )) } ), ), ) // make-table } // endif if "educations" in config and config.educations.len() >= 1 { h2[Education] make-table( config.educations, ( date: ( "start", "stop" ), title: ( "schooling", "school" ), body: ( "description", ), transform: (:), ), ) // make-table } // endif if "projects" in config and config.projects.len() >= 1 { h2[Projects] make-table( config.projects, ( date: ( "start", "stop" ), title: ( "title", "&repository" ), body: ( "description", ), transform: ( repository: repository => { align( horizon, text( size: 0.80em, fill: colors.fg2, "(" + link(repository) + ")" ) // text ) // align } // function ), ), ) // make-table } // endif }) // right-panel ) // table }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-18800.typ
typst
Apache License 2.0
#let data = ( ("TANGUT COMPONENT-001", "Lo", 0), ("TANGUT COMPONENT-002", "Lo", 0), ("TANGUT COMPONENT-003", "Lo", 0), ("TANGUT COMPONENT-004", "Lo", 0), ("TANGUT COMPONENT-005", "Lo", 0), ("TANGUT COMPONENT-006", "Lo", 0), ("TANGUT COMPONENT-007", "Lo", 0), ("TANGUT COMPONENT-008", "Lo", 0), ("TANGUT COMPONENT-009", "Lo", 0), ("TANGUT COMPONENT-010", "Lo", 0), ("TANGUT COMPONENT-011", "Lo", 0), ("TANGUT COMPONENT-012", "Lo", 0), ("TANGUT COMPONENT-013", "Lo", 0), ("TANGUT COMPONENT-014", "Lo", 0), ("TANGUT COMPONENT-015", "Lo", 0), ("TANGUT COMPONENT-016", "Lo", 0), ("TANGUT COMPONENT-017", "Lo", 0), ("TANGUT COMPONENT-018", "Lo", 0), ("TANGUT COMPONENT-019", "Lo", 0), ("TANGUT COMPONENT-020", "Lo", 0), ("TANGUT COMPONENT-021", "Lo", 0), ("TANGUT COMPONENT-022", "Lo", 0), ("TANGUT COMPONENT-023", "Lo", 0), ("TANGUT COMPONENT-024", "Lo", 0), ("TANGUT COMPONENT-025", "Lo", 0), ("TANGUT COMPONENT-026", "Lo", 0), ("TANGUT COMPONENT-027", "Lo", 0), ("TANGUT COMPONENT-028", "Lo", 0), ("TANGUT COMPONENT-029", "Lo", 0), ("TANGUT COMPONENT-030", "Lo", 0), ("TANGUT COMPONENT-031", "Lo", 0), ("TANGUT COMPONENT-032", "Lo", 0), ("TANGUT COMPONENT-033", "Lo", 0), ("TANGUT COMPONENT-034", "Lo", 0), ("TANGUT COMPONENT-035", "Lo", 0), ("TANGUT COMPONENT-036", "Lo", 0), ("TANGUT COMPONENT-037", "Lo", 0), ("TANGUT COMPONENT-038", "Lo", 0), ("TANGUT COMPONENT-039", "Lo", 0), ("TANGUT COMPONENT-040", "Lo", 0), ("TANGUT COMPONENT-041", "Lo", 0), ("TANGUT COMPONENT-042", "Lo", 0), ("TANGUT COMPONENT-043", "Lo", 0), ("TANGUT COMPONENT-044", "Lo", 0), ("TANGUT COMPONENT-045", "Lo", 0), ("TANGUT COMPONENT-046", "Lo", 0), ("TANGUT COMPONENT-047", "Lo", 0), ("TANGUT COMPONENT-048", "Lo", 0), ("TANGUT COMPONENT-049", "Lo", 0), ("TANGUT COMPONENT-050", "Lo", 0), ("TANGUT COMPONENT-051", "Lo", 0), ("TANGUT COMPONENT-052", "Lo", 0), ("TANGUT COMPONENT-053", "Lo", 0), ("TANGUT COMPONENT-054", "Lo", 0), ("TANGUT COMPONENT-055", "Lo", 0), ("TANGUT COMPONENT-056", "Lo", 0), ("TANGUT COMPONENT-057", "Lo", 0), ("TANGUT COMPONENT-058", "Lo", 0), ("TANGUT COMPONENT-059", "Lo", 0), ("TANGUT COMPONENT-060", "Lo", 0), ("TANGUT COMPONENT-061", "Lo", 0), ("TANGUT COMPONENT-062", "Lo", 0), ("TANGUT COMPONENT-063", "Lo", 0), ("TANGUT COMPONENT-064", "Lo", 0), ("TANGUT COMPONENT-065", "Lo", 0), ("TANGUT COMPONENT-066", "Lo", 0), ("TANGUT COMPONENT-067", "Lo", 0), ("TANGUT COMPONENT-068", "Lo", 0), ("TANGUT COMPONENT-069", "Lo", 0), ("TANGUT COMPONENT-070", "Lo", 0), ("TANGUT COMPONENT-071", "Lo", 0), ("TANGUT COMPONENT-072", "Lo", 0), ("TANGUT COMPONENT-073", "Lo", 0), ("TANGUT COMPONENT-074", "Lo", 0), ("TANGUT COMPONENT-075", "Lo", 0), ("TANGUT COMPONENT-076", "Lo", 0), ("TANGUT COMPONENT-077", "Lo", 0), ("TANGUT COMPONENT-078", "Lo", 0), ("TANGUT COMPONENT-079", "Lo", 0), ("TANGUT COMPONENT-080", "Lo", 0), ("TANGUT COMPONENT-081", "Lo", 0), ("TANGUT COMPONENT-082", "Lo", 0), ("TANGUT COMPONENT-083", "Lo", 0), ("TANGUT COMPONENT-084", "Lo", 0), ("TANGUT COMPONENT-085", "Lo", 0), ("TANGUT COMPONENT-086", "Lo", 0), ("TANGUT COMPONENT-087", "Lo", 0), ("TANGUT COMPONENT-088", "Lo", 0), ("TANGUT COMPONENT-089", "Lo", 0), ("TANGUT COMPONENT-090", "Lo", 0), ("TANGUT COMPONENT-091", "Lo", 0), ("TANGUT COMPONENT-092", "Lo", 0), ("TANGUT COMPONENT-093", "Lo", 0), ("TANGUT COMPONENT-094", "Lo", 0), ("TANGUT COMPONENT-095", "Lo", 0), ("TANGUT COMPONENT-096", "Lo", 0), ("TANGUT COMPONENT-097", "Lo", 0), ("TANGUT COMPONENT-098", "Lo", 0), ("TANGUT COMPONENT-099", "Lo", 0), ("TANGUT COMPONENT-100", "Lo", 0), ("TANGUT COMPONENT-101", "Lo", 0), ("TANGUT COMPONENT-102", "Lo", 0), ("TANGUT COMPONENT-103", "Lo", 0), ("TANGUT COMPONENT-104", "Lo", 0), ("TANGUT COMPONENT-105", "Lo", 0), ("TANGUT COMPONENT-106", "Lo", 0), ("TANGUT COMPONENT-107", "Lo", 0), ("TANGUT COMPONENT-108", "Lo", 0), ("TANGUT COMPONENT-109", "Lo", 0), ("TANGUT COMPONENT-110", "Lo", 0), ("TANGUT COMPONENT-111", "Lo", 0), ("TANGUT COMPONENT-112", "Lo", 0), ("TANGUT COMPONENT-113", "Lo", 0), ("TANGUT COMPONENT-114", "Lo", 0), ("TANGUT COMPONENT-115", "Lo", 0), ("TANGUT COMPONENT-116", "Lo", 0), ("TANGUT COMPONENT-117", "Lo", 0), ("TANGUT COMPONENT-118", "Lo", 0), ("TANGUT COMPONENT-119", "Lo", 0), ("TANGUT COMPONENT-120", "Lo", 0), ("TANGUT COMPONENT-121", "Lo", 0), ("TANGUT COMPONENT-122", "Lo", 0), ("TANGUT COMPONENT-123", "Lo", 0), ("TANGUT COMPONENT-124", "Lo", 0), ("TANGUT COMPONENT-125", "Lo", 0), ("TANGUT COMPONENT-126", "Lo", 0), ("TANGUT COMPONENT-127", "Lo", 0), ("TANGUT COMPONENT-128", "Lo", 0), ("TANGUT COMPONENT-129", "Lo", 0), ("TANGUT COMPONENT-130", "Lo", 0), ("TANGUT COMPONENT-131", "Lo", 0), ("TANGUT COMPONENT-132", "Lo", 0), ("TANGUT COMPONENT-133", "Lo", 0), ("TANGUT COMPONENT-134", "Lo", 0), ("TANGUT COMPONENT-135", "Lo", 0), ("TANGUT COMPONENT-136", "Lo", 0), ("TANGUT COMPONENT-137", "Lo", 0), ("TANGUT COMPONENT-138", "Lo", 0), ("TANGUT COMPONENT-139", "Lo", 0), ("TANGUT COMPONENT-140", "Lo", 0), ("TANGUT COMPONENT-141", "Lo", 0), ("TANGUT COMPONENT-142", "Lo", 0), ("TANGUT COMPONENT-143", "Lo", 0), ("TANGUT COMPONENT-144", "Lo", 0), ("TANGUT COMPONENT-145", "Lo", 0), ("TANGUT COMPONENT-146", "Lo", 0), ("TANGUT COMPONENT-147", "Lo", 0), ("TANGUT COMPONENT-148", "Lo", 0), ("TANGUT COMPONENT-149", "Lo", 0), ("TANGUT COMPONENT-150", "Lo", 0), ("TANGUT COMPONENT-151", "Lo", 0), ("TANGUT COMPONENT-152", "Lo", 0), ("TANGUT COMPONENT-153", "Lo", 0), ("TANGUT COMPONENT-154", "Lo", 0), ("TANGUT COMPONENT-155", "Lo", 0), ("TANGUT COMPONENT-156", "Lo", 0), ("TANGUT COMPONENT-157", "Lo", 0), ("TANGUT COMPONENT-158", "Lo", 0), ("TANGUT COMPONENT-159", "Lo", 0), ("TANGUT COMPONENT-160", "Lo", 0), ("TANGUT COMPONENT-161", "Lo", 0), ("TANGUT COMPONENT-162", "Lo", 0), ("TANGUT COMPONENT-163", "Lo", 0), ("TANGUT COMPONENT-164", "Lo", 0), ("TANGUT COMPONENT-165", "Lo", 0), ("TANGUT COMPONENT-166", "Lo", 0), ("TANGUT COMPONENT-167", "Lo", 0), ("TANGUT COMPONENT-168", "Lo", 0), ("TANGUT COMPONENT-169", "Lo", 0), ("TANGUT COMPONENT-170", "Lo", 0), ("TANGUT COMPONENT-171", "Lo", 0), ("TANGUT COMPONENT-172", "Lo", 0), ("TANGUT COMPONENT-173", "Lo", 0), ("TANGUT COMPONENT-174", "Lo", 0), ("TANGUT COMPONENT-175", "Lo", 0), ("TANGUT COMPONENT-176", "Lo", 0), ("TANGUT COMPONENT-177", "Lo", 0), ("TANGUT COMPONENT-178", "Lo", 0), ("TANGUT COMPONENT-179", "Lo", 0), ("TANGUT COMPONENT-180", "Lo", 0), ("TANGUT COMPONENT-181", "Lo", 0), ("TANGUT COMPONENT-182", "Lo", 0), ("TANGUT COMPONENT-183", "Lo", 0), ("TANGUT COMPONENT-184", "Lo", 0), ("TANGUT COMPONENT-185", "Lo", 0), ("TANGUT COMPONENT-186", "Lo", 0), ("TANGUT COMPONENT-187", "Lo", 0), ("TANGUT COMPONENT-188", "Lo", 0), ("TANGUT COMPONENT-189", "Lo", 0), ("TANGUT COMPONENT-190", "Lo", 0), ("TANGUT COMPONENT-191", "Lo", 0), ("TANGUT COMPONENT-192", "Lo", 0), ("TANGUT COMPONENT-193", "Lo", 0), ("TANGUT COMPONENT-194", "Lo", 0), ("TANGUT COMPONENT-195", "Lo", 0), ("TANGUT COMPONENT-196", "Lo", 0), ("TANGUT COMPONENT-197", "Lo", 0), ("TANGUT COMPONENT-198", "Lo", 0), ("TANGUT COMPONENT-199", "Lo", 0), ("TANGUT COMPONENT-200", "Lo", 0), ("TANGUT COMPONENT-201", "Lo", 0), ("TANGUT COMPONENT-202", "Lo", 0), ("TANGUT COMPONENT-203", "Lo", 0), ("TANGUT COMPONENT-204", "Lo", 0), ("TANGUT COMPONENT-205", "Lo", 0), ("TANGUT COMPONENT-206", "Lo", 0), ("TANGUT COMPONENT-207", "Lo", 0), ("TANGUT COMPONENT-208", "Lo", 0), ("TANGUT COMPONENT-209", "Lo", 0), ("TANGUT COMPONENT-210", "Lo", 0), ("TANGUT COMPONENT-211", "Lo", 0), ("TANGUT COMPONENT-212", "Lo", 0), ("TANGUT COMPONENT-213", "Lo", 0), ("TANGUT COMPONENT-214", "Lo", 0), ("TANGUT COMPONENT-215", "Lo", 0), ("TANGUT COMPONENT-216", "Lo", 0), ("TANGUT COMPONENT-217", "Lo", 0), ("TANGUT COMPONENT-218", "Lo", 0), ("TANGUT COMPONENT-219", "Lo", 0), ("TANGUT COMPONENT-220", "Lo", 0), ("TANGUT COMPONENT-221", "Lo", 0), ("TANGUT COMPONENT-222", "Lo", 0), ("TANGUT COMPONENT-223", "Lo", 0), ("TANGUT COMPONENT-224", "Lo", 0), ("TANGUT COMPONENT-225", "Lo", 0), ("TANGUT COMPONENT-226", "Lo", 0), ("TANGUT COMPONENT-227", "Lo", 0), ("TANGUT COMPONENT-228", "Lo", 0), ("TANGUT COMPONENT-229", "Lo", 0), ("TANGUT COMPONENT-230", "Lo", 0), ("TANGUT COMPONENT-231", "Lo", 0), ("TANGUT COMPONENT-232", "Lo", 0), ("TANGUT COMPONENT-233", "Lo", 0), ("TANGUT COMPONENT-234", "Lo", 0), ("TANGUT COMPONENT-235", "Lo", 0), ("TANGUT COMPONENT-236", "Lo", 0), ("TANGUT COMPONENT-237", "Lo", 0), ("TANGUT COMPONENT-238", "Lo", 0), ("TANGUT COMPONENT-239", "Lo", 0), ("TANGUT COMPONENT-240", "Lo", 0), ("TANGUT COMPONENT-241", "Lo", 0), ("TANGUT COMPONENT-242", "Lo", 0), ("TANGUT COMPONENT-243", "Lo", 0), ("TANGUT COMPONENT-244", "Lo", 0), ("TANGUT COMPONENT-245", "Lo", 0), ("TANGUT COMPONENT-246", "Lo", 0), ("TANGUT COMPONENT-247", "Lo", 0), ("TANGUT COMPONENT-248", "Lo", 0), ("TANGUT COMPONENT-249", "Lo", 0), ("TANGUT COMPONENT-250", "Lo", 0), ("TANGUT COMPONENT-251", "Lo", 0), ("TANGUT COMPONENT-252", "Lo", 0), ("TANGUT COMPONENT-253", "Lo", 0), ("TANGUT COMPONENT-254", "Lo", 0), ("TANGUT COMPONENT-255", "Lo", 0), ("TANGUT COMPONENT-256", "Lo", 0), ("TANGUT COMPONENT-257", "Lo", 0), ("TANGUT COMPONENT-258", "Lo", 0), ("TANGUT COMPONENT-259", "Lo", 0), ("TANGUT COMPONENT-260", "Lo", 0), ("TANGUT COMPONENT-261", "Lo", 0), ("TANGUT COMPONENT-262", "Lo", 0), ("TANGUT COMPONENT-263", "Lo", 0), ("TANGUT COMPONENT-264", "Lo", 0), ("TANGUT COMPONENT-265", "Lo", 0), ("TANGUT COMPONENT-266", "Lo", 0), ("TANGUT COMPONENT-267", "Lo", 0), ("TANGUT COMPONENT-268", "Lo", 0), ("TANGUT COMPONENT-269", "Lo", 0), ("TANGUT COMPONENT-270", "Lo", 0), ("TANGUT COMPONENT-271", "Lo", 0), ("TANGUT COMPONENT-272", "Lo", 0), ("TANGUT COMPONENT-273", "Lo", 0), ("TANGUT COMPONENT-274", "Lo", 0), ("TANGUT COMPONENT-275", "Lo", 0), ("TANGUT COMPONENT-276", "Lo", 0), ("TANGUT COMPONENT-277", "Lo", 0), ("TANGUT COMPONENT-278", "Lo", 0), ("TANGUT COMPONENT-279", "Lo", 0), ("TANGUT COMPONENT-280", "Lo", 0), ("TANGUT COMPONENT-281", "Lo", 0), ("TANGUT COMPONENT-282", "Lo", 0), ("TANGUT COMPONENT-283", "Lo", 0), ("TANGUT COMPONENT-284", "Lo", 0), ("TANGUT COMPONENT-285", "Lo", 0), ("TANGUT COMPONENT-286", "Lo", 0), ("TANGUT COMPONENT-287", "Lo", 0), ("TANGUT COMPONENT-288", "Lo", 0), ("TANGUT COMPONENT-289", "Lo", 0), ("TANGUT COMPONENT-290", "Lo", 0), ("TANGUT COMPONENT-291", "Lo", 0), ("TANGUT COMPONENT-292", "Lo", 0), ("TANGUT COMPONENT-293", "Lo", 0), ("TANGUT COMPONENT-294", "Lo", 0), ("TANGUT COMPONENT-295", "Lo", 0), ("TANGUT COMPONENT-296", "Lo", 0), ("TANGUT COMPONENT-297", "Lo", 0), ("TANGUT COMPONENT-298", "Lo", 0), ("TANGUT COMPONENT-299", "Lo", 0), ("TANGUT COMPONENT-300", "Lo", 0), ("TANGUT COMPONENT-301", "Lo", 0), ("TANGUT COMPONENT-302", "Lo", 0), ("TANGUT COMPONENT-303", "Lo", 0), ("TANGUT COMPONENT-304", "Lo", 0), ("TANGUT COMPONENT-305", "Lo", 0), ("TANGUT COMPONENT-306", "Lo", 0), ("TANGUT COMPONENT-307", "Lo", 0), ("TANGUT COMPONENT-308", "Lo", 0), ("TANGUT COMPONENT-309", "Lo", 0), ("TANGUT COMPONENT-310", "Lo", 0), ("TANGUT COMPONENT-311", "Lo", 0), ("TANGUT COMPONENT-312", "Lo", 0), ("TANGUT COMPONENT-313", "Lo", 0), ("TANGUT COMPONENT-314", "Lo", 0), ("TANGUT COMPONENT-315", "Lo", 0), ("TANGUT COMPONENT-316", "Lo", 0), ("TANGUT COMPONENT-317", "Lo", 0), ("TANGUT COMPONENT-318", "Lo", 0), ("TANGUT COMPONENT-319", "Lo", 0), ("TANGUT COMPONENT-320", "Lo", 0), ("TANGUT COMPONENT-321", "Lo", 0), ("TANGUT COMPONENT-322", "Lo", 0), ("TANGUT COMPONENT-323", "Lo", 0), ("TANGUT COMPONENT-324", "Lo", 0), ("TANGUT COMPONENT-325", "Lo", 0), ("TANGUT COMPONENT-326", "Lo", 0), ("TANGUT COMPONENT-327", "Lo", 0), ("TANGUT COMPONENT-328", "Lo", 0), ("TANGUT COMPONENT-329", "Lo", 0), ("TANGUT COMPONENT-330", "Lo", 0), ("TANGUT COMPONENT-331", "Lo", 0), ("TANGUT COMPONENT-332", "Lo", 0), ("TANGUT COMPONENT-333", "Lo", 0), ("TANGUT COMPONENT-334", "Lo", 0), ("TANGUT COMPONENT-335", "Lo", 0), ("TANGUT COMPONENT-336", "Lo", 0), ("TANGUT COMPONENT-337", "Lo", 0), ("TANGUT COMPONENT-338", "Lo", 0), ("TANGUT COMPONENT-339", "Lo", 0), ("TANGUT COMPONENT-340", "Lo", 0), ("TANGUT COMPONENT-341", "Lo", 0), ("TANGUT COMPONENT-342", "Lo", 0), ("TANGUT COMPONENT-343", "Lo", 0), ("TANGUT COMPONENT-344", "Lo", 0), ("TANGUT COMPONENT-345", "Lo", 0), ("TANGUT COMPONENT-346", "Lo", 0), ("TANGUT COMPONENT-347", "Lo", 0), ("TANGUT COMPONENT-348", "Lo", 0), ("TANGUT COMPONENT-349", "Lo", 0), ("TANGUT COMPONENT-350", "Lo", 0), ("TANGUT COMPONENT-351", "Lo", 0), ("TANGUT COMPONENT-352", "Lo", 0), ("TANGUT COMPONENT-353", "Lo", 0), ("TANGUT COMPONENT-354", "Lo", 0), ("TANGUT COMPONENT-355", "Lo", 0), ("TANGUT COMPONENT-356", "Lo", 0), ("TANGUT COMPONENT-357", "Lo", 0), ("TANGUT COMPONENT-358", "Lo", 0), ("TANGUT COMPONENT-359", "Lo", 0), ("TANGUT COMPONENT-360", "Lo", 0), ("TANGUT COMPONENT-361", "Lo", 0), ("TANGUT COMPONENT-362", "Lo", 0), ("TANGUT COMPONENT-363", "Lo", 0), ("TANGUT COMPONENT-364", "Lo", 0), ("TANGUT COMPONENT-365", "Lo", 0), ("TANGUT COMPONENT-366", "Lo", 0), ("TANGUT COMPONENT-367", "Lo", 0), ("TANGUT COMPONENT-368", "Lo", 0), ("TANGUT COMPONENT-369", "Lo", 0), ("TANGUT COMPONENT-370", "Lo", 0), ("TANGUT COMPONENT-371", "Lo", 0), ("TANGUT COMPONENT-372", "Lo", 0), ("TANGUT COMPONENT-373", "Lo", 0), ("TANGUT COMPONENT-374", "Lo", 0), ("TANGUT COMPONENT-375", "Lo", 0), ("TANGUT COMPONENT-376", "Lo", 0), ("TANGUT COMPONENT-377", "Lo", 0), ("TANGUT COMPONENT-378", "Lo", 0), ("TANGUT COMPONENT-379", "Lo", 0), ("TANGUT COMPONENT-380", "Lo", 0), ("TANGUT COMPONENT-381", "Lo", 0), ("TANGUT COMPONENT-382", "Lo", 0), ("TANGUT COMPONENT-383", "Lo", 0), ("TANGUT COMPONENT-384", "Lo", 0), ("TANGUT COMPONENT-385", "Lo", 0), ("TANGUT COMPONENT-386", "Lo", 0), ("TANGUT COMPONENT-387", "Lo", 0), ("TANGUT COMPONENT-388", "Lo", 0), ("TANGUT COMPONENT-389", "Lo", 0), ("TANGUT COMPONENT-390", "Lo", 0), ("TANGUT COMPONENT-391", "Lo", 0), ("TANGUT COMPONENT-392", "Lo", 0), ("TANGUT COMPONENT-393", "Lo", 0), ("TANGUT COMPONENT-394", "Lo", 0), ("TANGUT COMPONENT-395", "Lo", 0), ("TANGUT COMPONENT-396", "Lo", 0), ("TANGUT COMPONENT-397", "Lo", 0), ("TANGUT COMPONENT-398", "Lo", 0), ("TANGUT COMPONENT-399", "Lo", 0), ("TANGUT COMPONENT-400", "Lo", 0), ("TANGUT COMPONENT-401", "Lo", 0), ("TANGUT COMPONENT-402", "Lo", 0), ("TANGUT COMPONENT-403", "Lo", 0), ("TANGUT COMPONENT-404", "Lo", 0), ("TANGUT COMPONENT-405", "Lo", 0), ("TANGUT COMPONENT-406", "Lo", 0), ("TANGUT COMPONENT-407", "Lo", 0), ("TANGUT COMPONENT-408", "Lo", 0), ("TANGUT COMPONENT-409", "Lo", 0), ("TANGUT COMPONENT-410", "Lo", 0), ("TANGUT COMPONENT-411", "Lo", 0), ("TANGUT COMPONENT-412", "Lo", 0), ("TANGUT COMPONENT-413", "Lo", 0), ("TANGUT COMPONENT-414", "Lo", 0), ("TANGUT COMPONENT-415", "Lo", 0), ("TANGUT COMPONENT-416", "Lo", 0), ("TANGUT COMPONENT-417", "Lo", 0), ("TANGUT COMPONENT-418", "Lo", 0), ("TANGUT COMPONENT-419", "Lo", 0), ("TANGUT COMPONENT-420", "Lo", 0), ("TANGUT COMPONENT-421", "Lo", 0), ("TANGUT COMPONENT-422", "Lo", 0), ("TANGUT COMPONENT-423", "Lo", 0), ("TANGUT COMPONENT-424", "Lo", 0), ("TANGUT COMPONENT-425", "Lo", 0), ("TANGUT COMPONENT-426", "Lo", 0), ("TANGUT COMPONENT-427", "Lo", 0), ("TANGUT COMPONENT-428", "Lo", 0), ("TANGUT COMPONENT-429", "Lo", 0), ("TANGUT COMPONENT-430", "Lo", 0), ("TANGUT COMPONENT-431", "Lo", 0), ("TANGUT COMPONENT-432", "Lo", 0), ("TANGUT COMPONENT-433", "Lo", 0), ("TANGUT COMPONENT-434", "Lo", 0), ("TANGUT COMPONENT-435", "Lo", 0), ("TANGUT COMPONENT-436", "Lo", 0), ("TANGUT COMPONENT-437", "Lo", 0), ("TANGUT COMPONENT-438", "Lo", 0), ("TANGUT COMPONENT-439", "Lo", 0), ("TANGUT COMPONENT-440", "Lo", 0), ("TANGUT COMPONENT-441", "Lo", 0), ("TANGUT COMPONENT-442", "Lo", 0), ("TANGUT COMPONENT-443", "Lo", 0), ("TANGUT COMPONENT-444", "Lo", 0), ("TANGUT COMPONENT-445", "Lo", 0), ("TANGUT COMPONENT-446", "Lo", 0), ("TANGUT COMPONENT-447", "Lo", 0), ("TANGUT COMPONENT-448", "Lo", 0), ("TANGUT COMPONENT-449", "Lo", 0), ("TANGUT COMPONENT-450", "Lo", 0), ("TANGUT COMPONENT-451", "Lo", 0), ("TANGUT COMPONENT-452", "Lo", 0), ("TANGUT COMPONENT-453", "Lo", 0), ("TANGUT COMPONENT-454", "Lo", 0), ("TANGUT COMPONENT-455", "Lo", 0), ("TANGUT COMPONENT-456", "Lo", 0), ("TANGUT COMPONENT-457", "Lo", 0), ("TANGUT COMPONENT-458", "Lo", 0), ("TANGUT COMPONENT-459", "Lo", 0), ("TANGUT COMPONENT-460", "Lo", 0), ("TANGUT COMPONENT-461", "Lo", 0), ("TANGUT COMPONENT-462", "Lo", 0), ("TANGUT COMPONENT-463", "Lo", 0), ("TANGUT COMPONENT-464", "Lo", 0), ("TANGUT COMPONENT-465", "Lo", 0), ("TANGUT COMPONENT-466", "Lo", 0), ("TANGUT COMPONENT-467", "Lo", 0), ("TANGUT COMPONENT-468", "Lo", 0), ("TANGUT COMPONENT-469", "Lo", 0), ("TANGUT COMPONENT-470", "Lo", 0), ("TANGUT COMPONENT-471", "Lo", 0), ("TANGUT COMPONENT-472", "Lo", 0), ("TANGUT COMPONENT-473", "Lo", 0), ("TANGUT COMPONENT-474", "Lo", 0), ("TANGUT COMPONENT-475", "Lo", 0), ("TANGUT COMPONENT-476", "Lo", 0), ("TANGUT COMPONENT-477", "Lo", 0), ("TANGUT COMPONENT-478", "Lo", 0), ("TANGUT COMPONENT-479", "Lo", 0), ("TANGUT COMPONENT-480", "Lo", 0), ("TANGUT COMPONENT-481", "Lo", 0), ("TANGUT COMPONENT-482", "Lo", 0), ("TANGUT COMPONENT-483", "Lo", 0), ("TANGUT COMPONENT-484", "Lo", 0), ("TANGUT COMPONENT-485", "Lo", 0), ("TANGUT COMPONENT-486", "Lo", 0), ("TANGUT COMPONENT-487", "Lo", 0), ("TANGUT COMPONENT-488", "Lo", 0), ("TANGUT COMPONENT-489", "Lo", 0), ("TANGUT COMPONENT-490", "Lo", 0), ("TANGUT COMPONENT-491", "Lo", 0), ("TANGUT COMPONENT-492", "Lo", 0), ("TANGUT COMPONENT-493", "Lo", 0), ("TANGUT COMPONENT-494", "Lo", 0), ("TANGUT COMPONENT-495", "Lo", 0), ("TANGUT COMPONENT-496", "Lo", 0), ("TANGUT COMPONENT-497", "Lo", 0), ("TANGUT COMPONENT-498", "Lo", 0), ("TANGUT COMPONENT-499", "Lo", 0), ("TANGUT COMPONENT-500", "Lo", 0), ("TANGUT COMPONENT-501", "Lo", 0), ("TANGUT COMPONENT-502", "Lo", 0), ("TANGUT COMPONENT-503", "Lo", 0), ("TANGUT COMPONENT-504", "Lo", 0), ("TANGUT COMPONENT-505", "Lo", 0), ("TANGUT COMPONENT-506", "Lo", 0), ("TANGUT COMPONENT-507", "Lo", 0), ("TANGUT COMPONENT-508", "Lo", 0), ("TANGUT COMPONENT-509", "Lo", 0), ("TANGUT COMPONENT-510", "Lo", 0), ("TANGUT COMPONENT-511", "Lo", 0), ("TANGUT COMPONENT-512", "Lo", 0), ("TANGUT COMPONENT-513", "Lo", 0), ("TANGUT COMPONENT-514", "Lo", 0), ("TANGUT COMPONENT-515", "Lo", 0), ("TANGUT COMPONENT-516", "Lo", 0), ("TANGUT COMPONENT-517", "Lo", 0), ("TANGUT COMPONENT-518", "Lo", 0), ("TANGUT COMPONENT-519", "Lo", 0), ("TANGUT COMPONENT-520", "Lo", 0), ("TANGUT COMPONENT-521", "Lo", 0), ("TANGUT COMPONENT-522", "Lo", 0), ("TANGUT COMPONENT-523", "Lo", 0), ("TANGUT COMPONENT-524", "Lo", 0), ("TANGUT COMPONENT-525", "Lo", 0), ("TANGUT COMPONENT-526", "Lo", 0), ("TANGUT COMPONENT-527", "Lo", 0), ("TANGUT COMPONENT-528", "Lo", 0), ("TANGUT COMPONENT-529", "Lo", 0), ("TANGUT COMPONENT-530", "Lo", 0), ("TANGUT COMPONENT-531", "Lo", 0), ("TANGUT COMPONENT-532", "Lo", 0), ("TANGUT COMPONENT-533", "Lo", 0), ("TANGUT COMPONENT-534", "Lo", 0), ("TANGUT COMPONENT-535", "Lo", 0), ("TANGUT COMPONENT-536", "Lo", 0), ("TANGUT COMPONENT-537", "Lo", 0), ("TANGUT COMPONENT-538", "Lo", 0), ("TANGUT COMPONENT-539", "Lo", 0), ("TANGUT COMPONENT-540", "Lo", 0), ("TANGUT COMPONENT-541", "Lo", 0), ("TANGUT COMPONENT-542", "Lo", 0), ("TANGUT COMPONENT-543", "Lo", 0), ("TANGUT COMPONENT-544", "Lo", 0), ("TANGUT COMPONENT-545", "Lo", 0), ("TANGUT COMPONENT-546", "Lo", 0), ("TANGUT COMPONENT-547", "Lo", 0), ("TANGUT COMPONENT-548", "Lo", 0), ("TANGUT COMPONENT-549", "Lo", 0), ("TANGUT COMPONENT-550", "Lo", 0), ("TANGUT COMPONENT-551", "Lo", 0), ("TANGUT COMPONENT-552", "Lo", 0), ("TANGUT COMPONENT-553", "Lo", 0), ("TANGUT COMPONENT-554", "Lo", 0), ("TANGUT COMPONENT-555", "Lo", 0), ("TANGUT COMPONENT-556", "Lo", 0), ("TANGUT COMPONENT-557", "Lo", 0), ("TANGUT COMPONENT-558", "Lo", 0), ("TANGUT COMPONENT-559", "Lo", 0), ("TANGUT COMPONENT-560", "Lo", 0), ("TANGUT COMPONENT-561", "Lo", 0), ("TANGUT COMPONENT-562", "Lo", 0), ("TANGUT COMPONENT-563", "Lo", 0), ("TANGUT COMPONENT-564", "Lo", 0), ("TANGUT COMPONENT-565", "Lo", 0), ("TANGUT COMPONENT-566", "Lo", 0), ("TANGUT COMPONENT-567", "Lo", 0), ("TANGUT COMPONENT-568", "Lo", 0), ("TANGUT COMPONENT-569", "Lo", 0), ("TANGUT COMPONENT-570", "Lo", 0), ("TANGUT COMPONENT-571", "Lo", 0), ("TANGUT COMPONENT-572", "Lo", 0), ("TANGUT COMPONENT-573", "Lo", 0), ("TANGUT COMPONENT-574", "Lo", 0), ("TANGUT COMPONENT-575", "Lo", 0), ("TANGUT COMPONENT-576", "Lo", 0), ("TANGUT COMPONENT-577", "Lo", 0), ("TANGUT COMPONENT-578", "Lo", 0), ("TANGUT COMPONENT-579", "Lo", 0), ("TANGUT COMPONENT-580", "Lo", 0), ("TANGUT COMPONENT-581", "Lo", 0), ("TANGUT COMPONENT-582", "Lo", 0), ("TANGUT COMPONENT-583", "Lo", 0), ("TANGUT COMPONENT-584", "Lo", 0), ("TANGUT COMPONENT-585", "Lo", 0), ("TANGUT COMPONENT-586", "Lo", 0), ("TANGUT COMPONENT-587", "Lo", 0), ("TANGUT COMPONENT-588", "Lo", 0), ("TANGUT COMPONENT-589", "Lo", 0), ("TANGUT COMPONENT-590", "Lo", 0), ("TANGUT COMPONENT-591", "Lo", 0), ("TANGUT COMPONENT-592", "Lo", 0), ("TANGUT COMPONENT-593", "Lo", 0), ("TANGUT COMPONENT-594", "Lo", 0), ("TANGUT COMPONENT-595", "Lo", 0), ("TANGUT COMPONENT-596", "Lo", 0), ("TANGUT COMPONENT-597", "Lo", 0), ("TANGUT COMPONENT-598", "Lo", 0), ("TANGUT COMPONENT-599", "Lo", 0), ("TANGUT COMPONENT-600", "Lo", 0), ("TANGUT COMPONENT-601", "Lo", 0), ("TANGUT COMPONENT-602", "Lo", 0), ("TANGUT COMPONENT-603", "Lo", 0), ("TANGUT COMPONENT-604", "Lo", 0), ("TANGUT COMPONENT-605", "Lo", 0), ("TANGUT COMPONENT-606", "Lo", 0), ("TANGUT COMPONENT-607", "Lo", 0), ("TANGUT COMPONENT-608", "Lo", 0), ("TANGUT COMPONENT-609", "Lo", 0), ("TANGUT COMPONENT-610", "Lo", 0), ("TANGUT COMPONENT-611", "Lo", 0), ("TANGUT COMPONENT-612", "Lo", 0), ("TANGUT COMPONENT-613", "Lo", 0), ("TANGUT COMPONENT-614", "Lo", 0), ("TANGUT COMPONENT-615", "Lo", 0), ("TANGUT COMPONENT-616", "Lo", 0), ("TANGUT COMPONENT-617", "Lo", 0), ("TANGUT COMPONENT-618", "Lo", 0), ("TANGUT COMPONENT-619", "Lo", 0), ("TANGUT COMPONENT-620", "Lo", 0), ("TANGUT COMPONENT-621", "Lo", 0), ("TANGUT COMPONENT-622", "Lo", 0), ("TANGUT COMPONENT-623", "Lo", 0), ("TANGUT COMPONENT-624", "Lo", 0), ("TANGUT COMPONENT-625", "Lo", 0), ("TANGUT COMPONENT-626", "Lo", 0), ("TANGUT COMPONENT-627", "Lo", 0), ("TANGUT COMPONENT-628", "Lo", 0), ("TANGUT COMPONENT-629", "Lo", 0), ("TANGUT COMPONENT-630", "Lo", 0), ("TANGUT COMPONENT-631", "Lo", 0), ("TANGUT COMPONENT-632", "Lo", 0), ("TANGUT COMPONENT-633", "Lo", 0), ("TANGUT COMPONENT-634", "Lo", 0), ("TANGUT COMPONENT-635", "Lo", 0), ("TANGUT COMPONENT-636", "Lo", 0), ("TANGUT COMPONENT-637", "Lo", 0), ("TANGUT COMPONENT-638", "Lo", 0), ("TANGUT COMPONENT-639", "Lo", 0), ("TANGUT COMPONENT-640", "Lo", 0), ("TANGUT COMPONENT-641", "Lo", 0), ("TANGUT COMPONENT-642", "Lo", 0), ("TANGUT COMPONENT-643", "Lo", 0), ("TANGUT COMPONENT-644", "Lo", 0), ("TANGUT COMPONENT-645", "Lo", 0), ("TANGUT COMPONENT-646", "Lo", 0), ("TANGUT COMPONENT-647", "Lo", 0), ("TANGUT COMPONENT-648", "Lo", 0), ("TANGUT COMPONENT-649", "Lo", 0), ("TANGUT COMPONENT-650", "Lo", 0), ("TANGUT COMPONENT-651", "Lo", 0), ("TANGUT COMPONENT-652", "Lo", 0), ("TANGUT COMPONENT-653", "Lo", 0), ("TANGUT COMPONENT-654", "Lo", 0), ("TANGUT COMPONENT-655", "Lo", 0), ("TANGUT COMPONENT-656", "Lo", 0), ("TANGUT COMPONENT-657", "Lo", 0), ("TANGUT COMPONENT-658", "Lo", 0), ("TANGUT COMPONENT-659", "Lo", 0), ("TANGUT COMPONENT-660", "Lo", 0), ("TANGUT COMPONENT-661", "Lo", 0), ("TANGUT COMPONENT-662", "Lo", 0), ("TANGUT COMPONENT-663", "Lo", 0), ("TANGUT COMPONENT-664", "Lo", 0), ("TANGUT COMPONENT-665", "Lo", 0), ("TANGUT COMPONENT-666", "Lo", 0), ("TANGUT COMPONENT-667", "Lo", 0), ("TANGUT COMPONENT-668", "Lo", 0), ("TANGUT COMPONENT-669", "Lo", 0), ("TANGUT COMPONENT-670", "Lo", 0), ("TANGUT COMPONENT-671", "Lo", 0), ("TANGUT COMPONENT-672", "Lo", 0), ("TANGUT COMPONENT-673", "Lo", 0), ("TANGUT COMPONENT-674", "Lo", 0), ("TANGUT COMPONENT-675", "Lo", 0), ("TANGUT COMPONENT-676", "Lo", 0), ("TANGUT COMPONENT-677", "Lo", 0), ("TANGUT COMPONENT-678", "Lo", 0), ("TANGUT COMPONENT-679", "Lo", 0), ("TANGUT COMPONENT-680", "Lo", 0), ("TANGUT COMPONENT-681", "Lo", 0), ("TANGUT COMPONENT-682", "Lo", 0), ("TANGUT COMPONENT-683", "Lo", 0), ("TANGUT COMPONENT-684", "Lo", 0), ("TANGUT COMPONENT-685", "Lo", 0), ("TANGUT COMPONENT-686", "Lo", 0), ("TANGUT COMPONENT-687", "Lo", 0), ("TANGUT COMPONENT-688", "Lo", 0), ("TANGUT COMPONENT-689", "Lo", 0), ("TANGUT COMPONENT-690", "Lo", 0), ("TANGUT COMPONENT-691", "Lo", 0), ("TANGUT COMPONENT-692", "Lo", 0), ("TANGUT COMPONENT-693", "Lo", 0), ("TANGUT COMPONENT-694", "Lo", 0), ("TANGUT COMPONENT-695", "Lo", 0), ("TANGUT COMPONENT-696", "Lo", 0), ("TANGUT COMPONENT-697", "Lo", 0), ("TANGUT COMPONENT-698", "Lo", 0), ("TANGUT COMPONENT-699", "Lo", 0), ("TANGUT COMPONENT-700", "Lo", 0), ("TANGUT COMPONENT-701", "Lo", 0), ("TANGUT COMPONENT-702", "Lo", 0), ("TANGUT COMPONENT-703", "Lo", 0), ("TANGUT COMPONENT-704", "Lo", 0), ("TANGUT COMPONENT-705", "Lo", 0), ("TANGUT COMPONENT-706", "Lo", 0), ("TANGUT COMPONENT-707", "Lo", 0), ("TANGUT COMPONENT-708", "Lo", 0), ("TANGUT COMPONENT-709", "Lo", 0), ("TANGUT COMPONENT-710", "Lo", 0), ("TANGUT COMPONENT-711", "Lo", 0), ("TANGUT COMPONENT-712", "Lo", 0), ("TANGUT COMPONENT-713", "Lo", 0), ("TANGUT COMPONENT-714", "Lo", 0), ("TANGUT COMPONENT-715", "Lo", 0), ("TANGUT COMPONENT-716", "Lo", 0), ("TANGUT COMPONENT-717", "Lo", 0), ("TANGUT COMPONENT-718", "Lo", 0), ("TANGUT COMPONENT-719", "Lo", 0), ("TANGUT COMPONENT-720", "Lo", 0), ("TANGUT COMPONENT-721", "Lo", 0), ("TANGUT COMPONENT-722", "Lo", 0), ("TANGUT COMPONENT-723", "Lo", 0), ("TANGUT COMPONENT-724", "Lo", 0), ("TANGUT COMPONENT-725", "Lo", 0), ("TANGUT COMPONENT-726", "Lo", 0), ("TANGUT COMPONENT-727", "Lo", 0), ("TANGUT COMPONENT-728", "Lo", 0), ("TANGUT COMPONENT-729", "Lo", 0), ("TANGUT COMPONENT-730", "Lo", 0), ("TANGUT COMPONENT-731", "Lo", 0), ("TANGUT COMPONENT-732", "Lo", 0), ("TANGUT COMPONENT-733", "Lo", 0), ("TANGUT COMPONENT-734", "Lo", 0), ("TANGUT COMPONENT-735", "Lo", 0), ("TANGUT COMPONENT-736", "Lo", 0), ("TANGUT COMPONENT-737", "Lo", 0), ("TANGUT COMPONENT-738", "Lo", 0), ("TANGUT COMPONENT-739", "Lo", 0), ("TANGUT COMPONENT-740", "Lo", 0), ("TANGUT COMPONENT-741", "Lo", 0), ("TANGUT COMPONENT-742", "Lo", 0), ("TANGUT COMPONENT-743", "Lo", 0), ("TANGUT COMPONENT-744", "Lo", 0), ("TANGUT COMPONENT-745", "Lo", 0), ("TANGUT COMPONENT-746", "Lo", 0), ("TANGUT COMPONENT-747", "Lo", 0), ("TANGUT COMPONENT-748", "Lo", 0), ("TANGUT COMPONENT-749", "Lo", 0), ("TANGUT COMPONENT-750", "Lo", 0), ("TANGUT COMPONENT-751", "Lo", 0), ("TANGUT COMPONENT-752", "Lo", 0), ("TANGUT COMPONENT-753", "Lo", 0), ("TANGUT COMPONENT-754", "Lo", 0), ("TANGUT COMPONENT-755", "Lo", 0), ("TANGUT COMPONENT-756", "Lo", 0), ("TANGUT COMPONENT-757", "Lo", 0), ("TANGUT COMPONENT-758", "Lo", 0), ("TANGUT COMPONENT-759", "Lo", 0), ("TANGUT COMPONENT-760", "Lo", 0), ("TANGUT COMPONENT-761", "Lo", 0), ("TANGUT COMPONENT-762", "Lo", 0), ("TANGUT COMPONENT-763", "Lo", 0), ("TANGUT COMPONENT-764", "Lo", 0), ("TANGUT COMPONENT-765", "Lo", 0), ("TANGUT COMPONENT-766", "Lo", 0), ("TANGUT COMPONENT-767", "Lo", 0), ("TANGUT COMPONENT-768", "Lo", 0), )
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/001%20-%20Magic%202013/001_Odric%2C%20Master%20Tactician.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Odric, Master Tactician", set_name: "Magic 2013", story_date: datetime(day: 27, month: 06, year: 2012), author: "<NAME>", doc ) #figure(image("001_Odric, Master Tactician/01.jpg", height: 40%), caption: [], supplement: none, numbering: none) A town crier hollered the evening news on the cobblestone street below the open window. "Execution at Bloodless Wall! Tomorrow at sunrise! The Healers of Heron are at Child's Wall tomorrow..." When Odric last looked out the window, it had been early afternoon. Now a cold mist had settled on Thraben, and the city was cloaked in evening shadows. #emph[Where is the moon?] Odric's arm jerked involuntarily, nearly upending his ink jar. #emph[No] , he reminded himself. #emph[It no longer matters] . The phases of the moon were no longer predictors of life and death now that Avacyn had returned and cleansed the world. #emph[Or at least begun to...] He glanced across the oak table at Grete, his lieutenant, who looked surprised by his sudden movement. Sir Odric, Master Tactician, Commander of the Gavony Riders and Recipient of the Moonsilver Accommodation didn't startle easily. "Sir?" Grete asked. "Darkness fell," he told her. She glanced out the window, and he saw similar emotions play across her features. #emph[We've spent too many years as prey.] Too many years spent cowering in the shadows. #figure(image("001_Odric, Master Tactician/02.jpg", width: 100%), caption: [Art by Karl Kopinski], supplement: none, numbering: none) "There's still no sign of Ludevic," Grete continued, scanning the parchment. "A miller spotted him near Estwald, but he'd moved on before the cathars could detain him. So the hunt continues." Just the thought of the mad alchemist made his head ache. Odric leaned back and pressed his palms against his temples. This was a temporary assignment—one he had requested in hopes of getting an angel's-eye view of Innistrad. Every regiment sent in daily dispatches of what they encountered in the field. From these reports, Odric was piecing together where the Church's power was still being threatened. But he didn't enjoy sitting in a leather chair in a cathedral chamber. He was a battlefield man, much better suited to combat maneuvers than negotiating the politics of the Avacynian Church. "What of your friends around Hanweir?" Odric asked and was rewarded with a slight smile from the deadly serious Grete. She'd led an assault against a legion of ghouls ravaging the moorlands, a success that had earned her the promotion to second in command. #figure(image("001_Odric, Master Tactician/03.jpg", width: 100%), caption: [Blazing Torch | Art by <NAME>], supplement: none, numbering: none) "We're hunting down the last stragglers. Gisa is being transported from the Rider's Lock up to Thraben next week." "Triple the escort," Odric said. "She's caused enough trouble in my lifetime." Grete nodded and scanned the last dispatch. Only a few more days, and Odric's administrative duties in Thraben would be done. His time here had been valuable. He knew that demons were still on the loose, but Avacyn herself was focused on the Helvault escapees. Necromantic activity still plagued the moorlands, but nothing like the heyday of Gisa and Geralf's tyranny. Sigarda's forces were hunting the perpetrators of the Nephalia Massacre. The vampires had all but slunk back to Stensia. #emph[Someday soon, I will purge that province myself, but first I must be sure that Avacyn's blessings hold.] "The mayor's son in Torbach tumbled down a river bank and broke his leg." Odric sighed. "Does the mayor of Torbach truly request the Church's assistance to fix a boy's leg?" "It says he fell fleeing a... werewolf. He later died of a fever and gangrene." As Odric pushed himself to his feet, it felt as if a steel trap had closed around his stomach. Every morning since the Cursemute had rid the land of the lycanthrope curse, he'd fallen to his knees praising Avacyn's blessing. But in his heart, he doubted. What if the wolfir reverted to a murderous state? What if the abominations that had slaughtered so many of his kin returned? "Rouse our regiment," he told Grete. "It seems our days in Thraben have come to an abrupt end." #figure(image("001_Odric, Master Tactician/04.jpg", width: 100%), caption: [Odric, Master Tactician | Art by Michael Komarck], supplement: none, numbering: none) The Mayor of Torbach was a pompous, red-faced administrator who took power after Avacyn returned. #emph[A sheep in fancy clothes, ] Odric thought.#emph[ Not a leader during the darkest hours.] The mayor had been ranting at them since they arrived in his chamber. Grete shifted uncomfortably beside Odric, no doubt wondering why he was letting this tirade go on so long. "I demand to know! What is this new devilry? Werewolves walk even during the half moon? Did you not promise that this curse was lifted from us? These wolfir may slaughter us even during the daylight?" "Sir, there is no reason to believe that a wolfir—" Odric said. "It slaughtered the widow of <NAME>!" the mayor interrupted. "Destroyed her cottage just last night! And took its time under her roof. Maybe it slumbered a bit? Going to cook a hock of meat in her fireplace?" "The creature was in her cottage?" Odric asked. "This vile creature is terrorizing my village. Where are the angels? The cathars waste time building bridges and trimming apple trees and..." "One more question," Odric interrupted. "Has it attacked other cottages? Or just the widow's?" "Cottages, no. But my son! He was just a boy...." Odric laid his hand on the mayor's shoulder. At his touch, the man abruptly stopped speaking, and tears brimmed in his brown eyes. "We will find the monster that killed your son and put its head on a pike," Odric assured the mayor, who had lost his bluster and seemed to have no more words for them. Odric and Grete found their own way back to the street where their horses waited. "He acted as if all it was our fault," Grete said angrily. "He's a grieving man," Odric replied. #emph[A man who has lost a son to a werewolf, ] he thought.#emph[ Just as I have.] #figure(image("001_Odric, Master Tactician/05.jpg", width: 100%), caption: [Bane of Hanweir | Art by <NAME>s], supplement: none, numbering: none) As they rode toward the edge of the village, the red sun dropped low to the horizon. Above, a sliver of pale moon appeared in the indigo heavens. The phases of the moon once had been Odric's guiding hand. The shape of the moon would figure into his battle tactics as much as supply lines and the morale of his cathars. Odric spent years watching the night sky, noting how the moon touched the world in unexpected ways. Some seemed trivial. The leaves of the maple curled downward during a full moon. Others were crucial to survival. Ghouls moved quicker during a new moon. A waxing moon brought unnatural fighting among the rank and file. With Avacyn's Cursemute, Odric secretly felt he had lost one of his tactical advantages. The moon was playing new games, and Odric had yet to learn the rules. "What are your thoughts?" Grete asked over the thud of the horses' hooves. "I knew of the widow who was killed. They called her the Bitterheart Witch. Remember how he said it stayed in her cottage? Something about her attracted the monster. We'll set a trap near there." As they quickened their horses and made toward camp, Odric's eyes fixed on the pattern of mist around the meek sliver in the sky. Whatever evil was now manifesting itself, it would stop in the shadow of the widows' cottage. He would mount its head on the gates of Thraben. #figure(image("001_Odric, Master Tactician/06.jpg", width: 100%), caption: [Riders of Gavony | Art by Volkan Baga], supplement: none, numbering: none) By midnight, there was no moon at all. He and Grete lay in the undergrowth at the edge of a clearing. The only light came from a witchbane's orb, a magical ward against curses that hovered at the edge of the trees. The spell was the widow's handiwork, from a time before she'd been outcast by the villagers as a witch. Odric had poured the blood of the widow on the ground below the orb. Blood he'd taken from her lifeless corpse in the catacombs of the local church. The strange thing was that when he'd viewed her corpse, he'd seen no signs of violence on her. There was no evidence she'd been killed by a werewolf attack, which Odric had assumed after talking with the mayor. She looked peaceful enough to have died of old age. #figure(image("001_Odric, Master Tactician/07.jpg", width: 100%), caption: [Witchbane Orb | Art by <NAME>], supplement: none, numbering: none) A hooting call broke the night's silence. He recognized it as a cathar's signal that something had passed through the perimeter of scouts he'd placed around the grove. He glanced at Grete, and she silently stood up and disappeared into the shadows. Odric moved into a crouching position, waiting for the second signal, which would confirm if it was natural or unnatural... The signal came again, urgently. #emph[Unnatural, then] . Odric saw it before he heard it. A shadow—much taller than an average man—stretched across the clearing. He'd fought countless werewolves and none had moved with such quiet deliberation. Odric glanced up at the inky sky, suddenly doubting his strategy. But the monstrosity had entered the clearing and was loping toward the scent of the widow's blood. Whatever was approaching, there was no time to question the plan. #emph[Fear holds no place in faith's battle plan.] Odric shouted to the cathars in the treetops, who cast down the heavy net, sending the creature's massive bulk onto the forest floor. Odric sprinted toward it as it struggled under the ropes. He unsheathed his sword as he ran, ready to slice through ropes and neck in one blow. "Wait!" screamed Grete, attempting to intercept her commander. "Wait! It has an axe!" #figure(image("001_Odric, Master Tactician/08.jpg", width: 100%), caption: [Crusader of Odric | Art by <NAME>], supplement: none, numbering: none) Odric froze, seeing the massive weapon on the ground behind the monster. Then he saw the arm—a human arm—a hand, and human eyes, peering out of a haggard face crisscrossed with sickly black veins. "In Avacyn's name," Odric thundered. "What are you?" "I'm weakened, cursed, and no threat to you," it said. "I'm <NAME>..." #figure(image("001_Odric, Master Tactician/09.jpg", width: 100%), caption: [Garruk, the Veil-Cursed | Art by <NAME>], supplement: none, numbering: none) The guttural voice infuriated Odric. Every corpse that had been mutilated by a werewolf flashed through his memory. He would never forget the raw carnage of the attacks and the senseless rage that left human bodies in bloody tatters. The only way Odric could comprehend those murders was if they were done by mindless beasts. Mindless beasts don't have language or a voice to speak it. And never a name. #emph[Garruk Wildspeaker] . Even when Odric killed werewolves in human form, he never spoke their names. In his mind, the curse stripped them of any human identity they once possessed. Odric bashed the blunt end of his sword against the monster's temple, hearing the crack of skull as it shattered under the weight of his blow. It slumped to the ground. He yanked the net off the monster and grabbed a handful of its long matted hair. He yanked it back to expose the bare throat where lifeblood still pulsed through its unnatural veins. "Wait!" Grete was at his shoulder. Odric lifted his sword. #emph[One blow to sunder the head from the body] . "It's not a werewolf! Sir, Avacyn's blessings didn't fail us." He wanted its head. #emph[I'll throw it at the feet of Avacyn and scream the name of every person murdered in her absence.] "Let us bring it to Thraben—alive. Leave the days of slaughter behind us. It's a new day in Innistrad." He wanted to scream at her, too. She'd fought the same wars and lived in the same grisly world as he. But unlike him, her conscience was untarnished. #emph[She's still hopeful.] Grete's compassion would kill her someday. #emph[Someday soon. ] Odric let go of the monster and sheathed his sword. "Drug him and tie him down. It's a long road back to Thraben. Let Avacyn measure the worth of his life."
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-10300.typ
typst
Apache License 2.0
#let data = ( ("OLD ITALIC LETTER A", "Lo", 0), ("OLD ITALIC LETTER BE", "Lo", 0), ("OLD ITALIC LETTER KE", "Lo", 0), ("OLD ITALIC LETTER DE", "Lo", 0), ("OLD ITALIC LETTER E", "Lo", 0), ("OLD ITALIC LETTER VE", "Lo", 0), ("OLD ITALIC LETTER ZE", "Lo", 0), ("OLD ITALIC LETTER HE", "Lo", 0), ("OLD ITALIC LETTER THE", "Lo", 0), ("OLD ITALIC LETTER I", "Lo", 0), ("OLD ITALIC LETTER KA", "Lo", 0), ("OLD ITALIC LETTER EL", "Lo", 0), ("OLD ITALIC LETTER EM", "Lo", 0), ("OLD ITALIC LETTER EN", "Lo", 0), ("OLD ITALIC LETTER ESH", "Lo", 0), ("OLD ITALIC LETTER O", "Lo", 0), ("OLD ITALIC LETTER PE", "Lo", 0), ("OLD ITALIC LETTER SHE", "Lo", 0), ("OLD ITALIC LETTER KU", "Lo", 0), ("OLD ITALIC LETTER ER", "Lo", 0), ("OLD ITALIC LETTER ES", "Lo", 0), ("OLD ITALIC LETTER TE", "Lo", 0), ("OLD ITALIC LETTER U", "Lo", 0), ("OLD ITALIC LETTER EKS", "Lo", 0), ("OLD ITALIC LETTER PHE", "Lo", 0), ("OLD ITALIC LETTER KHE", "Lo", 0), ("OLD ITALIC LETTER EF", "Lo", 0), ("OLD ITALIC LETTER ERS", "Lo", 0), ("OLD ITALIC LETTER CHE", "Lo", 0), ("OLD ITALIC LETTER II", "Lo", 0), ("OLD ITALIC LETTER UU", "Lo", 0), ("OLD ITALIC LETTER ESS", "Lo", 0), ("OLD ITALIC NUMERAL ONE", "No", 0), ("OLD ITALIC NUMERAL FIVE", "No", 0), ("OLD ITALIC NUMERAL TEN", "No", 0), ("OLD ITALIC NUMERAL FIFTY", "No", 0), (), (), (), (), (), (), (), (), (), ("OLD ITALIC LETTER YE", "Lo", 0), ("OLD ITALIC LETTER NORTHERN TSE", "Lo", 0), ("OLD ITALIC LETTER SOUTHERN TSE", "Lo", 0), )
https://github.com/Myriad-Dreamin/tinymist
https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/goto_definition/base.typ
typst
Apache License 2.0
#let f() = 1; #(/* position after */ f());
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/048%20-%20Dominaria%20United/001_Episode%201%3A%20Echoes%20In%20The%20Dark.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Episode 1: Echoes In The Dark", set_name: "Dominaria United", story_date: datetime(day: 10, month: 08, year: 2022), author: "<NAME>", doc ) #figure(image("001_Episode 1: Echoes In The Dark/01.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) Even from three caverns away, the shriek of breaking metal echoed against the stone. Yet another broken excavator. If Karn had been an organic being, he would have sighed. Instead, he only paused and listened to the excavator's lingering rattles. He pitied his machines: no single setting could accommodate the eccentric geology of the Caves of Koilos, where olivine rock was as likely to back into sandstone as cinnabar, but he had no alternative. Here, he'd find the secret to operating the sylex. And he'd find it before any Phyrexian agent could. Condensation beaded on his body, the droplets joining to slide down his metal plating. No one seemed to believe him, but he knew the truth. The Phyrexians were here, on Dominaria. He could feel it, like he could feel the stone and the interplanar technology riddling its layers. He turned sideways to squeeze through a narrow passage. The basalt grated against his chest but gave way without scratching him. He ducked under translucent stalagmites into a low cave. Clear selenite coated the bones of Thran prisoners and fragmented Thran technologies, their gold traceries distorted. Karn located the failed excavator to the rear of the cave. The poor excavator steamed, as if vexed with its unmanageable job, and its overheated metal housing ticked with a soft #emph[tink-tink-tink ] as it cooled. Karn wove between the stalactites and water pools, careful not to break any of the delicate violet mineral deposits or disturb the freshwater anemones and tiny blind fish, bleached by a life that had, until now, passed in darkness. Karn placed his hand on the excavator. "Shall I fix you, then? Yes?" Steam sighed from the overheated machine. At his gesture, screws unwound themselves along their threading. He set them aside and removed the housing. A stripped-out gear greeted him. He removed it and set to generating a replacement. His fingers tingled with magic, its charge pulling together to generate something from nothing. Metal materialized, layer upon layer, to create a duplicate part. He liked working in the caves' silence. In the sun's absence, only the water's metronome drip measured out his days. He was alone here; other Planeswalkers didn't like the interplanar distortion that rubbed at their senses in the Caves of Koilos. Karn didn't either, but he appreciated the isolation it gave him. He didn't have to answer questions. Or worry about whether the Phyrexians had gotten to someone. Compleated them. He could seek the key to operating the sylex in solitude. He would win the fight alone. "What fight?" Jhoira had placed her hands on her hips in exasperation. "Karn, the Phyrexians were defeated centuries ago, and these new ones you told me about are trapped on their plane." "They're here," Karn had told her. "Defeating the Phyrexians in combat means nothing. They aren't an army. They're hate embodied. They promise Dominaria's destruction." Her voice had softened. "Just because Venser~" Karn didn't want to think about Venser. He slipped the gear onto its shaft and tightened it down. He replaced the housing, sliding it into place, and then threaded in each screw. Small pleasures. He patted the excavator and smiled. "That's better, isn't it?" He knew it wasn't alive, wasn't responding to him, but it almost felt like it was as he flipped a lever and watched the excavator move forward and begin digging into the cave wall. The stone shuddered. Fine white dust plumed away from the excavator's planed shovel-limbs. If organic beings had been present, Karn would have had to worry about using water to damp down the dust. Their lungs were so fragile. Better that he was alone, wasn't it? No one held him back, eating and sleeping hours away. No one delayed his progress with chitchat. The pulverized rock turned violet, then the excavator's rumble changed to a whine as it hit open air. The excavator stepped back, and Karn peered into the cavern it had opened. The rock had been thin as an eggshell but extremely hard. On the other side, the cavern's interior was coated in opal. The glow from his eyes caught on the iridescent flecks, suffusing the cavern with an amber glimmer. The dust-coated workshop looked like it came from Urza's mortal lifetime, or even prior, when magic's theories and practices were less well known, and technology propelled Dominaria's progress forward. Intricate glass tubing, beakers of varying sizes, defunct burners, powdery remnants of ancient chemicals, wire cutters and rollers for clay, buckets coated with desiccated glazes, gears, and cogs—even a small, vented forge, tongs placed casually to one side as if its smith, interrupted, had walked away from a task undone. In one corner, shackles: a reminder that the Caves of Koilos once housed the ugliness of the Thran before it transformed into Phyrexia. This workshop had been some artificer's sanctuary—and some prisoner's nightmare. Karn recognized a setup meant to exploit sentient beings for experimentation when he saw one. He'd seen too many such scenes when he'd been newly formed. "How did all this survive so intact?" If only he could share this sight with someone— He really had to stop talking to himself. Karn stepped into the cavern as lightly as his heavy body permitted. What if a stray vibration caused these delicate objects to shatter, destroying data? The books, arranged on a single long shelf with jewel-covered spines, tempted him with their knowledge, but he dared not take one down. The paper would probably crumble into dust should he touch it. He peeked into the beakers, tinted with dried-out fluids, and then examined the forge's ashes. Nothing. He examined the pottery worktop and saw it: a diagram of the sylex painted on parchment, a coppery bowl with handles and small black figures marching around its base. A gray slab of clay sat next to the diagram, etched with symbols duplicating those depicted in the diagram's faded paint. Some were in Thran; some in the arcing curves of an unrecognizable script that resembled some symbols on the sylex. The clay was damaged, partially illegible, and severed wires lay beside it. What had happened here? "I must compare this to the sylex." At the faint vibration of his words, the books collapsed into dust. Karn winced. He gathered the unfired clay tablet into his hands—careful, careful—and eased out from the ancient workshop. Karn had sited his basecamp some distance from the excavators, where the caves had greater stability. Each softly lit tent sheltered his equipment from the steady drip of water. Karn let their brightness guide his steps, the hollow cavern booming with his footfalls. With the tents lit from within, coming back to camp almost felt like coming home. Karn ducked into the largest tent, stepping around the large, golden Thran artifact he'd left in front of the entrance. Inside, he shimmied past a broken piece of metal he had collected days ago, intending to reshape it back to usefulness. He stepped over a pile of powerstone shards and sat at his work desk; it, like the rest of his tent, was too cluttered—he had no space for his newest find. Atop the papers and small artifacts, he saw Jhoira's letters, scattered, opened but unanswered. #emph[Karn, it's been months] , one letter began. #emph[Don't you think you should examine why you're doing this? ] Another letter ended. #emph[Mirrodin ] #emph[wasn't your fault] , she wrote in another. #emph[Please come back. Venser would have . . .] #figure(image("001_Episode 1: Echoes In The Dark/02.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) Karn shifted the artifact onto one palm and used the other hand to shove Jhoira's letters to one side. He slid the artifact onto the worktop then ducked under the table. He'd concealed the sylex in a small titanium chest, its lock only accessible to someone like him, someone who both knew the order in which the tumblers and pins needed to be lifted and could manipulate inorganic materials. His lock had no key. He placed his hand on the chest, focused, and felt the tumblers move. The lid popped open. He removed the sylex. Even his specialized senses could not identify its copper-like material. Normally he could reveal any inorganic object's mystery with a touch; not so the sylex. It made his palms itch with its strangeness. A Thran artifact, most said—but he had his doubts. It was Karn's belief that this device came from farther fields than simply the past. He lifted its wide goblet body onto his desk. Its inky characters seemed to move under his worktop's light, transforming from Thran to Fallaji to Sumifan. The vessel's wide, bowl-like mouth seemed to call to be filled—with, according to the Sumifan, the memories of the land. He'd been reluctant to test it without confirmation on how to use it. The sylex sent a jolt through him. Karn flinched and withdrew his hand, cradling it against his body. Once, when he'd been new, he'd reached out and touched the fire blazing in Urza's hearth. He'd dropped the cherry-red coal, shocked by the sensation, then examined his hand for damage. He'd found none. He'd looked up to see Urza watching him with glittering eyes. Urza hadn't tried to stop him, yet he'd known this would hurt Karn. #emph[Why did you give me intelligence if you do not value my personhood?] Karn had felt ashamed the moment he'd asked the question, and yes, Urza had chuckled. #emph[You're more valuable to me if you can react intelligently. ] Karn had stared at his aching, undamaged hand. #emph[Then why give me pain? ] Urza had smiled and stroked his white beard. Karn had later learned how to recognize that expression as one Urza made when he thought he was being particularly clever. #emph[People are more reluctant to damage something if it causes that thing pain.] But that was only true of some people, wasn't it? Karn glanced at Jhoira's unanswered letters. He did not dare involve Jhoira or the other Planeswalkers, lest he lose them to the Phyrexians like he'd lost Venser. Even after Memnarch renamed it, Karn still thought of that plane by its first name: Argentum. It had been Argentum to him when he'd created it and its smallest wonders. How beautiful it had been, a plane glimmering with mathematical precision. The Phyrexians had taken it from him. His plane, his children. Memnarch, his creation. And it was all his fault. He grabbed a rag from a nearby pile to wipe the condensation from his body—he didn't want to drip on his new, unfired find—and dropped the rag back into the heap. He bent to study the sylex, comparing its symbols to those on the clay tablet. The pattern changed right where the clay tablet's edge seemed rougher. Broken. Had he forgotten a piece? He needed to go back for it. Now. Since he'd opened the cavern to the caves' moisture, the artifacts in it would degrade. Just then, the death rattle of another excavator echoed through the caverns. Karn wished he could sigh. But, as it was, he locked away the sylex and his most recent find. He'd repair the excavator—it was located near the ancient workshop anyway—and then he'd look for the missing piece. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Oily smoke oozed up from the excavator's housing. It seemed to have hit a hard mineral deposit, stressing the mechanisms behind the cutting tools. Karn patted it. "More than you could cope with?" The machine released a gout of smoke. "I know the feeling," Karn replied. Before getting started, he peered around the tunnel's edge. The nearby workshop, despite the excavator's rumblings, seemed intact. Good. The excavator could continue its work without risking damage to those artifacts, then. After he repaired it, he would search the workshop for the missing piece to the clay tablet. He pulled the machine away and reached into the wall where it was digging. He scooped out something~ liquid. Oily black slop dripped down his fingers, spattering the ground. Could it be~ ? Karn reached with his special senses into the material. For him—he had tried to explain it to Jhoira once—this ability was akin to tasting, if tasting were to provide information beyond flavor. He felt nothing. As if this substance was organic. How had the cables been embedded into stone? It looked almost like they'd woven themselves into it, like worms through an apple that was otherwise undisturbed. He had been right: the oil was Phyrexian. He checked again—could these fibers be ancient remnants? "No, no," he murmured. "They look recent. Fresh." Karn reached into the borehole and grabbed one of the fibers. It writhed under his fingers, resistant, and released small spider-like clamps from its body to grip the stone. The cable was alive. He frowned. It lashed his fingertips as if trying to wriggle free from his grip. He tugged hard and tore it free from its tunneled cavity. Black oil spattered across his torso from its root. The other cables contracted within the wall—and the ceiling to the ancient workshop thundered to the ground. The tunnel behind Karn crumpled, the passage to his basecamp gone. He had lost his findings. He would never locate the shard of the unfired tablet. He would never slot it in and see what it revealed. He would never fully investigate the chamber and determine if it housed other secrets about the sylex's creation. This recent development had seen to that. Now he had a more urgent problem, one he needed to prioritize over archeological catastrophe: the Phyrexians were on Dominaria. Here, now. He could try to excavate the workshop. He could dig out the passage and return to his basecamp. He could reach out to the others, but seeking help took time and, Karn knew, put others at risk. If he had learned anything during his long life, it was this: a single moment of inattention, of neglect, could leave an entire plane vulnerable to the Phyrexians. The Phyrexians were contained within the caves for now, and he with them. Good. He would not let Dominaria fall like Mirrodin once fell. He'd stop the Phyrexians. If he could not do that, he would obtain proof enough that he could recruit reinforcements. Proof enough that Jhoira, and his fellow Planeswalkers, would believe him. #emph[Karn] , Jhoira would say, #emph[you were right all along.] #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Karn had only one direction he could go: forward. He stepped into the open Phyrexian tunnel. The walls looked organic, winding through the stone like veins through a body. He followed the tunnel until it opened into a junction. Here, the walls had been carved into a frieze. Unlike the materials he had seen in the workshop and embedded behind translucent stone, these cuts seemed crisp and new. It had the vaulted quality Karn associated with religious practices, such as the stained-glass murals in Serra's temples. In the frieze, a Phyrexian demon grasped a young human woman. The demon's elongated skull, bared teeth, and small eyes were depicted with loving detail. Each node of machinery and every exposed muscle fiber was polished until it gleamed. Small diamonds had been inset as highlights so that the demon seemed to move and glitter under Karn's gaze. In contrast, the human's profile, cut into the stone, was rough, her features drawn down into torment, revulsion, and fear. She held hands with another figure whose visage had been carved then intentionally defaced. #figure(image("001_Episode 1: Echoes In The Dark/03.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) A whisper of cloth brushing stone drew Karn's attention. He turned, hand still pressed to the mural. Humans always seemed so little to Karn. Only the tallest neared his height; all others were petite compared to him. These two—a man and a woman—were both small. The woman, her pale skin starved for sunlight and her brown hair ragged, had replaced her jaw with a hinged mechanism, small blades installed alongside her natural teeth. Where flesh joined metal, her scabs wept a sick, yellowish fluid. Her older companion, a white man with graying brittle blond hair, must have incorporated his technology more artfully: his white shirt lay open to expose the artificial heart beating between his ribs, the portal into his body protected with a glassy material. He had also added extra digits to his hands. Both held chisels and large mallets. The sculptors, then. Acolytes from the Society of Mishra if their robes told him the truth. The female looked at Karn, then to his hand resting on the frieze, and shrieked in outrage. She launched herself forward. Her male companion followed her a second later. She took a swing at his torso with the hammer. Karn seized her arm with one hand, and she drove her chisel toward the intricate, mobile plates along his abdomen. He grabbed her other arm. She grunted, straining against him. Her companion ran at Karn, raising his hammer over his shoulder. Karn swung the woman into her companion, slamming both into the wall. They fell in a tangle of limbs—nothing broken, just stunned—to the ground. Karn bent over them and arranged their limbs. He held out his hands and generated restraints so that they could not attack him again. Iron particles buzzed at his fingertips, drawn from the aether. He called forth the metal in layers, building the restraints into bands on their arms and legs. He did not generate keyholes or a key, for he had no need to. The metal bands were solid. The man moaned. The woman had enough fury in her to spit at Karn. The gob landed near his foot. They were so tiny. His strength, his reflexes, the facts of his body seemed an unfair advantage. Karn had torn through so many such creatures at Urza's request, walking through rank upon rank of soldiers like a lead weight through wet paper. He could almost feel it now: the resistance, then give, of those bodies; the heat of their blood trickling into his joints. The long hours he had spent while Urza slept cleaning his body with small wire brushes, scraping off the dried gore, digging out the clots from behind his knees. He had never felt clean enough. "You are not Phyrexian," Karn said, "yet you are here, and if I am not mistaken, in service to them. What do you hope to accomplish?" "You—you empty, fleshless husk. You desecrate our holy work with your touch." The female acolyte's rage dulled to a glittering self-satisfaction. "Others will respond to the barrier being broken. Gix's blessings upon them—they will come. They will come." Ah, yes, the network of wires in the walls. When he had broken through them, he had likely triggered an alarm. Perhaps these first acolytes had responded as if some animal or natural event had severed the cables, but when these two did not report back, the others would not make such a mistake. Karn reached toward the female's face and, with a twist of his fingers, generated a metal gag. The only reason she hadn't shouted for help—sounds would carry in these caverns—was that it hadn't yet occurred to her. She glared at him, making muffled noises that sounded like curses. He leaned over the male acolyte. "What are you doing here?" The male blinked at Karn. His pupils had dilated to different sizes. He was concussed. His speech, as a result, slurred. "Karn. I know you. It's good that you came." Karn frowned. "The Whispering One has a plan for you." The acolyte beamed. "She grows stronger daily, and you will serve her. Sheoldred welcomes you! It's your destiny, Karn, to create for us. To help us. To become one of us." Karn generated another gag so that this one, when he regained his senses, could not call for help. The acolyte accepted the gag—almost as if he appreciated it—with a beatific smile. Karn stepped away. How had Sheoldred survived crossing between planes? A question he would ponder later: for now, he had to find her, to end the Phyrexian invasion before it started. And he could do that alone. Better that he did, for he could not be subverted. Venser's spark saved him from that. Karn left the acolytes bound and gagged and plunged deeper into the cave network. The damp in these passages did not feel like the air around his camp but rather warm as breath. Hot moist air condensed on his cold body, trickling down in rivulets like sweat. Faint screams reverberated through the air. The tunnel opened into a vast cavern, which echoed with the cacophony of human misery. On the other side of the crevasse was the Phyrexian staging ground, located on a broad flat area of the cavern floor. Antlike workers scrambled across the rope bridges strung over the crevasse, ferrying meaty gobs, bloody cables, and chunks of flesh over to humans being compleated on surgical tables. On the opposite wall of the cavern, a Phyrexian portal ship cut through the darkness like an immense scythe. Coils hung from this structure. The twitching loops' membranous purple gleam reminded Karn of intestines. Sheoldred hung suspended in this morass. She was still. Tubes fed reddish and milky substances into her black segmented body. The mandibles that extended down from her thorax lay open, relaxed. Her humanoid torso, welded to the thorax's top, lay nested in a thick network of writhing inky lines. A horned mask obscured her face. Beneath her, worshippers clung together and raised their voices in an ecstatic paean. #figure(image("001_Episode 1: Echoes In The Dark/04.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) The defunct Phyrexian portal ship and Sheoldred's sleeping form dominated the cavern. Acolytes in the gray robes of the Society of Mishra attended surgical machines that converted struggling people into Phyrexian abominations. Compleated monstrosities dotted the cavern's floor like grotesque artworks, skittering on too many limbs. More acolytes stacked weapons beside a Phyrexian skyship. Teams of splicers scaled a dragon engine to repair it, so small that their welding torches seemed like white stars against the engine's metal skeleton. He'd found the staging ground of the Phyrexian invasion. A single figure attended Sheoldred: a young woman with platinum-brown skin and dark umber curls who wore the cloak of the Tolarian academy. When she turned, Karn saw the red dot from a mechanical eye. Below, an acolyte hurried forward and offered up gobbets of flesh. The young woman sorted through them, twisting some into the morass that supported Sheoldred. Karn traced the line of acolytes carrying materials from the immense monstrosity to Sheoldred and her helper. She was #emph[mining] the monstrosity to repair Sheoldred's damaged biological components. If the other Planeswalkers could see this now, they'd know that Karn's fears were true. Jhoira would say— No. It didn't matter what Jhoira would say. Karn faced this threat alone. He needed to alert the others, yes, but neither could he leave this staging ground intact. He had to destroy the Phyrexians before they could defend themselves. His course of action decided, Karn held out his hand, palm up. He raised his other hand above it. He visualized the incendiary device he planned to generate from the inside out. He could see its every component, its chemicals, laid out like a dimensional blueprint. His fingertips buzzed with the magic of his creation. The layers of material accumulated in midair. It was no sylex, but it would end Sheoldred. A klaxon filled the cavern with its high-pitched holler. Karn located its source as acolytes, worshippers, and Phyrexian agents paused in their work: the female acolyte who'd attacked him was blowing a horn. Either she'd been discovered and freed, or she'd freed herself: the drawback to leaving his attackers alive. The shrill sound prompted action. Acolytes loaded weapons into the skyships. Phyrexian surgeons loaded their bloodied operating tables into skyships. Others boarded the skyships, evacuating. Compleated Phyrexian monstrosities shuddered to life, metal fibers snaking out from their bodies. Others slumped to the floor. Claw-like limbs exploded from their abdomens and their gaping mouths opened blindly, like reptiles scenting for prey. A red beam dotted Karn's chest. Karn dropped flat to the rock just as a bolt of electricity flew overhead. He pressed his palms to the ground, lifting himself high enough to crawl forward. At the cliff's edge, he peered to the cavern's floor, trying to locate the blast's origin. The Tolarian who aided Sheoldred leveled a glaive at him. She had replaced her eye with a miniaturized ray cannon, and its red beam hit Karn. Karn rolled to the side. A crackle exploded the rock beside him. Smoke wisped up from where he'd once lain. #figure(image("001_Episode 1: Echoes In The Dark/05.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) Compleated Phyrexians swarmed toward him, and the Tolarian smiled. She placed a hand on Sheoldred's limp claw. Sheoldred remained limp, inert—as if she were under sedation while the young woman worked to restore her—and vulnerable. And Karn still held his incendiary device. The nearest bridge over to Sheoldred was close but narrow. Twelve ecstatic worshippers and the young woman with the glaive blocked his access to the Praetor. But Sheoldred seemed some distance from the Phyrexians and Tolarian acolytes located on the cavern floor. If Karn was quick, he would not have to battle through all Sheoldred's followers to attack her: just the twelve worshippers, the Tolarian making it thirteen. Karn pushed himself to his feet and charged down the narrow stone bridge. Sheoldred's worshippers ceased their paean and launched themselves toward him. Two reached the bridge. Karn shouldered them aside into crevasse's abyss. The other worshippers crowded themselves into a blockade. Two had leveled spears at him, which would have kept him at bay had he been a creature with flesh. Puncturing weapons annoyed him only if the shafts or blades became trapped in his joints and inhibited his range of motion. Similarly, the two young men with rotating saws did not give him pause: those blades would glance from his body. No, Karn focused on the worshippers who wielded the pistol chisels and welding torches. It all came back to him so easily. He felt numb, efficient. As Urza made him. Karn stopped an inch away from the spearpoints. The worshippers shifted, uneasy. Karn took one step forward, seized a spear, and lifted. A worshipper, still clinging to his weapon, gaped and dangled. Karn swung him into foes, sweeping several off the bridge and breaking up their blockade. Then he tossed the spear-wielder into the crevasse's depths, the man's screams fading as he fell. The other spear-wielder, an older woman, jammed her spearpoint into a gap in his torso. Although he held an incendiary device, he cracked the spear shaft by hammering his fist down, breaking it off inside himself. He'd deal with that later. He grabbed the broken shaft end, which she still held, and used it to swing her aside. She fell and crumpled. Only six combatants remained. The wielder of a rotating saw swung their buzzing instrument at Karn's head. Karn stepped back to evade. Before the saw could come around again, he stepped within the wielder's range and removed the tool from the man's fingers. The man tried to resist, but Karn had overwhelming strength on his side. Prying away the man's grip was disturbingly easy. Karn lifted him and tossed him into two more worshippers. The force crunched all three to the floor in a sickening confusion of broken limbs. A piston chisel wielder rushed him from the side. The chisel thunked into Karn, then skidded down his arm, throwing its user off balance. Karn punched him. The man went flying. The remaining two worshippers fled; their faith was not so great in the face of such bodily harm. All these humans, even with the Phyrexian alterations, were no sturdier to him than butterflies. Karn wished it were not so easy. He strode to Sheoldred. She hung, limp within her cradle but no longer quiescent. Her segmented limbs twitched like an arachnid's as she emerged into consciousness. Her human torso atop her thorax shivered. Her long fingers reached downward to the young woman in the Tolarian robes. But she did not seem aware—not yet. "Karn." The Tolarian spoke with contempt. "I've heard a lot about you." "How so?" Her gaze flicked to Sheoldred's inert form, then returned to him. "You aren't as impressive as I've been led to believe." Karn walked toward Sheoldred, the incendiary device in his hand. "Who are you?" Karn asked the Tolarian. "Why would you bring this here?" "Rona. And this," she gestured to Sheoldred, "is Dominaria's salvation." Rona positioned herself between Karn and Sheoldred, glaive held easy and at an angle in her palms. Rona's flesh eye narrowed while her mechanical socket focused its laser onto Karn's torso. She flexed her hands around her glaive. Its blade brightened, crackling with blue electricity. She smiled. "I do not wish to fight you," Karn told her. "Too bad." Rona leveled her glaive at him, and electricity burst from her blade. The electricity danced across his body, sparking. Karn grimaced at the pain but pushed through it, walking toward her as more waves rippled from its blade, pouring over him. Karn paused, dazed, and tried to shake off the agony as Rona continued to attack. She swung the glaive down, lodging it into his shoulder. Karn twisted, pulling it from her grip, and removed it from his body. He threw it aside. While he was occupied, Rona unsheathed a dagger and jammed it into one of his abdominal seams. She dug it between the plates that allowed him to flex, as if searching for organs. Karn winced. Karn gripped her head in one of his hands. He pressed his thumb into the mechanical eye and shattered the ray's lens. Rona shrieked and kicked. Karn tossed her into the wall. Bones crunched. She slammed into it, then fell to the ground. She curled, her hands around her head, her leg at an angle unnatural to human beings. Oil and blood oozed from the broken mechanical parts in her eye socket. She glared up at him from between her fingers, her lips drawn into a rictus. "Why don't you kill me?" Rona taunted him. "Finish me off." "I am not a weapon." Karn neared Sheoldred, holding his incendiary device. Although her humanoid part was the size of an ordinary woman, she attached to a scorpion-like body easily three times his size. In contrast to that well-crafted beauty, the organic materials grafted into her human torso seemed crude, bloody. Rona had tried her best to replace the organic parts that had burned away in the Blind Eternities during Sheoldred's transit between planes, but its patchwork nature showed. He would tear her to pieces. He would crush her while she was still weak. He would do anything—#emph[anything] —to stop Sheoldred from Phyrexianizing this plane. Karn reached up and seized Sheoldred's torso, determined to finish this. He would tuck this device between the vulnerable plates of her body and destroy her. At his touch, Sheoldred stirred. Her helmeted head craned down toward him. He could #emph[feel] her with the same senses he used to determine the elemental composition of a compound. Her inorganic components stretched out before him like the pages in a book. Her biological parts lay like dark tumors nested within the metal's luminescent glory. He could read her thoughts—some of them. #emph[Welcome, Father] , Sheoldred whispered into his mind, one mechanical being to another. #emph[What plans I have for you.] Karn recoiled from her slimy whisper, stepping back. And he knew what she had done. Phyrexian sleeper agents lurked in every land in Dominaria, these unknowing spies peppered throughout every government, throughout the military, throughout common people. He saw a brewer dumping hops into a vat. A spy. He saw a scribe sitting at a desk, her hand poised over a letter. He saw an adolescent playing chase with his cousins, pretending to be a monster when he was one, Phyrexian armature ready to explode from his back. Phyrexian agents were people's lovers, comrades, colleagues at work. They were everywhere. They could be anyone. #emph[Welcome] , her whisper echoed within him. #emph[Welcome.] Karn reached between the plates on her thorax and deposited the incendiary device inside her body. He lifted his thumb to flip the switch that would allow the two chemicals within it to flow into each other and combust. But his hand did not move. His joints had locked. He attempted to look down to examine himself, but even his neck remained rigid. He tried to turn and couldn't move his arms, legs, or torso. He couldn't tell whether he'd been paralyzed or locked into place. In his peripheral vision he could see Rona dragging herself—shattered lens, broken leg and all—toward unfamiliar magical devices, ones that she must have created herself. She left a trail of oil, blood, and fluorescent blue fluid behind her. Karn strained against the strange magic that gripped him. Rona levered herself up into a seated position. From her grunts, it sounded agonizing. "Your mistake," she said, "was not killing me when you had the chance. We have expected your coming, Karn. We have prepared." He tried again to move, his internal mechanisms groaning with the effort, and felt his metal torque. He would bend—break—before he freed himself from Rona's magic through force. Rona sorted through the heaped parts she'd been using to repair Sheoldred. She lifted up a node, smiled, and set it aside. With a grimace, she dug her fingers into her damaged eye socket and yanked out the ruined node, exposing raw tissue and a piece of gleaming skull near her eyebrow. A gout of clear liquid spurted out. She clicked the new node into place. Roars boomed through the cavern. Rock sifted down, pinging against Karn's body. "That," Rona said, "was the sound of our ships evacuating our forces from this staging area—which has been compromised—and retreating to a secondary staging area. We have many bases across Dominaria. You will not find them all." Rona drove her glaive into her leg. She grunted, slicing through her clothing and her flesh. Her eyes teared—even the eye she'd replaced dripped. Panting, she bared her muscle and her broken bone to the cavern's air. Karn had failed. Held with Rona's magic, he would be unable to warn his friends, unable to fight at their sides, unable to save them when the compleated Phyrexian agents exploded from their dearest companions to kill them. The cavern had emptied and quieted enough that Karn could hear the click as Rona slid a device into her leg. She sighed and folded her flesh over the metal. She fixed another panel over her thigh, sealing her wound shut, and then stood. She rolled her shoulders and smiled. "Sheoldred, in her beauty, my Whispering One," Rona said, "grows stronger by the day, and she will lead us to victory." Karn, still arm-deep in Sheoldred's torso, could feel clicking vibrate along his body. Sheoldred split apart, dividing herself up in pieces. Her segments broke away, each piece sprouting a dozen viridian segmented legs. The swarm poured over Karn, using him as a bridge to the floor. The spider-like creatures ran along Karn's arms, down his back and torso, the backs of his knees, his calves. The #emph[tink-tink-tink] of their metallic claws reverberated through him. A tarantula-sized piece sprang free from the cables onto Karn's face. It clung to his head, twitching, a heart-like nugget of flesh grafted into the center of its modified thorax. It crawled over his head. He could feel its wet body slither down his back. It dropped to the floor and scampered away. "I may not be able to stop you, Urza's creation," Rona said, "but I can prevent you from stopping us." From the edge of his vision, Karn could see Rona limp down a tunnel. Even with her makeshift repairs, Rona remained heavily damaged, and she leaned on her glaive, using it as a cane. Her leg spurted yellow fluid, and she staggered. She paused to catch her breath. Oil dribbled from her new inserts, mingled with blood. He turned his head to watch her. The field paralyzing Karn had weakened. Perhaps it was due to Rona's retreat. Did she carry the device with her that held him in place? Karn attempted to raise his arm. The effort shuddered through him. He lifted one finger. Rona let her shoulder rest against the tunnel wall. She used her glaive to cut free a strip of cloth from her mantle. "I hope that as we take this plane, as we make it more perfect, you feel the keenness of failure once again." Karn strained against the force gripping him. His jaw ached. "What~" Rona bound the cloth strip around her leaking leg in a tourniquet. "As you watch the people you've known for eons transform and turn against you, I hope it hurts." "Why would you say that?" Karn managed. He had to keep her talking. If he could break free~ "What ­have I done to you that you would wish such horror upon me?" "When the Mirrans became Phyrexian," Rona said, "it was the best thing that ever happened to them. They were independent from their creator. Unified. Beautiful." The force holding Karn seemed to loosen. He needed to push free. Even with the Phyrexian staging ground in the Caves of Koilos empty, if Karn could capture Rona, as Sheoldred's right hand she would be able to provide valuable information. All was not yet lost. "You'd kill them, wouldn't you," Rona said, "for reaching for perfection." He only needed one more moment— "You gave Memnarch your intelligence. Your capabilities. But he didn't have the experience to deal with it. The guidance. He was so lost." Rona's smile twisted. She enjoyed his struggle. "I can't stand bad parents." Karn stopped. His body couldn't have reverberated more if she had struck him. Rona flipped a switch on the wall. There was a small grating noise. Then, a series of booms overhead. The roar, as the cavern fell, engulfed him. Tons of rock poured onto him. A boulder rolled off the cavern's wall, then bounced into his chest. It tossed him onto his back. He stared up at the collapsing cavern, still paralyzed by Rona's device. Rocks sheeted down. Fist-sized pieces hammered into his body. Smaller pebbles thunked and plinked against him, rolling and filling in the gaps. His vision turned gray from dust then blackened out as the stone obscured all light. The rock weighed upon him. He could feel Rona's spell ease. He could move—or at least beneath all this stone, he could attempt to move, to twitch a finger. For whatever good it did him. Not even he could lift this stone. Not even he could dig his way out from this cave-in. The crushing layer of rock was too heavy even for him to shift. Karn reached for the spark that allowed him to planeswalk. It burned within him, hot and bright, such a perpetual companion that he had ceased to notice it. If he could just focus and— It didn't work. Nothing happened. Karn reached out with his special senses through his fingertips and analyzed the surrounding inorganic materials: olivine, granite, quartz, mica. Ordinary stone, but with all the ancient interplanar and Phyrexian technology providing a low-grade interference, he could not planeswalk away. He was trapped. Only he knew that Sheoldred had come to Dominaria, and he could warn no one.
https://github.com/ParaN3xus/numblex
https://raw.githubusercontent.com/ParaN3xus/numblex/main/lib/escape.typ
typst
MIT License
#let escaped_numbering(format, ..nums) = { numbering(format, ..nums) }
https://github.com/tiankaima/typst-notes
https://raw.githubusercontent.com/tiankaima/typst-notes/master/7e1810-algo_hw/hw1.typ
typst
#import "utils.typ": * == HW 1 (Week 2) Due: 2024.03.17 #rev1_note[ + Review: 渐进符号 $o, O, Theta, omega, Omega$ 的定义如下: $ O(g(n)) &= {f(n) mid(|) exists c > 0, n_0 > 0, forall n >= n_0 quad& 0 <= f(n) <= c dot g(n)}\ o(g(n)) &= {f(n) mid(|) exists c > 0, n_0 > 0, forall n >= n_0 quad& 0 <= f(n) < c dot g(n)}\ Theta(g(n)) &= { f(n) mid(|) exists c_1, c_2 > 0, n_0 > 0, forall n >= n_0 quad& 0 <= c_1 dot g(n) <= f(n) <= c_2 dot g(n) }\ Omega(g(n)) &= {f(n) mid(|) exists c > 0, n_0 > 0, forall n >= n_0 quad& 0 <= c dot g(n) <= f(n)}\ omega(g(n)) &= {f(n) mid(|) exists c > 0, n_0 > 0, forall n >= n_0 quad& 0 <= c dot g(n) < f(n)} $ + Review: 主定理 对分治算法的递归式 $T(n) = a T(n / b) + f(n)$ 主定理给出了一个快速求解递归算法复杂度的复杂度, 记 $c = log_b a, f(n) = O(n^d)$ + 如果 $c > d$, 则 $T(n) = Theta(n^c)$ 这里需要注意, 比较的时候要是多项式意义上的小于, 即: $exists epsilon > 0, f(n) = O(n^(c - epsilon))$ + 如果 $c = d$, 则 $T(n) = Theta(n^c log n)$ + 如果 $c < d$, 则 $T(n) = Theta(f(n))$ 同时需要满足正则化条件: $exists epsilon > 0,c < 1, quad a f(n\/b) <= c f(n) < f(n)$ ] === Question 2.3-5 You can also think of insertion sort as a recursive algorithm. In order to sort $A[1 : n]$, recursively sort the subarray $A[1 : n – 1]$ and then insert $A[n]$ into the sorted subarray $A[1 : n – 1]$. Write pseudocode for this recursive version of insertion sort. Give a recurrence for its worst-case running time. #ans[ The pseudocode for this recursive version of insertion sort is as follows: ```txt INSERTION-SORT(A, n) if n > 1 INSERTION-SORT(A, n - 1) key = A[n] i = n - 1 while i > 0 and A[i] > key A[i + 1] = A[i] i = i - 1 A[i + 1] = key ``` #rev1_note[ 最坏情况是: 在排序 $[1,k]$ 时, 需要将 $A[k]$ 与 $A[1:k-1]$ 中的所有元素比较一次, 以确定插入位置. 这样, 在排序 $[1,n]$ 时, 需要比较的次数为 $1 + 2 + dots.c + (n-1) = Theta(n^2)$. ] The recurrence for its worst-case running time is $ T(n) = cases(T(n - 1) + Theta(n) space.quad & n>1, Theta(1) & n=1) $ The solution to the recurrence is $Theta(n^2)$ worst-case time. ] === Question 2-1 *Insertion sort on small arrays in merge sort* Although merge sort runs in $Theta(n lg n)$ worst-case time and insertion sort runs in $Theta(n^2)$ worst-case time, the constant factors in insertion sort can make it faster in practice for small problem sizes on many machines. Thus it makes sense to coarsen the leaves of the recursion by using insertion sort within merge sort when subproblems become suffificiently small. Consider a modifification to merge sort in which $n\/k$ sublists of length $k$ are sorted using insertion sort and then merged using the standard merging mechanism, where $k$ is a value to be determined. + Show that insertion sort can sort the $n\/k$ sublists, each of length $k$, in $Theta(n k)$ worst-case time. + Show how to merge the sublists in $Theta(n lg(n\/k))$ worst-case time. + Given that the modifified algorithm runs in $Theta(n k + n lg(n\/k))$ worst-case time, what is the largest value of $k$ as a function of $n$ for which the modifified algorithm has the same running time as standard merge sort, in terms of $Theta$-notation? + How should you choose $k$ in practice? #ans[ + For each sublist, the insertion sort can sort the $k$ elements in $Theta(k^2)$ worst-case time. Thus, the insertion sort can sort the $n\/k$ sublists, each of length $k$, in $Theta(n k)$ worst-case time. + Given $n\/k$ sorted sublists, each of length $k$, the recurrence for merging the sublists is $ T(n) = cases(2 dot.c T(n\/2) + Theta(n) space.quad & n>k, 0 & n=k) $ The solution to the recurrence is $Theta(n lg(n\/k))$ worst-case time. *This could also be viewed as a tree with $lg(n\/k)$ levels with $n$ element in each level. Worst case would be $Theta(n lg (n\/k))$* #rev1_note[ 将 $n\/k$ 个数组看成 $n\/k$ 个元素, 作为 merge sort 的叶节点. 这样一个数有 $n\/k$ 个叶节点, 也就有 $log(n\/k)$ 层. 每层实际上合并 $n$ 个元素, 总时间复杂度为 $Theta(n lg(n\/k))$. 直接进行 $n\/k-1$ 次合并是不可行的, 这样的速度在 $Theta(n^2\/k)$, 不符合要求. 另一种可行的思路: 考虑直接合并 $n\/k$ 个有序数组, 我们比较这 $n\/k$ 个数组中, 尚未取出的最小元素, 并从中选取最小元素. 具体来说, 维护一个 $n\/k$ 大小的 heap 和一个 $n\/k$ 大小的数组, 用于存储每个数组中的当前元素. 每次取出堆顶元素, 并将对应数组的下一个元素插入堆中. 这样, 每次取出最小元素 (构建最小堆) 的时间复杂度为 $O(lg(n\/k))$, 总时间复杂度为 $O(n lg(n\/k))$. ] + Take $Theta(n k + n lg(n \/ k)) = Theta(n lg n)$, consider $k = Theta(lg n)$: $ Theta(n k + n lg(n \/ k)) &= Theta (n k + n lg n - n lg k) \ &= Theta (n lg n + n lg n - n lg (lg n)) \ &= Theta (n lg n) $ #rev1_note[ 思路: $ Theta(n k +n log(n\/k))=O(n log n) $ 只需 $k = O(log n), log(n\/k) = O(log n)$. 这样我们得到 $k=omega(1)=O(log n)$, 选取最大边界 $k=Theta(log n)$, 通过上述验证可以发现严格记号成立, 那么最大的 $k$ 值为 $Theta(log n)$. (渐进意义上的.) 容易发现当 $k=omega(log n)$ 时, $Theta(n k + n log(n\/k))=omega(n log n)$, 这样的 $k$ 值不满足题目要求. ] + Choose $k$ to be the largest length of sublist for which insertion sort is faster than merge sort. Use a small constant such as $5$ or $10$. #rev1_note[ 这里的主要问题是, 比较两个 $Theta$ 意义下相等的算法用时必须考虑常数, 实践中可以通过记录算法实际运算次数得到. ] ] === Question 4.2-3 What is the largest $k$ such that if you can multiply $3 times 3$ matrices using $k$ multiplications (not assuming commutativity of multiplication), then you can multiply $n times n$ matrices in $o(n^(log 7))$ time? What is the running time of this algorithm? #ans[ #rev1_note[ 稍微翻译一下题目: 如果你有一个 $k$ 次乘法的 $3 times 3$ 矩阵乘法算法, 那么这样的算法是否能否构造一个, 在 $o(n^(log_2 7))$ 时间内完成 $n times n$ 矩阵乘法? 问满足条件的最大的 $k$ 是多少. 递归式是 $T(n) = k T(n\/3) + O(n^2)$, 我们分类讨论来使用主定理: // - $k=27$ 时这就是最基本的矩阵分块算法. 我们不妨假设 $k<27$. - $log_3 k < 2$, 正则化条件: $k dot (n\/3)^2 < n^2$ 即 $k < 9$, 算法规模在 $T(n)=O(n^2)subset O(n^(log_2 7))$. - $log_3 k = 2$, 此时 $T(n)=O(n^2 lg n) subset O(n^(log_2 7))$. - $log_3 k > 2$, 为使 $T(n)=O(n^(log_3 k)) subset O(n^(log_2 7))$, 需要 $log_3 k < log_2 7$, 最大的 $k=21$. 下面这个答案中递归式是错误的, 应该改正. ] Assuming $n = 3^m$. Use block matrix multiplication, the recursive running time is $T(n) = k T(n\/3) + O(1)$. When $log_3 k > 2 $, using master theorem, the largest $k$ to satisfy $log_3 k < lg 7$ is $k=21$. ]
https://github.com/DashieTM/ost-5semester
https://raw.githubusercontent.com/DashieTM/ost-5semester/main/compiler/weeks/week8.typ
typst
#import "../../utils.typ": * #section("Virtual Machine") #align( center, [#image("../../Screenshots/2023_11_06_08_06_48.png", width: 80%)], ) #subsection("Loader") - *loads temporary code into memory* - *creates metadata for classes, methods, variables and code* - defines layouts for fields, variables, parameters - resolves references to methods, types and other assembly address relocation - *initiates execution* - interpreter or JiT compiler - *optional: code verification* #subsubsection("Metadata") #align( center, [#image("../../Screenshots/2023_11_06_08_13_26.png", width: 80%)], ) #subsubsubsection("Descriptors") - classes - field types - methods - types of parameters, local variables and return type - bytecode - arrays - element type Type descriptor: #align( center, [#image("../../Screenshots/2023_11_06_08_15_13.png", width: 60%)], ) Class&Method Descriptor #align( center, [#image("../../Screenshots/2023_11_06_08_15_49.png", width: 70%)], ) #subsubsection("ByteCode Loading") The descriptors explained above will replace the classes, methods, etc.\ In other words we don't want the simple types anymore, now we need all the description about it -> what methods does the class have, etc. #align( center, [#image("../../Screenshots/2023_11_06_08_17_18.png", width: 80%)], ) This is done via patching, aka we go through the code and replace the usage of a class with the descriptor: #align( center, [#image("../../Screenshots/2023_11_06_08_18_53.png", width: 80%)], ) #subsection("Interpreter") The interpreter goes through the bytecode instructions one by one and executes them dynamically: #align( center, [#image("../../Screenshots/2023_11_06_08_20_47.png", width: 100%)], ) - interpreter loop - emulates each instruction one after the other - instruction pointer(IP) - address of next instruction - evaluation stack - for virtual stack processor - locals & parameters - for active method - method descriptor - for active method #align( center, [#image("../../Screenshots/2023_11_06_08_36_59.png", width: 60%)], ) Loop: ```cs while (true) { var instruction = code[instructionPointer]; instructionPointer++; Execute(instruction); } ``` execute: ```cs switch(instruction.OpCode) { case LDC: Push(instruction.Operand); case IADD: { var right = Pop(); var left = Pop(); var result = left + right; Push(result); } // other cases... } ``` #align( center, [#image("../../Screenshots/2023_11_06_08_24_15.png", width: 80%)], ) #align( center, [#image("../../Screenshots/2023_11_06_08_24_51.png", width: 80%)], ) #subsubsection("Activation Frame and Call Stack") Each method creates its own "Activation Frame", which holds the information for this method(parameters etc): #align( center, [#image("../../Screenshots/2023_11_06_08_32_19.png", width: 70%)], ) #text( teal, )[For the interpreter, this call stack is "managed" aka with GC and everything, represented in a OO way(at least for dotnot).\ With JiT and proper languages, the call stack is *not* managed.] #subsubsubsection("Managed") managed callstack: ```cs class ActivationFrame { private MethodDescriptor method; private Pointer thisReference; private object[] arguments; private object[] locals; private EvaluationStack evaluationStack; private int instructionPointer; // ... } class CallStack { private Deque<ActivationFrame> stack; } ``` method call: ```cs var method = (MethodDescriptor)instruction.Operand; var nofParams = method.ParameterTypes.Length; var arguments = new object[nofParams]; for (int i = arguments.Length – 1; i >= 0; i--) { arguments[i] = Pop(); } var target = Pop(); var frame = new ActivationFrame(method, target, arguments); callStack.Push(frame); ``` method return: ```cs var method = activeFrame.Method; var hasReturn = method.ReturnType != null; object result = null; if (hasReturn) { result = Pop(); } callStack.Pop(); if (hasReturn) { Push(result); } ``` #subsubsubsection("Unmanaged") #align( center, [#image("../../Screenshots/2023_11_06_08_36_10.png", width: 60%)], ) #subsection("Verification") - detect wrong IL-code -> manipulation or error - static analysis on compile time - runtime analysis What will be checked? - correct usage of instruction - types are correct - method calls are correct with parameters and return types - jumps are valid -> methods, if, loops etc - op-codes are correct - stack-overflow or underflow are prevented - types are known - metadata correct -> no unknown types - values on evaluation stack have a type - values are initialized - null-dereferences, out of bounds - garbage collection -> if needed... - compatibility of external references -> includes *Example:* #align( center, [#image("../../Screenshots/2023_11_06_08_43_07.png", width: 60%)], )
https://github.com/morrisLuke/typst_quarto_barebones_report_template
https://raw.githubusercontent.com/morrisLuke/typst_quarto_barebones_report_template/main/README.md
markdown
## Barebones Report Template for Quarto with Typst This [Barebones template](https://morrisLuke.github.io/typst_quarto_barebones_report_template/report.pdf) features slots for a header image, footer image, footer page numbering, and footer motto. Template inspiration borrowed heavily from [<NAME>](https://github.com/dgkeyes)' far more [feature-rich report template](https://github.com/rfortherestofus/report-design) presented at [Posit::conf 2024](https://positconf2024.rfortherestofus.com/slides.html). ### More resources [Quarto documentation for Typst](https://quarto.org/docs/output-formats/typst.html) [Typst documentatiton](https://typst.app/docs) [Meghan Hall's blog post](https://meghan.rbind.io/blog/2024-08-14-quarto-templates/) recapping her Posit::conf 2024 talk <NAME>: [Making Pretty PDFs with Typst](https://nrennie.rbind.io/blog/pdf-quarto/making-pdf-with-quarto-typst-latex/) ![](https://morrisLuke.github.io/typst_quarto_barebones_report_template/report-preview.png) Out of the loop on the [South Harmon Institute of Technology](https://www.southharmoninstituteoftechnology.org/) reference? It's from the 2006 Movie *[Accepted](https://www.imdb.com/title/tt0384793/)*
https://github.com/DieracDelta/presentations
https://raw.githubusercontent.com/DieracDelta/presentations/master/polylux/book/src/external/pdfpc.md
markdown
# pdfpc [pdfpc](https://pdfpc.github.io/) is a "presenter console with multi-monitor support for PDF-files". That means, you can use it to display slides in the form of PDF-pages and also have some of the nice features known from, for example, PowerPoint. Check out their website to learn more. When pdfpc is provided a special `.pdfpc` file containing some JSON data, it can use that to enhance the user experience by correctly handling overlay slides, displaying speaker notes, setting up a specific timer, and more. While you can write this file by hand or use the pdfpc-internal features to edit it, some might find it more convenient to have all data about their presentation in one place, i.e. the Typst source file. Polylux allows you to do that. ## Adding metadata to the Typst source Polylux exports the `pdfpc` module that comes with a bunch of useful functions that do not actually add any content to the produced PDF but instead insert metadata that can later be extracted from the document. ### Speaker notes This is possibly the most useful feature of pdfpc. Using the function `#pdfpc.speaker-note` inside a slide, you can add a note to that slide that will only be visible to the speaker in pdfpc. It accepts either a string: ```typ #pdfpc.speaker-note("This is a note that only the speaker will see.") ``` or a `raw` block: ````typ #pdfpc.speaker-note( ```md # My notes Did you know that pdfpc supports Markdown notes? _So cool!_ ``` ) ```` Note that you can only specify one note per slide (only the first one will survive if you use more than one.) ### End slide Sometimes the last slide in your presentation is not really the one you want to end with. Say, you have some bibliography or appendix for the sake of completeness after your "I thank my mom and everyone who believed in me"-slide. With a simple `pdfpc.end-slide` inside any slide you can tell pdfpc that this is the last slide you usually want to show and hitting the `End` key will jump there. ### Save a slide Similarly, there is a feature in pdfpc to bookmark a specific slide (and you can jump to it using `Shift + M`). In your Typst source, you can choose that slide by putting `pdfpc.save-slide` inside it. ### Hide slides If you want to keep a certain slide in your presentation (just in case) but don't normally intend to show it, you can hide it inside pdfpc. It will be skipped during the presentation but it is still available in the overview. You can use `pdfpc.hidden-slide` in your Typst source to mark a slide as hidden. ### Configure pdfpc The previous commands are all supposed to be used _inside_ a slide. To perform some additional global configuration, you can use `pdfpc.config()` _before_ any of the slides (it will not be recognised otherwise). It accepts the following optional keyword arguments: - `duration-minutes`: how many minutes (a number) the presentation is supposed to take, affects the timer in pdfpc - `start-time`: wall-clock time when the presentation is supposed to start, either as a `datetime(hour: ..., minute: ..., second: ...)` or as a string in the `HH:MM` format - `end-time`: same as `start-time` but when the presentation is supposed to end - `last-minutes`: how many minutes (a number) before the time runs out the timer is supposed to change its colour as a warning - `note-font-size`: the font size (a number) the speaker notes are displayed in - `disable-markdown`: whether or not to disable rendering the notes as markdown (a bool), default `false` - `default-transition`: the transition to use between subsequent slides, must be given as a dictionary with (potentially) the following keys: - `type`: one of `"replace"` (default), `"push"`, `"blinds"`, `"box"`, `"cover"`, `"dissolve"`, `"fade"`, `"glitter"`, `"split"`, `"uncover"`, `"wipe"` - `duration-seconds`: the duration of the transition in seconds (a number) - `angle`: in which angle the transition moves, one of `ltr`, `rtl`, `ttb`, and `btt` (see [the `#stack` function](https://typst.app/docs/reference/layout/stack/#parameters-dir)) - `alignment`: whether the transition is performed horizontally or vertically, one of `"horizontal"` and `"vertical"` - `direction`: whether the transition is performed inward or outward, one of `"inward"` and `"outward"` Not all combinations of values are necessary or make sense for all transitions, of course. ## Extracting the data: `polylux2pdfpc` As mentioned above, the functions from the `pdfpc` module don't alter the produced PDF itself. Instead, we need some other way to extract their data. You could, in principle, do that by hand using the `typst query` CLI and then assemble the correct `.pdfpc` file yourself. However, this tedious task is better solved by the `polylux2pdfpc` tool. ### Installation If you have [Rust](https://www.rust-lang.org/tools/install) installed, you can simply run ```sh cargo install --git https://github.com/andreasKroepelin/polylux/ --branch release ``` If you use Arch Linux btw, you can also install `polylux2pdfpc` from the AUR package [polylux2pdfpc-git](https://aur.archlinux.org/packages/polylux2pdfpc-git) (thank you to <NAME>!) ### Usage You invoke `polylux2pdfpc` with the same arguments you would also give to `typst compile` when you wanted to build your slides. For example, say you have a file called `talk.typ` in the folder `thesis` that has some global utility files or so, you would compile it using ```sh typst compile --root .. thesis/talk.typ ``` and extract the pdfpc data using ```sh polylux2pdfpc --root .. thesis/talk.typ ``` Internally, `polylux2pdfpc` runs `typst query`, collects all the pdfpc-related metadata and then writes a `.pdfpc` file that equals the input file up to the suffix. In our example with `thesis/talk.typ`, we obtain `thesis/talk.pdfpc`. Since `typst compile` produced `thesis/talk.pdf`, you can now simply open the PDF in pdpfc: ```sh pdfpc thesis/talk.pdf ``` and it will automatically recognise the `.pdfpc` file.
https://github.com/TGM-HIT/typst-diploma-thesis
https://raw.githubusercontent.com/TGM-HIT/typst-diploma-thesis/main/template/glossaries.typ
typst
MIT License
#import "lib.typ": * #register-glossary( glossary-entry( "ac:tgm", short: "TGM", long: "Technologisches Gewerbemuseum", // group: "Accronyms", ), glossary-entry( "syt", short: "SYT", long: "Systemtechnik", desc: ["Als Systemtechnik bezeichnet man verschiedene Aufbau- und Verbindungstechniken, aber auch eine Fachrichtung der Ingenieurwissenschaften. Er bedeutet in der Unterscheidung zu den Mikrotechnologien die Verbindung verschiedener einzelner Module eines Systems und deren Konzeption." @wiki:syt] ), )
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/009_Born%20of%20the%20Gods.typ
typst
#import "@local/mtgset:0.1.0": conf #show: doc => conf("Born of the Gods", doc) #include "./009 - Born of the Gods/001_The Nature of Identity.typ" #include "./009 - Born of the Gods/002_Cowardice of the Hero.typ" #include "./009 - Born of the Gods/003_Emonberry Red.typ" #include "./009 - Born of the Gods/004_Kiora's Followers.typ" #include "./009 - Born of the Gods/005_Dance of the Flitterstep.typ" #include "./009 - Born of the Gods/006_The Walls of Akros.typ" #include "./009 - Born of the Gods/007_The Hero of Iroas.typ" #include "./009 - Born of the Gods/008_The Oracle of Ephara.typ" #include "./009 - Born of the Gods/009_Seasons in Setessa.typ"
https://github.com/Shyrogan/meformer
https://raw.githubusercontent.com/Shyrogan/meformer/main/rendu.typ
typst
#let prompt(prompt, answer) = { rect(fill: silver, inset: 12pt, radius: 8pt, stroke: 2pt + gray, [ *Utilisateur:* #prompt ]) text()[ *ChatGPT:* #answer ] } = Discussion avec ChatGPT #rect(fill: rgb("#FF000030"))[ *Les réponses ne correspondent pas exactement à ce qui a été obtenu, j'ai omi les parties qui ne m'intéressaient pas et les erreurs que j'ai fixé à la main en quelques secondes.* ] #prompt([ Voici les spécifications pour mon projet, liste moi déjà dans un premier les technologies que tu me recommandes: Présentation de l’application. Une fois l’application installée et sans inscription, elle permet à son utilisateur de bénéficier de : - Présentation claire et ergonomique de l’application sous différents formats : vidéo, texte, etc. - Une démonstration (utilisation) réelle mais temporaire des services de l’application. - Une interface de connexion - Etc. Inscription. L’application permet de s’inscrire selon trois profils : - Parent : Dans ce cas, l’application lui propose : - De saisir les informations du parent : nom, prénom, choix login, choix mot de passe, adresse mail, compte Tweeter, Facebook, etc. Permettre de vérifier l’adresse mail (Par exemple, activation du compte via une lien envoyé à l’adresse mail en renseignée). - De choisir le nombre d’élèves concernés par cette inscription et de saisir leurs profils (nom, prénom, lien de parenté, niveau scolaire, etc.) - De choisir l’année ou les années scolaires concernées. - De choisir une formule parmi un ensemble de formules proposées. Par exemple : ... ], [ Pour ce projet d’application mobile avec des fonctionnalités spécifiques à plusieurs profils (parent et élève), ainsi qu'une partie serveur robuste, voici les technologies et outils que je te recommande pour chaque partie : == Frontend Mobile (Android) : Pour l'interface utilisateur, l'accessibilité et l'expérience utilisateur, il est important d'utiliser des outils et frameworks adaptés au développement Android. Langage : Kotlin Kotlin est maintenant le langage recommandé pour le développement Android, car il offre une syntaxe plus simple et plus concise que Java. UI Design : Material Design Utilisation des principes de Material Design pour offrir une interface ergonomique et fluide, conforme aux standards modernes. Android Jetpack Components : Fragments : Pour gérer la navigation entre les différentes vues (parent, élève, etc.) dans l'application. ViewModel & LiveData : Pour gérer les données en temps réel et la persistance à travers les cycles de vie des activités. Room : Pour la gestion de la base de données locale, permettant le mode offline. DataBinding : Pour lier les composants de l'interface utilisateur aux données en toute simplicité. Librairies pour la gestion multimédia : ExoPlayer : Pour la gestion et l'affichage des vidéos (utilisé pour les cours en format vidéo). Glide : Pour le chargement et l’affichage optimisé des images (fiches de cours, avatars, etc.). == Backend (Serveur) : La partie serveur doit être capable de gérer des inscriptions, stocker des données, exécuter des algorithmes, et supporter la synchronisation offline/online. - Langage et Framework Backend : - Node.js avec Express.js : Idéal pour un backend léger et rapide, avec des bibliothèques riches pour le développement d'API RESTful. - Python avec Django ou Flask : Si tu prévois des algorithmes plus complexes (pour les recommandations ou la gestion des courbes de progression), Python peut être un bon choix grâce à ses bibliothèques d'intelligence artificielle et de machine learning. - Base de données : - PostgreSQL : Pour une base de données relationnelle robuste, avec une bonne gestion des relations entre les différentes entités (utilisateurs, cours, exercices, etc.). - MongoDB : Si tu préfères une base de données NoSQL, plus flexible pour stocker des documents (cours, activités des élèves, etc.). - Authentification & Sécurité : - OAuth2 pour l'authentification via des services externes (Google, Facebook, Twitter). - JWT (JSON Web Token) : Pour gérer les sessions d’utilisateurs de manière sécurisée. == Synchronisation Online/Offline : Pour permettre un mode offline avec synchronisation en ligne lorsque l’utilisateur repasse en mode connecté : Realm : Base de données mobile qui permet une synchronisation facile entre le mode offline et online. WorkManager : Pour gérer les tâches de fond et la synchronisation des données en mode offline/online. == Paiements en ligne : Pour la gestion des paiements des parents pour les formules choisies : - Stripe : Pour les paiements sécurisés par carte bancaire. - PayPal SDK : Comme alternative de paiement en ligne. == Notifications et Rappels : Pour la gestion des notifications à propos des cours ou des rappels d'activité : - Firebase Cloud Messaging (FCM) : Pour l'envoi de notifications push. - AlarmManager : Pour planifier des notifications locales ou rappels même en mode offline. == Intelligence artificielle pour recommandations : Pour les recommandations de progression aux élèves : - TensorFlow Lite : Si tu envisages de faire de l’IA embarquée sur mobile pour personnaliser les recommandations. - Python (Scikit-Learn ou TensorFlow) : Si l’intelligence artificielle est gérée côté serveur. == Tests et Qualité de Code : - JUnit & Espresso : Pour tester les fonctionnalités et l’interface utilisateur. - SonarQube : Pour l’analyse de la qualité du code. == Autres outils à envisager : - Retrofit : Pour les appels réseau, notamment la communication entre l'application et le serveur. - Coroutines (Kotlin) : Pour la gestion asynchrone, en particulier pour les appels réseau et la gestion de tâches de fond. SpeechRecognizer API : Pour l'activation de fonctionnalités par la voix. == Gestion des événements en temps réel : WebSockets : Pour la communication en temps réel dans le cadre des cours en direct ou des tchats. == Hébergement Backend : - AWS (Amazon Web Services) ou Google Cloud : Pour l'hébergement des services backend et la gestion de la base de données. - Heroku : Option plus simple pour des déploiements rapides avec une gestion facile des bases de données et des environnements. Résumé : - Frontend : Kotlin, Android Jetpack, Material Design, ExoPlayer, Glide. - Backend : Node.js/Express ou Python/Django, PostgreSQL/MongoDB, OAuth2, JWT. - Sync : Realm, WorkManager. - Paiement : Stripe, PayPal SDK. - Notifications : Firebase Cloud Messaging. - IA : TensorFlow Lite (mobile) ou Python (backend). - Tests : JUnit, Espresso. Cela te donnera une architecture performante et moderne pour ton application. ]) #prompt([ Génère un morceau de code qui va afficher un écran de séléction entre les différents types d'utilisateur avec Kotlin et Jetpack Compose. Pas de bla-bla, uniquement le code ], [ ```kotlin class MainActivity : ComponentActivity() { override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContent { UserSelectionScreen(onUserSelected = { userType -> // Handle user selection (Parent, Student, etc.) }) } } } @Composable fun UserSelectionScreen(onUserSelected: (String) -> Unit) { Column( modifier = Modifier .fillMaxSize() .padding(16.dp), verticalArrangement = Arrangement.Center, horizontalAlignment = Alignment.CenterHorizontally ) { Text(text = "Choisissez votre profil", fontSize = 24.sp, color = Color.Black) Button( modifier = Modifier.padding(top = 16.dp), onClick = { onUserSelected("Parent") } ) { Text(text = "Parent") } Button( modifier = Modifier.padding(top = 16.dp), onClick = { onUserSelected("Élève") } ) { Text(text = "Élève") } } } @Preview(showBackground = true) @Composable fun PreviewUserSelectionScreen() { UserSelectionScreen(onUserSelected = {}) } ``` ]) #prompt([ Parfait, quelle serait l'étape d'après, d'après toi, liste en plusieurs 2-3 plutôt courte que je choisisse ], [ Voici 3 prochaines étapes possibles après l'écran de sélection des utilisateurs : 1. Création des écrans d'inscription : Développer les écrans d'inscription pour chaque type d'utilisateur (Parent, Élève) avec les champs requis (nom, email, mot de passe, etc.) et la validation des formulaires. 2. Implémentation de la logique de navigation : Utiliser Jetpack Navigation pour gérer la navigation entre les différents écrans (sélection, inscription, connexion, etc.). 3. Connexion au backend : Mettre en place les appels réseau (via Retrofit par exemple) pour envoyer les données d'inscription/connexion au serveur. Choisis selon la priorité de ton projet ! ]) #prompt([ Faisons le 1, post-inscription, on va afficher le choix du type d'utilisateur. Pas de bla-bla ], [ ```kotlin class MainActivity : ComponentActivity() { override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContent { RegistrationScreen() } } } @Composable fun RegistrationScreen() { var name by remember { mutableStateOf("") } var email by remember { mutableStateOf("") } var password by remember { mutableStateOf("") } var isRegistered by remember { mutableStateOf(false) } if (isRegistered) { UserSelectionScreen(onUserSelected = { userType -> // Handle user type after registration }) } else { Column( modifier = Modifier .fillMaxSize() .padding(16.dp), verticalArrangement = Arrangement.Center, horizontalAlignment = Alignment.CenterHorizontally ) { Text(text = "Inscription", fontSize = 24.sp, color = Color.Black) BasicTextField( value = name, onValueChange = { name = it }, modifier = Modifier.padding(top = 16.dp), decorationBox = { innerTextField -> if (name.isEmpty()) Text("Nom") else innerTextField() } ) BasicTextField( value = email, onValueChange = { email = it }, modifier = Modifier.padding(top = 16.dp), decorationBox = { innerTextField -> if (email.isEmpty()) Text("Email") else innerTextField() } ) BasicTextField( value = password, onValueChange = { password = it }, modifier = Modifier.padding(top = 16.dp), decorationBox = { innerTextField -> if (password.isEmpty()) Text("Mot de passe") else innerTextField() } ) Button( modifier = Modifier.padding(top = 16.dp), onClick = { isRegistered = true } ) { Text(text = "S'inscrire") } } } } @Composable fun UserSelectionScreen(onUserSelected: (String) -> Unit) { Column( modifier = Modifier .fillMaxSize() .padding(16.dp), verticalArrangement = Arrangement.Center, horizontalAlignment = Alignment.CenterHorizontally ) { Text(text = "Choisissez votre profil", fontSize = 24.sp, color = Color.Black) Button( modifier = Modifier.padding(top = 16.dp), onClick = { onUserSelected("Parent") } ) { Text(text = "Parent") } Button( modifier = Modifier.padding(top = 16.dp), onClick = { onUserSelected("Élève") } ) { Text(text = "Élève") } } } @Preview(showBackground = true) @Composable fun PreviewRegistrationScreen() { RegistrationScreen() } ``` ]) #prompt([ ```toml [versions] agp = "8.6.0" kotlin = "1.9.0" coreKtx = "1.10.1" junit = "4.13.2" junitVersion = "1.1.5" espressoCore = "3.5.1" appcompat = "1.6.1" material = "1.10.0" [libraries] androidx-core-ktx = { group = "androidx.core", name = "core-ktx", version.ref = "coreKtx" } junit = { group = "junit", name = "junit", version.ref = "junit" } androidx-junit = { group = "androidx.test.ext", name = "junit", version.ref = "junitVersion" } androidx-espresso-core = { group = "androidx.test.espresso", name = "espresso-core", version.ref = "espressoCore" } androidx-appcompat = { group = "androidx.appcompat", name = "appcompat", version.ref = "appcompat" } material = { group = "com.google.android.material", name = "material", version.ref = "material" } [plugins] android-application = { id = "com.android.application", version.ref = "agp" } kotlin-android = { id = "org.jetbrains.kotlin.android", version.ref = "kotlin" } ``` Ajoute les dépendances pour la navigation ], [ ```toml [libraries] androidx-navigation-fragment-ktx = { group = "androidx.navigation", name = "navigation-fragment-ktx", version = "2.7.3" } androidx-navigation-ui-ktx = { group = "androidx.navigation", name = "navigation-ui-ktx", version = "2.7.3" } ``` ]) #prompt([ Bien, il me manque compose maintenant, ajoute le. Pas de blabla. ], [ ```kotlin [versions] compose = "1.5.0" [libraries] androidx-compose-ui = { group = "androidx.compose.ui", name = "ui", version.ref = "compose" } androidx-compose-material = { group = "androidx.compose.material", name = "material", version.ref = "compose" } androidx-compose-ui-tooling = { group = "androidx.compose.ui", name = "ui-tooling", version.ref = "compose" } androidx-compose-runtime = { group = "androidx.compose.runtime", name = "runtime", version.ref = "compose" } androidx-compose-foundation = { group = "androidx.compose.foundation", name = "foundation", version.ref = "compose" } androidx-compose-navigation = { group = "androidx.navigation", name = "navigation-compose", version = "2.7.3" } ``` ]) #prompt([ Parfait, maintenant créer des structures de données avec Kotlin pour représenter les données dans les pré-requis. Pas de blabla ], [ ```kotlin data class Parent( val id: String, val name: String, val email: String, val password: String, val children: List<Student> ) data class Student( val id: String, val name: String, val relationshipToParent: String, val gradeLevel: String, val schoolYear: List<String> ) data class Course( val id: String, val title: String, val description: String, val format: CourseFormat, val year: String ) enum class CourseFormat { DOCUMENTS_ONLY, TEACHER_SUPPORT } data class PaymentOption( val id: String, val type: PaymentType, val details: String ) enum class PaymentType { CREDIT_CARD, DIRECT_DEBIT } data class User( val id: String, val userType: UserType, val login: String, val password: String ) enum class UserType { PARENT, STUDENT } data class ActivityLog( val studentId: String, val activityType: ActivityType, val timestamp: Long ) enum class ActivityType { COURSE_VIEW, EXERCISE_COMPLETION, LOGIN } data class Recommendation( val studentId: String, val advice: String, val timestamp: Long ) ``` ]) #prompt([ Parfait, pour le moment tu vas mettre des valeurs par défaut au lieu de travailler avec une base de donnée, tu vas aussi ajouter des boutons à la page de séléction le temps que j'implémente l'authentification. Fais une page qui liste les cours d'un étudiant. Pas de blabla, que le code du composant. Oublie pas le jeu de donnée dans la preview en paramètre du composant ], [ ```kotlin @Composable fun StudentCoursesScreen(courses: List<Course>) { Column( modifier = Modifier .fillMaxSize() .padding(16.dp), verticalArrangement = Arrangement.Top, horizontalAlignment = Alignment.Start ) { Text(text = "Cours de l'Étudiant", fontSize = 24.sp) Spacer(modifier = Modifier.height(16.dp)) for (course in courses) { CourseItem(course = course) Spacer(modifier = Modifier.height(8.dp)) } } } @Composable fun CourseItem(course: Course) { Column(modifier = Modifier.padding(8.dp)) { Text(text = course.title, fontSize = 20.sp) Text(text = course.description) Text(text = "Format: ${course.format}", fontSize = 12.sp) Text(text = "Année: ${course.year}", fontSize = 12.sp) } } @Preview(showBackground = true) @Composable fun PreviewStudentCoursesScreen() { val sampleCourses = listOf( Course("1", "Mathématiques", "Cours sur les nombres et les opérations.", CourseFormat.DOCUMENTS_ONLY, "2024"), Course("2", "Physique", "Introduction à la physique moderne.", CourseFormat.TEACHER_SUPPORT, "2024"), Course("3", "Chimie", "Principes de base de la chimie.", CourseFormat.DOCUMENTS_ONLY, "2024") ) StudentCoursesScreen(courses = sampleCourses) } ``` ]) #prompt([ Une idée d'autre page qu'on peut déjà créer ? Enumère en 2-3 ], [ Voici quelques idées de pages que tu pourrais créer : 1. Page de Détails du Cours : Affiche les détails d'un cours spécifique, y compris le contenu, les exercices associés, et les ressources supplémentaires. 2. Page de Suivi de Progression : Montre une vue d'ensemble des progrès de l'étudiant, avec des graphiques et des statistiques sur les cours suivis et les performances. 3. Page de Profil de l'Étudiant : Permet à l'étudiant de voir et de modifier ses informations personnelles, comme le nom, le niveau scolaire, et les préférences de communication. Ces pages peuvent aider à enrichir l'expérience utilisateur avant l'implémentation complète de l'authentification et des données en temps réel. ]) #prompt([ Créer la page de détail de cours avec un jeu de donnée en preview. Pas de blabla que du code ], [ ```kotlin @Composable fun CourseDetailScreen(courseDetail: CourseDetail) { Column( modifier = Modifier .fillMaxSize() .padding(16.dp), verticalArrangement = Arrangement.Top, horizontalAlignment = Alignment.Start ) { Text(text = courseDetail.title, fontSize = 24.sp) Spacer(modifier = Modifier.height(8.dp)) Text(text = "Description:", fontSize = 18.sp) Text(text = courseDetail.description) Spacer(modifier = Modifier.height(16.dp)) Text(text = "Contenu:", fontSize = 18.sp) courseDetail.content.forEach { item -> Text(text = "- $item") } Spacer(modifier = Modifier.height(16.dp)) Text(text = "Exercices:", fontSize = 18.sp) courseDetail.exercises.forEach { exercise -> Text(text = "- $exercise") } Spacer(modifier = Modifier.height(16.dp)) Text(text = "Format: ${courseDetail.format}", fontSize = 12.sp) Text(text = "Année: ${courseDetail.year}", fontSize = 12.sp) } } @Preview(showBackground = true) @Composable fun PreviewCourseDetailScreen() { val sampleCourseDetail = CourseDetail( id = "1", title = "Mathématiques", description = "Cours sur les nombres et les opérations.", content = listOf("Nombres entiers", "Nombres rationnels", "Opérations de base"), exercises = listOf("Exercice 1", "Exercice 2", "Exercice 3"), format = CourseFormat.DOCUMENTS_ONLY, year = "2024" ) CourseDetailScreen(courseDetail = sampleCourseDetail) } ``` ])
https://github.com/Wallbreaker5th/fuzzy-cnoi-statement
https://raw.githubusercontent.com/Wallbreaker5th/fuzzy-cnoi-statement/main/template/main.typ
typst
MIT No Attribution
#import "@preview/fuzzy-cnoi-statement:0.1.1": *; // 以下有大量 Fancy 的设置项。你可以根据需求,注释掉你不需要的内容,或者将你需要的内容取消注释。 #let prob-list = ( ( name: "圆格染色", // 题目名 name-en: "color", // 英文题目名,同时也是目录 time-limit: "1.0 秒", // 每个测试点时限(会显示为 $1.0$ 秒) memory-limit: "512 MiB", // 内存限制(显示同上) test-case-count: "10", // 测试点数目(显示同上) test-case-equal: "是", // 测试点是否等分 year: "2023", // 你可以添加自定义的属性 ), // 实际上必需的项只有 name-en // 下面是一些花活 ( name: "桂花树", name-en: "tree", type: "交互型", // 题目类型 time-limit: "0.5 秒", memory-limit: "512 MiB", test-case-count: "10", test-case-equal: "是", year: "2023", // submit-file-name 为提交文件名,可以为 str、content 或一个 extension:str=>content 的函数 // submit-file-name: e => { // show raw: set text(size: 0.8em) // raw("tree." + e) // } // 将其字体变小 ), ( name: "深搜", name-en: "dfs", type: "提交答案型", executable: [无], input: [`dfs`$1~10$`.in`], // 输入文件名 output: [`dfs`$1~10$`.out`], // 输出文件名 test-case-count: "10", test-case-equal: "是", submit-file-name: [`dfs`$1~10$`.out`], ), ) #let contest-info = ( name: "全国中老年信息学奥林匹克竞赛", name-en: "FCC ION 3202", // 似乎也可以当成副标题用 round: "第一试", time: "时间:2023 年 7 月 24 日 08:00 ~ 13:00", // author: "<PASSWORD>" ) #let (init, title, problem-table, next-problem, filename, current-filename, current-sample-filename, data-constraints-table-args) = document-class( contest-info, prob-list, // custom-fonts 参数可以更改字体,字典各项的键值为:mono、serif、cjk-serif、cjk-sans、cjk-mono、cjk-italic,值为字体名。你可以只传入部分项。默认分别为: // - Consolas // - New Computer Modern // - 方正书宋(FZShuSong-Z01S) // - 方正黑体(FZHei-B01S) // - 方正仿宋(FZFangSong-Z02S) // - 方正楷体(FZKai-Z03S) // header 参数可以自定义页眉,你需要传入一个 (contest-info, current-problem) => content 的函数。 // footer 参数可以自定义页脚,你需要传入一个 (contest-info, current-problem) => content 的函数。 ) #show: init #title() #problem-table( // 默认会显示以下行(括号内为默认值,打星号的在没有题目有这一项时不会显示): // - name 题目名称(无) // - type 题目类型(传统型) // - name-en 目录 // - executable 可执行文件名(默认为目录名) // - input 输入文件名(默认为目录名 + ".in") // - output 输出文件名(默认为目录名 + ".out") // - * time-limit 每个测试点时限(无) // - * memory-limit 内存限制(无) // - * test-case-count 测试点数目(10) // - * subtask-count 子任务数目(1) // - * test-case-equal 测试点是否等分(是) // // 一般来说,默认显示的行的设置是够用的。但如果你希望增加更多行,或者当你有太多题目需要展示,你可能需要 extra-rows 参数来添加新的行。 // extra-rows 也可以覆盖掉默认的行。其在题目过多、需要将特定行的字体变小时尤其有用。 extra-rows:( year: ( // 对应的 field 名 name: "年份", // 显示的名字;可以用 content(调整字号等) wrap: text, // 显示的样式:若题目的这一项是 str,则显示为 wrap(str),否则会直接显示这一项。默认为 text。 always-display: false, // 是否总是显示:若为 false,则至少要有一个题目有这一项才会显示。默认为 false。 default: "2023" // 默认值,默认为“无”。你也可以传入一个函数,其接受一个参数,为当前题目的信息,返回一个 str 或 content。 ), contest: ( name: "赛事", wrap: text.with(fill: blue), // 你也可以在这里设置更小的字体 always-display: true, default: "NOI" ), setter: ( // name: text(size: 0.8em)[出题人], // 也许你需要更小的字号 name: "出题人", always-display: true, default: p => { p.name-en + "的出题人" } ), foo: ( name: "bar", wrap: text, default: "这一行不会显示" ) ), // 提交源程序文件名的列表,每一种语言为 (语言名, 文件后缀名) 的二元组,或 (语言名, 文件后缀名, 首列字体大小) 的三元组。若 problem 没有指明 submit-file-name,则用题目英文名与后缀名拼接。 languages:( ("C++", "cpp"), ("D++", "dpp"), // ("D++", "dpp", 0.8em), // 更小的字号 ), // 各个语言的编译选项,每一种语言为 (语言名, 编译选项) 的二元组,或 (语言名, 编译选项, 首列字体大小) 的三元组。 compile-options: ( ("C++", "-O2 -std=c++20 -DOFFLINE_JUDGE"), ) ) *注意事项(请仔细阅读)* + 文件名(程序名和输入输出文件名)必须使用英文小写。 + C++ 中函数 main() 的返回值类型必须是 int,程序正常结束时的返回值必须是 0。 + 因违反以上两点而出现的错误或问题,申诉时一律不予受理。 + 若无特殊说明,结果的比较方式为全文比较(过滤行末空格及文末回车)。 + 选手提交的程序源文件必须不大于 100KB。 + 程序可使用的栈空间内存限制与题目的内存限制一致。 + 只提供 Linux 格式附加样例文件。 + 禁止在源代码中改变编译器参数(如使用 \#pragma 命令),禁止使用系统结构相关指令(如内联汇编)和其他可能造成不公平的方法。 #next-problem() == 题目描述 输入两个正整数 $a, b$,输出它们的和。 你可以*强调一段带 $f+or+mu+l+a$ 的文本*。用 `#underline` 加 ``` `` ``` 来实现 #underline[`underlined raw text`]。 + 第一点 + 第二点 - 第一点 - 列表可以嵌套 - 但目前,有序列表和无序列表的互相嵌套会有缩进上的问题。 - 第二点 - 第二点的第一点 - 第二点的第二点 == 输入格式 从文件 #current-filename("in") 中读入数据。// 自动获取当前题目的输入文件名 输入的第一行包含两个正整数 $a, b$,表示需要求和的两个数。 == 输出格式 输出到文件 #filename[color.out] 中。 输出一行一个整数,表示 $a+b$。 == 样例1输入 // 从文件中读取样例 #raw(read("color1.in"), block: true) == 样例1输出 // 或者直接写在文档中 ```text 13 ``` == 样例1解释 #figure(caption: "凹包")[#image("fig.png", width: 40%)]<aobao> 如@aobao,这是一个凹包。 #for (i,case) in range(2, 8).zip(( $1 tilde 5$, $6 tilde 9$, $10 tilde 13$, $14 tilde 17$, $18 tilde 19$, $20$)){[ == 样例#{i+2} 见选手目录下的 #current-sample-filename(i, "in") 与 #current-sample-filename(i, "ans")。 这个样例满足测试点 #case 的条件限制。 ]} == 数据范围 对于所有测试数据保证:$1 <= a,b <= 10^9$。 #figure( table( columns: 4, ..data-constraints-table-args, // 默认的针对数据范围的三线表样式 table.header( [测试点编号], $n,m <=$, $q<=$, [特殊性质], ), $1 tilde 5$, $300$, $300$, table.cell(rowspan:2)[无], $6 tilde 9$, table.cell(rowspan:4)[$10^5$], $2000$, $10 tilde 13$, table.cell(rowspan:4)[$10^5$], [A], $14 tilde 17$, [B], $18 tilde 19$, table.cell(rowspan:2)[无], $20$, $10^9$, ) ) 特殊性质 A: 你可以像上面这样创建复杂的表格。 #next-problem() *这是一道交互题。* == 题目描述 #lorem(50) == 实现细节 请确保你的程序开头有 `#include "tree.h"`。 ```cpp int query(int x, int y); void answer(std::vector<int> ans); ``` ```bash g++ count.cpp -c -O2 -std=c++14 -lm && g++ count.o grader.o -o count ``` 我能吞下玻璃而不伤身体。 - 赵钱孙李周吴郑王,冯陈楮卫蒋沈韩: - 杨朱秦尤许何吕施张孔。 - 曹严华金魏陶姜戚谢。 - 邹喻柏水窦章云苏潘葛奚。 - 范彭郎鲁韦昌马苗凤花,方俞任袁柳酆鲍史唐费廉岑薛雷。 - 贺倪汤滕殷罗毕郝邬安常乐于时傅皮卞齐康伍余,元卜顾孟平黄和穆萧尹姚邵湛汪,祁毛禹狄米贝明臧计伏成戴。 == 评分标准 #lorem(50) #lorem(100) == 数据范围 #lorem(50) #next-problem() == 题目描述 #lorem(50) == 输入格式 从文件 #filename[dfs$1~10$.in] 中读入数据。 #lorem(50) == 输出格式 输出到文件 #filename[dfs$1~10$.out] 中。 #lorem(50) == 数据范围 对于所有测试数据保证:$1 <= n <= 10^5$。
https://github.com/protohaven/printed_materials
https://raw.githubusercontent.com/protohaven/printed_materials/main/common-tools/woodshop_dust_collection.typ
typst
#import "/meta-environments/env-features.typ": * = Woodshop Dust Collection Dust collection is an important part of any shop. Dust is a health hazard: dust is an irritant, and specific sizes of dust particles (PM2.5) can lodge in the lungs and accumulate there, leading to long-term and possibly severe health issues. Dust is also hard on metal tools, and can encourage surface rust and pitting. == Notes Run the dust collection system when using any connected power tool. Open the appropriate blast gates to insure air flow. Keep unused blast gates closed to maximize airflow where it is needed. The dust collection system also has three floor inlets, one next to the miter saw, and one behind the resaw bandsaw and one next to the drill press. These are useful for collecting floor sweepings. Empty the dust collection bin when it becomes two-thirds full (at the seam). Running the dust collection system with an over-capacity bin will reduce system performance. == Parts of the Dust Collection System === Control panel The control panel has power switches of the dust collection system. #figure( image("images/dust_collection-controls-annotated.png", width: 100%), caption: [ A close up of the dust collection control panel. ], ) === System Power Switch The system power switch turns power on and off to the entire dust collection system, including the control panel. #warning("DO NOT USE THIS SWITCH.") This switch should remain on at all times during open shop hours. === Blower Switch The blower switch turns on the blowers that pull air through the system, collecting dust. === Dust Collection Bin The dust collection bin holds all of the sawdust and light chips pulled in by the dust collection system. The lid is held in place by clamps. == Basic Operation === Turning On The Dust Collection Press the blower switch while the dust collection system is off. === Turning Off The Dust Collection Press the blower switch while the dust collection system is on. == Maintenance If the dust collection bin becomes full: + Stop the dust collection blowers with the blower switch. + Unclamp the lip of the dust collection bin. + Remove waste from the bin.\ _Scoop out waste into a wood shop trash can, and then roll the trashcan out to the dumpster and tip it in. If you need help lifting the bin up to the mouth of the dumpster, please ask!_ + Replace the dust collection bin lid, and secure the clamps. \ _Ensure the gasket is attached._ + If the wood shop is in use and dust collection is still needed, turn the blowers back on.
https://github.com/lyzynec/hys-or-kys
https://raw.githubusercontent.com/lyzynec/hys-or-kys/main/04/main.typ
typst
#import "../lib.typ": * #knowledge[ #question(name: [Give a definition of a hybrid automaton. In particular, describe all the key components.])[] #question(name: [Explain the differences in roles played in a hybrid automaton by the location (or mode) invariants and guard sets.])[] ] #skills[ #question(name: [Find a model of a simple dynamical system in the form of a hybrid automaton.])[] #question(name: [Implement a simple hybrid automaton in Stateflow.])[] ]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/math/op_03.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test custom operator. $ op("myop", limits: #false)_(x:=1) x \ op("myop", limits: #true)_(x:=1) x $
https://github.com/Han-duoduo/mathPater-typest-template
https://raw.githubusercontent.com/Han-duoduo/mathPater-typest-template/main/main_text.typ
typst
Apache License 2.0
#import "template/template.typ":* #import "template/template.typ":template as T-MathorCup #show: T-MathorCup.with( abstract: [ 物理小区标识符@wx1(Physical Cell Identifier,PCI) 作为一种独特的身份标签,对各覆盖小区编号进行合理配置。尽管整个系统仅配备有限的 $1008$ 个PCI资源,其分配任务却涵盖广泛且数量众多的小区部署,这一供需矛盾催生了不可避免的PCI复用现象。不恰当的PCI分配策略不仅可能导致通信服务质量显著下降,表现为诸如信号劣化等现象,还可能引发资源利用率的低下,形成潜在的网络效能浪费@wx2。鉴于此,对PCI进行科学严谨的规划至关重要,旨在确保无线网络性能的最优化。本文在PCI规划时将从PCI冲突、PCI混淆以及PCI模3干扰这三种场景考虑。每一问的PCI分配结果见问题结果表。 针对问题一,给 $2067$ 个优化小区分配PCI,需要考虑这2067个小区之间的冲突、混淆、干扰情况,则需要先进行数据处理。得出冲突矩阵A、混淆矩阵B、干扰矩阵C。重新分配PCI时,本文通过建立0-1整数规划模型,利用蒙特卡洛模拟求解最优值,得到的最小MR总数为 $50899984$ 。 针对问题二,在依次考虑冲突、混淆以及干扰的优先级的情况下,重新给2067个优化小区分配PCI,仍然使用第一问中的冲突矩阵A、混淆矩阵B、干扰矩阵C。重新分配PCI时,本文通过建立分层序列的多目标规划模型,首先在冲突MR数中找到分水岭,缩小考虑范围,再通过混淆MR数进一步缩小范围,再筛选干扰MR数最小的情况所对应的PCI分配情况,此时的干扰MR数最小为 $54186656$ 。 针对问题三,给 $2857$ 个优化小区及其相关联小区分配PCI,需要考虑这 $2857$ 个小区之间的冲突、混淆、干扰情况。得出冲突矩阵D、混淆矩阵E、干扰矩阵F。重新分配PCI时,本文通过建立0-1整数规划模型,利用蒙特卡洛模拟求解最优值,得到的最小MR总数为 $57079782$ 。 针对问题四,在依次考虑冲突、混淆以及干扰的优先级的情况下,重新给 $2857$ 个优化小区和相关联小区分配PCI,仍然使用第三问中的冲突矩阵D、混淆矩阵E、干扰矩阵F。重新分配PCI时,本文通过建立分层序列的多目标规划模型,首先在冲突MR数中找到分水岭,缩小考虑范围,再通过混淆MR数进一步缩小范围,再筛选干扰MR数最小的情况所对应的PCI分配情况,此时的冲突MR数为 $41415$ ,混淆MR数为 $1024166$ ,干扰MR数为 $58765303$ ,其MR数总和为 $59830884$ 。 ], title: "基于MR数据的PCI规划问题", keywords: ("PCI规划", "整数规划", "蒙特卡洛模拟", "分层序列法"), ) /*正文区*/ //导入各章节 #include "chapter/chap1.typ" #include "chapter/chap2.typ" #include "chapter/chap3.typ" #include "chapter/chap4.typ" #include "chapter/chap5.typ" #include "chapter/chap6.typ" //参考文献 = 参考文献 #bibliography("reference/lib.bib",title:none,) //附录 #include "chapter/appendix.typ"
https://github.com/npikall/typst-templates
https://raw.githubusercontent.com/npikall/typst-templates/main/README.md
markdown
# Typst Templates This repository contains templates for my personal typst projects. The templates are inspired by LaTeX Templates. As I am fairly new to ``typst`` my templates will be very basic, but they fit my needs. I will try to keep them as simple as possible, but also as flexible as possible. As of now it is not my intention to have them available in the `typst universe`, because I want the templates to be easily customizable. ## Compilation To compile the templates use the following command if not automatically done by your editor: ```bash typst compile dev.typ <output_name> ``` ## Useage of the Templates Have the template (e.g. `config.typ`) in the same directory as your project and include it in your main file with the following command: ```typst #import "config.typ": conf, maketitle ``` This will load the `conf` and the `maketitle` function from the template. You can now use them in your project. Use the `conf` function to set the general style of your document. This function might have some additional parameters, which you can find in the template file. ```typst #show: doc => conf(doc) ``` After that use the `maketitle` function to create the title section or page of your document. This function might have some additional parameters, which you can find in the template file aswell. ```typst #maketitle( title: [The title of your document], author: [The authors name], date: [The date], ) ```
https://github.com/sthenic/technogram
https://raw.githubusercontent.com/sthenic/technogram/main/README.md
markdown
MIT License
[![TYPST](https://img.shields.io/badge/Typst-0.12.0-orange.svg?style=flat-square)](https://typst.app) [![LICENSE](https://img.shields.io/badge/license-MIT-blue.svg?style=flat-square)](https://opensource.org/licenses/MIT) # Technogram Technogram is a collection of [Typst](https://typst.app) packages and document classes that together form a typesetting framework for technical documentation. This project was originally written i LuaLaTeX and is a work in progress. ## Installation Until the project is stable, manual installation is required (see below). Importing is done with ```typst #import "@local/technogram:0.1.0" as tg ``` to import all symbols to the `tg` namespace. Refer to the [Typst package repository](https://github.com/typst/packages/?tab=readme-ov-file#local-packages) for more information. ### Linux Make a symbolic link to the repository in the local package directory ln -s /path/to/repository $HOME/.local/share/typst/packages/local/technogram/0.1.0 ### Windows Copy the repository to %APPDATA%/typst/packages/local/technogram/0.1.0 ## License This application is free software released under the [MIT license](https://opensource.org/licenses/MIT).
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/accelerated-jacow/0.1.0/template/paper.typ
typst
Apache License 2.0
/* * Example of typst paper to be published with JACoW * * Based on the JACoW guide for preparation of papers. * See https://jacow.org/ for more information. * */ #import "@preview/accelerated-jacow:0.1.0": jacow, eqnum #show: paper => jacow( // Paper title title: [ Accelerated JAC#lower[o]W paper template\ for conference proceedings ], // Author list authors: ( (name: "<NAME>", affiliation: "uni-a", email: "<EMAIL>"), (name: "<NAME>", affiliation: ("company-b", "third")), (name: "<NAME>", affiliation: "company-b"), (name: "<NAME>", affiliation: ("uni-a", "company-b", "third")), ), affiliations: ( uni-a: "Town University, City, Country", company-b: "Paper Company, City, Country", third: "The Third Institute, City, Country", ), // Funding note (optional, comment out if not applicable) funding: "Work supported by ...", // Paper abstract abstract: [ This document shows how to use the accelerated-jacow paper template to compose papers for JACoW conference proceedings. #lorem(20) ], paper, ) // Other useful packages //#import "@preview/quick-maths:0.1.0" //#import "@preview/physica:0.9.3" #import "@preview/unify:0.6.0": unit,num,qty,numrange,qtyrange = Introduction Typst @typst is a great, modern and blazingly fast typesetting system focussed on scientific papers. This paper template is based on the JACoW paper preparation guide @jacowguide. It showcases common elements, like an equation: $ e^("i" pi) + 1 = 0 $ This equation has a number for it to be referenced #eqnum($ sum_(k=0)^n k &= 1 + 2 + ... + n \ &= (n(n+1)) / 2 $) <eq:thesum> To reference elements, use @eq:thesum, @fig:writer or @table:xydata. Make sure to use the long form at the #underline[beginning] of a sentence. @fig:writer[Figure] for example. @eq:thesum[Equation] as well. @table:xydata is a table and therefore anyways always written out. Scientific quantities can be typeset correctly with the unify package. Examples are quantities like #qty(1.2, "um") with a reduced spacing, $q=#num(0.12345678)$ with digit grouping, $f_"rev"=qty("325.2+-0.1", "kHz")$ with an uncertainty or $h=qty("8.3+0.1-0.2 e-2", "mm")$ with an asymmetric tolerance. Plain units can be written as #unit("tesla meter") or #unit("T m") (but #unit("Tm") is something different). More examples: #qty(3, "keV"), #qty(4, "GeV"), $qty("100", "kW")$, #qty(7, "um") Here is a list with some markup examples for common document elements: - This is a "list" - With _emphasize_ and *strong _emphasize_* - And usage of a `monospace font` - Last #highlight[but not least] something #highlight(fill: white, stroke: orange, extent: 0.1em, [fancy]) 💥 = Headings Headings, subsection headings and even third-level headings follow the JACoW style guide and are automatically transformed to all-caps case and word-caps case respectively. Custom upper/lower case can be forced if required. #let nacl = [#upper[N]#lower[a]#upper[C]#lower[l]] == Subsection heading: #nacl, #upper[N]#lower[a]Cl, $"NaCl"$ #lorem(30) === Third-level headings #lorem(30) = Floating figures #figure( image("writer.svg"), placement: bottom, // top, bottom or auto caption: [Scientific writing (AI artwork).], ) <fig:writer> #lorem(50) #figure( box(fill: silver, width: 100%, height: 1cm), placement: bottom, // top, bottom or auto caption: [ A gray rectangle with a relative width of 100%, an absolute height of 1cm and a two line caption. ], ) <fig:rect> #figure( table( columns: 4, [x], [1], [2], [3], [y], [0.3], [0.4], [0.8], ), placement: auto, // top, bottom or auto caption: [Values for x and y], ) <table:xydata> #figure( table( stroke: none, columns: 4, align: (x, y) => {if x==0 {left} else {center}}, table.hline(), table.header[][*x*][*y*][*z*], table.hline(), [First], [1], [0.3], [14], [Second], [2], [0.4], table.cell(rowspan: 2, align: horizon, [9¾]), [Third], [3], [0.8], table.hline(), ), placement: auto, // top, bottom or auto caption: [ Values for x and y as in @table:xydata with additional values for z and a different table style ] ) #lorem(50) #figure( box(fill: silver, width: 100%, height: 3cm), scope: "parent", // two column-figure placement: top, // top, bottom or auto caption: [A column spanning figure.], ) <fig:rect> #lorem(300) = Conclusions #lorem(50) = Acknowledgements #lorem(50) #bibliography("references.bib")
https://github.com/AxiomOfChoices/Typst
https://raw.githubusercontent.com/AxiomOfChoices/Typst/master/Research/Winter%202024/Masters%20Thesis/Masters%20Thesis.typ
typst
#import "/Templates/monograph.typ": style, frontpage, outline_style, chapter_headings, start_outline #import "@preview/ctheorems:1.1.0": * #import "/Templates/math.typ": * #import "@preview/cetz:0.2.1" #import "/Templates/i-figure.typ" #show: style #show math.equation: i-figure.show-equation.with( level: 1, only-labeled: true, ) #show ref: i-figure.show-equation-refrences.with( level: 1, ) // #show: equation_numbering // #show: equation_references #show: doc => frontpage( toptitle: [Conformally Induced\ Mean Curvature Flow], name: [<NAME> #linebreak() Master of Science ], middletitle: [ #image("./thumbnail_HIGH RES RED.png", width: 50%) Mathematics and Statistics \ McGill University \ Montreal, Quebec, Canada], bottomtitle: [ A thesis submitted to McGill University in partial\ fulfillment of the requirements of the degree of a\ Master of Science ], info: [#sym.copyright <NAME>, #datetime.today().display("[year]")], doc) #show: thmrules #show: outline_style #set page(margin: (x: 1in, top: 1in, bottom: 1in)) #pagebreak(weak:true) #show: start_outline #pagebreak(weak:true) = Abstract This thesis aims to loosely cover the subject of geometric flows, and more specifically a variant of the mean curvature flow. The subject of geometric flows uses results in PDE theory theory, more specifically parabolic PDEs, to gain new insight about Riemannian geometry. One of the biggest problems often tackled by mean curvature flows is the Isoperimetric problem. The Isoperimetric problem asks us to classify the spaces that minimize perimeter for a given volume (hence the name), the first use of mean curvature flows to attack this problem was due to Huisken in 1984 @huisken_flow_1984 who proved it in case of boundaries of convex domains in Euclidean space, and since then there have been attempts to push his methods further. More recently, in 2013, Guan and Li @guanMeanCurvatureType2013 constructed a new normalized flow which allows them to prove the inequality in the case of boundaries of star-shaped domains in Space forms. Shortly after, in 2018, Guan, Li and Wang pushed this flow even further which allowed them to prove the result in a certain class of warped product spaces @guanVolumePreservingFlow2018. Then, concurrently with the writing of this thesis, Li and Pan reframed the technique in terms of conformal vector fields, allowing them to weaken the assumptions on the ambient space @jiayuIsoperimetricProblemRiemannian2023. This thesis continues the effort to use such flows, together with my collaborator <NAME> we pushed the flow even further, allowing us to weaken the assumptions on the ambient space even further and even weaken the star-shapedness assumption, which was key to all previous results. #pagebreak(weak:true) = Abrégé Cette thèse vise à couvrir de manière générale le sujet des flots géométriques, et plus spécifiquement une variante du flot de courbure moyenne. Le domaine des flots géométriques utilise des résultats de la théorie des équations aux dérivées partielles (EDP), plus précisément des EDP paraboliques, pour obtenir de nouvelles perspectives sur la géométrie riemannienne. L'un des plus grands défis souvent abordés par les flots de courbures moyennes est le problème isopérimétrique. Ce problème demande de classer les espaces qui minimisent le périmètre pour un volume donné (d'où le nom). La première utilisation des flots de courbures moyennes pour attaquer ce problème remonte à Huisken en 1984 @huisken_flow_1984, qui l'a prouvé dans le cas des frontières de domaines convexes dans l'espace euclidien. Depuis lors, des tentatives ont été faites pour pousser ses méthodes plus loin. Plus récemment, en 2013, Guan et Li @guanMeanCurvatureType2013 ont construit un nouveau flot normalisé qui leur a permis de prouver l'inégalité dans le cas de frontières de domaines en forme d'étoile dans les formes spatiales. Peu de temps après, en 2018, Guan, Li et Wang ont poussé ce flot encore plus loin, ce qui leur a permis de prouver le résultat dans une certaine classe d'espaces de produits déformés @guanVolumePreservingFlow2018. Puis, simultanément à la rédaction de cette thèse, Li et Pan ont reformulé la technique en termes de champs vectoriels conformes, ce qui leur a permis de relâcher les hypothèses sur l'espace ambiant @jiayuIsoperimetricProblemRiemannian2023. Cette thèse poursuit l'effort d'utiliser de tels écoulements, avec mon collaborateur <NAME>, nous avons poussé le flot encore plus loin, nous permettant de relâcher encore davantage les hypothèses sur l'espace ambiant et même affaiblir l'hypothèse de domaines en forme d'étoile, qui était cruciale pour tous les résultats précédents. #pagebreak(weak:true) = Acknowledgements First I would like to thank my supervisor, <NAME>, for his regular assistance both before and after the start of my Masters program. He was always there to help me understand concepts I would otherwise spend weeks grappling with. Secondly, I would like to thank my collaborator, Doctor <NAME>, without whom this project would have never happened. Bouncing ideas off one another was one of the best parts of my Masters. I would also like to thank my good friends <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME> as well as the whole of the graduate community at McGill, for making my last year at McGill the best year of my life. Finally I would like to thank my family, and especially my mom, for always being there to help and support me whenever I would feel down and lost. #pagebreak(weak:true) = Contribution This thesis and each chapter within was written solely by myself, with occasional assistance from colleagues regarding phrasing. The body of the thesis is largely taken from the paper co-authored by myself and <NAME>. #show: chapter_headings #pagebreak(weak:true) = Introduction == Background on the Isoperimetric Problem The Isoperimetric Problem asks a seemingly simple question:\ Among all regions occupying a given volume which has the least perimeter? In the simple case of 2 and 3 dimensions we all know that the answer is a circle and a sphere respectively, and when we extend it to $n$ dimensions in Euclidean space, the answer remains the $n-1$ dimensional hypersphere, for a large variety of various proofs in this simple case see @burago_geometric_1988. However, if we try to generalize this question to a more general ambient domain, the problem very quickly becomes intractable. One of the ways to answer such a question is with the method of geometric flows, we will start with a shape which will likely not be optimal, and then we will morph it over time to make it optimal. The thesis will be split into 3 Chapters, \ This first chapter will establish all standard theory of Geometric Analysis used in the results of this thesis. \ The second chapter will discuss the method of geometric flows to solve the Isoperimetric problem, as well as the geometric setting introduced by Li and Pan in @jiayuIsoperimetricProblemRiemannian2023. \ The third chapter introduces a more general setting and extends the results to that setting too, first by computing the evolution equations for various geometric quantities and then by proving existence of the flow. This will all lead to the proof of the following theorem. #theorem[ Let $N$ be an ambient manifold admitting a conformal vector field $X$ and a foliation $cal(F)$, such that both are compatible (see @head-setting) and satisfy @assum-required. Then the leaves $S_alpha$ of the foliation $cal(F)$ are the Isoperimetric profile of the class of all hypersurfaces satisfying $ip(X,nu) > 0$. ]<thrm-conclusion> == Concepts and Notation This document assumes general knowledge of differential and Riemannian geometry, see @lee_introduction_2012 and @leeIntroductionRiemannianManifolds2018 for great introductions, respectively. For the rest of this document we will use the following notation, $N$ is an $n+1$ dimensional Riemannian manifold with metric $ov(g)$ within which we have a compact domain $Omega$ with boundary $diff Omega = M$ such that $F : M -> N$ is an embedding making $M$ a Riemannian hypersurface. We then set $g := F^* ov(g)$ to be the induced metric on $M$. We will in general identify $M$ with its image $F(M)$ and use the two interchangeably. We will write $frak(X)(N)$ to the set of surfaces that can be defined as above and call any such surface any such surface an _admissible_ hypersurface. In general, tensorial constructions defined on $N$ will be written with an overline and their versions on $M$ will be written normally. We will write the covariant derivatives on $M$ and $N$ as $nabla$ and $ov(nabla)$ respectively. We will write the Laplacian on $N$ and $M$ as $Delta$ and $ov(Delta)$ respectively. We will use Einstein summation notation for all tensor equations. Often for a matrix $M_(i j)$ we will use the notation $M_(i j) >= #h(0em) (>) med med 0$ to denote the fact that $M_(i j)$ is positive semi-definite (definite), and similarly for $M_(i j) <= #h(0em) (<) med med 0$. We can use the Riemannian metric $ov(g)$ to take inner products of tangent vectors in the same tangent space $T_p N$, for tangent vectors $X,Y in T_p N$ we will write this as $ip(X,Y)$. Since the metric $g$ is just the restriction of $ov(g)$ onto $T_p M$ when we think of it as a subspace of $T_p N$, we will use the same notation $ip(X,Y)$ for $X,Y in T_p M$. The Riemannian metric $ov(g)$ defines with it a Riemannian volume form which we will call $dif V$, this form can be restricted to $Omega$ to allow us to define $ Volume(Omega) = integral_Omega dif V, $ we will often write $V(M)$ as our $Omega$ can be determined uniquely an orientation by on $M$. Similarly the metric $g$ defines a volume form on $M$ which we will call $dif S$, using which we define $ Area(M) = integral_M dif S. $ We will often write $A(M)$ for brevity. We can now define the Isoperimetric profile of $N$ to be the function $ I(v) := inf { A(M) : M in frak(X)(N) "and" V(M) = v}. $ If for some family of surfaces $S_alpha$ we have $I(V(S_alpha)) = A(S_alpha)$ then we will also refer to ${ S_alpha }$ as the Isoperimetric profile, the meaning will be clear from context. The Isoperimetric Problem now asks us to + Show there exists a family of hypersurfaces $S_alpha in frak(X)(N)$ which is the Isoperimetric profile. + Characterize this family. In practice the above problem is extremely difficult so we often restrict ourself to a subclass of surfaces. For a subclass $cal(Z) seq frak(X)(N)$ the Isoperimetric profile of $cal(Z)$ is the function $ I(v) := inf { A(M) : M in cal(Z) "and" V(M) = v } $ and the same questions can be asked of this case. We will now start to build up the concepts that allow us to solve this problem. == Riemannian Geometry Recall that since $M$ is the boundary of a manifold it must be orientable, it thus has a canonical 'outward' pointing unit normal vector field, which we will call $nu$. Working with Riemannian geometry is almost always easier when done with coordinates. In extrinsic geometry, there are two coordinate systems that we will be using repeatedly so we will list some of their properties. #proposition[ Let $M$ be a Riemannian manifold, at any point $p$ there exists a chart $(U,phi)$ with the property that the frame $ e_i = diff/(diff x^i) $ forms an orthonormal basis _at the point $p$_. These are called _orthonormal coordinates_. ] // #proposition[ // Let $M$ and admissible submanifold of $N$, at any point $p in M$ there exists a chart $(U,phi)$ of $N$ such that // $ // e_i = diff/(diff x^i) // $ // forms an orthonormal basis for $T_p M$ for $i <= n$ _at $p$_ and $e_(n+1) = nu$ _on $U$_. These are called _Fermi coordinates_. In general, when we are dealing with Fermi coordinates and write $e_i$ we are excluding $e_(n+1)$ and refer to it as $nu$ instead. // ]<prop-fermi_coords> // #proof[ // The proof for the first proposition is standard and can be found in any differential geometry textbook, for the second proposition see @leeIntroductionRiemannianManifolds2018(p.~183). // ] Then we define the second fundamental form $h$ to be the bilinear form given by $ h(X,Y) = ip(X, ov(nabla)_Y nu). $ This second fundamental form encodes within itself how the manifold $M$ lies inside $N$, it also carries with it a number of useful properties, the most important of which is that it is symmetric, see @leeIntroductionRiemannianManifolds2018[p.~227] for details. Also of much importance is the trace of this form, taken with respect to the metric, which we write as $H = h_(i i)$, which is called the _mean curvature_. Also its eigenvalues $kappa_i$ taken with respect to the metric are called _principle curvatures_. Let us write up some properties of the second fundamental form. #proposition[ Let $e_i$ be an orthonormal frame at $p$, the following are true: #block(width: 100%, inset: (x: 2em))[ + $h$ can be written in coordinates as $h_(i j) = ip(e_i, ov(nabla)_(e_j) nu)$. <prop-h_coords> + $ov(nabla)_i nu = h_(i j) e_j$. <prop-h_applied> + $ov(nabla)_i e_j = - h_(i j) nu$. <prop-h_neg> + If $f$ is a function $N -> RR$, then $ov(Delta) f = Delta f + ov(Hess)_f (nu,nu) + H nu(f)$. <prop-h_laplace> ] ]<prop-h_props> #proof[ #link(<prop-h_coords>)[(a)] is directly from definition, to see #link(<prop-h_applied>)[(b)] note that ${ e_1, ..., e_n } union { nu }$ form a basis for the tangent space $T_p M$ and thus we have $ nabla_i nu = a^j e_j + b nu $ for some coefficients $a^j,b in RR$. But now consider, $ 0 = nabla_j ip(nu,nu) = 2 ip(nabla_j nu, nu) = 2 b $ and so we have $b = 0$. We then get, $ a^j = ip(nabla_i nu, e_j) = h_(i j) $ proving the claim. Now for #link(<prop-h_neg>)[(c)] we note first that $nabla_X Y = pi( ov(nabla)_Y X)$ for $X,Y in T_p M$ where $pi$ is the orthogonal projection to $T_p M$, see, for instance, @jost_riemannian_2011[p.~223]. This will mean that since $e_i$ are orthonormal then $ pi(ov(nabla)_i e_j) = nabla_i e_j = 0 $ for all $i,j$ and so $ov(nabla)_i e_j = b_(i j) nu$ for some matrix $b$ of coefficients. Now we have $ 0 = ov(nabla)_i ip(e_j, nu) = ip(ov(nabla)_i e_j, nu) + ip(e_j, ov(nabla)_i nu) = b_(i j) + h_(i j) $ which proves the claim. Finally for #link(<prop-h_laplace>)[(d)], we have $ ov(Delta) f & = ip(ov(nabla)_i ov(nabla) f, e_i) + ip(ov(nabla)_nu ov(nabla) f, nu) = ov(nabla)_i ip(ov(nabla) f, e_i) - ip(ov(nabla) f, ov(nabla)_i e_i) + ip(ov(nabla)_nu ov(nabla), nu) \ & = nabla_i ip(nabla f, e_i) - ip(ov(nabla) f, -H nu) + ip(ov(nabla)_nu ov(nabla), nu) \ & = Delta f + H nu(f) + ip(ov(nabla)_nu ov(nabla), nu) $ ] #remark[We will also use the notation $A(X)$ to mean the endomorphism satisfying $ h(X,Y) = ip(A(X), Y) $ which is given in coordinates by $A^i_j = g^(i k) h_(k j)$ and we will also use the notation $|A|^2$ to denote the squared norm of $h$ or $A$. ] We will also need another well known geometric identity, #lemma("Codazzi Equation")[ We have for any $X,Y,Z in T_p M$ $ ov(Rm)(X,Y,Z,nu) = - (nabla_X h)(Y,Z) + (nabla_Y h)(X,Z) $ in particular in coordinates we have $ ov(Rm)_(i j k nu) = - nabla_i h_(j k) + nabla_j h_(i k) $ ]<lemma-codazzi> #proof[ See @petersen_riemannian_2016[p.~93], note that some references have a similar equation of opposite sign due to a difference in defining the second fundamental form. ] == Conformal Vector Fields A conformal vector field is a vector field $X$ with the property that $lie_X ov(g) = 2 phi g$, where $phi$ is any smooth function called the _conformal factor_ of $X$. The Lie derivative is a little hard to work with for our purposes so we will follow the calculations of @deshmukh_geometry_2017 to obtain better formulations for the properties of conformal vector fields #proposition[ Let $X$ be a vector field on $N$, then for any $Y,Z in T_p N$ we have $ 2 ip(ov(nabla)_Y X, Z) = (lie_X g)(Y,Z) + dif eta (Y,Z) $ where $eta$ is the dual one form to $X$ defined by $eta(Z) = ip(X,Z)$ ] #proof[ We have by Koszul's formula (@leeIntroductionRiemannianManifolds2018[p.~123]) $ 2 ip(ov(nabla)_Y X, Z) =& Y(ip(X,Z)) + X(ip(Y,Z)) - Z(ip(X,Y)) \ &- ip([X,Y],Z) - ip([Y,Z],X) - ip([X,Z],Y) \ =& lie_X (ip(Y,Z)) + lie_Y (eta(Z)) - lie_Z (eta(Y)) \ &- ip(Z,lie_X Y) - eta(lie_Y Z) - ip(Y, lie_X Z) \ =& (lie_X ov(g)) (Y,Z) + lie_Y (eta(Z)) - lie_Z (eta(Y)) - eta(lie_Y Z) \ =& (lie_X ov(g)) (Y,Z) + (lie_Y eta)(Z) - lie_Z (eta(Y)) $ Now we can use Cartan's magic formula (@lee_introduction_2012[p.~372]) applied to the second term to get $ 2 ip(ov(nabla)_Y X, Z) &= (lie_X ov(g)) (Y,Z) + dif eta(Y,Z) + dif (eta(Y))(Z) - lie_Z (eta(Y)) \ & = (lie_X ov(g)) (Y,Z) + dif eta(Y,Z) $ ] From this we see that an equivalent characterization of a Conformal vector field is $2 ip(ov(nabla)_Y X, Z) = 2 phi ip(Y,Z) + dif eta(Y,Z)$ We now define the skew-symmetric endomorphism $psi$ by $ dif eta(Y,Z) = 2 ip(psi Y,Z) $ This endomorphism is then called the _associated tensor field_ of $X$, and with it we can rewrite the above equation as $ ip(nabla_Y X, Z) = phi ip(Y,Z) + ip(psi Y, Z). $ Note that this is also the decomposition of the $nabla X$ into its symmetric and anti-symmetric parts, that is $ Sym(nabla X) = phi g quad "and" quad AntiSym(nabla X) = ip(psi(dot), dot) $ In the special case that $phi = 0$ we call $X$ a Killing vector field. #definition[ Given a fixed vector field $X$ on $N$ we define the support function $u_X$ on $M$ by $ u_X := ip(X,nu) $ where $nu$ is the normal vector to $M$. ]<def-support> To see why conformal vector fields are so useful in the study of the Isoperimetric inequality, we will now derive a key result that was the basis of the results by <NAME> Wang and will also be the basis of the results in this thesis. #lemma("Conformal Integral Identities")[ Let $M seq N$ be an admissible hypersurface as defined above, and let $X$ be a conformal vector field on $N$ with conformal factor $phi$, the following identities hold $ integral_M (n phi - H u) dif S = 0 $ and $ integral_M H (n phi - H u) dif S = n/(n-1) integral_M ov(Ric)(nu, X - u nu) dif S - 1/n integral_M u sum_(i < j) (kappa_i - kappa_j)^2 dif S $ where $u = ip(X, nu)$ is called the support function. ]<lemma-integral_identities> #proof[ First we will define the vector field $Y = X - u nu$, which is the projection of $X$ onto the tangent space of $M$. Now consider the divergence of $Y$ on $M$, for an orthonormal frame $e_i$ of $M$ we have $ div_M (Y) = tr(nabla_j ip(Y, e_i)) = tr(ov(nabla)_j ip(X, e_i)) = tr(ip(ov(nabla)_j X, e_i) + ip(X, ov(nabla)_j e_i)). $ Now we combine the fact that $X$ is a conformal vector field and the fact that the trace of an endomorphism is the same as the trace of its symmetrization, giving us that $ tr(ip(ov(nabla)_j X, e_i)) = tr((ov(nabla) X)_(j i)) = tr(Sym(ov(nabla) X)_(i j)) = tr(phi ov(g)_(i j)) $ Now knowing that in our coordinates $ov(g)_(i j) = delta_(i j)$ and that the trace of $delta_(i j)$ is $dim M$ we get $ div_M (Y) = tr(phi ov(g) + ip(X, -h_(i j) nu)) = n phi - H u $ then since $M$ is closed we have by divergence theorem $ integral_M div_M (Y) dif S = integral_M (n phi - H u) dif S = 0 $ Secondly we will consider the vector field $Y' = (H I - A) Y$, its divergence gives us $ div_M Y' & = tr(nabla_j (H I - A)^i_k Y^k) = tr(nabla_j ((H I - A)^i_k ip(X, e_k))) \ &= tr(ip(X, e_k) nabla_j (H I - A)^i_k + (H I - A)^i_k ov(nabla)_j ip(X, e_k)) \ &= tr(ip(X, e_k) (nabla_j h_(ell ell) delta^i_k - nabla_j h_(i k)) + (H I - A)^i_k (phi delta_(j k) - u h_(j k))) \ &= ip(X, e_k) (nabla_k h_(ell ell) - nabla_i h_(j k)) + H(n phi - H u) - H phi + u|A|^2 \ &= ip(X, e_k) (nabla_k h_(ell ell) - nabla_ell h_(ell k)) + H((n-1)phi - H u) + u|A|^2 $ We now use the #link(<lemma-codazzi>)[Codazzi equation] to get $ div_M Y' & = ip(X, e_k) ov(Rm)_(ell k ell nu) + H((n-1)phi - H u) + u|A|^2 \ &= ip(ov(Rm)(e_ell, Y) e_ell, nu) + H((n-1)phi - H u) + u|A|^2 $ now swapping the last two indices on the Riemann tensor flips its sign gives us the Ricci tensor, then applying divergence theorem once more gives us $ & integral_M H ((n - 1)phi - H u) dif S = integral_M ov(Ric)(nu, Y) dif S - integral_M u|A|^2 dif S \ &(n-1)/n integral_M H (n phi - n/(n-1) H u) dif S = integral_M ov(Ric)(nu, Y) dif S - integral_M u|A|^2 dif S \ &integral_M H (n phi - n/(n-1) H u) dif S = n/(n-1) integral_M ov(Ric)(nu, Y) dif S - integral_M n/(n-1) u|A|^2 dif S \ &integral_M H (n phi - H u) dif S = n/(n-1) integral_M ov(Ric)(nu, Y) dif S - integral_M u(n|A|^2 - H^2)/(n - 1) dif S \ &integral_M H (n phi - H u) dif S = 1/(n-1) (integral_M n ov(Ric)(nu, Y) dif S - integral_M u sum_(i < j) (kappa_i - kappa_j)^2 dif S) $ ] == Partial Differential Equations <sect-PDE> The Partial Differential Equations (PDEs) we will be dealing with most in this thesis are parabolic PDEs, so we will dedicate this section to going over their properties. Let $T in (0,infinity]$ and $U seq M$ be a smooth open domain, a function $u : [0,T] times U$ is said to solve a *quasi-linear parabolic PDE* if it satisfies a differential equation of the form $ diff_t u (x,t) = a^(i j)(x,t,u,nabla u) nabla_i nabla_j u + G(x,t,u,nabla u) $<parabolic_pde> where $a$ is symmetric positive definite matrix depending smoothly on its inputs and $G$ is a function depending smoothly on its inputs. The equation is said to be in _divergence form_ if it can be written instead as $ diff_t u (x,t) = nabla_i (b^(i j)(x,t,u,nabla u) nabla_j u) + G(x,t,u,nabla u) $<div_parabolic_pde> We say that the PDE in @eqn-parabolic_pde is _uniformly parabolic_ if there exist constants $A,B$ such that $ A|v|^2 <= a^(i j) v_i v_j <= B|v|^2 $ for all $v in T_p M$ everywhere. The most important tool in the analysis of parabolic PDEs is the maximum principle, a form of which we will now prove. #proposition[ Assume $u$ solves @eqn-parabolic_pde and that at a spacial maximum of $u$ the inequality $G(x,t,u,nabla u) < f(t)$ holds, then we have for all $t in [0,T]$ $ sup_(x in U) u(x,t) <= sup_(x in U) u(x,0) + integral_0^t f(s) dif s $<max_priniciple_linear> if instead we have $G(x,t,u,nabla u) < B u(x,t)$ for some constant $B in RR$ then we have $ sup_(x in U) u(x,t) <= (sup_(x in U) u(x,0)) e^(B t) $<max_priniciple_exp> ]<prop-max_principle> #proof[ First for @eqn-max_priniciple_linear consider the auxiliary function $ v(x,t) = u(x,t) - integral_0^t f(s) dif s - sup_(x in U) u(x,0) $ which then solves $ diff_t v(x,t) = a^(i j)(x,t,v,nabla v) nabla_i nabla_j v + G(x,t, v, nabla v) - f(t) $ and also $v(x,0) <= 0$ on $U$. Now assume that @eqn-max_priniciple_linear fails to hold, that is, at some point $(y,t)$, we have $ u(y,t) > sup_(x in U) u(x,0) + integral_0^t f(s) dif s $ then we also have $ v(y, t) > 0 $ and so the maximum of $v$ is positive. But now let $(z,t')$ be said maximum, we have that the maximum is either on the interior of $[0,T] times U$ or on the boundary ${T} times U$, it cannot be on ${0} times U$ since there we have $v(x, 0) <= 0$. Thus we have that $ nabla v(z,t') = 0, nabla_i nabla_j v(z,t') <= 0 "as well as" diff_t v(z,t') >= 0 $ and so $ 0 <= diff_t v(x,t) = a^(i j)(x,t,v,nabla v) nabla_i nabla_j v + G(x,t, v, nabla v) - f(t) < 0 $ this is a contradiction. For @eqn-max_priniciple_exp we use an identical argument except that we instead use $ v(x,t) = e^(-B t) u(x,t) - sup_(x in U) u(x,0) $ ] The second most important tool is short-time existence, it will be extremely important as we want to use the derivatives of geometric quantities to characterize them, so we need the flow to exists for some non-zero amount of time. #theorem[ If $u(0,dot)$ is a smooth initial condition and @eqn-parabolic_pde is uniformly parabolic then @eqn-parabolic_pde has a solution $u$ for some time $T > 0$ which is smooth on $[0,T)$. Furthermore, if there is an apriori uniform bound $ ||u(t,dot)||_(C^(1+r)) <= K "for all" t in [0,s) $ for some constants $r > 0$, $K > 0$, then the solution exists on $[0,s)$ and satisfies a bound $ ||u(t, dot)||_(C^(2+r)) <= B(K) $ where $B$ is some constant depending on $K$. ]<thrm-pde_short_time> #proof[ Proposition 8.2 in @taylor_partial_2023[p.~411] for the first statement, and Theorem 4.28 in @lieberman_second_1996[p.~77] for the second statement. ] The last PDE results which we will need are the famous Nash-Moser estimates, for full details see @taylor_partial_2023 @ladyzenskajaLinearQuasilinearEquations1968 #theorem("Nash-Moser estimates")[ Let $u$ be a solution to uniformly parabolic @eqn-parabolic_pde on $[0,T)$ with smooth initial condition, if we know that $ ||u(t,dot)||_(C^0(U)) < c_1 "and" ||nabla u(t, dot)||_(C^0(U)) <= c_2 "on" [0,T) $ then on any subdomain $U'$ with $ov(U') seq U$ we have for some $r > 0$ depending only $c_1,c_2,A,B$ that $ ||u(t,dot)||_(C^(1+r)(U')) <= C(c_1,c_2,A,B,d) $ where $d$ is the distance between $diff U'$ and $diff U$. ]<thrm-nash_moser> #proof[ We will use Theorem 1.1 in @ladyzenskajaLinearQuasilinearEquations1968[p.~517], it is enough to show that the functions $ diff_(x^k) a^(i j)(x,t,v,p), diff_v a^(i j)(x,t,v,p), diff_(p^k) a^(i j)(x,t,v,p), G(x,t,v,p) $ are uniformly bounded on the set $ { (x,t,v,p) : x in ov(U), |v| <= c_1 "and" ||p|| <= c_2}. $ But this is immediate since these functions depend smoothly on their inputs and thus are continuous and so since the set above is compact they must attain their maximum inside that set and thus they are bounded by that maximum. ] == Evolving Hypersurfaces Now that we are familiar with geometry and PDEs we can start to use them together. This is done by use of *geometric flows*. #definition[ Let $F : M -> N$ be an admissible hypersurface. Let $F_t$ be a function $F : I times M -> N$, where $I = [0,T)$ for some fixed $T$ and $F_0 = F$ on $M$. $F_t$ is called a _normal flow_ with _normal velocity_ $f$ if $ diff_t F_t (x) = f(t,x) nu(x) $ where $nu(x)$ is the normal vector to $F_t (M)$ at $F_t (x)$. ] #remark[ We will often refer to $F_t (M)$ as $M_t$ for brevity. Additionally many constructions on $M_t$ will be denoted without explicit reference to $t$, i.e $g$ instead of $g(t)$, even though the metric of $M_t$ will depend on $t$. Keep in mind that any construction of the metric will also depend on $t$. ] As a manifold flows it's various properties, both local and global, will change, the equations governing these changes are called _evolution equations_. For ambient objects, i.e. those objects that are simply restricted to the hypersurface, this evolution is simple. #proposition[ Let $T$ be any tensor on $N$, then we write $T|_(M_t)$ to denote the orthogonal projection of $T$ onto $T_p M_t$. We then have along the flow $M_t$ $ diff_t (T|_(M_t)) = (f ov(nabla)_nu T)|_(M_t) $ ]<prop-ambient_evolution> For objects that depend on the induced metric on $M_t$, these objects depend on the embedding of a whole neighborhood of a point, so their evolution equations are more complicated, but we can still compute them. We will first start with the most important evolving tensor, the metric. #remark[ We will also adapt two important coordinate systems, we will be working in normal coordinates around a point $p in M$ which will call these coordinates $x^i$, we will denote their partial derivatives $diff_i$ or $e_i$ and the covariant derivatives with respect to the induced metric $nabla_i$. Secondly we will also have normal coordinates at $F(p) in N$, we will call these coordinates $y^i$, their partials $diff_y_i$ or $ov(e)_i$ and the covariant derivatives $ov(nabla)_i$. Note that we can rotate the normal coordinates $y^i$ so that they align with $x^i$, in the sense that _at the point $p$_ $ diff_i F = ov(e)_i, forall i <= n quad "and" quad nu = ov(e)_(n+1) $ Since we are working in normal coordinates, note that the Christoffel symbols $Gamma$ and $ov(Gamma)$ both vanish at $p$, but their derivatives might not, so we have to be very careful when working with these expressions. ] #proposition[ The evolution equation for the metric is $ diff_t g = 2 f h $ ]<prop-metric_evoluion> #proof[ We prove by using Fermi coordinates, recall that we define the metric as the restriction of the ambient metric like so $ g_(i j) = ip(diff_i F, diff_j F), $ and thus we can differentiate in the ambient space to get an expression for the time derivative of the restriction $ diff_t g_(i j) &= diff_t ip(diff_i F, diff_j F) = ip(diff_t diff_i F, diff_j F) + ip(diff_i F, diff_t diff_j F) \ &= ip(diff_i (f nu), e_j) + ip(e_i, diff_j (f nu)) \ &= ip(nabla_i (f nu), e_j) + ip(e_i, nabla_j (f nu)) quad "because Christoffel symbols vanish" \ &= ip(f nabla_i nu + nu nabla_i f, e_j) + ip(e_i, f nabla_j nu + nu nabla_j f) \ &= f ip(nabla_i nu, e_j) + f ip(e_i, nabla_j nu) wide "by orthogonality" \ &= f ip(h_(k i) e_k, e_j) + f ip(e_i, h_(k j) e_k) = f h_(j i) + f h_(i j) = 2 f h_(i j) $ where we used @prop-h_props in the final step. ] Now that we know how the metric evolves there are some immediate consequences that we can show. #proposition[ The evolution equations for $nu,dif S$ are $ diff_t nu = - nabla f quad "and" quad diff_t dif S = f H dif S $ respectively. ] #proof[ First note that $diff_t ip(nu,nu) = 0$ and so we have that in Fermi coordinates $ diff_t nu = ip(diff_t nu, e_j) e_j $ then we also have that for any $j$ $ 0 = diff_t ip(nu, e_j) = ip(diff_t nu, e_j) + ip(nu, diff_t e_j) $ and so $ diff_t nu &= - ip(nu, diff_t e_j) e_j = - ip(nu, diff_t diff_j F) e_j = - ip(nu, diff_j (f nu)) e_j = - ip(nu, f diff_j nu + nu diff_j f) e_j \ &= - ip(nu, f h_(i j) e_i + nu nabla_j f) e_j wide "apply orthogonality of" nu "and" e_i \ &= - ip(nu, nu nabla_j f) e_j = - nabla_j f e_j = - nabla f. $ Note that on line 2 we also implicitly used the fact that the Christoffel symbols vanish in normal coordinates around the point $p$. For the volume form, we know that $dif S = sqrt(det(g)) dif x_1 ... dif x_n$ and so we can compute $ diff_t (dif S) = diff_t (sqrt(det(g))) dif x_1 ... dif x_n. $ Now recall that for a parametrized matrix $A(t)$ we have $ diff_t det(A(t)) = det(A(t)) tr(diff_t A(t)) $ and so $ diff_t (sqrt(det(g))) dif x ... dif x &= 1/(2sqrt(det(g))) diff_t (det(g)) dif x_1 ... dif x_n \ &= sqrt(det(g)) tr(diff_t (g_(i j))) dif x_1 ... dif x_n \ &= sqrt(det(g)) tr(f h_(i j)) dif x_1 ... dif x_n \ &= f H dif S $ ] Now that we have evolution equation for some local properties, we can extend those to evolution equation of global quantities. #proposition[ We have the following evolution equations for $V(M_t)$ and $A(M_t)$, $ diff_t V(M_t) = integral_(M_t) f dif S, quad diff_t A(M_t) = integral_(M_t) H f dif S $ ]<prop-vol_area_variation> #proof[ First for the volume, extend the vector field $f nu$ to a global vector $Y$ field on $N$. Now by classic geometry theorems @lee_introduction_2012[p.~425] we get that the change in volume for a domain evolving under a global vector field is $ diff_t V(M_t) = integral_Omega div Y dif V. $ Now by the divergence theorem we get that $ integral_Omega div Y dif V = integral_(M_t) ip(Y, nu) dif S, $ but we know that $Y = f nu$ along $M_t$ so $ diff_t V(M_t) = integral_(M_t) f dif S. $ For the area, we get $ diff_t A(M_t) = diff_t integral_(M_t) dif S = integral_(M_t) diff_t dif S = integral_(M_t) f H dif S. $ ] We have one final evolution equation to find, and that is the one for the second fundamental form $h_(i j)$. #proposition[ We have the following evolution equations for $h_(i j)$ $ diff_t h_(i j) = - nabla_i nabla_j f + f (h_(i ell) g^(ell k) h_(k j) - ov(R)^(nu)_(nu i j)) $ ]<prop-h_evolution> #proof[ Recall that $h_(i j) = ip(e_i, ov(nabla)_j nu) = ip(diff_i F, ov(nabla)_j nu)$. Then since $nu$ is orthogonal to all $diff_i F$'s we get $ 0 = ov(nabla)_j ip(diff_i F, nu) = ip(diff_i F, ov(nabla)_j nu) + ip(ov(nabla)_j diff_i F, nu) $ and thus $ ip(ov(nabla)_j diff_i F, nu) = - h_(i j). $ With this in mind we can compute $ - diff_t h_(i j) &= diff_t ip(nabla_j diff_i F, nu) = ip(diff_t nabla_j diff_i F, nu) + ip(diff_j diff_i F, diff_t nu) \ &= ip(diff_t (diff_j diff_i F + ov(Gamma)_(rho sigma)^k diff_i F^rho diff_j F^sigma ov(e)_k), nu) + ip(diff_j diff_i F, diff_t nu) \ &= ip(diff_j diff_i (f nu) + (diff_t ov(Gamma)_(rho sigma)^k) (diff_i F^rho diff_j F^sigma) ov(e)_k + ov(Gamma)_(rho sigma)^k diff_t (diff_i F^rho diff_j F^sigma ov(e)_k), nu) \ &- med ip(diff_j diff_i F, nabla f) $ now the Christoffel symbols vanish at $p$, so wherever they appear without a time derivative they vanish there, we hence get $ -diff_t h_(i j) &= ip(diff_j (nu diff_i f + f diff_i nu) + (f diff_nu ov(Gamma)_(rho sigma)^k) (diff_i F^rho diff_j F^sigma) ov(e)_k, nu) - ip(diff_j diff_i F, nabla f) \ &= ip((diff_j nu) diff_i f + (diff_j f) diff_i nu + nu (diff_j diff_i f) + f diff_j (diff_i nu) + (f diff_nu ov(Gamma)_(rho sigma)^nu) (diff_i F^rho diff_j F^sigma) nu, nu) \ &- ip(- h_(i j) nu, nabla f) $ Now using the fact that $nu$ is orthogonal to any derivation of $nu$ (since it is a unit vector), the expression above simplifies to $ - diff_t h_(i j) &= diff_i diff_j f + f ip(diff_j diff_i nu, nu) + f (diff_nu ov(Gamma)^nu_(rho sigma)) diff_i F^rho diff_j F^sigma \ &= diff_i diff_j f + f ip(diff_j (nabla_i nu - ov(Gamma)_(rho sigma)^k diff_i F^rho nu^sigma ov(e)_k), nu) + f (diff_nu ov(Gamma)^nu_(rho sigma)) diff_i F^rho diff_j F^sigma \ &= diff_i diff_j f + f ip(diff_j (h_(i k) diff_k F - ov(Gamma)_(rho sigma)^k diff_i F^rho nu^sigma ov(e)_k), nu) + f (diff_nu ov(Gamma)^nu_(rho sigma)) diff_i F^rho diff_j F^sigma \ &= diff_i diff_j f + f (diff_nu ov(Gamma)^nu_(rho sigma)) diff_i F^rho diff_j F^sigma \ &+ med f ip(h_(i k) diff_j diff_k F + (diff_j h_(i k)) (diff_k F) - (diff_j ov(Gamma)_(rho sigma)^k) diff_i F^rho nu^sigma ov(e)_k - ov(Gamma)_(rho sigma)^k diff_j (diff_i F^rho v^sigma ov(e)_k), nu) $ but now again the Christoffel symbols vanish and since $nu$ is orthogonal to all tangent vectors, the second and fourth term in the inner product vanish and so we are left with $ f ip(h_(i k) diff_j diff_k F - (diff_j ov(Gamma)_(rho sigma)^k) diff_i F^rho nu^sigma ov(e)_k, nu) = - f h_(i k) h_(j k) - f (diff_j ov(Gamma)_(rho sigma)^nu) diff_i F^rho nu^sigma. $ Now recall that in orthonormal coordinates the Riemann tensor is given by $ ov(R)_(i j k)^l = diff_i Gamma_(j k)^l - diff_j Gamma_(i k)^ell $ and so we get $ - diff_t h_(i j) &= diff_i diff_j f + f (diff_nu ov(Gamma)^nu_(rho sigma)) diff_i F^rho diff_j F^sigma - f (diff_j ov(Gamma)_(rho sigma)^nu) diff_i F^rho nu^sigma - f h_(i k) h_(j k) \ &= diff_i diff_j f - f h_(i k) h_(j k) + f ov(R)^nu_(nu i j) $ which then since we are in orthonormal coordinates we know that $diff_i diff_j f = nabla_i nabla_j f$, and since the middle term is not tensorial we make it tensorial by contracting with the metric and so we get the desired result. ] #corollary[ Immediately from @prop-h_evolution we get the following evolution equation for $H$ $ diff_t H = - Delta f - f(|A|^2 + ov(Ric)(nu,nu)) $ ]<cor-H_evolution> #proof[ We have $H = g^(i j) h_(i j)$ in coordinates so $ diff_t H &= diff_t (g^(i j) h_(i j)) = h_(i j) diff_t (g^(i j)) + g^(i j) diff_t (h_(i j)) \ &= h_(i j) (- 2 f h^(i j)) + g^(i j) (- nabla_i nabla_j f + f (h_(i k) h_(k j) - ov(R)^(nu)_(nu i j))) \ &= - 2 f|A|^2 - Delta f + f (h_(i k) h_(i k) - ov(R)^(nu)_(nu i i)) \ &= - Delta f - f|A|^2 - f ov(Ric)(nu,nu) $ ] #pagebreak(weak: true) = Constrained Geometric Flow Method With the preliminaries out of the way we can begin to discuss how we can attempt to attack the Isoperimetric problem. This is formalized in the *flow method*. This method was first used by Gage and Hamilton in their curve shortening flow in $RR^2$. Then in 1984 Huisken extended these methods to the case of convex hypersurfaces in $RR^n$ @huiskenFlowMeanCurvature1984. Most recently Guan-Li used instead the *constrained flow method* to relax the convexity requirement to star-shapedness @guanMeanCurvatureType2013. This constrained method highly depends on the #link(<lemma-integral_identities>)[Minkowski Identities] which allows us to get a handle on the change in area and in volume along a special class of flows. #theorem("Constrained Flow Method")[ Consider two classes of admissible hypersurfaces $cal(Z),cal(P)$ such that the following conditions hold. + For each hypersurface $M in cal(Z)$ we can define a flow $M_t$ which exists for all time. + $V(M_t)$ is constant and $A(M_t)$ is non-decreasing. + The flow converges to a hypersurface $M_t -> M_infinity$ with $M_infinity in cal(P)$. Then $cal(P)$ is the Isoperimetric profile of $cal(Z)$, in the sense that for each $M in cal(Z)$ there is a hypersurface $S in cal(P)$ with $ V(M) = V(S) "and" A(M) >= A(S) $ ]<thrm-flow_method> #proof[ Let $M_t$ be the flow of $M$. Then $M_t -> S$ for some hypersurface $S in cal(P)$ and so $ V(M) = V(M_t) = lim_(t -> infinity) V(M_t) = V(S) $ and $ A(M) >= lim_(t -> infinity) A(M_t) = A(S) $ which proves the theorem. ] We will now consider two previous uses of this method that will motivate our use of it in the third chapter. == Warped Product Spaces Warped products are in essence a generalization of the Polar coordinates in $RR^2$ so let us first look at those. The Polar coordinates $(r,theta)$ in $RR^2$ are defined implicitly in terms of standard Euclidean coordinates, through $(x,y) = (r cos(theta), r sin(theta))$, where we have $r > 0$ and $-pi/2 < theta < pi/2$. Now the Euclidean metric is given by $g = dif x^2 + dif y^2$ and so we can compute its form in polar coordinates as $ g &= dif x^2 + dif y^2 = (dif (r cos theta))^2 + (dif (r sin theta))^2 \ &= (cos theta dif r - r sin theta dif theta)^2 + (sin theta dif r + r cos theta dif theta)^2 \ &= cos^2 theta dif r^2 - 2 r cos theta sin theta dif r dif theta + r^2 sin^2 theta dif theta^2 \ &+ sin^2 theta dif r^2 + 2 r sin theta cos theta dif r dif theta + r^2 cos^2 theta dif theta^2 \ &= dif r^2 + r^2 dif theta^2 $ Note that if we instead consider the function $(r, theta) -> (r cos(theta), r sin(theta))$ as a function from $RR_+ times S^1 -> RR^2$, where $S^1$ is the unit circle, then this almost gives us a decomposition $ g_(RR^2) = g_RR + r^2 g_S^1 $ Note that this is not exactly the case because we first need to project a given vector down to its components in $RR$ and $S^1$ respectively and then apply the appropriate metrics. That is we actually have $ g_(RR^2) = g_RR compose pi_1 + g_S^1 compose pi_2 $ where $pi_1,pi_2$ are projections onto the tangent spaces of $RR$ and $S^1$ respectively. A similar constructions works in higher dimensions, where we have $RR^n = RR_+ times S^(n - 1)$. It is this decomposition that we aim to generalize with the warped product space. #definition[ Let $(M,g_M)$ and $(N,g_n)$ be Riemannian manifolds, we can define a metric on $M times N$ by $ g(x,y) = g_M (x) compose pi_1 + f^2(x) g_N (y) compose pi_2, $ where $x,y$ are points of $M$ and $N$ respectively. This is called the _warped product space_ with the _warping factor_ $f$ being a function $f : M -> RR_(> 0)$, it is often denoted $M times_f N$ ] Note that in practice we will always suppress the projections $pi_1$ and $pi_2$ for clarity. The most common warped product spaces we see in practice are those of the form $RR_+ times_f N$ for some $N$, for example the 3 space forms, $S^n, RR^n, HH^n$, are of the form $ RR_+ times_(sin r) S^n, quad RR_+ times_(r) S^n, quad RR_+ times_(sinh r) S^n $ respectively. These spaces carry a lot of nice properties, but the most important one for us is that they carry a natural conformal vector field. #proposition[ Let $RR_+ times_f N$ be a warped space and let $r$ be a coordinate on $RR_+$, then the vector field $X = f(r) diff_r$ is a conformal vector field with conformal factor $f'(r)$. Furthermore its associated tensor $psi$ vanishes. ] #proof[ Let us compute the Lie derivative $lie_X g$, $ lie_X g &= lie_X (d r^2 + f^2(r) g_N^2) = 2 d r lie_X (d r) + g_N^2 lie_X (f^2(r)) \ &= 2 d r (d (lie_X r)) + 2 g_N^2 f^2(r) f'(r) = 2 d r (d f(r)) + 2 f'(r) f^2(r) g_N^2 \ &= 2 f'(r) d r^2 + 2 f'(r) f^2(r) g_N^2 = 2 f'(r) (d r^2 + f^2(r) g_N^2). $ Note that if we set $Phi(r) = integral_0^r f(s) dif s$ then $nabla Phi(r) = f(r) diff_r$ and so $ip(nabla_Y X, Z)$ is just $Hess_f (Y,Z)$ and so it is symmetric and so its anti-symmetric component vanishes. ] Note that since its a gradient, $X$ in the above proposition is closed. Recall that in Euclidean space spheres are the optimal shapes for the Isoperimetric inequality, in polar coordinates spheres take the simple form of sets where $r = r_0$ for some fixed $r_0$. Using the geometric properties of warped product spaces and the vector field $X$, Guan, Li and Wang were able to use the flow method to prove that this is fact generalizes, the Isoperimetric profile for a large class of warped product spaces are exactly the level sets of the projection onto $RR$. #theorem("Guan, Li, Wang")[ Let $N = RR_+ times_f B$ with $B$ closed and $f$ satisfying some technical conditions. Then among the admissible hypersurfaces such that $ip(f(r) diff_r, nu)$ is everywhere positive, the Isoperimetric profile consists of level sets of $r$. ] The proof is detailed in @guanVolumePreservingFlow2018, we will not consider these details too much as they will quickly get generalized by the next work. == Manifolds Admitting Compatible Conformal Vector Fields Now it turns out that these closed conformal vector fields characterize warped products of the form we saw in the previous section, namely, if a manifold admits a closed conformal vector field then it can locally be written in the form $RR_+ times_f N$ for some manifolds $N$ and function $f$, see @tashiro_complete_1965[Lemma 1.2]. This suggests that if we want to weaken the conditions of Guan, Li and Wang we should perhaps consider the case of non-closed conformal vector fields. This idea was first explored by Li and Pan @jiayuIsoperimetricProblemRiemannian2023, where they formalized the necessary conditions on the ambient manifold in terms of a conformal vector field, and used this to weaken the assumptions on the vector field. They also derive a number of key properties for a conformal vector field satisfying their conditions. They start with with a conformal vector field $X$ on $N$ with conformal factor $phi$ which is non-zero on an open dense set $U$. They then assume $X$ satisfies the following conditions. #conditions[ + The distribution $cal(D)(X) seq T U$ defined by $cal(D)(X)|_p = { v in T_p N | ip(v,X) = 0 }$ is integrable on the set $U$. + $phi > 0$ everywhere on $U$. + The integral surfaces of $cal(D)(X)$ are compact level sets of $(||X||)/phi$ on U. + $phi^2 - X(phi) > 0$ everywhere on $U$. + The normal vector $cal(N)$ to the integral surfaces of $cal(D)(X)$ is the only direction of minimal Ricci curvature, that is for every unit vector $v$ which is not colinear with $cal(N)$ we have $ ov(Ric)(cal(N),cal(N)) < ov(Ric)(v,v). $ ]<cond-pan_li> They consider the class $frak(Y)$ of admissible hypersurfaces which is star-shaped with respect to $X$, that is admissible hypersurfaces $M$ for which $u = ip(X, nu)$ is positive along $M$. They define the normal flow with velocity $f = n phi - H u$ and prove the following results about this flow. #lemma[ + For a hypersurface $M in frak(Y)$ the normal flow with velocity $f = n phi - H u$ exists for all time $t in [0,infinity)$. + Along this flow volume is fixed and area is non-increasing. + The flow converges in the limit to an integral hypersurface of $cal(D)(X)$. ] Applying this to @thrm-flow_method they then prove the following #theorem("Li and Pan")[ Let $N$ be an ambient manifold admitting a conformal vector field $X$ satisfying @cond-pan_li, then for any star-shaped hypersurface $M$ there exists an integral surface $S$ of $cal(D)(X)$ with $ V(S) = V(M) "and" A(M) >= A(S) $ ]<thrm-li_pan> == Quasi-Closed Conformal Vector Fields In this section we will rewrite some of the conditions of Li and Pan and the results that follow from those conditions in a form that is easier to use. Let us recall the setting, we let $N$ be the complete ambient manifold which admits a conformal vector field $X$ on some open subset $U$ which does not vanish on that subset, with conformal factor $phi$. We will start with the #link(<cond-pan_li>)[first condition]. #proposition[ Let $X$ be a conformal vector field $X$ such that condition 1 holds, then the associated tensor field $psi$ satisfies $ ip(psi (v), w) = 0 "for" v,w in cal(D)(X) "eveywhere on" U $ ]<prop-psi_vanish> #proof[ By definition the one form defined by $omega(v) = ip(X,v)$ annihilates $cal(D)(X)$, then by the one form condition for integrability @lee_introduction_2012[p.~495] we get that $dif omega$ also annihilates $cal(D)(X)$, that is $dif omega$ restricts to zero on $cal(D)(X)$, hence by definition of the associated tensor field, we get the result above. ] We see then that such a conformal vector field is 'almost' closed since its associated tensor field 'almost' vanishes. #definition[ We will call a conformal vector field $X$ satisfying condition $1$, a _quasi-closed_ conformal vector field. ] The rest of this section will be devoted to properties of quasi-closed conformal vector fields. We will fix a point $p in U$ and an arbitrary vector $Y in T_p U$. We will denote $cal(N) = X/(||X||)$ the normal vector to the integral surfaces of $cal(D)(X)$. We will also refer to $cal(D)(X)$ as $cal(D)$ for brevity. We will start with a key property regarding the integral surfaces of $cal(D)$. #proposition[ Let $S$ be an integral surface of $cal(D)$, then $S$ is totally umbilical, that is at every point $p in S$ we have $ h_(i j) = f(p) g_(i j), $ for some function $f$ on $S$. Furthermore we have $f = phi/(||X||)$. ] #proof[ We have in coordinates on $S$, $ h_(i j) = ip(ov(nabla)_i cal(N), e_j) = ip(ov(nabla)_i X/(||X||), e_j) = ip((ov(nabla)_i X)/(||X||) - X/(||X||^2) (ov(nabla)_i||X||), e_j) $ then since $X$ is orthogonal to the tangent vector $e_j$ we get $ h_(i j) = ip((ov(nabla)_i X)/(||X||), e_j) = 1/(||X||) ip(ov(nabla)_i X, e_j) = 1/(||X||) (phi g_(i j) + ip(psi (e_i), e_j)) $ then by @prop-psi_vanish we get $ h_(i j) = phi/(||X||) g_(i j) $ ] Now we want the integral surfaces $S$ of $cal(D)$ to be our Isoperimetric profile, hence they should be critical points of the surface area functional with respect to fixed volume. Hence by @prop-vol_area_variation we need to have $H = n phi/ (||X||)$ be constant, which motivates condition 2. #corollary[ If $X$ satisfies condition $2$, then the integral surfaces of $cal(D)$ are totally umbilical with constant mean curvature $H = n phi/(||X||)$. ] #definition[ We will call quasi-closed conformal vector field $X$ _symmetric_ if it satisfies condition 2. ] Now consider, for a moment, the spheres in $RR^n$ of radius $r$. They are the integral surfaces for the orthogonal distribution to $X = x^i diff_i$ which is a closed conformal vector field with factor $phi = 2$. We see that their mean curvature is $H = (2n) / r$, we thus see that the mean curvature is inversely proportional to a certain 'scale' function, in this case $r$. This scale function will turn out to be extremely useful in many of our future calculations, so we give it a name here. #definition[ Wherever $phi != 0$ on $U$, we will call the following function the _scale_ function for $X$ $ lambda = (||X||^2)/phi^2 $ ]<def-scale> Since we know $lambda$ is constant on integral surfaces, its gradient must be colinear with $X$, that is $ov(nabla) lambda = 2 Lambda X$ for some function $Lambda$. We note a convenient expression for $Lambda$. #proposition[ Where $lambda$ is defined, we have $ ov(nabla) lambda = 2 Lambda X = 2 (phi^2 - X(phi))/phi^3 X $ ]<prop-lambda_def> #proof[ We see that $ ov(nabla) lambda &= ip(ov(nabla) lambda, cal(N)) cal(N) = X(lambda) X/(||X||^2) = X((||X||^2)/phi^2) X/(||X||^2) \ &= (2 ip(ov(nabla)_X X, X))/phi^2 X/(||X||^2) + (-2/phi^3) X(phi) X = 2 (phi ip(X,X))/phi^2 X/(||X||^2) - 2(X(phi)/phi^3) X \ &= 2 (phi^2 - X(phi))/phi^3 X $ ] Now as we saw, for a quasi-closed conformal vector field we have that $ip(psi(v), w)$ vanishes on $cal(D)$ and this turns out to be enough to get a precise equation for $psi$ even when $v,w$ are not in $cal(D)$. #proposition[ We have $ psi(Y) = (ip(Y, ov(nabla)||X||) X - ip(Y, X) ov(nabla)||X||)/(||X||). $ Furthermore if $X$ symmetric, then wherever $phi != 0$ we have $ psi(Y) = (ip(Y, ov(nabla)phi) X - ip(Y, X) ov(nabla)phi)/(phi). $ ]<prop-psi_exact> #proof[ Recall that $ip(psi(Y),Z)$ is anti-symmetric in $Y,Z$ so we may assume WLOG one of the two is in $cal(D)$ and hence also WLOG assume that it is $Y$. Then by @prop-psi_vanish we may assume that $Z$ is colinear with $X$. We thus have $ip(Y,Z) = 0$ so $ ip(ov(nabla)_Y X, Z) &= ip(Z,X)/(||X||^2) ip(ov(nabla)_Y X, X) = 1/2 ip(Z,X)/(||X||^2) ov(nabla)_Y ip(X,X)\ &= 1/2 ip(Z,X)/(||X||^2) ip(Y, ov(nabla)||X||^2) = ip(Z,X)/(||X||) ip(Y, ov(nabla)||X||) $ We can then anti-symmetrize this to get that for arbitrary $Y,Z$ $ ip(psi(Y),Z) = (ip(Z,X)ip(Y, ov(nabla)||X||) - ip(Y,X) ip(Z, ov(nabla)||X||))/(||X||) $ which gives us the first result. For the second result, we compute $ psi(Y) &= (ip(Y, ov(nabla)(phi dot (||X||)/phi)) X - ip(Y, X) ov(nabla)(phi dot (||X||)/phi))/(||X||) \ &= (ip(Y, (||X||)/phi ov(nabla)phi + phi ov(nabla)((||X||)/phi)) X - ip(Y, X) ((||X||)/phi ov(nabla)phi + phi ov(nabla)((||X||)/phi)))/(||X||) \ &= (ip(Y, ov(nabla)phi) X - ip(Y, X) ov(nabla)phi)/(phi) + phi (ip(Y,ov(nabla)((||X||)/phi)) X - ip(Y, X) ov(nabla)((||X||)/phi))/(||X||). $ Now we recall that $(||X||)/phi$ is constant along integral surfaces and thus its gradient is colinear with $X$. At the point $p$ we then write $ov(nabla) ((||X||)/phi) = a X$ and get $ psi(Y) &= (ip(Y, ov(nabla)phi) X - ip(Y, X) ov(nabla)phi)/(phi) + phi (ip(Y,a X) X - ip(Y, X) a X)/(||X||) \ &= (ip(Y, ov(nabla)phi) X - ip(Y, X) ov(nabla)phi)/(phi) + 0 $ which gets us the second result. ] We note here two other useful ways to write the covariant derivative of $X$ #corollary[ Where $phi = 0$ we have $ ov(nabla)_Y X/(||X||) = - ip(X,Y)/(||X||) ov(nabla)||X|| $ if $X$ is symmetric then where $phi != 0$ we have $ ov(nabla)_Y X/(phi) = X - ip(X,Y)/phi^2 ov(nabla) phi $ ]<cor-simple_grad> #proof[ We prove directly, for the first case $ ov(nabla)_Y X/(||X||) &= 1/(||X||) ov(nabla)_Y X - X/(||X||^2) ov(nabla)_Y||X|| \ &= 1/(||X||) ((ip(Y, ov(nabla)||X||) X - ip(Y, X) ov(nabla)||X||)/(||X||)) - X/(||X||^2) ov(nabla)_Y||X|| \ &= -(ip(Y, X) ov(nabla)||X||)/(||X||^2). $ And in the second case $ ov(nabla)_Y X/(phi) &= 1/(phi) ov(nabla)_Y X - X/(phi^2) ov(nabla)_Y phi \ &= 1/(phi) (phi X + (ip(Y, ov(nabla)phi) X - ip(Y, X) ov(nabla)phi)/(phi)) - X/(phi^2) ov(nabla)_Y phi \ &= X-(ip(Y, X) ov(nabla)phi)/(phi^2). $ ] #corollary[ $ (psi(Y),Z) = 2(AntiSym(ov(nabla)||X||^flat times.circle X^flat)(Y,Z))/(||X||). $ Furthermore if $X$ is symmetric, wherever $phi != 0$ we have $ (psi(Y),Z) = 2(AntiSym(ov(nabla)phi^flat times.circle X^flat)(Y,Z))/(phi). $ Here $♭$ represents raising an index and $AntiSym$ represents the anti-symmetrization. ]<cor-psi_antisym> #proof[ Immediate from @prop-psi_exact. ] #pagebreak(weak: true) We can also rewrite some of the Riemann and Ricci curvatures of the ambient manifold in terms of $X$. #proposition[ Wherever $phi = 0$ we have $ ov(R)(Y,X,Y,X) &= -||X||ip(ov(nabla)_Y ov(nabla)||X||, Y) + ip(X,Y)^2/(||X||) ip(ov(nabla)_cal(N) ov(nabla)||X||, cal(N)) \ ov(Ric)(X,Y) &= -(ip(X,Y))/(||X||) (ov(Delta)||X|| - ip(ov(nabla)_cal(N) ov(nabla) phi, cal(N))) $ and in addition, if $Y in cal(D)$ $ ||X||ov(Ric)(X,Y) = ip(ov(nabla)_Y ov(nabla)||X||, X) = ip(ov(nabla)_X ov(nabla)||X||, Y) = 0 $<ric_orthogonal> If $X$ is symmetric, then wherever $phi != 0$ we have $ ov(R)(Y,X,Y,X) &= -phi ip(ov(nabla)_Y ov(nabla)phi, Y) + ip(X,Y)^2/(phi) ip(ov(nabla)_cal(N) ov(nabla)phi, cal(N)) \ ov(Ric)(X,Y) &= -(ip(X,Y))/(phi) (ov(Delta)phi - ip(ov(nabla)_cal(N) ov(nabla) phi, cal(N))) $<ric_x> and in addition, if $Y in cal(D)$ $ phi ov(Ric)(X,Y) = ip(ov(nabla)_Y ov(nabla)phi, X) = ip(ov(nabla)_X ov(nabla)phi, Y) = 0 $<ric_orthogonal-phi> ]<prop-riemann_ricci_x_identity> #proof[ For the first case, we will consider $ov(R)(ov(e)_i, ov(e)_j, X/(||X||), ov(e)_k)$, then we will use linearity of the Ricci tensor to remove the denominator, we start with a use of @cor-simple_grad $ ov(R)(ov(e)_i, ov(e)_j, X/(||X||), ov(e)_k) &= ip(ov(nabla)_i ov(nabla)_j X/(||X||) - ov(nabla)_j ov(nabla)_i X/(||X||), ov(e)_k) \ &= ip(ov(nabla)_i (-(ip(ov(e)_j, X) ov(nabla)||X||)/(||X||^2)) - ov(nabla)_j (-(ip(ov(e)_i, X) ov(nabla)||X||)/(||X||^2)), ov(e)_k) \ &= - ov(nabla)_i ((ip(ov(e)_j, X))/(||X||^2)) ov(e)_k (||X||) + ov(nabla)_j ((ip(ov(e)_i, X))/(||X||^2)) ov(e)_k (||X||) \ &-(ip(ov(e)_j, X))/(||X||^2) ip(ov(nabla)_i ov(nabla)||X||, ov(e)_k) + (ip(ov(e)_i, X))/(||X||^2) ip(ov(nabla)_j ov(nabla)||X||, ov(e)_k). $ Now let us deal with the first two terms, expanding gives us $ ov(e)_k (||X||) ((ip(ov(nabla)_j X, ov(e)_i) - ip(ov(nabla)_i X, ov(e)_j))/(||X||^2) + 2 (ip(X, ov(e)_j) ov(e)_i (||X||) - ip(X, ov(e)_i) ov(e)_j (||X||))/(||X||^3)), $ then by definition the left denominator here is $2ip(psi(ov(e)_j), ov(e)_i)$ which we can expand by @prop-psi_exact, this quickly shows that these terms exactly cancel the other terms in this above expression. Now we are left with $ ov(R)(ov(e)_i, ov(e)_j, X/(||X||), ov(e)_k) = (ip(ov(e)_i, X))/(||X||^2) ip(ov(nabla)_j ov(nabla)||X||, ov(e)_k) - (ip(ov(e)_j, X))/(||X||^2) ip(ov(nabla)_i ov(nabla)||X||, ov(e)_k), $<riemann_exact> by linearity we can substitute $ov(e)_i = ov(e)_k = Y$ and $ov(e)_j = X$, this gives us $ ov(R)(Y, X, X/(||X||), Y) = (ip(Y, X))/(||X||^2) ip(ov(nabla)_X ov(nabla)||X||, Y) - (ip(X, X))/(||X||^2) ip(ov(nabla)_Y ov(nabla)||X||, Y) $ we now multiply by $||X||$ to get $ ov(R)(Y, X, X, Y) = -||X||ip(ov(nabla)_Y ov(nabla)||X||, Y) + (ip(Y, X))/(||X||) ip(ov(nabla)_X ov(nabla)||X||, Y) $<riemann_simplified> Next consider the integral hypersurface $S$ of $cal(D)$ that passes through $p$, we know that $h_(i j)$ is identically zero everywhere on this hypersurface. Hence, by @lemma-codazzi, we have for any $e_i,e_j,e_k in T_p S$ $ ov(R)(e_i, e_j, e_k, cal(N)) = - nabla_i h_(j k) + nabla_j h_(i k) = 0 $ and so in particular, by taking trace over $j,k$ and using linearity to substitute $e_i = Y$ we get $ ov(Ric)(Y, X) = 0 $ for any $Y in cal(D)$. Now by using @eqn-riemann_exact but tracing over $e_i$ and $e_k$ and plugging in $ov(e)_j = Y$ gives us $ ov(Ric)(Y, X) = - ip(Y, X)/(||X||) ov(Delta)||X|| + 1/(||X||) ip(ov(nabla)_Y ov(nabla)||X||, X) $<ricci_simplified> then for any $Y in cal(D)$ we get $ 0 = - ip(Y, X)/(||X||) ov(Delta)||X|| + 1/(||X||) ip(ov(nabla)_Y ov(nabla)||X||, X) = 1/(||X||) ip(ov(nabla)_Y ov(nabla)||X||, X) $ which gives us @eqn-ric_orthogonal. Now plugging @eqn-ric_orthogonal into @eqn-riemann_simplified gives us $ ov(R)(Y, X, X, Y) &= -||X||ip(ov(nabla)_Y ov(nabla)||X||, Y) + (ip(Y, X))/(||X||)ip(ov(nabla)_(||X||cal(N)) ov(nabla)||X||, cal(N)ip(X/(||X||),Y)) \ &= -||X||ip(ov(nabla)_Y ov(nabla)||X||, Y) + (ip(Y, X)^2)/(||X||)ip(ov(nabla)_(cal(N)) ov(nabla)||X||, cal(N)) $ which is the first result. For the second result we do the same thing with @eqn-ricci_simplified, we get $ ov(Ric)(Y, X) &= - ip(Y, X)/(||X||) ov(Delta)||X|| +1/(||X||)ip(ov(nabla)_(cal(N)ip(cal(N), Y)) ov(nabla)||X||, cal(N)||X||) \ &= - ip(Y, X)/(||X||) (ov(Delta)||X|| - ip(ov(nabla)_(cal(N)) ov(nabla)||X||, cal(N))) $ which is our second result. Finally for the case where $phi != 0$, we start off similarly $ ov(R)(ov(e)_i, ov(e)_j, X/(||X||), ov(e)_k) &= ip(ov(nabla)_i ov(nabla)_j X/phi - ov(nabla)_j ov(nabla)_i X/phi, ov(e)_k) \ &= ip(ov(nabla)_i (ov(e)_j -(ip(ov(e)_j, X) ov(nabla) phi)/(phi^2)) - ov(nabla)_j (ov(e)_i - (ip(ov(e)_i, X) ov(nabla) phi)/(phi^2)), ov(e)_k) $ We now note that $ov(nabla)_i ov(e)_j$ and $ov(nabla)_j ov(e)_i$ are both zero because we are working in normal coordinates. After that the calculation is identical to the first case. ] #pagebreak(weak: true) = Main Results == Motivation As we saw in the previous section, already with quasi-closed conformal vector fields we can prove a strong result regarding Isoperimetric inequalities. However, there are still cases which we would expect these techniques to be applicable to which cannot be reached with their approach. #example[ Consider $N = RR^(n+1)$ with the following conformal vector field $ X(x^1,...,x^n) = x^i diff_i - x^2 diff_1 + x^1 diff_2 $ this is indeed a quasi-closed conformal vector field with conformal factor $phi$, but its integral surfaces are not compact and they do not have fixed mean curvature. Thus we have no hope of attaining a useful Isoperimetric inequality for star-shaped surfaces with respect to this flow. We could instead work with $Y = x^i diff_i$ but then there are hypersurfaces that will never be star-shaped with respect to $Y$ but are star-shaped with respect to $X$, we really need star-shapedness since it is needed to guarantee convergence and to guarantee that area decreases. An example of such a surface can be seen in @fig-star_shaped. #figure( [ #set align(center); #cetz.canvas({ import cetz.plot: * plot(size: (8,8), axis-style: "school-book", x-tick-step: none, y-tick-step: none, { add(domain: (0, 1), samples: 200, t => { let r = 1 + calc.pow(calc.sin(calc.pi*(2*t+1/4)),2); let theta = t + calc.sin(4*calc.pi*t)/8; return (r*calc.cos(2*calc.pi*theta), r*calc.sin(2*calc.pi*theta)) }) for s in (0,0.02452,0.04848,0.07392,0.10471,0.16832,0.26394,0.3846,0.4391,0.4728, 0.5,0.52452,0.54848,0.57392,0.60481,0.66832,0.76394,0.8846,0.9391,0.9728) { let t = s; let r = 1 + calc.pow(calc.sin(calc.pi*(2*t+1/4)),2); let theta = t + calc.sin(4*calc.pi*t)/8; let p = (r*calc.cos(2*calc.pi*theta), r*calc.sin(2*calc.pi*theta)) let v = (p.at(0) - p.at(1), p.at(1) + p.at(0)) let factor = 0.3; let p_2 = (p.at(0) + factor*v.at(0), p.at(1) + factor*v.at(1)) add((p,p_2), style: (stroke: (paint: red), mark: (end: ">", scale: 0.5))) } }) }) ], caption: [Hypersurface in $RR^2$ which is star shaped with respect to $X$ but not $Y$] )<fig-star_shaped> ] We will shortly introduce the tools we will need to deal with this issue, but before we do that we want to motivate these tools a little. We can think of the quasi-closedness condition on $X$ as a compatibility condition between $X$ and a foliation $S_alpha$, namely that $X$ is everywhere orthogonal to $S_alpha$. We can then try to consider foliations which are in some sense 'compatible' with $X$ given in the example above. #example[ Consider the foliation $cal(F)$ of $RR^(n+1) backslash 0$ given by spheres $S_alpha$ centered at the origin, this foliation is not everywhere orthogonal to $X$, however, the foliation is fixed under the normal flow $n phi - H u$ and the foliation induces a decomposition $X = X^perp + X^top$ where $X^perp$ is orthogonal to $S_r$ and $X^top$ is tangent to $S_r$. Now importantly $X^perp$ is just $Y$ and thus is also a conformal vector field, and thus since $X^top = X - X^perp$ then $X^top$ is also conformal. Now $X^perp$ is a quasi-closed conformal vector field which we can manage with the preexisting techniques, so our goal is to find a way to use this decomposition to reduce to the case of just $X^perp$. ] == Setting <head-setting> We will consider a complete $n+1$ dimensional Riemannian manifold $N$, with $n >= 2$. On this manifold we consider a complete conformal vector field $X$ which is non-zero on an open set $U$, along with a foliation $cal(F)$. We assume the two are are compatible, in the sense that the foliation $cal(F)$ induces a decomposition $X = X^perp + X^top$ where $X^perp$ is a quasi-closed symmetric conformal vector field with integral surfaces $S_alpha in cal(F)$ and $X^top$ is a quasi-closed Killing vector field, that is its conformal factor is zero. We will associate with $X^perp$ its conformal factor $phi$ which is the same as that of $X$, we will also associate the scale function (@def-scale) $lambda$ and its derivative $Lambda$ (@prop-lambda_def). We will denote by $psi^perp$ and $psi^top$ the associated tensor fields (@prop-psi_exact) of $X^perp$ and $X^top$ respectively. We will also define $ cal(N)^perp = X^perp/(||X^perp||) "and" cal(N)^top = X^top/(||X^top||) $ We will also make the following assumptions #assumptions[ + The conformal factor $phi$ of $X^perp$ is positive on $U$. + The function $Lambda$ (@prop-lambda_def) is positive on $U$. + The function $Lambda phi^3 + X^top (phi)$ is positive on $U$. + The integral hypersurfaces $S_alpha$ are compact. + The directions $X^perp$ and $X^top$ are both of least Ricci curvature, that is for any unit vector $Y in T_p U$ we have $ ov(Ric)(Y,Y) >= ov(Ric)(cal(N)^perp,cal(N)^perp) = ov(Ric)(cal(N)^top,cal(N)^top) $ ]<assum-required> The first condition informally means that $X^perp$ is a dilation-like vector field, because under its first order vector field flow volumes increase. \ The second condition informally means that our scale function $lambda$ is increasing in the direction of $X^perp$, so just like in Euclidean space as balls increase in radius their mean curvature decreases. \ The third and fourth conditions are technical conditions needed for convergence. The last condition is necessary for area to decrease along the normal flow we will construct. Our flow will consist of two steps, + First we will use a time dependent conformal vector field $X(t) = X^perp + X^top (1-t/T_0)$ for some constant $T_0$, we will consider the flow with velocity $ f = n phi - H ip(X(t), nu), $<flow_def> we will run this flow until $t = T_0$. + If the flow survives after $t = T_0$ we will stop the flow, and then set $X(t) = X^perp$, we then continue with the flow $ f = n phi - H ip(X^perp, nu) $ for however long the flow lasts. Note that in this setting, $u$'s definition depends on time but we will drop this dependence in our notation and only explicitly mention it when it comes up. Note that, if $u > 0$ when $t = T_0$, then at that point in time the surface is star-shaped with respect to just $X^perp$ so we can apply the methods of Li and Pan. For now we will assume that the flow exists on some interval $[0,T)$, we will show this must be the case later, in @prop-flow_short_time. We will also assume that $u$ remains positive on $[0,T)$, this will be proven in @head-evolution_equation_u. We will now start computing the evolution of various geometric quantities along our flow. For convenience we will define the factor $ Xi(t) =(1 - t/T_0). $ We will also define the parabolic operator $ L = diff_t - u Delta $ as well as the functions $ u^perp = ip(X^perp, nu), quad u^top = Xi(t) ip(X^top, nu). $ Notice $u = u^perp + u^top$. Finally we will also use the notation $ pi(X^perp), pi(X^top) $ to denote the orthogonal projection of these vector fields onto $T_p M_t$, notice that $ pi(X^perp) = X^perp - u^perp nu = ip(X^perp, e_i) e_i quad "and" \ pi(X^top) = X^top - u^top nu = ip(X^top, e_i) e_i. $ For some calculations we will assume $t <= T_0$, for $t > T_0$ we can simply set $Xi(t) = Xi'(t) = 0$ and the calculations still follow. == Variation of Area and Volume Information about the variation of area and volume along our flow is crucial to the success of the fow method, so we will start with that. #proposition[ Let $M_t$ be a solution to the flow for $t in [0,T)$, we have $ diff_t V(M_t) = 0 quad "and" quad diff_t A(M_t) <= 0. $ It thus follows that volume is fixed and area is non-increasing. ]<prop-vol_area_evolution> #proof[ We @prop-vol_area_variation to calculate the variation of volume and area. For volume we have due to @lemma-integral_identities $ diff_t V(M_t) = integral_(M_t) f dif S = integral_(M_t) n phi - H u dif S = 0. $ For area we get $ diff_t A(M_t) &= integral_(M_t) H f dif S = integral_(M_t) H (n phi - H u) dif S \ &= n/(n - 1) integral_(M_t) ov(Ric)(nu, X(t) - u nu) dif S - 1/n integral_(M_t) u sum_(i < j) (kappa_i - kappa_j)^2 dif S $ now we write $X(t) = X^perp + Xi(t) X^top$ and $u = u^perp + u^top$ to get $ ov(Ric)(nu, X(t) - u nu) &= ov(Ric)(nu, X^perp - u^perp nu) + ov(Ric)(nu, Xi(t) X^top - u^top nu) \ &= ov(Ric)(nu, X^perp) + Xi(t) ov(Ric)(nu, X^top) - (u^perp + u^top)ov(Ric)(nu,nu) $ then by @eqn-ric_orthogonal we get $ ov(Ric)(nu, X(t) - u nu) = & ip(nu, X^perp) ov(Ric)(cal(N)^perp, cal(N)^perp) + ip(nu, Xi(t) X^top) ov(Ric)(cal(N)^top, cal(N)^top) \ &- med (u^perp + u^top)ov(Ric)(nu,nu) \ = & u^perp ov(Ric)(cal(N)^perp, cal(N)^perp) + u^top ov(Ric)(cal(N)^top, cal(N)^top) - med (u^perp + u^top)ov(Ric)(nu,nu). $ Now due to assumption 5 we get that $ov(Ric)(cal(N)^perp, cal(N)^perp) = ov(Ric)(cal(N)^top, cal(N)^top)$ and so $ ov(Ric)(nu, X(t) - u nu) =& (u^perp + u^top) ov(Ric)(cal(N)^perp, cal(N)^perp) - med (u^perp + u^top)ov(Ric)(nu,nu). \ =& u (ov(Ric)(cal(N)^perp, cal(N)^perp) - ov(Ric)(nu,nu)). $ Plugging this into the variation of area we get $ diff_t A(M_t) &= n/(n - 1) integral_(M_t) ov(Ric)(nu, X(t) - u nu) dif S - 1/n integral_(M_t) u sum_(i < j) (kappa_i - kappa_j)^2 dif S \ &= n/(n - 1) integral_(M_t) u (ov(Ric)(cal(N)^perp, cal(N)^perp) - ov(Ric)(nu,nu)) dif S - 1/n integral_(M_t) u sum_(i < j) (kappa_i - kappa_j)^2 dif S. $ But now again by assumption 5 we get that the term $ov(Ric)(cal(N)^perp, cal(N)^perp) - ov(Ric)(nu,nu)$ is always non-positive and the term $(kappa_i - kappa_j)$ is clearly always non-positive so both of these integrals are non-positive and thus $diff_t A(M_t) <= 0$. ] By this theorem we get the second condition of @thrm-flow_method. == Evolution Equation for $lambda$ The first result we will prove is arguably the most important result, as it will guarantee our hypersurface remains within a compact subset. #proposition[ The evolution equation for $lambda$ under the flow is $ L lambda = - 2 Lambda n phi u^top - 2 u ip(nabla Lambda, X^perp). $ ]<prop-lambda_evolution> #proof[ First we compute the time derivative of $lambda$, since it is an ambient quantity this is easy by @prop-ambient_evolution. We get $ diff_t lambda &= (n phi - H u) ov(nabla)_nu lambda = (n phi - H u) ip(nu, ov(nabla) lambda) = (n phi - H u) 2 Lambda ip(nu, X^perp) \ &= 2 (n phi - H u) Lambda u^perp. $ For the induced Laplacian we get $ Delta lambda &= nabla_i nabla_i lambda = nabla_i (2 Lambda ip(X^perp, e_i)) = 2 Lambda (nabla_i ip(X^perp, e_i)) + 2 ip(X^perp, e_i) nabla_i Lambda \ &= 2 Lambda (ip(ov(nabla)_i X^perp, e_i) + ip(X^perp, ov(nabla)_i e_i)) + 2 ip(nabla Lambda, pi(X^perp)). $ Now since the trace of a tensor is the same as the trace of its symmetrization so $ ip(ov(nabla)_i X^perp, e_i) = tr(ov(nabla) X^perp) = tr(Sym(ov(nabla) X^perp)) = tr(phi g_(i j)) = n phi. $ Next by @prop-h_props we know that $ov(nabla)_i e_i = - h_(i i) nu = - H nu$. Combining these we get that $ Delta lambda &= 2 Lambda (n phi - H ip(X^perp, nu)) + 2 ip(nabla Lambda, X^perp) = 2 Lambda (n phi - H u^perp) + 2 ip(nabla Lambda, X^perp). $ Finally we compute $ (diff_t - u Delta) lambda &= 2 Lambda ( (n phi - H u) u^perp - (n phi - H u^perp) u) - 2 ip(nabla Lambda, X^perp) \ &= 2 Lambda (- n phi u^top) - 2 ip(nabla Lambda, X^perp) = - 2 Lambda n phi u^top - 2 ip(nabla Lambda, X^perp) $ ] #corollary[ For all $t in [0,T)$ and all $p in M_t$ we have $ min_(p in M_0) lambda(p,0) <= lambda(p,t) <= max_(p in M_0) lambda(p,0) $ ]<cor-lambda_estimate> #proof[ At a maximal or minimal point of $lambda$ we have by Lagrange multipliers that $ov(nabla)$ is colinear with $nu$, so we must have that $X^perp$ is colinear with $nu$ and thus $pi(X^perp) = 0$. Also since $X^top$ is orthogonal to $X^perp$ we have that at a maximal or minimal point $X^top$ is orthogonal to $nu$ and thus $u^top = 0$. Thus we get that at a maximal or minimal point $L lambda = 0$ and so by @prop-max_principle applied to $lambda$ and $-lambda$ we get $ min_(x in M_0) lambda(x,0) <= lambda(x,t) <= max_(x in M_0) lambda(x,0) $ ] #corollary[ Given a hypersurface $M$, there is a compact region $Gamma$ such that $M_t$ is contained in $Gamma$ for as long as it exists. ] #proof[ We use @cor-lambda_estimate along with @prop-compact_region. ] == Evolution Equation for $u$ <head-evolution_equation_u> This next evolution is nearly as important, our parabolic operator has a $u$ factor and so it ceases to be uniformly parabolic if we do not have a uniform lower bound on $u$. #proposition[ The evolution equation for $u$ under the flow is $ L u &= n (Lambda phi^3 - Xi(t) X^top (phi)) + Xi'(t) u^top - 2 phi H u + |A|^2 u^2 + 2 n u nu(phi) \ &+ med u^2 ov(Ric)(nu,nu) + H ip(X,nabla u) $ ]<prop-u_evolution> #proof[ This is quite the long calculation so we will split it into multiple steps, first for the time derivative $ diff_t u &= diff_t ip(X^perp + Xi(t) X^top, nu) = ip(diff_t (X^perp + Xi(t) X^top), nu) + ip(X^perp + Xi(t) X^top,diff_t nu) \ &= (n phi - H u)ip(ov(nabla)_nu (X^perp + Xi(t) X^top), nu) + Xi'(t) ip(X^top, nu) \ &+ med ip(X^perp + Xi(t) X^top, - nabla (n phi - H u)). $ Using the fact that $X^perp + Xi(t) X^top$ is conformal with factor $phi$ we can simplify the first term and continue calculating $ diff_t u &= phi(n phi - H u) + Xi'(t) u^perp + ip(X^perp + Xi(t) X^top, - nabla (n phi - H u)) \ &= phi(n phi - H u) + Xi'(t) u^perp - n ip(X,nabla phi) + H ip(X,nabla u) + u ip(X, nabla H) \ &= phi(n phi - H u) + Xi'(t) u^perp - n ip(X,ov(nabla)phi) + n u nu (phi) + H ip(X,nabla u) + u ip(X, nabla H). $<u_diff_t> Now we switch to the Laplacian, it will be helpful to decompose $u = u^perp + u^top$. First we deal with $u^perp$. #claim[ We have $ Delta u^perp = - n nu(phi) - u^perp ov(Ric)(nu,nu) + ip(nabla H, X^perp) + phi H - |A|^2 u^perp $ ] We start with computing from definitions $ Delta u^perp &= nabla_i nabla_i ip(X^perp, nu) = nabla_i (ip(ov(nabla)_i X^perp, nu) + ip(X^perp, ov(nabla)_i nu)). $ Now for the first term $e_i$ is orthogonal to $nu$ and thus it simplifies to $ip(psi^perp (e_i), nu)$, we continue computing, $ Delta u^perp = nabla_i (ip(psi^perp (e_i), nu) + ip(X^perp, h_(i j) e_j)). $<u_perp_step_1> Let us now deal with the first term, we use @cor-psi_antisym $ nabla_i (ip(psi^perp (e_i), nu)) &= nabla_i ((2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat))(e_i,nu))/(phi)) \ &= ((ov(nabla)_i 2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat))(e_i,nu))/(phi) \ &+ (2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat))(ov(nabla)_i e_i,nu))/(phi) \ &+ med (2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat))(e_i,ov(nabla)_i nu))/(phi) \ &- (2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat))(e_i,nu))/(phi^2)ov(nabla)_i phi. $ We notice that since $ov(nabla)_i e_i = - H nu$ the second term will have two $nu$ inputs into an anti-symmetrization, making it vanish. Similarly, since $ov(nabla)_i nu = h_(i j) e_j$ the third term will have the inputs $(e_i,e_j)$ symmetrized by $h_(i j)$ and thus will also vanish. We are thus left with $ nabla_i (ip(psi^perp (e_i), nu)) &= ((ov(nabla)_i 2 AntiSym(ov(nabla)phi^flat times.circle X^(perp flat)))(e_i,nu))/(phi) - ip(psi^perp (e_i), nu) (ov(nabla)_i phi)/phi. wide $ <u_perp_step_2> Now we can compute the covariant derivative of the anti-symmetrization $ 2 AntiSym( ov(nabla) phi^flat times.circle (phi e_i + psi^perp (e_i))^flat ) + 2 AntiSym( ov(nabla)_i ov(nabla) phi^flat times.circle X^(perp flat) ), $ now when we plug this back into @eqn-u_perp_step_2 we get $ ((ov(nabla)_i phi) ip(psi^perp (e_i), nu))/phi - ip(e_i, e_i) nu(phi) + ov(Hess)_phi (e_i, e_i) u^perp/phi - ov(Hess)_phi (e_i, nu) ip(X^perp,e_i)/phi \ - ip(psi^perp (e_i), nu) (ov(nabla)_i phi)/phi, $ which simplifies into $ - n nu(phi) + ov(Hess)_phi (e_i, e_i) u^perp/phi - (ov(Hess)_phi (pi(X^perp), nu))/phi. $ Now this Hessian term is almost the ambient Laplacian of $phi$, so we can rewrite this as $ - n nu(phi) + ov(Delta) phi u^perp/phi - u^perp (ov(Hess)_phi (nu,nu))/phi - (ov(Hess)_phi (pi(X^perp), nu))/phi, $ but now since $X^perp = pi(X^perp) + u^perp nu$ we further simplify this into $ - n nu(phi) + ov(Delta) phi u^perp/phi - (ov(Hess)_phi (X^perp, nu))/phi. $ and then we use @eqn-ric_orthogonal-phi to get $ - n nu(phi) + ov(Delta) phi u^perp/phi - ip(X^perp, nu)(ov(Hess)_phi (cal(N)^perp, cal(N)^perp))/phi \ = - n nu(phi) + u^perp/phi (ov(Delta) phi - (ov(Hess)_phi (cal(N)^perp, cal(N)^perp))). $ This form allows us to use @eqn-ric_x to get $ - n nu(phi) - ov(Ric)(X^perp, nu). $<u_perp_step_3> Next for the second term of @eqn-u_perp_step_1 we get $ ov(nabla)_i (h_(i j) ip(X^perp, e_j)) &= (ov(nabla)_i h_(i j)) ip(X^perp, e_j) + h_(i j) ip(ov(nabla)_i X^perp, e_j) + h_(i j) ip(X^perp, ov(nabla)_i e_j) \ & = (ov(nabla)_i h_(i j)) ip(X^perp, e_j) + h_(i j) (phi ip(e_i, e_j) + ip(psi^perp (e_i), e_j)) \ &- h_(i j) h_(i j) ip(X^perp, nu). $ Since $h_(i j)$ is symmetric the third term here vanishes and so we are left with $ ov(nabla)_i (h_(i j) ip(X^perp, e_j)) &= (ov(nabla)_i h_(i j)) ip(X^perp, e_j) + phi H - |A|^2 u^perp. $<u_perp_step_4> Now plugging @eqn-u_perp_step_3 and @eqn-u_perp_step_4 into @eqn-u_perp_step_1 gives us $ Delta u^perp = - n nu(phi) - ov(Ric) (X^perp, nu) + (ov(nabla)_i h_(i j)) ip(X^perp, e_j) + phi H - |A|^2 u^perp $<u_perp_step_5> now we can use @lemma-codazzi to get $ (ov(nabla)_i h_(i j)) ip(X^perp, e_j) &= (ov(Rm)_(j i i nu) + nabla_j h_(i i))ip(X^perp, e_j) = (ov(Ric)(e_j, nu) + nabla_j H)ip(X^perp, e_j) \ &= ov(Ric)(pi(X^perp), nu) + ip(nabla H, X^perp) \ &= ov(Ric)(X^perp, nu) - u^perp ov(Ric)(nu,nu) + ip(nabla H, X^perp) $ which we can plug back into @eqn-u_perp_step_5 to get $ Delta u^perp = - n nu(phi) - u^perp ov(Ric)(nu,nu) + ip(nabla H, X^perp) + phi H - |A|^2 u^perp $ Now we deal with $u^top$ #claim[ We have $ Delta u^top = - u^top ov(Ric)(nu,nu) + ip(nabla H, X^top) - |A|^2 u^top $ ] Again we compute from definitions, we will use $X^top$ instead of $Xi(t) X^top$ since that does not change any of the calculations and both are Killing vector fields, $ Delta u^top &= nabla_i nabla_i ip(X^top, nu) = nabla_i (ip(ov(nabla)_i X^top, nu) + ip(X^top, ov(nabla)_i nu)). $ Now for the first term $e_i$ is orthogonal to $nu$ and thus it simplifies to $ip(psi^top (e_i), nu)$, we continue computing, $ Delta u^top = nabla_i (ip(psi^top (e_i), nu) + ip(X^top, h_(i j) e_j)). $<u_top_step_1> Let us now deal with the first term, we use @cor-psi_antisym $ nabla_i (ip(psi^top (e_i), nu)) &= nabla_i ((2 AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat))(e_i,nu))/(||X^top||)) \ &= (ov(nabla)_i 2 AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat))(e_i,nu))/(||X^top||) \ &+ med (2 AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat))(ov(nabla)_i e_i,nu))/(||X^top||) \ &+ med (2 AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat))(e_i,ov(nabla)_i nu))/(||X^top||) \ &- med (2 AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat))(e_i,nu))/(||X^top||^2)ov(nabla)_i||X^top||. $ We notice that since $ov(nabla)_i e_i = - H nu$ the second term will have two $nu$ inputs into an anti-symmetrization, making it vanish. Similarly, since $ov(nabla)_i nu = h_(i j) e_j$ the third term will have the inputs $(e_i,e_j)$ symmetrized by $h_(i j)$ and thus will also vanish. We are thus left with $ nabla_i (ip(psi^top (e_i), nu)) \ = 2 ((ov(nabla)_i AntiSym(ov(nabla)||X^top||^flat times.circle X^(top flat)))(e_i,nu))/(||X^top||) - ip(psi^top (e_i), nu) (ov(nabla)_i||X^top||)/(||X^top||). #h(2em) $ <u_top_step_2> Now we can compute the covariant derivative of the anti-symmetrization $ 2 AntiSym( ov(nabla)||X^top||^flat times.circle (psi^top (e_i))^flat ) + 2 AntiSym( ov(nabla)_i ov(nabla)||X^top||^flat times.circle X^(top flat) ), $ now when we plug this back into @eqn-u_top_step_2 we get $ ((ov(nabla)_i||X^top||) ip(psi^top (e_i), nu))/(||X^top||) + ov(Hess)_(||X^top||) (e_i, e_i) u^top/(||X^top||) - ov(Hess)_(||X^top||) (e_i, nu) ip(X^top,e_i)/(||X^top||) \ - ip(psi^top (e_i), nu) (ov(nabla)_i (||X^top||))/(||X^top||), $ which simplifies into $ ov(Hess)_(||X^top||) (e_i, e_i) u^top/(||X^top||) - (ov(Hess)_(||X^top||) (pi(X^top), nu))/(||X^top||). $ Now this Hessian term is almost the ambient Laplacian of $||X^top||$, so we can rewrite this as $ ov(Delta)||X^top||u^top/(||X^top||) - u^top (ov(Hess)_(||X^top||) (nu,nu))/(||X^top||) - (ov(Hess)_(||X^top||) (pi(X^top), nu))/(||X^top||), $ but now since $X^top = pi(X^top) + u^top nu$ we further simplify this into $ ov(Delta)||X^top||u^top/(||X^top||) - (ov(Hess)_(||X^top||) (X^top, nu))/(||X^top||). $ and then we use @eqn-ric_orthogonal-phi to get $ ov(Delta)||X^top||u^top/(||X^top||) - ip(X^top, nu)(ov(Hess)_(||X^top||) (cal(N)^top, cal(N)^top))/(||X^top||) \ = - u^top/(||X^top||) (ov(Delta)||X^top|| - ov(Hess)_(||X^top||) (cal(N)^top, cal(N)^top)). $ Now we use @eqn-ric_x to get $ - ov(Ric)(X^top, nu). $<u_top_step_3> Next for the second term of @eqn-u_top_step_1 we get $ ov(nabla)_i (h_(i j) ip(X^top, e_j)) & = (ov(nabla)_i h_(i j)) ip(X^top, e_j) + h_(i j) ip(ov(nabla)_i X^top, e_j) + h_(i j) ip(X^top, ov(nabla)_i e_j) \ & = (ov(nabla)_i h_(i j)) ip(X^top, e_j) + h_(i j) (ip(psi^top (e_i), e_j)) - h_(i j) h_(i j) ip(X^top, nu). $ Since $h_(i j)$ is symmetric the third term here vanishes and so we are left with $ ov(nabla)_i (h_(i j) ip(X^top, e_j)) &= (ov(nabla)_i h_(i j)) ip(X^top, e_j) - |A|^2 u^top. $<u_top_step_4> Now plugging @eqn-u_top_step_3 and @eqn-u_top_step_4 into @eqn-u_top_step_1 gives us $ Delta u^top = - ov(Ric) (X^top, nu) + (ov(nabla)_i h_(i j)) ip(X^top, e_j) - |A|^2 u^top $<u_top_step_5> now we can use @lemma-codazzi to get $ (ov(nabla)_i h_(i j)) ip(X^top, e_j) &= (ov(Rm)_(j i i nu) + nabla_j h_(i i))ip(X^top, e_j) = (ov(Ric)(e_j, nu) + nabla_j H)ip(X^top, e_j) \ &= ov(Ric)(pi(X^top), nu) + ip(nabla H, X^top) \ &= ov(Ric)(X^top, nu) - u^top ov(Ric)(nu,nu) + ip(nabla H, X^top) $ which we can plug back into @eqn-u_top_step_5 to get $ Delta u^top = - u^top ov(Ric)(nu,nu) + ip(nabla H, X^top) - |A|^2 u^top $ which proves the claim. Combined with the previous claim we get that $ Delta u = - n nu(phi) - u ov(Ric)(nu,nu) + ip(nabla H, X) + phi H - |A|^2 u $ and then combining with @eqn-u_diff_t we get $ L u &= n phi^2 + Xi'(t) u^top - n X(phi) + 2 n u nu (phi) + H ip(X,nabla u) + u^2 ov(Ric) (nu,nu) \ &- 2 phi u H + |A|^2 u^2 \ &= n (phi^2 - X^perp (phi) - Xi(t) X^top (phi)) + Xi'(t) u^top + 2 n u nu (phi) + H ip(X,nabla u) + u^2 ov(Ric) (nu,nu) \ &- 2 phi u H + |A|^2 u^2 \ &= n Lambda phi^3 - n Xi(t) X^top (phi) + Xi'(t) u^top + 2 n u nu (phi) + H ip(X,nabla u) + u^2 ov(Ric) (nu,nu) \ &- 2 phi u H + |A|^2 u^2. $ This finishes the proof. ] Now we can start to analyse this evolution equation to get results about $u$. #corollary[ There is a constant $epsilon > 0$ such that for any $t in [0,T)$ $ min_(p in M_t) u(p,t) >= epsilon/(1 + max_(p in M_t) |H(p,t)|) $ ]<cor-u_inverse_bound> #proof[ At a minimum point of $u$ we have that $nabla u$ vanishes and so we get $ L u = n (Lambda phi^3 - Xi(t) X^top (phi)) + Xi'(t) u^top + 2 n u nu(phi) + u^2(ov(Ric)(nu,nu)) - 2 phi u H + |A|^2 u^2, $ now recall that all ambient objects are uniformly bounded for all time, so there exists a constant $M$ such that $ L u >= n (Lambda phi^3 - Xi(t) X^top (phi)) + Xi'(t) u^top - u M - 2 phi u H + |A|^2 u^2. $ By our assumptions both $Lambda phi^3$ and $Lambda phi^3 - Xi(t) X^top (phi)$ are positive, we thus have that any convex combinations of them is positive so since these are ambient quantities they must be uniformly bounded and so $ n (Lambda phi^3 - Xi(t) X^top (phi)) >= epsilon_1 > 0 $ for some $epsilon_1$. We thus have $ L u >= epsilon_1 + Xi'(t) u^top - u M - 2 phi u H + |A|^2 u^2. $ Now we are free to pick $T_0$ such that $(||X^top||)/ T_0 <= epsilon_1/2$. Then we can use the Newton-Maclaurin inequality to get $ L u >= epsilon_2 + Xi'(t) u^top - u M - 2 phi u H + H^2 u^2 / n. $ Now assume that $u < epsilon/(1 + max |H(p, t)|)$, then $ L u >= epsilon_2 + Xi'(t) u^top - epsilon M - 2 phi epsilon $ so by setting $epsilon < epsilon_2/(2(M + 2 phi))$ then $ L u >= epsilon_3 + Xi'(t) u^top $ now we can pick $T_0$ so that $|Xi'(t)| < 1/(||X^top||)$ and we get $ L u >= epsilon_4. $ Thus by using @prop-max_principle we get that $ u <= epsilon/(1 + max_(p in M_t) H(p,t)) quad "implies" quad min_(p in M_t) u(p,t) >= min_(p in M_0) u(p,0) $ and so by choosing $epsilon$ appropriately we get the desired result. ] Now that we can bound $u$ using $H$, we just need to show that $H$ grows sufficiently slowly, to guarantee the flow exists until $t = T_0$. #pagebreak(weak: true) == Evolution Equation for $H$ #proposition[ The evolution equation for $H$ is $ L H &= 2 ip(nabla H, nabla u) + H ip(X, nabla H) - phi (H^2 - n |A|^2) \ &+ n(ov(Hess)_phi (nu,nu) - ov(Hess)_phi (cal(N)^perp,cal(N)^perp)) + med n phi (ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Ric)(nu,nu)) $ ] #proof[ We use @cor-H_evolution to get $ diff_t H = - Delta (n phi - H u) - (n phi - H u)(|A|^2 + ov(Ric)(nu,nu)) $ then simplifying this we get $ L H = - n Delta phi + 2 ip(nabla H, nabla u) + H Delta u - (n phi - H u)(|A|^2 + ov(Ric)(nu,nu)) $ then using the results of @prop-u_evolution we get $ L H &= - n Delta phi + 2 ip(nabla H, nabla u) - H n nu(phi) - H u ov(Ric)(nu,nu) + H ip(nabla H, X) + phi H^2 \ & - med H|A|^2 u - (n phi - H u)(|A|^2 + ov(Ric)(nu,nu)) \ &= - n Delta phi + 2 ip(nabla H, nabla u) - H n nu(phi) + H ip(nabla H, X) + phi H^2 - n phi(|A|^2 + ov(Ric)(nu,nu)) \ &= - n Delta phi + 2 ip(nabla H, nabla u) - H n nu(phi) + H ip(nabla H, X) + phi (H^2 - n|A|^2) - n phi ov(Ric)(nu,nu). $ Now we use @prop-h_props to get $ L H &= - n ov(Delta) phi + n ov(Hess)_phi (nu,nu) + 2 ip(nabla H, nabla u) + H ip(nabla H, X) + phi (H^2 - n|A|^2) \ &- med n phi ov(Ric)(nu,nu) $ but now we use @ric_x to get $ L H &= n (phi ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Hess)_phi (cal(N)^perp, cal(N)^perp)) + n ov(Hess)_phi (nu,nu) + 2 ip(nabla H, nabla u)\ &+ H ip(nabla H, X) + phi (H^2 - n|A|^2) - med n phi ov(Ric)(nu,nu) \ &= 2 ip(nabla H, nabla u) + H ip(nabla H, X) + phi (H^2 - n|A|^2) + n phi (ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Ric)(nu,nu)) \ &+med n (ov(Hess)_phi (nu,nu) - ov(Hess)_phi (cal(N)^perp, cal(N)^perp)) $ ] #corollary[ There are constants $a,b > 0$ such that for any $t in [0,T)$ $ max_(p in M_t) H(p,t) <= a + b t $ ]<cor-H_linear_bound> #proof[ At a maximum point of $H$ we have $nabla H = 0$, hence the evolution equation simplifies to $ L H &= phi (H^2 - n|A|^2) + n phi (ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Ric)(nu,nu)) \ &+med n (ov(Hess)_phi (nu,nu) - ov(Hess)_phi (cal(N)^perp, cal(N)^perp)), $ but then again using the Newton-Maclaurin inequality we get $ L H <= n phi (ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Ric)(nu,nu)) + n (ov(Hess)_phi (nu,nu) - ov(Hess)_phi (cal(N)^perp, cal(N)^perp)). $ Now on the right hand side these are all ambient objects and thus are uniformly bounded by some constant $M$, hence we have $ L H <= M. $ Hence by @prop-max_principle we get that $H <= max_(p in M_0) H(p,0) + M t$, proving the desired result. ] With this linear bound we get an inverse linear lower bound on $u$. #corollary[ There exists $epsilon > 0$ such that $ u(p,t) >= epsilon/(1 + T_0) $ for all $t in [0,T)$ and $p in M_t$. ]<cor-u_const_bound> #proof[ Combining @cor-H_linear_bound with @cor-u_inverse_bound we immediately get the desired result. ] == Existence and Convergence We now have everything we need to prove the flow exists until $t = T_0$. #proposition[ If a surface $M$ which is star-shaped with respect to $X^perp + X^top$ admits a flow $M_t$, then $M_t$ remains star-shaped with respect to $X(t)$ for all $t in [0,T_0)$, furthermore if the flow exists at $t = T_0$ then there the surface is star-shaped with respect to $X^perp$. ] #proof[ We have showed that $u$ is uniformly bounded for $t in [0,T_0)$, hence it is also uniformly bounded in the limit $t = T_0$. ] We now shift out focus to rewriting this flow as a flow of functions instead of hypersurfaces, which will allow us to apply the results of @sect-PDE to it. We want to write our hypersurface as a graph over an integral hypersurface of $S_lambda$ where $ S_lambda = { p in N : lambda(p) = lambda}, $ since this is not a warped product space we need to be careful with this construction. We will fix a starting hypersurface $M$, and set $ lambda_0 = min_(p in M) lambda(p) quad "and" quad lambda_1 = max_(p in M) lambda(p), $ and we want to construct nice coordinates on $ D := { p in N : lambda_0 <= lambda(p) <= lambda_1 } $<domain_def> We will start with proving that $D$ is compact, allowing us to lower bound important quantities uniformly. #proposition[ For any $lambda_1 > lambda_0 > 0$ in the image of $lambda$, the subset $D$ defined by @eqn-domain_def is compact. ]<prop-compact_region> #proof[ First we will show that $D$ is a fiber bundle over $[lambda_0, lambda_1]$, to see this fix $lambda in [lambda_0, lambda_1]$, then set $S_lambda = { p in N : lambda(p) = lambda }$, it is an integrable hypersurface of $X^perp$ and is compact. Then consider the flow of $X^perp$, since $S_lambda$ is compact we can pick $epsilon > 0$ such that the flow of $X^perp$ exists for $t in [-epsilon, epsilon]$ for all points $p in S_lambda$. Now the image of $S_lambda$ under this flow is another integrable hypersurface, this is because $X^perp$ is a conformal vector field and so under its flow, itself and orthogonality are preserved. Hence the flow of $X^perp$ fixes its orthogonal distribution $cal(D)(X^perp)$, and thus also its foliation. Hence for some $lambda' < lambda < lambda''$ we can reparametrize the flow of $X^perp$ to get the homeomorphism $ cal(F) : S_lambda times [lambda', lambda ''] -> lambda^(-1)([lambda', lambda'']), $ and thus $D$ is a fiber bundle. We now prove a lemma regarding compactness of fiber bundles. #lemma[ If $(E,B,F)$ is a fiber bundle with $E,B,F$ metric spaces and $B,F$ compact, then $E$ is also compact. ] #proof[ Let $x_n$ be a sequence of points in $E$, since $pi(x_n)$ is a sequence of points in $B$ it has a convergence subsequence $pi(x_(n_k))$. Assume that it converges to $x$, then let $U$ be a neighborhood of $x$ which trivializes the bundle $E$, in the sense that $pi^(-1)(U) = U times F$. Consider a precompact subset $V seq U$, then since $pi(x_n_k) -> x$ we know that $pi(x_n_k)$ is eventually always contained in $V$, hence $x_n_k$ is eventually always contained in $pi^(-1)(ov(V)) = ov(V) times F$. Now there is a subsequence of $x_n_k$ which is entirely contained in $ov(V) times F$ which is a product of compact sets hence compact, so $x_n_k$ has a convergent subsequence in $E$, which is also a convergent subsequence of $E$. Hence $E$ is compact. ] Now in our situation we have $D$ as a fiber bundle over $[lambda_0, lambda_1]$ with fiber $S_lambda$ which are both compact, hence $D$ is compact which proves the proposition. // Now we have a fiber bundle $D$ with each fiber compact and base space $[lambda_0, lambda_1]$ which is also compact. Now we prove this with the finite intersection property, let $C_alpha$ be an arbitrary collection of closed sets such that every finite subcollection has non-empty intersection. Then we can project this collection onto $[lambda_0, lambda_1]$ to get a collection of closed hence compact subsets $pi(C_alpha)$ of $[lambda_0,lambda_1]$. Since they are compact they must have non-zero intersection so there some point $p in [lambda_0, lambda_1]$ such that $p in sect.big_alpha pi(C_alpha)$. // // Now consider the fiber $F = pi^(-1)(p)$, the sets $C_alpha sect F$ are closed in $F$ by definition of the subspace topology and also satisfy the finite intersection property by the paragraph above. Since $F$ is compact we know then that ] We can now use these lower bounds to construct a nice coordinate system for $D$, we will do this by flowing the surface $S_(lambda_0)$ to cover the entirety of $D$. For brevity we will shorten $S_(lambda_0)$ to $S$. #proposition[ For any $lambda in [lambda_0, lambda_1]$, and any point $p in S_lambda$ there exists an integral curve of $X(t)$ going through $p$ which intersects $S$ at exactly one point. ] #proof[ First we prove existence, consider the flow of $-X(t)$ acting on $p$, lets call this flow $cal(F)$. Notice that $ diff_t lambda(cal(F)(p,t)) = - 2 Lambda < 0 $ and so this function is decreasing. Then at some point $lambda(cal(F)(p,t)) = lambda_0$ since otherwise $cal(F)(p,t)$ remains forever in $D$ where $- 2 Lambda < -epsilon < 0$ for some positive $epsilon$ which is a contradiction. To show uniqueness assume that the flow $cal(F)(p,t)$ intersects $S$ at more than one point. Then we have $lambda(cal(F)(p,t_1)) = lambda(cal(F)(p,t_2))$ and so by Rolle's theorem we have that $diff_t lambda(cal(F)(p,t_3)) = 0$ which contradicts the fact that $- 2 Lambda < 0$. ] Using the unique intersection point we found above as a 'projection map' onto $S$ we get a diffeomorphism $F_t : D -> S times [lambda_0, lambda_1]$. Note that this diffeomorphism depends on $t$ because $X(t)$ depends on $t$. We can now convert our hypersurface flow into a flow of functions. #proposition[ A hypersurface $M$ contained in $D$ is star-shaped with respect to $X(t)$ if and only if it can be identified using $F_t$ with a graph of smooth function $f: S -> [lambda_0, lambda_1]$. ]<prop-graph> #proof[ First assume that $M$ can be identified with the graph of $f$, then set $F : S -> S times [lambda_0, lambda_1]$ be the embedding of the graph $ F : y |-> (y, f(y)). $ One can easily compute that for $v in T_p S$ $ F_* v = hat(v) + v(f) diff_lambda $ where $hat(v)$ is the extension of $v$ over the integral curve containing $p$ through the flow of $X(t)$. Then let $v$ be the unit vector which maximizes the length of the orthogonal projection of $X(t)$ onto $F_* v$, this length is $ ip(v + v(f) diff_lambda, X(t))/(||v + v(f) diff_lambda||) = ip(v + v(f) diff_lambda, X(t))/(||v + v(f) diff_lambda||) $ now orthogonally decompose $v$ as $v = a diff_lambda + b z$ where $z$ is a unit vector orthogonal to $diff_lambda$ and by extension also $X(t)$. We then have $ ip(a diff_lambda + b z + v(f) diff_lambda, X(t))/(||a diff_lambda + b z + v(f) diff_lambda||) &= ip(b z + (a + v(f)) diff_lambda, X(t))/(||b z + (a + v(f)) diff_lambda||) \ &= ((a+v(f))ip(diff_lambda, X(t)))/(sqrt(b^2 + (a+v(f))^2||diff_lambda||^2)) \ &= ((a+v(f)))/(sqrt(b^2/(||diff_lambda||^2) + (a+v(f))^2))||X(t)||. $ Now since we are on a compact surface we have uniform bounds on $v(f)$ and $b$ is non-zero (since $v$ is not colinear with $X(t)$) we have $ ((a+v(f)))/(sqrt(b^2/(||diff_lambda||^2) + (a+v(f))^2)) < 1 - epsilon $ for some $epsilon > 0$. Now from this we get that the projection $pi$ of $X(t)$ onto $M$ satisfies $ ||pi(X(t))||^2 < ||X(t)||^2 (1 - epsilon') $ and so we have $ u^2 = ||X(t)||^2 - ||pi(X(t))||^2 > ||X(t)||^2epsilon' $ and so up to a change of orientation our surface is star-shaped. On the other hand assume that a surface is star-shaped, then first we prove that it intersects every integral curve of $X(t)$ at most once, to see this note that if it were to intersect it twice, then we would have $ip(nu,X(t_1))$ be positive and $ip(nu,X(t_2))$ be negative or vice versa, where $t_1, t_2$ are the intersection times. But this directly contradicts the fact that it is star-shaped. We thus have an injective map $pi : M -> S$ since the all the integral curves intersect $S$ at exactly one point. It will be enough to show that $pi$ is also a diffeomorphism, as then its inverse will exactly be the embedding of the graph of a function. It in fact suffices just to show it is a local diffeomorphism, since then it is a bijection onto its image and thus a diffeomorphism onto its image. Now we check that this is indeed the case, fix a point $p in M$ and take an orthonormal frame $e_1,...,e_n$ of $S$ centered at $f(p)$ and extend it to $n$ vector fields $hat(e)_1,...,hat(e)_n,hat(nu)$ of $D$ through the flow of $X(t)$. Now at the point $p$ consider the projection $P : T_p N -> span(hat(e)_1,...,hat(e)_n)$ induced by the frame $hat(e)_1,...,hat(e)_n,X(t)$ (i.e. the projection with kernel $span(X(t))$). If we restrict this map to $T_p N$ the projection kernel of this linear map is zero since anything in the kernel must be colinear with $X(t)$ and thus cannot be in $T_p M$ since that would mean $ip(X(t), nu) = 0$ which contradicts star-shapedness. Now we see that the projection is precisely the differential of $pi$ and it is a linear isomorphism and thus $pi$ is a local diffeomorphism. ] In fact, by the first argument in the proof above we get that lower bounds on $u$ are equivalent to upper bounds on $||tilde(nabla) f||$ where $tilde(nabla)$ is the connection on $S$> #corollary[ There are functions $M(epsilon)$ and $epsilon(M)$ such that $ sup_(S) thin ||tilde(nabla) f|| < M => inf_S u > epsilon(M) quad "and" \ inf_(S) u > epsilon => sup_(S) thin ||tilde(nabla) f|| < M(epsilon) $ ]<cor-derivative_bound> We will now construct the flow in the following way, first we use @prop-graph to identify our initial surface $M_0$ with a graph of the function $lambda$ over $S$. Then $M_t$ solving @eqn-flow_def is equivalent to it being the graph of a function $f$ solving $ cases(L f = - 2 Lambda n phi u^top - 2 u ip(nabla Lambda, X^perp)\,, f(x,0) = lambda(x)\,, ) $<pde_formulation> up to an appropriate diffeomorphism that handles points changing which integral curve they are on (this is alright as all normal flows are diffeomorphism invariant). This identification is through $G : (x,t,lambda) |-> F_t (x, lambda)$. Next we solve this @pde_formulation purely in function space. We then apply appropriate diffeomorphisms to convert this solution to a solution of our normal flow. #proposition("Short Time Existence")[ For any star-shaped hypersurface $M_0$, the normal flow with velocity $n phi - H u$ with initial condition $M_0$ exists for some time interval $[0,T)$. ]<prop-flow_short_time> #proof[ By the processed described above it is enough to show that @pde_formulation has a solution for some time interval $[0,T)$. To see this we need to rewrite all geometric objects of the PDE in terms of $w$ and its derivatives. We will work with in normal coordinates $x^1,...,x^n$ on $S$, in which the induced metric is given by $ g_(i j) = ip(F_* e_i, F_* e_j) = ip(hat(e)_i, hat(e)_j) + diff_j f ip(hat(e)_i, diff_lambda) + diff_i f ip(hat(e)_j, diff_lambda) + (diff_i f) (diff_j f) ip(diff_lambda, diff_lambda). $ Notice that all 4 inner products in the expression are smooth functions of $(x,lambda)$ and so the entries are smooth functions of $x,f, D f$. We then immediately get that $ g^(i j)(x, f, D f), quad det(g)(x, f, D f) $ are both also smooth functions of their inputs. Now by the Gram-Schmidt method we get that the normal vector to the graph can be given by $ nu = X(t) - sum_i ip(hat(e)_i + diff_i f diff_lambda, X(t))/ip(hat(e)_i + diff_i f diff_lambda, hat(e)_i + diff_i f diff_lambda) (hat(e)_i + diff_i f diff_lambda) $ appropriately normalized, which is once again a smooth function of $x,f, D f$, hence $u^top$ and $u$ are also smooth functions of $x, f, D f$. Finally all ambient objects like $X^perp, Lambda, phi$ are all smooth functions of $x,lambda$ and hence $x,f$. We can thus rewrite @eqn-pde_formulation as $ diff_t f = - u(x,f,D f) Delta f + B(x,f,D f). $<simple_pde_formulation> At first glance this seems like a standard parabolic PDE but in fact our Laplacian is with respect to the induced metric which depends on the gradient of $f$ in a complicated manner. To counteract this we will use a technique called DeTurcks trick, which will allow us to exploit the diffeomorphism invariance of our geometric flow to sidestep this complexity. We recall that in coordinates the Laplacian takes the form $ Delta f = g^(i j) (diff_i diff_j f - Gamma^k_(i j) diff_k f) $ where both the inverse metric and the Christoffel Symbols depend on the induced metric. Now let us fix some other metric, for example the metric on $M_0$, we will call this metric $tilde(g)$ and its Christoffel symbols $tilde(Gamma)$. We recall that object $ Gamma^k_(i j) - tilde(Gamma)^k_(i j) $ is actually coordinate independent and is a tensor. Hence we can define the time dependent vector field $ V^k (t) = g^(i j)(t) (Gamma^k_(i j) (t) - tilde(Gamma)^k_(i j)). $ We will now apply a time-dependent diffeomorphism $Phi_t$ to our flow to see how it changes. Assume that $f$ solves @eqn-simple_pde_formulation, then we get that $h : = f compose Phi_t^(-1)$ solves $ diff_t (h compose Phi_t) = - u(Phi_t (x),h compose Phi_(t), D (h compose Phi_t)) Delta (h compose Phi_t) + B(Phi_t (x), h compose Phi_t, D (h compose Phi_t)) $ now recall that the right hand side here is actually $(n phi - H u) ip(ov(nabla) lambda, nu)$ and is actually a geometric quantity, hence it does not depend on parametrization and so we can rewrite the right hand side as $ (- u(x,h, D h) Delta h + B(x, h, D h)) compose Phi_t $ and so $h$ solves $ diff_t (h compose Phi_t) = (- u(x,h, D h) Delta h + B(x, h, D h)) compose Phi_t. $ But now we can simplify the left hand side by chain rule to get $ (diff_t h) compose Phi_t + (d h compose Phi_t)(diff_t Phi_t) = (- u(x,h, D h) Delta h + B(x, h, D h)) compose Phi_t. $ Next we assume that $diff_t Phi_t = (u (x, h, D h) V (t)) compose Phi_t$, this gives us $ (diff_t h) compose Phi_t + (u (x, h, D h) d h(V (t))) compose Phi_t = (- u(x,h, D h) Delta h + B(x, h, D h)) compose Phi_t \ diff_t h + u (x, h, D h) d h(V (t)) = - u(x,h, D h) Delta h + B(x, h, D h), $ and from the definition of $V^k (t)$ we get $ diff_t h = - (u g^(i j))(x, h, D h)(diff_i diff_j h - tilde(Gamma)^k_(i j) diff_k h) + B(x, h, D h). $ Now importantly in this PDE $tilde(Gamma)$ is fixed and $(u g^(i j))(x, h, D h)$ is uniformly positive semi-definite since they are uniformly positive semi-definite at $t = 0$ and are also positive-semi-definite for some non-zero time interval. Hence by @thrm-pde_short_time we get that this PDE does have a solution for some time interval $[0,T)$. Once we have this solution $h$, we can use it to construct the diffeomorphism $Phi_t$ by simply considering the flow of the time dependent vector field $u(x, h, D h) V(t)$ which exists for all time by standard ODE theory. Then $h compose Phi_t$ is a solution to @eqn-simple_pde_formulation which proves short time existence. ] Now that we showed short-time existence we can use @cor-u_const_bound along with @cor-derivative_bound to get that as long as $T <= T_0$ $ ||nabla f|| < epsilon $ for all $t in [0,T)$. This along with @thrm-nash_moser this gives us estimates on $||nabla f||_(C^(1+r))$ which then together with @thrm-pde_short_time gives us a stronger existence statement. #corollary[ For any star-shaped hypersurface $M_0$, the normal flow with velocity $n phi - H u$ with initial condition $M_0$ exists on $[0,T_0]$. ] #proof[ We will again first pass to the function space and consider the evolution of the graph of the function $f$. Existence on $[0,T_0)$ is immediate by @thrm-pde_short_time along with @thrm-nash_moser. Then to get existence at $t = T_0$ we will take a sequence $t_n -> T_0$ with $f(t, dot) -> g$ and use Arzelà–Ascoli Theorem to prove $g$ is smooth. Bounds on $f_t$ then imply that $f$ cannot infinitely oscillate in $t$ and so $g$ is independent of choice of sequence $t_n$. To use Arzelà–Ascoli like this, we will need uniform bounds on all derivatives of $f$, here we will use @thrm-pde_short_time once again. Note that by the second part of @thrm-pde_short_time we get that $ ||f(t, dot)||_(C^(2+r)) <= B(||f(t,dot)||_(C^(1+r))) $ and so by setting $r' = 1+r$ we can repeat this process to get bounds on higher and higher derivatives of $f$, which completes the proof. ] We now know that the flow survives until $t = T_0$. When the flow reaches this point we stop it and change the flow by removing the tangential component $X^top$ entirely, only leaving $X^perp$. With this simpler flow Li and Pan @jiayuIsoperimetricProblemRiemannian2023 showed that $u$ is uniformly bounded from below for _all_ time and thus our results prove that the flow exists for all $t in [0,infinity)$. #proposition[ The flow described in @head-setting exists for all time $t in [0,infinity)$. ] This proves the first condition of @thrm-flow_method. We now want to show that the limit of this flow is precisely an integral hypersurface $S_lambda$, to do this we will use a trick where we take a limit of an entire interval of our flow. To be more precise assume that $F: M times [0, infinity) -> N$ solves the flow, then consider the functions $F_n : M times [0,1] -> N$ defined by $ F_n (t, p) = F(n+t, p). $ These are all solutions to the flow and by Arzelà–Ascoli we can, after passing to a subsequence, get that $ F_n -> F_(infinity) $ for some function $F_(infinity) : M times [0,1] -> N$ which is also a solution to the flow. #proposition[ If $Q$ is a positive continuous geometric property of an embedding $F : M -> N$ such that $Q(F(t, dot))$ is non-increasing, then $Q(F_infinity (t, dot))$ is constant on $[0,1]$. ]<prop-const_geometry> #proof[ Since for any $t in [0,1]$ the sequence $Q(F_n (t, dot))$ is positive non increasing and thus its limit exists. Hence we have $ Q(F_infinity (t, dot)) = lim_(n -> infinity) Q(F_n (t,dot)) $ which after plugging in $t = 0,1$ gives us $ Q(F_infinity (0, dot)) = lim_(n -> infinity) Q(F_n (0,dot)) = lim_(n -> infinity) Q(F(n,dot)), \ Q(F_infinity (1, dot)) = lim_(n -> infinity) Q(F_n (1,dot)) = lim_(n -> infinity) Q(F(n+1,dot)), $ which immediately gives us that $Q(F_infinity (0, dot)) = Q(F_infinity (1, dot))$. But now $Q$ is non-increasing along the flow and thus $Q(F_infinity (t, dot))$ is constant. ] We now first apply this to the surface area $A(M_t)$. Since it is clearly continuous and non-increasing by @prop-vol_area_evolution, the above proposition implies that $F_infinity (t, dot)$ has constant surface area. Due to the variation formula for surface area we get that $ 0 = integral_(M) = n/(n-1) integral_M u(ov(Ric)(cal(N)^perp,cal(N)^perp) - ov(Ric)(nu,nu)) dif S - 1/n integral_M u sum_(i < j) (kappa_i - kappa_j)^2 dif S $ and so we get that $kappa_i = kappa_j$ for all $i,j$ and so $F_infinity (t, dot)$ is totally umbilical. Secondly we get that $ov(Ric)(cal(N)^perp, cal(N)^perp) = ov(Ric)(nu,nu)$, this is important due to the following lemma. #lemma[ Let $S$ be a symmetric bilinear form and $ip(dot,dot)$ an inner product on a vector space $V$. If $X$ is a unit vector with respect to $ip(dot,dot)$ such that $S(X,X)$ is minimal/maximal among all other such unit vectors, then $X$ is an eigenvalue of $ T(S(X, dot)) $ where $T$ is the isomorphism $V^* -> V$ induced by $ip(dot,dot)$. ] #proof[ Since $S$ is symmetric bilinear it has a basis of eigenvectors $e_i$ that is orthonormal with respect to $ip(dot,dot)$. Let $lambda_i$ be their eigenvalues which we can assume are ordered in increasing order. For all unit vectors $v = a_1 e_1 + ... + a_n e_n$ we have $ ip(v,v) = a_1^2 + a_2^2 + ... + a_n^2 = 1 $ and we also have $ S(v,v) = lambda_1 a_1^2 + lambda_2 a_2^2 + ... + lambda_n a_n^2 <= lambda_n a_1^2 + lambda_n a_2^2 + ... + lambda_n a_n^2 = lambda_n $ and so since $S(e_n, e_n) = lambda_n$ we have that $e_n$ is a unit vector with $S(e_n,e_n)$ maximal. Now assume that $v$ is another unit vector with $S(v,v)$ also maximal, then we must have $S(v,v) = lambda_n$ and so since $ lambda_n = S(v,v) = lambda_1 a_1^2 + lambda_2 a_2^2 + ... + lambda_n a_n^2 = lambda_n + (lambda_1 - lambda_n) a_1 ^2 + ... + (lambda_(n - 1) - lambda_n) a_(n-1)^2 $ and since we know that $lambda_(i) - lambda_n$ is always negative and $a_(i)^2$ always positive, we must have that for each $i < n$, either $lambda_i = lambda_n$ or $a_i = 0$. We thus have that $v$ can be written as a sum of eigenvectors all with eigenvalue $lambda_n$ and thus $v$ is itself also an eigenvector. ] Applying this lemma to our situation we get that $nu$ is an eigenvector of the Ricci tensor since it is symmetric bilinear. We can use this along with @lemma-codazzi to get that $ 0 = ov(Ric)(nu, e_i) = ov(Rm)(e_i,e_j,e_j,nu) = - (nabla_i h)(e_j,e_j) + (nabla_j h)(e_i,e_j) $t Now since $F_infinity (t, dot)$ is totally umbilical we know that $h = H/n g_(i j)$ and so we have $ nabla_k h = (nabla_k H)/n g_(i j) $ which when plugged into the equation above gives us $ 0 &= - (nabla_i h)(e_j,e_j) + (nabla_j h)(e_i,e_j) = - (nabla_i H)/n ip(e_j, e_j) + (nabla_j H)/n ip(e_i, e_j) = - nabla_i H + (nabla_i H)/n \ &= - nabla_i H (n-1)/n $ and so $nabla H = 0$ so we get that $H$ is constant along $F_infinity (t, dot)$ (though not necessarily constant in time). Next we get apply @prop-const_geometry to $max_M lambda$, this is also a non-increasing quantity due to @prop-lambda_evolution and is continuous so this is valid, and so we get that $max_M lambda(F_infinity (t, dot))$ is constant on $[0,1]$. #proposition[ At $t = 1/2$ there is at least one maximal point which is stationary, that is we have a point $p$ with $diff_t lambda(t,p) = 0$ and $lambda(1/2,p) = max_M lambda(F_infinity (1/2, p))$. ] #proof[ For brevity we will write $lambda_max(t) := max_M lambda(F_infinity (t, p))$. We prove by contrapositive, assume that there are no stationary maximal points, if any maximal point $p$ satisfies $ diff_t lambda(1/2,p) > 0 $ then we have $ lambda_max (1/2+h) >= lambda(1/2+h, p) = lambda(1/2,p) + h (diff_t lambda(1/2,p)) + O(h^2) $ and so for small enough $h > 0$ we get that $lambda_max (t)$ is not constant. Otherwise we have that $diff_t lambda(1/2, p) < 0$ for all maximal points $p$ let $S$ denote the set of all maximum points, since $S$ is the preimage of $lambda_max (1/2)$ under a continuous function it is closed and thus compact in $M$. Hence there is positive $epsilon$ such that $diff_t lambda(1/2, dot) < -epsilon$ on $S$. We can now define the open set $ U := (lambda(1/2, dot))^(-1) ((lambda_max (1/2) - epsilon, lambda_max (1/2) + epsilon)) $ which is clearly a neighborhood of $S$. Now the set $M backslash U$ is closed, hence compact, hence on it $lambda(1/2, dot)$ achieves a maximum. But this maximum cannot be $lambda_max (1/2)$ since this set does not contain $S$ and so on $M backslash U$ we have that $lambda(1/2, dot) < lambda_max (1/2) - epsilon'$ for some positive $epsilon' > 0$ and also on $M backslash U$ we have $diff_t lambda(1/2, dot) < B$ for some large positive $B$. But then we get that for $t = 1/2 + h$ we have $ lambda(1 / 2 + h, p) <= cases( lambda_max (1/2) - h epsilon + O(h^2) : p in U, lambda_max (1/2) - epsilon' + B h : p in M backslash U ) $ and so by picking small enough positive $h$ we get that $ lambda(1/2 + h, p) < lambda_max (1/2) $ everywhere and so we again get that $lambda_max (t)$ is not constant. By contrapositive we get that since it is constant at least one stationary maximal point exists. ] Now at a stationary maximal point we know that $ 0 = diff_t lambda(t, p) = 2 (n phi - H u) Lambda u $ and so since $Lambda$ and $u$ are positive we must have that $n phi - H u = 0$. Now again, at a maximum point, we know that $nu$ is colinear with the gradient of $lambda$ and so $nu = cal(N)^perp$ and so $u = ||X^perp||$. This then gives us that $ 0 = n phi - H u = n phi - H||X^perp||= phi (n - H(||X^perp||)/phi) = phi (n - H lambda^(1/2)) $ and so since $phi$ is positive we get that $n - H lambda^(1/2) = 0$ and thus $H = n lambda^(-1/2)$ at a stationary maximal point. But now we recall that $H$ is constant along $M$ for any fixed time and so at $t = 1/2$ we get that $ H = n (lambda_max)^(-1/2). $ We can now calculate that at any point $p$ of $M$ at $t = 1/2$ we have $ n phi - H u >= n phi - n (lambda_max)^(-1/2)||X^perp||||nu|| = n phi (1 - (lambda/lambda_max)^(1/2)). $ and so the speed function is nowhere negative. But now by @lemma-integral_identities we get that $ integral_M (n phi - H u) dif S = 0 $ and so since it is nowhere negative the speed function must be everywhere zero and so we must have $lambda = lambda_max$ on all of $M$. Since $lambda_min$ is also constant on $[0,1]$ we get that $lambda$ is constant on all of $M$ for all time $[0,1]$ and thus the limit of the flow is an integral hypersurface $S_lambda$. #proposition[ The limit as $t -> infinity$ of $M_t$ is an integral hypersurface $S_lambda$ ]<prop-limit_surface> This proves the last condition of @thrm-flow_method, and thus proves @thrm-conclusion. == Conclusion The result of @thrm-conclusion provides is the best known result for the Isoperimetric inequality using the flow method. And we suspect that it is unlikely to be improved without a major change of approach. The reason is that already the evolution equations are quite difficult to handle and require a 2 step flow which is extremely rare in the literature. Because of this the author does not believe that there are many fruitful research directions stemming from this specific result However, many of the tricks and methodologies used to prove this result are novel and not specific to this setup. For example, the proof of @prop-limit_surface used a novel from linear algebra applied along with the Codazzi equation to greatly reduce the necessary conditions for convergence, this trick is likely to be useful in other extrinsic flows like Inverse Mean Curvature Flow. The aforementioned 2 step flow could also likely be used in other geometric flows to 'smooth out' the target manifold before applying a known canonical flows. An interesting case could be the normalized Ricci flow which shares many of the properties of Mean Curvature Flow. #pagebreak(weak: true) #bibliography("Thesis.bib", style: "bib-style.csl")
https://github.com/orkhasnat/resume
https://raw.githubusercontent.com/orkhasnat/resume/master/coverletter/coverletter.typ
typst
// ------- Rule Set -------------------- #set page( paper: "a4", // margin: ( // left: 1.2cm, // right: 1.2cm, // top: 1.2cm, // bottom: 1.2cm, // ), ) #set text(font:"Atkinson Hyperlegible",size:12pt,hyphenate: false,lang: "en") #show par: set block(spacing: 2em) #set par(justify: true,linebreaks: "optimized") // variables #let position = "Lecturer of CSE" // #let employer = "United International University" #let employer = "Central Women's University" // #let location = "United City, Madani Avenue, Dhaka-1212" #let location = "6 Hatkhola Road, Dhaka-1203" #let date= datetime.today() // #let date= datetime( year: 2024, month: 5, day: 10) // ---- Begin Coverletter --------- #align(center)[ #text(weight: "black",size:32pt,fill:black.lighten(20%))[ <NAME> ]\ #v(1pt) #text(style: "italic",fill:eastern.darken(50%),weight:"light", link("tel:+8801731969827") + " | " + link("mailto:<EMAIL>") ) #v(1cm) ] #date.display( "[month repr:short] [day], [year]" )\ To\ The Registrar\ #employer\ #location Sir, // Dear HR, I, <NAME>, am a fresh graduate from the Department of Computer Science and Engineering (CSE) at Islamic University of Technology (IUT). I am writing this letter to express my interest in the position of #position at #employer. I came to know about the vacancy from your website and I consider myself as a suitable candidate for the role. Throughout my studentship, I indulged myself in academics which not only helped me build a solid foundation in computer science but also allowed me to graduate with an outstanding result. As a recent graduate, I have a unique advantage in understanding the mindset of the students allowing me to connect with them on a deeper level. From the minor experience I have in tutoring high school students, I learned that no two students are the same and the key to delivering knowledge effectively lies in adapting my teaching style to accommodate the differences. Beyond my qualifications, I possess a genuine dedication to the success of my students with the goal of inspiring them to strive for excellence in their chosen fields. I firmly believe that these qualities will make me a good teacher fit for your esteemed institution. With this document, I have attached my curriculum vitae which provides a comprehensive overview of my academic background. I hope you will review my application favorably and consider me for further evaluation at your convenience. Thank you for your time and consideration. Yours sincerely,\ <NAME> // #text(style: "italic","<NAME>")
https://github.com/eliapasquali/typst-thesis-template
https://raw.githubusercontent.com/eliapasquali/typst-thesis-template/main/config/constants.typ
typst
Other
#import "variables.typ": myLang #let supervisor = ( "it" : "Relatore", "en" : "Supervisor", ).at(myLang) #let undergraduate = ( "it" : "Laureando", "en" : "Undergraduate", ).at(myLang) #let academicYear = ( "it" : "ANNO ACCADEMICO", "en" : "ACADEMIC YEAR", ).at(myLang) #let abstract = ( "it" : "Sommario", "en" : "Abstract", ).at(myLang) #let acknlowledgements = ( "it" : "Ringraziamenti", "en" : "Acknlowledgements", ).at(myLang) #let figuresList = ( "it" : "Elenco delle Figure", "en" : "List of Figures", ).at(myLang) #let tablesList = ( "it" : "Elenco delle Tabelle", "en" : "List of Tables", ).at(myLang) #let chapter = ( "it" : "Capitolo", "en" : "Chapter", ).at(myLang) #let ID = ( "it" : "Matricola", "en" : "ID number", ).at(myLang)
https://github.com/astro-group-bristol/py-package
https://raw.githubusercontent.com/astro-group-bristol/py-package/main/presentation/main.typ
typst
MIT License
#import "@preview/polylux:0.3.1": * #import "tamburlaine.typ": * #show: tamburlaine-theme.with(aspect-ratio: "4-3") #show link: item => underline(text(blue)[#item]) #enable-handout-mode(true) #title-slide( title: "Developing a Python package", authors: ("<NAME>",), where: "Astro Dev Group", date: datetime(year: 2024, month: 4, day: 12) ) #slide(title:"What is a package?")[ #v(1em) #one-by-one(start: 2)[ - Something that is `pip` installable #v(-0.7em) ```bash $ pip install numpy ``` ][ - A directory on the filesystem ][ - A collection of reusable *modules* - An *abstraction* of some behaviour or methods ][ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[A package is a way to share code]) ][ Use a package: ```python import numpy as np # importing a module in the package values = np.linspace( # a function in the module 0, 1, 10 ) ``` ] ] #slide(title:"Why would you want to make a package?")[ #v(1em) #one-by-one(start: 2)[ - Let other people easily use your code ][ - Separate out functions and routines you share between projects - Reduce the complexity of a codebase ][ #v(-0.3em) #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Guide your thinking]) #v(0.3em) ][ - A package encourages you to think in *modules* - Starts to *break down the problem* for you - Makes your codebase navigable - Separation of concerns ] ] #slide(title:"The basics")[ #v(1em) #one-by-one(start:2)[ - Create an `__init__.py` to turn a *directory* into a *module* - `__init__.py` is executed on `import` - Used to define your public *API* ][ #align(center, text(fill: PRIMARY_COLOR, size: 25pt, weight: "black")[ An example package for statistical distributions: ]) #align(center)[ ``` distributions/ ├── continuous.py └── __init__.py ``` ] ][ Here we have - a `distributions` *module* (from the `__init__.py`) - a `distributions.continuous` *module* (from `continuous.py`) ] ] #slide[ #v(-0em) #rect(fill: PRIMARY_COLOR, width: 100%, height:44%, inset: (top: 1em))[ #text(fill:SECONDARY_COLOR, size: 80pt, weight: "black")[Modules are namespaces] ] #v(-1em) #one-by-one(start:2)[ #align(right, text(fill:PRIMARY_COLOR, size: 50pt, weight: "black")[Use modules to group logical parts of your code]) ][ #align(left, text(fill:TEXT_COLOR, size: 40pt, weight: "regular")[Combine *modules* into *packages*]) ] ] #slide(title: "Distinguish with modules")[ #v(0.5em) In the example: - *Continuous* and *discrete* distributions are *logically* separate. - *Plotting* is separate from the *definitions* of the distributions. #align(center)[ #v(-0.5em) ``` distributions/ ├── continuous.py ├── discrete.py ├── plots.py └── __init__.py ``` ] #v(-0.5em) Then use as ```py import distributions.continuous as continuous import distributions.discrete as discrete ``` ] #slide(title: "Thinking in modules")[ #v(0.5em) For example, use a *module* for: - functions specific to filesystem IO - an implementation of a numerical method - your plotting routines - a wrapper for another library that you depend on - defining constants - utility functions that don't fit into other places #one-by-one(start: 2)[ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Modules contain the building blocks out of which you build your program.]) ][ The "do one thing" idiom, and compose to do many things ] ] #slide(title: "Import machinery")[ #quote[The import statement combines two operations; it searches for the *named module*, then it *binds the results* of that search to a name in the local scope.] #v(-0.3em) #one-by-one(start: 2)[ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Where are modules imported from?]) #v(-0.3em) ][ ```python >>> import sys >>> sys.path ['', '/usr/lib/python311.zip', '/usr/lib/python3.11', '/usr/lib/python3.11/lib-dynload', '/home/lilith/.local/lib/python3.11/site-packages', '/usr/lib/python3.11/site-packages'] ``` ][ Loading (#link("https://docs.python.org/3/reference/import.html#loading")[docs]) binds the package into #v(-0.3em) ```py >>> sys.modules ``` ] ] #slide()[ #v(-0em) #rect(inset:(top: 1.5em, right: 0.6em), fill: PRIMARY_COLOR, width: 100%, height: 66%, par(leading:15pt, align(right, text(fill:SECONDARY_COLOR, size:190pt, weight:"black")[Try it #linebreak() out]))) Make a directoy with an `__init__.py`, define a small function, try *importing* it and using the function in a script. ] #slide(title:"Using our package")[ #v(1em) Using the `distributions` example: ```py import numpy as np import distributions x = np.linspace(-1, 1, 10) y = distributions.gauss(x) ``` #one-by-one(start:2)[ The *intent* behind this code is clear - The details are *abstracted* and we instead see the *concepts* - Easy and simple to use and understand ][ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Scripts build behaviours from packages]) ] ] #slide(title:"Good habits")[ #v(1em) #one-by-one[ - Break problems into small chunks, categorize the work - These become your *modules* ][ - Implement your solution as with simple building blocks - These become your *functions* ][ - Separate what is *internal* to the package and what is *public* to the user - *Public*: would require updating a code that depends on this package - *Private*: things you can change that do not affect the user (e.g. specific sorting algorithm to sort a vector) ][ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Interface versus implementation]) ] ] #slide(title:"Sharing your package")[ See #link("https://packaging.python.org/en/latest/tutorials/packaging-projects/")[Python documentation] for detailed step by step. #one-by-one(start:2)[ *Do not* use `distutils`: - Python 3.10 deprecated `distutils` - Python 3.12 has *removed it* ][ #align(center, text(fill: PRIMARY_COLOR, size: 30pt, weight: "black")[Always use `setuptools` with a `pyproject.toml`]) ][ ```toml [build-system] requires = ["setuptools"] build-backend = "setuptools.build_meta" [project] name = "distributions" version = "0.0.1" dependencies = ["numpy"] ``` ] ] #slide(title:"Installing your package")[ #v(1em) Locally - `pip install .` From a repository - ``` pip install \ git+https://github.com/astro-group-bristol/py-package ``` Or register your package in #link("https://setuptools.pypa.io/en/latest/userguide/quickstart.html#uploading-your-package-to-pypi")[PyPi] and install like any other. ] #slide(title:"Summary & questions")[ #v(1em) - *Packages* are for sharing / reusing code - Packages are composed of *modules* - *Modules* are namespaces / logical scopes - Let *modules* help guide your thinking - Turn a directory into a *module* with `__init__.py` - Use `setuptools` to make your package `pip` friendly (*do not* use `distutils`) - Implement your solution as many small *functions* in a *package* - Use a script (or Jupyter notebook) to compose those functions into an executable ]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/bugs/2902-gradient-oklch-panic_00.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page #set page(width: 15cm, height: auto, margin: 1em) #set block(width: 100%, height: 1cm, above: 2pt) // Oklch #block(fill: gradient.linear(red, purple, space: oklch)) #block(fill: gradient.linear(..color.map.rainbow, space: oklch)) #block(fill: gradient.linear(..color.map.plasma, space: oklch))
https://github.com/jens-hj/ds-exam-notes
https://raw.githubusercontent.com/jens-hj/ds-exam-notes/main/lectures/3.typ
typst
#import "../lib.typ": * #show link: it => underline(emph(it)) #set math.equation(numbering: "(1)") #set enum(full: true) #set math.mat(delim: "[") #set math.vec(delim: "[") #set list(marker: text(catppuccin.latte.lavender, sym.diamond.filled)) #show heading.where(level: 1): it => text(size: 22pt, it) #show heading.where(level: 2): it => text(size: 18pt, it) #show heading.where(level: 3): it => { text(size: 14pt, mainh, pad( left: -0.4em, gridx( columns: (auto, 1fr), align: center + horizon, gap: 0em, it, rule(stroke: 1pt + mainh) ) )) } #show heading.where(level: 4): it => text(size: 12pt, secondh, it) #show heading.where(level: 5): it => text(size: 12pt, thirdh, it) #show heading.where(level: 6): it => text(thirdh, it) #show emph: it => text(accent, it) #show ref: it => { //let sup = it.supplement let el = it.element if el == none { it.citation } else { let eq = math.equation // let sup = el.supplement if el != none and el.func() == eq { // The reference is an equation let sup = if it.fields().at("supplement", default: "none") == "none" { [Equation] } else { [] } // [#it.has("supplement")] show regex("\d+"): set text(accent) let n = numbering(el.numbering, ..counter(eq).at(el.location())) [#sup #n] } else if it.citation.has("supplement") { if el != none and el.func() == eq { show regex("\d+"): set text(accent) let n = numbering(el.numbering, ..counter(eq).at(el.location())) [#el.supplement #n] } else { text(accent)[#it] } } } } === Andrew File System - Does have a way to create multiple replicas, and can decide where to place them \ #ra Mimics more something like _NFSv4.1_ *Server:* Vice \ *Client:* Venus ==== Main Goal *Scalability* ==== Properties - Scalable - _Whole-file_ caching on local disk of client - Allow for many modifications at once, only sending large protocol messages - Engages the local file system - Less messages #ra less protocol overhead #image("../img/afs-flow.png", width: 80%) _Note that the local file system has to exist and be engaged._ ==== Venus (Client) #image("../img/afs-venus.png", width: 70%) - Decisions on caching - what to do when cache space has run out. - Multiple threads ==== Vice (Server) #image("../img/afs-vice.png", width: 70%) - Single thread - Serving data - Storing data ==== Organising Data *Local volumes:* - Logical unit corresponding to a directory - _Difference to NFS:_ Allows operators to deal with logical volumes; replicating, moving, otherwise modifying - User doesn't need to be aware where the logical volumes point *Benefits:* - Load-balancing #ra incread availability & performance - Can be moved easily ==== Callback *Goal:* - Ensure cached copies of files are up-to-date when another client closes the same file after updating it. *Callback promise:* - Token issued by Vice - Token stored with the cached files on local client disk - Token is valid or cancelled \ #ra *Valid:* Free to modify \ #ra *Cancelled:* Do not touch #box(emoji.warning, outset: -1pt) *Implementation:* - RPC from Vice to Venus - Set token to cancelled ==== Consistency Guarantees Two clients open one file and makes changes. Server will only be updated on close. What happens here? \ #ra Clients need to know when the Vice is updated to know that their copy is _old_ \ #ra Upon close, _promises_ to show updated data to all clients - Weak - Practical - Though with some pitfalls *Methods to provide this guarantee:* - *Write-through Cache:* Once modified on client, it will be modified on server (on close) \ - *Callbacks:* Setting other tokens to cancelled, making other clients aware that they have to re-fetch. #figure( image("../img/afs-process.png", width: 80%) ) ==== AFSv2 - Able to support 50 clients per server - Client-side performance close to local performance - since everything basically happens in local cache ==== AFS vs NFS #image("../img/afs-nfs.png") === Reliable Storage: Single Server ==== Basic Problem - No corruption - Minimise loss Individual servers are unreliable, some redundancy will be needed. \ Start thinking _one machine_ with multiple disks ==== State of The Art: Replication #image("../img/replication.png", width: 80%) *Overhead:* 200% - Make decisions about where to place each chunk/fragment/stripe/slice; impacts reliability - How many slices to make: \ #ra defines how much is lost if one fragment is impacted \ #ra defines how big of an action is needed to recover, like moving 10/20/50 percent of total file. *Traffic to repair:* 1 unit, 50% of file size ===== Different Options: Vanilla RAID5 #image("../img/vanilla-raid5.png", width: 50%) - Can lose _any *one* of the shown fragments_, and still recover it. - Need to move _the equivalent_ of the entire file, no matter how many slices. #image("../img/vanilla-raid5-2.png", width: 50%) ===== Different Options: Vanilla RAID6 - Sustain _any *two* losses_, while only storing twice the data - Slicing thinner gives better performance, with further linear combinations of the slices. #image("../img/vanilla-raid6.png", width: 50%)
https://github.com/ofurtumi/formleg
https://raw.githubusercontent.com/ofurtumi/formleg/main/h03/H3.typ
typst
#import "@templates/ass:0.1.1": * #import "@preview/finite:0.1.0" #import "@preview/cetz:0.1.1" #show: doc => template( project: "Verkefni 1", class: "TÖL105M - Tölvugrafík", doc ) #set heading(numbering: "1.a.1") = Language #align(center, [Consider the following language $L(10^∗(0 union 1))$]) #grid(columns: (2fr, 1fr), gutter: 16pt, [ == NFA #cetz.canvas({ import cetz.draw: set-style import finite.draw: state, transition state((0,0), "q0", initial: true) state((3,0), "q1") state((6,0), "q2", final: true) transition("q0", "q1", label: 1, curve: 0) transition("q1", "q1", label: 0, curve: 0) transition("q1", "q2", label: 0, curve: 0) transition("q1", "q2", label: 1, curve: -1) }) ],[ == Simulation ``` Init: q0 Accept: q2 q0,1,q1 q1,0,q1,q2 q1,1,q2 ``` ]) = Mayan numbers - We can lead with no or infinate zeros so we know that our regex will start with $a^*$ - The largest number that we can represent is $41 = "cb" = 2*20^1 + 1*20^0$ - The largest number possible in second place is $c$ - The only numbers we can add to $c = 40$ are $a = 0$ and $b = 1$ - The second largest number in the second place is $b = 1$ - We can add all other numbers of the alphabet to $b$ Using these rules we can construct the following regex: $ a^*(c[a b])|(b?[a-t]) $ Breaking it down we have zero or more a's followed by either $c a$, $c b$, $b$ followed by any letter from $a$ to $t$ or just any letter from $a$ to $t$ = Mayan numbers - continued == #grid(columns: (1fr, 1fr), gutter: 16pt, [ #cetz.canvas({ import cetz.draw: set-style, bezier, content import finite.draw: state, transition state((0,0), "q0", initial: true) state((5,0), "q1", final: true) state((0,5), "q2") transition("q1", "q0", label: 1, curve: 2) transition("q0", "q1", label: 1, curve: -0.5) transition("q0", "q2", label: 2, curve: 2) transition("q2", "q0", label: 2, curve: -0.5) transition("q2", "q1", label: 0, curve: 2) transition("q1", "q2", label: 0, curve: -0.5) transition("q2", "q2", label: 1, curve: 0) bezier("q0.top", "q0.right", (1, 1), mark: (end: ">")) content((0.8, 0.8), [0]) bezier("q1.top-right", "q1.bottom-right", (6.5, 0), mark: (end: ">")) content((6.3, 0), [2]) }) ],[ #align(horizon, [ Let's start by dividing $A$ into three sublanguages, $Sigma_( 1-3 )$, where sublanguage $Sigma_n$ contains all numbers of $A$ that when divided by three give a remainer $n$. In our DFA we represent transitions by these remainders, so for example the numbers $0,3,6,9,12,15,18$ are all represented by the same kind of transition, $0$ $ Sigma_0 &= {0,3,6,9,12,15,18}\ Sigma_1 &= {1,4,7,10,13,16,19}\ Sigma_2 &= {2,5,8,11,14,17} $ ]) ]) = Mayan regex Lets start by breaking our regex up into three parts $X$, $Y$, $Z$ where each part represents a different goal == $X$, lets stay at home $X$ makes sure that we can stay in the starting position, home, for as long as we want, since we can add infinate zeros to the start of our number without it changing it's obvious that $0^*$ must be one of our options. Now we can also leave for a short while so and then come back home, our options to do that would be $21^*2$ and $12^*1$. In short $X$ is the regex $(21^*2|12^*1|0)$ and we can repeat it as many times as we want, so $X^*$ == $Y$, one foot in the grave $Y$ lets us go to our final destination, *the grave*, so that we can stay there and possibly make some stops on the way. There are only a couple of ways to get to the end, that is without taking a path we have already declared at home or repeating ourselves. Those paths would be $21^*0$ and simply $1$. Since we can't go to the end multiple times we only need to declare it once, so $B$ is the regex $(21^*0|1)$ == $Z$, the silver age Now when arriving at $Z$ we could stop, but we want to keep going for a little longer but in the end we need to go back. There are two ways we can take to get out of the grave. $0$ and $1$, if we take the $0$ path we can then go to multiple places, more precisely from the grave we can go $(0(1^*(20^*2)^*1^*)^*0)$ and this path we can go as many times as we want. Similarly if we choose the other path we could go $(1(0^*(21^*2)^*0^*)^*1)$. Lastly we can go for a very short walk through $2$. So all in all $Z$ is the regex $(0(1^*(20^*2)^*1^*)^*0|1(0^*(21^*2)^*0^*)^*1|2)$ and same as $X$ we can repeat it as many times as we want, so $Z^*$ == $R(A)$ We have broken our regex up into small pieces now we have to put it back together. $X^*Y Z^*$ would then be: $ (21^*2|12^*1|0)*(21^*0|1)(0(1^*(20^*2)^*1^*)^*0|1(0^*(21^*2)^*0^*)^*1|2) $ #pagebreak() #set heading(numbering: none) = Mayan numbers - failed try == DFA for the Mayans #grid(columns: 2, gutter: 16pt, [ #cetz.canvas({ import cetz.draw: set-style, bezier, content import finite.draw: state, transition state((0,0), "q0", initial: true) state((5,0), "q1", final: true) state((0,5), "q2") transition("q1", "q0", label: 2, curve: 2) transition("q0", "q1", label: 1, curve: -0.5) transition("q0", "q2", label: 2, curve: 2) transition("q2", "q0", label: 1, curve: -0.5) transition("q2", "q1", label: 2, curve: 2) transition("q1", "q2", label: 1, curve: -0.5) transition("q2", "q2", label: 0, curve: 0) bezier("q0.top", "q0.right", (1, 1), mark: (end: ">"), label: 1) content((0.8, 0.8), [0]) bezier("q1.top-right", "q1.bottom-right", (6.5, 0), mark: (end: ">"), label: 1) content((6.3, 0), [0]) })],[ This was the first attempt, it took so long that i don't want to erase it. Here we thought that the string $0112$ would be the number $4$ but actiually it would be $422$. However this state machine was similar enough that it helped alot in understanding the actual problem. ]) == Regex for the Mayans $ (0^*20^*10^*|0^*10^*20^*)^*(0^*10^*|0^*20^*20^*|0^*20^*10^*10^*|0^*10^*10^*20^*|0^*10^*20^*10^*)\ (0^*10^*10^*10^*|0^*10^*20^*|0^*20^*10^*|0^*20^*20^*20^*|(0^*10^*10^*|0^*20^*){3})^* $ Since every state can accept numbers from $Sigma_0$ without changing state, the zeros become pretty overwhelming, so we can simplify the regex by removing them, they are still there i just don't write them out. === $R(A)$ The regex $A$ represents infinite or zero transitions from $q_0$ back into itself $ (21|12) $ === $R(B)$ The regex $B$ is the only necessary part of the big regex, it represents the transitions from $q_0$ to $q_1$ which is the end state $ (1|22|211|112|121) $ === $R(C)$ $C$ represents the transitions from $q_1$ back into itself, since we are already in the end state at this point, the only goal here is to stay. $ (111|12|21|222|(11|2){3}) $ === In conclusion, if we disregard the fact that the zeros are not written out, we can simplify the whole Mayan regex into $ A^*B C^* $
https://github.com/DrGo/typst-tips
https://raw.githubusercontent.com/DrGo/typst-tips/main/refs/samples/typst-uwthesis-master/format.typ
typst
// https://github.com/zagoli/simple-typst-thesis/blob/main/template.typ #let buildMainHeader(mainHeadingContent) = { [ #align(center, smallcaps(mainHeadingContent)) #line(length: 100%) ] } #let buildSecondaryHeader(mainHeadingContent, secondaryHeadingContent) = { [ #smallcaps(mainHeadingContent) #h(1fr) #emph(secondaryHeadingContent) #line(length: 100%) ] } // To know if the secondary heading appears after the main heading #let isAfter(secondaryHeading, mainHeading) = { let secHeadPos = secondaryHeading.location().position() let mainHeadPos = mainHeading.location().position() if (secHeadPos.at("page") > mainHeadPos.at("page")) { return true } if (secHeadPos.at("page") == mainHeadPos.at("page")) { return secHeadPos.at("y") > mainHeadPos.at("y") } return false } #let getHeader() = { locate(loc => { // Find if there is a level 1 heading on the current page let nextMainHeading = query(selector(heading).after(loc), loc).find(headIt => { headIt.location().page() == loc.page() and headIt.level == 1 }) if (nextMainHeading != none) { return buildMainHeader(nextMainHeading.body) } // Find the last previous level 1 heading -- at this point surely there's one :-) let lastMainHeading = query(selector(heading).before(loc), loc).filter(headIt => { headIt.level == 1 }).last() // Find if the last level > 1 heading in previous pages let previousSecondaryHeadingArray = query(selector(heading).before(loc), loc).filter(headIt => { headIt.level > 1 }) let lastSecondaryHeading = if (previousSecondaryHeadingArray.len() != 0) {previousSecondaryHeadingArray.last()} else {none} // Find if the last secondary heading exists and if it's after the last main heading if (lastSecondaryHeading != none and isAfter(lastSecondaryHeading, lastMainHeading)) { return buildSecondaryHeader(lastMainHeading.body, lastSecondaryHeading.body) } return buildMainHeader(lastMainHeading.body) }) } #let invisible_heading(level: 1, numbering: none, supplement: auto, outlined: true, content) = { // show heading.where(level: level): set text(size: 0em, color: red) show heading.where(level: level): it => block[] text(size: 0pt)[ #heading(level: level, numbering: numbering, supplement: supplement, outlined: outlined)[#content] ] } #let small_title(content, outlined: true) = { align(center)[ // #show heading.where(level: 1): set text(size: 0.85em) #show heading.where(level: 1): it => block[ #set text(size: 0.85em) #it.body ] #heading( outlined: outlined, numbering: none, content // text(0.85em,content), ) #v(5mm) ] } #let GLS_PREFIX = "gls-auto-" #let print_glossary(glossaries, name, bold: true) = { let to_print = () for (key, value) in glossaries.at(name).pairs() { // let (abbr, full) = value let abbr = value.at(0) let full = value.at(1) to_print.push([#if bold [*#abbr*] else [#abbr] #label(GLS_PREFIX + key)]) to_print.push(full) } grid( columns: 2, gutter: 3mm, ..to_print ) } #let GLOSSARIES = state("glossaries", (:)) #let PRINTED_GLOSSARIES = state("printed_glossaries", ()) #let gls(name) = { let contents = locate(loc => { let glossaries = GLOSSARIES.at(loc) for table in glossaries.values() { if name in table.keys() { if table.at(name).len() > 2 { link(label(GLS_PREFIX + name))[#table.at(name).at(2)] } else if name not in PRINTED_GLOSSARIES.at(loc) { link(label(GLS_PREFIX + name))[#table.at(name).at(1) (#table.at(name).at(0))] } else { link(label(GLS_PREFIX + name))[#table.at(name).at(0)] } break } } } ) contents PRINTED_GLOSSARIES.update(curr => { if name not in curr { curr.push(name) } curr }) // [#glossaries] } #let project( title: "", doc_type: "thesis", abstract: [], author: "", email: "", affiliation: "", address: "", city: "Waterloo, Ontario, Canada", degree: "Doctor of Philosophy", major: "", year: "2023", logo: none, committee_members: ( ( position: "Examiner", persons: ( ( name: "<NAME>", title: "Professor", department: "Dept. of Zoology", affiliation: "University of Waterloo" ), ) ), ), declaration: "none", statement: "", acknowledgements: "", dedication: "", glossaries: (abbreviation: (:),), body ) = { set page( paper: "us-letter", margin: (outside: 1.0in, inside: 1.125in, bottom:1.125in+0.4in, top: 1.125in + 0.4in), header-ascent: 0.4in, footer-descent: 0.3in ) // Set the document's basic properties. set document(author: author, title: title) // set text(font: "New Computer Modern", lang: "en") set text( size: 12pt, // font: "Times_New_Roman", font: "New Computer Modern", stretch: 120%, lang: "en" ) show math.equation: set text(weight: 400) // set math.equation(numbering: "(1.1)") // Currently not directly supported by typst set math.equation(numbering: "(1)") set heading(numbering: "1.1") set par(justify: true) // show heading.where(level: 1): set text(size: 24pt) show heading.where(level: 2): set text(size: 18pt) show heading.where(level: 3): set text(size: 14pt) show outline.entry.where(level: 1): it => { v(16pt, weak: true) strong(it) } show outline.entry.where(level: 2): it => { it } show link: set text(fill: blue) show ref: it => { let eq = math.equation let hd = heading let el = it.element if el != none and el.func() == eq { // Override equation references. link(el.label)[#numbering( el.numbering, ..counter(eq).at(el.location()) )] } else if el != none and el.func() == hd { // Override equation references. text(fill: blue.darken(60%))[#it] } else { // Other references as usual. it } } show cite: set text(fill: green) // Title page. v(0.25fr) align(center)[ #text(2em, weight: 700, title) ] v(1cm) align(center)[by] v(1cm) // Author information. align(center)[ #text(size: 16pt)[#author] \ #if email != "" [#email \ ] #v(2cm) A #doc_type \ presented to #affiliation \ in fulfillment of the \ thesis requirement for the degree of \ #degree \ in \ #major \ #v(2cm) #city, #year #v(1cm) © #author #year \ ] // Logo if logo != none { v(0.25fr) align(center, image(logo, width: 26%)) v(0.50fr) } else { v(0.75fr) } pagebreak() set page(numbering: "i", number-align: center) // committee let commitee_body = () for mitem in committee_members { let pos_printed = false for person in mitem.persons { if pos_printed { commitee_body.push("") } else { commitee_body.push(mitem.position + ": ") } pos_printed = true commitee_body.push([ #person.name \ #person.title, #person.department, #person.affiliation ]) } commitee_body.push(v(1.5cm)) commitee_body.push(v(1.5cm)) } small_title(outlined: false)[Examining Committee Membership] invisible_heading([Examining Committee]) [ The following served on the Examining Committee for this thesis. The decision of the Examining Committee is by majority vote. #v(1cm) ] grid( columns: 2, gutter: 3mm, ..commitee_body ) pagebreak() if declaration != "none" { small_title([Author’s Declaration]) if declaration == "sole" [ I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. ] else if declaration == "compiled" [ This thesis consists of material all of which I authored or co-authored: see Statement of Contributions included in the thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. ] else [ The author makes no declaration yet. ] pagebreak() } if statement != "" { small_title([Statement of Contributions]) statement pagebreak() } // Abstract page. // v(1fr) small_title([Abstract]) abstract // v(1.618fr) pagebreak() small_title([Acknowledgements]) acknowledgements pagebreak() if dedication != "" { small_title([Dedication]) dedication } pagebreak() show heading.where(level: 1): it => [ #set text(size: 24pt) #v(1.5in) #par(first-line-indent: 0pt)[#it.body] #v(1.5cm) ] // Table of contents. heading("Table of Contents", numbering: none, outlined: false) outline( title: none, depth: 3, indent: true ) pagebreak() heading("List of Figures", numbering: none) outline( title: none, depth: 3, indent: true, target: figure.where(kind: image), ) pagebreak() heading("List of Tables", numbering: none) outline( title: none, depth: 3, indent: true, target: figure.where(kind: table) ) pagebreak() GLOSSARIES.update(glossaries) heading( outlined: true, numbering: none, text("List of Abbreviations"), ) print_glossary(glossaries, "abbreviation", bold: true) pagebreak() heading( outlined: true, numbering: none, text("List of Symbols"), ) print_glossary(glossaries, "symbol", bold: false) // Main body. set page(numbering: "1", number-align: center) set par(first-line-indent: 20pt) set page(header: getHeader()) counter(page).update(1) // set gls(glossaries: glossaries) show heading.where(level: 1): it => [ // #pagebreak(weak: true) #set text(size: 24pt) #v(1.5in) #block[ #if it.numbering != none [ Chapter #counter(heading).display() #v(0.5cm) ] #par(first-line-indent: 0pt)[#it.body] ] #v(1.5cm, weak: true) ] show heading.where(level: 2): it => [ #set text(size: 18pt) #v(1cm, weak: true) #block[ #if it.numbering != none [ #counter(heading).display() ] #it.body ] #v(1cm, weak: true) ] show heading.where(level: 2): set text(size: 18pt) show heading.where(level: 3): set text(size: 14pt) body pagebreak(weak: true) heading( outlined: true, numbering: none, text("Glossary"), ) print_glossary(glossaries, "glossary") } #let appendix(body) = { set heading(numbering: "A.1.1.1") counter(heading).update(0) body }
https://github.com/0x1B05/nju_os
https://raw.githubusercontent.com/0x1B05/nju_os/main/book_notes/content/02_virtualization_02_memory_03_swapping.typ
typst
#import "../template.typ": * #pagebreak() = Swap == Swapping: Mechanism We’ve been assuming that every address space of every running process fits into memory. We will now relax these big assumptions, and assume that we wish to support many concurrently-running large address spaces. To do so, we require an additional level in the *memory hierarchy*. To support large address spaces, the OS will need a place to stash away portions of address spaces that currently aren’t in great demand. For now, just assume we have *a big and relatively-slow device* which we can use to help us build the illusion of a very large virtual memory, even bigger than physical memory itself. In modern systems, this role is usually served by *a hard disk drive*. A contrast is found in older systems that used *memory overlays*, which required programmers to manually move pieces of code or data in and out of memory as they were needed The invention of multiprogramming almost demanded the ability to swap out some pages, as early machines clearly could not hold all the pages needed by all processes at once. Thus, the combination of multiprogramming and ease-of-use leads us to want to support using more memory than is physically available. === Swap Space The first thing we will need to do is to reserve some space on the disk for moving pages back and forth. We generally refer to such space as *swap space*. We _swap_ pages out of memory to it and _swap_ pages into memory from it. Thus, we will simply assume that the OS can read from and write to the swap space, in page-sized units. To do so, the OS will need to remember the *disk address* of a given page. The size of the swap space is important, but let us assume for simplicity that it is very large for now. #image("images/2023-12-20-18-43-59.png", width: 80%) Swap space is not the only on-disk location for swapping traffic. Assume you are running a program binary. The code pages from this binary are initially found on disk, and when the program runs, they are loaded into memory.However, if the system needs to make room in physical memory for other needs, it can safely re-use the memory space for these code pages, knowing that it can later swap them in again from the on-disk binary in the file system. === The Present Bit Now that we have some space on the disk, we need to add some machinery higher up in the system in order to support swapping pages to and from the disk. Let us assume, for simplicity, that we have a system with a hardware-managed TLB. Recall first what happens on a memory reference. The running process generates virtual memory references, and, the hardware translates them into physical addresses before fetching the desired data from memory. If we wish to allow pages to be swapped to disk, however, we must add even more machinery. Specifically, when the hardware looks in the PTE, it may find that the page is not present in physical memory. The way the hardware/OS determines this is through a new piece of information in each page-table entry, known as the *present bit*. - If the present bit is set to one, it means the page is present in physical memory and everything proceeds as above; - If it is set to zero, the page is not in memory but rather on disk somewhere. The act of accessing a page that is not in physical memory is commonly referred to as a *page fault*. Upon a page fault, the OS is invoked to service the page fault. A particular piece of code, known as a *page-fault handler*. ==== SWAPPING TERMINOLOGY AND OTHER THINGS Terminology in virtual memory systems can be a little confusing and variable across machines and operating systems. For example, a *page fault* more generally could refer to any reference to a page table that generates a fault of some kind: - a page-not-present fault - illegal memory accesses Indeed, it is odd that we call what is definitely a legal access (to a page mapped into the virtual address space of a process, but simply not in physical memory at the time) a “fault” at all; really, it should be called a *page miss*. But often, when people say a program is “page faulting”, they mean that it is accessing parts of its virtual address space that the OS has swapped out to disk. === The Page Fault If a page is not present, the OS is put in charge to handle the page fault. #tip("Tip")[ Virtually all systems handle page faults in software; even with a hardware-managed TLB. ] If a page is not present and has been swapped to disk, the OS will need to swap the page into memory in order to service the page fault. How will the OS know where to find the desired page? In many systems, *the page table* is a natural place to store such information. Thus, the OS could use the bits in the PTE normally used for data such as the PFN of the page for a disk address. - When the OS receives a page fault for a page, it looks in the PTE to find the address, and issues the request to disk to fetch the page into memory. - The disk I/O completes, the OS will then update the page table to mark the page as present, update the PFN field of the page-table entry (PTE) to record the in-memory location of the newly-fetched page, and retry the instruction. - This next attempt may generate a TLB miss, which would then be serviced and update the TLB with the translation. - Finally, a last restart would find the translation in the TLB and thus proceed to fetch the desired data or instruction from memory at the translated physical address. #tip("Tip")[ Note that while the I/O is in flight, the process will be in the blocked state. Thus, the OS will be free to run other ready processes while the page fault is being serviced. ] ==== WHY HARDWARE DOESN’T HANDLE PAGE FAULTS - First, page faults to disk are slow; even if the OS takes a long time to handle a fault, executing tons of instructions, the disk operation itself is traditionally so slow that the extra overheads of running software are minimal.(*performance*) - Second, to be able to handle a page fault, the hardware would have to understand swap space, how to issue I/Os to the disk, and a lot of other details which it currently doesn’t know much about.(*simplicity*) === If Memory Is Full? Before we assumed there is plenty of free memory in which to *page in* a page from swap space. Memory may be full (or close to it). Thus, the OS might like to first *page out* one or more pages to make room for the new page(s) the OS is about to bring in. The process of picking a page to kick out, or *replace* is known as the *page-replacement policy*. === Page Fault Control Flow ==== Hardware #code(caption: [Page Fault Control Flow - Hardware])[ ```c VPN = (VirtualAddress & VPN_MASK) >> SHIFT (Success, TlbEntry) = TLB_Lookup(VPN) if (Success == True) // TLB Hit if (CanAccess(TlbEntry.ProtectBits) == True) Offset = VirtualAddress & OFFSET_MASK PhysAddr = (TlbEntry.PFN << SHIFT) | Offset Register = AccessMemory(PhysAddr) else RaiseException(PROTECTION_FAULT) else // TLB Miss PTEAddr = PTBR + (VPN * sizeof(PTE)) PTE = AccessMemory(PTEAddr) if (PTE.Valid == False) RaiseException(SEGMENTATION_FAULT) else if (CanAccess(PTE.ProtectBits) == False) RaiseException(PROTECTION_FAULT) else if (PTE.Present == True) // assuming hardware-managed TLB TLB_Insert(VPN, PTE.PFN, PTE.ProtectBits) RetryInstruction() else if (PTE.Present == False) RaiseException(PAGE_FAULT) ``` ] ==== Software #code(caption: [Page Fault Control Flow - Software])[ ```c PFN = FindFreePhysicalPage() if (PFN == -1) // no free page found PFN = EvictPage() // run replacement algorithm DiskRead(PTE.DiskAddr, PFN) // sleep (waiting for I/O) PTE.present = True // update page table with present PTE.PFN = PFN // bit and translation (PFN) RetryInstruction() // retry instruction ``` ] The retry will result in a TLB miss, and then, upon another retry, a TLB hit, at which point the hardware will be able to access the desired item. === When Replacements Really Occur The way we’ve described how replacements occur assumes that the OS waits until memory is entirely full, and only then replaces (evicts) a page to make room for some other page. As you can imagine, this is a little bit unrealistic, and there are many reasons for the OS to *keep a small portion of memory free more proactively*. To keep a small amount of memory free, most operating systems thus have some kind of *high watermark (HW)* and *low watermark (LW)* to help decide when to start evicting pages from memory. How this works is as follows: When the OS notices that there are fewer than LW pages available, a background thread that is responsible for freeing memory runs. The thread evicts pages until there are HW pages available. The background thread, sometimes called the *swap daemon* or *page daemon*, then goes to sleep, happy that it has freed some memory for running processes and the OS to use. By performing a number of replacements at once, new performance optimizations become possible. For example, many systems will *cluster* or *group* a number of pages and write them out at once to the swap partition, thus increasing the efficiency of the disk. To work with the background paging thread, the control flow in software above should be modified slightly: Instead of performing a replacement directly, the algorithm would instead simply check if there are any free pages available. - If not, it would inform the background paging thread that free pages are needed; - When the thread frees up some pages, it would re-awaken the original thread, which could then page in the desired page and go about its work. ==== DO WORK IN THE BACKGROUND When you have some work to do, it is often a good idea to do it in the background to increase efficiency and to allow for grouping of operations. Operating systems often do work in the background; for example, many systems buffer file writes in memory before actually writing the data to disk. Doing so has many possible benefits: - increased disk efficiency; - improved latency of writes, as the application thinks the writes completed quite quickly; - the possibility of work reduction, as the writes may need never to go to disk (i.e., if the file is deleted); - and better use of idle time, as the background work may possibly be done when the system is otherwise idle, thus better utilizing the hardware. == Swapping: Policies Unfortunately, things get a little more interesting when little memory is free. In such a case, this *memory pressure* forces the OS to start paging out pages to make room for actively-used pages. Deciding which page (or pages) to *evict* is encapsulated within the *replacement policy* of the OS. === Cache Management Given that main memory holds some subset of all the pages in the system, it can rightly be viewed as a *cache* for virtual memory pages in the system. Thus, our goal in picking a replacement policy for this cache is to minimize the number of *cache misses*(or maximizing the number of *cache hits*). Knowing the number of cache hits and misses let us calculate the *average memory access time (AMAT)* for a program. $ "AMAT" = T_(M) + (P_("Miss") \cdot T_(D)) $ - $T_(M)$ -> the cost of accessing memory - $T_(D)$ -> the cost of accessing disk - $P_("Miss")$ -> the probability of not finding the data in the cache (a miss); === The Optimal Replacement Policy The optimal replacement policy leads to the fewest number of misses overall. Belady showed that a simple (but, unfortunately, difficult to implement!) approach that replaces the page that will be accessed furthest in the future is the optimal policy, resulting in the fewest-possible cache misses. Hopefully, the intuition behind the optimal policy makes sense. Think about it like this: If you have to throw out some page, why not throw out the one that is needed the furthest from now? By doing so, you are essentially saying that all the other pages in the cache are more important than the one furthest out. The reason this is true is simple: you will refer to the other pages before you refer to the one furthest out. Assume a program accesses the following stream of virtual pages: 0, 1, 2, 0, 1, 3, 0, 3, 1, 2, 1. #image("images/2023-12-21-09-58-59.png", width: 60%) The first three accesses are misses, as the cache begins in an empty state; such amiss is sometimes referred to as a *cold-start miss (or compulsory miss)*. Then we refer again to pages 0 and 1, which both hit in the cache. Finally, we reach another miss (to page 3), but this time the cache is full. Which page should we replace? With the optimal policy, we examine the future for each page currently in the cache (0, 1, and 2), and see that 0 is accessed almost immediately, 1 is accessed a little later, and 2 is accessed furthest in the future. Thus the optimal policy has an easy choice: evict page 2, resulting in pages 0, 1, and 3 in the cache. ... Hit rate = Hits/(Hits+Misses) which is 6/(6+5) or 54.5%. #tip("Tip")[ We can also compute the hit rate modulo compulsory misses (i.e., ignore the first miss to a given page), resulting in a 85.7% hit rate. ] Sadly, the future is not generally known! ==== COMPARING AGAINST OPTIMAL IS USEFUL Although optimal is not very practical as a real policy, it is incredibly useful as a comparison point in simulation or other studies. Thus, in any study you perform, knowing what the optimal is lets you perform a better comparison, showing *how much improvement is still possible*, and also *when you can stop making your policy better*. === A Simple Policy: FIFO Pages were simply placed in a queue when they enter the system; when a replacement occurs, the page on the tail of the queue (the “first-in” page) is evicted. #tip("Tip")[ FIFO has one great strength: it is quite simple to implement. ] #image("images/2023-12-21-10-09-34.png",width: 60%) Comparing FIFO to optimal, FIFO does notably worse: a 36.4% hit rate (or 57.1% excluding compulsory misses). FIFO simply can’t determine the importance of blocks. ==== BELADY’S ANOMALY The memory reference stream: 1, 2, 3, 4, 1, 2, 5, 1, 2, 3, 4, 5. The replacement policy they were studying was FIFO. The interesting part: how the cache hit rate changed when moving from a cache size of 3 to 4 pages. In general, you would expect the cache hit rate to increase (get better) when the cache gets larger. But in this case, with FIFO, it gets worse! This odd behavior is generally referred to as *Belady’s Anomaly*(to the chagrin of his co-authors). Some other policies, such as LRU, don’t suffer from this problem. As it turns out, LRU has what is known as a *stack property*. For algorithms with this property, a cache of size N + 1 naturally includes the contents of a cache of size N. Thus, when increasing the cache size, hit rate will either stay the same or improve. FIFO and Random (among others) clearly do not obey the stack property, and thus are susceptible to anomalous behavior. === Another Simple Policy: Random This policy is to simply picks a random page to replace under memory pressure. #image("images/2023-12-21-10-14-26.png", width: 60%) How Random does depends entirely upon how lucky (or unlucky) Random gets in its choices. We can run the Random experiment thousands of times and determine how it does in general. Figure below shows how many hits Random achieves over 10,000 trials, each with a different random seed. #image("images/2023-12-21-10-15-38.png", width:60%) === Using History: LRU If a program has accessed a page in the near past, it is likely to access it again in the near future. - One type of historical information a page-replacement policy could use is *frequency*; if a page has been accessed many times, perhaps it should not be replaced as it clearly has some value. - A more commonly used property of a page is its *recency* of access; the more recently a page has been accessed, perhaps the more likely it will be accessed again. This family of policies is based on what people refer to as the *principle of locality*. - The *Least-Frequently-Used (LFU)* policy replaces the least-frequently-used page when an eviction must take place. - The *Least-Recently- Used (LRU)* policy replaces the least-recently-used page. #tip("Tip")[ - Note that the opposites of these algorithms exist: *Most-Frequently-Used (MFU)* and *Most-Recently-Used (MRU)*. - In most cases, these policies do not work well, as they ignore the locality most programs exhibit instead of embracing it. ] #image("images/2023-12-21-20-07-19.png", width: 60%) In our example, LRU does as well as possible, matching optimal in its performance. ==== TYPES OF LOCALITY There are two types of locality that programs tend to exhibit. The first is known as *spatial locality*, which states that if a page P is accessed, it is likely the pages around it (say P − 1 or P + 1) will also likely be accessed. The second is *temporal locality*, which states that pages that have been accessed in the near past are likely to be accessed again in the near future. The assumption of the presence of these types of locality plays a large role in the caching hierarchies of hardware systems, which deploy many levels of instruction, data, and address-translation caching to help programs run fast when such locality exists. === Workload Examples We’ll examine more complex *workloads* instead of small traces. ==== no locality workload Our first workload *has no locality*, which means that each reference is to a random page within the set of accessed pages. In this simple example, the workload accesses 100 unique pages over time, choosing the next page to refer to at random; overall, 10,000 pages are accessed. In the experiment, we vary the cache size from very small (1 page) to enough to hold all the unique pages (100 page), in order to see how each policy behaves over the range of cache sizes. #image("images/2023-12-21-20-11-24.png", width: 60%) - First, when there is no locality in the workload, it doesn’t matter much which realistic policy you are using; exactly *determined by the size of the cache*. - Second, when the cache is large enough to fit the entire workload, it also doesn’t matter which policy you use. - Finally, you can see that optimal performs noticeably better than the realistic policies. ==== 80-20 workload The next workload we examine is called the “80-20” workload, which exhibits locality: 80% of the references are made to 20% of the pages (the “hot” pages); the remaining 20% of the references are made to the remaining 80% of the pages (the “cold” pages). In our workload, there are a total 100 unique pages again; thus, “hot” pages are referred to most of the time, and “cold” pages the remainder. #image("images/2023-12-21-20-26-45.png", width: 60%) As you can see from the figure, while both random and FIFO do reasonably well, LRU does better, as it is more likely to hold onto the hot pages; as those pages have been referred to frequently in the past, they are likely to be referred to again in the near future. You might now be wondering: is LRU’s improvement over Random and FIFO really that big of a deal? The answer, as usual, is “it depends.” If each miss is very costly (not uncommon), then even a small increase in hit rate (reduction in miss rate) can make a huge difference on performance. If misses are not so costly, then of course the benefits possible with LRU are not nearly as important. ==== looping sequential workload We refer to 50 pages in sequence, starting at 0, then 1, ..., up to page 49, and then we loop, repeating those accesses, for a total of 10,000 accesses to 50 unique pages. #image("images/2023-12-21-20-30-23.png", width: 60%) This workload, common in many applications (including important commercial applications such as databases [CD85]), represents a worst case for both LRU and FIFO. Random has some nice properties; one such property is not having weird corner-case behaviors. === Implementing Historical Algorithms Let’s take, for example, LRU. Specifically, upon each _page access_, we must update some data structure to move this page to the front of the list (i.e., the MRU side). Contrast this to FIFO, where the FIFO list of pages is only accessed when a page is evicted (by removing the first-in page) or when a new page is added to the list (to the last-in side). To keep track of which pages have been least- and most-recently used, the system has to do some accounting work on every memory reference. Clearly, without great care, such accounting could greatly reduce performance. One method that could help speed this up is to add a little bit of hardware support. For example, a machine could update, on each page access, a time field in memory (for example, this could be in the per-process page table, or just in some separate array in memory, with one entry per physical page of the system). Thus, when a page is accessed, the time field would be set, by hardware, to the current time. Then, when replacing a page, the OS could simply scan all the time fields in the system to find the least-recently-used page. Unfortunately, as the number of pages in a system grows, scanning a huge array of times just to find the absolute least-recently-used page is prohibitively *expensive*. === Approximating LRU Do we really need to find the absolute oldest page to replace? Can we instead survive with an approximation? The idea requires some hardware support, in the form of a *use bit* (sometimes called the *reference bit*). There is one use bit per page of the system, and the use bits live in memory somewhere (they could be in the per-process page tables, for example, or just in an array somewhere). Whenever a page is referenced (i.e., read or written), the use bit is set by hardware to 1. The hardware never clears the bit, though (i.e., sets it to 0); that is the responsibility of the OS. How does the OS employ the use bit to approximate LRU? There could be a lot of ways, but with the *clock algorithm*, one simple approach was suggested. Imagine all the pages of the system arranged in a circular list. *A clock hand* points to some particular page to begin with. When a replacement must occur, the OS checks if the currently-pointed to page P has a use bit of 1 or 0. - If 1, this implies that page P was recently used and thus is not a good candidate for replacement. Thus, the use bit for P set to 0 (cleared), and the clock hand is incremented to the next page (P + 1). - The algorithm continues until it finds a use bit that is set to 0, implying this page has not been recently used (or, in the worst case, that all pages have been and that we have now searched through the entire set of pages, clearing all the bits). #tip("Tip")[ - Note that this approach is not the only way to employ a use bit to approximate LRU. - Indeed, any approach which periodically clears the use bits and then differentiates between which pages have use bits of 1 versus 0 to decide which to replace would be fine. - The clock algorithm of Corbato’s was just one early approach which met with some success, and had the nice property of not repeatedly scanning through all of memory looking for an unused page. ] #image("images/2023-12-21-20-40-09.png", width: 60%) As you can see, although it doesn’t do quite as well as perfect LRU, it does better than approaches that don’t consider history at all. === Considering Dirty Pages One small modification to the clock algorithm that is commonly made is the additional consideration of whether a page has been modified or not while in memory. The reason for this: - If a page has been *modified* and is thus *dirty*, it must be written back to disk to evict it, which is expensive. - If it has not been modified (and is thus clean), the eviction is free; the physical frame can simply be reused for other purposes without additional I/O. Thus, some VM systems prefer to evict clean pages over dirty pages. To support this behavior, the hardware should include a *modified bit* (a.k.a. *dirty bit*). This bit is set any time a page is written, and thus can be incorporated into the page-replacement algorithm. The clock algorithm, for example, could be changed to scan for pages that are *both unused and clean* to evict first; failing to find those, then for unused pages that are dirty, and so forth. === Other VM Policies Page replacement is not the only policy the VM subsystem employs (though it may be the most important). For example, the OS also has to decide when to bring a page into memory. This policy, sometimes called the *page selection* policy (as it was called by Denning [D70]), presents the OS with some different options. For most pages, the OS simply uses *demand paging*, which means the OS brings the page into memory when it is accessed, “on demand” as it were. Of course, the OS could guess that a page is about to be used, and thus bring it in ahead of time; this behavior is known as *prefetching* and should only be done when there is reasonable chance of success. For example, some systems will assume that if a code page P is brought into memory, that code page P+1 will likely soon be accessed and thus should be brought into memory too. Another policy determines *how the OS writes pages out to disk*. Of course, they could simply be written out one at a time; however, many systems instead collect a number of pending writes together in memory and write them to disk in one write. This behavior is usually called *clustering* or simply *grouping* of writes, and is effective because of the nature of disk drives, which perform a single large write more efficiently than many small ones. === Thrashing What should the OS do when memory is simply oversubscribed, and the memory demands of the set of running processes simply exceeds the available physical memory? In this case, the system will constantly be paging, a condition sometimes referred to as *thrashing*. Some earlier operating systems had a fairly sophisticated set of mechanisms to both detect and cope with thrashing when it took place. For example, given a set of processes, a system could decide not to run a subset of processes, with the hope that the reduced set of processes’ *working sets* (the pages that they are using actively) fit in memory and thus can make progress. This approach, generally known as *admission control*, states that it is sometimes better to do less work well than to try to do everything at once poorly, a situation we often encounter in real life as well as in modern computer systems. Some current systems take more a draconian approach to memory overload. For example, some versions of Linux run an *out-of-memory killer* when memory is oversubscribed; this daemon chooses a memory-intensive process and kills it, thus reducing memory in a none-too-subtle manner. While successful at reducing memory pressure, this approach can have problems, for example, it kills the X server and thus renders any applications requiring the display unusable. === Summary Modern systems add some tweaks to straightforward LRU approximations like clock; for example, *scan resistance* is an important part of many modern algorithms, such as *ARC*. Scan-resistant algorithms are usually LRU-like but also try to avoid the worst-case behavior of LRU, which we saw with the looping-sequential workload. Thus, the evolution of page-replacement algorithms continues. However, in many cases the importance of said *algorithms has decreased*, as the discrepancy between memory-access and disk-access times has increased. Because paging to disk is so expensive, the cost of frequent paging is prohibitive. Thus, the best solution to excessive paging is often a simple (if intellectually unsatisfying) one: *buy more memory*.
https://github.com/typst/templates
https://raw.githubusercontent.com/typst/templates/main/dashing-dept-news/README.md
markdown
MIT No Attribution
# dashing-dept-news A fun newsletter layout for departmental news. The template contains a hero image, a main column, and a margin with secondary articles. Place content in the sidebar with the `article` function, and use the cool customized `blockquote`s and `figure`s! ## Usage You can use this template in the Typst web app by clicking "Start from template" on the dashboard and searching for `dashing-dept-news`. Alternatively, you can use the CLI to kick this project off using the command ``` typst init @preview/dashing-dept-news ``` Typst will create a new directory with all the files needed to get you started. ## Configuration This template exports the `newsletter` function with the following named arguments: - `title`: The newsletter's title as content. - `edition`: The edition of the newsletter as content or `none`. This is displayed at the top of the sidebar. - `hero-image`: A dictionary with the keys `image` and `caption` or `none`. Image is content with the hero image while `caption` is content that is displayed to the right of the image. - `publication-info`: More information about the publication as content or `none`. It is displayed at the end of the document. The function also accepts a single, positional argument for the body of the newsletter's main column and exports the `article` function accepting a single content argument to populate the sidebar. The template will initialize your package with a sample call to the `newsletter` function in a show rule. If you, however, want to change an existing project to use this template, you can add a show rule like this at the top of your file: ```typ #import "@preview/dashing-dept-news:0.1.0": newsletter, article #show: newsletter.with( title: [Chemistry Department], edition: [ March 18th, 2023 \ Purview College ], hero-image: ( image: image("newsletter-cover.jpg"), caption: [Award-wining science], ), publication-info: [ The Dean of the Department of Chemistry. \ Purview College, 17 <NAME>, Exampleville, TN 59341. \ #link("mailto:<EMAIL>") ], ) // Your content goes here. Use `article` to populate the sidebar and `blockquote` for cool pull quotes. ```
https://github.com/darioglasl/Arbeiten-Vorlage-Typst
https://raw.githubusercontent.com/darioglasl/Arbeiten-Vorlage-Typst/main/05_Qualitätssicherung/01_test_concept.typ
typst
== Testkonzept <headingTestingConcept> TODO: text
https://github.com/MasterTemple/typst-bible-plugin
https://raw.githubusercontent.com/MasterTemple/typst-bible-plugin/main/main.typ
typst
#import "bible.typ": bible_footnote, bible_quote #for _ in range(26) { [\ ] } == `bible_quote(<verse>, *)` === Usage: ```typ #bible_quote("John 3:16") ``` === Displays: #bible_quote("John 3:16") == `bible_footnote(<verse>, *)` === Usage: ```typ We should not love the world.#bible_footnote("1 John 2:15") ``` === Displays: We should not love the world.#bible_footnote("1 John 2:15")
https://github.com/kotfind/hse-se-2-notes
https://raw.githubusercontent.com/kotfind/hse-se-2-notes/master/algo/lectures/2024-09-24.typ
typst
#import "/utils/math.typ": * = Детерминированные и вероятностные алгоритмы == Детерминированные алгоритмы #def[ #defitem[Сложность] --- максимальное время работы на данных размера $n$. ] #def[ #defitem[Сложность в среднем] --- математическое ожидание количества действий. Для конечномерных: $ E = sum_(x in chi) P(X) dot "cut"(x) $ Для бесконечномерных: \*какой-то интеграл\* ] От бесконечномерного случая часто можно перейти к конечномерному. Например, в случае сортировок делать сжатие координат (превращать ) == Вероятностные алгоритмы #def[ #defitem[Вероятностные алгоритмы] --- алгоритмы, которые при одних выходных данных могут иметь разное время работы или разный вывод. Используют генератор случайных чисел. ] Виды вероятностных алгоритмов: - Без ошибки: всегда выдает правильный ответ - С односторонней ошибкой: ошибается только в одну сторону Пример: вероятностные алгоритмы проверки на простоту - С двусторонней ошибкой #def[ #defitem[Ожидаемое время работы] --- математическое ожидание времени работы (для конкретного набора входных данных) ] #def[ #defitem[Ожидаемая сложность] --- максимальное ожидаемое время на данных размера $n$. ] == $k$-ая порядковая статистика (вероятностный) Выбрали случайный опорный элемент, разделили массив на две части по опорному: $ underbrace(..., m - 1 "элемент") <= x_i <= underbrace(..., n - m "элементов") $ Медиана будет либо опорным элементом, либо элементов в бОльшем куске. Оценим ожидаемое время работы. Если массив разбился на куски по $m - 1$ и $n - m$ $ T(n) = underbrace(T(max(m - 1, n - m)), "в худшем случае ищем в большем куске") + underbrace(O(n), "на разделение по опорному") $ Итого: $ E(T(n)) &= sum^n_(m=1) P(n) E(T(max(m - 1, n - m))) + O(n) = sum^n_(m = n/2) 1/n dot 2 E(T(m)) + O(n) = \ &= 2/n sum_(m = n/2)^(n - 1) E(T(m)) + O(n) = 2/n (O(n/2) + ... + O(n - 1)) + O(n) = \ &= 2/n O((3n^2) / 8) + O(n) = O(3/4 n) + O(n) = O(n) $ == $k$-ая порядковая статистика (детерминированный) + Делим массив на чанки размера 5 + Сортируем каждый чанк: $7/5 n$ действий + Берем медиану каждого чанка: $m_1 , ... , m_(n/10)$ + Ищем медиану медиан рекурсивно + Используем найденное число в виде опорного элемента в прошлом алгоритме $ T(n) = underbrace(T((7n)/10), "прошлый алгоритм") + underbrace(T(n/5), "рекурсия") + underbrace(O(n), "разделение") -> T(n) = O(n) $ == Ал<NAME> Правда ли, что $A dot B = C$ ($A, B " и " C$ даны)? Берем случайный вектор из $0$ и $1$: $v = (""_0^1, ""_0^1, ..., ""_0^1)$ Если $A B = C$, то $A dot (B dot v) = C dot v$ Алгоритм с односторонней ошибкой. Если получили равенство, то вероятность неудачи не больше одной второй Можно повторит процедуру и улучшить вероятность. За $k$ испытаний получаем вероятность $P_"неуд" <= 1/(2^k)$, а сложность $O(k n^2)$. == <NAME>-Зиппеля $f(x_1 , ... , x_k)$ --- многочлен степени $n$ Считаем, что умеем находить значение $f$ в точке Хотим проверить, является ли он тождественным нулем + Берем случайный набор данных $(y_1 , ... , y_k) in S^k$ + Для ненулевого $f: P(f(y_1, ..., y_n) = 0) <= n/abs(S)$ == Дерандомизация Превращение вероятностного алгоритма в детерминированный Для леммы Шварца-Зиппеля и $k = 1$ достаточно проверить $n + 1$ разную точку
https://github.com/RaphGL/ElectronicsFromBasics
https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap5/6_correct_ohms_law_use.typ
typst
Other
#import "../../core/core.typ" === Correct use of Ohm\'s Law One of the most common mistakes made by beginning electronics students in their application of Ohm\'s Laws is mixing the contexts of voltage, current, and resistance. In other words, a student might mistakenly use a value for I through one resistor and the value for E across a set of interconnected resistors, thinking that they\'ll arrive at the resistance of that one resistor. Not so! Remember this important rule: The variables used in Ohm\'s Law equations must be #emph[common] to the same two points in the circuit under consideration. I cannot overemphasize this rule. This is especially important in series-parallel combination circuits where nearby components may have different values for both voltage drop #emph[and] current. When using Ohm\'s Law to calculate a variable pertaining to a single component, be sure the voltage you\'re referencing is solely across that single component and the current you\'re referencing is solely through that single component and the resistance you\'re referencing is solely for that single component. Likewise, when calculating a variable pertaining to a set of components in a circuit, be sure that the voltage, current, and resistance values are specific to that complete set of components only! A good way to remember this is to pay close attention to the #emph[two points] terminating the component or set of components being analyzed, making sure that the voltage in question is across those two points, that the current in question is the electron flow from one of those points all the way to the other point, that the resistance in question is the equivalent of a single resistor between those two points, and that the power in question is the total power dissipated by all components between those two points. The \"table\" method presented for both series and parallel circuits in this chapter is a good way to keep the context of Ohm\'s Law correct for any kind of circuit configuration. In a table like the one shown below, you are only allowed to apply an Ohm\'s Law equation for the values of a single #emph[vertical] column at a time: #image("static/10086.png") Deriving values #emph[horizontally] across columns is allowable as per the principles of series and parallel circuits: #image("static/10087.png") #image("static/10088.png") Not only does the \"table\" method simplify the management of all relevant quantities, it also facilitates cross-checking of answers by making it easy to solve for the original unknown variables through other methods, or by working backwards to solve for the initially given values from your solutions. For example, if you have just solved for all unknown voltages, currents, and resistances in a circuit, you can check your work by adding a row at the bottom for power calculations on each resistor, seeing whether or not all the individual power values add up to the total power. If not, then you must have made a mistake somewhere! While this technique of \"cross-checking\" your work is nothing new, using the table to arrange all the data for the cross-check(s) results in a minimum of confusion. #core.review[ - Apply Ohm\'s Law to vertical columns in the table. - Apply rules of series/parallel to horizontal rows in the table. - Check your calculations by working \"backwards\" to try to arrive at originally given values (from your first calculated answers), or by solving for a quantity using more than one method (from different given values). ]
https://github.com/megakite/typst-qrcode-wasm
https://raw.githubusercontent.com/megakite/typst-qrcode-wasm/main/README.md
markdown
The Unlicense
# typst-qrcode-wasm QR Code plugin for Typst. ## Usage Download WASM plugin from [Releases](https://github.com/megakite/typst-qrcode-wasm/releases) and place it wherever you want. Then from Typst: ``` #let typst-qrcode-wasm = plugin("path/to/typst_qrcode_wasm.wasm") #let qrcode(arg) = image.decode(str( typst-qrcode-wasm.generate(bytes(arg)) )) #qrcode("Lorem ipsum") ```
https://github.com/npujol/npujol.github.io
https://raw.githubusercontent.com/npujol/npujol.github.io/main/Me/chuli-cv/cv-en.typ
typst
#import "@preview/chuli-cv:0.1.0": * #import "@preview/fontawesome:0.1.0": * #show: cv #let icons = ( phone: fa-phone(), homepage: fa-home(fill: colors.accent), linkedin: fa-linkedin(fill: colors.accent), github: fa-github(fill: colors.accent), xing: fa-xing(), mail: fa-envelope(fill: colors.accent), book: fa-book(fill: colors.accent), cook: fa-utensils(fill: colors.accent), bike: fa-biking(fill: colors.accent), game: fa-gamepad(fill: colors.accent), robot: fa-robot(fill: colors.accent), bed: fa-bed(fill: colors.accent), write: fa-pen-to-square(fill: colors.accent), talk: fa-comments(fill: colors.accent), code: fa-code(fill: colors.accent), paint: fa-paintbrush(fill: colors.accent), music: fa-music(fill: colors.accent), friends: fa-users(fill: colors.accent), beer: fa-beer(fill: colors.accent), ) #header( full-name: [<NAME>], job-title: [Software Developer], socials: ( ( icon: icons.github, text: [npujol], link: "https://github.com/npujol" ), ( icon: icons.mail, text: [<EMAIL>], link: "mailto://<EMAIL>" ), ( icon: icons.linkedin, text: [<NAME>], link: "https://linkedin.com/in/npujolm/" ), ( icon: icons.homepage, text: [Berlin, Germany], link: "#" ), ), profile-picture: image("media/avatar.jpeg") ) #show: body => columns(2, body) #set par(justify: true) #section("Experience") #entry( title: "Software Developer", company-or-university: "InterContent Group", date: "2021 - Today", location: "Berlin, Germany", logo: image("media/intercontent.jpeg"), description: list( [Development and deployment of microservices on Amazon Web Services (AWS) infrastructure, including Amazon EC2, Amazon ECS, AWS Lambda, and Microsoft Azure Containers Apps, customizing each service to meet specific requirements. Employed Docker containers for efficient packaging and deployment.], [Management and automation of the infrastructure provisioning and deployment process through Continuous Integration/Continuous Deployment (CI/CD) pipelines using GitHub Actions, Infrastructure as Code (IaC) with Terraform, and configuration management with YAML.], [Implementation of monitoring and logging solutions using Amazon CloudWatch and Sentry, enabling efficient troubleshooting and performance analysis. Utilized the Postman API for testing and validating API endpoints.], ), ) #entry( title: "Software Developer", company-or-university: "Universidad de las Ciencias Informáticas", date: "2014 - 2019", location: "Havanna, Cuba", logo: image( "media/universidad_de_las_ciencias_informaticas_logo.jpeg", width: 10pt, height: 10pt ), description: list( [Development of the university's postgraduate activities' management website using Python, Django, Django REST Framework, and PostgreSQL as the primary databases. Implemented backend functionality, adhering to best practices in back-end web development.], [Automatization of data cleaning tasks and error identification processes using Celery and Redis, streamlining data preprocessing and ensuring data integrity.], [Data analysis, pattern finding, and predictive modeling using Python libraries such as Jupyter, scikit-learn, NumPy, pandas, and Cython; enabling data-driven decision-making and insights.], ) ) #section("Education") #education-entry( title: "Master's degree in advanced informatic", company-or-university: "Universidad de las Ciencias Informáticas", date: "05/2016 - 12/2018", location: "Havanna, Cuba", logo: image( "media/universidad_de_las_ciencias_informaticas_logo.jpeg", ), gpa: "4.82", gpa-total: "5.0" ) #education-entry( title: "Bachelor of Software Engineering", company-or-university: "Universidad de las Ciencias Informáticas", date: "09/2009 - 07/2014", location: "Havanna, Cuba", logo: image( "media/universidad_de_las_ciencias_informaticas_logo.jpeg", ), gpa: "4.71", gpa-total: "5.0" ) #section("Personal Projects") #entry( title: "anki_wiktionary", company-or-university: "Personal Project", date: "2020 - 2024", location: "", logo: image("media/github_avatar.jpeg"), description: list( [Development of a project that generates Anki notes by using the linguistic data from Wiktionary. Implemented a Telegram bot as the primary interface, allowing users to interact and request the creation of Anki notes.], [Utilization of the python-telegram-bot library to build and manage the Telegram bot, facilitating real-time communication and note generation requests. Employed deep-translator and pydantic libraries for automatic language translation and data validation, respectively.], [Integration with Selenium WebDriver and Selenium for web automation, enabling the programmatic creation and management of Anki note decks directly within the Anki application.], ) ) #entry( title: "tts_stories", company-or-university: "Personal Project", date: "2021 - 2024", location: "", logo: image("media/github_avatar.jpeg"), description: list( [Development of a Text-to-Speech (TTS) command-line interface (CLI) tool using Python, enabling users to convert text into spoken audio format leveraging multiple TTS APIs and services such as Coqui and Google TTS.], [Utilization of ffmpeg-python for audio processing and manipulation, providing the option of sending the generated audio file directly to Telegram.], ) ) #section("Skills") #skill( skills: ("python", "django", "terraform", "golang", "aws", "azure", "devops", "docker", "javascript", "postgresql", "celery", "redis", "github actions", "yaml", "latex", "pandas", "pydantic", "git", "linux", "postman", "sentry", "bash", "typst"), ) #section("Languages") #language( name:"Spanish", label:"Native", nivel:5, ) #language( name:"German", label:"B1", nivel:3, ) #language( name:"English", label:"B1", nivel:3, ) #section("My Time") #piechart( activities: ( ( name: icons.friends, val: 0.01 ), ( name: icons.book, val: 0.01 ), ( name: icons.talk, val: 0.01 ), ( name: icons.code, val: 0.01 ), ( name: icons.robot, val: 0.009 ), ( name: icons.music, val: 0.009 ), ( name: icons.game, val: 0.008 ), ( name: icons.write, val: 0.008 ), ( name: icons.bike, val: 0.007 ), ( name: icons.cook, val: 0.007 ), ( name: icons.paint, val: 0.003 ), ( name: icons.beer, val: 0.002 ) ) )
https://github.com/rice8y/cetzuron
https://raw.githubusercontent.com/rice8y/cetzuron/main/README.md
markdown
# ニューラルネットワーク描画パッケージ (CeTZuron) ## Installation ### 1. `git clone` する ```bash $ git clone https://github.com/rice8y/cetzuron.git $ cd cetzuron ``` ### 2. `justfile` か `.sh`, `.bat` でローカルにパッケージをインストールする #### 2-1. `justfile` の場合 ```bash $ just install ``` <details> <summary>WSL2 (Ubuntu) での実行例</summary> ```bash $ just install Package cetzuron version 0.1.0 has been installed to /home/rice8/.local/share/typst/packages/local/cetzuron/0.1.0 ``` </details> #### 2-2. `.sh` の場合 ```bash $ chmod +x install.sh $ ./install.sh ``` <details> <summary>WSL2 (Ubuntu) での実行例</summary> ```bash $ ./install.sh Package cetzuron version 0.1.0 has been installed to /home/rice8/.local/share/typst/packages/local/cetzuron/0.1.0 ``` </details> #### 2-3. `.bat` の場合 ```cmd > install.bat ``` <details> <summary>Windows (cmd) での実行例</summary> ```bash > install.bat C:install.sh C:justfile C:README.md C:typst.toml C:docs\ae\sample_ae.pdf C:docs\ae\sample_ae.png C:docs\ae\sample_ae.typ C:docs\fcnn\sample_fcnn.pdf C:docs\fcnn\sample_fcnn.png C:docs\fcnn\sample_fcnn.typ C:docs\lstm\sample_lstm.pdf C:docs\lstm\sample_lstm.png C:docs\lstm\sample_lstm.typ C:docs\rnn\sample_rnn.pdf C:docs\rnn\sample_rnn.png C:docs\rnn\sample_rnn.typ C:src\ae.typ C:src\fcnn.typ C:src\lib.typ C:src\lstm.typ C:src\requirements.typ C:src\rnn.typ 23 File(s) copied Package cetzuron version 0.1.0 has been installed to C:\Users\yoneyama\AppData\Roaming\typst\packages\local\cetzuron\0.1.0 ``` </details> ## Usage `#import` でパッケージを読み込む. ```typ #import "@local/cetzuron:0.1.0" ``` ### 1. 全結合型ニューラルネットワーク `#fcnn` #### Parameters ```typ fcnn( inputNodes: int, middleNodes: int, outputNodes: int, middleLayers: int, label: bool, ) -> content ``` **inputNodes:** 入力層のノード数 **middleNodes:** 隠れ層のノード数 **outputNodes:** 出力層のノード数 **middleLayers:** 隠れ層の層数 (default: 3) **label:** ラベルの有無 (default: true) #### `#fcnn` の使用例 ```typ #import "@local/cetzuron:0.1.0": * #set page(width: auto, height: auto) #set text(lang: "ja", font: "TeX Gyre Termes", size: 10pt) #show regex("[\p{scx:Han}\p{scx:Hira}\p{scx:Kana}]"): set text(lang: "ja", font: "Har<NAME>", size: 10pt) #figure( fcnn(3, 4, 3), caption: [ラベル表示 Ver.] ) #figure( fcnn(5, 4, 3, middleLayers: 1, label: false), caption: [ラベル非表示 Ver.] ) ``` ![sample](./docs/fcnn/sample_fcnn.png) ### 2. 再帰型ニューラルネットワーク `#rnn` #### Parameters ```typ rnn( inputNodes: int, middleNodes: int, outputNodes: int, middleLayers: int, label: bool, ) -> content ``` **inputNodes:** 入力層のノード数 **middleNodes:** 隠れ層のノード数 **outputNodes:** 出力層のノード数 **middleLayers:** 隠れ層の層数 (default: 3) **label:** ラベルの有無 (default: true) #### `#rnn` の使用例 ```typ #import "@local/cetzuron:0.1.0": * #set page(width: auto, height: auto) #set text(lang: "ja", font: "TeX Gyre Termes", size: 10pt) #show regex("[\p{scx:Han}\p{scx:Hira}\p{scx:Kana}]"): set text(lang: "ja", font: "<NAME>", size: 10pt) #figure( rnn(3, 4, 3), caption: [ラベル表示 Ver.] ) #figure( rnn(5, 4, 3, middleLayers: 1, label: false), caption: [ラベル非表示 Ver.] ) ``` ![sample](./docs/rnn/sample_rnn.png) ### 短・長期記憶 `#lstm` #### Parameters ```typ lstm( inputNodes: int, middleNodes: int, outputNodes: int, middleLayers: int, label: bool, ) -> content ``` **inputNodes:** 入力層のノード数 **middleNodes:** 隠れ層のノード数 **outputNodes:** 出力層のノード数 **middleLayers:** 隠れ層の層数 (default: 3) **label:** ラベルの有無 (default: true) #### `#lstm` の使用例 ```typ #import "@local/cetzuron:0.1.0": * #set page(width: auto, height: auto) #set text(lang: "ja", font: "TeX Gyre Termes", size: 10pt) #show regex("[\p{scx:Han}\p{scx:Hira}\p{scx:Kana}]"): set text(lang: "ja", font: "<NAME>", size: 10pt) #figure( lstm(3, 4, 3), caption: [ラベル表示 Ver.] ) #figure( lstm(5, 4, 3, middleLayers: 1, label: false), caption: [ラベル非表示 Ver.] ) ``` ![sample](./docs/lstm/sample_lstm.png) ### オートエンコーダ `#ae` #### Parameters ```typ ae( inputNodes: int, middleNodes: int, style: string, label: bool, ) -> content ``` **inputNodes:** 入力層/出力層のノード数 **middleNodes:** 隠れ層のノード数 **style:** 隠れ層の形状 \["short", "full"\] (default: "short") **label:** ラベルの有無 (default: true) #### `#ae` の使用例 ```typ #import "@local/cetzuron:0.1.0": * #set page(width: auto, height: auto) #set text(lang: "ja", font: "TeX Gyre Termes", size: 10pt) #show regex("[\p{scx:Han}\p{scx:Hira}\p{scx:Kana}]"): set text(lang: "ja", font: "<NAME>", size: 10pt) #figure( ae(5, 3), caption: [ラベル表示 Ver. (short)] ) #figure( ae(5, 3, style: "full"), caption: [ラベル表示 Ver. (full)] ) #figure( ae(4, 2, style: "full", label: false), caption: [ラベル非表示 Ver. (full)] ) ``` ![sample](./docs/ae/sample_ae.png)
https://github.com/jeffa5/typst-cambridge
https://raw.githubusercontent.com/jeffa5/typst-cambridge/main/thesis/manual-1.typ
typst
MIT License
#import "cambridge.typ": * #show: chapter = Introduction #lorem(100) == A subsection === A subsection ==== A subsection ===== A subsection ====== A subsection #figure( caption: [#lorem(10)], ```python print("Hello, world!") ```, ) #figure(caption: [Test figure image], image("./CollegeShields/CUni.svg")) #figure(caption: [Test figure image 2], image("./CollegeShields/Trinity.svg")) #figure(caption: [Test Table], table(columns: 2, [*Key*], [*Value*]))
https://github.com/tsar-boomba/resume
https://raw.githubusercontent.com/tsar-boomba/resume/main/README.md
markdown
# My Resume ✨😁✨ My resume! 😁 Whatever is in this repository is what [my website](https://igamble.dev) links to. Built with [✨Typst✨](https://typst.app/)
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/list_02.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page - Level 1 - Level #[ 2 through content block ]
https://github.com/DieracDelta/presentations
https://raw.githubusercontent.com/DieracDelta/presentations/master/08_16_24/main.typ
typst
#import "polylux/polylux.typ": * #import themes.metropolis: * #import "common.typ": * #show: metropolis-theme.with( footer: [#logic.logical-slide.display() / #utils.last-slide-number] ) #set text(font: font, weight: wt, size: 25pt) // #show math.equation: set text(font: "Fira Math") #set strong(delta: 100) #set par(justify: true) #title-slide( author: [<NAME>], title: "Slides - 8/16", ) #slide(title: "Table of contents")[ #metropolis-outline ] #new-section-slide([Pitch]) #slide(title: "Abbreviated Pitch" )[ - Compile C to Rust - Improve on C2Rust flaws - Guarantee that Rust code matches or improves on behavior of C code - Formalize - Semantics, borrow checking, memory model of small subset of Rust - Lifting from unsafe to safe Rust with already-defined semantics ] #new-section-slide([Plan]) #slide(title: "This week")[ - review concrat - multi-file translation ] #slide(title: "Next week")[ - slides on aeneas - complete multi-file translation ] #slide(title: "C2Rust")[ - Fed `compile_commands.json` - Copy-paste of header types into each corresponding C file - Externally link against function declarations and globals - Broken with recent rust toolchains - Difficult to understand/unnecessary ] #slide(title: "Underlying difference")[ - C2Rust: compiled together, linked at crate level - GCC: compiled separately into object files, then linked together ] #slide(title: "Laertes")[ - Deduplication pass (ResolveImports) after C2Rust transpilation - Data structure location arbitrary - Does not detect executables ] #slide(title: "Conceptual design")[ #set text(font: font, weight: wt, size: 22pt) - Direction 1: - Exactly match C - Workspace with crate (separate compilation unit) per object file - Exactly follow `compile_commands.json` - Shim outside compcert - Direction 2: - Follow prior work, (not faithful to C) - Compile directly within crate as Laertes does - Maintain distinction between header and c file - Shim within compcert - In both cases: - support executables - process `compile-commands.json` ] // #new-section-slide([Borrow checking via symbolic semantics]) // // #slide(title: "Review: Aeneas")[ // - Low Level Borrow Calculus - modeled after MIR // - Value based. No memory, addresses, pointer arithmetic // - Ownership is Oxide style: modeled via regions, loans instead of semantic lifetimes // - Aeneas workflow: LLBC -> lambda calculus -> itp (F\* or coq) // ] // // #slide(title: "Review: Aeneas")[ // - Limitations // - No unsafe // - No interior mutability // - Pros // - LLBC semantics are intuitive // ] // // #slide(title: "Example of LLBC semantics")[ // ```rust // let mut x = 0; // // x -> 0 // let mut px = &mut x; // // x -> loan_mut l // // px -> borrow_mut l 0 // let ppx = &mut px; // // x -> loan_mut l // // px -> borrow_mut l' // // ppx -> borrow_mut l' (borrow_mut l 0) // ``` // // ] // // #slide(title: "Symbolic Semantics for LLB")[ // // ] // // #new-section-slide([Switch statement]) // // #slide(title: "Existing literature")[ // - Relooper // - Pros // - Cons // - // ]
https://github.com/Akelio-zhang/cv-typst
https://raw.githubusercontent.com/Akelio-zhang/cv-typst/main/meta.typ
typst
#let today() = { let month = datetime.today().month(); let day = datetime.today().day(); let year = datetime.today().year(); [#year 年 #month 月#day 日] } #let today_en() = { let month = ( "January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December", ).at(datetime.today().month() - 1); let day = datetime.today().day(); let year = datetime.today().year(); [#month #day, #year] }
https://github.com/Myriad-Dreamin/tinymist
https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/type_check/recursive.typ
typst
Apache License 2.0
// path: base.typ #let a(x) = a; ----- #import "base.typ": * #let f() = a()
https://github.com/Coekjan/touying-buaa
https://raw.githubusercontent.com/Coekjan/touying-buaa/master/examples/main.typ
typst
MIT License
#import "@preview/cetz:0.2.2" #import "@preview/fletcher:0.5.1" as fletcher: node, edge #import "@preview/touying:0.5.2": * #import "../lib.typ": * // i.e. "@preview/touying-buaa:<latest>" // cetz and fletcher bindings for touying #let cetz-canvas = touying-reducer.with(reduce: cetz.canvas, cover: cetz.draw.hide.with(bounds: true)) #let fletcher-diagram = touying-reducer.with(reduce: fletcher.diagram, cover: fletcher.hide) #show: buaa-theme.with( // Lang and font configuration lang: "zh", font: ("Linux Libertine", "Source Han Sans SC", "Source Han Sans"), // Basic information config-info( title: [Typst Slide Theme for Beihang University Based on Touying], subtitle: [基于 Touying 的北京航空航天大学 Typst 幻灯片模板], author: [<NAME>], date: datetime.today(), institution: [北京航空航天大学], ), // Pdfpc configuration // typst query --root . ./examples/main.typ --field value --one "<pdfpc-file>" > ./examples/main.pdfpc config-common(preamble: pdfpc.config( duration-minutes: 30, start-time: datetime(hour: 14, minute: 10, second: 0), end-time: datetime(hour: 14, minute: 40, second: 0), last-minutes: 5, note-font-size: 12, disable-markdown: false, default-transition: ( type: "push", duration-seconds: 2, angle: ltr, alignment: "vertical", direction: "inward", ), )), ) #title-slide() #outline-slide() = Typst 与 Touying #tblock(title: [Typst])[ Typst 是一门新的基于标记的排版系统,它强大且易于学习。本演示文稿不详细介绍 Typst 的使用,你可以在 Typst 的#link("https://typst.app/docs")[文档]中找到更多信息。 ] #tblock(title: [Touying])[ Touying 是为 Typst 开发的幻灯片/演示文稿包。Touying 也类似于 LaTeX 的 Beamer,但是得益于 Typst,你可以拥有更快的渲染速度与更简洁的语法。你可以在 Touying 的#link("https://touying-typ.github.io/touying/zh/docs/intro")[文档]中详细了解 Touying。 Touying 取自中文里的「投影」,在英文中意为 project。相较而言,LaTeX 中的 beamer 就是德文的投影仪的意思。 ] = Touying 幻灯片动画 == 简单动画 使用 ```typ #pause``` #pause 暂缓显示内容。 #pause 就像这样。 #meanwhile 同时,#pause 我们可以使用 ```typ #meanwhile``` 来 #pause 显示同时其他内容。 #speaker-note[ 使用 ```typ config-common(show-notes-on-second-screen: right)``` 来启用演讲提示,否则将不会显示。 ] == 复杂动画 - Mark-Style 在子幻灯片 #touying-fn-wrapper((self: none) => str(self.subslide)) 中,我们可以: 使用 #uncover("2-")[```typ #uncover``` 函数](预留空间) 使用 #only("2-")[```typ #only``` 函数](不预留空间) #alternatives[多次调用 ```typ #only``` 函数 \u{2717}][使用 ```typ #alternatives``` 函数 #sym.checkmark] 从多个备选项中选择一个。 == 复杂动画 - Callback-Style #slide(repeat: 3, self => [ #let (uncover, only, alternatives) = utils.methods(self) 在子幻灯片 #self.subslide 中,我们可以: 使用 #uncover("2-")[```typ #uncover``` 函数](预留空间) 使用 #only("2-")[```typ #only``` 函数](不预留空间) #alternatives[多次调用 ```typ #only``` 函数 \u{2717}][使用 ```typ #alternatives``` 函数 #sym.checkmark] 从多个备选项中选择一个。 ]) == 数学公式动画 在 Touying 数学公式中使用 `pause`: #touying-equation(` f(x) &= pause x^2 + 2x + 1 \ &= pause (x + 1)^2 \ `) #meanwhile 如您所见,#pause 这是 $f(x)$ 的表达式。 #pause 通过因式分解,我们得到了结果。 = 与其他 Typst 包集成 == CeTZ 动画 在 Touying 中集成 CeTZ 动画: #cetz-canvas({ import cetz.draw: * rect((0,0), (5,5)) (pause,) rect((0,0), (1,1)) rect((1,1), (2,2)) rect((2,2), (3,3)) (pause,) line((0,0), (2.5, 2.5), name: "line") }) == Fletcher 动画 在 Touying 中集成 Fletcher 动画: #fletcher-diagram( node-stroke: .1em, node-fill: gradient.radial(blue.lighten(80%), blue, center: (30%, 20%), radius: 80%), spacing: 4em, edge((-1,0), "r", "-|>", `open(path)`, label-pos: 0, label-side: center), node((0,0), `reading`, radius: 2em), edge((0,0), (0,0), `read()`, "--|>", bend: 130deg), pause, edge(`read()`, "-|>"), node((1,0), `eof`, radius: 2em), pause, edge(`close()`, "-|>"), node((2,0), `closed`, radius: 2em, extrude: (-2.5, 0)), edge((0,0), (2,0), `close()`, "-|>", bend: -40deg), ) == 其他例子 #tblock(title: [Pinit, MiTeX, Codly, Ctheorems...])[ Touying 社区正在探索与更多 Typst 包的集成,详细情况可查阅#link("https://touying-typ.github.io/zh/docs/category/package-integration/")[文档]。 ] = 其他功能 == 双栏布局 #slide(composer: (1fr, 1fr))[ 我仰望星空, 它是那样辽阔而深邃; 那无穷的真理, 让我苦苦地求索、追随。 我仰望星空, 它是那样庄严而圣洁; 那凛然的正义, 让我充满热爱、感到敬畏。 ][ 我仰望星空, 它是那样自由而宁静; 那博大的胸怀, 让我的心灵栖息、依偎。 我仰望星空, 它是那样壮丽而光辉; 那永恒的炽热, 让我心中燃起希望的烈焰、响起春雷。 ] == 内容跨页 豫章故郡,洪都新府。星分翼轸,地接衡庐。襟三江而带五湖,控蛮荆而引瓯越。物华天宝,龙光射牛斗之墟;人杰地灵,徐孺下陈蕃之榻。雄州雾列,俊采星驰。台隍枕夷夏之交,宾主尽东南之美。都督阎公之雅望,棨戟遥临;宇文新州之懿范,襜帷暂驻。十旬休假,胜友如云;千里逢迎,高朋满座。腾蛟起凤,孟学士之词宗;紫电青霜,王将军之武库。家君作宰,路出名区;童子何知,躬逢胜饯。 时维九月,序属三秋。潦水尽而寒潭清,烟光凝而暮山紫。俨骖騑于上路,访风景于崇阿。临帝子之长洲,得天人之旧馆。层峦耸翠,上出重霄;飞阁流丹,下临无地。鹤汀凫渚,穷岛屿之萦回;桂殿兰宫,即冈峦之体势。 披绣闼,俯雕甍,山原旷其盈视,川泽纡其骇瞩。闾阎扑地,钟鸣鼎食之家;舸舰弥津,青雀黄龙之舳。云销雨霁,彩彻区明。落霞与孤鹜齐飞,秋水共长天一色。渔舟唱晚,响穷彭蠡之滨,雁阵惊寒,声断衡阳之浦。 遥襟甫畅,逸兴遄飞。爽籁发而清风生,纤歌凝而白云遏。睢园绿竹,气凌彭泽之樽;邺水朱华,光照临川之笔。四美具,二难并。穷睇眄于中天,极娱游于暇日。天高地迥,觉宇宙之无穷;兴尽悲来,识盈虚之有数。望长安于日下,目吴会于云间。地势极而南溟深,天柱高而北辰远。关山难越,谁悲失路之人;萍水相逢,尽是他乡之客。怀帝阍而不见,奉宣室以何年? 嗟乎!时运不齐,命途多舛。冯唐易老,李广难封。屈贾谊于长沙,非无圣主;窜梁鸿于海曲,岂乏明时?所赖君子见机,达人知命。老当益壮,宁移白首之心?穷且益坚,不坠青云之志。酌贪泉而觉爽,处涸辙以犹欢。北海虽赊,扶摇可接;东隅已逝,桑榆非晚。孟尝高洁,空余报国之情;阮籍猖狂,岂效穷途之哭! 勃,三尺微命,一介书生。无路请缨,等终军之弱冠;有怀投笔,慕宗悫之长风。舍簪笏于百龄,奉晨昏于万里。非谢家之宝树,接孟氏之芳邻。他日趋庭,叨陪鲤对;今兹捧袂,喜托龙门。杨意不逢,抚凌云而自惜;钟期既遇,奏流水以何惭? 呜呼!胜地不常,盛筵难再;兰亭已矣,梓泽丘墟。临别赠言,幸承恩于伟饯;登高作赋,是所望于群公。敢竭鄙怀,恭疏短引;一言均赋,四韵俱成。请洒潘江,各倾陆海云尔。 #align(center)[ 滕王高阁临江渚,佩玉鸣鸾罢歌舞。\ 画栋朝飞南浦云,珠帘暮卷西山雨。\ 闲云潭影日悠悠,物换星移几度秋。\ 阁中帝子今何在?槛外长江空自流。 ] // appendix by freezing last-slide-number #show: appendix == 附注 #slide[ - 您可以使用: ```sh typst init @preview/touying-buaa ``` 来创建基于本模板的演示文稿项目。 - 本模板仓库位于 #link("https://github.com/Coekjan/touying-buaa"),欢迎关注与贡献。 ]
https://github.com/QuarticCat/detypify
https://raw.githubusercontent.com/QuarticCat/detypify/main/README.md
markdown
MIT License
<div align="center"> <img src="./assets/logo.svg" alt="logo" width="150"/> <h1>Detypify</h1> <p> Can't remember some Typst symbol? <a href="https://detypify.quarticcat.com/">Draw it!</a> </p> </div> ## Features - **PWA**: installable and works offline - **Tiny model**: 1.3 MiB (ONNX), fast to load and run - **Decent symbol set**: support 400+ symbols ## News - 2024-04-06: This project has been integrated into [Tinymist](https://github.com/Myriad-Dreamin/tinymist). ## Associated Repos - [detypify-data](https://github.com/QuarticCat/detypify-data): Detypify's own dataset (your contributions on the website finally go here) - [detypify-external](https://github.com/QuarticCat/detypify-external): Necessary external data to bootstrap Detypify ## Use As A Library Use the [detypify-service](https://www.npmjs.com/package/detypify-service) NPM package. ## Self Deployment Download files from [gh-pages](https://github.com/QuarticCat/detypify/tree/gh-pages) branch and host them using any HTTP server. ## Development If you want to build `migrate` or `train`, you need to pull submodules. ([Git LFS](https://git-lfs.com/) is required) ```console $ git submodule update --init --recursive ``` If you just want to build `web`, you can download `train-out` from [NPM](https://www.npmjs.com/package/detypify-service?activeTab=code) to project root. ### Migrating ```console $ rye sync # install venv and denpendencies $ rye run migrate # migrate $ rye run migrate-font # strip font (optional) ``` ### Training ```console $ rye sync # install venv and denpendencies $ rye run train # train ``` ### Web Page ```console $ bun run --cwd=service copy # copy train-out folder $ bun install # install dependencies $ bun run dev # start dev server $ bun run build # build for production ``` ### Logo & Favicons (Optional) 1. Install *NewComputerModernMath* font ([guide](https://wiki.archlinux.org/title/TeX_Live#Making_fonts_available_to_Fontconfig)). 1. Convert `manuscript.svg`. ```console $ cd assets $ inkscape manuscript.svg --export-text-to-path --export-filename=logo.svg $ bunx svgo --multipass logo.svg ``` 1. Generate favicons by [Favicon InBrowser.App](https://favicon.inbrowser.app/tools/favicon-generator) using `logo.svg`. 1. Move them to [web/public/icons](web/public/icons). ## License MIT
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-11000.typ
typst
Apache License 2.0
#let data = ( ("BRAHMI SIGN CANDRABINDU", "Mc", 0), ("BRAHMI SIGN ANUSVARA", "Mn", 0), ("BRAHMI SIGN VISARGA", "Mc", 0), ("BRAHMI SIGN JIHVAMULIYA", "Lo", 0), ("BRAHMI SIGN UPADHMANIYA", "Lo", 0), ("BRAHMI LETTER A", "Lo", 0), ("BRAHMI LETTER AA", "Lo", 0), ("BRAHMI LETTER I", "Lo", 0), ("BRAHMI LETTER II", "Lo", 0), ("BRAHMI LETTER U", "Lo", 0), ("BRAHMI LETTER UU", "Lo", 0), ("BRAHMI LETTER VOCALIC R", "Lo", 0), ("BRAHMI LETTER VOCALIC RR", "Lo", 0), ("BRAHMI LETTER VOCALIC L", "Lo", 0), ("BRAHMI LETTER VOCALIC LL", "Lo", 0), ("BRAHMI LETTER E", "Lo", 0), ("BRAHMI LETTER AI", "Lo", 0), ("BRAHMI LETTER O", "Lo", 0), ("BRAHMI LETTER AU", "Lo", 0), ("BRAHMI LETTER KA", "Lo", 0), ("BRAHMI LETTER KHA", "Lo", 0), ("BRAHMI LETTER GA", "Lo", 0), ("BRAHMI LETTER GHA", "Lo", 0), ("BRAHMI LETTER NGA", "Lo", 0), ("BRAHMI LETTER CA", "Lo", 0), ("BRAHMI LETTER CHA", "Lo", 0), ("BRAHMI LETTER JA", "Lo", 0), ("BRAHMI LETTER JHA", "Lo", 0), ("BRAHMI LETTER NYA", "Lo", 0), ("BRAHMI LETTER TTA", "Lo", 0), ("BRAHMI LETTER TTHA", "Lo", 0), ("BRAHMI LETTER DDA", "Lo", 0), ("BRAHMI LETTER DDHA", "Lo", 0), ("BRAHMI LETTER NNA", "Lo", 0), ("BRAHMI LETTER TA", "Lo", 0), ("BRAHMI LETTER THA", "Lo", 0), ("BRAHMI LETTER DA", "Lo", 0), ("BRAHMI LETTER DHA", "Lo", 0), ("BRAHMI LETTER NA", "Lo", 0), ("BRAHMI LETTER PA", "Lo", 0), ("BRAHMI LETTER PHA", "Lo", 0), ("BRAHMI LETTER BA", "Lo", 0), ("BRAHMI LETTER BHA", "Lo", 0), ("BRAHMI LETTER MA", "Lo", 0), ("BRAHMI LETTER YA", "Lo", 0), ("BRAHMI LETTER RA", "Lo", 0), ("BRAHMI LETTER LA", "Lo", 0), ("BRAHMI LETTER VA", "Lo", 0), ("BRAHMI LETTER SHA", "Lo", 0), ("BRAHMI LETTER SSA", "Lo", 0), ("BRAHMI LETTER SA", "Lo", 0), ("BRAHMI LETTER HA", "Lo", 0), ("BRAHMI LETTER LLA", "Lo", 0), ("BRAHMI LETTER OLD TAMIL LLLA", "Lo", 0), ("BRAHMI LETTER OLD TAMIL RRA", "Lo", 0), ("BRAHMI LETTER OLD TAMIL NNNA", "Lo", 0), ("BRAHMI VOWEL SIGN AA", "Mn", 0), ("BRAHMI VOWEL SIGN BHATTIPROLU AA", "Mn", 0), ("BRAHMI VOWEL SIGN I", "Mn", 0), ("BRAHMI VOWEL SIGN II", "Mn", 0), ("BRAHMI VOWEL SIGN U", "Mn", 0), ("BRAHMI VOWEL SIGN UU", "Mn", 0), ("BRAHMI VOWEL SIGN VOCALIC R", "Mn", 0), ("BRAHMI VOWEL SIGN VOCALIC RR", "Mn", 0), ("BRAHMI VOWEL SIGN VOCALIC L", "Mn", 0), ("BRAHMI VOWEL SIGN VOCALIC LL", "Mn", 0), ("BRAHMI VOWEL SIGN E", "Mn", 0), ("BRAHMI VOWEL SIGN AI", "Mn", 0), ("BRAHMI VOWEL SIGN O", "Mn", 0), ("BRAHMI VOWEL SIGN AU", "Mn", 0), ("BRAHMI VIRAMA", "Mn", 9), ("BRAHMI DANDA", "Po", 0), ("BRAHMI DOUBLE DANDA", "Po", 0), ("BRAHMI PUNCTUATION DOT", "Po", 0), ("BRAHMI PUNCTUATION DOUBLE DOT", "Po", 0), ("BRAHMI PUNCTUATION LINE", "Po", 0), ("BRAHMI PUNCTUATION CRESCENT BAR", "Po", 0), ("BRAHMI PUNCTUATION LOTUS", "Po", 0), (), (), (), (), ("BRAHMI NUMBER ONE", "No", 0), ("BRAHMI NUMBER TWO", "No", 0), ("BRAHMI NUMBER THREE", "No", 0), ("BRAHMI NUMBER FOUR", "No", 0), ("BRAHMI NUMBER FIVE", "No", 0), ("BRAHMI NUMBER SIX", "No", 0), ("BRAHMI NUMBER SEVEN", "No", 0), ("BRAHMI NUMBER EIGHT", "No", 0), ("BRAHMI NUMBER NINE", "No", 0), ("BRAHMI NUMBER TEN", "No", 0), ("BRAHMI NUMBER TWENTY", "No", 0), ("BRAHMI NUMBER THIRTY", "No", 0), ("BRAHMI NUMBER FORTY", "No", 0), ("BRAHMI NUMBER FIFTY", "No", 0), ("BRAHMI NUMBER SIXTY", "No", 0), ("BRAHMI NUMBER SEVENTY", "No", 0), ("BRAHMI NUMBER EIGHTY", "No", 0), ("BRAHMI NUMBER NINETY", "No", 0), ("BRAHMI NUMBER ONE HUNDRED", "No", 0), ("BRAHMI NUMBER ONE THOUSAND", "No", 0), ("BRAHMI DIGIT ZERO", "Nd", 0), ("BRAHMI DIGIT ONE", "Nd", 0), ("BRAHMI DIGIT TWO", "Nd", 0), ("BRAHMI DIGIT THREE", "Nd", 0), ("BRAHMI DIGIT FOUR", "Nd", 0), ("BRAHMI DIGIT FIVE", "Nd", 0), ("BRAHMI DIGIT SIX", "Nd", 0), ("BRAHMI DIGIT SEVEN", "Nd", 0), ("BRAHMI DIGIT EIGHT", "Nd", 0), ("BRAHMI DIGIT NINE", "Nd", 0), ("BRAHMI SIGN OLD TAMIL VIRAMA", "Mn", 9), ("BRAHMI LETTER OLD TAMIL SHORT E", "Lo", 0), ("BRAHMI LETTER OLD TAMIL SHORT O", "Lo", 0), ("BRAHMI VOWEL SIGN OLD TAMIL SHORT E", "Mn", 0), ("BRAHMI VOWEL SIGN OLD TAMIL SHORT O", "Mn", 0), ("BRAHMI LETTER OLD TAMIL LLA", "Lo", 0), (), (), (), (), (), (), (), (), (), ("BRAHMI NUMBER JOINER", "Mn", 9), )
https://github.com/antonWetzel/Masterarbeit
https://raw.githubusercontent.com/antonWetzel/Masterarbeit/main/arbeit/abstrakt.typ
typst
#import "setup.typ": * #set heading(numbering: none, outlined: false) #v(1fr) #align(center, block(width: 80%, [ == Abstrakt #align(left)[ Diese Arbeit beschäftigt sich mit der Verarbeitung, Analyse und Visualisierung von Waldgebieten mithilfe von Punktwolken. Dabei wird der komplette Ablauf vom Datensatz bis zur Visualisierung der einzelnen Bäume mit relevanten Informationen durchgeführt. Ein Datensatz ist dabei ein Lidar-Scan von einem Waldgebiet, welcher als Liste von Punkten gegeben ist. Die Punkte können in beliebiger Reihenfolge abgespeichert sein und für jeden Punkt wird nur die dreidimensionale Position benötigt. Zuerst werden die Punkte automatisch in einzelne Bäume unterteilt. Die Analyse berechnet für die Bäume und einzelne Punkte danach relevante Informationen und die Daten werden für die Visualisierung in Echtzeit vorbereitet. In dem zugehörigen Softwareprojekt sind die vorgestellten Methoden, Datenstrukturen und Algorithmen umgesetzt. Die Analyse und Visualisierung der Waldgebiete werden mit der Umsetzung getestet und ausgewertet. ] ])) #v(2fr) #image("../images/auto-crop/br06-uls.png") #v(1fr)
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/stack-1_00.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test stacks with different directions. #let widths = ( 30pt, 20pt, 40pt, 15pt, 30pt, 50%, 20pt, 100%, ) #let shaded(i, w) = { let v = (i + 1) * 10% rect(width: w, height: 10pt, fill: rgb(v, v, v)) } #let items = for (i, w) in widths.enumerate() { (align(right, shaded(i, w)),) } #set page(width: 50pt, margin: 0pt) #stack(dir: btt, ..items)
https://github.com/benjft/CV
https://raw.githubusercontent.com/benjft/CV/master/education.typ
typst
== Education *Computer Science*, BSc (Hons), Coventry University, 1st *Physics with Theoretical Physics*, Undergraduate Diploma, University of Nottingham
https://github.com/antonWetzel/Masterarbeit
https://raw.githubusercontent.com/antonWetzel/Masterarbeit/main/verteidigung/main.typ
typst
#import "@preview/polylux:0.3.1": * #import "theme.typ": * #import "@preview/cetz:0.2.0" as cetz #show: doc => setup(footer: [<NAME> | #datetime.today().display("[day].[month].[year]")], doc) #title-slide( title: [Analyse und Visualisierung von Bäumen mit 3D-Punktwolken], ) #set figure(numbering: none) #new-section[Überblick] #let image2(path) = rect( image(path, width: 5cm, height: 3cm), inset: 0.5pt, ) #let rect2 = rect #let grid2 = grid #normal-slide( title: [Ablauf], )[ #set align(center) #cetz.canvas(length: 1.0cm, { import cetz.draw: * let box-width = 5.0 let box(x, y, name, stroke: 1pt) = { rect((x, y), (x + box-width, y + 1), name: name, stroke: stroke) content(name, raw(name)) } set-style(mark: (end: ">", fill: black, scale: 1.4, width: 3.5pt), stroke: black) box(0, 0, "Wald") box(6, 0, "Punktwolke") box(12, 0, "Segmente") box(18, 0, "Eigenschaften") let arrow(x, text) = { bezier((x + 3.0, 1), (x + 8.0, 1), (x + 5.5, 2)) content((x + 5.5, 1.6), anchor: "south", text) } arrow(0.0, [Lidar-Scan]) arrow(6.0, [ #only(1)[Segmentierung] #only(2)[*Segmentierung*] ]) arrow(12.0, [ #only(1)[Analyse] #only(2)[*Analyse*] ]) content((box-width / 2, -1.8), image2("../images/forest-example.jpg")) content((6 + box-width / 2, -1.8), image2("../images/eye_dome_white_with-edited.png")) content((12 + box-width / 2, -1.8), image2("../images/segment-step.png")) content((18 + box-width / 2, -1.8), rect2(width: 5cm, height: 3cm, inset: 0pt, grid2( columns: 1 * 4, gutter: 1fr, image("../images/crop/prop_classification.png"), image("../images/crop/prop_height.png"), image("../images/crop/prop_curve_all.png"), image("../images/crop/prop_var_all.png"), ))) }) ] #normal-slide( title: [Eingabedaten], )[ - Lidar-Scan von einem Wald - Sensor - (`TLS`) Terrestrial - (`ULS`) Drohne - (`ALS`) Flugzeug - Kombination von mehreren Messungen ][ #import cetz.draw: * #let l = ( 3.46, 3.25, 3.1, 3.05, 3.0, 3.05, 1.97, 1.77, 1.75, 1.94, 2.7, 2.65, 6, ) #let outer() = { rect((-2, 1), (6, -3.5)) } #let base() = { rect((-2, -2.99), (6, -3.5), fill: gray, stroke: none) line((-2, -3), (0.9, -3), (0.9, -2), (1.1, -2), (1.1, -3), (2.3, -3), (2.3, -2), (2.5, -2), (2.5, -3), (6, -3), fill: gray) circle((2.4, -2), fill: gray, radius: 0.5) circle((1, -2), fill: gray, radius: 0.5) } #let lines() = { for i in range(0, l.len()) { line((0, 0), (240deg + i * 7.5deg, l.at(i))) } } #let points() = { for i in range(0, l.len()) { circle((240deg + i * 7.5deg, l.at(i)), radius: 0.1, fill: black, stroke: none) } } #let scanner() = { circle((0, 0), stroke: black, fill: red, radius: 0.1) } #let length = 1.4cm #only(1, cetz.canvas(length: length, { outer() base() scanner() })) #only(2, cetz.canvas(length: length, { outer() base() lines() scanner() })) #only(3, cetz.canvas(length: length, { outer() points() })) #only(4, cetz.canvas(length: length, { outer() let stroke = (dash: "dotted", thickness: 2pt) line((-2, -3), (0.9, -3), (0.9, -2), (1.1, -2), (1.1, -3), (2.3, -3), (2.3, -2), (2.5, -2), (2.5, -3), (6, -3), stroke: stroke) circle((2.4, -2), radius: 0.5, stroke: stroke, fill: white) circle((1.0, -2), radius: 0.5, stroke: stroke, fill: white) rect((2.325, -3), (2.4675, -2), fill: white, stroke: none) rect((2.325 - 1.4, -3), (2.4675 - 1.4, -2), fill: white, stroke: none) })) ] #new-section[Segmentierung] #normal-slide( title: [Ziel], columns: (1fr, 1.5fr), expand-content: true, )[ - Ein Segment für jeden Baum - Punkte im gleichen Segment zusammenfassen ][ #set align(horizon) #image("../images/auto-crop/segments-br05-als.png") ] #normal-slide( title: [Ablauf], columns: (1.1fr, 2fr), )[ + *Horizontale Scheiben* + Bereiche + Koordinaten + Punkte zuordnen ][#{ let images = range(1, 9).map(i => image("../images/crop/layers_" + str(i) + ".png")); only(1, { image("../images/crop/layers_ori.png") }) only(2, { image("../images/crop/layers.png") }) for (i, img) in images.enumerate() { only(i + 3, img) } }] #let double-image(a, b, text) = { grid( columns: 1 * (1fr, 1fr), gutter: 1em, rect(image(a), inset: 0.5pt), rect(image(b), inset: 0.5pt), grid.cell(colspan: 2, align(center, text)) ) } #normal-slide( title: [Ablauf], columns: (1.1fr, 2fr), )[ + Horizontale Scheiben + *Bereiche* + Koordinaten + Punkte zuordnen ][ #double-image("../images/test_5-areas.svg", "../images/test_6-areas.svg", [Zusammenhängende Bereiche]) ] #normal-slide( title: [Ablauf], columns: (1.1fr, 2fr), )[ + Horizontale Scheiben + Bereiche + *Koordinaten* + Punkte zuordnen ][ #double-image("../images/test_5-coords.svg", "../images/test_6-coords.svg", [Koordinaten der Bäume]) ] #normal-slide( title: [Ablauf], columns: (1.1fr, 2fr), )[ + Horizontale Scheiben + Bereiche + Koordinaten + *Punkte zuordnen* ][ #double-image("../images/test_5-moved.svg", "../images/test_6-moved.svg", [Bereiche für die Punkte]) ] #normal-slide( title: [Ergebnisse], expand-content: true, )[ #image("../images/auto-crop/segments-ka11-als.png") ][ #set align(bottom) #image("../images/auto-crop/segmentation_uls.png") ] #new-section[Analyse] #normal-slide( title: [Ziel], expand-content: true, )[ + Gesamthöhe + Kronenhöhe + Stammhöhe + Kronendurchmesser + Stammdurchmesser bei #number(1.3, unit: [m]) ][ #cetz.canvas(length: 2.6cm, { import cetz.draw: * line((-0.2, 0.0), (-0.2, -3.0), (0.2, -3.0), (0.2, 0.0), fill: gray) line((-0.2, -3.0), (0.2, -3.0), (1, -3.2), (-1, -3.2), close: true, fill: gray) circle((0.0, 0.0), radius: 1, fill: gray) line((-2.5, -3.0), (-2.5, 1.0), mark: (start: "|", end: "|")) content((-2.5, -1.0), $1$, anchor: "east", padding: 0.1) line((-1.8, -0.95), (-1.8, 1.0), mark: (start: "|", end: "|")) content((-1.8, 0.0), $2$, anchor: "east", padding: 0.1) line((-1.8, -1.05), (-1.8, -3.0), mark: (start: "|", end: "|")) content((-1.8, -2.0), $3$, anchor: "east", padding: 0.1) line((-1.1, 0.0), (1.1, 0.0), mark: (start: "|", end: "|")) content((-1.1, 0.0), $4$, anchor: "east", padding: 0.1) line((-0.25, -2.5), (0.25, -2.5), mark: (start: "|", end: "|")) content((-0.2, -2.5), $5$, anchor: "east", padding: 0.1) }) ] #normal-slide( columns: (2fr, 4fr), title: [Klassifizierung], expand-content: true, )[ + Horizontale Scheiben + Zugehörige Flächen + Klassifizierung - Boden - Stamm - Krone ][ #set align(center + bottom) #grid( columns: 1 * 4, column-gutter: 2em, row-gutter: 0.2em, image("../images/crop/prop_height.png"), image("../images/klassifkation_slices copy.svg"), image("../images/klassifkation_slices copy 2.svg", height: 82%), image("../images/crop/prop_classification-recolor.png"), [Punkte], [Scheiben], [Flächen], [Klassifizierung], ) ] #normal-slide( columns: (1fr, 1fr), title: [Eigenschaften], expand-content: true, )[ + Gesamthöhe + Kronenhöhe + Stammhöhe + Kronendurchmesser + Stammdurchmesser bei #number(1.3, unit: [m]) ][ #place(pad(left: 3cm, image("../images/crop/prop_classification-recolor.png"))) #cetz.canvas(length: 1cm, { import cetz.draw: * rect((0, 0.0), (10, 11), stroke: none, fill: rgb(0, 0, 0, 0)) line((1.0, 0), (1.0, 11), mark: (start: "|", end: "|")) content((1.0, 5.5), $1$, anchor: "east", padding: 0.1) line((2.0, 3.8), (2.0, 11), mark: (start: "|", end: "|")) content((2.0, 7.5), $2$, anchor: "east", padding: 0.1) line((2.0, 3.6), (2.0, 0), mark: (start: "|", end: "|")) content((2.0, 1.8), $3$, anchor: "east", padding: 0.1) line((3.1, 8.7), (6.8, 8.7), mark: (start: "|", end: "|")) content((3.1, 8.7), $4$, anchor: "east", padding: 0.1) line((5, 0.5), (5.3, 0.5), mark: (start: "|", end: "|")) content((5, 0.5), $5$, anchor: "east", padding: 0.1) }) ] #normal-slide( columns: (2fr, 3fr), title: [Einzelne Punkte], expand-content: true, )[ - Daten - Höhe - Krümmung - Ausdehnung - Visualisierung - Größe - Orientierung ][ #set align(bottom) #grid( columns: 1 * 4, column-gutter: 2em, image("../images/crop/prop_height.png"), image("../images/crop/prop_curve_all.png"), image("../images/crop/prop_var_all.png"), ) ] #new-section[Umsetzung] #normal-slide( columns: (3fr, 1fr), title: [Überblick], expand-content: true, )[ - Repository - `https://github.com/antonWetzel/treee` - Technologie - Rust - WebGPU - Systemanforderungen - Keine ][ #align(center, grid( rows: (1fr, 1fr), image("./assets/rust.png"), image("./assets/webgpu.svg"), )) ] #let box-offset = 7; #let box-width = 5; #normal-slide(title: [Quelltext])[ #set align(center + horizon) #cetz.canvas(length: 1.0cm, { import cetz.draw: * let box(x, y, name, stroke: 1pt) = { rect((x, y), (x + box-width, y + 1), name: name, stroke: stroke) content(name, raw(name)) } set-style(mark: (end: ">", fill: black, scale: 1.4, width: 3.5pt), stroke: black) box(0, -1, "project") box(0, 1, "data-file") box(0, -3, "k-nearest") box(0, 3, "input") box(box-offset, -1, "triangulation") box(box-offset, 3, "render") box(box-offset, -3, "importer", stroke: 3pt) box(box-offset, 1, "viewer", stroke: 3pt) box(box-offset * 2, -1, "treee", stroke: 3pt) line("k-nearest.east", "triangulation.west") line("project.east", "render.west") line("input.east", "render.west") line("data-file.east", "importer.west") line("data-file.east", "viewer.west") line("render.south", "viewer.north") line("triangulation.north", "viewer.south") line("k-nearest.east", "importer.west") line("project.east", "importer.west") line("importer.east", "treee.west") line("viewer.east", "treee.west") }) ] #normal-slide(title: [Import])[ #set align(center + horizon) #cetz.canvas(length: 1cm, { import cetz.draw: * let box(x, y, name, node, stroke: 1pt) = { rect((x, y), (x + box-width, y + 1), name: node, stroke: stroke) content(node, raw(name)) } set-style(mark: (end: ">", fill: black, scale: 1.4, width: 3.5pt), stroke: black) box(0, 2, "Punktdaten", "daten") box(box-offset, 4, "Segment 1", "seg0") box(box-offset, 2, "Segment 2", "seg1") box(box-offset, 0, "...", "seg2") line("daten.east", "seg0.west") line("daten.east", "seg1.west") line("daten.east", "seg2.west") box(box-offset * 2, 3, "Octree", "octree") box(box-offset * 2, 1, "Punktwolke", "punktwolke") line("seg0.east", "octree.west") line("seg1.east", "octree.west") line("seg2.east", "octree.west") line("seg0.east", "punktwolke.west") line("seg1.east", "punktwolke.west") line("seg2.east", "punktwolke.west") line("octree.south", "punktwolke.north") }) ] #normal-slide( title: [Detailstufen], expand-content: true, )[ - Originalen Punkte für Leaf-Knoten - Maximale Punktanzahl - Detailstufen für Branch-Knoten ][ #{ set align(center) only(1, image("../images/crop/lod_0.png")) only(2, image("../images/crop/lod_1.png")) only(3, image("../images/crop/lod_2.png")) only(4, image("../images/crop/lod_3.png")) only(5, image("../images/crop/lod_4.png")) only(6, image("../images/crop/lod_5.png")) } ] #normal-slide( title: [Culling], expand-content: true, )[ - Nur sichtbare Knoten anzeigen - Genauer für kleinere Knoten ][ #{ set align(center + horizon) only(1, rect(image("../images/culling_0.png"), inset: 0.5pt)) only(2, rect(image("../images/culling_1.png"), inset: 0.5pt)) only(3, rect(image("../images/culling_2.png"), inset: 0.5pt)) only(4, rect(image("../images/culling_3.png"), inset: 0.5pt)) only(5, rect(image("../images/culling_4.png"), inset: 0.5pt)) only(6, rect(image("../images/culling_5.png"), inset: 0.5pt)) only(7, rect(image("../images/culling_6.png"), inset: 0.5pt)) } ] #focus-slide(size: 90pt, [Demonstration]) #new-section[Ergebnisse] #normal-slide( title: [Testdaten], expand-content: true, )[ - ALS-, ULS- und TLS-Daten für mehrere Gebiete - Liste von Punkten - Daten in einzelne Bäume unterteilt - Teilweise Eigenschaften für einzelne Bäume gemessen ][ #set align(bottom) #figure(caption: [Punktanzahl], image("../data/total_points.svg")) ] #normal-slide( title: [Beispieldaten], columns: (2fr, 1fr, 1fr), expand-content: true, )[ #let test(idx, content) = { alternatives-match(( "1": content, "2": content, "3": content, "4": content, str(idx): table.cell(fill: silver, content), )) } #for i in range(1, 5) { set align(center) only(i, { let args = ( [ALS], number(1503), number(6446), [ULS-off], number(7156), number(58201), [ULS-on], number(6273), number(74262), [TLS], [-], number(1687505), ) args = args.enumerate().map(((idx, val)) => if (int(idx / 3) + 1 == i) { table.cell(fill: silver, val) } else { val }) table( columns: (1fr, 1fr, 1fr), inset: 10pt, [*Quelle*], table.cell(colspan: 2)[*Punktanzahl*], ..args ) }) } ][ #{ set align(center) only(1, image("../images/crop/compare2_als.png")) only(2, image("../images/crop/compare2_uls_off.png")) only(3, image("../images/crop/compare2_uls_on.png")) only(4, rect(width: 100%, height: 100%, radius: 5pt, align(center + horizon)[Keine Daten])) } ][ #{ set align(center) only(1, image("../images/crop/compare_als.png")) only(2, image("../images/crop/compare_uls_off.png")) only(3, image("../images/crop/compare_uls_on.png")) only(4, image("../images/crop/compare_tls.png")) } ] #normal-slide( title: [Vergleich], expand-content: true, )[ - *Gesamthöhe vom Baum* - Stammdurchmesser bei #number(130, unit: [cm]) - Anfangshöhe der Baumkrone - Durchmesser der Baumkrone ][ #image("../data/data_tree_height.svg") ] #normal-slide( title: [Vergleich], expand-content: true, )[ - Gesamthöhe vom Baum - *Stammdurchmesser bei #number(130, unit: [cm])* - Anfangshöhe der Baumkrone - Durchmesser der Baumkrone ][ #image("../data/data_trunk_diameter.svg") ] #normal-slide( title: [Vergleich], expand-content: true, )[ - Gesamthöhe vom Baum - Stammdurchmesser bei #number(130, unit: [cm]) - *Anfangshöhe der Baumkrone* - Durchmesser der Baumkrone ][ #image("../data/data_crown_start.svg") ] #normal-slide( title: [Vergleich], expand-content: true, )[ - Gesamthöhe vom Baum - Stammdurchmesser bei #number(130, unit: [cm]) - Anfangshöhe der Baumkrone - *Durchmesser der Baumkrone* ][ #image("../data/data_crown_diameter.svg") ] #new-section[Referenzen] #let link-ref(content) = { link(content, raw(content)) } #normal-slide( title: [Quellen], )[ #set text(size: 0.7em) - Arbeit und Vortrag - #link-ref("https://github.com/antonWetzel/masterarbeit") - Programm - #link-ref("https://github.com/antonWetzel/treee") - Präsentationsvorlage - #link-ref("https://intranet.tu-ilmenau.de/site/moef/SitePages/Dokumente.aspx") - Testdaten - #link-ref("https://doi.pangaea.de/10.1594/PANGAEA.942856?format=html#download") - Waldfoto (Folie 2) - #link-ref("https://commons.wikimedia.org/wiki/File:L%27automne_au_Qu%C3%A9bec_(8072544123).jpg") ] #final-slide(title: [Danke für ihre Aufmerksamkeit], e-mail: [<EMAIL>]) #focus-slide(size: 90pt, [Nachfragen]) #new-section[Appendix] #normal-slide( title: [Import (Punkte pro Sekunde)], expand-content: true, )[ #image("../data/punkte_pro_sekunde.svg") ] #normal-slide( title: [Anzeigen von Punkten], )[ - Benötigt - Position - Größe - Orientierung - Dreieck + Discard - Instancing ][ #cetz.canvas(length: 1.8cm, { import cetz.draw: * circle((0, 0), radius: 1, fill: silver) line((-1.73, -1), (1.73, -1), (0, 2), close: true) }) ][ #rect(image("../images/auto-crop/properties.png"), inset: 0.5pt) ] #let lines_and_mesh(prec) = { grid( rows: (1fr, 1fr, auto), image("../images/crop/triangulation_mesh_" + prec + ".png"), image("../images/crop/triangulation_lines_" + prec + ".png"), align(center, number(prec, unit: [m])), ) } #normal-slide( title: [Triangulierung], columns: (1fr, 2fr), expand-content: true, )[ - Ball-Pivoting Algorithmus - <NAME> - $alpha$ für Genauigkeit ][ #grid( columns: 1 * 5, stroke: silver, inset: 2pt, lines_and_mesh("0.2"), lines_and_mesh("0.5"), lines_and_mesh("1.0"), lines_and_mesh("2.0"), lines_and_mesh("5.0"), ) ] /* - General - Laptop am Strom anschließen - Webex - Gesamtdauer: 25 bis 30 Minuten - Alle Folien selbsterklärend - Analyse Punkte nur Anschneiden - Programm statt Softwareprojekt sagen - SOFTWARE - Punkte entfernen - Werte neu Berechnen - Werte händisch anpassen */
https://github.com/tony-rsa/thonifho.muhali.cv
https://raw.githubusercontent.com/tony-rsa/thonifho.muhali.cv/main/src/sections/ru/experience.typ
typst
MIT License
#import "../../template.typ": * #cvSection("Опыт") #cvEntry( title: [Cloud Software Engineer (Python/Rust/SQL)], host: [#link("https://github.com/Stabl-Energy")[#gh #hSpc() STABL Energy] #hDot() Частичная занятость], date: [Январь — Март 2023 (3 месяца)], mode: [Мюнхен, Германия #hDot() Комбинированный режим], logo: "assets/logos/stb.png", desc: list( marker: [➤], [Написал Rust-клиент для отправки логов в Grafana Loki.], [Мигрировал +20 GiB данных IoT-датчиков из AWS Timestream в InfluxDB.], [Настраивал автоматическое создание конфигурируемых панелей в Grafana.], [Настраивал CI пайплайны для тестирования и линтинга проектов на Python и Rust.], [Прошил +10 ESP32-S2 устройств посредством специальной бутстрэп программы и драйвера CP210x.], [Писал эффективные FluxQL запросы для очистки, резервного копирования и восстановления данных InfluxDB.], ), ) #cvEntry( title: [Deep Learning Engineer (Python/C++/CUDA)], host: [#link("https://github.com/phygitalism")[#gh #hSpc() Phygitalism] #hDot() Полная занятость], date: [Октябрь 2021 — Апрель 2022 (7 месяцев)], mode: [Москва, Россия #hDot() Комбинированный режим], logo: "assets/logos/phy.png", desc: list( marker: [➤], [Реализовал алгоритм обнаружения границ облака точек из научной публикации на Python.], [Адаптировал модели RIFE и AdaAttN в качестве веб-сервисов при помощи Docker и RabbitMQ.], [Сравнивал нейронные и дифференцируемые рендереры и составлял обзор последних статей.], [Помогал разработать пайплайна данных для обучения модели Point Transformer на дорожных знаках.], [Разработал пайплайн реконструкции меша по облакам точек помещений при помощи C++ библиотеки CGAL.], [Выполнял предварительную обработка, анализ и визуализацию данных облаков точек дорожных LIDAR-сканов.], [Протестировал модель NeRF из проекта InstantNGP и создал PR для настройки проекта при помощи DevContainer.], ), )
https://github.com/matteobongio/Advanced-Algorithms
https://raw.githubusercontent.com/matteobongio/Advanced-Algorithms/master/main.typ
typst
#import "@preview/showybox:2.0.1": showybox #let bluebox(title, text, ..opts) = { showybox( title-style: ( weight: 900, color: blue.darken(40%), sep-thickness: 0pt, align: center ), frame: ( title-color: blue.lighten(80%), border-color: blue.darken(40%), body-color: blue.lighten(90%), thickness: (left: 1pt), radius: (top-right: 5pt, bottom-right:5pt, rest: 0pt) ), title: title, text, ..opts ) } #let DefCounters = state("defs", 0) #let DefBox(text, ..opts) = { DefCounters.update(x => x + 1) let c = context DefCounters.get() bluebox([Definition ] + c, text, ..opts) } #let ExCnts = state("exmples", 0) #let Examplebox(text, ..opts) = { ExCnts.update(x => x + 1) let c = context ExCnts.get() showybox( title-style: ( weight: 900, color: orange.darken(40%), sep-thickness: 0pt, align: center ), frame: ( title-color: orange.lighten(80%), border-color: orange.lighten(40%), body-color: orange.lighten(90%), thickness: (left: 1pt), radius: (top-right: 5pt, bottom-right:5pt, rest: 0pt) ), title: [Example ] + c, text, ..opts ) } = AA TODO - Theorem boxes - O vs $Theta$ vs $Omega$ vs etc == Master Theorem Let $a >= 1 and b >= 1$ be constants, let $f(n)$ be a cfunction, and let $T(n)$ be defined on the nonnegative integers by the recurrence: $ T(n) = a T( frac(n, b) + f(n)) "if" n/b equiv floor(n/b) or ceil(n/b) $ $ => \ &"if" f(n) = O(n^(log_b a - epsilon)) "for" epsilon > 0 => T(n) = Theta(n^(log_b a)) \ &"if" f(n) = Theta(n^(log_b a) ) "for" epsilon > 0 => T(n) = Theta(n^(log_b a)lg(n)) \ &"if" f(n) = Omega(n^(log_b a + epsilon)) "for" epsilon > 0 and "if" a f(n/b) <= c f(n) "for" c < 1 forall "sufficiently large" n \ & quad => T(n) = Theta(f(n)) \ $
https://github.com/yasemitee/Teoria-Informazione-Trasmissione
https://raw.githubusercontent.com/yasemitee/Teoria-Informazione-Trasmissione/main/2023-10-31.typ
typst
#import "@preview/lemmify:0.1.4": * #let ( theorem, lemma, corollary, remark, proposition, example, proof, rules: thm-rules ) = default-theorems("thm-group", lang: "it") #show: thm-rules #show thm-selector("thm-group", subgroup: "proof"): it => block( it, stroke: green + 1pt, inset: 1em, breakable: true ) = Ottimalità del codice di Huffman == Introduzione Ricordiamo che un codice di Huffman è costruito con il seguente sistema (Algoritmo di Huffman): + i simboli sorgente vengono ordinati in base alle probabilità; + si crea un nuovo modello di sorgente in cui i $D$ simboli meno frequenti sono rimpiazzati da un nuovo simbolo con probabilità pari alla somma delle loro probabilità; + se la nuova sorgente contiene più di $D$ simboli si ricomincia dal passo 1. #v(12pt) #figure( image("assets/2023-10-13_huffman-compressione.svg", width: 65%) ) $|Chi| = m$ $m = (D-1)K+1$, quindi è divisibile per $D-1$ con resto 1. #v(12pt) Procediamo ora a dimostrare l’ottimalità del codice di Huffman nell’ambito dei codici sorgente istantanei. Prima di dimostrare il teorema, dobbiamo però fare una semplice osservazione preliminare. Ovvero: da un codice di Huffman _D-ario_ per una sorgente di $m − D + 1$ simboli possiamo ricavare un codice di Huffman D-ario per una sorgente di $m$ simboli semplicemente sostituendo un simbolo sorgente con D nuovi simboli cosicché le probabilità assegnate ad essi siano tutte più piccole di quelle dei rimanenti m − D vecchi simboli. == Lemma sulla generazione con giustapposizione #lemma(numbering:none)[ Sia $c'$ un codice _D-ario_ di Huffman per la sorgente $Chi' = {x_1, dots, x_(m-D+1)}$ con probabilità $p_1 >= dots >= p_(m-D+1)$. Sia $Chi$ la sorgente di $m$ simboli ${x_1, dots, x_(k-1), x_(k+1), dots, x_(m+1)}$ ottenuta da $Chi'$ togliendo $x_k$ e aggiungendo $D$ nuovi simboli $x_(m-D+2), dots, x_(m+1)$ con probabilità $p_(m-D+2), dots, p_(m+1)$ tali che $0 < underbrace(p_(m-D+2)"," dots"," p_(m+1), "nuove prob") < p_(m-D+1)$ e $p_(m-D+2) + dots + p_(m+1) = p_k$. Allora il codice $ c(x) = cases( c'(x) " se" x in {x_1, dots, x_(k-1), x_(k+1), dots, x_(m-D+1)}, c'(x_k)i " se" x = x_(m-D+i+2) "per" i = 0"," dots "," D-1. ) $ è un codice di Huffman per la sorgente $Chi$. ] <thm> #v(12pt) #figure( image("assets/2023-10-31 esempio-lemma-huffman.svg", width: 100%) ) #v(12pt) In sostanza sto dicendo che se $c(x)$ è uguale a $c'(x)$ fino a $x_k$ escluso e dopo aggiunge i nuovi simboli giustapponendoli a $x_k$ (nota che $p_k = p_"simbolo1" + p_"simbolo2" + dots "in base a" D$). In pratica è come fare Huffman al contrario, quindi srotolando i simboli "fantoccio" che costituivano la somma delle D probabilità più basse. #proof()[ La dimostrazione è ovvia considerando che dopo il primo passo nella costruzione del codice di Huffman per $Chi$ otteniamo $Chi'$ come nuova sorgente. Quindi i due codici differiscono solo per le codifiche ai $D$ simboli $x_(m-D+2), dots, x_(m+1)$ che sono quelli meno probabili in $Chi$. Per definizione dell'algoritmo di Huffman, le codifiche dei simboli meno probabili di $Chi$ sono definite in termini del codice di Huffman per $Chi'$ esattamente come descritto nel sistema precedente. ]<proof> == Teorema sull'ottimalità del codice di Huffman #theorem(name:"Ottimalità del codice di Huffman", numbering:none)[ Data una sorgente $angle.l Chi, p angle.r$ con $D > 1$, il codice _D-ario_ $c$ di Huffman minimizza $EE[l_c]$ fra tutti i codici _D-ari_ istantanei per la medesima sorgente. Quindi: $EE[l_c] <= EE[l_(c', c'', dots)]$ ]<thm> #proof()[ La dimostrazione procede per induzione. Passo base: (per facilità nei conti consideriamo D = 2 e si ricorda che $|Chi| = m$). Nel caso base $m = 2$ Huffman è ottimo. Infatti, intuitivamente mi basta osservare che l'algoritmo di Huffman produce il codice $c(x_1) = 0$ e $c(x_2) = 1$ che è ottimale per ogni distribuzione di probabilità su $x_1, x_2$. Passo induttivo: Assumendo quindi $m > 2$, grazie all'ipotesi induttiva abbiamo che Huffman è ottimo per $k <= m - 1$. Fissiamo una sorgente $angle.l Chi, p angle.r$ e siano $u, v in Chi$ tale che $p(u)$ e $p(v)$ sono minime (quindi le ultime che ordineremmo nell'algoritmo di Huffman). Definiamo la sorgente $angle.l Chi', p' angle.r$ dove $u,v in Chi$ sono rimpiazzati da $z in Chi'$ e dove $ p'(x) = cases( p(x) " se" x != z, p(u) + p(v) " se" x =z ) $ Sia $c'$ il codice di Huffman per la sorgente $angle.l Chi', p' angle.r$. Dato che $|Chi'| = m - 1$, $c'$ è ottimo per ipotesi induttiva (Huffman è ottimo per $k <= m - 1$). Definiamo ora il codice $c(x)$ per $Chi$. $ c(x) = cases( c'(x) " se" x in.not {u,v}, c'(x)0 " se" x = u, c'(x)1 " se" x = v, ) $ _nota che c'(x)0 e c'(x)1 sono costruiti con una giustapposizione_. Per il lemma precedente sappiamo che $c$ è un codice istantaneo di Huffman. Ora vogliamo dimostrare che $c$ è ottimale (quindi che $EE$ <= di qualsiasi codice istantaneo). Per fare ciò abbiamo bisogno di dimostrare tre relazioni, che in seguito ci permetteranno di trarre le dovute conclusioni. $ EE[l_c] = sum_(x in Chi) l_c (x) p(x) = sum_(x in Chi) underbracket(l_c' (x) p'(x), "identico a c per ora") - underbracket(l_c' (z) p'(z), "tolgo z") + underbracket(l_c (u) p(u) + l_c(v) p(v),"aggiungo u e v") $ $ = EE[l_c'] - l_c' (z) p'(z) + (l_c'(z) + 1)p(u) + (l_c' (z)+1)p(v) $ $ = EE[l_c'] - l_c' (z) p'(z) + l_c'(z)p'(z) + p'(z) $ $ = EE[l_c'] + p'(z) $ $ underline("Quindi" EE[l_c] = EE[l_c'] + p'(z)) $ Ora consideriamo un altro codice istantaneo $c_2$ per la medesima sorgente $angle.l Chi, p angle.r$ e verifichiamo sempre che $EE[l_c] <= EE[l_c_2]$. Fissato $c_2$, siano $r, s in Chi$ tali che $l_c_2(r)$ e $l_c_2(s)$ sono massime. #v(12pt) #figure( image("assets/2023-10-31 nodi-lmax.svg", width: 75%) ) #v(12pt) Esaminando le foglie $r$ e $s$ nell'albero di codifica di $c_2$, osserviamo che che senza perdita di generalità, possiamo assumere che $c_2$ sia tale che $r$ e $s$ sono fratelli. Infatti se $r$ e $s$ sono fratelli, non facciamo nulla. Se $r$ o $s$ hanno un fatello (ad esempio il fratello di $r$ è $f$), allora possiamo scegliere $r$ e $f$ tali che $l_c_2(r)$ e $l_c_2(f)$ sono massime invece di $r$ e $s$ (tanto se $f$ è fratello di $s$ allora hanno la stessa $l_c_2$). Se invece né $r$ ne $s$ hanno un fratello nell'albero, allora possiamo sostituire alla codifica di ciascun la codifica del padre finché ci riportiamo nella situazione in cui $r$ e $s$ hanno entrambi un fratello. Ora trasformiamo $c_2$ in un codice $ tilde(c)_2(x) = cases( c_2(x) " se" x in.not {u,v, r, s}, c_2(u) " se" x = r, c_2(r) " se" x = u, c_2(v) " se" x = s, c_2(s) " se" x = v, ) $ In pratica quello che fa $tilde(c)_2$ è sostituire la codifica dei simboli di lunghezza massima ($r$ e $s$) con quella dei simboli di lunghezza minima ($u$ e $v$). Ora dobbiamo capire quele codice tra $c_2$ e $tilde(c)_2$ "sfida" meglio $c$ (sull'ottimalità) esaminando la differenza fra la lunghezza media di questi ultimi. $ EE[l_tilde(c)_2] - EE[l_c_2] = sum_(x in Chi) p(x)(l_tilde(c)_2 (x) - l_c_2 (x)) $ $ = p(r) l_c_2 (u) + p(u) l_c_2(r) + p(s)l_c_2(v) + p(v) l_c_2 (s) - $ $ - p(u) l_c_2 (u) - p(r) l_c_2(r) - p(v) l_c_2 (v) - p(s) l_c_2 (s) $ $ = underbrace(underbrace((p(r) - p(u)), >=0) underbrace((l_c_2(u) - l_c_2(r)),<=0) + underbrace((p(s) - p(v)), >= 0)underbrace((l_c_2(v)-l_c_2(s)), <=0), <= 0) $ I segni delle differenze sono determinati dalla scelta di $u,v,r,s$ in quanto $ max{p(u),p(v)} <= min{p(r), p(s)}," "min{l_c_2(r),l_c_2(s)} >= max{L_c_2(u), l_c_2(v)} $ Quindi abbiamo dimostrato che $underline(EE[l_tilde(c)_2] <= EE[l_c_2])$ (lo "sfidante" migliore è $tilde(c)_2$). Notiamo ora che dopo lo scambio con $r$ e $s$, $u$ e $v$ sono diventati fratelli in $tilde(c)_2$. Quindi esiste $omega in {0,1}^*$ tale che $tilde(c)_2(u) = omega 0$ e $tilde(c)_2(v) = omega 1$. Ora per applicare l'ipotesi induttiva, costruisco un altro codice $c'_2$ per $angle.l Chi', p' angle.r$ così definito $ c'_2 = cases( tilde(c)_2 (x) "se" x != z, omega " se" x != z ) $ Possiamo allora scrivere, ricordando che $p'(z) = p(u) + p(v)$, $ EE[l_tilde(c)_2] = sum_(x in Chi:x!=z) p'(x) l_tilde(c)_2 (x) + p(u)(l_c'_2(z)+1) + p(v)(l_c'_2(z)+1) $ $ = sum_(x in Chi:x!=z) p'(x) l_tilde(c)_2 (x) + p'(z)l_c'_2(z)+p'(z) $ $ underline(= EE[l_c'_2] + p'(z)) $ Adesso è giunto il momento di trarre le dovute conclusioni ricordando le disuguaglianze precedentemente ottenute, e utilizzando l'ipotesi induttiva per stabilire che $EE[l_c'] <= EE[l_c'_2]$, possiamo quindi scrivere $ EE[l_c] = EE[l_c'] + p'(z) <= EE[l_c'_2] + p'(z) = EE[l_tilde(c)_2] <= EE[l_c_2] $ ]<proof>
https://github.com/ad-si/invoice-maker
https://raw.githubusercontent.com/ad-si/invoice-maker/master/fixtures/example-data.typ
typst
ISC License
#let biller = ( name: "<NAME>", title: "Inventor", company: "Crazy Inventions Ltd.", vat-id: "DL1234567", iban: "DE89370400440532013000", address: ( country: "Disneyland", city: "Duckburg", postal-code: "123456", street: "Inventor Drive 23", ), ) #let recipient = ( name: "<NAME>", title: "Treasure Hunter", vat-id: "DL7654321", address: ( country: "Disneyland", city: "Duckburg", postal-code: "123456", street: "Killmotor Hill 1", ) ) #let table-data = ( ( number: 1, date: "2016-04-03", description: "Arc reactor", dur-min: 0, quantity: 1, price: 13000, ), ( number: 2, date: "2016-04-05", description: "Flux capacitor", dur-min: 0, quantity: 1, price: 27000, ), ( number: 3, date: "2016-04-07", description: "Lightsaber", dur-min: 0, quantity: 2, price: 3600, ), ( number: 4, date: "2016-04-08", description: "Sonic screwdriver", dur-min: 0, quantity: 10, price: 800, ), ( number: 5, date: "2016-04-12", description: "Assembly", dur-min: 160, quantity: 1, price: 53.33, ) )
https://github.com/jxpeng98/Typst-CV-Resume
https://raw.githubusercontent.com/jxpeng98/Typst-CV-Resume/main/legacy/example-double-legacy.typ
typst
MIT License
#import "modernpro-cv-legacy.typ": * // Remember to set the fonttype in `typstcv.typ` #cv-double-legacy( name: [#lorem(2)], //name:"" or name:[] address: [#lorem(4)], lastupdated: "true", date: "2023.4.7", contacts: ( (text: "08856", link: ""), (text: "example.com", link: "https://www.example.com"), (text: "github.com", link: "https://www.github.com"), (text: "<EMAIL>", link: "mailto:<EMAIL>"), ), [ // Left //About #section("About") #descript[#lorem(50)] #sectionsep #section("Education") #subsection[#lorem(4)\ ] #term[xxxx-xxxx][UK] #subsectionsep #subsection[#lorem(4)\ ] #term[xxxx-xxxx][UK] #sectionsep #section("Skills") #descript("Programming Languages") #info[Python, C++, Java, JavaScript, HTML, CSS, SQL, LaTeX] #subsectionsep #descript("Frameworks") #info[React, Node.js, Express, Flask, Django, Bootstrap, jQuery] #subsectionsep #descript("Tools") #info[Git, GitHub, Docker, AWS, Heroku, MongoDB, MySQL, PostgreSQL, Redis, Linux] #sectionsep // Award #section("Awards") #awarddetail[2018][Scholarship][University] #awarddetail[2017][Grant][Organisation] #awarddetail[2016][Scholarship][University] #sectionsep ], [ // Right //Experience #section("Experience") #jobtitle[#lorem(4)][#lorem(2)][xxxx-xxxx][UK] #jobdetail[ - #lorem(10) - #lorem(10) - #lorem(10) - #lorem(10) ] #subsectionsep #jobtitle[#lorem(4)][#lorem(2)][xxxx-xxxx][\ ] #jobdetail[#lorem(30)] #subsectionsep // Projects #section("Projects") #descript[#lorem(2)] #info[#lorem(40)] #subsectionsep #descript[#lorem(2)] #info[#lorem(40)] #subsectionsep #descript[#lorem(2)] #info[#lorem(40)] #sectionsep // Publication #section("Publications") #publication("bib.bib", "chicago-author-date") ], )
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/valkyrie/0.1.1/src/lib.typ
typst
Apache License 2.0
#import "types.typ": * #import "ctx.typ": z-ctx #import "base-type.typ" as advanced /// This is the main function for validating an object against a schema. *WILL* return the given /// object after validation if successful, or none and *MAY* throw a failed assertion error. /// /// - object (any): Object to validate against provided schema. Object *SHOULD* statisfy the schema /// requirements. An error *MAY* be produced if not. /// - schema (schema): Schema against which `object` is validated. *MUST* be a valid valkyrie schema /// type. /// - ctx (ctx): ctx passed to schema validator function, containing flags that *MAY* alter /// behaviour. /// - scope (scope): An array of strings used to generate the string representing the location of a /// failed requirement within `object`. *MUST* be an array of strings of length greater than or /// equal to `1`. /// -> any, none #let parse( object, schema, ctx: z-ctx(), scope: ("argument",), ) = { // don't expose to external import "base-type.typ": assert-base-type // Validate named arguments assert-base-type(schema, scope: scope) // Validate arguments per schema object = (schema.validate)( schema, ctx: ctx, scope: scope, object, ) // Require arguments match schema exactly in strict mode if ctx.strict { for (argument-name, argument-value) in object { assert(argument-name in schema, message: "Unexpected argument " + argument-name) } } object }
https://github.com/francescoo22/kt-uniqueness-system
https://raw.githubusercontent.com/francescoo22/kt-uniqueness-system/main/src/annotation-system/formalization.typ
typst
#import "rules/base.typ": * #import "rules/relations.typ": * #import "rules/unification.typ": * #import "rules/statements.typ": * #import "../vars.typ": * #show raw.where(block: true): frame-box #show link: set text(fill: rgb(0, 0, 255)) #show link: underline #set heading(numbering: "1.") = Formalization // TODO: be more precise when an annotation cannot be borrowed (e.g. method type can be ... -> alpha) // TODO: since there isn't a borrowed-Top, maybe the context can be re defined in a way s.t. borrowed top doesn't even exist #v(1em) #grid( columns: (auto, auto), column-gutter: 2em, row-gutter: 1em, [*Grammar*],[*Context*], frame-box( $ CL &::= class C(overline(f\: alpha_f)) \ af &::= unique | shared \ beta &::= dot | borrowed \ M &::= m(overline(af beta space x)): af {begin_m; overline(s); ret_m e} \ p &::= x | p.f \ e &::= null | p | m(overline(p)) \ s &::= var x | p = e | fi p_1 == p_2 then overline(s_1) els overline(s_2) | m(overline(p)) \ &| loop p_1 == p_2 do overline(s) $ ), frame-box( $ alpha &::= unique | shared | top \ beta &::= dot | borrowed \ Delta &::= dot | p : alpha beta, Delta $ ) ) - Only *fields*, *method parameters*, and *return values* have to be annotated. - A reference annotated as `unique` may either be `null` or point to an object, and it is the sole *accessible* reference pointing to that object. - A reference marked as `shared` can point to an object without being the exclusive reference to that object. - `T` is an annotation that can only be inferred and means that the reference is *not accessible*. - $borrowed$ (borrowed) indicates that the function receiving the reference won't create extra aliases to it, and on return, its fields will maintain at least the permissions stated in the class declaration. - Annotations on fields indicate only the default permissions, in order to understand the real permissions of a fields it is necessary to look at the context. This concept is formalized by rules in @cap:paths and shown in @field-annotations. - Primitive fields are not considered - `this` can be seen as a parameter - constructors can be seen as functions returning a `unique` value == General #display-rules( M-Type, M-Args ) == Context - The same variable/field cannot appear more than once in a context. - Contexts are always *finite* - If not present in the context, fields have a default annotation that is the one written in the class declaration #display-rules( Not-In-Base, Not-In-Rec, Ctx-Base, Ctx-Rec, Root-Base, Root-Rec, Lookup-Base, Lookup-Rec, Lookup-Default, "", Remove-Empty, Remove-Base, Remove-Rec, "", ) == SubPaths If $p_1 subset.sq p_2$ holds, we say that - $p_1$ is a *sub*-path of $p_2$ - $p_2$ is a *sup*-path of $p_1$ #display-rules( SubPath-Base, SubPath-Rec, SubPath-Eq-1, SubPath-Eq-2, Remove-SupPathsEq-Empty, Remove-SupPathsEq-Discard, Remove-SupPathsEq-Keep, Replace, Get-SupPaths-Empty, "", Get-SupPaths-Discard, "", Get-SupPaths-Keep, "", ) == Annotations relations - $alpha beta rel alpha' beta'$ means that $alpha beta$ can be passed where $alpha' beta'$ is expected. - $alpha beta ~> alpha' beta' ~> alpha'' beta''$ means that after passing a reference annotated with $alpha beta$ as argument where $alpha' beta'$ is expected, the reference will be annotated with $alpha'' beta''$ right after the method call. #display-rules( row-size: 3, A-id, A-trans, A-bor-sh, A-sh, A-bor-un, A-un-1, A-un-2, Pass-Bor, Pass-Un, Pass-Sh ) #figure(image(width: 25%, "../img/lattice.svg"), caption: [Lattice obtained by annotations relations rules])<annotation-lattice> == Paths <cap:paths> - $Lub{alpha_0 beta_0, ..., alpha_n beta_n}$ identifies the least upper bound of the annotations based on the lattice in @annotation-lattice. - Note that even if $p.f$ is annotated as unique in the class declaration, $Delta(p.f)$ can be shared (or $top$) if $Delta(p) = shared$ (or $top$) - Note that fields of a borrowed parameter are borrowed too and they need to be treated carefully in order to avoid unsoundness. Specifically, borrowed fields: - Can be passed as arguments to other functions (if relation rules are respected). - Have to become `T` after being read (even if shared). - Can only be reassigned with a `unique`. - Note that $(Delta(p) = alpha beta) => (Delta inangle(root(p)) = alpha' beta')$ i.e. the root is present in the context. - $Delta tr std(p, alpha beta)$ means that paths rooted in $p$ have the right permissions when passing $p$ where $alpha beta$ is expected. To understand better why these rules are necessary look at the example in @path-permissions. - Note that in the rule "Std-Rec-2" the premise $(x : alpha beta) (p') = alpha'' beta''$ means that the evaluation of $p'$ in a context in which there is only $x : alpha beta$ is $alpha'' beta''$ #display-rules( Get-Var, Get-Path, Std-Empty, Std-Rec-1, Std-Rec-2, "", ) == Unification - $Delta_1 lub Delta_2$ is the pointwise lub of $Delta_1$ and $Delta_2$. - If a variable $x$ is present in only one context, it will be annotated with $top$ in $Delta_1 lub Delta_2$. - If a path $p.f$ is missing in one of the two contexts, we can just consider the annotation in the class declaration. - $Delta triangle.filled.small.l Delta_1$ is used to maintain the correct context when exiting a scope. - $Delta$ represents the resulting context of the inner scope. - $Delta_1$ represents the context at the beginning of the scope. - The result of the operation is a context where paths rooted in variable locally declared inside the scope are removed. - $unify(Delta, Delta_1, Delta_2)$ means that we want to unify $Delta_1$ and $Delta_2$ starting from a parent environment $Delta$. - A path $p$ contained in $Delta_1$ or $Delta_2$ such that $root(p) = x$ is not contained $Delta$ will not be included in the unfication. - The annotation of variables contained in the unfication is the least upper bound of the annotation in $Delta_1$ and $Delta_2$. #display-rules( // U-Empty, U-Sym, // U-Local, U-Rec, Ctx-Lub-Empty, Ctx-Lub-Sym, Ctx-Lub-1, "", Ctx-Lub-2, "", Remove-Locals-Base, Remove-Locals-Keep, Remove-Locals-Discard, "", Unify, "" ) == Normalization - Normalization takes a list of annotated $p$ and retruns a list in which duplicates are substituted with the least upper bound. - Normalization is required for method calls in which the same variable is passed more than once. ```kt fun f(x: ♭ shared, y: shared) fun use_f(x: unique) { // Δ = x: unique f(x, x) // Δ = normalize(x: unique, x:shared) = x: shared } ``` #display-rules( N-Empty, "", N-Rec, "" ) == Statements #display-rules( Begin, "", Decl, Assign-Null, Seq-Base, Seq-Rec, If, "", Assign-Unique, "", Assign-Shared, "", Assign-Borrowed-Field, "", Assign-Call, "", Call, "", // Return-Null, "", Return-p, "", // Return-m, "", ) *Note:* Since they can be easily desugared, there are no rules for returnning `null` or a method call. - `return null` $equiv$ `var fresh ; fresh = null ; return fresh` - `return m(...)` $equiv$ `var fresh ; fresh = m(...) ; return fresh` The same can be done for the guard of if statements: - `if (p1 == null) ...` $equiv$ `var p2 ; p2 = null ; if(p1 == p2) ...` - `if (p1 == m(...)) ...` $equiv$ `var p2 ; p2 = m(...) ; if(p1 == p2) ...` // #include "while-loops.typ" #pagebreak() = Examples == Paths-permissions: #figure( ```kt class C() class A(var f: @Unique C) fun use_a(a: @Unique A) fun f1(a: @Shared A){ // Δ = a: Shared // ==> Δ(a.f) = shared even if `f` is annotated ad unique } fun f2(a: @Unique A){ // Δ = a: Unique // ==> Δ(a.f) = Unique use_a(a) // after passing `a` to `use_a` it can no longer be accessed // Δ = a: T // Δ(a.f) = T even if `f` is annotated ad unique } ``` )<field-annotations> #figure( ```kt class C() class A(var f: @Unique C) fun f(a: @Unique A) fun use_f(x: @Unique A, y: @Unique A){ // Δ = x: Unique, y: Unique y.f = x.f // Δ = x: Unique, y: Unique, x.f: T f(x) // error: 'x.f' does not have standard permissions when 'x' is passed } ``` )<path-permissions> == Call premises explaination: - $forall 0 <= i <= n : Delta tr std(p_i, alpha^m_i beta^m_i)$ : \ We need all the arguments to have at least standard permissions for their fields. - $forall 0 <= i, j <= n : (i != j and p_i = p_j) => alpha_i^m = shared$ : \ If the same variable/field is passed more than once, it can only be passed where shared is expected. ```kt class C() class A(var f: @Unique C) fun f1(x: @Unique A, y: @Borrowed @Shared A) fun f2(x: @Borrowed @Shared A, y: @Borrowed @Shared A) fun f3(x: @Shared A, y: @Borrowed @Shared A) fun use_f1(x: @Unique A){ // Δ = x: Unique f1(x, x) // error: 'x' is passed more than once but is also expected to be unique } fun use_f2_f3(x: @Unique A){ // Δ = x: Unique f2(x, x) // ok, uniqueness is also preserved since both the args are borrowed // Δ = x: Unique f3(x, x) // ok, but uniqueness is lost since one of the args is not borrowed // Δ = x: Shared } ``` - $forall 0 <= i, j <= n : p_i subset.sq p_j => (Delta(p_j) = shared or a_i^m = a_j^m = shared)$ : \ Fields of an object that has been passed to a method can be passed too, but only if the nested one is shared or they are both expected to be shared. ```kt class C() class A(var f: @Shared C) class B(var f: @Unique C) fun f1(x: @Unique A, y: @Shared C) {} fun use_f1(x: @Unique A) { // Δ = x: Unique f1(x, x.f) // ok // Δ = x: T, x.f: Shared // Note that even if x.f is marked shared in the context, // it is not accessible since Δ(x.f) = T } fun f2(x: @Unique B, y: @Shared C) {} fun use_f2(b: @Unique B) { // Δ = b: Unique f2(b, b.f) // error: 'b.f' cannot be passed since 'b' is passed as Unique and Δ(b.f) = Unique // Δ(b.f) = Unique // It is correct to raise an error since f2 expects x.f to be unique } fun f3(x: @Shared B, y: @Shared C) {} fun use_f3(x: @Unique B) { // Δ = x: Unique f3(x, x.f) // ok // Δ = x: Shared, x.f: Shared } ``` == Borrowed fields Fields of a borrowed variable are borrowed too. But differently from variables, they can be read/written and so these operation require special rules. - `Assign-Borrowed-Field` tells us what happens when reading a borrowed field. The most important thing to notice is that after being read, the field will be annotated with $top$, even if it is shared. Doing this is necessary to guarantee soundness while passing unique variables to functions expecting a borrowed shared argument. - For the same reason, borrowed fields can only be re-assigned to something that is unique. Otherwise passing a unique to a function expecting a borrowed shared argument cannot guarantee that uniqueness is preserved. ```kt class A(var n: Int) class B(var a: @Unique A) fun f(b: @Shared @Borrowed B) { // Δ = b : Shared Borrowed var temp = b.a // Δ = b : Shared Borrowed, b.a : T, temp: Shared // before returning, b needs to be in std form // but at this point it can only be re-assigned with a unique // re-assigning b.a with a shared would cause unsoundness to a caller passing a unique b.a = A(0) // Δ = b : Shared Borrowed, b.a : Unique, temp: Shared // now the function can return with no problems } fun use_f(b: @Unique B) { // Δ = b : Unique // also Δ(b.f) = Unique f(b) // Δ = b : Unique // moreover it is guaranteed that also Δ(b.a) = Unique } ``` == Smart casts Since if-statements guards cannot create new aliases, having a variable or a field access in the guard will not modify its uniqueness. ```kt class T class A(var t: @Unique T?) fun use_t(t : @Shared @Borrowed T){ } fun f(a1: @Unique @Borrowed A, a2 : @Shared @Borrowed A) { // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed if (a1.t != null) { // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed use_t(a1.t) // here it is safe to smart cast because Δ(a1.t) = Unique (Borrowed) // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed } // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed if(a2.t != null){ // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed use_t(a2.t!!) // here it is NOT safe to smart cast because Δ(a2.t) = Shared (Borrowed) // Δ = a1 : Unique Borrowed, a2 : Shared Borrowed } } ``` == Stack example This shows how the example presented in #link("https://arxiv.org/pdf/2309.05637.pdf")[LATTE] paper works with this system. ```kt class Node(var value: @Unique Any?, var next: @Unique Node?) class Stack(var root: @Unique Node?) { @Borrowed @Unique fun push(value: @Unique Any?) { // Δ = this: borrowed unique, value: unique val r = this.root // Δ = this: borrowed unique, this.root: T, value: unique, r: unique val n = Node(value, r) // Δ = this: borrowed unique, this.root: T, value: T, r: T, n: unique this.root = n // Δ = this: borrowed unique, this.root: unique, value: T, r: T, n: T } @Borrowed @Unique fun pop(): @Unique Any? { // Δ = this: borrowed unique val value: Any? // Δ = this: borrowed unique, value: T if (this.root == null) { value = null // Δ = this: borrowed unique, value: unique } else { value = this.root.value // Note: smart cast 'this.root' to Node // Δ = this: borrowed unique, this.root.value: T, value: unique val next = this.root.next // Note: smart cast 'this.root' to Node // Δ = this: borrowed unique, this.root.value: T, this.root.next: T, value: unique, next: unique this.root = next // Δ = this: borrowed unique, this.root: unique, value: unique, next: T // Note: doing this.root = this.root.next works too } // Unification... // Δ = this: borrowed unique, this.root: unique, value: unique return value } } ```
https://github.com/dyc3/senior-design
https://raw.githubusercontent.com/dyc3/senior-design/main/user-stories.typ
typst
#import("lib/use-cases.typ"): userstory = User Stories #userstory( [End User], [to be able to watch videos with my friends], [I can share my experience with others], ) <UserStory::WatchTogether> #userstory( [End User], [to have responsive video controls], [the app feels better to use], ) <UserStory::Responsive> #userstory( [End User], [OTT to be available when I want to use it], [I can start watch parties spontaneously], ) <UserStory::Reliable> #userstory( [End User], [to not be interrupted by ads], [I can focus on the content I want to watch], ) <UserStory::NoAds> #userstory( [End User], [to not be interrupted by app updates], [I can focus on the content I want to watch], ) #userstory( [Maintainer], [to have redundancy], [the app is more robust against crashes], ) <UserStory::Redundant> #userstory( [Self-hoster], [to have a simple installation process], [the app is easier to install], ) <UserStory::SimpleInstall> #userstory( [Self-hoster], [to have a simple upgrade process], [the app is easier to upgrade], ) #userstory( [Maintainer or Self-hoster], [OTT to have a consistent and relatively stable HTTP API], [I can automate administration tasks], ) <UserStory::HttpApi>
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024
https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/entries/brainstorm-wedges/entry.typ
typst
Creative Commons Attribution Share Alike 4.0 International
#import "/packages.typ": notebookinator #import notebookinator: * #import themes.radial.components: * #show: create-body-entry.with( title: "Brainstorm: Pushing Robots", type: "brainstorm", date: datetime(year: 2023, month: 9, day: 3), author: "<NAME>", witness: "<NAME>", ) Most robot designs that we've seen use a mechanism that lifts the other robot so that they are at an angle. This gives us mechanical advantage against them, as shown below: #image("./free-body-diagram.svg", width: 50%) As you can see from the diagram above, the force normal force on the robot pulls it backwards, using their own weight to our advantage. This also gives them less friction against the field tiles, reducing the amount of force they can exert against us. #grid( columns: (1fr, 1fr), gutter: 20pt, [ = Wedges This design consists of either a plastic plate or two high strength axles in order to lift the opponent's robot so that they're at an angle. This design is used by many robots, including 515R's Spin Up robot #footnote(link("https://www.youtube.com/watch?v=UZGtNiJYn38")), and 606X's Spin Up robot #footnote(link("https://www.youtube.com/watch?v=BpEzAX6E2_g&t=20s")). #pro-con( pros: [ - It can double as a skirt. - It is used by many teams. - The slant gives our robot mechanical advantage while pushing other robots. - The shape is simple to design. ], cons: [ - It is hard to mount because there are no flat surfaces. - The design will possibly require reinforcement of the robot. - It is unoriginal. ], ) ], image("./wedges.svg"), [ = Snow Plow This design consists of a curved plate at the front of the robot. It works similarly to the wedges except its concave instead of flat. #pro-con(pros: [ - It is slanted at the bottom to give mechanical advantage. - The back is flat and will be easier to mount. - It could possibly double as a skirt. ], cons: [ - The shape is odd and would be hard to create. - The protrusion at the top may limit the effectiveness. - Not as slanted as the wedges. - It would not allow the robot to go over the barrier. ]) ], image("./plow.svg"), [ = Arms This design involves a piece of metal or plastic at 90#sym.degree mounted to the front of the robot. #pro-con( pros: [ - The back is flat and would be easy to mount. - If mounted really close to the floor it would be able to slide easily under a skirt. - It can help to push other robots. ], cons: [ - Not slanted, forfeiting mechanical advantage - Would not allow the robot to go over the barrier. ], ) ], image("./forklift.svg"), )
https://github.com/ymgyt/blog
https://raw.githubusercontent.com/ymgyt/blog/main/content/entry/riakuto-3ed/index.md
markdown
+++ title = "⚛️ りあクト!第3版がすごくよかった" slug = "riakuto-3ed" date = "2020-10-10" draft = false aliases = ["/entry/books/riakuto"] [taxonomies] tags = ["book"] +++  会社のフロントエンドエンジニアの方にReact/Typescript関連のおすすめのドキュメントを聞いたところりあクトを教えてもらいました。 読んでみてとてもおもしろく、おおげさですが感動してしまったので感想を書きます。 ## 読んだ本 りあクト! TypeScriptで始めるつらくないReact開発 第3版(200912) {{figure(caption="りあクト!第3版", images=[ "images/riakuto_1.jpeg", "images/riakuto_2.jpeg", "images/riakuto_3.jpeg", ], width="32%") }} * [Ⅰ. 言語・環境編](https://oukayuka.booth.pm/items/2368045) * [Ⅱ. React基礎編](https://oukayuka.booth.pm/items/2368019) * [Ⅲ. React応用編](https://oukayuka.booth.pm/items/2367992) ## 私とフロントエンド  普段仕事では[movo](https://movo.co.jp/)という物流業界向けのSaaSのバックエンドでGoを書いています。 各種プロダクトのフロントエンドはReact(16.9)/Typescript(3.7)で書かれているのですが、swagger/openapiの生成までしかタッチしていないので フロントのコードを読めるようになりたいと思っていました。 ReactについてはUdemyのコースを買ってみてTODOを中途半端に作った感じで止まっていました。(https://todo.ymgyt.io/)  Reactをやってみた感想はReact + TypeScript + Redux + その他ライブラリ(router/form/etc...)を同時に学ぶのは大変だな、でした。 りあクト!の目次を見てみると環境構築(linter周りも), 言語仕様(モダン?JavaScript,Typescript)、React(Hooks)、ルーティング/Reduxと自分がもやもやしていたところが バランス良く触れられていそうだったので読んでみることにしました。 ## よかったところまとめ  なによりよかったのがReactのAPIやライブラリが登場してきた背景や解決したい問題についてしっかり触れられているところでした。 普段から業務等でフロントの開発にキャッチアップできていなかった自分にとって時系列的な変化の情報はなかなかドキュメントを読んだだけではわからなかったので とてもありがたかったです。  仕事で使えることを念頭にかかれていること。linterの設定やReactRouterではv5/v6両方について触れられていること等実際のコードを読む上で助けになる情報が多かったです。  Reactが好きになる。自分の力不足で理解が至らない箇所もありましたが(特に13章)Reactが好きになりました。 ## 第一部(言語/環境編)  ここからは各章の感想を書いていきます。 ### 第1章 こんにちはReact  どうしてフロントエンドの開発にnodeが必要かはわかっていなかったのでここからきっちり説明してくれてこの本に対する好感度があがりました。 きちんとnodeenv/nvm等でversion固定しておく方法も例があって親切です。 Editorに関してはVSCodeが超絶推奨されています。(個人的には言語とEditor/IDEはあまり密結合せず各機能がcli等で提供されていてほしいですが)  本番buildすると`/static/js/main.chunk.js`にcompile後の結果が出力されるのも知りませんでした。 `REACT_APP_API_TOKEN`のような環境変数で渡した値もここに表示されてしまいそうでした。CRA(create-react-app)のドキュメントにもsecret情報はのせないように注意されていました。(https://create-react-app.dev/docs/adding-custom-environment-variables/)ただそうするとフロントには一切secretわたせないのでしょうか。  yarn upgrade-interactiveは便利ですね、nodeのcliはカラフルでemojiも使われていてフレンドリーですね。 ### 第2章 エッジでディープなJavaScriptの世界  jsのプリミティブ型へのアクセスがラッパーオブジェクトに自動変換される仕様は知りませんでした。 ```sh > 'Hello _'.replace('_', 'ymgyt') 'Hello ymgyt' > (new String('Hello _')).replace('_', 'ymgyt') 'Hello ymgyt' > ```  メソッドとnew以外、thisはグローバルオブジェクト(window,global)を参照してるってすごい仕様だなと思います。 ```javascript class Persion { constructor(name) { this.name = name; } greet() { const doIt = function() { console.log(`Hi, ${this.name}`); }; doIt(); } } const ymgyt = new Person('ymgyt'); ymgyt.greet(); // => Uncaught TypeError: Cannot read property 'name' of undefined ``` `doIt`はメソッドでないのでthisはglobalを参照 -> class構文ではstrict modeが有効になっていて、thisのアクセスはundefinedになるという流れ。 このthisをundefinedでなくするためには - `bind()`でthisを固定する - `call()`,`apply()`でthisを指定する - thisをを一時変数に代入する - アロー関数 というアプローチが可能。Reactのコンポーネントをクラスで定義する際にcallbackで渡す関数をbindしておかないといけなかった理由がわかってうれしかったです。  CommonJS, Browserify, AMD, ES Modulesとmoduleにもいろいろあったんですね。 webpackのやっていることをみるとフロントの技術はスクリプト言語で書いたままがユーザ空間でもそのまま実行されるみたいな認識はあらためないとけないと思わされました。ここまでくるとWASMのようにbinaryにcompileしてブラウザも一つのcompile targetみたいにとらえるのは自然の流れなんですかね。(Rustでフロント書きたい) ### 第3章 関数型プログラミングでいこう  関数型プログラミングの説明についてはjs/ts特有の話はすくないのですんなり読めました。 Rustでasync/awaitが使えるようになったときは感動しましたが、jsやられてきた方はどうだったんですかね。 ### 第4章 TypeScriptで型をご安全に  ちょっと触って試したいとき等、ts-nodeがかなり便利。anyとunknownという型が用意されているところが多様な入出力を扱うjsらしいなと思いました。RustやGoならbyte列使いそう。 neverを使ってenum/string literal unionのswitchのcase漏れを捕捉できるのはすごくよいと思いました。  継承より合成のほうがよい理由が説明されているのですが自分も著者(雪菜さん)のようにしっかり理由が説明できないとなと感じました。 Goに継承がないとディスり気味に言われたときに力強く擁護できず悔しい思いをしました。  tsでクラスを定義するとinterface宣言とコンストラクター関数の宣言になるのは知りませんでした。そもそも型コンテキストという概念がなかったので型コンテキストというものが新鮮でした。 ```typescript const permissions = { r: 0b100, w: 0b010, x: 0b001, }; typePermsChar = keyof typeof permissions; //'r'|'w'|'x' const readable: PermsChar = 'r'; const writable: PermsChar = 'z'; // not assignable ! ``` のように型を操作するような式が書けるのがすごい。  組み込みのユーティリテ型`Pick`,`Ommit`なんてRustにもほしいです。entity定義しつつも、作成時に必要なproperty, ユーザに更新させるpropertyが微妙に違ったりするのでそのために3つ型定義するのではなく ```typescript type struct Blog { id: Id, content: string, tags: Vec<Tag>, } type CreateBlogParam = Pick<Blog,"content">; type UpdateBlogParam = Ommit<Blog, "id">; ``` のようにすることで作成時に必要な値、更新を許可する値みたいに意図がでていいかなと思うのですがどうでしょうか。 ## 第二部(React基礎編) 第一部まではReactのためのjs/tsの準備的側面が強く第二部からいよいよReactの話にはいっていきます。第二部から本書の魅了全開といった感じです。 ### 第5章 JSXでUIを表現する  JSXはReactのDSLくらいなイメージだったのですが、`React.createElement`のSyntactic Sugarである点が強調されています。(Syntacs Sugarが和製英語であることを初めて知りました。)  MVCは技術の役割による関心の分離でReactはコンポーネントを通じてアプリケーションの機能の単位による関心の分離という説明は蒙を啓かれた感じでした(語彙) > 私たちが作ろうとしてるのは、ひとつのURLリクエストに対してひとつの静的なHTMLページを返すだけの単純なアプリじゃないからね。複数の外部APIとの並列的な非同期通信、取得データのキャッシュやローカルストレージへの永続化、ユーザーの操作によって即座に変化するUIの状態 管理、ときにはカメラやGPSといったデバイスへのアクセスまでをも備えた、インタラクティブでレスポンス性の高いアプリなんだよ。比べるべくはモバイルアプリやデスクトップアプリであって、サーバサイドWebアプリケーションの延長で考えるべきではないの (p18) という考えからフロントの技術派閥がHTMLテンプレートという観点から分類でき、Reactはどういう考えに基づいているかの説明はとても参考になりました。たしかに、テンプレート形式だとearly returnとかできないですよね。  なんとなく肌感としてフロントのフレームワークはReactとVueの二大巨頭でついでAngularくらいに思っていたのですがnpmのDL数をみると圧倒的にReactが優勢なんですね。  Reactのサンプルコードで最初に`react`と`react-dom`の2つをimportしていますが、どうしてdomだけ分離しているのかなと思っていました。"HTML is just the beginning."とあるように、Reactからするとdomはあくまでレンダー(出力)の対象環境のひとつにすぎないという設計になっているらしく視座の高さを感じました。  Reactの解説をする際にいきなりJSXの書き方からはいるのではなくここまで厚く背景について解説してもらえてとてもありがたいです。こういう背景や考え方みたいなことってわりと口語的に伝わっていったり自分でいろいろ触ってみて体感していくことが多いと思うので本でここまではっきりいいきってくれる本書は本当に貴重だと思います。 ### 第6章 Linterとフォーマッタでコード美人に  フロントのlinterについてはなんとかLintがたくさんあってよくわかっていませんでした。Reactを触っていろいろ試そうと思うとどうしても自分で環境を作る必要があり、CRAもそこまでやってくれないので最初から作る過程をのせてくれていて本当にありがたいです。(ただ著者のtwitterによると新たに推奨設定が発表されたらしいです。)  eslintの公式の推奨設定とtypescriptのバッティング箇所を調整するためにruleを優先度考慮して設定していくのはなかなか大変そうだなと思いました。実際にはやりながら都度都度チームに合う形で調整していくことになると思うのですが、言語的にIDE+linterの設定込で安全性担保しようとしているのである程度わかっていないとTypeScriptの良さが発揮できずこのあたりにつらみでているなと思いました。ここにさらにcode formatterも加わってformatしたらlint違反になったりします。(やはり公式のformatterは偉大)projectのtop levelに`.eslintignore`,`.eslintrc.js`,`.prettierrc`, `.stylelintrc.js`,`tsconfig.eslint.json`ができあがるので、Frontend DevOpsと呼ばれる職種があるのも納得です。(`eslint-config-prettier-check`という設定のconflictを検知する専用のcliまであるのはすごい)このあたりは、js/ts/cssといったパラダイムが違う技術スタックを統一的に扱ってる大変さがあるからなのかなと思います。 ### 第7章 Reactをめぐるフロントエンドの歴史 > 7-1 React登場前夜 すべてはGoogleマップショックから始まった  この書き出しからすでにおもしろい。Google MapのAjaxからはじまり、prototype.js,jQuery,Backbone.js,AngularJS,Knockout.js,Vueとフロントエンドの技術の変遷が語られていきます。Web Componentについても聞いたことしかなかったので勉強になりました。  Reactの公式サイト(https://reactjs.org/)にはDeclartive, Component-Based, Lean Once, Write Anywhereと書かれていますがこれが変遷していった話はおもしろかったです。  仮想DOMについてもまったくわかっていませんでしたがメンタルモデルレベルではなんとなくイメージできるようになりました。仮想DOMって`react-dom`の概念ではなく`react`の概念だとしたら、Reactはレンダリングされる環境に関与しない設計方針に反するような気はしました。それともDOM自体がブラウザに限定されない抽象的概念なんでしょうか。(なんて思っていたらしっかり秋谷さんが代弁してくれました。)  VueのほうがReactよりシンプルで使いやすいというおっしゃられてる人もいますが大規模になっていくとそれはそれで大変になるみたいです。 ### 8章 何はなくともコンポーネント  クラスコンポーネントでEventHandlerにメソッド渡す際はアロー関数にして渡すようにしないとうまくいかない程度の認識でしたが、2章でthisの挙動について丁寧に解説してもらったおかげで理由が納得できました。 ``` // NG <Button onClick={this.handleClick} // OK <Button onClick={() => this.handleClick} ```   PresentationalComponentとContainerComponentは紹介されているブログ記事(https://medium.com/@dan_abramov/smart-and-dumb-components-7ca2f9a7c7d0)ではhooksで同じことできるから現在は推奨しないようなことが注記されていました。 > Update from 2019: I wrote this article a long time ago and my views have since evolved. In particular, I don’t suggest splitting your components like this anymore. If you find it natural in your codebase, this pattern can be handy. But I’ve seen it enforced without any necessity and with almost dogmatic fervor far too many times. The main reason I found it useful was because it let me separate complex stateful logic from other aspects of the component. Hooks let me do the same thing without an arbitrary division. This text is left intact for historical reasons but don’t take it too seriously. 影響力ある方が提唱するとwithout any necessityにdogmatic fervorでenforcedされるのは耳がいたいですね。大事なのはstateful logicを分離することみたいですね。  もちろんりあクトでも、撤回されている件について触れられていて、そのうえで今でも有効な考えであると展開されていきます。理由は以下の2つ。 - デザインガイドと共存させやすいこと - 分けて考えることがメンテナンス性の高い設計がうながされること  デザインガイドとの共存についてはhooksって要は副作用なのでそれが分離されているとmockしやすいということなんでしょうか。StoryBookはプロダクトでも使われているのですが自分で使ったことがないのでピンときていません。  メンテナンス性の高い設計につながる理由は、Reactの公式ドキュメントthinking in react(https://ja.reactjs.org/docs/thinking-in-react.html)の各ステップに対応するからということです。 #### Thinking in React(Reactの流儀)  公式が開発の流れ/考え方まで示してくれるのは最高ですね。これは読まねばということで各ステップをみていきます。 - 英語(https://reactjs.org/docs/thinking-in-react.html) - 日本語(https://ja.reactjs.org/docs/thinking-in-react.html) ##### Step1 UIをコンポーネントの階層構造に落とし込む  画面のモックがあることは前提で、ここからコンポーネントに分解していきます。その際に、単一責任の原則(single responsibility principle)がひとつの基準になるようです。 ただ、テーブルコンポーネントとテーブルヘッダーコンポーネントを分離するかは好みの問題のようで(ソート等複雑になってきたらわければよい)このあたりは経験でやるみたいですね。最終的には以下のようなコンポーネントの階層構造を得ます。 ``` - FilterableProductTable - SearchBar - ProductTable - ProductCategoryRow - ProductRow ``` ##### Step2 Reactで静的なバージョンを作成する > 表示の実装とユーザ操作の実装を切り離しておくことは重要です。静的な(操作できない)バージョンを作る際には、タイプ量が多い代わりに考えることが少なく、ユーザ操作を実装するときには、考えることが多い代わりにタイプ量は少ないからです。なぜそうなのかは後で説明します。  とにかく表示の関心は分離しておくことが大事みたいですね。 ##### Step3 UI状態を表現する必要かつ十分なstateを決定する > それぞれについて見ていき、どれが state になりうるのかを考えてみます。各データについて、3 つの質問をしてみましょう。 親から props を通じて与えられたデータでしょうか? もしそうなら、それは state ではありません 時間経過で変化しないままでいるデータでしょうか? もしそうなら、それは state ではありません コンポーネント内にある他の props や state を使って算出可能なデータでしょうか? もしそうなら、それは state ではありません stateとpropsの違いは?と聞かれても即答できないのですが、このような観点で考えてみればいいのですね。既存の実装を読む際にはstateとpropsの切り分けにも注目してみようと思います。基本的にはユーザの入力や選択がstateになるのではと思っています。 ##### Step4 stateをどこに配置するべきかのかを明確にする stateについて * その state を使って表示を行う、すべてのコンポーネントを確認する * 共通の親コンポーネントを見つける(その階層構造の中で、ある state を必要としているすべてのコンポーネントの上位にある単一のコンポーネントのことです) * 共通の親コンポーネントか、その階層構造でさらに上位の別のコンポーネントが state を持っているべきである * もし state を持つにふさわしいコンポーネントを見つけられなかった場合は、state を保持するためだけの新しいコンポーネントを作り、階層構造の中ですでに見つけておいた共通の親コンポーネントの上に配置する ということで、共通で利用されていたらどんどん上の階層に登っていく感じでしょうか。 ##### Step5 逆方向のデータフローを追加する  下位のコンポーネントから上位のコンポーネントのstateを更新できるように更新用のcallback関数をpropsとして下位コンポーネントに渡す。 > Hopefully, this gives you an idea of how to think about building components and applications with React. While it may be a little more typing than you’re used to, remember that code is read far more than it’s written, and it’s less difficult to read this modular, explicit code. As you start to build large libraries of components, you’ll appreciate this explicitness and modularity, and with code reuse, your lines of code will start to shrink. :)  心にしみる深い教えだ..  ということで、たしかにPresentational ComponentとContainer Componentをわけて書くとthinking in reactで言われているようにまず表示から作ってstate/ロジックを分離する流れが強制できそうです。自分のように初めてReact書くようなメンバーにとってはこれはここに書くと決まりごとあるほうがありがたいですよね。 ### 第9章 Hooks, 関数コンポーネントの合体強化パーツ  まず前提としてやりたいことはComponent間のstateを伴ったロジックの共有。そして当初はmixinといわれる手法が使われていたが公式からConsidered harmfulといわれるにいたってしまった。ということで言及されている、Mixins Considered Harmfulから見ていきます。(https://reactjs.org/blog/2016/07/13/mixins-considered-harmful.html) #### Mixins Considered Harmful > “How do I share the code between several components?” is one of the first questions that people ask when they learn React. Our answer has always been to use component composition for code reuse. You can define a component and use it in several other components. > It is not always obvious how a certain pattern can be solved with composition. React is influenced by functional programming but it came into the field that was dominated by object-oriented libraries. It was hard for engineers both inside and outside of Facebook to give up on the patterns they were used to. To ease the initial adoption and learning, we included certain escape hatches into React. The mixin system was one of those escape hatches, and its goal was to give you a way to reuse code between components when you aren’t sure how to solve the same problem with composition. どうやらmixinはオブジェクト指向に慣れている開発者が慣れ親しんだ手法でcodeの再利用をおこなえるためのescape hatch的側面が強かったようです。 また > This doesn’t mean that mixins themselves are bad. People successfully employ them in different languages and paradigms, including some functional languages. とあるようにmixinsそのものが悪いといっているわけではないです。mixinsが壊れやすい理由としては以下が挙げられています。 * mixinsは暗黙的な依存を招く。stateのrenameしようとするときにmixinのlistから探さないといけなくなったりする等。またmixin間でも依存が起きる。 * mixinxは名前空間を共有している。`FluxListenerMixin`と`WindowSizeMixin`が`handleChange()`をそれぞれ定義していたらバグる。mixinsの開発者が新しいmethodを追加すると潜在的に現在そのmixinを利用しているコードベースを壊すおそれがある。 * mixinsの複雑性がsnowballする。mouseのhoverをtrackする`HoverMixin`がtooltipの表示を制御するための`TooltipMixin`に利用され、tooltipの方向を制御するために`getTooltipOptions()`が`TooltipMixin`に追加され、同時にtooltipを表示したいcomponentがhoverの判定にdelayを追加したくなったので`getHoverOptions()`を`TooltipMixin`に追加。この時点で`HoverMixin`と`TooltipMixin`はtightly coupledになっている。といった具体例が紹介されていました。確かにこれはつらそう。 公式ドキュメントにはコードの具体例も乗っているので、りあクトで言及されているmixinsのつらさがなんとなく実感できました。 #### HOC Mixins Considered Harmful(https://reactjs.org/blog/2016/07/13/mixins-considered-harmful.html)でも代替案として提示されているのがHOCです。紹介されているgist(https://gist.github.com/sebmarkbage/ef0bf1f338a7182b6775) ```javascript import { Component } from "React"; export var Enhance = ComposedComponent => class extends Component { constructor() { this.state = { data: null }; } componentDidMount() { this.setState({ data: 'Hello' }); } render() { return <ComposedComponent {...this.props} data={this.state.data} />; } }; ``` ```javascript import { Enhance } from "./Enhance"; class MyComponent { render() { if (!this.data) return <div>Waiting...</div>; return <div>{this.data}</div>; } } export default Enhance(MyComponent); // Enhanced component ```  利用する側は、提供されているHOCで自身をwrapするとAPIとして提供されているpropsがもらえる前提でコードかけるみたいな理解でよいのでしょうか。 #### Render Props  HOCの理解もままならない中さらに別の方法が登場。ただrender propsを利用したライブラリもあるとのことなのである程度わかっておかないとライブラリのAPIが読めなそうです。 ```javascript const MyComposent: FC<{providerArg: number?> = ({ providerArg }) => ( <XXXProvider providerArg={providerArg}> {({ logic }) => ( <div> {logic()} </div> )} </XXXProvider> ); ```  かなりあやしいですが、要は書いたときになにがProviderを制御する値で、何がProviderからもらえるかわかりやすいという感じなのでしょうか。 りあクトであげられているUse a Render Props!(https://cdb.reacttraining.com/use-a-render-prop-50de598f11ce)も読んでみます。 > A render prop is a function prop that a component uses to know what to render.  HOCのようにwrapするのではなく、利用する側のComponentでchildrenとしてrenderしたいComponentをProviderに渡す処理を書く感じでしょうか。 HOCにせよrender propsにせよ今はhooksを使うらしいのでやりたいことだけなんとなく抑えておけばいいのかなと思ってます。 #### Hooks  りあクト本のすごいところは個人の開発者名がバンバンでてきて、ライブラリ等が発表されたときの当時のコミュニティーの反応やREADMEの引用がのっていて 開発のダイナミズミみたいなものが追体験できるところです。ということでHooksが最初に発表されたカンファレンスのyoutubeをみてみます。(https://www.youtube.com/watch?v=dpw9EHDh2bM) ##### React Today and Tomorrow and 90% Cleaner ReactWith Hooks  Opening key noteでReactの3つのsucks(problem)として * Reusing logic * Giant components * Confusing class が挙げられていました。そしてこれらは独立した問題ではなく、simpler smaller lightweight primitive to add state or lifecycleをReactが提供していないという問題の現れとしています。そして、Class Componentで書かれたsample codeをhooksを利用して書き換えています。(23:51)ではじめてhookが言及。react featuresをcomponentにhookするようなニュアンスなんですね。 ```javascript import React, { useState, useEffect } from 'react'; import Row from './Row'; import { ThemeContext } from './context'; export default function Greeting(props) { const name = useFormInput('Mary'); const theme = useContext(ThemeContext); useDocumenTitle(name.value) const [width, setWidth] = useState(window.innerWidth); useEffect(() => { const handleResize = () => setWidth(window.innerWidth); window.addEVentListener('resize', handleResize); return () => { window.removeEventListener('resize', handleResize); } }) function handleNameChange(e) { setName(e.target.value); } return ( <section className={theme}> <Row label="Name"> <input {...name} /> </Row> </section> ); } function useFormInput(initialValue) { const [value, setValue] = useState(initialValue); function handleNameChange(e) { setValue(e.target.value); } return { value, onChange: handleChange }; } function useDocumenTitle(title) { useEffect(() => { document.title = title; }); } ``` いやー、class componentとくらべると本当にhooksはflatになりますね。useXXXで関心のあるstateとロジックをいい感じに閉じ込められているよに思えます。Facebookがproductionで試してから発表してくれているようで採用するにしてもかなり安心感ありそう。  そして、最後はR<NAME>rence先生の`useReducer()`のデモコードの中で`useState()`が複数行になってきてstateの変更箇所が散ってきたら`useReducer()`使う感じなのでしょうか。hookをifの中で呼んではいけないことにひきつけて、`unconditionally`っていうのがおもしろポイントみたいなのでフロントの人と話すときに使っていきたいです。 ##### Effect Hookとライフサイクルメソッドの相違点  Function Componentにライフサイクルメソッドの機能を提供するために`useEffect`が用意されています。りあクトでは`useEffect`と`componentDidMount`と`componentDidUpdate`の違いを3つに分類して説明してくれています。(Ryan Florence先生のdemoではコメントで`cDM`と`cDU`と略されていました。) * 実行されるタイミング * propsとstateの値の即時性 * 凝集の単位 ##### 実行されるタイミング  `useEffect`が実行されるタイミングは必ず抑えておかないといけなそうです。コンポーネントが初期値でレンダリングされたあとに実行されるということなのでAPI callとか時間がかかる処理をいれてもよさそうです。`useLayoutEffect`というhookもあるらしいのですが、使用頻度はあまり高くないみたいです。フロントの方になにが違うか聞いてみたことがありますが、DOMの値(高さとか)を使ってなにかしたいときに利用したりすると教えてもらいました。 ##### propsとstateの値の即時性  どうやらクラスコンポーネントと関数コンポーネントではpropsとstateの変数に対するメンタルモデルが違いそうです。いまいちピンとこないで参考としてあげられているブログ記事"関数コンポーネントはクラスとどうちがうのか?"(https://overreacted.io/ja/how-are-function-components-different-from-classes/)を読んでみます。 ###### 関数コンポーネントはクラスとどう違うのか? ```javascript class ProfilePage extends React.Component { showMessage = () => { alert('Followed ' + this.props.user); }; handleClick = () => { setTimeout(this.showMessage, 3000); }; render() { return <button onClick={this.handleClick}>Follow</button>; } } ``` ```javascript function ProfilePage(props) { const showMessage = () => { alert('Followed ' + props.user); }; const handleClick = () => { setTimeout(showMessage, 3000); }; return ( <button onClick={handleClick}>Follow</button> ); } ```  ボタンを押したらFollowしましたと表示するコンポーネント。この2つの挙動は一見同じにみえるが、クラスコンポーネントの方はボタン押してからprops.userが変わって再レンダリングされると変更後のuserが表示されてしまうというバグがあるということです。理由はpropsの更新前後で`this`が参照しているobjectがかわるからという理解でよいのでしょうか。バグを修正するには ```javascript class ProfilePage extends React.Component { render() { // propsを捕獲しましょう! const props = this.props; // 注: ここは*render内部*です。 // なのでこれらはクラスメソッドではありません。 const showMessage = () => { alert('Followed ' + props.user); }; const handleClick = () => { setTimeout(showMessage, 3000); }; return <button onClick={handleClick}>Follow</button>; } } ```  のようにして、保持して起きた値をclosureで補足しておきます。関数コンポーネントはこの`render()`の中に全部書いたのと結局は同じという理解でよいのでしょうか。 ##### 凝集の単位  これはhooksのdemo動画(https://www.youtube.com/watch?v=dpw9EHDh2bM)でも強調されておりわりとピンと来ました。戻り値にクリーンアップ処理を返すという発想はいろいろなAPIでも利用されていると思うで参考にしていきたいです。 ##### memo化  プロダクトのコードみていると、`useMemo`,`useCallback`がよく利用されているのですがよくわかっておらず、ここにも言及してもらって本当に助かります。公式doc(https://reactjs.org/docs/hooks-reference.html#usecallback)によりますと `useCallback(fn, deps) is equivalent to useMemo(() => fn, deps)` みたいです。reactのソースコードはまったくわかりませんが検索してみた感じこれでしょうか。 ```typescript export function useCallback<T>( callback: T, deps: Array<mixed> | void | null, ): T { return useMemo(() => callback, deps); } ``` https://github.com/facebook/react/blob/ddd1faa1972b614dfbfae205f2aa4a6c0b39a759/packages/react-dom/src/server/ReactPartialRendererHooks.js#L447 ##### useRef  useRefについても当然わかっておらず、プロダクトコードではform系のコンポーネントに`useRef`のもどり値を渡していてなんでこんなことしてるんだろうと思ってました。関数コンポーネントはクラスとどう違うのか(https://overreacted.io/ja/how-are-function-components-different-from-classes/) > クラスにおいては、this.propsもしくはthis.stateを読み取ることでそれができるでしょう。なぜならthis自体がミュータブルだからです。Reactがそれを書き換えます。関数コンポーネントにおいても、あらゆるコンポーネントのrenderに共有されるミュータブルな値を持つことが可能です。それは「ref」と呼ばれています > refはインスタンスフィールドと同じ役割を持ちます。それはミュータブルで命令型の世界への脱出口です。「DOMのref」という考えに馴染みがあるかもしれませんが、そのコンセプトははるかに汎用的です。それは中に何かを入れるための単なる入れ物なのです。 とあります。これはつまり関数コンポーネントにinstance fieldの機能を提供するためのhookと考えてよいのでしょうか。 ### 第10章 Reactにおけるルーティング  きちんとSAPのルーティングの考えから解説してくれていてとても参考になります。SPAにおいてブラウザに表示されるURLはUIの状態の識別子みたいな位置づけになりますよね。DOMの書き換えであたかもページ遷移してるってように振る舞うだけではなく、ブラウザのセッション履歴まで同期させる必要があるのが学びでした。  SPAになるとAPIへのリクエストとページ遷移の関係が複雑になるので、pageviewみたいな指標をとろうと思うとアクセスログから分析するのではなく専用の機能を使う必要もありますよね。このあたりは皆さんどうされてるんですかね。紹介されているreact-ga等を利用したGoogle Analytics一択なんでしょうか。個人的にはmetrics APIのようなものを建てるかendpoint切りたいですがビジネスのコアに集中みたいな文脈からみるとあまりそういうことはやらないほうがいい気もしていて悩ましいです。 #### React Router  React RouterとReach Routerの関係についてかなり丁寧に解説いただいています。特にReact Routerのv3,v4,v5,v6の変遷についてはとてもありがたかったです。プロダクトコードではv5とv6が利用されており、API変更の背景がわかって納得感でました。自分が知らなすぎるということもあるかもしれませんがりあクトはこの章だけで買う価値があると思える章で構成されていてすごいです。React Routerの公式doc(https://reactrouter.com/web/guides/quick-start)を読む前に簡単な説明をいれてくれているのも滲みます。  SPAではページ遷移(History APIの履歴更新)時のスクロールまで考慮にいれる必要あるんですね。  素朴に思うのがReactでWeb Application作ろうと思ったら必ずRouting必要になると思うので公式で提供してくれてもよさそうに思いましたがこのあたりもサードパーティ製に委ねるのがJust The UIゆえなんでしょうか。 ### 第11章 Reduxでグローバルな状態を扱う  ついに出ましたRedux。もほやフロント以外のブログ記事でも時々Redux/Fluxの考え方やAPIが言及されていたりして避けては通れない存在。個人的に特にわかりにくい点は、今までComponentにはpropsを通して値を受け渡していましたが突然そのコードが消えて、HOCのmapStateToPropsを呼ぶだけになる点でした。  参考にあげらているfacebookが[Fluxを発表した記事](https://www.infoq.com/jp/news/2014/05/facebook-mvc-flux/)の[MVCの画像](https://res.infoq.com/news/2014/05/facebook-mvc-flux/ja/resources/flux-react-mvc.png)はコントローラが肥大化しすぎなようにも思えましたが。  状態の更新ってそのまま扱うと副作用になってしまうと思いますがこれをActionで表現して、reduce(Action,State) => Stateの形で表現するのはすごくいい考えですよね。バックエンドの更新処理も同じような枠組みで処理できるように書いてみたいものです。 #### Redux Style Guide  React/Reduxを利用しようと思ったときデェレクトリ構成で手が止まりました。チュートリアル等でも結構ディレクトリ構成バラバラでこのあたりはチームやプロダクトの規模に応じて決めるものなのかなと思っていました。そんな中で、りあクトではFlux Standard ActionやDucksといったデザインパターンが紹介されていて、そのうちRedux Style Guide(https://redux.js.org/style-guide/style-guide/)は公式ガイドラインとあっては読まないわけにはいきません。ちなみに会社のプロダクトではducksならぬ[reducks](https://github.com/alexnm/re-ducks)を採用しているとのことでした。 > You are encouraged to follow these recommendations, but take the time to evaluate your own situation and decide if they fit your needs. とあるように公式でもあくまで教条的にならないようにいっています。  Priority AのEssentialなものとして以下があげられています。 * Do Not Mutate State * Reducers Must Not Have Side Effects * Do Not Put Non-Serializable Values in State or Actions * Only One Redux Store Per App  りあクトでReduxの背景から説明してもらっているので割と自然に思えます。Non-SerializableなvalueとしてSetsがあげられているのは、Serializeするたびに要素の順序がかわってしまったりするためなのでしょうか。  複数人で開発する場合チームにはいろいろな背景のメンバーがいるので、やってはいけないことリストを公式が提示してくれているのはそれだけで結構大きいと思います。最初にこれ読んでおいてと伝えられるものがあるとドキュメント等書かなくてもよくなるわけですし。  りあクトではPriority B/Cに関しては重要と思われるものに整理してくれています。Actionに関する項目を読んでみますと、プロダクトコードのActionまわりがどうしてこう書かれているかの背景がわかりました。   ##### Evaluate Where Each Piece Of State Should Live  ReduxのThree Principles Single source of truthで言われている > The global state of your application is stored in an object tree within a single store. は全ての値をRedux storeに入れればよいというわけではないと注意されています。"local"と考えられる値はComponentで保持すべきであると。なのでこの値はstoreで保持して、この値はComponentのstateで保持するみたいな判断は必要になるんですね。自分も最初にReduxを知ったときは、全ての値をstoreで保持するものだと思っていて、どうしてFormの値はReduxで管理せずにForm用のライブラリで管理しているのか質問したりしました。(Redux Formというライブラリがあることを知ってちょっとうれしくなりました。) #### Redux Toolkit  redux-toolkit(https://github.com/reduxjs/redux-toolkit)について具体例でかなりわかりやすく解説してくれています。ここまで丁寧にReduxの流れを追ってきているのでこんなに短く書けるのは感動です。ただもし自分がReduxについてまったく知らずにいきなりこのtoolkitから入るとかなり、actionがsimpleなjsのobjectでreducerがただの関数というReduxの良さがわからないかもしれないなとは思いました。  #### Redux DevTools  神ツールとはこういうツールをいうのかという感じでこれなしでの開発は考えられないくらいすごいツールだと思います。バックエンドもこういう風に開発したい。 #### useReducer  useReducerに関してはReact Conf 2018(https://www.youtube.com/watch?v=dpw9EHDh2bM)の<NAME>lorence先生のdemoでuseStateと更新処理が複雑になっていったComponentをuseReducerでシンプルに書き直すのがすごくわかりやすかったです。  useStateの実体はuseReducerのwrapで、Fiberとhooksの関係について説明が続くのですがここはまったくわかりませんでした。 ### 第12章 Reactは非同期処理とどう戦ってきたか  非同期のAPI Callをどう扱うかという文脈でreduxのmiddlewareが説明されており、redux-thunkがどういう位置づけなのか整理できたような気がします。redux-thunkは始めやすいが気をつけないとカオスになりがちというのは意識しておかないといけないですね。  わりとこのあたりはバックエンド的にもどのAPIのendpointがどういったタイミングで叩かれるのかにかかわってくるので、理解しておきたいと思っていました。  redux-sagaについてはDSLやAPIの学習コスト高そうですが、副作用を別機構に切り出せる点は魅力的ですしaction creatorがシンプルなほうがRedux wayな気もします。あとは自分がSagaパターンに馴染みがないのですが、マイクロサービス関連の理解が進めばsaga推しになるかもしれと思いました。マイクロサービスの文脈とフロントの文脈が繋がるのはわくわくします。  最後のマイクロフロントエンドという概念は未来すぎてついていけませんでした。中央集権から分散みたいな大きな流れで捉えると分散によっていく流れがフロントまで波及するというのは説得力あるように思えました。 (みんなReduxじゃなくてRedux Devtoolsが使いだけなんじゃ..?) #### 公式が示したEffect Hookという道  useEffectとreduxのuseSelector/useDispatchを組み合わせたデータの取得のサンプルコードがまさに直近のプロダクトコードで利用されている方法で、なるほどこれが現在のベストなのかーと思ったら既に次の本命がきていることが示唆されておりアキレスと亀のような気分になりました。 #### New Context APIの登場  フロントの方々がContextでいくかReduxでいくかみたいな話をされていたのですがその背景がなんとなくわかりました。公式のコンポーネント間のデータ共有というのはなかなか魅力的なように思えます。 ### 第13章 Suspenseでデータ取得の宣言的UIを実現する  Suspense...?という状態。useEffectはまったくもって宣言的でないというのはそのとおりですよね、Custom HooksでuseXXXがあっても実装読まないと怖くて使えないなとは思います。  Errorでなくとも任意のオブジェクトをthrowできるとはさすがjsと思ってしまいました。(ReduxのactionもthrowしておれおれRedux作れんじゃ...)  GraphQLについてはまだプロダクトでは採用できていないのですが、いずれは利用する想定なのでライブラリの動向等が参考になります。 #### Suspenseの優位性とConcurrentモード  恥ずかしながらWebパフォーマンスにこういった指標(https://web.dev/lighthouse-performance/)があることを知りませんでした。DevToolのリクエストの時間かサーバサイドのelapsedくらいしか気にしていませんでした。 #### Concurrent モード  ほとんどついていけませんでした。これもあくまでDOM等のレンダリング環境とは独立した話なのでしょうか。まだ公式からstableが正式にでていない機能をここまで具体例豊富に解説してくれていてすごいです。きっともうすこし時間が経つとあの時点でここまで解説してくれていたりあクトすごい!と思える時がくるんじゃないでしょうか。 ## おわりに  実用性と読み物としての楽しさが両立しているとても素晴らしい本だと思うので、是非おすすめしたい本です。個人的にかなりうれしかったことはdeno(node runtime)やswc(typescript compiler)でRustが使われていることが知れたことです。
https://github.com/LDemetrios/Typst4k
https://raw.githubusercontent.com/LDemetrios/Typst4k/master/src/test/resources/suite/layout/inline/justify.typ
typst
--- justify --- #set page(width: 180pt) #set par(justify: true, first-line-indent: 14pt, spacing: 5pt, leading: 5pt) This text is justified, meaning that spaces are stretched so that the text forms a "block" with flush edges at both sides. First line indents and hyphenation play nicely with justified text. --- justify-knuth-story --- // LARGE #set page(width: auto, height: auto) #set par(leading: 4pt, justify: true) #set text(font: "New Computer Modern") #let story = [ In olden times when wishing still helped one, there lived a king whose daughters were all beautiful; and the youngest was so beautiful that the sun itself, which has seen so much, was astonished whenever it shone in her face. Close by the king’s castle lay a great dark forest, and under an old lime-tree in the forest was a well, and when the day was very warm, the king’s child went out into the forest and sat down by the side of the cool fountain; and when she was bored she took a golden ball, and threw it up on high and caught it; and this ball was her favorite plaything. ] #let column(title, linebreaks, hyphenate) = { rect(inset: 0pt, width: 132pt, fill: rgb("eee"))[ #set par(linebreaks: linebreaks) #set text(hyphenate: hyphenate) #strong(title) \ #story ] } #grid( columns: 3, gutter: 10pt, column([Simple without hyphens], "simple", false), column([Simple with hyphens], "simple", true), column([Optimized with hyphens], "optimized", true), ) --- justify-manual-linebreak --- // Test that lines with hard breaks aren't justified. #set par(justify: true) A B C \ D --- justify-justified-linebreak --- // Test forced justification with justified break. A B C #linebreak(justify: true) D E F #linebreak(justify: true) --- justify-basically-empty --- // Test that there are no hick-ups with justification enabled and // basically empty paragraph. #set par(justify: true) #"" --- justify-shrink-last-line --- // Test that the last line can be shrunk #set page(width: 155pt) #set par(justify: true) This text can be fitted in one line. --- justify-avoid-runts --- // Test that runts are avoided when it's not too costly to do so. #set page(width: 124pt) #set par(justify: true) #for i in range(0, 20) { "a b c " } #"d" --- justify-no-leading-spaces --- // Test that justification cannot lead to a leading space #set par(justify: true) #set text(size: 12pt) #set page(width: 45mm, height: auto) lorem ipsum 1234, lorem ipsum dolor sit amet #" leading whitespace should still be displayed" --- justify-code-blocks --- // Test that justification doesn't break code blocks #set par(justify: true) ```cpp int main() { printf("Hello world\n"); return 0; } ``` --- justify-chinese --- // In Chinese typography, line length should be multiples of the character size // and the line ends should be aligned with each other. Most Chinese // publications do not use hanging punctuation at line end. #set page(width: auto) #set par(justify: true) #set text(lang: "zh", font: "Noto Serif CJK SC") #rect(inset: 0pt, width: 80pt, fill: rgb("eee"))[ 中文维基百科使用汉字书写,汉字是汉族或华人的共同文字,是中国大陆、新加坡、马来西亚、台湾、香港、澳门的唯一官方文字或官方文字之一。25.9%,而美国和荷兰则分別占13.7%及8.2%。近年來,中国大陆地区的维基百科编辑者正在迅速增加; ] --- justify-japanese --- // Japanese typography is more complex, make sure it is at least a bit sensible. #set page(width: auto) #set par(justify: true) #set text(lang: "ja", font: ("Libertinus Serif", "Noto Serif CJK JP")) #rect(inset: 0pt, width: 80pt, fill: rgb("eee"))[ ウィキペディア(英: Wikipedia)は、世界中のボランティアの共同作業によって執筆及び作成されるフリーの多言語インターネット百科事典である。主に寄付に依って活動している非営利団体「ウィキメディア財団」が所有・運営している。 専門家によるオンライン百科事典プロジェクトNupedia(ヌーペディア)を前身として、2001年1月、ラリー・サンガーとジミー・ウェールズ(英: <NAME> "Jimbo" Wales)により英語でプロジェクトが開始された。 ] --- justify-whitespace-adjustment --- // Test punctuation whitespace adjustment #set page(width: auto) #set text(lang: "zh", font: "Noto Serif CJK SC") #set par(justify: true) #rect(inset: 0pt, width: 80pt, fill: rgb("eee"))[ “引号测试”,还, 《书名》《测试》下一行 《书名》《测试》。 ] 「『引号』」。“‘引号’”。 --- justify-variants --- // Test Variants of Mainland China, Hong Kong, and Japan. // 17 characters a line. #set page(width: 170pt + 10pt, margin: (x: 5pt)) #set text(lang: "zh", font: "Noto Serif CJK SC") #set par(justify: true) 孔雀最早见于《山海经》中的《海内经》:“有孔雀。”东汉杨孚著《异物志》记载,岭南:“孔雀,其大如大雁而足高,毛皆有斑纹彩,捕而蓄之,拍手即舞。” #set text(lang: "zh", region: "hk", font: "Noto Serif CJK TC") 孔雀最早见于《山海经》中的《海内经》:「有孔雀。」东汉杨孚著《异物志》记载,岭南:「孔雀,其大如大雁而足高,毛皆有斑纹彩,捕而蓄之,拍手即舞。」 --- justify-punctuation-adjustment --- // Test punctuation marks adjustment in justified paragraph. // The test case includes the following scenarios: // - Compression of punctuation marks at line start or line end // - Adjustment of adjacent punctuation marks #set page(width: 110pt + 10pt, margin: (x: 5pt)) #set text(lang: "zh", font: "Noto Serif CJK SC") #set par(justify: true) 标注在字间的标点符号(乙式括号省略号以外)通常占一个汉字宽度,使其易于识别、适合配置及排版,有些排版风格完全不对标点宽度进行任何调整。但是为了让文字体裁更加紧凑易读,,,以及执行3.1.4 行首行尾禁则时,就需要对标点符号的宽度进行调整。是否调整取决于…… --- justify-without-justifiables --- // Test breaking a line without justifiables. #set par(justify: true) #block(width: 1cm, fill: aqua, lorem(2)) --- issue-2419-justify-hanging-indent --- // Test that combination of justification and hanging indent doesn't result in // an underfull first line. #set par(hanging-indent: 2.5cm, justify: true) #lorem(5) --- issue-4651-justify-bad-bound --- // Test that overflow does not lead to bad bounds in paragraph optimization. #set par(justify: true) #block(width: 0pt)[A B]
https://github.com/protohaven/printed_materials
https://raw.githubusercontent.com/protohaven/printed_materials/main/common-policy/core_values.typ
typst
= Core Values *Inclusion*: We strive to create an environment where all feel welcome and are treated with dignity and respect. *Empowerment*: We share our knowledge, resources, tools, and space to build the capacity of others. *Collaboration*: We are a shared learning community that offers various levels of engagement to honor the creative aspirations of everyone. *Agility*: We shift our focus and resources based on the needs of our members and the community. *Sustainability*: We intentionally plan for the financial future of our organization and the environmental future of our planet.Community Guidelines == Be Respectful Our students, members, and partners bring with them a rich variety of experiences, perspectives, and aspirations. What brings us together is a shared interest in the process of making things. To create a welcoming, supportive environment, we ask each individual to treat one another with care and openness to diversity of race, gender, orientation, age, identity, and ability. Actions that bully, degrade, or threaten any individual or group are not welcome in our community. == Take Responsibility Protohaven is built around shared resources - space, tools, knowledge, and the community itself. To preserve these communal assets, each individual is asked to take responsibility for their own actions and to be mindful of how they impact those around them. We are each accountable for ourselves, and don't expect others to bear the costs of our actions. == Help Out Protohaven is a public charity and is made possible only by people giving more than they take. In this spirit, we ask each individual to find opportunities to help each other, the organization, and our broader community. We also ask each individual to recognize that assistance is not always sought, to seek consent before providing aid, and to provide opportunity for others' work and discovery.
https://github.com/ufodauge/typst-algomod
https://raw.githubusercontent.com/ufodauge/typst-algomod/main/algomod_statements.typ
typst
MIT License
#import "algomod.typ": updateIndent, createStatement #let Function(funcName, args) = { let argBody = [$(#args)$] return createStatement( postIndent: 1, [*Function* #funcName#argBody *Do*], ) } #let Return(statement) = { return createStatement( [*Return* #statement], ) } #let End() = { return createStatement( preIndent: -1, [*End*], ) } #let Let(statement) = { return createStatement( [*Let* #statement], ) } #let If(condition) = { return createStatement( postIndent: 1, [*If* #condition *Then*], ) } #let ElseIf(condition) = { return createStatement( preIndent: -1, postIndent: 1, [*Else If* #condition *Then*], ) } #let Else() = { return createStatement( preIndent: -1, postIndent: 1, [*Else*], ) } #let ForAll(var, iterable) = { return createStatement( postIndent: 1, [*For All* #var $<-$ #iterable *Do*], ) } #let While(condition) = { return createStatement( postIndent: 1, [*While* #condition *Do*], ) } #let Repeat() = { return createStatement( postIndent: 1, [*Repeat*], ) } #let Until(condition) = { return createStatement( preIndent: -1, [*Until* #condition], ) }
https://github.com/EricWay1024/Homological-Algebra-Notes
https://raw.githubusercontent.com/EricWay1024/Homological-Algebra-Notes/master/ha/9-tor.typ
typst
#import "../libs/template.typ": * = $Tor$ and Flatness <tor-flat> == Flat Modules #definition[ A #lrm $B$ is *flat* if $ (- tpr B)$ is exact. A #rrm $A$ is *flat* if $ (A tpr -)$ is exact. ] // #remark[ // #TODO link to algebraic geometry, algebraic continunity // // In some sense flatness gives "continuity". It appears in algebraic geometry. // // Assume $R$ is commutative, and let $M$ be an $R$-module. Then $tilde(M)$ quasi- ?? sheaf on $"Spec"(R)$. // ] // #proposition[ // Free modules are flat. // ] // #proof[ // Take free module $plus.circle.big _I R$ and any module $M$, then // $ // M tpr plus.circle.big _I R iso plus.circle.big_I M tpr R iso plus.circle.big_I M // $ // The functor $plus.circle.big_I$ is exact. // ] // #proposition[ // Projective modules are flat. // ] // #proof[ // Let $P$ be a projective module, then there exists module $Q$ such that $P xor Q$ is free. #TODO @rotman[Propositin 3.46] // ] // #remark[ // We already used that projective modules are flat. First we have free modules are flat. Second, if $P xor Q$ is free (i.e. $P$ is projective) then $- tpr P$ is exact. // ] #proposition[ Let $B$ a #lrm. The followings are equivalent: + $B$ is flat; + $Tor_n^R (A, B) = 0$ for all $n > 0$ and all $A$; + $Tor_1^R (A, B) = 0$ for all $A$. Similarly, let $A$ be a #rrm. The followings are equivalent: + $A$ is flat; + $Tor_n^R (A, B) = 0$ for all $n > 0$ and all $B$; + $Tor_1^R (A, B) = 0$ for all $B$. ] <flat-tor> #proof[@notes[Lemma 6.26]. We prove the #lrm case. (1) $=>$ (2). Suppose that $B$ is flat. Let $F_cx arrow.r A$ be a free resolution of $A$. Since $(minus times.circle_R B)$ is exact, the sequence $ dots.h arrow.r F_2 times.circle_R B arrow.r F_1 times.circle_R B arrow.r F_0 times.circle_R B arrow.r A times.circle_R B arrow.r 0 $ is exact, so the homology of $ dots.h arrow.r F_2 times.circle_R B arrow.r F_1 times.circle_R B arrow.r F_0 times.circle_R B arrow.r 0 $ vanishes in positive degree. (2) $arrow.r.double$ (3). Trivial. (3) $arrow.r.double$ (1). For any short exact sequence $0 arrow.r X arrow.r Y arrow.r A arrow.r 0$ in $ModR$, we have the long exact sequence of $Tor$ by @tor-les, $ 0 eq "Tor"_1^R lr((A comma B)) arrow.r X times.circle_R B arrow.r Y times.circle_R B arrow.r A times.circle_R B arrow.r 0, $ which shows that $(- tpr B)$ is exact. Note that #rrm case relies on the balancing of $Tor$, but the proof is very similar. ] #corollary[Every projective module is flat. In particular, every free module is flat. #footnote[We have already proven this claim in @projective-flat-1, because we needed it for the balancing of $Tor$ (@balance-tor). This second proof actually relies on the balancing of $Tor$ so we could not use it previously, but it is presented here regardless.]] <projective-flat-2> #proof[ If a left $R$-module $P$ is projective, by @balance-tor and @projective-left-zero, $"Tor"_n^R lr((A comma P)) iso L_n (A tpr -) (P) eq 0$ for all $A$ and all $n gt.eq 1$. Then applying @flat-tor gives the result. The case where $P$ is a projective #rrm is similar. ] // Another way to prove the corollary is found @rotman[Propositin 3.46]. // Recall that since the tensor product is left adjoint, it commutes with all colimits. Thus $ (colim_I A_i) tpr B iso colim_I (A_i tpr B) $ #definition[ A category $I$ is called *filtered* if + $I$ is non-empty; + For any $i, j in I$, there exists $k in I$ with morphisms $f : i -> k$ and $g : j -> k$; + For any $i, j in I$ with a pair of morphisms $u, v : i -> j$, there exists $k in I$ with morphism $w : j -> k$ such that $w oo u = w oo v$. ] #example[ A non-empty partially ordered set (poset) $I$, viewed as a small category, is *filtered* if for any $i, j in I$, there exists $k$ such that $i <= k$ and $j <= k$. This is because condition (3) above is automatically satisifed, as there is at most one morphism $i -> j$ for any $i, j in I$. ] #proposition[ // For any small category $I$ and any functor $I -> ModR$, we have Let $I$ be a filtered category. Then the functor $ colim_I: Fun(I, RMod) -> RMod $ is exact. ] <colim-exact> #proof[@weibel[Theorem 2.6.15, p.57].] #remark[ $colim_I$ is not a exact functor in general if $I$ is not filtered. ] #notation[ Let $I$ be a small category and $A : I -> RMod$ be a diagram. We denote $A_i = A(i)$ for each $i in I$ and we would write $colim_I A_i$ to mean $colim_I A$. ] #corollary[ Let $I$ be a filtered category and $A : I -> ModR$ be a diagram. Let $B in RMod$. Then $Tor_n^R (colim_I A_i, B) iso colim_I Tor_n^R (A_i, B)$. In other words, filtered colimits commute with $Tor$. ] <filter-tor> #proof[ Let $P_cx -> B$ be a projective resolution. Then $ Tor_n^R (colim_I A_i , B ) &= H_n ((colim_I A_i) tpr P) \ &iso H_n (colim_I (A_i tpr P)) \ &iso colim_I H_n (A_i tpr P) \ &= colim_I Tor_n^R (A_i, B), $ where at each step we respectively use the definition of $Tor$, that colimits commute with tensor products, that $colim_I$ is exact and thus commutes with homology, and the definition of $Tor$ again. ] #corollary[ Let $I$ be a filtered category and $A : I -> ModR$ be a diagram. Suppose $A_i$ is flat for all $i in I$. Then $colim_I A_i$ is also flat. In other words, a filtered colimit of flat $R$-modules is flat. ] <filtered-flat-flat> #proof[ Take any $B in RMod$. Since each $A_i$ is flat, we know that $Tor_1 (A_i , B) = 0$ by @flat-tor. Then $ Tor_1 (colim_I A_i, B) = colim_I Tor_1 (A_i, B) = 0 $ by @filter-tor, so $colim_I A_i$ is also flat by @flat-tor again. ] // #proof[ // In $RMod$, let $A = colim_I A_i$ where $I$ is a filtered poset and $A_i$ is flat for all $i in I$. We want to show $A$ is flat. If we have #sest $ ses(B_1, B_2, B_3) $ then for all $i in I$, // $ // ses(A_i tpr B_1, A_i tpr B_2, A_i tpr B_3) // $ // Taking $colim_I$, which is exact as $I$ is a filtered poset by @colim-exact, yields // $ // ses(colim_I (A_i tpr B_1), colim_I (A_i tpr B_2), colim_I (A_i tpr B_3)) // $ // By @tensor-right-exact, $colim_I$ commutes with tensor products, hence // $ // ses((colim_I A_i) tpr B_1, (colim_I A_i) tpr B_2, (colim_I A_i) tpr B_3) // $ // which is just // $ // ses( A tpr B_1, A tpr B_2, A tpr B_3) // $ // showing that $A$ is flat. // ] #example[ Let $s in R$ be a central element of ring $R$, then the localisation $R[s^(-1)]$ is a flat $R$-module. To generalise, for a central multiplicatively closed set $S subset Z(R)$, we can form $R[S^(-1)]$, which is a flat $R$-module as well. ] #proof[ @weibel[Theorem 3.2.2, p.69]. ] We now take a look at the case in $Ab$ and we shall show that a module in $Ab$ is flat if and only if it is torsion-free. #lemma[ Let $B in Ab$ and $p in ZZ$. Then $Tor_0^ZZ (ZZ over p ZZ, B) = B over p B$ and $Tor_1 ^ZZ (ZZ over p ZZ, B) = {b in B : p b = 0}.$ ] #proof[ Use the definition of $Tor$, the projective resolution $0 -> ZZ ->^p ZZ -> ZZ over p ZZ -> 0$, and $B tpz ZZ iso B$. ] #lemma[ An abelian group is a filtered colimit of its finitely generated subgroups. ] #proof[ $A = union A_i = colim_I A_i $ where $I$ is a filtered poset representing the inclusion relations of the finitely generated subgroups of $A$. // Hence torsion-free abelian groups are flat. ] #lemma[ Let $A, B in Ab$. Then $Tor_1^ZZ (A, B)$ is a torsion abelian group. ] #remark[ This is likely why $Tor$ is called $Tor$. ] #proof[ By writing $A = colim_I A_i$ for finitely genreated subgroups $A_i$ of $A$, we see that it suffices to show that each $Tor^ZZ_1 (A_i, B)$ is torsion. We can write $A_i$ as a direct sum of its torsion part and free part using the classification theorem for finitely generated abelian groups, i.e. $A_i = ZZ over p_1 ZZ ds ZZ over p_2 ZZ ds ... ds ZZ over p_m ZZ ds ZZ^r$. Notice that $Tor$ commutes with direct sums and the free part $ZZ^r$ vanishes with $Tor$, so $ Tor_1 ^ZZ (A_i, B) iso plus.circle.big_(k=1)^m Tor_1^ZZ (ZZ over p_k ZZ, B) iso plus.circle.big_(k=1)^m B over p_k B, $ which is clearly a torsion abelian group. ] #lemma[ Let $B in Ab$. Then $Tor_1 ^ZZ (QQ over ZZ, B)$ is the torsion subgroup of $B$, i.e. ${b in B : "there exists" n in ZZ "such that" n b = 0 }.$ ] #proof[ $QQ over ZZ$ can be written as the filtered colimit $QQ over ZZ iso colim_I ZZ over p ZZ$, where $I$ is the poset representing the divisibility of natural numbers. Then $ Tor_1 ^ZZ (QQ over ZZ, B) iso Tor_1^ZZ ( colim_I ZZ over p ZZ, B) iso colim_I Tor_1^ZZ (ZZ over p ZZ, B) iso colim_I {b in B : p b = 0}, $ which is the torsion subgroup of $B$. ] // #example[ // A torsion-free abelian group is a flat $ZZ$-module. // ] // #proof[ // // @weibel[Poposition 3.1.4, p.67]. // Let $A$ be a torsion-free abelian group, then $A = union A_i$ where $A_i$ are finitely generated subgroups of $A$, so by // ] #proposition[ A $ZZ$-module is flat if and only if it is torsion-free. ] #proof[ Let $A$ be a torsion-free abelian group, then $A = union A_i$ where $A_i$ are finitely generated subgroups of $A$. Then each $A_i$ is free and hence flat. By @filtered-flat-flat, $A$ is also flat. On the other hand, if $A$ is flat, then $Tor_1^ZZ (-, A) = 0$; in particular, $Tor_1 ^ZZ (QQ over ZZ, A) = 0$, so the torsion subgroup of $A$ is trivial. ] // #proof[ // $R[s^(-1)]$ is the universal ring where $s$ is invertible, which is $colim(R->^s R ->^s R-> ...)$. // ] #endlec(12) == Flat Resolutions #definition[ A resolution $F_cx -> A$ is a *flat resolution* of $A$ if $F_n$ is a flat module for each $n$. ] #lemma("Flat Resolution Lemma")[ Let $F_cx -> A$ be a flat resolution of $A$. Then $ Tor_ast^R (A, B) iso H_ast (F_cx tpr B ) $ Similarly, if $F'_cx -> B$ is a flat resolution, then $ Tor_ast^R (A, B) iso H_ast (A tpr F'_cx) $ ] <flat-resolution> #proof[ By induction. When $n = 0$, we need to show that $A tpr B iso H_0 (F_cx tpr B)$. We see that $ H_0 (F_cx tpr B) = (F_0 tpr B) / IM(F_1 tpr B) = Coker(F_1 tpr B -> F_0 tpr B) $ On the other hand, $A = Coker(F_1 -> F_0)$, and since $(- tpr B)$ is right exact and preserves cokernels, we get the result. When $n >= 1$, we have the short exact sequence $ ses(K, F_0, A, f:ker(epsilon), g:epsilon) $ If we write $E_cx = (... -> F_2 -> F_1 -> 0)$, then $E_cx ->^(d_1) K$ is a flat resolution of $K$, where $d_1 : F_1 -> F_0$ has $IM(d_1) = Ker(epsilon) = K$ by the exactness at $F_0$ of the resolution $F_cx -> A$. Now we can write the #lest induced by $Tor$: $ ... -> Tor_n (F_0 , B) -> Tor_n (A , B) -> Tor_(n-1) (K , B) -> Tor_(n-1) (F_0 , B) -> ... $ But $Tor_n (F_0, B) = 0$ for $n >= 1$ by @flat-tor. Thus, $ Tor_n (A, B) iso cases(Ker (K tp B -> F_0 tp B) quad &n = 1, Tor_(n-1) (K, B) quad &n >= 2) $ For $n=1$, notice that $K tpr B iso H_0 (E_cx tpr B) = (F_1 tpr B) over IM(F_2 tpr B)$ by applying the case $n = 0$ to $E_cx -> K$, and hence we have $ Tor_1 (A, B) iso Ker (K tp B -> F_0 tp B) = Ker ((F_1 tp B) / IM(F_2 tp B) -> F_0 tp B) = H_1 (F_cx tp B) $ For $n >= 2$, $ Tor_n (A, B) iso Tor_(n-1) (K, B) iso H_(n-1) (E_cx tp B) = H_n (F_cx tp B) $ by applying the induction hypothesis to $E_cx -> K$. ] #remark[ Why have we not defined $Tor$ with flat resolutions in the first place? The problem is that we have to show it is well defined regardless of the choice of flat resolutions. This may not be as convenient as using projective resolutions. Nevertheless, now we are free to use flat resolutions, a larger class than projective resolutions, for calculations. ] A generalisation to flat modules is the following. #definition[ If $F$ is a right exact functor, an object $Q$ is *$F$-acyclic* if $L_i F(Q) = 0$ for all $i != 0$. ] #proposition[ If $Q_cx -> A$ is a resolution where $Q_n$ is $F$-acyclic for all $n$, then $L_i F(A) = H_i (F (Q_cx))$. ] #proof[ The proof is exactly the same as above. ] // #remark[#TODO Quillen model structures, fibrations and cofibrations.] == Universal Coefficient Theorem Let $P_cx$ be a chain complex of right $R$-modules and let $M$ be a left $R$-module. In this section, we investigate how the homology of $P_cx$ is related to the homology of $P_cx tp M$, under certain flatness assumptions about $P_cx$. We first need an auxiliary result. #lemma[ Let $ses(X, Y, Z)$ be a #sest in $ModR$ such that $Y$ and $Z$ are flat, then $X$ is also flat. ] <two-flats> #proof[ Let $B in RMod$. Write the #lest induced by $Tor$: $ ... -> Tor_(n+1)^R (Z, B) -> Tor_n^R (X, B) -> Tor_n^R (Y, B) -> ... $ Since $Y$ and $Z$ are flat, $Tor_n^R (Y, B) = Tor_n^R (Z, B) = 0$ when $n >= 1$ by @flat-tor. So $Tor_n^R (X, B) = 0$ for any $n >= 1$ and $X$ is also flat. ] The main result is the following. #theorem("Künneth Formula")[ Let $P_cx$ be a chain complex of flat right $R$-modules such that each submodule $d(P_n)$ of $P_(n-1)$ is also flat. Then for every $n$ and every left $R$-module $M$, there is a #sest $ ses(H_n (P) tpr M, H_n (P_cx tpr M), Tor_1^R (H_(n-1)(P), M)). $ ] <kunneth> #proof[@rotman[Theorem 7.55], @weibel[Theorem 3.6.1]. Let $Z_n = Ker (P_n ->^d P_(n-1))$, then we have a #sest #math.equation(block: true, numbering: "(1)", supplement: "Short Exact Sequence", $ ses(Z_n, P_n, d(P_n)). $) <zpdp> // The associated #lest in $Tor$ shows that $Z_n$ is also flat, as $ Tor_k (P_n, M) = Tor_k (d(P_n), M) = 0 $ for all $k >= 1$. (#TODO In general, if two terms in a #sest are flat then the third one is also flat. ) Since $P_n$ and $d(P_n)$ are both flat, @two-flats shows that $Z_n$ is also flat. Also, since $d(P_n)$ is flat, $Tor_1^R (d(P_n), M) = 0 $ by @flat-tor, so a long exact sequence induced by @zpdp gives $ ses(Z_n tp M, P_n tp M, d(P_n) tp M), $ from which we yield a #sest of chain complexes $ ses(Z_cx tp M , P_cx tp M, d(P)_cx tp M). $ We now look at the #lest induced by homology: #math.equation(block: true, numbering: "(1)", supplement: "Long Exact Sequence", $ ... -> H_(n+1) (d(P)_cx tp M ) ->^diff H_n (Z_cx tp M) \ -> H_n (P_cx tp M) -> H_n (d(P)_cx tp M) ->^diff H_(n-1) (Z_cx tp M) -> ... $) <eq1> Note that the differentials on the chain complexes $Z_cx$ and $d(P)_cx$ are all zero, and hence the differentials on $Z_cx tp M$ and $d(P)_cx tp M$ are also all zero, which gives $ H_n (d(P)_cx tp M) = d(P_n) tp M quad "and" quad H_n (Z_cx tp M ) = Z_n tp M. $ // #TODO A theorem saying differential zero gives homology equals chain complex. Hence @eq1 now becomes $ ... -> d(P_(n+1)) tp M ->^(diff_(n+1)) Z_n tp M -> H_n (P_cx tp M) -> d(P_(n)) tp M ->^(diff_n) Z_(n-1) tp M -> ... $ Apply @connecting and we can find the connecting homomorphism $diff_(n+1) = i_n tp id_M$, where $i_n : d(P_(n+1)) -> Z_n$ is the inclusion map. By @five-to-ses, we have a #sest #math.equation(block: true, numbering: "(1)", supplement: "Short Exact Sequence", $ ses(Coker(i_n tp id_M), H_n (P_cx tp M), Ker(i_(n-1) tp id_M)). $) <ses-1> Now it remains to calculate the two flanking terms of @ses-1. Note that we have a flat resolution for $H_n (P_cx)$: $ ses(d(P_(n+1)), Z_n, H_n (P_cx), f: i_n). $ By @flat-resolution, $Tor_ast^R (H_n (P_cx), M)$ is the homology of the following chain complex: $ D_cx = (... -> 0 -> d(P_(n+1)) tp M ->^(i_n tp id_M) Z_n tp M -> 0) $ Hence $ H_n (P_cx) tp M = Tor_0 (H_n (P_cx), M) = H_0 (D) = Coker (i_n tp id_M) $ and $ Tor_1^R (H_n (P_cx), M) = H_1 (D) = Ker (i_n tp id_M). $ Combining the last two equations with @ses-1 gives the result. ] #remark[ These are the games you play with the machine. ] In particular, we can consider the case of $Ab$, where we have the following: #theorem("Universal Coefficient Theorem")[ Let $P_cx$ be a chain complex of free abelian groups, then for every $n$ and every $M$, the Künneth #sest splits, so $ H_n (P_cx tp M) = (H_n (P) tp M) ds Tor_1^ZZ (H_(n-1) (P), M) $ The split is not canonical. ] #proof[ @rotman[Corollary 7.56, p. 450], @weibel[Theorem 3.6.2, p. 87]. Since each $P_n$ is a free abelian group, its subgroup $d (P_n)$ is also free abelian by @sub-pid. Therefore $d (P_n)$ is projective, so the short exact sequence $ 0 arrow.r Z_n arrow.r^(i_n) P_n arrow.r d( P_n) arrow.r 0 $ splits by @projective-split. Applying $(- tp M)$ (which commutes with direct sums), we see that the #sest $ 0 arrow.r Z_n tp M arrow.r^(i_n tp id_M) P_n tp M arrow.r d( P_n) tp M arrow.r 0 $ also splits, so $Z_n tp M$ is a direct summand of $P_n tp M$. Now notice we have the inclusions $ IM (d_(n+1) tp id_M) subset.eq IM (i_n tp id_M) subset.eq Ker (d_n tp id_M) subset.eq P_n tp M $ By @split-sub, $Z_n tp M$ is a direct summand of $Ker (d_n tp id_M)$. Modding out by $IM (d_(n+1) tp id_M)$ on both of them, again by @split-sub, we see that #math.equation(block: true, numbering: "(1)", supplement: "Equation", $ H_n (P) tp M= (Z_n tp M) / IM(d_(n+1) tp id_M) $) <hnptpm> is a direct summand of $ H_n (P_cx tp M) = Ker(d_n tp id_M) / IM(d_(n+1) tp id_M). $ To see why @hnptpm holds, let $j_n : d(P_(n+1)) -> Z_n$ be the inclusion map, and by the proof of @kunneth, $ H_n (P) tp M &= Coker (j_n tp id_M : d(P_(n+1)) tp M -> Z_n tp M) \ &= (Z_n tp M) / (IM (j_n tp id_M)) = (Z_n tp M) / IM(d_(n+1) tp id_M). $ Since each $P_n$ and $d(P_n)$ are projective and thus flat, by @kunneth we have a #sest $ 0 arrow.r H_n lr((P)) times.circle M arrow.r H_n lr((P_cx times.circle M)) arrow.r "Tor"_1^ZZ lr((H_(n minus 1) lr((P)) comma M)) arrow.r 0, $ which is therefore split. // #TODO // Note: $P_n -> d(P_n)$ splits. ] We also demonstrate a more general result, known as the Full Künneth Formula. Now it can be helpful to recall @tp-dc, the tensor product double complex, as well as @homology-double. #theorem("Full Künneth Formula")[ Let $P_cx$ and $Q_cx$ be right and left $R$-modules, respectively. If $P_n$ and $d(P_n)$ are flat for each $n$, then there is a #sest $ ses(plus.circle.big _(i+j=n) H_i (P) tp H_j (Q), H_n (P tpr Q), plus.circle.big_(i+j = n-1) Tor_1^R (H_i (P), H_j (Q))) $ ] #proof[ @weibel[Theorem 3.6.3]. Modify the proof of @kunneth. ] It is worth mentioning that there is also a version of the Universal Coefficient Theorem for cohomology that involves $Ext$ and $hom$: #theorem("Universal Coefficient Theorem for Cohomology")[ Let $P_cx$ be a chain complex of projective $R$-modules such that each $d(P_n)$ is also projective. Then for any $n$ and every $R$-module $M$, there is a non-canonically split exact sequence $ 0 -> Ext_R^1 (H_(n-1) (P), M) -> H^n (homr (P, M)) -> homr (H_n (P), M) -> 0. $ ] #proof[ @weibel[Theorem 3.6.5]. ] These results yield important consequences in algebraic topology, as briefly discussed below. More can be seen in the Algebraic Topology course. #example[ @weibel[Application 3.6.4]. Let $X$ be a topological space. Let $C_cx (X)$ be the singular chain complex of $X$, then each $C_n (X)$ is a free abelian group. Let $M$ be an abelian group, then we define the $n$-th (singular) homology of $X$ with _coefficients_ in $M$ as $ H_n (X ; M) := H_n (C_cx (X) tp M). $ In particular, $H_n (X) = H_n (X ; ZZ)$. Then the Universal Coefficient Theorem gives $ H_n (X; M) iso (H_n (X) tp M )ds Tor_1^ZZ (H_(n-1) (X), M). $ For cohomology, we could make a similar definition, i.e. $ H^n (X; M) := H^n (hom_ZZ (C_cx (X), M)) $ with $H^n (X) = H^n (X; ZZ)$, and the Universal Coefficient Theorem would indicate that $ H^n (X; M) iso hom_ZZ (H_n (X), M) ds Ext_ZZ^1 (H_(n-1) (X), M). $ (If we further assume that $M = ZZ$ and that each $H_n (X)$ is finitely generated such that $H_n (X) iso F_n ds T_n$ with free part $F_n$ and torsion part $T_n$, then we can show that $H^n (X) iso F_n ds T_(n-1).$) Let $Y$ be another topological space with singular chain complex $C_cx (Y)$. By Eilenberg–Zilber theorem, $H_n (X times Y) iso H_n (C_cx (X) tp C_cx (Y))$. Then the Full Künneth Formula indicates that $ H_n (X times Y) iso (plus.circle.big_(p=1)^n H_p (X) tp H_(n-p) (Y)) xor (plus.circle.big_(p=1)^n Tor_1^ZZ (H_(p-1) (X), H_(n-p) (Y))). $ ] #endlec(12)
https://github.com/Enter-tainer/typstyle
https://raw.githubusercontent.com/Enter-tainer/typstyle/master/tests/assets/unit/off/code-block.typ
typst
Apache License 2.0
#let alert(body, fill: red) = { set text(white) set align(center) // @typstyle off rect( fill: fill, inset: 8pt, radius: 4pt, [*Warning:\ #body*], ) } // @typstyle off #alert[ Danger is imminent!]
https://github.com/N3M0-dev/Notes
https://raw.githubusercontent.com/N3M0-dev/Notes/main/CS/Algorithm/Intro_to_Algor/Ch_1/ch1.typ
typst
= Ch1 The Art of Computer Programming assignment differ every semester test difficult estimated average score about 75 prefer coding assignment Manufacture: tencent Internet, Huawei EE comm, byd automobile. evaluate algorithm: Stadard in course : time complexity, IRL: depend on reality 细分领域 learn in practice
https://github.com/typst-jp/typst-jp.github.io
https://raw.githubusercontent.com/typst-jp/typst-jp.github.io/main/docs/changelog/0.4.0.md
markdown
Apache License 2.0
--- title: 0.4.0 description: Changes in Typst 0.4.0 --- # Version 0.4.0 (May 20, 2023) ## Footnotes - Implemented support for footnotes - The [`footnote`] function inserts a footnote - The [`footnote.entry`]($footnote.entry) function can be used to customize the footnote listing - The `{"chicago-notes"}` [citation style]($cite.style) is now available ## Documentation - Added a [Guide for LaTeX users]($guides/guide-for-latex-users) - Now shows default values for optional arguments - Added richer outlines in "On this Page" - Added initial support for search keywords: "Table of Contents" will now find the [outline] function. Suggestions for more keywords are welcome! - Fixed issue with search result ranking - Fixed many more small issues ## Math - **Breaking change**: Alignment points (`&`) in equations now alternate between left and right alignment - Added support for writing roots with Unicode: For example, `[$root(x+y)$]` can now also be written as `[$√(x+y)$]` - Fixed uneven vertical [`attachment`]($math.attach) alignment - Fixed spacing on decorated elements (e.g., spacing around a [canceled]($math.cancel) operator) - Fixed styling for stretchable symbols - Added `tack.r.double`, `tack.l.double`, `dotless.i` and `dotless.j` [symbols]($category/symbols/sym) - Fixed show rules on symbols (e.g. `{show sym.tack: set text(blue)}`) - Fixed missing rename from `ast.op` to `ast` that should have been in the previous release ## Scripting - Added function scopes: A function can now hold related definitions in its own scope, similar to a module. The new [`assert.eq`]($assert.eq) function, for instance, is part of the [`assert`] function's scope. Note that function scopes are currently only available for built-in functions. - Added [`assert.eq`]($assert.eq) and [`assert.ne`]($assert.ne) functions for simpler equality and inequality assertions with more helpful error messages - Exposed [list]($list.item), [enum]($enum.item), and [term list]($terms.item) items in their respective functions' scope - The `at` methods on [strings]($str.at), [arrays]($array.at), [dictionaries]($dictionary.at), and [content]($content.at) now support specifying a default value - Added support for passing a function to [`replace`]($str.replace) that is called with each match. - Fixed [replacement]($str.replace) strings: They are now inserted completely verbatim instead of supporting the previous (unintended) magic dollar syntax for capture groups - Fixed bug with trailing placeholders in destructuring patterns - Fixed bug with underscore in parameter destructuring - Fixed crash with nested patterns and when hovering over an invalid pattern - Better error messages when casting to an [integer]($int) or [float]($float) fails ## Text and Layout - Implemented sophisticated CJK punctuation adjustment - Disabled [overhang]($text.overhang) for CJK punctuation - Added basic translations for Traditional Chinese - Fixed [alignment]($raw.align) of text inside raw blocks (centering a raw block, e.g. through a figure, will now keep the text itself left-aligned) - Added support for passing a array instead of a function to configure table cell [alignment]($table.align) and [fill]($table.fill) per column - Fixed automatic figure [`kind`]($figure.kind) detection - Made alignment of [enum numbers]($enum.number-align) configurable, defaulting to `end` - Figures can now be made breakable with a show-set rule for blocks in figure - Initial fix for smart quotes in RTL languages ## Export - Fixed ligatures in PDF export: They are now copyable and searchable - Exported PDFs now embed ICC profiles for images that have them - Fixed export of strokes with zero thickness ## Web app - Projects can now contain folders - Added upload by drag-and-drop into the file panel - Files from the file panel can now be dragged into the editor to insert them into a Typst file - You can now copy-paste images and other files from your computer directly into the editor - Added a button to resend confirmation email - Added an option to invert preview colors in dark mode - Added tips to the loading screen and the Help menu. Feel free to propose more! - Added syntax highlighting for YAML files - Allowed middle mouse button click on many buttons to navigate into a new tab - Allowed more project names - Fixed overridden Vim mode keybindings - Fixed many bugs regarding file upload and more ## Miscellaneous Improvements - Improved performance of counters, state, and queries - Improved incremental parsing for more efficient recompilations - Added support for `.yaml` extension in addition to `.yml` for bibliographies - The CLI now emits escape codes only if the output is a TTY - For users of the `typst` crate: The `Document` is now `Sync` again and the `World` doesn't have to be `'static` anymore ## Contributors <contributors from="v0.3.0" to="v0.4.0" />
https://github.com/imlasky/TOMLresume
https://raw.githubusercontent.com/imlasky/TOMLresume/main/backend/template.typ
typst
// #let resume(body) = { // set list(indent: 1em) // show list: set text(size: 0.92em) // show link: underline // show link: set underline(offset: 3pt) // set page( // paper: "us-letter", // margin: (x: 0.5in, y: 0.5in) // ) // set text( // size: 11pt, // font: "New Computer Modern", // ) // body // } // #let name_header(name) = { // set text(size: 2.25em) // [*#name*] // } // #let header( // name: "<NAME>", // phone: "123-456-7890", // email: "<EMAIL>", // linkedin: "linkedin.com/in/jdoe", // github: "github.com/jdoe", // website: "jdoe.com", // ) = { // align(center, // block[ // #name_header(name) \ // #box[#if phone != "" [#phone |]] // #box[#if email != "" [#link("mailto:" + email)[#email] |]] // #box[#if linkedin != "" [#link("https://" + linkedin)[#linkedin] |]] // #box[#if website != "" [#link("https://" + website)[#website] |]] // #box[#if github != "" [#link("https://" + github)[#github]]] // ] // ) // v(5pt) // } // #let resume_heading(txt) = { // show heading: set text(size: 0.92em, weight: "regular") // block[ // = #smallcaps(txt) // #v(-4pt) // #line(length: 100%, stroke: 1pt + black) // ] // } // #let edu_item( // name: "Default", // degree: "Default", // date: "Default" // ) = { // set block(above: 0.7em, below: 1em) // pad(left: 1em, right: 0.5em, grid( // columns: (3fr, 1fr), // align(left)[ // *#name* \ // _#degree _ // ], // align(right)[ // #date // ] // )) // } // #let exp_item( // name: "Default", // role: "Default", // date: "Default", // location: "Default", // highlights: [] // ) = { // set block(above: 0.7em, below: 1em) // pad(left: 1em, right: 0.5em, box[ // #grid( // columns: (3fr, 1fr), // align(left)[ // *#role* \ // _#name _ // ], // align(right)[ // #date \ // _#location _ // ] // ) // #for hl in highlights [ // - #hl.highlight // ] // ]) // } // #let project_item( // name: "", // skills: "", // date: "", // highlights: [] // ) = { // set block(above: 0.7em, below: 1em) // pad(left: 1em, right: 0.5em, box[ // *#name* #h(1fr) #date // #for hl in highlights [ // - #hl.highlight // ] // ]) // } // #let skill_item( // category: "Default", // skills: "Default", // ) = { // set block(above: 0.7em) // set text(size: 0.91em) // pad(left: 1em, right: 0.5em, block[*#category*: #skills]) // } #let resume(body) = { set list(indent: 1em) show list: set text(size: 0.92em) show link: underline show link: set underline(offset: 3pt) set page( paper: "us-letter", margin: (x: 0.5in, y: 0.5in) ) set text( size: 11pt, font: "New Computer Modern", ) body } #let name_header(name) = { set text(size: 2.25em) [*#name*] } // #let header( // name: "<NAME>", // label: "", // image: "", // phone: "", // email: "", // url: "", // summary: "", // profiles: [], // location: (), // ) = { // align(left + horizon)[ // #name_header(name) // #if label != "" [#label \] // #box[#if phone != "" [#phone |]] // #box[#if email != "" [#link("mailto:" + email)[#email] |]] // #box[#if url != "" [#link(url)[#url] |]] // #for profile in profiles [ // #box[#link(profile.url)[#profile.network] |] // ] // #if location != () [ // #location.address, #location.city, #location.region #location.postalCode, #location.countryCode // ] // ] // if summary != "" [ // #v(5pt) // #summary // ] // v(5pt) // } #let resume_heading(txt) = { show heading: set text(size: 0.92em, weight: "regular") block[ = #smallcaps(txt) #v(-4pt) #line(length: 100%, stroke: 1pt + black) ] } #let edu_item( institution: "", url: "", area: "", studyType: "", date: "", score: "", courses: [] ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, grid( columns: (3fr, 1fr), align(left)[ *#institution* \ _#studyType in #area _ _#if url != "" [#link(url)[#url]] _ \ #if courses.len() > 0 [ *Courses:* #courses.join(", ") ] ], align(right)[ #date \ #if score != "" [Score: #score] ] )) } #let exp_item( name: "", position: "", date: "", url: "", summary: "", highlights: [] ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ #grid( columns: (3fr, 1fr), align(left)[ *#position* \ _#name _ _#if url != "" [#link(url)[#url]] _ ], align(right)[ #date ] ) #if summary != "" [#summary] #for hl in highlights [ - #hl ] ]) } #let project_item( name: "", description: "", highlights: [], startDate: "", endDate: "", url: "" ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ #grid( columns: (3fr, 1fr), align(left)[ *#name* \ _#if url != "" [#link(url)[#url]]_ ], align(right)[ #startDate --- #if endDate != none [#endDate] else ["Present"] ] ) #if description != "" [#description] #for hl in highlights [ - #hl ] ]) } #let skill_item( name: "", level: "", keywords: [] ) = { set block(above: 0.7em) set text(size: 0.91em) pad(left: 1em, right: 0.5em, block[ *#name* (#level): #keywords.join(", ") ]) } #let volunteer_item( organization: "", position: "", url: "", date: "", summary: "", highlights: [] ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ #grid( columns: (3fr, 1fr), align(left)[ *#position* \ _#organization _ _#if url != "" [#link(url)[#url]] _ ], align(right)[ #date ] ) #if summary != "" [#summary] #for hl in highlights [ - #hl ] ]) } #let award_item( title: "", date: "", awarder: "", summary: "" ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ #grid( columns: (3fr, 0.5fr), align(left)[ *#title* by #awarder\ _#if summary != "" [#summary] _ ], align(right)[ #date ] ) ]) } #let certificate_item( name: "", date: "", issuer: "", url: "" ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ *#name* - #issuer #h(1fr) #date #if url != "" [#link(url)[#url]] ]) } #let publication_item( name: "", publisher: "", releaseDate: "", url: "", summary: "" ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ *#name* - #publisher #h(1fr) #releaseDate #if url != "" [#link(url)[#url]] #if summary != "" [#summary] ]) } #let language_item( language: "", fluency: "" ) = { set block(above: 0.7em) set text(size: 0.91em) pad(left: 1em, right: 0.5em, block[ *#language*: #fluency ]) } #let interest_item( name: "", keywords: [] ) = { set block(above: 0.7em) set text(size: 0.91em) pad(left: 1em, right: 0.5em, block[ *#name*: #keywords.join(", ") ]) } #let reference_item( name: "", reference: "" ) = { set block(above: 0.7em, below: 1em) pad(left: 1em, right: 0.5em, box[ *#name* "#reference" ]) }
https://github.com/alberto-lazari/computer-science
https://raw.githubusercontent.com/alberto-lazari/computer-science/main/lcd/project-presentation/common.typ
typst
#import "@preview/polylux:0.3.1": * #import "theme.typ": * #let focus-slide = content => focus-slide(background: unipd-red)[ #set align(center) #content ] #let inline-rule(name, ..cases) = { let space = .5em $name ::= #h(space) #cases.pos().join[#h(space) | #h(space)]$ } #let rule(name, first, ..cases, inline: false) = { if(inline) { inline-rule(name, first, ..cases) } else { set par(leading: .4em) let cases = cases.pos().map(case => $& #h(.45em) | & case \ $ ) $name & ::= & first \ cases.join()$ } } #let case(rule, first, ..cases) = { set par(leading: .4em) let cases = cases.pos().map(case => $& #h(.25em) | & case \ $ ) $"case" rule & = & first \ cases.join()$ } #let to = { h(.5em) sym.arrow h(.5em) } #let encode(..elements) = { let exp = $space$ if elements.pos().len() > 0 { exp = elements.pos().join[,] } $[| space.punct exp space.punct |]$ } #let sub(val, var) = { let slash = rotate(30deg, "|") $attach(slash, tl: val, br: var)$ } #let expand(s, domain, ..elements) = { let exp = $space$ if elements.pos().len() > 0 { exp = elements.pos().join[,] } $attach(angle.l space.punct exp space.punct angle.r_#s, tl: domain)$ } #let expandk(domain, ..elements) = expand($k$, domain, ..elements) #let expanda(domain, ..elements) = expand($a$, domain, ..elements) #let expandf(domain, ..elements) = expand($f$, domain, ..elements) #let expandL(domain, ..elements) = expand($L$, domain, ..elements) #let nil = $Ø$ #let tick = "'" #let evale = $"eval"_e$ #let evalb = $"eval"_b$ #let Devale = $attach("eval"_e, tl: D)$ #let encodepi(..elements) = $#encode(..elements)_pi$
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cetz-plot/0.1.0/src/lib.typ
typst
Apache License 2.0
#let version = version(0,1,0) #import "/src/axes.typ" #import "/src/plot.typ" #import "/src/chart.typ"
https://github.com/tingerrr/typst-test
https://raw.githubusercontent.com/tingerrr/typst-test/main/docs/book/src/reference/tests/lib.md
markdown
MIT License
# Test Library The test library is an augmented standard library, it contains all definitions in the standard library plus some additional modules and functions which help testing packages and debug regressions. It defines the following modules: - `test`: a module with various testing helpers such as `catch` and additonal asserts. The following items are re-exorted in the global scope as well: - `assert-panic`: originally `test.assert-panic` - `catch`: originally `test.catch` ## `test` Contains the main testing utilities. ### `assert-panic` Ensures that a function panics. Fails with an error if the function does not panic. Does not produce any output in the document. #### Example ```typst #assert-panic(() => {}, message: "I panic!") #assert-panic(() => panic(), message: "I don't!") ``` #### Parameters ```txt assert-panic( function, message: str | auto, ) ``` > ##### `function: function` > - `required` > - `positional` > > The function to test. > ##### `message: str | auto` > > The error message when the assertion fails. ### `catch` Unwraps and returns the panics generated by a function, if there were any. Does not produce any output in the document. #### Example ```typst #assert.eq(catch(() => {}), none) #assert.eq( catch(panics).first(), "panicked with: Invalid arg, expected `int`, got `str`", ) ``` #### Parameters ```txt catch( function, ) ``` > ##### `function: function` > - `required` > - `positional` > > The function to test.
https://github.com/SabrinaJewson/cmarker.typ
https://raw.githubusercontent.com/SabrinaJewson/cmarker.typ/main/lib.typ
typst
MIT License
#let _p = plugin("./plugin.wasm") #let render( markdown, smart-punctuation: true, blockquote: none, math: none, h1-level: 1, raw-typst: true, scope: (:), show-source: false, ) = { if type(markdown) == content and markdown.has("text") { markdown = markdown.text } let options = 0 if smart-punctuation { options += 0b00000001 } if blockquote != none { options += 0b00000010 scope += (blockquote: blockquote) } if raw-typst { options += 0b00000100 } if math != none { options += 0b00001000 scope += (inlinemath: math.with(block: false), displaymath: math.with(block: true)) } let rendered = str(_p.render(bytes(markdown), bytes((options, h1-level)))) if show-source { raw(rendered, block: true, lang: "typ") } else { eval( rendered, mode: "markup", scope: ( image: (..args) => image(..args), ..scope, ), ) } }
https://github.com/Skimmeroni/Appunti
https://raw.githubusercontent.com/Skimmeroni/Appunti/main/Metodi%20Algebrici/Strutture/Strutture.typ
typst
Creative Commons Zero v1.0 Universal
#import "../Metodi_defs.typ": * Sia $A$ un insieme non vuoto. La funzione $*$ viene detta *operazione binaria* su $A$ se ha come dominio il prodotto cartesiano di $A$ con sé stesso ed il codominio coincidente con $A$: $ * : A times A |-> A $ #example[ - La funzione $f: ZZ times NN |-> ZZ, f(a, b) = a^(b)$ non é un'operazione binaria; - La funzione $f: ZZ times ZZ |-> ZZ, f(a, b) = root(b, a)$ é un'operazione binaria. ] Le operazioni binarie possono venire generalizzate con prodotti cartesiani $n$-dimensionali. La funzione $*$ viene detta *operazione n-aria* su $A$ se ha come dominio $A^(n)$ e sé stesso come codominio: $ * : A^(n) |-> A $ Sia $*$ una operazione su un insieme $a$, e siano $a, b, c in A$ tre suoi elementi. Si dice che $*$ gode della *proprietá associativa* se applicare a $c$ il risultato dell'applicazione di $*$ ad $a$ e a $b$ equivale all'applicare ad $a$ il risultato dell'applicazione di $*$ a $b$ e a $c$. In altri termini: $ (a * b) * c = a * (b * c) $ Dove le parentesi tonde determinano l'_ordine di precedenza_ dell'applicazione di $*$. Sia $*$ una operazione su un insieme $a$, e siano $a, b in A$ due suoi elementi. Si dice che $*$ gode della *proprietá commutativa* se applicare $a$ a $b$ equivale ad applicare $b$ ad $a$. In altri termini: $ a * b = b * a $ #example[ - L'operazione $f: ZZ times ZZ |-> ZZ, f(a, b) = a + b$ gode sia della proprietá associativa che della proprietá commutativa; - L'operazione $f: ZZ times ZZ |-> ZZ, f(a, b) = a - b$ non gode né della proprietá associativa né della proprietá commutativa; - L'operazione $f: ZZ times ZZ |-> ZZ, f(a, b) = 2^(a + b)$ gode della proprietá commutativa, ma non di quella associativa. Infatti, sebbene sia vero che $f(a, b) = f(b, a)$ in quanto $2^(a + b) = 2^(b + a)$, non é vero che $f(a, f(b, c)) = f(f(a, b), c)$, in quanto $2^(a + 2 b + c) != 2^(2 a + b + c)$. - L'operazione $f: ZZ times ZZ |-> ZZ, f(a, b) = b$ gode della proprietá associativa, ma non di quella commutativa. Infatti, sebbene valga $f(a, f(b, c)) = f(f(a, b), c) = c$, si ha $f(a, b) = b$ e $f(b, a) = a$, pertanto $f(a, b) != f(b, a)$. ] Se $*$ é una operazione sull'insieme $A$, un elemento $1_(*) in A$ viene detto *elemento neutro* per $*$ se, per qualsiasi $a in A$, vale $1_(*) * a = a * 1_(*) = a$. Se $*$ é una operazione sull'insieme $A$ che ammette elemento neutro $1_(*)$, per ciascun $a in A$ esiste un $a^(-1)_(*) in A$ chiamato *inverso* di $a$ tale per cui $a * a^(-1)_(*) = a^(-1)_(*) * a = 1_(*)$. Quando l'operazione é nota dal contesto, il pedice $*$ viene omesso. #example[ L'operazione $f: ZZ times ZZ |-> ZZ, f(a, b) = a + b$ ha come elemento neutro $0$. Infatti, preso un $a in Z$ qualsiasi, si ha $a + 0 = 0 + a = a$. L'inverso di $a$ rispetto a tale operazione é $-a$, in quanto $a + (-a) = (-a) + a = 0$. ] Se $*$ é una operazione sull'insieme $A$, un elemento $0_(*) in A$ viene detto *elemento assorbente* per $*$ se, per qualsiasi $a in A$, vale $0_(*) * a = a * 0_(*) = 0_(*)$. #example[ L'operazione $f: QQ times QQ |-> QQ, f(a, b) = a dot b$ ha come elemento assorbente $0$. Infatti, preso un $a in Z$ qualsiasi, si ha $a dot 0 = 0 dot a = 0$. ] Sia $A$ un insieme e siano $f_(1), ..., f_(n)$ una collezione di $n$ operazioni definite su $A$. La $n+1$-pla $(A, f_(1), ..., f_(n))$, formata dall'insieme $A$ e dalle $n$ operazioni su questo definite, prende il nome di *struttura algebrica*. L'insieme $A$ viene detto *insieme sostegno*, o semplicemente *sostegno*, della struttura algebrica. Dato che, nella maggior parte dei casi, le operazioni $f_(1), ..., f_(n)$ delle strutture algebriche sono operazioni binarie, se non viene diversamente specificato con "operazione" si intende implicitamente "operazione binaria". Una struttura algebrica $(S, *)$, formata da un insieme $S$ e da una operazione $*$ su questo definita, prende il nome di *semigruppo* se $*$ gode della proprietá associativa. Un semigruppo $(M, *)$ viene detto *monoide* se l'operazione $*$ definita sull'insieme $M$ ammette elemento neutro. #example[ - La coppia $(NN, +)$, dove $+$ indica la somma sui numeri interi comunemente intesa, é un semigruppo, perché $+$ gode della proprietá associativa. É anche un monoide, perché $+$ ammette elemento neutro (il numero $0$); - La coppia $(QQ, dot)$, dove $dot$ indica il prodotto sui numeri razionali comunemente inteso, é un semigruppo, perché $dot$ gode della proprietá associativa. É anche un monoide, perché $dot$ ammette elemento neutro (il numero $1$). ] <Some-groups> Un semigruppo $(S, *)$ dove $*$ gode della proprietá commmutativa viene detto *semigruppo abeliano*. Allo stesso modo, un monoide $(M, *)$ dove $*$ gode della proprietá commmutativa viene detto *monoide abeliano*.
https://github.com/teshu0/uec24-exp-a-typst
https://raw.githubusercontent.com/teshu0/uec24-exp-a-typst/main/README.md
markdown
The Unlicense
# UEC2024 基礎科学実験A レポート用 Typst テンプレート `uec_exp_a.typ` がテンプレートファイルなので、`main.typ` を参考に書いてください。めちゃくちゃ説明用のコメントを入れています。 ```typ #import "./uec_exp_a.typ": uec_exp_a, title, description #import "./logos.typ": Typst, LaTeX // ロゴの置き換え #show "LaTeX": name => LaTeX // テンプレートを使う #show: uec_exp_a // タイトル #title("レポートのためのテンプレート - Typst版") // 名前など #description( " 電気通信大学 Ⅳ 類 Typstを使おうプログラム 2400123 電通太郎 2024 年 0 月 0 日作成 2024 年 1 月 32 日更新 " ) // ここから本文 = 目的 // これは見出し 長年 LaTeX @latex を用いたレポート執筆が主流であった。しかし、新たな組版システム #Typst の登場により LaTeX に代わる新たな選択肢の一つとなった。今回は、 #Typst を用いたレポートの例を示す。 ```
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/text/lang-with-region.typ
typst
Apache License 2.0
// Test if text with region works --- // without any region #set text(font: "Noto Serif CJK TC", lang: "zh") #outline() --- // with unknown region configured #set text(font: "Noto Serif CJK TC", lang: "zh", region: "XX") #outline() --- // with region configured #set text(font: "Noto Serif CJK TC", lang: "zh", region: "TW") #outline()
https://github.com/WinstonMDP/math
https://raw.githubusercontent.com/WinstonMDP/math/main/knowledge/algebras.typ
typst
#import "../cfg.typ": cfg #show: cfg = Algebras $(V, +, *, dot)$ is an algebra over a field $F :=$ + $(V, +, dot)$ is a vector space over $F$. + $(V, +, *)$ is a ring. + $(lambda a) b = a(lambda b) = lambda(a b)$. $a b = sum_(i, j = 1)^n a_i b_j e_i e_j$.
https://github.com/Student-Smart-Printing-Service-HCMUT/ssps-docs
https://raw.githubusercontent.com/Student-Smart-Printing-Service-HCMUT/ssps-docs/main/contents/categories/task2/2.3.typ
typst
Apache License 2.0
#pagebreak() == Class diagram _Draw a class diagram of the printer order system. _\ *Frontend Class diagram*: Các class trong sơ đồ UML class diagram đều sử dụng chung 1 class là DataService. Class DataService có nhiệm vụ gọi các API cung cấp bởi phía server. #figure(caption: "Frontend Class diagram", image("../../images/ClassDia_FE.png") ) *Mô tả* #block(inset:(left:1cm))[ - Đối tượng *Student* bao gồm các thuộc tính: id, name, email, hash_password, default_coin_per_sem, remain_coins, và InteractingView với Main Page View và *Student* có phương thức changPassword(). - Đối tượng *Order* bao gồm các thuộc tính: id, filename, location, num_pages, money, status. - Đối tượng *File* bao gồm các thuộc tính: file, config: Config (filetype, pagetype), và phương thức changeConfig(). - Đối tượng *Location* bao gồm các thuộc tính: id, campus, building, room. - Đối tượng *Printer* bao gồm các thuộc tính: id, name, model, location(Location). - Đối tượng *PrintingRequest* bao gồm các thuộc tính: id, List<File>, filename, num_of_copies, page_size, page_print(List<int>) và phương thức AddOrder(file, num_of_copies), DeleteOrder(file, num_of_copies), CalculateMoney(). - *Student* có thể tương tác với *MainPageView*. *MainPageView* bao gồm các thuộc tính student: Student, totalMoney, orderProcess: List<Order>. Các phương thức của *MainPageView* bao gồm: ShowOrderListProcess(order:List<Order>), showButtonOrderPrint(), Uploadfile() và Refresh(). - Sau khi *Student* upload file, *MainPageView* sẽ chuyển hướng đến *OrderPrintingView* để hiển thị thông tin về file vừa upload và tùy chọn các thông số in ấn. *OrderPrintingView* bao gồm các thuộc tính: selectedFile: List<File>, selectPrinter: List<Printer>, selectLocation: List<Location>, paymentController: PaymentController, waitingPrintList< 1..unordered> và các phương thức: ShowFileUpload(selectedFile: List<File>), ShowChooseNumOfCopies(selectedFile: List<File>), ShowPreviewDocument(selectFile: File), ShowChooseOptinal(selectedFile: List<File>), ShowTotalMoney(), ChoosePrinterandLocation(), ShowConfirmOrder(), ShowPatmentMethod(). ] *Class Diagram cho Business Logic Layer* Tương tự vai trò của class DataService bên phía presentation, class Router phía business logic đóng vai trò tiếp nhận và điều hướng các lời gọi API từ phía presentation layer tới các logic xử lý trong hệ thống. Ngoài Router, các lớp đều sử dụng class Database để thực hiện thao tác dữ liệu trực tiếp từ cloud database. #figure(caption: "Backend Class diagram", image("../../images/ClassDia_BE.png") ) *Kết nối giữa frontend và backend* Class DataService là đại diện bên frontend, chuyên gửi các request và nhận dữ liệu trả về từ backend. Tương tự, class Router là đại diện bên backend, chuyên tiếp nhận các request từ frontend, điều hướng tới các logic xử lý nghiệp vụ, và trả về dữ liệu tương ứng với request nhận được. #figure(caption: "FE_BE Connect diagram", image("../../images/ClassDia_Connect.png", width: 40%, height: 30%) )
https://github.com/Julian-Wassmann/chapter-utils
https://raw.githubusercontent.com/Julian-Wassmann/chapter-utils/main/0.1.0/test-cases.typ
typst
// TODO: Page number in header lacks default page number behind, after page counter gets updated: Only for "empty" pages. Possible workaround: hide() #import "./lib.typ": chapter-header, chapter-numbering, page-number, page-chapter #set page( paper: "a6" ) #align(center)[#text(size: 20pt)[Title Page]] #set heading(numbering: "1.") #counter(page).update(0) #set page( numbering: "I", header: chapter-header(), //footer: [], ) #outline() #counter(page).update(0) #set page(numbering: "1") = Chapter A #lorem(200) #pagebreak() = Chapter B #lorem(30) = Chapter C #lorem(30) #set page( header: [#page-chapter()#h(1fr)#page-number()], ) = Chapter D #lorem(30) #set page( header: [ #page-chapter(format-chapter: (number, body) => [#number - #body]) #h(1fr) #page-number(format-page: page => [p. #page]) ], ) = Chapter E #lorem(30) #pagebreak() #counter(page).update(0) #set page( header: chapter-header(format-page: page => [A #page]) ) #set heading(numbering: none) = Appendix #lorem(30)
https://github.com/leo1oel/CSAPP
https://raw.githubusercontent.com/leo1oel/CSAPP/main/Homework/quick-maths.typ
typst
// Element function for sequences #let sequence = $a b$.body.func() // Convert content to an array of its children #let to-children(content) = { if type(content) == str { content.clusters().map(char => [#char]) } else if content.has("children") { content.children } else if content.has("text") { to-children(content.text) } else if content.func() == math.equation { to-children(content.body) } } // Convert shorthands in the given sequence to their respective replacements. #let convert-sequence(seq, shorthands) = { let is-sequence = type(seq) == content and seq.func() == sequence if not is-sequence { return seq } let children = seq.children.map(c => convert-sequence(c, shorthands)) if children.len() == 0 { return seq } for shorthand in shorthands { let components = to-children(shorthand.first()) let start = 0 while start < children.len() { let pos = children.slice(start).position(c => c == components.first()) if pos == none { break } else { pos = start + pos // Position of first matching character start = pos + 1 // Start index for finding next match // Check whether all components of the shorthand match let matches = range(components.len()).all(i => { let child = children.at(pos+i, default: none) child == components.at(i) }) if matches { // Remove shorthand and insert replacement for i in range(components.len()) { children.remove(pos) } children.insert(pos, shorthand.last()) } } } } return children.join() } // A template that converts the given shorthands to their respective replacement. // // Parameters: // - shorthands: One or more tuples of the form `(shorthand, replacement)`. // - body: The body to apply the template on. // // Returns: The body with evaluated shorthands. #let shorthands(..shorthands, body) = { let shorthands = shorthands.pos() show math.equation: eq => { show sequence: seq => { let new = convert-sequence(seq, shorthands) if new != seq { new } else { seq } } eq } body }
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/transform-layout_02.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test that scaling impact layout. #set page(width: 200pt) #set text(size: 32pt) #let scaled(body) = box(scale( x: 20%, y: 40%, body )) #set scale(reflow: false) Hello #scaled[World]! #set scale(reflow: true) Hello #scaled[World]!
https://github.com/RandomcodeDev/FalseKing-Design
https://raw.githubusercontent.com/RandomcodeDev/FalseKing-Design/main/game/ui.typ
typst
= UI == HUD There will be a HUD visible at all times (except in screenshot mode). It will have indicators for health, what element the player is using, how charged the special attack is, what equipment the player is using, and the current objective. == Inventory There will be an inventory screen which shows the player's items, what they have equipped, and the player themself, similar to Minecraft and Breath of the Wild's inventory screens. == Objectives There will be a screen that shows the player their objectives. == Lore There will be a lore screen that has information about the world that the player has encountered.
https://github.com/Mc-Zen/quill
https://raw.githubusercontent.com/Mc-Zen/quill/main/tests/gates/meter/test.typ
typst
MIT License
#set page(width: auto, height: auto, margin: 0pt) #import "/src/quill.typ": * #quantum-circuit( 1, meter(n: 3), [\ ], 2, [\ ], 2 ) #pagebreak() #quantum-circuit( scale: 120%, 1, meter(label: $y$), 1, meter(n: 1, label: $lr(|plus.minus〉)$), meter(label: $phi/2$, n: 1, wire-count: 1), meter(target: 1, label: $X$), meter(n: 2, label: $X$), [\ ], 1, meter(radius: 3pt, fill: gray), 3,ctrl(0), 1 ) #pagebreak() #quantum-circuit( gate-padding: 2pt, wire: .2pt + red, color: red, 1, meter(label: $y$), 1, meter(n: 1, label: $lr(|plus.minus〉)$), meter(label: $phi/2$, n: 1, wire-count: 1), meter(target: 1, label: "a"), meter(n: 2, label: "a"), [\ ], 1, gate($H$), 3, ctrl(0), 2 )
https://github.com/marcantoinem/CV
https://raw.githubusercontent.com/marcantoinem/CV/main/en/awards.typ
typst
#import "../src/style.typ": experience #let csgame2024 = { experience( "CS Games 2024", "Montréal, Canada", "Mar 2024", "1st place overall.", [ - Won the 1st place overall at the CS Games 2024, a competition that gathers more than 300 students from 20 universities across Canada to compete in various computer science challenges. ], ) }
https://github.com/ThatOneCalculator/riesketcher
https://raw.githubusercontent.com/ThatOneCalculator/riesketcher/main/README.md
markdown
MIT License
# riesketcher A package to draw Riemann sums (and their plots) of a function with CeTZ. Usage example and docs: [manual.pdf](https://github.com/ThatOneCalculator/riesketcher/blob/main/manual.pdf) ```typst #import "@preview/riesketcher:0.2.1": riesketcher ``` ![Demo](https://github.com/ThatOneCalculator/riesketcher/assets/44733677/4f87b750-e4be-4698-b650-74f4fe56789d)
https://github.com/ClassicConor/UoKCSYear1ExamNotes2024
https://raw.githubusercontent.com/ClassicConor/UoKCSYear1ExamNotes2024/master/HCI%20(With%20Exam%20Answers)/HCI%202022%20Paper/HCI%202022%20Answers.typ
typst
= HCI Exam 2022 - Answers to the questions <hci-exam-2022---answers-to-the-questions> == Question 1 <question-1> Human Centred Design (HCD) Process consists of five stages: Empathise, Define, Ideate, Prototype, and Test === a <a> For each of the following activity, match the activity to the most appropriate HCD stage as above. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 1) + Activity 1: <NAME> noticed his three-year old son Thor – too big for a high chair but too small for an adult chair – struggled for a place at the family dining table. ] ] Empathise stage - We already know who we are designing for (a child), and we have #strong[passed the empathise stage];. We are now attempting to understand their problem, defining their needs based on current observatiosn. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 2) + Activity 2: A user is shown an interface prototype by a facilitator and asked to do Heuristic Evaluation for the interface design ] ] Test stage - The prototype has been created, and now we are handing over our prototype to a variety of users in order to review it. We are #strong[in the final] stage before the release of the product. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 3) + Activity 3: Team brainstorms 5 different approaches to develop an interface for an online printing service ] ] Ideate stage - We have defined the problem, and are now attempting to brainstorm a variety of different ideas #strong[before] we begin to produce prototypes. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 4) + Activity 4: Use Matrix Analysis method to help you to manage and synthesise data into information gathered from notes, video, surveys. ] ] Define stage - Information is collected from a variety of sources of all different demographics, perhaps getting their opinion on a variety of topics. Once the data has been thoroughly analysed, then we can look into the data that’s been collected, and attempt to #strong[define the issues] that people may be having through a quantitative analysis. #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 22) + Activity 5: Wireframing tool is used to produce version 2 of an interface for an online food delivery mobile application. ] ] Prototype stage - We have brainstormed our ideas, and have now created a prototype of the interface. The fact that it’s version 2 suggests that a version 1 interface was previously created, and we’ve #strong[returned to the prototyping stage] in order to create the next version. === b <b> In HCD, there are many design concepts. For each of the following scenario, give the appropriate name of the design concept. Briefly (in no more than one sentence each), explain how the design concept can be used in HCI. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 1) + Scenario 1: Although not required, mouse has a clicking sound when pressed. ] ] Affordance - the clicking sound when pressing the mouse provides auditory feedback which suggests the action of clicking, helping to reinforce the user’s actions. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 2) + Scenario 2: Facebook shows random friends’ updates. ] ] Visibility - The user is easily able to see the updates on their feed, ensuring that this information is accessible and easily interactived by the user. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 3) + Scenario 3: When designing a mobile application, wireframing tool is used to limit the size of the screen to simulate different sizes of mobile phones available on the market. ] ] Constraints - By limiting the number of interactions on the screen, this helps to guide the user to towards desired outcomes and behaviours. A smaller screen may present the user with fewer options, which may be required to allow the application’s layout and features to remain consistent across different devices. #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 4) + Scenario 4: Unhealthy food and drinks are placed to the bottom of a shelf in a supermarket to encourage people consume more healthy food. ] ] Mapping - Mapping refers to the relationship between control and effect. In this example, by placing the unhealthy food and drinks towards the bottom of the shelf, this may map to the desired behaviour of encouraging people to consume healthier food, as the healthier food will be mor easily accessible and visible at eye level. #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 22) + Scenario 5: Speak to a group of elderly people to decide the font size for a mobile application designed for these people. ] ] Feedback - By speaking to your potential user base, you can gather feedback and information that will be useful when creating prototypes, as well as the final product. #pagebreak() == Question 2 <question-2> Below shows a form provided on a website for the user to choose the delivery country, the currency the prices on the website are displayed in, and the language the website content is displayed in. This question focuses on the language selection functionality. To select their preferred language, the user first selects a country and then the language options are displayed, see figure 1 below. The language choices available to the user when they change their country to United Kingdom, United States, or Canada are provided below. - If the user selects "United Kingdom", they have only one language option – English. - If the user selects "United States", they have two language options – English and Español. - If the user selects "Canada", they have two language options – English and Français. #figure(image("HCI2022Question2.png"), caption: [ HCI2022 Question 2 Answer ] ) With reference to the scenario above, answer the following questions: #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 1) + Identify two potential usability problems with the language selection functionality provided by the interface. ] ] - The first usability problem may be the limited number of languages available. If the website is only available in both English and Spanish, then it may limit people who do not speak these languages. - The second usability problem that may arise out of the design is that the text above the language selection input box, which reads "Select Language", is written in English. Somebody who is not a native English speaker may not understand what this text means, and may therefore struggle to use the application. This could be solved by placing a picture of a globe next to the text; the globe being a universally known symbol to indicate different countries and languages. - A third usability issue is the fact that the text box for language selection and the background of the website are the same colour, which may result in some users with restricted vision struggling to see that the text box is in fact able to be selected. #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 2) + KLM-GOMS stands for Keystroke-Level Model for Goals, Operators, Methods, and Selection rules. It allows designers to predict how long it will take a skilled user to execute a specific set of actions using a particular interface design. The standard set of operators are: ] ] - Keying (K): for typing a key or clicking a control – 0.2s - Pointing (P): pointing with a mouse to a target on a display – 1.1s - Homing (H): home hands on keyboard or other input device – 0.4s - Mentally Preparing (M): think before doing an action – 1.35s - Responding (R): time a user must wait for a response Once you determine the operators, you need to apply the following rules for mental operations before calculating the time. - Rule 0: Add Ms in front of all Ks and all Ps that point to commands. - Rule 1: Change PMK to PK to remove fully anticipated Ms. - Rule 2: Eliminate any Ms in the middle of strings that are part of a cognitive unit (a contiguous sequence of typed characters). - Rule 3: Delete Ms before consecutive terminators. - Rule 4: Delete Ms that are terminators of commands. - Rule 5: Delete overlapped Ms, i.e., overlaps an R (a delay). #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 1) + Using KLM-GOMS, list out the necessary actions and identify the operators that will be used to calculate the time required to take a skilled user to change their preferred country using the interface shown in figure 1 Assume that the currently displayed country is the "United Kingdom", and the user needs to select "United States" from the country/region dropdown and choose "Español" from the language dropdown. Assume no scrolling is involved in either dropdown control. Clicking the "Go to site" button at the end will apply the change. ] ] My steps: Original steps: + R - Wait for website to load + M - Think about changing country and find it on the screen + H - Place hands on the mouse and keyboard + P - Move the cursor to the selection + K - Click on the selection + R - Wait for the selection to change + M - Think about the country, and find it within the selection + P - Move the cursor to the specific country (United States) + K - Click on the country + R - Wait for the drop down menu to disappear + M - Find the language selection on the screen + P - Move the mouse cursor to the language selection + K - Click on the box + M - Find the specific language (Spanish) + P - Move the mouse cursor to the specific language + K - Click on the language + R - Wait for the menu to disappear. + M - Find the "Go to site" button + P - Move the mouse to the button + K - Click on the button Steps before rules: RMHPKRMPKRMPKMPKRMPK Steps with rules: + R - Wait for website to load + M - Think about changing country and find it on the screen + H - Place hands on the mouse and keyboard + P - Move the cursor to the selection + K - Click on the selection + #strong[M - Added M] + R - Wait for the selection to change + M - Think about the country, and find it within the selection + P - Move the cursor to the specific country + K - Click on the country + #strong[M - Added M] + R - Wait for the drop down menu to disappear + M - Find the language selection on the screen + P - Move the mouse cursor to the language selection + K - Click on the box + #strong[M - Added M] + M - Find the specific language (Spanish) + P - Move the mouse cursor to the specific language + K - Click on the language + #strong[M - Added M] + R - Wait for the menu to disappear. + M - Find the "Go to site" button + P - Move the mouse to the button + K - Click on the button Final result: RMHPKMRMPKMRMPKMMPKMRMPK #quote(block: true)[ #block[ #set enum(numbering: "(i)", start: 2) + Using your solution from (i), apply the rules for mental operations where applicable and use your final formula to make a prediction of the time it will take a skilled user to complete the task. ] ] Time of the action: - M (9 \* 1.35) = 12.15 - K (5 \* 0.2) = 1 - P (5 \* 1.1) = 5.5 - H (1 \* 0.4) 0.4 Total time: 19.05 seconds #pagebreak() == Question 3 <question-3> For this question, you will need to design a simple interface for a microwave without any digital display. The microwave has only FIVE functions: - START/STOP - Microwave power mode (LOW/MED/HIGH) - Microwave heating time for up to 30 minutes - Open door - Overheat warning #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 1) + Sketch an interface design (with appropriate annotations on its usage) that includes approximately designed menu interface for the five functions above. ] ] \[insert image here\] #quote(block: true)[ #block[ #set enum(numbering: "(a)", start: 2) + For each feature that you have designed, list a design principle, and explain (in no more than one sentence each) on how the feature implements the principle. ] ] \[insert features here\]
https://github.com/rabotaem-incorporated/probability-theory-notes
https://raw.githubusercontent.com/rabotaem-incorporated/probability-theory-notes/master/sections/03-characteristic-functions/04-big-deviations.typ
typst
#import "../../utils/core.typ": * == Вероятности больших отклонений #th(name: "Факт из закона больших чисел в форме Чебывеша")[ Пусть $xi_1$, $xi_2$, ... --- независимые, одинаково распределенные случайные величины, и $E xi_1 = a$, $D xi_1 = sigma^2$ конечна. Тогда $ P(abs(S_n/n - a) >= eps) <= sigma^2/(n eps^2) $ В частности, при $r > a$, $ P(S_n / n >= r) <= sigma^2/(n (r - a)^2). $ ] #proof[ Мы получали это при доказательстве закона больших чисел в форме Чебывеша (а вообще это очевидно получается из неравенства Чебышева). ] #def[ Случайная величина удовлетворяет _условию Крамера_, если $E e^(lambda xi) < +oo$ для некоторого $lambda > 0$. ] #notice(name: "оценка Чернова")[ Пусть $xi_1$, $xi_2$, ... --- независимые, одинаково распределенные. $a = E xi$, $r > a$. Будем интересоваться $ P(S_n/n >= r) = P(lambda S_n >= n lambda r) = P(e^(lambda S_n) >= e^(lambda r n)) <=^"Марков" (E e^(lambda S_n))/(e^(lambda r n)). $ Знаем $ E e^(lambda S_n) = E e^(lambda sum xi_k) = E product e^(lambda xi_k) = product E e^(lambda xi_k) = (E e^(lambda xi_1))^n. $ Поэтому $ P(S_n/n >= r) <= (E e^(lambda S_n))/(e^(lambda r n)) = ((E e^(lambda xi_1))/(e^(lambda r)))^n. $ Заведем $psi(lambda) := ln E e^(lambda xi_1)$. Тогда, переписав что выше, $ P(S_n/n >= r) <= e^(n (psi(lambda) - lambda r)). $ Заведем $I(r) := sup_lambda {lambda r - psi(lambda)}$ --- _функцию отклонения_. Тогда $ P(S_n/n >= r) <= e^(-n I(r)). $ ] #example[ Пусть $xi_k sim Nn(0, 1)$. Смотрим на $lambda r - ln E e^(lambda xi)$: $ E e^(lambda xi) = 1/sqrt(2pi) integral_RR e^(lambda t) e^(-t^2/2) dif t = e^(lambda^2/2)/sqrt(2pi) integral_RR e^(-(t - lambda)^2/2) dif t = e^(lambda^2/2) ==> psi(lambda) = lambda^2/2. $ Значит $lambda r - lambda^2/2$ достигает максимума при $lambda = r$, и $I(r) = r^2/2$. Значит $P(S_n/n >= r) <= e^(-n r^2/2)$. ] #example[ $xi_k sim op("Exp")(1)$. Тогда при $lambda < 1$, $ psi(lambda) = ln E e^(lambda xi) = ln (integral_0^(+oo) e^(lambda t) e^(-t) dif t) = ln lr((e^((lambda - 1) t))/(lambda - 1)|)_(t = 0)^(t = +oo) = ln 1/(1 - lambda) = -ln (1 - lambda), $ Тогда $lambda r + ln (1 - lambda)$ имеет максимум при $lambda = 1 - 1/r$ и максимум равен $r - 1 - ln r$. Значит $ P(S_n/n >= r) <= e^(-n (r - 1 - ln r)). $ ] #exercise[ Если $xi_k sim Bern(p)$ независимы, то $psi(lambda) = ln E e^(lambda xi) = ln(1 - p + p e^lambda) <= p (e^lambda - 1)$. Досчитайте, что получится дальше. ] #exercise[ Если $xi_k sim Bern(p_k)$ независимы (но $p$ у каждого свое), $mu = p_1 + p_2 + ... + p_n$. Доказать, что для любого $delta > 0$ $ P(S_n >= (1 + delta) mu) < e^((-delta^2 mu) / (delta + 2)). $ ]
https://github.com/Ngan-Ngoc-Dang-Nguyen/thesis
https://raw.githubusercontent.com/Ngan-Ngoc-Dang-Nguyen/thesis/main/docs/main.typ
typst
#import "../typst-orange.typ": project, part, chapter, my-bibliography, appendices, make-index, index, theorem, mathcal // // begin format of Theorem // #import "@preview/ctheorems:1.1.2": * // #show: thmrules.with(qed-symbol: $square$) // #set page(width: 16cm, height: auto, margin: 1.5cm) // #set heading(numbering: "1.1.") // #let theorem = thmbox("theorem", "Theorem", fill: rgb("#eeffee")) // #let corollary = thmplain( // "corollary", // "Corollary", // base: "theorem", // titlefmt: strong // ) // #let definition = thmbox("definition", "Definition", inset: (x: 1.2em, top: 1em)) // #let example = thmplain("example", "Example").with(numbering: none) // #let proof = thmproof("proof", "Proof") // // end of format of Theorem // #set text(font: "Times New Roman", size: 14pt) //#set text(font: "TeX Gyre Pagella") // #set text(font: "Lato") //#show math.equation: set text(font: "Fira Math") // #show math.equation: set text(font: "Lato Math") // #show raw: set text(font: "Fira Code") #show: project.with( title: "Bài toán nâng cấp bán kính ổn định cho vị trí 1-median trên cây", //subtitle: "A Practical Guide", date: "2024", author: "<NAME>", mainColor: rgb("#F36619"), // mainColor: blue, lang: "en", cover: image("../images/background.svg"), imageIndex: image("../images/orange1.jpg"), listOfFigureTitle: "List of Figures", listOfTableTitle: "List of Tables", supplementChapter: "Chương", supplementPart: "Phần", part_style: 0, copyright: [] ) #set text(font: "Times New Roman", size: 13pt) #set math.equation(numbering: "(1)") // #show math.equation: set text(blue) #show math.equation: it => { if it.body.fields().at("size", default: none) != "display" { return math.display(it) } it } // #show cite: it => [#text(fill: blue)[#cite(it)]] #show cite: set text(red) // #chapter("Sectioning Examples", image: image("../images/orange2.jpg"), l: "chap1") // #index("Sectioning") // #include "../tools/multi-section-ref.typ" // #import "../tools/macros.typ": eqref // #import "../typst-orange.typ": theorem, proof, lemma, proposition, corollary, example // --------- CHƯƠNG 1 #part([Chương 1]) #pagebreak() #include "../docs/xxx-how.typ" #include "../docs/chapter1.typ" // #include "../docs/chapter2.typ" #include "../docs/1.2 median.typ" #include "../docs/knapsack.typ" #include "../docs/inverse 1-median.typ" #include "../docs/reverse.typ" #include "../docs/upgrading.typ" // --------- CHƯƠNG 2 #part([Chương 2]) #pagebreak() #include "../docs/stability radius.typ" // --------- CHƯƠNG 3 #part([Chương 3]) #pagebreak() #include "../docs/upgrading radius.typ" #part([Tài Liệu Tham Khảo]) #pagebreak() #bibliography("../docs/ref.bib", title: "Tài Liệu Tham Khảo", style: "annual-reviews-author-date", )