licenses
sequencelengths 1
3
| version
stringclasses 677
values | tree_hash
stringlengths 40
40
| path
stringclasses 1
value | type
stringclasses 2
values | size
stringlengths 2
8
| text
stringlengths 25
67.1M
| package_name
stringlengths 2
41
| repo
stringlengths 33
86
|
---|---|---|---|---|---|---|---|---|
[
"BSD-3-Clause"
] | 0.5.1 | e89f3d1830790439dcb1404bb99d017d8a986259 | docs | 2930 | # Distributed Example
An example of how to enable distributed computation within ProgressiveHedging.jl. This example is also available as the script distributed_example.jl in the example directory.
Our first step here is to setup the worker processes. To do this we will use the Julia native [Distributed](https://docs.julialang.org/en/v1/stdlib/Distributed/) package.
```julia
using Distributed
addprocs(2) # add 2 workers
```
Now we need to setup the environment as before but the worker processes need to load the packages too. However, the worker processes are in the default julia environment when launched. If the necessary packages are not installed in this environment, or you want to use a different environment, you'll nee to explicitly activate it for the workers. Here we will activate the examples environment. Activating the proper environment on each worker can be done first by loading `Pkg` on every worker using the `@everywhere` macro and then using `Pkg.activate` to actually activate the environment.
```julia
@everywhere using Pkg
@everywhere Pkg.activate(joinpath(@__DIR__, "..", "examples"))
```
Finally, we again use the Distributed package's `@everywhere` macro to load the needed packages.
```julia
@everywhere using ProgressiveHedging
@everywhere import JuMP
@everywhere import Ipopt
```
Just as in every other case we define the function that is used to create a subproblem. In this case, however, we need to make sure that the worker processes are aware of the function. We once more do this with the `@everywhere` macro.
```julia
@everywhere function two_stage_model(scenario_id::ScenarioID)
model = JuMP.Model(()->Ipopt.Optimizer())
JuMP.set_optimizer_attribute(model, "print_level", 0)
JuMP.set_optimizer_attribute(model, "tol", 1e-12)
JuMP.set_optimizer_attribute(model, "acceptable_tol", 1e-12)
scen = value(scenario_id)
ref = JuMP.@variable(model, x >= 0.0)
stage1 = [ref]
ref = JuMP.@variable(model, y >= 0.0)
stage2 = [ref]
b_s = scen == 0 ? 11.0 : 4.0
c_s = scen == 0 ? 0.5 : 10.0
JuMP.@constraint(model, x + y == b_s)
JuMP.@objective(model, Min, 1.0*x + c_s*y)
return JuMPSubproblem(model,
scenario_id,
Dict(stid(1) => stage1,
stid(2) => stage2)
)
end
```
Now we proceed just as before. Create the scenario tree and call the solve function. This is all done locally. The rest of the computation distribution will be handled by PH.
```julia
scen_tree = two_stage_tree(2)
(niter, abs_res, rel_res, obj, soln_df, phd) = solve(scen_tree,
two_stage_model,
ScalarPenaltyParameter(1.0)
)
@show niter
@show abs_res
@show rel_res
@show obj
@show soln_df
```
| ProgressiveHedging | https://github.com/NREL/ProgressiveHedging.jl.git |
|
[
"BSD-3-Clause"
] | 0.5.1 | e89f3d1830790439dcb1404bb99d017d8a986259 | docs | 3861 | # Multi-Stage Example
An example of solving a multi-stage problem with ProgressiveHedging.jl. This example is also available as the script multistage_example.jl in the examples directory.
Using PH to solve a multi-stage problem is very similar to a two-stage problem. The only significant difference is in the creation of the scenario tree.
We will create a three-stage problem. First we import the proper packages and define the subproblem creation function
```@example multistage
using ProgressiveHedging
import JuMP
import Ipopt
function create_model(scenario_id::ScenarioID)
model = JuMP.Model(()->Ipopt.Optimizer())
JuMP.set_optimizer_attribute(model, "print_level", 0)
JuMP.set_optimizer_attribute(model, "tol", 1e-12)
c = [1.0, 10.0, 0.01]
d = 7.0
a = 16.0
α = 1.0
β = 1.0
γ = 1.0
δ = 1.0
ϵ = 1.0
s1 = 8.0
s2 = 4.0
s11 = 9.0
s12 = 16.0
s21 = 5.0
s22 = 18.0
stage1 = JuMP.@variable(model, x[1:3] >= 0.0)
JuMP.@constraint(model, x[3] <= 1.0)
obj = zero(JuMP.GenericQuadExpr{Float64,JuMP.VariableRef})
JuMP.add_to_expression!(obj, sum(c.*x))
# Second stage
stage2 = Vector{JuMP.VariableRef}()
if scenario_id < scid(2)
vref = JuMP.@variable(model, y >= 0.0)
JuMP.@constraint(model, α*sum(x) + β*y >= s1)
JuMP.add_to_expression!(obj, d*y)
else
vref = JuMP.@variable(model, y >= 0.0)
JuMP.@constraint(model, α*sum(x) + β*y >= s2)
JuMP.add_to_expression!(obj, d*y)
end
push!(stage2, vref)
# Third stage
stage3 = Vector{JuMP.VariableRef}()
if scenario_id == scid(0)
vref = JuMP.@variable(model, z[1:2])
JuMP.@constraint(model, ϵ*sum(x) + γ*y + δ*sum(z) == s11)
JuMP.add_to_expression!(obj, a*sum(z[i]^2 for i in 1:2))
elseif scenario_id == scid(1)
vref = JuMP.@variable(model, z[1:2])
JuMP.@constraint(model, ϵ*sum(x) + γ*y + δ*sum(z) == s12)
JuMP.add_to_expression!(obj, a*sum(z[i]^2 for i in 1:2))
elseif scenario_id == scid(2)
vref = JuMP.@variable(model, z[1:2])
JuMP.@constraint(model, ϵ*sum(x) + γ*y + δ*sum(z) == s21)
JuMP.add_to_expression!(obj, a*sum(z[i]^2 for i in 1:2))
else
vref = JuMP.@variable(model, z[1:2])
JuMP.@constraint(model, ϵ*sum(x) + γ*y + δ*sum(z) == s22)
JuMP.add_to_expression!(obj, a*sum(z[i]^2 for i in 1:2))
end
append!(stage3, vref)
JuMP.@objective(model, Min, obj)
vdict = Dict{StageID, Vector{JuMP.VariableRef}}([stid(1) => stage1,
stid(2) => stage2,
stid(3) => stage3,
])
return JuMPSubproblem(model, scenario_id, vdict)
end
nothing # hide
```
Now we create the scenario tree with two branch nodes in the second stage.
```@example multistage; continued=true
scen_tree = ScenarioTree()
branch_node_1 = add_node(scen_tree, root(scen_tree))
branch_node_2 = add_node(scen_tree, root(scen_tree))
```
To each branch node, we add two leaf nodes
```@example multistage; continued=false
add_leaf(scen_tree, branch_node_1, 0.375)
add_leaf(scen_tree, branch_node_1, 0.125)
add_leaf(scen_tree, branch_node_2, 0.375)
add_leaf(scen_tree, branch_node_2, 0.125)
nothing # hide
```
Finally, we solve just as we normally would
```@example multistage
(niter, abs_res, rel_res, obj, soln_df, phd) = solve(scen_tree,
create_model,
ScalarPenaltyParameter(25.0);
atol=1e-8, rtol=1e-12, max_iter=500)
@show niter
@show abs_res
@show rel_res
@show obj
@show soln_df
nothing # hide
```
| ProgressiveHedging | https://github.com/NREL/ProgressiveHedging.jl.git |
|
[
"MIT"
] | 0.0.6 | 9263f87d4cbf886e6193af356d03e1a26eb814d7 | code | 14114 | module SalesForceBulkApi
# pre requirements
## Other packages
using HTTP, LightXML, CSV, ProgressMeter, DataFrames, Distributed
import JSON
export login, sf_bulkapi_query, all_object_fields, fields_description, object_list, multiquery
# login
## login function and session token gathering
function login_post(username, password, version)
xml = """<?xml version="1.0" encoding="utf-8" ?>
<env:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
<env:Body>
<n1:login xmlns:n1="urn:partner.soap.sforce.com">
<username>$(username)</username>
<password>$(password)</password>
</n1:login>
</env:Body>
</env:Envelope>"""
HTTP.request("POST", "https://login.salesforce.com/services/Soap/u/$(version)",
["Content-Type" => "text/xml",
"SOAPAction" => "login"],
xml)
end
function login_base(username::String, password::String, version::String = "35.0")
session_info=login_post(username, password, version)
status = session_info.status;
http_status_exception_hand(status)
body = String(session_info.body)
if status == 200
return child_elem(body)
else
return status
end
end
function login(username::String, password::String, version::String = "35.0", tries::Int = 10)
current = 1
while current <= tries
try
session = login_base(username, password, version);
return session
catch
current += 1
@warn "Not successfull. Starting try $current of $tries"
end
end
end
# Bulk api functions
## create work
function jobcreater(session, object, queryall = false)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
if queryall == true
xml = """<?xml version="1.0" encoding="utf-8" ?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>queryAll</operation>
<object>$(object)</object>
<concurrencyMode>Parallel</concurrencyMode>
<contentType>CSV</contentType>
</jobInfo>"""
else
xml = """<?xml version="1.0" encoding="utf-8" ?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>query</operation>
<object>$(object)</object>
<concurrencyMode>Parallel</concurrencyMode>
<contentType>CSV</contentType>
</jobInfo>"""
end
job = HTTP.request("POST", url1 * "/services/async/" * apiVersion * "/job",
["Content-Type" => "text/plain",
"X-SFDC-Session" => session["sessionId"]],
xml)
status = job.status;
http_status_exception_hand(status)
body = String(job.body)
job = child_elem(body)
println("Job: " * job["id"])
println("Status: " * job["state"])
return job
end
## create query
function queryposter(session, job, query)
jobid = job["id"]
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
ret = HTTP.request("POST", url1 * "/services/async/" * apiVersion * "/job/" * jobid * "/batch",
["Content-Type" => "text/csv",
"X-SFDC-Session" => session["sessionId"]],
query)
status = ret.status;
http_status_exception_hand(status)
body = String(ret.body)
query = child_elem(body)
println("Job: " * query["id"])
println("Status: " * query["state"])
return query
end
## check status
function batchstatus(session, query; printing = true, tries = 10)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
jobid = query["jobId"]
batchid = query["id"]
ret = []
while tries > 0
try
ret = HTTP.request("GET", url1 * "/services/async/" * apiVersion * "/job/" * jobid * "/batch/" * batchid,
["Content-Type" => "text/plain",
"X-SFDC-Session" => session["sessionId"]])
tries = 0
catch
tries -= 1
@warn "Batchstatus: Not successfull. Tries left $tries"
end
end
status = ret.status;
http_status_exception_hand(status)
body = String(ret.body)
batch = child_elem(body)
if batch["state"] == "Failed"
println("Batch: " * batch["id"])
error("Status: " * batch["stateMessage"])
elseif printing == true
println("Batch: " * batch["id"])
println("Status: " * batch["state"])
end
return batch
end
## fetch results
function resultsid(session, batch)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
jobid = batch["jobId"]
batchid = batch["id"]
ret = HTTP.request("GET", url1 * "/services/async/" * apiVersion * "/job/" * jobid * "/batch/" * batchid * "/result",
["Content-Type" => "text/plain",
"X-SFDC-Session" => session["sessionId"]])
status = ret.status;
http_status_exception_hand(status)
body = String(ret.body)
results = Dict{String,String}()
for (i,x) in enumerate(child_elements(LightXML.root(parse_string(body))))
name_v, value = split(string(x), r"<|>")[2:3]
merge!(results, Dict([name_v*string(i) => value]))
end
return results
end
function results(session, batch)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
jobid = batch["jobId"]
batchid = batch["id"]
resultids = collect(values(resultsid(session,batch)))
body = DataFrame()
for resultid in resultids
ret = HTTP.request("GET", url1 * "/services/async/" * apiVersion * "/job/" * jobid * "/batch/" * batchid * "/result/" * resultid,
["Content-Type" => "text/plain",
"X-SFDC-Session" => session["sessionId"]])
status = ret.status;
http_status_exception_hand(status)
if size(body) == (0, 0)
body = mapcols(x -> replace(x, "" => missing), CSV.read(IOBuffer(String(ret.body)), missingstring = ""))
else
body = vcat(body, mapcols(x -> replace(x, "" => missing), CSV.read(IOBuffer(String(ret.body)), missingstring = "")))
end
end
return body
end
## close worker
function jobcloser(session, job)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
jobid = job["id"]
xml = """<?xml version="1.0" encoding="utf-8" ?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<state>Closed</state>
</jobInfo>"""
ret = HTTP.request("POST", url1 * "/services/async/" * apiVersion * "/job/" * jobid,
["Content-Type" => "text/plain",
"X-SFDC-Session" => session["sessionId"]],
xml)
status = ret.status;
http_status_exception_hand(status)
body = String(ret.body)
job = child_elem(body)
println("Job: " * job["id"])
println("Status: " * job["state"])
return job
end
# Wrapper
# wrapper function for single task
function lowercase_query(query::String)
org = collect(eachmatch(r"([\"'])(?:(?=(\\?))\2.)*?\1", query))
query = lowercase(query)
new = collect(eachmatch(r"([\"'])(?:(?=(\\?))\2.)*?\1",query))
[query = replace(query, x) for x in Dict(zip([String(y.match) for y in new], [String(y.match) for y in org]))];
return query
end
function sf_bulkapi_query(session, query::String, queryall::Bool = false)
query = lowercase_query(query)
objects = [x.match for x in eachmatch(r"(?<=from\s)(\w+)",query)]
length(objects) > 1 ? error("Query string include multiple objects. Should only have 1 FROM * statement") : nothing
objects = objects[1]
job = jobcreater(session, objects, queryall);
try
query = queryposter(session, job, query);
batch = batchstatus(session, query, printing=true);
if batch["state"] == "Failed"
error("Status: " * batch["stateMessage"])
else
while batch["state"] != "Completed"
sleep(3)
batch = batchstatus(session, query, printing=false);
end
if batch["state"] == "Completed"
res = results(session, batch)
end
return res
end
catch
return DataFrame()
finally
jobcloser(session, job)
end
end
# Functions for multiple queries
function startworker(session, joblist::RemoteChannel{Channel{String}}, res::RemoteChannel{Channel{Dict}}, queries, queryall = false)
function do_worker(session, joblist::RemoteChannel{Channel{String}}, res::RemoteChannel{Channel{Dict}}, queryall = false)
running = true
while running
try
query = take!(joblist)
result = sf_bulkapi_query(session, query, queryall)
put!(res, Dict([(query,result)]))
catch berror
if isa(berror, InvalidStateException)
running = false
elseif isa(berror, RemoteException)
running = false
else
running = false
println(berror)
end
end
end
end
for p in workers()
remote_do(do_worker, p, session, joblist, res, queryall)
end
end;
function startworker(session, joblist::Channel{String}, res::Channel{Dict}, queries, queryall = false)
function create_worker(session, res::Channel{Dict}, queryall = false)
query = take!(joblist)
println("Job taken")
put!(res, Dict([(query,sf_bulkapi_query(session, query, queryall))]))
end
for i in queries
@async create_worker(session, res, queryall)
end
end;
function create_joblist(queries::Array{String}, joblist)
for i in queries
put!(joblist, i)
end
close(joblist)
end;
function result_collector(queries, res)
totalres=Dict()
while isopen(res)
resa = take!(res)
merge!(totalres, resa)
if length(totalres) >= size(queries, 1)
close(res)
end
end
return totalres
end;
function multiquery(session, queries, queryall = false)
if length(workers()) > 1
res = RemoteChannel(()->Channel{Dict}(size(queries,1)));
joblist = RemoteChannel(()->Channel{String}(size(queries,1)));
else
res = Channel{Dict}(size(queries,1));
joblist = Channel{String}(size(queries,1));
end
println("Setup done")
println("Create Jobs")
create_joblist(queries, joblist)
println("Start worker")
@async startworker(session, joblist, res, queries, queryall)
ret = result_collector(queries, res)
println("Results collected")
return ret
end
# Helper functions
## All Tables
function object_list(session)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
ret = HTTP.request("GET", url1 * "/services/data/v" * apiVersion * "/sobjects",
["Content-Type" => "text/plain",
"Authorization" => "Bearer " * session["sessionId"],
"Accept" => "application/json"])
body = JSON.parse(String(ret.body));
objects = [x["name"] for x in body["sobjects"]]
return objects
end
## Field per Table
function fields_description(session, object::String)
apiVersion = match(r"/[0-9\.]{2,}/", session["serverUrl"]).match[2:end-1]
url1 = match(r".{0,}\.com", session["serverUrl"]).match
ret = HTTP.request("GET", url1 * "/services/data/v" * apiVersion * "/sobjects/" * object * "/describe",
["Content-Type" => "text/plain",
"Authorization" => "Bearer " * session["sessionId"],
"Accept" => "application/json"])
body = JSON.parse(String(ret.body));
ret = field_extractor(body["fields"], object)
return ret
end
## All Field in a Table
function fields_description(session, object::Array)
p = Progress(size(object,1), dt=1, barglyphs=BarGlyphs("[=> ]"), barlen=10, color=:green)
ret = fields_description(session, object[1])
next!(p)
for x in object[2:end]
append!(ret, fields_description(session,x))
next!(p)
end
return ret
end
## All fields + all tables
function all_object_fields(session)
objects = object_list(session)
ret = fields_description(session, objects)
return ret
end
## XML functions
function child_elem(x)
x = LightXML.root(parse_string(x))
res = Dict{String,String}()
child_elem(x, res)
end
function child_elem(x, res)
if size(collect(child_elements(x)),1) > 0
for x in child_elements(x)
name_v, value = split(string(x), r"<|>")[2:3]
merge!(res, Dict([name_v => value]))
child_elem(x, res)
end
end
return(res)
end
## Return stat behaviour
function http_status_exception_hand(x)
if x >= 300 | x < 200
@error "HTTP code $x"
end
end
#Extracts all fields from a dict into columns of a DataFrame and appends the object name for reference
function field_extractor(x, object::String)
ret = []
for (i, x) in enumerate(x)
if i == 1
ret = DataFrame(reshape([x for x in values(x)],1,:), Symbol.(keys(x)))
else
append!(ret,DataFrame(reshape([x for x in values(x)],1,:), Symbol.(keys(x))))
end
end
ret[!,:object] .= object
return ret
end
end | SalesForceBulkApi | https://github.com/GregorMatheis/SalesForceBulkApi.jl.git |
|
[
"MIT"
] | 0.0.6 | 9263f87d4cbf886e6193af356d03e1a26eb814d7 | code | 1013 | #test/runtests.jl
#import Pkg; Pkg.add("Test")
using Test, DataFrames,SalesForceBulkApi
## Pulling the data ##
session = login("[email protected]", "9d3T67hTK8DwKjApVAiwZL4nmBmPGqFpMNnK2YoRE4B7Sgf78", "45.0")
all_object_fields_return = all_object_fields(session)
all_object_fields_return[[:name, :object]]
queries = ["Select Name From Account Limit 10", "Select LastName From Contact limit 10"]
res1 = sf_bulkapi_query(session, "Select LastName From Contact limit 10")
multi_result = multiquery(session, queries)
multi_result_all = multiquery(session, queries, true)
# Testing content #
@test eltype(session["sessionId"]) == Char
@test isa(all_object_fields_return, DataFrame)
@test size(all_object_fields_return, 1) > 1
@test size(all_object_fields_return, 2) > 50
@test size(res1) == (10,1)
@test typeof(multi_result) == Dict{Any,Any}
@test multi_result[queries[2]] == res1
@test multi_result[queries[1]][1,1] == "GenePoint"
@test size(multi_result[queries[1]],1) <= size(multi_result_all[queries[1]],1)
| SalesForceBulkApi | https://github.com/GregorMatheis/SalesForceBulkApi.jl.git |
|
[
"MIT"
] | 0.0.6 | 9263f87d4cbf886e6193af356d03e1a26eb814d7 | docs | 1592 | [](https://travis-ci.org/GregorMatheis/SalesForceBulkApi.jl)
[](https://coveralls.io/github/GregorMatheis/SalesForceBulkApi.jl?branch=master)
# SalesForceBulkApi.jl
Functions to query data with the sales force bulk api
Install:
```julia
import Pkg
Pkg.add("SalesForceBulkApi")
using SalesForceBulkApi
```
Usage:
Query data
```julia
session = login("[email protected]/Login", "Your Password", "Your API Version (e.g. 45.0)")
sf_bulkapi_query(session, "Select Name FROM account limit 100")
```
Get overview of all objects and fields per object:
```julia
session = login("[email protected]/Login", "Your Password", "Your API Version (e.g. 45.0)")
object_list(session) # List of all available objects
fields_description(session, "object name") # Gives all fields
all_object_fields(session) # Handy iterator that creates a complete dataframe with all objects and fields. Runs a couple of seconds
```
Query deleted data:
```julia
sf_bulkapi_query(session, "Select Name, IsDeleted From Account", true)
```
Running multiple queries at once
```julia
queries = ["Select Name From Account", "Select LastName From Contact"]
multi_result = multiquery(session, queries) #normal query
multi_result_all = multiquery(session, queries, true) #includes deleted objects
```
If multiple workers are available, queries get distributed across workers. Otherwise queries run async in one worker.
| SalesForceBulkApi | https://github.com/GregorMatheis/SalesForceBulkApi.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 479 | using Documenter, MeshCore, MeshSteward
makedocs(
modules = [MeshSteward],
doctest = false, clean = true,
format = Documenter.HTML(prettyurls = false),
authors = "Petr Krysl",
sitename = "MeshSteward.jl",
pages = Any[
"Home" => "index.md",
"How to guide" => "guide/guide.md",
"Reference" => Any[
"man/types.md",
"man/functions.md"],
"Concepts" => "concepts/concepts.md"
],
)
deploydocs(
repo = "github.com/PetrKryslUCSD/MeshSteward.jl.git",
)
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 2561 | module Exports
###############################################################################
using ..MeshSteward: initbox, updatebox!, boundingbox, inflatebox!, inbox, boxesoverlap, intersectboxes
export initbox, updatebox!, boundingbox, inflatebox!, inbox, boxesoverlap, intersectboxes
###############################################################################
using ..MeshSteward: vselect, connectedv
export vselect, connectedv
###############################################################################
using ..MeshSteward: eselect
export eselect
###############################################################################
using ..MeshSteward: linearspace, gradedspace
export linearspace, gradedspace
###############################################################################
using ..MeshSteward: T4blockx, T4block
export T4blockx, T4block
###############################################################################
using ..MeshSteward: T3blockx, T3block, T3toT6, T6blockx, T6block, T6toT3
export T3blockx, T3block, T3toT6, T6blockx, T6block, T6toT3
###############################################################################
using ..MeshSteward: Q4blockx, Q4block, Q4quadrilateral, Q4blockwdistortion
export Q4blockx, Q4block, Q4quadrilateral, Q4blockwdistortion
###############################################################################
using ..MeshSteward: L2blockx, L2block
export L2blockx, L2block
###############################################################################
using ..MeshSteward: import_MESH, import_NASTRAN, import_ABAQUS
export import_MESH, import_NASTRAN, import_ABAQUS
###############################################################################
using ..MeshSteward: vtkwrite, export_MESH
export vtkwrite, export_MESH
###############################################################################
using ..MeshSteward: transform, vconnected, vnewnumbering, compactify, fusevertices, withvertices, renumbered, cat, mergeirs, minimize_profile
export transform, vconnected, vnewnumbering, compactify, fusevertices, withvertices, renumbered, cat, mergeirs, minimize_profile
###############################################################################
using ..MeshSteward: Mesh
export Mesh
using ..MeshSteward: load, save, increl, attach!, basecode, nspacedims, baseincrel, geometry, summary, vselect, boundary, vertices, submesh, label
export load, save, increl, attach!, basecode, nspacedims, baseincrel, geometry, summary, vselect, boundary, vertices, submesh, label
end # module
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 520 | module MeshSteward
include("boxes.jl")
include("vselect.jl")
include("eselect.jl")
include("utilities.jl")
include("tetrahedra.jl")
include("triangles.jl")
include("quadrilaterals.jl")
include("lines.jl")
include("io.jl")
include("modification.jl")
include("mesh.jl")
# We can either use/import individual functions from MeshSteward like so:
# ```
# using MeshSteward: attach!
# ```
# or we can bring into our context all exported symbols as
# ```
# using MeshSteward.Exports
# ```
include("Exports.jl")
end # module
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 4223 | using LinearAlgebra
"""
initbox(x::AbstractVector{T}) where {T}
Create a bounding box and initialize with a single point.
"""
function initbox(x::AbstractVector{T}) where {T}
sdim = length(x)
box = fill(zero(T), 2*sdim)
for i in 1:sdim
box[2*i-1] = box[2*i] = x[i];
end
return box
end
function initbox(x::AbstractMatrix{T}) where {T}
box = initbox(x[1, :])
return updatebox!(box, x)
end
"""
updatebox!(box::AbstractVector{T}, x::AbstractVector{T}) where {T}
Update a box with another location, or create a new box.
If the `box` does not have the correct dimensions, it is correctly sized.
`box` = bounding box
for 1-D `box=[minx,maxx]`, or
for 2-D `box=[minx,maxx,miny,maxy]`, or
for 3-D `box=[minx,maxx,miny,maxy,minz,maxz]`
`x` = vector defining a point. The `box` is expanded to include the
supplied location `x`.
"""
function updatebox!(box::AbstractVector{T}, x::AbstractVector{T}) where {T}
sdim = length(x)
for i in 1:sdim
box[2*i-1] = min(box[2*i-1],x[i]);
box[2*i] = max(box[2*i],x[i]);
end
return box
end
function updatebox!(box::AbstractVector{T}, x::AbstractMatrix{T}) where {T}
for i in 1:size(x, 1)
updatebox!(box, x[i, :])
end
return box
end
"""
boundingbox(x::AbstractArray{T}) where {T}
Compute the bounding box of the points in `x`.
`x` = holds points, one per row.
Returns `box` = bounding box
for 1-D `box=[minx,maxx]`, or
for 2-D `box=[minx,maxx,miny,maxy]`, or
for 3-D `box=[minx,maxx,miny,maxy,minz,maxz]`
"""
function boundingbox(x::AbstractArray{T}) where {T}
box = initbox(x[1, :])
for i in 2:size(x, 1)
updatebox!(box, x)
end
return box
end
"""
inflatebox!(box::AbstractVector{T}, inflatevalue::T) where {T}
Inflate the box by the value supplied.
"""
function inflatebox!(box::AbstractVector{T}, inflatevalue::T) where {T}
abox = deepcopy(box)
sdim = Int(length(box)/2);
for i=1:sdim
box[2*i-1] = min(abox[2*i-1],abox[2*i]) - inflatevalue;
box[2*i] = max(abox[2*i-1],abox[2*i]) + inflatevalue;
end
return box
end
"""
inbox(box::AbstractVector{T}, x::AbstractVector{T}) where {T}
Is the given location inside the box?
- `box` = vector entries arranged as [minx,maxx,miny,maxy,minz,maxz] (or
adjusted in an obvious way for lower space dimension).
Note: point on the boundary of the box is counted as being inside.
"""
function inbox(box::AbstractVector{T}, x::AbstractVector{T}) where {T}
inrange(rangelo,rangehi,r) = ((r>=rangelo) && (r<=rangehi));
sdim=length(x);
@assert 2*sdim == length(box)
if !inrange(box[1], box[2], x[1])
return false # short-circuit
end
for i=2:sdim
if !inrange(box[2*i-1], box[2*i], x[i])
return false # short-circuit
end
end
return true
end
function inbox(box::AbstractVector{T}, x::AbstractArray{T}) where {T}
return inbox(box, vec(x))
end
"""
boxesoverlap(box1::AbstractVector{T}, box2::AbstractVector{T}) where {T}
Do the given boxes overlap?
"""
function boxesoverlap(box1::AbstractVector{T}, box2::AbstractVector{T}) where {T}
dim=Int(length(box1)/2);
@assert 2*dim == length(box2) "Mismatched boxes"
for i=1:dim
if box1[2*i-1]>box2[2*i]
return false;
end
if box1[2*i]<box2[2*i-1]
return false;
end
end
return true;
end
"""
intersectboxes(box1::AbstractVector{T}, box2::AbstractVector{T}) where {T}
Compute the intersection of two boxes.
The function returns an empty box (length(b) == 0) if the intersection is
empty; otherwise a box is returned.
"""
function intersectboxes(box1::AbstractVector{T}, box2::AbstractVector{T}) where {T}
@assert length(box1) == length(box2) "Mismatched boxes"
b = copy(box1)
dim=Int(length(box1)/2);
@assert 2*dim == length(box2) "Wrong box data"
for i=1:dim
lb = max(box1[2*i-1], box2[2*i-1])
ub = min(box1[2*i], box2[2*i])
if (ub <= lb) # intersection is empty
return eltype(box1)[] # box of length zero signifies empty intersection
end
b[2*i-1] = lb
b[2*i] = ub
end
return b;
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 7258 | using LinearAlgebra: norm, dot, cross
using Statistics: mean
using MeshCore: IncRel, indextype, nshapes, attribute, nentities, nrelations, manifdim, n1storderv
"""
eselect(ir::IncRel; kwargs...)
Select finite elements.
# Arguments
- `ir` = incidence relation representing finite element set `(d, 0)`. The
"elements" are the shapes on the left of the incidence relation.
- `kwargs` = keyword arguments to specify the selection criteria
## Selection criteria
### facing
Select all "boundary" elements that "face" a certain direction.
```
exteriorbfl = eselect(ir, facing=true, direction=x -> [1.0, 1.0, 0.0]);
```
or
```
exteriorbfl = eselect(ir, facing=true, direction=xyz -> xyz/norm(xyz), dotmin = 0.99);
```
where `xyz` is the location of the centroid of a boundary element.
Here the finite element is considered "facing" in the given direction if the dot
product of its normal and the direction vector is greater than `dotmin`.
The default value for `dotmin` is 0.01 (this corresponds to almost 90 degrees
between the normal to the finite element and the given direction).
This selection method makes sense only for elements that are surface-like (i. e.
for boundary meshes).
### label
Select elements based on their label.
```
rl1 = eselect(ir, label=1)
```
### box, distance
Select elements based on some criteria that their nodes satisfy. See the
function `vselect()`.
Example:
Select all elements whose nodes are closer than `R+inflate` from the point `from`.
```
linner = eselect(ir, distance = R, from = [0.0 0.0 0.0], inflate = tolerance)
```
Example:
```
exteriorbfl = eselect(ir, box=[1.0, 1.0, 0.0, pi/2, 0.0, Th], inflate=tolerance)
```
where `Th` is a variable.
### Optional keyword arguments
Should we consider the element only if all its nodes are in?
- `allin` = Boolean: if true, then all nodes of an element must satisfy the
criterion; otherwise one is enough.
# Output
- `felist` = list of finite elements (shapes) from the from the collection on
the left of the incidence relation that satisfy the criteria
"""
function eselect(ir::IncRel; kwargs...)
# Extract arguments
allin = nothing; flood = nothing; facing = nothing; label = nothing;
# nearestto = nothing; smoothpatch = nothing;
startnode = 0; dotmin = 0.01
overlappingbox = nothing
for apair in pairs(kwargs)
sy, val = apair
if sy == :flood
flood = val
elseif sy == :facing
facing = val
elseif sy == :label
label = val
elseif sy == :overlappingbox
overlappingbox = val
elseif sy == :nearestto
nearestto = val
elseif sy == :allin
allin = val
end
end
if flood != nothing
for apair in pairs(kwargs)
sy, val = apair
if sy == :startnode
startnode = val
end
end
end
if facing != nothing
facing = true;
direction = nothing
dotmin = 0.01;
for apair in pairs(kwargs)
sy, val = apair
if sy == :direction
direction = val
elseif (sy == :dotmin) || (sy == :tolerance)# allow for obsolete keyword to work
dotmin = val
end
end
end
# The elements of this array are flipped from zero when the element satisfies
# the search condition. This list is eventually purged of the zero elements and
# returned.
felist = zeros(indextype(ir),nshapes(ir.left));
# Select based on fe label
if label != nothing
_label = attribute(ir.left, "label")
for i in 1:nshapes(ir.left)
if label == _label[i]
felist[i] = i; # matched this element
end
end
return felist[findall(x->x!=0, felist)]; # return the nonzero element numbers
end
# Select by flooding
# if flood != nothing && (flood)
# @assert startnode > 0
# fen2fe = FENodeToFEMap(connasarray(fes), count(fens))
# felist = zeros(FInt, count(fes));
# pfelist = zeros(FInt, count(fes));
# felist[fen2fe.map[startnode]] .= 1;
# while true
# copyto!(pfelist, felist);
# markedl = findall(x -> x != 0, felist)
# for j = markedl
# for k = fes.conn[j]
# felist[fen2fe.map[k]] .= 1;
# end
# end
# if sum(pfelist-felist) == 0 # If there are no more changes in this pass, we are done
# break;
# end
# end
# return findall(x -> x != 0, felist); # return the nonzero element numbers;
# end
# Helper function: calculate the normal to a boundary finite element
function normal2d(c, locs, n1stov)
t = locs[c[2]] - locs[c[1]]
n = [t[2], -t[1]]
return n/norm(n);
end
function normal3d(c, locs, n1stov)
if n1stov == 3 # triangle
v1 = locs[c[2]] - locs[c[1]]
v2 = locs[c[3]] - locs[c[1]]
n = cross(v1, v2)
return n/norm(n);
else
v1 = locs[c[3]] - locs[c[1]]
v2 = locs[c[4]] - locs[c[2]]
n = cross(v1, v2)
return n/norm(n);
end
end
center(c, locs) = begin
r = locs[c[1]]
for i in 2:length(c)
r += locs[c[i]]
end
return r ./ length(c)
end
# Select by in which direction the normal of the fes face
if (facing != nothing) && (facing)
locs = attribute(ir.right, "geom")
sdim = length(locs[1])
mdim = manifdim(shapedesc(ir.left));
@assert (mdim == sdim-1) "'Facing': only for Manifold dim. == Space dim.-1"
n1stov = n1storderv(shapedesc(ir.left))
@assert (mdim == 1) && (n1stov == 2) ||
(mdim == 2) && ((n1stov == 2) || (n1stov == 3)) "'Facing': shapes with n1stov = $(n1stov)"
normal = (mdim == 2) ? normal3d : normal2d
for i in 1:nrelations(ir)
c = ir[i]
n = normal(c, locs, n1stov)
d = direction(center(c, locs))
d = d / norm(d)
if (dot(vec(n),vec(d)) > dotmin)
felist[i] = i;
end
end
return felist[findall(x->x!=0, felist)]; # return the nonzero element numbers
end
# Default: Select based on location of nodes
# Should we consider the element only if all its nodes are in?
allinvalue = (allin == nothing) || ((allin != nothing) && (allin))
# Select elements whose nodes are in the selected node list
locs = attribute(ir.right, "geom")
vlist = vselect(locs; kwargs...);
allv = zeros(Bool, nshapes(ir.right));
allv[vlist] .= true
for i in 1:nrelations(ir)
found = 0
for nd in ir[i]
if allv[nd]
found = found + 1
end
end
if allinvalue
if found == nentities(ir, i)
felist[i] = i;
end
else
if found >= 1
felist[i] = i;
end
end
end
return felist[findall(x -> x!=0, felist)]; # return the nonzero element numbers
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 3280 |
# const VTKtypemap = Dict{DataType, Int}(P1=>1, L2=>3, T3=>5,
# Q4=>9, T4=>10, H8=>12, Q8=>23,
# L3=>21, T6=>22,
# T10=>24, H20=>25)
const _VTK_TYPE_MAP = Dict{AbsShapeDesc, Int}(P1=>1, L2=>3, T3=>5, Q4=>9, T4=>10, H8=>12, T6=>22, Q8=>23)
"""
vtkwrite(filename, connectivity)
Write VTK file with the mesh.
- `connectivity` = incidence relation of the type `d -> 0`. It must have a "geom"
attribute for access to the locations of all the vertices that the
connectivity incidence relation references.
- `data` = array of named tuples. Names of attributes of either the left or the
right shape collection of the `connectivity` incidence relation. Property names: `:name` required, name of the attribute. `:allxyz` optional, pad the attribute to be a three-dimensional quantity (in the global Cartesian coordinate system).
"""
function vtkwrite(filename, connectivity, data = [])
locs = attribute(connectivity.right, "geom")
# Fill in a matrix of point coordinates
points = fill(zero(eltype(locs[1])), length(locs[1]), length(locs))
for i in 1:length(locs)
points[:, i] .= locs[i]
end #
# Figure out the cell type
celltype = WriteVTK.VTKCellTypes.VTKCellType(_VTK_TYPE_MAP[shapedesc(connectivity.left)])
# Prepare an array of the cells
cells = [MeshCell(celltype, [j for j in connectivity[i]]) for i in 1:nrelations(connectivity)]
vtkfile = vtk_grid(filename, points, cells, compress=3)
for nt in data
an = nt.name # name of the shape collection
pn = propertynames(nt)
allxyz = (:allxyz in pn ? nt.allxyz : false)
if an in keys(connectivity.right.attributes)
a = attribute(connectivity.right, an)
nc = length(a[1])
ncz = ((nc < 3) && allxyz ? ncz = 3 : nc)
pdata = fill(0.0, ncz, length(a))
for j in 1:nc
for i in 1:length(a)
pdata[j, i] = a[i][j]
end
end
vtkfile[an] = pdata
elseif an in keys(connectivity.left.attributes)
a = attribute(connectivity.left, an)
nc = length(a[1])
cdata = fill(0.0, nc, length(a))
for j in 1:nc
for i in 1:length(a)
cdata[j, i] = a[i][j]
end
end
vtkfile[an] = cdata
end
end
return vtk_save(vtkfile)
end
"""
export_MESH(meshfile, mesh)
Import vertices and shapes in the MESH format.
# Output
Data dictionary, with keys
- "`vertices`" (vertices),
- "`shapes`" (array of shape collections).
"""
function export_MESH(meshfile, connectivity)
meshfilebase, ext = splitext(meshfile)
# Name of the shape descriptor
t = shapedesc(connectivity.left).name
datinfo = [meshfilebase * "-xyz.dat", t, meshfilebase * "-conn.dat"]
locs = attribute(connectivity.right, "geom")
open(datinfo[1], "w") do file
X = [locs[idx] for idx in 1:length(locs)]
writedlm(file, X, ' ')
end
open(datinfo[3], "w") do file
c = [connectivity[idx] for idx in 1:nrelations(connectivity)]
writedlm(file, c, ' ')
end
open(meshfilebase * ".mesh", "w") do file
writedlm(file, datinfo)
end
return true
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 13384 | """
import_MESH(meshfile)
Import vertices and shapes in the MESH format.
# Output
Array of incidence relations.
"""
function import_MESH(meshfile)
meshfilebase, ext = splitext(meshfile)
if ext == ""
ext = ".mesh"
end
meshfile = meshfilebase * ext
meshfiledir = dirname(meshfile)
# Mesh file in the format of the FinEtools .mesh file
datinfo = open(meshfile, "r") do file
readdlm(file)
end
# The first line is the name of the file with the coordinates
Xfile = isfile(datinfo[1]) ? datinfo[1] : joinpath(meshfiledir, datinfo[1])
X = open(Xfile, "r") do file
readdlm(file, ' ', Float64)
end
# RAW: should be able to handle multiple shape sets.
# The second line is the name of the shape descriptor.
sd = SHAPE_DESC[datinfo[2]]
Cfile = isfile(datinfo[3]) ? datinfo[3] : joinpath(meshfiledir, datinfo[3])
C = open(Cfile, "r") do file
readdlm(file, ' ', Int64)
end
N, T = size(X, 2), eltype(X)
locs = VecAttrib([SVector{N, T}(X[i, :]) for i in 1:size(X, 1)])
vrts = ShapeColl(P1, length(locs), "vertices")
vrts.attributes["geom"] = locs
shapes = ShapeColl(sd, size(C, 1), "elements")
connectivity = IncRel(shapes, vrts, C)
return [connectivity]
end
"""
!!! note
The arrays are reallocated as the data is read from files. The size of the
arrays is increased by this much. If the number of entities to be read is
large, the CHUNK should be probably increased so that only a few reallocations
are needed.
"""
const CHUNK = 100000
# Fix up an old style floating-point number without the exponent letter.
function _fixupdecimal(s)
os = ""
for i = length(s):-1:1
os = s[i] * os
if (s[i] == '-') && (i > 1) && (uppercase(s[i-1]) != "E") && isdigit(s[i-1])
os = "E" * os
end
if (s[i] == '+') && (i > 1) && (uppercase(s[i-1]) != "E") && isdigit(s[i-1])
os = "E" * os
end
end
return os
end
"""
import_NASTRAN(filename)
Import tetrahedral (4- and 10-node) NASTRAN mesh (.nas file).
Limitations:
1. only the GRID and CTETRA sections are read.
2. Only 4-node and 10-node tetrahedra are handled.
3. The file should be free-form (data separated by commas).
Some fixed-format files can also be processed (large-field, but not small-field).
# Output
Array of incidence relations.
"""
function import_NASTRAN(filename; allocationchunk=CHUNK, expectfixedformat = false)
lines = readlines(filename)
nnode = 0
node = zeros(allocationchunk, 4)
maxnodel = 10
nelem = 0
elem = zeros(Int64, allocationchunk, maxnodel + 3)
current_line = 1
while true
if current_line >= length(lines)
break
end
temp = lines[current_line]
current_line = current_line + 1
if (length(temp) >= 4) && (uppercase(temp[1:4]) == "GRID")
fixedformat = expectfixedformat
largefield = false
if (length(temp) >= 5) && (uppercase(temp[1:5]) == "GRID*")
largefield = true; fixedformat = true
end
@assert (!fixedformat) || (fixedformat && largefield) "Can handle either free format or large-field fixed format"
nnode = nnode + 1
if size(node, 1) < nnode
node = vcat(node, zeros(allocationchunk, 4))
end
if fixedformat
# $------1-------2-------3-------4-------5-------6-------7-------8-------9-------0
# GRID* 5 0-1.66618812195+1-3.85740337853+0
# * 1.546691269367+1 0
node[nnode, 1] = parse(Float64, _fixupdecimal(temp[9:24]))
node[nnode, 2] = parse(Float64, _fixupdecimal(temp[41:56]))
node[nnode, 3] = parse(Float64, _fixupdecimal(temp[57:72]))
temp = lines[current_line]
current_line = current_line + 1
node[nnode, 4] = parse(Float64, _fixupdecimal(temp[9:24]))
else
# Template:
# GRID,1,,-1.32846E-017,3.25378E-033,0.216954
A = split(replace(temp, "," => " "))
for six = 1:4
node[nnode, six] = parse(Float64, A[six+1])
end
end
end # GRID
if (length(temp) >= 6) && (uppercase(temp[1:6]) == "CTETRA")
# Template:
# CTETRA,1,3,15,14,12,16,8971,4853,8972,4850,8973,4848
nelem = nelem + 1
if size(elem, 1) < nelem
elem = vcat(elem, zeros(Int64, allocationchunk, maxnodel + 3))
end
continuation = ""
fixedformat = (length(temp) == 72) || expectfixedformat # Is this fixed-format record? This is a guess based on the length of the line.
if fixedformat
continuation = temp[min(length(temp), 72):length(temp)]
temp = temp[1:min(length(temp), 72)]
end
A = split(replace(temp, "," => " "))
elem[nelem, 1] = parse(Int64, A[2])
elem[nelem, 2] = parse(Int64, A[3])
if length(A) == 7 # nodes per element equals 4
nperel = 4
else
nperel = 10
if length(A) < 13 # the line is continued: read the next line
temp = lines[current_line]
current_line = current_line + 1
if fixedformat
temp = temp[length(continuation)+1:min(length(temp), 72)]
end
temp = strip(temp)
Acont = split(replace(temp, "," => " "))
A = vcat(A, Acont)
end
end
elem[nelem, 3] = nperel
for six = 1:nperel
elem[nelem, six+3] = parse(Int64, A[six+3])
end
end # CTETRA
end # while
node = node[1:nnode,:]
elem = elem[1:nelem,:]
# The nodes need to be in serial order: if they are not, the element
# connectivities will not point at the right nodes
@assert norm(collect(1:nnode)-node[:,1]) == 0 "Nodes are not in serial order"
# Process output arguments
# Extract coordinates
xyz = node[:,2:4]
# Cleanup element connectivities
ennod = unique(elem[:,3])
@assert length(ennod) == 1 "Cannot handle mixture of element types"
@assert ((ennod[1] == 4) || (ennod[1] == 10)) "Unknown element type"
conn = elem[:,4:3+convert(Int64, ennod[1])]
# Create output arguments. First the nodes
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
if ennod[1] == 4
shapes = ShapeColl(T4, size(conn, 1), "elements")
# else
# fes = FESetT10(conn) # RAW
end
vrts = ShapeColl(P1, length(locs), "vertices")
vrts.attributes["geom"] = locs
connectivity = IncRel(shapes, vrts, conn)
return [connectivity]
end
mutable struct AbaqusElementSection
ElementLine::AbstractString
nelem::Int64
elem::Array{Int64,2}
end
"""
import_ABAQUS(filename)
Import tetrahedral (4- and 10-node) or hexahedral (8- and 20-node) Abaqus mesh
(.inp file).
Limitations:
1. Only the `*NODE` and `*ELEMENT` sections are read
2. Only 4-node and 10-node tetrahedra, 8-node or 20-node hexahedra, 3-node triangles
are handled.
# Output
Array of incidence relations.
"""
function import_ABAQUS(filename; allocationchunk=CHUNK)
lines = readlines(filename)
maxelnodes = 20
warnings = String[]
nnode = 0
node = zeros(allocationchunk, 4)
Reading_nodes = false
next_line = 1
while true
if next_line > length(lines)
break
end
temp = uppercase(strip(lines[next_line]))
next_line = next_line + 1
if (length(temp) >= 5) && (temp[1:5] == "*NODE")
Reading_nodes = true
nnode = 0
node = zeros(allocationchunk, 4)
temp = uppercase(strip(lines[next_line]))
next_line = next_line + 1
end
if Reading_nodes
if temp[1:1] == "*" # another section started
Reading_nodes = false
break
end
nnode = nnode + 1
if size(node, 1) < nnode # if needed, allocate more space
node = vcat(node, zeros(allocationchunk, 4))
end
A = split(replace(temp, "," => " "))
for six = 1:length(A)
node[nnode, six] = parse(Float64, A[six])
end
end
end # while
nelemset = 0
elemset = AbaqusElementSection[]
Reading_elements = false
next_line = 1
while true
if next_line > length(lines)
break
end
temp = uppercase(strip(lines[next_line]))
next_line = next_line + 1
if (length(temp) >= 8) && (temp[1:8] == "*ELEMENT")
Reading_elements = true
nelemset = nelemset + 1
nelem = 0
a = AbaqusElementSection(temp, nelem, zeros(Int64, allocationchunk, maxelnodes+1))
push!(elemset, a)
temp = uppercase(strip(lines[next_line]))
next_line = next_line + 1
end
if Reading_elements
if temp[1:1] == "*" # another section started
Reading_elements = false
break
end
elemset[nelemset].nelem = elemset[nelemset].nelem + 1
if size(elemset[nelemset].elem, 1) < elemset[nelemset].nelem
elemset[nelemset].elem = vcat(elemset[nelemset].elem,
zeros(Int64, allocationchunk, maxelnodes+1))
end
A = split(temp, ",")
if (A[end] == "") # the present line is continued on the next one
temp = uppercase(strip(lines[next_line]))
next_line = next_line + 1
Acont = split(temp, ",")
A = vcat(A[1:end-1], Acont)
end
for ixxxx = 1:length(A)
elemset[nelemset].elem[elemset[nelemset].nelem, ixxxx] = parse(Int64, A[ixxxx])
end
end
end # while
node = node[1:nnode, :] # truncate the array to just the lines read
# The nodes need to be in serial order: if they are not, the element
# connectivities will not point at the right nodes. So, if that's the case we
# will figure out the sequential numbering of the nodes and then we will
# renumber the connectivvities of the elements.
# newnumbering = collect(1:nnode)
# if norm(collect(1:nnode)-node[:,1]) != 0
# newnumbering = zeros(Int64, convert(Int64, maximum(node[:,1])))
# jn = 1
# for ixxxx = 1:size(node, 1)
# if node[ixxxx,1] != 0
# on = convert(Int64, node[ixxxx,1])
# newnumbering[on] = jn
# jn = jn + 1
# end
# end
# end
# Process output arguments
# Nodes
xyz = node[:,2:4]
function feset_construct(elemset1)
temp = uppercase(strip(elemset1.ElementLine))
b = split(temp, ",")
for ixxx = 1:length(b)
c = split(b[ixxx], "=")
if (uppercase(strip(c[1])) == "TYPE") && (length(c) > 1)
TYPE = uppercase(strip(c[2]))
if (length(TYPE) >= 4) && (TYPE[1:4] == "C3D4")
return (T4, elemset1.elem[:, 2:5])
# elseif (length(TYPE) >= 4) && (TYPE[1:4] == "C3D8")
# return FESetH8(elemset1.elem[:, 2:9])
# elseif (length(TYPE) >= 5) && (TYPE[1:5] == "C3D20")
# return FESetH20(elemset1.elem[:, 2:21])
# elseif (length(TYPE) >= 5) && (TYPE[1:5] == "C3D10")
# return FESetT10(elemset1.elem[:, 2:11])
# elseif (length(TYPE) >= 5) && (TYPE[1:5] == "DC2D3")
# return FESetT3(elemset1.elem[:, 2:4])
elseif (length(TYPE) >= 5) && ((TYPE[1:5] == "CPS4R") || (TYPE[1:4] == "CPS4"))
return (Q4, elemset1.elem[:, 2:5])
elseif (length(TYPE) >= 2) && (TYPE[1:2] == "S3")
return (T3, elemset1.elem[:, 2:4])
else
return nothing, nothing
end
end
end
end
# Create output arguments. First the nodes
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs), "vertices")
vrts.attributes["geom"] = locs
# Element sets
connectivities = IncRel[]
for ixxxx = 1:length(elemset)
elemset[ixxxx].elem = elemset[ixxxx].elem[1:elemset[ixxxx].nelem, :]
fet, conn = feset_construct(elemset[ixxxx])
if (fet == nothing)
@warn "Don't know how to handle " * elemset[ixxxx].ElementLine
else
# fes = renumberconn!(fes, newnumbering)
elements = ShapeColl(fet, size(conn, 1), fet.name)
push!(connectivities, IncRel(elements, vrts, [SVector{size(conn, 2), Int64}(conn[i, :]) for i in 1:size(conn, 1)], fet.name))
end
end
return connectivities
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 281 | using DelimitedFiles
using WriteVTK
using MeshCore: AbsShapeDesc, SHAPE_DESC, P1, L2, T3, Q4, T4, H8, T6, Q8
using MeshCore: ShapeColl, shapedesc, nshapes, IncRel, nrelations
using MeshCore: VecAttrib, attribute
using LinearAlgebra: norm
include("import.jl")
include("export.jl")
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 1648 | using DelimitedFiles
using WriteVTK
using MeshCore: AbsShapeDesc, P1, L2, L3
using MeshCore: VecAttrib
using MeshCore: shapedesc, nshapes, IncRel
using MeshCore: ShapeColl
using LinearAlgebra: norm
using StaticArrays
"""
L2blockx(xs::Vector{T}; intbytes = 8) where {T}
Generate a graded mesh on an interval.
Keyword argument `intbytes` controls the size of the integer indexes.
# Return
By convention the function returns an incidence relation (`connectivity`)
between the elements (`connectivity.left`) and the vertices
(`connectivity.right`). The geometry is stored as the attribute "geom" of the
vertices.
"""
function L2blockx(xs::Vector{T}; intbytes = 8) where {T}
inttype = Int64
if intbytes == 4
inttype = Int32
end
nL = length(xs)-1;
nnodes = (nL+1);
ncells = (nL);
xys = zeros(T, nnodes, 1);
conns = zeros(inttype, ncells, 2);
f=1;
for i in 1:(nL+1)
xys[f, 1] = xs[i]
f=f+1;
end
gc=1;
for i in 1:nL
conns[gc, 1] = i
conns[gc, 2] = (i+1)
gc=gc+1;
end
C = inttype.(conns[1:gc-1, :]);
N, TC = size(xys, 2), eltype(xys)
locs = VecAttrib([SVector{N, TC}(xys[i, :]) for i in 1:size(xys, 1)])
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(L2, size(C, 1), "elements")
return IncRel(elements, vertices, C)
end
"""
L2block(Length, nL; kwargs...)
Generate a quadrilateral mesh of the 2D block.
See also: L2blockx
"""
function L2block(Length, nL; kwargs...)
return L2blockx(collect(linearspace(0.0, Length, nL+1)); kwargs...);
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 6743 | using MeshCore
using MeshCore: ShapeColl, IncRel, ir_skeleton
using MeshCore: shapedesc, nshapes, ir_code, attribute
import Base.show
using StaticArrays
"""
Mesh
The type of the mesh.
It stores the incidence relations keyed by the code of the relation.
The incidence relation code is `(d1, d2)`, where `d1` is the manifold dimension
of the shape collection on the left, and `d2` is the manifold dimension of the
shape collection on the right.
The incidence relations are stored with a key consisting of the code and a
string tag. If the string tag is unspecified, it is assumed to be an empty
string.
"""
struct Mesh
name::String # name of the mesh
_increls::Dict{Tuple{Tuple{Int64, Int64}, String}, IncRel} # dictionary of incidence relations
end
"""
Mesh()
Define the mesh with default name and empty dictionary of incidence relations.
"""
function Mesh()
Mesh("mesh", Dict{Tuple{Tuple{Int64, Int64}, String}, IncRel}())
end
"""
Mesh(s::String)
Define the mesh named `s` with an empty dictionary of incidence relations.
"""
function Mesh(s::String)
Mesh(s, Dict{Tuple{Tuple{Int64, Int64}, String}, IncRel}())
end
"""
load(m::Mesh, filename::String)
Load a mesh (incidence relation) from a MESH file.
!!! note
No check is performed that the loaded incidence relation is compatible
with the existing incidence relations in the mesh.
"""
function load(m::Mesh, filename::String)
conns = import_MESH(filename)
return attach!(m, conns[1])
end
"""
save(m::Mesh, filename::String)
Save a mesh base incidence relation to a MESH file.
"""
function save(m::Mesh, filename::String)
ir = increl(m, basecode(m))
return export_MESH(filename, ir)
end
"""
increl(m::Mesh, irc::Tuple{Int64, Int64})
Retrieve the named incidence relation based on the code.
Any tag is matched.
"""
function increl(m::Mesh, irc::Tuple{Int64, Int64})
for (k, v) in zip(keys(m._increls), values(m._increls))
if k[1] == irc
return v
end
end
return nothing
end
"""
increl(m::Mesh, fullirc::Tuple{Tuple{Int64, Int64}, String})
Retrieve the named incidence relation based on the full key (code + tag).
"""
increl(m::Mesh, fullirc::Tuple{Tuple{Int64, Int64}, String}) = m._increls[fullirc]
"""
attach!(m::Mesh, increl::IncRel)
Attach the incidence relation under its code and empty tag.
The code of the incidence relation combined with an empty tag (`""`) is the key
under which this relation is stored in the mesh.
"""
function attach!(m::Mesh, ir::IncRel)
return attach!(m, ir, "")
end
"""
attach!(m::Mesh, increl::IncRel, tag::String)
Attach the incidence relation under its code and given tag.
The code of the incidence relation combined with the tag is the key
under which this relation is stored in the mesh.
"""
function attach!(m::Mesh, ir::IncRel, tag::String)
m._increls[(ir_code(ir), tag)] = ir
return m
end
"""
basecode(m::Mesh)
Compute the code of the base relation.
The base incidence relation is `(d, 0)` that represents the elements of the
interior of the domain.
"""
function basecode(m::Mesh)
maxd = 0
for irc in keys(m._increls)
maxd = max(maxd, irc[1][1])
end
return (maxd, 0)
end
"""
nspacedims(m::Mesh)
Furnish the dimension of the space in which the mesh lives.
"""
function nspacedims(m::Mesh)
ir = increl(m, basecode(m))
a = attribute(ir.right, "geom")
return length(a[1])
end
"""
baseincrel(m::Mesh)
Retrieve the base incidence relation for the mesh.
"""
baseincrel(m::Mesh) = increl(m, basecode(m))
"""
baseincrel(m::Mesh, tag::String)
Retrieve the base incidence relation for the mesh distinguished by its tag.
"""
baseincrel(m::Mesh, tag::String) = increl(m, (basecode(m), tag))
"""
geometry(m::Mesh)
Retrieve the geometry attribute from the vertices.
"""
function geometry(m::Mesh)
ir = increl(m, basecode(m))
return attribute(ir.right, "geom")
end
"""
Base.summary(m::Mesh)
Form a brief summary of the mesh.
"""
function Base.summary(m::Mesh)
s = "Mesh $(m.name):"
for ir in m._increls
s = s * " $(ir[1]) = " * summary(ir[2]) * "; "
end
return s
end
"""
Base.summary(io::IO, m::Mesh)
Form a brief summary of the mesh.
"""
function Base.summary(io::IO, m::Mesh)
print(io, summary(m), "\n")
end
"""
vselect(m::Mesh; kwargs...)
Select vertices. Return as an incidence relation.
Refer to `vselect` that works with the geometry attribute.
"""
function vselect(m::Mesh; kwargs...)
ir = increl(m, basecode(m))
geom = attribute(ir.right, "geom")
list = vselect(geom; (kwargs...))
return IncRel(ShapeColl(MeshCore.P1, length(list)), ir.right, [[idx] for idx in list])
end
"""
boundary(m::Mesh)
Compute the boundary of the mesh.
The incidents relation is stored in the mesh with the tag "boundary".
"""
function boundary(m::Mesh)
ir = increl(m, basecode(m))
sir = ir_skeleton(ir, "skeleton") # compute the skeleton of the base incidence relation
attach!(m, sir) # insert the skeleton into the mesh
# Now construct the boundary incidence relation
isboundary = sir.left.attributes["isboundary"]
ind = [i for i in 1:length(isboundary) if isboundary[i]]
lft = ShapeColl(shapedesc(sir.left), length(ind), "facets")
bir = IncRel(lft, sir.right, deepcopy(sir._v[ind]))
attach!(m, bir, "boundary")
return bir
end
"""
vertices(m::Mesh)
Compute the `(0, 0)` incidence relation for the vertices of the base incidence
relation.
"""
function vertices(m::Mesh)
ir = increl(m, basecode(m))
return IncRel(ir.right, ir.right, [SVector{1, Int64}([idx]) for idx in 1:nshapes(ir.right)], "vertices")
end
"""
submesh(m::Mesh, list)
Extract a submesh constructed of a subset of the base relation.
"""
function submesh(m::Mesh, list)
ir = increl(m, basecode(m))
lft = ShapeColl(shapedesc(ir.left), length(list))
v = [ir[idx] for idx in list]
nir = IncRel(lft, ir.right, v)
nm = Mesh()
return attach!(nm, nir)
end
function _label(sc, list, lab)
if !("label" in keys(sc.attributes))
sc.attributes["label"] = VecAttrib([zero(typeof(lab)) for idx in 1:nshapes(sc)])
end
a = sc.attributes["label"]
for i in 1:length(list)
a[list[i]] = lab
end
end
"""
label(m::Mesh, irc, list, lab)
Label shapes in `list` with the label `lab`.
Label the shapes on the `shapecoll` of the incidence relation.
`shapecoll` must be either `:left` or `:right`.
"""
function label(m::Mesh, irc, shapecoll, list, lab)
ir = increl(m, irc)
if shapecoll == :left
_label(ir.left, list, lab)
else
shapecoll == :right
_label(ir.right, list, lab)
end
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 11864 | import Base.cat
using DelimitedFiles
using WriteVTK
using SparseArrays
using SymRCM
using MeshCore: AbsShapeDesc, P1, L2, T3, Q4, T4, H8
using MeshCore: VecAttrib
using MeshCore: shapedesc, nshapes, IncRel
using MeshCore: ShapeColl
using LinearAlgebra: norm
# using UnicodePlots # 4 debugging only
"""
transform(ir, T = x -> x)
Change the locations of the vertices through the transformation `T`.
"""
function transform(ir, T = x -> x)
locs = ir.right.attributes["geom"]
N = length(locs[1])
TC = eltype(locs[1])
nlocs = VecAttrib([SVector{N, TC}(T(locs[i])) for i in 1:length(locs)])
ir.right.attributes["geom"] = nlocs
end
"""
vconnected(ir)
Find whether or not the vertices are connected to any shape on the left of the
incidence relation.
- `isconnected` = vector is returned which is for the node `k` either true
(vertex `k` is connected), or false (vertex `k` is not connected).
"""
function vconnected(ir)
isconnected = falses(nshapes(ir.right));
for i in 1:nrelations(ir)
for j in 1:nentities(ir, i)
isconnected[ir[i, j]] = true
end
end
return isconnected
end
"""
vnewnumbering(ir, isconnected)
Compute the new numbering obtained by deleting unconnected vertices.
"""
function vnewnumbering(ir, isconnected)
@_check length(isconnected) == nshapes(ir.right)
p = fill(zero(indextype(ir)), nshapes(ir.right));
id = 1;
for i in 1:length(isconnected)
if (isconnected[i])
p[i] = id;
id = id+1;
end
end
return p
end
"""
compactify(ir, new_numbering)
Compact the finite element node set by deleting unconnected nodes.
`fens` = array of finite element nodes
`connected` = The array element `connected[j]` is either 0 (when `j` is an
unconnected node), or a positive number (when node `j` is connected to
other nodes by at least one finite element)
# Output
`fens` = new set of finite element nodes
`new_numbering`= array which tells where in the new `fens` array the
connected nodes are (or 0 when the node was unconnected). For instance,
node 5 was connected, and in the new array it is the third node: then
`new_numbering[5]` is 3.
# Examples
Let us say there are nodes not connected to any finite element that you
would like to remove from the mesh: here is how that would be
accomplished.
```
connected = findunconnnodes(fens, fes);
fens, new_numbering = compactnodes(fens, connected);
fes = renumberconn!(fes, new_numbering);
```
Finally, check that the mesh is valid:
```
validate_mesh(fens, fes);
```
"""
function compactify(ir, new_numbering)
@_check length(new_numbering) == nshapes(ir.right)
locs = ir.right.attributes["geom"] # this is the old geometry attribute
N = length(locs[1]) # number of spatial dimensions
TC = eltype(locs[1])
# Now generate a vector of the locations for connected vertices
v = [SVector{N, TC}(locs[i]) for i in 1:length(locs) if new_numbering[i] != 0]
locs = VecAttrib(v) # this is the new geometry attribute
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
# Elements
C = [SVector{3}(ir[idx]) for idx in 1:nrelations(ir)]
elements = ShapeColl(shapedesc(ir.left), length(C), "elements")
return IncRel(elements, vertices, C)
end
# This code was taken over from FinEtools: refer to fusenodes
function _fusen(xyz1, id1, xyz2, id2, tolerance)
dim = size(xyz1,2);
nn1 = size(xyz1, 1)
nn2 = size(xyz2, 1)
# Decide which nodes should be checked for proximity
ib = intersectboxes(inflatebox!(boundingbox(xyz1), tolerance), inflatebox!(boundingbox(xyz2), tolerance))
node1in = fill(false, nn1);
node2in = fill(false, nn2);
if length(ib) > 0
for i=1:nn1
node1in[i] = inbox(ib, @view xyz1[i, :])
end
for i=1:nn2
node2in[i] = inbox(ib, @view xyz2[i, :])
end
end
# Mark nodes from the first array that are duplicated in the second
if (tolerance > 0.0) # should we attempt to merge nodes?
for i=1:nn1
if node1in[i]
breakoff = false
for rx=1:nn2
if node2in[rx]
distance = 0.0
for cx=1:dim
distance = distance + abs(xyz2[rx,cx]-xyz1[i,cx]);
if (distance >= tolerance) # shortcut: if the distance is already too large, stop checking
break
end
end
if (distance < tolerance)
id1[i] = -rx; breakoff = true;
end
end
if breakoff
break
end
end
end
end
end
# Generate fused arrays of the nodes. First copy in the nodes from the second set...
xyzm = zeros(eltype(xyz1),nn1+nn2,dim);
for rx = 1:nn2
for cx = 1:dim
xyzm[rx,cx] = xyz2[rx,cx];
end
end
# idm = zeros(eltype(id1),nn1+nn2);
# for rx = 1:nn2
# idm[rx] = rx;
# end
mid=nn2+1;
# ...and then we add in only non-duplicated nodes from the first set
for i=1:nn1
if id1[i]>0
id1[i] = mid;
# idm[mid] = mid;
for cx = 1:dim
xyzm[mid,cx] = xyz1[i,cx];
end
mid = mid+1;
else
id1[i] = id2[-id1[i]];
end
end
nnodes = mid-1;
# The set 1 is described by these locations. The new numbering applies also
# to set 1.
xyzm = xyzm[1:nnodes,:];
new_indexes_of_set1_nodes = deepcopy(id1);
# The node set 2 numbering stays the same
return xyzm, new_indexes_of_set1_nodes
end
"""
fusevertices(locs1, locs2, tolerance)
Fuse together vertices from two vortex sets.
Fuse two vertex sets. If necessary, by gluing together vertices located within
`tolerance` of each other. The two vertex sets are fused
together by merging the vertices that fall within a box of size `tolerance`. The
merged vertex set, `fens`, and the new indexes of the nodes in the set `fens1`
are returned.
The set `fens2` will be included unchanged, in the same order,
in the node set `fens`.
The indexes of the node set `fens1` will have changed.
# Example
After the call to this function we have
`k=new_indexes_of_fens1_nodes[j]` is the node in the node set `fens` which
used to be node `j` in node set `fens1`.
The finite element set connectivity that used to refer to `fens1`
needs to be updated to refer to the same nodes in the set `fens` as
`updateconn!(fes, new_indexes_of_fens1_nodes);`
"""
function fusevertices(locs1, locs2, tolerance)
@_check length(locs1[1]) == length(locs2[1])
dim = length(locs1[1]);
nn1 = length(locs1)
nn2 = length(locs2)
TC = eltype(locs1[1])
xyz1 = zeros(TC,nn1,dim);
for i in 1:length(locs1)
xyz1[i, :] = locs1[i]
end
xyz2 = zeros(TC,nn2,dim);
for i in 1:length(locs2)
xyz2[i, :] = locs2[i]
end
id1 = collect(1:nn1);
id2 = collect(1:nn2);
xyzm, new_indexes_of_set1_nodes = _fusen(xyz1, id1, xyz2, id2, tolerance)
N = size(xyzm, 2)
nlocs1 = VecAttrib([SVector{N, TC}(xyzm[i, :]) for i in 1:size(xyzm, 1)])
return nlocs1, new_indexes_of_set1_nodes
end
"""
withvertices(ir, locs)
Create a new incidence relation referring to a different set of vertices.
Presumably the set of vertices is simply broadened from the one used by the
incidence relation `ir`.
"""
function withvertices(ir, locs)
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(shapedesc(ir.left), nrelations(ir), "elements")
return IncRel(elements, vertices, [ir[idx] for idx in 1:nrelations(ir)])
end
"""
renumbered(ir, p)
Renumber the connectivity of the shapes based on a new numbering for the
vertices.
- `p` = new serial numbers for the vertices. The connectivity
should be changed as `conn[j]` --> `p[conn[j]]`
Returns new incidence relation with renumbered connectivity.
# Example
To do: Revise this example.
Let us say there are nodes not connected to any finite element that you would
like to remove from the mesh: here is how that would be accomplished.
```
connected = findunconnnodes(fens, fes);
fens, p = compactfens(fens, connected);
fes = renumberconn!(fes, p);
```
Finally, check that the mesh is valid:
```julia
validate_mesh(fens, fes);
```
"""
function renumbered(ir, p)
N = nentities(ir, 1)
C = SVector{N, indextype(ir)}[]
for i in 1:nrelations(ir)
c = ir[i]
push!(C, SVector{N}(p[c]))
end
locs = ir.right.attributes["geom"]
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(shapedesc(ir.left), length(C), "elements")
return IncRel(elements, vertices, C)
end
"""
cat(ir1::T, ir2::T) where {T<:IncRel}
Concatenate the connectivities of two shape collections.
The two shape collections must be of the same shape descriptor, and they must
refer to the same attribute describing the locations of the vertices.
"""
function cat(ir1::T, ir2::T) where {T<:IncRel}
# First, the left shapes must be of the same description
@_check shapedesc(ir1.left) == shapedesc(ir2.left)
# The shape collections on the right must be vertices
@_check shapedesc(ir1.right) == shapedesc(ir2.right) == P1
# The shape collections must refer to the same set of vertices. The length
# is a cheap proxy for checking that condition.
@_check length(ir1.right.attributes["geom"]) == length(ir2.right.attributes["geom"])
# The shape collections must have fixed cardinality.
N = nentities(ir1, 1)
C = SVector{N, indextype(ir1)}[]
for i in 1:nrelations(ir1)
push!(C, ir1[i])
end
for i in 1:nrelations(ir2)
push!(C, ir2[i])
end
locs = ir1.right.attributes["geom"]
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(shapedesc(ir1.left), length(C), "elements")
return IncRel(elements, vertices, C)
end
"""
mergeirs(conn1, conn2, tolerance = eps())
Merge two incidence relations, fuse vertices closer then tolerance.
"""
function mergeirs(conn1, conn2, tolerance = eps())
locs1 = conn1.right.attributes["geom"]
locs2 = conn2.right.attributes["geom"]
nlocs1, ni1 = fusevertices(locs1, locs2, tolerance)
conn1 = withvertices(conn1, nlocs1)
conn2 = withvertices(conn2, nlocs1)
conn1 = renumbered(conn1, ni1)
return cat(conn1, conn2)
end
"""
minimize_profile(conn)
Re-number the vertices so that the profile can be minimized.
# Output
Renumbered incidence relation.
"""
function minimize_profile(conn)
I = fill(zero(Int), 0)
J = fill(zero(Int), 0)
sizehint!(I, nshapes(conn.right))
for k in 1:nrelations(conn)
ne = nentities(conn, k)
for i in 1:ne
append!(I, conn[k])
for m in 1:ne
push!(J, conn[k, i])
end
end
end
V = fill(1.0, length(I))
S = sparse(I, J, V, nshapes(conn.right), nshapes(conn.right))
# display(spy(S))
# find the new numbering (permutation)
p = symrcm(S)
# display(spy(S[p, p]))
# number the vertices of the shapes on the left using the new permutation
conn = renumbered(conn, p)
# reorder the vertices attribute: the order of the vertices changed
ip = similar(p) # inverse permutation
ip[p] = 1:length(p)
locs = conn.right.attributes["geom"]
newlocs = VecAttrib([locs[k] for k in ip])
conn.right.attributes["geom"] = newlocs
return conn
end | MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 4382 | using DelimitedFiles
using WriteVTK
using MeshCore: AbsShapeDesc, P1, L2, T3, Q4, T4, H8
using MeshCore: VecAttrib
using MeshCore: shapedesc, nshapes, IncRel
using MeshCore: ShapeColl
using LinearAlgebra: norm
using StaticArrays
"""
Q4blockx(xs::Vector{T}, ys::Vector{T}; intbytes = 8) where {T}
Generate a graded quadrilateral mesh of a 2D block.
Keyword argument `intbytes` controls the size of the integer indexes.
# Return
By convention the function returns an incidence relation (`connectivity`)
between the elements (`connectivity.left`) and the vertices
(`connectivity.right`). The geometry is stored as the attribute "geom" of the
vertices.
"""
function Q4blockx(xs::Vector{T}, ys::Vector{T}; intbytes = 8) where {T}
inttype = Int64
if intbytes == 4
inttype = Int32
end
nL = length(xs)-1;
nW = length(ys)-1;
nnodes = (nL+1)*(nW+1);
ncells = (nL)*(nW);
xys = zeros(T, nnodes, 2);
conns = zeros(inttype, ncells, 4);
f=1;
for j in 1:(nW+1)
for i in 1:(nL+1)
xys[f, 1] = xs[i]
xys[f, 2] = ys[j]
f=f+1;
end
end
gc=1;
for i in 1:nL
for j in 1:nW
f=(j-1)*(nL+1)+i;
f = (j-1) * (nL+1) + i;
conns[gc, 1] = f
conns[gc, 2] = (f+1)
conns[gc, 3] = f+(nL+1)+1
conns[gc, 4] = f+(nL+1)
gc=gc+1;
end
end
C = inttype.(conns[1:gc-1, :]);
N, TC = size(xys, 2), eltype(xys)
locs = VecAttrib([SVector{N, TC}(xys[i, :]) for i in 1:size(xys, 1)])
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(Q4, size(C, 1), "elements")
return IncRel(elements, vertices, C)
end
"""
Q4block(Length, Width, nL, nW; kwargs...)
Generate a quadrilateral mesh of the 2D block.
See also: Q4blockx
"""
function Q4block(Length, Width, nL, nW; kwargs...)
return Q4blockx(collect(linearspace(0.0, Length, nL+1)),
collect(linearspace(0.0, Width, nW+1)); kwargs...);
end
"""
Q4quadrilateral(xyz::Matrix{T}, nL, nW) where {T}
Mesh of a general quadrilateral given by the location of the vertices.
"""
function Q4quadrilateral(xyz::Matrix{T}, nL, nW; intbytes = 8) where {T}
npts = size(xyz,1);
if npts == 2 # In this case the quadrilateral must be defined in two dimensions
lo = minimum(xyz, dims = 1);
hi = maximum(xyz, dims = 1);
xyz = [[lo[1] lo[2]];
[hi[1] lo[2]];
[hi[1] hi[2]];
[lo[1] hi[2]]];
elseif npts != 4
error("Need 2 or 4 points");
end
# Generate elements and vertices in a 2D square
ir = Q4block(2.0, 2.0, nL, nW; intbytes = intbytes);
function bfun(param_coords)
return SMatrix{4, 1}([0.25 * (1. - param_coords[1]) * (1. - param_coords[2]);
0.25 * (1. + param_coords[1]) * (1. - param_coords[2]);
0.25 * (1. + param_coords[1]) * (1. + param_coords[2]);
0.25 * (1. - param_coords[1]) * (1. + param_coords[2])]);
end
# Remap the quadrilateral mesh to the physical coordinates
locs = ir.right.attributes["geom"];
N, TC = size(xyz, 2), eltype(locs[1])
v = SVector{N, TC}[]
for i = 1:length(locs)
bfv = bfun([locs[i][1]-1.0, locs[i][2]-1.0]);# shift coordinates by -1
push!(v, SVector{N, TC}(bfv'*xyz));
end
nlocs = VecAttrib(v)
ir.right.attributes["geom"] = nlocs
return ir
end
"""
Q4blockwdistortion(Length, Width, nL, nW; intbytes = 8)
Generate mesh of a rectangular block with distorted quadrilaterals.
"""
function Q4blockwdistortion(Length, Width, nL, nW; intbytes = 8)
nL2 = max(1, Int32(round(nL/2)))
nW2 = max(1, Int32(round(nW/2)))
c1 = Q4quadrilateral([-1 -1; -0.2 -1; -0.1 -0.2; -1 0.8], nL2, nW2; intbytes = intbytes)
c2 = Q4quadrilateral([-0.2 -1; 1 -1; 1 -0.5; -0.1 -0.2], nL2, nW2; intbytes = intbytes)
c3 = Q4quadrilateral([-0.1 -0.2; 1 -0.5; 1 1; 0.3 1], nL2, nW2; intbytes = intbytes)
c4 = Q4quadrilateral([-1 0.8; -0.1 -0.2; 0.3 1; -1 1], nL2, nW2; intbytes = intbytes)
c = mergeirs(c1, c2, 0.001/max(nL2, nW2))
c = mergeirs(c, c3, 0.001/max(nL2, nW2))
c = mergeirs(c, c4, 0.001/max(nL2, nW2))
transform(c, x -> begin
[(x[1]+1.0)/2*Length, (x[2]+1.0)/2*Width]
end)
return c
end | MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 4326 | using DelimitedFiles
using WriteVTK
using MeshCore: AbsShapeDesc, P1, L2, T3, Q4, T4, H8
using MeshCore: VecAttrib
using MeshCore: shapedesc, nshapes, IncRel
using MeshCore: ShapeColl
using LinearAlgebra: norm
"""
T4blockx(xs::Vector{T}, ys::Vector{T}, zs::Vector{T}, orientation::Symbol) where {T}
Generate a graded tetrahedral mesh of a 3D block.
Four-node tetrahedra in a regular arrangement, with non-uniform given spacing
between the nodes, with a given orientation of the diagonals.
The mesh is produced by splitting each logical rectangular cell into a certain number of
tetrahedra. Orientation may be chosen as `:a`, `:b` (six tetrahedra per rectangular cell),
or `:ca` or `:cb` (five tetrahedra per rectangular cell). Keyword argument
# Return
By convention the function returns an incidence relation (`connectivity`)
between the elements (`connectivity.left`) and the vertices
(`connectivity.right`). The geometry is stored as the attribute "geom" of the
vertices.
"""
function T4blockx(xs::Vector{T}, ys::Vector{T}, zs::Vector{T}, orientation::Symbol; intbytes = 8) where {T}
inttype = Int64
if intbytes == 4
inttype = Int32
end
nL = length(xs)-1;
nW = length(ys)-1;
nH = length(zs)-1;
nnodes = (nL+1)*(nW+1)*(nH+1);
ncells = 6*(nL)*(nW)*(nH);
xyzs = zeros(T, nnodes, 3);
conns = zeros(inttype, ncells, 4);
if (orientation==:a)
t4ia = inttype[1 8 5 6; 3 4 2 7; 7 2 6 8; 4 7 8 2; 2 1 6 8; 4 8 1 2];
t4ib = inttype[1 8 5 6; 3 4 2 7; 7 2 6 8; 4 7 8 2; 2 1 6 8; 4 8 1 2];
elseif (orientation==:b)
t4ia = inttype[2 7 5 6; 1 8 5 7; 1 3 4 8; 2 1 5 7; 1 2 3 7; 3 7 8 1];
t4ib = [2 7 5 6; 1 8 5 7; 1 3 4 8; 2 1 5 7; 1 2 3 7; 3 7 8 1];
elseif (orientation==:ca)
t4ia = inttype[8 4 7 5; 6 7 2 5; 3 4 2 7; 1 2 4 5; 7 4 2 5];
t4ib = inttype[7 3 6 8; 5 8 6 1; 2 3 1 6; 4 1 3 8; 6 3 1 8];
elseif (orientation==:cb)
t4ia = inttype[7 3 6 8; 5 8 6 1; 2 3 1 6; 4 1 3 8; 6 3 1 8];
t4ib = inttype[8 4 7 5; 6 7 2 5; 3 4 2 7; 1 2 4 5; 7 4 2 5];
else
error("Unknown orientation")
end
f=1;
for k=1:(nH+1)
for j=1:(nW+1)
for i=1:(nL+1)
xyzs[f, 1] = xs[i]
xyzs[f, 2] = ys[j]
xyzs[f, 3] = zs[k];
f=f+1;
end
end
end
function node_numbers(i, j, k, nL, nW, nH)
f=(k-1)*((nL+1)*(nW+1))+(j-1)*(nL+1)+i;
nn=[f (f+1) f+(nL+1)+1 f+(nL+1)];
return inttype[nn broadcast(+, nn, (nL+1)*(nW+1))];
end
gc=1;
for i=1:nL
for j=1:nW
for k=1:nH
nn=node_numbers(i, j, k, nL, nW, nH);
if (mod(sum( [i, j, k] ), 2)==0)
t4i =t4ib;
else
t4i =t4ia;
end
for r=1:size(t4i, 1)
for c1=1:size(t4i, 2)
conns[gc, c1] = nn[t4i[r, c1]];
end
gc=gc+1;
end
end
end
end
C = inttype.(conns[1:gc-1, :]);
N, TC = size(xyzs, 2), eltype(xyzs)
locs = VecAttrib([SVector{N, TC}(xyzs[i, :]) for i in 1:size(xyzs, 1)])
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(T4, size(C, 1), "elements")
return IncRel(elements, vertices, C)
end
"""
T4block(Length, Width, Height, nL, nW, nH, orientation = :a)
Generate a tetrahedral mesh of the 3D block.
Four-node tetrahedra in a regular arrangement, with uniform spacing between
the nodes, with a given orientation of the diagonals.
The mesh is produced by splitting each logical rectangular cell into six
tetrahedra. Range =<0, Length> x <0, Width> x <0, Height>.
Divided into elements: nL, nW, nH in the first, second, and
third direction (x, y, z).
See also: T4blockx
"""
function T4block(Length, Width, Height, nL, nW, nH, orientation = :a; kwargs...)
return T4blockx(collect(linearspace(0.0, Length, nL+1)),
collect(linearspace(0.0, Width, nW+1)),
collect(linearspace(0.0, Height, nH+1)), orientation; kwargs...);
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 5867 | using DelimitedFiles
using WriteVTK
using MeshCore: AbsShapeDesc, P1, L2, T3, Q4, T4, H8, T6
using MeshCore: VecAttrib, @_check, datavaluetype
using MeshCore: shapedesc, nshapes, IncRel
using MeshCore: ShapeColl, ir_skeleton, ir_bbyfacets, nshapes
using LinearAlgebra: norm
"""
T3blockx(xs::Vector{T}, ys::Vector{T}, orientation::Symbol; intbytes = 8) where {T}
Generate a graded triangle mesh of a 2D block.
The mesh is produced by splitting each logical rectangular cell into two
triangles. Orientation may be chosen as `:a`, `:b`.
Keyword argument `intbytes` controls the size of the integer indexes.
# Return
By convention the function returns an incidence relation (`connectivity`)
between the elements (`connectivity.left`) and the vertices
(`connectivity.right`). The geometry is stored as the attribute "geom" of the
vertices.
"""
function T3blockx(xs::Vector{T}, ys::Vector{T}, orientation::Symbol; intbytes = 8) where {T}
if (orientation==:a) || (orientation==:b)
# nothing
else
error("Unknown orientation")
end
inttype = Int64
if intbytes == 4
inttype = Int32
end
nL = length(xs)-1;
nW = length(ys)-1;
nnodes = (nL+1)*(nW+1);
ncells = 2*(nL)*(nW);
xys = zeros(T, nnodes, 2);
conns = zeros(inttype, ncells, 3);
f=1;
for j in 1:(nW+1)
for i in 1:(nL+1)
xys[f, 1] = xs[i]
xys[f, 2] = ys[j]
f=f+1;
end
end
gc=1;
for i in 1:nL
for j in 1:nW
f=(j-1)*(nL+1)+i;
if (orientation==:a)
conns[gc,:] .= f, (f+1), f+(nL+1)
elseif (orientation==:b)
conns[gc,:] .= f, (f+1), f+(nL+1)+1
end
gc=gc+1;
if (orientation==:a)
conns[gc,:] .= (f+1), f+(nL+1)+1, f+(nL+1)
elseif (orientation==:b)
conns[gc,:] .= f, f+(nL+1)+1, f+(nL+1)
end
gc=gc+1;
end
end
C = inttype.(conns[1:gc-1, :]);
N, TC = size(xys, 2), eltype(xys)
locs = VecAttrib([SVector{N, TC}(xys[i, :]) for i in 1:size(xys, 1)])
vertices = ShapeColl(P1, length(locs), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(T3, size(C, 1), "elements")
return IncRel(elements, vertices, C)
end
"""
T3block(Length, Width, nL, nW, orientation = :a)
Generate a triangle mesh of a 2D block.
See also: T3blockx
"""
function T3block(Length, Width, nL, nW, orientation = :a; kwargs...)
return T3blockx(collect(linearspace(0.0, Length, nL+1)),
collect(linearspace(0.0, Width, nW+1)), orientation; kwargs...);
end
"""
T3toT6(ir)
Convert three node triangles (T3) to six node triangles (T6).
"""
function T3toT6(ir)
@_check shapedesc(ir.left) == T3
locs = ir.right.attributes["geom"]
sk = ir_skeleton(ir) # skeleton consists of edges
bf = ir_bbyfacets(ir, sk) # edges bounding triangles
n = nshapes(ir.right) # number of vertices in the three-node triangle mesh
med = [idx+n for idx in 1:nshapes(sk.left)]
C = fill(zero(indextype(ir)), nshapes(ir.left), 6)
# Create array to hold the new coordinates
nx = fill(zero(eltype(datavaluetype(locs))), length(locs) + length(med), length(locs[1]))
for i in 1:length(locs) # copy in the old locations
nx[i, :] .= locs[i]
end
for i in 1:nshapes(ir.left)
C[i, 1:3] .= ir[i]
ec = abs.(bf[i]) # ignore the orientation
for (j, w) in enumerate((5, 6, 4))
en = med[ec[j]]
C[i, w] = en
ev = sk[ec[j]]
nx[en, :] = (locs[ev[1]] + locs[ev[2]]) / 2.0
end
end
locs = VecAttrib([SVector{size(nx, 2), eltype(locs[1])}(nx[i, :]) for i in 1:size(nx, 1)])
vertices = ShapeColl(P1, size(nx, 1), "vertices")
vertices.attributes["geom"] = locs
elements = ShapeColl(T6, size(C, 1), "elements")
return IncRel(elements, vertices, C)
end
"""
T6blockx(xs::Vector{T}, ys::Vector{T}, orientation::Symbol; intbytes = 8) where
{T}
Generate a graded quadratic triangle mesh of a 2D block.
The mesh is produced by splitting each logical rectangular cell into two
triangles. Orientation may be chosen as `:a`, `:b`.
Keyword argument `intbytes` controls the size of the integer indexes.
# Return
By convention the function returns an incidence relation (`connectivity`)
between the elements (`connectivity.left`) and the vertices
(`connectivity.right`). The geometry is stored as the attribute "geom" of the
vertices.
"""
function T6blockx(xs::Vector{T}, ys::Vector{T}, orientation::Symbol; intbytes = 8) where
{T}
ir = T3blockx(xs, ys, orientation; intbytes = intbytes)
return T3toT6(ir)
end
"""
T6block(Length, Width, nL, nW, orientation = :a)
Generate a quadratic triangle mesh of a 2D block.
See also: T6blockx
"""
function T6block(Length, Width, nL, nW, orientation = :a; kwargs...)
return T6blockx(collect(linearspace(0.0, Length, nL+1)),
collect(linearspace(0.0, Width, nW+1)), orientation; kwargs...);
end
"""
T6toT3(ir)
Convert six node triangles (T6) to three node triangles (T3).
Note: Unconnected vertices will be removed from the right-hand side shape
collection.
"""
function T6toT3(ir)
@_check shapedesc(ir.left) == T6
# The same vertices
vertices = ShapeColl(P1, nshapes(ir.right), "vertices")
locs = ir.right.attributes["geom"]
vertices.attributes["geom"] = locs
# Elements have only the three corner nodes
C = [SVector{3}(ir[idx][1:3]) for idx in 1:nrelations(ir)]
elements = ShapeColl(T3, length(C), "elements")
newir = IncRel(elements, vertices, C)
# Remove unconnected vertices.
ic = vconnected(newir)
nn = vnewnumbering(newir, ic)
return compactify(newir, nn)
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 1101 | using LinearAlgebra: norm
"""
linearspace(start, stop, N)
Generate linear space.
Generate a linear sequence of `N` numbers between `start` and `stop` (i. e.
sequence of number with uniform intervals inbetween).
# Example
```
julia> linearspace(2.0, 3.0, 5)
2.0:0.25:3.0
```
"""
function linearspace(start, stop, N)
return collect(range(Float64(start), stop = Float64(stop), length = Int64(N)))
end
"""
gradedspace(start, stop, N, strength=2)
Generate graded space.
Generate a graded sequence of `N` numbers between `start` and `stop`. This
sequence corresponds to separation of adjacent numbers that increases in
proportion corresponding to the power coefficient `strength`
from start to finish.
# Example
```
julia> gradedspace(2.0, 3.0, 5)
5-element Array{Float64,1}:
2.0
2.0625
2.25
2.5625
3.0
```
"""
function gradedspace(start, stop, N, strength=2)
N = Int64(N)
x = range(0.0, stop = 1.0, length = N);
x = x.^strength
# for i = 1:strength
# x = cumsum(x);
# end
x = x./maximum(x);
out = Float64(start) .* (1.0 .- x) .+ Float64(stop) .* x;
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 6319 | using LinearAlgebra: norm, dot, cross
using Statistics: mean
using MeshCore: IncRel, indextype, nshapes
"""
selectnode(fens::FENodeSet; kwargs...)
Select nodes using some criterion.
# Arguments
- `v` = array of locations, one location per row
- `kwargs` = pairs of keyword argument/value
# Selection criteria:
## Distance from point
```
list = selectnode(fens.xyz, distance=1.0+0.1/2^nref, from=[0. 0.],
inflate=tolerance);
```
## Distance from a plane
```
candidates = selectnode(fens, plane = [0.0 0.0 1.0 0.0], thickness = h/1000)
```
The keyword `plane` defines the plane by its normal (the first two or
three numbers) and its distance from the origin (the last number). Nodes
are selected they lie on the plane, or near the plane within the
distance `thickness` from the plane. The normal is assumed to be of unit
length, if it isn't apply as such, it will be normalized internally.
## Nearest-to a point
Find the node nearest to the location given.
```
nh = selectnode(fens, nearestto = [R+Ro/2, 0.0, 0.0] )
```
"""
function _box_outputlist!(outputlist::Vector{IT}, abox::BT, sdim::IT, v::VT) where {IT, BT, VT}
# Helper functions
@inline inrange(rangelo,rangehi,x) = (rangelo <= x <= rangehi)
nn = 0
for j in eachindex(v)
matches = true
for i in 1:sdim
if !inrange(abox[2*i-1], abox[2*i], v[j][i])
matches = false; break
end
end
if matches
nn = nn + 1; outputlist[nn] = j;
end
end
return outputlist, nn
end
function _distance_outputlist!(outputlist::Vector{IT}, d, fromvalue, sdim::IT, v::VT) where {IT, VT}
# Helper functions
nn = 0
for j in eachindex(v)
if norm(fromvalue-v[j]) < d
nn = nn + 1; outputlist[nn] = j;
end
end
return outputlist, nn
end
function _plane_outputlist!(outputlist::Vector{IT}, distance, normal, t, sdim::IT, v::VT) where {IT, VT}
# Helper functions
nn = 0
for j in eachindex(v)
ad = dot(v[j], normal);
if abs(distance-ad)<t
nn = nn + 1; outputlist[nn] = j;
end
end
return outputlist, nn
end
function _nearestto_outputlist!(outputlist::Vector{IT}, nearestto, sdim::IT, v::VT) where {IT, VT}
distances = fill(0.0, length(v));
for j in eachindex(v)
distances[j] = norm(nearestto-v[j])
end
Mv,j = findmin(distances)
return [j], 1
end
"""
vselect(v::VT; kwargs...) where {VT}
Select locations (vertices) based on some criterion.
`VT` is an abstract array that returns the coordinates of a vertex given its
number.
## Selection criteria
### box
```
nLx = vselect(v, box = [0.0 Lx 0.0 0.0 0.0 0.0], inflate = Lx/1.0e5)
```
The keyword 'inflate' may be used to increase or decrease the extent of
the box (or the distance) to make sure some nodes which would be on the
boundary are either excluded or included.
### distance
```
list = vselect(v, distance=1.0+0.1/2^nref, from=[0. 0.], inflate=tolerance);
```
### plane
```
candidates = vselect(v, plane = [0.0 0.0 1.0 0.0], thickness = h/1000)
```
The keyword `plane` defines the plane by its normal (the first two or
three numbers) and its distance from the origin (the last number). Nodes
are selected they lie on the plane, or near the plane within the
distance `thickness` from the plane. The normal is assumed to be of unit
length, if it isn't provided as such, it will be normalized internally.
### nearestto
Find the node nearest to the location given.
```
nh = vselect(v, nearestto = [R+Ro/2, 0.0, 0.0])
```
# Returns
The list of vertices that match the search criterion.
"""
function vselect(v::VT; kwargs...) where {VT}
# Extract arguments
box = nothing; distance = nothing; from = nothing; plane = nothing;
thickness = nothing; nearestto = nothing; inflate = 0.0;
for apair in pairs(kwargs)
sy, val = apair
if sy == :box
box = val
elseif sy == :distance
distance = val
elseif sy == :from
from = val
elseif sy == :plane
plane = val
elseif sy == :thickness
thickness = val
elseif sy == :nearestto
nearestto = val
elseif sy == :inflate
inflate = val
end
end
# Did we get an inflate value
inflatevalue = 0.0;
if inflate != nothing
inflatevalue = Float64(inflate);
end
# Initialize the output list
outputlist = zeros(Int64, length(v)); nn = 0;
sdim = length(v[1])
# Process the different options
if box != nothing
dim = Int64(round(length(box)/2.));
@assert dim == sdim "Dimension of box not matched to dimension of array of vertices"
abox = vec(box)
inflatebox!(abox, inflatevalue)
outputlist, nn = _box_outputlist!(outputlist, abox, sdim, v)
elseif distance != nothing
fromvalue = fill(0.0, sdim);
if from != nothing
fromvalue = from;
end
d = distance+inflatevalue;
outputlist, nn = _distance_outputlist!(outputlist, d, fromvalue, sdim, v)
elseif plane != nothing
normal = plane[1:end-1];
normal = vec(normal/norm(normal));
thicknessvalue = 0.0;
if thickness != nothing
thicknessvalue = thickness;
end
t = thicknessvalue+inflatevalue;
distance = plane[end];
outputlist, nn = _plane_outputlist!(outputlist, distance, normal, t, sdim, v)
elseif nearestto != nothing
nearestto = vec(nearestto);
outputlist, nn = _nearestto_outputlist!(outputlist, nearestto, sdim, v)
end
if (nn==0)
outputlist = Int64[];# nothing matched
else
outputlist = outputlist[1:nn];
end
return outputlist
end
"""
connectedv(fes::AbstractFESet)
Extract the node numbers of the nodes connected by given finite elements.
Extract the list of unique node numbers for the nodes that are connected
by the finite element set `fes`. Note that it is assumed that all the
FEs are of the same type (the same number of connected nodes by each
cell).
"""
function connectedv(ir::IncRel)
vl = fill(zero(indextype(ir)), 0)
for i in 1:nshapes(ir.left)
append!(vl, ir[i])
end
return unique(vl);
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 684 | using Test
@time @testset "Lines" begin include("test_lines.jl") end
@time @testset "Quadrilaterals" begin include("test_quadrilaterals.jl") end
@time @testset "Search elements" begin include("test_eselect.jl") end
@time @testset "Boxes" begin include("test_boxes.jl") end
@time @testset "Search vertices" begin include("test_vselect.jl") end
@time @testset "Mesh import/export" begin include("test_io.jl") end
@time @testset "Triangles" begin include("test_triangles.jl") end
@time @testset "Tetrahedra" begin include("test_tetrahedra.jl") end
@time @testset "Modification" begin include("test_modification.jl") end
@time @testset "High-level" begin include("test_mesh.jl") end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 353 | module samplet3
function samplet3mesh()
xyz = [0.0 0.0
0.5 0.0
1.0 0.0
0.0 0.3333333333333333
0.5 0.3333333333333333
1.0 0.3333333333333333
0.0 0.6666666666666666
0.5 0.6666666666666666
1.0 0.6666666666666666
0.0 1.0
0.5 1.0
1.0 1.0]
c = [1 2 5
1 5 4
4 5 8
4 8 7
7 8 11
7 11 10
2 3 6
2 6 5
5 6 9
5 9 8
8 9 12
8 12 11
]
return xyz, c
end
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 3293 | module samplet4
function samplet4mesh()
xyz = [
0.0 0.0 0.0
1.0 0.0 0.0
2.0 0.0 0.0
3.0 0.0 0.0
0.0 2.0 0.0
1.0 2.0 0.0
2.0 2.0 0.0
3.0 2.0 0.0
0.0 4.0 0.0
1.0 4.0 0.0
2.0 4.0 0.0
3.0 4.0 0.0
0.0 6.0 0.0
1.0 6.0 0.0
2.0 6.0 0.0
3.0 6.0 0.0
0.0 8.0 0.0
1.0 8.0 0.0
2.0 8.0 0.0
3.0 8.0 0.0
0.0 0.0 2.5
1.0 0.0 2.5
2.0 0.0 2.5
3.0 0.0 2.5
0.0 2.0 2.5
1.0 2.0 2.5
2.0 2.0 2.5
3.0 2.0 2.5
0.0 4.0 2.5
1.0 4.0 2.5
2.0 4.0 2.5
3.0 4.0 2.5
0.0 6.0 2.5
1.0 6.0 2.5
2.0 6.0 2.5
3.0 6.0 2.5
0.0 8.0 2.5
1.0 8.0 2.5
2.0 8.0 2.5
3.0 8.0 2.5
0.0 0.0 5.0
1.0 0.0 5.0
2.0 0.0 5.0
3.0 0.0 5.0
0.0 2.0 5.0
1.0 2.0 5.0
2.0 2.0 5.0
3.0 2.0 5.0
0.0 4.0 5.0
1.0 4.0 5.0
2.0 4.0 5.0
3.0 4.0 5.0
0.0 6.0 5.0
1.0 6.0 5.0
2.0 6.0 5.0
3.0 6.0 5.0
0.0 8.0 5.0
1.0 8.0 5.0
2.0 8.0 5.0
3.0 8.0 5.0
]
c = [1 25 21 22
6 5 2 26
26 2 22 25
5 26 25 2
2 1 22 25
5 25 1 2
21 45 41 42
26 25 22 46
46 22 42 45
25 46 45 22
22 21 42 45
25 45 21 22
5 29 25 26
10 9 6 30
30 6 26 29
9 30 29 6
6 5 26 29
9 29 5 6
25 49 45 46
30 29 26 50
50 26 46 49
29 50 49 26
26 25 46 49
29 49 25 26
9 33 29 30
14 13 10 34
34 10 30 33
13 34 33 10
10 9 30 33
13 33 9 10
29 53 49 50
34 33 30 54
54 30 50 53
33 54 53 30
30 29 50 53
33 53 29 30
13 37 33 34
18 17 14 38
38 14 34 37
17 38 37 14
14 13 34 37
17 37 13 14
33 57 53 54
38 37 34 58
58 34 54 57
37 58 57 34
34 33 54 57
37 57 33 34
2 26 22 23
7 6 3 27
27 3 23 26
6 27 26 3
3 2 23 26
6 26 2 3
22 46 42 43
27 26 23 47
47 23 43 46
26 47 46 23
23 22 43 46
26 46 22 23
6 30 26 27
11 10 7 31
31 7 27 30
10 31 30 7
7 6 27 30
10 30 6 7
26 50 46 47
31 30 27 51
51 27 47 50
30 51 50 27
27 26 47 50
30 50 26 27
10 34 30 31
15 14 11 35
35 11 31 34
14 35 34 11
11 10 31 34
14 34 10 11
30 54 50 51
35 34 31 55
55 31 51 54
34 55 54 31
31 30 51 54
34 54 30 31
14 38 34 35
19 18 15 39
39 15 35 38
18 39 38 15
15 14 35 38
18 38 14 15
34 58 54 55
39 38 35 59
59 35 55 58
38 59 58 35
35 34 55 58
38 58 34 35
3 27 23 24
8 7 4 28
28 4 24 27
7 28 27 4
4 3 24 27
7 27 3 4
23 47 43 44
28 27 24 48
48 24 44 47
27 48 47 24
24 23 44 47
27 47 23 24
7 31 27 28
12 11 8 32
32 8 28 31
11 32 31 8
8 7 28 31
11 31 7 8
27 51 47 48
32 31 28 52
52 28 48 51
31 52 51 28
28 27 48 51
31 51 27 28
11 35 31 32
16 15 12 36
36 12 32 35
15 36 35 12
12 11 32 35
15 35 11 12
31 55 51 52
36 35 32 56
56 32 52 55
35 56 55 32
32 31 52 55
35 55 31 32
15 39 35 36
20 19 16 40
40 16 36 39
19 40 39 16
16 15 36 39
19 39 15 16
35 59 55 56
40 39 36 60
60 36 56 59
39 60 59 36
36 35 56 59
39 59 35 36
]
return xyz, c
end
end
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 3585 | module mbox1
using MeshSteward: initbox, updatebox!, inflatebox!, inbox, boxesoverlap
using Test
function test()
box = Float64[]
box = initbox([0.0])
@test isapprox(box, [0.0, 0.0])
box = updatebox!(box, [-1.0])
@test isapprox(box, [-1.0, 0.0])
box = Float64[]
box = initbox([1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test isapprox(box, [0.99, 1.01])
box = initbox([0.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test inbox(box, [-0.003]) == true
box = initbox([2.0])
updatebox!(box, [1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test inbox(box, [rand()+1.0]) == true
box = Float64[]
box = initbox([0.0; 1.0])
@test isapprox(box, [0.0, 0.0, 1.0, 1.0])
box = updatebox!(box, [1.0; -1.0])
@test isapprox(box, [0.0, 1.0, -1.0, 1.0])
box = Float64[]
box = initbox([0.0; 1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test isapprox(box, [-0.01, 0.01, 0.99, 1.01])
box = initbox([0.0; 1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test inbox(box, [-0.003; 0.995]) == true
box = initbox([0.0; 0.0])
updatebox!(box, [1.0; 1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test inbox(box, [-0.003; 0.995]) == true
@test inbox(box, [rand(); rand()]) == true
@test inbox(box, [0.5; 0.5]) == true
@test inbox(box, [0.0-inflate; 0.0-inflate]) == true
@test inbox(box, [0.0-inflate; 0.0-2*inflate]) == false
@test inbox(box, [0.0+inflate; 1.0+inflate]) == true
box = initbox([0.0; 0.0; 0.0].-1.0)
updatebox!(box, [1.0; 1.0; 1.0])
inflate = 0.01
box = inflatebox!(box, inflate)
@test inbox(box, [-0.003; 0.995; 1.0]) == true
@test inbox(box, [rand(); rand(); rand()]) == true
@test inbox(box, [-rand(); rand(); rand()]) == true
@test inbox(box, [rand(); -rand(); rand()]) == true
@test inbox(box, [rand(); rand(); -rand()]) == true
@test inbox(box, [0.0-inflate; 0.0-inflate; 1.0]) == true
@test inbox(box, [0.0-inflate; 0.0-inflate; -1.0]) == true
return true
end
end
using .mbox1
mbox1.test()
module mbox2
using MeshSteward: initbox, updatebox!, inflatebox!, inbox, boxesoverlap, boundingbox
using LinearAlgebra
using Test
function test()
a = [0.431345 0.611088 0.913161;
0.913581 0.459229 0.82186;
0.999429 0.965389 0.571139;
0.390146 0.711732 0.302495;
0.873037 0.248077 0.51149;
0.999928 0.832524 0.93455]
b1 = boundingbox(a)
@test norm(b1 - [0.390146, 0.999928, 0.248077, 0.965389, 0.302495, 0.93455]) < 1.0e-4
b2 = updatebox!(b1, a)
@test norm(b1 - b2) < 1.0e-4
b2 = initbox(a)
@test norm(b1 - b2) < 1.0e-4
c = [-1.0, 3.0, -0.5]
b3 = updatebox!(b1, c)
# # println("$(b3)")
@test norm(b3 - [-1.0, 0.999928, 0.248077, 3.0, -0.5, 0.93455]) < 1.0e-4
x = [0.25 1.1 -0.3]
@test inbox(b3, x)
@test inbox(b3, c)
@test inbox(b3, a[2, :])
b4 = boundingbox(-a)
# # println("$(b3)")
# # println("$(b4)")
# # println("$(boxesoverlap(b3, b4))")
@test !boxesoverlap(b3, b4)
b5 = updatebox!(b3, [0.0 -0.4 0.0])
# # println("$(b5)")
# # println("$(boxesoverlap(b5, b4))")
@test boxesoverlap(b5, b4)
end
end
using .mbox2
mbox2.test()
module mboxt3
using MeshSteward: intersectboxes
using LinearAlgebra
using Test
function test()
b1 = [-1.0, 2.0, 0.5, 2.5]
b2 = [-0.7, 1.2, -0.5, 3.5]
ib = intersectboxes(b1, b2)
@test isapprox(ib, [-0.7, 1.2, 0.5, 2.5])
b1 = [-1.0, 2.0, 0.5, 2.5, 77.1, 1000.0]
b2 = [-0.7, 1.2, -0.5, 3.5, 1077.1, 10000.0]
ib = intersectboxes(b1, b2)
@test isapprox(ib, Float64[])
end
end
using .mboxt3
mboxt3.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 5103 | include("samplet3.jl")
module mt4topo1e1
using StaticArrays
using MeshCore: P1, T3, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: ir_skeleton, VecAttrib, nrelations
using MeshCore: IncRel, ir_transpose, nrelations, nentities, ir_boundary
using MeshSteward: connectedv, eselect
using ..samplet3: samplet3mesh
using Test
function test()
xyz, cc = samplet3mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tris = ShapeColl(T3, size(cc, 1))
ir = IncRel(tris, vrts, cc)
vl = connectedv(ir)
@test length(vl) == size(xyz, 1)
bir = ir_boundary(ir)
bir.right.attributes["geom"] = locs
el = eselect(bir; box = [0.0, 1.0, 1.0, 1.0], inflate = 0.01)
for i in el
vl = bir[i]
for j in vl
@test isapprox(locs[j][2], 1.0)
end
end
el = eselect(bir; box = [0.0, 0.0, 0.0, 1.0], inflate = 0.01)
for i in el
vl = bir[i]
for j in vl
@test isapprox(locs[j][1], 0.0)
end
end
true
end
end
using .mt4topo1e1
mt4topo1e1.test()
module mt4topo1e2
using StaticArrays
using MeshCore: P1, T3, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: ir_skeleton, VecAttrib, nrelations
using MeshCore: IncRel, transpose, nrelations, nentities, ir_boundary
using MeshSteward: connectedv, eselect
using ..samplet3: samplet3mesh
using Test
function test()
xyz, cc = samplet3mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tris = ShapeColl(T3, size(cc, 1))
ir = IncRel(tris, vrts, cc)
vl = connectedv(ir)
@test length(vl) == size(xyz, 1)
bir = ir_boundary(ir)
bir.right.attributes["geom"] = locs
bir.left.attributes["label"] = VecAttrib([1 for i in 1:nshapes(bir.left)])
el = eselect(bir; box = [0.0, 1.0, 1.0, 1.0], inflate = 0.01)
for i in el
vl = bir[i]
for j in vl
@test isapprox(locs[j][2], 1.0)
end
end
for i in 1:length(el)
bir.left.attributes["label"][el[i]] = 2
end
el2 = eselect(bir; label = 2)
@test isapprox(el, el2)
el = eselect(bir; box = [0.0, 0.0, 0.0, 1.0], inflate = 0.01)
for i in el
vl = bir[i]
for j in vl
@test isapprox(locs[j][1], 0.0)
end
end
for i in 1:length(el)
bir.left.attributes["label"][el[i]] = 3
end
el2 = eselect(bir; label = 3)
@test isapprox(el, el2)
true
end
end
using .mt4topo1e2
mt4topo1e2.test()
module mmeses1
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton, ir_boundary, ir_subset
using MeshSteward: import_NASTRAN, vtkwrite, eselect
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("trunc_cyl_shell_0-search", connectivity)
try rm("trunc_cyl_shell_0-search.vtu"); catch end
bir = ir_boundary(connectivity)
el = eselect(bir; facing = true, direction = x -> [0.0, 0.0, 1.0], dotmin = 0.99)
@test length(el) == 44
vtkwrite("trunc_cyl_shell_0-search-z=top", ir_subset(bir, el))
try rm("trunc_cyl_shell_0-search-z=top.vtu"); catch end
el = eselect(bir; facing = true, direction = x -> [-x[1], -x[2], 0.0], dotmin = 0.99)
@test length(el) == 304
vtkwrite("trunc_cyl_shell_0-search-interior", ir_subset(bir, el))
try rm("trunc_cyl_shell_0-search-interior.vtu"); catch end
true
end
end
using .mmeses1
mmeses1.test()
module mmeses2
using StaticArrays
using MeshCore: P1, T3, ShapeColl, manifdim, nvertices, nshapes, indextype, IncRel
using MeshCore: attribute, nrelations, ir_skeleton, ir_boundary, ir_subset, VecAttrib
using MeshSteward: import_NASTRAN, vtkwrite, eselect, connectedv
using ..samplet3: samplet3mesh
using Test
function test()
xyz, cc = samplet3mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tris = ShapeColl(T3, size(cc, 1))
ir = IncRel(tris, vrts, cc)
vl = connectedv(ir)
@test length(vl) == size(xyz, 1)
bir = ir_boundary(ir)
bir.right.attributes["geom"] = locs
el = eselect(bir; facing = true, direction = x -> [-1.0, 0.0], dotmin = 0.99)
@test length(el) == 3
vtkwrite("samplet3mesh-search", ir)
vtkwrite("samplet3mesh-search-x=0_0", ir_subset(bir, el))
try rm("samplet3mesh-search" * ".vtu"); catch end
try rm("samplet3mesh-search-x=0_0" * ".vtu"); catch end
true
end
end
using .mmeses2
mmeses2.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 12617 |
module mmeshio1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: import_NASTRAN
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
true
end
end
using .mmeshio1
mmeshio1.test()
module mmeshio2
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton
using MeshSteward: import_NASTRAN, vtkwrite
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("trunc_cyl_shell_0", connectivity)
try rm("trunc_cyl_shell_0.vtu"); catch end
ir00 = ir_skeleton(ir_skeleton(ir_skeleton(connectivity)))
@test (nshapes(ir00.right), nshapes(ir00.left)) == (376, 376)
vtkwrite("trunc_cyl_shell_0-0-skeleton", ir00)
try rm("trunc_cyl_shell_0-0-skeleton" * ".vtu"); catch end
true
end
end
using .mmeshio2
mmeshio2.test()
module mmeshio5
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton
using MeshSteward: import_NASTRAN, vtkwrite
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("trunc_cyl_shell_0", connectivity)
try rm("trunc_cyl_shell_0" * ".vtu"); catch end
ir20 = ir_skeleton(connectivity)
@test (nshapes(ir20.right), nshapes(ir20.left)) == (376, 2368)
vtkwrite("trunc_cyl_shell_0-2-skeleton", ir20)
try rm("trunc_cyl_shell_0-2-skeleton" * ".vtu"); catch end
true
end
end
using .mmeshio5
mmeshio5.test()
module mmeshio6
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary
using MeshSteward: import_NASTRAN, vtkwrite
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("trunc_cyl_shell_0", connectivity)
try rm("trunc_cyl_shell_0" * ".vtu"); catch end
ir20 = ir_boundary(connectivity)
@test (nshapes(ir20.right), nshapes(ir20.left)) == (376, 752)
vtkwrite("trunc_cyl_shell_0-boundary", ir20)
try rm("trunc_cyl_shell_0-boundary" * ".vtu"); catch end
true
end
end
using .mmeshio6
mmeshio6.test()
module mmeshio7
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary
using MeshSteward: import_NASTRAN, vtkwrite
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("trunc_cyl_shell_0", connectivity)
try rm("trunc_cyl_shell_0" * ".vtu"); catch end
ir20 = ir_boundary(connectivity)
@test (nshapes(ir20.right), nshapes(ir20.left)) == (376, 752)
vtkwrite("trunc_cyl_shell_0-boundary", ir20)
try rm("trunc_cyl_shell_0-boundary" * ".vtu"); catch end
true
end
end
using .mmeshio7
mmeshio7.test()
module mmeshio8
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary
using MeshSteward: import_MESH, vtkwrite, export_MESH
using Test
function test()
connectivities = import_MESH(joinpath("data", "q4-4-2.mesh"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (15, 8)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vtkwrite("q4-4-2", connectivity)
try rm("q4-4-2" * ".vtu"); catch end
ir20 = ir_boundary(connectivity)
@test (nshapes(ir20.right), nshapes(ir20.left)) == (15, 12)
vtkwrite("q4-4-2-boundary", ir20)
try rm("q4-4-2-boundary" * ".vtu"); catch end
# @show connectivity
@test export_MESH("q4-4-2-export.mesh", connectivity)
try rm("q4-4-2-export" * ".mesh"); catch end
try rm("q4-4-2-export-xyz" * ".dat"); catch end
try rm("q4-4-2-export-conn" * ".dat"); catch end
true
end
end
using .mmeshio8
mmeshio8.test()
module mmeshio9
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary
using MeshSteward: import_NASTRAN, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
beforeexport = (nshapes(connectivity.right), nshapes(connectivity.left))
@test export_MESH("test.mesh", connectivity)
connectivities = import_MESH("test.mesh")
connectivity = connectivities[1]
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == beforeexport
vertices = connectivity.right
geom2 = attribute(vertices, "geom")
s = sum([norm(geom2[i]-geom[i]) for i in length(geom)] )
@test s <= 1.0e-9
try rm("test" * ".mesh"); catch end
try rm("test-*" * ".dat"); catch end
true
end
end
using .mmeshio9
mmeshio9.test()
module mmeshio10
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
beforeexport = (nshapes(connectivity.right), nshapes(connectivity.left))
@test export_MESH("test.mesh", connectivity)
connectivities = import_MESH("test.mesh")
connectivity = connectivities[1]
connectivity = connectivities[1]
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == beforeexport
vertices = connectivity.right
geom2 = attribute(vertices, "geom")
s = sum([norm(geom2[i]-geom[i]) for i in length(geom)] )
@test s <= 1.0e-9
try rm("test" * ".mesh"); catch end
try rm("test" * "-xyz" * ".dat"); catch end
try rm("test" * "-conn" * ".dat"); catch end
true
end
end
using .mmeshio10
mmeshio10.test()
module mmeshio11
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary, VecAttrib
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vertices.attributes["distance"] = VecAttrib([norm(geom[i]) for i in 1:length(geom)])
vtkwrite("block-w-hole-distance", connectivity, [(name = "distance",)])
try rm("block-w-hole-distance" * ".vtu"); catch end
true
end
end
using .mmeshio11
mmeshio11.test()
module mmeshio12
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_boundary, VecAttrib
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vertices.attributes["dist"] = VecAttrib([norm(geom[i]) for i in 1:length(geom)])
vertices.attributes["x"] = VecAttrib([geom[i][1] for i in 1:length(geom)])
vtkwrite("block-w-hole-distance", connectivity, [(name = "dist",), (name = "x",)])
try rm("block-w-hole-distance" * ".vtu"); catch end
true
end
end
using .mmeshio12
mmeshio12.test()
module mmeshio13
using StaticArrays
using MeshCore: nshapes, nrelations
using MeshCore: attribute, nrelations, ir_boundary, VecAttrib
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vertices.attributes["dist"] = VecAttrib([norm(geom[i]) for i in 1:length(geom)])
vertices.attributes["x"] = VecAttrib([geom[i][1] for i in 1:length(geom)])
connectivity.left.attributes["invdist"] = VecAttrib([1.0/norm(sum(geom[connectivity[i]])) for i in 1:nrelations(connectivity)])
vtkwrite("block-w-hole-mixed", connectivity, [(name = "dist",), (name = "x",), (name = "invdist",)])
try rm("block-w-hole-mixed" * ".vtu"); catch end
true
end
end
using .mmeshio13
mmeshio13.test()
module mmeshio14
using StaticArrays
using MeshCore: nshapes, nrelations
using MeshCore: attribute, nrelations, ir_boundary, VecAttrib
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vertices.attributes["dist"] = VecAttrib([norm(geom[i]) for i in 1:length(geom)])
vertices.attributes["x"] = VecAttrib([geom[i][1] for i in 1:length(geom)])
vertices.attributes["v"] = VecAttrib([[geom[i][2], -geom[i][1], 0.0] for i in 1:length(geom)])
connectivity.left.attributes["invdist"] = VecAttrib([1.0/norm(sum(geom[connectivity[i]])) for i in 1:nrelations(connectivity)])
vtkwrite("block-w-hole-mixed", connectivity, [(name = "dist",), (name = "x",), (name = "invdist",), (name = "v",)])
try rm("block-w-hole-mixed" * ".vtu"); catch end
true
end
end
using .mmeshio14
mmeshio14.test()
module mmeshio15
using StaticArrays
using MeshCore: nshapes, nrelations
using MeshCore: attribute, nrelations, ir_boundary, VecAttrib
using MeshSteward: import_ABAQUS, vtkwrite, export_MESH, import_MESH
using LinearAlgebra
using Test
function test()
connectivities = import_ABAQUS(joinpath("data", "block-w-hole.inp"))
@test length(connectivities) == 1
connectivity = connectivities[1]
@test connectivity.left.name == "Q4"
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (481, 430)
vertices = connectivity.right
geom = attribute(vertices, "geom")
vertices.attributes["dist"] = VecAttrib([norm(geom[i]) for i in 1:length(geom)])
vertices.attributes["x"] = VecAttrib([geom[i][1] for i in 1:length(geom)])
vertices.attributes["v"] = VecAttrib([[geom[i][2], -geom[i][1]] for i in 1:length(geom)])
connectivity.left.attributes["invdist"] = VecAttrib([1.0/norm(sum(geom[connectivity[i]])) for i in 1:nrelations(connectivity)])
vtkwrite("block-w-hole-mixed", connectivity, [(name = "dist",), (name = "x",), (name = "invdist",), (name = "v", allxyz = true)])
try rm("block-w-hole-mixed" * ".vtu"); catch end
true
end
end
using .mmeshio15
mmeshio15.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 1902 |
module ml2gen1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: L2blockx
using Test
function test()
connectivity = L2blockx([0.0, 1.0, 3.0])
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (3, 2)
vtkwrite("ml2gen1a", connectivity)
try rm("ml2gen1a.vtu"); catch end
true
end
end
using .ml2gen1
ml2gen1.test()
module ml2gen1b
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: L2blockx
using Test
function test()
connectivity = L2blockx([0.0, 1.0, 3.0])
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (3, 2)
vtkwrite("ml2gen1b", connectivity)
try rm("ml2gen1b.vtu"); catch end
true
end
end
using .ml2gen1b
ml2gen1b.test()
module ml2gen2
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: L2block
using Test
function test()
connectivity = L2block(1.0, 7)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (8, 7)
vtkwrite("ml2gen2", connectivity)
try rm("ml2gen2.vtu"); catch end
true
end
end
using .ml2gen2
ml2gen2.test()
module ml2gen3
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: L2blockx, gradedspace
using Test
function test()
connectivity = L2blockx(gradedspace(0.0, 5.0, 7, 2))
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (7, 6)
vtkwrite("ml2gen3", connectivity)
try rm("ml2gen3.vtu"); catch end
export_MESH("ml2gen3", connectivity)
try rm("ml2gen3.mesh"); catch end
try rm("ml2gen3-xyz.dat"); catch end
try rm("ml2gen3-conn.dat"); catch end
true
end
end
using .ml2gen3
ml2gen3.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 13711 | module osamplet4
using StaticArrays
function osamplet4mesh()
xyz = [
0.0 0.0 0.0
1.0 0.0 0.0
2.0 0.0 0.0
3.0 0.0 0.0
0.0 2.0 0.0
1.0 2.0 0.0
2.0 2.0 0.0
3.0 2.0 0.0
0.0 4.0 0.0
1.0 4.0 0.0
2.0 4.0 0.0
3.0 4.0 0.0
0.0 6.0 0.0
1.0 6.0 0.0
2.0 6.0 0.0
3.0 6.0 0.0
0.0 8.0 0.0
1.0 8.0 0.0
2.0 8.0 0.0
3.0 8.0 0.0
0.0 0.0 2.5
1.0 0.0 2.5
2.0 0.0 2.5
3.0 0.0 2.5
0.0 2.0 2.5
1.0 2.0 2.5
2.0 2.0 2.5
3.0 2.0 2.5
0.0 4.0 2.5
1.0 4.0 2.5
2.0 4.0 2.5
3.0 4.0 2.5
0.0 6.0 2.5
1.0 6.0 2.5
2.0 6.0 2.5
3.0 6.0 2.5
0.0 8.0 2.5
1.0 8.0 2.5
2.0 8.0 2.5
3.0 8.0 2.5
0.0 0.0 5.0
1.0 0.0 5.0
2.0 0.0 5.0
3.0 0.0 5.0
0.0 2.0 5.0
1.0 2.0 5.0
2.0 2.0 5.0
3.0 2.0 5.0
0.0 4.0 5.0
1.0 4.0 5.0
2.0 4.0 5.0
3.0 4.0 5.0
0.0 6.0 5.0
1.0 6.0 5.0
2.0 6.0 5.0
3.0 6.0 5.0
0.0 8.0 5.0
1.0 8.0 5.0
2.0 8.0 5.0
3.0 8.0 5.0
]
c = [1 25 21 22
6 5 2 26
26 2 22 25
5 26 25 2
2 1 22 25
5 25 1 2
21 45 41 42
26 25 22 46
46 22 42 45
25 46 45 22
22 21 42 45
25 45 21 22
5 29 25 26
10 9 6 30
30 6 26 29
9 30 29 6
6 5 26 29
9 29 5 6
25 49 45 46
30 29 26 50
50 26 46 49
29 50 49 26
26 25 46 49
29 49 25 26
9 33 29 30
14 13 10 34
34 10 30 33
13 34 33 10
10 9 30 33
13 33 9 10
29 53 49 50
34 33 30 54
54 30 50 53
33 54 53 30
30 29 50 53
33 53 29 30
13 37 33 34
18 17 14 38
38 14 34 37
17 38 37 14
14 13 34 37
17 37 13 14
33 57 53 54
38 37 34 58
58 34 54 57
37 58 57 34
34 33 54 57
37 57 33 34
2 26 22 23
7 6 3 27
27 3 23 26
6 27 26 3
3 2 23 26
6 26 2 3
22 46 42 43
27 26 23 47
47 23 43 46
26 47 46 23
23 22 43 46
26 46 22 23
6 30 26 27
11 10 7 31
31 7 27 30
10 31 30 7
7 6 27 30
10 30 6 7
26 50 46 47
31 30 27 51
51 27 47 50
30 51 50 27
27 26 47 50
30 50 26 27
10 34 30 31
15 14 11 35
35 11 31 34
14 35 34 11
11 10 31 34
14 34 10 11
30 54 50 51
35 34 31 55
55 31 51 54
34 55 54 31
31 30 51 54
34 54 30 31
14 38 34 35
19 18 15 39
39 15 35 38
18 39 38 15
15 14 35 38
18 38 14 15
34 58 54 55
39 38 35 59
59 35 55 58
38 59 58 35
35 34 55 58
38 58 34 35
3 27 23 24
8 7 4 28
28 4 24 27
7 28 27 4
4 3 24 27
7 27 3 4
23 47 43 44
28 27 24 48
48 24 44 47
27 48 47 24
24 23 44 47
27 47 23 24
7 31 27 28
12 11 8 32
32 8 28 31
11 32 31 8
8 7 28 31
11 31 7 8
27 51 47 48
32 31 28 52
52 28 48 51
31 52 51 28
28 27 48 51
31 51 27 28
11 35 31 32
16 15 12 36
36 12 32 35
15 36 35 12
12 11 32 35
15 35 11 12
31 55 51 52
36 35 32 56
56 32 52 55
35 56 55 32
32 31 52 55
35 55 31 32
15 39 35 36
20 19 16 40
40 16 36 39
19 40 39 16
16 15 36 39
19 39 15 16
35 59 55 56
40 39 36 60
60 36 56 59
39 60 59 36
36 35 56 59
39 59 35 36
]
return xyz, c
end
end
module mt4mesh1
using StaticArrays
using MeshCore: P1, T4, ShapeColl, manifdim, nvertices, nridges, nshapes
using MeshCore: ir_bbyridges, ir_skeleton, ir_bbyfacets, nshifts, _sense
using MeshCore: IncRel, transpose, nrelations, nentities
using MeshCore: VecAttrib, attribute, ir_code
using MeshSteward: Mesh, attach!, increl, basecode
using ..osamplet4: osamplet4mesh
using Test
function test()
xyz, cc = osamplet4mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
vrts.attributes["geom"] = locs
tets = ShapeColl(T4, size(cc, 1))
ir30 = IncRel(tets, vrts, cc)
mesh = Mesh()
attach!(mesh, ir30)
irc = basecode(mesh)
@test irc == (3, 0)
@test increl(mesh, (3, 0)) == ir30
ir = increl(mesh, (3, 0))
locs = attribute(ir.right, "geom")
@test locs[nshapes(ir.right)] == [3.0, 8.0, 5.0]
true
end
end
using .mt4mesh1
mt4mesh1.test()
module mmeshio2a
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton
using MeshSteward: import_NASTRAN, vtkwrite
using MeshSteward: Mesh, attach!, increl, basecode
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
irc = basecode(mesh)
connectivity = increl(mesh, irc)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
vertices = connectivity.right
geom = attribute(vertices, "geom")
# @show typeof(geom)
# @show typeof(geom.val)
vtkwrite("trunc_cyl_shell_0", connectivity)
try rm("trunc_cyl_shell_0" * ".vtu"); catch end
connectivity0 = ir_skeleton(ir_skeleton(ir_skeleton(connectivity)))
@test (nshapes(connectivity0.right), nshapes(connectivity0.left)) == (376, 376)
vtkwrite("trunc_cyl_shell_0-boundary-skeleton", connectivity0)
try rm("trunc_cyl_shell_0-boundary-skeleton" * ".vtu"); catch end
true
end
end
using .mmeshio2a
mmeshio2a.test()
module mmeshio3
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton
using MeshSteward: import_NASTRAN, vtkwrite, export_MESH
using MeshSteward: Mesh, attach!, increl, load, basecode, nspacedims
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
connectivity = increl(mesh, (3, 0))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
export_MESH("trunc_cyl_shell_0.mesh", connectivity)
mesh2 = Mesh("new mesh")
mesh2 = load(mesh2, "trunc_cyl_shell_0.mesh")
connectivity = increl(mesh2, (3, 0))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (376, 996)
irc = basecode(mesh)
@test irc == (3, 0)
@test nspacedims(mesh2) == 3
try rm("trunc_cyl_shell_0" * ".mesh"); catch end
try rm("trunc_cyl_shell_0-xyz" * ".dat"); catch end
try rm("trunc_cyl_shell_0-conn" * ".dat"); catch end
true
end
end
using .mmeshio3
mmeshio3.test()
module mmeshio4
using StaticArrays
using MeshCore: nshapes
using MeshCore: attribute, nrelations, ir_skeleton
using MeshSteward: import_NASTRAN, vtkwrite, export_MESH
using MeshSteward: Mesh, attach!, increl, load, basecode, nspacedims, save
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
save(mesh, "trunc_cyl_shell_0")
mesh2 = Mesh()
mesh2 = load(mesh2, "trunc_cyl_shell_0")
@test nspacedims(mesh) == nspacedims(mesh2)
try rm("trunc_cyl_shell_0" * ".mesh"); catch end
try rm("trunc_cyl_shell_0-xyz" * ".dat"); catch end
try rm("trunc_cyl_shell_0-conn" * ".dat"); catch end
true
end
end
using .mmeshio4
mmeshio4.test()
module mmfind1
using MeshSteward: boundingbox
using MeshSteward: Mesh, attach!, increl, load, basecode, nspacedims, save, baseincrel
using MeshSteward: vselect
using MeshSteward: import_NASTRAN, vtkwrite
using MeshCore: nshapes
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
mesh = Mesh()
attach!(mesh, connectivities[1])
# vtkwrite("trunc_cyl_shell_0-elements", baseincrel(mesh))
selectedv = vselect(mesh, box = boundingbox([-Inf -Inf 0.5; Inf Inf 0.5]), inflate = 0.001)
# vtkwrite("trunc_cyl_shell_0-selected-vertices", selectedv)
@test nshapes(selectedv.left) == 44
end
end
using .mmfind1
mmfind1.test()
module mmbd1
using MeshSteward: import_NASTRAN, vtkwrite
using MeshSteward: Mesh, attach!, boundary
using MeshSteward: summary
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
bir = boundary(mesh)
s = summary(mesh)
vtkwrite("trunc_cyl_shell_0-boundary", bir)
try rm("trunc_cyl_shell_0-boundary" * ".vtu"); catch end
end
end
using .mmbd1
mmbd1.test()
module mmvtx1
using MeshSteward: import_NASTRAN, vtkwrite
using MeshSteward: Mesh, attach!, vertices
using MeshSteward: summary
using MeshCore: nshapes
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
vir = vertices(mesh)
@test nshapes(vir.right) == nshapes(vir.left)
@test nshapes(vir.right) == nshapes(connectivity.right)
# s = summary(mesh)
# vtkwrite("trunc_cyl_shell_0-vertices", vir)
# try rm("trunc_cyl_shell_0-vertices" * ".vtu"); catch end
end
end
using .mmvtx1
mmvtx1.test()
module mmvtx2
using MeshSteward: import_NASTRAN, vtkwrite
using MeshSteward: Mesh, attach!, vertices, submesh
using MeshSteward: summary, basecode, boundary, eselect, label, initbox, updatebox!, baseincrel
using MeshCore: nshapes, attribute, ir_subset
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
@test basecode(mesh) == (3, 0)
vir = vertices(mesh)
@test nshapes(vir.right) == nshapes(vir.left)
@test nshapes(vir.right) == nshapes(connectivity.right)
geom = attribute(vir.right, "geom")
box = initbox(geom[1])
for i in 1:length(geom)
updatebox!(box, geom[i])
end
box[1] = (box[1] + box[2]) / 2
ir = baseincrel(mesh)
el = eselect(ir; box = box, inflate = 0.0009)
@test length(el) == 498
label(mesh, (3, 0), :left, el, 8)
el = eselect(ir; label = 8)
@test length(el) == 498
# @test length(el) == 44
@test summary(mesh) == "Mesh mesh: ((3, 0), \"\") = (elements, vertices): elements = 996 x T4 {label,}, vertices = 376 x P1 {geom,}; "
# vtkwrite("trunc_cyl_shell_0-full", ir)
# vtkwrite("trunc_cyl_shell_0-subset", ir_subset(ir, el))
# try rm("trunc_cyl_shell_0-vertices" * ".vtu"); catch end
return true
end
end
using .mmvtx2
mmvtx2.test()
module mmvtx3
using MeshSteward: import_NASTRAN, vtkwrite, geometry
using MeshSteward: Mesh, attach!, vertices, submesh, increl, baseincrel
using MeshSteward: summary, basecode, boundary, eselect, label, initbox, updatebox!, baseincrel
using MeshCore: nshapes, attribute, ir_subset, ir_code, nrelations
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
@test basecode(mesh) == (3, 0)
vir = vertices(mesh)
@test nshapes(vir.right) == nshapes(vir.left)
@test nshapes(vir.right) == nshapes(connectivity.right)
geom = geometry(mesh)
box = initbox(geom[1])
for i in 1:length(geom)
updatebox!(box, geom[i])
end
box[1] = (box[1] + box[2]) / 2
ir = baseincrel(mesh)
el = eselect(ir; box = box, inflate = 0.0009)
@test length(el) == 498
label(mesh, (3, 0), :left, el, 8)
el = eselect(ir; label = 8)
@test length(el) == 498
halfmesh = submesh(mesh, el)
# @show summary(halfmesh)
@test nrelations(baseincrel(halfmesh)) == 498
bir = boundary(mesh)
bir2 = increl(mesh, (ir_code(bir), "boundary"))
@test summary(bir) == summary(bir2)
# @test length(el) == 44
# @show summary(mesh)
# vtkwrite("trunc_cyl_shell_0-full", ir)
# vtkwrite("trunc_cyl_shell_0-subset", ir_subset(ir, el))
try rm("trunc_cyl_shell_0-full" * ".vtu"); catch end
try rm("trunc_cyl_shell_0-subset" * ".vtu"); catch end
try rm("trunc_cyl_shell_0-vertices" * ".vtu"); catch end
return true
end
end
using .mmvtx3
mmvtx3.test()
module mmvtx4
using MeshCore: nshapes, attribute, ir_subset, ir_code, nrelations
using MeshSteward.Exports
using Test
function test()
connectivities = import_NASTRAN(joinpath("data", "trunc_cyl_shell_0.nas"))
connectivity = connectivities[1]
mesh = Mesh()
attach!(mesh, connectivity)
@test basecode(mesh) == (3, 0)
vir = vertices(mesh)
@test nshapes(vir.right) == nshapes(vir.left)
@test nshapes(vir.right) == nshapes(connectivity.right)
geom = geometry(mesh)
box = initbox(geom[1])
for i in 1:length(geom)
updatebox!(box, geom[i])
end
box[1] = (box[1] + box[2]) / 2
ir = baseincrel(mesh)
el = eselect(ir; box = box, inflate = 0.0009)
@test length(el) == 498
label(mesh, (3, 0), :left, el, 8)
el = eselect(ir; label = 8)
@test length(el) == 498
halfmesh = submesh(mesh, el)
# @show summary(halfmesh)
@test nrelations(baseincrel(halfmesh)) == 498
bir = boundary(mesh)
bir2 = increl(mesh, (ir_code(bir), "boundary"))
@test summary(bir) == summary(bir2)
# @test length(el) == 44
# @show summary(mesh)
# vtkwrite("trunc_cyl_shell_0-full", ir)
# vtkwrite("trunc_cyl_shell_0-subset", ir_subset(ir, el))
try rm("trunc_cyl_shell_0-full" * ".vtu"); catch end
try rm("trunc_cyl_shell_0-subset" * ".vtu"); catch end
try rm("trunc_cyl_shell_0-vertices" * ".vtu"); catch end
return true
end
end
using .mmvtx4
mmvtx4.test() | MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 4914 |
module mmodt6gen4
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3block, T6block, T3toT6, T6toT3
using MeshSteward: vconnected, vnewnumbering, compactify
using Test
function test()
connectivity = T6block(2.0, 0.75*pi, 6, 5, :b)
locs = connectivity.right.attributes["geom"]
for i in 1:length(locs)
r, a = locs[i][1]+2.7, locs[i][2]
locs[i] = (cos(a)*r, sin(a)*r)
end
@test nshapes(connectivity.left) == 60
connectivity3 = T6toT3(connectivity)
@test nshapes(connectivity3.left) == nshapes(connectivity.left)
@test nshapes(connectivity3.right) == 42
vtkwrite("mmodt6gen4", connectivity3)
try rm("mmodt6gen4.vtu"); catch end
true
end
end
using .mmodt6gen4
mmodt6gen4.test()
module mmodt6gen5
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite, summary
using MeshSteward: Q4block, transform, fusevertices, cat, renumbered, withvertices
using MeshSteward: vconnected, vnewnumbering, compactify
using Test
function test()
conn1 = Q4block(2.0, 0.75*pi, 13, 12)
conn2 = Q4block(3.0, 0.75*pi, 7, 12)
transform(conn2, x -> [x[1]+2, x[2]])
# vtkwrite("mmodt6gen5-1", conn1)
# vtkwrite("mmodt6gen5-2", conn2)
locs1 = conn1.right.attributes["geom"]
locs2 = conn2.right.attributes["geom"]
tolerance = 1.0e-3
nlocs1, ni1 = fusevertices(locs1, locs2, tolerance)
conn1 = withvertices(conn1, nlocs1)
conn2 = withvertices(conn2, nlocs1)
conn1 = renumbered(conn1, ni1)
@test summary(conn1) == "(elements, vertices): elements = 156 x Q4, vertices = 273 x P1 {geom,}"
@test summary(conn2) == "(elements, vertices): elements = 84 x Q4, vertices = 273 x P1 {geom,}"
# vtkwrite("mmodt6gen5-1", conn1)
# vtkwrite("mmodt6gen5-2", conn2)
connectivity = cat(conn1, conn2)
locs = connectivity.right.attributes["geom"]
for i in 1:length(locs)
r, a = locs[i][1]+2.7, locs[i][2]
locs[i] = (cos(a)*r, sin(a)*r)
end
# vtkwrite("mmodt6gen5", connectivity)
# try rm("mmodt6gen5.vtu"); catch end
true
end
end
using .mmodt6gen5
mmodt6gen5.test()
module mmodt6gen6
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite, summary
using MeshSteward: Q4block, transform, fusevertices, cat, renumbered, withvertices
using MeshSteward: vconnected, vnewnumbering, compactify
using Test
function test()
conn1 = Q4block(2.0, 0.75*pi, 13, 12)
conn2 = Q4block(3.0, 0.75*pi, 7, 12)
transform(conn2, x -> [x[1]+2, x[2]])
# vtkwrite("mmodt6gen6-1", conn1)
# vtkwrite("mmodt6gen6-2", conn2)
locs1 = conn1.right.attributes["geom"]
locs2 = conn2.right.attributes["geom"]
tolerance = 1.0e-3
nlocs1, ni1 = fusevertices(locs1, locs2, tolerance)
conn1 = withvertices(conn1, nlocs1)
conn2 = withvertices(conn2, nlocs1)
conn1 = renumbered(conn1, ni1)
@test summary(conn1) == "(elements, vertices): elements = 156 x Q4, vertices = 273 x P1 {geom,}"
@test summary(conn2) == "(elements, vertices): elements = 84 x Q4, vertices = 273 x P1 {geom,}"
# vtkwrite("mmodt6gen6-1", conn1)
# vtkwrite("mmodt6gen6-2", conn2)
connectivity = cat(conn1, conn2)
transform(connectivity, x -> begin
r, a = x[1]+2.7, x[2]
[cos(a)*r, sin(a)*r]
end)
# locs = connectivity.right.attributes["geom"]
# for i in 1:length(locs)
# r, a = locs[i][1]+2.7, locs[i][2]
# locs[i] = (cos(a)*r, sin(a)*r)
# end
vtkwrite("mmodt6gen6", connectivity)
try rm("mmodt6gen6.vtu"); catch end
true
end
end
using .mmodt6gen6
mmodt6gen6.test()
module mmodt6gen7
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite, summary
using MeshSteward: Q4block, transform, fusevertices, cat, renumbered, withvertices
using MeshSteward: vconnected, vnewnumbering, compactify, mergeirs
using Test
function test()
conn1 = Q4block(2.0, 0.75*pi, 13, 12)
conn2 = Q4block(3.0, 0.75*pi, 7, 12)
transform(conn2, x -> [x[1]+2, x[2]])
connectivity = mergeirs(conn1, conn2, 0.001)
transform(connectivity, x -> begin
r, a = x[1]+2.7, x[2]
[cos(a)*r, sin(a)*r]
end)
vtkwrite("mmodt6gen7", connectivity)
try rm("mmodt6gen7.vtu"); catch end
true
end
end
using .mmodt6gen7
mmodt6gen7.test()
module mt4symrcm1a
using StaticArrays
using MeshCore.Exports
using MeshSteward.Exports
using Test
function test()
# connectivity = T4block(1.0, 2.0, 3.0, 1, 1, 1, :a)
connectivity = T4block(1.0, 2.0, 3.0, 7, 9, 13, :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
# @test (nshapes(connectivity.right), nshapes(connectivity.left)) == (1120, 4914)
connectivity = minimize_profile(connectivity)
vtkwrite("mt4symrcm1a", connectivity)
try rm("mt4symrcm1a.vtu"); catch end
true
end
end
using .mt4symrcm1a
mt4symrcm1a.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 5451 |
module mq4gen1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: Q4blockx
using Test
function test()
connectivity = Q4blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0])
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (9, 4)
vtkwrite("mq4gen1a", connectivity)
try rm("mq4gen1a.vtu"); catch end
true
end
end
using .mq4gen1
mq4gen1.test()
module mq4gen1b
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: Q4blockx
using Test
function test()
connectivity = Q4blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0])
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (9, 4)
vtkwrite("mq4gen1b", connectivity)
try rm("mq4gen1b.vtu"); catch end
true
end
end
using .mq4gen1b
mq4gen1b.test()
module mq4gen2
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: Q4block
using Test
function test()
connectivity = Q4block(1.0, 2.0, 7, 9)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (80, 63)
vtkwrite("mq4gen2", connectivity)
try rm("mq4gen2.vtu"); catch end
true
end
end
using .mq4gen2
mq4gen2.test()
module mq4gen3
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4blockx, gradedspace
using Test
function test()
connectivity = Q4blockx([1.0, 2.0, 3.0], gradedspace(0.0, 5.0, 7, 2))
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (21, 12)
vtkwrite("mq4gen3", connectivity)
try rm("mq4gen3.vtu"); catch end
export_MESH("mq4gen3", connectivity)
try rm("mq4gen3.mesh"); catch end
try rm("mq4gen3-xyz.dat"); catch end
try rm("mq4gen3-conn.dat"); catch end
true
end
end
using .mq4gen3
mq4gen3.test()
module mq4gen4
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4quadrilateral
using Test
function test()
connectivity = Q4quadrilateral([1.0 0.0; 1.5 1.7; -0.5 0.9; -0.1 -0.1], 3, 4)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (20, 12)
vtkwrite("mq4gen4", connectivity)
try rm("mq4gen4.vtu"); catch end
export_MESH("mq4gen4", connectivity)
try rm("mq4gen4.mesh"); catch end
try rm("mq4gen4-xyz.dat"); catch end
try rm("mq4gen4-conn.dat"); catch end
true
end
end
using .mq4gen4
mq4gen4.test()
module mq4gen5
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4quadrilateral
using Test
function test()
connectivity = Q4quadrilateral([1.0 0.0; 1.5 1.7], 6, 4)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (35, 24)
vtkwrite("mq4gen5", connectivity)
try rm("mq4gen5.vtu"); catch end
export_MESH("mq4gen5", connectivity)
try rm("mq4gen5.mesh"); catch end
try rm("mq4gen5-xyz.dat"); catch end
try rm("mq4gen5-conn.dat"); catch end
true
end
end
using .mq4gen5
mq4gen5.test()
module mq4gen6
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4quadrilateral
using Test
function test()
connectivity = Q4quadrilateral([1.0 0.0 0.0; 1.5 1.7 -0.1; 2.1 1.7 0.5; -1.0 1.0 0.3], 6, 4)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (35, 24)
vtkwrite("mq4gen6", connectivity)
try rm("mq4gen6.vtu"); catch end
export_MESH("mq4gen6", connectivity)
try rm("mq4gen6.mesh"); catch end
try rm("mq4gen6-xyz.dat"); catch end
try rm("mq4gen6-conn.dat"); catch end
true
end
end
using .mq4gen6
mq4gen6.test()
module mq4gen7
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4quadrilateral, mergeirs
using Test
function test()
N = 2
c1 = Q4quadrilateral([-1 -1; -0.2 -1; -0.1 -0.2; -1 0.8], N, N)
c2 = Q4quadrilateral([-0.2 -1; 1 -1; 1 -0.5; -0.1 -0.2], N, N)
c3 = Q4quadrilateral([1 -0.5; 1 1; 0.3 1; -0.1 -0.2], N, N)
c4 = Q4quadrilateral([0.3 1; -1 1; -1 0.8; -0.1 -0.2], N, N)
c = mergeirs(c1, c2, 0.001)
c = mergeirs(c, c3, 0.001)
c = mergeirs(c, c4, 0.001)
@test nshapes(c.left) == 4 * N ^ 2
vtkwrite("mq4gen7", c)
try rm("mq4gen7.vtu"); catch end
# export_MESH("mq4gen7", connectivity)
# try rm("mq4gen7.mesh"); catch end
# try rm("mq4gen7-xyz.dat"); catch end
# try rm("mq4gen7-conn.dat"); catch end
true
end
end
using .mq4gen7
mq4gen7.test()
module mq4gen8
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: Q4blockwdistortion
using Test
function test()
c1 = Q4blockwdistortion(3.0, 1.0, 15, 6)
vtkwrite("mq4gen8", c1)
try rm("mq4gen8.vtu"); catch end
# export_MESH("mq4gen8", connectivity)
# try rm("mq4gen8.mesh"); catch end
# try rm("mq4gen8-xyz.dat"); catch end
# try rm("mq4gen8-conn.dat"); catch end
true
end
end
using .mq4gen8
mq4gen8.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 3519 |
module mt4gen1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx
using Test
function test()
connectivity = T4blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0], [0.0, 1.0, 3.0], :a)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (27, 48)
vtkwrite("mt4gen1", connectivity)
try rm("mt4gen1.vtu"); catch end
true
end
end
using .mt4gen1
mt4gen1.test()
module mt4gen2
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4block
using Test
function test()
connectivity = T4block(1.0, 2.0, 3.0, 7, 9, 13, :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (1120, 4914)
vtkwrite("mt4gen2", connectivity)
try rm("mt4gen2.vtu"); catch end
true
end
end
using .mt4gen2
mt4gen2.test()
module mt4gen3
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx, gradedspace
using Test
function test()
connectivity = T4blockx([1.0, 2.0, 3.0], [1.0, 2.0, 3.0], gradedspace(0.0, 5.0, 7, 2), :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (63, 144)
vtkwrite("mt4gen3", connectivity)
try rm("mt4gen3.vtu"); catch end
true
end
end
using .mt4gen3
mt4gen3.test()
module mt4gen4
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx, linearspace
using Test
function test()
connectivity = T4blockx([1.0, 2.0, 3.0], [1.0, 2.0, 3.0], linearspace(0.0, 5.0, 7), :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (63, 144)
vtkwrite("mt4gen4", connectivity)
try rm("mt4gen4.vtu"); catch end
true
end
end
using .mt4gen4
mt4gen4.test()
module mt4gen4b
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx, linearspace
using Test
function test()
connectivity = T4blockx([1.0, 2.0, 3.0], [1.0, 2.0, 3.0], linearspace(0.0, 5.0, 7), :b)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (63, 144)
vtkwrite("mt4gen4", connectivity)
try rm("mt4gen4.vtu"); catch end
true
end
end
using .mt4gen4b
mt4gen4b.test()
module mt4gen4ca
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx, linearspace
using Test
function test()
connectivity = T4blockx([1.0, 2.0, 3.0], [1.0, 2.0, 3.0], linearspace(0.0, 5.0, 7), :ca)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (63, 120)
vtkwrite("mt4gen4", connectivity)
try rm("mt4gen4.vtu"); catch end
true
end
end
using .mt4gen4ca
mt4gen4ca.test()
module mt4gen4cb
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T4blockx, linearspace
using Test
function test()
connectivity = T4blockx([1.0, 2.0, 3.0], [1.0, 2.0, 3.0], linearspace(0.0, 5.0, 7), :cb)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (63, 120)
vtkwrite("mt4gen4", connectivity)
try rm("mt4gen4.vtu"); catch end
true
end
end
using .mt4gen4cb
mt4gen4cb.test() | MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 4681 |
module mt3gen1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3blockx
using Test
function test()
connectivity = T3blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0], :a)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (9, 8)
vtkwrite("mt3gen1a", connectivity)
try rm("mt3gen1a.vtu"); catch end
true
end
end
using .mt3gen1
mt3gen1.test()
module mt3gen1b
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3blockx
using Test
function test()
connectivity = T3blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0], :b)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (9, 8)
vtkwrite("mt3gen1b", connectivity)
try rm("mt3gen1b.vtu"); catch end
true
end
end
using .mt3gen1b
mt3gen1b.test()
module mt3gen2
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3block
using Test
function test()
connectivity = T3block(1.0, 2.0, 7, 9, :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (80, 63*2)
vtkwrite("mt3gen2", connectivity)
try rm("mt3gen2.vtu"); catch end
true
end
end
using .mt3gen2
mt3gen2.test()
module mt3gen3
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: export_MESH
using MeshSteward: T3blockx, gradedspace
using Test
function test()
connectivity = T3blockx([1.0, 2.0, 3.0], gradedspace(0.0, 5.0, 7, 2), :a)
# @show (nshapes(connectivity.right), nshapes(connectivity.left))
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (21, 24)
vtkwrite("mt3gen3", connectivity)
try rm("mt3gen3.vtu"); catch end
export_MESH("mt3gen3", connectivity)
try rm("mt3gen3.mesh"); catch end
try rm("mt3gen3-xyz.dat"); catch end
try rm("mt3gen3-conn.dat"); catch end
true
end
end
using .mt3gen3
mt3gen3.test()
module mt3gen5
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3blockx
using Test
function test()
connectivity = T3blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0], :a)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (9, 8)
vtkwrite("mt3gen5a", connectivity)
try rm("mt3gen5a.vtu"); catch end
true
end
end
using .mt3gen5
mt3gen5.test()
module mt6gen1
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T6blockx
using Test
function test()
connectivity = T6blockx([0.0, 1.0, 3.0], [0.0, 1.0, 3.0], :b)
@test (nshapes(connectivity.right), nshapes(connectivity.left)) == (25, 8)
vtkwrite("mt6gen1", connectivity)
try rm("mt6gen1.vtu"); catch end
true
end
end
using .mt6gen1
mt6gen1.test()
module mt6gen2
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T6block
using Test
function test()
connectivity = T6block(2.0, 0.75*pi, 6, 5, :b)
locs = connectivity.right.attributes["geom"]
for i in 1:length(locs)
r, a = locs[i][1]+2.7, locs[i][2]
locs[i] = (cos(a)*r, sin(a)*r)
end
@test nshapes(connectivity.left) == 60
vtkwrite("mt6gen2", connectivity)
try rm("mt6gen2.vtu"); catch end
true
end
end
using .mt6gen2
mt6gen2.test()
module mt6gen3
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3block, T6block, T3toT6
using Test
function test()
connectivity = T3block(2.0, 0.75*pi, 6, 5, :b)
locs = connectivity.right.attributes["geom"]
for i in 1:length(locs)
r, a = locs[i][1]+2.7, locs[i][2]
locs[i] = (cos(a)*r, sin(a)*r)
end
@test nshapes(connectivity.left) == 60
connectivity = T3toT6(connectivity)
vtkwrite("mt6gen3", connectivity)
try rm("mt6gen3.vtu"); catch end
true
end
end
using .mt6gen3
mt6gen3.test()
module mt6gen4
using StaticArrays
using MeshCore: nshapes
using MeshSteward: vtkwrite
using MeshSteward: T3block, T6block, T3toT6, T6toT3
using Test
function test()
connectivity = T6block(2.0, 0.75*pi, 6, 5, :b)
locs = connectivity.right.attributes["geom"]
for i in 1:length(locs)
r, a = locs[i][1]+2.7, locs[i][2]
locs[i] = (cos(a)*r, sin(a)*r)
end
@test nshapes(connectivity.left) == 60
connectivity3 = T6toT3(connectivity)
@test nshapes(connectivity3.left) == nshapes(connectivity.left)
@test nshapes(connectivity3.right) != nshapes(connectivity.right)
@test nshapes(connectivity3.right) == 42
vtkwrite("mt6gen4", connectivity3)
try rm("mt6gen4.vtu"); catch end
true
end
end
using .mt6gen4
mt6gen4.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | code | 5053 |
module mtest2
using StaticArrays
using MeshCore: VecAttrib
import MeshSteward: vselect, initbox, updatebox!
using Test
function test()
xyz = [0.0 0.0; 633.3333333333334 0.0; 1266.6666666666667 0.0; 1900.0 0.0; 0.0 400.0; 633.3333333333334 400.0; 1266.6666666666667 400.0; 1900.0 400.0; 0.0 800.0; 633.3333333333334 800.0; 1266.6666666666667 800.0; 1900.0 800.0]
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
@test length(locs) == 12
x = locs[SVector{2}([2, 4])]
@test x[1] == SVector{2}([633.3333333333334 0.0])
@test length(locs[1]) == 2
@test eltype(locs[1]) == Float64
box = [0.0 0.0 0.0 0.0]
inflate = 0.01
outputlist = vselect(locs; box = box, inflate = inflate)
@test isapprox(outputlist, [1])
box = initbox(xyz)
outputlist = vselect(locs; box = box, inflate = inflate)
@test isapprox(outputlist, collect(1:size(xyz, 1)))
box = initbox(xyz[2, :])
updatebox!(box, xyz[6, :])
outputlist = vselect(locs; box = box, inflate = inflate)
@test isapprox(outputlist, [2, 6])
box = initbox(xyz[5, :])
updatebox!(box, xyz[6, :])
outputlist = vselect(locs; box = box, inflate = inflate)
@test isapprox(outputlist, [5, 6])
true
end
end
using .mtest2
mtest2.test()
include("samplet4.jl")
module mt4topo1
using StaticArrays
using MeshCore: P1, T4, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: VecAttrib
using MeshCore: IncRel, nrelations, nentities, ir_boundary
using MeshSteward: connectedv
using ..samplet4: samplet4mesh
using Test
function test()
xyz, cc = samplet4mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tets = ShapeColl(T4, size(cc, 1))
ir30 = IncRel(tets, vrts, cc)
vl = connectedv(ir30)
@test length(vl) == size(xyz, 1)
ir20 = ir_boundary(ir30)
vl = connectedv(ir20)
@test length(vl) == 54
true
end
end
using .mt4topo1
mt4topo1.test()
module mt4topv1
using StaticArrays
using LinearAlgebra
using MeshCore: P1, T4, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: VecAttrib
using MeshCore: IncRel, nrelations, nentities, ir_boundary
using MeshSteward: connectedv, vselect, vtkwrite
using ..samplet4: samplet4mesh
using Test
function test()
xyz, cc = samplet4mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tets = ShapeColl(T4, size(cc, 1))
ir30 = IncRel(tets, vrts, cc)
ir30.right.attributes["geom"] = locs
outputlist = vselect(locs; distance = 0.3, from = locs[1], inflate = 0.001)
nmatched = 0
for i in 1:length(locs)
if norm(locs[i]) < 0.3 + 0.001
nmatched = nmatched + 1
end
end
@test nmatched == length(outputlist)
vtkwrite("delete_me", ir30)
try rm("delete_me" * ".vtu"); catch end
true
end
end
using .mt4topv1
mt4topv1.test()
module mt4topv2
using StaticArrays
using LinearAlgebra
using MeshCore: P1, T4, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: VecAttrib
using MeshCore: IncRel, nrelations, nentities, ir_boundary
using MeshSteward: connectedv, vselect, vtkwrite
using ..samplet4: samplet4mesh
using Test
function test()
xyz, cc = samplet4mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tets = ShapeColl(T4, size(cc, 1))
ir30 = IncRel(tets, vrts, cc)
ir30.right.attributes["geom"] = locs
outputlist = vselect(locs; plane = [1.0, 0.0, 0.0, 5.0], thickness = 0.001)
nmatched = 0
for i in 1:length(locs)
if abs(locs[i][1] - 5.0) < 0.001
nmatched = nmatched + 1
end
end
@test nmatched == length(outputlist)
# vtkwrite("delete_me", ir30)
# try rm("delete_me" * ".vtu"); catch end
true
end
end
using .mt4topv2
mt4topv2.test()
module mt4topv3
using StaticArrays
using LinearAlgebra
using MeshCore: P1, T4, ShapeColl, manifdim, nvertices, nshapes, indextype
using MeshCore: VecAttrib
using MeshCore: IncRel, nrelations, nentities, ir_boundary
using MeshSteward: connectedv, vselect, vtkwrite
using ..samplet4: samplet4mesh
using Test
function test()
xyz, cc = samplet4mesh()
# Construct the initial incidence relation
N, T = size(xyz, 2), eltype(xyz)
locs = VecAttrib([SVector{N, T}(xyz[i, :]) for i in 1:size(xyz, 1)])
vrts = ShapeColl(P1, length(locs))
tets = ShapeColl(T4, size(cc, 1))
ir30 = IncRel(tets, vrts, cc)
ir30.right.attributes["geom"] = locs
outputlist = vselect(locs; nearestto = locs[13])
@test length(outputlist) == 1
@test outputlist[1] == 13
true
end
end
using .mt4topv3
mt4topv3.test()
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 1540 | [](http://www.repostatus.org/#active)
[](https://github.com/PetrKryslUCSD/MeshSteward.jl/actions)
[](https://codecov.io/gh/PetrKryslUCSD/MeshSteward.jl)
[](https://petrkryslucsd.github.io/MeshSteward.jl/dev)
# MeshSteward.jl
Manages finite element meshes powered by [`MeshCore.jl`](https://github.com/PetrKryslUCSD/MeshCore.jl).
## Installation
This release is for Julia 1.5.
The package is registered: doing
```
]add MeshSteward
```
is enough.
Depends on: [`MeshCore`](https://github.com/PetrKryslUCSD/MeshCore.jl).
## Using
The user can either use/import individual functions from `MeshSteward` like so:
```
using MeshSteward: Mesh, attach!
```
or all exported symbols maybe made available in the user's context as
```
using MeshSteward.Exports
```
## Learning
Please refer to the tutorials in the package [`MeshTutor.jl`](https://github.com/PetrKryslUCSD/MeshTutor.jl).
## News
- 12/15/2020: Tested with Julia 1.6.
- 07/06/2020: Exports have been added to facilitate use of the library.
- 06/17/2020: Key the stored relations with a tuple consisting of the code and a
string tag.
- 05/26/2020: First version.
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 553 | # Issues
- Test skeleton and boundary.
- Documenter used to generate documentation:
pkg"add DocumenterTools"
using DocumenterTools
DocumenterTools.genkeys(user="PetrKryslUCSD", repo="[email protected]:PetrKryslUCSD/Elfel.jl.git")
and follow the instructions to install the keys.
- Export method could take just the mesh. How do we handle multiple element types in the mesh? That would mean multiple connectivity incidence relations.
- Import methods needs to be documented: outputs are incorrectly described.
- Import mesh in the .h5mesh format.
- | MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 298 | # MeshSteward Documentation
## Conceptual guide
The concepts and ideas are described.
```@contents
Pages = [
"guide/guide.md",
]
Depth = 1
```
## Manual
The description of the types and of the functions.
```@contents
Pages = [
"man/types.md",
"man/functions.md",
]
Depth = 3
```
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 152 | [Table of contents](https://petrkryslucsd.github.io/MeshSteward.jl/latest/index.html)
# Concepts about the design and operation
Needs to be written.
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 2361 | [Table of contents](https://petrkryslucsd.github.io/MeshSteward.jl/latest/index.html)
# How to Guide
## How to create simple meshes
We will generate the tetrahedral mesh inside a rectangular block.
The block will have the dimensions shown below:
```
a, b, c = 2.0, 2.5, 3.0
```
The tetrahedra will be generated in a regular pattern, with the number of
edges per side of the block given as
```
na, nb, nc = 2, 2, 3
```
Now we bring in function to generate mesh, and use it to generate the incidence relation representing the mesh.
```
using MeshSteward: T4block
conn = T4block(a, b, c, na, nb, nc);
```
The variable `conn` is an incidence relation. This will become the base
relation of the mesh. The mesh is first created.
```
using MeshSteward: Mesh
m = Mesh()
```
Then the ``(3, 0)`` incidence relation, which defines the tetrahedral elements in terms of the vertices at their corners, is attached to it.
```
using MeshSteward: attach!
attach!(m, conn);
```
We can now inspect the mesh by printing its summary.
```
println(summary(m))
```
## How to find a particular incidence relation
Find an incidence relation based on a code.
The ``(3, 0)`` incidence relation, which defines the tetrahedral elements in terms of the vertices at their corners, is found like this:
```
conn = increl(m, (3, 0))
```
We can extract the boundary of this incidence relation and attach it to the mesh:
```
using MeshCore: ir_boundary
bconn = ir_boundary(conn)
```
This incidence relation than may be attached to the mesh, with a name and a code.
```
attach!(m, bconn, "boundary_triangles")
```
To recover this incidence relation from the mesh we can do:
```
bconn = increl(m, ((2, 0), "boundary_triangles"))
```
## [How to visualize meshes](@id visualize)
The mesh can be exported for visualization. The tetrahedral elements are the
base incidence relation of the mesh.
```
using MeshSteward: baseincrel
using MeshSteward: vtkwrite
vtkwrite("trunc_cyl_shell_0-elements", baseincrel(mesh))
```
Start "Paraview", load the file `"trunc_cyl_shell_0-elements.vtu"` and
select for instance view as "Surface with Edges". The result will be a view
of the surface of the tetrahedral mesh.
```
@async run(`paraview trunc_cyl_shell_0-elements.vtu`)
```
## [How to export meshes ](@id export)
## [How to import meshes ](@id import)
## [How to select entities](@id select)
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 508 | # Functions
```@meta
CurrentModule = MeshSteward
```
## Boxes
```@docs
initbox
updatebox!
boundingbox
inflatebox!
inbox
boxesoverlap
intersectboxes
```
## Searching vertices and elements
```@docs
vselect
eselect
```
## Import of meshes
```@docs
import_MESH
import_NASTRAN
import_ABAQUS
```
## Export of meshes
```@docs
vtkwrite
export_MESH
```
## Management of meshes
```@docs
load
save
increl
attach!
basecode
nspacedims
Base.summary
boundary
vertices
submesh
label
```
## Index
```@index
```
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 1.3.1 | 273512aa2ef5ac8019c19d3f971f7eccec0487ae | docs | 50 | # Types
```@meta
CurrentModule = MeshSteward
```
| MeshSteward | https://github.com/PetrKryslUCSD/MeshSteward.jl.git |
|
[
"MIT"
] | 0.1.5 | 073d5c20d44129b20fe954720b97069579fa403b | code | 387 | module StableTasks
macro spawn end
macro spawnat end
macro fetch end
macro fetchfrom end
mutable struct AtomicRef{T}
@atomic x::T
AtomicRef{T}() where {T} = new{T}()
AtomicRef(x::T) where {T} = new{T}(x)
AtomicRef{T}(x) where {T} = new{T}(convert(T, x))
end
struct StableTask{T}
t::Task
ret::AtomicRef{T}
end
include("internals.jl")
end # module StableTasks
| StableTasks | https://github.com/JuliaFolds2/StableTasks.jl.git |
|
[
"MIT"
] | 0.1.5 | 073d5c20d44129b20fe954720b97069579fa403b | code | 6273 | module Internals
import StableTasks: @spawn, @spawnat, @fetch, @fetchfrom, StableTask, AtomicRef
Base.getindex(r::AtomicRef) = @atomic r.x
Base.setindex!(r::AtomicRef{T}, x) where {T} = @atomic r.x = convert(T, x)
function Base.fetch(t::StableTask{T}) where {T}
fetch(t.t)
t.ret[]
end
for func ∈ [:wait, :istaskdone, :istaskfailed, :istaskstarted, :yield, :yieldto]
if isdefined(Base, func)
@eval Base.$func(t::StableTask) = $func(t.t)
end
end
Base.yield(t::StableTask, x) = yield(t.t, x)
Base.yieldto(t::StableTask, x) = yieldto(t.t, x)
if isdefined(Base, :current_exceptions)
Base.current_exceptions(t::StableTask; backtrace::Bool=true) = current_exceptions(t.t; backtrace)
end
if isdefined(Base, :errormonitor)
Base.errormonitor(t::StableTask) = errormonitor(t.t)
end
Base.schedule(t::StableTask) = (schedule(t.t); t)
Base.schedule(t, val; error=false) = (schedule(t.t, val; error); t)
"""
@spawn [:default|:interactive] expr
Similar to `Threads.@spawn` but type-stable. Creates a `Task` and schedules it to run on any available
thread in the specified threadpool (defaults to the `:default` threadpool).
"""
macro spawn(args...)
tp = QuoteNode(:default)
na = length(args)
if na == 2
ttype, ex = args
if ttype isa QuoteNode
ttype = ttype.value
if ttype !== :interactive && ttype !== :default
throw(ArgumentError("unsupported threadpool in StableTasks.@spawn: $ttype"))
end
tp = QuoteNode(ttype)
else
tp = ttype
end
elseif na == 1
ex = args[1]
else
throw(ArgumentError("wrong number of arguments in @spawn"))
end
letargs = _lift_one_interp!(ex)
thunk = replace_linenums!(:(() -> ($(esc(ex)))), __source__)
var = esc(Base.sync_varname) # This is for the @sync macro which sets a local variable whose name is
# the symbol bound to Base.sync_varname
# I asked on slack and this is apparently safe to consider a public API
quote
let $(letargs...)
f = $thunk
T = Core.Compiler.return_type(f, Tuple{})
ref = AtomicRef{T}()
f_wrap = () -> (ref[] = f(); nothing)
task = Task(f_wrap)
task.sticky = false
Threads._spawn_set_thrpool(task, $(esc(tp)))
if $(Expr(:islocal, var))
put!($var, task) # Sync will set up a Channel, and we want our task to be in there.
end
schedule(task)
StableTask{T}(task, ref)
end
end
end
"""
@spawnat thrdid expr
Similar to `StableTasks.@spawn` but creates a **sticky** `Task` and schedules it to run on the thread with the given id (`thrdid`).
The task is guaranteed to stay on this thread (it won't migrate to another thread).
"""
macro spawnat(thrdid, ex)
letargs = _lift_one_interp!(ex)
thunk = replace_linenums!(:(() -> ($(esc(ex)))), __source__)
var = esc(Base.sync_varname)
tid = esc(thrdid)
@static if VERSION < v"1.9"
nt = :(Threads.nthreads())
else
nt = :(Threads.maxthreadid())
end
quote
if $tid < 1 || $tid > $nt
throw(ArgumentError("Invalid thread id ($($tid)). Must be between in " *
"1:(total number of threads), i.e. $(1:$nt)."))
end
let $(letargs...)
thunk = $thunk
RT = Core.Compiler.return_type(thunk, Tuple{})
ret = AtomicRef{RT}()
thunk_wrap = () -> (ret[] = thunk(); nothing)
local task = Task(thunk_wrap)
task.sticky = true
ccall(:jl_set_task_tid, Cvoid, (Any, Cint), task, $tid - 1)
if $(Expr(:islocal, var))
put!($var, task)
end
schedule(task)
StableTask(task, ret)
end
end
end
"""
@fetch ex
Shortcut for `fetch(@spawn(ex))`.
"""
macro fetch(ex)
:(fetch(@spawn($(esc(ex)))))
end
"""
@fetchfrom thrdid ex
Shortcut for `fetch(@spawnat(thrdid, ex))`.
"""
macro fetchfrom(thrdid, ex)
:(fetch(@spawnat($(esc(thrdid)), $(esc(ex)))))
end
# Copied from base rather than calling it directly because who knows if it'll change in the future
function _lift_one_interp!(e)
letargs = Any[] # store the new gensymed arguments
_lift_one_interp_helper(e, false, letargs) # Start out _not_ in a quote context (false)
letargs
end
_lift_one_interp_helper(v, _, _) = v
function _lift_one_interp_helper(expr::Expr, in_quote_context, letargs)
if expr.head === :$
if in_quote_context # This $ is simply interpolating out of the quote
# Now, we're out of the quote, so any _further_ $ is ours.
in_quote_context = false
else
newarg = gensym()
push!(letargs, :($(esc(newarg)) = $(esc(expr.args[1]))))
return newarg # Don't recurse into the lifted $() exprs
end
elseif expr.head === :quote
in_quote_context = true # Don't try to lift $ directly out of quotes
elseif expr.head === :macrocall
return expr # Don't recur into macro calls, since some other macros use $
end
for (i, e) in enumerate(expr.args)
expr.args[i] = _lift_one_interp_helper(e, in_quote_context, letargs)
end
expr
end
# Copied from base rather than calling it directly because who knows if it'll change in the future
replace_linenums!(ex, ln::LineNumberNode) = ex
function replace_linenums!(ex::Expr, ln::LineNumberNode)
if ex.head === :block || ex.head === :quote
# replace line number expressions from metadata (not argument literal or inert) position
map!(ex.args, ex.args) do @nospecialize(x)
isa(x, Expr) && x.head === :line && length(x.args) == 1 && return Expr(:line, ln.line)
isa(x, Expr) && x.head === :line && length(x.args) == 2 && return Expr(:line, ln.line, ln.file)
isa(x, LineNumberNode) && return ln
return x
end
end
# preserve any linenums inside `esc(...)` guards
if ex.head !== :escape
for subex in ex.args
subex isa Expr && replace_linenums!(subex, ln)
end
end
return ex
end
end # module Internals
| StableTasks | https://github.com/JuliaFolds2/StableTasks.jl.git |
|
[
"MIT"
] | 0.1.5 | 073d5c20d44129b20fe954720b97069579fa403b | code | 1479 | using Test, StableTasks
using StableTasks: @spawn, @spawnat, @fetch, @fetchfrom
@testset "Type stability" begin
@test 2 == @inferred fetch(@spawn 1 + 1)
t = @eval @spawn inv([1 2 ; 3 4])
@test inv([1 2 ; 3 4]) == @inferred fetch(t)
@test 2 == @inferred fetch(@spawn :interactive 1 + 1)
t = @eval @spawn :interactive inv([1 2 ; 3 4])
@test inv([1 2 ; 3 4]) == @inferred fetch(t)
s = :default
@test 2 == @inferred fetch(@spawn s 1 + 1)
t = @eval @spawn $(QuoteNode(s)) inv([1 2 ; 3 4])
@test inv([1 2 ; 3 4]) == @inferred fetch(t)
@test 2 == @inferred fetch(@spawnat 1 1 + 1)
t = @eval @spawnat 1 inv([1 2 ; 3 4])
@test inv([1 2 ; 3 4]) == @inferred fetch(t)
end
@testset "API funcs" begin
T = @spawn rand(Bool)
@test isnothing(wait(T))
@test istaskdone(T)
@test istaskfailed(T) == false
@test istaskstarted(T)
r = Ref(0)
@sync begin
@spawn begin
sleep(5)
r[] = 1
end
@test r[] == 0
end
@test r[] == 1
T = @spawnat 1 rand(Bool)
@test isnothing(wait(T))
@test istaskdone(T)
@test istaskfailed(T) == false
@test istaskstarted(T)
@test fetch(@spawnat 1 Threads.threadid()) == 1
r = Ref(0)
@sync begin
@spawnat 1 begin
sleep(5)
r[] = 1
end
@test r[] == 0
end
@test r[] == 1
@test @fetch(3+3) == 6
@test @fetchfrom(1, Threads.threadid()) == 1
end
| StableTasks | https://github.com/JuliaFolds2/StableTasks.jl.git |
|
[
"MIT"
] | 0.1.5 | 073d5c20d44129b20fe954720b97069579fa403b | docs | 992 | # StableTasks.jl
StableTasks is a simple package with one main API `StableTasks.@spawn` (not exported by default).
It works like `Threads.@spawn`, except it is *type stable* to `fetch` from.
``` julia
julia> using StableTasks, Test
julia> @inferred fetch(StableTasks.@spawn 1 + 1)
2
```
versus
``` julia
julia> @inferred fetch(Threads.@spawn 1 + 1)
ERROR: return type Int64 does not match inferred return type Any
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:35
[2] top-level scope
@ REPL[3]:1
```
The package also provides `StableTasks.@spawnat` (not exported), which is similar to `StableTasks.@spawn` but creates a *sticky* task (it won't migrate) on a specific thread.
```julia
julia> t = StableTasks.@spawnat 4 Threads.threadid();
julia> @inferred fetch(t)
4
```
For convenience, and similar to at Distributed.jl, there are also `@fetch` and `@fetchfrom` macros:
```julia
julia> StableTasks.@fetch 3+3
6
julia> StableTasks.@fetchfrom 2 Threads.threadid()
2
```
| StableTasks | https://github.com/JuliaFolds2/StableTasks.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 413 | using AWSS3
using Documenter
DocMeta.setdocmeta!(AWSS3, :DocTestSetup, :(using AWSS3); recursive=true)
makedocs(;
modules=[AWSS3],
sitename="AWSS3.jl",
format=Documenter.HTML(;
canonical="https://juliacloud.github.io/AWSS3.jl/stable/", edit_link="main"
),
pages=["Home" => "index.md", "API" => "api.md"],
)
deploydocs(; repo="github.com/JuliaCloud/AWSS3.jl.git", push_preview=true)
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 47029 | #==============================================================================#
# AWSS3.jl
#
# S3 API. See http://docs.aws.amazon.com/AmazonS3/latest/API/APIRest.html
#
# Copyright OC Technology Pty Ltd 2014 - All rights reserved
#==============================================================================#
module AWSS3
export S3Path,
s3_arn,
s3_put,
s3_get,
s3_get_file,
s3_exists,
s3_delete,
s3_copy,
s3_create_bucket,
s3_put_cors,
s3_enable_versioning,
s3_delete_bucket,
s3_list_buckets,
s3_list_objects,
s3_list_keys,
s3_list_versions,
s3_nuke_object,
s3_get_meta,
s3_directory_stat,
s3_purge_versions,
s3_sign_url,
s3_begin_multipart_upload,
s3_upload_part,
s3_complete_multipart_upload,
s3_multipart_upload,
s3_get_tags,
s3_put_tags,
s3_delete_tags
using AWS
using AWS.AWSServices: s3
using ArrowTypes
using Base64
using Compat: @something
using Dates
using EzXML
using FilePathsBase
using FilePathsBase: /, join
using HTTP: HTTP
using Mocking
using OrderedCollections: OrderedDict, LittleDict
using Retry
using SymDict
using URIs
using UUIDs
using XMLDict
@service S3 use_response_type = true
const SSDict = Dict{String,String}
const AbstractS3Version = Union{AbstractString,Nothing}
const AbstractS3PathConfig = Union{AbstractAWSConfig,Nothing}
# Utility function to workaround https://github.com/JuliaCloud/AWS.jl/issues/547
function get_robust_case(x, key)
lkey = lowercase(key)
haskey(x, lkey) && return x[lkey]
return x[key]
end
__init__() = FilePathsBase.register(S3Path)
# Declare new `parse` function to avoid type piracy
# TODO: remove when changes are released: https://github.com/JuliaCloud/AWS.jl/pull/502
function parse(r::AWS.Response, mime::MIME)
# AWS doesn't always return a Content-Type which results the parsing returning bytes
# instead of a dictionary. To address this we'll allow passing in the MIME type.
return try
AWS._rewind(r.io) do io
AWS._read(io, mime)
end
catch e
@warn "Failed to parse the following content as $mime:\n\"\"\"$(String(r.body))\"\"\""
rethrow(e)
end
end
parse(args...; kwargs...) = Base.parse(args...; kwargs...)
"""
s3_arn(resource)
s3_arn(bucket,path)
[Amazon Resource Name](http://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html)
for S3 `resource` or `bucket` and `path`.
"""
s3_arn(resource) = "arn:aws:s3:::$resource"
s3_arn(bucket, path) = s3_arn("$bucket/$path")
"""
s3_get([::AbstractAWSConfig], bucket, path; <keyword arguments>)
Retrieves an object from the `bucket` for a given `path`.
# Optional Arguments
- `version=nothing`: version of object to get.
- `retry=true`: try again on "NoSuchBucket", "NoSuchKey" (common if object was recently
created).
- `raw=false`: return response as `Vector{UInt8}`
- `byte_range=nothing`: given an iterator of `(start_byte, end_byte)` gets only
the range of bytes of the object from `start_byte` to `end_byte`. For example,
`byte_range=1:4` gets bytes 1 to 4 inclusive. Arguments should use the Julia convention
of 1-based indexing.
- `header::Dict{String,String}`: pass in an HTTP header to the request.
As an example of how to set custom HTTP headers, the below is equivalent to
`s3_get(aws, bucket, path; byte_range=range)`:
```julia
s3_get(aws, bucket, path; headers=Dict{String,String}("Range" => "bytes=\$(first(range)-1)-\$(last(range)-1)"))
```
# API Calls
- [`GetObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObject.html)
# Permissions
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObject):
(conditional): required when `version === nothing`.
- [`s3:GetObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectVersion):
(conditional): required when `version !== nothing`.
- [`s3:ListBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucket)
(optional): allows requests to non-existent objects to throw a exception with HTTP status
code 404 (Not Found) instead of HTTP status code 403 (Access Denied).
"""
function s3_get(
aws::AbstractAWSConfig,
bucket,
path;
version::AbstractS3Version=nothing,
retry::Bool=true,
raw::Bool=false,
byte_range::Union{Nothing,AbstractVector}=nothing,
headers::AbstractDict{<:AbstractString,<:Any}=Dict{String,Any}(),
return_stream::Bool=false,
kwargs...,
)
@repeat 4 try
params = Dict{String,Any}()
return_stream && (params["response_stream"] = Base.BufferStream())
if version !== nothing
params["versionId"] = version
end
if byte_range !== nothing
headers = copy(headers) # make sure we don't mutate existing object
# we make sure we stick to the Julia convention of 1-based indexing
a, b = (first(byte_range) - 1), (last(byte_range) - 1)
headers["Range"] = "bytes=$a-$b"
end
if !isempty(headers)
params["headers"] = headers
end
r = S3.get_object(bucket, path, params; aws_config=aws, kwargs...)
return if return_stream
close(r.io)
r.io
elseif raw
r.body
else
parse(r)
end
catch e
#! format: off
# https://github.com/domluna/JuliaFormatter.jl/issues/459
@delay_retry if retry && ecode(e) in ["NoSuchBucket", "NoSuchKey"] end
#! format: on
end
end
s3_get(a...; b...) = s3_get(global_aws_config(), a...; b...)
"""
s3_get_file([::AbstractAWSConfig], bucket, path, filename; [version=], kwargs...)
Like `s3_get` but streams result directly to `filename`. Keyword arguments accept are
the same as those for `s3_get`.
"""
function s3_get_file(
aws::AbstractAWSConfig,
bucket,
path,
filename;
version::AbstractS3Version=nothing,
kwargs...,
)
stream = s3_get(aws, bucket, path; version=version, return_stream=true, kwargs...)
open(filename, "w") do file
while !eof(stream)
write(file, readavailable(stream))
end
end
end
s3_get_file(a...; b...) = s3_get_file(global_aws_config(), a...; b...)
function s3_get_file(
aws::AbstractAWSConfig,
buckets::Vector,
path,
filename;
version::AbstractS3Version=nothing,
kwargs...,
)
i = start(buckets)
@repeat length(buckets) try
bucket, i = next(buckets, i)
s3_get_file(aws, bucket, path, filename; version=version, kwargs...)
catch e
#! format: off
@retry if ecode(e) in ["NoSuchKey", "AccessDenied"] end
#! format: on
end
end
"""
s3_get_meta([::AbstractAWSConfig], bucket, path; [version], kwargs...)
Retrieves metadata from an object without returning the object itself.
# API Calls
- [`HeadObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_HeadObject.html)
# Permissions
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObject)
(conditional): required when `version === nothing`.
- [`s3:GetObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectVersion):
(conditional): required when `version !== nothing`.
- [`s3:ListBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucket)
(optional): allows requests to non-existent objects to throw a exception with HTTP status
code 404 (Not Found) instead of HTTP status code 403 (Access Denied).
"""
function s3_get_meta(
aws::AbstractAWSConfig, bucket, path; version::AbstractS3Version=nothing, kwargs...
)
params = Dict{String,Any}()
if version !== nothing
params["versionId"] = version
end
r = S3.head_object(bucket, path, params; aws_config=aws, kwargs...)
return Dict(r.headers)
end
s3_get_meta(a...; b...) = s3_get_meta(global_aws_config(), a...; b...)
function _s3_exists_file(aws::AbstractAWSConfig, bucket, path)
q = Dict("prefix" => path, "delimiter" => "/", "max-keys" => 1)
l = parse(S3.list_objects_v2(bucket, q; aws_config=aws))
c = get(l, "Contents", nothing)
c === nothing && return false
return get(c, "Key", "") == path
end
"""
_s3_exists_dir(aws::AbstractAWSConfig, bucket, path)
An internal function used by [`s3_exists`](@ref).
Checks if the given directory exists within the `bucket`. Since S3 uses a flat structure, as
opposed to being hierarchical like a file system, directories are actually just a collection
of object keys which share a common prefix. S3 implements empty directories as
[0-byte objects](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-folders.html)
with keys ending with the delimiter.
It is possible to create non 0-byte objects with a key ending in the delimiter
(e.g. `s3_put(bucket, "abomination/", "If I cannot inspire love, I will cause fear!")`)
which the AWS console interprets as the directory "abmonination" containing the object "/".
"""
function _s3_exists_dir(aws::AbstractAWSConfig, bucket, path)
endswith(path, '/') || throw(ArgumentError("S3 directories must end with '/': $path"))
q = Dict("prefix" => path, "delimiter" => "/", "max-keys" => 1)
r = parse(S3.list_objects_v2(bucket, q; aws_config=aws))
return get(r, "KeyCount", "0") != "0"
end
"""
s3_exists_versioned([::AbstractAWSConfig], bucket, path, version)
Returns if an object `version` exists with the key `path` in the `bucket`.
Note that the AWS API's support for object versioning is quite limited and this check will
involve `try`/`catch` logic. Prefer using [`s3_exists_unversioned `](@ref) where possible
for more performant checks.
See [`s3_exists`](@ref) and [`s3_exists_unversioned`](@ref).
# API Calls
- [`ListObjectV2`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html)
# Permissions
- [`s3:GetObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectVersion)
- [`s3:ListBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucket)
(optional): allows requests to non-existent objects to throw a exception with HTTP status
code 404 (Not Found) instead of HTTP status code 403 (Access Denied).
"""
function s3_exists_versioned(
aws::AbstractAWSConfig, bucket, path, version::AbstractS3Version
)
@repeat 2 try
s3_get_meta(aws, bucket, path; version=version)
return true
catch e
#! format: off
@delay_retry if ecode(e) in ["NoSuchBucket", "404", "NoSuchKey", "AccessDenied"] end
#! format: on
@ignore if ecode(e) in ["404", "NoSuchKey", "AccessDenied"]
return false
end
end
end
"""
s3_exists_unversioned([::AbstractAWSConfig], bucket, path)
Returns a boolean whether an object exists at `path` in `bucket`.
See [`s3_exists`](@ref) and [`s3_exists_versioned`](@ref).
# API Calls
- [`ListObjectV2`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html)
# Permissions
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectVersion)
- [`s3:ListBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucket)
(optional): allows requests to non-existent objects to throw a exception with HTTP status
code 404 (Not Found) instead of HTTP status code 403 (Access Denied).
"""
function s3_exists_unversioned(aws::AbstractAWSConfig, bucket, path)
f = endswith(path, '/') ? _s3_exists_dir : _s3_exists_file
return f(aws, bucket, path)
end
"""
s3_exists([::AbstractAWSConfig], bucket, path; version=nothing)
Returns if an object exists with the key `path` in the `bucket`. If a `version` is specified
then an object must exist with the specified version identifier.
Note that the AWS API's support for object versioning is quite limited and this check will
involve `try`/`catch` logic if a `version` is specified.
# API Calls
- [`ListObjectV2`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html)
# Permissions
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObject)
(conditional): required when `version === nothing`.
- [`s3:GetObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectVersion):
(conditional): required when `version !== nothing`.
- [`s3:ListBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucket)
(optional): allows requests to non-existent objects to throw a exception with HTTP status
code 404 (Not Found) instead of HTTP status code 403 (Access Denied).
"""
function s3_exists(aws::AbstractAWSConfig, bucket, path; version::AbstractS3Version=nothing)
if version !== nothing
s3_exists_versioned(aws, bucket, path, version)
else
s3_exists_unversioned(aws, bucket, path)
end
end
s3_exists(a...; b...) = s3_exists(global_aws_config(), a...; b...)
"""
s3_delete([::AbstractAWSConfig], bucket, path; [version], kwargs...)
Deletes an object from a bucket. The `version` argument can be used to delete a specific
version.
# API Calls
- [`DeleteObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteObject.html)
# Permissions
- [`s3:DeleteObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteObject)
"""
function s3_delete(
aws::AbstractAWSConfig, bucket, path; version::AbstractS3Version=nothing, kwargs...
)
params = Dict{String,Any}()
if version !== nothing
params["versionId"] = version
end
return parse(S3.delete_object(bucket, path, params; aws_config=aws, kwargs...))
end
s3_delete(a...; b...) = s3_delete(global_aws_config(), a...; b...)
"""
s3_nuke_object([::AbstractAWSConfig], bucket, path; kwargs...)
Deletes all versions of object `path` from `bucket`. All provided `kwargs` are forwarded to
[`s3_delete`](@ref). In the event an error occurs any object versions already deleted by
`s3_nuke_object` will be lost.
To only delete one specific version, use [`s3_delete`](@ref); to delete all versions
EXCEPT the latest version, use [`s3_purge_versions`](@ref); to delete all versions
in an entire bucket, use [`AWSS3.s3_nuke_bucket`](@ref).
# API Calls
- [`DeleteObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteObject.html)
- [`ListObjectVersions`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectVersions.html)
# Permissions
- [`s3:DeleteObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteObject)
- [`s3:ListBucketVersions`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucketVersions)
"""
function s3_nuke_object(aws::AbstractAWSConfig, bucket, path; kwargs...)
# Because list_versions returns ALL keys with the given _prefix_, we need to
# restrict the results to ones with the _exact same_ key.
for object in s3_list_versions(aws, bucket, path)
object["Key"] == path || continue
version = object["VersionId"]
try
s3_delete(aws, bucket, path; version, kwargs...)
catch e
@warn "Failed to delete version $(version) of $(path)"
rethrow(e)
end
end
return nothing
end
function s3_nuke_object(bucket, path; kwargs...)
return s3_nuke_object(global_aws_config(), bucket, path; kwargs...)
end
"""
s3_copy([::AbstractAWSConfig], bucket, path; acl::AbstractString="",
to_bucket=bucket, to_path=path, metadata::AbstractDict=SSDict(),
parse_response::Bool=true, kwargs...)
Copy the object at `path` in `bucket` to `to_path` in `to_bucket`.
# Optional Arguments
- `acl=`; `x-amz-acl` header for setting access permissions with canned config.
See [here](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl).
- `metadata::Dict=`; `x-amz-meta-` headers.
- `parse_response::Bool=`; when `false`, return raw `AWS.Response`
- `kwargs`; additional kwargs passed through into `S3.copy_object`
# API Calls
- [`CopyObject`](http://https://docs.aws.amazon.com/AmazonS3/latest/API/API_CopyObject.html)
# Permissions
- [`s3:PutObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutObject)
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObject)
"""
function s3_copy(
aws::AbstractAWSConfig,
bucket,
path;
acl::AbstractString="",
to_bucket=bucket,
to_path=path,
metadata::AbstractDict=SSDict(),
parse_response::Bool=true,
kwargs...,
)
headers = SSDict(
"x-amz-metadata-directive" => "REPLACE",
Pair["x-amz-meta-$k" => v for (k, v) in metadata]...,
)
if !isempty(acl)
headers["x-amz-acl"] = acl
end
response = S3.copy_object(
to_bucket,
to_path,
"$bucket/$path",
Dict("headers" => headers);
aws_config=aws,
kwargs...,
)
return parse_response ? parse(response) : response
end
s3_copy(a...; b...) = s3_copy(global_aws_config(), a...; b...)
"""
s3_create_bucket([::AbstractAWSConfig], bucket; kwargs...)
Creates a new S3 bucket with the globally unique `bucket` name. The bucket will be created
AWS region associated with the `AbstractAWSConfig` (defaults to "us-east-1").
# API Calls
- [`CreateBucket`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_CreateBucket.html)
# Permissions
- [`s3:CreateBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-CreateBucket)
"""
function s3_create_bucket(aws::AbstractAWSConfig, bucket; kwargs...)
r = @protected try
if aws.region == "us-east-1"
S3.create_bucket(bucket; aws_config=aws, kwargs...)
else
bucket_config = """
<CreateBucketConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<LocationConstraint>$(aws.region)</LocationConstraint>
</CreateBucketConfiguration>
"""
S3.create_bucket(
bucket,
Dict("CreateBucketConfiguration" => bucket_config);
aws_config=aws,
kwargs...,
)
end
catch e
#! format: off
@ignore if ecode(e) == "BucketAlreadyOwnedByYou" end
#! format: on
end
return parse(r)
end
s3_create_bucket(a) = s3_create_bucket(global_aws_config(), a)
"""
s3_put_cors([::AbstractAWSConfig], bucket, cors_config; kwargs...)
[PUT Bucket cors](http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTcors.html)
```
s3_put_cors("my_bucket", \"\"\"
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>http://my.domain.com</AllowedOrigin>
<AllowedOrigin>http://my.other.domain.com</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>HEAD</AllowedMethod>
<AllowedHeader>*</AllowedHeader>
<ExposeHeader>Content-Range</ExposeHeader>
</CORSRule>
</CORSConfiguration>
\"\"\"
```
"""
function s3_put_cors(aws::AbstractAWSConfig, bucket, cors_config; kwargs...)
return parse(S3.put_bucket_cors(bucket, cors_config; aws_config=aws, kwargs...))
end
s3_put_cors(a...; b...) = s3_put_cors(AWS.global_aws_config(), a...; b...)
"""
s3_enable_versioning([::AbstractAWSConfig], bucket, [status]; kwargs...)
Enables or disables versioning for all objects within the given `bucket`. Use `status` to
either enable or disable versioning (respectively "Enabled" and "Suspended").
# API Calls
- [`PutBucketVersioning`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutBucketVersioning.html)
# Permissions
- [`s3:PutBucketVersioning`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutBucketVersioning)
"""
function s3_enable_versioning(aws::AbstractAWSConfig, bucket, status="Enabled"; kwargs...)
versioning_config = """
<VersioningConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<Status>$status</Status>
</VersioningConfiguration>
"""
r = s3(
"PUT",
"/$(bucket)?versioning",
Dict("body" => versioning_config);
aws_config=aws,
feature_set=AWS.FeatureSet(; use_response_type=true),
kwargs...,
)
return parse(r)
end
s3_enable_versioning(a; b...) = s3_enable_versioning(global_aws_config(), a; b...)
"""
s3_put_tags([::AbstractAWSConfig], bucket, [path], tags::Dict; kwargs...)
Sets the tags for a bucket or an existing object. When `path` is specified then tagging
is performed on the object, otherwise it is performed on the `bucket`.
See also [`s3_put_tags`](@ref), [`s3_delete_tags`](@ref), and [`s3_put`'s](@ref) `tag`
option.
# API Calls
- [`PutBucketTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutBucketTagging.html)(conditional): used when `path` is not specified (bucket tagging).
- [`PutObjectTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutObjectTagging.html) (conditional): used when `path` is specified (object tagging).
# Permissions
- [`s3:PutBucketTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutBucketTagging)
(conditional): required for when `path` is not specified (bucket tagging).
- [`s3:PutObjectTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutObjectTagging)
(conditional): required when `path` is specified (object tagging).
"""
function s3_put_tags(aws::AbstractAWSConfig, bucket, path, tags::SSDict; kwargs...)
tag_set = Dict("Tag" => [Dict("Key" => k, "Value" => v) for (k, v) in tags])
tags = Dict("Tagging" => Dict("TagSet" => tag_set))
tags = XMLDict.node_xml(tags)
uri_path = isempty(path) ? "/$(bucket)?tagging" : "/$(bucket)/$(path)?tagging"
r = s3(
"PUT",
uri_path,
Dict("body" => tags);
feature_set=AWS.FeatureSet(; use_response_type=true),
aws_config=aws,
kwargs...,
)
return parse(r)
end
function s3_put_tags(aws::AbstractAWSConfig, bucket, tags::SSDict; kwargs...)
return s3_put_tags(aws, bucket, "", tags; kwargs...)
end
s3_put_tags(a...) = s3_put_tags(global_aws_config(), a...)
"""
s3_get_tags([::AbstractAWSConfig], bucket, [path]; kwargs...)
Get the tags associated with a bucket or an existing object. When `path` is specified then
tag retrieval is performed on the object, otherwise it is performed on the `bucket`.
See also [`s3_put_tags`](@ref) and [`s3_delete_tags`](@ref).
# API Calls
- [`GetBucketTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetBucketTagging.html) (conditional): used when `path` is not specified (bucket tagging).
- [`GetObjectTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObjectTagging.html) (conditional): used when `path` is specified (object tagging).
# Permissions
- [`s3:GetBucketTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetBucketTagging)
(conditional): required for when `path` is not specified (bucket tagging).
- [`s3:GetObjectTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObjectTagging)
(conditional): required when `path` is specified (object tagging).
"""
function s3_get_tags(aws::AbstractAWSConfig, bucket, path=""; kwargs...)
@protected try
tags = []
r = if isempty(path)
S3.get_bucket_tagging(bucket; aws_config=aws, kwargs...)
else
S3.get_object_tagging(bucket, path; aws_config=aws, kwargs...)
end
tags = parse(r, MIME"application/xml"())
if isempty(tags["TagSet"])
return SSDict()
end
tags = tags["TagSet"]
tags = isa(tags["Tag"], Vector) ? tags["Tag"] : [tags["Tag"]]
return SSDict(x["Key"] => x["Value"] for x in tags)
catch e
@ignore if ecode(e) == "NoSuchTagSet"
return SSDict()
end
end
end
s3_get_tags(a...; b...) = s3_get_tags(global_aws_config(), a...; b...)
"""
s3_delete_tags([::AbstractAWSConfig], bucket, [path])
Delete the tags associated with a bucket or an existing object. When `path` is specified then
tag deletion is performed on the object, otherwise it is performed on the `bucket`.
See also [`s3_put_tags`](@ref) and [`s3_get_tags`](@ref).
# API Calls
- [`DeleteBucketTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteBucketTagging.html) (conditional): used when `path` is not specified (bucket tagging).
- [`DeleteObjectTagging`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteObjectTagging.html) (conditional): used when `path` is specified (object tagging).
# Permissions
- [`s3:PutBucketTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutBucketTagging)
(conditional): required for when `path` is not specified (bucket tagging).
- [`s3:DeleteObjectTagging`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteObjectTagging)
(conditional): required when `path` is specified (object tagging).
"""
function s3_delete_tags(aws::AbstractAWSConfig, bucket, path=""; kwargs...)
r = if isempty(path)
S3.delete_bucket_tagging(bucket; aws_config=aws, kwargs...)
else
S3.delete_object_tagging(bucket, path; aws_config=aws, kwargs...)
end
return parse(r)
end
s3_delete_tags(a...; b...) = s3_delete_tags(global_aws_config(), a...; b...)
"""
s3_delete_bucket([::AbstractAWSConfig], "bucket"; kwargs...)
Deletes an empty bucket. All objects in the bucket must be deleted before a bucket can be
deleted.
See also [`AWSS3.s3_nuke_bucket`](@ref).
# API Calls
- [`DeleteBucket`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteBucket.html)
# Permissions
- [`s3:DeleteBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteBucket)
"""
function s3_delete_bucket(aws::AbstractAWSConfig, bucket; kwargs...)
return parse(S3.delete_bucket(bucket; aws_config=aws, kwargs...))
end
s3_delete_bucket(a; b...) = s3_delete_bucket(global_aws_config(), a; b...)
"""
s3_list_buckets([::AbstractAWSConfig]; kwargs...)
Return a list of all of the buckets owned by the authenticated sender of the request.
# API Calls
- [`ListBuckets`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListBuckets.html)
# Permissions
- [`s3:ListAllMyBuckets`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListAllMyBuckets)
"""
function s3_list_buckets(aws::AbstractAWSConfig=global_aws_config(); kwargs...)
r = S3.list_buckets(; aws_config=aws, kwargs...)
buckets = parse(r)["Buckets"]
isempty(buckets) && return []
buckets = buckets["Bucket"]
return [b["Name"] for b in (isa(buckets, Vector) ? buckets : [buckets])]
end
"""
s3_list_objects([::AbstractAWSConfig], bucket, [path_prefix]; delimiter="/", max_items=1000, kwargs...)
[List Objects](http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketGET.html)
in `bucket` with optional `path_prefix`.
Returns an iterator of `Dict`s with keys `Key`, `LastModified`, `ETag`, `Size`,
`Owner`, `StorageClass`.
"""
function s3_list_objects(
aws::AbstractAWSConfig,
bucket,
path_prefix="";
delimiter="/",
start_after="",
max_items=nothing,
kwargs...,
)
return Channel(; ctype=LittleDict, csize=128) do chnl
more = true
num_objects = 0
token = ""
while more
q = Dict{String,String}("prefix" => path_prefix)
for (name, v) in [
("delimiter", delimiter),
("start-after", start_after),
("continuation-token", token),
]
isempty(v) || (q[name] = v)
end
if max_items !== nothing
# Note: AWS seems to only return up to 1000 items
q["max-keys"] = string(max_items - num_objects)
end
@repeat 4 try
# Request objects
r = parse(S3.list_objects_v2(bucket, q; aws_config=aws, kwargs...))
token = get(r, "NextContinuationToken", "")
isempty(token) && (more = false)
if haskey(r, "Contents")
l = isa(r["Contents"], Vector) ? r["Contents"] : [r["Contents"]]
for object in l
put!(chnl, object)
num_objects += 1
end
end
catch e
#! format: off
@delay_retry if ecode(e) in ["NoSuchBucket"] end
#! format: on
end
end
end
end
s3_list_objects(a...; kw...) = s3_list_objects(global_aws_config(), a...; kw...)
"""
s3_directory_stat([::AbstractAWSConfig], bucket, path)
Determine the properties of an S3 "directory", size and time of last modification, that cannot be determined
directly with the standard AWS API. This returns a tuple `(s, tmlast)` where `s` is the size in bytes, and
`tmlast` is the time of the latest modification to a file within that directory.
"""
function s3_directory_stat(
aws::AbstractAWSConfig, bucket::AbstractString, path::AbstractString
)
s = 0 # total size in bytes
tmlast = typemin(DateTime)
# setting delimiter is needed to get all objects within path,
# additionally, we have to make sure the path ends with "/" or it will pick up extra stuff
endswith(path, "/") || (path = path * "/")
for obj in s3_list_objects(aws, bucket, path; delimiter="")
s += parse(Int, get(obj, "Size", "0"))
t = get(obj, "LastModified", nothing)
t = t ≡ nothing ? tmlast : DateTime(t[1:(end - 4)])
tmlast = max(tmlast, t)
end
return s, tmlast
end
s3_directory_stat(a...) = s3_directory_stat(global_aws_config(), a...)
"""
s3_list_keys([::AbstractAWSConfig], bucket, [path_prefix]; kwargs...)
Like [`s3_list_objects`](@ref) but returns object keys as `Vector{String}`.
"""
function s3_list_keys(aws::AbstractAWSConfig, bucket, path_prefix=""; kwargs...)
return (o["Key"] for o in s3_list_objects(aws, bucket, path_prefix; kwargs...))
end
s3_list_keys(a...; b...) = s3_list_keys(global_aws_config(), a...; b...)
"""
s3_list_versions([::AbstractAWSConfig], bucket, [path_prefix]; kwargs...)
List metadata about all versions of the objects in the `bucket` matching the
optional `path_prefix`.
# API Calls
- [`ListObjectVersions`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectVersions.html)
# Permissions
- [`s3:ListBucketVersions`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucketVersions)
"""
function s3_list_versions(aws::AbstractAWSConfig, bucket, path_prefix=""; kwargs...)
more = true
versions = []
marker = ""
while more
query = Dict{String,Any}("versions" => "", "prefix" => path_prefix)
if !isempty(marker)
query["key-marker"] = marker
end
r = S3.list_object_versions(bucket, query; aws_config=aws, kwargs...)
r = parse_xml(String(r))
more = r["IsTruncated"] == "true"
for e in eachelement(EzXML.root(r.x))
if nodename(e) in ["Version", "DeleteMarker"]
version = xml_dict(e)
version["state"] = nodename(e)
push!(versions, version)
marker = version["Key"]
end
end
end
return versions
end
s3_list_versions(a...; b...) = s3_list_versions(global_aws_config(), a...; b...)
"""
s3_purge_versions([::AbstractAWSConfig], bucket, [path_prefix [, pattern]]; kwargs...)
Removes all versions of an object except for the latest version. When `path_prefix` is
provided then only objects whose key starts with `path_prefix` will be purged. Use of
`pattern` further restricts which objects are purged by only purging object keys containing
the `pattern` (i.e string literal or regex). When both `path_prefix` and `pattern` are not'
specified then all objects in the bucket will be purged.
# API Calls
- [`ListObjectVersions`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectVersions.html)
- [`DeleteObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteObject.html)
# Permissions
- [`s3:ListBucketVersions`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucketVersions)
- [`s3:DeleteObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteObjectVersion)
"""
function s3_purge_versions(
aws::AbstractAWSConfig, bucket, path_prefix="", pattern=""; kwargs...
)
for v in s3_list_versions(aws, bucket, path_prefix; kwargs...)
if pattern == "" || occursin(pattern, v["Key"])
if v["IsLatest"] != "true"
S3.delete_object(
bucket,
v["Key"],
Dict("versionId" => v["VersionId"]);
aws_config=aws,
kwargs...,
)
end
end
end
end
s3_purge_versions(a...; b...) = s3_purge_versions(global_aws_config(), a...; b...)
"""
s3_put([::AbstractAWSConfig], bucket, path, data, data_type="", encoding="";
acl::AbstractString="", metadata::SSDict=SSDict(), tags::AbstractDict=SSDict(),
parse_response::Bool=true, kwargs...)
[PUT Object](http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html)
`data` at `path` in `bucket`.
# Optional Arguments
- `data_type=`; `Content-Type` header.
- `encoding=`; `Content-Encoding` header.
- `acl=`; `x-amz-acl` header for setting access permissions with canned config.
See [here](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl).
- `metadata::Dict=`; `x-amz-meta-` headers.
- `tags::Dict=`; `x-amz-tagging-` headers
(see also [`s3_put_tags`](@ref) and [`s3_get_tags`](@ref)).
- `parse_response::Bool=`; when `false`, return raw `AWS.Response`
- `kwargs`; additional kwargs passed through into `S3.put_object`
"""
function s3_put(
aws::AbstractAWSConfig,
bucket,
path,
data::Union{String,Vector{UInt8}},
data_type="",
encoding="";
acl::AbstractString="",
metadata::SSDict=SSDict(),
tags::AbstractDict=SSDict(),
parse_response::Bool=true,
kwargs...,
)
headers = Dict{String,Any}(["x-amz-meta-$k" => v for (k, v) in metadata])
if isempty(data_type)
data_type = "application/octet-stream"
ext = splitext(path)[2]
for (e, t) in [
(".html", "text/html"),
(".js", "application/javascript"),
(".pdf", "application/pdf"),
(".csv", "text/csv"),
(".txt", "text/plain"),
(".log", "text/plain"),
(".dat", "application/octet-stream"),
(".gz", "application/octet-stream"),
(".bz2", "application/octet-stream"),
]
if ext == e
data_type = t
break
end
end
end
headers["Content-Type"] = data_type
if !isempty(tags)
headers["x-amz-tagging"] = URIs.escapeuri(tags)
end
if !isempty(acl)
headers["x-amz-acl"] = acl
end
if !isempty(encoding)
headers["Content-Encoding"] = encoding
end
args = Dict("body" => data, "headers" => headers)
response = S3.put_object(bucket, path, args; aws_config=aws, kwargs...)
return parse_response ? parse(response) : response
end
s3_put(a...; b...) = s3_put(global_aws_config(), a...; b...)
function s3_begin_multipart_upload(
aws::AbstractAWSConfig,
bucket,
path,
args=Dict{String,Any}();
kwargs...,
# format trick: using this comment to force use of multiple lines
)
r = S3.create_multipart_upload(bucket, path, args; aws_config=aws, kwargs...)
return parse(r, MIME"application/xml"())
end
function s3_upload_part(
aws::AbstractAWSConfig,
upload,
part_number,
part_data;
args=Dict{String,Any}(),
kwargs...,
)
args["body"] = part_data
response = S3.upload_part(
upload["Bucket"],
upload["Key"],
part_number,
upload["UploadId"],
args;
aws_config=aws,
kwargs...,
)
return get_robust_case(Dict(response.headers), "ETag")
end
function s3_complete_multipart_upload(
aws::AbstractAWSConfig,
upload,
parts::Vector{String},
args=Dict{String,Any}();
parse_response::Bool=true,
kwargs...,
)
doc = XMLDocument()
rootnode = setroot!(doc, ElementNode("CompleteMultipartUpload"))
for (i, etag) in enumerate(parts)
part = addelement!(rootnode, "Part")
addelement!(part, "PartNumber", string(i))
addelement!(part, "ETag", etag)
end
args["body"] = string(doc)
response = S3.complete_multipart_upload(
upload["Bucket"], upload["Key"], upload["UploadId"], args; aws_config=aws, kwargs...
)
return parse_response ? parse(response) : response
end
"""
s3_multipart_upload(aws::AbstractAWSConfig, bucket, path, io::IO, part_size_mb=50;
parse_response::Bool=true, kwargs...)
Upload `data` at `path` in `bucket` using a [multipart upload](https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html)
# Optional Arguments
- `part_size_mb`: maximum size per uploaded part, in bytes.
- `parse_response`: when `false`, return raw `AWS.Response`
- `kwargs`: additional kwargs passed through into `s3_upload_part` and `s3_complete_multipart_upload`
# API Calls
- [`CreateMultipartUpload`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_CreateMultipartUpload.html)
- [`UploadPart`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_UploadPart.html)
- [`CompleteMultipartUpload`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_CompleteMultipartUpload.html)
"""
function s3_multipart_upload(
aws::AbstractAWSConfig,
bucket,
path,
io::IO,
part_size_mb=50;
parse_response::Bool=true,
kwargs...,
)
part_size = part_size_mb * 1024 * 1024
upload = s3_begin_multipart_upload(aws, bucket, path)
tags = Vector{String}()
buf = Vector{UInt8}(undef, part_size)
i = 0
while (n = readbytes!(io, buf, part_size)) > 0
if n < part_size
resize!(buf, n)
end
push!(tags, s3_upload_part(aws, upload, (i += 1), buf; kwargs...))
end
return s3_complete_multipart_upload(aws, upload, tags; parse_response, kwargs...)
end
using MbedTLS
function _s3_sign_url_v2(
aws::AbstractAWSConfig,
bucket,
path,
seconds=3600;
verb="GET",
content_type="application/octet-stream",
protocol="http",
)
path = URIs.escapepath(path)
expires = round(Int, Dates.datetime2unix(now(Dates.UTC)) + seconds)
query = SSDict(
"AWSAccessKeyId" => aws.credentials.access_key_id,
"x-amz-security-token" => aws.credentials.token,
"Expires" => string(expires),
"response-content-disposition" => "attachment",
)
if verb != "PUT"
content_type = ""
end
to_sign =
"$verb\n\n$content_type\n$(query["Expires"])\n" *
"x-amz-security-token:$(query["x-amz-security-token"])\n" *
"/$bucket/$path?" *
"response-content-disposition=attachment"
key = aws.credentials.secret_key
query["Signature"] = strip(base64encode(digest(MD_SHA1, to_sign, key)))
endpoint = string(protocol, "://", bucket, ".s3.", aws.region, ".amazonaws.com")
return "$endpoint/$path?$(URIs.escapeuri(query))"
end
function _s3_sign_url_v4(
aws::AbstractAWSConfig,
bucket,
path,
seconds=3600;
verb="GET",
content_type="application/octet-stream",
protocol="http",
)
path = URIs.escapepath("/$bucket/$path")
now_datetime = now(Dates.UTC)
datetime_stamp = Dates.format(now_datetime, "YYYYmmddTHHMMSS\\Z")
date_stamp = Dates.format(now_datetime, "YYYYmmdd")
service = "s3"
scheme = "AWS4"
algorithm = "HMAC-SHA256"
terminator = "aws4_request"
scope = "$date_stamp/$(aws.region)/$service/$terminator"
host = if aws.region == "us-east-1"
"s3.amazonaws.com"
else
"s3-$(aws.region).amazonaws.com"
end
headers = OrderedDict{String,String}("Host" => host)
sort!(headers; by=name -> lowercase(name))
canonical_header_names = join(map(name -> lowercase(name), collect(keys(headers))), ";")
query = OrderedDict{String,String}(
"X-Amz-Expires" => string(seconds),
"X-Amz-Algorithm" => "$scheme-$algorithm",
"X-Amz-Credential" => "$(aws.credentials.access_key_id)/$scope",
"X-Amz-Date" => datetime_stamp,
"X-Amz-Security-Token" => aws.credentials.token,
"X-Amz-SignedHeaders" => canonical_header_names,
)
if !isempty(aws.credentials.token)
query["X-Amz-Security-Token"] = aws.credentials.token
end
sort!(query; by=name -> lowercase(name))
canonical_headers = join(
map(
header -> "$(lowercase(header.first)):$(lowercase(header.second))\n",
collect(headers),
),
)
canonical_request = string(
"$verb\n",
"$path\n",
"$(URIs.escapeuri(query))\n",
"$canonical_headers\n",
"$canonical_header_names\n",
"UNSIGNED-PAYLOAD",
)
string_to_sign = string(
"$scheme-$algorithm\n",
"$datetime_stamp\n",
"$scope\n",
bytes2hex(digest(MD_SHA256, canonical_request)),
)
key_secret = string(scheme, aws.credentials.secret_key)
key_date = digest(MD_SHA256, date_stamp, key_secret)
key_region = digest(MD_SHA256, aws.region, key_date)
key_service = digest(MD_SHA256, service, key_region)
key_signing = digest(MD_SHA256, terminator, key_service)
signature = digest(MD_SHA256, string_to_sign, key_signing)
query["X-Amz-Signature"] = bytes2hex(signature)
return string(protocol, "://", host, path, "?", URIs.escapeuri(query))
end
"""
s3_sign_url([::AbstractAWSConfig], bucket, path, [seconds=3600];
[verb="GET"], [content_type="application/octet-stream"],
[protocol="http"], [signature_version="v4"])
Create a [pre-signed url](http://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURL.html) for `bucket` and `path` (expires after for `seconds`).
To create an upload URL use `verb="PUT"` and set `content_type` to match
the type used in the `Content-Type` header of the PUT request.
For compatibility, the signature version 2 signing process can be used by setting
`signature_version="v2"` but it is [recommended](https://docs.aws.amazon.com/general/latest/gr/signature-version-2.html) that the default version 4 is used.
```
url = s3_sign_url("my_bucket", "my_file.txt"; verb="PUT")
Requests.put(URI(url), "Hello!")
```
```
url = s3_sign_url("my_bucket", "my_file.txt";
verb="PUT", content_type="text/plain")
Requests.put(URI(url), "Hello!";
headers=Dict("Content-Type" => "text/plain"))
```
# Permissions
- [`s3:GetObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-GetObject)
(conditional): required permission when `verb="GET"`.
- [`s3:PutObject`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-PutObject)
(conditional): required permission when `verb="PUT"`.
"""
function s3_sign_url(
aws::AbstractAWSConfig,
bucket,
path,
seconds=3600;
verb="GET",
content_type="application/octet-stream",
protocol="http",
signature_version="v4",
)
if signature_version == "v2"
_s3_sign_url_v2(aws, bucket, path, seconds; verb, content_type, protocol)
elseif signature_version == "v4"
_s3_sign_url_v4(aws, bucket, path, seconds; verb, content_type, protocol)
else
throw(ArgumentError("Unknown signature version $signature_version"))
end
end
s3_sign_url(a...; b...) = s3_sign_url(global_aws_config(), a...; b...)
"""
s3_nuke_bucket([::AbstractAWSConfig], bucket_name)
Deletes a bucket including all of the object versions in that bucket. Users should not call
this function unless they are certain they want to permanently delete all of the data that
resides within this bucket.
The `s3_nuke_bucket` is purposefully *not* exported as a safe guard against accidental
usage.
!!! warning
Permanent data loss will occur when using this function. Do not use this function unless
you understand the risks. By using this function you accept all responsibility around
any repercussions with the loss of this data.
# API Calls
- [`ListObjectVersions`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectVersions.html)
- [`DeleteObject`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteObject.html)
- [`DeleteBucket`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_DeleteBucket.html)
# Permissions
- [`s3:ListBucketVersions`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-ListBucketVersions)
- [`s3:DeleteObjectVersion`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteObjectVersion):
required even on buckets that do not have versioning enabled.
- [`s3:DeleteBucket`](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-DeleteBucket)
"""
function s3_nuke_bucket(aws::AbstractAWSConfig, bucket_name)
for v in s3_list_versions(aws, bucket_name)
s3_delete(aws, bucket_name, v["Key"]; version=v["VersionId"])
end
return s3_delete_bucket(aws, bucket_name)
end
include("s3path.jl")
end #module AWSS3
#==============================================================================#
# End of file.
#==============================================================================#
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 25528 | # Amazon states that version IDs are UTF-8 encoded, URL-ready, opaque strings no longer than
# 1024 bytes
# – https://docs.aws.amazon.com/AmazonS3/latest/userguide/versioning-workflows.html#version-ids
#
# In practise version IDs seems to be much narrower in scope:
# https://github.com/JuliaCloud/AWSS3.jl/pull/199#issuecomment-901995960
#
# An unversioned object can be accessed using the "null" version ID. For details see:
# https://github.com/JuliaCloud/AWSS3.jl/issues/241
const VERSION_ID_REGEX = r"^(?:[0-9a-zA-Z\._]{32}|null)$"
struct S3Path{A<:AbstractS3PathConfig} <: AbstractPath
segments::Tuple{Vararg{String}}
root::String
drive::String
isdirectory::Bool
version::Union{String,Nothing}
config::A
# Inner constructor performs no data checking and is only meant for direct use by
# deserialization.
function S3Path{A}(
segments, root, drive, isdirectory, version, config::A
) where {A<:AbstractS3PathConfig}
return new(segments, root, drive, isdirectory, version, config)
end
end
function S3Path(
segments, root, drive, isdirectory, version::AbstractString, config::A
) where {A<:AbstractS3PathConfig}
# Validate the `version` string provided is valid. Having this check during construction
# allows us to fail early instead of having to wait to make an API call to fail.
if !occursin(VERSION_ID_REGEX, version)
throw(ArgumentError("`version` string is invalid: $(repr(version))"))
end
return S3Path{A}(segments, root, drive, isdirectory, version, config)
end
function S3Path(
segments, root, drive, isdirectory, version::Nothing, config::A
) where {A<:AbstractS3PathConfig}
return S3Path{A}(segments, root, drive, isdirectory, version, config)
end
"""
S3Path()
S3Path(str::AbstractString; version::$(AbstractS3Version)=nothing, config::$(AbstractS3PathConfig)=nothing)
S3Path(path::S3Path; isdirectory=path.isdirectory, version=path.version, config=path.config)
Construct a new AWS S3 path type which should be of the form
`"s3://<bucket>/prefix/to/my/object"`.
NOTES:
- Directories are required to have a trailing `/` due to how S3
distinguishes files from folders, as internally they're just
keys to objects.
- Objects `p"s3://bucket/a"` and `p"s3://bucket/a/b"` can co-exist.
If both of these objects exist listing the keys for `p"s3://bucket/a"` returns
`[p"s3://bucket/a"]` while `p"s3://bucket/a/"` returns `[p"s3://bucket/a/b"]`.
- The drive property will return `"s3://<bucket>"`
- On top of the standard path properties (e.g., `segments`, `root`, `drive`,
`separator`), `S3Path`s also support `bucket` and `key` properties for your
convenience.
- If `version` argument is `nothing`, will return latest version of object. Version
can be provided via either kwarg `version` or as suffix `"?versionId=<object_version>"`
of `str`, e.g., `"s3://<bucket>/prefix/to/my/object?versionId=<object_version>"`.
- If `config` is left at its default value of `nothing`, then the
latest `global_aws_config()` will be used in any operations involving the
path. To "freeze" the config at construction time, explicitly pass an
`AbstractAWSConfig` to the `config` keyword argument.
"""
S3Path() = S3Path((), "/", "", true, nothing, nothing)
# below definition needed by FilePathsBase
S3Path{A}() where {A<:AbstractS3PathConfig} = S3Path()
function S3Path(
bucket::AbstractString,
key::AbstractString;
isdirectory::Bool=endswith(key, "/"),
version::AbstractS3Version=nothing,
config::AbstractS3PathConfig=nothing,
)
return S3Path(
Tuple(filter!(!isempty, split(key, "/"))),
"/",
strip(startswith(bucket, "s3://") ? bucket : "s3://$bucket", '/'),
isdirectory,
version,
config,
)
end
function S3Path(
bucket::AbstractString,
key::AbstractPath;
isdirectory::Bool=false,
version::AbstractS3Version=nothing,
config::AbstractS3PathConfig=nothing,
)
return S3Path(
key.segments, "/", normalize_bucket_name(bucket), isdirectory, version, config
)
end
function S3Path(
path::S3Path; isdirectory=path.isdirectory, version=path.version, config=path.config
)
return S3Path(path.bucket, path.key; isdirectory, config, version)
end
# To avoid a breaking change.
function S3Path(
str::AbstractString;
isdirectory::Union{Bool,Nothing}=nothing,
version::AbstractS3Version=nothing,
config::AbstractS3PathConfig=nothing,
)
result = tryparse(S3Path, str; config)
result !== nothing || throw(ArgumentError("Invalid s3 path string: $str"))
ver = if version !== nothing
if result.version !== nothing && result.version != version
throw(ArgumentError("Conflicting object versions in `version` and `str`"))
end
version
else
result.version
end
# Replace the parsed isdirectory field with an explicit passed in argument.
is_dir = isdirectory === nothing ? result.isdirectory : isdirectory
# Warning: We need to use the full constructor because reconstructing with the bucket
# and key results in inconsistent `root` fields.
return S3Path(result.segments, result.root, result.drive, is_dir, ver, result.config)
end
# Parses a URI in the S3 scheme as an S3Path combining the conventions documented in:
# - https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-bucket-intro.html#accessing-a-bucket-using-S3-format
# - https://docs.aws.amazon.com/AmazonS3/latest/userguide/RetMetaOfObjVersion.html
function Base.tryparse(
::Type{<:S3Path}, str::AbstractString; config::AbstractS3PathConfig=nothing
)
uri = URI(str)
uri.scheme == "s3" || return nothing
drive = "s3://$(uri.host)"
root = isempty(uri.path) ? "" : "/"
isdirectory = isempty(uri.path) || endswith(uri.path, '/')
path = Tuple(split(uri.path, '/'; keepempty=false))
version = get(queryparams(uri), "versionId", nothing)
return S3Path(path, root, drive, isdirectory, version, config)
end
function Base.parse(::Type{P}, p::P; kwargs...) where {P<:S3Path}
return p
end
function normalize_bucket_name(bucket)
return strip(startswith(bucket, "s3://") ? bucket : "s3://$bucket", '/')
end
function Base.print(io::IO, fp::S3Path)
print(io, fp.anchor, fp.key)
fp.version !== nothing && print(io, "?versionId=", fp.version)
return nothing
end
function Base.:(==)(a::S3Path, b::S3Path)
return (
a.segments == b.segments &&
a.root == b.root &&
a.drive == b.drive &&
a.isdirectory == b.isdirectory &&
a.version == b.version
)
end
function Base.getproperty(fp::S3Path, attr::Symbol)
if attr === :anchor
return fp.drive * fp.root
elseif attr === :separator
return "/"
elseif attr === :bucket
return split(fp.drive, "//")[2]
elseif attr === :key
if isempty(fp.segments)
return ""
end
return join(fp.segments, '/') * (fp.isdirectory ? "/" : "")
else
return getfield(fp, attr)
end
end
# We need to special case join and parents so that we propagate
# directories correctly (see type docstring for details)
function FilePathsBase.join(prefix::S3Path, pieces::AbstractString...)
isempty(pieces) && return prefix
segments = String[prefix.segments...]
isdirectory = endswith(last(pieces), "/")
for p in pieces
append!(segments, filter!(!isempty, split(p, "/")))
end
return S3Path(
tuple(segments...),
"/",
prefix.drive,
isdirectory,
nothing, # Version is per-object, so we should not propagate it from the prefix
prefix.config,
)
end
function FilePathsBase.parents(fp::S3Path)
if hasparent(fp)
return map(0:(length(fp.segments) - 1)) do i
S3Path(fp.segments[1:i], fp.root, fp.drive, true, nothing, fp.config)
end
elseif fp.segments == tuple(".") || isempty(fp.segments)
return [fp]
else
return [isempty(fp.root) ? Path(fp, tuple(".")) : Path(fp, ())]
end
end
"""
get_config(fp::S3Path)
Returns the AWS configuration used by the path `fp`. This can be stored within the path itself, but if not
it will be fetched with `AWS.global_aws_config()`.
"""
get_config(fp::S3Path) = @something(fp.config, global_aws_config())
function FilePathsBase.exists(fp::S3Path)
return s3_exists(get_config(fp), fp.bucket, fp.key; fp.version)
end
Base.isfile(fp::S3Path) = !fp.isdirectory && exists(fp)
function Base.isdir(fp::S3Path)
fp.isdirectory || return false
if isempty(fp.segments) # special handling of buckets themselves
try
params = Dict("prefix" => "", "delimiter" => "/", "max-keys" => "0")
@mock S3.list_objects_v2(fp.bucket, params; aws_config=get_config(fp))
return true
catch e
if ecode(e) == "NoSuchBucket"
return false
else
rethrow()
end
end
else
exists(fp)
end
end
function FilePathsBase.walkpath(fp::S3Path; kwargs...)
# Select objects with that prefix
objects = s3_list_objects(get_config(fp), fp.bucket, fp.key; delimiter="")
root = joinpath(fp, "/")
# Construct a new Channel using a recursive internal `_walkpath!` function
return Channel(; ctype=typeof(fp), csize=128) do chnl
_walkpath!(root, root, Iterators.Stateful(objects), chnl; kwargs...)
end
end
function _walkpath!(
root::S3Path, prefix::S3Path, objects, chnl; topdown=true, onerror=throw, kwargs...
)
@assert root.isdirectory
@assert prefix.isdirectory
while true
try
# Start by inspecting the next element
obj = Base.peek(objects)
# Early exit condition if we've exhausted the iterator or just the current prefix.
obj === nothing && return nothing
# Extract the non-root part of the key
k = chop(obj["Key"]; head=length(root.key), tail=0)
fp = joinpath(root, k)
_parents = parents(fp)
# If the filepath matches our prefix then pop it off and continue
# Cause we would have already processed it before recursing
child = if prefix.segments == fp.segments
popfirst!(objects)
continue
# If the filpath is a direct descendant of our prefix then check if it
# is a directory too
elseif last(_parents) == prefix
popfirst!(objects)
# If our current path is a prefix for the next path then we can assume that
# the current path should be a directory without needing to call `isdir`
next = Base.peek(objects)
is_dir =
(next !== nothing && startswith(next["Key"], fp.key)) || isdir(fp)
# Reconstruct our next object and explicitly specify whether it is a
# directory.
S3Path(
fp.bucket,
fp.key;
isdirectory=is_dir,
config=fp.config,
version=fp.version,
)
# If our filepath is a distance descendant of the prefix then start filling in
# the intermediate paths
elseif prefix in _parents
i = findfirst(==(prefix), _parents)
_parents[i + 1]
# Otherwise we've established that the current filepath isn't a descendant
# of the prefix and we should exit
else
return nothing
end
# If we aren't dealing with the root and we're doing topdown iteration then
# insert the child into the results channel
!isempty(k) && topdown && put!(chnl, child)
# Apply our recursive call for the children as necessary
# NOTE: We're relying on the `isdirectory` field rather than calling `isdir`
# which will call out to AWS as a fallback.
if child.isdirectory
_walkpath!(
root, child, objects, chnl; topdown=topdown, onerror=onerror, kwargs...
)
end
# If we aren't dealing with the root and we're doing bottom up iteration then
# insert the child ion the result channel here
!isempty(k) && !topdown && put!(chnl, child)
catch e
isa(e, Base.IOError) ? onerror(e) : rethrow()
end
end
end
"""
stat(fp::S3Path)
Return the status struct for the S3 path analogously to `stat` for local directories.
Note that this cannot be used on a directory. This is because S3 is a pure key-value store and internally does
not have a concept of directories. In some cases, a directory may actually be an empty file, in which case
you should use `s3_get_meta`.
"""
function Base.stat(fp::S3Path)
# Currently AWSS3 would require a s3_get_acl call to fetch
# ownership and permission settings
m = Mode(; user=(READ + WRITE), group=(READ + WRITE), other=(READ + WRITE))
u = FilePathsBase.User()
g = FilePathsBase.Group()
blksize = 4096
blocks = 0
s = 0
last_modified = DateTime(0)
if isfile(fp)
resp = s3_get_meta(get_config(fp), fp.bucket, fp.key; version=fp.version)
# Example: "Thu, 03 Jan 2019 21:09:17 GMT"
last_modified = DateTime(
get_robust_case(resp, "Last-Modified")[1:(end - 4)], dateformat"e, d u Y H:M:S"
)
s = parse(Int, get_robust_case(resp, "Content-Length"))
blocks = ceil(Int, s / 4096)
end
return Status(0, 0, m, 0, u, g, 0, s, blksize, blocks, last_modified, last_modified)
end
"""
diskusage(fp::S3Path)
Compute the *total* size of all contents of a directory. Note that there is no direct functionality
for this in the S3 API so it may be slow.
"""
function FilePathsBase.diskusage(fp::S3Path)
return if isfile(fp)
stat(fp).size
else
s3_directory_stat(get_config(fp), fp.bucket, fp.key)[1]
end
end
"""
lastmodified(fp::S3Path)
Returns a `DateTime` corresponding to the latest time at which the object (or, in the case of a
directory, any contained object) was modified.
"""
function lastmodified(fp::S3Path)
return if isfile(fp)
stat(fp).mtime
else
s3_directory_stat(get_config(fp), fp.bucket, fp.key)[2]
end
end
# Need API for accessing object ACL permissions for this to work
FilePathsBase.isexecutable(fp::S3Path) = false
Base.isreadable(fp::S3Path) = true
Base.iswritable(fp::S3Path) = true
Base.ismount(fp::S3Path) = false
"""
mkdir(fp::S3Path; recursive=false, exist_ok=false)
Create an empty directory within an existing bucket for the S3 path `fp`. If `recursive`,
this will create any previously non-existent directories which would contain `fp`. An error
will be thrown if an object exists at `fp` unless `exist_ok`.
!!! note
Creating a directory in [S3 creates a 0-byte object](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-folders.html)
with a key set to the provided directory name.
"""
function Base.mkdir(fp::S3Path; recursive=false, exist_ok=false)
fp.isdirectory || throw(ArgumentError("S3Path folders must end with '/': $fp"))
if exists(fp)
!exist_ok && error("$fp already exists.")
else
if hasparent(fp) && !exists(parent(fp))
if recursive
# don't try to create buckets this way, minio at least really doesn't like it
isempty(parent(fp).segments) || mkdir(parent(fp); recursive, exist_ok)
else
error(
"The parent of $fp does not exist. " *
"Pass `recursive=true` to create it.",
)
end
end
write(fp, "")
end
return fp
end
function Base.rm(fp::S3Path; recursive=false, kwargs...)
if isdir(fp)
files = readpath(fp)
if recursive
for f in files
rm(f; recursive, kwargs...)
end
elseif length(files) > 0
error("S3 path $fp is not empty. Use `recursive=true` to delete.")
end
end
@debug "delete: $fp"
return s3_delete(fp)
end
s3_delete(fp::S3Path) = s3_delete(get_config(fp), fp.bucket, fp.key; fp.version)
"""
s3_nuke_object(fp::S3Path)
Delete all versions of an object `fp`.
"""
function s3_nuke_object(fp::S3Path)
return s3_nuke_object(get_config(fp), fp.bucket, fp.key)
end
# We need to special case sync with S3Paths because of how directories
# are handled again.
# NOTE: This method signature only makes sense with FilePathsBase 0.6.2, but
# 1) It'd be odd for other packages to restrict FilePathsBase to a patch release
# 2) Seems cleaner to have it fallback and error rather than having
# slightly inconsistent handling of edge cases between the two versions.
function FilePathsBase.sync(
f::Function, src::AbstractPath, dst::S3Path; delete=false, overwrite=true
)
# Throw an error if the source path doesn't exist at all
exists(src) || throw(ArgumentError("Unable to sync from non-existent $src"))
# If the top level source is just a file then try to just sync that
# without calling walkpath
if isfile(src)
# If the destination exists then we should make sure it is a file and check
# if we should copy the source over.
if exists(dst)
isfile(dst) || throw(ArgumentError("Unable to sync file $src to non-file $dst"))
if overwrite && f(src, dst)
cp(src, dst; force=true)
end
else
cp(src, dst)
end
elseif isdir(src)
if exists(dst)
isdir(dst) ||
throw(ArgumentError("Unable to sync directory $src to non-directory $dst"))
# Create an index of all of the source files
src_paths = collect(walkpath(src))
#! format: off
# https://github.com/domluna/JuliaFormatter.jl/issues/458
index = Dict(
Tuple(setdiff(p.segments, src.segments)) => i
for (i, p) in enumerate(src_paths)
)
#! format: on
for dst_path in walkpath(dst)
k = Tuple(setdiff(dst_path.segments, dst.segments))
if haskey(index, k)
src_path = src_paths[pop!(index, k)]
if overwrite && f(src_path, dst_path)
cp(src_path, dst_path; force=true)
end
elseif delete
rm(dst_path; recursive=true)
end
end
# Finally, copy over files that don't exist at the destination
# But we need to iterate through it in a way that respects the original
# walkpath order otherwise we may end up trying to copy a file before its parents.
index_pairs = collect(pairs(index))
index_pairs = index_pairs[sortperm(index_pairs; by=last)]
for (seg, i) in index_pairs
new_dst = S3Path(
tuple(dst.segments..., seg...),
dst.root,
dst.drive,
isdir(src_paths[i]),
nothing,
dst.config,
)
cp(src_paths[i], new_dst; force=true)
end
else
cp(src, dst)
end
else
throw(ArgumentError("$src is neither a file or directory."))
end
end
# for some reason, sometimes we get back a `Pair`
# other times a `AbstractDict`.
function _pair_or_dict_get(p::Pair, k)
first(p) == k || return nothing
return last(p)
end
_pair_or_dict_get(d::AbstractDict, k) = get(d, k, nothing)
function _retrieve_prefixes!(results, objects, prefix_key, chop_head)
objects === nothing && return nothing
rm_key = s -> chop(s; head=chop_head, tail=0)
for p in objects
prefix = _pair_or_dict_get(p, prefix_key)
if prefix !== nothing
push!(results, rm_key(prefix))
end
end
return nothing
end
function _readdir_add_results!(results, response, key_length)
sizehint!(results, length(results) + parse(Int, response["KeyCount"]))
common_prefixes = get(response, "CommonPrefixes", nothing)
_retrieve_prefixes!(results, common_prefixes, "Prefix", key_length)
contents = get(response, "Contents", nothing)
_retrieve_prefixes!(results, contents, "Key", key_length)
return get(response, "NextContinuationToken", nothing)
end
function Base.readdir(fp::S3Path; join=false, sort=true)
if isdir(fp)
k = fp.key
key_length = length(k)
results = String[]
token = ""
while token !== nothing
response = @repeat 4 try
params = Dict("delimiter" => "/", "prefix" => k)
if !isempty(token)
params["continuation-token"] = token
end
parse(S3.list_objects_v2(fp.bucket, params; aws_config=get_config(fp)))
catch e
#! format: off
@delay_retry if ecode(e) in ["NoSuchBucket"] end
#! format: on
end
token = _readdir_add_results!(results, response, key_length)
end
# Filter out any empty object names which are valid in S3
filter!(!isempty, results)
# Sort results if sort=true
sort && sort!(results)
# Return results, possibly joined with the root path if join=true
return join ? joinpath.(fp, results) : results
else
throw(ArgumentError("\"$fp\" is not a directory"))
end
end
"""
read(fp::S3Path; byte_range=nothing)
Fetch data from the S3 path as a `Vector{UInt8}`. A subset of the object can be specified with
`byte_range` which should be a contiguous integer range, e.g. `1:4`.
"""
function Base.read(fp::S3Path; byte_range=nothing)
return Vector{UInt8}(
s3_get(
get_config(fp),
fp.bucket,
fp.key;
raw=true,
byte_range=byte_range,
version=fp.version,
),
)
end
"""
Base.write(fp::S3Path, content::String; kwargs...)
Base.write(fp::S3Path, content::Vector{UInt8}; part_size_mb=50, multipart::Bool=true,
returns::Symbol=:parsed, other_kwargs...,)
Write `content` to S3Path `fp`.
# Optional Arguments
- `multipart`: when `true`, uploads data via [`s3_multipart_upload`](@ref) for `content`
greater than `part_size_mb` bytes; when false, or when `content` is shorter
than `part_size_mb`, uploads data via [`s3_put`](@ref).
- `part_size_mb`: when `multipart=true`, sets maximum length of partitioned data (in bytes).
- `returns`: determines the result returned by the function: `:response` (the AWS API
response), :parsed` (default; the parsed AWS API response), or `:path` (the newly created
[`S3Path`](@ref), including its `version` when versioning is enabled for the bucket).
- `other_kwargs`: additional kwargs passed through into [`s3_multipart_upload`](@ref).
"""
function Base.write(fp::S3Path, content::String; kwargs...)
return Base.write(fp, Vector{UInt8}(content); kwargs...)
end
function Base.write(
fp::S3Path,
content::Vector{UInt8};
part_size_mb=50,
multipart::Bool=true,
returns::Symbol=:parsed,
other_kwargs...,
)
# avoid HTTPClientError('An HTTP Client raised an unhandled exception: string longer than 2147483647 bytes')
MAX_HTTP_BYTES = 2147483647
if fp.version !== nothing
throw(ArgumentError("Can't write to a specific object version ($(fp.version))"))
end
supported_return_values = (:parsed, :response, :path)
if !(returns in supported_return_values)
err = "Unsupported `returns` value `$returns`; supported options are `$(supported_return_values)`"
throw(ArgumentError(err))
end
config = get_config(fp)
response = if !multipart || length(content) < MAX_HTTP_BYTES
s3_put(config, fp.bucket, fp.key, content; parse_response=false)
else
io = IOBuffer(content)
s3_multipart_upload(
config,
fp.bucket,
fp.key,
io,
part_size_mb;
parse_response=false,
other_kwargs...,
)
end
if returns == :path
return S3Path(
fp.bucket,
fp.key;
isdirectory=fp.isdirectory,
version=HTTP.header(response.headers, "x-amz-version-id", nothing),
config=fp.config,
)
elseif returns == :parsed
return parse(response)
else
return response
end
end
function FilePathsBase.mktmpdir(parent::S3Path)
fp = parent / string(uuid4(), "/")
return mkdir(fp)
end
const S3PATH_ARROW_NAME = Symbol("JuliaLang.AWSS3.S3Path")
ArrowTypes.arrowname(::Type{<:S3Path}) = S3PATH_ARROW_NAME
ArrowTypes.ArrowType(::Type{<:S3Path}) = String
ArrowTypes.JuliaType(::Val{S3PATH_ARROW_NAME}, ::Any) = S3Path{Nothing}
ArrowTypes.fromarrow(::Type{<:S3Path}, uri_string) = S3Path(uri_string)
function ArrowTypes.toarrow(path::S3Path)
if !isnothing(path.config)
throw(ArgumentError("`S3Path` config must be `nothing` to serialize to Arrow"))
end
return string(path)
end
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 17061 | function awss3_tests(base_config)
bucket_name = gen_bucket_name()
@testset "Robust key selection" begin
lower_dict = Dict("foo-bar" => 1)
upper_dict = Dict("Foo-Bar" => 1)
@test AWSS3.get_robust_case(lower_dict, "Foo-Bar") == 1
@test AWSS3.get_robust_case(upper_dict, "Foo-Bar") == 1
@test_throws KeyError("Foo-Bar") AWSS3.get_robust_case(Dict(), "Foo-Bar")
end
@testset "Create Bucket" begin
config = assume_testset_role("CreateBucketTestset"; base_config)
s3_create_bucket(config, bucket_name)
@test bucket_name in s3_list_buckets(config)
is_aws(config) && s3_enable_versioning(config, bucket_name)
sleep(1)
end
@testset "Bucket Tagging" begin
config = assume_testset_role("BucketTaggingTestset"; base_config)
@test isempty(s3_get_tags(config, bucket_name))
tags = Dict("A" => "1", "B" => "2", "C" => "3")
s3_put_tags(config, bucket_name, tags)
@test s3_get_tags(config, bucket_name) == tags
s3_delete_tags(config, bucket_name)
@test isempty(s3_get_tags(config, bucket_name))
end
@testset "Create Objects" begin
config = assume_testset_role("CreateObjectsTestset"; base_config)
global_aws_config(config)
s3_put(config, bucket_name, "key1", "data1.v1")
s3_put(bucket_name, "key2", "data2.v1"; tags=Dict("Key" => "Value"))
s3_put(config, bucket_name, "key3", "data3.v1")
s3_put(config, bucket_name, "key3", "data3.v2")
s3_put(config, bucket_name, "key3", "data3.v3"; metadata=Dict("foo" => "bar"))
s3_put(config, bucket_name, "key4", "data3.v4"; acl="bucket-owner-full-control")
s3_put_tags(config, bucket_name, "key3", Dict("Left" => "Right"))
@test isempty(s3_get_tags(config, bucket_name, "key1"))
@test s3_get_tags(config, bucket_name, "key2")["Key"] == "Value"
@test s3_get_tags(config, bucket_name, "key3")["Left"] == "Right"
s3_delete_tags(config, bucket_name, "key2")
@test isempty(s3_get_tags(config, bucket_name, "key2"))
@test s3_get(config, bucket_name, "key1") == b"data1.v1"
@test s3_get(config, bucket_name, "key2") == b"data2.v1"
@test s3_get(bucket_name, "key3") == b"data3.v3"
@test s3_get(bucket_name, "key4") == b"data3.v4"
try
s3_get(config, bucket_name, "key5")
@test false
catch e
e isa AWSException || rethrow()
# Will see a 403 status if we lack the `s3:ListBucket` permission.
@test e.cause.status == 404
end
@test s3_get_meta(bucket_name, "key3")["x-amz-meta-foo"] == "bar"
@test isa(
s3_put(config, bucket_name, "key6", "data"; parse_response=false), AWS.Response
)
end
@testset "ASync Get" begin
config = assume_testset_role("ReadObject"; base_config)
@sync begin
for i in 1:2
@async begin
@test s3_get(bucket_name, "key3") == b"data3.v3"
end
end
end
end
@testset "Raw Return - XML" begin
config = assume_testset_role("ReadWriteObject"; base_config)
xml = "<?xml version='1.0'?><Doc><Text>Hello</Text></Doc>"
@test s3_put(config, bucket_name, "file.xml", xml, "text/xml") == UInt8[]
@test String(s3_get(config, bucket_name, "file.xml"; raw=true)) == xml
@test s3_get(config, bucket_name, "file.xml")["Text"] == "Hello"
end
@testset "Get byte range" begin
config = assume_testset_role("ReadWriteObject"; base_config)
teststr = "123456789"
s3_put(config, bucket_name, "byte_range", teststr)
range = 3:6
@test String(s3_get(config, bucket_name, "byte_range"; byte_range=range)) ==
teststr[range]
end
@testset "Object Copy" begin
config = assume_testset_role("ReadWriteObject"; base_config)
result = s3_copy(
config, bucket_name, "key1"; to_bucket=bucket_name, to_path="key1.copy"
)
@test result isa AbstractDict
@test s3_get(config, bucket_name, "key1.copy") == b"data1.v1"
result = s3_copy(
config,
bucket_name,
"key1";
to_bucket=bucket_name,
to_path="key1.copy",
parse_response=false,
)
@test result isa AWS.Response
if is_aws(base_config)
@test !isnothing(HTTP.header(result.headers, "x-amz-version-id", nothing))
end
end
@testset "Object exists" begin
config = assume_testset_role("ReadObject"; base_config)
for key in ["key1", "key2", "key3", "key1.copy"]
@test s3_exists(config, bucket_name, key)
end
end
@testset "List Objects" begin
config = assume_testset_role("ReadObject"; base_config)
for key in ["key1", "key2", "key3", "key1.copy"]
@test key in [o["Key"] for o in s3_list_objects(config, bucket_name)]
end
end
@testset "Object Delete" begin
config = assume_testset_role("ReadWriteObject"; base_config)
s3_delete(config, bucket_name, "key1.copy")
@test !("key1.copy" in [o["Key"] for o in s3_list_objects(config, bucket_name)])
end
@testset "Check Metadata" begin
config = assume_testset_role("ReadObject"; base_config)
meta = s3_get_meta(config, bucket_name, "key1")
@test meta["ETag"] == "\"68bc8898af64159b72f349b391a7ae35\""
end
# https://github.com/samoconnor/AWSS3.jl/issues/24
@testset "default Content-Type" begin
config = assume_testset_role("ReadWriteObject"; base_config)
ctype(key) = s3_get_meta(config, bucket_name, key)["Content-Type"]
for k in ["file.foo", "file", "file_html", "file.d/html", "foobar.html/file.htm"]
is_aws(config) && k == "file" && continue
s3_put(config, bucket_name, k, "x")
@test ctype(k) == "application/octet-stream"
end
for (k, t) in [
("foo/bar/file.html", "text/html"),
("x.y.z.js", "application/javascript"),
("downalods/foo.pdf", "application/pdf"),
("data/foo.csv", "text/csv"),
("this.is.a.file.txt", "text/plain"),
("my.log", "text/plain"),
("big.dat", "application/octet-stream"),
("some.tar.gz", "application/octet-stream"),
("data.bz2", "application/octet-stream"),
]
s3_put(config, bucket_name, k, "x")
@test ctype(k) == t
end
end
@testset "Multi-Part Upload" begin
config = assume_testset_role("MultipartUploadTestset"; base_config)
MIN_S3_CHUNK_SIZE = 5 * 1024 * 1024 # 5 MB
key_name = "multi-part-key"
upload = s3_begin_multipart_upload(config, bucket_name, key_name)
tags = Vector{String}()
for part_number in 1:5
push!(
tags,
s3_upload_part(config, upload, part_number, rand(UInt8, MIN_S3_CHUNK_SIZE)),
)
end
result = s3_complete_multipart_upload(config, upload, tags)
@test s3_exists(config, bucket_name, key_name)
@test isa(result, LittleDict)
end
@testset "Multi-Part Upload, return unparsed path" begin
config = assume_testset_role("MultipartUploadTestset"; base_config)
MIN_S3_CHUNK_SIZE = 5 * 1024 * 1024 # 5 MB
key_name = "multi-part-key"
upload = s3_begin_multipart_upload(config, bucket_name, key_name)
tags = Vector{String}()
for part_number in 1:5
push!(
tags,
s3_upload_part(config, upload, part_number, rand(UInt8, MIN_S3_CHUNK_SIZE)),
)
end
result = s3_complete_multipart_upload(config, upload, tags; parse_response=false)
@test s3_exists(config, bucket_name, key_name)
@test isa(result, AWS.Response)
end
# these tests are needed because lack of functionality of the underlying AWS API makes certain
# seemingly inane tasks incredibly tricky: for example checking if an "object" (file or
# directory) exists is very subtle
@testset "path naming edge cases" begin
config = assume_testset_role("ReadWriteObject"; base_config)
# this seemingly arbitrary operation is needed because of the insanely tricky way we
# need to check for directories
s3_put(config, bucket_name, "testdir.", "") # create an empty file called `testdir.`
s3_put(config, bucket_name, "testdir/", "") # create an empty file called `testdir/` which AWS will treat as an "empty directory"
@test s3_exists(config, bucket_name, "testdir/")
@test isdir(S3Path(bucket_name, "testdir/"; config))
@test !isfile(S3Path(bucket_name, "testdir/"; config))
@test s3_exists(config, bucket_name, "testdir.")
@test isfile(S3Path(bucket_name, "testdir."; config))
@test !isdir(S3Path(bucket_name, "testdir."; config))
@test !s3_exists(config, bucket_name, "testdir")
s3_put(config, bucket_name, "testdir/testfile.txt", "what up")
@test s3_exists(config, bucket_name, "testdir/testfile.txt")
@test isfile(S3Path(bucket_name, "testdir/testfile.txt"; config))
# make sure the directory still "exists" even though there's a key in there now
@test s3_exists(config, bucket_name, "testdir/")
@test isdir(S3Path(bucket_name, "testdir/"; config))
@test !isfile(S3Path(bucket_name, "testdir/"; config))
# but it is still a directory and not an object
@test !s3_exists(config, bucket_name, "testdir")
end
# Based upon this example: https://repost.aws/knowledge-center/iam-s3-user-specific-folder
#
# MinIO isn't currently setup with the restrictive prefix required to make the tests
# fail with "AccessDenied".
is_aws(base_config) && @testset "Restricted Prefix" begin
setup_config = assume_testset_role("ReadWriteObject"; base_config)
s3_put(
setup_config,
bucket_name,
"prefix/denied/secrets/top-secret",
"for british eyes only",
)
s3_put(setup_config, bucket_name, "prefix/granted/file", "hello")
config = assume_testset_role("RestrictedPrefixTestset"; base_config)
@test s3_exists(config, bucket_name, "prefix/granted/file")
@test !s3_exists(config, bucket_name, "prefix/granted/dne")
@test_throws_msg ["AccessDenied", "403"] begin
s3_exists(config, bucket_name, "prefix/denied/top-secret")
end
@test s3_exists(config, bucket_name, "prefix/granted/")
@test s3_exists(config, bucket_name, "prefix/")
# Ensure that `s3_list_objects` works with restricted prefixes
@test length(collect(s3_list_objects(config, bucket_name, "prefix/granted/"))) == 1
@test length(collect(s3_list_objects(config, bucket_name, "prefix/"))) == 0
# Validate that we have permissions to list the root without encountering an access error.
# Ideally we just want `@test_no_throws s3_list_objects(config, bucket_name)`.
@test length(collect(s3_list_objects(config, bucket_name))) >= 0
end
@testset "Version is empty" begin
config = assume_testset_role("ReadWriteObject"; base_config)
# Create the file to ensure we're only testing `version`
k = "version_empty.txt"
s3_put(config, bucket_name, k, "v1")
s3_put(config, bucket_name, k, "v2")
if is_aws(config)
@test_throws AWSException s3_get(config, bucket_name, k; version="")
@test_throws AWSException s3_get_meta(config, bucket_name, k; version="")
@test_throws AWSException s3_exists(config, bucket_name, k; version="")
@test_throws AWSException s3_delete(config, bucket_name, k; version="")
else
# Using an empty string as the version returns the latest version
@test s3_get(config, bucket_name, k; version="") == "v2"
@test s3_get_meta(config, bucket_name, k; version="") isa AbstractDict
@test s3_exists(config, bucket_name, k; version="")
@test s3_delete(config, bucket_name, k; version="") == UInt8[]
end
end
@testset "Version is nothing" begin
config = assume_testset_role("ReadWriteObject"; base_config)
# Create the file to ensure we're only testing `version`
k = "version_nothing.txt"
s3_put(config, bucket_name, k, "v1")
s3_put(config, bucket_name, k, "v2")
# Using an empty string as the version returns the latest version
@test s3_get(config, bucket_name, k; version=nothing) == "v2"
@test s3_get_meta(config, bucket_name, k; version=nothing) isa AbstractDict
@test s3_exists(config, bucket_name, k; version=nothing)
@test s3_delete(config, bucket_name, k; version=nothing) == UInt8[]
end
is_aws(base_config) && @testset "Sign URL" begin
config = assume_testset_role("SignUrlTestset"; base_config)
for v in ["v2", "v4"]
url = s3_sign_url(config, bucket_name, "key1"; signature_version=v)
curl_output = ""
@repeat 3 try
curl_output = read(`curl -s -o - $url`, String)
catch e
@delay_retry if true
end
end
@test curl_output == "data1.v1"
fn = "/tmp/jl_qws_test_key1"
if isfile(fn)
rm(fn)
end
@repeat 3 try
s3_get_file(config, bucket_name, "key1", fn)
catch e
sleep(1)
@retry if true
end
end
@test read(fn, String) == "data1.v1"
rm(fn)
end
end
is_aws(base_config) && @testset "Check Object Versions" begin
config = assume_testset_role("ReadObjectVersion"; base_config)
versions = s3_list_versions(config, bucket_name, "key3")
@test length(versions) == 3
@test (
s3_get(config, bucket_name, "key3"; version=versions[3]["VersionId"]) ==
b"data3.v1"
)
@test (
s3_get(config, bucket_name, "key3"; version=versions[2]["VersionId"]) ==
b"data3.v2"
)
@test (
s3_get(config, bucket_name, "key3"; version=versions[1]["VersionId"]) ==
b"data3.v3"
)
tmp_file = joinpath(tempdir(), "jl_qws_test_key3")
s3_get_file(config, bucket_name, "key3", tmp_file; version=versions[2]["VersionId"])
@test read(tmp_file) == b"data3.v2"
end
is_aws(base_config) && @testset "Purge Versions" begin
config = assume_testset_role("PurgeVersionsTestset"; base_config)
s3_purge_versions(config, bucket_name, "key3")
versions = s3_list_versions(config, bucket_name, "key3")
@test length(versions) == 1
@test s3_get(config, bucket_name, "key3") == b"data3.v3"
end
is_aws(base_config) && @testset "Delete All Versions" begin
config = assume_testset_role("NukeObjectTestset"; base_config)
key_to_delete = "NukeObjectTestset_key"
# Test that object that starts with the same prefix as `key_to_delete` is
# not _also_ deleted
key_not_to_delete = "NukeObjectTestset_key/rad"
function _s3_object_versions(config, bucket, key)
return filter!(x -> x["Key"] == key, s3_list_versions(config, bucket, key))
end
s3_put(config, bucket_name, key_to_delete, "foo.v1")
s3_put(config, bucket_name, key_to_delete, "foo.v2")
s3_put(config, bucket_name, key_to_delete, "foo.v3")
s3_put(config, bucket_name, key_not_to_delete, "rad.v1")
s3_put(config, bucket_name, key_not_to_delete, "rad.v2")
@test length(_s3_object_versions(config, bucket_name, key_to_delete)) == 3
@test length(_s3_object_versions(config, bucket_name, key_not_to_delete)) == 2
s3_nuke_object(config, bucket_name, key_to_delete)
@test length(_s3_object_versions(config, bucket_name, key_to_delete)) == 0
# Test that _only_ specific path was deleted---not paths at the same prefix
@test length(_s3_object_versions(config, bucket_name, key_not_to_delete)) == 2
end
if is_aws(base_config)
@testset "Empty and Delete Bucket" begin
config = assume_testset_role("EmptyAndDeleteBucketTestset"; base_config)
AWSS3.s3_nuke_bucket(config, bucket_name)
@test !in(bucket_name, s3_list_buckets(config))
end
@testset "Delete Non-Existent Bucket" begin
config = assume_testset_role("DeleteNonExistentBucketTestset"; base_config)
@test_throws AWS.AWSException s3_delete_bucket(config, bucket_name)
end
end
end
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 915 | using AWS
using AWS.AWSExceptions: AWSException
using AWSS3
using Arrow
using Dates
using FilePathsBase
using FilePathsBase: /, join
using FilePathsBase.TestPaths
using FilePathsBase.TestPaths: test
using HTTP
using JSON3
using Minio
using Mocking
using OrderedCollections: LittleDict
using Retry
using Test
using UUIDs: uuid4
Mocking.activate()
@service S3 use_response_type = true
include("utils.jl")
# Load the test functions
include("s3path.jl") # creates `awss3_tests(config)`
include("awss3.jl") # creates `s3path_tests(config)`
@testset "AWSS3.jl" begin
# We can run most tests locally under MinIO without requring AWS credentials
@testset "Minio" begin
minio_server() do config
awss3_tests(config)
s3path_tests(config)
end
end
@testset "S3" begin
config = AWSConfig()
awss3_tests(config)
s3path_tests(config)
end
end
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 33543 | function test_s3_constructors(ps::PathSet)
bucket_name = ps.root.bucket
@test S3Path(bucket_name, "pathset-root/foo/baz.txt") == ps.baz
@test S3Path(bucket_name, p"pathset-root/foo/baz.txt") == ps.baz
@test S3Path(bucket_name, p"/pathset-root/foo/baz.txt") == ps.baz
@test S3Path("s3://$bucket_name", p"/pathset-root/foo/baz.txt") == ps.baz
@test S3Path(bucket_name, "pathset-root/bar/qux"; isdirectory=true) == ps.qux
@test S3Path(bucket_name, "pathset-root/bar/qux/"; isdirectory=true) == ps.qux
@test S3Path(bucket_name, p"pathset-root/bar/qux"; isdirectory=true) == ps.qux
@test S3Path(bucket_name, p"/pathset-root/bar/qux"; isdirectory=true) == ps.qux
@test S3Path("s3://$bucket_name/pathset-root/bar/qux"; isdirectory=true) == ps.qux
end
function test_s3_parents(ps::PathSet)
@testset "parents" begin
@test parent(ps.foo) == ps.root
@test parent(ps.qux) == ps.bar
@test dirname(ps.foo) == ps.root
@test hasparent(ps.qux)
_parents = parents(ps.qux)
@test _parents[end] == ps.bar
@test _parents[end - 1] == ps.root
@test _parents[1] == Path(ps.root; segments=())
end
end
function test_s3_join(ps::PathSet)
@testset "join" begin
@test join(ps.root, "bar/") == ps.bar
@test ps.root / "foo" / "baz.txt" == ps.baz
@test ps.root / "foobaz.txt" == ps.root / "foo" * "baz.txt"
end
end
function test_s3_normalize(ps::PathSet)
@testset "norm" begin
@test normalize(ps.bar / ".." / "foo/") == ps.foo
@test normalize(ps.bar / ".." / "foo") != ps.foo
@test normalize(ps.bar / "./") == ps.bar
@test normalize(ps.bar / "../") == ps.root
end
end
function test_s3_mkdir(p::PathSet)
@testset "mkdir" begin
garply = p.root / "corge" / "grault" / "garply/"
mkdir(garply; recursive=true)
@test exists(garply)
rm(p.root / "corge/"; recursive=true)
@test !exists(garply)
end
end
function test_s3_download(base_config::AbstractAWSConfig)
config = assume_testset_role("ReadWriteObject"; base_config)
# Requires that the global AWS configuration is set so that `S3Path`s created within
# the tests have the correct permissions (e.g. `download(::S3Path, "s3://...")`)
return p -> with_aws_config(config) do
test_download(p)
end
end
function test_s3_readpath(p::PathSet)
@testset "readpath" begin
@test readdir(p.root) == ["bar/", "foo/", "fred/"]
@test readdir(p.qux) == ["quux.tar.gz"]
@test readpath(p.root) == [p.bar, p.foo, p.fred]
@test readpath(p.qux) == [p.quux]
end
end
function test_s3_walkpath(p::PathSet)
@testset "walkpath - S3" begin
# Test that we still return parent prefixes even when no "directory" objects
# have been created by a `mkdir`, retaining consistency with `readdir`.
_root = p.root / "s3_walkpath/"
_foo = _root / "foo/"
_baz = _foo / "baz.txt"
_bar = _root / "bar/"
_qux = _bar / "qux/"
_quux = _qux / "quux.tar.gz"
# Only write the leaf files
write(_baz, read(p.baz))
write(_quux, read(p.quux))
topdown = [_bar, _qux, _quux, _foo, _baz]
bottomup = [_quux, _qux, _bar, _baz, _foo]
@test collect(walkpath(_root; topdown=true)) == topdown
@test collect(walkpath(_root; topdown=false)) == bottomup
rm(_root; recursive=true)
end
end
function test_s3_cp(p::PathSet)
@testset "cp" begin
# In case the folder objects were deleted in a previous test
mkdir.([p.foo, p.qux, p.fred]; recursive=true, exist_ok=true)
@test exists(p.foo)
cp(p.foo, p.qux / "foo/"; force=true)
@test exists(p.qux / "foo" / "baz.txt")
rm(p.qux / "foo/"; recursive=true)
end
end
function test_s3_mv(p::PathSet)
@testset "mv" begin
# In case the folder objects were deleted in a previous test
mkdir.([p.foo, p.qux, p.fred]; recursive=true, exist_ok=true)
garply = p.root / "corge" / "grault" / "garply/"
mkdir(garply; recursive=true, exist_ok=true)
@test exists(garply)
mv(p.root / "corge/", p.foo / "corge/"; force=true)
@test exists(p.foo / "corge" / "grault" / "garply/")
rm(p.foo / "corge/"; recursive=true)
end
end
function test_s3_sync(ps::PathSet)
return p -> @testset "sync" begin
# In case the folder objects were deleted in a previous test
mkdir.([p.foo, p.qux, p.fred]; recursive=true, exist_ok=true)
# Base cp case
sync(p.foo, ps.qux / "foo/")
@test exists(p.qux / "foo" / "baz.txt")
# Test that the copied baz file has a newer modified time
baz_t = modified(p.qux / "foo" / "baz.txt")
@test modified(p.baz) < baz_t
# Don't cp unchanged files when a new file is added
# NOTE: sleep before we make a new file, so it's clear tha the
# modified time has changed.
sleep(1)
write(p.foo / "test.txt", "New File")
sync(p.foo, ps.qux / "foo/")
@test exists(p.qux / "foo" / "test.txt")
@test read(p.qux / "foo" / "test.txt") == b"New File"
@test read(p.qux / "foo" / "test.txt", String) == "New File"
@test modified(p.qux / "foo" / "baz.txt") == baz_t
@test modified(p.qux / "foo" / "test.txt") > baz_t
# Test not deleting a file on sync
rm(p.foo / "test.txt")
sync(p.foo, ps.qux / "foo/")
@test exists(p.qux / "foo" / "test.txt")
# Test passing delete flag
sync(p.foo, p.qux / "foo/"; delete=true)
@test !exists(p.qux / "foo" / "test.txt")
rm(p.qux / "foo/"; recursive=true)
end
end
function test_s3_properties(base_config::AbstractAWSConfig)
return ps -> @testset "s3_properties" begin
config = assume_testset_role("ReadWriteObject"; base_config)
fp1 = S3Path("s3://mybucket/path/to/some/object"; config)
fp2 = S3Path("s3://mybucket/path/to/some/prefix/"; config)
@test fp1.bucket == "mybucket"
@test fp1.key == "path/to/some/object"
@test fp2.bucket == "mybucket"
@test fp2.key == "path/to/some/prefix/"
@test fp2.version === nothing
try
fp3 = S3Path(ps.root.bucket, "/another/testdir/"; config)
strs = ["what up", "what else up", "what up again"]
write(fp3 / "testfile1.txt", strs[1])
write(fp3 / "testfile2.txt", strs[2])
write(fp3 / "inner" / "testfile3.txt", strs[3])
@test AWSS3.diskusage(fp3) == sum(ncodeunits.(strs))
# we deliberately pick an older file to compare to so we
# can be confident timestamps are different
@test AWSS3.lastmodified(fp3) > AWSS3.lastmodified(ps.foo)
finally
rm(S3Path(ps.root.bucket, "/another/"; config); recursive=true) # otherwise subsequent tests may fail
end
end
end
function test_s3_folders_and_files(ps::PathSet)
config = ps.root.config
@testset "s3_folders_and_files" begin
# Minio has slightly different semantics than s3 in that it does
# not support having prefixes that clash with files
# (https://github.com/minio/minio/issues/9865)
# Thus in these tests, we run certain tests only on s3.
# In case the ps.root doesn't exist
mkdir(ps.root; recursive=true, exist_ok=true)
# Test that the trailing slash matters
@test p"s3://mybucket/path/to/some/prefix/" != p"s3://mybucket/path/to/some/prefix"
# Test that we can have empty directory names
# I'm not sure if we want to support this in the future, but it may require more
# overloading of AbstractPath methods to support properly.
@test_broken p"s3://mybucket/path/to/some/prefix" !=
p"s3://mybucket/path//to/some/prefix"
write(ps.root / "foobar", "I'm an object")
if is_aws(config)
mkdir(ps.root / "foobar/")
write(ps.root / "foobar" / "car.txt", "I'm a different object")
end
@test read(ps.root / "foobar") == b"I'm an object"
@test read(ps.root / "foobar", String) == "I'm an object"
@test_throws ArgumentError readpath(ps.root / "foobar")
if is_aws(config)
@test readpath(ps.root / "foobar/") == [ps.root / "foobar" / "car.txt"]
@test read(ps.root / "foobar" / "car.txt", String) == "I'm a different object"
end
end
end
function test_multipart_write(ps::PathSet)
teststr = repeat("This is a test string!", round(Int, 2e5))
@testset "multipart write/read" begin
result = write(ps.quux, teststr; part_size_mb=1, multipart=true)
@test read(ps.quux, String) == teststr
@test result == UInt8[]
end
@testset "multipart write/read, return path" begin
result = write(ps.quux, teststr; part_size_mb=1, multipart=true, returns=:path)
@test read(ps.quux, String) == teststr
@test isa(result, S3Path)
end
@testset "multipart write/read, return response" begin
result = write(ps.quux, teststr; part_size_mb=1, multipart=true, returns=:response)
@test read(ps.quux, String) == teststr
@test isa(result, AWS.Response)
end
end
function test_write_returns(ps::PathSet)
@testset "write returns" begin
teststr = "Test string"
@test write(ps.quux, teststr) == UInt8[]
@test write(ps.quux, teststr; returns=:parsed) == UInt8[]
@test write(ps.quux, teststr; returns=:response) isa AWS.Response
@test write(ps.quux, teststr; returns=:path) isa S3Path
@test_throws ArgumentError write(ps.quux, teststr; returns=:unsupported_return_type)
end
end
function initialize(config, bucket_name)
"""
Hierarchy:
bucket-name
|-- test_01.txt
|-- emptydir/
|-- subdir1/
| |-- test_02.txt
| |-- test_03.txt
| |-- subdir2/
| |-- test_04.txt
| |-- subdir3/
"""
s3_put(config, bucket_name, "test_01.txt", "test01")
s3_put(config, bucket_name, "emptydir/", "")
s3_put(config, bucket_name, "subdir1/", "")
s3_put(config, bucket_name, "subdir1/test_02.txt", "test02")
s3_put(config, bucket_name, "subdir1/test_03.txt", "test03")
s3_put(config, bucket_name, "subdir1/subdir2/", "")
s3_put(config, bucket_name, "subdir1/subdir2/test_04.txt", "test04")
return s3_put(config, bucket_name, "subdir1/subdir2/subdir3/", "")
end
function verify_files(path::S3Path)
@test readdir(path) == ["emptydir/", "subdir1/", "test_01.txt"]
@test readdir(path; join=true) ==
[path / "emptydir/", path / "subdir1/", path / "test_01.txt"]
@test readdir(path / "emptydir/") == []
@test readdir(path / "emptydir/"; join=true) == []
@test readdir(path / "subdir1/") == ["subdir2/", "test_02.txt", "test_03.txt"]
@test readdir(path / "subdir1/"; join=true) == [
path / "subdir1/" / "subdir2/",
path / "subdir1/" / "test_02.txt",
path / "subdir1/" / "test_03.txt",
]
@test readdir(path / "subdir1/subdir2/") == ["subdir3/", "test_04.txt"]
@test readdir(path / "subdir1/subdir2/"; join=true) == [
path / "subdir1/subdir2/" / "subdir3/", path / "subdir1/subdir2/" / "test_04.txt"
]
@test readdir(path / "subdir1/subdir2/subdir3/") == []
@test readdir(path / "subdir1/subdir2/subdir3/"; join=true) == []
end
function verify_files(path::AbstractPath)
@test readdir(path) == ["emptydir", "subdir1", "test_01.txt"]
@test readdir(path; join=true) ==
[path / "emptydir", path / "subdir1", path / "test_01.txt"]
@test readdir(path / "emptydir/") == []
@test readdir(path / "emptydir/"; join=true) == []
@test readdir(path / "subdir1/") == ["subdir2", "test_02.txt", "test_03.txt"]
@test readdir(path / "subdir1/"; join=true) == [
path / "subdir1" / "subdir2",
path / "subdir1" / "test_02.txt",
path / "subdir1/" / "subdir1/test_03.txt",
]
@test readdir(path / "subdir1/subdir2/") == ["subdir3", "test_04.txt"]
@test readdir(path / "subdir1/subdir2/"; join=true) ==
[path / "subdir1/subdir2/" / "subdir3", path / "subdir1/subdir2/" / "test_04.txt"]
@test readdir(path / "subdir1/subdir2/subdir3/") == []
@test readdir(path / "subdir1/subdir2/subdir3/"; join=true) == []
end
# This is the main entrypoint for the S3Path tests
function s3path_tests(base_config)
bucket_name = gen_bucket_name()
let
config = assume_testset_role("CreateBucket"; base_config)
s3_create_bucket(config, bucket_name)
end
root = let
config = assume_testset_role("ReadWriteObject"; base_config)
S3Path("s3://$bucket_name/pathset-root/"; config)
end
ps = PathSet(
root,
root / "foo/",
root / "foo" / "baz.txt",
root / "bar/",
root / "bar" / "qux/",
root / "bar" / "qux" / "quux.tar.gz",
root / "fred/",
root / "fred" / "plugh",
false,
)
@testset "$(typeof(ps.root))" begin
testsets = [
test_s3_constructors,
test_registration,
test_show,
test_parse,
test_convert,
test_components,
test_indexing,
test_iteration,
test_s3_parents,
test_descendants_and_ascendants,
test_s3_join,
test_splitext,
test_basename,
test_filename,
test_extensions,
test_isempty,
test_s3_normalize,
# test_canonicalize, # real doesn't make sense for S3Paths
test_relative,
test_absolute,
test_isdir,
test_isfile,
test_stat,
test_filesize,
test_modified,
test_created,
test_cd,
test_s3_readpath,
test_walkpath,
test_read,
test_multipart_write,
test_write,
test_write_returns,
test_s3_mkdir,
# These tests seem to fail due to an eventual consistency issue?
test_s3_cp,
test_s3_mv,
test_s3_sync(ps),
test_symlink,
test_touch,
test_tmpname,
test_tmpdir,
test_mktmp,
test_mktmpdir,
test_s3_download(base_config),
test_issocket,
# These will also all work for our custom path type,
# but many implementations won't support them.
test_isfifo,
test_ischardev,
test_isblockdev,
test_ismount,
test_isexecutable,
test_isreadable,
test_iswritable,
# test_chown, # chmod & chown don't make sense for S3Paths
# test_chmod,
test_s3_properties(base_config),
test_s3_folders_and_files,
]
# Run all of the automated tests
#
# Note: `FilePathsBase.TestPaths.test` internally calls an `initialize` function
# which requires AWS permissions in order to write some files to S3. Due to this
# setup and how `test` passes in `ps` to each test it makes it hard to have each
# testset specify their required permissions separately. Currently, we embed the
# configuration in the paths themselves but it may make more sense to set the
# config globally temporarily via `with_aws_config`.
test(ps, testsets)
end
@testset "readdir" begin
config = assume_testset_role("ReadWriteObject"; base_config)
initialize(config, bucket_name)
@testset "S3" begin
verify_files(S3Path("s3://$bucket_name/"; config))
@test_throws ArgumentError("Invalid s3 path string: $bucket_name") S3Path(
bucket_name
)
end
@test_skip @testset "Local" begin
temp_path = Path(tempdir() * string(uuid4()))
mkdir(temp_path)
sync(S3Path("s3://$bucket_name/"; config), temp_path)
verify_files(temp_path)
rm(temp_path; force=true, recursive=true)
end
@testset "join" begin
@test ( # test trailing slash on prefix does not matter for join
p"s3://foo/bar" / "baz" == p"s3://foo/bar/" / "baz" == p"s3://foo/bar/baz"
)
@test ( # test trailing slash on root-only prefix in particular does not matter
p"s3://foo" / "bar" / "baz" ==
p"s3://foo/" / "bar" / "baz" ==
p"s3://foo/bar/baz"
)
# test extra leading and trailing slashes do not matter
@test p"s3://foo/" / "bar/" / "/baz" == p"s3://foo/bar/baz"
# test joining `/` and string concatentation `*` play nice as expected
@test p"s3://foo" * "/" / "bar" ==
p"s3://foo" / "/" * "bar" ==
p"s3://foo" / "bar"
@test p"s3://foo" / "bar" * "baz" ==
p"s3://foo/bar" * "baz" ==
p"s3://foo" / "barbaz"
# test trailing slash on final piece is included
@test p"s3://foo/bar" / "baz/" == p"s3://foo/bar/baz/"
end
@testset "readdir" begin
path = S3Path("s3://$(bucket_name)/A/A/B.txt"; config)
write(path, "test!")
results = readdir(S3Path("s3://$(bucket_name)/A/"; config))
@test results == ["A/"]
end
end
@testset "isdir" begin
config = assume_testset_role("ReadObject"; base_config)
function _generate_exception(code)
return AWSException(
code, "", nothing, AWS.HTTP.Exceptions.StatusError(404, "", "", ""), nothing
)
end
@testset "top level bucket" begin
@testset "success" begin
@test isdir(S3Path("s3://$(bucket_name)"; config))
@test isdir(S3Path("s3://$(bucket_name)/"; config))
end
@testset "NoSuchBucket" begin
test_exception = _generate_exception("NoSuchBucket")
patch = @patch function AWSS3.S3.list_objects_v2(args...; kwargs...)
throw(test_exception)
end
apply(patch) do
@test !isdir(S3Path("s3://$(bucket_name)"; config))
@test !isdir(S3Path("s3://$(bucket_name)/"; config))
end
end
@testset "Other Exception" begin
test_exception = _generate_exception("TestException")
patch = @patch function AWSS3.S3.list_objects_v2(args...; kwargs...)
throw(test_exception)
end
apply(patch) do
@test_throws AWSException isdir(S3Path("s3://$(bucket_name)"; config))
@test_throws AWSException isdir(S3Path("s3://$(bucket_name)/"; config))
end
end
end
# Based upon this example: https://repost.aws/knowledge-center/iam-s3-user-specific-folder
#
# MinIO isn't currently setup with the restrictive prefix required to make the tests
# fail with "AccessDenied".
is_aws(base_config) && @testset "Restricted Prefix" begin
setup_config = assume_testset_role("ReadWriteObject"; base_config)
s3_put(
setup_config,
bucket_name,
"prefix/denied/secrets/top-secret",
"for british eyes only",
)
s3_put(setup_config, bucket_name, "prefix/granted/file", "hello")
config = assume_testset_role("RestrictedPrefixTestset"; base_config)
@test isdir(S3Path("s3://$(bucket_name)/prefix/granted/"; config))
@test isdir(S3Path("s3://$(bucket_name)/prefix/"; config))
@test isdir(S3Path("s3://$(bucket_name)"; config))
@test_throws_msg ["AccessDenied", "403"] begin
isdir(S3Path("s3://$(bucket_name)/prefix/denied/"; config))
end
# The above call fails as we use `"prefix" => "prefix/denied/"`. However,
# this restricted role can still determine that the "denied" directory
# exists with some carefully crafted queries.
params = Dict("prefix" => "prefix/", "delimiter" => "/")
r = S3.list_objects_v2(bucket_name, params; aws_config=config)
prefixes = [x["Prefix"] for x in parse(r)["CommonPrefixes"]]
@test "prefix/denied/" in prefixes
@test_throws_msg ["AccessDenied", "403"] begin
!isdir(S3Path("s3://$(bucket_name)/prefix/dne/"; config))
end
@test_throws_msg ["AccessDenied", "403"] begin
!isdir(S3Path("s3://$(bucket_name)/prefix/denied/secrets/"; config))
end
end
end
@testset "JSON roundtripping" begin
config = assume_testset_role("ReadWriteObject"; base_config)
json_path = S3Path("s3://$(bucket_name)/test_json"; config)
my_dict = Dict("key" => "value", "key2" => 5.0)
# here we use the "application/json" MIME type to trigger the heuristic parsing into a `LittleDict`
# that will hit a `MethodError` at the `Vector{UInt8}` constructor of `read(::S3Path)` if `raw=true`
# was not passed to `s3_get` in that method.
result = s3_put(
config, bucket_name, "test_json", JSON3.write(my_dict), "application/json"
)
@test result == UInt8[]
json_bytes = read(json_path)
@test JSON3.read(json_bytes, Dict) == my_dict
rm(json_path)
end
@testset "Arrow <-> S3Path (de)serialization" begin
ver = String('A':'Z') * String('0':'5')
paths = Union{Missing,S3Path}[
missing,
S3Path("s3://$(bucket_name)/a"),
S3Path("s3://$(bucket_name)/b?versionId=$ver"),
# format trick: using this comment to force use of multiple lines
]
tbl = Arrow.Table(Arrow.tobuffer((; paths=paths)))
@test all(isequal.(tbl.paths, paths))
# Cannot serialize `S3Path`s with embedded `AWSConfig`s.
push!(paths, S3Path("s3://$(bucket_name)/c"; config=base_config))
@test_throws ArgumentError Arrow.tobuffer((; paths))
end
@testset "tryparse" begin
# The global `AWSConfig` is just used for comparison and isn't used for access
cfg = global_aws_config()
ver = String('A':'Z') * String('0':'5')
@test S3Path("s3://my_bucket/prefix/path") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", false, nothing, cfg)
@test S3Path("s3://my_bucket/prefix/path/") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", true, nothing, cfg)
@test S3Path("s3://my_bucket/") ==
S3Path((), "/", "s3://my_bucket", true, nothing, cfg)
@test S3Path("s3://my_bucket") ==
S3Path((), "", "s3://my_bucket", true, nothing, cfg)
@test S3Path("s3://my_bucket/prefix/path?versionId=$ver") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", false, ver, cfg)
@test S3Path("s3://my_bucket/prefix/path/?versionId=$ver") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", true, ver, cfg)
@test S3Path("s3://my_bucket/?versionId=$ver") ==
S3Path((), "/", "s3://my_bucket", true, ver, cfg)
@test S3Path("s3://my_bucket?versionId=$ver") ==
S3Path((), "", "s3://my_bucket", true, ver, cfg)
@test S3Path("s3://my_bucket/prefix/path/?versionId=$ver&radtimes=foo") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", true, ver, cfg)
@test S3Path("s3://my_bucket/prefix/path/?radtimes=foo&versionId=$ver") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", true, ver, cfg)
@test S3Path("s3://my_bucket/prefix/path?versionId=null") ==
S3Path(("prefix", "path"), "/", "s3://my_bucket", false, "null", cfg)
# Test to mark inconsistent root string behaviour when reconstructing parsed paths.
parsed = tryparse(S3Path, "s3://my_bucket")
@test_broken parsed == S3Path(
parsed.bucket, parsed.key; version=parsed.version, config=parsed.config
)
@test_throws ArgumentError S3Path("s3://my_bucket/?versionId=")
@test_throws ArgumentError S3Path("s3://my_bucket/?versionId=xyz")
end
@testset "version is empty" begin
@test_throws ArgumentError S3Path("my_bucket", "path"; version="")
@test_throws ArgumentError S3Path("s3://my_bucket/"; version="")
end
@testset "construct S3Path from S3Path" begin
# Use custom config to test that config is preserved in construction
config = AWSConfig(; region="bogus")
path = S3Path("s3://my_bucket/prefix"; config)
# When no kwargs provided, return identity
@test S3Path(path) === path
# version kwarg overrides path.version
version = String('A':'Z') * String('0':'5')
p = S3Path(path; version)
@test p != path
@test p.bucket == path.bucket
@test p.key == path.key
@test p.config == path.config
@test p.version == version != path.version
# ...if version already exists, overwrite silently
path_versioned = S3Path(path; version)
alt_version = String('0':'5') * String('A':'Z')
p = S3Path(path_versioned; version=alt_version)
@test p.version == alt_version != path_versioned.version
# config kwarg overrides path.config
alt_config = AWSConfig(; region="foo")
p = S3Path(path; config=alt_config)
@test p.config == alt_config != path.config
# isdirectory kwarg overrides path.config
p = S3Path(path; isdirectory=!path.isdirectory)
@test p.isdirectory != path.isdirectory
end
# `s3_list_versions` gives `SignatureDoesNotMatch` exceptions on Minio
if is_aws(base_config)
@testset "S3Path versioning" begin
config = assume_testset_role("S3PathVersioningTestset"; base_config)
s3_enable_versioning(config, bucket_name)
key = "test_versions"
r1 = s3_put(config, bucket_name, key, "data.v1"; parse_response=false)
r2 = s3_put(config, bucket_name, key, "data.v2"; parse_response=false)
rv1 = HTTP.header(r1.headers, "x-amz-version-id", nothing)
rv2 = HTTP.header(r2.headers, "x-amz-version-id", nothing)
# `s3_list_versions` returns versions in the order newest to oldest
listed_versions = s3_list_versions(config, bucket_name, key)
versions = [d["VersionId"] for d in reverse!(listed_versions)]
v1, v2 = first(versions), last(versions)
@test v1 == rv1
@test v2 == rv2
@test read(S3Path(bucket_name, key; config, version=v1), String) == "data.v1"
@test read(S3Path(bucket_name, key; config, version=v2), String) == "data.v2"
@test read(S3Path(bucket_name, key; config, version=v2), String) ==
read(S3Path(bucket_name, key; config), String)
@test read(S3Path(bucket_name, key; config, version=v2), String) ==
read(S3Path(bucket_name, key; config, version=nothing), String)
unversioned_path = S3Path(bucket_name, key; config)
versioned_path = S3Path(bucket_name, key; config, version=v2)
@test versioned_path.version == v2
@test unversioned_path.version === nothing
@test exists(versioned_path)
@test exists(unversioned_path)
dne = "feVMBvDgNiKSpMS17fKNJK3GV05bl8ir"
dne_versioned_path = S3Path(bucket_name, key; config, version=dne)
@test !exists(dne_versioned_path)
versioned_path_v1 = S3Path("s3://$bucket_name/$key"; version=v1)
versioned_path_v2 = S3Path("s3://$bucket_name/$key"; version=v2)
@test versioned_path_v1.version == v1
@test versioned_path_v1 != unversioned_path
@test versioned_path_v1 != versioned_path_v2
versioned_path_v1_from_url = S3Path("s3://$bucket_name/$key?versionId=$v1")
@test versioned_path_v1_from_url.key == key
@test versioned_path_v1_from_url.version == v1
@test S3Path("s3://$bucket_name/$key?versionId=$v1"; version=v1).version == v1
@test_throws ArgumentError begin
S3Path("s3://$bucket_name/$key?versionId=$v1"; version=v2)
end
str_v1 = string(versioned_path_v1)
roundtripped_v1 = S3Path(str_v1; config)
@test isequal(versioned_path_v1, roundtripped_v1)
@test str_v1 == "s3://" * bucket_name * "/" * key * "?versionId=" * v1
@test isa(stat(versioned_path), Status)
@test_throws ArgumentError write(versioned_path, "new_content")
rm(versioned_path)
@test !exists(versioned_path)
@test length(s3_list_versions(config, bucket_name, key)) == 1
fp = S3Path(bucket_name, "test_versions_nukeobject"; config)
foreach(_ -> write(fp, "foo"), 1:6)
@test length(s3_list_versions(fp.config, fp.bucket, fp.key)) == 6
s3_nuke_object(fp)
@test length(s3_list_versions(fp.config, fp.bucket, fp.key)) == 0
@test !exists(fp)
end
@testset "S3Path null version" begin
config = assume_testset_role("S3PathNullVersionTestset"; base_config)
b = gen_bucket_name("awss3.jl.test.null.")
k = "object"
function versioning_enabled(config, bucket)
d = parse(S3.get_bucket_versioning(bucket; aws_config=config))
return get(d, "Status", "Disabled") == "Enabled"
end
function list_version_ids(args...)
return [d["VersionId"] for d in reverse!(s3_list_versions(args...))]
end
try
# Create a new bucket that we know does not have versioning enabled
s3_create_bucket(config, b)
@test !versioning_enabled(config, b)
# Create an object which will have versionId set to "null"
r1 = s3_put(config, b, k, "original"; parse_response=false)
rv1 = HTTP.header(r1.headers, "x-amz-version-id", nothing)
@test isnothing(rv1)
versions = list_version_ids(config, b, k)
@test length(versions) == 1
@test versions[1] == "null"
@test read(S3Path(b, k; config, version=versions[1])) == b"original"
s3_enable_versioning(config, b)
@test versioning_enabled(config, b)
# Overwrite the original object with a new version
r2 = s3_put(config, b, k, "new and improved!"; parse_response=false)
rv2 = HTTP.header(r2.headers, "x-amz-version-id", nothing)
@test !isnothing(rv2)
versions = list_version_ids(config, b, k)
@test length(versions) == 2
@test versions[1] == "null"
@test versions[2] != "null"
@test versions[2] == rv2
@test read(S3Path(b, k; config, version=versions[1])) == b"original"
@test read(S3Path(b, k; config, version=versions[2])) ==
b"new and improved!"
finally
AWSS3.s3_nuke_bucket(config, b)
end
end
end
# <https://github.com/JuliaCloud/AWSS3.jl/issues/168>
@testset "Default `S3Path` does not freeze config" begin
path = S3Path("s3://$(bucket_name)/test_str.txt")
@test path.config === nothing
@test AWSS3.get_config(path) !== nothing
end
@testset "No-op constructor" begin
path = S3Path("s3://$(bucket_name)/test_str.txt")
path2 = S3Path(path)
@test path == path2
end
# MinIO does not care about regions, so this test doesn't work there
if is_aws(base_config)
@testset "Global config is not frozen at construction time" begin
config = assume_testset_role("ReadWriteObject"; base_config)
with_aws_config(config) do
# Setup: create a file holding a string `abc`
path = S3Path("s3://$(bucket_name)/test_str.txt")
write(path, "abc")
@test read(path, String) == "abc" # Have access to read file
alt_region = config.region == "us-east-2" ? "us-east-1" : "us-east-2"
alt_config = AWSConfig(; region=alt_region) # this is the wrong region!
with_aws_config(alt_config) do
@test_throws AWS.AWSException read(path, String)
end
# Now it works, without recreating `path`
@test read(path, String) == "abc"
rm(path)
end
end
end
# Broken on MinIO
if is_aws(base_config)
config = assume_testset_role("NukeBucket"; base_config)
AWSS3.s3_nuke_bucket(config, bucket_name)
end
end
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | code | 3514 | using Test: @test, @test_throws, contains_warn
const BUCKET_DATE_FORMAT = dateformat"yyyymmdd\THHMMSS\Z"
is_aws(config) = config isa AWSConfig
AWS.aws_account_number(::Minio.MinioConfig) = "123"
function minio_server(body, dirs=[mktempdir()]; address="localhost:9005")
server = Minio.Server(dirs; address)
try
run(server; wait=false)
sleep(0.5) # give the server just a bit of time, though it is amazingly fast to start
config = MinioConfig(
"http://$address"; username="minioadmin", password="minioadmin"
)
body(config)
finally
# Make sure we kill the server even if a test failed.
kill(server)
end
end
function gen_bucket_name(prefix="awss3.jl.test.")
# # https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html
return lowercase(prefix * Dates.format(now(Dates.UTC), BUCKET_DATE_FORMAT))
end
function assume_role(aws_config::AbstractAWSConfig, role; duration=nothing)
if startswith(role, "arn:aws:iam")
role_arn = role
role_name = basename(role)
else
response = AWSServices.sts(
"GetCallerIdentity";
aws_config,
feature_set=AWS.FeatureSet(; use_response_type=true),
)
account_id = parse(response)["GetCallerIdentityResult"]["Account"]
role_name = role
role_arn = "arn:aws:iam::$account_id:role/$role_name"
end
role_session = AWS._role_session_name(
"AWS.jl-role-",
role_name,
"-" * Dates.format(now(UTC), dateformat"yyyymmdd\THHMMSS\Z"),
)
params = Dict{String,Any}("RoleArn" => role_arn, "RoleSessionName" => role_session)
if duration !== nothing
params["DurationSeconds"] = duration
end
response = AWSServices.sts(
"AssumeRole",
params;
aws_config,
feature_set=AWS.FeatureSet(; use_response_type=true),
)
dict = parse(response)
role_creds = dict["AssumeRoleResult"]["Credentials"]
role_user = dict["AssumeRoleResult"]["AssumedRoleUser"]
return AWSConfig(;
creds=AWSCredentials(
role_creds["AccessKeyId"],
role_creds["SecretAccessKey"],
role_creds["SessionToken"],
role_user["Arn"];
expiry=DateTime(rstrip(role_creds["Expiration"], 'Z')),
renew=() -> assume_role(aws_config, role_arn; duration, mfa_serial).credentials,
),
)
end
# TODO: We're ignoring assume role calls when using a `MinioConfig` as we don't yet support
# this.
function assume_role(config::MinioConfig, role; kwargs...)
return config
end
function assume_testset_role(role_suffix; base_config)
return assume_role(base_config, "AWSS3.jl-$role_suffix")
end
function with_aws_config(f, config::AbstractAWSConfig)
local result
old_config = global_aws_config()
global_aws_config(config)
try
result = f()
finally
global_aws_config(old_config)
end
return result
end
# Rudementary support for `@test_throws ["Try", "Complex"] sqrt(-1)` for Julia 1.6
macro test_throws_msg(extype, ex)
# https://github.com/JuliaLang/julia/pull/41888
expr = if VERSION >= v"1.8.0-DEV.363"
:(@test_throws $extype $ex)
else
quote
@test try
$ex
false
catch e
exc = sprint(showerror, e)
contains_warn(exc, $extype)
end
end
end
return esc(expr)
end
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | docs | 1952 | # AWSS3.jl v0.11 Release Notes
## Breaking changes
- v0.11.0: The `s3_exists`, `isdir(::S3Path)`, and `isfile(::S3Path)` calls now specify the `delimiter` to be `"/"` instead of `""` to support IAM policies which allow limited access to specified prefixes (see this [example](https://github.com/JuliaCloud/AWSS3.jl/pull/289#discussion_r1224636214)). Users who previously used the IAM policies conditional `{"Condition":{"StringEquals":{"s3:delimiter":[""]}}}` with AWSS3.jl v0.10 will need to update their IAM policy to be `{"s3:delimiter":["/"]}` with AWSS3.jl v0.11.0. To maintain compatibility with both versions of AWSS3.jl use the policy `{"s3:delimiter":["","/"]}`. Any policies not using the conditional `s3:delimiter` are unaffected ([#289]).
## Non-breaking changes
- v0.11.0: The `s3_exists` and `isdir(::S3Path)` calls no longer encounter HTTP 403 (Access Denied) errors when attempting to list resources which requiring an `s3:prefix` to be specified ([#289]).
- v0.11.1: The new keyword argument `returns` for `Base.write(fp::S3Path, ...)` determines the output returned from `write`, which can now be the raw `AWS.Response` (`returns=:response`) or the `S3Path` (`returns=:path`); this latter option returns an `S3Path` populated with the version ID of the written object (when versioning is enabled on the bucket) ([#293]).
- v0.11.2: `s3_copy` supports the `parse_response` keyword allowing for access to the unparsed AWS API response ([#300]).
- v0.11.2: Added `s3_nuke_object` function to delete all versions of an object ([#299]).
- v0.11.2: Added `S3Path` copy constructor for allowing updating `version`, `config`, and/or `isdirectory` ([#297]).
[#289]: https://github.com/JuliaCloud/AWSS3.jl/pull/289
[#293]: https://github.com/JuliaCloud/AWSS3.jl/pull/293
[#297]: https://github.com/JuliaCloud/AWSS3.jl/pull/297
[#299]: https://github.com/JuliaCloud/AWSS3.jl/pull/299
[#300]: https://github.com/JuliaCloud/AWSS3.jl/pull/300
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | docs | 1691 | # AWSS3
AWS S3 Interface for Julia
[](https://github.com/JuliaCloud/AWSS3.jl/actions/workflows/CI.yml)
[](https://github.com/invenia/BlueStyle)
[](https://github.com/SciML/ColPrac)
**Installation**: at the Julia REPL, `using Pkg; Pkg.add("AWSS3")`
**Documentation**: [![][docs-stable-img]][docs-stable-url] [![][docs-latest-img]][docs-latest-url]
[docs-latest-img]: https://img.shields.io/badge/docs-latest-blue.svg
[docs-latest-url]: http://juliacloud.github.io/AWSS3.jl/dev/
[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
[docs-stable-url]: http://juliacloud.github.io/AWSS3.jl/stable/
## Example
```julia
using AWSS3
using AWS # for `global_aws_config`
aws = global_aws_config(; region="us-east-2") # pass keyword arguments to change defaults
s3_create_bucket(aws, "my.bucket")
# if the config is omitted it will try to infer it as usual from AWS.jl
s3_delete_bucket("my.bucket")
p = S3Path("s3://my.bucket/test1.txt") # provides an filesystem-like interface
write(p, "some data")
read(p, byte_range=1:4) # returns b"some"
response = write(p, "other data"; returns=:response) # returns the raw `AWS.Response` on writing to S3
parsed_response = write(p, "other data"; returns=:parsed) # returns the parsed `AWS.Response` (default)
versioned_path = write(p, "other data"; returns=:path) # returns the `S3Path` written to S3, including the version ID
```
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | docs | 658 | ```@meta
CurrentModule = AWSS3
```
## S3 Interaction
```@docs
s3_arn
s3_get
s3_get_file
s3_get_meta
s3_exists
s3_delete
s3_copy
s3_create_bucket
s3_put_cors
s3_enable_versioning
s3_put_tags
s3_get_tags
s3_delete_tags
s3_delete_bucket
s3_list_buckets
s3_list_objects
s3_list_keys
s3_purge_versions
s3_put
s3_sign_url
s3_nuke_bucket
```
## `S3Path`
Note that `S3Path` implements the `AbstractPath` interface, some the FilePathsBase documentation for
the interface [here](https://rofinn.github.io/FilePathsBase.jl/stable/api/).
```@docs
S3Path
stat
mkdir
read
get_config
```
## Internal
```@docs
_s3_exists_dir
s3_exists_versioned
s3_exists_unversioned
```
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.11.2 | d87804d72660de156ceb3f675e5c6bbdc9bee607 | docs | 3057 | ```@meta
CurrentModule = AWSS3
```
# AWSS3.jl
AWSS3.jl is a Julia package for interacting with key-value data storage services [AWS S3](https://aws.amazon.com/s3/)
and [min.io](https://min.io/). It operates through HTTP calls to a REST API service. It is based
on the package [AWS.jl](https://github.com/JuliaCloud/AWS.jl) which provides a direct wrapper to
low-level API calls but provides a great deal of additional convenient functionality.
## Quick Start
```julia
using AWSS3
using AWS # for `global_aws_config`
aws = global_aws_config(; region="us-east-2") # pass keyword arguments to change defaults
s3_create_bucket(aws, "my.bucket")
s3_enable_versioning(aws, "my.bucket")
s3_put(aws, "my.bucket", "key", "Hello!")
println(s3_get(aws, "my.bucket", "key")) # prints "Hello!"
println(s3_get(aws, "my.bucket", "key", byte_range=1:2)) # prints only "He"
```
## `S3Path`
This package provides the `S3Path` object which implements the
[FilePathsBase](https://github.com/rofinn/FilePathsBase.jl) interface, thus providing a
filesystem-like abstraction for interacting with S3. In particular, this allows for interacting
with S3 using the [filesystem interface](https://docs.julialang.org/en/v1/base/file/) provided by
Julia's `Base`. This makes it possible to (mostly) write code which works the same way for S3 as it
does for the local filesystem.
```julia
julia> using AWSS3, AWS, FilePathsBase;
# global_aws_config() is also the default if no `config` argument is passed
julia> p = S3Path("s3://bucket-name/dir1/", config=global_aws_config());
julia> readdir(p)
1-element Vector{SubString{String}}:
"demo.txt"
julia> file = joinpath(p, "demo.txt")
p"s3://bucket-name/dir1/demo.txt"
julia> stat(file)
Status(
device = 0,
inode = 0,
mode = -rw-rw-rw-,
nlink = 0,
uid = 1000 (username),
gid = 1000 (username),
rdev = 0,
size = 34 (34.0),
blksize = 4096 (4.0K),
blocks = 1,
mtime = 2021-01-30T18:53:02,
ctime = 2021-01-30T18:53:02,
)
julia> String(read(file)) # fetch the file into memory
"this is a file for testing S3Path\n"
julia> String(read(file, byte_range=1:4)) # fetch a specific byte range of the file
"this"
julia> rm(file) # delete the file
UInt8[]
```
!!! warning
S3 is a pure [key-value store](https://en.wikipedia.org/wiki/Key%E2%80%93value_database),
**NOT** a filesystem. Therefore, though S3 has, over time, gained features which oftne mimic a
filesystem interface, in some cases it can behave very differently. In particular "empty
directories" are, in actuality, 0-byte files and can have some unexpected behavior, e.g. there
is no `stat(dir)` like in a true filesystem.
## Min.io
Min.io is fully compatible with the S3 API and therefore this package can be used to interact with
it. To use Min.io requires a dedicated AWS configuration object, see the
[Minio.jl](https://gitlab.com/ExpandingMan/Minio.jl) package. This package also contains some
convenience functions for easily setting up a server for experimentation and testing with the
min.io/S3 interface.
| AWSS3 | https://github.com/JuliaCloud/AWSS3.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 832 | using Documenter, NeutralLandscapes
import Literate
# For GR docs bug
ENV["GKSwstype"] = "100"
vignettes = filter(
endswith(".jl"),
readdir(joinpath(@__DIR__, "src", "vignettes"); join = true, sort = true),
)
for vignette in vignettes
Literate.markdown(
vignette,
joinpath(@__DIR__, "src", "vignettes");
config = Dict("credit" => false, "execute" => true),
)
end
makedocs(;
sitename = "NeutralLandscapes",
authors = "M.D. Catchen",
modules = [NeutralLandscapes],
pages = [
"Index" => "index.md",
"Gallery" => "gallery.md",
"Vignettes" => [
"Overview" => "vignettes/overview.md",
],
],
checkdocs = :all,
)
deploydocs(
repo="github.com/EcoJulia/NeutralLandscapes.jl.git",
devbranch="main",
push_preview=true
)
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1787 | module NeutralLandscapes
import NaNMath
using StatsBase: sample, ZScoreTransform, fit, transform
using Random: rand!
using Statistics: quantile, mean
using Distributions: Normal, LogNormal, MvNormal, Categorical, pdf
using NearestNeighbors: KDTree, knn, nn, always_false, knn_point!, SVector
using DataStructures: IntDisjointSets, union!, find_root, push!
using Base: @kwdef
using HaltonSequences: haltonvalue
export rand, rand!
export classify!, classify, blend, label
export NeutralLandscapeMaker
export DiscreteVoronoi
export DiamondSquare, MidpointDisplacement
export EdgeGradient
export DistanceGradient
export NearestNeighborCluster
export NearestNeighborElement
export NoGradient
export PerlinNoise
export PlanarGradient
export RectangularCluster
export WaveSurface
export Patches
include("landscape.jl")
include("classify.jl")
include("makers/diamondsquare.jl")
include("makers/discretevoronoi.jl")
include("makers/distancegradient.jl")
include("makers/edgegradient.jl")
include("makers/nncluster.jl")
include("makers/nnelement.jl")
include("makers/nogradient.jl")
include("makers/perlinnoise.jl")
include("makers/planargradient.jl")
include("makers/rectangularcluster.jl")
include("makers/wavesurface.jl")
include("makers/patches.jl")
include("updaters/update.jl")
include("updaters/temporal.jl")
include("updaters/spatial.jl")
include("updaters/spatiotemporal.jl")
export update, update!
export NeutralLandscapeUpdater
export TemporallyVariableUpdater
export SpatiallyAutocorrelatedUpdater
export SpatiotemporallyAutocorrelatedUpdater
export rate, variability
export normalize
using Requires
function __init__()
@require SpeciesDistributionToolkit="72b53823-5c0b-4575-ad0e-8e97227ad13b" include(joinpath("integrations", "simplesdmlayers.jl"))
end
end # module
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 5007 | """
classify!(array, weights[, mask])
Classify an array in-place into proportions based upon a list of class weights.
"""
function classify!(array, weights, mask = nothing)
quantiles = zeros(length(weights))
cumsum!(quantiles, weights)
quantiles ./= quantiles[end]
boundaryvalues = if isnothing(mask)
quantile(filter(isfinite, array), quantiles)
else
quantile(array[mask .& isfinite.(array)], quantiles)
end
for i in eachindex(array)
array[i] = isnan(array[i]) ? NaN : searchsortedfirst(boundaryvalues, array[i])
end
array
end
classify!(array, weights::Real, mask = nothing) = classify!(array, ones(weights), mask)
"""
classify(array, weights[, mask])
Classify an array into proportions based upon a list of class weights.
"""
classify(array, weights, mask = nothing) = classify!(copy(array), weights, mask)
classify(array, weights::Real, mask = nothing) = classify(array, ones(weights), mask)
function _clusterMean(clusterArray, array)
clusters = Dict{Float64, Float64}()
clustersum = Dict{Float64, Float64}()
labels, nlabels = label(clusterArray)
for ind in eachindex(labels, array)
temp = labels[ind]
if !haskey(clusters, temp)
clusters[temp] = clustersum[temp] = 0.0
end
clusters[temp] += 1.0
clustersum[temp] += array[ind]
end
for cl in keys(clusters)
clustersum[cl] /= clusters[cl]
end
clustersum[NaN] = NaN
_rescale!(get.(Ref(clustersum), labels, NaN))
end
"""
blend(arrays[, scaling])
Blend arrays weighted by scaling factors.
"""
function blend(arrays, scaling::AbstractVector{<:Number} = ones(length(arrays)))
if length(scaling) != length(arrays)
throw(DimensionMismatch("The array of landscapes (n = $(length(arrays))) and scaling (n = $(length(scaling))) must have the same length"))
end
ret = sum(arrays .* scaling)
_rescale!(ret)
end
"""
blend(clusterarray, arrays[, scaling])
Blend a primary cluster NLM with other arrays in which the mean value per
cluster is weighted by scaling factors.
"""
function blend(clusterarray, arrays::AbstractVector, scaling::AbstractVector{<:Number} = ones(length(arrays)))
ret = sum(_clusterMean.(Ref(clusterarray), arrays) .* scaling)
_rescale!(clusterarray + ret)
end
blend(clusterarray, array, scaling = 1) = blend(clusterarray, [array], [scaling])
const _neighborhoods = Dict(
:rook => [(1, 0), (0, 1)],
:diagonal => [(1, 0), (0, 1), (1, 1)],
:queen => [(1, 0), (0, 1), (1, 1), (1, -1)],
)
"""
label(mat[, neighborhood = :rook])
Assign an arbitrary label to all clusters of contiguous matrix elements with the same value.
Returns a matrix of values and the total number of final clusters.
The `neighborhood` structure can be
`:rook` `:queen` `:diagonal`
0 1 0 1 1 1 0 1 1
1 x 1 1 x 1 1 x 1
0 1 0 1 1 1 1 1 0
`:rook` is the default
"""
function label(mat, neighborhood = :rook)
neighbors = _neighborhoods[neighborhood]
m, n = size(mat)
(m >= 3 && n >= 3) || error("The label algorithm requires the landscape to be at least 3 cells in each direction")
# initialize objects
ret = similar(mat)
clusters = IntDisjointSets(0)
# run through the matrix and make clusters
for j in axes(mat, 2), i in axes(mat, 1)
if isfinite(mat[i, j])
same = []
for neigh in neighbors[same]
x,y = i-neigh[1], j-neigh[2]
1 <= x <= m && 1 <= y <= n && push!(vals, mat[x,y]==mat[i,j] )
end
if count(same) == 0
push!(clusters)
ret[i, j] = length(clusters)
elseif count(same) == 1
n1, n2 = only(neighbors[same])
ret[i, j] = ret[i - n1, j - n2]
else
vals = []
for neigh in neighbors[same]
x,y = i-neigh[1], j-neigh[2]
1 <= x <= m && 1 <= y <= n && push!(vals, mat[x,y] )
end
unique!(vals)
if length(vals) == 1
ret[i, j] = only(vals)
else
for v in vals[2:end]
ret[i, j] = union!(clusters, Int(vals[1]), Int(v))
end
end
end
else
ret[i, j] = NaN
end
end
# merge adjacent clusters with same value
finalclusters = Set{eltype(ret)}()
for i in eachindex(ret)
ret[i] = isnan(ret[i]) ? NaN : find_root(clusters, Int(ret[i]))
push!(finalclusters, ret[i])
end
# assign each cluster a random number in steps of 1 (good for plotting)
randomcode = Dict(i => j for (i, j) in zip(finalclusters, 1.0:length(finalclusters)))
randomcode[NaN] = NaN
for i in eachindex(ret)
ret[i] = randomcode[ret[i]]
end
ret, length(finalclusters)
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1924 | import Random.rand!
"""
NeutralLandscapeMaker
Abstract supertype that all algorithms are descended from. A new
algorithm must minimally implement a `_landscape!` method for this type.
"""
abstract type NeutralLandscapeMaker end
"""
rand(alg, dims::Tuple{Vararg{Int64,2}}; mask=nothing) where {T <: Integer}
Creates a landscape of size `dims` (a tuple of two integers) following the model
defined by `alg`. The `mask` argument accepts a matrix of boolean values, and is
passed to `mask!` if it is not `nothing`.
"""
function Base.rand(alg::T, dims::Tuple{Int64,Int64}; mask=nothing) where {T <: NeutralLandscapeMaker}
ret = Matrix{Float64}(undef, dims...)
rand!(ret, alg; mask=mask)
end
Base.rand(alg::T, dims::Integer...; mask=nothing) where {T <: NeutralLandscapeMaker} = rand(alg, dims; mask = mask)
"""
rand!(mat, alg) where {IT <: Integer}
Fill the matrix `mat` with a landscape created following the model defined by
`alg`. The `mask` argument accepts a matrix of boolean values, and is passed to
`mask!` if it is not `nothing`.
"""
function rand!(mat::AbstractArray{<:AbstractFloat,2} where N, alg::T; mask=nothing) where {T <: NeutralLandscapeMaker}
_landscape!(mat, alg)
isnothing(mask) || mask!(mat, mask)
_rescale!(mat)
end
"""
mask!(array::AbstractArray{<:AbstractFloat}, maskarray::AbstractArray{<:AbstractBool})
Modifies `array` so that the positions at which `maskarray` is `false` are
replaced by `NaN`.
"""
function mask!(array::AbstractArray{<:Float64}, maskarray::AbstractArray{<:Bool})
(size(array) == size(maskarray)) || throw(DimensionMismatch("The dimensions of array, $(size(array)), and maskarray, $(size(maskarray)), must match. "))
array[.!maskarray] .= NaN
return array
end
# Changes the matrix `mat` so that it is between `0` and `1`.
function _rescale!(mat)
mn, mx = NaNMath.extrema(mat)
mat .= (mat .- mn) ./ (mx - mn)
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 531 | @info "Loading NeutralLandscapes support for SimpleSDMLayers.jl..."
"""
NeutralLandscapes.mask!(array::AbstractArray{<:Float64}, masklayer::T) where {T<:SimpleSDMLayers.SimpleSDMLayer}
Masks an `array` by the values of `nothing` in `masklayer`, where `masklayer` is
a `SimpleSDMLayer`
"""
function NeutralLandscapes.mask!(array::AbstractArray{<:Float64}, masklayer::T) where {T<:SpeciesDistributionToolkit.SimpleSDMLayers.SimpleSDMLayer}
I = findall(isnothing, masklayer.grid)
array[I] .= NaN
return array
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 9631 | """
DiamondSquare <: NeutralLandscapeMaker
DiamondSquare(; H = 0.5)
DiamondSquare(H)
This type generates a neutral landscape using the diamond-squares
algorithm, which produces fractals with variable spatial autocorrelation.
https://en.wikipedia.org/wiki/Diamond-square_algorithm
The algorithm is named diamond-square because it is an iterative procedure of
"diamond" and "square" steps.
The degree of spatial autocorrelation is controlled by a parameter `H`,
which varies from 0.0 (low autocorrelation) to 1.0 (high autocorrelation) ---
note this is non-inclusive and H = 0 and H = 1 will not behave as expected.
The result of the diamond-square algorithm is a fractal with dimension D = 2 + H.
A similar algorithm, midpoint-displacement, functions almost
identically, except that in DiamondSquare, the square step interpolates
edge midpoints from the nearest two corners and the square's center, where as
midpoint-displacement only interpolates from the nearest corners (see `MidpointDisplacement`).
"""
@kwdef struct DiamondSquare <: NeutralLandscapeMaker
H::Float64 = 0.5
function DiamondSquare(H::T) where {T <: Real}
@assert 0 <= H < 1
new(H)
end
end
"""
MidpointDisplacement <: NeutralLandscapeMaker
MidpointDisplacement(; H = 0.5)
Creates a midpoint-displacement algorithm object `MidpointDisplacement`.
The degree of spatial autocorrelation is controlled by a parameter `H`,
which varies from 0.0 (low autocorrelation) to 1.0 (high autocorrelation) ---
note this is non-inclusive and H = 0 and H = 1 will not behave as expected.
A similar algorithm, diamond-square, functions almost
identically, except that in diamond-square, the square step interpolates
edge midpoints from the nearest two corners and the square's center, where as
`MidpointDisplacement` only interpolates from the nearest corners (see `DiamondSquare`).
"""
@kwdef struct MidpointDisplacement <: NeutralLandscapeMaker
H::Float64 = 0.5
function MidpointDisplacement(H::T) where {T <: Real}
@assert 0 <= H < 1
new(H)
end
end
# Check if `mat` is the right size. If mat is not the correct size (DiamondSquare
# can only run on a lattice of size NxN where N = (2^n)+1 for integer n),
# allocates the smallest lattice large enough to contain `mat` that can run
# DiamondSquare.
function _landscape!(mat, alg::Union{DiamondSquare, MidpointDisplacement}; kw...) where {IT <: Integer}
rightSize::Bool = _isPowerOfTwo(size(mat)[1]-1) && _isPowerOfTwo(size(mat)[2]-1)
latticeSize::Int = size(mat)[1]
dsMat = mat
if !rightSize
dim1, dim2 = size(mat)
smallestContainingLattice::Int = 2^ceil(log2(max(dim1, dim2))) + 1
dsMat = zeros(smallestContainingLattice, smallestContainingLattice)
end
_diamondsquare!(dsMat, alg)
mat .= dsMat[1:size(mat)[1], 1:size(mat)[2]]
end
# Runs the diamond-square algorithm on a matrix `mat` of size
# `NxN`, where `N=(2^n)+1` for some integer `n`, i.e (N=5,9,17,33,65)
# Diamond-square is an iterative procedure, where the lattice is divided
# into subsquares in subsequent rounds. At each round, the subsquares shrink in size,
# as previously uninitialized values in the lattice are interpolated as a mean of nearby points plus random displacement.
# As the rounds increase, the magnitude of this displacement decreases. This creates spatioautocorrelation, which is controlled
# by a single parameter `H` which varies between `0` (no autocorrelation) and `1` (high autocorrelation)
function _diamondsquare!(mat, alg)
latticeSize = size(mat)[1]
numberOfRounds::Int = log2(latticeSize-1)
_initializeDiamondSquare!(mat, alg)
for round in 0:(numberOfRounds-1) # counting from 0 saves us a headache later
subsquareSideLength::Int = 2^(numberOfRounds-(round))
numberOfSubsquaresPerAxis::Int = ((latticeSize-1) / subsquareSideLength)-1
for x in 0:numberOfSubsquaresPerAxis # iterate over the subsquares within the lattice at this side length
for y in 0:numberOfSubsquaresPerAxis
subsquareCorners = _subsquareCornerCoordinates(x,y,subsquareSideLength)
_diamond!(mat, alg, round, subsquareCorners)
_square!(mat, alg, round, subsquareCorners)
end
end
end
end
# Initialize's the `DiamondSquare` algorithm by displacing the four corners of the
# lattice using `displace`, scaled by the algorithm's autocorrelation `H`.
function _initializeDiamondSquare!(mat, alg)
latticeSize = size(mat)[1]
corners = _subsquareCornerCoordinates(0,0, latticeSize-1)
for mp in corners
mat[mp...] = _displace(alg.H, 1)
end
end
# Returns the coordinates for the corners of the subsquare (x,y) given a side-length `sideLength`.
function _subsquareCornerCoordinates(x::Int, y::Int, sideLength::Int)
return (1 .+ sideLength.*i for i in ((x,y), (x+1, y), (x, y+1), (x+1, y+1)))
end
# Runs the diamond step of the `DiamondSquare` algorithm on the square defined by
# `corners` on the matrix `mat`. The center of the square is interpolated from the
# four corners, and is displaced. The displacement is drawn according to `alg.H` and round using `displace`
function _diamond!(mat, alg, round::Int, corners)
centerPt = _centerCoordinate(corners)
mat[centerPt...] = _interpolate(mat, corners) + _displace(alg.H, round)
end
# Runs the square step of the `DiamondSquare` algorithm on the square defined
# by `corners` on the matrix `mat`. The midpoint of each edge of this square is interpolated
# by computing the mean value of the two corners on the edge and the center of the square, and the
# displacing it. The displacement is drawn according to `alg.H` and round using `displace`
function _square!(mat, alg::DiamondSquare, round::Int, corners)
bottomLeft,bottomRight,topLeft,topRight = corners
leftEdge, bottomEdge, topEdge, rightEdge = _edgeMidpointCoordinates(corners)
centerPoint = _centerCoordinate(corners)
mat[leftEdge...] = _interpolate(mat, (topLeft,bottomLeft,centerPoint)) + _displace(alg.H, round)
mat[bottomEdge...] = _interpolate(mat, (bottomLeft,bottomRight,centerPoint)) + _displace(alg.H, round)
mat[topEdge...] = _interpolate(mat, (topLeft,topRight,centerPoint)) + _displace(alg.H, round)
mat[rightEdge...] = _interpolate(mat, (topRight,bottomRight,centerPoint)) + _displace(alg.H, round)
end
# Runs the square step of the `MidpointDisplacement` algorithm on the square defined
# by `corners` on the matrix `mat`. The midpoint of each edge of this square is interpolated
# by computing the mean value of the two corners on the edge and the center of the square, and the
# displacing it. The displacement is drawn according to `alg.H` and round using `displace`
function _square!(mat, alg::MidpointDisplacement, round::Int, corners)
bottomLeft,bottomRight,topLeft,topRight = corners
leftEdge, bottomEdge, topEdge, rightEdge = _edgeMidpointCoordinates(corners)
mat[leftEdge...] = _interpolate(mat, (topLeft,bottomLeft)) + _displace(alg.H, round)
mat[bottomEdge...] = _interpolate(mat, (bottomLeft,bottomRight)) + _displace(alg.H, round)
mat[topEdge...] = _interpolate(mat, (topLeft,topRight)) + _displace(alg.H, round)
mat[rightEdge...] = _interpolate(mat, (topRight,bottomRight)) + _displace(alg.H, round)
end
# Computes the mean of a set of points, represented as a list of indicies to a matrix `mat`.
function _interpolate(mat, points)
return mean(mat[pt...] for pt in points)
end
# `displace` produces a random value as a function of `H`, which is the
# autocorrelation parameter used in `DiamondSquare` and must be between `0`
# and `1`, and `round` which describes the current tiling size for the
# DiamondSquare() algorithm.
# Random value are drawn from a Gaussian distribution using `Distribution.Normal`
# The standard deviation of this Gaussian, σ, is set to (1/2)^(round*H), which will
# move from 1.0 to 0 as `round` increases.
function _displace(H::Float64, round::Int)
σ = 0.5^(round*H)
return rand(Normal(0, σ))
end
# Returns the center coordinate for a square defined by `corners` for the
# `DiamondSquare` algorithm.
function _centerCoordinate(corners)
bottomLeft,bottomRight,topLeft,topRight = corners
centerX::Int = (_xcoord(bottomLeft)+_xcoord(bottomRight)) ÷ 2
centerY::Int = (_ycoord(topRight)+_ycoord(bottomRight)) ÷ 2
return (centerX, centerY)
end
# Returns the x-coordinate from a lattice coordinate `pt`.
_xcoord(pt::Tuple{Int,Int}) = pt[1]
# Returns the y-coordinate from a lattice coordinate `pt`.
_ycoord(pt::Tuple{Int,Int}) = pt[2]
# Returns an array of midpoints for a square defined by `corners` for the `DiamondSquare` algorithm.
function _edgeMidpointCoordinates(corners)
# bottom left, bottom right, top left, top right
bottomLeft,bottomRight,topLeft,topRight = corners
leftEdgeMidpoint::Tuple{Int,Int} = (_xcoord(bottomLeft), (_ycoord(bottomLeft)+_ycoord(topLeft))÷2 )
bottomEdgeMidpoint::Tuple{Int,Int} = ( (_xcoord(bottomLeft)+ _xcoord(bottomRight))÷2, _ycoord(bottomLeft) )
topEdgeMidpoint::Tuple{Int,Int} = ( (_xcoord(topLeft)+_xcoord(topRight))÷2, _ycoord(topLeft))
rightEdgeMidpoint::Tuple{Int,Int} = ( _xcoord(bottomRight), (_ycoord(bottomRight)+_ycoord(topRight))÷2)
edgeMidpoints = (leftEdgeMidpoint, bottomEdgeMidpoint, topEdgeMidpoint, rightEdgeMidpoint)
return edgeMidpoints
end
# Determines if `x`, an integer, can be expressed as `2^n`, where `n` is also an integer.
function _isPowerOfTwo(x::IT) where {IT <: Integer}
return x & (x-1) == 0
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 616 | """
DiscreteVoronoi <: NeutralLandscapeMaker
DiscreteVoronoi(; n=3)
DiscreteVoronoi(n)
This type provides a rasterization of a Voronoi-like diagram.
Assigns a value to each patch using a 1-NN algorithmm with `n` initial clusters.
It is a `NearestNeighborElement` algorithmm with `k` neighbors set to 1.
The default is to use three clusters.
"""
@kwdef struct DiscreteVoronoi <: NeutralLandscapeMaker
n::Int64 = 3
function DiscreteVoronoi(n::Int64)
@assert n > 0
new(n)
end
end
_landscape!(mat, alg::DiscreteVoronoi) = _landscape!(mat, NearestNeighborElement(alg.n, 1))
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1123 | """
DistanceGradient <: NeutralLandscapeMaker
DistanceGradient(; sources=[1])
DistanceGradient(sources)
The `sources` field is a `Vector{Integer}` of *linear* indices of the matrix,
from which the distance must be calculated.
"""
@kwdef struct DistanceGradient <: NeutralLandscapeMaker
sources::Vector{Integer} = [1]
end
function _landscape!(mat, alg::DistanceGradient)
@assert maximum(alg.sources) <= length(mat)
@assert minimum(alg.sources) > 0
coordinates = CartesianIndices(mat)
source_coordinates = map(alg.sources) do c
SVector(Tuple(coordinates[c]))
end
idx = Vector{Int}(undef, 1)
dist = Vector{Float64}(undef, 1)
tree = KDTree(source_coordinates)
_write_knn!(mat, dist, idx, tree, coordinates)
return mat
end
# Function barrier, somehow we lose type stability with this above
function _write_knn!(mat, dist, idx, tree, coordinates)
sortres = false
for i in eachindex(mat)
point = SVector(Tuple(coordinates[i]))
knn_point!(tree, point, sortres, dist, idx, always_false)
mat[i] = dist[1]
end
return mat
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 787 | """
EdgeGradient <: NeutralLandscapeMaker
EdgeGradient(; direction=360rand())
EdgeGradient(direction)
This type is used to generate an edge gradient landscape, where values change
as a bilinear function of the *x* and *y* coordinates. The direction is
expressed as a floating point value, which will be in *[0,360]*. The inner
constructor takes the mod of the value passed and 360, so that a value that is
out of the correct interval will be corrected.
"""
@kwdef struct EdgeGradient <: NeutralLandscapeMaker
direction::Float64 = 360rand()
EdgeGradient(x::T) where {T <: Real} = new(mod(x, 360.0))
end
function _landscape!(mat, alg::EdgeGradient) where {IT <: Integer}
_landscape!(mat, PlanarGradient(alg.direction))
mat .= -2.0abs.(0.5 .- mat) .+ 1.0
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1435 | """
NearestNeighborCluster <: NeutralLandscapeMaker
NearestNeighborCluster(; p=0.5, n=:rook)
NearestNeighborCluster(p, [n=:rook])
Create a random cluster nearest-neighbour neutral landscape model with
values ranging 0-1. `p` sets the density of original clusters, and `n`
sets the neighborhood for clustering (see `?label` for neighborhood options)
"""
@kwdef struct NearestNeighborCluster <: NeutralLandscapeMaker
p::Float64 = 0.5
n::Symbol = :rook
function NearestNeighborCluster(p::Float64, n::Symbol = :rook)
@assert p > 0
@assert n ∈ (:rook, :queen, :diagonal)
new(p,n)
end
end
function _landscape!(mat, alg::NearestNeighborCluster)
_landscape!(mat, NoGradient())
classify!(mat, [alg.p, 1 - alg.p])
replace!(mat, 2.0 => NaN)
clusters, nClusters = label(mat, alg.n)
coordinates = CartesianIndices(clusters)
sources = findall(!isnan, vec(clusters))
cluster_coordinates = map(sources) do c
SVector(Tuple(coordinates[c]))
end
idx = Vector{Int}(undef, 1)
dist = Vector{Float64}(undef, 1)
tree = KDTree(cluster_coordinates)
randvals = rand(nClusters)
sortres = false
for i in eachindex(mat)
point = SVector(Tuple(coordinates[i]))
knn_point!(tree, point, sortres, dist, idx, always_false)
cluster = clusters[sources[idx[1]]]
mat[i] = randvals[Int(cluster)]
end
return mat
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1414 | """
NearestNeighborElement <: NeutralLandscapeMaker
NearestNeighborElement(; n=3, k=1)
NearestNeighborElement(n, [k=1])
Assigns a value to each patch using a k-NN algorithmm with `n` initial clusters
and `k` neighbors. The default is to use three cluster and a single neighbor.
"""
@kwdef struct NearestNeighborElement <: NeutralLandscapeMaker
n::Int64 = 3
k::Int64 = 1
function NearestNeighborElement(n::Int64, k::Int64 = 1)
@assert n > 0
@assert k > 0
@assert k <= n
new(n,k)
end
end
function _landscape!(mat, alg::NearestNeighborElement)
clusters = sample(eachindex(mat), alg.n; replace=false)
# Preallocate for NearestNeighbors
idx = Vector{Int}(undef, alg.k)
dist = Vector{Float64}(undef, alg.k)
coordinates = CartesianIndices(mat)
cluster_coordinates = map(clusters) do c
SVector(Tuple(coordinates[c]))
end
tree = KDTree(cluster_coordinates)
sortres = false
if alg.k == 1
for i in eachindex(mat)
point = SVector(Tuple(coordinates[i]))
knn_point!(tree, point, sortres, dist, idx, always_false)
mat[i] = idx[1]
end
else
for i in eachindex(mat)
point = SVector(Tuple(coordinates[i]))
knn_point!(tree, point, sortres, dist, idx, always_false)
mat[i] = mean(idx)
end
end
return mat
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 226 | """
NoGradient <: NeutralLandscapeMaker
NoGradient()
This type is used to generate a random landscape with no gradients
"""
struct NoGradient <: NeutralLandscapeMaker
end
_landscape!(mat, ::NoGradient) = rand!(mat)
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 4253 | @kwdef struct Patches <: NeutralLandscapeMaker
numpatches = 10
areaproportion = 0.3
kernel = (x, λ) -> exp(-λ*x)
σ_explore = 3.
σ_return = 1.
interweave_frequency = 2
smoothing_rounds = 15
smoothing_threshold = 4
size_distribution = LogNormal()
end
function _balanced_acceptance_lite(numpoints, dims)
x,y = dims
coords = []
seed = rand(1:10^6, 2)
for ptct in 1:numpoints
i, j = haltonvalue(seed[1] + ptct, 2), haltonvalue(seed[2] + ptct, 3)
coord = CartesianIndex(convert.(Int32, [ceil(x * i), ceil(y * j)])...)
push!(coords, coord)
end
coords
end
function _patch_distances(centers, knearest=2)
dists = zeros(length(centers))
for i in 1:length(centers)
s = 0.
thesedists = zeros(length(centers))
for j in 1:length(centers)
x = centers[i] - centers[j]
thesedists[j] = sqrt(x[1]^2+x[2]^2)
end
dists[i] = sum(sort(thesedists)[1:knearest])
end
dists
end
function _landscape!(mat, p::Patches; kw...)
mat .= 0
centers = _balanced_acceptance_lite(p.numpatches, size(mat))
size_dist = rand(p.size_distribution, p.numpatches)
px_per_patch = Int32.(floor.((size_dist ./ sum(size_dist) ) .* p.areaproportion*prod(size(mat))))
#TODO hueristic to avoid overlap:
# 1. compute mean dist from all other centers
# 2. largest patch is furthest, 2nd largest is 2nd furtherst, etc.
sort!(px_per_patch)
sorted_Icenter = sortperm(_patch_distances(centers))
for (i,c) in enumerate(centers[sorted_Icenter])
_buildpatch!(mat, p, c, i, px_per_patch[i])
end
for _ in 1:p.smoothing_rounds
_smoothing!(mat, p.smoothing_threshold)
end
end
isinbounds(x, dims) = x[1] > 0 && x[1] <= dims[1] && x[2] > 0 && x[2] < dims[2]
function _smoothing!(mat, thres)
for i in CartesianIndices(mat)
offsets = filter(!isequal(CartesianIndex(0,0)),CartesianIndices((-1:1,-1:1)))
check = [isinbounds(i + o, size(mat)) ? i+o : nothing for o in offsets]
filter!(!isnothing, check)
if sum([mat[c] == mat[check[begin]] for c in check]) > thres
mat[i] = mat[check[begin]]
end
end
end
function _buildpatch!(mat, p, center, id, pixels)
current = center
MAX_ITER = 10*pixels
pct, it = 0, 0
while pct < pixels && it < MAX_ITER
if mat[current] == 0
mat[current] = id
pct += 1
end
if it % 2 == 0
current = gaussian_explore_kernel(size(mat), current, center; σ=p.σ_explore)
else
current = gaussian_return_kernel(size(mat), current, center, σ=p.σ_return)
end
it += 1
end
end
function _ensure_inbounds!(dims, current)
if current[1] < 1
current = CartesianIndex(1, current[2])
elseif current[1] > dims[1]
current = CartesianIndex(dims[1], current[2])
end
if current[2] < 1
current = CartesianIndex(current[1], 1)
elseif current[2] > dims[2]
current = CartesianIndex(current[1], dims[2])
end
current
end
function gaussian_kernel(current, targ; σ=5.0)
N = MvNormal([targ[1], targ[2]], [σ 0; 0 σ])
offsets = CartesianIndices((-1:1,-1:1))
probmat = zeros(size(offsets))
for (i,offset) in enumerate(offsets)
idx = current + offset
x = [idx[1], idx[2]]
probmat[i] = pdf(N, x)
end
probmat = probmat ./ sum(probmat)
Ioff = rand(Categorical(vec(probmat)))
current + offsets[Ioff]
end
function gaussian_return_kernel(dims, current, center; kwargs...)
dx = current[1] < center[1] ? CartesianIndex(1,0) : CartesianIndex(-1,0)
dy = current[2] < center[2] ? CartesianIndex(0,1) : CartesianIndex(0,-1)
targ = current + dx + dy
_ensure_inbounds!(dims,gaussian_kernel(current, targ; kwargs...))
end
function gaussian_explore_kernel(dims, current, center; kwargs...)
#
dx = current[1] < center[1] ? CartesianIndex(-1,0) : CartesianIndex(1,0)
dy = current[2] < center[2] ? CartesianIndex(0,-1) : CartesianIndex(0,1)
targ = current + dx + dy
_ensure_inbounds!(dims, gaussian_kernel(current, targ ; kwargs...))
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 5194 | """
PerlinNoise <: NeutralLandscapeMaker
PerlinNoise(; kw...)
PerlinNoise(periods, [octaves=1, lacunarity=2, persistance=0.5, valley=:u])
Create a Perlin noise neutral landscape model with values ranging 0-1.
# Keywords
- `periods::Tuple{Int,Int}=(1,1)`: the number of periods of Perlin noise across row and
column dimensions for the first octave.
- `octaves::Int=1`: the number of octaves that will form the Perlin noise.
- `lacunarity::Int=2` : the rate at which the frequency of periods increases for each
octive.
- `persistance::Float64=0.5` : the rate at which the amplitude of periods decreases for each
octive.
- `valley::Symbol=`:u`: the kind of valley bottom that will be mimicked: `:u` produces
u-shaped valleys, `:v` produces v-shaped valleys, and `:-` produces flat bottomed
valleys.
Note: This is a memory-intensive algorithm with some settings. Be careful using larger
prime numbers for `period` when also using a large array size, high lacuarity and/or many
octaves. Memory use scales with the lowest common multiple of `periods`.
"""
@kwdef struct PerlinNoise <: NeutralLandscapeMaker
periods::Tuple{Int,Int} = (1, 1)
octaves::Int = 1
lacunarity::Int = 2
persistance::Float64 = 0.5
valley::Symbol = :u
function PerlinNoise(periods, octaves=1, lacunarity=2, persistance=0.5, valley=:u)
new(periods, octaves, lacunarity, persistance, valley)
end
end
function _landscape!(A, alg::PerlinNoise)
# nRow must equal nCol so determine the dimension of the smallest square
dim = max(size(A)...)
# Check the dim is a multiple of each octives maximum number of periods and
# expand dim if needed
rperiodsmax = alg.periods[1] * alg.lacunarity^(alg.octaves - 1)
cperiodsmax = alg.periods[2] * alg.lacunarity^(alg.octaves - 1)
periodsmultiple = lcm(rperiodsmax, cperiodsmax) # lowest common multiple
if dim % periodsmultiple != 0
dim = ceil(Int, dim / periodsmultiple) * periodsmultiple
end
# Generate the Perlin noise
noise = zeros(eltype(A), (dim, dim))
meshbuf1 = Array{eltype(A),2}(undef, dim, dim)
meshbuf2 = Array{eltype(A),2}(undef, dim, dim)
nbufs = ntuple(_->Array{eltype(A),2}(undef, dim, dim), 4)
for octave in 0:alg.octaves-1
octave_noise!(noise, meshbuf1, meshbuf2, nbufs, alg, octave, dim, dim)
end
# Randomly extract the desired array size
noiseview = _view_from_square(noise, size(A)...)
# Rescale the Perlin noise to mimic different kinds of valley bottoms
return if alg.valley == :u
A .= noiseview
elseif alg.valley == :v
A .= abs.(noiseview)
elseif alg.valley == :-
A .= noiseview.^2
else
error("$(alg.valley) not recognised for `valley` use `:u`, `:v` or `:-`")
end
end
function octave_noise!(
noise, m1, m2, (n11, n21, n12, n22), alg::PerlinNoise, octave, nrow, ncol
)
f(t) = @fastmath 6 * t ^ 5 - 15 * t ^ 4 + 10 * t ^ 3 # Wut
# Mesh
rp, cp = alg.periods .* alg.lacunarity^(octave)
delta = (rp / nrow, cp / ncol)
ranges = range(0, rp-delta[1], length=nrow), range(0, cp-delta[2], length=ncol)
_rem_meshes!(m1, m2, ranges...)
# Gradients
# This allocates, but the gradients size changes with octave so it needs to
# be a very smart in-place `repeat!` or some kind of generator, and the improvement
# may not be that large (~20%).
angles = 2pi .* rand(rp + 1, cp + 1)
@fastmath gradients = cat(cos.(angles), sin.(angles); dims=3)
d = (nrow ÷ rp, ncol ÷ cp)
grad = repeat(gradients, inner=[d[1], d[2], 1])
g111 = @view grad[1:ncol, 1:ncol, 1]
g211 = @view grad[end-nrow+1:end, 1:ncol, 1]
g121 = @view grad[1:ncol, end-ncol+1:end, 1]
g221 = @view grad[end-nrow+1:end, end-ncol+1:end, 1]
g112 = @view grad[1:ncol, 1:ncol, 2]
g212 = @view grad[end-nrow+1:end, 1:ncol, 2]
g122 = @view grad[1:ncol, end-ncol+1:end, 2]
g222 = @view grad[end-nrow+1:end, end-ncol+1:end, 2]
# Ramps
n11 .= ((m1 .+ m2 ) .* g111 .+ (m1 .+ m2 ) .* g112)
n21 .= ((m1 .-1 .+ m2 ) .* g211 .+ (m1 .-1 .+ m2 ) .* g212)
n12 .= ((m1 .+ m2 .- 1) .* g121 .+ (m1 .+ m2 .- 1) .* g122)
n22 .= ((m1 .-1 .+ m2 .- 1) .* g221 .+ (m1 .-1 .+ m2 .- 1) .* g222)
# Interpolation
m1 .= f.(m1)
m2 .= f.(m2)
noise .+= sqrt(2) .* (alg.persistance ^ octave) .*
((1 .- m2) .* (n11 .* (1 .- m1) .+ m1 .* n21) .+
m2 .* (n12 .* (1 .- m1) .+ m1 .* n22))
return noise
end
function _rem_meshes!(m1, m2, x, y)
for (i, ival) in enumerate(x), j in 1:length(y)
@fastmath m1[i, j] = ival % 1
end
for i in 1:length(x), (j, jval) in enumerate(y)
@fastmath m2[i, j] = jval % 1
end
return
end
function _view_from_square(source, nrow, ncol)
# Extract a portion of the array to match the dimensions
dim = size(source, 1)
startrow = rand(1:(dim - nrow + 1))
startcol = rand(1:(dim - ncol + 1))
return @view source[startrow:startrow + nrow - 1, startcol:startcol + ncol - 1]
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 899 | """
PlanarGradient <: NeutralLandscapeMaker
PlanarGradient(; direction=360rand())
PlanarGradient(direction)
This type is used to generate a planar gradient landscape, where values change
as a bilinear function of the *x* and *y* coordinates. The direction is
expressed as a floating point value, which will be in *[0,360]*. The inner
constructor takes the mod of the value passed and 360, so that a value that is
out of the correct interval will be corrected.
"""
@kwdef struct PlanarGradient <: NeutralLandscapeMaker
direction::Float64 = 360rand()
PlanarGradient(x::T) where {T <: Real} = new(mod(x, 360.0))
end
function _landscape!(mat, alg::PlanarGradient) where {IT <: Integer}
eastness = sin(deg2rad(alg.direction))
southness = -1cos(deg2rad(alg.direction))
rows, cols = axes(mat)
mat .= collect(rows) .* southness .+ cols' .* eastness
return mat
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 947 | """
RectangularCluster <: NeutralLandscapeMaker
RectangularCluster(; minimum=2, maximum=4)
RectangularCluster(minimum, [maximum=4])
Fills the landscape with rectangles containing a random value. The size of each
rectangle/patch is between `minimum` and `maximum` (the two can be equal for a
fixed size rectangle).
"""
@kwdef struct RectangularCluster <: NeutralLandscapeMaker
minimum::Integer = 2
maximum::Integer = 4
function RectangularCluster(minimum::T, maximum::T=4) where {T <: Integer}
@assert 0 < minimum <= maximum
new(minimum, maximum)
end
end
function _landscape!(mat, alg::RectangularCluster)
mat .= -1.0
while any(i -> i === -1.0, mat)
width, height = rand(alg.minimum:alg.maximum, 2)
row = rand(1:(size(mat,1)-(width-1)))
col = rand(1:(size(mat,2)-(height-1)))
mat[row:(row+(width-1)) , col:(col+(height-1))] .= rand()
end
return mat
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 714 | """
WaveSurface <: NeutralLandscapeMaker
WaveSurface(; direction=360rand(), periods=1)
WaveSurface(direction, [periods=1])
Creates a sinusoidal landscape with a `direction` and a number of `periods`. If
neither are specified, there will be a single period of random direction.
"""
@kwdef struct WaveSurface <: NeutralLandscapeMaker
direction::Float64 = 360rand()
periods::Int64 = 1
function WaveSurface(direction::T, periods::K = 1) where {T <: Real, K <: Integer}
@assert periods >= 1
new(mod(direction, 360.0), periods)
end
end
function _landscape!(mat, alg::WaveSurface)
rand!(mat, PlanarGradient(alg.direction))
mat .= sin.(mat .* (2π * alg.periods))
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1084 |
"""
SpatiallyAutocorrelatedUpdater{SU,R,V}
A `NeutralLandscapeUpdater` that has a prescribed level of
spatial variation (`variability`) and rate of change (`rate`),
and where the spatial distribution of this change is proportional
to a neutral landscape generated with `spatialupdater` at every time
step.
TODO: make it possible to fix a given spatial updater at each timestep.
"""
@kwdef struct SpatiallyAutocorrelatedUpdater{SU,R,V} <: NeutralLandscapeUpdater
spatialupdater::SU = DiamondSquare(0.5)
rate::R = 0.1
variability::V = 0.1
end
"""
_update(sau::SpatiallyAutocorrelatedUpdater, mat; transform=ZScoreTransform)
Updates `mat` using spatially autocorrelated change, using the direction, rate,
and spatial updater parameters from `sau`.
TODO: doesn't necessarily have to be a ZScoreTransform, could be arbitrary
argument
"""
function _update(sau::SpatiallyAutocorrelatedUpdater, mat)
change = rand(spatialupdater(sau), size(mat))
delta = rate(sau) .+ variability(sau) .* transform(fit(ZScoreTransform, change), change)
mat .+ delta
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1227 |
"""
SpatiotemporallyAutocorrelatedUpdater{SU,R,V}
A `NeutralLandscapeUpdater` that has a prescribed level of
spatial and temporal variation (`variability`) and rate of change (`rate`),
and where the spatial distribution of this change is proportional
to a neutral landscape generated with `spatialupdater` at every time
step.
TODO: perhaps spatial and temporal should each have their own variability param
"""
@kwdef struct SpatiotemporallyAutocorrelatedUpdater{SU,R,V} <: NeutralLandscapeUpdater
spatialupdater::SU = DiamondSquare(0.1)
rate::R = 0.1
variability::V = 0.1
end
"""
_update(stau::SpatiotemporallyAutocorrelatedUpdater, mat)
Updates `mat` using temporally autocorrelated change, using the direction, rate,
and spatialupdater parameters from `stau`.
TODO: doesn't necessarily have to be a Normal distribution or ZScoreTransform,
could be arbitrary argument
"""
function _update(stau::SpatiotemporallyAutocorrelatedUpdater, mat)
change = rand(spatialupdater(stau), size(mat))
temporalshift = rand(Normal(0, variability(stau)), size(mat))
z = transform(fit(ZScoreTransform, change), change)
delta = rate(stau) .+ variability(stau) * z .+ temporalshift
mat .+ delta
end | NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 887 | """
TemporallyVariableUpdater{D,S} <: NeutralLandscapeUpdater
A `NeutralLandscapeUpdater` that has a prescribed level of temporal variation
(`variability`) and rate of change (`rate`), but no spatial correlation in where
change is distributed.
"""
@kwdef struct TemporallyVariableUpdater{D,R,V} <: NeutralLandscapeUpdater
spatialupdater::D = missing
rate::R = 0.1
variability::V = 0.1
end
"""
_update(tvu::TemporallyVariableUpdater, mat)
Updates `mat` using temporally autocorrelated change, using the direction and
rate parameters from `tvu`.
TODO: this doesn't have to be a Normal distribution, could be arbitrary
distribution that is continuous and can have mean 0 (or that can be transformed
to have mean 0)
"""
function _update(tvu::TemporallyVariableUpdater, mat)
U = rand(Normal(0, variability(tvu)), size(mat))
Δ = rate(tvu) .+ U
mat .+ Δ
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 2599 | """
NeutralLandscapeUpdater
NeutralLandscapeUpdater is an abstract type for methods for updating a landscape
matrix
"""
abstract type NeutralLandscapeUpdater end
"""
spatialupdater(up::NeutralLandscapeUpdater)
All `NeutralLandscapeUpdater`s have a field `rate` which defines the expected
(or mean) change across all cells per timestep.
"""
rate(up::NeutralLandscapeUpdater) = up.rate
"""
spatialupdater(up::NeutralLandscapeUpdater)
All `NeutralLandscapeUpdater`'s have a `spatialupdater` field which is either a
`NeutralLandscapeMaker`, or `Missing` (in the case of temporally correlated
updaters).
"""
spatialupdater(up::NeutralLandscapeUpdater) = up.spatialupdater
"""
variability(up::NeutralLandscapeUpdater)
Returns the `variability` of a given `NeutralLandscapeUpdater`. The variability
of an updater is how much temporal variation there will be in a generated
time-series of landscapes.
"""
variability(up::NeutralLandscapeUpdater) = up.variability
"""
normalize(mats::Vector{M})
Normalizes a vector of neutral landscapes `mats` such that all values between 0
and 1. Note that this does not preserve the `rate` parameter for a given
`NeutralLandscapeUpdater`, and instead rescales it proportional to the
difference between the total maximum and total minimum across all `mats`.
"""
function normalize(mats::Vector{M}) where {M<:AbstractMatrix}
mins, maxs = [NaNMath.min(x...) for x in mats], [NaNMath.max(x...) for x in mats]
totalmin, totalmax = NaNMath.min(mins...), NaNMath.max(maxs...)
returnmats = copy(mats)
for (i,mat) in enumerate(mats)
returnmats[i] = (mat .- totalmin) ./ (totalmax - totalmin)
end
return returnmats
end
"""
update(updater::T, mat)
Returns one-timestep applied to `mat` based on the `NeutralLandscapeUpdater`
provided (`updater`).
"""
function update(updater::T, mat) where {T<:NeutralLandscapeUpdater}
_update(updater, mat)
end
"""
update(updater::T, mat, n::I)
Returns a sequence of length `n` where the original neutral landscape `mat` is
updated by the `NeutralLandscapeUpdater` `update` for `n` timesteps.
"""
function update(updater::T, mat, n::I) where {T<:NeutralLandscapeUpdater, I<:Integer}
sequence = [zeros(size(mat)) for _ in 1:n]
sequence[begin] .= mat
for i in 2:n
sequence[i] = _update(updater, sequence[i-1])
end
sequence
end
"""
update!(updater::T, mat)
Updates a landscape `mat` in-place by directly mutating `mat`.
"""
function update!(updater::T, mat) where {T<:NeutralLandscapeUpdater}
mat .= _update(updater, mat)
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 324 | using NeutralLandscapes
using Test
using SpeciesDistributionToolkit
bbox = (left=-83.0, bottom=46.4, right=-55.2, top=63.7)
temp = SimpleSDMPredictor(RasterData(WorldClim2, AverageTemperature); bbox...)
mpd = rand(MidpointDisplacement(), size(temp), mask=temp)
@test findall(isnan, mpd) == findall(isnothing, temp.grid)
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 430 | using NeutralLandscapes
using Test
pl33 = rand(PlanarGradient(0.0), (3,3))
@test pl33 == [1 1 1; 1/2 1/2 1/2; 0 0 0]
testmask = [true false true; false true false; true true true]
pl33m = rand(PlanarGradient(0.0), (3,3); mask=testmask)
@test all(pl33m .=== [1 NaN 1; NaN 1/2 NaN; 0 0 0])
@test rand(PlanarGradient(0.0), (5, 5)) == rand(EdgeGradient(0.0), (5, 5))
mat = rand(Float64, (10, 10))
rand!(mat, PlanarGradient(90.0))
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1289 | using NeutralLandscapes, Test
@testset "_rescale" begin
@test NeutralLandscapes._rescale!([3.0 -2.0; 0.0 1.0]) == [1.0 0.0; 0.4 0.6]
end
algorithms = (
DiamondSquare(),
DiamondSquare(; H=0.2),
DiamondSquare(0.4),
DiscreteVoronoi(),
DiscreteVoronoi(; n=2),
DiscreteVoronoi(3),
DistanceGradient(),
DistanceGradient(; sources=[2]),
DistanceGradient([1, 2]),
EdgeGradient(),
EdgeGradient(; direction=120.0),
EdgeGradient(90.0),
MidpointDisplacement(),
MidpointDisplacement(;),
NearestNeighborElement(),
NearestNeighborElement(; k=1),
NearestNeighborElement(2),
NearestNeighborCluster(),
# NearestNeighborCluster(; n=:queen),
NearestNeighborCluster(; n=:diagonal),
NearestNeighborCluster(0.2, :diagonal),
NoGradient(),
Patches(),
PerlinNoise(),
PerlinNoise(; octaves=3, lacunarity=3, persistance=0.3),
PerlinNoise((1, 2), 3, 2, 0.2),
PlanarGradient(),
PlanarGradient(; direction=120.0),
PlanarGradient(90),
RectangularCluster(),
RectangularCluster(; maximum=5),
RectangularCluster(2),
WaveSurface(),
WaveSurface(90.0),
WaveSurface(; periods=2),
)
sizes = (50, 200), (100, 100), (301, 87)
for alg in algorithms, sze in sizes
A = rand(alg, sze)
@test size(A) == sze
end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 283 | using Test, SafeTestsets
@time @safetestset "updaters" begin include("updaters.jl") end
@time @safetestset "planar gradient" begin include("planar.jl") end
@time @safetestset "integrations" begin include("integrations.jl") end
@time @safetestset "rand" begin include("rand.jl") end
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 1515 | using NeutralLandscapes
using Test
# test it's running
@test TemporallyVariableUpdater() != π
@test SpatiallyAutocorrelatedUpdater() != π
@test SpatiotemporallyAutocorrelatedUpdater() != π
function testupdaters(model)
updater = model()
# Test defaults
@test rate(updater) == 0.1
@test variability(updater) == 0.1
# Test kwargs
updater = model(rate = 1.0, variability=0.05, spatialupdater=MidpointDisplacement(0.5))
@test rate(updater) == 1.0
@test variability(updater) == 0.05
@test typeof(updater.spatialupdater) <: NeutralLandscapeMaker
@test updater.spatialupdater == MidpointDisplacement(0.5)
# Test updating
env = rand(MidpointDisplacement(0.5), 50, 50)
newenv = update(updater, env)
@test env != newenv
oldenv = deepcopy(env)
update!(updater, env)
@test env != oldenv
end
function testnormalize(model)
updater = model()
env = rand(MidpointDisplacement(0.5), 50, 50)
seq = update(updater, env, 30)
normseq = normalize(seq)
@test length(findall(isnan, normseq[end])) == 0
for m in normseq
@test min(m...) >= 0 && max(m...) <= 1
end
env = [NaN 5 2 1 NaN; 3 4 5 2 1; 6 NaN 0 5 2; NaN NaN 0 4 5]
seq = update(updater, env, 30)
normseq = normalize(seq)
@test length(findall(isnan, normseq[end])) == 5
end
models = [
TemporallyVariableUpdater,
SpatiallyAutocorrelatedUpdater,
SpatiotemporallyAutocorrelatedUpdater
]
testnormalize.(models)
testupdaters.(models)
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | code | 93 | using NeutralLandscapes
using Test
# simply test it's running
@test DiscreteVoronoi(42) != π | NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 4107 | # Neutral Landscapes
This packages allows the generation of neutral landscapes in *Julia*. It is a port of the [`NLMPy` package](https://github.com/tretherington/nlmpy), which is described in greater detail at:
Etherington, T.R., Holland, E.P. & O’Sullivan, D. (2015) NLMpy: a python software package for the creation of neutral landscape models within a general numerical framework. _Methods in Ecology and Evolution_, __6__, 164–168.








[](https://ecojulia.github.io/NeutralLandscapes.jl/stable/)
[](https://ecojulia.github.io/NeutralLandscapes.jl/dev/)
All landscapes are generated using an overload of the `rand` (or `rand!`) method, taking as arguments a `NeutralLandscapeGenerator`, as well as a dimension and a Boolean mask if required. The additional functions `classify` and `blend` are used to respectively discretize the network, or merge the result of different neutral generators.
The code below reproduces figure 1 of Etherington et al. (2015):
```julia
using NeutralLandscapes, Plots
siz = 50, 50
# Random NLM
Fig1a = rand(NoGradient(), siz)
# Planar gradient NLM
Fig1b = rand(PlanarGradient(), siz)
# Edge gradient NLM
Fig1c = rand(EdgeGradient(), siz)
# Mask example
Fig1d = falses(siz)
Fig1d[10:25, 10:25] .= true
# Distance gradient NLM
Fig1e = rand(DistanceGradient(findall(vec(Fig1d))), siz)
# Midpoint displacement NLM
Fig1f = rand(MidpointDisplacement(0.75), siz)
# Random rectangular cluster NLM
Fig1g = rand(RectangularCluster(4, 8), siz)
# Random element nearest-neighbor NLM
Fig1h = rand(NearestNeighborElement(200), siz)
# Random cluster nearest-neighbor NLM
Fig1i = rand(NearestNeighborCluster(0.4), siz)
# Blended NLM
Fig1j = blend([Fig1f, Fig1c])
# Patch blended NLM
Fig1k = blend(Fig1h, Fig1e, 1.5)
# Classifiend random cluster nearest-neighbor NLM
Fig1l = classify(Fig1i, ones(4))
# Percolation NLM
Fig1m = classify(Fig1a, [1-0.5, 0.5])
# Binary random rectangular cluster NLM
Fig1n = classify(Fig1g, [1-0.75, 0.75])
# Classified midpoint displacement NLM
Fig1o = classify(Fig1f, ones(3))
# Classified midpoind displacement NLM, with limited classification
Fig1p = classify(Fig1f, ones(3), Fig1d)
# Masked planar gradient NLM
Fig1q = rand(PlanarGradient(90), siz, mask = Fig1n .== 2) #TODO mask as keyword + should mask be matrix or vec or both? (Fig1e)
# Hierarchical NLM
Fig1r = ifelse.(Fig1o .== 2, Fig1m .+ 2, Fig1o)
# Rotated NLM
Fig1s = rotr90(Fig1l)
# Transposed NLM
Fig1t = Fig1o'
class = cgrad(:Set3_4, 4, categorical = true)
c2, c3, c4 = class[1:2], class[1:3], class[1:4]
gr(color = :fire, ticks = false, framestyle = :box, colorbar = false)
plot(
heatmap(Fig1a), heatmap(Fig1b), heatmap(Fig1c), heatmap(Fig1d, c = c2), heatmap(Fig1e),
heatmap(Fig1f), heatmap(Fig1g), heatmap(Fig1h), heatmap(Fig1i), heatmap(Fig1j),
heatmap(Fig1k), heatmap(Fig1l, c = c4), heatmap(Fig1m, c = c2), heatmap(Fig1n, c = c2), heatmap(Fig1o, c = c3),
heatmap(Fig1p, c = c4), heatmap(Fig1q), heatmap(Fig1r, c = c4), heatmap(Fig1s, c = c4), heatmap(Fig1t, c = c3),
layout = (4,5), size = (1600, 1270)
)
```

| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 1304 | **What the pull request does**
Explain in a few words what the pull request does.
**Type of change**
Please indicate the relevant option(s)
- [ ] :bug: Bug fix (non-breaking change which fixes an issue)
- [ ] :sparkle: New feature (non-breaking change which adds functionality)
- [ ] :boom: Breaking change (fix or feature that would cause existing functionality to not work as expected)
- [ ] :book: This change requires a documentation update
**Checklist**
- [ ] The changes are documented
- [ ] The docstrings of the different functions describe the arguments, purpose, and behavior
- [ ] There are examples in the documentation website
- [ ] The changes are tested
- [ ] The changes **do not** modify the behavior of the previously existing functions
- If they **do**, please explain why and how in the introduction paragraph
- [ ] For **new contributors** - my name and information are added to `.zenodo.json`
- [ ] The `Project.toml` field `version` has been updated
- Change the *last* number for a `v0.0.x` series release, a bug fix, or a performance improvement
- Change the *middle* number for additional features that *do not* break the current test suite (unless you fix a bug in the test suite)
- Change the *first* number for changes that break the current test suite
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 734 | ---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug, need-triage
assignees: 'tpoisot'
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
A [minimal reproducible example](https://stackoverflow.com/help/minimal-reproducible-example) that is enough to show what the problem is
~~~ julia
using NCBITaxonomy
# Add your code here
~~~
**Stacktrace**
~~~ julia
# Please paste your stacktrace here
~~~
**Expected behavior**
A clear and concise description of what you expected to happen.
**Environment:**
- OS:
- Julia version:
- Other packages used in the example and their versions:
**Additional context**
Add any other context about the problem here.
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 753 | ---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement, need-triage
assignees: 'tpoisot'
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
Ideally, this can take the form of code you would like to write:
~~~ julia
using NCBITaxonomy
# Write your dream code here
~~~
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 186 | ---
name: To-do
about: Short notes on development tasks
title: ''
labels: need-triage, to do
assignees: 'tpoisot'
---
**What to do?**
...
**Why?**
...
**Any ideas how?**
...
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 1310 | ```@example gallery
using NeutralLandscapes
using Plots
function demolandscape(alg::T) where {T <: NeutralLandscapeMaker}
heatmap(rand(alg, (200, 200)), frame=:none, aspectratio=1, c=:davos)
end
```
## No gradient
```@example gallery
demolandscape(NoGradient())
```
## Planar gradient
```@example gallery
demolandscape(PlanarGradient(35))
```
## Edge gradient
```@example gallery
demolandscape(EdgeGradient(186))
```
## Wave surface
```@example gallery
demolandscape(WaveSurface(35, 3))
```
## Rectangular cluster
```@example gallery
demolandscape(RectangularCluster())
```
## Distance gradient
```@example gallery
sources = unique(rand(1:40000, 50))
demolandscape(DistanceGradient(sources))
```
## Nearest-neighbor element
```@example gallery
heatmap(rand(NearestNeighborElement(20, 1), (45, 45)))
```
## Voronoi
```@example gallery
demolandscape(DiscreteVoronoi(40))
```
## Perlin Noise
```@example gallery
demolandscape(PerlinNoise())
```
## Classify landscape
```@example gallery
sources = unique(rand(1:40000, 50))
heatmap(NeutralLandscapes.classify!(rand(DistanceGradient(sources), (200, 200)), [0.5, 1, 1, 0.5]))
```
## Diamond Square
```@example gallery
demolandscape(DiamondSquare())
```
## Midpoint Displacement
```@example gallery
demolandscape(MidpointDisplacement())
```
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 0.1.4 | 323b6ca5bb37cd13a7cb7d881f13f86e643d2f72 | docs | 820 | # NeutralLandscapes.jl
A pure Julia port of https://github.com/tretherington/nlmpy
## Landscape models
```@docs
NeutralLandscapeMaker
DiamondSquare
DiscreteVoronoi
DistanceGradient
EdgeGradient
MidpointDisplacement
NearestNeighborCluster
NearestNeighborElement
NoGradient
PerlinNoise
PlanarGradient
RectangularCluster
WaveSurface
```
## Landscape generating function
```@docs
rand
rand!
```
## Temporal Change
```@docs
NeutralLandscapeUpdater
TemporallyVariableUpdater
SpatiallyAutocorrelatedUpdater
SpatiotemporallyAutocorrelatedUpdater
update
update!
normalize
NeutralLandscapes.rate
NeutralLandscapes.variability
NeutralLandscapes.spatialupdater
NeutralLandscapes._update
```
## Other functions
```@docs
classify
classify!
blend
label
NeutralLandscapes.mask!
```
| NeutralLandscapes | https://github.com/EcoJulia/NeutralLandscapes.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 500 | using Documenter, HeartRateVariability
makedocs(sitename = "HeartRateVariability.jl",
authors = "Jasmin Walter",
pages = [
"Introduction" => "introduction.md",
"Installation" => "installation.md",
"Quick Start" => "quickstart.md",
"API" => "index.md",
"License" => "LICENSE.md"
],
)
deploydocs(
repo = "github.com/LiScI-Lab/HeartRateVariability.jl.git",
)
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 1119 | module Frequency
import LombScargle
import Trapz
#=
This function calculates a lomb scargle transformation
:param n: is the array that contains the NN-inetrvals
:return: the result of the lomb scargle transformation
=#
function lomb_scargle(n)
t=cumsum(n).-n[1]
t=t./1000
plan=LombScargle.plan(t,n,normalization=:psd,minimum_frequency=0.003,maximum_frequency=0.4)
return LombScargle.lombscargle(plan)
end # lomb_scargle
#=
This function calculates the power of a frequency band between two given frequencys
:param freq: The frequency of a lomb scargle transformation
:param power: The power of a lomb scargle transformation
:param min: The minimum value of the frequency band to be calculated
:param max: The maximum value of the frequency band to be calculated
:return p: The power of the frequency band
=#
function get_power(freq,power,min,max)
count=1
index=[]
for f in freq
if f>=min && f<max
push!(index,count)
end
count+=1
end
p=Trapz.trapz(freq[index[1]:index[end]],power[index[1]:index[end]])
return p
end # get_power
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 1107 | module Geometric
import Plots
import Images
#=
This function creates a Poincaré plot
:param n: is the array that contains the NN-inetrvals
:return: a plot object
=#
function poincare(n)
x=[]
y=[]
for i in 1:length(n)-1
push!(x,n[i])
push!(y,n[i+1])
end
p=Plots.scatter(x,y,xlabel="RRn",ylabel="RRn+1",legend=false);
return p;
end # poincare
#=
This function creates a recurrence plot
:param n: is the array that contains the NN-inetrvals
:param e: the maximum distance between two intervals, default="mean" == the mean value of succsessive differences
:return: a plot object
=#
function recurrence(n,e)
if e=="mean"
diff=[]
for i in 1:length(n)-1
push!(diff,abs(n[i+1]-n[i]))
end
e=sum(diff)/length(diff)
end
x=zeros(length(n),length(n))
for i in 1:length(n)
for j in i:length(n)
if sqrt((n[i]-n[j])^2)<=e
x[i,j]=1
x[j,i]=1
end
end
end
img=Images.Gray.(x)
r=Plots.plot(img);
return r;
end # recurrence
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 3899 | module HeartRateVariability
include("TimeDomain.jl")
include("Input.jl")
include("Frequency.jl")
include("Nonlinear.jl")
include("Geometric.jl")
"""
geometric(n,e="mean")
Arguments:
- n: the array that contains the NN-inetrvals
- e: the maximum distance between two intervals, default="mean" (the mean value of the succsessive differences), has to be "mean" or a number
Results:
- poincare: the Poincaré plot
- recurrence: the recurrence plot
"""
function geometric(n::Array{Float64,1},e="mean")
if (e!="mean" && !isa(e,Number))
error("e has to be a numerical value or 'mean'")
end
return (poincare=Geometric.poincare(n),recurrence=Geometric.recurrence(n,e))
end # geometric
"""
nonlinear(n,m=2,r=6)
Arguments:
- n: the array that contains the NN-inetrvals
- m: the embedding dimension, default=2
- r: the tolerance, default=6
Results:
- apen: the approximate entropy
- sampen: the sample entropy
- hurst: the hurst exponent (only valid if the length of n is >= 100)
- renyi0, renyi1, renyi2: the Rényi entropy of order 0,1 and 2
"""
function nonlinear(n::Array{Float64,1},m::Int64=2,r::Number=6)
if length(n)<100
@warn("To obtain a valid value for the hurst coefficient, the length of the data series must be greater than or equal to 100.")
end
return (apen=Nonlinear.apen(n,m,r), sampen=Nonlinear.sampen(n,m,r),
hurst=Nonlinear.hurst(n), renyi0=Nonlinear.renyi(n,0),
renyi1=Nonlinear.renyi(n,1), renyi2=Nonlinear.renyi(n,2))
end # nonlinear
"""
frequency(n)
Arguments:
- n: the array that contains the NN-inetrvals
Results:
- vlf: the very low-frequency power
- lf: the low-frequency power
- hf: the high-frequency power
- lfhf_ratio: the lf/hf ratio
- tp: the total power
"""
function frequency(n::Array{Float64,1})
ls=Frequency.lomb_scargle(n)
vlf=Frequency.get_power(ls.freq,ls.power,0.003,0.04)
lf=Frequency.get_power(ls.freq,ls.power,0.04,0.15)
hf=Frequency.get_power(ls.freq,ls.power,0.15,0.4)
tp=vlf+lf+hf
return (vlf=vlf, lf=lf, hf=hf, lfhf_ratio=lf/hf, tp=tp)
end # frequency
"""
time_domain(n)
Arguments:
- n: the array that contains the NN-inetrvals
Results:
- mean: the mean value
- sdnn: the standard deviation
- rmssd: the root mean square of successive differences
- sdsd: the standard deviation of successive differences
- nn50: the number of successive NN intervals with an interval smaller than 50 ms
- pnn50: the percentage of successive NN intervals with an interval smaller than 50 ms
- nn20: the number of successive NN intervals with an interval smaller than 20 ms
- pnn20: the percentage of successive NN intervals with an interval smaller than 20 ms
- rRR: the percentage of relative RR intervals
"""
function time_domain(n::Array{Float64,1})
diff=TimeDomain.nn_diff(n)
return (mean=TimeDomain.mean_nn(n),sdnn=TimeDomain.sdnn(n),
rmssd=TimeDomain.rmssd(diff), sdsd=TimeDomain.sdsd(diff),
nn50=TimeDomain.nn(diff,50), pnn50=TimeDomain.pnn(diff,50),
nn20=TimeDomain.nn(diff,20), pnn20=TimeDomain.pnn(diff,20),
rRR=TimeDomain.rRR(n))
end # time_domain
"""
infile(file)
This function reads the data from a txt or csv file.
Arguments:
- file: is the path of the input file
"""
function infile(file::String)
return Input.read_txt(file)
end # infile
"""
infile(record,annotator)
This function reads the data from a wbdb file.
Arguments:
- record: is the name of the record
- annotator: is the annotator of the record
!!! note
In order to use the infile function for wfdb files, the WFDB Software Package from
Pysionet is required. See [Installation](installation) for more information.
"""
function infile(record::String,annotator::String)
return Input.read_wfdb(record,annotator)
end # infile
export nonlinear, frequency, time_domain, infile, geometric
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 721 | module Input
#=
This function reads the data from a txt or csv file.
:param infile: the path of the file
:return: an array with the read data
=#
function read_txt(infile::String)
a=read(open(infile,"r"),String)
return parse.(Float64,filter!(e->e!="",split(a,r"[^\d.]")))
end # infile
#=
This function reads the data from a wfdb file.
:param record: the record name
:param annotator: the annotator of the record
:return: an array with the read data
=#
function read_wfdb(record::String,annotator::String)
temp=string(record,"_temp.txt")
run(pipeline(`ann2rr -r "$record" -a "$annotator" -i s -c`,stdout="$temp"))
a=read_txt("$temp")
run(`rm "$temp"`)
return a*1000
end # infile
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 4345 | module Nonlinear
import StatsBase
import Statistics
#=
This function calculates the approximate entropy
:param n: the array that contains the NN-inetrvals
:param m: the embedding dimension, default=2
:param r: the tolerance, default=6
:return: the approximate entropy
=#
function apen(n,m,r)
c1=get_apen_dist(n,m,r)
c2=get_apen_dist(n,m+1,r)
return log(c1/c2)
end # apen
#=
This function calculates the sample entropy
:param n: the array that contains the NN-inetrvals
:param m: the embedding dimension, default=2
:param r: the tolerance, default=6
:return: the sample entropy
=#
function sampen(n,m,r)
c1=get_sampen_dist(n,m,r,1)
c2=get_sampen_dist(n,m+1,r,0)
return -log(c2/c1)
end # sampen
#=
This function creates a template of a given array over an embedding dimension
:param n: the array that contains the NN-inetrvals
:param m: the embedding dimension, default=2
:return template: the created template
=#
function get_template(n,m)
template=[]
for i in 1:length(n)-m+1
push!(template,n[i:i+m-1])
end
return template
end # get_template
#=
This function calculates the distances for the approximate entropy
:param n: the array that contains the NN-inetrvals
:param m: the embedding dimension, default=2
:param r: the tolerance, default=6
:return: the distance for the approximate entropy
=#
function get_apen_dist(n,m,r)
template=get_template(n,m)
count=zeros(length(template))
for i in 1:length(template)
for j in i+1:length(template)
if maximum(abs.(template[i].-template[j]))<=r
count[i]+=1
count[j]+=1
end
end
end
return sum(count./(length(n)-m+1))/(length(n)-m+1)
end # get_apen_dist
#=
This function calculates the distances for the sample entropy
:param n: the array that contains the NN-inetrvals
:param m: the embedding dimension, default=2
:param r: the tolerance, default=6
:param l: a value to limit the for-loops
:return: the distance for the sample entropy
=#
function get_sampen_dist(n,m,r,l)
template=get_template(n,m)
counts=[]
count=0
for i in 1:length(template)-l
for j in 1:length(template)-l
if maximum(abs.(template[i].-template[j]))>=r || i==j
push!(counts,count)
count=0
else
count+=1
end
end
end
return sum(counts)
end # get_sampen_dist
#=
This function calculates the renyi entropy of a given order
:param n: the array that contains the NN-inetrvals
:param a: the order of the renyi entropy
:return: the calculated renyi entropy
=#
function renyi(n,a)
return StatsBase.renyientropy(n,a)
end # renyi
#=
This function calculates the hurst coefficient
It was inspired by the python hurst package by Dmitry A. Mottl (https://github.com/Mottl/hurst)
:param n: the array that contains the NN-inetrvals
:return H: the hurst coefficient
=#
function hurst(n)
ws=Array(range(log10(10),stop=log10(length(n)),step=0.25))
window=[]
for x in ws
push!(window,round(Int64,exp10(x),RoundDown))
end
if !(length(n) in window)
push!(window,length(n))
push!(ws,log10(length(n)))
end
RS=[]
for w in window
rs=[]
for start in (range(0,stop=length(n),step=w))
if (start+w)>length(n)
break
end
RS_part= get_rs(n[start+1:start+w])
if RS_part != 0
push!(rs,RS_part)
end
end
if length(rs)>0
push!(RS,Statistics.mean(rs))
end
end
A=Array{Float64}([ws ones(length(RS))])
RSlog=[]
for r in RS
push!(RSlog,log10(r))
end
B=Array{Float64}(RSlog)
H,c=A\B
c=exp10(c)
return H
end # hurst
#=
This function calculates the rescaled range of a time series
It was inspired by the python hurst package by Dmitry A. Mottl (https://github.com/Mottl/hurst)
:param n: the array that contains the NN-inetrvals
:return: the rescaled range
=#
function get_rs(n)
incs=n[2:end].-n[1:end-1]
mean_inc=(n[end]-n[1])/length(incs)
deviations=incs.-mean_inc
Z=cumsum(deviations)
R=maximum(Z)-minimum(Z)
S=Statistics.std(incs)
if R==0 || S==0
return 0
else
return R/S
end
end # get_rs
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 2537 | module TimeDomain
import Statistics
#=
This function calculates the differences between the NN intervals
:param n: is the array that contains the NN-inetrvals
:return diff: is an array containing the differences
=#
function nn_diff(n)
diff=[]
for i in 1:length(n)-1
push!(diff,abs(n[i+1]-n[i]))
end
return diff
end #nn_diff
#=
This function calculates the standard deviation of the NN intervals
:param n: is the array that contains the NN-inetrvals
:return: the standard deviation
=#
function sdnn(n)
return Statistics.std(n)
end # sdnn
#=
This function calculates the root mean square of successive differences
:param diff: is the array containing the differences between the NN intervals
:return: the rmssd
=#
function rmssd(diff)
return sqrt(Statistics.mean(diff.^2))
end # rmssd
#=
This function calculates the standard deviation of successive differences
:param diff: is the array containing the differences between the NN intervals
:return: the sdsd
=#
function sdsd(diff)
return Statistics.std(diff)
end # sdsd
#=
This function calculates the percentage of successive NN intervals,
with an interval smaller than x ms
:param diff: is the array containing the differences between the NN intervals
:param x: is the number of miliseconds the intervals may differ
:return: the percentage of successive intervals with a difference < x ms
=#
function pnn(diff,x)
return nn(diff,x)/(length(diff)+1)*100
end # pnn
#=
This function calculates the number of successive NN intervals,
with an interval smaller than x ms
:param diff: is the array containing the differences between the NN intervals
:param x: is the number of miliseconds the intervals may differ
:return: the number of successive intervals with a difference < x ms
=#
function nn(diff,x)
count=0
for d in diff
if d>x
count+=1
end
end
return count
end # nn
#=
This function calculates the mean of the NN intervals
:param n: is the array that contains the NN-inetrvals
:return: the mean value
=#
function mean_nn(n)
return Statistics.mean(n)
end # mean_nn
#=
This function calculates the relative RR
:param n: is the array that contains the NN-inetrvals
:return: the relative RR
=#
function rRR(n)
rr=[]
for i in 2:length(n)
r=(2*(n[i]-n[i-1])/(n[i]+n[i-1]))
push!(rr,r)
end
m=sum(rr)/length(rr)
d=[]
for i in 1:length(rr)-1
push!(d,sqrt((m-rr[i])^2+(m-rr[i+1])^2))
end
return Statistics.median(d)*100
end # rRR
end # module
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | code | 1981 | using HeartRateVariability
using Test
n=HeartRateVariability.infile("e1304.txt")
td=HeartRateVariability.time_domain(n)
fd=HeartRateVariability.frequency(n)
nl=HeartRateVariability.nonlinear(n)
g=HeartRateVariability.geometric(n)
@testset "HeartRateVariability.jl" begin
@testset "HeartRateVariability.infile" begin
@test HeartRateVariability.infile("e1304","atr")==n
end
@testset "HeartRateVariability.time_domain" begin
@test td.mean≈917.24 atol=0.1
@test td.sdnn≈137.19 atol=0.1
@test td.rmssd≈27.85 atol=0.1
@test td.sdsd≈20.35 atol=0.1
@test td.nn50≈342 atol=1
@test td.pnn50≈4.41 atol=0.1
@test td.nn20≈2831 atol=1
@test td.pnn20≈36.53 atol=0.1
@test td.rRR≈2.67 atol=0.1
end
@testset "HeartRateVariability.frequency" begin
@test fd.vlf≈1317.96 atol=0.01*fd.vlf
@test fd.lf≈90.36 atol=0.01*fd.lf
@test fd.hf≈176.05 atol=0.01*fd.hf
@test fd.lfhf_ratio≈0.51 atol=0.01*fd.lfhf_ratio
@test fd.tp≈1584.35 atol=0.01*fd.tp
end
@testset "HeartRateVariability.nonlinear" begin
@test nl.apen≈2.16 atol=0.1
@test nl.sampen≈2.16 atol=0.1
@test nl.hurst≈0.37 atol=0.1
@test nl.renyi0≈-6.82 atol=0.1
@test nl.renyi1≈-6.83 atol=0.1
@test nl.renyi2≈-6.84 atol=0.1
#testing if get_rs from module Nonlinear returns 0 when S or R is 0
@test HeartRateVariability.Nonlinear.get_rs(ones(100))==0
#testing if warning is thwown
@test_logs (:warn,"To obtain a valid value for the hurst coefficient, the length of the data series must be greater than or equal to 100.") HeartRateVariability.nonlinear([1.0,2.0,1.0,2.0,1.0,2.0,1.0,2.0,1.0,2.0])
end
@testset "HeartRateVariability.geometric" begin
@test g.poincare!=nothing
@test g.recurrence!=nothing
@test_throws ErrorException HeartRateVariability.geometric(n,"error")
end
end
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | docs | 790 | # HeartRateVariability
## Documentation
[](https://LiScI-Lab.github.io/HeartRateVariability.jl/dev/introduction)
## Build status
[](https://travis-ci.org/LiScI-Lab/HeartRateVariability.jl)
[](http://codecov.io/github/LiScI-Lab/HeartRateVariability.jl?branch=master)
This package implements the most common methods for heart rate variability analysis. For more information about the used analyzing methods and the installation and application, please read the [documentation](https://LiScI-Lab.github.io/HeartRateVariability.jl/dev/introduction).
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
|
[
"MIT"
] | 1.0.0 | c170775f98cb8738f72c31b6ec3f00055a60184a | docs | 378 | # API
# Data import
The following functions are used to import data. They return the read data in an array.
```@docs
infile
```
# Analysis
The following functions are used to analyze the data using time series, frequency or nonlinear methods. Each of them returns a NamedTuple containing the results of the analysis.
```@docs
time_domain
frequency
nonlinear
geometric
```
| HeartRateVariability | https://github.com/LiScI-Lab/HeartRateVariability.jl.git |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.