licenses
sequencelengths 1
3
| version
stringclasses 677
values | tree_hash
stringlengths 40
40
| path
stringclasses 1
value | type
stringclasses 2
values | size
stringlengths 2
8
| text
stringlengths 25
67.1M
| package_name
stringlengths 2
41
| repo
stringlengths 33
86
|
---|---|---|---|---|---|---|---|---|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | code | 6870 | # # Plucker vectors and coordinate transforms
#md # ```@meta
#md # CurrentModule = RigidBodyTools
#md # ```
#=
Here we discuss the use of Plücker vectors and their transforms for
describing rigid-body motion and force. Plücker vectors succinctly describe
both the angular (rotational) and linear (translational) part of motion, and the angular (moment) and
linear (force) part of force. In three dimensions, a Plücker vector is 6-dimensional,
e.g., Plücker velocity and force vectors are
$$v = \begin{bmatrix} \Omega_x \\ \Omega_y \\ \Omega_z \\ U_x \\ U_y \\ U_z \end{bmatrix}, \qquad
f = \begin{bmatrix} M_x \\ M_y \\ M_z \\ F_x \\ F_y \\ F_z \end{bmatrix}$$
In two dimensions, there is only one angular component and two linear components, e.g.,
$$v = \begin{bmatrix} \Omega_z \\ U_x \\ U_y \end{bmatrix}, \qquad f = \begin{bmatrix} M_z \\ F_x \\ F_y \end{bmatrix}$$
We need to be able to transform these vectors from one coordinate system to another.
This requires rotating their components and shifting their center from one origin to another.
For example, a translational velocity based at system B will be different from
the translational velocity at system A because of the rotational velocity, $\Omega \times {}^Br_{A}$,
where ${}^Br_{A}$ is the vector from the origin of A to the origin of B.
Similarly, the moment about B will be different from the moment about A due to
the moment arm ${}^Br_{A} \times F$.
=#
using RigidBodyTools
using LinearAlgebra
using Plots
#=
## Plücker vectors
A Plücker vector is easily created by simply supplying a vector of its components
=#
v = PluckerMotion([1.0,2.0,3.0])
#=
This created a 2d motion vector, with angular velocity 1.0 and linear
velocity (2.0,3.0). One can also supply the angular and linear
parts separately, using keywords. If one of these keywords is
omitted, it defaults to zero for that part. Note that we also need
to write this as `PluckerMotion{2}` to specify the physical dimensionality.
For a 3d motion vector, one would write `PluckerMotion{3}` here.
=#
v2 = PluckerMotion{2}(angular=1.0,linear=[2.0,3.0])
v2 == v
#=
We can also pick off the angular and linear parts
=#
angular_only(v)
#=
and
=#
linear_only(v)
#=
Force vectors are similar
=#
f = PluckerForce([-1.0,-3.5,2.25])
#=
The vectors of the same type can be added and subtracted
=#
v3 = v + v2
#=
We can also take a scalar product of force and motion vectors
=#
dot(f,v)
#=
## Transforms
Transforms are constructed by describing the relationship between the two
coordinate systems. Consider the example in the figure below.

To develop the 2d transform from A to B, we supply the position $r$ and
the rotation angle $\theta$. For example, if B is shifted by [1,1]
and rotated by angle $\pi/6$ counterclockwise about A, then we construct the transform
as
=#
Xm = MotionTransform([1,1],π/6)
#=
Note that it uses the angle of rotation, $\pi/6$, to create a rotation
matrix operator.
A 2d force transform would be constructed by
=#
Xf = ForceTransform([1,1],π/6)
#=
For 3d transforms, we need to supply the rotation operator itself (as well
as the 3d translation vector). Often, this rotation is done by
rotating about a certain axis by a certain angle. We do this with the
`rotation_about_axis` function. For example, to rotate by $\pi/4$ about
an axis parallel to the vector $[1,1,1]$, then we use
=#
R = rotation_about_axis(π/4,[1,1,1])
#=
and then to translate this rotated system by $[-1,-2,-3]$,
=#
Xm = MotionTransform([-1,-2,-3],R)
#=
and similarly for a force transform.
=#
#=
We can also compute the inverses of these transforms, to transform back from
B to A
=#
inv(Xm)
#=
Transforms of the same type (motion or force) can be composed via multiplication to transform
from, e.g., A to B to C.
=#
Xm1 = MotionTransform([1.5,1.5],π/6)
Xm2 = MotionTransform([-1,1],π/3)
Xm2*Xm1
#=
## Transforming bodies
We can use motion transforms, in particular, to place bodies. We simply
apply the transform as a function, and it transforms the body's
coordinates. For example, transform `Xm1` above shifts the
body to `[1.5,1.5]` and rotates it counterclockwise by `π/6`:
=#
b = Ellipse(1.0,0.2,0.02)
plot(b,xlims=(-3,3),ylims=(-3,3),fillcolor=:gray)
plot!(Xm1(b),xlims=(-3,3),ylims=(-3,3))
#=
In the example above, we did not affect the original body by applying the
transform as a function. Rather, we created a copy of the body.
If, instead, you wish to transform the body in place, use `update_body!`
=#
update_body!(b,Xm1)
#=
One important note: a body stores a set of coordinates in its own intrinsic
coordinate system, and when a transform is applied to the body, it always
acts on these coordinates. This means that the transform's application on the body
cannot be carried out as a composite of operations, e.g. `T2(T1(b))` is not possible.
Insteady, in the application on the body, the transform is always interpreted such that system A
is the inertial coordinate system and B is the body system. Of course, the transform itself can always
be constructed from composite transforms.
=#
#=
Sometimes we need information about the normals in the body system.
For these, we can use `normalmid` with the flag `axes=:body`:
=#
nx, ny = normalmid(b,axes=:body)
#=
Finally, if you wish to transform the body's own coordinate system, rather
than use the transform to simply place the body in the inertial system, then
use `transform_body!`. This transforms the intrinsic coordinates of the body.
=#
transform_body!(b,Xm1)
#=
## Transforming Plücker vectors
Transforms can be applied to Plücker vectors to transform their components
between systems. Let's consider a 2d example in which the motion based at system A
is purely a rotation with angular velocity $\Omega = 1$, and we wish to transform this
to system B, translated by $[2,0]$ from A, but with axes aligned with B.
We expect that the velocity based at B should have the same angular velocity,
but also should have translational velocity equal to $[0,2]$ due to the angular
motion.
First we construct the motion vector at A
=#
Ω = 1.0
vA = PluckerMotion(Ω,[0,0])
#=
Now construct the transform from A to B:
=#
XA_to_B = MotionTransform([2,0],0)
#=
Now apply the transform to get the velocity at B:
=#
vB = XA_to_B*vA
#=
which gives the expected result. Now let's transform back, using the inverse,
and check that we get back to `vA`
=#
inv(XA_to_B)*vB
#md # ## Transform functions
#md # ```@docs
#md # PluckerMotion
#md # PluckerForce
#md # angular_only
#md # linear_only
#md # LinearAlgebra.dot(::AbstractPluckerForceVector,::AbstractPluckerMotionVector)
#md # MotionTransform
#md # ForceTransform
#md # Base.inv(::AbstractTransformOperator)
#md # Base.transpose(::AbstractTransformOperator)
#md # rotation_transform
#md # translation_transform
#md # update_body!
#md # transform_body!
#md # ```
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 1697 | ## RigidBodyTools.jl
_Tools for creating, moving, and discretizing rigid bodies_
| Documentation | Build Status |
|:---:|:---:|
| [](https://JuliaIBPM.github.io/RigidBodyTools.jl/stable) [](https://JuliaIBPM.github.io/RigidBodyTools.jl/dev) | [](https://github.com/JuliaIBPM/RigidBodyTools.jl/actions) [](https://codecov.io/gh/JuliaIBPM/RigidBodyTools.jl) |
## About the package
The purpose of this package is to provide tools for rigid bodies with
point-discretized surfaces. It currently includes methods for
* a library of 2D surface shape definitions and associated point discretizations
* calculation of geometric properties
* collections of multiple rigid bodies
* prescribed kinematics of single or linked systems of rigid and deforming bodies, with a library of joints
These tools support a variety of classes of problems, including those that involve bodies interacting with fluids, in which we would immerse these point-discretized representations into a computational grid.
<!--
Documentation can be found at https://JuliaIBPM.github.io/RigidBodyTools.jl/latest.
**RigidBodyTools.jl** is registered in the general Julia registry. To install, enter the package manager by typing
```julia
] add RigidBodyTools
```
Then, in any version, type
```julia
julia> using RigidBodyTools
```
For examples, consult the documentation or see the example Jupyter notebooks in the Examples folder.
-->
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 830 | # RigidBodyTools.jl
*Tools for creating, moving, and discretizing rigid bodies*
The purpose of this package is to provide tools for rigid bodies with
point-discretized surfaces. It includes methods for
* a library of surface shape definitions and associated point discretizations
* calculation of geometric properties
* rigid-body motion and transformation of surface points
* collections of multiple rigid bodies
## Installation
This package works on Julia `1.0` and above and is registered in the general Julia registry. To install from the REPL, type
e.g.,
```julia
] add RigidBodyTools
```
Then, in any version, type
```julia
julia> using RigidBodyTools
```
The plots in this documentation are generated using [Plots.jl](http://docs.juliaplots.org/latest/).
You might want to install that, too, to follow the examples.
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 2383 | ```@meta
EditURL = "../../../test/literate/bodylists.jl"
```
# Lists of bodies and their transforms
```@meta
CurrentModule = RigidBodyTools
```
We might want to have several distinct bodies. Here, we discuss how
to combine bodies into lists, and similarly, their transforms.
````@example bodylists
using RigidBodyTools
using Plots
````
## Body list
Suppose we have two bodies and we wish to combine them into a single list.
The advantage of doing so is that many of the operations we have presented
previously also extend to lists. We use `BodyList` to combine them.
````@example bodylists
b1 = Circle(1.0,0.02)
b2 = Rectangle(1.0,2.0,0.02)
bl = BodyList([b1,b2])
````
Another way to do this is to push each one onto the list:
````@example bodylists
bl = BodyList()
push!(bl,b1)
push!(bl,b2)
````
We can transform the list by creating a list of transforms with a `MotionTransformList`
````@example bodylists
X1 = MotionTransform([2.0,3.0],0.0)
X2 = MotionTransform([-2.0,-0.5],π/4)
tl = MotionTransformList([X1,X2])
````
The transform list can be applied to the whole body list simply with
````@example bodylists
tl(bl)
````
which creates a copy of the body list and transforms that, or
````@example bodylists
update_body!(bl,tl)
````
which updates each body in `bl` in place.
Let's see our effect
````@example bodylists
plot(bl)
````
It is important to note that the list points to the original bodies,
so that any change made to the list is reflected in the original bodies, e.g.
````@example bodylists
plot(b2)
````
## Utilities on lists
There are some specific utilities that are helpful for lists. For example,
to collect all of the x, y points (the segment midpoints) in the list into two
vectors, use
````@example bodylists
x, y = collect(bl)
````
In a vector comprising data on these concatenated surface points, we
can use `view` to look at just one body's part and change it:
````@example bodylists
f = zero(x)
f1 = view(f,bl,1)
f1 .= 1.0;
plot(f)
````
Also, we can sum up the values for one of the bodies:
````@example bodylists
sum(f,bl,2)
````
## Body and transform list functions
```@docs
BodyList
getrange
Base.collect(::BodyList)
Base.sum(::AbstractVector,::BodyList,::Int)
Base.view(::AbstractVector,::BodyList,::Int)
MotionTransformList
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 5143 | ```@meta
EditURL = "../../../test/literate/deformation.jl"
```
# Deforming bodies
```@meta
CurrentModule = RigidBodyTools
```
Thus far we have only shown rigid body motion. However, we can
also prescribe surface deformation as an additional component
of a body's motion.
````@example deformation
using RigidBodyTools
using Plots
````
Before we get started, let's define the same macro that we used earlier
in order to visualize our system's motion
````@example deformation
macro animate_motion(b,m,dt,tmax,xlim,ylim)
return esc(quote
bc = deepcopy($b)
t0, x0 = 0.0, init_motion_state(bc,$m)
dxdt = zero(x0)
x = copy(x0)
@gif for t in t0:$dt:t0+$tmax
motion_rhs!(dxdt,x,($m,bc),t)
global x += dxdt*$dt
update_body!(bc,x,$m)
plot(bc,xlims=$xlim,ylims=$ylim)
end every 5
end)
end
````
For deforming bodies, we specify the velocity of the surface directly. This
deformation velocity is expressed in the coordinate system attached to the
body, rather than the inertial coordinate system. This enables the
motion to be easily superposed with the rigid-body motion described earlier.
It is also important to note that the motion is applied **to the endpoints**
of the surface segments. The midpoints are then constructed from the
updated endpoints.
## Example: Basic deformation
Let's see an example. We will create an oscillatory deformation of a circle.
We create the motion by creating functions for each component of velocity.
````@example deformation
Ω = 2π
ufcn(x,y,t) = 0.25*x*y*Ω*cos(Ω*t)
vfcn(x,y,t) = 0.25*(x^2-y^2)*Ω*cos(Ω*t)
def = DeformationMotion(ufcn,vfcn)
````
We will create a simple fixed revolute joint that anchors the body's center to the
inertial system. (Note that we don't need to create a body list here, since we
are only working with one body and one joint.)
````@example deformation
Xp_to_jp = MotionTransform(0.0,0.0,0.0)
Xc_to_jc = MotionTransform(0.0,0.0,0.0)
dofs = [ConstantVelocityDOF(0.0)]
joint = Joint(RevoluteJoint,0,Xp_to_jp,1,Xc_to_jc,dofs)
body = Circle(1.0,0.02)
````
To construct the system, we supply the joint and body, as before, as well as the deformation.
````@example deformation
ls = RigidBodyMotion(joint,body,def)
````
Let's animate this motion
````@example deformation
@animate_motion body ls 0.01 4 (-2,2) (-2,2)
````
The body remains fixed, but the surface deforms!
## Example: Expanding motion
Now a circle undergoing an expansion. For this, we set constant velocity
components equal to constants, the coordinates of the surface segment endpoints
````@example deformation
body = Circle(1.0,0.02)
u = copy(body.x̃end)
v = copy(body.ỹend)
def = ConstantDeformationMotion(u,v)
ls = RigidBodyMotion(joint,body,def)
@animate_motion body ls 0.01 2 (-5,5) (-5,5)
````
## Example: Combining rigid motion and deforming motion.
Now, let's combine an oscillatory rigid-body rotation with
oscillatory deformation, this time applied to a square.
````@example deformation
Xp_to_jp = MotionTransform(0.0,0.0,0.0)
Xc_to_jc = MotionTransform(0.0,0.0,0.0)
Ω = 1.0
dofs = [OscillatoryDOF(π/4,Ω,0.0,0.0)]
joint = Joint(RevoluteJoint,0,Xp_to_jp,1,Xc_to_jc,dofs)
body = Square(1.0,0.02)
ufcn(x,y,t) = 0.25*(x^2+y^2)*y*Ω*cos(Ω*t)
vfcn(x,y,t) = -0.25*(x^2+y^2)*x*Ω*cos(Ω*t)
def = DeformationMotion(ufcn,vfcn)
ls = RigidBodyMotion(joint,body,def)
@animate_motion body ls π/100 4π (-2,2) (-2,2)
````
## Example: Defining new deformations
We can also define new types of deformation that are more specialized.
We need only define a subtype of `AbstractDeformationMotion`
and extend the function `deformation_velocity` to work with it.
The signature of this function is `deformation_velocity(body,deformation,time)`.
For example, let's define a motion on a rectangular shape that
will deform only the top side in the normal direction, but leave the rest of
the surface stationary. We will use the `side` field
of the `Polygon` shape type to access the top, and set
its vertical velocity.
````@example deformation
struct TopMotion{UT} <: AbstractDeformationMotion
vtop :: UT
end
function RigidBodyTools.deformation_velocity(body::Polygon,def::TopMotion,t::Real)
u, v = zero(body.x̃end), zero(body.ỹend)
top = body.side[3]
v[top] .= def.vtop.(body.x̃end[top],body.ỹend[top],t)
return vcat(u,v)
end
````
Now apply it
````@example deformation
Xp_to_jp = MotionTransform(0.0,0.0,0.0)
Xc_to_jc = MotionTransform(0.0,0.0,0.0)
dofs = [ConstantVelocityDOF(0.0)]
joint = Joint(RevoluteJoint,0,Xp_to_jp,1,Xc_to_jc,dofs)
body = Rectangle(1.0,2.0,0.02)
vfcn(x,y,t) = 0.2*(1-x^2)*cos(t)
def = TopMotion(vfcn)
ls = RigidBodyMotion(joint,body,def)
````
Let's try it out
````@example deformation
@animate_motion body ls π/100 4π (-1.5,1.5) (-2.5,2.5)
````
As desired, the top surface deforms vertically, but the rest of the
surface is stationary.
## Deformation functions
```@docs
DeformationMotion
ConstantDeformationMotion
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 4790 | ```@meta
EditURL = "../../../test/literate/dofmotions.jl"
```
# Behaviors of degrees of freedom
```@meta
CurrentModule = RigidBodyTools
```
To define the motion of a joint requires that we define the behavior
of each of the joint's degrees of freedom. There are three different types of
behavior of a degree of freedom:
(1) Its motion is prescribed with a given function of time, in which case we say that the
degree of freedom is *constrained*
(2) Its motion is specified directly, but determined by a process that lies outside
of the body system, in which case we say the degree of freedom is *exogenous*.
(3) Its motion is determined indirectly by some sort of force model for its behavior,
such as a spring or damper, in which case we say the degree of freedom is *unconstrained*.
Here, we will discuss how to set a degree of freedom with one of these behaviors,
and how to set the prescribed motion, if desired.
````@example dofmotions
using RigidBodyTools
using Plots
````
## Constrained behavior (i.e., prescribed motion)
When a degree of freedom is constrained, then its behavior over time
is set by some time-varying function. There are a number of different
types of predefined time-varying behaviors.
### Constant velocity
To specify constant velocity with some velocity, e.g. $U=1$, set
````@example dofmotions
U = 1.0
k = ConstantVelocityDOF(U)
````
For any prescribed motion, we can evaluate it at any specified time.
It returns data of type `DOFKinematicData`.
````@example dofmotions
t = 1.0
kt = k(t)
````
The kinematic data can be parsed with
`dof_position`, `dof_velocity`, and `dof_acceleration`:
````@example dofmotions
dof_position(kt)
````
````@example dofmotions
dof_velocity(kt)
````
````@example dofmotions
dof_acceleration(kt)
````
Let's plot the position over time
````@example dofmotions
t = range(0,3,length=301)
plot(t,dof_position.(k.(t)),xlims=(0,Inf),ylims=(0,Inf))
````
### Oscillatory motion
We can set the position to be a sinusoidal function. For this, we
set the amplitude, the angular frequency, the phase, and the mean velocity (typically zero).
````@example dofmotions
A = 1.0 ## amplitude
Ω = 2π ## angular frequency
ϕ = π/2 ## phase
vel = 0 ## mean velocity
k = OscillatoryDOF(A,Ω,ϕ,vel)
````
Plot the position, velocity, and acceleration
````@example dofmotions
plot(t,dof_position.(k.(t)),xlims=(0,Inf),label="x")
plot!(t,dof_velocity.(k.(t)),label="ẋ")
plot!(t,dof_acceleration.(k.(t)),label="ẍ")
````
### Smooth ramp motion
To ramp the position from one value to another, we use the `SmoothRampDOF`.
For this, we need to specify the nominal velocity of the ramp,
the change in position, and the time at which the ramp starts.
There is an optional argument `ramp` to control the ramp's smoothness.
It defaults to `EldredgeRamp(11.0)`, an Eldredge-type ramp with smoothness
factor 11.
````@example dofmotions
vel = 1.0 ## nominal ramp velocity
Δx = 1.0 ## change in position
t0 = 1.0 ## time of ramp start
k = SmoothRampDOF(vel,Δx,t0)
````
Plot the position
````@example dofmotions
plot(t,dof_position.(k.(t)),xlims=(0,Inf),label="x")
````
We can also ramp up the velocity from one value to another, using
`SmoothVelocityRampDOF`. For example,
````@example dofmotions
u1 = 1.0 ## initial velocity
u2 = 2.0 ## final velocity
acc = 1.0 ## nominal acceleration of the ramp
t0 = 1.0 ## time of ramp start
k = SmoothVelocityRampDOF(acc,u1,u2,t0)
````
Plot the velocity
````@example dofmotions
plot(t,dof_velocity.(k.(t)),xlims=(0,Inf),label="u")
````
and the position
````@example dofmotions
plot(t,dof_position.(k.(t)),xlims=(0,Inf),label="x")
````
### User-defined motion
The user can specify the time-varying position by supplying a function of time
and using `CustomDOF`. It automatically differentiates this function to
get velocity and acceleration.
For example, a quadratic behavior
````@example dofmotions
f(t) = 1.0*t + 2.0*t^2
k = CustomDOF(f)
````
Plot the position
````@example dofmotions
plot(t,dof_position.(k.(t)),xlims=(0,Inf),label="x")
````
and the velocity
````@example dofmotions
plot(t,dof_velocity.(k.(t)),xlims=(0,Inf),label="ẋ")
````
and the acceleration
````@example dofmotions
plot(t,dof_acceleration.(k.(t)),xlims=(0,Inf),label="ẍ")
````
## Exogenous and unconstrained behaviors
If the degree of freedom is to be *exogenous* or *unconstrained*,
then it can be designated as such, e.g,
````@example dofmotions
k = ExogenousDOF()
````
or
````@example dofmotions
k = UnconstrainedDOF()
````
## Degree of freedom functions
```@docs
ConstantVelocityDOF
OscillatoryDOF
SmoothRampDOF
CustomDOF
ExogenousDOF
UnconstrainedDOF
dof_position
dof_velocity
dof_acceleration
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 4146 | ```@meta
EditURL = "../../../test/literate/exogenous.jl"
```
# Exogenous degrees of freedom
```@meta
CurrentModule = RigidBodyTools
```
As we mentioned in previous pages, some degrees of freedom can be
designated as *exogenous*, meaning that their behavior is determined
by some external process that we do not model explicitly. In practice,
that means that the acceleration of such a degree of freedom must
by explicitly provided at every time step while the state vector is being
advanced.
````@example exogenous
using RigidBodyTools
using Plots
````
As usual, we will demonstrate this via an example. In the example,
a single flat plate will be commanded to pitch upward by 45 degrees
about its leading edge. It will move steadily in the +x direction.
However, its y acceleration will vary randomly via some exogenous process.
````@example exogenous
Xp_to_jp = MotionTransform(0.0,0.0,0.0)
Xc_to_jc = MotionTransform(0.5,0.0,0.0)
dofs = [SmoothRampDOF(0.4,π/4,0.5),ConstantVelocityDOF(1.0),ExogenousDOF()]
joint = Joint(FreeJoint2d,0,Xp_to_jp,1,Xc_to_jc,dofs)
body = ThickPlate(1.0,0.05,0.02)
````
We need to provide a means of passing along the time-varying information about
the exogenous acclerations to the time integrator. There are two ways we
can do this.
#### Method 1: A function for exogenous acclerations.
We can provide a function that will specify the exogenous y acceleration.
Here, we will create a function that sets it to a random value chosen from a
normal distribution. Note that the function must be mutating and have a signature
`(a,x,p,t)`, where `a` is the exogenous acceleration vector. The arguments `x`
and `p` are a state and a parameter, which can be flexibly defined.
Here, the *state* is the rigid-body system state and the *parameter*
is the `RigidBodyMotion` structure.
The last argument `t` is time.
In this example, we don't need any of those arguments.
````@example exogenous
function my_exogenous_function!(a,x,ls,t)
a .= randn(length(a))
end
````
We pass that along via the `exogenous` keyword argument.
````@example exogenous
ls = RigidBodyMotion(joint,body;exogenous=my_exogenous_function!)
````
#### Method 2: Setting the exogenous accelerations explicitly in the loop
Another approach we can take is to set the exogenous acceleration(s)
explicitly in the loop, using the `update_exogenous!` function.
This function saves the vector of accelerations in a buffer in the `RigidBodyMotion`
structure so that it is available to the time integrator. We will
demonstrate that approach here.
````@example exogenous
ls = RigidBodyMotion(joint,body)
````
Let's initialize the state vector and its rate of change
````@example exogenous
bc = deepcopy(body)
dt, tmax = 0.01, 3.0
t0, x0 = 0.0, init_motion_state(bc,ls)
dxdt = zero(x0)
x = copy(x0)
````
Note that the state vector has four elements. The first two are
associated with the prescribed motions for rotation and x translation.
The third is the y position, the exogenous degree of freedom. And the
fourth is the y velocity.
Why the y velocity? Because the exogenous behavior is specified via its
acceleration. Let's advance the system and animate it. We include
a horizontal line along the hinge axis to show the effect of the exogenous
motion.
````@example exogenous
xhist = []
a_edof = zero_exogenous(ls)
@gif for t in t0:dt:t0+tmax
a_edof .= randn(length(a_edof))
update_exogenous!(ls,a_edof)
motion_rhs!(dxdt,x,(ls,bc),t)
global x += dxdt*dt
update_body!(bc,x,ls)
push!(xhist,copy(x))
plot(bc,xlims=(-1,5),ylims=(-1.5,1.5))
hline!([0.0])
end every 5
````
Let's plot the exogenous state and its velocity
````@example exogenous
plot(t0:dt:t0+tmax,map(x -> exogenous_position_vector(x,ls,1)[1],xhist),label="y position",xlabel="t")
plot!(t0:dt:t0+tmax,map(x -> exogenous_velocity_vector(x,ls,1)[1],xhist),label="y velocity")
````
The variation in velocity is quite noisy (and constitutes a random walk).
In contrast, the change in position is relatively smooth, since it represents
an integral of this velocity.
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 11961 | ```@meta
EditURL = "../../../test/literate/joints.jl"
```
# Joints and body-joint systems
```@meta
CurrentModule = RigidBodyTools
```
For systems consisting of one or more rigid bodies, *joints* are used to specify
the degrees of freedom permitted for the body. Even for a single rigid body,
the body is assumed to be connected by a joint with the inertial coordinate system.
In three dimensions, there are several different classes of joint:
- `RevoluteJoint`, which rotates about an axis and has only one degree of freedom
- `PrismaticJoint`, which slides along one axis and has only one degree of freedom
- `HelicalJoint`, which rotates about and slides along one axis (two degrees of freedom)
- `SphericalJoint`, which rotates freely about one point (three degrees of freedom)
- `FreeJoint`, which can move in any manner (six degrees of freedom)
However, in two spatial dimensions, there are only two
- `RevoluteJoint` (one degree of freedom: rotation)
- `FreeJoint2d` (three degrees of freedom: rotation, two translation)
````@example joints
using RigidBodyTools
using Plots
````
A joint serves as a connection between two bodies; one of the bodies is the *parent*
and the other is the *child*. It is also common for a body to be connected to
the inertial coordinate system, in which case the inertial system is the parent
and the body is the child. The user must specify the id of the parent body and the child body.
The id of the inertial system is 0.
Before we discuss how a joint is specified, it is important to understand that
a joint is basically just an intermediate `MotionTransform` from the parent body's system
to the child body's system.
That is, to construct the transform from body P to body C, we would compose it
as follows
$${}^C X_{P} = {}^{C}X_{J(C)} {}^{J(C)}X_{J(P)} {}^{J(P)}X_{P}$$
The transform in the middle is the joint transform and is the only one of these that
can vary in time. It maps from one
coordinate system attached (rididly) to the parent body to another coordinate
system attached (rigidly) to the child body.
The degrees of freedom of the joint constrain the behavior of this joint transform.
Below, we show how joints are constructed and used.
## Joint construction generalities
The user must specify where the joint is located on each body, relative to the
origin of the body's coordinate system, and what orientation the joint takes
with respect to the body's coordinate system. These are specified with
`MotionTransform`s, transforming from each body's coordinate system to the joint's
coordinate system on that body. This transform is invariant -- it never changes.
Also, the user must specify the behavior of each of the degrees of freedom
of a joint, using the tools we discussed in the previous page.
The basic signature is
`Joint(joint_type,parent_body_id,Xp_to_jp,child_body_id,Xc_to_jc,doflist)`
However, there is a specialized signature if you simply wish to place
a body in some stationary configuration `X`:
`Joint(X,body_id)`
and if there is only one body and it is stationary, then
`Joint(X)`
will do.
## Example
Let's see a 2d example. Suppose we have two bodies, 1 and 2. Body 1 is to
be connected to the inertial coordinate system, and prescribed with motion
that causes it to oscillate rotationally (*pitch*) about a point located
at $(-1,0)$ and in the body's coordinate system, and this pitch axis
can oscillate up and down (*heave*).
Body 2 is to be connected by a hinge to body 1, and this hinge's angle will
also oscillate. We will set the hinge to be located at $(1.02,0)$
on body 1 and $(-1.02,0)$ on body 2.
### Joint from inertial system to body 1
First, let's construct the joint from the inertial system to body 1. This
should be a `FreeJoint2d`, since motion is prescribed in two of the three
degrees of freedom (as well as the third one, with zero velocity). We can assume
that the joint is attached to the origin of the parent (the inertial
system). On the child (body 1), the joint is to be at `[-1,0]`.
````@example joints
pid = 0
Xp_to_jp = MotionTransform([0,0],0)
cid = 1
Xc_to_jc = MotionTransform([-1,0],0)
````
For the rotational and the y degrees of freedom, we need oscillatory motion.
For the rotational motion, let's set the amplitude to $\pi/4$ and the
angular frequency to $\Omega = 2\pi$, but set the phase and mean velocity both
to zero.
````@example joints
Ar = π/4
Ω = 2π
ϕr = 0.0
vel = 0.0
kr = OscillatoryDOF(Ar,Ω,ϕr,vel)
````
For the plunging, we will set an amplitude of 1 and a phase lag of $\pi/2$, but keep the same
frequency as the pitching.
````@example joints
Ay = 1
ϕy = -π/2
ky = OscillatoryDOF(Ay,Ω,ϕy,vel)
````
The x degree of freedom is simply constant velocity, set to 0, to
ensure it does not move in the $x$ direction.
````@example joints
kx = ConstantVelocityDOF(0)
````
We put these together into a vector, to pass along to the joint constructor.
The ordering of these in the vector is important. It must be
[rotational, x, y].
````@example joints
dofs = [kr,kx,ky];
nothing #hide
````
Now set the joint
````@example joints
joint1 = Joint(FreeJoint2d,pid,Xp_to_jp,cid,Xc_to_jc,dofs)
````
Note that this joint has three constrained degrees of freedom,
no exogenous degrees of freedom, and no unconstrained degrees of freedom.
In a later example, we will change this.
### Joint from body 1 to body 2
This joint is a `RevoluteJoint`. First set the joint locations on each body.
````@example joints
pid = 1
Xp_to_jp = MotionTransform([1.02,0],0)
cid = 2
Xc_to_jc = MotionTransform([-1.02,0],0)
````
Now set its single degree of freedom (rotation) to have oscillatory
kinematics. We will set its amplitude the same as before,
but give it a phase lag
````@example joints
Ar = π/4
Ω = 2π
ϕr = -π/4
kr = OscillatoryDOF(Ar,Ω,ϕr,vel)
````
Put it in a one-element vector.
````@example joints
dofs = [kr];
nothing #hide
````
and construct the joint
````@example joints
joint2 = Joint(RevoluteJoint,pid,Xp_to_jp,cid,Xc_to_jc,dofs)
````
Group the two joints together into a vector. It doesn't matter what
order this vector is in, since the connectivity will be figured out any way
it is ordered, but it numbers the joints by the order they are provided here.
````@example joints
joints = [joint1,joint2];
nothing #hide
````
## Assembling the joints and bodies
The joints and bodies comprise an overall `RigidBodyMotion` system.
When this system is constructed, all of the connectivities are
determined (or missing connectivities are revealed). The
construction requires that we have set up the bodies themselves, so
let's do that first. We will make both body 1 and body 2 an ellipse
of aspect ratio 5. Note that ordering of bodies matters here, because
the first in the list is interpreted as body 1, etc.
````@example joints
b1 = Ellipse(1.0,0.2,0.02)
b2 = Ellipse(1.0,0.2,0.02)
bodies = BodyList([b1,b2])
````
Now we can construct the system
````@example joints
ls = RigidBodyMotion(joints,bodies)
````
Before we proceed, it is useful to demonstrate some of the tools
we have to probe the connectivity of the system. For example,
to find the parent joint of body 1, we use
````@example joints
parent_joint_of_body(1,ls)
````
This returns 1, since we have connected body 1 to joint 1.
How about the child joint of body 1? We expect it to be 2. Since there
might be more than one child, this returns a vector:
````@example joints
child_joints_of_body(1,ls)
````
We can also check the body connectivity of joints. This can be very useful
for more complicated systems in which the joint numbering is less clear.
The parent body of joint 1
````@example joints
parent_body_of_joint(1,ls)
````
This returns 0 since we have connected joint 1 to the inertial system. The child body of joint 2:
````@example joints
child_body_of_joint(2,ls)
````
## The system state vector
A key concept in advancing, plotting, and doing further analysis of this
system of joints and bodies is the *state vector*, $x$. This state vector
has entries for the position of every degree of freedom of all of the joints. It also may
have further entries for other quantities that need to be advanced, but for
this example, there are no other entries.
There are two functions that are useful for constructing the state vector. The
first is `zero_motion_state`, which simply creates a vector of zeros of the
correct size.
````@example joints
zero_motion_state(bodies,ls)
````
The second is `init_motion_state`, which fills in initial position values for any degrees of freedom
that have been prescribed.
````@example joints
x = init_motion_state(bodies,ls)
````
Note that neither of these functions has any mutating effect on the arguments (`bodies` and `ls`).
Also, it is always possible for the user to modify the entries in the state
vector after this function is called. In general, it would be difficult to
determine which entry is which in this state vector, so we can use a special
function for this. For example, to get access to just
the part of the state vector for the positions of joint 1,
````@example joints
jid = 1
x1 = position_vector(x,ls,jid)
````
This is a view on the overall state vector. This, if you decide to change an entry of `x1`, this, in turn,
would change the correct entry in `x`.
We can use the system state vector to put the bodies in their proper places,
using the joint positions in `x`.
````@example joints
update_body!(bodies,x,ls)
````
Let's plot this just to check
````@example joints
plot(bodies,xlims=(-4,4),ylims=(-4,4))
````
## Advancing the state vector
Once the initial state vector is constructed, then the system can
be advanced in time. In this example, there are no exogenous or
unconstrained degrees of freedom that require extra input, so
the system is *closed* as it is. To advance the system, we need
to solve the system of equations
$$\frac{\mathrm{d}x}{\mathrm{d}t} = f(x,t)$$
The function $f(x,t)$ describing the rate of change of $x$ is given by the
function `motion_rhs!`. This function mutates its first argument,
the rate-of-change vector `dxdt`, which can then be used to update the state.
The system and bodies are passed in as a tuple, followed by time.
Using a simple forward Euler method, the state vector can be advanced
as follows
````@example joints
t0, x0 = 0.0, init_motion_state(bodies,ls)
dxdt = zero(x0)
x = copy(x0)
dt, tmax = 0.01, 4.0
for t in t0:dt:t0+tmax
motion_rhs!(dxdt,x,(ls,bodies),t)
global x += dxdt*dt
end
````
Now that we know how to advance the state vector, let's create a macro
that can be used to make a movie of the evolving system.
````@example joints
macro animate_motion(b,m,dt,tmax,xlim,ylim)
return esc(quote
bc = deepcopy($b)
t0, x0 = 0.0, init_motion_state(bc,$m)
dxdt = zero(x0)
x = copy(x0)
@gif for t in t0:$dt:t0+$tmax
motion_rhs!(dxdt,x,($m,bc),t)
global x += dxdt*$dt
update_body!(bc,x,$m)
plot(bc,xlims=$xlim,ylims=$ylim)
end every 5
end)
end
````
Let's use it here
````@example joints
@animate_motion bodies ls 0.01 4 (-4,4) (-4,4)
````
## Joint functions
```@docs
Joint
zero_joint
init_joint
```
## System and state functions
```@docs
RigidBodyMotion
zero_motion_state
init_motion_state
Base.view(::AbstractVector,::RigidBodyMotion,::Int)
position_vector
velocity_vector
deformation_vector
exogenous_position_vector
exogenous_velocity_vector
unconstrained_position_vector
unconstrained_velocity_vector
body_transforms
motion_transform_from_A_to_B
force_transform_from_A_to_B
body_velocities
motion_rhs!
zero_exogenous
update_exogenous!
maxvelocity
ismoving(::RigidBodyMotion)
is_system_in_relative_motion
rebase_from_inertial_to_reference
```
## Joint types
```@docs
RevoluteJoint
PrismaticJoint
HelicalJoint
SphericalJoint
FreeJoint
FreeJoint2d
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 5169 | ```@meta
EditURL = "../../../test/literate/shapes.jl"
```
# Creating bodies
```@meta
CurrentModule = RigidBodyTools
```
The most basic functions of this package create an object of type
`Body`. There are a variety of such functions, a few of which we will demonstrate here.
Generally speaking, we are interesting in creating the object and placing it
in a certain position and orientation. We do this in two steps: we create the
basic shape, centered at the origin with a default orientation, and then
we transform the shape to a desired location and orientation. We will discuss
the shapes in this notebook, and the transforms in the following notebook.
It is useful to stress that each body stores two types of points internally:
the *endpoints* of the segments that comprise the body surface, and the
*midpoints* of these segments. The midpoints are intended for use in downstream
calculations, e.g. as forcing points in the calculations on immersed layers.
The midpoints are simply the geometric averages of the endpoints, so
endpoints are the ones that are transformed first, and midpoints are updated next.
````@example shapes
using RigidBodyTools
using Plots
````
## Creating a shape
Let's first create a shape. For any shape, we have to make a choice of the
geometric dimensions (e.g, radius of the circle, side lengths of a rectangle),
as well as the points that we use to discretely represent the surface.
For this latter choice, there are two constructor types: we can specify the
number of points (as an integer), or we can specify the nominal spacing between
points (as a floating-point number).
The second approach is usually preferable when we use these tools for constructing immersed bodies.
It is important to stress that the algorithms for placing points attempt to make the spacing
as uniform as possible.
Let's create the most basic shape, a circle of radius 1. We will discretize
it with 100 points first:
````@example shapes
b = Circle(1.0,100)
````
Now we will create the same body with a spacing of 0.02
````@example shapes
b = Circle(1.0,0.02)
````
This choice led to 312 points along the circumference. Quick math will tell
you that the point spacing is probably not exactly 0.02. In fact, you can
find out the actual spacing with `dlengthmid`. This function calculates
the spacing associated with each point. (It does so by first calculating
the spacing between midpoints between each point and its two adjacent points.)
````@example shapes
dlengthmid(b)
````
It is just a bit larger than 0.02.
A few other useful functions on the shape. To simply know the number of points,
````@example shapes
length(b)
````
To find the outward normal vectors (based on the perpendicular to the line
joining the adjacent midpoints):
````@example shapes
nx, ny = normalmid(b)
````
We can also plot the shape
````@example shapes
plot(b)
````
Sometimes we don't want to fill in the shape (and maybe change the line color).
In that case, we can use
````@example shapes
plot(b,fill=:false,linecolor=:black)
````
## Other shapes
Let's see some other shapes in action, like a square and an ellipse
````@example shapes
b1, b2 = Square(1.0,0.02), Ellipse(0.6,0.1,0.02)
plot(plot(b1), plot(b2))
````
A NACA 4412 airfoil, with chord length 1, and 0.02 spacing between points.
````@example shapes
b = NACA4(0.04,0.4,0.12,0.02)
plot(b)
````
A flat plate with no thickness
````@example shapes
b = Plate(1.0,0.02)
plot(b)
````
and a flat plate with a 5 percent thickness (and rounded ends)
````@example shapes
b = ThickPlate(1.0,0.05,0.01)
plot(b)
````
There are also some generic tools for creating shapes. A `BasicBody` simply
consists of points that describe the vertices. The interface for this is very simple.
````@example shapes
x = [1.0, 1.2, 0.7, 0.6, 0.2, -0.1, 0.1, 0.4]
y = [0.1, 0.5, 0.8, 1.2, 0.8, 0.6, 0.2, 0.3]
b = BasicBody(x,y)
````
````@example shapes
plot(b)
scatter!(b,markersize=3,markercolor=:black)
````
However, this function does not insert any points along the sides between vertices.
We have to do the work of specifying these points in the original call. For this reason,
there are a few functions that are more directly useful. For example, we can
create a polygon from these vertices, with a specified spacing between points
distributed along the polygon sides
````@example shapes
b = Polygon(x,y,0.02)
````
````@example shapes
plot(b)
scatter!(b,markersize=3,markercolor=:black)
````
Alternatively, we can interpret
those original points as control points for splines, with a spacing between points
along the splines provided:
````@example shapes
b = SplinedBody(x,y,0.02)
````
````@example shapes
plot(b)
scatter!(b,markersize=3,markercolor=:black)
````
## Body functions
```@docs
BasicBody
Polygon
Circle
Ellipse
NACA4
Plate
Rectangle
SplinedBody
Square
```
## Shape utilities
```@docs
centraldiff
Base.diff(::Body)
Base.diff(::BodyList)
dlength
dlengthmid
Base.length(::Body)
midpoints(::Body)
midpoints(::BodyList)
normal
normalmid
arccoord
arccoordmid
arclength(::Body)
arclength(::BodyList)
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 5004 | ```@meta
EditURL = "../../../test/literate/surfacevelocities.jl"
```
# Evaluating velocities on body surfaces
```@meta
CurrentModule = RigidBodyTools
```
For use in mechanics problems, it is important to be able to output
the velocity of the points on the surfaces of bodies at a given system state
and time. We use the function `surface_velocity!` for this. Here,
we will demonstrate its use, and particularly, its various options.
We will use a simple problem, consisting of two flat plates connected
by a `RevoluteJoint`, and one of the plates connected to the inertial
system by a `FreeJoint2d`, oscillating in y coordinate only.
````@example surfacevelocities
using RigidBodyTools
using Plots
````
## Set up the bodies, joints, and motion
````@example surfacevelocities
body1 = Plate(1.0,200)
body2 = deepcopy(body1)
bodies = BodyList([body1,body2])
````
Joint 1, connecting body 1 to inertial system
````@example surfacevelocities
Xp_to_j1 = MotionTransform(0.0,0.0,0.0)
Xch_to_j1 = MotionTransform(-0.5,0.0,0.0)
dofs1 = [ConstantVelocityDOF(0),ConstantVelocityDOF(0),OscillatoryDOF(1.0,2π,0,0.0)]
joint1 = Joint(FreeJoint2d,0,Xp_to_j1,1,Xch_to_j1,dofs1)
````
Joint 2, connecting body 2 to body 1
````@example surfacevelocities
Xp_to_j2 = MotionTransform(0.5,0.0,0.0)
Xch_to_j2 = MotionTransform(-0.5,0.0,0.0)
dofs2 = [OscillatoryDOF(π/4,2π,-π/4,0.0)]
joint2 = Joint(RevoluteJoint,1,Xp_to_j2,2,Xch_to_j2,dofs2)
````
Assemble, and get the initial joint state vector at t = 0
````@example surfacevelocities
joints = [joint1,joint2]
ls = RigidBodyMotion(joints,bodies)
t = 0
x = init_motion_state(bodies,ls;tinit=t)
````
Let's plot the bodies, just to visualize their current configuration.
````@example surfacevelocities
update_body!(bodies,x,ls)
plot(bodies,xlim=(-1,3),ylim=(-2,2))
````
Because of the phase difference, joint 2 is bent at an angle initially.
## Evaluate velocity on body points
First, initialize vectors for the `u` and `v` components in this 2d example,
using the `zero_body` function.
````@example surfacevelocities
u, v = zero_body(bodies), zero_body(bodies)
````
Now evaluate the velocities at time 0
````@example surfacevelocities
surface_velocity!(u,v,bodies,x,ls,t)
````
We can plot these on each body using the `view` function for `BodyList`.
For example, the vectors of u and v velocities on body 1 are
````@example surfacevelocities
bodyid = 1
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u1")
plot!(s,view(v,bodies,bodyid),label="v1")
````
This shows a pure vertical translation. On body 2,
we see a mixture of the translation of body 1, plus
the rotation of joint 2, which is reflected in the linear variation. Since the plate is at an angle,
some of this rotation shows up as a negative u component.
````@example surfacevelocities
bodyid = 2
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u2")
plot!(s,view(v,bodies,bodyid),label="v2")
````
We can compute these velocities in different components, using the optional `axes`
argument. For example, let's see the velocities on body 2 in its own coordinate
system
````@example surfacevelocities
surface_velocity!(u,v,bodies,x,ls,t;axes=2)
bodyid = 2
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u2")
plot!(s,view(v,bodies,bodyid),label="v2")
````
Now only the v component (the component perpendicular to the plate) depicts
the rotation, and the u component has only a portion of the translational motion
from joint 1.
We can also compute the velocity relative to a body reference frame, using
the optional `frame` argument. Let's compute the velocities relative
to the velocity of body 1 (and in body 1 coordinates) and plot them.
````@example surfacevelocities
surface_velocity!(u,v,bodies,x,ls,t;axes=1,frame=1)
bodyid = 1
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u1")
plot!(s,view(v,bodies,bodyid),label="v1")
````
Body 1 has no velocity in this reference frame. And body 2 has only
a pure rotation...
````@example surfacevelocities
bodyid = 2
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u2")
plot!(s,view(v,bodies,bodyid),label="v2")
````
We do not have to remove all of the motion of the reference body
when we compute velocity in a moving reference frame. We
can use the `motion_part` keyword argument to remove only, e.g., `:angular`
or `:linear` part (instead of `:full`, the default). For example,
if we remove the angular part of body 2's motion, we are left with only
the translation from joint 1
````@example surfacevelocities
surface_velocity!(u,v,bodies,x,ls,t;axes=2,frame=2,motion_part=:angular)
bodyid = 2
s = arccoord(bodies[bodyid])
plot(s,view(u,bodies,bodyid),ylim=(-20,20),label="u2")
plot!(s,view(v,bodies,bodyid),label="v2")
````
## Surface velocity functions
```@docs
surface_velocity!
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.5.8 | 00e976b2114783858a92e581e4895769d319ab83 | docs | 7431 | ```@meta
EditURL = "../../../test/literate/transforms.jl"
```
# Plucker vectors and coordinate transforms
```@meta
CurrentModule = RigidBodyTools
```
Here we discuss the use of Plücker vectors and their transforms for
describing rigid-body motion and force. Plücker vectors succinctly describe
both the angular (rotational) and linear (translational) part of motion, and the angular (moment) and
linear (force) part of force. In three dimensions, a Plücker vector is 6-dimensional,
e.g., Plücker velocity and force vectors are
$$v = \begin{bmatrix} \Omega_x \\ \Omega_y \\ \Omega_z \\ U_x \\ U_y \\ U_z \end{bmatrix}, \qquad
f = \begin{bmatrix} M_x \\ M_y \\ M_z \\ F_x \\ F_y \\ F_z \end{bmatrix}$$
In two dimensions, there is only one angular component and two linear components, e.g.,
$$v = \begin{bmatrix} \Omega_z \\ U_x \\ U_y \end{bmatrix}, \qquad f = \begin{bmatrix} M_z \\ F_x \\ F_y \end{bmatrix}$$
We need to be able to transform these vectors from one coordinate system to another.
This requires rotating their components and shifting their center from one origin to another.
For example, a translational velocity based at system B will be different from
the translational velocity at system A because of the rotational velocity, $\Omega \times {}^Br_{A}$,
where ${}^Br_{A}$ is the vector from the origin of A to the origin of B.
Similarly, the moment about B will be different from the moment about A due to
the moment arm ${}^Br_{A} \times F$.
````@example transforms
using RigidBodyTools
using LinearAlgebra
using Plots
````
## Plücker vectors
A Plücker vector is easily created by simply supplying a vector of its components
````@example transforms
v = PluckerMotion([1.0,2.0,3.0])
````
This created a 2d motion vector, with angular velocity 1.0 and linear
velocity (2.0,3.0). One can also supply the angular and linear
parts separately, using keywords. If one of these keywords is
omitted, it defaults to zero for that part. Note that we also need
to write this as `PluckerMotion{2}` to specify the physical dimensionality.
For a 3d motion vector, one would write `PluckerMotion{3}` here.
````@example transforms
v2 = PluckerMotion{2}(angular=1.0,linear=[2.0,3.0])
v2 == v
````
We can also pick off the angular and linear parts
````@example transforms
angular_only(v)
````
and
````@example transforms
linear_only(v)
````
Force vectors are similar
````@example transforms
f = PluckerForce([-1.0,-3.5,2.25])
````
The vectors of the same type can be added and subtracted
````@example transforms
v3 = v + v2
````
We can also take a scalar product of force and motion vectors
````@example transforms
dot(f,v)
````
## Transforms
Transforms are constructed by describing the relationship between the two
coordinate systems. Consider the example in the figure below.

To develop the 2d transform from A to B, we supply the position $r$ and
the rotation angle $\theta$. For example, if B is shifted by [1,1]
and rotated by angle $\pi/6$ counterclockwise about A, then we construct the transform
as
````@example transforms
Xm = MotionTransform([1,1],π/6)
````
Note that it uses the angle of rotation, $\pi/6$, to create a rotation
matrix operator.
A 2d force transform would be constructed by
````@example transforms
Xf = ForceTransform([1,1],π/6)
````
For 3d transforms, we need to supply the rotation operator itself (as well
as the 3d translation vector). Often, this rotation is done by
rotating about a certain axis by a certain angle. We do this with the
`rotation_about_axis` function. For example, to rotate by $\pi/4$ about
an axis parallel to the vector $[1,1,1]$, then we use
````@example transforms
R = rotation_about_axis(π/4,[1,1,1])
````
and then to translate this rotated system by $[-1,-2,-3]$,
````@example transforms
Xm = MotionTransform([-1,-2,-3],R)
````
and similarly for a force transform.
We can also compute the inverses of these transforms, to transform back from
B to A
````@example transforms
inv(Xm)
````
Transforms of the same type (motion or force) can be composed via multiplication to transform
from, e.g., A to B to C.
````@example transforms
Xm1 = MotionTransform([1.5,1.5],π/6)
Xm2 = MotionTransform([-1,1],π/3)
Xm2*Xm1
````
## Transforming bodies
We can use motion transforms, in particular, to place bodies. We simply
apply the transform as a function, and it transforms the body's
coordinates. For example, transform `Xm1` above shifts the
body to `[1.5,1.5]` and rotates it counterclockwise by `π/6`:
````@example transforms
b = Ellipse(1.0,0.2,0.02)
plot(b,xlims=(-3,3),ylims=(-3,3),fillcolor=:gray)
plot!(Xm1(b),xlims=(-3,3),ylims=(-3,3))
````
In the example above, we did not affect the original body by applying the
transform as a function. Rather, we created a copy of the body.
If, instead, you wish to transform the body in place, use `update_body!`
````@example transforms
update_body!(b,Xm1)
````
One important note: a body stores a set of coordinates in its own intrinsic
coordinate system, and when a transform is applied to the body, it always
acts on these coordinates. This means that the transform's application on the body
cannot be carried out as a composite of operations, e.g. `T2(T1(b))` is not possible.
Insteady, in the application on the body, the transform is always interpreted such that system A
is the inertial coordinate system and B is the body system. Of course, the transform itself can always
be constructed from composite transforms.
Sometimes we need information about the normals in the body system.
For these, we can use `normalmid` with the flag `axes=:body`:
````@example transforms
nx, ny = normalmid(b,axes=:body)
````
Finally, if you wish to transform the body's own coordinate system, rather
than use the transform to simply place the body in the inertial system, then
use `transform_body!`. This transforms the intrinsic coordinates of the body.
````@example transforms
transform_body!(b,Xm1)
````
## Transforming Plücker vectors
Transforms can be applied to Plücker vectors to transform their components
between systems. Let's consider a 2d example in which the motion based at system A
is purely a rotation with angular velocity $\Omega = 1$, and we wish to transform this
to system B, translated by $[2,0]$ from A, but with axes aligned with B.
We expect that the velocity based at B should have the same angular velocity,
but also should have translational velocity equal to $[0,2]$ due to the angular
motion.
First we construct the motion vector at A
````@example transforms
Ω = 1.0
vA = PluckerMotion(Ω,[0,0])
````
Now construct the transform from A to B:
````@example transforms
XA_to_B = MotionTransform([2,0],0)
````
Now apply the transform to get the velocity at B:
````@example transforms
vB = XA_to_B*vA
````
which gives the expected result. Now let's transform back, using the inverse,
and check that we get back to `vA`
````@example transforms
inv(XA_to_B)*vB
````
## Transform functions
```@docs
PluckerMotion
PluckerForce
angular_only
linear_only
LinearAlgebra.dot(::AbstractPluckerForceVector,::AbstractPluckerMotionVector)
MotionTransform
ForceTransform
Base.inv(::AbstractTransformOperator)
Base.transpose(::AbstractTransformOperator)
rotation_transform
translation_transform
update_body!
transform_body!
```
---
*This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*
| RigidBodyTools | https://github.com/JuliaIBPM/RigidBodyTools.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 659 | using FastGradientProjection
using Documenter
DocMeta.setdocmeta!(FastGradientProjection, :DocTestSetup, :(using FastGradientProjection); recursive=true)
makedocs(;
modules=[FastGradientProjection],
authors="Shuhei Yoshida <[email protected]> and contributors",
sitename="FastGradientProjection.jl",
format=Documenter.HTML(;
canonical="https://syoshida1983.github.io/FastGradientProjection.jl",
edit_link="main",
assets=String[],
),
pages=[
"Home" => "index.md",
],
)
deploydocs(;
repo="github.com/syoshida1983/FastGradientProjection.jl",
devbranch="main",
)
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 1945 | export FGP
export FGP!
"""
FGP(b, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso")
return denoised (volume) image `b` based on the minimization problem ``\\min_{\\mathbf{x}\\in{C}}\\|\\mathbf{x} - \\mathbf{b}\\|^{2}_{F} + 2\\lambda\\mathrm{TV}(\\mathbf{x}).``
# Arguments
- `b`: input (volume) image.
- `λ`: regularization parameter.
- `N`: number of iterations.
- `lower_bound=-Inf`: upper bound of the convex closed set ``C``. If `b` is a complex array, projection to ``C`` is not performed.
- `upper_bound=Inf`: lower bound of the convex closed set ``C``.
- `TV="iso"`: if `TV="iso"` (default), denoising is based on isotropic TV. To specify anisotropic TV, set `TV="aniso"`.
"""
function FGP(b::AbstractArray{T,2}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
ProjPair = (TV=="iso") ? IsoProj : ((TV=="aniso") ? AnisoProj : throw(ArgumentError("Invalid keyword argument.")))
Ny, Nx = size(b)
p = zeros(eltype(b), Ny+1, Nx)
q = zeros(eltype(b), Ny, Nx+1)
p₋₁ = zeros(eltype(b), Ny+1, Nx)
q₋₁ = zeros(eltype(b), Ny, Nx+1)
p̃ = zeros(eltype(b), Ny+1, Nx)
q̃ = zeros(eltype(b), Ny, Nx+1)
t = 1.
t₊₁ = 1.
for k ∈ 1:N
p̃[2:end-1,:], q̃[:,2:end-1] = AdjointOp(ProjConvex.(b .- λ.*LinearOp(p₋₁, q₋₁), lower_bound, upper_bound))
p̃ = @. p₋₁ + p̃/(8λ)
q̃ = @. q₋₁ + q̃/(8λ)
p[2:end-1,:], q[:,2:end-1] = ProjPair(p̃, q̃)
t₊₁ = (1 + √(1 + 4t^2))/2
p₋₁ = @. p + (t - 1)/t₊₁*(p - p₋₁)
q₋₁ = @. q + (t - 1)/t₊₁*(q - q₋₁)
t = t₊₁
end
return ProjConvex.(b .- λ.*LinearOp(p₋₁, q₋₁), lower_bound, upper_bound)
end
"""
FGP!(b, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso")
Same as `FGP`, but operates in-place on `b`.
"""
function FGP!(b::AbstractArray{T,2}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
b[:,:] = FGP(b, λ, N, lower_bound=lower_bound, upper_bound=upper_bound, TV=TV)
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 1417 | function FGP(b::AbstractArray{T,3}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
ProjPair = (TV=="iso") ? IsoProj : ((TV=="aniso") ? AnisoProj : throw(ArgumentError("Invalid keyword argument.")))
Ny, Nx, Nz = size(b)
p = zeros(eltype(b), Ny+1, Nx, Nz)
q = zeros(eltype(b), Ny, Nx+1, Nz)
r = zeros(eltype(b), Ny, Nx, Nz+1)
p₋₁ = zeros(eltype(b), Ny+1, Nx, Nz)
q₋₁ = zeros(eltype(b), Ny, Nx+1, Nz)
r₋₁ = zeros(eltype(b), Ny, Nx, Nz+1)
p̃ = zeros(eltype(b), Ny+1, Nx, Nz)
q̃ = zeros(eltype(b), Ny, Nx+1, Nz)
r̃ = zeros(eltype(b), Ny, Nx, Nz+1)
t = 1.
t₊₁ = 1.
for k ∈ 1:N
p̃[2:end-1,:,:], q̃[:,2:end-1,:], r̃[:,:,2:end-1] = AdjointOp(ProjConvex.(b .- λ.*LinearOp(p₋₁, q₋₁, r₋₁), lower_bound, upper_bound))
p̃ = @. p₋₁ + p̃/(12λ)
q̃ = @. q₋₁ + q̃/(12λ)
r̃ = @. r₋₁ + r̃/(12λ)
p[2:end-1,:,:], q[:,2:end-1,:], r[:,:,2:end-1] = ProjPair(p̃, q̃, r̃)
t₊₁ = (1 + √(1 + 4t^2))/2
p₋₁ = @. p + (t - 1)/t₊₁*(p - p₋₁)
q₋₁ = @. q + (t - 1)/t₊₁*(q - q₋₁)
r₋₁ = @. r + (t - 1)/t₊₁*(r - r₋₁)
t = t₊₁
end
return ProjConvex.(b .- λ.*LinearOp(p₋₁, q₋₁, r₋₁), lower_bound, upper_bound)
end
function FGP!(b::AbstractArray{T,3}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
b[:,:,:] = FGP(b, λ, N, lower_bound=lower_bound, upper_bound=upper_bound, TV=TV)
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 162 | module FastGradientProjection
include("operator.jl")
include("projection.jl")
include("GP2d.jl")
include("GP3d.jl")
include("FGP2d.jl")
include("FGP3d.jl")
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 1199 | export GP
export GP!
"""
GP(b, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso")
return denoised (volume) image `b`. Compared to `FGP`, convergence is slower, but memory usage is lower.
"""
function GP(b::AbstractArray{T,2}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
ProjPair = (TV=="iso") ? IsoProj : ((TV=="aniso") ? AnisoProj : throw(ArgumentError("Invalid keyword argument.")))
Ny, Nx = size(b)
p = zeros(eltype(b), Ny+1, Nx)
q = zeros(eltype(b), Ny, Nx+1)
p̃ = zeros(eltype(b), Ny+1, Nx)
q̃ = zeros(eltype(b), Ny, Nx+1)
for k ∈ 1:N
p̃[2:end-1,:], q̃[:,2:end-1] = AdjointOp(ProjConvex.(b .- λ.*LinearOp(p, q), lower_bound, upper_bound))
p̃ = @. p + p̃/(8λ)
q̃ = @. q + q̃/(8λ)
p[2:end-1,:], q[:,2:end-1] = ProjPair(p̃, q̃)
end
return ProjConvex.(b .- λ.*LinearOp(p, q), lower_bound, upper_bound)
end
"""
GP!(b, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso")
Same as `GP`, but operates in-place on `b`.
"""
function GP!(b::AbstractArray{T,2}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
b[:,:] = GP(b, λ, N, lower_bound=lower_bound, upper_bound=upper_bound, TV=TV)
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 1070 | function GP(b::AbstractArray{T,3}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
ProjPair = (TV=="iso") ? IsoProj : ((TV=="aniso") ? AnisoProj : throw(ArgumentError("Invalid keyword argument.")))
Ny, Nx, Nz = size(b)
p = zeros(eltype(b), Ny+1, Nx, Nz)
q = zeros(eltype(b), Ny, Nx+1, Nz)
r = zeros(eltype(b), Ny, Nx, Nz+1)
p̃ = zeros(eltype(b), Ny+1, Nx, Nz)
q̃ = zeros(eltype(b), Ny, Nx+1, Nz)
r̃ = zeros(eltype(b), Ny, Nx, Nz+1)
for k ∈ 1:N
p̃[2:end-1,:,:], q̃[:,2:end-1,:], r̃[:,:,2:end-1] = AdjointOp(ProjConvex.(b .- λ.*LinearOp(p, q, r), lower_bound, upper_bound))
p̃ = @. p + p̃/(12λ)
q̃ = @. q + q̃/(12λ)
r̃ = @. r + r̃/(12λ)
p[2:end-1,:,:], q[:,2:end-1,:], r[:,:,2:end-1] = ProjPair(p̃, q̃, r̃)
end
return ProjConvex.(b .- λ.*LinearOp(p, q, r), lower_bound, upper_bound)
end
function GP!(b::AbstractArray{T,3}, λ, N; lower_bound=-Inf, upper_bound=Inf, TV="iso") where {T}
b[:,:,:] = GP(b, λ, N, lower_bound=lower_bound, upper_bound=upper_bound, TV=TV)
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 840 | function LinearOp(p::AbstractArray{T,2}, q::AbstractArray{T,2}) where {T}
return @. (@view p[2:end,:]) + (@view q[:,2:end]) - (@view p[1:end-1,:]) - (@view q[:,1:end-1])
end
function LinearOp(p::AbstractArray{T,3}, q::AbstractArray{T,3}, r::AbstractArray{T,3}) where {T}
return @. (@view p[2:end,:,:]) + (@view q[:,2:end,:]) + (@view r[:,:,2:end]) - (@view p[1:end-1,:,:]) - (@view q[:,1:end-1,:]) - (@view r[:,:,1:end-1])
end
function AdjointOp(x::AbstractArray{T,2}) where {T}
return (@view x[1:end-1,:]) .- (@view x[2:end,:]),
(@view x[:,1:end-1]) .- (@view x[:,2:end])
end
function AdjointOp(x::AbstractArray{T,3}) where {T}
return (@view x[1:end-1,:,:]) .- (@view x[2:end,:,:]),
(@view x[:,1:end-1,:]) .- (@view x[:,2:end,:]),
(@view x[:,:,1:end-1]) .- (@view x[:,:,2:end])
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 1592 | function ProjConvex(x::T, l, u) where {T<:Real}
if x < l
return l
elseif x > u
return u
else
return x
end
end
function ProjConvex(x::T, l, u) where {T<:Complex}
return x
end
function IsoProj(p::AbstractArray{T,2}, q::AbstractArray{T,2}) where {T}
return (@. (@view p[2:end-1,:])/max(1, √(abs2(@view p[2:end-1,:]) + abs2(@view q[1:end-1,2:end])))),
(@. (@view q[:,2:end-1])/max(1, √(abs2(@view p[2:end,1:end-1]) + abs2(@view q[:,2:end-1]))))
end
function IsoProj(p::AbstractArray{T,3}, q::AbstractArray{T,3}, r::AbstractArray{T,3}) where {T}
return (@. (@view p[2:end-1,:,:])/max(1, √(abs2(@view p[2:end-1,:,:]) + abs2(@view q[1:end-1,2:end,:]) + abs2(@view r[1:end-1,:,2:end])))),
(@. (@view q[:,2:end-1,:])/max(1, √(abs2(@view p[2:end,1:end-1,:]) + abs2(@view q[:,2:end-1,:]) + abs2(@view r[:,1:end-1,2:end])))),
(@. (@view r[:,:,2:end-1])/max(1, √(abs2(@view p[2:end,:,1:end-1]) + abs2(@view q[:,2:end,1:end-1]) + abs2(@view r[:,:,2:end-1]))))
end
function AnisoProj(p::AbstractArray{T,2}, q::AbstractArray{T,2}) where {T}
return (@. (@view p[2:end-1,:])/max(1, abs(@view p[2:end-1,:]))),
(@. (@view q[:,2:end-1])/max(1, abs(@view q[:,2:end-1])))
end
function AnisoProj(p::AbstractArray{T,3}, q::AbstractArray{T,3}, r::AbstractArray{T,3}) where {T}
return (@. (@view p[2:end-1,:,:])/max(1, abs(@view p[2:end-1,:,:]))),
(@. (@view q[:,2:end-1,:])/max(1, abs(@view q[:,2:end-1,:]))),
(@. (@view r[:,:,2:end-1])/max(1, abs(@view r[:,:,2:end-1])))
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | code | 117 | using FastGradientProjection
using Test
@testset "FastGradientProjection.jl" begin
# Write your tests here.
end
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | docs | 3753 | # FastGradientProjection
[](https://syoshida1983.github.io/FastGradientProjection.jl/stable/)
[](https://syoshida1983.github.io/FastGradientProjection.jl/dev/)
[](https://github.com/syoshida1983/FastGradientProjection.jl/actions/workflows/CI.yml?query=branch%3Amain)
This package provides the functions for total variation (TV) denoising with gradient projection (GP) and fast gradient projection (FGP). GP/FGP denoises of (volume) image $\mathbf{b}$ based on the following minimization problem
$$\min_{\mathbf{x}\in{C}}\\|\mathbf{x} - \mathbf{b}\\|^{2}_{F} + 2\lambda\mathrm{TV}(\mathbf{x}),$$
where $C$ is a convex closed set, $\\|\cdot\\|_{F}$ is the Frobenius norm, $\lambda$ is the regularization parameter, and $\mathrm{TV}$ is the total variation defined by
$$\begin{align}
\mathrm{TV}\_{I}(\mathbf{x}) &=
\begin{cases}
\sum\_{i}\sum\_{j}\sqrt{\\left(x\_{i,j} - x\_{i+1,j}\\right)^{2} + \\left(x\_{i,j} - x\_{i,j+1}\\right)^{2}} & \text{(2D)}\\
\sum\_{i}\sum\_{j}\sum\_{k}\sqrt{\\left(x\_{i,j,k} - x\_{i+1,j,k}\\right)^{2} + \\left(x\_{i,j,k} - x\_{i,j+1,k}\\right)^{2} + \\left(x\_{i,j,k} - x\_{i,j,k+1}\\right)^{2}} & \text{(3D)}
\end{cases},\\
\mathrm{TV}\_{l\_{1}}(\mathbf{x}) &=
\begin{cases}
\sum\_{i}\sum\_{j}\\left\\{\\left|x\_{i,j} - x\_{i+1,j}\\right| + \\left|x\_{i,j} - x\_{i,j+1}\\right|\\right\\} & \text{(2D)}\\
\sum\_{i}\sum\_{j}\sum\_{k}\\left\\{\\left|x\_{i,j,k} - x\_{i+1,j,k}\\right| + \\left|x\_{i,j,k} - x\_{i,j+1,k}\\right| + \\left|x\_{i,j,k} - x\_{i,j,k+1}\\right|\\right\\} & \text{(3D)}
\end{cases}.
\end{align}$$
$\mathrm{TV}\_{I}$ and $\mathrm{TV}\_{l\_{1}}$ express isotropic and $l_{1}$-based anisotropic TV, respectively.
For more information on the algorithm, please refer to the following reference.
> [Amir Beck and Marc Teboulle, "Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems," IEEE Trans. Image Process. **18**, 2419-2434 (2009)](https://doi.org/10.1109/TIP.2009.2028250)
# Installation
To install this package, open the Julia REPL and run
```julia
julia> ]add FastGradientProjection
```
or
```julia
julia> using Pkg
julia> Pkg.add("FastGradientProjection")
```
# Usage
Import the package first.
```julia
julia> using FastGradientProjection
```
For example, perform the following to denoise (volume) image $\mathbf{b}$ with FGP.
```julia
julia> x = FGP(b, 0.1, 100, lower_bound=0.0, upper_bound=1.0)
```
The second and third arguments are the regularization parameter $\lambda$ and the number of iterations. The values specified by the keyword arguments `lower_bound` and `upper_bound` are the upper and lower bounds of the convex closed set $C$. These default values are `lower_bound=-Inf` and `upper_bound=Inf`. If $\mathbf{b}$ is a complex array, projection to $C$ is not performed. The function FGP performs denoising based on isotropic TV by default; to perform denoising based on anisotropic TV, specify the keyword argument `TV="aniso"`. Refer to the [documentation](https://syoshida1983.github.io/FastGradientProjection.jl/stable/) for further information.
<p align="center">
<img src="https://github.com/syoshida1983/FastGradientProjection.jl/blob/images/noised.jpg" width="250px">
  
<img src="https://github.com/syoshida1983/FastGradientProjection.jl/blob/images/denoised.jpg" width="250px">
<br>
noised image
          
denoised image
</p>
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MIT"
] | 0.1.0 | 6eccff516780c4eb8301d339809483189a98d93c | docs | 250 | ```@meta
CurrentModule = FastGradientProjection
```
# FastGradientProjection
Documentation for [FastGradientProjection](https://github.com/syoshida1983/FastGradientProjection.jl).
```@index
```
```@autodocs
Modules = [FastGradientProjection]
```
| FastGradientProjection | https://github.com/syoshida1983/FastGradientProjection.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | code | 505 | using Taxsim
using Documenter
makedocs(;
modules=[Taxsim],
authors="Johannes Fleck <[email protected]>",
repo="https://github.com/jo-fleck/Taxsim.jl/blob/{commit}{path}#L{line}",
sitename="Taxsim.jl",
format=Documenter.HTML(;
prettyurls=get(ENV, "CI", "false") == "true",
canonical="https://jo-fleck.github.io/Taxsim.jl",
assets=String[],
),
pages=[
"Home" => "index.md",
],
)
deploydocs(;
repo="github.com/jo-fleck/Taxsim.jl",
)
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | code | 103 | module Taxsim
using DataFrames
using CSV
using FTPClient
include("taxsim32.jl")
export taxsim32
end
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | code | 10107 |
"""
Before using `taxsim32`, please make yourself familiar with [Internet TAXSIM 32](https://taxsim.nber.org/taxsim32/). Submit a few individual observations and upload an entire csv file.
#### Syntax
`taxsim32(df; kwargs...)`
- `df` has to be a DataFrame object with at least one observation.
- Included columns have to be named exactly as in the Internet TAXSIM 32 variable list (bold names after boxes) but can be in any order. `taxsim32` returns typos and case errors.
- Non-provided input variables are set to zero by the TAXSIM server but `" "` (blanks as strings) or `missing` lead to non-response as the server only accepts Integers or Floats. `taxsim32` returns type errors.
#### Keyword Arguments
- `connection`: choose either `"SSH"` or `"FTP"`. `"SSH"` issues a system curl command while `"FTP"` uses the [FTPClient Package](https://github.com/invenia/FTPClient.jl). Defaults to `"SSH"` (which is faster).
- `full`: request the full list of TAXSIM return variables v1 to v45. Defaults to `false` which returns v1 to v9.
- `long_names`: name all return variables with their long TAXSIM names (as opposed to abbreviated names for v1 to v9 and no names for v10 to v45). Defaults to `false`.
#### Output
- Data frame with requested TAXSIM return variables. Column types are either Integer or Float.
- If `df` does not include `state` or if `state = 0`, the data frame returned by a `full` request does not include v30 to v41.
- Observations are ordered as in `df` so `hcat(df, df_output, makeunique=true)` merges all variables of the input and output data frames.
### Examples
```julia-repl
using DataFrames, Taxsim
df_small_input = DataFrame(year=1980, mstat=2, ltcg=100000)
1×3 DataFrame
│ Row │ year │ mstat │ ltcg │
│ │ Int64 │ Int64 │ Int64 │
├─────┼───────┼───────┼────────┤
│ 1 │ 1980 │ 2 │ 100000 │
df_small_output_default = taxsim32(df_small_input)
1×9 DataFrame
│ Row │ taxsimid │ year │ state │ fiitax │ siitax │ fica │ frate │ srate │ ficar │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │
├─────┼──────────┼───────┼───────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ 0.0 │ 0.0 │ 20.0 │ 0.0 │ 12.0 │
df_small_output_full = taxsim32(df_small_input, connection="FTP", full=true)
1×29 DataFrame
│ Row │ taxsimid │ year │ state │ fiitax │ siitax │ fica │ frate │ srate │ ficar │ v10 │ v11 │ ... | v29 │ v42 │ ... | v45 │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ ... │ Float64 │ Float64 | ... | Float64 |
├─────┼──────────┼───────┼───────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────┼─────────┼─────────┼─────┼─────────┼
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ 0.0 │ 0.0 │ 20.0 │ 0.0 │ 12.26 │ 40000.0 │ 0.0 │ ... | 0.0 │ 0.0 | ... | 0.0 |
df_small_output_names = taxsim32(df_small_input, long_names=true)
1×9 DataFrame
│ Row │ Case ID │ Year │ State │ Federal income tax liability including capital gains rates, surtaxes, AMT and refundable and non-refundable credits │ ... │ federal marginal rate │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ ... │ Float64 │
├─────┼─────────┼───────┼───────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼─────────────────────────────┼─
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ ... │ 20.0 │
N = 10000
df_small_stateN = DataFrame(year=repeat([1980],inner=N), mstat=repeat([2],inner=N), ltcg=repeat([100000],inner=N), state=repeat([1],inner=N))
df_small_stateN_out = taxsim32(df_small_stateN)
10000×9 DataFrame
Row │ taxsimid year state fiitax siitax fica frate srate ficar
│ Float64 Int64 Int64 Float64 Float64 Float64 Float64 Float64 Float64
───────┼──────────────────────────────────────────────────────────────────────────────
1 │ 1.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
2 │ 2.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
3 │ 3.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
4 │ 4.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
⋮ │ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
9998 │ 9998.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
9999 │ 9999.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
10000 │ 10000.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
```
"""
function taxsim32(df_in; connection = "SSH", full = false, long_names = false, checks = true)
# Input checks
if checks == true
if typeof(df_in) != DataFrame error("Input must be a data frame") end
if isempty(df_in) == true error("Input data frame is empty") end
TAXSIM32_vars = ["taxsimid","year","state","mstat","page","sage","depx","dep13","dep17","dep18","pwages","swages","dividends","intrec","stcg","ltcg","otherprop","nonprop","pensions","gssi","ui","transfers","rentpaid","rentpaid","otheritem","childcare","mortgage","scorp","pbusinc","pprofinc","sbusinc","sprofinc"];
for (i, input_var) in enumerate(names(df_in))
if (input_var in TAXSIM32_vars) == false error("Input contains \"" * input_var *"\" which is not an allowed TAXSIM 32 variable name") end
if any(ismissing.(df_in[!, i])) == true error("Input contains \"" * input_var *"\" with missing(s) which TAXSIM does not accept") end
if (eltype(df_in[!, i]) == Int || eltype(df_in[!, i]) == Float64 || eltype(df_in[!, i]) == Float32 || eltype(df_in[!, i]) == Float16) == false error("Input contains \"" * input_var *"\" which is a neiter an Integer nor a Float variable as required by TAXSIM") end
end
else
end
df = deepcopy(df_in)
# Add taxsimid column if not included and specify result request
if sum(occursin.("taxsimid", names(df))) == 0 insertcols!(df, 1, :taxsimid => 1:size(df,1)) end
insertcols!(df, size(df,2)+1, :idtl => 0)
if full == true
if size(df,1) == 1
df[end, :idtl] = 12
else
df[:, :idtl] = 2*ones(Int64,size(df,1))
df[end, :idtl] = 12
end
else
df[end, :idtl] = 10
end
if connection == "SSH"
io_out = IOBuffer();
if Sys.isapple() == true || Sys.islinux() == true
try
run(pipeline(`ssh -T -o ConnectTimeout=10 -o StrictHostKeyChecking=no [email protected]`, stdin=seekstart(CSV.write(IOBuffer(), df)), stdout=io_out))
catch
@error "Cannot connect to the TAXSIM server via SSH -> try FTP and check your firewall settings"
end
end
if Sys.iswindows() == true
try
run(pipeline(`ssh -T -o ConnectTimeout=10 -o StrictHostKeyChecking=no [email protected]`, stdin=seekstart(CSV.write(IOBuffer(), df)), stdout=io_out))
catch
@error "Cannot connect to the TAXSIM server via SSH -> try FTP and check your firewall settings"
end
end
df_res = CSV.read(seekstart(io_out), DataFrame; silencewarnings=true)
else
ftp = FTP(hostname="taxsimftp.nber.org", username="taxsim", password="02138")
try
cd(ftp, "tmp")
catch
@error "Cannot connect to the TAXSIM server via FTP -> try SSH and check your firewall settings"
end
upload(ftp, CSV.write(IOBuffer(), df), "/userid")
df_res = CSV.read(seekstart(download(ftp, "/userid.txm32")), DataFrame; silencewarnings=true)
end
if long_names == true
ll_default = ["Case ID","Year","State","Federal income tax liability including capital gains rates, surtaxes, AMT and refundable and non-refundable credits","State income tax liability","FICA (OADSI and HI, sum of employee AND employer)","federal marginal rate","state marginal rate","FICA rate"];
ll_full = ["Federal AGI","UI in AGI","Social Security in AGI","Zero Bracket Amount","Personal Exemptions","Exemption Phaseout","Deduction Phaseout","Deductions Allowed (Zero for non-itemizers)","Federal Taxable Income","Tax on Taxable Income (no special capital gains rates)","Exemption Surtax","General Tax Credit","Child Tax Credit (as adjusted)","Additional Child Tax Credit (refundable)","Child Care Credit","Earned Income Credit (total federal)","Income for the Alternative Minimum Tax","AMT Liability after credit for regular tax and other allowed credits","Federal Income Tax Before Credits (includes special treatment of Capital gains, exemption surtax (1988-1996) and 15% rate phaseout (1988-1990) but not AMT)","FICA"];
ll_state = ["State Household Income (imputation for property tax credit)","State Rent Expense (imputation for property tax credit)","State AGI","State Exemption amount","State Standard Deduction","State Itemized Deductions","State Taxable Income","State Property Tax Credit","State Child Care Credit","State EIC","State Total Credits","State Bracket Rate","Earned Self-Employment Income for FICA","Medicare Tax on Unearned Income","Medicare Tax on Earned Income","CARES act Recovery Rebates"];
if full == false
rename!(df_res, ll_default)
else
rename!(df_res, [ll_default; ll_full; ll_state])
end
end
if full == true && (sum(occursin.("state", names(df_in))) == 0 || (sum(occursin.("state", names(df_in))) == 1 && df_in[1, :state] == 0)) select!(df_res, Not(names(df_res)[30:41])) end # Drop v30 to v41 if no state or state == 0 in df_in
return df_res
end
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | code | 5203 | using Taxsim
using Test
using DataFrames
# Input tests
array = Array{Int64,2}(undef, 2, 3)
df_empty = DataFrame(year=[], mstat=[], ltcg=[])
df_faulty_name = DataFrame(yyear=1980, mstat=2, ltcg=100000)
df_missing = DataFrame(year=1980, mstat=2, ltcg=missing)
df_string = DataFrame(year=1980, mstat="married", ltcg=100000)
@testset "Inputs" begin
@test_throws ErrorException("Input must be a data frame") taxsim32(array)
@test_throws ErrorException("Input data frame is empty") taxsim32(df_empty)
@test_throws ErrorException("Input contains \"yyear\" which is not an allowed TAXSIM 32 variable name") taxsim32(df_faulty_name)
@test_throws ErrorException("Input contains \"ltcg\" with missing(s) which TAXSIM does not accept") taxsim32(df_missing)
@test_throws ErrorException("Input contains \"mstat\" which is a neiter an Integer nor a Float variable as required by TAXSIM") taxsim32(df_string)
end
# Connection tests
df_small = DataFrame(year=1980, mstat=2, ltcg=100000)
@testset "SSH connection" begin
df_default_out_ssh = taxsim32(df_small)
@test typeof(df_default_out_ssh) == DataFrame
end
@testset "FTP connection" begin
df_default_out_ftp = taxsim32(df_small, connection = "FTP")
@test typeof(df_default_out_ftp) == DataFrame
end
# Output tests
df_small_state = DataFrame(year=1980, mstat=2, pwages=0, ltcg=100000, state=1)
@testset "1 filer output" begin
df_small_out = taxsim32(df_small)
@test df_small_out.fiitax[1] == 10920.0
@test df_small_out.frate[1] == 20.0
@test df_small_out.ficar[1] == 12.0
@test size(df_small_out,2) == 9
df_small_full_out = taxsim32(df_small, full = true)
@test size(df_small_full_out,2) == 33
df_small_state_out = taxsim32(df_small_state)
@test df_small_state_out.fiitax[1] == 10920.0
@test df_small_state_out.frate[1] == 20.0
@test df_small_state_out.siitax[1] == 1119.0
@test df_small_state_out.srate[1] == 4.0
@test size(df_small_state_out,2) == 9
df_small_state_full_out = taxsim32(df_small_state, full = true)
@test size(df_small_state_full_out,2) == 45
end
df_small_state2 = DataFrame(year=[1980, 1981], mstat=[2,1], pwages=[0,100000], ltcg=[100000,0], state=[1,5])
@testset "2 filer output" begin
df_small_state2_out = taxsim32(df_small_state2)
@test size(df_small_state2_out,2) == 9
@test size(df_small_state2_out,1) == 2
df_small_state2_full_out = taxsim32(df_small_state2, full = true)
@test df_small_state2_full_out.fiitax[1] == 10920.0
@test df_small_state2_full_out.siitax[1] == 1119.0
@test df_small_state2_full_out.fiitax[2] == 38344.85
@test df_small_state2_full_out.siitax[2] == 9559.5
@test size(df_small_state2_full_out,2) == 45
end
N = 100
df_small_stateN = DataFrame(year=repeat([1980],inner=N), mstat=repeat([2],inner=N), ltcg=repeat([100000],inner=N), state=repeat([1],inner=N))
@testset "N filer output: SSH connection" begin
df_small_stateN_full_out_ssh = taxsim32(df_small_stateN, full = true)
@test df_small_stateN_full_out_ssh.fiitax[N] == 10920.0
@test df_small_stateN_full_out_ssh.siitax[N] == 1119.0
@test size(df_small_stateN_full_out_ssh,2) == 45
end
@testset "N filer output: FTP connection" begin
df_small_stateN_full_out_ftp = taxsim32(df_small_stateN, connection = "FTP", full = true)
@test df_small_stateN_full_out_ftp.fiitax[N] == 10920.0
@test df_small_stateN_full_out_ftp.siitax[N] == 1119.0
@test size(df_small_stateN_full_out_ftp,2) == 45
end
# Long names tests
@testset "Long names" begin
df_small_out_long_names = taxsim32(df_small, long_names = true)
@test names(df_small_out_long_names)[9] == "FICA rate"
df_small_full_out_long_names = taxsim32(df_small, full = true, long_names = true)
@test names(df_small_full_out_long_names)[29] == "FICA"
@test names(df_small_full_out_long_names)[33] == "CARES act Recovery Rebates"
df_small_state_out_long_names = taxsim32(df_small_state, long_names = true)
@test names(df_small_state_out_long_names)[9] == "FICA rate"
df_small_state_full_out_long_names = taxsim32(df_small_state, full = true, long_names = true)
@test names(df_small_state_full_out_long_names)[45] == "CARES act Recovery Rebates"
end
# # Performance Tests:
#
# # 1 filer
#
# @timev taxsim32(df_small);
# @timev taxsim32(df_small, connection = "FTP");
#
# # N filer
#
# N_perf = 100000;
# df_small_stateN_perf = DataFrame(year=repeat([1980],inner=N_perf), mstat=repeat([2],inner=N_perf), ltcg=repeat([100000],inner=N_perf), state=repeat([1],inner=N_perf));
#
# @timev taxsim32(df_small_stateN_perf, full = true);
# @timev taxsim32(df_small_stateN_perf, connection = "FTP", full = true);
# @timev taxsim32(df_small_stateN_perf, full = true, checks = false);
#
# @code_native taxsim32(df_small_stateN_perf, full = true);
# @code_native taxsim32(df_small_stateN_perf, connection = "FTP", full = true);
# @code_native taxsim32(df_small_stateN_perf, full = true, checks = false);
#
# @profiler taxsim32(df_small_stateN_perf, full = true);
# @profiler taxsim32(df_small_stateN_perf, connection = "FTP", full = true);
# @profiler taxsim32(df_small_stateN_perf, full = true, checks = false);
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | docs | 7784 |
[](https://github.com/jo-fleck/Taxsim.jl/actions)
[](https://codecov.io/gh/jo-fleck/Taxsim.jl)
<!-- [](https://jo-fleck.github.io/Taxsim.jl/stable)
[](https://jo-fleck.github.io/Taxsim.jl/dev) -->
# Taxsim.jl
[TAXSIM](https://taxsim.nber.org) is a program of the National Bureau of Economic Research (NBER) which calculates liabilities under US federal and state income tax laws. It can be accessed by uploading tax filer information to the NBER's TAXSIM server. The program then computes a number of variables (income taxes, tax credits, etc.) and returns them.
`Taxsim.jl` exchanges data between the Julia workspace and the server. Its function `taxsim32` supports the latest [TAXSIM version 32](https://taxsim.nber.org/taxsim32/). Future versions will be included.
#### Acknowledgments
Daniel Feenberg develops and maintains TAXSIM. He and his collaborators provide [helpful materials](http://users.nber.org/~taxsim/) including codes to prepare input files from household datasets (CPS, SCF, PSID).
Reach out to Daniel with questions on TAXSIM and follow his request on citation (see bottom of this [webpage](https://taxsim.nber.org/taxsim32/)).
### Installation and Instructions
`Taxsim.jl` can be installed via Julia's package manager using one of two options:
- REPL: `] add Taxsim`
- Pkg functions: `using Pkg; Pkg.add("Taxsim")`
Before using `taxsim32`, please make yourself familiar with [Internet TAXSIM 32](https://taxsim.nber.org/taxsim32/). Submit a few individual observations and upload an entire csv file.
#### Syntax
`taxsim32(df; kwargs...)`
- `df` has to be a DataFrame object with at least one observation.
- Included columns have to be named exactly as in the Internet TAXSIM 32 variable list (bold names after boxes) but can be in any order. `taxsim32` returns typos and case errors.
- Non-provided input variables are set to zero by the TAXSIM server but `" "` (blanks as strings) or `missing` lead to non-response as the server only accepts Integers or Floats. `taxsim32` returns type errors.
#### Keyword Arguments
- `connection`: choose either `"SSH"` or `"FTP"`. `"SSH"` issues a system curl command while `"FTP"` uses the [FTPClient Package](https://github.com/invenia/FTPClient.jl). Defaults to `"SSH"` (which is faster).
- `full`: request the full list of TAXSIM return variables v1 to v45. Defaults to `false` which returns v1 to v9.
- `long_names`: name all return variables with their long TAXSIM names. Defaults to `false` which returns abbreviated names for v1 to v9 and no names for v10 to v45.
#### Output
- Data frame with requested TAXSIM return variables. Column types are either Integer or Float.
- If `df` does not include `state` or if `state = 0`, the data frame returned by a `full` request does not include v30 to v41.
- Observations are ordered as in `df` so `hcat(df, df_output, makeunique=true)` merges all variables of the input and output data frames.
### Examples
````
using DataFrames, Taxsim
df_small_input = DataFrame(year=1980, mstat=2, ltcg=100000)
1×3 DataFrame
│ Row │ year │ mstat │ ltcg │
│ │ Int64 │ Int64 │ Int64 │
├─────┼───────┼───────┼────────┤
│ 1 │ 1980 │ 2 │ 100000 │
df_small_output_default = taxsim32(df_small_input)
1×9 DataFrame
│ Row │ taxsimid │ year │ state │ fiitax │ siitax │ fica │ frate │ srate │ ficar │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │
├─────┼──────────┼───────┼───────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ 0.0 │ 0.0 │ 20.0 │ 0.0 │ 12.0 │
df_small_output_full = taxsim32(df_small_input, connection="FTP", full=true)
1×29 DataFrame
│ Row │ taxsimid │ year │ state │ fiitax │ siitax │ fica │ frate │ srate │ ficar │ v10 │ v11 │ ... | v29 │ v42 │ ... | v45 │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ Float64 │ ... │ Float64 │ Float64 | ... | Float64 |
├─────┼──────────┼───────┼───────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────┼─────────┼─────────┼─────┼─────────┼
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ 0.0 │ 0.0 │ 20.0 │ 0.0 │ 12.26 │ 40000.0 │ 0.0 │ ... | 0.0 │ 0.0 | ... | 0.0 |
df_small_output_names = taxsim32(df_small_input, long_names=true)
1×9 DataFrame
│ Row │ Case ID │ Year │ State │ Federal income tax liability including capital gains rates, surtaxes, AMT and refundable and non-refundable credits │ ... │ federal marginal rate │
│ │ Float64 │ Int64 │ Int64 │ Float64 │ ... │ Float64 │
├─────┼─────────┼───────┼───────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼─────────────────────────────┼─
│ 1 │ 0.0 │ 1980 │ 0 │ 10920.0 │ ... │ 20.0 │
N = 10000
df_small_stateN = DataFrame(year=repeat([1980],inner=N), mstat=repeat([2],inner=N), ltcg=repeat([100000],inner=N), state=repeat([1],inner=N))
df_small_stateN_out = taxsim32(df_small_stateN)
10000×9 DataFrame
Row │ taxsimid year state fiitax siitax fica frate srate ficar
│ Float64 Int64 Int64 Float64 Float64 Float64 Float64 Float64 Float64
───────┼──────────────────────────────────────────────────────────────────────────────
1 │ 1.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
2 │ 2.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
3 │ 3.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
4 │ 4.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
⋮ │ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
9998 │ 9998.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
9999 │ 9999.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
10000 │ 10000.0 1980 1 10920.0 1119.0 0.0 20.0 4.0 12.0
````
### Troubleshooting
Expect three different kinds of errors
1. **Input Error** Adjust `df` so it meets the required column types and names.
2. **Connection Error** Indicates that `taxsim32` cannot connect to the TAXSIM server. Try a different connection option. If this does not help, check your internet and network settings and contact your network administrator - you're probably behind a restrictive firewall.
3. **Server Error** Returned from the TAXSIM server (error message begins with "TAXSIM: ... "). Either a faulty `df` passed the input tests or TAXSIM cannot compute the tax variables for some other reason which the error message hopefully helps to identify. Example: "TAXSIM: Non-joint return with 2 wage-earners"
Please file an issue if you experience problems with large input data frames (server non-response, truncated return data frames, etc).
### Scheduled Updates
- `taxsim32` currently returns marginal tax rates computed with respect to taxpayer earnings. Marginal rates for "Wage Income", "Spouse Earning", etc. will be included as keyword options in future releases.
- HTTP connection will be included as another connection option in future releases.
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MPL-2.0"
] | 0.3.0 | 152c36adb7e533d799699a4377c6cb693699f743 | docs | 98 | ```@meta
CurrentModule = Taxsim
```
# Taxsim
```@index
```
```@autodocs
Modules = [Taxsim]
```
| Taxsim | https://github.com/jo-fleck/Taxsim.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1534 | """
JuliaFormatterTool
Implements a script that runs a simple loop that runs `JuliaFormatter.format` on
every iteration after the user presses enter.
"""
module JuliaFormatterTool
export run_formatter_loop
using JuliaFormatter
# This tool lives in <repo>/contrib/format/, format <repo> by default
const default_target_dir = joinpath(dirname(dirname(@__DIR__)))
"""
run_formatter_loop(target_dir = $default_target_dir)
This function runs a simple loop that runs `JuliaFormatter.format` on
every iteration, every time the user presses enter.
"""
function run_formatter_loop(target_dir::AbstractString=default_target_dir)
printstyled("Welcome to Julia Formatter Tool!\n"; color=:cyan, bold=true)
printstyled("--------------------------------\n"; color=:cyan, bold=true)
let running::Bool = true
while running
println()
printstyled(
"Press enter to format the directory $target_dir or `q[enter]` to quit\n";
color=:light_cyan
)
printstyled("format.jl> "; color=:green, bold=true)
input = readline()
running = input != "q" && input != "quit" && !startswith(input, "exit")
if running
println("Applying JuliaFormatter...")
@info "Is the current directory formatted?" target_dir format(target_dir)
end
end
println("Thank you for formatting HDF5.jl. Have a nice day!")
end
return nothing
end
end # module JuliaFormatterTool
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 607 | #!/bin/env julia
#
# Runs JuliaFormatter on the repository
# Invoke this script directly, `./contrib/format/format.jl`
# or via `julia --project=contrib/format contrib/format/format.jl`
# Install the project if not the current project environment
if Base.active_project() != joinpath(@__DIR__, "Project.toml")
using Pkg
Pkg.activate(@__DIR__)
Pkg.resolve()
Pkg.instantiate()
end
include("JuliaFormatterTool.jl")
using .JuliaFormatterTool
if abspath(PROGRAM_FILE) == @__FILE__
if length(ARGS) == 0
run_formatter_loop()
else
run_formatter_loop(ARGS[1])
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1217 | using Documenter
using HDF5
using H5Zblosc
using H5Zbzip2
using H5Zlz4
using H5Zzstd
using MPI # needed to generate docs for parallel HDF5 API
DocMeta.setdocmeta!(HDF5, :DocTestSetup, :(using HDF5); recursive=true)
makedocs(;
sitename="HDF5.jl",
modules=[HDF5, H5Zblosc, H5Zbzip2, H5Zlz4, H5Zzstd],
authors="Mustafa Mohamad <[email protected]> and contributors",
format=Documenter.HTML(;
prettyurls=get(ENV, "CI", "false") == "true",
canonical="https://JuliaIO.github.io/HDF5.jl",
assets=String[],
sidebar_sitename=false
),
strict=true,
pages=[
"Home" => "index.md",
"Interface" => [
"interface/configuration.md",
"interface/files.md",
"interface/groups.md",
"interface/dataspaces.md",
"interface/dataset.md",
"interface/attributes.md",
"interface/properties.md",
"interface/filters.md",
"interface/objects.md",
],
"mpi.md",
"Low-level library bindings" => "api_bindings.md",
"Additional Resources" => "resources.md",
]
)
deploydocs(; repo="github.com/JuliaIO/HDF5.jl.git", push_preview=true)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2921 | module MPIExt
isdefined(Base, :get_extension) ? (using MPI) : (using ..MPI)
import Libdl
using HDF5: HDF5, API, Drivers, Drivers.Driver, Properties, h5doc, h5open
###
### MPIO
###
# define API functions here
function API.h5p_set_fapl_mpio(fapl_id, comm, info)
API.lock(API.liblock)
var"#status#" = try
ccall(
(:H5Pset_fapl_mpio, API.libhdf5),
API.herr_t,
(API.hid_t, MPI.MPI_Comm, MPI.MPI_Info),
fapl_id,
comm,
info
)
finally
API.unlock(API.liblock)
end
var"#status#" < 0 && API.@h5error("Error setting MPIO properties")
return nothing
end
function API.h5p_get_fapl_mpio(fapl_id, comm, info)
API.lock(API.liblock)
var"#status#" = try
ccall(
(:H5Pget_fapl_mpio, API.libhdf5),
API.herr_t,
(API.hid_t, Ptr{MPI.MPI_Comm}, Ptr{MPI.MPI_Info}),
fapl_id,
comm,
info
)
finally
API.unlock(API.liblock)
end
var"#status#" < 0 && API.@h5error("Error getting MPIO properties")
return nothing
end
# The docstring for `MPIO` is included in the function `MPIO` in
# src/drivers/drivers.jl.
struct MPIO <: Driver
comm::MPI.Comm
info::MPI.Info
end
Drivers.MPIO(comm::MPI.Comm, info::MPI.Info) = MPIO(comm, info)
Drivers.MPIO(comm::MPI.Comm; kwargs...) = MPIO(comm, MPI.Info(; kwargs...))
function Drivers.set_driver!(fapl::Properties, mpio::MPIO)
HDF5.has_parallel() || error(
"HDF5.jl has no parallel support." *
" Make sure that you're using MPI-enabled HDF5 libraries, and that" *
" MPI was loaded before HDF5." *
" See HDF5.jl docs for details."
)
# Note: HDF5 creates a COPY of the comm and info objects, so we don't need to keep a reference around.
API.h5p_set_fapl_mpio(fapl, mpio.comm, mpio.info)
Drivers.DRIVERS[API.h5p_get_driver(fapl)] = MPIO
return nothing
end
function Drivers.get_driver(fapl::Properties, ::Type{MPIO})
comm = MPI.Comm()
info = MPI.Info()
API.h5p_get_fapl_mpio(fapl, comm, info)
return MPIO(comm, info)
end
"""
h5open(filename, [mode="r"], comm::MPI.Comm, [info::MPI.Info]; pv...)
Open or create a parallel HDF5 file using the MPI-IO driver.
Equivalent to `h5open(filename, mode; fapl_mpio=(comm, info), pv...)`.
Throws an informative error if the loaded HDF5 libraries do not include parallel
support.
See the [HDF5 docs](https://portal.hdfgroup.org/display/HDF5/H5P_SET_FAPL_MPIO)
for details on the `comm` and `info` arguments.
"""
function HDF5.h5open(
filename::AbstractString,
mode::AbstractString,
comm::MPI.Comm,
info::MPI.Info=MPI.Info();
pv...
)
HDF5.h5open(filename, mode; driver=MPIO(comm, info), pv...)
end
HDF5.h5open(filename::AbstractString, comm::MPI.Comm, args...; pv...) =
HDF5.h5open(filename, "r", comm, args...; pv...)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 11633 | #==
Julia code wrapping the bitshuffle filter for HDF5. A rough translation of
bshuf_h5filter.c by Kiyoshi Masui, see
https://github.com/kiyo-masui/bitshuffle.
==#
"""
The bitshuffle filter for HDF5. See https://portal.hdfgroup.org/display/support/Filters#Filters-32008
and https://github.com/kiyo-masui/bitshuffle for details.
"""
module H5Zbitshuffle
using bitshuffle_jll
using HDF5.API
import HDF5.Filters:
Filter,
filterid,
register_filter,
filtername,
filter_func,
filter_cfunc,
set_local_func,
set_local_cfunc
export BSHUF_H5_COMPRESS_LZ4,
BSHUF_H5_COMPRESS_ZSTD, BitshuffleFilter, H5Z_filter_bitshuffle
# From bshuf_h5filter.h
const BSHUF_H5_COMPRESS_LZ4 = 2
const BSHUF_H5_COMPRESS_ZSTD = 3
const H5Z_FILTER_BITSHUFFLE = API.H5Z_filter_t(32008)
const BSHUF_VERSION_MAJOR = 0
const BSHUF_VERSION_MINOR = 4
const BSHUF_VERSION_POINT = 2
const bitshuffle_name = "HDF5 bitshuffle filter; see https://github.com/kiyo-masui/bitshuffle"
# Set filter arguments
function bitshuffle_set_local(dcpl::API.hid_t, htype::API.hid_t, space::API.hid_t)
# Sanity check of provided values and set element size
bs_flags = Ref{Cuint}()
bs_values = Vector{Cuint}(undef, 8)
bs_nelements = Ref{Csize_t}(length(bs_values))
API.h5p_get_filter_by_id(
dcpl, H5Z_FILTER_BITSHUFFLE, bs_flags, bs_nelements, bs_values, 0, C_NULL, C_NULL
)
@debug "Initial filter info" bs_flags bs_values bs_nelements
flags = bs_flags[]
# set values
bs_values[1] = BSHUF_VERSION_MAJOR
bs_values[2] = BSHUF_VERSION_MINOR
elem_size = API.h5t_get_size(htype)
@debug "Element size for $htype reported as $elem_size"
if elem_size <= 0
return API.herr_t(-1)
end
bs_values[3] = elem_size
nelements = bs_nelements[]
# check user-supplied values
if nelements > 3
if bs_values[4] % 8 != 0 || bs_values[4] < 0
return API.herr_t(-1)
end
end
if nelements > 4
if !(bs_values[5] in (0, BSHUF_H5_COMPRESS_LZ4, BSHUF_H5_COMPRESS_ZSTD))
return API.herr_t(-1)
end
end
@debug "Final values" bs_values
API.h5p_modify_filter(dcpl, H5Z_FILTER_BITSHUFFLE, bs_flags[], nelements, bs_values)
return API.herr_t(1)
end
function H5Z_filter_bitshuffle(
flags::Cuint,
cd_nelmts::Csize_t,
cd_values::Ptr{Cuint},
nbytes::Csize_t,
buf_size::Ptr{Csize_t},
buf::Ptr{Ptr{Cvoid}}
)::Csize_t
in_buf = unsafe_load(buf) #in_buf is *void
out_buf = C_NULL
nbytes_out = 0
block_size = 0
try #mop up errors at end
@debug "nelmts" cd_nelmts
if cd_nelmts < 3
error("bitshuffle_h5plugin: Not enough elements provided to bitshuffle filter")
end
# Get needed information
major = unsafe_load(cd_values, 1)
minor = unsafe_load(cd_values, 2)
elem_size = unsafe_load(cd_values, 3)
comp_lvl = unsafe_load(cd_values, 6)
compress_flag = unsafe_load(cd_values, 5)
if cd_nelmts > 3
block_size = unsafe_load(cd_values, 4)
end
@debug "Major,minor:" major minor
@debug "element size, compress_level, compress_flag" elem_size comp_lvl compress_flag
if block_size == 0
block_size = ccall(
(:bshuf_default_block_size, libbitshuffle), Csize_t, (Csize_t,), elem_size
)
end
# Work out buffer sizes
if cd_nelmts > 4 &&
(compress_flag in (BSHUF_H5_COMPRESS_LZ4, BSHUF_H5_COMPRESS_ZSTD))
# Use compression
if (flags & API.H5Z_FLAG_REVERSE) != 0 # unshuffle and decompress
# First 8 bytes is number of uncompressed bytes
nbytes_uncomp = ccall(
(:bshuf_read_uint64_BE, libbitshuffle), UInt64, (Ptr{Cvoid},), in_buf
)
# Next 4 bytes are the block size
block_size =
ccall(
(:bshuf_read_uint32_BE, libbitshuffle),
UInt32,
(Ptr{Cvoid},),
in_buf + 8
) ÷ elem_size
in_buf += 12
buf_size_out = nbytes_uncomp
else #shuffle and compress
nbytes_uncomp = nbytes
if compress_flag == BSHUF_H5_COMPRESS_LZ4
buf_size_out =
ccall(
(:bshuf_compress_lz4_bound, libbitshuffle),
Csize_t,
(Csize_t, Csize_t, Csize_t),
nbytes_uncomp ÷ elem_size,
elem_size,
block_size
) + 12
elseif compress_flag == BSHUF_H5_COMPRESS_ZSTD
buf_size_out =
ccall(
(:bshuf_compress_zstd_bound, libbitshuffle),
Csize_t,
(Csize_t, Csize_t, Csize_t),
nbytes_uncomp ÷ elem_size,
elem_size,
block_size
) + 12
end
end
else # No compression required
nbytes_uncomp = nbytes
buf_size_out = nbytes
end
if nbytes_uncomp % elem_size != 0
error(
"bitshuffle_h5plugin: Uncompressed size $nbytes_uncomp is not a multiple of $elem_size"
)
end
size = nbytes_uncomp ÷ elem_size
out_buf = Libc.malloc(buf_size_out)
if out_buf == C_NULL
error(
"bitshuffle_h5plugin: Cannot allocate memory for outbuf during decompression"
)
end
# Now perform the decompression
if cd_nelmts > 4 &&
(compress_flag in (BSHUF_H5_COMPRESS_LZ4, BSHUF_H5_COMPRESS_ZSTD))
if flags & API.H5Z_FLAG_REVERSE != 0 #unshuffle and decompress
if compress_flag == BSHUF_H5_COMPRESS_LZ4
err = ccall(
(:bshuf_decompress_lz4, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf,
size,
elem_size,
block_size
)
elseif compress_flag == BSHUF_H5_COMPRESS_ZSTD
err = ccall(
(:bshuf_decompress_zstd, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf,
size,
elem_size,
block_size
)
end
nbytes_out = nbytes_uncomp
else #shuffle and compress
ccall(
(:bshuf_write_uint64_BE, libbitshuffle),
Cvoid,
(Ptr{Cvoid}, UInt64),
out_buf,
nbytes_uncomp
)
ccall(
(:bshuf_write_uint32_BE, libbitshuffle),
Cvoid,
(Ptr{Cvoid}, UInt32),
out_buf + 8,
block_size * elem_size
)
if compress_flag == BSHUF_H5_COMPRESS_LZ4
err = ccall(
(:bshuf_compress_lz4, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf + 12,
size,
elem_size,
block_size
)
else
err = ccall(
(:bshuf_compress_zstd, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf + 12,
size,
elem_size,
block_size
)
end
nbytes_out = err + 12
end
else # just the shuffle thanks
if flags & API.H5Z_FLAG_REVERSE != 0
err = ccall(
(:bshuf_bitunshuffle, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf,
size,
elem_size,
block_size
)
else
err = ccall(
(:bshuf_bitshuffle, libbitshuffle),
Int64,
(Ptr{Cvoid}, Ptr{Cvoid}, Csize_t, Csize_t, Csize_t),
in_buf,
out_buf,
size,
elem_size,
block_size
)
end
nbytes_out = nbytes
end
# And wrap it up
if err < 0
error("h5plugin_bitshuffle: Error in bitshuffle with code $err")
end
Libc.free(unsafe_load(buf))
unsafe_store!(buf, out_buf)
unsafe_store!(buf_size, Csize_t(buf_size_out))
out_buf = C_NULL
catch e
# On failure, return 0 and change no arguments
nbytes_out = Csize_t(0)
@error "Non-fatal H5 bitshuffle plugin error: " e
display(stacktrace(catch_backtrace()))
finally
if out_buf != C_NULL
Libc.free(out_buf)
end
end
return Csize_t(nbytes_out)
end
# Filter registration
# All information for the filter
struct BitshuffleFilter <: Filter
major::Cuint
minor::Cuint
typesize::Cuint
blocksize::Cuint
compression::Cuint
comp_level::Cuint #Zstd only
end
"""
BitshuffleFilter(blocksize=0,compressor=:none,comp_level=0)
The Bitshuffle filter can optionally include compression :lz4 or :zstd. For :zstd
comp_level can be provided. This is ignored for :lz4 compression. If `blocksize`
is zero the default bitshuffle blocksize is used.
"""
function BitshuffleFilter(; blocksize=0, compressor=:none, comp_level=0)
compressor in (:lz4, :zstd, :none) ||
throw(ArgumentError("Invalid bitshuffle compression $compressor"))
compcode = 0
if compressor == :lz4
compcode = BSHUF_H5_COMPRESS_LZ4
elseif compressor == :zstd
compcode = BSHUF_H5_COMPRESS_ZSTD
end
BitshuffleFilter(
BSHUF_VERSION_MAJOR, BSHUF_VERSION_MINOR, 0, blocksize, compcode, comp_level
)
end
filterid(::Type{BitshuffleFilter}) = H5Z_FILTER_BITSHUFFLE
filtername(::Type{BitshuffleFilter}) = bitshuffle_name
set_local_func(::Type{BitshuffleFilter}) = bitshuffle_set_local
set_local_cfunc(::Type{BitshuffleFilter}) =
@cfunction(bitshuffle_set_local, API.herr_t, (API.hid_t, API.hid_t, API.hid_t))
filterfunc(::Type{BitshuffleFilter}) = H5Z_filter_bitshuffle
filter_cfunc(::Type{BitshuffleFilter}) = @cfunction(
H5Z_filter_bitshuffle,
Csize_t,
(Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
function __init__()
register_filter(BitshuffleFilter)
end
end # module
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 7292 | module H5Zblosc
# port of https://github.com/Blosc/c-blosc/blob/3a668dcc9f61ad22b5c0a0ab45fe8dad387277fd/hdf5/blosc_filter.c (copyright 2010 Francesc Alted, license: MIT/expat)
import Blosc
using HDF5.API
import HDF5.Filters: Filter, FilterPipeline
import HDF5.Filters:
filterid,
register_filter,
filtername,
filter_func,
filter_cfunc,
set_local_func,
set_local_cfunc
import HDF5.Filters.Shuffle
export H5Z_FILTER_BLOSC, blosc_filter, BloscFilter
# Import Blosc shuffle constants
import Blosc: NOSHUFFLE, SHUFFLE, BITSHUFFLE
const H5Z_FILTER_BLOSC = API.H5Z_filter_t(32001) # Filter ID registered with the HDF Group for Blosc
const FILTER_BLOSC_VERSION = 2
const blosc_name = "blosc"
function blosc_set_local(dcpl::API.hid_t, htype::API.hid_t, space::API.hid_t)
blosc_flags = Ref{Cuint}()
blosc_values = Vector{Cuint}(undef, 8)
blosc_nelements = Ref{Csize_t}(length(blosc_values))
blosc_chunkdims = Vector{API.hsize_t}(undef, 32)
API.h5p_get_filter_by_id(
dcpl,
H5Z_FILTER_BLOSC,
blosc_flags,
blosc_nelements,
blosc_values,
0,
C_NULL,
C_NULL
)
flags = blosc_flags[]
nelements = max(blosc_nelements[], 4) # First 4 slots reserved
# Set Blosc info in first two slots
blosc_values[1] = FILTER_BLOSC_VERSION
blosc_values[2] = Blosc.VERSION_FORMAT
ndims = API.h5p_get_chunk(dcpl, 32, blosc_chunkdims)
chunksize = prod(resize!(blosc_chunkdims, ndims))
if ndims < 0 || ndims > 32 || chunksize > Blosc.MAX_BUFFERSIZE
return API.herr_t(-1)
end
htypesize = API.h5t_get_size(htype)
if API.h5t_get_class(htype) == API.H5T_ARRAY
hsuper = API.h5t_get_super(htype)
basetypesize = API.h5t_get_size(hsuper)
API.h5t_close(hsuper)
else
basetypesize = htypesize
end
# Limit large typesizes (they are pretty inefficient to shuffle
# and, in addition, Blosc does not handle typesizes larger than
# blocksizes).
if basetypesize > Blosc.MAX_TYPESIZE
basetypesize = 1
end
blosc_values[3] = basetypesize
blosc_values[4] = chunksize * htypesize # size of the chunk
API.h5p_modify_filter(dcpl, H5Z_FILTER_BLOSC, flags, nelements, blosc_values)
return API.herr_t(1)
end
function blosc_filter(
flags::Cuint,
cd_nelmts::Csize_t,
cd_values::Ptr{Cuint},
nbytes::Csize_t,
buf_size::Ptr{Csize_t},
buf::Ptr{Ptr{Cvoid}}
)
typesize = unsafe_load(cd_values, 3) # The datatype size
outbuf_size = unsafe_load(cd_values, 4)
# Compression level:
clevel = cd_nelmts >= 5 ? unsafe_load(cd_values, 5) : Cuint(5)
# Do shuffle:
doshuffle = cd_nelmts >= 6 ? unsafe_load(cd_values, 6) : SHUFFLE
if (flags & API.H5Z_FLAG_REVERSE) == 0 # compressing
# Allocate an output buffer exactly as long as the input data; if
# the result is larger, we simply return 0. The filter is flagged
# as optional, so HDF5 marks the chunk as uncompressed and proceeds.
outbuf_size = unsafe_load(buf_size)
outbuf = Libc.malloc(outbuf_size)
outbuf == C_NULL && return Csize_t(0)
compname = if cd_nelmts >= 7
compcode = unsafe_load(cd_values, 7)
Blosc.compname(compcode)
else
"blosclz"
end
Blosc.set_compressor(compname)
status = Blosc.blosc_compress(
clevel, doshuffle, typesize, nbytes, unsafe_load(buf), outbuf, nbytes
)
status < 0 && (Libc.free(outbuf); return Csize_t(0))
else # decompressing
# Extract the exact outbuf_size from the buffer header.
#
# NOTE: the guess value got from "cd_values" corresponds to the
# uncompressed chunk size but it should not be used in a general
# cases since other filters in the pipeline can modify the buffer
# size.
in = unsafe_load(buf)
# See https://github.com/JuliaLang/julia/issues/43402
# Resolved in https://github.com/JuliaLang/julia/pull/43408
outbuf_size, cbytes, blocksize = Blosc.cbuffer_sizes(in)
outbuf = Libc.malloc(outbuf_size)
outbuf == C_NULL && return Csize_t(0)
status = Blosc.blosc_decompress(in, outbuf, outbuf_size)
status <= 0 && (Libc.free(outbuf); return Csize_t(0))
end
if status != 0
Libc.free(unsafe_load(buf))
unsafe_store!(buf, outbuf)
unsafe_store!(buf_size, outbuf_size)
return Csize_t(status) # size of compressed/decompressed data
end
Libc.free(outbuf)
return Csize_t(0)
end
"""
BloscFilter(;level=5, shuffle=true, compressor="blosclz")
The Blosc compression filter, using [Blosc.jl](https://github.com/JuliaIO/Blosc.jl). Options:
- `level`: compression level
- `shuffle`: whether to shuffle data before compressing (this option should be used instead of the [`Shuffle`](@ref) filter)
- `compressor`: the compression algorithm. Call `Blosc.compressors()` for the available compressors.
# External links
* [What Is Blosc?](https://www.blosc.org/pages/blosc-in-depth/)
* [Blosc HDF5 Filter ID 32001](https://portal.hdfgroup.org/display/support/Filters#Filters-32001)
* [Blosc HDF5 Plugin Repository (C code)](https://github.com/Blosc/hdf5-blosc)
"""
struct BloscFilter <: Filter
blosc_version::Cuint
version_format::Cuint
typesize::Cuint
bufsize::Cuint
level::Cuint
shuffle::Cuint
compcode::Cuint
end
function BloscFilter(; level=5, shuffle=SHUFFLE, compressor="blosclz")
Blosc.isvalidshuffle(shuffle) || throw(ArgumentError("invalid blosc shuffle $shuffle"))
compcode = Blosc.compcode(compressor)
BloscFilter(0, 0, 0, 0, level, shuffle, compcode)
end
filterid(::Type{BloscFilter}) = H5Z_FILTER_BLOSC
filtername(::Type{BloscFilter}) = blosc_name
set_local_func(::Type{BloscFilter}) = blosc_set_local
set_local_cfunc(::Type{BloscFilter}) =
@cfunction(blosc_set_local, API.herr_t, (API.hid_t, API.hid_t, API.hid_t))
filter_func(::Type{BloscFilter}) = blosc_filter
filter_cfunc(::Type{BloscFilter}) = @cfunction(
blosc_filter,
Csize_t,
(Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
function Base.show(io::IO, blosc::BloscFilter)
print(
io,
BloscFilter,
"(level=",
Int(blosc.level),
",shuffle=",
blosc.shuffle == NOSHUFFLE ? "NOSHUFFLE" :
blosc.shuffle == SHUFFLE ? "SHUFFLE" :
blosc.shuffle == BITSHUFFLE ? "BITSHUFFLE" :
"UNKNOWN",
",compressor=",
Blosc.compname(blosc.compcode),
")"
)
end
function Base.push!(f::FilterPipeline, blosc::BloscFilter)
0 <= blosc.level <= 9 ||
throw(ArgumentError("blosc compression $(blosc.level) not in [0,9]"))
Blosc.isvalidshuffle(blosc.shuffle) ||
throw(ArgumentError("invalid blosc shuffle $(blosc.shuffle)"))
ref = Ref(blosc)
GC.@preserve ref begin
API.h5p_set_filter(
f.plist,
filterid(BloscFilter),
API.H5Z_FLAG_OPTIONAL,
div(sizeof(BloscFilter), sizeof(Cuint)),
pointer_from_objref(ref)
)
end
return f
end
function __init__()
register_filter(BloscFilter)
end
end # module H5Zblosc
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 7470 | #=
The code below has been ported to Julia from the original C source:
https://github.com/nexusformat/HDF5-External-Filter-Plugins/blob/master/BZIP2/src/H5Zbzip2.c
The filter function H5Z_filter_bzip2 was adopted from:
PyTables http://www.pytables.org.
The plugin can be used with the HDF5 library version 1.8.11+ to read HDF5 datasets compressed with bzip2 created by PyTables.
License: licenses/H5Zbzip2_LICENSE.txt
The following license applies to the Julia port.
Copyright (c) 2021 Mark Kittisopikul and Howard Hughes Medical Institute. License MIT, see LICENSE.txt
=#
module H5Zbzip2
using CodecBzip2
import CodecBzip2: libbzip2
using HDF5.API
import HDF5.Filters:
Filter, filterid, register_filter, filtername, filter_func, filter_cfunc
export H5Z_FILTER_BZIP2, H5Z_filter_bzip2, Bzip2Filter
const H5Z_FILTER_BZIP2 = API.H5Z_filter_t(307)
const bzip2_name = "HDF5 bzip2 filter; see http://www.hdfgroup.org/services/contributions.html"
function H5Z_filter_bzip2(
flags::Cuint,
cd_nelmts::Csize_t,
cd_values::Ptr{Cuint},
nbytes::Csize_t,
buf_size::Ptr{Csize_t},
buf::Ptr{Ptr{Cvoid}}
)::Csize_t
outbuf = C_NULL
outdatalen = Cuint(0)
# Prepare the output buffer
try
if flags & API.H5Z_FLAG_REVERSE != 0
# Decompress
outbuflen = nbytes * 3 + 1
outbuf = Libc.malloc(outbuflen)
if outbuf == C_NULL
error("H5Zbzip2: memory allocation failed for bzip2 decompression.")
end
stream = CodecBzip2.BZStream()
# Just use default malloc and free
stream.bzalloc = C_NULL
stream.bzfree = C_NULL
# BZ2_bzDecompressInit
ret = CodecBzip2.decompress_init!(stream, 0, false)
if ret != CodecBzip2.BZ_OK
errror("H5Zbzip2: bzip2 decompress start failed with error $ret.")
end
stream.next_out = outbuf
stream.avail_out = outbuflen
stream.next_in = unsafe_load(buf)
stream.avail_in = nbytes
cont = true
while cont
# BZ2_bzDecompress
ret = CodecBzip2.decompress!(stream)
if ret < 0
error("H5Zbzip2: bzip2 decompression failed with error $ret.")
end
cont = ret != CodecBzip2.BZ_STREAM_END
if cont && stream.avail_out == 0
# Grow the output buffer
newbuflen = outbuflen * 2
newbuf = Libc.realloc(outbuf, newbuflen)
if newbuf == C_NULL
error("H5Zbzip2: memory allocation failed for bzip2 decompression.")
end
stream.next_out = newbuf + outbuflen
stream.avail_out = outbuflen
outbuf = newbuf
outbuflen = newbuflen
end
end
outdatalen = stream.total_out_lo32
# BZ2_bzDecompressEnd
ret = CodecBzip2.decompress_end!(stream)
if ret != CodecBzip2.BZ_OK
error("H5Zbzip2: bzip2 compression end failed with error $ret.")
end
else
# Compress data
# Maybe not the same size as outdatalen
odatalen = Cuint(0)
blockSize100k = 9
# Get compression blocksize if present
if cd_nelmts > 0
blockSize100k = unsafe_load(cd_values)
if blockSize100k < 1 || blockSize100k > 9
error("H5Zbzip2: Invalid compression blocksize: $blockSize100k")
end
end
# Prepare the output buffer
outbuflen = nbytes + nbytes ÷ 100 + 600 # worse case (bzip2 docs)
outbuf = Libc.malloc(outbuflen)
@debug "Allocated" outbuflen outbuf
if outbuf == C_NULL
error("H5Zbzip2: Memory allocation failed for bzip2 compression")
end
# Compress data
odatalen = outbuflen
r_odatalen = Ref{Cuint}(odatalen)
ret = BZ2_bzBuffToBuffCompress(
outbuf, r_odatalen, unsafe_load(buf), nbytes, blockSize100k, 0, 0
)
outdatalen = r_odatalen[]
if ret != CodecBzip2.BZ_OK
error("H5Zbzip2: bzip2 compression failed with error $ret.")
end
end # if flags & API.H5Z_FLAG_REVERSE != 0
Libc.free(unsafe_load(buf))
unsafe_store!(buf, outbuf)
unsafe_store!(buf_size, outbuflen)
catch err
# "In the case of failure, the return value is 0 (zero) and all pointer arguments are left unchanged."
outdatalen = Csize_t(0)
if outbuf != C_NULL
Libc.free(outbuf)
end
@error "H5Zbzip2.jl Non-Fatal ERROR: " err
display(stacktrace(catch_backtrace()))
end # try - catch
return Csize_t(outdatalen)
end # function H5Z_filter_bzip2
# Need stdcall for 32-bit Windows?
function BZ2_bzBuffToBuffCompress(
dest, destLen, source, sourceLen, blockSize100k, verbosity, workFactor
)
@static if CodecBzip2.WIN32
return ccall(
("BZ2_bzBuffToBuffCompress@28", libbzip2),
stdcall,
Cint,
(Ptr{Cchar}, Ptr{Cuint}, Ptr{Cchar}, Cuint, Cint, Cint, Cint),
dest,
destLen,
source,
sourceLen,
blockSize100k,
verbosity,
workFactor
)
else
return ccall(
(:BZ2_bzBuffToBuffCompress, libbzip2),
Cint,
(Ptr{Cchar}, Ptr{Cuint}, Ptr{Cchar}, Cuint, Cint, Cint, Cint),
dest,
destLen,
source,
sourceLen,
blockSize100k,
verbosity,
workFactor
)
end
end
function BZ2_bzBuffToBuffDecompress(dest, destLen, source, sourceLen, small, verbosity)
@static if CodecBzip2.WIN32
return ccall(
("BZ2_bzBuffToBuffDecompress@24", libbzip2),
stdcall,
Cint,
(Ptr{Cchar}, Ptr{Cuint}, Ptr{Cchar}, Cuint, Cint, Cint),
dest,
destLen,
source,
sourceLen,
small,
verbosity
)
else
return ccall(
(:BZ2_bzBuffToBuffDecompress, libbzip2),
Cint,
(Ptr{Cchar}, Ptr{Cuint}, Ptr{Cchar}, Cuint, Cint, Cint),
dest,
destLen,
source,
sourceLen,
small,
verbosity
)
end
end
# Filters Module
"""
Bzip2Filter(blockSize100k)
Apply Bzip2 compression. The filter id is $H5Z_FILTER_BZIP2.
# External Links
* [BZIP2 HDF5 Filter ID 307](https://portal.hdfgroup.org/display/support/Filters#Filters-307)
* [PyTables Repository (C code)](https://github.com/PyTables/PyTables)
"""
struct Bzip2Filter <: Filter
blockSize100k::Cuint
end
Bzip2Filter() = Bzip2Filter(9)
filterid(::Type{Bzip2Filter}) = H5Z_FILTER_BZIP2
filtername(::Type{Bzip2Filter}) = bzip2_name
filter_func(::Type{Bzip2Filter}) = H5Z_filter_bzip2
filter_cfunc(::Type{Bzip2Filter}) = @cfunction(
H5Z_filter_bzip2,
Csize_t,
(Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
function __init__()
register_filter(Bzip2Filter)
end
end # module H5Zbzip2
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 8114 | #=
This is a port of H5Zlz4.c to Julia
https://github.com/HDFGroup/hdf5_plugins/blob/master/LZ4/src/H5Zlz4.c
https://github.com/nexusformat/HDF5-External-Filter-Plugins/blob/master/LZ4/src/H5Zlz4.c
https://github.com/silx-kit/hdf5plugin/blob/main/src/LZ4/H5Zlz4.c
H5Zlz4 is originally a copyright of HDF Group. License: licenses/H5Zlz4_LICENSE.txt
The following license applies to the Julia port.
Copyright (c) 2021 Mark Kittisopikul and Howard Hughes Medical Institute. License MIT, see LICENSE.txt
=#
module H5Zlz4
using CodecLz4
using HDF5.API
import HDF5.Filters:
Filter, filterid, register_filter, filtername, filter_func, filter_cfunc
export H5Z_FILTER_LZ4, H5Z_filter_lz4, Lz4Filter
const H5Z_FILTER_LZ4 = API.H5Z_filter_t(32004)
const DEFAULT_BLOCK_SIZE = 1 << 30
const lz4_name = "HDF5 lz4 filter; see http://www.hdfgroup.org/services/contributions.html"
const LZ4_AGGRESSION = Ref(1)
# flags H5Z_FLAG_REVERSE or H5Z_FLAG_OPTIONAL
# cd_nelmts number of elements in cd_values (0 or 1)
# cd_values the first optional element must be the blockSize
# nbytes - number of valid bytes of data
# buf_size - total size of buffer
# buf - pointer to pointer of data
function H5Z_filter_lz4(
flags::Cuint,
cd_nelmts::Csize_t,
cd_values::Ptr{Cuint},
nbytes::Csize_t,
buf_size::Ptr{Csize_t},
buf::Ptr{Ptr{Cvoid}}
)::Csize_t
outBuf = C_NULL
ret_value = Csize_t(0)
try
if (flags & API.H5Z_FLAG_REVERSE) != 0 # reverse filter, decompressing
#i32Buf = Ref{UInt32}()
blockSize = UInt32(0)
roBuf = Ref{UInt8}()
rpos = Ptr{UInt8}(unsafe_load(buf))
#i64Buf = Ptr{UInt64}(rpos)
# Load the first 8 bytes from buffer as a big endian UInt64
# This is the original size of the buffer
origSize = ntoh(unsafe_load(Ptr{UInt64}(rpos)))
rpos += 8 # advance the pointer
# Next read the next four bytes from the buffer as a big endian UInt32
# This is the blocksize
#i32Buf[] = rpos
blockSize = ntoh(unsafe_load(Ptr{UInt32}(rpos)))
rpos += 4
if blockSize > origSize
blockSize = origSize
end
# malloc a byte buffer of origSize
# outBuf = Vector{UInt8}(undef, origSize)
@debug "OrigSize" origSize
outBuf = Libc.malloc(origSize)
# Julia should throw an error if it cannot allocate this
roBuf = Ptr{UInt8}(outBuf)
decompSize = 0
# Start with the first blockSize
while decompSize < origSize
# compressedBlockSize = UInt32(0)
if origSize - decompSize < blockSize # the last block can be smaller than block size
blockSize = origSize - decompSize
end
#i32Buf[] = rpos
compressedBlockSize = ntoh(unsafe_load(Ptr{UInt32}(rpos)))
rpos += 4
if compressedBlockSize == blockSize
# There was no compression
# memcpy(roBuf, rpos, blockSize)
unsafe_copyto!(roBuf, rpos, blockSize)
decompressedBytes = blockSize
else
# do the compression
# LZ4_decompress_fast, version number 10300 ?
@debug "decompress_safe" rpos roBuf compressedBlockSize (
origSize - decompSize
)
decompressedBytes = CodecLz4.LZ4_decompress_safe(
rpos, roBuf, compressedBlockSize, origSize - decompSize
)
@debug "decompressedBytes" decompressedBytes
end
rpos += compressedBlockSize
roBuf += blockSize
decompSize += decompressedBytes
end
Libc.free(unsafe_load(buf))
unsafe_store!(buf, outBuf)
outBuf = C_NULL
ret_value = Csize_t(origSize)
else
# forward filter
# compressing
#i64Buf = Ref{UInt64}()
#i32Buf = Ref{UInt32}()
if nbytes > typemax(Int32)
error("Can only compress chunks up to 2GB")
end
blockSize = unsafe_load(cd_values)
if cd_nelmts > 0 && blockSize > 0
else
blockSize = DEFAULT_BLOCK_SIZE
end
if blockSize > nbytes
blockSize = nbytes
end
nBlocks = (nbytes - 1) ÷ blockSize + 1
maxDestSize =
nBlocks * CodecLz4.LZ4_compressBound(blockSize) + 4 + 8 + nBlocks * 4
outBuf = Libc.malloc(maxDestSize)
rpos = Ptr{UInt8}(unsafe_load(buf))
roBuf = Ptr{UInt8}(outBuf)
# Header
unsafe_store!(Ptr{UInt64}(roBuf), hton(UInt64(nbytes)))
roBuf += 8
unsafe_store!(Ptr{UInt32}(roBuf), hton(UInt32(blockSize)))
roBuf += 4
outSize = 12
for block in 0:(nBlocks - 1)
# compBlockSize::UInt32
origWritten = Csize_t(block * blockSize)
if nbytes - origWritten < blockSize # the last block may be < blockSize
blockSize = nbytes - origWritten
end
# aggression = 1 is the same LZ4_compress_default
@debug "LZ4_compress_fast args" rpos outBuf roBuf roBuf + 4 blockSize nBlocks CodecLz4.LZ4_compressBound(
blockSize
)
compBlockSize = UInt32(
CodecLz4.LZ4_compress_fast(
rpos,
roBuf + 4,
blockSize,
CodecLz4.LZ4_compressBound(blockSize),
LZ4_AGGRESSION[]
)
)
@debug "Compressed block size" compBlockSize
if compBlockSize == 0
error("Could not compress block $block")
end
if compBlockSize >= blockSize # compression did not save any space, do a memcpy instead
compBlockSize = blockSize
unsafe_copyto!(roBuf + 4, rpos, blockSize)
end
unsafe_store!(Ptr{UInt32}(roBuf), hton(UInt32(compBlockSize))) # write blocksize
roBuf += 4
rpos += blockSize
roBuf += compBlockSize
outSize += compBlockSize + 4
end
Libc.free(unsafe_load(buf))
unsafe_store!(buf, outBuf)
unsafe_store!(buf_size, outSize)
outBuf = C_NULL
ret_value = Csize_t(outSize)
end # (flags & API.H5Z_FLAG_REVERSE) != 0
catch err
# "In the case of failure, the return value is 0 (zero) and all pointer arguments are left unchanged."
ret_value = Csize_t(0)
@error "H5Zlz4.jl Non-Fatal ERROR: " err
display(stacktrace(catch_backtrace()))
finally
if outBuf != C_NULL
Libc.free(outBuf)
end
end
return Csize_t(ret_value)
end
# Filters Module
"""
Lz4Filter(blockSize)
Apply LZ4 compression. `blockSize` is the main argument. The filter id is $H5Z_FILTER_LZ4.
# External Links
* [LZ4 HDF5 Filter ID 32004](https://portal.hdfgroup.org/display/support/Filters#Filters-32004)
* [LZ4 HDF5 Plugin Repository (C code)](https://github.com/nexusformat/HDF5-External-Filter-Plugins/tree/master/LZ4)
"""
struct Lz4Filter <: Filter
blockSize::Cuint
end
Lz4Filter() = Lz4Filter(DEFAULT_BLOCK_SIZE)
filterid(::Type{Lz4Filter}) = H5Z_FILTER_LZ4
filtername(::Type{Lz4Filter}) = lz4_name
filter_func(::Type{Lz4Filter}) = H5Z_filter_lz4
filter_cfunc(::Type{Lz4Filter}) = @cfunction(
H5Z_filter_lz4,
Csize_t,
(Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
function __init__()
register_filter(Lz4Filter)
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3902 | #=
Derived from https://github.com/aparamon/HDF5Plugin-Zstandard, zstd_h5plugin.c
Licensed under Apache License Version 2.0, see licenses/H5Zzstd_LICENSE.txt
The following license applies to the Julia port.
Copyright (c) 2021 Mark Kittisopikul and Howard Hughes Medical Institute. License MIT, see LICENSE.txt
=#
module H5Zzstd
using CodecZstd
import CodecZstd.LibZstd
using HDF5.API
import HDF5.Filters:
Filter, filterid, register_filter, filterid, filtername, filter_func, filter_cfunc
const H5Z_FILTER_ZSTD = API.H5Z_filter_t(32015)
const zstd_name = "Zstandard compression: http://www.zstd.net"
export H5Z_filter_zstd, H5Z_FILTER_ZSTD, ZstdFilter
# cd_values First optional value is the compressor aggression
# Default is CodecZstd.LibZstd.ZSTD_CLEVEL_DEFAULT
function H5Z_filter_zstd(
flags::Cuint,
cd_nelmts::Csize_t,
cd_values::Ptr{Cuint},
nbytes::Csize_t,
buf_size::Ptr{Csize_t},
buf::Ptr{Ptr{Cvoid}}
)::Csize_t
inbuf = unsafe_load(buf)
outbuf = C_NULL
origSize = nbytes
ret_value = Csize_t(0)
try
if flags & API.H5Z_FLAG_REVERSE != 0
#decompresssion
decompSize = LibZstd.ZSTD_getDecompressedSize(inbuf, origSize)
outbuf = Libc.malloc(decompSize)
if outbuf == C_NULL
error(
"zstd_h5plugin: Cannot allocate memory for outbuf during decompression."
)
end
decompSize = LibZstd.ZSTD_decompress(outbuf, decompSize, inbuf, origSize)
Libc.free(inbuf)
unsafe_store!(buf, outbuf)
outbuf = C_NULL
ret_value = Csize_t(decompSize)
else
# compression
if cd_nelmts > 0
aggression = Cint(unsafe_load(cd_values))
else
aggression = CodecZstd.LibZstd.ZSTD_CLEVEL_DEFAULT
end
if aggression < 1
aggression = 1 # ZSTD_minCLevel()
elseif aggression > LibZstd.ZSTD_maxCLevel()
aggression = LibZstd.ZSTD_maxCLevel()
end
compSize = LibZstd.ZSTD_compressBound(origSize)
outbuf = Libc.malloc(compSize)
if outbuf == C_NULL
error(
"zstd_h5plugin: Cannot allocate memory for outbuf during compression."
)
end
compSize = LibZstd.ZSTD_compress(outbuf, compSize, inbuf, origSize, aggression)
Libc.free(unsafe_load(buf))
unsafe_store!(buf, outbuf)
unsafe_store!(buf_size, compSize)
outbuf = C_NULL
ret_value = compSize
end
catch e
# "In the case of failure, the return value is 0 (zero) and all pointer arguments are left unchanged."
ret_value = Csize_t(0)
@error "H5Zzstd Non-Fatal ERROR: " err
display(stacktrace(catch_backtrace()))
finally
if outbuf != C_NULL
free(outbuf)
end
end # try catch finally
return Csize_t(ret_value)
end
# Filters Module
"""
ZstdFilter(clevel)
Zstandard compression filter. `clevel` determines the compression level.
# External Links
* [Zstandard HDF5 Filter ID 32015](https://portal.hdfgroup.org/display/support/Filters#Filters-32015)
* [Zstandard HDF5 Plugin Repository (C code)](https://github.com/aparamon/HDF5Plugin-Zstandard)
"""
struct ZstdFilter <: Filter
clevel::Cuint
end
ZstdFilter() = ZstdFilter(CodecZstd.LibZstd.ZSTD_CLEVEL_DEFAULT)
filterid(::Type{ZstdFilter}) = H5Z_FILTER_ZSTD
filtername(::Type{ZstdFilter}) = zstd_name
filter_func(::Type{ZstdFilter}) = H5Z_filter_zstd
filter_cfunc(::Type{ZstdFilter}) = @cfunction(
H5Z_filter_zstd,
Csize_t,
(Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
function __init__()
register_filter(ZstdFilter)
end
end # module H5Zzstd
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 68532 | #! format: off
# The `@bind` macro is used to automatically generate Julia bindings to the low-level
# HDF5 library functions.
#
# Each line should consist of two arguments:
#
# 1. An `@ccall`-like function definition expression for the C library interface,
# including the return type.
# 2. Either an error string (which is thrown by `hdf5error()`) or an expression which
# builds an error string and may refer to any of the named function arguments.
#
# The C library names are automatically generated from the Julia function name by
# uppercasing the `h5?` and removing the first underscore ---
# e.g. `h5d_close` -> `H5Dclose`. Versioned function names (such as
# `h5d_open2` -> `H5Dopen2`) have the trailing number removed for the Julia function
# definition. Other arbitrary mappings may be added by adding an entry to the
# `bind_exceptions` Dict in `bind_generator.jl`.
#
# Execute gen_wrappers.jl to generate ../src/api/functions.jl from this file.
###
### HDF5 General library functions
###
@bind h5_close()::herr_t "Error closing the HDF5 resources"
@bind h5_dont_atexit()::herr_t "Error calling dont_atexit"
@bind h5_free_memory(buf::Ptr{Cvoid})::herr_t "Error freeing memory"
@bind h5_garbage_collect()::herr_t "Error on garbage collect"
@bind h5_get_libversion(majnum::Ref{Cuint}, minnum::Ref{Cuint}, relnum::Ref{Cuint})::herr_t "Error getting HDF5 library version"
@bind h5_is_library_threadsafe(is_ts::Ref{Cuchar})::herr_t "Error determining thread safety"
@bind h5_open()::herr_t "Error initializing the HDF5 library"
@bind h5_set_free_list_limits(reg_global_lim::Cint, reg_list_lim::Cint, arr_global_lim::Cint, arr_list_lim::Cint, blk_global_lim::Cint, blk_list_lim::Cint)::herr_t "Error setting limits on free lists"
###
### Attribute Interface
###
@bind h5a_close(id::hid_t)::herr_t "Error closing attribute"
@bind h5a_create2(loc_id::hid_t, attr_name::Cstring, type_id::hid_t, space_id::hid_t, acpl_id::hid_t, aapl_id::hid_t)::hid_t string("Error creating attribute ", attr_name, " for object ", h5i_get_name(loc_id))
@bind h5a_create_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, type_id::hid_t, space_id::hid_t, acpl_id::hid_t, aapl_id::hid_t, lapl_id::hid_t)::hid_t string("Error creating attribute ", attr_name, " for object ", obj_name)
@bind h5a_delete(loc_id::hid_t, attr_name::Cstring)::herr_t string("Error deleting attribute ", attr_name)
@bind h5a_delete_by_idx(loc_id::hid_t, obj_name::Cstring, idx_type::Cint, order::Cint, n::hsize_t, lapl_id::hid_t)::herr_t string("Error deleting attribute ", n, " from object ", obj_name)
@bind h5a_delete_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, lapl_id::hid_t)::herr_t string("Error removing attribute ", attr_name, " from object ", obj_name)
@bind h5a_exists(obj_id::hid_t, attr_name::Cstring)::htri_t string("Error checking whether attribute ", attr_name, " exists")
@bind h5a_exists_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, lapl_id::hid_t)::htri_t string("Error checking whether object ", obj_name, " has attribute ", attr_name)
@bind h5a_get_create_plist(attr_id::hid_t)::hid_t "Cannot get creation property list"
@bind h5a_get_name(attr_id::hid_t, buf_size::Csize_t, buf::Ptr{UInt8})::Cssize_t "Error getting attribute name"
@bind h5a_get_name_by_idx(loc_id::hid_t, obj_name::Cstring, index_type::Cint, order::Cint, idx::hsize_t, name::Ptr{UInt8}, size::Csize_t, lapl_id::hid_t)::Cssize_t "Error getting attribute name"
@bind h5a_get_space(attr_id::hid_t)::hid_t "Error getting attribute dataspace"
@bind h5a_get_type(attr_id::hid_t)::hid_t "Error getting attribute type"
@bind h5a_iterate2(obj_id::hid_t, idx_type::Cint, order::Cint, n::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)::herr_t string("Error iterating attributes in object ", h5i_get_name(obj_id))
@bind h5a_open(obj_id::hid_t, attr_name::Cstring, aapl_id::hid_t)::hid_t string("Error opening attribute ", attr_name, " for object ", h5i_get_name(obj_id))
@bind h5a_open_by_idx(obj_id::hid_t, pathname::Cstring, idx_type::Cint, order::Cint, n::hsize_t, aapl_id::hid_t, lapl_id::hid_t)::hid_t string("Error opening attribute ", n, " of ", h5i_get_name(obj_id), "/", pathname)
@bind h5a_read(attr_id::hid_t, mem_type_id::hid_t, buf::Ptr{Cvoid})::herr_t string("Error reading attribute ", h5a_get_name(attr_id))
@bind h5a_rename(loc_id::hid_t, old_attr_name::Cstring, new_attr_name::Cstring)::herr_t string("Could not rename attribute")
@bind h5a_write(attr_hid::hid_t, mem_type_id::hid_t, buf::Ptr{Cvoid})::herr_t "Error writing attribute data"
###
### Dataset Interface
###
@bind h5d_chunk_iter(dset_id::hid_t, dxpl_id::hid_t, cb::Ptr{Nothing}, op_data::Any)::herr_t "Error iterating over chunks" (v"1.12.3", nothing)
@bind h5d_close(dataset_id::hid_t)::herr_t "Error closing dataset"
@bind h5d_create2(loc_id::hid_t, pathname::Cstring, dtype_id::hid_t, space_id::hid_t, lcpl_id::hid_t, dcpl_id::hid_t, dapl_id::hid_t)::hid_t string("Error creating dataset ", h5i_get_name(loc_id), "/", pathname)
@bind h5d_create_anon(loc_id::hid_t, type_id::hid_t, space_id::hid_t, dcpl_id::hid_t, dapl_id::hid_t)::hid_t "Error in creating anonymous dataset"
@bind h5d_extend(dataset_id::hid_t, size::Ptr{hsize_t})::herr_t "Error extending dataset" # deprecated in favor of h5d_set_extent
@bind h5d_fill(fill::Ptr{Cvoid}, fill_type_id::hid_t, buf::Ptr{Cvoid}, buf_type_id::hid_t, space_id::hid_t)::herr_t "Error filling dataset"
@bind h5d_flush(dataset_id::hid_t)::herr_t "Error flushing dataset"
@bind h5d_gather(src_space_id::hid_t, src_buf::Ptr{Cvoid}, type_id::hid_t, dst_buf_size::Csize_t, dst_buf::Ptr{Cvoid}, op::Ptr{Cvoid}, op_data::Any)::herr_t "Error gathering dataset"
@bind h5d_get_access_plist(dataset_id::hid_t)::hid_t "Error getting dataset access property list"
@bind h5d_get_chunk_info(dataset_id::hid_t, fspace_id::hid_t, index::hsize_t, offset::Ptr{hsize_t}, filter_mask::Ptr{Cuint}, addr::Ptr{haddr_t}, size::Ptr{hsize_t})::herr_t "Error getting chunk info"
@bind h5d_get_chunk_info_by_coord(dataset_id::hid_t, offset::Ptr{hsize_t}, filter_mask::Ptr{Cuint}, addr::Ptr{haddr_t}, size::Ptr{hsize_t})::herr_t "Error getting chunk info by coord" (v"1.10.5",nothing)
@bind h5d_get_chunk_storage_size(dataset_id::hid_t, offset::Ptr{hsize_t}, chunk_nbytes::Ptr{hsize_t})::herr_t "Error getting chunk storage size"
@bind h5d_get_create_plist(dataset_id::hid_t)::hid_t "Error getting dataset create property list"
@bind h5d_get_num_chunks(dataset_id::hid_t, fspace_id::hid_t, nchunks::Ptr{hsize_t})::herr_t "Error getting number of chunks" (v"1.10.5",nothing)
@bind h5d_get_offset(dataset_id::hid_t)::haddr_t "Error getting offset"
@bind h5d_get_space(dataset_id::hid_t)::hid_t "Error getting dataspace"
@bind h5d_get_space_status(dataset_id::hid_t, status::Ref{Cint})::herr_t "Error getting dataspace status"
@bind h5d_get_storage_size(dataset_id::hid_t)::hsize_t "Error getting storage size"
@bind h5d_get_type(dataset_id::hid_t)::hid_t "Error getting dataspace type"
@bind h5d_iterate(buf::Ptr{Cvoid}, type_id::hid_t, space_id::hid_t, operator::Ptr{Cvoid}, operator_data::Any)::herr_t "Error iterating dataset"
@bind h5d_open2(loc_id::hid_t, pathname::Cstring, dapl_id::hid_t)::hid_t string("Error opening dataset ", h5i_get_name(loc_id), "/", pathname)
@bind h5d_read(dataset_id::hid_t, mem_type_id::hid_t, mem_space_id::hid_t, file_space_id::hid_t, xfer_plist_id::hid_t, buf::Ptr{Cvoid})::herr_t string("Error reading dataset ", h5i_get_name(dataset_id))
@bind h5d_read_chunk(dset::hid_t, dxpl_id::hid_t, offset::Ptr{hsize_t}, filters::Ptr{UInt32}, buf::Ptr{Cvoid})::herr_t "Error reading chunk"
@bind h5d_refresh(dataset_id::hid_t)::herr_t "Error refreshing dataset"
@bind h5d_scatter(op::Ptr{Cvoid}, op_data::Any, type_id::hid_t, dst_space_id::hid_t, dst_buf::Ptr{Cvoid})::herr_t "Error scattering to dataset"
@bind h5d_set_extent(dataset_id::hid_t, new_dims::Ptr{hsize_t})::herr_t "Error extending dataset dimensions"
@bind h5d_vlen_get_buf_size(dset_id::hid_t, type_id::hid_t, space_id::hid_t, buf::Ptr{hsize_t})::herr_t "Error getting vlen buffer size"
@bind h5d_vlen_reclaim(type_id::hid_t, space_id::hid_t, plist_id::hid_t, buf::Ptr{Cvoid})::herr_t "Error reclaiming vlen buffer"
@bind h5d_write(dataset_id::hid_t, mem_type_id::hid_t, mem_space_id::hid_t, file_space_id::hid_t, xfer_plist_id::hid_t, buf::Ptr{Cvoid})::herr_t "Error writing dataset"
@bind h5d_write_chunk(dset_id::hid_t, dxpl_id::hid_t, filter_mask::UInt32, offset::Ptr{hsize_t}, bufsize::Csize_t, buf::Ptr{Cvoid})::herr_t "Error writing chunk"
###
### Error Interface
###
@bind h5e_get_auto2(estack_id::hid_t, func::Ref{Ptr{Cvoid}}, client_data::Ref{Ptr{Cvoid}})::herr_t "Error getting error reporting behavior"
@bind h5e_set_auto2(estack_id::hid_t, func::Ptr{Cvoid}, client_data::Ptr{Cvoid})::herr_t "Error setting error reporting behavior"
@bind h5e_get_current_stack()::hid_t "Unable to return current error stack"
@bind h5e_get_msg(mesg_id::hid_t, mesg_type::Ref{Cint}, mesg::Ref{UInt8}, len::Csize_t)::Cssize_t "Error getting message"
@bind h5e_get_num(estack_id::hid_t)::Cssize_t "Error getting stack length"
@bind h5e_close_stack(stack_id::hid_t)::herr_t "Error closing stack"
@bind h5e_walk2(stack_id::hid_t, direction::Cint, op::Ptr{Cvoid}, op_data::Any)::herr_t "Error walking stack"
###
### File Interface
###
@bind h5f_clear_elink_file_cache(file_id::hid_t)::herr_t "Error in h5f_clear_elink_file_cache (not annotated)"
@bind h5f_close(file_id::hid_t)::herr_t "Error closing file"
@bind h5f_create(pathname::Cstring, flags::Cuint, fcpl_id::hid_t, fapl_id::hid_t)::hid_t "Error creating file $pathname"
@bind h5f_delete(filename::Cstring, fapl_id::hid_t)::herr_t "Error in h5f_delete (not annotated)"
@bind h5f_flush(object_id::hid_t, scope::Cint)::herr_t "Error flushing object to file"
@bind h5f_format_convert(fid::hid_t)::herr_t "Error in h5f_format_convert (not annotated)"
@bind h5f_get_access_plist(file_id::hid_t)::hid_t "Error getting file access property list"
@bind h5f_get_create_plist(file_id::hid_t)::hid_t "Error getting file create property list"
@bind h5f_get_dset_no_attrs_hint(file_id::hid_t, minimize::Ptr{hbool_t})::herr_t "Error getting dataset no attributes hint"
@bind h5f_get_eoa(file_id::hid_t, eoa::Ptr{haddr_t})::herr_t "Error in h5f_get_eoa (not annotated)"
@bind h5f_get_file_image(file_id::hid_t, buf_ptr::Ptr{Cvoid}, buf_len::Csize_t)::Cssize_t "Error in h5f_get_file_image (not annotated)"
@bind h5f_get_fileno(file_id::hid_t, fileno::Ptr{Culong})::herr_t "Error in h5f_get_fileno (not annotated)"
@bind h5f_get_filesize(file_id::hid_t, size::Ptr{hsize_t})::herr_t "Error in h5f_get_filesize (not annotated)"
@bind h5f_get_free_sections(file_id::hid_t, type::H5F_mem_t, nsects::Csize_t, sect_info::Ptr{H5F_sect_info_t})::Cssize_t "Error in h5f_get_free_sections (not annotated)"
@bind h5f_get_freespace(file_id::hid_t)::hssize_t "Error in h5f_get_freespace (not annotated)"
@bind h5f_get_intent(file_id::hid_t, intent::Ptr{Cuint})::herr_t "Error getting file intent"
@bind h5f_get_info2(obj_id::hid_t, file_info::Ptr{H5F_info2_t})::herr_t "Error in h5f_get_info2 (not annotated)"
@bind h5f_get_mdc_config(file_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})::herr_t "Error in h5f_get_mdc_config (not annotated)"
@bind h5f_get_mdc_hit_rate(file_id::hid_t, hit_rate_ptr::Ptr{Cdouble})::herr_t "Error in h5f_get_mdc_hit_rate (not annotated)"
@bind h5f_get_mdc_image_info(file_id::hid_t, image_addr::Ptr{haddr_t}, image_size::Ptr{hsize_t})::herr_t "Error in h5f_get_mdc_image_info (not annotated)"
@bind h5f_get_mdc_logging_status(file_id::hid_t, is_enabled::Ptr{hbool_t}, is_currently_logging::Ptr{hbool_t})::herr_t "Error in h5f_get_mdc_logging_status (not annotated)"
@bind h5f_get_mdc_size(file_id::hid_t, max_size_ptr::Ptr{Csize_t}, min_clean_size_ptr::Ptr{Csize_t}, cur_size_ptr::Ptr{Csize_t}, cur_num_entries_ptr::Ptr{Cint})::herr_t "Error in h5f_get_mdc_size (not annotated)"
@bind h5f_get_metadata_read_retry_info(file_id::hid_t, info::Ptr{H5F_retry_info_t})::herr_t "Error in h5f_get_metadata_read_retry_info (not annotated)"
@bind h5f_get_mpi_atomicity(file_id::hid_t, flag::Ptr{hbool_t})::herr_t "Error in h5f_get_mpi_atomicity (not annotated)"
@bind h5f_get_name(obj_id::hid_t, buf::Ptr{UInt8}, buf_size::Csize_t)::Cssize_t "Error getting file name"
@bind h5f_get_obj_count(file_id::hid_t, types::Cuint)::Cssize_t "Error getting object count"
@bind h5f_get_obj_ids(file_id::hid_t, types::Cuint, max_objs::Csize_t, obj_id_list::Ptr{hid_t})::Cssize_t "Error getting objects"
@bind h5f_get_page_buffering_stats(file_id::hid_t, accesses::Ptr{Cuint}, hits::Ptr{Cuint}, misses::Ptr{Cuint}, evictions::Ptr{Cuint}, bypasses::Ptr{Cuint})::herr_t "Error in h5f_get_page_buffering_stats (not annotated)"
@bind h5f_get_vfd_handle(file_id::hid_t, fapl_id::hid_t, file_handle::Ref{Ptr{Cvoid}})::herr_t "Error getting VFD handle"
@bind h5f_increment_filesize(file_id::hid_t, increment::hsize_t)::herr_t "Error in h5f_increment_filesize (not annotated)"
@bind h5f_is_accessible(container_name::Cstring, fapl_id::hid_t)::htri_t "Error in h5f_is_accessible (not annotated)"
@bind h5f_is_hdf5(pathname::Cstring)::htri_t "Unable to access file $pathname"
@bind h5f_mount(loc::hid_t, name::Cstring, child::hid_t, plist::hid_t)::herr_t "Error in h5f_mount (not annotated)"
@bind h5f_open(pathname::Cstring, flags::Cuint, fapl_id::hid_t)::hid_t "Error opening file $pathname"
@bind h5f_reopen(file_id::hid_t)::hid_t "Error in h5f_reopen (not annotated)"
@bind h5f_reset_mdc_hit_rate_stats(file_id::hid_t)::herr_t "Error in h5f_reset_mdc_hit_rate_stats (not annotated)"
@bind h5f_reset_page_buffering_stats(file_id::hid_t)::herr_t "Error in h5f_reset_page_buffering_stats (not annotated)"
@bind h5f_set_dset_no_attrs_hint(file_id::hid_t, minimize::hbool_t)::herr_t "Error in setting dataset no attributes hint"
@bind h5f_set_libver_bounds(file_id::hid_t, low::H5F_libver_t, high::H5F_libver_t)::herr_t "Error in h5f_set_libver_bounds (not annotated)"
@bind h5f_set_mdc_config(file_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})::herr_t "Error in h5f_set_mdc_config (not annotated)"
@bind h5f_set_mpi_atomicity(file_id::hid_t, flag::hbool_t)::herr_t "Error in h5f_set_mpi_atomicity (not annotated)"
@bind h5f_start_mdc_logging(file_id::hid_t)::herr_t "Error in h5f_start_mdc_logging (not annotated)"
@bind h5f_start_swmr_write(id::hid_t)::herr_t "Error starting SWMR write"
@bind h5f_stop_mdc_logging(file_id::hid_t)::herr_t "Error in h5f_stop_mdc_logging (not annotated)"
@bind h5f_unmount(loc::hid_t, name::Cstring)::herr_t "Error in h5f_unmount (not annotated)"
###
### Group Interface
###
@bind h5g_close(group_id::hid_t)::herr_t "Error closing group"
@bind h5g_create2(loc_id::hid_t, pathname::Cstring, lcpl_id::hid_t, gcpl_id::hid_t, gapl_id::hid_t)::hid_t "Error creating group $(h5i_get_name(loc_id))/$(pathname)"
@bind h5g_get_create_plist(group_id::hid_t)::hid_t "Error getting group create property list"
@bind h5g_get_info(group_id::hid_t, buf::Ptr{H5G_info_t})::herr_t "Error getting group info"
@bind h5g_get_num_objs(loc_id::hid_t, num_obj::Ptr{hsize_t})::hid_t "Error getting group length"
@bind h5g_get_objname_by_idx(loc_id::hid_t, idx::hsize_t, pathname::Ptr{UInt8}, size::Csize_t)::Cssize_t "Error getting group object name $(h5i_get_name(loc_id))/$(pathname)"
@bind h5g_open2(loc_id::hid_t, pathname::Cstring, gapl_id::hid_t)::hid_t "Error opening group $(h5i_get_name(loc_id))/$(pathname)"
###
### Identifier Interface
###
@bind h5i_dec_ref(obj_id::hid_t)::Cint "Error decementing reference"
@bind h5i_get_file_id(obj_id::hid_t)::hid_t "Error getting file identifier"
@bind h5i_get_name(obj_id::hid_t, buf::Ptr{UInt8}, buf_size::Csize_t)::Cssize_t "Error getting object name"
@bind h5i_get_ref(obj_id::hid_t)::Cint "Error getting reference count"
@bind h5i_get_type(obj_id::hid_t)::Cint "Error getting type"
@bind h5i_inc_ref(obj_id::hid_t)::Cint "Error incrementing identifier refcount"
@bind h5i_is_valid(obj_id::hid_t)::htri_t "Cannot determine whether object is valid"
###
### Link Interface
###
@bind h5l_create_external(target_file_name::Cstring, target_obj_name::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)::herr_t string("Error creating external link ", link_name, " pointing to ", target_obj_name, " in file ", target_file_name)
@bind h5l_create_hard(obj_loc_id::hid_t, obj_name::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)::herr_t string("Error creating hard link ", link_name, " pointing to ", obj_name)
@bind h5l_create_soft(target_path::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)::herr_t string("Error creating soft link ", link_name, " pointing to ", target_path)
@bind h5l_delete(obj_id::hid_t, pathname::Cstring, lapl_id::hid_t)::herr_t string("Error deleting ", h5i_get_name(obj_id), "/", pathname)
@bind h5l_move(src_obj_id::hid_t, src_name::Cstring, dest_obj_id::hid_t, dest_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)::herr_t string("Error moving ", h5i_get_name(src_obj_id), "/", src_name, " to ", h5i_get_name(dest_obj_id), "/", dest_name)
@bind h5l_exists(loc_id::hid_t, pathname::Cstring, lapl_id::hid_t)::htri_t string("Cannot determine whether ", pathname, " exists")
@bind h5l_get_info(link_loc_id::hid_t, link_name::Cstring, link_buf::Ptr{H5L_info_t}, lapl_id::hid_t)::herr_t string("Error getting info for link ", link_name)
@bind h5l_get_name_by_idx(loc_id::hid_t, group_name::Cstring, index_field::Cint, order::Cint, n::hsize_t, name::Ptr{UInt8}, size::Csize_t, lapl_id::hid_t)::Cssize_t "Error getting object name"
# libhdf5 v1.10 provides the name H5Literate
# libhdf5 v1.12 provides the same under H5Literate1, and a newer interface on H5Literate2
@bind h5l_iterate(group_id::hid_t, idx_type::Cint, order::Cint, idx::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)::herr_t string("Error iterating through links in group ", h5i_get_name(group_id)) (nothing, v"1.12")
@bind h5l_iterate1(group_id::hid_t, idx_type::Cint, order::Cint, idx::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)::herr_t string("Error iterating through links in group ", h5i_get_name(group_id)) (v"1.12", nothing)
###
### Object Interface
###
@bind h5o_are_mdc_flushes_disabled(object_id::hid_t, are_disabled::Ptr{hbool_t})::herr_t "Error in h5o_are_mdc_flushes_disabled (not annotated)"
@bind h5o_close(object_id::hid_t)::herr_t "Error closing object"
@bind h5o_copy(src_loc_id::hid_t, src_name::Cstring, dst_loc_id::hid_t, dst_name::Cstring, ocpypl_id::hid_t, lcpl_id::hid_t)::herr_t string("Error copying object ", h5i_get_name(src_loc_id), "/", src_name, " to ", h5i_get_name(dst_loc_id), "/", dst_name)
@bind h5o_decr_refcount(object_id::hid_t)::herr_t "Error in h5o_decr_refcount (not annotated)"
@bind h5o_disable_mdc_flushes(object_id::hid_t)::herr_t "Error in h5o_disable_mdc_flushes (not annotated)"
@bind h5o_enable_mdc_flushes(object_id::hid_t)::herr_t "Error in h5o_enable_mdc_flushes (not annotated)"
@bind h5o_exists_by_name(loc_id::hid_t, name::Cstring, lapl_id::hid_t)::htri_t "Error in h5o_exists_by_name (not annotated)"
@bind h5o_flush(obj_id::hid_t)::herr_t "Error in h5o_flush (not annotated)"
@bind h5o_get_comment(obj_id::hid_t, comment::Ptr{Cchar}, bufsize::Csize_t)::Cssize_t "Error in h5o_get_comment (not annotated)"
@bind h5o_get_comment_by_name(loc_id::hid_t, name::Cstring, comment::Ptr{Cchar}, bufsize::Csize_t, lapl_id::hid_t)::Cssize_t "Error in h5o_get_comment_by_name (not annotated)"
@bind h5o_get_info1(object_id::hid_t, buf::Ptr{H5O_info1_t})::herr_t "Error getting object info" (nothing, v"1.10.3")
@bind h5o_get_info2(loc_id::hid_t, oinfo::Ptr{H5O_info1_t}, fields::Cuint)::herr_t "Error in h5o_get_info2 (not annotated)" (v"1.10.3", v"1.12.0")
@bind h5o_get_info3(loc_id::hid_t, oinfo::Ptr{H5O_info2_t}, fields::Cuint)::herr_t "Error in h5o_get_info3 (not annotated)" (v"1.12.0", nothing)
@bind h5o_get_info_by_idx1(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info1_t}, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_idx1 (not annotated)" (nothing, v"1.10.3")
@bind h5o_get_info_by_idx2(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info1_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_idx2 (not annotated)" (v"1.10.3", v"1.12.0")
@bind h5o_get_info_by_idx3(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info2_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_idx3 (not annotated)" (v"1.12.0", nothing)
@bind h5o_get_info_by_name1(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info1_t}, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_name1 (not annotated)" (nothing, v"1.10.3")
@bind h5o_get_info_by_name2(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info1_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_name2 (not annotated)" (v"1.10.3", v"1.12.0")
@bind h5o_get_info_by_name3(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info2_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_info_by_name3 (not annotated)" (v"1.12.0", nothing)
@bind h5o_get_native_info(loc_id::hid_t, oinfo::Ptr{H5O_native_info_t}, fields::Cuint)::herr_t "Error in h5o_get_native_info (not annotated)"
@bind h5o_get_native_info_by_idx(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_native_info_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_native_info_by_idx (not annotated)"
@bind h5o_get_native_info_by_name(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_native_info_t}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_get_native_info_by_name (not annotated)"
@bind h5o_incr_refcount(object_id::hid_t)::herr_t "Error in h5o_incr_refcount (not annotated)"
@bind h5o_link(obj_id::hid_t, new_loc_id::hid_t, new_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)::herr_t "Error in h5o_link (not annotated)"
@bind h5o_open(loc_id::hid_t, pathname::Cstring, lapl_id::hid_t)::hid_t string("Error opening object ", h5i_get_name(loc_id), "/", pathname)
@bind h5o_open_by_addr(loc_id::hid_t, addr::haddr_t)::hid_t "Error opening object by address"
@bind h5o_open_by_idx(loc_id::hid_t, group_name::Cstring, index_type::Cint, order::Cint, n::hsize_t, lapl_id::hid_t)::hid_t string("Error opening object of index ", n)
@bind h5o_refresh(oid::hid_t)::herr_t "Error in h5o_refresh (not annotated)"
@bind h5o_set_comment(obj_id::hid_t, comment::Cstring)::herr_t "Error in h5o_set_comment (not annotated)"
@bind h5o_set_comment_by_name(loc_id::hid_t, name::Cstring, comment::Cstring, lapl_id::hid_t)::herr_t "Error in h5o_set_comment_by_name (not annotated)"
@bind h5o_token_cmp(loc_id::hid_t, token1::Ptr{H5O_token_t}, token2::Ptr{H5O_token_t}, cmp_value::Ptr{Cint})::herr_t "Error in h5o_token_cmp (not annotated)"
@bind h5o_token_from_str(loc_id::hid_t, token_str::Cstring, token::Ptr{H5O_token_t})::herr_t "Error in h5o_token_from_str (not annotated)"
@bind h5o_token_to_str(loc_id::hid_t, token::Ptr{H5O_token_t}, token_str::Ptr{Ptr{Cchar}})::herr_t "Error in h5o_token_to_str (not annotated)"
@bind h5o_visit1(obj_id::hid_t, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid})::herr_t "Error in h5o_visit1 (not annotated)" (nothing, v"1.12.0")
#@bind h5o_visit2(obj_id::hid_t, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid}, fields::Cuint)::herr_t "Error in h5o_visit2 (not annotated)"
@bind h5o_visit3(obj_id::hid_t, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate2_t, op_data::Ptr{Cvoid}, fields::Cuint)::herr_t "Error in h5o_visit3 (not annotated)" (v"1.12.0", nothing)
@bind h5o_visit_by_name1(loc_id::hid_t, obj_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid}, lapl_id::hid_t)::herr_t "Error in h5o_visit_by_name1 (not annotated)" (nothing, v"1.12.0")
#@bind h5o_visit_by_name2(loc_id::hid_t, obj_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_visit_by_name2 (not annotated)"
@bind h5o_visit_by_name3(loc_id::hid_t, obj_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate2_t, op_data::Ptr{Cvoid}, fields::Cuint, lapl_id::hid_t)::herr_t "Error in h5o_visit_by_name3 (not annotated)" (v"1.12.0", nothing)
###
### Property Interface
###
# get
@bind h5p_get(plist_id::hid_t, name::Cstring, value::Ptr{Cvoid})::herr_t "Error in h5p_get (not annotated)"
@bind h5p_get_alignment(fapl_id::hid_t, threshold::Ref{hsize_t}, alignment::Ref{hsize_t})::herr_t "Error getting alignment"
@bind h5p_get_alloc_time(plist_id::hid_t, alloc_time::Ptr{Cint})::herr_t "Error getting allocation timing"
@bind h5p_get_append_flush(dapl_id::hid_t, dims::Cuint, boundary::Ptr{hsize_t}, func::Ptr{H5D_append_cb_t}, udata::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_append_flush (not annotated)"
@bind h5p_get_attr_creation_order(plist_id::hid_t, crt_order_flags::Ptr{Cuint})::herr_t "Error getting attribute creation order"
@bind h5p_get_attr_phase_change(plist_id::hid_t, max_compact::Ptr{Cuint}, min_dense::Ptr{Cuint})::herr_t "Error in h5p_get_attr_phase_change (not annotated)"
@bind h5p_get_btree_ratios(plist_id::hid_t, left::Ptr{Cdouble}, middle::Ptr{Cdouble}, right::Ptr{Cdouble})::herr_t "Error in h5p_get_btree_ratios (not annotated)"
@bind h5p_get_buffer(plist_id::hid_t, tconv::Ptr{Ptr{Cvoid}}, bkg::Ptr{Ptr{Cvoid}})::Csize_t "Error in h5p_get_buffer (not annotated)"
@bind h5p_get_cache(plist_id::hid_t, mdc_nelmts::Ptr{Cint}, rdcc_nslots::Ptr{Csize_t}, rdcc_nbytes::Ptr{Csize_t}, rdcc_w0::Ptr{Cdouble})::herr_t "Error in h5p_get_cache (not annotated)"
@bind h5p_get_char_encoding(plist_id::hid_t, encoding::Ref{Cint})::herr_t "Error getting char encoding"
@bind h5p_get_chunk(plist_id::hid_t, n_dims::Cint, dims::Ptr{hsize_t})::Cint "Error getting chunk size"
@bind h5p_get_chunk_cache(dapl_id::hid_t, rdcc_nslots::Ptr{Csize_t}, rdcc_nbytes::Ptr{Csize_t}, rdcc_w0::Ptr{Cdouble})::herr_t "Error in h5p_get_chunk_cache (not annotated)"
@bind h5p_get_chunk_opts(plist_id::hid_t, opts::Ptr{Cuint})::herr_t "Error in h5p_get_chunk_opts (not annotated)"
@bind h5p_get_class(plist_id::hid_t)::hid_t "Error in h5p_get_class (not annotated)"
#@bind h5p_get_class_name(pclass_id::hid_t)::Ptr{Cchar} "Error in h5p_get_class_name (not annotated)"
@bind h5p_get_class_parent(pclass_id::hid_t)::hid_t "Error in h5p_get_class_parent (not annotated)"
@bind h5p_get_copy_object(plist_id::hid_t, copy_options::Ptr{Cuint})::herr_t "Error in h5p_get_copy_object (not annotated)"
@bind h5p_get_core_write_tracking(fapl_id::hid_t, is_enabled::Ptr{hbool_t}, page_size::Ptr{Csize_t})::herr_t "Error in h5p_get_core_write_tracking (not annotated)"
@bind h5p_get_create_intermediate_group(lcpl_id::hid_t, crt_intermed_group::Ref{Cuint})::herr_t "Error getting create intermediate group property"
@bind h5p_get_data_transform(plist_id::hid_t, expression::Ptr{Cchar}, size::Csize_t)::Cssize_t "Error in h5p_get_data_transform (not annotated)"
@bind h5p_get_driver(plist_id::hid_t)::hid_t "Error getting driver identifier"
@bind h5p_get_driver_info(plist_id::hid_t)::Ptr{Cvoid} "Error getting driver info"
@bind h5p_get_dset_no_attrs_hint(dcpl_id::hid_t, minimize::Ptr{hbool_t})::herr_t "Error in getting dataset no attributes hint property"
@bind h5p_get_dxpl_mpio(dxpl_id::hid_t, xfer_mode::Ptr{Cint})::herr_t "Error getting MPIO transfer mode"
@bind h5p_get_edc_check(plist_id::hid_t)::H5Z_EDC_t "Error in h5p_get_edc_check (not annotated)"
@bind h5p_get_efile_prefix(dapl_id::hid_t, prefix::Ptr{UInt8}, size::Csize_t)::Cssize_t "Error getting external file prefix"
@bind h5p_get_elink_acc_flags(lapl_id::hid_t, flags::Ptr{Cuint})::herr_t "Error in h5p_get_elink_acc_flags (not annotated)"
@bind h5p_get_elink_cb(lapl_id::hid_t, func::Ptr{H5L_elink_traverse_t}, op_data::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_elink_cb (not annotated)"
@bind h5p_get_elink_fapl(lapl_id::hid_t)::hid_t "Error in h5p_get_elink_fapl (not annotated)"
@bind h5p_get_elink_file_cache_size(plist_id::hid_t, efc_size::Ptr{Cuint})::herr_t "Error in h5p_get_elink_file_cache_size (not annotated)"
@bind h5p_get_elink_prefix(plist_id::hid_t, prefix::Ptr{Cchar}, size::Csize_t)::Cssize_t "Error in h5p_get_elink_prefix (not annotated)"
@bind h5p_get_est_link_info(plist_id::hid_t, est_num_entries::Ptr{Cuint}, est_name_len::Ptr{Cuint})::herr_t "Error in h5p_get_est_link_info (not annotated)"
@bind h5p_get_evict_on_close(fapl_id::hid_t, evict_on_close::Ptr{hbool_t})::herr_t "Error in h5p_get_evict_on_close (not annotated)"
@bind h5p_get_external(plist::hid_t, idx::Cuint, name_size::Csize_t, name::Ptr{Cuchar}, offset::Ptr{off_t}, size::Ptr{hsize_t})::herr_t "Error getting external file properties"
@bind h5p_get_external_count(plist::hid_t)::Cint "Error getting external count"
@bind h5p_get_family_offset(fapl_id::hid_t, offset::Ptr{hsize_t})::herr_t "Error in h5p_get_family_offset (not annotated)"
@bind h5p_get_fapl_core(fapl_id::hid_t, increment::Ptr{Csize_t}, backing_store::Ptr{hbool_t})::herr_t "Error in h5p_get_fapl_core (not annotated)"
@bind h5p_get_fapl_family(fapl_id::hid_t, memb_size::Ptr{hsize_t}, memb_fapl_id::Ptr{hid_t})::herr_t "Error in h5p_get_fapl_family (not annotated)"
@bind h5p_get_fapl_hdfs(fapl_id::hid_t, fa_out::Ptr{H5FD_hdfs_fapl_t})::herr_t "Error in h5p_get_fapl_hdfs (not annotated)"
@bind h5p_get_fapl_multi(fapl_id::hid_t, memb_map::Ptr{H5FD_mem_t}, memb_fapl::Ptr{hid_t}, memb_name::Ptr{Ptr{Cchar}}, memb_addr::Ptr{haddr_t}, relax::Ptr{hbool_t})::herr_t "Error in h5p_get_fapl_multi (not annotated)"
@bind h5p_get_fapl_splitter(fapl_id::hid_t, config_ptr::Ptr{H5FD_splitter_vfd_config_t})::herr_t "Error in h5p_get_fapl_splitter (not annotated)"
@bind h5p_get_fapl_ros3(fapl_id::hid_t, fa_out::Ptr{H5FD_ros3_fapl_t})::herr_t "Error in getting ros3 properties"
@bind h5p_get_fclose_degree(fapl_id::hid_t, fc_degree::Ref{Cint})::herr_t "Error getting close degree"
@bind h5p_get_file_image(fapl_id::hid_t, buf_ptr_ptr::Ptr{Ptr{Cvoid}}, buf_len_ptr::Ptr{Csize_t})::herr_t "Error in h5p_get_file_image (not annotated)"
@bind h5p_get_file_image_callbacks(fapl_id::hid_t, callbacks_ptr::Ptr{H5FD_file_image_callbacks_t})::herr_t "Error in h5p_get_file_image_callbacks (not annotated)"
@bind h5p_get_file_locking(fapl_id::hid_t, use_file_locking::Ptr{hbool_t}, ignore_when_disabled::Ptr{hbool_t})::herr_t "Error in h5p_get_file_locking (not annotated)"
@bind h5p_get_file_space(plist_id::hid_t, strategy::Ptr{H5F_file_space_type_t}, threshold::Ptr{hsize_t})::herr_t "Error in h5p_get_file_space (not annotated)"
@bind h5p_get_file_space_page_size(plist_id::hid_t, fsp_size::Ptr{hsize_t})::herr_t "Error in h5p_get_file_space_page_size (not annotated)"
@bind h5p_get_file_space_strategy(plist_id::hid_t, strategy::Ptr{H5F_fspace_strategy_t}, persist::Ptr{hbool_t}, threshold::Ptr{hsize_t})::herr_t "Error in h5p_get_file_space_strategy (not annotated)"
@bind h5p_get_fill_time(plist_id::hid_t, fill_time::Ptr{H5D_fill_time_t})::herr_t "Error in h5p_get_fill_time (not annotated)"
@bind h5p_get_fill_value(plist_id::hid_t, type_id::hid_t, value::Ptr{Cvoid})::herr_t "Error in h5p_get_fill_value (not annotated)"
@bind h5p_get_filter2(plist_id::hid_t, idx::Cuint, flags::Ptr{Cuint}, cd_nemlts::Ref{Csize_t}, cd_values::Ptr{Cuint}, namelen::Csize_t, name::Ptr{Cchar}, filter_config::Ptr{Cuint})::H5Z_filter_t "Error getting filter"
@bind h5p_get_filter_by_id2(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Ref{Cuint}, cd_nelmts::Ref{Csize_t}, cd_values::Ptr{Cuint}, namelen::Csize_t, name::Ptr{UInt8}, filter_config::Ptr{Cuint})::herr_t "Error getting filter ID"
@bind h5p_get_gc_references(fapl_id::hid_t, gc_ref::Ptr{Cuint})::herr_t "Error in h5p_get_gc_references (not annotated)"
@bind h5p_get_hyper_vector_size(fapl_id::hid_t, size::Ptr{Csize_t})::herr_t "Error in h5p_get_hyper_vector_size (not annotated)"
@bind h5p_get_istore_k(plist_id::hid_t, ik::Ptr{Cuint})::herr_t "Error in h5p_get_istore_k (not annotated)"
@bind h5p_get_layout(plist_id::hid_t)::Cint string("Error getting layout")
@bind h5p_get_libver_bounds(fapl_id::hid_t, low::Ref{Cint}, high::Ref{Cint})::herr_t "Error getting library version bounds"
@bind h5p_get_link_creation_order(plist_id::hid_t, crt_order_flags::Ptr{Cuint})::herr_t "Error getting link creation order"
@bind h5p_get_link_phase_change(plist_id::hid_t, max_compact::Ptr{Cuint}, min_dense::Ptr{Cuint})::herr_t "Error in h5p_get_link_phase_change (not annotated)"
@bind h5p_get_local_heap_size_hint(plist_id::hid_t, size_hint::Ref{Csize_t})::herr_t "Error getting local heap size hint"
@bind h5p_get_mcdt_search_cb(plist_id::hid_t, func::Ptr{H5O_mcdt_search_cb_t}, op_data::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_mcdt_search_cb (not annotated)"
@bind h5p_get_mdc_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})::herr_t "Error in h5p_get_mdc_config (not annotated)"
@bind h5p_get_mdc_image_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_image_config_t})::herr_t "Error in h5p_get_mdc_image_config (not annotated)"
@bind h5p_get_mdc_log_options(plist_id::hid_t, is_enabled::Ptr{hbool_t}, location::Ptr{Cchar}, location_size::Ptr{Csize_t}, start_on_access::Ptr{hbool_t})::herr_t "Error in h5p_get_mdc_log_options (not annotated)"
@bind h5p_get_meta_block_size(fapl_id::hid_t, size::Ptr{hsize_t})::herr_t "Error in h5p_get_meta_block_size (not annotated)"
@bind h5p_get_metadata_read_attempts(plist_id::hid_t, attempts::Ptr{Cuint})::herr_t "Error in h5p_get_metadata_read_attempts (not annotated)"
@bind h5p_get_multi_type(fapl_id::hid_t, type::Ptr{H5FD_mem_t})::herr_t "Error in h5p_get_multi_type (not annotated)"
@bind h5p_get_nfilters(plist_id::hid_t)::Cint "Error getting nfilters"
@bind h5p_get_nlinks(plist_id::hid_t, nlinks::Ptr{Csize_t})::herr_t "Error in h5p_get_nlinks (not annotated)"
@bind h5p_get_nprops(id::hid_t, nprops::Ptr{Csize_t})::herr_t "Error in h5p_get_nprops (not annotated)"
@bind h5p_get_obj_track_times(plist_id::hid_t, track_times::Ref{UInt8})::herr_t "Error getting object time tracking"
@bind h5p_get_object_flush_cb(plist_id::hid_t, func::Ptr{H5F_flush_cb_t}, udata::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_object_flush_cb (not annotated)"
@bind h5p_get_page_buffer_size(plist_id::hid_t, buf_size::Ptr{Csize_t}, min_meta_perc::Ptr{Cuint}, min_raw_perc::Ptr{Cuint})::herr_t "Error in h5p_get_page_buffer_size (not annotated)"
@bind h5p_get_preserve(plist_id::hid_t)::Cint "Error in h5p_get_preserve (not annotated)"
@bind h5p_get_shared_mesg_index(plist_id::hid_t, index_num::Cuint, mesg_type_flags::Ptr{Cuint}, min_mesg_size::Ptr{Cuint})::herr_t "Error in h5p_get_shared_mesg_index (not annotated)"
@bind h5p_get_shared_mesg_nindexes(plist_id::hid_t, nindexes::Ptr{Cuint})::herr_t "Error in h5p_get_shared_mesg_nindexes (not annotated)"
@bind h5p_get_shared_mesg_phase_change(plist_id::hid_t, max_list::Ptr{Cuint}, min_btree::Ptr{Cuint})::herr_t "Error in h5p_get_shared_mesg_phase_change (not annotated)"
@bind h5p_get_sieve_buf_size(fapl_id::hid_t, size::Ptr{Csize_t})::herr_t "Error in h5p_get_sieve_buf_size (not annotated)"
@bind h5p_get_size(id::hid_t, name::Ptr{Cchar}, size::Ptr{Csize_t})::herr_t "Error in h5p_get_size (not annotated)"
@bind h5p_get_sizes(plist_id::hid_t, sizeof_addr::Ptr{Csize_t}, sizeof_size::Ptr{Csize_t})::herr_t "Error in h5p_get_sizes (not annotated)"
@bind h5p_get_small_data_block_size(fapl_id::hid_t, size::Ptr{hsize_t})::herr_t "Error in h5p_get_small_data_block_size (not annotated)"
@bind h5p_get_sym_k(plist_id::hid_t, ik::Ptr{Cuint}, lk::Ptr{Cuint})::herr_t "Error in h5p_get_sym_k (not annotated)"
@bind h5p_get_type_conv_cb(dxpl_id::hid_t, op::Ptr{H5T_conv_except_func_t}, operate_data::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_type_conv_cb (not annotated)"
@bind h5p_get_userblock(plist_id::hid_t, len::Ptr{hsize_t})::herr_t "Error getting userblock"
@bind h5p_get_version(plist_id::hid_t, boot::Ptr{Cuint}, freelist::Ptr{Cuint}, stab::Ptr{Cuint}, shhdr::Ptr{Cuint})::herr_t "Error in h5p_get_version (not annotated)"
@bind h5p_get_virtual_count(dcpl_id::hid_t, count::Ptr{Csize_t})::herr_t "Error in h5p_get_virtual_count (not annotated)"
@bind h5p_get_virtual_dsetname(dcpl_id::hid_t, index::Csize_t, name::Ptr{Cchar}, size::Csize_t)::Cssize_t "Error in h5p_get_virtual_dsetname (not annotated)"
@bind h5p_get_virtual_filename(dcpl_id::hid_t, index::Csize_t, name::Ptr{Cchar}, size::Csize_t)::Cssize_t "Error in h5p_get_virtual_filename (not annotated)"
@bind h5p_get_virtual_prefix(dapl_id::hid_t, prefix::Ptr{Cchar}, size::Csize_t)::Cssize_t "Error in h5p_get_virtual_prefix (not annotated)"
@bind h5p_get_virtual_printf_gap(dapl_id::hid_t, gap_size::Ptr{hsize_t})::herr_t "Error in h5p_get_virtual_printf_gap (not annotated)"
@bind h5p_get_virtual_srcspace(dcpl_id::hid_t, index::Csize_t)::hid_t "Error in h5p_get_virtual_srcspace (not annotated)"
@bind h5p_get_virtual_view(dapl_id::hid_t, view::Ptr{H5D_vds_view_t})::herr_t "Error in h5p_get_virtual_view (not annotated)"
@bind h5p_get_virtual_vspace(dcpl_id::hid_t, index::Csize_t)::hid_t "Error in h5p_get_virtual_vspace (not annotated)"
@bind h5p_get_vlen_mem_manager(plist_id::hid_t, alloc_func::Ptr{H5MM_allocate_t}, alloc_info::Ptr{Ptr{Cvoid}}, free_func::Ptr{H5MM_free_t}, free_info::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_vlen_mem_manager (not annotated)"
@bind h5p_get_vol_id(plist_id::hid_t, vol_id::Ptr{hid_t})::herr_t "Error in h5p_get_vol_id (not annotated)"
@bind h5p_get_vol_info(plist_id::hid_t, vol_info::Ptr{Ptr{Cvoid}})::herr_t "Error in h5p_get_vol_info (not annotated)"
# set
@bind h5p_set(plist_id::hid_t, name::Cstring, value::Ptr{Cvoid})::herr_t "Error in h5p_set (not annotated)"
@bind h5p_set_alignment(plist_id::hid_t, threshold::hsize_t, alignment::hsize_t)::herr_t "Error setting alignment"
@bind h5p_set_alloc_time(plist_id::hid_t, alloc_time::Cint)::herr_t "Error setting allocation timing"
@bind h5p_set_append_flush(dapl_id::hid_t, ndims::Cuint, boundary::Ptr{hsize_t}, func::H5D_append_cb_t, udata::Ptr{Cvoid})::herr_t "Error in h5p_set_append_flush (not annotated)"
@bind h5p_set_attr_creation_order(plist_id::hid_t, crt_order_flags::Cuint)::herr_t "Error setting attribute creation order"
@bind h5p_set_attr_phase_change(plist_id::hid_t, max_compact::Cuint, min_dense::Cuint)::herr_t "Error in h5p_set_attr_phase_change (not annotated)"
@bind h5p_set_btree_ratios(plist_id::hid_t, left::Cdouble, middle::Cdouble, right::Cdouble)::herr_t "Error in h5p_set_btree_ratios (not annotated)"
@bind h5p_set_buffer(plist_id::hid_t, size::Csize_t, tconv::Ptr{Cvoid}, bkg::Ptr{Cvoid})::herr_t "Error in h5p_set_buffer (not annotated)"
@bind h5p_set_cache(plist_id::hid_t, mdc_nelmts::Cint, rdcc_nslots::Csize_t, rdcc_nbytes::Csize_t, rdcc_w0::Cdouble)::herr_t "Error in h5p_set_cache (not annotated)"
@bind h5p_set_char_encoding(plist_id::hid_t, encoding::Cint)::herr_t "Error setting char encoding"
@bind h5p_set_chunk(plist_id::hid_t, ndims::Cint, dims::Ptr{hsize_t})::herr_t "Error setting chunk size"
@bind h5p_set_chunk_cache(dapl_id::hid_t, rdcc_nslots::Csize_t, rdcc_nbytes::Csize_t, rdcc_w0::Cdouble)::herr_t "Error setting chunk cache"
@bind h5p_set_chunk_opts(plist_id::hid_t, opts::Cuint)::herr_t "Error in h5p_set_chunk_opts (not annotated)"
@bind h5p_set_copy_object(plist_id::hid_t, copy_options::Cuint)::herr_t "Error in h5p_set_copy_object (not annotated)"
@bind h5p_set_core_write_tracking(fapl_id::hid_t, is_enabled::hbool_t, page_size::Csize_t)::herr_t "Error in h5p_set_core_write_tracking (not annotated)"
@bind h5p_set_create_intermediate_group(plist_id::hid_t, setting::Cuint)::herr_t "Error setting create intermediate group"
@bind h5p_set_data_transform(plist_id::hid_t, expression::Cstring)::herr_t "Error in h5p_set_data_transform (not annotated)"
@bind h5p_set_deflate(plist_id::hid_t, setting::Cuint)::herr_t "Error setting compression method and level (deflate)"
@bind h5p_set_driver(plist_id::hid_t, driver_id::hid_t, driver_info::Ptr{Cvoid})::herr_t "Error in h5p_set_driver (not annotated)"
@bind h5p_set_dset_no_attrs_hint(dcpl_id::hid_t, minimize::hbool_t)::herr_t "Error in setting dataset no attributes hint property"
@bind h5p_set_dxpl_mpio(dxpl_id::hid_t, xfer_mode::Cint)::herr_t "Error setting MPIO transfer mode"
@bind h5p_set_edc_check(plist_id::hid_t, check::H5Z_EDC_t)::herr_t "Error in h5p_set_edc_check (not annotated)"
@bind h5p_set_efile_prefix(plist_id::hid_t, prefix::Cstring)::herr_t "Error setting external file prefix"
@bind h5p_set_elink_acc_flags(lapl_id::hid_t, flags::Cuint)::herr_t "Error in h5p_set_elink_acc_flags (not annotated)"
@bind h5p_set_elink_cb(lapl_id::hid_t, func::H5L_elink_traverse_t, op_data::Ptr{Cvoid})::herr_t "Error in h5p_set_elink_cb (not annotated)"
@bind h5p_set_elink_fapl(lapl_id::hid_t, fapl_id::hid_t)::herr_t "Error in h5p_set_elink_fapl (not annotated)"
@bind h5p_set_elink_file_cache_size(plist_id::hid_t, efc_size::Cuint)::herr_t "Error in h5p_set_elink_file_cache_size (not annotated)"
@bind h5p_set_elink_prefix(plist_id::hid_t, prefix::Cstring)::herr_t "Error in h5p_set_elink_prefix (not annotated)"
@bind h5p_set_est_link_info(plist_id::hid_t, est_num_entries::Cuint, est_name_len::Cuint)::herr_t "Error in h5p_set_est_link_info (not annotated)"
@bind h5p_set_evict_on_close(fapl_id::hid_t, evict_on_close::hbool_t)::herr_t "Error in h5p_set_evict_on_close (not annotated)"
@bind h5p_set_external(plist_id::hid_t, name::Cstring, offset::off_t, size::hsize_t)::herr_t "Error setting external property"
@bind h5p_set_family_offset(fapl_id::hid_t, offset::hsize_t)::herr_t "Error in h5p_set_family_offset (not annotated)"
@bind h5p_set_fapl_core(fapl_id::hid_t, increment::Csize_t, backing_store::hbool_t)::herr_t "Error in h5p_set_fapl_core (not annotated)"
@bind h5p_set_fapl_family(fapl_id::hid_t, memb_size::hsize_t, memb_fapl_id::hid_t)::herr_t "Error in h5p_set_fapl_family (not annotated)"
@bind h5p_set_fapl_hdfs(fapl_id::hid_t, fa::Ptr{H5FD_hdfs_fapl_t})::herr_t "Error in h5p_set_fapl_hdfs (not annotated)"
@bind h5p_set_fapl_log(fapl_id::hid_t, logfile::Cstring, flags::Culonglong, buf_size::Csize_t)::herr_t "Error in h5p_set_fapl_log (not annotated)"
@bind h5p_set_fapl_multi(fapl_id::hid_t, memb_map::Ptr{H5FD_mem_t}, memb_fapl::Ptr{hid_t}, memb_name::Ptr{Cstring}, memb_addr::Ptr{haddr_t}, relax::hbool_t)::herr_t "Error in h5p_set_fapl_multi (not annotated)"
@bind h5p_set_fapl_sec2(fapl_id::hid_t)::herr_t "Error setting Sec2 properties"
@bind h5p_set_fapl_ros3(fapl_id::hid_t, fa::Ptr{H5FD_ros3_fapl_t})::herr_t "Error in setting ros3 properties"
@bind h5p_set_fapl_split(fapl::hid_t, meta_ext::Cstring, meta_plist_id::hid_t, raw_ext::Cstring, raw_plist_id::hid_t)::herr_t "Error in h5p_set_fapl_split (not annotated)"
@bind h5p_set_fapl_splitter(fapl_id::hid_t, config_ptr::Ptr{H5FD_splitter_vfd_config_t})::herr_t "Error in h5p_set_fapl_splitter (not annotated)"
@bind h5p_set_fapl_stdio(fapl_id::hid_t)::herr_t "Error in h5p_set_fapl_stdio (not annotated)"
@bind h5p_set_fapl_windows(fapl_id::hid_t)::herr_t "Error in h5p_set_fapl_windows (not annotated)"
@bind h5p_set_fclose_degree(plist_id::hid_t, fc_degree::Cint)::herr_t "Error setting close degree"
@bind h5p_set_file_image(fapl_id::hid_t, buf_ptr::Ptr{Cvoid}, buf_len::Csize_t)::herr_t "Error in h5p_set_file_image (not annotated)"
@bind h5p_set_file_image_callbacks(fapl_id::hid_t, callbacks_ptr::Ptr{H5FD_file_image_callbacks_t})::herr_t "Error in h5p_set_file_image_callbacks (not annotated)"
@bind h5p_set_file_locking(fapl_id::hid_t, use_file_locking::hbool_t, ignore_when_disabled::hbool_t)::herr_t "Error in h5p_set_file_locking (not annotated)"
@bind h5p_set_file_space(plist_id::hid_t, strategy::H5F_file_space_type_t, threshold::hsize_t)::herr_t "Error in h5p_set_file_space (not annotated)"
@bind h5p_set_file_space_page_size(plist_id::hid_t, fsp_size::hsize_t)::herr_t "Error in h5p_set_file_space_page_size (not annotated)"
@bind h5p_set_file_space_strategy(plist_id::hid_t, strategy::H5F_fspace_strategy_t, persist::hbool_t, threshold::hsize_t)::herr_t "Error in h5p_set_file_space_strategy (not annotated)"
@bind h5p_set_fill_time(plist_id::hid_t, fill_time::H5D_fill_time_t)::herr_t "Error in h5p_set_fill_time (not annotated)"
@bind h5p_set_fill_value(plist_id::hid_t, type_id::hid_t, value::Ptr{Cvoid})::herr_t "Error in h5p_set_fill_value (not annotated)"
@bind h5p_set_filter(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Cuint, cd_nelmts::Csize_t, cd_values::Ptr{Cuint})::herr_t "Error setting filter"
@bind h5p_set_filter_callback(plist_id::hid_t, func::H5Z_filter_func_t, op_data::Ptr{Cvoid})::herr_t "Error in h5p_set_filter_callback (not annotated)"
@bind h5p_set_fletcher32(plist_id::hid_t)::herr_t "Error enabling Fletcher32 filter"
@bind h5p_set_gc_references(fapl_id::hid_t, gc_ref::Cuint)::herr_t "Error in h5p_set_gc_references (not annotated)"
@bind h5p_set_hyper_vector_size(plist_id::hid_t, size::Csize_t)::herr_t "Error in h5p_set_hyper_vector_size (not annotated)"
@bind h5p_set_istore_k(plist_id::hid_t, ik::Cuint)::herr_t "Error in h5p_set_istore_k (not annotated)"
@bind h5p_set_layout(plist_id::hid_t, setting::Cint)::herr_t "Error setting layout"
@bind h5p_set_libver_bounds(fapl_id::hid_t, low::Cint, high::Cint)::herr_t "Error setting library version bounds"
@bind h5p_set_link_creation_order(plist_id::hid_t, crt_order_flags::Cuint)::herr_t "Error setting link creation order"
@bind h5p_set_link_phase_change(plist_id::hid_t, max_compact::Cuint, min_dense::Cuint)::herr_t "Error in h5p_set_link_phase_change (not annotated)"
@bind h5p_set_local_heap_size_hint(plist_id::hid_t, size_hint::Csize_t)::herr_t "Error setting local heap size hint"
@bind h5p_set_mcdt_search_cb(plist_id::hid_t, func::H5O_mcdt_search_cb_t, op_data::Ptr{Cvoid})::herr_t "Error in h5p_set_mcdt_search_cb (not annotated)"
@bind h5p_set_mdc_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})::herr_t "Error in h5p_set_mdc_config (not annotated)"
@bind h5p_set_mdc_image_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_image_config_t})::herr_t "Error in h5p_set_mdc_image_config (not annotated)"
@bind h5p_set_mdc_log_options(plist_id::hid_t, is_enabled::hbool_t, location::Cstring, start_on_access::hbool_t)::herr_t "Error in h5p_set_mdc_log_options (not annotated)"
@bind h5p_set_meta_block_size(fapl_id::hid_t, size::hsize_t)::herr_t "Error in h5p_set_meta_block_size (not annotated)"
@bind h5p_set_metadata_read_attempts(plist_id::hid_t, attempts::Cuint)::herr_t "Error in h5p_set_metadata_read_attempts (not annotated)"
@bind h5p_set_multi_type(fapl_id::hid_t, type::H5FD_mem_t)::herr_t "Error in h5p_set_multi_type (not annotated)"
@bind h5p_set_nbit(plist_id::hid_t)::herr_t "Error enabling nbit filter"
@bind h5p_set_nlinks(plist_id::hid_t, nlinks::Csize_t)::herr_t "Error in h5p_set_nlinks (not annotated)"
@bind h5p_set_obj_track_times(plist_id::hid_t, track_times::UInt8)::herr_t "Error setting object time tracking"
@bind h5p_set_object_flush_cb(plist_id::hid_t, func::H5F_flush_cb_t, udata::Ptr{Cvoid})::herr_t "Error in h5p_set_object_flush_cb (not annotated)"
@bind h5p_set_page_buffer_size(plist_id::hid_t, buf_size::Csize_t, min_meta_per::Cuint, min_raw_per::Cuint)::herr_t "Error in h5p_set_page_buffer_size (not annotated)"
@bind h5p_set_preserve(plist_id::hid_t, status::hbool_t)::herr_t "Error in h5p_set_preserve (not annotated)"
@bind h5p_set_scaleoffset(plist_id::hid_t, scale_type::Cint, scale_factor::Cint)::herr_t "Error enabling szip filter"
@bind h5p_set_shared_mesg_index(plist_id::hid_t, index_num::Cuint, mesg_type_flags::Cuint, min_mesg_size::Cuint)::herr_t "Error in h5p_set_shared_mesg_index (not annotated)"
@bind h5p_set_shared_mesg_nindexes(plist_id::hid_t, nindexes::Cuint)::herr_t "Error in h5p_set_shared_mesg_nindexes (not annotated)"
@bind h5p_set_shared_mesg_phase_change(plist_id::hid_t, max_list::Cuint, min_btree::Cuint)::herr_t "Error in h5p_set_shared_mesg_phase_change (not annotated)"
@bind h5p_set_shuffle(plist_id::hid_t)::herr_t "Error enabling shuffle filter"
@bind h5p_set_sieve_buf_size(fapl_id::hid_t, size::Csize_t)::herr_t "Error in h5p_set_sieve_buf_size (not annotated)"
@bind h5p_set_sizes(plist_id::hid_t, sizeof_addr::Csize_t, sizeof_size::Csize_t)::herr_t "Error in h5p_set_sizes (not annotated)"
@bind h5p_set_small_data_block_size(fapl_id::hid_t, size::hsize_t)::herr_t "Error in h5p_set_small_data_block_size (not annotated)"
@bind h5p_set_sym_k(plist_id::hid_t, ik::Cuint, lk::Cuint)::herr_t "Error in h5p_set_sym_k (not annotated)"
@bind h5p_set_szip(plist_id::hid_t, options_mask::Cuint, pixels_per_block::Cuint)::herr_t "Error enabling szip filter"
@bind h5p_set_type_conv_cb(dxpl_id::hid_t, op::H5T_conv_except_func_t, operate_data::Ptr{Cvoid})::herr_t "Error in h5p_set_type_conv_cb (not annotated)"
@bind h5p_set_userblock(plist_id::hid_t, len::hsize_t)::herr_t "Error setting userblock"
@bind h5p_set_virtual(dcpl_id::hid_t, vspace_id::hid_t, src_file_name::Cstring, src_dset_name::Cstring, src_space_id::hid_t)::herr_t "Error setting virtual"
@bind h5p_set_virtual_prefix(dapl_id::hid_t, prefix::Cstring)::herr_t "Error in h5p_set_virtual_prefix (not annotated)"
@bind h5p_set_virtual_printf_gap(dapl_id::hid_t, gap_size::hsize_t)::herr_t "Error in h5p_set_virtual_printf_gap (not annotated)"
@bind h5p_set_virtual_view(dapl_id::hid_t, view::H5D_vds_view_t)::herr_t "Error in h5p_set_virtual_view (not annotated)"
@bind h5p_set_vlen_mem_manager(plist_id::hid_t, alloc_func::H5MM_allocate_t, alloc_info::Ptr{Cvoid}, free_func::H5MM_free_t, free_info::Ptr{Cvoid})::herr_t "Error in h5p_set_vlen_mem_manager (not annotated)"
@bind h5p_set_vol(plist_id::hid_t, new_vol_id::hid_t, new_vol_info::Ptr{Cvoid})::herr_t "Error in h5p_set_vol (not annotated)"
# others
@bind h5p_add_merge_committed_dtype_path(plist_id::hid_t, path::Cstring)::herr_t "Error in h5p_add_merge_committed_dtype_path (not annotated)"
@bind h5p_all_filters_avail(plist_id::hid_t)::htri_t "Error in h5p_all_filters_avail (not annotated)"
@bind h5p_close(id::hid_t)::herr_t "Error closing property list"
@bind h5p_close_class(plist_id::hid_t)::herr_t "Error in h5p_close_class (not annotated)"
@bind h5p_copy(plist_id::hid_t)::hid_t "Error in h5p_copy (not annotated)"
@bind h5p_copy_prop(dst_id::hid_t, src_id::hid_t, name::Cstring)::herr_t "Error in h5p_copy_prop (not annotated)"
@bind h5p_create(cls_id::hid_t)::hid_t "Error creating property list"
@bind h5p_create_class(parent::hid_t, name::Cstring, create::H5P_cls_create_func_t, create_data::Ptr{Cvoid}, copy::H5P_cls_copy_func_t, copy_data::Ptr{Cvoid}, close::H5P_cls_close_func_t, close_data::Ptr{Cvoid})::hid_t "Error in h5p_create_class (not annotated)"
@bind h5p_decode(buf::Ptr{Cvoid})::hid_t "Error in h5p_decode (not annotated)"
@bind h5p_encode1(plist_id::hid_t, buf::Ptr{Cvoid}, nalloc::Ptr{Csize_t})::herr_t "Error in h5p_encode1 (not annotated)"
@bind h5p_encode2(plist_id::hid_t, buf::Ptr{Cvoid}, nalloc::Ptr{Csize_t}, fapl_id::hid_t)::herr_t "Error in h5p_encode2 (not annotated)"
@bind h5p_equal(id1::hid_t, id2::hid_t)::htri_t "Error in h5p_equal (not annotated)"
@bind h5p_exist(plist_id::hid_t, name::Cstring)::htri_t "Error in h5p_exist (not annotated)"
@bind h5p_fill_value_defined(plist::hid_t, status::Ptr{H5D_fill_value_t})::herr_t "Error in h5p_fill_value_defined (not annotated)"
@bind h5p_free_merge_committed_dtype_paths(plist_id::hid_t)::herr_t "Error in h5p_free_merge_committed_dtype_paths (not annotated)"
@bind h5p_insert1(plist_id::hid_t, name::Cstring, size::Csize_t, value::Ptr{Cvoid}, prp_set::H5P_prp_set_func_t, prp_get::H5P_prp_get_func_t, prp_delete::H5P_prp_delete_func_t, prp_copy::H5P_prp_copy_func_t, prp_close::H5P_prp_close_func_t)::herr_t "Error in h5p_insert1 (not annotated)"
@bind h5p_insert2(plist_id::hid_t, name::Cstring, size::Csize_t, value::Ptr{Cvoid}, set::H5P_prp_set_func_t, get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, copy::H5P_prp_copy_func_t, compare::H5P_prp_compare_func_t, close::H5P_prp_close_func_t)::herr_t "Error in h5p_insert2 (not annotated)"
@bind h5p_isa_class(plist_id::hid_t, pclass_id::hid_t)::htri_t "Error in h5p_isa_class (not annotated)"
@bind h5p_iterate(id::hid_t, idx::Ptr{Cint}, iter_func::H5P_iterate_t, iter_data::Ptr{Cvoid})::Cint "Error in h5p_iterate (not annotated)"
@bind h5p_modify_filter(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Cuint, cd_nelmts::Csize_t, cd_values::Ptr{Cuint})::herr_t "Error modifying filter"
@bind h5p_register1(cls_id::hid_t, name::Cstring, size::Csize_t, def_value::Ptr{Cvoid}, prp_create::H5P_prp_create_func_t, prp_set::H5P_prp_set_func_t, prp_get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, prp_copy::H5P_prp_copy_func_t, prp_close::H5P_prp_close_func_t)::herr_t "Error in h5p_register1 (not annotated)"
@bind h5p_register2(cls_id::hid_t, name::Cstring, size::Csize_t, def_value::Ptr{Cvoid}, create::H5P_prp_create_func_t, set::H5P_prp_set_func_t, get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, copy::H5P_prp_copy_func_t, compare::H5P_prp_compare_func_t, close::H5P_prp_close_func_t)::herr_t "Error in h5p_register2 (not annotated)"
@bind h5p_remove(plist_id::hid_t, name::Cstring)::herr_t "Error in h5p_remove (not annotated)"
@bind h5p_remove_filter(plist_id::hid_t, filter_id::H5Z_filter_t)::herr_t "Error removing filter"
@bind h5p_unregister(pclass_id::hid_t, name::Cstring)::herr_t "Error in h5p_unregister (not annotated)"
###
### Plugin Interface
###
@bind h5pl_set_loading_state(plugin_control_mask::Cuint)::herr_t "Error setting plugin loading state"
@bind h5pl_get_loading_state(plugin_control_mask::Ptr{Cuint})::herr_t "Error getting plugin loading state"
@bind h5pl_append(search_path::Cstring)::herr_t "Error appending plugin path"
@bind h5pl_prepend(search_path::Cstring)::herr_t "Error prepending plugin path"
@bind h5pl_replace(search_path::Cstring, index::Cuint)::herr_t "Error replacing plugin path"
@bind h5pl_insert(search_path::Cstring, index::Cuint)::herr_t "Error inserting plugin path"
@bind h5pl_remove(index::Cuint)::herr_t "Error removing plugin path"
@bind h5pl_get(index::Cuint, path_buf::Ptr{Cchar}, buf_size::Csize_t)::Cssize_t "Error getting plugin path"
@bind h5pl_size(num_paths::Ptr{Cuint})::herr_t "Error in getting number of plugins paths"
###
### Reference Interface
###
@bind h5r_create(ref::Ptr{Cvoid}, loc_id::hid_t, pathname::Cstring, ref_type::Cint, space_id::hid_t)::herr_t string("Error creating reference to object ", h5i_get_name(loc_id), "/", pathname)
@bind h5r_dereference2(obj_id::hid_t, oapl_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid})::hid_t "Error dereferencing object"
@bind h5r_get_obj_type2(loc_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid}, obj_type::Ptr{Cint})::herr_t "Error getting object type"
@bind h5r_get_region(loc_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid})::hid_t "Error getting region from reference"
###
### Dataspace Interface
###
@bind h5s_close(space_id::hid_t)::herr_t "Error closing dataspace"
@bind h5s_combine_hyperslab(dspace_id::hid_t, seloper::H5S_seloper_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})::herr_t "Error selecting hyperslab"
@bind h5s_combine_select(space1_id::hid_t, op::H5S_seloper_t , space2_id::hid_t)::hid_t "Error combining dataspaces" (v"1.10.7", nothing)
@bind h5s_copy(space_id::hid_t)::hid_t "Error copying dataspace"
@bind h5s_create(class::Cint)::hid_t "Error creating dataspace"
@bind h5s_create_simple(rank::Cint, current_dims::Ptr{hsize_t}, maximum_dims::Ptr{hsize_t})::hid_t "Error creating simple dataspace"
@bind h5s_extent_copy(dst::hid_t, src::hid_t)::herr_t "Error copying extent"
@bind h5s_extent_equal(space1_id::hid_t, space2_id::hid_t)::htri_t "Error comparing dataspaces"
@bind h5s_get_regular_hyperslab(space_id::hid_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})::herr_t "Error getting regular hyperslab selection"
@bind h5s_get_select_bounds(space_id::hid_t, starts::Ptr{hsize_t}, ends::Ptr{hsize_t})::herr_t "Error getting bounding box for selection"
@bind h5s_get_select_elem_npoints(space_id::hid_t)::hssize_t "Error getting number of elements in dataspace selection"
@bind h5s_get_select_elem_pointlist(space_id::hid_t, startpoint::hsize_t, numpoints::hsize_t, buf::Ptr{hsize_t})::herr_t "Error getting list of element points"
@bind h5s_get_select_hyper_blocklist(space_id::hid_t, startblock::hsize_t, numblocks::hsize_t, buf::Ptr{hsize_t})::herr_t "Error getting list of hyperslab blocks"
@bind h5s_get_select_hyper_nblocks(space_id::hid_t)::hssize_t "Error getting number of selected blocks"
@bind h5s_get_select_npoints(space_id::hid_t)::hsize_t "Error getting the number of selected points"
@bind h5s_get_select_type(space_id::hid_t)::H5S_sel_type "Error getting the selection type"
@bind h5s_get_simple_extent_dims(space_id::hid_t, dims::Ptr{hsize_t}, maxdims::Ptr{hsize_t})::Cint "Error getting the dimensions for a dataspace"
@bind h5s_get_simple_extent_ndims(space_id::hid_t)::Cint "Error getting the number of dimensions for a dataspace"
@bind h5s_get_simple_extent_type(space_id::hid_t)::H5S_class_t "Error getting the dataspace type"
@bind h5s_is_regular_hyperslab(space_id::hid_t)::htri_t "Error determining whether datapace is regular hyperslab"
@bind h5s_is_simple(space_id::hid_t)::htri_t "Error determining whether dataspace is simple"
@bind h5s_modify_select(space_id::hid_t, op::H5S_seloper_t, space2_id::hid_t)::herr_t "Error modifying selection"
@bind h5s_offset_simple(space_id::hid_t, offset::Ptr{hssize_t})::herr_t "Error offsetting simple dataspace extent"
@bind h5s_select_adjust(space_id::hid_t, offset::Ptr{hssize_t})::herr_t "Error adjusting selection offset"
@bind h5s_select_all(space_id::hid_t)::herr_t "Error selecting all of dataspace"
@bind h5s_select_copy(dst::hid_t, src::hid_t)::herr_t "Error copying selection"
@bind h5s_select_elements(space_id::hid_t, op::H5S_seloper_t, num_elem::Csize_t, coord::Ptr{hsize_t})::herr_t "Error selecting elements"
@bind h5s_select_hyperslab(dspace_id::hid_t, seloper::H5S_seloper_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})::herr_t "Error selecting hyperslab"
@bind h5s_select_intersect_block(space_id::hid_t, starts::Ptr{hsize_t}, ends::Ptr{hsize_t})::htri_t "Error determining whether selection intersects block"
@bind h5s_select_shape_same(space1_id::hid_t, space2_id::hid_t)::htri_t "Error determining whether dataspace shapes are the same"
@bind h5s_select_valid(spaceid::hid_t)::htri_t "Error determining whether selection is within extent"
@bind h5s_set_extent_none(space_id::hid_t)::herr_t "Error setting dataspace extent to none"
@bind h5s_set_extent_simple(dspace_id::hid_t, rank::Cint, current_size::Ptr{hsize_t}, maximum_size::Ptr{hsize_t})::herr_t "Error setting dataspace size"
###
### Datatype Interface
###
@bind h5t_array_create2(basetype_id::hid_t, ndims::Cuint, sz::Ptr{hsize_t})::hid_t string("Error creating H5T_ARRAY of id ", basetype_id, " and size ", sz)
@bind h5t_close(dtype_id::hid_t)::herr_t "Error closing datatype"
@bind h5t_committed(dtype_id::hid_t)::htri_t "Error determining whether datatype is committed"
@bind h5t_commit2(loc_id::hid_t, name::Cstring, dtype_id::hid_t, lcpl_id::hid_t, tcpl_id::hid_t, tapl_id::hid_t)::herr_t "Error committing type"
# @bind h5t_commit_anon
# @bind h5t_compiler_conv
@bind h5t_copy(dtype_id::hid_t)::hid_t "Error copying datatype"
@bind h5t_create(class_id::Cint, sz::Csize_t)::hid_t string("Error creating datatype of id ", class_id)
# @bind h5t_decode
# @bind h5t_detect_class
# @bind h5t_encode
# @bind h5t_enum_create
@bind h5t_enum_insert(dtype_id::hid_t, name::Cstring, value::Ptr{Cvoid})::herr_t string("Error adding ", name, " to enum datatype")
# @bind h5t_enum_nameof
# @bind h5t_enum_valueof
@bind h5t_equal(dtype_id1::hid_t, dtype_id2::hid_t)::htri_t "Error checking datatype equality"
# @bind ht5_find
@bind h5t_get_array_dims2(dtype_id::hid_t, dims::Ptr{hsize_t})::Cint "Error getting dimensions of array"
@bind h5t_get_array_ndims(dtype_id::hid_t)::Cint "Error getting ndims of array"
@bind h5t_get_class(dtype_id::hid_t)::Cint "Error getting class"
@bind h5t_get_cset(dtype_id::hid_t)::Cint "Error getting character set encoding"
@bind h5t_get_ebias(dtype_id::hid_t)::Csize_t "Error getting exponent bias"
@bind h5t_get_fields(dtype_id::hid_t, spos::Ref{Csize_t}, epos::Ref{Csize_t}, esize::Ref{Csize_t}, mpos::Ref{Csize_t}, msize::Ref{Csize_t})::herr_t "Error getting datatype floating point bit positions"
@bind h5t_get_member_class(dtype_id::hid_t, index::Cuint)::Cint string("Error getting class of compound datatype member #", index)
@bind h5t_get_member_index(dtype_id::hid_t, membername::Cstring)::Cint string("Error getting index of compound datatype member \"", membername, "\"")
# @bind h5t_get_member_name(dtype_id::hid_t, index::Cuint)::Cstring string("Error getting name of compound datatype member #", index) # See below
@bind h5t_get_member_offset(dtype_id::hid_t, index::Cuint)::Csize_t "Error getting offset of compound datatype #$(index)"
@bind h5t_get_member_type(dtype_id::hid_t, index::Cuint)::hid_t string("Error getting type of compound datatype member #", index)
# @bind h5t_get_member_value
@bind h5t_get_native_type(dtype_id::hid_t, direction::Cint)::hid_t "Error getting native type"
@bind h5t_get_nmembers(dtype_id::hid_t)::Cint "Error getting the number of members"
# @bind h5t_get_norm
@bind h5t_get_offset(dtype_id::hid_t)::Cint "Error getting offset"
@bind h5t_get_order(dtype_id::hid_t)::Cint "Error getting order"
# @bind h5t_get_pad(dtype_id::hid_t, lsb::Ptr{H5T_pad_t}, msb::Ptr{H5T_pad_t})::herr_t "Error getting pad"
@bind h5t_get_precision(dtype_id::hid_t)::Csize_t "Error getting precision"
@bind h5t_get_sign(dtype_id::hid_t)::Cint "Error getting sign"
@bind h5t_get_size(dtype_id::hid_t)::Csize_t "Error getting type size"
@bind h5t_get_strpad(dtype_id::hid_t)::Cint "Error getting string padding"
@bind h5t_get_super(dtype_id::hid_t)::hid_t "Error getting super type"
# @bind h5t_get_tag(type_id::hid_t)::Cstring "Error getting datatype opaque tag" # See below
@bind h5t_insert(dtype_id::hid_t, fieldname::Cstring, offset::Csize_t, field_id::hid_t)::herr_t string("Error adding field ", fieldname, " to compound datatype")
@bind h5t_is_variable_str(type_id::hid_t)::htri_t "Error determining whether string is of variable length"
@bind h5t_lock(type_id::hid_t)::herr_t "Error locking type"
@bind h5t_open2(loc_id::hid_t, name::Cstring, tapl_id::hid_t)::hid_t string("Error opening type ", h5i_get_name(loc_id), "/", name)
# @bind h5t_pack
# @bind h5t_reclaim
# @bind h5t_refresh
# @bind h5t_register
@bind h5t_set_cset(dtype_id::hid_t, cset::Cint)::herr_t "Error setting character set in datatype"
@bind h5t_set_ebias(dtype_id::hid_t, ebias::Csize_t)::herr_t "Error setting datatype floating point exponent bias"
@bind h5t_set_fields(dtype_id::hid_t, spos::Csize_t, epos::Csize_t, esize::Csize_t, mpos::Csize_t, msize::Csize_t)::herr_t "Error setting datatype floating point bit positions"
# @bind h5t_set_inpad(dtype_id::hid_t, inpad::H5T_pad_t)::herr_t "Error setting inpad"
# @bind h5t_set_norm(dtype_id::hid_t, norm::H5T_norm_t)::herr_t "Error setting mantissa"
@bind h5t_set_offset(dtype_id::hid_t, offset::Csize_t)::herr_t "Error setting offset"
@bind h5t_set_order(dtype_id::hid_t, order::Cint)::herr_t "Error setting order"
@bind h5t_set_precision(dtype_id::hid_t, sz::Csize_t)::herr_t "Error setting precision of datatype"
@bind h5t_set_size(dtype_id::hid_t, sz::Csize_t)::herr_t "Error setting size of datatype"
@bind h5t_set_strpad(dtype_id::hid_t, sz::Cint)::herr_t "Error setting size of datatype"
@bind h5t_set_tag(dtype_id::hid_t, tag::Cstring)::herr_t "Error setting opaque tag"
# @bind h5t_unregister
@bind h5t_vlen_create(base_type_id::hid_t)::hid_t "Error creating vlen type"
# The following are not automatically wrapped since they have requirements about freeing
# the memory that is returned from the calls. They are implemented via api_helpers.jl
#@bind h5t_get_member_name(dtype_id::hid_t, index::Cuint)::Cstring string("Error getting name of compound datatype member #", index)
#@bind h5t_get_tag(type_id::hid_t)::Cstring "Error getting datatype opaque tag"
###
### Optimized Functions Interface
###
@bind h5do_append(dset_id::hid_t, dxpl_id::hid_t, index::Cuint, num_elem::hsize_t, memtype::hid_t, buffer::Ptr{Cvoid})::herr_t "error appending"
# h5do_write_chunk is deprecated as of hdflib 1.10.3
@bind h5do_write_chunk(dset_id::hid_t, dxpl_id::hid_t, filter_mask::UInt32, offset::Ptr{hsize_t}, bufsize::Csize_t, buf::Ptr{Cvoid})::herr_t "Error writing chunk"
###
### High Level Dimension Scale Interface
###
@bind h5ds_attach_scale(did::hid_t, dsid::hid_t, idx::Cuint)::herr_t "Unable to attach scale"
@bind h5ds_detach_scale(did::hid_t, dsid::hid_t, idx::Cuint)::herr_t "Unable to detach scale"
@bind h5ds_get_label(did::hid_t, idx::Cuint, label::Ptr{UInt8}, size::hsize_t)::herr_t "Unable to get label"
@bind h5ds_get_num_scales(did::hid_t, idx::Cuint)::Cint "Error getting number of scales"
@bind h5ds_get_scale_name(did::hid_t, name::Ptr{UInt8}, size::Csize_t)::Cssize_t "Unable to get scale name"
@bind h5ds_is_attached(did::hid_t, dsid::hid_t, idx::Cuint)::htri_t "Unable to check if dimension is attached"
@bind h5ds_is_scale(did::hid_t)::htri_t "Unable to check if dataset is scale"
@bind h5ds_set_label(did::hid_t, idx::Cuint, label::Ref{UInt8})::herr_t "Unable to set label"
@bind h5ds_set_scale(dsid::hid_t, dimname::Cstring)::herr_t "Unable to set scale"
###
### HDF5 Lite Interface
###
@bind h5lt_dtype_to_text(datatype::hid_t, str::Ptr{UInt8}, lang_type::Cint, len::Ref{Csize_t})::herr_t "Error getting datatype text representation"
###
### Table Interface
###
@bind h5tb_append_records(loc_id::hid_t, dset_name::Cstring, nrecords::hsize_t, type_size::Csize_t, field_offset::Ptr{Csize_t}, field_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})::herr_t "Error adding record to table"
@bind h5tb_get_field_info(loc_id::hid_t, table_name::Cstring, field_names::Ptr{Ptr{UInt8}}, field_sizes::Ptr{Csize_t}, field_offsets::Ptr{Csize_t}, type_size::Ptr{Csize_t})::herr_t "Error getting field information"
@bind h5tb_get_table_info(loc_id::hid_t, table_name::Cstring, nfields::Ptr{hsize_t}, nrecords::Ptr{hsize_t})::herr_t "Error getting table information"
# NOTE: The HDF5 docs incorrectly specify type_size::hsize_t where as it should be type_size::Csize_t
@bind h5tb_make_table(table_title::Cstring, loc_id::hid_t, dset_name::Cstring, nfields::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_names::Ptr{Cstring}, field_offset::Ptr{Csize_t}, field_types::Ptr{hid_t}, chunk_size::hsize_t, fill_data::Ptr{Cvoid}, compress::Cint, data::Ptr{Cvoid})::herr_t "Error creating and writing dataset to table"
@bind h5tb_read_records(loc_id::hid_t, table_name::Cstring, start::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_offsets::Ptr{Csize_t}, dst_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})::herr_t "Error reading record from table"
@bind h5tb_read_table(loc_id::hid_t, table_name::Cstring, dst_size::Csize_t, dst_offset::Ptr{Csize_t}, dst_sizes::Ptr{Csize_t}, dst_buf::Ptr{Cvoid})::herr_t "Error reading table"
@bind h5tb_write_records(loc_id::hid_t, table_name::Cstring, start::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_offsets::Ptr{Csize_t}, field_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})::herr_t "Error writing record to table"
###
### Filter Interface
###
@bind h5z_register(filter_class::Ref{H5Z_class_t})::herr_t "Unable to register new filter"
@bind h5z_unregister(id::H5Z_filter_t)::herr_t "Unable to unregister filter"
@bind h5z_filter_avail(id::H5Z_filter_t)::htri_t "Unable to get check filter availability"
@bind h5z_get_filter_info(filter::H5Z_filter_t, filter_config_flags::Ptr{Cuint})::herr_t "Error getting filter information"
###
### File driver interface
### FD consts: these are defined in hdf5 as macros
@bind h5fd_core_init()::hid_t "Error initializing file driver"
@bind h5fd_family_init()::hid_t "Error initializing file driver"
@bind h5fd_log_init()::hid_t "Error initializing file driver"
@bind h5fd_mpio_init()::hid_t "Error initializing file driver"
@bind h5fd_multi_init()::hid_t "Error initializing file driver"
@bind h5fd_sec2_init()::hid_t "Error initializing file driver"
@bind h5fd_stdio_init()::hid_t "Error initializing file driver"
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 11077 | using Base.Meta: isexpr, quot
# Some names don't follow the automatic conversion rules, so define how to make some
# of the translations explicitly.
const bind_exceptions = Dict{Symbol,Symbol}()
# Load URLs for HDF Group Doxygen
const urldict = Dict{String,String}([
func_urls[1] => func_urls[2] for
func_urls in split.(readlines("DoxygenTagParser/hdf5_func_urls.tsv"))
])
# have numbers at the end
bind_exceptions[:h5p_set_fletcher32] = :H5Pset_fletcher32
bind_exceptions[:h5p_set_fapl_sec2] = :H5Pset_fapl_sec2
bind_exceptions[:h5p_get_fapl_ros3] = :H5Pget_fapl_ros3
bind_exceptions[:h5p_set_fapl_ros3] = :H5Pset_fapl_ros3
# underscore separator not removed
bind_exceptions[:h5fd_core_init] = :H5FD_core_init
bind_exceptions[:h5fd_family_init] = :H5FD_family_init
bind_exceptions[:h5fd_log_init] = :H5FD_log_init
bind_exceptions[:h5fd_mpio_init] = :H5FD_mpio_init
bind_exceptions[:h5fd_multi_init] = :H5FD_multi_init
bind_exceptions[:h5fd_sec2_init] = :H5FD_sec2_init
bind_exceptions[:h5fd_stdio_init] = :H5FD_stdio_init
# An expression which is injected at the beginning of the API defitions to aid in doing
# (pre)compile-time conditional compilation based on the libhdf5 version.
_libhdf5_build_ver_expr = quote
_libhdf5_build_ver = let
majnum, minnum, relnum = Ref{Cuint}(), Ref{Cuint}(), Ref{Cuint}()
r = ccall(
(:H5get_libversion, libhdf5),
herr_t,
(Ref{Cuint}, Ref{Cuint}, Ref{Cuint}),
majnum,
minnum,
relnum
)
r < 0 && error("Error getting HDF5 library version")
VersionNumber(majnum[], minnum[], relnum[])
end
end
# We'll also use this processing pass to automatically generate documentation that simply
# lists all of the bound API functions.
const bound_api = Dict{String,Vector{String}}()
"""
@bind h5_function(arg1::Arg1Type, ...)::ReturnType [ErrorStringOrExpression] [(lb,ub)]
A binding generator for translating `@ccall`-like declarations of HDF5 library functions
to error-checked `ccall` expressions.
The provided function name is used to define the Julia function with any trailing version
number removed (such as `h5t_open2` -> `h5t_open`). The corresponding C function name is
auto-generated by uppercasing the first few letters (up to the first `_`), the first `_` is
removed. Explicit name mappings can be made by inserting a `:jlname => :h5name` pair into
the `bind_exceptions` dictionary.
The optional `ErrorStringOrExpression` can be either a string literal or an expression
which evaluates to a string. This string is used as the message in `h5error(msg)`
call. Note that the expression may refer to any arguments by name.
The optional Tuple `(lb,ub)` is a tuple of version literals (e.g. v"1.10.5") or `nothing`
defining an inclusive lower bound (lb) and exclusive upper bound (ub) such that
the binding is only defined if `lb ≤ _libhdf5_build_ver < ub`. If either bound is `nothing`
then that bound is ignored and the version is unbounded on that side.
The declared return type in the function-like signature must be the return type of the C
function, and the Julia return type is inferred as one of the following possibilities:
1. If `ReturnType === :herr_t`, the Julia function returns `nothing` and the C return is
used only in error checking.
2. If `ReturnType === :htri_t`, the Julia function returns a boolean indicating whether
the return value of the C function was zero (`false`) or positive (`true`).
3. Otherwise, the C function return value is returned from the Julia function, with the
following exceptions:
- If the return type is an C integer type compatible with Int, the return type is
converted to an Int.
Furthermore, the C return value is interpreted to automatically generate error checks:
1. If `ReturnType === :herr_t` or `ReturnType === :htri_t`, an error is raised when the return
value is negative.
2. If `ReturnType === :haddr_t` or `ReturnType === :hsize_t`, an error is raised when the
return value is equivalent to `-1 % haddr_t` and `-1 % hsize_t`, respectively.
3. If `ReturnType` is a `Ptr` expression, an error is raised when the return value is
equal to `C_NULL`.
4. If `ReturnType` is a `Csize_t` expression, an error is raised when the return value is
equal to `0`. Note that at least in the case of `h5t_get_member_offset`, `0` is also a
valid return value, so it is necessary for `h5error()` to check the error stack to
determine whether or not an error should be thrown.
5. For all other return types, it is assumed a negative value indicates error.
It is assumed that the HDF library names are given in global constants named `libhdf5`
and `libhdf5_hl`. The former is used for all `ccall`s, except if the C library name begins
with "H5DO" or "H5TB" then the latter library is used.
"""
macro bind(sig::Expr, err::Union{String,Expr}, vers::Union{Expr,Nothing}=nothing)
expr = _bind(__module__, __source__, sig, err)
isnothing(vers) && return esc(expr)
isexpr(vers, :tuple) || error("Expected 2-tuple of version bounds, got ", vers)
length(vers.args) == 2 || error("Expected 2-tuple of version bounds, got ", vers)
lb = vers.args[1]
ub = vers.args[2]
if lb !== :nothing && !(isexpr(lb, :macrocall) && lb.args[1] == Symbol("@v_str"))
error("Lower version bound must be `nothing` or version number literal, got ", lb)
end
if ub !== :nothing && !(isexpr(ub, :macrocall) && ub.args[1] == Symbol("@v_str"))
error("Upper version bound must be `nothing` or version number literal, got ", ub)
end
if lb === :nothing && ub !== :nothing
conditional = :(_libhdf5_build_ver < $(ub))
elseif lb !== :nothing && ub === :nothing
conditional = :($(lb) ≤ _libhdf5_build_ver)
else
conditional = :($(lb) ≤ _libhdf5_build_ver < $(ub))
end
conditional = Expr(:if, conditional, expr)
return esc(Expr(:macrocall, Symbol("@static"), nothing, conditional))
end
function _bind(__module__, __source__, sig::Expr, err::Union{String,Expr,Nothing})
sig.head === :(::) || error("return type required on function signature")
# Pull apart return-type and rest of function declaration
rettype = sig.args[2]::Union{Symbol,Expr}
funcsig = sig.args[1]
isexpr(funcsig, :call) ||
error("expected function-like expression, found `", funcsig, "`")
funcsig = funcsig::Expr
# Extract function name and argument list
jlfuncname = funcsig.args[1]::Symbol
funcargs = funcsig.args[2:end]
# Pull apart argument names and types
args = Vector{Symbol}()
argt = Vector{Union{Expr,Symbol}}()
for ii in 1:length(funcargs)
argex = funcargs[ii]
if !isexpr(argex, :(::)) || !(argex.args[1] isa Symbol)
error(
"expected `name::type` expression in argument ", ii, ", got ", funcargs[ii]
)
end
push!(args, argex.args[1])
push!(argt, argex.args[2])
end
prefix, rest = split(string(jlfuncname), "_"; limit=2)
# Translate the C function name to a local equivalent
if haskey(bind_exceptions, jlfuncname)
cfuncname = bind_exceptions[jlfuncname]
else
# turn e.g. h5f_close into H5Fclose
cfuncname = Symbol(uppercase(prefix), rest)
# Remove the version number if present (excluding match to literal "hdf5" suffix)
if occursin(r"\d(?<!hdf5)$", String(jlfuncname))
jlfuncname = Symbol(chop(String(jlfuncname); tail=1))
end
end
# Store the function prototype in HDF5-module specific lists:
funclist = get!(bound_api, uppercase(prefix), Vector{String}(undef, 0))
string(jlfuncname) in funclist || push!(funclist, string(jlfuncname))
# Also start building the matching doc string.
docfunc = copy(funcsig)
docfunc.args[1] = jlfuncname
docstr = " $docfunc"
# Determine the underlying C library to call
lib = startswith(string(cfuncname), r"H5(DO|DS|LT|TB)") ? :libhdf5_hl : :libhdf5
# Now start building up the full expression:
statsym = Symbol("#status#") # not using gensym() to have stable naming
# The ccall(...) itself
cfunclib = Expr(:tuple, quot(cfuncname), lib)
ccallexpr = :(ccall($cfunclib, $rettype, ($(argt...),), $(args...)))
# The error condition expression
# errexpr = :(@hdf5error($err))
# avoid generating line numbers in macrocall
errexpr = Expr(:macrocall, Symbol("@h5error"), nothing, err)
if rettype === :Csize_t
:($statsym == 0 % $rettype && $errexpr)
# pass through
elseif rettype === :haddr_t || rettype === :hsize_t
# Error typically indicated by negative values, but some return types are unsigned
# integers. From `H5public.h`:
# ADDR_UNDEF => (haddr_t)(-1)
# HSIZE_UNDEF => (hsize_t)(-1)
# which are both just `-1 % $type` in Julia
errexpr = :($statsym == -1 % $rettype && $errexpr)
elseif isexpr(rettype, :curly) && rettype.args[1] === :Ptr
errexpr = :($statsym == C_NULL && $errexpr)
else
errexpr = :($statsym < $rettype(0) && $errexpr)
end
# Three cases for handling the return type
if rettype === :htri_t
# Returns a Boolean on non-error
returnexpr = :(return $statsym > 0)
docstr *= " -> Bool"
elseif rettype === :herr_t
# Only used to indicate error status
returnexpr = :(return nothing)
elseif rettype === :Cint
# Convert to Int type
returnexpr = :(return Int($statsym))
docstr *= " -> Int"
else
# Returns a value
returnexpr = :(return $statsym)
docstr *= " -> $rettype"
end
if prefix == "h5fd" && endswith(rest, "_init")
# remove trailing _init
drivername = rest[1:(end - 5)]
docstr *=
"\n\nThis function is exposed in `libhdf5` as the macro `H5FD_$(uppercase(drivername))`. " *
"See `libhdf5` documentation for [`H5Pget_driver`]" *
"(" *
urldict["H5Pget_driver"] *
").\n"
else
docstr *=
"\n\nSee `libhdf5` documentation for [`$cfuncname`]" *
"(" *
get(urldict, string(cfuncname), "https://docs.hdfgroup.org/hdf5/v1_14/") *
").\n"
end
# Then assemble the pieces. Doing it through explicit Expr() objects
# avoids inserting the line number nodes for the macro --- the call site
# is instead explicitly injected into the function body via __source__.
jlfuncsig = Expr(:call, jlfuncname, args...)
jlfuncbody = Expr(
:block, __source__, :(lock(liblock)), :($statsym = try
$ccallexpr
finally
unlock(liblock)
end)
)
if errexpr !== nothing
push!(jlfuncbody.args, errexpr)
end
push!(jlfuncbody.args, returnexpr)
jlfuncexpr = Expr(:function, jlfuncsig, jlfuncbody)
jlfuncexpr = Expr(:macrocall, Symbol("@doc"), nothing, docstr, jlfuncexpr)
return jlfuncexpr
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 6529 | # Generate ../src/api/functions.jl
# Run `julia --project=.. gen_wrappers.jl`` to execute this script
const group_url_dict = Dict{String,String}([
func_urls[1] => func_urls[2] for
func_urls in split.(readlines("DoxygenTagParser/hdf5_group_urls.tsv"))
])
group_url_dict["H5FD"] = group_url_dict["VFL"]
include(joinpath(@__DIR__, "bind_generator.jl"))
# Read in the API definition macros from the definitions file
defs = read(joinpath(@__DIR__, "api_defs.jl"), String)
# Have Julia expand/run the @bind macro to generate expressions for all of the functions
exprs = Base.include_string(
@__MODULE__, "@macroexpand1 begin\n" * defs * "\nend", "api_defs.jl"
)
# Insert the conditional version helper expression
prepend!(exprs.args, _libhdf5_build_ver_expr.args)
Base.remove_linenums!(exprs)
# Definitions which are not automatically generated, but should still be documented as
# part of the raw low-level API:
append!(bound_api["H5O"], [
# defined in src/api/helpers.jl
"h5o_get_info1",
])
append!(bound_api["H5P"], [
# defined in src/api/helpers.jl
"h5p_get_class_name",
"h5p_get_fapl_mpio",
"h5p_set_fapl_mpio",
])
append!(bound_api["H5T"], [
# defined in src/api/helpers.jl
"h5t_get_member_name",
"h5t_get_tag",
])
# Now dump the text representation to disk
open(joinpath(@__DIR__, "..", "src", "api", "functions.jl"), "w") do fid
println(
fid,
"""
#! format: off
# This file is autogenerated by HDF5.jl's `gen/gen_wrappers.jl` and should not be editted.
#
# To add new bindings, define the binding in `gen/api_defs.jl`, re-run
# `gen/gen_wrappers.jl`, and commit the updated `src/api/functions.jl`.
"""
)
function triplequote(s::String, indent="", prefix="")
ret = indent * prefix * "\"\"\"\n"
for l in eachline(IOBuffer(s))
ret *= isempty(l) ? "\n" : indent * l * "\n"
end
ret *= indent * "\"\"\"\n"
return ret
end
ismacro(ex, sym, n=0) =
isexpr(ex, :macrocall) && length(ex.args) >= n + 2 && ex.args[1] == sym
for funcblock in exprs.args
if ismacro(funcblock, Symbol("@doc"), 2)
# Pretty print the doc macro as just a juxtaposed doc string and function
# definition; the `@doc` construction is necessary in AST form for the docs
# to be included in interactive use of `@bind`, but in source form we can
# rely on Julia's parsing behavior.
print(fid, triplequote(funcblock.args[3]), funcblock.args[4], "\n\n")
elseif ismacro(funcblock, Symbol("@static"), 1) &&
isexpr(funcblock.args[3], :if, 2) &&
ismacro(funcblock.args[3].args[2], Symbol("@doc"), 2)
# Within a @static block, we have to keep the @doc prefix, but we can still
# switch to triple-quoting and there's special parsing to allow the function
# definition to be on the next line.
#
# Work around the expression printer in this more complex case by printing
# to a buffer and string-replacing a sentinel value
docstr = funcblock.args[3].args[2].args[3]
funcblock.args[3].args[2].args[3] = "SENTINEL_DOC"
buf = sprint(print, funcblock)
# Two-step deindent since `r"^\s{4}(\s{4})?"m => s"\1"` errors: see JuliaLang/julia#31456
buf = replace(buf, r"^\s{4}"m => s"") # deindent
buf = replace(buf, r"^(\s{4})\s{4}"m => s"\1") # deindent
# Now format the doc string and replace (note need to indent `function`)
buf = replace(
buf,
r"^\s+@doc \"SENTINEL_DOC\" "m =>
triplequote(docstr, " "^4, "@doc ") * " "^4
)
print(fid, buf, "\n\n")
else
# passthrough
print(fid, funcblock, "\n\n")
end
end
# Remove last endline
truncate(fid, position(fid) - 1)
end
# Also generate auto-docs that simply list all of the bound API functions
apidocs = ""
for (mod, desc, urltail) in (
("H5", "General Library Functions", "Library"),
("H5A", "Attribute Interface", "Attributes"),
("H5D", "Dataset Interface", "Datasets"),
("H5E", "Error Interface", "Error+Handling"),
("H5F", "File Interface", "Files"),
("H5G", "Group Interface", "Groups"),
("H5I", "Identifier Interface", "Identifiers"),
("H5L", "Link Interface", "Links"),
("H5O", "Object Interface", "Objects"),
("H5PL", "Plugin Interface", "Plugins"),
("H5P", "Property Interface", "Property+Lists"),
("H5R", "Reference Interface", "References"),
("H5S", "Dataspace Interface", "Dataspaces"),
("H5T", "Datatype Interface", "Datatypes"),
("H5Z", "Filter Interface", "Filters"),
("H5FD", "File Drivers", "File+Drivers"),
("H5DO", "Optimized Functions Interface", "Optimizations"),
("H5DS", "Dimension Scale Interface", "Dimension+Scales"),
("H5LT", "Lite Interface", "Lite"),
("H5TB", "Table Interface", "Tables"),
)
global apidocs
funclist = sort!(bound_api[mod])
index = join(["- [`$f`](@ref $f)" for f in funclist], "\n")
funcs = join(funclist, "\n")
apidocs *= """
---
## [[`$mod`]($(get(group_url_dict, mod, "https://docs.hdfgroup.org/hdf5/v1_14/"))) — $desc](@id $mod)
$index
```@docs
$funcs
```
"""
end
open(joinpath(@__DIR__, "..", "docs", "src", "api_bindings.md"), "w") do fid
write(
fid,
"""
```@raw html
<!-- This file is auto-generated and should not be manually editted. To update, run the
gen/gen_wrappers.jl script -->
```
```@meta
CurrentModule = HDF5.API
```
# Low-level library bindings
At the lowest level, `HDF5.jl` operates by calling the public API of the HDF5 shared
library through a set of `ccall` wrapper functions.
This page documents the function names and nominal C argument types of the API which
have bindings in this package.
Note that in many cases, high-level data types are valid arguments through automatic
`ccall` conversions.
For instance, `HDF5.Datatype` objects will be automatically converted to their `hid_t` ID
by Julia's `cconvert`+`unsafe_convert` `ccall` rules.
There are additional helper wrappers (often for out-argument functions) which are not
documented here.
$apidocs
"""
)
end
nothing
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 5021 | module DoxygenTagParser
using Downloads
using LightXML
export parse_tag_file, hdf5_func_url, save_to_tab_separated_values
struct HDF5FunctionInfo
name::String
anchorfile::String
anchor::String
arglist::String
end
struct HDF5GroupInfo
name::String
title::String
filename::String
end
const DEFAULT_URL_PREFIX = "https://docs.hdfgroup.org/hdf5/v1_14/"
const HDF5_TAG_URL = "$(DEFAULT_URL_PREFIX)hdf5.tag"
"""
parse_tag_file(url)
Parse a Doxygen tag file. This defaults to "$HDF5_TAG_URL".
"""
function parse_tag_file(hdf5_tag_url=HDF5_TAG_URL)
filename = if startswith(hdf5_tag_url, "https://")
Downloads.download(hdf5_tag_url, basename(hdf5_tag_url))
basename(hdf5_tag_url)
else
hdf5_tag_url
end
funcdict = Dict{String,HDF5FunctionInfo}()
groupdict = Dict{String,HDF5GroupInfo}()
parsed = LightXML.parse_file(filename)
tag_root = root(parsed)
for compound_element in child_elements(tag_root)
compound_kind = attribute(compound_element, "kind")
if compound_kind == "class"
# Java or C++ methods
continue
elseif compound_kind == "group" || compound_kind == "page"
group_name = ""
group_title = ""
group_filename = ""
for compound_child in child_elements(compound_element)
if name(compound_child) == "member" &&
attribute(compound_child, "kind") == "function"
func_name = ""
func_anchorfile = ""
func_anchor = ""
func_arglist = ""
for func_child in child_elements(compound_child)
func_child_name = name(func_child)
if func_child_name == "name"
func_name = content(func_child)
elseif func_child_name == "anchorfile"
func_anchorfile = content(func_child)
elseif func_child_name == "anchor"
func_anchor = content(func_child)
elseif func_child_name == "arglist"
func_arglist = content(func_child)
end
end
if func_name == "H5Pget_chunk"
println(compound_element)
end
funcdict[func_name] = HDF5FunctionInfo(
func_name, func_anchorfile, func_anchor, func_arglist
)
elseif name(compound_child) == "name"
group_name = content(compound_child)
elseif name(compound_child) == "title"
group_title = content(compound_child)
if startswith(group_title, "Java")
break
end
elseif name(compound_child) == "filename"
group_filename = content(compound_child)
end
end
if startswith(group_title, "Java")
continue
end
groupdict[group_name] = HDF5GroupInfo(group_name, group_title, group_filename)
end
end
return funcdict, groupdict
end
"""
hdf5_func_url
Build the documentation URL from the anchorfile and anchor.
"""
function hdf5_func_url(info::HDF5FunctionInfo; prefix=DEFAULT_URL_PREFIX)
return prefix * info.anchorfile * "#" * info.anchor
end
function hdf5_group_url(info::HDF5GroupInfo; prefix="https://docs.hdfgroup.org/hdf5/v1_14/")
return prefix * info.filename
end
"""
save_to_tab_separated_values
Save the function names and documentation URLs to a file, separated by a time, with one function per line.
"""
function save_to_tab_separated_values(
func_filename::AbstractString="hdf5_func_urls.tsv",
group_filename::AbstractString="hdf5_group_urls.tsv",
info::Tuple{Dict{String,HDF5FunctionInfo},Dict{String,HDF5GroupInfo}}=parse_tag_file()
)
funcinfo, groupinfo = info
open(func_filename, "w") do f
sorted_funcs = sort!(collect(keys(funcinfo)))
for func in sorted_funcs
println(f, func, "\t", hdf5_func_url(funcinfo[func]))
end
end
open(group_filename, "w") do f
sorted_groups = sort!(collect(keys(groupinfo)))
for group in sorted_groups
println(f, group, "\t", hdf5_group_url(groupinfo[group]))
end
end
end
function __init__()
if abspath(PROGRAM_FILE) == @__FILE__()
main()
end
end
"""
main()
Executed when `julia --project=. src/DoxygenTagParser` is run from the shell.
"""
function main()
nargs = length(ARGS)
tsv_file = nargs > 0 ? ARGS[1] : "hdf5_func_urls.tsv"
group_file = nargs > 1 ? ARGS[2] : "hdf5_group_urls.tsv"
tag_file = nargs > 2 ? ARGS[3] : HDF5_TAG_URL
info = parse_tag_file(tag_file)
save_to_tab_separated_values(tsv_file, group_file, info)
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3137 | module HDF5
using Base: unsafe_convert
using Requires: @require
using Mmap: Mmap
# needed for filter(f, tuple) in julia 1.3
using Compat
using UUIDs: uuid4
using Printf: @sprintf
### PUBLIC API ###
export @read,
@write,
h5open,
h5read,
h5write,
h5rewrite,
h5writeattr,
h5readattr,
create_attribute,
open_attribute,
read_attribute,
write_attribute,
delete_attribute,
rename_attribute,
attributes,
attrs,
create_dataset,
open_dataset,
read_dataset,
write_dataset,
create_group,
open_group,
copy_object,
open_object,
delete_object,
move_link,
create_datatype,
commit_datatype,
open_datatype,
create_property,
group_info,
object_info,
dataspace,
datatype,
Filters,
Drivers
### The following require module scoping ###
# file, filename, name,
# get_chunk, get_datasets,
# get_access_properties, get_create_properties,
# root, readmmap,
# iscontiguous, iscompact, ischunked,
# ishdf5, ismmappable,
# refresh
# start_swmr_write
# create_external, create_external_dataset
### Types
# H5DataStore, Attribute, File, Group, Dataset, Datatype, Opaque,
# Dataspace, Object, Properties, VLen, ChunkStorage, Reference
h5doc(name) = "[`$name`](https://portal.hdfgroup.org/display/HDF5/$(name))"
include("api/api.jl")
include("properties.jl")
include("context.jl")
include("types.jl")
include("file.jl")
include("objects.jl")
include("groups.jl")
include("datatypes.jl")
include("typeconversions.jl")
include("dataspaces.jl")
include("virtual.jl")
include("datasets.jl")
include("attributes.jl")
include("readwrite.jl")
include("references.jl")
include("show.jl")
include("api_midlevel.jl")
include("highlevel.jl")
# Functions that require special handling
const libversion = API.h5_get_libversion()
const HAS_PARALLEL = Ref(false)
const HAS_ROS3 = Ref(false)
"""
has_parallel()
Returns `true` if the HDF5 libraries were compiled with MPI parallel support via the [`Drivers.MPIO`](@ref) driver.
See [Parallel HDF5](@ref) for more details.
"""
has_parallel() = HAS_PARALLEL[]
"""
has_ros3()
Returns `true` if the HDF5 libraries were compiled with ros3 support
"""
has_ros3() = HAS_ROS3[]
function __init__()
# HDF5.API.__init__() is run first
#
# initialize default properties
ASCII_LINK_PROPERTIES.char_encoding = :ascii
ASCII_LINK_PROPERTIES.create_intermediate_group = true
UTF8_LINK_PROPERTIES.char_encoding = :utf8
UTF8_LINK_PROPERTIES.create_intermediate_group = true
ASCII_ATTRIBUTE_PROPERTIES.char_encoding = :ascii
UTF8_ATTRIBUTE_PROPERTIES.char_encoding = :utf8
@require FileIO = "5789e2e9-d7fb-5bc7-8068-2c6fae9b9549" include("fileio.jl")
@require H5Zblosc = "c8ec2601-a99c-407f-b158-e79c03c2f5f7" begin
set_blosc!(p::Properties, val::Bool) =
val && push!(Filters.FilterPipeline(p), H5Zblosc.BloscFilter())
set_blosc!(p::Properties, level::Integer) =
push!(Filters.FilterPipeline(p), H5Zblosc.BloscFilter(; level=level))
end
return nothing
end
include("deprecated.jl")
end # module
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 8111 | # This file defines midlevel api wrappers. We include name normalization for methods that are
# applicable to different hdf5 api-layers. We still try to adhere close proximity to the underlying
# method name in the hdf5-library.
"""
HDF5.set_extent_dims(dset::HDF5.Dataset, new_dims::Dims)
Change the current dimensions of a dataset to `new_dims`, limited by
`max_dims = get_extent_dims(dset)[2]`. Reduction is possible and leads to loss of truncated data.
"""
function set_extent_dims(dset::Dataset, size::Dims)
checkvalid(dset)
API.h5d_set_extent(dset, API.hsize_t[reverse(size)...])
end
"""
HDF5.set_extent_dims(dspace::HDF5.Dataspace, new_dims::Dims, max_dims::Union{Dims,Nothing} = nothing)
Change the dimensions of a dataspace `dspace` to `new_dims`, optionally with the maximum possible
dimensions `max_dims` different from the active size `new_dims`. If not given, `max_dims` is set equal
to `new_dims`.
"""
function set_extent_dims(
dspace::Dataspace, size::Dims, max_dims::Union{Dims,Nothing}=nothing
)
checkvalid(dspace)
rank = length(size)
current_size = API.hsize_t[reverse(size)...]
maximum_size = isnothing(max_dims) ? C_NULL : [reverse(max_dims .% API.hsize_t)...]
API.h5s_set_extent_simple(dspace, rank, current_size, maximum_size)
return nothing
end
"""
HDF5.get_extent_dims(obj::Union{HDF5.Dataspace, HDF5.Dataset, HDF5.Attribute}) -> dims, maxdims
Get the array dimensions from a dataspace, dataset, or attribute and return a tuple of `dims` and `maxdims`.
"""
function get_extent_dims(obj::Union{Dataspace,Dataset,Attribute})
dspace = obj isa Dataspace ? checkvalid(obj) : dataspace(obj)
h5_dims, h5_maxdims = API.h5s_get_simple_extent_dims(dspace)
# reverse dimensions since hdf5 uses C-style order
N = length(h5_dims)
dims = ntuple(i -> @inbounds(Int(h5_dims[N - i + 1])), N)
maxdims = ntuple(i -> @inbounds(h5_maxdims[N - i + 1]) % Int, N) # allows max_dims to be specified as -1 without triggering an overflow
obj isa Dataspace || close(dspace)
return dims, maxdims
end
"""
HDF5.get_chunk_offset(dataset_id, index)
Get 0-based offset of chunk from 0-based `index`. The offsets are returned in Julia's column-major order rather than hdf5 row-major order.
For a 1-based API, see `HDF5.ChunkStorage`.
"""
function get_chunk_offset(dataset_id, index)
extent = size(dataset_id)
chunk = get_chunk(dataset_id)
chunk_indices = CartesianIndices(
ntuple(i -> 0:(cld(extent[i], chunk[i]) - 1), length(extent))
)
offset = API.hsize_t.(chunk_indices[index + 1].I .* chunk)
return offset
end
"""
HDF5.get_chunk_index(dataset_id, offset)
Get 0-based index of chunk from 0-based `offset` returned in Julia's column-major order.
For a 1-based API, see `HDF5.ChunkStorage`.
"""
function get_chunk_index(dataset_id, offset)
extent = size(dataset_id)
chunk = get_chunk(dataset_id)
chunk_indices = LinearIndices(
ntuple(i -> 0:(cld(extent[i], chunk[i]) - 1), length(extent))
)
chunk_indices[(fld.(offset, chunk) .+ 1)...] - 1
end
"""
HDF5.get_num_chunks_per_dim(dataset_id)
Get the number of chunks in each dimension in Julia's column-major order.
"""
function get_num_chunks_per_dim(dataset_id)
extent = size(dataset_id)
chunk = get_chunk(dataset_id)
return cld.(extent, chunk)
end
"""
HDF5.get_num_chunks(dataset_id)
Returns the number of chunks in a dataset. Equivalent to `API.h5d_get_num_chunks(dataset_id, HDF5.H5S_ALL)`.
"""
function get_num_chunks(dataset_id)
@static if v"1.10.5" ≤ API._libhdf5_build_ver
API.h5d_get_num_chunks(dataset_id)
else
prod(get_num_chunks_per_dim(dataset_id))
end
end
"""
HDF5.get_chunk_length(dataset_id)
Retrieves the chunk size in bytes. Equivalent to `API.h5d_get_chunk_info(dataset_id, index)[:size]`.
"""
function get_chunk_length(dataset_id)
type = API.h5d_get_type(dataset_id)
chunk = get_chunk(dataset_id)
return Int(API.h5t_get_size(type) * prod(chunk))
end
vlen_get_buf_size(dset::Dataset, dtype::Datatype, dspace::Dataspace) =
API.h5d_vlen_get_buf_size(dset, dtype, dspace)
function vlen_get_buf_size(dataset_id)
type = API.h5d_get_type(dataset_id)
space = API.h5d_get_space(dataset_id)
API.h5d_vlen_get_buf_size(dataset_id, type, space)
end
"""
HDF5.read_chunk(dataset_id, offset, [buf]; dxpl_id = HDF5.API.H5P_DEFAULT, filters = Ref{UInt32}())
Helper method to read chunks via 0-based offsets in a `Tuple`.
Argument `buf` is optional and defaults to a `Vector{UInt8}` of length determined by `HDF5.get_chunk_length`.
Argument `dxpl_id` can be supplied a keyword and defaults to `HDF5.API.H5P_DEFAULT`.
Argument `filters` can be retrieved by supplying a `Ref{UInt32}` value via a keyword argument.
This method returns `Vector{UInt8}`.
"""
function read_chunk(
dataset_id,
offset,
buf::Vector{UInt8}=Vector{UInt8}(undef, get_chunk_length(dataset_id));
dxpl_id=API.H5P_DEFAULT,
filters=Ref{UInt32}()
)
API.h5d_read_chunk(dataset_id, dxpl_id, offset, filters, buf)
return buf
end
"""
HDF5.read_chunk(dataset_id, index::Integer, [buf]; dxpl_id = HDF5.API.H5P_DEFAULT, filters = Ref{UInt32}())
Helper method to read chunks via 0-based integer `index`.
Argument `buf` is optional and defaults to a `Vector{UInt8}` of length determined by `HDF5.API.h5d_get_chunk_info`.
Argument `dxpl_id` can be supplied a keyword and defaults to `HDF5.API.H5P_DEFAULT`.
Argument `filters` can be retrieved by supplying a `Ref{UInt32}` value via a keyword argument.
This method returns `Vector{UInt8}`.
"""
function read_chunk(
dataset_id,
index::Integer,
buf::Vector{UInt8}=Vector{UInt8}(undef, get_chunk_length(dataset_id));
dxpl_id=API.H5P_DEFAULT,
filters=Ref{UInt32}()
)
offset = [reverse(get_chunk_offset(dataset_id, index))...]
read_chunk(dataset_id, offset, buf; dxpl_id=dxpl_id, filters=filters)
end
"""
HDF5.write_chunk(dataset_id, offset, buf::AbstractArray; dxpl_id = HDF5.API.H5P_DEFAULT, filter_mask = 0)
Helper method to write chunks via 0-based offsets `offset` as a `Tuple`.
"""
function write_chunk(
dataset_id, offset, buf::AbstractArray; dxpl_id=API.H5P_DEFAULT, filter_mask=0
)
# Borrowed from write_dataset stride detection
stride(buf, 1) == 1 ||
throw(ArgumentError("Cannot write arrays with a different stride than `Array`"))
API.h5d_write_chunk(dataset_id, dxpl_id, filter_mask, offset, sizeof(buf), buf)
end
function write_chunk(
dataset_id,
offset,
buf::Union{DenseArray,Base.FastContiguousSubArray};
dxpl_id=API.H5P_DEFAULT,
filter_mask=0
)
# We can bypass the need to check stride with Array and FastContiguousSubArray
API.h5d_write_chunk(dataset_id, dxpl_id, filter_mask, offset, sizeof(buf), buf)
end
"""
HDF5.write_chunk(dataset_id, index::Integer, buf::AbstractArray; dxpl_id = API.H5P_DEFAULT, filter_mask = 0)
Helper method to write chunks via 0-based integer `index`.
"""
function write_chunk(
dataset_id, index::Integer, buf::AbstractArray; dxpl_id=API.H5P_DEFAULT, filter_mask=0
)
offset = [reverse(get_chunk_offset(dataset_id, index))...]
write_chunk(dataset_id, offset, buf; dxpl_id=dxpl_id, filter_mask=filter_mask)
end
# Avoid ambiguous method with offset based versions
function write_chunk(
dataset_id,
index::Integer,
buf::Union{DenseArray,Base.FastContiguousSubArray};
dxpl_id=API.H5P_DEFAULT,
filter_mask=0
)
# We can bypass the need to check stride with Array and FastContiguousSubArray
offset = [reverse(get_chunk_offset(dataset_id, index))...]
write_chunk(dataset_id, offset, buf; dxpl_id=dxpl_id, filter_mask=filter_mask)
end
function get_fill_value(plist_id, ::Type{T}) where {T}
value = Ref{T}()
API.h5p_get_fill_value(plist_id, datatype(T), value)
return value[]
end
get_fill_value(plist_id) = get_fill_value(plist_id, Float64)
function set_fill_value!(plist_id, value)
ref_value = Ref(value)
API.h5p_set_fill_value(plist_id, datatype(value), ref_value)
return plist_id
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 12838 | """
HDF5.Attribute
A HDF5 attribute: this is a piece of metadata attached to an HDF5 `Group` or
`Dataset`. It acts like a `Dataset`, in that it has a defined datatype and
dataspace, and can `read` and `write` data to it.
See also
- [`open_attribute`](@ref)
- [`create_attribute`](@ref)
- [`read_attribute`](@ref)
- [`write_attribute`](@ref)
- [`delete_attribute`](@ref)
"""
Attribute # defined in types.jl
function Base.close(obj::Attribute)
if obj.id != -1
if obj.file.id != -1 && isvalid(obj)
API.h5a_close(obj)
end
obj.id = -1
end
nothing
end
name(attr::Attribute) = API.h5a_get_name(attr)
datatype(dset::Attribute) = Datatype(API.h5a_get_type(checkvalid(dset)), file(dset))
dataspace(attr::Attribute) = Dataspace(API.h5a_get_space(checkvalid(attr)))
function Base.write(obj::Attribute, x)
dtype = datatype(x)
try
write_attribute(obj, dtype, x)
finally
close(dtype)
end
end
"""
read_attribute(parent::Union{File,Group,Dataset,Datatype}, name::AbstractString)
Read the value of the named attribute on the parent object.
# Example
```julia-repl
julia> HDF5.read_attribute(g, "time")
2.45
```
"""
function read_attribute(parent::Union{File,Group,Dataset,Datatype}, name::AbstractString)
obj = open_attribute(parent, name)
try
return read(obj)
finally
close(obj)
end
end
read_attribute(attr::Attribute, memtype::Datatype, buf) = API.h5a_read(attr, memtype, buf)
"""
open_attribute(parent::Union{File,Group,Dataset,Datatype}, name::AbstractString)
Open the [`Attribute`](@ref) named `name` on the object `parent`.
"""
open_attribute(
parent::Union{File,Object},
name::AbstractString,
aapl::AttributeAccessProperties=AttributeAccessProperties()
) = Attribute(API.h5a_open(checkvalid(parent), name, aapl), file(parent))
"""
create_attribute(parent::Union{File,Object}, name::AbstractString, dtype::Datatype, space::Dataspace)
create_attribute(parent::Union{File,Object}, name::AbstractString, data)
Create a new [`Attribute`](@ref) object named `name` on the object `parent`,
either by specifying the `Datatype` and `Dataspace` of the attribute, or by
providing the data. Note that no data will be written: use
[`write_attribute`](@ref) to write the data.
"""
function create_attribute(parent::Union{File,Object}, name::AbstractString, data; pv...)
dtype = datatype(data)
dspace = dataspace(data)
obj = try
create_attribute(parent, name, dtype, dspace; pv...)
finally
close(dspace)
end
return obj, dtype
end
function create_attribute(
parent::Union{File,Object}, name::AbstractString, dtype::Datatype, dspace::Dataspace
)
attrid = API.h5a_create(
checkvalid(parent), name, dtype, dspace, _attr_properties(name), API.H5P_DEFAULT
)
return Attribute(attrid, file(parent))
end
# generic method
function write_attribute(attr::Attribute, memtype::Datatype, x::T) where {T}
if isbitstype(T)
API.h5a_write(attr, memtype, x)
else
jl_type = get_mem_compatible_jl_type(memtype)
try
x_mem = convert(jl_type, x)
API.h5a_write(attr, memtype, Ref(x_mem))
catch err
if err isa MethodError
throw(
ArgumentError(
"Could not convert non-bitstype $T to $jl_type for writing to HDF5. Consider implementing `convert(::Type{$jl_type}, ::$T)`"
)
)
else
rethrow()
end
end
end
end
function write_attribute(attr::Attribute, memtype::Datatype, x::Ref{T}) where {T}
if isbitstype(T)
API.h5a_write(attr, memtype, x)
else
jl_type = get_mem_compatible_jl_type(memtype)
try
x_mem = convert(Ref{jl_type}, x[])
API.h5a_write(attr, memtype, x_mem)
catch err
if err isa MethodError
throw(
ArgumentError(
"Could not convert non-bitstype $T to $jl_type for writing to HDF5. Consider implementing `convert(::Type{$jl_type}, ::$T)`"
)
)
else
rethrow()
end
end
end
end
# specific methods
write_attribute(attr::Attribute, memtype::Datatype, x::VLen) =
API.h5a_write(attr, memtype, x)
function write_attribute(attr::Attribute, memtype::Datatype, x::AbstractArray{T}) where {T}
length(x) == length(attr) || throw(
ArgumentError(
"Invalid length: $(length(x)) != $(length(attr)), for attribute \"$(name(attr))\""
)
)
if isbitstype(T)
API.h5a_write(attr, memtype, x)
else
jl_type = get_mem_compatible_jl_type(memtype)
try
x_mem = convert(Array{jl_type}, x)
API.h5a_write(attr, memtype, x_mem)
catch err
if err isa MethodError
throw(
ArgumentError(
"Could not convert non-bitstype $T to $jl_type for writing to HDF5. Consider implementing `convert(::Type{$jl_type}, ::$T)`"
)
)
else
rethrow()
end
end
end
end
function write_attribute(attr::Attribute, memtype::Datatype, str::AbstractString)
strbuf = Base.cconvert(Cstring, str)
GC.@preserve strbuf begin
if API.h5t_is_variable_str(memtype)
ptr = Base.unsafe_convert(Cstring, strbuf)
write_attribute(attr, memtype, Ref(ptr))
else
ptr = Base.unsafe_convert(Ptr{UInt8}, strbuf)
write_attribute(attr, memtype, ptr)
end
end
end
function write_attribute(
attr::Attribute, memtype::Datatype, x::T
) where {T<:Union{ScalarType,Complex{<:ScalarType}}}
tmp = Ref{T}(x)
write_attribute(attr, memtype, tmp)
end
function write_attribute(attr::Attribute, memtype::Datatype, strs::Array{<:AbstractString})
p = Ref{Cstring}(strs)
write_attribute(attr, memtype, p)
end
write_attribute(attr::Attribute, memtype::Datatype, ::EmptyArray) = nothing
"""
write_attribute(parent::Union{File,Object}, name::AbstractString, data)
Write `data` as an [`Attribute`](@ref) named `name` on the object `parent`.
"""
function write_attribute(parent::Union{File,Object}, name::AbstractString, data; pv...)
attr, dtype = create_attribute(parent, name, data; pv...)
try
write_attribute(attr, dtype, data)
catch exc
delete_attribute(parent, name)
rethrow(exc)
finally
close(attr)
close(dtype)
end
nothing
end
"""
rename_attribute(parent::Union{File,Object}, oldname::AbstractString, newname::AbstractString)
Rename the [`Attribute`](@ref) of the object `parent` named `oldname` to `newname`.
"""
rename_attribute(
parent::Union{File,Object}, oldname::AbstractString, newname::AbstractString
) = API.h5a_rename(checkvalid(parent), oldname, newname)
"""
delete_attribute(parent::Union{File,Object}, name::AbstractString)
Delete the [`Attribute`](@ref) named `name` on the object `parent`.
"""
delete_attribute(parent::Union{File,Object}, name::AbstractString) =
API.h5a_delete(checkvalid(parent), name)
"""
h5writeattr(filename, name::AbstractString, data::Dict)
Write `data` as attributes to the object at `name` in the HDF5 file `filename`.
"""
function h5writeattr(filename, name::AbstractString, data::Dict)
file = h5open(filename, "r+")
try
obj = file[name]
merge!(attrs(obj), data)
close(obj)
finally
close(file)
end
end
"""
h5readattr(filename, name::AbstractString)
Read the attributes of the object at `name` in the HDF5 file `filename`, returning a `Dict`.
"""
function h5readattr(filename, name::AbstractString)
local dat
file = h5open(filename, "r")
try
obj = file[name]
dat = Dict(attrs(obj))
close(obj)
finally
close(file)
end
dat
end
"""
num_attrs()
Retrieve the number of attributes from an object.
See [`API.h5o_get_info`](@ref).
"""
function num_attrs(obj)
info = @static if API._libhdf5_build_ver < v"1.12.0"
API.h5o_get_info(checkvalid(obj))
else
API.h5o_get_info(checkvalid(obj), API.H5O_INFO_NUM_ATTRS)
end
return Int(info.num_attrs)
end
struct AttributeDict <: AbstractDict{String,Any}
parent::Object
end
AttributeDict(file::File) = AttributeDict(open_group(file, "."))
"""
attrs(object::Union{File,Group,Dataset,Datatype})
The attributes dictionary of `object`. Returns an `AttributeDict`, a `Dict`-like
object for accessing the attributes of `object`.
```julia
attrs(object)["name"] = value # create/overwrite an attribute
attr = attrs(object)["name"] # read an attribute
delete!(attrs(object), "name") # delete an attribute
keys(attrs(object)) # list the attribute names
```
"""
function attrs(parent)
return AttributeDict(parent)
end
Base.haskey(attrdict::AttributeDict, path::AbstractString) =
API.h5a_exists(checkvalid(attrdict.parent), path)
Base.length(attrdict::AttributeDict) = num_attrs(attrdict.parent)
function Base.getindex(x::AttributeDict, name::AbstractString)
haskey(x, name) || throw(KeyError(name))
read_attribute(x.parent, name)
end
function Base.get(x::AttributeDict, name::AbstractString, default)
haskey(x, name) || return default
read_attribute(x.parent, name)
end
function Base.setindex!(attrdict::AttributeDict, val, name::AbstractString)
if haskey(attrdict, name)
# in case of an error, we write first to a temporary, then rename
_name = name * "_hdf5jl_" * string(uuid4())
haskey(attrdict, _name) && error("temp attribute name exists against all odds")
try
write_attribute(attrdict.parent, _name, val)
delete_attribute(attrdict.parent, name)
rename_attribute(attrdict.parent, _name, name)
finally
haskey(attrdict, _name) && delete_attribute(attrdict.parent, _name)
end
else
write_attribute(attrdict.parent, name, val)
end
end
Base.delete!(attrdict::AttributeDict, path::AbstractString) =
delete_attribute(attrdict.parent, path)
function Base.keys(attrdict::AttributeDict)
# faster than iteratively calling h5a_get_name_by_idx
checkvalid(attrdict.parent)
keyvec = sizehint!(String[], length(attrdict))
API.h5a_iterate(
attrdict.parent, idx_type(attrdict.parent), order(attrdict.parent)
) do _, attr_name, _
push!(keyvec, unsafe_string(attr_name))
return false
end
return keyvec
end
function Base.iterate(attrdict::AttributeDict)
# constuct key vector, then iterate
# faster than calling h5a_open_by_idx
iterate(attrdict, (keys(attrdict), 1))
end
function Base.iterate(attrdict::AttributeDict, (keyvec, n))
iter = iterate(keyvec, n)
if isnothing(iter)
return iter
end
key, nn = iter
return (key => attrdict[key]), (keyvec, nn)
end
struct Attributes
parent::Union{File,Object}
end
"""
attributes(object::Union{File,Object})
The attributes of a file or object: this returns an `Attributes` object, which
is `Dict`-like object for accessing the attributes of `object`: `getindex` will
return an [`Attribute`](@ref) object, and `setindex!` will call [`write_attribute`](@ref).
"""
attributes(p::Union{File,Object}) = Attributes(p)
Base.isvalid(obj::Attributes) = isvalid(obj.parent)
function Base.getindex(x::Attributes, name::AbstractString)
haskey(x, name) || throw(KeyError(name))
open_attribute(x.parent, name)
end
Base.setindex!(x::Attributes, val, name::AbstractString) =
write_attribute(x.parent, name, val)
Base.haskey(attr::Attributes, path::AbstractString) =
API.h5a_exists(checkvalid(attr.parent), path)
Base.length(x::Attributes) = num_attrs(x.parent)
function Base.keys(x::Attributes)
checkvalid(x.parent)
children = sizehint!(String[], length(x))
API.h5a_iterate(x.parent, idx_type(x.parent), order(x.parent)) do _, attr_name, _
push!(children, unsafe_string(attr_name))
return API.herr_t(0)
end
return children
end
Base.read(attr::Attributes, name::AbstractString) = read_attribute(attr.parent, name)
# Dataset methods which act like attributes
Base.write(parent::Dataset, name::AbstractString, data; pv...) =
write_attribute(parent, name, data; pv...)
function Base.getindex(dset::Dataset, name::AbstractString)
haskey(dset, name) || throw(KeyError(name))
open_attribute(dset, name)
end
Base.setindex!(dset::Dataset, val, name::AbstractString) = write_attribute(dset, name, val)
Base.haskey(dset::Union{Dataset,Datatype}, path::AbstractString) =
API.h5a_exists(checkvalid(dset), path)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2996 | # The context API is under active development. This is an internal API and may change.
"""
HDF5Context
*Internal API*
An `HDF5Context` is a collection of HDF5 property lists. It is meant to be used
as a `Task` local mechanism to store state and change the default property lists
for new objects.
Use the function `get_context_property(name::Symbol)` to access a property
list within the local context.
The context in `task_local_storage()[:hdf5_context]` will be checked first.
A common global HDF5Context is stored in the constant `HDF5.CONTEXT` and
serves as the default context if the current task does not have a
`:hdf5_context`.
# Fields
* attribute_access
* attribute_create
* dataset_access
* dataset_create
* dataset_tranfer
* datatype_access
* datatype_create
* file_access
* file_create
* file_mount
* group_access
* group_create
* link_access
* link_create
* object_copy
* object_create
* string_create
"""
struct HDF5Context
attribute_access :: AttributeAccessProperties
attribute_create :: AttributeCreateProperties
dataset_access :: DatasetAccessProperties
dataset_create :: DatasetCreateProperties
dataset_transfer :: DatasetTransferProperties
datatype_access :: DatatypeAccessProperties
datatype_create :: DatatypeCreateProperties
file_access :: FileAccessProperties
file_create :: FileCreateProperties
file_mount :: FileMountProperties
group_access :: GroupAccessProperties
group_create :: GroupCreateProperties
link_access :: LinkAccessProperties
link_create :: LinkCreateProperties
object_copy :: ObjectCopyProperties
object_create :: ObjectCreateProperties
string_create :: StringCreateProperties
end
Base.copy(ctx::HDF5Context) =
HDF5Context(map(n -> copy(getfield(ctx, n)), fieldnames(HDF5Context))...)
Base.close(ctx::HDF5Context) =
foreach(n -> close(getfield(ctx, n)), fieldnames(HDF5Context))
function HDF5Context()
HDF5Context(
AttributeAccessProperties(),
AttributeCreateProperties(),
DatasetAccessProperties(),
DatasetCreateProperties(),
DatasetTransferProperties(),
DatatypeAccessProperties(),
DatatypeCreateProperties(),
FileAccessProperties(),
FileCreateProperties(),
FileMountProperties(),
GroupAccessProperties(),
GroupCreateProperties(),
LinkAccessProperties(),
LinkCreateProperties(),
ObjectCopyProperties(),
ObjectCreateProperties(),
StringCreateProperties(),
)
end
"""
HDF5.CONTEXT
*Internal API*
Default `HDF5Context`.
"""
const CONTEXT = HDF5Context()
"""
get_context_property(name::Symbol)
*Internal API*
Retrieve a property list from the task local context, defaulting to
`HDF5.CONTEXT` if `task_local_storage()[:hdf5_context]` does not
exist.
"""
get_context_property(name::Symbol) =
getfield(get(task_local_storage(), :hdf5_context, CONTEXT), name)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 28256 | # Dataset defined in types.jl
# Get the dataspace of a dataset
dataspace(dset::Dataset) = Dataspace(API.h5d_get_space(checkvalid(dset)))
"""
open_dataset(parent::Union{File, Group}, path::AbstractString; properties...)
Open an existing [`HDF5.Dataset`](@ref) at `path` under `parent`
Optional keyword arguments include any keywords that that belong to
[`DatasetAccessProperties`](@ref) or [`DatasetTransferProperties`](@ref).
"""
open_dataset(
parent::Union{File,Group},
name::AbstractString,
dapl::DatasetAccessProperties,
dxpl::DatasetTransferProperties
) = Dataset(API.h5d_open(checkvalid(parent), name, dapl), file(parent), dxpl)
function open_dataset(parent::Union{File,Group}, name::AbstractString; pv...)
dapl = DatasetAccessProperties()
dxpl = DatasetTransferProperties()
pv = setproperties!(dapl, dxpl; pv...)
isempty(pv) || error("invalid keyword options $(keys(pv))")
open_dataset(parent, name, dapl, dxpl)
end
# Setting dset creation properties with name/value pairs
"""
create_dataset(parent, path, datatype, dataspace; properties...)
# Arguments
* `parent` - `File` or `Group`
* `path` - `String` describing the path of the dataset within the HDF5 file or
`nothing` to create an anonymous dataset
* `datatype` - `Datatype` or `Type` or the dataset
* `dataspace` - `Dataspace` or `Dims` of the dataset
* `properties` - keyword name-value pairs set properties of the dataset
# Keywords
There are many keyword properties that can be set. Below are a few select keywords.
* `chunk` - `Dims` describing the size of a chunk. Needed to apply filters.
* `filters` - `AbstractVector{<: Filters.Filter}` describing the order of the filters to apply to the data. See [`Filters`](@ref)
* `external` - `Tuple{AbstractString, Intger, Integer}` `(filepath, offset, filesize)` External dataset file location, data offset, and file size. See [`API.h5p_set_external`](@ref).
Additionally, the initial create, transfer, and access properties can be provided as a keyword:
* `dcpl` - [`DatasetCreateProperties`](@ref)
* `dxpl` - [`DatasetTransferProperties`](@ref)
* `dapl` - [`DatasetAccessProperties`](@ref)
"""
function create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Datatype,
dspace::Dataspace;
dcpl::DatasetCreateProperties=DatasetCreateProperties(),
dxpl::DatasetTransferProperties=DatasetTransferProperties(),
dapl::DatasetAccessProperties=DatasetAccessProperties(),
pv...
)
!isnothing(path) &&
haskey(parent, path) &&
error(
"cannot create dataset: object \"", path, "\" already exists at ", name(parent)
)
pv = setproperties!(dcpl, dxpl, dapl; pv...)
isempty(pv) || error("invalid keyword options")
if isnothing(path)
ds = API.h5d_create_anon(parent, dtype, dspace, dcpl, dapl)
else
ds = API.h5d_create(parent, path, dtype, dspace, _link_properties(path), dcpl, dapl)
end
Dataset(ds, file(parent), dxpl)
end
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Datatype,
dspace_dims::Dims;
pv...
) = create_dataset(checkvalid(parent), path, dtype, dataspace(dspace_dims); pv...)
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Datatype,
dspace_dims::Tuple{Dims,Dims};
pv...
) = create_dataset(
checkvalid(parent),
path,
dtype,
dataspace(dspace_dims[1]; max_dims=dspace_dims[2]);
pv...
)
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Type,
dspace_dims::Tuple{Dims,Dims};
pv...
) = create_dataset(
checkvalid(parent),
path,
datatype(dtype),
dataspace(dspace_dims[1]; max_dims=dspace_dims[2]);
pv...
)
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Type,
dspace_dims::Dims;
pv...
) = create_dataset(checkvalid(parent), path, datatype(dtype), dataspace(dspace_dims); pv...)
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Type,
dspace_dims::Int...;
pv...
) = create_dataset(checkvalid(parent), path, datatype(dtype), dataspace(dspace_dims); pv...)
create_dataset(
parent::Union{File,Group},
path::Union{AbstractString,Nothing},
dtype::Type,
dspace::Dataspace;
pv...
) = create_dataset(checkvalid(parent), path, datatype(dtype), dspace; pv...)
# Get the datatype of a dataset
datatype(dset::Dataset) = Datatype(API.h5d_get_type(checkvalid(dset)), file(dset))
"""
read_dataset(parent::Union{File,Group}, name::AbstractString)
Read a dataset with named `name` from `parent`. This will typically return an array.
The dataset will be opened, read, and closed.
See also [`HDF5.open_dataset`](@ref), [`Base.read`](@ref)
"""
function read_dataset(parent::Union{File,Group}, name::AbstractString)
local ret
obj = open_dataset(parent, name)
try
ret = read(obj)
finally
close(obj)
end
ret
end
refresh(ds::Dataset) = API.h5d_refresh(checkvalid(ds))
Base.flush(ds::Dataset) = API.h5d_flush(checkvalid(ds))
# Array constructor for datasets
Base.Array(x::Dataset) = read(x)
# The next two lines are kept for v"1.4" <= VERSION <= v"1.5"
Base.lastindex(dset::Dataset) = length(dset)
Base.lastindex(dset::Dataset, d::Int) = size(dset, d)
function iscompact(obj::Dataset)
prop = API.h5d_get_create_plist(checkvalid(obj))
try
API.h5p_get_layout(prop) == API.H5D_COMPACT
finally
API.h5p_close(prop)
end
end
function ischunked(obj::Dataset)
prop = API.h5d_get_create_plist(checkvalid(obj))
try
API.h5p_get_layout(prop) == API.H5D_CHUNKED
finally
API.h5p_close(prop)
end
end
function iscontiguous(obj::Dataset)
prop = API.h5d_get_create_plist(checkvalid(obj))
try
API.h5p_get_layout(prop) == API.H5D_CONTIGUOUS
finally
API.h5p_close(prop)
end
end
# Reading with mmap
ismmappable(::Type{<:ScalarType}) = true
ismmappable(::Type{Complex{T}}) where {T<:BitsType} = true
ismmappable(::Type) = false
ismmappable(obj::Dataset, ::Type{T}) where {T} = ismmappable(T) && iscontiguous(obj)
ismmappable(obj::Dataset) = ismmappable(obj, get_jl_type(obj))
function readmmap(obj::Dataset, ::Type{T}) where {T}
dspace = dataspace(obj)
stype = API.h5s_get_simple_extent_type(dspace)
(stype != API.H5S_SIMPLE) && error("can only mmap simple dataspaces")
dims = size(dspace)
if isempty(dims)
return T[]
end
if !Sys.iswindows()
local fdint
prop = API.h5d_get_access_plist(obj)
try
# TODO: Should check return value of API.h5f_get_driver()
fdptr = API.h5f_get_vfd_handle(obj.file, prop)
fdint = unsafe_load(convert(Ptr{Cint}, fdptr))
finally
API.h5p_close(prop)
end
fd = fdio(fdint)
else
# This is a workaround since the regular code path does not work on windows
# (see #89 for background). The error is that "Mmap.mmap(fd, ...)" cannot
# create create a valid file mapping. The question is if the handler
# returned by "API.h5f_get_vfd_handle" has
# the correct format as required by the "fdio" function. The former
# calls
# https://gitlabext.iag.uni-stuttgart.de/libs/hdf5/blob/develop/src/H5FDcore.c#L1209
#
# The workaround is to create a new file handle, which should actually
# not make any problems. Since we need to know the permissions of the
# original file handle, we first retrieve them using the "API.h5f_get_intent"
# function
# Check permissions
intent = API.h5f_get_intent(obj.file)
flag = intent == API.H5F_ACC_RDONLY ? "r" : "r+"
fd = open(obj.file.filename, flag)
end
offset = API.h5d_get_offset(obj)
if offset == -1 % API.haddr_t
# note that API.h5d_get_offset may not actually raise an error, so we need to check it here
error("Error getting offset")
elseif offset % Base.datatype_alignment(T) == 0
A = Mmap.mmap(fd, Array{T,length(dims)}, dims, offset)
else
Aflat = Mmap.mmap(fd, Vector{UInt8}, prod(dims) * sizeof(T), offset)
A = reshape(reinterpret(T, Aflat), dims)
end
if Sys.iswindows()
close(fd)
end
return A
end
function readmmap(obj::Dataset)
T = get_jl_type(obj)
ismmappable(T) || error("Cannot mmap datasets of type $T")
iscontiguous(obj) || error("Cannot mmap discontiguous dataset")
readmmap(obj, T)
end
# Generic write
function Base.write(
parent::Union{File,Group},
name1::Union{AbstractString,Nothing},
val1,
name2::Union{AbstractString,Nothing},
val2,
nameval...
) # FIXME: remove?
if !iseven(length(nameval))
error("name, value arguments must come in pairs")
end
write(parent, name1, val1)
write(parent, name2, val2)
for i in 1:2:length(nameval)
thisname = nameval[i]
if !isa(thisname, AbstractString)
error("Argument ", i + 5, " should be a string, but it's a ", typeof(thisname))
end
write(parent, thisname, nameval[i + 1])
end
end
# Plain dataset & attribute writes
# Due to method ambiguities we generate these explicitly
# Create datasets and attributes with "native" types, but don't write the data.
# The return syntax is: dset, dtype = create_dataset(parent, name, data; properties...)
function create_dataset(
parent::Union{File,Group}, name::Union{AbstractString,Nothing}, data; pv...
)
dtype = datatype(data)
dspace = dataspace(data)
obj = try
create_dataset(parent, name, dtype, dspace; pv...)
finally
close(dspace)
end
return obj, dtype
end
# Create and write, closing the objects upon exit
"""
write_dataset(parent::Union{File,Group}, name::Union{AbstractString,Nothing}, data; pv...)
Create and write a dataset with `data`. Keywords are forwarded to [`create_dataset`](@ref).
Providing `nothing` as the name will create an anonymous dataset.
See also [`create_dataset`](@ref)
"""
function write_dataset(
parent::Union{File,Group}, name::Union{AbstractString,Nothing}, data; pv...
)
obj, dtype = create_dataset(parent, name, data; pv...)
try
write_dataset(obj, dtype, data)
catch exc
delete_object(obj)
rethrow(exc)
finally
close(obj)
close(dtype)
end
nothing
end
# Write to already-created objects
function Base.write(obj::Dataset, x)
dtype = datatype(x)
try
write_dataset(obj, dtype, x)
finally
close(dtype)
end
end
# For plain files and groups, let "write(obj, name, val; properties...)" mean "write_dataset"
Base.write(parent::Union{File,Group}, name::Union{AbstractString,Nothing}, data; pv...) =
write_dataset(parent, name, data; pv...)
# Indexing
Base.eachindex(::IndexLinear, A::Dataset) = Base.OneTo(length(A))
Base.axes(dset::Dataset) = map(Base.OneTo, size(dset))
Base.axes(dset::Dataset, d::Integer) = Base.OneTo(size(dset, d))
# Write to a subset of a dataset using array slices: dataset[:,:,10] = array
const IndexType = Union{AbstractRange{Int},Int,Colon}
function Base.setindex!(dset::Dataset, X::Array{T}, I::IndexType...) where {T}
!isconcretetype(T) && error("type $T is not concrete")
U = get_jl_type(dset)
# perform conversions for numeric types
if (U <: Number) && (T <: Number) && U !== T
X = convert(Array{U}, X)
end
filetype = datatype(dset)
memtype = _memtype(filetype, eltype(X))
close(filetype)
dspace = dataspace(dset)
stype = API.h5s_get_simple_extent_type(dspace)
stype == API.H5S_NULL && error("attempting to write to null dataspace")
indices = Base.to_indices(dset, I)
dspace = hyperslab(dspace, indices...)
memspace = dataspace(X)
if API.h5s_get_select_npoints(dspace) != API.h5s_get_select_npoints(memspace)
error("number of elements in src and dest arrays must be equal")
end
try
API.h5d_write(dset, memtype, memspace, dspace, dset.xfer, X)
finally
close(memtype)
close(memspace)
close(dspace)
end
return X
end
function Base.setindex!(
dset::Dataset, X::Array{S}, I::IndexType...
) where {S<:AbstractString}
!isconcretetype(S) && error("type $S is not concrete")
U = get_jl_type(dset)
filetype = datatype(dset)
memtype = _memtype(filetype, eltype(X))
close(filetype)
dspace = dataspace(dset)
stype = API.h5s_get_simple_extent_type(dspace)
stype == API.H5S_NULL && error("attempting to write to null dataspace")
indices = Base.to_indices(dset, I)
dspace = hyperslab(dspace, indices...)
memspace = dataspace(X)
if API.h5s_get_select_npoints(dspace) != API.h5s_get_select_npoints(memspace)
error("number of elements in src and dest arrays must be equal")
end
p = Ref{Cstring}(X)
try
API.h5d_write(dset, memtype, memspace, dspace, dset.xfer, p)
finally
close(memtype)
close(memspace)
close(dspace)
end
return X
end
function Base.setindex!(dset::Dataset, x, I::IndexType...)
indices = Base.to_indices(dset, I)
X = fill(x, map(length, indices))
Base.setindex!(dset, X, indices...)
end
function Base.setindex!(dset::Dataset, x::T, I::IndexType...) where {T<:AbstractString}
indices = Base.to_indices(dset, I)
X = fill(x, map(length, indices))
Base.setindex!(dset, X, indices...)
end
function Base.setindex!(dset::Dataset, X::AbstractArray, I::IndexType...)
Base.setindex!(dset, Array(X), I...)
end
"""
create_external_dataset(parent, name, filepath, dtype, dspace, offset = 0)
Create an external dataset with data in an external file.
* `parent` - File or Group
* `name` - Name of the Dataset
* `filepath` - File path to where the data is tored
* `dtype` - Datatype, Type, or value where `datatype` is applicable
* `offset` - Offset, in bytes, from the beginning of the file to the location in the file where the data starts.
See also [`API.h5p_set_external`](@ref) to link to multiple segments.
"""
function create_external_dataset(
parent::Union{File,Group},
name::AbstractString,
filepath::AbstractString,
t,
sz::Dims,
offset::Integer=0
)
create_external_dataset(parent, name, filepath, datatype(t), dataspace(sz), offset)
end
function create_external_dataset(
parent::Union{File,Group},
name::AbstractString,
filepath::AbstractString,
dtype::Datatype,
dspace::Dataspace,
offset::Integer=0
)
checkvalid(parent)
create_dataset(
parent,
name,
dtype,
dspace;
external=(filepath, offset, length(dspace) * sizeof(dtype))
)
end
### HDF5 utilities ###
# default behavior
read_dataset(
dset::Dataset, memtype::Datatype, buf, xfer::DatasetTransferProperties=dset.xfer
) = API.h5d_read(dset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, buf)
write_dataset(
dset::Dataset, memtype::Datatype, x, xfer::DatasetTransferProperties=dset.xfer
) = API.h5d_write(dset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, x)
# type-specific behaviors
function _check_invalid(dataset::Dataset, buf::AbstractArray)
num_bytes_dset = Base.checked_mul(sizeof(datatype(dataset)), length(dataset))
num_bytes_buf = Base.checked_mul(sizeof(eltype(buf)), length(buf))
num_bytes_buf == num_bytes_dset || throw(
ArgumentError(
"Invalid number of bytes: $num_bytes_buf != $num_bytes_dset, for dataset \"$(name(dataset))\""
)
)
stride(buf, 1) == 1 || throw(
ArgumentError("Cannot read/write arrays with a different stride than `Array`")
)
end
function read_dataset(
dataset::Dataset,
memtype::Datatype,
buf::AbstractArray,
xfer::DatasetTransferProperties=dataset.xfer
)
_check_invalid(dataset, buf)
API.h5d_read(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, buf)
end
function write_dataset(
dataset::Dataset,
memtype::Datatype,
buf::AbstractArray{T},
xfer::DatasetTransferProperties=dataset.xfer
) where {T}
_check_invalid(dataset, buf)
if isbitstype(T)
API.h5d_write(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, buf)
else
# For non-bitstypes, we need to convert the buffer to a bitstype
# For mutable structs, this will usually be a NamedTuple.
jl_type = get_mem_compatible_jl_type(memtype)
try
memtype_buf = convert(Array{jl_type}, buf)
API.h5d_write(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, memtype_buf)
catch err
if err isa MethodError
throw(
ArgumentError(
"Could not convert non-bitstype $T to $jl_type for writing to HDF5. Consider implementing `convert(::Type{$jl_type}, ::$T)`"
)
)
else
rethrow()
end
end
end
end
function write_dataset(
dataset::Dataset,
memtype::Datatype,
buf::Base.ReinterpretArray,
xfer::DatasetTransferProperties=dataset.xfer
)
# We cannot obtain a pointer of a ReinterpretArrayin Julia 1.11 and beyond
# https://github.com/JuliaLang/julia/issues/51962
buf_copy = copy(buf)
@assert !(typeof(buf_copy) <: Base.ReinterpretArray) "Copying $(typeof(buf)) resulted in another Base.ReinterpretArray"
write_dataset(dataset, memtype, buf_copy, xfer)
end
function write_dataset(
dataset::Dataset,
memtype::Datatype,
str::Union{AbstractString,Nothing},
xfer::DatasetTransferProperties=dataset.xfer
)
strbuf = Base.cconvert(Cstring, str)
GC.@preserve strbuf begin
# unsafe_convert(Cstring, strbuf) is responsible for enforcing the no-'\0' policy,
# but then need explicit convert to Ptr{UInt8} since Ptr{Cstring} -> Ptr{Cvoid} is
# not automatic.
buf = convert(Ptr{UInt8}, Base.unsafe_convert(Cstring, strbuf))
API.h5d_write(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, buf)
end
end
function write_dataset(
dataset::Dataset, memtype::Datatype, x::T, xfer::DatasetTransferProperties=dataset.xfer
) where {T<:Union{ScalarType,Complex{<:ScalarType}}}
tmp = Ref{T}(x)
API.h5d_write(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, tmp)
end
function write_dataset(
dataset::Dataset,
memtype::Datatype,
strs::Array{<:AbstractString},
xfer::DatasetTransferProperties=dataset.xfer
)
p = Ref{Cstring}(strs)
API.h5d_write(dataset, memtype, API.H5S_ALL, API.H5S_ALL, xfer, p)
end
write_dataset(
dataset::Dataset,
memtype::Datatype,
::EmptyArray,
xfer::DatasetTransferProperties=dataset.xfer
) = nothing
"""
get_datasets(file::HDF5.File) -> datasets::Vector{HDF5.Dataset}
Get all the datasets in an hdf5 file without loading the data.
"""
function get_datasets(file::File)
list = Dataset[]
get_datasets!(list, file)
list
end
function get_datasets!(list::Vector{Dataset}, node::Union{File,Group,Dataset})
if isa(node, Dataset)
push!(list, node)
else
for c in keys(node)
get_datasets!(list, node[c])
end
end
end
### Chunks ###
# heuristic chunk layout (return empty array to disable chunking)
function heuristic_chunk(T, shape)
Ts = sizeof(T)
sz = prod(shape)
sz == 0 && return Int[] # never return a zero-size chunk
chunk = [shape...]
nd = length(chunk)
# simplification of ugly heuristic target chunk size from PyTables/h5py:
target = min(1500000, max(12000, floor(Int, 300 * cbrt(Ts * sz))))
Ts > target && return ones(chunk)
# divide last non-unit dimension by 2 until we get <= target
# (since Julia default to column-major, favor contiguous first dimension)
while Ts * prod(chunk) > target
i = nd
while chunk[i] == 1
i -= 1
end
chunk[i] >>= 1
end
return chunk
end
heuristic_chunk(T, ::Tuple{}) = Int[]
heuristic_chunk(A::AbstractArray{T}) where {T} = heuristic_chunk(T, size(A))
heuristic_chunk(x) = Int[]
# (strings are saved as scalars, and hence cannot be chunked)
"""
do_write_chunk(dataset::Dataset, offset, chunk_bytes::AbstractArray, filter_mask=0)
Write a raw chunk at a given offset.
`chunk_bytes` is an AbstractArray that can be converted to a pointer, Ptr{Cvoid}.
`offset` is a 1-based list of rank `ndims(dataset)` and must fall on a chunk boundary.
"""
function do_write_chunk(dataset::Dataset, offset, chunk_bytes::AbstractArray, filter_mask=0)
checkvalid(dataset)
offs = collect(API.hsize_t, reverse(offset)) .- 1
write_chunk(dataset, offs, chunk_bytes; filter_mask=UInt32(filter_mask))
end
"""
do_write_chunk(dataset::Dataset, index, chunk_bytes::AbstractArray, filter_mask=0)
Write a raw chunk at a given linear index.
`chunk_bytes` is an AbstractArray that can be converted to a pointer, Ptr{Cvoid}.
`index` is 1-based and consecutive up to the number of chunks.
"""
function do_write_chunk(
dataset::Dataset, index::Integer, chunk_bytes::AbstractArray, filter_mask=0
)
checkvalid(dataset)
index -= 1
write_chunk(dataset, index, chunk_bytes; filter_mask=UInt32(filter_mask))
end
"""
do_read_chunk(dataset::Dataset, offset)
Read a raw chunk at a given offset.
`offset` is a 1-based list of rank `ndims(dataset)` and must fall on a chunk boundary.
"""
function do_read_chunk(dataset::Dataset, offset)
checkvalid(dataset)
offs = collect(API.hsize_t, reverse(offset)) .- 1
filters = Ref{UInt32}()
buf = read_chunk(dataset, offs; filters=filters)
return (filters[], buf)
end
"""
do_read_chunk(dataset::Dataset, index::Integer)
Read a raw chunk at a given index.
`index` is 1-based and consecutive up to the number of chunks.
"""
function do_read_chunk(dataset::Dataset, index::Integer)
checkvalid(dataset)
index -= 1
filters = Ref{UInt32}()
buf = read_chunk(dataset, index; filters=filters)
return (filters[], buf)
end
struct ChunkStorage{I<:IndexStyle,N} <: AbstractArray{Tuple{UInt32,Vector{UInt8}},N}
dataset::Dataset
end
ChunkStorage{I,N}(dataset) where {I,N} = ChunkStorage{I,N}(dataset)
Base.IndexStyle(::ChunkStorage{I}) where {I<:IndexStyle} = I()
# ChunkStorage{IndexCartesian,N} (default)
function ChunkStorage(dataset)
ChunkStorage{IndexCartesian,ndims(dataset)}(dataset)
end
Base.size(cs::ChunkStorage{IndexCartesian}) = get_num_chunks_per_dim(cs.dataset)
function Base.axes(cs::ChunkStorage{IndexCartesian})
chunk = get_chunk(cs.dataset)
extent = size(cs.dataset)
ntuple(i -> 1:chunk[i]:extent[i], length(extent))
end
# Filter flags provided
function Base.setindex!(
chunk_storage::ChunkStorage{IndexCartesian},
v::Tuple{<:Integer,AbstractArray},
index::Integer...
)
do_write_chunk(chunk_storage.dataset, index, v[2], v[1])
end
# Filter flags will default to 0
function Base.setindex!(
chunk_storage::ChunkStorage{IndexCartesian}, v::AbstractArray, index::Integer...
)
do_write_chunk(chunk_storage.dataset, index, v)
end
function Base.getindex(chunk_storage::ChunkStorage{IndexCartesian}, index::Integer...)
do_read_chunk(chunk_storage.dataset, API.hsize_t.(index))
end
# ChunkStorage{IndexLinear,1}
ChunkStorage{IndexLinear}(dataset) = ChunkStorage{IndexLinear,1}(dataset)
Base.size(cs::ChunkStorage{IndexLinear}) = (get_num_chunks(cs.dataset),)
Base.length(cs::ChunkStorage{IndexLinear}) = get_num_chunks(cs.dataset)
function Base.setindex!(
chunk_storage::ChunkStorage{IndexLinear},
v::Tuple{<:Integer,AbstractArray},
index::Integer
)
do_write_chunk(chunk_storage.dataset, index, v[2], v[1])
end
# Filter flags will default to 0
function Base.setindex!(
chunk_storage::ChunkStorage{IndexLinear}, v::AbstractArray, index::Integer
)
do_write_chunk(chunk_storage.dataset, index, v)
end
function Base.getindex(chunk_storage::ChunkStorage{IndexLinear}, index::Integer)
do_read_chunk(chunk_storage.dataset, index)
end
# TODO: Move to show.jl. May need to include show.jl after this line.
# ChunkStorage axes may be StepRanges, but this is not available until v"1.6.0"
# no method matching CartesianIndices(::Tuple{StepRange{Int64,Int64},UnitRange{Int64}}) until v"1.6.0"
function Base.show(io::IO, cs::ChunkStorage{IndexCartesian,N}) where {N}
println(io, "HDF5.ChunkStorage{IndexCartesian,$N}")
print(io, "Axes: ")
println(io, axes(cs))
print(io, cs.dataset)
end
Base.show(
io::IO, ::MIME{Symbol("text/plain")}, cs::ChunkStorage{IndexCartesian,N}
) where {N} = show(io, cs)
function get_chunk(dset::Dataset)
p = get_create_properties(dset)
local ret
try
ret = get_chunk(p)
finally
close(p)
end
ret
end
struct ChunkInfo{N}
offset::NTuple{N,Int}
filter_mask::Cuint
addr::API.haddr_t
size::API.hsize_t
end
function Base.show(io::IO, ::MIME"text/plain", info::Vector{<:ChunkInfo})
print(io, typeof(info))
println(io, " with $(length(info)) elements:")
println(io, "Offset \tFilter Mask \tAddress\tSize")
println(io, "----------\t--------------------------------\t-------\t----")
for ci in info
println(
io,
@sprintf("%10s", ci.offset),
"\t",
bitstring(ci.filter_mask),
"\t",
ci.addr,
"\t",
ci.size
)
end
end
"""
HDF5.get_chunk_info_all(dataset, [dxpl])
Obtain information on all the chunks in a dataset. Returns a
`Vector{ChunkInfo{N}}`. The fields of `ChunkInfo{N}` are
* offset - `NTuple{N, Int}` indicating the offset of the chunk in terms of elements, reversed to F-order
* filter_mask - Cuint, 32-bit flags indicating whether filters have been applied to the cunk
* addr - haddr_t, byte-offset of the chunk in the file
* size - hsize_t, size of the chunk in bytes
"""
function get_chunk_info_all(dataset, dxpl=API.H5P_DEFAULT)
@static if hasmethod(API.h5d_chunk_iter, Tuple{API.hid_t})
return _get_chunk_info_all_by_iter(dataset, dxpl)
else
return _get_chunk_info_all_by_index(dataset, dxpl)
end
end
"""
_get_chunk_info_all_by_iter(dataset, [dxpl])
Implementation of [`get_chunk_info_all`](@ref) via [`HDF5.API.h5d_chunk_iter`](@ref).
We expect this will be faster, O(N), than using `h5d_get_chunk_info` since this allows us to iterate
through the chunks once.
"""
@inline function _get_chunk_info_all_by_iter(dataset, dxpl=API.H5P_DEFAULT)
ds = dataspace(dataset)
N = ndims(ds)
info = ChunkInfo{N}[]
num_chunks = get_num_chunks(dataset)
sizehint!(info, num_chunks)
API.h5d_chunk_iter(dataset, dxpl) do offset, filter_mask, addr, size
_offset = reverse(unsafe_load(Ptr{NTuple{N,API.hsize_t}}(offset)))
push!(info, ChunkInfo{N}(_offset, filter_mask, addr, size))
return HDF5.API.H5_ITER_CONT
end
return info
end
"""
_get_chunk_info_all_by_index(dataset, [dxpl])
Implementation of [`get_chunk_info_all`](@ref) via [`HDF5.API.h5d_get_chunk_info`](@ref).
We expect this will be slower, O(N^2), than using `h5d_chunk_iter` since each call to `h5d_get_chunk_info`
iterates through the B-tree structure.
"""
@inline function _get_chunk_info_all_by_index(dataset, dxpl=API.H5P_DEFAULT)
ds = dataspace(dataset)
N = ndims(ds)
info = ChunkInfo{N}[]
num_chunks = get_num_chunks(dataset)
sizehint!(info, num_chunks)
for chunk_index in 0:(num_chunks - 1)
_info_nt = HDF5.API.h5d_get_chunk_info(dataset, chunk_index)
_offset = (reverse(_info_nt[:offset])...,)
filter_mask = _info_nt[:filter_mask]
addr = _info_nt[:addr]
size = _info_nt[:size]
push!(info, ChunkInfo{N}(_offset, filter_mask, addr, size))
end
return info
end
# properties that require chunks in order to work (e.g. any filter)
# values do not matter -- just needed to form a NamedTuple with the desired keys
const chunked_props = (; compress=nothing, deflate=nothing, blosc=nothing, shuffle=nothing)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 10731 | """
HDF5.Dataspace
A dataspace defines the size and the shape of a dataset or an attribute.
A dataspace is typically constructed by calling [`dataspace`](@ref).
The following functions have methods defined for `Dataspace` objects
- `==`
- `ndims`
- `size`
- `length`
- `isempty`
- [`isnull`](@ref)
"""
Dataspace # defined in types.jl
Base.:(==)(dspace1::Dataspace, dspace2::Dataspace) =
API.h5s_extent_equal(checkvalid(dspace1), checkvalid(dspace2))
Base.hash(dspace::Dataspace, h::UInt) = hash(dspace.id, hash(Dataspace, h))
Base.copy(dspace::Dataspace) = Dataspace(API.h5s_copy(checkvalid(dspace)))
function Base.close(obj::Dataspace)
if obj.id != -1
if isvalid(obj)
API.h5s_close(obj)
end
obj.id = -1
end
nothing
end
"""
dataspace(obj::Union{Attribute, Dataset, Dataspace})
The [`Dataspace`](@ref) of `obj`.
"""
dataspace(ds::Dataspace) = ds
# Create a dataspace from in-memory types
"""
dataspace(data)
The default `Dataspace` used for representing a Julia object `data`:
- strings or numbers: a scalar `Dataspace`
- arrays: a simple `Dataspace`
- `struct` types: a scalar `Dataspace`
- `nothing` or an `EmptyArray`: a null dataspace
"""
dataspace(x::T) where {T} =
if isstructtype(T)
Dataspace(API.h5s_create(API.H5S_SCALAR))
else
throw(MethodError(dataspace, x))
end
dataspace(x::Union{T,Complex{T}}) where {T<:ScalarType} =
Dataspace(API.h5s_create(API.H5S_SCALAR))
dataspace(::AbstractString) = Dataspace(API.h5s_create(API.H5S_SCALAR))
function _dataspace(sz::Dims{N}, max_dims::Union{Dims{N},Tuple{}}=()) where {N}
dims = API.hsize_t[sz[i] for i in N:-1:1]
if isempty(max_dims)
maxd = dims
else
# This allows max_dims to be specified as -1 without triggering an overflow
# exception due to the signed -> unsigned conversion.
maxd = API.hsize_t[API.hssize_t(max_dims[i]) % API.hsize_t for i in N:-1:1]
end
return Dataspace(API.h5s_create_simple(length(dims), dims, maxd))
end
dataspace(A::AbstractArray{T,N}; max_dims::Union{Dims{N},Tuple{}}=()) where {T,N} =
_dataspace(size(A), max_dims)
# special array types
dataspace(v::VLen; max_dims::Union{Dims,Tuple{}}=()) = _dataspace(size(v.data), max_dims)
dataspace(A::EmptyArray) = Dataspace(API.h5s_create(API.H5S_NULL))
dataspace(n::Nothing) = Dataspace(API.h5s_create(API.H5S_NULL))
# for giving sizes explicitly
"""
dataspace(dims::Tuple; max_dims::Tuple=dims)
dataspace(dims::Tuple, max_dims::Tuple)
Construct a simple `Dataspace` for the given dimensions `dims`. The maximum
dimensions `maxdims` specifies the maximum possible size: `-1` can be used to
indicate unlimited dimensions.
"""
dataspace(sz::Dims{N}; max_dims::Union{Dims{N},Tuple{}}=()) where {N} =
_dataspace(sz, max_dims)
dataspace(sz::Dims{N}, max_dims::Union{Dims{N},Tuple{}}) where {N} =
_dataspace(sz, max_dims)
dataspace(dims::Tuple{Dims{N},Dims{N}}) where {N} = _dataspace(first(dims), last(dims))
dataspace(sz1::Int, sz2::Int, sz3::Int...; max_dims::Union{Dims,Tuple{}}=()) =
_dataspace(tuple(sz1, sz2, sz3...), max_dims)
function Base.ndims(dspace::Dataspace)
API.h5s_get_simple_extent_ndims(checkvalid(dspace))
end
function Base.size(dspace::Dataspace)
h5_dims = API.h5s_get_simple_extent_dims(checkvalid(dspace), nothing)
N = length(h5_dims)
return ntuple(i -> @inbounds(Int(h5_dims[N - i + 1])), N)
end
function Base.size(dspace::Dataspace, d::Integer)
d > 0 || throw(ArgumentError("invalid dimension d; must be positive integer"))
N = ndims(dspace)
d > N && return 1
h5_dims = API.h5s_get_simple_extent_dims(dspace, nothing)
return @inbounds Int(h5_dims[N - d + 1])
end
function Base.length(dspace::Dataspace)
isnull(dspace) && return 0
h5_dims = API.h5s_get_simple_extent_dims(checkvalid(dspace), nothing)
return Int(prod(h5_dims))
end
Base.isempty(dspace::Dataspace) = length(dspace) == 0
"""
isnull(dspace::Union{HDF5.Dataspace, HDF5.Dataset, HDF5.Attribute})
Determines whether the given object has no size (consistent with the `API.H5S_NULL` dataspace).
# Examples
```julia-repl
julia> HDF5.isnull(dataspace(HDF5.EmptyArray{Float64}()))
true
julia> HDF5.isnull(dataspace((0,)))
false
```
"""
function isnull(dspace::Dataspace)
return API.h5s_get_simple_extent_type(checkvalid(dspace)) == API.H5S_NULL
end
"""
HDF5.get_regular_hyperslab(dspace)::Tuple
Get the hyperslab selection from `dspace`. Returns a tuple of [`BlockRange`](@ref) objects.
"""
function get_regular_hyperslab(dspace::Dataspace)
start0, stride, count, block = API.h5s_get_regular_hyperslab(dspace)
N = length(start0)
ntuple(N) do i
ri = N - i + 1
@inbounds BlockRange(start0[ri], stride[ri], count[ri], block[ri])
end
end
struct BlockRange
start0::API.hsize_t
stride::API.hsize_t
count::API.hsize_t
block::API.hsize_t
end
"""
HDF5.BlockRange(;start::Integer, stride::Integer=1, count::Integer=1, block::Integer=1)
A `BlockRange` represents a selection along a single dimension of a HDF5
hyperslab. It is similar to a Julia `range` object, with some extra features for
selecting multiple contiguous blocks.
- `start`: the index of the first element in the first block (1-based).
- `stride`: the step between the first element of each block (must be >0)
- `count`: the number of blocks (can be -1 for an unlimited number of blocks)
- `block`: the number of elements in each block.
HDF5.BlockRange(obj::Union{Integer, OrdinalRange})
Convert `obj` to a `BlockRange` object.
# External links
- [HDF5 User Guide, section 7.4.2.1 "Selecting Hyperslabs"](https://support.hdfgroup.org/HDF5/doc/UG/HDF5_Users_Guide-Responsive%20HTML5/index.html#t=HDF5_Users_Guide%2FDataspaces%2FHDF5_Dataspaces_and_Partial_I_O.htm%23TOC_7_4_2_Programming_Modelbc-8&rhtocid=7.2.0_2)
"""
function BlockRange(; start::Integer, stride::Integer=1, count::Integer=1, block::Integer=1)
if count == -1
count = API.H5S_UNLIMITED
end
BlockRange(start - 1, stride, count, block)
end
BlockRange(start::Integer; stride=1, count=1, block=1) =
BlockRange(; start=start, stride=stride, count=count, block=block)
BlockRange(r::AbstractUnitRange; stride=max(length(r), 1), count=1) =
BlockRange(; start=first(r), stride=stride, count=count, block=length(r))
BlockRange(r::OrdinalRange) = BlockRange(; start=first(r), stride=step(r), count=length(r))
BlockRange(br::BlockRange) = br
Base.to_index(d::Dataset, br::BlockRange) = br
Base.length(br::BlockRange) = Int(br.count * br.block)
function Base.range(br::BlockRange)
start = Int(br.start0 + 1)
if br.count == 1
# UnitRange
return range(start; length=Int(br.block))
elseif br.block == 1 && br.count != API.H5S_UNLIMITED
# StepRange
return range(start; step=Int(br.stride), length=Int(br.count))
else
error("$br cannot be converted to a Julia range")
end
end
Base.convert(::Type{T}, br::BlockRange) where {T<:AbstractRange} = convert(T, range(br))
"""
HDF5.select_hyperslab!(dspace::Dataspace, [op, ], idxs::Tuple)
Selects a hyperslab region of the `dspace`. `idxs` should be a tuple of
integers, ranges or [`BlockRange`](@ref) objects.
- `op` determines how the new selection is to be combined with the already
selected dataspace:
- `:select` (default): replace the existing selection with the new selection.
- `:or`: adds the new selection to the existing selection.
Aliases: `|`, `∪`, `union`.
- `:and`: retains only the overlapping portions of the new and existing
selection. Aliases: `&`, `∩`, `intersect`.
- `:xor`: retains only the elements that are members of the new selection or
the existing selection, excluding elements that are members of both
selections. Aliases: `⊻`, `xor`
- `:notb`: retains only elements of the existing selection that are not in the
new selection. Alias: `setdiff`.
- `:nota`: retains only elements of the new selection that are not in the
existing selection.
"""
function select_hyperslab!(
dspace::Dataspace, op::Union{Symbol,typeof.((&, |, ⊻, ∪, ∩, setdiff))...}, idxs::Tuple
)
N = ndims(dspace)
length(idxs) == N || error("Number of indices does not match dimension of Dataspace")
blockranges = map(BlockRange, idxs)
_start0 = API.hsize_t[blockranges[N - i + 1].start0 for i in 1:N]
_stride = API.hsize_t[blockranges[N - i + 1].stride for i in 1:N]
_count = API.hsize_t[blockranges[N - i + 1].count for i in 1:N]
_block = API.hsize_t[blockranges[N - i + 1].block for i in 1:N]
_op = if op == :select
API.H5S_SELECT_SET
elseif (op == :or || op === (|) || op === (∪))
API.H5S_SELECT_OR
elseif (op == :and || op === (&) || op === (∩))
API.H5S_SELECT_AND
elseif (op == :xor || op === (⊻))
API.H5S_SELECT_XOR
elseif op == :notb || op === setdiff
API.H5S_SELECT_NOTB
elseif op == :nota
API.H5S_SELECT_NOTA
else
error("invalid operator $op")
end
API.h5s_select_hyperslab(dspace, _op, _start0, _stride, _count, _block)
return dspace
end
select_hyperslab!(dspace::Dataspace, idxs::Tuple) = select_hyperslab!(dspace, :select, idxs)
hyperslab(dspace::Dataspace, I::Union{AbstractRange{Int},Integer,BlockRange}...) =
hyperslab(dspace, I)
function hyperslab(dspace::Dataspace, I::Tuple)
select_hyperslab!(copy(dspace), I)
end
# methods for Dataset/Attribute which operate on Dataspace
function Base.ndims(obj::Union{Dataset,Attribute})
dspace = dataspace(obj)
try
return Base.ndims(dspace)
finally
close(dspace)
end
end
function Base.size(obj::Union{Dataset,Attribute})
dspace = dataspace(obj)
try
return Base.size(dspace)
finally
close(dspace)
end
end
function Base.size(obj::Union{Dataset,Attribute}, d::Integer)
dspace = dataspace(obj)
try
return Base.size(dspace, d)
finally
close(dspace)
end
end
function Base.length(obj::Union{Dataset,Attribute})
dspace = dataspace(obj)
try
return Base.length(dspace)
finally
close(dspace)
end
end
function Base.isempty(obj::Union{Dataset,Attribute})
dspace = dataspace(obj)
try
return Base.isempty(dspace)
finally
close(dspace)
end
end
function isnull(obj::Union{Dataset,Attribute})
dspace = dataspace(obj)
try
return isnull(dspace)
finally
close(dspace)
end
end
function hyperslab(dset::Dataset, I::Union{AbstractRange{Int},Int}...)
dspace = dataspace(dset)
try
return hyperslab(dspace, I...)
finally
close(dspace)
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1586 | function Base.close(obj::Datatype)
if obj.toclose && obj.id != -1
if (!isdefined(obj, :file) || obj.file.id != -1) && isvalid(obj)
API.h5o_close(obj)
end
obj.id = -1
end
nothing
end
# The datatype of a Datatype is the Datatype
datatype(dt::Datatype) = dt
open_datatype(
parent::Union{File,Group}, name::AbstractString, tapl::DatatypeAccessProperties
) = Datatype(API.h5t_open(checkvalid(parent), name, tapl), file(parent))
"""
open_datatype(parent::Union{File,Group}, path::AbstractString; properties...)
Open an existing [`Datatype`](@ref) at `path` under the `parent` object.
Optional keyword arguments include any keywords that that belong to
[`DatatypeAccessProperties`](@ref).
"""
function open_datatype(parent::Union{File,Group}, name::AbstractString; pv...)
tapl = DatatypeAccessProperties(; pv...)
return open_datatype(parent, name, tapl)
end
# Note that H5Tcreate is very different; H5Tcommit is the analog of these others
create_datatype(class_id, sz) = Datatype(API.h5t_create(class_id, sz))
function commit_datatype(
parent::Union{File,Group},
path::AbstractString,
dtype::Datatype,
lcpl::LinkCreateProperties=LinkCreateProperties(),
tcpl::DatatypeCreateProperties=DatatypeCreateProperties(),
tapl::DatatypeAccessProperties=DatatypeAccessProperties()
)
lcpl.char_encoding = cset(typeof(path))
API.h5t_commit(checkvalid(parent), path, dtype, lcpl, tcpl, tapl)
dtype.file = file(parent)
return dtype
end
Base.sizeof(dtype::Datatype) = Int(API.h5t_get_size(dtype))
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3721 | import Base: @deprecate, @deprecate_binding, depwarn
###
### v0.16 deprecations
###
### Changed in PR #844
@deprecate silence_errors(f::Function) f()
for name in names(API; all=true)
if name ∉ names(HDF5; all=true) && startswith(uppercase(String(name)), "H")
depmsg = ", use HDF5.API.$name instead."
@eval Base.@deprecate_binding $name API.$name false $depmsg
end
end
### Changed in PR #847
import Base: getindex, setindex!
@deprecate getindex(p::Properties, name::Symbol) Base.getproperty(p, name)
@deprecate setindex!(p::Properties, val, name::Symbol) Base.setproperty!(p, name, val)
# UTF8_LINK_PROPERTIES etc. used to be Refs, so required UTF8_LINK_PROPERTIES[] to set.
@deprecate getindex(p::Properties) p
function create_property(class; kwargs...)
(oldname, newtype) =
class == HDF5.API.H5P_OBJECT_CREATE ? (:H5P_OBJECT_CREATE, ObjectCreateProperties) :
class == HDF5.API.H5P_FILE_CREATE ? (:H5P_FILE_CREATE, FileCreateProperties) :
class == HDF5.API.H5P_FILE_ACCESS ? (:H5P_FILE_ACCESS, FileAccessProperties) :
class == HDF5.API.H5P_DATASET_CREATE ? (:H5P_DATASET_CREATE, DatasetCreateProperties) :
class == HDF5.API.H5P_DATASET_ACCESS ? (:H5P_DATASET_ACCESS, DatasetAccessProperties) :
class == HDF5.API.H5P_DATASET_XFER ? (:H5P_DATASET_XFER, DatasetTransferProperties) :
class == HDF5.API.H5P_FILE_MOUNT ? (:H5P_FILE_MOUNT, FileMountProperties) :
class == HDF5.API.H5P_GROUP_CREATE ? (:H5P_GROUP_CREATE, GroupCreateProperties) :
class == HDF5.API.H5P_GROUP_ACCESS ? (:H5P_GROUP_ACCESS, GroupAccessProperties) :
class == HDF5.API.H5P_DATATYPE_CREATE ? (:H5P_DATATYPE_CREATE, DatatypeCreateProperties) :
class == HDF5.API.H5P_DATATYPE_ACCESS ? (:H5P_DATATYPE_ACCESS, DatatypeAccessProperties) :
class == HDF5.API.H5P_STRING_CREATE ? (:H5P_STRING_CREATE, StringCreateProperties) :
class == HDF5.API.H5P_ATTRIBUTE_CREATE ? (:H5P_ATTRIBUTE_CREATE, AttributeCreateProperties) :
class == HDF5.API.H5P_OBJECT_COPY ? (:H5P_OBJECT_COPY, ObjectCopyProperties) :
class == HDF5.API.H5P_LINK_CREATE ? (:H5P_LINK_CREATE, LinkCreateProperties) :
class == HDF5.API.H5P_LINK_ACCESS ? (:H5P_LINK_ACCESS, LinkAccessProperties) :
error("invalid class")
Base.depwarn(
"`create_property(HDF5.$oldname; kwargs...)` has been deprecated, use `$newtype(;kwargs...)` instead.",
:create_property
)
init!(newtype(; kwargs...))
end
@deprecate set_chunk(p::Properties, dims...) set_chunk!(p, dims) false
@deprecate get_userblock(p::Properties) p.userblock false
function set_shuffle!(p::Properties, ::Tuple{})
depwarn("`shuffle=()` option is deprecated, use `shuffle=true`", :set_shuffle!)
set_shuffle!(p, true)
end
# see src/properties.jl for the following deprecated keywords
# :compress
# :fapl_mpio
# :track_times
### Changed in PR #887
# see src/properties.jl for the following deprecated keyword
# :filter
### Changed in PR #902
import Base: append!, push!
import .Filters: ExternalFilter
@deprecate append!(filters::Filters.FilterPipeline, extra::NTuple{N,Integer}) where {N} append!(
filters, [ExternalFilter(extra...)]
)
@deprecate push!(p::Filters.FilterPipeline, f::NTuple{N,Integer}) where {N} push!(
p, ExternalFilter(f...)
)
@deprecate ExternalFilter(t::Tuple) ExternalFilter(t...) false
### Changed in PR #979
# Querying items in the file
@deprecate object_info(obj::Union{File,Object}) API.h5o_get_info1(checkvalid(obj))
### Changed in PR #994
@deprecate set_track_order(p::Properties, val::Bool) set_track_order!(
p::Properties, val::Bool
) false
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 7106 | """
h5open(filename::AbstractString, mode::AbstractString="r"; swmr=false, pv...)
Open or create an HDF5 file where `mode` is one of:
- "r" read only
- "r+" read and write
- "cw" read and write, create file if not existing, do not truncate
- "w" read and write, create a new file (destroys any existing contents)
Pass `swmr=true` to enable (Single Writer Multiple Reader) SWMR write access for "w" and
"r+", or SWMR read access for "r".
Properties can be specified as keywords for [`FileAccessProperties`](@ref) and [`FileCreateProperties`](@ref).
Also the keywords `fapl` and `fcpl` can be used to provide default instances of these property lists. Property
lists passed in via keyword will be closed. This is useful to set properties not currently defined by HDF5.jl.
Note that `h5open` uses `fclose_degree = :strong` by default, but this can be overriden by the `fapl` keyword.
"""
function h5open(
filename::AbstractString,
mode::AbstractString,
fapl::FileAccessProperties,
fcpl::FileCreateProperties=FileCreateProperties();
swmr::Bool=false
)
#! format: off
rd, wr, cr, tr, ff =
mode == "r" ? (true, false, false, false, false) :
mode == "r+" ? (true, true, false, false, true ) :
mode == "cw" ? (false, true, true, false, true ) :
mode == "w" ? (false, true, true, true, false) :
# mode == "w+" ? (true, true, true, true, false) :
# mode == "a" ? (true, true, true, true, true ) :
error("invalid open mode: ", mode)
#! format: on
if ff && !wr
error("HDF5 does not support appending without writing")
end
if cr && (tr || !isfile(filename))
flag = swmr ? API.H5F_ACC_TRUNC | API.H5F_ACC_SWMR_WRITE : API.H5F_ACC_TRUNC
fid = API.h5f_create(filename, flag, fcpl, fapl)
elseif fapl.driver isa Drivers.Core && fapl.driver.backing_store == 0
flag = wr ? API.H5F_ACC_RDWR : API.H5F_ACC_RDONLY
fid = API.h5f_open(filename, flag, fapl)
else
occursin(r"(s3a?|https?)://", filename) ||
ishdf5(filename) ||
error(
"unable to determine if $filename is accessible in the HDF5 format (file may not exist)"
)
if wr
flag = swmr ? API.H5F_ACC_RDWR | API.H5F_ACC_SWMR_WRITE : API.H5F_ACC_RDWR
else
flag = swmr ? API.H5F_ACC_RDONLY | API.H5F_ACC_SWMR_READ : API.H5F_ACC_RDONLY
end
fid = API.h5f_open(filename, flag, fapl)
end
return File(fid, filename)
end
function h5open(
filename::AbstractString,
mode::AbstractString="r";
swmr::Bool=false,
# With garbage collection, the other modes don't make sense
fapl = FileAccessProperties(; fclose_degree=:strong),
fcpl = FileCreateProperties(),
pv...
)
try
pv = setproperties!(fapl, fcpl; pv...)
isempty(pv) || error("invalid keyword options $pv")
return h5open(filename, mode, fapl, fcpl; swmr=swmr)
finally
close(fapl)
close(fcpl)
end
end
"""
h5open(f::Function, args...; pv...)
Apply the function f to the result of `h5open(args...; kwargs...)` and close the resulting
`HDF5.File` upon completion.
For example with a `do` block:
h5open("foo.h5","w") do h5
h5["foo"]=[1,2,3]
end
"""
function h5open(f::Function, args...; context=copy(CONTEXT), pv...)
file = h5open(args...; pv...)
task_local_storage(:hdf5_context, context) do
if (track_order = get(pv, :track_order, nothing)) !== nothing
context.file_create.track_order = context.group_create.track_order = track_order
end
try
f(file) # the function body can access `context` via `get_context_property`
finally
close(file)
close(context)
end
end
end
"""
h5open(file_image::Vector{UInt8}, mode::AbstractString="r";
name=nothing,
fapl=FileAccessProperties(),
increment=8192,
backing_store=false,
write_tracking=false,
page_size=524288,
pv...
)
Open a file image contained in a `Vector{UInt8}`. See [`API.h5p_set_file_image`](@ref).
Unlike [`Drivers.Core`](@ref) the default here is not to use a backing store.
"""
function h5open(
file_image::Vector{UInt8},
mode::AbstractString="r";
name=nothing,
fapl=FileAccessProperties(),
increment=8192,
backing_store=false,
write_tracking=false,
page_size=524288,
pv...
)
fapl.driver = Drivers.Core(; increment, backing_store, write_tracking, page_size)
fapl.file_image = file_image
if isnothing(name)
if fapl.driver.backing_store != 0
# The temporary file will be used as a backing store
name = tempname()
else
# Provide it with a unique name based on the objectid
name = "<memory objectid>: " * repr(objectid(file_image))
end
end
h5open(name, mode; fapl, pv...)
end
function h5rewrite(f::Function, filename::AbstractString, args...)
tmppath, tmpio = mktemp(dirname(filename))
close(tmpio)
try
val = h5open(f, tmppath, "w", args...)
Base.Filesystem.rename(tmppath, filename)
return val
catch
Base.Filesystem.unlink(tmppath)
rethrow()
end
end
function Base.close(obj::File)
if obj.id != -1
API.h5f_close(obj)
obj.id = -1
end
nothing
end
"""
isopen(obj::HDF5.File)
Returns `true` if `obj` has not been closed, `false` if it has been closed.
"""
Base.isopen(obj::File) = obj.id != -1
"""
ishdf5(name::AbstractString)
Returns `true` if the file specified by `name` is in the HDF5 format, and `false` otherwise.
"""
function ishdf5(name::AbstractString)
isfile(name) || return false # fastpath in case the file is non-existant
# TODO: v1.12 use the more robust API.h5f_is_accesible
try
# docs falsely claim API.h5f_is_hdf5 doesn't error, but it does
return API.h5f_is_hdf5(name)
catch
return false
end
end
# Extract the file
file(f::File) = f
file(o::Union{Object,Attribute}) = o.file
fd(obj::Object) = API.h5i_get_file_id(checkvalid(obj))
filename(obj::Union{File,Group,Dataset,Attribute,Datatype}) =
API.h5f_get_name(checkvalid(obj))
"""
start_swmr_write(h5::HDF5.File)
Start Single Reader Multiple Writer (SWMR) writing mode.
# External links
[*Single Writer Multiple Reader* from the HDF5 manual](https://portal.hdfgroup.org/display/HDF5/Single+Writer+Multiple+Reader++-+SWMR).
"""
start_swmr_write(h5::File) = API.h5f_start_swmr_write(h5)
# Flush buffers
Base.flush(f::Union{Object,Attribute,Datatype,File}, scope=API.H5F_SCOPE_GLOBAL) =
API.h5f_flush(checkvalid(f), scope)
# File image conversion
function Vector{UInt8}(h5f::File)
flush(h5f)
API.h5f_get_file_image(h5f)
end
function File(file_image::Vector{UInt8}, name=nothing)
h5open(file_image; name)
end
Base.convert(::Type{Vector{UInt8}}, h5f::File) = Vector{UInt8}(h5f)
Base.convert(::Type{File}, file_image::Vector{UInt8}) = File(file_image)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2122 | import .FileIO
function loadtodict!(d::AbstractDict, g::Union{File,Group}, prefix::String="")
for k in keys(g)
if (v = g[k]) isa Group
loadtodict!(d, v, prefix * k * "/")
else
d[prefix * k] = read(g, k)
end
end
return d
end
_infer_track_order(track_order::Union{Nothing,Bool}, dict::AbstractDict) =
something(track_order, false)
@require OrderedCollections = "bac558e1-5e72-5ebc-8fee-abe8a469f55d" begin
_infer_track_order(
track_order::Union{Nothing,Bool}, dict::OrderedCollections.OrderedDict
) = something(track_order, true)
end
# load with just a filename returns a flat dictionary containing all the variables
function fileio_load(
f::FileIO.File{FileIO.format"HDF5"};
dict=Dict{String,Any}(),
track_order::Union{Nothing,Bool}=nothing,
kwargs...
)
h5open(
FileIO.filename(f),
"r";
track_order=_infer_track_order(track_order, dict),
kwargs...
) do file
loadtodict!(dict, file)
end
end
# when called with explicitly requested variable names, return each one
function fileio_load(
f::FileIO.File{FileIO.format"HDF5"}, varname::AbstractString; kwargs...
)
h5open(FileIO.filename(f), "r"; kwargs...) do file
read(file, varname)
end
end
function fileio_load(
f::FileIO.File{FileIO.format"HDF5"}, varnames::AbstractString...; kwargs...
)
h5open(FileIO.filename(f), "r"; kwargs...) do file
map(var -> read(file, var), varnames)
end
end
# save all the key-value pairs in the dict as top-level variables
function fileio_save(
f::FileIO.File{FileIO.format"HDF5"},
dict::AbstractDict;
track_order::Union{Nothing,Bool}=nothing,
kwargs...
)
h5open(
FileIO.filename(f),
"w";
track_order=_infer_track_order(track_order, dict),
kwargs...
) do file
for (k, v) in dict
isa(k, AbstractString) || throw(
ArgumentError("keys must be strings (the names of variables), got $k")
)
write(file, String(k), v)
end
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 6277 | """
HDF5.Group
An object representing a [HDF5
group](https://docs.hdfgroup.org/hdf5/develop/_h5_d_m__u_g.html#subsubsec_data_model_abstract_group).
A group is analagous to a file system directory, in that, except for the root
group, every object must be a member of at least one group.
# See also
- [`create_group`](@ref)
- [`open_group`](@ref)
"""
Group
"""
create_group(parent::Union{File,Group}, path::AbstractString; properties...)
Create a new [`Group`](@ref) at `path` under the `parent` object. Optional keyword
arguments include any keywords that that belong to
[`LinkCreateProperties`](@ref) or [`GroupCreateProperties`](@ref).
"""
function create_group(
parent::Union{File,Group},
path::AbstractString,
lcpl::LinkCreateProperties,
gcpl::GroupCreateProperties;
pv...
)
if !isempty(pv)
depwarn(
"Passing properties as positional and keyword arguments in the same call is deprecated.",
:create_group
)
setproperties!(gcpl; pv...)
end
return Group(API.h5g_create(parent, path, lcpl, gcpl, API.H5P_DEFAULT), file(parent))
end
function create_group(parent::Union{File,Group}, path::AbstractString; pv...)
lcpl = _link_properties(path)
gcpl = GroupCreateProperties()
try
pv = setproperties!(lcpl, gcpl; pv...)
isempty(pv) || error("invalid keyword options $pv")
return create_group(parent, path, lcpl, gcpl)
finally
close(lcpl)
close(gcpl)
end
end
"""
open_group(parent::Union{File,Group}, path::AbstractString; properties...)
Open an existing [`Group`](@ref) at `path` under the `parent` object.
Optional keyword arguments include any keywords that that belong to
[`GroupAccessProperties`](@ref).
"""
function open_group(
parent::Union{File,Group}, name::AbstractString, gapl::GroupAccessProperties
)
return Group(API.h5g_open(checkvalid(parent), name, gapl), file(parent))
end
function open_group(parent::Union{File,Group}, name::AbstractString; pv...)
gapl = GroupAccessProperties(; pv...)
return open_group(parent, name, gapl)
end
# Get the root group
root(h5file::File) = open_group(h5file, "/")
root(obj::Union{Group,Dataset}) = open_group(file(obj), "/")
group_info(obj::Union{Group,File}) = API.h5g_get_info(checkvalid(obj))
Base.length(obj::Union{Group,File}) = Int(API.h5g_get_num_objs(checkvalid(obj)))
Base.isempty(x::Union{Group,File}) = length(x) == 0
# filename and name
name(obj::Union{File,Group,Dataset,Datatype}) = API.h5i_get_name(checkvalid(obj))
# iteration by objects
function Base.iterate(parent::Union{File,Group}, iter=(1, nothing))
n, prev_obj = iter
prev_obj ≢ nothing && close(prev_obj)
n > length(parent) && return nothing
obj = h5object(
API.h5o_open_by_idx(
checkvalid(parent), ".", idx_type(parent), order(parent), n - 1, API.H5P_DEFAULT
),
parent
)
return (obj, (n + 1, obj))
end
function Base.parent(obj::Union{File,Group,Dataset})
f = file(obj)
path = name(obj)
if length(path) == 1
return f
end
parentname = dirname(path)
if !isempty(parentname)
return open_object(f, dirname(path))
else
return root(f)
end
end
# Path manipulation
function split1(path::AbstractString)
ind = findfirst('/', path)
isnothing(ind) && return path, ""
if ind == 1 # matches root group
return "/", path[2:end]
else
indm1, indp1 = prevind(path, ind), nextind(path, ind)
return path[1:indm1], path[indp1:end] # better to use begin:indm1, but only available on v1.5
end
end
function Base.haskey(
parent::Union{File,Group},
path::AbstractString,
lapl::LinkAccessProperties=LinkAccessProperties()
)
# recursively check each step of the path exists
# see https://portal.hdfgroup.org/display/HDF5/H5L_EXISTS
checkvalid(parent)
first, rest = split1(path)
if first == "/"
parent = root(parent)
elseif !API.h5l_exists(parent, first, lapl)
return false
end
exists = true
if !isempty(rest)
obj = parent[first]
exists = haskey(obj, rest, lapl)
close(obj)
end
return exists
end
function Base.keys(x::Union{Group,File})
checkvalid(x)
children = sizehint!(String[], length(x))
API.h5l_iterate(x, idx_type(x), order(x)) do _, name, _
push!(children, unsafe_string(name))
return API.herr_t(0)
end
return children
end
"""
delete_object(parent::Union{File,Group}, path::AbstractString)
Delete the object at `parent[path]`.
# Examples
```julia
f = h5open("f.h5", "r+")
delete_object(f, "Group1")
```
"""
delete_object(
parent::Union{File,Group},
path::AbstractString,
lapl::LinkAccessProperties=LinkAccessProperties()
) = API.h5l_delete(checkvalid(parent), path, lapl)
delete_object(obj::Object) = delete_object(parent(obj), ascii(split(name(obj), "/")[end])) # FIXME: remove ascii?
# Move links
move_link(
src::Union{File,Group},
src_name::AbstractString,
dest::Union{File,Group},
dest_name::AbstractString=src_name,
lapl::LinkAccessProperties=LinkAccessProperties(),
lcpl::LinkCreateProperties=LinkCreateProperties()
) = API.h5l_move(checkvalid(src), src_name, checkvalid(dest), dest_name, lcpl, lapl)
move_link(
parent::Union{File,Group},
src_name::AbstractString,
dest_name::AbstractString,
lapl::LinkAccessProperties=LinkAccessProperties(),
lcpl::LinkCreateProperties=LinkCreateProperties()
) = API.h5l_move(checkvalid(parent), src_name, parent, dest_name, lcpl, lapl)
"""
create_external(source::Union{HDF5.File, HDF5.Group}, source_relpath, target_filename, target_path;
lcpl_id=HDF5.API.H5P_DEFAULT, lapl_id=HDF5.H5P.DEFAULT)
Create an external link such that `source[source_relpath]` points to `target_path` within the file
with path `target_filename`.
# See also
[`API.h5l_create_external`](@ref)
"""
function create_external(
source::Union{File,Group},
source_relpath,
target_filename,
target_path;
lcpl_id=API.H5P_DEFAULT,
lapl_id=API.H5P_DEFAULT
)
API.h5l_create_external(
target_filename, target_path, source, source_relpath, lcpl_id, lapl_id
)
nothing
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3464 | ### High-level interface ###
function h5write(filename, name::AbstractString, data; pv...)
file = h5open(filename, "cw"; pv...)
try
write(file, name, data)
finally
close(file)
end
end
function h5read(filename, name::AbstractString; pv...)
local dat, file
fapl = FileAccessProperties(; fclose_degree=:strong)
pv = setproperties!(fapl; pv...)
try
file = h5open(filename, "r", fapl)
finally
close(fapl)
end
try
obj = getindex(file, name; pv...)
dat = read(obj)
close(obj)
finally
close(file)
end
dat
end
function h5read(filename, name_type_pair::Pair{<:AbstractString,DataType}; pv...)
local dat, file
fapl = FileAccessProperties(; fclose_degree=:strong)
pv = setproperties!(fapl; pv...)
try
file = h5open(filename, "r", fapl)
finally
close(fapl)
end
try
obj = getindex(file, name_type_pair[1]; pv...)
dat = read(obj, name_type_pair[2])
close(obj)
finally
close(file)
end
dat
end
function h5read(
filename,
name::AbstractString,
indices::Tuple{Vararg{Union{AbstractRange{Int},Int,Colon}}};
pv...
)
local dat, file
fapl = FileAccessProperties(; fclose_degree=:strong)
pv = setproperties!(fapl; pv...)
try
file = h5open(filename, "r", fapl)
finally
close(fapl)
end
try
dset = getindex(file, name; pv...)
dat = dset[indices...]
close(dset)
finally
close(file)
end
dat
end
function Base.getindex(parent::Union{File,Group}, path::AbstractString; pv...)
haskey(parent, path) || throw(KeyError(path))
# Faster than below if defaults are OK
isempty(pv) && return open_object(parent, path)
obj_type = gettype(parent, path)
if obj_type == API.H5I_DATASET
return open_dataset(parent, path; pv...)
elseif obj_type == API.H5I_GROUP
return open_group(parent, path; pv...)
else#if obj_type == API.H5I_DATATYPE # only remaining choice
return open_datatype(parent, path; pv...)
end
end
# Assign syntax: obj[path] = value
# Create a dataset with properties: obj[path, prop = val, ...] = val
function Base.setindex!(
parent::Union{File,Group}, val, path::Union{AbstractString,Nothing}; pv...
)
need_chunks = any(k in keys(chunked_props) for k in keys(pv))
have_chunks = any(k == :chunk for k in keys(pv))
chunk = need_chunks ? heuristic_chunk(val) : Int[]
# ignore chunked_props (== compression) for empty datasets (issue #246):
discard_chunks = need_chunks && isempty(chunk)
if discard_chunks
pv = pairs(Base.structdiff((; pv...), chunked_props))
else
if need_chunks && !have_chunks
pv = pairs((; chunk=chunk, pv...))
end
end
write(parent, path, val; pv...)
end
### Property manipulation ###
get_access_properties(d::Dataset) = DatasetAccessProperties(API.h5d_get_access_plist(d))
get_access_properties(f::File) = FileAccessProperties(API.h5f_get_access_plist(f))
get_create_properties(d::Dataset) = DatasetCreateProperties(API.h5d_get_create_plist(d))
get_create_properties(g::Group) = GroupCreateProperties(API.h5g_get_create_plist(g))
get_create_properties(f::File) = FileCreateProperties(API.h5f_get_create_plist(f))
get_create_properties(a::Attribute) = AttributeCreateProperties(API.h5a_get_create_plist(a))
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2698 | # Ensure that objects haven't been closed
Base.isvalid(obj::Union{File,Datatype,Dataspace}) = obj.id != -1 && API.h5i_is_valid(obj)
Base.isvalid(obj::Union{Group,Dataset,Attribute}) =
obj.id != -1 && obj.file.id != -1 && API.h5i_is_valid(obj)
checkvalid(obj) = isvalid(obj) ? obj : error("File or object has been closed")
# Close functions
# Close functions that should first check that the file is still open. The common case is a
# file that has been closed with CLOSE_STRONG but there are still finalizers that have not run
# for the datasets, etc, in the file.
function Base.close(obj::Union{Group,Dataset})
if obj.id != -1
if obj.file.id != -1 && isvalid(obj)
API.h5o_close(obj)
end
obj.id = -1
end
nothing
end
# Object (group, named datatype, or dataset) open
function h5object(obj_id::API.hid_t, parent)
obj_type = API.h5i_get_type(obj_id)
if obj_type == API.H5I_GROUP
Group(obj_id, file(parent))
elseif obj_type == API.H5I_DATATYPE
Datatype(obj_id, file(parent))
elseif obj_type == API.H5I_DATASET
Dataset(obj_id, file(parent))
else
error("Invalid object type for path ", path)
end
end
open_object(parent, path::AbstractString) =
h5object(API.h5o_open(checkvalid(parent), path, API.H5P_DEFAULT), parent)
function gettype(parent, path::AbstractString)
obj_id = API.h5o_open(checkvalid(parent), path, API.H5P_DEFAULT)
obj_type = API.h5i_get_type(obj_id)
API.h5o_close(obj_id)
return obj_type
end
# Copy objects
"""
copy_object(src_parent::Union{File,Group}, src_path::AbstractString, dst_parent::Union{File,Group}, dst_path::AbstractString)
Copy data from `src_parent[src_path]` to `dst_parent[dst_path]`.
# Examples
```julia
f = h5open("f.h5", "r")
g = h5open("g.h5", "cw")
copy_object(f, "Group1", g, "GroupA")
copy_object(f["Group1"], "data1", g, "DataSet/data_1")
```
"""
copy_object(
src_parent::Union{File,Group},
src_path::AbstractString,
dst_parent::Union{File,Group},
dst_path::AbstractString
) = API.h5o_copy(
checkvalid(src_parent),
src_path,
checkvalid(dst_parent),
dst_path,
API.H5P_DEFAULT,
_link_properties(dst_path)
)
"""
copy_object(src_obj::Object, dst_parent::Union{File,Group}, dst_path::AbstractString)
# Examples
```julia
copy_object(f["Group1"], g, "GroupA")
copy_object(f["Group1/data1"], g, "DataSet/data_1")
```
"""
copy_object(src_obj::Object, dst_parent::Union{File,Group}, dst_path::AbstractString) =
API.h5o_copy(
checkvalid(src_obj),
".",
checkvalid(dst_parent),
dst_path,
API.H5P_DEFAULT,
_link_properties(dst_path)
)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 33587 | abstract type Properties end
include("filters/filters.jl")
include("drivers/drivers.jl")
Base.cconvert(::Type{API.hid_t}, obj::Properties) = obj
Base.unsafe_convert(::Type{API.hid_t}, obj::Properties) = obj.id
function Base.close(obj::Properties)
if obj.id != -1
if isvalid(obj)
API.h5p_close(obj)
end
obj.id = -1
end
nothing
end
Base.isvalid(obj::Properties) = obj.id != -1 && API.h5i_is_valid(obj)
Base.copy(obj::P) where {P<:Properties} = P(HDF5.API.h5p_copy(obj.id))
# By default, properties objects are only initialized lazily
function init!(prop::P) where {P<:Properties}
if !isvalid(prop)
prop.id = API.h5p_create(classid(P))
end
return prop
end
function (::Type{P})(; kwargs...) where {P<:Properties}
obj = P(API.H5P_DEFAULT)
for (k, v) in kwargs
setproperty!(obj, k, v)
end
return obj
end
# Properties() do syntax
function (::Type{P})(func::Function; kwargs...) where {P<:Properties}
p = P(; kwargs...)
# Eagerly initialize when using do syntax
# This allows for use low-level API calls
init!(p)
try
func(p)
finally
close(p)
end
end
function Base.getproperty(p::P, name::Symbol) where {P<:Properties}
name === :id ? getfield(p, :id) : class_getproperty(P, init!(p), name)
end
function Base.setproperty!(p::P, name::Symbol, val) where {P<:Properties}
if name === :id
return setfield!(p, :id, API.hid_t(val))
end
init!(p)
class_setproperty!(P, p, name, val)
end
Base.propertynames(p::P) where {P<:Properties} = (all_propertynames(P)..., :id)
all_propertynames(::Type{P}) where {P<:Properties} =
(class_propertynames(P)..., all_propertynames(superclass(P))...,)
# defaults: refer to super class
class_getproperty(::Type{P}, props, name) where {P<:Properties} =
class_getproperty(superclass(P), props, name)
class_setproperty!(::Type{P}, p, name, val) where {P<:Properties} =
class_setproperty!(superclass(P), p, name, val)
class_propertynames(::Type{P}) where {P<:Properties} = ()
"""
@propertyclass P classid
Define a new subtype of `P <: Properties` corresponding to a HDF5 property list
with class identifier `classid`.
Once defined, the following interfaces can be defined:
superclass(::Type{P})
This should return the type from which `P` inherits. If not defined, it will
inherit from `GenericProperties`.
class_propertynames(::Type{P})
This should return a `Tuple` of `Symbol`s, being the names of the properties
associated with `P`.
class_getproperty(::Type{P}, p::Properties, name::Symbol)
If `name` is an associated property of type `P`, this should return the value of
the property, otherwise call `class_getproperty(superclass(P), p, name)`.
class_setproperty!(::Type{P}, p::Properties, name::Symbol, val)
If `name` is an associated property of type `P`, this should set the value of
the property, otherwise call `class_setproperty!(superclass(P), p, name, val)`.
"""
macro propertyclass(name, classid)
expr = quote
Core.@__doc__ mutable struct $name <: Properties
id::API.hid_t
function $name(id::API.hid_t)
obj = new(id)
finalizer(API.try_close_finalizer, obj)
obj
end
end
classid(::Type{$name}) = $classid
end
return esc(expr)
end
@propertyclass GenericProperties API.H5P_DEFAULT
superclass(::Type{P}) where {P<:Properties} = GenericProperties
class_getproperty(::Type{GenericProperties}, props, name) =
error("$(typeof(props)) has no property $name")
class_setproperty!(::Type{GenericProperties}, props, name, val) =
error("$(typeof(props)) has no property $name")
all_propertynames(::Type{GenericProperties}) = ()
# for initializing multiple Properties from a set of keyword arguments
"""
setproperties!(props::Properties...; kwargs...)
For each `(key, value)` pair in `kwargs`, set the corresponding properties in
each `Properties` object in `props`. Returns a `Dict` of any pairs which didn't
match properties in `props`.
"""
function setproperties!(props::Properties...; kwargs...)
filter(kwargs) do (k, v)
found = false
for prop in props
if k in all_propertynames(typeof(prop))
setproperty!(prop, k, v)
found = true
end
end
return !found
end
end
###
### Convenience macros for defining getter/setter functions
###
"""
@tuple_property(name)
"""
macro tuple_property(property)
get_property = Symbol(:get_, property)
set_property! = Symbol(:set_, property, :!)
api_get_property = :(API.$(Symbol(:h5p_get_, property)))
api_set_property = :(API.$(Symbol(:h5p_set_, property)))
quote
function $(esc(get_property))(p::Properties)
return $api_get_property(p)
end
function $(esc(set_property!))(p::Properties, val::Tuple)
return $api_set_property(p, val...)
end
end
end
"""
@enum_property(name, sym1 => enumvalue1, sym2 => enumvalue2, ...)
Wrap property getter/setter API functions that use enum values to use symbol instead.
"""
macro enum_property(property, pairs...)
get_property = Symbol(:get_, property)
set_property! = Symbol(:set_, property, :!)
api_get_property = :(API.$(Symbol(:h5p_get_, property)))
api_set_property = :(API.$(Symbol(:h5p_set_, property)))
get_expr = :(error("Unknown $property value $enum"))
set_expr = :(throw(ArgumentError("Invalid $property $val")))
for pair in reverse(pairs)
@assert pair isa Expr && pair.head == :call && pair.args[1] == :(=>)
_, val, enum = pair.args
get_expr = :(enum == $enum ? $val : $get_expr)
set_expr = :(val == $val ? $enum : $set_expr)
end
quote
function $(esc(get_property))(p::Properties)
property = $(QuoteNode(property))
enum = $api_get_property(p)
return $get_expr
end
function $(esc(set_property!))(p::Properties, val)
property = $(QuoteNode(property))
enum = $set_expr
return $api_set_property(p, enum)
end
function $(esc(set_property!))(p::Properties, enum::Integer)
# deprecate?
return $api_set_property(p, enum)
end
end
end
"""
@bool_property(name)
Wrap property getter/setter API functions that use `0`/`1` to use `Bool` values
"""
macro bool_property(property)
get_property = Symbol(:get_, property)
set_property! = Symbol(:set_, property, :!)
api_get_property = :(API.$(Symbol(:h5p_get_, property)))
api_set_property = :(API.$(Symbol(:h5p_set_, property)))
quote
function $(esc(get_property))(p::Properties)
return $api_get_property(p) != 0
end
function $(esc(set_property!))(p::Properties, val)
return $api_set_property(p, val)
end
end
end
###
### Define Properties types
###
#! format: off
"""
ObjectCreateProperties(;kws...)
ObjectCreateProperties(f::Function; kws...)
Properties used when creating a new object. Available options:
- `obj_track_times :: Bool`: governs the recording of times associated with an
object. If set to `true`, time data will be recorded. See
$(h5doc("H5P_SET_OBJ_TRACK_TIMES")).
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass ObjectCreateProperties API.H5P_OBJECT_CREATE
@bool_property(obj_track_times)
class_propertynames(::Type{ObjectCreateProperties}) = (
:obj_track_times,
:track_times,
)
function class_getproperty(::Type{ObjectCreateProperties}, p::Properties, name::Symbol)
name === :obj_track_times ? get_obj_track_times(p) :
# deprecated
name === :track_times ? (depwarn("`track_times` property is deprecated, use `obj_track_times` instead",:track_times); get_obj_track_times(p)) :
class_getproperty(superclass(ObjectCreateProperties), p, name)
end
function class_setproperty!(::Type{ObjectCreateProperties}, p::Properties, name::Symbol, val)
name === :obj_track_times ? set_obj_track_times!(p, val) :
# deprecated
name === :track_times ? (depwarn("`track_times=$val` keyword option is deprecated, use `obj_track_times=$val` instead",:track_times); set_obj_track_times!(p, val)) :
class_setproperty!(superclass(ObjectCreateProperties), p, name, val)
end
get_track_order(p::Properties) = API.h5p_get_link_creation_order(p) != 0 && API.h5p_get_attr_creation_order(p) != 0
function set_track_order!(p::Properties, val::Bool)
crt_order_flags = val ? (API.H5P_CRT_ORDER_TRACKED | API.H5P_CRT_ORDER_INDEXED) : 0
API.h5p_set_link_creation_order(p, crt_order_flags)
API.h5p_set_attr_creation_order(p, crt_order_flags)
nothing
end
"""
GroupCreateProperties(;kws...)
GroupCreateProperties(f::Function; kws...)
Properties used when creating a new `Group`. Inherits from
[`ObjectCreateProperties`](@ref), with additional options:
- `local_heap_size_hint :: Integer`: the anticipated maximum local heap size in
bytes. See $(h5doc("H5P_SET_LOCAL_HEAP_SIZE_HINT")).
- `track_order :: Bool`: tracks the group creation order.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass GroupCreateProperties API.H5P_GROUP_CREATE
superclass(::Type{GroupCreateProperties}) = ObjectCreateProperties
class_propertynames(::Type{GroupCreateProperties}) = (
:local_heap_size_hint,
:track_order,
)
function class_getproperty(::Type{GroupCreateProperties}, p::Properties, name::Symbol)
name === :local_heap_size_hint ? API.h5p_get_local_heap_size_hint(p) :
name === :track_order ? get_track_order(p) :
class_getproperty(superclass(GroupCreateProperties), p, name)
end
function class_setproperty!(::Type{GroupCreateProperties}, p::Properties, name::Symbol, val)
name === :local_heap_size_hint ? API.h5p_set_local_heap_size_hint(p, val) :
name === :track_order ? set_track_order!(p, val) :
class_setproperty!(superclass(GroupCreateProperties), p, name, val)
end
"""
FileCreateProperties(;kws...)
FileCreateProperties(f::Function; kws...)
Properties used when creating a new `File`. Inherits from
[`ObjectCreateProperties`](@ref), with additional properties:
- `userblock :: Integer`: user block size in bytes. The default user block size
is 0; it may be set to any power of 2 equal to 512 or greater (512, 1024,
2048, etc.). See $(h5doc("H5P_SET_USERBLOCK")).
- `track_order :: Bool`: tracks the file creation order.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass FileCreateProperties API.H5P_FILE_CREATE
superclass(::Type{FileCreateProperties}) = ObjectCreateProperties
class_propertynames(::Type{FileCreateProperties}) = (
:userblock,
:track_order,
:strategy,
:persist,
:threshold,
:file_space_page_size
)
const FSPACE_STRATEGY_SYMBOLS = Dict(
:fsm_aggr => API.H5F_FSPACE_STRATEGY_FSM_AGGR,
:page => API.H5F_FSPACE_STRATEGY_PAGE,
:aggr => API.H5F_FSPACE_STRATEGY_AGGR,
:none => API.H5F_FSPACE_STRATEGY_NONE,
:ntypes => API.H5F_FSPACE_STRATEGY_NTYPES
)
set_strategy!(p::FileCreateProperties, val) = API.h5p_set_file_space_strategy(p, strategy = val)
set_strategy!(p::FileCreateProperties, val::Symbol) = API.h5p_set_file_space_strategy(p, strategy = FSPACE_STRATEGY_SYMBOLS[val])
function get_strategy(p::FileCreateProperties)
strategy = API.h5p_get_file_space_strategy(p)[:strategy]
for (k, v) in FSPACE_STRATEGY_SYMBOLS
if v == strategy
return k
end
end
return :unknown
end
function class_getproperty(::Type{FileCreateProperties}, p::Properties, name::Symbol)
name === :userblock ? API.h5p_get_userblock(p) :
name === :track_order ? get_track_order(p) :
name === :strategy ? get_strategy(p) :
name === :persist ? API.h5p_get_file_space_strategy(p)[:persist] :
name === :threshold ? API.h5p_get_file_space_strategy(p)[:threshold] :
name === :file_space_page_size ? API.h5p_get_file_space_page_size(p) :
class_getproperty(superclass(FileCreateProperties), p, name)
end
function class_setproperty!(::Type{FileCreateProperties}, p::Properties, name::Symbol, val)
name === :userblock ? API.h5p_set_userblock(p, val) :
name === :track_order ? set_track_order!(p, val) :
name === :strategy ? set_strategy!(p, val) :
name === :persist ? API.h5p_set_file_space_strategy(p, persist = val) :
name === :threshold ? API.h5p_set_file_space_strategy(p, threshold = val) :
name === :file_space_page_size ? API.h5p_set_file_space_page_size(p, val) :
class_setproperty!(superclass(FileCreateProperties), p, name, val)
end
"""
DatatypeCreateProperties(;kws...)
DatatypeCreateProperties(f::Function; kws...)
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass DatatypeCreateProperties API.H5P_DATATYPE_CREATE
superclass(::Type{DatatypeCreateProperties}) = ObjectCreateProperties
"""
DatasetCreateProperties(;kws...)
DatasetCreateProperties(f::Function; kws...)
Properties used when creating a new `Dataset`. Inherits from
[`ObjectCreateProperties`](@ref), with additional properties:
- `alloc_time`: the timing for the allocation of storage space for a dataset's
raw data; one of:
- `:default`
- `:early`: allocate all space when the dataset is created
- `:incremental`: Allocate space incrementally, as data is written to the
dataset
- `:late`: Allocate all space when data is first written to the dataset.
See $(h5doc("H5P_SET_ALLOC_TIME")).
- `fill_time`: the timing of when the dataset should be filled; one of:
- `:alloc`: Fill when allocated
- `:never`: Never fill
- `:ifset`: Fill if a value is set
- `fill_value`: the fill value for a dataset. See $(h5doc("H5P_SET_FILL_VALUE")).
- `chunk`: a tuple containing the size of the chunks to store each dimension.
See $(h5doc("H5P_SET_CHUNK")) (note that this uses Julia's column-major
ordering).
- `external`: A tuple of `(name,offset,size)`, See $(h5doc("H5P_SET_EXTERNAL")).
- `filters` (only valid when `layout=:chunked`): a filter or vector of filters
that are applied to applied to each chunk of a dataset, see [Filters](@ref).
When accessed, will return a [`Filters.FilterPipeline`](@ref) object that can
be modified in-place.
- `layout`: the type of storage used to store the raw data for a dataset. Can be
one of:
- `:compact`: Store raw data in the dataset object header in file. This
should only be used for datasets with small amounts of raw data.
- `:contiguous`: Store raw data separately from the object header in one
large chunk in the file.
- `:chunked`: Store raw data separately from the object header as chunks of
data in separate locations in the file.
- `:virtual`: Draw raw data from multiple datasets in different files. See
the `virtual` property below.
See $(h5doc("H5P_SET_LAYOUT")).
- `no_attrs_hint`: Minimize the space for dataset metadata by hinting that no
attributes will be added if set to `true`. Attributes can still be added but
may exist elsewhere within the file. See
$(h5doc("H5P_SET_DSET_NO_ATTRS_HINT")).
- `virtual`: when specified, creates a virtual dataset (VDS). The argument
should be a "virtuala collection of [`VirtualMapping`](@ref) objects for
describing the mapping from the dataset to the source datasets. When accessed,
returns a [`VirtualLayout`](@ref) object.
The following options are shortcuts for the various filters, and are set-only.
They will be appended to the filter pipeline in the order in which they appear
- `blosc = true | level`: set the [`H5Zblosc.BloscFilter`](@ref) compression
filter; argument can be either `true`, or the compression level.
- `deflate = true | level`: set the [`Filters.Deflate`](@ref) compression
filter; argument can be either `true`, or the compression level.
- `fletcher32 = true`: set the [`Filters.Fletcher32`](@ref) checksum filter.
- `shuffle = true`: set the [`Filters.Shuffle`](@ref) filter.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass DatasetCreateProperties API.H5P_DATASET_CREATE
superclass(::Type{DatasetCreateProperties}) = ObjectCreateProperties
@enum_property(alloc_time,
:default => API.H5D_ALLOC_TIME_DEFAULT,
:early => API.H5D_ALLOC_TIME_EARLY,
:incremental => API.H5D_ALLOC_TIME_INCR,
:late => API.H5D_ALLOC_TIME_LATE)
# reverse indices
function get_chunk(p::Properties)
dims, N = API.h5p_get_chunk(p)
ntuple(i -> Int(dims[N-i+1]), N)
end
set_chunk!(p::Properties, dims) = API.h5p_set_chunk(p, length(dims), API.hsize_t[reverse(dims)...])
@enum_property(layout,
:compact => API.H5D_COMPACT,
:contiguous => API.H5D_CONTIGUOUS,
:chunked => API.H5D_CHUNKED,
:virtual => API.H5D_VIRTUAL)
# See https://portal.hdfgroup.org/display/HDF5/H5P_SET_FILL_TIME
@enum_property(fill_time,
:alloc => API.H5D_FILL_TIME_ALLOC,
:never => API.H5D_FILL_TIME_NEVER,
:ifset => API.H5D_FILL_TIME_IFSET
)
# filters getters/setters
get_filters(p::Properties) = Filters.FilterPipeline(p)
set_filters!(p::Properties, val::Filters.Filter) = push!(empty!(Filters.FilterPipeline(p)), val)
set_filters!(p::Properties, vals::Union{Tuple, AbstractVector}) = append!(empty!(Filters.FilterPipeline(p)), vals)
# convenience
set_deflate!(p::Properties, val::Bool) = val && push!(Filters.FilterPipeline(p), Filters.Deflate())
set_deflate!(p::Properties, level::Integer) = push!(Filters.FilterPipeline(p), Filters.Deflate(level=level))
set_shuffle!(p::Properties, val::Bool) = val && push!(Filters.FilterPipeline(p), Filters.Shuffle())
set_fletcher32!(p::Properties, val::Bool) = val && push!(Filters.FilterPipeline(p), Filters.Fletcher32())
set_blosc!(p::Properties, val) = error("The Blosc filter now requires the H5Zblosc package be loaded")
get_virtual(p::Properties) = VirtualLayout(p)
set_virtual!(p::Properties, vmaps) = append!(VirtualLayout(p), vmaps)
class_propertynames(::Type{DatasetCreateProperties}) = (
:alloc_time,
:fill_time,
:fill_value,
:chunk,
:external,
:filters,
:layout,
:no_attrs_hint,
:virtual,
# convenience
:blosc,
:deflate,
:fletcher32,
:shuffle,
# deprecated
:compress,
:filter
)
function class_getproperty(::Type{DatasetCreateProperties}, p::Properties, name::Symbol)
name === :alloc_time ? get_alloc_time(p) :
name === :fill_time ? get_fill_time(p) :
name === :fill_value ? get_fill_value(p) :
name === :chunk ? get_chunk(p) :
name === :external ? API.h5p_get_external(p) :
name === :filters ? get_filters(p) :
name === :layout ? get_layout(p) :
name === :no_attrs_hint ?
@static(API.h5_get_libversion() < v"1.10.5" ?
false :
API.h5p_get_dset_no_attrs_hint(p)
) :
name === :virtual ? get_virtual(p) :
# deprecated
name === :filter ? (depwarn("`filter` property name is deprecated, use `filters` instead",:class_getproperty); get_filters(p)) :
class_getproperty(superclass(DatasetCreateProperties), p, name)
end
function class_setproperty!(::Type{DatasetCreateProperties}, p::Properties, name::Symbol, val)
name === :alloc_time ? set_alloc_time!(p, val) :
name === :fill_time ? set_fill_time!(p, val) :
name === :fill_value ? set_fill_value!(p, val) :
name === :chunk ? set_chunk!(p, val) :
name === :external ? API.h5p_set_external(p, val...) :
name === :filters ? set_filters!(p, val) :
name === :layout ? set_layout!(p, val) :
name === :no_attrs_hint ?
@static(API.h5_get_libversion() < v"1.10.5" ?
error("no_attrs_hint is only valid for HDF5 library versions 1.10.5 or greater") :
API.h5p_set_dset_no_attrs_hint(p, val)
) :
name === :virtual ? set_virtual!(p, val) :
# set-only for convenience
name === :blosc ? set_blosc!(p, val) :
name === :deflate ? set_deflate!(p, val) :
name === :fletcher32 ? set_fletcher32!(p, val) :
name === :shuffle ? set_shuffle!(p, val) :
# deprecated
name === :filter ? (depwarn("`filter=$val` keyword option is deprecated, use `filters=$val` instead",:class_setproperty!); set_filters!(p, val)) :
name === :compress ? (depwarn("`compress=$val` keyword option is deprecated, use `deflate=$val` instead",:class_setproperty!); set_deflate!(p, val)) :
class_setproperty!(superclass(DatasetCreateProperties), p, name, val)
end
"""
StringCreateProperties(;kws...)
StringCreateProperties(f::Function; kws...)
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass StringCreateProperties API.H5P_STRING_CREATE
@enum_property(char_encoding,
:ascii => API.H5T_CSET_ASCII,
:utf8 => API.H5T_CSET_UTF8)
class_propertynames(::Type{StringCreateProperties}) = (
:char_encoding,
)
function class_getproperty(::Type{StringCreateProperties}, p::Properties, name::Symbol)
name === :char_encoding ? get_char_encoding(p) :
class_getproperty(superclass(StringCreateProperties), p, name)
end
function class_setproperty!(::Type{StringCreateProperties}, p::Properties, name::Symbol, val)
name === :char_encoding ? set_char_encoding!(p, val) :
class_setproperty!(superclass(StringCreateProperties), p, name, val)
end
"""
LinkCreateProperties(;kws...)
LinkCreateProperties(f::Function; kws...)
Properties used when creating links.
- `char_encoding`: the character enconding, either `:ascii` or `:utf8`.
- `create_intermediate_group :: Bool`: if `true`, will create missing
intermediate groups.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass LinkCreateProperties API.H5P_LINK_CREATE
superclass(::Type{LinkCreateProperties}) = StringCreateProperties
@bool_property(create_intermediate_group)
class_propertynames(::Type{LinkCreateProperties}) = (
:create_intermediate_group,
)
function class_getproperty(::Type{LinkCreateProperties}, p::Properties, name::Symbol)
name === :create_intermediate_group ? get_create_intermediate_group(p) :
class_getproperty(superclass(LinkCreateProperties), p, name)
end
function class_setproperty!(::Type{LinkCreateProperties}, p::Properties, name::Symbol, val)
name === :create_intermediate_group ? set_create_intermediate_group!(p, val) :
class_setproperty!(superclass(LinkCreateProperties), p, name, val)
end
"""
AttributeCreateProperties(;kws...)
AttributeCreateProperties(f::Function; kws...)
Properties used when creating attributes.
- `char_encoding`: the character enconding, either `:ascii` or `:utf8`.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass AttributeCreateProperties API.H5P_ATTRIBUTE_CREATE
superclass(::Type{AttributeCreateProperties}) = StringCreateProperties
"""
FileAccessProperties(;kws...)
FileAccessProperties(f::Function; kws...)
Properties used when accessing files.
- `alignment :: Tuple{Integer, Integer}`: a `(threshold, alignment)` pair: any
file object greater than or equal in size to threshold bytes will be aligned
on an address which is a multiple of alignment. Default values are 1, implying
no alignment.
- `driver`: the file driver used to access the file. See [Drivers](@ref).
- `driver_info` (get only)
- `fclose_degree`: file close degree property. One of:
- `:weak`
- `:semi`
- `:strong`
- `:default`
- `libver_bounds`: a `(low, high)` pair: `low` sets the earliest possible format
versions that the library will use when creating objects in the file; `high`
sets the latest format versions that the library will be allowed to use when
creating objects in the file. Values can be a `VersionNumber` for the hdf5
library, `:earliest`, or `:latest` . See $(h5doc("H5P_SET_LIBVER_BOUNDS"))
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass FileAccessProperties API.H5P_FILE_ACCESS
# Defaults for FileAccessProperties
function init!(fapl::FileAccessProperties)
# Call default init! for Properties
invoke(init!, Tuple{Properties}, fapl)
# Disable file locking by default for mmap
@static if API.has_h5p_set_file_locking()
API.h5p_set_file_locking(fapl, false, true)
end
set_fclose_degree!(fapl, :strong)
return fapl
end
@tuple_property(alignment)
@enum_property(fclose_degree,
:weak => API.H5F_CLOSE_WEAK,
:semi => API.H5F_CLOSE_SEMI,
:strong => API.H5F_CLOSE_STRONG,
:default => API.H5F_CLOSE_DEFAULT)
# getter/setter for libver_bounds
libver_bound_to_enum(val::Integer) = val
libver_bound_to_enum(val::API.H5F_libver_t) = val
function libver_bound_to_enum(val::VersionNumber)
val >= v"1.15" ? API.H5F_LIBVER_V116 :
val >= v"1.14" ? API.H5F_LIBVER_V114 :
val >= v"1.12" ? API.H5F_LIBVER_V112 :
val >= v"1.10" ? API.H5F_LIBVER_V110 :
val >= v"1.8" ? API.H5F_LIBVER_V18 :
throw(ArgumentError("libver_bound must be >= v\"1.8\"."))
end
function libver_bound_to_enum(val::Symbol)
val == :earliest ? API.H5F_LIBVER_EARLIEST :
val == :latest ? API.H5F_LIBVER_LATEST :
throw(ArgumentError("Invalid libver_bound $val."))
end
function libver_bound_from_enum(enum::API.H5F_libver_t)
enum == API.H5F_LIBVER_EARLIEST ? :earliest :
enum == API.H5F_LIBVER_V18 ? v"1.8" :
enum == API.H5F_LIBVER_V110 ? v"1.10" :
enum == API.H5F_LIBVER_V112 ? v"1.12" :
enum == API.H5F_LIBVER_V114 ? v"1.14" :
enum == API.H5F_LIBVER_V116 ? v"1.16" :
error("Unknown libver_bound value $enum")
end
libver_bound_from_enum(enum) = libver_bound_from_enum(API.H5F_libver_t(enum))
function get_libver_bounds(p::Properties)
low, high = API.h5p_get_libver_bounds(p)
return libver_bound_from_enum(low), libver_bound_from_enum(high)
end
function set_libver_bounds!(p::Properties, (low, high)::Tuple{Any,Any})
API.h5p_set_libver_bounds(p, libver_bound_to_enum(low), libver_bound_to_enum(high))
end
function set_libver_bounds!(p::Properties, val)
API.h5p_set_libver_bounds(p, libver_bound_to_enum(val), libver_bound_to_enum(val))
end
class_propertynames(::Type{FileAccessProperties}) = (
:alignment,
:driver,
:driver_info,
:fapl_mpio,
:fclose_degree,
:file_locking,
:libver_bounds,
:meta_block_size,
:file_image,
)
function class_getproperty(::Type{FileAccessProperties}, p::Properties, name::Symbol)
name === :alignment ? get_alignment(p) :
name === :driver ? Drivers.get_driver(p) :
name === :driver_info ? API.h5p_get_driver_info(p) : # get only
name === :fclose_degree ? get_fclose_degree(p) :
name === :file_locking ? API.h5p_get_file_locking(p) :
name === :libver_bounds ? get_libver_bounds(p) :
name === :meta_block_size ? API.h5p_get_meta_block_size(p) :
name === :file_image ? API.h5p_get_file_image(p) :
# deprecated
name === :fapl_mpio ? (depwarn("The `fapl_mpio` property is deprecated, use `driver=HDF5.Drivers.MPIO(...)` instead.", :fapl_mpio); drv = get_driver(p, MPIO); (drv.comm, drv.info)) :
class_getproperty(superclass(FileAccessProperties), p, name)
end
function class_setproperty!(::Type{FileAccessProperties}, p::Properties, name::Symbol, val)
name === :alignment ? set_alignment!(p, val) :
name === :driver ? Drivers.set_driver!(p, val) :
name === :fclose_degree ? set_fclose_degree!(p, val) :
name === :file_locking ? API.h5p_set_file_locking(p, val...) :
name === :libver_bounds ? set_libver_bounds!(p, val) :
name === :meta_block_size ? API.h5p_set_meta_block_size(p, val) :
name === :file_image ? API.h5p_set_file_image(p, val) :
# deprecated
name === :fapl_mpio ? (depwarn("The `fapl_mpio` property is deprecated, use `driver=HDF5.Drivers.MPIO(...)` instead.", :fapl_mpio); Drivers.set_driver!(p, Drivers.MPIO(val...))) :
class_setproperty!(superclass(FileAccessProperties), p, name, val)
end
@propertyclass LinkAccessProperties API.H5P_LINK_ACCESS
"""
GroupAccessProperties(;kws...)
Properties used when accessing datatypes. None are currently defined.
"""
@propertyclass GroupAccessProperties API.H5P_GROUP_ACCESS
superclass(::Type{GroupAccessProperties}) = LinkAccessProperties
"""
DatatypeAccessProperties(;kws...)
Properties used when accessing datatypes. None are currently defined.
"""
@propertyclass DatatypeAccessProperties API.H5P_DATATYPE_ACCESS
superclass(::Type{DatatypeAccessProperties}) = LinkAccessProperties
"""
DatasetAccessProperties(;kws...)
DatasetAccessProperties(f::Function; kws...)
Properties that control access to data in external, virtual, and chunked datasets.
- `chunk_cache`: Chunk cache parameters as (nslots, nbytes, w0).
Default: (521, 0x100000, 0.75)
- `efile_prefix`: Path prefix for reading external files.
The default is the current working directory.
- `:origin`: alias for `raw"\$ORIGIN"` will make the external file relative to
the HDF5 file.
- `virtual_prefix`: Path prefix for reading virtual datasets.
- `virtual_printf_gap`: The maximum number of missing source files and/or
datasets with the printf-style names when getting the extent of an unlimited
virtual dataset
- `virtual_view`: Influences whether the view of the virtual dataset includes
or excludes missing mapped elements
- `:first_missing`: includes all data before the first missing mapped data
- `:last_available`: includes all available mapped data
A function argument passed via `do` will be given an initialized property list
that will be closed.
See [Dataset Access Properties](https://portal.hdfgroup.org/display/HDF5/Dataset+Access+Properties)
"""
@propertyclass DatasetAccessProperties API.H5P_DATASET_ACCESS
superclass(::Type{DatasetAccessProperties}) = LinkAccessProperties
class_propertynames(::Type{DatasetAccessProperties}) = (
:chunk_cache,
:efile_prefix,
:virtual_prefix,
:virtual_printf_gap,
:virtual_view
)
@enum_property(virtual_view,
:first_missing => API.H5D_VDS_FIRST_MISSING,
:last_available => API.H5D_VDS_LAST_AVAILABLE
)
function class_getproperty(::Type{DatasetAccessProperties}, p::Properties, name::Symbol)
name === :chunk_cache ? API.h5p_get_chunk_cache(p) :
name === :efile_prefix ? API.h5p_get_efile_prefix(p) :
name === :virtual_prefix ? API.h5p_get_virtual_prefix(p) :
name === :virtual_printf_gap ? API.h5p_get_virtual_printf_gap(p) :
name === :virtual_view ? get_virtual_view(p) :
class_getproperty(superclass(DatasetAccessProperties), p, name)
end
function class_setproperty!(::Type{DatasetAccessProperties}, p::Properties, name::Symbol, val)
name === :chunk_cache ? API.h5p_set_chunk_cache(p, val...) :
name === :efile_prefix ? API.h5p_set_efile_prefix(p, val) :
name === :virtual_prefix ? API.h5p_set_virtual_prefix(p, val) :
name === :virtual_printf_gap ? API.h5p_set_virtual_printf_gap(p, val) :
name === :virtual_view ? set_virtual_view!(p, val) :
class_setproperty!(superclass(DatasetAccessProperties), p, name, val)
end
@propertyclass AttributeAccessProperties API.H5P_ATTRIBUTE_ACCESS
superclass(::Type{AttributeAccessProperties}) = LinkAccessProperties
"""
DatasetTransferProperties(;kws...)
DatasetTransferProperties(f::Function; kws...)
Properties used when transferring data to/from datasets
- `dxpl_mpio`: MPI transfer mode when using [`Drivers.MPIO`](@ref) file driver:
- `:independent`: use independent I/O access (default),
- `:collective`: use collective I/O access.
A function argument passed via `do` will be given an initialized property list
that will be closed.
"""
@propertyclass DatasetTransferProperties API.H5P_DATASET_XFER
@enum_property(dxpl_mpio,
:independent => API.H5FD_MPIO_INDEPENDENT,
:collective => API.H5FD_MPIO_COLLECTIVE)
class_propertynames(::Type{DatasetTransferProperties}) = (
:dxpl_mpio,
)
function class_getproperty(::Type{DatasetTransferProperties}, p::Properties, name::Symbol)
name === :dxpl_mpio ? get_dxpl_mpio(p) :
class_getproperty(superclass(DatasetTransferProperties), p, name)
end
function class_setproperty!(::Type{DatasetTransferProperties}, p::Properties, name::Symbol, val)
name === :dxpl_mpio ? set_dxpl_mpio!(p, val) :
class_setproperty!(superclass(DatasetTransferProperties), p, name, val)
end
@propertyclass FileMountProperties API.H5P_FILE_MOUNT
@propertyclass ObjectCopyProperties API.H5P_OBJECT_COPY
const DEFAULT_PROPERTIES = GenericProperties()
# These properties are initialized in __init__()
const ASCII_LINK_PROPERTIES = LinkCreateProperties()
const UTF8_LINK_PROPERTIES = LinkCreateProperties()
const ASCII_ATTRIBUTE_PROPERTIES = AttributeCreateProperties()
const UTF8_ATTRIBUTE_PROPERTIES = AttributeCreateProperties()
_link_properties(::AbstractString) = copy(UTF8_LINK_PROPERTIES)
_attr_properties(::AbstractString) = copy(UTF8_ATTRIBUTE_PROPERTIES)
#! format: on
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 11397 | ### Read and write operations common to multiple types
# Convenience macros
macro read(fid, sym)
!isa(sym, Symbol) && error("Second input to @read must be a symbol (i.e., a variable)")
esc(:($sym = read($fid, $(string(sym)))))
end
macro write(fid, sym)
!isa(sym, Symbol) && error("Second input to @write must be a symbol (i.e., a variable)")
esc(:(write($fid, $(string(sym)), $sym)))
end
# Generic read functions
"""
read(parent::Union{HDF5.File, HDF5.Group}, name::AbstractString; pv...)
read(parent::Union{HDF5.File, HDF5.Group}, name::AbstractString => dt::HDF5.Datatype; pv...)
Read a dataset or attribute from a HDF5 file of group identified by `name`.
Optionally, specify the [`HDF5.Datatype`](@ref) to be read.
"""
function Base.read(parent::Union{File,Group}, name::AbstractString; pv...)
obj = getindex(parent, name; pv...)
val = read(obj)
close(obj)
val
end
function Base.read(
parent::Union{File,Group}, name_type_pair::Pair{<:AbstractString,DataType}; pv...
)
obj = getindex(parent, name_type_pair[1]; pv...)
val = read(obj, name_type_pair[2])
close(obj)
val
end
# "Plain" (unformatted) reads. These work only for simple types: scalars, arrays, and strings
# See also "Reading arrays using getindex" below
# This infers the Julia type from the HDF5.Datatype. Specific file formats should provide their own read(dset).
const DatasetOrAttribute = Union{Dataset,Attribute}
"""
read(obj::HDF5.DatasetOrAttribute}
Read the data within a [`HDF5.Dataset`](@ref) or [`HDF5.Attribute`](@ref).
"""
function Base.read(obj::DatasetOrAttribute)
dtype = datatype(obj)
T = get_jl_type(dtype)
val = generic_read(obj, dtype, T)
close(dtype)
return val
end
function Base.getindex(obj::DatasetOrAttribute, I...)
dtype = datatype(obj)
T = get_jl_type(dtype)
val = generic_read(obj, dtype, T, I...)
close(dtype)
return val
end
function Base.read(obj::DatasetOrAttribute, ::Type{T}, I...) where {T}
dtype = datatype(obj)
val = generic_read(obj, dtype, T, I...)
close(dtype)
return val
end
# `Type{String}` does not have a definite size, so the generic_read does not accept
# it even though it will return a `String`. This explicit overload allows that usage.
function Base.read(obj::DatasetOrAttribute, ::Type{String}, I...)
dtype = datatype(obj)
T = get_jl_type(dtype)
T <: Union{Cstring,FixedString} || error(name(obj), " cannot be read as type `String`")
val = generic_read(obj, dtype, T, I...)
close(dtype)
return val
end
"""
copyto!(output_buffer::AbstractArray{T}, obj::Union{DatasetOrAttribute}) where T
Copy [part of] a HDF5 dataset or attribute to a preallocated output buffer.
The output buffer must be convertible to a pointer and have a contiguous layout.
"""
function Base.copyto!(
output_buffer::AbstractArray{T}, obj::DatasetOrAttribute, I...
) where {T}
dtype = datatype(obj)
val = nothing
try
val = generic_read!(output_buffer, obj, dtype, T, I...)
finally
close(dtype)
end
return val
end
# Special handling for reading OPAQUE datasets and attributes
function generic_read!(
buf::Matrix{UInt8}, obj::DatasetOrAttribute, filetype::Datatype, ::Type{Opaque}
)
generic_read(obj, filetype, Opaque, buf)
end
function generic_read(
obj::DatasetOrAttribute,
filetype::Datatype,
::Type{Opaque},
buf::Union{Matrix{UInt8},Nothing}=nothing
)
sz = size(obj)
if isnothing(buf)
buf = Matrix{UInt8}(undef, sizeof(filetype), prod(sz))
end
if obj isa Dataset
read_dataset(obj, filetype, buf, obj.xfer)
else
read_attribute(obj, filetype, buf)
end
tag = API.h5t_get_tag(filetype)
if isempty(sz)
# scalar (only) result
data = vec(buf)
else
# array of opaque objects
data = reshape([buf[:, i] for i in 1:prod(sz)], sz...)
end
return Opaque(data, tag)
end
# generic read function
function generic_read!(
buf::Union{AbstractMatrix{UInt8},AbstractArray{T}},
obj::DatasetOrAttribute,
filetype::Datatype,
::Type{T},
I...
) where {T}
return _generic_read(obj, filetype, T, buf, I...)
end
function generic_read(
obj::DatasetOrAttribute, filetype::Datatype, ::Type{T}, I...
) where {T}
return _generic_read(obj, filetype, T, nothing, I...)
end
function _generic_read(
obj::DatasetOrAttribute,
filetype::Datatype,
::Type{T},
buf::Union{AbstractMatrix{UInt8},AbstractArray{T},Nothing},
I...
) where {T}
sz, scalar, dspace = _size_of_buffer(obj, I)
if isempty(sz)
close(dspace)
return EmptyArray{T}()
end
try
if isnothing(buf)
buf = _normalized_buffer(T, sz)
else
sizeof(buf) != prod(sz) * sizeof(T) && error(
"Provided array buffer of size, $(size(buf)), and element type, $(eltype(buf)), does not match the dataset of size, $sz, and type, $T"
)
end
catch err
close(dspace)
rethrow(err)
end
memtype = _memtype(filetype, T)
memspace = isempty(I) ? dspace : dataspace(sz)
try
if obj isa Dataset
API.h5d_read(obj, memtype, memspace, dspace, obj.xfer, buf)
else
API.h5a_read(obj, memtype, buf)
end
if do_normalize(T)
out = reshape(normalize_types(T, buf), sz...)
else
out = buf
end
xfer_id = obj isa Dataset ? obj.xfer.id : API.H5P_DEFAULT
do_reclaim(T) && API.h5d_vlen_reclaim(memtype, memspace, xfer_id, buf)
if scalar
return out[1]
else
return out
end
catch e
# Add nicer errors if reading fails.
if obj isa Dataset
prop = get_create_properties(obj)
try
Filters.ensure_filters_available(Filters.FilterPipeline(prop))
finally
close(prop)
end
end
throw(e)
finally
close(memtype)
close(memspace)
close(dspace)
end
end
"""
similar(obj::DatasetOrAttribute, [::Type{T}], [dims::Integer...]; normalize = true)
Return a `Array{T}` or `Matrix{UInt8}` to that can contain [part of] the dataset.
The `normalize` keyword will normalize the buffer for string and array datatypes.
"""
function Base.similar(
obj::DatasetOrAttribute, ::Type{T}, dims::Dims; normalize::Bool=true
) where {T}
filetype = datatype(obj)
try
return similar(obj, filetype, T, dims; normalize=normalize)
finally
close(filetype)
end
end
Base.similar(
obj::DatasetOrAttribute, ::Type{T}, dims::Integer...; normalize::Bool=true
) where {T} = similar(obj, T, Int.(dims); normalize=normalize)
# Base.similar without specifying the Julia type
function Base.similar(obj::DatasetOrAttribute, dims::Dims; normalize::Bool=true)
filetype = datatype(obj)
try
T = get_jl_type(filetype)
return similar(obj, filetype, T, dims; normalize=normalize)
finally
close(filetype)
end
end
Base.similar(obj::DatasetOrAttribute, dims::Integer...; normalize::Bool=true) =
similar(obj, Int.(dims); normalize=normalize)
# Opaque types
function Base.similar(
obj::DatasetOrAttribute, filetype::Datatype, ::Type{Opaque}; normalize::Bool=true
)
# normalize keyword for consistency, but it is ignored for Opaque
sz = size(obj)
return Matrix{UInt8}(undef, sizeof(filetype), prod(sz))
end
# Undocumented Base.similar signature allowing filetype to be specified
function Base.similar(
obj::DatasetOrAttribute, filetype::Datatype, ::Type{T}, dims::Dims; normalize::Bool=true
) where {T}
# We are reusing code that expect indices
I = Base.OneTo.(dims)
sz, scalar, dspace = _size_of_buffer(obj, I)
memtype = _memtype(filetype, T)
try
buf = _normalized_buffer(T, sz)
if normalize && do_normalize(T)
buf = reshape(normalize_types(T, buf), sz)
end
return buf
finally
close(dspace)
close(memtype)
end
end
Base.similar(
obj::DatasetOrAttribute,
filetype::Datatype,
::Type{T},
dims::Integer...;
normalize::Bool=true
) where {T} = similar(obj, filetype, T, Int.(dims); normalize=normalize)
# Utilities used in Base.similar implementation
#=
_memtype(filetype::Datatype, T)
This is a utility function originall from generic_read.
It gets the native memory type for the system based on filetype, and checks
if the size matches.
=#
@inline function _memtype(filetype::Datatype, ::Type{T}) where {T}
!isconcretetype(T) && error("type $T is not concrete")
# padded layout in memory
memtype = Datatype(API.h5t_get_native_type(filetype))
if sizeof(T) != sizeof(memtype)
error("""
Type size mismatch
sizeof($T) = $(sizeof(T))
sizeof($memtype) = $(sizeof(memtype))
""")
end
return memtype
end
@inline function _memtype(filetype::Datatype, ::Type{S}) where {S<:AbstractString}
return datatype(S)
end
#=
_size_of_buffer(obj::DatasetOrAttribute, [I::Tuple, dspace::Dataspace])
This is a utility function originally from generic_read, but factored out.
The primary purpose is to determine the size and shape of the buffer to
create in order to hold the contents of a Dataset or Attribute.
# Arguments
* obj - A Dataset or Attribute
* I - (optional) indices, defaults to ()
* dspace - (optional) dataspace, defaults to dataspace(obj).
This argument will be consumed by hyperslab and returned.
# Returns
* `sz` the size of the selection
* `scalar`, which is true if the value should be read as a scalar.
* `dspace`, hyper
=#
@inline function _size_of_buffer(
obj::DatasetOrAttribute, I::Tuple=(), dspace::Dataspace=dataspace(obj)
)
!isempty(I) &&
obj isa Attribute &&
error("HDF5 attributes do not support hyperslab selections")
stype = API.h5s_get_simple_extent_type(dspace)
if !isempty(I) && stype != API.H5S_NULL
indices = Base.to_indices(obj, I)
dspace = hyperslab(dspace, indices...)
end
scalar = false
if stype == API.H5S_SCALAR
sz = (1,)
scalar = true
elseif stype == API.H5S_NULL
sz = ()
# scalar = false
elseif isempty(I)
sz = size(dspace)
# scalar = false
else
# Determine the size by the length of non-Int indices
sz = map(length, filter(i -> !isa(i, Int), indices))
if isempty(sz)
# All indices are Int, so this is scalar
sz = (1,)
scalar = true
end
end
return sz, scalar, dspace
end
#=
_normalized_buffer(T, sz)
Return a Matrix{UInt8} for a normalized type or `Array{T}` for a regular type.
See `do_normalize` in typeconversions.jl.
=#
@inline function _normalized_buffer(::Type{T}, sz::NTuple{N,Int}) where {T,N}
if do_normalize(T)
# The entire dataset is read into in a buffer matrix where the first dimension at
# any stage of normalization is the bytes for a single element of type `T`, and
# the second dimension of the matrix runs through all elements.
buf = Matrix{UInt8}(undef, sizeof(T), prod(sz))
else
buf = Array{T}(undef, sz...)
end
return buf
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 795 | Reference() = Reference(API.HOBJ_REF_T_NULL) # NULL reference to compare to
Base.:(==)(a::Reference, b::Reference) = a.r == b.r
Base.hash(x::Reference, h::UInt) = hash(x.r, h)
function Reference(parent::Union{File,Group,Dataset}, name::AbstractString)
ref = Ref{API.hobj_ref_t}()
API.h5r_create(ref, checkvalid(parent), name, API.H5R_OBJECT, -1)
return Reference(ref[])
end
# Dereference
function _deref(parent, r::Reference)
r == Reference() && error("Reference is null")
obj_id = API.h5r_dereference(checkvalid(parent), API.H5P_DEFAULT, API.H5R_OBJECT, r)
h5object(obj_id, parent)
end
Base.getindex(parent::Union{File,Group}, r::Reference) = _deref(parent, r)
Base.getindex(parent::Dataset, r::Reference) = _deref(parent, r) # defined separately to resolve ambiguity
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 10803 | function Base.show(io::IO, fid::File)
if isvalid(fid)
intent = API.h5f_get_intent(fid)
RW_MASK = API.H5F_ACC_RDONLY | API.H5F_ACC_RDWR
SWMR_MASK = API.H5F_ACC_SWMR_READ | API.H5F_ACC_SWMR_WRITE
rw = (intent & RW_MASK) == API.H5F_ACC_RDONLY ? "(read-only" : "(read-write"
swmr = (intent & SWMR_MASK) != 0 ? ", swmr) " : ") "
print(io, "HDF5.File: ", rw, swmr, fid.filename)
else
print(io, "HDF5.File: (closed) ", fid.filename)
end
end
function Base.show(io::IO, g::Group)
if isvalid(g)
print(io, "HDF5.Group: ", name(g), " (file: ", g.file.filename, ")")
else
print(io, "HDF5.Group: (invalid)")
end
end
function Base.show(io::IO, prop::Properties)
print(io, typeof(prop))
if prop.id == API.H5P_DEFAULT
print(io, "()")
elseif !isvalid(prop)
print(io, ": (invalid)")
else
print(io, "(")
for name in all_propertynames(typeof(prop))
# skip deprecated names
if name in (:compress, :track_times, :fapl_mpio)
continue
end
# not all properties have getters (e.g. shuffle, deflate, etc.),
# or always well-defined (e.g. chunk if layout != :chunked, dxpl_mpio if no MPI)
try
val = getproperty(prop, name)
if name == :file_image
print(
io,
"\n ",
rpad(name, 15),
" = ",
repr(typeof(val)),
", length = ",
length(val),
","
)
else
print(io, "\n ", rpad(name, 15), " = ", repr(val), ",")
end
catch e
end
end
print(io, "\n)")
end
end
function Base.show(io::IO, dset::Dataset)
if isvalid(dset)
print(
io,
"HDF5.Dataset: ",
name(dset),
" (file: ",
dset.file.filename,
" xfer_mode: ",
dset.xfer.id,
")"
)
else
print(io, "HDF5.Dataset: (invalid)")
end
end
function Base.show(io::IO, attr::Attribute)
if isvalid(attr)
print(io, "HDF5.Attribute: ", name(attr))
else
print(io, "HDF5.Attribute: (invalid)")
end
end
function Base.show(io::IO, attr::Attributes)
print(io, "Attributes of ", attr.parent)
end
Base.show(io::IO, attrdict::AttributeDict) = summary(io, attrdict)
function Base.summary(io::IO, attrdict::AttributeDict)
print(io, "AttributeDict of ", attrdict.parent)
if isvalid(attrdict.parent)
n = length(attrdict)
print(io, " with ", n, n == 1 ? " attribute" : " attributes")
end
end
const ENDIAN_DICT = Dict(
API.H5T_ORDER_LE => "little endian byte order",
API.H5T_ORDER_BE => "big endian byte order",
API.H5T_ORDER_VAX => "vax mixed endian byte order",
API.H5T_ORDER_MIXED => "mixed endian byte order",
API.H5T_ORDER_NONE => "no particular byte order",
)
function Base.show(io::IO, dtype::Datatype)
print(io, "HDF5.Datatype: ")
if isvalid(dtype)
API.h5t_committed(dtype) && print(io, name(dtype), " ")
buffer = IOBuffer(; sizehint=18)
print(buffer, API.h5lt_dtype_to_text(dtype))
str = String(take!(buffer))
if str == "undefined integer"
println(io, str)
println(io, " size: ", API.h5t_get_size(dtype), " bytes")
println(io, " precision: ", API.h5t_get_precision(dtype), " bits")
println(io, " offset: ", API.h5t_get_offset(dtype), " bits")
print(io, " order: ", ENDIAN_DICT[API.h5t_get_order(dtype)])
else
print(io, str)
end
else
# Note that API.h5i_is_valid returns `false` on the built-in datatypes (e.g. API.H5T_NATIVE_INT),
# apparently because they have refcounts of 0 yet are always valid. Just temporarily turn
# off error printing and try the call to probe if dtype is valid since API.H5LTdtype_to_text
# special-cases all of the built-in types internally.
local text
try
text = API.h5lt_dtype_to_text(dtype)
catch
text = "(invalid)"
end
print(io, text)
end
end
function Base.show(io::IO, br::BlockRange)
start = Int(br.start0 + 1)
# choose the simplest possible representation
if br.count == 1
if br.block == 1
# integer
r = start
else
# UnitRange
r = range(start; length=Int(br.block))
end
elseif br.block == 1 && br.count != API.H5S_UNLIMITED
# StepRange
r = range(start; step=Int(br.stride), length=Int(br.count))
else
# BlockRange(int; ...)
print(io, BlockRange, "(start=", start)
if br.stride != 1
print(io, ", stride=", Int(br.stride))
end
if br.count != 1
print(io, ", count=", br.count == API.API.H5S_UNLIMITED ? -1 : Int(br.count))
end
if br.block != 1
print(io, ", block=", Int(br.block))
end
print(io, ")")
return nothing
end
compact = get(io, :compact, false)
compact || print(io, BlockRange, "(")
print(io, r)
compact || print(io, ")")
return nothing
end
function Base.show(io::IO, dspace::Dataspace)
if !isvalid(dspace)
print(io, "HDF5.Dataspace: (invalid)")
return nothing
end
print(io, "HDF5.Dataspace: ")
type = API.h5s_get_simple_extent_type(dspace)
if type == API.H5S_NULL
print(io, "H5S_NULL")
return nothing
elseif type == API.H5S_SCALAR
print(io, "H5S_SCALAR")
return nothing
end
# otherwise type == API.H5S_SIMPLE
sz, maxsz = get_extent_dims(dspace)
sel = API.h5s_get_select_type(dspace)
if sel == API.H5S_SEL_HYPERSLABS && API.h5s_is_regular_hyperslab(dspace)
io_compact = IOContext(io, :compact => true)
blockranges = get_regular_hyperslab(dspace)
ndims = length(blockranges)
print(io_compact, "(")
for ii in 1:ndims
print(io_compact, blockranges[ii])
ii != ndims && print(io_compact, ", ")
end
print(io_compact, ") / (")
for ii in 1:ndims
print(io_compact, 1:maxsz[ii])
ii != ndims && print(io_compact, ", ")
end
print(io_compact, ")")
else
print(io, sz)
if maxsz != sz
print(io, " / ", maxsz)
end
if sel != API.H5S_SEL_ALL
print(io, " [irregular selection]")
end
end
end
"""
SHOW_TREE_ICONS = Ref{Bool}(true)
Configurable option to control whether emoji icons (`true`) or a plain-text annotation
(`false`) is used to indicate the object type by `show_tree`.
"""
const SHOW_TREE_ICONS = Ref{Bool}(true)
"""
SHOW_TREE_MAX_DEPTH = Ref{Int}(5)
Maximum recursive depth to descend during printing.
"""
const SHOW_TREE_MAX_DEPTH = Ref{Int}(5)
"""
SHOW_TREE_MAX_CHILDREN = Ref{Int}(50)
Maximum number of children to show at each node.
"""
const SHOW_TREE_MAX_CHILDREN = Ref{Int}(50)
function Base.show(
io::IO, ::MIME"text/plain", obj::Union{File,Group,Dataset,Attributes,Attribute}
)
if get(io, :compact, false)::Bool
show(io, obj)
else
show_tree(io, obj)
end
end
_tree_icon(::Type{Attribute}) = SHOW_TREE_ICONS[] ? "🏷️" : "[A]"
_tree_icon(::Type{Group}) = SHOW_TREE_ICONS[] ? "📂" : "[G]"
_tree_icon(::Type{Dataset}) = SHOW_TREE_ICONS[] ? "🔢" : "[D]"
_tree_icon(::Type{Datatype}) = SHOW_TREE_ICONS[] ? "📄" : "[T]"
_tree_icon(::Type{File}) = SHOW_TREE_ICONS[] ? "🗂️" : "[F]"
_tree_icon(::Type) = SHOW_TREE_ICONS[] ? "❓" : "[?]"
_tree_icon(obj) = _tree_icon(typeof(obj))
_tree_icon(obj::Attributes) = _tree_icon(obj.parent)
_tree_head(io::IO, obj) = print(io, _tree_icon(obj), " ", obj)
_tree_head(io::IO, obj::Datatype) =
print(io, _tree_icon(obj), " HDF5.Datatype: ", name(obj))
_tree_count(parent::Union{File,Group}, attributes::Bool) =
length(parent) + (attributes ? length(HDF5.attrs(parent)) : 0)
_tree_count(parent::Dataset, attributes::Bool) = attributes ? length(HDF5.attrs(parent)) : 0
_tree_count(parent::Attributes, _::Bool) = length(parent)
_tree_count(parent::Union{Attribute,Datatype}, _::Bool) = 0
function _show_tree(
io::IO,
obj::Union{File,Group,Dataset,Datatype,Attributes,Attribute},
indent::String="";
attributes::Bool=true,
depth::Int=1
)
isempty(indent) && _tree_head(io, obj)
isvalid(obj) || return nothing
INDENT = " "
PIPE = "│ "
TEE = "├─ "
ELBOW = "└─ "
limit = get(io, :limit, false)::Bool
counter = 0
nchildren = _tree_count(obj, attributes)
@inline function childstr(io, n, more=" ")
print(io, "\n", indent, ELBOW * "(", n, more, n == 1 ? "child" : "children", ")")
end
@inline function depth_check()
counter += 1
if limit && counter > max(2, SHOW_TREE_MAX_CHILDREN[] ÷ depth)
childstr(io, nchildren - counter + 1, " more ")
return true
end
return false
end
if limit && nchildren > 0 && depth > SHOW_TREE_MAX_DEPTH[]
childstr(io, nchildren)
return nothing
end
if attributes && !isa(obj, Attribute)
obj′ = obj isa Attributes ? obj.parent : obj
API.h5a_iterate(obj′, idx_type(obj′), order(obj′)) do _, cname, _
depth_check() && return API.herr_t(1)
name = unsafe_string(cname)
icon = _tree_icon(Attribute)
islast = counter == nchildren
print(io, "\n", indent, islast ? ELBOW : TEE, icon, " ", name)
return API.herr_t(0)
end
end
typeof(obj) <: Union{File,Group} || return nothing
API.h5l_iterate(obj, idx_type(obj), order(obj)) do loc_id, cname, _
depth_check() && return API.herr_t(1)
name = unsafe_string(cname)
child = obj[name]
icon = _tree_icon(child)
islast = counter == nchildren
print(io, "\n", indent, islast ? ELBOW : TEE, icon, " ", name)
nextindent = indent * (islast ? INDENT : PIPE)
_show_tree(io, child, nextindent; attributes=attributes, depth=depth + 1)
close(child)
return API.herr_t(0)
end
return nothing
end
show_tree(obj; kws...) = show_tree(stdout, obj; kws...)
function show_tree(io::IO, obj; kws...)
buf = IOBuffer()
_show_tree(IOContext(buf, io), obj; kws...)
print(io, String(take!(buf)))
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 16863 | # Single character types
# These are needed to safely handle VLEN objects
abstract type CharType <: AbstractString end
struct ASCIIChar <: CharType
c::UInt8
end
Base.length(c::ASCIIChar) = 1
struct UTF8Char <: CharType
c::UInt8
end
Base.length(c::UTF8Char) = 1
chartype(::Type{String}) = ASCIIChar
stringtype(::Type{ASCIIChar}) = String
stringtype(::Type{UTF8Char}) = String
cset(::Type{<:AbstractString}) = API.H5T_CSET_UTF8
cset(::Type{UTF8Char}) = API.H5T_CSET_UTF8
cset(::Type{ASCIIChar}) = API.H5T_CSET_ASCII
function unpad(s::String, pad::Integer)::String
if pad == API.H5T_STR_NULLTERM # null-terminated
ind = findfirst(isequal('\0'), s)
isnothing(ind) ? s : s[1:prevind(s, ind)]
elseif pad == API.H5T_STR_NULLPAD # padded with nulls
rstrip(s, '\0')
elseif pad == API.H5T_STR_SPACEPAD # padded with spaces
rstrip(s, ' ')
else
error("Unrecognized string padding mode $pad")
end
end
unpad(s, pad::Integer) = unpad(String(s), pad)
# VLEN objects
struct VLen{T}
data::Array
end
VLen(strs::Array{S}) where {S<:String} = VLen{chartype(S)}(strs)
VLen(A::Array{Array{T}}) where {T<:ScalarType} = VLen{T}(A)
VLen(A::Array{Array{T,N}}) where {T<:ScalarType,N} = VLen{T}(A)
function Base.cconvert(::Type{Ptr{Cvoid}}, v::VLen)
len = length(v.data)
h = Vector{API.hvl_t}(undef, len)
for ii in 1:len
d = v.data[ii]
p = unsafe_convert(Ptr{UInt8}, d)
h[ii] = API.hvl_t(length(d), p)
end
return h
end
datatype(A::VLen{T}) where {T<:ScalarType} = Datatype(API.h5t_vlen_create(hdf5_type_id(T)))
function datatype(str::VLen{C}) where {C<:CharType}
type_id = API.h5t_copy(hdf5_type_id(C))
API.h5t_set_size(type_id, 1)
API.h5t_set_cset(type_id, cset(C))
Datatype(API.h5t_vlen_create(type_id))
end
# Compound types
# These will use finalizers. Close them eagerly to avoid issues.
datatype(::T) where {T} = Datatype(hdf5_type_id(T), true)
datatype(::Type{T}) where {T} = Datatype(hdf5_type_id(T), true)
datatype(x::AbstractArray{T}) where {T} = Datatype(hdf5_type_id(T), true)
hdf5_type_id(::Type{T}) where {T} = hdf5_type_id(T, Val(isstructtype(T)))
function hdf5_type_id(::Type{T}, isstruct::Val{true}) where {T}
dtype = API.h5t_create(API.H5T_COMPOUND, sizeof(T))
for (idx, fn) in enumerate(fieldnames(T))
ftype = fieldtype(T, idx)
API.h5t_insert(dtype, Symbol(fn), fieldoffset(T, idx), hdf5_type_id(ftype))
end
return dtype
end
# Perhaps we need a custom error type here
hdf5_type_id(::Type{T}, isstruct::Val{false}) where {T} =
throw(MethodError(hdf5_type_id, (T, isstruct)))
# Opaque types
struct Opaque
data
tag::String
end
# An empty array type
struct EmptyArray{T} <: AbstractArray{T,0} end
# Required AbstractArray interface
Base.size(::EmptyArray) = ()
Base.IndexStyle(::Type{<:EmptyArray}) = IndexLinear()
Base.getindex(::EmptyArray, ::Int) = error("cannot index an `EmptyArray`")
Base.setindex!(::EmptyArray, v, ::Int) = error("cannot assign to an `EmptyArray`")
# Optional interface
Base.similar(::EmptyArray{T}) where {T} = EmptyArray{T}()
Base.similar(::EmptyArray, ::Type{S}) where {S} = EmptyArray{S}()
Base.similar(::EmptyArray, ::Type{S}, dims::Dims) where {S} = Array{S}(undef, dims)
# Override behavior for 0-dimensional Array
Base.length(::EmptyArray) = 0
# Required to avoid indexing during printing
Base.show(io::IO, E::EmptyArray) = print(io, typeof(E), "()")
Base.show(io::IO, ::MIME"text/plain", E::EmptyArray) = show(io, E)
# FIXME: Concatenation doesn't work for this type (it's treated as a length-1 array like
# Base's 0-dimensional arrays), so just forceably abort.
Base.cat_size(::EmptyArray) = error("concatenation of HDF5.EmptyArray is unsupported")
Base.cat_size(::EmptyArray, d) = error("concatenation of HDF5.EmptyArray is unsupported")
# Stub types to encode fixed-size arrays for API.H5T_ARRAY
struct FixedArray{T,D,L}
data::NTuple{L,T}
end
Base.size(::Type{FixedArray{T,D,L}}) where {T,D,L} = D
Base.size(x::FixedArray) = size(typeof(x))
Base.eltype(::Type{FixedArray{T,D,L}}) where {T,D,L} = T
Base.eltype(x::FixedArray) = eltype(typeof(x))
struct FixedString{N,PAD}
data::NTuple{N,UInt8}
end
Base.length(::Type{FixedString{N,PAD}}) where {N,PAD} = N
Base.length(str::FixedString) = length(typeof(str))
pad(::Type{FixedString{N,PAD}}) where {N,PAD} = PAD
pad(x::T) where {T<:FixedString} = pad(T)
struct VariableArray{T}
len::Csize_t
p::Ptr{Cvoid}
end
Base.eltype(::Type{VariableArray{T}}) where {T} = T
## Conversion between Julia types and HDF5 atomic types
#! format: off
hdf5_type_id(::Type{Bool}) = API.H5T_NATIVE_B8
hdf5_type_id(::Type{Int8}) = API.H5T_NATIVE_INT8
hdf5_type_id(::Type{UInt8}) = API.H5T_NATIVE_UINT8
hdf5_type_id(::Type{Int16}) = API.H5T_NATIVE_INT16
hdf5_type_id(::Type{UInt16}) = API.H5T_NATIVE_UINT16
hdf5_type_id(::Type{Int32}) = API.H5T_NATIVE_INT32
hdf5_type_id(::Type{UInt32}) = API.H5T_NATIVE_UINT32
hdf5_type_id(::Type{Int64}) = API.H5T_NATIVE_INT64
hdf5_type_id(::Type{UInt64}) = API.H5T_NATIVE_UINT64
hdf5_type_id(::Type{Float32}) = API.H5T_NATIVE_FLOAT
hdf5_type_id(::Type{Float64}) = API.H5T_NATIVE_DOUBLE
hdf5_type_id(::Type{Reference}) = API.H5T_STD_REF_OBJ
hdf5_type_id(::Type{<:AbstractString}) = API.H5T_C_S1
#! format: on
# It's not safe to use particular id codes because these can change, so we use characteristics of the type.
function _hdf5_type_map(class_id, is_signed, native_size)
if class_id == API.H5T_INTEGER
if is_signed == API.H5T_SGN_2
return if native_size == 1
Int8
elseif native_size == 2
Int16
elseif native_size == 4
Int32
elseif native_size == 8
Int64
else
throw(KeyError((class_id, is_signed, native_size)))
end
else
return if native_size == 1
UInt8
elseif native_size == 2
UInt16
elseif native_size == 4
UInt32
elseif native_size == 8
UInt64
else
throw(KeyError((class_id, is_signed, native_size)))
end
end
else
return if native_size == 4
Float32
elseif native_size == 8
Float64
else
throw(KeyError((class_id, is_signed, native_size)))
end
end
end
# global configuration for complex support
const COMPLEX_SUPPORT = Ref(true)
const COMPLEX_FIELD_NAMES = Ref(("r", "i"))
enable_complex_support() = COMPLEX_SUPPORT[] = true
disable_complex_support() = COMPLEX_SUPPORT[] = false
set_complex_field_names(real::AbstractString, imag::AbstractString) =
COMPLEX_FIELD_NAMES[] = ((real, imag))
# Create a datatype from in-memory types
datatype(x::ScalarType) = Datatype(hdf5_type_id(typeof(x)), false)
datatype(::Type{T}) where {T<:ScalarType} = Datatype(hdf5_type_id(T), false)
datatype(A::AbstractArray{T}) where {T<:ScalarType} = Datatype(hdf5_type_id(T), false)
function datatype(::Type{Complex{T}}) where {T<:ScalarType}
COMPLEX_SUPPORT[] ||
error("complex support disabled. call HDF5.enable_complex_support() to enable")
dtype = API.h5t_create(API.H5T_COMPOUND, 2 * sizeof(T))
API.h5t_insert(dtype, COMPLEX_FIELD_NAMES[][1], 0, hdf5_type_id(T))
API.h5t_insert(dtype, COMPLEX_FIELD_NAMES[][2], sizeof(T), hdf5_type_id(T))
return Datatype(dtype)
end
datatype(x::Complex{<:ScalarType}) = datatype(typeof(x))
datatype(A::AbstractArray{Complex{T}}) where {T<:ScalarType} = datatype(eltype(A))
function datatype(str::AbstractString)
type_id = API.h5t_copy(hdf5_type_id(typeof(str)))
API.h5t_set_size(type_id, max(sizeof(str), 1))
API.h5t_set_cset(type_id, cset(typeof(str)))
Datatype(type_id)
end
function datatype(::Type{S}) where {S<:AbstractString}
type_id = API.h5t_copy(hdf5_type_id(S))
API.h5t_set_size(type_id, API.H5T_VARIABLE)
API.h5t_set_cset(type_id, cset(S))
Datatype(type_id)
end
function datatype(::Array{S}) where {S<:AbstractString}
type_id = API.h5t_copy(hdf5_type_id(S))
API.h5t_set_size(type_id, API.H5T_VARIABLE)
API.h5t_set_cset(type_id, cset(S))
Datatype(type_id)
end
# conversions to Julia types
function get_jl_type(obj_type::Datatype)
class_id = API.h5t_get_class(obj_type)
if class_id == API.H5T_OPAQUE
return Opaque
else
return get_mem_compatible_jl_type(obj_type)
end
end
function get_jl_type(obj)
dtype = datatype(obj)
try
return get_jl_type(dtype)
finally
close(dtype)
end
end
Base.eltype(dset::Union{Dataset,Attribute}) = get_jl_type(dset)
function get_mem_compatible_jl_type(obj_type::Datatype)
class_id = API.h5t_get_class(obj_type)
if class_id == API.H5T_STRING
if API.h5t_is_variable_str(obj_type)
return Cstring
else
N = sizeof(obj_type)
PAD = API.h5t_get_strpad(obj_type)
return FixedString{N,PAD}
end
elseif class_id == API.H5T_INTEGER || class_id == API.H5T_FLOAT
native_type = API.h5t_get_native_type(obj_type)
try
native_size = API.h5t_get_size(native_type)
if class_id == API.H5T_INTEGER
is_signed = API.h5t_get_sign(native_type)
else
is_signed = nothing
end
return _hdf5_type_map(class_id, is_signed, native_size)
finally
API.h5t_close(native_type)
end
elseif class_id == API.H5T_BITFIELD
return Bool
elseif class_id == API.H5T_ENUM
super_type = API.h5t_get_super(obj_type)
try
native_type = API.h5t_get_native_type(super_type)
try
native_size = API.h5t_get_size(native_type)
is_signed = API.h5t_get_sign(native_type)
return _hdf5_type_map(API.H5T_INTEGER, is_signed, native_size)
finally
API.h5t_close(native_type)
end
finally
API.h5t_close(super_type)
end
elseif class_id == API.H5T_REFERENCE
# TODO update to use version 1.12 reference functions/types
return Reference
elseif class_id == API.H5T_OPAQUE
# TODO: opaque objects should get their own fixed-size data type; punning like
# this permits recursively reading (i.e. compound data type containing an
# opaque field). Requires figuring out what to do about the tag...
len = Int(API.h5t_get_size(obj_type))
return FixedArray{UInt8,(len,),len}
elseif class_id == API.H5T_VLEN
superid = API.h5t_get_super(obj_type)
return VariableArray{get_mem_compatible_jl_type(Datatype(superid))}
elseif class_id == API.H5T_COMPOUND
N = API.h5t_get_nmembers(obj_type)
membernames = ntuple(N) do i
API.h5t_get_member_name(obj_type, i - 1)
end
membertypes = ntuple(N) do i
dtype = Datatype(API.h5t_get_member_type(obj_type, i - 1))
return get_mem_compatible_jl_type(dtype)
end
# check if should be interpreted as complex
iscomplex =
COMPLEX_SUPPORT[] &&
N == 2 &&
(membernames == COMPLEX_FIELD_NAMES[]) &&
(membertypes[1] == membertypes[2]) &&
(membertypes[1] <: ScalarType)
if iscomplex
return Complex{membertypes[1]}
else
return NamedTuple{Symbol.(membernames),Tuple{membertypes...}}
end
elseif class_id == API.H5T_ARRAY
dims = API.h5t_get_array_dims(obj_type)
nd = length(dims)
eltyp = Datatype(API.h5t_get_super(obj_type))
elT = get_mem_compatible_jl_type(eltyp)
dimsizes = ntuple(i -> Int(dims[nd - i + 1]), nd) # reverse order
return FixedArray{elT,dimsizes,prod(dimsizes)}
end
error("Class id ", class_id, " is not yet supported")
end
# convert special types to native julia types
function normalize_types(::Type{T}, buf::AbstractMatrix{UInt8}) where {T}
# First dimension spans bytes of a single element of type T --- (recursively) normalize
# each range of bytes to final type, returning vector of normalized data.
return [_normalize_types(T, view(buf, :, ind)) for ind in axes(buf, 2)]
end
# high-level description which should always work --- here, the buffer contains the bytes
# for exactly 1 element of an object of type T, so reinterpret the `UInt8` vector as a
# length-1 array of type `T` and extract the (only) element.
function _typed_load(::Type{T}, buf::AbstractVector{UInt8}) where {T}
return @inbounds reinterpret(T, buf)[1]
end
# fast-path for common concrete types with simple layout (which should be nearly all cases)
@static if VERSION ≤ v"1.10.0-DEV.1390" # Maybe a few dev versions earlier
function _typed_load(
::Type{T}, buf::V
) where {T,V<:Union{Vector{UInt8},Base.FastContiguousSubArray{UInt8,1}}}
dest = Ref{T}()
GC.@preserve dest buf Base._memcpy!(
unsafe_convert(Ptr{Cvoid}, dest), pointer(buf), sizeof(T)
)
return dest[]
# TODO: The above can maybe be replaced with
# return GC.@preserve buf unsafe_load(convert(Ptr{t}, pointer(buf)))
# dependent on data elements being properly aligned for all datatypes, on all
# platforms.
end
else
# TODO reimplement fast path _typed_load for Julia 1.10, consider refactor
function _typed_load(
::Type{T}, buf::V
) where {T,V<:Union{Vector{UInt8},Base.FastContiguousSubArray{UInt8,1}}}
dest = Ref{T}()
GC.@preserve dest buf Libc.memcpy(
unsafe_convert(Ptr{Cvoid}, dest), pointer(buf), sizeof(T)
)
return dest[]
# TODO: The above can maybe be replaced with
# return GC.@preserve buf unsafe_load(convert(Ptr{t}, pointer(buf)))
# dependent on data elements being properly aligned for all datatypes, on all
# platforms.
end
end
_normalize_types(::Type{T}, buf::AbstractVector{UInt8}) where {T} = _typed_load(T, buf)
function _normalize_types(::Type{T}, buf::AbstractVector{UInt8}) where {K,T<:NamedTuple{K}}
# Compound data types do not necessarily have members of uniform size, so instead of
# dim-1 => bytes of single element and dim-2 => over elements, just loop over exact
# byte ranges within the provided buffer vector.
nv = ntuple(length(K)) do ii
elT = fieldtype(T, ii)
off = fieldoffset(T, ii) % Int
sub = view(buf, off .+ (1:sizeof(elT)))
return _normalize_types(elT, sub)
end
return NamedTuple{K}(nv)
end
function _normalize_types(
::Type{V}, buf::AbstractVector{UInt8}
) where {T,V<:VariableArray{T}}
va = _typed_load(V, buf)
pbuf = unsafe_wrap(Array, convert(Ptr{UInt8}, va.p), (sizeof(T), Int(va.len)))
if do_normalize(T)
# If `T` a non-trivial type, recursively normalize the vlen buffer.
return normalize_types(T, pbuf)
else
# Otherwise if `T` is simple type, directly reinterpret the vlen buffer.
# (copy since libhdf5 will reclaim `pbuf = va.p` in `API.h5d_vlen_reclaim`)
return copy(vec(reinterpret(T, pbuf)))
end
end
function _normalize_types(::Type{F}, buf::AbstractVector{UInt8}) where {T,F<:FixedArray{T}}
if do_normalize(T)
# If `T` a non-trivial type, recursively normalize the buffer after reshaping to
# matrix with dim-1 => bytes of single element and dim-2 => over elements.
return reshape(normalize_types(T, reshape(buf, sizeof(T), :)), size(F)...)
else
# Otherwise, if `T` is simple type, directly reinterpret the array and reshape to
# final dimensions. The copy ensures (a) the returned array is independent of
# [potentially much larger] read() buffer, and (b) that the returned data is an
# Array and not ReshapedArray of ReinterpretArray of SubArray of ...
return copy(reshape(reinterpret(T, buf), size(F)...))
end
end
_normalize_types(::Type{Cstring}, buf::AbstractVector{UInt8}) =
unsafe_string(_typed_load(Ptr{UInt8}, buf))
_normalize_types(::Type{T}, buf::AbstractVector{UInt8}) where {T<:FixedString} =
unpad(String(buf), pad(T))
do_normalize(::Type{T}) where {T} = false
do_normalize(::Type{NamedTuple{T,U}}) where {U,T} =
any(i -> do_normalize(fieldtype(U, i)), 1:fieldcount(U))
do_normalize(::Type{T}) where {T<:Union{Cstring,FixedString,FixedArray,VariableArray}} =
true
do_reclaim(::Type{T}) where {T} = false
do_reclaim(::Type{NamedTuple{T,U}}) where {U,T} =
any(i -> do_reclaim(fieldtype(U, i)), 1:fieldcount(U))
do_reclaim(::Type{T}) where {T<:Union{Cstring,VariableArray}} = true
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 5115 | ### Generic H5DataStore interface ###
# Common methods that could be applicable to any interface for reading/writing variables from a file, e.g. HDF5, JLD, or MAT files.
# Types inheriting from H5DataStore should have names, read, and write methods.
# Supertype of HDF5.File, HDF5.Group, JldFile, JldGroup, Matlabv5File, and MatlabHDF5File.
abstract type H5DataStore end
"""
read(parent::H5DataStore)
read(parent::H5DataStore, names...)
Read a list of variables, read(parent, "A", "B", "x", ...).
If no variables are specified, read every variable in the file.
"""
function Base.read(parent::H5DataStore, name::AbstractString...)
tuple((read(parent, x) for x in name)...)
end
# Read every variable in the file
function Base.read(f::H5DataStore)
vars = keys(f)
vals = Vector{Any}(undef, length(vars))
for i in 1:length(vars)
vals[i] = read(f, vars[i])
end
Dict(zip(vars, vals))
end
### Base HDF5 structs ###
## HDF5 uses a plain integer to refer to each file, group, or
## dataset. These are wrapped into special types in order to allow
## method dispatch.
# Note re finalizers: we use them to ensure that objects passed back
# to the user will eventually be cleaned up properly. However, since
# finalizers don't run on a predictable schedule, we also call close
# directly on function exit. (This avoids certain problems, like those
# that occur when passing a freshly-created file to some other
# application).
# This defines an "unformatted" HDF5 data file. Formatted files are defined in separate modules.
mutable struct File <: H5DataStore
id::API.hid_t
filename::String
function File(id, filename, toclose::Bool=true)
f = new(id, filename)
if toclose
finalizer(API.try_close_finalizer, f)
end
f
end
end
Base.cconvert(::Type{API.hid_t}, f::File) = f
Base.unsafe_convert(::Type{API.hid_t}, f::File) = f.id
mutable struct Group <: H5DataStore
id::API.hid_t
file::File # the parent file
function Group(id, file)
g = new(id, file)
finalizer(API.try_close_finalizer, g)
g
end
end
Base.cconvert(::Type{API.hid_t}, g::Group) = g
Base.unsafe_convert(::Type{API.hid_t}, g::Group) = g.id
"""
HDF5.Dataset
A mutable wrapper for a HDF5 Dataset `HDF5.API.hid_t`.
"""
mutable struct Dataset
id::API.hid_t
file::File
xfer::DatasetTransferProperties
function Dataset(id, file, xfer=DatasetTransferProperties())
dset = new(id, file, xfer)
finalizer(API.try_close_finalizer, dset)
dset
end
end
Base.cconvert(::Type{API.hid_t}, dset::Dataset) = dset
Base.unsafe_convert(::Type{API.hid_t}, dset::Dataset) = dset.id
"""
HDF5.Datatype(id, toclose = true)
Wrapper for a HDF5 datatype id. If `toclose` is true, the finalizer will close
the datatype.
"""
mutable struct Datatype
id::API.hid_t
toclose::Bool
file::File
function Datatype(id, toclose::Bool=true)
nt = new(id, toclose)
if toclose
finalizer(API.try_close_finalizer, nt)
end
nt
end
function Datatype(id, file::File, toclose::Bool=true)
nt = new(id, toclose, file)
if toclose
finalizer(API.try_close_finalizer, nt)
end
nt
end
end
Base.cconvert(::Type{API.hid_t}, dtype::Datatype) = dtype
Base.unsafe_convert(::Type{API.hid_t}, dtype::Datatype) = dtype.id
Base.hash(dtype::Datatype, h::UInt) = hash(dtype.id, hash(Datatype, h))
Base.:(==)(dt1::Datatype, dt2::Datatype) = API.h5t_equal(dt1, dt2)
mutable struct Dataspace
id::API.hid_t
function Dataspace(id)
dspace = new(id)
finalizer(API.try_close_finalizer, dspace)
dspace
end
end
Base.cconvert(::Type{API.hid_t}, dspace::Dataspace) = dspace
Base.unsafe_convert(::Type{API.hid_t}, dspace::Dataspace) = dspace.id
mutable struct Attribute
id::API.hid_t
file::File
function Attribute(id, file)
dset = new(id, file)
finalizer(API.try_close_finalizer, dset)
dset
end
end
Base.cconvert(::Type{API.hid_t}, attr::Attribute) = attr
Base.unsafe_convert(::Type{API.hid_t}, attr::Attribute) = attr.id
# High-level reference handler
struct Reference
r::API.hobj_ref_t
end
Base.cconvert(
::Type{Ptr{T}}, ref::Reference
) where {T<:Union{Reference,API.hobj_ref_t,Cvoid}} = Ref(ref)
const BitsType = Union{
Bool,Int8,UInt8,Int16,UInt16,Int32,UInt32,Int64,UInt64,Float32,Float64
}
const ScalarType = Union{BitsType,Reference}
# Define an H5O Object type
const Object = Union{Group,Dataset,Datatype}
idx_type(obj::File) =
if get_context_property(:file_create).track_order ||
get_create_properties(obj).track_order
API.H5_INDEX_CRT_ORDER
else
API.H5_INDEX_NAME
end
idx_type(obj::Group) =
if get_context_property(:group_create).track_order ||
get_create_properties(obj).track_order
API.H5_INDEX_CRT_ORDER
else
API.H5_INDEX_NAME
end
idx_type(obj) = API.H5_INDEX_NAME
# TODO: implement alternative iteration order ?
order(obj) = API.H5_ITER_INC
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1999 | """
VirtualMapping(
vspace::Dataspace,
srcfile::AbstractString,
srcdset::AbstractString,
srcspace::Dataspace
)
Specify a map of elements of the virtual dataset (VDS) described by `vspace` to
the elements of the source dataset described by `srcspace`. The source dataset
is identified by the name of the file where it is located, `srcfile`, and the
name of the dataset, `srcdset`.
Both `srcfile` and `srcdset` support "printf"-style formats with `%b` being
replaced by the block count of the selection.
For more details on how source file resolution works, see
[`H5P_SET_VIRTUAL`](https://portal.hdfgroup.org/display/HDF5/H5P_SET_VIRTUAL).
"""
struct VirtualMapping
vspace::Dataspace
srcfile::String
srcdset::String
srcspace::Dataspace
end
"""
VirtualLayout(dcpl::DatasetCreateProperties)
The collection of [`VirtualMapping`](@ref)s associated with `dcpl`. This is an
`AbstractVector{VirtualMapping}`, supporting `length`, `getindex` and `push!`.
"""
struct VirtualLayout <: AbstractVector{VirtualMapping}
dcpl::DatasetCreateProperties
end
function Base.length(vlayout::VirtualLayout)
return API.h5p_get_virtual_count(vlayout.dcpl)
end
Base.size(vlayout::VirtualLayout) = (length(vlayout),)
function Base.push!(vlayout::VirtualLayout, vmap::VirtualMapping)
API.h5p_set_virtual(
vlayout.dcpl, vmap.vspace, vmap.srcfile, vmap.srcdset, vmap.srcspace
)
return vlayout
end
function Base.append!(vlayout::VirtualLayout, vmaps)
for vmap in vmaps
push!(vlayout, vmap)
end
return vlayout
end
function Base.getindex(vlayout::VirtualLayout, i::Integer)
vspace = Dataspace(API.h5p_get_virtual_vspace(vlayout.dcpl, i - 1))
srcfile = API.h5p_get_virtual_filename(vlayout.dcpl, i - 1)
srcdset = API.h5p_get_virtual_dsetname(vlayout.dcpl, i - 1)
srcspace = Dataspace(API.h5p_get_virtual_srcspace(vlayout.dcpl, i - 1))
return VirtualMapping(vspace, srcfile, srcdset, srcspace)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 4395 | module API
using Libdl: dlopen, dlclose, dlpath, dlsym, RTLD_LAZY, RTLD_NODELETE
using Base: StringVector
using Preferences: @load_preference, delete_preferences!, set_preferences!
using UUIDs: UUID
const _PREFERENCE_LIBHDF5 = @load_preference("libhdf5", nothing)
const _PREFERENCE_LIBHDF5_HL = @load_preference("libhdf5_hl", nothing)
if _PREFERENCE_LIBHDF5 === nothing && _PREFERENCE_LIBHDF5_HL === nothing
using HDF5_jll
elseif _PREFERENCE_LIBHDF5 !== nothing && _PREFERENCE_LIBHDF5_HL === nothing
error("You have only set a preference for the path of libhdf5, but not of libhdf5_hl.")
elseif _PREFERENCE_LIBHDF5 === nothing && _PREFERENCE_LIBHDF5_HL !== nothing
error("You have only set a preference for the path of libhdf5_hl, but not of libhdf5.")
else
libhdf5 = _PREFERENCE_LIBHDF5
libhdf5_hl = _PREFERENCE_LIBHDF5_HL
# Check whether we can open the libraries
flags = RTLD_LAZY | RTLD_NODELETE
dlopen(libhdf5, flags; throw_error=true)
dlopen(libhdf5_hl, flags; throw_error=true)
libhdf5_size = filesize(dlpath(libhdf5))
end
const HDF5_JL_UUID = UUID("f67ccb44-e63f-5c2f-98bd-6dc0ccc4ba2f")
const HDF5_JLL_JL_UUID = UUID("0234f1f7-429e-5d53-9886-15a909be8d59")
"""
set_libraries!(libhdf5 = nothing, libhdf5_hl = nothing; force = true)
Convenience function to set the preferences for a system-provided HDF5 library.
Pass the paths pointing to the libraries `libhdf5` and `libhdf5_hl` as strings
to set the preference. If `libhdf5` and `libhdf5_hl` are `nothing` use the default,
i.e. the binaries provided by HDF5_jll.jl.
"""
function set_libraries!(libhdf5=nothing, libhdf5_hl=nothing; force=true)
if isnothing(libhdf5) && isnothing(libhdf5_hl)
delete_preferences!(HDF5_JL_UUID, "libhdf5"; force)
delete_preferences!(HDF5_JL_UUID, "libhdf5_hl"; force)
delete_preferences!(HDF5_JLL_JL_UUID, "libhdf5_path", "libhdf5_hl_path"; force)
@info "The libraries from HDF5_jll will be used."
elseif isnothing(libhdf5) || isnothing(libhdf5_hl)
throw(
ArgumentError(
"Specify either no positional arguments or both positional arguments."
)
)
else
isfile(libhdf5) || throw(ArgumentError("$libhdf5 is not a file that exists."))
isfile(libhdf5_hl) || throw(ArgumentError("$libhdf5_hl is not a file that exists."))
set_preferences!(HDF5_JL_UUID, "libhdf5" => libhdf5; force)
set_preferences!(HDF5_JL_UUID, "libhdf5_hl" => libhdf5_hl; force)
# Also set the HDF5_jll override settings in case some other package tries to use HDF5_jll
set_preferences!(
HDF5_JLL_JL_UUID, "libhdf5_path" => libhdf5, "libhdf5_hl_path" => libhdf5_hl
)
end
@info "Please restart Julia and reload HDF5.jl for the library changes to take effect"
end
include("lock.jl")
include("types.jl")
include("error.jl")
include("functions.jl") # core API ccall wrappers
include("helpers.jl")
function __init__()
# HDF5.API.__init__() is run before HDF5.__init__()
# Ensure this is reinitialized on using
libhdf5handle[] = dlopen(libhdf5)
# Warn if the environment is set and does not agree with Preferences.jl
if haskey(ENV, "JULIA_HDF5_PATH")
if _PREFERENCE_LIBHDF5 === nothing
@warn "The environment variable JULIA_HDF5_PATH is deprecated and ignored. Use Preferences.jl as detailed in the documentation." ENV["JULIA_HDF5_PATH"] _PREFERENCE_LIBHDF5
elseif !startswith(_PREFERENCE_LIBHDF5, ENV["JULIA_HDF5_PATH"])
@warn "The environment variable JULIA_HDF5_PATH is deprecated and does not agree with the Preferences.jl setting." ENV["JULIA_HDF5_PATH"] _PREFERENCE_LIBHDF5
else
@debug "The environment variable JULIA_HDF5_PATH is set and agrees with the Preferences.jl setting." ENV["JULIA_HDF5_PATH"] _PREFERENCE_LIBHDF5
end
end
# Disable file locking as that can cause problems with mmap'ing.
# File locking is disabled in HDF5.init!(::FileAccessPropertyList)
# or here if h5p_set_file_locking is not available
@static if !has_h5p_set_file_locking() && !haskey(ENV, "HDF5_USE_FILE_LOCKING")
ENV["HDF5_USE_FILE_LOCKING"] = "FALSE"
end
# use our own error handling machinery (i.e. turn off automatic error printing)
h5e_set_auto(API.H5E_DEFAULT, C_NULL, C_NULL)
end
end # module API
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2470 | # An error thrown by libhdf5
mutable struct H5Error <: Exception
msg::String
id::hid_t
end
macro h5error(msg)
# Check if the is actually any errors on the stack. This is necessary as there are a
# small number of functions which return `0` in case of an error, but `0` is also a
# valid return value, e.g. `h5t_get_member_offset`
# This needs to be a macro as we need to call `h5e_get_current_stack()` _before_
# evaluating the message expression, as some message expressions can call API
# functions, which would clear the error stack.
quote
err_id = h5e_get_current_stack()
if h5e_get_num(err_id) > 0
throw(H5Error($(esc(msg)), err_id))
else
h5e_close_stack(err_id)
end
end
end
Base.cconvert(::Type{hid_t}, err::H5Error) = err
Base.unsafe_convert(::Type{hid_t}, err::H5Error) = err.id
function Base.close(err::H5Error)
if err.id != -1 && isvalid(err)
h5e_close_stack(err)
err.id = -1
end
return nothing
end
Base.isvalid(err::H5Error) = err.id != -1 && h5i_is_valid(err)
Base.length(err::H5Error) = h5e_get_num(err)
Base.isempty(err::H5Error) = length(err) == 0
function H5Error(msg::AbstractString)
id = h5e_get_current_stack()
err = H5Error(msg, id)
finalizer(close, err)
return err
end
const SHORT_ERROR = Ref(true)
function Base.showerror(io::IO, err::H5Error)
n_total = length(err)
print(io, "$(typeof(err)): ", err.msg)
print(io, "\nlibhdf5 Stacktrace:")
h5e_walk(err, H5E_WALK_UPWARD) do n, errptr
n += 1 # 1-based indexing
errval = unsafe_load(errptr)
print(io, "\n", lpad("[$n] ", 4 + ndigits(n_total)))
if errval.func_name != C_NULL
printstyled(io, unsafe_string(errval.func_name); bold=true)
print(io, ": ")
end
major = h5e_get_msg(errval.maj_num)[2]
minor = h5e_get_msg(errval.min_num)[2]
print(io, major, "/", minor)
if errval.desc != C_NULL
printstyled(
io,
"\n",
" "^(4 + ndigits(n_total)),
unsafe_string(errval.desc);
color=:light_black
)
end
if SHORT_ERROR[]
if n_total > 1
print(io, "\n", lpad("⋮", 2 + ndigits(n_total)))
end
return true # stop iterating
end
return false
end
return nothing
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 308708 | #! format: off
# This file is autogenerated by HDF5.jl's `gen/gen_wrappers.jl` and should not be editted.
#
# To add new bindings, define the binding in `gen/api_defs.jl`, re-run
# `gen/gen_wrappers.jl`, and commit the updated `src/api/functions.jl`.
_libhdf5_build_ver = let
(majnum, minnum, relnum) = (Ref{Cuint}(), Ref{Cuint}(), Ref{Cuint}())
r = ccall((:H5get_libversion, libhdf5), herr_t, (Ref{Cuint}, Ref{Cuint}, Ref{Cuint}), majnum, minnum, relnum)
r < 0 && error("Error getting HDF5 library version")
VersionNumber(majnum[], minnum[], relnum[])
end
"""
h5_close()
See `libhdf5` documentation for [`H5close`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#ga8a9fe81dcf66972ed75ea481e7750574).
"""
function h5_close()
lock(liblock)
var"#status#" = try
ccall((:H5close, libhdf5), herr_t, ())
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing the HDF5 resources")
return nothing
end
"""
h5_dont_atexit()
See `libhdf5` documentation for [`H5dont_atexit`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#ga7f80eb63b5e78812b9d0d50ac46764e8).
"""
function h5_dont_atexit()
lock(liblock)
var"#status#" = try
ccall((:H5dont_atexit, libhdf5), herr_t, ())
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error calling dont_atexit")
return nothing
end
"""
h5_free_memory(buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5free_memory`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#ga71872bf6445cba956da86d4762b662cf).
"""
function h5_free_memory(buf)
lock(liblock)
var"#status#" = try
ccall((:H5free_memory, libhdf5), herr_t, (Ptr{Cvoid},), buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error freeing memory")
return nothing
end
"""
h5_garbage_collect()
See `libhdf5` documentation for [`H5garbage_collect`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#gae511943bcb837a52a012a3a5dd7b90ef).
"""
function h5_garbage_collect()
lock(liblock)
var"#status#" = try
ccall((:H5garbage_collect, libhdf5), herr_t, ())
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error on garbage collect")
return nothing
end
"""
h5_get_libversion(majnum::Ref{Cuint}, minnum::Ref{Cuint}, relnum::Ref{Cuint})
See `libhdf5` documentation for [`H5get_libversion`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#gaf87da966fdf896ec7bca794e21d4ab0a).
"""
function h5_get_libversion(majnum, minnum, relnum)
lock(liblock)
var"#status#" = try
ccall((:H5get_libversion, libhdf5), herr_t, (Ref{Cuint}, Ref{Cuint}, Ref{Cuint}), majnum, minnum, relnum)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting HDF5 library version")
return nothing
end
"""
h5_is_library_threadsafe(is_ts::Ref{Cuchar})
See `libhdf5` documentation for [`H5is_library_threadsafe`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#ga70bfde4acd009cdd7bcd2f54c594e28a).
"""
function h5_is_library_threadsafe(is_ts)
lock(liblock)
var"#status#" = try
ccall((:H5is_library_threadsafe, libhdf5), herr_t, (Ref{Cuchar},), is_ts)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error determining thread safety")
return nothing
end
"""
h5_open()
See `libhdf5` documentation for [`H5open`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#ga27fa33dc262dda95c5aa8df533837480).
"""
function h5_open()
lock(liblock)
var"#status#" = try
ccall((:H5open, libhdf5), herr_t, ())
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error initializing the HDF5 library")
return nothing
end
"""
h5_set_free_list_limits(reg_global_lim::Cint, reg_list_lim::Cint, arr_global_lim::Cint, arr_list_lim::Cint, blk_global_lim::Cint, blk_list_lim::Cint)
See `libhdf5` documentation for [`H5set_free_list_limits`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5.html#gaa3f78b24b8a1ff4168db2b7ddca21545).
"""
function h5_set_free_list_limits(reg_global_lim, reg_list_lim, arr_global_lim, arr_list_lim, blk_global_lim, blk_list_lim)
lock(liblock)
var"#status#" = try
ccall((:H5set_free_list_limits, libhdf5), herr_t, (Cint, Cint, Cint, Cint, Cint, Cint), reg_global_lim, reg_list_lim, arr_global_lim, arr_list_lim, blk_global_lim, blk_list_lim)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting limits on free lists")
return nothing
end
"""
h5a_close(id::hid_t)
See `libhdf5` documentation for [`H5Aclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gaef4394b661e2c930879e9868e122bdda).
"""
function h5a_close(id)
lock(liblock)
var"#status#" = try
ccall((:H5Aclose, libhdf5), herr_t, (hid_t,), id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing attribute")
return nothing
end
"""
h5a_create(loc_id::hid_t, attr_name::Cstring, type_id::hid_t, space_id::hid_t, acpl_id::hid_t, aapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Acreate2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga4f4e5248c09f689633079ed8afc0b308).
"""
function h5a_create(loc_id, attr_name, type_id, space_id, acpl_id, aapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Acreate2, libhdf5), hid_t, (hid_t, Cstring, hid_t, hid_t, hid_t, hid_t), loc_id, attr_name, type_id, space_id, acpl_id, aapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error creating attribute ", attr_name, " for object ", h5i_get_name(loc_id)))
return var"#status#"
end
"""
h5a_create_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, type_id::hid_t, space_id::hid_t, acpl_id::hid_t, aapl_id::hid_t, lapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Acreate_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga004160c28e281455ec48aa7fe557ef8a).
"""
function h5a_create_by_name(loc_id, obj_name, attr_name, type_id, space_id, acpl_id, aapl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Acreate_by_name, libhdf5), hid_t, (hid_t, Cstring, Cstring, hid_t, hid_t, hid_t, hid_t, hid_t), loc_id, obj_name, attr_name, type_id, space_id, acpl_id, aapl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error creating attribute ", attr_name, " for object ", obj_name))
return var"#status#"
end
"""
h5a_delete(loc_id::hid_t, attr_name::Cstring)
See `libhdf5` documentation for [`H5Adelete`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gada9fa3d6db52329f1fd55662de6ff6ba).
"""
function h5a_delete(loc_id, attr_name)
lock(liblock)
var"#status#" = try
ccall((:H5Adelete, libhdf5), herr_t, (hid_t, Cstring), loc_id, attr_name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error deleting attribute ", attr_name))
return nothing
end
"""
h5a_delete_by_idx(loc_id::hid_t, obj_name::Cstring, idx_type::Cint, order::Cint, n::hsize_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Adelete_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga06711a4e77ff8ab49e427010fd38ac9e).
"""
function h5a_delete_by_idx(loc_id, obj_name, idx_type, order, n, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Adelete_by_idx, libhdf5), herr_t, (hid_t, Cstring, Cint, Cint, hsize_t, hid_t), loc_id, obj_name, idx_type, order, n, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error deleting attribute ", n, " from object ", obj_name))
return nothing
end
"""
h5a_delete_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Adelete_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gacbf689308f851428dd641b64f5f94feb).
"""
function h5a_delete_by_name(loc_id, obj_name, attr_name, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Adelete_by_name, libhdf5), herr_t, (hid_t, Cstring, Cstring, hid_t), loc_id, obj_name, attr_name, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error removing attribute ", attr_name, " from object ", obj_name))
return nothing
end
"""
h5a_exists(obj_id::hid_t, attr_name::Cstring) -> Bool
See `libhdf5` documentation for [`H5Aexists`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga293b5be270d90cd5e47f782ca9aec80b).
"""
function h5a_exists(obj_id, attr_name)
lock(liblock)
var"#status#" = try
ccall((:H5Aexists, libhdf5), htri_t, (hid_t, Cstring), obj_id, attr_name)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error(string("Error checking whether attribute ", attr_name, " exists"))
return var"#status#" > 0
end
"""
h5a_exists_by_name(loc_id::hid_t, obj_name::Cstring, attr_name::Cstring, lapl_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Aexists_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gaa1d2305651a4524f6aa0f8b56eec1a37).
"""
function h5a_exists_by_name(loc_id, obj_name, attr_name, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aexists_by_name, libhdf5), htri_t, (hid_t, Cstring, Cstring, hid_t), loc_id, obj_name, attr_name, lapl_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error(string("Error checking whether object ", obj_name, " has attribute ", attr_name))
return var"#status#" > 0
end
"""
h5a_get_create_plist(attr_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Aget_create_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga0f6b545850bd21f128904eff51df226d).
"""
function h5a_get_create_plist(attr_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aget_create_plist, libhdf5), hid_t, (hid_t,), attr_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Cannot get creation property list")
return var"#status#"
end
"""
h5a_get_name(attr_id::hid_t, buf_size::Csize_t, buf::Ptr{UInt8}) -> Cssize_t
See `libhdf5` documentation for [`H5Aget_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga05e195aabab8c623b1c52009aeb99674).
"""
function h5a_get_name(attr_id, buf_size, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Aget_name, libhdf5), Cssize_t, (hid_t, Csize_t, Ptr{UInt8}), attr_id, buf_size, buf)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting attribute name")
return var"#status#"
end
"""
h5a_get_name_by_idx(loc_id::hid_t, obj_name::Cstring, index_type::Cint, order::Cint, idx::hsize_t, name::Ptr{UInt8}, size::Csize_t, lapl_id::hid_t) -> Cssize_t
See `libhdf5` documentation for [`H5Aget_name_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga4c552b2db32371f8ea20d87475313fb6).
"""
function h5a_get_name_by_idx(loc_id, obj_name, index_type, order, idx, name, size, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aget_name_by_idx, libhdf5), Cssize_t, (hid_t, Cstring, Cint, Cint, hsize_t, Ptr{UInt8}, Csize_t, hid_t), loc_id, obj_name, index_type, order, idx, name, size, lapl_id)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting attribute name")
return var"#status#"
end
"""
h5a_get_space(attr_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Aget_space`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga9e21e544119d03f9342530b45a71d74d).
"""
function h5a_get_space(attr_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aget_space, libhdf5), hid_t, (hid_t,), attr_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting attribute dataspace")
return var"#status#"
end
"""
h5a_get_type(attr_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Aget_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga0b070b714b2e535df2e1cb3005026a44).
"""
function h5a_get_type(attr_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aget_type, libhdf5), hid_t, (hid_t,), attr_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting attribute type")
return var"#status#"
end
"""
h5a_iterate(obj_id::hid_t, idx_type::Cint, order::Cint, n::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)
See `libhdf5` documentation for [`H5Aiterate2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga9315a22b60468b6e996559b1b8a77251).
"""
function h5a_iterate(obj_id, idx_type, order, n, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Aiterate2, libhdf5), herr_t, (hid_t, Cint, Cint, Ptr{hsize_t}, Ptr{Cvoid}, Any), obj_id, idx_type, order, n, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error iterating attributes in object ", h5i_get_name(obj_id)))
return nothing
end
"""
h5a_open(obj_id::hid_t, attr_name::Cstring, aapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Aopen`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga59863b205b6d93b2145f0fbca49656f7).
"""
function h5a_open(obj_id, attr_name, aapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aopen, libhdf5), hid_t, (hid_t, Cstring, hid_t), obj_id, attr_name, aapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening attribute ", attr_name, " for object ", h5i_get_name(obj_id)))
return var"#status#"
end
"""
h5a_open_by_idx(obj_id::hid_t, pathname::Cstring, idx_type::Cint, order::Cint, n::hsize_t, aapl_id::hid_t, lapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Aopen_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gab1451cdff4f77dcf9feaee83c8179b2d).
"""
function h5a_open_by_idx(obj_id, pathname, idx_type, order, n, aapl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Aopen_by_idx, libhdf5), hid_t, (hid_t, Cstring, Cint, Cint, hsize_t, hid_t, hid_t), obj_id, pathname, idx_type, order, n, aapl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening attribute ", n, " of ", h5i_get_name(obj_id), "/", pathname))
return var"#status#"
end
"""
h5a_read(attr_id::hid_t, mem_type_id::hid_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Aread`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gaacb27a997f7c98e8a833d0fd63b58f1c).
"""
function h5a_read(attr_id, mem_type_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Aread, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), attr_id, mem_type_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error reading attribute ", h5a_get_name(attr_id)))
return nothing
end
"""
h5a_rename(loc_id::hid_t, old_attr_name::Cstring, new_attr_name::Cstring)
See `libhdf5` documentation for [`H5Arename`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#ga490dcd6db246c1fda7295badfce28203).
"""
function h5a_rename(loc_id, old_attr_name, new_attr_name)
lock(liblock)
var"#status#" = try
ccall((:H5Arename, libhdf5), herr_t, (hid_t, Cstring, Cstring), loc_id, old_attr_name, new_attr_name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Could not rename attribute"))
return nothing
end
"""
h5a_write(attr_hid::hid_t, mem_type_id::hid_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Awrite`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_a.html#gab70871e205d57450c83efd9912be2b5c).
"""
function h5a_write(attr_hid, mem_type_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Awrite, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), attr_hid, mem_type_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error writing attribute data")
return nothing
end
@static if v"1.12.3" ≤ _libhdf5_build_ver
@doc """
h5d_chunk_iter(dset_id::hid_t, dxpl_id::hid_t, cb::Ptr{Nothing}, op_data::Any)
See `libhdf5` documentation for [`H5Dchunk_iter`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gac482c2386aa3aea4c44730a627a7adb8).
"""
function h5d_chunk_iter(dset_id, dxpl_id, cb, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Dchunk_iter, libhdf5), herr_t, (hid_t, hid_t, Ptr{Nothing}, Any), dset_id, dxpl_id, cb, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error iterating over chunks")
return nothing
end
end
"""
h5d_close(dataset_id::hid_t)
See `libhdf5` documentation for [`H5Dclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gae47c3f38db49db127faf221624c30609).
"""
function h5d_close(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dclose, libhdf5), herr_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing dataset")
return nothing
end
"""
h5d_create(loc_id::hid_t, pathname::Cstring, dtype_id::hid_t, space_id::hid_t, lcpl_id::hid_t, dcpl_id::hid_t, dapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dcreate2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gabf62045119f4e9c512d87d77f2f992df).
"""
function h5d_create(loc_id, pathname, dtype_id, space_id, lcpl_id, dcpl_id, dapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dcreate2, libhdf5), hid_t, (hid_t, Cstring, hid_t, hid_t, hid_t, hid_t, hid_t), loc_id, pathname, dtype_id, space_id, lcpl_id, dcpl_id, dapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error creating dataset ", h5i_get_name(loc_id), "/", pathname))
return var"#status#"
end
"""
h5d_create_anon(loc_id::hid_t, type_id::hid_t, space_id::hid_t, dcpl_id::hid_t, dapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dcreate_anon`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga15a77e82383d821fee8ecbf9ab8408cb).
"""
function h5d_create_anon(loc_id, type_id, space_id, dcpl_id, dapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dcreate_anon, libhdf5), hid_t, (hid_t, hid_t, hid_t, hid_t, hid_t), loc_id, type_id, space_id, dcpl_id, dapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in creating anonymous dataset")
return var"#status#"
end
"""
h5d_extend(dataset_id::hid_t, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dextend`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gac4c0ff57977b1f39c1055296e39cbe91).
"""
function h5d_extend(dataset_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Dextend, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), dataset_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error extending dataset")
return nothing
end
"""
h5d_fill(fill::Ptr{Cvoid}, fill_type_id::hid_t, buf::Ptr{Cvoid}, buf_type_id::hid_t, space_id::hid_t)
See `libhdf5` documentation for [`H5Dfill`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga8d4a57e2b2b8c95cfecf6f75bdaa8343).
"""
function h5d_fill(fill, fill_type_id, buf, buf_type_id, space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dfill, libhdf5), herr_t, (Ptr{Cvoid}, hid_t, Ptr{Cvoid}, hid_t, hid_t), fill, fill_type_id, buf, buf_type_id, space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error filling dataset")
return nothing
end
"""
h5d_flush(dataset_id::hid_t)
See `libhdf5` documentation for [`H5Dflush`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga4a2175a62baa1e35ad2467bb1fdff1f7).
"""
function h5d_flush(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dflush, libhdf5), herr_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error flushing dataset")
return nothing
end
"""
h5d_gather(src_space_id::hid_t, src_buf::Ptr{Cvoid}, type_id::hid_t, dst_buf_size::Csize_t, dst_buf::Ptr{Cvoid}, op::Ptr{Cvoid}, op_data::Any)
See `libhdf5` documentation for [`H5Dgather`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga1f6a428a8234d7c2ccba7da4742d79be).
"""
function h5d_gather(src_space_id, src_buf, type_id, dst_buf_size, dst_buf, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Dgather, libhdf5), herr_t, (hid_t, Ptr{Cvoid}, hid_t, Csize_t, Ptr{Cvoid}, Ptr{Cvoid}, Any), src_space_id, src_buf, type_id, dst_buf_size, dst_buf, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error gathering dataset")
return nothing
end
"""
h5d_get_access_plist(dataset_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dget_access_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga252c0ddac7a7817bd757190e7398353b).
"""
function h5d_get_access_plist(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_access_plist, libhdf5), hid_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting dataset access property list")
return var"#status#"
end
"""
h5d_get_chunk_info(dataset_id::hid_t, fspace_id::hid_t, index::hsize_t, offset::Ptr{hsize_t}, filter_mask::Ptr{Cuint}, addr::Ptr{haddr_t}, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dget_chunk_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gaccff213d3e0765b86f66d08dd9959807).
"""
function h5d_get_chunk_info(dataset_id, fspace_id, index, offset, filter_mask, addr, size)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_chunk_info, libhdf5), herr_t, (hid_t, hid_t, hsize_t, Ptr{hsize_t}, Ptr{Cuint}, Ptr{haddr_t}, Ptr{hsize_t}), dataset_id, fspace_id, index, offset, filter_mask, addr, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting chunk info")
return nothing
end
@static if v"1.10.5" ≤ _libhdf5_build_ver
@doc """
h5d_get_chunk_info_by_coord(dataset_id::hid_t, offset::Ptr{hsize_t}, filter_mask::Ptr{Cuint}, addr::Ptr{haddr_t}, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dget_chunk_info_by_coord`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga408a49c6ec59c5b65ce4c791f8d26cb0).
"""
function h5d_get_chunk_info_by_coord(dataset_id, offset, filter_mask, addr, size)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_chunk_info_by_coord, libhdf5), herr_t, (hid_t, Ptr{hsize_t}, Ptr{Cuint}, Ptr{haddr_t}, Ptr{hsize_t}), dataset_id, offset, filter_mask, addr, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting chunk info by coord")
return nothing
end
end
"""
h5d_get_chunk_storage_size(dataset_id::hid_t, offset::Ptr{hsize_t}, chunk_nbytes::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dget_chunk_storage_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gaaeea958861de082db9051fc4bf215234).
"""
function h5d_get_chunk_storage_size(dataset_id, offset, chunk_nbytes)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_chunk_storage_size, libhdf5), herr_t, (hid_t, Ptr{hsize_t}, Ptr{hsize_t}), dataset_id, offset, chunk_nbytes)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting chunk storage size")
return nothing
end
"""
h5d_get_create_plist(dataset_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dget_create_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga8848f14f4aba8e6160c3d8bb7f1be163).
"""
function h5d_get_create_plist(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_create_plist, libhdf5), hid_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting dataset create property list")
return var"#status#"
end
@static if v"1.10.5" ≤ _libhdf5_build_ver
@doc """
h5d_get_num_chunks(dataset_id::hid_t, fspace_id::hid_t, nchunks::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dget_num_chunks`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga8e15897dcc5799d6c09806644b492d7a).
"""
function h5d_get_num_chunks(dataset_id, fspace_id, nchunks)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_num_chunks, libhdf5), herr_t, (hid_t, hid_t, Ptr{hsize_t}), dataset_id, fspace_id, nchunks)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting number of chunks")
return nothing
end
end
"""
h5d_get_offset(dataset_id::hid_t) -> haddr_t
See `libhdf5` documentation for [`H5Dget_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga70ce7ab523b06c6c6a93fb28e916c2b3).
"""
function h5d_get_offset(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_offset, libhdf5), haddr_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" == -1 % haddr_t && @h5error("Error getting offset")
return var"#status#"
end
"""
h5d_get_space(dataset_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dget_space`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gad42a46be153d895d8c28a11ebf5a0d0a).
"""
function h5d_get_space(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_space, libhdf5), hid_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting dataspace")
return var"#status#"
end
"""
h5d_get_space_status(dataset_id::hid_t, status::Ref{Cint})
See `libhdf5` documentation for [`H5Dget_space_status`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga7639ef5c12cb906c71670ce73b856a4c).
"""
function h5d_get_space_status(dataset_id, status)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_space_status, libhdf5), herr_t, (hid_t, Ref{Cint}), dataset_id, status)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting dataspace status")
return nothing
end
"""
h5d_get_storage_size(dataset_id::hid_t) -> hsize_t
See `libhdf5` documentation for [`H5Dget_storage_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gafb249479a493e80891f0c7f5d8a91b00).
"""
function h5d_get_storage_size(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_storage_size, libhdf5), hsize_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" == -1 % hsize_t && @h5error("Error getting storage size")
return var"#status#"
end
"""
h5d_get_type(dataset_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dget_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga7cd04b8332e8a0939b9973fbc500cadb).
"""
function h5d_get_type(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dget_type, libhdf5), hid_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting dataspace type")
return var"#status#"
end
"""
h5d_iterate(buf::Ptr{Cvoid}, type_id::hid_t, space_id::hid_t, operator::Ptr{Cvoid}, operator_data::Any)
See `libhdf5` documentation for [`H5Diterate`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga71421c684884ab49765748720fe938db).
"""
function h5d_iterate(buf, type_id, space_id, operator, operator_data)
lock(liblock)
var"#status#" = try
ccall((:H5Diterate, libhdf5), herr_t, (Ptr{Cvoid}, hid_t, hid_t, Ptr{Cvoid}, Any), buf, type_id, space_id, operator, operator_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error iterating dataset")
return nothing
end
"""
h5d_open(loc_id::hid_t, pathname::Cstring, dapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Dopen2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga04198c4cf0b849ed3a8921f6c7169ee2).
"""
function h5d_open(loc_id, pathname, dapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Dopen2, libhdf5), hid_t, (hid_t, Cstring, hid_t), loc_id, pathname, dapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening dataset ", h5i_get_name(loc_id), "/", pathname))
return var"#status#"
end
"""
h5d_read(dataset_id::hid_t, mem_type_id::hid_t, mem_space_id::hid_t, file_space_id::hid_t, xfer_plist_id::hid_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dread`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga8287d5a7be7b8e55ffeff68f7d26811c).
"""
function h5d_read(dataset_id, mem_type_id, mem_space_id, file_space_id, xfer_plist_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dread, libhdf5), herr_t, (hid_t, hid_t, hid_t, hid_t, hid_t, Ptr{Cvoid}), dataset_id, mem_type_id, mem_space_id, file_space_id, xfer_plist_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error reading dataset ", h5i_get_name(dataset_id)))
return nothing
end
"""
h5d_read_chunk(dset::hid_t, dxpl_id::hid_t, offset::Ptr{hsize_t}, filters::Ptr{UInt32}, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dread_chunk`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gac1092a63b718ec949d6539590a914b60).
"""
function h5d_read_chunk(dset, dxpl_id, offset, filters, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dread_chunk, libhdf5), herr_t, (hid_t, hid_t, Ptr{hsize_t}, Ptr{UInt32}, Ptr{Cvoid}), dset, dxpl_id, offset, filters, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error reading chunk")
return nothing
end
"""
h5d_refresh(dataset_id::hid_t)
See `libhdf5` documentation for [`H5Drefresh`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga3c1ea7e5db3f62d9cf03dd62d1fb08da).
"""
function h5d_refresh(dataset_id)
lock(liblock)
var"#status#" = try
ccall((:H5Drefresh, libhdf5), herr_t, (hid_t,), dataset_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error refreshing dataset")
return nothing
end
"""
h5d_scatter(op::Ptr{Cvoid}, op_data::Any, type_id::hid_t, dst_space_id::hid_t, dst_buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dscatter`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga3525b15235ba1fd415f988899e48dc5c).
"""
function h5d_scatter(op, op_data, type_id, dst_space_id, dst_buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dscatter, libhdf5), herr_t, (Ptr{Cvoid}, Any, hid_t, hid_t, Ptr{Cvoid}), op, op_data, type_id, dst_space_id, dst_buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error scattering to dataset")
return nothing
end
"""
h5d_set_extent(dataset_id::hid_t, new_dims::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dset_extent`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#gad31e1e0129f4520c531ce524de2a056f).
"""
function h5d_set_extent(dataset_id, new_dims)
lock(liblock)
var"#status#" = try
ccall((:H5Dset_extent, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), dataset_id, new_dims)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error extending dataset dimensions")
return nothing
end
"""
h5d_vlen_get_buf_size(dset_id::hid_t, type_id::hid_t, space_id::hid_t, buf::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Dvlen_get_buf_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga0e97bbd8a8ee4a8b5b78ccce8547ce76).
"""
function h5d_vlen_get_buf_size(dset_id, type_id, space_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dvlen_get_buf_size, libhdf5), herr_t, (hid_t, hid_t, hid_t, Ptr{hsize_t}), dset_id, type_id, space_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting vlen buffer size")
return nothing
end
"""
h5d_vlen_reclaim(type_id::hid_t, space_id::hid_t, plist_id::hid_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dvlen_reclaim`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga222a2fd93868e2524b2e42c3c6146119).
"""
function h5d_vlen_reclaim(type_id, space_id, plist_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dvlen_reclaim, libhdf5), herr_t, (hid_t, hid_t, hid_t, Ptr{Cvoid}), type_id, space_id, plist_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error reclaiming vlen buffer")
return nothing
end
"""
h5d_write(dataset_id::hid_t, mem_type_id::hid_t, mem_space_id::hid_t, file_space_id::hid_t, xfer_plist_id::hid_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dwrite`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga98f44998b67587662af8b0d8a0a75906).
"""
function h5d_write(dataset_id, mem_type_id, mem_space_id, file_space_id, xfer_plist_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dwrite, libhdf5), herr_t, (hid_t, hid_t, hid_t, hid_t, hid_t, Ptr{Cvoid}), dataset_id, mem_type_id, mem_space_id, file_space_id, xfer_plist_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error writing dataset")
return nothing
end
"""
h5d_write_chunk(dset_id::hid_t, dxpl_id::hid_t, filter_mask::UInt32, offset::Ptr{hsize_t}, bufsize::Csize_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Dwrite_chunk`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d.html#ga416ccd200929b11386a10e9024977109).
"""
function h5d_write_chunk(dset_id, dxpl_id, filter_mask, offset, bufsize, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Dwrite_chunk, libhdf5), herr_t, (hid_t, hid_t, UInt32, Ptr{hsize_t}, Csize_t, Ptr{Cvoid}), dset_id, dxpl_id, filter_mask, offset, bufsize, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error writing chunk")
return nothing
end
"""
h5e_get_auto(estack_id::hid_t, func::Ref{Ptr{Cvoid}}, client_data::Ref{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Eget_auto2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#ga2eda33cbadd9be5bfddbaa91e863c936).
"""
function h5e_get_auto(estack_id, func, client_data)
lock(liblock)
var"#status#" = try
ccall((:H5Eget_auto2, libhdf5), herr_t, (hid_t, Ref{Ptr{Cvoid}}, Ref{Ptr{Cvoid}}), estack_id, func, client_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting error reporting behavior")
return nothing
end
"""
h5e_set_auto(estack_id::hid_t, func::Ptr{Cvoid}, client_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Eset_auto2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#gaf0d6b18cd5160517fe5165b9a8443c69).
"""
function h5e_set_auto(estack_id, func, client_data)
lock(liblock)
var"#status#" = try
ccall((:H5Eset_auto2, libhdf5), herr_t, (hid_t, Ptr{Cvoid}, Ptr{Cvoid}), estack_id, func, client_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting error reporting behavior")
return nothing
end
"""
h5e_get_current_stack() -> hid_t
See `libhdf5` documentation for [`H5Eget_current_stack`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#gac66c0955a6d821a472a3a408cdc95ae6).
"""
function h5e_get_current_stack()
lock(liblock)
var"#status#" = try
ccall((:H5Eget_current_stack, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Unable to return current error stack")
return var"#status#"
end
"""
h5e_get_msg(mesg_id::hid_t, mesg_type::Ref{Cint}, mesg::Ref{UInt8}, len::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Eget_msg`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#ga64714effca13c23c4f95529256621fa0).
"""
function h5e_get_msg(mesg_id, mesg_type, mesg, len)
lock(liblock)
var"#status#" = try
ccall((:H5Eget_msg, libhdf5), Cssize_t, (hid_t, Ref{Cint}, Ref{UInt8}, Csize_t), mesg_id, mesg_type, mesg, len)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting message")
return var"#status#"
end
"""
h5e_get_num(estack_id::hid_t) -> Cssize_t
See `libhdf5` documentation for [`H5Eget_num`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#ga5c42673e2059c385a95ce3c597e0756d).
"""
function h5e_get_num(estack_id)
lock(liblock)
var"#status#" = try
ccall((:H5Eget_num, libhdf5), Cssize_t, (hid_t,), estack_id)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting stack length")
return var"#status#"
end
"""
h5e_close_stack(stack_id::hid_t)
See `libhdf5` documentation for [`H5Eclose_stack`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#ga41c2ed13fd6aac6e413fe7383b9090fa).
"""
function h5e_close_stack(stack_id)
lock(liblock)
var"#status#" = try
ccall((:H5Eclose_stack, libhdf5), herr_t, (hid_t,), stack_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing stack")
return nothing
end
"""
h5e_walk(stack_id::hid_t, direction::Cint, op::Ptr{Cvoid}, op_data::Any)
See `libhdf5` documentation for [`H5Ewalk2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_e.html#ga4ecc0f6a1ea5bb821373a5a7b8070655).
"""
function h5e_walk(stack_id, direction, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Ewalk2, libhdf5), herr_t, (hid_t, Cint, Ptr{Cvoid}, Any), stack_id, direction, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error walking stack")
return nothing
end
"""
h5f_clear_elink_file_cache(file_id::hid_t)
See `libhdf5` documentation for [`H5Fclear_elink_file_cache`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gafcc153d8606829d4401e93305e5246d7).
"""
function h5f_clear_elink_file_cache(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fclear_elink_file_cache, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_clear_elink_file_cache (not annotated)")
return nothing
end
"""
h5f_close(file_id::hid_t)
See `libhdf5` documentation for [`H5Fclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gac55cd91d80822e4f8c2a7f04ea71b124).
"""
function h5f_close(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fclose, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing file")
return nothing
end
"""
h5f_create(pathname::Cstring, flags::Cuint, fcpl_id::hid_t, fapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Fcreate`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gae64b51ee9ac0781bc4ccc599d98387f4).
"""
function h5f_create(pathname, flags, fcpl_id, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fcreate, libhdf5), hid_t, (Cstring, Cuint, hid_t, hid_t), pathname, flags, fcpl_id, fapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating file $(pathname)")
return var"#status#"
end
"""
h5f_delete(filename::Cstring, fapl_id::hid_t)
See `libhdf5` documentation for [`H5Fdelete`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga2e8b5e19b343123e8ab21442f9169a62).
"""
function h5f_delete(filename, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fdelete, libhdf5), herr_t, (Cstring, hid_t), filename, fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_delete (not annotated)")
return nothing
end
"""
h5f_flush(object_id::hid_t, scope::Cint)
See `libhdf5` documentation for [`H5Fflush`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gae686870f0a276c4d06bbc667b2c24124).
"""
function h5f_flush(object_id, scope)
lock(liblock)
var"#status#" = try
ccall((:H5Fflush, libhdf5), herr_t, (hid_t, Cint), object_id, scope)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error flushing object to file")
return nothing
end
"""
h5f_format_convert(fid::hid_t)
See `libhdf5` documentation for [`H5Fformat_convert`](https://docs.hdfgroup.org/hdf5/v1_14/).
"""
function h5f_format_convert(fid)
lock(liblock)
var"#status#" = try
ccall((:H5Fformat_convert, libhdf5), herr_t, (hid_t,), fid)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_format_convert (not annotated)")
return nothing
end
"""
h5f_get_access_plist(file_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Fget_access_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga359585c49f82f5199178777b39e780f4).
"""
function h5f_get_access_plist(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_access_plist, libhdf5), hid_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting file access property list")
return var"#status#"
end
"""
h5f_get_create_plist(file_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Fget_create_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga2f823a9e929b00b06a6be80619a61778).
"""
function h5f_get_create_plist(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_create_plist, libhdf5), hid_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting file create property list")
return var"#status#"
end
"""
h5f_get_dset_no_attrs_hint(file_id::hid_t, minimize::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Fget_dset_no_attrs_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gacbf3ba8b36750c42b49740567a9732c4).
"""
function h5f_get_dset_no_attrs_hint(file_id, minimize)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_dset_no_attrs_hint, libhdf5), herr_t, (hid_t, Ptr{hbool_t}), file_id, minimize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting dataset no attributes hint")
return nothing
end
"""
h5f_get_eoa(file_id::hid_t, eoa::Ptr{haddr_t})
See `libhdf5` documentation for [`H5Fget_eoa`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga4c18bddafc652203944d889a602bd53f).
"""
function h5f_get_eoa(file_id, eoa)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_eoa, libhdf5), herr_t, (hid_t, Ptr{haddr_t}), file_id, eoa)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_eoa (not annotated)")
return nothing
end
"""
h5f_get_file_image(file_id::hid_t, buf_ptr::Ptr{Cvoid}, buf_len::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Fget_file_image`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gadc53f4e76b1199cb5d2a8cb7fbb114ad).
"""
function h5f_get_file_image(file_id, buf_ptr, buf_len)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_file_image, libhdf5), Cssize_t, (hid_t, Ptr{Cvoid}, Csize_t), file_id, buf_ptr, buf_len)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5f_get_file_image (not annotated)")
return var"#status#"
end
"""
h5f_get_fileno(file_id::hid_t, fileno::Ptr{Culong})
See `libhdf5` documentation for [`H5Fget_fileno`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga402205688af065ab5db0fe20417d5484).
"""
function h5f_get_fileno(file_id, fileno)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_fileno, libhdf5), herr_t, (hid_t, Ptr{Culong}), file_id, fileno)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_fileno (not annotated)")
return nothing
end
"""
h5f_get_filesize(file_id::hid_t, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Fget_filesize`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga515426821321c261a825b4e4a3f576fe).
"""
function h5f_get_filesize(file_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_filesize, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), file_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_filesize (not annotated)")
return nothing
end
"""
h5f_get_free_sections(file_id::hid_t, type::H5F_mem_t, nsects::Csize_t, sect_info::Ptr{H5F_sect_info_t}) -> Cssize_t
See `libhdf5` documentation for [`H5Fget_free_sections`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gab9cbf1a45f9dcda34b43f985b7848434).
"""
function h5f_get_free_sections(file_id, type, nsects, sect_info)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_free_sections, libhdf5), Cssize_t, (hid_t, H5F_mem_t, Csize_t, Ptr{H5F_sect_info_t}), file_id, type, nsects, sect_info)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5f_get_free_sections (not annotated)")
return var"#status#"
end
"""
h5f_get_freespace(file_id::hid_t) -> hssize_t
See `libhdf5` documentation for [`H5Fget_freespace`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga3ef2673183567543346668a8f1eca2e9).
"""
function h5f_get_freespace(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_freespace, libhdf5), hssize_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < hssize_t(0) && @h5error("Error in h5f_get_freespace (not annotated)")
return var"#status#"
end
"""
h5f_get_intent(file_id::hid_t, intent::Ptr{Cuint})
See `libhdf5` documentation for [`H5Fget_intent`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga466179d7783d256329c2e3110055a16c).
"""
function h5f_get_intent(file_id, intent)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_intent, libhdf5), herr_t, (hid_t, Ptr{Cuint}), file_id, intent)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting file intent")
return nothing
end
"""
h5f_get_info(obj_id::hid_t, file_info::Ptr{H5F_info2_t})
See `libhdf5` documentation for [`H5Fget_info2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gaced8c09c1559636a9c3f33dff3f4520e).
"""
function h5f_get_info(obj_id, file_info)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_info2, libhdf5), herr_t, (hid_t, Ptr{H5F_info2_t}), obj_id, file_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_info2 (not annotated)")
return nothing
end
"""
h5f_get_mdc_config(file_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})
See `libhdf5` documentation for [`H5Fget_mdc_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#gaa67f127242d4aaf244ae8ac4a1fe6a59).
"""
function h5f_get_mdc_config(file_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mdc_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_config_t}), file_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mdc_config (not annotated)")
return nothing
end
"""
h5f_get_mdc_hit_rate(file_id::hid_t, hit_rate_ptr::Ptr{Cdouble})
See `libhdf5` documentation for [`H5Fget_mdc_hit_rate`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#gabea066c3fd924d2cf868ecee66a7c41f).
"""
function h5f_get_mdc_hit_rate(file_id, hit_rate_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mdc_hit_rate, libhdf5), herr_t, (hid_t, Ptr{Cdouble}), file_id, hit_rate_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mdc_hit_rate (not annotated)")
return nothing
end
"""
h5f_get_mdc_image_info(file_id::hid_t, image_addr::Ptr{haddr_t}, image_size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Fget_mdc_image_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga7b37da15ff80c4aa5c275649f1f45b0a).
"""
function h5f_get_mdc_image_info(file_id, image_addr, image_size)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mdc_image_info, libhdf5), herr_t, (hid_t, Ptr{haddr_t}, Ptr{hsize_t}), file_id, image_addr, image_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mdc_image_info (not annotated)")
return nothing
end
"""
h5f_get_mdc_logging_status(file_id::hid_t, is_enabled::Ptr{hbool_t}, is_currently_logging::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Fget_mdc_logging_status`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga998ebdc7b5190cf3d0fdf2fbe71e9780).
"""
function h5f_get_mdc_logging_status(file_id, is_enabled, is_currently_logging)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mdc_logging_status, libhdf5), herr_t, (hid_t, Ptr{hbool_t}, Ptr{hbool_t}), file_id, is_enabled, is_currently_logging)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mdc_logging_status (not annotated)")
return nothing
end
"""
h5f_get_mdc_size(file_id::hid_t, max_size_ptr::Ptr{Csize_t}, min_clean_size_ptr::Ptr{Csize_t}, cur_size_ptr::Ptr{Csize_t}, cur_num_entries_ptr::Ptr{Cint})
See `libhdf5` documentation for [`H5Fget_mdc_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#gacda6cbd60d3c50b59f801eba4e5a335f).
"""
function h5f_get_mdc_size(file_id, max_size_ptr, min_clean_size_ptr, cur_size_ptr, cur_num_entries_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mdc_size, libhdf5), herr_t, (hid_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cint}), file_id, max_size_ptr, min_clean_size_ptr, cur_size_ptr, cur_num_entries_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mdc_size (not annotated)")
return nothing
end
"""
h5f_get_metadata_read_retry_info(file_id::hid_t, info::Ptr{H5F_retry_info_t})
See `libhdf5` documentation for [`H5Fget_metadata_read_retry_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___s_w_m_r.html#gaa80bd62f19993e414e383db7d1623e5f).
"""
function h5f_get_metadata_read_retry_info(file_id, info)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_metadata_read_retry_info, libhdf5), herr_t, (hid_t, Ptr{H5F_retry_info_t}), file_id, info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_metadata_read_retry_info (not annotated)")
return nothing
end
"""
h5f_get_mpi_atomicity(file_id::hid_t, flag::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Fget_mpi_atomicity`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_h5_f.html#ga849316b77788799fecb321a87d987ade).
"""
function h5f_get_mpi_atomicity(file_id, flag)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_mpi_atomicity, libhdf5), herr_t, (hid_t, Ptr{hbool_t}), file_id, flag)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_mpi_atomicity (not annotated)")
return nothing
end
"""
h5f_get_name(obj_id::hid_t, buf::Ptr{UInt8}, buf_size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Fget_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga0ed43dbe476a160b73f55127c7db797c).
"""
function h5f_get_name(obj_id, buf, buf_size)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_name, libhdf5), Cssize_t, (hid_t, Ptr{UInt8}, Csize_t), obj_id, buf, buf_size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting file name")
return var"#status#"
end
"""
h5f_get_obj_count(file_id::hid_t, types::Cuint) -> Cssize_t
See `libhdf5` documentation for [`H5Fget_obj_count`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gadcdae0aca7c88064db0d32de7f1e31f2).
"""
function h5f_get_obj_count(file_id, types)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_obj_count, libhdf5), Cssize_t, (hid_t, Cuint), file_id, types)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting object count")
return var"#status#"
end
"""
h5f_get_obj_ids(file_id::hid_t, types::Cuint, max_objs::Csize_t, obj_id_list::Ptr{hid_t}) -> Cssize_t
See `libhdf5` documentation for [`H5Fget_obj_ids`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga35e72579bd07433162b80ddc0bd0c5b1).
"""
function h5f_get_obj_ids(file_id, types, max_objs, obj_id_list)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_obj_ids, libhdf5), Cssize_t, (hid_t, Cuint, Csize_t, Ptr{hid_t}), file_id, types, max_objs, obj_id_list)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting objects")
return var"#status#"
end
"""
h5f_get_page_buffering_stats(file_id::hid_t, accesses::Ptr{Cuint}, hits::Ptr{Cuint}, misses::Ptr{Cuint}, evictions::Ptr{Cuint}, bypasses::Ptr{Cuint})
See `libhdf5` documentation for [`H5Fget_page_buffering_stats`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga0663defe0143631f4292267c21e94202).
"""
function h5f_get_page_buffering_stats(file_id, accesses, hits, misses, evictions, bypasses)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_page_buffering_stats, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}, Ptr{Cuint}, Ptr{Cuint}, Ptr{Cuint}), file_id, accesses, hits, misses, evictions, bypasses)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_get_page_buffering_stats (not annotated)")
return nothing
end
"""
h5f_get_vfd_handle(file_id::hid_t, fapl_id::hid_t, file_handle::Ref{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Fget_vfd_handle`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gae4020a66fb8da0586e3b74c81ffccea4).
"""
function h5f_get_vfd_handle(file_id, fapl_id, file_handle)
lock(liblock)
var"#status#" = try
ccall((:H5Fget_vfd_handle, libhdf5), herr_t, (hid_t, hid_t, Ref{Ptr{Cvoid}}), file_id, fapl_id, file_handle)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting VFD handle")
return nothing
end
"""
h5f_increment_filesize(file_id::hid_t, increment::hsize_t)
See `libhdf5` documentation for [`H5Fincrement_filesize`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gadbe82c1f6e16c21062fabd20b0ffccd4).
"""
function h5f_increment_filesize(file_id, increment)
lock(liblock)
var"#status#" = try
ccall((:H5Fincrement_filesize, libhdf5), herr_t, (hid_t, hsize_t), file_id, increment)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_increment_filesize (not annotated)")
return nothing
end
"""
h5f_is_accessible(container_name::Cstring, fapl_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Fis_accessible`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga584471c3b98453b9b04a4bf9af847442).
"""
function h5f_is_accessible(container_name, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fis_accessible, libhdf5), htri_t, (Cstring, hid_t), container_name, fapl_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5f_is_accessible (not annotated)")
return var"#status#" > 0
end
"""
h5f_is_hdf5(pathname::Cstring) -> Bool
See `libhdf5` documentation for [`H5Fis_hdf5`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga6055c2ea3438bd4aaf221eba66843225).
"""
function h5f_is_hdf5(pathname)
lock(liblock)
var"#status#" = try
ccall((:H5Fis_hdf5, libhdf5), htri_t, (Cstring,), pathname)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Unable to access file $(pathname)")
return var"#status#" > 0
end
"""
h5f_mount(loc::hid_t, name::Cstring, child::hid_t, plist::hid_t)
See `libhdf5` documentation for [`H5Fmount`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga7c4865fd36ee25d839725252150bb53b).
"""
function h5f_mount(loc, name, child, plist)
lock(liblock)
var"#status#" = try
ccall((:H5Fmount, libhdf5), herr_t, (hid_t, Cstring, hid_t, hid_t), loc, name, child, plist)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_mount (not annotated)")
return nothing
end
"""
h5f_open(pathname::Cstring, flags::Cuint, fapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Fopen`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gaa3f4f877b9bb591f3880423ed2bf44bc).
"""
function h5f_open(pathname, flags, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fopen, libhdf5), hid_t, (Cstring, Cuint, hid_t), pathname, flags, fapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error opening file $(pathname)")
return var"#status#"
end
"""
h5f_reopen(file_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Freopen`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga3f213eb05c5419d63ba168c30036e47b).
"""
function h5f_reopen(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Freopen, libhdf5), hid_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5f_reopen (not annotated)")
return var"#status#"
end
"""
h5f_reset_mdc_hit_rate_stats(file_id::hid_t)
See `libhdf5` documentation for [`H5Freset_mdc_hit_rate_stats`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga6708886c2bb8740327d9078d7840197f).
"""
function h5f_reset_mdc_hit_rate_stats(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Freset_mdc_hit_rate_stats, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_reset_mdc_hit_rate_stats (not annotated)")
return nothing
end
"""
h5f_reset_page_buffering_stats(file_id::hid_t)
See `libhdf5` documentation for [`H5Freset_page_buffering_stats`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga7ef1c0aab9a7a9112a8d0a788ec8696c).
"""
function h5f_reset_page_buffering_stats(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Freset_page_buffering_stats, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_reset_page_buffering_stats (not annotated)")
return nothing
end
"""
h5f_set_dset_no_attrs_hint(file_id::hid_t, minimize::hbool_t)
See `libhdf5` documentation for [`H5Fset_dset_no_attrs_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gafc0166070f920f037e6b1a5c66e5464c).
"""
function h5f_set_dset_no_attrs_hint(file_id, minimize)
lock(liblock)
var"#status#" = try
ccall((:H5Fset_dset_no_attrs_hint, libhdf5), herr_t, (hid_t, hbool_t), file_id, minimize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in setting dataset no attributes hint")
return nothing
end
"""
h5f_set_libver_bounds(file_id::hid_t, low::H5F_libver_t, high::H5F_libver_t)
See `libhdf5` documentation for [`H5Fset_libver_bounds`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#ga4b833c33fe2e141a26b6f2ad559d3610).
"""
function h5f_set_libver_bounds(file_id, low, high)
lock(liblock)
var"#status#" = try
ccall((:H5Fset_libver_bounds, libhdf5), herr_t, (hid_t, H5F_libver_t, H5F_libver_t), file_id, low, high)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_set_libver_bounds (not annotated)")
return nothing
end
"""
h5f_set_mdc_config(file_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})
See `libhdf5` documentation for [`H5Fset_mdc_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga81bc06be69131484eb04d01511b9c8f8).
"""
function h5f_set_mdc_config(file_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Fset_mdc_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_config_t}), file_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_set_mdc_config (not annotated)")
return nothing
end
"""
h5f_set_mpi_atomicity(file_id::hid_t, flag::hbool_t)
See `libhdf5` documentation for [`H5Fset_mpi_atomicity`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_h5_f.html#ga087dbbf37cb5c451ae8a6f66b227f0fc).
"""
function h5f_set_mpi_atomicity(file_id, flag)
lock(liblock)
var"#status#" = try
ccall((:H5Fset_mpi_atomicity, libhdf5), herr_t, (hid_t, hbool_t), file_id, flag)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_set_mpi_atomicity (not annotated)")
return nothing
end
"""
h5f_start_mdc_logging(file_id::hid_t)
See `libhdf5` documentation for [`H5Fstart_mdc_logging`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga378fb5863071278b47070cf205f53e67).
"""
function h5f_start_mdc_logging(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fstart_mdc_logging, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_start_mdc_logging (not annotated)")
return nothing
end
"""
h5f_start_swmr_write(id::hid_t)
See `libhdf5` documentation for [`H5Fstart_swmr_write`](https://docs.hdfgroup.org/hdf5/v1_14/group___s_w_m_r.html#ga159be34fbe7e4a959589310ef0196dfe).
"""
function h5f_start_swmr_write(id)
lock(liblock)
var"#status#" = try
ccall((:H5Fstart_swmr_write, libhdf5), herr_t, (hid_t,), id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error starting SWMR write")
return nothing
end
"""
h5f_stop_mdc_logging(file_id::hid_t)
See `libhdf5` documentation for [`H5Fstop_mdc_logging`](https://docs.hdfgroup.org/hdf5/v1_14/group___m_d_c.html#ga78627b23010f82002b837f4d312bf234).
"""
function h5f_stop_mdc_logging(file_id)
lock(liblock)
var"#status#" = try
ccall((:H5Fstop_mdc_logging, libhdf5), herr_t, (hid_t,), file_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_stop_mdc_logging (not annotated)")
return nothing
end
"""
h5f_unmount(loc::hid_t, name::Cstring)
See `libhdf5` documentation for [`H5Funmount`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_f.html#gae8f807d3f04a33f132ffb6c5295e897f).
"""
function h5f_unmount(loc, name)
lock(liblock)
var"#status#" = try
ccall((:H5Funmount, libhdf5), herr_t, (hid_t, Cstring), loc, name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5f_unmount (not annotated)")
return nothing
end
"""
h5g_close(group_id::hid_t)
See `libhdf5` documentation for [`H5Gclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#ga8dbe20b390d2504f0bd3589ed8f4e221).
"""
function h5g_close(group_id)
lock(liblock)
var"#status#" = try
ccall((:H5Gclose, libhdf5), herr_t, (hid_t,), group_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing group")
return nothing
end
"""
h5g_create(loc_id::hid_t, pathname::Cstring, lcpl_id::hid_t, gcpl_id::hid_t, gapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Gcreate2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#ga86d93295965f750ef25dea2505a711d9).
"""
function h5g_create(loc_id, pathname, lcpl_id, gcpl_id, gapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Gcreate2, libhdf5), hid_t, (hid_t, Cstring, hid_t, hid_t, hid_t), loc_id, pathname, lcpl_id, gcpl_id, gapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating group $(h5i_get_name(loc_id))/$(pathname)")
return var"#status#"
end
"""
h5g_get_create_plist(group_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Gget_create_plist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#ga0b959a53cbffa48f5d68ce33b43b7ed8).
"""
function h5g_get_create_plist(group_id)
lock(liblock)
var"#status#" = try
ccall((:H5Gget_create_plist, libhdf5), hid_t, (hid_t,), group_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting group create property list")
return var"#status#"
end
"""
h5g_get_info(group_id::hid_t, buf::Ptr{H5G_info_t})
See `libhdf5` documentation for [`H5Gget_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#gad4be126ab7bbf2001435e8e70089f3d3).
"""
function h5g_get_info(group_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Gget_info, libhdf5), herr_t, (hid_t, Ptr{H5G_info_t}), group_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting group info")
return nothing
end
"""
h5g_get_num_objs(loc_id::hid_t, num_obj::Ptr{hsize_t}) -> hid_t
See `libhdf5` documentation for [`H5Gget_num_objs`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#ga3e30142e15ccf9a08bfc91ca9925c14d).
"""
function h5g_get_num_objs(loc_id, num_obj)
lock(liblock)
var"#status#" = try
ccall((:H5Gget_num_objs, libhdf5), hid_t, (hid_t, Ptr{hsize_t}), loc_id, num_obj)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting group length")
return var"#status#"
end
"""
h5g_get_objname_by_idx(loc_id::hid_t, idx::hsize_t, pathname::Ptr{UInt8}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Gget_objname_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#ga80180e7b819d3c9b3b3f1895e9baaf5b).
"""
function h5g_get_objname_by_idx(loc_id, idx, pathname, size)
lock(liblock)
var"#status#" = try
ccall((:H5Gget_objname_by_idx, libhdf5), Cssize_t, (hid_t, hsize_t, Ptr{UInt8}, Csize_t), loc_id, idx, pathname, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting group object name $(h5i_get_name(loc_id))/$(pathname)")
return var"#status#"
end
"""
h5g_open(loc_id::hid_t, pathname::Cstring, gapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Gopen2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_g.html#gadab91e2dd7a7e253dcc0e4fe04b81403).
"""
function h5g_open(loc_id, pathname, gapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Gopen2, libhdf5), hid_t, (hid_t, Cstring, hid_t), loc_id, pathname, gapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error opening group $(h5i_get_name(loc_id))/$(pathname)")
return var"#status#"
end
"""
h5i_dec_ref(obj_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Idec_ref`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#gaea2aa78caea892edf2a6a6ac70486ed9).
"""
function h5i_dec_ref(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Idec_ref, libhdf5), Cint, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error decementing reference")
return Int(var"#status#")
end
"""
h5i_get_file_id(obj_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Iget_file_id`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#ga6ce32e88051e4cdf7d02439d86e5e042).
"""
function h5i_get_file_id(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Iget_file_id, libhdf5), hid_t, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting file identifier")
return var"#status#"
end
"""
h5i_get_name(obj_id::hid_t, buf::Ptr{UInt8}, buf_size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Iget_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#ga9c84a8dc29566b82b6d1ff7a6e6828f1).
"""
function h5i_get_name(obj_id, buf, buf_size)
lock(liblock)
var"#status#" = try
ccall((:H5Iget_name, libhdf5), Cssize_t, (hid_t, Ptr{UInt8}, Csize_t), obj_id, buf, buf_size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting object name")
return var"#status#"
end
"""
h5i_get_ref(obj_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Iget_ref`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#gac9638ade14cc75b7b125b3723f319c81).
"""
function h5i_get_ref(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Iget_ref, libhdf5), Cint, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting reference count")
return Int(var"#status#")
end
"""
h5i_get_type(obj_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Iget_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#ga4941435d4d64de3d7095d2316f415f2d).
"""
function h5i_get_type(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Iget_type, libhdf5), Cint, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting type")
return Int(var"#status#")
end
"""
h5i_inc_ref(obj_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Iinc_ref`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#ga3fd0c157704573965cafd6e1aa7f368f).
"""
function h5i_inc_ref(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Iinc_ref, libhdf5), Cint, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error incrementing identifier refcount")
return Int(var"#status#")
end
"""
h5i_is_valid(obj_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Iis_valid`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_i.html#ga20eb10c559d9ed5ba6f77b31d6a3ba9a).
"""
function h5i_is_valid(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Iis_valid, libhdf5), htri_t, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Cannot determine whether object is valid")
return var"#status#" > 0
end
"""
h5l_create_external(target_file_name::Cstring, target_obj_name::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Lcreate_external`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga15dfaeb9b1c0b3136533cb97ee45e683).
"""
function h5l_create_external(target_file_name, target_obj_name, link_loc_id, link_name, lcpl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lcreate_external, libhdf5), herr_t, (Cstring, Cstring, hid_t, Cstring, hid_t, hid_t), target_file_name, target_obj_name, link_loc_id, link_name, lcpl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error creating external link ", link_name, " pointing to ", target_obj_name, " in file ", target_file_name))
return nothing
end
"""
h5l_create_hard(obj_loc_id::hid_t, obj_name::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Lcreate_hard`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga69d50f7acdfd2f1dc7c4372397e63bd2).
"""
function h5l_create_hard(obj_loc_id, obj_name, link_loc_id, link_name, lcpl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lcreate_hard, libhdf5), herr_t, (hid_t, Cstring, hid_t, Cstring, hid_t, hid_t), obj_loc_id, obj_name, link_loc_id, link_name, lcpl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error creating hard link ", link_name, " pointing to ", obj_name))
return nothing
end
"""
h5l_create_soft(target_path::Cstring, link_loc_id::hid_t, link_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Lcreate_soft`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga894444623b58ce1ac3bd35538245ac78).
"""
function h5l_create_soft(target_path, link_loc_id, link_name, lcpl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lcreate_soft, libhdf5), herr_t, (Cstring, hid_t, Cstring, hid_t, hid_t), target_path, link_loc_id, link_name, lcpl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error creating soft link ", link_name, " pointing to ", target_path))
return nothing
end
"""
h5l_delete(obj_id::hid_t, pathname::Cstring, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Ldelete`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga5b4e7f59f5d4bdae94fd8ce6875295cf).
"""
function h5l_delete(obj_id, pathname, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Ldelete, libhdf5), herr_t, (hid_t, Cstring, hid_t), obj_id, pathname, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error deleting ", h5i_get_name(obj_id), "/", pathname))
return nothing
end
"""
h5l_move(src_obj_id::hid_t, src_name::Cstring, dest_obj_id::hid_t, dest_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Lmove`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga0bbc7f9bf25c8aca9dd8433a325c8acb).
"""
function h5l_move(src_obj_id, src_name, dest_obj_id, dest_name, lcpl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lmove, libhdf5), herr_t, (hid_t, Cstring, hid_t, Cstring, hid_t, hid_t), src_obj_id, src_name, dest_obj_id, dest_name, lcpl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error moving ", h5i_get_name(src_obj_id), "/", src_name, " to ", h5i_get_name(dest_obj_id), "/", dest_name))
return nothing
end
"""
h5l_exists(loc_id::hid_t, pathname::Cstring, lapl_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Lexists`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga171be6e41dc1a464edc402df0ebdf801).
"""
function h5l_exists(loc_id, pathname, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lexists, libhdf5), htri_t, (hid_t, Cstring, hid_t), loc_id, pathname, lapl_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error(string("Cannot determine whether ", pathname, " exists"))
return var"#status#" > 0
end
"""
h5l_get_info(link_loc_id::hid_t, link_name::Cstring, link_buf::Ptr{H5L_info_t}, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Lget_info`](https://docs.hdfgroup.org/hdf5/v1_14/).
"""
function h5l_get_info(link_loc_id, link_name, link_buf, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lget_info, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5L_info_t}, hid_t), link_loc_id, link_name, link_buf, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error getting info for link ", link_name))
return nothing
end
"""
h5l_get_name_by_idx(loc_id::hid_t, group_name::Cstring, index_field::Cint, order::Cint, n::hsize_t, name::Ptr{UInt8}, size::Csize_t, lapl_id::hid_t) -> Cssize_t
See `libhdf5` documentation for [`H5Lget_name_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l.html#ga453ea40c3bb85ec8120dd17deed2bd90).
"""
function h5l_get_name_by_idx(loc_id, group_name, index_field, order, n, name, size, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Lget_name_by_idx, libhdf5), Cssize_t, (hid_t, Cstring, Cint, Cint, hsize_t, Ptr{UInt8}, Csize_t, hid_t), loc_id, group_name, index_field, order, n, name, size, lapl_id)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting object name")
return var"#status#"
end
@static if _libhdf5_build_ver < v"1.12"
@doc """
h5l_iterate(group_id::hid_t, idx_type::Cint, order::Cint, idx::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)
See `libhdf5` documentation for [`H5Literate`](https://docs.hdfgroup.org/hdf5/v1_14/).
"""
function h5l_iterate(group_id, idx_type, order, idx, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Literate, libhdf5), herr_t, (hid_t, Cint, Cint, Ptr{hsize_t}, Ptr{Cvoid}, Any), group_id, idx_type, order, idx, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error iterating through links in group ", h5i_get_name(group_id)))
return nothing
end
end
@static if v"1.12" ≤ _libhdf5_build_ver
@doc """
h5l_iterate(group_id::hid_t, idx_type::Cint, order::Cint, idx::Ptr{hsize_t}, op::Ptr{Cvoid}, op_data::Any)
See `libhdf5` documentation for [`H5Literate1`](https://docs.hdfgroup.org/hdf5/v1_14/group___t_r_a_v.html#ga1e7c0a8cf17699563c02e128f27042f1).
"""
function h5l_iterate(group_id, idx_type, order, idx, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Literate1, libhdf5), herr_t, (hid_t, Cint, Cint, Ptr{hsize_t}, Ptr{Cvoid}, Any), group_id, idx_type, order, idx, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error iterating through links in group ", h5i_get_name(group_id)))
return nothing
end
end
"""
h5o_are_mdc_flushes_disabled(object_id::hid_t, are_disabled::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Oare_mdc_flushes_disabled`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gab2fa388aadd1ff154ee150cbb4884c1c).
"""
function h5o_are_mdc_flushes_disabled(object_id, are_disabled)
lock(liblock)
var"#status#" = try
ccall((:H5Oare_mdc_flushes_disabled, libhdf5), herr_t, (hid_t, Ptr{hbool_t}), object_id, are_disabled)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_are_mdc_flushes_disabled (not annotated)")
return nothing
end
"""
h5o_close(object_id::hid_t)
See `libhdf5` documentation for [`H5Oclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga545ad7c54987013ebd50b40fe9e73c61).
"""
function h5o_close(object_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oclose, libhdf5), herr_t, (hid_t,), object_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing object")
return nothing
end
"""
h5o_copy(src_loc_id::hid_t, src_name::Cstring, dst_loc_id::hid_t, dst_name::Cstring, ocpypl_id::hid_t, lcpl_id::hid_t)
See `libhdf5` documentation for [`H5Ocopy`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaa94449be6f67f499be5ddd3fc44f4225).
"""
function h5o_copy(src_loc_id, src_name, dst_loc_id, dst_name, ocpypl_id, lcpl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Ocopy, libhdf5), herr_t, (hid_t, Cstring, hid_t, Cstring, hid_t, hid_t), src_loc_id, src_name, dst_loc_id, dst_name, ocpypl_id, lcpl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error copying object ", h5i_get_name(src_loc_id), "/", src_name, " to ", h5i_get_name(dst_loc_id), "/", dst_name))
return nothing
end
"""
h5o_decr_refcount(object_id::hid_t)
See `libhdf5` documentation for [`H5Odecr_refcount`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga60c20da5e244c28a653d4fa23d316b44).
"""
function h5o_decr_refcount(object_id)
lock(liblock)
var"#status#" = try
ccall((:H5Odecr_refcount, libhdf5), herr_t, (hid_t,), object_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_decr_refcount (not annotated)")
return nothing
end
"""
h5o_disable_mdc_flushes(object_id::hid_t)
See `libhdf5` documentation for [`H5Odisable_mdc_flushes`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga0908be309da1fb4f771c1e264fac22ae).
"""
function h5o_disable_mdc_flushes(object_id)
lock(liblock)
var"#status#" = try
ccall((:H5Odisable_mdc_flushes, libhdf5), herr_t, (hid_t,), object_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_disable_mdc_flushes (not annotated)")
return nothing
end
"""
h5o_enable_mdc_flushes(object_id::hid_t)
See `libhdf5` documentation for [`H5Oenable_mdc_flushes`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga21014920bdabf6973e233796d7174156).
"""
function h5o_enable_mdc_flushes(object_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oenable_mdc_flushes, libhdf5), herr_t, (hid_t,), object_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_enable_mdc_flushes (not annotated)")
return nothing
end
"""
h5o_exists_by_name(loc_id::hid_t, name::Cstring, lapl_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Oexists_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gab0fef18d97844c4f83d412c5a22def7b).
"""
function h5o_exists_by_name(loc_id, name, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oexists_by_name, libhdf5), htri_t, (hid_t, Cstring, hid_t), loc_id, name, lapl_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5o_exists_by_name (not annotated)")
return var"#status#" > 0
end
"""
h5o_flush(obj_id::hid_t)
See `libhdf5` documentation for [`H5Oflush`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gad99f35048cba4534b6393214684f090f).
"""
function h5o_flush(obj_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oflush, libhdf5), herr_t, (hid_t,), obj_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_flush (not annotated)")
return nothing
end
"""
h5o_get_comment(obj_id::hid_t, comment::Ptr{Cchar}, bufsize::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Oget_comment`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaa1511ce5e2fe01ce7ea58f2f851d694b).
"""
function h5o_get_comment(obj_id, comment, bufsize)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_comment, libhdf5), Cssize_t, (hid_t, Ptr{Cchar}, Csize_t), obj_id, comment, bufsize)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5o_get_comment (not annotated)")
return var"#status#"
end
"""
h5o_get_comment_by_name(loc_id::hid_t, name::Cstring, comment::Ptr{Cchar}, bufsize::Csize_t, lapl_id::hid_t) -> Cssize_t
See `libhdf5` documentation for [`H5Oget_comment_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gae6d92d597c5a292d342a1bda91e41171).
"""
function h5o_get_comment_by_name(loc_id, name, comment, bufsize, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_comment_by_name, libhdf5), Cssize_t, (hid_t, Cstring, Ptr{Cchar}, Csize_t, hid_t), loc_id, name, comment, bufsize, lapl_id)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5o_get_comment_by_name (not annotated)")
return var"#status#"
end
@static if _libhdf5_build_ver < v"1.10.3"
@doc """
h5o_get_info(object_id::hid_t, buf::Ptr{H5O_info1_t})
See `libhdf5` documentation for [`H5Oget_info1`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaf3751684a6706e3ba49b863406011f80).
"""
function h5o_get_info(object_id, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info1, libhdf5), herr_t, (hid_t, Ptr{H5O_info1_t}), object_id, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting object info")
return nothing
end
end
@static if v"1.10.3" ≤ _libhdf5_build_ver < v"1.12.0"
@doc """
h5o_get_info(loc_id::hid_t, oinfo::Ptr{H5O_info1_t}, fields::Cuint)
See `libhdf5` documentation for [`H5Oget_info2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga06f896e14fe4fa940fbc2bc235e0cf74).
"""
function h5o_get_info(loc_id, oinfo, fields)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info2, libhdf5), herr_t, (hid_t, Ptr{H5O_info1_t}, Cuint), loc_id, oinfo, fields)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info2 (not annotated)")
return nothing
end
end
@static if v"1.12.0" ≤ _libhdf5_build_ver
@doc """
h5o_get_info(loc_id::hid_t, oinfo::Ptr{H5O_info2_t}, fields::Cuint)
See `libhdf5` documentation for [`H5Oget_info3`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaf0fbf7d780a1eefce920facadb198013).
"""
function h5o_get_info(loc_id, oinfo, fields)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info3, libhdf5), herr_t, (hid_t, Ptr{H5O_info2_t}, Cuint), loc_id, oinfo, fields)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info3 (not annotated)")
return nothing
end
end
@static if _libhdf5_build_ver < v"1.10.3"
@doc """
h5o_get_info_by_idx(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info1_t}, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_idx1`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga7208d2cf198dcfc875603323841bffae).
"""
function h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_idx1, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, hsize_t, Ptr{H5O_info1_t}, hid_t), loc_id, group_name, idx_type, order, n, oinfo, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_idx1 (not annotated)")
return nothing
end
end
@static if v"1.10.3" ≤ _libhdf5_build_ver < v"1.12.0"
@doc """
h5o_get_info_by_idx(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info1_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_idx2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga85e15e65922874111da1a5efd5dd7bed).
"""
function h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_idx2, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, hsize_t, Ptr{H5O_info1_t}, Cuint, hid_t), loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_idx2 (not annotated)")
return nothing
end
end
@static if v"1.12.0" ≤ _libhdf5_build_ver
@doc """
h5o_get_info_by_idx(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_info2_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_idx3`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gafa2f8884f7d3e7fd9b8549f5b59fd9eb).
"""
function h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_idx3, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, hsize_t, Ptr{H5O_info2_t}, Cuint, hid_t), loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_idx3 (not annotated)")
return nothing
end
end
@static if _libhdf5_build_ver < v"1.10.3"
@doc """
h5o_get_info_by_name(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info1_t}, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_name1`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga96ce408ffda805210844246904da2842).
"""
function h5o_get_info_by_name(loc_id, name, oinfo, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_name1, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5O_info1_t}, hid_t), loc_id, name, oinfo, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_name1 (not annotated)")
return nothing
end
end
@static if v"1.10.3" ≤ _libhdf5_build_ver < v"1.12.0"
@doc """
h5o_get_info_by_name(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info1_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_name2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga0090da86c086c1c63a5acfaed39a035e).
"""
function h5o_get_info_by_name(loc_id, name, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_name2, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5O_info1_t}, Cuint, hid_t), loc_id, name, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_name2 (not annotated)")
return nothing
end
end
@static if v"1.12.0" ≤ _libhdf5_build_ver
@doc """
h5o_get_info_by_name(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_info2_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_info_by_name3`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gabb69c962999e027cef0079bbb1282199).
"""
function h5o_get_info_by_name(loc_id, name, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_info_by_name3, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5O_info2_t}, Cuint, hid_t), loc_id, name, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_info_by_name3 (not annotated)")
return nothing
end
end
"""
h5o_get_native_info(loc_id::hid_t, oinfo::Ptr{H5O_native_info_t}, fields::Cuint)
See `libhdf5` documentation for [`H5Oget_native_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga677d99ab106e2032b991b75b75de0e46).
"""
function h5o_get_native_info(loc_id, oinfo, fields)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_native_info, libhdf5), herr_t, (hid_t, Ptr{H5O_native_info_t}, Cuint), loc_id, oinfo, fields)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_native_info (not annotated)")
return nothing
end
"""
h5o_get_native_info_by_idx(loc_id::hid_t, group_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, n::hsize_t, oinfo::Ptr{H5O_native_info_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_native_info_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gafa6570d8b0ef6e2aff75093e1f99f67e).
"""
function h5o_get_native_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_native_info_by_idx, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, hsize_t, Ptr{H5O_native_info_t}, Cuint, hid_t), loc_id, group_name, idx_type, order, n, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_native_info_by_idx (not annotated)")
return nothing
end
"""
h5o_get_native_info_by_name(loc_id::hid_t, name::Cstring, oinfo::Ptr{H5O_native_info_t}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oget_native_info_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga296ded21aeac3921fee07272353b8476).
"""
function h5o_get_native_info_by_name(loc_id, name, oinfo, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oget_native_info_by_name, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5O_native_info_t}, Cuint, hid_t), loc_id, name, oinfo, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_get_native_info_by_name (not annotated)")
return nothing
end
"""
h5o_incr_refcount(object_id::hid_t)
See `libhdf5` documentation for [`H5Oincr_refcount`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga2086bad6c3cd2a711c306a48c093ff55).
"""
function h5o_incr_refcount(object_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oincr_refcount, libhdf5), herr_t, (hid_t,), object_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_incr_refcount (not annotated)")
return nothing
end
"""
h5o_link(obj_id::hid_t, new_loc_id::hid_t, new_name::Cstring, lcpl_id::hid_t, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Olink`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga2c97dd58e64b67d16325fceb7e02113f).
"""
function h5o_link(obj_id, new_loc_id, new_name, lcpl_id, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Olink, libhdf5), herr_t, (hid_t, hid_t, Cstring, hid_t, hid_t), obj_id, new_loc_id, new_name, lcpl_id, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_link (not annotated)")
return nothing
end
"""
h5o_open(loc_id::hid_t, pathname::Cstring, lapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Oopen`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga9f635f58c7ddf17f87c253bfbca08bc1).
"""
function h5o_open(loc_id, pathname, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oopen, libhdf5), hid_t, (hid_t, Cstring, hid_t), loc_id, pathname, lapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening object ", h5i_get_name(loc_id), "/", pathname))
return var"#status#"
end
"""
h5o_open_by_addr(loc_id::hid_t, addr::haddr_t) -> hid_t
See `libhdf5` documentation for [`H5Oopen_by_addr`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga137f3823adab4daaaf8fe87b40453fa2).
"""
function h5o_open_by_addr(loc_id, addr)
lock(liblock)
var"#status#" = try
ccall((:H5Oopen_by_addr, libhdf5), hid_t, (hid_t, haddr_t), loc_id, addr)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error opening object by address")
return var"#status#"
end
"""
h5o_open_by_idx(loc_id::hid_t, group_name::Cstring, index_type::Cint, order::Cint, n::hsize_t, lapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Oopen_by_idx`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaeb66e5cbb3ca79890fc284a0b06762be).
"""
function h5o_open_by_idx(loc_id, group_name, index_type, order, n, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oopen_by_idx, libhdf5), hid_t, (hid_t, Cstring, Cint, Cint, hsize_t, hid_t), loc_id, group_name, index_type, order, n, lapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening object of index ", n))
return var"#status#"
end
"""
h5o_refresh(oid::hid_t)
See `libhdf5` documentation for [`H5Orefresh`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaf0318b68be9ab23a92b8a6bee0af9e2f).
"""
function h5o_refresh(oid)
lock(liblock)
var"#status#" = try
ccall((:H5Orefresh, libhdf5), herr_t, (hid_t,), oid)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_refresh (not annotated)")
return nothing
end
"""
h5o_set_comment(obj_id::hid_t, comment::Cstring)
See `libhdf5` documentation for [`H5Oset_comment`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga8b5cf8e916204e29616516046121f631).
"""
function h5o_set_comment(obj_id, comment)
lock(liblock)
var"#status#" = try
ccall((:H5Oset_comment, libhdf5), herr_t, (hid_t, Cstring), obj_id, comment)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_set_comment (not annotated)")
return nothing
end
"""
h5o_set_comment_by_name(loc_id::hid_t, name::Cstring, comment::Cstring, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Oset_comment_by_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gafeb5242de7f1080b5c19f4fe19784505).
"""
function h5o_set_comment_by_name(loc_id, name, comment, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Oset_comment_by_name, libhdf5), herr_t, (hid_t, Cstring, Cstring, hid_t), loc_id, name, comment, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_set_comment_by_name (not annotated)")
return nothing
end
"""
h5o_token_cmp(loc_id::hid_t, token1::Ptr{H5O_token_t}, token2::Ptr{H5O_token_t}, cmp_value::Ptr{Cint})
See `libhdf5` documentation for [`H5Otoken_cmp`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaeb8da4fbe62f8a3cd9146a7ac1093562).
"""
function h5o_token_cmp(loc_id, token1, token2, cmp_value)
lock(liblock)
var"#status#" = try
ccall((:H5Otoken_cmp, libhdf5), herr_t, (hid_t, Ptr{H5O_token_t}, Ptr{H5O_token_t}, Ptr{Cint}), loc_id, token1, token2, cmp_value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_token_cmp (not annotated)")
return nothing
end
"""
h5o_token_from_str(loc_id::hid_t, token_str::Cstring, token::Ptr{H5O_token_t})
See `libhdf5` documentation for [`H5Otoken_from_str`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga5136c14b4e907f15007030d7a6d6cd24).
"""
function h5o_token_from_str(loc_id, token_str, token)
lock(liblock)
var"#status#" = try
ccall((:H5Otoken_from_str, libhdf5), herr_t, (hid_t, Cstring, Ptr{H5O_token_t}), loc_id, token_str, token)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_token_from_str (not annotated)")
return nothing
end
"""
h5o_token_to_str(loc_id::hid_t, token::Ptr{H5O_token_t}, token_str::Ptr{Ptr{Cchar}})
See `libhdf5` documentation for [`H5Otoken_to_str`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga2bdd7528090f7f2c4b361ab4cc7735f6).
"""
function h5o_token_to_str(loc_id, token, token_str)
lock(liblock)
var"#status#" = try
ccall((:H5Otoken_to_str, libhdf5), herr_t, (hid_t, Ptr{H5O_token_t}, Ptr{Ptr{Cchar}}), loc_id, token, token_str)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_token_to_str (not annotated)")
return nothing
end
@static if _libhdf5_build_ver < v"1.12.0"
@doc """
h5o_visit(obj_id::hid_t, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Ovisit1`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga6efdb2a0a9fe9fe46695cc0f7bd993e7).
"""
function h5o_visit(obj_id, idx_type, order, op, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Ovisit1, libhdf5), herr_t, (hid_t, H5_index_t, H5_iter_order_t, H5O_iterate1_t, Ptr{Cvoid}), obj_id, idx_type, order, op, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_visit1 (not annotated)")
return nothing
end
end
@static if v"1.12.0" ≤ _libhdf5_build_ver
@doc """
h5o_visit(obj_id::hid_t, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate2_t, op_data::Ptr{Cvoid}, fields::Cuint)
See `libhdf5` documentation for [`H5Ovisit3`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga6d03115ae0e5e5b516bbf35bb492266a).
"""
function h5o_visit(obj_id, idx_type, order, op, op_data, fields)
lock(liblock)
var"#status#" = try
ccall((:H5Ovisit3, libhdf5), herr_t, (hid_t, H5_index_t, H5_iter_order_t, H5O_iterate2_t, Ptr{Cvoid}, Cuint), obj_id, idx_type, order, op, op_data, fields)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_visit3 (not annotated)")
return nothing
end
end
@static if _libhdf5_build_ver < v"1.12.0"
@doc """
h5o_visit_by_name(loc_id::hid_t, obj_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate1_t, op_data::Ptr{Cvoid}, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Ovisit_by_name1`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#gaffacf3bd66f4fe074099eae1c80914f2).
"""
function h5o_visit_by_name(loc_id, obj_name, idx_type, order, op, op_data, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Ovisit_by_name1, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, H5O_iterate1_t, Ptr{Cvoid}, hid_t), loc_id, obj_name, idx_type, order, op, op_data, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_visit_by_name1 (not annotated)")
return nothing
end
end
@static if v"1.12.0" ≤ _libhdf5_build_ver
@doc """
h5o_visit_by_name(loc_id::hid_t, obj_name::Cstring, idx_type::H5_index_t, order::H5_iter_order_t, op::H5O_iterate2_t, op_data::Ptr{Cvoid}, fields::Cuint, lapl_id::hid_t)
See `libhdf5` documentation for [`H5Ovisit_by_name3`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_o.html#ga34815400b01df59c4dac19436124885a).
"""
function h5o_visit_by_name(loc_id, obj_name, idx_type, order, op, op_data, fields, lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Ovisit_by_name3, libhdf5), herr_t, (hid_t, Cstring, H5_index_t, H5_iter_order_t, H5O_iterate2_t, Ptr{Cvoid}, Cuint, hid_t), loc_id, obj_name, idx_type, order, op, op_data, fields, lapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5o_visit_by_name3 (not annotated)")
return nothing
end
end
"""
h5p_get(plist_id::hid_t, name::Cstring, value::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pget`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga40f1c3042011462c632844464a746db3).
"""
function h5p_get(plist_id, name, value)
lock(liblock)
var"#status#" = try
ccall((:H5Pget, libhdf5), herr_t, (hid_t, Cstring, Ptr{Cvoid}), plist_id, name, value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get (not annotated)")
return nothing
end
"""
h5p_get_alignment(fapl_id::hid_t, threshold::Ref{hsize_t}, alignment::Ref{hsize_t})
See `libhdf5` documentation for [`H5Pget_alignment`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga6735afde382cfd746b92a1a3b0e6a2ab).
"""
function h5p_get_alignment(fapl_id, threshold, alignment)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_alignment, libhdf5), herr_t, (hid_t, Ref{hsize_t}, Ref{hsize_t}), fapl_id, threshold, alignment)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting alignment")
return nothing
end
"""
h5p_get_alloc_time(plist_id::hid_t, alloc_time::Ptr{Cint})
See `libhdf5` documentation for [`H5Pget_alloc_time`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gaf507a3efa5d1f37448baea089fc053d8).
"""
function h5p_get_alloc_time(plist_id, alloc_time)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_alloc_time, libhdf5), herr_t, (hid_t, Ptr{Cint}), plist_id, alloc_time)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting allocation timing")
return nothing
end
"""
h5p_get_append_flush(dapl_id::hid_t, dims::Cuint, boundary::Ptr{hsize_t}, func::Ptr{H5D_append_cb_t}, udata::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_append_flush`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#gacd6803640eebd20e408c330192b09fa6).
"""
function h5p_get_append_flush(dapl_id, dims, boundary, func, udata)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_append_flush, libhdf5), herr_t, (hid_t, Cuint, Ptr{hsize_t}, Ptr{H5D_append_cb_t}, Ptr{Ptr{Cvoid}}), dapl_id, dims, boundary, func, udata)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_append_flush (not annotated)")
return nothing
end
"""
h5p_get_attr_creation_order(plist_id::hid_t, crt_order_flags::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_attr_creation_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga2a54d1ff8d7a0d0e8d652f373c18bc37).
"""
function h5p_get_attr_creation_order(plist_id, crt_order_flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_attr_creation_order, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, crt_order_flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting attribute creation order")
return nothing
end
"""
h5p_get_attr_phase_change(plist_id::hid_t, max_compact::Ptr{Cuint}, min_dense::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_attr_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gaf7c57a6e78a4123f82450559623ab534).
"""
function h5p_get_attr_phase_change(plist_id, max_compact, min_dense)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_attr_phase_change, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}), plist_id, max_compact, min_dense)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_attr_phase_change (not annotated)")
return nothing
end
"""
h5p_get_btree_ratios(plist_id::hid_t, left::Ptr{Cdouble}, middle::Ptr{Cdouble}, right::Ptr{Cdouble})
See `libhdf5` documentation for [`H5Pget_btree_ratios`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga3f3df48ce44b6af7517654b23a37fa02).
"""
function h5p_get_btree_ratios(plist_id, left, middle, right)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_btree_ratios, libhdf5), herr_t, (hid_t, Ptr{Cdouble}, Ptr{Cdouble}, Ptr{Cdouble}), plist_id, left, middle, right)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_btree_ratios (not annotated)")
return nothing
end
"""
h5p_get_buffer(plist_id::hid_t, tconv::Ptr{Ptr{Cvoid}}, bkg::Ptr{Ptr{Cvoid}}) -> Csize_t
See `libhdf5` documentation for [`H5Pget_buffer`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga1278b9979cc833e77d699cc878c6dab4).
"""
function h5p_get_buffer(plist_id, tconv, bkg)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_buffer, libhdf5), Csize_t, (hid_t, Ptr{Ptr{Cvoid}}, Ptr{Ptr{Cvoid}}), plist_id, tconv, bkg)
finally
unlock(liblock)
end
@h5error "Error in h5p_get_buffer (not annotated)"
return var"#status#"
end
"""
h5p_get_cache(plist_id::hid_t, mdc_nelmts::Ptr{Cint}, rdcc_nslots::Ptr{Csize_t}, rdcc_nbytes::Ptr{Csize_t}, rdcc_w0::Ptr{Cdouble})
See `libhdf5` documentation for [`H5Pget_cache`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga9481a0b08d729ec68897d57db1827861).
"""
function h5p_get_cache(plist_id, mdc_nelmts, rdcc_nslots, rdcc_nbytes, rdcc_w0)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_cache, libhdf5), herr_t, (hid_t, Ptr{Cint}, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cdouble}), plist_id, mdc_nelmts, rdcc_nslots, rdcc_nbytes, rdcc_w0)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_cache (not annotated)")
return nothing
end
"""
h5p_get_char_encoding(plist_id::hid_t, encoding::Ref{Cint})
See `libhdf5` documentation for [`H5Pget_char_encoding`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_c_p_l.html#ga9b35ef9add6463997330e9b4b606603d).
"""
function h5p_get_char_encoding(plist_id, encoding)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_char_encoding, libhdf5), herr_t, (hid_t, Ref{Cint}), plist_id, encoding)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting char encoding")
return nothing
end
"""
h5p_get_chunk(plist_id::hid_t, n_dims::Cint, dims::Ptr{hsize_t}) -> Int
See `libhdf5` documentation for [`H5Pget_chunk`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga4ef814034f601f48ab1ed6db79b4354c).
"""
function h5p_get_chunk(plist_id, n_dims, dims)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_chunk, libhdf5), Cint, (hid_t, Cint, Ptr{hsize_t}), plist_id, n_dims, dims)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting chunk size")
return Int(var"#status#")
end
"""
h5p_get_chunk_cache(dapl_id::hid_t, rdcc_nslots::Ptr{Csize_t}, rdcc_nbytes::Ptr{Csize_t}, rdcc_w0::Ptr{Cdouble})
See `libhdf5` documentation for [`H5Pget_chunk_cache`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#gaeda015dfee4167cc60baab1d1f0560fe).
"""
function h5p_get_chunk_cache(dapl_id, rdcc_nslots, rdcc_nbytes, rdcc_w0)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_chunk_cache, libhdf5), herr_t, (hid_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cdouble}), dapl_id, rdcc_nslots, rdcc_nbytes, rdcc_w0)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_chunk_cache (not annotated)")
return nothing
end
"""
h5p_get_chunk_opts(plist_id::hid_t, opts::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_chunk_opts`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga6e8d8f6a14b79bd110e27666d95031cf).
"""
function h5p_get_chunk_opts(plist_id, opts)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_chunk_opts, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, opts)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_chunk_opts (not annotated)")
return nothing
end
"""
h5p_get_class(plist_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#ga9b230c1e85790f9f45c4ca2e79dd62c5).
"""
function h5p_get_class(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_class, libhdf5), hid_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_get_class (not annotated)")
return var"#status#"
end
"""
h5p_get_class_parent(pclass_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_class_parent`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga89f228c04207992d93fc3f2dddd860a5).
"""
function h5p_get_class_parent(pclass_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_class_parent, libhdf5), hid_t, (hid_t,), pclass_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_get_class_parent (not annotated)")
return var"#status#"
end
"""
h5p_get_copy_object(plist_id::hid_t, copy_options::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_copy_object`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#gad81b509481ba53a1ef1ba3c7083fc295).
"""
function h5p_get_copy_object(plist_id, copy_options)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_copy_object, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, copy_options)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_copy_object (not annotated)")
return nothing
end
"""
h5p_get_core_write_tracking(fapl_id::hid_t, is_enabled::Ptr{hbool_t}, page_size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_core_write_tracking`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga60ec714459a4911d28e46deb201f4f2e).
"""
function h5p_get_core_write_tracking(fapl_id, is_enabled, page_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_core_write_tracking, libhdf5), herr_t, (hid_t, Ptr{hbool_t}, Ptr{Csize_t}), fapl_id, is_enabled, page_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_core_write_tracking (not annotated)")
return nothing
end
"""
h5p_get_create_intermediate_group(lcpl_id::hid_t, crt_intermed_group::Ref{Cuint})
See `libhdf5` documentation for [`H5Pget_create_intermediate_group`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_c_p_l.html#gaf7db1b7ce19703f30f1827b7c899c3b0).
"""
function h5p_get_create_intermediate_group(lcpl_id, crt_intermed_group)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_create_intermediate_group, libhdf5), herr_t, (hid_t, Ref{Cuint}), lcpl_id, crt_intermed_group)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting create intermediate group property")
return nothing
end
"""
h5p_get_data_transform(plist_id::hid_t, expression::Ptr{Cchar}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_data_transform`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga865b05218e704578fe8db0c9dec07b25).
"""
function h5p_get_data_transform(plist_id, expression, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_data_transform, libhdf5), Cssize_t, (hid_t, Ptr{Cchar}, Csize_t), plist_id, expression, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5p_get_data_transform (not annotated)")
return var"#status#"
end
"""
h5p_get_driver(plist_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5p_get_driver(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_driver, libhdf5), hid_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting driver identifier")
return var"#status#"
end
"""
h5p_get_driver_info(plist_id::hid_t) -> Ptr{Cvoid}
See `libhdf5` documentation for [`H5Pget_driver_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga1b072297fed53cd8586604e45c483a56).
"""
function h5p_get_driver_info(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_driver_info, libhdf5), Ptr{Cvoid}, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" == C_NULL && @h5error("Error getting driver info")
return var"#status#"
end
"""
h5p_get_dset_no_attrs_hint(dcpl_id::hid_t, minimize::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_dset_no_attrs_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga2fd4f0446a38186db8256cef4c97a970).
"""
function h5p_get_dset_no_attrs_hint(dcpl_id, minimize)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_dset_no_attrs_hint, libhdf5), herr_t, (hid_t, Ptr{hbool_t}), dcpl_id, minimize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in getting dataset no attributes hint property")
return nothing
end
"""
h5p_get_dxpl_mpio(dxpl_id::hid_t, xfer_mode::Ptr{Cint})
See `libhdf5` documentation for [`H5Pget_dxpl_mpio`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gab66eca0259c33d575b4050eebfb6f2cd).
"""
function h5p_get_dxpl_mpio(dxpl_id, xfer_mode)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_dxpl_mpio, libhdf5), herr_t, (hid_t, Ptr{Cint}), dxpl_id, xfer_mode)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting MPIO transfer mode")
return nothing
end
"""
h5p_get_edc_check(plist_id::hid_t) -> H5Z_EDC_t
See `libhdf5` documentation for [`H5Pget_edc_check`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gabc2b1e2af542ac15ee1613f4f89117e1).
"""
function h5p_get_edc_check(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_edc_check, libhdf5), H5Z_EDC_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < H5Z_EDC_t(0) && @h5error("Error in h5p_get_edc_check (not annotated)")
return var"#status#"
end
"""
h5p_get_efile_prefix(dapl_id::hid_t, prefix::Ptr{UInt8}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_efile_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga442647d48171db920c71a7baf6fdeee6).
"""
function h5p_get_efile_prefix(dapl_id, prefix, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_efile_prefix, libhdf5), Cssize_t, (hid_t, Ptr{UInt8}, Csize_t), dapl_id, prefix, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting external file prefix")
return var"#status#"
end
"""
h5p_get_elink_acc_flags(lapl_id::hid_t, flags::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_elink_acc_flags`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#gaf1357eb0940f171efecae06a9ed6155b).
"""
function h5p_get_elink_acc_flags(lapl_id, flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_elink_acc_flags, libhdf5), herr_t, (hid_t, Ptr{Cuint}), lapl_id, flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_elink_acc_flags (not annotated)")
return nothing
end
"""
h5p_get_elink_cb(lapl_id::hid_t, func::Ptr{H5L_elink_traverse_t}, op_data::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_elink_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#gacbf576bd8f7e63f3a91134b12d6b2d12).
"""
function h5p_get_elink_cb(lapl_id, func, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_elink_cb, libhdf5), herr_t, (hid_t, Ptr{H5L_elink_traverse_t}, Ptr{Ptr{Cvoid}}), lapl_id, func, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_elink_cb (not annotated)")
return nothing
end
"""
h5p_get_elink_fapl(lapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_elink_fapl`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga2c2fe0a0396b9a0a02b28402e4ee108a).
"""
function h5p_get_elink_fapl(lapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_elink_fapl, libhdf5), hid_t, (hid_t,), lapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_get_elink_fapl (not annotated)")
return var"#status#"
end
"""
h5p_get_elink_file_cache_size(plist_id::hid_t, efc_size::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_elink_file_cache_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga4c9bcfff90f48bfefa2c25e551485923).
"""
function h5p_get_elink_file_cache_size(plist_id, efc_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_elink_file_cache_size, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, efc_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_elink_file_cache_size (not annotated)")
return nothing
end
"""
h5p_get_elink_prefix(plist_id::hid_t, prefix::Ptr{Cchar}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_elink_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga7960f746797bcf35f70746cd644f8b5a).
"""
function h5p_get_elink_prefix(plist_id, prefix, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_elink_prefix, libhdf5), Cssize_t, (hid_t, Ptr{Cchar}, Csize_t), plist_id, prefix, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5p_get_elink_prefix (not annotated)")
return var"#status#"
end
"""
h5p_get_est_link_info(plist_id::hid_t, est_num_entries::Ptr{Cuint}, est_name_len::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_est_link_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#ga701867215546a345dea7b8e9cf7a1b61).
"""
function h5p_get_est_link_info(plist_id, est_num_entries, est_name_len)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_est_link_info, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}), plist_id, est_num_entries, est_name_len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_est_link_info (not annotated)")
return nothing
end
"""
h5p_get_evict_on_close(fapl_id::hid_t, evict_on_close::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_evict_on_close`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga12789fcfeaea073c13202e6401f404a6).
"""
function h5p_get_evict_on_close(fapl_id, evict_on_close)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_evict_on_close, libhdf5), herr_t, (hid_t, Ptr{hbool_t}), fapl_id, evict_on_close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_evict_on_close (not annotated)")
return nothing
end
"""
h5p_get_external(plist::hid_t, idx::Cuint, name_size::Csize_t, name::Ptr{Cuchar}, offset::Ptr{off_t}, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_external`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga78253b80b6c86faf7ff0db135146521d).
"""
function h5p_get_external(plist, idx, name_size, name, offset, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_external, libhdf5), herr_t, (hid_t, Cuint, Csize_t, Ptr{Cuchar}, Ptr{off_t}, Ptr{hsize_t}), plist, idx, name_size, name, offset, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting external file properties")
return nothing
end
"""
h5p_get_external_count(plist::hid_t) -> Int
See `libhdf5` documentation for [`H5Pget_external_count`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga4c45d90845ea7627b6238f95168c41ce).
"""
function h5p_get_external_count(plist)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_external_count, libhdf5), Cint, (hid_t,), plist)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting external count")
return Int(var"#status#")
end
"""
h5p_get_family_offset(fapl_id::hid_t, offset::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_family_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga14977eaaf6565ba871b575de3163f1b3).
"""
function h5p_get_family_offset(fapl_id, offset)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_family_offset, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), fapl_id, offset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_family_offset (not annotated)")
return nothing
end
"""
h5p_get_fapl_core(fapl_id::hid_t, increment::Ptr{Csize_t}, backing_store::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_fapl_core`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gac347d401cbb28fecd78c5f15ddb4c9c1).
"""
function h5p_get_fapl_core(fapl_id, increment, backing_store)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_core, libhdf5), herr_t, (hid_t, Ptr{Csize_t}, Ptr{hbool_t}), fapl_id, increment, backing_store)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fapl_core (not annotated)")
return nothing
end
"""
h5p_get_fapl_family(fapl_id::hid_t, memb_size::Ptr{hsize_t}, memb_fapl_id::Ptr{hid_t})
See `libhdf5` documentation for [`H5Pget_fapl_family`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga0bc1a003f26bf4b53e4487b6ca117389).
"""
function h5p_get_fapl_family(fapl_id, memb_size, memb_fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_family, libhdf5), herr_t, (hid_t, Ptr{hsize_t}, Ptr{hid_t}), fapl_id, memb_size, memb_fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fapl_family (not annotated)")
return nothing
end
"""
h5p_get_fapl_hdfs(fapl_id::hid_t, fa_out::Ptr{H5FD_hdfs_fapl_t})
See `libhdf5` documentation for [`H5Pget_fapl_hdfs`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gae59e7d8e0e8823e6dd6034b66418ed00).
"""
function h5p_get_fapl_hdfs(fapl_id, fa_out)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_hdfs, libhdf5), herr_t, (hid_t, Ptr{H5FD_hdfs_fapl_t}), fapl_id, fa_out)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fapl_hdfs (not annotated)")
return nothing
end
"""
h5p_get_fapl_multi(fapl_id::hid_t, memb_map::Ptr{H5FD_mem_t}, memb_fapl::Ptr{hid_t}, memb_name::Ptr{Ptr{Cchar}}, memb_addr::Ptr{haddr_t}, relax::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_fapl_multi`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga765b7880795a139f3b567743ac88c3c7).
"""
function h5p_get_fapl_multi(fapl_id, memb_map, memb_fapl, memb_name, memb_addr, relax)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_multi, libhdf5), herr_t, (hid_t, Ptr{H5FD_mem_t}, Ptr{hid_t}, Ptr{Ptr{Cchar}}, Ptr{haddr_t}, Ptr{hbool_t}), fapl_id, memb_map, memb_fapl, memb_name, memb_addr, relax)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fapl_multi (not annotated)")
return nothing
end
"""
h5p_get_fapl_splitter(fapl_id::hid_t, config_ptr::Ptr{H5FD_splitter_vfd_config_t})
See `libhdf5` documentation for [`H5Pget_fapl_splitter`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaf6ac1c131acee33dfb878593dfefb4ac).
"""
function h5p_get_fapl_splitter(fapl_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_splitter, libhdf5), herr_t, (hid_t, Ptr{H5FD_splitter_vfd_config_t}), fapl_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fapl_splitter (not annotated)")
return nothing
end
"""
h5p_get_fapl_ros3(fapl_id::hid_t, fa_out::Ptr{H5FD_ros3_fapl_t})
See `libhdf5` documentation for [`H5Pget_fapl_ros3`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga13e273711e160cbd58e60c701b4f50e6).
"""
function h5p_get_fapl_ros3(fapl_id, fa_out)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fapl_ros3, libhdf5), herr_t, (hid_t, Ptr{H5FD_ros3_fapl_t}), fapl_id, fa_out)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in getting ros3 properties")
return nothing
end
"""
h5p_get_fclose_degree(fapl_id::hid_t, fc_degree::Ref{Cint})
See `libhdf5` documentation for [`H5Pget_fclose_degree`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga41da04bb4f823ba9f7d6c57dc8fe2878).
"""
function h5p_get_fclose_degree(fapl_id, fc_degree)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fclose_degree, libhdf5), herr_t, (hid_t, Ref{Cint}), fapl_id, fc_degree)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting close degree")
return nothing
end
"""
h5p_get_file_image(fapl_id::hid_t, buf_ptr_ptr::Ptr{Ptr{Cvoid}}, buf_len_ptr::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_file_image`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga337626cc516d5d1e3303ea6bc350e56b).
"""
function h5p_get_file_image(fapl_id, buf_ptr_ptr, buf_len_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_image, libhdf5), herr_t, (hid_t, Ptr{Ptr{Cvoid}}, Ptr{Csize_t}), fapl_id, buf_ptr_ptr, buf_len_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_image (not annotated)")
return nothing
end
"""
h5p_get_file_image_callbacks(fapl_id::hid_t, callbacks_ptr::Ptr{H5FD_file_image_callbacks_t})
See `libhdf5` documentation for [`H5Pget_file_image_callbacks`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gae17e38082dfdbadd75c897f1e6a9096e).
"""
function h5p_get_file_image_callbacks(fapl_id, callbacks_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_image_callbacks, libhdf5), herr_t, (hid_t, Ptr{H5FD_file_image_callbacks_t}), fapl_id, callbacks_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_image_callbacks (not annotated)")
return nothing
end
"""
h5p_get_file_locking(fapl_id::hid_t, use_file_locking::Ptr{hbool_t}, ignore_when_disabled::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_file_locking`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga5de19a5a8ac23ca417aa2d49d708dc2d).
"""
function h5p_get_file_locking(fapl_id, use_file_locking, ignore_when_disabled)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_locking, libhdf5), herr_t, (hid_t, Ptr{hbool_t}, Ptr{hbool_t}), fapl_id, use_file_locking, ignore_when_disabled)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_locking (not annotated)")
return nothing
end
"""
h5p_get_file_space(plist_id::hid_t, strategy::Ptr{H5F_file_space_type_t}, threshold::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_file_space`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga9873dad32f2be5b4bb41497e2fbf5619).
"""
function h5p_get_file_space(plist_id, strategy, threshold)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_space, libhdf5), herr_t, (hid_t, Ptr{H5F_file_space_type_t}, Ptr{hsize_t}), plist_id, strategy, threshold)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_space (not annotated)")
return nothing
end
"""
h5p_get_file_space_page_size(plist_id::hid_t, fsp_size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_file_space_page_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gaab5e8c08e4f588e0af1d937fcebfc885).
"""
function h5p_get_file_space_page_size(plist_id, fsp_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_space_page_size, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), plist_id, fsp_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_space_page_size (not annotated)")
return nothing
end
"""
h5p_get_file_space_strategy(plist_id::hid_t, strategy::Ptr{H5F_fspace_strategy_t}, persist::Ptr{hbool_t}, threshold::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_file_space_strategy`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga54cf6ca4f897ba9ee3695a15fe8e6029).
"""
function h5p_get_file_space_strategy(plist_id, strategy, persist, threshold)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_file_space_strategy, libhdf5), herr_t, (hid_t, Ptr{H5F_fspace_strategy_t}, Ptr{hbool_t}, Ptr{hsize_t}), plist_id, strategy, persist, threshold)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_file_space_strategy (not annotated)")
return nothing
end
"""
h5p_get_fill_time(plist_id::hid_t, fill_time::Ptr{H5D_fill_time_t})
See `libhdf5` documentation for [`H5Pget_fill_time`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga92c5eb5ee19bfd4a9184cf0428d1b00c).
"""
function h5p_get_fill_time(plist_id, fill_time)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fill_time, libhdf5), herr_t, (hid_t, Ptr{H5D_fill_time_t}), plist_id, fill_time)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fill_time (not annotated)")
return nothing
end
"""
h5p_get_fill_value(plist_id::hid_t, type_id::hid_t, value::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pget_fill_value`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga82bbe8c77c7eb9c460bfd1eb26881355).
"""
function h5p_get_fill_value(plist_id, type_id, value)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_fill_value, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), plist_id, type_id, value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_fill_value (not annotated)")
return nothing
end
"""
h5p_get_filter(plist_id::hid_t, idx::Cuint, flags::Ptr{Cuint}, cd_nemlts::Ref{Csize_t}, cd_values::Ptr{Cuint}, namelen::Csize_t, name::Ptr{Cchar}, filter_config::Ptr{Cuint}) -> H5Z_filter_t
See `libhdf5` documentation for [`H5Pget_filter2`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga024d200a6a07e12f008a62c4e62d0bcc).
"""
function h5p_get_filter(plist_id, idx, flags, cd_nemlts, cd_values, namelen, name, filter_config)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_filter2, libhdf5), H5Z_filter_t, (hid_t, Cuint, Ptr{Cuint}, Ref{Csize_t}, Ptr{Cuint}, Csize_t, Ptr{Cchar}, Ptr{Cuint}), plist_id, idx, flags, cd_nemlts, cd_values, namelen, name, filter_config)
finally
unlock(liblock)
end
var"#status#" < H5Z_filter_t(0) && @h5error("Error getting filter")
return var"#status#"
end
"""
h5p_get_filter_by_id(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Ref{Cuint}, cd_nelmts::Ref{Csize_t}, cd_values::Ptr{Cuint}, namelen::Csize_t, name::Ptr{UInt8}, filter_config::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_filter_by_id2`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga2d5e9df5f0e93abae11ee5edd82fcec3).
"""
function h5p_get_filter_by_id(plist_id, filter_id, flags, cd_nelmts, cd_values, namelen, name, filter_config)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_filter_by_id2, libhdf5), herr_t, (hid_t, H5Z_filter_t, Ref{Cuint}, Ref{Csize_t}, Ptr{Cuint}, Csize_t, Ptr{UInt8}, Ptr{Cuint}), plist_id, filter_id, flags, cd_nelmts, cd_values, namelen, name, filter_config)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting filter ID")
return nothing
end
"""
h5p_get_gc_references(fapl_id::hid_t, gc_ref::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_gc_references`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaaa81d8427b419d80eff6e1d216d99b71).
"""
function h5p_get_gc_references(fapl_id, gc_ref)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_gc_references, libhdf5), herr_t, (hid_t, Ptr{Cuint}), fapl_id, gc_ref)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_gc_references (not annotated)")
return nothing
end
"""
h5p_get_hyper_vector_size(fapl_id::hid_t, size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_hyper_vector_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gaa55e7a6dd26a8df51b331febfeeb376b).
"""
function h5p_get_hyper_vector_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_hyper_vector_size, libhdf5), herr_t, (hid_t, Ptr{Csize_t}), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_hyper_vector_size (not annotated)")
return nothing
end
"""
h5p_get_istore_k(plist_id::hid_t, ik::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_istore_k`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga2179b032be5d2efbca63d8f82a292ec1).
"""
function h5p_get_istore_k(plist_id, ik)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_istore_k, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, ik)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_istore_k (not annotated)")
return nothing
end
"""
h5p_get_layout(plist_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Pget_layout`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga655530b0f40990507fedeef6b3068db3).
"""
function h5p_get_layout(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_layout, libhdf5), Cint, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error(string("Error getting layout"))
return Int(var"#status#")
end
"""
h5p_get_libver_bounds(fapl_id::hid_t, low::Ref{Cint}, high::Ref{Cint})
See `libhdf5` documentation for [`H5Pget_libver_bounds`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gad5d7e671c3a06bcee64bc25841aaf607).
"""
function h5p_get_libver_bounds(fapl_id, low, high)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_libver_bounds, libhdf5), herr_t, (hid_t, Ref{Cint}, Ref{Cint}), fapl_id, low, high)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting library version bounds")
return nothing
end
"""
h5p_get_link_creation_order(plist_id::hid_t, crt_order_flags::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_link_creation_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#gaa2c2f433c7e65f694e0444e7f0ed2d33).
"""
function h5p_get_link_creation_order(plist_id, crt_order_flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_link_creation_order, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, crt_order_flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting link creation order")
return nothing
end
"""
h5p_get_link_phase_change(plist_id::hid_t, max_compact::Ptr{Cuint}, min_dense::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_link_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#gacab66461dca6c2beafd624c2e4d9f94d).
"""
function h5p_get_link_phase_change(plist_id, max_compact, min_dense)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_link_phase_change, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}), plist_id, max_compact, min_dense)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_link_phase_change (not annotated)")
return nothing
end
"""
h5p_get_local_heap_size_hint(plist_id::hid_t, size_hint::Ref{Csize_t})
See `libhdf5` documentation for [`H5Pget_local_heap_size_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#ga49e14718767fa160248e3852c2abdd74).
"""
function h5p_get_local_heap_size_hint(plist_id, size_hint)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_local_heap_size_hint, libhdf5), herr_t, (hid_t, Ref{Csize_t}), plist_id, size_hint)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting local heap size hint")
return nothing
end
"""
h5p_get_mcdt_search_cb(plist_id::hid_t, func::Ptr{H5O_mcdt_search_cb_t}, op_data::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_mcdt_search_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#ga5d7b82394d37bda28769a0435300d396).
"""
function h5p_get_mcdt_search_cb(plist_id, func, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_mcdt_search_cb, libhdf5), herr_t, (hid_t, Ptr{H5O_mcdt_search_cb_t}, Ptr{Ptr{Cvoid}}), plist_id, func, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_mcdt_search_cb (not annotated)")
return nothing
end
"""
h5p_get_mdc_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})
See `libhdf5` documentation for [`H5Pget_mdc_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga3012f7f3310c7d25ada7617896bef1ee).
"""
function h5p_get_mdc_config(plist_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_mdc_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_config_t}), plist_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_mdc_config (not annotated)")
return nothing
end
"""
h5p_get_mdc_image_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_image_config_t})
See `libhdf5` documentation for [`H5Pget_mdc_image_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaaa18d59ee9efb12626410b1638f76f00).
"""
function h5p_get_mdc_image_config(plist_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_mdc_image_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_image_config_t}), plist_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_mdc_image_config (not annotated)")
return nothing
end
"""
h5p_get_mdc_log_options(plist_id::hid_t, is_enabled::Ptr{hbool_t}, location::Ptr{Cchar}, location_size::Ptr{Csize_t}, start_on_access::Ptr{hbool_t})
See `libhdf5` documentation for [`H5Pget_mdc_log_options`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaa3a1ca6e294cc5074933239cc3d0e4a3).
"""
function h5p_get_mdc_log_options(plist_id, is_enabled, location, location_size, start_on_access)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_mdc_log_options, libhdf5), herr_t, (hid_t, Ptr{hbool_t}, Ptr{Cchar}, Ptr{Csize_t}, Ptr{hbool_t}), plist_id, is_enabled, location, location_size, start_on_access)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_mdc_log_options (not annotated)")
return nothing
end
"""
h5p_get_meta_block_size(fapl_id::hid_t, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_meta_block_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gac17861181246af0209c0da5209305461).
"""
function h5p_get_meta_block_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_meta_block_size, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_meta_block_size (not annotated)")
return nothing
end
"""
h5p_get_metadata_read_attempts(plist_id::hid_t, attempts::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_metadata_read_attempts`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga670948d56435920f1e1c2e88b823935e).
"""
function h5p_get_metadata_read_attempts(plist_id, attempts)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_metadata_read_attempts, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, attempts)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_metadata_read_attempts (not annotated)")
return nothing
end
"""
h5p_get_multi_type(fapl_id::hid_t, type::Ptr{H5FD_mem_t})
See `libhdf5` documentation for [`H5Pget_multi_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga251515e9fee4641037b4866a4f7c49fe).
"""
function h5p_get_multi_type(fapl_id, type)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_multi_type, libhdf5), herr_t, (hid_t, Ptr{H5FD_mem_t}), fapl_id, type)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_multi_type (not annotated)")
return nothing
end
"""
h5p_get_nfilters(plist_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Pget_nfilters`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gacbad1ca36a61246b439a25f28e7575fb).
"""
function h5p_get_nfilters(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_nfilters, libhdf5), Cint, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting nfilters")
return Int(var"#status#")
end
"""
h5p_get_nlinks(plist_id::hid_t, nlinks::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_nlinks`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga6bfa33fa9a77011cbdc06d0fbc907177).
"""
function h5p_get_nlinks(plist_id, nlinks)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_nlinks, libhdf5), herr_t, (hid_t, Ptr{Csize_t}), plist_id, nlinks)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_nlinks (not annotated)")
return nothing
end
"""
h5p_get_nprops(id::hid_t, nprops::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_nprops`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga13f41512715a8019e89529ea093c2c43).
"""
function h5p_get_nprops(id, nprops)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_nprops, libhdf5), herr_t, (hid_t, Ptr{Csize_t}), id, nprops)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_nprops (not annotated)")
return nothing
end
"""
h5p_get_obj_track_times(plist_id::hid_t, track_times::Ref{UInt8})
See `libhdf5` documentation for [`H5Pget_obj_track_times`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gad99400915d340da978dd6ac5676122c6).
"""
function h5p_get_obj_track_times(plist_id, track_times)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_obj_track_times, libhdf5), herr_t, (hid_t, Ref{UInt8}), plist_id, track_times)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting object time tracking")
return nothing
end
"""
h5p_get_object_flush_cb(plist_id::hid_t, func::Ptr{H5F_flush_cb_t}, udata::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_object_flush_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gadb66d434fd8d2f600213b0eec539564e).
"""
function h5p_get_object_flush_cb(plist_id, func, udata)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_object_flush_cb, libhdf5), herr_t, (hid_t, Ptr{H5F_flush_cb_t}, Ptr{Ptr{Cvoid}}), plist_id, func, udata)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_object_flush_cb (not annotated)")
return nothing
end
"""
h5p_get_page_buffer_size(plist_id::hid_t, buf_size::Ptr{Csize_t}, min_meta_perc::Ptr{Cuint}, min_raw_perc::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_page_buffer_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga0da11baf31cf424d053aa7952c933d98).
"""
function h5p_get_page_buffer_size(plist_id, buf_size, min_meta_perc, min_raw_perc)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_page_buffer_size, libhdf5), herr_t, (hid_t, Ptr{Csize_t}, Ptr{Cuint}, Ptr{Cuint}), plist_id, buf_size, min_meta_perc, min_raw_perc)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_page_buffer_size (not annotated)")
return nothing
end
"""
h5p_get_preserve(plist_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Pget_preserve`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gacca1a094d70c3b2277175145142fda10).
"""
function h5p_get_preserve(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_preserve, libhdf5), Cint, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error in h5p_get_preserve (not annotated)")
return Int(var"#status#")
end
"""
h5p_get_shared_mesg_index(plist_id::hid_t, index_num::Cuint, mesg_type_flags::Ptr{Cuint}, min_mesg_size::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_shared_mesg_index`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gac6bac4446c45d348c953b3afdecede2c).
"""
function h5p_get_shared_mesg_index(plist_id, index_num, mesg_type_flags, min_mesg_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_shared_mesg_index, libhdf5), herr_t, (hid_t, Cuint, Ptr{Cuint}, Ptr{Cuint}), plist_id, index_num, mesg_type_flags, min_mesg_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_shared_mesg_index (not annotated)")
return nothing
end
"""
h5p_get_shared_mesg_nindexes(plist_id::hid_t, nindexes::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_shared_mesg_nindexes`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga30980db1814a251e7b40362af1006652).
"""
function h5p_get_shared_mesg_nindexes(plist_id, nindexes)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_shared_mesg_nindexes, libhdf5), herr_t, (hid_t, Ptr{Cuint}), plist_id, nindexes)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_shared_mesg_nindexes (not annotated)")
return nothing
end
"""
h5p_get_shared_mesg_phase_change(plist_id::hid_t, max_list::Ptr{Cuint}, min_btree::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_shared_mesg_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gab013e791706b44f545a97096d8e4c72e).
"""
function h5p_get_shared_mesg_phase_change(plist_id, max_list, min_btree)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_shared_mesg_phase_change, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}), plist_id, max_list, min_btree)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_shared_mesg_phase_change (not annotated)")
return nothing
end
"""
h5p_get_sieve_buf_size(fapl_id::hid_t, size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_sieve_buf_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gac2321d0c34bb2b3cf33cd7bf02ca8e66).
"""
function h5p_get_sieve_buf_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_sieve_buf_size, libhdf5), herr_t, (hid_t, Ptr{Csize_t}), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_sieve_buf_size (not annotated)")
return nothing
end
"""
h5p_get_size(id::hid_t, name::Ptr{Cchar}, size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gaaad1c5ad2069145e2f5397ce4ab3a93c).
"""
function h5p_get_size(id, name, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_size, libhdf5), herr_t, (hid_t, Ptr{Cchar}, Ptr{Csize_t}), id, name, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_size (not annotated)")
return nothing
end
"""
h5p_get_sizes(plist_id::hid_t, sizeof_addr::Ptr{Csize_t}, sizeof_size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_sizes`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga8da25b0367cf226c2888141661fd7a2d).
"""
function h5p_get_sizes(plist_id, sizeof_addr, sizeof_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_sizes, libhdf5), herr_t, (hid_t, Ptr{Csize_t}, Ptr{Csize_t}), plist_id, sizeof_addr, sizeof_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_sizes (not annotated)")
return nothing
end
"""
h5p_get_small_data_block_size(fapl_id::hid_t, size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_small_data_block_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga6896bea06d7744b56e22347f572f5470).
"""
function h5p_get_small_data_block_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_small_data_block_size, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_small_data_block_size (not annotated)")
return nothing
end
"""
h5p_get_sym_k(plist_id::hid_t, ik::Ptr{Cuint}, lk::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_sym_k`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga1d4ee26c030ced6d7a314543578c88b1).
"""
function h5p_get_sym_k(plist_id, ik, lk)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_sym_k, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}), plist_id, ik, lk)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_sym_k (not annotated)")
return nothing
end
"""
h5p_get_type_conv_cb(dxpl_id::hid_t, op::Ptr{H5T_conv_except_func_t}, operate_data::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_type_conv_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gae8c8557d303fa914b569da0182284e89).
"""
function h5p_get_type_conv_cb(dxpl_id, op, operate_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_type_conv_cb, libhdf5), herr_t, (hid_t, Ptr{H5T_conv_except_func_t}, Ptr{Ptr{Cvoid}}), dxpl_id, op, operate_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_type_conv_cb (not annotated)")
return nothing
end
"""
h5p_get_userblock(plist_id::hid_t, len::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_userblock`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga75b312bb0c70419fc428d743a65bed86).
"""
function h5p_get_userblock(plist_id, len)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_userblock, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), plist_id, len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting userblock")
return nothing
end
"""
h5p_get_version(plist_id::hid_t, boot::Ptr{Cuint}, freelist::Ptr{Cuint}, stab::Ptr{Cuint}, shhdr::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pget_version`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga99c0afbb68e8e775ae70cac44404a534).
"""
function h5p_get_version(plist_id, boot, freelist, stab, shhdr)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_version, libhdf5), herr_t, (hid_t, Ptr{Cuint}, Ptr{Cuint}, Ptr{Cuint}, Ptr{Cuint}), plist_id, boot, freelist, stab, shhdr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_version (not annotated)")
return nothing
end
"""
h5p_get_virtual_count(dcpl_id::hid_t, count::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pget_virtual_count`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga83dcce1ce110d1ff6eae0fb77d4a7c85).
"""
function h5p_get_virtual_count(dcpl_id, count)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_count, libhdf5), herr_t, (hid_t, Ptr{Csize_t}), dcpl_id, count)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_virtual_count (not annotated)")
return nothing
end
"""
h5p_get_virtual_dsetname(dcpl_id::hid_t, index::Csize_t, name::Ptr{Cchar}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_virtual_dsetname`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gaf50620fd5d83dc9ca1e5c3f374c5a952).
"""
function h5p_get_virtual_dsetname(dcpl_id, index, name, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_dsetname, libhdf5), Cssize_t, (hid_t, Csize_t, Ptr{Cchar}, Csize_t), dcpl_id, index, name, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5p_get_virtual_dsetname (not annotated)")
return var"#status#"
end
"""
h5p_get_virtual_filename(dcpl_id::hid_t, index::Csize_t, name::Ptr{Cchar}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_virtual_filename`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga5c17780cc9a72a0f62d70f6138510afa).
"""
function h5p_get_virtual_filename(dcpl_id, index, name, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_filename, libhdf5), Cssize_t, (hid_t, Csize_t, Ptr{Cchar}, Csize_t), dcpl_id, index, name, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5p_get_virtual_filename (not annotated)")
return var"#status#"
end
"""
h5p_get_virtual_prefix(dapl_id::hid_t, prefix::Ptr{Cchar}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5Pget_virtual_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga9a48c80955877c20d53e8fd3f49a2995).
"""
function h5p_get_virtual_prefix(dapl_id, prefix, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_prefix, libhdf5), Cssize_t, (hid_t, Ptr{Cchar}, Csize_t), dapl_id, prefix, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error in h5p_get_virtual_prefix (not annotated)")
return var"#status#"
end
"""
h5p_get_virtual_printf_gap(dapl_id::hid_t, gap_size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pget_virtual_printf_gap`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga833dfc6d9c87738c9d94b610e70a818f).
"""
function h5p_get_virtual_printf_gap(dapl_id, gap_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_printf_gap, libhdf5), herr_t, (hid_t, Ptr{hsize_t}), dapl_id, gap_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_virtual_printf_gap (not annotated)")
return nothing
end
"""
h5p_get_virtual_srcspace(dcpl_id::hid_t, index::Csize_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_virtual_srcspace`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga8319e9386cdb9b3881a8b698edfc78fc).
"""
function h5p_get_virtual_srcspace(dcpl_id, index)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_srcspace, libhdf5), hid_t, (hid_t, Csize_t), dcpl_id, index)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_get_virtual_srcspace (not annotated)")
return var"#status#"
end
"""
h5p_get_virtual_view(dapl_id::hid_t, view::Ptr{H5D_vds_view_t})
See `libhdf5` documentation for [`H5Pget_virtual_view`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga7173663654b085e8583ab609c988b47c).
"""
function h5p_get_virtual_view(dapl_id, view)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_view, libhdf5), herr_t, (hid_t, Ptr{H5D_vds_view_t}), dapl_id, view)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_virtual_view (not annotated)")
return nothing
end
"""
h5p_get_virtual_vspace(dcpl_id::hid_t, index::Csize_t) -> hid_t
See `libhdf5` documentation for [`H5Pget_virtual_vspace`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga6425cabbc055b66e218b4728d6eb911d).
"""
function h5p_get_virtual_vspace(dcpl_id, index)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_virtual_vspace, libhdf5), hid_t, (hid_t, Csize_t), dcpl_id, index)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_get_virtual_vspace (not annotated)")
return var"#status#"
end
"""
h5p_get_vlen_mem_manager(plist_id::hid_t, alloc_func::Ptr{H5MM_allocate_t}, alloc_info::Ptr{Ptr{Cvoid}}, free_func::Ptr{H5MM_free_t}, free_info::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_vlen_mem_manager`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga9de8cad8b5664a956d965fd9414c376e).
"""
function h5p_get_vlen_mem_manager(plist_id, alloc_func, alloc_info, free_func, free_info)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_vlen_mem_manager, libhdf5), herr_t, (hid_t, Ptr{H5MM_allocate_t}, Ptr{Ptr{Cvoid}}, Ptr{H5MM_free_t}, Ptr{Ptr{Cvoid}}), plist_id, alloc_func, alloc_info, free_func, free_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_vlen_mem_manager (not annotated)")
return nothing
end
"""
h5p_get_vol_id(plist_id::hid_t, vol_id::Ptr{hid_t})
See `libhdf5` documentation for [`H5Pget_vol_id`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga5f133bdf09ca5a32622688d1ba5cc838).
"""
function h5p_get_vol_id(plist_id, vol_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_vol_id, libhdf5), herr_t, (hid_t, Ptr{hid_t}), plist_id, vol_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_vol_id (not annotated)")
return nothing
end
"""
h5p_get_vol_info(plist_id::hid_t, vol_info::Ptr{Ptr{Cvoid}})
See `libhdf5` documentation for [`H5Pget_vol_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gafc58db23c257cdcf2f0c1c3ae911ab0f).
"""
function h5p_get_vol_info(plist_id, vol_info)
lock(liblock)
var"#status#" = try
ccall((:H5Pget_vol_info, libhdf5), herr_t, (hid_t, Ptr{Ptr{Cvoid}}), plist_id, vol_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_get_vol_info (not annotated)")
return nothing
end
"""
h5p_set(plist_id::hid_t, name::Cstring, value::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga971d2f700cb98ccdfcdf93a39118983b).
"""
function h5p_set(plist_id, name, value)
lock(liblock)
var"#status#" = try
ccall((:H5Pset, libhdf5), herr_t, (hid_t, Cstring, Ptr{Cvoid}), plist_id, name, value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set (not annotated)")
return nothing
end
"""
h5p_set_alignment(plist_id::hid_t, threshold::hsize_t, alignment::hsize_t)
See `libhdf5` documentation for [`H5Pset_alignment`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a).
"""
function h5p_set_alignment(plist_id, threshold, alignment)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_alignment, libhdf5), herr_t, (hid_t, hsize_t, hsize_t), plist_id, threshold, alignment)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting alignment")
return nothing
end
"""
h5p_set_alloc_time(plist_id::hid_t, alloc_time::Cint)
See `libhdf5` documentation for [`H5Pset_alloc_time`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga85faefca58387bba409b65c470d7d851).
"""
function h5p_set_alloc_time(plist_id, alloc_time)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_alloc_time, libhdf5), herr_t, (hid_t, Cint), plist_id, alloc_time)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting allocation timing")
return nothing
end
"""
h5p_set_append_flush(dapl_id::hid_t, ndims::Cuint, boundary::Ptr{hsize_t}, func::H5D_append_cb_t, udata::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_append_flush`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga2f685a7b3f3a4fa35ddcd1659ab4a835).
"""
function h5p_set_append_flush(dapl_id, ndims, boundary, func, udata)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_append_flush, libhdf5), herr_t, (hid_t, Cuint, Ptr{hsize_t}, H5D_append_cb_t, Ptr{Cvoid}), dapl_id, ndims, boundary, func, udata)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_append_flush (not annotated)")
return nothing
end
"""
h5p_set_attr_creation_order(plist_id::hid_t, crt_order_flags::Cuint)
See `libhdf5` documentation for [`H5Pset_attr_creation_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gade132fded1df87300a4c7175c6bd766a).
"""
function h5p_set_attr_creation_order(plist_id, crt_order_flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_attr_creation_order, libhdf5), herr_t, (hid_t, Cuint), plist_id, crt_order_flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting attribute creation order")
return nothing
end
"""
h5p_set_attr_phase_change(plist_id::hid_t, max_compact::Cuint, min_dense::Cuint)
See `libhdf5` documentation for [`H5Pset_attr_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga0115b13dcbd8770cbdcef3db2ac12ea1).
"""
function h5p_set_attr_phase_change(plist_id, max_compact, min_dense)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_attr_phase_change, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, max_compact, min_dense)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_attr_phase_change (not annotated)")
return nothing
end
"""
h5p_set_btree_ratios(plist_id::hid_t, left::Cdouble, middle::Cdouble, right::Cdouble)
See `libhdf5` documentation for [`H5Pset_btree_ratios`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga51d126d64fa766d44160a95057a2c733).
"""
function h5p_set_btree_ratios(plist_id, left, middle, right)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_btree_ratios, libhdf5), herr_t, (hid_t, Cdouble, Cdouble, Cdouble), plist_id, left, middle, right)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_btree_ratios (not annotated)")
return nothing
end
"""
h5p_set_buffer(plist_id::hid_t, size::Csize_t, tconv::Ptr{Cvoid}, bkg::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_buffer`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga777e8c171c9e462230a9fa40874b38ce).
"""
function h5p_set_buffer(plist_id, size, tconv, bkg)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_buffer, libhdf5), herr_t, (hid_t, Csize_t, Ptr{Cvoid}, Ptr{Cvoid}), plist_id, size, tconv, bkg)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_buffer (not annotated)")
return nothing
end
"""
h5p_set_cache(plist_id::hid_t, mdc_nelmts::Cint, rdcc_nslots::Csize_t, rdcc_nbytes::Csize_t, rdcc_w0::Cdouble)
See `libhdf5` documentation for [`H5Pset_cache`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga034a5fc54d9b05296555544d8dd9fe89).
"""
function h5p_set_cache(plist_id, mdc_nelmts, rdcc_nslots, rdcc_nbytes, rdcc_w0)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_cache, libhdf5), herr_t, (hid_t, Cint, Csize_t, Csize_t, Cdouble), plist_id, mdc_nelmts, rdcc_nslots, rdcc_nbytes, rdcc_w0)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_cache (not annotated)")
return nothing
end
"""
h5p_set_char_encoding(plist_id::hid_t, encoding::Cint)
See `libhdf5` documentation for [`H5Pset_char_encoding`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_c_p_l.html#gad4fa8e2d17236786f770cf17eef908cc).
"""
function h5p_set_char_encoding(plist_id, encoding)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_char_encoding, libhdf5), herr_t, (hid_t, Cint), plist_id, encoding)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting char encoding")
return nothing
end
"""
h5p_set_chunk(plist_id::hid_t, ndims::Cint, dims::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Pset_chunk`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga3584d592e377da3604b7604e266dcf5b).
"""
function h5p_set_chunk(plist_id, ndims, dims)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_chunk, libhdf5), herr_t, (hid_t, Cint, Ptr{hsize_t}), plist_id, ndims, dims)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting chunk size")
return nothing
end
"""
h5p_set_chunk_cache(dapl_id::hid_t, rdcc_nslots::Csize_t, rdcc_nbytes::Csize_t, rdcc_w0::Cdouble)
See `libhdf5` documentation for [`H5Pset_chunk_cache`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga104d00442c31714ee073dee518f661f1).
"""
function h5p_set_chunk_cache(dapl_id, rdcc_nslots, rdcc_nbytes, rdcc_w0)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_chunk_cache, libhdf5), herr_t, (hid_t, Csize_t, Csize_t, Cdouble), dapl_id, rdcc_nslots, rdcc_nbytes, rdcc_w0)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting chunk cache")
return nothing
end
"""
h5p_set_chunk_opts(plist_id::hid_t, opts::Cuint)
See `libhdf5` documentation for [`H5Pset_chunk_opts`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga8e60618d9030dc1b99ad9c8ff7867873).
"""
function h5p_set_chunk_opts(plist_id, opts)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_chunk_opts, libhdf5), herr_t, (hid_t, Cuint), plist_id, opts)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_chunk_opts (not annotated)")
return nothing
end
"""
h5p_set_copy_object(plist_id::hid_t, copy_options::Cuint)
See `libhdf5` documentation for [`H5Pset_copy_object`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#ga8819261e0b4663827212892e10dfc8a6).
"""
function h5p_set_copy_object(plist_id, copy_options)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_copy_object, libhdf5), herr_t, (hid_t, Cuint), plist_id, copy_options)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_copy_object (not annotated)")
return nothing
end
"""
h5p_set_core_write_tracking(fapl_id::hid_t, is_enabled::hbool_t, page_size::Csize_t)
See `libhdf5` documentation for [`H5Pset_core_write_tracking`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga237e300b96222a259896b24cf52405b0).
"""
function h5p_set_core_write_tracking(fapl_id, is_enabled, page_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_core_write_tracking, libhdf5), herr_t, (hid_t, hbool_t, Csize_t), fapl_id, is_enabled, page_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_core_write_tracking (not annotated)")
return nothing
end
"""
h5p_set_create_intermediate_group(plist_id::hid_t, setting::Cuint)
See `libhdf5` documentation for [`H5Pset_create_intermediate_group`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_c_p_l.html#ga66c4c5d3f34e5cf65d00e47a5387383c).
"""
function h5p_set_create_intermediate_group(plist_id, setting)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_create_intermediate_group, libhdf5), herr_t, (hid_t, Cuint), plist_id, setting)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting create intermediate group")
return nothing
end
"""
h5p_set_data_transform(plist_id::hid_t, expression::Cstring)
See `libhdf5` documentation for [`H5Pset_data_transform`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gaa8c317b6164ae22a6ddae4131bbbcd22).
"""
function h5p_set_data_transform(plist_id, expression)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_data_transform, libhdf5), herr_t, (hid_t, Cstring), plist_id, expression)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_data_transform (not annotated)")
return nothing
end
"""
h5p_set_deflate(plist_id::hid_t, setting::Cuint)
See `libhdf5` documentation for [`H5Pset_deflate`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gaf1f569bfc54552bdb9317d2b63318a0d).
"""
function h5p_set_deflate(plist_id, setting)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_deflate, libhdf5), herr_t, (hid_t, Cuint), plist_id, setting)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting compression method and level (deflate)")
return nothing
end
"""
h5p_set_driver(plist_id::hid_t, driver_id::hid_t, driver_info::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga8bcce60e23e9d2a019212c63b146502e).
"""
function h5p_set_driver(plist_id, driver_id, driver_info)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_driver, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), plist_id, driver_id, driver_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_driver (not annotated)")
return nothing
end
"""
h5p_set_dset_no_attrs_hint(dcpl_id::hid_t, minimize::hbool_t)
See `libhdf5` documentation for [`H5Pset_dset_no_attrs_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gaf5ae8c0257c02e3fbe50bde70b1eb8be).
"""
function h5p_set_dset_no_attrs_hint(dcpl_id, minimize)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_dset_no_attrs_hint, libhdf5), herr_t, (hid_t, hbool_t), dcpl_id, minimize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in setting dataset no attributes hint property")
return nothing
end
"""
h5p_set_dxpl_mpio(dxpl_id::hid_t, xfer_mode::Cint)
See `libhdf5` documentation for [`H5Pset_dxpl_mpio`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga001a22b64f60b815abf5de8b4776f09e).
"""
function h5p_set_dxpl_mpio(dxpl_id, xfer_mode)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_dxpl_mpio, libhdf5), herr_t, (hid_t, Cint), dxpl_id, xfer_mode)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting MPIO transfer mode")
return nothing
end
"""
h5p_set_edc_check(plist_id::hid_t, check::H5Z_EDC_t)
See `libhdf5` documentation for [`H5Pset_edc_check`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga0d95dfa506784acc9aed850c99713609).
"""
function h5p_set_edc_check(plist_id, check)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_edc_check, libhdf5), herr_t, (hid_t, H5Z_EDC_t), plist_id, check)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_edc_check (not annotated)")
return nothing
end
"""
h5p_set_efile_prefix(plist_id::hid_t, prefix::Cstring)
See `libhdf5` documentation for [`H5Pset_efile_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#gad487f84157fd0944cbe1cbd4dea4e1b8).
"""
function h5p_set_efile_prefix(plist_id, prefix)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_efile_prefix, libhdf5), herr_t, (hid_t, Cstring), plist_id, prefix)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting external file prefix")
return nothing
end
"""
h5p_set_elink_acc_flags(lapl_id::hid_t, flags::Cuint)
See `libhdf5` documentation for [`H5Pset_elink_acc_flags`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga020f7eb2eae01043286af50db0a76d82).
"""
function h5p_set_elink_acc_flags(lapl_id, flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_elink_acc_flags, libhdf5), herr_t, (hid_t, Cuint), lapl_id, flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_elink_acc_flags (not annotated)")
return nothing
end
"""
h5p_set_elink_cb(lapl_id::hid_t, func::H5L_elink_traverse_t, op_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_elink_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga8850042eed51777866d7bd0d050cfdc2).
"""
function h5p_set_elink_cb(lapl_id, func, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_elink_cb, libhdf5), herr_t, (hid_t, H5L_elink_traverse_t, Ptr{Cvoid}), lapl_id, func, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_elink_cb (not annotated)")
return nothing
end
"""
h5p_set_elink_fapl(lapl_id::hid_t, fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pset_elink_fapl`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#ga3895e8e60ce8f0b6f32ab7a22c715d1a).
"""
function h5p_set_elink_fapl(lapl_id, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_elink_fapl, libhdf5), herr_t, (hid_t, hid_t), lapl_id, fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_elink_fapl (not annotated)")
return nothing
end
"""
h5p_set_elink_file_cache_size(plist_id::hid_t, efc_size::Cuint)
See `libhdf5` documentation for [`H5Pset_elink_file_cache_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gac21a815e9b133802df625c9f766ef325).
"""
function h5p_set_elink_file_cache_size(plist_id, efc_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_elink_file_cache_size, libhdf5), herr_t, (hid_t, Cuint), plist_id, efc_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_elink_file_cache_size (not annotated)")
return nothing
end
"""
h5p_set_elink_prefix(plist_id::hid_t, prefix::Cstring)
See `libhdf5` documentation for [`H5Pset_elink_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#gafa5eced13ba3a00cdd65669626dc7294).
"""
function h5p_set_elink_prefix(plist_id, prefix)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_elink_prefix, libhdf5), herr_t, (hid_t, Cstring), plist_id, prefix)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_elink_prefix (not annotated)")
return nothing
end
"""
h5p_set_est_link_info(plist_id::hid_t, est_num_entries::Cuint, est_name_len::Cuint)
See `libhdf5` documentation for [`H5Pset_est_link_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#gaa8571642d45e73ab5a9ae71cf00501f9).
"""
function h5p_set_est_link_info(plist_id, est_num_entries, est_name_len)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_est_link_info, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, est_num_entries, est_name_len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_est_link_info (not annotated)")
return nothing
end
"""
h5p_set_evict_on_close(fapl_id::hid_t, evict_on_close::hbool_t)
See `libhdf5` documentation for [`H5Pset_evict_on_close`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaa44cc0e592608e12082dad9305b3c74d).
"""
function h5p_set_evict_on_close(fapl_id, evict_on_close)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_evict_on_close, libhdf5), herr_t, (hid_t, hbool_t), fapl_id, evict_on_close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_evict_on_close (not annotated)")
return nothing
end
"""
h5p_set_external(plist_id::hid_t, name::Cstring, offset::off_t, size::hsize_t)
See `libhdf5` documentation for [`H5Pset_external`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga85ff7c9c827fa524041cd58c199b77b8).
"""
function h5p_set_external(plist_id, name, offset, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_external, libhdf5), herr_t, (hid_t, Cstring, off_t, hsize_t), plist_id, name, offset, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting external property")
return nothing
end
"""
h5p_set_family_offset(fapl_id::hid_t, offset::hsize_t)
See `libhdf5` documentation for [`H5Pset_family_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga6b24e6daf4816bbfb89b63bab40aa982).
"""
function h5p_set_family_offset(fapl_id, offset)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_family_offset, libhdf5), herr_t, (hid_t, hsize_t), fapl_id, offset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_family_offset (not annotated)")
return nothing
end
"""
h5p_set_fapl_core(fapl_id::hid_t, increment::Csize_t, backing_store::hbool_t)
See `libhdf5` documentation for [`H5Pset_fapl_core`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga6e6628f620a1c58c704129cf07282849).
"""
function h5p_set_fapl_core(fapl_id, increment, backing_store)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_core, libhdf5), herr_t, (hid_t, Csize_t, hbool_t), fapl_id, increment, backing_store)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_core (not annotated)")
return nothing
end
"""
h5p_set_fapl_family(fapl_id::hid_t, memb_size::hsize_t, memb_fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fapl_family`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga95d19da25f196ce1ace10af00f49ab53).
"""
function h5p_set_fapl_family(fapl_id, memb_size, memb_fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_family, libhdf5), herr_t, (hid_t, hsize_t, hid_t), fapl_id, memb_size, memb_fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_family (not annotated)")
return nothing
end
"""
h5p_set_fapl_hdfs(fapl_id::hid_t, fa::Ptr{H5FD_hdfs_fapl_t})
See `libhdf5` documentation for [`H5Pset_fapl_hdfs`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga970d077c8e712a4692f43fa4f38dde14).
"""
function h5p_set_fapl_hdfs(fapl_id, fa)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_hdfs, libhdf5), herr_t, (hid_t, Ptr{H5FD_hdfs_fapl_t}), fapl_id, fa)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_hdfs (not annotated)")
return nothing
end
"""
h5p_set_fapl_log(fapl_id::hid_t, logfile::Cstring, flags::Culonglong, buf_size::Csize_t)
See `libhdf5` documentation for [`H5Pset_fapl_log`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga4e03be2fe83ed02b32266a6c81427beb).
"""
function h5p_set_fapl_log(fapl_id, logfile, flags, buf_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_log, libhdf5), herr_t, (hid_t, Cstring, Culonglong, Csize_t), fapl_id, logfile, flags, buf_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_log (not annotated)")
return nothing
end
"""
h5p_set_fapl_multi(fapl_id::hid_t, memb_map::Ptr{H5FD_mem_t}, memb_fapl::Ptr{hid_t}, memb_name::Ptr{Cstring}, memb_addr::Ptr{haddr_t}, relax::hbool_t)
See `libhdf5` documentation for [`H5Pset_fapl_multi`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga296bd22cc1e462351f8f0a00a46baf58).
"""
function h5p_set_fapl_multi(fapl_id, memb_map, memb_fapl, memb_name, memb_addr, relax)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_multi, libhdf5), herr_t, (hid_t, Ptr{H5FD_mem_t}, Ptr{hid_t}, Ptr{Cstring}, Ptr{haddr_t}, hbool_t), fapl_id, memb_map, memb_fapl, memb_name, memb_addr, relax)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_multi (not annotated)")
return nothing
end
"""
h5p_set_fapl_sec2(fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fapl_sec2`](https://docs.hdfgroup.org/hdf5/v1_14/).
"""
function h5p_set_fapl_sec2(fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_sec2, libhdf5), herr_t, (hid_t,), fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting Sec2 properties")
return nothing
end
"""
h5p_set_fapl_ros3(fapl_id::hid_t, fa::Ptr{H5FD_ros3_fapl_t})
See `libhdf5` documentation for [`H5Pset_fapl_ros3`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaad28d8c24f236590193215c5ae7a8f18).
"""
function h5p_set_fapl_ros3(fapl_id, fa)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_ros3, libhdf5), herr_t, (hid_t, Ptr{H5FD_ros3_fapl_t}), fapl_id, fa)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in setting ros3 properties")
return nothing
end
"""
h5p_set_fapl_split(fapl::hid_t, meta_ext::Cstring, meta_plist_id::hid_t, raw_ext::Cstring, raw_plist_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fapl_split`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga502f1ad38f5143cf281df8282fef26ed).
"""
function h5p_set_fapl_split(fapl, meta_ext, meta_plist_id, raw_ext, raw_plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_split, libhdf5), herr_t, (hid_t, Cstring, hid_t, Cstring, hid_t), fapl, meta_ext, meta_plist_id, raw_ext, raw_plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_split (not annotated)")
return nothing
end
"""
h5p_set_fapl_splitter(fapl_id::hid_t, config_ptr::Ptr{H5FD_splitter_vfd_config_t})
See `libhdf5` documentation for [`H5Pset_fapl_splitter`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga49f386ea235bb48128e54c962c499f07).
"""
function h5p_set_fapl_splitter(fapl_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_splitter, libhdf5), herr_t, (hid_t, Ptr{H5FD_splitter_vfd_config_t}), fapl_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_splitter (not annotated)")
return nothing
end
"""
h5p_set_fapl_stdio(fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fapl_stdio`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga77f0643117835e7f7992d573761b5052).
"""
function h5p_set_fapl_stdio(fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_stdio, libhdf5), herr_t, (hid_t,), fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_stdio (not annotated)")
return nothing
end
"""
h5p_set_fapl_windows(fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fapl_windows`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga0017f78e0f3de465621fde556f679830).
"""
function h5p_set_fapl_windows(fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fapl_windows, libhdf5), herr_t, (hid_t,), fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fapl_windows (not annotated)")
return nothing
end
"""
h5p_set_fclose_degree(plist_id::hid_t, fc_degree::Cint)
See `libhdf5` documentation for [`H5Pset_fclose_degree`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga60e3567f677fd3ade75b909b636d7b9c).
"""
function h5p_set_fclose_degree(plist_id, fc_degree)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fclose_degree, libhdf5), herr_t, (hid_t, Cint), plist_id, fc_degree)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting close degree")
return nothing
end
"""
h5p_set_file_image(fapl_id::hid_t, buf_ptr::Ptr{Cvoid}, buf_len::Csize_t)
See `libhdf5` documentation for [`H5Pset_file_image`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga31d0299f6ad287e013b2a02a8ccc1fa2).
"""
function h5p_set_file_image(fapl_id, buf_ptr, buf_len)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_image, libhdf5), herr_t, (hid_t, Ptr{Cvoid}, Csize_t), fapl_id, buf_ptr, buf_len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_image (not annotated)")
return nothing
end
"""
h5p_set_file_image_callbacks(fapl_id::hid_t, callbacks_ptr::Ptr{H5FD_file_image_callbacks_t})
See `libhdf5` documentation for [`H5Pset_file_image_callbacks`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga14ea3598215afd078b964b672b40d63c).
"""
function h5p_set_file_image_callbacks(fapl_id, callbacks_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_image_callbacks, libhdf5), herr_t, (hid_t, Ptr{H5FD_file_image_callbacks_t}), fapl_id, callbacks_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_image_callbacks (not annotated)")
return nothing
end
"""
h5p_set_file_locking(fapl_id::hid_t, use_file_locking::hbool_t, ignore_when_disabled::hbool_t)
See `libhdf5` documentation for [`H5Pset_file_locking`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga503e9ff6121a67cf53f8b67054ed9391).
"""
function h5p_set_file_locking(fapl_id, use_file_locking, ignore_when_disabled)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_locking, libhdf5), herr_t, (hid_t, hbool_t, hbool_t), fapl_id, use_file_locking, ignore_when_disabled)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_locking (not annotated)")
return nothing
end
"""
h5p_set_file_space(plist_id::hid_t, strategy::H5F_file_space_type_t, threshold::hsize_t)
See `libhdf5` documentation for [`H5Pset_file_space`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gad388f8cfa213762c6fc3e45619aa5db6).
"""
function h5p_set_file_space(plist_id, strategy, threshold)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_space, libhdf5), herr_t, (hid_t, H5F_file_space_type_t, hsize_t), plist_id, strategy, threshold)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_space (not annotated)")
return nothing
end
"""
h5p_set_file_space_page_size(plist_id::hid_t, fsp_size::hsize_t)
See `libhdf5` documentation for [`H5Pset_file_space_page_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gad012d7f3c2f1e1999eb1770aae3a4963).
"""
function h5p_set_file_space_page_size(plist_id, fsp_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_space_page_size, libhdf5), herr_t, (hid_t, hsize_t), plist_id, fsp_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_space_page_size (not annotated)")
return nothing
end
"""
h5p_set_file_space_strategy(plist_id::hid_t, strategy::H5F_fspace_strategy_t, persist::hbool_t, threshold::hsize_t)
See `libhdf5` documentation for [`H5Pset_file_space_strategy`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga167ff65f392ca3b7f1933b1cee1b9f70).
"""
function h5p_set_file_space_strategy(plist_id, strategy, persist, threshold)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_file_space_strategy, libhdf5), herr_t, (hid_t, H5F_fspace_strategy_t, hbool_t, hsize_t), plist_id, strategy, persist, threshold)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_file_space_strategy (not annotated)")
return nothing
end
"""
h5p_set_fill_time(plist_id::hid_t, fill_time::H5D_fill_time_t)
See `libhdf5` documentation for [`H5Pset_fill_time`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga6bd822266b31f86551a9a1d79601b6a2).
"""
function h5p_set_fill_time(plist_id, fill_time)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fill_time, libhdf5), herr_t, (hid_t, H5D_fill_time_t), plist_id, fill_time)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fill_time (not annotated)")
return nothing
end
"""
h5p_set_fill_value(plist_id::hid_t, type_id::hid_t, value::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_fill_value`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga4335bb45b35386daa837b4ff1b9cd4a4).
"""
function h5p_set_fill_value(plist_id, type_id, value)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fill_value, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), plist_id, type_id, value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_fill_value (not annotated)")
return nothing
end
"""
h5p_set_filter(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Cuint, cd_nelmts::Csize_t, cd_values::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pset_filter`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga191c567ee50b2063979cdef156a768c5).
"""
function h5p_set_filter(plist_id, filter_id, flags, cd_nelmts, cd_values)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_filter, libhdf5), herr_t, (hid_t, H5Z_filter_t, Cuint, Csize_t, Ptr{Cuint}), plist_id, filter_id, flags, cd_nelmts, cd_values)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting filter")
return nothing
end
"""
h5p_set_filter_callback(plist_id::hid_t, func::H5Z_filter_func_t, op_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_filter_callback`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga1890e730c34efa20cd6a5d1d2a0e8caa).
"""
function h5p_set_filter_callback(plist_id, func, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_filter_callback, libhdf5), herr_t, (hid_t, H5Z_filter_func_t, Ptr{Cvoid}), plist_id, func, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_filter_callback (not annotated)")
return nothing
end
"""
h5p_set_fletcher32(plist_id::hid_t)
See `libhdf5` documentation for [`H5Pset_fletcher32`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga8bc81abfbd0393b0a46e121f817a3f81).
"""
function h5p_set_fletcher32(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_fletcher32, libhdf5), herr_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error enabling Fletcher32 filter")
return nothing
end
"""
h5p_set_gc_references(fapl_id::hid_t, gc_ref::Cuint)
See `libhdf5` documentation for [`H5Pset_gc_references`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga61f01a12d5392ccf1321168f3c28f36f).
"""
function h5p_set_gc_references(fapl_id, gc_ref)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_gc_references, libhdf5), herr_t, (hid_t, Cuint), fapl_id, gc_ref)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_gc_references (not annotated)")
return nothing
end
"""
h5p_set_hyper_vector_size(plist_id::hid_t, size::Csize_t)
See `libhdf5` documentation for [`H5Pset_hyper_vector_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gad8c1582c86e3316c70b0658b3b8e2071).
"""
function h5p_set_hyper_vector_size(plist_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_hyper_vector_size, libhdf5), herr_t, (hid_t, Csize_t), plist_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_hyper_vector_size (not annotated)")
return nothing
end
"""
h5p_set_istore_k(plist_id::hid_t, ik::Cuint)
See `libhdf5` documentation for [`H5Pset_istore_k`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga84a72f59d17841c37ab34674bf22a10c).
"""
function h5p_set_istore_k(plist_id, ik)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_istore_k, libhdf5), herr_t, (hid_t, Cuint), plist_id, ik)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_istore_k (not annotated)")
return nothing
end
"""
h5p_set_layout(plist_id::hid_t, setting::Cint)
See `libhdf5` documentation for [`H5Pset_layout`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga75d80991a8f467e0d454c53a383ae7f9).
"""
function h5p_set_layout(plist_id, setting)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_layout, libhdf5), herr_t, (hid_t, Cint), plist_id, setting)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting layout")
return nothing
end
"""
h5p_set_libver_bounds(fapl_id::hid_t, low::Cint, high::Cint)
See `libhdf5` documentation for [`H5Pset_libver_bounds`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gacbe1724e7f70cd17ed687417a1d2a910).
"""
function h5p_set_libver_bounds(fapl_id, low, high)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_libver_bounds, libhdf5), herr_t, (hid_t, Cint, Cint), fapl_id, low, high)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting library version bounds")
return nothing
end
"""
h5p_set_link_creation_order(plist_id::hid_t, crt_order_flags::Cuint)
See `libhdf5` documentation for [`H5Pset_link_creation_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#ga24817b5c9553df3872de57c20bf11512).
"""
function h5p_set_link_creation_order(plist_id, crt_order_flags)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_link_creation_order, libhdf5), herr_t, (hid_t, Cuint), plist_id, crt_order_flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting link creation order")
return nothing
end
"""
h5p_set_link_phase_change(plist_id::hid_t, max_compact::Cuint, min_dense::Cuint)
See `libhdf5` documentation for [`H5Pset_link_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#gab463ac9355728469eddfd973b4a5964f).
"""
function h5p_set_link_phase_change(plist_id, max_compact, min_dense)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_link_phase_change, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, max_compact, min_dense)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_link_phase_change (not annotated)")
return nothing
end
"""
h5p_set_local_heap_size_hint(plist_id::hid_t, size_hint::Csize_t)
See `libhdf5` documentation for [`H5Pset_local_heap_size_hint`](https://docs.hdfgroup.org/hdf5/v1_14/group___g_c_p_l.html#ga870728af2bf3c0b16edafd762a1c44d6).
"""
function h5p_set_local_heap_size_hint(plist_id, size_hint)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_local_heap_size_hint, libhdf5), herr_t, (hid_t, Csize_t), plist_id, size_hint)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting local heap size hint")
return nothing
end
"""
h5p_set_mcdt_search_cb(plist_id::hid_t, func::H5O_mcdt_search_cb_t, op_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_mcdt_search_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#ga9e0448885990a1b9ebd4493b7604f0c1).
"""
function h5p_set_mcdt_search_cb(plist_id, func, op_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_mcdt_search_cb, libhdf5), herr_t, (hid_t, H5O_mcdt_search_cb_t, Ptr{Cvoid}), plist_id, func, op_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_mcdt_search_cb (not annotated)")
return nothing
end
"""
h5p_set_mdc_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_config_t})
See `libhdf5` documentation for [`H5Pset_mdc_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gaf234199ad4cf9c708f45893f7f9cd4d3).
"""
function h5p_set_mdc_config(plist_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_mdc_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_config_t}), plist_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_mdc_config (not annotated)")
return nothing
end
"""
h5p_set_mdc_image_config(plist_id::hid_t, config_ptr::Ptr{H5AC_cache_image_config_t})
See `libhdf5` documentation for [`H5Pset_mdc_image_config`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga65cf9fea33d1324009efc2d5db848434).
"""
function h5p_set_mdc_image_config(plist_id, config_ptr)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_mdc_image_config, libhdf5), herr_t, (hid_t, Ptr{H5AC_cache_image_config_t}), plist_id, config_ptr)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_mdc_image_config (not annotated)")
return nothing
end
"""
h5p_set_mdc_log_options(plist_id::hid_t, is_enabled::hbool_t, location::Cstring, start_on_access::hbool_t)
See `libhdf5` documentation for [`H5Pset_mdc_log_options`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga4d7206c5679d7243410058eceae59b2c).
"""
function h5p_set_mdc_log_options(plist_id, is_enabled, location, start_on_access)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_mdc_log_options, libhdf5), herr_t, (hid_t, hbool_t, Cstring, hbool_t), plist_id, is_enabled, location, start_on_access)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_mdc_log_options (not annotated)")
return nothing
end
"""
h5p_set_meta_block_size(fapl_id::hid_t, size::hsize_t)
See `libhdf5` documentation for [`H5Pset_meta_block_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga8822e3dedc8e1414f20871a87d533cb1).
"""
function h5p_set_meta_block_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_meta_block_size, libhdf5), herr_t, (hid_t, hsize_t), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_meta_block_size (not annotated)")
return nothing
end
"""
h5p_set_metadata_read_attempts(plist_id::hid_t, attempts::Cuint)
See `libhdf5` documentation for [`H5Pset_metadata_read_attempts`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gab827cef16ec569c87cec94a8b3f350c5).
"""
function h5p_set_metadata_read_attempts(plist_id, attempts)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_metadata_read_attempts, libhdf5), herr_t, (hid_t, Cuint), plist_id, attempts)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_metadata_read_attempts (not annotated)")
return nothing
end
"""
h5p_set_multi_type(fapl_id::hid_t, type::H5FD_mem_t)
See `libhdf5` documentation for [`H5Pset_multi_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga507341f31848c57008a3225bff3fe128).
"""
function h5p_set_multi_type(fapl_id, type)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_multi_type, libhdf5), herr_t, (hid_t, H5FD_mem_t), fapl_id, type)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_multi_type (not annotated)")
return nothing
end
"""
h5p_set_nbit(plist_id::hid_t)
See `libhdf5` documentation for [`H5Pset_nbit`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gad58a9c0e766ef71d4075b2c2a755e91c).
"""
function h5p_set_nbit(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_nbit, libhdf5), herr_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error enabling nbit filter")
return nothing
end
"""
h5p_set_nlinks(plist_id::hid_t, nlinks::Csize_t)
See `libhdf5` documentation for [`H5Pset_nlinks`](https://docs.hdfgroup.org/hdf5/v1_14/group___l_a_p_l.html#gaa46c63c196a0cf5cd94dede039c030f4).
"""
function h5p_set_nlinks(plist_id, nlinks)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_nlinks, libhdf5), herr_t, (hid_t, Csize_t), plist_id, nlinks)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_nlinks (not annotated)")
return nothing
end
"""
h5p_set_obj_track_times(plist_id::hid_t, track_times::UInt8)
See `libhdf5` documentation for [`H5Pset_obj_track_times`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gafa87fab7ebb6c4a8da9a75a86cc62fa3).
"""
function h5p_set_obj_track_times(plist_id, track_times)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_obj_track_times, libhdf5), herr_t, (hid_t, UInt8), plist_id, track_times)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting object time tracking")
return nothing
end
"""
h5p_set_object_flush_cb(plist_id::hid_t, func::H5F_flush_cb_t, udata::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_object_flush_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#gab4a4a788af5b6e88381dda0df2efbf19).
"""
function h5p_set_object_flush_cb(plist_id, func, udata)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_object_flush_cb, libhdf5), herr_t, (hid_t, H5F_flush_cb_t, Ptr{Cvoid}), plist_id, func, udata)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_object_flush_cb (not annotated)")
return nothing
end
"""
h5p_set_page_buffer_size(plist_id::hid_t, buf_size::Csize_t, min_meta_per::Cuint, min_raw_per::Cuint)
See `libhdf5` documentation for [`H5Pset_page_buffer_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga8008cddafa81bd1ddada23f6d9a161ca).
"""
function h5p_set_page_buffer_size(plist_id, buf_size, min_meta_per, min_raw_per)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_page_buffer_size, libhdf5), herr_t, (hid_t, Csize_t, Cuint, Cuint), plist_id, buf_size, min_meta_per, min_raw_per)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_page_buffer_size (not annotated)")
return nothing
end
"""
h5p_set_preserve(plist_id::hid_t, status::hbool_t)
See `libhdf5` documentation for [`H5Pset_preserve`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#gae8b87209ba6a3943eb614b6dfe55e588).
"""
function h5p_set_preserve(plist_id, status)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_preserve, libhdf5), herr_t, (hid_t, hbool_t), plist_id, status)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_preserve (not annotated)")
return nothing
end
"""
h5p_set_scaleoffset(plist_id::hid_t, scale_type::Cint, scale_factor::Cint)
See `libhdf5` documentation for [`H5Pset_scaleoffset`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga5c10165b670e0e984db431aee818cc7e).
"""
function h5p_set_scaleoffset(plist_id, scale_type, scale_factor)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_scaleoffset, libhdf5), herr_t, (hid_t, Cint, Cint), plist_id, scale_type, scale_factor)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error enabling szip filter")
return nothing
end
"""
h5p_set_shared_mesg_index(plist_id::hid_t, index_num::Cuint, mesg_type_flags::Cuint, min_mesg_size::Cuint)
See `libhdf5` documentation for [`H5Pset_shared_mesg_index`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga052aba0c1c5a3908a62335fc28e287ef).
"""
function h5p_set_shared_mesg_index(plist_id, index_num, mesg_type_flags, min_mesg_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_shared_mesg_index, libhdf5), herr_t, (hid_t, Cuint, Cuint, Cuint), plist_id, index_num, mesg_type_flags, min_mesg_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_shared_mesg_index (not annotated)")
return nothing
end
"""
h5p_set_shared_mesg_nindexes(plist_id::hid_t, nindexes::Cuint)
See `libhdf5` documentation for [`H5Pset_shared_mesg_nindexes`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga5e5020b1e2579da4617ea115e3cc50f1).
"""
function h5p_set_shared_mesg_nindexes(plist_id, nindexes)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_shared_mesg_nindexes, libhdf5), herr_t, (hid_t, Cuint), plist_id, nindexes)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_shared_mesg_nindexes (not annotated)")
return nothing
end
"""
h5p_set_shared_mesg_phase_change(plist_id::hid_t, max_list::Cuint, min_btree::Cuint)
See `libhdf5` documentation for [`H5Pset_shared_mesg_phase_change`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga967f961f4002d63804dc67b3bcd8f354).
"""
function h5p_set_shared_mesg_phase_change(plist_id, max_list, min_btree)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_shared_mesg_phase_change, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, max_list, min_btree)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_shared_mesg_phase_change (not annotated)")
return nothing
end
"""
h5p_set_shuffle(plist_id::hid_t)
See `libhdf5` documentation for [`H5Pset_shuffle`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga31e09cb0bf2da2893eed8a72220e6521).
"""
function h5p_set_shuffle(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_shuffle, libhdf5), herr_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error enabling shuffle filter")
return nothing
end
"""
h5p_set_sieve_buf_size(fapl_id::hid_t, size::Csize_t)
See `libhdf5` documentation for [`H5Pset_sieve_buf_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga24fd737955839194bf5605d5f47928ee).
"""
function h5p_set_sieve_buf_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_sieve_buf_size, libhdf5), herr_t, (hid_t, Csize_t), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_sieve_buf_size (not annotated)")
return nothing
end
"""
h5p_set_sizes(plist_id::hid_t, sizeof_addr::Csize_t, sizeof_size::Csize_t)
See `libhdf5` documentation for [`H5Pset_sizes`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#gae5eb3ba16f063d151d1b56d33e0710a9).
"""
function h5p_set_sizes(plist_id, sizeof_addr, sizeof_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_sizes, libhdf5), herr_t, (hid_t, Csize_t, Csize_t), plist_id, sizeof_addr, sizeof_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_sizes (not annotated)")
return nothing
end
"""
h5p_set_small_data_block_size(fapl_id::hid_t, size::hsize_t)
See `libhdf5` documentation for [`H5Pset_small_data_block_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga5a99962a79412814b79be830f14c23dd).
"""
function h5p_set_small_data_block_size(fapl_id, size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_small_data_block_size, libhdf5), herr_t, (hid_t, hsize_t), fapl_id, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_small_data_block_size (not annotated)")
return nothing
end
"""
h5p_set_sym_k(plist_id::hid_t, ik::Cuint, lk::Cuint)
See `libhdf5` documentation for [`H5Pset_sym_k`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga444ca905f084f9f96b7fe60d2a8c8176).
"""
function h5p_set_sym_k(plist_id, ik, lk)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_sym_k, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, ik, lk)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_sym_k (not annotated)")
return nothing
end
"""
h5p_set_szip(plist_id::hid_t, options_mask::Cuint, pixels_per_block::Cuint)
See `libhdf5` documentation for [`H5Pset_szip`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga37de4b6071a94574cfab5cd6de9c3fc6).
"""
function h5p_set_szip(plist_id, options_mask, pixels_per_block)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_szip, libhdf5), herr_t, (hid_t, Cuint, Cuint), plist_id, options_mask, pixels_per_block)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error enabling szip filter")
return nothing
end
"""
h5p_set_type_conv_cb(dxpl_id::hid_t, op::H5T_conv_except_func_t, operate_data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_type_conv_cb`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga10a80b29444d933da1aa2003f46cf003).
"""
function h5p_set_type_conv_cb(dxpl_id, op, operate_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_type_conv_cb, libhdf5), herr_t, (hid_t, H5T_conv_except_func_t, Ptr{Cvoid}), dxpl_id, op, operate_data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_type_conv_cb (not annotated)")
return nothing
end
"""
h5p_set_userblock(plist_id::hid_t, len::hsize_t)
See `libhdf5` documentation for [`H5Pset_userblock`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_c_p_l.html#ga403bd982a2976c932237b186ed1cff4d).
"""
function h5p_set_userblock(plist_id, len)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_userblock, libhdf5), herr_t, (hid_t, hsize_t), plist_id, len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting userblock")
return nothing
end
"""
h5p_set_virtual(dcpl_id::hid_t, vspace_id::hid_t, src_file_name::Cstring, src_dset_name::Cstring, src_space_id::hid_t)
See `libhdf5` documentation for [`H5Pset_virtual`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#gadec895092dbbedb94f85d9cacf8924f5).
"""
function h5p_set_virtual(dcpl_id, vspace_id, src_file_name, src_dset_name, src_space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_virtual, libhdf5), herr_t, (hid_t, hid_t, Cstring, Cstring, hid_t), dcpl_id, vspace_id, src_file_name, src_dset_name, src_space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting virtual")
return nothing
end
"""
h5p_set_virtual_prefix(dapl_id::hid_t, prefix::Cstring)
See `libhdf5` documentation for [`H5Pset_virtual_prefix`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga6816e0de35a335f636922c3cd5569819).
"""
function h5p_set_virtual_prefix(dapl_id, prefix)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_virtual_prefix, libhdf5), herr_t, (hid_t, Cstring), dapl_id, prefix)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_virtual_prefix (not annotated)")
return nothing
end
"""
h5p_set_virtual_printf_gap(dapl_id::hid_t, gap_size::hsize_t)
See `libhdf5` documentation for [`H5Pset_virtual_printf_gap`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#ga8bb25e402e860133b8af3715e429bacf).
"""
function h5p_set_virtual_printf_gap(dapl_id, gap_size)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_virtual_printf_gap, libhdf5), herr_t, (hid_t, hsize_t), dapl_id, gap_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_virtual_printf_gap (not annotated)")
return nothing
end
"""
h5p_set_virtual_view(dapl_id::hid_t, view::H5D_vds_view_t)
See `libhdf5` documentation for [`H5Pset_virtual_view`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_a_p_l.html#gac65520e7cd7748f93d94c4a42abd01b4).
"""
function h5p_set_virtual_view(dapl_id, view)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_virtual_view, libhdf5), herr_t, (hid_t, H5D_vds_view_t), dapl_id, view)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_virtual_view (not annotated)")
return nothing
end
"""
h5p_set_vlen_mem_manager(plist_id::hid_t, alloc_func::H5MM_allocate_t, alloc_info::Ptr{Cvoid}, free_func::H5MM_free_t, free_info::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_vlen_mem_manager`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_x_p_l.html#ga2220ab75de470b6a6d5b1173d12aa0cf).
"""
function h5p_set_vlen_mem_manager(plist_id, alloc_func, alloc_info, free_func, free_info)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_vlen_mem_manager, libhdf5), herr_t, (hid_t, H5MM_allocate_t, Ptr{Cvoid}, H5MM_free_t, Ptr{Cvoid}), plist_id, alloc_func, alloc_info, free_func, free_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_vlen_mem_manager (not annotated)")
return nothing
end
"""
h5p_set_vol(plist_id::hid_t, new_vol_id::hid_t, new_vol_info::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Pset_vol`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga8aaa97e70b2544c3d95d908e1ae5b0f0).
"""
function h5p_set_vol(plist_id, new_vol_id, new_vol_info)
lock(liblock)
var"#status#" = try
ccall((:H5Pset_vol, libhdf5), herr_t, (hid_t, hid_t, Ptr{Cvoid}), plist_id, new_vol_id, new_vol_info)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_set_vol (not annotated)")
return nothing
end
"""
h5p_add_merge_committed_dtype_path(plist_id::hid_t, path::Cstring)
See `libhdf5` documentation for [`H5Padd_merge_committed_dtype_path`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#gab89c9debe50afca848151ff046afc82f).
"""
function h5p_add_merge_committed_dtype_path(plist_id, path)
lock(liblock)
var"#status#" = try
ccall((:H5Padd_merge_committed_dtype_path, libhdf5), herr_t, (hid_t, Cstring), plist_id, path)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_add_merge_committed_dtype_path (not annotated)")
return nothing
end
"""
h5p_all_filters_avail(plist_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Pall_filters_avail`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga70f5346250698afc950532e9593c3988).
"""
function h5p_all_filters_avail(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pall_filters_avail, libhdf5), htri_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5p_all_filters_avail (not annotated)")
return var"#status#" > 0
end
"""
h5p_close(id::hid_t)
See `libhdf5` documentation for [`H5Pclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#ga5dce61149211d3ef319452aa598887fb).
"""
function h5p_close(id)
lock(liblock)
var"#status#" = try
ccall((:H5Pclose, libhdf5), herr_t, (hid_t,), id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing property list")
return nothing
end
"""
h5p_close_class(plist_id::hid_t)
See `libhdf5` documentation for [`H5Pclose_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gaa7978af06720106af04b9d034e57fcfa).
"""
function h5p_close_class(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pclose_class, libhdf5), herr_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_close_class (not annotated)")
return nothing
end
"""
h5p_copy(plist_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pcopy`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#gad2663ccbcbf76b96cde4c104588ae21b).
"""
function h5p_copy(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pcopy, libhdf5), hid_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_copy (not annotated)")
return var"#status#"
end
"""
h5p_copy_prop(dst_id::hid_t, src_id::hid_t, name::Cstring)
See `libhdf5` documentation for [`H5Pcopy_prop`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga339a27b865cf2d57ff95a6b26e94a581).
"""
function h5p_copy_prop(dst_id, src_id, name)
lock(liblock)
var"#status#" = try
ccall((:H5Pcopy_prop, libhdf5), herr_t, (hid_t, hid_t, Cstring), dst_id, src_id, name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_copy_prop (not annotated)")
return nothing
end
"""
h5p_create(cls_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Pcreate`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#gaf1b11da01d4d45d788c45f8bc5f0cbfa).
"""
function h5p_create(cls_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pcreate, libhdf5), hid_t, (hid_t,), cls_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating property list")
return var"#status#"
end
"""
h5p_create_class(parent::hid_t, name::Cstring, create::H5P_cls_create_func_t, create_data::Ptr{Cvoid}, copy::H5P_cls_copy_func_t, copy_data::Ptr{Cvoid}, close::H5P_cls_close_func_t, close_data::Ptr{Cvoid}) -> hid_t
See `libhdf5` documentation for [`H5Pcreate_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga62a1119b6ad2c33bcc9149df5e24ae12).
"""
function h5p_create_class(parent, name, create, create_data, copy, copy_data, close, close_data)
lock(liblock)
var"#status#" = try
ccall((:H5Pcreate_class, libhdf5), hid_t, (hid_t, Cstring, H5P_cls_create_func_t, Ptr{Cvoid}, H5P_cls_copy_func_t, Ptr{Cvoid}, H5P_cls_close_func_t, Ptr{Cvoid}), parent, name, create, create_data, copy, copy_data, close, close_data)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_create_class (not annotated)")
return var"#status#"
end
"""
h5p_decode(buf::Ptr{Cvoid}) -> hid_t
See `libhdf5` documentation for [`H5Pdecode`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#gafd75009eb96922e72beffa78718d4bdd).
"""
function h5p_decode(buf)
lock(liblock)
var"#status#" = try
ccall((:H5Pdecode, libhdf5), hid_t, (Ptr{Cvoid},), buf)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error in h5p_decode (not annotated)")
return var"#status#"
end
"""
h5p_encode(plist_id::hid_t, buf::Ptr{Cvoid}, nalloc::Ptr{Csize_t})
See `libhdf5` documentation for [`H5Pencode1`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gaf40518cb161ee9508da4b9c0d34553bf).
"""
function h5p_encode(plist_id, buf, nalloc)
lock(liblock)
var"#status#" = try
ccall((:H5Pencode1, libhdf5), herr_t, (hid_t, Ptr{Cvoid}, Ptr{Csize_t}), plist_id, buf, nalloc)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_encode1 (not annotated)")
return nothing
end
"""
h5p_encode(plist_id::hid_t, buf::Ptr{Cvoid}, nalloc::Ptr{Csize_t}, fapl_id::hid_t)
See `libhdf5` documentation for [`H5Pencode2`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r.html#ga37b1b6666e62a86389015e7dfc384faa).
"""
function h5p_encode(plist_id, buf, nalloc, fapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pencode2, libhdf5), herr_t, (hid_t, Ptr{Cvoid}, Ptr{Csize_t}, hid_t), plist_id, buf, nalloc, fapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_encode2 (not annotated)")
return nothing
end
"""
h5p_equal(id1::hid_t, id2::hid_t) -> Bool
See `libhdf5` documentation for [`H5Pequal`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga9425ef9f3bc3ee661eca6be654aeae20).
"""
function h5p_equal(id1, id2)
lock(liblock)
var"#status#" = try
ccall((:H5Pequal, libhdf5), htri_t, (hid_t, hid_t), id1, id2)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5p_equal (not annotated)")
return var"#status#" > 0
end
"""
h5p_exist(plist_id::hid_t, name::Cstring) -> Bool
See `libhdf5` documentation for [`H5Pexist`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gae135cf333c12375f3808cfe931ea9190).
"""
function h5p_exist(plist_id, name)
lock(liblock)
var"#status#" = try
ccall((:H5Pexist, libhdf5), htri_t, (hid_t, Cstring), plist_id, name)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5p_exist (not annotated)")
return var"#status#" > 0
end
"""
h5p_fill_value_defined(plist::hid_t, status::Ptr{H5D_fill_value_t})
See `libhdf5` documentation for [`H5Pfill_value_defined`](https://docs.hdfgroup.org/hdf5/v1_14/group___d_c_p_l.html#ga14f9bc2a0d6f9e62ab95661fc1045ad6).
"""
function h5p_fill_value_defined(plist, status)
lock(liblock)
var"#status#" = try
ccall((:H5Pfill_value_defined, libhdf5), herr_t, (hid_t, Ptr{H5D_fill_value_t}), plist, status)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_fill_value_defined (not annotated)")
return nothing
end
"""
h5p_free_merge_committed_dtype_paths(plist_id::hid_t)
See `libhdf5` documentation for [`H5Pfree_merge_committed_dtype_paths`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_y_p_l.html#ga4b8d6496ac56b167dba16729a8bd7adf).
"""
function h5p_free_merge_committed_dtype_paths(plist_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pfree_merge_committed_dtype_paths, libhdf5), herr_t, (hid_t,), plist_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_free_merge_committed_dtype_paths (not annotated)")
return nothing
end
"""
h5p_insert(plist_id::hid_t, name::Cstring, size::Csize_t, value::Ptr{Cvoid}, prp_set::H5P_prp_set_func_t, prp_get::H5P_prp_get_func_t, prp_delete::H5P_prp_delete_func_t, prp_copy::H5P_prp_copy_func_t, prp_close::H5P_prp_close_func_t)
See `libhdf5` documentation for [`H5Pinsert1`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga6ba9694c03ae97c9f514470366a909f9).
"""
function h5p_insert(plist_id, name, size, value, prp_set, prp_get, prp_delete, prp_copy, prp_close)
lock(liblock)
var"#status#" = try
ccall((:H5Pinsert1, libhdf5), herr_t, (hid_t, Cstring, Csize_t, Ptr{Cvoid}, H5P_prp_set_func_t, H5P_prp_get_func_t, H5P_prp_delete_func_t, H5P_prp_copy_func_t, H5P_prp_close_func_t), plist_id, name, size, value, prp_set, prp_get, prp_delete, prp_copy, prp_close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_insert1 (not annotated)")
return nothing
end
"""
h5p_insert(plist_id::hid_t, name::Cstring, size::Csize_t, value::Ptr{Cvoid}, set::H5P_prp_set_func_t, get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, copy::H5P_prp_copy_func_t, compare::H5P_prp_compare_func_t, close::H5P_prp_close_func_t)
See `libhdf5` documentation for [`H5Pinsert2`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga930e15d5f994e223bea80621ef3065d4).
"""
function h5p_insert(plist_id, name, size, value, set, get, prp_del, copy, compare, close)
lock(liblock)
var"#status#" = try
ccall((:H5Pinsert2, libhdf5), herr_t, (hid_t, Cstring, Csize_t, Ptr{Cvoid}, H5P_prp_set_func_t, H5P_prp_get_func_t, H5P_prp_delete_func_t, H5P_prp_copy_func_t, H5P_prp_compare_func_t, H5P_prp_close_func_t), plist_id, name, size, value, set, get, prp_del, copy, compare, close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_insert2 (not annotated)")
return nothing
end
"""
h5p_isa_class(plist_id::hid_t, pclass_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Pisa_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga68bc660e09b46dcb5ea3f25b245aff60).
"""
function h5p_isa_class(plist_id, pclass_id)
lock(liblock)
var"#status#" = try
ccall((:H5Pisa_class, libhdf5), htri_t, (hid_t, hid_t), plist_id, pclass_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error in h5p_isa_class (not annotated)")
return var"#status#" > 0
end
"""
h5p_iterate(id::hid_t, idx::Ptr{Cint}, iter_func::H5P_iterate_t, iter_data::Ptr{Cvoid}) -> Int
See `libhdf5` documentation for [`H5Piterate`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga1c52aa0f6d1839798982973d7db9569e).
"""
function h5p_iterate(id, idx, iter_func, iter_data)
lock(liblock)
var"#status#" = try
ccall((:H5Piterate, libhdf5), Cint, (hid_t, Ptr{Cint}, H5P_iterate_t, Ptr{Cvoid}), id, idx, iter_func, iter_data)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error in h5p_iterate (not annotated)")
return Int(var"#status#")
end
"""
h5p_modify_filter(plist_id::hid_t, filter_id::H5Z_filter_t, flags::Cuint, cd_nelmts::Csize_t, cd_values::Ptr{Cuint})
See `libhdf5` documentation for [`H5Pmodify_filter`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#ga12a358b3725a889c1768bbd2b5f541d8).
"""
function h5p_modify_filter(plist_id, filter_id, flags, cd_nelmts, cd_values)
lock(liblock)
var"#status#" = try
ccall((:H5Pmodify_filter, libhdf5), herr_t, (hid_t, H5Z_filter_t, Cuint, Csize_t, Ptr{Cuint}), plist_id, filter_id, flags, cd_nelmts, cd_values)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error modifying filter")
return nothing
end
"""
h5p_register(cls_id::hid_t, name::Cstring, size::Csize_t, def_value::Ptr{Cvoid}, prp_create::H5P_prp_create_func_t, prp_set::H5P_prp_set_func_t, prp_get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, prp_copy::H5P_prp_copy_func_t, prp_close::H5P_prp_close_func_t)
See `libhdf5` documentation for [`H5Pregister1`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga91799f6cda78911e9ecc2cfaaea3a3b5).
"""
function h5p_register(cls_id, name, size, def_value, prp_create, prp_set, prp_get, prp_del, prp_copy, prp_close)
lock(liblock)
var"#status#" = try
ccall((:H5Pregister1, libhdf5), herr_t, (hid_t, Cstring, Csize_t, Ptr{Cvoid}, H5P_prp_create_func_t, H5P_prp_set_func_t, H5P_prp_get_func_t, H5P_prp_delete_func_t, H5P_prp_copy_func_t, H5P_prp_close_func_t), cls_id, name, size, def_value, prp_create, prp_set, prp_get, prp_del, prp_copy, prp_close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_register1 (not annotated)")
return nothing
end
"""
h5p_register(cls_id::hid_t, name::Cstring, size::Csize_t, def_value::Ptr{Cvoid}, create::H5P_prp_create_func_t, set::H5P_prp_set_func_t, get::H5P_prp_get_func_t, prp_del::H5P_prp_delete_func_t, copy::H5P_prp_copy_func_t, compare::H5P_prp_compare_func_t, close::H5P_prp_close_func_t)
See `libhdf5` documentation for [`H5Pregister2`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gaac3f957a5d3cbb4adc8b7ba2aa5f1719).
"""
function h5p_register(cls_id, name, size, def_value, create, set, get, prp_del, copy, compare, close)
lock(liblock)
var"#status#" = try
ccall((:H5Pregister2, libhdf5), herr_t, (hid_t, Cstring, Csize_t, Ptr{Cvoid}, H5P_prp_create_func_t, H5P_prp_set_func_t, H5P_prp_get_func_t, H5P_prp_delete_func_t, H5P_prp_copy_func_t, H5P_prp_compare_func_t, H5P_prp_close_func_t), cls_id, name, size, def_value, create, set, get, prp_del, copy, compare, close)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_register2 (not annotated)")
return nothing
end
"""
h5p_remove(plist_id::hid_t, name::Cstring)
See `libhdf5` documentation for [`H5Premove`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#ga2bf026d42a7823e89b6070a4cafc284b).
"""
function h5p_remove(plist_id, name)
lock(liblock)
var"#status#" = try
ccall((:H5Premove, libhdf5), herr_t, (hid_t, Cstring), plist_id, name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_remove (not annotated)")
return nothing
end
"""
h5p_remove_filter(plist_id::hid_t, filter_id::H5Z_filter_t)
See `libhdf5` documentation for [`H5Premove_filter`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_c_p_l.html#gabffbf6d013c090fa052ac4bafce8e532).
"""
function h5p_remove_filter(plist_id, filter_id)
lock(liblock)
var"#status#" = try
ccall((:H5Premove_filter, libhdf5), herr_t, (hid_t, H5Z_filter_t), plist_id, filter_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error removing filter")
return nothing
end
"""
h5p_unregister(pclass_id::hid_t, name::Cstring)
See `libhdf5` documentation for [`H5Punregister`](https://docs.hdfgroup.org/hdf5/v1_14/group___p_l_c_r_a.html#gaefb44d3535e309ba4041e420f3712aea).
"""
function h5p_unregister(pclass_id, name)
lock(liblock)
var"#status#" = try
ccall((:H5Punregister, libhdf5), herr_t, (hid_t, Cstring), pclass_id, name)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in h5p_unregister (not annotated)")
return nothing
end
"""
h5pl_set_loading_state(plugin_control_mask::Cuint)
See `libhdf5` documentation for [`H5PLset_loading_state`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#ga3ee7276ca1168b175661db3866f8db0e).
"""
function h5pl_set_loading_state(plugin_control_mask)
lock(liblock)
var"#status#" = try
ccall((:H5PLset_loading_state, libhdf5), herr_t, (Cuint,), plugin_control_mask)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting plugin loading state")
return nothing
end
"""
h5pl_get_loading_state(plugin_control_mask::Ptr{Cuint})
See `libhdf5` documentation for [`H5PLget_loading_state`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#gadde42938cf1cc41fa392e8719050b52a).
"""
function h5pl_get_loading_state(plugin_control_mask)
lock(liblock)
var"#status#" = try
ccall((:H5PLget_loading_state, libhdf5), herr_t, (Ptr{Cuint},), plugin_control_mask)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting plugin loading state")
return nothing
end
"""
h5pl_append(search_path::Cstring)
See `libhdf5` documentation for [`H5PLappend`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#gac74200fdc02f794f3fae753fe8b850b0).
"""
function h5pl_append(search_path)
lock(liblock)
var"#status#" = try
ccall((:H5PLappend, libhdf5), herr_t, (Cstring,), search_path)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error appending plugin path")
return nothing
end
"""
h5pl_prepend(search_path::Cstring)
See `libhdf5` documentation for [`H5PLprepend`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#ga1f2300ef2de6e430af330de7d194576f).
"""
function h5pl_prepend(search_path)
lock(liblock)
var"#status#" = try
ccall((:H5PLprepend, libhdf5), herr_t, (Cstring,), search_path)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error prepending plugin path")
return nothing
end
"""
h5pl_replace(search_path::Cstring, index::Cuint)
See `libhdf5` documentation for [`H5PLreplace`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#gab0f8d4e8d0b81cb55cf8b9de5095dc0b).
"""
function h5pl_replace(search_path, index)
lock(liblock)
var"#status#" = try
ccall((:H5PLreplace, libhdf5), herr_t, (Cstring, Cuint), search_path, index)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error replacing plugin path")
return nothing
end
"""
h5pl_insert(search_path::Cstring, index::Cuint)
See `libhdf5` documentation for [`H5PLinsert`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#gacc5153b0db6b3f876c3980bf34f931fc).
"""
function h5pl_insert(search_path, index)
lock(liblock)
var"#status#" = try
ccall((:H5PLinsert, libhdf5), herr_t, (Cstring, Cuint), search_path, index)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error inserting plugin path")
return nothing
end
"""
h5pl_remove(index::Cuint)
See `libhdf5` documentation for [`H5PLremove`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#gaa566196b7c6970c255feac4cf9f3bf40).
"""
function h5pl_remove(index)
lock(liblock)
var"#status#" = try
ccall((:H5PLremove, libhdf5), herr_t, (Cuint,), index)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error removing plugin path")
return nothing
end
"""
h5pl_get(index::Cuint, path_buf::Ptr{Cchar}, buf_size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5PLget`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#ga64a3c5450d91455624ecba582553d905).
"""
function h5pl_get(index, path_buf, buf_size)
lock(liblock)
var"#status#" = try
ccall((:H5PLget, libhdf5), Cssize_t, (Cuint, Ptr{Cchar}, Csize_t), index, path_buf, buf_size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Error getting plugin path")
return var"#status#"
end
"""
h5pl_size(num_paths::Ptr{Cuint})
See `libhdf5` documentation for [`H5PLsize`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_p_l.html#ga30b799ad7e9645312ef8975a610b4b18).
"""
function h5pl_size(num_paths)
lock(liblock)
var"#status#" = try
ccall((:H5PLsize, libhdf5), herr_t, (Ptr{Cuint},), num_paths)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error in getting number of plugins paths")
return nothing
end
"""
h5r_create(ref::Ptr{Cvoid}, loc_id::hid_t, pathname::Cstring, ref_type::Cint, space_id::hid_t)
See `libhdf5` documentation for [`H5Rcreate`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_r.html#ga0ac6997b5de26b11d91a95de2869950d).
"""
function h5r_create(ref, loc_id, pathname, ref_type, space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Rcreate, libhdf5), herr_t, (Ptr{Cvoid}, hid_t, Cstring, Cint, hid_t), ref, loc_id, pathname, ref_type, space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error creating reference to object ", h5i_get_name(loc_id), "/", pathname))
return nothing
end
"""
h5r_dereference(obj_id::hid_t, oapl_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid}) -> hid_t
See `libhdf5` documentation for [`H5Rdereference2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_r.html#ga9b09586f7b6ec708434dd8f95f58a9b7).
"""
function h5r_dereference(obj_id, oapl_id, ref_type, ref)
lock(liblock)
var"#status#" = try
ccall((:H5Rdereference2, libhdf5), hid_t, (hid_t, hid_t, Cint, Ptr{Cvoid}), obj_id, oapl_id, ref_type, ref)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error dereferencing object")
return var"#status#"
end
"""
h5r_get_obj_type(loc_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid}, obj_type::Ptr{Cint})
See `libhdf5` documentation for [`H5Rget_obj_type2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_r.html#ga766e39a76bcdd68dc514425353eff807).
"""
function h5r_get_obj_type(loc_id, ref_type, ref, obj_type)
lock(liblock)
var"#status#" = try
ccall((:H5Rget_obj_type2, libhdf5), herr_t, (hid_t, Cint, Ptr{Cvoid}, Ptr{Cint}), loc_id, ref_type, ref, obj_type)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting object type")
return nothing
end
"""
h5r_get_region(loc_id::hid_t, ref_type::Cint, ref::Ptr{Cvoid}) -> hid_t
See `libhdf5` documentation for [`H5Rget_region`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_r.html#ga1702d609e85b9edd3d1e526a0276484f).
"""
function h5r_get_region(loc_id, ref_type, ref)
lock(liblock)
var"#status#" = try
ccall((:H5Rget_region, libhdf5), hid_t, (hid_t, Cint, Ptr{Cvoid}), loc_id, ref_type, ref)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting region from reference")
return var"#status#"
end
"""
h5s_close(space_id::hid_t)
See `libhdf5` documentation for [`H5Sclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga2b53128a39c8f104c1c9c2a91590fcc1).
"""
function h5s_close(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sclose, libhdf5), herr_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing dataspace")
return nothing
end
"""
h5s_combine_hyperslab(dspace_id::hid_t, seloper::H5S_seloper_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Scombine_hyperslab`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gae7578a93bb7b22989bcb737f26b60ad1).
"""
function h5s_combine_hyperslab(dspace_id, seloper, start, stride, count, block)
lock(liblock)
var"#status#" = try
ccall((:H5Scombine_hyperslab, libhdf5), herr_t, (hid_t, H5S_seloper_t, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}), dspace_id, seloper, start, stride, count, block)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error selecting hyperslab")
return nothing
end
@static if v"1.10.7" ≤ _libhdf5_build_ver
@doc """
h5s_combine_select(space1_id::hid_t, op::H5S_seloper_t, space2_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Scombine_select`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga356600d12d3cf0db53cc27b212d75b08).
"""
function h5s_combine_select(space1_id, op, space2_id)
lock(liblock)
var"#status#" = try
ccall((:H5Scombine_select, libhdf5), hid_t, (hid_t, H5S_seloper_t, hid_t), space1_id, op, space2_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error combining dataspaces")
return var"#status#"
end
end
"""
h5s_copy(space_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Scopy`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gae5e26a8f8191768a600d40ec518ed66b).
"""
function h5s_copy(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Scopy, libhdf5), hid_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error copying dataspace")
return var"#status#"
end
"""
h5s_create(class::Cint) -> hid_t
See `libhdf5` documentation for [`H5Screate`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gabee514327cba34ca9951b24fa14fb083).
"""
function h5s_create(class)
lock(liblock)
var"#status#" = try
ccall((:H5Screate, libhdf5), hid_t, (Cint,), class)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating dataspace")
return var"#status#"
end
"""
h5s_create_simple(rank::Cint, current_dims::Ptr{hsize_t}, maximum_dims::Ptr{hsize_t}) -> hid_t
See `libhdf5` documentation for [`H5Screate_simple`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga8e35eea5738b4805856eac7d595254ae).
"""
function h5s_create_simple(rank, current_dims, maximum_dims)
lock(liblock)
var"#status#" = try
ccall((:H5Screate_simple, libhdf5), hid_t, (Cint, Ptr{hsize_t}, Ptr{hsize_t}), rank, current_dims, maximum_dims)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating simple dataspace")
return var"#status#"
end
"""
h5s_extent_copy(dst::hid_t, src::hid_t)
See `libhdf5` documentation for [`H5Sextent_copy`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga0eae5447eaabaa9444fac0464cd1b8d5).
"""
function h5s_extent_copy(dst, src)
lock(liblock)
var"#status#" = try
ccall((:H5Sextent_copy, libhdf5), herr_t, (hid_t, hid_t), dst, src)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error copying extent")
return nothing
end
"""
h5s_extent_equal(space1_id::hid_t, space2_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Sextent_equal`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gae58bca0c154ceaed9ad36c58c78e145c).
"""
function h5s_extent_equal(space1_id, space2_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sextent_equal, libhdf5), htri_t, (hid_t, hid_t), space1_id, space2_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error comparing dataspaces")
return var"#status#" > 0
end
"""
h5s_get_regular_hyperslab(space_id::hid_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sget_regular_hyperslab`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gabc974bbc041538a1d3032729df2ddfc0).
"""
function h5s_get_regular_hyperslab(space_id, start, stride, count, block)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_regular_hyperslab, libhdf5), herr_t, (hid_t, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}), space_id, start, stride, count, block)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting regular hyperslab selection")
return nothing
end
"""
h5s_get_select_bounds(space_id::hid_t, starts::Ptr{hsize_t}, ends::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sget_select_bounds`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga645591ec939b89732c10efd5867a6205).
"""
function h5s_get_select_bounds(space_id, starts, ends)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_bounds, libhdf5), herr_t, (hid_t, Ptr{hsize_t}, Ptr{hsize_t}), space_id, starts, ends)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting bounding box for selection")
return nothing
end
"""
h5s_get_select_elem_npoints(space_id::hid_t) -> hssize_t
See `libhdf5` documentation for [`H5Sget_select_elem_npoints`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga217b839584cd7c7995b47fc30fe92f4c).
"""
function h5s_get_select_elem_npoints(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_elem_npoints, libhdf5), hssize_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < hssize_t(0) && @h5error("Error getting number of elements in dataspace selection")
return var"#status#"
end
"""
h5s_get_select_elem_pointlist(space_id::hid_t, startpoint::hsize_t, numpoints::hsize_t, buf::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sget_select_elem_pointlist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga61459c488147254d1d06537a9ab6e2d4).
"""
function h5s_get_select_elem_pointlist(space_id, startpoint, numpoints, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_elem_pointlist, libhdf5), herr_t, (hid_t, hsize_t, hsize_t, Ptr{hsize_t}), space_id, startpoint, numpoints, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting list of element points")
return nothing
end
"""
h5s_get_select_hyper_blocklist(space_id::hid_t, startblock::hsize_t, numblocks::hsize_t, buf::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sget_select_hyper_blocklist`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga8534829a8db2eca8e987bb9fe8a3d628).
"""
function h5s_get_select_hyper_blocklist(space_id, startblock, numblocks, buf)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_hyper_blocklist, libhdf5), herr_t, (hid_t, hsize_t, hsize_t, Ptr{hsize_t}), space_id, startblock, numblocks, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting list of hyperslab blocks")
return nothing
end
"""
h5s_get_select_hyper_nblocks(space_id::hid_t) -> hssize_t
See `libhdf5` documentation for [`H5Sget_select_hyper_nblocks`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gad873b2f3b82ee8c426c26ceeb1c67f86).
"""
function h5s_get_select_hyper_nblocks(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_hyper_nblocks, libhdf5), hssize_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < hssize_t(0) && @h5error("Error getting number of selected blocks")
return var"#status#"
end
"""
h5s_get_select_npoints(space_id::hid_t) -> hsize_t
See `libhdf5` documentation for [`H5Sget_select_npoints`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga1a44dde97206f40f366f99d9c39b6046).
"""
function h5s_get_select_npoints(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_npoints, libhdf5), hsize_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" == -1 % hsize_t && @h5error("Error getting the number of selected points")
return var"#status#"
end
"""
h5s_get_select_type(space_id::hid_t) -> H5S_sel_type
See `libhdf5` documentation for [`H5Sget_select_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga51ae555e5b2492d95c7fefab2e0d5018).
"""
function h5s_get_select_type(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_select_type, libhdf5), H5S_sel_type, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < H5S_sel_type(0) && @h5error("Error getting the selection type")
return var"#status#"
end
"""
h5s_get_simple_extent_dims(space_id::hid_t, dims::Ptr{hsize_t}, maxdims::Ptr{hsize_t}) -> Int
See `libhdf5` documentation for [`H5Sget_simple_extent_dims`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gac494409b615d8e67c5edd9eb2848b2f3).
"""
function h5s_get_simple_extent_dims(space_id, dims, maxdims)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_simple_extent_dims, libhdf5), Cint, (hid_t, Ptr{hsize_t}, Ptr{hsize_t}), space_id, dims, maxdims)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting the dimensions for a dataspace")
return Int(var"#status#")
end
"""
h5s_get_simple_extent_ndims(space_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Sget_simple_extent_ndims`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gae5282a81692b80b5b19dd12d05b9b28e).
"""
function h5s_get_simple_extent_ndims(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_simple_extent_ndims, libhdf5), Cint, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting the number of dimensions for a dataspace")
return Int(var"#status#")
end
"""
h5s_get_simple_extent_type(space_id::hid_t) -> H5S_class_t
See `libhdf5` documentation for [`H5Sget_simple_extent_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gaf63af02b385e80c8c10b1c43763c251f).
"""
function h5s_get_simple_extent_type(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sget_simple_extent_type, libhdf5), H5S_class_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < H5S_class_t(0) && @h5error("Error getting the dataspace type")
return var"#status#"
end
"""
h5s_is_regular_hyperslab(space_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Sis_regular_hyperslab`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga8a5bc33fae4be442093329f2cfec3f49).
"""
function h5s_is_regular_hyperslab(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sis_regular_hyperslab, libhdf5), htri_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether datapace is regular hyperslab")
return var"#status#" > 0
end
"""
h5s_is_simple(space_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Sis_simple`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gab0b1560f7c8402986f332522e2adae1d).
"""
function h5s_is_simple(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sis_simple, libhdf5), htri_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether dataspace is simple")
return var"#status#" > 0
end
"""
h5s_modify_select(space_id::hid_t, op::H5S_seloper_t, space2_id::hid_t)
See `libhdf5` documentation for [`H5Smodify_select`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga0ccb190f72fe41a927407ffb9f19ef1b).
"""
function h5s_modify_select(space_id, op, space2_id)
lock(liblock)
var"#status#" = try
ccall((:H5Smodify_select, libhdf5), herr_t, (hid_t, H5S_seloper_t, hid_t), space_id, op, space2_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error modifying selection")
return nothing
end
"""
h5s_offset_simple(space_id::hid_t, offset::Ptr{hssize_t})
See `libhdf5` documentation for [`H5Soffset_simple`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga8e31da08f4110c3c7dfb18e9758e180d).
"""
function h5s_offset_simple(space_id, offset)
lock(liblock)
var"#status#" = try
ccall((:H5Soffset_simple, libhdf5), herr_t, (hid_t, Ptr{hssize_t}), space_id, offset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error offsetting simple dataspace extent")
return nothing
end
"""
h5s_select_adjust(space_id::hid_t, offset::Ptr{hssize_t})
See `libhdf5` documentation for [`H5Sselect_adjust`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga64f08c187b899f2728d4ac016d44f890).
"""
function h5s_select_adjust(space_id, offset)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_adjust, libhdf5), herr_t, (hid_t, Ptr{hssize_t}), space_id, offset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error adjusting selection offset")
return nothing
end
"""
h5s_select_all(space_id::hid_t)
See `libhdf5` documentation for [`H5Sselect_all`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gae183b79831506fd4b0c3ba9821eab33e).
"""
function h5s_select_all(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_all, libhdf5), herr_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error selecting all of dataspace")
return nothing
end
"""
h5s_select_copy(dst::hid_t, src::hid_t)
See `libhdf5` documentation for [`H5Sselect_copy`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga57e5eba2d1b282803835ba3f7e0b9bfa).
"""
function h5s_select_copy(dst, src)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_copy, libhdf5), herr_t, (hid_t, hid_t), dst, src)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error copying selection")
return nothing
end
"""
h5s_select_elements(space_id::hid_t, op::H5S_seloper_t, num_elem::Csize_t, coord::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sselect_elements`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga2f4407dd73d0ec37e5d9e80e4382483d).
"""
function h5s_select_elements(space_id, op, num_elem, coord)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_elements, libhdf5), herr_t, (hid_t, H5S_seloper_t, Csize_t, Ptr{hsize_t}), space_id, op, num_elem, coord)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error selecting elements")
return nothing
end
"""
h5s_select_hyperslab(dspace_id::hid_t, seloper::H5S_seloper_t, start::Ptr{hsize_t}, stride::Ptr{hsize_t}, count::Ptr{hsize_t}, block::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sselect_hyperslab`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga6adfdf1b95dc108a65bf66e97d38536d).
"""
function h5s_select_hyperslab(dspace_id, seloper, start, stride, count, block)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_hyperslab, libhdf5), herr_t, (hid_t, H5S_seloper_t, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}, Ptr{hsize_t}), dspace_id, seloper, start, stride, count, block)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error selecting hyperslab")
return nothing
end
"""
h5s_select_intersect_block(space_id::hid_t, starts::Ptr{hsize_t}, ends::Ptr{hsize_t}) -> Bool
See `libhdf5` documentation for [`H5Sselect_intersect_block`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga51472bcb9af024675fba6294a6aefa5e).
"""
function h5s_select_intersect_block(space_id, starts, ends)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_intersect_block, libhdf5), htri_t, (hid_t, Ptr{hsize_t}, Ptr{hsize_t}), space_id, starts, ends)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether selection intersects block")
return var"#status#" > 0
end
"""
h5s_select_shape_same(space1_id::hid_t, space2_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Sselect_shape_same`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gafc6cafae877900ee060709eaa0b9b261).
"""
function h5s_select_shape_same(space1_id, space2_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_shape_same, libhdf5), htri_t, (hid_t, hid_t), space1_id, space2_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether dataspace shapes are the same")
return var"#status#" > 0
end
"""
h5s_select_valid(spaceid::hid_t) -> Bool
See `libhdf5` documentation for [`H5Sselect_valid`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#ga1abfdec1248c262ca8791b5308e67d4b).
"""
function h5s_select_valid(spaceid)
lock(liblock)
var"#status#" = try
ccall((:H5Sselect_valid, libhdf5), htri_t, (hid_t,), spaceid)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether selection is within extent")
return var"#status#" > 0
end
"""
h5s_set_extent_none(space_id::hid_t)
See `libhdf5` documentation for [`H5Sset_extent_none`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gacf8a5c48d7b7edb5ff73d9d02dbd073d).
"""
function h5s_set_extent_none(space_id)
lock(liblock)
var"#status#" = try
ccall((:H5Sset_extent_none, libhdf5), herr_t, (hid_t,), space_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting dataspace extent to none")
return nothing
end
"""
h5s_set_extent_simple(dspace_id::hid_t, rank::Cint, current_size::Ptr{hsize_t}, maximum_size::Ptr{hsize_t})
See `libhdf5` documentation for [`H5Sset_extent_simple`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_s.html#gaf2526a41d2f4506e2c52098510517343).
"""
function h5s_set_extent_simple(dspace_id, rank, current_size, maximum_size)
lock(liblock)
var"#status#" = try
ccall((:H5Sset_extent_simple, libhdf5), herr_t, (hid_t, Cint, Ptr{hsize_t}, Ptr{hsize_t}), dspace_id, rank, current_size, maximum_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting dataspace size")
return nothing
end
"""
h5t_array_create(basetype_id::hid_t, ndims::Cuint, sz::Ptr{hsize_t}) -> hid_t
See `libhdf5` documentation for [`H5Tarray_create2`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_r_r_a_y.html#ga9d9aea590106fdab7a2c07c04346f618).
"""
function h5t_array_create(basetype_id, ndims, sz)
lock(liblock)
var"#status#" = try
ccall((:H5Tarray_create2, libhdf5), hid_t, (hid_t, Cuint, Ptr{hsize_t}), basetype_id, ndims, sz)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error creating H5T_ARRAY of id ", basetype_id, " and size ", sz))
return var"#status#"
end
"""
h5t_close(dtype_id::hid_t)
See `libhdf5` documentation for [`H5Tclose`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#gafcba4db244f6a4d71e99c6e72b8678f0).
"""
function h5t_close(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tclose, libhdf5), herr_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error closing datatype")
return nothing
end
"""
h5t_committed(dtype_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Tcommitted`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga0eba38d8c49784269e71ac9fa79b0f0a).
"""
function h5t_committed(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tcommitted, libhdf5), htri_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether datatype is committed")
return var"#status#" > 0
end
"""
h5t_commit(loc_id::hid_t, name::Cstring, dtype_id::hid_t, lcpl_id::hid_t, tcpl_id::hid_t, tapl_id::hid_t)
See `libhdf5` documentation for [`H5Tcommit2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga10352b6fa9ac58a7fbd5299496f1df31).
"""
function h5t_commit(loc_id, name, dtype_id, lcpl_id, tcpl_id, tapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tcommit2, libhdf5), herr_t, (hid_t, Cstring, hid_t, hid_t, hid_t, hid_t), loc_id, name, dtype_id, lcpl_id, tcpl_id, tapl_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error committing type")
return nothing
end
"""
h5t_copy(dtype_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Tcopy`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#gaec07efbab84f4e5b4ed22f010786be8e).
"""
function h5t_copy(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tcopy, libhdf5), hid_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error copying datatype")
return var"#status#"
end
"""
h5t_create(class_id::Cint, sz::Csize_t) -> hid_t
See `libhdf5` documentation for [`H5Tcreate`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#gaa9afc38e1a7d35e4d0bec24c569b3c65).
"""
function h5t_create(class_id, sz)
lock(liblock)
var"#status#" = try
ccall((:H5Tcreate, libhdf5), hid_t, (Cint, Csize_t), class_id, sz)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error creating datatype of id ", class_id))
return var"#status#"
end
"""
h5t_enum_insert(dtype_id::hid_t, name::Cstring, value::Ptr{Cvoid})
See `libhdf5` documentation for [`H5Tenum_insert`](https://docs.hdfgroup.org/hdf5/v1_14/group___e_n_u_m.html#ga7bbddcff3a5d18ee983fbe5654fdc41f).
"""
function h5t_enum_insert(dtype_id, name, value)
lock(liblock)
var"#status#" = try
ccall((:H5Tenum_insert, libhdf5), herr_t, (hid_t, Cstring, Ptr{Cvoid}), dtype_id, name, value)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error adding ", name, " to enum datatype"))
return nothing
end
"""
h5t_equal(dtype_id1::hid_t, dtype_id2::hid_t) -> Bool
See `libhdf5` documentation for [`H5Tequal`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#gaa92250f289b557b63cba974defa20b0f).
"""
function h5t_equal(dtype_id1, dtype_id2)
lock(liblock)
var"#status#" = try
ccall((:H5Tequal, libhdf5), htri_t, (hid_t, hid_t), dtype_id1, dtype_id2)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error checking datatype equality")
return var"#status#" > 0
end
"""
h5t_get_array_dims(dtype_id::hid_t, dims::Ptr{hsize_t}) -> Int
See `libhdf5` documentation for [`H5Tget_array_dims2`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_r_r_a_y.html#ga3ea18a56f03d3b9c8f3ff4091c784769).
"""
function h5t_get_array_dims(dtype_id, dims)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_array_dims2, libhdf5), Cint, (hid_t, Ptr{hsize_t}), dtype_id, dims)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting dimensions of array")
return Int(var"#status#")
end
"""
h5t_get_array_ndims(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_array_ndims`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_r_r_a_y.html#gadec89de23da8efaba4677abfd818a9c0).
"""
function h5t_get_array_ndims(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_array_ndims, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting ndims of array")
return Int(var"#status#")
end
"""
h5t_get_class(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga364545c053f925fec65880b235e37898).
"""
function h5t_get_class(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_class, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting class")
return Int(var"#status#")
end
"""
h5t_get_cset(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_cset`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga5bc2f3e8f708f5bcdd0d8667950310c1).
"""
function h5t_get_cset(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_cset, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting character set encoding")
return Int(var"#status#")
end
"""
h5t_get_ebias(dtype_id::hid_t) -> Csize_t
See `libhdf5` documentation for [`H5Tget_ebias`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga302b1c22cc6007ca69724a9e387e3888).
"""
function h5t_get_ebias(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_ebias, libhdf5), Csize_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
@h5error "Error getting exponent bias"
return var"#status#"
end
"""
h5t_get_fields(dtype_id::hid_t, spos::Ref{Csize_t}, epos::Ref{Csize_t}, esize::Ref{Csize_t}, mpos::Ref{Csize_t}, msize::Ref{Csize_t})
See `libhdf5` documentation for [`H5Tget_fields`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga42e62cb497fdec8f08cb9ac3c6de0e14).
"""
function h5t_get_fields(dtype_id, spos, epos, esize, mpos, msize)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_fields, libhdf5), herr_t, (hid_t, Ref{Csize_t}, Ref{Csize_t}, Ref{Csize_t}, Ref{Csize_t}, Ref{Csize_t}), dtype_id, spos, epos, esize, mpos, msize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting datatype floating point bit positions")
return nothing
end
"""
h5t_get_member_class(dtype_id::hid_t, index::Cuint) -> Int
See `libhdf5` documentation for [`H5Tget_member_class`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_o_u_n_d.html#gac8476d164fb972fbf7b8c4584b8e916b).
"""
function h5t_get_member_class(dtype_id, index)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_member_class, libhdf5), Cint, (hid_t, Cuint), dtype_id, index)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error(string("Error getting class of compound datatype member #", index))
return Int(var"#status#")
end
"""
h5t_get_member_index(dtype_id::hid_t, membername::Cstring) -> Int
See `libhdf5` documentation for [`H5Tget_member_index`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_e_n_u_m.html#gabe31b13b2b8bf29d1a4c3b04cf917c6c).
"""
function h5t_get_member_index(dtype_id, membername)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_member_index, libhdf5), Cint, (hid_t, Cstring), dtype_id, membername)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error(string("Error getting index of compound datatype member \"", membername, "\""))
return Int(var"#status#")
end
"""
h5t_get_member_offset(dtype_id::hid_t, index::Cuint) -> Csize_t
See `libhdf5` documentation for [`H5Tget_member_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_o_u_n_d.html#ga46cf2a60b54a08695635749c215af4af).
"""
function h5t_get_member_offset(dtype_id, index)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_member_offset, libhdf5), Csize_t, (hid_t, Cuint), dtype_id, index)
finally
unlock(liblock)
end
@h5error "Error getting offset of compound datatype #$(index)"
return var"#status#"
end
"""
h5t_get_member_type(dtype_id::hid_t, index::Cuint) -> hid_t
See `libhdf5` documentation for [`H5Tget_member_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_o_u_n_d.html#gaf5de0eabe28246f040342e275b9a63eb).
"""
function h5t_get_member_type(dtype_id, index)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_member_type, libhdf5), hid_t, (hid_t, Cuint), dtype_id, index)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error getting type of compound datatype member #", index))
return var"#status#"
end
"""
h5t_get_native_type(dtype_id::hid_t, direction::Cint) -> hid_t
See `libhdf5` documentation for [`H5Tget_native_type`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga05b99133058637e8daa5d745381ddd3d).
"""
function h5t_get_native_type(dtype_id, direction)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_native_type, libhdf5), hid_t, (hid_t, Cint), dtype_id, direction)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting native type")
return var"#status#"
end
"""
h5t_get_nmembers(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_nmembers`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_e_n_u_m.html#ga21bdfc706f71ebe298a433e74b5bc626).
"""
function h5t_get_nmembers(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_nmembers, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting the number of members")
return Int(var"#status#")
end
"""
h5t_get_offset(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga225f0b6d173f90d3696bb68b88ae07c1).
"""
function h5t_get_offset(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_offset, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting offset")
return Int(var"#status#")
end
"""
h5t_get_order(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gaeb5bd7ec46787a4b6d33947dc73c2a5f).
"""
function h5t_get_order(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_order, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting order")
return Int(var"#status#")
end
"""
h5t_get_precision(dtype_id::hid_t) -> Csize_t
See `libhdf5` documentation for [`H5Tget_precision`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gaac9f5410c8cf456f048011030b7f90f9).
"""
function h5t_get_precision(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_precision, libhdf5), Csize_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
@h5error "Error getting precision"
return var"#status#"
end
"""
h5t_get_sign(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_sign`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga636f7655e706ccf7a3f23566ca561e90).
"""
function h5t_get_sign(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_sign, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting sign")
return Int(var"#status#")
end
"""
h5t_get_size(dtype_id::hid_t) -> Csize_t
See `libhdf5` documentation for [`H5Tget_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga1b971589cd7a86f3e84affdee455564e).
"""
function h5t_get_size(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_size, libhdf5), Csize_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
@h5error "Error getting type size"
return var"#status#"
end
"""
h5t_get_strpad(dtype_id::hid_t) -> Int
See `libhdf5` documentation for [`H5Tget_strpad`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga564b21cc269467c39f59462feb0d5903).
"""
function h5t_get_strpad(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_strpad, libhdf5), Cint, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting string padding")
return Int(var"#status#")
end
"""
h5t_get_super(dtype_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Tget_super`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga331e8f7b388a50af77294018db788de3).
"""
function h5t_get_super(dtype_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tget_super, libhdf5), hid_t, (hid_t,), dtype_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error getting super type")
return var"#status#"
end
"""
h5t_insert(dtype_id::hid_t, fieldname::Cstring, offset::Csize_t, field_id::hid_t)
See `libhdf5` documentation for [`H5Tinsert`](https://docs.hdfgroup.org/hdf5/v1_14/group___c_o_m_p_o_u_n_d.html#ga487d8f64a76f48b6eeb7f402d3b8b081).
"""
function h5t_insert(dtype_id, fieldname, offset, field_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tinsert, libhdf5), herr_t, (hid_t, Cstring, Csize_t, hid_t), dtype_id, fieldname, offset, field_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error(string("Error adding field ", fieldname, " to compound datatype"))
return nothing
end
"""
h5t_is_variable_str(type_id::hid_t) -> Bool
See `libhdf5` documentation for [`H5Tis_variable_str`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gac16f1dd88eda4bc5ae5b325809dc2bee).
"""
function h5t_is_variable_str(type_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tis_variable_str, libhdf5), htri_t, (hid_t,), type_id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Error determining whether string is of variable length")
return var"#status#" > 0
end
"""
h5t_lock(type_id::hid_t)
See `libhdf5` documentation for [`H5Tlock`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga523642dbf4c60a83127fff87664a965b).
"""
function h5t_lock(type_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tlock, libhdf5), herr_t, (hid_t,), type_id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error locking type")
return nothing
end
"""
h5t_open(loc_id::hid_t, name::Cstring, tapl_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Topen2`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#ga7e65e77634f1fb4ba38cbcdab9a59bc2).
"""
function h5t_open(loc_id, name, tapl_id)
lock(liblock)
var"#status#" = try
ccall((:H5Topen2, libhdf5), hid_t, (hid_t, Cstring, hid_t), loc_id, name, tapl_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error(string("Error opening type ", h5i_get_name(loc_id), "/", name))
return var"#status#"
end
"""
h5t_set_cset(dtype_id::hid_t, cset::Cint)
See `libhdf5` documentation for [`H5Tset_cset`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#ga4909c0c3d97c3d212fee032cc8dc031a).
"""
function h5t_set_cset(dtype_id, cset)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_cset, libhdf5), herr_t, (hid_t, Cint), dtype_id, cset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting character set in datatype")
return nothing
end
"""
h5t_set_ebias(dtype_id::hid_t, ebias::Csize_t)
See `libhdf5` documentation for [`H5Tset_ebias`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gad2c4a8f09672f4166f39efe83d44dba2).
"""
function h5t_set_ebias(dtype_id, ebias)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_ebias, libhdf5), herr_t, (hid_t, Csize_t), dtype_id, ebias)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting datatype floating point exponent bias")
return nothing
end
"""
h5t_set_fields(dtype_id::hid_t, spos::Csize_t, epos::Csize_t, esize::Csize_t, mpos::Csize_t, msize::Csize_t)
See `libhdf5` documentation for [`H5Tset_fields`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gafbdc98b45749e5cfbaf1a8689f3c403d).
"""
function h5t_set_fields(dtype_id, spos, epos, esize, mpos, msize)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_fields, libhdf5), herr_t, (hid_t, Csize_t, Csize_t, Csize_t, Csize_t, Csize_t), dtype_id, spos, epos, esize, mpos, msize)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting datatype floating point bit positions")
return nothing
end
"""
h5t_set_offset(dtype_id::hid_t, offset::Csize_t)
See `libhdf5` documentation for [`H5Tset_offset`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gafd22e4b0aecbe6dad9a899c5bf567e2f).
"""
function h5t_set_offset(dtype_id, offset)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_offset, libhdf5), herr_t, (hid_t, Csize_t), dtype_id, offset)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting offset")
return nothing
end
"""
h5t_set_order(dtype_id::hid_t, order::Cint)
See `libhdf5` documentation for [`H5Tset_order`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gab1aab76b1214a819281f2156c6d45d71).
"""
function h5t_set_order(dtype_id, order)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_order, libhdf5), herr_t, (hid_t, Cint), dtype_id, order)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting order")
return nothing
end
"""
h5t_set_precision(dtype_id::hid_t, sz::Csize_t)
See `libhdf5` documentation for [`H5Tset_precision`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gab0f4dccfc2fb47bf2c7e06c9bf84c1f7).
"""
function h5t_set_precision(dtype_id, sz)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_precision, libhdf5), herr_t, (hid_t, Csize_t), dtype_id, sz)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting precision of datatype")
return nothing
end
"""
h5t_set_size(dtype_id::hid_t, sz::Csize_t)
See `libhdf5` documentation for [`H5Tset_size`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t.html#gae5f38bfd4a4c557496b3194b5180212c).
"""
function h5t_set_size(dtype_id, sz)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_size, libhdf5), herr_t, (hid_t, Csize_t), dtype_id, sz)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting size of datatype")
return nothing
end
"""
h5t_set_strpad(dtype_id::hid_t, sz::Cint)
See `libhdf5` documentation for [`H5Tset_strpad`](https://docs.hdfgroup.org/hdf5/v1_14/group___a_t_o_m.html#gaec9ebf44e766cc5b932d0bf26dcf8700).
"""
function h5t_set_strpad(dtype_id, sz)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_strpad, libhdf5), herr_t, (hid_t, Cint), dtype_id, sz)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting size of datatype")
return nothing
end
"""
h5t_set_tag(dtype_id::hid_t, tag::Cstring)
See `libhdf5` documentation for [`H5Tset_tag`](https://docs.hdfgroup.org/hdf5/v1_14/group___o_p_a_q_u_e.html#ga3543ad909983a2a20e651d16502de43d).
"""
function h5t_set_tag(dtype_id, tag)
lock(liblock)
var"#status#" = try
ccall((:H5Tset_tag, libhdf5), herr_t, (hid_t, Cstring), dtype_id, tag)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error setting opaque tag")
return nothing
end
"""
h5t_vlen_create(base_type_id::hid_t) -> hid_t
See `libhdf5` documentation for [`H5Tvlen_create`](https://docs.hdfgroup.org/hdf5/v1_14/group___v_l_e_n.html#ga6841355fa5b3c924876b121dedb8ed2f).
"""
function h5t_vlen_create(base_type_id)
lock(liblock)
var"#status#" = try
ccall((:H5Tvlen_create, libhdf5), hid_t, (hid_t,), base_type_id)
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error creating vlen type")
return var"#status#"
end
"""
h5do_append(dset_id::hid_t, dxpl_id::hid_t, index::Cuint, num_elem::hsize_t, memtype::hid_t, buffer::Ptr{Cvoid})
See `libhdf5` documentation for [`H5DOappend`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_o.html#ga316caac160af15192e0c78228667341e).
"""
function h5do_append(dset_id, dxpl_id, index, num_elem, memtype, buffer)
lock(liblock)
var"#status#" = try
ccall((:H5DOappend, libhdf5_hl), herr_t, (hid_t, hid_t, Cuint, hsize_t, hid_t, Ptr{Cvoid}), dset_id, dxpl_id, index, num_elem, memtype, buffer)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("error appending")
return nothing
end
"""
h5do_write_chunk(dset_id::hid_t, dxpl_id::hid_t, filter_mask::UInt32, offset::Ptr{hsize_t}, bufsize::Csize_t, buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5DOwrite_chunk`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_o.html#gabc3c07ed2cf29dd34035194919fbeb31).
"""
function h5do_write_chunk(dset_id, dxpl_id, filter_mask, offset, bufsize, buf)
lock(liblock)
var"#status#" = try
ccall((:H5DOwrite_chunk, libhdf5_hl), herr_t, (hid_t, hid_t, UInt32, Ptr{hsize_t}, Csize_t, Ptr{Cvoid}), dset_id, dxpl_id, filter_mask, offset, bufsize, buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error writing chunk")
return nothing
end
"""
h5ds_attach_scale(did::hid_t, dsid::hid_t, idx::Cuint)
See `libhdf5` documentation for [`H5DSattach_scale`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#ga4149785cd62520d6fc2736489f74e296).
"""
function h5ds_attach_scale(did, dsid, idx)
lock(liblock)
var"#status#" = try
ccall((:H5DSattach_scale, libhdf5_hl), herr_t, (hid_t, hid_t, Cuint), did, dsid, idx)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to attach scale")
return nothing
end
"""
h5ds_detach_scale(did::hid_t, dsid::hid_t, idx::Cuint)
See `libhdf5` documentation for [`H5DSdetach_scale`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#gac94c498a5f8fc38b0619307f2ac1593a).
"""
function h5ds_detach_scale(did, dsid, idx)
lock(liblock)
var"#status#" = try
ccall((:H5DSdetach_scale, libhdf5_hl), herr_t, (hid_t, hid_t, Cuint), did, dsid, idx)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to detach scale")
return nothing
end
"""
h5ds_get_label(did::hid_t, idx::Cuint, label::Ptr{UInt8}, size::hsize_t)
See `libhdf5` documentation for [`H5DSget_label`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#gaaefa5dcd96a7dba034764b70e2f3aa38).
"""
function h5ds_get_label(did, idx, label, size)
lock(liblock)
var"#status#" = try
ccall((:H5DSget_label, libhdf5_hl), herr_t, (hid_t, Cuint, Ptr{UInt8}, hsize_t), did, idx, label, size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to get label")
return nothing
end
"""
h5ds_get_num_scales(did::hid_t, idx::Cuint) -> Int
See `libhdf5` documentation for [`H5DSget_num_scales`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#gaa373f8cb78fa2014c50fe8e3dd85ea90).
"""
function h5ds_get_num_scales(did, idx)
lock(liblock)
var"#status#" = try
ccall((:H5DSget_num_scales, libhdf5_hl), Cint, (hid_t, Cuint), did, idx)
finally
unlock(liblock)
end
var"#status#" < Cint(0) && @h5error("Error getting number of scales")
return Int(var"#status#")
end
"""
h5ds_get_scale_name(did::hid_t, name::Ptr{UInt8}, size::Csize_t) -> Cssize_t
See `libhdf5` documentation for [`H5DSget_scale_name`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#ga81de6ce31478c7bc3a9f80b8c600170a).
"""
function h5ds_get_scale_name(did, name, size)
lock(liblock)
var"#status#" = try
ccall((:H5DSget_scale_name, libhdf5_hl), Cssize_t, (hid_t, Ptr{UInt8}, Csize_t), did, name, size)
finally
unlock(liblock)
end
var"#status#" < Cssize_t(0) && @h5error("Unable to get scale name")
return var"#status#"
end
"""
h5ds_is_attached(did::hid_t, dsid::hid_t, idx::Cuint) -> Bool
See `libhdf5` documentation for [`H5DSis_attached`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#ga76884eddb41d52adf4eb3278c135bbe4).
"""
function h5ds_is_attached(did, dsid, idx)
lock(liblock)
var"#status#" = try
ccall((:H5DSis_attached, libhdf5_hl), htri_t, (hid_t, hid_t, Cuint), did, dsid, idx)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Unable to check if dimension is attached")
return var"#status#" > 0
end
"""
h5ds_is_scale(did::hid_t) -> Bool
See `libhdf5` documentation for [`H5DSis_scale`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#ga06f897270fe32408f40bba69c747fc6b).
"""
function h5ds_is_scale(did)
lock(liblock)
var"#status#" = try
ccall((:H5DSis_scale, libhdf5_hl), htri_t, (hid_t,), did)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Unable to check if dataset is scale")
return var"#status#" > 0
end
"""
h5ds_set_label(did::hid_t, idx::Cuint, label::Ref{UInt8})
See `libhdf5` documentation for [`H5DSset_label`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#gaf3810cf2fde9e8d06d8151a879e081b2).
"""
function h5ds_set_label(did, idx, label)
lock(liblock)
var"#status#" = try
ccall((:H5DSset_label, libhdf5_hl), herr_t, (hid_t, Cuint, Ref{UInt8}), did, idx, label)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to set label")
return nothing
end
"""
h5ds_set_scale(dsid::hid_t, dimname::Cstring)
See `libhdf5` documentation for [`H5DSset_scale`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_d_s.html#ga508a06962e9fc11dff32ed356e0a71fa).
"""
function h5ds_set_scale(dsid, dimname)
lock(liblock)
var"#status#" = try
ccall((:H5DSset_scale, libhdf5_hl), herr_t, (hid_t, Cstring), dsid, dimname)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to set scale")
return nothing
end
"""
h5lt_dtype_to_text(datatype::hid_t, str::Ptr{UInt8}, lang_type::Cint, len::Ref{Csize_t})
See `libhdf5` documentation for [`H5LTdtype_to_text`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_l_t.html#ga6ec5b6204c4cddc5bd323918e51755dc).
"""
function h5lt_dtype_to_text(datatype, str, lang_type, len)
lock(liblock)
var"#status#" = try
ccall((:H5LTdtype_to_text, libhdf5_hl), herr_t, (hid_t, Ptr{UInt8}, Cint, Ref{Csize_t}), datatype, str, lang_type, len)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting datatype text representation")
return nothing
end
"""
h5tb_append_records(loc_id::hid_t, dset_name::Cstring, nrecords::hsize_t, type_size::Csize_t, field_offset::Ptr{Csize_t}, field_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5TBappend_records`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#ga86f4907fb623ca561df39028dad83201).
"""
function h5tb_append_records(loc_id, dset_name, nrecords, type_size, field_offset, field_sizes, data)
lock(liblock)
var"#status#" = try
ccall((:H5TBappend_records, libhdf5_hl), herr_t, (hid_t, Cstring, hsize_t, Csize_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cvoid}), loc_id, dset_name, nrecords, type_size, field_offset, field_sizes, data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error adding record to table")
return nothing
end
"""
h5tb_get_field_info(loc_id::hid_t, table_name::Cstring, field_names::Ptr{Ptr{UInt8}}, field_sizes::Ptr{Csize_t}, field_offsets::Ptr{Csize_t}, type_size::Ptr{Csize_t})
See `libhdf5` documentation for [`H5TBget_field_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#gaa0f9db05008cec2c37de8487531000ba).
"""
function h5tb_get_field_info(loc_id, table_name, field_names, field_sizes, field_offsets, type_size)
lock(liblock)
var"#status#" = try
ccall((:H5TBget_field_info, libhdf5_hl), herr_t, (hid_t, Cstring, Ptr{Ptr{UInt8}}, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Csize_t}), loc_id, table_name, field_names, field_sizes, field_offsets, type_size)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting field information")
return nothing
end
"""
h5tb_get_table_info(loc_id::hid_t, table_name::Cstring, nfields::Ptr{hsize_t}, nrecords::Ptr{hsize_t})
See `libhdf5` documentation for [`H5TBget_table_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#ga35c13910216330ca1768396ac7ecd99c).
"""
function h5tb_get_table_info(loc_id, table_name, nfields, nrecords)
lock(liblock)
var"#status#" = try
ccall((:H5TBget_table_info, libhdf5_hl), herr_t, (hid_t, Cstring, Ptr{hsize_t}, Ptr{hsize_t}), loc_id, table_name, nfields, nrecords)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting table information")
return nothing
end
"""
h5tb_make_table(table_title::Cstring, loc_id::hid_t, dset_name::Cstring, nfields::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_names::Ptr{Cstring}, field_offset::Ptr{Csize_t}, field_types::Ptr{hid_t}, chunk_size::hsize_t, fill_data::Ptr{Cvoid}, compress::Cint, data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5TBmake_table`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#gacce384a15825421d1bddfc7b3ab1e7d0).
"""
function h5tb_make_table(table_title, loc_id, dset_name, nfields, nrecords, type_size, field_names, field_offset, field_types, chunk_size, fill_data, compress, data)
lock(liblock)
var"#status#" = try
ccall((:H5TBmake_table, libhdf5_hl), herr_t, (Cstring, hid_t, Cstring, hsize_t, hsize_t, Csize_t, Ptr{Cstring}, Ptr{Csize_t}, Ptr{hid_t}, hsize_t, Ptr{Cvoid}, Cint, Ptr{Cvoid}), table_title, loc_id, dset_name, nfields, nrecords, type_size, field_names, field_offset, field_types, chunk_size, fill_data, compress, data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error creating and writing dataset to table")
return nothing
end
"""
h5tb_read_records(loc_id::hid_t, table_name::Cstring, start::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_offsets::Ptr{Csize_t}, dst_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5TBread_records`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#ga597aa39196582e086ebb6ff60abcd3fc).
"""
function h5tb_read_records(loc_id, table_name, start, nrecords, type_size, field_offsets, dst_sizes, data)
lock(liblock)
var"#status#" = try
ccall((:H5TBread_records, libhdf5_hl), herr_t, (hid_t, Cstring, hsize_t, hsize_t, Csize_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cvoid}), loc_id, table_name, start, nrecords, type_size, field_offsets, dst_sizes, data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error reading record from table")
return nothing
end
"""
h5tb_read_table(loc_id::hid_t, table_name::Cstring, dst_size::Csize_t, dst_offset::Ptr{Csize_t}, dst_sizes::Ptr{Csize_t}, dst_buf::Ptr{Cvoid})
See `libhdf5` documentation for [`H5TBread_table`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#gae3f29b60e32a09a4d6c7bae56374a3bb).
"""
function h5tb_read_table(loc_id, table_name, dst_size, dst_offset, dst_sizes, dst_buf)
lock(liblock)
var"#status#" = try
ccall((:H5TBread_table, libhdf5_hl), herr_t, (hid_t, Cstring, Csize_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cvoid}), loc_id, table_name, dst_size, dst_offset, dst_sizes, dst_buf)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error reading table")
return nothing
end
"""
h5tb_write_records(loc_id::hid_t, table_name::Cstring, start::hsize_t, nrecords::hsize_t, type_size::Csize_t, field_offsets::Ptr{Csize_t}, field_sizes::Ptr{Csize_t}, data::Ptr{Cvoid})
See `libhdf5` documentation for [`H5TBwrite_records`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_t_b.html#ga04c5ef340c562ff96bff16d222c5677e).
"""
function h5tb_write_records(loc_id, table_name, start, nrecords, type_size, field_offsets, field_sizes, data)
lock(liblock)
var"#status#" = try
ccall((:H5TBwrite_records, libhdf5_hl), herr_t, (hid_t, Cstring, hsize_t, hsize_t, Csize_t, Ptr{Csize_t}, Ptr{Csize_t}, Ptr{Cvoid}), loc_id, table_name, start, nrecords, type_size, field_offsets, field_sizes, data)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error writing record to table")
return nothing
end
"""
h5z_register(filter_class::Ref{H5Z_class_t})
See `libhdf5` documentation for [`H5Zregister`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_z.html#ga93145acc38c2c60d832b7a9b0123706b).
"""
function h5z_register(filter_class)
lock(liblock)
var"#status#" = try
ccall((:H5Zregister, libhdf5), herr_t, (Ref{H5Z_class_t},), filter_class)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to register new filter")
return nothing
end
"""
h5z_unregister(id::H5Z_filter_t)
See `libhdf5` documentation for [`H5Zunregister`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_z.html#ga6b8bcdde70c9256c50c7c62ba66380f8).
"""
function h5z_unregister(id)
lock(liblock)
var"#status#" = try
ccall((:H5Zunregister, libhdf5), herr_t, (H5Z_filter_t,), id)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Unable to unregister filter")
return nothing
end
"""
h5z_filter_avail(id::H5Z_filter_t) -> Bool
See `libhdf5` documentation for [`H5Zfilter_avail`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_z.html#ga3594e10d70739ccda55ebb55b17b50ee).
"""
function h5z_filter_avail(id)
lock(liblock)
var"#status#" = try
ccall((:H5Zfilter_avail, libhdf5), htri_t, (H5Z_filter_t,), id)
finally
unlock(liblock)
end
var"#status#" < htri_t(0) && @h5error("Unable to get check filter availability")
return var"#status#" > 0
end
"""
h5z_get_filter_info(filter::H5Z_filter_t, filter_config_flags::Ptr{Cuint})
See `libhdf5` documentation for [`H5Zget_filter_info`](https://docs.hdfgroup.org/hdf5/v1_14/group___h5_z.html#ga9ef800ceec249c8819492545def9adba).
"""
function h5z_get_filter_info(filter, filter_config_flags)
lock(liblock)
var"#status#" = try
ccall((:H5Zget_filter_info, libhdf5), herr_t, (H5Z_filter_t, Ptr{Cuint}), filter, filter_config_flags)
finally
unlock(liblock)
end
var"#status#" < herr_t(0) && @h5error("Error getting filter information")
return nothing
end
"""
h5fd_core_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_CORE`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_core_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_core_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_family_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_FAMILY`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_family_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_family_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_log_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_LOG`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_log_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_log_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_mpio_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_MPIO`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_mpio_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_mpio_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_multi_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_MULTI`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_multi_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_multi_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_sec2_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_SEC2`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_sec2_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_sec2_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
"""
h5fd_stdio_init() -> hid_t
This function is exposed in `libhdf5` as the macro `H5FD_STDIO`. See `libhdf5` documentation for [`H5Pget_driver`](https://docs.hdfgroup.org/hdf5/v1_14/group___f_a_p_l.html#ga43a733fe9723dd15f5ad7abda909a1b8).
"""
function h5fd_stdio_init()
lock(liblock)
var"#status#" = try
ccall((:H5FD_stdio_init, libhdf5), hid_t, ())
finally
unlock(liblock)
end
var"#status#" < hid_t(0) && @h5error("Error initializing file driver")
return var"#status#"
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 32403 | # This file is a companion to `src/api.jl` --- it defines the raw ccall wrappers, while
# here small normalizations are made to make the calls more Julian.
# For instance, many property getters return values through pointer output arguments,
# so the methods here handle making the appropriate `Ref`s and return them (as tuples).
const H5F_LIBVER_LATEST = if _libhdf5_build_ver >= v"1.15"
H5F_LIBVER_V116
elseif _libhdf5_build_ver >= v"1.14"
H5F_LIBVER_V114
elseif _libhdf5_build_ver >= v"1.12"
H5F_LIBVER_V112
elseif _libhdf5_build_ver >= v"1.10"
H5F_LIBVER_V110
else
H5F_LIBVER_V108
end
###
### HDF5 General library functions
###
function h5_get_libversion()
majnum, minnum, relnum = Ref{Cuint}(), Ref{Cuint}(), Ref{Cuint}()
h5_get_libversion(majnum, minnum, relnum)
VersionNumber(majnum[], minnum[], relnum[])
end
function h5_is_library_threadsafe()
is_ts = Ref{Cuchar}(0)
h5_is_library_threadsafe(is_ts)
return is_ts[] > 0
end
###
### HDF5 File Interface
###
function h5f_get_dset_no_attrs_hint(file_id)::Bool
minimize = Ref{hbool_t}(false)
h5f_get_dset_no_attrs_hint(file_id, minimize)
return minimize[]
end
"""
h5f_get_file_image(file_id)
Return a `Vector{UInt8}` containing the file image. Does not include the user block.
"""
function h5f_get_file_image(file_id)
buffer_length = h5f_get_file_image(file_id, C_NULL, 0)
buffer = Vector{UInt8}(undef, buffer_length)
h5f_get_file_image(file_id, buffer, buffer_length)
return buffer
end
"""
h5f_get_file_image(file_id, buffer::Vector{UInt8})
Store the file image in the provided buffer.
"""
function h5f_get_file_image(file_id, buffer::Vector{UInt8})
h5f_get_file_image(fild_id, buffer, length(buffer))
end
###
### Attribute Interface
###
function h5a_get_name(attr_id)
len = h5a_get_name(attr_id, 0, C_NULL)
buf = StringVector(len)
h5a_get_name(attr_id, len + 1, buf)
return String(buf)
end
function h5a_get_name_by_idx(loc_id, obj_name, idx_type, order, idx, lapl_id)
len = h5a_get_name_by_idx(loc_id, obj_name, idx_type, order, idx, C_NULL, 0, lapl_id)
buf = StringVector(len)
h5a_get_name_by_idx(loc_id, obj_name, idx_type, order, idx, buf, len + 1, lapl_id)
return String(buf)
end
# libhdf5 supports proper closure environments, so we use that support rather than
# emulating it with the less desirable form of creating closure handles directly in
# `@cfunction` with `$f`.
# This helper translates between the two preferred forms for each respective language.
function h5a_iterate_helper(
loc_id::hid_t, attr_name::Ptr{Cchar}, ainfo::Ptr{H5A_info_t}, @nospecialize(data::Any)
)::herr_t
f, err_ref = data
try
return herr_t(f(loc_id, attr_name, ainfo))
catch err
err_ref[] = err
return herr_t(-1)
end
end
"""
h5a_iterate(f, loc_id, idx_type, order, idx = 0) -> hsize_t
Executes [`h5a_iterate`](@ref h5a_iterate(::hid_t, ::Cint, ::Cint,
::Ptr{hsize_t}, ::Ptr{Cvoid}, ::Ptr{Cvoid})) with the user-provided callback
function `f`, returning the index where iteration ends.
The callback function must correspond to the signature
```
f(loc::HDF5.API.hid_t, name::Ptr{Cchar}, info::Ptr{HDF5.API.H5A_info_t}) -> Union{Bool, Integer}
```
where a negative return value halts iteration abnormally (triggering an error),
a `true` or a positive value halts iteration successfully, and `false` or zero
continues iteration.
# Examples
```julia-repl
julia> HDF5.API.h5a_iterate(obj, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC) do loc, name, info
println(unsafe_string(name))
return false
end
```
"""
function h5a_iterate(@nospecialize(f), obj_id, idx_type, order, idx=0)
err_ref = Ref{Any}(nothing)
idxref = Ref{hsize_t}(idx)
fptr = @cfunction(h5a_iterate_helper, herr_t, (hid_t, Ptr{Cchar}, Ptr{H5A_info_t}, Any))
try
h5a_iterate(obj_id, idx_type, order, idxref, fptr, (f, err_ref))
catch h5err
jlerr = err_ref[]
if !isnothing(jlerr)
rethrow(jlerr)
end
rethrow(h5err)
end
return idxref[]
end
###
### Dataset Interface
###
"""
h5d_vlen_get_buf_size(dataset_id, type_id, space_id)
Helper method to determines the number of bytes required to store the variable length data from the dataset. Returns a value of type `HDF5.API.hsize_t`.
"""
function h5d_vlen_get_buf_size(dataset_id, type_id, space_id)
sz = Ref{hsize_t}()
h5d_vlen_get_buf_size(dataset_id, type_id, space_id, sz)
return sz[]
end
"""
h5d_get_chunk_info(dataset_id, fspace_id, index)
h5d_get_chunk_info(dataset_id, index; fspace_id = HDF5.API.H5S_ALL)
Helper method to retrieve chunk information.
Returns a `NamedTuple{(:offset, :filter_mask, :addr, :size), Tuple{HDF5.API.hsize_t, UInt32, HDF5.API.haddr_t, HDF5.API.hsize_t}}`.
"""
function h5d_get_chunk_info(dataset_id, fspace_id, index)
offset = Vector{hsize_t}(undef, ndims(dataset_id))
filter_mask = Ref{UInt32}()
addr = Ref{haddr_t}()
size = Ref{hsize_t}()
h5d_get_chunk_info(dataset_id, fspace_id, index, offset, filter_mask, addr, size)
return (offset=offset, filter_mask=filter_mask[], addr=addr[], size=size[])
end
h5d_get_chunk_info(dataset_id, index; fspace_id=H5S_ALL) =
h5d_get_chunk_info(dataset_id, fspace_id, index)
"""
h5d_get_chunk_info_by_coord(dataset_id, offset)
Helper method to read chunk information by coordinate. Returns a `NamedTuple{(:filter_mask, :addr, :size), Tuple{UInt32, HDF5.API.haddr_t, HDF5.API.hsize_t}}`.
"""
function h5d_get_chunk_info_by_coord(dataset_id, offset)
filter_mask = Ref{UInt32}()
addr = Ref{haddr_t}()
size = Ref{hsize_t}()
h5d_get_chunk_info_by_coord(dataset_id, offset, filter_mask, addr, size)
return (filter_mask=filter_mask[], addr=addr[], size=size[])
end
"""
h5d_get_chunk_storage_size(dataset_id, offset)
Helper method to retrieve the chunk storage size in bytes. Returns an integer of type `HDF5.API.hsize_t`.
"""
function h5d_get_chunk_storage_size(dataset_id, offset)
chunk_nbytes = Ref{hsize_t}()
h5d_get_chunk_storage_size(dataset_id, offset, chunk_nbytes)
return chunk_nbytes[]
end
@static if v"1.10.5" ≤ _libhdf5_build_ver
"""
h5d_get_num_chunks(dataset_id, fspace_id = H5S_ALL)
Helper method to retrieve the number of chunks. Returns an integer of type `HDF5.API.hsize_t`.
"""
function h5d_get_num_chunks(dataset_id, fspace_id=H5S_ALL)
nchunks = Ref{hsize_t}()
h5d_get_num_chunks(dataset_id, fspace_id, nchunks)
return nchunks[]
end
end
"""
h5d_chunk_iter(f, dataset, [dxpl_id=H5P_DEFAULT])
Call `f(offset::Ptr{hsize_t}, filter_mask::Cuint, addr::haddr_t, size::hsize_t)` for each chunk.
`dataset` maybe a `HDF5.Dataset` or a dataset id.
`dxpl_id` is the the dataset transfer property list and is optional.
Available only for HDF5 1.10.x series for 1.10.9 and greater or for version HDF5 1.12.3 or greater.
"""
h5d_chunk_iter() = nothing
@static if v"1.12.3" ≤ _libhdf5_build_ver ||
(_libhdf5_build_ver.minor == 10 && _libhdf5_build_ver.patch >= 10)
# H5Dchunk_iter is first available in 1.10.10, 1.12.3, and 1.14.0 in the 1.10, 1.12, and 1.14 minor version series, respectively
function h5d_chunk_iter_helper(
offset::Ptr{hsize_t},
filter_mask::Cuint,
addr::haddr_t,
size::hsize_t,
@nospecialize(data::Any)
)::H5_iter_t
func, err_ref = data
try
return convert(H5_iter_t, func(offset, filter_mask, addr, size))
catch err
err_ref[] = err
return H5_ITER_ERROR
end
end
function h5d_chunk_iter(@nospecialize(f), dset_id, dxpl_id=H5P_DEFAULT)
err_ref = Ref{Any}(nothing)
fptr = @cfunction(
h5d_chunk_iter_helper, H5_iter_t, (Ptr{hsize_t}, Cuint, haddr_t, hsize_t, Any)
)
try
return h5d_chunk_iter(dset_id, dxpl_id, fptr, (f, err_ref))
catch h5err
jlerr = err_ref[]
if !isnothing(jlerr)
rethrow(jlerr)
end
rethrow(h5err)
end
end
end
"""
h5d_get_space_status(dataset_id)
Helper method to retrieve the status of the dataset space.
Returns a `HDF5.API.H5D_space_status_t` (`Cint`) indicating the status, see `HDF5.API.H5D_SPACE_STATUS_`* constants.
"""
function h5d_get_space_status(dataset_id)
r = Ref{H5D_space_status_t}()
h5d_get_space_status(dataset_id, r)
return r[]
end
###
### Error Interface
###
function h5e_get_auto(estack_id)
func = Ref{Ptr{Cvoid}}()
client_data = Ref{Ptr{Cvoid}}()
h5e_get_auto(estack_id, func, client_data)
return func[], client_data[]
end
"""
mesg_type, mesg = h5e_get_msg(meshg_id)
"""
function h5e_get_msg(mesg_id)
mesg_type = Ref{Cint}()
mesg_len = h5e_get_msg(mesg_id, mesg_type, C_NULL, 0)
buffer = StringVector(mesg_len)
h5e_get_msg(mesg_id, mesg_type, buffer, mesg_len + 1)
resize!(buffer, mesg_len)
return mesg_type[], String(buffer)
end
# See explanation for h5a_iterate above.
function h5e_walk_helper(
n::Cuint, err_desc::Ptr{H5E_error2_t}, @nospecialize(data::Any)
)::herr_t
f, err_ref = data
try
return herr_t(f(n, err_desc))
catch err
err_ref[] = err
return herr_t(-1)
end
end
function h5e_walk(f::Function, stack_id, direction)
err_ref = Ref{Any}(nothing)
fptr = @cfunction(h5e_walk_helper, herr_t, (Cuint, Ptr{H5E_error2_t}, Any))
try
h5e_walk(stack_id, direction, fptr, (f, err_ref))
catch h5err
jlerr = err_ref[]
if !isnothing(jlerr)
rethrow(jlerr)
end
rethrow(h5err)
end
end
###
### File Interface
###
function h5f_get_intent(file_id)
intent = Ref{Cuint}()
h5f_get_intent(file_id, intent)
return intent[]
end
function h5f_get_name(loc_id)
len = h5f_get_name(loc_id, C_NULL, 0)
buf = StringVector(len)
h5f_get_name(loc_id, buf, len + 1)
return String(buf)
end
function h5f_get_obj_ids(file_id, types)
sz = h5f_get_obj_count(file_id, types)
hids = Vector{hid_t}(undef, sz)
sz2 = h5f_get_obj_ids(file_id, types, sz, hids)
sz2 != sz && resize!(hids, sz2)
return hids
end
function h5f_get_vfd_handle(file_id, fapl)
file_handle = Ref{Ptr{Cvoid}}()
h5f_get_vfd_handle(file_id, fapl, file_handle)
return file_handle[]
end
"""
h5f_get_free_sections(file_id, type, [sect_info::AbstractVector{H5F_sect_info_t}])::AbstractVector{H5F_sect_info_t}
Return an `AbstractVector` of the free section information. If `sect_info` is not provided a new `Vector` will be allocated and returned.
If `sect_info` is provided, a view, a `SubArray`, will be returned.
"""
function h5f_get_free_sections(file_id, type)
nsects = h5f_get_free_sections(file_id, type, 0, C_NULL)
sect_info = Vector{H5F_sect_info_t}(undef, nsects)
if nsects > 0
h5f_get_free_sections(file_id, type, nsects, sect_info)
end
return sect_info
end
function h5f_get_free_sections(file_id, type, sect_info::AbstractVector{H5F_sect_info_t})
nsects = length(sect_info)
nsects = h5f_get_free_sections(file_id, type, nsects, sect_info)
return @view(sect_info[1:nsects])
end
function h5p_get_file_locking(fapl)
use_file_locking = Ref{API.hbool_t}(0)
ignore_when_disabled = Ref{API.hbool_t}(0)
h5p_get_file_locking(fapl, use_file_locking, ignore_when_disabled)
return (
use_file_locking = Bool(use_file_locking[]),
ignore_when_disabled = Bool(ignore_when_disabled[])
)
end
# Check to see if h5p_set_file_locking should exist
const _has_h5p_set_file_locking = _has_symbol(:H5Pset_file_locking)
function has_h5p_set_file_locking()
return _has_h5p_set_file_locking
#=
h5_version = h5_get_libversion()
if (h5_version >= v"1.10" && h5_version < v"1.10.7") ||
(h5_version >= v"1.12" && h5_version < v"1.12.1") ||
(h5_version < v"1.10")
return false
else
return true
end
=#
end
function h5p_get_file_space_strategy(plist_id)
strategy = Ref{H5F_fspace_strategy_t}()
persist = Ref{hbool_t}(0)
threshold = Ref{hsize_t}()
h5p_get_file_space_strategy(plist_id, strategy, persist, threshold)
return (strategy=strategy[], persist=persist[], threshold=threshold[])
end
function h5p_get_file_space_page_size(plist_id)
fsp_size = Ref{hsize_t}()
h5p_get_file_space_page_size(plist_id, fsp_size)
return fsp_size[]
end
function h5p_set_file_space_strategy(
plist_id; strategy=nothing, persist=nothing, threshold=nothing
)
current = h5p_get_file_space_strategy(plist_id)
strategy = isnothing(strategy) ? current[:strategy] : strategy
persist = isnothing(persist) ? current[:persist] : persist
threshold = isnothing(threshold) ? current[:threshold] : threshold
return h5p_set_file_space_strategy(plist_id, strategy, persist, threshold)
end
###
### Group Interface
###
function h5g_get_info(loc_id)
ginfo = Ref{H5G_info_t}()
h5g_get_info(loc_id, ginfo)
return ginfo[]
end
function h5g_get_num_objs(loc_id)
num_objs = Ref{hsize_t}()
h5g_get_num_objs(loc_id, num_objs)
return num_objs[]
end
###
### Identifier Interface
###
function h5i_get_name(loc_id)
len = h5i_get_name(loc_id, C_NULL, 0)
buf = StringVector(len)
h5i_get_name(loc_id, buf, len + 1)
return String(buf)
end
###
### Link Interface
###
function h5l_get_info(link_loc_id, link_name, lapl_id)
info = Ref{H5L_info_t}()
h5l_get_info(link_loc_id, link_name, info, lapl_id)
return info[]
end
function h5l_get_name_by_idx(loc_id, group_name, idx_type, order, idx, lapl_id)
len = h5l_get_name_by_idx(loc_id, group_name, idx_type, order, idx, C_NULL, 0, lapl_id)
buf = StringVector(len)
h5l_get_name_by_idx(loc_id, group_name, idx_type, order, idx, buf, len + 1, lapl_id)
return String(buf)
end
# See explanation for h5a_iterate above.
function h5l_iterate_helper(
group::hid_t, name::Ptr{Cchar}, info::Ptr{H5L_info_t}, @nospecialize(data::Any)
)::herr_t
f, err_ref = data
try
return herr_t(f(group, name, info))
catch err
err_ref[] = err
return herr_t(-1)
end
end
"""
h5l_iterate(f, group_id, idx_type, order, idx = 0) -> hsize_t
Executes [`h5l_iterate`](@ref h5l_iterate(::hid_t, ::Cint, ::Cint,
::Ptr{hsize_t}, ::Ptr{Cvoid}, ::Ptr{Cvoid})) with the user-provided callback
function `f`, returning the index where iteration ends.
The callback function must correspond to the signature
```
f(group::HDF5.API.hid_t, name::Ptr{Cchar}, info::Ptr{HDF5.API.H5L_info_t}) -> Union{Bool, Integer}
```
where a negative return value halts iteration abnormally, `true` or a positive
value halts iteration successfully, and `false` or zero continues iteration.
# Examples
```julia-repl
julia> HDF5.API.h5l_iterate(hfile, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC) do group, name, info
println(unsafe_string(name))
return HDF5.API.herr_t(0)
end
```
"""
function h5l_iterate(@nospecialize(f), group_id, idx_type, order, idx=0)
err_ref = Ref{Any}(nothing)
idxref = Ref{hsize_t}(idx)
fptr = @cfunction(h5l_iterate_helper, herr_t, (hid_t, Ptr{Cchar}, Ptr{H5L_info_t}, Any))
try
h5l_iterate(group_id, idx_type, order, idxref, fptr, (f, err_ref))
catch h5err
jlerr = err_ref[]
if !isnothing(jlerr)
rethrow(jlerr)
end
rethrow(h5err)
end
return idxref[]
end
###
### Object Interface
###
@static if _libhdf5_build_ver < v"1.10.3"
# H5Oget_info1
function h5o_get_info(loc_id)
oinfo = Ref{H5O_info1_t}()
h5o_get_info(loc_id, oinfo)
return oinfo[]
end
# H5Oget_info_by_name1
function h5o_get_info_by_name(loc_id, name, lapl=H5P_DEFAULT)
oinfo = Ref{H5O_info1_t}()
h5o_get_info_by_name(loc_id, name, oinfo, lapl)
return oinfo[]
end
# H5Oget_info_by_idx1
function h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, lapl=H5P_DEFAULT)
oinfo = Ref{H5O_info1_t}()
h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, lapl)
return oinfo[]
end
elseif _libhdf5_build_ver >= v"1.10.3" && _libhdf5_build_ver < v"1.12.0"
# H5Oget_info2
function h5o_get_info(loc_id, fields=H5O_INFO_ALL)
oinfo = Ref{H5O_info1_t}()
h5o_get_info(loc_id, oinfo, fields)
return oinfo[]
end
# H5Oget_info_by_name2
function h5o_get_info_by_name(loc_id, name, fields=H5O_INFO_ALL, lapl=H5P_DEFAULT)
oinfo = Ref{H5O_info1_t}()
h5o_get_info_by_name(loc_id, name, oinfo, fields, lapl)
return oinfo[]
end
# H5Oget_info_by_idx2
function h5o_get_info_by_idx(
loc_id, group_name, idx_type, order, n, fields=H5O_INFO_ALL, lapl=H5P_DEFAULT
)
oinfo = Ref{H5O_info1_t}()
h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, fields, lapl)
return oinfo[]
end
else # _libhdf5_build_ver >= v"1.12.0"
# H5Oget_info3
function h5o_get_info(loc_id, fields=H5O_INFO_ALL)
oinfo = Ref{H5O_info2_t}()
h5o_get_info(loc_id, oinfo, fields)
return oinfo[]
end
# H5Oget_info_by_name3
function h5o_get_info_by_name(loc_id, name, fields=H5O_INFO_ALL, lapl=H5P_DEFAULT)
oinfo = Ref{H5O_info2_t}()
h5o_get_info_by_name(loc_id, name, oinfo, fields, lapl)
return oinfo[]
end
# H5Oget_info_by_idx3
function h5o_get_info_by_idx(
loc_id, group_name, idx_type, order, n, fields=H5O_INFO_ALL, lapl=H5P_DEFAULT
)
oinfo = Ref{H5O_info2_t}()
h5o_get_info_by_idx(loc_id, group_name, idx_type, order, n, oinfo, fields, lapl)
return oinfo[]
end
function h5o_get_native_info(loc_id, fields=H5O_NATIVE_INFO_ALL)
oinfo = Ref{H5O_native_info_t}()
h5o_get_native_info(loc_id, oinfo, fields)
return oinfo[]
end
function h5o_get_native_info_by_idx(
loc_id, group_name, idx_type, order, n, fields=H5O_NATIVE_INFO_ALL, lapl=H5P_DEFAULT
)
oinfo = Ref{H5O_native_info_t}()
h5o_get_native_info_by_idx(
loc_id, group_name, idx_type, order, n, oinfo, fields, lapl
)
return oinfo[]
end
function h5o_get_native_info_by_name(
loc_id, name, fields=H5O_NATIVE_INFO_ALL, lapl=H5P_DEFAULT
)
oinfo = Ref{H5O_native_info_t}()
h5o_get_native_info_by_name(loc_id, name, oinfo, fields, lapl)
return oinfo[]
end
end # @static if _libhdf5_build_ver < v"1.12.0"
# Add a default link access property list if not specified
function h5o_exists_by_name(loc_id, name)
return h5o_exists_by_name(loc_id, name, H5P_DEFAULT)
end
# Legacy h5o_get_info1 interface, for compat with pre-1.12.0
# Used by deprecated object_info function
"""
h5o_get_info1(object_id, [buf])
Deprecated HDF5 function. Use [`h5o_get_info`](@ref) or [`h5o_get_native_info`](@ref) if possible.
See `libhdf5` documentation for [`H5Oget_info1`](https://portal.hdfgroup.org/display/HDF5/H5O_GET_INFO1).
"""
function h5o_get_info1(object_id, buf)
var"#status#" = ccall(
(:H5Oget_info1, libhdf5), herr_t, (hid_t, Ptr{H5O_info_t}), object_id, buf
)
var"#status#" < 0 && @h5error("Error getting object info")
return nothing
end
function h5o_get_info1(loc_id)
oinfo = Ref{H5O_info1_t}()
h5o_get_info1(loc_id, oinfo)
return oinfo[]
end
###
### Property Interface
###
function h5p_get_alignment(fapl_id)
threshold = Ref{hsize_t}()
alignment = Ref{hsize_t}()
h5p_get_alignment(fapl_id, threshold, alignment)
return threshold[], alignment[]
end
function h5p_get_alloc_time(plist_id)
alloc_time = Ref{Cint}()
h5p_get_alloc_time(plist_id, alloc_time)
return alloc_time[]
end
function h5p_get_char_encoding(plist_id)
encoding = Ref{Cint}()
h5p_get_char_encoding(plist_id, encoding)
return encoding[]
end
function h5p_get_chunk(plist_id)
ndims = h5p_get_chunk(plist_id, 0, C_NULL)
dims = Vector{hsize_t}(undef, ndims)
h5p_get_chunk(plist_id, ndims, dims)
return dims, ndims
end
function h5p_get_chunk_cache(dapl_id)
nslots = Ref{Csize_t}()
nbytes = Ref{Csize_t}()
w0 = Ref{Cdouble}()
h5p_get_chunk_cache(dapl_id, nslots, nbytes, w0)
return (nslots=nslots[], nbytes=nbytes[], w0=w0[])
end
function h5p_get_create_intermediate_group(plist_id)
cig = Ref{Cuint}()
h5p_get_create_intermediate_group(plist_id, cig)
return cig[]
end
function h5p_get_dxpl_mpio(dxpl_id)
xfer_mode = Ref{Cint}()
h5p_get_dxpl_mpio(dxpl_id, xfer_mode)
return xfer_mode[]
end
function h5p_get_efile_prefix(plist)
efile_len = h5p_get_efile_prefix(plist, C_NULL, 0)
buffer = StringVector(efile_len)
prefix_size = h5p_get_efile_prefix(plist, buffer, efile_len + 1)
return String(buffer)
end
function h5p_set_efile_prefix(plist, sym::Symbol)
if sym === :origin
h5p_set_efile_prefix(plist, raw"$ORIGIN")
else
throw(
ArgumentError(
"The only valid `Symbol` argument for `h5p_set_efile_prefix` is `:origin`. Got `$sym`."
)
)
end
end
function h5p_get_external(plist, idx=0)
offset = Ref{off_t}(0)
sz = Ref{hsize_t}(0)
name_size = 64
name = Base.StringVector(name_size)
while true
h5p_get_external(plist, idx, name_size, name, offset, sz)
null_id = findfirst(==(0x00), name)
if isnothing(null_id)
name_size *= 2
resize!(name, name_size)
else
resize!(name, null_id - 1)
break
end
end
# Heuristic for 32-bit Windows bug
# Possibly related:
# https://github.com/HDFGroup/hdf5/pull/1821
# Quote:
# The offset parameter is of type off_t and the offset field of H5O_efl_entry_t
# is HDoff_t which is a different type on Windows (off_t is a 32-bit long,
# HDoff_t is __int64, a 64-bit type).
@static if Sys.iswindows() && sizeof(Int) == 4
lower = 0xffffffff & sz[]
upper = 0xffffffff & (sz[] >> 32)
# Scenario 1: The size is in the lower 32 bits, upper 32 bits contains garbage v1.12.2
# Scenario 2: The size is in the upper 32 bits, lower 32 bits is 0 as of HDF5 v1.12.1
sz[] = lower == 0 && upper != 0xffffffff ? upper : lower
end
return (name=String(name), offset=offset[], size=sz[])
end
function h5p_get_fclose_degree(fapl_id)
out = Ref{Cint}()
h5p_get_fclose_degree(fapl_id, out)
return out[]
end
function h5p_get_fill_time(plist_id)
out = Ref{H5D_fill_time_t}()
h5p_get_fill_time(plist_id, out)
return out[]
end
function h5p_get_libver_bounds(plist_id)
low = Ref{Cint}()
high = Ref{Cint}()
h5p_get_libver_bounds(plist_id, low, high)
return low[], high[]
end
function h5p_get_local_heap_size_hint(plist_id)
size_hint = Ref{Csize_t}()
h5p_get_local_heap_size_hint(plist_id, size_hint)
return size_hint[]
end
function h5p_get_meta_block_size(fapl_id)
sz = Ref{hsize_t}(0)
h5p_get_meta_block_size(fapl_id, sz)
return sz[]
end
function h5p_get_obj_track_times(plist_id)
track_times = Ref{UInt8}()
h5p_get_obj_track_times(plist_id, track_times)
return track_times[] != 0x0
end
function h5p_get_userblock(plist_id)
len = Ref{hsize_t}()
h5p_get_userblock(plist_id, len)
return len[]
end
function h5p_get_virtual_count(dcpl_id)
count = Ref{Csize_t}()
h5p_get_virtual_count(dcpl_id, count)
return count[]
end
function h5p_get_virtual_dsetname(dcpl_id, index)
len = h5p_get_virtual_dsetname(dcpl_id, index, C_NULL, 0)
buffer = StringVector(len)
h5p_get_virtual_dsetname(dcpl_id, index, buffer, len + 1)
return String(buffer)
end
function h5p_get_virtual_filename(dcpl_id, index)
len = h5p_get_virtual_filename(dcpl_id, index, C_NULL, 0)
buffer = StringVector(len)
h5p_get_virtual_filename(dcpl_id, index, buffer, len + 1)
return String(buffer)
end
function h5p_get_virtual_prefix(dapl_id)
virtual_file_len = h5p_get_virtual_prefix(dapl_id, C_NULL, 0)
buffer = StringVector(virtual_file_len)
prefix_size = h5p_get_virtual_prefix(dapl_id, buffer, virtual_file_len + 1)
return String(buffer)
end
function h5p_get_virtual_printf_gap(dapl_id)
gap = Ref{hsize_t}()
h5p_get_virtual_printf_gap(dapl_id, gap)
return gap[]
end
function h5p_get_virtual_view(dapl_id)
view = Ref{H5D_vds_view_t}()
h5p_get_virtual_view(dapl_id, view)
return view[]
end
"""
h5p_get_file_image(fapl_id)::Vector{UInt8}
Retrieve a file image of the appropriate size in a `Vector{UInt8}`.
"""
function h5p_get_file_image(fapl_id)::Vector{UInt8}
cb = h5p_get_file_image_callbacks(fapl_id)
if cb.image_free != C_NULL
# The user has configured their own memory deallocation routines.
# The user should use a lower level call to properly handle deallocation
error(
"File image callback image_free is not C_NULL. Use the three argument method of h5p_get_file_image when setting file image callbacks."
)
end
buf_ptr_ref = Ref{Ptr{Nothing}}()
buf_len_ref = Ref{Csize_t}(0)
h5p_get_file_image(fapl_id, buf_ptr_ref, buf_len_ref)
image = unsafe_wrap(Array{UInt8}, Ptr{UInt8}(buf_ptr_ref[]), buf_len_ref[]; own=false)
finalizer(image) do image
# Use h5_free_memory to ensure we are using the correct free
h5_free_memory(image)
end
return image
end
"""
h5p_set_file_image(fapl_id, image::Vector{UInt8})
Set the file image from a `Vector{UInt8}`.
"""
function h5p_set_file_image(fapl_id, image::Vector{UInt8})
h5p_set_file_image(fapl_id, image, length(image))
end
"""
h5p_get_file_image_callbacks(fapl_id)
Retrieve the file image callbacks for memory operations
"""
function h5p_get_file_image_callbacks(fapl_id)
cb = H5FD_file_image_callbacks_t(C_NULL, C_NULL, C_NULL, C_NULL, C_NULL, C_NULL, C_NULL)
r = Ref(cb)
h5p_get_file_image_callbacks(fapl_id, r)
return r[]
end
# Note: The following function(s) implement direct ccalls because the binding generator
# cannot (yet) do the string wrapping and memory freeing.
"""
h5p_get_class_name(pcid::hid_t) -> String
See `libhdf5` documentation for [`H5P_GET_CLASS_NAME`](https://portal.hdfgroup.org/display/HDF5/H5P_GET_CLASS_NAME).
"""
function h5p_get_class_name(pcid)
pc = ccall((:H5Pget_class_name, libhdf5), Ptr{UInt8}, (hid_t,), pcid)
if pc == C_NULL
@h5error("Error getting class name")
end
s = unsafe_string(pc)
h5_free_memory(pc)
return s
end
function h5p_get_attr_creation_order(p)
attr = Ref{UInt32}()
h5p_get_attr_creation_order(p, attr)
return attr[]
end
function h5p_get_link_creation_order(p)
link = Ref{UInt32}()
h5p_get_link_creation_order(p, link)
return link[]
end
function h5p_get_dset_no_attrs_hint(dcpl)::hbool_t
minimize = Ref{hbool_t}(false)
h5p_get_dset_no_attrs_hint(dcpl, minimize)
return minimize[] > 0
end
###
### Plugin Interface
###
function h5pl_get_loading_state()
plugin_control_mask = Ref{Cuint}()
h5pl_get_loading_state(plugin_control_mask)
plugin_control_mask[]
end
function h5pl_get(index=0)
buf_size = Csize_t(1024)
path_buf = Vector{Cchar}(undef, buf_size)
h5pl_get(index, path_buf, buf_size)
unsafe_string(pointer(path_buf))
end
function h5pl_size()
num_paths = Ref{Cuint}()
h5pl_size(num_paths)
num_paths[]
end
###
### Reference Interface
###
###
### Dataspace Interface
###
function h5s_get_regular_hyperslab(space_id)
n = h5s_get_simple_extent_ndims(space_id)
start = Vector{hsize_t}(undef, n)
stride = Vector{hsize_t}(undef, n)
count = Vector{hsize_t}(undef, n)
block = Vector{hsize_t}(undef, n)
h5s_get_regular_hyperslab(space_id, start, stride, count, block)
return start, stride, count, block
end
function h5s_get_simple_extent_dims(space_id)
n = h5s_get_simple_extent_ndims(space_id)
dims = Vector{hsize_t}(undef, n)
maxdims = Vector{hsize_t}(undef, n)
h5s_get_simple_extent_dims(space_id, dims, maxdims)
return dims, maxdims
end
function h5s_get_simple_extent_dims(space_id, ::Nothing)
n = h5s_get_simple_extent_ndims(space_id)
dims = Vector{hsize_t}(undef, n)
h5s_get_simple_extent_dims(space_id, dims, C_NULL)
return dims
end
###
### Datatype Interface
###
function h5t_get_array_dims(type_id)
nd = h5t_get_array_ndims(type_id)
dims = Vector{hsize_t}(undef, nd)
h5t_get_array_dims(type_id, dims)
return dims
end
function h5t_get_fields(type_id)
spos = Ref{Csize_t}()
epos = Ref{Csize_t}()
esize = Ref{Csize_t}()
mpos = Ref{Csize_t}()
msize = Ref{Csize_t}()
h5t_get_fields(type_id, spos, epos, esize, mpos, msize)
return (spos[], epos[], esize[], mpos[], msize[])
end
# Note: The following two functions implement direct ccalls because the binding generator
# cannot (yet) do the string wrapping and memory freeing.
"""
h5t_get_member_name(type_id::hid_t, index::Cuint) -> String
See `libhdf5` documentation for [`H5Oopen`](https://portal.hdfgroup.org/display/HDF5/H5T_GET_MEMBER_NAME).
"""
function h5t_get_member_name(type_id, index)
pn = ccall((:H5Tget_member_name, libhdf5), Ptr{UInt8}, (hid_t, Cuint), type_id, index)
if pn == C_NULL
@h5error("Error getting name of compound datatype member #$index")
end
s = unsafe_string(pn)
h5_free_memory(pn)
return s
end
"""
h5t_get_tag(type_id::hid_t) -> String
See `libhdf5` documentation for [`H5Oopen`](https://portal.hdfgroup.org/display/HDF5/H5T_GET_TAG).
"""
function h5t_get_tag(type_id)
pc = ccall((:H5Tget_tag, libhdf5), Ptr{UInt8}, (hid_t,), type_id)
if pc == C_NULL
@h5error("Error getting opaque tag")
end
s = unsafe_string(pc)
h5_free_memory(pc)
return s
end
h5t_get_native_type(type_id) = h5t_get_native_type(type_id, H5T_DIR_ASCEND)
###
### Optimized Functions Interface
###
###
### HDF5 Lite Interface
###
function h5lt_dtype_to_text(dtype_id)
len = Ref{Csize_t}()
h5lt_dtype_to_text(dtype_id, C_NULL, 0, len)
buf = StringVector(len[] - 1)
h5lt_dtype_to_text(dtype_id, buf, 0, len)
return String(buf)
end
###
### Table Interface
###
function h5tb_get_table_info(loc_id, table_name)
nfields = Ref{hsize_t}()
nrecords = Ref{hsize_t}()
h5tb_get_table_info(loc_id, table_name, nfields, nrecords)
return nfields[], nrecords[]
end
function h5tb_get_field_info(loc_id, table_name)
nfields, = h5tb_get_table_info(loc_id, table_name)
field_sizes = Vector{Csize_t}(undef, nfields)
field_offsets = Vector{Csize_t}(undef, nfields)
type_size = Ref{Csize_t}()
# pass C_NULL to field_names argument since libhdf5 does not provide a way to determine if the
# allocated buffer is the correct length, which is thus susceptible to a buffer overflow if
# an incorrect buffer length is passed. Instead, we manually compute the column names using the
# same calls that h5tb_get_field_info internally uses.
h5tb_get_field_info(loc_id, table_name, C_NULL, field_sizes, field_offsets, type_size)
did = h5d_open(loc_id, table_name, H5P_DEFAULT)
tid = h5d_get_type(did)
h5d_close(did)
field_names = [h5t_get_member_name(tid, i - 1) for i in 1:nfields]
h5t_close(tid)
return field_names, field_sizes, field_offsets, type_size[]
end
###
### Filter Interface
###
function h5z_get_filter_info(filter)
ref = Ref{Cuint}()
h5z_get_filter_info(filter, ref)
ref[]
end
###
### MPIO
###
# define these stubs, but can't define the methods as the types aren't
# known until MPI.jl is loaded.
"""
h5p_get_fapl_mpio(fapl_id::hid_t, comm::Ptr{MPI.MPI_Comm}, info::Ptr{MPI.MPI_Info})
See `libhdf5` documentation for [`H5Pget_fapl_mpio`](https://portal.hdfgroup.org/display/HDF5/H5P_GET_FAPL_MPIO32).
"""
function h5p_get_fapl_mpio end
"""
h5p_set_fapl_mpio(fapl_id::hid_t, comm::MPI.MPI_Comm, info::MPI.MPI_Info)
See `libhdf5` documentation for [`H5Pset_fapl_mpio`](https://portal.hdfgroup.org/display/HDF5/H5P_SET_FAPL_MPIO32).
"""
function h5p_set_fapl_mpio end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 381 | const liblock = ReentrantLock()
# Try to acquire the lock (test-test-set) and close if successful
# Otherwise, defer finalization
# https://github.com/JuliaIO/HDF5.jl/issues/1048
function try_close_finalizer(x)
if !islocked(liblock) && trylock(liblock) do
close(x)
true
end
else
finalizer(try_close_finalizer, x)
end
return nothing
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 22214 | ## C types
const Ctime_t = Int
## HDF5 types and constants
const haddr_t = UInt64
const hbool_t = Bool
const herr_t = Cint
const hid_t = Int64
const hsize_t = UInt64
const hssize_t = Int64
const htri_t = Cint # pseudo-boolean (negative if error)
@static if Sys.iswindows()
const off_t = Int64
else
const off_t = Int
end
const H5Z_filter_t = Cint
# For VLEN
struct hvl_t
len::Csize_t
p::Ptr{Cvoid}
end
const HVL_SIZE = sizeof(hvl_t)
# Reference types
struct hobj_ref_t
buf::UInt64 # H5R_OBJ_REF_BUF_SIZE bytes
end
struct hdset_reg_ref_t
buf::NTuple{12,UInt8} # H5R_DSET_REG_REF_BUF_SIZE bytes
end
const HOBJ_REF_T_NULL = hobj_ref_t(0x0)
const HDSET_REG_REF_T_NULL = hdset_reg_ref_t(ntuple(_ -> 0x0, Val(12)))
#= TODO: when upgraded to using newer HDF5 v1.12 reference API, can replace both with:
struct H5R_ref_t
# Element type of UInt64 and not UInt8 to get 8-byte alignment
buf::NTuple{8,UInt64} # H5R_REF_BUF_SIZE bytes
end
const H5R_REF_T_NULL = H5R_ref_t(ntuple(_ -> 0x0, Val(64)))
=#
# For attribute information
struct H5A_info_t
corder_valid::hbool_t
corder::UInt32 # typedef uint32_t H5O_msg_crt_idx_t
cset::Cint
data_size::hsize_t
end
# For group information
struct H5G_info_t
storage_type::Cint # enum H5G_storage_type_t
nlinks::hsize_t
max_corder::Int64
mounted::hbool_t
end
# For objects
@enum H5_index_t::Cint begin
H5_INDEX_UNKNOWN = -1
H5_INDEX_NAME = 0
H5_INDEX_CRT_ORDER = 1
H5_INDEX_N = 2
end
@enum H5_iter_order_t::Cint begin
H5_ITER_UNKNOWN = -1
H5_ITER_INC = 0
H5_ITER_DEC = 1
H5_ITER_NATIVE = 2
H5_ITER_N = 3
end
@enum H5_iter_t::Cint begin
H5_ITER_CONT = 0
H5_ITER_ERROR = -1
H5_ITER_STOP = 1
end
Base.convert(::Type{H5_iter_t}, x::Integer) = H5_iter_t(x)
const H5O_iterate1_t = Ptr{Cvoid}
const H5O_iterate2_t = Ptr{Cvoid}
struct _H5O_hdr_info_t_space
total::hsize_t
meta::hsize_t
mesg::hsize_t
free::hsize_t
end
struct _H5O_hdr_info_t_mesg
present::UInt64
shared::UInt64
end
struct H5O_hdr_info_t
version::Cuint
nmesgs::Cuint
nchunks::Cuint
flags::Cuint
space::_H5O_hdr_info_t_space
mesg::_H5O_hdr_info_t_mesg
end
struct H5_ih_info_t
index_size::hsize_t
heap_size::hsize_t
end
struct _H5O_native_info_t_meta_size
obj::H5_ih_info_t
attr::H5_ih_info_t
end
struct H5O_native_info_t
hdr::H5O_hdr_info_t
meta_size::_H5O_native_info_t_meta_size
end
const H5O_NATIVE_INFO_HDR = 0x0008
const H5O_NATIVE_INFO_META_SIZE = 0x0010
const H5O_NATIVE_INFO_ALL = H5O_NATIVE_INFO_HDR | H5O_NATIVE_INFO_META_SIZE
struct H5O_token_t
__data::NTuple{16,UInt8}
end
@enum H5O_type_t::Cint begin
H5O_TYPE_UNKNOWN = -1
H5O_TYPE_GROUP = 0
H5O_TYPE_DATASET = 1
H5O_TYPE_NAMED_DATATYPE = 2
H5O_TYPE_MAP = 3
H5O_TYPE_NTYPES = 4
end
struct H5O_info1_t # version 1 type H5O_info1_t
fileno::Culong
addr::haddr_t
otype::H5O_type_t # enum H5O_type_t
rc::Cuint
atime::Ctime_t
mtime::Ctime_t
ctime::Ctime_t
btime::Ctime_t
num_attrs::hsize_t
#{ inlined struct H5O_hdr_info_t named type
version::Cuint
nmesgs::Cuint
nchunks::Cuint
flags::Cuint
total::hsize_t
meta::hsize_t
mesg::hsize_t
free::hsize_t
present::UInt64
shared::UInt64
#}
#{ inlined anonymous struct named meta_size
meta_obj::H5_ih_info_t
meta_attr::H5_ih_info_t
#}
end
const H5O_info_t = H5O_info1_t
# The field "otype" is originally "type"
# Alias "otype" as "type" for compat with H5O_info2_t
Base.getproperty(oinfo::H5O_info1_t, field::Symbol) =
field == :type ? getfield(oinfo, :otype) : getfield(oinfo, field)
struct H5O_info2_t
fileno::Culong
token::H5O_token_t
type::H5O_type_t
rc::Cuint
atime::Ctime_t
mtime::Ctime_t
ctime::Ctime_t
btime::Ctime_t
num_attrs::hsize_t
end
# For links
struct H5L_info_t
linktype::Cint
corder_valid::hbool_t
corder::Int64
cset::Cint # enum H5T_cset_t
u::haddr_t # union { haddr_t address, size_t val_size }
end
# For registering filters
struct H5Z_class_t # version 2 type H5Z_class2_t
version::Cint # = H5Z_CLASS_T_VERS
id::H5Z_filter_t # Filter ID number
encoder_present::Cuint # Does this filter have an encoder?
decoder_present::Cuint # Does this filter have a decoder?
name::Ptr{UInt8} # Comment for debugging
can_apply::Ptr{Cvoid} # The "can apply" callback
set_local::Ptr{Cvoid} # The "set local" callback
filter::Ptr{Cvoid} # The filter callback
end
# Information about an error; element of error stack.
# See https://github.com/HDFGroup/hdf5/blob/hdf5-1_12_0/src/H5Epublic.h#L36-L44
struct H5E_error2_t
cls_id::hid_t # class ID"
maj_num::hid_t # major error ID
min_num::hid_t # minor error number
line::Cuint # line in file where the error occurs
func_name::Cstring # function where the error occurred
file_name::Cstring # file in which the error occurred
desc::Cstring # optional supplied description
end
# HDFS Drivers
struct H5FD_hdfs_fapl_t
version::Int32
namenode_name::NTuple{129,Cchar}
namenode_port::Int32
user_name::NTuple{129,Cchar}
kerberos_ticket_cache::NTuple{129,Cchar}
stream_buffer_size::Int32
end
const H5FD_ROS3_MAX_REGION_LEN = 32
const H5FD_ROS3_MAX_SECRET_ID_LEN = 128
const H5FD_ROS3_MAX_SECRET_KEY_LEN = 128
struct H5FD_ros3_fapl_t
version::Int32
authenticate::hbool_t
aws_region::NTuple{H5FD_ROS3_MAX_REGION_LEN + 1,Cchar}
secret_id::NTuple{H5FD_ROS3_MAX_SECRET_ID_LEN + 1,Cchar}
secret_key::NTuple{H5FD_ROS3_MAX_SECRET_KEY_LEN + 1,Cchar}
end
struct H5FD_splitter_vfd_config_t
magic::Int32
version::Cuint
rw_fapl_id::hid_t
wo_fapl_id::hid_t
wo_path::NTuple{4097,Cchar}
log_file_path::NTuple{4097,Cchar}
ignore_wo_errs::hbool_t
end
# Private function to extract exported global library constants.
# Need to call H5open to ensure library is initalized before reading these constants.
# Although these are runtime initalized constants, in practice their values are stable, so
# we can precompile for improved latency.
const libhdf5handle = Ref(dlopen(libhdf5))
ccall(dlsym(libhdf5handle[], :H5open), herr_t, ())
_read_const(sym::Symbol) = unsafe_load(cglobal(dlsym(libhdf5handle[], sym), hid_t))
_has_symbol(sym::Symbol) = dlsym(libhdf5handle[], sym; throw_error=false) !== nothing
# iteration order constants
# Moved to H5_iter_order_t enum
#const H5_ITER_UNKNOWN = -1
#const H5_ITER_INC = 0
#const H5_ITER_DEC = 1
#const H5_ITER_NATIVE = 2
#const H5_ITER_N = 3
# indexing type constants
# Moved to H5_index_t enum
#const H5_INDEX_UNKNOWN = -1
#const H5_INDEX_NAME = 0
#const H5_INDEX_CRT_ORDER = 1
#const H5_INDEX_N = 2
# dataset constants
const H5D_COMPACT = 0
const H5D_CONTIGUOUS = 1
const H5D_CHUNKED = 2
const H5D_VIRTUAL = 3
# allocation times (C enum H5D_alloc_time_t)
const H5D_ALLOC_TIME_ERROR = -1
const H5D_ALLOC_TIME_DEFAULT = 0
const H5D_ALLOC_TIME_EARLY = 1
const H5D_ALLOC_TIME_LATE = 2
const H5D_ALLOC_TIME_INCR = 3
# used to "unset" chunk cache configuration parameters
const H5D_CHUNK_CACHE_NSLOTS_DEFAULT = -1 % Csize_t
const H5D_CHUNK_CACHE_NBYTES_DEFAULT = -1 % Csize_t
const H5D_CHUNK_CACHE_W0_DEFAULT = Cdouble(-1)
# space status
const H5D_SPACE_STATUS_ERROR = Cint(-1)
const H5D_SPACE_STATUS_NOT_ALLOCATED = Cint(0)
const H5D_SPACE_STATUS_PART_ALLOCATED = Cint(1)
const H5D_SPACE_STATUS_ALLOCATED = Cint(2)
const H5D_space_status_t = Cint
# error-related constants
const H5E_DEFAULT = 0
const H5E_WALK_UPWARD = 0
const H5E_WALK_DOWNWARD = 1
# file access modes
const H5F_ACC_RDONLY = 0x0000
const H5F_ACC_RDWR = 0x0001
const H5F_ACC_TRUNC = 0x0002
const H5F_ACC_EXCL = 0x0004
const H5F_ACC_DEBUG = 0x0008
const H5F_ACC_CREAT = 0x0010
const H5F_ACC_SWMR_WRITE = 0x0020
const H5F_ACC_SWMR_READ = 0x0040
# Library versions
@enum H5F_libver_t::Int32 begin
H5F_LIBVER_ERROR = -1
H5F_LIBVER_EARLIEST = 0
H5F_LIBVER_V18 = 1
H5F_LIBVER_V110 = 2
H5F_LIBVER_V112 = 3
H5F_LIBVER_V114 = 4
H5F_LIBVER_V116 = 5
H5F_LIBVER_NBOUNDS = 6
end
# H5F_LIBVER_LATEST defined in helpers.jl
# object types
const H5F_OBJ_FILE = 0x0001
const H5F_OBJ_DATASET = 0x0002
const H5F_OBJ_GROUP = 0x0004
const H5F_OBJ_DATATYPE = 0x0008
const H5F_OBJ_ATTR = 0x0010
const H5F_OBJ_ALL = (H5F_OBJ_FILE | H5F_OBJ_DATASET | H5F_OBJ_GROUP | H5F_OBJ_DATATYPE | H5F_OBJ_ATTR)
const H5F_OBJ_LOCAL = 0x0020
# other file constants
const H5F_SCOPE_LOCAL = 0
const H5F_SCOPE_GLOBAL = 1
const H5F_CLOSE_DEFAULT = 0
const H5F_CLOSE_WEAK = 1
const H5F_CLOSE_SEMI = 2
const H5F_CLOSE_STRONG = 3
# file driver constants
const H5FD_MPIO_INDEPENDENT = 0
const H5FD_MPIO_COLLECTIVE = 1
const H5FD_MPIO_CHUNK_DEFAULT = 0
const H5FD_MPIO_CHUNK_ONE_IO = 1
const H5FD_MPIO_CHUNK_MULTI_IO = 2
const H5FD_MPIO_COLLECTIVE_IO = 0
const H5FD_MPIO_INDIVIDUAL_IO = 1
# object types (C enum H5Itype_t)
const H5I_FILE = 1
const H5I_GROUP = 2
const H5I_DATATYPE = 3
const H5I_DATASPACE = 4
const H5I_DATASET = 5
const H5I_ATTR = 6
const H5I_REFERENCE = 7
const H5I_VFL = 8
# Link constants
const H5L_TYPE_HARD = 0
const H5L_TYPE_SOFT = 1
const H5L_TYPE_EXTERNAL = 2
# H5O_INFO constants
const H5O_INFO_BASIC = Cuint(0x0001)
const H5O_INFO_TIME = Cuint(0x0002)
const H5O_INFO_NUM_ATTRS = Cuint(0x0004)
const H5O_INFO_HDR = Cuint(0x0008)
const H5O_INFO_META_SIZE = Cuint(0x0010)
const H5O_INFO_ALL =
H5O_INFO_BASIC | H5O_INFO_TIME | H5O_INFO_NUM_ATTRS | H5O_INFO_HDR | H5O_INFO_META_SIZE
# Object constants
# Moved to H5O_type_t enum
#const H5O_TYPE_GROUP = 0
#const H5O_TYPE_DATASET = 1
#const H5O_TYPE_NAMED_DATATYPE = 2
# Property constants
const H5P_DEFAULT = hid_t(0)
const H5P_OBJECT_CREATE = _read_const(:H5P_CLS_OBJECT_CREATE_ID_g)
const H5P_FILE_CREATE = _read_const(:H5P_CLS_FILE_CREATE_ID_g)
const H5P_FILE_ACCESS = _read_const(:H5P_CLS_FILE_ACCESS_ID_g)
const H5P_DATASET_CREATE = _read_const(:H5P_CLS_DATASET_CREATE_ID_g)
const H5P_DATASET_ACCESS = _read_const(:H5P_CLS_DATASET_ACCESS_ID_g)
const H5P_DATASET_XFER = _read_const(:H5P_CLS_DATASET_XFER_ID_g)
const H5P_FILE_MOUNT = _read_const(:H5P_CLS_FILE_MOUNT_ID_g)
const H5P_GROUP_CREATE = _read_const(:H5P_CLS_GROUP_CREATE_ID_g)
const H5P_GROUP_ACCESS = _read_const(:H5P_CLS_GROUP_ACCESS_ID_g)
const H5P_DATATYPE_CREATE = _read_const(:H5P_CLS_DATATYPE_CREATE_ID_g)
const H5P_DATATYPE_ACCESS = _read_const(:H5P_CLS_DATATYPE_ACCESS_ID_g)
const H5P_STRING_CREATE = _read_const(:H5P_CLS_STRING_CREATE_ID_g)
const H5P_ATTRIBUTE_ACCESS = _read_const(:H5P_CLS_ATTRIBUTE_ACCESS_ID_g)
const H5P_ATTRIBUTE_CREATE = _read_const(:H5P_CLS_ATTRIBUTE_CREATE_ID_g)
const H5P_OBJECT_COPY = _read_const(:H5P_CLS_OBJECT_COPY_ID_g)
const H5P_LINK_CREATE = _read_const(:H5P_CLS_LINK_CREATE_ID_g)
const H5P_LINK_ACCESS = _read_const(:H5P_CLS_LINK_ACCESS_ID_g)
# Plugin constants, H5PL_type_t
const H5PL_TYPE_ERROR = -1
const H5PL_TYPE_FILTER = 0
const H5PL_TYPE_VOL = 1
const H5PL_TYPE_NONE = 2
const H5P_CRT_ORDER_TRACKED = 1
const H5P_CRT_ORDER_INDEXED = 2
# Reference constants
const H5R_OBJECT = 0
const H5R_DATASET_REGION = 1
const H5R_OBJ_REF_BUF_SIZE = 8 # == sizeof(hobj_ref_t)
const H5R_DSET_REG_REF_BUF_SIZE = 12 # == sizeof(hdset_reg_ref_t)
# Dataspace constants
const H5S_ALL = hid_t(0)
const H5S_UNLIMITED = typemax(hsize_t)
# Dataspace classes (C enum H5S_class_t)
@enum H5S_class_t::Cint begin
H5S_NO_CLASS = -1
H5S_SCALAR = 0
H5S_SIMPLE = 1
H5S_NULL = 2
end
# Dataspace selection constants (C enum H5S_seloper_t)
@enum H5S_seloper_t::Cint begin
H5S_SELECT_NOOP = -1
H5S_SELECT_SET = 0
H5S_SELECT_OR = 1
H5S_SELECT_AND = 2
H5S_SELECT_XOR = 3
H5S_SELECT_NOTB = 4
H5S_SELECT_NOTA = 5
H5S_SELECT_APPEND = 6
H5S_SELECT_PREPEND = 7
H5S_SELECT_INVALID = 8
end
# Dataspace selection types (C enum H5S_sel_type)
@enum H5S_sel_type::Cint begin
H5S_SEL_ERROR = -1
H5S_SEL_NONE = 0
H5S_SEL_POINTS = 1
H5S_SEL_HYPERSLABS = 2
H5S_SEL_ALL = 3
H5S_SEL_N = 4
end
# type classes (C enum H5T_class_t)
const H5T_NO_CLASS = hid_t(-1)
const H5T_INTEGER = hid_t(0)
const H5T_FLOAT = hid_t(1)
const H5T_TIME = hid_t(2) # not supported by HDF5 library
const H5T_STRING = hid_t(3)
const H5T_BITFIELD = hid_t(4)
const H5T_OPAQUE = hid_t(5)
const H5T_COMPOUND = hid_t(6)
const H5T_REFERENCE = hid_t(7)
const H5T_ENUM = hid_t(8)
const H5T_VLEN = hid_t(9)
const H5T_ARRAY = hid_t(10)
# Byte orders (C enum H5T_order_t)
const H5T_ORDER_ERROR = -1 # error
const H5T_ORDER_LE = 0 # little endian
const H5T_ORDER_BE = 1 # bit endian
const H5T_ORDER_VAX = 2 # VAX mixed endian
const H5T_ORDER_MIXED = 3 # Compound type with mixed member orders
const H5T_ORDER_NONE = 4 # no particular order (strings, bits,..)
# Floating-point normalization schemes (C enum H5T_norm_t)
const H5T_NORM_ERROR = -1 # error
const H5T_NORM_IMPLIED = 0 # msb of mantissa isn't stored, always 1
const H5T_NORM_MSBSET = 1 # msb of mantissa is always 1
const H5T_NORM_NONE = 2 # not normalized
# Character types
const H5T_CSET_ASCII = 0
const H5T_CSET_UTF8 = 1
# Sign types (C enum H5T_sign_t)
const H5T_SGN_ERROR = Cint(-1) # error
const H5T_SGN_NONE = Cint(0) # unsigned
const H5T_SGN_2 = Cint(1) # 2's complement
const H5T_NSGN = Cint(2) # sentinel: this must be last!
# Search directions
const H5T_DIR_ASCEND = 1
const H5T_DIR_DESCEND = 2
# String padding modes
const H5T_STR_NULLTERM = 0
const H5T_STR_NULLPAD = 1
const H5T_STR_SPACEPAD = 2
# Type_id constants (LE = little endian, I16 = Int16, etc)
const H5T_STD_I8LE = _read_const(:H5T_STD_I8LE_g)
const H5T_STD_I8BE = _read_const(:H5T_STD_I8BE_g)
const H5T_STD_U8LE = _read_const(:H5T_STD_U8LE_g)
const H5T_STD_U8BE = _read_const(:H5T_STD_U8BE_g)
const H5T_STD_I16LE = _read_const(:H5T_STD_I16LE_g)
const H5T_STD_I16BE = _read_const(:H5T_STD_I16BE_g)
const H5T_STD_U16LE = _read_const(:H5T_STD_U16LE_g)
const H5T_STD_U16BE = _read_const(:H5T_STD_U16BE_g)
const H5T_STD_I32LE = _read_const(:H5T_STD_I32LE_g)
const H5T_STD_I32BE = _read_const(:H5T_STD_I32BE_g)
const H5T_STD_U32LE = _read_const(:H5T_STD_U32LE_g)
const H5T_STD_U32BE = _read_const(:H5T_STD_U32BE_g)
const H5T_STD_I64LE = _read_const(:H5T_STD_I64LE_g)
const H5T_STD_I64BE = _read_const(:H5T_STD_I64BE_g)
const H5T_STD_U64LE = _read_const(:H5T_STD_U64LE_g)
const H5T_STD_U64BE = _read_const(:H5T_STD_U64BE_g)
const H5T_IEEE_F32LE = _read_const(:H5T_IEEE_F32LE_g)
const H5T_IEEE_F32BE = _read_const(:H5T_IEEE_F32BE_g)
const H5T_IEEE_F64LE = _read_const(:H5T_IEEE_F64LE_g)
const H5T_IEEE_F64BE = _read_const(:H5T_IEEE_F64BE_g)
const H5T_C_S1 = _read_const(:H5T_C_S1_g)
const H5T_STD_REF_OBJ = _read_const(:H5T_STD_REF_OBJ_g)
const H5T_STD_REF_DSETREG = _read_const(:H5T_STD_REF_DSETREG_g)
# Native types
const H5T_NATIVE_B8 = _read_const(:H5T_NATIVE_B8_g)
const H5T_NATIVE_INT8 = _read_const(:H5T_NATIVE_INT8_g)
const H5T_NATIVE_UINT8 = _read_const(:H5T_NATIVE_UINT8_g)
const H5T_NATIVE_INT16 = _read_const(:H5T_NATIVE_INT16_g)
const H5T_NATIVE_UINT16 = _read_const(:H5T_NATIVE_UINT16_g)
const H5T_NATIVE_INT32 = _read_const(:H5T_NATIVE_INT32_g)
const H5T_NATIVE_UINT32 = _read_const(:H5T_NATIVE_UINT32_g)
const H5T_NATIVE_INT64 = _read_const(:H5T_NATIVE_INT64_g)
const H5T_NATIVE_UINT64 = _read_const(:H5T_NATIVE_UINT64_g)
const H5T_NATIVE_FLOAT = _read_const(:H5T_NATIVE_FLOAT_g)
const H5T_NATIVE_DOUBLE = _read_const(:H5T_NATIVE_DOUBLE_g)
# Other type constants
const H5T_VARIABLE = reinterpret(UInt, -1)
# Filter constants
const H5Z_FLAG_MANDATORY = 0x0000
const H5Z_FLAG_OPTIONAL = 0x0001
const H5Z_FLAG_REVERSE = 0x0100
const H5Z_CLASS_T_VERS = 1
# predefined filters
const H5Z_FILTER_ALL = H5Z_filter_t(0)
const H5Z_FILTER_NONE = H5Z_filter_t(0)
const H5Z_FILTER_DEFLATE = H5Z_filter_t(1)
const H5Z_FILTER_SHUFFLE = H5Z_filter_t(2)
const H5Z_FILTER_FLETCHER32 = H5Z_filter_t(3)
const H5Z_FILTER_SZIP = H5Z_filter_t(4)
const H5Z_FILTER_NBIT = H5Z_filter_t(5)
const H5Z_FILTER_SCALEOFFSET = H5Z_filter_t(6)
const H5_SZIP_EC_OPTION_MASK = Cuint(4)
const H5_SZIP_NN_OPTION_MASK = Cuint(32)
const H5_SZIP_MAX_PIXELS_PER_BLOCK = Cuint(32)
const H5Z_FILTER_CONFIG_ENCODE_ENABLED = 0x0001
const H5Z_FILTER_CONFIG_DECODE_ENABLED = 0x0002
# fill time
@enum H5D_fill_time_t::Int32 begin
H5D_FILL_TIME_ERROR = -1
H5D_FILL_TIME_ALLOC = 0
H5D_FILL_TIME_NEVER = 1
H5D_FILL_TIME_IFSET = 2
end
@enum H5F_mem_t::Int32 begin
H5FD_MEM_NOLIST = -1
H5FD_MEM_DEFAULT = 0
H5FD_MEM_SUPER = 1
H5FD_MEM_BTREE = 2
H5FD_MEM_DRAW = 3
H5FD_MEM_GHEAP = 4
H5FD_MEM_LHEAP = 5
H5FD_MEM_OHDR = 6
H5FD_MEM_NTYPES = 7
end
const H5FD_mem_t = H5F_mem_t
struct H5FD_file_image_callbacks_t
image_malloc::Ptr{Cvoid}
image_memcpy::Ptr{Cvoid}
image_realloc::Ptr{Cvoid}
image_free::Ptr{Cvoid}
udata_copy::Ptr{Cvoid}
udata_free::Ptr{Cvoid}
udata::Ptr{Cvoid}
end
struct H5F_sect_info_t
addr::haddr_t
size::hsize_t
end
@enum H5F_fspace_strategy_t::UInt32 begin
H5F_FSPACE_STRATEGY_FSM_AGGR = 0
H5F_FSPACE_STRATEGY_PAGE = 1
H5F_FSPACE_STRATEGY_AGGR = 2
H5F_FSPACE_STRATEGY_NONE = 3
H5F_FSPACE_STRATEGY_NTYPES = 4
end
@enum H5F_file_space_type_t::UInt32 begin
H5F_FILE_SPACE_DEFAULT = 0
H5F_FILE_SPACE_ALL_PERSIST = 1
H5F_FILE_SPACE_ALL = 2
H5F_FILE_SPACE_AGGR_VFD = 3
H5F_FILE_SPACE_VFD = 4
H5F_FILE_SPACE_NTYPES = 5
end
@enum H5Z_EDC_t::Int32 begin
H5Z_ERROR_EDC = -1
H5Z_DISABLE_EDC = 0
H5Z_ENABLE_EDC = 1
H5Z_NO_EDC = 2
end
# Callbacks
# typedef herr_t ( * H5P_prp_cb1_t ) ( const char * name , size_t size , void * value )
const H5P_prp_cb1_t = Ptr{Cvoid}
const H5P_prp_copy_func_t = H5P_prp_cb1_t
# typedef int ( * H5P_prp_compare_func_t ) ( const void * value1 , const void * value2 , size_t size )
const H5P_prp_compare_func_t = Ptr{Cvoid}
const H5P_prp_close_func_t = H5P_prp_cb1_t
const H5P_prp_create_func_t = H5P_prp_cb1_t
const H5P_prp_cb2_t = Ptr{Cvoid}
const H5P_prp_set_func_t = H5P_prp_cb2_t
const H5P_prp_get_func_t = H5P_prp_cb2_t
const H5P_prp_delete_func_t = H5P_prp_cb2_t
const H5P_cls_create_func_t = Ptr{Cvoid}
const H5P_cls_copy_func_t = Ptr{Cvoid}
const H5P_cls_close_func_t = Ptr{Cvoid}
const H5P_iterate_t = Ptr{Cvoid}
const H5D_append_cb_t = Ptr{Cvoid}
const H5L_elink_traverse_t = Ptr{Cvoid}
# typedef herr_t ( * H5F_flush_cb_t ) ( hid_t object_id , void * udata )
const H5F_flush_cb_t = Ptr{Cvoid}
# typedef H5O_mcdt_search_ret_t ( * H5O_mcdt_search_cb_t ) ( void * op_data )
const H5O_mcdt_search_cb_t = Ptr{Cvoid}
# typedef herr_t ( * H5T_conv_t ) ( hid_t src_id , hid_t dst_id , H5T_cdata_t * cdata , size_t nelmts , size_t buf_stride , size_t bkg_stride , void * buf , void * bkg , hid_t dset_xfer_plist )
const H5T_conv_t = Ptr{Cvoid}
# typedef H5T_conv_ret_t ( * H5T_conv_except_func_t ) ( H5T_conv_except_t except_type , hid_t src_id , hid_t dst_id , void * src_buf , void * dst_buf , void * user_data )
const H5T_conv_except_func_t = Ptr{Cvoid}
# typedef herr_t ( * H5M_iterate_t ) ( hid_t map_id , const void * key , void * op_data )
const H5M_iterate_t = Ptr{Cvoid}
# typedef void * ( * H5MM_allocate_t ) ( size_t size , void * alloc_info )
const H5MM_allocate_t = Ptr{Cvoid}
# typedef void ( * H5MM_free_t ) ( void * mem , void * free_info )
const H5MM_free_t = Ptr{Cvoid}
# typedef H5Z_cb_return_t ( * H5Z_filter_func_t ) ( H5Z_filter_t filter , void * buf , size_t buf_size , void * op_data )
const H5Z_filter_func_t = Ptr{Cvoid}
struct H5Z_cb_t
func::H5Z_filter_func_t
op_data::Ptr{Cvoid}
end
@enum H5C_cache_incr_mode::UInt32 begin
H5C_incr__off = 0
H5C_incr__threshold = 1
end
@enum H5C_cache_flash_incr_mode::UInt32 begin
H5C_flash_incr__off = 0
H5C_flash_incr__add_space = 1
end
@enum H5C_cache_decr_mode::UInt32 begin
H5C_decr__off = 0
H5C_decr__threshold = 1
H5C_decr__age_out = 2
H5C_decr__age_out_with_threshold = 3
end
struct H5AC_cache_config_t
version::Cint
rpt_fcn_enabled::hbool_t
open_trace_file::hbool_t
close_trace_file::hbool_t
trace_file_name::NTuple{1025,Cchar}
evictions_enabled::hbool_t
set_initial_size::hbool_t
initial_size::Csize_t
min_clean_fraction::Cdouble
max_size::Csize_t
min_size::Csize_t
epoch_length::Clong
incr_mode::H5C_cache_incr_mode
lower_hr_threshold::Cdouble
increment::Cdouble
apply_max_increment::hbool_t
max_increment::Csize_t
flash_incr_mode::H5C_cache_flash_incr_mode
flash_multiple::Cdouble
flash_threshold::Cdouble
decr_mode::H5C_cache_decr_mode
upper_hr_threshold::Cdouble
decrement::Cdouble
apply_max_decrement::hbool_t
max_decrement::Csize_t
epochs_before_eviction::Cint
apply_empty_reserve::hbool_t
empty_reserve::Cdouble
dirty_bytes_threshold::Csize_t
metadata_write_strategy::Cint
end
struct H5AC_cache_image_config_t
version::Cint
generate_image::hbool_t
save_resize_status::hbool_t
entry_ageout::Cint
end
@enum H5D_vds_view_t::Int32 begin
H5D_VDS_ERROR = -1
H5D_VDS_FIRST_MISSING = 0
H5D_VDS_LAST_AVAILABLE = 1
end
@enum H5D_fill_value_t::Int32 begin
H5D_FILL_VALUE_ERROR = -1
H5D_FILL_VALUE_UNDEFINED = 0
H5D_FILL_VALUE_DEFAULT = 1
H5D_FILL_VALUE_USER_DEFINED = 2
end
struct H5F_info2_super
version::Cuint
super_size::hsize_t
super_ext_size::hsize_t
end
struct H5F_info2_free
version::Cuint
meta_size::hsize_t
tot_space::hsize_t
end
struct H5F_info2_sohm
version::Cuint
hdr_size::hsize_t
msgs_info::H5_ih_info_t
end
struct H5F_info2_t
super::H5F_info2_super
free::H5F_info2_free
sohm::H5F_info2_sohm
end
struct H5F_retry_info_t
nbins::Cuint
retries::NTuple{21,Ptr{UInt32}}
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 4119 | module Drivers
export POSIX, ROS3
import ..API
import ..HDF5: HDF5, Properties, h5doc
using Libdl: dlopen, dlsym
if !isdefined(Base, :get_extension)
using Requires: @require
end
function get_driver(p::Properties)
driver_id = API.h5p_get_driver(p)
D = get(DRIVERS, driver_id) do
error("Unknown driver type")
end
get_driver(p, D)
end
abstract type Driver end
const DRIVERS = Dict{API.hid_t,Type{<:Driver}}()
"""
Core([increment::Csize_t, backing_store::Cuint, [write_tracking::Cuint, page_size::Csize_t]])
Core(; increment::Csize_t = 8192, backing_store::Cuint = true, write_tracking::Cuint = false, page_size::Csize_t = 524288)
# Arguments
* `increment`: specifies the increment by which allocated memory is to be increased each time more memory is required. (default: 8192)
* `backing_store`: Boolean flag indicating whether to write the file contents to disk when the file is closed. (default: true)
* `write_tracking`: Boolean flag indicating whether write tracking is enabled. (default: false)
* `page_size`: Size, in bytes, of write aggregation pages. (default: 524288)
"""
struct Core <: Driver
increment::Csize_t
backing_store::Cuint #Bool
write_tracking::Cuint #Bool
page_size::Csize_t
end
Core(increment, backing_store) = Core(increment, backing_store, false, 524288)
Core(; increment=8192, backing_store=true, write_tracking=false, page_size=524288) =
Core(increment, backing_store, write_tracking, page_size)
function get_driver(p::Properties, ::Type{Core})
r_increment = Ref{Csize_t}(0)
r_backing_store = Ref{Bool}(0)
r_write_tracking = Ref{Bool}(0)
r_page_size = Ref{Csize_t}(0)
API.h5p_get_fapl_core(p, r_increment, r_backing_store)
API.h5p_get_core_write_tracking(p, r_write_tracking, r_page_size)
return Core(r_increment[], r_backing_store[], r_write_tracking[], r_page_size[])
end
function set_driver!(fapl::Properties, driver::Core)
HDF5.init!(fapl)
API.h5p_set_fapl_core(fapl, driver.increment, driver.backing_store)
API.h5p_set_core_write_tracking(fapl, driver.write_tracking, driver.page_size)
DRIVERS[API.h5p_get_driver(fapl)] = Core
return nothing
end
"""
POSIX()
Also referred to as SEC2, this driver uses POSIX file-system functions like read and
write to perform I/O to a single, permanent file on local disk with no system
buffering. This driver is POSIX-compliant and is the default file driver for all systems.
"""
struct POSIX <: Driver end
function get_driver(fapl::Properties, ::Type{POSIX})
POSIX()
end
function set_driver!(fapl::Properties, ::POSIX)
HDF5.init!(fapl)
API.h5p_set_fapl_sec2(fapl)
DRIVERS[API.h5p_get_driver(fapl)] = POSIX
return nothing
end
function __init__()
# Initialize POSIX key in DRIVERS
HDF5.FileAccessProperties() do fapl
set_driver!(fapl, POSIX())
end
# Check whether the libhdf5 was compiled with parallel support.
HDF5.HAS_PARALLEL[] = API._has_symbol(:H5Pset_fapl_mpio)
HDF5.HAS_ROS3[] = API._has_symbol(:H5Pset_fapl_ros3)
@static if !isdefined(Base, :get_extension)
@require MPI = "da04e1cc-30fd-572f-bb4f-1f8673147195" include("../../ext/MPIExt.jl")
end
end
# The docstring for `MPIO` basically belongs to the struct `MPIO` in
# ext/MPIExt/MPIExt.jl. However, we need to document it here to make it
# work smoothly both for Julia without package extensions (up to v1.8) and
# with package extensions (v1.9 and newer).
@doc """
MPIO(comm::MPI.Comm, info::MPI.Info)
MPIO(comm::MPI.Comm; kwargs....)
The parallel MPI file driver. This requires the use of
[MPI.jl](https://github.com/JuliaParallel/MPI.jl), and a custom HDF5 binary that has been
built with MPI support.
- `comm` is the communicator over which the file will be opened.
- `info`/`kwargs` are MPI-IO options, and are passed to `MPI_FILE_OPEN`.
# See also
- [`HDF5.has_parallel`](@ref)
- [Parallel HDF5](@ref)
# External links
- $(h5doc("H5P_SET_FAPL_MPIO"))
- [Parallel HDF5](https://portal.hdfgroup.org/display/HDF5/Parallel+HDF5)
"""
function MPIO end
include("ros3.jl")
end # module
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2833 | """
ROS3()
ROS3(aws_region::String, secret_id::String, secret_key::String)
ROS3(version::Int32, authenticate::Bool, aws_region::String, secret_id::String, secret_key::String)
This is the read-only virtual driver that enables access to HDF5 objects stored in AWS S3
"""
struct ROS3 <: Driver
version::Int32
authenticate::Bool
aws_region::String
secret_id::String
secret_key::String
function ROS3(version, authenticate, aws_region, secret_id, secret_key)
length(aws_region) <= API.H5FD_ROS3_MAX_REGION_LEN ||
"length(aws_region), $(length(aws_region)), must be less than or equal to $(API.H5FD_ROS3_MAX_REGION_LEN)"
length(secret_id) <= API.H5FD_ROS3_MAX_SECRET_ID_LEN ||
"length(secret_id), $(length(secret_id)), must be less than or equal to $(API.H5FD_ROS3_MAX_SECRET_ID_LEN)"
length(secret_key) <= API.H5FD_ROS3_MAX_SECRET_KEY_LEN ||
"length(secret_key), $(length(secret_key)), must be less than or equal to $(API.H5FD_ROS3_MAX_SECRET_KEY_LEN)"
new(version, authenticate, aws_region, secret_id, secret_key)
end
end
ROS3() = ROS3(1, false, "", "", "")
ROS3(region::AbstractString, id::AbstractString, key::AbstractString) =
ROS3(1, true, region, id, key)
function ROS3(driver::API.H5FD_ros3_fapl_t)
aws_region = _ntuple_to_string(driver.aws_region)
secret_id = _ntuple_to_string(driver.secret_id)
secret_key = _ntuple_to_string(driver.secret_key)
ROS3(driver.version, driver.authenticate, aws_region, secret_id, secret_key)
end
_ntuple_to_string(x) = unsafe_string(Ptr{Cchar}(pointer_from_objref(Ref(x))), length(x))
function Base.convert(::Type{API.H5FD_ros3_fapl_t}, driver::ROS3)
aws_region = ntuple(
i -> i <= length(driver.aws_region) ? Cchar(driver.aws_region[i]) : Cchar(0x0),
API.H5FD_ROS3_MAX_REGION_LEN + 1
)
secret_id = ntuple(
i -> i <= length(driver.secret_id) ? Cchar(driver.secret_id[i]) : Cchar(0x0),
API.H5FD_ROS3_MAX_SECRET_ID_LEN + 1
)
secret_key = ntuple(
i -> i <= length(driver.secret_key) ? Cchar(driver.secret_key[i]) : Cchar(0x0),
API.H5FD_ROS3_MAX_SECRET_KEY_LEN + 1
)
s = API.H5FD_ros3_fapl_t(
driver.version, driver.authenticate, aws_region, secret_id, secret_key,
)
end
function get_driver(fapl::Properties, ::Type{ROS3})
r_fa = Ref{API.H5FD_ros3_fapl_t}()
API.h5p_get_fapl_ros3(fapl, r_fa)
return ROS3(r_fa[])
end
function set_driver!(fapl::Properties, driver::ROS3)
HDF5.has_ros3() || error(
"HDF5.jl has no ros3 support." *
" Make sure that you're using ROS3-enabled HDF5 libraries"
)
HDF5.init!(fapl)
API.h5p_set_fapl_ros3(fapl, Ref{API.H5FD_ros3_fapl_t}(driver))
DRIVERS[API.h5p_get_driver(fapl)] = ROS3
return nothing
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 4837 | """
Deflate(level=5)
Deflate/ZLIB lossless compression filter. `level` is an integer between 0 and 9,
inclusive, denoting the compression level, with 0 being no compression, 9 being the
highest compression (but slowest speed).
# External links
- $(h5doc("H5P_SET_DEFLATE"))
- [_Deflate_ on Wikipedia](https://en.wikipedia.org/wiki/Deflate)
"""
struct Deflate <: Filter
level::Cuint
end
Deflate(; level=5) = Deflate(level)
Base.show(io::IO, deflate::Deflate) = print(io, Deflate, "(level=", Int(deflate.level), ")")
filterid(::Type{Deflate}) = API.H5Z_FILTER_DEFLATE
FILTERS[API.H5Z_FILTER_DEFLATE] = Deflate
function Base.push!(f::FilterPipeline, deflate::Deflate)
API.h5p_set_deflate(f.plist, deflate.level)
return f
end
"""
Shuffle()
The shuffle filter de-interlaces a block of data by reordering the bytes. All the bytes
from one consistent byte position of each data element are placed together in one block;
all bytes from a second consistent byte position of each data element are placed together
a second block; etc. For example, given three data elements of a 4-byte datatype stored as
012301230123, shuffling will re-order data as 000111222333. This can be a valuable step in
an effective compression algorithm because the bytes in each byte position are often
closely related to each other and putting them together can increase the compression
ratio.
As implied above, the primary value of the shuffle filter lies in its coordinated use with
a compression filter; it does not provide data compression when used alone. When the
shuffle filter is applied to a dataset immediately prior to the use of a compression
filter, the compression ratio achieved is often superior to that achieved by the use of a
compression filter without the shuffle filter.
# External links
- $(h5doc("H5P_SET_SHUFFLE"))
"""
struct Shuffle <: Filter end
filterid(::Type{Shuffle}) = API.H5Z_FILTER_SHUFFLE
FILTERS[API.H5Z_FILTER_SHUFFLE] = Shuffle
function Base.push!(f::FilterPipeline, ::Shuffle)
API.h5p_set_shuffle(f.plist)
return f
end
"""
Fletcher32()
The Fletcher32 checksum filter. This doesn't perform compression, but instead checks the validity of the stored data.
This should be applied _after_ any lossy filters have been applied.
# External links
- $(h5doc("H5P_SET_FLETCHER32"))
- [_Fletcher's checksum_ on Wikipedia](https://en.wikipedia.org/wiki/Fletcher's_checksum)
"""
struct Fletcher32 <: Filter end
filterid(::Type{Fletcher32}) = API.H5Z_FILTER_FLETCHER32
FILTERS[API.H5Z_FILTER_FLETCHER32] = Fletcher32
function Base.push!(f::FilterPipeline, ::Fletcher32)
API.h5p_set_fletcher32(f.plist)
return f
end
"""
Szip(coding=:nn, pixels_per_block=8)
Szip compression lossless filter. Options:
- `coding`: the coding method: either `:ec` (entropy coding) or `:nn` (nearest neighbors,
default)
- `pixels_per_block`: The number of pixels or data elements in each data block (typically
8, 10, 16, or 32)
# External links
- $(h5doc("H5P_SET_SZIP"))
- [Szip Compression in HDF Products](https://support.hdfgroup.org/doc_resource/SZIP/)
"""
struct Szip <: Filter
options_mask::Cuint
pixels_per_block::Cuint
end
function Szip(; coding=:nn, pixels_per_block=8)
options_mask = Cuint(0)
if coding == :ec
options_mask |= API.H5_SZIP_EC_OPTION_MASK
elseif coding == :nn
options_mask |= API.H5_SZIP_NN_OPTION_MASK
else
error("invalid coding option")
end
Szip(options_mask, pixels_per_block)
end
function Base.show(io::IO, szip::Szip)
print(io, Szip, "(")
if szip.options_mask & API.H5_SZIP_EC_OPTION_MASK != 0
print(io, "coding=:ec,")
elseif szip.options_mask & API.H5_SZIP_NN_OPTION_MASK != 0
print(io, "coding=:nn,")
end
print(io, "pixels_per_block=", Int(szip.pixels_per_block))
end
filterid(::Type{Szip}) = API.H5Z_FILTER_SZIP
FILTERS[API.H5Z_FILTER_SZIP] = Szip
function Base.push!(f::FilterPipeline, szip::Szip)
API.h5p_set_szip(f.plist, szip.options_mask, szip.pixels_per_block)
return f
end
"""
NBit()
The N-Bit filter.
# External links
- $(h5doc("H5P_SET_NBIT"))
"""
struct NBit <: Filter end
filterid(::Type{NBit}) = API.H5Z_FILTER_NBIT
FILTERS[API.H5Z_FILTER_NBIT] = NBit
function Base.push!(f::FilterPipeline, ::NBit)
API.h5p_set_nbit(f.plist)
return f
end
"""
ScaleOffset(scale_type::Integer, scale_offset::Integer)
The scale-offset filter.
# External links
- $(h5doc("H5P_SET_SCALEOFFSET"))
"""
struct ScaleOffset <: Filter
scale_type::Cint
scale_factor::Cint
end
filterid(::Type{ScaleOffset}) = API.H5Z_FILTER_SCALEOFFSET
FILTERS[API.H5Z_FILTER_SCALEOFFSET] = ScaleOffset
function Base.push!(f::FilterPipeline, scaleoffset::ScaleOffset)
API.h5p_set_scaleoffset(f.plist, scaleoffset.scale_type, scaleoffset.scale_factor)
return f
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 15170 | """
HDF5.Filters
This module contains the interface for using filters in HDF5.jl.
# Example Usage
```julia
using HDF5
using HDF5.Filters
# Create a new file
fn = tempname()
# Create test data
data = rand(1000, 1000)
# Open temp file for writing
f = h5open(fn, "w")
# Create datasets
dsdeflate = create_dataset(f, "deflate", datatype(data), dataspace(data),
chunk=(100, 100), deflate=3)
dsshufdef = create_dataset(f, "shufdef", datatype(data), dataspace(data),
chunk=(100, 100), shuffle=true, deflate=3)
dsfiltdef = create_dataset(f, "filtdef", datatype(data), dataspace(data),
chunk=(100, 100), filters=Filters.Deflate(3))
dsfiltshufdef = create_dataset(f, "filtshufdef", datatype(data), dataspace(data),
chunk=(100, 100), filters=[Filters.Shuffle(), Filters.Deflate(3)])
# Write data
write(dsdeflate, data)
write(dsshufdef, data)
write(dsfiltdef, data)
write(dsfiltshufdef, data)
close(f)
```
## Additonal Examples
See [test/filter.jl](https://github.com/JuliaIO/HDF5.jl/blob/master/test/filter.jl) for further examples.
"""
module Filters
# builtin filters
export Deflate, Shuffle, Fletcher32, Szip, NBit, ScaleOffset, ExternalFilter
import ..HDF5: Properties, h5doc, API
"""
Filter
Abstract type to describe HDF5 Filters.
See the Extended Help for information on implementing a new filter.
# Extended Help
## Filter interface
The Filter interface is implemented upon the Filter subtype.
See [`API.h5z_register`](@ref) for details.
### Required Methods to Implement
* `filterid` - registered filter ID
* `filter_func` - implement the actual filter
### Optional Methods to Implement
* `filtername` - defaults to "Unnamed Filter"
* `encoder_present` - defaults to true
* `decoder_present` - defaults to true
* `can_apply_func` - defaults to nothing
* `set_local_func` - defaults to nothing
### Advanced Methods to Implement
* `can_apply_cfunc` - Defaults to wrapping @cfunction around the result of `can_apply_func`
* `set_local_cfunc` - Defaults to wrapping @cfunction around the result of `set_local_func`
* `filter_cfunc` - Defaults to wrapping @cfunction around the result of `filter_func`
* `register_filter` - Defaults to using the above functions to register the filter
Implement the Advanced Methods to avoid @cfunction from generating a runtime closure which may not work on all systems.
"""
abstract type Filter end
"""
FILTERS
Maps filter id to filter type.
"""
const FILTERS = Dict{API.H5Z_filter_t,Type{<:Filter}}()
"""
filterid(F) where {F <: Filter}
The internal filter id of a filter of type `F`.
"""
filterid
"""
encoder_present(::Type{F}) where {F<:Filter}
Can the filter have an encode or compress the data?
Defaults to true.
Returns a Bool. See [`API.h5z_register`](@ref).
"""
encoder_present(::Type{F}) where {F<:Filter} = true
"""
decoder_present(::Type{F}) where {F<:Filter}
Can the filter decode or decompress the data?
Defaults to true.
Returns a Bool.
See [`API.h5z_register`](@ref)
"""
decoder_present(::Type{F}) where {F<:Filter} = true
"""
filtername(::Type{F}) where {F<:Filter}
What is the name of a filter?
Defaults to "Unnamed Filter"
Returns a String describing the filter. See [`API.h5z_register`](@ref)
"""
filtername(::Type{F}) where {F<:Filter} = "Unnamed Filter"
"""
can_apply_func(::Type{F}) where {F<:Filter}
Return a function indicating whether the filter can be applied or `nothing` if no function exists.
The function signature is `func(dcpl_id::API.hid_t, type_id::API.hid_t, space_id::API.hid_t)`.
See [`API.h5z_register`](@ref)
"""
can_apply_func(::Type{F}) where {F<:Filter} = nothing
"""
can_apply_cfunc(::Type{F}) where {F<:Filter}
Return a C function pointer for the can apply function.
By default, this will return the result of using `@cfunction` on the function
specified by `can_apply_func(F)` or `C_NULL` if `nothing`.
Overriding this will allow `@cfunction` to return a `Ptr{Nothing}` rather
than a `CFunction`` closure which may not work on all systems.
"""
function can_apply_cfunc(::Type{F}) where {F<:Filter}
func = can_apply_func(F)
if func === nothing
return C_NULL
else
return @cfunction($func, API.herr_t, (API.hid_t, API.hid_t, API.hid_t))
end
end
"""
set_local_func(::Type{F}) where {F<:Filter}
Return a function that sets dataset specific parameters or `nothing` if no function exists.
The function signature is `func(dcpl_id::API.hid_t, type_id::API.hid_t, space_id::API.hid_t)`.
See [`API.h5z_register`](@ref).
"""
set_local_func(::Type{F}) where {F<:Filter} = nothing
"""
set_local_cfunc(::Type{F}) where {F<:Filter}
Return a C function pointer for the set local function.
By default, this will return the result of using `@cfunction` on the function
specified by `set_local_func(F)` or `C_NULL` if `nothing`.
Overriding this will allow `@cfunction` to return a `Ptr{Nothing}` rather
than a `CFunction`` closure which may not work on all systems.
"""
function set_local_cfunc(::Type{F}) where {F<:Filter}
func = set_local_func(F)
if func === nothing
return C_NULL
else
return @cfunction($func, API.herr_t, (API.hid_t, API.hid_t, API.hid_t))
end
end
"""
filter_func(::Type{F}) where {F<:Filter}
Returns a function that performs the actual filtering.
See [`API.h5z_register`](@ref)
"""
filter_func(::Type{F}) where {F<:Filter} = nothing
"""
filter_cfunc(::Type{F}) where {F<:Filter}
Return a C function pointer for the filter function.
By default, this will return the result of using `@cfunction` on the function
specified by `filter_func(F)` or will throw an error if `nothing`.
Overriding this will allow `@cfunction` to return a `Ptr{Nothing}` rather
than a `CFunction`` closure which may not work on all systems.
"""
function filter_cfunc(::Type{F}) where {F<:Filter}
func = filter_func(F)
if func === nothing
error("Filter function for $F must be defined via `filter_func`.")
end
c_filter_func = @cfunction(
$func, Csize_t, (Cuint, Csize_t, Ptr{Cuint}, Csize_t, Ptr{Csize_t}, Ptr{Ptr{Cvoid}})
)
return c_filter_func
end
# Generic implementation of register_filter
"""
register_filter(::Type{F}) where F <: Filter
Register the filter with the HDF5 library via [`API.h5z_register`](@ref).
Also add F to the FILTERS dictionary.
"""
function register_filter(::Type{F}) where {F<:Filter}
id = filterid(F)
encoder = encoder_present(F)
decoder = decoder_present(F)
name = filtername(F)
can_apply = can_apply_cfunc(F)
set_local = set_local_cfunc(F)
func = filter_cfunc(F)
GC.@preserve name begin
API.h5z_register(
API.H5Z_class_t(
API.H5Z_CLASS_T_VERS,
id,
encoder,
decoder,
pointer(name),
can_apply,
set_local,
func
)
)
end
FILTERS[id] = F
return nothing
end
"""
ExternalFilter(filter_id::API.H5Z_filter_t, flags::Cuint, data::Vector{Cuint}, name::String, config::Cuint)
ExternalFilter(filter_id, flags, data::Integer...)
ExternalFilter(filter_id, data::AbstractVector{<:Integer} = Cuint[])
Intended to support arbitrary, unregistered, external filters. Allows the
quick creation of filters using internal/proprietary filters without subtyping
`HDF5.Filters.Filter`.
Users are instead encouraged to define subtypes on `HDF5.Filters.Filter`.
# Fields / Arguments
* `filter_id` - (required) `Integer` filter identifer.
* `flags` - (optional) bit vector describing general properties of the filter. Defaults to `API.H5Z_FLAG_MANDATORY`
* `data` - (optional) auxillary data for the filter. See [`cd_values`](@ref API.h5p_set_filter). Defaults to `Cuint[]`
* `name` - (optional) `String` describing the name of the filter. Defaults to "Unknown Filter with id [filter_id]"
* `config` - (optional) bit vector representing information about the filter regarding whether it is able to encode data, decode data, neither, or both. Defaults to `0`.
# See also:
* [`API.h5p_set_filter`](@ref)
* [`H5Z_GET_FILTER_INFO`](https://portal.hdfgroup.org/display/HDF5/H5Z_GET_FILTER_INFO).
* [Registered Filter Plugins](https://portal.hdfgroup.org/display/support/Registered+Filter+Plugins)
`flags` bits
* `API.H5Z_FLAG_OPTIONAL`
* `API.H5Z_FLAG_MANDATORY`
`config` bits
* `API.H5Z_FILTER_CONFIG_ENCODE_ENABLED`
* `API.H5Z_FILTER_CONFIG_DECODE_ENABLED`
"""
struct ExternalFilter <: Filter
filter_id::API.H5Z_filter_t
flags::Cuint
data::Vector{Cuint}
name::String
config::Cuint
end
function ExternalFilter(filter_id, flags, data::AbstractVector{<:Integer})
ExternalFilter(filter_id, flags, Cuint.(data), "Unknown Filter with id $filter_id", 0)
end
function ExternalFilter(filter_id, flags, data::Integer...)
ExternalFilter(filter_id, flags, Cuint[data...])
end
function ExternalFilter(filter_id, data::AbstractVector{<:Integer}=Cuint[])
ExternalFilter(filter_id, API.H5Z_FLAG_MANDATORY, data)
end
filterid(filter::ExternalFilter) = filter.filter_id
filtername(filter::ExternalFilter) = filter.name
filtername(::Type{ExternalFilter}) = "Unknown Filter"
encoder_present(::Type{ExternalFilter}) = false
decoder_present(::Type{ExternalFilter}) = false
"""
UnknownFilter
Unknown filter type. Alias for [`ExternalFilter`](@ref) (see related documentation).
"""
const UnknownFilter = ExternalFilter
"""
FilterPipeline(plist::DatasetCreateProperties)
The filter pipeline associated with `plist`. Acts like a `AbstractVector{Filter}`,
supporting the following operations:
- `length(pipeline)`: the number of filters.
- `pipeline[i]` to return the `i`th filter.
- `pipeline[FilterType]` to return a filter of type `FilterType`
- `push!(pipline, filter)` to add an extra filter to the pipeline.
- `append!(pipeline, filters)` to add multiple filters to the pipeline.
- `delete!(pipeline, FilterType)` to remove a filter of type `FilterType` from the pipeline.
- `empty!(pipeline)` to remove all filters from the pipeline.
"""
struct FilterPipeline{P<:Properties} <: AbstractVector{Filter}
plist::P
end
function Base.length(f::FilterPipeline)
API.h5p_get_nfilters(f.plist)
end
Base.size(f::FilterPipeline) = (length(f),)
function Base.getindex(f::FilterPipeline, i::Integer)
id = API.h5p_get_filter(f.plist, i - 1, C_NULL, C_NULL, C_NULL, 0, C_NULL, C_NULL)
F = get(FILTERS, id, ExternalFilter)
return getindex(f, F, i)
end
function Base.getindex(
f::FilterPipeline, ::Type{ExternalFilter}, i::Integer, cd_values::Vector{Cuint}=Cuint[]
)
flags = Ref{Cuint}()
cd_nelmts = Ref{Csize_t}(length(cd_values))
namebuf = Array{UInt8}(undef, 256)
config = Ref{Cuint}()
id = API.h5p_get_filter(
f.plist, i - 1, flags, cd_nelmts, cd_values, length(namebuf), namebuf, config
)
if cd_nelmts[] > length(cd_values)
resize!(cd_values, cd_nelmts[])
return getindex(f, ExternalFilter, i, cd_values)
end
resize!(namebuf, findfirst(isequal(0), namebuf) - 1)
resize!(cd_values, cd_nelmts[])
return ExternalFilter(id, flags[], cd_values, String(namebuf), config[])
end
function Base.getindex(f::FilterPipeline, ::Type{F}, i::Integer) where {F<:Filter}
@assert isbitstype(F)
ref = Ref{F}()
GC.@preserve ref begin
id = API.h5p_get_filter(
f.plist,
i - 1,
C_NULL,
div(sizeof(F), sizeof(Cuint)),
pointer_from_objref(ref),
0,
C_NULL,
C_NULL
)
end
@assert id == filterid(F)
return ref[]
end
function Base.getindex(f::FilterPipeline, ::Type{F}) where {F<:Filter}
@assert isbitstype(F)
ref = Ref{F}()
GC.@preserve ref begin
API.h5p_get_filter_by_id(
f.plist,
filterid(F),
C_NULL,
div(sizeof(F), sizeof(Cuint)),
pointer_from_objref(ref),
0,
C_NULL,
C_NULL
)
end
return ref[]
end
function Base.empty!(filters::FilterPipeline)
API.h5p_remove_filter(filters.plist, API.H5Z_FILTER_ALL)
return filters
end
function Base.delete!(filters::FilterPipeline, ::Type{F}) where {F<:Filter}
API.h5p_remove_filter(filters.plist, filterid(F))
return filters
end
function Base.append!(
filters::FilterPipeline, extra::Union{AbstractVector{<:Filter},NTuple{N,Filter} where N}
)
for filter in extra
push!(filters, filter)
end
return filters
end
function Base.push!(p::FilterPipeline, f::F) where {F<:Filter}
ref = Ref(f)
GC.@preserve ref begin
API.h5p_set_filter(
p.plist,
filterid(F),
API.H5Z_FLAG_OPTIONAL,
div(sizeof(F), sizeof(Cuint)),
pointer_from_objref(ref)
)
end
return p
end
function Base.push!(p::FilterPipeline, f::ExternalFilter)
GC.@preserve f begin
API.h5p_set_filter(p.plist, f.filter_id, f.flags, length(f.data), pointer(f.data))
end
return p
end
# Convert a Filter to an Integer subtype using filterid
function Base.convert(::Type{I}, ::Type{F}) where {I<:Integer,F<:Filter}
Base.convert(I, filterid(F))
end
function Base.convert(::Type{I}, f::Filter) where {I<:Integer}
Base.convert(I, filterid(f))
end
"""
EXTERNAL_FILTER_JULIA_PACKAGES
Maps filter id to the Julia package name that contains the filter.
"""
const EXTERNAL_FILTER_JULIA_PACKAGES = Dict{API.H5Z_filter_t,String}([
32008 => "H5Zbitshuffle",
32001 => "H5Zblosc",
307 => "H5Zbzip2",
32004 => "H5Zlz4",
32015 => "H5Zzstd",
])
"""
Error if all filters in a filter pipeline are not available.
"""
function ensure_filters_available(f::FilterPipeline)
if !API.h5p_all_filters_avail(f.plist)
nfilters = length(f)
for i in 1:nfilters
filter::UnknownFilter = getindex(f, UnknownFilter, i)
filter_id = filterid(filter)
filter_name = filtername(filter)
if !API.h5z_filter_avail(filter_id)
if haskey(EXTERNAL_FILTER_JULIA_PACKAGES, filter_id)
error(
"""
filter missing, filter id: $filter_id name: $filter_name
Try running `import $(EXTERNAL_FILTER_JULIA_PACKAGES[filter_id])` to install this filter.
"""
)
else
error(
"""
filter missing, filter id: $filter_id name: $filter_name
This filter is not currently available as a Julia package.
For more information, see https://portal.hdfgroup.org/display/support/Registered+Filter+Plugins
"""
)
end
end
end
else
return nothing
end
error("unreachable")
end
include("builtin.jl")
include("filters_midlevel.jl")
include("registered.jl")
end # module
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 903 | """
isavailable(filter_or_id)
Given a subtype of `Filters.Filter` or the filter ID number as an integer,
return `true` if the filter is available and `false` otherwise.
"""
isavailable(filter_or_id) = API.h5z_filter_avail(filter_or_id)
"""
isencoderenabled(filter_or_id)
Given a subtype of `Filters.Filter` or the filter ID number as an integer,
return `true` if the filter can encode or compress data.
"""
function isencoderenabled(filter_or_id)
info = API.h5z_get_filter_info(filter_or_id)
return info & API.H5Z_FILTER_CONFIG_ENCODE_ENABLED != 0
end
"""
isdecoderenabled(filter_or_id)
Given a subtype of `Filters.Filter` or the filter ID number as an integer,
return `true` if the filter can decode or decompress data.
"""
function isdecoderenabled(filter_or_id)
info = API.h5z_get_filter_info(filter_or_id)
return info & API.H5Z_FILTER_CONFIG_DECODE_ENABLED != 0
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 4356 | """
HDF5.Filters.Registered
Module containing convenience methods to create `ExternalFilter` instances
of [HDF5 registered filters](https://portal.hdfgroup.org/display/support/Registered+Filter+Plugins).
This module does not implement any filter or guarantee filter availability.
Rather the functions within this module create `ExternalFilter` instances for convenience.
These instances can be used to determine if a filter is available. They can also
be incorporated as part of a filter pipeline.
Examine `REGISTERED_FILTERS`, a `Dict{H5Z_filter_t, Function}`, for a list of
filter functions contained within this module, which are exported.
```jldoctest
julia> println.(values(HDF5.Filters.Registered.REGISTERED_FILTERS));
FCIDECOMPFilter
LZOFilter
BitGroomFilter
SZ3Filter
Delta_RiceFilter
fpzipFilter
LPC_RiceFilter
LZFFilter
FLACFilter
VBZFilter
FAPECFilter
zfpFilter
CBFFilter
JPEG_XRFilter
LZ4Filter
BLOSC2Filter
ZstandardFilter
SZFilter
Granular_BitRoundFilter
JPEGFilter
SnappyFilter
B³DFilter
APAXFilter
BLOSCFilter
SPDPFilter
bitshuffleFilter
MAFISCFilter
BZIP2Filter
CCSDS_123Filter
JPEG_LSFilter
```
"""
module Registered
using HDF5.Filters:
Filters, Filter, ExternalFilter, EXTERNAL_FILTER_JULIA_PACKAGES, isavailable
using HDF5.API: API, H5Z_filter_t, H5Z_FLAG_MANDATORY
const _REGISTERED_FILTERIDS_DICT = Dict{H5Z_filter_t,Symbol}(
305 => :LZO,
307 => :BZIP2,
32000 => :LZF,
32001 => :BLOSC,
32002 => :MAFISC,
32003 => :Snappy,
32004 => :LZ4,
32005 => :APAX,
32006 => :CBF,
32007 => :JPEG_XR,
32008 => :bitshuffle,
32009 => :SPDP,
32010 => :LPC_Rice,
32011 => :CCSDS_123,
32012 => :JPEG_LS,
32013 => :zfp,
32014 => :fpzip,
32015 => :Zstandard,
32016 => :B³D,
32017 => :SZ,
32018 => :FCIDECOMP,
32019 => :JPEG,
32020 => :VBZ,
32021 => :FAPEC,
32022 => :BitGroom,
32023 => :Granular_BitRound,
32024 => :SZ3,
32025 => :Delta_Rice,
32026 => :BLOSC2,
32027 => :FLAC
)
const REGISTERED_FILTERS = Dict{H5Z_filter_t,Function}()
for (filter_id, filter_name) in _REGISTERED_FILTERIDS_DICT
fn_string = String(filter_name) * "Filter"
fn = Symbol(fn_string)
filter_name_string = replace(String(filter_name), "_" => raw"\_")
@eval begin
@doc """
$($fn_string)(flags=API.H5Z_FLAG_MANDATORY, data::AbstractVector{<: Integer}=Cuint[], config::Cuint=0)
$($fn_string)(flags=API.H5Z_FLAG_MANDATORY, data::Integer...)
Create an [`ExternalFilter`](@ref) for $($filter_name_string) with filter id $($filter_id).
$(haskey(EXTERNAL_FILTER_JULIA_PACKAGES, $filter_id) ?
"Users are instead encouraged to use the Julia package $(EXTERNAL_FILTER_JULIA_PACKAGES[$filter_id])." :
"Users should consider defining a subtype of [`Filter`](@ref) to specify the data."
)
# Fields / Arguments
* `flags` - (optional) bit vector describing general properties of the filter. Defaults to `API.H5Z_FLAG_MANDATORY`
* `data` - (optional) auxillary data for the filter. See [`cd_values`](@ref API.h5p_set_filter). Defaults to `Cuint[]`
* `config` - (optional) bit vector representing information about the filter regarding whether it is able to encode data, decode data, neither, or both. Defaults to `0`.
See [`ExternalFilter`](@ref) for valid argument values.
""" $fn
export $fn
$fn(flags, data::AbstractVector{<:Integer}) =
ExternalFilter($filter_id, flags, Cuint.(data), $filter_name_string, 0)
$fn(flags, data::Integer...) =
ExternalFilter($filter_id, flags, Cuint[data...], $filter_name_string, 0)
$fn(data::AbstractVector{<:Integer}=Cuint[]) = ExternalFilter(
$filter_id, H5Z_FLAG_MANDATORY, Cuint.(data), $filter_name_string, 0
)
$fn(flags, data, config) =
ExternalFilter($filter_id, flags, data, $filter_name_string, config)
REGISTERED_FILTERS[$filter_id] = $fn
end
end
"""
available_registered_filters()::Dict{H5Z_filter_t, Function}
Return a `Dict{H5Z_filter_t, Function}` listing the available filter ids and
their corresponding convenience function.
"""
function available_registered_filters()
filter(p -> isavailable(first(p)), REGISTERED_FILTERS)
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2897 | using HDF5, Test
@testset "h5a_iterate" begin
filename = tempname()
f = h5open(filename, "w")
write_attribute(f, "a", 1)
write_attribute(f, "b", 2)
# iterate over attributes
names = String[]
@test HDF5.API.h5a_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
push!(names, unsafe_string(name))
return false
end == 2
@test names == ["a", "b"]
# iterate over attributes in reverse
names = String[]
@test HDF5.API.h5a_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_DEC
) do loc, name, info
push!(names, unsafe_string(name))
return false
end == 2
@test names == ["b", "a"]
# only iterate once
names = String[]
@test HDF5.API.h5a_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
push!(names, unsafe_string(name))
return true
end == 1
@test names == ["a"]
# HDF5 error
@test_throws HDF5.API.H5Error HDF5.API.h5a_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
return -1
end
# Julia error
@test_throws AssertionError HDF5.API.h5a_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
@assert false
end
end
@testset "h5l_iterate" begin
filename = tempname()
f = h5open(filename, "w")
create_group(f, "a")
create_group(f, "b")
# iterate over groups
names = String[]
@test HDF5.API.h5l_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
push!(names, unsafe_string(name))
return false
end == 2
@test names == ["a", "b"]
# iterate over attributes in reverse
names = String[]
@test HDF5.API.h5l_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_DEC
) do loc, name, info
push!(names, unsafe_string(name))
return false
end == 2
@test names == ["b", "a"]
# only iterate once
names = String[]
@test HDF5.API.h5l_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
push!(names, unsafe_string(name))
return true
end == 1
@test names == ["a"]
# HDF5 error
@test_throws HDF5.API.H5Error HDF5.API.h5l_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
return -1
end
# Julia error
@test_throws AssertionError HDF5.API.h5l_iterate(
f, HDF5.API.H5_INDEX_NAME, HDF5.API.H5_ITER_INC
) do loc, name, info
@assert false
end
end
@testset "h5dchunk_iter" begin
@test convert(HDF5.API.H5_iter_t, 0) == HDF5.API.H5_ITER_CONT
@test convert(HDF5.API.H5_iter_t, 1) == HDF5.API.H5_ITER_STOP
@test convert(HDF5.API.H5_iter_t, -1) == HDF5.API.H5_ITER_ERROR
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2186 | using HDF5, Test
function test_attrs(o::Union{HDF5.File,HDF5.Object})
@test attrs(o) isa HDF5.AttributeDict
attrs(o)["a"] = 1
@test haskey(attrs(o), "a")
@test attrs(o)["a"] == 1
attrs(o)["b"] = [2, 3]
@test attrs(o)["b"] == [2, 3]
@test haskey(attrs(o), "a")
@test length(attrs(o)) == 2
@test sort(keys(attrs(o))) == ["a", "b"]
@test !haskey(attrs(o), "c")
# overwrite: same type
attrs(o)["a"] = 4
@test attrs(o)["a"] == 4
@test get(attrs(o), "a", nothing) == 4
@test length(attrs(o)) == 2
@test sort(keys(attrs(o))) == ["a", "b"]
# overwrite: different size
attrs(o)["b"] = [4, 5, 6]
@test attrs(o)["b"] == [4, 5, 6]
@test length(attrs(o)) == 2
@test sort(keys(attrs(o))) == ["a", "b"]
# overwrite: different type
attrs(o)["b"] = "a string"
@test attrs(o)["b"] == "a string"
@test length(attrs(o)) == 2
@test sort(keys(attrs(o))) == ["a", "b"]
# delete a key
delete!(attrs(o), "a")
@test !haskey(attrs(o), "a")
@test length(attrs(o)) == 1
@test sort(keys(attrs(o))) == ["b"]
@test_throws KeyError attrs(o)["a"]
@test isnothing(get(attrs(o), "a", nothing))
end
@testset "attrs interface" begin
filename = tempname()
f = h5open(filename, "w")
try
# Test attrs on a HDF5.File
test_attrs(f)
# Test attrs on a HDF5.Group
g = create_group(f, "group_foo")
test_attrs(g)
# Test attrs on a HDF5.Dataset
d = create_dataset(g, "dataset_bar", Int, (32, 32))
test_attrs(d)
# Test attrs on a HDF5.Datatype
t = commit_datatype(
g, "datatype_int16", HDF5.Datatype(HDF5.API.h5t_copy(HDF5.API.H5T_NATIVE_INT16))
)
test_attrs(t)
finally
close(f)
end
end
@testset "variable length strings" begin
filename = tempname()
h5open(filename, "w") do f
# https://github.com/JuliaIO/HDF5.jl/issues/1129
attr = create_attribute(f, "attr-name", datatype(String), dataspace(String))
write_attribute(attr, datatype(String), "attr-value")
@test attrs(f)["attr-name"] == "attr-value"
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 10123 | using HDF5
using Test
@testset "Raw Chunk I/O" begin
fn = tempname()
# Direct chunk write is no longer dependent on HL library
# Test direct chunk writing Cartesian index
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int), dataspace(4, 5); chunk=(2, 2))
HDF5.API.h5d_extend(d, HDF5.API.hsize_t[3, 3]) # should do nothing (deprecated call)
HDF5.API.h5d_extend(d, HDF5.API.hsize_t[4, 4]) # should do nothing (deprecated call)
raw = HDF5.ChunkStorage(d)
raw[1, 1] = 0, collect(reinterpret(UInt8, [1, 2, 5, 6]))
raw[3, 1] = 0, collect(reinterpret(UInt8, [3, 4, 7, 8]))
raw[1, 3] = 0, collect(reinterpret(UInt8, [9, 10, 13, 14]))
raw[3, 3] = 0, collect(reinterpret(UInt8, [11, 12, 15, 16]))
raw[1, 5] = 0, collect(reinterpret(UInt8, [17, 18, 21, 22]))
raw[3, 5] = 0, collect(reinterpret(UInt8, [19, 20, 23, 24]))
end
# Test read back
@test h5open(fn, "r") do f
vec(f["dataset"][:, :])
end == collect(1:20)
# Test reading direct chunks via linear indexing
h5open(fn, "r") do f
d = f["dataset"]
raw = HDF5.ChunkStorage{IndexLinear}(d)
@test size(raw) == (6,)
@test length(raw) == 6
@test axes(raw) == (Base.OneTo(6),)
@test prod(HDF5.get_num_chunks_per_dim(d)) == HDF5.get_num_chunks(d)
if v"1.10.5" ≤ HDF5.API._libhdf5_build_ver
@test HDF5.get_chunk_length(d) == HDF5.API.h5d_get_chunk_info(d, 1)[:size]
end
@test reinterpret(Int, raw[1][2]) == [1, 2, 5, 6]
@test reinterpret(Int, raw[2][2]) == [3, 4, 7, 8]
@test reinterpret(Int, raw[3][2]) == [9, 10, 13, 14]
@test reinterpret(Int, raw[4][2]) == [11, 12, 15, 16]
@test reinterpret(Int, raw[5][2])[1:2] == [17, 18]
@test reinterpret(Int, raw[6][2])[1:2] == [19, 20]
# Test 0-based indexed API
@test HDF5.get_chunk_offset(d, 0) == (0, 0)
@test HDF5.get_chunk_offset(d, 1) == (2, 0)
@test HDF5.get_chunk_offset(d, 2) == (0, 2)
@test HDF5.get_chunk_offset(d, 3) == (2, 2)
@test HDF5.get_chunk_offset(d, 4) == (0, 4)
@test HDF5.get_chunk_offset(d, 5) == (2, 4)
# Test reverse look up of index from coords
@test HDF5.get_chunk_index(d, (0, 0)) == 0
@test HDF5.get_chunk_index(d, (2, 0)) == 1
@test HDF5.get_chunk_index(d, (0, 2)) == 2
@test HDF5.get_chunk_index(d, (2, 2)) == 3
@test HDF5.get_chunk_index(d, (0, 4)) == 4
@test HDF5.get_chunk_index(d, (2, 4)) == 5
# Test internal coordinates
@test HDF5.get_chunk_index(d, (0, 1)) == 0
@test HDF5.get_chunk_index(d, (1, 0)) == 0
@test HDF5.get_chunk_index(d, (1, 1)) == 0
@test HDF5.get_chunk_index(d, (3, 1)) == 1
@test HDF5.get_chunk_index(d, (1, 3)) == 2
@test HDF5.get_chunk_index(d, (3, 3)) == 3
@test HDF5.get_chunk_index(d, (1, 5)) == 4
@test HDF5.get_chunk_index(d, (2, 5)) == 5
@test HDF5.get_chunk_index(d, (3, 4)) == 5
@test HDF5.get_chunk_index(d, (3, 5)) == 5
# Test chunk iter
if v"1.12.3" ≤ HDF5.API._libhdf5_build_ver
infos = HDF5.get_chunk_info_all(d)
offsets = [info.offset for info in infos]
addrs = [info.addr for info in infos]
filter_masks = [info.filter_mask for info in infos]
sizes = [info.size for info in infos]
@test isempty(
setdiff(offsets, [(0, 0), (2, 0), (0, 2), (2, 2), (0, 4), (2, 4)])
)
@test length(unique(addrs)) == 6
@test only(unique(filter_masks)) === UInt32(0)
@test only(unique(sizes)) == 4 * sizeof(Int)
end
end
# Test direct write chunk writing via linear indexing
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int64), dataspace(4, 5); chunk=(2, 3))
raw = HDF5.ChunkStorage{IndexLinear}(d)
raw[1] = 0, collect(reinterpret(UInt8, Int64[1, 2, 5, 6, 9, 10]))
raw[2] = 0, collect(reinterpret(UInt8, Int64[3, 4, 7, 8, 11, 12]))
raw[3] = 0, collect(reinterpret(UInt8, Int64[13, 14, 17, 18, 21, 22]))
raw[4] = 0, collect(reinterpret(UInt8, Int64[15, 16, 19, 20, 23, 24]))
end
@test h5open(fn, "r") do f
f["dataset"][:, :]
end == reshape(1:20, 4, 5)
h5open(fn, "r") do f
d = f["dataset"]
raw = HDF5.ChunkStorage(d)
chunk = HDF5.get_chunk(d)
extent = HDF5.get_extent_dims(d)[1]
@test chunk == (2, 3)
@test extent == (4, 5)
@test size(raw) == (2, 2)
@test length(raw) == 4
@test axes(raw) == (1:2:4, 1:3:5)
@test prod(HDF5.get_num_chunks_per_dim(d)) == HDF5.get_num_chunks(d)
# Test 0-based indexed API
@test HDF5.get_chunk_offset(d, 0) == (0, 0)
@test HDF5.get_chunk_offset(d, 1) == (2, 0)
@test HDF5.get_chunk_offset(d, 2) == (0, 3)
@test HDF5.get_chunk_offset(d, 3) == (2, 3)
# Test reverse look up of index from coords
@test HDF5.get_chunk_index(d, (0, 0)) == 0
@test HDF5.get_chunk_index(d, (2, 0)) == 1
@test HDF5.get_chunk_index(d, (0, 3)) == 2
@test HDF5.get_chunk_index(d, (2, 3)) == 3
# Test internal coordinates
@test HDF5.get_chunk_index(d, (0, 1)) == 0
@test HDF5.get_chunk_index(d, (0, 2)) == 0
@test HDF5.get_chunk_index(d, (1, 0)) == 0
@test HDF5.get_chunk_index(d, (1, 1)) == 0
@test HDF5.get_chunk_index(d, (1, 2)) == 0
@test HDF5.get_chunk_index(d, (3, 1)) == 1
@test HDF5.get_chunk_index(d, (1, 4)) == 2
@test HDF5.get_chunk_index(d, (2, 4)) == 3
@test HDF5.get_chunk_index(d, (2, 5)) == 3
@test HDF5.get_chunk_index(d, (3, 3)) == 3
@test HDF5.get_chunk_index(d, (3, 4)) == 3
@test HDF5.get_chunk_index(d, (3, 5)) == 3
if v"1.10.5" ≤ HDF5.API._libhdf5_build_ver
chunk_length = HDF5.get_chunk_length(d)
origin = HDF5.API.h5d_get_chunk_info(d, 0)
@test chunk_length == origin[:size]
chunk_info = HDF5.API.h5d_get_chunk_info_by_coord(d, HDF5.API.hsize_t[0, 1])
@test chunk_info[:filter_mask] == 0
@test chunk_info[:size] == chunk_length
# Test HDF5.get_chunk_offset equivalence to h5d_get_chunk_info information
@test all(
reverse(HDF5.API.h5d_get_chunk_info(d, 3)[:offset]) .==
HDF5.get_chunk_offset(d, 3)
)
# Test HDF5.get_chunk_index equivalence to h5d_get_chunk_info_by_coord information
offset = HDF5.API.hsize_t[2, 3]
chunk_info = HDF5.API.h5d_get_chunk_info_by_coord(d, reverse(offset))
@test HDF5.get_chunk_index(d, offset) ==
fld(chunk_info[:addr] - origin[:addr], chunk_info[:size])
@test HDF5.API.h5d_get_chunk_storage_size(d, HDF5.API.hsize_t[0, 1]) ==
chunk_length
@test HDF5.API.h5d_get_storage_size(d) == sizeof(Int64) * 24
if v"1.12.2" ≤ HDF5.API._libhdf5_build_ver
@test HDF5.API.h5d_get_space_status(d) ==
HDF5.API.H5D_SPACE_STATUS_ALLOCATED
else
@test HDF5.API.h5d_get_space_status(d) ==
HDF5.API.H5D_SPACE_STATUS_PART_ALLOCATED
end
end
# Manually reconstruct matrix
A = Matrix{Int}(undef, extent)
for (r1, c1) in Iterators.product(axes(raw)...)
r2 = min(extent[1], r1 + chunk[1] - 1)
c2 = min(extent[2], c1 + chunk[2] - 1)
dims = (r2 - r1 + 1, c2 - c1 + 1)
bytes = 8 * prod(dims)
A[r1:r2, c1:c2] .= reshape(reinterpret(Int64, raw[r1, c1][2][1:bytes]), dims)
end
@test A == reshape(1:20, extent)
end
@static if VERSION >= v"1.6"
# CartesianIndices does not accept StepRange
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int), dataspace(4, 5); chunk=(2, 3))
raw = HDF5.ChunkStorage(d)
data = permutedims(reshape(1:24, 2, 2, 3, 2), (1, 3, 2, 4))
ci = CartesianIndices(raw)
for ind in eachindex(ci)
raw[ci[ind]] = data[:, :, ind]
end
end
@test h5open(fn, "r") do f
f["dataset"][:, :]
end == reshape(1:20, 4, 5)
end
# Test direct write chunk writing via linear indexing, using views and without filter flag
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int), dataspace(4, 5); chunk=(2, 3))
raw = HDF5.ChunkStorage{IndexLinear}(d)
data = permutedims(reshape(1:24, 2, 2, 3, 2), (1, 3, 2, 4))
chunks = Iterators.partition(data, 6)
i = 1
for c in chunks
raw[i] = c
i += 1
end
end
@test h5open(fn, "r") do f
f["dataset"][:, :]
end == reshape(1:20, 4, 5)
# Test chunk info retrieval method performance
h5open(fn, "w") do f
d = create_dataset(
f,
"dataset",
datatype(UInt8),
dataspace(256, 256);
chunk=(16, 16),
alloc_time=:early
)
if v"1.10.5" ≤ HDF5.API._libhdf5_build_ver
HDF5._get_chunk_info_all_by_index(d)
index_time = @elapsed infos_by_index = HDF5._get_chunk_info_all_by_index(d)
@test length(infos_by_index) == 256
iob = IOBuffer()
show(iob, MIME"text/plain"(), infos_by_index)
seekstart(iob)
@test length(readlines(iob)) == 259
if v"1.12.3" ≤ HDF5.API._libhdf5_build_ver
HDF5._get_chunk_info_all_by_iter(d)
iter_time = @elapsed infos_by_iter = HDF5._get_chunk_info_all_by_iter(d)
@test infos_by_iter == infos_by_index
@test iter_time < index_time
end
end
end
rm(fn)
end # testset "Raw Chunk I/O"
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 7842 | using Random, Test, HDF5
import HDF5.datatype
import Base.convert
import Base.unsafe_convert
struct foo
a::Float64
b::String
c::String
d::Array{ComplexF64,2}
e::Array{Int64,1}
end
struct foo_hdf5
a::Float64
b::Cstring
c::NTuple{20,UInt8}
d::NTuple{9,ComplexF64}
e::HDF5.API.hvl_t
end
function unsafe_convert(::Type{foo_hdf5}, x::foo)
foo_hdf5(
x.a,
Base.unsafe_convert(Cstring, x.b),
ntuple(i -> i <= ncodeunits(x.c) ? codeunit(x.c, i) : '\0', 20),
ntuple(i -> x.d[i], length(x.d)),
HDF5.API.hvl_t(length(x.e), pointer(x.e))
)
end
function datatype(::Type{foo_hdf5})
dtype = HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(foo_hdf5))
HDF5.API.h5t_insert(dtype, "a", fieldoffset(foo_hdf5, 1), datatype(Float64))
vlenstr_dtype = HDF5.API.h5t_copy(HDF5.API.H5T_C_S1)
HDF5.API.h5t_set_size(vlenstr_dtype, HDF5.API.H5T_VARIABLE)
HDF5.API.h5t_set_cset(vlenstr_dtype, HDF5.API.H5T_CSET_UTF8)
HDF5.API.h5t_insert(dtype, "b", fieldoffset(foo_hdf5, 2), vlenstr_dtype)
fixedstr_dtype = HDF5.API.h5t_copy(HDF5.API.H5T_C_S1)
HDF5.API.h5t_set_size(fixedstr_dtype, 20 * sizeof(UInt8))
HDF5.API.h5t_set_cset(fixedstr_dtype, HDF5.API.H5T_CSET_UTF8)
HDF5.API.h5t_set_strpad(fixedstr_dtype, HDF5.API.H5T_STR_NULLPAD)
HDF5.API.h5t_insert(dtype, "c", fieldoffset(foo_hdf5, 3), fixedstr_dtype)
hsz = HDF5.API.hsize_t[3, 3]
array_dtype = HDF5.API.h5t_array_create(datatype(ComplexF64), 2, hsz)
HDF5.API.h5t_insert(dtype, "d", fieldoffset(foo_hdf5, 4), array_dtype)
vlen_dtype = HDF5.API.h5t_vlen_create(datatype(Int64))
HDF5.API.h5t_insert(dtype, "e", fieldoffset(foo_hdf5, 5), vlen_dtype)
HDF5.Datatype(dtype)
end
struct bar
a::Vector{String}
end
struct bar_hdf5
a::NTuple{2,NTuple{20,UInt8}}
end
function datatype(::Type{bar_hdf5})
dtype = HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(bar_hdf5))
fixedstr_dtype = HDF5.API.h5t_copy(HDF5.API.H5T_C_S1)
HDF5.API.h5t_set_size(fixedstr_dtype, 20 * sizeof(UInt8))
HDF5.API.h5t_set_cset(fixedstr_dtype, HDF5.API.H5T_CSET_UTF8)
hsz = HDF5.API.hsize_t[2]
array_dtype = HDF5.API.h5t_array_create(fixedstr_dtype, 1, hsz)
HDF5.API.h5t_insert(dtype, "a", fieldoffset(bar_hdf5, 1), array_dtype)
HDF5.Datatype(dtype)
end
function convert(::Type{bar_hdf5}, x::bar)
bar_hdf5(
ntuple(
i -> ntuple(j -> j <= ncodeunits(x.a[i]) ? codeunit(x.a[i], j) : '\0', 20), 2
)
)
end
@testset "compound" begin
N = 10
v = [
foo(
rand(),
randstring(rand(10:100)),
randstring(10),
rand(ComplexF64, 3, 3),
rand(1:10, rand(10:100))
) for _ in 1:N
]
v[1] = foo(1.0, "uniçº∂e", "uniçº∂e", rand(ComplexF64, 3, 3), rand(1:10, rand(10:100)))
v_write = unsafe_convert.(foo_hdf5, v)
w = [bar(["uniçº∂e", "hello"])]
w_write = convert.(bar_hdf5, w)
fn = tempname()
h5open(fn, "w") do h5f
foo_dtype = datatype(foo_hdf5)
foo_space = dataspace(v_write)
foo_dset = create_dataset(h5f, "foo", foo_dtype, foo_space)
write_dataset(foo_dset, foo_dtype, v_write)
bar_dtype = datatype(bar_hdf5)
bar_space = dataspace(w_write)
bar_dset = create_dataset(h5f, "bar", bar_dtype, bar_space)
write_dataset(bar_dset, bar_dtype, w_write)
end
v_read = h5read(fn, "foo")
for field in (:a, :b, :c, :d, :e)
f = x -> getfield(x, field)
@test f.(v) == f.(v_read)
end
w_read = h5read(fn, "bar")
for field in (:a,)
f = x -> getfield(x, field)
@test f.(w) == f.(w_read)
end
T = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,Cstring}}
TT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,T}}
TTT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,TT}}
TTTT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,TTT}}
@test HDF5.do_reclaim(TTTT) == true
@test HDF5.do_normalize(TTTT) == true
T = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,HDF5.FixedArray}}
TT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,T}}
TTT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,TT}}
TTTT = NamedTuple{(:a, :b, :c, :d, :e, :f),Tuple{Int,Int,Int,Int,Int,TTT}}
@test HDF5.do_reclaim(TTTT) == false
@test HDF5.do_normalize(TTTT) == true
end
struct Bar
a::Int32
b::Float64
c::Bool
end
mutable struct MutableBar
x::Int64
end
@testset "create_dataset (compound)" begin
bars = [Bar(1, 2, true), Bar(3, 4, false), Bar(5, 6, true), Bar(7, 8, false)]
fn = tempname()
h5open(fn, "w") do h5f
d = create_dataset(h5f, "the/bars", Bar, ((2,), (-1,)); chunk=(100,))
d[1:2] = bars[1:2]
end
h5open(fn, "cw") do h5f
d = h5f["the/bars"]
HDF5.set_extent_dims(d, (4,))
d[3:4] = bars[3:4]
end
thebars = h5open(fn, "r") do h5f
read(h5f, "the/bars")
end
@test 4 == length(thebars)
@test 1 == thebars[1].a
@test true == thebars[3].c
@test 8 == thebars[4].b
end
@testset "write_compound" begin
bars = [
[Bar(1, 1.1, true) Bar(2, 2.1, false) Bar(3, 3.1, true)]
[Bar(4, 4.1, false) Bar(5, 5.1, true) Bar(6, 6.1, false)]
]
namedtuples = [(a=1, b=2.3), (a=4, b=5.6)]
tuples = [(1, namedtuples[1], (0.1,)), (2, namedtuples[2], (1.0,))]
fn = tempname()
h5open(fn, "w") do h5f
write_dataset(h5f, "the/bars", bars)
write_dataset(h5f, "the/namedtuples", namedtuples)
write_dataset(h5f, "the/tuples", tuples)
end
thebars = h5open(fn, "r") do h5f
read(h5f, "the/bars")
end
@test (2, 3) == size(thebars)
@test thebars[1, 2].b ≈ 2.1
@test thebars[2, 1].a == 4
@test thebars[1, 3].c
thebars_r = reinterpret(Bar, thebars)
@test (2, 3) == size(thebars_r)
@test thebars_r[1, 2].b ≈ 2.1
@test thebars_r[2, 1].a == 4
@test thebars_r[1, 3].c
thenamedtuples = h5open(fn, "r") do h5f
read(h5f, "the/namedtuples")
end
@test (2,) == size(thenamedtuples)
@test thenamedtuples[1].a == 1
@test thenamedtuples[1].b ≈ 2.3
@test thenamedtuples[2].a == 4
@test thenamedtuples[2].b ≈ 5.6
thetuples = h5open(fn, "r") do h5f
read(h5f, "the/tuples")
end
@test (2,) == size(thetuples)
@test thetuples[1][1] == 1
@test thetuples[1][2].a == 1
@test thetuples[1][2].b ≈ 2.3
@test thetuples[1][3][1] ≈ 0.1
@test thetuples[2][1] == 2
@test thetuples[2][2].a == 4
@test thetuples[2][2].b ≈ 5.6
@test thetuples[2][3][1] ≈ 1.0
mutable_bars = [MutableBar(1), MutableBar(2), MutableBar(3)]
fn = tempname()
@test_throws ArgumentError begin
h5open(fn, "w") do h5f
write_dataset(h5f, "the/mutable_bars", mutable_bars)
end
end
Base.convert(::Type{NamedTuple{(:x,),Tuple{Int64}}}, mb::MutableBar) = (x=mb.x,)
Base.unsafe_convert(::Type{Ptr{Nothing}}, mb::MutableBar) = pointer_from_objref(mb)
h5open(fn, "w") do h5f
write_dataset(h5f, "the/mutable_bars", mutable_bars)
write_dataset(h5f, "the/mutable_bar", first(mutable_bars))
end
h5open(fn, "r") do h5f
@test [1, 2, 3] == [b.x for b in read(h5f, "the/mutable_bars")]
@test 1 == read(h5f, "the/mutable_bar").x
end
h5open(fn, "w") do h5f
write_attribute(h5f, "mutable_bars", mutable_bars)
write_attribute(h5f, "mutable_bar", first(mutable_bars))
end
h5open(fn, "r") do h5f
@test [1, 2, 3] == [b.x for b in attrs(h5f)["mutable_bars"]]
@test 1 == attrs(h5f)["mutable_bar"].x
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 801 | using Pkg, Libdl
Pkg.activate(dirname(@__DIR__))
Pkg.instantiate()
# Configure the test setup based on environment variables set in CI.
# First, we get the settings and remove all local preference configurations
# that may still exist.
const JULIA_HDF5_PATH = get(ENV, "JULIA_HDF5_PATH", "")
rm(joinpath(dirname(@__DIR__), "LocalPreferences.toml"); force=true)
# Next, we configure MPI.jl appropriately.
import MPIPreferences
MPIPreferences.use_system_binary()
# Finally, we configure HDF5.jl as desired.
import UUIDs, Preferences
Preferences.set_preferences!(
UUIDs.UUID("f67ccb44-e63f-5c2f-98bd-6dc0ccc4ba2f"), # UUID of HDF5.jl
"libhdf5" => joinpath(JULIA_HDF5_PATH, "libhdf5." * Libdl.dlext),
"libhdf5_hl" => joinpath(JULIA_HDF5_PATH, "libhdf5_hl." * Libdl.dlext);
force=true
)
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2259 | using HDF5
using Test
"""
Test Set create_dataset
Test the combination of arguments to create_dataset.
"""
@testset "create_dataset" begin
mktemp() do fn, io
h5open(fn, "w") do h5f
h5g = create_group(h5f, "test_group")
# Arguments
# Test file and group
parents = (h5f, h5g)
# Test anonymous dattaset, String, and SubString
names = (nothing, "test_dataset", @view("test_dataset"[1:4]))
# Test primitive, HDF5.Datatype, non-primitive, non-primitive HDF5.Datatype
types = (UInt8, datatype(UInt8), Complex{Float32}, datatype(Complex{Float32}))
# Test Tuple, HDF5.Dataspace, two tuples (extendible), extendible HDF5.Dataspace
spaces = (
(3, 4),
dataspace((16, 16)),
((4, 4), (8, 8)),
dataspace((16, 16); max_dims=(32, 32))
)
# TODO: test keywords
# Create argument cross product
p = Iterators.product(parents, names, types, spaces)
for (parent, name, type, space) in p
try
# create a chunked dataset since contiguous datasets are not extendible
ds = create_dataset(parent, name, type, space; chunk=(2, 2))
@test datatype(ds) == datatype(type)
@test dataspace(ds) == dataspace(space)
@test isvalid(ds)
close(ds)
if !isnothing(name)
# if it is not an anonymous dataset, try to open it
ds2 = open_dataset(parent, name)
@test isvalid(ds2)
close(ds2)
delete_object(parent, name)
end
catch err
throw(ArgumentError("""
Error occured with (
$parent :: $(typeof(parent)),
$name :: $(typeof(name)),
$type :: $(typeof(type)),
$space :: $(typeof(space)))
"""))
end
end
end
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1203 | using Random, Test, HDF5
import HDF5.datatype
struct Simple
a::Float64
b::Int
end
function datatype(::Type{Simple})
dtype = HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(Simple))
HDF5.API.h5t_insert(dtype, "a", fieldoffset(Simple, 1), datatype(Float64))
HDF5.API.h5t_insert(dtype, "b", fieldoffset(Simple, 2), datatype(Int))
HDF5.Datatype(dtype)
end
@testset "custom" begin
N = 5
v = [Simple(rand(Float64), rand(Int)) for i in 1:N, j in 1:N]
fn = tempname()
h5open(fn, "w") do h5f
dtype = datatype(Simple)
dspace = dataspace(v)
dset = create_dataset(h5f, "data", dtype, dspace)
write_dataset(dset, dtype, v)
end
h5open(fn, "r") do h5f
dset = h5f["data"]
@test_throws ErrorException read(dset, Float64)
@test_throws ErrorException read(dset, Union{Float64,Int})
v_read = read(dset, Simple)
@test v_read == v
indices = (1, 1:2:4)
v_read = read(dset, Simple, indices...)
@test v_read == v[indices...]
v_read = read(h5f, "data" => Simple)
@test v_read == v
end
v_read = h5read(fn, "data" => Simple)
@test v_read == v
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 6815 | using HDF5
using Test
@testset "Dataspaces" begin
hsize_t = HDF5.API.hsize_t
# Reference objects without using high-level API
ds_null = HDF5.Dataspace(HDF5.API.h5s_create(HDF5.API.H5S_NULL))
ds_scalar = HDF5.Dataspace(HDF5.API.h5s_create(HDF5.API.H5S_SCALAR))
ds_zerosz = HDF5.Dataspace(HDF5.API.h5s_create_simple(1, hsize_t[0], hsize_t[0]))
ds_vector = HDF5.Dataspace(HDF5.API.h5s_create_simple(1, hsize_t[5], hsize_t[5]))
ds_matrix = HDF5.Dataspace(HDF5.API.h5s_create_simple(2, hsize_t[7, 5], hsize_t[7, 5]))
ds_maxdim = HDF5.Dataspace(HDF5.API.h5s_create_simple(2, hsize_t[7, 5], hsize_t[20, 20]))
ds_unlim = HDF5.Dataspace(HDF5.API.h5s_create_simple(1, hsize_t[1], [HDF5.API.H5S_UNLIMITED]))
# Testing basic property accessors of dataspaces
@test isvalid(ds_scalar)
@test ndims(ds_null) === 0
@test ndims(ds_scalar) === 0
@test ndims(ds_zerosz) === 1
@test ndims(ds_vector) === 1
@test ndims(ds_matrix) === 2
# Test that properties of existing datasets can be extracted.
# Note: Julia reverses the order of dimensions when using the high-level API versus
# the dimensions used above to create the reference objects.
@test size(ds_null) === ()
@test size(ds_scalar) === ()
@test size(ds_zerosz) === (0,)
@test size(ds_vector) === (5,)
@test size(ds_matrix) === (5, 7)
@test size(ds_maxdim) === (5, 7)
@test size(ds_null, 5) === 1
@test size(ds_scalar, 5) === 1
@test size(ds_zerosz, 5) === 1
@test size(ds_vector, 5) === 1
@test size(ds_matrix, 5) === 1
@test size(ds_maxdim, 5) === 1
@test_throws ArgumentError("invalid dimension d; must be positive integer") size(
ds_null, 0
)
@test_throws ArgumentError("invalid dimension d; must be positive integer") size(
ds_scalar, -1
)
@test length(ds_null) === 0
@test length(ds_scalar) === 1
@test length(ds_zerosz) === 0
@test length(ds_vector) === 5
@test length(ds_matrix) === 35
@test length(ds_maxdim) === 35
@test isempty(ds_null)
@test !isempty(ds_scalar)
@test isempty(ds_zerosz)
@test !isempty(ds_vector)
@test HDF5.isnull(ds_null)
@test !HDF5.isnull(ds_scalar)
@test !HDF5.isnull(ds_zerosz)
@test !HDF5.isnull(ds_vector)
@test HDF5.get_extent_dims(ds_null) === ((), ())
@test HDF5.get_extent_dims(ds_scalar) === ((), ())
@test HDF5.get_extent_dims(ds_zerosz) === ((0,), (0,))
@test HDF5.get_extent_dims(ds_vector) === ((5,), (5,))
@test HDF5.get_extent_dims(ds_matrix) === ((5, 7), (5, 7))
@test HDF5.get_extent_dims(ds_maxdim) === ((5, 7), (20, 20))
@test HDF5.get_extent_dims(ds_unlim) === ((1,), (-1,))
# Can create new copies
ds_tmp = copy(ds_maxdim)
ds_tmp2 = HDF5.Dataspace(ds_tmp.id) # copy of ID, but new Julia object
@test ds_tmp.id == ds_tmp2.id != ds_maxdim.id
# Equality and hashing
@test ds_tmp == ds_maxdim
@test ds_tmp !== ds_maxdim
@test hash(ds_tmp) != hash(ds_maxdim)
@test ds_tmp == ds_tmp2
@test ds_tmp !== ds_tmp2
@test hash(ds_tmp) == hash(ds_tmp2)
# Behavior of closing dataspace objects
close(ds_tmp)
@test ds_tmp.id == -1
@test !isvalid(ds_tmp)
@test !isvalid(ds_tmp2)
# Validity checking in high-level operations
@test_throws ErrorException("File or object has been closed") copy(ds_tmp)
@test_throws ErrorException("File or object has been closed") ndims(ds_tmp)
@test_throws ErrorException("File or object has been closed") size(ds_tmp)
@test_throws ErrorException("File or object has been closed") size(ds_tmp, 1)
@test_throws ErrorException("File or object has been closed") length(ds_tmp)
@test_throws ErrorException("File or object has been closed") ds_tmp == ds_tmp2
@test close(ds_tmp) === nothing # no error
# Test ability to create explicitly-sized dataspaces
@test dataspace(()) == ds_scalar
@test dataspace((5,)) == ds_vector
@test dataspace((5, 7)) == ds_matrix != ds_maxdim
@test dataspace((5, 7); max_dims=(20, 20)) == ds_maxdim != ds_matrix
@test dataspace((5, 7), (20, 20)) == ds_maxdim
@test dataspace(((5, 7), (20, 20))) == ds_maxdim
@test dataspace((1,); max_dims=(-1,)) == ds_unlim
@test dataspace((1,), (-1,)) == ds_unlim
@test dataspace(((1,), (-1,))) == ds_unlim
# for ≥ 2 numbers, same as single tuple argument
@test dataspace(5, 7) == ds_matrix
@test dataspace(5, 7, 1) == dataspace((5, 7, 1))
# Test dataspaces derived from data
@test dataspace(nothing) == ds_null
@test dataspace(HDF5.EmptyArray{Bool}()) == ds_null
@test dataspace(fill(1.0)) == ds_scalar
@test dataspace(1) == ds_scalar
@test dataspace(1 + 1im) == ds_scalar
@test dataspace("string") == ds_scalar
@test dataspace(zeros(0)) == ds_zerosz
@test dataspace(zeros(0, 0)) != ds_zerosz
@test dataspace(zeros(5, 7)) == ds_matrix
@test dataspace(HDF5.VLen([[1]])) == dataspace((1,))
@test dataspace(HDF5.VLen([[1], [2]])) == dataspace((2,))
# Constructing dataspace for/from HDF5 dataset or attribute
mktemp() do path, io
close(io)
h5open(path, "w") do hid
dset = create_dataset(hid, "dset", datatype(Int), ds_matrix)
attr = create_attribute(dset, "attr", datatype(Bool), ds_vector)
@test dataspace(dset) == ds_matrix
@test dataspace(dset) !== ds_matrix
@test dataspace(attr) == ds_vector
@test dataspace(attr) !== ds_vector
close(dset)
close(attr)
@test_throws ErrorException("File or object has been closed") dataspace(dset)
@test_throws ErrorException("File or object has been closed") dataspace(attr)
end
end
# Test mid-level routines: set/get_extent_dims
dspace_norm = dataspace((100, 4))
@test HDF5.get_extent_dims(dspace_norm)[1] ==
HDF5.get_extent_dims(dspace_norm)[2] ==
(100, 4)
HDF5.set_extent_dims(dspace_norm, (8, 2))
@test HDF5.get_extent_dims(dspace_norm)[1] ==
HDF5.get_extent_dims(dspace_norm)[2] ==
(8, 2)
dspace_maxd = dataspace((100, 4); max_dims=(256, 5))
@test HDF5.get_extent_dims(dspace_maxd)[1] == (100, 4)
@test HDF5.get_extent_dims(dspace_maxd)[2] == (256, 5)
HDF5.set_extent_dims(dspace_maxd, (8, 2))
@test HDF5.get_extent_dims(dspace_maxd)[1] == (8, 2)
HDF5.set_extent_dims(dspace_maxd, (3, 1), (4, 2))
@test HDF5.get_extent_dims(dspace_maxd)[1] == (3, 1)
@test HDF5.get_extent_dims(dspace_maxd)[2] == (4, 2)
HDF5.set_extent_dims(dspace_maxd, (3, 1), (-1, -1)) # unlimited max size
@test HDF5.get_extent_dims(dspace_maxd)[1] == (3, 1)
@test HDF5.get_extent_dims(dspace_maxd)[2] == (-1, -1)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1296 | using HDF5
using Test
function create_h5_uint24()
dt = HDF5.API.h5t_copy(HDF5.API.H5T_STD_U32LE)
HDF5.API.h5t_set_size(dt, 3)
HDF5.API.h5t_set_precision(dt, 24)
return HDF5.Datatype(dt)
end
@testset "Datatypes" begin
DT = create_h5_uint24()
@test HDF5.API.h5t_get_size(DT) == 3
@test HDF5.API.h5t_get_precision(DT) == 24
@test HDF5.API.h5t_get_offset(DT) == 0
@test HDF5.API.h5t_get_order(DT) == HDF5.API.H5T_ORDER_LE
HDF5.API.h5t_set_precision(DT, 12)
@test HDF5.API.h5t_get_precision(DT) == 12
@test HDF5.API.h5t_get_offset(DT) == 0
HDF5.API.h5t_set_offset(DT, 12)
@test HDF5.API.h5t_get_precision(DT) == 12
@test HDF5.API.h5t_get_offset(DT) == 12
io = IOBuffer()
show(io, DT)
str = String(take!(io))
@test match(r"undefined integer", str) !== nothing
@test match(r"size: 3 bytes", str) !== nothing
@test match(r"precision: 12 bits", str) !== nothing
@test match(r"offset: 12 bits", str) !== nothing
@test match(r"order: little endian byte order", str) !== nothing
HDF5.API.h5t_set_order(DT, HDF5.API.H5T_ORDER_BE)
@test HDF5.API.h5t_get_order(DT) == HDF5.API.H5T_ORDER_BE
show(io, DT)
str = String(take!(io))
@test match(r"order: big endian byte order", str) !== nothing
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 752 | using HDF5
import HDF5.Drivers
using Test
@testset "Drivers" begin
fn = tempname()
A = rand(UInt8, 256, 128)
h5open(fn, "w"; driver=Drivers.Core()) do f
ds = write_dataset(f, "core_dataset", A)
end
@test isfile(fn)
h5open(fn, "r") do f
@test f["core_dataset"][] == A
end
fn = tempname()
file_image = h5open(fn, "w"; driver=Drivers.Core(; backing_store=false)) do f
ds = write_dataset(f, "core_dataset", A)
Vector{UInt8}(f)
end
@test !isfile(fn)
h5open(file_image) do f
@test f["core_dataset"][] == A
end
fn = tempname()
h5open(fn, "r"; driver=Drivers.Core(; backing_store=false), file_image) do f
@test f["core_dataset"][] == A
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3519 | using HDF5
using Test
@testset "extend" begin
fn = tempname()
fid = h5open(fn, "w")
g = create_group(fid, "shoe")
d = create_dataset(g, "foo", datatype(Float64), ((10, 20), (100, 200)); chunk=(1, 1))
#println("d is size current $(map(int,HDF5.get_extent_dims(d)[1])) max $(map(int,HDF5.get_extent_dims(d)[2]))")
dims, max_dims = HDF5.get_extent_dims(d)
@test dims == (UInt64(10), UInt64(20))
@test max_dims == (UInt64(100), UInt64(200))
HDF5.set_extent_dims(d, (100, 150))
dims, max_dims = HDF5.get_extent_dims(d)
@test dims == (UInt64(100), UInt64(150))
@test max_dims == (UInt64(100), UInt64(200))
d[1, 1:5] = [1.1231, 1.313, 5.123, 2.231, 4.1231]
HDF5.set_extent_dims(d, (1, 5))
@test size(d) == (1, 5)
# Indexing returns correct array dimensions
@test d[1, end] ≈ 4.1231
@test d[:, end] ≈ [4.1231]
@test d[end, :] == [1.1231, 1.313, 5.123, 2.231, 4.1231]
@test d[:, :] == [1.1231 1.313 5.123 2.231 4.1231]
if VERSION >= v"1.4"
include("extend_test_begin.jl")
extend_test_begin(d)
end
# Test all integer types work
@test d[UInt8(1), UInt16(1)] == 1.1231
@test d[UInt32(1), UInt128(1)] == 1.1231
@test d[Int8(1), Int16(1)] == 1.1231
@test d[Int32(1), Int128(1)] == 1.1231
# Test ranges work with steps
@test d[1, 1:2:5] == [1.1231, 5.123, 4.1231]
@test d[1:1, 2:2:4] == [1.313 2.231]
# Test Array constructor
Array(d) == [1.1231 1.313 5.123 2.231 4.1231]
#println("d is size current $(map(int,HDF5.get_extent_dims(d)[1])) max $(map(int,HDF5.get_extent_dims(d)[2]))")
b = create_dataset(fid, "b", Int, ((1000,), (-1,)); chunk=(100,)) #-1 is equivalent to typemax(hsize_t) as far as I can tell
#println("b is size current $(map(int,HDF5.get_extent_dims(b)[1])) max $(map(int,HDF5.get_extent_dims(b)[2]))")
b[1:200] = ones(200)
dims, max_dims = HDF5.get_extent_dims(b)
@test dims == (UInt64(1000),)
@test max_dims == (HDF5.API.H5S_UNLIMITED % Int,)
HDF5.set_extent_dims(b, (10000,))
dims, max_dims = HDF5.get_extent_dims(b)
@test dims == (UInt64(10000),)
@test max_dims == (HDF5.API.H5S_UNLIMITED % Int,)
#println("b is size current $(map(int,HDF5.get_extent_dims(b)[1])) max $(map(int,HDF5.get_extent_dims(b)[2]))")
# b[:] = [1:10000] # gave error no method lastindex(HDF5.Dataset{PlainHDF5File},),
# so I defined lastindex(dset::HDF5.Dataset) = length(dset), and exported lastindex
# but that didn't fix the error, despite the lastindex function working
# d[1] produces error ERROR: Wrong number of indices supplied, should datasets support linear indexing?
b[1:10000] = [1:10000;]
#println(b[1:100])
close(fid)
fid = h5open(fn, "r")
d_again = fid["shoe/foo"]
dims, max_dims = HDF5.get_extent_dims(d_again)
@test dims == (UInt64(1), UInt64(5))
@test max_dims == (UInt64(100), UInt64(200))
@test (sum(d_again[1, 1:5]) - sum([1.1231, 1.313, 5.123, 2.231, 4.1231])) == 0
#println("d is size current $(map(int,HDF5.get_extent_dims(re_d)[1])) max $(map(int,HDF5.get_extent_dims(re_d)[2]))")
@test fid["b"][1:10000] == [1:10000;]
b_again = fid["b"]
dims, max_dims = HDF5.get_extent_dims(b_again)
@test dims == (UInt64(10000),)
@test max_dims == (HDF5.API.H5S_UNLIMITED % Int,)
#println("b is size current $(map(int,HDF5.get_extent_dims(b)[1])) max $(map(int,HDF5.get_extent_dims(b)[2]))")
close(fid)
rm(fn)
end # testset extend_test
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 224 | function extend_test_begin(d)
@test d[1, begin] ≈ 1.1231
@test d[:, begin] ≈ [1.1231]
@test d[begin, :] == [1.1231, 1.313, 5.123, 2.231, 4.1231]
@test d[:, begin:end] == [1.1231 1.313 5.123 2.231 4.1231]
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2102 | using HDF5
using Test
@testset "external" begin
# roughly following https://www.hdfgroup.org/ftp/HDF5/current/src/unpacked/examples/h5_extlink.c
fn1 = tempname()
fn2 = tempname()
source_file = h5open(fn1, "w")
agroup = create_group(source_file, "agroup")
target_file = h5open(fn2, "w")
target_group = create_group(target_file, "target_group")
target_group["abc"] = "abc"
target_group["1"] = 1
target_group["1.1"] = 1.1
close(target_file)
# create external link such that source_file["ext_link"] points to target_file["target_group"]
# test both an HDF5.File and an HDF5.Group for first argument
HDF5.create_external(source_file, "ext_link", target_file.filename, "target_group")
HDF5.create_external(agroup, "ext_link", target_file.filename, "target_group")
close(agroup)
# write some things via the external link
new_group = create_group(source_file["ext_link"], "new_group")
new_group["abc"] = "abc"
new_group["1"] = 1
new_group["1.1"] = 1.1
close(new_group)
# read things from target_group via exernal link created with HDF5File argument
group = source_file["ext_link"]
@test read(group["abc"]) == "abc"
@test read(group["1"]) == 1
@test read(group["1.1"]) == 1.1
close(group)
# read things from target_group via the external link created with HDF5.Group argument
groupalt = source_file["agroup/ext_link"]
@test read(groupalt["abc"]) == "abc"
@test read(groupalt["1"]) == 1
@test read(groupalt["1.1"]) == 1.1
close(groupalt)
close(source_file)
##### tests that should be included but don't work
# when ggggggggg restarts julia and keeps track of target_file.filename,
# these tests succeed
# reopening the target_file crashes due to "file close degree doesn't match"
# target_file = h5open(target_file.filename, "r")
# group2 = target_file["target_group"]["new_group"]
# @test read(group2["abc"])=="abc"
# @test read(group2["1"])==1
# @test read(group2["1.1"])==1.1
rm(fn1)
# rm(fn2)
end # testset external
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 3830 | using HDF5, OrderedCollections, FileIO, Test
@testset "fileio" begin
fn = tempname() * ".h5"
hfile = h5open(fn, "w")
hfile["A"] = 1.0
hfile["B"] = [1, 2, 3]
create_group(hfile, "G")
hfile["G/A"] = collect(-3:4)
create_group(hfile, "G1/G2")
hfile["G1/G2/A"] = "hello"
close(hfile)
# test loader
data = Dict("A" => 1.0, "B" => [1, 2, 3], "G/A" => collect(-3:4), "G1/G2/A" => "hello")
@test load(fn) == data
@test load(fn, "A") == 1.0
@test load(fn, "A", "B") == (1.0, [1, 2, 3])
@test load(fn, "G/A") == collect(-3:4)
rm(fn)
# test saver
save(fn, data)
@test load(fn) == data
@test load(fn, "A") == 1.0
fr = h5open(fn, "r")
read(fr, "A") == 1.0
close(fr)
rm(fn)
end
@testset "track order" begin
let fn = tempname() * ".h5"
h5open(fn, "w"; track_order=true) do io
fcpl = HDF5.get_create_properties(io)
@test fcpl.track_order
io["b"] = 1
io["a"] = 2
g = create_group(io, "G"; track_order=true)
gcpl = HDF5.get_create_properties(io["G"])
@test gcpl.track_order
write(g, "z", 3)
write(g, "f", 4)
end
dat = load(fn; dict=OrderedDict()) # `track_order` is inferred from `OrderedDict`
@test all(keys(dat) .== ["b", "a", "G/z", "G/f"])
# issue #939
h5open(fn, "r"; track_order=true) do io
@test HDF5.get_context_property(:file_create).track_order
@test all(keys(io) .== ["b", "a", "G"])
@test HDF5.get_context_property(:group_create).track_order
@test HDF5.get_create_properties(io["G"]).track_order # inferred from file, created with `track_order=true`
@test all(keys(io["G"]) .== ["z", "f"])
end
h5open(fn, "r"; track_order=false) do io
@test !HDF5.get_context_property(:file_create).track_order
@test all(keys(io) .== ["G", "a", "b"])
@test !HDF5.get_context_property(:group_create).track_order
@test HDF5.get_create_properties(io["G"]).track_order # inferred from file
@test all(keys(io["G"]) .== ["z", "f"])
end
h5open(fn, "r") do io
@test !HDF5.get_create_properties(io).track_order
@test all(keys(io) .== ["G", "a", "b"])
@test HDF5.get_create_properties(io["G"]).track_order # inferred from file
@test all(keys(io["G"]) .== ["z", "f"])
end
end
let fn = tempname() * ".h5"
save(fn, OrderedDict("b" => 1, "a" => 2, "G/z" => 3, "G/f" => 4))
dat = load(fn; dict=OrderedDict())
@test all(keys(dat) .== ["b", "a", "G/z", "G/f"])
end
end # @testset track_order
@static if HDF5.API.h5_get_libversion() >= v"1.10.5"
@testset "h5f_get_dset_no_attrs_hint" begin
fn = tempname()
threshold = 300
h5open(fn, "w"; libver_bounds=:latest, meta_block_size=threshold) do f
HDF5.API.h5f_set_dset_no_attrs_hint(f, true)
@test HDF5.API.h5f_get_dset_no_attrs_hint(f)
f["test"] = 0x1
# We expect that with the hint, the offset will actually be 300
@test HDF5.API.h5d_get_offset(f["test"]) == threshold
end
@test filesize(fn) == threshold + 1
h5open(fn, "w"; libver_bounds=:latest, meta_block_size=threshold) do f
HDF5.API.h5f_set_dset_no_attrs_hint(f, false)
@test !HDF5.API.h5f_get_dset_no_attrs_hint(f)
f["test"] = 0x1
# We expect that with the hint, the offset will be greater than 300
@test HDF5.API.h5d_get_offset(f["test"]) > threshold
end
@test filesize(fn) > threshold + 1
end
end # @static if HDF5.API.h5_get_libversion() >= v"1.10.5"
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 8379 | using HDF5
using HDF5.Filters
import HDF5.Filters.Registered
using Test
using H5Zblosc, H5Zlz4, H5Zbzip2, H5Zzstd
using Preferences
@static if VERSION >= v"1.6"
using H5Zbitshuffle
end
using HDF5.Filters: ExternalFilter, isavailable, isencoderenabled, isdecoderenabled
@testset "filter" begin
# Create a new file
fn = tempname()
# Create test data
data = rand(1000, 1000)
# Open temp file for writing
f = h5open(fn, "w")
# Create datasets
dsdeflate = create_dataset(
f, "deflate", datatype(data), dataspace(data); chunk=(100, 100), deflate=3
)
dsshufdef = create_dataset(
f,
"shufdef",
datatype(data),
dataspace(data);
chunk=(100, 100),
shuffle=true,
deflate=3
)
dsfiltdef = create_dataset(
f,
"filtdef",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=Filters.Deflate(3)
)
dsfiltshufdef = create_dataset(
f,
"filtshufdef",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=[Filters.Shuffle(), Filters.Deflate(3)]
)
# Write data
write(dsdeflate, data)
write(dsshufdef, data)
write(dsfiltdef, data)
write(dsfiltshufdef, data)
# Test compression filters
compressionFilters = Dict(
"blosc" => BloscFilter,
"bzip2" => Bzip2Filter,
"lz4" => Lz4Filter,
"zstd" => ZstdFilter,
)
for (name, filter) in compressionFilters
ds = create_dataset(
f, name, datatype(data), dataspace(data); chunk=(100, 100), filters=filter()
)
write(ds, data)
ds = create_dataset(
f,
"shuffle+" * name,
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=[Filters.Shuffle(), filter()]
)
write(ds, data)
end
ds = create_dataset(
f,
"blosc_bitshuffle",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=BloscFilter(; shuffle=H5Zblosc.BITSHUFFLE)
)
write(ds, data)
function extra_bitshuffle()
ds = create_dataset(
f,
"bitshuffle_lz4",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=BitshuffleFilter(; compressor=:lz4)
)
write(ds, data)
ds = create_dataset(
f,
"bitshuffle_zstd",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=BitshuffleFilter(; compressor=:zstd, comp_level=5)
)
write(ds, data)
ds = create_dataset(
f,
"bitshuffle_plain",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=BitshuffleFilter()
)
write(ds, data)
end
@static VERSION >= v"1.6" ? extra_bitshuffle() : nothing
# Close and re-open file for reading
close(f)
f = h5open(fn)
try
# Read datasets and test for equality
for name in keys(f)
ds = f[name]
@testset "$name" begin
@debug "Filter Dataset" HDF5.name(ds)
@test ds[] == data
filters = HDF5.get_create_properties(ds).filters
if startswith(name, "shuffle+")
@test filters[1] isa Shuffle
@test filters[2] isa compressionFilters[name[9:end]]
elseif haskey(compressionFilters, name) || name == "blosc_bitshuffle"
name = replace(name, r"_.*" => "")
@test filters[1] isa compressionFilters[name]
end
if v"1.12.3" ≤ HDF5.API._libhdf5_build_ver
infos = HDF5.get_chunk_info_all(ds)
filter_masks = [info.filter_mask for info in infos]
@test only(unique(filter_masks)) === UInt32(0)
end
end
end
finally
close(f)
end
# Test that reading a dataset with a missing filter has an informative error message.
h5open(fn, "w") do f
data = zeros(100, 100)
ds = create_dataset(
f,
"data",
datatype(data),
dataspace(data);
chunk=(100, 100),
filters=Lz4Filter()
)
write(ds, data)
close(ds)
end
HDF5.API.h5z_unregister(Filters.filterid(H5Zlz4.Lz4Filter))
h5open(fn) do f
filter_name = Filters.filtername(H5Zlz4.Lz4Filter)
filter_id = Filters.filterid(H5Zlz4.Lz4Filter)
@test_throws(
ErrorException("""
filter missing, filter id: $filter_id name: $filter_name
Try running `import H5Zlz4` to install this filter.
"""),
read(f["data"])
)
HDF5.Filters.register_filter(H5Zlz4.Lz4Filter)
end
# Issue #896 and https://github.com/JuliaIO/HDF5.jl/issues/285#issuecomment-1002243321
# Create an ExternalFilter from a Tuple
h5open(fn, "w") do f
data = rand(UInt8, 512, 16, 512)
# Tuple of integers should become an Unknown Filter
ds, dt = create_dataset(
f, "data", data; chunk=(256, 1, 256), filter=(H5Z_FILTER_BZIP2, 0)
)
# Tuple of Filters should get pushed into the pipeline one by one
dsfiltshufdef = create_dataset(
f,
"filtshufdef",
datatype(data),
dataspace(data);
chunk=(128, 4, 128),
filters=(Filters.Shuffle(), Filters.Deflate(3))
)
write(ds, data)
close(ds)
write(dsfiltshufdef, data)
close(dsfiltshufdef)
end
h5open(fn, "r") do f
@test f["data"][] == data
@test f["filtshufdef"][] == data
end
# Filter Pipeline test for ExternalFilter
FILTERS_backup = copy(HDF5.Filters.FILTERS)
empty!(HDF5.Filters.FILTERS)
h5open(fn, "w") do f
data = collect(1:128)
filter = ExternalFilter(
H5Z_FILTER_LZ4, 0, Cuint[0, 2, 4, 6, 8, 10], "Unknown LZ4", 0
)
ds, dt = create_dataset(f, "data", data; chunk=(32,), filters=filter)
dcpl = HDF5.get_create_properties(ds)
pipeline = HDF5.Filters.FilterPipeline(dcpl)
@test pipeline[1].data == filter.data
end
merge!(HDF5.Filters.FILTERS, FILTERS_backup)
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_DEFLATE)
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_FLETCHER32)
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_NBIT)
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_SCALEOFFSET)
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_SHUFFLE)
if !Preferences.has_preference(HDF5, "libhdf5")
if HDF5.API.h5_get_libversion() < v"1.14"
@test_broken HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_SZIP)
else
@test HDF5.API.h5z_filter_avail(HDF5.API.H5Z_FILTER_SZIP)
end
end
@test HDF5.API.h5z_filter_avail(H5Z_FILTER_BZIP2)
@test HDF5.API.h5z_filter_avail(H5Z_FILTER_LZ4)
@test HDF5.API.h5z_filter_avail(H5Z_FILTER_ZSTD)
@test HDF5.API.h5z_filter_avail(H5Z_FILTER_BLOSC)
# Test the RegisteredFilter module for filters we know to be loaded
reg_loaded = [
Registered.BZIP2Filter,
Registered.LZ4Filter,
Registered.ZstandardFilter,
Registered.BLOSCFilter
]
for func in reg_loaded
f = func()
@test HDF5.API.h5z_filter_avail(f)
@test (Filters.filterid(f) => func) in Registered.available_registered_filters()
@test func(HDF5.API.H5Z_FLAG_OPTIONAL) isa ExternalFilter
@test func(
HDF5.API.H5Z_FLAG_OPTIONAL, Cuint[], HDF5.API.H5Z_FILTER_CONFIG_ENCODE_ENABLED
) isa ExternalFilter
end
HDF5.API.h5z_unregister(H5Z_FILTER_LZ4)
HDF5.Filters.register_filter(H5Zlz4.Lz4Filter)
@test isavailable(H5Z_FILTER_LZ4)
@test isavailable(Lz4Filter)
@test isencoderenabled(H5Z_FILTER_LZ4)
@test isdecoderenabled(H5Z_FILTER_LZ4)
@test isencoderenabled(Lz4Filter)
@test isdecoderenabled(Lz4Filter)
end # @testset "filter"
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2083 | using HDF5
using Test
macro gcvalid(args...)
Expr(
:block,
quote
GC.enable(true)
GC.gc()
GC.enable(false)
end,
[:(@test HDF5.isvalid($(esc(x)))) for x in args]...
)
end
macro closederror(x)
quote
try
$(esc(x))
catch e
isa(e, ErrorException) || rethrow(e)
e.msg == "File or object has been closed" ||
error("Attempt to access closed object did not throw")
end
end
end
@testset "gc" begin
GC.enable(false)
fn = tempname()
for i in 1:10
file = h5open(fn, "w")
memtype_id = HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, 2 * sizeof(Float64))
HDF5.API.h5t_insert(memtype_id, "real", 0, HDF5.hdf5_type_id(Float64))
HDF5.API.h5t_insert(memtype_id, "imag", sizeof(Float64), HDF5.hdf5_type_id(Float64))
dt = HDF5.Datatype(memtype_id)
commit_datatype(file, "dt", dt)
ds = dataspace((2,))
d = create_dataset(file, "d", dt, ds)
g = create_group(file, "g")
a = create_attribute(file, "a", dt, ds)
@gcvalid dt ds d g a
close(file)
@closederror read(d)
for obj in (d, g)
@closederror read_attribute(obj, "a")
@closederror write_attribute(obj, "a", 1)
end
for obj in (g, file)
@closederror open_dataset(obj, "d")
@closederror read_dataset(obj, "d")
@closederror write_dataset(obj, "d", 1)
@closederror read(obj, "x")
@closederror write(obj, "x", "y")
end
end
for i in 1:10
file = h5open(fn, "r")
dt = file["dt"]
d = file["d"]
ds = dataspace(d)
g = file["g"]
a = attributes(file)["a"]
@gcvalid dt ds d g a
close(file)
end
GC.enable(true)
let plist = HDF5.init!(HDF5.FileAccessProperties()) # related to issue #620
HDF5.API.h5p_close(plist)
@test_nowarn finalize(plist)
end
rm(fn)
end # testset gc
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2818 | using Random, Test, HDF5
@testset "BlockRange" begin
br = HDF5.BlockRange(2)
@test length(br) == 1
@test range(br) === 2:2
@test convert(AbstractRange, br) === 2:2
@test convert(UnitRange, br) === 2:2
@test convert(StepRange, br) === 2:1:2
@test repr(br) == "HDF5.BlockRange(2)"
@test repr(br; context=:compact => true) == "2"
br = HDF5.BlockRange(Base.OneTo(3))
@test length(br) == 3
@test range(br) == 1:3
@test convert(AbstractRange, br) === 1:3
@test convert(UnitRange, br) === 1:3
@test convert(StepRange, br) === 1:1:3
@test repr(br) == "HDF5.BlockRange(1:3)"
@test repr(br; context=:compact => true) == "1:3"
br = HDF5.BlockRange(2:7)
@test length(br) == 6
@test range(br) == 2:7
@test convert(AbstractRange, br) === 2:7
@test convert(UnitRange, br) === 2:7
@test convert(StepRange, br) === 2:1:7
@test repr(br) == "HDF5.BlockRange(2:7)"
@test repr(br; context=:compact => true) == "2:7"
br = HDF5.BlockRange(1:2:7)
@test length(br) == 4
@test range(br) == 1:2:7
@test convert(AbstractRange, br) === 1:2:7
@test_throws Exception convert(UnitRange, br)
@test convert(StepRange, br) === 1:2:7
@test repr(br) == "HDF5.BlockRange(1:2:7)"
@test repr(br; context=:compact => true) == "1:2:7"
br = HDF5.BlockRange(; start=2, stride=8, count=3, block=2)
@test length(br) == 6
@test_throws Exception range(br)
@test_throws Exception convert(AbstractRange, br)
@test_throws Exception convert(UnitRange, br)
@test_throws Exception convert(StepRange, br)
@test repr(br) == "HDF5.BlockRange(start=2, stride=8, count=3, block=2)"
@test repr(br; context=:compact => true) ==
"BlockRange(start=2, stride=8, count=3, block=2)"
end
@testset "hyperslab" begin
N = 10
v = [randstring(rand(5:10)) for i in 1:N, j in 1:N]
fn = tempname()
h5open(fn, "w") do f
f["data"] = v
end
h5open(fn, "r") do f
dset = f["data"]
indices = (1, 1)
@test dset[indices...] == v[indices...]
indices = (1:10, 1)
@test dset[indices...] == v[indices...]
indices = (1, 1:10)
@test dset[indices...] == v[indices...]
indices = (1:2:10, 1:3:10)
@test dset[indices...] == v[indices...]
indices = (
HDF5.BlockRange(1:2; stride=4, count=2), HDF5.BlockRange(1; stride=5, count=2)
)
@test dset[indices...] == vcat(v[1:2, 1:5:6], v[5:6, 1:5:6])
end
end
@testset "read 0-length arrays: issue #859" begin
fname = tempname()
dsetname = "foo"
h5open(fname, "w") do fid
create_dataset(fid, dsetname, datatype(Float32), ((0,), (-1,)); chunk=(100,))
end
h5open(fname, "r") do fid
fid[dsetname][:]
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 691 | using HDF5
const PRINT_MEMORY = `ps -p $(getpid()) -o rss=`
const DATA = zeros(1000)
macro memtest(ex)
@info :Memory
quote
for i in 1:100
for _ in 1:100
$ex
end
# HDF5.h5_garbage_collect()
GC.gc()
print(rpad(i, 8))
run(PRINT_MEMORY)
end
end
end
@memtest h5open("/tmp/memtest.h5", "w") do file
dset = create_dataset(file, "A", datatype(DATA), dataspace(DATA); chunk=(100,))
dset[:] = DATA[:]
end
@memtest h5open("/tmp/memtest.h5", "w") do file
file["A", chunk=(100,)] = DATA[:]
end
@memtest h5open("/tmp/memtest.h5", "r") do file
file["A", "dxpl_mpio", 0]
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2011 | using HDF5
using Test
@testset "mmap" begin
# Create a new file
fn = tempname()
f = h5open(fn, "w")
@test isopen(f)
# Create two datasets, one with late allocation (the default for contiguous
# datasets) and the other with explicit early allocation.
hdf5_A = create_dataset(f, "A", datatype(Int64), dataspace(3, 3))
hdf5_B = create_dataset(
f, "B", datatype(Float64), dataspace(3, 3); alloc_time=HDF5.API.H5D_ALLOC_TIME_EARLY
)
# The late case cannot be mapped yet.
@test_throws ErrorException("Error getting offset") HDF5.readmmap(f["A"])
# Then write and fill dataset A, making it mappable. B was filled with 0.0 at
# creation.
A = rand(Int64, 3, 3)
hdf5_A[:, :] = A
flush(f)
close(f)
# Read HDF5 file & MMAP
f = h5open(fn, "r")
A_mmaped = HDF5.readmmap(f["A"])
@test all(A .== A_mmaped)
@test all(iszero, HDF5.readmmap(f["B"]))
# Check that it is read only
@test_throws ReadOnlyMemoryError A_mmaped[1, 1] = 33
close(f)
# Now check if we can write
f = h5open(fn, "r+")
A_mmaped = HDF5.readmmap(f["A"])
A_mmaped[1, 1] = 33
close(f)
# issue #863 - fix mmapping complex arrays
fn = tempname()
f = h5open(fn, "w")
A = rand(ComplexF32, 5, 5)
f["A"] = A
close(f)
f = h5open(fn, "r+")
complex_support = HDF5.COMPLEX_SUPPORT[]
# Complex arrays can be mmapped when complex support is enabled
complex_support || HDF5.enable_complex_support()
@test A == read(f["A"])
@test A == HDF5.readmmap(f["A"])
# But mmapping should throw an error when support is disabled
HDF5.disable_complex_support()
At = [(r=real(c), i=imag(c)) for c in A]
@test read(f["A"]) == At # readable as array of NamedTuples
@test_throws ErrorException("Cannot mmap datasets of type $(eltype(At))") HDF5.readmmap(
f["A"]
)
close(f)
# Restore complex support state
complex_support && HDF5.enable_complex_support()
end # testset
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2677 | using MPI
using HDF5
using Test
@testset "mpio" begin
HDF5.FileAccessProperties() do fapl
Drivers.set_driver!(fapl, Drivers.Core())
end
MPI.Init()
comm = MPI.COMM_WORLD
if !HDF5.has_parallel()
@test_throws ErrorException(
"HDF5.jl has no parallel support. Make sure that you're using MPI-enabled HDF5 libraries, and that MPI was loaded before HDF5. See HDF5.jl docs for details."
) HDF5.FileAccessProperties(driver=HDF5.Drivers.MPIO(comm))
else
nprocs = MPI.Comm_size(comm)
myrank = MPI.Comm_rank(comm)
# Check that serial drivers are still there after loading MPI (#928)
@test Drivers.Core ∈ values(Drivers.DRIVERS)
@test Drivers.POSIX ∈ values(Drivers.DRIVERS)
let fileprop = HDF5.FileAccessProperties()
fileprop.driver = HDF5.Drivers.MPIO(comm)
driver = fileprop.driver
h5comm = driver.comm
h5info = driver.info
# check that the two communicators point to the same group
if isdefined(MPI, :Comm_compare) # requires recent MPI.jl version
@test MPI.Comm_compare(comm, h5comm) == MPI.CONGRUENT
end
HDF5.close(fileprop)
end
# open file in parallel and write dataset
fn = MPI.bcast(tempname(), 0, comm)
A = [myrank + i for i in 1:10]
h5open(fn, "w", comm) do f
@test isopen(f)
g = create_group(f, "mygroup")
dset = create_dataset(
g,
"B",
datatype(Int64),
dataspace(10, nprocs);
chunk=(10, 1),
dxpl_mpio=:collective
)
dset[:, myrank + 1] = A
end
MPI.Barrier(comm)
h5open(fn, comm) do f # default: opened in read mode, with default MPI.Info()
@test isopen(f)
@test keys(f) == ["mygroup"]
B = read(f, "mygroup/B"; dxpl_mpio=:collective)
@test !isempty(B)
@test A == vec(B[:, myrank + 1])
B = f["mygroup/B", dxpl_mpio=:collective]
@test !isempty(B)
@test A == vec(B[:, myrank + 1])
end
MPI.Barrier(comm)
B = h5read(fn, "mygroup/B"; driver=HDF5.Drivers.MPIO(comm), dxpl_mpio=:collective)
@test A == vec(B[:, myrank + 1])
MPI.Barrier(comm)
B = h5read(
fn,
"mygroup/B",
(:, myrank + 1);
driver=HDF5.Drivers.MPIO(comm),
dxpl_mpio=:collective
)
@test A == vec(B)
end
MPI.Finalize()
end # testset mpio
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 1203 | using HDF5
using Test
@testset "non-allocating methods" begin
fn = tempname()
data = rand(UInt16, 16, 16)
h5open(fn, "w") do h5f
h5f["data"] = data
end
h5open(fn, "r") do h5f
buffer = similar(h5f["data"])
copyto!(buffer, h5f["data"])
@test isequal(buffer, data)
# Consider making this a view later
v = h5f["data"][1:4, 1:4]
buffer = similar(v)
@test size(buffer) == (4, 4)
copyto!(buffer, v)
@test isequal(buffer, @view(data[1:4, 1:4]))
@test size(similar(h5f["data"], Int16)) == size(h5f["data"])
@test size(similar(h5f["data"], 5, 6)) == (5, 6)
@test size(similar(h5f["data"], Int16, 8, 7)) == (8, 7)
@test size(similar(h5f["data"], Int16, 8, 7; normalize=false)) == (8, 7)
@test_broken size(similar(h5f["data"], Int8, 8, 7)) == (8, 7)
@test size(similar(h5f["data"], (5, 6))) == (5, 6)
@test size(similar(h5f["data"], Int16, (8, 7))) == (8, 7)
@test size(similar(h5f["data"], Int16, (8, 7); normalize=false)) == (8, 7)
@test size(similar(h5f["data"], Int16, 0x8, 0x7; normalize=false)) == (8, 7)
end
rm(fn)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 2897 | using Test
using HDF5
using HDF5.API
@testset "Object API" begin
fn = tempname()
# Write some data
h5open(fn, "w") do h5f
h5f["data"] = 5
h5f["lore"] = 9.0
attrs(h5f["lore"])["evil"] = true
end
# Read the data
h5open(fn, "r") do h5f
@test API.h5o_exists_by_name(h5f, "data")
@test API.h5o_exists_by_name(h5f, "lore")
@test_throws API.H5Error API.h5o_exists_by_name(h5f, "noonian")
loc_id = API.h5o_open(h5f, "data", API.H5P_DEFAULT)
try
@test loc_id > 0
oinfo = API.h5o_get_info(loc_id)
@test oinfo.num_attrs == 0
@test oinfo.type == API.H5O_TYPE_DATASET
oinfo1 = API.h5o_get_info1(loc_id)
@test oinfo1.num_attrs == 0
@test oinfo1.type == API.H5O_TYPE_DATASET
@static if HDF5.API.h5_get_libversion() >= v"1.12.0"
oninfo = API.h5o_get_native_info(loc_id)
@test oninfo.hdr.version > 0
@test oninfo.hdr.nmesgs > 0
@test oninfo.hdr.nchunks > 0
@test oninfo.hdr.flags > 0
@test oninfo.hdr.space.total > 0
@test oninfo.hdr.space.meta > 0
@test oninfo.hdr.space.mesg > 0
@test oninfo.hdr.space.free > 0
@test oninfo.hdr.mesg.present > 0
end
finally
API.h5o_close(loc_id)
end
oinfo = API.h5o_get_info_by_name(h5f, ".")
@test oinfo.type == API.H5O_TYPE_GROUP
oinfo = API.h5o_get_info_by_name(h5f, "lore")
@test oinfo.num_attrs == 1
@static if HDF5.API.h5_get_libversion() >= v"1.12.0"
oninfo = API.h5o_get_native_info_by_name(h5f, "lore")
@test oninfo.hdr.version > 0
@test oninfo.hdr.nmesgs > 0
@test oninfo.hdr.nchunks > 0
@test oninfo.hdr.flags > 0
@test oninfo.hdr.space.total > 0
@test oninfo.hdr.space.meta > 0
@test oninfo.hdr.space.mesg > 0
@test oninfo.hdr.space.free > 0
@test oninfo.hdr.mesg.present > 0
end
oinfo = API.h5o_get_info_by_idx(h5f, ".", API.H5_INDEX_NAME, API.H5_ITER_INC, 0)
@test oinfo.num_attrs == 0
@static if HDF5.API.h5_get_libversion() >= v"1.12.0"
oninfo = API.h5o_get_native_info_by_idx(
h5f, ".", API.H5_INDEX_NAME, API.H5_ITER_INC, 1
)
@test oninfo.hdr.version > 0
@test oninfo.hdr.nmesgs > 0
@test oninfo.hdr.nchunks > 0
@test oninfo.hdr.flags > 0
@test oninfo.hdr.space.total > 0
@test oninfo.hdr.space.meta > 0
@test oninfo.hdr.space.mesg > 0
@test oninfo.hdr.space.free > 0
@test oninfo.hdr.mesg.present > 0
end
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 50830 | using HDF5
using CRC32c
using H5Zblosc
using Test
gatherf(dst_buf, dst_buf_bytes_used, op_data) = HDF5.API.herr_t(0)
gatherf_bad(dst_buf, dst_buf_bytes_used, op_data) = HDF5.API.herr_t(-1)
gatherf_data(dst_buf, dst_buf_bytes_used, op_data) = HDF5.API.herr_t((op_data == 9) - 1)
function scatterf(src_buf, src_buf_bytes_used, op_data)
A = [1, 2, 3, 4]
unsafe_store!(src_buf, pointer(A))
unsafe_store!(src_buf_bytes_used, sizeof(A))
return HDF5.API.herr_t(0)
end
scatterf_bad(src_buf, src_buf_bytes_used, op_data) = HDF5.API.herr_t(-1)
function scatterf_data(src_buf, src_buf_bytes_used, op_data)
A = [1, 2, 3, 4]
unsafe_store!(src_buf, pointer(A))
unsafe_store!(src_buf_bytes_used, sizeof(A))
return HDF5.API.herr_t((op_data == 9) - 1)
end
@testset "plain" begin
# Create a new file
fn = tempname()
f = h5open(fn, "w")
@test isopen(f)
# Write scalars
f["Float64"] = 3.2
f["Int16"] = Int16(4)
# compression of empty array (issue #246)
f["compressedempty", shuffle=true, deflate=4] = Int64[]
# compression of zero-dimensional array (pull request #445)
f["compressed_zerodim", shuffle=true, deflate=4] = fill(Int32(42), ())
f["bloscempty", blosc=4] = Int64[]
# test creation of an anonymouse dataset
f[nothing] = 5
# Create arrays of different types
A = randn(3, 5)
write(f, "Afloat64", convert(Matrix{Float64}, A))
write(f, "Afloat32", convert(Matrix{Float32}, A))
Ai = rand(1:20, 2, 4)
write(f, "Aint8", convert(Matrix{Int8}, Ai))
f["Aint16"] = convert(Matrix{Int16}, Ai)
write(f, "Aint32", convert(Matrix{Int32}, Ai))
write(f, "Aint64", convert(Matrix{Int64}, Ai))
write(f, "Auint8", convert(Matrix{UInt8}, Ai))
write(f, "Auint16", convert(Matrix{UInt16}, Ai))
# test writing multiple variable (issue #599)
write(f, "Auint32", convert(Matrix{UInt32}, Ai), "Auint64", convert(Matrix{UInt64}, Ai))
# Arrays of bools (pull request #540)
Abool = [false, true, false]
write(f, "Abool", Abool)
salut = "Hi there"
ucode = "uniçº∂e"
write(f, "salut", salut)
write(f, "ucode", ucode)
# Manually write a variable-length string (issue #187)
let
dtype = HDF5.Datatype(HDF5.API.h5t_copy(HDF5.API.H5T_C_S1))
HDF5.API.h5t_set_size(dtype, HDF5.API.H5T_VARIABLE)
HDF5.API.h5t_set_cset(dtype, HDF5.cset(typeof(salut)))
dspace = dataspace(salut)
dset = create_dataset(f, "salut-vlen", dtype, dspace)
GC.@preserve salut begin
HDF5.API.h5d_write(
dset,
dtype,
HDF5.API.H5S_ALL,
HDF5.API.H5S_ALL,
HDF5.API.H5P_DEFAULT,
[pointer(salut)]
)
end
end
# Arrays of strings
salut_split = ["Hi", "there"]
write(f, "salut_split", salut_split)
salut_2d = ["Hi" "there"; "Salut" "friend"]
write(f, "salut_2d", salut_2d)
# Arrays of strings as vlen
vlen = HDF5.VLen(salut_split)
write_dataset(f, "salut_vlen", vlen)
# Arrays of scalars as vlen
vlen_int = [[3], [1], [4]]
vleni = HDF5.VLen(vlen_int)
write_dataset(f, "int_vlen", vleni)
write_attribute(f["int_vlen"], "vlen_attr", vleni)
# Empty arrays
empty = UInt32[]
write(f, "empty", empty)
write(f, nothing, empty)
# Empty strings
empty_string = ""
write(f, "empty_string", empty_string)
# Empty array of strings
empty_string_array = String[]
write(f, "empty_string_array", empty_string_array)
# Array of empty string
empty_array_of_strings = [""]
write(f, "empty_array_of_strings", empty_array_of_strings)
# attributes
species = [["N", "C"]; ["A", "B"]]
attributes(f)["species"] = species
@test read(attributes(f)["species"]) == species
@test attributes(f)["species"][] == species
C∞ = 42
attributes(f)["C∞"] = C∞
dset = f["salut"]
@test !isempty(dset)
label = "This is a string"
attributes(dset)["typeinfo"] = label
@test read(attributes(dset)["typeinfo"]) == label
@test attributes(dset)["typeinfo"][] == label
@test dset["typeinfo"][] == label
close(dset)
# Scalar reference values in attributes
attributes(f)["ref_test"] = HDF5.Reference(f, "empty_array_of_strings")
@test read(attributes(f)["ref_test"]) === HDF5.Reference(f, "empty_array_of_strings")
# Group
g = create_group(f, "mygroup")
# Test dataset with compression
R = rand(1:20, 20, 40)
g["CompressedA", chunk=(5, 6), shuffle=true, deflate=9] = R
g["BloscA", chunk=(5, 6), shuffle=true, blosc=9] = R
close(g)
# Copy group containing dataset
copy_object(f, "mygroup", f, "mygroup2")
# Copy dataset
g = create_group(f, "mygroup3")
copy_object(f["mygroup/CompressedA"], g, "CompressedA")
copy_object(f["mygroup/BloscA"], g, "BloscA")
close(g)
# Writing hyperslabs
dset = create_dataset(
f, "slab", datatype(Float64), dataspace(20, 20, 5); chunk=(5, 5, 1)
)
Xslab = randn(20, 20, 5)
for i in 1:5
dset[:, :, i] = Xslab[:, :, i]
end
dset = create_dataset(
f, nothing, datatype(Float64), dataspace(20, 20, 5); chunk=(5, 5, 1)
)
dset[:, :, :] = 3.0
# More complex hyperslab and assignment with "incorrect" types (issue #34)
d = create_dataset(f, "slab2", datatype(Float64), ((10, 20), (100, 200)); chunk=(1, 1))
d[:, :] = 5
d[1, 1] = 4
# 1d indexing
d = create_dataset(f, "slab3", datatype(Int), ((10,), (-1,)); chunk=(5,))
@test d[:] == zeros(Int, 10)
d[3:5] = 3:5
# Create a dataset designed to be deleted
f["deleteme"] = 17.2
close(f)
@test !isopen(f)
# Test the h5read/write interface, with attributes
W = copy(reshape(1:120, 15, 8))
Wa = Dict("a" => 1, "b" => 2)
h5write(fn, "newgroup/W", W)
h5writeattr(fn, "newgroup/W", Wa)
# Read the file back in
fr = h5open(fn)
x = read(fr, "Float64")
@test x == 3.2 && isa(x, Float64)
y = read(fr, "Int16")
@test y == 4 && isa(y, Int16)
zerodim = read(fr, "compressed_zerodim")
@test zerodim == 42 && isa(zerodim, Int32)
bloscempty = read(fr, "bloscempty")
@test bloscempty == Int64[] && isa(bloscempty, Vector{Int64})
Af32 = read(fr, "Afloat32")
@test convert(Matrix{Float32}, A) == Af32
@test eltype(Af32) == Float32
Af64 = read(fr, "Afloat64")
@test convert(Matrix{Float64}, A) == Af64
@test eltype(Af64) == Float64
@test eltype(fr["Afloat64"]) == Float64 # issue 167
Ai8 = read(fr, "Aint8")
@test Ai == Ai8
@test eltype(Ai8) == Int8
Ai16 = read(fr, "Aint16")
@test Ai == Ai16
@test eltype(Ai16) == Int16
Ai32 = read(fr, "Aint32")
@test Ai == Ai32
@test eltype(Ai32) == Int32
Ai64 = read(fr, "Aint64")
@test Ai == Ai64
@test eltype(Ai64) == Int64
Ai8 = read(fr, "Auint8")
@test Ai == Ai8
@test eltype(Ai8) == UInt8
Ai16 = read(fr, "Auint16")
@test Ai == Ai16
@test eltype(Ai16) == UInt16
Ai32 = read(fr, "Auint32")
@test Ai == Ai32
@test eltype(Ai32) == UInt32
Ai64 = read(fr, "Auint64")
@test Ai == Ai64
@test eltype(Ai64) == UInt64
Abool_read = read(fr, "Abool")
@test Abool_read == Abool
@test eltype(Abool_read) == Bool
salutr = read(fr, "salut")
@test salut == salutr
salutr = read(fr, "salut-vlen")
@test salut == salutr
ucoder = read(fr, "ucode")
@test ucode == ucoder
salut_splitr = read(fr, "salut_split")
@test salut_splitr == salut_split
salut_2dr = read(fr, "salut_2d")
@test salut_2d == salut_2dr
salut_vlenr = read(fr, "salut_vlen")
@test HDF5.vlen_get_buf_size(fr["salut_vlen"]) == 7
@test HDF5.API.h5d_get_access_plist(fr["salut-vlen"]) != 0
#@test salut_vlenr == salut_split
vlen_intr = read(fr, "int_vlen")
@test vlen_intr == vlen_int
vlen_attrr = read(fr["int_vlen"]["vlen_attr"])
@test vlen_attrr == vlen_int
Rr = read(fr, "mygroup/CompressedA")
@test Rr == R
Rr2 = read(fr, "mygroup2/CompressedA")
@test Rr2 == R
Rr3 = read(fr, "mygroup3/CompressedA")
@test Rr3 == R
Rr4 = read(fr, "mygroup/BloscA")
@test Rr4 == R
Rr5 = read(fr, "mygroup2/BloscA")
@test Rr5 == R
Rr6 = read(fr, "mygroup3/BloscA")
@test Rr6 == R
dset = fr["mygroup/CompressedA"]
@test HDF5.get_chunk(dset) == (5, 6)
@test HDF5.name(dset) == "/mygroup/CompressedA"
dset2 = fr["mygroup/BloscA"]
@test HDF5.get_chunk(dset2) == (5, 6)
@test HDF5.name(dset2) == "/mygroup/BloscA"
Xslabr = read(fr, "slab")
@test Xslabr == Xslab
Xslabr = h5read(fn, "slab", (:, :, :)) # issue #87
@test Xslabr == Xslab
Xslab2r = read(fr, "slab2")
target = fill(5, 10, 20)
target[1] = 4
@test Xslab2r == target
dset = fr["slab3"]
@test dset[3:5] == [3:5;]
emptyr = read(fr, "empty")
@test isempty(emptyr)
empty_stringr = read(fr, "empty_string")
@test empty_stringr == empty_string
empty_string_arrayr = read(fr, "empty_string_array")
@test empty_string_arrayr == empty_string_array
empty_array_of_stringsr = read(fr, "empty_array_of_strings")
@test empty_array_of_stringsr == empty_array_of_strings
@test read_attribute(fr, "species") == species
@test read_attribute(fr, "C∞") == C∞
dset = fr["salut"]
@test read_attribute(dset, "typeinfo") == label
close(dset)
# Test ref-based reading
Aref = fr["Afloat64"]
sel = (2:3, 1:2:5)
Asub = Aref[sel...]
@test Asub == A[sel...]
close(Aref)
# Test iteration, name, and parent
for obj in fr
@test HDF5.filename(obj) == fn
n = HDF5.name(obj)
p = parent(obj)
end
# Test reading multiple vars at once
z = read(fr, "Float64", "Int16")
@test z == (3.2, 4)
@test typeof(z) == Tuple{Float64,Int16}
# Test reading entire file at once
z = read(fr)
@test z["Float64"] == 3.2
close(fr)
# Test object deletion
fr = h5open(fn, "r+")
@test haskey(fr, "deleteme")
delete_object(fr, "deleteme")
@test !haskey(fr, "deleteme")
close(fr)
# Test object move
h5open(fn, "r+") do io
io["moveme"] = [1, 2, 3]
create_group(io, "moveto")
end
h5open(fn, "r+") do io
@test haskey(io, "moveme")
@test haskey(io, "moveto") && !haskey(io, "moveto/moveme")
move_link(io, "moveme", io["moveto"])
@test haskey(io, "moveto/moveme") && !haskey(io, "moveme")
end
# Test the h5read interface
Wr = h5read(fn, "newgroup/W")
@test Wr == W
rng = (2:3:15, 3:5)
Wr = h5read(fn, "newgroup/W", rng)
@test Wr == W[rng...]
War = h5readattr(fn, "newgroup/W")
@test War == Wa
# issue #618
# Test that invalid writes treat implicit creation as a transaction, cleaning up the partial
# operation
hid = h5open(fn, "w")
A = rand(3, 3)'
@test !haskey(hid, "A")
@test_throws ArgumentError write(hid, "A", A)
@test !haskey(hid, "A")
dset = create_dataset(hid, "attr", datatype(Int), dataspace(0))
@test !haskey(attributes(dset), "attr")
# broken test - writing attributes does not check that the stride is correct
@test_skip @test_throws ArgumentError write(dset, "attr", A)
@test !haskey(attributes(dset), "attr")
close(hid)
# more do syntax
h5open(fn, "w") do fid
g = create_group(fid, "mygroup")
write(g, "x", 3.2)
end
fid = h5open(fn, "r")
@test keys(fid) == ["mygroup"]
g = fid["mygroup"]
@test keys(g) == ["x"]
close(g)
close(fid)
rm(fn)
# more do syntax: atomic rename version
tmpdir = mktempdir()
outfile = joinpath(tmpdir, "test.h5")
# create a new file
h5rewrite(outfile) do fid
g = create_group(fid, "mygroup")
write(g, "x", 3.3)
end
@test length(readdir(tmpdir)) == 1
h5open(outfile, "r") do fid
@test keys(fid) == ["mygroup"]
@test keys(fid["mygroup"]) == ["x"]
end
# fail to overwrite
@test_throws ErrorException h5rewrite(outfile) do fid
g = create_group(fid, "mygroup")
write(g, "oops", 3.3)
error("failed")
end
@test length(readdir(tmpdir)) == 1
h5open(outfile, "r") do fid
@test keys(fid) == ["mygroup"]
@test keys(fid["mygroup"]) == ["x"]
end
# overwrite
h5rewrite(outfile) do fid
g = create_group(fid, "mygroup")
write(g, "y", 3.3)
end
@test length(readdir(tmpdir)) == 1
h5open(outfile, "r") do fid
@test keys(fid) == ["mygroup"]
@test keys(fid["mygroup"]) == ["y"]
end
rm(tmpdir; recursive=true)
test_files = joinpath(@__DIR__, "test_files")
d = h5read(joinpath(test_files, "compound.h5"), "/data")
@test typeof(d[1]) == NamedTuple{
(:wgt, :xyz, :uvw, :E),Tuple{Float64,Array{Float64,1},Array{Float64,1},Float64}
}
# get-datasets
fn = tempname()
fd = h5open(fn, "w")
fd["level_0"] = [1, 2, 3]
grp = create_group(fd, "mygroup")
fd["mygroup/level_1"] = [4, 5]
grp2 = create_group(grp, "deep_group")
fd["mygroup/deep_group/level_2"] = [6.0, 7.0]
datasets = HDF5.get_datasets(fd)
@test sort(map(HDF5.name, datasets)) ==
sort(["/level_0", "/mygroup/deep_group/level_2", "/mygroup/level_1"])
close(fd)
rm(fn)
# File creation and access property lists
fid = h5open(
fn,
"w";
userblock=1024,
libver_bounds=(HDF5.API.H5F_LIBVER_EARLIEST, HDF5.API.H5F_LIBVER_LATEST)
)
write(fid, "intarray", [1, 2, 3])
close(fid)
h5open(
fn, "r"; libver_bounds=(HDF5.API.H5F_LIBVER_EARLIEST, HDF5.API.H5F_LIBVER_LATEST)
) do fid
intarray = read(fid, "intarray")
@test intarray == [1, 2, 3]
end
# Test null terminated ASCII string (e.g. exported by h5py) #332
h5open(joinpath(test_files, "nullterm_ascii.h5"), "r") do fid
str = read(fid["test"])
@test str == "Hello World"
end
@test HDF5.unpad(UInt8[0x43, 0x43, 0x41], 1) == "CCA"
# Test the h5read/write interface with a filename as a first argument, when
# the file does not exist
rm(fn)
h5write(fn, "newgroup/W", W)
Wr = h5read(fn, "newgroup/W")
@test Wr == W
close(f)
rm(fn)
# Test dataspace convenience versions of create_dataset
try
h5open(fn, "w") do f
create_dataset(f, "test", Int, (128, 32))
create_dataset(f, "test2", Float64, 128, 64)
@test size(f["test"]) == (128, 32)
@test size(f["test2"]) == (128, 64)
end
finally
rm(fn)
end
@testset "h5d_fill" begin
val = 5
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int), dataspace(6, 6); chunk=(2, 3))
buf = Array{Int,2}(undef, (6, 6))
dtype = datatype(Int)
HDF5.API.h5d_fill(Ref(val), dtype, buf, datatype(Int), dataspace(d))
@test all(buf .== 5)
HDF5.API.h5d_write(
d, dtype, HDF5.API.H5S_ALL, HDF5.API.H5S_ALL, HDF5.API.H5P_DEFAULT, buf
)
end
h5open(fn, "r") do f
@test all(f["dataset"][:, :] .== 5)
end
rm(fn)
end # testset "Test h5d_fill
@testset "h5d_gather" begin
src_buf = rand(Int, (4, 4))
dst_buf = Array{Int,2}(undef, (4, 4))
h5open(fn, "w") do f
d = create_dataset(f, "dataset", datatype(Int), dataspace(4, 4); chunk=(2, 2))
@test isnothing(
HDF5.API.h5d_gather(
dataspace(d),
src_buf,
datatype(Int),
sizeof(dst_buf),
dst_buf,
C_NULL,
C_NULL
)
)
@test src_buf == dst_buf
gatherf_ptr = @cfunction(
gatherf, HDF5.API.herr_t, (Ptr{Nothing}, Csize_t, Ptr{Nothing})
)
@test isnothing(
HDF5.API.h5d_gather(
dataspace(d),
src_buf,
datatype(Int),
sizeof(dst_buf) ÷ 2,
dst_buf,
gatherf_ptr,
C_NULL
)
)
gatherf_bad_ptr = @cfunction(
gatherf_bad, HDF5.API.herr_t, (Ptr{Nothing}, Csize_t, Ptr{Nothing})
)
@test_throws HDF5.API.H5Error HDF5.API.h5d_gather(
dataspace(d),
src_buf,
datatype(Int),
sizeof(dst_buf) ÷ 2,
dst_buf,
gatherf_bad_ptr,
C_NULL
)
gatherf_data_ptr = @cfunction(
gatherf_data, HDF5.API.herr_t, (Ptr{Nothing}, Csize_t, Ref{Int})
)
@test isnothing(
HDF5.API.h5d_gather(
dataspace(d),
src_buf,
datatype(Int),
sizeof(dst_buf) ÷ 2,
dst_buf,
gatherf_data_ptr,
Ref(9)
)
)
@test_throws HDF5.API.H5Error HDF5.API.h5d_gather(
dataspace(d),
src_buf,
datatype(Int),
sizeof(dst_buf) ÷ 2,
dst_buf,
gatherf_data_ptr,
10
)
end
rm(fn)
end
@testset "h5d_scatter" begin
h5open(fn, "w") do f
dst_buf = Array{Int,2}(undef, (4, 4))
d = create_dataset(f, "dataset", datatype(Int), dataspace(4, 4); chunk=(2, 2))
scatterf_ptr = @cfunction(
scatterf, HDF5.API.herr_t, (Ptr{Ptr{Nothing}}, Ptr{Csize_t}, Ptr{Nothing})
)
@test isnothing(
HDF5.API.h5d_scatter(
scatterf_ptr, C_NULL, datatype(Int), dataspace(d), dst_buf
)
)
scatterf_bad_ptr = @cfunction(
scatterf_bad,
HDF5.API.herr_t,
(Ptr{Ptr{Nothing}}, Ptr{Csize_t}, Ptr{Nothing})
)
@test_throws HDF5.API.H5Error HDF5.API.h5d_scatter(
scatterf_bad_ptr, C_NULL, datatype(Int), dataspace(d), dst_buf
)
scatterf_data_ptr = @cfunction(
scatterf_data, HDF5.API.herr_t, (Ptr{Ptr{Int}}, Ptr{Csize_t}, Ref{Int})
)
@test isnothing(
HDF5.API.h5d_scatter(
scatterf_data_ptr, Ref(9), datatype(Int), dataspace(d), dst_buf
)
)
end
rm(fn)
end
# Test that switching time tracking off results in identical files
fn1 = tempname()
fn2 = tempname()
h5open(fn1, "w") do f
f["x", obj_track_times=false] = [1, 2, 3]
end
sleep(1)
h5open(fn2, "w") do f
f["x", obj_track_times=false] = [1, 2, 3]
end
@test open(crc32c, fn1) == open(crc32c, fn2)
rm(fn1)
rm(fn2)
end # testset plain
@testset "complex" begin
HDF5.enable_complex_support()
fn = tempname()
f = h5open(fn, "w")
f["ComplexF64"] = 1.0 + 2.0im
attributes(f["ComplexF64"])["ComplexInt64"] = 1im
Acmplx = rand(ComplexF64, 3, 5)
write(f, "Acmplx64", convert(Matrix{ComplexF64}, Acmplx))
write(f, "Acmplx32", convert(Matrix{ComplexF32}, Acmplx))
dset = create_dataset(
f, "Acmplx64_hyperslab", datatype(Complex{Float64}), dataspace(Acmplx)
)
for i in 1:size(Acmplx, 2)
dset[:, i] = Acmplx[:, i]
end
HDF5.disable_complex_support()
@test_throws ErrorException f["_ComplexF64"] = 1.0 + 2.0im
@test_throws ErrorException write(f, "_Acmplx64", convert(Matrix{ComplexF64}, Acmplx))
@test_throws ErrorException write(f, "_Acmplx32", convert(Matrix{ComplexF32}, Acmplx))
HDF5.enable_complex_support()
close(f)
fr = h5open(fn)
z = read(fr, "ComplexF64")
@test z == 1.0 + 2.0im && isa(z, ComplexF64)
z_attrs = attributes(fr["ComplexF64"])
@test read(z_attrs["ComplexInt64"]) == 1im
Acmplx32 = read(fr, "Acmplx32")
@test convert(Matrix{ComplexF32}, Acmplx) == Acmplx32
@test eltype(Acmplx32) == ComplexF32
Acmplx64 = read(fr, "Acmplx64")
@test convert(Matrix{ComplexF64}, Acmplx) == Acmplx64
@test eltype(Acmplx64) == ComplexF64
dset = fr["Acmplx64_hyperslab"]
Acmplx64_hyperslab = zeros(eltype(dset), size(dset))
for i in 1:size(dset, 2)
Acmplx64_hyperslab[:, i] = dset[:, i]
end
@test convert(Matrix{ComplexF64}, Acmplx) == Acmplx64_hyperslab
HDF5.disable_complex_support()
z = read(fr, "ComplexF64")
@test isa(z, NamedTuple{(:r, :i),Tuple{Float64,Float64}})
Acmplx32 = read(fr, "Acmplx32")
@test eltype(Acmplx32) == NamedTuple{(:r, :i),Tuple{Float32,Float32}}
Acmplx64 = read(fr, "Acmplx64")
@test eltype(Acmplx64) == NamedTuple{(:r, :i),Tuple{Float64,Float64}}
close(fr)
HDF5.enable_complex_support()
end
# test strings with null and undefined references
@testset "undefined and null" begin
fn = tempname()
f = h5open(fn, "w")
# don't silently truncate data
@test_throws ArgumentError write(f, "test", ["hello", "there", "\0"])
@test_throws ArgumentError write(f, "trunc1", "\0")
@test_throws ArgumentError write(f, "trunc2", "trunc\0ateme")
# test writing uninitialized string arrays
undefstrarr = similar(Vector(1:3), String) # strs = String[#undef, #undef, #undef]
@test_throws UndefRefError write(f, "undef", undefstrarr)
close(f)
rm(fn)
end # testset null and undefined
# test writing abstract arrays
@testset "abstract arrays" begin
# test writing reinterpreted data
fn = tempname()
try
h5open(fn, "w") do f
data = reinterpret(UInt8, [true, false, false])
write(f, "reinterpret array", data)
end
@test h5open(fn, "r") do f
read(f, "reinterpret array")
end == UInt8[0x01, 0x00, 0x00]
finally
rm(fn)
end
# don't silently fail for arrays with a different stride
fn = tempname()
try
data = rand(UInt16, 2, 3)
pdv_data = PermutedDimsArray(data, (2, 1))
@test_throws ArgumentError h5write(fn, "pdv_data", pdv_data)
finally
rm(fn)
end
# test alignment
fn = tempname()
h5open(fn, "w"; alignment=(0, 8)) do fid
fid["x"] = zeros(10, 10)
end
end # writing abstract arrays
# issue #705
@testset "empty and 0-size arrays" begin
fn = tempname()
hfile = h5open(fn, "w")
# Write datasets with various 0-sizes
write(hfile, "empty", HDF5.EmptyArray{Int64}()) # HDF5 empty
write(hfile, "zerodim", fill(1.0π)) # 0-dimensional
write(hfile, "zerovec", zeros(0)) # 1-dimensional, size 0
write(hfile, "zeromat", zeros(0, 0)) # 2-dimensional, size 0
write(hfile, "zeromat2", zeros(0, 1)) # 2-dimensional, size 0 with non-zero axis
dempty = hfile["empty"]
dzerodim = hfile["zerodim"]
dzerovec = hfile["zerovec"]
dzeromat = hfile["zeromat"]
dzeromat2 = hfile["zeromat2"]
# Test that eltype is preserved (especially for EmptyArray)
@test eltype(dempty) == Int64
@test eltype(dzerodim) == Float64
@test eltype(dzerovec) == Float64
@test eltype(dzeromat) == Float64
@test eltype(dzeromat2) == Float64
# Test sizes are as expected
@test size(dempty) == ()
@test size(dzerovec) == (0,)
@test size(dzeromat) == (0, 0)
@test size(dzeromat2) == (0, 1)
@test HDF5.isnull(dempty)
@test !HDF5.isnull(dzerovec)
@test !HDF5.isnull(dzeromat)
@test !HDF5.isnull(dzeromat2)
# Reading back must preserve emptiness
@test read(dempty) isa HDF5.EmptyArray
# but 0-dimensional Array{T,0} are stored as HDF5 scalar
@test size(dzerodim) == ()
@test !HDF5.isnull(dzerodim)
@test read(dzerodim) == 1.0π
# Similar tests for writing to attributes
write(dempty, "attr", HDF5.EmptyArray{Float64}())
write(dzerodim, "attr", fill(1.0ℯ))
write(dzerovec, "attr", zeros(Int64, 0))
write(dzeromat, "attr", zeros(Int64, 0, 0))
write(dzeromat2, "attr", zeros(Int64, 0, 1))
aempty = dempty["attr"]
azerodim = dzerodim["attr"]
azerovec = dzerovec["attr"]
azeromat = dzeromat["attr"]
azeromat2 = dzeromat2["attr"]
# Test that eltype is preserved (especially for EmptyArray)
@test eltype(aempty) == Float64
@test eltype(azerodim) == Float64
@test eltype(azerovec) == Int64
@test eltype(azeromat) == Int64
@test eltype(azeromat2) == Int64
# Test sizes are as expected
@test size(aempty) == ()
@test size(azerovec) == (0,)
@test size(azeromat) == (0, 0)
@test size(azeromat2) == (0, 1)
@test HDF5.isnull(aempty)
@test !HDF5.isnull(azerovec)
@test !HDF5.isnull(azeromat)
@test !HDF5.isnull(azeromat2)
# Reading back must preserve emptiness
@test read(aempty) isa HDF5.EmptyArray
# but 0-dimensional Array{T,0} are stored as HDF5 scalar
@test size(azerodim) == ()
@test !HDF5.isnull(azerodim)
@test read(azerodim) == 1.0ℯ
# Concatenation of EmptyArrays is not supported
x = HDF5.EmptyArray{Float64}()
@test_throws ErrorException [x x]
@test_throws ErrorException [x; x]
@test_throws ErrorException [x x; x x]
close(hfile)
rm(fn)
# check that printing EmptyArray doesn't error
buf = IOBuffer()
show(buf, HDF5.EmptyArray{Int64}())
@test String(take!(buf)) == "HDF5.EmptyArray{Int64}()"
show(buf, MIME"text/plain"(), HDF5.EmptyArray{Int64}())
@test String(take!(buf)) == "HDF5.EmptyArray{Int64}()"
end # empty and 0-size arrays
@testset "generic read of native types" begin
fn = tempname()
hfile = h5open(fn, "w")
dtype_varstring = HDF5.Datatype(HDF5.API.h5t_copy(HDF5.API.H5T_C_S1))
HDF5.API.h5t_set_size(dtype_varstring, HDF5.API.H5T_VARIABLE)
write(hfile, "uint8_array", UInt8[(1:8)...])
write(hfile, "bool_scalar", true)
fixstring = "fix"
varstring = "var"
write(hfile, "fixed_string", fixstring)
vardset = create_dataset(
hfile, "variable_string", dtype_varstring, dataspace(varstring)
)
GC.@preserve varstring begin
HDF5.API.h5d_write(
vardset,
dtype_varstring,
HDF5.API.H5S_ALL,
HDF5.API.H5S_ALL,
HDF5.API.H5P_DEFAULT,
[pointer(varstring)]
)
end
flush(hfile)
close(dtype_varstring)
# generic read() handles concrete types with definite sizes transparently
d = read(hfile["uint8_array"], UInt8)
@test d isa Vector{UInt8}
@test d == 1:8
d = read(hfile["bool_scalar"], Bool)
@test d isa Bool
@test d == true
d = read(hfile["fixed_string"], HDF5.FixedString{length(fixstring),0})
@test d isa String
@test d == fixstring
d = read(hfile["variable_string"], Cstring)
@test d isa String
@test d == varstring
# will also accept memory-compatible reinterpretations
d = read(hfile["uint8_array"], Int8)
@test d isa Vector{Int8}
@test d == 1:8
d = read(hfile["bool_scalar"], UInt8)
@test d isa UInt8
@test d == 0x1
# but should throw on non-compatible types
@test_throws ErrorException("""
Type size mismatch
sizeof(UInt16) = 2
sizeof($(sprint(show, datatype(UInt8)))) = 1
""") read(hfile["uint8_array"], UInt16)
# Strings are not fixed size, but generic read still handles them if given the correct
# underlying FixedString or Cstring type; a method overload makes String work, too.
d = read(hfile["fixed_string"], String)
@test d isa String
@test d == fixstring
d = read(hfile["variable_string"], String)
@test d isa String
@test d == varstring
close(hfile)
rm(fn)
end # generic read of native types
@testset "show" begin
fn = tempname()
# First create data objects and sure they print useful outputs
hfile = h5open(fn, "w"; swmr=true)
@test sprint(show, hfile) == "HDF5.File: (read-write, swmr) $fn"
group = create_group(hfile, "group")
@test sprint(show, group) == "HDF5.Group: /group (file: $fn)"
dset = create_dataset(group, "dset", datatype(Int), dataspace((1,)))
@test sprint(show, dset) == "HDF5.Dataset: /group/dset (file: $fn xfer_mode: 0)"
meta = create_attribute(dset, "meta", datatype(Bool), dataspace((1,)))
@test sprint(show, meta) == "HDF5.Attribute: meta"
dsetattrs = attributes(dset)
@test sprint(show, dsetattrs) ==
"Attributes of HDF5.Dataset: /group/dset (file: $fn xfer_mode: 0)"
prop = HDF5.init!(HDF5.LinkCreateProperties())
@test sprint(show, prop) == """
HDF5.LinkCreateProperties(
create_intermediate_group = false,
char_encoding = :ascii,
)"""
prop = HDF5.DatasetCreateProperties()
@test sprint(show, prop) == "HDF5.DatasetCreateProperties()"
dtype = HDF5.Datatype(HDF5.API.h5t_copy(HDF5.API.H5T_IEEE_F64LE))
@test sprint(show, dtype) == "HDF5.Datatype: H5T_IEEE_F64LE"
commit_datatype(hfile, "type", dtype)
@test sprint(show, dtype) == "HDF5.Datatype: /type H5T_IEEE_F64LE"
dtypemeta = create_attribute(dtype, "dtypemeta", datatype(Bool), dataspace((1,)))
@test sprint(show, dtypemeta) == "HDF5.Attribute: dtypemeta"
dtypeattrs = attributes(dtype)
@test sprint(show, dtypeattrs) == "Attributes of HDF5.Datatype: /type H5T_IEEE_F64LE"
dspace_null = HDF5.Dataspace(HDF5.API.h5s_create(HDF5.API.H5S_NULL))
dspace_scal = HDF5.Dataspace(HDF5.API.h5s_create(HDF5.API.H5S_SCALAR))
dspace_norm = dataspace((100, 4))
dspace_maxd = dataspace((100, 4); max_dims=(256, 4))
dspace_slab = HDF5.hyperslab(dataspace((100, 4)), 1:20:100, 1:4)
if HDF5.libversion ≥ v"1.10.7"
dspace_irrg = HDF5.Dataspace(
HDF5.API.h5s_combine_select(
HDF5.API.h5s_copy(dspace_slab),
HDF5.API.H5S_SELECT_OR,
HDF5.hyperslab(dataspace((100, 4)), 2, 2)
)
)
@test sprint(show, dspace_irrg) == "HDF5.Dataspace: (100, 4) [irregular selection]"
end
@test sprint(show, dspace_null) == "HDF5.Dataspace: H5S_NULL"
@test sprint(show, dspace_scal) == "HDF5.Dataspace: H5S_SCALAR"
@test sprint(show, dspace_norm) == "HDF5.Dataspace: (100, 4)"
@test sprint(show, dspace_maxd) == "HDF5.Dataspace: (100, 4) / (256, 4)"
@test sprint(show, dspace_slab) == "HDF5.Dataspace: (1:20:81, 1:4) / (1:100, 1:4)"
# Now test printing after closing each object
close(dspace_null)
@test sprint(show, dspace_null) == "HDF5.Dataspace: (invalid)"
close(dtype)
@test sprint(show, dtype) == "HDF5.Datatype: (invalid)"
close(prop)
@test sprint(show, prop) == "HDF5.DatasetCreateProperties: (invalid)"
close(meta)
@test sprint(show, meta) == "HDF5.Attribute: (invalid)"
close(dtypemeta)
@test sprint(show, dtypemeta) == "HDF5.Attribute: (invalid)"
close(dset)
@test sprint(show, dset) == "HDF5.Dataset: (invalid)"
@test sprint(show, dsetattrs) == "Attributes of HDF5.Dataset: (invalid)"
close(group)
@test sprint(show, group) == "HDF5.Group: (invalid)"
close(hfile)
@test sprint(show, hfile) == "HDF5.File: (closed) $fn"
# Go back and check different access modes for file printing
hfile = h5open(fn, "r+"; swmr=true)
@test sprint(show, hfile) == "HDF5.File: (read-write, swmr) $fn"
close(hfile)
hfile = h5open(fn, "r"; swmr=true)
@test sprint(show, hfile) == "HDF5.File: (read-only, swmr) $fn"
close(hfile)
hfile = h5open(fn, "r")
@test sprint(show, hfile) == "HDF5.File: (read-only) $fn"
close(hfile)
hfile = h5open(fn, "cw")
@test sprint(show, hfile) == "HDF5.File: (read-write) $fn"
close(hfile)
rm(fn)
# Make an interesting file tree
hfile = h5open(fn, "w")
# file level
hfile["version"] = 1.0
attributes(hfile)["creator"] = "HDF5.jl"
# group level
create_group(hfile, "inner")
attributes(hfile["inner"])["dirty"] = true
# dataset level
hfile["inner/data"] = collect(-5:5)
attributes(hfile["inner/data"])["mode"] = 1
# non-trivial committed datatype
# TODO: print more datatype information
tmeta = HDF5.Datatype(
HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(Int) + sizeof(Float64))
)
HDF5.API.h5t_insert(tmeta, "scale", 0, HDF5.hdf5_type_id(Int))
HDF5.API.h5t_insert(tmeta, "bias", sizeof(Int), HDF5.hdf5_type_id(Float64))
tstr = datatype("fixed")
t = HDF5.Datatype(
HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(tmeta) + sizeof(tstr))
)
HDF5.API.h5t_insert(t, "meta", 0, tmeta)
HDF5.API.h5t_insert(t, "type", sizeof(tmeta), tstr)
commit_datatype(hfile, "dtype", t)
buf = IOBuffer()
iobuf = IOContext(buf, :limit => true, :module => Main)
show3(io::IO, x) = show(IOContext(io, iobuf), MIME"text/plain"(), x)
HDF5.show_tree(iobuf, hfile)
msg = String(take!(buf))
@test occursin(
r"""
🗂️ HDF5.File: .*$
├─ 🏷️ creator
├─ 📄 dtype
├─ 📂 inner
│ ├─ 🏷️ dirty
│ └─ 🔢 data
│ └─ 🏷️ mode
└─ 🔢 version"""m,
msg
)
@test sprint(show3, hfile) == msg
HDF5.show_tree(iobuf, hfile; attributes=false)
@test occursin(
r"""
🗂️ HDF5.File: .*$
├─ 📄 dtype
├─ 📂 inner
│ └─ 🔢 data
└─ 🔢 version"""m,
String(take!(buf))
)
HDF5.show_tree(iobuf, attributes(hfile))
msg = String(take!(buf))
@test occursin(
r"""
🗂️ Attributes of HDF5.File: .*$
└─ 🏷️ creator"""m,
msg
)
@test sprint(show3, attributes(hfile)) == msg
HDF5.show_tree(iobuf, hfile["inner"])
msg = String(take!(buf))
@test occursin(
r"""
📂 HDF5.Group: /inner .*$
├─ 🏷️ dirty
└─ 🔢 data
└─ 🏷️ mode"""m,
msg
)
@test sprint(show3, hfile["inner"]) == msg
HDF5.show_tree(iobuf, hfile["inner"]; attributes=false)
@test occursin(
r"""
📂 HDF5.Group: /inner .*$
└─ 🔢 data"""m,
String(take!(buf))
)
HDF5.show_tree(iobuf, hfile["inner/data"])
msg = String(take!(buf))
@test occursin(
r"""
🔢 HDF5.Dataset: /inner/data .*$
└─ 🏷️ mode"""m,
msg
)
# xfer_mode changes between printings, so need regex again
@test occursin(
r"""
🔢 HDF5.Dataset: /inner/data .*$
└─ 🏷️ mode"""m,
sprint(show3, hfile["inner/data"])
)
HDF5.show_tree(iobuf, hfile["inner/data"]; attributes=false)
@test occursin(
r"""
🔢 HDF5.Dataset: /inner/data .*$"""m,
String(take!(buf))
)
HDF5.show_tree(iobuf, hfile["dtype"])
@test occursin(
r"""
📄 HDF5.Datatype: /dtype""",
String(take!(buf))
)
HDF5.show_tree(iobuf, hfile["inner/data"]["mode"]; attributes=true)
@test occursin(
r"""
🏷️ HDF5.Attribute: mode""",
String(take!(buf))
)
# configurable options
# no emoji icons
HDF5.SHOW_TREE_ICONS[] = false
@test occursin(
r"""
\[F\] HDF5.File: .*$
├─ \[A\] creator
├─ \[T\] dtype
├─ \[G\] inner
│ ├─ \[A\] dirty
│ └─ \[D\] data
│ └─ \[A\] mode
└─ \[D\] version"""m,
sprint(show3, hfile)
)
HDF5.SHOW_TREE_ICONS[] = true
# no tree printing
show(IOContext(iobuf, :compact => true), MIME"text/plain"(), hfile)
msg = String(take!(buf))
@test msg == sprint(show, hfile)
close(hfile)
# Now test the print-limiting heuristics for large/complex datasets
# group with a large number of children; tests child entry truncation heuristic
h5open(fn, "w") do hfile
dt, ds = datatype(Int), dataspace(())
opts = Iterators.product('A':'Z', 1:9)
for ii in opts
create_dataset(hfile, string(ii...), dt, ds)
end
def = HDF5.SHOW_TREE_MAX_CHILDREN[]
HDF5.SHOW_TREE_MAX_CHILDREN[] = 5
HDF5.show_tree(iobuf, hfile)
msg = String(take!(buf))
@test occursin(
r"""
🗂️ HDF5.File: .*$
├─ 🔢 A1
├─ 🔢 A2
├─ 🔢 A3
├─ 🔢 A4
├─ 🔢 A5
└─ \(229 more children\)"""m,
msg
)
@test sprint(show3, hfile) == msg
HDF5.SHOW_TREE_MAX_CHILDREN[] = def
# IOContext can halt limiting
HDF5.show_tree(IOContext(iobuf, :limit => false), hfile)
@test countlines(seekstart(buf)) == length(opts) + 1
truncate(buf, 0)
end
# deeply nested set of elements; test that the tree is truncated
h5open(fn, "w") do hfile
p = HDF5.root(hfile)::HDF5.Group
opts = 'A':'Z'
for ii in opts
p = create_group(p, string(ii))
end
def = HDF5.SHOW_TREE_MAX_DEPTH[]
HDF5.SHOW_TREE_MAX_DEPTH[] = 5
HDF5.show_tree(iobuf, hfile)
msg = String(take!(buf))
@test occursin(
r"""
🗂️ HDF5.File: .*$
└─ 📂 A
└─ 📂 B
└─ 📂 C
└─ 📂 D
└─ 📂 E
└─ \(1 child\)"""m,
msg
)
@test sprint(show3, hfile) == msg
HDF5.SHOW_TREE_MAX_DEPTH[] = def
# IOContext can halt limiting
HDF5.show_tree(IOContext(iobuf, :limit => false), hfile)
@test countlines(seekstart(buf)) == length(opts) + 1
truncate(buf, 0)
end
rm(fn)
end # show tests
@testset "split1" begin
@test HDF5.split1("/") == ("/", "")
@test HDF5.split1("a") == ("a", "")
@test HDF5.split1("/a/b/c") == ("/", "a/b/c")
@test HDF5.split1("a/b/c") == ("a", "b/c")
@test HDF5.split1(GenericString("a")) == ("a", "")
@test HDF5.split1(GenericString("/a/b/c")) == ("/", "a/b/c")
@test HDF5.split1(GenericString("a/b/c")) == ("a", "b/c")
# The following two paths have the same graphemes but different code unit structures:
# the first one is
# <latin small letter a with circumflex> "/" <greek small leter alpha>
# while the second one is
# "a" <combining circumflex accent> "/" <greek small letter alpha>
circa = "â" # <latin small leter a with circumflex>
acomb = "â" # "a" + <combining circumflex accent>
path1 = circa * "/α"
path2 = acomb * "/α"
# Sanity checks that the two strings are different but equivalent under normalization
@test path1 != path2
@test Base.Unicode.normalize(path1, :NFC) == Base.Unicode.normalize(path2, :NFC)
# Check split1 operates correctly
@test HDF5.split1(path1) == (circa, "α")
@test HDF5.split1(path2) == (acomb, "α")
@test HDF5.split1("/" * path1) == ("/", path1)
@test HDF5.split1("/" * path2) == ("/", path2)
end # split1 tests
# Also tests AbstractString interface
@testset "haskey" begin
fn = tempname()
hfile = h5open(fn, "w")
group1 = create_group(hfile, "group1")
group2 = create_group(group1, "group2")
@test haskey(hfile, "/")
@test haskey(hfile, GenericString("group1"))
@test !haskey(hfile, GenericString("groupna"))
@test haskey(hfile, "group1/group2")
@test !haskey(hfile, "group1/groupna")
@test_throws KeyError hfile["nothing"]
dset1 = create_dataset(hfile, "dset1", datatype(Int), dataspace((1,)))
dset2 = create_dataset(group1, "dset2", datatype(Int), dataspace((1,)))
@test haskey(hfile, "dset1")
@test !haskey(hfile, "dsetna")
@test haskey(hfile, "group1/dset2")
@test !haskey(hfile, "group1/dsetna")
meta1 = create_attribute(dset1, "meta1", datatype(Bool), dataspace((1,)))
@test haskey(dset1, "meta1")
@test !haskey(dset1, "metana")
@test_throws KeyError dset1["nothing"]
attribs = attributes(hfile)
attribs["test1"] = true
attribs["test2"] = "foo"
@test haskey(attribs, "test1")
@test haskey(attribs, "test2")
@test !haskey(attribs, "testna")
@test_throws KeyError attribs["nothing"]
attribs = attributes(dset2)
attribs["attr"] = "foo"
@test haskey(attribs, GenericString("attr"))
close(hfile)
rm(fn)
end # haskey tests
@testset "AbstractString" begin
fn = GenericString(tempname())
hfile = h5open(fn, "w")
close(hfile)
hfile = h5open(fn)
close(hfile)
hfile = h5open(fn, "w")
@test_nowarn create_group(hfile, GenericString("group1"))
@test_nowarn create_dataset(
hfile, GenericString("dset1"), datatype(Int), dataspace((1,))
)
@test_nowarn create_dataset(hfile, GenericString("dset2"), 1)
@test_nowarn hfile[GenericString("group1")]
@test_nowarn hfile[GenericString("dset1")]
dset1 = hfile["dset1"]
@test_nowarn create_attribute(
dset1, GenericString("meta1"), datatype(Bool), dataspace((1,))
)
@test_nowarn create_attribute(dset1, GenericString("meta2"), 1)
@test_nowarn dset1[GenericString("meta1")]
@test_nowarn dset1[GenericString("x")] = 2
array_of_strings = ["test",]
write(hfile, "array_of_strings", array_of_strings)
@test_nowarn attributes(hfile)[GenericString("ref_test")] = HDF5.Reference(
hfile, GenericString("array_of_strings")
)
@test read(attributes(hfile)[GenericString("ref_test")]) ===
HDF5.Reference(hfile, "array_of_strings")
hfile[GenericString("test")] = 17.2
@test_nowarn delete_object(hfile, GenericString("test"))
@test_nowarn delete_attribute(dset1, GenericString("meta1"))
# transient types
memtype_id = HDF5.API.h5t_copy(HDF5.API.H5T_NATIVE_DOUBLE)
dt = HDF5.Datatype(memtype_id)
@test !HDF5.API.h5t_committed(dt)
commit_datatype(hfile, GenericString("dt"), dt)
@test HDF5.API.h5t_committed(dt)
dt = datatype(Int)
ds = dataspace(0)
d = create_dataset(hfile, GenericString("d"), dt, ds)
g = create_group(hfile, GenericString("g"))
a = create_attribute(hfile, GenericString("a"), dt, ds)
for obj in (d, g)
@test_nowarn write_attribute(obj, GenericString("a"), 1)
@test_nowarn read_attribute(obj, GenericString("a"))
@test_nowarn write(obj, GenericString("aa"), 1)
@test_nowarn attributes(obj)["attr1"] = GenericString("b")
end
@test_nowarn write(d, "attr2", GenericString("c"))
@test_nowarn write_dataset(g, GenericString("ag"), GenericString("gg"))
@test_nowarn write_dataset(
g, GenericString("ag_array"), [GenericString("a1"), GenericString("a2")]
)
genstrs = GenericString["fee", "fi", "foo"]
@test_nowarn write_attribute(d, GenericString("myattr"), genstrs)
@test genstrs == read(d["myattr"])
for obj in (hfile,)
@test_nowarn open_dataset(obj, GenericString("d"))
@test_nowarn write_dataset(obj, GenericString("dd"), 1)
@test_nowarn read_dataset(obj, GenericString("dd"))
@test_nowarn read(obj, GenericString("dd"))
@test_nowarn read(obj, GenericString("dd") => Int)
end
read(attributes(hfile), GenericString("a"))
write(hfile, GenericString("ASD"), GenericString("Aa"))
write(g, GenericString("ASD"), GenericString("Aa"))
write(g, GenericString("ASD1"), [GenericString("Aa")])
# test writing multiple variable
@test_nowarn write(
hfile, GenericString("a1"), rand(2, 2), GenericString("a2"), rand(2, 2)
)
# copy methods
d1 = create_dataset(hfile, GenericString("d1"), dt, ds)
d1["x"] = 32
@test_nowarn copy_object(hfile, GenericString("d1"), hfile, GenericString("d1copy1"))
@test_nowarn copy_object(d1, hfile, GenericString("d1copy2"))
fn = GenericString(tempname())
A = Matrix(reshape(1:120, 15, 8))
@test_nowarn h5write(fn, GenericString("A"), A)
@test_nowarn h5read(fn, GenericString("A"))
@test_nowarn h5read(fn, GenericString("A"), (2:3:15, 3:5))
@test_nowarn h5write(fn, GenericString("x"), 1)
@test_nowarn h5read(fn, GenericString("x") => Int)
@test_nowarn h5rewrite(fn) do fid
g = create_group(fid, "mygroup")
write(g, "x", 3.3)
end
@test_nowarn h5rewrite(fn) do fid
g = create_group(fid, "mygroup")
write(g, "y", 3.3)
end
@test_nowarn h5write(fn, "W", [1 2; 3 4])
@test_nowarn h5writeattr(fn, GenericString("W"), Dict("a" => 1, "b" => 2))
@test_nowarn h5readattr(fn, GenericString("W"))
fn_external = GenericString(tempname())
dset = HDF5.create_external_dataset(hfile, "ext", fn_external, Int, (10, 20))
dcpl = HDF5.get_create_properties(dset)
@test HDF5.API.h5p_get_external_count(dcpl) == 1
ext_prop = HDF5.API.h5p_get_external(dcpl)
@test ext_prop.name == fn_external
@test ext_prop.offset == 0
@test ext_prop.size == 10 * 20 * sizeof(Int)
dapl = HDF5.get_access_properties(dset)
dapl.efile_prefix = "efile_test"
@test HDF5.API.h5p_get_efile_prefix(dapl) == "efile_test"
close(hfile)
end
@testset "opaque data" begin
mktemp() do path, io
close(io)
fid = h5open(path, "w")
num = 1
olen = 4
otype = HDF5.Datatype(HDF5.API.h5t_create(HDF5.API.H5T_OPAQUE, olen))
HDF5.API.h5t_set_tag(otype, "opaque test")
# scalar
dat0 = rand(UInt8, olen)
create_dataset(fid, "scalar", otype, dataspace(()))
write_dataset(fid["scalar"], otype, dat0)
# vector
dat1 = [rand(UInt8, olen) for _ in 1:4]
buf1 = reduce(vcat, dat1)
create_dataset(fid, "vector", otype, dataspace(dat1))
write_dataset(fid["vector"], otype, buf1)
# matrix
dat2 = [rand(UInt8, olen) for _ in 1:4, _ in 1:2]
buf2 = reduce(vcat, dat2)
create_dataset(fid, "matrix", otype, dataspace(dat2))
write_dataset(fid["matrix"], otype, buf2)
# opaque data within a compound data type
ctype = HDF5.Datatype(
HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(num) + sizeof(otype))
)
HDF5.API.h5t_insert(ctype, "v", 0, datatype(num))
HDF5.API.h5t_insert(ctype, "d", sizeof(num), otype)
cdat = vcat(reinterpret(UInt8, [num]), dat0)
create_dataset(fid, "compound", ctype, dataspace(()))
write_dataset(fid["compound"], ctype, cdat)
opaque0 = read(fid["scalar"])
@test opaque0.tag == "opaque test"
@test opaque0.data == dat0
opaque1 = read(fid["vector"])
@test opaque1.tag == "opaque test"
@test opaque1.data == dat1
opaque2 = read(fid["matrix"])
@test opaque2.tag == "opaque test"
@test opaque2.data == dat2
# Note: opaque tag is lost
compound = read(fid["compound"])
@test compound == (v=num, d=dat0)
close(fid)
end
end
@testset "FixedStrings and FixedArrays" begin
# properties for FixedString
fix = HDF5.FixedString{4,0}((b"test"...,))
@test length(typeof(fix)) == 4
@test length(fix) == 4
@test HDF5.pad(typeof(fix)) == 0
@test HDF5.pad(fix) == 0
# issue #742, large fixed strings are readable
mktemp() do path, io
close(io)
num = Int64(9)
ref = join('a':'z')^1000
fid = h5open(path, "w")
# long string serialized as FixedString
fid["longstring"] = ref
# compound datatype containing a FixedString
compound_dtype = HDF5.Datatype(
HDF5.API.h5t_create(HDF5.API.H5T_COMPOUND, sizeof(num) + sizeof(ref))
)
HDF5.API.h5t_insert(compound_dtype, "n", 0, datatype(num))
HDF5.API.h5t_insert(compound_dtype, "a", sizeof(num), datatype(ref))
c = create_dataset(fid, "compoundlongstring", compound_dtype, dataspace(()))
# normally this is done with a `struct name{N}; n::Int64; a::NTuple{N,Char}; end`,
# but we need to not actually instantiate the `NTuple`.
buf = IOBuffer()
write(buf, num, ref)
@assert position(buf) == sizeof(compound_dtype)
write_dataset(c, compound_dtype, take!(buf))
# Test reading without stalling
d = fid["longstring"]
T = HDF5.get_jl_type(d)
@test T <: HDF5.FixedString
@test length(T) == length(ref)
@test read(d) == ref
T = HDF5.get_jl_type(c)
@test T <: NamedTuple
@test fieldnames(T) == (:n, :a)
@test read(c) == (n=num, a=ref)
close(fid)
end
fix = HDF5.FixedArray{Float64,(2, 2),4}((1, 2, 3, 4))
@test size(typeof(fix)) == (2, 2)
@test size(fix) == (2, 2)
@test eltype(typeof(fix)) == Float64
@test eltype(fix) == Float64
# large fixed arrays are readable
mktemp() do path, io
close(io)
ref = rand(Float64, 3000)
t = HDF5.Datatype(
HDF5.API.h5t_array_create(datatype(Float64), ndims(ref), collect(size(ref)))
)
scalarspace = dataspace(())
fid = h5open(path, "w")
d = create_dataset(fid, "longnums", t, scalarspace)
write_dataset(d, t, ref)
T = HDF5.get_jl_type(d)
@test T <: HDF5.FixedArray
@test size(T) == size(ref)
@test eltype(T) == eltype(ref)
@test read(d) == ref
close(fid)
end
end
@testset "Object Exists" begin
hfile = h5open(tempname(), "w")
g1 = create_group(hfile, "group1")
@test_throws HDF5.API.H5Error create_group(hfile, "group1")
create_group(g1, "group1a")
@test_throws HDF5.API.H5Error create_group(hfile, "/group1/group1a")
@test_throws HDF5.API.H5Error create_group(g1, "group1a")
# issue #1055
create_group(hfile, SubString("abcdef", 1, 3))
@test haskey(hfile, "abc")
@test !haskey(hfile, "abcdef")
create_dataset(hfile, "dset1", 1)
create_dataset(hfile, "/group1/dset1", 1)
@test_throws ErrorException create_dataset(hfile, "dset1", 1)
@test_throws ErrorException create_dataset(hfile, "group1", 1)
@test_throws ErrorException create_dataset(g1, "dset1", 1)
close(hfile)
end
@testset "HDF5 existance" begin
fn1 = tempname()
fn2 = tempname()
open(fn1, "w") do f
write(f, "Hello text file")
end
@test !HDF5.ishdf5(fn1) # check that a non-hdf5 file retuns false
@test !HDF5.ishdf5(fn2) # checks that a file that does not exist returns false
@test_throws ErrorException h5write(fn1, "x", 1) # non hdf5 file throws
h5write(fn2, "x", 1)
@test HDF5.ishdf5(fn2)
rm(fn1)
rm(fn2)
end
@testset "bounds" begin
# issue #954
h5open(tempname(), "w") do f
a, _ = create_attribute(f, "a", zeros(4))
@test_throws ArgumentError write(a, ones(2))
d, _ = create_dataset(f, "dd", zeros(4))
@test_throws ArgumentError write(d, ones(2))
end
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 654 | import HDF5.API
@testset "plugins" begin
state = API.h5pl_get_loading_state()
@test state != API.H5PL_TYPE_ERROR
@test API.h5pl_set_loading_state(state) === nothing
tmp = mktempdir()
@test API.h5pl_append(tmp) === nothing
@test API.h5pl_get(1) == tmp
tmp = mktempdir()
@test API.h5pl_prepend(tmp) === nothing
@test API.h5pl_get(0) == tmp
tmp = mktempdir()
@test API.h5pl_replace(tmp, 1) === nothing
@test API.h5pl_get(1) == tmp
tmp = mktempdir()
@test API.h5pl_insert(tmp, 1) === nothing
@test API.h5pl_get(1) == tmp
@test API.h5pl_remove(1) === nothing
@test API.h5pl_size() == 3
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 4443 | using HDF5
using Test
@testset "properties" begin
fn = tempname()
h5open(
fn,
"w";
userblock=1024,
alignment=(0, sizeof(Int)),
libver_bounds=(:earliest, :latest),
meta_block_size=1024,
strategy=:fsm_aggr,
persist=1,
threshold=2,
file_space_page_size=0x800
) do hfile
# generic
g = create_group(hfile, "group")
if HDF5.API.h5_get_libversion() >= v"1.10.5"
kwargs = (:no_attrs_hint => true,)
else
kwargs = ()
end
d = create_dataset(
g,
"dataset",
datatype(Int),
dataspace((500, 50));
alloc_time=HDF5.API.H5D_ALLOC_TIME_EARLY,
chunk=(5, 10),
fill_value=1,
fill_time=:never,
obj_track_times=false,
chunk_cache=(522, 0x200000, 0.80),
efile_prefix=:origin,
virtual_prefix="virtual",
virtual_printf_gap=2,
virtual_view=:last_available,
kwargs...
)
attributes(d)["metadata"] = "test"
flush(hfile)
fcpl = HDF5.get_create_properties(hfile)
fapl = HDF5.get_access_properties(hfile)
gcpl = HDF5.get_create_properties(hfile["group"])
dcpl = HDF5.get_create_properties(d)
dapl = HDF5.get_access_properties(d)
acpl = HDF5.get_create_properties(attributes(d)["metadata"])
# Retrievability of properties
@test isvalid(fcpl)
@test isvalid(fapl)
@test isvalid(gcpl)
@test isvalid(dcpl)
@test isvalid(dapl)
@test isvalid(acpl)
# Retrieving property values:
@test fcpl.userblock == 1024
@test fcpl.obj_track_times
@test fcpl.file_space_page_size == 0x800
@test fcpl.strategy == :fsm_aggr
@test fcpl.persist == 1
@test fcpl.threshold == 2
@test fapl.alignment == (0, sizeof(Int))
@test fapl.driver == Drivers.POSIX()
@test_throws HDF5.API.H5Error fapl.driver_info
@test fapl.fclose_degree == :strong
@test fapl.libver_bounds ==
(:earliest, HDF5.libver_bound_from_enum(HDF5.API.H5F_LIBVER_LATEST))
@test fapl.meta_block_size == 1024
@test gcpl.local_heap_size_hint == 0
@test gcpl.obj_track_times
@test HDF5.UTF8_LINK_PROPERTIES.char_encoding == :utf8
@test HDF5.UTF8_LINK_PROPERTIES.create_intermediate_group
@test dcpl.alloc_time == :early
@test dcpl.chunk == (5, 10)
@test dcpl.layout == :chunked
@test !dcpl.obj_track_times
@test dcpl.fill_time == :never
@test dcpl.fill_value == 1.0
if HDF5.API.h5_get_libversion() >= v"1.10.5"
@test dcpl.no_attrs_hint == true
end
@test dapl.chunk_cache.nslots == 522
@test dapl.chunk_cache.nbytes == 0x200000
@test dapl.chunk_cache.w0 == 0.8
@test dapl.efile_prefix == raw"$ORIGIN"
@test dapl.virtual_prefix == "virtual"
# We probably need to actually use a virtual dataset
@test_broken dapl.virtual_printf_gap == 2
if HDF5.API.h5_get_libversion() >= v"1.14.0"
@test dapl.virtual_view == :last_available
else
@test_broken dapl.virtual_view == :last_available
end
@test acpl.char_encoding == :utf8
# Test auto-initialization of property lists on get
dcpl2 = HDF5.DatasetCreateProperties() # uninitialized
@test dcpl2.id < 1 # 0 or -1
@test !isvalid(dcpl2)
@test dcpl2.alloc_time == :late
@test isvalid(dcpl2)
# Test H5Pcopy
dapl2 = copy(dapl)
@test dapl2.id != dapl.id
@test dapl2.virtual_prefix == dapl.virtual_prefix
dapl2.virtual_prefix = "somewhere_else"
@test dapl2.virtual_prefix != dapl.virtual_prefix
nothing
end
file_image = read(fn)
rm(fn; force=true)
fapl = HDF5.FileAccessProperties()
fapl.driver = HDF5.Drivers.Core(; backing_store=false)
fapl.file_image = copy(file_image)
@test fapl.file_image == file_image
file_image2 = h5open(fn, "r"; fapl) do f
@test haskey(f, "group")
convert(Vector{UInt8}, f)
end
# file_image2 does not include user block
@test file_image2 == @view(file_image[1025:end])
rm(fn; force=true)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 5297 | using HDF5
using Test
using LinearAlgebra: norm
@testset "readremote" begin
# check that we can read the official HDF5 example files
# download and save test file via:
# urlbase = "https://support.hdfgroup.org/ftp/HDF5/examples/files/exbyapi/"
test_files = joinpath(@__DIR__, "test_files")
# if !isdir(test_files)
# mkdir(test_files)
# end
# function joinpath(test_files, name)
# file = joinpath(test_files, name)
# if !isfile(file)
# file = download(urlbase*name, file)
# end
# file
# end
#! format: off
fcmp =
[
0 1 2 3 4 5 6
2 1.66667 2.4 3.28571 4.22222 5.18182 6.15385
4 2.33333 2.8 3.57143 4.44444 5.36364 6.30769
6 3 3.2 3.85714 4.66667 5.54545 6.46154
]'
icmp = [
0 -1 -2 -3 -4 -5 -6
0 0 0 0 0 0 0
0 1 2 3 4 5 6
0 2 4 6 8 10 12
]'
SOLID, LIQUID, GAS, PLASMA = 0, 1, 2, 3
ecmp =
[
SOLID SOLID SOLID SOLID SOLID SOLID SOLID
SOLID LIQUID GAS PLASMA SOLID LIQUID GAS
SOLID GAS SOLID GAS SOLID GAS SOLID
SOLID PLASMA GAS LIQUID SOLID PLASMA GAS
]'
scmp = ["Parting", "is such", "sweet", "sorrow."]
vicmp = Array{Int32}[[3, 2, 1], [1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144]]
opq = Array{UInt8}[
[0x4f, 0x50, 0x41, 0x51, 0x55, 0x45, 0x30],
[0x4f, 0x50, 0x41, 0x51, 0x55, 0x45, 0x31],
[0x4f, 0x50, 0x41, 0x51, 0x55, 0x45, 0x32],
[0x4f, 0x50, 0x41, 0x51, 0x55, 0x45, 0x33]
]
# For H5T_ARRAY
AA = Array{Int,2}[
[0 0 0;
0 -1 -2;
0 -2 -4;
0 -3 -6;
0 -4 -8],
[0 1 2;
1 1 1;
2 1 0;
3 1 -1;
4 1 -2],
[0 2 4;
2 3 4;
4 4 4;
6 5 4;
8 6 4],
[0 3 6;
3 5 7;
6 7 8;
9 9 9;
12 11 10]]
#! format: on
file = joinpath(test_files, "h5ex_t_floatatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test norm(a - fcmp) < 1.5e-5
close(fid)
file = joinpath(test_files, "h5ex_t_float.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test norm(d - fcmp) < 1.5e-5
close(fid)
file = joinpath(test_files, "h5ex_t_intatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a == icmp
close(fid)
file = joinpath(test_files, "h5ex_t_int.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d == icmp
close(fid)
if HDF5.API.h5_get_libversion() >= v"1.8.11"
file = joinpath(test_files, "h5ex_t_enumatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a == ecmp
close(fid)
file = joinpath(test_files, "h5ex_t_enum.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d == ecmp
close(fid)
end
file = joinpath(test_files, "h5ex_t_objrefatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
g = fid[a[1]]
@test isa(g, HDF5.Group)
ds2 = fid[a[2]]
ds2v = read(ds2)
@test isa(ds2v, HDF5.EmptyArray{Int32})
@test isempty(ds2v)
close(fid)
file = joinpath(test_files, "h5ex_t_objref.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
g = fid[d[1]]
@test isa(g, HDF5.Group)
ds2 = fid[d[2]]
ds2v = read(ds2)
@test isa(ds2v, HDF5.EmptyArray{Int32})
@test isempty(ds2v)
close(fid)
file = joinpath(test_files, "h5ex_t_stringatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a == scmp
close(fid)
file = joinpath(test_files, "h5ex_t_string.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d == scmp
close(fid)
file = joinpath(test_files, "h5ex_t_vlenatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a == vicmp
close(fid)
file = joinpath(test_files, "h5ex_t_vlen.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d == vicmp
close(fid)
file = joinpath(test_files, "h5ex_t_vlstringatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a == scmp
close(fid)
file = joinpath(test_files, "h5ex_t_vlstring.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d == scmp
close(fid)
file = joinpath(test_files, "h5ex_t_opaqueatt.h5")
fid = h5open(file, "r")
dset = fid["DS1"]
a = read_attribute(dset, "A1")
@test a.tag == "Character array"
@test a.data == opq
close(fid)
file = joinpath(test_files, "h5ex_t_opaque.h5")
fid = h5open(file, "r")
d = read(fid, "DS1")
@test d.tag == "Character array"
@test d.data == opq
close(fid)
file = joinpath(test_files, "h5ex_t_array.h5")
fid = h5open(file, "r")
A = read(fid, "DS1")
@test A == AA
close(fid)
end # testset readremote
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
|
[
"MIT"
] | 0.17.2 | e856eef26cf5bf2b0f95f8f4fc37553c72c8641c | code | 997 | using Random, Test, HDF5
@testset "reference" begin
data = rand(100)
fn = tempname()
f = h5open(fn, "w")
f["data"] = data
f["group_data"] = data
f["attr_data"] = data
# reference attached to file
f["file_ref"] = HDF5.Reference(f, "data")
# reference attached to group
g = create_group(f, "sub")
g["group_ref"] = HDF5.Reference(f, "group_data")
# reference attached to dataset
f["data"]["attr_ref"] = HDF5.Reference(f, "attr_data")
close(f)
f = h5open(fn, "r")
# read back file-attached reference
ref = read(f["file_ref"])
@test ref isa HDF5.Reference
@test data == read(f[ref])
# read back group-attached reference
gref = read(f["sub"]["group_ref"])
@test gref isa HDF5.Reference
@test data == read(f["sub"][gref])
# read back dataset-attached reference
aref = read(f["data"]["attr_ref"])
@test aref isa HDF5.Reference
@test data == read(f["data"][aref])
close(f)
rm(fn)
end
| HDF5 | https://github.com/JuliaIO/HDF5.jl.git |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.